Does it matter what my LocalDateTime(..) is?
The
OffsetDateTimePattern.Create
method requires a default value. It's only used if the parsing were to fail and you didn't checkresult.Success
before usingresult.Value
.The other patterns have an overload that doesn't require a default value (see issue #267). I chose the particular default value of
2000-01-01T00:00:00.0000000+00:00
because it's similar to what the other patterns use when you don't specify a default explicitly.There really isn't any significance though. You can use any default you wish.
How do I convert the time to a Unix timestamp? It does not appear there's a built-in method to do so.
The
result.Value
is anOffsetDateTime
. TheInstant
type uses the Unix epoch, so you can do this:int unixTime = result.Value.ToInstant().Ticks / NodaConstants.TicksPerSecond;
Note that Unix timestamps are precise to the nearest second. If you're passing to JavaScript, you'd want to use
TicksPerMillisecond
and return it in along
.
... I was wondering if I can parse in a date/time that HAS offset defined inside, and at the same time, my
timeZone
input also allows input of timezones/offsets.
Sorry, but I don't fully understand what you're asking here. Can you please clarify?
From the code you provided, it looks like you are confusing the offset for the default value with the offset for the input string. The default value is only used if parsing fails.
If you want to control the offset instead of including it in the input, then use a
LocalDateTimePattern
instead of anOffsetDateTimePattern
to do the parsing. After it's parsed, you can associated it with a particular zone.Also, watch your naming conventions.
int timeZone
doesn't make sense (that's an offset, not a time zone). Perhapsint offsetHours
, or better yet,Offset timeZoneOffset
.