I'm having a little trouble converting nanoseconds to DateTime so i can use the Google Fit API (https://developers.google.com/fit/rest/v1/reference/users/dataSources/datasets/get)
Dataset identifier that is a composite of the minimum data point start
time and maximum data point end time represented as nanoseconds from
the epoch. The ID is formatted like: "startTime-endTime" where
startTime and endTime are 64 bit integers.
I was able to convert from datetime to Nanoseconds this way
DateTime zuluTime = ssDatetime.ToUniversalTime();
DateTime unixEpoch = new DateTime(1970, 1, 1);
ssNanoSeconds = (Int32)(zuluTime.Subtract(unixEpoch)).TotalSeconds + "000000000";
But now i need to convert nanoseconds to DateTime. How can i do it?
Use AddTicks method. Don't forget to divide nanoseconds by 100 to get ticks.
long nanoseconds = 1449491983090000000;
DateTime epochTime = new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc);
DateTime result = epochTime.AddTicks(nanoseconds / 100);
Ticks property represents 100 nanoseconds. So how about :
var ssNanoSeconds = ((zuluTime.Subtract(unixEpoch)).Ticks / 100)
From nanoseconds to DateTime
DateTime dateTime = new DateTime(1970, 1, 1).AddTicks(nanoSeconds * 100) ;
To convert nanoseconds to a DateTime object, you can do the following:
Divide the nanoseconds value by 1,000,000,000 to get the equivalent number of seconds.
Create a DateTime object representing the Unix epoch (January 1, 1970).
Add the number of seconds to the Unix epoch to get the desired DateTime object.
long nanoseconds = 1234567890123; // nanoseconds value to convert
// Divide the nanoseconds value by 1,000,000,000 to get the equivalent number of seconds
int seconds = (int)(nanoseconds / 1000000000);
// Create a DateTime object representing the Unix epoch (January 1, 1970)
DateTime unixEpoch = new DateTime(1970, 1, 1);
// Add the number of seconds to the Unix epoch to get the desired DateTime object
DateTime datetime = unixEpoch.AddSeconds(seconds);
Note that this will only work correctly if the nanoseconds value is within the range of values that can be represented by a DateTime object (approximately 292 million years before or after the Unix epoch). If the nanoseconds value is outside of this range, the resulting DateTime object will not be accurate.
Related
public static DateTime ToDateTimeForEpochMSec(this double microseconds)
{
var epoch = new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc);
DateTime tempDate = epoch.AddMilliseconds(microseconds / 1000);
return tempDate;
}
tried using this code but losing microseconds precison in the datetime
You need to adjust your code to handle microseconds. We know that 1000 microseconds = 1 millisecond, therefore 1/1000th of a millisecond is a microsecond. Now, after AddMilliseconds, the next level of precision that DateTime supports is Ticks, so we'll have to use that.
We can find the number of ticks in a millisecond by using TimeSpan:
long ticksPerMillisecond = TimeSpan.TicksPerMillisecond;
If 1 microsecond is 1/1000th of a millisecond, then it stands to reason that we need 1/1000th of the ticks to represent a microsecond. The documentation defines a tick as:
one hundred nanoseconds or one ten-millionth of a second. There are 10,000 ticks in a millisecond (see TicksPerMillisecond) and 10 million ticks in a second.
We should therefore be safe to use integer division to divide by 1000:
long ticksPerMicrosecond = TimeSpan.TicksPerMillisecond / 1000;
Now we can convert the number of microseconds into ticks:
long ticks = (long)(microseconds * ticksPerMicrosecond);
And add it to the epoch:
DateTime tempDate = epoch.AddTicks(ticks);
Putting it all together we get:
public static DateTime ToDateTimeForEpochMSec(this double microseconds)
{
var epoch = new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc);
long ticksPerMicrosecond = TimeSpan.TicksPerMillisecond / 1000;
long ticks = (long)(microseconds * ticksPerMicrosecond);
DateTime tempDate = epoch.AddTicks(ticks);
return tempDate;
}
Try it online
By the way, I'm not sure why microseconds is a double here. It seems like it would be fine as a long.
In jquery I do this:
new Date(2009, 12, 1)).getTime()
and I got a huge number like this 1262304000000
How can I change the datetime variable in c# to get the same result that I would get in jquery?
The JavaScript getTime function returns the number of milliseconds since midnight on 1st January 1970.
So to get the same figure from a .NET System.DateTime object, you must subtract the epoch of 1st January 1970 and return the TotalMilliseconds property of the resulting TimeSpan:
var dateOfInterest = new DateTime(2009,12,1);
var epoch = new DateTime(1970,1,1);
var differenceInMilliseconds = (dateOfInterest - epoch).TotalMilliseconds;
C# has DateTime.Ticks Property but this doesn't exactly do with getTime(). Looks like there is no exactly equavalent in C#. You can divide result with 10.000 but still it calculates from 0001 year. Javascripts's getTime() method calculates from 1970.
Gets the number of ticks that represent the date and time of this
instance.
A single tick represents one hundred nanoseconds or one ten-millionth
of a second. There are 10,000 ticks in a millisecond.
The value of this property represents the number of 100-nanosecond
intervals that have elapsed since 12:00:00 midnight, January 1, 0001,
DateTime dt = new DateTime(2009, 12, 1);
dt.Ticks.Dump(); // 633952224000000000
From http://www.w3schools.com/jsref/jsref_gettime.asp
The getTime() method returns the number of milliseconds between
midnight of January 1, 1970 and the specified date.
As a better solution, you can use TimeSpan structure to subtract your dates and use TotalMilliseconds property like;
DateTime start = new DateTime(2009, 12, 1);
DateTime end = new DateTime(1970, 1, 1);
double miliseconds = (start - end).TotalMilliseconds;
miliseconds.Dump(); // 1259625600000
When I generate my unix timestamp using the below code, I get two numbers after the decimal point, like this : 1389200512.12
Should those numbers be here? Do they mean anything? Are those fractions of milliseconds?
private double ConvertToTimestamp(DateTime value)
{
//create Timespan by subtracting the value provided from
//the Unix Epoch
TimeSpan span = (value - new DateTime(1970, 1, 1, 0, 0, 0, 0));
//return the total seconds (which is a UNIX timestamp)
return Math.Round((double)span.TotalSeconds,2);
}
As Peter said they are fractions of a second because you call span.TotalSeconds. 1389200512.12 aren't milliseconds since the epoch but seconds since the epoch. If you want milliseconds you should call span.TotalMilliseconds
There are a number of questions on this site explaining how to do this. My problem I when I do what seems to work for everyone else I don't get the correct date or time. The code is ...
long numberOfTicks = Convert.ToInt64(callAttribute);
startDateTime = new DateTime(numberOfTicks);
The value of callAttribute is = "1379953111"
After converting it the value of numberOfTicks = 1379953111
But the DateTime ends up being startDateTime = {1/1/0001 12:02:17 AM}
I have taken the same value for ticks and converted it online and it comes up with the correct date/time.
What am I doing wrong?
Your value doesn't seem to be a number of ticks; I suspect it's a UNIX timestamp (number of seconds since 1970/01/01 UTC)
Here's a function to convert from a UNIX timestamp:
static readonly DateTime _unixEpoch =
new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc);
public static DateTime DateFromTimestamp(long timestamp)
{
return _unixEpoch.AddSeconds(timestamp);
}
How can I find difference between two time intervals.
Like 13:45:26.836 - 14:24:18.473 which is of the format "Hour:Min:Sec:Millisecs". Now i need to find the time difference between these two times.
How can i do this in C#.?
Thanks in advance.
Basically, what you need to do is put those time values into DateTime structures. Once you have your two DateTime variables, just subtract them from one another - the result is a variable of type TimeSpan:
DateTime dt1 = new DateTime(2010, 5, 7, 13, 45, 26, 836);
DateTime dt2 = new DateTime(2010, 5, 7, 14, 24, 18, 473);
TimeSpan result = dt2 - dt1;
string result2 = result.ToString();
TimeSpan has a ton of properties that get sets - the difference in all sorts of units, e.g. milliseconds, seconds, minutes etc. You can also just do a .ToString() on it to get a string representation of the result. In result2, you'll get something like this:
00:38:51.6370000
Is that what you're looking for?
i'm posting an example;
you can check it and adapt your program,
/* Read the initial time. */
DateTime startTime = DateTime.Now;
Console.WriteLine(startTime);
/* Do something that takes up some time. For example sleep for 1.7 seconds. */
Thread.Sleep(1700);
/* Read the end time. */
DateTime stopTime = DateTime.Now;
Console.WriteLine(stopTime);
/* Compute the duration between the initial and the end time.
* Print out the number of elapsed hours, minutes, seconds and milliseconds. */
TimeSpan duration = stopTime - startTime;
Console.WriteLine("hours:" + duration.Hours);
Console.WriteLine("minutes:" + duration.Minutes);
Console.WriteLine("seconds:" + duration.Seconds);
Console.WriteLine("milliseconds:" + duration.Milliseconds);
Find the number of seconds; subtract both numbers and then you can figure out the time difference. Depending on the programming language you use, I am positive their must be a library that can handle it.
//Start off with a string
string time1s = "13:45:26.836";
string time2s = "14:24:18.473";
TimeSpan interval = DateTime.Parse(time2s) - DateTime.Parse(time1s);
This will produce a result of:
Days 0 int Hours 0 int
Milliseconds 637 int
Minutes 38 int Seconds 51 int
Ticks 23316370000 long
TotalDays 0.02698653935185185 double
TotalHours 0.64767694444444446 double
TotalMilliseconds 2331637.0 double
TotalMinutes 38.860616666666665 double
TotalSeconds 2331.6369999999997 double