Converting Epoch timestamp to DateTime - c#

Hello i am experiencing a strange problem, my scenario is that i am reading the history of google chrome browser stored in sqlite database using c# language. every thing goes fine except the timestamp that chrome stores in epoch format i have to upload the data to server using php and store it in the MYSQL database.
Now my problem is that i can't manage to convert that epoch timestamp to MYSQL based datetime.
for C# i tried following code
public static DateTime FromUnixTime(long unixTime)
{
return epoch.AddSeconds(unixTime);
}
taken from here i tried all available solutions on that link but they did not worked in my case.
for PHP i tried the following code taken from here
echo date("Y-m-d H:i:s", substr($epoch, 0, 10));
but it converts correct if the timestamp is same as mentioned in example but it returns wrong year when executed with my epoch timestamp.
i even tried to solve this problem at MYSQL query level so i searched and tried the following solution taken from here
select from_unixtime(floor(1389422614485/1000));
it does work when i don't replace the example epoch timestamp but when i put my own it did not work
kindly help me to get rid of this strange irritating problem does not matter at which layer you provide solution all are acceptable following languages are preferred
C#
PHP
MYSQL Query
example epoch timestamp is foloowing
13209562668824233
i do know that with respect to length its not same as in examples but note that chrome does convert this efficiently.

As explained by this answer, the Chrome timestamp is not equal to Unix epoch time. This is why you're not getting the results you expect from such methods. It's actually microseconds since the 1st Jan, 1601 (as opposed to Unix epoch time's seconds since 1st Jan, 1970).
You can test your WebKit timestamp here, where you'll see it returns Tuesday, 6 August 2019 10:57:48 (UTC).
So to convert this in code, we should first subtract the difference between 1970 and 1601 (in microseconds) and then divde the value by 1 million to get seconds (C# solution):
public static DateTime ConvertWebKitTime(long webkitEpoch)
{
const long epochDifferenceMicroseconds = 11644473600000000; // difference in microseconds between 1601 and 1970
var epoch = (webkitEpoch - epochDifferenceMicroseconds) / 1000000; // adjust to seconds since 1st Jan 1970
return DateTimeOffset.FromUnixTimeSeconds(epoch).UtcDateTime; // convert to datetime
}

A solution for PHP (64 Bit, without Milliseconds) is
$ts = 13209562668824233;
$date = date_create("1601-1-1 UTC")
->modify((int)($ts/1000000)." Seconds")
->format('Y-m-d H:i:s')
; //"2019-08-06 10:57:48"
For more details see: What is the format of Chrome's timestamps?

Related

Should historical SQL UTC Dates change when currently viewing in DST

Let's say I have a date stored in UTC time in SQL using the datetime type. In C#, I am converting to the client using .ToLocalTime().
Will the time value itself be different depending on when I look at it? That is, during DST or not?
For example, the actual hour of this historic event (2022-09-30T16:00:00.00) is stored 4 hours ahead, so it converts as 12pm. This is correct. But when DST ends, and it's 5 hours ahead, will it show as being done at 11am? Is there a method to always show the actual hour it happened in the past, not adjusted?
No, because September 30 is always with DST. It doesn't matter if you check it in May or January, dates in September have always DST.
You can trivially confirm this yourself: Create a new DateTime object for 2022-02-03 (no DST) and print it today (October 4). Currently, we observe DST, but the date in February will still be output without DST, because DST is not applied in February.

Working with daylight saving time with the DateTime class

I'm in the process of porting a C++ program to C#. The program needs to be able to read a file's "modified" timestamp and store it in a List. This is what I have so far:
C# code:
ret = new List<Byte>(); //list to store the timestamp
var file = new FileInfo(filename);
//Get revision date/time
DateTime revTime_lastWriteTime_LT = file.LastWriteTime;
//Copy date/time into the List (binary buffer)
ret.Add(Convert.ToByte(revTime_lastWriteTime_LT.Month));
ret.Add(Convert.ToByte(revTime_lastWriteTime_LT.Day));
ret.Add(Convert.ToByte(revTime_lastWriteTime_LT.Year % 100)); // 2-digit year
ret.Add(Convert.ToByte(revTime_lastWriteTime_LT.Hour));
ret.Add(Convert.ToByte(revTime_lastWriteTime_LT.Minute));
ret.Add(Convert.ToByte(revTime_lastWriteTime_LT.Second));
The problem occurs when I read in the Hours value. If the file was modified during daylight savings time (like a summer month), the hour value in the C++ program gets subtracted by one. I can't seem to replicate this in my program. In the MSDN article for DateTime it says under "DateTime values": "local time is optionally affected by daylight saving time, which adds or subtracts an hour from the length of a day". In my C# code I made sure to change my DateTime object to local time using, ToLocalTime(), but apparently I haven't instituted the option that the article is talking about. How do I make sure that my DateTime object in local time subtracts a value when reading in a file that was modified during daylight savings time?
C++ code just in case:
static WIN32_FILE_ATTRIBUTE_DATA get_file_data(const std::string & filename)
{
WIN32_FILE_ATTRIBUTE_DATA ret;
if (!GetFileAttributesEx(filename.c_str(), GetFileExInfoStandard, &ret))
RaiseLastWin32Error();
return ret;
}
//---------------------------------------------------------------------------
static SYSTEMTIME get_file_time(const std::string & filename)
{
const WIN32_FILE_ATTRIBUTE_DATA data(get_file_data(filename));
FILETIME local;
if (!FileTimeToLocalFileTime(&data.ftLastWriteTime, &local))
RaiseLastWin32Error();
SYSTEMTIME ret;
if (!FileTimeToSystemTime(&local, &ret))
RaiseLastWin32Error();
return ret;
}
void parse_asm()
{
// Get revision date/time and size
const SYSTEMTIME rev = get_file_time(filename);
// Copy date/time into the binary buffer
ret.push_back(rev.wMonth);
ret.push_back(rev.wDay);
ret.push_back(rev.wYear % 100); // 2-digit year
ret.push_back(rev.wHour);
ret.push_back(rev.wMinute);
ret.push_back(rev.wSecond);
}
Update for clarity:
In Windows time settings I am in (UTC-05:00) Eastern Time (US & Canada). The file was last modified on Tues Sept 03, 2013 at 12:13:52 PM. The C++ app shows the hour value as 11 and the C# app shows the hour value as 12 using the code above. I need the C# app to show the same hour value as the C++ app.
The bug is actually not with .NET, but with your C++ code. You're using FileTimeToLocalFileTime, which has a well known bug, as described in KB932955:
Note The FileTimeToLocalFileTime() function and the
LocalFileTimeToFileTime() function perform the conversion between UTC
time and local time by using only the current time zone information
and the DST information. This conversion occurs regardless of the
timestamp that is being converted.
So in the example you gave, Sept 03, 2013 at 12:13:52 PM in the US Eastern Time zone should indeed be in daylight saving time. But because it is right now (February 2015) not daylight saving time, you currently get 11 for the hour in your C++ program. If you run the exact same C++ code after next month's transition (March 8th 2015), you will then get 12 for the hour.
The fix for the C++ code is described in the remarks section of the MSDN entry for the FileTimeToLocalFileTime function:
To account for daylight saving time when converting a file time to a local time, use the following sequence of functions in place of using FileTimeToLocalFileTime:
FileTimeToSystemTime
SystemTimeToTzSpecificLocalTime
SystemTimeToFileTime
Now that you understand the bug - if you actually wanted to keep that behavior in C# (which I do not recommend), then you would do the following:
TimeSpan currentOffset = TimeZoneInfo.Local.GetUtcOffset(DateTime.UtcNow);
DateTime revTime_LastWriteTime_Lt = file.LastWriteTimeUtc.Add(currentOffset);
The better thing to do would just to leave your current code as is (using file.LastWriteTime), and call the bug fixed.
Sorry, you need to use Add not AddHours, Add accepts TimeSpan. So you're looking for:
file.LastWriteTimeUtc.Add(TimeZoneInfo.Local.BaseUtcOffset);

What datetime format is a number like "22194885"?

I have a very simple issue with datetime and am seeking some help.
I have a log that I would like to get all data information from. There are three columns of datetime formats (2 in UNIX timestamp while the other isn't).
The one with different timestamp format offers a value of, for example, 22194885 which I don't know which datetime type it belongs to.
Looks like minutes since January 1, 1970. this is Python code, but works the same as C localtime():
>>> import time
>>> time.localtime(22194885*60)
time.struct_time(tm_year=2012, tm_mon=3, tm_mday=13, tm_hour=19, tm_min=45, tm_sec=0, tm_wday=1, tm_yday=73, tm_isdst=1)
Works out to 3/13/2012 7:45pm.
Looks like it could be minutes since the Epoch, rather than milliseconds
22194885 minutes / 60 = 369914.75 hours
369914.75 hours / 24 = 15413.1 days
15413.1 days / 365 = 42.2 years
1970 + 42.2 = about today
For help converting Epoch time to .Net time, see
How to convert a Unix timestamp to DateTime and vice versa?
Remember that question deals with milliseconds, so you'll have to adjust the answer slightly.
Based on the calculations in Eric J's answer (which has been deleted), this could well be the number of minutes since the epoch. Bah! He slipped in a ninja edit.
Julian seconds (since the start of the year) is also a strong possibility.

DateTime.Parse error

Our webservice uses the Datetime.parse method to convert data from an xml to DateTime format. It parses Date and time strings separately and adds it together like this -
DateTime.Parse(Date_string).add(TimeSpan.Parse(Time_string)).
Code was working fine except for a few hours last week. Time was showing as 12 hours ahead of actual time. For example, 01/01/2011 10:00:00 will be parsed as 01/01/2011 22:00:00. Most of the requests during that time were processed with datetime values 12 hours ahead of actual time though some were processed correctly. It is working fine now and haven't seen it after that.
Has anyone seen a issue like this?
You say "Code was working fine except for a few hours last week", but you didn't specify exactly when that was or what time zone you are in. Any chance it was around a daylight savings time change?
You shouldn't use TimeSpan.Parse at all. A TimeSpan does NOT represent the time-of-day, despite its appearance as hh:mm:ss. A TimeSpan represents a fixed DURATION of time.
If you really are given separate date and time strings, join them together before parsing, such as:
DateTime dt = DateTime.Parse(date_string + " " + time_string);
You should also be aware of the timezone implications of the string you are sending in. See the MSDN article on DateTime.Parse for further details.

Timestamp as UTC Integer

I Have a legacy database with a field containing an integer representing a datetime in UTC
From the documentation:
"Timestamps within a CDR appear in Universal Coordinated Time (UTC). This value remains
independent of daylight saving time changes"
An example of a value is 1236772829.
My question is what is the best way to convert it to a .NET DateTime (in CLR code, not in the DB), both as the UTC value and as a local time value.
Have tried to google it but without any luck.
You'll need to know what the integer really means. This will typically consist of:
An epoch/offset (i.e. what 0 means) - for example "midnight Jan 1st 1970"
A scale, e.g. seconds, milliseconds, ticks.
If you can get two values and what they mean in terms of UTC, the rest should be easy. The simplest way would probably be to have a DateTime (or DateTimeOffset) as the epoch, then construct a TimeSpan from the integer, e.g. TimeSpan.FromMilliseconds etc. Add the two together and you're done. EDIT: Using AddSeconds or AddMilliseconds as per aakashm's answer is a simpler way of doing this bit :)
Alternatively, do the arithmetic yourself and call the DateTime constructor which takes a number of ticks. (Arguably the version taking a DateTimeKind as well would be better, so you can explicitly state that it's UTC.)
Googling that exact phrase gives me this Cicso page, which goes on to say "The field specifies a time_t value that is obtained from the operating system. "
time_t is a C library concept which strictly speaking doesn't have to be any particular implementation, but typically for UNIX-y systems will be the number of seconds since the start of the Unix epoch, 1970 January 1 00:00.
Assuming this to be right, this code will give you a DateTime from an int:
DateTime epochStart = new DateTime(1970, 1, 1);
int cdrTimestamp = 1236772829;
DateTime result = epochStart.AddSeconds(cdrTimestamp);
// Now result is 2009 March 11 12:00:29
You should sanity check the results you get to confirm that this is the correct interpretation of these time_ts.

Categories

Resources