Int, Float, Double, Decimal, DateTime .etc are value types. And I know:
Int:Represents a 32-bit signed integer.
Float:Represents a single-precision floating-point number(32-bit).
Double:Represents a double-precision floating-point number(64-bit).
...
But how many bit for DateTime? And why all value types in .NET are struct?
Based on here, DateTime represents 64-bit in C#:
Prior to the .NET Framework version 2.0, the DateTime structure
contains a 64-bit field composed of an unused 2-bit field concatenated
with a private Ticks field, which is a 62-bit unsigned field that
contains the number of ticks that represent the date and time. The
value of the Ticks field can be obtained with the Ticks property.
Starting with the .NET Framework 2.0, the DateTime structure contains
a 64-bit field composed of a private Kind field concatenated with the
Ticks field. The Kind field is a 2-bit field that indicates whether
the DateTime structure represents a local time, a Coordinated
Universal Time (UTC), or the time in an unspecified time zone. The
Kind field is used when performing time conversions between time
zones, but not for time comparisons or arithmetic. The value of the
Kind field can be obtained with the Kind property.
Related
I have timestamp comoing from c# (64-bit binary value) that I need to convert in php.
The timestamp is created with DateTime.ToBinary() method.
an example is -8586307756854775808 which gives 10/11/2019 10:00:00 PM
I tried with unpack() in php but without success:
$timestamp = unpack('q', -8586307756854775808);
How can I do this in php ?
Thanks
At first must separate the ticks from 64-bit binary value.
The ticks of the 64-bit value are the same when creating DateTime in C# regardless of the use of DateTimeKind.Utc or DateTimeKind.Local
Then may be convert the Ticks in a UNIX-Timestamp.
gmdate() returns a date string from the timestamp.
<?php
//num from c# DateTime.ToBinary()
//2019-10-11 22:00:00
$num = -8586307756854775808;
$ticks = $num & 0x3fffffffffffffff;
$linuxTs = $ticks/10000000 - 62135596800;
$strDate = gmdate("Y-m-d H:i:s",$linuxTs);
echo $strDate; //"2019-10-11 22:00:00"
Note: This solution requires a 64-bit platform.
The kind information from the 64-bit value $num may also be useful.
$kind = (ord(pack('J',$num)) & 0xC0)>>6; //0,1 or 2
DateTime.ToBinary creates a value that combines the .Ticks and .Kind property into a single 64-bit value. You shouldn't use it unless you intend to pass that value back into DateTime.FromBinary in .NET.
While it is possible to interpret it outside of .NET, one would have to apply the same logic that is in the implementation of DateTime.FromBinary. You can see the source code here.
Instead, consider using a standard interchange format. The most common is ISO 8601 (also covered in part by RFC 3339).
string iso = dateTime.ToString("O");
The above uses the "round-trip" specifier to get the ISO 8601 format. There are other ways as well.
If your values are UTC based, you could also consider passing a Unix Timestamp. The simplest way to get one from .NET is through DateTimeOffset.ToUnixTimeSeconds or DateTimeOffset.ToUnixTimeMilliseconds depending on what precision you want. Of course, you will need to convert your DateTime to a DateTimeOffset first.
long ms = new DateTimeOffset(dateTime).ToUnixTimeMilliseconds();
The above assumes that your DateTime has either DateTimeKind.Local or DateTimeKind.Utc in its .Kind property. If it is DateTimeKind.Unspecified, then you may need to work with TimeZoneInfo to get aligned to UTC before figuring out the Unix Timestamp.
I have been having some fun with DateTime parsing from strings in .NET MVC, and I have identified some curious behaviour. Look at this test:
[Test]
public void DoesItWork()
{
DateTime theTime = DateTime.Now;
DateTime theUTCTime = theTime.ToUniversalTime();
Assert.IsTrue(theTime==theUTCTime);
}
I'm in the UK right now, and its BST, so I expect the UTC time to be an hour behind the value of DateTime.Now. So it is. But when I call .ToUniversalTime() on my initial date time, as well as subtracting an hour, the value's Kind property is also updated - from Local to Utc. This is also what I would expect.
But when I come to compare the values of these two DateTimevariables, the equality operator doesn't take account of the different Kind values, and simply reports they're different values. To me, this seems flat out wrong.
Can anyone elucidate why it works this way?
According to MSDN and MSDN2 when you compare two DateTime values:
Remarks
To determine the relationship of the current instance to value, the CompareTo method compares the Ticks property of the current instance and value but ignores their Kind property. Before comparing DateTime objects, make sure that the objects represent times in the same time zone. You can do this by comparing the values of their Kind properties.
Remarks
The Equality operator determines whether two DateTime values are equal by comparing their number of ticks. Before comparing DateTime objects, make sure that the objects represent times in the same time zone. You can do this by comparing the values of their Kind property.
So it's correct.
Link to DateTime.Kind Property and again from the remark:
The Kind property allows a DateTime value to clearly reflect either Coordinated Universal Time (UTC) or the local time. In contrast, the DateTimeOffset structure can unambiguously reflect any time in any time zone as a single point in time.
UPDATE
Regarding your comment. IMHO it is expected behavior. Because usually, you don't need to compare two DateTimes from different time zone. If you need to do this although, you have to use DateTimeOffset DateTimeOffset Structure which is:
Remarks
The DateTimeOffset structure includes a DateTime value, together with an Offset property that defines the difference between the current DateTimeOffset instance's date and time and Coordinated Universal Time (UTC). Because it exactly defines a date and time relative to UTC, the DateTimeOffset structure does not include a Kind member, as the DateTime structure does. It represents dates and times with values whose UTC ranges from 12:00:00 midnight, January 1, 0001 Anno Domini (Common Era), to 11:59:59 P.M., December 31, 9999 A.D. (C.E.).
Have a small problem where if I save a DateTime field as an SQL command parameter it loses precision, like often less than a milisecond.
e.g. The parameter's Value is:
TimeOfDay {16:59:35.4002017}
But its SqlValue is:
TimeOfDay {16:59:35.4000000}
And that's the time that's saved in the database.
Now I'm not particularly bothered about a couple of microseconds, but it causes problems later on when I'm comparing values, they show up as not equal.
(Also in some comparisons the type of the field is not known until run-time so I'm not even sure at dev-time whether I'll even need special DateTime "rounding" logic)
Is there any easy fix for this when adding the parameter?
You're using DateTime, which is documented with:
Accuracy: Rounded to increments of .000, .003, or .007 seconds
It sounds like you want DateTime2:
Precision, scale: 0 to 7 digits, with an accuracy of 100ns. The default precision is 7 digits.
That 100ns precision is the same as DateTime (1 tick = 100ns)
Or just live with the difference and write methods to round DateTime before comparing - that may end up being simpler.
Try using datetime2 it has better precision.
I have a date time that I generate like this:
DateTime myDateTime = DateTime.Now;
I then store it in the database (in a DateTime typed column) with Entity Framework. I then retrieve it with OData (WCF Data Services).
When it goes in the TimeOfDay value is: 09:30:03.0196095
When it comes out the TimeOfDay value is: 09:30:03.0200000
The net effect of this makes it so that the Milliseconds are seen as 19 before it is saved and 20 after it is re-loaded.
So when I do a compare later in my code, it fails where it should be equal.
Does SQL Server not have as much precision as .NET? Or is it Entity Framework or OData that is messing this up?
I will just truncate off the milliseconds (I don't really need them). But I would like to know why this is happening.
This really depends on the version of SQL server you are using.
The resolution of the date time field is to 3 decimal places: For example: 2011-06-06 23:59:59.997 and is only accuracte to within 3.33 ms.
In your case, 09:30:03.0196095 is being rounded up to 09:30:03.020 on storage.
Beginning with SQL 2008, other data types were added to provide more detail, such as datetime2 which has up to 7 decimal places and is accurate to within 100ns.
See the following for more information:
http://karaszi.com/the-ultimate-guide-to-the-datetime-datatypes
I think your best bet is to provide the rounding to the second PRIOR to storing it in SQL server if the milliseconds is unimportant.
This is due to the precision of the SQL datetime type. According to msdn:
Datetime values are rounded to increments of .000, .003, or .007 seconds
Look at the Rounding of datetime Fractional Second Precision section of this msdn page and you'll understand how the rounding is done.
As indicated by others, you can use datetime2 instead of datetime to have a better precision:
datetime time range is 00:00:00 through 23:59:59.997
datetime2 time range is 00:00:00 through 23:59:59.9999999
For those who do not have the ability to use DateTime2 in SQL (ex: like me using tables that are generated by a separate system that would be expensive to change for this single issue), there is a simple code modification that will do the rounding for you.
Reference System.Data and import the System.Data.SqlTypes namespace. You can then use the SqlDateTime structure to do the conversion for you:
DateTime someDate = new SqlDateTime(DateTime.Now).Value;
This will convert the value into SQL ticks, and then back into .NET ticks, including the loss of precision. :)
A word of warning, this will lose the Kind of the original DateTime structure (i.e. Utc, Local). This conversion is also not simply rounding, there is a complete conversion including tick calculations, MaxTime changes, etc.. So don't use this if you are relying on specific indicators in DateTime as they could be lost.
The precision of DateTime in SQL Server is milliseconds (.fff). So 0.0196 would round to 0.020. If you can use datetime2, you get a higher precision.
I Have a legacy database with a field containing an integer representing a datetime in UTC
From the documentation:
"Timestamps within a CDR appear in Universal Coordinated Time (UTC). This value remains
independent of daylight saving time changes"
An example of a value is 1236772829.
My question is what is the best way to convert it to a .NET DateTime (in CLR code, not in the DB), both as the UTC value and as a local time value.
Have tried to google it but without any luck.
You'll need to know what the integer really means. This will typically consist of:
An epoch/offset (i.e. what 0 means) - for example "midnight Jan 1st 1970"
A scale, e.g. seconds, milliseconds, ticks.
If you can get two values and what they mean in terms of UTC, the rest should be easy. The simplest way would probably be to have a DateTime (or DateTimeOffset) as the epoch, then construct a TimeSpan from the integer, e.g. TimeSpan.FromMilliseconds etc. Add the two together and you're done. EDIT: Using AddSeconds or AddMilliseconds as per aakashm's answer is a simpler way of doing this bit :)
Alternatively, do the arithmetic yourself and call the DateTime constructor which takes a number of ticks. (Arguably the version taking a DateTimeKind as well would be better, so you can explicitly state that it's UTC.)
Googling that exact phrase gives me this Cicso page, which goes on to say "The field specifies a time_t value that is obtained from the operating system. "
time_t is a C library concept which strictly speaking doesn't have to be any particular implementation, but typically for UNIX-y systems will be the number of seconds since the start of the Unix epoch, 1970 January 1 00:00.
Assuming this to be right, this code will give you a DateTime from an int:
DateTime epochStart = new DateTime(1970, 1, 1);
int cdrTimestamp = 1236772829;
DateTime result = epochStart.AddSeconds(cdrTimestamp);
// Now result is 2009 March 11 12:00:29
You should sanity check the results you get to confirm that this is the correct interpretation of these time_ts.