Accuracy of comparing DateTime.now of C# and GetDate() from SQL - c#

What i am doing is that, i need to select a row that i have just recently added through DateTime to get the PK since i need it.
I store the DateTime through:
DateTime nw = DateTime.now and i use nw to search through my table.
My question is that, what if let's say i put 2 rows within a span of 1 minute?
My sql table stores them like this:
Since milliseconds isn't visible, will both of them be selected?(assuming everything happened within 1 minute)
Edit: this is from my asp mvc project. So the DateTime is new everytime my action is run.

The problem is precision. The GetDate() function in TSQL is not at the precision as c# DateTime, as GetDate() returns an TSQL DateTime.
TSQL DateTime:
Defines a date that is combined with a time of day with fractional seconds that is based on a 24-hour clock.
Rounded to increments of .000, .003, or .007 seconds
C# DateTime:
The Ticks property expresses date and time values in units of one ten-millionth of a second, and the Millisecond property returns the thousandths of a second in a date and time value. However, if you are using repeated calls to the DateTime.Now property to measure elapsed time, and you are concerned with small time intervals less than 100 milliseconds, you should note that values returned by the DateTime.Now property are dependent on the system clock, which on Windows 7 and Windows 8 systems has a resolution of approximately 15 milliseconds.
However you could use the newer (avail as of SQL Server 2008) SysDateTime() which returns a datetime2(7) value that should match the precision of C# Datetime.
datetime2(7):
Defines a date that is combined with a time of day that is based on 24-hour clock. datetime2 can be considered as an extension of the existing datetime type that has a larger date range, a larger default fractional precision, and optional user-specified precision.
This only academically interesting because you should never use a datetime as a PK.
Let's say it's Nov 6, 2016 at 1:15AM. You create a record:
MyPk
------
2016-11-06 01:15:00
One hour later you create another record...
MyPk
------
2016-11-06 01:15:00
2016-11-06 01:15:00
Duplicate PKs due to daylight savings. Don't have daylight savings? There are a multitude of reasons to not use DateTime for a PK (simply google search for datetime as primary key).
Just to name a few:
Exact select can be very difficult (milliseconds matter!)
Foreign Keys become a Nightmare
Replication is very difficult unless all systems are in the same timezone

If you really want to use the DateTime.Now with second precision as a way to find the PK of your data, you should not declared it once and use it everywhere. Rather, you should use it like this:
insertDataToDataBase(data, DateTime.Now);
and then 10-20 seconds later
insertDataToDataBase(data, DateTime.Now); //still use DateTime.Now
This way your DateTime.Now will always be updated

Related

Does calling DateTime.AddDays with a whole number always leave the time unchanged?

Consider the following code which attempts to get a DateTime that's equivalent to the local time "midnight yesterday":
DateTime midnightYesterday = DateTime.Today.AddDays(-1.0d);
Will this always result in a DateTime with a time component of 00:00:00 -- regardless of any corner cases such as leap days, leap seconds, or what the local time zone is?
More generally: Does calling DateTime.AddDays, passing a whole number as a parameter, always result in a DateTime being returned that has the exact same time component as the original Datetime?
The MSDN documentation for DateTime.AddDays does not address this specific question.
DateTime does not account for leap seconds. You can read this article from which you will see that because of this it doesn't really support UTC. Documentation states that:
Time values are measured in 100-nanosecond units called ticks, and a
particular date is the number of ticks since 12:00 midnight, January
1, 0001 A.D. (C.E.) in the GregorianCalendar calendar (excluding ticks
that would be added by leap seconds)
About daylight saving time documentation states the following:
Conversion operations between time zones (such as between UTC and
local time, or between one time zone and another) take daylight saving
time into account, but arithmetic and comparison operations do not.
That means that adding days (which is arithmetic operation) to DateTime instance, even if it has kind Local (so represents time in local timezone) does not take DST into account. That makes performing any arithmetic operations on datetimes with kind Local a really bad idea. If you need to do that with date times - first convert it to UTC (that has no notion of DST), perform operation then convert back to local (conversion as stated above does take DST into account).
You can also look at source code to see that datetime value is stored as a number internally (number of ticks) and adding days just adds fixed value to that number. Calculating hour\minute\second use that value and perform fixed operations (just a division) to obtain target value. None of those operations account for anything like leap seconds, time zones or anything else. So the answer to your question is yes.

How to remove time in date time?

How to remove a time in date time ? on column date its only display format
I store the value on repository combobox dropdown, and it store the value including the time. How do I remove the time?
I know there's so many question about this. But the solution was by converting it into a date.tostring("dd MMM yyyy"). Is there a solution beside convert it into string? I want the value was date time not a conversion of string.
The code I am using still giving me a time.
DateTime date = Convert.ToDateTime(gridView1.GetDataRow(i)["date"]);
You just forgot to specify the date at the end of the conversion
DateTime date = Convert.ToDateTime(gridView1.GetDataRow(i)["date"]).Date;
DateTime as the name implify, stores date and time.
You cannot remove time part from date because time is an integral part of date.
To understand this you will have to understand how the date and time are stored. Internally, the date and time is stored as a rational number (in fractions). In computer system 24 hours are considered as numeric 1, so when your value is increased by 1 that means your date is increased by 1 day. If the value is increased by 0.5 that means your date is increased by 12 hours (half day).
So, when you have value 42613.00 that means 31st August at midnight (just when the day started) and if you have value 42613.25 that means 6 AM of 31 Aug 2016 and 42613.50 means 12 noon of 31 Aug 2016 (and 42613.39236 means 9:25:00 AM of 31 Aug 2016)
The smallest fraction of time that need to be stored is 1 millisecond. That means the values of DateTime field should have a precision of more than 0.0000000115740740740741. But this is an irrational value (in binary) and hence cannot be stored as such (the nearest match is 1.00000000000000000000000000110001101101011101010000111010111111..., ... means there are more), so I can say that milliseconds are to their nearest approximation values.
.
That said,
if you wish to take only Date part, you can create your own class or struct to store date part of the DateTime and then override operators for date arithematic and provide implicit conversions to convert them to DateTime if any code that expect DateTime field.

Milliseconds in my DateTime changes when stored in SQL Server

I have a date time that I generate like this:
DateTime myDateTime = DateTime.Now;
I then store it in the database (in a DateTime typed column) with Entity Framework. I then retrieve it with OData (WCF Data Services).
When it goes in the TimeOfDay value is: 09:30:03.0196095
When it comes out the TimeOfDay value is: 09:30:03.0200000
The net effect of this makes it so that the Milliseconds are seen as 19 before it is saved and 20 after it is re-loaded.
So when I do a compare later in my code, it fails where it should be equal.
Does SQL Server not have as much precision as .NET? Or is it Entity Framework or OData that is messing this up?
I will just truncate off the milliseconds (I don't really need them). But I would like to know why this is happening.
This really depends on the version of SQL server you are using.
The resolution of the date time field is to 3 decimal places: For example: 2011-06-06 23:59:59.997 and is only accuracte to within 3.33 ms.
In your case, 09:30:03.0196095 is being rounded up to 09:30:03.020 on storage.
Beginning with SQL 2008, other data types were added to provide more detail, such as datetime2 which has up to 7 decimal places and is accurate to within 100ns.
See the following for more information:
http://karaszi.com/the-ultimate-guide-to-the-datetime-datatypes
I think your best bet is to provide the rounding to the second PRIOR to storing it in SQL server if the milliseconds is unimportant.
This is due to the precision of the SQL datetime type. According to msdn:
Datetime values are rounded to increments of .000, .003, or .007 seconds
Look at the Rounding of datetime Fractional Second Precision section of this msdn page and you'll understand how the rounding is done.
As indicated by others, you can use datetime2 instead of datetime to have a better precision:
datetime time range is 00:00:00 through 23:59:59.997
datetime2 time range is 00:00:00 through 23:59:59.9999999
For those who do not have the ability to use DateTime2 in SQL (ex: like me using tables that are generated by a separate system that would be expensive to change for this single issue), there is a simple code modification that will do the rounding for you.
Reference System.Data and import the System.Data.SqlTypes namespace. You can then use the SqlDateTime structure to do the conversion for you:
DateTime someDate = new SqlDateTime(DateTime.Now).Value;
This will convert the value into SQL ticks, and then back into .NET ticks, including the loss of precision. :)
A word of warning, this will lose the Kind of the original DateTime structure (i.e. Utc, Local). This conversion is also not simply rounding, there is a complete conversion including tick calculations, MaxTime changes, etc.. So don't use this if you are relying on specific indicators in DateTime as they could be lost.
The precision of DateTime in SQL Server is milliseconds (.fff). So 0.0196 would round to 0.020. If you can use datetime2, you get a higher precision.

How to get current Time with Milli second precision ( C#)

I use System.DateTime.Now , but it return like 5/28/2011 1:45:58 AM .(no Milli second precision)
I would like to save current time (or Date time ) with Milli second precision in database .
Update : Sorry , I meant Milli Second
System.DateTime manages precision to the millisecond, 5/28/2011 1:45:58 AM is just how it was formatted to a String.
To format with millisecond included use format string: "d/M/yyyy hh:mm:ss.fff tt"
If you want to store it in a SQL Server database, ADO.Net automatically converts the CLR System.DateTime datatype to a SQL Server datetime datatype (and vice-versa).
The CLR System.DateTime has 100-nanosecond precision (e.g., each tick is 100 nanoseconds; 10,000 ticks per millisecond, 10 million ticks per second.
The SQL Server datetime datatype is precise to (approximately) 3ms.
You shouldn't need to worry about it: ADO.Net will take care of it for you.
OTOH, if you really want to throw away extra nanoseconds, something like this ought to do the trick:
public static DateTime ToExactMillisecondPrecision( DateTime dt )
{
const long TICKS_PER_MILLISECOND = 10000 ;
long totalMilliseconds = dt.Ticks / TICKS_PER_MILLISECOND ;
return new DateTime( totalMilliseconds * TICKS_PER_MILLISECOND ) ;
}
Can't really see the need myself.
Look under the properties list in this link. All the different options are there.
http://msdn.microsoft.com/en-us/library/system.datetime.aspx
Including seconds, milliseconds, and ticks
The string you posted contains seconds, so I suppose you're not asking for second precision, but for more precise timing.
The value of DateTime.Now is returned with more than millisecond precision. it's just that with default formatting, the milliseconds aren't displayed. To display the value with milliseconds, you can either use the o standard format string, or write your own custom format string, that includes the millisecond format specifier fff.
Note that just because the returned value is precise, it doesn't mean it's as much accurate. The actual accuracy is not defined exactly, but tends to be in tens of milliseconds.
It should not be necessary to convert the date to string. Perhaps the real problem is that you using dynamic SQL.

Timestamp as UTC Integer

I Have a legacy database with a field containing an integer representing a datetime in UTC
From the documentation:
"Timestamps within a CDR appear in Universal Coordinated Time (UTC). This value remains
independent of daylight saving time changes"
An example of a value is 1236772829.
My question is what is the best way to convert it to a .NET DateTime (in CLR code, not in the DB), both as the UTC value and as a local time value.
Have tried to google it but without any luck.
You'll need to know what the integer really means. This will typically consist of:
An epoch/offset (i.e. what 0 means) - for example "midnight Jan 1st 1970"
A scale, e.g. seconds, milliseconds, ticks.
If you can get two values and what they mean in terms of UTC, the rest should be easy. The simplest way would probably be to have a DateTime (or DateTimeOffset) as the epoch, then construct a TimeSpan from the integer, e.g. TimeSpan.FromMilliseconds etc. Add the two together and you're done. EDIT: Using AddSeconds or AddMilliseconds as per aakashm's answer is a simpler way of doing this bit :)
Alternatively, do the arithmetic yourself and call the DateTime constructor which takes a number of ticks. (Arguably the version taking a DateTimeKind as well would be better, so you can explicitly state that it's UTC.)
Googling that exact phrase gives me this Cicso page, which goes on to say "The field specifies a time_t value that is obtained from the operating system. "
time_t is a C library concept which strictly speaking doesn't have to be any particular implementation, but typically for UNIX-y systems will be the number of seconds since the start of the Unix epoch, 1970 January 1 00:00.
Assuming this to be right, this code will give you a DateTime from an int:
DateTime epochStart = new DateTime(1970, 1, 1);
int cdrTimestamp = 1236772829;
DateTime result = epochStart.AddSeconds(cdrTimestamp);
// Now result is 2009 March 11 12:00:29
You should sanity check the results you get to confirm that this is the correct interpretation of these time_ts.

Categories

Resources