When I compare a DateTime variable with SqlDateTime.MinValue:
if (StartDate > SqlDateTime.MinValue)
{
// some code
}
I get the following runtime exception if StartDate is < SqlDateTime.MinValue:
SqlDateTime overflow. Must be between 1/1/1753 12:00:00 AM and
12/31/9999 11:59:59 PM.
This can be easy solved with a small change:
if (StartDate > SqlDateTime.MinValue.Value)
{
// some code
}
I understand that in the first code snippet I'm comparing apples to oranges. What I don't understand is the exception message. It seems like I'm assigning a DateTime value to a SqlDateTime variable.
What am I missing?
The .NET native DateTime type (to be specific, its a structure) holds a broader range of possible values than the SqlDateTime data type can support. More specifically, a DateTime value can range from 01/01/0000 to a theoretical 12/31/9999.
When the compiler tries to coerce the types for comparison, it attempts to put a DateTime value (MinValue.Value) that's outside (below or 'before' in context) the range supported by SqlDateTime - hence the overflow.
From SqlDateTime Structure on MSDN:
SqlDateTime structure
Represents the date and time data ranging in value from January 1, 1753 to December 31, 9999 to an accuracy of 3.33 milliseconds to be stored in or retrieved from a database. The SqlDateTime structure has a different underlying data structure from its corresponding .NET Framework type, DateTime, which can represent any time between 12:00:00 AM 1/1/0001 and 11:59:59 PM 12/31/9999, to the accuracy of 100 nanoseconds. SqlDateTime actually stores the relative difference to 00:00:00 AM 1/1/1900. Therefore, a conversion from "00:00:00 AM 1/1/1900" to an integer will return 0.
Related
I have been having some fun with DateTime parsing from strings in .NET MVC, and I have identified some curious behaviour. Look at this test:
[Test]
public void DoesItWork()
{
DateTime theTime = DateTime.Now;
DateTime theUTCTime = theTime.ToUniversalTime();
Assert.IsTrue(theTime==theUTCTime);
}
I'm in the UK right now, and its BST, so I expect the UTC time to be an hour behind the value of DateTime.Now. So it is. But when I call .ToUniversalTime() on my initial date time, as well as subtracting an hour, the value's Kind property is also updated - from Local to Utc. This is also what I would expect.
But when I come to compare the values of these two DateTimevariables, the equality operator doesn't take account of the different Kind values, and simply reports they're different values. To me, this seems flat out wrong.
Can anyone elucidate why it works this way?
According to MSDN and MSDN2 when you compare two DateTime values:
Remarks
To determine the relationship of the current instance to value, the CompareTo method compares the Ticks property of the current instance and value but ignores their Kind property. Before comparing DateTime objects, make sure that the objects represent times in the same time zone. You can do this by comparing the values of their Kind properties.
Remarks
The Equality operator determines whether two DateTime values are equal by comparing their number of ticks. Before comparing DateTime objects, make sure that the objects represent times in the same time zone. You can do this by comparing the values of their Kind property.
So it's correct.
Link to DateTime.Kind Property and again from the remark:
The Kind property allows a DateTime value to clearly reflect either Coordinated Universal Time (UTC) or the local time. In contrast, the DateTimeOffset structure can unambiguously reflect any time in any time zone as a single point in time.
UPDATE
Regarding your comment. IMHO it is expected behavior. Because usually, you don't need to compare two DateTimes from different time zone. If you need to do this although, you have to use DateTimeOffset DateTimeOffset Structure which is:
Remarks
The DateTimeOffset structure includes a DateTime value, together with an Offset property that defines the difference between the current DateTimeOffset instance's date and time and Coordinated Universal Time (UTC). Because it exactly defines a date and time relative to UTC, the DateTimeOffset structure does not include a Kind member, as the DateTime structure does. It represents dates and times with values whose UTC ranges from 12:00:00 midnight, January 1, 0001 Anno Domini (Common Era), to 11:59:59 P.M., December 31, 9999 A.D. (C.E.).
How to remove a time in date time ? on column date its only display format
I store the value on repository combobox dropdown, and it store the value including the time. How do I remove the time?
I know there's so many question about this. But the solution was by converting it into a date.tostring("dd MMM yyyy"). Is there a solution beside convert it into string? I want the value was date time not a conversion of string.
The code I am using still giving me a time.
DateTime date = Convert.ToDateTime(gridView1.GetDataRow(i)["date"]);
You just forgot to specify the date at the end of the conversion
DateTime date = Convert.ToDateTime(gridView1.GetDataRow(i)["date"]).Date;
DateTime as the name implify, stores date and time.
You cannot remove time part from date because time is an integral part of date.
To understand this you will have to understand how the date and time are stored. Internally, the date and time is stored as a rational number (in fractions). In computer system 24 hours are considered as numeric 1, so when your value is increased by 1 that means your date is increased by 1 day. If the value is increased by 0.5 that means your date is increased by 12 hours (half day).
So, when you have value 42613.00 that means 31st August at midnight (just when the day started) and if you have value 42613.25 that means 6 AM of 31 Aug 2016 and 42613.50 means 12 noon of 31 Aug 2016 (and 42613.39236 means 9:25:00 AM of 31 Aug 2016)
The smallest fraction of time that need to be stored is 1 millisecond. That means the values of DateTime field should have a precision of more than 0.0000000115740740740741. But this is an irrational value (in binary) and hence cannot be stored as such (the nearest match is 1.00000000000000000000000000110001101101011101010000111010111111..., ... means there are more), so I can say that milliseconds are to their nearest approximation values.
.
That said,
if you wish to take only Date part, you can create your own class or struct to store date part of the DateTime and then override operators for date arithematic and provide implicit conversions to convert them to DateTime if any code that expect DateTime field.
I have a datetime value to pass into a SqlParameter.
DateTime object value
However when pass to myCmd.Parameters.Add("#TrxDate", adt_TrxDate);
SqlParameter SqlValue & Value property is difference value
Found that it add a day when executing query. I found this issue in profiler and finally find out SqlParameter SqlValue & Value property is different value. Why it is happen and any idea?
Root cause is, .NET DateTime has higher precision then SQL Server's DateTime. So it is rounded off. SQL Server 2008 on wards, DateTime2 supports higher precision.
Since data type is DateTime in SQL, SQL Parameter is rounding .net DateTime to nearest Sql DateTime. Since it is rounding algorithm, it may add .003,.007 or remove, you can look at this SO question for more details. Undesired rounding of DateTime in SQL Server , so sometimes, if added micro seconds adds up, it may actually change to next day if it was on boundary between two days.
From MSDN,
https://msdn.microsoft.com/en-us/library/system.data.sqldbtype(v=vs.110).aspx
DateTime. Date and time data ranging in value from January 1, 1753 to December 31, 9999 to an accuracy of 3.33 milliseconds.
DateTime2
Date and time data. Date value range is from January 1,1 AD through December 31, 9999 AD. Time value range is 00:00:00 through 23:59:59.9999999 with an accuracy of 100 nanoseconds.
So I guess if you change Parameter Type to DateTime2 it would preserve actual time. However, unless your SQL Server has column's type as DateTime2, it this will have no effect.
You have to use like the following by specifying the SqlDbType for the parameter :
myCmd.Parameters.Add("#TrxDate",SqlDbType.Date).Value =adt_TrxDate;
use SqlDbType.DateTime if it is DateTime
or you can use AddWithValue
myCmd.Parameters.AddWithValue("#TrxDate",adt_TrxDate);
You can refer this thread for the difference between these two
This was born from my previous question
I have a DateTime in c#.
Then this value is inserted to database.
After that select this value and compare that date is the same as it was in the beginning.
What is the best way to do this?
Since SQL datetime has different ticks, DateTime from the first step will not be the same as SQL DateTime (row["MyDate"])
How to compare them?
Subtract one from the other & check the ticks of the resulting TimeSpan to be within acceptable limits for the difference in tick length
You can use the SqlDateTime structure.
DateTime now = DateTime.Now;
SqlDateTime sqlNow = new SqlDateTime(now);
bool equal = now == sqlNow.Value; // false
So if you have a DateTime and want to know if it's equal to a DB-DateTime use:
Assert.Equal(dbEndTime, new SqlDateTime(endTime).Value); // true
SqlDateTime:
Represents the date and time data ranging in value from January 1,
1753 to December 31, 9999 to an accuracy of 3.33 milliseconds to be
stored in or retrieved from a database. The SqlDateTime structure has
a different underlying data structure from its corresponding .NET
Framework type, DateTime, which can represent any time between
12:00:00 AM 1/1/0001 and 11:59:59 PM 12/31/9999, to the accuracy of
100 nanoseconds. SqlDateTime actually stores the relative difference
to 00:00:00 AM 1/1/1900. Therefore, a conversion from "00:00:00 AM
1/1/1900" to an integer will return 0.
if you ignore millisecond difference than you can try this
Select * from MyTable DATEADD(ms, -DATEPART(ms, endTime), endTime) = #value
The following code compiles fine with comparison operator.
If(dateTimeVariable > SqlDateTime.MinValue) //compiles Ok. dateTimeVariable is of type DateTime
{
}
However, the following code fails to compile.
DateTime dateTimeVariable=SqlDateTime.MinValue;
//Throws exception , cannot convert source type SqlDateTime to DateTime. Which is obvious.
My question is why comparison is allowed between SqlDateTime and Datetime types but not assignment. (Unless comparison operators are doing some implicit conversion.)
I'm guessing I must be missing something really basic.
There's an implicit conversion in SqlDateTime that takes care of converting a DateTime to an SqlDateTime without any additional work:
public static implicit operator SqlDateTime(DateTime value)
{
return new SqlDateTime(value);
}
// SqlDateTime mySqlDate = DateTime.Now
What must be happening is that dateTimeVariable is being implicitly converted from a DateTime to an SqlDateTime for the comparison:
if (dateTimeVariable > SqlDateTime.MinValue)
{
// if dateTimeVariable, after conversion to an SqlDateTime, is greater than the
// SqlDateTime.MinValue, this code executes
}
But in the case of the following code, there's nothing that allows you to simply stuff an SqlDateTime into a DateTime variable, so it doesn't allow it.
DateTime dateTimeVariable = SqlDateTime.MinValue; // fails
Cast your initial value and it will compile okay, but there's a chance you're going to lose some valuable information that is part of an SqlDateTime but not a DateTime.
DateTime dateTimeVariable = (DateTime)SqlDateTime.MinValue;
This is a question of potential loss of precision. Usually this occurs in the context of "narrowing" versus "widening".
Integers are a subset of numbers. All integers are numbers, some numbers are not integers. Thus, the type "number" is wider than the type "integer".
You can always assign a type to a wider type without losing information.
Narrowing is another matter. To assign 1.3 to an integer you must lose information. This is possible but the compiler won't perform a narrowing conversion unless you explicitly state that this is what you want.
As a result, assignments that require a widening conversion are automatically and implicitly converted, but narrowing assignments require explicit casting or conversion (not all conversions are simple casting).
Although arguably SqlDateTime is narrower than DateTime differences in representation mean that conversions in both directions are potentially lossy. As a result, to assign a SqlDateTime to a DateTime requires an explicit conversion. Conversion of DateTime to SqlDateTime strictly speaking ought to require explicit conversion but the implicit conversion implemented in the SqlDateTime type (qv Grant's answer) makes SqlDateTime behave as though it were wider. I made the mistake of assuming SqlDateTime was wider because that's how it's behaving in this case and many kudos to commenters for picking out this important subtlety.
This implicit conversion thing is actually a bit of an issue with VARCHAR columns and ADO.NET implicitly typed parameters, because C# strings are Unicode and become NVARCHAR, so comparing them to an indexed column of type VARCHAR will cause a widening conversion to NVARCHAR (the implicit widening conversions thing also occurs in TSQL), which can prevent the use of the index - which won't stop the query from returning the correct results but will cripple performance.
From MSDN
SqlDateTime Structure
Represents the date and time data ranging in value from January 1, 1753 to December 31, 9999 to an accuracy of 3.33 milliseconds to be stored in or retrieved from a database. The SqlDateTime structure has a different underlying data structure from its corresponding .NET Framework type, DateTime, which can represent any time between 12:00:00 AM 1/1/0001 and 11:59:59 PM 12/31/9999, to the accuracy of 100 nanoseconds. SqlDateTime actually stores the relative difference to 00:00:00 AM 1/1/1900. Therefore, a conversion from "00:00:00 AM 1/1/1900" to an integer will return 0.