I have a DateTime variable that holds the following value: 5/11/2014 7:56:26 am
I am currently using the following code to format this as a string: uDate.ToString("s"). Using this code, I get the following value: 2014-11-05T07:56:26
I need this to be more accurate. The exact value I am wanting to get is in the following range of accuracy: 2014-11-05T07:56:26.4
I have done some research at the following link: http://www.w3.org/TR/NOTE-datetime
Here is what I have found:
This profile does not specify how many digits may be used to represent
the decimal fraction of a second. An adopting standard that permits
fractions of a second must specify both the minimum number of digits
(a number greater than or equal to one) and the maximum number of
digits (the maximum may be stated to be "unlimited").
The value I am wanting to get is the value used on Azure when displaying a DateTime variable in a web service request.
How can I format my own DateTime strings so that they are the same format as the format used on Azure? Is there a specific format that Azure uses?
Basically, how can I format a DateTime string to be more accurate than the code: uDate.ToString("s")
Thanks in advance.
I believe the most "complete" information would be to use the "O" value, including fractions and also TimeZone (when available).
var myDate = DateTime.Now.ToString("o");
See also this article. Microsoft states that "complies with ISO 8601" and every .Net platform (also Azure) is able to read this string as DateTime, and also properly set the 'DateTimeKind' when available.
"s" standard format specifier uses SortableDateTimePattern property of your CurrentCulture and it doesn't includes miliseconds part.
This format specifier is always the same regardless the culture you use and it is "yyyy'-'MM'-'dd'T'HH':'mm':'ss" format.
You can just use custom date and time format strings like;
uDate.ToString("yyyy-MM-dd'T'HH:mm:ss.f");
Related
Does following DateTime format "%M/%d/yyyy %H:%m:%s" will include both lines, e.g. with or without leading zero:
Line 1: 4/8/2022 7:6:3
Line 2: 04/08/2022 07:06:03
It's seems to be working but related documentation is more welcome.
The related documentation can be found here: https://learn.microsoft.com/en-us/dotnet/standard/base-types/custom-date-and-time-format-strings
Note the description of the % symbol: "Defines the following character as a custom format specifier".[1]
Since you have a custom date time format string, the symbols M, d, H, ... are custom format specifiers. This means, here % essentially becomes a no-operation without any effect, because the symbols following it are already custom format specifiers.
So, what exactly is the purpose of % if the symbols in a custom date time format string are already custom format specifiers regardless of % being there or not? The reason for % becomes understandable when you consider that there are also standard date time format strings, which consist of a single character, a single format specifier. Pertinent documentation here: https://learn.microsoft.com/en-us/dotnet/standard/base-types/standard-date-and-time-format-strings
Basically, any date time format string made of only one character is treated as a standard date time format string. And any date time format string with two or more characters is treated as a custom date time format string.
What if you want to use a custom date time format string consisting of only one custom format specifier? That one-character string will be interpreted as a standard date time format string instead. And that is a problem.
If you compare the lists of specifiers for standard and custom date time format strings, you'll notice that many of the standard date time format specifiers use symbols that are also used by custom date time format specifiers. However, standard date time format specifiers represent different data and/or formatting patterns than the respective custom date time format specifier using the same symbol. For example, the standard date time format specifier y yields year+month, while the custom date time format specifier y yields the last two digits of the year.
Therefore, if you need a functionally single-specifier custom date time format string, you gotta fatten up that string and turn it from a one-character string into a two-characters string with the help of the "no-op" specifier %, so that it will be correctly treated as a custom date time format string.
As an example, imagine you want to get just the last two digits of the year and nothing more, and you decide to use the custom format specifier y which does exactly what you want. However, the format string "y" is a standard date time format string yielding year+month. So, how do you get what you want? You turn the standard date time format string "y" into a custom date time format string by using "%y".
[1]According to that documentation, it should theoretically be possible to use contiguous sequences of multiple % in custom date time format string like "%%%%%M/%%%%%%d". Each of those % sequences should functionally collapse into a single %, as by definition according to the quoted documentation, each % defines the following % as a custom format specifier that it already is. However -- and for the better, i might add -- the DateTime formatting functions will have none of such shenanigans and throw a FormatException for you being a bad boy having even tried this...
It seems that C# does not manage to parse a time in a valid RFC 3339 format:
DateTime.ParseExact("2019-12-31T00:00:00.123456789+01:00", "yyyy'-'MM'-'dd'T'HH':'mm':'ss'.'fffffffffzzz", null)
This line throws an exception, while this line works just fine:
DateTime.ParseExact("2019-12-31T00:00:00.1234567+01:00", "yyyy'-'MM'-'dd'T'HH':'mm':'ss'.'fffffffzzz", null)
So it seems there is a limit on milliseconds, but I cannot find out any documentation on that. Is this how it is supposed to be?
The reason want to parse this date is that I have have an input date field. We use OAS (Swagger) date-time format that quite clearly says that any date in RFC 3339 Internet Date/Time format should be valid. Now from the spec here section 5.6
time-secfrac = "." 1*DIGIT
As far as I understand this means that up to 9 digits should be allowed and to be 100% compliant we have to allow these inputs, but it does not seem that C# even supports that.
Any ideas on how to fix it?
Per MSDN specification, you can use only fffffff
The fffffff custom format specifier represents the seven most
significant digits of the seconds fraction; that is, it represents the
ten millionths of a second in a date and time value.
In your first example
DateTime.ParseExact("2019-12-31T00:00:00.123456789+01:00", "yyyy'-'MM'-'dd'T'HH':'mm':'ss'.'fffffffffzzz", null)
you are using fffffffff which is more precise for .NET custom date and time format strings
As far as I know, .NET supports seven most significant digits for milliseconds which is The "fffffff" custom format specifier are for.
The "fffffff" custom format specifier represents the seven most
significant digits of the seconds fraction; that is, it represents the
ten millionths of a second in a date and time value.
Although it's possible to display the ten millionths of a second
component of a time value, that value may not be meaningful. The
precision of date and time values depends on the resolution of the
system clock.
That means you are giving not meaningful data that are not supported for .NET Framework. I strongly suggest not doing that.
In addition to the information in the other answers, if you cannot change your input and you still want to parse it, you may use one of the following solutions:
If your input will always be in the same format (i.e., has 9 seconds-fraction digits), you could just remove the two extra ones and proceed to parse it:
string input = "2019-12-31T00:00:00.123456789+01:00";
input = input.Remove(27, 2);
// TODO: parse `input`.
If you don't know the number of the seconds-fraction digits beforehand, you may use something like this:
string input = "2019-12-31T00:00:00.123456789+01:00";
string format = "yyyy'-'MM'-'dd'T'HH':'mm':'ss'.'FFFFFFFzzz";
var regEx = new Regex(#"^(\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d{1,7})\d*");
input = regEx.Replace(input, "$1");
DateTime parsedDate = DateTime.ParseExact(input, format, null);
I have timestamp comoing from c# (64-bit binary value) that I need to convert in php.
The timestamp is created with DateTime.ToBinary() method.
an example is -8586307756854775808 which gives 10/11/2019 10:00:00 PM
I tried with unpack() in php but without success:
$timestamp = unpack('q', -8586307756854775808);
How can I do this in php ?
Thanks
At first must separate the ticks from 64-bit binary value.
The ticks of the 64-bit value are the same when creating DateTime in C# regardless of the use of DateTimeKind.Utc or DateTimeKind.Local
Then may be convert the Ticks in a UNIX-Timestamp.
gmdate() returns a date string from the timestamp.
<?php
//num from c# DateTime.ToBinary()
//2019-10-11 22:00:00
$num = -8586307756854775808;
$ticks = $num & 0x3fffffffffffffff;
$linuxTs = $ticks/10000000 - 62135596800;
$strDate = gmdate("Y-m-d H:i:s",$linuxTs);
echo $strDate; //"2019-10-11 22:00:00"
Note: This solution requires a 64-bit platform.
The kind information from the 64-bit value $num may also be useful.
$kind = (ord(pack('J',$num)) & 0xC0)>>6; //0,1 or 2
DateTime.ToBinary creates a value that combines the .Ticks and .Kind property into a single 64-bit value. You shouldn't use it unless you intend to pass that value back into DateTime.FromBinary in .NET.
While it is possible to interpret it outside of .NET, one would have to apply the same logic that is in the implementation of DateTime.FromBinary. You can see the source code here.
Instead, consider using a standard interchange format. The most common is ISO 8601 (also covered in part by RFC 3339).
string iso = dateTime.ToString("O");
The above uses the "round-trip" specifier to get the ISO 8601 format. There are other ways as well.
If your values are UTC based, you could also consider passing a Unix Timestamp. The simplest way to get one from .NET is through DateTimeOffset.ToUnixTimeSeconds or DateTimeOffset.ToUnixTimeMilliseconds depending on what precision you want. Of course, you will need to convert your DateTime to a DateTimeOffset first.
long ms = new DateTimeOffset(dateTime).ToUnixTimeMilliseconds();
The above assumes that your DateTime has either DateTimeKind.Local or DateTimeKind.Utc in its .Kind property. If it is DateTimeKind.Unspecified, then you may need to work with TimeZoneInfo to get aligned to UTC before figuring out the Unix Timestamp.
I have a windows service application in which I am getting the current date and time using DateTime.Now.ToString(), which returns '04-05-2018 05:50:12'.
But I tried the same in a sample console application, but it returns the date in a different format as '5/4/2018 5:51:32 AM'
Both these machines are being executed in the same machine. Can some one let me know why is there a date format difference in these applications?
The DateTime.ToString() formats the DateTime according to current culture. As Written in the Documentation
Converts the value of the current DateTime object to its equivalent
string representation using the formatting conventions of the current
culture.(Overrides ValueType.ToString().)
If you want the same string you should instead use the DateTime.ToString(string) overload and provide the exact format which you want.
The ToString(String) method returns the string representation of a
date and time value in a specific format that uses the formatting
conventions of the current culture; for more information, see
CultureInfo.CurrentCulture.
The format parameter should contain either a single format specifier
character (see Standard Date and Time Format Strings) or a custom
format pattern (see Custom Date and Time Format Strings) that defines
the format of the returned string. If format is null or an empty
string, the general format specifier, 'G', is used.
Some uses of this method include:
Getting a string that displays the date and time in the current
culture’s short date and time format. To do this, you use the “G”
format specifier.
Getting a string that contains only the month and year. To do this,
you use the “MM/yyyy” format string. The format string uses the
current culture’s date separator.
Getting a string that contains the date and time in a specific format.
For example, the “MM/dd/yyyyHH:mm” format string displays the date and
time string in a fixed format such as “19//03//2013 18:06". The format
string uses “/” as a fixed date separator regardless of
culture-specific settings.
Getting a date in a condensed format that could be used for
serializing a date string. For example, the "yyyyMMdd" format string
displays a four-digit year followed by a two-digit month and a
two-digit day with no date separator.
Basically,I am reading excel file where one of that columns has date format like : dd/MM/yyyy eg: 11/04/2016
When I am using DateTime.TryParse() to parse that string into datetime method TryParse() treated first numbers like month (number 11 in example above). However the same code running on the other computers will take the second number (04 in example above) as the month.
So my question is why there is a difference between them, what actually decide the behavior of TryParse method?
I think the main difference is in IFormatProvider (hard to say if I can't check some settings in target system), but I usually use other method to get proper DateTime object:
DateTime someDate = DateTime.ParseExact(myStringDate, "dd/MM/yyyy", System.Globalization.CultureInfo.InvariantCulture);
It always gives me what I want no matter how client environment is configured.
Hope this helps. :)
From DateTime.TryParse(String, DateTime) documentation:
Because the DateTime.TryParse(String, DateTime) method tries to parse
the string representation of a date and time using the formatting
rules of the current culture, trying to parse a particular string
across different cultures can either fail or return different results.
If a specific date and time format will be parsed across different
locales, use the
DateTime.TryParse(String, IFormatProvider, DateTimeStyles, DateTime)
method or one of the overloads of the TryParseExact method and provide
a format specifier.
That means your computers have different culture settings which is pointed in CurrentCulture property.
Looks like one computer's current culture have dd/MM/yyyy and the other computer's current culture have MM/dd/yyyy as a standard date and time format.
Since you are sure your values are always in dd/MM/yyyy format, I would use DateTime.ParseExact instead of Datetime.TryParse or DateTime.TryParseExact methods like;
var dt = DateTime.ParseExact(yourColumnValue, "dd/MM/yyyy", CultureInfo.InvariantCulture);
Or you can sets all computers current culture to like the first computer but remember, CultureInfo data is not a stable data that might be change in future with a windows update, .NET Framework version or OS version.