How to convert integer to unsigned 64-bit integer - c#

I'm converting this code from C# to Ruby:
C# Code
DateTime dtEpoch = new DateTime(1970, 01, 01, 0, 0, 0, 0, DateTimeKind.Utc);
string strTimeStamp = Convert.ToUInt64((DateTime.UtcNow - dtEpoch).TotalSeconds).ToString();
Ruby Code
now = Time.now.utc
epoch = Time.utc(1970,01,01, 0,0,0)
time_diff = ((now - epoch).to_s).unpack('Q').first.to_s
I need to convert the integer into an unsigned 64-bit integer. Is unpack really the way to go?

I'm not really sure what is the returned value for your code, but it sure ain't seconds since epoch.
Ruby stores dates and times internally as seconds until epoch. Time.now.to_i will return exactly what you're looking for.
require 'date'
# Seconds from epoch to this very second
puts Time.now.to_i
# Seconds from epoch until today, 00:00
puts Date.today.to_time.to_i

In short, Time.now.to_i is enough.
Ruby internally stores Times in seconds since 1970-01-01T00:00:00.000+0000:
Time.now.to_f #=> 1439806700.8638804
Time.now.to_i #=> 1439806700
And you don't have to convert the value to something like ulong in C#, because Ruby automatically coerces the integer type so that it doesn't fight against your common sence.
A bit verbose explanation: ruby stores integers as instances of Fixnum, if that number fits the 63-bit size (not 64-bit, weird huh?) If that number exceeds that size, ruby automatically converts it to a Bignum, which has an arbitrary size.

Related

calculate number ticks from outside of the .net framework

Does anyone know how to calculate the number of .net time ticks from outside of the .net framework? my situation involves a python script on Linux (my side) needing to send a timestamp as the number of ticks in a message destined for a C# process (the other side, who can't change its code). I can't find any libraries that do this... Any thoughts? Thanks!
You can easily determine what the Unix/Linux epoch (1970-01-01 00:00:00) is in DateTime ticks:
DateTime netTime = new DateTime(1970, 1, 1, 0, 0, 0);
Console.WriteLine(netTime.Ticks);
// Prints 621355968000000000
Since the Unix/Linux timestamp is the number of seconds since 1970-01-01 00:00:00, you can convert that to 100ns ticks by multiplying it by 10000000, and then convert it to DateTime ticks by adding 621355968000000000:
long netTicks = 621355968000000000L + 10000000L * linuxTimestamp;

Another data type to store date and time

I'm creating a C# program that create a packet data. One of the data in the packet is the Date and Time of 4 bytes size. I did this:
public byte[] convertDateTimetoHex4bytes(DateTime dt)
{
//Convert date time format 20091202041800
string str = dt.ToString("yyyyMMddhhmmss");
//Convert to Int32
Int32 decValue = Convert.ToInt32(str);
//convert to bytes
byte[] bytes = BitConverter.GetBytes(decValue);
return bytes;
}
However, it seems that I need to use 8 byte of data, since 32 bit is too small (error during run time). Anyone can help me, if there is any other smaller Date Time format with 4 bytes? or any other way?
The error is logical as, the given string value when converted to a number, is simply too big to be represented by Int32 and the extra information cannot be "shoved in".
20140616160000 -- requested number (i.e. DateTime for today)
2147483647 -- maximum Int32 value (2^31-1)
A UNIX Timestamp might work as is [traditionally] 32-bits, but it only has second precision and a limited range of 1970~2038. See How to convert a Unix timestamp to DateTime and vice versa? if interested in this approach.
The timestamp results in a smaller numeric range because it doesn't have unused numbers around the different time components; e.g. with the original format, numbers in the union of the following formats are not used and thus "wasted":
yyyyMMddhhmm60-yyyyMMddhhmm99 -- these seconds will never be
yyyyMMddhh60ss-yyyyMMddhh99ss -- these minutes will never be
yyyyMMdd25hhss-yyyyMMdd99hhss -- these hours will never be
-- etc.

what is c time_t equivalent for c#

There is a native method from dll written in c which takes a parameter of type time_t.
Is it possible to use C# uint or ulong for this parameter?
I do not think I should say they are equivalent, but you can convert t_time to DateTime in such a way:
int t= 1070390676; // value of time_t in an ordinary integer
System.DateTime dt= new System.DateTime(1970,1,1).AddSeconds(t);
And this example is from How can I convert date-time data in time_t to C# DateTime class format?
I should also say that UInt32 is used for t_time,too.check DateTime to time_t
Depends on how time_t was defined in the Standard C header files the DLL was compiled against.
If time_t is 64-bit, the C# equivalent is long.
If time_t is 32-bit, then it has the Year 2038 bug and you should ask whoever wrote the DLL for a non-buggy version.
According to Wikipedia's article on Time_t you could use a integer (Int32 or Int64)
Unix and POSIX-compliant systems implement time_t as an integer or real-floating type (typically a 32- or 64-bit integer) which represents the number of seconds since the start of the Unix epoch: midnight UTC of January 1, 1970 (not counting leap seconds).
Bastardo's solution did not help me. I was facing an issue with DST, so an additional conversion to local time was required, or the resulting time differed by one hour. This is what I do:
return new DateTime(1970, 1, 1).ToLocalTime().AddSeconds(time_t);

Timestamp as UTC Integer

I Have a legacy database with a field containing an integer representing a datetime in UTC
From the documentation:
"Timestamps within a CDR appear in Universal Coordinated Time (UTC). This value remains
independent of daylight saving time changes"
An example of a value is 1236772829.
My question is what is the best way to convert it to a .NET DateTime (in CLR code, not in the DB), both as the UTC value and as a local time value.
Have tried to google it but without any luck.
You'll need to know what the integer really means. This will typically consist of:
An epoch/offset (i.e. what 0 means) - for example "midnight Jan 1st 1970"
A scale, e.g. seconds, milliseconds, ticks.
If you can get two values and what they mean in terms of UTC, the rest should be easy. The simplest way would probably be to have a DateTime (or DateTimeOffset) as the epoch, then construct a TimeSpan from the integer, e.g. TimeSpan.FromMilliseconds etc. Add the two together and you're done. EDIT: Using AddSeconds or AddMilliseconds as per aakashm's answer is a simpler way of doing this bit :)
Alternatively, do the arithmetic yourself and call the DateTime constructor which takes a number of ticks. (Arguably the version taking a DateTimeKind as well would be better, so you can explicitly state that it's UTC.)
Googling that exact phrase gives me this Cicso page, which goes on to say "The field specifies a time_t value that is obtained from the operating system. "
time_t is a C library concept which strictly speaking doesn't have to be any particular implementation, but typically for UNIX-y systems will be the number of seconds since the start of the Unix epoch, 1970 January 1 00:00.
Assuming this to be right, this code will give you a DateTime from an int:
DateTime epochStart = new DateTime(1970, 1, 1);
int cdrTimestamp = 1236772829;
DateTime result = epochStart.AddSeconds(cdrTimestamp);
// Now result is 2009 March 11 12:00:29
You should sanity check the results you get to confirm that this is the correct interpretation of these time_ts.

Convert System::DateTime to _timeb

I have a legacy C++-based application that timestamps incoming network traffic using the CRT _ftime() function. The _ftime() function returns a _timeb structure, which has a 32-bit and a 64-bit implementation. We are using the 32-bit implementation, which looks like this:
struct _timeb {
long time; // 4 bytes
unsigned short millitm; // 2 bytes
short timezone; // 2 bytes
short dstflag; // 2 bytes
};
From the MSDN documentation, here is how each field is interpreted:
dstflag - nonzero if daylight savings time is currently in effect for the local time zone (see _tzset for an explanation of how daylight savings time is determined.)
millitm - fraction of a second in milliseconds
time - time in seconds since midnight (00:00:00), January 1, 1970, coordinated universal time (UTC).
timezone - difference in minutes, moving westward, between UTC and local time. The value of timezone is set from the value of the global variable _timezone (see _tzset).
I am re-working the portion of the code that does the timestamping to use C# in .NET 3.5. Timestamps are now generated using the System.DateTime structure, but I still need to convert them back to the _timeb structure so the legacy C++ code can operate on them. Here is how I am doing that in my managed C++ bridge library:
DateTime dateTime = DateTime::UtcNow;
DateTime baseTime(1970, 1, 1, 0, 0, 0, DateTimeKind::Utc);
TimeSpan delta = dateTime - baseTime;
_timeb timestamp;
timestamp.time = delta.TotalSeconds;
timestamp.millitm = dateTime.Millisecond;
timestamp.dstflag = TimeZoneInfo::Local->IsDaylightSavingTime(dateTime) ? 1 : 0;
timestamp.timezone = TimeZoneInfo::Local->BaseUtcOffset.TotalMinutes * -1;
From what I can tell, this appears to reconstruct the _timeb structure as if I had called _ftime() directly, and that's good. The thing is, timestamps are a critical piece of our application, so this has to be right.
My question is two-fold.
Is my algorithm flawed somehow? Does anyone see anything obvious that I've missed? Are there boundary conditions where this won't work right?
Is there a better way to do the conversion? Does .NET have a way to do this in a more straightforward manner?
You're aware of the Y2K38 problem? I assume you checked the sign of .timezone. Avoid the cleverness of using dateTime.Millisecond, that just confuses the next guy. Looks good otherwise.

Categories

Resources