I want to get the accurate current time in the following format:
HH:MM:SS:mmm
HH - hours
MM - minutes
SS - seconds
mmm - miliseconds
As far as I know DateTime.Now is not accurate enough and StopWatch is just for measuring time.
I want to get accuracy of 1 milisecond.
The problem that you have here is that DateTime gives you a very precise time result, but it's accuracy is governed by both the systems hardware clock and how quickly the OS responds to the time request.
Precision and accuracy may seem like the same thing, but they're not. You may be able to write a measurement down to 1 billionth of a metre, but if your measuring device is 1 metre long without graduations then you only have an accuracy of 1 metre.
If you really need an accurate time down to 1 millisecond then you'll need another time source that you can poll directly thus bypassing any OS delays.
It is possible to get devices that allow connection to a realtime clock, for example gps receivers (which are highly accurate). Again accuracy between two receivers will depend on whether they are the same distance from the satellite or not.
Well, the resolution of DateTime actually goes down to 100 nanosecond resolution (even though the resolution may be limited to less by the implementation), millisecond resolution should not be a problem.
All properties you need to solve your problem are available in the normal DateTime type.
Edit: As #CodeInChaos points out, even millisecond resolution is not guaranteed using this API, so if you actually need that resolution, it's no good :-/
DateTime.Now has a resolution of 10ms
http://msdn.microsoft.com/en-us/library/system.datetime.now.aspx
what accurate resolution will be appropriate for you?
Windows Multimedia Timer timeGetTime()
Granularity: 1 millisecond
Precision: 1 millisecond
Apparent precision: 2.32804262366679e-006 seconds
Apparent jitter: 1.19727906360006e-007 seconds
Ripped from here
Related
I am trying to implement a time service that will report time with greater accuracy than 1ms. I thought an easy solution would be to take an initial measurement and use StopWatch to add a delta to it. The problem is that this method seems to diverge extremely fast from wall time. For example, the following code attempts to measure the divergence between Wall Time and my High Resolution Clock:
public static void Main(string[] args)
{
System.Diagnostics.Stopwatch s = new System.Diagnostics.Stopwatch();
DateTime baseDateTime = DateTime.UtcNow;
s.Start();
long counter = 0;
while(true)
{
DateTime utcnow = DateTime.UtcNow;
DateTime hpcutcnow = baseDateTime + s.Elapsed;
Console.WriteLine(String.Format("{0}) DT:{1} HP:{2} DIFF:{3}",
++counter, utcnow, hpcutcnow, utcnow - hpcutcnow));
Thread.Sleep(1000);
}
}
I am diverging at a rate of about 2ms/minute on fairly recent sever hardware.
Is there another time facility in windows that I am not aware of that will be more accurate? If not, is there a better approach to creating a high resolution clock or a 3rd party library I should be using?
Getting an accurate clock is difficult. Stopwatch has a very high resolution, but it is not accurate, deriving its frequency from a signal in the chipset. Which operates at typical electronic part tolerances. Cut-throat competition in the hardware business preempted the expensive crystal oscillators with a guaranteed and stable frequency.
DateTime.UtcNow isn't all that accurate either, but it gets help. Windows periodically contacts a time service, the default one is time.windows.com to obtain an update of a high quality clock. And uses it to recalibrate the machine's clock, inserting small adjustments to get the clock to catch up or slow down.
You need lots of bigger tricks to get it accurate down to a millisecond. You can only get a guarantee like that for code that runs in kernel mode, running at interrupt priority so it cannot get pre-empted by other code and with its code and data pages page-locked so it can't get hit with page faults. Commercial solutions use a GPS radio to read the clock signal of the GPS satellites, backed up by an oscillator that runs in an oven to provide temperature stability. Reading such a clock is the hard problem, you don't have much use for a sub-millisecond clock source when your program that uses it can get pre-empted by the operating system just as it obtained the time and not start running again until ~45 msec later. Or worse.
DateTime.UtcNow is accurate to 15.625 milliseconds and stable over very long periods thanks to the time service updates. Going lower than that just doesn't make much sense, you can't get the execution guarantee you need in user mode to take advantage of it.
Apparently in Windows 8/Server 2012 a new API was added specifically for getting high resolution timestamps, the GetSystemTimePreciseAsFileTime API. I haven't had a chance to play around with this, but it looks promising.
3rd party library
I tried to create my own based on several Internet sources. Here is a link to it: https://github.com/Anonymous87549236/HighResolutionDateTime/releases .
But then I realized that in .NET Core the resolution is already the maximum. So I think .NET Core is the best implementation.
I'd like to use 3 or 4 C# Timers with an Interval that could be 40ms (to work with image data 1000/25 = 40).
According to MSDN it seems to be the good pattern to perform a Task every 40ms. The default interval is 100ms.
In real life, I'd like to know if 40ms is still Ok ? Or if I should use another thread design pattern ? Is the wakeup/sleep behavior is near cpu free ?
There is no special relevance to the 100 msec default, you can change it as needed.
You do need to pick your values carefully if you want to get an interval that's consistent from one machine to another. The accuracy of the Timer class is affected by the default operating system clock interrupt rate. On most Windows machines, that interrupt occurs 64 times per second. Equal to 15.625 milliseconds. There are machines that have a higher rate, some go as low as 1 msec. A side-effect of other programs changing the interrupt rate. The timeBeginPeriod() winapi function does this and it has a global effect.
So the best intervals to pick are ones that are a multiple of 15.625 and stay just below that. So your chosen interval repeats well on any machine. Which makes the good choices:
15
31
46
62
etc.
Your best bet for aiming near 40 msec is therefore 46. It will be accurate to 1.4% on any machine. I do always pick 45 myself, nice round number.
Do beware that actual intervals can be arbitrary longer if the machine is under heavy load or you have a lot of active threadpool threads in your program.
I created an App which uses Timer class to callback a method at a certain time of a day and recall it every 24 hours after that.
I use Ticks to signify 24 hours later. (int) TimeSpan.FromHours(24).TotalMilliseconds
I use that to retrieve the ticks for 24 hours.
This works fine for me but on different computers, the trigger time is way off.
Anyway to debug this ? How should I fight/handle this issue ....
How much is "way off" to you? If you want an app to run at a specific time, schedule it for that specific time, not 24 hours from the time it finishes - you're inevitably going to see some slippage doing it that way because the time will always be off the next day X seconds, where X is how long the program took to complete the previous day.
How "way off" yes desktop computers clocks frequently fluctuate by a second or more every day, they generally use a NTP server to correct these fluctuations. But its just the nature of the beast.
First of all, there is no "my ticks" because a Tick is a well-defined value.
A single tick represents one hundred nanoseconds or one ten-millionth of a second. There are 10,000 ticks in a millisecond.
A DateTime object also has a Ticks property that you can use to access it. I wrote some simple code that I've posted here worked great for me, producing perfect results.
I see no reason your implementation should drift so much. Please make a sample that consistently produces the problem or post the relevant pieces of your own source.
I don't understand the meaning of timer precision and resolution. Can anyone explain it to me?
NOTE: This question is related to Stopwatch.
Accuracy and precision are opposing goals, you can't get both. An example of a very accurate timing source is DateTime.UtcNow. It provides absolute time that's automatically corrected for clock rate errors by the kernel, using a timing service to periodically re-calibrate the clock. You probably heard of time.windows.com, the NTP server that most Windows PC use. Very accurate, you can count on less than a second of error over an entire year. But not precise, the value only updates 64 times per second. It is useless to time anything that takes less than a second with any kind of decent precision.
The clock source for Stopwatch is very different. It uses a free running counter that is driven by a frequency source available somewhere in the chipset. This used to be a dedicate crystal running at the color burst frequency (3.579545 MHz) but relentless cost cutting has eliminated that from most PCs. Stopwatch is very precise, you can tell from its Frequency property. You should get something between a megahertz and the cpu clock frequency, allowing you to time down to a microsecond or better. But it is not accurate, it is subject to electronic part tolerances. Particularly mistrust any Frequency beyond a gigahertz, that's derived from a multiplier which also multiplies the error. And beware the Heisenberg principle, starting and stopping the Stopwatch takes non-zero overhead that will affect the accuracy of very short measurements. Another common accuracy problem with Stopwatch is the operating system switching out your thread to allow other code to run. You need to take multiple samples and use the median value.
They are the same as with any measurement. See this Wikipedia article for more details --
http://en.wikipedia.org/wiki/Accuracy_and_precision
There are different types of times in .net (3 or 4 of them, if i remember correctly), each working with his own algorithm. The precision of timer means how accurate it is in informing the using application on the ticking events. For example, if you use a timer and set it to trigger its ticking event every 1000 ms, the precision of the timer means how close to the specified 1000 ms it will actually tick.
for more information (at least in c#), i suggest u read the msdn page on timers:
From MSDN Stopwatch Class: (emphasis mine)
"The Stopwatch measures elapsed time by counting timer ticks in the underlying timer mechanism. If the installed hardware and operating system support a high-resolution performance counter, then the Stopwatch class uses that counter to measure elapsed time. Otherwise, the Stopwatch class uses the system timer to measure elapsed time. Use the Frequency and IsHighResolution fields to determine the precision and resolution of the Stopwatch timing implementation."
Is it a viable option to compare two FileInfo.CreationTimeUtc.Ticks of two files on two different computers to see which version is newer - or is there a better way?
Do Ticks depend on OS time or are they really physical ticks from some fixed date in the past?
The aim of a UTC time is that it will be universal - but both computers would have to have synchronized clocks for it to be appropriate to just compare the ticks. For example, if both our computers updated a file at the exact same instant (relativity aside) they could still record different times - it's not like computers tend to come with atomic clocks built-in.
Of course, they can synchronize time with NTP etc to be pretty close to each other. Whether that's good enough for your uses is hard to say without more information. You should also bear in mind the possibility of users deliberately messing with their system clocks - could that cause a problem for your use case?
Do Ticks depend on OS time or are they
really physical ticks from some fixed
date in the past?
Ticks are pretty much independent of the OS.
If I remember correctly, 1 second = 10000000 ticks.
So whatever time you are checking, what you get from Ticks is about ten million times what you get from TotalSeconds. (Although it is more accurate than TotalSeconds, obviously.)
A tick is basically the smallest unit of time that you can measure from .NET.
As of speaking about UTC, yes, it is as good as you can get. If the system time on the machine your app is running on is accurate enough, you'll manage with it without issue.
Basically, the more frequent updates there are of the files, the more inaccurate this will be. If someone creates two versions of the file in one second, all of the system times must be precisely synchronized to get a good result.
If you only have different versions once per several minutes, then it is very much good enough for you.
The short answer is that no, this is not a viable solution, at least not in the theoretical, anything-can-happen type of world.
The clock of a computer might be accurate to a billionth of a billionth of a second, but the problem isn't the accuracy, it's whether the clocks of the two computers are synchronized.
Here's an experiment. Look at your watch, then ask a random stranger around you what the time is. If your file was written to on your computer when you looked at the watch, and written to on the computer belonging to the person you're asking 1 second ago, would your comparison determine that your file or his/her file was newer?
Now, your solution might work, assuming that:
You're not going to have to compare milli- or nano-second different clock values
The clocks of the computers in question are synchronized against some common source
** or at the very least set to be as close to each other as your inaccuracy-criteria allows for
Depending on your requirements, I would seriously look at trying to find a different way of ensuring that you get the right value.
If you are talking about arbitrary files, then the answer about UTC is worth a try. The clocks should be the same.
If you have control over the files and the machines writing to them, I would write a version number at the start of the file. Then you can compare the version number. You would have to look into how to get unique version numbers if two machines write a file independently. To solve this, a version webservice could be implemented which both machines use.
Some file formats have version information built-in, and also time stamps. For example, Microsoft Office formats. You could then use the internal information to decide which was the newest. But you could end up with version conflict on these as well.
From the way you phrased the question, I assume that for some reason you cannot call the DateTime comparison methods. Assuming this, the msdn page for the ticks property states "A single tick represents one hundred nanoseconds or one ten-millionth of a second. There are 10,000 ticks in a millisecond." Thus the ticks refer to the value assigned by the .NET library and do not depend on the machine/OS(ehich is probably Windows since you are using C#) and can be safely used for comparisons.