Different values of Stopwatch.Frequency on the same PC - c#

I'm working on online game and have client and server. For client I use Unity3D and C#, server is written in C#. For synchronization I use timers, and as we know timers depends from ticks. Ticks counter in C# is class Stopwatch, and count of ticks in 1 second equals Stopwatch.Frequency, but on server and client, values of Stopwatch.Frequency are different, and it kills my synchronization because the timer on server too slow unlike a timer on the client. Stopwatch.Frequency on the server equals 3.124.980, and Stopwatch.Frequency on the client equals 10.000.000. So why??? How i can change the value of Stopwatch.Frequency for timers synchronization? Thanks and sorry for my bad English

Stopwatch can be unreliable on a PC with multiple processors, or processors that do not have a constant clock speed (processors that can reduce clock to conserve energy), so you simply can't use it in a game (because you want it to work in every computer).
Many games uses a global watch, and I've seen that even the simplest global watch algorithm can be good enough to synchronize clients with a server for a game. Take a look at the Cristian's algorithm.
Having a global watch, you can simply use DateTime.UtcNow to measure time.

StopWatch.Frequency cannot be changed - it gets the frequency of the timer as the number of ticks per second. This field is read-only.
The timer used by the Stopwatch class depends on the system hardware and operating system.
The field Stopwatch.IsHighResolution is true if the Stopwatch timer is based on a high-resolution performance counter. Otherwise, IsHighResolution is false, in which case it indicates that the Stopwatch timer is based on the system timer.

Related

StopWatch vs Timer - When to Use

Forgive me for this question, but I can't seem to find a good source of when to use which. Would be happy if you can explain it in simple terms.
Furthermore, I am facing this dilemma:
See, I am coding a simple application. I want it to show the elapsed time (hh:mm:ss format or something). But also, to be able to "speed up" or "slow down" its time intervals (i.e. speed up so that a minute in real time equals an hour in the app).
For example, in Youtube videos (* let's not consider the fact that we can jump to specific parts of the vid *), we see the actual time spent in watching that video on the bottom left corner of the screen, but through navigating in the options menu, we are able to speed the video up or down.
And we can actually see that the time gets updated in a manner that agrees with the speed factor (like, if you choose twice the speed, the timer below gets updated twice faster than normal), and you can change this speed rate whenever you want.
This is what I'm kinda after. Something like how Youtube videos measure the time elapsed and the fact that they can change the time intervals. So, which of the two do you think I should choose? Timer or StopWatch?
I'm just coding a Windows Form Application, by the way. I'm simulating something and I want the user to be able to speed up whenever he or she wishes to. Simple as this may be, I wish to implement a proper approach.
As far as I know the main differences are:
Timer
Timer is just a simple scheduler that runs some operation/method once in a while
It executes method in a separate thread. This prevents blocking of the main thread
Timer is good when we need to execute some task in certain time interval without blocking anything.
Stopwatch
Stopwatch by default runs on the same thread
It counts time and returns TimeSpan struct that can be useful in case when we need some additional information
Stopwatch is good when we need to watch the time and get some additional information about how much elapsed processor ticks does the method take etc.
This has already been covered in a number of other questions including
here. Basically, you can either have Stopwatch with a Speed factor then the result is your "elapsed time". A more complicated approach is to implement Timer and changing the Interval property.

Best way to implement a high resolution DateTime.UtcNow in C#?

I am trying to implement a time service that will report time with greater accuracy than 1ms. I thought an easy solution would be to take an initial measurement and use StopWatch to add a delta to it. The problem is that this method seems to diverge extremely fast from wall time. For example, the following code attempts to measure the divergence between Wall Time and my High Resolution Clock:
public static void Main(string[] args)
{
System.Diagnostics.Stopwatch s = new System.Diagnostics.Stopwatch();
DateTime baseDateTime = DateTime.UtcNow;
s.Start();
long counter = 0;
while(true)
{
DateTime utcnow = DateTime.UtcNow;
DateTime hpcutcnow = baseDateTime + s.Elapsed;
Console.WriteLine(String.Format("{0}) DT:{1} HP:{2} DIFF:{3}",
++counter, utcnow, hpcutcnow, utcnow - hpcutcnow));
Thread.Sleep(1000);
}
}
I am diverging at a rate of about 2ms/minute on fairly recent sever hardware.
Is there another time facility in windows that I am not aware of that will be more accurate? If not, is there a better approach to creating a high resolution clock or a 3rd party library I should be using?
Getting an accurate clock is difficult. Stopwatch has a very high resolution, but it is not accurate, deriving its frequency from a signal in the chipset. Which operates at typical electronic part tolerances. Cut-throat competition in the hardware business preempted the expensive crystal oscillators with a guaranteed and stable frequency.
DateTime.UtcNow isn't all that accurate either, but it gets help. Windows periodically contacts a time service, the default one is time.windows.com to obtain an update of a high quality clock. And uses it to recalibrate the machine's clock, inserting small adjustments to get the clock to catch up or slow down.
You need lots of bigger tricks to get it accurate down to a millisecond. You can only get a guarantee like that for code that runs in kernel mode, running at interrupt priority so it cannot get pre-empted by other code and with its code and data pages page-locked so it can't get hit with page faults. Commercial solutions use a GPS radio to read the clock signal of the GPS satellites, backed up by an oscillator that runs in an oven to provide temperature stability. Reading such a clock is the hard problem, you don't have much use for a sub-millisecond clock source when your program that uses it can get pre-empted by the operating system just as it obtained the time and not start running again until ~45 msec later. Or worse.
DateTime.UtcNow is accurate to 15.625 milliseconds and stable over very long periods thanks to the time service updates. Going lower than that just doesn't make much sense, you can't get the execution guarantee you need in user mode to take advantage of it.
Apparently in Windows 8/Server 2012 a new API was added specifically for getting high resolution timestamps, the GetSystemTimePreciseAsFileTime API. I haven't had a chance to play around with this, but it looks promising.
3rd party library
I tried to create my own based on several Internet sources. Here is a link to it: https://github.com/Anonymous87549236/HighResolutionDateTime/releases .
But then I realized that in .NET Core the resolution is already the maximum. So I think .NET Core is the best implementation.

C# Timer.Interval Pattern with 40ms?

I'd like to use 3 or 4 C# Timers with an Interval that could be 40ms (to work with image data 1000/25 = 40).
According to MSDN it seems to be the good pattern to perform a Task every 40ms. The default interval is 100ms.
In real life, I'd like to know if 40ms is still Ok ? Or if I should use another thread design pattern ? Is the wakeup/sleep behavior is near cpu free ?
There is no special relevance to the 100 msec default, you can change it as needed.
You do need to pick your values carefully if you want to get an interval that's consistent from one machine to another. The accuracy of the Timer class is affected by the default operating system clock interrupt rate. On most Windows machines, that interrupt occurs 64 times per second. Equal to 15.625 milliseconds. There are machines that have a higher rate, some go as low as 1 msec. A side-effect of other programs changing the interrupt rate. The timeBeginPeriod() winapi function does this and it has a global effect.
So the best intervals to pick are ones that are a multiple of 15.625 and stay just below that. So your chosen interval repeats well on any machine. Which makes the good choices:
15
31
46
62
etc.
Your best bet for aiming near 40 msec is therefore 46. It will be accurate to 1.4% on any machine. I do always pick 45 myself, nice round number.
Do beware that actual intervals can be arbitrary longer if the machine is under heavy load or you have a lot of active threadpool threads in your program.

What is the meaning of timer precision and resolution?

I don't understand the meaning of timer precision and resolution. Can anyone explain it to me?
NOTE: This question is related to Stopwatch.
Accuracy and precision are opposing goals, you can't get both. An example of a very accurate timing source is DateTime.UtcNow. It provides absolute time that's automatically corrected for clock rate errors by the kernel, using a timing service to periodically re-calibrate the clock. You probably heard of time.windows.com, the NTP server that most Windows PC use. Very accurate, you can count on less than a second of error over an entire year. But not precise, the value only updates 64 times per second. It is useless to time anything that takes less than a second with any kind of decent precision.
The clock source for Stopwatch is very different. It uses a free running counter that is driven by a frequency source available somewhere in the chipset. This used to be a dedicate crystal running at the color burst frequency (3.579545 MHz) but relentless cost cutting has eliminated that from most PCs. Stopwatch is very precise, you can tell from its Frequency property. You should get something between a megahertz and the cpu clock frequency, allowing you to time down to a microsecond or better. But it is not accurate, it is subject to electronic part tolerances. Particularly mistrust any Frequency beyond a gigahertz, that's derived from a multiplier which also multiplies the error. And beware the Heisenberg principle, starting and stopping the Stopwatch takes non-zero overhead that will affect the accuracy of very short measurements. Another common accuracy problem with Stopwatch is the operating system switching out your thread to allow other code to run. You need to take multiple samples and use the median value.
They are the same as with any measurement. See this Wikipedia article for more details --
http://en.wikipedia.org/wiki/Accuracy_and_precision
There are different types of times in .net (3 or 4 of them, if i remember correctly), each working with his own algorithm. The precision of timer means how accurate it is in informing the using application on the ticking events. For example, if you use a timer and set it to trigger its ticking event every 1000 ms, the precision of the timer means how close to the specified 1000 ms it will actually tick.
for more information (at least in c#), i suggest u read the msdn page on timers:
From MSDN Stopwatch Class: (emphasis mine)
"The Stopwatch measures elapsed time by counting timer ticks in the underlying timer mechanism. If the installed hardware and operating system support a high-resolution performance counter, then the Stopwatch class uses that counter to measure elapsed time. Otherwise, the Stopwatch class uses the system timer to measure elapsed time. Use the Frequency and IsHighResolution fields to determine the precision and resolution of the Stopwatch timing implementation."

Executing a task via a thread on an interval a certain number of times

I don't have a great deal of experience with threads. I'm using .NET 4 and would like to use the .NET 4 threading features to solve this. Here is what I want to do.
I have a class with two methods, 'A' and 'B'. I want 'A' to call 'B' some number of times (like 100) every some number of milliseconds (like 3000). I want to record the average execution time of method 'B' when it's done executing its 100 (or whatever) times. The class will have some private properties to keep track of the total elapsed execution time of 'B' in order to calculate an average.
I'm not sure if method 'A' should call 'B' via a System.Timers.Timer thread (where the interval can be set, but not the number of times) or if there is a better (.NET 4) way of doing this.
Thanks very much.
In reading over your question, I think the root question you have is about safely kicking off a set of events and timing their execution in a thread-safe manner. In your example, you are running 100 iterations every 3000ms. That means that at most each iteration should only take 30ms. Unfortunately, the System.Timers.Timer (which is System.Threading.Timer with a wrapper around it) is not that precise. Expect a precision of 10ms at best and possibly a lot worse. In order to get the 1ms precision you really need, you are going to need to tap into the native interop. Here is a quote I found on this:
The precision of multithreaded timers depends on the operating system, and is typically in the 10–20 ms region. If you need greater precision, you can use native interop and call the Windows multimedia timer. This has precision down to 1 ms and it is defined in winmm.dll. First call timeBeginPeriod to inform the operating system that you need high timing precision, and then call timeSetEvent to start a multimedia timer. When you’re done, call timeKillEvent to stop the timer and timeEndPeriod to inform the OS that you no longer need high timing precision. You can find complete examples on the Internet that use the multimedia timer by searching for the keywords dllimport winmm.dll timesetevent
-Joseph Albahari ( http://www.albahari.com/threading/part3.aspx )
If you follow his advice, you should get the precision you need.

Categories

Resources