Imagine we have a dedicated OS thread, that must do something every 4.30 minutes.
You cannot suspend the thread, because it needs to check/do other things, nor can you use await/async, because that would cause the command to drop out of the dedicated OS thread, defeating the purpose of first creating one.
To solve the problem above, I found 2 solutions:
Use System.Diagnostics.Stopwatch to start a new Stopwatch object, and compare it every 1 second if it reached the desired time-frame: if (timer.ElapsedMilliseconds == 258000) { ...}
Use System.DateTime to initialize a DateTime object DateTime.UtcNow.AddMinutes(4.30) and compare that to the current date every 1 second: if (DateTime.UtcNow >= initializedDate) { ... }
First approach uses a Stopwatch. I think the Stopwatch uses a thread to do all the work, which is why I wanted to avoid it?
The second approach uses a lot of DateTime objects. Every 1 second we'd be constructing a new DateTime object of the current date.
The two solutions are equivalent unless you want very high precision: Stopwatch will use a high precision system timer if available and falls back to DateTime if not. To address your concerns:
Stopwatch does NOT use a separate thread. It just reads the system ticks (high or low precision) and does some math for you.
DateTime is a struct so creating a new instance every second should be cheap and put zero pressure on the GC.
I'd use the Stopwatch because it gives me the elapsed time for free. :)
Related
I just ran into some unexpected behavior with DateTime.UtcNow while doing some unit tests. It appears that when you call DateTime.Now/UtcNow in rapid succession, it seems to give you back the same value for a longer-than-expected interval of time, rather than capturing more precise millisecond increments.
I know there is a Stopwatch class that would be better suited for doing precise time measurements, but I was curious if someone could explain this behavior in DateTime? Is there an official precision documented for DateTime.Now (for example, precise to within 50 ms?)? Why would DateTime.Now be made less precise than what most CPU clocks could handle? Maybe it's just designed for the lowest common denominator CPU?
public static void Main(string[] args)
{
var stopwatch = new Stopwatch();
stopwatch.Start();
for (int i=0; i<1000; i++)
{
var now = DateTime.Now;
Console.WriteLine(string.Format(
"Ticks: {0}\tMilliseconds: {1}", now.Ticks, now.Millisecond));
}
stopwatch.Stop();
Console.WriteLine("Stopwatch.ElapsedMilliseconds: {0}",
stopwatch.ElapsedMilliseconds);
Console.ReadLine();
}
Why would DateTime.Now be made less precise than what most CPU clocks could handle?
A good clock should be both precise and accurate; those are different. As the old joke goes, a stopped clock is exactly accurate twice a day, a clock a minute slow is never accurate at any time. But the clock a minute slow is always precise to the nearest minute, whereas a stopped clock has no useful precision at all.
Why should the DateTime be precise to, say a microsecond when it cannot possibly be accurate to the microsecond? Most people do not have any source for official time signals that are accurate to the microsecond. Therefore giving six digits after the decimal place of precision, the last five of which are garbage would be lying.
Remember, the purpose of DateTime is to represent a date and time. High-precision timings is not at all the purpose of DateTime; as you note, that's the purpose of StopWatch. The purpose of DateTime is to represent a date and time for purposes like displaying the current time to the user, computing the number of days until next Tuesday, and so on.
In short, "what time is it?" and "how long did that take?" are completely different questions; don't use a tool designed to answer one question to answer the other.
Thanks for the question; this will make a good blog article! :-)
DateTime's precision is somewhat specific to the system it's being run on. The precision is related to the speed of a context switch, which tends to be around 15 or 16 ms. (On my system, it is actually about 14 ms from my testing, but I've seen some laptops where it's closer to 35-40 ms accuracy.)
Peter Bromberg wrote an article on high precision code timing in C#, which discusses this.
I would like a precise Datetime.Now :), so I cooked this up:
public class PreciseDatetime
{
// using DateTime.Now resulted in many many log events with the same timestamp.
// use static variables in case there are many instances of this class in use in the same program
// (that way they will all be in sync)
private static readonly Stopwatch myStopwatch = new Stopwatch();
private static System.DateTime myStopwatchStartTime;
static PreciseDatetime()
{
Reset();
try
{
// In case the system clock gets updated
SystemEvents.TimeChanged += SystemEvents_TimeChanged;
}
catch (Exception)
{
}
}
static void SystemEvents_TimeChanged(object sender, EventArgs e)
{
Reset();
}
// SystemEvents.TimeChanged can be slow to fire (3 secs), so allow forcing of reset
static public void Reset()
{
myStopwatchStartTime = System.DateTime.Now;
myStopwatch.Restart();
}
public System.DateTime Now { get { return myStopwatchStartTime.Add(myStopwatch.Elapsed); } }
}
From MSDN you'll find that DateTime.Now has an approximate resolution of 10 milliseconds on all NT operating systems.
The actual precision is hardware dependent. Better precision can be obtained using QueryPerformanceCounter.
For what it's worth, short of actually checking the .NET source, Eric Lippert provided a comment on this SO question saying that DateTime is only accurate to approx 30 ms. The reasoning for not being nanosecond accurate, in his words, is that it "doesn't need to be."
From MSDN documentation:
The resolution of this property
depends on the system timer.
They also claim that the approximate resolution on Windows NT 3.5 and later is 10 ms :)
The resolution of this property depends on the system timer, which
depends on the underlying operating system. It tends to be between 0.5
and 15 milliseconds.
As a result, repeated calls to the Now property in a short time interval, such as in a loop, may return the same value.
MSDN Link
I'm coding a clock app using DispatchTimer in c#, but for some reasons my clock seems to skip 1 second every now and then.
eg. 52s -> 54s skipping 53s.
Seems to me that the timer does not execute exactly every second.
DispatcherTimer timer = new DispatcherTimer();
timer.Tick += DispatcherTimerEventHandler;
timer.Interval = new TimeSpan(0, 0, 0, 0, 1000);
or
timer.Interval = new TimeSpan(0, 0, 0, 1);
Both also don't work.
From the documentation of the DispatchTimer (emphasis mine):
Timers are not guaranteed to execute exactly when the time interval
occurs, but they are guaranteed to not execute before the time
interval occurs. This is because DispatcherTimer operations are placed
on the Dispatcher queue like other operations. When the
DispatcherTimer operation executes is dependent on the other jobs in
the queue and their priorities.
Usually I would recommend to use some kind of scheduling/cron framework like Quartz.NET, but this seems like a huge overhead for your usecase.
For a "clock app", although it's quite difficult to know what exactly you want to do, I would go for my own solution - meaning some kind of new thread with a while-loop or a BackgroundWorker.
Even a timer may help you, for example like in this answer.
Basically your approach is wrong. Don't add "one second" to a TimeSpan (or any kind of counter) based on any kind of timer, because in Windows, timers aren't guaranteed to fire at the exact interval. Using that approach will result in "drift" over longer periods of time.
Instead, store a DateTime "start" value and subtract that from the current DateTime to get a TimeSpan representing how much time has transpired since "start". With that approach it won't matter how often you update the clock. You could update 10 times a second, or once a minute, and the TimeSpan returned will still be correct.
An alternative is to use the Stopwatch class. It encapsulates the above process for you by returning a TimeSpan from its Elapsed() property.
With either of the two approaches above, it wouldn't matter how often the Timer fires as it will keep accurate time independent of the Timer frequency/timing.
A just need a stable count of the current program's progression in milliseconds in C#. I don't care about what timestamp it goes off of, whether it's when the program starts, midnight, or the epoch, I just need a single function that returns a stable millisecond value that does not change in an abnormal manner besides increasing by 1 each millisecond. You'd be surprised how few comprehensive and simple answers I could find by searching.
Edit: Why did you remove the C# from my title? I'd figure that's a pretty important piece of information.
When your program starts create a StopWatch and Start() it.
private StopWatch sw = new StopWatch();
public void StartMethod()
{
sw.Start();
}
At any point you can query the Stopwatch:
public void SomeMethod()
{
var a = sw.ElapsedMilliseconds;
}
If you want something accurate/precise then you need to use a StopWatch, and please read Eric Lippert's Blog (formerly the Principal Developer of the C# compiler Team) Precision and accuracy of DateTime.
Excerpt:
Now, the question “how much time has elapsed from start to finish?” is a completely different question than “what time is it right now?” If the question you want to ask is about how long some operation took, and you want a high-precision, high-accuracy answer, then use the StopWatch class. It really does have nanosecond precision and accuracy that is close to its precision.
If you don't need an accurate time, and you don't care about precision and the possibility of edge-cases that cause your milliseconds to actually be negative then use DateTime.
Do you mean DateTime.Now? It holds absolute time, and subtracting two DateTime instances gives you a TimeSpan object which has a TotalMilliseconds property.
You could store the current time in milliseconds when the program starts, then in your function get the current time again and subtract
edit:
if what your going for is a stable count of process cycles, I would use processor clocks instead of time.
as per your comment you can use DateTime.Ticks, which is 1/10,000 of a millisecond per tick
Also, if you wanted to do the time thing you can use DateTime.Now as your variable you store when you start your program, and do another DateTime.Now whenever you want the time. It has a millisecond property.
Either way DateTime is what your looking for
It sounds like you are just trying to get the current date and time, in milliseconds. If you are just trying to get the current time, in milliseconds, try this:
long milliseconds = DateTime.Now.Ticks / TimeSpan.TicksPerMillisecond;
How can i do Thread.Sleep(10.4166667);?
OK i see now that Sleep is not the way to go.
So i use Timer but timer is also in ms put i need more precise
Is there timer with nanosecond accuracy?
So you want your thread to sleep precisely for that time and then resume? Forget about it. This parameter tells the system to wake the Thread after at least this number of milliseconds. At least. And after resuming, the thread could be put to sleep once again in a blink of an eye. That just how Operating Systems work and you cannot control it.
Please note that Thread.Sleep sleeps as long as you tell it (not even precisely), no matter how long code before or after takes to execute.
Your question seems to imply that you want some code to be executed in certain intervals, since a precise time seems to matter. Thus you might prefer a Timer.
To do such a precise sleep you would need to use a real time operating system and you would likely need specialized hardware. Integrity RTOS claims to respond to interrupts in nanoseconds, as do others.
This isn't going to happen with C# or any kind of high level sleep call.
Please note that the argument is in milliseconds, so 10 is 10 milliseconds. Are you sure you want 10.41 etc milliseconds? If you want 10.41 seconds, then you can use 10416.
The input to Thread.Sleep is the number of milliseconds for which the thread is blocked. After that it will be runnable, but you have no influence over when it is actually scheduled. I.e. in theory the thread could wait forever before resuming execution.
It hardly ever makes sense to rely on specific number of milliseconds here. If you're trying to synchronize work between two threads there are better options than using Sleep.
As you already mentioned: You could combine DispatcherTimer with Stopwatch (Making sure the IsHighResolution and Frequency suits your needs). Start the Timer and the Stopwatch, and on discreet Ticks of the Timer check the exact elapsed time of the stopwatch.
If you are trying to rate-limit a calculation and insist on using only Thread.Sleep then be aware there is a an underlying kernel pulse rate (roughly 15ms), so your thread will only resume when a pulse occurs. The guarantee provided is to "wait at least the specified duration." For example, if you call Thread.Sleep(1) (to wait 1ms), and the last pulse was 13ms ago, then you will end up waiting 2ms until the next pulse comes.
The draw synchronization I implemented for a rendering engine does something similar to dithering to get the quantization to the 15ms intervals to be uniformly distributed around my desired time interval. It is mostly just a matter of subtracting half the pulse interval from the sleep duration, so only half the invocations wait the extra duration to the next 15ms pulse, and half occur early.
public class TimeSynchronizer {
//see https://learn.microsoft.com/en-us/windows/win32/api/synchapi/nf-synchapi-sleep
public const double THREAD_PULSE_MS = 15.6d;//TODO read exact value for your system
public readonly TimeSpan Min = TimeSpan.Zero;
public TimeSynchronizer(TimeSpan? min = null) {
if (min.HasValue && min.Value.Ticks > 0L) this.Min = min.Value;
}
private DateTime _targetTimeUtc = DateTime.UtcNow;//you may wish to defer this initialization so the first Synchronize() call assuredly doesn't wait
public void Synchronize() {
if (this.Min.Ticks > 0L) {
DateTime nowUtc = DateTime.UtcNow;
TimeSpan waitDuration = this._targetTimeUtc - nowUtc;
//store the exact desired return time for the next inerval
if (waitDuration.Ticks > 0L)
this._targetTimeUtc += this.Min;
else this._targetTimeUtc = nowUtc + this.Min;//missed it (this does not preserve absolute synchronization and can de-phase from metered interval times)
if (waitDuration.TotalMilliseconds > THREAD_PULSE_MS/2d)
Thread.Sleep(waitDuration.Subtract(TimeSpan.FromMilliseconds(THREAD_PULSE_MS/2d)));
}
}
}
I do not recommend this solution if your nominal sleep durations are significantly less than the pulse rate, because it will frequently not wait at all in that case.
The following screenshot shows rough percentile bands on how long it truly takes (from buckets of 20 samples each - dark green are the median values), with a (nominal) minimum duration between frames set at 30fps (33.333ms):
I am suspicious that the exact pulse duration is 1 second / 600, since in SQL server a single DateTime tick is exactly 1/300th of a second
Ideally I would like to have something similar to the Stopwatch class but with an extra property called Speed which would determine how quickly the timer changes minutes. I am not quite sure how I would go about implementing this.
Edit
Since people don't quite seem to understand why I want to do this. Consider playing a soccer game, or any sport game. The halfs are measured in minutes, but the time-frame in which the game is played is significantly lower i.e. a 45 minute half is played in about 2.5 minutes.
Subclass it, call through to the superclass methods to do their usual work, but multiply all the return values by Speed as appropriate.
I would use the Stopwatch as it is, then just multiply the result, for example:
var Speed = 1.2; //Time progresses 20% faster in this example
var s = new Stopwatch();
s.Start();
//do things
s.Stop();
var parallelUniverseMilliseconds = s.ElapsedMilliseconds * Speed;
The reason your simple "multiplication" doesn't work is that it doesn't speeding up the passing of time - the factor applies to all time that has passed, as well as time that is passing.
So, if you set your speed factor to 3 and then wait 10 minutes, your clock will correctly read 30 minutes. But if you then change the factor to 2, your clock will immediately read 20 minutes because the multiplication is applied to time already passed. That's obviously not correct.
I don't think the stopwatch is the class you want to measure "system time" with. I think you want to measure it yoruself, and store elapsed time in your own variable.
Assuming that your target project really is a game, you will likely have your "game loop" somewhere in code. Each time through the loop, you can use a regular stopwatch object to measure how much real-time has elapsed. Multiply that value by your speed-up factor and add it to a separate game-time counter. That way, if you reduce your speed factor, you only reduce the factor applied to passing time, not to the time you've already recorded.
You can wrap all this behaviour into your own stopwatch class if needs be. If you do that, then I'd suggest that you calculate/accumulate the elapsed time both "every time it's requested" and also "every time the factor is changed." So you have a class something like this (note that I've skipped field declarations and some simple private methods for brevity - this is just a rough idea):
public class SpeedyStopwatch
{
// This is the time that your game/system will run from
public TimeSpan ElapsedTime
{
get
{
CalculateElapsedTime();
return this._elapsedTime;
}
}
// This can be set to any value to control the passage of time
public double ElapsedTime
{
get { return this._timeFactor; }
set
{
CalculateElapsedTime();
this._timeFactor = value;
}
}
private void CalculateElapsedTime()
{
// Find out how long (real-time) since we last called the method
TimeSpan lastTimeInterval = GetElapsedTimeSinceLastCalculation();
// Multiply this time by our factor
lastTimeInterval *= this._timeFactor;
// Add the multiplied time to our elapsed time
this._elapsedTime += lastTimeInterval;
}
}
According to modern physics, what you need to do to make your timer go "faster" is to speed up the computer that your software is running one. I don't mean the speed at wich it performs calculations, but the physical speed. The close you get to the speed of light ( the constant C ) the greater the rate at which time passes for your computer, so as you approach the speed of light, time will "speed up" for you.
It sounds like what you might actually be looking for is an event scheduler, where you specify that certain events must happen at specific points in simulated time and you want to be able to change the relationship between real time and simulated time (perhaps dynamically). You can run into boundary cases when you start to change the speed of time in the process of running your simulation and you may also have to deal with cases where real time takes longer to return than normal (your thread didn't get a time slice as soon as you wanted, so you might not actually be able to achieve the simulated time you're targeting.)
For instance, suppose you wanted to update your simulation at least once per 50ms of simulated time. You can implement the simulation scheduler as a queue where you push events and use a scaled output from a normal Stopwatch class to drive the scheduler. The process looks something like this:
Push (simulate at t=0) event to event queue
Start stopwatch
lastTime = 0
simTime = 0
While running
simTime += scale*(stopwatch.Time - lastTime)
lastTime = stopwatch.Time
While events in queue that have past their time
pop and execute event
push (simulate at t=lastEventT + dt) event to event queue
This can be generalized to different types of events occurring at different intervals. You still need to deal with the boundary case where the event queue is ballooning because the simulation can't keep up with real time.
I'm not entirely sure what you're looking to do (doesn't a minute always have 60 seconds?), but I'd utilize Thread.Sleep() to accomplish what you want.