Running a accurate clock - c#

I am working on an analog clock app, that has the accuraty of an normal clock. Instead of displaying the current time it has to display the time starting from a given time. I used a timer event handler that is triggered every second.
protected void Timer_Tick(object sender, EventArgs e)
{
DateTime CurrentTime = DateTime.Now;
TimeSpan ElapsedTime = m_PrevCurrentTime.HasValue ? CurrentTime - m_PrevCurrentTime.Value : TimeSpan.Zero;
Time += ElapsedTime;
// Update clock.
:
m_PrevCurrentTime = CurrentTime;
}
ElapsedTime however is never exactly 1 second, so the clock runs not exact.
Is there a better approach ?

You can never get the exact clock value at least on Windows.
The ordinary resolution is about 15 ms. Meaning you get the same value for 15 ms in a row if you read the value.
You can get more information in this question.
It is possible to reduce the resolution to a lower value and you can find details in the same question I referenced above.

Related

How to use NMEA time as elapsed time in c#?

In my windows form application I use stopwatch to calculate elapsed time between events but I want to get time from NMEA(GPRMC sentence) and want to use it as elapsed time. How can I do this?
Here is my code;
Stopwatch watch=new Stopwatch();
Textbox1.Text=gsaat.Substring(4,2);
//gsaat is the UTC time part of the GPRMC;
private void timer2_Tick(object sender, EventArgs e)
{
if ((Convert.ToDouble(textBox2.Text) >= Convert.ToDouble(label1.Text) && Convert.ToDouble(label1.Text) >= Convert.ToDouble(textBox1.Text)))
{
watch.Start();
var time = watch.Elapsed.TotalSeconds;
//I want to use gsaat as elapsed time in here instead of stopwatch
label2.Text = String.Format("{0:00.00}", time);
}
}
EDIT;
In my application I am making a acceleration test by using GNSS module. In application I am setting a boundary conditions for velocity Textbox1 and Textbox2 and label1 represents current velocity of the car and label2 represents elapsed time when car's current velocity enter these boundaries .For example; I set Textbox1.text=10 and Textbox2.text=100 and car started to accelerate and I want to measure time difference between car is accclerating from 10 to 100. To do this I used stopwatch but I faced some problems. Now I want to use UTC time to measure time between car's velocity is 10 and 100. I can get UTC time from sentences of the GPRMC and I parsed it too.However I can't make start it from zero when car's velocity is 10 and stopped it when car velocity passed 100.
Not really clear what you want to achieve, but a guess:
Parse the NMEA time and save it as DateTime object. Then do
var timeSpan = DateTime.UtcNow - nmeaTimestamp;
to get the time since the transmitted timestamp.
Have a look at NMEA documentation for the meaning of this timestamp: UTC of position fix: It is the time when the transmitting station was close to the transmitted position.
If you want to be alarmed after a particular timespan use a Timer. It will raise an event when the time has elapsed.
EDIT: Pick a packet close to velocity 10 and a packet close to velocity 100, parse the timestamps and subtract them:
var accelerationTimeMilliseconds = (timestampAt100 - timestampAt10).ElapsedMilliseconds;

What is the default time precision of DateTime?

I did a small winform program for data transferring in Visual Studio, and I used a method to provide the transferring time duration. After the transferring being done, the program will return a dialog window to show the time.
But here I don't know what is the time precision or the resolution of the timer, how can it be such a precision, even micro second?
var startTime = DateTime.Now;
this.transferdata();
var endTime = DateTime.Now;
var timeElapsed = endTime.Subtract(startTime);
when I saw the definition of class DateTime, there is only a precision in milisecond. can anybody tell me why there is such a high resolution timer in the visual studio 2012? Or there is related to the operating system?
The precision of the clock depends on the operating system. The system clock ticks a certain number of times per second, and you can only measure whole ticks.
You can test the resolution for a specific computer using code like this:
DateTime t1 = DateTime.Now;
DateTime t2;
while ((t2 = DateTime.Now) == t1) ;
Console.WriteLine(t2 - t1);
On my computer the result is 00:00:00.0156253, which means that the system clock ticks 64 times per second.
(Note: The DateTime type also has ticks, but that is not the same as the system clock ticks. A DateTime tick is 1/10000000 second.)
To measure time more precisely, you should use the Stopwatch class. Its resolution also depends on the system, but is much higher than the system clock. You can get the resolution from the Stopwatch.Frequency property, and on my computer it returns 2143566 which is a tad more than 64...
Start a stopwatch before the work and stop it after, then get the elapsed time:
Stopwatch time = Stopwatch.StartNew();
this.transferdata();
time.Stop();
TimeSpan timeElapsed = time.Elapsed;
That will return the time in the resolution that the TimeSpan type can handle, e.g. 1/10000000 second. You can also calculate the time from the number of ticks:
double timeElapsed = (double)s.ElapsedTicks / (double)Stopwatch.Frequency;
You are confusing several things. Precision, Accuracy, Frequency, and Resolution.
You might have a variable that is accurate to a billion decimal places. But if you can't actually measure that small of a number then that's the difference between precision and resolution. Frequency is the number of times per second a measurement is taken, while relates to resolution. Accuracy is how closely a given sample is to the real measurement.
So, given that DateTime has a precision much higher than the system clock, simply saying DateTime.Now will not necessarily give you an exact timestamp. There are, however, Higher resolution timers in Windows, and the Stopwatch class uses them to measure time elapsed, so if you use this class you get a much better accuracy.
DateTime has no "default precision". It has only one precision, and that's the Minimum and Maximum values it can store. DateTime internally stores it's values as a single value, and this value is formatted to whatever type you want to display (seconds, minutes, days, ticks, whatever...).

Update UI element from thread

I try to create in windows phone timer which update every 10-15 ms (for UI element). And i want have opportunity to append time. So i create TimeSpan and DispatcherTimer where interval = 15 ms. So every 15 ms call the event where i subtract 15 ms to timeSpan and when timespan <= 0 i call some method. When i set TimeSpan 4 seconds (for example) in life passed more than 4 sec about 4,6 sec. Also i tryed to use async/await but this did't work. I tryed to use System.Threading but i don't know how to update element which was create in the other thread.
So every 15 ms call the event where i subtract 15 ms to timeSpan and when timespan <= 0 i call some method.
Your logic is flawed. You can't possibly update your timespan this way because:
As Stephen Cleary mentioned in his answer, you have no guarantee that the timer will fire at exactly 15 ms
Even if it did, it doesn't take into account the time needed to actually update your timespan (say that it takes 1ms to compute the new timespan, your timer will drift of 1ms every 15ms)
To have an accurate time, you need to store the timestamp at which you started it (retrieve it by using DateTime.UtcNow. Every time your timer tick, take the new timestamp and substract to the one you saved. This way, you know exactly how much time has passed and your timer will never drift.
private DateTime start; // Set this with DateTime.UtcNow when starting the timer
void timer_Tick(object sender, EventArgs e)
{
// Compute the new timespan
var timespan = DateTime.UtcNow - start;
// Do whatever with it (check if it's greater than 4 seconds, display it, ...)
}
So every 15 ms call the event where i subtract 15 ms to timeSpan
And there's your problem.
When you set a timer on any Windows platform, you can't expect a huge level of precision. On the desktop I believe the normal scheduler period on consumer hardware is ~12ms (and of course other apps can throw that off considerably). I have no idea what the scheduling is like on the phone but I assume it's less accurate than desktop (for battery lifetime reasons).
So, you simply can't approach this problem that way on Windows, because you can't assume that a timer will fire every 15ms. Instead, just start the DispatcherTimer to the full time span that you need, e.g., 4 seconds.
Try this to update UI element in your timer thread
Dispatcher.BeginInvoke(() =>
{
//Update Your Element.
});

Trigger event using timer on a specific day and time

I am using System.Timer to trigger an event. Currently I trigger it every 1 hour and check if it matches the configured value (day,time).
But it is possible to trigger this at a specific time? like suppose on Sunday at 12Am.
Windows Task Scheduler would be more appropriate but its not an option.
Thanks in advance
It isn't clear why you just wouldn't set the timer's Interval to the target date/time. There's a limit on the number of milliseconds, you can time up to 2^31 milliseconds, 27 days. You'll be good as long as you can stay in that range.
private static void SetTimer(Timer timer, DateTime due) {
var ts = due - DateTime.Now;
timer.Interval = ts.TotalMilliseconds;
timer.AutoReset = false;
timer.Start();
}
The timer doesn't support this kind of interval but you could check every 20 seconds for the current day and time.
Edit Sorry you are doing that already... why not make the interval half the time left each time you check (A Zeno timer)?
In a similar situation, I'm using System.Threading.Timer to achieve this. Basically I set its due time to desiredDateTime - DateTime.Now, so that it will tick at desiredDateTime.
If you get another date meanwhile, you can user Timer.Change() to change the tick time to the new date. Don't forget to Dispose() the Timer when you no longer need it!

inaccurate .NET timer?

I'm developing an application and I need to get the current date from a server (it differs from the machine's date).
I receive the date from the server and with a simple Split I create a new DateTime:
globalVars.fec = new DateTime(DateTime.Now.Year, DateTime.Now.Month, DateTime.Now.Day, int.Parse(infoHour[0]), int.Parse(infoHour[1]), int.Parse(infoHour[2]));
globalVars is a class and fec is a public static variable so that I can access it anywhere in the application (bad coding I know...).
Now I need to have a timer checking if that date is equal to some dates I have stored in a List and if it is equal I just call a function.
List<DateTime> fechas = new List<DateTime>();
Before having to obtain the date from a server I was using computer's date, so to check if the dates matched I was using this:
private void timerDatesMatch_Tick(object sender, EventArgs e)
{
DateTime tick = DateTime.Now;
foreach (DateTime dt in fechas)
{
if (dt == tick)
{
//blahblah
}
}
}
Now I have the date from the server so DateTime.Now can't be used here. Instead I have created a new timer with Interval=1000 and on tick I'm adding 1 second to globalVars.fec using:
globalVars.fec = globalVars.fec.AddSeconds(1);
But the clock isn't accurate and every 30 mins the clock loses about 30 seconds.
Is there another way of doing what I'm trying to do? I've thought about using threading.timer instead but I need to have access to other threads and non-static functions.
Store the difference between the server's time and local time. Calculate the servers' time when you need it using that difference.
If you create atimer with an interval of 1000ms, it will be called no sooner than 1000ms. So you can pretty much guarantee that it will be called in more than 1000ms, which means you will "lose" time by adding 1s on this timer tick - This will accumulate error with every tick. A better approach is to record a start time and use the current time to determine the current offset from that known start time, so that you don't accumulate any error in your time keeping. (There will still be some error, but you will not drift out of touch with real-time over time)
Different timers (Forms.Timer, Thread.Timer etc) will give different accuracies as well - Forms.Timer is particularly poor for accuracy.
You could also use a high performance time to keep track of the time better - see here, for example.
Here is a reliable 1 μs Timer
See https://stackoverflow.com/questions/15725711/obtaining-microsecond-precision-using-net-without-platform-invoke?noredirect=1#comment22341931_15725711
I guarantee its faster and more accurate then StopWatch and PerformanceCounters and uses the fractions of a second you have in the time slice wisely!

Categories

Resources