Can Ticks be ticked at different speed depending on computer? - c#

I created an App which uses Timer class to callback a method at a certain time of a day and recall it every 24 hours after that.
I use Ticks to signify 24 hours later. (int) TimeSpan.FromHours(24).TotalMilliseconds
I use that to retrieve the ticks for 24 hours.
This works fine for me but on different computers, the trigger time is way off.
Anyway to debug this ? How should I fight/handle this issue ....

How much is "way off" to you? If you want an app to run at a specific time, schedule it for that specific time, not 24 hours from the time it finishes - you're inevitably going to see some slippage doing it that way because the time will always be off the next day X seconds, where X is how long the program took to complete the previous day.

How "way off" yes desktop computers clocks frequently fluctuate by a second or more every day, they generally use a NTP server to correct these fluctuations. But its just the nature of the beast.

First of all, there is no "my ticks" because a Tick is a well-defined value.
A single tick represents one hundred nanoseconds or one ten-millionth of a second. There are 10,000 ticks in a millisecond.
A DateTime object also has a Ticks property that you can use to access it. I wrote some simple code that I've posted here worked great for me, producing perfect results.
I see no reason your implementation should drift so much. Please make a sample that consistently produces the problem or post the relevant pieces of your own source.

Related

ConfigurationChangeWatcher.Poller()

When I profile my application , it seems that 70% of the time is spent in the method:
Microsoft.Practices.EnterpriseLibrary.Configuration.Storage.ConfigurationChangeWatcher.Poller()
From what I can gather this method should only be invoked every 50 seconds so I find it hard to believe that it is actually taking up that much time.
Does anyone know how can I reduce the frequency that this method is called?
I'm surprised that in an application that is doing real work that a timer thread that executes once every 15 seconds (the default) and looks to just be comparing file times is taking up so much time.
What if you try to set the timer interval to a longer interval sometime after initializing Enterprise Library:
ConfigurationChangeWatcher.SetDefaultPollDelayInMilliseconds(int.MaxValue);
If you do that does the time spent decrease?
Also, if you use the FileConfigurationSource class programmatically there is a constructor overload to disable watching for configuration file changes.

How can i run a method in silverlight once a month?

I have a program that displays some information in the form of a chart. The information updates every month and so i need to retrieve the information once a month.
I thought about having a thread that sleeps for a months time but, i don't know if that is doable. Can someone suggest a better way to do this?
thanks!
You can store the last accessed date in a file/table and then poll the file/table each day when the program starts. If it exceeds 1 month(30/31 days) then re-get your data.
Use the DateTime class in .NET to help you with it. Also use DateTime.Subtract() between two dates.
I'm wondering what you're trying to do here. It sounds... odd.
So, where is the info coming from? A database? If so, can you just calculate the month's values when pulling the data normally? Or you could have a job running in SQL Agent that updates a table once a month with the calculated values.
Really with windows you could set up a scheduled task with a similar concept. It runs once a month and pulls down the new data.
But basically a little more info about what you're trying to do might help us give you better answers..
I have seen implementations where on startup a thread is started and calculates the time until the next read time. Within the thread it waits for a shutdown event to be signalled (which is fired by the app), the timeout value is the previously calculated time. Once wait is complete, if it wasn't due to shutdown event, then data is read.
However, the more common implementation that I have seen is to use scheduled tasks to load the data to the app. Although this will take longer to implement than threaded version, it should provide a more robust solution.

how to get accurate time of day instead DateTime.Now.TimeOfDay?

I want to get the accurate current time in the following format:
HH:MM:SS:mmm
HH - hours
MM - minutes
SS - seconds
mmm - miliseconds
As far as I know DateTime.Now is not accurate enough and StopWatch is just for measuring time.
I want to get accuracy of 1 milisecond.
The problem that you have here is that DateTime gives you a very precise time result, but it's accuracy is governed by both the systems hardware clock and how quickly the OS responds to the time request.
Precision and accuracy may seem like the same thing, but they're not. You may be able to write a measurement down to 1 billionth of a metre, but if your measuring device is 1 metre long without graduations then you only have an accuracy of 1 metre.
If you really need an accurate time down to 1 millisecond then you'll need another time source that you can poll directly thus bypassing any OS delays.
It is possible to get devices that allow connection to a realtime clock, for example gps receivers (which are highly accurate). Again accuracy between two receivers will depend on whether they are the same distance from the satellite or not.
Well, the resolution of DateTime actually goes down to 100 nanosecond resolution (even though the resolution may be limited to less by the implementation), millisecond resolution should not be a problem.
All properties you need to solve your problem are available in the normal DateTime type.
Edit: As #CodeInChaos points out, even millisecond resolution is not guaranteed using this API, so if you actually need that resolution, it's no good :-/
DateTime.Now has a resolution of 10ms
http://msdn.microsoft.com/en-us/library/system.datetime.now.aspx
what accurate resolution will be appropriate for you?
Windows Multimedia Timer timeGetTime()
Granularity: 1 millisecond
Precision: 1 millisecond
Apparent precision: 2.32804262366679e-006 seconds
Apparent jitter: 1.19727906360006e-007 seconds
Ripped from here

Compare DateTime ticks on two machines

Is it a viable option to compare two FileInfo.CreationTimeUtc.Ticks of two files on two different computers to see which version is newer - or is there a better way?
Do Ticks depend on OS time or are they really physical ticks from some fixed date in the past?
The aim of a UTC time is that it will be universal - but both computers would have to have synchronized clocks for it to be appropriate to just compare the ticks. For example, if both our computers updated a file at the exact same instant (relativity aside) they could still record different times - it's not like computers tend to come with atomic clocks built-in.
Of course, they can synchronize time with NTP etc to be pretty close to each other. Whether that's good enough for your uses is hard to say without more information. You should also bear in mind the possibility of users deliberately messing with their system clocks - could that cause a problem for your use case?
Do Ticks depend on OS time or are they
really physical ticks from some fixed
date in the past?
Ticks are pretty much independent of the OS.
If I remember correctly, 1 second = 10000000 ticks.
So whatever time you are checking, what you get from Ticks is about ten million times what you get from TotalSeconds. (Although it is more accurate than TotalSeconds, obviously.)
A tick is basically the smallest unit of time that you can measure from .NET.
As of speaking about UTC, yes, it is as good as you can get. If the system time on the machine your app is running on is accurate enough, you'll manage with it without issue.
Basically, the more frequent updates there are of the files, the more inaccurate this will be. If someone creates two versions of the file in one second, all of the system times must be precisely synchronized to get a good result.
If you only have different versions once per several minutes, then it is very much good enough for you.
The short answer is that no, this is not a viable solution, at least not in the theoretical, anything-can-happen type of world.
The clock of a computer might be accurate to a billionth of a billionth of a second, but the problem isn't the accuracy, it's whether the clocks of the two computers are synchronized.
Here's an experiment. Look at your watch, then ask a random stranger around you what the time is. If your file was written to on your computer when you looked at the watch, and written to on the computer belonging to the person you're asking 1 second ago, would your comparison determine that your file or his/her file was newer?
Now, your solution might work, assuming that:
You're not going to have to compare milli- or nano-second different clock values
The clocks of the computers in question are synchronized against some common source
** or at the very least set to be as close to each other as your inaccuracy-criteria allows for
Depending on your requirements, I would seriously look at trying to find a different way of ensuring that you get the right value.
If you are talking about arbitrary files, then the answer about UTC is worth a try. The clocks should be the same.
If you have control over the files and the machines writing to them, I would write a version number at the start of the file. Then you can compare the version number. You would have to look into how to get unique version numbers if two machines write a file independently. To solve this, a version webservice could be implemented which both machines use.
Some file formats have version information built-in, and also time stamps. For example, Microsoft Office formats. You could then use the internal information to decide which was the newest. But you could end up with version conflict on these as well.
From the way you phrased the question, I assume that for some reason you cannot call the DateTime comparison methods. Assuming this, the msdn page for the ticks property states "A single tick represents one hundred nanoseconds or one ten-millionth of a second. There are 10,000 ticks in a millisecond." Thus the ticks refer to the value assigned by the .NET library and do not depend on the machine/OS(ehich is probably Windows since you are using C#) and can be safely used for comparisons.

Alternatives to checking against the system time

I have an application which license should expire after some period of time.
I can check the time in the applicatino against the system time, but system time can be changed by the administrator, therefore it's not a good idea to check against the system time in my opinion.
What alternatives do I have?
Thank you
You could check against an online NTP server or, alternatively, at every run, note how many days have elapsed on the system clock. If they ever set this back before the time you know has currently elapsed, it will be obvious and you will know they are trying to cheat.
I'd say that forcing the user to run with the wrong date set would be enough deterent for most people. However, you could always combine it by storing the last run date in the registry and every time you compare that with today's date, if the the last run date is higher than today's date they've probably reset the date. There could be legit reasons for this though so you might want to have a check that you only disable the software after this has been done x number of times or similar.
you can use a counter so the user can use the software n times
anotherway is that you store the start- and enddate of the user starting the application.
If the user starts the application another time, you check against it.
Another way is to always check with an online server.

Categories

Resources