What is a "tick"? - c#

I know it's a unit of time equal to 100 nanoseconds and, as far as I know, it's the smallest unit of time you can get in C#.
However what, exactly, are "ticks"? Is 1 tick = 1 CPU tick? I'd guess not but then how does the system know when to add/increment a tick?
Bonus: if no one can look beyond "A tick is 100 nanoseconds, deal with it." then how does the system "get;" a tick?

Related

2100 hour + 1 minute as a timer.interval

In here says "" The time, in milliseconds, between Elapsed events. The value must be greater than zero, and less than or equal to Int32.MaxValue" [2,147,483,647]
However, I need 2100 hours plus 1 minute as a Timer.Interval. [7,560,000,000]
How to solve this, There's another way?
Timers shouldn't live anywhere near that long. Fire a short timer periodically, and check the system clock to see if it's time to perform your long-running event or not.
Better yet, use Quartz.net, which is already designed for this.

Putting a thread to sleep for decimal value

This question is about System.Threading.Thread.Sleep(int). I know there is no method for a decimal value, but I really need to work with decimals.
I have a device which takes 20.37 milliseconds to turn by 1 degree. So, I need to put the code to sleep for an appropriate multiplication of 20.37 (2 degrees = 20.37*2 etc). Since the thread class got no decimal sleep method, how can I do this?
That does not work that way. Sleep will grant you that the thread sats idle for x time, but not that it won't stay idle for more. The end of the sleep period means that the thread is available for the scheduler to run it, but the scheduler may chose to run other threads/processes at that moment.
Get the initial instant, find the final instant, and calculate the current turn by the time passed. Also, do not forget to check how precise the time functions are.
Real-time programming has some particularities in its own as to advice you to seek for more info in the topic before trying to get something to work. It can be pretty extensive (multiprocessing OS vs monoprocessing, priorities, etc.)
Right, as pointed out in the comments, Thread.Sleep isn't 100% accurate. However, you can get it to (in theory) wait for 20.27 milliseconds by converting the milliseconds to ticks, and then making a new TimeSpan and calling the method with it, as follows:
Thread.Sleep(new TimeSpan(202700))
//202700 is 20.27 * TimeSpan.TicksPerMillisecond (which is 10,000)
Again, this is probably not going to be 100% accurate (as Thread.Sleep only guarantees for AT LEAST that amount of time). But if that's accurate enough, it'll be fine.
You can simply divide the integer - I just figured that out.
I needed less than a milisecond of time the thread sleeps so I just divided that time by an integer, you can either define a constant or just type in:
System.Threading.Thread.Sleep(time / 100);
Or what number you want.
Alternatively, as mentioned, you can do it like:
int thisIsTheNumberYouDivideTheTimeBy = 100;
Thread.Sleep(time / thisIsTheNumberYouDivideTheTimeBy);
Its actually quite simple. Hope that helped.
By the way, instead of
System.Threading.Thread.Sleep(x);
you can just type
Thread.Sleep(x);
unless you haven't written
using System.Threading;
in the beginning.
I had the same problem. But as a work around, i substitute the float vslie but convert to int value in the passing. The code itself rounds off for me and the thread sleeps for that long. As i said, its a work around and i'm just saying, not that it's accurate
You can use little bit of math as a workaround.
Let´s assume, that you don´t want to be extremely precise,
but still need overall float precise sleep.
Thread.Sleep(new Random().Next(20,21));
This should give you ~20.5 sleep timing. Use your imagination now.
TotalSleeps / tries = "should be wanted value", but for single sleep interval, this will not be true.
Dont use new Random() make an instance before.

Use StopWatch to measure FPS with strange issue

I have a code snippet like this:
while(true)
{
myStopWatch.Start();
DoMyJob();
myStopWatch.Stop();
FPS = 1000/myStopWatch.Elapsed.ToMillionSeconds();
myStopWatch.Reset();
}
which works pretty good, I got the FPS around 100(+/-2). But sometimes I just want to focus on a certain part of the DoMyJob() performance and add some feedbacks, so I split the DoMyJob() to DoMyJob1() and add DoMyJobs2(). the first part is mainly calculation stuff, second part is to visualize the calculation on the Form and update some indicators.
So the code becomes:
while(true)
{
myStopWatch.Start();
DoMyJob_1();
myStopWatch.Stop();
FPS = 1000/myStopWatch.Elapsed.ToMillionSeconds();
myStopWatch.Reset();
DoMyJob_2();
}
I did not expect anything would mess up the FPS since DoMyJob1 is almost the same as the original DoMyJob. But oops..it messed up. the FPS becomes frenzy, bouncing between 40 and up to 600 in a somehow random manner. I wiped out the DoMyJob2() and FPS went back to steady 100.
As I examined deep into the FPS sequence, I found out the FPSs are not random at all - they had like 4 or 5 different ranges, in my code, 30-50, 100-120, 300-360, 560-600, etc. Not a single number falls into the gaps. Then I tried the code in another laptop and the issue still exists, but just with different ranges. I know StopWatch uses Win32API. Is it because it's buggy and I run the code on the 64bit system??
BTW: what the best way to measure FPS on .NET Windows Form App? (like if FPS=100 or more)
If DoMyJob_2 takes a variable amount of time, then you have a slice of time from every second that is not being taken into account. You could use your method to calculate an average time to execute DoMyJob_1, but not to determine frames per second. For example:
loop 1:
task 1: 5ms
reported fps: 1000/5ms = 200
task 2: 15ms
real fps: 1000/20ms = 50
loop 2:
task 1: 5ms
reported fps: 1000/5ms = 200
task 2: 25ms
real fps: 1000/30ms = 33
...
So I'm not sure that's what you are seeing, but it seems possible. What you are describing (fluctuating reported fps) might actually make more sense if the total length of the job tends to be stable, but the way you split the job makes each part variable.

.NET Timers, do they fire at the exact interval or after processing + interval

So a simple enough question really.
How exactly does the interval for System.Timers work?
Does it fire 1 second, each second, regardless of how long the timeout event takes or does it require the routine to finish first and then restarts the interval?
So either:
1 sec....1 sec....1 sec and so on
1 sec + process time....1 sec + process time....1 sec + process time and so on
The reason I ask this is I know my "processing" takes much less than 1 second but I would like to fire it every one second on the dot (or as close as).
I had been using a Thread.Sleep method like so:
Thread.Sleep(1000 - ((int)(DateTime.Now.Subtract(start).TotalMilliseconds) >= 1000 ? 0 : (int)(DateTime.Now.Subtract(start).TotalMilliseconds)));
Where start time is registered at start of the routine. The problem here is that Thread.Sleep only works in milliseconds. So my routine could restart at 1000ms or a fraction over like 1000.0234ms, which can happen as one of my routines takes 0ms according to "TimeSpan" but obviously it has used ticks/nanoseconds - which would then mean the timing is off and is no longer every second. If I could sleep by ticks or nanoseconds it would be bang on.
If number 1 applies to System.Timers then I guess I'm sorted. If not I need some way to "sleep" the thread to a higher resolution of time i.e ticks/nanoseconds.
You might ask why I do an inline IF statement, well sometimes the processing can go above 1000ms so we need to make sure we don't create a minus figure. Also, by the time we determine this, the ending time has changed slightly - not by much, but, it could make the thread delay slightly longer causing the entire subsequent sleeping off.
I know, I know, the time would be negligible... but what happens if the system suddenly stalled for a few ms... it would protect against that in this case.
Update 1
Ok. So I didn't realise you can put a TimeSpan in as the timing value. So I used the below code:
Thread.Sleep(TimeSpan.FromMilliseconds(1000) - ((DateTime.Now.Subtract(start).TotalMilliseconds >= 1000) ? TimeSpan.FromMilliseconds(0) : DateTime.Now.Subtract(start)));
If I am right, this should then allow me to repeat the thread at exactly 1 second - or as close as the system will allow.
IF you have set AutoReset = true; then your theory 1 is true, otherwise you would have to deal with it in code – see the docuementation for Timer on MSDN.

How would I go about implementing a stopwatch with different speeds?

Ideally I would like to have something similar to the Stopwatch class but with an extra property called Speed which would determine how quickly the timer changes minutes. I am not quite sure how I would go about implementing this.
Edit
Since people don't quite seem to understand why I want to do this. Consider playing a soccer game, or any sport game. The halfs are measured in minutes, but the time-frame in which the game is played is significantly lower i.e. a 45 minute half is played in about 2.5 minutes.
Subclass it, call through to the superclass methods to do their usual work, but multiply all the return values by Speed as appropriate.
I would use the Stopwatch as it is, then just multiply the result, for example:
var Speed = 1.2; //Time progresses 20% faster in this example
var s = new Stopwatch();
s.Start();
//do things
s.Stop();
var parallelUniverseMilliseconds = s.ElapsedMilliseconds * Speed;
The reason your simple "multiplication" doesn't work is that it doesn't speeding up the passing of time - the factor applies to all time that has passed, as well as time that is passing.
So, if you set your speed factor to 3 and then wait 10 minutes, your clock will correctly read 30 minutes. But if you then change the factor to 2, your clock will immediately read 20 minutes because the multiplication is applied to time already passed. That's obviously not correct.
I don't think the stopwatch is the class you want to measure "system time" with. I think you want to measure it yoruself, and store elapsed time in your own variable.
Assuming that your target project really is a game, you will likely have your "game loop" somewhere in code. Each time through the loop, you can use a regular stopwatch object to measure how much real-time has elapsed. Multiply that value by your speed-up factor and add it to a separate game-time counter. That way, if you reduce your speed factor, you only reduce the factor applied to passing time, not to the time you've already recorded.
You can wrap all this behaviour into your own stopwatch class if needs be. If you do that, then I'd suggest that you calculate/accumulate the elapsed time both "every time it's requested" and also "every time the factor is changed." So you have a class something like this (note that I've skipped field declarations and some simple private methods for brevity - this is just a rough idea):
public class SpeedyStopwatch
{
// This is the time that your game/system will run from
public TimeSpan ElapsedTime
{
get
{
CalculateElapsedTime();
return this._elapsedTime;
}
}
// This can be set to any value to control the passage of time
public double ElapsedTime
{
get { return this._timeFactor; }
set
{
CalculateElapsedTime();
this._timeFactor = value;
}
}
private void CalculateElapsedTime()
{
// Find out how long (real-time) since we last called the method
TimeSpan lastTimeInterval = GetElapsedTimeSinceLastCalculation();
// Multiply this time by our factor
lastTimeInterval *= this._timeFactor;
// Add the multiplied time to our elapsed time
this._elapsedTime += lastTimeInterval;
}
}
According to modern physics, what you need to do to make your timer go "faster" is to speed up the computer that your software is running one. I don't mean the speed at wich it performs calculations, but the physical speed. The close you get to the speed of light ( the constant C ) the greater the rate at which time passes for your computer, so as you approach the speed of light, time will "speed up" for you.
It sounds like what you might actually be looking for is an event scheduler, where you specify that certain events must happen at specific points in simulated time and you want to be able to change the relationship between real time and simulated time (perhaps dynamically). You can run into boundary cases when you start to change the speed of time in the process of running your simulation and you may also have to deal with cases where real time takes longer to return than normal (your thread didn't get a time slice as soon as you wanted, so you might not actually be able to achieve the simulated time you're targeting.)
For instance, suppose you wanted to update your simulation at least once per 50ms of simulated time. You can implement the simulation scheduler as a queue where you push events and use a scaled output from a normal Stopwatch class to drive the scheduler. The process looks something like this:
Push (simulate at t=0) event to event queue
Start stopwatch
lastTime = 0
simTime = 0
While running
simTime += scale*(stopwatch.Time - lastTime)
lastTime = stopwatch.Time
While events in queue that have past their time
pop and execute event
push (simulate at t=lastEventT + dt) event to event queue
This can be generalized to different types of events occurring at different intervals. You still need to deal with the boundary case where the event queue is ballooning because the simulation can't keep up with real time.
I'm not entirely sure what you're looking to do (doesn't a minute always have 60 seconds?), but I'd utilize Thread.Sleep() to accomplish what you want.

Categories

Resources