I have a timer. When it ticks , by calculating based on formulas, the position of 12 panels changes.
The problem is, although timer's interval is 1 milisecond, the moves are very slow.There are many calculations. What can be done for improving speed, using drawing class or something else?
The gui shows positions, I can move the panels by clicking, so the values. If the correct way is drawing class, do I have a chance to move the rectangles by clicking and take the values of them?
although timer's interval is 1 milisecond
That's the core problem, a Timer cannot tick that fast. Actual timer resolution is constrained by the operating system's clock interrupt rate. Which ticks 64 times per second on most Windows machines. Or once every 15.625 millsecond. The smallest interval you can hope to get is therefore 16 msec. So these panels probably now move 16 times slower than you hoped they would.
Keep in mind how this is observed, you only need to keep human eyes happy. They can't perceive anything that changes at 1 msec, anything that updates faster than 25 times per second just looks like a blur. Something that's taken advantage of in TV and the cinema, a movie updates at 24 frames per second. Once ever 42 milliseconds.
So a sane setting for the Timer.Interval is a hair below three times the clock interrupt rate, 46 milliseconds. The actual tick interval will be 3 x 15.625 = 46.875 msec on a regular machine. And still close to 46 msec if the machine runs with a higher clock interrupt rate. You'll get an equivalent frame rate of 21 fps. Right on the edge of a blur to human eyes. The next lower sane rate is two times the interrupt rate or 31 msec for 32 fps. Making it any smaller doesn't make sense, it isn't observable and just burns cpu time for no benefit.
And, important, the rate at which the panel moves is now determined by how much you change its Location property in the Tick event handler. The interval is fixed so the amount of motion you get is determined by the increment in the position. Which will not be one pixel, probably what you are using now.
Related
I wrote a simple OpenTK program based on the OpenTK official tutorial (here) that basically draws a cube on the screen, applies a mixed texture, and rotates it based on elapsed time.
What I found is that when I set GameWindowSettings RenderFrequency and UpdateFrequency, weird stuff happens (all time/frequencies are measured with System.Diagnostics.Stopwatch):
If I don't set any frequency, the program runs at max speed and I see a smooth textured cube rotating in the widows. The measured screen FPS is about 1000 and updates are about 1000 per second. The cube rotates smoothly and the CPU / GPU runs at 100%.
if I set RenderFrequency say to 60.0 Hz and UpdateFrequency say to 1.0 Hz, the magic happens: the actual UpdateFrequency is about 1 Hz but the actual RenderFrequecy is about half the set, says around 30 Hz. The most stunning thing is that the cube still rotates but instead of rotating smoothly, it stays still for around 1 second and then suddenly moves, and all the rotation amount is done in a short time, but in a smooth way (I don't see any twitchy movement of the image, simply rotate very fast and smooth). CPU and GPU run at about 10-15%.
If I let RenderFrequency to its default value and I set UpdateFrequency, then the actual update matches the set and the RenderFrequency goes to 1500-2000 Hz and the cube rotates smoothly. The CPU is about 50% and the GPU goes up to 90-100%.
If I let the UpdateFrequency to its default and I set RenderFrequency to 60.0 Hz, then the actual UpdateFrequency goes to more than 100 kHz, the RenderFrequency matched 60 Hz, and the cube rotates as in case 2. CPU runs about 30% (why not the max?) and GPU around 10%.
OpenTK behavior, CPU/GPU usage, and the rotating cube I see don't make any sense to me.
I run the software on a Lifebook U Series Laptop with Win10.
Any clue?
I'm losing the night on this, many thanks in advance.
I am making the character run but the animation is extremely quick as I am doing:
_frameIndex++;
_frameIndex; is the value which points to the image in the SpriteSheet. Does anyone know how I can use gameTime.ElapsedGameTime.TotalMilliseconds to slow the animation down?
I saw that you've asked a couple of questions tonight concerning animation and spritesheets, so here's an example from Aaron Reed's "Learning XNA 4.0," from Chapter 3 under the "Adjusting the Animation Speed" heading.
First, create two class-level variables to track time between animation frames:
int timeSinceLastFrame = 0;
int millisecondsPerFrame = 50;
The first variable tracks time passed since the animation frame was changed, and the second is an arbitrary amount of time that you specify to wait before moving the frame index again. So making millisecondsPerFrame smaller will increase the animation speed, and making it larger will decrease the animation speed.
Now, in your update method, you can take advantage of game.ElapsedGameTime to check time passed since the last frame change, and change the frame when when that value greater than millisecondsPerFrame, you can do work:
timeSinceLastFrame += gameTime.ElapsedGameTime.Milliseconds;
if (timeSinceLastFrame > millisecondsPerFrame){
timeSinceLastFrame -= millisecondsPerFrame;
// Increment Current Frame here (See link for implementation)
}
This sort of solution is similar to what you've found works, except that you can take an extra step to specify exactly how often you want the animation to update, or even change that duration later on in your code if you like. For example, if have some condition that would "speed up" the sprite (like a power-up) or likewise slow it down, you could do so by changing millisecondsPerFrame.
I removed the code that actually updates the current frame, since you should already have something that does that since you have a working animation. If you'd like to see the example in-full, you can download it from the textbook's website.
I debugged the code and noticed that the gameTime.ElapsedGameTime.TotalMilliseconds always equated to 33. So I did the following:
milliSeconds += gameTime.ElapsedGameTime.Milliseconds;
if (milliSeconds > 99)
{
_frameIndex++;
milliSeconds = 0;
}
Which basically means that if this is the THIRD frame of the game, then make he _frameIndex go up. Reset the milliseconds to start over.
I am trying to run an evolutionary algorithm using xna
i would like only to run the logical side of the game
and after a long caculation time add the animation.
Does anybody know how to accelerate calculation time and disable the
Draw() call.
Thanks
Just don't draw in your Draw method. You can keep a counter or a time stamp as reference, and only draw once out of 100 times or once per second.
However, your problem is not the number of times Draw is called, but the number of times Update is called. By default, XNA will never call Update more than 30/60 times per second. You can change the frame rate XNA tries to achieve as explained in this post. To call Update 100 times per second, just change the target elapsed time in your game to:
this.TargetElapsedTime = TimeSpan.FromSeconds(1.0f / 100.0f);
I've made a game in GDI and I want to make a fps counter. To do this I use a System.Windows.Forms.Timer with an interval of 1000ms. In the Form1_Paint() I increment the fps variable, I draw a text showing the fps and call the this.Invalidate() at the end. In the Timer_Tick() I put fps = 0. In the Form1_Load() I enable the timer and start it. But the timer doesn't start and the fps variable doesn't come back to 0. Why the timer doesn't start?
I think the problem is from the this.Invalidate(), I think that it doesn't let the timer to call the Timer_Tick(). How can I make the timer call it if this is the problem?
System.Windows.Forms.Timer is a synchronous timer. It's running on the same thread like the GUI, which means that if the GUI is busy with some heavy logic/calculations, the Timer won't run.
You're most likely looking for an asynchronous Timer, which runs on it's own thread, like System.Timers.Timer and System.Threading.Timer. But watch-out for cross-thread-calls.
Use a System.Diagnostics.Stopwatch to measure the time between paints.
Then calculate the frame rate with ((double)1 / Stopwatch.ElapsedMilliseconds) * 1000.
From ElapsedMilliseconds we get "milliseconds per frame", inverting the number gives us "frames per milliseconds" and finally multiplying with 1000 we get the sought after frames per second.
private void Form1_Paint(object sender, PaintEventArgs e)
{
long msec = watch.ElapsedMilliseconds;
watch.Reset();
watch.Start();
label1.Text = ((1d / msec) * 1000).ToString("F") + " FPS";
}
First you'll want to make sure that the timer is actually working. I suspect that it is, but you'll have to write some debugging code to find out for sure. One way is to put a label on your form and have the timer update the label with a tick count (i.e. increments the count each time the timer fires). If the timer is working, then that label should update once per second (approximately).
For your FPS counter, I would suggest that rather than having the timer update the UI, it should just compute the FPS and let the normal game update logic display the FPS when you want it. That way you're not updating the display in multiple places, and you're not wasting time updating the screen just to show the FPS.
I'm trying to get a general smooth scrolling mechanism that I can implement in my mobile applications.
I want it to be generic enough so that it can port to any platform, but I am currently working in C# on the .net Compact Framework.
What I'm doing right now is:
Create a Stopwatch object (in the panel's ctor)
on mouse down start the Stopwatch and save the current mouse point _lastMouse
on mouse move, stop the Stopwatch, and store velocity = (_lastMouse - curMouse) / Stopwatch.TotalSeconds then reset the Stopwatch and start it again
In most cases Stopwatch.TotalSeconds is between 0.02 and 0.03
on mouse up, I pass the velocity value into a smooth scrolling function, and that function continues to scroll the panel until either the end is hit or the increasing friction causes the velocity to be == 0
My problem is in the final step. The velocity values are generally int the 2,000-3,000 pixel range. The velocity is in pixels per second, so this is to be expected. I take the Stopwatch (which should be still running), stop it and I find the elapsed time from last mous move and multiply velocity by Stopwatch.TotalSeconds, get that distance and then reset and start the Stopwatch, then loop back and start all over again.
The expected result is that the elapsed time between refreshes multiplied by the velocity should give me the number of pixels (according to the last mouse move) that I should scroll. My actual result is that sometimes the panel goes flying and sometimes it bearly moves! the gradual slowdown is fine, it's just the beginning velocity that is off
Is there a flaw in the logic? Should I be doing something else?
Thanks for any help!
It seems to me that there are three possible sources of inaccuracy here. Firstly, as "A.R." said, you'll have problems if the granularity of your timer isn't good enough. Have you checked IsHighResolution and Frequency to make sure it's OK?
Secondly, even if your timer is perfect, there may be some inaccuracy in your position measurements, and if you're taking two in very quick succession then it may hurt you. My guess is that this isn't a big deal, but e.g. if you're on a capacitive touchscreen then as the finger lifts off you may get variation in position as the contact area goes down.
Thirdly, the physical motion of the finger (or stylus or mouse or whatever you've got doing the actual input; I'm going to guess a finger) may not be all that well behaved. At the end of a gesture, it may change from being mostly horizontal to being mostly vertical.
All these problems would be substantially mitigated by using a longer sampling period and maybe (try it both ways) ignoring the very last sample or two. So, keep a little circular buffer of recent samples, and when you get the mouse-up look back (say) 100ms or thereabouts and use that to decide your velocity. Or, if you don't want to keep that much history, use a simple IIR filter: every time you get a sample, do something like
filtered_dt = filtered_dt + SMALL*(latest_dt-filtered_dt);
filtered_dx = filtered_dx + SMALL*(latest_dx-filtered_dx);
filtered_dy = filtered_dy + SMALL*(latest_dy-filtered_dy);
where SMALL should be, at a guess, somewhere around 0.2 or so; then use filtered_dx/filtered_dt and filtered_dy/filtered_dt as your velocity estimate. (I think this is better than calculating a velocity every time and filtering that, because e.g. the latter will still blow up if you ever get a spuriously small dt. If in doubt, try both ways.)
If you use the IIR approach, you may still want to make it ignore the very last sample if that turns out to be unreliable; if you remember the latest dt,dx and dy you can do that by undoing the last update: use (filtered_dt-SMALL*latest_dt)/(1-SMALL), etc.
Here's an off-the-wall suggestion that may or may not work. You mentioned that you get more erratic results when there's a "flick" at the end of the gesture. Perhaps you can use that to your advantage: look at, say, how rapidly the estimated velocity is changing right at the end of the gesture, and if it's changing very rapidly then increase the velocity you use somewhat.