Making smooth animations in .NET/GDI+ - c#

I'm having problems making smooth animations in GDI+. The problem as I understand it from Googling is that there is some kind of bug with the clock-ticks in .NET on a multi-core processor. Here's a simplified version of what I'm doing:
class Animation
{
System.Diagnostics.Stopwatch sw = new Stopwatch();
float AnimationTime = 1000; //time it takes the animation to complete
public bool IsComplete
{ get { return sw.ElapsedMilliseconds > AnimationTime; } }
public void StartAnimation()
{
sw.Reset();
sw.Start();
}
public void DoFrame()
{
float PercentComplete = (float)sw.ElapsedMilliseconds / AnimationTime;
//draw the animation based on PercentComplete
}
}
DoFrame() is called like this:
Animation.Start();
do
{
Animation.DoFrame();
Application.DoEvents();
} while (!Animation.IsComplete);
The problem is that the animation is very smooth for about 15 frames then it jerks, it actually goes backward (sw.ElapsedMilliseconds gives a lesser value than the previous query). It's very annoying and it's ruining my otherwise smooth animation that looks great even on a Core 2 Duo (despite Microsoft saying this is a multicore bug). I have an i7 and the animation is smooth except for 2-3 frames per second that look "jerky".
I understand that this is a known issue and Microsoft blames it on the processor, so my question is, does anyone know any kind of solution for this? I tried using a Kalman filter and it sort of works. I was hoping maybe there is an established "correct" solution for this?
Oh b.t.w. I tried using DateTime instead of Stopwatch and got the same results.
I also tried:
double PercentComplete = (double)sw.ElapsedTicks / (double)Stopwatch.Frequency * 1000 / AnimationTime
It gives me the same results.

It probably has to do with the way you call DoFrame(). Using the following windows forms / GDI+ based algorithm you should alway get very smooth animations:
const double desiredFps = 500.0;
long ticks1 = 0;
var interval = Stopwatch.Frequency / desiredFps;
while (true)
{
Application.DoEvents();
var ticks2 = Stopwatch.GetTimestamp();
if (ticks2 >= ticks1 + interval)
{
ticks1 = Stopwatch.GetTimestamp();
// do the drawing here
}
}

Related

Would this C# delta time loop be good for running a simulation at 60fps on server side?

For running a simulation on the server side at 60 logic updates per second, do you think the following would suffice/be stable enough? It's not real-time physics but a mild action RPG type of multiplayer game. I don't think the Thread.Sleep is a good idea from what I've read, but not sure how else to keep it from eating up unnecessary CPU cycles in this loop.
class Program
{
static void Main()
{
long fps = 60;
long interval = 1000 / fps * 10000;
long ticks, delta;
long prevTicks = DateTime.Now.Ticks;
long accum = 0;
while (true)
{
ticks = DateTime.Now.Ticks;
delta = (ticks - prevTicks);
prevTicks = ticks;
accum += delta;
if (accum >= interval)
{
// update logic
Console.WriteLine(accum);
accum -= interval;
}
Thread.Sleep(0);
}
}
}
From experience using the "Accumulator" is only good in physics simulations, The accumulator make the simulation's physics more accurate but at the price of more janky rendering when the loop is running slowly (On less fps), I'd also suggest to use doubles instead of longs.

thread.sleep is messing with stopwatch

I'm using a stopwatch to know how long some operations run for points (game) system.
I'm defining it like this:
Stopwatch sekunde = new Stopwatch();
long tocke;
long glavnetocke;
long cas;
tocke and glavnetocke are points.
Then in "construct" I need to do this:
sekunde.Start();
Thread.Sleep(10000);
sekunde.Stop();
If I don't do Thread sleep for 10000 it won't work.
Later in event handlers I use it like this:
sekunde.Start();
viewer.Clear();
viewer.DrawBody(body, 10.0, Brushes.Green, 10.0, Brushes.Green);
sekunde.Stop();
cas = (long)sekunde.Elapsed.TotalSeconds;
tocke = tocke + ((cas / 10));
if (tocke > 200)
{
glavnetocke = glavnetocke + 1;
tocke = 0;
}
viewer is to draw body of the person standing in front of a camera.
If I leave thread.sleep to 10000 it will work, otherwise it will not. But I need this to go away, because it stops the whole program. Even thou it happens only after I press a button in my form, I need to wait 10sec, before starting. Was thinking of just adding loading screen, but it doesn't work, as it freezes the whole program, so it doesn't show the loading gif.
EDIT:
well, as you can see that "tocke = tocke +((cas / 10));
tocke is always 0 if I remove thread.sleep or even lower the sleep number. Does stopwatch need time to initialize or something?
I want to get time of how long the operation runs to a long type variable so I can then use it for ingame points calculating. (cas = (long)sekunde.Elapsed.TotalSeconds;)
By casting TotalSeconds as a long you are chopping off the fraction part. I've redone your example using doubles and it seems to work for me:
Stopwatch sekunde = new Stopwatch();
long glavnetocke = 0;
double cas;
double tocke = 0;
sekunde.Start();
sekunde.Stop();
cas = sekunde.Elapsed.TotalSeconds;
tocke = tocke + ((cas / 10));
if (tocke > 200)
{
glavnetocke = glavnetocke + 1;
tocke = 0;
}
Result
cas = 0.0000015 (on my setup)

How can I increase the volume of a sound from unhearable (0db) to loud (60db)

I try to increase the noise by doing this :
public void maakgeluid() {
WaveOut waveOut = new WaveOut();
SineWaveOscillator osc = new SineWaveOscillator(44100);
for (short amplitude = 1; amplitude <500; amplitude+=1) {
Console.WriteLine("amplitude:" + amplitude);
for (int i = 1; i < 10; i++) {
Console.WriteLine(i);
osc.Frequency = 500;
osc.Amplitude = amplitude;
try {
waveOut.Init(osc);
} catch (Exception ) { Console.WriteLine("error"); }
waveOut.Play();
}
}
}
The purpose is to generate a sound, like when you go to the ear specialist and take a hearing test. So it should start very silently, and then slowly get loader.
But I have different problems :
i hear the sound immediately
the sound increases to fast
i use the loop with the i counter, to lengthen the duration of the sound, but i don't think that is the right way to do it.
the looping to increase the sound level stops to quickly but I don't see why?
THx
based on the following code
msdn.microsoft.com/en-us/magazine/ee309883.asp
The bel scale is logarithmic, so you need to do maths to convert between decibels and a sample scaling factor.
In digital (floating point) audio a full signal (i.e. a waveform that reaches from +1 to -1, peak to trough) is considered to be 0dB.
As such, you'd probably want to go from -60dB to 0dB.
The conversion is as follows (assuming signed audio over unsigned, as with 8-bit audio)
double sampleValue = 1.0d;
//every -3db represents an approximate halving of linear signal level
double decibelValue = -3.0d;
double linearScalingRatio = Math.Pow(10d, decibelValue/10d);
var newSampleValue = sampleValue * linearScalingRatio;
So now, newSampleValue is 0.501187233627272
Your current code keeps recreating WaveOut devices which is not a good idea. Open the soundcard once, and then feed a single signal to it that gradually increases in volume over time. One way you could do this is to use the SignalGenerator class to make the sin wave, then pass that through a FadeInSampleProvider to gradually fade it in:
var sg = new SignalGenerator(44100,1);
sg.Frequency = 500;
sg.Type = SignalGeneratorType.Sin;
var fadeIn = new FadeInOutSampleProvider(sg, true);
fadeIn.BeginFadeIn(20000); // fade in over 20 seconds
waveOut.Init(fadein);
waveOut.Play();
As spender rightly points out, 0dB is maximum, so this is going from negative infinity decibels up to 0dB over the duration of the fade-in time. If you wanted to make it start at -60dB, or for the ramp-up of the multiplier to not be linear, then you'd need to make your own custom ISampleProvider similar to FadeInOutSampleProvider and use that instead.

More Precise timer than Stopwatch?

I'm trying to have a stopwatch start and stop when recording positions for the Kinect:
//process x and y coordinates
public void calculateJoints(Skeleton skeleton)
{
Joint rightHand = skeleton.Joints[JointType.HandRight];
Joint leftHand = skeleton.Joints[JointType.HandRight];
rightX = rightHand.Position.X;
rightY = rightHand.Position.Y;
}
//start the stopwatch (tried to use a greater time between positions 1 and 5 vs 1 and 2
public void processJointsAndRepeat(Skeleton skeleton)
{
startTime();
while (numPositions < 5)
{
calculateJoints(skeleton);
numPositions++;
}
stopTime();
double tempTime = calculateTimeElapsed();
}
//calculate time in milliseconds
private double calculateTimeElapsed()
{
long milliseconds = stopWatch.ElapsedMilliseconds;
return (double)milliseconds;
}
But whenever I try to put in the x, y, and time values with time as the key, it throws an error for duplicate keys. When I retrieved the value of tempTime, it only showed 0.
Is this a problem with my code, or do I need a more precise stopwatch?
I realize that getting a time for something that is 30 fps is difficult, so if you have any other suggestions, that'd be great! I'm basically just trying to calculate the average velocities between points to adjust the playback speed of an audio file. Thanks!
Stopwatch is wrapper around timer with higerst resolution on regular Windows box. You can use less fancy functions to get higer than MS resolution by using Stopwatch.ElapsedTicks and Stopwatch.Frequency.
Note that your problem is probably not related to timers but rather some other code you did not show...

What are my options for timing?

I'm making a TextBox control in XNA and do not have access to the GameTime class. Currently I am trying to simulate the blinking text cursor caret and have successfully done so using this code:
int deltaTickCount = Environment.TickCount - previousTickCount;
if (deltaTickCount < CursorBlinkRate && Selected)
{
spriteBatch.Draw(emptyPixel, new Rectangle(caretLocation, Rectangle.Y + 1, caretWidth, caretHeight), Color.Black);
}
else if (deltaTickCount > CursorBlinkRate * 2)
{
previousTickCount = Environment.TickCount;
}
However, I'm a bit wary of using Environment.TickCount. If the computer was running long enough, wouldn't the program eventually crash or produce unpredictable behavior when the tick count exceeded its integral size?
Does anyone know what Windows does? I imagine it would use the system clock. Would that be a more suitable solution? I imagine they used something like total milliseconds in to the second instead of the tick count, but I'm not sure.
Thanks for reading.
I generally use the system diagnostics timer in a lot of situations.
It's a pretty powerful tool which creates a timer for you with a lot of good controls.
using System.Diagnostics;
Stopwatch timer = new Stopwatch();
Then use inbuilt controls:
timer.Start();
if(timer.elapsedMilliseconds() > ...)
{ }
timer.Reset();
etc...
This would allow you to reset the timer?
When Evnironment.TickCount rolls over, deltaTickCount will end up being negative, so you know it has happened. The calculation then becomes:
if (deltaTickCount < 0)
deltaTickCount = int.MaxValue - previousTickCount + Environment.TickCount;
Without bothering with what would happen in the case of an integer overflow, simply change to:
int deltaTickCount =
Environment.TickCount > previousTickCount
? Environment.TickCount - previousTickCount
: CursorBlinkRate * 3;

Categories

Resources