I am trying to read a video (.mp4) from a file and showing it on a Imagebox on forms. When I time the total time it played, it's actually more than actual video length. For example, if video is actually 10 seconds, it played for 12-13 seconds. Why is that? The framerate for my video is 31fps
I tried to make sure fps variable has the correct framerate already.
private void Play_Video(string filePath)
{
VideoCapture cameraCapture = new VideoCapture(filePath);
var stopwatch = new Stopwatch();
stopwatch.Start();
while (true)
{
Mat m = new Mat();
cameraCapture.Read(m);
if (!m.IsEmpty)
{
imageBox1.Image = m;
double fps = cameraCapture.GetCaptureProperty(CapProp.Fps);
Thread.Sleep(1000 / Convert.ToInt32(fps));
}
else
{
break;
}
}
stopwatch.Stop();
double elapsed_time = stopwatch.ElapsedMilliseconds * 0.001;
// My elapsed_time here shows about 13 seconds, when the actual length of video is 10 seconds. Why?
}
NEW EDIT:
I updated retrieval of frames in a separate function as to not cause delay during render, but there is still a few seconds delay. For example 30 seconds video plays for 32-33 seconds. My updated code:
private void Play_Video(List<Mat> frames, double fps, Emgu.CV.UI.ImageBox imageBox)
{
var stopwatch = new Stopwatch();
stopwatch.Start();
for (int i = 0; i < frames.Count; i++)
{
imageBox.Image = frames[i];
Thread.Sleep(1000 / Convert.ToInt32(fps));
}
stopwatch.Stop();
double elapsed_time = stopwatch.ElapsedMilliseconds * 0.001;
}
This was the only way I could get the exact fps because using thread.sleep takes 10-15 ms to cycle on and off. I had same problem with Task.Delay. This seems to work perfectly, if anyone has a better solution I would love to see it.
private void PlayVideo()
{
double framecount = video.Get(Emgu.CV.CvEnum.CapProp.FrameCount);
double fps = video.Get(Emgu.CV.CvEnum.CapProp.Fps);
video.Set(Emgu.CV.CvEnum.CapProp.PosFrames, (double)(framecount * (double)(savePercent / 100)));
int ms = Convert.ToInt32(((double)1000 / fps));
while (disablePreviewTimer == false)
{
DateTime dt = DateTime.Now;
Emgu.CV.Mat img = video.QueryFrame();
if (img == null)
{
disablePreviewTimer = true;
break;
}
else
{
Emgu.CV.Util.VectorOfByte vb = new Emgu.CV.Util.VectorOfByte();
Emgu.CV.CvInvoke.Resize(img, img, PreviewSize);
Emgu.CV.CvInvoke.Imencode(".jpg", img, vb);
pictureBox5.Image = Image.FromStream(new MemoryStream(vb.ToArray()));
Application.DoEvents();
while ((DateTime.Now - dt).TotalMilliseconds < ms)
{
Application.DoEvents();
}
label6.Text = ((int)(DateTime.Now - dt).TotalMilliseconds).ToString();
}
}
}
Related
Issue 1: Delay in preview of webcam, FPS drop on live preview
On high resolution i am facing issue of delay in preview of webcam, FPS drop on live preview in AForge.Video.DirectShow(Version=2.2.5.0). I am using AForge.Video.DirectShow for getting camera feed in WPF c# application. On high resolution(1920 * 1080 : 60 FPS) user can observe delay in preview of webcam. On normal resolution preview is look good. But when user select higher resolution current frame rate is drooping and user can observe delay in preview of webcam.
This is event for get camera frame and assign to picture box and calculation for get framerate
public void VideoCaptureDevice_NewFrame(object sender, NewFrameEventArgs eventArgs)
{
//// make the FPS measurement
stopwatch.Stop();
TimeSpan ts = stopwatch.Elapsed;
stopwatch.Restart();
// this is called exponential smoothing. The weight for the previous
// avgFramePeriod and the weight for the new measurement should equal 1
avgFramePeriod = 0.9 * avgFramePeriod + 0.1 * ts.TotalSeconds;
try
{
previewBox.Image = (Bitmap)eventArgs.Frame.Clone();
_latestFrame = (Bitmap)eventArgs.Frame.Clone();
if (_recording && _writer.IsOpen)
{
if (_firstFrameTime != null)
{
_writer.WriteVideoFrame(_latestFrame, DateTime.Now - _firstFrameTime.Value);
}
else
{
_writer.WriteVideoFrame(_latestFrame);
_firstFrameTime = DateTime.Now;
}
}
}
catch { }
// suggest the garbage collector run -- with high res images, if you don't
// do this, it can run out of memory
GC.Collect();
}
private void Timer_Tick(object sender, EventArgs e)
{
if (this.Visible)
{
double avgFPS = 1.0 / avgFramePeriod;
fpsValStatusLabel.Text = avgFPS.ToString("0.##");
}
}
Issue 2 video get pixelated when user select high resolution(1920 * 1080 : 60 FPS)
For video recording i have used Accord.Video.FFMPEG.VideoFileWriter(Version=3.8.0.0) package but in this video get pixelated when user select high resolution(1920 * 1080 : 60 FPS) In this screen shoot we can see that its pixelated video on high resolution(1920 * 1080 : 60 FPS)
This is button click event where I am creating VideoFileWriter object for creating video
private void button1_Click(object sender, EventArgs e)
{
_writer = new Accord.Video.FFMPEG.VideoFileWriter();
string fileName = $"temp_{DateTime.Now.ToFileTime()}.mp4";
string InitialDirectory = Environment.GetFolderPath(Environment.SpecialFolder.MyVideos);
string pathToNexigoFolder = System.IO.Path.Combine(InitialDirectory, "Nexigo");
if (!Directory.Exists(pathToNexigoFolder))
{
try
{
Directory.CreateDirectory(pathToNexigoFolder);
}
catch
{
pathToNexigoFolder = InitialDirectory;
}
}
string filePath = System.IO.Path.Combine(pathToNexigoFolder, fileName);
_writer.Open(filePath, _latestFrame.Width, _latestFrame.Height, 60, Accord.Video.FFMPEG.VideoCodec.MPEG4, 60);
_recording = true;
}
private void button2_Click(object sender, EventArgs e)
{
_recording = false;
_writer.Close();
_writer.Dispose();
}
This is call-back event from AForge.Video.DirectShow library for getting each frame from webcam
public void VideoCaptureDevice_NewFrame(object sender, NewFrameEventArgs eventArgs)
{
//// make the FPS measurement
stopwatch.Stop();
TimeSpan ts = stopwatch.Elapsed;
stopwatch.Restart();
// this is called exponential smoothing. The weight for the previous
// avgFramePeriod and the weight for the new measurement should equal 1
avgFramePeriod = 0.9 * avgFramePeriod + 0.1 * ts.TotalSeconds;
try
{
previewBox.Image = (Bitmap)eventArgs.Frame.Clone();
_latestFrame = (Bitmap)eventArgs.Frame.Clone();
if (_recording && _writer.IsOpen)
{
if (_firstFrameTime != null)
{
_writer.WriteVideoFrame(_latestFrame, DateTime.Now - _firstFrameTime.Value);
}
else
{
_writer.WriteVideoFrame(_latestFrame);
_firstFrameTime = DateTime.Now;
}
}
}
catch { }
// suggest the garbage collector run -- with high res images, if you don't
// do this, it can run out of memory
GC.Collect();
}
private void Timer_Tick(object sender, EventArgs e)
{
if (this.Visible)
{
double avgFPS = 1.0 / avgFramePeriod;
fpsValStatusLabel.Text = avgFPS.ToString("0.##");
}
}
So, I'm trying to create a method that animates the movement of a control on a form. I generate all the points the control is going to travel to beforehand, like this:
private static List<decimal> TSIncrement(int durationInMilliseconds, decimal startPoint, decimal endPoint)
{
List<decimal> tempPoints = new List<decimal>();
decimal distance = endPoint - startPoint;
decimal increment = distance / durationInMilliseconds;
decimal tempPoint = (decimal)startPoint;
for (decimal i = durationInMilliseconds; i > 0; i--)
{
tempPoint += increment;
tempPoints.Add(tempPoint);
}
return tempPoints;
}
This outputs a list with as many points as there are milliseconds in the duration of the animation. I think you can guess what I'm doing afterwards:
public static void ControlAnimation(Control control, Point locationEndpoint, int delay)
{
if (delay > 0)
{
List<decimal> tempXpoints = TSIncrement(delay, control.Location.X, locationEndpoint.X);
List<decimal> tempYpoints = TSIncrement(delay, control.Location.Y, locationEndpoint.Y);
for (int i = 0; i < delay; i++)
{
control.Location = new Point((int)Math.Round(tempXpoints[i]), (int)Math.Round(tempYpoints[i]));
Thread.Sleep(1); //I won't leave this obviously, it's just easier for now
}
}
}
In the actual method, I go through this list of points and use those to create the new location of the control (I actually use two lists for the abscissa and the ordinate).
My problem lies in creating one millisecond of delay between each shifting. Since the code in the loop takes a bit of time to execute, I usually end up with approximately 5 seconds more duration.
I tried using a stopwatch to measure the time it takes to set control.location, and subtracting that to the 1 millisecond delay. The stopwatch adds some delay as well though, since I gotta start, stop and reset it everytime.
So what should I do, and how could I improve my code? Any feedback is greatly appreciated :)
You won't get a reliable delay below around 50 milliseconds in WinForms, so that is the delay I used below:
private Random R = new Random();
private async void button1_Click(object sender, EventArgs e)
{
button1.Enabled = false;
label1.AutoSize = false;
label1.Size = button2.Size;
Point p = new Point(R.Next(this.Width - button2.Width), R.Next(this.Height - button2.Height));
label1.Location = p;
label1.SendToBack();
await MoveControl(button2, p, R.Next(2000, 7001));
button1.Enabled = true;
}
private Task MoveControl(Control control, Point LocationEndPoint, int delayInMilliseconds)
{
return Task.Run(new Action(() =>
{
decimal p;
int startX = control.Location.X;
int startY = control.Location.Y;
int deltaX = LocationEndPoint.X - startX;
int deltaY = LocationEndPoint.Y - startY;
System.Diagnostics.Stopwatch sw = new System.Diagnostics.Stopwatch();
sw.Start();
while(sw.ElapsedMilliseconds < delayInMilliseconds)
{
System.Threading.Thread.Sleep(50);
p = Math.Min((decimal)1.0, (decimal)sw.ElapsedMilliseconds / (decimal)delayInMilliseconds);
control.Invoke((MethodInvoker)delegate {
control.Location = new Point(startX + (int)(p * deltaX), startY + (int)(p * deltaY));
});
}
}));
}
I need a timer that fires every 25ms. I've been comparing the default Timer implementation between Windows 10 and Linux (Ubuntu Server 16.10 and 12.04) on both the dotnet core runtime and the latest mono-runtime.
There are some differences in the timer precision that I don't quite understand.
I'm using the following piece of code to test the Timer:
// inside Main()
var s = new Stopwatch();
var offsets = new List<long>();
const int interval = 25;
using (var t = new Timer((obj) =>
{
offsets.Add(s.ElapsedMilliseconds);
s.Restart();
}, null, 0, interval))
{
s.Start();
Thread.Sleep(5000);
}
foreach(var n in offsets)
{
Console.WriteLine(n);
}
Console.WriteLine(offsets.Average(n => Math.Abs(interval - n)));
On windows it's all over the place:
...
36
25
36
26
36
5,8875 # <-- average timing error
Using dotnet core on linux, it's less all over the place:
...
25
30
27
28
27
2.59776536312849 # <-- average timing error
But the mono Timer is very precise:
...
25
25
24
25
25
25
0.33 # <-- average timing error
Edit: Even on windows, mono still maintains its timing precision:
...
25
25
25
25
25
25
25
24
0.31
What is causing this difference? Is there a benefit to the way the dotnet core runtime does things compared to mono, that justifies the lost precision?
Unfortunately you cannot rely on timers in the .NET framework. The best one has 15 ms frequency even if you want to trigger it in every millisecond. But you can implement a high-resolution timer with microsec precision, too.
Note: This works only when Stopwatch.IsHighResolution returns true. In Windows this is true starting with Windows XP; however, I did not test other frameworks.
public class HiResTimer
{
// The number of ticks per one millisecond.
private static readonly float tickFrequency = 1000f / Stopwatch.Frequency;
public event EventHandler<HiResTimerElapsedEventArgs> Elapsed;
private volatile float interval;
private volatile bool isRunning;
public HiResTimer() : this(1f)
{
}
public HiResTimer(float interval)
{
if (interval < 0f || Single.IsNaN(interval))
throw new ArgumentOutOfRangeException(nameof(interval));
this.interval = interval;
}
// The interval in milliseconds. Fractions are allowed so 0.001 is one microsecond.
public float Interval
{
get { return interval; }
set
{
if (value < 0f || Single.IsNaN(value))
throw new ArgumentOutOfRangeException(nameof(value));
interval = value;
}
}
public bool Enabled
{
set
{
if (value)
Start();
else
Stop();
}
get { return isRunning; }
}
public void Start()
{
if (isRunning)
return;
isRunning = true;
Thread thread = new Thread(ExecuteTimer);
thread.Priority = ThreadPriority.Highest;
thread.Start();
}
public void Stop()
{
isRunning = false;
}
private void ExecuteTimer()
{
float nextTrigger = 0f;
Stopwatch stopwatch = new Stopwatch();
stopwatch.Start();
while (isRunning)
{
float intervalLocal = interval;
nextTrigger += intervalLocal;
float elapsed;
while (true)
{
elapsed = ElapsedHiRes(stopwatch);
float diff = nextTrigger - elapsed;
if (diff <= 0f)
break;
if (diff < 1f)
Thread.SpinWait(10);
else if (diff < 10f)
Thread.SpinWait(100);
else
{
// By default Sleep(1) lasts about 15.5 ms (if not configured otherwise for the application by WinMM, for example)
// so not allowing sleeping under 16 ms. Not sleeping for more than 50 ms so interval changes/stopping can be detected.
if (diff >= 16f)
Thread.Sleep(diff >= 100f ? 50 : 1);
else
{
Thread.SpinWait(1000);
Thread.Sleep(0);
}
// if we have a larger time to wait, we check if the interval has been changed in the meantime
float newInterval = interval;
if (intervalLocal != newInterval)
{
nextTrigger += newInterval - intervalLocal;
intervalLocal = newInterval;
}
}
if (!isRunning)
return;
}
float delay = elapsed - nextTrigger;
if (delay >= ignoreElapsedThreshold)
{
fallouts += 1;
continue;
}
Elapsed?.Invoke(this, new HiResTimerElapsedEventArgs(delay, fallouts));
fallouts = 0;
// restarting the timer in every hour to prevent precision problems
if (stopwatch.Elapsed.TotalHours >= 1d)
{
stopwatch.Restart();
nextTrigger = 0f;
}
}
stopwatch.Stop();
}
private static float ElapsedHiRes(Stopwatch stopwatch)
{
return stopwatch.ElapsedTicks * tickFrequency;
}
}
public class HiResTimerElapsedEventArgs : EventArgs
{
public float Delay { get; }
internal HiResTimerElapsedEventArgs(float delay)
{
Delay = delay;
}
}
Edit 2021: Using the latest version that does not have the issue #hankd mentions in the comments.
I trying to do the following setup and it works fine when I use a real output.
I´m not sure what the right approach is to do that, I tried to use a Timer and it works for some time, but then fails because it drifts a bit and I get a buffer full exception.
var mixSampleProvider = new MixingSampleProvider(resampleWaveFormat);
mixSampleProvider.AddMixerInput(inputAResampler);
mixSampleProvider.AddMixerInput(inputBResampler);
var mixWaveProvider = new SampleToWaveProvider(mixSampleProvider);
savingWaveProvider = new SavingWaveProvider(mixWaveProvider);
System.Timers.Timer timer = new System.Timers.Timer(98);
timer.Elapsed += (sender, args) =>
{
var count = resampleWaveFormat.AverageBytesPerSecond / 10;
var dummy = new byte[count];
savingWaveProvider.Read(dummy, 0, count);
};
timer.Start();
I have tried to calculate how much I should read on each tick e.g.
var readCount = Math.Min(inputABufferedWaveProvider.BufferedBytes, inputBBufferedWaveProvider.BufferedBytes);
but cannot make it work, and I have tried to use the DataAvailable event, but since there are two input and they are mixed I cannot that to work either.
The resolution of System.Timer.Timer is approximately 15.6ms, based on the Windows clock time. You need to track the time using a more accurate mechanism and adjust your read rates based on the true time rather than the rate of timer ticks.
The most popular method of tracking elapsed time is to use a System.Diagnostics.Stopwatch to determine how much time has actually elapsed since your process started, which you can then use to calculate the number of samples to read to stay in sync.
Here's a IWaveOutput implementation that uses a timer and a stopwatch to figure out how many samples to read from its input:
public class SyncedNullOutput : IWavePlayer
{
// where to read data from
private IWaveProvider _source;
// time measurement
Stopwatch _stopwatch = null;
double _lastTime = 0;
// timer to fire our read method
System.Timers.Timer _timer = null;
PlaybackState _state = PlaybackState.Stopped;
public PlaybackState PlaybackState { get { return _state; } }
public SuncedNullOutput()
{ }
public SyncedNullOutput(IWaveProvider source)
{
Init(source);
}
public void Dispose()
{
Stop();
}
void _timer_Elapsed(object sender, ElapsedEventArgs args)
{
// get total elapsed time, compare to last time
double elapsed = _stopwatch.Elapsed.TotalSeconds;
double deltaTime = elapsed - _lastTime;
_lastTime = elapsed;
// work out number of samples we need to read...
int nSamples = (int)(deltaTime * _source.WaveFormat.SampleRate);
// ...and how many bytes those samples occupy
int nBytes = nSamples * _source.WaveFormat.BlockAlign;
// Read samples from the source
byte[] buffer = new byte[nBytes];
_source.Read(buffer, 0, nBytes);
}
public void Play()
{
if (_state == PlaybackState.Stopped)
{
// create timer
_timer = new System.Timers.Timer(90);
_timer.AutoReset = true;
_timer.Elapsed += _timer_Elapsed;
_timer.Start();
// create stopwatch
_stopwatch = Stopwatch.StartNew();
_lastTime = 0;
}
else if (_state == PlaybackState.Paused)
{
// reset stopwatch
_stopwatch.Reset();
_lastTime = 0;
// restart timer
_timer.Start();
}
_state = PlaybackState.Playing;
}
public void Stop()
{
if (_timer != null)
{
_timer.Stop();
_timer.Dispose();
_timer = null;
}
if (_stopwatch != null)
{
_stopwatch.Stop();
_stopwatch = null;
}
_lastTime = 0;
_state = PlaybackState.Stopped;
}
public void Pause()
{
_timer.Stop();
_state = PlaybackState.Paused;
}
public void Init(IWaveProvider waveProvider)
{
Stop();
_source = waveProvider;
}
public event EventHandler<StoppedEventArgs> PlaybackStopped;
protected void OnPlaybackStopped(Exception exception = null)
{
if (PlaybackStopped != null)
PlaybackStopped(this, new StoppedEventArgs(exception));
}
public float Volume {get;set;}
}
I did some tests with this hooked up to a BufferedWaveProvider that was being fed samples from a default WaveInEvent instance (8kHz PCM 16-bit mono). The timer was ticking at around 93ms instead of the requested 90ms, as judged by the total run time vs number of reads, and the input buffer remained constantly under 3800 bytes in length. Changing to 44.1kHz stereo IeeeFloat format upped the buffer size to just under 80kB... still very manageable, and no overflows. In both cases the data was arriving in blocks just under half the maximum buffer size - 35280 bytes per DataAvailable event vs 76968 bytes maximum buffer length in a 60 second run, with DataAvailable firing every 100ms on average.
Try it out and see how well it works for you.
I want to develop an app to match your tinnitus frequency : A frequency is played and the user decrease or increase the freqency by pressing a plus or minus button. (see part of the codes, based on some coding from stackoverflow thx :-))
public static short[] BufferSamples = new short[44100 * 1 * 2];
private SourceVoice sourceVoice;
private AudioBuffer buffer;
private int Tfreq;
public MatchTinn()
{
InitializeComponent();
Loaded += MatchTinn_Loaded;
TFreq = 5000;
}
private void MatchTinn_Loaded(object sender, RoutedEventArgs e)
{
var dataStream = DataStream.Create(BufferSamples, true, true);
buffer = new AudioBuffer
{
LoopCount = AudioBuffer.LoopInfinite,
Stream = dataStream,
AudioBytes = (int)dataStream.Length,
Flags = BufferFlags.EndOfStream
};
FillBuffer(BufferSamples, 44100, Tfreq);
var waveFormat = new WaveFormat();
XAudio2 xaudio = new XAudio2();
MasteringVoice masteringVoice = new MasteringVoice(xaudio);
sourceVoice = new SourceVoice(xaudio, waveFormat, true);
// Submit the buffer
sourceVoice.SubmitSourceBuffer(buffer, null);
}
private void FillBuffer(short[] buffer, int sampleRate, int frequency)
{
if (sourceVoice != null)
{
sourceVoice.FlushSourceBuffers();
}
double totalTime = 0;
for (int i = 0; i < buffer.Length - 1; i += 2)
{
double time = (double)totalTime / (double)sampleRate;
short currentSample = (short)(Math.Sin(2 * Math.PI * frequency * time) * (double)short.MaxValue);
buffer[i] = currentSample;
buffer[i + 1] = currentSample;
totalTime++;
}
private void m1_OnTap(object sender, GestureEventArgs e)
{
Tfreq = Tfreq - 1;
if (Tfreq < 0)
{
Tfreq = 0;
}
FillBuffer(BufferSamples, 44100, Tfreq);
}
private void p1_OnTap(object sender, GestureEventArgs e)
{
Tfreq = Tfreq + 1;
if (Tfreq > 16000)
{
Tfreq = 16000;
}
FillBuffer(BufferSamples, 44100, Tfreq);
}
Playing the frequency is fine, but when the user presses a button you here a clicking sound when the frequency is updated. Do you have any idea what makes the sound and how i can get rid of it?
Thanks.
When you change the frequency, you're causing a discontinuity in the waveform that manifests as a click. Instead of making your signal calculations against absolute time, you should keep track of the phase of your sine calculation (e.g. a value from 0 to 2*pi), and figure out how much you need to add to your phase (subtracting 2*pi every time you exceed 2*pi) for the next sample when playing a specific frequency. This way, when you change frequency, the phase that you supply as a parameter to Math.Sin doesn't change abruptly causing a click.
Expanding on the answer #spender gave (I need 50 rep to add comment to his answer), I had a similar problem with naudio. I was able to solve the issue by adding two bool values that monitored the current sign of the sine value and the previous sign of the sine value. If the previous sine was negative and the current sine is positive, we know we can safely adjust the frequency of the sine wave.
double sine = amplitude * Math.Sin(Math.PI * 2 * frequency * time);
isPreviousSineWaveValPositive = isSineWaveValPositive;
if (sine < 0)
{
isSineWaveValPositive = false;
}
else
{
isSineWaveValPositive = true;
}
// When the time is right, change the frequency
if ( false == isPreviousSineWaveValPositive && true == isSineWaveValPositive )
{
time = 0.0;
frequency = newFrequency;
}
Here's an example how you can get rid of the clicking. Instead of using a time, you should keep track of the current phase and calculate how much the phase is changed on the required frequency. Also this _currentPhase must be persistent so it will have the previous value. (declaring it within the method would result in a click aswell (on most frequencies)
private double _currentPhase = 0;
private void FillBuffer(short[] buffer, int sampleRate, int frequency)
{
if (sourceVoice != null)
{
sourceVoice.FlushSourceBuffers();
}
var phaseStep = ((Math.PI * 2) / (double)sampleRate) * frequency;
for (int i = 0; i < buffer.Length - 1; i += 2)
{
_currentPhase += phaseStep;
short currentSample = (short)(Math.Sin(_currentPhase) * (double)short.MaxValue);
buffer[i] = currentSample;
buffer[i + 1] = currentSample;
}
}