Observed periodic noise with ASIO sound card in my audio processing application - c#

I observe noise which shows up periodically (every 5 seconds or so) when i use Asio sound card with the custom built audio processing application visualisation tab which displays the frequency analysis.
The noise is not observed when using a Direct Sound card with the same audio.
I have tried changing the number of channels that are listening to the audio for Asio from 8 to 2 but that doesn't fix the issue!
The sampling rate is 48kHz.( Tweaked it to 44kHz, doesn't fix the issue).
The audio processing application is written in C# and made use of NAudio API.
I've included the images for the waveform in the link:
https://www.sendbig.com/view-files?Id=8fe0ff05-d27e-9ec2-161f-415d923599b7
The first image is the clean signal with no noise and the next image shows the audio along with the noise.
Any inputs on this is appreciated!

Related

DirectShow webcam video and audio async - audio lag

I try to save a video with audio and save it as an uncompressed avi file. The graph is as you can see in the picture. The problem is that the sound recording is ~500ms behind the video. It doesn't matter which sources I have. What can I do to have video and audio in sync?
Default audio capture buffer is pretty large and is about 500 ms in length. You start getting the data once the buffer is filled and hence the lag. Large buffers might be okay for some scenarios and are not good for other. You can use IAMBufferNegotiation interface to adjust the buffering.
See related (you will see 500 ms lag is a typical complaint):
Audio Sync problems using DirectShow.NET
Minimizing audio capture latency in DirectShow
Delay in video in DirectShow graph

Output multiple audio to individual sound card channels

I have an eight channel sound card. 8 audio in and 8 audio out. When the driver for the sound card is installed, the following is shown on Playback devices.
Play 1-2
Play 3-4
Play 5-6
Play 7-8
I can enumerate all 4 devices (Play 1-2,Play 3-4) and play on each of them. The current situation only allows me to have 4 audio outputs. The sound card also has individual audio out wires/port for each channel.
What I want to do is to play on Play 1, Play 2, Play 3, etc. Is there any way to achieve this. I am currently using NAudio to enumerate and playback.
You can play two independent mono streams out of one of the stereo pairs using MultiplexingSampleProvider or MultiplexingWaveProvider. If you want to treat the whole soundcard as a single device, then I've found AsioOut tends to be the only option and again you can use the multiplexing providers to route individual NAudio wavestreams to the different device outputs.

DirectShow transform filter with multiple video frames - Sync with audio

I've written a DirectShow transform filter (in C# but concept is the same in C++) which buffers multiple video frames before sending them to the renderer (hence a delay). These frames are processed before producing an output frame (think sliding window of say 6 frames).
On a 6fps video source, this causes a 1 second delay. Audio ends up playing back 1 second ahead of video. How do I tell the graph to delay audio by the same amount?
Video and audio renderers present data respecting attached time stamps. You need to restamp your audio data adding the desired delay.

Synchronize playback of audio to a timer using NAudio

I want to synchronize the playback of a song to a timer so that I can keep the beats of a song in sync with things rendered on the screen. Any way of accomplishing this using NAudio?
Several out the output devices in NAudio support the IWavePosition interface, which gives a more accurate indication of where the soundcard is currently up to in the buffer it is playing. Usually this is reported in terms of number of bytes that have been played since playback started - so it does not necessarily correspond to the position within the file you are playing or within a song. So if you use this you will need to keep track of when you started playing.
Usually you would keep the things rendered on screen synchronized to the audio playback position, rather than the other way round.

Live Playback of PCM (wave) Audio from Network Stream

Very similar to this question, I have I networked micro-controller which collects PCM audio (8-bit, 8 kHz) and streams the data as raw bytes over a TCP network socket. I am able to connect to the device, open a NetworkStream, and create a RIFF / wave file out of the collected data.
I would like to take it a step further, and enable live playback of the measurements. My approach so far has been to buffer the incoming data into multiple MemoryStream's with an appropriate RIFF header, and when each chunk is complete to use System.Media.SoundPlayer to play the wave file segment. To avoid high latency, each segment is only 0.5 seconds long.
My major issue with this approach is that often there is a distinctive popping sound between segments (since each chunk is not necessarily zero-centered or zero-ended).
Questions:
Is there a more suitable or direct method to playback live streaming PCM audio in C#?
If not, are there additional steps I can take to make the multiple playbacks run more smoothly?
I don;t think you can manage it without popping sounds with the SoundPlayer, because there shouldn't be any delay in pushing the buffers. Normally you should always have one extra buffer buffered. But the SoundPlayer only buffers one buffer. Even when the soundplayer gives an event that it is ready, you're already too late to play a new sound.
I advise you to check this link: Recording and Playing Sound with the Waveform Audio Interface http://msdn.microsoft.com/en-us/library/aa446573.aspx
There are some examples of the SoundPlayer (skip those), but also how to use the WaveOut. Look at the section Playing with WaveOut.
The SoundPlayer is normally used for notification sounds.

Categories

Resources