WasapiCapture NAudio - c#

We are using the NAudio Stack written in c# and trying to capture the audio in Exclusive mode with PCM 8kHZ and 16bits per sample.
In the following function:
private void InitializeCaptureDevice()
{
if (initialized)
return;
long requestedDuration = REFTIMES_PER_MILLISEC * 100;
if (!audioClient.IsFormatSupported(AudioClientShareMode.Shared, WaveFormat) &&
(!audioClient.IsFormatSupported(AudioClientShareMode.Exclusive, WaveFormat)))
{
throw new ArgumentException("Unsupported Wave Format");
}
var streamFlags = GetAudioClientStreamFlags();
audioClient.Initialize(AudioClientShareMode.Shared,
streamFlags,
requestedDuration,
requestedDuration,
this.waveFormat,
Guid.Empty);
int bufferFrameCount = audioClient.BufferSize;
this.bytesPerFrame = this.waveFormat.Channels * this.waveFormat.BitsPerSample / 8;
this.recordBuffer = new byte[bufferFrameCount * bytesPerFrame];
Debug.WriteLine(string.Format("record buffer size = {0}", this.recordBuffer.Length));
initialized = true;
}
We configured the WaveFormat before calls this function to (8000,1) and also a period of 100 ms.
We expected the system to allocate 1600 bytes for the buffer and interval of 100 ms as requested.
But we noticed following occured:
1. the system allocated audioClient.BufferSize to be 4800 and "this.recordBuffer" an array of 9600 bytes (which means a buffer for 600ms and not 100ms).
2. the thread is going to sleep and then getting 2400 samples (4800 bytes) and not as expected frames of 1600 bytes
Any idea what is going there?

You say you are capturing audio in exclusive mode, but in the example code you call the Initialize method with AudioClientMode.Shared. It strikes me as very unlikely that shared mode will let you work at 8kHz. Unlike the wave... APIs, WASAPI does no resampling for you of playback or capture, so the soundcard itself must be operating at the sample rate you specify.

Related

NAudio proper way to stream MediaFoundationReader

I'm trying to use NAudio to play shoutcast streams. This is doable (apparently) with a few lines of code:
var url = "http://dance.radiomonster.fm/320.mp3";
using(var radioStream = new MediaFoundationReader(url))
using(var wo = new WaveOutEvent())
{
wo.Init(radioStream);
wo.Play();
while (wo.PlaybackState == PlaybackState.Playing)
{
Thread.Sleep(1000);
}
}
Works fine for playback. However I need to take that as float samples converted to 48 KHz mono to be sent off as a bytestream elsewhere. So I do this:
//Convert the wave to 48 khz, mono
str1 = new WaveFormatConversionProvider(new WaveFormat(48000,1), radioStream);
//Convert the converted wave to floats
str = new Wave16ToFloatProvider(str1);
Under previous iterations of this I would do something like this to retrieve the data:
while ((readResultCount = str.Read(result, 0, result.Length)) > 0)
{
float[] output = new float[result.Length / sizeof(float)];
Buffer.BlockCopy(result, 0, output, 0, sizeof(float) * output.Length);
au.EnqueueEncodeVoiceData(output);
}
However, this results in audio output that skips worse than a CD that's been ran over.
I've tried to see if there was any way to determine if there was data enough in the buffer and avoid reading the stream if not, but there doesn't seem to be any facilities to do this.
What am I doing wrong?
Ah, here's a matter of hidden knowledge.
The loop itself was being called in a timer event tick, which meant that multiple instances of the loop could (and would) get called, since the Read operation is apparently blocking until either the requested data is received or the stream ends for whatever reason.
Since the timer ticked indiscriminately of whether the loop had exited or not, it was running over its own toes.
I resolved this by removing the timer tick and shoving the routine into a Task.

C# - Microphone noise detection

I am using nAudio Library to capture microphone input. But I have run into a problem.
I am using the code(which I have modified slightly) from an nAudio sample app.
The codes generates a WAV file based on mic input and renders it as a wave. Here is the code for that.
private void RenderFile()
{
SampleAggregator.RaiseRestart();
using (WaveFileReader reader = new WaveFileReader(this.voiceRecorderState.ActiveFile))
{
this.samplesPerSecond = reader.WaveFormat.SampleRate;
SampleAggregator.NotificationCount = reader.WaveFormat.SampleRate/10;
//Sample rate is 44100
byte[] buffer = new byte[1024];
WaveBuffer waveBuffer = new WaveBuffer(buffer);
waveBuffer.ByteBufferCount = buffer.Length;
int bytesRead;
do
{
bytesRead = reader.Read(waveBuffer, 0, buffer.Length);
int samples = bytesRead / 2;
double sum = 0;
for (int sample = 0; sample < samples; sample++)
{
if (bytesRead > 0)
{
sampleAggregator.Add(waveBuffer.ShortBuffer[sample] / 32768f);
double sample1 = waveBuffer.ShortBuffer[sample] / 32768.0;
sum += (sample1 * sample1);
}
}
double rms = Math.Sqrt(sum / (SampleAggregator.NotificationCount));
var decibel = 20 * Math.Log10(rms);
System.Diagnostics.Debug.WriteLine(decibel.ToString() + " in dB");
} while (bytesRead > 0);
int totalSamples = (int)reader.Length / 2;
TotalWaveFormSamples = totalSamples / sampleAggregator.NotificationCount;
SelectAll();
}
audioPlayer.LoadFile(this.voiceRecorderState.ActiveFile);
}
Below is a little chunk from a result of a 2second WAV file with no sound but only mic noise.
-54.089102453893 in dB
-51.9171950072361 in dB
-53.3478098666891 in dB
-53.1845794096928 in dB
-53.8851764055102 in dB
-57.5541358628342 in dB
-54.0121140454216 in dB
-55.5204248291508 in dB
-54.9012326746571 in dB
-53.6831017096011 in dB
-52.8728852678309 in dB
-55.7021600863786 in dB
As we can see, the db level hovers around -55 when there is no input sound, only silence. if I record saying "Hello" in mic in a normal tone, the db value will goto -20 or so. I read somewhere that average human talk is around 20dB and -3dB to -6dB is the ZERO value range for mic.
Question: Am I calculating the dB value correctly? (i used a formula proposed here by someone else)...Why dB is always coming up in negative? Am i missing a crucial concept or a mechanism?
I searched nAudio documentation at codeplex and didn't find an answer. In my observation, the documentation there needs to be more explanatory then just a bunch of Q&A [no offense nAudio :)]
If I understood the formula correctly, the actual value you're calculating is dBm, and that's absolutely ok since dB is just a unit to measure amplification and can't be used for measuring signal strength/amplitude (i.e. you can say I amplified the signal by 3 db, but can't say my signal strength is 6 dB).
The negative values are there just because of the logarithmic conversion part of the formula (converting watts/miliWatts to db) and since the signals you're dealing with are relativly weak.
So in conclusion, looks, like you've done everything right.
Hope it helps.
EDIT: BTW, as you can see, there really is ~23-25dbm difference between silence and human speech

Determine Exact Upload Speed?

I wanted to check the upload speed of the system.
void CheckUploadSpeed()
{
using (var wc = new WebClient())
{
IPv4InterfaceStatistics ipis = networkInterface.GetIPv4Statistics();
BytesSentb4Upload = ipis.BytesSent;
FileStream stream = File.OpenRead(string.Format("{0}speedtext.txt", path)); //speedtext.txt is a 5 MB file.
var fileBytes = new byte[stream.Length];
stream.Read(fileBytes, 0, fileBytes.Length);
stream.Close();
startTime = Environment.TickCount;
wc.UploadDataAsync(new Uri("http://www.example.com/"), fileBytes);
InternetSpeedResult = "Data upload started. Uploading 5MB file";
wc.UploadProgressChanged += new UploadProgressChangedEventHandler(UploadProgressCallback);
wc.UploadDataCompleted += wc_UploadDataCompleted;
}
}
And on Upload Progress Changed
void UploadProgressCallback(object sender, UploadProgressChangedEventArgs e)
{
InternetSpeedResult = string.Format("Checking Upload Speed ... ");
double endTime = Environment.TickCount;
double secs = Math.Round(Math.Floor(endTime - startTime) / 1000, 0);
if (secs >= 30)
{
UploadComplete(sender, e);
}
}
This code actually is serving my issue but the problem is that this is not giving the exact results everytime. Since I am counting on Complete BytesSent in the particular period of time. This numbers automatically varies. If the Speed is very low (less than 512 KBPS) and very high (Greater than 20 MBPS) than its not giving the expected upload rate.
What should I be doing in the code that I can rely on the Results?
Is there any other approach to check the Upload Speed
If the Speed is very low (less than 512 KBPS) and very high (Greater than 20 MBPS). What approach should I take?
I would not say this as a solution whereas it would be small workaround which might help you to get rely on your upload speed.
Increase the size of the uploaded file to somewhere around 100MB or so.
Set the secs >= 60
Now whatever the speed of the network is if it low it would check for a minute and let u know the upload speed. If it is higher. Then also it would let u know either all 100 MB is transferred and you will get the upload speed and if not than you you'll get to know the speed in 60 seconds.
Since working with high speed internet the problem is that the bandwidth is not optimum while the transaction begins. as soon as you start sending or receiving data it will increase the speed thus Changing it to 60 seconds and increasing the file size will give you results in both the cases.

Broken MIDI File Output

I'm currently trying to implement my own single track MIDI file output. It turns an 8x8 grid of colours stored in multiple frames, into a MIDI file that can be imported into a digital audio interface and played through a Novation Launchpad. Some more context here.
I've managed to output a file that programs recognize as MIDI, but the resultant MIDI does not play, and its not matching files generated via the same frame data. I've been doing comparisions by recording my programs live MIDI messages through a dedicated MIDI program, and then spitting out a MIDI file via that. I then compare my generated file to that properly generated file via a hex editor. Things are correct as far as the headers, but that seems to be it.
I've been slaving over multiple renditions of the MIDI specification and existing Stack Overflow questions with no 100% solution.
Here is my code, based on what I have researched. I can't help but feel I'm missing something simple. I'm avoiding the use of existing MIDI libraries, as I only need this one MIDI function to work (and want the learning experience of doing this from scratch). Any guidance would be very helpful.
/// <summary>
/// Outputs an MIDI file based on frames for the Novation Launchpad.
/// </summary>
/// <param name="filename"></param>
/// <param name="frameData"></param>
/// <param name="bpm"></param>
/// <param name="ppq"></param>
public static void WriteMidi(string filename, List<FrameData> frameData, int bpm, int ppq) {
decimal totalLength = 0;
using (FileStream stream = new FileStream(filename, FileMode.Create, FileAccess.Write)) {
// Output midi file header
stream.WriteByte(77);
stream.WriteByte(84);
stream.WriteByte(104);
stream.WriteByte(100);
for (int i = 0; i < 3; i++) {
stream.WriteByte(0);
}
stream.WriteByte(6);
// Set the track mode
byte[] trackMode = BitConverter.GetBytes(Convert.ToInt16(0));
stream.Write(trackMode, 0, trackMode.Length);
// Set the track amount
byte[] trackAmount = BitConverter.GetBytes(Convert.ToInt16(1));
stream.Write(trackAmount, 0, trackAmount.Length);
// Set the delta time
byte[] deltaTime = BitConverter.GetBytes(Convert.ToInt16(60000 / (bpm * ppq)));
stream.Write(deltaTime, 0, deltaTime.Length);
// Output track header
stream.WriteByte(77);
stream.WriteByte(84);
stream.WriteByte(114);
stream.WriteByte(107);
for (int i = 0; i < 3; i++) {
stream.WriteByte(0);
}
stream.WriteByte(12);
// Get our total byte length for this track. All colour arrays are the same length in the FrameData class.
byte[] bytes = BitConverter.GetBytes(frameData.Count * frameData[0].Colours.Count * 6);
// Write our byte length to the midi file.
stream.Write(bytes, 0, bytes.Length);
// Cycle through frames and output the necessary MIDI.
foreach (FrameData frame in frameData) {
// Calculate our relative delta for this frame. Frames are originally stored in milliseconds.
byte[] delta = BitConverter.GetBytes((double) frame.TimeStamp / 60000 / (bpm * ppq));
for (int i = 0; i < frame.Colours.Count; i++) {
// Output the delta length to MIDI file.
stream.Write(delta, 0, delta.Length);
// Get the respective MIDI note based on the colours array index.
byte note = (byte) NoteIdentifier.GetIntFromNote(NoteIdentifier.GetNoteFromPosition(i));
// Check if the current color signals a MIDI off event.
if (!CheckEqualColor(frame.Colours[i], Color.Black) && !CheckEqualColor(frame.Colours[i], Color.Gray) && !CheckEqualColor(frame.Colours[i], Color.Purple)) {
// Signal a MIDI on event.
stream.WriteByte(144);
// Write the current note.
stream.WriteByte(note);
// Check colour and write the respective velocity.
if (CheckEqualColor(frame.Colours[i], Color.Red)) {
stream.WriteByte(7);
} else if (CheckEqualColor(frame.Colours[i], Color.Orange)) {
stream.WriteByte(83);
} else if (CheckEqualColor(frame.Colours[i], Color.Green) || CheckEqualColor(frame.Colours[i], Color.Aqua) || CheckEqualColor(frame.Colours[i], Color.Blue)) {
stream.WriteByte(124);
} else if (CheckEqualColor(frame.Colours[i], Color.Yellow)) {
stream.WriteByte(127);
}
} else {
// Calculate the delta that the frame had.
byte[] offDelta = BitConverter.GetBytes((double) (frameData[frame.Index - 1].TimeStamp / 60000 / (bpm * ppq)));
// Write the delta to MIDI.
stream.Write(offDelta, 0, offDelta.Length);
// Signal a MIDI off event.
stream.WriteByte(128);
// Write the current note.
stream.WriteByte(note);
// No need to set our velocity to anything.
stream.WriteByte(0);
}
}
}
}
}
BitConverter.GetBytes returns the bytes in the native byte order, but MIDI files use big-endian values. If you're running on x86 or ARM, you must reverse the bytes.
The third value in the file header is not called "delta time"; it is the number of ticks per quarter note, which you already have as ppq.
The length of the track is not 12; you must write the actual length.
Due to the variable-length encoding of delta times (see below), this is usually not possible before collecting all bytes of the track.
You need to write a tempo meta event that specifies the number of microseconds per quarter note.
A delta time is not an absolute time; it specifies the interval starting from the time of the previous event.
A delta time specifies the number ticks; your calculation is wrong.
Use TimeStamp * bpm * ppq / 60000.
Delta times are not stored as a double floating-point number but as a variable-length quantity; the specification has example code for encoding it.
The last event of the track must be an end-of-track meta event.
Another approach would be to use one of the .NET MIDI libraries to write the MIDI file. You just have to convert your frames into Midi objects and pass them to the library to save. The library will take care of all the MIDI details.
You could try MIDI.NET and C# Midi Toolkit. Not sure if NAudio does writing MIDI files and at what abstraction level that is...
Here is more info on the MIDI File Format specification:
http://www.blitter.com/~russtopia/MIDI/~jglatt/tech/midifile.htm
Hope it helps,
Marc

Measure data transfer rate over tcp using c#

i want to measure current download speed. im sending huge file over tcp. how can i capture the transfer rate every second? if i use IPv4InterfaceStatistics or similar method, instead of capturing the file transfer rate, i capture the device transfer rate. the problem with capturing device transfer rate is that it captures all ongoing data through the network device instead of the single file that i transfer.
how can i capture the file transfer rate? im using c#.
Since you doesn't have control over stream to tell him how much read, you can time-stamp before and after a stream read and then based on received or sent bytes calculate the speed:
using System.IO;
using System.Net;
using System.Diagnostics;
// some code here...
Stopwatch stopwatch = new Stopwatch();
// Begining of the loop
int offset = 0;
stopwatch.Reset();
stopwatch.Start();
bytes[] buffer = new bytes[1024]; // 1 KB buffer
int actualReadBytes = myStream.Read(buffer, offset, buffer.Length);
// Now we have read 'actualReadBytes' bytes
// in 'stopWath.ElapsedMilliseconds' milliseconds.
stopwatch.Stop();
offset += actualReadBytes;
int speed = (actualReadBytes * 8) / stopwatch.ElapsedMilliseconds; // kbps
// End of the loop
You should put the Stream.Read in a try/catch and handle reading exception. It's the same for writing to streams and calculate the speed, just these two lines are affected:
myStream.Write(buffer, 0, buffer.Length);
int speed = (buffer.Length * 8) / stopwatch.ElapsedMilliseconds; // kbps

Categories

Resources