I have an ASP.NET + C # application that is not playing a WAV audio file.
My code causes no error, it starts playing but stops shortly after.
My code:
AudioFileReader wave = null
WaveOut outputSound = null
wave = new NAudio.Wave.AudioFileReader("C:/appwin/Audio.WAV");
OffsetSampleProvider OffsetSampleProvider = new OffsetSampleProvider(wave);
outputSound = new WaveOut();
outputSound.Init(OffsetSampleProvider);
outputSound.Play();
Related
I want to record conversations over Skype or similar applications (these recordings will be processed after being saved). I was trying to accomplish that with NAudio.
So far I managed to record speaker audio using WasapiLoopbackCapture and save it to a WAV file, also I managed to record and save microphone audio using WaveIn. The main problem is that I cannot mix these 2 files into a single file, as stated in the following link: https://github.com/naudio/NAudio/blob/master/Docs/MixTwoAudioFilesToWav.md
The function where I start my recording looks like this:
waveSourceSpeakers = new WasapiLoopbackCapture();
string outputFilePath = #"xxxx\xxx\xxx";
waveFileSpeakers = new WaveFileWriter(outputFilePath, waveSourceSpeakers.WaveFormat);
waveSourceSpeakers.DataAvailable += (s, a) =>
{
waveFileSpeakers.Write(a.Buffer, 0, a.BytesRecorded);
};
waveSourceSpeakers.RecordingStopped += (s, a) =>
{
waveFileSpeakers.Dispose();
waveFileSpeakers = null;
waveSourceSpeakers.Dispose();
};
waveSourceSpeakers.StartRecording();
waveSourceMic = new WaveIn();
waveSourceMic.WaveFormat = new WaveFormat(44100, 1);
waveSourceMic.DataAvailable += new EventHandler<WaveInEventArgs>(waveSource_DataAvailable);
waveSourceMic.RecordingStopped += new EventHandler<StoppedEventArgs>(waveSource_RecordingStopped);
waveFileMic = new WaveFileWriter(#"xxxx\xxx\xxx", waveSourceMic.WaveFormat);
waveSourceMic.StartRecording();
The function where I try to mix my 2 wav files looks like this:
using (var reader1 = new AudioFileReader(#"xxx\xxx\file1.wav"))
using (var reader2 = new AudioFileReader(#"xxx\xxx\file2.wav"))
{
var mixer = new MixingSampleProvider(new[] { reader1, reader2 });
WaveFileWriter.CreateWaveFile16(#"xxxx\xxx\mixed.wav", mixer);
}
and I get this exception: System.ArgumentException: 'All mixer inputs must have the same WaveFormat' while trying to create MixingSampleProvider.
I was wondering if I am using the right ways to record both audios? Also, it would be great if there is a way to record both audios in one file, but I'm not sure if that is possible.
All mixer inputs must have the same WaveFormat
hints to that yours don't.
Change the line
waveSourceMic.WaveFormat = new WaveFormat(44100, 1);
to
waveSourceMic.WaveFormat = waveSourceSpeakers.WaveFormat;
So, now you will be using the same Format for both Mic and Speakers and the mixer should be fine.
Yo guys, its me again with my noob questions. so this time I've used cscore to record windows sounds then send the recorded bytes to another pc by sockets and let them play there.
I just could not figure out how to play the gotten bytes under DataAvailable callback...
I've tried to write the bytes gotten to a file and play that file that worked but sound is not playing correctly like there's some unexpected sounds being heard with it too.
so here's my code:
WasapiCapture capture = new WasapiLoopbackCapture();
capture.Initialize();
capture.DataAvailable += (s, e) =>
{
WaveWriter w = new WaveWriter("file.mp3", capture.WaveFormat);
w.Write(e.Data, e.Offset, e.ByteCount);
w.Dispose();
MemoryStream stream = new MemoryStream(File.ReadAllBytes("file.mp3"));
SoundPlayer player = new SoundPlayer(stream);
player.Play();
stream.Dispose();
};
capture.Start();
any help would be highly appreciated ;-;.
if you wanna hear how sound comes out by that way I would record you the result.
NOTE: if I just record sounds to a file and open later it just works perfectly but if I write and play instantly it unexpected sounds being heard.....
Use the SoundInSource as an adapater.
var capture = new WasapiCapture(...)
capture.Initialize(); //initialize always first!!!!
var soundInSource = new SoundInSource(capture)
{ FillWithZeros = true }; //set FillWithZeros to true, to prevent WasapiOut from stopping for the case WasapiCapture does not serve any data
var soundOut = new WasapiOut();
soundOut.Initialize(soundInSource);
soundOut.Play();
I play mp3 using NAudio but I don't know how to loop the music during opening program?
IWavePlayer waveOutDevice = new WaveOut();
AudioFileReader musicBackground = new AudioFileReader(#"....mp3");
waveOutDevice.Init(musicBackground);
waveOutDevice.Play();
I am trying to implement NAudio into Unity. I managed to link the NAudio dll, but I am getting a strange error when I try to play music with NAudio BufferedWaveProvider.
If I to this:
WaveOut player;
BufferedWaveProvider buf;
AudioFileReader reader;
void Start () {
reader = new AudioFileReader(#"..\music.mp3"); // some music
player = new WaveOut();
player.Init(reader );
player.Play();
}
The music plays normal, without any problems.
But when I try use BufferedWaveProvider:
WaveOut player;
BufferedWaveProvider buf;
AudioFileReader reader;
void Start () {
reader = new AudioFileReader(#"..\music.mp3"); // some music
buf = new BufferedWaveProvider(reader.WaveFormat);
byte[] tmp = new byte[50000];
reader.Read(tmp, 0, tmp.Length); //read 50000 bytes
buf.AddSamples(tmp, 0, tmp.Length); //add bytes to buf
player = new WaveOut();
player.Init(buf); //init the WaveOut with buff
player.Play(); // play
}
It doesnt play! I debuged really a lot, and found out that the BufferedWaveProvider is using the samples (BufferedBytes are lowering), but I dont get any sound out of it!
I am using BufferedWaveProvider because of a more complex project, but its already a problem in such a simple example..
What am I missing?
Note: The same code WORKS in C# Windows Forms...
Try using WaveOutEvent instead of WaveOut, it worked at least for me in one of the projects.
As Mark pointed out:
it works because WaveOut uses Windows message callbacks by default, so if you have no gui thread (e.g. you are in a console app), then it can't be used and WaveOutEvent should be preferred
I need to convert a wave that i created inside my app into a bit array and then back.
I have no clue how to start.
This is my clase where i create the sound file.
private void forecast(string forecast)
{
MemoryStream streamAudio = new MemoryStream();
System.Media.SoundPlayer m_SoundPlayer = new System.Media.SoundPlayer();
SpeechSynthesizer speech = new SpeechSynthesizer();
speech.SetOutputToWaveStream(streamAudio);
speech.Speak(forecast);
streamAudio.Position = 0;
m_SoundPlayer.Stream = streamAudio;
m_SoundPlayer.Play();
// Set the synthesizer output to null to release the stream.
speech.SetOutputToNull();
}
After you've called Speak, the data is in the MemoryStream. You can get that to a byte array and do whatever you like:
speech.Speak(forecast);
byte[] speechBytes = streamAudio.ToArray();
speechBytes contains the data you're looking for.