AudioClip to file won't play properly - c#

I'm trying to send AudioClips through UnityWebRequest (not important for now) and the way I've found to do that is converting AudioClips to byte[] and sending it as a raw audio file. But this audio file doesn't seem to be working correctly. When I try saving the raw audio directly to my computer and play it in Audacity, it sounds very noisy and corrupt. When I speak into the mic, my voice can barely be made out in the static when I play it in Audacity.
This is what I am currently doing to get my AudioClip to a file:
float[] samples = new float[audioClip.samples * audioClip.channels];
audioClip.GetData(samples, 0);
byte[] byteArray = new byte[samples.Length * 4];
Buffer.BlockCopy(samples, 0, byteArray, 0, byteArray.Length);
File.WriteAllBytes("myAudio/audio.raw", byteArray);
Playing it within Unity works fine (whether playing the original AudioClip or converting it back from bytes). How can I get the file to play correctly. Is there a specific way to decode it in Audacity? Do the bytes not correspond to a raw audio file and must they somehow be converted?

The configuration within Audacity actually was not trivial. It took me some trial and error but it ended up playing correctly when I imported it as 32-bit float, Little-endian, 1 channel, and 44100Hz.
Thanks to Ron Beyer who also suggested a custom Audacity import configuration.

Related

NAudio Loopback Record Crackle Sound Error

I'm recording sound via the WasapiLoopbackCapture and write it to an MP3-File via the NAudio.Lame lib:
LAMEPreset quality = LAMEPreset.ABR_320;
audiostream = new WasapiLoopbackCapture();
audiostream.DataAvailable += stream_DataAvailable;
audiostream.RecordingStopped += stream_RecordingStopped;
mp3writer = new LameMP3FileWriter(Environment.GetEnvironmentVariable("USERPROFILE") + #"\Music\record_temp.mp3",
audiostream.WaveFormat, quality);
audiostream.StartRecording();
When the user presses the stop-recording-button, I save the MP3 and stop the recording:
mp3writer.Flush();
audiostream.Dispose();
mp3writer.Dispose();
All works fine, except that the output file has some disturbing crackle noises in it. (See here for example). I think it might be the case, that my computer is a bit to slow to do the process of compressing and writing the audio data in realtime, so some of the values get lost, but that is just my guess
Edit: When recording to WAVE, the errors dont appear.
What may be the problem here and how could I possibly solve it / work around it?
start off by saving your audio to a WAV file. Does that have crackles in it? If so the crackles are coming from the soundcard. If not, they are coming from the encoding to MP3 code.

Manipulating Mp3's as Array Using NAudio

I'm trying to reimplement an existing Matlab 8-band equalizer GUI I created for a project last week in C#. In Matlab, songs load as a dynamic array into memory, where they can be freely manipulated and playing is as easy as sound(array).
I found the NAudio library which conveniently already has Mp3 extractors, players, and both convolution and FFT defined. I was able to open the Mp3 and read all its data into an array (though I'm not positive I'm going about it correctly.) However, even after looking through a couple of examples, I'm struggling to figure out how to take the array and write it back into a stream in such a way as to play it properly (I don't need to write to file).
Following the examples I found, I read my mp3's like this:
private byte[] CreateInputStream(string fileName)
{
byte[] stream;
if (fileName.EndsWith(".mp3"))
{
WaveStream mp3Reader = new Mp3FileReader(fileName);
songFormat = mp3Reader.WaveFormat; // songFormat is a class field
long sizeOfStream = mp3Reader.Length;
stream = new byte[sizeOfStream];
mp3Reader.Read(stream, 0, (int) sizeOfStream);
}
else
{
throw new InvalidOperationException("Unsupported Exception");
}
return stream;
}
Now I have an array of bytes presumably containing raw audio data, which I intend to eventually covert to floats so as to run through the DSP module. Right now, however, I'm simply trying to see if I can play the array of bytes.
Stream outstream = new MemoryStream(stream);
WaveFileWriter wfr = new WaveFileWriter(outstream, songFormat);
// outputStream is an array of bytes and a class variable
wfr.Write(outputStream, 0, (int)outputStream.Length);
WaveFileReader wr = new WaveFileReader(outstream);
volumeStream = new WaveChannel32(wr);
waveOutDevice.Init(volumeStream);
waveOutDevice.Play();
Right now I'm getting errors thrown in WaveFileReader(outstream) which say that it can't read past the end of the stream. I suspect that's not the only thing I'm not doing correctly. Any insights?
Your code isn't working because you never close the WaveFileWriter so its headers aren't written correctly, and you also would need to rewind the MemoryStream.
However, there is no need for writing a WAV file if you want to play back an array of byes. Just use a RawSourceWaveStream and pass in your MemoryStream.
You may also find the AudioFileReader class more suitable to your needs as it will provide the samples as floating point directly, and allow you to modify the volume.

My code crashes if it tries to play a mp3 file, works fine for wav files

All my sound clips are in mp3, so I do not really feel like translating them tp wav.
If I try to play a mp3 file I get a exception, but the code works fine for .wav files.
I figure there must be a way to play mp3 files.
Stream s = TitleContainer.OpenStream("sounds/bag.mp3");
// throws a exceptio if its a mp3 file
SoundEffect effect = SoundEffect.FromStream(s);
FrameworkDispatcher.Update();
effect.Play();
It is by design - SoundEffect.FromStream only works with wave files. As per MSDN:
The Stream object must point to the head of a valid PCM wave file.
Also, this wave file must be in the RIFF bitstream format.
The audio format has the following restrictions:
Must be a PCM wave file
Can only be mono or stereo
Must be 8 or 16 bit
Sample rate must be between 8,000 Hz and 48,000 Hz
Try to use Media Element..
MediaElement MyMedia = new MediaElement();
MyMedia.Source = new Uri("sounds/bag.mp3", UriKind.RelativeOrAbsolute);
MyMedia.Play();

Sound quality issues when using NAudio to play several wav files at once

My objective is this: to allow users of my .NET program to choose their own .wav files for sound effects. These effects may be played simultaneously. NAudio seemed like my best bet.
I decided to use WaveMixerStream32. One early challenge was that my users had .wav files of different formats, so to be able to mix them together with WaveMixerStream32, I needed to "normalize" them to a common format. I wasn't able to find a good example of this to follow so I suspect my problem is a result of my doing this part wrong.
My problem is that when some sounds are played, there are very noticeable "clicking" sounds at their end. I can reproduce this myself.
Also, my users have complained that sometimes, sounds aren't played at all, or are "scratchy" all the way through. I haven't been able to reproduce this in development but I have heard this for myself in our production environment.
I've played the user's wav files myself using Windows Media and VLC, so I know the files aren't corrupt. It must be a problem with how I'm using them with NAudio.
My NAudio version is v1.4.0.0.
Here's the code I used. To set up the mixer:
_mixer = new WaveMixerStream32 { AutoStop = false, };
_waveOutDevice = new WaveOut(WaveCallbackInfo.NewWindow())
{
DeviceNumber = -1,
DesiredLatency = 300,
NumberOfBuffers = 3,
};
_waveOutDevice.Init(_mixer);
_waveOutDevice.Play();
Surprisingly, if I set "NumberOfBuffers" to 2 here I found that sound quality was awful, with audible "ticks" occurring several times a second.
To initialize a sound file, I did this:
var sample = new AudioSample(fileName);
sample.Position = sample.Length; // To prevent the sample from playing right away
_mixer.AddInputStream(sample);
AudioSample is my class. Its constructor is responsible for the "normalization" of the wav file format. It looks like this:
private class AudioSample : WaveStream
{
private readonly WaveChannel32 _channelStream;
public AudioSample(string fileName)
{
MemoryStream memStream;
using (var fileStream = File.OpenRead(fileName))
{
memStream = new MemoryStream();
memStream.SetLength(fileStream.Length);
fileStream.Read(memStream.GetBuffer(), 0, (int)fileStream.Length);
}
WaveStream originalStream = new WaveFileReader(memStream);
var pcmStream = WaveFormatConversionStream.CreatePcmStream(originalStream);
var blockAlignReductionStream = new BlockAlignReductionStream(pcmStream);
var waveFormatConversionStream = new WaveFormatConversionStream(
new WaveFormat(44100, blockAlignReductionStream.WaveFormat.BitsPerSample, 2), blockAlignReductionStream);
var waveOffsetStream = new WaveOffsetStream(waveFormatConversionStream);
_channelStream = new WaveChannel32(waveOffsetStream);
}
Basically, the AudioSample delegates to its _channelStream object. To play an AudioSample, my code sets its "Position" to 0. This code that does this is marshalled onto the UI thread.
This almost works great. I can play multiple sounds simultaneously. Unfortunately the sound quality is bad as described above. Can anyone help me figure out why?
Some points in response to your question:
Yes, you have to have all inputs at the same sample rate before you feed them into a mixer. This is simply how digital audio works. The ACM sample rate conversion provided by WaveFormatConversion stream isn't brilliant (has no aliasing protection). What sample rates are your input files typically at?
You are passing every input through two WaveFormatConversionStreams. Only do this if it is absolutely necessary.
I'm surprised that you are getting bad sound with NumberOfBuffers=2, which is now the default in NAudio. Have you been pausing and resuming, because there was a bug where a buffer could get dropped (fixed in the latest and will be fixed for NAudio 1.4 final)
A click at the end of a file can just mean it doesn't end on a zero sample. You would have to add a fade out to eliminate this (a lot of media players do this automatically)
Whenever you are troubleshooting a bad sound issue, I always recommend using WaveFileWriter to convert your WaveStream into a WAV file (taking care not to produce a never ending file!), so you can listen to it in another media player. This allows you to quickly determine whether it is your audio processing that is causing the problem, or the playback itself.

Decrypt a media file and play it without saving to the HDD

I need to develop WinForms app, which will be able to decrypt a media file (a movie) and then play it without saving decrypted file to the HDD (the decrypted file finally will be stored in the memory stream) The problem is, how then play that movie from the memory stream ? Is it possible ?
It is possible, but I expect you will need to write your own DirectShow filter to do so, which once created will act as a file reader (implementing the IFileSourceFilter interface), and, as the video plays, will read successive frames from the file, decrypt them, and pass them up to the next filter.
This will only work however if the file is encrypted in a sequential form (i.e. each individual frame is encrypted as a seperate entity). Otherwise, you will have to decrypt the entire file at once, which could be intensive, slow, and probably have to hit the hard drive to store the end file.
But anyway, this link should get you started: http://msdn.microsoft.com/en-us/library/dd375454%28VS.85%29.aspx
I'm afraid that in order to create the DirectShow filter, you will need to use C++, and it isn't the easiest API to get your head around.
An alternate way to do it may be to use the Windows Media Format SDK, which allows you to pass custom video packets to a renderer in real time. There is also a good interop library for C# (WindowsMediaLib)
First of all, it's a good idea to encrypt source video piece by piece. So the encrypted video file is a set of encrypted parts. Just split original file into parts of the same size and encrypt them.
Here the scheme (OutputStream is a stream of encrypted video file, InputStream is original file stream, ChunkSize is a size of each part in the original file, also we write some metadata: sizes of original and encrypted pieces):
using (BinaryWriter Writer = new BinaryWriter(OutputStream))
{
byte[] Buf = new byte[ChunkSize];
List<int> SourceChunkSizeList = new List<int>();
List<int> EncryptedChunkSizeList = new List<int>();
int ReadBytes;
while ((ReadBytes = InputStream.Read(Buf, 0, Buf.Length)) > 0)
{
byte[] EncryptedData = Encrypt(Buf, ReadBytes);
OutputStream.Write(EncryptedData, 0, EncryptedData.Length);
SourceChunkSizeList.Add(ReadBytes);
EncryptedChunkSizeList.Add(EncryptedData.Length);
}
foreach (int SourceChunkSize in SourceChunkSizeList)
Writer.Write(SourceChunkSize);
foreach (int EncryptedChunkSize in EncryptedChunkSizeList)
Writer.Write(EncryptedChunkSize);
}
Such metadata will help to find encrypted part rapidly.
Secondly, don't decrypt encrypted data in each read request. Cache it: video playing in the most case is just a sequential reading.
The tricky part is how to play encrypted video file. You may write either a DirectShow filter (video specific solution), or check a 3rd party product (multipurpose solution): BoxedApp, a virtualization SDK. What's cool is that they have an article that shows how to solve exact your task, look: http://boxedapp.com/encrypted_video_streaming.html

Categories

Resources