Save streaming data to a WAV file using NAudio - c#

I want to save the incoming stream data to a WAV file on my hard disk drive. How can I change the code below to be able to record the stream into a valid WAV file?
From the demo here:
private void StreamMP3(object state)
{
this.fullyDownloaded = false;
string url = (string)state;
webRequest = (HttpWebRequest)WebRequest.Create(url);
HttpWebResponse resp = null;
try
{
resp = (HttpWebResponse)webRequest.GetResponse();
}
catch(WebException e)
{
if (e.Status != WebExceptionStatus.RequestCanceled)
{
ShowError(e.Message);
}
return;
}
byte[] buffer = new byte[16384 * 4]; // Needs to be big enough to hold a decompressed frame
IMp3FrameDecompressor decompressor = null;
try
{
using (var responseStream = resp.GetResponseStream())
{
var readFullyStream = new ReadFullyStream(responseStream);
do
{
if (bufferedWaveProvider != null &&
bufferedWaveProvider.BufferLength - bufferedWaveProvider.BufferedBytes <
bufferedWaveProvider.WaveFormat.AverageBytesPerSecond / 4)
{
Debug.WriteLine("Buffer getting full, taking a break");
Thread.Sleep(500);
}
else
{
Mp3Frame frame = null;
try
{
frame = Mp3Frame.LoadFromStream(readFullyStream);
}
catch (EndOfStreamException)
{
this.fullyDownloaded = true;
// Reached the end of the MP3 file / stream
break;
}
catch (WebException)
{
// Probably we have aborted download from the GUI thread
break;
}
if (decompressor == null)
{
// I don't think these details matter too much - just help ACM select the right codec.
// However, the buffered provider doesn't know what sample rate it is working at
// until we have a frame.
WaveFormat waveFormat = new Mp3WaveFormat(
frame.SampleRate,
frame.ChannelMode == ChannelMode.Mono ? 1 : 2,
frame.FrameLength,
frame.BitRate);
decompressor = new AcmMp3FrameDecompressor(waveFormat);
this.bufferedWaveProvider = new BufferedWaveProvider(decompressor.OutputFormat);
this.bufferedWaveProvider.BufferDuration = TimeSpan.FromSeconds(20); // Allow us to get well ahead of ourselves
//this.bufferedWaveProvider.BufferedDuration = 250;
}
int decompressed = decompressor.DecompressFrame(frame, buffer, 0);
//Debug.WriteLine(String.Format("Decompressed a frame {0}", decompressed));
bufferedWaveProvider.AddSamples(buffer, 0, decompressed);
}
} while (playbackState != StreamingPlaybackState.Stopped);
Debug.WriteLine("Exiting");
// I was doing this in a finally block, but for some reason
// we are hanging on response stream .Dispose, so we never get there.
decompressor.Dispose();
}
}
finally
{
if (decompressor != null)
{
decompressor.Dispose();
}
}
}

I wouldn't take that particular approach to saving to disk. It's a bit too hands-on, because it has to deal with playing back at the right rate. Just buffer up the response, and then wrap it in an Mp3FileReader stream and use WaveFileWriter to write the WAV file:
MemoryStream mp3Buffered = new MemoryStream();
using (var responseStream = resp.GetResponseStream())
{
byte[] buffer = new byte[65536];
int bytesRead = responseStream.Read(buffer, 0, buffer.Length);
while (bytesRead > 0)
{
mp3Buffered.Write(buffer, 0, bytesRead);
bytesRead = responseStream.Read(buffer, 0, buffer.Length);
}
}
mp3Buffered.Position = 0;
using (var mp3Stream = new Mp3FileReader(mp3Buffered))
{
WaveFileWriter.CreateWaveFile("file.wav", mp3Stream);
}
That does, of course, assume that your MP3 file's wave format is compatible with WAV and in particular, your WAV player. If it isn't, you'll need to inject and add a WaveFormatConversion stream as well.

You can use following line to save to MemoryStream :
mp3Buffered.Write(frame.RawData, 0, frame.RawData.Length);
Saving stream to file is described in MattW's answer.

Related

Copy and update content of actively recording Wav file to a new file

I have an active audio recording happening in WAV format with NAudio Library.
private void RecordStart() {
try {
_sourceStream = new WaveIn {
DeviceNumber = _recordingInstance.InputDeviceIndex,
WaveFormat =
new WaveFormat(
44100,
WaveIn.GetCapabilities(_recordingInstance.InputDeviceIndex).Channels)
};
_sourceStream.DataAvailable += SourceStreamDataAvailable;
if (!Directory.Exists(_recordingInstance.AudioFilePath)) {
Directory.CreateDirectory(_recordingInstance.AudioFilePath);
}
WaveFileWriter _waveWriter = new WaveFileWriter(
_recordingInstance.AudioFilePath + _recordingInstance.AudioFileName,
_sourceStream.WaveFormat);
_sourceStream.StartRecording();
}
catch (Exception exception) {
Log.Error("Recording failes", exception);
}
}
private void SourceStreamDataAvailable(object sender, WaveInEventArgs e) {
if (_waveWriter == null) return;
_waveWriter.Write(e.Buffer, 0, e.BytesRecorded);
_waveWriter.Flush();
}
I want to copy the latest available content to another location. The copied file should be in WAV format, and should be able to play the available duration. Update the destination file, whenever more content is available.
I have Tried the following sample code (using NAudio) with a static WAV file, but the solution is not working.
The resulting WAV file created is corrupted - not in the correct format.
using (WaveFileReader reader = new WaveFileReader(remoteWavFile))
{
byte[] buffer = new byte[reader.Length];
int read = reader.Read(buffer, 0, buffer.Length);
}
When the recording is in progress, the code throws an exception "File is in use by another application".
I have solved the problem with help of NAudio Library itself.
When we only use the WaveFileReader class of NAudio. It will throw the exception - "file is in use by another application".
So I had to create a file stream, which opens the source file - live recording file, with File.Open(inPath, FileMode.Open, FileAccess.Read, FileShare.ReadWrite) then pass this stream as an input of WaveFileReader.
Then create a WaveFileWritter class of NAudio, with the same WavFormat of the reader.
copied below is the code, i have used.
public static void CopyWavFile(string inPath, string outPath){
using (var fs = File.Open(inPath, FileMode.Open, FileAccess.Read, FileShare.ReadWrite)){
using (var reader = new WaveFileReader(fs)){
using (var writer = new WaveFileWriter(outPath, reader.WaveFormat)){
reader.Position = 0;
var endPos = (int)reader.Length;
var buffer = new byte[1024];
while (reader.Position < endPos){
var bytesRequired = (int)(endPos - reader.Position);
if (bytesRequired <= 0) continue;
var bytesToRead = Math.Min(bytesRequired, buffer.Length);
var bytesRead = reader.Read(buffer, 0, bytesToRead);
if (bytesRead > 0){
writer.Write(buffer, 0, bytesRead);
}
}
}
}
}
}

YouTube Direct Upload - OutOfMemory Exception

Whenever I try to upload a large video via Direct Upload using the YouTube API. I get an OutOfMemory Exception. Is there anything I can do to get rid of this? The YouTube API does not say anything about video size limit using direct upload.
I gave up on the Direct Upload. Now I trying the resumable upload way. My code is below.
YouTubeRequest request;
YouTubeRequestSettings settings = new YouTubeRequestSettings("YouTube Upload", Client Key, "Username", "Password");
request = new YouTubeRequest(settings);
Video newVideo = new Video();
ResumableUploader m_ResumableUploader = null;
Authenticator YouTubeAuthenticator;
m_ResumableUploader = new ResumableUploader(256); //chunksize 256 kilobyte
m_ResumableUploader.AsyncOperationCompleted += new AsyncOperationCompletedEventHandler(m_ResumableUploader_AsyncOperationCompleted);
m_ResumableUploader.AsyncOperationProgress += new AsyncOperationProgressEventHandler(m_ResumableUploader_AsyncOperationProgress);
YouTubeAuthenticator = new ClientLoginAuthenticator("YouTubeUploader", ServiceNames.YouTube, "kjohnson#resoluteinnovations.com", "password");
//AtomLink link = new AtomLink("http://uploads.gdata.youtube.com/resumable/feeds/api/users/uploads");
//link.Rel = ResumableUploader.CreateMediaRelation;
//newVideo.YouTubeEntry.Links.Add(link);
System.IO.FileStream stream = new System.IO.FileStream(filePath, System.IO.FileMode.Open, System.IO.FileAccess.Read);
byte[] chunk = new byte[256000];int count = 1;
while (true) {
int index = 0;
while (index < chunk.Length) {
int bytesRead = stream.Read(chunk, index, chunk.Length - index);
if (bytesRead == 0) {
break;
}
index += bytesRead;
}
if (index != 0) { // Our previous chunk may have been the last one
newVideo.MediaSource = new MediaFileSource(new MemoryStream(chunk), filePath, "video/quicktime");
if (count == 1) {
m_ResumableUploader.InsertAsync(YouTubeAuthenticator, newVideo.YouTubeEntry, new MemoryStream(chunk));
count++;
}
else
m_ResumableUploader.ResumeAsync(YouTubeAuthenticator, new Uri("http://uploads.gdata.youtube.com/resumable/feeds/api/users/uploads"), "POST", new MemoryStream(chunk), "video/quicktime", new object());
}
if (index != chunk.Length) { // We didn't read a full chunk: we're done
break;
}
}
Can anyone tell me what is wrong? My 2 GB video not uploading.
The reason I was getting a 403 Forbidden error was due to the fact that I was not passing in:
Username & Password
A developer key
The request variable in the code above is not being used/sent in the upload. Therefore I was doing an unauthorized upload.
Chances are that you are not disposing your objects. Ensure all disposable objects are within a using statement..
For example, this code will upload a large zip file to a server:
try
{
using (Stream ftpStream = FTPRequest.GetRequestStream())
{
using (FileStream file = File.OpenRead(ImagesZipFile))
{
// set up variables we'll use to read the file
int length = 1024;
byte[] buffer = new byte[length];
int bytesRead = 0;
// write the file to the request stream
do
{
bytesRead = file.Read(buffer, 0, length);
ftpStream.Write(buffer, 0, bytesRead);
}
while (bytesRead != 0);
}
}
}
catch (Exception e)
{
// throw the exception
throw e;
}

Partially download and serialize big file in C#?

As part of an upcoming project at my university, I need to write a client that downloads a media file from a server and writes it to the local disk. Since these files can be very large, I need to implement partial download and serialization in order to avoid excessive memory use.
What I came up with:
namespace PartialDownloadTester
{
using System;
using System.Diagnostics.Contracts;
using System.IO;
using System.Net;
using System.Text;
public class DownloadClient
{
public static void Main(string[] args)
{
var dlc = new DownloadClient(args[0], args[1], args[2]);
dlc.DownloadAndSaveToDisk();
Console.ReadLine();
}
private WebRequest request;
// directory of file
private string dir;
// full file identifier
private string filePath;
public DownloadClient(string uri, string fileName, string fileType)
{
this.request = WebRequest.Create(uri);
this.request.Method = "GET";
var sb = new StringBuilder();
sb.Append("C:\\testdata\\DownloadedData\\");
this.dir = sb.ToString();
sb.Append(fileName + "." + fileType);
this.filePath = sb.ToString();
}
public void DownloadAndSaveToDisk()
{
// make sure directory exists
this.CreateDir();
var response = (HttpWebResponse)request.GetResponse();
Console.WriteLine("Content length: " + response.ContentLength);
var rStream = response.GetResponseStream();
int bytesRead = -1;
do
{
var buf = new byte[2048];
bytesRead = rStream.Read(buf, 0, buf.Length);
rStream.Flush();
this.SerializeFileChunk(buf);
}
while (bytesRead != 0);
}
private void CreateDir()
{
if (!Directory.Exists(dir))
{
Directory.CreateDirectory(dir);
}
}
private void SerializeFileChunk(byte[] bytes)
{
Contract.Requires(!Object.ReferenceEquals(bytes, null));
FileStream fs = File.Open(filePath, FileMode.Append);
fs.Write(bytes, 0, bytes.Length);
fs.Flush();
fs.Close();
}
}
}
For testing purposes, I've used the following parameters:
"http://itu.dk/people/janv/mufc_abc.jpg" "mufc_abc" "jpg"
However, the picture is incomplete (only the first ~10% look right) even though the content length prints 63780 which is the actual size of the image.
So my questions are:
Is this the right way to go for partial download and serialization or is there a better/easier approach?
Is the full content of the response stream stored in client memory? If this is the case, do I need to use HttpWebRequest.AddRange to partially download data from the server in order to conserve my client's memory?
How come the serialization fails and I get a broken image?
Do I introduce a lot of overhead when I use the FileMode.Append? (msdn states that this option "seeks to the end of the file")
Thanks in advance
You could definitely simplify your code using a WebClient:
class Program
{
static void Main()
{
DownloadClient("http://itu.dk/people/janv/mufc_abc.jpg", "mufc_abc.jpg");
}
public static void DownloadClient(string uri, string fileName)
{
using (var client = new WebClient())
{
using (var stream = client.OpenRead(uri))
{
// work with chunks of 2KB => adjust if necessary
const int chunkSize = 2048;
var buffer = new byte[chunkSize];
using (var output = File.OpenWrite(fileName))
{
int bytesRead;
while ((bytesRead = stream.Read(buffer, 0, buffer.Length)) > 0)
{
output.Write(buffer, 0, bytesRead);
}
}
}
}
}
}
Notice how I am writing only the number of bytes I have actually read from the socket to the output file and not the entire 2KB buffer.
I don't know if this is the source of the problem, however I would change the loop like this
const int ChunkSize = 2048;
var buf = new byte[ChunkSize];
var rStream = response.GetResponseStream();
do {
int bytesRead = rStream.Read(buf, 0, ChunkSize);
if (bytesRead > 0) {
this.SerializeFileChunk(buf, bytesRead);
}
} while (bytesRead == ChunkSize);
The serialize method would get an additional argument
private void SerializeFileChunk(byte[] bytes, int numBytes)
and then write the right number of bytes
fs.Write(bytes, 0, numBytes);
UPDATE:
I do not see the need for closing and reopening the file each time. I also would use the using statement, which closes the resources, even if an exception should occur. The using statement calls the Dispose() method of the resource at the end, which in turn calls Close() in the case of file streams. using can be applied to all types implementing IDisposable.
var buf = new byte[2048];
using (var rStream = response.GetResponseStream()) {
using (FileStream fs = File.Open(filePath, FileMode.Append)) {
do {
bytesRead = rStream.Read(buf, 0, buf.Length);
fs.Write(bytes, 0, bytesRead);
} while (...);
}
}
The using statement does something like this
{
var rStream = response.GetResponseStream();
try
{
// do some work with rStream here.
} finally {
if (rStream != null) {
rStream.Dispose();
}
}
}
Here is the solution from Microsoft: http://support.microsoft.com/kb/812406
Updated 2021-03-16: seems the original article is not available now. Here is the archived one: https://mskb.pkisolutions.com/kb/812406

Frustrating TCP Serialization Exception: Binary stream '0' does not contain a valid BinaryHeader

I posted a question on how to send large objects over TCP and it seems like the primary issue is solved, but now frequently I get another exception:
Binary stream '0' does not contain a
valid BinaryHeader. Possible causes
are invalid stream or object version
change between serialization and
deserialization.
The issue is still in my Receive method:
public Message Receive()
{
if (_tcpClient == null || !_tcpClient.Connected)
{
throw new TransportException("Client Not Connected");
}
// buffers
byte[] msgBuffer;
byte[] sizeBuffer = new byte[sizeof(int)];
// bites read
int readSize = 0;
// message size
int size = 0;
MemoryStream memStream = new MemoryStream();
NetworkStream netStream = _tcpClient.GetStream();
BinaryFormatter formatter = new BinaryFormatter();
try
{
// Read the message length
netStream.Read(sizeBuffer, 0, sizeof(int));
// Extract the message length
size = BitConverter.ToInt32(sizeBuffer, 0);
msgBuffer = new byte[size];
// Fill up the message msgBuffer
do
{
// Clear the buffer
Array.Clear(msgBuffer, 0, size);
// Read the message
readSize += netStream.Read(msgBuffer, 0, _tcpClient.ReceiveBufferSize);
// Write the msgBuffer to the memory streamvb
memStream.Write(msgBuffer, 0, readSize);
} while (readSize < size);
// Reset the memory stream position
memStream.Position = 0;
// Deserialize the message
return (Message)formatter.Deserialize(memStream); // <-- Exception here
}
catch (System.Exception e)
{
if (_tcpClient == null || !_tcpClient.Connected)
{
throw new TransportException("Client Not Connected");
}
else
{
throw e;
}
}
}
The rest of the code relevant to this example can be found in my original question.
Does anybody know what is causing this exception and how I can avoid it?
Update
Changed the Read to read a maximum of _tcpClient.ReceiveBufferSize bytes at a time, rather than trying to read the full message size (which can be larger than the buffer size) and while the frequency of the Exception decreased slightly, it's still occurring quite often.
Let me suggest you a slight simplification of your code:
public Message Receive()
{
try
{
if (_tcpClient == null || !_tcpClient.Connected)
{
throw new TransportException("Client Not Connected");
}
using (var stream = _tcpClient.GetStream())
using (var reader = new BinaryReader(stream))
{
int size = reader.ReadInt32();
byte[] buffer = reader.ReadBytes(size);
using (var memStream = new MemoryStream(buffer))
{
var formatter = new BinaryFormatter();
return (Message)formatter.Deserialize(memStream);
}
}
}
catch (System.Exception e)
{
if (_tcpClient == null || !_tcpClient.Connected)
{
throw new TransportException("Client Not Connected");
}
throw e;
}
}
Also if you are doing this for fun and/or education purposes then it's ok, but in a real project you should probably consider WCF in order to transmit objects over the wire.
WCF not so good in client-server. Th polling duplex is quite raw technology.

If I check stream for valid image I can't write bytes to server

I am trying to check if a file is an image before I upload it to the image server.
I am doing it with the following function, which works exceptionally well:
static bool IsValidImage(Stream imageStream)
{
bool isValid = false;
try
{
// Read the image without validating image data
using (Image img = Image.FromStream(imageStream, false, false))
{
isValid = true;
}
}
catch
{
;
}
return isValid;
}
The problem is that when the below is called immediately afterwards, The line:
while ((bytesRead = request.FileByteStream.Read(buffer, 0, bufferSize)) > 0)
evalueates to zero and no bytes are read. I notice that when I remove the
IsValidImage function, bytes are read and the file is written. It seems
that bytes can only be read once? Any idea how to fix this?
using (FileStream outfile = new FileStream(filePath, FileMode.Create))
{
const int bufferSize = 65536; // 64K
int bytesRead = 0;
Byte[] buffer = new Byte[bufferSize];
while ((bytesRead = request.FileByteStream.Read(buffer, 0, bufferSize)) > 0)
{
outfile.Write(buffer, 0, bytesRead);
}
outfile.Close(); //necessary?
}
UPDATE: Thanks for your help Marc. I am new to stream manipulation and could use a little
more help here. I took a shot but may be mixing up the use of filestream and memorystream.
Would you mind taking a look? Thanks again.
using (FileStream outfile = new FileStream(filePath, FileMode.Create))
using (MemoryStream ms = new MemoryStream())
{
byte[] buffer = new byte[1024];
int bytesRead;
while ((bytesRead = request.FileByteStream.Read(buffer, 0, buffer.Length)) > 0)
{
ms.Write(buffer, 0, bytesRead);
}
// ms now has a seekable/rewindable copy of the data
// TODO: read ms the first time
// I replaced request.FileByteStream with ms but am unsure about
// the using statement in the IsValidImage function.
if (!IsValidImage(ms) == true)
{
ms.Close();
request.FileByteStream.Close();
return;
}
ms.Position = 0;
// TODO: read ms the second time
byte[] m_buffer = new byte[ms.Length];
while ((bytesRead = ms.Read(m_buffer, 0, (int)ms.Length)) > 0)
{
outfile.Write(m_buffer, 0, bytesRead);
}
}
static bool IsValidImage(MemoryStream imageStream)
{
bool isValid = false;
try
{
// Read the image without validating image data
using (Image img = Image.FromStream(imageStream, false, false))
{
isValid = true;
}
}
catch
{
;
}
return isValid;
}
As you read from any stream, the position increases. If you read a stream to the end (as is typical), and then try to read again, then it will return EOF.
For some streams, you can seek - set the Position to 0, for example. However, you should try to avoid relying on this as it is not available for many streams (especially when network IO is involved). You can query this ability via CanSeek, but it would be simpler to avoid this - partly as if you are branching based on this, you suddenly have twice as much code to maintain.
If you need the data twice, then the options depends on the size of the data. For small streams, buffer it in-memory, as either a byte[] or a MemoryStream. For larger streams (or if you don't know the size) then writing to a scratch file (and deleting afterwards) is a reasonable approach. You can open and read the file as many times (in series, not in parallel) as you like.
If you are happy the stream isn't too large (although maybe add a cap to prevent people uploading swap-files, etc):
using (MemoryStream ms = new MemoryStream()) {
byte[] buffer = new byte[1024];
int bytesRead;
while ((bytesRead = inputStream.Read(buffer, 0, buffer.Length)) > 0) {
ms.Write(buffer, 0, bytesRead);
}
// ms now has a seekable/rewindable copy of the data
// TODO: read ms the first time
ms.Position = 0;
// TODO: read ms the second time
}
Indeed Stream instances remember where the current "cursor" is. Some streams support "rewinding". The "CanSeek" property will then return true. In the case of a HTTP request Ithis won't work (CanSeek = false).
Isn't a MIME-type sent from the browser as well?
If you really want to keep your way of checking you'll have to go with Marc's proposition
In your update, you have a problem reading the stream a second time.
byte[] m_buffer = new byte[ms.Length];
while ((bytesRead = ms.Read(m_buffer, 0, (int)ms.Length)) > 0)
{
outfile.Write(m_buffer, 0, bytesRead);
}
The solution is simple:
byte[] m_buffer = ms.ToArray();
outfile.Write(m_buffer, 0, m_buffer.Length);
See also MemoryStream.ToArray
public static bool IsImagen(System.IO.Stream stream, String fileName)
{
try
{
using (Image img = Image.FromStream(stream, false, false))
{
if (fileName.ToLower().IndexOf(".jpg") > 0)
return true;
if (fileName.ToLower().IndexOf(".gif") > 0)
return true;
if (fileName.ToLower().IndexOf(".png") > 0)
return true;
}
}
catch (ArgumentException){}
return false;
}

Categories

Resources