asp.net ffmpeg video encoding hangs - c#

I have below methods to encode videos uploaded to a web site with ffmpeg. It works fine for videos up to 8-9 MB but, if video size is larger than 8-9 MB it hangs web site. Only way to recover it is restarting iis.
When i watch the process i can see that ffmpeg encodes video and exits. Resulted video is just fine. Problem starts as soon as ffmpeg exists.
Application runs on win2003 x86 webserver iis6
Anyone got experience with encoding large files from asp.net web app using ffmpeg?
public EncodedVideo EncodeVideo(VideoFile input, string encodingCommand, string outputFile)
{
EncodedVideo encoded = new EncodedVideo();
Params = string.Format("-i \"{0}\" {1} \"{2}\"", input.Path, encodingCommand, outputFile);
//string output = RunProcess(Params);
string output = RunProcessLargeFile(Params);
encoded.EncodingLog = output;
encoded.EncodedVideoPath = outputFile;
if (File.Exists(outputFile))
{
encoded.Success = true;
}
else
{
encoded.Success = false;
}
//System.Web.HttpContext.Current.Response.Write(Params);
return encoded;
}
private string RunProcessLargeFile(string Parameters)
{
/* The below will be the right solution ....
* The while loop which reads the stream is very improtant
* for FFMPEG as .NET does not provide more memory to FFMPEG.
* When converting large files, FFMPEG's out put stream gets filled...
* And waits for .NET to allocate memory resources but is never done.
* In order to utilize less memory, we are clearing the buffer periodically.
**/
ProcessStartInfo oInfo = new ProcessStartInfo(this.FFmpegPath, Parameters);
oInfo.WorkingDirectory = Path.GetDirectoryName(this.FFmpegPath);
oInfo.UseShellExecute = false;
oInfo.CreateNoWindow = true;
oInfo.RedirectStandardOutput = true;
oInfo.RedirectStandardError = true;
Process proc = System.Diagnostics.Process.Start(oInfo);
StreamReader srOutput = proc.StandardError;
System.Text.StringBuilder output = new System.Text.StringBuilder();
StreamReader objStreamReader = proc.StandardError;
System.Text.StringBuilder sbOutPut = new StringBuilder();
while (!proc.WaitForExit(1000))
{
sbOutPut.Append(objStreamReader.ReadToEnd().ToString());
}
if (proc.ExitCode == 0)
{
proc.Close();
if (objStreamReader != null)
{
objStreamReader.Close();
}
}
else
{
proc.Close();
if (objStreamReader != null) objStreamReader.Close();
}
return sbOutPut.ToString();
}

Smells like a typical deadlock.
You call proc.WaitForExit before proc.StandardError.ReadToEnd method and it seems cause a deadlock.
A reference from MSDN:
A deadlock condition can result if the
parent process calls WaitForExit
before StandardOutput.ReadToEnd and
the child process writes enough text
to fill the redirected stream. The
parent process would wait indefinitely
for the child process to exit. The
child process would wait indefinitely
for the parent to read from the full
StandardOutput stream.
So replacing your while cycle to:
string output = objStreamReader.ReadToEnd();
proc.WaitForExit();
should solve your problem.

i am not a asp.net person i work with php and in php.ini which is a config file i can set the max amount of megabytes i am able to upload. maybe if you check the config file to see anything about uploading and set it to the maximum.

Related

Uploading Large Files to WCF from Xamarin Android App Crashes

I'm trying to upload a large video (1 GB+) from my xamarin app and it keeps crashing once it reaches about 0.5 GB of my file. The only way I've found to get the videos to post to my WCF service while sending data along with it is using the MultiPart logic but I'm not sure if I'm running out of memory or what because even in debug mode, it simply crashes without any real error message.
I'm trying to run it on a native device (not a sim) and it's a Samsung Galaxy S9 with Android 9.
Here's the upload code that I'm using: (p.s. - as a test, I tried putting the WriteAsync into a for loop thinking that maybe trying to write the whole gig was the problem, but the result was the same. That's why you'll see the MAXFILESIZEPART constant in there which is just an int equal to 10000000.)
private async Task<byte[]> GetMultipartFormDataAsync(Dictionary<string, object> postParameters, string boundary)
{
try
{
using (Stream formDataStream = new System.IO.MemoryStream())
{
bool needsCLRF = false;
foreach (var param in postParameters)
{
// Thanks to feedback from commenters, add a CRLF to allow multiple parameters to be added.
// Skip it on the first parameter, add it to subsequent parameters.
if (needsCLRF)
await formDataStream.WriteAsync(Encoding.UTF8.GetBytes("\r\n"), 0, Encoding.UTF8.GetByteCount("\r\n"));
needsCLRF = true;
if (param.Value is FileParameter)
{
FileParameter fileToUpload = (FileParameter)param.Value;
// Add just the first part of this param, since we will write the file data directly to the Stream
string header = string.Format("--{0}\r\nContent-Disposition: form-data; name=\"{1}\"; filename=\"{2}\"\r\nContent-Type: {3}\r\n\r\n",
boundary,
param.Key,
fileToUpload.FileName ?? param.Key,
fileToUpload.ContentType ?? "application/octet-stream");
await formDataStream.WriteAsync(Encoding.UTF8.GetBytes(header), 0, Encoding.UTF8.GetByteCount(header));
// Write the file data directly to the Stream, rather than serializing it to a string.
if (fileToUpload.File.Length > MAXFILESIZEPART)
{
for (var i = 0; i < fileToUpload.File.Length; i += MAXFILESIZEPART)
{
var len = i + MAXFILESIZEPART > fileToUpload.File.Length
? fileToUpload.File.Length - i
: MAXFILESIZEPART;
await formDataStream.WriteAsync(fileToUpload.File, i, len);
}
}
else
{
await formDataStream.WriteAsync(fileToUpload.File, 0, fileToUpload.File.Length);
}
}
else
{
string postData = string.Format("--{0}\r\nContent-Disposition: form-data; name=\"{1}\"\r\n\r\n{2}",
boundary,
param.Key,
param.Value);
await formDataStream.WriteAsync(Encoding.UTF8.GetBytes(postData), 0, Encoding.UTF8.GetByteCount(postData));
}
}
// Add the end of the request. Start with a newline
string footer = "\r\n--" + boundary + "--\r\n";
await formDataStream.WriteAsync(Encoding.UTF8.GetBytes(footer), 0, Encoding.UTF8.GetByteCount(footer));
// Dump the Stream into a byte[]
formDataStream.Position = 0;
byte[] formData = new byte[formDataStream.Length];
formDataStream.Read(formData, 0, formData.Length);
return formData;
}
}
catch (Exception e)
{
Console.WriteLine(e);
throw;
}
}
And it's eventually failing on the following line
await formDataStream.WriteAsync(fileToUpload.File, i, len);
but only after a certain point (about 500MB) so I'm assuming it's a memory issue but it doesn't say so. Is there a better way to accomplish this task? I'm doing it so that it also records the progress as the upload happens. I'm trying to accomplish something similar to uploading large videos via the facebook app so that it will upload in the background while you continue working. It works great when working with smaller files (i.e. - < 500 MB) but this is the first time I've tried a file that was almost a gig in size.
NOTE: This happens BEFORE it starts posting anything to the server so it's not IIS or WCF related. This code crashes just writing the bytes to the memory stream.
Any suggestions?
Thanks!
According to your description, the service will stop at a certain time point, and because the file you transfer is about 1G, it is likely to be sendtimeout.No transfer completed within the specified time, causing exception。The SendTimeout that specifies how long the write operation has to complete before timing out. The default value is 1 minute.
I set sendtimeout to 15 seconds in my configuration file.If the data takes more than 15 seconds, an exception will occur. You can set it to a higher value to avoid timeout and exception.
For information about sendtimeout, please refer to the following link:
https://learn.microsoft.com/en-us/dotnet/api/system.servicemodel.channels.binding.sendtimeout?view=dotnet-plat-ext-3.1
UPDATE
I think it might be a memory overflow problem.Large file may cause memory overflow, unable to read at the same time.
You can refer to the following links for solutions
https://learn.microsoft.com/en-us/archive/blogs/johan/are-you-getting-outofmemoryexceptions-when-uploading-large-files

Executing a process on IIS makes RAM goes up really quick

I built an ASP.NET MVC API hosted on IIS on Windows 10 Pro (VM on Azure - 4GB RAM, 2CPU). Within I call an .exe (wkhtmltopdf) that I want to convert an HTML page to image and save it locally. Everything works fine, except I noticed that after some calls to the API, the RAM goes crazy and while investigating the process with Task Manager I saw a process, called IIS Worker Process, that adds more RAM every time the API is called. Of course I wrapped my System.Diagnostics.Process instance usage inside a using statement to be disposed, because IDisposable is implemented, but it still consumes more and more RAM and after a while the server becomes laggy and unresponsive (it has only 4GB of RAM after all). I noticed that after some number of minutes (10-15-20 maybe) this IIS Worker Process calms down in terms of RAM usage... Here is my code, pretty straight forward:
Gets base64 encoded url
Decodes it
Uses wkhtmltoimage.exe to convert it to image
Saves it locally
Reads the byte array
Creates a blob in Azure with the image
Returns json with the url
public async Task<ActionResult> Index(string url)
{
object oJSON = new { url = string.Empty };
if (!string.IsNullOrEmpty(value: url))
{
try
{
byte[] EncodedData = Convert.FromBase64String(s: url);
string DecodedURL = Encoding.UTF8.GetString(bytes: EncodedData);
using (Process proc = new Process())
{
proc.StartInfo.FileName = wkhtmltopdfExecutablePath;
proc.StartInfo.Arguments = $"--encoding utf-8 \"{DecodedURL}\" {LocalImageFilePath}";
proc.Start();
proc.WaitForExit();
oJSON = new { procStatusCode = proc.ExitCode };
}
if (System.IO.File.Exists(path: LocalImageFilePath))
{
byte[] pngBytes = System.IO.File.ReadAllBytes(path: LocalImageFilePath);
System.IO.File.Delete(path: LocalImageFilePath);
string ImageURL = await CreateBlob(blobName: $"{BlobName}.png", data: pngBytes);
oJSON = new { url = ImageURL };
}
}
catch (Exception ex)
{
Debug.WriteLine(value: ex);
}
}
return Json(data: oJSON, behavior: JsonRequestBehavior.AllowGet);
}
private async Task<string> CreateBlob(string blobName, byte[] data)
{
string ConnectionString = "DefaultEndpointsProtocol=https;AccountName=" + AzureStorrageAccountName + ";AccountKey=" + AzureStorageAccessKey + ";EndpointSuffix=core.windows.net";
CloudStorageAccount cloudStorageAccount = CloudStorageAccount.Parse(connectionString: ConnectionString);
CloudBlobClient cloudBlobClient = cloudStorageAccount.CreateCloudBlobClient();
CloudBlobContainer cloudBlobContainer = cloudBlobClient.GetContainerReference(containerName: AzureBlobContainer);
await cloudBlobContainer.CreateIfNotExistsAsync();
BlobContainerPermissions blobContainerPermissions = await cloudBlobContainer.GetPermissionsAsync();
blobContainerPermissions.PublicAccess = BlobContainerPublicAccessType.Container;
await cloudBlobContainer.SetPermissionsAsync(permissions: blobContainerPermissions);
CloudBlockBlob cloudBlockBlob = cloudBlobContainer.GetBlockBlobReference(blobName: blobName);
cloudBlockBlob.Properties.ContentType = "image/png";
using (Stream stream = new MemoryStream(buffer: data))
{
await cloudBlockBlob.UploadFromStreamAsync(source: stream);
}
return cloudBlockBlob.Uri.AbsoluteUri;
}
Here are the resources I'm reading somehow related to this issue IMO, but are not helping much:
Investigating ASP.Net Memory Dumps for Idiots (like Me)
ASP.NET app eating memory. Application / Session objects the reason?
IIS Worker Process using a LOT of memory?
Run dispose method upon asp.net IIS app restart
IIS: Idle Timeout vs Recycle
UPDATE:
if (System.IO.File.Exists(path: LocalImageFilePath))
{
string BlobName = Guid.NewGuid().ToString(format: "n");
string ImageURL = string.Empty;
using(FileStream fileStream = new FileStream(LocalImageFilePath, FileMode.Open)
{
ImageURL = await CreateBlob(blobName: $"{BlobName}.png", dataStream: fileStream);
}
System.IO.File.Delete(path: LocalImageFilePath);
oJSON = new { url = ImageURL };
}
The most likely cause of your pain is the allocation of large byte arrays:
byte[] pngBytes = System.IO.File.ReadAllBytes(path: LocalImageFilePath);
The easiest change to make, to try and encourage the GC to collect the Large Object Heap more often, is to set GCSettings.LargeObjectHeapCompactionMode to CompactOnce at the end of the method. That might help.
But, a better idea would be to remove the need for the large array altogether. To do this, change:
private async Task<string> CreateBlob(string blobName, byte[] data)
to instead be:
private async Task<string> CreateBlob(string blobName, FileStream data)
And then later use:
await cloudBlockBlob.UploadFromStreamAsync(source: data);
In the caller, you'll need to stop using ReadAllBytes, and instead use a FileStream to read the file instead.

capturing ALL stdout data using Process.Start [duplicate]

In C# (.NET 4.0 running under Mono 2.8 on SuSE) I would like to run an external batch command and capture its ouput in binary form. The external tool I use is called 'samtools' (samtools.sourceforge.net) and among other things it can return records from an indexed binary file format called BAM.
I use Process.Start to run the external command, and I know that I can capture its output by redirecting Process.StandardOutput. The problem is, that's a text stream with an encoding, so it doesn't give me access to the raw bytes of the output. The almost-working solution I found is to access the underlying stream.
Here's my code:
Process cmdProcess = new Process();
ProcessStartInfo cmdStartInfo = new ProcessStartInfo();
cmdStartInfo.FileName = "samtools";
cmdStartInfo.RedirectStandardError = true;
cmdStartInfo.RedirectStandardOutput = true;
cmdStartInfo.RedirectStandardInput = false;
cmdStartInfo.UseShellExecute = false;
cmdStartInfo.CreateNoWindow = true;
cmdStartInfo.Arguments = "view -u " + BamFileName + " " + chromosome + ":" + start + "-" + end;
cmdProcess.EnableRaisingEvents = true;
cmdProcess.StartInfo = cmdStartInfo;
cmdProcess.Start();
// Prepare to read each alignment (binary)
var br = new BinaryReader(cmdProcess.StandardOutput.BaseStream);
while (!cmdProcess.StandardOutput.EndOfStream)
{
// Consume the initial, undocumented BAM data
br.ReadBytes(23);
// ... more parsing follows
But when I run this, the first 23bytes that I read are not the first 23 bytes in the ouput, but rather somewhere several hundred or thousand bytes downstream. I assume that StreamReader does some buffering and so the underlying stream is already advanced say 4K into the output. The underlying stream does not support seeking back to the start.
And I'm stuck here. Does anyone have a working solution for running an external command and capturing its stdout in binary form? The ouput may be very large so I would like to stream it.
Any help appreciated.
By the way, my current workaround is to have samtools return the records in text format, then parse those, but this is pretty slow and I'm hoping to speed things up by using the binary format directly.
Using StandardOutput.BaseStream is the correct approach, but you must not use any other property or method of cmdProcess.StandardOutput. For example, accessing cmdProcess.StandardOutput.EndOfStream will cause the StreamReader for StandardOutput to read part of the stream, removing the data you want to access.
Instead, simply read and parse the data from br (assuming you know how to parse the data, and won't read past the end of stream, or are willing to catch an EndOfStreamException). Alternatively, if you don't know how big the data is, use Stream.CopyTo to copy the entire standard output stream to a new file or memory stream.
Since you explicitly specified running on Suse linux and mono, you can work around the problem by using native unix calls to create the redirection and read from the stream. Such as:
using System;
using System.Diagnostics;
using System.IO;
using Mono.Unix;
class Test
{
public static void Main()
{
int reading, writing;
Mono.Unix.Native.Syscall.pipe(out reading, out writing);
int stdout = Mono.Unix.Native.Syscall.dup(1);
Mono.Unix.Native.Syscall.dup2(writing, 1);
Mono.Unix.Native.Syscall.close(writing);
Process cmdProcess = new Process();
ProcessStartInfo cmdStartInfo = new ProcessStartInfo();
cmdStartInfo.FileName = "cat";
cmdStartInfo.CreateNoWindow = true;
cmdStartInfo.Arguments = "test.exe";
cmdProcess.StartInfo = cmdStartInfo;
cmdProcess.Start();
Mono.Unix.Native.Syscall.dup2(stdout, 1);
Mono.Unix.Native.Syscall.close(stdout);
Stream s = new UnixStream(reading);
byte[] buf = new byte[1024];
int bytes = 0;
int current;
while((current = s.Read(buf, 0, buf.Length)) > 0)
{
bytes += current;
}
Mono.Unix.Native.Syscall.close(reading);
Console.WriteLine("{0} bytes read", bytes);
}
}
Under unix, file descriptors are inherited by child processes unless marked otherwise (close on exec). So, to redirect stdout of a child, all you need to do is change the file descriptor #1 in the parent process before calling exec. Unix also provides a handy thing called a pipe which is a unidirectional communication channel, with two file descriptors representing the two endpoints. For duplicating file descriptors, you can use dup or dup2 both of which create an equivalent copy of a descriptor, but dup returns a new descriptor allocated by the system and dup2 places the copy in a specific target (closing it if necessary). What the above code does, then:
Creates a pipe with endpoints reading and writing
Saves a copy of the current stdout descriptor
Assigns the pipe's write endpoint to stdout and closes the original
Starts the child process so it inherits stdout connected to the write endpoint of the pipe
Restores the saved stdout
Reads from the reading endpoint of the pipe by wrapping it in a UnixStream
Note, in native code, a process is usually started by a fork+exec pair, so the file descriptors can be modified in the child process itself, but before the new program is loaded. This managed version is not thread-safe as it has to temporarily modify the stdout of the parent process.
Since the code starts the child process without managed redirection, the .NET runtime does not change any descriptors or create any streams. So, the only reader of the child's output will be the user code, which uses a UnixStream to work around the StreamReader's encoding issue,
I checked out what's happening with reflector. It seems to me that StreamReader doesn't read until you call read on it. But it's created with a buffer size of 0x1000, so maybe it does. But luckily, until you actually read from it, you can safely get the buffered data out of it: it has a private field byte[] byteBuffer, and two integer fields, byteLen and bytePos, the first means how many bytes are in the buffer, the second means how many have you consumed, should be zero. So first read this buffer with reflection, then create the BinaryReader.
Maybe you can try like this:
public class ThirdExe
{
private static TongueSvr _instance = null;
private Diagnostics.Process _process = null;
private Stream _messageStream;
private byte[] _recvBuff = new byte[65536];
private int _recvBuffLen;
private Queue<TonguePb.Msg> _msgQueue = new Queue<TonguePb.Msg>();
void StartProcess()
{
try
{
_process = new Diagnostics.Process();
_process.EnableRaisingEvents = false;
_process.StartInfo.FileName = "d:/code/boot/tongueerl_d.exe"; // Your exe
_process.StartInfo.UseShellExecute = false;
_process.StartInfo.CreateNoWindow = true;
_process.StartInfo.RedirectStandardOutput = true;
_process.StartInfo.RedirectStandardInput = true;
_process.StartInfo.RedirectStandardError = true;
_process.ErrorDataReceived += new Diagnostics.DataReceivedEventHandler(ErrorReceived);
_process.Exited += new EventHandler(OnProcessExit);
_process.Start();
_messageStream = _process.StandardInput.BaseStream;
_process.BeginErrorReadLine();
AsyncRead();
}
catch (Exception e)
{
Debug.LogError("Unable to launch app: " + e.Message);
}
private void AsyncRead()
{
_process.StandardOutput.BaseStream.BeginRead(_recvBuff, 0, _recvBuff.Length
, new AsyncCallback(DataReceived), null);
}
void DataReceived(IAsyncResult asyncResult)
{
int nread = _process.StandardOutput.BaseStream.EndRead(asyncResult);
if (nread == 0)
{
Debug.Log("process read finished"); // process exit
return;
}
_recvBuffLen += nread;
Debug.LogFormat("recv data size.{0} remain.{1}", nread, _recvBuffLen);
ParseMsg();
AsyncRead();
}
void ParseMsg()
{
if (_recvBuffLen < 4)
{
return;
}
int len = IPAddress.NetworkToHostOrder(BitConverter.ToInt32(_recvBuff, 0));
if (len > _recvBuffLen - 4)
{
Debug.LogFormat("current call can't parse the NetMsg for data incomplete");
return;
}
TonguePb.Msg msg = TonguePb.Msg.Parser.ParseFrom(_recvBuff, 4, len);
Debug.LogFormat("recv msg count.{1}:\n {0} ", msg.ToString(), _msgQueue.Count + 1);
_recvBuffLen -= len + 4;
_msgQueue.Enqueue(msg);
}
The key is _process.StandardOutput.BaseStream.BeginRead(_recvBuff, 0, _recvBuff.Length, new AsyncCallback(DataReceived), null); and the very very important is that convert to asynchronous reads event like Process.OutputDataReceived.

Is it possible to "talk" with running process?

i want to create some service that will run as simple process and will give some other application the possibility to send him xml stream.
What i mean is to create simple process ( exe ) with Infinite loop - and any application will be able to send XML ( file / stream ) to this process => and this process will send the xml to some socket.
Is it possible to do it without pipe ?
I want to do something like COM - that can 'catch' instance of working process.
sure.
you can use Named Pipe classes in c# :
Server :
using (var s = new NamedPipeServerStream ("myPipe"))
{
s.WaitForConnection();
s.WriteByte (100);
Console.WriteLine (s.ReadByte());
}
client code:
using (var s = new NamedPipeClientStream ("myPipe"))
{
s.Connect();
Console.WriteLine (s.ReadByte());
s.WriteByte (200);
}
edit
you can do it by file. + systemfileWatcher Class
put a file in a folder.
the other process will audit this folder.
and now you can transfer info.
edit2
you can use memoryMappedFile
and open a view in each process to see the same mempry region - and transfer data.
I think its the best.
Process A :
static void Main(string[] args)
{
using (MemoryMappedFile mmf = MemoryMappedFile.CreateNew("testmap", 4000))
{
bool mutexCreated;
Mutex mutex = new Mutex(true, "testmapmutex", out mutexCreated);
using (MemoryMappedViewStream stream = mmf.CreateViewStream())
{
BinaryWriter writer = new BinaryWriter(stream);
string st = "Hellow";
int stringSize = Encoding.UTF8.GetByteCount(st); //6
writer.Write(st);
writer.Write(123); //6+4 bytes = 10 bytes
}
mutex.ReleaseMutex();
Console.WriteLine("Start Process B and press ENTER to continue.");
Console.ReadLine();
mutex.WaitOne();
using (MemoryMappedViewStream stream = mmf.CreateViewStream())
{
BinaryReader reader = new BinaryReader(stream);
Console.WriteLine("Process A says: {0}", reader.ReadString());
Console.WriteLine("Process A says: {0}", reader.ReadInt32());
Console.WriteLine("Process B says: {0}", reader.ReadInt32());
}
mutex.ReleaseMutex();
}
}
Process B writes to its region
static void Main(string[] args)
{
try
{
using (MemoryMappedFile mmf = MemoryMappedFile.OpenExisting("testmap"))
{
Mutex mutex = Mutex.OpenExisting("testmapmutex");
mutex.WaitOne();
using (MemoryMappedViewStream stream = mmf.CreateViewStream(11, 0)) // From the 11 byte....
{
BinaryWriter writer = new BinaryWriter(stream, Encoding.UTF8);
writer.Write(2);
}
mutex.ReleaseMutex();
}
}
catch (FileNotFoundException)
{
Console.WriteLine("Memory-mapped file does not exist. Run Process A first.");
}
}
Just use C# Sockets that listen for connections from the other process and write a custom XML file receiver.
Yes, of course you can use a TCP socket connection .If you want to avoid network connection as enlightened in a comment you can use a shared memory approach, for example with Memory-Mapped Files .
What you are looking for is some form of IPC (Inter-process communuication). There's a huge number of possibilities:
Regular file. Windows provides location specifically for temp files (%TEMP%)
For small data, you could use registry, although in most cases it's not a proper use
Memory-mapped file, it's similar to file but in RAM
As Royi properly mentioned, NamedPipeStream is a way to go if you decide to give pipes a try
You could create a WCF endpoint. It sounds like a drag, but Visual Studio will create you all the scaffolding, so it's not such an issue in the end
Window messages could be used if you are developing forms application, and sometimes even if not
You mentioned that the data is XML, so this methodology is not for you, but I'll mention it anyway: you could use named kernel objects, such as: mutexes, events, semaphores to pass signals from one program to another.

About redirected stdout in System.Diagnostics.Process

I've been recently working on a program that convert flac files to mp3 in C# using flac.exe and lame.exe, here are the code that do the job:
ProcessStartInfo piFlac = new ProcessStartInfo( "flac.exe" );
piFlac.CreateNoWindow = true;
piFlac.UseShellExecute = false;
piFlac.RedirectStandardOutput = true;
piFlac.Arguments = string.Format( flacParam, SourceFile );
ProcessStartInfo piLame = new ProcessStartInfo( "lame.exe" );
piLame.CreateNoWindow = true;
piLame.UseShellExecute = false;
piLame.RedirectStandardInput = true;
piLame.RedirectStandardOutput = true;
piLame.Arguments = string.Format( lameParam, QualitySetting, ExtractTag( SourceFile ) );
Process flacp = null, lamep = null;
byte[] buffer = BufferPool.RequestBuffer();
flacp = Process.Start( piFlac );
lamep = new Process();
lamep.StartInfo = piLame;
lamep.OutputDataReceived += new DataReceivedEventHandler( this.ReadStdout );
lamep.Start();
lamep.BeginOutputReadLine();
int count = flacp.StandardOutput.BaseStream.Read( buffer, 0, buffer.Length );
while ( count != 0 )
{
lamep.StandardInput.BaseStream.Write( buffer, 0, count );
count = flacp.StandardOutput.BaseStream.Read( buffer, 0, buffer.Length );
}
Here I set the command line parameters to tell lame.exe to write its output to stdout, and make use of the Process.OutPutDataRecerved event to gather the output data, which is mostly binary data, but the DataReceivedEventArgs.Data is of type "string" and I have to convert it to byte[] before put it to cache, I think this is ugly and I tried this approach but the result is incorrect.
Is there any way that I can read the raw redirected stdout stream, either synchronously or asynchronously, bypassing the OutputDataReceived event?
PS: the reason why I don't use lame to write to disk directly is that I'm trying to convert several files in parallel, and direct writing to disk will cause severe fragmentation.
Thanks a lot!
I don't think lame and flac uses Stdout to output its converted data. I think they write the converted files directly to disk. The StdOut is only used to output info/error messages.

Categories

Resources