I've been recently working on a program that convert flac files to mp3 in C# using flac.exe and lame.exe, here are the code that do the job:
ProcessStartInfo piFlac = new ProcessStartInfo( "flac.exe" );
piFlac.CreateNoWindow = true;
piFlac.UseShellExecute = false;
piFlac.RedirectStandardOutput = true;
piFlac.Arguments = string.Format( flacParam, SourceFile );
ProcessStartInfo piLame = new ProcessStartInfo( "lame.exe" );
piLame.CreateNoWindow = true;
piLame.UseShellExecute = false;
piLame.RedirectStandardInput = true;
piLame.RedirectStandardOutput = true;
piLame.Arguments = string.Format( lameParam, QualitySetting, ExtractTag( SourceFile ) );
Process flacp = null, lamep = null;
byte[] buffer = BufferPool.RequestBuffer();
flacp = Process.Start( piFlac );
lamep = new Process();
lamep.StartInfo = piLame;
lamep.OutputDataReceived += new DataReceivedEventHandler( this.ReadStdout );
lamep.Start();
lamep.BeginOutputReadLine();
int count = flacp.StandardOutput.BaseStream.Read( buffer, 0, buffer.Length );
while ( count != 0 )
{
lamep.StandardInput.BaseStream.Write( buffer, 0, count );
count = flacp.StandardOutput.BaseStream.Read( buffer, 0, buffer.Length );
}
Here I set the command line parameters to tell lame.exe to write its output to stdout, and make use of the Process.OutPutDataRecerved event to gather the output data, which is mostly binary data, but the DataReceivedEventArgs.Data is of type "string" and I have to convert it to byte[] before put it to cache, I think this is ugly and I tried this approach but the result is incorrect.
Is there any way that I can read the raw redirected stdout stream, either synchronously or asynchronously, bypassing the OutputDataReceived event?
PS: the reason why I don't use lame to write to disk directly is that I'm trying to convert several files in parallel, and direct writing to disk will cause severe fragmentation.
Thanks a lot!
I don't think lame and flac uses Stdout to output its converted data. I think they write the converted files directly to disk. The StdOut is only used to output info/error messages.
Related
Does C# provide an event when data is received on the stdin stream for my own process? Something like Process.OutputDataReceived, only I need an event for InputDataReceived.
I've searched high and low, and learned to redirect stdin->stdout, monitor output streams of spawned apps and a ton of other stuff, but nowhere has anyone shown which event is triggered when stdin is recieved. Unless I use a dumb polling loop in main().
// dumb polling loop -- is this the only way? does this consume a lot of CPU?
while ((line = Console.ReadLine()) != null && line != "") {
// do work
}
Also, I need to get binary data from the stream, something like this:
using (Stream stdin = Console.OpenStandardInput())
using (Stream stdout = Console.OpenStandardOutput())
{
byte[] buffer = new byte[2048];
int bytes;
while ((bytes = stdin.Read(buffer, 0, buffer.Length)) > 0) {
stdout.Write(buffer, 0, bytes);
}
}
The polling loop won't consume much CPU, because ReadLine blocks and waits. Put this code in an own worker-thread and raise your event out of it. As far as I know, there is no such feature in .NET.
EDIT: I was wrong here in the first place. Corrected:
You can actually read the binary data from stdin, as this SO answer says:
To read binary, the best approach is to use the raw input stream - here showing something like "echo" between stdin and stdout:
using (Stream stdin = Console.OpenStandardInput())
using (Stream stdout = Console.OpenStandardOutput())
{
byte[] buffer = new byte[2048];
int bytes;
while ((bytes = stdin.Read(buffer, 0, buffer.Length)) > 0) {
stdout.Write(buffer, 0, bytes);
}
}
Here's an async approach. Like OutputDataReceived, the callback runs on newlines. For binary, streaming to base64 might work. Switching it to a binary stream is harder because you can't just check for newline.
using System.Diagnostics;
using System.Threading.Tasks;
public static void ListenToParent(Action<string> onMessageFromParent)
{
Task.Run(async () =>
{
while (true) // Loop runs only once per line received
{
var text = await Console.In.ReadLineAsync();
onMessageFromParent(text);
}
});
}
Here's how my parent app sets up the child process:
var child = new Process()
{
EnableRaisingEvents = true,
StartInfo =
{
FileName = ..., // .exe path
RedirectStandardOutput = true,
RedirectStandardInput = true,
UseShellExecute = false,
CreateNoWindow = true
},
};
child.Start();
child.BeginOutputReadLine();
... and how it sends a line to the child process:
child.StandardInput.WriteLine("Message from parent");
In C# (.NET 4.0 running under Mono 2.8 on SuSE) I would like to run an external batch command and capture its ouput in binary form. The external tool I use is called 'samtools' (samtools.sourceforge.net) and among other things it can return records from an indexed binary file format called BAM.
I use Process.Start to run the external command, and I know that I can capture its output by redirecting Process.StandardOutput. The problem is, that's a text stream with an encoding, so it doesn't give me access to the raw bytes of the output. The almost-working solution I found is to access the underlying stream.
Here's my code:
Process cmdProcess = new Process();
ProcessStartInfo cmdStartInfo = new ProcessStartInfo();
cmdStartInfo.FileName = "samtools";
cmdStartInfo.RedirectStandardError = true;
cmdStartInfo.RedirectStandardOutput = true;
cmdStartInfo.RedirectStandardInput = false;
cmdStartInfo.UseShellExecute = false;
cmdStartInfo.CreateNoWindow = true;
cmdStartInfo.Arguments = "view -u " + BamFileName + " " + chromosome + ":" + start + "-" + end;
cmdProcess.EnableRaisingEvents = true;
cmdProcess.StartInfo = cmdStartInfo;
cmdProcess.Start();
// Prepare to read each alignment (binary)
var br = new BinaryReader(cmdProcess.StandardOutput.BaseStream);
while (!cmdProcess.StandardOutput.EndOfStream)
{
// Consume the initial, undocumented BAM data
br.ReadBytes(23);
// ... more parsing follows
But when I run this, the first 23bytes that I read are not the first 23 bytes in the ouput, but rather somewhere several hundred or thousand bytes downstream. I assume that StreamReader does some buffering and so the underlying stream is already advanced say 4K into the output. The underlying stream does not support seeking back to the start.
And I'm stuck here. Does anyone have a working solution for running an external command and capturing its stdout in binary form? The ouput may be very large so I would like to stream it.
Any help appreciated.
By the way, my current workaround is to have samtools return the records in text format, then parse those, but this is pretty slow and I'm hoping to speed things up by using the binary format directly.
Using StandardOutput.BaseStream is the correct approach, but you must not use any other property or method of cmdProcess.StandardOutput. For example, accessing cmdProcess.StandardOutput.EndOfStream will cause the StreamReader for StandardOutput to read part of the stream, removing the data you want to access.
Instead, simply read and parse the data from br (assuming you know how to parse the data, and won't read past the end of stream, or are willing to catch an EndOfStreamException). Alternatively, if you don't know how big the data is, use Stream.CopyTo to copy the entire standard output stream to a new file or memory stream.
Since you explicitly specified running on Suse linux and mono, you can work around the problem by using native unix calls to create the redirection and read from the stream. Such as:
using System;
using System.Diagnostics;
using System.IO;
using Mono.Unix;
class Test
{
public static void Main()
{
int reading, writing;
Mono.Unix.Native.Syscall.pipe(out reading, out writing);
int stdout = Mono.Unix.Native.Syscall.dup(1);
Mono.Unix.Native.Syscall.dup2(writing, 1);
Mono.Unix.Native.Syscall.close(writing);
Process cmdProcess = new Process();
ProcessStartInfo cmdStartInfo = new ProcessStartInfo();
cmdStartInfo.FileName = "cat";
cmdStartInfo.CreateNoWindow = true;
cmdStartInfo.Arguments = "test.exe";
cmdProcess.StartInfo = cmdStartInfo;
cmdProcess.Start();
Mono.Unix.Native.Syscall.dup2(stdout, 1);
Mono.Unix.Native.Syscall.close(stdout);
Stream s = new UnixStream(reading);
byte[] buf = new byte[1024];
int bytes = 0;
int current;
while((current = s.Read(buf, 0, buf.Length)) > 0)
{
bytes += current;
}
Mono.Unix.Native.Syscall.close(reading);
Console.WriteLine("{0} bytes read", bytes);
}
}
Under unix, file descriptors are inherited by child processes unless marked otherwise (close on exec). So, to redirect stdout of a child, all you need to do is change the file descriptor #1 in the parent process before calling exec. Unix also provides a handy thing called a pipe which is a unidirectional communication channel, with two file descriptors representing the two endpoints. For duplicating file descriptors, you can use dup or dup2 both of which create an equivalent copy of a descriptor, but dup returns a new descriptor allocated by the system and dup2 places the copy in a specific target (closing it if necessary). What the above code does, then:
Creates a pipe with endpoints reading and writing
Saves a copy of the current stdout descriptor
Assigns the pipe's write endpoint to stdout and closes the original
Starts the child process so it inherits stdout connected to the write endpoint of the pipe
Restores the saved stdout
Reads from the reading endpoint of the pipe by wrapping it in a UnixStream
Note, in native code, a process is usually started by a fork+exec pair, so the file descriptors can be modified in the child process itself, but before the new program is loaded. This managed version is not thread-safe as it has to temporarily modify the stdout of the parent process.
Since the code starts the child process without managed redirection, the .NET runtime does not change any descriptors or create any streams. So, the only reader of the child's output will be the user code, which uses a UnixStream to work around the StreamReader's encoding issue,
I checked out what's happening with reflector. It seems to me that StreamReader doesn't read until you call read on it. But it's created with a buffer size of 0x1000, so maybe it does. But luckily, until you actually read from it, you can safely get the buffered data out of it: it has a private field byte[] byteBuffer, and two integer fields, byteLen and bytePos, the first means how many bytes are in the buffer, the second means how many have you consumed, should be zero. So first read this buffer with reflection, then create the BinaryReader.
Maybe you can try like this:
public class ThirdExe
{
private static TongueSvr _instance = null;
private Diagnostics.Process _process = null;
private Stream _messageStream;
private byte[] _recvBuff = new byte[65536];
private int _recvBuffLen;
private Queue<TonguePb.Msg> _msgQueue = new Queue<TonguePb.Msg>();
void StartProcess()
{
try
{
_process = new Diagnostics.Process();
_process.EnableRaisingEvents = false;
_process.StartInfo.FileName = "d:/code/boot/tongueerl_d.exe"; // Your exe
_process.StartInfo.UseShellExecute = false;
_process.StartInfo.CreateNoWindow = true;
_process.StartInfo.RedirectStandardOutput = true;
_process.StartInfo.RedirectStandardInput = true;
_process.StartInfo.RedirectStandardError = true;
_process.ErrorDataReceived += new Diagnostics.DataReceivedEventHandler(ErrorReceived);
_process.Exited += new EventHandler(OnProcessExit);
_process.Start();
_messageStream = _process.StandardInput.BaseStream;
_process.BeginErrorReadLine();
AsyncRead();
}
catch (Exception e)
{
Debug.LogError("Unable to launch app: " + e.Message);
}
private void AsyncRead()
{
_process.StandardOutput.BaseStream.BeginRead(_recvBuff, 0, _recvBuff.Length
, new AsyncCallback(DataReceived), null);
}
void DataReceived(IAsyncResult asyncResult)
{
int nread = _process.StandardOutput.BaseStream.EndRead(asyncResult);
if (nread == 0)
{
Debug.Log("process read finished"); // process exit
return;
}
_recvBuffLen += nread;
Debug.LogFormat("recv data size.{0} remain.{1}", nread, _recvBuffLen);
ParseMsg();
AsyncRead();
}
void ParseMsg()
{
if (_recvBuffLen < 4)
{
return;
}
int len = IPAddress.NetworkToHostOrder(BitConverter.ToInt32(_recvBuff, 0));
if (len > _recvBuffLen - 4)
{
Debug.LogFormat("current call can't parse the NetMsg for data incomplete");
return;
}
TonguePb.Msg msg = TonguePb.Msg.Parser.ParseFrom(_recvBuff, 4, len);
Debug.LogFormat("recv msg count.{1}:\n {0} ", msg.ToString(), _msgQueue.Count + 1);
_recvBuffLen -= len + 4;
_msgQueue.Enqueue(msg);
}
The key is _process.StandardOutput.BaseStream.BeginRead(_recvBuff, 0, _recvBuff.Length, new AsyncCallback(DataReceived), null); and the very very important is that convert to asynchronous reads event like Process.OutputDataReceived.
I have below methods to encode videos uploaded to a web site with ffmpeg. It works fine for videos up to 8-9 MB but, if video size is larger than 8-9 MB it hangs web site. Only way to recover it is restarting iis.
When i watch the process i can see that ffmpeg encodes video and exits. Resulted video is just fine. Problem starts as soon as ffmpeg exists.
Application runs on win2003 x86 webserver iis6
Anyone got experience with encoding large files from asp.net web app using ffmpeg?
public EncodedVideo EncodeVideo(VideoFile input, string encodingCommand, string outputFile)
{
EncodedVideo encoded = new EncodedVideo();
Params = string.Format("-i \"{0}\" {1} \"{2}\"", input.Path, encodingCommand, outputFile);
//string output = RunProcess(Params);
string output = RunProcessLargeFile(Params);
encoded.EncodingLog = output;
encoded.EncodedVideoPath = outputFile;
if (File.Exists(outputFile))
{
encoded.Success = true;
}
else
{
encoded.Success = false;
}
//System.Web.HttpContext.Current.Response.Write(Params);
return encoded;
}
private string RunProcessLargeFile(string Parameters)
{
/* The below will be the right solution ....
* The while loop which reads the stream is very improtant
* for FFMPEG as .NET does not provide more memory to FFMPEG.
* When converting large files, FFMPEG's out put stream gets filled...
* And waits for .NET to allocate memory resources but is never done.
* In order to utilize less memory, we are clearing the buffer periodically.
**/
ProcessStartInfo oInfo = new ProcessStartInfo(this.FFmpegPath, Parameters);
oInfo.WorkingDirectory = Path.GetDirectoryName(this.FFmpegPath);
oInfo.UseShellExecute = false;
oInfo.CreateNoWindow = true;
oInfo.RedirectStandardOutput = true;
oInfo.RedirectStandardError = true;
Process proc = System.Diagnostics.Process.Start(oInfo);
StreamReader srOutput = proc.StandardError;
System.Text.StringBuilder output = new System.Text.StringBuilder();
StreamReader objStreamReader = proc.StandardError;
System.Text.StringBuilder sbOutPut = new StringBuilder();
while (!proc.WaitForExit(1000))
{
sbOutPut.Append(objStreamReader.ReadToEnd().ToString());
}
if (proc.ExitCode == 0)
{
proc.Close();
if (objStreamReader != null)
{
objStreamReader.Close();
}
}
else
{
proc.Close();
if (objStreamReader != null) objStreamReader.Close();
}
return sbOutPut.ToString();
}
Smells like a typical deadlock.
You call proc.WaitForExit before proc.StandardError.ReadToEnd method and it seems cause a deadlock.
A reference from MSDN:
A deadlock condition can result if the
parent process calls WaitForExit
before StandardOutput.ReadToEnd and
the child process writes enough text
to fill the redirected stream. The
parent process would wait indefinitely
for the child process to exit. The
child process would wait indefinitely
for the parent to read from the full
StandardOutput stream.
So replacing your while cycle to:
string output = objStreamReader.ReadToEnd();
proc.WaitForExit();
should solve your problem.
i am not a asp.net person i work with php and in php.ini which is a config file i can set the max amount of megabytes i am able to upload. maybe if you check the config file to see anything about uploading and set it to the maximum.
I need to concatenate 3 files using C#. A header file, content, and a footer file, but I want to do this as cool as it can be done.
Cool = really small code or really fast (non-assembly code).
I support Mehrdad Afshari on his code being exactly same as used in System.IO.Stream.CopyTo.
I would still wonder why did he not use that same function instead of rewriting its implementation.
string[] srcFileNames = { "file1.txt", "file2.txt", "file3.txt" };
string destFileName = "destFile.txt";
using (Stream destStream = File.OpenWrite(destFileName))
{
foreach (string srcFileName in srcFileNames)
{
using (Stream srcStream = File.OpenRead(srcFileName))
{
srcStream.CopyTo(destStream);
}
}
}
According to the disassembler (ILSpy) the default buffer size is 4096. CopyTo function has got an overload, which lets you specify the buffer size in case you are not happy with 4096 bytes.
void CopyStream(Stream destination, Stream source) {
int count;
byte[] buffer = new byte[BUFFER_SIZE];
while( (count = source.Read(buffer, 0, buffer.Length)) > 0)
destination.Write(buffer, 0, count);
}
CopyStream(outputFileStream, fileStream1);
CopyStream(outputFileStream, fileStream2);
CopyStream(outputFileStream, fileStream3);
If your files are text and not large, there's something to be said for dead-simple, obvious code. I'd use the following.
File.ReadAllText("file1") + File.ReadAllText("file2") + File.ReadAllText("file3");
If your files are large text files and you're on Framework 4.0, you can use File.ReadLines to avoid buffering the entire file.
File.WriteAllLines("out", new[] { "file1", "file2", "file3" }.SelectMany(File.ReadLines));
If your files are binary, See Mehrdad's answer
Another way....how about letting the OS do it for you?:
ProcessStartInfo psi = new ProcessStartInfo("cmd.exe",
String.Format(#" /c copy {0} + {1} + {2} {3}",
file1, file2, file3, dest));
psi.UseShellExecute = false;
Process process = Process.Start(psi);
process.WaitForExit();
You mean 3 text files?? Does the result need to be a file again?
How about something like:
string contents1 = File.ReadAllText(filename1);
string contents2 = File.ReadAllText(filename2);
string contents3 = File.ReadAllText(filename3);
File.WriteAllText(outputFileName, contents1 + contents2 + contents3);
Of course, with a StringBuilder and a bit of extra smarts, you could easily extend that to handle any number of input files :-)
Cheers
If you are in a Win32 environment, the most efficient solution could be to use the Win32 API function "WriteFile". There is an example in VB 6, but rewriting it in C# is not difficult.
I am uploading multiple files using the BeginGetRequestStream of HttpWebRequest but I want to update the progress control I have written whilst I post up the data stream.
How should this be done, I have tried calling Dispatch.BeginInvoke (as below) from within the loop that pushes the data into the stream but it locks the browser until its finished so it seems to be in some sort of worker/ui thread deadlock.
This is a code snippet of pretty much what I am doing:
class RequestState
{
public HttpWebRequest request; // holds the request
public FileDialogFileInfo file; // store our file stream data
public RequestState( HttpWebRequest request, FileDialogFileInfo file )
{
this.request = request;
this.file = file;
}
}
private void UploadFile( FileDialogFileInfo file )
{
UriBuilder ub = new UriBuilder( app.receiverURL );
ub.Query = string.Format( "filename={0}", file.Name );
// Open the selected file to read.
HttpWebRequest request = (HttpWebRequest)WebRequest.Create( ub.Uri );
request.Method = "POST";
RequestState state = new RequestState( request, file );
request.BeginGetRequestStream( new AsyncCallback( OnUploadReadCallback ), state );
}
private void OnUploadReadCallback( IAsyncResult asynchronousResult )
{
RequestState state = (RequestState)asynchronousResult.AsyncState;
HttpWebRequest request = (HttpWebRequest)state.request;
Stream postStream = request.EndGetRequestStream( asynchronousResult );
PushData( state.file, postStream );
postStream.Close();
state.request.BeginGetResponse( new AsyncCallback( OnUploadResponseCallback ), state.request );
}
private void PushData( FileDialogFileInfo file, Stream output )
{
byte[] buffer = new byte[ 4096 ];
int bytesRead = 0;
Stream input = file.OpenRead();
while( ( bytesRead = input.Read( buffer, 0, buffer.Length ) ) != 0 )
{
output.Write( buffer, 0, bytesRead );
bytesReadTotal += bytesRead;
App app = App.Current as App;
int totalPercentage = Convert.ToInt32( ( bytesReadTotal / app.totalBytesToUpload ) * 100 );
// enabling the following locks up my UI and browser
Dispatcher.BeginInvoke( () =>
{
this.ProgressBarWithPercentage.Percentage = totalPercentage;
} );
}
}
I was going to say that, I didn't think that Silverlight 2's HttpWebRequest supported streaming, because the request data gets buffered into memory entirely. It had been a while since the last time I looked at it though, therefore I went back to see if Beta 2 supported it. Well turns out it does. I am glad I went back and read before stating that. You can enable it by setting AllowReadStreamBuffering to false. Did you set this property on your HttpWebRequest? That could be causing your block.
MSDN Reference
File upload component for Silverlight and ASP.NET
Edit, found another reference for you. You may want to follow this approach by breaking the file into chunks. This was written last March, therefore I am not sure if it will work in Beta 2 or not.
Thanks for that, I will take a look at those links, I was considering chunking my data anyway, seems to be the only way I can get any reasonable progress reports out of it.