Sharing variable using MemoryMappedFile gives error - c#

I need to share a variable value from App "A" to different applications on same system. I need to use only MemoryMappedFiles as per requirement. I created another simple App "B" which reads value of shared variable from App A.
But App "B" gives error
Unable to find the specified file.
Sample App A and B written in C# using VS 2012 :
On Button click event of app A to share variable Name :
private void button1_Click(object sender, EventArgs e)
{
string Name = txtName.Text.ToString();
int howManyBytes = Name.Length * sizeof(Char) + 4;
label1.Text = howManyBytes.ToString();
using (var MyText = MemoryMappedFile.CreateOrOpen("MyGlobalData", howManyBytes, MemoryMappedFileAccess.ReadWrite))
{
byte[] array1 = new byte[howManyBytes];
array1 = GetBytes(Name);
using (var accessor = MyText.CreateViewAccessor(0, array1.Length))
{
accessor.WriteArray(0, array1, 0, array1.Length);
}
}
}
static byte[] GetBytes(string str)
{
byte[] bytes = new byte[str.Length * sizeof(char)];
System.Buffer.BlockCopy(str.ToCharArray(), 0, bytes, 0, bytes.Length);
return bytes;
}
On Button click event of app B to read shared variable MyGlobalData :
try
{
using (var mmf = MemoryMappedFile.OpenExisting("MyGlobalData"))
{
using (var accessor = mmf.CreateViewAccessor(0, 34))
{
byte[] array1 = new byte[34];
accessor.ReadArray(0, array1, 0, array1.Length);
string result = System.Text.Encoding.UTF8.GetString(array1);
txtName.Text = result;
}
}
}
catch (Exception ex)
{
MessageBox.Show(" Error :- " + ex.Message.ToString());
}
When trying to read the shared variable from App B it gives error
Unable to find the specified file
Do i need to write variable value in txt file and then share it ?

Memory mapped files are two types
Persisted memory mapped file
Non persisted memory mapped file
You're creating Non persisted memory mapped file, so when the last process to close then handle of the MMF, it will be destroyed. You're closing the handle by calling Dispose on it through using statement, and thus MMF is destroyed.
You either need to keep the MMF open, or use Persisted memory mapped file. To keep the MMF open, just don't dispose it.
For more information

Related

Access Embedded Resource In Console App

I am embedding a .docx file into my Console App, and I want to be able to distribute the console.exe and have the users be able to access the .docx file inside it.
I have set the .docx file as a resource (see image) - however, if I try to "access" it by using Resources.Test.docx it seems as it if does not exist, and intellisense is not giving it as an option.
How should I do this in a C# console app?
EDIT
In winforms I would embed as resource like this:
static void Main(string[] args)
{
AppDomain.CurrentDomain.AssemblyResolve += (sender, args) =>
{
string rn1 = new AssemblyName(args.Name).Name + ".docx";
string rs1 = Array.Find(this.GetType().Assembly.GetManifestResourceNames(), element => element.EndsWith(rn1));
using (var stream = Assembly.GetExecutingAssembly().GetManifestResourceStream(rs1))
{
Byte[] assemblydata = new Byte[stream.Length];
stream.Read(assemblydata, 0, assemblydata.Length);
return Assembly.Load(assemblydata);
}
}
}
And access the file like this:
Object oFName;
byte[] resourceFile = Properties.Resources.Report;
string destination = Path.Combine(Path.GetTempPath(), "Test.docx");
System.IO.File.WriteAllBytes(destination, resourceFile);
oFName = destination;
EDIT 2
If I try to use the code I use for winforms the AppDomain.CurrentDomain.AssemblyResolve ->
I receive the below errors
A local or parameter named 'args' cannot be declared in this scope because that name is used in an enclosing local scope to define a local or parameter
Keyword 'this' is not valid in a static property, static method, or static field initializer
Your first method, with a little bit of modification, should be able to return a resource stream. Here is basically what you have modified a little bit to just read the stream:
public static byte[] GetResourceData(string resourceName)
{
var embeddedResource = Assembly.GetExecutingAssembly().GetManifestResourceNames().FirstOrDefault(s => string.Compare(s, resourceName, true) == 0);
if (!string.IsNullOrWhiteSpace(embeddedResource))
{
using (var stream = Assembly.GetExecutingAssembly().GetManifestResourceStream(embeddedResource))
{
var data = new byte[stream.Length];
stream.Read(data, 0, data.Length);
return data;
}
}
return null;
}
This method can be called using the resource name and will return all the bytes that are inside the embedded resource.

Reading file contents from many processes at the same time

I have a class library that gets called from a windows service, the class library can be called many times at the same time.
I have an issue where i have to read file contents in my class, so i get the error that the file is being used by another process if the class is getting called by many processes.
This is how i read the file content:
File.ReadAllBytes("path");
What is the best solution in this case ?
Thank you
The following code demonstrates to access a file by setting its share permissions. The first using block creates and writes file, second and third using blocks access and read the file.
var fileName = "test.txt";
using (var fsWrite = new FileStream(fileName, FileMode.OpenOrCreate, FileAccess.Write, FileShare.ReadWrite))
{
var content = Encoding.UTF8.GetBytes("test");
fsWrite.Write(content, 0, content.Length);
fsWrite.Flush();
using (var fsRead_1 = new FileStream(fileName, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
var bufRead_1 = new byte[fsRead_1.Length];
fsRead_1.Read(bufRead_1, 0, bufRead_1.Length);
Console.WriteLine("fsRead_1:" + Encoding.UTF8.GetString(bufRead_1));
using (var fsRead_2 = new FileStream(fileName, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
var bufRead_2 = new byte[fsRead_2.Length];
fsRead_2.Read(bufRead_2, 0, bufRead_2.Length);
Console.WriteLine("fsRead_2:" + Encoding.UTF8.GetString(bufRead_2));
}
}
}
You need to synchronize the access to the whole file using standard thread synchronization approaches.
The simplest one is Monitor using lock statement:
public class A
{
private static readonly object _sync = new object();
public void DoStuff()
{
// All threads trying to enter this critical section will
// wait until the first to enter exits it
lock(_sync)
{
byte[] buffer = File.ReadAllBytes(#"C:\file.jpg");
}
}
}
Note
Firstly, I was understanding OP was accessing the file from different processes, but when I double-checked the statement:
I have a class library that gets called from a windows service, the
class library can be called many times at the same time.
...I realized OP is calling a method which reads all bytes from some file within the same Windows service instance.
Use Mutex for syncing among different processes. File.ReadAllBytes uses a FileAccess.Read and FileShare.Read when reads the file, so normally you don't need to use any locks here. So you get this exception because the file is being written somewhere (or at least is locked for writing).
Solution 1 - if you are the one who writes this file
private static Mutex mutex;
public void WriteFile(string path)
{
Mutex mutex = GetOrCreateMutex();
try
{
mutex.WaitOne();
// TODO: ... write file
}
finally
{
mutex.ReleaseMutex();
}
}
public byte[] ReadFile(string path)
{
// Note: If you just read the file, this lock is completely unnecessary
// because ReadAllFile uses Read access. This just protects the file
// being read and written at the same time
Mutex mutex = GetOrCreateMutex();
try
{
mutex.WaitOne();
return File.ReadAllBytes(path);
}
finally
{
mutex.ReleaseMutex();
}
}
private static Mutex GetOrCreateMutex()
{
try
{
mutex = Mutex.OpenExisting("MyMutex");
}
catch (WaitHandleCannotBeOpenedException)
{
mutex = new Mutex(false, "MyMutex");
}
}
Remark: A ReadWriteLock would be better here because you can read a file safely parallelly when it is not being written; however, there is no built-in inter-process read-write lock in .NET. Here is an example how you can implement one with Mutex and Semaphor types.
Solution 2 - if you just read the file
You must simply being prepared that the file can be locked when it is being written by a 3rd process:
public byte[] TryReadFile(string path, int maxTry)
{
Exception e;
for (int i = 0; i < maxTry; i++)
{
try
{
return File.ReadAllBytes(path);
}
catch (IOException io)
{
e = io;
Thread.Sleep(100);
}
}
throw e; // or just return null
}

capturing ALL stdout data using Process.Start [duplicate]

In C# (.NET 4.0 running under Mono 2.8 on SuSE) I would like to run an external batch command and capture its ouput in binary form. The external tool I use is called 'samtools' (samtools.sourceforge.net) and among other things it can return records from an indexed binary file format called BAM.
I use Process.Start to run the external command, and I know that I can capture its output by redirecting Process.StandardOutput. The problem is, that's a text stream with an encoding, so it doesn't give me access to the raw bytes of the output. The almost-working solution I found is to access the underlying stream.
Here's my code:
Process cmdProcess = new Process();
ProcessStartInfo cmdStartInfo = new ProcessStartInfo();
cmdStartInfo.FileName = "samtools";
cmdStartInfo.RedirectStandardError = true;
cmdStartInfo.RedirectStandardOutput = true;
cmdStartInfo.RedirectStandardInput = false;
cmdStartInfo.UseShellExecute = false;
cmdStartInfo.CreateNoWindow = true;
cmdStartInfo.Arguments = "view -u " + BamFileName + " " + chromosome + ":" + start + "-" + end;
cmdProcess.EnableRaisingEvents = true;
cmdProcess.StartInfo = cmdStartInfo;
cmdProcess.Start();
// Prepare to read each alignment (binary)
var br = new BinaryReader(cmdProcess.StandardOutput.BaseStream);
while (!cmdProcess.StandardOutput.EndOfStream)
{
// Consume the initial, undocumented BAM data
br.ReadBytes(23);
// ... more parsing follows
But when I run this, the first 23bytes that I read are not the first 23 bytes in the ouput, but rather somewhere several hundred or thousand bytes downstream. I assume that StreamReader does some buffering and so the underlying stream is already advanced say 4K into the output. The underlying stream does not support seeking back to the start.
And I'm stuck here. Does anyone have a working solution for running an external command and capturing its stdout in binary form? The ouput may be very large so I would like to stream it.
Any help appreciated.
By the way, my current workaround is to have samtools return the records in text format, then parse those, but this is pretty slow and I'm hoping to speed things up by using the binary format directly.
Using StandardOutput.BaseStream is the correct approach, but you must not use any other property or method of cmdProcess.StandardOutput. For example, accessing cmdProcess.StandardOutput.EndOfStream will cause the StreamReader for StandardOutput to read part of the stream, removing the data you want to access.
Instead, simply read and parse the data from br (assuming you know how to parse the data, and won't read past the end of stream, or are willing to catch an EndOfStreamException). Alternatively, if you don't know how big the data is, use Stream.CopyTo to copy the entire standard output stream to a new file or memory stream.
Since you explicitly specified running on Suse linux and mono, you can work around the problem by using native unix calls to create the redirection and read from the stream. Such as:
using System;
using System.Diagnostics;
using System.IO;
using Mono.Unix;
class Test
{
public static void Main()
{
int reading, writing;
Mono.Unix.Native.Syscall.pipe(out reading, out writing);
int stdout = Mono.Unix.Native.Syscall.dup(1);
Mono.Unix.Native.Syscall.dup2(writing, 1);
Mono.Unix.Native.Syscall.close(writing);
Process cmdProcess = new Process();
ProcessStartInfo cmdStartInfo = new ProcessStartInfo();
cmdStartInfo.FileName = "cat";
cmdStartInfo.CreateNoWindow = true;
cmdStartInfo.Arguments = "test.exe";
cmdProcess.StartInfo = cmdStartInfo;
cmdProcess.Start();
Mono.Unix.Native.Syscall.dup2(stdout, 1);
Mono.Unix.Native.Syscall.close(stdout);
Stream s = new UnixStream(reading);
byte[] buf = new byte[1024];
int bytes = 0;
int current;
while((current = s.Read(buf, 0, buf.Length)) > 0)
{
bytes += current;
}
Mono.Unix.Native.Syscall.close(reading);
Console.WriteLine("{0} bytes read", bytes);
}
}
Under unix, file descriptors are inherited by child processes unless marked otherwise (close on exec). So, to redirect stdout of a child, all you need to do is change the file descriptor #1 in the parent process before calling exec. Unix also provides a handy thing called a pipe which is a unidirectional communication channel, with two file descriptors representing the two endpoints. For duplicating file descriptors, you can use dup or dup2 both of which create an equivalent copy of a descriptor, but dup returns a new descriptor allocated by the system and dup2 places the copy in a specific target (closing it if necessary). What the above code does, then:
Creates a pipe with endpoints reading and writing
Saves a copy of the current stdout descriptor
Assigns the pipe's write endpoint to stdout and closes the original
Starts the child process so it inherits stdout connected to the write endpoint of the pipe
Restores the saved stdout
Reads from the reading endpoint of the pipe by wrapping it in a UnixStream
Note, in native code, a process is usually started by a fork+exec pair, so the file descriptors can be modified in the child process itself, but before the new program is loaded. This managed version is not thread-safe as it has to temporarily modify the stdout of the parent process.
Since the code starts the child process without managed redirection, the .NET runtime does not change any descriptors or create any streams. So, the only reader of the child's output will be the user code, which uses a UnixStream to work around the StreamReader's encoding issue,
I checked out what's happening with reflector. It seems to me that StreamReader doesn't read until you call read on it. But it's created with a buffer size of 0x1000, so maybe it does. But luckily, until you actually read from it, you can safely get the buffered data out of it: it has a private field byte[] byteBuffer, and two integer fields, byteLen and bytePos, the first means how many bytes are in the buffer, the second means how many have you consumed, should be zero. So first read this buffer with reflection, then create the BinaryReader.
Maybe you can try like this:
public class ThirdExe
{
private static TongueSvr _instance = null;
private Diagnostics.Process _process = null;
private Stream _messageStream;
private byte[] _recvBuff = new byte[65536];
private int _recvBuffLen;
private Queue<TonguePb.Msg> _msgQueue = new Queue<TonguePb.Msg>();
void StartProcess()
{
try
{
_process = new Diagnostics.Process();
_process.EnableRaisingEvents = false;
_process.StartInfo.FileName = "d:/code/boot/tongueerl_d.exe"; // Your exe
_process.StartInfo.UseShellExecute = false;
_process.StartInfo.CreateNoWindow = true;
_process.StartInfo.RedirectStandardOutput = true;
_process.StartInfo.RedirectStandardInput = true;
_process.StartInfo.RedirectStandardError = true;
_process.ErrorDataReceived += new Diagnostics.DataReceivedEventHandler(ErrorReceived);
_process.Exited += new EventHandler(OnProcessExit);
_process.Start();
_messageStream = _process.StandardInput.BaseStream;
_process.BeginErrorReadLine();
AsyncRead();
}
catch (Exception e)
{
Debug.LogError("Unable to launch app: " + e.Message);
}
private void AsyncRead()
{
_process.StandardOutput.BaseStream.BeginRead(_recvBuff, 0, _recvBuff.Length
, new AsyncCallback(DataReceived), null);
}
void DataReceived(IAsyncResult asyncResult)
{
int nread = _process.StandardOutput.BaseStream.EndRead(asyncResult);
if (nread == 0)
{
Debug.Log("process read finished"); // process exit
return;
}
_recvBuffLen += nread;
Debug.LogFormat("recv data size.{0} remain.{1}", nread, _recvBuffLen);
ParseMsg();
AsyncRead();
}
void ParseMsg()
{
if (_recvBuffLen < 4)
{
return;
}
int len = IPAddress.NetworkToHostOrder(BitConverter.ToInt32(_recvBuff, 0));
if (len > _recvBuffLen - 4)
{
Debug.LogFormat("current call can't parse the NetMsg for data incomplete");
return;
}
TonguePb.Msg msg = TonguePb.Msg.Parser.ParseFrom(_recvBuff, 4, len);
Debug.LogFormat("recv msg count.{1}:\n {0} ", msg.ToString(), _msgQueue.Count + 1);
_recvBuffLen -= len + 4;
_msgQueue.Enqueue(msg);
}
The key is _process.StandardOutput.BaseStream.BeginRead(_recvBuff, 0, _recvBuff.Length, new AsyncCallback(DataReceived), null); and the very very important is that convert to asynchronous reads event like Process.OutputDataReceived.

Keeping log files under a certain size

I have an application that is running on a stand-alone panel PC in a kiosk (C#/WPF). It performs some typical logging operations to a text file. The PC has some limited amount of disk space to store these logs as they grow.
What I need to do is be able to specify the maximum size that a log file is allowed to be. If, when attempting to write to the log, the max size is exceeded, new data will be written to the end of the log and the oldest data will be purged from the beginning.
Getting the file size is no problem, but are there any typical file manipulation techniques to keep a file under a certain size?
One technique to handle this is to have two log files which are half the maximum size each. You simply rotate between the two as you reach the max size of each file. Rotating to a file causes it to be overwritten with a new file.
A logging framework such as log4net has this functionality built in.
Try using Log4Net
http://www.codeproject.com/KB/aspnet/log4net.aspx
There's no easy way to strip the data from the beginning of file. So you have several options:
Keep the log in several smaller log files and delete the oldest "chunks" if the total size of all log files exceeds your limit. This is similar to what you want to do, but on different level
Rename the log file to "log.date" and start a new log. Similar to (1) but not an option if you have limited disk space.
IF you have enough RAM and your log size is relatively small to fit in memory, you can do the following: map the whole file into memory using Memory-mapped file, then perform move operation by taking the data from the middle of the file and moving them to the beginning. Then truncate the file. This is the only way to easily strip the data from the beginning of the log file without creating a copy of it.
Linux os: check out logrotate - http://www.cyberciti.biz/faq/how-do-i-rotate-log-files/
Windows os: try googling windows logrotate. for example: http://blog.arithm.com/2008/02/07/windows-log-file-rotation/
I wanted a simple solution as well, but I didn't want to add another dependency so I made a simple method. This has everything you need other than the part of compressing the old file to a zip, which you can find here: Create zip file in memory from bytes (text with arbitrary encoding)
static int iMaxLogLength = 2000; // Probably should be bigger, say 200,000
static int KeepLines = 5; // minimum of how much of the old log to leave
public static void ManageLogs(string strFileName)
{
try
{
FileInfo fi = new FileInfo(strFileName);
if (fi.Length > iMaxLogLength) // if the log file length is already too long
{
int TotalLines = 0;
var file = File.ReadAllLines(strFileName);
var LineArray = file.ToList();
var AmountToCull = (int)(LineArray.Count - KeepLines);
var trimmed = LineArray.Skip(AmountToCull).ToList();
File.WriteAllLines(strFileName, trimmed);
string archiveName = strFileName + "-" + DateTime.Now.ToString("MM-dd-yyyy") + ".zip";
File.WriteAllBytes(archiveName, Compression.Zip(string.Join("\n", file)));
}
}
catch (Exception ex)
{
Console.WriteLine("Failed to write to logfile : " + ex.Message);
}
}
I have this as part of the initialization / reinitialization section of my application, so it gets run a few times a day.
ErrorLogging.ManageLogs("Application.log");
I wouldn't use this for a file meant to be over say 1 Meg and it's not terribly efficient, but it works good if you need to solve a pesky problem of when you need a log file that you can't conveniently maintain. Make sure the log file exists before you use this though... or you could add code for it as well as checking the location exists, etc.
// This is how to call it
private void buttonLog_Click(object sender, EventArgs e)
{
c_Log.writeToFile(textBoxMessages.Text, "../../log.log", 1);
}
public static class c_Log
{
static int iMaxLogLength = 15000; // Probably should be bigger, say 200,000
static int iTrimmedLogLength = -1000; // minimum of how much of the old log to leave
static public void writeToFile(string strNewLogMessage, string strFile, int iLogLevel)
{
try
{
FileInfo fi = new FileInfo(strFile);
Byte[] bytesSavedFromEndOfOldLog = null;
if (fi.Length > iMaxLogLength) // if the log file length is already too long
{
using (BinaryReader br = new BinaryReader(File.Open(strFile, FileMode.Open)))
{
// Seek to our required position of what you want saved.
br.BaseStream.Seek(iTrimmedLogLength, SeekOrigin.End);
// Read what you want to save and hang onto it.
bytesSavedFromEndOfOldLog = br.ReadBytes((-1 * iTrimmedLogLength));
}
}
byte[] newLine = System.Text.ASCIIEncoding.ASCII.GetBytes(Environment.NewLine);
FileStream fs = null;
// If the log file is less than the max length, just open it at the end to write there
if (fi.Length < iMaxLogLength)
fs = new FileStream(strFile, FileMode.Append, FileAccess.Write, FileShare.Read);
else // If the log file is more than the max length, just open it empty
fs = new FileStream(strFile, FileMode.Create, FileAccess.Write, FileShare.Read);
using (fs)
{
// If you are trimming the file length, write what you saved.
if (bytesSavedFromEndOfOldLog != null)
{
Byte[] lineBreak = Encoding.ASCII.GetBytes("### " + DateTime.Now.ToString("yyyy-MM-dd HH:mm:ss") + " *** *** *** Old Log Start Position *** *** *** *** ###");
fs.Write(newLine, 0, newLine.Length);
fs.Write(newLine, 0, newLine.Length);
fs.Write(lineBreak, 0, lineBreak.Length);
fs.Write(newLine, 0, newLine.Length);
fs.Write(bytesSavedFromEndOfOldLog, 0, bytesSavedFromEndOfOldLog.Length);
fs.Write(newLine, 0, newLine.Length);
}
Byte[] sendBytes = Encoding.ASCII.GetBytes(strNewLogMessage);
// Append your last log message.
fs.Write(sendBytes, 0, sendBytes.Length);
fs.Write(newLine, 0, newLine.Length);
}
}
catch (Exception ex)
{
; // Nothing to do...
//writeEvent("writeToFile() Failed to write to logfile : " + ex.Message + "...", 5);
}
}
}

Get Binary data from a SQL Database

I have an ASP .Net (3.5) website. I have the following code that uploads a file as a binary to a SQL Database:
Print("
protected void UploadButton_Click(object sender, EventArgs e)
{
//Get the posted file
Stream fileDataStream = FileUpload.PostedFile.InputStream;
//Get length of file
int fileLength = FileUpload.PostedFile.ContentLength;
//Create a byte array with file length
byte[] fileData = new byte[fileLength];
//Read the stream into the byte array
fileDataStream.Read(fileData, 0, fileLength);
//get the file type
string fileType = FileUpload.PostedFile.ContentType;
//Open Connection
WebSysDataContext db = new WebSysDataContext(Contexts.WEBSYS_CONN());
//Create New Record
BinaryStore NewFile = new BinaryStore();
NewFile.BinaryID = "1";
NewFile.Type = fileType;
NewFile.BinaryFile = fileData;
//Save Record
db.BinaryStores.InsertOnSubmit(NewFile);
try
{
db.SubmitChanges();
}
catch (Exception)
{
throw;
}
}");
The files that will be uploaded are PDFs, Can you please help me in writing the code to get the PDF out of the SQL database and display it in the browser. (I am able to get the binary file using a linq query but not sure how to process the bytes)
So are you really just after how to serve a byte array in ASP.NET? It sounds like the database part is irrelevant, given that you've said you are able to get the binary file with a LINQ query.
If so, look at HttpResponse.BinaryWrite. You should also set the content type of the response appropriately, e.g. application/pdf.
How big are the files? Huge buffers (i.e. byte[fileLength]) are usually a bad idea.
Personally, I'd look at things like this and this, which show reading/writing data as streams (the second shows pushing the stream as an http response). But updated to use varchar(max) ;-p
protected void Test_Click(object sender, EventArgs e)
{
WebSysDataContext db = new WebSysDataContext(Contexts.WEBSYS_CONN());
var GetFile = from x in db.BinaryStores
where x.BinaryID == "1"
select x.BinaryFile;
FileStream MyFileStream;
long FileSize;
MyFileStream = new FileStream(GetFile, FileMode.Open);
FileSize = MyFileStream.Length;
byte[] Buffer = new byte[(int)FileSize];
MyFileStream.Read(Buffer, 0, (int)FileSize);
MyFileStream.Close();
Response.Write("<b>File Contents: </b>");
Response.BinaryWrite(Buffer);
}
I tryed this and this did not work. I get a compile error on this line "MyFileStream = new FileStream(GetFile, FileMode.Open);"
I not sure where i am going wrong, is it due to the way i have stored it?
When you store binary files in SQL Server it adds an OLE Header to the binary-data. So you must strip that header before actually reading the byte[] into file. Here's how you do this.
// First Strip-Out the OLE header
const int OleHeaderLength = 78;
int strippedDataLength = datarow["Field"].Length - OleHeaderLength;
byte[] strippedData = new byte[strippedDataLength];
Array.Copy(datarow["Field"], OleHeaderLength,
strippedData , 0, strippedDataLength );
Once you run this code, strippedData will contain the actual file data. You can then use MemoryStream or FileStream to perform I/O on the byte[].

Categories

Resources