How do I prevent an unexpected disposal of Streams? (StreamReader, FileStream...) - c#

I have some code where I'm trying to read a line within a Mainframe file before I download it from the host. I create an instance of the Stream class as an object called reader, retrieve the FTP data stream from the host and place it into the Stream object, and then create a copy of the original Stream object and its data into another Stream object called readerCopy. My issue, I think, is that when I pass readerCopy into a method that retrieves some data from the Stream(RetrieveDateFromFile), that the resources for both readerCopy and reader are disposed of after the method ends. So when my calling method tries to use reader later on it throws the following:
System.ObjectDisposedException: 'Cannot access a disposed object. Object name: 'System.Net.Sockets.NetworkStream''
I thought that encapsulating all of the Stream objects in using statements would make it so that the resources wouldn't be disposed of until the end of those statements are reached but it seems like they might be disposed of sooner.
What am I missing?
Calling method:
public void FtpFile()
{
// Gets the FTP data stream and stores it into reader, creates a new Stream object called readercopy.
using (Stream reader = request.GetResponse().GetResponseStream(), readerCopy = new MemoryStream())
{
if (reader != null)
{
reader.CopyTo(readerCopy);// Copies the original Stream to readerCopy.
readerCopy.Position = 0; //Sets the position to be beginning of the Stream.
SMDPPITrigger trigger = new SMDPPITrigger(); //Custom class
using (StreamReader fileReader = new StreamReader(readerCopy))
{
using (FileStream fileStream = new FileStream(ftpFileDestination, FileMode.Create))
{
if (trigger.CheckIfExists(RetrieveDateFromFile(fileReader)) == false)
while (true)
{
bytesRead = reader.Read(buffer, 0, buffer.Length); //<--- error occurs here.
if (bytesRead == 0)
break;
fileStream.Write(buffer, 0, bytesRead);
}
}
}
}
}
}
Method to retrieve data from stream:
public DateTime RetrieveDateFromFile(StreamReader mainframeFile)
{
string lineParsed = "";
// StreamReader fileReader = new StreamReader(mainframeFile);
for (int i = 0; i <= 2; i++)
switch (i)
{
case 2:
string line = mainframeFile.ReadLine();
if (line != null)
{
lineParsed = line.Substring(124);
break;
}
else
{
break;
}
default:
{
mainframeFile.ReadLine();
break;
}
}
return DateTime.Parse(lineParsed);
}

I am suspecting on below two lines of code causing problem.
reader.CopyTo(readerCopy);// Copies the original Stream to readerCopy.
readerCopy.Position = 0; //Sets the position to be beginning of the Stream.
Can you try specifying length of stream you want to copy into readerCopy?
e.g
reader.CopyTo(readerCopy,124);

Closing the loop on my question. It turns out the issue and resolution is exactly what #KlausGütter had said. Using reader.CopyTo(readerCopy) was setting my position to the end of the stream and my error message wasn't that it had been disposed but instead that there was nothing left to read. Using the stream that I copied over to, readerCopy, solved my issue since the stream is seekable.

Related

Reading from a ZipArchiveEntry cause exception and MemoryLeak if using a MemoryStream

I have the following code that generate two kinds of errors. First with the current code I get an exception 'NotSupportedException: This stream from ZipArchiveEntry does not support reading.'. How am I supposed to read the data ?
Furthermore if i use a MemoryStream (as the commented code ) then I can read the data and deserialize correctly but the memorystream i created still remains in memory even if the dispose method has been called on it , causing some memory leaks . Any idea what is wrong with this code ?
void Main()
{
List<Product> products;
using (var s = GetDb().Result)
{
products = Utf8Json.JsonSerializer.Deserialize<List<Product>>(s).ToList();
}
}
// Define other methods and classes here
public static Task<Stream> GetDb()
{
var filepath = Path.Combine("c:/users/tom/Downloads", "productdb.zip");
using (var archive = ZipFile.OpenRead(filepath))
{
var data = archive.Entries.Single(e => e.FullName == "productdb.json");
return Task.FromResult(data.Open());
//using (var reader = new StreamReader(data.Open()))
//{
// var ms = new MemoryStream();
// data.Open().CopyTo(ms);
// ms.Seek(0, SeekOrigin.Begin);
// return Task.FromResult((Stream)ms);
//}
}
}
With the commented code you open the stream into a reader, don't use the reader, then open the stream again and copy over to the memory stream without closing the second opened stream.
It is the second opened stream that remains in memory, not the MemoryStream.
Refactor
public static async Task<Stream> GetDb() {
var filepath = Path.Combine("c:/users/tom/Downloads", "productdb.zip");
using (var archive = ZipFile.OpenRead(filepath)) {
var entry = archive.Entries.Single(e => e.FullName == "productdb.json");
using (var stream = entry.Open()) {
var ms = new MemoryStream();
await stream.CopyToAsync(ms);
return ms;
}
}
}

.Net BinaryWriter automatically flushes stream when checking stream Position

Has anyone else seen this or know how to turn this off?
I have code which periodically checks the Position of a stream in a BinaryWriter. Every invocation of the BinaryWriter.BaseStream.Position method results in that stream's Flush method being invoked.
I tried using a BinaryWriter and a StreamWriter, and only the BinaryWriter demonstrated this behavior.
Some sample code below:
namespace FlushaholicStreams
{
class Program
{
static void Main(string[] args)
{
using (var stream = new PrivateStream())
using (var writer = new BinaryWriter(stream))
{
var data = "hi there, this is a really long string. Very very very long";
for (int i = 0; i < 19; i++)
{
data += data;
}
for (int j = 0; j < 8; j++)
{
var bytes = Encoding.ASCII.GetBytes(data);
writer.Write(bytes);
var position = writer.BaseStream.Position;
Console.WriteLine("The position was {0}", position);
}
}
Console.WriteLine("All done");
}
}
class PrivateStream : MemoryStream
{
public int FlushCount = 0;
public int CloseCount = 0;
public override void Close()
{
this.CloseCount++;
Console.WriteLine("Closing the stream");
base.Close();
}
public override void Flush()
{
this.FlushCount++;
Console.WriteLine("Flushing the stream");
base.Flush();
}
}
}
That code yields the output:
Flushing the stream
The position was 30932992
Flushing the stream
The position was 61865984
Flushing the stream
The position was 92798976
Flushing the stream
The position was 123731968
Flushing the stream
The position was 154664960
Flushing the stream
The position was 185597952
Flushing the stream
The position was 216530944
Flushing the stream
The position was 247463936
Flushed the stream 8 times
Closing the stream
Closing the stream
All done
I'm using .Net 4.5
Looks like the BinaryWriter class forces excessive Flushing and there's no way to override it. I'll just keep a reference to the original stream and check position on it directly.
I found the (alleged) source code here:
http://reflector.webtropy.com/default.aspx/4#0/4#0/DEVDIV_TFS/Dev10/Releases/RTMRel/ndp/clr/src/BCL/System/IO/BinaryWriter#cs/1305376/BinaryWriter#cs
/*
* Returns the stream associate with the writer. It flushes all pending
* writes before returning. All subclasses should override Flush to
* ensure that all buffered data is sent to the stream.
*/
public virtual Stream BaseStream {
get {
Flush();
return OutStream;
}
}
// Clears all buffers for this writer and causes any buffered data to be
// written to the underlying device.
public virtual void Flush()
{
OutStream.Flush();
}
The framework is prone to flushing, yes. That's bad because it forces disk access. (Subjective note: It's a design flaw.)
Write yourself a Stream that wraps another stream. In your wrapper class you override the necessary methods to maintain the Position yourself in an instance field. That way the Position member of the wrapped stream does not need to be accessed at all.

Filestream create or append issue

I am having issues with FileStreams. I'm in the process of writing a C# serial interface for FPGA project I'm working on which receives a packet (containing 16 bytes) creates and writes the bytes to a file and subsequently appends to the created file.
The program is not throwing any errors but doesn't appear to get past creating the file and does not write any data to it.
Any Ideas? IS there a better way to OpenOrAppend a file?
Thanks in Advance,
Michael
private void SendReceivedDataToFile(int sendBytes)
{
if (saveFileCreated == false)
{
FileStream writeFileStream = new FileStream(tbSaveDirectory.Text, FileMode.Create);
writeFileStream.Write(oldData, 0, sendBytes);
writeFileStream.Flush();
writeFileStream.Close();
saveFileCreated = true;
readByteCount = readByteCount + sendBytes;
}
else
{
using (var writeFilestream2 = new FileStream(tbSaveDirectory.Text, FileMode.Append))
{
writeFilestream2.Write(oldData, 0, sendBytes);
writeFilestream2.Flush();
writeFilestream2.Close();
readByteCount = readByteCount + sendBytes;
}
}
if (readByteCount == readFileSize) // all data has been recieved so close file.
{
saveFileCreated = false;
}
}
FileMode.Append already means "create or append", so really you only need the else {} part of your if. You also don't need to call Flush() or Close() - disposing the stream will do that for you.
Not sure about not writing data... did you try to trace your code?
So first I would reduce your code to
private void SendReceivedDataToFile(int sendBytes)
{
using (var fs = new FileStream(tbSaveDirectory.Text, FileMode.Append))
fs.Write(oldData, 0, sendBytes);
readByteCount += sendBytes;
}
then try to figure what exactly in the oldData.

Cannot access a closed stream ASP.net v2.0

We have a very odd problem, the below code is working fine on all developers machine/ our 2 test servers, both with code and with built version, however when it is running on a virtual machine with windows 2003 server and asp.net v2.0 it throws an error
Cannot access a closed stream.
public String convertResultToXML(CResultObject[] state)
{
MemoryStream stream = null;
TextWriter writer = null;
try
{
stream = new MemoryStream(); // read xml in memory
writer = new StreamWriter(stream, Encoding.Unicode);
// get serialise object
XmlSerializer serializer = new XmlSerializer(typeof(CResultObject[]));
serializer.Serialize(writer, state); // read object
int count = (int)stream.Length; // saves object in memory stream
byte[] arr = new byte[count];
stream.Seek(0, SeekOrigin.Begin);
// copy stream contents in byte array
stream.Read(arr, 0, count);
UnicodeEncoding utf = new UnicodeEncoding(); // convert byte array to string
return utf.GetString(arr).Trim();
}
catch
{
return string.Empty;
}
finally
{
if (stream != null) stream.Close();
if (writer != null) writer.Close();
}
}
Any idea why would it do this?
For your Serialize use using to prevent the stream remain open.
Something like this:
using (StreamWriter streamWriter = new StreamWriter(fullFilePath))
{
xmlSerializer.Serialize(streamWriter, toSerialize);
}
I originally thought that it was because you're closing the stream then closing the writer - you should just close the writer, because it will close the stream also : http://msdn.microsoft.com/en-us/library/system.io.streamwriter.close(v=vs.80).aspx.
However, despite MSDNs protestation, I can't see any evidence that it does actually does this when reflecting the code.
Looking at your code, though, I can't see why you're using the writer in the first place. I'll bet if you change your code thus (I've taken out the bad exception swallowing too) it'll be alright:
public String convertResultToXML(CResultObject[] state)
{
using(var stream = new MemoryStream)
{
// get serialise object
XmlSerializer serializer = new XmlSerializer(typeof(CResultObject[]));
serializer.Serialize(stream, state); // read object
int count = (int)stream.Length; // saves object in memory stream
byte[] arr = new byte[count];
stream.Seek(0, SeekOrigin.Begin);
// copy stream contents in byte array
stream.Read(arr, 0, count);
UnicodeEncoding utf = new UnicodeEncoding(); // convert byte array to string
return utf.GetString(arr).Trim();
}
}
Now you're working with the stream directly, and it'll only get closed once - most definitely getting rid of this strange error - which I'll wager could be something to do with a service pack or something like that.

Datacontractserializer doesn't overwrite all data

I've noticed that if I persist an object back into file using a Datacontractserializer, if the length of the new xml is shorter than the xml originally present in the file the remnants of the original xml outwith the length of the new xml will remain in the file and will break the xml.
Does anyone have a good solution to fix this?
Here's the code I am using to persist the object:
/// <summary>
/// Flushes the current instance of the given type to the datastore.
/// </summary>
private void Flush()
{
try
{
string directory = Path.GetDirectoryName(this.fileName);
if (!Directory.Exists(directory))
{
Directory.CreateDirectory(directory);
}
FileStream stream = null;
try
{
stream = new FileStream(this.fileName, FileMode.OpenOrCreate);
for (int i = 0; i < 3; i++)
{
try
{
using (XmlDictionaryWriter writer = XmlDictionaryWriter.CreateTextWriter(stream, new System.Text.UTF8Encoding(false)))
{
stream = null;
// The serializer is initialized upstream.
this.serializer.WriteObject(writer, this.objectValue);
}
break;
}
catch (IOException)
{
Thread.Sleep(200);
}
}
}
finally
{
if (stream != null)
{
stream.Dispose();
}
}
}
catch
{
// TODO: Localize this
throw;
//throw new IOException(String.Format(CultureInfo.CurrentCulture, "Unable to save persistable object to file {0}", this.fileName));
}
}
It's because of how you are opening your stream with:
stream = new FileStream(this.fileName, FileMode.OpenOrCreate);
Try using:
stream = new FileStream(this.fileName, FileMode.Create);
See FileMode documentation.
I believe this is due to using FileMode.OpenOrCreate. If the file already exits, I think the file is being opened and parts of the data are being overwritten from the start byte. If you change to using FileMode.Create it forces any existing files to be overwritten.

Categories

Resources