Out of Memory issue on a 128 GB Ram x64 cpu - c#

I am working on a program where it reads a 312 MB encrypted file into memory stream , decrypts it and copies into destination stream. My program works well with file size of around 120 MB . I couldn't figure out why it is failing for this ?
My System info : 64 bit cpu , RAM : 128 GB
Also the c# code I built on using Any CPU setting in Configuration Manager.
I wrote a sample program to check where I am getting out of memory and I see that its failing at 512 MB. I do know that the Memory stream requires contigous blocks in the Memory as the RAM is fragmented. But the RAM size is huge here, I tried in multiple machines as well with RAMs of 14 GB, 64GB & 8GB.
Any help is appreciated.
The sample program I wrote to test the Out of Memory Size :
const int bufferSize = 4096;
byte[] buffer = new byte[bufferSize];
int fileSize = 1000 * 1024 * 1024;
int total = 0;
try
{
using (MemoryStream memory = new MemoryStream())
{
while (total < fileSize)
{
memory.Write(buffer, 0, bufferSize);
total += bufferSize;
}
}
Console.WriteLine("No errors");
}
catch (OutOfMemoryException)
{
Console.WriteLine("OutOfMemory around size : " + (total / (1024m * 1024.0m)) + "MB");
}

Just running out of Large Object Heap I guess. However another approach to solving your problem is to not read the stream into memory - most decrypt algorithms just want a System.IO.Stream - reading it into memory seems a relatively pointless step - just pass the decrypt api your incoming file or network stream instead.

Try disabling the option "Prefer 32 bits" from the project's properties, in "Build" tab, thats works for me.
Good luck!

Related

How to increase the allowed RAM usage for IIS Express?

First of all, I have seen this question:
IIS Express - increse memory limit
But it is not a duplicate, because all answers are pointing to the 64bit version of IIS Express. I need to support 32 bit! And 32 bit can support 2 GB RAM per process.
I was debugging a strange problem. So I created a Console Application which works fine:
//Simplifiy my application logic
var trash = new List<int>();
long mbUsed = 0;
while (mbUsed < 600)
{
for (var i = 0; i < 100000; i++)
{
trash.Add(i);
}
GC.Collect();
mbUsed = Process.GetCurrentProcess().WorkingSet64 / 1024 / 1024;
Console.WriteLine(mbUsed + " MB used");
}
//Creating a image
var bitmap = new Bitmap(2000, 4000);
Basicly it fills the ram to 600 MB and then tries to create a large image.
Now if I paste the same code in an MVC Action, surprise, I get an OutOfMemoryException:
If I read it correctly, I am using less then 500 MB.
So how can I use more RAM? For normal IIS I can change it on the application pool.

How to efficiently set the number of bytes to download for HttpWebRequest?

I'm currently working on a file downloader project. The application is designed so as to support resumable downloads. All downloaded data and its metadata(download ranges) are stored on the disk immediately per call to ReadBytes. Let's say that I used the following code snippet :-
var reader = new BinaryReader(response.GetResponseStream());
var buffr = reader.ReadBytes(_speedBuffer);
DownloadSpeed += buffr.Length;//used for reporting speed and zeroed every second
Here _speedBuffer is the number of bytes to download which is set to a default value.
I have tested the application by two methods. First is by downloading a file which is hosted on a local IIS server. The speed is great. Secondly, I tried to download the same file's copy(from where it was actually downloaded) from the internet. My net speed is real slow. Now, what I observed that if I increase the _speedBuffer then the downloading speed from the local server is good but for the internet copy, speed reporting is slow. Whereas if I decrease the value of _speedBuffer, the downloading speed(reporting) for the file's internet copy is good but not for the local server. So I thought, why shouldn't I change the _speedBuffer at runtime. But all the custom algorithms(for changing the value) I came up with were in-efficient. Means the download speed was still slow as compared other downloaders.
Is this approach OK?
Am I doing it the wrong way?
Should I stick with default value for _speedBuffer(byte count)?
The problem with ReadBytes in this case is that it attempts to read exactly that number of bytes, or it returns when there is no more data to read.
So you receive a packet containing 99 bytes of data, then calling ReadBytes(100) will wait for the next packet to include that missing byte.
I wouldn't use a BinaryReader at all:
byte[] buffer = new byte[bufferSize];
using (Stream responseStream = response.GetResponseStream())
{
int bytes;
while ((bytes = responseStream.Read(buffer, 0, buffer.Length)) > 0)
{
DownloadSpeed += bytes;//used for reporting speed and zeroed every second
// on each iteration, "bytes" bytes of the buffer have been filled, store these to disk
}
// bytes was 0: end of stream
}

virtual comport is very slow compared to terminal

I'm writing an application where I need to send a file (~600kB) to another unit via a virtual serialport.
When I send it using a terminal application (TeraTerm) it takes less than 10 seconds, but using my program it takes 1-2 minutes.
My code is very simple:
port.WriteTimeout = 30000;
port.ReadTimeout = 5000;
port.WriteBufferSize = 1024 * 1024; // Buffer size larger than file size
...
fs = File.OpenRead(filename);
byte[] filedata = new byte[fs.Length];
fs.Read(filedata, 0, Convert.ToInt32(fs.Length));
...
for (int iter = 0; iter < filedata.Length; iter++) {
port.Write(filedata, iter, 1);
}
Calling port.Write with the entire file length seems to always cause a write timeout for unknown reason, so I'm writing 1 byte at a time.
Solved it, here's the details in case someone else finds this it might give some hints on what's wrong.
I was reading the file wrong, somehow the application used \r\n as newlines when transferring. The file itself is a Intel .hex file which contains checksums which were calculated using \r as newlines.
Checksum errors caused the other device to ACK very slowly, thus making the transfer super slow combined with the PC application now handling checking for checksum errors.
If you have similar errors I recommend using a software snoop to monitor what's actually being sent

StreamReader.ReadToEnd causes massive memory usage / leaks

What it does: for each EncryptedBase64PictureFile, reads the content, decrypts the base64 string and creates a picturebox.
Where the problem is: Insane memory usage! I guess that some data after each loop are not deleted properly. For example, 100 loops with input around 100MB of encrypted data, which should generate around 100MB of image files, uses around 1.5 GB of memory! And when i try to decrypt just a little more data, around 150MB, i get OutOfMemory exception. Visual studio's memory profiling report says, that " string fileContent= reader.ReadToEnd();" line is responsible for 80% of allocations.
for each EncryptedBase64PictureFile {
Rijndael rijAlg = Rijndael.Create();
rijAlg.Key = ASCIIEncoding.ASCII.GetBytes(sKey);
rijAlg.IV = ASCIIEncoding.ASCII.GetBytes(sKey);
FileStream fsread = new FileStream(EncryptedBase64PictureFile, FileMode.Open, FileAccess.Read);
ICryptoTransform desdecrypt = rijAlg.CreateDecryptor();
CryptoStream cryptostreamDecr = new CryptoStream(fsread,desdecrypt, CryptoStreamMode.Read);
StreamReader reader = new StreamReader(cryptostreamDecr);
string fileContent= reader.ReadToEnd(); //this should be the memory eater
var ms = new MemoryStream(Convert.FromBase64String(fileContent));
PictureBox myPictureBox= new PictureBox();
myPictureBox.Image = Image.FromStream(ms);
ms.Close();
reader.Close();
cryptostreamDecr.Close();
fsread.Close();
}
So the question is, is there a way to dealocate memory properly after each loop? Or is the problem in something else?
Thanx for each idea!
EDIT:
Of course i tried to dispose() all 4 streams, but the result was the same...
ms.Dispose();
reader.Dispose();
cryptostreamDecr.Dispose();
fsread.Dispose();
EDIT:
Found the problem. It was not dispose(), but creating the picture from stream. After deleting the picture, memory usage went from 1.5GB to 20MB.
EDIT:
Pictures are about 500kb in .jpg format, around 700kb in base64 encrypted format. But i have really no idea, how big is the imagebox object.
EDIT:
100 loops with input around 100MB was meant that each loop takes around 1MB, 100MB is total for 100 loops
Another answer: Live with it.
As in: You work with 100mb blocks in what appears to be a 32 bit application. This will not work without reusing buffers due to large object heap and general memory fragmentation.
As in: The memory is there, just not in large enough blocks. THis results in allocation errors.
There is no real way around this except going 64 bit where the larger address space handles the issue.
Information about this may be at:
https://connect.microsoft.com/VisualStudio/feedback/details/521147/large-object-heap-fragmentation-causes-outofmemoryexception
https://www.simple-talk.com/dotnet/.net-framework/large-object-heap-compaction-should-you-use-it/
has a possible solution these days, engabling large object heap conpaction:
GCSettings.LargeObjectHeapCompactionMode = GCLargeObjectHeapCompactionMode.CompactOnce;
GC.Collect(); // This can be omitted
LOH oprations are expensibe, but 100mb areas running aruond is not exactly a GC recommendet scenario. Not in 32 bit.
Use a base 64 transform when decrypting your stream. Do not use Convert.FromBase64String as this requires all data to be in memory.
using (FileStream f64 = File.Open(fileout, FileMode.Open) ) // content is in base64
using (var cs=new CryptoStream(f64, new FromBase64Transform(), CryptoStreamMode.Read ) ) // transform passed to constructor
using(var fo =File.Open(filein +".orig", FileMode.Create))
{
cs.CopyTo(fo); // stream is accessed as if it was already decrypted
}
Code sample taken from this related answer -
Convert a VERY LARGE binary file into a Base64String incrementally

Why the buffer size changes my stream output?

I'm trying to stream a pdf file. Most of the files open without any problems but sometimes it fails. When it fails, it also looks like file size is smaller than the original one. For example, I was trying to open a 47K file but when the streamed output to the browser it's only 44.5K. When check the size of the stream (result.FileStream), it's 47K like it supposed to be.
I'm using Stream.Read to output the file to the browser. When I had a problem, I was using buffer size of 10000 bytes. However, when I changed the buffer size from 10000 to 1000 the problem disappear and I was able to the file. I cannot explain why the change in the buffer size makes the streaming behave differently.
Here's the code I'm using result.FileStream is of type Stream:
using (result.FileStream)
{
int length;
const int byteSize = 1000;
var buffer = new byte[byteSize];
while ((length = result.FileStream.Read(buffer, 0, byteSize)) > 0 && Response.IsClientConnected)
{
Response.OutputStream.Write(buffer, 0, length);
Response.Flush();
}
}
Response.Close();
Please enlighten me because I definitely don't understand something.
You're using Response.Close(), which seems to be much more evil then the documentation would make you believe.
http://forums.iis.net/t/1152058.aspx

Categories

Resources