C# MemoryMappedFile not enough resources - c#

I'm doing operations on images. Some of these operations require me to create 3 different versions of the pixel data from the image and then later on combine them and do operantions on it.
For regular/small images the code works fine, I simply initialize my image raster data as new int[size].
However for bigger images with a bigger resolution (600, 1200, ...) the new int[size] throws an OutOfMemoryException. Trying to allocate more than 2GB. However I've built it 64bit (not anycpu or 32bit).
To resolve this issue, I've tried to create a MemoryMappedFile in memory itself. This gave me out of resource also. Next I've tried to create a MemoryMappedFile but by first creating a file on disk and then creating a accessor over the complete file.
Still I'm facing the not enough resources with the temporary file on disk and the MemoryMappedFile/ViewAcessor.
Am I doing something wrong in the code below? I thought the MMF and Accessor would handle the virtual memory paging automagically.
mmfPath = Path.GetTempFileName();
// create a file on disk first
using (var fs = File.OpenWrite(mmfPath))
{
var widthBytes = new byte[width * 4];
for (int y = 0; y < height; y++)
{
fs.Write(widthBytes, 0, widthBytes.Length);
}
}
// open the file on disk as a MMF
_RasterData = MemoryMappedFile.CreateFromFile(mmfPath,
FileMode.OpenOrCreate,
Guid.NewGuid().ToString(),
0, // 0 to set te capacity to the size of the file on disk
MemoryMappedFileAccess.ReadWrite);
_RasterDataAccessor = _RasterData.CreateViewAccessor(); // <-- not enough memory resources
Not enough memory resources are available to process this command.
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.MemoryMappedFiles.MemoryMappedView.CreateView(SafeMemoryMappedFileHandle memMappedFileHandle, MemoryMappedFileAccess access, Int64 offset, Int64 size)
at System.IO.MemoryMappedFiles.MemoryMappedFile.CreateViewAccessor(Int64 offset, Int64 size, MemoryMappedFileAccess access)
at System.IO.MemoryMappedFiles.MemoryMappedFile.CreateViewAccessor()
...
In case I can resolve the problem above, I think I will later on run again on the same issue when I need to create a resulting bitmap out of the pixeldata again. (2GB limit).
The goal is working with big images (and temporary copies of its pixeldata for raster/raster operations).
The current issue is that I'm getting Out of memory resources with MemoryMappedFile. Where I thought this would resolve the 2GB limit and that Windows/Framework would handle the virtual memory paging issues.
(.NET Framework 4.8 - 64bit build.)

Related

Using memory stream is throwing out of memory exeption

I have a requirement where I need to encrypt file of size 1-2 GB in azure function. In am using PGP core library to encrypt file in memory. The below code is throwing out of memory exception if file size is above 700 MB. Note:- I am using azure function. Scaling up of App service plan didn't help.
I there any alternate of Memory stream that I can use. After encryption , I am uploading file into blob storage.
var privateKeyEncoded = Encoding.UTF8.GetString(Convert.FromBase64String(_options.PGPKeys.PublicKey));
using Stream privateKeyStream = StringToStreamUtility.GenerateStreamFromString(privateKeyEncoded);
privateKeyStream.Position = 0;
var encryptionKeys = new EncryptionKeys(privateKeyStream);
var pgp = new PGP(encryptionKeys);
//encrypt stream
var encryptStream = new MemoryStream();
await pgp.EncryptStreamAsync(streamToEncrypt, encryptStream );
MemoryStream is a Stream wrapper over a byte[]` buffer. Every time that buffer is full, a new one with double the size is allocated and the data is copied. This eventually uses double the final buffer size (4GB for a 2GB file) but worse, it results in such memory fragmentation that eventually the memory allocator can't find a new contiguous memory block to allocate. That's when you get an OOM.
While you could avoid OOM errors by specifying a capacity in the constructor, storing 2GB in memory before even starting to write it is very wasteful. With a real FileStream the encrypted bytes would be written out as soon as they were available.
Azure Functions allow temporary storage. This means you can create a temporary file, open a stream on it and use it for encryption.
var tempPath=Path.GetTempFileName();
try
{
using (var outputStream=File.Open(tempPath))
{
await pgp.EncryptStreamAsync(streamToEncrypt, outputStream);
...
}
}
finally
{
File.Delete(tempPath);
}
MemoryStream uses a byte[] internally, and any byte[] is going to get a bit brittle as it gets around/above 1GiB (although in theory a byte[] can be nearly 2 GiB, in reality this isn't a good idea, and is rarely seen).
Frankly, MemoryStream simply isn't a good choice here; I'd probably suggest using a temporary file instead, and use a FileStream. This doesn't attempt to keep everything in memory at once, and is more reliable at large sizes. Alternatively: avoid ever needing all the data at once completely, by performing the encryption in a pass-thru streaming way.

Parallel read same file different segments from multiple threads

I need to read different sections of a large file (50-500 GB) from a network-attached storage platform and do some process with it and I need to do this very quickly.
I'm writing applications using .net5, golang and c++ and I publish my code for all platforms(Windows, Linux, MacOS)
The parallel code below works fine when I publish it for Linux and MacOS and I get the benefit of parallel reading(like 4x-32x depends on the number of CPU cores)compared to single thread method.
However with same hardware configuration and same code I don't get any performance effect on Windows machine with parallel method when compared to single thread method.
Another unexpected behavior is that when I write the same logic with GOLANG for Linux platforms,different distros shows different behaviors. For example my code can do parallel reading on ubuntu only if the storage device is mounted with NFS protocol. However with CentOS it can do parallel reading with both configuration(NFS and block storage).
So I'm confused.
If the problem is the OS then why my code written with GOLANG can do parallel read on NFS and cannot do on block storage when using Ubuntu?
If the problem is the language(c# or GO), then why C# application can do parallel read on Linux(Ubuntu or CentOS)and cannot do it on Windows(Win Server 2019)?
If the problem is the protocols that the network storage device is mounted, then how come I can achive parallel read in every scenarion when I use CentOS?
Also con can find the benchmark tools that I've prepared for this scenario below.
storage-benchmark-go
storage-benchmark-csharp
I know this question is a very niche one and only interest people who works with network storage devices, but I'll try my change if some OS or Storage or Software people can comment on this. Thanks all.
Single Thread Method in C#
//Store max of each section
int[] maxBuffer = new int[numberOfSections];
using (FileStream streamSource = new FileStream(filePath, FileMode.Open, FileAccess.Read, FileShare.Read, 4096, FileOptions.Asynchronous))
{
for (int index = 0; index < numberOfSections; index++)
{
byte[] sectionBuffer = new byte[1024L*20L];
streamSource.Position = (((long)sectionBuffer.Length + numberOfBytesToSkip) * (long)index)%streamSource.Length;
streamSource.Read(sectionBuffer, 0, sectionBuffer.Length));
maxBuffer[index] = sectionBuffer.Max();
}
}
Console.WriteLine(maxBuffer.Sum());
Parallel Method C#
//Store max of each section
int[] maxBuffer = new int[numberOfSections];
Parallel.For(0, numberOfSections, index =>
{
using (FileStream streamSource = new FileStream(filePathOfLargeFile, FileMode.Open, FileAccess.Read, FileShare.Read, 4096, FileOptions.Asynchronous))
{
byte[] sectionBuffer = new byte[1024L*20L];
streamSource.Position = (((long)sectionBuffer.Length + numberOfBytesToSkip) * (long)index)%streamSource.Length;
streamSource.Read(sectionBuffer, 0, sectionBuffer.Length);
maxBuffer[index] = sectionBuffer.Max();
}
});
Console.WriteLine(maxBuffer.Sum());
I'm attaching an image to visualize the implementation of the code above.

System.IO.File.ReadAllBytes for file larger than 2GB

I have a large file that I need to copy to memory for further processing. The software works fine for files smaller than 2GB, but as soon as they pass this limit I get an exception that ReadAllBytes only supports files smaller than 2GB.
byte[] buffer = System.IO.File.ReadAllBytes(file); // exception if file > 2GB
What is the fastest way to copy a file larger than 2GB to memory?
The process is already 64bit and the flag gcAllowVeryLargeObjects is already set.
I doubt you can do anything faster than a memory mapped file http://msdn.microsoft.com/en-us/library/system.io.memorymappedfiles.memorymappedfile(v=vs.110).aspx.
using ( var file = MemoryMappedFile.CreateFromFile( "F:\\VeryLargeFile.data" ) )
{
}
You can then use CreateViewAccessor or CreateViewStream to manipulate the data.

File.ReadAllBytes() throws OutOfMemoryException

I am loading small pdf files into the buffer an getting the OutOfMemoryEception. File Size 220KB works fine, the next size I have tested is 4,50MB an this file throws the exception. What ist the maximum file size and what can I do to change the max size? 4,5MB ist not that much :-)
This is the related code:
ListViewDataItem dataItem = (ListViewDataItem)e.Item;
int i = dataItem.DisplayIndex;
byte[] buffer = File.ReadAllBytes(Session["pdfFileToSplit"].ToString());
string unique = Guid.NewGuid().ToString();
Session[unique] = buffer;
Panel thumbnailPanel = (Panel)e.Item.FindControl("thumbnails");
Thumbnail thumbnail = new Thumbnail();
thumbnail.SessionKey = unique;
thumbnail.Index = i+1;
thumbnail.DPI = 17;
thumbnail.BorderColor = System.Drawing.Color.Blue;
thumbnailPanel.Controls.Add(thumbnail);
Ok I just saw something really mysterios (for me). I uploaded a file below 10MB an whatching the used memory of the IIS Server(w3wp.exe), nothing dramatic happens, a few MB up, a few down, everything worked fine. Than I've tried the same thing with a 12MB file. At the beginning it appears same, but than, suddenly, out of nowhere, the used memory of the w3wp.exe exploded to 1,5GB an than the server crashes....
The OutOfMemoryException is on server side or client side?
When you useSession[unique] = buffer, you're storing all the files (represented as byte arrays) simultaneously in your session.
That can be a lot of information.
If your session is "InProc", your server will probably run out of memory.
The limit is the memory of the machine.
When your request finishes the memory stays allocated in the session. That's the problem. You should set Session[unique] = null if this isn't the desired behavior, making the session dispose the memory on the server. If you put 10 files, 10 will be simultaneously stored in the session even after the requests finishes. They will be disposed only when the session ends.

Loading saved byte array to memory stream causes out of memory exception

At some point in my program the user selects a bitmap to use as the background image of a Panel object. When the user does this, the program immediately draws the panel with the background image and everything works fine. When the user clicks "Save", the following code saves the bitmap to a DataTable object.
MyDataSet.MyDataTableRow myDataRow = MyDataSet.MyDataTableRow.NewMyDataTableRow(); //has a byte[] column named BackgroundImageByteArray
using (MemoryStream stream = new MemoryStream())
{
this.Panel.BackgroundImage.Save(stream, ImageFormat.Bmp);
myDataRow.BackgroundImageByteArray = stream.ToArray();
}
Everything works fine, there is no out of memory exception with this stream, even though it contains all the image bytes. However, when the application launches and loads saved data, the following code throws an Out of Memory Exception:
using (MemoryStream stream = new MemoryStream(myDataRow.BackGroundImageByteArray))
{
this.Panel.BackgroundImage = Image.FromStream(stream);
}
The streams are the same length. I don't understand how one throws an out of memory exception and the other doesn't. How can I load this bitmap?
P.S. I've also tried
using (MemoryStream stream = new MemoryStream(myDataRow.BackgroundImageByteArray.Length))
{
stream.Write(myDataRow.BackgroundImageByteArray, 0, myDataRow.BackgroundImageByteArray.Length); //throw OoM exception here.
}
The issue I think is here:
myDataRow.BackgroundImageByteArray = stream.ToArray();
Stream.ToArray() . Be advised, this will convert the stream to an array of bytes with length = stream.Length. Stream.Legnth is size of the buffer of the stream, which is going to be larger than the actual data that is loaded into it. You can solve this by using Stream.ReadByte() in a while loop until it returns a -1, indicating the end of the data within the stream.
You might give this library a look.
http://arraysegments.codeplex.com/
Project Description
Lightweight extension methods for ArraySegment, particularly useful for byte arrays.
Supports .NET 4.0 (client and full), .NET 4.5, Metro/WinRT, Silverlight 4 and 5, Windows Phone 7 and 7.5, all portable library profiles, and XBox.

Categories

Resources