How to increase the allowed RAM usage for IIS Express? - c#

First of all, I have seen this question:
IIS Express - increse memory limit
But it is not a duplicate, because all answers are pointing to the 64bit version of IIS Express. I need to support 32 bit! And 32 bit can support 2 GB RAM per process.
I was debugging a strange problem. So I created a Console Application which works fine:
//Simplifiy my application logic
var trash = new List<int>();
long mbUsed = 0;
while (mbUsed < 600)
{
for (var i = 0; i < 100000; i++)
{
trash.Add(i);
}
GC.Collect();
mbUsed = Process.GetCurrentProcess().WorkingSet64 / 1024 / 1024;
Console.WriteLine(mbUsed + " MB used");
}
//Creating a image
var bitmap = new Bitmap(2000, 4000);
Basicly it fills the ram to 600 MB and then tries to create a large image.
Now if I paste the same code in an MVC Action, surprise, I get an OutOfMemoryException:
If I read it correctly, I am using less then 500 MB.
So how can I use more RAM? For normal IIS I can change it on the application pool.

Related

Parallel read same file different segments from multiple threads

I need to read different sections of a large file (50-500 GB) from a network-attached storage platform and do some process with it and I need to do this very quickly.
I'm writing applications using .net5, golang and c++ and I publish my code for all platforms(Windows, Linux, MacOS)
The parallel code below works fine when I publish it for Linux and MacOS and I get the benefit of parallel reading(like 4x-32x depends on the number of CPU cores)compared to single thread method.
However with same hardware configuration and same code I don't get any performance effect on Windows machine with parallel method when compared to single thread method.
Another unexpected behavior is that when I write the same logic with GOLANG for Linux platforms,different distros shows different behaviors. For example my code can do parallel reading on ubuntu only if the storage device is mounted with NFS protocol. However with CentOS it can do parallel reading with both configuration(NFS and block storage).
So I'm confused.
If the problem is the OS then why my code written with GOLANG can do parallel read on NFS and cannot do on block storage when using Ubuntu?
If the problem is the language(c# or GO), then why C# application can do parallel read on Linux(Ubuntu or CentOS)and cannot do it on Windows(Win Server 2019)?
If the problem is the protocols that the network storage device is mounted, then how come I can achive parallel read in every scenarion when I use CentOS?
Also con can find the benchmark tools that I've prepared for this scenario below.
storage-benchmark-go
storage-benchmark-csharp
I know this question is a very niche one and only interest people who works with network storage devices, but I'll try my change if some OS or Storage or Software people can comment on this. Thanks all.
Single Thread Method in C#
//Store max of each section
int[] maxBuffer = new int[numberOfSections];
using (FileStream streamSource = new FileStream(filePath, FileMode.Open, FileAccess.Read, FileShare.Read, 4096, FileOptions.Asynchronous))
{
for (int index = 0; index < numberOfSections; index++)
{
byte[] sectionBuffer = new byte[1024L*20L];
streamSource.Position = (((long)sectionBuffer.Length + numberOfBytesToSkip) * (long)index)%streamSource.Length;
streamSource.Read(sectionBuffer, 0, sectionBuffer.Length));
maxBuffer[index] = sectionBuffer.Max();
}
}
Console.WriteLine(maxBuffer.Sum());
Parallel Method C#
//Store max of each section
int[] maxBuffer = new int[numberOfSections];
Parallel.For(0, numberOfSections, index =>
{
using (FileStream streamSource = new FileStream(filePathOfLargeFile, FileMode.Open, FileAccess.Read, FileShare.Read, 4096, FileOptions.Asynchronous))
{
byte[] sectionBuffer = new byte[1024L*20L];
streamSource.Position = (((long)sectionBuffer.Length + numberOfBytesToSkip) * (long)index)%streamSource.Length;
streamSource.Read(sectionBuffer, 0, sectionBuffer.Length);
maxBuffer[index] = sectionBuffer.Max();
}
});
Console.WriteLine(maxBuffer.Sum());
I'm attaching an image to visualize the implementation of the code above.

Downloaded size is more than real

I have written a program with C# and Microsoft API that show's amount of download you had, but it doesn't telling me the correct size that is downloaded, for example: If I download a 10 MB file, the program shows 11 MB is downloaded.
I also checked in network status window still tell me just like my program.
Why? Does other software in ISP work same way that I have?
objIPInterfaceStatistics2 = objNetworkInterface[numberinterface].GetIPStatistics();
long newBytesreceived;
newBytesreceived = objIPInterfaceStatistics2.BytesReceived;
if (checkdata == true)
{
checkdata = false;
newBytesreceived = 0;
}
long newUsage = newBytesreceived - oldBytesreceived2;
trafficusage += newUsage;
float converttrafficusage = trafficusage / 1000000;
oldBytesreceived2 = objIPInterfaceStatistics2.BytesReceived;
worker.ReportProgress((int)Math.Ceiling(converttrafficusage));
Thread.Sleep(1000);
I can just assume that the two values are calculated differently.
In general, 1 megabyte refers to 1.000.000 bytes whereas 1 mebibyte refers to 2^20 bytes => 1.048.576 bytes. Normally, megabyte is used as it is easier to calculate with.
To be sure, some example code from you where the downloaded packags are being calculated would be good^^

Out of Memory issue on a 128 GB Ram x64 cpu

I am working on a program where it reads a 312 MB encrypted file into memory stream , decrypts it and copies into destination stream. My program works well with file size of around 120 MB . I couldn't figure out why it is failing for this ?
My System info : 64 bit cpu , RAM : 128 GB
Also the c# code I built on using Any CPU setting in Configuration Manager.
I wrote a sample program to check where I am getting out of memory and I see that its failing at 512 MB. I do know that the Memory stream requires contigous blocks in the Memory as the RAM is fragmented. But the RAM size is huge here, I tried in multiple machines as well with RAMs of 14 GB, 64GB & 8GB.
Any help is appreciated.
The sample program I wrote to test the Out of Memory Size :
const int bufferSize = 4096;
byte[] buffer = new byte[bufferSize];
int fileSize = 1000 * 1024 * 1024;
int total = 0;
try
{
using (MemoryStream memory = new MemoryStream())
{
while (total < fileSize)
{
memory.Write(buffer, 0, bufferSize);
total += bufferSize;
}
}
Console.WriteLine("No errors");
}
catch (OutOfMemoryException)
{
Console.WriteLine("OutOfMemory around size : " + (total / (1024m * 1024.0m)) + "MB");
}
Just running out of Large Object Heap I guess. However another approach to solving your problem is to not read the stream into memory - most decrypt algorithms just want a System.IO.Stream - reading it into memory seems a relatively pointless step - just pass the decrypt api your incoming file or network stream instead.
Try disabling the option "Prefer 32 bits" from the project's properties, in "Build" tab, thats works for me.
Good luck!

virtual comport is very slow compared to terminal

I'm writing an application where I need to send a file (~600kB) to another unit via a virtual serialport.
When I send it using a terminal application (TeraTerm) it takes less than 10 seconds, but using my program it takes 1-2 minutes.
My code is very simple:
port.WriteTimeout = 30000;
port.ReadTimeout = 5000;
port.WriteBufferSize = 1024 * 1024; // Buffer size larger than file size
...
fs = File.OpenRead(filename);
byte[] filedata = new byte[fs.Length];
fs.Read(filedata, 0, Convert.ToInt32(fs.Length));
...
for (int iter = 0; iter < filedata.Length; iter++) {
port.Write(filedata, iter, 1);
}
Calling port.Write with the entire file length seems to always cause a write timeout for unknown reason, so I'm writing 1 byte at a time.
Solved it, here's the details in case someone else finds this it might give some hints on what's wrong.
I was reading the file wrong, somehow the application used \r\n as newlines when transferring. The file itself is a Intel .hex file which contains checksums which were calculated using \r as newlines.
Checksum errors caused the other device to ACK very slowly, thus making the transfer super slow combined with the PC application now handling checking for checksum errors.
If you have similar errors I recommend using a software snoop to monitor what's actually being sent

File.ReadAllBytes() throws OutOfMemoryException

I am loading small pdf files into the buffer an getting the OutOfMemoryEception. File Size 220KB works fine, the next size I have tested is 4,50MB an this file throws the exception. What ist the maximum file size and what can I do to change the max size? 4,5MB ist not that much :-)
This is the related code:
ListViewDataItem dataItem = (ListViewDataItem)e.Item;
int i = dataItem.DisplayIndex;
byte[] buffer = File.ReadAllBytes(Session["pdfFileToSplit"].ToString());
string unique = Guid.NewGuid().ToString();
Session[unique] = buffer;
Panel thumbnailPanel = (Panel)e.Item.FindControl("thumbnails");
Thumbnail thumbnail = new Thumbnail();
thumbnail.SessionKey = unique;
thumbnail.Index = i+1;
thumbnail.DPI = 17;
thumbnail.BorderColor = System.Drawing.Color.Blue;
thumbnailPanel.Controls.Add(thumbnail);
Ok I just saw something really mysterios (for me). I uploaded a file below 10MB an whatching the used memory of the IIS Server(w3wp.exe), nothing dramatic happens, a few MB up, a few down, everything worked fine. Than I've tried the same thing with a 12MB file. At the beginning it appears same, but than, suddenly, out of nowhere, the used memory of the w3wp.exe exploded to 1,5GB an than the server crashes....
The OutOfMemoryException is on server side or client side?
When you useSession[unique] = buffer, you're storing all the files (represented as byte arrays) simultaneously in your session.
That can be a lot of information.
If your session is "InProc", your server will probably run out of memory.
The limit is the memory of the machine.
When your request finishes the memory stays allocated in the session. That's the problem. You should set Session[unique] = null if this isn't the desired behavior, making the session dispose the memory on the server. If you put 10 files, 10 will be simultaneously stored in the session even after the requests finishes. They will be disposed only when the session ends.

Categories

Resources