Lock / Use / Allocate memory - c#

is there any option to lock or allocate memory in C#?
Scenario 1:
In my virtual machine, there is 16GB RAM, for a test I need to use 8GB RAM so 8GB will remain 'free' for the operating system and rest application
Scenario 2:
The same virtual machine with 16GB RAM, and now I need to use 14GB RAM.
For now, I create a memory leak function but this is not good cause it takes all memory from my virtual machine.
List<byte[]> memoryUsage = new List<byte[]>();
while (true)
{
try
{
memoryUsage.Add(new byte[1024]);
}
catch (OutOfMemoryException)
{
throw;
}
}
Allocate or Lock RAM that User (me) will input to file e.g
I want to allocate/lock 8GB RAM and program allocate/lock 8GB RAM and 8GB RAM will remain as 'free'

If you're running this on a Windows box, you need to keep in mind the concept of virtual memory. You can allocate - or leak memory - to your heart's content with C#, however if the underlying operating system deems safe to page the memory being used by your process to the paging file (assuming at least one such file is defined), it will do so.
Let's take your first scenario - the one where you want 8 GB of RAM allocated. Your code can do precisely that. However your OS can kick in and move some of the pages representing your allocated data in RAM to disk. There are several reasons why the memory manager would do this - take a look at some here (under the "Pages can be removed from a process working set..." paragraph). You'll thus be left with less used RAM that you originally intended.
To my understanding you're after a constant working set occupied by your process, and I'm not sure C# - even in an unsafe context - allows you to do that. You could try invoking Win32 functions that work at a low level, possibly using P/Invoke as stated here, but like I've said, I'm not sure it will work.
You'll also have to keep an eye out on the variable(s) that reference your allocated data. If the GC (Garbage Collector) decides the data you've allocated is no longer required from a point on (say because your remaining code no longer references it) it will happily reclaim it, again leaving you with less allocated memory that you've originally wanted.
You haven't really said anything about the target platform you've used to build this. An out-of-memory error will be thrown much earlier - after allocating only around 3 GB - by a process that was built for x86 (or AnyCPU + Prefer 32-bit), since that's how things work in Wow64.
If you don't really have to write the code that's doing the allocation yourself in C#, maybe you can merely invoke something like Testlimit (and play around with setting the minimum working set).

Related

does mono/.Net GC release free allocated memory back to OS after collection? if not, why?

I heard many times that once C# managed program request more memory from OS, it doesn't free it back, unless system is out of memory. Eg. when object is collected, it gets deleted, and memory that was occupied by the object is free to reuse by another managed object, but memory itself is not returned to operating system (for example, mono on unix wouldn't call brk / sbrk to decrease the amount of virtual memory available to the process back to what it was before its allocation).
I don't know if this really happens or not, but I can see that my c# applications, running on linux, use small amount of memory on beginning, then when I do something memory expensive, it allocates more of it, but later on when all objects get deleted (I can verify that by putting debug message to destructors), the memory is not free'd. On other hand no more memory is allocated when I run that memory expensive operation again. The program just keep on eating the same amount of memory until it is terminated.
Maybe it is just my misunderstanding of how GC in .net works, but if it really does work like this, why is that? What is a benefit of keeping the allocated memory for later, instead of returning it back to the system? How can it even know if system need it back or not? What about other application that would crash or couldn't start because of OOM caused by this effect?
I know that people will probably answer something like "GC manages memory better than you ever could, just don't care about it" or "GC knows what it does best" or "it doesn't matter at all, it's just virtual memory" but it does matter, on my 2gb laptop I am running OOM (and kernel OOM killer gets started because of that) very often when I am running any C# applications after some time precisely because of this irresponsible memory management.
Note: I was testing this all on mono in linux because I really have hard times understanding how windows manage memory, so debugging on linux is much easier for me, also linux memory management is open source code, memory management of windows kernel / .Net is rather mystery for me
The memory manager works this way because there is no benefit of having a lot of unused system memory when you don't need it.
If the memory manager would always try to have as little memory allocated as possible, that would mean that it would do a lot of work for no reason. It would only slow the application down, and the only benefit would be more free memory that no application is using.
Whenever the system needs more memory, it will tell the running applications to return as much as possible. The same signal is also sent to an application when you minimise it.
If this doesn't work the same with Mono in Linux, then that is a problem with that specific implementation.
Generally, if an app needs memory once, it will need it again. Releasing memory back to the OS only to request it back again is overhead, and if nothing else wants the memory: why bother?. It is trying to optimize for the very likely scenario of wanting it again. Additionally, releasing it back requires entire / contiguous blocks that can be handed back, which has very specific impact on things like compaction: it isn't quite as simple as "hey, I'm not using most of this : have it back" - it needs to figure out what blocks can be released, presumably after a full collect and compact (relocate objects etc) cycle.

Windows - WPF Process out of memory crash with plenty of available physical memory

i have a WPF desktop app that crashed with the following exception:
System.Data.SqlServerCe.SqlCeException (0x80004005): There is not enough memory on the device running SQL Server
However, the memory values at crash time are somewhat not clear to me:
Current Process Current Working Set: 806 MB
Current Process Peak Working Set: 1157 MB
Current Process Current Page Memory Size: 779 MB
Current Process Peak Page Memory Size: 1502 MB
Current Process Private Memory Size: 779 MB
ComputerInfo TotalPhysicalMemory: 5884 MB
ComputerInfo TotalVirtualMemory: 2047 MB
ComputerInfo AvailablePhysicalMemory: 3378 MB
ComputerInfo AvailableVirtualMemory: 166 MB
btw. The Current Process values are taken from the C# Process class. The ComputerInfo values are taken from the VB.NET ComputerInfo class.
My app is compiled with (x86) configuration. The process is running on a Windows 7 64 Bit machine.
I see that the Available Virtual Memory is 166MB which looks pretty low.
How is it possible that the process crashed when there is plenty of AvailablePhysicalMemory reported by the VB.NET ComputerInfo class?..
The high Current and Peak Working Set indicates that probably there is a memory leak somewhere, but i still don't get why it crashed when there was plenty of available RAM.
Your assumption that physical memory is in any way relevant is the fundamental cause of your confusion. Remember, the right way to think about memory is that process memory is disk space. Physical memory is just a fast cache on top of the disk. Again, let me emphasize this: if you run out of physical memory then your machine gets slower. It doesn't give an out of memory error.
The relevant resource is virtual address space, not memory. You only get 4GB of virtual address space per 32 bit process and 2GB of that is reserved for the use of the operating system. Suppose you have 166 MB of virtual address space left and that it is divided into four chunks of 42 MB each. If a request for 50MB comes in, that request cannot be fulfilled. Unfortunately the error you get is "out of memory" and not "out of virtual address space", which would be a more accurate error message.
The solution to your problem is either (1) allocate way less than 2GB of user memory per process, (2) implement your own system for mapping memory into and out of virtual address space, or (3) use a 64 bit process that has a far larger amount of available virtual address space.
Each 32 bit (you have 32 bit process because TotalVirtualMemory: 2047 MB) can address only up to 2GB of memory, regardless of the available physical memory.
An OutOfMemoryException can be caused by a number of things.
It can be caused when your application doesn't have enough space in the Gen0 managed heap or in the large object heap to process a new allocation. This is a rare case, but will typically happen when the heap is too fragmented to allow a new allocation (sometimes of quite a small size!). In Gen0 this might happen due to an excessive use of pinned objects (when handling interop with unmanaged code); in the LOH this was once a common problem but appears much less frequent in later versions of .NET. It's worth noting that SqlCe access would include unmanaged code; I've not heard of any major issues with this but it's possible that your use of the SqlCe classes is causing problems.
Alternatively, it could be a virtual memory issue - which seems quite plausible given the figures you've posted. Eric Lippert has a good blog post about this kind of issue here. If your application is trying to write pages of memory to disk so that it can keep something else in memory, you might well see the exception because your VM is so limited.

C# 64 bit applications, large memory address aware

Im a little confused with regards to the memory limitations of an application. As far as i can see, if i write a c# application, targeting x64, my program will have access to 8TB of Virtual address space = space on the HD?
OS >= Windows 7 professional supports 192gigs of RAM. So if i had 192gig system (unfortunately i dont), i could load just over 8.1TB of data into memory (assuming no other processes were running)?
Is virtual memory only used when i have run out available ram? Im assuming there is a performance implication associated with virtual memory vs using RAM?
Apologies if these appear stupid questions, but when it comes to memory management, im rather green.
Your question is actually several related question, taking each individually:
OS >= Windows 7 professional supports 192gigs of RAM. So if i had 192gig system (unfortunately i dont), i could load just over 8.1TB of data into memory (assuming no other processes were running)?
No, it would still be 8 TB. That is the maximum amount of addressable space, whether it is in RAM or elsewhere.
However you could never have 8 TB in use, even if you some how unloaded Windows itself, as the OS needs to keep track of the space being used. In total, you could probably get to 7 TB approximately.
is virtual memory only used when i have run out available ram?
No, if you have virtual memory turned on the entirety of RAM is typically preloaded onto your HDD (give or take a few seconds). This allows the OS to unload something to make room if it feels the need, without having to persist the data. Note that the OS keep thorough track so will know if this is the case or not.
Im assuming there is a performance implication associated with virtual memory vs using RAM?
Depends on your context. Every seek on the hard drive takes a computational eternity, however it is still a fraction of a second. Assuming your process isn't thrashing and repeatedly accessing virtual memory, you should not notice a significant performance hit outside high performance computing.
Apologies if these appear stupid questions, but when it comes to memory management, im rather green.
Your main problem is you have some preconceived notions about how memory works that don't line up with reality. If you are really interested you should look into how memory is used in a modern system.
For instance, most people conceptualize that a pointer points to a location in memory, since it is the fundamental structure. This isn't quite true. In fact the pointer contains a piece of information that can be decoded into a location in the addressable space of the system, which isn't always in RAM. This decoding process uses quite a few tricks that are interesting, but beyond the scope of this question.
Normally, you should write applications targeting Any CPU. The .NET loader then decides (depending on the platform it is running on) which version of the run-time environment will execute the application and into what kind of native code it will be compiled. There is no need to specify the platform, unless you are using custom native components which will be loaded into the process created for your application. This process is then associated with some virtual address space - how that is mapped to physical memory is managed by the OS...

GC.GetTotalMemory(false) and Process.WorkingSet

After running my .net process for a long time, I see there's a big difference between the memory GC is aware of and the process working set.
The values I'm monitoring are GC.GetTotalMemory(false) and Process.WorkingSet.
If I check in the "task manager" (or using a SysInternals tool) the value of WorkingSet (Process) and what's shown in "task manager" don't match, but they're somehow close (let's say, 100mb in task manager, 130 in WorkingSet), but what's shocking for me is that GC.GetTotalMemory(false) is like 40Mb or so.
I've run several times through profilers (my favorite is ants memory profiler from redgate), and there it is easy to check that there's a value called "free memory" which is normally the difference between what GC "sees" and what the OS sees (ok, plus loaded DLLs and so on).
A couple of questions:
Is there a way to programmatically monitor this "GC free memory". Yes, I could use the profiler, but in a long running process is not that easy.
If your process runs for a long time, allocates a lot of memory and then frees it (and allocation means thousands or millions of objects, not as simple as a big allocation and free), the problem is that it will never "shrink" to a low value, despite of the fact that GC values are low and correct. Is there a way to fix this? It could be that something is broken on my process but it has been profiled several times over the years and it doesn't seem to be a leak.
Thanks
Just to resurrect an ancient thread, if you want to monitor "GC free memory", you can periodically make a call to GC.GetTotalMemory(false) and store the value. Dump it to disk or a database or something. Alternatively you can use a library like prometheus-net(https://github.com/prometheus-net/) which will export this and a whole bunch of metrics for you.
As for your second question, there is a way to force your application to shrink, but it's not dot net native, it's windows only, and it's NOT recommended:
[DllImport("psapi.dll")]
static extern int EmptyWorkingSet(IntPtr hwProc);
static void MinimizeFootprint()
{
EmptyWorkingSet(Process.GetCurrentProcess().Handle);
}
From what little I know I think the OS will take care of this if the server is under memory pressure, and it's more efficient for your app, and for the server in general, if the memory remains allocated to your process. This is because there's a good chance it will need it again in the future, so there's no point in cleaning it up when it will just have to allocate it again, especially when your server has enough memory.

Hitting a memory limit slows down the .Net application

We have a 64bit C#/.Net3.0 application that runs on a 64bit Windows server. From time to time the app can use large amount of memory which is available. In some instances the application stops allocating additional memory and slows down significantly (500+ times slower).When I check the memory from the task manager the amount of the memory used barely changes. The application keeps on running very slowly and never gives an out of memory exception.
Any ideas? Let me know if more data is needed.
You might try enabling server mode for the Garbage Collector. By default, all .NET apps run in Workstation Mode, where the GC tries to do its sweeps while keeping the application running. If you turn on server mode, it temporarily stops the application so that it can free up memory (much) faster, and it also uses different heaps for each processor/core.
Most server apps will see a performance improvement using the GC server mode, especially if they allocate a lot of memory. The downside is that your app will basically stall when it starts to run out of memory (until the GC is finished).
* To enable this mode, insert the following into your app.config or web.config:
<configuration>
<runtime>
<gcServer enabled="true"/>
</runtime>
</configuration>
The moment you are hitting the physical memory limit, the OS will start paging (that is, write memory to disk). This will indeed cause the kind of slowdown you are seeing.
Solutions?
Add more memory - this will only help until you hit the new memory limit
Rewrite your app to use less memory
Figure out if you have a memory leak and fix it
If memory is not the issue, perhaps your application is hitting CPU very hard? Do you see the CPU hitting close to 100%? If so, check for large collections that are being iterated over and over.
As with 32-bit Windows operating systems, there is a 2GB limit on the size of an object you can create while running a 64-bit managed application on a 64-bit Windows operating system.
Investigating Memory Issues (MSDN article)
There is an awful lot of good stuff mentioned in the other answers. However, I'm going to chip in my two pence (or cents - depending on where you're from!) anyway.
Assuming that this is indeed a 64-bit process as you have stated, here's a few avenues of investigation...
Which memory usage are you checking? Mem Usage or VMem Size? VMem size is the one that actually matters, since that applies to both paged and non-paged memory. If the two numbers are far out of whack, then the memory usage is indeed the cause of the slow-down.
What's the actual memory usage across the whole server when things start to slow down? Does the slow down also apply to other apps? If so, then you may have a kernel memory issue - which can be due to huge amounts of disk accessing and low-level resource usage (for example, create 20000 mutexes, or load a few thousand bitmaps via code that uses Win32 HBitmaps). You can get some indication of this on the Task Manager (although Windows 2003's version is more informative directly on this than 2008's).
When you say that the app gets significantly slower, how do you know? Are you using vast dictionaries or lists? Could it not just be that the internal data structures are getting so big so as to complicate the work any internal algorithms are performing? When you get to huge numbers some algorithms can start to become slower by orders of magnitude.
What's the CPU load of the application when it's running at full-pelt? Is actually the same as when the slow-down occurs? If the CPU usage decreases as the memory usage goes up, then that means that whatever it's doing is taking the OS longer to fulfill, meaning that it's probably putting too much load on the OS. If there's no difference in CPU load, then my guess is it's internal data structures getting so big as to slow down your algos.
I would certainly be looking at running a Perfmon on the application - starting off with some .Net and native memory counters, Cache hits and misses, and Disk Queue length. Run it over the course of the application from startup to when it starts to run like an asthmatic tortoise, and you might just get a clue from that as well.
Having skimmed through the other answers, I'd say there's a lot of good ideas. Here's one I didn't see:
Get a memory profiler, such as SciTech's MemProfiler. It will tell you what's being allocated, by what, and it will show you the whole slice n dice.
It also has video tutorials in case you don't know how to use it. In my case, I discovered I had IDisposable instances that I wasn't Using(...)

Categories

Resources