C# calling COM fails to allocate memory - c#

I've got a problem with a C# application and a COM component allocating memory:
C# program calls a function in a COM DLL written in C++ which does matrix processing. The function allocates a lot of memory (around 800MB in eight 100MB chunks). This fails (malloc returns "bad allocation" when calling the function from C#.
If I run the same function from a C program, allocating the same amount of memory, then there's no problem allocating memory.
I've got 8GB RAM, Win7 x64 and there are plenty of free memory.
How to fix that it works to allocate memory when calling from the C# application?
I tried to google it, but didn't really know what to search for. Searched for setting heap size etc, but that didn't give anything.
Feel a bit lost! All help are appreciated!

Amount of physical memory (8 GB) is not the constraint that limits memory consumption of your application. Supposedly, you built 32-bit application which has a fundamental limit of 4 GB of directly addressable bytes. For historical reasons, the application not doing any magic has only half of this - 2 GB. This is where you allocate from, and this space is used for other needs. 100 MB chucks are large enough to reduce the effectively usable space because of memory/address fragmentation (you want not just 100 chunks, you request continuous ones).
The easiest solution here is to build 64-bit applications. The limits there are distant.
If you still want 32-bit code:
enable /LARGEADDRESSWARE on the hosting application binary to extend limit from 2 to 4 GB
use file mappings, which you can keep in physical memory with your data and map into metered address space on demand
allocate smaller chunks

Related

Lock / Use / Allocate memory

is there any option to lock or allocate memory in C#?
Scenario 1:
In my virtual machine, there is 16GB RAM, for a test I need to use 8GB RAM so 8GB will remain 'free' for the operating system and rest application
Scenario 2:
The same virtual machine with 16GB RAM, and now I need to use 14GB RAM.
For now, I create a memory leak function but this is not good cause it takes all memory from my virtual machine.
List<byte[]> memoryUsage = new List<byte[]>();
while (true)
{
try
{
memoryUsage.Add(new byte[1024]);
}
catch (OutOfMemoryException)
{
throw;
}
}
Allocate or Lock RAM that User (me) will input to file e.g
I want to allocate/lock 8GB RAM and program allocate/lock 8GB RAM and 8GB RAM will remain as 'free'
If you're running this on a Windows box, you need to keep in mind the concept of virtual memory. You can allocate - or leak memory - to your heart's content with C#, however if the underlying operating system deems safe to page the memory being used by your process to the paging file (assuming at least one such file is defined), it will do so.
Let's take your first scenario - the one where you want 8 GB of RAM allocated. Your code can do precisely that. However your OS can kick in and move some of the pages representing your allocated data in RAM to disk. There are several reasons why the memory manager would do this - take a look at some here (under the "Pages can be removed from a process working set..." paragraph). You'll thus be left with less used RAM that you originally intended.
To my understanding you're after a constant working set occupied by your process, and I'm not sure C# - even in an unsafe context - allows you to do that. You could try invoking Win32 functions that work at a low level, possibly using P/Invoke as stated here, but like I've said, I'm not sure it will work.
You'll also have to keep an eye out on the variable(s) that reference your allocated data. If the GC (Garbage Collector) decides the data you've allocated is no longer required from a point on (say because your remaining code no longer references it) it will happily reclaim it, again leaving you with less allocated memory that you've originally wanted.
You haven't really said anything about the target platform you've used to build this. An out-of-memory error will be thrown much earlier - after allocating only around 3 GB - by a process that was built for x86 (or AnyCPU + Prefer 32-bit), since that's how things work in Wow64.
If you don't really have to write the code that's doing the allocation yourself in C#, maybe you can merely invoke something like Testlimit (and play around with setting the minimum working set).

Why are the bytes in all heap much larger than total memory usage

We have a C# Windows service (runs on Windows 2008 server + .NET framework 4.6 + GC perf hot fix). After running for a few days, the size of bytes in all heaps reaches more than 100 GB (committed), and the private bytes is very high too (110 GB+), but the RAM usage is only 68 GB (Working Set) & 59 GB (Private Bytes). Only 10 GB in page files on this server.
I made a dump and run WinDbg + SOS to analyze memory usage, and I found out that there are a lot of Free objects (about 54 GB). Could this be caused by Free objects? Do those free objects only take up virtual memory but no physical memory? If not, how it is possible that the committed virtual memory is much larger than used physical memory + page files?
You have just discovered the concept of demand-zero pages.
Let me cite from Windows Internals, 6th edition, part 2 [Amazon Germany], chapter 10, which is about memory management (page 276 in my edition of the book):
For many of those items, the commit charge may represent the potential use of storage rather than the actual. For example, a page of private committed memory does not actually occupy either a physical page of RAM or the equivalent page file space until it's been referenced at least once. Until then, it's a demand-zero page [...] But commit charge accounts for such pages when the virtual space is first created. This ensures that when the page is later referenced, actual physical storage space will be available for it.
This means: Windows will increase the size of either the working set or the page file when the (committed but yet unaccessed/unused) memory is accessed.
Your sentence
Do those free objects only take up virtual memory but no physical memory?
does not really fit to the rest of the question. Any memory, regardless of what it is filled with (.NET Free objects, .NET regular objects or even C++), may consume physical memory (then it's in the Working Set) or not (then it's in the Page File).

Windows - WPF Process out of memory crash with plenty of available physical memory

i have a WPF desktop app that crashed with the following exception:
System.Data.SqlServerCe.SqlCeException (0x80004005): There is not enough memory on the device running SQL Server
However, the memory values at crash time are somewhat not clear to me:
Current Process Current Working Set: 806 MB
Current Process Peak Working Set: 1157 MB
Current Process Current Page Memory Size: 779 MB
Current Process Peak Page Memory Size: 1502 MB
Current Process Private Memory Size: 779 MB
ComputerInfo TotalPhysicalMemory: 5884 MB
ComputerInfo TotalVirtualMemory: 2047 MB
ComputerInfo AvailablePhysicalMemory: 3378 MB
ComputerInfo AvailableVirtualMemory: 166 MB
btw. The Current Process values are taken from the C# Process class. The ComputerInfo values are taken from the VB.NET ComputerInfo class.
My app is compiled with (x86) configuration. The process is running on a Windows 7 64 Bit machine.
I see that the Available Virtual Memory is 166MB which looks pretty low.
How is it possible that the process crashed when there is plenty of AvailablePhysicalMemory reported by the VB.NET ComputerInfo class?..
The high Current and Peak Working Set indicates that probably there is a memory leak somewhere, but i still don't get why it crashed when there was plenty of available RAM.
Your assumption that physical memory is in any way relevant is the fundamental cause of your confusion. Remember, the right way to think about memory is that process memory is disk space. Physical memory is just a fast cache on top of the disk. Again, let me emphasize this: if you run out of physical memory then your machine gets slower. It doesn't give an out of memory error.
The relevant resource is virtual address space, not memory. You only get 4GB of virtual address space per 32 bit process and 2GB of that is reserved for the use of the operating system. Suppose you have 166 MB of virtual address space left and that it is divided into four chunks of 42 MB each. If a request for 50MB comes in, that request cannot be fulfilled. Unfortunately the error you get is "out of memory" and not "out of virtual address space", which would be a more accurate error message.
The solution to your problem is either (1) allocate way less than 2GB of user memory per process, (2) implement your own system for mapping memory into and out of virtual address space, or (3) use a 64 bit process that has a far larger amount of available virtual address space.
Each 32 bit (you have 32 bit process because TotalVirtualMemory: 2047 MB) can address only up to 2GB of memory, regardless of the available physical memory.
An OutOfMemoryException can be caused by a number of things.
It can be caused when your application doesn't have enough space in the Gen0 managed heap or in the large object heap to process a new allocation. This is a rare case, but will typically happen when the heap is too fragmented to allow a new allocation (sometimes of quite a small size!). In Gen0 this might happen due to an excessive use of pinned objects (when handling interop with unmanaged code); in the LOH this was once a common problem but appears much less frequent in later versions of .NET. It's worth noting that SqlCe access would include unmanaged code; I've not heard of any major issues with this but it's possible that your use of the SqlCe classes is causing problems.
Alternatively, it could be a virtual memory issue - which seems quite plausible given the figures you've posted. Eric Lippert has a good blog post about this kind of issue here. If your application is trying to write pages of memory to disk so that it can keep something else in memory, you might well see the exception because your VM is so limited.

Identify owner of native memory used by C# application

I am working on a C# application which is designed to run in the system tray all the time. I would therefore like to minimise the amount of memory which the application uses when idle. Using Windows perfmon and the Windows Task Manager I have got some figures for idle memory usage.
Windows XP TaskManager - Mem Usage - 96,300K
PerfMon
.NET CLR Memory
# Bytes in all Heaps - 34,513,708
# Total committed Bytes - 40,591,360
# Total reserved Bytes - 50,319,360
I think these figures mean that my application has been allocated 96MB of memory by Windows. 50MB of this has been allocated to the CLR. The CLR has handed out 40mb of this.
Is there any way to work out what the other 46mb of memory which hasn't been assigned to the CLR is being used for? I assume this will be a combination of memory used for loading DLLs into the process and memory used by this native code.
EDIT: I have download VMMap and found the following.
Private
Total - 72mb
Managed Heap - 25mb
Stack - 16mb (Seems quite large)
Private Data - 13mb (Not sure what this is)
Image - 8mb (Mostly .NET DLLs)
Page Table - 6mb (Seems quite large)
Heap - 3mb
Can anyone suggest an interpretation for the Stack, Private Data and Page Table figures?
NOTE: The counters I originally quoted are now showing some bizarre figures.
Windows XP TaskManager - Mem Usage - 43,628K
PerfMon
.NET CLR Memory
# Bytes in all Heaps - 20mb
# Total committed Bytes - 23mb
# Total reserved Bytes - 50mb
This suggests that the CLR has reserved more memory than has been allocated to the process. Obviously this can't be true so the TaskManager must only be showing what has been paged in at the moment.
Note that the difference between the total memory usage (I'm not exactly sure what figure TaskManager is showing; Windows tools have a bad history about using different terms for equal concepts) and the "#Total reserved bytes" may also be used by CLR, just not by the managed heap (so native allocations by the CLR, loaded DLLs, etc. may also account here).
You may want to checkout Sysinternals VMMap to get more detailed information.

WPF out of memory exception when loading large amount of bitmaps in single instance of app. Is there a limit?

I need to load large amounts of bitmaps into memory for display in a WPF app (using .net 4.0). Where I run into trouble is when I approach around 1,400MB of memory ( I am getting this from the process list in the task manager).
This same thing happens whether the app is run on a machine with 4GB of memory or 6GB (and some other configs that I do not have the details on). It is easy to test by reducing the images loaded and when it works on 1 machine then it works on them all, but when it crashes on one it also does on all.
When I reduce the image count and allow the app to load without causing the memory exception I can run multiple instances of the app (exceeding the 1.4GB of the single instance) without the problem so it appears to be some per instance limit or per instance error on my part.
I load the images as a BitmapImage and they are either stored in a List<BitmapImage> or loaded into a List<byte[]> where they are later used in a bunch of layered sequences (using a Writeablebitmap)
The error occurrs when I load the images not while in use. In the repeatable case I load 600 640X640 images plus another 200-300 smaller images ranging from 100X100 to 200X200, although it appears to be an overall bit count that is the problem.
So my questions are:
*Is there some built in per process memory limit in a situation like this?
*Is there a better technique to load large amounts of image data into memory?
Thanks,
Brian
Yes, there is a limit on per process memory allocations. One of the solutions is to make your binary LARGEADDRESSAWARE to use up more memory.
Refer Out of memory? Easy ways to increase the memory available to your program, it has great discussion around solutions to this.
Below may be a cause but i am not sure
Problem is not about loading large amout of data but because CLR maintains a Large Heap for object greater than 85k of memory and you don't have any control to free this large heap.
and these objects became Long Lived and will normally deallocated when Appdomain Unloads.
i would suggest that try to load larger images in another AppDomain and use that appdomain to manupulate larger images.
See this MSDN Entry to Profiling GC
See if Memory Mapped Files helps in case you are using .net 4.0
And more example
A x86 build can access 4 GB on 64 bit Windows, so that's the theoretical upper limit for the process. This requires the application to be large address aware. Additionally .NET imposes a 2 GB limit on a single object.
You may be suffering from LOH fragmentation. Objects larger than 85000 bytes are stored on the Large Object Heap, which is a special part of the managed heap that doesn't get compacted.
You say that the images are 600x600, but what is the pixel format and is there a mask as well? If you use a byte per color channel plus a byte for the alpha channel each picture is 600x600x32, so trying to load 600 of them at once will be a problem in a 32 bit process.
You're running into the limitation 32 bit processes which can only access about 2Gb of data. If you were to run 64 bit you wouldn't have the issues.
There are a number of ways to work around the issue, some of which are:
Simply don't load that much data, load only when needed. Use caching.
Use memory mapped files to map whole chucks of data into memory. Not recommended as you'll have to do all the memory management yourself.
Use multiple processes to hold the data and use an IPC mechanism to only bring over the data you need, similar to item 1.

Categories

Resources