How can I hog memory to test another application? - c#

So I want to test my windows application under low memory conditions, and I have found that the easiest way to do this is to create another app (a Console application) that just hogs memory.
I have created this monster:
while (true)
{
try
{
Marshal.AllocHGlobal(1024);
}
catch{}
}
But it only goes to 3.7 GB. I then open another instance of this application and it goes back down.
How can I keep the garbage collector from collecting my allocations?
Or: how can I test low-memory conditions on my universal windows application?

You can try changing the GCSettings latency mode to SustainedLowLatency which will avoid garbage collection at all unless the system will run out.
GCSettings.LatencyMode = GCLatencyMode.SustainedLowLatency;

You are probably running into the limit that you see because you are running your memory hog as a 32-bit process, which can only address ~4GB of memory.
Try running as a 64-bit process (compile for x64 explicitly), or spawning multiple copies.
That said, there are better ways to limit the memory available to the process under test. One example is given here: Set Windows process (or user) memory limit

Related

.NET Core app Process Memory doesn't decrease after objets are deallocated

I have problems with a ASP .NET Core 2.1 application running in Windows that increases its memory consumption until finally crashing and requiring to kill the .NET Core Host process. I suspected that the cause could be a synchronization task run in the background once per hour, and I have confirmed that disabling it solves the problem.
I've been profiling this synchronization task with VisualStudio 2019 Diagnostic Tools and I've found a behavior that I don't understand:
As you can see, I have taken 3 snapshots:
At the start of the synchronization method
At the end of the synchronization method
A bit later, once we have existed the scope
In the snapshot table I see a behavior that seem logical to me: the heap size grows considerably during the task (2) and is reduced almost to the initial size (1) when the scope is exited (3). However, the "Process Memory" chart shows a different story: the memory consumption increase is there, but it never goes down.
I have launched the application with dotnet run in Release mode and I see the same behavior when looking at the memory used by the .NET Core Host process.
I have two questions:
Why this divergence between the Heap Size and the Process Memory? Shouldn't they be closely related?
Could this be the cause of my webapp crashing? It seems so, but the memory consumption increase should be temporal, not permanent up to the point of crashing it. How could I fix it?
Remark: I have reproduced the same behavior with a simple .NET Core Console App (2.1 and 2.2) with no dependencies, so this is not linked to the ASP part or any other library:
internal class Program
{
private static void Main()
{
new Whatever().AllocateSomeStrings();
// Snapshot3
Console.ReadKey();
}
}
public class Whatever
{
public void AllocateSomeStrings()
{
// Snapshot1
List<string> numbers = Enumerable.Range(0, 50000).Select(n => n.ToString()).ToList();
// Snapshot2
}
}
To answer my own questions:
The heap size increases when objects are created, and is reduced when they are GCed. The process memory increases when the heap size increases but it does not necessarily decrease when the heap size decreases. That memory can stay assigned to the process unless some other processes need it (the available memory for the machine goes low), forcing it to be released. This process is not handled by the GC, which operates at the process level. See also: When is memory, allocated by .NET process, released back to Windows
I found that the root problem was a memory leak in the code, a while loop poorly written, fixing that fixed the crashing app.

GC.GetTotalMemory(false) and Process.WorkingSet

After running my .net process for a long time, I see there's a big difference between the memory GC is aware of and the process working set.
The values I'm monitoring are GC.GetTotalMemory(false) and Process.WorkingSet.
If I check in the "task manager" (or using a SysInternals tool) the value of WorkingSet (Process) and what's shown in "task manager" don't match, but they're somehow close (let's say, 100mb in task manager, 130 in WorkingSet), but what's shocking for me is that GC.GetTotalMemory(false) is like 40Mb or so.
I've run several times through profilers (my favorite is ants memory profiler from redgate), and there it is easy to check that there's a value called "free memory" which is normally the difference between what GC "sees" and what the OS sees (ok, plus loaded DLLs and so on).
A couple of questions:
Is there a way to programmatically monitor this "GC free memory". Yes, I could use the profiler, but in a long running process is not that easy.
If your process runs for a long time, allocates a lot of memory and then frees it (and allocation means thousands or millions of objects, not as simple as a big allocation and free), the problem is that it will never "shrink" to a low value, despite of the fact that GC values are low and correct. Is there a way to fix this? It could be that something is broken on my process but it has been profiled several times over the years and it doesn't seem to be a leak.
Thanks
Just to resurrect an ancient thread, if you want to monitor "GC free memory", you can periodically make a call to GC.GetTotalMemory(false) and store the value. Dump it to disk or a database or something. Alternatively you can use a library like prometheus-net(https://github.com/prometheus-net/) which will export this and a whole bunch of metrics for you.
As for your second question, there is a way to force your application to shrink, but it's not dot net native, it's windows only, and it's NOT recommended:
[DllImport("psapi.dll")]
static extern int EmptyWorkingSet(IntPtr hwProc);
static void MinimizeFootprint()
{
EmptyWorkingSet(Process.GetCurrentProcess().Handle);
}
From what little I know I think the OS will take care of this if the server is under memory pressure, and it's more efficient for your app, and for the server in general, if the memory remains allocated to your process. This is because there's a good chance it will need it again in the future, so there's no point in cleaning it up when it will just have to allocate it again, especially when your server has enough memory.

Larged Paged Pool in C# program .... what is this?

I have a serverside c# program running 24/7. After a few days the Processes 'Paged Pool' (as displayed in Windows Task Manager) is building up to 12 MB when it reaches 13-14 Mb the machine blue screens. The main 'Mem usage' is 180 mb,
I am running 32-bit Windows Server 2003 SP2.
The question is; What is the 'Paged Pool'? What in my C# program could be causing this?
Thanks
The Windows Paged Pool is a section of memory set aside by the Windows kernel for satisfying demands from the kernel and device drivers for memory which can be paged to disk, as opposed to memory which should never be paged to disk. (For an in-depth look, read: http://blogs.technet.com/b/markrussinovich/archive/2009/03/26/3211216.aspx)
I don't see how your process could be allocating Paged Pool, since it is managed by the kernel, however, given that you are getting a blue screen, there could be some connection. Are you using the Registry or Memory Mapped files at all? Those are big consumers of Paged Pool resources. Perhaps you are reading a lot of registry entries over the life of the process and never releasing them. However, as you can see from the above article, exhausing the paged pool won't blue screen, but perhaps you have a hardware device which is crashing on exhaustion.
Ultimately, you'd need to get more details about the problem, since a number of things could be going on here. Recording the Stop Error code, describing what the program does, etc., will all help in troubleshooting.
It sounds like you programme is acquiring memory and not releasing it.
Do you have an infinite loop running where objects that implement IDisposable are being created?
Check that they are being disposed of somewhere in the loop, either by calling Dispose on them directly, or wrapping them in a using block.

Hitting a memory limit slows down the .Net application

We have a 64bit C#/.Net3.0 application that runs on a 64bit Windows server. From time to time the app can use large amount of memory which is available. In some instances the application stops allocating additional memory and slows down significantly (500+ times slower).When I check the memory from the task manager the amount of the memory used barely changes. The application keeps on running very slowly and never gives an out of memory exception.
Any ideas? Let me know if more data is needed.
You might try enabling server mode for the Garbage Collector. By default, all .NET apps run in Workstation Mode, where the GC tries to do its sweeps while keeping the application running. If you turn on server mode, it temporarily stops the application so that it can free up memory (much) faster, and it also uses different heaps for each processor/core.
Most server apps will see a performance improvement using the GC server mode, especially if they allocate a lot of memory. The downside is that your app will basically stall when it starts to run out of memory (until the GC is finished).
* To enable this mode, insert the following into your app.config or web.config:
<configuration>
<runtime>
<gcServer enabled="true"/>
</runtime>
</configuration>
The moment you are hitting the physical memory limit, the OS will start paging (that is, write memory to disk). This will indeed cause the kind of slowdown you are seeing.
Solutions?
Add more memory - this will only help until you hit the new memory limit
Rewrite your app to use less memory
Figure out if you have a memory leak and fix it
If memory is not the issue, perhaps your application is hitting CPU very hard? Do you see the CPU hitting close to 100%? If so, check for large collections that are being iterated over and over.
As with 32-bit Windows operating systems, there is a 2GB limit on the size of an object you can create while running a 64-bit managed application on a 64-bit Windows operating system.
Investigating Memory Issues (MSDN article)
There is an awful lot of good stuff mentioned in the other answers. However, I'm going to chip in my two pence (or cents - depending on where you're from!) anyway.
Assuming that this is indeed a 64-bit process as you have stated, here's a few avenues of investigation...
Which memory usage are you checking? Mem Usage or VMem Size? VMem size is the one that actually matters, since that applies to both paged and non-paged memory. If the two numbers are far out of whack, then the memory usage is indeed the cause of the slow-down.
What's the actual memory usage across the whole server when things start to slow down? Does the slow down also apply to other apps? If so, then you may have a kernel memory issue - which can be due to huge amounts of disk accessing and low-level resource usage (for example, create 20000 mutexes, or load a few thousand bitmaps via code that uses Win32 HBitmaps). You can get some indication of this on the Task Manager (although Windows 2003's version is more informative directly on this than 2008's).
When you say that the app gets significantly slower, how do you know? Are you using vast dictionaries or lists? Could it not just be that the internal data structures are getting so big so as to complicate the work any internal algorithms are performing? When you get to huge numbers some algorithms can start to become slower by orders of magnitude.
What's the CPU load of the application when it's running at full-pelt? Is actually the same as when the slow-down occurs? If the CPU usage decreases as the memory usage goes up, then that means that whatever it's doing is taking the OS longer to fulfill, meaning that it's probably putting too much load on the OS. If there's no difference in CPU load, then my guess is it's internal data structures getting so big as to slow down your algos.
I would certainly be looking at running a Perfmon on the application - starting off with some .Net and native memory counters, Cache hits and misses, and Disk Queue length. Run it over the course of the application from startup to when it starts to run like an asthmatic tortoise, and you might just get a clue from that as well.
Having skimmed through the other answers, I'd say there's a lot of good ideas. Here's one I didn't see:
Get a memory profiler, such as SciTech's MemProfiler. It will tell you what's being allocated, by what, and it will show you the whole slice n dice.
It also has video tutorials in case you don't know how to use it. In my case, I discovered I had IDisposable instances that I wasn't Using(...)

Windows memory and page file usage

Can someone please explain to me why minimizing a windows app massively reduces the memory usage?
For example, I run Visual Studio showing 800MB memory usage in Task Manager then I minimize the Visual Studio app window and the memory usage now only shows 50MB in task manager. This seems to happen in all winforms apps.
From here:
What Task Manager shows as an application's memory usage is actually its working set. Windows trims the working set of an application when it is minimized, so that's why this figure goes down. The working set is not an accurate representation of how much memory an application is using.
In Windows Vista, Microsoft modified Task Manager to show private bytes instead (which is a much more useful figure), so this phenomenon doesn't occur anymore.
It's normal for applications not to be so aggressive about returning memory to the system. A computer doesn't run faster by having a lot of unused memory, so it's better to save the cleanup work until it's really needed.
When you minimise a program, the system sends a signal to it that it's time to return as much memory to the system as possible, so the program does a garbage collection and releases all memory that it can.

Categories

Resources