I have a serverside c# program running 24/7. After a few days the Processes 'Paged Pool' (as displayed in Windows Task Manager) is building up to 12 MB when it reaches 13-14 Mb the machine blue screens. The main 'Mem usage' is 180 mb,
I am running 32-bit Windows Server 2003 SP2.
The question is; What is the 'Paged Pool'? What in my C# program could be causing this?
Thanks
The Windows Paged Pool is a section of memory set aside by the Windows kernel for satisfying demands from the kernel and device drivers for memory which can be paged to disk, as opposed to memory which should never be paged to disk. (For an in-depth look, read: http://blogs.technet.com/b/markrussinovich/archive/2009/03/26/3211216.aspx)
I don't see how your process could be allocating Paged Pool, since it is managed by the kernel, however, given that you are getting a blue screen, there could be some connection. Are you using the Registry or Memory Mapped files at all? Those are big consumers of Paged Pool resources. Perhaps you are reading a lot of registry entries over the life of the process and never releasing them. However, as you can see from the above article, exhausing the paged pool won't blue screen, but perhaps you have a hardware device which is crashing on exhaustion.
Ultimately, you'd need to get more details about the problem, since a number of things could be going on here. Recording the Stop Error code, describing what the program does, etc., will all help in troubleshooting.
It sounds like you programme is acquiring memory and not releasing it.
Do you have an infinite loop running where objects that implement IDisposable are being created?
Check that they are being disposed of somewhere in the loop, either by calling Dispose on them directly, or wrapping them in a using block.
Related
I'm looking for a solution for about 1 1/2 days now and just can't get to the point. I tried to start a *.lnk file in PocketPC 2003 out of our C# application. This *.lnk file contains a link to evm.exe which is a JVM for PocketPC. Argument passed is (besides others) -Xms8M which tells the JVM to reserve at least 8MB of memory.
If directly started from Windows Explorer there's no problem.
Now I created a process in C# pointing to the *.lnk file. When I try to start it the JVM console opens and brings up one of two errors: "EVM execution history too large" or "failed to initialize heap (Phase 1)" (or something like that).
If I delete the mentioned parameter the application comes up with no problem.
Because of this behaviour I assume that there is too few memory assigned to the newly created process. Is this realistic? And if: is there a way to assign more memory to the newly created process? Or am I completely wrong and have to go some other way (if any available)?
Edit:
--CodeSnippet--
this.myStartProcess = new Process { StartInfo = { FileName = appName },EnableRaisingEvents = true };
this.myStartProcess.Start()--CodeSnippet--
Edit 2:
After doing some more research it turned out that the real problem is that there are very limited resources available, eaten up by my launcher application (which is about 1.8 MB in total after starting) over time.
To improve things I started to study how the garbage collector works in Windows Mobile and so used two techniques to bring up the virtual machine.
First one is to reduce the memory taken by my own application by sending it to the background (SendToBack() method of the form) and waiting for the garbage collector to finish (GC.WaitForPendingFinalizers()).
After that I'm looking for 9 MB of free space in program memory before trying to bring the VM up. If there isn't enough space I try to shift the needed memory from storage memory to program memory.
This two techniques improved things a lot!
There's still a problem to my launcher application. The allocated bytes (strings and boxed objects to be concrete) increase over time when my launcher application is in front... It's about 30 kb in 10 minutes. After 24 hours the device will be rebooted automatically. At the moment I assume the launcher will be in front for about 10 minutes total during that period. Nevertheless it's not good to have memory leaks. Anyone got an idea how to chase this down?
Thanks in advance
Best regards
Marcel
It looks you have two reasons for which this could happen:
The default value provided to MinWorkingSet and MaxWorkingSet Properties are not satisfactory for your requirements. From http://msdn.microsoft.com/en-us/library/system.diagnostics.process.maxworkingset.aspx
The working set of a process is the set of memory pages currently
visible to the process in physical RAM memory. These pages are
resident and available for an application to use without triggering a
page fault.
The working set includes both shared and private data. The shared data
includes the pages that contain all the instructions that your
application executes, including the pages in your .dll files and the
system.dll files. As the working set size increases, memory demand
increases.
A process has minimum and maximum working set sizes. Each time a
process resource is created, the system reserves an amount of memory
equal to the minimum working set size for the process. The virtual
memory manager attempts to keep at least the minimum amount of memory
resident when the process is active, but it never keeps more than the
maximum size.
The system sets the default working set sizes. You can modify these
sizes using the MaxWorkingSet and MinWorkingSet members. However,
setting these values does not guarantee that the memory will be
reserved or resident.
It is effectively impossible to reserve the memory you are requiring for the JVM on your machine, because the way the OS manages memory ( which I would find really suprising because every modern os has virtual memory support)
This a VERY open question.
Basically, I have a computing application that launches test combinations for N Scenarios.
Each test is conducted in a single dedicated thread, and involves reading large binary data, processing it, and dropping results to DB.
If the number of threads is too large, the app gets rogue and eats out all available memory and hangs out..
What is the most efficient way to exploit all CPU+RAM capabilities (High Performance computing i.e 12Cores/16GB RAM) without putting the system down to its knees (which happens if "too many" simultaneous threads are launched, "too many" being a relative notion of course)
I have to specify that I have a workers buffer queue with N workers, every time one finishes and dies a new one is launched via a Queue. This works pretty fine as of now. But I would like to avoid "manually" and "empirically" setting the number of simultaneous threads and have an intelligent scalable system that drops as many threads at a time that the system can properly handle, and stop at a "reasonable" memory usage (the target server is dedicated to the app so there is no problem regarding other applications except the system)
PS : I know that .Net 3.5 comes with Thread Pools and .Net 4 has interesting TPL capabilites, that I am still considering right now (I never went very deep into this so far).
PS 2 : After reading this post I was a bit puzzled by the "don't do this" answers. Though I think such request is fair for a memory-demanding computing program.
EDIT
After reading this post I will to try to use WMI features
All built-in threading capabilities in .NET do not support adjusting according to memory usage. You need to build this yourself.
You can either predict memory usage or react to low memory conditions. Alternatives:
Look at the amount of free memory on the system before launching a new task. If it is below 500mb, wait until enough has been freed.
Launch tasks as they come and throttle as soon as some of them start to fail because of OOM. Restart them later. This alternative sucks big time because your process will do garbage collections like crazy to avoid the OOMs.
I recommend (1).
You can either look at free system memory or your own processes memory usage. In order to get the memory usage I recommend looking at private bytes using the Process class.
If you set aside 1GB of buffer on your 16GB system you run at 94% efficiency and are pretty safe.
We have a 64bit C#/.Net3.0 application that runs on a 64bit Windows server. From time to time the app can use large amount of memory which is available. In some instances the application stops allocating additional memory and slows down significantly (500+ times slower).When I check the memory from the task manager the amount of the memory used barely changes. The application keeps on running very slowly and never gives an out of memory exception.
Any ideas? Let me know if more data is needed.
You might try enabling server mode for the Garbage Collector. By default, all .NET apps run in Workstation Mode, where the GC tries to do its sweeps while keeping the application running. If you turn on server mode, it temporarily stops the application so that it can free up memory (much) faster, and it also uses different heaps for each processor/core.
Most server apps will see a performance improvement using the GC server mode, especially if they allocate a lot of memory. The downside is that your app will basically stall when it starts to run out of memory (until the GC is finished).
* To enable this mode, insert the following into your app.config or web.config:
<configuration>
<runtime>
<gcServer enabled="true"/>
</runtime>
</configuration>
The moment you are hitting the physical memory limit, the OS will start paging (that is, write memory to disk). This will indeed cause the kind of slowdown you are seeing.
Solutions?
Add more memory - this will only help until you hit the new memory limit
Rewrite your app to use less memory
Figure out if you have a memory leak and fix it
If memory is not the issue, perhaps your application is hitting CPU very hard? Do you see the CPU hitting close to 100%? If so, check for large collections that are being iterated over and over.
As with 32-bit Windows operating systems, there is a 2GB limit on the size of an object you can create while running a 64-bit managed application on a 64-bit Windows operating system.
Investigating Memory Issues (MSDN article)
There is an awful lot of good stuff mentioned in the other answers. However, I'm going to chip in my two pence (or cents - depending on where you're from!) anyway.
Assuming that this is indeed a 64-bit process as you have stated, here's a few avenues of investigation...
Which memory usage are you checking? Mem Usage or VMem Size? VMem size is the one that actually matters, since that applies to both paged and non-paged memory. If the two numbers are far out of whack, then the memory usage is indeed the cause of the slow-down.
What's the actual memory usage across the whole server when things start to slow down? Does the slow down also apply to other apps? If so, then you may have a kernel memory issue - which can be due to huge amounts of disk accessing and low-level resource usage (for example, create 20000 mutexes, or load a few thousand bitmaps via code that uses Win32 HBitmaps). You can get some indication of this on the Task Manager (although Windows 2003's version is more informative directly on this than 2008's).
When you say that the app gets significantly slower, how do you know? Are you using vast dictionaries or lists? Could it not just be that the internal data structures are getting so big so as to complicate the work any internal algorithms are performing? When you get to huge numbers some algorithms can start to become slower by orders of magnitude.
What's the CPU load of the application when it's running at full-pelt? Is actually the same as when the slow-down occurs? If the CPU usage decreases as the memory usage goes up, then that means that whatever it's doing is taking the OS longer to fulfill, meaning that it's probably putting too much load on the OS. If there's no difference in CPU load, then my guess is it's internal data structures getting so big as to slow down your algos.
I would certainly be looking at running a Perfmon on the application - starting off with some .Net and native memory counters, Cache hits and misses, and Disk Queue length. Run it over the course of the application from startup to when it starts to run like an asthmatic tortoise, and you might just get a clue from that as well.
Having skimmed through the other answers, I'd say there's a lot of good ideas. Here's one I didn't see:
Get a memory profiler, such as SciTech's MemProfiler. It will tell you what's being allocated, by what, and it will show you the whole slice n dice.
It also has video tutorials in case you don't know how to use it. In my case, I discovered I had IDisposable instances that I wasn't Using(...)
Can someone please explain to me why minimizing a windows app massively reduces the memory usage?
For example, I run Visual Studio showing 800MB memory usage in Task Manager then I minimize the Visual Studio app window and the memory usage now only shows 50MB in task manager. This seems to happen in all winforms apps.
From here:
What Task Manager shows as an application's memory usage is actually its working set. Windows trims the working set of an application when it is minimized, so that's why this figure goes down. The working set is not an accurate representation of how much memory an application is using.
In Windows Vista, Microsoft modified Task Manager to show private bytes instead (which is a much more useful figure), so this phenomenon doesn't occur anymore.
It's normal for applications not to be so aggressive about returning memory to the system. A computer doesn't run faster by having a lot of unused memory, so it's better to save the cleanup work until it's really needed.
When you minimise a program, the system sends a signal to it that it's time to return as much memory to the system as possible, so the program does a garbage collection and releases all memory that it can.
I'm not sure if the question title is the best one but it was the best one I could come up with...
I have this .NET (C#) applications which starts up with Windows and remains opened until the computer is turned off. The app stays on the tray and I open it by clicking the tray icon and close it the same way.
The app is not slow at first, it works normally, no problems there. But after long periods of inactivity of itself, it gets very slow when showing it again for the first time in a long period. Know what I mean.
For instance, I could not use/open (click the tray icon) for a few days and between those days I opened and closed and used lots of other apps, heavy apps too and I probably hibernated and resumed the computer a few times and when I needed to open my app again, it was slow. After a few minutes of using it, it goes back to normal and works fine.
I believe this has something to with memory management and the system probably frees up most of my app's memory so other programs can use it more efficiently. And maybe .NET memory management as something to do with it...
Whatever the reason, is there anything I can do to optimize my app regarding that issue?
This is almost certainly due to memory being paged out to disk.
When you cease using your application and other applications or tasks start exerting memory pressure, pages of memory from your app can be written out to disk. When you try to use your application again, all this data must then be read in causing the stalls that you see.
Unfortunately, there is no good solution - if you run as administrator or have the SeLockMemoryPrivilege, you can lock parts of your application into physical memory. You can try "touching" pages periodically to keep them in physical memory. Unfortunately, both these options will cause your application to interact badly with other applications on the system - your memory is getting paged out because the physical memory is needed for something else. You can attempt to lower your overall memory footprint, but you will still run into this issue in some cases. There are options for tweaking your working set size, but the OS is free to ignore those, and will in many cases.
You can use the .NET memory profiler to get a good idea of what your application is doing with memory over time. You can check if anything is building up where you don't expect it to (collections, lists, etc.) and causing your memory footprint to grow.
Have you tried double-clicking in Process Explorer on your running app and observe it for leaks? Is it CPU utilization, UI leak, memory leak, recurrent I/O?
If the issue is indeed that the program is being paged out due to an extended period of inactivity, have you tried setting a "keep-alive" timer, that fires every few minutes, that basically keeps the application in a state where the paging system doesn't ever see it as idle?
I reckon that this behavior is normal in Windows. Windows will serve the application that is need it most, ie, the application that is currently in used. Thus, if an application is minimized or not active for pro-long duration, like the case specified above, it will be the best candidate for Windows to page the application memory and make available memory to active application. Windows will restore the application memory when it get active from paging memory (hard disk), thus this explain why it is slow.
This might be not the best answer, but if I were you, I will experiment by increasing the base priority of the application. This can be done by using Thread object in .NET framework.
Thanks.
This isn't a root cause answer or solution, but if Michael is right and there's nothing you can really do about it, then you might just want to consider visually informing the user that your application is active with a progress bar or nifty progress circle.
It won't speed up your program, but it may ease the minds of your users just a little bit.