I have a custom application that allows to open some custom models.
If I open a model in the application, then open another model - memory isn't released from the first model.
When I try to profile memory leak with profiler (ANTS memory profiler), the application releases memory and I'm not able to track the leak. How can I manage this problem?
When you take a snap shot ANTS memory profiler does a full garbage collection.
When you want to take a snapshot, I normally take 2-3 snapshots until there is not memory differences between two consecutive snapshots. Then compare with your base snapshot.
Go to instance list and see if there is any instances are growing. Select the Objects with Source in order to get rid of heaps of system object.
If there is any growing instances, pick one and see the objects retention graph which will show you exactly which instance holds the reference.
And also, make sure that you have implemented IDisposable properly and dispose all disposable objects and unsubscribe from all event subscriptions.
Have a look at below walkthroughs
http://www.red-gate.com/products/dotnet-development/ants-memory-profiler/walkthrough http://www.red-gate.com/products/dotnet-development/ants-memory-profiler/
Related
What is the reason for pinned GC handles when working with unmanaged .net components? This happens from time to time without any code changed or something else. When investigating the issue, I see a lot of pinned GC-Handles
These handles seem to stick in the memory for the entire application lifetime. In this case, the library is GdPicture (14). Is there any way to investigate why those instances are not cleaned up? I'm using Dispose()/using everywhere and can't find any GC roots in the managed code.
Thanks a lot!
EDIT
Another behaviour that is strange is, that the task manager shows that the application uses about 6GB ram, when the memory profiler shows the usage of 400MB (red line is live bytes)
What is the reason for pinned GC handles when working with unmanaged .net components?
Pinning is needed when working with unmanaged code. It prevents objects from being moved during garbage collection so that the unmanaged code can have a pointer to it. The garbage collector will update all .NET references, but it will not update unmanaged pointer values.
Is there any way to investigate why those instances are not cleaned up?
No. The reason always is: there's a bug in the code. Either your code (assume that first) or in a 3rd party library (libraries are used often, chances are that leaks in the library have been found by someone else before).
I'm using Dispose()/using everywhere
Seems like you missed one or it's not using the disposable pattern.
Another behaviour that is strange is, that the task manager shows that the application uses about 6GB ram, when the memory profiler shows the usage of 400MB (red line is live bytes)
A .NET memory profiler may only show the .NET part of memory (400 MB) and omit the rest (5600 MB).
Task manager is not interested in .NET. It cares about physical RAM mostly, which is why Task Manager is not a good analytics tool in general. You don't want to analyze physical RAM, you want to analyze virtual memory.
To look for memory leaks, use Process Explorer and show the "Private Bytes" and "Virtual size" column. Process Explorer can also show you a graph over time per process.
How to proceed?
Forget about the unmanaged leak for a moment. Use a .NET profiler that has the capability of taking memory snapshots and allows you to see each individual object inside as well as a statistics.
Try to figure out the steps that it takes to create more leaks in a consistent way. Then
Take a snapshot
Repeat the leak procedure 10 times
Take a snapshot
Repeat the leak procedure another 10 times
Take a snaphot
Compare snapshot of step 1 and 3. Check for managed types that differ in multiples of 10. Compare snapshot of step 3 and 5. Check the same type again. It must be a multiple of 10. You can't leak 7 objects when you run a method 10 times.
Do a code review on the places where the affected types are used based on internal knowledge on the leak procedure (which methods are called) and the managed type. Make sure it's disposed or released properly.
I was tracking a problem of memory leak related to an event handler,while I've discovered that each time I open a raddoCking then I close it I got around 500kb of memory used and not released.
I'm using MVVM pattern and as far I've seen its not related to the mvvm library.
When I close a RadPane I set it context to null hasn't it enough?
Thanks
You need to call RadPane's RemoveFromParent() method for it to be garbage collected.
Please check out these links:
http://www.telerik.com/forums/radpanegroup-memory-leak
http://www.telerik.com/forums/radpane-not-garbage-collected-when-closed
Few Points:
Setting RadPane's context to null isn't enough. You should unsubscribe from any event subscription to any long lasting objects and call Dispose for all disposable objects.
How do you measure memory? It would not release memory right after you close the RadPane. Garbage collection occurs only when it is required. If you want to test memory usage you should call GC collect and finalization before that.
GC.Collect();
GC.WaitForPendingFinalizers();
However, if you want to track memory leaks accurately, you need to use a proper profiling tool which would show you growing objects and their retention graphs.
Have a look at this answer for a good memory profiler.
I am receiving a very large list as a method argument and would like to remove it from memory after it is used. Normally I would let the GC do its thing, but I need to be very careful of memory usage in this app.
Will this code accomplish my goal? I've read a lot of differing opinions and am confused.
public void Save(IList<Employee> employees)
{
// I've mapped the passed-in list
var data = Mapper<Employee, EmployeeDTO>.MapList(employees);
// ?????????????
employees = null;
GC.Collect();
// Continues to process very long running methods....
// I don't want this large list to stay in memory
}
Maybe I should use another technique that I'm not aware of?
If the list is not used anymore the GC will automatically collect it when available memory is an issue.
However, if the caller uses the list after passing it to your function then the GC won't collect it even if you set it to null (all you have is a reference to the list - you can't do anything about other objects that hold references as wel).
Unless you have a measurable problem don't try and outsmart the GC.
This is not a direct answer to the question but some ideas how to handle the low-memory situations the poster (#Big Daddy) referred to.
If you're running into low memory situations on an x64 platform with 8 GB of memory, you should determine if your application is responsible for it. If it is, then run a memory profiler (CLR Profiler or something else or even get a full user dump and run WinDbg on it) to see what allocates the memory. It's possible that you have some objects that are not used anymore but are still referenced somewhere - this is not a true memory leak but it's memory that's not freed up in your application - most decent profilers will identify big objects (or objects with a lot of instances) along with their types.
I find it hard to believe that the list passed to this Save function would stress a server with 8 GB of memory but we don't know how much free memory is available to the process and what kind of a process it is (IIS, desktop, etc.)
If Save is called on several threads with huge inputs, it can potentially lead to memory stress but even then, it's not very likely and I'd check various counters and profile data to see when and why memory stress happens.
Is there any tool such that it can get a heap dump from a running application and determine/group objects by where in source code they were created?
With no changes to source code and ideally something free.
What about .NET Memory Profiler from ANTS for example.
Maybe CLR Profiler.
The information is not available if you create a memory dump. In order to gather this, you have to monitor the process as it is running. You could launch the application via WinDbg and set breakpoints on all the constructors you're interested in (hopefully you don't want to look at each and every object).
If you create the breakpoint, so it dumps the stack you will have the point of creation for the object. However, keep in mind that the objects may move around during GC, which will make parring objects with stacks difficult (or even impossible in some cases).
Since your question is tagged with performance and profiling, I gather that you want to reduce memory allocations. Why not just look at the numbers of objects created (or possibly look at the largest objects created) by looking at the heap. Then go through the source code and figure out where such instances are created.
As others suggested memory profilers, Memprofiler is definitely the most advanced one (I've tried all existing .NET profilers). It has a 14 day trial.
You need a .NET memory profiler. These tools allow you to follow the object graphs on the garbage collected heap and can be very useful in identifying the sources of memory leaks. While they may not necessarily tell you the method where an object was created they will tell which instances of which classes are holding on to the objects and allow you to take differences of snap shots of the gc heap. They don't require modifications to source code. You may want to have a look at
What Are Some Good .NET Profilers?
Our QA teams use http://www.jetbrains.com/profiler/ for this kind of thing here when we run into bottlenecks. I'm pretty sure it will give you a list of allocations by method call. I'll go install it and check :)
Good old windbg + sos + pdb will make the dumping.
As for the "where in source code they were created" part - is impossible without instrumentation or injection.
The SOS Debugging Extension
How to use : http://msdn.microsoft.com/en-us/library/yy6d2sxs.aspx
I am hoping that someone can shed some light on how .NET handles garbage collection in the following case.
I have a program where I need to do a very specific kind of "Find in Files" functionality like you would see in Visual Studio. I have to search potentially thousands of files, and I collect the results in a List(Pair()) object, where Pair is a simple class I created for storing a pair of items (obviously).
When I am through using what I need, I call Clear() on the list in order to get rid of the old information. This does not seem to help free memory because I can see on my Task Manager that the memory consumed does not decrease.
For a really large search, I am potentially dealing with 5,000,000 lines of information (approx. 500MB of memory usage on my machine) that need to be handled. When my search is through, the memory consumed level stays the same. I made my Pair class implement IDisposable, and that didn't help.
Any idea what I might be missing? Thanks!
The garbage collection will clear memory when needed, that is, not when you "clear" the list, but when it finds out that none of the items that were referenced in it are referenced any more and when the process/computer is running out of memory.
There is no need to micromanage memory in C#.
The .NET Garbage Collector is surprisingly good. In general you shouldn't worry about the memory consumption you see in task manager because as you are observing, the garbage collector doesn't reclaim memory as soon as you would think. The reason for this is reclaiming memory is an expensive operation. If the memory isn't needed at that moment, why go messing around in there? The inner workings are of when it does go reclaiming space are pretty involved. There are different levels of collection the GC goes through (called Generations) to reclaim memory optimized for speed.
There are lots of articles which can explain this in more detail better than I can. Here is a starting point.
http://msdn.microsoft.com/en-us/library/ms973837.aspx
For now you should see at what point you end up getting out of memory exceptions, if at all, and go from there.
When you call Clear() all references to the Pair objects will be removed, this will cause those objects to be GC'ed eventually unless another object holds references to them, but you cannot count on when that will happen - it also depends on memory pressure.
As a side note you can use Tuple in C# 4 instead of Pair.