I have an object, which lives forever. I am deleteing all references I can see, to it after using it, but it still not collected. Its life cycle is pretty sophisticated so I can't be sure that all references been cleared.
if ( container.Controls.Count > 0 )
{
var controls = new Control[ container.Controls.Count ];
container.Controls.CopyTo( controls, 0 );
foreach ( var control in controls )
{
container.Controls.Remove( control );
control.Dispose();
}
controls = null;
}
GC.Collect();
GC.Collect(1);
GC.Collect(2);
GC.Collect(3);
How can I find out what references does it still have? Why is it not collected?
Try using a memory profiler, (e.g. ants) it will tell you what is keeping the object alive. Trying to 2nd guess this type of problem is very hard.
Red-gate gives 14 days trial that should be more then enough time to tack down this problem and decide if a memory profiler provides you with long term value.
There are lots of other memory profilers on the market (e.g. .NET Memory Profiler) most of them have free trials, however I have found that the Red-Gate tools are easy to use, so tend try them first.
You'll have to use Windbg and Sosex extension.
The !DumpHeap and !GCRoot commands can help you to identify the instance, and all remaining references that keep it alive.
I solved a similar issue with the SOS extension (which apparently does no longer work with Visual Studio 2013, but works fine with older versions of Visual Studio).
I used following code to get the address of the object for which I wanted to track references:
public static string GetAddress(object o)
{
if (o == null)
{
return "00000000";
}
else
{
unsafe
{
System.TypedReference tr = __makeref(o);
System.IntPtr ptr = **(System.IntPtr**) (&tr);
return ptr.ToString ("X");
}
}
}
and then, in Visual Studio 2012 immediate window, while running in the debugger, type:
.load C:\Windows\Microsoft.NET\Framework\v4.0.30319\sos.dll
which will load the SOS.dll extension.
You can then use GetAddress(x) to get the hexadecimal address of the object (for instance 8AB0CD40), and then use:
!do 8AB0CD40
!GCRoot -all 8AB0CD40
to dump the object and find all references to the object.
Just keep in mind that if the GC runs, it might change the address of the object.
I've been using .NET Memory Profiler to do some serious memory profiling on one of our projects. It's a great tool to look into the memory management of your app. I don't get paid for this info :) but it just helped me alot.
The garbage collection in .NET is not a counting scheme (such as COM), but a mark-and-sweep implementation. Basically, the GC runs at "random" times when it feels the need to do so, and the collection of the objects is therefore not deterministic.
You can, however, manually trigger a collection (GC.Collect()), but you may have to wait for finalizers to run then (GC.WaitForPendingFinalizers()). Doing this in a production app, however, is discouraged, because it may affect the efficiency of the memory management (GC runs too often, or waits for finalizers to run). If the object still exists, it actually still has some live reference somewhere.
It is not collected because you haven't removed all references to it. The GC will only mark objects for collection if they have no roots in the application.
What means are you using to check up on the GC to see if it has collected your object?
Related
Using the System.DirectoryServices.Protocols library:
I have a class LdapItemOperator that takes a SearchResultEntry object from an LDAP query (not Active Directory related) and stores the attributes for the object in a field: readonly SearchResultAttributeCollection LdapAttributes.
The problem I am experiencing is that when I have a large operation the garbage collector seems to never delete these objects after they ought to have been disposed because of the LdapAttributes field in my objects, at least I think that's the problem. What ways can I try to dispose of the objects when they are no longer required? I can't seem to find a way to incorporate a using statement in there, although I only have little experience with it.
As an example, let's say I have the following logic:
List<LdapItemOperator> itemList = GetList(ldapFilter);
List<bool> resultList = new List<bool>();
foreach (IdmLdapItemOperator item in itemList) {
bool result = doStuff(item);
resultList.Add(result);
}
//Even though we are out of the loop now, the objects are still stored in memory, how come? Same goes for the previous objects in the loop, they seem to remain in memory
Logic.WriteResultToLog(result);
After a good while of running the logic on large filesets, this process starts taking up enormous amounts of memory, of course...
I think you might be a little confused about how GC works. You can never know exactly when GC will run. And objects you are still holding a reference to will not be collected (unless it's a weak reference...).
Also "disposing" is yet another different concept, that hasn't much to do with GC.
Basically, all objects will be in memory already after the call to GetList. And memory consumption will not change much after that, the foreach loop shouldn't affect it at all.
Without knowing your implementation, maybe try returning an enumerable instead of a single list, or make multiple batched calls.
I'm developing a C# tool to assist me in analyzing log files for my work, and I'm using Winforms to create the interface. I've noticed that every time I check or un-check an item in a CheckedListBox, the memory usage of my program jumps a bit (a few hundred kilobytes at most). Repeatedly checking and unchecking items causes the program to climb in memory usage from something like 50MB to 150MB, and it just keeps rising from there.
I've monitored the memory usage through Visual Studio's profiling tools, dotMemory, and Task Manager. Each confirms that the memory is climbing, but I'm not sure why.
Here's dotMemory:
I am unable to locate the "leak" in my code, so I tested checking / unchecking a blank Winform with a number of identical items in a CheckedListBox and noticed that the memory climbs similarly (albeit to a lesser degree)!
I'm obviously a novice when it comes to C# and memory management. I'm not sure whether this is something that I should be worried about, or whether I'm being impatient with the garbage collector. Though the problem appears to be un-managed memory...
The full spaghetti source is here if you're at all interested.
On Code:
CheckListItem toAdd = CountMatches(item, splitFileContents);
if (toAdd.Display != string.Empty)
checkedListBoxMore.Items.Add(CountMatches(item, splitFileContents));
You have created toAdd object, but before don't add to check list, you create more one.
Use same object to add on check list:
CheckListItem toAdd = CountMatches(item, splitFileContents);
if (toAdd.Display != string.Empty)
checkedListBoxMore.Items.Add(toAdd);
You can also free the object from memory, using Dispose method, on ControlRemoved event:
checkedListBoxMore.ControlRemoved += (ss, ee) => {
ee.Control.Dispose();
};
I have the following simple program which I am trying to use with VS 2015's Diagnostic Tools related to Memory Usage.
class Program
{
static void Main(string[] args)
{
Console.WriteLine("Begin");
Console.ReadLine();
Goo();
Console.WriteLine("End");
Console.ReadLine();
}
private static void Goo()
{
var list = new List<string>();
for (var i = 0; i < 1000; i++)
{
Foo(list);
}
}
private static void Foo(IEnumerable<string> strings)
{
foreach (var str in strings)
{
}
}
}
While profiling this application's project, I took couple of snapshots and was expecting to see 1000 boxed List<string>+Enumerator objects. For example, I get this kind of information in JetBrains's dotMemory product. But for some reason, I am unable to see this information in VS's tool...I am obviously missing something...can anyone point me in the right direction?
As you can see in the above snapshot, I get information about the mscorlib module only where as I do not see any information about my executing program. What am I missing?...some more info below:
I used Start Diagnostic Tools Without Debugging in visual studio
After taking and opening the snapshot, I even unselected the option Collapse small objects to see if this was hiding any info, but that also did not help.
Updated(responding to user answer):
I am using dotMemory version 4.4. Following is a snapshot of the data I get from it. NOTE: make sure to click the button Collect Allocations before you hit any key after seeing the Begin message
All objects created in Goo and Foo already collected when you get snapshot at "End" point. I profiled this code with dotMemory 10.0.1 and also do not see any objects created in Goo and Foo methods.
UPDATE: In dotMemory you are looking at "Memory Traffic" view. Memory traffic - are objects created and possible already collected to the point of time. dotMemory shows you a warning that it is not able to display collected objects. If you check "Start collecting allocation data immediatelly" checkbox in the profiling setup dialog, dotMemory will show you that these 1000 objects were allocated and already collected. In VS Diagnostic Tools you are looking at live objects graph. I'm not very familliar with this tool, but it seems that there is no information about memory traffic.
If you look at live objects graph in dotMemory ("All objects" view), you won't find these objects too.
The memory analysis tool works by iterating all of the GC roots and traversing the object graphs from there - similar to what the GC does. Once the method is out of scope, the local variable containing the enumerator is no longer a GC root. Any heap objects whose reference is stored in a local and is not referenced through another GC root are unreachable and essentially gone at that point.
I just delivered my first C# WebAPI application to the first customer. Under normal load, performance initially is even better than I expected. Initially.
Everything worked fine until, at some point, memory was up and garbage collection started running riot (as in "It collects objects that are not yet garbage"). At that point, there were multiple W3WP threads with some ten gigs of ram altogether, and single-digit gigs per worker. After a restart of the IIS everything was back to normal, but of course the memory usage is rising again.
Please correct me if I am wrong, but
Shouldn't C# have automatic garbage collection?
Shouldn't it be easy for GC to collect the garbage of a WebAPI application?
And please help me out:
How can I explicitly state what GC should collect, thus preventing memory leaks? Is someBigList = null; the way to go?
How can I detect where the memory leaks are?
EDIT: Let me clarify some things.
My .NET WebAPI application is mostly a bunch of
public class MyApiController:ApiController
{
[HttpGet]
public MyObjectClass[] MyApi(string someParam) {
List<MyObjectClass> list = new List<MyObjectClass>();
...
for/while/foreach {
MyObjectClass obj = new MyObjectClass();
obj.firstStringAttribute = xyz;
...
list.Add(obj);
}
return list.ToArray();
}
}
Under such conditions, GC should be easy: after "return", all local variables should be garbage. Yet with every single call the used memory increases.
I initially thought that C# WebAPI programs behave similar to (pre-compiled) PHP: IIS calls the program, it is executed, returns the value and is then completely disposed off.
But this is not the case. For instance, I found static variables to keep their data between runs, and now I disposed of all static variables.
Because I found static variables to be a problem for GC:
internal class Helper
{
private static List<string> someVar = new List<string>();
internal Helper() {
someVar=new List<string>();
}
internal void someFunc(string str) {
someVar.Add(str);
}
internal string[] someOtherFunc(string str) {
string[] s = someVar.ToArray();
someVar=new List<string>();
return s;
}
}
Here, under low-memory conditions, someVar threw a null pointer error, which in my opinion can only be caused by GC, since I did not find any code where someVar is actively nullified by me.
I think the memory increase slowed down since I actively set the biggest array variables in the most often used Controllers to null, but this is only a gut feeling and not even nearly a complete solution.
I will now do some profiling using the link you provided, and get back with some results.
Shouldn't C# have automatic garbage collection?
C# is a programming language for the .NET runtime, and .NET brings the automatic garbage collection to the table. So, yes, although technically C# isn't the piece that brings it.
Shouldn't it be easy for GC to collect the garbage of a WebAPI application?
Sure, it should be just as easy as for any other type of .NET application.
The common theme here is garbage. How does .NET determine that something is garbage? By verifying that there are no more live references to the object. To be honest I think it is far more likely that you have verified one of your assumptions wrongly, compared to there being a serious bug in the garbage collector in such a way that "It collects objects that are not yet garbage".
To find leaks, you need to figure out what objects are currently held in memory, make a determination whether that is correct or not, and if not, figure out what is holding them there. A memory profiler application would help with that, there are numerous available, such as the Red-Gate ANTS Memory Profiler.
For your other questions, how to make something eligible for garbage collection? By turning it into garbage (see definition above). Note that setting a local variable to null may not necessarily help or be needed. Setting a static variable to null, however, might. But the correct way to determine that is to use a profiler.
Here are some shot-in-the-dark type of tips you might look into:
Look at static classes, static fields, and static properties. Are you storing data there that is accumulating?
How about static events? Do you have this? Do you remember to unsubscribe the event when you no longer need it?
And by "static fields, properties, and events", I also mean normal instance fields, properties and events that are held in objects that directly or indirectly are stored in static fields or properties. Basically, anything that will keep the objects in memory.
Are you remembering to Dispose of all your IDisposable objects? If not, then the memory being used could be unmanaged. Typically, however, when the garbage collector collects the managed object, the finalizer of that object should clean up the unmanaged memory as well, however you might allocate memory that the GC algorithm isn't aware of, and thus thinks it isn't a big problem to wait with collection. See the GC.AddMemoryPressure method for more on this.
I am using a 3rd-party object I didn't create that over time consumes a lot of resources. This object shouldn't in any way contain a state, it simply performs a calculation. Despite this fact, everytime I call a specific function of this object a little more memory is consumed. A few hours later, and my program is sitting at gigabytes of allocated memory.
The object was origionaly initialized as a static member of my Program class in my command-line application. I have found that if I wrap my entire program in an class, and reinitialize it every now and again, the older (and bloated) object is unallocated by GC and a new smaller object replaces it.
My issue is this method is quite clumsy and ruins the flow of my Program.
Is there any other way you can dispose of an object? I am lead to believe GC.Collect() will only dispose unreachable code. Is there anyway I can make an object 'unreachable'?
Edit: As requested, the code:
static ILexicon lexicon = new Lexicon();
...
lexicon.LoadDataFromFile(#"lexicon.dat", null);
...
byte similarityScore(string w1, string w2, PartOfSpeech pos, SimilarityMeasure measure)
{
if (w1 == w2)
return 255;
if (pos != PartOfSpeech.Noun && pos != PartOfSpeech.Verb)
return 0;
IList<ILemma> w1_lemmas = lexicon.FindSenses(w1, pos);
IList<ILemma> w2_lemmas = lexicon.FindSenses(w2, pos);
byte result;
byte score = 0;
foreach (ILemma w1_lemma in w1_lemmas)
{
foreach (ILemma w2_lemma in w2_lemmas)
{
result = (byte) (w1_lemma.GetSimilarity(w2_lemma, measure) * 255);
if (result > score)
score = result;
}
}
return score;
}
As similarityScore is called, more memory is allocated to a private member of lexicon. It does not implement IDisposable and there are no obvious functions to clear the memory. The library is based on WordNet, and uses an algorithm to find path lengths in the hypernym tree to calculate the similarity of two words. Unless there is caching, I can't see why it would need to store any memory. What is for sure, is I can't change it. I'm almost certain there is nothing wrong with my code. I just need to dispose of lexicon when it gets too large (N.B. it takes a second or two to load the lexicon from file to memory)
If the object doesn't implement IDisposable and you want to push it out of scope you can set all references to it to null and then the force garbage collection with GC.Collect().
GC.Collect() is very expensive. If you're going to have to do this frequently, you might want to consider contacting the vendor.
Find out:
If you are using their library correctly, or is there something you're doing wrong that's causing the memory leak.
If their library is leaking memory even when used as intended, can they fix the leak?
Additional note: If the 3rd party library is native and you're having to use interop, you can use Marshal.ReleaseComObject to free unmanaged memory.
you could try calling the Dispose() method. This would make the object unusable, so you would have to instantiate another one. I assume your program is in a loop, so it can be a loop variable with the call to dispose at the bottom.
I would suggest that if you can get your hands on a memory profiler, you use it. A memory profiler will let you pause your program, click on a class, and and see a list of objects of that class. One can then click on an object and see how it was created, and the "path" to that object from a root (e.g. there's a static class foo, which holds a reference to a bar, which holds a reference to a boz, which holds a reference to a reallybigthing). Often, seeing that will make it clear what needs to be done to break the chain.
you might be able to download the source from wordnet repository and modify the code since it is an opensource.