I have the following simple program which I am trying to use with VS 2015's Diagnostic Tools related to Memory Usage.
class Program
{
static void Main(string[] args)
{
Console.WriteLine("Begin");
Console.ReadLine();
Goo();
Console.WriteLine("End");
Console.ReadLine();
}
private static void Goo()
{
var list = new List<string>();
for (var i = 0; i < 1000; i++)
{
Foo(list);
}
}
private static void Foo(IEnumerable<string> strings)
{
foreach (var str in strings)
{
}
}
}
While profiling this application's project, I took couple of snapshots and was expecting to see 1000 boxed List<string>+Enumerator objects. For example, I get this kind of information in JetBrains's dotMemory product. But for some reason, I am unable to see this information in VS's tool...I am obviously missing something...can anyone point me in the right direction?
As you can see in the above snapshot, I get information about the mscorlib module only where as I do not see any information about my executing program. What am I missing?...some more info below:
I used Start Diagnostic Tools Without Debugging in visual studio
After taking and opening the snapshot, I even unselected the option Collapse small objects to see if this was hiding any info, but that also did not help.
Updated(responding to user answer):
I am using dotMemory version 4.4. Following is a snapshot of the data I get from it. NOTE: make sure to click the button Collect Allocations before you hit any key after seeing the Begin message
All objects created in Goo and Foo already collected when you get snapshot at "End" point. I profiled this code with dotMemory 10.0.1 and also do not see any objects created in Goo and Foo methods.
UPDATE: In dotMemory you are looking at "Memory Traffic" view. Memory traffic - are objects created and possible already collected to the point of time. dotMemory shows you a warning that it is not able to display collected objects. If you check "Start collecting allocation data immediatelly" checkbox in the profiling setup dialog, dotMemory will show you that these 1000 objects were allocated and already collected. In VS Diagnostic Tools you are looking at live objects graph. I'm not very familliar with this tool, but it seems that there is no information about memory traffic.
If you look at live objects graph in dotMemory ("All objects" view), you won't find these objects too.
The memory analysis tool works by iterating all of the GC roots and traversing the object graphs from there - similar to what the GC does. Once the method is out of scope, the local variable containing the enumerator is no longer a GC root. Any heap objects whose reference is stored in a local and is not referenced through another GC root are unreachable and essentially gone at that point.
Related
I'm developing a C# tool to assist me in analyzing log files for my work, and I'm using Winforms to create the interface. I've noticed that every time I check or un-check an item in a CheckedListBox, the memory usage of my program jumps a bit (a few hundred kilobytes at most). Repeatedly checking and unchecking items causes the program to climb in memory usage from something like 50MB to 150MB, and it just keeps rising from there.
I've monitored the memory usage through Visual Studio's profiling tools, dotMemory, and Task Manager. Each confirms that the memory is climbing, but I'm not sure why.
Here's dotMemory:
I am unable to locate the "leak" in my code, so I tested checking / unchecking a blank Winform with a number of identical items in a CheckedListBox and noticed that the memory climbs similarly (albeit to a lesser degree)!
I'm obviously a novice when it comes to C# and memory management. I'm not sure whether this is something that I should be worried about, or whether I'm being impatient with the garbage collector. Though the problem appears to be un-managed memory...
The full spaghetti source is here if you're at all interested.
On Code:
CheckListItem toAdd = CountMatches(item, splitFileContents);
if (toAdd.Display != string.Empty)
checkedListBoxMore.Items.Add(CountMatches(item, splitFileContents));
You have created toAdd object, but before don't add to check list, you create more one.
Use same object to add on check list:
CheckListItem toAdd = CountMatches(item, splitFileContents);
if (toAdd.Display != string.Empty)
checkedListBoxMore.Items.Add(toAdd);
You can also free the object from memory, using Dispose method, on ControlRemoved event:
checkedListBoxMore.ControlRemoved += (ss, ee) => {
ee.Control.Dispose();
};
I have a long-running application that consistently fails due to a memory leak.
I suspect my use of static properties may be the cause. Here's an example of what I have today:
public class StaticReferences
{
public static readonly object Fixed1 = new object();
}
public class ShortLived
{
public object Object1;
}
public class Doer // This class is instantiated once
{
public void DoStuff() // This method is called over and over again.
{
var shortLived = new ShortLived()
{
Object1 = StaticReferences.Fixed1
};
}
}
Will an instance of ShortLived with its reference to StaticReferences.Fixed1 (via the ShortLived.Object1 property) get properly garbage collected once it is out of scope?
No, just referencing global static properties won't create a memory leak. The example you posted is fine. shortLived will be cleaned up once its scope is over and the reference to Fixed1 will get cleaned up when your program exits. Your problem is very likely elsewhere but it's impossible to say from your simple example. Do you have any proof that you're looking at a memory leak?
I suggest you use a memory profiler or get a full memory dump and analyze it (WinDbg is free but there are other, easier to use but pay tools, too). Another tool you can try using is DebugDiag from Microsoft (also free) - get a dump and then run it through DebugDiag to get a memory report.
As #EricJ mentioned in his comment, the profiler in Visual Studio 2015 is also a great tool to analyze memory use and it's available in all editions, including the free Community Edition.
I just delivered my first C# WebAPI application to the first customer. Under normal load, performance initially is even better than I expected. Initially.
Everything worked fine until, at some point, memory was up and garbage collection started running riot (as in "It collects objects that are not yet garbage"). At that point, there were multiple W3WP threads with some ten gigs of ram altogether, and single-digit gigs per worker. After a restart of the IIS everything was back to normal, but of course the memory usage is rising again.
Please correct me if I am wrong, but
Shouldn't C# have automatic garbage collection?
Shouldn't it be easy for GC to collect the garbage of a WebAPI application?
And please help me out:
How can I explicitly state what GC should collect, thus preventing memory leaks? Is someBigList = null; the way to go?
How can I detect where the memory leaks are?
EDIT: Let me clarify some things.
My .NET WebAPI application is mostly a bunch of
public class MyApiController:ApiController
{
[HttpGet]
public MyObjectClass[] MyApi(string someParam) {
List<MyObjectClass> list = new List<MyObjectClass>();
...
for/while/foreach {
MyObjectClass obj = new MyObjectClass();
obj.firstStringAttribute = xyz;
...
list.Add(obj);
}
return list.ToArray();
}
}
Under such conditions, GC should be easy: after "return", all local variables should be garbage. Yet with every single call the used memory increases.
I initially thought that C# WebAPI programs behave similar to (pre-compiled) PHP: IIS calls the program, it is executed, returns the value and is then completely disposed off.
But this is not the case. For instance, I found static variables to keep their data between runs, and now I disposed of all static variables.
Because I found static variables to be a problem for GC:
internal class Helper
{
private static List<string> someVar = new List<string>();
internal Helper() {
someVar=new List<string>();
}
internal void someFunc(string str) {
someVar.Add(str);
}
internal string[] someOtherFunc(string str) {
string[] s = someVar.ToArray();
someVar=new List<string>();
return s;
}
}
Here, under low-memory conditions, someVar threw a null pointer error, which in my opinion can only be caused by GC, since I did not find any code where someVar is actively nullified by me.
I think the memory increase slowed down since I actively set the biggest array variables in the most often used Controllers to null, but this is only a gut feeling and not even nearly a complete solution.
I will now do some profiling using the link you provided, and get back with some results.
Shouldn't C# have automatic garbage collection?
C# is a programming language for the .NET runtime, and .NET brings the automatic garbage collection to the table. So, yes, although technically C# isn't the piece that brings it.
Shouldn't it be easy for GC to collect the garbage of a WebAPI application?
Sure, it should be just as easy as for any other type of .NET application.
The common theme here is garbage. How does .NET determine that something is garbage? By verifying that there are no more live references to the object. To be honest I think it is far more likely that you have verified one of your assumptions wrongly, compared to there being a serious bug in the garbage collector in such a way that "It collects objects that are not yet garbage".
To find leaks, you need to figure out what objects are currently held in memory, make a determination whether that is correct or not, and if not, figure out what is holding them there. A memory profiler application would help with that, there are numerous available, such as the Red-Gate ANTS Memory Profiler.
For your other questions, how to make something eligible for garbage collection? By turning it into garbage (see definition above). Note that setting a local variable to null may not necessarily help or be needed. Setting a static variable to null, however, might. But the correct way to determine that is to use a profiler.
Here are some shot-in-the-dark type of tips you might look into:
Look at static classes, static fields, and static properties. Are you storing data there that is accumulating?
How about static events? Do you have this? Do you remember to unsubscribe the event when you no longer need it?
And by "static fields, properties, and events", I also mean normal instance fields, properties and events that are held in objects that directly or indirectly are stored in static fields or properties. Basically, anything that will keep the objects in memory.
Are you remembering to Dispose of all your IDisposable objects? If not, then the memory being used could be unmanaged. Typically, however, when the garbage collector collects the managed object, the finalizer of that object should clean up the unmanaged memory as well, however you might allocate memory that the GC algorithm isn't aware of, and thus thinks it isn't a big problem to wait with collection. See the GC.AddMemoryPressure method for more on this.
I am experiencing problem in my application with OutOfMemoryException. My application can search for words within texts. When I start a long running process search to search about 2000 different texts for about 2175 different words the application will terminate at about 50 % through with a OutOfMemoryException (after about 6 hours of processing)
I have been trying to find the memory leak. I have an object graph like: (--> are references)
a static global application object (controller) --> an algorithm starter object --> text mining starter object --> text mining algorithm object (this object performs the searching).
The text mining starter object will start the text mining algorithm object's run()-method in a separate thread.
To try to fix the issue I have edited the code so that the text mining starter object will split the texts to search into several groups and initialize one text mining algorithm object for each group of texts sequentially (so when one text mining algorithm object is finished a new will be created to search the next group of texts). Here I set the previous text mining algorithm object to null. But this does not solve the issue.
When I create a new text mining algorithm object I have to give it some parameters. These are taken from properties of the previous text mining algorithm object before I set that object to null. Will this prevent garbage collection of the text mining algorithm object?
Here is the code for the creation of new text mining algorithm objects by the text mining algorithm starter:
private void RunSeveralAlgorithmObjects()
{
IEnumerable<ILexiconEntry> currentEntries = allLexiconEntries.GetGroup(intCurrentAlgorithmObject, intNumberOfAlgorithmObjectsToUse);
algorithm.LexiconEntries = currentEntries;
algorithm.Run();
intCurrentAlgorithmObject++;
for (int i = 0; i < intNumberOfAlgorithmObjectsToUse - 1; i++)
{
algorithm = CreateNewAlgorithmObject();
AddAlgorithmListeners();
algorithm.Run();
intCurrentAlgorithmObject++;
}
}
private TextMiningAlgorithm CreateNewAlgorithmObject()
{
TextMiningAlgorithm newAlg = new TextMiningAlgorithm();
newAlg.SortedTermStruct = algorithm.SortedTermStruct;
newAlg.PreprocessedSynonyms = algorithm.PreprocessedSynonyms;
newAlg.DistanceMeasure = algorithm.DistanceMeasure;
newAlg.HitComparerMethod = algorithm.HitComparerMethod;
newAlg.LexiconEntries = allLexiconEntries.GetGroup(intCurrentAlgorithmObject, intNumberOfAlgorithmObjectsToUse);
newAlg.MaxTermPercentageDeviation = algorithm.MaxTermPercentageDeviation;
newAlg.MaxWordPercentageDeviation = algorithm.MaxWordPercentageDeviation;
newAlg.MinWordsPercentageHit = algorithm.MinWordsPercentageHit;
newAlg.NumberOfThreads = algorithm.NumberOfThreads;
newAlg.PermutationType = algorithm.PermutationType;
newAlg.RemoveStopWords = algorithm.RemoveStopWords;
newAlg.RestrictPartialTextMatches = algorithm.RestrictPartialTextMatches;
newAlg.Soundex = algorithm.Soundex;
newAlg.Stemming = algorithm.Stemming;
newAlg.StopWords = algorithm.StopWords;
newAlg.Synonyms = algorithm.Synonyms;
newAlg.Terms = algorithm.Terms;
newAlg.UseSynonyms = algorithm.UseSynonyms;
algorithm = null;
return newAlg;
}
Here is the start of the thread that is running the whole search process:
// Run the algorithm in it's own thread
Thread algorithmThread = new Thread(new ThreadStart
(RunSeveralAlgorithmObjects));
algorithmThread.Start();
Can something here prevent the previous text mining algorithm object from being garbage collected?
I recommend first identifying what exactly is leaking. Then postulate a cause (such as references in event handlers).
To identify what is leaking:
Enable native debugging for the project. Properties -> Debug -> check Enable unmanaged code debugging.
Run the program. Since the memory leak is probably gradual, you probably don't need to let it run the whole 6 hours; just let it run for a while and then Debug -> Break All.
Bring up the Immediate window. Debug -> Windows -> Immediate
Type one of the following into the immediate window, depending on whether you're running 32 or 64 bit, .NET 2.0/3.0/3.5 or .NET 4.0:
.load C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\sos.dll for 32-bit .NET 2.0-3.5
.load C:\WINDOWS\Microsoft.NET\Framework\v4.0.30319\sos.dll for 32-bit .NET 4.0
.load C:\WINDOWS\Microsoft.NET\Framework64\v2.0.50727\sos.dll for 64-bit .NET 2.0-3.5
.load C:\WINDOWS\Microsoft.NET\Framework64\v4.0.30319\sos.dll for 64-bit .NET 4.0
You can now run SoS commands in the Immediate window. I recommend checking the output of !dumpheap -stat, and if that doesn't pinpoint the problem, check !finalizequeue.
Notes:
Running the program the first time after enabling native debugging may take a long time (minutes) if you have VS set up to load symbols.
The debugger commands that I recommended both start with ! (exclamation point).
These instructions are courtesy of the incredible Tess from Microsoft, and Mario Hewardt, author of Advanced .NET Debugging.
Once you've identified the leak in terms of which object is leaking, then postulate a cause and implement a fix. Then you can do these steps again to determine for sure whether or not the fix worked.
1) As I said in a comment, if you use events in your code (the AddAlgorithmListeners makes me suspect this), subscribing to an event can create a "hidden" dependency between objects which is easily forgotten. This dependency can mean that an object is not freed, because someone is still listening to one of it's events. Make sure you unsubscribe from all events when you no longer need to listen to them.
2) Also, I'd like to point you to one (probably not-so-off-topic) issue with your code:
private void RunSeveralAlgorithmObjects()
{
...
algorithm.LexiconEntries = currentEntries;
// ^ when/where is algorithm initialized?
for (...)
{
algorithm = CreateNewAlgorithmObject();
....
}
}
Is algoritm already initialized when this method is invoked? Otherwise, setting algorithm.LexiconEntries wouldn't seem like a valid thing to do. This means your method is dependent on some external state, which to me looks like a potential place for bugs creeping in your program logic.
If I understand it correctly, this object contains some state describing the algorithm, and CreateNewAlgorithmObject derives a new state for algorithm from the current state. If this was my code, I would make algorithm an explicit parameter to all your functions, as a signal that the method depends on this object. It would then no longer be hidden "external" state upon which your functions depend.
P.S.: If you don't want to go down that route, the other thing you could consider to make your code more consistent is to turn CreateNewAlgorithmObject into a void method and re-assign algorithm directly inside that method.
Is AddAlgorithmListeners attaching event handlers to events exposed by the algorithm object ? Are the listening objects living longer than the algorithm object - in which case they can continue to keep the algorithm object from being collected.
If yes, try unsubscribing events before you let the object go out of scope.
for (int i = 0; i < intNumberOfAlgorithmObjectsToUse - 1; i++)
{
algorithm = CreateNewAlgorithmObject();
AddAlgorithmListeners();
algorithm.Run();
RemoveAlgoritmListeners(); // See if this fixes your issue.
intCurrentAlgorithmObject++;
}
my suspect is in AddAlgorithmListeners(); are you sure you remove the listener after execution completed?
Is the IEnumerable returned by GetGroup() throw-away or cached? That is, does it hold onto the objects it has emitted, as if it does it would obviously grow linearly with each iteration.
Memory profiling is useful, have you tried examining the application with a profiler? I found Red Gate's useful in the past (it's not free, but does have an evaluation version, IIRC).
I have an object, which lives forever. I am deleteing all references I can see, to it after using it, but it still not collected. Its life cycle is pretty sophisticated so I can't be sure that all references been cleared.
if ( container.Controls.Count > 0 )
{
var controls = new Control[ container.Controls.Count ];
container.Controls.CopyTo( controls, 0 );
foreach ( var control in controls )
{
container.Controls.Remove( control );
control.Dispose();
}
controls = null;
}
GC.Collect();
GC.Collect(1);
GC.Collect(2);
GC.Collect(3);
How can I find out what references does it still have? Why is it not collected?
Try using a memory profiler, (e.g. ants) it will tell you what is keeping the object alive. Trying to 2nd guess this type of problem is very hard.
Red-gate gives 14 days trial that should be more then enough time to tack down this problem and decide if a memory profiler provides you with long term value.
There are lots of other memory profilers on the market (e.g. .NET Memory Profiler) most of them have free trials, however I have found that the Red-Gate tools are easy to use, so tend try them first.
You'll have to use Windbg and Sosex extension.
The !DumpHeap and !GCRoot commands can help you to identify the instance, and all remaining references that keep it alive.
I solved a similar issue with the SOS extension (which apparently does no longer work with Visual Studio 2013, but works fine with older versions of Visual Studio).
I used following code to get the address of the object for which I wanted to track references:
public static string GetAddress(object o)
{
if (o == null)
{
return "00000000";
}
else
{
unsafe
{
System.TypedReference tr = __makeref(o);
System.IntPtr ptr = **(System.IntPtr**) (&tr);
return ptr.ToString ("X");
}
}
}
and then, in Visual Studio 2012 immediate window, while running in the debugger, type:
.load C:\Windows\Microsoft.NET\Framework\v4.0.30319\sos.dll
which will load the SOS.dll extension.
You can then use GetAddress(x) to get the hexadecimal address of the object (for instance 8AB0CD40), and then use:
!do 8AB0CD40
!GCRoot -all 8AB0CD40
to dump the object and find all references to the object.
Just keep in mind that if the GC runs, it might change the address of the object.
I've been using .NET Memory Profiler to do some serious memory profiling on one of our projects. It's a great tool to look into the memory management of your app. I don't get paid for this info :) but it just helped me alot.
The garbage collection in .NET is not a counting scheme (such as COM), but a mark-and-sweep implementation. Basically, the GC runs at "random" times when it feels the need to do so, and the collection of the objects is therefore not deterministic.
You can, however, manually trigger a collection (GC.Collect()), but you may have to wait for finalizers to run then (GC.WaitForPendingFinalizers()). Doing this in a production app, however, is discouraged, because it may affect the efficiency of the memory management (GC runs too often, or waits for finalizers to run). If the object still exists, it actually still has some live reference somewhere.
It is not collected because you haven't removed all references to it. The GC will only mark objects for collection if they have no roots in the application.
What means are you using to check up on the GC to see if it has collected your object?