Increasing memory usage from CheckedListBox -- Is this normal? - c#

I'm developing a C# tool to assist me in analyzing log files for my work, and I'm using Winforms to create the interface. I've noticed that every time I check or un-check an item in a CheckedListBox, the memory usage of my program jumps a bit (a few hundred kilobytes at most). Repeatedly checking and unchecking items causes the program to climb in memory usage from something like 50MB to 150MB, and it just keeps rising from there.
I've monitored the memory usage through Visual Studio's profiling tools, dotMemory, and Task Manager. Each confirms that the memory is climbing, but I'm not sure why.
Here's dotMemory:
I am unable to locate the "leak" in my code, so I tested checking / unchecking a blank Winform with a number of identical items in a CheckedListBox and noticed that the memory climbs similarly (albeit to a lesser degree)!
I'm obviously a novice when it comes to C# and memory management. I'm not sure whether this is something that I should be worried about, or whether I'm being impatient with the garbage collector. Though the problem appears to be un-managed memory...
The full spaghetti source is here if you're at all interested.

On Code:
CheckListItem toAdd = CountMatches(item, splitFileContents);
if (toAdd.Display != string.Empty)
checkedListBoxMore.Items.Add(CountMatches(item, splitFileContents));
You have created toAdd object, but before don't add to check list, you create more one.
Use same object to add on check list:
CheckListItem toAdd = CountMatches(item, splitFileContents);
if (toAdd.Display != string.Empty)
checkedListBoxMore.Items.Add(toAdd);
You can also free the object from memory, using Dispose method, on ControlRemoved event:
checkedListBoxMore.ControlRemoved += (ss, ee) => {
ee.Control.Dispose();
};

Related

C# Issue handling memory usage for SearchResultAttributeCollection LDAP

Using the System.DirectoryServices.Protocols library:
I have a class LdapItemOperator that takes a SearchResultEntry object from an LDAP query (not Active Directory related) and stores the attributes for the object in a field: readonly SearchResultAttributeCollection LdapAttributes.
The problem I am experiencing is that when I have a large operation the garbage collector seems to never delete these objects after they ought to have been disposed because of the LdapAttributes field in my objects, at least I think that's the problem. What ways can I try to dispose of the objects when they are no longer required? I can't seem to find a way to incorporate a using statement in there, although I only have little experience with it.
As an example, let's say I have the following logic:
List<LdapItemOperator> itemList = GetList(ldapFilter);
List<bool> resultList = new List<bool>();
foreach (IdmLdapItemOperator item in itemList) {
bool result = doStuff(item);
resultList.Add(result);
}
//Even though we are out of the loop now, the objects are still stored in memory, how come? Same goes for the previous objects in the loop, they seem to remain in memory
Logic.WriteResultToLog(result);
After a good while of running the logic on large filesets, this process starts taking up enormous amounts of memory, of course...
I think you might be a little confused about how GC works. You can never know exactly when GC will run. And objects you are still holding a reference to will not be collected (unless it's a weak reference...).
Also "disposing" is yet another different concept, that hasn't much to do with GC.
Basically, all objects will be in memory already after the call to GetList. And memory consumption will not change much after that, the foreach loop shouldn't affect it at all.
Without knowing your implementation, maybe try returning an enumerable instead of a single list, or make multiple batched calls.

C# WebAPI Garbage Collection

I just delivered my first C# WebAPI application to the first customer. Under normal load, performance initially is even better than I expected. Initially.
Everything worked fine until, at some point, memory was up and garbage collection started running riot (as in "It collects objects that are not yet garbage"). At that point, there were multiple W3WP threads with some ten gigs of ram altogether, and single-digit gigs per worker. After a restart of the IIS everything was back to normal, but of course the memory usage is rising again.
Please correct me if I am wrong, but
Shouldn't C# have automatic garbage collection?
Shouldn't it be easy for GC to collect the garbage of a WebAPI application?
And please help me out:
How can I explicitly state what GC should collect, thus preventing memory leaks? Is someBigList = null; the way to go?
How can I detect where the memory leaks are?
EDIT: Let me clarify some things.
My .NET WebAPI application is mostly a bunch of
public class MyApiController:ApiController
{
[HttpGet]
public MyObjectClass[] MyApi(string someParam) {
List<MyObjectClass> list = new List<MyObjectClass>();
...
for/while/foreach {
MyObjectClass obj = new MyObjectClass();
obj.firstStringAttribute = xyz;
...
list.Add(obj);
}
return list.ToArray();
}
}
Under such conditions, GC should be easy: after "return", all local variables should be garbage. Yet with every single call the used memory increases.
I initially thought that C# WebAPI programs behave similar to (pre-compiled) PHP: IIS calls the program, it is executed, returns the value and is then completely disposed off.
But this is not the case. For instance, I found static variables to keep their data between runs, and now I disposed of all static variables.
Because I found static variables to be a problem for GC:
internal class Helper
{
private static List<string> someVar = new List<string>();
internal Helper() {
someVar=new List<string>();
}
internal void someFunc(string str) {
someVar.Add(str);
}
internal string[] someOtherFunc(string str) {
string[] s = someVar.ToArray();
someVar=new List<string>();
return s;
}
}
Here, under low-memory conditions, someVar threw a null pointer error, which in my opinion can only be caused by GC, since I did not find any code where someVar is actively nullified by me.
I think the memory increase slowed down since I actively set the biggest array variables in the most often used Controllers to null, but this is only a gut feeling and not even nearly a complete solution.
I will now do some profiling using the link you provided, and get back with some results.
Shouldn't C# have automatic garbage collection?
C# is a programming language for the .NET runtime, and .NET brings the automatic garbage collection to the table. So, yes, although technically C# isn't the piece that brings it.
Shouldn't it be easy for GC to collect the garbage of a WebAPI application?
Sure, it should be just as easy as for any other type of .NET application.
The common theme here is garbage. How does .NET determine that something is garbage? By verifying that there are no more live references to the object. To be honest I think it is far more likely that you have verified one of your assumptions wrongly, compared to there being a serious bug in the garbage collector in such a way that "It collects objects that are not yet garbage".
To find leaks, you need to figure out what objects are currently held in memory, make a determination whether that is correct or not, and if not, figure out what is holding them there. A memory profiler application would help with that, there are numerous available, such as the Red-Gate ANTS Memory Profiler.
For your other questions, how to make something eligible for garbage collection? By turning it into garbage (see definition above). Note that setting a local variable to null may not necessarily help or be needed. Setting a static variable to null, however, might. But the correct way to determine that is to use a profiler.
Here are some shot-in-the-dark type of tips you might look into:
Look at static classes, static fields, and static properties. Are you storing data there that is accumulating?
How about static events? Do you have this? Do you remember to unsubscribe the event when you no longer need it?
And by "static fields, properties, and events", I also mean normal instance fields, properties and events that are held in objects that directly or indirectly are stored in static fields or properties. Basically, anything that will keep the objects in memory.
Are you remembering to Dispose of all your IDisposable objects? If not, then the memory being used could be unmanaged. Typically, however, when the garbage collector collects the managed object, the finalizer of that object should clean up the unmanaged memory as well, however you might allocate memory that the GC algorithm isn't aware of, and thus thinks it isn't a big problem to wait with collection. See the GC.AddMemoryPressure method for more on this.

List<T>.AddRange is causing a brief delay

I have a list of entities which implement an ICollidable interface. This interface is used to resolve collisions between entities. My entities are thus:
Players
Enemies
Projectiles
Items
Tiles
On each game update (about 60 t/s), I am clearing the list and adding the current entities based on the game state. I am accomplishing this via:
collidableEntities.Clear();
collidableEntities.AddRange(players);
collidableEntities.AddRange(enemies);
collidableEntities.AddRange(projectiles);
collidableEntities.AddRange(items);
collidableEntities.AddRange(camera.VisibleTiles);
Everything works fine until I add the visible tiles to the list. The first ~1-2 seconds of running the game loop causes a visible hiccup that delays drawing (so I can see a jitter in the rendering). I can literally remove/add the line that adds the tiles and see the jitter occur and not occur, so I have narrowed it down to that line.
My question is, why? The list of VisibleTiles is about 450-500 tiles, so it's really not that much data. Each tile contains a Texture2D (image) and a Vector2 (position) to determine what is rendered and where. I'm going to keep looking, but from the top of my head, I can't understand why only the first 1-2 seconds hiccups but is then smooth from there on out.
Any advice is appreciated.
Update
I have tried increasing the initial capacity to an approximate amount of elements, but no difference was observed.
Update
As requested, here is the code for camera.VisibleTiles
public List<Tile>
{
get { return this.visibleTiles; }
}
Whenever you have a performance issue like this, the correct response is not to guess at what possible causes might be; the correct response is to profile the application. There are a number of options available; I'm sure you can find suggestions here on StackOverflow.
However, I'm feeling lucky today, so I'm going to ignore the advice I just gave you and take a wild guess. I will explain my reasoning, which I hope will be helpful in diagnosing such problems in the future.
The fact that the problem you describe only happens once and only at the beginning of gameplay leads me to believe that it's not anything inherent to the functionality of the list itself, or of the collision logic. This is assuming, of course, that the number of objects in the list is basically constant across this period of time. If it was caused by either of these, I would expect it to happen every frame.
Therefore I suspect the garbage collector. You're likely to see GC happening right at the beginning of the game-in other words, right after you've loaded all of your massive assets into memory. You're also likely to see it happen at seemingly random points in the code, because any object you allocate can theoretically push it over the edge into a collection.
My guess is this: when you load your game, the assets you create are generating a large amount of collection pressure which is nonetheless not sufficient to trigger a collection. As you allocate objects during the course of gameplay (in this case, as a result of resizing the list), it is increasing the collection pressure to the point where the GC finally decides to activate, causing the jitter you observe. Once the GC has run, and all of the appropriate objects have been relegated to their correct generations, the jitter stops.
As I said, this is just a guess, albeit an educated one. It is, however, simple to test. Add a call to GC.Collect(2) prior to entering your render loop. If I'm right, the jitter should go away.
If you want to be more thorough than that-and I highly recommend it-Microsoft provides a tool which is useful for debugging memory issues, the CLR Profiler. This tool will show you exactly when collections are occurring and why. It's very useful for XNA development.
1-2 seconds may be caused by GC adjusting sizes of each generation. Look at corresponding perf counters and see if there is large number of Gen1/2 collections in first seconds (minimizing GC is useful goal for games)
Additional random guess: all your objects are struct for whatever reason. And very large. So copying them takes long time. (would not explain smoothing out after first seconds)
Notes:
instead of merging items in single list consider creating iterator that will avoid all copying.
get profiler
learn to look at perf counters for your process. Interesting categories are memory, GC, CPU, handlers.
You can use an overloaded constructor for List<T> to initialize the list with a predefined size:
var collidableEntities = new List<object>(500);

Question about Garbage collection C# .NET

I am experiencing problem in my application with OutOfMemoryException. My application can search for words within texts. When I start a long running process search to search about 2000 different texts for about 2175 different words the application will terminate at about 50 % through with a OutOfMemoryException (after about 6 hours of processing)
I have been trying to find the memory leak. I have an object graph like: (--> are references)
a static global application object (controller) --> an algorithm starter object --> text mining starter object --> text mining algorithm object (this object performs the searching).
The text mining starter object will start the text mining algorithm object's run()-method in a separate thread.
To try to fix the issue I have edited the code so that the text mining starter object will split the texts to search into several groups and initialize one text mining algorithm object for each group of texts sequentially (so when one text mining algorithm object is finished a new will be created to search the next group of texts). Here I set the previous text mining algorithm object to null. But this does not solve the issue.
When I create a new text mining algorithm object I have to give it some parameters. These are taken from properties of the previous text mining algorithm object before I set that object to null. Will this prevent garbage collection of the text mining algorithm object?
Here is the code for the creation of new text mining algorithm objects by the text mining algorithm starter:
private void RunSeveralAlgorithmObjects()
{
IEnumerable<ILexiconEntry> currentEntries = allLexiconEntries.GetGroup(intCurrentAlgorithmObject, intNumberOfAlgorithmObjectsToUse);
algorithm.LexiconEntries = currentEntries;
algorithm.Run();
intCurrentAlgorithmObject++;
for (int i = 0; i < intNumberOfAlgorithmObjectsToUse - 1; i++)
{
algorithm = CreateNewAlgorithmObject();
AddAlgorithmListeners();
algorithm.Run();
intCurrentAlgorithmObject++;
}
}
private TextMiningAlgorithm CreateNewAlgorithmObject()
{
TextMiningAlgorithm newAlg = new TextMiningAlgorithm();
newAlg.SortedTermStruct = algorithm.SortedTermStruct;
newAlg.PreprocessedSynonyms = algorithm.PreprocessedSynonyms;
newAlg.DistanceMeasure = algorithm.DistanceMeasure;
newAlg.HitComparerMethod = algorithm.HitComparerMethod;
newAlg.LexiconEntries = allLexiconEntries.GetGroup(intCurrentAlgorithmObject, intNumberOfAlgorithmObjectsToUse);
newAlg.MaxTermPercentageDeviation = algorithm.MaxTermPercentageDeviation;
newAlg.MaxWordPercentageDeviation = algorithm.MaxWordPercentageDeviation;
newAlg.MinWordsPercentageHit = algorithm.MinWordsPercentageHit;
newAlg.NumberOfThreads = algorithm.NumberOfThreads;
newAlg.PermutationType = algorithm.PermutationType;
newAlg.RemoveStopWords = algorithm.RemoveStopWords;
newAlg.RestrictPartialTextMatches = algorithm.RestrictPartialTextMatches;
newAlg.Soundex = algorithm.Soundex;
newAlg.Stemming = algorithm.Stemming;
newAlg.StopWords = algorithm.StopWords;
newAlg.Synonyms = algorithm.Synonyms;
newAlg.Terms = algorithm.Terms;
newAlg.UseSynonyms = algorithm.UseSynonyms;
algorithm = null;
return newAlg;
}
Here is the start of the thread that is running the whole search process:
// Run the algorithm in it's own thread
Thread algorithmThread = new Thread(new ThreadStart
(RunSeveralAlgorithmObjects));
algorithmThread.Start();
Can something here prevent the previous text mining algorithm object from being garbage collected?
I recommend first identifying what exactly is leaking. Then postulate a cause (such as references in event handlers).
To identify what is leaking:
Enable native debugging for the project. Properties -> Debug -> check Enable unmanaged code debugging.
Run the program. Since the memory leak is probably gradual, you probably don't need to let it run the whole 6 hours; just let it run for a while and then Debug -> Break All.
Bring up the Immediate window. Debug -> Windows -> Immediate
Type one of the following into the immediate window, depending on whether you're running 32 or 64 bit, .NET 2.0/3.0/3.5 or .NET 4.0:
.load C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\sos.dll for 32-bit .NET 2.0-3.5
.load C:\WINDOWS\Microsoft.NET\Framework\v4.0.30319\sos.dll for 32-bit .NET 4.0
.load C:\WINDOWS\Microsoft.NET\Framework64\v2.0.50727\sos.dll for 64-bit .NET 2.0-3.5
.load C:\WINDOWS\Microsoft.NET\Framework64\v4.0.30319\sos.dll for 64-bit .NET 4.0
You can now run SoS commands in the Immediate window. I recommend checking the output of !dumpheap -stat, and if that doesn't pinpoint the problem, check !finalizequeue.
Notes:
Running the program the first time after enabling native debugging may take a long time (minutes) if you have VS set up to load symbols.
The debugger commands that I recommended both start with ! (exclamation point).
These instructions are courtesy of the incredible Tess from Microsoft, and Mario Hewardt, author of Advanced .NET Debugging.
Once you've identified the leak in terms of which object is leaking, then postulate a cause and implement a fix. Then you can do these steps again to determine for sure whether or not the fix worked.
1) As I said in a comment, if you use events in your code (the AddAlgorithmListeners makes me suspect this), subscribing to an event can create a "hidden" dependency between objects which is easily forgotten. This dependency can mean that an object is not freed, because someone is still listening to one of it's events. Make sure you unsubscribe from all events when you no longer need to listen to them.
2) Also, I'd like to point you to one (probably not-so-off-topic) issue with your code:
private void RunSeveralAlgorithmObjects()
{
...
algorithm.LexiconEntries = currentEntries;
// ^ when/where is algorithm initialized?
for (...)
{
algorithm = CreateNewAlgorithmObject();
....
}
}
Is algoritm already initialized when this method is invoked? Otherwise, setting algorithm.LexiconEntries wouldn't seem like a valid thing to do. This means your method is dependent on some external state, which to me looks like a potential place for bugs creeping in your program logic.
If I understand it correctly, this object contains some state describing the algorithm, and CreateNewAlgorithmObject derives a new state for algorithm from the current state. If this was my code, I would make algorithm an explicit parameter to all your functions, as a signal that the method depends on this object. It would then no longer be hidden "external" state upon which your functions depend.
P.S.: If you don't want to go down that route, the other thing you could consider to make your code more consistent is to turn CreateNewAlgorithmObject into a void method and re-assign algorithm directly inside that method.
Is AddAlgorithmListeners attaching event handlers to events exposed by the algorithm object ? Are the listening objects living longer than the algorithm object - in which case they can continue to keep the algorithm object from being collected.
If yes, try unsubscribing events before you let the object go out of scope.
for (int i = 0; i < intNumberOfAlgorithmObjectsToUse - 1; i++)
{
algorithm = CreateNewAlgorithmObject();
AddAlgorithmListeners();
algorithm.Run();
RemoveAlgoritmListeners(); // See if this fixes your issue.
intCurrentAlgorithmObject++;
}
my suspect is in AddAlgorithmListeners(); are you sure you remove the listener after execution completed?
Is the IEnumerable returned by GetGroup() throw-away or cached? That is, does it hold onto the objects it has emitted, as if it does it would obviously grow linearly with each iteration.
Memory profiling is useful, have you tried examining the application with a profiler? I found Red Gate's useful in the past (it's not free, but does have an evaluation version, IIRC).

Find references to the object in runtime

I have an object, which lives forever. I am deleteing all references I can see, to it after using it, but it still not collected. Its life cycle is pretty sophisticated so I can't be sure that all references been cleared.
if ( container.Controls.Count > 0 )
{
var controls = new Control[ container.Controls.Count ];
container.Controls.CopyTo( controls, 0 );
foreach ( var control in controls )
{
container.Controls.Remove( control );
control.Dispose();
}
controls = null;
}
GC.Collect();
GC.Collect(1);
GC.Collect(2);
GC.Collect(3);
How can I find out what references does it still have? Why is it not collected?
Try using a memory profiler, (e.g. ants) it will tell you what is keeping the object alive. Trying to 2nd guess this type of problem is very hard.
Red-gate gives 14 days trial that should be more then enough time to tack down this problem and decide if a memory profiler provides you with long term value.
There are lots of other memory profilers on the market (e.g. .NET Memory Profiler) most of them have free trials, however I have found that the Red-Gate tools are easy to use, so tend try them first.
You'll have to use Windbg and Sosex extension.
The !DumpHeap and !GCRoot commands can help you to identify the instance, and all remaining references that keep it alive.
I solved a similar issue with the SOS extension (which apparently does no longer work with Visual Studio 2013, but works fine with older versions of Visual Studio).
I used following code to get the address of the object for which I wanted to track references:
public static string GetAddress(object o)
{
if (o == null)
{
return "00000000";
}
else
{
unsafe
{
System.TypedReference tr = __makeref(o);
System.IntPtr ptr = **(System.IntPtr**) (&tr);
return ptr.ToString ("X");
}
}
}
and then, in Visual Studio 2012 immediate window, while running in the debugger, type:
.load C:\Windows\Microsoft.NET\Framework\v4.0.30319\sos.dll
which will load the SOS.dll extension.
You can then use GetAddress(x) to get the hexadecimal address of the object (for instance 8AB0CD40), and then use:
!do 8AB0CD40
!GCRoot -all 8AB0CD40
to dump the object and find all references to the object.
Just keep in mind that if the GC runs, it might change the address of the object.
I've been using .NET Memory Profiler to do some serious memory profiling on one of our projects. It's a great tool to look into the memory management of your app. I don't get paid for this info :) but it just helped me alot.
The garbage collection in .NET is not a counting scheme (such as COM), but a mark-and-sweep implementation. Basically, the GC runs at "random" times when it feels the need to do so, and the collection of the objects is therefore not deterministic.
You can, however, manually trigger a collection (GC.Collect()), but you may have to wait for finalizers to run then (GC.WaitForPendingFinalizers()). Doing this in a production app, however, is discouraged, because it may affect the efficiency of the memory management (GC runs too often, or waits for finalizers to run). If the object still exists, it actually still has some live reference somewhere.
It is not collected because you haven't removed all references to it. The GC will only mark objects for collection if they have no roots in the application.
What means are you using to check up on the GC to see if it has collected your object?

Categories

Resources