C# Garbage collection? [closed] - c#

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 7 years ago.
Improve this question
I'm writing a service right now to automate a couple of my routines. Now I've really only started learning C# within the last month or so I'm still fairly new (but really liking it so far). I've designed my service that it just runs a method on a 5 minute timer which I made via AppSettings, runs a few checks, and organizes a few things if need be and that's it until the next interval.
I quickly realized the initial way I was doing things seems to have a pretty bad memory leak. So I've re-written much of it to embed things in "using" blocks and then disposing those when it's done. The "using" blocks were recommended to me by a developer at work who's been really helpful, but I don't like to bother him too much with my personal projects when he's got work to do.
Currently I'm not really having a problem with memory usage, as it's only using about 25Mb of ram, but when it starts, it's only using about 8Mb, and with each polling interval it climbs. But once it reaches that 25Mb threshold, I can see if dip a little lower which I'm assuming is garbage collection doing things, and then it climbs back up to 25Mb, and rinse and repeat. So my applications memory usage is stable, but it just seems higher than it needs to be so I'm curious.
Now if I call GC.Collect manually the memory usage drops to half. I realize this isn't ideal as I've already done some research on this. But now my question really comes down to, is there a some sort of default threshold in .NET when it comes to memory usage and garbage collection? I ask because it would explain what I'm seeing.
I did look at this page on the Process.MaxWorkingSet Property, but I'm not sure if it would make a difference at all or just potentially cause me problems.
I also tried running a profiler against, but to be honest, this is still new to me and I wasn't entirely clear at what I was looking for.

Conditions for a garbage collection
Garbage collection occurs when one of the following conditions is
true:
The system has low physical memory.
> The memory that is used by allocated objects on the managed heap
surpasses an acceptable threshold. This threshold is continuously
adjusted as the process runs.
The GC.Collect method is called. In almost all cases, you do not have
to call this method, because the garbage collector runs continuously.
This method is primarily used for unique situations and testing.
> When the garbage collector detects that the survival rate is high in a
generation, it increases the threshold of allocations for that
generation, so the next collection gets a substantial size of
reclaimed memory. The CLR continually balances two priorities: not
letting an application's working set get too big and not letting the
garbage collection take too much time.
This are quotes from the msdn GC article

Related

what is the reason for high % time in GC? for our app pool in APM perfmonitor tool, we see it crossing 99% and staying their for hours [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 5 years ago.
Improve this question
Does this mean memory leakage? The %Time in GC goes to 99% even when noone is using the application. Could you please help me why this %time in GC counter is strangely behaving. this could be a code issue? application is in Asp.net and uses services to call some methods.
For disposing oracle connections, we used disposed method. we used standard dispose pattern in the application.
Could someone give me insights into this?
It is hard to diagnose this kind of problem without very detailed analysis and direct observation of the measurements, but on the surface what this suggests is that you have a very large number of objects that have been allocated and which are retained for a long time - combined with some form of memory pressure. The net performance of full GC2 garbage collection is essentially bound by the number of alive / reachable objects in your system. So: what is the memory consumption? Is it in the GB area? If you have a large memory footprint, it doesn't necessarily mean a leak - but it can mean a leak. You can use memory analysis tools (usually against memory dump files) to investigate what objects exist, and how they are "rooted" - i.e. what is stopping them from being collected.
The most common things that cause this are:
a huge object model loaded into memory and retained for a long period - for example, loading a large chunk of your database into very large arrays/lists and keeping them globally to "help performance"
a common case of the above is reusing a single "data context" / "unit of work" / etc in your DAL between many requests
inappropriate use of events - especially registering objects to listen to events on a long-lived object - that cause objects to stay reachable forever via a hypothetical event that actually never happens - for example, for every row doing globalObj.SomeEvent += row.SomeHandler; : once you've done this, row is reachable from globalObj, so if globalObj doesn't die: neither will row
a common case of the above is subscribing temporary objects to static events (and not unsubscribing them); static events don't die
As for what it is in your case - if there even is an actual problem: only deeper analysis will show what.

How to create a large(10gb+) persistent cache in .NET

Spin-off from my other question.
.NET Garbagecollector trouble. Blocks for 15-40 mins
I want to create a simple persistent cache. For simplicity, everything that goes in stays in. I have currently implemented this with a ImmutableDictionary<int,DataItem> but I have problem with the garbage collector.
It seems to think I use lots of data which is true it as it contains 10 000-100 000 complex objects, and then it begins to think its a good idea to scan the cache very often. With blocking Generation 2. At least that's what I believe it does. "%Time in gc" is 90%+ and my application is blazing slow.
Can I somehow mark the cache as untouchable or let my app use more memory before GC thinks it should do a full collect? I have loads of free memory on the server.
Might switching to NCache or Redis for Windows (MSOpenTech) be a better solution?
You could use GC.TryStartNoGCRegion to strictly limit the times at which GC occurs. Though in line with my previous comment, I believe this is not a good approach.
Have you considered examining how the cached objects are cleaning themselves up? Perhaps when objects exceed the 4 hour ttl, cleaning them up takes too long?
Are the objects disposable? Are they finalizable? If yes to either one, whats the time cost of a Dispose or Finalize operation on an individual object? If they are Finalizable, do they need to be?

Undesirable Garbage Collection

In a title "Forcing a Garbage Colection" from book "C# 2010 and the .NET 4 Platform" by Andrew Troelsen written:
"Again, the whole purpose of the .NET garbage collector is to manage memory on our behalf. However, in some very rare circumstances, it may be beneficial to programmatically force a garbage collection using GC.Collect(). Specifically:
• Your application is about to enter into a block of code that you don’t want interrupted by a possible garbage collection.
...
"
But stop! Is there a such case when Garbage Collection is undesirable? I never saw/read something like that (because of my little development experience of course). If while your practice you have done something like that, please share. For me it's very interesting point.
Thank you!
Yes, there's absolutely a case when garbage collection is undesirable: when a user is waiting for something to happen, and they have to wait longer because the code can't proceed until garbage collection has completed.
That's Troelsen's point: if you have a specific point where you know a GC isn't problematic and is likely to be able to collect significant amounts of garbage then it may be a good idea to provoke it then, to avoid it triggering at a less opportune moment.
I run a recipe related website, and I store a massive graph of recipes and their ingredient usage in memory. Due to the way I pivot this information for quick access, I have to load several gigs of data into memory when the application loads before I can organize the data into a very optimized graph. I create a huge amount of tiny objects on the heap that, once the graph is built, become unreachable.
This is all done when the web application loads, and probably takes 4-5 seconds to do. After I do so, I call GC.Collect(); because I'd rather re-claim all that memory now rather than potentially block all threads during an incoming HTTP request while the garbage collector is freaking out cleaning up all these short lived objects. I also figure it's better to clean up now since the heap is probably less fragmented at this time, since my app hasn't really done anything else so far. Delaying this might result in many more objects being created, and the heap needing to be compressed more when GC runs automatically.
Other than that, in my 12 years of .NET programming, I've never come across a situation where I wanted to force the garbage collector to run.
The recommendation is that you should not explicitly call Collect in your code. Can you find circumstances where it's useful?
Others have detailed some, and there are no doubt more. The first thing to understand though, is don't do it. It's a last resort, investigate other options, learn how GC works look at how your code is impacted, follow best practices for your designs.
Calling Collect at the wrong point will make your performance worse. Worse still, to rely on it makes your code very fragile. The rare conditions required to make a call to Collect beneficial, or at last not harmful, can be utterly undone with a simple change to the code, which will result unexpected OOMs, sluggish performamnce and such.
I call it before performance measurements so that the GC doesn't falsify the results.
Another situation are unit-tests testing for memory leaks:
object doesItLeak = /*...*/; //The object you want to have tested
WeakReference reference = new WeakRefrence(doesItLeak);
GC.Collect();
GC.WaitForPendingFinalizers();
GC.Collect();
Assert.That(!reference.IsAlive);
Besides those, I did not encounter a situation in which it would actually be helpful.
Especially in production code, GC.Collect should never be found IMHO.
It would be very rare, but GC can be a moderately expensive process so if there's a particular section that's timing sensitive, you don't want that section interupted by GC.
Your application is about to enter into a block of code that you don’t
want interrupted by a possible garbage collection. ...
A very suspect argument (that is nevertheless used a lot).
Windows is not a Real Time OS. Your code (Thread/Process) can always be pre-empted by the OS scheduler. You do not have a guaranteed access to the CPU.
So it boils down to: how does the time for a GC-run compare to a time-slot (~ 20 ms) ?
There is very little hard data available about that, I searched a few times.
From my own observation (very informal), a gen-0 collection is < 40 ms, usually a lot less. A full gen-2 can run into ~100 ms, probably more.
So the 'risk' of being interrupted by the GC is of the same order of magnitude as being swapped out for another process. And you can't control the latter.

C# Production Server, Do I collect the garbage?

I know there's tons of threads about this. And I read a few of them.
I'm wondering if in my case it is correct to GC.Collect();
I have a server for a MMORPG, in production it is online day and night. And the server is restarted every other day to implement changes to the production codebase. Every twenty minutes the server pauses all other threads, and serializes the current game state. This usually takes 0.5 to 4 seconds
Would it be a good idea to GC.Collect(); after serialization?
The server is, obviously, constantly creating and destroying game items.
Would I have a notorious gain in performance or memory optimization / usage?
Should I not manually collect?
I've read about how collecting can be bad if used in the wrong moments or too frequently, but I'm thinking these saves are both a good moment to collect, and not that frequent.
The server is in framework 4.0
Update in answer to a comment:
We are randomly experiencing server freezes, sometimes, unexpectedly, the server memory usage will raise increasingly until it reaches a point when the server takes way too long to handle any network operation. Thus, I'm considering a lot of different approaches to solve the issue, this is one of them.
The garbage collector knows best when to run, and you shouldn't force it.
It will not improve performance or memory optimization. CLR can tell GC to collect object which are no longer used if there is a need to do that.
Answer to an updated part:
Forcing the collection is not a good solution to the problem. You should rather have a look a bit deeper into your code to find out what is wrong. If memory usage grows unexpectedly you might have an issue with unmanaged resources which are not properly handled or even a "leaky code" within managed code.
One more thing. I would be surprise if calling GC.Collect fixed the problem.
Every twenty minutes the server pauses
all other threads, and serializes the
current game state. This usually takes
0.5 to 4 seconds
If all your threads are suspended already anyway you might as well call the garbage collection, since it should be fairly fast at this point. I suspect doing this will only mask your real problem though, not actually solve it.
We are randomly experiencing server
freezes, sometimes, unexpectedly, the
server memory usage will raise
increasingly until it reaches a point
when the server takes way too long to
handle any network operation. Thus,
I'm considering a lot of different
approaches to solve the issue, this is
one of them.
This sounds more like you actually are still referencing all these objects that use the memory - if you weren't the GC would run due to the memory pressure and try to release those objects. You might be looking at an actual bug in your production code (i.e. objects that are still subscribed to events or otherwise are being referenced when they shouldn't be) rather than something you can fix by manually taking out the garbage.
If possible in this scenario you should run a performance analysis to see where your bottlenecks are and what part of your code is causing the brunt of the memory allocations.
Could the memory increase be an "attack" by a player with a fake/modified game-client? Is a lot of memory allocated by the server when it accepts a new client connection? Does the server handle bogus incoming data well?

How to overcome Memory leak problem in .Net(Windows Application) [closed]

It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 12 years ago.
I have a problem of memory leaks in my application when its running.
The Application Uses CPU Memory like this.
Minimum Percentage is 6%
Maximum Percentage is 35%
Maximum Peak Memory is 90MB
I have used ANTS Memory Profiler for Analyzing Memory Leaks in the Application.
But I don't know how to reduce the memory usage of Application while running.
Please Any one can give me solution as soon as possible.
Thanks and Regards
Ramesh N
How do you know you have memory leaks? Bear in mind that the GC may not run if there's no memory pressure on the system, so it may look like memory is being allocated and not released - the GC will deal with it if necessary.
Why do you think your application is leaking? If it stays at a consistent 90MB usage then this isn't a leak - it's just showing more memory use than you think. If it's a genuine memory leak then over time it would creep higher through usage. If you can't get it to 100MB then it's not really leaking...
.NET applications often show higher memory usage (especially in certain views of task manager) than you'd expect. Is this actually a problem for you, or are you perceiving it as a problem because it's higher than you think?
Do you experience any problems from the memory use? Otherwise it doesn't seem like there is a problem at all.
Unless there is any actual memory leeks (but I suppose there isn't, as you have profiles the code), an application using several megabytes or memory, or even growing constantly to a certain point, is no problem.
It's a common misconception that a computer should have as much free memory as possible, but there is no performance benefit from that. Having unused memory doesn't make the applications run any faster in any way.
It's normal for a .NET application to allocate more memory as it runs. As long as there is free memory, this is by far more efficient than running throrough garbage collections to try to free up memory. The application will clean up the memory when needed.
The system can send a signal to an application to make it free up as much memory as possible. If you minimise an application, this signal is sent to it, so you can use that to find out approximately how much memory your application uses more than the absolute minimum.
First, put in some TEMPORARY code that calls GC.GetTotalMemory(true) at regular intervals and logs it.
Run the application for some time.
THEN TAKE OUT THE TEMPORARY CODE. Since this method really does hurt memory usage, but it will give you some useful details in doing so. Remember, this is purely an investigatory step, not something to use in 99% of production code.
Now, see if the figures it's returning are steadily climbing. If they're not (and that includes climbing a bit and then dropping again), you've no problem. End of solution.
If you do, then you need to look at direct or indirect use of unmanaged resources, which either are unmanaged memory, or use it. These will split into too cases.
The first is where you yourself are using unmanaged resources. Make sure that you are wrapping them in some safe-handle based wrapper and that they are disposed on each use along with having a finaliser. Don't mix direct use of managed and unmanaged resources in the same class (and then avoid the Dispose(bool) pattern, as it's really part of the anti-pattern of mixing these).
The second is where you use something that in turn uses unmanaged resources. A class might be such if it implements IDisposable. Make sure these are always disposed.
Make sure you are not interning strings needlessly. Interning strings is a useful memory-saving technique, but only if you know that the string value in question will be used regularly throughout the lifetime of the project (or at the very least, you will add few which won't be used again throughout that lifetime). If you intern strings that aren't regularly used, you've hit on one of the best ways to push memory into a tight spot with managed code (GC can happen on the intern pool now, but it often doesn't).
There are also techniques to reduce memory use rather than avoid leaks, but since you're only using a very small amount of memory (90MB) these aren't worth considering here.
Incidentally, what size paging file do you have? 90MB being 35% means a total memory of 256MB. Unless you've got 64MB of physical RAM, that's a bit low. Current advice puts page files at about 100% or less of physical RAM, but that's based on the tendency toward larger RAM sizes these days. If you've got 128MB in that thing, I'd at least double up that page file to give a total memory of around 390MB.

Categories

Resources