C# Simple garbage collection code not working - c#

I have a simple WPF application that allocates memory, clears it, and interacts with the garbage collector. Unfortunately I can't ever see the garbage collected automatically clear memory. For example, say I click Alloc button 10 times it allocates a gig, then if I click the New button the allocated memory does not go down. However if I force a garbage collection with GC.Collect (GC button) it does free the memory. I have enabled large collections with gcAllowVeryLargeObjects set to true, as I would like to test with more than 2 gigs of use. Any idea how I can get the garbage collector to automatically collect and free the memory?
Simple code excerpt:
List<byte[]> m_allocs = new List<byte[]>();
private void AllocClick(object sender, RoutedEventArgs e)
{
int oneHundredMegsAsBytes = 100000000;
byte[] array = new byte[oneHundredMegsAsBytes];
Array.Clear(array, 0, oneHundredMegsAsBytes);
m_allocs.Add(array);
}
private void NewClick(object sender, RoutedEventArgs e)
{
m_allocs = new List<byte[]>();
}
private void ClearClick(object sender, RoutedEventArgs e)
{
m_allocs.Clear();
}
private void GCClick(object sender, RoutedEventArgs e)
{
GC.Collect();
GC.WaitForPendingFinalizers();
GC.Collect();
}

You don't.
The GC has been developed to only run when needed. If yours isn't running, then it doesn't need to.
You don't want it to.
GC is expensive, the GC has to stop the program and figure out what is and isn't needed. If it did this at the end of every cycle your program would grind to a crawl.
However...
You could change the high-memory percent to a lower value, which will make the GC get much more aggressive, much earlier. This will probably do you more harm than good.

When experimenting based on the new information from your comments I found a solution. If I press alloc a lot of times, eg till 10 gig use, then click clear and continue allocating more, it eventually drops down the memory usage down to a lower value, eg 2 gig.
So the garbage collector is automatically running with the sample code, it just works when memory usage is higher than what I was testing with and further allocations help.

Related

Memory leak analysis and help requested

I've been using the methodology outlined by Shivprasad Koirala to check for memory leaks from code running inside a C# application (VoiceAttack). It basically involves using the Performance Monitor to track an application's private bytes as well as bytes in all heaps and compare these counters to assess if there is a leak and what type (managed/unmanaged). Ideally I need to test outside of Visual Studio, which is why I'm using this method.
The following portion of code generates the below memory profile (bear in mind the code has a little different format compared to Visual Studio because this is a function contained within the main C# application):
public void main()
{
string FilePath = null;
using (FileDialog myFileDialog = new OpenFileDialog())
{
myFileDialog.Title = "this is the title";
myFileDialog.FileName = "testFile.txt";
myFileDialog.Filter = "txt files (*.txt)|*.txt|All files (*.*)|*.*";
myFileDialog.FilterIndex = 1;
if (myFileDialog.ShowDialog() == DialogResult.OK)
{
FilePath = myFileDialog.FileName;
var extension = Path.GetExtension(FilePath);
var compareType = StringComparison.InvariantCultureIgnoreCase;
if (extension.Equals(".txt", compareType) == false)
{
FilePath = null;
VA.WriteToLog("Selected file is not a text file. Action canceled.");
}
else
VA.WriteToLog(FilePath);
}
else
VA.WriteToLog("No file selected. Action canceled.");
}
VA.WriteToLog("done");
}
You can see that after running this code the private bytes don't come back to the original count and the bytes in all heaps are roughly constant, which implies that there is a portion of unmanaged memory that was not released. Running this same inline function a few times consecutively doesn't cause further increases to the maximum observed private bytes or the unreleased memory. Once the main C# application (VoiceAttack) closes all the related memory (including the memory for the above code) is released. The bad news is that under normal circumstances the main application may be kept running indefinitely by the user, causing the allocated memory to remain unreleased.
For good measure I threw this same code into VS (with a pair of Thread.Sleep(5000) added before and after the using block for better graphical analysis) and built an executable to track with the Performance Monitor method, and the result is the same. There is an initial unmanaged memory jump for the OpenFileDialog and the allocated unmanaged memory never comes back down to the original value.
Does the memory and leak tracking methodology outlined above make sense? If YES, is there anything that can be done to properly release the unmanaged memory?
Does the memory and leak tracking methodology outlined above make sense?
No. You shouldn't expect unmanaged committed memory (Private Bytes) always be released. For instance processes have an unmanaged heap, which is managed to allow for subsequent allocations. And since Windows can page your committed memory, it isn't critical to minimize each processes committed memory.
If repeated calls don't increase memory use, you don't have a memory leak, you have delayed initialization. Some components aren't initialized until you use them, so their memory usage isn't being taken into account when you establish your baseline.

Am i experiencing a memory leak when i add and remove to a certain static collection in memory?

I currently have a class where I am storing a static collection that gets objects added and removed as certain methods are called. Here is the current code:
public class MatchMaker : Hub
{
private static HashSet<SoloUser> soloUsers = new HashSet<SoloUser>();
//Client Requests
public void findNewPartner(string Name, string Country)
{
SoloUser soloUser = soloUsers.Users.FirstOrDefault(s => (s.Name == Name) && (s.Country == Major));
if (soloUsers.Users.Count > 0){
Clients.partnerRequestResult(soloUsers.Users.FirstOrDefault());
soloUsers.Users.Remove(soloUser);
Process currentProcess = System.Diagnostics.Process.GetCurrentProcess();
long totalBytesOfMemoryUsed = currentProcess.WorkingSet64;
Debug.WriteLine("TotalMemoryUsed: " + totalBytesOfMemoryUsed);
}
else
{
soloUser = new SoloUser {
Name = Name,
Country = Country
};
soloUsers.Users.Add(soloUser);
Process currentProcess = System.Diagnostics.Process.GetCurrentProcess();
long totalBytesOfMemoryUsed = currentProcess.WorkingSet64;
Debug.WriteLine("TotalMemoryUsed: " + totalBytesOfMemoryUsed);
}
}
}
When I run:
Process currentProcess = System.Diagnostics.Process.GetCurrentProcess();
long totalBytesOfMemoryUsed = currentProcess.WorkingSet64;
Debug.WriteLine("TotalMemoryUsed: " + totalBytesOfMemoryUsed);
an object is added or removed from the collection the output of totalBytesOfMemoryUsed gets larger and larger(by 2mb each time) whether or not i add or remove the object from the collection, is this due to a memory leak? Is this even a sufficient way to check memory management? Do i need to dispose an object when i remove it from the collection?
There is probably not enough information here to comment on memory leaks, but in the general case - put your trust in the Garbage Collector.
If you still suspect of a memory leak, use a memory profiler (such as ANTS or dotTrace).
It is a good practice to dispose of any objects that implement IDisposable as soon as you are finished with them.
Given that, you need to remember that the .NET Garbage Collector is non-deterministic. This means you aren't going to necessarily know beforehand when a collection is going to occur. When the run-time needs to perform a collection, it will try and do that. You shouldn't be afraid of 2mb anyways. Let the Garbage Collector do it's job.
Trying to analyze your program's memory usage by getting the working set of memory or looking at the Windows Task Manager is usually a trip down the rabbit hole. Use a memory profiler after you actually have a problem (such as ANTS).
I also suggest reading MSDN regarding the Garbage Collector in general. You can start here.

Reading from PackagePart stream does not release memory

In our application, we are reading an XPS file using the System.IO.Packaging.Package class. When we read from a stream of a PackagePart, we can see from the Task Manager that the application's memory consumption rises. However, when the reading is done, the memory consumption doesn't fall back to what it was before reading from the stream.
To illustrate the problem, I wrote a simple code sample that you can use in a stand alone wpf application.
public partial class Window1 : Window
{
public Window1()
{
InitializeComponent();
_package = Package.Open(#"c:\test\1000pages.xps", FileMode.Open, FileAccess.ReadWrite, FileShare.None);
}
private void ReadPackage()
{
foreach (PackagePart part in _package.GetParts())
{
using (Stream partStream = part.GetStream())
{
byte[] arr = new byte[partStream.Length];
partStream.Read(arr, 0, (int)partStream.Length);
partStream.Close();
}
}
}
Package _package;
private void Button_Click(object sender, RoutedEventArgs e)
{
ReadPackage();
}
}
The ReadPackage() method will read all the PackagePart objects' stream contents into a local array. In the sample, I used a 1000 page XPS document as the package source in order to easily see the memory consumption change of the application. On my machine, the stand alone app's memory consumption starts at 18MB then rises to 100MB after calling the method. Calling the method again can raise the memory consumption again but it can fall back to 100MB. However, it doesn't fall back to 18MB anymore.
Has anyone experienced this while using PackagePart? Or am I using it wrong? I think the internal implementation of PackagePart is caching the data that was read.
Thank you!
You do not specify how you measure the "memory consumption" of your application but perhaps you are using task manager? To get a better view of what is going on I suggest that you examine some performance counters for your application. Both .NET heap and general process memory performance counters are available.
If you really want to understand the details of how your application uses memory you can use the Microsoft CLR profiler.
What you see may be a result of the .NET heap expanding to accomodate a very large file. Big objects are placed on the Large Object Heap (LOH) and even if the .NET memory is garbage collected the free memory is never returned to the operating system. Also, objects on the LOH are never moved around during garbage collection and this may fragment the LOH exhausting the available address space even though there is plenty of free memory.
Has anyone experienced this while using PackagePart? Or am I using it wrong?
If you want to control the resources used by the package you are not using it in the best way. Packages are disposable and in general you should use it like this:
using (var package = Package.Open(#"c:\test\1000pages.xps", FileMode.Open, FileAccess.ReadWrite, FileShare.None)) {
// ... process the package
}
At the end of the using statement resources consumed by the package are either already released or can be garbage collected.
If you really want to keep the _package member of your form you should at some point call Close() (or IDisposable.Dispose()) to release the resources. Calling GC.Collect() is not recommended and will not necessarily be able to recycle the resources used by the package. Any managed memory (e.g. package buffers) that are reachable from _package will not be garbage collected no matter how often you try to force a garbage collection.

Simple app with timer is eating memory away?

Can anybody explain why this tiny app's memory usage keeps increasing ?
static class Program
{
private static System.Timers.Timer _TestTimer;
[STAThread]
static void Main()
{
_TestTimer = new System.Timers.Timer();
_TestTimer.Interval = 30;
_TestTimer.Elapsed += new System.Timers.ElapsedEventHandler(_TestTimer_Elapsed);
_TestTimer.Enabled = true;
Application.EnableVisualStyles();
Application.SetCompatibleTextRenderingDefault(false);
Application.Run(new Form1());
}
static void _TestTimer_Elapsed(object sender, System.Timers.ElapsedEventArgs e)
{
string test = "tick";
Trace.WriteLine(test);
test = null;
}
}
Thanks!
Pika81
You are assuming that the memory usage should not increase. Wrong assumption, that's just not how either the .NET garbage collector or the Windows heap manager work. Both of them work efficiently by using memory that's available for use instead of constantly releasing and reallocating memory.
Let it run for a week. Might go quicker if you make the Interval smaller. Also minimize the form for spectacular effects.
My suggestion would be to crank up Windbg and find out what objects exist in memory.
Agreed that we devs generally use it as the last option but in this case, it would give you the exact reason for the memory increase
I was digging through the source of DefaultTraceListener and I found this:
private void WriteLine(string message, bool useLogFile)
{
if (base.NeedIndent)
{
this.WriteIndent();
}
this.Write(message + "\r\n", useLogFile);
base.NeedIndent = true;
}
So the memory usage is probably growing too slowly for the GC to react immediately.
If you're only looking at the Task Manager to see how much memory your process is using you're probably not getting a very accurate reading.
Have a read of this article:
http://www.itwriting.com/dotnetmem.php
It explains some of the shortfalls with using TaskManager as a means to measure the memory usage of your .Net application.

Process Memory Size - Different Counters

I'm trying to find out how much memory my own .Net server process is using (for monitoring and logging purposes).
I'm using:
Process.GetCurrentProcess().PrivateMemorySize64
However, the Process object has several different properties that let me read the memory space used:
Paged, NonPaged, PagedSystem, NonPagedSystem, Private, Virtual, WorkingSet
and then the "peaks": which i'm guessing just store the maximum values these last ones ever took.
Reading through the MSDN definition of each property hasn't proved too helpful for me. I have to admit my knowledge regarding how memory is managed (as far as paging and virtual goes) is very limited.
So my question is obviously "which one should I use?", and I know the answer is "it depends".
This process will basically hold a bunch of lists in memory of things that are going on, while other processes communicate with it and query it for stuff. I'm expecting the server where this will run on to require lots of RAM, and so i'm querying this data over time to be able to estimate RAM requirements when compared to the sizes of the lists it keeps inside.
So... Which one should I use and why?
If you want to know how much the GC uses try:
GC.GetTotalMemory(true)
If you want to know what your process uses from Windows (VM Size column in TaskManager) try:
Process.GetCurrentProcess().PrivateMemorySize64
If you want to know what your process has in RAM (as opposed to in the pagefile) (Mem Usage column in TaskManager) try:
Process.GetCurrentProcess().WorkingSet64
See here for more explanation on the different sorts of memory.
OK, I found through Google the same page that Lars mentioned, and I believe it's a great explanation for people that don't quite know how memory works (like me).
http://shsc.info/WindowsMemoryManagement
My short conclusion was:
Private Bytes = The Memory my process has requested to store data. Some of it may be paged to disk or not. This is the information I was looking for.
Virtual Bytes = The Private Bytes, plus the space shared with other processes for loaded DLLs, etc.
Working Set = The portion of ALL the memory of my process that has not been paged to disk. So the amount paged to disk should be (Virtual - Working Set).
Thanks all for your help!
If you want to use the "Memory (Private Working Set)" as shown in Windows Vista task manager, which is the equivalent of Process Explorer "WS Private Bytes", here is the code. Probably best to throw this infinite loop in a thread/background task for real-time stats.
using System.Threading;
using System.Diagnostics;
//namespace...class...method
Process thisProc = Process.GetCurrentProcess();
PerformanceCounter PC = new PerformanceCounter();
PC.CategoryName = "Process";
PC.CounterName = "Working Set - Private";
PC.InstanceName = thisProc.ProcessName;
while (true)
{
String privMemory = (PC.NextValue()/1000).ToString()+"KB (Private Bytes)";
//Do something with string privMemory
Thread.Sleep(1000);
}
To get the value that Task Manager gives, my hat's off to Mike Regan's solution above. However, one change: it is not: perfCounter.NextValue()/1000; but perfCounter.NextValue()/1024; (i.e. a real kilobyte). This gives the exact value you see in Task Manager.
Following is a full solution for displaying the 'memory usage' (Task manager's, as given) in a simple way in your WPF or WinForms app (in this case, simply in the title). Just call this method within the new Window constructor:
private void DisplayMemoryUsageInTitleAsync()
{
origWindowTitle = this.Title; // set WinForms or WPF Window Title to field
BackgroundWorker wrkr = new BackgroundWorker();
wrkr.WorkerReportsProgress = true;
wrkr.DoWork += (object sender, DoWorkEventArgs e) => {
Process currProcess = Process.GetCurrentProcess();
PerformanceCounter perfCntr = new PerformanceCounter();
perfCntr.CategoryName = "Process";
perfCntr.CounterName = "Working Set - Private";
perfCntr.InstanceName = currProcess.ProcessName;
while (true)
{
int value = (int)perfCntr.NextValue() / 1024;
string privateMemoryStr = value.ToString("n0") + "KB [Private Bytes]";
wrkr.ReportProgress(0, privateMemoryStr);
Thread.Sleep(1000);
}
};
wrkr.ProgressChanged += (object sender, ProgressChangedEventArgs e) => {
string val = e.UserState as string;
if (!string.IsNullOrEmpty(val))
this.Title = string.Format(#"{0} ({1})", origWindowTitle, val);
};
wrkr.RunWorkerAsync();
}`
Is this a fair description? I'd like to share this with my team so please let me know if it is incorrect (or incomplete):
There are several ways in C# to ask how much memory my process is using.
Allocated memory can be managed (by the CLR) or unmanaged.
Allocated memory can be virtual (stored on disk) or loaded (into RAM pages)
Allocated memory can be private (used only by the process) or shared (e.g. belonging to a DLL that other processes are referencing).
Given the above, here are some ways to measure memory usage in C#:
1) Process.VirtualMemorySize64(): returns all the memory used by a process - managed or unmanaged, virtual or loaded, private or shared.
2) Process.PrivateMemorySize64(): returns all the private memory used by a process - managed or unmanaged, virtual or loaded.
3) Process.WorkingSet64(): returns all the private, loaded memory used by a process - managed or unmanaged
4) GC.GetTotalMemory(): returns the amount of managed memory being watched by the garbage collector.
Working set isn't a good property to use. From what I gather, it includes everything the process can touch, even libraries shared by several processes, so you're seeing double-counted bytes in that counter. Private memory is a much better counter to look at.
I'd suggest to also monitor how often pagefaults happen. A pagefault happens when you try to access some data that have been moved from physical memory to swap file and system has to read page from disk before you can access this data.

Categories

Resources