Reclaim string memory quickly (device with limited RAM) - c#

I am writing a C# app for a device with limited ram. (Mono on iPhone/iPad)
When I assign a large string:
string xml = "10 meg xml string from REST service";
then clear it
xml = null;
Is with the GC free up that memory asap? Is there a way to make sure it's cleaned up. GC has a collect feature, but is this executed right away?
The problem is that I am downloading many large xml files in a loop and even though I am setting the string to null, memory use is growing.

In general GC does not happen immediately because garbage collection is relatively expensive. The runtime generally tries to do it while it's not busy doing other things, but this clearly isn't always possible. I ran into a non-deterministic out of memory error at one point because it was putting off GC too long (sometimes) while I was running a tight memory intensive loop. Lesson: Generally the collector knows what it is doing and you don't need to tweak it. But not always.
To force a collection to happen, you need two lines:
GC.Collect();
GC.WaitForPendingFinalizers();
EDIT: Wrote this before I saw your note on what you were running on. This is written based on the desktop .NET machine. It may (or may not) be different on mono / iPad.

First, keeping large strings or large arrays in memory at once should be avoided, especially on memory constrained devices like phones. Use XmlTextReader, for example, to parse xml files. If you get them from the network, save them to disk, etc.
Next, the issue about garbage collection: the current Mono GC does a conservative scan of the thread stacks, which means that some pointers to objects may be still be visible to the GC, even if to the programmer they have been cleared (like setting to null in your example).
To limit the consequences of this behavior, you should try to allocate or otherwise manipulate big arrays and strings in a separate stack frame. For example, instead of coding it this way:
while (true) {
string s = get_big_string_from_network ();
do_something_with_string(s);
handle_ui ();
s = null;
}
do the following:
void manipulate_big_string() {
string s = get_big_string_from_network ();
do_something_with_string(s);
}
...
while (true) {
manipulate_big_string ();
handle_ui ();
}
Normally, setting a reference to null has the intended effect only when applied to a static or instance field, using it with a method local variable may not be enough to hide the reference from the GC.

I think if you are developing for iPhone you don't have the gargabe collector from the .net framework. The memory management is made by the operating system in this case the iOS.
I think you should check in mono documentation in order to find how to manage the memory in this case. XCode implemented and automatic object management called automatic reference counting and is not a garbage collection like the one in .net framework is just an automatic tool to release unused objects.
Now thinking just in .net when working with big strings, you should always use stringbuilder instead of just string.
Now thinking in the iOS you sould not compare the app written for the iOs enviroment with a Desktop application for windows (in a pc you have a lot more resources). The iOS will not allow a big hit in memory consumption, if an app do this, the operating system will close it automatically to mantain the system running.

While I'm not a Mono expert, but some simple things to check. While you noted that you are setting your variable to null, are you actually calling a .Close() or .Dispose() method as appropriate, or including the scope of the variable in question inside of a using block?
Could be a case of a problem where you're sticking around waiting for a finalizer (i.e. if you have a handle on an unmalaged resource like a file handle), or the variable is still stuck in scope for some reason. This would result in increased memory pressure.
Ideally, handle opening your file one at a time in a method with explicitly clear variable scope in combination with a using block, thus ensuring that the appropriate finalizers, etc. are called even if an exception is thrown.
Hope that helps!

Related

Limiting the allowed RAM for a service, possible using MaxWorkingSet

I have a service that runs on a domain controller that is randomly accessed by other computers on the network. I can't shutdown the service and run it only when needed (this would defeat the purpose of running it as a service anyway).
The problem is that the memory used by the service doesn't seem to ever get cleared, and increases every time the service is queried by a remote computer.
Is there a way to set a limit on the RAM used by the application?
I've found a few references to using MaxWorkingSet, but none of the references actually tell me how to use it. Can I use MaxWorkingSet to limit the RAM used to, for example, 35MB? and if so, how? (what is the syntax etc?)
Otherwise, is there a function like "clearall()" that I could use to reset the variables and memory at the end of each run through? I've tried using GC.Collect(), but it didn't work.
Literally, MaxWorkingSet only affect Working set, which is the amount of physical memory. To restrict of an overall memory usage, you need Job Object API. But it is danger if your program really need such memory (many codes don't consider an OutOfMemoryException and sometimes .NET runtime has strange behaviors when memory is not enough)
You need to:
Create a Win32 Job object
Set the maximum memory to the job
Assign your process to the job
Here is a wrapper for .NET. ^reference
Besides, you could try this method of GC: (for .NET 4.6 or newer)
GCSettings.LargeObjectHeapCompactionMode = GCLargeObjectHeapCompactionMode.CompactOnce;
GC.Collect(2, GCCollectionMode.Forced, true, true);
(for older but sometimes doesn't work)
GC.Collect(2, GCCollectionMode.Forced);
The third param in 4.6 version of GC.Collect() is to tell runtime whether to do garbage collecting immediately. In older versions, GC.Collect() only notifies and leaves the decision to runtime.
As for some programming advice, I suggest you could wrap a class for one query. The class could be explicitly disposed after a query is done. It may help make GC smarter.
Finally, indeed there are something in .NET framework which you need to manage yourself. Like Bitmap.GetHBitmap, they need to be disposed manually.

C# GC.Collect() and Memory

I am receiving a very large list as a method argument and would like to remove it from memory after it is used. Normally I would let the GC do its thing, but I need to be very careful of memory usage in this app.
Will this code accomplish my goal? I've read a lot of differing opinions and am confused.
public void Save(IList<Employee> employees)
{
// I've mapped the passed-in list
var data = Mapper<Employee, EmployeeDTO>.MapList(employees);
// ?????????????
employees = null;
GC.Collect();
// Continues to process very long running methods....
// I don't want this large list to stay in memory
}
Maybe I should use another technique that I'm not aware of?
If the list is not used anymore the GC will automatically collect it when available memory is an issue.
However, if the caller uses the list after passing it to your function then the GC won't collect it even if you set it to null (all you have is a reference to the list - you can't do anything about other objects that hold references as wel).
Unless you have a measurable problem don't try and outsmart the GC.
This is not a direct answer to the question but some ideas how to handle the low-memory situations the poster (#Big Daddy) referred to.
If you're running into low memory situations on an x64 platform with 8 GB of memory, you should determine if your application is responsible for it. If it is, then run a memory profiler (CLR Profiler or something else or even get a full user dump and run WinDbg on it) to see what allocates the memory. It's possible that you have some objects that are not used anymore but are still referenced somewhere - this is not a true memory leak but it's memory that's not freed up in your application - most decent profilers will identify big objects (or objects with a lot of instances) along with their types.
I find it hard to believe that the list passed to this Save function would stress a server with 8 GB of memory but we don't know how much free memory is available to the process and what kind of a process it is (IIS, desktop, etc.)
If Save is called on several threads with huge inputs, it can potentially lead to memory stress but even then, it's not very likely and I'd check various counters and profile data to see when and why memory stress happens.

Releasing Memory taken by variables

I am having fun creating my own wallpaper changer program. I know there are plenty on the internet, but i am simply trying to learn new stuff. So, till now, every time i was creating any simple program, i didn't care about RAM/Memory cuz i was mostly creating programs for school, and it was like one time use program, and then i was forgetting about it.
But now i am trying to create application i would like to use, something mine. I noticed my program takes around ~4000k in "alt + ctrl + del" window, and it takes sometimes up to 200,000k when it changes wallpaper, and sometimes goes down, and sometimes stays that high till it changes it to another one.
So here comes the question, what are tips to make my app use least possible ram while running (tray icon, and main windows is hidden using if (FormWindowState.Minimized == WindowState) Hide();)
Is variable inside a function taking any memory? Example
int function(int a){
int b = 0;
int c = a+b;
return c;
}
Or are these variables released after function returns some value?
I could use some tips, guides, and/or links to articles where i could get some info about that. Newbie friendly tho.
EDIT:
Okay, i have read some, started to dispose bitmaps, got rid of one of my global variables i was using.. and its on steady 4000-7000k now. Raising a little when changing wallpaper, but then lowering back to that. So i guess thats kind of success for me. One more thing left. I downloaded kinda big/large/with many options program, that changes wallpapers, and it got a loot more options than mine, and still it takes around 1000-2000k, so ima read now what can take so "much" ram in mine. Right when i run my program its about 4100, so i guess i still can do something to optimize that. Thanks everyone for answers tho! :)
Memory from your program's perspective is divided in two blocks if you will. The Stack and the Heap.
The Stack represents the current frame of execution (for instance the currently executing function), and it is used to pass function parameters, return values and is where local variables are generally stored. That memory is purged when the the current frame of execution ends (for example your function exiting).
The Heap represents a memory pool where objects can be created and stored for longer periods of time. Generally, anything created using the "new" operator will go on the Heap, with the references existing on the Stack (for local context). If references to the allocated object stop being used, that memory remains taken until the Garbage Collector runs at some unspecified time in the future and frees the memory. When the GC runs can not be guaranteed - it might be when your program is running out of memory, or at scheduled intervals etc.
I think in the memory behaviour you are observing, spikes are due to opening up and loading resources, troughs are after the GC runs. Another way to observe this is to look at a program's memory footprint when there is UI showing on the screen, and when the program is minimized. When minimized the memory footprint will shrink, because all the graphical elements are no longer necessary. When you maximize the UI and redraw it, memory usage peaks.
You can look at the following articles for a better understanding of Stack and Heap:
C# Stack and Heap
What are stack and heap?
You might also want to look into Garbage Collection:
Garbage collection article on MSDN
... and Value vs Reference types
Make sure you use Using blocks around anything that implements the iDisposable interface. Particularly if you are reading files, any streams, or any requests. You can read a bit more about it at http://msdn.microsoft.com/en-us/library/yh598w02(v=vs.80).aspx and it gives a few examples of how to use it.
Memory taken for locally declared variables will be automatically released.
Memory taken for variables that will persist outside the function will be released too, when they are no longer used, by something called the GarbageCollector (GC for short).
So don't worry, you're not creating a memory leak with your example function.
It's difficult to tell you where that 200,000l could be used up. There are profilers which can help (I have none to recommend, but this one comes first on Google: http://memprofiler.com/)

C# .NET Garbage Collection not functioning?

I am working on a relatively large solution in Visual Studio 2010. It has various projects, one of them being an XNA Game-project, and another one being an ASP.NET MVC 2-project.
With both projects I am facing the same issue: After starting them in debug mode, memory usage keeps rising. They start at 40 and 100MB memory usage respectively, but both climb to 1.5GB relatively quickly (10 and 30 minutes respectively). After that it would sometimes drop back down to close to the initial usage, and other times it would just throw OutOfMemoryExceptions.
Of course this would indicate severe memory leaks, so that is where I initially tried to spot the problem. After searching for leaks unsuccesfully, I tried calling GC.Collect() regularly (about once per 10 seconds). After introducing this "hack", memory usage stayed at 45 and 120MB respectively for 24 hours (until I stopped testing).
.NET's garbage collection is supposed to be "very good", but I can't help suspecting that it just doesn't do its job. I have used CLR Profiler in an attempt to troubleshoot the issue, and it showed that the XNA project seemed to have saved a lot of byte arrays I was indeed using, but to which the references should already be deleted, and therefore collected by the garbage collector.
Again, when I call GC.Collect() regularly, the memory usage issues seem to have gone. Does anyone know what could be causing this high memory usage? Is it possibly related to running in Debug mode?
After searching for leaks unsuccesfully
Try harder =)
Memory leaks in a managed language can be tricky to track down. I have had good experiences with the Redgate ANTS Memory Profiler. It's not free, but they give you a 14 day, full-featured trial. It has a nice UI and shows you where you memory is allocated and why these objects are being kept in memory.
As Alex says, event handlers are a very common source of memory leaks in a .NET app. Consider this:
public static class SomeStaticClass
{
public event EventHandler SomeEvent;
}
private class Foo
{
public Foo()
{
SomeStaticClass.SomeEvent += MyHandler;
}
private void MyHandler( object sender, EventArgs ) { /* whatever */ }
}
I used a static class to make the problem as obvious as possible here. Let's say that, during the life of your application, many Foo objects are created. Each Foo subscribes to the SomeEvent event of the static class.
The Foo objects may fall out of scope at one time or another, but the static class maintains a reference to each one via the event handler delegate. Thus, they are kept alive indefinitely. In this case, the event handler simply needs to be "unhooked".
...the XNA project seemed to have saved a lot of byte arrays I was indeed using...
You may be running into fragmentation in the LOH. If you are allocating large objects very frequently they may be causing the problem. The total size of these objects may be much smaller than the total memory allocated to the runtime, but due to fragmentation there is a lot of unused memory allocated to your application.
The profiler I linked to above will tell you if this is a problem. If it is, you will likely be able to track it down to an object leak somewhere. I just fixed a problem in my app showing the same behavior and it was due to a MemoryStream not releasing its internal byte[] even after calling Dispose() on it. Wrapping the stream in a dummy stream and nulling it out fixed the problem.
Also, stating the obvious, make sure to Dispose() of your objects that implement IDisposable. There may be native resources lying around. Again, a good profiler will catch this.
My suggestion; it's not the GC, the problem is in your app. Use a profiler, get your app in a high memory consumption state, take a memory snapshot and start analyzing.
First and foremost, the GC works, and works well. There's no bug in it that you have just discovered.
Now that we've gotten that out of the way, some thoughts:
Are you using too many threads?
Remember that GC is non deterministic; it'll run whenever it thinks it needs to run (even if you call GC.Collect().
Are you sure all your references are going out of scope?
What are you loading into memory in the first place? Large images? Large text files?
Your profiler should tell you what's using so much memory. Start cutting at the biggest culprits as much as you can.
Also, calling GC.Collect() every X seconds is a bad idea and will unlikely solve your real problem.
Analyzing memory issues in .NET is not a trivial task and you definitely should read several good articles and try different tools to achieve the result. I end up with the following article after investigations: http://www.alexatnet.com/content/net-memory-management-and-garbage-collector You can also try to read some of Jeffrey Richter's articles, like this one: http://msdn.microsoft.com/en-us/magazine/bb985010.aspx
From my experience, there are two most common reasons for Out-Of-Memory issue:
Event handlers - they may hold the object even when no other objects referencing it. So ideally you need to unsubscribe event handlers to destroy the object automatically.
Finalizer thread is blocked by some other thread in STA mode. For example, when STA thread does a lot of work the other threads are stopped and the objects that are in the finalization queue cannot be destroyed.
Edit: Added link to Large Object Heap fragmentation.
Edit: Since it looks like it is a problem with allocating and throwing away the Textures, can you use Texture2D.SetData to reuse the large byte[]s?
First, you need to figure out whether it is managed or unmanaged memory that is leaking.
Use perfmon to see what happens to your process '.net memory# Bytes in all Heaps' and Process\Private Bytes. Compare the numbers and the memory rises. If the rise in Private bytes outpaces the rise in heap memory, then it's unmanaged memory growth.
Unmanaged memory growth would point to objects that are not being disposed (but eventually collected when their finalizer executes).
If it's managed memory growth, then we'll need to see which generation/LOH (there are also performance counters for each generation of heap bytes).
If it's Large Object Heap bytes, you'll want to reconsider the use and throwing away of large byte arrays. Perhaps the byte arrays can be re-used instead of discarded. Also, consider allocating large byte arrays that are powers of 2. This way, when disposed, you'll leave a large "hole" in the large object heap that can be filled by another object of the same size.
A final concern is pinned memory, but I don't have any advice for you on this because I haven't ever messed with it.
I would also add that if you are doing any file access, make sure that you are closing and/or disposing of any Readers or Writers. You should have a matching 1-1 between opening any file and closing it.
Also, I usually use the using clause for resources, like a Sql Connection:
using (var connection = new SqlConnection())
{
// Do sql connection work in here.
}
Are you implementing IDisposable on any objects and possibly doing something custom that is causing any issues? I would double check all of your IDisposable code.
The GC doesn't take into account the unmanaged heap. If you are creating lots of objects that are merely wrappers in C# to larger unmanaged memory then your memory is being devoured but the GC can't make rational decisions based on this as it only see the managed heap.
You end up in a situation where the GC collector doesn't think you are short of memory because most of the things on your gen 1 heap are 8 byte references where in actual fact they are like icebergs at sea. Most of the memory is below!
You can make use of these GC calls:
System::GC::AddMemoryPressure(sizeOfField);
System::GC::RemoveMemoryPressure(sizeOfField);
These methods allow the garbage collector to see the unmanaged memory (if you provide it the right figures)

setnewhandler in C#

for creating my own memory management in C# I need to have a possibility to intercept the new command before it returns a null or fires an exception. When using the new command I want to call the original handler first. If this handler fails to return a block of memory, I want to inform all my mappable objects to be written to disk and to free memory.
In C++ there has been a possibility to intercept the new command by assigned a different new handler. In C# I couldn't find anything which shows the same behaviour.
Has anyone seen a possibility to do this.
Thanks
Martin
You can't do what you're after in C#, or in any managed language. Nor should you try. The .NET runtime manages allocations and garbage collection. It's impossible for you to instruct your objects to free memory, as you have no guarantee when (or, technically, even if) a particular object will be collected once it's no longer rooted. Even eliminating all references and manually calling GC.Invoke() is not an absolute guarantee. If you're looking for granular memory management, you need to be using a lower-level environment.
As an important point, it is not possible for the new operator to return a null reference. It can only return either a reference to the specified type or throw an exception.
If you want to do your own management of how and when objects are allocated, you'll have to use something along the lines of a factory pattern.
I think you're approaching this from the wrong angle; the whole point of using a runtime with managed memory is so that you don't have to worry about memory. The tradeoff is that you can't do this type of low-level trickery.
As an aside, you can 'override new' for a limited class of objects (those descending from ContextBoundObject) by creating a custom ProxyAttribute, though this likely does not address what you're intending.
I believe that you are not understanding the side-effects of what you're asking for. Even in C++, you can't really do what you think you can do. The reason is simple, if you have run out of memory, you can't even make your objects serialize to disk because you have no memory to accomplish that. By the time memory is exhausted, the only real thing you can do is either discard memory (without saving or doing anything else first) or abend the program.
Now, what you're talking about will still work 95% of the time because your memory allocation will likely be sufficiently large that when it fails, you have a little room to play with, but you can't guarantee that this will be the case.
Example: If you have only 2MB of memory left, and you try to allocate 10MB, then it will fail, and you still have 2MB to play with to try and free up some memory, which will allow you to allocate small chunks of memory needed to serialize objects to disk.
But, if you only have 10 bytes of memory left, then you don't even have enough memory to create a new exception object (unless it comes from a reserved pool). So, in essence, you're creating a very poor situation that will likely crash at some point.
Even in C++ low memory conditions are almost impossible to get right, and it's almost impossible to recover from every case unless you have very carefully planned, and pre-allocated memory for your recovery routines.
Now, when you're talking about a garbage collected OS, you have no control over how memory is allocated or freed. At best, all you can do is give hints. There is very little you can reliably do here by the nature of garbage collection. It's non-deterministic.

Categories

Resources