How to avoid and predict Out of Memory Exception - c#

So ,if i have a List<Bitmap> myBitmaps where i should store lot of Bitmaps token from a DataBase ,and it might a Out of Memory Exception ,how can i avoid that .
Should i store a bunch of Images at Hard Drive ,than load another Bunch after first bunch Processing etc.
Another case is if i load a Base64 and encode to String and String to Base64 against large amount of data how can i achieve it without causing Out of Memory Exception ,and how can i predict if there is going to be a Out of Memory Exception in both cases .
PS : If someone offers a solution please could it explain if it's slows Performance and why .

From the sound of it, it seems like the Bitmaps are large and it will be impractical to store them all in memory. If you are doing bulk operations on it then swapping to disk will certainly cause performance degradation but given that in real world you are working with hardware limits there are no options.
You might be able to cache certain metadata about the bitmaps in memory to speedup operations. Another option is to cache entire bitmaps in memory and go to the disk when data is not available in core memory.
Exactly which ones you cache will again depend on usage pattern. Consider using WeakReference as well.
Base64 conversion for large data can be trivially converted to an online algorithm, converting small amount of data at a time.

I would prefer storing such a (possible) mass of data in a database. The performance with a database depends on the database, the hardware and the connection to the database.

If the bitmaps are so big, you could just load them one by one. Every time load just one bitmap, process it, don't forget to Dispose() it and then go on another bitmap.
If doing lots of SQL queries like this makes a difference for you, load and process the bitmaps in bunches of N (where N depends on your specific circumstances). There is no need to save them to the disk, that's what the database is for.

Related

C# Dynamically keep recent objects in memory and load the olders when needed

I need to monitor clipboard and build a list of past clips. The problem is that i need to keep as much clips as possible. While small text or filedrops are inexpensive, a large image can occupy a large chunk of memory, for example a 20MP image copied into clipboard.
My plan is to save clips into SQLite, while user navigate forward or backward in clipboard, a list in memory is dynamically built for a faster preview (Not have to wait for the SQL Query, and load a image for example).
A pagination system can be other solution.
I wonder if there are any kind of library that do this kind of job, something like a cache with memory management at same time or a better approach to this problem
This sounds like you're trying to prematurely optimize without actually testing performance. Get your solution working before you even bother trying to come up with clever ways to conserve memory.
Next, try out several some of the strategies you've outlined for managing memory and measure each. Decide which trade off you'd rather live with.
The simplest would be to just leave everything in memory (in an array), have a memory limit setting, track the size of each item being added, and truncate the oldest items until the item being added can fit.

Extremely high rates of paging active memory to disk but low constant memory usage

As the title states, I have a problem with high page file activity.
I am developing a program that process a lot of images, which it loads from the hard drive.
From every image it generates some data, that I save on a list. For every 3600 images, I save the list to the hard drive, its size is about 5 to 10 MB. It is running as fast as it can, so it max out one CPU Thread.
The program works, it generates the data that it is supposed to, but when I analyze it in Visual Studio I get a warning saying: DA0014: Extremely high rates of paging active memory to disk.
The memory comsumption of the program, according to Task Manager is about 50 MB and seems to be stable. When I ran the program I had about 2 GB left out of 4 GB, so I guess I am not running out of RAM.
http://i.stack.imgur.com/TDAB0.png
The DA0014 rule description says "The number of Pages Output/sec is frequently much larger than the number of Page Writes/sec, for example. Because Pages Output/sec also includes changed data pages from the system file cache. However, it is not always easy to determine which process is directly responsible for the paging or why."
Does this mean that I get this warning simply because I read a lot of images from the hard drive, or is it something else? Not really sure what kind of bug I am looking for.
EDIT: Link to image inserted.
EDIT1: The images size is about 300 KB each. I dipose each one before loading the next.
UPDATE: Looks from experiments like the paging comes from just loading the large amount of files. As I am no expert in C# or the underlying GDI+ API, I don't know which of the answers are most correct. I chose Andras Zoltans answer as it was well explained and because it seems he did a lot of work to explain the reason to a newcomer like me:)
Updated following more info
The working set of your application might not be very big - but what about the virtual memory size? Paging can occur because of this and not just because of it's physical size. See this screen shot from Process Explorer of VS2012 running on Windows 8:
And on task manager? Apparently the private working set for the same process is 305,376Kb.
We can take from this a) that Task Manager can't necessarily be trusted and b) an application's size in memory, as far as the OS is concerned, is far more complicated than we'd like to think.
You might want to take a look at this.
The paging is almost certainly because of what you do with the files and the high final figures almost certainly because of the number of files you're working with. A simple test of that would be experiment with different numbers of files and generate a dataset of final paging figures alongside those. If the number of files is causing the paging, then you'll see a clear correlation.
Then take out any processing (but keep the image-loading) you do and compare again - note the difference.
Then stub out the image-loading code completely - note the difference.
Clearly you'll see the biggest drop in faults when you take out the image loading.
Now, looking at the Emgu.CV Image code, it uses the Image class internally to get the image bits - so that's firing up GDI+ via the function GdipLoadImageFromFile (Second entry on this index)) to decode the image (using system resources, plus potentially large byte arrays) - and then it copies the data to an uncompressed byte array containing the actual RGB values.
This byte array is allocated using GCHandle.Alloc (also surrounded by GC.AddMemoryPressure and GC.RemoveMemoryPressure) to create a pinned byte array to hold the image data (uncompressed). Now I'm no expert on .Net memory management, but it seems to me that what we have a potential for heap fragmentation here, even if each file is loaded sequentially and not in parallel.
Whether that's causing the hard paging I don't know. But it seems likely.
In particular the in-memory representation of the image could be specifically geared around displaying as opposed to being the original file bytes. So if we're talking JPEGs, for example, then a 300Kb JPEG could be considerably larger in physical memory, depending on its size. E.g. a 1027x768 32 bit image is 3Mb - and that's been allocated twice for each image since it's loaded (first allocation) then copied (second allocation) into the EMGU image object before being disposed.
But you have to ask yourself if it's necessary to find a way around the problem. If your application is not consuming vast amounts of physical RAM, then it will have much less of an impact on other applications; one process hitting the page file lots and lots won't badly affect another process that doesn't, if there's sufficient physical memory.
However, it is not always easy to determine which process is directly responsible for the paging or why.
The devil is in that cop-out note. Bitmaps are mapped into memory from the file that contains the pixel data using a memory-mapped file. That's an efficient way to avoid reading and writing the data directly into/from RAM, you only pay for what you use. The mechanism that keeps the file in sync with RAM is paging. So it is inevitable that if you process a lot of images then you'll see a lot of page faults. The tool you use just isn't smart enough to know that this is by design.
Feature, not a bug.

How to resolve Out of memory exception in WP7 using dictionary?

I build an application in WP7 where i require to load around 20000 hard coded data({'a',"XYZ"},{'b',"mno"},....)on which i have to perform the search. So i trying to do this by creating a dictionary making 'a' as key and value as "XYZ". As soon as my dictionary gets filled it gives Out of memory exception.
How can i solve this problem considering that i m building WP7 application?
Or Is there some way other than using dictionary?
Whenever you are loading so much data onto a phone, you're doing it wrong. Firstly, the bandwidth issue is going to kill your app. Second, the memory issue has already killed your app. Thirdly, the CPU issue is going to kill your app. The conclusion is, your user ends up killing your app.
Recommended solution: find a way to categorize the data so that not all of it must download to the phone. Do your processing on the server where it belongs (not on a phone).
If you insist on processing so much data on the phone, first try to manage the download size. Remember you're talking about a mobile phone here, and not everywhere has max 3G speeds. Try to compress the data structure as much as possible (e.g. using a tree to store common prefixes). Also try to zip up the data before downloading.
Then count your per-object memory usage aggressively. Putting in 20,000 strings can easily consume a lot of memory. You'd want to reduce the size of per-object memory usage as possible. In your example, you are just putting strings in there, so I can't guess how you'd be using up the tens of MB allowable on a WP7 app. However, if you are putting not just strings, but large objects, count the bytes.
Also, manage fragementation aggressively. The last thing you'll want to do is to new Dictionary() then dict.Add(x,y); in a for-loop. When the dictionary's internal table space runs full, it gets allocated to a new place, and the entire dictionary copied to the new place, wasting the original space. You end up having lots of fragmented memory space. Do a new Dictionary(20000) or something to reserve the space first in one go.
Instead of storing it in memory as a Dictionary you can store it in a Database(wp7sqlite) and fetch only the data required.In this way you can store whatever amount of data.
Edit
No nothing is required in extra from user end.you can create the database using sqlite manager,attach this to the project.Copy DB to Isolated storage on first usage.and you can access the DB whenever you want .Check this Link DB helper.This link uses sqlitewindowsphone instead of WP7Sqlite.I prefer wp7sqlite Since i got an error using sqlitewindowsphone.

.NET out of memory troubleshooting

After reading a few enlightening articles about memory in the .NET technology, Out of Memory does not refer to physical memory, 597499.
I thought I understood why a C# app would throw an out of memory exception -- until I started experimenting with two servers-- both are having 2.5 gigs of ram, windows server 2003 and identical programs running.
The only significant difference between the two being one has 7% hard drive storage left and the other more than 50%.
The server with 7% storage space left is consistently throwing an out of memory while the other is performing consistently well.
My app is a C# web application that process' hundreds of MBs of String object.
Why would this difference happen seeing that the most likely reason for the out of memory issue is out of contiguous virtual address space.
All I can think of is that you're exhausting the virtual memory. Sounds like you need to run a memory profiler on the app.
I've used the Red Gate profiler in similar situations in the past. You may be surprised how much memory your strings are actually using.
Is the paging file fragmentation different on each machine? High fragmentation could slow down paging operations and thus exacerbate memory issues. If the paging file is massively fragmented, sort it out e.g. bring the server off-line, set the paging file size to zero, defrag the drive, re-create the paging file.
It's hard to give any specific advice on how to deal with perf problems with your string handling without more detail of what you are doing.
Why would this difference happen
seeing that the most likely reason for
the out of memory issue is out of
contiguous virtual address space?
With 7% free hard disk your server is probably running out of space to page out memory from either your process or other processes, hence it has to keep everything in RAM and therefore you are unable to allocate additional memory more often than on the server with 50% free space.
What solutions do you guys propose?
Since you've already run a profiler and seen at least 600MB+ of usage with all the string data you need to start tackling this problem.
The obvious answer would be to not hold all that data in memory. If you are processing a large data set then load a bit, process it and then throw that bit away and load the next bit instead of loading it all up front.
If it's data you need to serve, look at a caching strategy like LRU (least recently used) and keep only the hottest data in memory but leave the rest on disk.
You could even offload the strings into a database (in-memory or disk-based) and let that handle the cache management for you.
A slighty left-of-field solution I've had to use in the past was simply compressing the string data in memory as it arrived and decompressing it again when needed using the SharpZipLib. It wasn't that slow surprisingly.
I would agree that your best bet is to use a memory profiler. I've used .Net Memory Profiler 3.5 and was able to diagnose the issue, which in my case were undisposed Regex statements. They have demo tutorials which will walk you through the process if you're not familiar.
As you your question, any single reference to the strings, the jagged array for instance, would still prevent the string from disposing. Without knowing more about your architecture, it would be tough to make a specific recommendation. I would suggest trying to optimize your app before extending memory though. It will come back to bite you later.
An OutOfMemoryException is more likely to indicate fragmentation in your page file - not that you are out of RAM or disk space.
It is generally (wrongly) assumed that the page file is used as a swap disk - that RAM overflow is written to the page file. All allocated memory is stored in the page file and only data that is under heavy usage is copied to RAM.
There's no simple code fix to this problem other than trying to reduce the memory footprint of your application. But if you really get desperate you can always try PageDefrag, which is a free application originally developed by SysInternals.
There is a few tricks to increase memory (I dont know if it works with a web-app, but it looks like it does):
"Out of memory? Easy ways to increase the memory available to your program"
http://blogs.msdn.com/b/calvin_hsia/archive/2010/09/27/10068359.aspx

Dealing with large number of text strings

My project when it is running, will collect a large number of string text block (about 20K and largest I have seen is about 200K of them) in short span of time and store them in a relational database. Each of the string text is relatively small and the average would be about 15 short lines (about 300 characters). The current implementation is in C# (VS2008), .NET 3.5 and backend DBMS is Ms. SQL Server 2005
Performance and storage are both important concern of the project, but the priority will be performance first, then storage. I am looking for answers to these:
Should I compress the text before storing them in DB? or let SQL Server worry about compacting the storage?
Do you know what will be the best compression algorithm/library to use for this context that gives me the best performance? Currently I just use the standard GZip in .NET framework
Do you know any best practices to deal with this? I welcome outside the box suggestions as long as it is implementable in .NET framework? (it is a big project and this requirements is only a small part of it)
EDITED: I will keep adding to this to clarify points raised
I don't need text indexing or searching on these text. I just need to be able to retrieve them in later stage for display as a text block using its primary key.
I have a working solution implemented as above and SQL Server has no issue at all handling it. This program will run quite often and need to work with large data context so you can imagine the size will grow very rapidly hence every optimization I can do will help.
The strings are, on average, 300 characters each. That's either 300 or 600 bytes, depending on Unicode settings. Let's say you use a varchar(4000) column and use (on average) 300 bytes each.
Then you have up to 200,000 of these to store in a database.
That's less than 60 MB of storage. In the land of databases, that is, quite frankly, peanuts. 60 GB of storage is what I'd call a "medium" database.
At this point in time, even thinking about compression is premature optimization. SQL Server can handle this amount of text without breaking a sweat. Barring any system constraints that you haven't mentioned, I would not concern myself with any of this until and unless you actually start to see performance problems - and even then it will likely be the result of something else, like a poor indexing strategy.
And compressing certain kinds of data, especially very small amounts of data (and 300 bytes is definitely small), can actually sometimes yield worse results. You could end up with "compressed" data that is actually larger than the original data. I'm guessing that most of the time, the compressed size will probably be very close to the original size.
SQL Server 2008 can perform page-level compression, which would be a somewhat more useful optimization, but you're on SQL Server 2005. So no, definitely don't bother trying to compress individual values or rows, it's not going to be worth the effort and may actually make things worse.
If you can upgrade to SQL Server 2008, I would recommend just turning on page compression, as detailed here: http://msdn.microsoft.com/en-us/library/cc280449.aspx
As an example, you can create a compressed table like this:
CREATE TABLE T1
(c1 int, c2 nvarchar(50) )
WITH (DATA_COMPRESSION = PAGE);
If you can't use compression in the database, unfortunately your strings (no more than 300 chars) are not going to be worthwhile to compress using something like System.IO.Compression. I suppose you could try it, though.
Compression will consume resources and typically will hurt performance where significant time is just local communication and processing.
Not entirely clear on what you are asking.
In regard to performance - if you are compressing the strings in memory before storing them in the database your program is going to be slower than if you just stuff the data straight in to the table and let SQL worry about it later. Trade off is that the sql database will be larger, but 1Tb hard drives are cheap so is storage really that big a deal?
Based on your numbers (200K by 300 bytes) you are only talking about roughly 60Megs. That is not a very large dataset. Have you considered using the Bulk Copy feature in ADO.NET (http://msdn.microsoft.com/en-us/library/7ek5da1a.aspx). If all over you data goes in one table this should be fun.
This would be an alternative to having something like EF generating essentially 200K insert statements.
UPDATE
Here is another example: http://weblogs.sqlteam.com/mladenp/archive/2006/08/26/11368.aspx
I wouldn't worry about compressing them. For strings this size (300 characters or so), it's going to be more of a headache than it's worth. Compressing strings takes time (no matter how small), and SQL server 2005 does not have a native way of doing this, which means that you are going to have to write something to do it. If you do this in the application that is going to hurt your performance, you could write a CLR routine to do it in the database, but it is still going to be an extra step to actually use the compressed string in your application (or any other that uses it for that matter).
Space in a database is cheap, so you aren't really saving much by compressing all the strings. Your biggest problem is going to be keeping a large number of strings in your application's memory. If you are routinely going back to the database to load some of them and not trying to cache all of them at the same time, I wouldn't worry about it unless you are actually seeing problems.
Sounds like you would benefit from using Large-Value Data Types
These data types will store up to 2^31-1 bytes of data
If all of your strings are smallish, there is a diminishing return to be gained by compressing them. Without natuve SQL compression, they will not be searchable anyway if you compress them.
It sound like you are trying to solve a definitely non-relational problem with a relational database. Why exactly are you using a database? It can be done of course, but some problems just don't fit well. TFS shows that you can brute force a problem into using a RDBS once you throw enough hardware on it, but that doesn't make it a good idea.

Categories

Resources