How to resolve Out of memory exception in WP7 using dictionary? - c#

I build an application in WP7 where i require to load around 20000 hard coded data({'a',"XYZ"},{'b',"mno"},....)on which i have to perform the search. So i trying to do this by creating a dictionary making 'a' as key and value as "XYZ". As soon as my dictionary gets filled it gives Out of memory exception.
How can i solve this problem considering that i m building WP7 application?
Or Is there some way other than using dictionary?

Whenever you are loading so much data onto a phone, you're doing it wrong. Firstly, the bandwidth issue is going to kill your app. Second, the memory issue has already killed your app. Thirdly, the CPU issue is going to kill your app. The conclusion is, your user ends up killing your app.
Recommended solution: find a way to categorize the data so that not all of it must download to the phone. Do your processing on the server where it belongs (not on a phone).
If you insist on processing so much data on the phone, first try to manage the download size. Remember you're talking about a mobile phone here, and not everywhere has max 3G speeds. Try to compress the data structure as much as possible (e.g. using a tree to store common prefixes). Also try to zip up the data before downloading.
Then count your per-object memory usage aggressively. Putting in 20,000 strings can easily consume a lot of memory. You'd want to reduce the size of per-object memory usage as possible. In your example, you are just putting strings in there, so I can't guess how you'd be using up the tens of MB allowable on a WP7 app. However, if you are putting not just strings, but large objects, count the bytes.
Also, manage fragementation aggressively. The last thing you'll want to do is to new Dictionary() then dict.Add(x,y); in a for-loop. When the dictionary's internal table space runs full, it gets allocated to a new place, and the entire dictionary copied to the new place, wasting the original space. You end up having lots of fragmented memory space. Do a new Dictionary(20000) or something to reserve the space first in one go.

Instead of storing it in memory as a Dictionary you can store it in a Database(wp7sqlite) and fetch only the data required.In this way you can store whatever amount of data.
Edit
No nothing is required in extra from user end.you can create the database using sqlite manager,attach this to the project.Copy DB to Isolated storage on first usage.and you can access the DB whenever you want .Check this Link DB helper.This link uses sqlitewindowsphone instead of WP7Sqlite.I prefer wp7sqlite Since i got an error using sqlitewindowsphone.

Related

Application performance degradation due to several large in memory Dictionary objects in C#

I am working on a winforms application where I have to load data from Web API calls. Few million rows of data will be returned and had to be stored in a Dictionary. The logic goes like this. User will click on an item and data would be loaded. If the user clicks on another item, another new dictionary would be created.During the course of time several such heavy weight Dictionary objects would be created. The user might not use the old Dictionary objects after some time. Is this a case for using WeakReference?. Note that recreating any Dictionary object would take 10 to 20 secs. If I opt to keep all the objects in memory, the application performance degrades slowly after some time.
The answer here is to use a more advanced technique.
Use a memory-mapped file to store the dictionaries on disk; then you don't have to worry about holding them all in memory at once as they will be swapped in and out by the OS per demand.
You will want to write a Dictionary designed specifically to operate in the memory mapped file region, and a heap to store things pointed to by the key value pairs in the dictionary. Since you aren't deleting anything, this is actually pretty straightforward.
Otherwise you should take Fildor 4's suggestion and Just Use A Database, as it will basically do everything I just mentioned for you and wrap it up in a nice syntax.

Best way to manage a large amount of data in memory?

I'm trying to find the best way to manage a large amount of data in memory using C# without access a database. (the db will be used to store just part of these information when they become final)
When I say "large amount of data" I'm talking about hundreds of megabytes and I will like to manage a complex structure, not only something like a table with millions of records.
I need to search inside them as fast as possible and I need to be able to remove part of them when they become obsolete.
Luckily I can split this data in groups that don't need to be related in some way...so I don't need to find or update a row between millions, but something like find a group between lets say 50,000 other, search, add and remove data in this group and delete all the group when it become obsolete.
I have some projects that already manage data in memory but nothing so huge so I don't know if these methods are applicable also in this situazion:
-I used the .Net cache object but I never worked with more that 10 or 20 megabytes
-private static List data = new List(); on which I stored groups of data in xml format, but I never worked with more than a couple of megabyte
-datatable objects, one for group and also in this last case I never worked with more than 10 megabytes and I had problems to manage the access because datatables aren't thread safe
What could be the best way to manage this kind of situation? There is any kind of limit of Windows or of the .Net framework that can create me problems?
You should be fine with using your memory to store data in a data structure and playing around with data. The trade off is that the memory will not be available to other applications running on same server. Also when you try to commit the data from Memory to disk/DB the time taken for large data is more.
Depending on your data, the structure has to be defined. If the data is not inter related you should create different objects for each entity.
You will also need to device a strategy to upate/refresh your cache. Either it can be hourly, daily, weekly; depending on your needs.

How to avoid and predict Out of Memory Exception

So ,if i have a List<Bitmap> myBitmaps where i should store lot of Bitmaps token from a DataBase ,and it might a Out of Memory Exception ,how can i avoid that .
Should i store a bunch of Images at Hard Drive ,than load another Bunch after first bunch Processing etc.
Another case is if i load a Base64 and encode to String and String to Base64 against large amount of data how can i achieve it without causing Out of Memory Exception ,and how can i predict if there is going to be a Out of Memory Exception in both cases .
PS : If someone offers a solution please could it explain if it's slows Performance and why .
From the sound of it, it seems like the Bitmaps are large and it will be impractical to store them all in memory. If you are doing bulk operations on it then swapping to disk will certainly cause performance degradation but given that in real world you are working with hardware limits there are no options.
You might be able to cache certain metadata about the bitmaps in memory to speedup operations. Another option is to cache entire bitmaps in memory and go to the disk when data is not available in core memory.
Exactly which ones you cache will again depend on usage pattern. Consider using WeakReference as well.
Base64 conversion for large data can be trivially converted to an online algorithm, converting small amount of data at a time.
I would prefer storing such a (possible) mass of data in a database. The performance with a database depends on the database, the hardware and the connection to the database.
If the bitmaps are so big, you could just load them one by one. Every time load just one bitmap, process it, don't forget to Dispose() it and then go on another bitmap.
If doing lots of SQL queries like this makes a difference for you, load and process the bitmaps in bunches of N (where N depends on your specific circumstances). There is no need to save them to the disk, that's what the database is for.

Reading from SerialPort & Memory Management - C#

I am working on a program that reads in from a serial port, then parses, formats, and displays the information appropriately. This program, however, needs to be run for upwards of 12+ hours - constantly handling a stream of incoming data. I am finding that as I let my app run for a while, the memory usage increases at a linear rate - not good for a 12 hour usage.
I have implemented a logger that writes the raw incoming binary data to a file - is there a way I can utilize this idea to clear my memory cache at regular intervals? I.e. how can I, every so often, write to the log file in such a way that the data doesn't need to be stored in memory?
Also - are there other aspects of a Windows Form Application that would contribute to this? E.g. I print the formatted strings to a textbox, which ends up displaying the entire string. Because this is running for so long, it easily displays hundreds of thousands of lines of text. Should I be writing this to a file and clearing the text? Or something else?
Obviously, if the string grows over time, your app's memory usage will also grow over time. Also, WinForms textboxes can have trouble dealing with very large strings. How large does the string get?
Unless you really want to display the entire string onscreen, you should definitely clear it periodically (depending on your users' expectations); this will save memory and probably improve performance.
Normally, memory management in .NET is completely automatic. You should be careful about extrapolating a short observations (minutes) to a 12 hour period. And please note that TaskManager is not a very good tool to measure memory usage.
Writing the incoming data should not increase memory usage significantly. But there are a few thing you should avoid doing, and concatenating to a string over and over is one of them. Your TextBox is probably costing a lot more than you seem to think. Using a ListBox would be more efficient. And easier.
I have several serial applications which run either as an application or as a windows service. These are required to be up 24/7-365. The best mechanism I have found to avoid this same problem is two-fold.
1) Write the information out to a log file. For a service, this is the only way of getting the info out. The log file does not increase your memory usage.
2) For the application, write the information out to a log file as well as put it into a listbox. I generally limit the listbox to the last 500 or 1000 entries. With the newer .net controls, the listboxes are virtualized which helps but you also don't run into other memory issues such as the textbox concatenation.
You can take a system down with a textbox by constantly appending the string over a number of hours as it is not intended for that kind of abuse out of the box.

Memory Limits

I have an app which I have created to copy users (10,000+) from one domain and create the user in another domain with custom properties.
The app seems to run fine until it hits a 1.7gb of memory, I know there is a limit of 2gb per process on 32bit machines but I am running this on a copy of Windows Server 2008 x64 with 24gb of RAM.
My app does not crash but completes before it should (around 3000 users), maybe memory limitation is not my problem here but it was the first thing that stood out when comparing my app to a simpler app which collect just counts the users as it loops through.
I have my project set to "Any CPU" and it shows in task manager without the *32 flag.
Can anyone help me to understand what is going on?
Thanks
Does any single data structure in your program exceed 2 gigabytes? The .NET runtime can access more than 2gb, but no single object can be larger than 2 gb in size. For example, you can't allocate an array of bytes that's larger than 2 gb.
This can trip you up when using .NET collections. In particular the generic Dictionary and HashSet collection types.
Are you getting the out of memory exception when trying to add something to a collection?
The bulk of my application is looping through all 10000+ users using the DirectorySearcher then for each DirectoryEntry found I create an account using the following:
using(DirectoryEntry newUser = root.Children.Add("LDAPPATH"))
{
//Assign properties to the newUser
newUser.CommitChanges();
//write the new username to a file using a streamwriter.
}
I then close the streamwriter and report display the number of users that have been created.
I have the pagesize of the DirectorySearcher set to 110000 to get all the results. My app just completes aas it should be only reports 3500 users created.
I created a simple test using the same code for the DirectorySearcher and instead of creating the accounts I just increment a counter, the counter reached the number I expected so I think must be something with the account creation or the logging to file.
What is your source for the records? Can you use a connected DataReader type model rather than a disconnected DataTable/DataSet type model to avoid keeping more than the current record in memory?
I doubt you just happen to have a server with 24GB of RAM lying around not doing anything else. It must have some job to do that requires the RAM be available for that purpose. Just because you have that much RAM on the machine doesn't necessarily mean it's available, or that it's a good idea to rely on it.
Whether or not you should be hitting a limit, it sounds to me like you're holding onto references to some objects that you don't really need.
If you don't know why you're consuming so much memory you might want to try profiling your memory usage to see if you really need all that memory. There are many options, but the Microsoft CLR Profiler is free and would do what you need.

Categories

Resources