In an Outlook AddIn I'm working on, I use a list to grab all the messages in the current folder, then process them, then save them. First, I create a list of all messages, then I create another list from the list of messages, then finally I create a third list of messages that need to be moved. Essentially, they are all copies of each other, and I made it this way to organize it. Would it increase performance if I used only one list? I thought lists were just references to the actual item.
Without seeing your code it is impossible to tell if you are creating copies of the list itself or copies of the reference to the list - the latter is preferable.
Another thing to consider is whether or not you could stream the messages from Outlook using an iterator block. By using a List<T> you are currently buffering the entire sequence of messages which means you must hold them all in memory, processing them one at a time. Streaming the messages would reduce the memory pressure on your application as you would only need to hold each message in memory long enough to process it.
Unless your lists contains 10 millions items or more, it should not be a problem.
Outlook seems to have a problem much smaller sized mailboxes, so I would say you are pretty much safe.
Related
I am working on a winforms application where I have to load data from Web API calls. Few million rows of data will be returned and had to be stored in a Dictionary. The logic goes like this. User will click on an item and data would be loaded. If the user clicks on another item, another new dictionary would be created.During the course of time several such heavy weight Dictionary objects would be created. The user might not use the old Dictionary objects after some time. Is this a case for using WeakReference?. Note that recreating any Dictionary object would take 10 to 20 secs. If I opt to keep all the objects in memory, the application performance degrades slowly after some time.
The answer here is to use a more advanced technique.
Use a memory-mapped file to store the dictionaries on disk; then you don't have to worry about holding them all in memory at once as they will be swapped in and out by the OS per demand.
You will want to write a Dictionary designed specifically to operate in the memory mapped file region, and a heap to store things pointed to by the key value pairs in the dictionary. Since you aren't deleting anything, this is actually pretty straightforward.
Otherwise you should take Fildor 4's suggestion and Just Use A Database, as it will basically do everything I just mentioned for you and wrap it up in a nice syntax.
Given is Dictionary<T, U> in asp.net application. Application is running for a long time and new elements will be continuously added to the dictionary. My goal is to keep so many elements in Dictionary as possible, without "memory overflow" or other side effects. How should I clean the collection to give some memory free for new elements, by deleting old ones? How to check how many old elements and when should elements be deleted?
Have you checked out Application.Cache ?
Sounds to me like it's just what you need...
http://msdn.microsoft.com/en-us/library/ms178597(v=vs.100).aspx
Quote:
The advantage of using the application cache is that ASP.NET manages the cache and removes items when they expire or become invalidated, or when memory runs low.
This is not a simple task. It sounds as if you are using the dictionary as cache in which case you should look into standard caching for asp.net.
That said you could choose different strategies. You could choose to set an upper bound on the number of elements you want in the dictionary. This is can be done by having a linked list with elements and letting the dictionary store the LinkedListElement<U>. When ever you retrieve something from the dictionary, you move the LinkedListElement<U> to the front of the linked list. This way you always have the newest elements on top of the list. When you insert elements, you add them to both the linked list and the dictionary, and test if the dictionary size limit is met. If it is, you remove elements from the bottom of the linked list and the dictionary.
Remember locking!
Since you are using a managed language you are limited in the way how you can manage your memory and the practices of manual memory freeing or forcing garbage collector do it are all deemed as suboptimal.
So, there is no direct way to say how many elements your generic dictionary is allowed to keep and how much memory you still can claim. In the same time deleting some old values from the dictionary will not automatically lead to immediate increase of available memory.
This said, you can keep monitoring the memory currently allocated to your application using GC.GetTotalMemory (as long as your object are fully manageable, e.g. contain no unmanaged code or data blocks) and compare it to the total system memory available to application.
Or, if you know the size of the objects you put into your Dictionary<>, you might be interested in using MemoryFailPoint class that will tell you if you can create a new object of the given size or not, so you won't bump into the OutOfMemory exception and have time to free some resources.
Finally, if you are using the dictionary as a cache, there are some built-in cache providers in ASP.NET, read from here. The cache will automatically drop objects that are obsolete or old in case of memory pressure.
I use a List<string> to which I add a few hundred thousand of random strings to be then used within my program.
I need this list quite at the beginning of the program and the random strings which it uses are saved in a textfile. At the moment the first thing the program does after loading, is adding all the items to the list. However, since the list is exactly the same every single time I wonder if there is a way to save it somehow internally so the list can just be directly used and does not need to be expanded on every single startup.
The list needs to be persisted somewhere otherwise when the application shuts down you will loose all values. When the application shuts down, the memory that was used to store this list is returned back to the Operating System. So, no, there's no other way to have the list in memory when the application starts without reading it from somewhere - whether this would be a file, database or some other storage you need to load it from there or regenerate it from scratch.
If you do not care about the file format in which the list is stored you could use a BinaryFormatter for faster serialization and deserialization compared to XML, JSON and other formats.
I build an application in WP7 where i require to load around 20000 hard coded data({'a',"XYZ"},{'b',"mno"},....)on which i have to perform the search. So i trying to do this by creating a dictionary making 'a' as key and value as "XYZ". As soon as my dictionary gets filled it gives Out of memory exception.
How can i solve this problem considering that i m building WP7 application?
Or Is there some way other than using dictionary?
Whenever you are loading so much data onto a phone, you're doing it wrong. Firstly, the bandwidth issue is going to kill your app. Second, the memory issue has already killed your app. Thirdly, the CPU issue is going to kill your app. The conclusion is, your user ends up killing your app.
Recommended solution: find a way to categorize the data so that not all of it must download to the phone. Do your processing on the server where it belongs (not on a phone).
If you insist on processing so much data on the phone, first try to manage the download size. Remember you're talking about a mobile phone here, and not everywhere has max 3G speeds. Try to compress the data structure as much as possible (e.g. using a tree to store common prefixes). Also try to zip up the data before downloading.
Then count your per-object memory usage aggressively. Putting in 20,000 strings can easily consume a lot of memory. You'd want to reduce the size of per-object memory usage as possible. In your example, you are just putting strings in there, so I can't guess how you'd be using up the tens of MB allowable on a WP7 app. However, if you are putting not just strings, but large objects, count the bytes.
Also, manage fragementation aggressively. The last thing you'll want to do is to new Dictionary() then dict.Add(x,y); in a for-loop. When the dictionary's internal table space runs full, it gets allocated to a new place, and the entire dictionary copied to the new place, wasting the original space. You end up having lots of fragmented memory space. Do a new Dictionary(20000) or something to reserve the space first in one go.
Instead of storing it in memory as a Dictionary you can store it in a Database(wp7sqlite) and fetch only the data required.In this way you can store whatever amount of data.
Edit
No nothing is required in extra from user end.you can create the database using sqlite manager,attach this to the project.Copy DB to Isolated storage on first usage.and you can access the DB whenever you want .Check this Link DB helper.This link uses sqlitewindowsphone instead of WP7Sqlite.I prefer wp7sqlite Since i got an error using sqlitewindowsphone.
I have data structure, specifically a queue, that is growing so large as to cause an out of memory exception. This was unexpected behavior due to the relativity simple object it is holding (having a essentially one string field).
Is there an easy way, or a built in .NET way, to save this collection to a file on disk (there is no database to work with), and it continue to function transparently as queue?
Maybe a queue is not an appropriate data structure for your problem. How many objects are in your queue? Do you really need to store all those strings for later? Could you replace the strings with something smaller like enums or something more object-oriented like the Flyweight design pattern?
If you are processing lots of data, sometimes it's faster to recompute or reload the original data than saving a copy for later. Or you can process the data as you load it and avoid saving it for later processing.
I would first investigate why you are getting the OOM.
If you are adding to the queue - keep a check on the size and perform some action when a threshold is breached.
Can you filter those items? Do the items have many duplicates? In which case you could replace duplicates with a pre-cached object.
I would use Sqlite to save the data to disk.
In response to your comment on the question, I guess you could split your file-collecting thread into two threads:
The first thread merely counts the number of files to be processed, and increment a volatile int count. This thread only updates the count; it does not store anything in the queue.
The second thread is very similar to the first one, except that it doesn't update the count, and instead it actually saves the data into the queue. When the size of the queue reaches a certain threshold, your thread should block for some time, and then resume adding data to the queue. This ensures that your queue is never larger than a certain threshold.
I would even guess that you wouldn't actually need the second thread. Since the first one would give you the count you need, you can find the actual files in your main thread, one file at a time. This way you'll save yourself from having the queue, which will reduce your memory requirements.
However, I doubt that your queue is the reason you're getting out of memory exceptions. Even if you're adding one million entries to the queue, this would only take about 512 MB memory. I suggest you check your processing logic.
The specific answer is no, there is not an easy or built in way to do this. You have to write it to disk "yourself".
Figure out why you are getting the out of memory it might surprise you. Maybe its string internment, maybe your are fragmenting the GC with all the small object allocations. The CLR Profiler from Microsoft is fantastic for this.