Does Session.Remove() clear memory in c#? - c#

I am using a Session variable to pass a datatable from 1 page to another. Sometimes the datatable can contain well over 100,000 records. After running it a few times, I get thrown a Out of Memory exception, so I guess I have a few questions?
Is Session the best way to handle this?
Does Session.Clear("session") release it from Memory? If not, does anything release the Session from memory?
If I store a datatable into a Session object and then I store another datatable into that same Session object, does it keep using up memory or does it write over the existing Session object?

I'll assume you're talking about In-Process session state.
You aren't actually storing the DataTable itself in session. You are instead storing a reference to a DataTable. Thus when you create a new DataTable, and add that to session, you're simply overwriting the reference. You still have two DataTables somewhere in memory, until garbage collection cleans up any to which there there are no live references.
Remember that garbage collection in .net is non-deterministic. That is to say, setting an object to null does not immediately release the memory. It simply marks it, and at some point in the future, the garbage collector may see the dead object and release the memory associated with it.
You probably want to rethink your design if you're running out of memory. It might be better to have the second page refetch the data again, perhaps from a caching layer (possibly implemented on an application server as suggested by another poster) or perhaps from the database itself.

use a app-server layer to hold the data and each page should take it from there...

My first question would be why you need to store the entire database in the Session or the Application? Here is a good article that goes over all of your options and it advises against storing large amounts of data in the Session or Application caches. What issue are you trying to resolve by doing this?
Edit:
Are you display all the data at once on that page? ex. scroll down through 10000 records. If so that doesn't sound very user friendly (Assumption). Have you considered paging that data? You could have a page of 50 records and n number of pages. That would make the data call a lot faster and then you could implement filters, sorting, etc.

Related

Application performance degradation due to several large in memory Dictionary objects in C#

I am working on a winforms application where I have to load data from Web API calls. Few million rows of data will be returned and had to be stored in a Dictionary. The logic goes like this. User will click on an item and data would be loaded. If the user clicks on another item, another new dictionary would be created.During the course of time several such heavy weight Dictionary objects would be created. The user might not use the old Dictionary objects after some time. Is this a case for using WeakReference?. Note that recreating any Dictionary object would take 10 to 20 secs. If I opt to keep all the objects in memory, the application performance degrades slowly after some time.
The answer here is to use a more advanced technique.
Use a memory-mapped file to store the dictionaries on disk; then you don't have to worry about holding them all in memory at once as they will be swapped in and out by the OS per demand.
You will want to write a Dictionary designed specifically to operate in the memory mapped file region, and a heap to store things pointed to by the key value pairs in the dictionary. Since you aren't deleting anything, this is actually pretty straightforward.
Otherwise you should take Fildor 4's suggestion and Just Use A Database, as it will basically do everything I just mentioned for you and wrap it up in a nice syntax.

ASP.NET session management for huge object instance

What is the best way to approach to ASP.NET session management, when I keep in mind I have a huge object instance to keep in session, also this should be database managed i.e. the session id is kept in database.
Also the object instance that I keep in session has data table instances within.
Thanks in advance for any suggestions/ comments.
ASP.NET has few session state providers, one is SqlSessionStateStore which stores session state in the SQL server.
You can read more on MSDN.
Another thing is that storing huge objects in a session(especially DataTables) seems like a bad idea. One thing is that you can hit memory issues with date, locks and data going out of sync, another is that retrieving, serializing and deserializing it can be both time and resources consuming.
'Best way' would be removing huge objects from session storage. Data table instance in session storage is a total 'never use session like this' example.
ASP.NET session is a InProc collection (by default). It saves in current process memory all the data you place to 'Session' collection. So if apppool flushes - you loose session.
Every single type of 'session provider' you can try to use would require serializing the data placed in session. This means HUGE problems for your data tables.

ASP.NET Memory Management Techniques

We have several different projects using ASP.NET and DevExpress ASPxGridView components. Throughout the development of these projects, several techniques on databinding have been used and we're now finding that some of these projects are eating up all the memory on the server.
Originally, we were using a call to a stored procedure and binding a DataSet to the the gridview, but on DX recommendation, modified this to an ObjectDataSource and created and object that ultimately uses a Linq statement against the DB and returns a generic list of objects which is then bound.
Unfortunately, this does not cure the problem at hand. We're still noticing large amounts of memory being eaten up and I'm trying to get to the bottom of this. When running through RedGate memory profiler, I notice that there are lots of strings, RuntimeTypeHandles and instances of my object created everytime we rebind to the grid.
The DataBind is done on page load, and the grid uses postbacks on sorting, but this is causing MBs of memory to leak on every bind, so I'm wondering what techniques I can use / best practices for managing the objects we have control over? I've implemented IDisposable in the data object, disposing of the linq context and setting any other objects to null, but it doesn't seem to make a difference. I seem to be creating an instance of the data object on every call, and even calling dispose makes no difference.
Wow, lots of plumbing and moving parts in there.
Is it possible to narrow things down a bit? That is, can you strip stuff off the page and see how it performs?
Forgive this, but when you say 'leaking memory' what do you mean and how do you know? The GC is 'lazy' and won't do anything until there is pressure to do so. This is a good thing but it also means memory may appear to accumulate until a collection is needed, and then you may find it frees a lot up. Memory profilers often look like a saw-tooth for this reason.
How are you storing the grid data to make the paging work? I've seen datasets persisted in viewstate, which means the data goes to the client along with the grid. If you're querying again on post-back page-load you're wasting a lot of space there.
Another common problem is event subscriptions keeping large objects alive longer than they should. I've actually seen code where a datagrid was placed in session state which kept the page alive for as long as the session was. On each post-back this happened again and again until poof. In this case, GC couldn't help us becuase the objects were indeed still 'in-use'.
So try to simplify - turn off sorting, get rid of the 3rd party control, use a smaller data set, etc. Using a memory profiler and something that puts the server under pressure, measure this scenario. If you find no 'leaks' then start adding stuff back to see when it goes haywire.
You may be returning too much data to your iis server each time. Remember that using standard linq datasource with the devexpress grid, each time you do a callback for sorting, or paging or any other callback, the whole data is loaded in memory and then sorted and paged.
This means that if you are loading a very large amount of data you will easily waste server memory. Think that you may have much users opening the same page and this will load the whole data in memory for each user, and the GC may not have time enough to free all that stuff.
DevExpress provides for this the LinqServerModeDataSource, that does all the paging and sorting in the data server.
If you cannot use that, try to retrieve a smaller set of data by filtering it.

'Caching' a large table in ASP.NET

I understand that each page refresh, especially in 'AjaxLand', causes my back-end/code-behind class to be called from scratch... This is a problem because my class (which is a member object in System.Web.UI.Page) contains A LOT of data that it sources from a database. So now every page refresh in AjaxLand is causing me to making large backend DB calls, rather than just to reuse a class object from memory. Any fix for this? Is this where session variables come into play? Are session variables the only option I have to retain an object in memory that is linked to a single-user and a single-session instance?
You need ASP.Net Caching.
Specifically Data Caching.
If your data is user-specific then Session would be the way to go. Be careful if you have a web farm or web garden. In which case you'll need a Session server or database for your session.
If your data is application-level then Application Data Cache could be the way to go. Be careful if you have limited RAM and your data is huge. The cache can empty itself at an inopportune moment.
Either way, you'll need to test how your application performs with your changes. You may even find going back to the database to be the least bad option.
In addition, you could have a look at Lazy Loading some of the data, to make it less heavy.
Take a look at this MS article on various caching mechanisms for ASP.NET. There is a section named "Cache arbitrary objects in server memory" that may interest you.
Since you mention Ajax, I think you might want to consider the following points:
Assume this large data set is static and not transient, in the first call to Ajax, your app queries the database, retrieves lots of data and returns to the client (i.e. the browser/JavaScript running on the browser, etc), the client now has all of that in memory already. Subsequently, there's no need to go back to the server for the same data that your client already has in memory. What you need to do is using JavaScript to rebuild the DOM or whatever. All can be done on the client from this point on.
Now assume the data is not static but transient, caching on the server by putting them is the session won't be the solution that you want anyway. Every time your client sends a request to the server, and the server just returns what's in the cache (session), the data is already stale and there's no difference from the data that the client already has in memory.
The point is if the data is static, save round trips to the server once you already have data in memory. If the data is transient, I am afraid there's no cheap solution except re-querying or re-retrieving the data somehow, and send everything back to the client.

HttpRuntime.Cache[] vs Application[]

I know that most people recommend using HttpRuntime.Cache because it has more flexibility... etc. But what if you want the object to persist in the cache for the life of the application? Is there any big downside to using the Application[] object to cache things?
As long as you don't abuse the application state, then I don't see a problem in using it for items that you don't want to expire.
Alternatively I would probably use a static variable near the code that uses it. That way you avoid to go through HttpApplicationState and then be forced to have a reference to System.Web if i want to access my data.
But be sure to think through how you use the object(s) that you store in HttpApplicationState. If it's a DataSet which you keep adding stuff to for each request, then at some point you end up eating up too much memory on the web-server. The same could happen if you keep adding items to HttpApplicationState when you process requests, at some point you will force the application to restart.
That's probably the advantage of using Cache in your situation. Consuming larger amounts memory isn't as fatal because you allow ASP.NET to release the items in your cache when memory becomes scarce.
Application is deprecated by Cache. If you need something with application scope, then you should either create it as a static member of a class or use the Cache. If you want to go the Cache route but don't ever want it to expire, you should use the CacheItemPriority.NotRemovable option when you Insert the value into the cache. Note that it is possible to use this priority and still use cache dependencies, for instance if your data depended on something in the file system. All the CacheItemPriority does is prevent the HttpRuntime.Cache from intelligently clearing the item when it feels memory pressure and uses its Least-Recently-Used algorithm to purge items that aren't seeing much use.
Use cache when you want items to automatically expire or get reclaimed when memory is scarse. Otherwise use static variables if you can, because they will yield better performance then digging through the ApplicationState collection. I'm not exactly sure what would be the case when to use ApplicationState, but there are sure to be some.

Categories

Resources