ASP.NET session management for huge object instance - c#

What is the best way to approach to ASP.NET session management, when I keep in mind I have a huge object instance to keep in session, also this should be database managed i.e. the session id is kept in database.
Also the object instance that I keep in session has data table instances within.
Thanks in advance for any suggestions/ comments.

ASP.NET has few session state providers, one is SqlSessionStateStore which stores session state in the SQL server.
You can read more on MSDN.
Another thing is that storing huge objects in a session(especially DataTables) seems like a bad idea. One thing is that you can hit memory issues with date, locks and data going out of sync, another is that retrieving, serializing and deserializing it can be both time and resources consuming.

'Best way' would be removing huge objects from session storage. Data table instance in session storage is a total 'never use session like this' example.
ASP.NET session is a InProc collection (by default). It saves in current process memory all the data you place to 'Session' collection. So if apppool flushes - you loose session.
Every single type of 'session provider' you can try to use would require serializing the data placed in session. This means HUGE problems for your data tables.

Related

Can I use MemoryCach in place of Session?

I have a aspx page that is hitting the database over and over. Given that the data does not change often, I wanted at least store the data in a session object. I just learn that session object for our application are stored in the database.
After researching, I've found a class called MemoryCache, which cache data in memory. I'd like to know whether I can use that class to store data just the time some one his working on that page and then destroy it at the end of the operation?
Can someone explain if I can use MemoryCache for that purpose.
Thank for helping.
Yes, you can use it to cache data between page loads. The MemoryCache object is the replacement for the ASP.NET only Cache object. It's values can be persisted for as long as the CLR process is alive i.e. as long as the server doesn't go down.

Nhibernate; control over when Session Per Request is saved

I'm trying to develop a web forms application using NHibernate and the Session Per Request model. All the examples I've seen have an HTTPModule that create a session and transaction at the beging of each request and then commits the transaction and closes the session at the end of the request. I've got this working but I have some concerns.
The main concern is that objects are automatically saved to the database when the web request is finished. I'm not particularly pleased with this and would much prefer some way to take a more active approach to deciding what is actually saved when the request is finished. Is this possible with the Session Per Request approach?
Ideally I'd like for the interaction with the database to go something like this:
Retreive object from the database or create a new one
Modify it in some way
Call a save method on the object which validates that it's indeed ready to be commited to the database
Object gets saved to the database
I'm able to accomplish this if I do not use the Sessions Per Request model and wrap the interactions in a using session / using transaction blocks. The problem I ran into in taking this approach is that after the object is loaded from the database the session is closed an I am not able to utilize lazy loading. Most of the time that's okay but there are a few objects which have lists of other objects that then cannot be modified because, as stated, the session has been closed. I know I could eagerly load those objects but they don't always get used and I feel that in doing so I'm failing at utilizing NHibernate.
Is there some way to use the Session Per Request (or any other model, it seems like that one is the most common) which will allow me to utilize lazy loading AND provide me with a way to manually decide when an object is saved back to the database? Any code, tutorials, or feedback is greatly appreciated.
Yes, this is possible and you should be able to find examples of it. This is how I do it:
Use session-per-request but do not start a transaction at the start of the request.
Set ISession.FlushMode to Commit.
Use individual transactions (occasionally multiple per session) as needed.
At the end of the session, throw an exception if there's an active uncommitted transaction. If the session is dirty, flush it and log a warning.
With this approach, the session is open during the request lifetime so lazy loading works, but the transaction scope is limited as you see fit. In my opinion, using a transaction-per-request is a bad practice. Transactions should be compact and surround the data access code.
Be aware that if you use database assigned identifiers (identity columns in SQL Server), NHibernate may perform inserts outside of your transaction boundaries. And lazy loads can of course occur outside of transactions (you should use transactions for reads also).

'Caching' a large table in ASP.NET

I understand that each page refresh, especially in 'AjaxLand', causes my back-end/code-behind class to be called from scratch... This is a problem because my class (which is a member object in System.Web.UI.Page) contains A LOT of data that it sources from a database. So now every page refresh in AjaxLand is causing me to making large backend DB calls, rather than just to reuse a class object from memory. Any fix for this? Is this where session variables come into play? Are session variables the only option I have to retain an object in memory that is linked to a single-user and a single-session instance?
You need ASP.Net Caching.
Specifically Data Caching.
If your data is user-specific then Session would be the way to go. Be careful if you have a web farm or web garden. In which case you'll need a Session server or database for your session.
If your data is application-level then Application Data Cache could be the way to go. Be careful if you have limited RAM and your data is huge. The cache can empty itself at an inopportune moment.
Either way, you'll need to test how your application performs with your changes. You may even find going back to the database to be the least bad option.
In addition, you could have a look at Lazy Loading some of the data, to make it less heavy.
Take a look at this MS article on various caching mechanisms for ASP.NET. There is a section named "Cache arbitrary objects in server memory" that may interest you.
Since you mention Ajax, I think you might want to consider the following points:
Assume this large data set is static and not transient, in the first call to Ajax, your app queries the database, retrieves lots of data and returns to the client (i.e. the browser/JavaScript running on the browser, etc), the client now has all of that in memory already. Subsequently, there's no need to go back to the server for the same data that your client already has in memory. What you need to do is using JavaScript to rebuild the DOM or whatever. All can be done on the client from this point on.
Now assume the data is not static but transient, caching on the server by putting them is the session won't be the solution that you want anyway. Every time your client sends a request to the server, and the server just returns what's in the cache (session), the data is already stale and there's no difference from the data that the client already has in memory.
The point is if the data is static, save round trips to the server once you already have data in memory. If the data is transient, I am afraid there's no cheap solution except re-querying or re-retrieving the data somehow, and send everything back to the client.

DAL, Session, Cache architecture

I'm attempting to create Data Access Layer for my web application. Currently, all datatables are stored in the session. When I am finished the DAL will populate and return datatables. Is it a good idea to store the returned datatables in the session? A distributed/shared cache? Or just ping the database each time? Note: generally the number of rows in the datatable will be small < 2000.
Additional info:
Almost none of the data is shared. The parameters that are sent to the SQL queries are chosen by the user. The parameter values available to the user are based on who the user is. In most cases it is impossible for two users to run the same sql queries. However, the same user can run the same query more than once.
More info:
Number of concurrent users ~50,000
Important info:
In 99% of the cases no two users will have the same data/queries, however, the same user may run the same query/get the same data multiple times.
Thanks
Storing the data in session is not a good idea because:
Every user gets a separate copy of the same data - enormous waste of server memory.
IIS will recycle a session if you fill it with too much data.
I recommend storing the data tables in Cache, and also populating each table only when first requested rather than all at once. That way, if IIS starts reclaiming space in the cache, your code won't be affected.
Very simple example of fetching on demand:
T GetCached<T>(string cacheKey, Func<T> getDirect) {
object value = HttpContext.Current.Cache.Item(cacheKey);
if(value == null) {
value = getDirect();
HttpContext.Current.Cache.Insert(cacheKey, value);
}
return (T) value;
}
EDIT: - Question Update
Cache vs local Session - Local session state is all-or-nothing. If it gets too full, IIS will recycle everything in it. By contrast, cache items are dropped individually when memory gets too low, so it's much less of a problem.
Cache vs Session state server - I don't have any data to back this up, so please say so if I've got this wrong, but I would have thought that caching the data independently in memory in each physical server AppDomain would scale better than storing it in a shared session state service.
The first thing I would say is: cache is not mandatory everywhere. You should use it wisely and very specially on bottlenecks related to data access.
I don't think it's a good idea to store 1000 different datatables with 2000 records anywhere. If queries are so dynamic that having the same query in a short period of time is the exception then cache doesn't seem a good option.
And in relation to a distributed cache option, I suggest you to check http://memcached.org . A distributed cache used by many big projects around the world.
I know Velocity is near, but so far I know it needs Windows Server 2008 and it's something very very new yet. Normally Microsoft products are good from version 2.0 :-)
Store lookups/dictionaries - and items that your app would require very frequently in Application or Cache object; query database for data that depends upon the user role.
--EDIT--
This is in response to your comment.
Usually in any data oriented system, the queries run around the facts table(or tables that are inevitable to query); assuming you do have a set of inevitable tables, so you can use Cache.Insert():
Load the inevitable tables on app startup;
Load most queried tables in Cache upon table request-basis;
Query database for least queried tables.
If you do not have any performance issues then let SQL handle everything.
Storing that amount of data in the Session is a very bad idea. Each user will get their own version!
If this is shared data (same for all users), consider moving it to the Application object.

Does Session.Remove() clear memory in c#?

I am using a Session variable to pass a datatable from 1 page to another. Sometimes the datatable can contain well over 100,000 records. After running it a few times, I get thrown a Out of Memory exception, so I guess I have a few questions?
Is Session the best way to handle this?
Does Session.Clear("session") release it from Memory? If not, does anything release the Session from memory?
If I store a datatable into a Session object and then I store another datatable into that same Session object, does it keep using up memory or does it write over the existing Session object?
I'll assume you're talking about In-Process session state.
You aren't actually storing the DataTable itself in session. You are instead storing a reference to a DataTable. Thus when you create a new DataTable, and add that to session, you're simply overwriting the reference. You still have two DataTables somewhere in memory, until garbage collection cleans up any to which there there are no live references.
Remember that garbage collection in .net is non-deterministic. That is to say, setting an object to null does not immediately release the memory. It simply marks it, and at some point in the future, the garbage collector may see the dead object and release the memory associated with it.
You probably want to rethink your design if you're running out of memory. It might be better to have the second page refetch the data again, perhaps from a caching layer (possibly implemented on an application server as suggested by another poster) or perhaps from the database itself.
use a app-server layer to hold the data and each page should take it from there...
My first question would be why you need to store the entire database in the Session or the Application? Here is a good article that goes over all of your options and it advises against storing large amounts of data in the Session or Application caches. What issue are you trying to resolve by doing this?
Edit:
Are you display all the data at once on that page? ex. scroll down through 10000 records. If so that doesn't sound very user friendly (Assumption). Have you considered paging that data? You could have a page of 50 records and n number of pages. That would make the data call a lot faster and then you could implement filters, sorting, etc.

Categories

Resources