I have a aspx page that is hitting the database over and over. Given that the data does not change often, I wanted at least store the data in a session object. I just learn that session object for our application are stored in the database.
After researching, I've found a class called MemoryCache, which cache data in memory. I'd like to know whether I can use that class to store data just the time some one his working on that page and then destroy it at the end of the operation?
Can someone explain if I can use MemoryCache for that purpose.
Thank for helping.
Yes, you can use it to cache data between page loads. The MemoryCache object is the replacement for the ASP.NET only Cache object. It's values can be persisted for as long as the CLR process is alive i.e. as long as the server doesn't go down.
Related
I'm writing an ASP.NET MVC5 Application, I know that the actions where session["foo"] = bar are ran sequentially, now to avoid this, i want to store some informations into a MemoryCache object and not in session, but my doubt is: Is the cache managed like the session? So the actions where i put ObjectCache.Set("foo", bar, null) are ran sequentially like for session?
I know the scope difference between cache and session but for me and in this case it's not important.
Thank to everyone
I understand that you try to avoid the session lock on the page.
The cache is not lock the full page access so the answer is that the cache is not run sequentially.
There are two kind of cache, one in memory that use static dictionary to keep the data and one that save the cache on database, that use files to save the data. Both of them locks the data only for the period of read/write, while the session is lock the full access on the page from start to the end of it.
So use cache, but close the session on the page you have this issue. Also have in mind that if you use web garden then the cache on memory can have multiple different data because memory cache have its own static space on each pool.
Also the session is different for each user, the cache is the same for all users.
some more to read : ASP.NET Server does not process pages asynchronously
I think the term you are looking for is thread safety - especially around concurrent access, typically writing.
Seems that according to MSDN, System.Runtime.Caching.MemoryCache is indeed thread safe. See also: Is MemoryCache.Set() thread-safe?
What is the best way to approach to ASP.NET session management, when I keep in mind I have a huge object instance to keep in session, also this should be database managed i.e. the session id is kept in database.
Also the object instance that I keep in session has data table instances within.
Thanks in advance for any suggestions/ comments.
ASP.NET has few session state providers, one is SqlSessionStateStore which stores session state in the SQL server.
You can read more on MSDN.
Another thing is that storing huge objects in a session(especially DataTables) seems like a bad idea. One thing is that you can hit memory issues with date, locks and data going out of sync, another is that retrieving, serializing and deserializing it can be both time and resources consuming.
'Best way' would be removing huge objects from session storage. Data table instance in session storage is a total 'never use session like this' example.
ASP.NET session is a InProc collection (by default). It saves in current process memory all the data you place to 'Session' collection. So if apppool flushes - you loose session.
Every single type of 'session provider' you can try to use would require serializing the data placed in session. This means HUGE problems for your data tables.
I'm trying to develop a web forms application using NHibernate and the Session Per Request model. All the examples I've seen have an HTTPModule that create a session and transaction at the beging of each request and then commits the transaction and closes the session at the end of the request. I've got this working but I have some concerns.
The main concern is that objects are automatically saved to the database when the web request is finished. I'm not particularly pleased with this and would much prefer some way to take a more active approach to deciding what is actually saved when the request is finished. Is this possible with the Session Per Request approach?
Ideally I'd like for the interaction with the database to go something like this:
Retreive object from the database or create a new one
Modify it in some way
Call a save method on the object which validates that it's indeed ready to be commited to the database
Object gets saved to the database
I'm able to accomplish this if I do not use the Sessions Per Request model and wrap the interactions in a using session / using transaction blocks. The problem I ran into in taking this approach is that after the object is loaded from the database the session is closed an I am not able to utilize lazy loading. Most of the time that's okay but there are a few objects which have lists of other objects that then cannot be modified because, as stated, the session has been closed. I know I could eagerly load those objects but they don't always get used and I feel that in doing so I'm failing at utilizing NHibernate.
Is there some way to use the Session Per Request (or any other model, it seems like that one is the most common) which will allow me to utilize lazy loading AND provide me with a way to manually decide when an object is saved back to the database? Any code, tutorials, or feedback is greatly appreciated.
Yes, this is possible and you should be able to find examples of it. This is how I do it:
Use session-per-request but do not start a transaction at the start of the request.
Set ISession.FlushMode to Commit.
Use individual transactions (occasionally multiple per session) as needed.
At the end of the session, throw an exception if there's an active uncommitted transaction. If the session is dirty, flush it and log a warning.
With this approach, the session is open during the request lifetime so lazy loading works, but the transaction scope is limited as you see fit. In my opinion, using a transaction-per-request is a bad practice. Transactions should be compact and surround the data access code.
Be aware that if you use database assigned identifiers (identity columns in SQL Server), NHibernate may perform inserts outside of your transaction boundaries. And lazy loads can of course occur outside of transactions (you should use transactions for reads also).
I understand that each page refresh, especially in 'AjaxLand', causes my back-end/code-behind class to be called from scratch... This is a problem because my class (which is a member object in System.Web.UI.Page) contains A LOT of data that it sources from a database. So now every page refresh in AjaxLand is causing me to making large backend DB calls, rather than just to reuse a class object from memory. Any fix for this? Is this where session variables come into play? Are session variables the only option I have to retain an object in memory that is linked to a single-user and a single-session instance?
You need ASP.Net Caching.
Specifically Data Caching.
If your data is user-specific then Session would be the way to go. Be careful if you have a web farm or web garden. In which case you'll need a Session server or database for your session.
If your data is application-level then Application Data Cache could be the way to go. Be careful if you have limited RAM and your data is huge. The cache can empty itself at an inopportune moment.
Either way, you'll need to test how your application performs with your changes. You may even find going back to the database to be the least bad option.
In addition, you could have a look at Lazy Loading some of the data, to make it less heavy.
Take a look at this MS article on various caching mechanisms for ASP.NET. There is a section named "Cache arbitrary objects in server memory" that may interest you.
Since you mention Ajax, I think you might want to consider the following points:
Assume this large data set is static and not transient, in the first call to Ajax, your app queries the database, retrieves lots of data and returns to the client (i.e. the browser/JavaScript running on the browser, etc), the client now has all of that in memory already. Subsequently, there's no need to go back to the server for the same data that your client already has in memory. What you need to do is using JavaScript to rebuild the DOM or whatever. All can be done on the client from this point on.
Now assume the data is not static but transient, caching on the server by putting them is the session won't be the solution that you want anyway. Every time your client sends a request to the server, and the server just returns what's in the cache (session), the data is already stale and there's no difference from the data that the client already has in memory.
The point is if the data is static, save round trips to the server once you already have data in memory. If the data is transient, I am afraid there's no cheap solution except re-querying or re-retrieving the data somehow, and send everything back to the client.
I am using a Session variable to pass a datatable from 1 page to another. Sometimes the datatable can contain well over 100,000 records. After running it a few times, I get thrown a Out of Memory exception, so I guess I have a few questions?
Is Session the best way to handle this?
Does Session.Clear("session") release it from Memory? If not, does anything release the Session from memory?
If I store a datatable into a Session object and then I store another datatable into that same Session object, does it keep using up memory or does it write over the existing Session object?
I'll assume you're talking about In-Process session state.
You aren't actually storing the DataTable itself in session. You are instead storing a reference to a DataTable. Thus when you create a new DataTable, and add that to session, you're simply overwriting the reference. You still have two DataTables somewhere in memory, until garbage collection cleans up any to which there there are no live references.
Remember that garbage collection in .net is non-deterministic. That is to say, setting an object to null does not immediately release the memory. It simply marks it, and at some point in the future, the garbage collector may see the dead object and release the memory associated with it.
You probably want to rethink your design if you're running out of memory. It might be better to have the second page refetch the data again, perhaps from a caching layer (possibly implemented on an application server as suggested by another poster) or perhaps from the database itself.
use a app-server layer to hold the data and each page should take it from there...
My first question would be why you need to store the entire database in the Session or the Application? Here is a good article that goes over all of your options and it advises against storing large amounts of data in the Session or Application caches. What issue are you trying to resolve by doing this?
Edit:
Are you display all the data at once on that page? ex. scroll down through 10000 records. If so that doesn't sound very user friendly (Assumption). Have you considered paging that data? You could have a page of 50 records and n number of pages. That would make the data call a lot faster and then you could implement filters, sorting, etc.