Entity Framework: Cache management? - c#

I'm using Entity Framework 4.0 behind WCF services. My problem is that the memory used by the programm is growing a lot(start à 200Mo, and I stopped it at ~1.1Go.
How can I manage the cache? I mean, I've two datacontext, one of them is never used to read data, so can I disable the cache?
And for the other, can I specify the amount of space it cans use? Is there a way to monitor these resources? Is there a way to use less resources?
Thank you!

First of all you should not use shared contexts. Create new context for each WCF request and dispose context before you end your operation processing! If you need some data caching do it outside of EF. EF itself is not supposed to be used as cache and there is no control of this behavior.
If you host your service in IIS you can configure AppPool recycling by specifying Private Memory Limit in advanced settings of the AppPool. But it will simply kill everything running in that AppPool.

What may be happening is that each call is creating a new context. Which remains in memory untill the connection timesout and the Garbage collection removes it.
Are you not disposing of the datacontext each time you use it?
Are you closing your connections from the client?
Are you using per call session mode?

Related

IIS App Pool, memory management

I have a RESTful WCF service hosted on IIS 7.5. When some operation is called, it returns almost immediately, but starts a complex task, dealing with combinatorics and opening big files in memory. After several requests about 50% of memory is in use by application pool, although tasks have been completed. When does IIS pool reclaim memory? I tried to call GC.Collect(), but nothing happened. Is there any way to profile applications like this one? I tried several profilers, but they show only .NET classes, which IIS uses to process request itself.
Long running tasks don't typically suit web applications as they time out/hang the responsiveness of the website/API
Is it possible to configure a background task to run asynchronously of the IIS site? So you can push these slow tasks into a queue and process them in the background
I think the memory usage on the process is an issue but doesn't tell the whole story, what have you managed to profile so far? Do you have unclosed connections lingering? Are you creating instances of multiple classes that are not being disposed of effectively? I would look to profile the call execution plan more than the memory usage, as it may lead you more calls as to where items are being left
When you say 50% memory how much are we actually talking about in mb? IIS can be a little greedy/lazy when it doesn't need to give up RAM
The worker process itself will not release memory to the operating system on its own. You can set the process to recycle on a schedule - this restarts the process releasing memory without interfering with running requests.
You probably should not do that though - basically .net is holding on to the memory to avoid having to reallocate it for later requests. The memory is available for reuse within the WCF process, and if the memory is not used the OS will page it out and allow it be reused when other processes need it. See Answer to When is memory, allocated by .NET process, released back to Windows for more details.
I had almost a similar issue, and i solved it by using Castle.Windsor as a IoC container, added the svc client class to container with Transient Scope, and finally, i have decorate the svc class with :
[ServiceBehavior(InstanceContextMode = InstanceContextMode.PerCall)].
All other dependent bindings have beend added with Transient Livestyle, thus making them dependend by their instantiator. I`m not sure that this will help in your situation, given the fact that you use large files, but if every thing else fails, try to implement IDisposable on your most memory eaters classes, and check if Dispose is called when it should.
Hope that helps!

should ObjectContexts in Entity Framework 5 be singletons?

When using ObjectContext in EF 5, should it be a singleton or is it better to create new instance every time like SqlConnection. If so, why?
Create and dispose the context as soon as possible. Quote from the guidelines on MSDN:
Here are some general guidelines when deciding on the lifetime of the
context:
When working with long-running context consider the following:
As you load more objects and their references into memory, the memory
consumption of the context may increase rapidly. This may cause performance issues.
Remember to dispose of the context when it is no
longer required.
If an exception causes the context to be in an
unrecoverable state, the whole application may terminate. The chances
of running into concurrency-related issues increase as the gap between
the time when the data is queried and updated grows.
When working with Web applications, use a context instance per request.
When working with Windows Presentation Foundation (WPF) or Windows Forms, use a
context instance per form. This lets you use change-tracking
functionality that context provides.

How to make my appDomain live longer?

Here is the situation that we're in.
We are distributing our assemblies (purely DLL) to our clients (we don't have control over their environment).
They call us by passing a list of item's id and we search through our huge database and return items with highest price. Since we have our SLA (30 milisecond) to meet, we are caching our items in memory cache (using Microsoft MemoryCache) We are caching about a million items.
The problem here is, it only caches throughout our client application lifetime. When the process exit, so are all the cached items.
Is there a way i can make my memorycache live longer, so that subsequent process can reused cached items?
I have consider having a window service and allow all these different processes to communicate with one on the same box, but that's going to create a huge mess when it comes to deployment.
We are using AppFabric as our distributed cache but the only way we can achieve our SLA is to use memorycache.
Any help would be greatly appreciated. Thank you
I don't see a way to make sure that your AppDomain lives longer - since all the calling assembly has to do is unload the AppDomain...
One option could be -although messy too- to implement some sort of "persisting MemoryCache"... to achieve performance you could/would use a ConcurrentDictionary persisted in a MemoryMappedFile...
Another option would be to use a local database - could even be Sqlite and implement to cache interface in-memory such that all writes/updates/deletes are "write-through" while reads are pure RAM-access...
Another option could be to include a EXE (as embedded resource for example) and start that from inside the DLL if it is not running... the EXE provides the MemoryCache, communication could be via IPC (for example shared memory...). Since the EXE is a separate process it would stay alive even after unloading your AppDomain... the problem with this is more whether the client likes and/or permissions allow it...
I really like Windows Service approach although I agree that could be a deployment mess...
The basic issue seems to be that you don't have control of the run-time Host - which is what controls the lifespan (and hence the cache).
I'd investigate creating some sort of (light-weight ?) host - maybe a .exe or a service.
The bulk of your DLL's would hang off the new host, but you could still deploy a "facade" DLL which in turn called your main solution (tied to your host). Yes you could have the external clients call your new host directly but that would mean changing / re-configuring those external callers where-as leaving your original DLL / API in place would isolate the external callers from your internal changes.
This would (I assume) mean completely gutting and re-structuring your solution, particularly whatever DLLs the external callers currently hit, because instead of processing the requests itself it's just going to pass the request off to your new host.
Performance
Inter-process communication is more expensive than keeping it within a process - I'm not sure how the change in approach would affect your performance and ability to hit the SLA.
In-particular, sparking up a new instance of the host will incur a performance hit.

ASP.NET + thread-aware unmanaged API

I'm thinking over an ASP.NET application that uses ESENT for persistance.
At this point this is just my hobby project, so the requirements are very flexible. However I'd like it to work on Windows 7, Windows 2008, and 2008 R2, with .NET 3.5 and higher, and default IIS settings.
In ESENT, most operations require you to open a session object. The documentation says: "A session tracks which thread it was used on, and it will throw an error if used on multiple threads with an open transaction." The API documentation mentions the native threads, not managed threads.
I assume the open session operation is relatively expensive, that's why I don't want to open/close session for every HTTP request.
Here're my questions, finally.
How in asp.net do I initialize / deinitialize something exactly once, on every native thread that executes my C# code?
Will the code like posted below work for me?
Is there something bad I don't know in keeping the asp.net managed thread constantly pinned to the native thread with BeginThreadAffinity method? Wont my sessions leak after the IIS is under the load for a month without a single reboot?
Thanks in advance!
class MySession: IDisposable
{
[ThreadStatic]
private static MySession s_session = null;
public static MySession instance
{
get
{
return s_session ?? ( s_session = new MySession() );
}
}
private MySession()
{
Thread.BeginThreadAffinity();
// Open a new session, store the handle in non-static data member.
}
void IDisposable.Dispose()
{
// Close the session.
Thread.EndThreadAffinity();
}
}
One good approach is to create a pool of sessions and have threads grab a session from the pool and then return the session when done. A session can be used by different threads, but ESENT will complain if you migrate a session between threads while a transaction is active (it is possible to disable that behaviour though).
Several large server apps that use ESENT have taken the session pool approach and it works well for them.
our current research shows that instancing a new session in page_load and disposing it in page_unload easily yields 600 reqs/sec with wcat for a simple script that seeks on an index, and does two other seeks for each returned row.
in other words with proper tuning of esent params a session pool might not be needed.
example above is with maxsessions set to 256. adjusting minimum cache size also helps performance. on a quad core test server with 8 gb ram.
This will probably not work in this form if you really intend to leave the session open across requests.
The finalizer will run on a separate thread and closing the session will throw an error.
Most probably
JET_errSessionInUse -
session was in use on another thread, or the session was not set or reset properly in JetEndSession() during Dispose().
If you really must use ESENT, maybe you can fire up and manage a dedicated pool of threads by hand and marshal calls to/from them.

Correct use of nhibernate session

i have a client server application, the server uses nhibernate.
i wanna know how should i use the session?
per call?
per client?
single?
other way?
and how can i keep the session cache in the server ?
and also i wanna know if the session is thread safe?
You should use one session per unit of work. If that includes multiple operations, so be it.
Use the session.BeginTransaction() to wrap the unit of work and commit once all the items are done.
Sessions are NOT thread safe, but the session factory is (which you definitely want to keep around).
NHiberate has various cache options for data, but the sessions are meant to be used and disposed.
Normally it's done one per request. You can create HttpApplication, which opens the session at the beginning of request and closes at the end of request (example).
Per call should be the usual solution
There really is no one right answer to the question of session lifetime. You can make any session lifetime work, it depends on your requirements. Sessions are not thread safe, but session factories are.
To keep the cache around, you need to keep the session around. It is likely to be fairly challenging to keep the cache around and keep the cache correct in anything but simple single user, single process applications.
There's a great example I've used from NHibernate Best Practices.
The code example uses a session per ASP.NET request.

Categories

Resources