Windows Phone 8 localdb thread safety - c#

I have a WP8 app that has multiple (at times, up to 40) threads that have to get the data from a webservice and then commit to a localdb.
I have implemented an AutoResetEvent-based pattern where each Repository method looks somewhat like this:
public class MySuperAppRepository
{
public static AutoResetEvent DataAccess = new AutoResetEvent(true);
public MyFancyObject CreateMyFancyObject(string path, int something)
{
DataAccess.WaitOne();
try
{
using (var dbContext = new MySuperAppDataContext(MySuperAppDataContext.DbConnectionString))
{
var mfo = new MyFancyObject();
dbContext.MyFancyObjects.InsertOnSubmit(mfo);
mfo.Path = path;
mfo.Something = something;
dbContext.SubmitChanges();
return mfo;
}
}
finally
{
DataAccess.Set();
}
}
}
This is all nice and clean, but as soon as i get multiple threads (as mentioned above), the performance is PATHETIC. i can get lots of requests come down and then they're all waiting for db to be free.
Is there a better alternative? Would using lock(object) improve the performance?

Can you try not creating a new DataContext on every data operation.
Also try out some of the best practices mentioned here.
in particular :
Enabling fast updates with a version column
One of the easiest ways to optimize the performance of an update operation on a table is to add a version column. This optimization is specific to LINQ to SQL for Windows Phone. For example, in an entity, add the following code.
[Column(IsVersion=true)]
private Binary _version;

Related

Async methods to load datas from a database

I'm building a Visual Studio .NET application in c# with 2 databases and Entity Framework.
At first, I thought it would be better to load the most used datas in static Lists when the app is launched, but it was a bit too long, so now I try to use Tasks. I have one method per table I want to load, and I call them in a more general method :
public static void fillConstantes()
{
new Task(FillFiches).Start();
new Task(FillCustomers).Start();
new Task(FillEmployees).Start();
new Task(FillContact1).Start();
new Task(FillServiceItems).Start();
new Task(FillServiceLine).Start();
}
The fact is, if if I try to access the datas in another class and the task isn't finished, it crashes, obvisouly.
So, I was wondering if there's a good way to check if the tasks are done, are if I just have to check if it worked whenever I try to retrieve datas ?
Thanks for your help !
Task t = new Task(() => { });
t.Start();
you can use t.IsCompleted for your purpose

Nhibernate sessionPerThread

I am creating entities in with multiple thread at the same time.
When i do this in sequence order (with one thread) everything is fine, but when i introduce concurrency there are pretty much always new exception.
i call this method asynchronously:
public void SaveNewData(){
....DO SOME HARD WORK....
var data = new Data
{
LastKnownName = workResult.LastKnownName
MappedProperty = new MappedProperty
{
PropertyName = "SomePropertyName"
}
};
m_repository.Save(data);
}
I already got this exception:
a different object with the same identifier value was already
associated with the session: 3, of
entity:TestConcurrency.MappedProperty
and also this one:
Flushing during cascade is dangerous
and of course my favourite one:
Session is closed!Object name: 'ISession'.
What i think is going on is: Everythread got same session (nhibernateSession) and then it... go wrong cos everything try to send queries with same session.
For nhibernate configuration i use NhibernateIntegration with windsor castle.
m_repository.Save(data) looks like:
public virtual void Save(object instance)
{
using (ISession session = m_sessionManager.OpenSession())
{
Save(instance, session);
}
}
where m_sessionManager is injected in constructor from Castle and it is ISessionManager. Is there any way how to force this ISessionManager to give me SessionPerThread or any other concurrent session handling ?
So i researched and it seems that NHibernateIntengrationFacility doesnt support this transaction management out of the box.
I solved it when i changed to new Castle.NHibernate.Facility which supersede Castle.NHibernateIntegration - please note that this is only beta version currently.
Castle.Nhibernate.Facility supports session-per-transaction management, so it solved my problem completely.

Multithreading in opennetcf.orm (how to use SqlCeDataStore)

I just started using the OpenNETCF.ORM framework, and I ran into a problem. What is the correct way to use SqlCeDataStore in a multithreaded application?
In a single-threaded application I would simply use a static field:
public class MyApp
{
private static SqlCeDataStore _store;
public static SqlCeDataStore Store
{
get {
if (_store == null) {
_store = new SqlCeDataStore("database.sdf");
// other initialization stuff, DiscoverTypes() etc...
}
return _store;
}
}
}
And then I would use it like so:
var customers = MyApp.Store.Select<Customer>().ToArray();
After some research on SQL Server Compact, I found out that connections aren't thread safe, so each thread should have it's own connection. OpenNETCF.ORM does have an option to use a new connection each time you connect to the database. Should I just use that?
Another option would be to create a new SqlCeDataStore for each thread. Is that better?
What is the correct way?
We use SQL Compact in a variety of heavily multithread applications using the OpenNETCF ORM without any problems. We run these on full Windows and Windows CE.
We use the "Maintain Maintenance Connection" connection behavior, where a new connection is created for all CRUD calls, but a single-background one is kept for doing maintenance work (creating tables, etc). This gives good performance and a reasonable amount of thread safety.

Entity Framework - Effect of MultipleActiveResultSets on Caching

So I have a Class that looks something like the following. There is a thread that does some work using an Entity Framework Code First DbContext.
The problem I'm having is that it seems like m_DB context is caching data even though it should be disposed and recreated for every processing loop.
What I've seen is that some data in a relationship isn't present in the models loaded. If I kill and restart the process suddenly the data is found just like it should.
The only thing I can think of is this app is using the MultipleActiveResultSets=true in the database connection string, but I can't find anything stating clearly that this would cause the behavior I'm seeing.
Any insight would be appreciated.
public class ProcessingService
{
private MyContext m_DB = null
private bool m_Run = true;
private void ThreadLoop()
{
while(m_Run)
{
try
{
if(m_DB == null)
m_DB = new MyContext();
}
catch(Exception ex)
{
//Log Error
}
finally
{
if(m_DB != null)
{
m_DB.Dispose();
m_DB = null;
}
}
}
}
private void ProcessingStepOne()
{
// Do some work with m_DB
}
private void ProcessingStepTwo()
{
// Do some work with m_DB
}
}
Multiple Active Result Sets or MARS is a feature of SQL 2005/2008 and ADO.NET where one connection can be used by multiple active result sets (Just as the name implies). try switching this off on the connection string and observe the behaviour of the app, i am guessing that this could be the likely cause of your problem. read the following MSDN link for more on MARS
MSDN - Multiple Active Result Sets
Edit:
Try:
var results = context.SomeEntitiy.AsNoTracking() where this = that select s;
AsNoTracking() switches off internal change tracking of entities and it should also force Entity Framework to reload entities every time.
Whatever said and done you will require some amount of re-factoring since there's obviously a design flaw in your code.
I hate answering my own question, especially when I don't have a good explanation of why it fixes the problem.
I ended up removing MARS and it did resolve my issue. The best explanation I have is this:
Always read to the end of results for procedural requests regardless of whether they return results or not, and for batches that return multiple results. (http://technet.microsoft.com/en-us/library/ms131686.aspx)
My application doesn't always read through all the results returned, so its my theory that this some how caused data to get cached and reused the new DbContext.

Is locking single session in repository thread safe? (NHibernate)

I read many posts saying multithreaded applications must use a separate session per thread. Perhaps I don't understand how the locking works, but if I put a lock on the session in all repository methods, would that not make a single static session thread safe?
like:
public void SaveOrUpdate(T instance)
{
if (instance == null) return;
lock (_session)
using (ITransaction transaction = _session.BeginTransaction())
{
lock (instance)
{
_session.SaveOrUpdate(instance);
transaction.Commit();
}
}
}
EDIT:
Please consider the context/type of applications I'm writing:
Not multi-user, not typical user-interaction, but a self-running robot reacting to remote events like financial data and order-updates, performing tasks and saves based on that. Intermittently this can create clusters of up to 10 saves per second. Typically it's the same object graph that needs to be saved every time. Also, on startup, the program does load the full database into an entity-object-graph. So it basically just reads once, then performs SaveOrUpdates as it runs.
Given that the application is typically editing the same object graph, perhaps it would make more sense to have a single thread dedicated to applying these edits to the object graph and then saving them to the database, or perhaps a pool of threads servicing a common queue of edits, where each thread has it's own (dedicated) session that it does not need to lock. Look up producer/consumer queues (to start, look here).
Something like this:
[Producer Threads]
Edit Event -\ [Database Servicer Thread]
Edit Event ------> Queue -> Dequeue and Apply to Session -> Database
Edit Event -/
I'd imagine that a BlockingCollection<Action<Session>> would be a good starting point for such an implementation.
Here's a rough example (note this is obviously untested):
// Assuming you have a work queue defined as
public static BlockingCollection<Action<Session>> myWorkQueue = new BlockingCollection<Action<Session>>();
// and your eventargs looks something like this
public class MyObjectUpdatedEventArgs : EventArgs {
public MyObject MyObject { get; set; }
}
// And one of your event handlers
public MyObjectWasChangedEventHandler(object sender, MyObjectUpdatedEventArgs e) {
myWorkQueue.Add(s=>SaveOrUpdate(e.MyObject));
}
// Then a thread in a constant loop processing these items could work:
public void ProcessWorkQueue() {
var mySession = mySessionFactory.CreateSession();
while (true) {
var nextWork = myWorkQueue.Take();
nextWork(mySession);
}
}
// And to run the above:
var dbUpdateThread = new Thread(ProcessWorkQueue);
dbUpdateThread.IsBackground = true;
dbUpdateThread.Start();
At least two disadvantages are:
You are reducing the performance significantly. Having this on a busy web server is like having a crowd outside a cinema but letting people go in through a person-wide entrance.
A session has its internal identity map (cache). A single session per application means that the memory consumption grows as users access different data from the database. Ultimately you can even end up with the whole database in the memory which of course would just not work. This requires then calling a method to drop the 1st level cache from time to time. However, there is no good moment to drop the cache. You just can't drop in at the beginning of a request because other concurrent sessions could suffer from this.
I am sure people will add other disadvantages.

Categories

Resources