c#
I need to update 4 objects (entities) that exits on a observableCollection.
if they are not bound to the view (UI)
What's the best way or How they should be updated using RIA?
I would not like to create 4 trips to the database.
Would this gerenate 4 sql update commands?
What about if there is a time frame while the User decides what to change, could be other user changing one of the entity. if so what?
Any links I could read related to these questions ?
thanks in advance
should at least know what kind of update you need to do, basing on your question I'm just assuming that you need the user to change some arbitrary values on some entities, so, no "optimizations" and Group update can be done.
The domaincontext will keep track of your changes and send them as a whole in single Changeset.
The number of trips that you'll do to the database it's not related to WCF Ria services, rather it's a feature of your data layer, however, if you are using an ORM like nHibernate take a look at it's batchsize, or for EF take a look at his extension: http://weblogs.asp.net/pwelter34/archive/2011/11/29/entity-framework-batch-update-and-future-queries.aspx
Normally yes. Any out of the box data layer solution I know of, will generate 4 distinct updates
this is known as Concurrency. Again, is something that you should manage at your data layer. Raising an exception if other user have changed that row is a reasonable way in most case
take a look at this http://blogs.infosupport.com/optimistic-concurrency-with-wcf-ria-services-and-the-entity-framework/
I suggest you to reformulate your question into more specific arguments. Actually it's too wide, each point requires analysis of your needs and it's impossible to indicate a way.
Related
Let's say we've got a type Book with ten properties. This type is a representation of the table in database.
What's the best solution to update such a type? I've used repository pattern where I got update method that take Book type and updates all its fields.
So when I want to update a Book, I'm getting it by Id from database, update fields I want (I can update 1, 2 or all its fields) and then invoke Update method on repository.
My friend by contrast tells that we should update only the fields we want, so for example if i only want to update field bookTitle I should create method in repo UpdateTitle etc.
What the best solution? To be honest I see one method in repo update with all fields much better than multiple methods to update some parts/ones properties.
Dapper do not generate queries for you. You write the query and pass it to Dapper. Dapper do not track your types. You maintain the track of changes. Dapper just executes what you passed in and maps the result with your type/poco/entity/class whatever if applicable.
That said, how to update selected fields/columns? question is out of frame.
Now about other question — what approach to be taken? Both the approaches you mention have their own pros and cons.
Remember that, early optimization is root of all evil. Dapper is simple; keep it simple. Update entire type to begin with. Over the time, if you face critical performance issues AND you are sure the issue is due to updating all the fields, go ahead and optimize the query (or write new one) to update selected fields.
This will save you on efforts — will help improve performance if needed -- will keep the things simple.
Dapper itself doesn't support Tracking of database entities, there are ORMs that does that (e.g. Entity Framework)
Tracking allows you to load database objects into model class instances, once these are updated in code, you can save changes into the database, the generated query will update only the changed fields, as said, this is not supported in Dapper.
If you just use Dapper, I would recommend to keep it simple, save the whole updated object and consider changing that method if you have performance issues.
I am using the latest version of NHibernate, and I recently stumbled upon an interesting problem.
Let's say I have a table called Profile, and I want to receive a list of all my profiles. However, with it, I have a computed column called CanDelete which prohibits the profile from being deleted if (for instance) it is in use.
However, this CanDelete computed column is not part of my entity and I don't want to pollute the entity when I only need the CanDelete value in this scenario - and computing it individually for every profile is too slow.
Is there a way in NHibernate to execute some query and fetch the rows of that query as objects, but then somehow fetch an additional computed column as well?
Let's say I am using an N-tier architecture. All the way in the presentation layer I need a list of all profiles (and for each profile, whether or not I can delete it). How would my Business Logic Layer and Data Access Layer look like?
Right now in my repository I have a GetProfiles method and then a CanDeleteProfile method that I run for every profile fetched. But like I mentioned above, it is simply too slow. I could make a GetProfilesWithCanDeleteStatus method, but that would require me to create a specialized entity with that computed column on it as well.
What are your suggestions on how to architect this in a proper way when I don't want to hit O(n^2) performance in my profiles list? I would like to avoid the n+1 problem.
I am not necessarily looking for an NHibernate solution (I tagged NHibernate because it might have some specific tools for this kind of thing), and general solutions for other ORMs are welcome.
I have an idea that might be good for you, but it won't be a perfect one since each way you design your program has it's downsides.
I suggest you change the definition of that column CanDelete so it would be just like any other column in that entity (not computed at run time) and of type Boolean, but without damaging the requirements.
By doing so it would be like any other simple Select from the db - which is very quick.
Now, the tricky part is to insure that the column would indicate (at any time needed ) if it is been used or not (and if it can be deleted).
because I don't know the way you compute if Profile entity is been used (and can be deleted) it's difficult to tell how to design the DAL and BL exactly, but the guideline is :
in every other place you change a state in DB (of that entity or other) that might change the column CanDelete you encapsulate with a function to compute that value again to validate it's state and change it if needed.
If you insure that every time you change in BL one of the columns which CanDelete is computed from than you insure that the column CanDelete is always true indication.
The downside for this approach is :
1. It opens a place for mistake of a programmer not using the encapsulated function when needed.
2. It takes an assumption the this app is the only one changing this db.
3. You have to be careful from inserting raw data from an Management studio or script.
I Hope it's good for your BL.
This might seem like an odd question, but it's been bugging me for a while now. Given that i'm not a hugely experienced programmer, and i'm the sole application/c# developer in the company, I felt the need to sanity check this with you guys.
We have created an application that handles shipping information internally within our company, this application works with a central DB at our IT office.
We've recently switch DB from mysql to mssql and during the transition we decided to forgo the webservices previously used and connect directly to the DB using Application Role, for added security we only allow access to Store Procedures and all CRUD operations are handle via these.
However we currently have stored procedures for updating every field in one of our objects, which is quite a few stored procedures, and as such quite a bit of work on the client for the DataRepository (needing separate code to call the procedure and pass the right params for each procedure).
So i'm thinking, would it be better to simply update the entire object (in this case, an object represents a table, for example shipments) given that a lot of that data would be change one field at a time after initial insert, and that we are trying to keep the network usage down, as some of the clients will run with limited internet.
Whats the standard practice for this kind of thing? or is there a method that I've overlooked?
I would say that updating all the columns for the entire row is a much more common practice.
If you have a proc for each field, and you change multiple fields in one update, you will have to wrap all the stored procedure calls into a single transaction to avoid the database getting into an inconsistent state. You also have to detect which field changed (which means you need to compare the old row to the new row).
Look into using an Object Relational Mapper (ORM) like Entity Framework for these kinds of operations. You will find that there is not general consensus on whether ORMs are a great solution for all data access needs, but it's hard to argue that they solve the problem of CRUD pretty comprehensively.
Connecting directly to the DB over the internet isn't something I'd switch to in a hurry.
"we decided to forgo the webservices previously used and connect directly to the DB"
What made you decide this?
If you are intent on this model, then a single SPROC to update an entire row would be advantageous over one per column. I have a similar application which uses SPROCs in this way, however the data from the client comes in via XML, then a middleware application on our server end deals with updating the DB.
The standard practice is not to connect to DB over the internet.
Even for small app, this should be the overall model:
Client app -> over internet -> server-side app (WCF WebService) -> LAN/localhost -> SQL
DB
Benefits:
your client app would not even know that you have switched DB implementations.
It would not know anything about DB security, etc.
you, as a programmer, would not be thinking in terms of "rows" and "columns" on client side. Those would be objects and fields.
you would be able to use different protocols: send only single field updates between client app and server app, but update entire rows between server app and DB.
Now, given your situation, updating entire row (the entire object) is definitely more of a standard practice than updating a single column.
It's better to only update what you change if you know what you change (if using an ORM like entity Framework for example), but if you're going down the stored proc route then yes definately update everything in a row at once that's way granular enough.
You should take the switch as an oportunity to change over to LINQ to entities however if you're already in a big change and ditch stored procedures in the process whenever possible
Hy everyone.
In c# .net VS 2008 I'm developing an N-tier CRM framework solution and when it's done I want to share it.
The architecture is based on:
Data Access Layer, Entity Framework, Bussines Logic Layer, WCF and finally the presentation layer (win forms).
Somewhere I had read, that more than 2 tier layers are problematic, beacuse of the Optimistic Concurrency Updates (multiple client transactions with the same data).
In max. 2-tier layer solutions this should not be a problem because of the controls (like datagridview) that are solving this problem by themself, so I'm asking myself if it's not better to work with 2-tier layers and so avoid the optimistic concurrency problem?
Actually I want to make an N-tier layer solution for huge projects and not 2-tiers. I don't know how to solve concurrency problems like this and hope to get help right here.
Certainly there should be good some mechanisms to solve this... maybe any suggestions, examples, etc.?
Thanking you in anticipation.
Best regards,
Jooj
It's not really a question of the number of tiers. The question is how does your data access logic deal with concurrency. Dealing with concurrency should happen in whichever tier handles your data access regardless of how many tiers you have. But I understand where you're coming from as the .NET controls and components can hide this functionality and reduce the number of tiers needed.
There are two common methods of optimistic concurrency resolution.
The first is using a timestamp on rows to determine if the version the user was looking at when they started their edit has been modified by the time they commit their edit. Keep in mind that this is not necessarily a proper Timestamp database data type. Different systems will use different data types each with their own benefits and drawbacks. This is the simpler approach and works fine with most well designed databases.
The second common approach is, when committing changes, to identify the row in question not only by id but by all of the original values for the fields that the user changed. If the original values of the fields and id don't match on the record being edited you know that at least one of those fields has been changed by another user. This option has the benefit that even if two users edit the same record, as long as they don't change the same fields, the commit works. The downside is that there is possible extra work involved to guarantee that the data in the database record is in a consistent state as far as business rules are concerned.
Here's a decent explanation of how to implement simple optimistic concurrency in EF.
We use a combination manual merging (determining change sets and collisions) and last man wins depending on the data requirements. if the data changes collide same field changed from the a common original value then merge type exceptions are thrown and the clients handle that.
A few things come to my mind:
1) If you are using EF surely you don'y have Data Access Layer?! Do you mean database itself?
2) Question with tiers is both a phydical and logical one. So do you mean physical or logical?
3) In any-tiered application there is this issue with concurrency. Even in client-server, people could open a form, go soemwhere and come back and then save while the data has been changed by soemone else. You can use a timestamp to check while saving making sure your last update was when you have had the data.
4) Do not think too much on less or more tiers. Just implement the functionality as simple as possible and with the minimum number of layers.
Is there any way to have a datagrid listen to the database and automatically update the data if the database data is changed? I use a SQL Server database.
I'd like to use Linq-2-SQL if possible
Because #Slaggg asked: there are fairly straightforward ways of doing this, but they're almost certainly going to involve a lot of coding, it'll impact performance pretty significantly, and my strong suspicion is that it'll be more trouble than it's worth.
That said, for a typical n-tier application, at a very high level you'll need:
(1) A way to notify the middle tier when the data changes. You could use custom-code triggers inside each the table that fire off some sort of notification (perhaps using WCF and CLR stored procedures), or you could use a SqlDependency object. Probably the second would work better.
(2) A way to notify each client connected to that middle tier. Assuming that you're using WCF, you'll need to use one of the duplex bindings that are available, such as Net.TCP or HttpPollingDuplex (for Silverlight). You'll need to make sure this is configured correctly on both the client and the server. You'll also need to manually keep track of which clients might be interested in the update, so that you can know which ones to update, and you'll need to be able to remove them from that list when they go away or timeout. Tomek, from the MS WCF team, has some pretty good examples on his blog that you might want to investigate.
(3) A mechanism to update the local client's model and/or viewmodel and/or UI once you get the notification from the middle tier that something has changed. This is more complicated than you'd think: it's difficult enough to keep your UI in sync with your data model under normal circumstances, but it gets dramatically more complicated when that data model can be changing under you from the other direction as well.
The idea behind these sorts of notifications is straightforward enough: but getting all the details right is likely to keep you debugging way into the night. And the guy who has to support all this two years from now will curse your name.
Hope this helps.
It depends from where you are updating the database:
From the same context (in
Silverlight, are you adding,
deleting, editing on the same page)
From a ChildWindow in your
Silverlight application
From an
external, non-related tool, outside
of your Silverlight application