I have a situation where I pull data from a table by date. If no data is supplied for a given date I create a record using default values and display it all to the user. When the user is done manipulating the data I need to commit the changes.
So my question is how do I handle in Entity Framework submitting a table where there could be both updates and adds that need to be done. This is in C# using MVC3 and Entity Framework.
So here's what the data might look like to start,
Table A
NAME AGE PHONE_NUM
Jim 25 555-555-5555
Jill 48 555-551-5555
After the users done with the data it could look like this,
Table A
NAME AGE PHONE_NUM
Jim 25 555-555-5555
Jill 28 555-551-5555
Rob 42 555-534-6677
How do I commit these changes? My problem is there are both updates and inserts needed?
I've found some code like this but I don't know if it will work in this case.
For adding rows of data
entities.TABlEA.AddObject(TableOBJECT);
entities.SaveChanges();
or for updating data
entities.TABLEA.Attach(entities.TABLEA.Single(t => t.NAME == TableOBJECT.NAME));
entities.TABLEA.ApplyCurrentValues(TableOBJECT);
entities.SaveChanges();
Will any of this work or do I need to keep track of whats there and what was added?
Ideas?
More or less you already have the solution. You just need to check if your Single call which tries to load the object from the DB has an result or not (use SingleOrDefault instead). If the result is null you need to insert, otherwise update:
foreach (var TableOBJECT in collectionOfYourTableOBJECTsTheUserWorkedWith)
{
var objectInDB = entities.TABLEA
.SingleOrDefault(t => t.NAME == TableOBJECT.NAME);
if (objectInDB != null) // UPDATE
entities.TABLEA.ApplyCurrentValues(TableOBJECT);
else // INSERT
entities.TABLEA.AddObject(TableOBJECT);
}
entities.SaveChanges();
(I'm assuming that NAME is the primary key property of your TableOBJECT entity.)
I think you have to keep track of what is new and what is modified. If you do that, that the two code examples you provided are going to work.
A simple workaround which I used is to check if an entity's primary key property is set to anything. If it is set to a value, then that is an updated object, otherwise it's new.
Another solution would be to use Entity Framework's Self Tracking Entities, but I do not think that's the right direction to go in a web application (maybe it is in a distributed WCF app).
Related
I am trying to find a neater way to save table/column updates. I have an object representing the current order table entry and an object with all updated values. I want to compare each variable with matching names in each object and save the update to the database if different.
The current way I am doing this (which I know is badly written) is as follows:
if (currentOrder.Comment != editedOrder.Comment)
{
createOrderUpdateRecord("Comment", currentOrder.Comment.ToString(), editedOrder.Comment.ToString(), OrderID);
currentOrder.Comment = editedOrder.Comment ;
anyChangesMade = true;
}
if (currentOrder.CustomerName != editedOrder.CustomerName )
{
createOrderUpdateRecord("CustomerName ", currentOrder.CustomerName .ToString(), editedOrder.CustomerName .ToString(), OrderID);
currentOrder.CustomerName = editedOrder.CustomerName ;
anyChangesMade = true;
}
and then in the createOrderUpdateRecord method, I save the information to an edits table.
In case anyones interested I ended up using the tracker enabled db context library alongside entity framework to track changes to table fields, which did the job perfectly
In case you are using the normal .NET Winforms DataGridView when you refer to table/column-updates:
There is an event called "CellValueChanged". Also there is "CellEndEdit". If you subscribe to either of those, you could set a "changed"-variable to true to track if you must update the value in the database.
This question already has answers here:
Fastest Way of Inserting in Entity Framework
(32 answers)
Closed 7 years ago.
I have a table with 100,000+ records. Customer has asked that we encrypt the username field, copy the encrypted value to a new field, and clear the original username field. The encryption can only be performed in the application, not the database.
The existing codebase used the Entity Framework in the past for these tasks, but never for a table of this size. Legacy code looked something like:
foreach(var phone in db.Phones){
phone.Enc_Serial = Encrypt(phone.Serial);
phone.Serial = "";
}
db.SaveChanges();
Given this is a bulk update, would there be any advantage to doing this with a raw SQL command? I'm thinking that at least we wouldn't have a ton of tracked objects sitting in the DbContext consuming memory.
var idsAndSerials = db.Phones.Select(p => new { id = p.Id, serial = p.Serial };
foreach(var item in idsAndSerials ){
string sql = String.Format("Update phone set Enc_Serial ='{0}' where phoneId={1}", Encrypt(item.serial), item.id.ToString());
db.Database.ExecuteSqlCommand(sql);
}
In the example you've provided, no way. You're still iterating through each record and calling UPDATE. At least in the first example I believe those statements would get executed as a batch, and would be transactional so that all updates succeed or none of them do.
Since this is a significant update, I'd suggest creating a tracking table (on the SQL side) in which you sequentially number each of the rows to be updated (and also store the row's PK value). Also include a column in the tracking table that lets you mark the row as done (say 0 or 1). Set a foreign key into the original table via the PK value.
Update your data model on the EF side to include the new tracking table. Now you have a new table that will easily let you retrieve, say, 1K record batches to work on at a time. This won't have excessive memory consumption. The application logic can do the encrypting. As you update those records, mark the ones that are updated as "done" in your tracking table.
Get the next 1K of not done records via tracking table (use navigation properties to get the real records). Repeat.
It'll be done very fast and with no undue load. 100000+ records is not really a lot, especially if you use a divide and conquer approach (100+ batches).
Given the following code (which is mostly irrelevant except for the last two lines), what would your method be to get the value of the identity field for the new record that was just created? Would you make a second call to the database to retrieve it based on the primary key of the object (which could be problematic if there's not one), or based on the last inserted record (which could be problematic with multithreaded apps) or is there maybe a more clever way to get the new value back at the same time you are making the insert?
Seems like there should be a way to get an Identity back based on the insert operation that was just made rather than having to query for it based on other means.
public void Insert(O obj)
{
var sqlCmd = new SqlCommand() { Connection = con.Conn };
var sqlParams = new SqlParameters(sqlCmd.Parameters, obj);
var props = obj.Properties.Where(o => !o.IsIdentity);
InsertQuery qry = new InsertQuery(this.TableAlias);
qry.FieldValuePairs = props.Select(o => new SqlValuePair(o.Alias, sqlParams.Add(o))).ToList();
sqlCmd.CommandText = qry.ToString();
sqlCmd.ExecuteNonQuery();
}
EDIT: While this question isn't a duplicate in the strictest manner, it's almost identical to this one which has some really good answers: Best way to get identity of inserted row?
It strongly depends on your database server. For example for Microsoft SQL Server you can get the value of the ##IDENTITY variable, that contains the last identity value assigned.
To prevent race conditions you must keep the insert query and the variable read inside a transaction.
Another solution could be to create a stored procedure for every type of insert you have to do and make it return the identity value and accept the insert arguments.
Otherwise, inside a transaction you can implement whatever ID assignment logic you want and be preserved from concurrency problems.
Afaik there is not finished way.
I solved by using client generated ids (guid) so that my method generated the id and returns it to the caller.
Perhaps you can analyse some SqlServer systables in order to see what has last changed. But you would get concurrency issues (What if someone else inserts a very similar record).
So I would recommend a strategy change and generate the id's on the clients
You can take a look at : this link.
I may add that to avoid the fact that multiple rows can exist, you can use "Transactions", make the Insert and the select methods in the same transaction.
Good luck.
The proper approach is to learn sql.
You can do a SQL command followed by a SELECT in one run, so you can go in and return the assigned identity.
See
I have a list view with two colums in wpf Customername and Isvalid.I am using linq to sql to get the data from my sql table.when i am trying to update a value to the table i dont see any changes to the table.
Here is my code when i click on the save button:
try
{
CustomersDataContext dataContext = new CustomersDataContext();
Customer customerRow = MyDataGrid.SelectedItem as Customer;
string m = customerRow.CustomerName;
Customer customer = (from p in dataContext.Customers
where p.CustomerName == customerRow.CustomerName
select p).Single();
customer.Isvalid=false;
dataContext.SubmitChanges();
MessageBox.Show("Row Updated Successfully.");
}
catch (Exception Ex)
{
MessageBox.Show(Ex.Message);
return;
}
I can see that i am able to query the record based on the customername selected but the value is not updating.
I would be glad if some one can point out where am i missing the logic to update the "ISVALID" value to the data base.
Firstly, where's your using(get_datacontext){...} block? You need to dispose of DataContexts when you are done with them!
Anyway...
My guess is that the update statement is generating a where clause that's far too tight, or just plain wrong.
I would be checking the 'Update Check' property of each of the columns in your mapped table in the Linq to Sql designer. The simplest thing to do is to set the primary key column to Always and set all the others to Never. You can also consider setting them to WhenChanged.
The designer's default behaviour is generally to set it to Always for everything; not only does this cause horrible WHERE clauses for updates, but can occasionally also cause problems. Obviously such behaviour is required for proper concurrency checking (i.e. two threads updating the same row); so be aware of that.
Oh, thinking of something else - you can also get this behaviour if your table doesn't have a primary key designated in the designer - make sure one of the columns is.
Finally you can check the SQL being generated when SubmitChanges is called; by attaching a TextWriter to the DataContext.Log property; or equally IntelliTrace in VS2010 will collect all ADO.Net queries that are run if you start with debugging. This is invaluable in debugging why L2S stuff isn't working as expected.
You should Add updated Customer to list of updating customers. I mean before saving changes you should do something like: db.AddToCustomers(customer). AddToCustomers in used in EF, I exactly don't know its equivalent in LINQ.
Can someone explain why the following code fails to update the database or what else I can do to troubleshoot?
// *********************************
// People Updates
// *********************************
// In Engr and SoE
EmplIDs = InputList.GetPeopleIds(InputType.Engr | InputType.SoE); // retrieve IDs in tables Engr and SoE
Engr = DB.GetEngrByIds(EmplIDs); // retrieve objects from Engr
SoE = DB.GetSoEByIds(EmplIDs); // retrieve objects from SoE
Batch.Update(SoE, Engr); // update SoE with Engr data
DB.Save(SoE); // persist
// Inside DB repository
public void Save(List<SoE_People> people) {
ChangeSet cs = dc.GetChangeSet();
foreach (SoE_People person in people.Where(p => cs.Updates.Contains(p))) {
person.LastUpdate = DateTime.Now;
}
dc.SubmitChanges();
}
I've checked the following:
people.Count ~ 2500, cs.Updates.Count ~ 200
dc.GetChangeSet.Updates.Count = 0 after calling SubmitChanges()
all updates to the people object are correct. They are visible in the locals window via people and cs.Updates[x]
no exceptions are thrown by dc.SubmitChanges()
setting dc.Log = Console.Out shows no SQL for the SubmitChanges()
a previous section of the code that inserts new records via dc.InsertAllOnSubmit() works fine -> no write permission problem.
manually cutting and pasting data into the SoE_People table works -> no foreign key constraint problem.
Without even looking into the logic you have above, here are some recommendations:
Put everything in a try/catch. How do you know there's no exception being thrown?
dc.SaveChanges() returns an int - number of records that were CRUDed. Capture the return value and check it.
I'm not familiar with DataContext (don't use it), but I'll throw this out there.
If it's like an SqlDataAdapter, did you write/define the SQL text for inserting or updating your records?
You have probably overridden SoE_People update in the datacontext.
There should be a this .ExecuteDynamicUpdate(instance).
Look there maybe you have commented / removed that.
You can take a look here if you don't understand what I mean:
Custom Entity Insert/Update/Delete Method Validation.
Maybe it is not the code, it is the database table.
I had the issue to update the table use that SubmitChanges().
I fixed the problem by giving the table a primary key.