I have two models: Machine and Devices.
The relation between them is: A Machine has a collection of Devices.
How the PostAction should work: When the user creates a new Machine, he will also declare the number of Devices that Machine has.
This way, if 3 devices are declared for a Machine, 3 registers must be saved on the Device model.
The Code:
[HttpPost, ActionName("CreateEdit")]
[ValidateAntiForgeryToken]
public async Task<IActionResult> CreateEditPost(int? id,
[Bind("TypeID,BrandID,SupplierID,StoreID,MchName,NumDevices,FechaCompra,CostoMaq,MachineStatus")]Machine model,
[Bind("Id,DeviceName")]Device devicemodel)
{
if (id == null)
{
return NotFound();
}
if (ModelState.IsValid)
{
if (id == 0)
{
_context.Add(model);
for (var items = 0; items < model.NumDevices; items++)
{
var contador = items + 1;
string devicename = model.MchName + "-" + contador.ToString();
devicemodel.DeviceName = devicename;
_context.Add(devicemodel);
await _context.SaveChangesAsync();
}
await _context.SaveChangesAsync();
return RedirectToAction("Index");
}
}
return RedirectToAction("Index");
}
The problem:
When indicating, for example, 2 devices, here is what the debug is showing:
As shown, in the first attempt the DeviceID is 0. In the second attempt is 1006. This DeviceID is autogenerated.
At this point the application interrups claiming:
SqlException: Cannot insert explicit value for identity column in table 'Device' when IDENTITY_INSERT is set to OFF.
I believe this is happening because it's trying to write a zero on an Key field (DeviceID).
But also, it's saving one register on the database:
But it's saving a combination of the two attempts: (1) The DeviceName from attempt 1, (2) The DeviceID from attempt 2.
Can someone explain why in the first attempt the DeviceID is zero? How can this be fixed? And why is it saving the mix of both attempts?
Thanks in advance.
From what I can tell in your code, your loop is going through the number of devices that it thinks it has based off of the auto-bound number of devices in the Machine model, which I assume there is a hand-entered value for on your MVC form.
For each "Device" it has, you are literally trying to tell Entity Framework to add the same object (after it has its properties modified) and save it to the database. After the first "SaveChanges" call, the device's Id column will be updated to the ID that the database assigned to it. If you then try to add that to the DBContext again, it will try to create a NEW device with the SAME id, which is illegal unless, as it says, IDENTITY_INSERT is set to ON. Even it that setting was ON, it would be illegal because of the likely unique-ness constraint.
So, the first thing, is that it's a better practice to have DISCONNECTED models, and then a data layer which converts those model to actual entities and inserts those into the DB. But, barring that, something like this, which creates a new Device each time around, would work better:
if (id == 0)
{
_context.Add(model);
for (var items = 0; items < model.NumDevices; items++)
{
var contador = items + 1;
string devicename = model.MchName + "-" + contador.ToString();
var devNew = new Device();
devNew.DeviceName = devicename;
_context.Add(devNew);
await _context.SaveChangesAsync();
}
await _context.SaveChangesAsync();
return RedirectToAction("Index");
}
Related
I need to auto-increment TimesModified by 1 every time when Edit method is run.
When I use the following code, only TimesModified gets incremented but other parameters do not change (even though I'm changing let's say Age):
When I use the other version of code, everything else can be changed/updated but TimesModified doesn't get incremented:
I also tried this:
public async Task<IActionResult> Edit(int id, [Bind("ID,FirstName,LastName,Email,Phone,Age,City,Department,HiredDate,FiredDate,TimesModified")] Employee employee)
{
if (id != employee.ID)
{
return NotFound();
}
if (ModelState.IsValid)
{
try
{
_context.Update(employee);
await _context.SaveChangesAsync();
_context.Update(employee.TimesModified++);
await _context.SaveChangesAsync();
}
}
}
And I get this error:
Any suggestions?
1st suggestion is giving following error:
2nd suggestion has no errors but when I change anything like Age for example, it resets back to original(prior to edit) value: (However TimesModified gets incremented ok)
3rd suggestion my final SOLUTION!!!
added extra input field inside Edit View
Inside Edit Action , doing update in 2 steps: Save input , increment, Save again.
Try this:
var employee = await _context.Employees
.Where(e => e.Id == employee.Id)
.FisrstOrDefaultAsync();
employee.TimesModified+= 1;
_context.Entry(employee).State = EntityState.Modified;
//or _context.Entry(employee).Property(t=>t.TimesModified).IsModified = true;
await _context.SaveChangesAsync();
Instead of fetching the whole entity, just query the TimesModified.
Eg
var tm = _context.Employees.Select(e => e.Id == employee.Id).Select(e => e.TimesModified).First();
employee.TimesModified = tm + 1;
_context.Update(employee);
await _context.SaveChangesAsync();
3rd suggestion my final SOLUTION!!!
added extra input field inside Edit View
Inside Edit Action , doing update in 2 steps: Save input , increment, Save again.
I try to insert a list of Answers into Questions which in return it is inserted into Exams, all my code works just fine except one part, which is inserting new Answers.
Data is inserted just fine except Answers' data, whose data is not stored in the database, plus I try to get QuestionId so I can store it with Answer as a foreign key and I failed in that too.
API Controller
public IActionResult addExam([FromBody] Exams exam)
{
try
{
if (exam == null)
{
return StatusCode(401, "data is null");
}
var userId = this.help.GetCurrentUser(HttpContext);
Exams exams = new Exams
{
Name = exam.Name,
Number = exam.Number,
FullMarck = exam.FullMarck,
CreatedBy = userId,
CreatedOn = DateTime.Now,
Status = exam.Status
};
db.Exams.Add(exams);
var questionsList = new List<Questions>();
foreach (Questions item in exam.Questions)
{
var question = new Questions
{
ExamId = exam.Id,
Points = item.Points,
CreatedBy = userId,
CreatedOn = DateTime.Now,
Status = item.Status,
};
questionsList.Add(question);
}
exams.Questions = questionsList;
db.SaveChanges();
foreach (Questions item in exam.Questions)
{
var answersList = new List<Answers>();
foreach (Answers answers in item.Answers)
answersList.Add(new Answers
{
QuestionId = item.Id,
ExamAnswers = answers.ExamAnswers,
CreatedBy = userId,
CreatedOn = DateTime.Now
});
item.Answers = answersList;
}
db.SaveChanges();
return Ok("successfully created ");
}
catch (Exception e)
{
return StatusCode(500, e.InnerException.Message);
}
}
The way you build your graph is a bit unusual. I would have expected it more like a single set of nested loops that takes your supplied model and populates entity collections without forced ids. EF will track the IDs; you don't need to worry about them; when you add a new Answer to a particular question.Answers collection, you don't need to tell the Answer what it's QuestionID is; EF knows based on which question it was added to. If the ID for a question is not yet set, because it is generated by the db and no save has occurred, then saving the question will generate an ID and EF will ripple the change out to all the owned Answers in the question.Answers; you don't need to micro manage it
Here's a pseudo code of how I would expect it to go:
//model is an ExamModel
Exam e = new Exam(); //exam is a db entity
e.Title = model.ExamTitle; //model is not a db entity
foreach(QuestionModel mq in model.Questions){ //enumerate all the questionmodel we got from the front end and build a db entity graph
Question q = new Question(); //make new db entity
q.Subject = mq.QuestionHeader; //set entity property from model
q.Body = mq.BodyText; //set property from model
if(e.Questions == null) //not sure how your entities are declared, if this is already done elsewhere, remove it
e.Questions = new List<Question>();
e.Questions.Add(q); //add the question db entity to the exam db entity
//notice I didn't set the question id. EF will do that- it knows what exam this question belongs to
foreach(AnswerModel ma in mq.Answers){ //while our question db entity called q is still in scope let us add the related answers to it
Answer a = new Answer(); //create EF entity
a.Text = ma.AnswerText; //set db entity property from model property
if(q.Answers == null)
q.Answers = new List<Answer>();
q.Answers.Add(a); //add the answer to the question db entity
}
}
//exam e now has a collection of questions that have each a collection of answers, save it to the db
db.Exams.Add(e);
db.SaveChanges(); //only need one call to save changes on the whole graph
I think the way you've split your operations up hasn't created a connected graph of entities and/or the way you've forced the questionids of answers means EF hasn't kept the relationship to date upon saving.
You should also have a separation between the data objects arriving in your controller (I've called these ModelExam, ModelQuestion, ModelAnswer) and the entities in your EF (I've called these Exam, Question, Answer - yours are plural). This separation is achieved by having different classes for your front end controllers etc to use than your back end db context uses. At first it looks like things are being repeated for no good reason but eventually the system will become complex enough that not every db property can or should be exposed all the way to the front end and back, and the front end might need calculated or other non db based data. At this point you really need your front end data models to be completely separate things from your back end data entities
I think you need to call db.SaveChanges() after adding new exam. In your case exam id is not auto generated and it is always 0 , so you cant save question with examid 0;
item.Answers = answersList; is false
exam.Questions.Answers = answersList is true
Remove the first db.SaveChanges(); witch is before last foreach loop. This will insert all your data at once and should do the job.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I have an export job migrating data from an old database into a new database. The problem I'm having is that the user population is around 3 million and the job takes a very long time to complete (15+ hours). The machine I am using only has 1 processor so I'm not sure if threading is what I should be doing. Can someone help me optimize this code?
static void ExportFromLegacy()
{
var usersQuery = _oldDb.users.Where(x =>
x.status == 'active');
int BatchSize = 1000;
var errorCount = 0;
var successCount = 0;
var batchCount = 0;
// Using MoreLinq's Batch for sequences
// https://www.nuget.org/packages/MoreLinq.Source.MoreEnumerable.Batch
foreach (IEnumerable<users> batch in usersQuery.Batch(BatchSize))
{
Console.WriteLine(String.Format("Batch count at {0}", batchCount));
batchCount++;
foreach(var user in batch)
{
try
{
var userData = _oldDb.userData.Where(x =>
x.user_id == user.user_id).ToList();
if (userData.Count > 0)
{
// Insert into table
var newData = new newData()
{
UserId = user.user_id; // shortened code for brevity.
};
_db.newUserData.Add(newData);
_db.SaveChanges();
// Insert item(s) into table
foreach (var item in userData.items)
{
if (!_db.userDataItems.Any(x => x.id == item.id)
{
var item = new Item()
{
UserId = user.user_id, // shortened code for brevity.
DataId = newData.id // id from object created above
};
_db.userDataItems.Add(item);
}
_db.SaveChanges();
successCount++;
}
}
}
catch(Exception ex)
{
errorCount++;
Console.WriteLine(String.Format("Error saving changes for user_id: {0} at {1}.", user.user_id.ToString(), DateTime.Now));
Console.WriteLine("Message: " + ex.Message);
Console.WriteLine("InnerException: " + ex.InnerException);
}
}
}
Console.WriteLine(String.Format("End at {0}...", DateTime.Now));
Console.WriteLine(String.Format("Successful imports: {0} | Errors: {1}", successCount, errorCount));
Console.WriteLine(String.Format("Total running time: {0}", (exportStart - DateTime.Now).ToString(#"hh\:mm\:ss")));
}
Unfortunately, the major issue is the number of database round-trip.
You make a round-trip:
For every user, you retrieve user data by user id in the old database
For every user, you save user data in the new database
For every user, you save user data item in the new database
So if you say you have 3 million users, and every user has an average of 5 user data item, it mean you do at least 3m + 3m + 15m = 21 million database round-trip which is insane.
The only way to dramatically improve the performance is by reducing the number of database round-trip.
Batch - Retrieve user by id
You can quickly reduce the number of database round-trip by retrieving all user data at once and since you don't have to track them, use "AsNoTracking()" for even more performance gains.
var list = batch.Select(x => x.user_id).ToList();
var userDatas = _oldDb.userData
.AsNoTracking()
.Where(x => list.Contains(x.user_id))
.ToList();
foreach(var userData in userDatas)
{
....
}
You should already have saved a few hours only with this change.
Batch - Save Changes
Every time you save a user data or item, you perform a database round-trip.
Disclaimer: I'm the owner of the project Entity Framework Extensions
This library allows to perform:
BulkSaveChanges
BulkInsert
BulkUpdate
BulkDelete
BulkMerge
You can either call BulkSaveChanges at the end of the batch or create a list to insert and use directly BulkInsert instead for even more performance.
You will, however, have to use a relation to the newData instance instead of using the ID directly.
foreach (IEnumerable<users> batch in usersQuery.Batch(BatchSize))
{
// Retrieve all users for the batch at once.
var list = batch.Select(x => x.user_id).ToList();
var userDatas = _oldDb.userData
.AsNoTracking()
.Where(x => list.Contains(x.user_id))
.ToList();
// Create list used for BulkInsert
var newDatas = new List<newData>();
var newDataItems = new List<Item();
foreach(var userData in userDatas)
{
// newDatas.Add(newData);
// newDataItem.OwnerData = newData;
// newDataItems.Add(newDataItem);
}
_db.BulkInsert(newDatas);
_db.BulkInsert(newDataItems);
}
EDIT: Answer subquestion
One of the properties of a newDataItem, is the id of newData. (ex.
newDataItem.newDataId.) So newData would have to be saved first in
order to generate its id. How would I BulkInsert if there is a
dependency of an another object?
You must use instead navigation properties. By using navigation property, you will never have to specify parent id but set the parent object instance instead.
public class UserData
{
public int UserDataID { get; set; }
// ... properties ...
public List<UserDataItem> Items { get; set; }
}
public class UserDataItem
{
public int UserDataItemID { get; set; }
// ... properties ...
public UserData OwnerData { get; set; }
}
var userData = new UserData();
var userDataItem = new UserDataItem();
// Use navigation property to set the parent.
userDataItem.OwnerData = userData;
Tutorial: Configure One-to-Many Relationship
Also, I don't see a BulkSaveChanges in your example code. Would that
have to be called after all the BulkInserts?
Bulk Insert directly insert into the database. You don't have to specify "SaveChanges" or "BulkSaveChanges", once you invoke the method, it's done ;)
Here is an example using BulkSaveChanges:
foreach (IEnumerable<users> batch in usersQuery.Batch(BatchSize))
{
// Retrieve all users for the batch at once.
var list = batch.Select(x => x.user_id).ToList();
var userDatas = _oldDb.userData
.AsNoTracking()
.Where(x => list.Contains(x.user_id))
.ToList();
// Create list used for BulkInsert
var newDatas = new List<newData>();
var newDataItems = new List<Item();
foreach(var userData in userDatas)
{
// newDatas.Add(newData);
// newDataItem.OwnerData = newData;
// newDataItems.Add(newDataItem);
}
var context = new UserContext();
context.userDatas.AddRange(newDatas);
context.userDataItems.AddRange(newDataItems);
context.BulkSaveChanges();
}
BulkSaveChanges is slower than BulkInsert due to having to use some internal method from Entity Framework but still way faster than SaveChanges.
In the example, I create a new context for every batch to avoid memory issue and gain some performance. If you re-use the same context for all batchs, you will have millions of tracked entities in the ChangeTracker which is never a good idea.
Entity Framework is a very bad choice for importing large amounts of data. I know this from personal experience.
That being said, I found a few ways to optimize things when I tried to use it in the same way you are.
The Context will cache objects as you add them, and the more inserts you do, the slower future inserts will get. My solution was to limit each context to about 500 inserts before I disposed of that instance and created a new one. This boosted performance significantly.
I was able to make use of multiple threads to increase performance, but you will have to be very careful about resource contention. Each thread will definitely need its own Context, don't even think about trying to share it between threads. My machine had 8 cores, so threading will probably not help you as much; with a single core I doubt it will help you at all.
Turn off ChangeTracking with AutoDetectChangesEnabled = false;, change tracking is incredibly slow. Unfortunately this means you have to modify your code to make all changes directly through the context. No more Entity.Property = "Some Value";, it becomes Context.Entity(e=> e.Property).SetValue("Some Value"); (or something like that, I don't remember the exact syntax), which makes the code ugly.
Any queries you do should definitely use AsNoTracking.
With all that, I was able to cut a ~20 hour process down to about 6 hours, but I still don't recommend using EF for this. It was an extremely painful project due almost entirely to my poor choice of EF to add data. Please use something else... anything else...
I don't want to give the impression that EF is a bad data access library, it is great at what it was designed to do, unfortunately this is not what it was designed for.
I can think on a few options.
1) A little speed increase could be done by moving your _db.SaveChanges() under your foreach() close bracket
foreach (...){
}
successCount += _db.SaveChanges();
2) Add items to a list, and then to context
List<ObjClass> list = new List<ObjClass>();
foreach (...)
{
list.Add(new ObjClass() { ... });
}
_db.newUserData.AddRange(list);
successCount += _db.SaveChanges();
3) If it's a big amount of dada, save on bunches
List<ObjClass> list = new List<ObjClass>();
int cnt=0;
foreach (...)
{
list.Add(new ObjClass() { ... });
if (++cnt % 100 == 0) // bunches of 100
{
_db.newUserData.AddRange(list);
successCount += _db.SaveChanges();
list.Clear();
// Optional if a HUGE amount of data
if (cnt % 1000 == 0)
{
_db = new MyDbContext();
}
}
}
// Don't forget that!
_db.newUserData.AddRange(list);
successCount += _db.SaveChanges();
list.Clear();
4) If TOOOO big, considere using bulkinserts. There are a few examples on internet and a few free libraries around.
Ref: https://blogs.msdn.microsoft.com/nikhilsi/2008/06/11/bulk-insert-into-sql-from-c-app/
On most of these options you loose some control on error handling as it is difficult to know which one failed.
I am very new to MVC and hope someone can assist me.
I have a controller method to save post back data from a form. It has a field called OrderStatus. If the order status value is "Received" then only I want to execute a block of code.
What I am doing in this code is, read the post values and read the EF data again using Find and compare the values. All seems ok but when I try to save the record, it gives me below error.
An object with the same key already exists in the ObjectStateManager. The ObjectStateManager cannot track multiple objects with the same key.
I do understand the problem but how can I check the existing values in the database and compare and save.
My code is below
// POST: /Purchasing/Edit/5
[HttpPost]
public ActionResult Edit(PurchaseMaster purchasemaster)
{
if (ModelState.IsValid)
{
if (purchasemaster.OrderStatus == "Received")
{
string myId = purchasemaster.PurchaseId;
//check if the existing status is already set as Received or not
PurchaseMaster pm = db.PurchaseMasters.Find(myId);
if (pm.OrderStatus != "Received") //this will prevent duplicate stock updates
{
//load the items and loop through to update the stock
List<PurchaseDetail> purchasedetails = db.PurchaseDetails.Where(x => x.PurchaseId == myId).ToList();
foreach (PurchaseDetail singleitem in purchasedetails)
{
string itemcode = singleitem.ItemCode;
Item item = db.Items.Find(itemcode);
item.QtyInHand = item.QtyInHand + singleitem.Quantity;
db.Entry(item).State = EntityState.Modified;
db.SaveChanges();
}
}
}
db.Entry(purchasemaster).State = EntityState.Modified;
db.SaveChanges();
return RedirectToAction("Index");
}
return View(purchasemaster);
}
Try this it should work
//don't get this object from database
//PurchaseMaster pm = db.PurchaseMasters.Find(myId);
if (db.PurchaseMasters.Any(x =>x.Id == myId && x.OrderStatus != "Received") {
// Do your stuff
}
I am trying to write a program to scan a directory containing tv show folders, look up some details about the shows using tvrage API and then save the details to a database using entity framework.
My TVShow table pkey is the same value as taken from the tvrage database show id, and I am having issues when duplicate or similar folder names are returning the same Show info. In a situation where I have a directory containing three folders, "Alias", "Alias 1" , "Band of Brothers" I get the following output from my code
* TV SHOWS *
Alias....... NO MATCH......ADDING........DONE
Alias 1 ...... NO MATCH.....ADDING....CANT ADD, ID ALREADY EXISTS IN DB
Band of Brothers ...... NO MATCH..ADDING....
Before getting an UpdateException on the context.SaveChanges(); line
Violation of PRIMARY KEY constraint 'PK_TVShows'.
I can see using SQL profiler that the problem is that my app is trying to perform an insert on the alias show for a second time with duplicate key, but I can't see why. When I step through the code on the second interaction of the foreach loop (second "alias" folder), the code to save the show entity to the database is bypassed.
It is only on the next iteration of the foreach loop when I have created a new TVShow entity for "Band of Brothers" do I
actually reach the code which adds a Tvshow to context and saves, at which point the app crashes. In visual studio I can see
at the point of the crash that;
"show" entity in context.TVShows.AddObject(show) is "Band of Brothers" w/ a unique ID
context.TVShows only contains one record, the first Alias Entity
But SQL profiler shows that EntityFramework is instead inserting Alias for a second time, and I am stumped by why this is
private void ScanForTVShowFolders( GenreDirectoryInfo drive ) {
IEnumerable<DirectoryInfo> shows = drive.DirInfo.EnumerateDirectories();
foreach (DirectoryInfo d in shows) {
//showList contains a list of existing TV show names previously queried out of DB
if (showList.Contains(d.Name)) {
System.Console.WriteLine(d.Name + ".....MATCH");
} else {
System.Console.Write(d.Name + "......NO MATCH..ADDING....");
TVShow show = LookUpShowOnline(d.Name, drive.GenreName);
if (show.Id == -1) { // id of -1 means online search failed
System.Console.Write("..........CANT FIND SHOW" + Environment.NewLine);
} else if (context.TVShows.Any(a => a.Id == show.Id)) { //catch duplicate primary key insert
System.Console.Write(".......CANT ADD, ID ALREADY EXISTS IN DB" + Environment.NewLine);
} else {
context.TVShows.AddObject(show);
context.SaveChanges();
System.Console.Write("....DONE" + Environment.NewLine);
}
}
}
private TVShow LookUpShowOnline( string name, string genre ) {
string xmlPath = String.Format("http://services.tvrage.com/feeds/search.php?show='{0}'", name);
TVShow aShow = new TVShow();
aShow.Id = -1; // -1 = Can't find
XmlDocument xmlResp = new XmlDocument();
try { xmlResp.Load(xmlPath); } catch (WebException e) { System.Console.WriteLine(e); }
XmlNode root = xmlResp.FirstChild;
if (root.NodeType == XmlNodeType.XmlDeclaration) { root = root.NextSibling; }
XmlNode tvShowXML;
//if (showXML["episode"] == null)
// return false;
tvShowXML = root["show"];
if (tvShowXML != null) {
aShow.Id = System.Convert.ToInt16(tvShowXML["showid"].InnerText);
aShow.Name = tvShowXML["name"].InnerText.Trim();
aShow.StartYear = tvShowXML["started"].InnerText.Trim();
aShow.Status = tvShowXML["status"].InnerText.Trim();
aShow.TVGenre = context.TVGenres.Where(b => b.Name.Trim() == genre).Single();
}
return aShow;
}
}
Edit
Doing some more reading I added context.ObjectStateManager to my debug watchlist and I can see everytime I create a new TVShow entity a new record is added to _addedEntityStore. Actually if I remove context.TVShows.AddObject(show) the code still updates the database so manually adding to the context seems redundant.
If your are inserting object by foreach loop > better to keep the Primary Key outside and make it increment!
eg: int newID= Shows.Select(d=>d.Id).Max();
foreach(............)
{
show.Id = newID++;
.
.
. //remaining fields
.
context.TVShows.AddObject(show);
}
context.SaveChanges();
it works for me...!!
Turns out context.TVShows.AddObject(show) is unnecessary in my case, I was inadvertently adding all created show entities to the context when this query runs
aShow.TVGenre = context.TVGenres.Where(b => b.Name.Trim() == genre).Single();
This is not what I wanted, I just wanted to create the object, then decide whether to add it. Will be pretty easy to fix now I know why it's happening.