I am new to Linq to Sql. I got a scenario. I have two tables. MasterTable and DetailTable table. What I am trying to do is :
Inserting new rows in the DetailTable and on base of DetailTable rows I am trying to Update Master in One transaction.
Here is my code :
DBContext context = new DBContext();
context.Connection.Open();
context.Transaction = context.Connection.BeginTransaction();
DetailTable detail = new DetailTable();
detail.Amount = 100;
var detailTable = context.GetTable<DetailTable>();
// pass in the object with insert on submit
// and then submit changes
detailTable.InsertOnSubmit(detail);
var result = (from Total in context.MasterTable
select Total).Sum();
decimal total = (decimal)result; // This total is not the latest.
// UpdateMaster.....
// ................
context.SubmitChanges();
context.Transaction.Commit();
Now the problem, I am facing is that I am not getting the latest Sum from MasterTable. Like after inserting new row of amount 100, say I should get 600 but I am getting 500 (Sum rows as if I have not inserted new row).
Please let me know if this is possible using Linq to Sql if it is then how or I am trying to achieve something which is not possible.
Try to put the context.SubmitChanges(); above the decimal total = (decimal)result;
The data in your datacontext is stale. Linq-2-sql will not apply pending updates before your submitchanges.
So what you need to do is either the following:
decimal total = (decimal)result + detail.amount
or you do indeed what Jan P. has suggested above. That will work too since you are managing the transaction yourself.
Additionally: why are you opening the connection yourself? There is no need to do so in this case.
Related
I have a list of rows that I want to insert in one batch (add X rows, single call to SaveChanges). Unfortunately, from time to time, some of the items in the list already exist. Since the insertion process is taking place in a transaction (all or nothing), nothing gets added.
Code to show the idea:
using (var context = new CacheDbContext())
{
context.Counter.Add(new Counter
{
Id = "test-3",
CounterType = "test",
Expiry = DateTime.UtcNow.AddHours(1),
Value = 0
});
context.Counter.Add(new Counter
{
Id = "test-2",
CounterType = "test",
Expiry = DateTime.UtcNow.AddHours(1),
Value = 0
});
await context.SaveChangesAsync().ConfigureAwait(false);
}
My goal is to do an insertion and if the one or more items are already exists - ignore.
The naive solution is to check if the ID is exists, before insert it. This will work, but it has poor performance. I want to execute multiple inserts, with one call.
I know that it possible using SQL like this:
INSERT INTO table_name(c1)
VALUES(c1)
ON DUPLICATE KEY UPDATE c1 = VALUES(c1) + 1;
if my EF will translate my SQL INSERT statement for something like this, it will be good for me.
Is this possible?
Any other solution will be welcome.
Okay so I've already asked this question but I've narrowed it down and am now able to word it better.
I have a sql database and an asp.net mvc project with entity frameworks. I already figured out how to query the database and display all contents. But now I need to query the database and only display the rows where column "a" is greater than or equal to column "b".
Edit: datatypes in both columns are int
Here is the query I need
Select *
from Inventory
Where quantity <= statusLow
var context = new MyContext();
var query = context.Inventory.Where(p=> p.quantity <= p.statusLow); // write the statement to query
var result = query.ToList(); // obtaining the result, trigger the database
You can try as shown below.
using (var db = new yourContext())
{
var result = db.Inventory.Where(a=> a.quantity <= a.statusLow).ToList();
}
You can learn more about LINQ to Entities here.
I using C# and LINQ to pull/push data housed in SQL Azure. The basic scenario is we have a Customer table that contains all customers (PK = CompanyID) and supporting tables like LaborTypes and Materials (FK CompanyID to Customer table).
When a new customer signs up, a new record is created in the Customers table. Once that is complete, I want to load a set of default materials and laborTypes from a separate table. It is simple enough if I just wanted to copy data direct from one table to another but in order to populate the existing tables for the new customer, I need to take the seed data (e.g. laborType, laborDescription), add the CompanyID for each row of seed data, then do the insert to the existing table.
What the best method to accomplish this using C# and LINQ with SQL Azure?
An example of a direct insert from user input for LaborTypes is below for contextual reference.
using (var context = GetContext(memCustomer))
{
var u = GetUserByUsername(context, memUser);
var l = (from lbr in context.LaborTypes
where lbr.LaborType1.ToLower() == laborType
&& lbr.Company == u.Company
select lbr).FirstOrDefault();
if (l == null)
{
l = new AccountDB.LaborType();
l.Company = u.Company;
l.Description = laborDescription;
l.LaborType1 = laborType;
l.FlatRate = flatRate;
l.HourlyRate = hourlyRate;
context.LaborTypes.InsertOnSubmit(l);
context.SubmitChanges();
}
result = true;
}
What you'll want to do is write a query retrieving data from table B and do an Insert Statement on Table A using the result(s).
This has been covered elsewhere in SO I think, here would be a good place to start
I don't know the syntax for Linq specifically; but by constructing something similar to #Murph 's answer beyond that link, I think this might work.
var fromB= from b in TableB
where ... ;//identify the row/data from table B
// you may need to make fromB populate from the DB here.
var toInsert = ...; //construct your object with the desired data
// do other necessary things here
TableA.InsertAllOnSubmit(toInsert);
dc.SubmitChanges(); // Submit your changes
I'm working on an import from a CSV file to my ASP.NET MVC3/C#/Entity Framework Application.
Currently this is my code, but I'm looking to optimise:
var excel = new ExcelQueryFactory(file);
var data = from c in excel.Worksheet(0)
select c;
var dataList = data.ToList();
List<FullImportExcel> importList = new List<FullImportExcel>();
foreach (var s in dataList.ToArray())
{
if ((s[0].ToString().Trim().Length < 6) && (s[1].ToString().Trim().Length < 7))
{
FullImportExcel item = new FullImportExcel();
item.Carrier = s[0].ToString().Trim();
item.FlightNo = s[1].ToString().Trim();
item.CodeFlag = s[2].ToString().Trim();
//etc etc (50 more columns here)
importList.Add(item);
}
}
PlannerEntities context = null;
context = new PlannerEntities();
context.Configuration.AutoDetectChangesEnabled = false;
int count = 0;
foreach (var item in importList)
{
++count;
context = AddToFullImportContext(context, item, count, 100, true);
}
private PlannerEntities AddToFullImportContext(PlannerEntities context, FullImportExcel entity, int count, int commitCount, bool recreateContext)
{
context.Set<FullImportExcel>().Add(entity);
if (count % commitCount == 0)
{
context.SaveChanges();
if (recreateContext)
{
context.Dispose();
context = new PlannerEntities();
context.Configuration.AutoDetectChangesEnabled = false;
}
}
return context;
}
This works fine, but isn't as quick as it could be, and the import that I'm going to need to do will be a minimum of 2 million lines every month. Are there any better methods out there for bulk imports?
Am I better avoiding EF altogether and using SQLConnection and inserting that way?
Thanks
I do like how you're only committing records every X number of records (100 in your case.)
I've recently written a system that once a month, needed to update the status of upwards of 50,000 records in one go - this is updating each record and inserting an audit record for each updated record.
Originally I wrote this with the entity framework, and it took 5-6 minutes to do this part of the task. SQL Profiler showed me it was doing 100,000 SQL queries - one UPDATE and one INSERT per record (as expected I guess.)
I changed this to a stored procedure which takes a comma-separated list of record IDs, the status and user ID as parameters, which does a mass-update followed by a mass-insert. This now takes 5 seconds.
In your case, for this number of records, I'd recommend creating a BULK IMPORT file and passing that over to SQL to import.
http://msdn.microsoft.com/en-us/library/ms188365.aspx
For large number of inserts in SQL Server Bulk Copy is the fastest way. You can use the SqlBulkCopy class for accessing Bulk Copy from code. You have to create an IDataReader for your List or you can use this IDataReader for inserting generic Lists I have written.
Thanks to Andy for the heads up - this was the code used in SQL, with a little help from the ever helpful, Pinal Dave - http://blog.sqlauthority.com/2008/02/06/sql-server-import-csv-file-into-sql-server-using-bulk-insert-load-comma-delimited-file-into-sql-server/ :)
DECLARE #bulkinsert NVARCHAR(2000)
DECLARE #filepath NVARCHAR(100)
set #filepath = 'C:\Users\Admin\Desktop\FullImport.csv'
SET #bulkinsert =
N'BULK INSERT FullImportExcel2s FROM ''' +
#filepath +
N''' WITH (FIRSTROW = 2, FIELDTERMINATOR = '','', ROWTERMINATOR = ''\n'')'
EXEC sp_executesql #bulkinsert
Still got a bit of work to do to work it into the code, but we're down to 25 seconds for 50000 rows instead of an hour, so a huge improvement!
I need to write some code to insert around 3 million rows of data.
At the same time I need to insert the same number of companion rows.
I.e. schema looks like this:
Item
- Id
- Title
Property
- Id
- FK_Item
- Value
My first attempt was something vaguely like this:
BaseDataContext db = new BaseDataContext();
foreach (var value in values)
{
Item i = new Item() { Title = value["title"]};
ItemProperty ip = new ItemProperty() { Item = i, Value = value["value"]};
db.Items.InsertOnSubmit(i);
db.ItemProperties.InsertOnSubmit(ip);
}
db.SubmitChanges();
Obviously this was terribly slow so I'm now using something like this:
BaseDataContext db = new BaseDataContext();
DataTable dt = new DataTable("Item");
dt.Columns.Add("Title", typeof(string));
foreach (var value in values)
{
DataRow item = dt.NewRow();
item["Title"] = value["title"];
dt.Rows.Add(item);
}
using (System.Data.SqlClient.SqlBulkCopy sb = new System.Data.SqlClient.SqlBulkCopy(db.Connection.ConnectionString))
{
sb.DestinationTableName = "dbo.Item";
sb.ColumnMappings.Add(new SqlBulkCopyColumnMapping("Title", "Title"));
sb.WriteToServer(dt);
}
But this doesn't allow me to add the corresponding 'Property' rows.
I'm thinking the best solution might be to add a Stored Procedure like this one that generically lets me do a bulk insert (or at least multiple inserts, but I can probably disable logging in the stored procedure somehow for performance) and then returns the corresponding ids.
Can anyone think of a better (i.e. more succinct, near equal performance) solution?
To combine the previous best two answers and add in the missing piece for the IDs:
1) Use BCP to Load the data into a temporary "staging" table defined like this
CREATE TABLE stage(Title AS VARCHAR(??), value AS {whatever});
and you'll need the appropriate index for performance later:
CREATE INDEX ix_stage ON stage(Title);
2) Use SQL INSERT to load the Item table:
INSERT INTO Item(Title) SELECT Title FROM stage;
3) Finally load the Property table by joining stage with Item:
INSERT INTO Property(FK_ItemID, Value)
SELECT id, Value
FROM stage
JOIN Item ON Item.Title = stage.Title
The best way to move that much data into SQL Server is bcp. Assuming that the data starts in some sort of file, you'll need to write a small script to funnel the data into the two tables. Alternately you could use bcp to funnel the data into a single table and then use an SP to INSERT the data into the two tables.
Bulk copy the data into a temporary table, and then call a stored proc that splits the data into the two tables you need to populate.
You can bulk copy in code as well, using the .NET SqlBulkCopy class.