How can I migrate a temp table using LINQ to SQL? - c#

I have two tables: contacts and contact_temps. The contact_temps table mirrors the contacts table. What I'm trying to do is simply pull records from the temp table and insert them into contacts. Afterwards I will remove those records from the contact_temps table.
The code below only migrates one record and doesn't delete anything from the temp table. How can I fix my issue? Thanks.
// migrate temp profile(s)...
var tempProfilesToMigrate = from ct in db.contact_temps
where ct.SessionKey == contact.Profile.SessionId
select new contact();
db.contacts.InsertAllOnSubmit(tempProfilesToMigrate);
db.SubmitChanges();
//...clear temp table records
var tempProfilesToDelete = from ct in db.contact_temps
where ct.SessionKey == contact.Profile.SessionId
select ct;
db.contact_temps.DeleteAllOnSubmit(tempProfilesToDelete);
db.SubmitChanges();

I wonder if your "Insert All on Submit" is causing the entities to become associated with db.contacts. Try this.
// migrate temp profile(s)...
var tempProfiles = from ct in db.contact_temps
where ct.SessionKey == contact.Profile.SessionId
select ct;
foreach (var c in tempProfiles)
{
Contact newC = new Contact();
newC.Name = c.Name;
// copy other values
db.contacts.InsertOnSubmit(newC);
}
// WAIT! do it at once in a single TX => avoid db.SubmitChanges() here.
db.contact_temps.DeleteAllOnSubmit(tempProfiles);
// Both sets of changes in one Tx.
db.SubmitChanges();
You can also write a stored-proc and import it into the db context and just call it.

var result = db.ExecuteCommand("insert into contacts select * from contacts_temp where SessionKey={0}",contact.Profile.SessionId);
Of course, that's just off the top of my head, but you get the idea. Even better would be to put the migration and deletion into a stored procedure. The method you're using will round trip all the contact_temp records twice (once back and forth for the insert, once for the delete).
P.S. google "code first stored procedures" for a way to call stored procs using EF 4.1

Related

Using SQLBulkCopy to Insert into Related Tables

I am using SQL Bulk copy to read data form Excel to SQL DB. In the Database, I have two tables into which I need to insert this data from Excel. Table A and Table B which uses the ID(primary Key IDENTITY) from Table A to insert corresponding row records into Table B.
I am able to insert into one table (Table A) using the following Code.
using (SqlConnection connection = new SqlConnection(strConnection)) {
connection.Open();
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection)) {
bulkCopy.DestinationTableName = "dbo.[EMPLOYEEINFO]";
try {
// Write from the source to the destination.
SqlBulkCopyColumnMapping NameMap = new SqlBulkCopyColumnMapping(data.Columns[0].ColumnName, "EmployeeName");
SqlBulkCopyColumnMapping GMap = new SqlBulkCopyColumnMapping(data.Columns[1].ColumnName, "Gender");
SqlBulkCopyColumnMapping CMap = new SqlBulkCopyColumnMapping(data.Columns[2].ColumnName, "City");
SqlBulkCopyColumnMapping AMap = new SqlBulkCopyColumnMapping(data.Columns[3].ColumnName, "HomeAddress");
bulkCopy.ColumnMappings.Add(NameMap);
bulkCopy.ColumnMappings.Add(GMap);
bulkCopy.ColumnMappings.Add(CMap);
bulkCopy.ColumnMappings.Add(AMap);
bulkCopy.WriteToServer(data);
}
catch (Exception ex) {
Console.WriteLine(ex.Message);
}
}
}
But then I am not sure how to extend it for two tables which are bound by Foreign Key relationship.Especially, Table B uses the Identity value from Table A Any example would be great. I googled it and none of the threads on SO couldn't give a Working example.
AFAIK bulk copy can only be used to upload into a single table. In order to achieve a bulk upload into two tables, you will therefore need two bulk uploads. Your problem comes from using a foreign key which is an identity. You can work around this, however. I am pretty sure that bulk copy uploads sequentially, which means that if you upload 1,000 records and the last record gets an ID of 10,197, then the ID of the first record is 9,198! So my recommendation would be to upload your first table, check the max id after the upload, deduct the number of records and work from there!
Of course in a high use database, someone might insert after you, so you would need to get the top id by selecting the record which matches your last one by other details (assuming a combination of (upto) all fields would be guaranteed to be unique). Only you know if this is likely to be a problem.
The alternative is not to use an identity column in the first place, but I presume you have no control over the design? In my younger days, I made the mistake of using identities, I never do now. They always find a way of coming back to bite!
For example to add the second data:
DataTable secondTable = new DataTable("SecondTable");
secondTable.Columns.Add("ForeignKey", typeof(int));
secondTable.Columns.Add("DataField", typeof(yourDataType));
...
Add data to secondTable.
(Depends on format of second data)
int cnt = 0;
foreach (var d in mySecondData)
{
DataRow newRow = secondTable.NewRow();
newRow["ForeignKey"] = cnt;
newRow["DataField"] = d.YourData;
secondTable.Rows.Add(newRow);
}
Then after you found out the starting identity (int startID).
for (int i = 0; i < secondTable.Rows.Count; i++)
{
secondTable["ForeignKey"] = secondTable["ForeignKey"] + startID;
}
Finally:
bulkCopy.DestinationTableName = "YourSecondTable";
bulkCopy.WriteToServer(secondTable);

Query with linq using a where clause

Okay so I've already asked this question but I've narrowed it down and am now able to word it better.
I have a sql database and an asp.net mvc project with entity frameworks. I already figured out how to query the database and display all contents. But now I need to query the database and only display the rows where column "a" is greater than or equal to column "b".
Edit: datatypes in both columns are int
Here is the query I need
Select *
from Inventory
Where quantity <= statusLow
var context = new MyContext();
var query = context.Inventory.Where(p=> p.quantity <= p.statusLow); // write the statement to query
var result = query.ToList(); // obtaining the result, trigger the database
You can try as shown below.
using (var db = new yourContext())
{
var result = db.Inventory.Where(a=> a.quantity <= a.statusLow).ToList();
}
You can learn more about LINQ to Entities here.

Pulling data from one SQL Azure table, add a column, then populate a different table

I using C# and LINQ to pull/push data housed in SQL Azure. The basic scenario is we have a Customer table that contains all customers (PK = CompanyID) and supporting tables like LaborTypes and Materials (FK CompanyID to Customer table).
When a new customer signs up, a new record is created in the Customers table. Once that is complete, I want to load a set of default materials and laborTypes from a separate table. It is simple enough if I just wanted to copy data direct from one table to another but in order to populate the existing tables for the new customer, I need to take the seed data (e.g. laborType, laborDescription), add the CompanyID for each row of seed data, then do the insert to the existing table.
What the best method to accomplish this using C# and LINQ with SQL Azure?
An example of a direct insert from user input for LaborTypes is below for contextual reference.
using (var context = GetContext(memCustomer))
{
var u = GetUserByUsername(context, memUser);
var l = (from lbr in context.LaborTypes
where lbr.LaborType1.ToLower() == laborType
&& lbr.Company == u.Company
select lbr).FirstOrDefault();
if (l == null)
{
l = new AccountDB.LaborType();
l.Company = u.Company;
l.Description = laborDescription;
l.LaborType1 = laborType;
l.FlatRate = flatRate;
l.HourlyRate = hourlyRate;
context.LaborTypes.InsertOnSubmit(l);
context.SubmitChanges();
}
result = true;
}
What you'll want to do is write a query retrieving data from table B and do an Insert Statement on Table A using the result(s).
This has been covered elsewhere in SO I think, here would be a good place to start
I don't know the syntax for Linq specifically; but by constructing something similar to #Murph 's answer beyond that link, I think this might work.
var fromB= from b in TableB
where ... ;//identify the row/data from table B
// you may need to make fromB populate from the DB here.
var toInsert = ...; //construct your object with the desired data
// do other necessary things here
TableA.InsertAllOnSubmit(toInsert);
dc.SubmitChanges(); // Submit your changes

Create a stored procedure to select data from multiple tables SQL Server WPF

I'm trying to get some information from database and show it in a DataGrid but it is quite slow to get the information because I am getting data from multiple tables. I need to speed it up. Here is how my database tables are structured
TestPack (Id, test_pack_no, train_no, .....)
Sheet (id, **testPackId**, sheet_no, ....)
Spool (id, **sheetId**, spool_no, bore_size, ....)
FieldJoint(id, **spoolId**, thickness, size, ...)
Here is the code which creates a List of FieldJoints ( a custom class with required fields of Testpack, Sheet, Spool and FIeldJoint).
foreach (var tp in allTestPacks)
{
foreach (var sheet in tp.Sheets)
{
foreach (var spool in sheet.Spools)
{
foreach (var joint in spool.FieldJoints)
{
var newJoint = new FieldJoint
{
TestPackNo = tp.test_pack_no,
TrainNo = tp.train_no,
IsometricNo = spool.sheet_no,
SpoolNo = spool.spool_no,
BoreSize = spool.bore_size,
Thickness = joint.joint_thickness.Value,
JointSize = joint.joint_size.Value,
};
_fieldJointsInfo.Add(newJoint);
}
}
}
}
I believe if I can create a stored procedure that runs directly on the server machine where database is hosted, it will speed up the performance. How can I convert the above code into a stored procedure that would take input a string "trainNo" and return all the above information based on that trainNo which is a column in TestPack table.
If I understand this correctly, you want to read related data. You do not show all your code, but reading data in nested for-loops sounds like a pain in the neck...
What about a function like this?
Attention: Make sure to have indexes on your key and foreign key columns...
CREATE FUNCTION dbo.GetTestPackDetails
(
#TrainNo VARCHAR(100)
)
RETURNS TABLE
AS
RETURN
SELECT tp.Id AS TestPackId
,tp.test_pack_no
,tp.train_no
,sh.id AS SheetId
,sh.sheet_no
,sp.id As SpoolId
,sp.spool_no,sp.bore_size
,fj.id AS FieldJointId
,fj.thikness
,fj.size
--add more columns...
FROM TestPack AS tp
INNER JOIN Sheet AS sh ON tp.Id=sh.testPackId
INNER JOIN Spool AS sp ON sh.id=sp.sheetId
INNER JOIN FieldJoint AS fj ON sp.id=fj.spoolId
WHERE tp.train_no=#TrainNo;
You can call this with
SELECT * FROM dbo.GetTestpackDetails('TheTrainNo');
In C# you might create a typed DataSet directly from this function.
Btw: I would not use a Stored Procedure just to read data. SPs are meant to do something...

Fast insert relational(normalized) data tables into SQL Server 2008 database

I'm trying to find a better and faster way to insert pretty massive amount of data(~50K rows) than the Linq that I'm using now.
The data I'm trying to write to a local db is in a list of ORM mapped data serialized and received from WCF.
I'm keen on using SqlBulkCopy, but the problem is that the tables are normalized and are actually a sequence or interconnected tables with one-to-many relationships.
Here's some code that illustrates my point:
foreach (var meeting in meetingsList)
{
int meetingId = dbAccess.InsertM(value1, value2...);
foreach (var competition in meeting.COMPETITIONs)
{
int competitionId = dbAccess.InsertC(meetingid, value1, value2...);
foreach(var competitor in competition.COMPETITORs)
{
int competitorId = dbAccess.InsertCO(comeetitionId, value1,....)
// and so on
}
}
}
where dbAccess.InsertMeeting looks something like this:
// check if meeting exists
int meetingId = GetMeeting(meeting, date);
if (meetingId == 0)
{
// if meeting doesn't exist insert new
var m = new MEETING
{
NAME = name,
DATE = date
}
_db.InsertOnSubmit(m);
_db.SubmitChanges();
}
Thanks in advance for any answers.
Bojan
I would still use SqlBulkCopy to quickly copy your data from the external file into a staging table that has the same (flat) structure as the file (you'll need to create that table ahead of time)
Once it's loaded, you can split up the data across multiple tables using e.g. a stored procedure or something - should be pretty fast since everything's on the server already.

Categories

Resources