INSERT INTO two tables at one query - c#

How can I insert values into two tables at once?
if it not successful, both table should rollback.
I am using SQL server and the query passe throw C# code.

You could either run the two queries as one statement
insert into table1 (...) values (...); insert into table2 (...) values (...)
or write a trigger to do the second INSERT.

I would typically write a stored procedure to take in all of the values you want to write out, then call a series of INSERT INTO statements wrapped in a transaction.
If you provide more information, such as table structure and sample data, we can help you further.

get ans here of ur question
SQL Server: Is it possible to insert into two tables at the same time?
How can I INSERT data into two tables simultaneously in SQL Server?

Related

T-SQL: merge or upsert child table

I have three tables with data as shown in the following screenshot:
I have a working MERGE statement for the Users table that is sent to the SQL Server from C#. The MERGE statement correctly merges data into the ICS_USERS table. The #temp table is also created and populated in C#:
MERGE INTO ICS_Users AS Target
USING #temp AS Source ON TARGET.USerID = Source.UserId
WHEN MATCHED THEN
UPDATE
SET TARGET.UserName = Source.UserName, TARGET.Active = Source.Active
WHEN NOT MATCHED THEN
INSERT(UserName, Active, UserInitials)
VALUES(Source.UserName, Source.Active, Source.UserInitials)
I want to allow the user to add/change/delete the role for a user and send in the MERGE statement to handle it. Note that the user will only ever be allowed to change a single user at one time.
How do I change the merge statement to account for the Role and User/Role associative table?
You can follow these steps to achieve this.
Create a stored procedure which will accept all the input parameters which are required to join, insert, update and delete the values using source and target. Make all parameter optional or null able so that you can run your merge even without input values.
Insert all these values in a table variable which will have same column as your target tables along with the same data type.
Use your actual source table and along with table variable and union them and then use the output in merge logic.
This will handle your both requirement in single merge statement. Even when you input columns will be null first query in union will always have values to suffice the merge logic.
Try if this will not work comment and I will provide you sql script. Here I am not providing sql intentionally so that at least you can read and try by your self and write the logic in sql.

how to Insert data into a database by fetching data from another database?

I have two database for example(A and B). I have already made a crystal report which take values from database 'B'. But now I want to upload this report to the Database 'A' with an unique id. When the user will see the report the program will fetch the report from Database 'A'. And the the report will take the data from database 'B'. Need some suggestions.
We can copy all columns from one table to another, existing table:
INSERT INTO table2
SELECT * FROM table1;
Or we can copy only the columns we want to into another, existing table:
INSERT INTO table2
(column_name(s))
SELECT column_name(s)
FROM table1;
Assuming that your are calling a stored procedure to fetch data for your report.
In your SP after writing your select statement for your report, you can fire insert queries to databaseA from databaseB itself
If your databases are across different servers and both the servers can talk, you could do in with a SQL insert statement.
first run the following on your first server:
Execute sp_addlinkedserver SERVER_NAME1
Then just create the insert statement:
INSERT INTO [SERVER_NAME1].DATABASE_NAME.dbo.TABLE_NAME (Names_of_Columns_to_be_inserted)
SELECT Names_of_Columns_to_be_inserted
FROM TABLE_NAME
If both the databases are on the same server then you can exclude the [SERVER_NAME1] part from the above query.
If this doesnt solve your query then please elaborate your question.
Hope this helps

Using a SQL DataReader to capture inserted or deleted values

I'm currently calling a stored procedure from a .net application that inserts records into the database. However, I need to get a list of the records that I've just inserted successfully.
I know that I could return the inserted rows from the stored procedure directly, but I was hoping there was a way to do this programatically in C#.
Is it possible to implement the SQLDataReader class in order to achieve this functionality i.e. reading from the inserted/deleted tables? Or is there some other class that can accommodate this request?
The inserted/deleted tables are available in the OUTPUT clause of the INSERT statement.
You could use the OUTPUT clause in the INSERT in your stored procedure and use SQLDataReader to pick up the result.

Execute multiple queries in one shot in SQLite C#?

I know how to execute single statements, but is their a way to execute a block of statements in some easy way. I just want to delete a column from a table.
BEGIN TRANSACTION;
CREATE TEMPORARY TABLE t1_backup(a,b);
INSERT INTO t1_backup SELECT a,b FROM t1;
DROP TABLE t1;
CREATE TABLE t1(a,b);
INSERT INTO t1 SELECT a,b FROM t1_backup;
DROP TABLE t1_backup;
COMMIT;
Looks like the only way is to execute each line as a separate query and create a transaction. I wish there should be some API to execute a bunch of queries at once.
It looks like you have this solved already. I don't think there's a better solution than what you posted in your question.

how to improve SQL query performance in my case

I have a table, schema is very simple, an ID column as unique primary key (uniqueidentifier type) and some other nvarchar columns. My current goal is, for 5000 inputs, I need to calculate what ones are already contained in the table and what are not. Tht inputs are string and I have a C# function which converts string into uniqueidentifier (GUID). My logic is, if there is an existing ID, then I treat the string as already contained in the table.
My question is, if I need to find out what ones from the 5000 input strings are already contained in DB, and what are not, what is the most efficient way?
BTW: My current implementation is, convert string to GUID using C# code, then invoke/implement a store procedure which query whether an ID exists in database and returns back to C# code.
My working environment: VSTS 2008 + SQL Server 2008 + C# 3.5.
My first instinct would be to pump your 5000 inputs into a single-column temporary table X, possibly index it, and then use:
SELECT X.thecol
FROM X
JOIN ExistingTable USING (thecol)
to get the ones that are present, and (if both sets are needed)
SELECT X.thecol
FROM X
LEFT JOIN ExistingTable USING (thecol)
WHERE ExistingTable.thecol IS NULL
to get the ones that are absent. Worth benchmarking, at least.
Edit: as requested, here are some good docs & tutorials on temp tables in SQL Server. Bill Graziano has a simple intro covering temp tables, table variables, and global temp tables. Randy Dyess and SQL Master discuss performance issue for and against them (but remember that if you're getting performance problems you do want to benchmark alternatives, not just go on theoretical considerations!-).
MSDN has articles on tempdb (where temp tables are kept) and optimizing its performance.
Step 1. Make sure you have a problem to solve. Five thousand inserts isn't a lot to insert one at a time in a lot of contexts.
Are you certain that the simplest way possible isn't sufficient? What performance issues have you measured so far?
What do you need to do with those entries that do or don't exist in your table??
Depending on what you need, maybe the new MERGE statement in SQL Server 2008 could fit your bill - update what's already there, insert new stuff, all wrapped neatly into a single SQL statement. Check it out!
http://blogs.conchango.com/davidportas/archive/2007/11/14/SQL-Server-2008-MERGE.aspx
http://www.sql-server-performance.com/articles/dba/SQL_Server_2008_MERGE_Statement_p1.aspx
http://blogs.msdn.com/brunoterkaly/archive/2008/11/12/sql-server-2008-merge-capability.aspx
Your statement would look something like this:
MERGE INTO
(your target table) AS t
USING
(your source table, e.g. a temporary table) AS s
ON t.ID = s.ID
WHEN NOT MATCHED THEN -- new rows does not exist in base table
....(do whatever you need to do)
WHEN MATCHED THEN -- row exists in base table
... (do whatever else you need to do)
;
To make this really fast, I would load the "new" records from e.g. a TXT or CSV file into a temporary table in SQL server using BULK INSERT:
BULK INSERT YourTemporaryTable
FROM 'c:\temp\yourimportfile.csv'
WITH
(
FIELDTERMINATOR =',',
ROWTERMINATOR =' |\n'
)
BULK INSERT combined with MERGE should give you the best performance you can get on this planet :-)
Marc
PS: here's a note from TechNet on MERGE performance and why it's faster than individual statements:
In SQL Server 2008, you can perform multiple data manipulation language (DML) operations in a single statement by using the MERGE statement. For example, you may need to synchronize two tables by inserting, updating, or deleting rows in one table based on differences found in the other table. Typically, this is done by executing a stored procedure or batch that contains individual INSERT, UPDATE, and DELETE statements. However, this means that the data in both the source and target tables are evaluated and processed multiple times; at least once for each statement.
By using the MERGE statement, you can replace the individual DML statements with a single statement. This can improve query performance because the operations are performed within a single statement, therefore, minimizing the number of times the data in the source and target tables are processed. However, performance gains depend on having correct indexes, joins, and other considerations in place. This topic provides best practice recommendations to help you achieve optimal performance when using the MERGE statement.
Try to ensure you end up running only one query - i.e. if your solution consists of running 5000 queries against the database, that'll probably be the biggest consumer of resources for the operation.
If you can insert the 5000 IDs into a temporary table, you could then write a single query to find the ones that don't exist in the database.
If you want simplicity, since 5000 records is not very many, then from C# just use a loop to generate an insert statement for each of the strings you want to add to the table. Wrap the insert in a TRY CATCH block. Send em all up to the server in one shot like this:
BEGIN TRY
INSERT INTO table (theCol, field2, field3)
SELECT theGuid, value2, value3
END TRY BEGIN CATCH END CATCH
BEGIN TRY
INSERT INTO table (theCol, field2, field3)
SELECT theGuid, value2, value3
END TRY BEGIN CATCH END CATCH
BEGIN TRY
INSERT INTO table (theCol, field2, field3)
SELECT theGuid, value2, value3
END TRY BEGIN CATCH END CATCH
if you have a unique index or primary key defined on your string GUID, then the duplicate inserts will fail. Checking ahead of time to see if the record does not exist just duplicates work that SQL is going to do anyway.
If performance is really important, then consider downloading the 5000 GUIDS to your local station and doing all the analysis localy. Reading 5000 GUIDS should take much less than 1 second. This is simpler than bulk importing to a temp table (which is the only way you will get performance from a temp table) and doing an update using a join to the temp table.
Since you are using Sql server 2008, you could use Table-valued parameters. It's a way to provide a table as a parameter to a stored procedure.
Using ADO.NET you could easily pre-populate a DataTable and pass it as a SqlParameter.
Steps you need to perform:
Create a custom Sql Type
CREATE TYPE MyType AS TABLE
(
UniqueId INT NOT NULL,
Column NVARCHAR(255) NOT NULL
)
Create a stored procedure which accepts the Type
CREATE PROCEDURE spInsertMyType
#Data MyType READONLY
AS
xxxx
Call using C#
SqlCommand insertCommand = new SqlCommand(
"spInsertMyType", connection);
insertCommand.CommandType = CommandType.StoredProcedure;
SqlParameter tvpParam =
insertCommand.Parameters.AddWithValue(
"#Data", dataReader);
tvpParam.SqlDbType = SqlDbType.Structured;
Links: Table-valued Parameters in Sql 2008
Definitely do not do it one-by-one.
My preferred solution is to create a stored procedure with one parameter that can take and XML in the following format:
<ROOT>
<MyObject ID="60EAD98F-8A6C-4C22-AF75-000000000000">
<MyObject ID="60EAD98F-8A6C-4C22-AF75-000000000001">
....
</ROOT>
Then in the procedure with the argument of type NCHAR(MAX) you convert it to XML, after what you use it as a table with single column (lets call it #FilterTable). The store procedure looks like:
CREATE PROCEDURE dbo.sp_MultipleParams(#FilterXML NVARCHAR(MAX))
AS BEGIN
SET NOCOUNT ON
DECLARE #x XML
SELECT #x = CONVERT(XML, #FilterXML)
-- temporary table (must have it, because cannot join on XML statement)
DECLARE #FilterTable TABLE (
"ID" UNIQUEIDENTIFIER
)
-- insert into temporary table
-- #important: XML iS CaSe-SenSiTiv
INSERT #FilterTable
SELECT x.value('#ID', 'UNIQUEIDENTIFIER')
FROM #x.nodes('/ROOT/MyObject') AS R(x)
SELECT o.ID,
SIGN(SUM(CASE WHEN t.ID IS NULL THEN 0 ELSE 1 END)) AS FoundInDB
FROM #FilterTable o
LEFT JOIN dbo.MyTable t
ON o.ID = t.ID
GROUP BY o.ID
END
GO
You run it as:
EXEC sp_MultipleParams '<ROOT><MyObject ID="60EAD98F-8A6C-4C22-AF75-000000000000"/><MyObject ID="60EAD98F-8A6C-4C22-AF75-000000000002"/></ROOT>'
And your results look like:
ID FoundInDB
------------------------------------ -----------
60EAD98F-8A6C-4C22-AF75-000000000000 1
60EAD98F-8A6C-4C22-AF75-000000000002 0

Categories

Resources