I need to know how can I know what is the last inserted id in a SQL Server database for a given table?
I'm using int identity not guid and if it's matters NHibernate as my ORM.
Thanks
You could try something like this:
var sql = "SELECT IDENT_CURRENT('TableName')";
var query = session.CreateSQLQuery(sql);
var result = query.UniqueResult();
This will return the last identity value generated for a given table.
Depending on how your ID's are generated, you should be able to return it straight from the ISession.Save():
int insertedID = (int)session.Save(entity);
Related
I want to create simple database in runtime, fill it with data from internal resource and then read each record through loop. Previously I used LiteDb for that but I couldn't squeeze time anymore so
I choosed SQLite.
I think there are few things to improve I am not aware of.
Database creation process:
First step is to create table
using var create = transaction.Connection.CreateCommand();
create.CommandText = "CREATE TABLE tableName (Id TEXT PRIMARY KEY, Value TEXT) WITHOUT ROWID";
create.ExecuteNonQuery();
Next insert command is defined
var insert = transaction.Connection.CreateCommand();
insert.CommandText = "INSERT OR IGNORE INTO tableName VALUES (#Id, #Record)";
var idParam = insert.CreateParameter();
var valueParam = insert.CreateParameter();
idParam.ParameterName = "#" + IdColumn;
valueParam.ParameterName = "#" + ValueColumn;
insert.Parameters.Add(idParam);
insert.Parameters.Add(valueParam);
Through loop each value is inserted
idParameter.Value = key;
valueParameter.Value = value.ValueAsText;
insert.Parameters["#Id"] = idParameter;
insert.Parameters["#Value"] = valueParameter;
insert.ExecuteNonQuery();
Transaction commit transaction.Commit();
Create index
using var index = transaction.Connection.CreateCommand();
index.CommandText = "CREATE UNIQUE INDEX idx_tableName ON tableName(Id);";
index.ExecuteNonQuery();
And after that i perform milion selects (to retrieve single value):
using var command = _connection.CreateCommand();
command.CommandText = "SELECT Value FROM tableName WHERE Id = #id;";
var param = command.CreateParameter();
param.ParameterName = "#id";
param.Value = id;
command.Parameters.Add(param);
return command.ExecuteReader(CommandBehavior.SingleResult).ToString();
For all select's one connection is shared and never closed. Insert is quite fast (less then minute) but select's are very troublesome here. Is there a way to improve them?
Table is quite big (around ~2 milions records) and Value contains quite heavy serialized objects.
System.Data.SQLite provider is used and connection string contains this additional options: Version=3;Journal Mode=Off;Synchronous=off;
If you go for performance, you need to consider this: each independent SELECT command is a roundtrip to the DB with some extra costs. It's similar to a N+1 select problem in case of parent-child relations.
The best thing you can do is to get a LIST of items (values):
SELECT Value FROM tableName WHERE Id IN (1, 2, 3, 4, ...);
Here's a link on how to code that: https://www.mikesdotnetting.com/article/116/parameterized-in-clauses-with-ado-net-and-linq
You could have the select command not recreated for every Id but created once and only executed for every Id. From your code it seems every select is CreateCommand/CreateParameters and so on. See this for example: https://learn.microsoft.com/en-us/dotnet/api/system.data.idbcommand.prepare?view=net-5.0 - you run .Prepare() once and then only execute (they don't need to be NonQuery)
you could then try to see if you can be faster with ExecuteScalar and not having reader created for one data result, like so: https://learn.microsoft.com/en-us/dotnet/api/system.data.idbcommand.executescalar?view=net-5.0
If scalar will not prove to be faster then you could try to use .SingleRow instead of .SingleResult in your ExecuteReader for possible performance optimisations. According to this: https://learn.microsoft.com/en-us/dotnet/api/system.data.commandbehavior?view=net-5.0 it might work. I doubt that but if first two don't help, why not try it too.
I'm working on a WPF application and using SQLite database. I can do every CRUD operation with Entity Framework, but in some specific cases I have to use raw SQL queries, and sometimes it's not returning what I need.
Here is a sample code:
using (var db = new DbContext(AppIO.DatabaseFilePath)) {
var key = 12;
string sql = $"SELECT COUNT(*) FROM SomeTable WHERE SomeField={key}";
var result = db.Database.ExecuteSqlCommand(sql);
}
I simplified the example. Here the result, what I got is -1. I copied the sql string value (after it's built) and executed in SQLiteStuido on the same database and it returned the correct value.
The DatabaseFilePath is correct. The connection is set correctly. I'm checking the same databases (in code and in SQLiteStudio). Any other idea?
Try this:
var result = db.Database.SqlQuery<int>(sql).First();
You have to call SqlQuery method and not ExecuteSqlCommand method. Since SqlQuery returns an IEnumerable you have to call Single. This is a the way to retreive scalar values from a query.
using (var db = new DbContext(AppIO.DatabaseFilePath)) {
var key = 12;
string sql = $"SELECT COUNT(*) FROM SomeTable WHERE SomeField={key}";
var result = db.Database.SqlQuery<int>(sql).Single();
}
I am working on a transaction website. When a subscription is created the modified date is inserted in the table. What I am trying to accomplish is to uniquely identify that record. Here is that I am trying:
DateTime currentTime = DateTime.Now;
Debug.WriteLine("Current Timestamp is: " + currentTime.ToString("yyyy-MM-dd HH:mm:ss:fff"));
sCmd.CommandText = "SELECT distinct userID FROM table WHERE Report_OID = #reportID AND ModifiedDate = #currentTime";
sCmd.Parameters.AddWithValue("#reportID", reportID);
sCmd.Parameters.AddWithValue("#currentTime", currentTime.ToString("yyyy-MM-dd HH:mm:ss:fff"));
Output:
Current Timestamp is: 2016-06-09 10:19:16:586
Modified Date is: 2016-06-09 10:19:16.553
Tried this:
sCmd.CommandText = "SELECT LAST_INSERT_ID();";
IDataReader reader = sCmd.ExecuteReader();
while (reader != null && reader.Read())
{
long lastInsertedID = reader.GetInt64(0);
Debug.WriteLine("return id is : " + lastInsertedID);
}
Got this error:
'LAST_INSERT_ID' is not a recognized built-in function name.
This error may have happened due to the createSubscription method call to create a subscription in the reporting service webservice which handles the insert in the database. I do not have insert statements in my code.
This will fail as the milliseconds wont be the same. The reason I want to use milliseconds is to prevent multiple userids if two users insert a subscription around the same time.
What I am trying to figure out is compare these two dates so I can accurately get the userID of my transaction and not worry about grabbing someone else user id if two transactions are inserted at the same time. Is this possible? Am I trying to solve this the wrong way. Any help will be appreciated. Thank you.
Forgive me for the pseudo-code (I don't use MySQL often), but the proper way to do this is to modify your insert to return the identity of the last record inserted by your statement instead of trying to match things up using timestamps and userIds. The code will be simpler and more reliable. This is done in MySQL using the LAST_INSERT_ID() function (the approximation of MS SQL's SCOPE_IDENTITY() function).
For example, your insert (assuming the table has an indentity column called "FooID" might look something like this:
INSERT INTO Foo (Bar) VALUES ('Biz');
SELECT LAST_INSERT_ID();
If the new record had a FooID of 148, then your statement would insert the record and return 148, at which point you could:
SELECT Bar FROM Foo WHERE FooID = 148;
That should accomplish what you are trying to do.
Just use one of the inputs to find the record. You have several.
Thanks to #Paparazzi, I created another select statement to get the lastest subscription by using the email address and creating a datetime variable before the createSubscription method is called I was able to get the most recent insert id and its working now.
DateTime subscriptionCreateTime = DateTime.Now;
CreateSubscription(); //Create a new ssrs subscription
//SQL connection here
sCmd.CommandText = "SELECT distinct ownerID FROM db.table WHERE Report_ID = #reportID AND ModifiedDate >= #currentTimeStamp AND Description like #emailAddress";
sCmd.Parameters.AddWithValue("#reportID", gRptID);
sCmd.Parameters.AddWithValue("#emailAddress", "%"+tbTO.Text+"%");
sCmd.Parameters.AddWithValue("#currentTimeStamp", subscriptionCreatedTime);
SqlDataReader reader = sCmd.ExecuteReader();
I am using Dapper with C# and back end is MS Access. My DAL method inserts record in database. I want to return unique identifier (or updated POCO with unique identifier) of the inserted row.
I am expecting my function something like follows (I know this does not work; just to explain what I want): -
public MyPoco Insert(MyPoco myPoco)
{
sql = #"INSERT INTO MyTable (Field1, Field2) VALUES (#Field1, #Field2)";
var param = GetMappedParams(myPoco);//ID property here is null.
var result = _connection.Query<MyPoco>(sql, param, null, false, null, CommandType.Text);.Single();
return result;//This result now contains ID that is created by database.
}
I am from NHibernate world and POCO updates automatically with NH. If not; we can call Refresh method and it updates the ID.
I am not aware how to achieve this with Dapper.
I read this question on SO which is not relevant as it talks about SQL Server.
Another this question does not have accepted answer.
I read this question where accepted answer explains pit-falls of using ##Identity.
This is what works for me:
static MyPoco Insert(MyPoco myPoco)
{
string sql = "INSERT INTO MyTable (Field1, Field2) VALUES (#Field1, #Field2)";
_connection.Execute(sql, new {myPoco.Field1, myPoco.Field2});
myPoco.ID = _connection.Query<int>("SELECT ##IDENTITY").Single();
return myPoco; // This result now contains ID that is created by database.
}
Note that this will work with an OleDbConnection to the Access database, but it will not work with an OdbcConnection.
Edit re: comment
To ensure that the Connection remains open between the INSERT and the SELECT calls, we could do this:
static void Insert(MyPoco myPoco)
{
string sql = "INSERT INTO MyTable (Field1, Field2) VALUES (#Field1, #Field2)";
bool connAlreadyOpen = (_connection.State == System.Data.ConnectionState.Open);
if (!connAlreadyOpen)
{
_connection.Open();
}
_connection.Execute(sql, new {myPoco.Field1, myPoco.Field2});
myPoco.ID = _connection.Query<int>("SELECT ##IDENTITY").Single();
if (!connAlreadyOpen)
{
_connection.Close();
}
return; // (myPoco now contains ID that is created by database.)
}
Just a couple of extra thoughts: If the ##Identity pitfalls are an issue then another option would be to create a new GUID ahead of time in code and then insert that GUID with the rest of the data, rather than letting Access create the identity value when it creates the new record.
I appreciate that will only work if your particular situation allows for a GUID primary key for the table, but it does guarantee you that you know the true value of the key for the record you just inserted.
Alternatively, if you don't want a GUID key you could create a table with a single row that holds the current seed value for any manually managed keys in your application. You can then manually increment the particular seed's value each time you want to insert a new record. As with the GUID approach, you'd then manually insert the ID with the record, this time the ID would be the newly incremented seed you just retrieved.
Again, that should guarantee you a unique key for each insert, although now you are doing a read and two writes for each insert.
I am using C# and using SqlBulkCopy. I have a problem though. I need to do a mass insert into one table then another mass insert into another table.
These 2 have a PK/FK relationship.
Table A
Field1 -PK auto incrementing (easy to do SqlBulkCopy as straight forward)
Table B
Field1 -PK/FK - This field makes the relationship and is also the PK of this table. It is not auto incrementing and needs to have the same id as to the row in Table A.
So these tables have a one to one relationship but I am unsure how to get back all those PK Id that the mass insert made since I need them for Table B.
Edit
Could I do something like this?
SELECT *
FROM Product
WHERE NOT EXISTS (SELECT * FROM ProductReview WHERE Product.ProductId = ProductReview.ProductId AND Product.Qty = NULL AND Product.ProductName != 'Ipad')
This should find all the rows that where just inserted with the sql bulk copy. I am not sure how to take the results from this then do a mass insert with them from a SP.
The only problem I can see with this is that if a user is doing the records one at a time and a this statement runs at the same time it could try to insert a row twice into the "Product Review Table".
So say I got like one user using the manual way and another user doing the mass way at about the same time.
manual way.
1. User submits data
2. Linq to sql Product object is made and filled with the data and submited.
3. this object now contains the ProductId
4. Another linq to sql object is made for the Product review table and is inserted(Product Id from step 3 is sent along).
Mass way.
1. User grabs data from a user sharing the data.
2. All Product rows from the sharing user are grabbed.
3. SQL Bulk copy insert on Product rows happens.
4. My SP selects all rows that only exist in the Product table and meets some other conditions
5. Mass insert happens with those rows.
So what happens if step 3(manual way) is happening at the same time as step 4(mass way). I think it would try to insert the same row twice causing a primary constraint execption.
In that scenario, I would use SqlBulkCopy to insert into a staging table (i.e. one that looks like the data I want to import, but isn't part of the main transactional tables), and then at the DB to a INSERT/SELECT to move the data into the first real table.
Now I have two choices depending on the server version; I could do a second INSERT/SELECT to the second real table, or I could use the INSERT/OUTPUT clause to do the second insert , using the identity rows from the table.
For example:
-- dummy schema
CREATE TABLE TMP (data varchar(max))
CREATE TABLE [Table1] (id int not null identity(1,1), data varchar(max))
CREATE TABLE [Table2] (id int not null identity(1,1), id1 int not null, data varchar(max))
-- imagine this is the SqlBulkCopy
INSERT TMP VALUES('abc')
INSERT TMP VALUES('def')
INSERT TMP VALUES('ghi')
-- now push into the real tables
INSERT [Table1]
OUTPUT INSERTED.id, INSERTED.data INTO [Table2](id1,data)
SELECT data FROM TMP
If your app allows it, you could add another column in which you store an identifier of the bulk insert (a guid for example). You would set this id explicitly.
Then after the bulk insert, you just select the rows that have that identifier.
I had the same issue where I had to get back ids of the rows inserted with SqlBulkCopy.
My ID column was an identity column.
Solution:
I have inserted 500+ rows with bulk copy, and then selected them back with the following query:
SELECT TOP InsertedRowCount *
FROM MyTable
ORDER BY ID DESC
This query returns the rows I have just inserted with their ids. In my case I had another unique column. So I selected that column and id. Then mapped them with a IDictionary like so:
IDictionary<string, int> mymap = new Dictionary<string, int>()
mymap[Name] = ID
Hope this helps.
My approach is similar to what RiceRiceBaby described, except one important thing to add is that the call to retrieve Max(Id) needs to be a part of a transaction, along with the call to SqlBulkCopy.WriteToServer. Otherwise, someone else may insert during your transaction and this would make your Id's incorrect. Here is my code:
public static void BulkInsert<T>(List<ColumnInfo> columnInfo, List<T> data, string
destinationTableName, SqlConnection conn = null, string idColumn = "Id")
{
NLogger logger = new NLogger();
var closeConn = false;
if (conn == null)
{
closeConn = true;
conn = new SqlConnection(_connectionString);
conn.Open();
}
SqlTransaction tran =
conn.BeginTransaction(System.Data.IsolationLevel.Serializable);
try
{
var options = SqlBulkCopyOptions.KeepIdentity;
var sbc = new SqlBulkCopy(conn, options, tran);
var command = new SqlCommand(
$"SELECT Max({idColumn}) from {destinationTableName};", conn,
tran);
var id = command.ExecuteScalar();
int maxId = 0;
if (id != null && id != DBNull.Value)
{
maxId = Convert.ToInt32(id);
}
data.ForEach(d =>
{
maxId++;
d.GetType().GetProperty(idColumn).SetValue(d, maxId);
});
var dt = ConvertToDataTable(columnInfo, data);
sbc.DestinationTableName = destinationTableName;
foreach (System.Data.DataColumn dc in dt.Columns)
{
sbc.ColumnMappings.Add(dc.ColumnName, dc.ColumnName);
}
sbc.WriteToServer(dt);
tran.Commit();
if(closeConn)
{
conn.Close();
conn = null;
}
}
catch (Exception ex)
{
tran.Rollback();
logger.Write(LogLevel.Error, $#"An error occurred while performing a bulk
insert into table {destinationTableName}. The entire
transaction has been rolled back.
{ex.ToString()}");
throw ex;
}
}
Depending on your needs and how much control you have of the tables, you may want to consider using UNIQUEIDENTIFIERs (Guids) instead of your IDENTITY primary keys. This moves key management outside of the database and into your application. There are some serious tradeoffs to this approach, so it may not meet your needs. But it may be worth considering. If you know for sure that you'll be pumping a lot of data into your tables via bulk-insert, it is often really handy to have those keys managed in your object model rather than your application relying on the database to give you back the data.
You could also take a hybrid approach with staging tables as suggested before. Get the data into those tables using GUIDs for the relationships, and then via SQL statements you could get the integer foreign keys in order and pump data into your production tables.
I would:
Turn on identity insert on the table
Grab the Id of the last row of the table
Loop from (int i = Id; i < datable.rows.count+1; i++)
In the loop, assign the Id property of your datable to i+1.
Run your SQL bulk insert with your keep identity turned on.
Turn identity insert back off
I think that's the safest way to get your ids on an SQL bulk insert because it will prevent mismatched ids that could caused by the application be executed on another thread.
Disclaimer: I'm the owner of the project C# Bulk Operations
The library overcome SqlBulkCopy limitations and add flexible features like output inserted identity value.
Behind the code, it does exactly like the accepted answer but way easier to use.
var bulk = new BulkOperation(connection);
// Output Identity
bulk.ColumnMappings.Add("ProductID", ColumnMappingDirectionType.Output);
// ... Column Mappings...
bulk.BulkInsert(dt);