I am inserting values in to the table QueryList
[QueryID] [WorkItemID] [RaisedBy]
1 123 xyz
2 234 abc
where QueryID is an Identity column.
I am using a foreach loop and inserting more than one value at a time. My question is how to get all the newly inserted Identity values in Entity Framework 3.5
This is my code
using (TransactionScope currentScope = new TransactionScope())
{
Query newQuery = new Query();
foreach (long workItemId in workItemID)
{
newQuery = new Query();
...
currentScope.Complete();
success = true;
}
}
entityCollection.SaveChanges(true);
int QueryID = newQuery.QueryID; //It gives me last 1 Identity value
You have to track each newly created Query object separately. I suggest using a List<Query> for simplicity:
using (TransactionScope curentScope = new TransactionScope())
{
List<Query> newQueries = new List<Query>();
Query newQuery = new Query();
newQueries.Add(newQuery);
foreach (long workItemId in workItemID)
{
newQuery = new Query();
newQueries.Add(newQuery);
...
curentScope.Complete();
success = true;
}
}
entityCollection.SaveChanges(true);
var queryIDs = newQueries.Select(q => q.QueryID);
Side note: In your code sample you created a Query object outside of the for-loop, but didn't use it at all. This may just be because it's just a sample, but if you use it or insert it in your data context, don't create it.
Related
I'm using EF Core Bulk Extensions library. And this is the code that I use to insert a list of entities:
var entities = new List<Enttiy>();
for (var i = 0; i < 1000; i++)
{
var entity = new Entity();
// setting properties
entities.Add(entity);
}
using context = new DatabaseCotnext();
context.BulkInsert(entities);
The problem is that even if one record can not be inserted, it does not insert anything at all.
I don't want it to act transactionally and atomically. I want it to insert as many records as it can. For example, 900 out of 1000.
Is that possible?
Update:
In MariaDB and C#, this code works in a fuzzy manner:
public List<string> BulkInsert(List<T> entities)
{
var problems = new List<string>();
var table = entities.ToTable();
var connectionString = ConnectionString;
if (!connectionString.Contains("AllowLoadLocalInfile"))
{
connectionString = $"{connectionString};AllowLoadLocalInfile=true;";
}
using var connection = new MySqlConnection(connectionString);
connection.InfoMessage += (s, e) =>
{
foreach (var item in e.Errors)
{
problems.Add(item.Message);
}
};
connection.Open();
var bulkCopy = new MySqlBulkCopy(connection);
bulkCopy.DestinationTableName = TableName;
try
{
var result = bulkCopy.WriteToServer(table);
return problems;
}
catch (MySqlException ex)
{
Logger.LogException(ex);
Logger.LogError(problems.Merge());
throw new ServerException(ex.Message + ". See the logs for the list of problems. Also make sure your database schema, specially column orders is correct. And make sure that you use FakerHelper for correct formatting.");
}
}
I am attempting to add the variable newRetentionLimit to a table in Microsoft SQL Server. I pass the value I want to insert into a parameter and then run ExecuteNonQuery. I get no errors back but the newRetentionLimit isn't placed into the table. I have debugged to make sure that newRetentionLimit isn't null and is an actual integer.
The problem appears to be that ExecuteNonQuery isn't retrieving the parameter value based on the name I put in the script. It appears its just trying to run the script with the parameter name. Anyone have any idea why?
if (request.SystemSettings.Any(s => s.SettingName.Equals("HISTORYRETENTIONDAYS")))
{
var entities = entityRepo.GetList();
var newRetentionLimit = request.SystemSettings.Find(setting => setting.SettingName.Equals("HISTORYRETENTIONDAYS")).SettingValue.ToInt();
var requestContext = new RequestContext();
var sqlParameter = new List<SqlParameter> {
SqlParameterMaker.MakeTypedValueParameter("#retentionValue",newRetentionLimit, SqlDbType.Int)
};
foreach (var entity in entities)
{
var sql = $#"ALTER TABLE [data].[t{entity.Name}] SET (SYSTEM_VERSIONING = ON (HISTORY_TABLE = [hist].[t{entity.Name}], HISTORY_RETENTION_PERIOD = #retentionValue DAYS));";
requestContext.DatabaseContext.ExecuteNonQuery(sql, sqlParameter);
}
}
I ended up finding a solution that still allows me to maintain the use of SqlParameter
if (request.SystemSettings.Any(s => s.SettingName.Equals("HISTORYRETENTIONDAYS")))
{
var entities = entityRepo.GetList();
var newRetentionLimit = request.SystemSettings.Find(setting => setting.SettingName.Equals("HISTORYRETENTIONDAYS")).SettingValue.ToInt();
var requestContext = new RequestContext();
foreach (var entity in entities)
{
var sqlParameters = new List<SqlParameter>{
new SqlParameter("#entityName", entity.Name),
new SqlParameter("#retentionPeriod", newRetentionLimit)
};
var sql = "EXEC('ALTER TABLE [data].[t' + #entityName + '] SET (SYSTEM_VERSIONING = ON (HISTORY_RETENTION_PERIOD = ' + #retentionPeriod + ' DAYS))');";
requestContext.DatabaseContext.ExecuteNonQuery(sql, sqlParameters.ToArray());
}
}
I am trying to save a large cvs file into the database. The file i am using is about 7000 rows and each row contains 14 columns. I have to generate and tag each column of every row with a topic id i pass in my api. After saving each item i then loop through the actual data and i use the generated id to save each data in another table. My problem is i have nested foreach loops and in the first loop i call db.saveChanges() after taking each column in every row so i can reference the generated id. but that is A LOT of saveChanges() calls that are made before processing the data.
For an example:
public static void Save(TopicRequest req){
using(var db = new DbContext()){
foreach(var row in req.items){
var obj = new Entity{
topicId = req.topicId,
year = req.year
};
db.Add(obj);
db.saveChanges();
foreach(var col in row){
var newData = new Entity{
TopicObjId = obj.id,
Value = col
}
db.TopicData.Add(newData);
}
db.saveChanges();
}
}
}
so for a 7000 row file with 14 columns that means that my first loop will make a call to save into the db 98,000 times. This is causing a timeout and the file saved. How can i probably handle such large amounts of data in this way.
I suggest to use AddRange to improve the performance.
Add vs AddRange
Here's an example:
public async Task Save(TopicRequest req)
{
using(var db = new DbContext())
{
var list1 = new List<Entity1>();
var list2 = new List<Entity2>();
foreach(var row in req.items)
{
var obj = new Entity1
{
topicId = req.topicId,
year = req.year
};
list1.Add(obj);
}
db.Topic.AddRange(list1);
await db.SaveChangesAsync();
// this may not be necessary
await db.Entry(list1).ReloadAsync():
foreach(var obj in list1)
{
var newData = new Entity2
{
TopicObjId = obj.topicId,
Value = obj.value
};
list2.Add(newData);
}
db.TopicData.AddRange(list2);
await db.SaveChangesAsync();
}
}
my problem is very common, but I have not found any solution.
This is my code:
public async Task<QueryResult> RollbackQuery(ActionLog action)
{
var inputParameters = JsonConvert.DeserializeObject<Parameter[]>(action.Values);
var data = DeserailizeByteArrayToDataSet(action.RollBackData);
using (var structure = PrepareStructure(action.Query, action.Query.DataBase, inputParameters))
{
//_queryPlanner is the implementor for my interface
return await _queryPlanner.RollbackQuery(structure, data);
}
}
I need to load DataTable (from whereever) and replace data to database. This is my Rollback function. This function use a "CommandStructure" where I've incapsulated all SqlClient objects. PrepareStructure initialize all objects
//_dataLayer is an Helper for create System.Data.SqlClient objects
//ex: _dataLayer.CreateCommand(preSelect) => new SqlCommand(preSelect)
private CommandStructure PrepareStructure(string sql, string preSelect, DataBase db, IEnumerable<Parameter> inputParameters)
{
var parameters = inputParameters as IList<Parameter> ?? inputParameters.ToList();
var structure = new CommandStructure(_logger);
structure.Connection = _dataLayer.ConnectToDatabase(db);
structure.SqlCommand = _dataLayer.CreateCommand(sql);
structure.PreSelectCommand = _dataLayer.CreateCommand(preSelect);
structure.QueryParameters = _dataLayer.CreateParemeters(parameters);
structure.WhereParameters = _dataLayer.CreateParemeters(parameters.Where(p => p.IsWhereClause.HasValue && p.IsWhereClause.Value));
structure.CommandBuilder = _dataLayer.CreateCommandBuilder();
structure.DataAdapter = new SqlDataAdapter();
return structure;
}
So, my function uses SqlCommandBuilder and DataAdapter to operate on Database.
PreSelectCommand is like "Select * from Purchase where CustomerId = #id"
The table Purchase has one primaryKey on ID filed
public virtual async Task<QueryResult> RollbackQuery(CommandStructure cmd, DataTable oldData)
{
await cmd.OpenConnectionAsync();
int record = 0;
using (var cmdPre = cmd.PreSelectCommand as SqlCommand)
using (var dataAdapt = new SqlDataAdapter(cmdPre))
using (var cmdBuilder = new SqlCommandBuilder(dataAdapt))
{
dataAdapt.UpdateCommand = cmdBuilder.GetUpdateCommand();
dataAdapt.DeleteCommand = cmdBuilder.GetDeleteCommand();
dataAdapt.InsertCommand = cmdBuilder.GetInsertCommand();
using (var tbl = new DataTable(oldData.TableName))
{
dataAdapt.Fill(tbl);
dataAdapt.FillSchema(tbl, SchemaType.Source);
tbl.Merge(oldData);
foreach (DataRow row in tbl.Rows)
{
row.SetModified();
}
record = dataAdapt.Update(tbl);
}
}
return new QueryResult
{
RecordAffected = record
};
}
I Execute the code and I don't have any errors, but the data are not updated.
variable "record" contain the right number of modified (??) record, but..... on the table nothing
can someone help me?
EDIT 1:
With SQL Profiler I saw that no query is executed on DB. Only select query on .Fill(tbl) command.
EDIT 2:
Now I have made one change:
tbl.Merge(oldData) => tbl.Merge(oldData, true)
so I see perform the expected query but, with reversed parameters.
UPDATE Purchase SET price=123 where id=6 and price=22
instead of
UPDATE Purchase SET price=22 where id=6 and price=123
i have code like below:
QueryExpression query = new QueryExpression();
query.EntityName = "new_callistyorder";
ColumnSet col = new ColumnSet("new_nomororder","new_customer");
query.ColumnSet = col;
EntityCollection colect = service.RetrieveMultiple(query);
string str = string.Empty;
foreach (Entity e in colect.Entities)
{
if(e.Contains("new_nomororder")){
str = str + e.Attributes["new_nomororder"].ToString();
}
}
throw new InvalidPluginExecutionException(str);
Trough this code. I am able to get data from microsoft dynamic entity.
Now, i want to get the data which have biggest id.
If in SQL Query, it would be looks something like this : "Select top 1 my_id from Account order by my_id desc".
How can i do that on queryexpression ?
Thanks
You can add the order by using this:
query.AddOrder("my_id", OrderType.Descending);
and then getting the first element retrieved.
var entityCollection = service.RetrieveMultiple(query);
if(entityCollection.Entities.Count<1)
{
//perform some logic
}