Working with multiple rows in table as properties - c#

I'm working with sqlite databases using System.Data.Sqlite library. I have the "metadata" table, which looks like this:
Table screenshot
It doesn't always contain all of these rows and fields, and there can also be a lot of optional ones.
Usually I work with these fields as properties (which I get through reading queries and reflection), my class:
class MetadataTable
{
public string Version { get; set; }
public string Timestamp { get; set; }
public string Author { get; set; }
public string Location { get; set; }
public MetadataTable(string pathToDb)
{
using (SQLiteConnection connection = new SQLiteConnection($"Data Source={pathToDb};Version=3;"))
{
try
{
connection.Open();
}
catch (Exception)
{
throw new Exception("Unable to open database.");
}
using (SQLiteCommand command = new SQLiteCommand(connection))
{
command.CommandText = "SELECT key, value FROM metadata;";
using (SQLiteDataReader reader = command.ExecuteReader())
{
List<string> propertyNames = new List<string>();
List<string> propertyValues = new List<string>();
while (reader.Read())
{
propertyNames.Add(reader[0].ToString());
propertyValues.Add(reader[1].ToString());
}
for (int i = 0; i < propertyNames.Count; i++)
propertyNames[i] = propertyNames[i].ToLower().First().ToString().ToUpper()
+ propertyNames[i].Substring(1);
for (int i = 0; i < propertyValues.Count; i++)
typeof(MetadataTable).GetProperty(propertyNames[i])?.SetValue(this, propertyValues[i]);
}
}
}
}
}
And how it's called from Main():
string pathToDb = "D:/Downloads/mytest.db";
MetadataTable metadataTable = new MetadataTable(pathToDb);
Console.WriteLine($"Version:{metadataTable.Version}, Author:{metadataTable.Author}, " +
$"Timestamp:{metadataTable.Timestamp}, Location:{metadataTable.Location ?? "Not specified"}");
Recently I decided to try LINQ to SQL and written a simple class for my table:
[Table(Name = "metadata")]
class Metadata
{
[Column(Name = "key")]
public string Key { get; set; }
[Column(Name = "value")]
public string Value { get; set; }
}
That's how I read it in Main():
using (SQLiteConnection connection = new SQLiteConnection($"Data Source={pathToDb};Version=3;"))
{
using (DataContext context = new DataContext(connection))
{
Table<Metadata> metadataFields = context.GetTable<Metadata>();
foreach (Metadata metadataField in metadataFields)
Console.WriteLine($"Key:{metadataField.Key}, Value:{metadataField.Value}");
}
}
I find it very convinient, but is there same convinient way to work with rows/fields as properties (like in MetadataTable code above) without using reflection?
And BTW, is EntityFramework more suitable (and more performant?) for that task? I didn't work with LINQ to Entity or LINQ to SQL before, so it doesn't make big difference what to learn first.

Related

Infinite Loop When Loading Object

I've been racking my brains for a few days, trying not to generate infinite loops when I load my classes with data coming from a database through a method inside the class.
In a playful example this is what is happening. Consider the class below:
public class Person
{
public int id { get; set; }
public string name{ get; set; }
public Person carrier{ get; set; }
}
In the method inside the class I defined it like this:
public void Load()
{
Datatable tableResult;
using (MySqlConnection con = new MySqlConnection(ConnectionString))
{
using (MySqlCommand com = new MySqlCommand(My.Resources.PersonSelect))
{
com.Parameters.AddWithValue("#ID", this.ID);
using (MySqlDataAdapter adp = new MySqlDataAdapter(CmdPerson))
{
TableResult = new DataTable();
adp.Fill(TableResult);
if (tableResult.Rows.Count == 1)
{
ID = tableResult.Rows(0).Item("id");
Name = tableResult.Rows(0).Item("name");
Carrier = new Person().Load(tableResult.Rows(0).Item("carrierid"));
}
}
}
}
}
Also consider that I have a person, and that person's carrier is himself. this would generate an infinite loop.
I've already tried implementing LazzyLoading in the properties, but I still don't know which classes might have this same problem. Is there any other way to solve this problem?
You can try this:
public void Load()
{
var tableResult = new DataTable();
using (var con = new MySqlConnection(ConnectionString))
using (var com = new MySqlCommand(My.Resources.PersonSelect))
using (var adp = new MySqlDataAdapter(com))
{
com.Parameters.AddWithValue("#ID", this.ID);
adp.Fill(tableResult);
}
if (tableResult.Rows.Count == 1)
{
ID = tableResult.Rows(0).Item("id");
Name = tableResult.Rows(0).Item("name");
if (ID == tableResult.Rows(0).Item("carrierid"))
{
Carrier = this;
}
else
{
Carrier = new Person().Load(tableResult.Rows(0).Item("carrierid"));
}
}
}
It still has potential to have a loop in a chain (Person A's carrier is Person B, who's carrier is Person A again), but if that's not a situation in your data you might be okay.
Otherwise you can opt to lazy-load the carrier, like this:
public class Person
{
public int ID { get; set; }
public string Name{ get; set; }
private int _carrierID;
private Person _carrier = null;
public Person Carrier
{
get
{
if (_carrier == null) _carrier = Person.Load(_carrierID);
return _carrier;
}
private set
{
_carrier = value;
if (_carrier != null) _carrierID = _carrier.ID;
}
}
public static Person Load(int ID)
{
var tableResult = new DataTable();
using (var con = new MySqlConnection(ConnectionString))
using (var com = new MySqlCommand(My.Resources.PersonSelect))
using (var adp = new MySqlDataAdapter(com))
{
com.Parameters.AddWithValue("#ID", ID);
adp.Fill(tableResult);
}
if (tableResult.Rows.Count == 1)
{
return new Person() {
ID = tableResult.Rows(0).Item("id"),
Name = tableResult.Rows(0).Item("name"),
_carrierid = tableResult.Rows(0).Item("carrierid")
};
}
return null;
}
}
This can also still possibly loop forever if you try to write code like this:
var p = Person.Load(someID);
while (p.Carrier!=null)
{
p = p.Carrier;
}
But at least now you have to put the loop out in the open where you can find it.
Also note how I converted the Load() method to static.

trying to use json array to insert data in sql with c# but still getting Newtonsoft.Json.Linq.JValue error?

I'm passing an array of JSON data from client-side to server-side (c#). However when I try to insert this data into a SQL table I'm having this Newtonsoft.Json.Linq.JValue just at the moment of the query execution.
Here's the code I have so far:
Client (js);
var json = [];
var Firstcotizacion = {
idCotizacion: "111111",
idProyecto: "8047",
nombreProyecto: "Edificio Caiquen",
idProducto: "MLC462815278"
}
var Secondcotizacion = {
idCotizacion: "222222",
idProyecto: "1234",
nombreProyecto: "Edificio malbec",
idProducto: "MLC29870FD"
}
json.push(Firstcotizacion)
json.push(Secondcotizacion)
$.post("../Ajax/GuardaCotizacionesPI", { json: JSON.stringify(json) }, function (data) {
console.log(data)
});
Server-side (C#):
public string GuardarCotizacion(string json)
{
string SP_INSERTA_COTIZACIONPI = "SVI_CPI_COTIZACION_PI";
dynamic cotizaciones = JsonConvert.DeserializeObject(json);
foreach (var cotizacion in cotizaciones)
{
using (SqlConnection conexion = new SqlConnection(Inventa.PazCorp.Data.BaseDatos.StringConexionGestionContactos(System.Configuration.ConfigurationManager.AppSettings["ambiente"].ToString())))
{
conexion.Open();
using (SqlCommand comm = new SqlCommand(SP_INSERTA_COTIZACIONPI, conexion))
{
comm.CommandType = System.Data.CommandType.StoredProcedure;
comm.Parameters.AddWithValue("IDCOTIZACION", cotizacion.idCotizacion);
comm.Parameters.AddWithValue("IDPROYECTO", cotizacion.idProyecto);
comm.Parameters.AddWithValue("NOMBRE_PROYECTO", cotizacion.nombreProyecto);
comm.Parameters.AddWithValue("IDPRODUCTO", cotizacion.idProducto);
comm.ExecuteNonQuery(); //Here is where the error appears
}
}
}
return json;
}
Any Idea what could be wrong?
Because you use the non-generic version of JsonConvert.DeserializeObject, JSON.Net will deserialise everything to it's own objects like JObject and JValue. And because you are hiding all of that by using dynamic, you are susceptible to all sorts of runtime errors.
However, if you use a concrete class though, you will get type-safe code. For example:
public class MyThing
{
public string idCotizacion { get; set; }
public string idProyecto { get; set; }
public string nombreProyecto { get; set; }
public string idProducto { get; set; }
}
Now use the generic method:
var cotizaciones = JsonConvert.DeserializeObject<List<MyThing>>(json);
And the rest of your code should fit right in, but is now much clearer and safe.
As an aside, you should not be using AddWithValue.
Based on the code you posted, I would implement a Cotizacion class :
public class Cotizacion {
public string idCotizacion {get; set;}
public string idProyecto {get; set;}
public string nombreProyecto {get; set;}
public string idProducto{get; set;}
}
And then do something like :
public string GuardarCotizacion(string json)
{
string SP_INSERTA_COTIZACIONPI = "SVI_CPI_COTIZACION_PI";
var cotizaciones = JsonConvert.DeserializeObject<List<Cotizacion>>(json);
foreach (var cotizacion in cotizaciones)
{
using (SqlConnection conexion = new SqlConnection(Inventa.PazCorp.Data.BaseDatos.StringConexionGestionContactos(System.Configuration.ConfigurationManager.AppSettings["ambiente"].ToString())))
{
conexion.Open();
using (SqlCommand comm = new SqlCommand(SP_INSERTA_COTIZACIONPI, conexion))
{
comm.CommandType = System.Data.CommandType.StoredProcedure;
comm.Parameters.AddWithValue("IDCOTIZACION", cotizacion.idCotizacion);
comm.Parameters.AddWithValue("IDPROYECTO", cotizacion.idProyecto);
comm.Parameters.AddWithValue("NOMBRE_PROYECTO", cotizacion.nombreProyecto);
comm.Parameters.AddWithValue("IDPRODUCTO", cotizacion.idProducto);
comm.ExecuteNonQuery(); //Here is where the error appears
}
}
}
return json;
}

How to retrieve only few columns data of a csv using the column names instead of column number in c#

I have a csv consisting of many columns. From that csv I have to select only few required columns.
The code I have written is
for (int i = 0; i < lineCount; i++)
{
var line = str.ReadLine();
if (line != null)
{
var values = line.Split(',');
dataInformation.Add(new DataInformation
{
timestamp_iso = values[3],
last_attributed_touch_data_tilde_campaign = values[9],
last_attributed_touch_data_tilde_channel = values[11],
last_attributed_touch_data_tilde_feature = values[12],
last_attributed_touch_data_tilde_ad_set_name = values[19],
user_data_platform = values[69],
user_data_aaid = values[70],
user_data_idfa = values[71],
user_data_idfv = values[72]
});
}
}
I am getting wrong values while using this. Is there any other approach to retrieve the values using the column names instead of column numbers.
The Data Information is a class
public class DataInformation
{
public string timestamp_iso { get; set; }
public string last_attributed_touch_data_tilde_campaign { get; set; }
public string last_attributed_touch_data_tilde_channel { get; set; }
public string last_attributed_touch_data_tilde_feature { get; set; }
public string last_attributed_touch_data_tilde_ad_set_name { get; set; }
public string user_data_platform { get; set; }
public string user_data_aaid { get; set; }
public string user_data_idfa { get; set; }
public string user_data_idfv { get; set; }
}
Please help me on this.
I recommend using a library to deal with CSV format. CsvHelper is a good one. It allows accessing fields by column name:
csv.Read();
var field = csv["HeaderName"];
CSV format may look simple, but there are a few corner cases (like quotes), so it is better to use an existing solution.
I have used the below code to get all the records of the type DataInformation.
using (TextReader fileReader = File.OpenText(FileName))
{
var csv = new CsvReader(fileReader);
dataInformation = csv.GetRecords<DataInformation>().ToList();
}
And after that I have used the below code to get the required columns.
using (TextWriter writer = new StreamWriter(ConfigurationManager.AppSettings["downloadFilePath"] + ConfigurationManager.AppSettings["fileName"] + date + ConfigurationManager.AppSettings["csvExtension"].ToString()))
{
using (var csv = new CsvWriter(TextWriter.Synchronized(writer)))
{
csv.WriteHeader(typeof(DataInformation));
csv.NextRecord();
csv.WriteRecords(dataInformation);
}
}
It works for me.

Populate Model class from Data in Backend

I have a database that has two tables as follows, Please ignore the data but the format looks as follows
Now I have a Model class that is constructed as follows
public class FamilyModel
{
public string Name { get; set; }
public List<FamilyModel> FamilyList { get; set; }
public FamilyModel()
{
FamilyList = new List<FamilyModel>();
}
}
Now all I want is to get data from the two tables and populate the list.
So I have a stored procedure that returns data as follows
So I have written some code to populate the above class. But it dosent work. I get a count of 5 when I debug. I want the count to be 2 and when expanded I want something like FamilyA ->{Nick, Tom, Pam}.. FamilyB->{Harry} and so on. Please help fixing this code.
public static FamilyModel familyData()
{
//FamilyModel fml = new FamilyModel();
//fml.FamilyList = new List<FamilyModel>();
using (SqlConnection con = new SqlConnection(#"Data Source=(LocalDB)\v11.0; AttachDbFilename=|DataDirectory|\Families.mdf; Integrated Security=True; Connect Timeout=30;"))
{
con.Open();
SqlCommand cmd = new SqlCommand("sp_GetFamilies", con);
cmd.CommandType = CommandType.StoredProcedure;
SqlDataReader dr = cmd.ExecuteReader();
while (dr.Read()) {
FamilyModel fm = new FamilyModel();
fm.Name = dr["FamilyName"].ToString();
foreach (var item in dr["ChildName"].ToString())
{
if (Convert.ToInt32(dr["id"]) == Convert.ToInt32(dr["FID"]))
{
fm.FamilyList.Add(new FamilyModel() { Name = dr["ChildName"].ToString() });
}
}
}
return fm;
}
}
Here is some source code that should get the right idea across. Below it, I've included some explanation for what's going on.
using Dapper;
public class FamilyModel
{
public int Id { get; set;}
public string FamilyName { get; set; }
public List<Person> Members { get; set; } = new List<Person>();//Initializer for Auto-Property, C#6<= only
}
public class Person
{
public int Id { get; set;}
public string Name { get; set; }
}
public class DatabasePOCO
{
public string FamilyName { get; set; }
public string ChildName { get; set; }
public int Fid { get; set; }
public int Id { get; set;}
}
void Main()
{
using (IDbConnection conn = new SqlConnection("..."))
{
conn.Open();
var raw = conn.Query<DatabasePOCO>("sp_GetFamilies",
commandType: CommandType.StoredProcedure);//Could be dynamic, not typed
var familyList = raw
.GroupBy(x => x.Fid)
.Select(x =>
{
var rawMembers = x.ToList();
var fId = x.First().Fid;
var fName = x.First().FamilyName;
var members = rawMembers.Select(y => new Person
{
Id = y.Id,
Name = y.ChildName
});
return new FamilyModel
{
Id = fId,
FamilyName = fName,
Members = members.ToList()
};
});
//Whatever else you want to do here
}
}
Consider using Dappper. It is a great ORM that makes accessing data from database really easy. It's designed to work with SQL Server, but I've had success using Oracle too, and most other RMDBS systems will probably work.
Consider using Slapper. If you have control over your stored procedure, this can reduce a lot of the boilerplate code below.
If you use Dapper (I hope you do), you can play around with C# dynamic objects, or you can create a POCO to help get some type enforcement on your code.
Understand if you care about reference equality. The code I provided below does not enforce reference equality of objects. Reference equality, in my experience, doesn't buy you much and is a pain to enforce.
You need to distinguish between a new row in the data set and a new FamilyModel. One way to do this is to declare a list of models, then look up the "right" one before you add the current child row:
var rootModel = new FamilyModel();
rootModel.Name = "root";
// ... Set up data reader ...
while (dr.Read())
{
//This requires support for the Id in your FamilyModel:
var id = (int)dr["Id"];
//You could also use ".Single(...)" here
var fm = rootModel.FamilyList.Where(x => x.Id == id).First();
if (fm == null)
{
fm = new FamilyModel();
fm.Name = dr["FamilyName"].ToString();
rootModel.FamilyList.Add(fm);
}
fm.FamilyList.Add(new FamilyModel() { Name = dr["ChildName"].ToString() });
}
For each row in your database query, you'll:
Try to look up that family in your list
If you don't find one, create a new one. Add it to your top-level list.
Add the child name as a sub-element of the "current" family.

23,908 objects in a JSON I am trying to add values to properties before adding them to SQL database

I am creating a WEB API that is receiving a stringified JSON and I am doing a JSON convert and before I add it to the SQL database I give two of the properties values using a for loop but it takes more then 5 mins before it finally hits the db.SaveChanges(). The JSON has around 23,908 objects. I wanted to ask the community if there was a better implementation then what I am doing to speed up the adding of each object?
//This handles the stringify conversion and the adding values to the properties before adding it to SQL database
public void SaveCSV(string file, string fileName)
{
var csv = JsonConvert.DeserializeObject<List<SecurityFile>>(file);
using (ApplicationDbContext db = ApplicationDbContext.Create())
{
//For loop that adds the values to each object in JSON
for (var i = 0; i < csv.Count(); i++)
{
csv[i].DateSubmitted = DateTime.Now;
csv[i].FileName = fileName;
db.SecurityFiles.Add(csv[i]);
}
//Saves it to SQL Database
db.SaveChanges();
}
//Here is my Class
public class SecurityFile
{
[Key]
public int ID { get; set; }
[JsonProperty("Plugin ID")]
public string PluginId { get; set; }
[JsonProperty("CVE")]
public string CVE { get; set; }
[JsonProperty("Risk")]
public string Risk { get; set; }
[JsonProperty("Host")]
public string Host { get; set; }
[JsonProperty("Protocol")]
public string Protocol { get; set; }
[JsonProperty("Port")]
public string Port { get; set; }
[JsonProperty("Name")]
public string Name { get; set; }
[JsonProperty("Synopsis")]
public string Synopsis { get; set; }
[JsonProperty("Description")]
public string Description { get; set; }
[JsonProperty("Solution")]
public string Solution { get; set; }
[JsonProperty("See Also")]
public string SeeAlso { get; set; }
[JsonProperty("FileName")]
public string FileName{ get; set; }
[JsonProperty("DateSubmitted")]
public DateTime DateSubmitted { get; set; }
}
There are several optimization options.
I think you will profit if you parallelize the process using
C# Linq TPL and then add the entities all at once.
using System.Threading.Tasks;
public void SaveCSV(string file, string fileName)
{
var csv = JsonConvert.DeserializeObject<List<SecurityFile>>(file);
using (ApplicationDbContext db = ApplicationDbContext.Create())
{
var now = DateTime.Now;
Parallel.Foreach(csv, item => {
item.DateSubmitted = now;
item.FileName = fileName;
})
//Attach the Entities all at once
db.SecurityFiles.AddRange(csv);
//Saves it to SQL Database
db.SaveChanges();
}
}
20'000 objects isn't very much.
You should be able to do it in less than five seconds 700ms.
Your way is very slow because you do
foreach(jsonObject)
connection.open()
cmd.execute() // Individual insert-statement
connection.close()
Opening and Closing a connection is expensive, as is executing each command separately.
You should instead do
int batchSize = 100;
System.Text.StringBuilder sb = new System.Text.StringBuilder();
connection.Open();
transaction.Begin();
for(int i = 0; i < Count;++i)
{
sb.Append("SQL-Insert");
if(i%batchSize == 0 && i != 0)
{
execute(sb.ToString())
sb.Length = 0;
}
}
execute(sb.ToString())
transaction.commit();
// TODO: Try/Catch + Rollback
connection.Close();
A much easier way is to fetch the table you want to insert your objects to with SELECT * FROM table_name WHERE (1=2)
then add the entries to the datatable, and then call DataAdaper.Update()
If your table has a primary-key, you can even auto-generate the insert command using SqlCommandBuilder.
C#
public static void InsertUpdateDataTable(string strTableName, System.Data.DataTable dt)
{
string strSQL = string.Format("SELECT * FROM [{0}] WHERE 1 = 2 ", strTableName.Replace("]", "]]"));
using (System.Data.SqlClient.SqlDataAdapter daInsertUpdate = new System.Data.SqlClient.SqlDataAdapter(strSQL, getConnectionString())) {
SqlCommandBuilder cmdBuilder = new SqlCommandBuilder(daInsertUpdate);
daInsertUpdate.InsertCommand = cmdBuilder.GetInsertCommand();
daInsertUpdate.UpdateCommand = cmdBuilder.GetUpdateCommand();
daInsertUpdate.Update(dt);
}
}
VB.NET:
Public Shared Sub InsertUpdateDataTable(strTableName As String, dt As System.Data.DataTable)
Dim strSQL As String = String.Format("SELECT * FROM [{0}] WHERE 1 = 2 ", strTableName.Replace("]", "]]"))
Using daInsertUpdate As New System.Data.SqlClient.SqlDataAdapter(strSQL, getConnectionString())
Dim cmdBuilder As New SqlCommandBuilder(daInsertUpdate)
daInsertUpdate.InsertCommand = cmdBuilder.GetInsertCommand()
daInsertUpdate.UpdateCommand = cmdBuilder.GetUpdateCommand()
daInsertUpdate.Update(dt)
End Using
End Sub
You can set the batch-size on the DataAdaper, and it generates the SQL for you as well.
And in a database-agnostic way:
private static System.Data.Common.DbProviderFactory m_factory = System.Data.Common.DbProviderFactories.GetFactory(typeof(System.Data.SqlClient.SqlClientFactory).Namespace);
public static void InsertUpdateDataTable(string strTableName, System.Data.DataTable dt)
{
if (dt == null)
throw new System.ArgumentNullException("DataTable dt may not be NULL.");
// https://msdn.microsoft.com/en-us/library/aadf8fk2(v=vs.110).aspx
using (System.Data.Common.DbDataAdapter daInsertUpdate = m_factory.CreateDataAdapter())
{
using (System.Data.Common.DbConnection conn = m_factory.CreateConnection())
{
conn.ConnectionString = getConnectionString();
daInsertUpdate.SelectCommand = conn.CreateCommand();
daInsertUpdate.SelectCommand.CommandText = string.Format("SELECT * FROM [{0}] WHERE 1 = 2 ", strTableName.Replace("]", "]]"));
using (System.Data.Common.DbCommandBuilder cmdBuilder = m_factory.CreateCommandBuilder())
{
cmdBuilder.DataAdapter = daInsertUpdate;
daInsertUpdate.InsertCommand = cmdBuilder.GetInsertCommand();
daInsertUpdate.UpdateCommand = cmdBuilder.GetUpdateCommand();
} // End Using cmdBuilder
daInsertUpdate.Update(dt);
} // End Using conn
} // End Using daInsertUpdate
System.Console.WriteLine(dt);
}
And using a transaction (assuming ls is a list of numbers):
private static System.Data.Common.DbProviderFactory m_factory = System.Data.Common.DbProviderFactories.GetFactory(typeof(System.Data.SqlClient.SqlClientFactory).Namespace);
public static string getConnectionString()
{
System.Data.SqlClient.SqlConnectionStringBuilder csb = new System.Data.SqlClient.SqlConnectionStringBuilder();
csb.DataSource = System.Environment.MachineName;
csb.InitialCatalog = "TestDb";
csb.IntegratedSecurity = true;
return csb.ConnectionString;
}
public static System.Data.Common.DbConnection GetConnection()
{
var con = m_factory.CreateConnection();
con.ConnectionString = getConnectionString();
return con;
}
public static int BatchedInsert(System.Collections.IList ls)
{
int iAffected = 0;
int batchSize = 100; // Each batch corresponds to a single round-trip to the DB.
using (System.Data.IDbConnection idbConn = GetConnection())
{
lock (idbConn)
{
using (System.Data.IDbCommand cmd = idbConn.CreateCommand())
{
lock (cmd)
{
if (cmd.Connection.State != System.Data.ConnectionState.Open)
cmd.Connection.Open();
using (System.Data.IDbTransaction idbtTrans = idbConn.BeginTransaction())
{
try
{
cmd.Transaction = idbtTrans;
System.Text.StringBuilder sb = new System.Text.StringBuilder();
for (int i = 0; i < ls.Count; ++i)
{
sb.Append("INSERT INTO T_TransactionInsertTest(TestValue) VALUES ( ");
sb.Append(ls[i].ToString());
sb.AppendLine(");");
if (i % batchSize == 0 && i != 0)
{
cmd.CommandText = sb.ToString();
iAffected += cmd.ExecuteNonQuery();
sb.Length = 0;
}
}
if (sb.Length != 0)
{
cmd.CommandText = sb.ToString();
iAffected += cmd.ExecuteNonQuery();
}
idbtTrans.Commit();
} // End Try
catch (System.Data.Common.DbException ex)
{
if (idbtTrans != null)
idbtTrans.Rollback();
iAffected = -1;
//if (Log(ex))
throw;
} // End catch
finally
{
if (cmd.Connection.State != System.Data.ConnectionState.Closed)
cmd.Connection.Close();
} // End Finally
} // End Using idbtTrans
} // End lock cmd
} // End Using cmd
} // End lock idbConn
} // End Using idbConn
return iAffected;
} // End Function BatchedInsert
I would recommend to Insert all rows with one SQL Query rather than insert each row in the loop.
you can insert all items tempTable and than make an insert join statement and execute it outside of the loop.
take a look at this question bulk-record-update-with-sql
or either you can execute a bulk insert statement:
using (ApplicationDbContext db = ApplicationDbContext.Create())
{
db.Database.ExecuteSqlCommand(#" BULK
INSERT SecurityFiles
FROM 'your file path'
WITH
(
FIRSTROW
= 1,
FIELDTERMINATOR = ';',
ROWTERMINATOR = '\n',
)");
}

Categories

Resources