I need to retrieve an image from a database and save it to disk. In the database, the image is stored in binary format but datatype of the column is varchar(5000).
This is the code that I am using for retrieving the image and saving it to disk
public void CreateImageDataUsingDataReader_ForNetezzaDB()
{
string strDbConn = string.Empty;
string strImageFileName = string.Empty;
string strImageData = string.Empty;
string strImgSavePath = string.Empty;
string strQuery = string.Empty;
Byte[] byteImageData;
MemoryStream stmImageData = new MemoryStream();
Image saveImage;
try
{
//---open the database connection
strDbConn = ConfigurationSettings.AppSettings["NetezzaDBConnection"].ToString().Trim();
OleDbConnection dbcon = new OleDbConnection(strDbConn);
dbcon.Open();
strQuery = "select name,signature_vod__c from sfb_call2_vod where signature_vod__c is not null limit 10";
OleDbCommand cmdSelect = new OleDbCommand(strQuery, dbcon);
OleDbDataReader imageReader = cmdSelect.ExecuteReader();
if (imageReader.HasRows)
{
while (imageReader.Read())
{
strImageFileName = imageReader["name"].ToString().Trim();
strImageData = imageReader["signature_vod__c"].ToString().Trim();
stmImageData.Seek(0, SeekOrigin.Begin);
//converting string to byte array
byteImageData = Convert.FromBase64String(strImageData);
//---create Memory stremm from the Image Byte data
stmImageData.Write(byteImageData, 0, byteImageData.Length);
//--saving the image
//saveImage = Image.FromStream(stmImageData);
using (saveImage = Image.FromStream(stmImageData))
{
strImgSavePath = ConfigurationSettings.AppSettings["ImageSavePath"].ToString().Trim();
saveImage.Save(strImgSavePath + strImageFileName + ".png", System.Drawing.Imaging.ImageFormat.Png); ///---error comes in this line
}
}
}
imageReader.Close();
dbcon.Close();
stmImageData.Close();
stmImageData = null;
}
catch (Exception ex)
{
throw new Exception("Error Occured in method CreateImageDataUsingDataReader " + ex.Message);
}
}
but I keep getting an error:
A generic error occurred in GDI+.
Same code if I execute for SQL Server database it works fine but issue comes only with the Netezza database
Please help me resolve this issue
You mention that you store the binary image in a varchar column. This and the fact that it works on an other db technology makes it obious that you read different data back in the Netazza case.
I would suggest to setup a testproject where you persist the same image to the 2 different databases (netazza and mssql), read it back from both a do a bitwise comparison of the result either between the db results or between original and read from db.
I would be surprised if you get the same result and if I am right, you should probalaby consider to use a binary data type to persist the image data in your db backend.
Related
Hi I have a problem with importing a csv file into a sql server, this csv file contains articles that need to be saved in the sql server database. Once the import (done with the code c # written below) is finished, some fields imported as (Descrizione and CodArt) are not written correctly in the database and have strange characters. To download the csv file click here.
SqlServer improper import over blue line:
Import C# Code:
using (var rd = new StreamReader(labelPercorso.Text))
{
Articolo a = new Articolo();
a.db = this.db;
while (!rd.EndOfStream)
{
//setto codean e immagine =null ad ogni giro
CodEAN = "";
Immagine = "";
try
{
var splits = rd.ReadLine().Split(';');
CodArt = splits[0];
Descrizione = splits[1];
String Price = splits[2];
Prezzo = decimal.Parse(Price);
}
catch (Exception ex)
{
Console.WriteLine("Non è presente nè immagine nè codean");
}
a.Prezzo = Prezzo;
a.CodiceArticolo = CodArt;
a.Descrizione = Descrizione;
a.Fornitore = fornitore;
//manca da controllare se l'articolo è presente e nel caso aggiornalo
a.InserisciArticoloCSV();
}
}
Code of function: InserisciArticoloCSV
try
{
SqlConnection conn = db.apriconnessione();
String query = "INSERT INTO Articolo(CodArt,Descrizione,Prezzo,PrezzoListino,Fornitore,Importato,TipoArticolo) VALUES(#CodArt,#Descrizione,#Prezzo,#PrezzoListino,#Fornitore,#Importato,#TipoArticolo)";
String Importato = "CSV";
String TipoArticolo = "A";
SqlCommand cmd = new SqlCommand(query, conn);
// MessageBox.Show("CodArt: " + CodiceArticolo + "\n Descrizione :" + Descrizione + "\n Prezzo: " + Prezzo);
cmd.Parameters.AddWithValue("#CodArt", CodiceArticolo.ToString());
cmd.Parameters.AddWithValue("#Descrizione", Descrizione.ToString());
cmd.Parameters.AddWithValue("#Prezzo", Prezzo);
cmd.Parameters.AddWithValue("#PrezzoListino", Prezzo);
cmd.Parameters.AddWithValue("#Fornitore", Fornitore.ToString());
cmd.Parameters.AddWithValue("#Importato", Importato.ToString());
cmd.Parameters.AddWithValue("#TipoArticolo", TipoArticolo.ToString());
cmd.ExecuteNonQuery();
db.chiudiconnessione();
conn.Close();
return true;
}
catch (Exception ex)
{
Console.WriteLine("Errore nell'inserimento dell'articolo " + ex);
//MessageBox.Show("Errore nel inserimento dell'articolo: " + ex);
return false;
}
Your CSV file is not well formated , there are intermediatory Carriage Returns in between , which screws up the parsing. See the file in Notepad++ and turn on the Line Breaks , this is what you find.
So for the lines which are in format the data import is working fine , for others the logic is not working.
As others have pointed out, you have numerous problems, encoding, carriage returns and a lot of white space. In addition you are using single inserts into your database, which is very slow. I show below some sample code, which illustrates how to deal with all of these points.
IFormatProvider fP = new CultureInfo("it");
DataTable tmp = new DataTable();
tmp.Columns.Add("CodArt", typeof(string));
tmp.Columns.Add("Descrizione", typeof(string));
tmp.Columns.Add("Prezzo", typeof(decimal));
using (var rd = new StreamReader("yourFileName", Encoding.GetEncoding("iso-8859-1")))
{
while (!rd.EndOfStream)
{
try
{
var nextLine = Regex.Replace(rd.ReadLine(), #"\s+", " ");
while (nextLine.Split(';').Length < 3)
{
nextLine = nextLine.Replace("\r\n", "") + Regex.Replace(rd.ReadLine(), #"\s+", " ");
}
var splits = nextLine.Split(';');
DataRow dR = tmp.NewRow();
dR[0] = splits[0];
dR[1] = splits[1];
string Price = splits[2];
dR[2] = decimal.Parse(Price, fP);
tmp.Rows.Add(dR);
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
}
}
using (var conn = db.apriconnessione())
{
var sBC = new SqlBulkCopy(conn);
conn.Open();
sBC.DestinationTableName = "yourTableName";
sBC.WriteToServer(tmp);
conn.Close();
}
Now for some explanation:
Firstly I am storing the parsed values in a DataTable. Please note that I have only included the three fields that are in the CSV. In practice you must supply the other columns and fill the extra columns with the correct values for each row. I was simply being lazy, but I am sure you will get the idea.
I do not know what encoding your csv file is, but iso-8859-1 worked for me!
I use Regex to replace multiple white space with a single space.
If any line does not have the required number of splits, I keep adding further lines (having deleted the carriage return) until I hit success!
Once I have a complete line, I can now split it, and assign it to the new DataRow (please see my comments above for extra columns).
Finally once the file has been read, the DataTable will have all the rows and can be uploaded to your database using BulkCopy. This is very fast!
HTH
PS Some of your lines have double quotes. You probably want to get rid of these as well!
You should specify the correct encoding when you read your file. Is it utf? Is it ascii with a specific code page? You should also specify the SqlDbType of your Sql parameters, especially the string parameters which will be either varchar or nvarchar and there is a big difference between them.
// what is the encoding of your file? This is an example using code page windows-1252
var encoding = Encoding.GetEncoding("windows-1252");
using (var file = File.Open(labelPercorso.Text, FileMode.Open))
using (var reader = new StreamReader(file, encoding))
{
// rest of code unchanged
}
Sql Code. Note that I added using blocks for the types that implement IDisposable like Connection and Command.
try
{
String query = "INSERT INTO Articolo(CodArt,Descrizione,Prezzo,PrezzoListino,Fornitore,Importato,TipoArticolo) VALUES(#CodArt,#Descrizione,#Prezzo,#PrezzoListino,#Fornitore,#Importato,#TipoArticolo)";
String Importato = "CSV";
String TipoArticolo = "A";
using(SqlConnection conn = db.apriconnessione())
using(SqlCommand cmd = new SqlCommand(query, conn))
{
// -1 indicates you used MAX like nvarchar(max), otherwise use the maximum number of characters in the schema
cmd.Parameters.Add(new SqlDbParameter("#CodArt", SqlDbType.NVarChar, -1)).Value = CodiceArticolo.ToString();
cmd.Parameters.Add(new SqlDbParameter("#Descrizione", SqlDbType.NVarChar, -1)).Value = Descrizione.ToString();
/*
Rest of your parameters created in the same manner
*/
cmd.ExecuteNonQuery();
db.chiudiconnessione();
}
return true;
}
catch (Exception ex)
{
Console.WriteLine("Errore nell'inserimento dell'articolo " + ex);
//MessageBox.Show("Errore nel inserimento dell'articolo: " + ex);
return false;
}
Just in case if you are interested in exploring library to handle all parsing needs with few lines of code, you can check out the Cinchoo ETL - an open source library. Here is sample to parse the csv file and shows how to get either datatable or list of records for later to load them to database.
System.Threading.Thread.CurrentThread.CurrentCulture = new CultureInfo("it");
using (var p = new ChoCSVReader("Bosch Luglio 2017.csv")
.Configure((c) => c.MayContainEOLInData = true) //Handle newline chars in data
.Configure(c => c.Encoding = Encoding.GetEncoding("iso-8859-1")) //Specify the encoding for reading
.WithField("CodArt", 1) //first column
.WithField("Descrizione", 2) //second column
.WithField("Prezzo", 3, fieldType: typeof(decimal)) //third column
.Setup(c => c.BeforeRecordLoad += (o, e) =>
{
e.Source = e.Source.CastTo<string>().Replace(#"""", String.Empty); //Remove the quotes
}) //Scrub the data
)
{
var dt = p.AsDataTable();
//foreach (var rec in p)
// Console.WriteLine(rec.Prezzo);
}
Disclaimer: I'm the author of this library.
I have scenario where i want to store video into database and show in grid to download from database ,i will upload video by fileupload control and it should be sql server2008 database,which is best way in c# and asp.net?
There are many link for it. Check out this.
http://weblogs.asp.net/hajan/archive/2010/06/21/save-and-display-youtube-video-links-on-asp-net-website.aspx
http://forums.asp.net/t/1104451.aspx/1?How+to+retrieve+video+file+from+sql+server+database
http://www.saurabhdeveloper.com/techtips_details.php?tipsid=15
http://forums.asp.net/p/1533758/3719583.aspx
http://forums.asp.net/t/1045855.aspx/2/10
http://forums.asp.net/t/1511588.aspx/1
http://www.dotnetspider.com/forum/274821-Play-video-file-asp-net-page.aspx
http://blogs.ugidotnet.org/kfra/archive/2006/10/04/50003.aspx
http://www.dotnetspider.com/resources/16239-code-for-video-upload.aspx
http://www.c-sharpcorner.com/Forums/Thread/88899/
http://www.asp.net/webmatrix/tutorials/10-working-with-video
http://www.c-sharpcorner.com/Forums/Thread/88899/
Try this code
byte[] buffer;
//this is the array of bytes which will hold the data (file)
SqlConnection connection;
protected void ButtonUpload_Click(object sender, EventArgs e)
{
//check the file
if (FileUpload1.HasFile && FileUpload1.PostedFile != null
&& FileUpload1.PostedFile.FileName != "")
{
HttpPostedFile file = FileUpload1.PostedFile;
//retrieve the HttpPostedFile object
buffer = new byte[file.ContentLength];
int bytesReaded = file.InputStream.Read(buffer, 0,
FileUpload1.PostedFile.ContentLength);
if (bytesReaded > 0)
{
try
{
string connectionString =
ConfigurationManager.ConnectionStrings[
"uploadConnectionString"].ConnectionString;
connection = new SqlConnection(connectionString);
SqlCommand cmd = new SqlCommand
("INSERT INTO Videos (Video, Video_Name, Video_Size)" +
" VALUES (#video, #videoName, #videoSize)", connection);
cmd.Parameters.Add("#video",
SqlDbType.VarBinary, buffer.Length).Value = buffer;
cmd.Parameters.Add("#videoName",
SqlDbType.NVarChar).Value = FileUpload1.FileName;
cmd.Parameters.Add("#videoSize",
SqlDbType.BigInt).Value = file.ContentLength;
using (connection)
{
connection.Open();
int i = cmd.ExecuteNonQuery();
Label1.Text = "uploaded, " + i.ToString() + " rows affected";
}
}
catch (Exception ex)
{
Label1.Text = ex.Message.ToString();
}
}
}
else
{
Label1.Text = "Choose a valid video file";
}
}
You say it "should" be SQL Server 2008, but does it have to be?
If not, SQL Server 2012 offers File Tables that are designed for storing large files in SQL Server. It actually stores the data on the filesystem but in a way that you don't have to write any code to manage the actual file.
If it does need to be SQL Server 2008 then you have the following options:
1) FileStream using a varbinary(MAX) column type - there can be performance issues with this method. I personally would not store video in this way.
2) Put the file on the file system (on disk) and only store the filepath/filename in the DB. You then write code to manage CRUD operations on the file system.
It will help you... see the following link http://www.codeproject.com/Articles/308552/Upload-and-Download-Files-to-SQL-Servers-in-ASP-Ne
I have a Excel File (xls) that has a column called Money. In the Money column all the columns are formatted as number, except for some that have that marker saying formatted as text against them. I convert the Excel file to CSV using a c# script that uses IMEX=1 in the connection string to open it. The fields that are marked with stored as text do not come through to the csv file. The file is large, about 20MB. So this means 100 values like 33344 etc do not come thro the csv file.
I tried to put a delay in where I open the Excel File. This worked on my PC but not the Development machine.
Have any idea how to get round this without manually intervention, like format all columns with mixed data types as number etc ? I am looking for an automated solution that works every time . This is on SSIS 2008.
static void ConvertExcelToCsv(string excelFilePath, string csvOutputFile, int worksheetNumber = 1) {
if (!File.Exists(excelFilePath)) throw new FileNotFoundException(excelFilePath);
if (File.Exists(csvOutputFile)) throw new ArgumentException("File exists: " + csvOutputFile);
// connection string
var cnnStr = String.Format("Provider=Microsoft.Jet.OLEDB.4.0;Data Source={0};Extended Properties=\"Excel 8.0;IMEX=1;HDR=NO\"", excelFilePath);
var cnn = new OleDbConnection(cnnStr);
// get schema, then data
var dt = new DataTable();
try {
cnn.Open();
var schemaTable = cnn.GetOleDbSchemaTable(OleDbSchemaGuid.Tables, null);
if (schemaTable.Rows.Count < worksheetNumber) throw new ArgumentException("The worksheet number provided cannot be found in the spreadsheet");
string worksheet = schemaTable.Rows[worksheetNumber - 1]["table_name"].ToString().Replace("'", "");
string sql = String.Format("select * from [{0}]", worksheet);
var da = new OleDbDataAdapter(sql, cnn);
da.Fill(dt);
}
catch (Exception e) {
// ???
throw e;
}
finally {
// free resources
cnn.Close();
}
// write out CSV data
using (var wtr = new StreamWriter(csvOutputFile)) {
foreach (DataRow row in dt.Rows) {
bool firstLine = true;
foreach (DataColumn col in dt.Columns) {
if (!firstLine) { wtr.Write(","); } else { firstLine = false; }
var data = row[col.ColumnName].ToString().Replace("\"", "\"\"");
wtr.Write(String.Format("\"{0}\"", data));
}
wtr.WriteLine();
}
}
}
My solution was to specify a format for the incoming files which said no columns with mixed data types. Solution was from business side and not technology.
I want to save Image inside SqlServer using c# winforms and dapper micro orm.
Photo field inside db is of type VARBINARY(MAX)
Inside Book entity I have Photo property of type byte[].
public Book
{
...
public byte[] Photo { get; set; }
}
Inside winforms Window I have
OpenFileDialog open = new OpenFileDialog() { Filter = "Image Files(*.jpeg;*.bmp;*.png;*.jpg)|*.jpeg;*.bmp;*.png;*.jpg" };
if (open.ShowDialog() == DialogResult.OK)
{
txtPhoto.Text = open.FileName;
}
string image = txtPhoto.Text;
Bitmap bmp = new Bitmap(image);
FileStream fs = new FileStream(image, FileMode.Open, FileAccess.Read);
byte[] bimage = new byte[fs.Length];
fs.Read(bimage, 0, Convert.ToInt32(fs.Length));
fs.Close();
byte[] Photo = bimage;
// inside my repository I have error on saving object at line Photo = #Photo
var sql = "UPDATE Book " +
"SET Title = #Title, " +
" Language = #Language, " +
....
" Photo = #Photo" +
"WHERE Id = #Id";
this.db.Execute(sql, book); // error occures
return book;
Error is
A first chance exception of type 'System.Data.SqlClient.SqlException'
occurred in System.Data.dll
Additional information: Incorrect syntax near 'Photo'.
Am I missing something?
Thanks
You are missing white space before WHERE keyword:
" Photo = #Photo" + // no space at the end here
"WHERE Id = #Id"; // and no space before WHERE here
Also I suggest you to use multiline string (i.e. verbatim string literal) for sql query text (that makes query more readable):
var sql = #"UPDATE Book
SET Title = #Title,
Language = #Language,
Photo = #Photo
WHERE Id = #Id";
And one more thing - it's better to wrap Stream usage into using block (in order to release file handle in case of exception):
byte[] photo;
using(var stream = File.OpenRead(txtPhoto.Text)
{
photo = new byte[stream.Length];
stream.Read(photo, 0, photo.Length);
}
// db query with dapper is OK
My program is now still running to import data from a log file into a remote SQL Server Database. The log file is about 80MB in size and contains about 470000 lines, with about 25000 lines of data. My program can import only 300 rows/second, which is really bad. :(
public static int ImportData(string strPath)
{
//NameValueCollection collection = ConfigurationManager.AppSettings;
using (TextReader sr = new StreamReader(strPath))
{
sr.ReadLine(); //ignore three first lines of log file
sr.ReadLine();
sr.ReadLine();
string strLine;
var cn = new SqlConnection(ConnectionString);
cn.Open();
while ((strLine = sr.ReadLine()) != null)
{
{
if (strLine.Trim() != "") //if not a blank line, then import into database
{
InsertData(strLine, cn);
_count++;
}
}
}
cn.Close();
sr.Close();
return _count;
}
}
InsertData is just a normal insert method using ADO.NET. It uses a parsing method:
public Data(string strLine)
{
string[] list = strLine.Split(new[] {'\t'});
try
{
Senttime = DateTime.Parse(list[0] + " " + list[1]);
}
catch (Exception)
{
}
Clientip = list[2];
Clienthostname = list[3];
Partnername = list[4];
Serverhostname = list[5];
Serverip = list[6];
Recipientaddress = list[7];
Eventid = Convert.ToInt16(list[8]);
Msgid = list[9];
Priority = Convert.ToInt16(list[10]);
Recipientreportstatus = Convert.ToByte(list[11]);
Totalbytes = Convert.ToInt32(list[12]);
Numberrecipient = Convert.ToInt16(list[13]);
DateTime temp;
if (DateTime.TryParse(list[14], out temp))
{
OriginationTime = temp;
}
else
{
OriginationTime = null;
}
Encryption = list[15];
ServiceVersion = list[16];
LinkedMsgid = list[17];
MessageSubject = list[18];
SenderAddress = list[19];
}
InsertData method:
private static void InsertData(string strLine, SqlConnection cn)
{
var dt = new Data(strLine); //parse the log line into proper fields
const string cnnStr =
"INSERT INTO LOGDATA ([SentTime]," + "[client-ip]," +
"[Client-hostname]," + "[Partner-Name]," + "[Server-hostname]," +
"[server-IP]," + "[Recipient-Address]," + "[Event-ID]," + "[MSGID]," +
"[Priority]," + "[Recipient-Report-Status]," + "[total-bytes]," +
"[Number-Recipients]," + "[Origination-Time]," + "[Encryption]," +
"[service-Version]," + "[Linked-MSGID]," + "[Message-Subject]," +
"[Sender-Address]) " + " VALUES ( " + "#Senttime," + "#Clientip," +
"#Clienthostname," + "#Partnername," + "#Serverhostname," + "#Serverip," +
"#Recipientaddress," + "#Eventid," + "#Msgid," + "#Priority," +
"#Recipientreportstatus," + "#Totalbytes," + "#Numberrecipient," +
"#OriginationTime," + "#Encryption," + "#ServiceVersion," +
"#LinkedMsgid," + "#MessageSubject," + "#SenderAddress)";
var cmd = new SqlCommand(cnnStr, cn) {CommandType = CommandType.Text};
cmd.Parameters.AddWithValue("#Senttime", dt.Senttime);
cmd.Parameters.AddWithValue("#Clientip", dt.Clientip);
cmd.Parameters.AddWithValue("#Clienthostname", dt.Clienthostname);
cmd.Parameters.AddWithValue("#Partnername", dt.Partnername);
cmd.Parameters.AddWithValue("#Serverhostname", dt.Serverhostname);
cmd.Parameters.AddWithValue("#Serverip", dt.Serverip);
cmd.Parameters.AddWithValue("#Recipientaddress", dt.Recipientaddress);
cmd.Parameters.AddWithValue("#Eventid", dt.Eventid);
cmd.Parameters.AddWithValue("#Msgid", dt.Msgid);
cmd.Parameters.AddWithValue("#Priority", dt.Priority);
cmd.Parameters.AddWithValue("#Recipientreportstatus", dt.Recipientreportstatus);
cmd.Parameters.AddWithValue("#Totalbytes", dt.Totalbytes);
cmd.Parameters.AddWithValue("#Numberrecipient", dt.Numberrecipient);
if (dt.OriginationTime != null)
cmd.Parameters.AddWithValue("#OriginationTime", dt.OriginationTime);
else
cmd.Parameters.AddWithValue("#OriginationTime", DBNull.Value);
//if OriginationTime was null, then insert with null value to this column
cmd.Parameters.AddWithValue("#Encryption", dt.Encryption);
cmd.Parameters.AddWithValue("#ServiceVersion", dt.ServiceVersion);
cmd.Parameters.AddWithValue("#LinkedMsgid", dt.LinkedMsgid);
cmd.Parameters.AddWithValue("#MessageSubject", dt.MessageSubject);
cmd.Parameters.AddWithValue("#SenderAddress", dt.SenderAddress);
cmd.ExecuteNonQuery();
}
How can my program run faster?
Thank you so much!
Use SqlBulkCopy.
Edit: I created a minimal implementation of IDataReader and created a Batch type so that I could insert arbitrary in-memory data using SqlBulkCopy. Here is the important bit:
IDataReader dr = batch.GetDataReader();
using (SqlTransaction tx = _connection.BeginTransaction())
{
try
{
using (SqlBulkCopy sqlBulkCopy =
new SqlBulkCopy(_connection, SqlBulkCopyOptions.Default, tx))
{
sqlBulkCopy.DestinationTableName = TableName;
SetColumnMappings(sqlBulkCopy.ColumnMappings);
sqlBulkCopy.WriteToServer(dr);
tx.Commit();
}
}
catch
{
tx.Rollback();
throw;
}
}
The rest of the implementation is left as an exercise for the reader :)
Hint: the only bits of IDataReader you need to implement are Read, GetValue and FieldCount.
Hmmm, let's break this down a little bit.
In pseudocode what you did is the ff:
Open the file
Open a connection
For every line that has data:
Parse the string
Save the data in SQL Server
Close the connection
Close the file
Now the fundamental problems in doing it this way are:
You are keeping a SQL connection open while waiting for your line parsing (pretty susceptible to timeouts and stuff)
You might be saving the data line by line, each in its own transaction. We won't know until you show us what the InsertData method is doing
Consequently you are keeping the file open while waiting for SQL to finish inserting
The optimal way of doing this is to parse the file as a whole, and then insert them in bulk. You can do this with SqlBulkCopy (as suggested by Matt Howells), or with SQL Server Integration Services.
If you want to stick with ADO.NET, you can pool together your INSERT statements and then pass them off into one large SQLCommand, instead of doing it this way e.g., setting up one SQLCommand object per insert statement.
You create the SqlCommand object for every row of data. The simplest improvement would therefore to create a
private static SqlCommand cmdInsert
and declare the parameters with the Parameters.Add() method. Then for each data row, set the parameter values using
cmdInsert.Parameters["#paramXXX"].Value = valueXXX;
A second performance improvement might be to skip creation of Data objects for each row, and assign Parameter values directly from the list[] array.