Logging parameterized SQL query with parameters included - c#

I have a couple of functions that take a SQL query (parameters already added) and run it against the database. When the SQL query fails, I'd like to log the complete query, parameters included, in order to see exactly what caused the failure. query.ToString() returns IBM.Data.Informix.IfxCommand, so currently I'm just capturing query.CommandText, but if the parameters are what caused the issue this doesn't tell me exactly what I'm dealing with.
Here's one of the query functions I'm using:
public DataTable CallDtQuery(IfxCommand query)
{
DataTable dt = new DataTable();
using (IBM.Data.Informix.IfxConnection conn = new IfxConnection(sqlConnection))
{
try
{
IBM.Data.Informix.IfxDataAdapter adapter = new IfxDataAdapter();
query.Connection = conn;
conn.Open();
adapter.SelectCommand = new IfxCommand("SET ISOLATION TO DIRTY READ", conn);
adapter.SelectCommand.ExecuteNonQuery(); //Tells the program to wait in case of a lock.
adapter.SelectCommand = query;
adapter.Fill(dt);
conn.Close();
adapter.Dispose();
}
catch (IBM.Data.Informix.IfxException ex)
{
LogError(ex, query.CommandText);
SendErrorEmail(ex, query.CommandText);
DisplayError();
}
}
return dt;
}
Here's the logging function:
private void LogError(IfxException ex, string query)
{ //Logs the error.
string filename = HttpContext.Current.Server.MapPath("~") + "/Logs/sqlErrors.txt";
System.IO.FileStream fs = new System.IO.FileStream(filename, System.IO.FileMode.Append);
System.IO.StreamWriter sw = new System.IO.StreamWriter(fs);
sw.WriteLine("=======BEGIN ERROR LOG=======");
sw.WriteLine(DateTime.Now.ToShortDateString() + " " + DateTime.Now.ToLongTimeString());
sw.WriteLine("Query = " + query);
sw.WriteLine("User = " + HttpContext.Current.Session["UserID"]);
sw.WriteLine("Error Message = " + ex.Message);
sw.WriteLine("Message Source:");
sw.WriteLine(ex.Source);
sw.WriteLine("=============================");
sw.WriteLine("Message Target:");
sw.WriteLine(ex.TargetSite);
sw.WriteLine("=============================");
sw.WriteLine("Stack Trace:");
sw.WriteLine(ex.StackTrace);
sw.WriteLine("========END ERROR LOG========");
sw.WriteLine("");
sw.Close();
fs.Close();
}
Is there a way to pass the entire string, parameters included, for logging as I have here? The only approach I've figured out that should work is to pass the query to the logging function and build a for loop to log each parameter as a separate item. As some of these queries have a number of parameters, and as I'd not be getting the complete query in one easy string, that's not really the most ideal solution.

What about:
// ...
catch (IBM.Data.Informix.IfxException ex)
{
LogError(ex, query); // NOTE
SendErrorEmail(ex, query.CommandText);
DisplayError();
}
And create an overload like this:
private void LogError(IfxException ex, IfxCommand query)
{
StringBuilder sb = new StringBuilder();
sb.Append(String.Format("{0}\n", query.CommandText));
foreach (IDataParameter parameter in query.Parameters)
sb.Append(String.Format("\t{0} = {1}\n",
parameter.ParameterName, parameter.Value));
LogError(ex, sb.ToString());
}

Rubens:
Since I want to pass the query both to the LogError and SendErrorEmail functions, I made a unique function for this approach. I also tweaked your version to automatically replace the parameter spots (for example, "cmt_slmno = ?" in the query) instead of making a list of parameters. Here's the result:
private string RecreateQuery(IfxCommand query)
{
StringBuilder sb = new StringBuilder();
sb.Append(query.CommandText);
foreach (IfxParameter parameter in query.Parameters)
sb.Replace(" ? ", string.Format(" {0} ", parameter.Value.ToString()));
return sb.ToString();
}
And the correspondingly adjusted catch statement:
catch (IBM.Data.Informix.IfxException ex)
{
string errorQuery = RecreateQuery(query);
LogError(ex, errorQuery);
SendErrorEmail(ex, errorQuery);
DisplayError();
}
This approach has the potential problem of a rogue ? being replaced in the text, for example if the statement has a questionmark in a string it's trying to set. That should be very rare with the data I'm using in this application, and surrounding the question mark with spaces as above should cover all but the extremely rare, oddly-formatted string a user puts in.

Related

Running a command although the connection is close adomd

I have read the Mirosoft Document.When we open a connection and then close it, it is possible to use the session.
I have written this block of code to run a command but I get an error message, which says there is no connection. Do you have any Idee how can I close the connection, but use the session to run a cammand:
try
{
using (AdomdConnection adomdConnection = new AdomdConnection("MY Connection String"))
{
adomdConnection.Open();
adomdConnection.Close(false);
while (true)
{
String query = #"EVALUATE { BLANK()}";
AdomdCommand adomdCommand = new AdomdCommand(query);
Console.WriteLine(adomdConnection.SessionID.ToString() + " " + DateTime.Now.ToString());
AdomdDataReader reader = adomdCommand.ExecuteReader();
reader.Close();
System.Threading.Thread.Sleep(30000);
}
}
}
catch(AdomdConnectionException ex)
{
Console.WriteLine(ex.Message.ToString());
}
In the examples shown in the document you list, it has:
/*First, try to connect to the specified data source.
If the connection string is not valid, or if the specified
provider does not support sessions, an exception is thrown. */
objConnection.ConnectionString = connectionString;
objConnection.Open();
// Now that the connection is open, retrieve the new
// active session ID.
strSessionID = objConnection.SessionID;
// Close the connection, but leave the session open.
objConnection.Close(false);
return strSessionID;
And in your code specifically, you have:
adomdConnection.Open();
adomdConnection.Close(false);
while (true)
{
String query = #"EVALUATE { BLANK()}";
AdomdCommand adomdCommand = new AdomdCommand(query);
Console.WriteLine(adomdConnection.SessionID.ToString() + " " +
DateTime.Now.ToString());
AdomdDataReader reader = adomdCommand.ExecuteReader();
reader.Close();
System.Threading.Thread.Sleep(30000);
}
Wouldn't you want to have this instead (based on the example given)?
adomdConnection.Open();
while (true)
{
String query = #"EVALUATE { BLANK()}";
AdomdCommand adomdCommand = new AdomdCommand(query);
Console.WriteLine(adomdConnection.SessionID.ToString() + " " +
DateTime.Now.ToString());
AdomdDataReader reader = adomdCommand.ExecuteReader();
reader.Close();
System.Threading.Thread.Sleep(30000);
}
adomdConnection.Close(false);
It seems as though it's complaining because you're closing the connection before you even use it, according to the order in which your code looks to be operating. Try moving the adomdConnection.Close(false); after your while loop.

when I put a csv file in Sql Server with C #, some fields are written incorrectly?

Hi I have a problem with importing a csv file into a sql server, this csv file contains articles that need to be saved in the sql server database. Once the import (done with the code c # written below) is finished, some fields imported as (Descrizione and CodArt) are not written correctly in the database and have strange characters. To download the csv file click here.
SqlServer improper import over blue line:
Import C# Code:
using (var rd = new StreamReader(labelPercorso.Text))
{
Articolo a = new Articolo();
a.db = this.db;
while (!rd.EndOfStream)
{
//setto codean e immagine =null ad ogni giro
CodEAN = "";
Immagine = "";
try
{
var splits = rd.ReadLine().Split(';');
CodArt = splits[0];
Descrizione = splits[1];
String Price = splits[2];
Prezzo = decimal.Parse(Price);
}
catch (Exception ex)
{
Console.WriteLine("Non è presente nè immagine nè codean");
}
a.Prezzo = Prezzo;
a.CodiceArticolo = CodArt;
a.Descrizione = Descrizione;
a.Fornitore = fornitore;
//manca da controllare se l'articolo è presente e nel caso aggiornalo
a.InserisciArticoloCSV();
}
}
Code of function: InserisciArticoloCSV
try
{
SqlConnection conn = db.apriconnessione();
String query = "INSERT INTO Articolo(CodArt,Descrizione,Prezzo,PrezzoListino,Fornitore,Importato,TipoArticolo) VALUES(#CodArt,#Descrizione,#Prezzo,#PrezzoListino,#Fornitore,#Importato,#TipoArticolo)";
String Importato = "CSV";
String TipoArticolo = "A";
SqlCommand cmd = new SqlCommand(query, conn);
// MessageBox.Show("CodArt: " + CodiceArticolo + "\n Descrizione :" + Descrizione + "\n Prezzo: " + Prezzo);
cmd.Parameters.AddWithValue("#CodArt", CodiceArticolo.ToString());
cmd.Parameters.AddWithValue("#Descrizione", Descrizione.ToString());
cmd.Parameters.AddWithValue("#Prezzo", Prezzo);
cmd.Parameters.AddWithValue("#PrezzoListino", Prezzo);
cmd.Parameters.AddWithValue("#Fornitore", Fornitore.ToString());
cmd.Parameters.AddWithValue("#Importato", Importato.ToString());
cmd.Parameters.AddWithValue("#TipoArticolo", TipoArticolo.ToString());
cmd.ExecuteNonQuery();
db.chiudiconnessione();
conn.Close();
return true;
}
catch (Exception ex)
{
Console.WriteLine("Errore nell'inserimento dell'articolo " + ex);
//MessageBox.Show("Errore nel inserimento dell'articolo: " + ex);
return false;
}
Your CSV file is not well formated , there are intermediatory Carriage Returns in between , which screws up the parsing. See the file in Notepad++ and turn on the Line Breaks , this is what you find.
So for the lines which are in format the data import is working fine , for others the logic is not working.
As others have pointed out, you have numerous problems, encoding, carriage returns and a lot of white space. In addition you are using single inserts into your database, which is very slow. I show below some sample code, which illustrates how to deal with all of these points.
IFormatProvider fP = new CultureInfo("it");
DataTable tmp = new DataTable();
tmp.Columns.Add("CodArt", typeof(string));
tmp.Columns.Add("Descrizione", typeof(string));
tmp.Columns.Add("Prezzo", typeof(decimal));
using (var rd = new StreamReader("yourFileName", Encoding.GetEncoding("iso-8859-1")))
{
while (!rd.EndOfStream)
{
try
{
var nextLine = Regex.Replace(rd.ReadLine(), #"\s+", " ");
while (nextLine.Split(';').Length < 3)
{
nextLine = nextLine.Replace("\r\n", "") + Regex.Replace(rd.ReadLine(), #"\s+", " ");
}
var splits = nextLine.Split(';');
DataRow dR = tmp.NewRow();
dR[0] = splits[0];
dR[1] = splits[1];
string Price = splits[2];
dR[2] = decimal.Parse(Price, fP);
tmp.Rows.Add(dR);
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
}
}
using (var conn = db.apriconnessione())
{
var sBC = new SqlBulkCopy(conn);
conn.Open();
sBC.DestinationTableName = "yourTableName";
sBC.WriteToServer(tmp);
conn.Close();
}
Now for some explanation:
Firstly I am storing the parsed values in a DataTable. Please note that I have only included the three fields that are in the CSV. In practice you must supply the other columns and fill the extra columns with the correct values for each row. I was simply being lazy, but I am sure you will get the idea.
I do not know what encoding your csv file is, but iso-8859-1 worked for me!
I use Regex to replace multiple white space with a single space.
If any line does not have the required number of splits, I keep adding further lines (having deleted the carriage return) until I hit success!
Once I have a complete line, I can now split it, and assign it to the new DataRow (please see my comments above for extra columns).
Finally once the file has been read, the DataTable will have all the rows and can be uploaded to your database using BulkCopy. This is very fast!
HTH
PS Some of your lines have double quotes. You probably want to get rid of these as well!
You should specify the correct encoding when you read your file. Is it utf? Is it ascii with a specific code page? You should also specify the SqlDbType of your Sql parameters, especially the string parameters which will be either varchar or nvarchar and there is a big difference between them.
// what is the encoding of your file? This is an example using code page windows-1252
var encoding = Encoding.GetEncoding("windows-1252");
using (var file = File.Open(labelPercorso.Text, FileMode.Open))
using (var reader = new StreamReader(file, encoding))
{
// rest of code unchanged
}
Sql Code. Note that I added using blocks for the types that implement IDisposable like Connection and Command.
try
{
String query = "INSERT INTO Articolo(CodArt,Descrizione,Prezzo,PrezzoListino,Fornitore,Importato,TipoArticolo) VALUES(#CodArt,#Descrizione,#Prezzo,#PrezzoListino,#Fornitore,#Importato,#TipoArticolo)";
String Importato = "CSV";
String TipoArticolo = "A";
using(SqlConnection conn = db.apriconnessione())
using(SqlCommand cmd = new SqlCommand(query, conn))
{
// -1 indicates you used MAX like nvarchar(max), otherwise use the maximum number of characters in the schema
cmd.Parameters.Add(new SqlDbParameter("#CodArt", SqlDbType.NVarChar, -1)).Value = CodiceArticolo.ToString();
cmd.Parameters.Add(new SqlDbParameter("#Descrizione", SqlDbType.NVarChar, -1)).Value = Descrizione.ToString();
/*
Rest of your parameters created in the same manner
*/
cmd.ExecuteNonQuery();
db.chiudiconnessione();
}
return true;
}
catch (Exception ex)
{
Console.WriteLine("Errore nell'inserimento dell'articolo " + ex);
//MessageBox.Show("Errore nel inserimento dell'articolo: " + ex);
return false;
}
Just in case if you are interested in exploring library to handle all parsing needs with few lines of code, you can check out the Cinchoo ETL - an open source library. Here is sample to parse the csv file and shows how to get either datatable or list of records for later to load them to database.
System.Threading.Thread.CurrentThread.CurrentCulture = new CultureInfo("it");
using (var p = new ChoCSVReader("Bosch Luglio 2017.csv")
.Configure((c) => c.MayContainEOLInData = true) //Handle newline chars in data
.Configure(c => c.Encoding = Encoding.GetEncoding("iso-8859-1")) //Specify the encoding for reading
.WithField("CodArt", 1) //first column
.WithField("Descrizione", 2) //second column
.WithField("Prezzo", 3, fieldType: typeof(decimal)) //third column
.Setup(c => c.BeforeRecordLoad += (o, e) =>
{
e.Source = e.Source.CastTo<string>().Replace(#"""", String.Empty); //Remove the quotes
}) //Scrub the data
)
{
var dt = p.AsDataTable();
//foreach (var rec in p)
// Console.WriteLine(rec.Prezzo);
}
Disclaimer: I'm the author of this library.

Index was outside the bounds of the array in MSCORLIB.DLL

I will be amazed if I find a solution for this, since it is very specific and vague, but I figured I would try. I'll try to give as much information as humanly possible, since I've been searching for answers for some time now.
I am building a utility in C# which copies records from a file in a library on the i-series/AS400 and builds an encrypted text file with each record from the AS400 as a comma separated string. In the file, it will have values like filename, fieldvalue1, fieldvalue2, fieldvalue3. I then take that text file to another PC, and run a C# utility which copies that record into the same file name in a library over there on a different i-series machine. Unfortunately, I receive the outside bounds of the array exception in some cases, but I cannot determine why. In the record just prior to the exception, the record looks pretty much the same and it works fine. My code is below in a nutshell. I usually don't give up, but I don't expect to ever figure this out. If someone does, I'll probably sing karaoke tonight.
// Select records from AS400 file and write them to text file
Recordset rs = new Recordset();
sqlQuery = "SELECT * FROM " + dataLibrary + "." + fileName;
try
{
rs.Open(sqlQuery, con);
while (!rs.EOF)
{
int[] fieldLengths;
fieldLengths = new int[rs.Fields.Count];
String[] fieldValues;
fieldValues = new String[rs.Fields.Count];
String fullString = "";
for (i = 0; i < rs.Fields.Count; i++)
{
fieldLengths[i] += rs.Fields[i].DefinedSize;
fieldValues[i] += rs.Fields[i].Value;
}
fullString = fileName + "," + String.Join(",", fieldValues);
fullString = Functions.EncryptString(fullString);
File.AppendAllText(savefile.FileName, fullString + Environment.NewLine);
rs.MoveNext();
}
}
catch (Exception ex)
{
}
cmd.Dispose();
// This gives me a text file of filename, fieldvalue1, fieldvalue2, etc...
// Next, I take the file to another system and run this process:
while ((myString = inputFile.ReadLine()) != null)
{
int stringLength = myString.Length;
String[] valuesArray = myString.Split(',');
for (i = 0; i < valuesArray.Length; i++)
{
if (i == 0)
{
fileName = valuesArray[0];
// Create file if it doesn't exist already
createPhysicalFile(newLibrary, fileName);
SQLStatement = "INSERT INTO " + newLibrary + "." + fileName + "VALUES(";
}
else
{
if (i == valuesArray.Length - 1)
{
SQLStatement += "#VAL" + i + ")";
}
else
{
SQLStatement += "#VAL" + i + ", ";
}
}
}
try
{
using (connection)
{
try
{
connection.Open();
}
catch (Exception ex)
{
}
// Create a new SQL command
iDB2Command command = new iDB2Command(SQLStatement, connection);
for (i = 1; i < valuesArray.Length; i++)
{
try
{
command.Parameters.AddWithValue("#VAL" + i, (valuesArray[i]));
}
catch (Exception ex)
{
}
}
// Just split the array into a string to visually check
// differences in the records
String arraySplit = ConvertStringArrayToString(valuesArray);
// The query gets executed here. The command looks something
// like:
// INSERT INTO LIBNAME.FILENAME VALUES(#VAL!, #VAL2, #VAL3, #VAL4)
// There are actually 320 fields in the file I'm having a problem with,
// so it's possible I'm overlooking something. I have narrowed it down to
// field # 316 when the exception occurs, but in both cases
// field 316 is blanks (when it works and when it doesn't).
command.ExecuteNonQuery();
}
}
catch (Exception ex)
{
// Here I get the exception out of bounds error in MSCORLIB.DLL.
// Some records are added fine, while others cause this exception.
// I cannot visibly tell any major differences, nor do I see any
// errors in the AS400 job log or anything in C# that would lead me
// down a certain path.
String error = ex.Message;
}
}
For what it's worth, I found this happening one a smaller file in the system and was able to figure out what going on, after painstaking research into the code and the net. Basically, the file file has numeric fields on the i-series. Somehow, the records were written to the file on the original system with null values in the numeric fields instead of numeric values. When storing the original records, I had to do this calculation:
String fieldType = rs.Fields[i].Type.ToString();
object objValue = rs.Fields[i].Value;
if (fieldType == "adNumeric" && objValue is DBNull)
{
fieldValues[i] += "0";
}
else
{
fieldValues[i] += rs.Fields[i].Value;
}
After this, if null values were found in one of the numeric fields, it just put "0" in it's place so that when writing to the new machine, it would put a valid numeric character in there and continue on writing the rest of the values. Thanks for all the advice and moral support. :)

New to C# - trying to write code to do a simple function

I'm new to C# (worked in PHP, Python, and Javascript) and I'm trying to more or less make a duplicate of another page and change some things - to make a form and database submission.
Anyway, here's the code:
public partial class commenter : System.Web.UI.Page
{
string employee_reviewed;
//public Commenter();
public void SaveBtn_Click(object sender, EventArgs e)
{
if (CommentTB.Text == "Please enter a comment.")
{
String csname = "Email Error";
Type cstype = this.GetType();
ClientScriptManager cs = Page.ClientScript;
if (!cs.IsStartupScriptRegistered(cstype, csname))
{
String cstext = "alert('Please submit at least one comment.');";
cs.RegisterStartupScript(cstype, csname, cstext, true);
}
FormMessage.Text = "Please submit at least one comment.";
return;
}
string comment = CommentTB.Text;
comment = comment.Replace("'", "''");
comment = comment.Replace("’", "''");
comment = comment.Replace("`", "''");
try
{
//myCommand.Connection.Open();
//myCommand.ExecuteNonQuery();
//myCommand.Connection.Close();
MySqlCommand myCommand;
MySqlConnection connection;
string connStringName = "server=localhost;database=hourtracking;uid=username;password=password";
connection = new MySqlConnection(connStringName);
string sql_query;
sql_query = "insert into peer_review_comment " + " (emp_id, comment)" + " values(?employeeid, ?comment) ";
//String csname = "Email Error";
//Type cstype = this.GetType();
//ClientScriptManager cs = Page.ClientScript;
//cs.RegisterStartupScript(cstype, csname, sql_query, true);
myCommand = new MySqlCommand(sql_query, connection);
//FormMessage.Text = sql_query;
//return;
Trace.Write("comment = ", comment);
myCommand.Parameters.Add(new MySqlParameter("?employeeid", ViewState["employeeid"].ToString()));
myCommand.Parameters.Add(new MySqlParameter("?comment", comment));
try
{
myCommand.Connection.Open();
myCommand.ExecuteNonQuery();
myCommand.Connection.Close();
}
catch (Exception ex)
{
FormMessage.Text = "Error:SaveBtn_Click - " + ex.Message;
}
//SendNotification(from, to, cc, subject, body, attach);
FormMessage.Text = "\n Thank you for leaving anonymous feedback for " + employee_reviewed; ;
ThankyouDiv.Visible = true;
FormFieldDiv.Visible = false;
reviewHeader.Visible = false;
}
catch (Exception ex)
{
FormMessage.Text = "Error:SaveBtn_Click - " + ex.Message;
}
}
}
I really have little idea what I'm doing - I'm reading the tutorials, but C# is a significantly different language than I am used to.
I get the Javascript alert when I do not change the text currently, but submission isn't working - I want it to submit to peer_review_comment database table, and fill in employeeid as well as the submitted comment.
Sorry if my understanding is so spotty, I am a TOTAL C# newbie (currently reading http://www.csharp-station.com/Tutorial/CSharp/)
My guess is the problem is here:
try
{
myCommand.Connection.Open();
myCommand.ExecuteNonQuery();
myCommand.Connection.Close();
}
catch (Exception ex)
{
FormMessage.Text = "Error:SaveBtn_Click - " + ex.Message;
// no "return;" !!
}
//SendNotification(from, to, cc, subject, body, attach);
FormMessage.Text = "\n Thank you for leaving anonymous feedback for " +
employee_reviewed; ;
Your catch block is setting the FormMessage.Text value bot not exiting the method, so the method keeps executing where the catch block finishes off, resetting the Text value and appearing that no exception was thrown.
add a return; at the end of your catch block to see the excpetion message.
Some general guidelines to make these kinds of problems easier to trap:
Don't try to do too much in one method. Have one method that validates the message (or do it client-side using Validators, another to do the DB call, etc.
Learn to use the debugger. You can step through code and get a better idea of what causes these kinds of errors.
Unless you can DO something about an exception, there's no harm in letting them bubble up to a higher level event handler (like Elmah) so exceptions don't get accidentally swallowed like it does here. In general it's preferrable to re-throw exceptions in lower-level methods (maybe adding some context or a user-friendly message) so the higher level exception handling can decide what to do (show a message, log, etc.)
I have taken the liberty of refactoring your code. This shows some better code practices but may also show you the problem. Along with these code changes I would also recommend reading D. Stanley's answer; there are some helpful tips in there as well.
if (CommentTB.Text == "Please enter a comment.")
{
String csname = "Email Error";
Type cstype = this.GetType();
ClientScriptManager cs = Page.ClientScript;
if (!cs.IsStartupScriptRegistered(cstype, csname))
{
String cstext = "alert('Please submit at least one comment.');";
cs.RegisterStartupScript(cstype, csname, cstext, true);
}
FormMessage.Text = "Please submit at least one comment.";
return;
}
// This helps some but very little, just wanted to show an alternative to writing three statements
string comment = CommentTB.Text.Replace("'", "''").Replace("’", "''").Replace("`", "''");
//string comment = CommentTB.Text;
//comment = comment.Replace("'", "''");
//comment = comment.Replace("’", "''");
//comment = comment.Replace("`", "''");
try
{
// No need to do string concatenation...just make it one string.
// sql_query = "insert into peer_review_comment " + " (emp_id, comment)" + " values(?employeeid, ?comment) ";
string sql_query = "insert into peer_review_comment (emp_id, comment) values (?employeeid, ?comment) ";
string connStringName = "server=localhost;database=hourtracking;uid=username;password=password";
// Use a "using" clause because it guarantees the connection is closed even when an exception occurs.
using (MySqlConnection connection = new MySqlConnection(connStringName))
{
connection.Open();
// Again, use a "using" clause
using (MySqlCommand myCommand = new MySqlCommand(sql_query, connection))
{
Trace.Write("comment = ", comment);
myCommand.Parameters.Add(new MySqlParameter("?employeeid", ViewState["employeeid"].ToString()));
myCommand.Parameters.Add(new MySqlParameter("?comment", comment));
myCommand.ExecuteNonQuery();
// No need for a Close statement with "using" clause.
//myCommand.Connection.Close();
}
}
FormMessage.Text = "\n Thank you for leaving anonymous feedback for " + employee_reviewed;
ThankyouDiv.Visible = true;
FormFieldDiv.Visible = false;
reviewHeader.Visible = false;
}
catch (Exception ex)
{
FormMessage.Text = "Error:SaveBtn_Click - " + ex.Message;
}

import from text file to SQL Server Database, is ADO.NET too slow?

My program is now still running to import data from a log file into a remote SQL Server Database. The log file is about 80MB in size and contains about 470000 lines, with about 25000 lines of data. My program can import only 300 rows/second, which is really bad. :(
public static int ImportData(string strPath)
{
//NameValueCollection collection = ConfigurationManager.AppSettings;
using (TextReader sr = new StreamReader(strPath))
{
sr.ReadLine(); //ignore three first lines of log file
sr.ReadLine();
sr.ReadLine();
string strLine;
var cn = new SqlConnection(ConnectionString);
cn.Open();
while ((strLine = sr.ReadLine()) != null)
{
{
if (strLine.Trim() != "") //if not a blank line, then import into database
{
InsertData(strLine, cn);
_count++;
}
}
}
cn.Close();
sr.Close();
return _count;
}
}
InsertData is just a normal insert method using ADO.NET. It uses a parsing method:
public Data(string strLine)
{
string[] list = strLine.Split(new[] {'\t'});
try
{
Senttime = DateTime.Parse(list[0] + " " + list[1]);
}
catch (Exception)
{
}
Clientip = list[2];
Clienthostname = list[3];
Partnername = list[4];
Serverhostname = list[5];
Serverip = list[6];
Recipientaddress = list[7];
Eventid = Convert.ToInt16(list[8]);
Msgid = list[9];
Priority = Convert.ToInt16(list[10]);
Recipientreportstatus = Convert.ToByte(list[11]);
Totalbytes = Convert.ToInt32(list[12]);
Numberrecipient = Convert.ToInt16(list[13]);
DateTime temp;
if (DateTime.TryParse(list[14], out temp))
{
OriginationTime = temp;
}
else
{
OriginationTime = null;
}
Encryption = list[15];
ServiceVersion = list[16];
LinkedMsgid = list[17];
MessageSubject = list[18];
SenderAddress = list[19];
}
InsertData method:
private static void InsertData(string strLine, SqlConnection cn)
{
var dt = new Data(strLine); //parse the log line into proper fields
const string cnnStr =
"INSERT INTO LOGDATA ([SentTime]," + "[client-ip]," +
"[Client-hostname]," + "[Partner-Name]," + "[Server-hostname]," +
"[server-IP]," + "[Recipient-Address]," + "[Event-ID]," + "[MSGID]," +
"[Priority]," + "[Recipient-Report-Status]," + "[total-bytes]," +
"[Number-Recipients]," + "[Origination-Time]," + "[Encryption]," +
"[service-Version]," + "[Linked-MSGID]," + "[Message-Subject]," +
"[Sender-Address]) " + " VALUES ( " + "#Senttime," + "#Clientip," +
"#Clienthostname," + "#Partnername," + "#Serverhostname," + "#Serverip," +
"#Recipientaddress," + "#Eventid," + "#Msgid," + "#Priority," +
"#Recipientreportstatus," + "#Totalbytes," + "#Numberrecipient," +
"#OriginationTime," + "#Encryption," + "#ServiceVersion," +
"#LinkedMsgid," + "#MessageSubject," + "#SenderAddress)";
var cmd = new SqlCommand(cnnStr, cn) {CommandType = CommandType.Text};
cmd.Parameters.AddWithValue("#Senttime", dt.Senttime);
cmd.Parameters.AddWithValue("#Clientip", dt.Clientip);
cmd.Parameters.AddWithValue("#Clienthostname", dt.Clienthostname);
cmd.Parameters.AddWithValue("#Partnername", dt.Partnername);
cmd.Parameters.AddWithValue("#Serverhostname", dt.Serverhostname);
cmd.Parameters.AddWithValue("#Serverip", dt.Serverip);
cmd.Parameters.AddWithValue("#Recipientaddress", dt.Recipientaddress);
cmd.Parameters.AddWithValue("#Eventid", dt.Eventid);
cmd.Parameters.AddWithValue("#Msgid", dt.Msgid);
cmd.Parameters.AddWithValue("#Priority", dt.Priority);
cmd.Parameters.AddWithValue("#Recipientreportstatus", dt.Recipientreportstatus);
cmd.Parameters.AddWithValue("#Totalbytes", dt.Totalbytes);
cmd.Parameters.AddWithValue("#Numberrecipient", dt.Numberrecipient);
if (dt.OriginationTime != null)
cmd.Parameters.AddWithValue("#OriginationTime", dt.OriginationTime);
else
cmd.Parameters.AddWithValue("#OriginationTime", DBNull.Value);
//if OriginationTime was null, then insert with null value to this column
cmd.Parameters.AddWithValue("#Encryption", dt.Encryption);
cmd.Parameters.AddWithValue("#ServiceVersion", dt.ServiceVersion);
cmd.Parameters.AddWithValue("#LinkedMsgid", dt.LinkedMsgid);
cmd.Parameters.AddWithValue("#MessageSubject", dt.MessageSubject);
cmd.Parameters.AddWithValue("#SenderAddress", dt.SenderAddress);
cmd.ExecuteNonQuery();
}
How can my program run faster?
Thank you so much!
Use SqlBulkCopy.
Edit: I created a minimal implementation of IDataReader and created a Batch type so that I could insert arbitrary in-memory data using SqlBulkCopy. Here is the important bit:
IDataReader dr = batch.GetDataReader();
using (SqlTransaction tx = _connection.BeginTransaction())
{
try
{
using (SqlBulkCopy sqlBulkCopy =
new SqlBulkCopy(_connection, SqlBulkCopyOptions.Default, tx))
{
sqlBulkCopy.DestinationTableName = TableName;
SetColumnMappings(sqlBulkCopy.ColumnMappings);
sqlBulkCopy.WriteToServer(dr);
tx.Commit();
}
}
catch
{
tx.Rollback();
throw;
}
}
The rest of the implementation is left as an exercise for the reader :)
Hint: the only bits of IDataReader you need to implement are Read, GetValue and FieldCount.
Hmmm, let's break this down a little bit.
In pseudocode what you did is the ff:
Open the file
Open a connection
For every line that has data:
Parse the string
Save the data in SQL Server
Close the connection
Close the file
Now the fundamental problems in doing it this way are:
You are keeping a SQL connection open while waiting for your line parsing (pretty susceptible to timeouts and stuff)
You might be saving the data line by line, each in its own transaction. We won't know until you show us what the InsertData method is doing
Consequently you are keeping the file open while waiting for SQL to finish inserting
The optimal way of doing this is to parse the file as a whole, and then insert them in bulk. You can do this with SqlBulkCopy (as suggested by Matt Howells), or with SQL Server Integration Services.
If you want to stick with ADO.NET, you can pool together your INSERT statements and then pass them off into one large SQLCommand, instead of doing it this way e.g., setting up one SQLCommand object per insert statement.
You create the SqlCommand object for every row of data. The simplest improvement would therefore to create a
private static SqlCommand cmdInsert
and declare the parameters with the Parameters.Add() method. Then for each data row, set the parameter values using
cmdInsert.Parameters["#paramXXX"].Value = valueXXX;
A second performance improvement might be to skip creation of Data objects for each row, and assign Parameter values directly from the list[] array.

Categories

Resources