Reading large text files into SQL Server stored procedure - c#

I've got to read and put text into a SQL Server table.
I initially started with a BULK INSERT, but I have the problem that I don't now the encoding of the files and so the BULK sometimes outputs some errors.
Then, I choose to make through a C# Console Aplication, and everything works fine, except when I have large files. To about 300-400 MB, I don't have any problem, but above that, I'm getting a OOM error.
Here's my code:
static void Main()
{
string fullFileName = #"C:\Temp\410.4604";
string FileName = System.IO.Path.GetFileName(fullFileName);
string table = FileName.Substring(0, 3);
Console.WriteLine("Iniciou às: " + inicio);
DataTable data = new DataTable();
data.Clear();
data.Columns.Add("Text");
using (System.IO.FileStream fs = System.IO.File.Open(fullFileName, System.IO.FileMode.Open, System.IO.FileAccess.Read, System.IO.FileShare.Read))
using (System.IO.BufferedStream bs = new System.IO.BufferedStream(fs))
using (StreamReader sr = new StreamReader(bs))
{
string line;
while ((line = sr.ReadLine()) != null)
{
data.Rows.Add(line);
}
}
var conn = new System.Data.SqlClient.SqlConnection(myConn);
SqlCommand insertCommand = new SqlCommand(myProc, conn);
insertCommand.CommandTimeout = 0;
insertCommand.CommandType = CommandType.StoredProcedure;
SqlParameter insParam1 = insertCommand.Parameters.AddWithValue("#data", data);
insParam1.SqlDbType = SqlDbType.Structured;
SqlParameter insParam2 = insertCommand.Parameters.AddWithValue("#table", table);
conn.Open();
insertCommand.ExecuteNonQuery();
conn.Close();
data.Clear();
data = null;
GC.Collect();
Console.WriteLine("Press any key to exit.");
System.Console.ReadKey();
}
Any ideas? Or advices to improve this?
Many thanks

Related

Stored Procedure does not fire in Windows Service

I created a Windows service with this code below. Not sure why it will not fire the last SQL Server stored procedure. If I have nothing in the code but the stored procedure than it fires ok. There are no errors.
using (SqlConnection SqlConnection = new SqlConnection(connectionString))
{
SqlConnection.Open();
using (SqlCommand cmdCreateITableSP = new SqlCommand("CreateSP", SqlConnection))
{
cmdCreateITableSP.CommandType = CommandType.StoredProcedure;
cmdCreateTableSP.ExecuteNonQuery();
}
string INTable = "IN";
string XMLPackagesDir = "D:\\Cle\\";
// Create a datatable with two columns:
DataTable INHReponseDT = new DataTable("INHReponseDT");
// Create Columns:
INHReponseDT.Columns.Add("XM");
INHReponseDT.Columns.Add("Cl");
INHReponseDT.Columns.Add("I");
INHReponseDT.Columns.Add("INH");
INHReponseDT.Columns.Add("IN");
DirectoryInfo DirInfo = new DirectoryInfo(XMLPackagesDir);
DataRow INHReponseRow = INHReponseDT.NewRow();
foreach (FileInfo fi in DirInfo.GetFiles("*.*", SearchOption.AllDirectories))
{
XmlSerializer serializer = new XmlSerializer(typeof(Response));
Response i;
FileStream fs = null;
fs = new FileStream(Path.Combine(XMLPackagesDir, fi.Name), FileMode.Open);
using (TextReader tr = new StreamReader(fs))
{
i = (Response)serializer.Deserialize(tr);
INHReponseRow = INHReponseDT.NewRow();
INHReponseRow["XM"] = fi.Name;
INHReponseRow["Cl"] = i.ClientCorrelationID;
INHReponseRow["I"] = i.StatusInformation.StatusItem.MessageText;
INHReponseRow["INH"] = i.ResponseStatus;
INHReponseRow["IN"] = i.RequestProcessedTime.ToString();
INHReponseDT.Rows.Add(INHReponseRow);
}
//Insert into SQL Table
using (SqlBulkCopy s = new SqlBulkCopy(SqlConnection))
{
s.DestinationTableName = INTable;
s.BatchSize = INHReponseDT.Rows.Count;
s.WriteToServer(INHReponseDT);
s.Close();
}
}
using (SqlCommand cmdUpdateCaseInformationINHResponseSP = new SqlCommand("UpdateCaseSP", SqlConnection))
{
cmdUpdateCaseInformationINHResponseSP.CommandType = CommandType.StoredProcedure;
cmdUpdateCaseInformationINHResponseSP.ExecuteNonQuery();
}
}
I had a similar issue with "extra code" in the middle of some SQL commands. While I couldn't immediately see it, there was indeed an error being thrown because of some of the code included between the SQL commands. Wrapping this in a try catch may help show this.
To fix this (and as a matter of good practices), you should create new connections for each SQL command and not reuse the same connection object and keep it open for so long between commands. If you need to ensure that they are handled as a transacted unit, wrap the call to this method in a new TransactionScope()

Reading binary data from SQL Server CE in C#

I'm trying to read binary data from a SQL Server CE database file.
I tried with this code:
string fileName = "C:\\Users\\Luca\\Desktop\\TestNoah\\TestNoah\\NOAHDatabaseCoreSqlCompact.sdf";
connectionString = string.Format("DataSource='{0}'; Persist Security Info=False", fileName);
using (SqlCeConnection con = new SqlCeConnection(connectionString))
{
con.Open();
using (SqlCeCommand command = new SqlCeCommand("SELECT PublicData FROM Action",con))
{
byte[] barrImg = (byte[])command.ExecuteScalar();
List<byte> test = barrImg.ToList<byte>();
string strfn = Convert.ToString(DateTime.Now.ToFileTime());
FileStream fs = new FileStream("C:\\Users\\Luca\\Desktop\\TestNoah\\TestNoah\\" + strfn + ".jpg" , FileMode.CreateNew, FileAccess.Write);
fs.Write(barrImg, 0, barrImg.Length);
fs.Flush();
fs.Close();
}
}
This code creates an image that I can't open. Any ideas? Thanks to all
The code below uses the ExecuteReader instead of ExecuteScalar (which should not be used unless you expect only a single return value from the result set where your SqlCeCommand has a WHERE limited by primary key or you have TOP 1 in your SELECT.
string targetDir = "C:\\Users\\Luca\\Desktop\\TestNoah\\TestNoah\\";
var reader = command.ExecuteReader();
while (reader.Read())
{
byte[] img = (byte[])reader["PublicData"];
string path = System.IO.Path.Combine(targetDir, string.Format("{0}.jpg", DateTime.Now.ToFileTime()));
System.IO.File.WriteAllBytes(path, img);
}
If you still have image corruption issues, I expect it would be your source data record.

How to add/insert files such as pdf, xls etc. to mysql database using c# form application

I am developing a form application using C# and using mysql as database. I simply need to add or insert any type of files (.pdf, .doc, .xls etc) into mysql table column (say attachment table , column name is files, database name say MYDB).
Can anyone suggest the code in C# or the procedure in mysql to do the same.
I searched through the Internet but no relevant answer I found.
Any answer would be most appreciated.
You can insert any type of file by using this code....
uFileExpiryCapping pass 0 in this parameter.
In most of case uFilePhysicalPath will be same as uFilePath
public int UploadFileToDB(string uFilePath, string uFilename, string uFileType, int uFileExpiryCapping, string uFilePhysicalPath)
{
int intResult = 0;
try
{
string strConnectionString = string.Empty;
FileStream fs = new FileStream(uFilePath, FileMode.OpenOrCreate, FileAccess.Read);
byte[] fileData = new byte[fs.Length];
fs.Read(fileData, 0, System.Convert.ToInt32(fs.Length));
DataSet ds = new DataSet("UploadedFiles");
// Set up parameters (7 input and 1 output)
SqlParameter[] arParms = new SqlParameter[6];
//#UserName Input Parameter
arParms[0] = new SqlParameter("#UploadedFile", SqlDbType.Image);
arParms[0].Value = fileData;
arParms[0].Direction = ParameterDirection.Input;
//#FirstName Input Parameter
arParms[1] = new SqlParameter("#UploadedFileName", SqlDbType.NVarChar, 100);
arParms[1].Value = uFilename;
arParms[1].Direction = ParameterDirection.Input;
//#LastName Input Parameter
arParms[2] = new SqlParameter("#UploadedFileType", SqlDbType.NVarChar, 50);
arParms[2].Value = uFileType;
arParms[2].Direction = ParameterDirection.Input;
//#MiddleName Input Parameter
arParms[3] = new SqlParameter("#UploadedFileExpiryCapping", SqlDbType.Int);
arParms[3].Value = uFileExpiryCapping;
arParms[3].Direction = ParameterDirection.Input;
//#MobileNo Input Parameter
arParms[4] = new SqlParameter("#UploadedFileSaveLocation", SqlDbType.NVarChar, int.MaxValue);
arParms[4].Value = uFilePhysicalPath;
arParms[4].Direction = ParameterDirection.Input;
//#DataInserted Output Parameter
arParms[5] = new SqlParameter("#ErrorCode", SqlDbType.Int);
arParms[5].Direction = ParameterDirection.Output;
strConnectionString = ""; // initialise connection string
using (SqlConnection sqlConnection = new SqlConnection(strConnectionString))
{
intResult = SqlHelper.ExecuteNonQuery(sqlConnection, CommandType.StoredProcedure, "Your Procedure", arParms);
}
}
catch (Exception ex)
{
}
finally
{
fs.Dispose();
}
return intResult;
}
You can either store them as BINARY or VARCHAR fields or FILESTREAM by submitting them to your database as byte[] (a byte array). Whether you prefer one approach over the other will depend on your personal preference and use case (ie. How many objects are stored? What are the average filesizes?)
Recommended usage for uploading images/BLOB in this manner is explained on MSDN.

Can't put varbinary from SQL into a byte[] in C# program using SqlDataReader and ExecuteReader

I have seen many solutions to this problem where people will just use Command.ExecuteScalar as byte[]; but their SQL queries are getting one varbinary field at a time. I am trying to select about 30k rows of varbinary entries, but them in a byte[] and deserialize.
Here is my code:
public void MNAdapter()
{
IsoStorage retVal = new IsoStorage();
SqlConnectionStringBuilder csb = new SqlConnectionStringBuilder();
csb.DataSource = #"LocalMachine\SQLDEV";
csb.InitialCatalog = "Support";
csb.IntegratedSecurity = true;
string connString = csb.ToString();
using (SqlConnection conn = new SqlConnection(connString))
{
conn.Open();
SqlCommand command = conn.CreateCommand();
command.CommandText = #"SELECT S.Settings
from Support.dbo.SavedLocalSettings S
inner join WebCatalog.Published.People P
on P.PKey = S.PeopleLink
inner join WebCatalog.Published.Company C
on P.Link = C.PeopleList
where S.DateSaved >= GETDATE()-34
and C.PKey != '530F4622-C30D-DD11-A23A-00304834A8C9'
and C.PKey != '7BAF7229-9249-449E-BEA5-4B366D7ECCD1'
and C.PKey != 'CCBB2140-C30D-DD11-A23A-00304834A8C9'
and S.CompanyName not like 'Tech Support%'
Group By S.PeopleLink, S.Settings";
using (SqlDataReader reader = command.ExecuteReader(CommandBehavior.SequentialAccess))
{
//DataTable dt = new DataTable();
//dt.Load(reader);
byte[] blob = null;
BinaryFormatter bf = new BinaryFormatter();
bf.Binder = new CustomBinder();
while (reader.Read())
{
reader.GetBytes(0,0,blob,0,100000);
Console.WriteLine(blob.ToString());
retVal = bf.Deserialize(new MemoryStream(blob)) as IsoStorage;
}
}
}
}
I also tried putting them in a Data tabla first even though I thought that would be redundant, but they get read in as integers.
I don't get any errors and the data is going into the data reader, but it's like reader.GetBytes(0,0,blob,0,100000); is not even running because blob stays null.
Why not use:
blob = (byte[])reader.Items["Settings"];
Or
blob = (byte[])reader["Settings"];
reader.GetBytes(0,0,blob,0,100000);
You expect this method to create the array of bytes for you. It won't - it needs a reference to an existing array. You have to prepare the array yourself.

Convert varbinary back to .txt file

I have a database (SQL 2008) where I store file's in.
These are saved as a varbinary(max) type.
Now I need to get the .txt file again so I can loop through the contents of the file like i used to do with StreamReader.
while ((line = file.ReadLine()) != null)
{
string code = line.Substring(line.Length - 12);
}
But how can I convert the varbinary byte[] back to the normal .txt file so I'm able to go through the contents line by line.
I found some ideas with memorystream or filestream but can't get them to work.
Thanks in advance!
MemoryStream m = new MemoryStream(byteArrayFromDB);
StreamReader file = new StreamReader(m);
while ((line = file.ReadLine()) != null)
{
string code = line.Substring(line.Length - 12);
}
try this:
System.IO.File.WriteAllBytes("path to save your file", bytes);
cv is a varbinary(max) field
SqlCommand sqlCmd = new SqlCommand("SELECT cv FROM [job].[UserInfo] Where ID = 39 ", conn);
SqlDataReader reader = sqlCmd.ExecuteReader();
if (reader.Read() != null)
{
byte[] buffer = (byte[])reader["cv"];
File.WriteAllBytes("c:\\cv1.txt", buffer);
}
static void Main(string[] args)
{
GetData();
}
public static void GetData()
{
SqlConnection conn = new SqlConnection(System.Configuration.ConfigurationManager.ConnectionStrings["main"].ConnectionString);
conn.Open();
string query = "SELECT FileStream FROM [EventOne] Where ID=2";
SqlCommand cmd = new SqlCommand(query, conn);
DataTable dt = new DataTable();
dt.Load(cmd.ExecuteReader());
byte[] buffer = dt.AsEnumerable().Select(c => c.Field<byte[]>("FileStream")).SingleOrDefault();
File.WriteAllBytes("c:\\FileStream.txt", buffer);
}

Categories

Resources