Is there a performant way to load very big CSV-files (which have size of several gigabytes) into an SQL-Server 2008 database with .NET?
I would combine this CSV reader with SqlBulkCopy; i.e.
using (var file = new StreamReader(path))
using (var csv = new CsvReader(file, true)) // true = has header row
using (var bcp = new SqlBulkCopy(connection)) {
bcp.DestinationTableName = "TableName";
bcp.WriteToServer(csv);
}
This uses the bulk-copy API to do the inserts, while using a fully-managed (and fast) IDataReader implementation (crucially, which streams the data, rather than loading it all at once).
Look into using the SQLBulkCopy class.
Use SqlBulkCopy
http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy.aspx
Related
I need to export Postgres DB (having around 20 tables) to excel using C#. I need to implement some logic on the data from DB and then need to export it. Any idea of how to export all data using c#?
using Npgsql;
using OfficeOpenXml; // Nuget EPPlus
using System.IO;
EPPlus has a one-step method to export a data table into a spreadsheet, so if you leveraged this, you should be able to loop through your queries and export each one to a unique sheet.
Something like this (untested but should be 99% there) should do the trick:
FileInfo fi = new FileInfo("foo.xlsx");
ExcelPackage excel = new ExcelPackage(fi);
int sheet = 1;
foreach (string sql in sqlQueries)
{
DataTable dt = new DataTable();
NpgsqlCommand cmd = new NpgsqlCommand(sql, conn);
NpgsqlDataAdapter da = new NpgsqlDataAdapter(cmd);
da.Fill(dt);
ExcelWorksheet ws = excel.Workbook.Worksheets.Add(string.Format("Sheet{0}", sheet++));
ws.Cells["A1"].LoadFromDataTable(dt, true);
}
excel.Save();
Of course, I'd recommend some refinements to deal with datatypes, formatting and the like, but this is the basic construct.
Also, of course, use the IDisposable using liberally.
The problem can be divided into two sub problems
Getting Data into c# from postgres.
pushing that data into excel.
Now solving a problem at a time
Here is a good article on working with postgres using c#
once you have you data in c# you can use any one of many libraries available for working with Excel using c#
One of them is NPOI
Here is one with example
Happy Coding.!!!
I want to export a .NET datatable to csv using FileHelpers.
The CommonEngine.DataTableToCsv doesn't cover my needs as it only writes to a file, I need it to be in memory.
I found a post which was using but the accepted answer CommonEngine.DataTableToCsv
DelimitedClassBuilder cb = new DelimitedClassBuilder("DataRow", ",", table);
Type t = cb.CreateRecordClass();
DelimitedFileEngine engine = new DelimitedFileEngine(t);
Can help would be awesome.
Thanks
I'm trying to grab an image from the web, and add it to my database.
String lsResponse = string.Empty;
using (HttpWebResponse lxResponse = (HttpWebResponse)req1.GetResponse())
{
using (BinaryReader reader = new BinaryReader(lxResponse.GetResponseStream()))
{
Byte[] lnByte = reader.ReadBytes(1 * 1024 * 1024 * 10);
using (FileStream lxFS = new FileStream(id + ".png", FileMode.Create))
{
lxFS.Write(lnByte, 0, lnByte.Length);
My SQL Server datatype is Varbinary(MAX). Tried with different types (image,...) did not work.
Put it into database.
SqlCommand cmd = new SqlCommand("INSERT INTO PlayersDB (id) ) VALUES (#id);
cmd.Parameters.AddWithValue("#id", lnByte);
I keep getting an error:
Implicit conversion from data type nvarchar to varbinary(max) is not allowed. Use the CONVERT function to run this query.
So, my program sees my lnByte not as binary?
First be sure about the data type is correct.I don't know much about SQL server but i've encountered similar problem in MySql. I changed the data type to Blob and it worked
We use blobs for storing images within a sql db at my workplace. I'm not particularly experienced with it myself, but here are some Technet resources that can explain it better than I can.
Binary Large Object (Blob) Data (SQL Server)
FILESTREAM (SQL Server)
I have a FileStream connected to a xml file that I would like to read directly into a SHA512 object in order to compute a hash for the purposes of a checksum (not a security use).
The issue is twofold:
I want to omit some of the nodes in the xml,
the file is quite large, and I would rather not load the whole thing into into memory
I can read the whole file into a xml structure, delete the node, then write it to a stream that would then be plugged into SHA512.ComputeHash, but that will cause a performance loss. I would prefer to be able to somehow do the deletion of the nodes as an operation on a stream and then chain the streams together somehow into a single stream that can be passed into SHA512.ComputeHash(Stream).
How can I accomplish this?
using (var hash = new SHA512Cng())
using (var stream = new CryptoStream(Stream.Null, hash, CryptoStreamMode.Write))
using (var writer = XmlWriter.Create(stream))
using (var reader = XmlReader.Create("input.xml"))
{
while (reader.Read())
{
// ... write node to writer ...
}
writer.Flush();
stream.FlushFinalBlock();
var result = hash.Hash;
}
i want to select a picture that save as a large object in a postgresql database.
i know i use lo_export to do this.
but there is a problem: i want to save this picture directly to my computer because i cant access the files that save on Server using lo_export
(i think that is best for me if picture transfer to my computer by a select query)
I don't exactly know my way around C# but the Npgsql Manual has an example sort of like this of writing a bytea column to a file:
command = new NpgsqlCommand("select blob from t where id = 1);", conn);
Byte[] result = (Byte[])command.ExecuteScalar();
FileStream fs = new FileStream(args[0] + "database", FileMode.Create, FileAccess.Write);
BinaryWriter bw = new BinaryWriter(new BufferedStream(fs));
bw.Write(result);
bw.Flush();
fs.Close();
bw.Close();
So you just read it out of the database pretty much like any other column and write it to a local file. The example is about half way down the page I linked to, just search for "bytea" and you'll find it.
UPDATE: For large objects, the process appears to be similar but less SQL-ish. The manual (as linked to above) includes a few large object examples:
NpgsqlTransaction t = Polacz.BeginTransaction();
LargeObjectManager lbm = new LargeObjectManager(Polacz);
LargeObject lo = lbm.Open(takeOID(idtowaru),LargeObjectManager.READWRITE); //take picture oid from metod takeOID
byte[] buf = new byte[lo.Size()];
buf = lo.Read(lo.Size());
MemoryStream ms = new MemoryStream();
ms.Write(buf,0,lo.Size());
// ...
Image zdjecie = Image.FromStream(ms);
Search the manual for "large object" and you'll find it.
Not familiar with C# but if you've contrib/dblink around and better access to a separate postgresql server, this might work:
select large object from bad db server.
copy large object into good db server using dblink.
lo_export on good db server.
If your pictures don't exceed 1GB (or if you don't access only parts of the bytes) then using bytea is the better choice to store them.
A lot of SQL GUI tools allow to directly download (even view) the content of bytea columns directly