How to fill PostgreSQL tables from C# DataSet via Npgsql? - c#

I have a DataSet in C# with DataTables and PostgreSQL database with the same tables. I fill DataTable in my code and want to INSERT DataTable to Postgresql DataBase. I tried to insert it with simple SQL queries (INSERT INTO...), but it's very slowly if I have hundred tables of thousands rows. I guess, using DataAdapter will improve performance, but I cant understand, how does it work. Can you explain me at two cases example?
case1:
Inserting DataSet's tables to Postgresql with DataAdapter
case2:
Inserting only uniq values from DataSet to PostgreSQL (if table in database have rows with uniq keys and DataTable contain the same)
Or maybe you can suggest what to read to learn DataAdapters... Anyway, thanks.

With the exception of trivially small datasets, you're going to have a hard time beating the performance of NpgSql's implementation of copy, which can be accomplished via the BeginTextImport method of your NpgSqlConnection object.
So, regardless of how your data exists in your application, if you dump the output via the text import (copy), it should be very zippy. Here is an example of how you would do that with a datatable. Bear in mind the columns in the datatable and the columns in the table would have to line up -- if not, you need to manage that one way or the other.
This presupposes NpgSql 3.1.9 or higher.
object[] outRow = new object[dt.Columns.Count];
using (var writer = conn.BeginTextImport("copy <table> from STDIN WITH NULL AS '' CSV"))
{
foreach (DataRow rw in dt.Rows)
{
for (int col = 0; col < dt.Columns.Count; col++)
outRow[col] = rw[col];
writer.WriteLine(string.Join(",", outRow));
}
}
As far as duplicates... wow, that really depends. Define "duplicates." If it's just a "select distinct," then it also depends on how many duplicates you expect. If it's a small amount, a List.Exists<> would probably be adequate, but if you have a large number of dupes a Dictionary object would make each lookup a lot more efficient. A typical List lookup is O(n), while a Dictionary lookup would be O(1).
Here's a pretty brute-force example of a dictionary distinct insert for the above example:
object[] outRow = new object[dt.Columns.Count];
Dictionary<string, bool> already = new Dictionary<string, bool>();
bool test;
using (var writer = conn.BeginTextImport("copy <table> from STDIN WITH NULL AS '' CSV"))
{
foreach (DataRow rw in dt.Rows)
{
for (int col = 0; col < dt.Columns.Count; col++)
outRow[col] = rw[col];
string output = string.Join(",", outRow);
if (!already.TryGetValue(output, out test))
{
writer.WriteLine(output);
already.Add(output, true);
}
}
}
Disclaimer: This is a memory pig. If you can manage dupes any other way, or guarantee the ordering of the data, there are numerous other options.
If you can't (or won't) use a bulk copy insert, something that would help performance would be to wrap your inserts into a transaction (NpgSqlTransaction), but for hundreds of thousands of rows, I can't see why you would.

Related

Avoid duplication in DataTable query and build

What would be the right way to avoid duplication when querying datatable and then saving it to DataTable. I'm using the pattern below, which gets very error-prone once tables grow. I looked at below hints. With first one copyToDataTable() looks not really applicable and second is for me much too complex for the task. I would like to split the below code into 2 separate methods (first to build the query and second to retrieve the DataTable). Perhaps if I avoid the anonymous type in the query this should be easier to avoid hardcoding all the column names - but I'm somehow lost with this.
Filling a DataSet or DataTable from a LINQ query result set
or
https://msdn.microsoft.com/en-us/library/bb669096%28v=vs.110%29.aspx
public DataTable retrieveReadyReadingDataTable()
{
DataTable dtblReadyToSaveToDb = RetrieveDataTableExConstraints();
var query = from scr in scrTable.AsEnumerable()
from products in productsTable.AsEnumerable()
where(scr.Field<string>("EAN") == products.Field<string>("EAN"))
select
new
{
Date = DateTime.Today.Date,
ProductId = products.Field<string>("SkuCode"),
Distributor = scr.Field<string>("Distributor"),
Price = float.Parse(scr.Field<string>("Price")),
Url = scr.Field<string>("Url")
};
foreach (var q in query)
{
DataRow newRow = dtblReadyToSaveToDb.Rows.Add();
newRow.SetField("Date", q.Date);
newRow.SetField("ProductId", q.ProductId);
newRow.SetField("Distributor", q.Distributor);
newRow.SetField("Price", q.Price);
newRow.SetField("Url", q.Url);
}
return dtblReadyToSaveToDb;
}
Firstly, you have to decide what "duplicate" means in your case. According to your code i would say a duplicate is a row with the same value in column Date, ProductId and Distributor. So add a multi column primary key for those columns first.
Secondly, you should add some sort of code that first queries existing rows and then compares these existing rows to the rows you want to create. If a match is found, then simply just don't insert a new row.

Filling table takes a lot of time in MS Word

I made the following code to add external data table to another table in MS word document, its working fine but takes a lot of time in case that the number of rows is more than 100, and in case of adding table with rows count more that 500 it fills the ms word table really slow and can't complete the task.
I tried to hide the document and disable the screen update for the document but still no solution for the slow performance.
//Get the required external data to the DT data table
DataTable DT = XDt.GetData();
Word.Table TB;
int X = 1;
foreach (DataRow Rw in DT.Rows)
{
Word.Row Rn = TB.Rows.Add(TB.Rows[X + 1]);
for(int i=0;i<=DT.Columns.Count-1;i++)
{
Rn.Cells[i+1].Range.Text = Rw[i].ToString());
}
X++;
}
So is there a way to make this process go faster ?
The most efficient way to add a table to Word is to first concatenate the data in a delimited text string, where "/n" must be the symbol for end-of-row (record separator). The end-of-cell (field separator) can be any character you like that's not in the string content that makes up the table.
Assign this string to a Range object, then use the ConvertToTable() method to create the table.
You're retrieving the last row of the current table for the BeforeRow parameter of TB.Rows.Add. This is significantly slower than simply adding the row. You should replace this:
Word.Row Rn = TB.Rows.Add(TB.Rows[X + 1]);
With this:
Word.Row Rn = TB.Rows.Add();
Utilizing parallelization as suggested in the comments might help slightly, but I'm afraid it's not going to do much good seeing the table add code runs on the main thread as mentioned in this link.
EDIT:
If performance is still an issue, I'd look into creating the Word table independently of the Word object model by using OpenXML. It's orders of magnitude faster.
ConvertToTable method is orders of magnitude faster than adding Rows/Cells one at a time.
while (reader.Read())
{
values = new object[reader.FieldCount];
var cols = reader.GetValues(values);
var item = String.Join("\t", values);
items.Add(item);
};
data = String.Join("\n", items.ToArray());
var tempDocument = application.Documents.Add();
var range = tempDocument.Range();
range.Text = data;
var tempTable = range.ConvertToTable(Separator: Microsoft.Office.Interop.Word.WdTableFieldSeparator.wdSeparateByTabs,
NumColumns: reader.FieldCount,
NumRows: rows, DefaultTableBehavior: WdDefaultTableBehavior.wdWord9TableBehavior,
AutoFitBehavior: WdAutoFitBehavior.wdAutoFitWindow);

Insert large amount of data into database using LINQ

I want to insert around 1 million records into a database using Linq in ASP.NET MVC. But when I try the following code it didn't work. It's throwing an OutOfMemoryException. And also it took 3 days in the loop. Can anyone please help me on this???
db.Database.ExecuteSqlCommand("DELETE From [HotelServices]");
DataTable tblRepeatService = new DataTable();
tblRepeatService.Columns.Add("HotelCode",typeof(System.String));
tblRepeatService.Columns.Add("Service",typeof(System.String));
tblRepeatService.Columns.Add("Category",typeof(System.String));
foreach (DataRow row in xmltable.Rows)
{
string[] servicesarr = Regex.Split(row["PAmenities"].ToString(), ";");
for (int a = 0; a < servicesarr.Length; a++)
{
tblRepeatService.Rows.Add(row["HotelCode"].ToString(), servicesarr[a], "PA");
}
String[] servicesarrA = Regex.Split(row["RAmenities"].ToString(), ";");
for (int b = 0; b < servicesarrA.Length; b++)
{
tblRepeatService.Rows.Add(row["hotelcode"].ToString(), servicesarrA[b], "RA");
}
}
HotelAmenties _hotelamenties;
foreach (DataRow hadr in tblRepeatService.Rows)
{
_hotelamenties = new HotelAmenties();
_hotelamenties.Id = Guid.NewGuid();
_hotelamenties.ServiceName = hadr["Service"].ToString();
_hotelamenties.HotelCode = hadr["HotelCode"].ToString();
db.HotelAmenties.Add(_hotelamenties);
}
db.SaveChanges();
tblRepeatService table has around 1 million rows.
Bulk inserts like this are highly inefficient in LINQtoSQL. Every insert creates at least three objects (the DataRow, the HotelAmenities object and the tracking record for it), chewing up memory on objects you don't need.
Given that you already have a DataTable, you can use System.Data.SqlClient.SqlBulkCopy to push the content of the table to a temporary table on the SQL server, then use a single insert statement to load the data into its final destination. This is the fastest way I have found so far to move many thousands of records from memory to SQL.
If performance doesn't matter and this is a 1 shot job you can stick to the way you're using. Your problem is you're only saving at the end, so entity Framework has to store and generate the SQL for 1 million operations at once, modify your code so that you save every 1000 or so inserts instead of only at the end and it should work just fine.
int i = 0;
foreach (DataRow hadr in tblRepeatService.Rows)
{
_hotelamenties = new HotelAmenties();
_hotelamenties.Id = Guid.NewGuid();
_hotelamenties.ServiceName = hadr["Service"].ToString();
_hotelamenties.HotelCode = hadr["HotelCode"].ToString();
db.HotelAmenties.Add(_hotelamenties);
if((i%1000)==0){
db.SaveChanges();
}
i++;
}
db.SaveChanges();

C# Optimisation: Inserting 200 million rows into database

I have the following (simplified) code which I'd like to optimise for speed:
long inputLen = 50000000; // 50 million
DataTable dataTable = new DataTable();
DataRow dataRow;
object[] objectRow;
while (inputLen--)
{
objectRow[0] = ...
objectRow[1] = ...
objectRow[2] = ...
// Generate output for this input
output = ...
for (int i = 0; i < outputLen; i++) // outputLen can range from 1 to 20,000
{
objectRow[3] = output[i];
dataRow = dataTable.NewRow();
dataRow.ItemArray = objectRow;
dataTable.Rows.Add(dataRow);
}
}
// Bulk copy
SqlBulkCopy bulkTask = new SqlBulkCopy(connection, SqlBulkCopyOptions.TableLock, null);
bulkTask.DestinationTableName = "newTable";
bulkTask.BatchSize = dataTable.Rows.Count;
bulkTask.WriteToServer(dataTable);
bulkTask.Close();
I'm already using SQLBulkCopy in an attempt to speed things up, but it appears assigning values to the DataTable itself proves to be slow.
I don't know how DataTables work so I'm wondering if I'm creating unnecessary overhead by first creating a reusable array, then assigning it to a DataRow, then adding the DataRow to the DataTable? Or is using DataTable not optimal in the first place? The input comes from a database.
I don't care much about LOC, just speed. Can anyone give some advice on this?
For such a big table, you should instead use the
public void WriteToServer(IDataReader reader)
method.
It may mean you'll have to implement yourself a "fake" IDataReader interface with your code (if you' don't get the data from an existing IDataReader), but this way, you'll get "streaming" from end to end, and will avoid a 200 million loop.
Instead of holding a huge data table in memory, I would suggest implementing a IDataReader which serves up the data as the bulk copy goes. This will reduce the need to keep everything in memory upfront, and should thus serve to improve performance.
You should not construct entire datatable in memory. Use this overload of WrtieToServer, that takes array of DataRow. Just split in chunks your data.

What is the best way to fast insert SQL data and dependant rows?

I need to write some code to insert around 3 million rows of data.
At the same time I need to insert the same number of companion rows.
I.e. schema looks like this:
Item
- Id
- Title
Property
- Id
- FK_Item
- Value
My first attempt was something vaguely like this:
BaseDataContext db = new BaseDataContext();
foreach (var value in values)
{
Item i = new Item() { Title = value["title"]};
ItemProperty ip = new ItemProperty() { Item = i, Value = value["value"]};
db.Items.InsertOnSubmit(i);
db.ItemProperties.InsertOnSubmit(ip);
}
db.SubmitChanges();
Obviously this was terribly slow so I'm now using something like this:
BaseDataContext db = new BaseDataContext();
DataTable dt = new DataTable("Item");
dt.Columns.Add("Title", typeof(string));
foreach (var value in values)
{
DataRow item = dt.NewRow();
item["Title"] = value["title"];
dt.Rows.Add(item);
}
using (System.Data.SqlClient.SqlBulkCopy sb = new System.Data.SqlClient.SqlBulkCopy(db.Connection.ConnectionString))
{
sb.DestinationTableName = "dbo.Item";
sb.ColumnMappings.Add(new SqlBulkCopyColumnMapping("Title", "Title"));
sb.WriteToServer(dt);
}
But this doesn't allow me to add the corresponding 'Property' rows.
I'm thinking the best solution might be to add a Stored Procedure like this one that generically lets me do a bulk insert (or at least multiple inserts, but I can probably disable logging in the stored procedure somehow for performance) and then returns the corresponding ids.
Can anyone think of a better (i.e. more succinct, near equal performance) solution?
To combine the previous best two answers and add in the missing piece for the IDs:
1) Use BCP to Load the data into a temporary "staging" table defined like this
CREATE TABLE stage(Title AS VARCHAR(??), value AS {whatever});
and you'll need the appropriate index for performance later:
CREATE INDEX ix_stage ON stage(Title);
2) Use SQL INSERT to load the Item table:
INSERT INTO Item(Title) SELECT Title FROM stage;
3) Finally load the Property table by joining stage with Item:
INSERT INTO Property(FK_ItemID, Value)
SELECT id, Value
FROM stage
JOIN Item ON Item.Title = stage.Title
The best way to move that much data into SQL Server is bcp. Assuming that the data starts in some sort of file, you'll need to write a small script to funnel the data into the two tables. Alternately you could use bcp to funnel the data into a single table and then use an SP to INSERT the data into the two tables.
Bulk copy the data into a temporary table, and then call a stored proc that splits the data into the two tables you need to populate.
You can bulk copy in code as well, using the .NET SqlBulkCopy class.

Categories

Resources