Handling large data from webservice to SQL Server CE database - c#

In a windows mobile application I am calling a web service to retrieve large amounts of data which returns data in the form of array List. After that I am inserting the data to SQL Server CE database inside the device. Right now its taking too much time as there are lot of tables and large amount of data for each table. Please suggest a faster way to insert data to SQL Server CE database using Array list .
ce_command.Connection = Database.GetDbConnection();
ce_command.CommandType = CommandType.TableDirect;
ce_command.CommandText = "NHH_SOURCE";
System.Data.SqlServerCe.SqlCeResultSet rsSource;
SqlCeUpdatableRecord recSource;
rsSource = ce_command.ExecuteResultSet(System.Data.SqlServerCe.ResultSetOptions.Updatable);
recSource = rsSource.CreateRecord();
NPFWebService.WebServiceGetNHH_Source[] get_source = null;
get_source = npf_WS.GetSourceData();
if (get_source.Length > 0)
{
for (int i = 0; i < get_source.Length; i++)
{
recSource.SetValue(0, get_source[i].sourceID);
recSource.SetValue(1, get_source[i].sourceName.Replace("'", "''"));
recSource.SetValue(2, get_source[i].organizationID);
recSource.SetValue(3,get_source[i].transferF);
recS‌​ource.SetValue(4, get_source[i].transferDate);
rsSource.Insert(recSource);

Looks like your are already using the fastest approach. Does you table have an index (or indexes) - you might be able to save some time by dropping and recreating after the insert is complete

Related

Bulkcopy 100 million rows of data to Azure SQL Server using C#?

I have a local SQL Server 2017 database, and I need to copy two tables to an Azure SQL Server database; one table has over 100 million rows of data including a "geography" type column. How do I do that?
I am right now running a bulk copy:
using (SqlConnection streamsConnection = new SqlConnection(streamsConnectionString))
{
streamsConnection.Open();
using (SqlConnection cloudConnection = new SqlConnection(cloudConnectionString))
{
cloudConnection.Open();
using (SqlCommand cmd = streamsConnection.CreateCommand())
using (SqlBulkCopy bcp = new SqlBulkCopy(cloudConnection))
{
bcp.DestinationTableName = "GroundDataNodes";
bcp.BatchSize = 200000;
bcp.BulkCopyTimeout = 1200;
bcp.NotifyAfter = 100000;
bcp.SqlRowsCopied += new SqlRowsCopiedEventHandler(s_SqlRowsCopied);
cmd.CommandText = "SELECT [Id],[nodeid],[latlon],[type],[label],[code],[lat],[lon]FROM[dbo].[GroundDataNodes]";
using (SqlDataReader reader = cmd.ExecuteReader())
{
bcp.WriteToServer(reader);
}
Console.WriteLine("Finished!");
Console.ReadLine();
}
}
}
But I'm quite new to the bulk load side and am wondering how I can improve this so it doesn't take weeks to run...
Give Azure Database Migration Services a Try. I Migrated an on-premise SQL Server with 10 Million Rows in a Days Time. But of course it also largely depends on your bandwidth.
Also:
Use the multi CPU General Purpose Pricing Tier when you create your service instance to allow the service to take advantage of multiple vCPUs for parallelization and faster data transfer.
Temporarily scale up your Azure SQL Database target instance to the Premium tier SKU during the data migration operation to minimize Azure SQL Database throttling that may impact data transfer activities when using lower-level SKUs.
so I have tried all sorts of upload stuff like the bulk upload, migration, backup etc but all had the same problem and that was my upload speed isn't up to it. It would work but take days to run. So I decided to write a server side bit of code to just populate it from there directly onto the database therefore taking my upload speeds out of the equation. I imagine if I had better upload speeds the migration tool would have worked fine even with the Geographic field etc. not quick but it would work.

.NET slow insert in remote SQL Server database

I developed a client .NET WinForms application. This application is used to populate a database with thousands of records. For the data layer I used EF6.
When I work and run the application locally every thing works as expected. Insertions are very fast (over 500000 records in about 2 mins).
Now I'm trying to use a remote database on a hosted server and I notice that insertions are very very slow (less than 500 records in about 2 mins). This means 1000 times slower than locally.
If I try to insert 500 records in the remote database using SQL Server Management Studio the operation is completed in less than 1 second.
Is the problem on my client application?
Here the insert function:
public void SetDemoDimCustomer()
{
DWContext dc = null;
try
{
dc = DWContext.Create(SqlServerInstance, DbName);
dc.Configuration.AutoDetectChangesEnabled = false;
dc.Database.ExecuteSqlCommand("DELETE FROM DimCustomer");
dc.Database.ExecuteSqlCommand("DBCC CHECKIDENT ('DimCustomer', RESEED, 0)");
DimCustomer objCustomer;
List<DimCustomer> lstDemoCustomers = new List<DimCustomer>();
int length = 100;
for (int i = 0; i < length; i++)
{
objCustomer = new DimCustomer();
objCustomer.Name = "Customer " + (i + 1);
objCustomer.CustomerBKey = (i + 1).ToString();
lstDemoCustomers.Add(objCustomer);
}
dc.DimCustomer.AddRange(lstDemoCustomers);
dc.SaveChanges();
}
catch (Exception)
{
throw;
}
finally
{
if (dc != null)
{
dc.Dispose();
}
}
}
I tried to use Linq-to-SQL instead of EF6 but the result is the same. Maybe is not a specific EF6 problem.
Some infos about the remote system:
OS: Windows Server 2012
RDBMS: SQL Server 2014 Express
Thanks in advance.
UPDATE AFTER SOME TESTS WITH BULKINSERT
Ok here the results of my first tests with BulkInsert:
100 records -> EF6 AddRange: 9 sec. / EF6 BulkInsert: 1 sec.
1000 records -> EF6 AddRange: 1:27 min. / EF6 BulkInsert: 1 sec. (wow!)
10000 records -> EF6 AddRange: 14:39 min. / EF6 BulkInsert: 4 sec. (wooooow!)
Now, of course, the EF6 BulkInsert package is part of my project.
Looks like most time is spend on the network waiting for a round-trip to complete. EF cannot be made to batch inserts (yet). So you cannot use EF for inserts here.
Investigate the typical solutions to this problem (TVPs and SqlBulkCopy).
The dispose pattern you are using is not a good choice. Just wrap dc in ´using` and delete all exception handling.
As suggested, SqlBulkCopy is your best bet, however there is an Interesting Nuget Package which does BulkInsert for Entity Framework:
https://efbulkinsert.codeplex.com/

SQL Sever 2014 Query Notifications in C++ with ADO.NET libraries

I have a C++ application that has to react to status changes that are kept in an SQL database. They have to react in real-time, and currently this is done by polling. However the real-time response time I need is overloading the database.
I have read about Query Notifications introduced in SQL Server 2005, but in truth I don't really understand the different SQL connection methods available. Currently I use ADO.NET to faciliate database communication, but the only sample code I have found for doing Query Notifications of this nature is in C# and VB. Not to mention that I am struggling to understand the concept. Here is a sample of how I currently run SQL with ADO.NET.
hr = link.CreateInstance(__uuidof(Connection) );
if (link) { // Execute the query on the open connection using existing command object
pCommand.CreateInstance(__uuidof(Command));
pCommand->ActiveConnection = link;
pCommand->CommandType = ado20CommandType;
pCommand->CommandText = strQuery.GetBuffer(0);
pCommand->PutPrepared(true);
pCommand->NamedParameters = true;
if (pRecordSet = pCommand->Execute(&vRecordsAffected,NULL,NULL)) { // Execute the final query and set object-level recordset object
lRecordsAffected = vRecordsAffected.lVal;
_RecordsetPtr pfuRecordSet = link->Execute(_T("SELECT ##IDENTITY as pmIDENT;"),NULL,adCmdText);
CString strID = pfuRecordSet->GetFields()->GetItem(COleVariant("pmIDENT"))->Value.bstrVal;
int nID = atoi(strID);
if (nID > 0) {
nLastInsertedID = nID;
}
return true;
}
I was wondering if someone could provide me with resources and/or sample code that would help me set this up in C++? Thanks in advance.
The fastest way to get data out of a SQL Server database is using the SqlDataReader, which is forward-only, read-only.
Take a look at this C++ code and give that a try.
http://tinyurl.com/m6u9jh7

How to store and retrieve .pdf, .exe file in SQL Server 2008

I am currently working on a Windows Forms application (C#, VS 2010) and I need to create functionality that enables users to upload .pdf, .exe files into a SQL Server 2008 database and download them back.
The problem that I have is that the file downloaded from the database is always corrupted (except .txt file) even though they are the same size. And I have used varbinary(MAX) as the file type to store the data in the database.
Can anyone show me some example code for how to do this?
PS: I have researched for more than a week, but still cannot find a solution to my problem, can anyone please help? Any answer will be highly appreciated.
In the below example there are a few assumptions made:
I'm using Dapper for the database access. It extends any IDbConnection object.
I have a table in the database that has a definition as such CREATE TABLE Data (Id INT IDENTITY(1, 1) PRIMARY KEY, Data VARBINARY(MAX)).
Here is the documentation for ReadAllBytes
So, obviously since you didn't provide anything surrounding your table structure you're going to have to change this code to meet you needs, but it will get you started.
Writing It
this.connection.Open();
try
{
var parameters = new
{
Data = File.ReadAllBytes(...);
};
return connection.Execute("INSERT INTO Data (Data) VALUES (#Data)", parameters);
}
finally
{
this.connection.Close();
}
Reading It
this.connection.Open();
try
{
var parameters = new { Id = 1 };
return connection.Query(
"SELECT Data FROM dbo.Data WHERE Id = #Id", parameters)
.Select(q => q.Data as byte[])
.Single();
}
finally
{
this.connection.Close();
}

SQL SMO To Execute Batch TSQL Script

I'm using SMO to execute a batch SQL script. In Management Studio, the script executes in about 2 seconds. With the following code, it takes about 15 seconds.
var connectionString = GetConnectionString();
// need to use master because the DB in the connection string no longer exists
// because we dropped it already
var builder = new SqlConnectionStringBuilder(connectionString)
{
InitialCatalog = "master"
};
using (var sqlConnection = new SqlConnection(builder.ToString()))
{
var serverConnection = new ServerConnection(sqlConnection);
var server = new Server(serverConnection);
// hangs here for about 12 -15 seconds
server.ConnectionContext.ExecuteNonQuery(sql);
}
The script creates a new database and inserts a few thousand rows across a few tables. The resulting DB size is about 5MB.
Anyone have any experience with this or have a suggestion on why this might be running so slowly with SMO?
SMO does lots of weird .. stuff in the background, which is a price you pay for ability to treat server/database objects in an object-oriented way.
Since you're not using the OO capabilites of SMO, why don't you just ignore SMO completely and simply run the script through normal ADO?
The best and fastest way to upload records into a database is through SqlBulkCopy.
Particularly when your scripts are ~1000 records plus - this will make a significant speed improvement.
You will need to do a little work to get your data into a DataSet, but this can easily be done using the DataSet xml functions.

Categories

Resources