Connecting to a SQLLite db with SqlBulkCopy - c#

I'm writing a unit test in c# for a function that is responsible for using System.Data.SqlClient.SqlBulkCopy to copy a DataTable to a database server.
I use SQLLite for unit tests, and wanted to connect to my SQLLite in memory database with SqlBulkCopy, and then bulk copy that test data into the SQLLite db.
However, I can't seem to get the connection string right.
I originally tried
var bcp = new SqlBulkCopy("FullUri=file::memory:?cache=shared")
Then
var bcp = new SqlBulkCopy("Data Source=:memory:;Cache=Shared")
Which didn't recognize Cache
So then I tried
var bcp = new SqlBulkCopy("Data Source=:memory:")
out of desperation, which simply timed out when attempting to connect to the database.
Is what I'm trying to accomplish here possible? If it is, can someone please help me with the connection string?

The answer to this was that you cannot connect SqlBulkCopy to a SQLite instance.
What I did to solve my problem (unit test a part of the code that used SqlBulkCopy) was to create a wrapper around SqlBulkCopy that is implemented using SqlBulkCopy for production code, and with a mock bulk copy in test code. Effectively decoupling the dependency on SqlBulkCopy itself.
Specifically, I created
public interface IBulkCopy : IDisposable {
string DestinationTableName { get; set; }
void CreateColumnMapping(string from, string to);
Task WriteToServerAsync(DataTable dt);
}
Then, I implemented this as
public class SQLBulkCopy : IBulkCopy {
private SqlBulkCopy _sbc;
public string DestinationTableName {
get { return _sbc.DestinationTableName; }
set { _sbc.DestinationTableName = value; }
}
public SQLBulkCopy(IDBContext ctx) {
_sbc = new SqlBulkCopy((SqlConnection)ctx.GetConnection());
}
public void CreateColumnMapping(string from, string to) {
_sbc.ColumnMappings.Add(new SqlBulkCopyColumnMapping(from, to));
}
public Task WriteToServerAsync(DataTable dt) {
return _sbc.WriteToServerAsync(dt);
}
}
And in my test utilities I mocked out "bulk copy" with just inserts:
class MockBulkCopy : IBulkCopy {
private IDBContext _context;
public MockBulkCopyHelper(IDBContext context) {
_context = context;
}
public string DestinationTableName { get; set; }
public void CreateColumnMapping(string fromName, string toName) {
//We don't need a column mapping for raw SQL Insert statements.
return;
}
public virtual Task WriteToServerAsync(DataTable dt) {
return Task.Run(() => {
using (var cn = _context.GetConnection()) {
using (var cmd = cn.CreateCommand()) {
cmd.CommandText = $"INSERT INTO {DestinationTableName}({GetCsvColumnList(dt)}) VALUES {GetCsvValueList(dt)}";
cmd.ExecuteNonQuery();
}
}
});
}
Where GetCsvColumnList and GetCsvValueList I implemented as helper functions.

You cannot use SqlBulkCopy for SQLite. SqlBulkCopy has been done for SQL Server.
Normally the trick to dramatically improve performance for SQLite is making sure a transaction is used.
Disclaimer: I'm the owner of .NET Bulk Operations
This library is not free but allows you to easily perform and customize all bulk operations:
Bulk Insert
Bulk Delete
Bulk Update
Bulk Merge
Example
// Easy to use
var bulk = new BulkOperation(connection);
bulk.BulkInsert(dt);
bulk.BulkUpdate(dt);
bulk.BulkDelete(dt);
bulk.BulkMerge(dt);
// Easy to customize
var bulk = new BulkOperation<Customer>(connection);
bulk.BatchSize = 1000;
bulk.ColumnInputExpression = c => new { c.Name, c.FirstName };
bulk.ColumnOutputExpression = c => c.CustomerID;
bulk.ColumnPrimaryKeyExpression = c => c.Code;
bulk.BulkMerge(customers);
EDIT: Answer comment
I want to load a data table from SQLite then "bulk copy" it in other databases
This situation is possible but requires 2 connection
DbConnection sourceConnection = // connection from the source
DbConnection destinationConnection = // connection from the destination
// Fill the DataTable using the sourceConnection
dt = ...;
// BulkInsert using the destinationConnection
var bulk = new BulkOperation(destinationConnection);
bulk.BulkInsert(dt);

Related

Take too long time to write in to Access Data Base using C# WPF application, Visual Studio,

I have a WPF application using C# and VS.
And I am using an Access database.
I have a loop that has to run in a maximum time of 500MS, But its take 570+-
In my program, I have a wait time of ~340MS in total and more ~160MS that I can to optimize
After checking with a Stopwatch I found that when I write my data to my Access Database Its take about ~50MS (I have a 3 writes to there).
And I have no Idea how to optimize my Database write
My Class that connect and using the database is an external DLL file
that look like that (I also give an example of one method that take a 50MS of runtime, named as "AddDataToLocalHeaderResult"):
namespace DataBaseManager
{
public class LocalPulserDBManager
{
private string localConnectionString;
private string databaseName = $#"C:\Pulser\LocalPulserDB.mdb";
private readonly int _30DaysBack = -30;
private static readonly Lazy<LocalPulserDBManager> lazy =new Lazy<LocalPulserDBManager>(() => new LocalPulserDBManager());
public static LocalPulserDBManager LocalPulserDBManagerInstance { get { return lazy.Value; } }
private void CreateConnectionString()
{
localConnectionString = $#"Provider=Microsoft.Jet.OLEDB.4.0;Data Source={databaseName};Persist Security Info=True";
}
private LocalPulserDBManager()
{
CreateConnectionString();
}
public void AddDataToLocalHeaderResult(string reportNumber,string reportDescription,
string catalog,string workerName,int machineNumber, Calibration c,string age)
{
if (IsHeaderLocalDataExist(reportNumber, catalog, machineNumber, c) == false)
{
using (OleDbConnection openCon = new OleDbConnection(localConnectionString))
{
string query = "INSERT into [HeaderResult] ([ReportNumber],[ReportDescription],[CatalogNumber], " +
"[WorkerName], [LastCalibrationDate], [NextCalibrationDate], [MachineNumber], [Age]) " +
"VALUES (#report ,#reportDescription ,#catalog, #workerName," +
" #LastCalibrationDate, #NextCalibrationDate, #machineNumber, #age)";
using (OleDbCommand command = new OleDbCommand(query))
{
command.Parameters.AddWithValue("#report", reportNumber);
command.Parameters.AddWithValue("#reportDescription", reportDescription);
command.Parameters.AddWithValue("#catalog", catalog);
command.Parameters.AddWithValue("#workerName", workerName);
command.Parameters.AddWithValue("#LastCalibrationDate", c.LastCalibrationDate);
command.Parameters.AddWithValue("#NextCalibrationDate", c.NextCalibrationDate);
command.Parameters.AddWithValue("#machineNumber", machineNumber);
command.Parameters.AddWithValue("#age", age);
command.Connection = openCon;
openCon.Open();
int recordsAffected = command.ExecuteNonQuery();
openCon.Close();
}
}
}
}
....
....
METHODS
....
}
}
In my executable program I use that like that :
I have usings as that : using static DataBaseManager.LocalPulserDBManager;
and in my code I exeute the method like that LocalPulserDBManagerInstance.AddDataToLocalHeaderResult(ReportNumber, Date_Description,CatalogNumber, WorkerName, (int)MachineNumber, calibrationForSave, AgeCells);
One of my access database table look like that :
One row in that table look like that:
50MS it is normal runtime in that situation?
If here is missing any information please tell me...
********************* EDITING **************************
I have change my AddDataToLocalHeaderResult method as the first command told me
I got the same result
public void AddDataToLocalHeaderResult(string reportNumber,string reportDescription,
string catalog,string workerName,int machineNumber, Calibration c,string age)
{
if (IsHeaderLocalDataExist(reportNumber, catalog, machineNumber, c) == false)
{
using (OleDbConnection openCon = new OleDbConnection(localConnectionString))
{
string query = "INSERT into [HeaderResult] ([ReportNumber],[ReportDescription],[CatalogNumber], " +
"[WorkerName], [LastCalibrationDate], [NextCalibrationDate], [MachineNumber], [EditTime], [Age]) " +
"VALUES (#report ,#reportDescription ,#catalog, #workerName," +
" #LastCalibrationDate, #NextCalibrationDate, #machineNumber,#edittime, #age)";
DateTime dt = DateTime.Now;
DateTime edittime = new DateTime(dt.Year, dt.Month, dt.Day, dt.Hour, dt.Minute, dt.Second);
using (OleDbCommand command = new OleDbCommand(query))
{
command.Parameters.AddWithValue("#report", reportNumber);
command.Parameters.AddWithValue("#reportDescription", reportDescription);
command.Parameters.AddWithValue("#catalog", catalog);
command.Parameters.AddWithValue("#workerName", workerName);
command.Parameters.AddWithValue("#LastCalibrationDate", c.LastCalibrationDate);
command.Parameters.AddWithValue("#NextCalibrationDate", c.NextCalibrationDate);
command.Parameters.AddWithValue("#machineNumber", machineNumber);
command.Parameters.AddWithValue("#edittime", edittime);
command.Parameters.AddWithValue("#age", age);
command.Connection = openCon;
openCon.Open();
int recordsAffected = command.ExecuteNonQuery();
openCon.Close();
}
}
}
}
Using the method you're showing here, you're adding one row at a time. So the server is opening a connection to the db, the data's being written to memory, then to the physical file (mdb), then the indexes are being updated. To that's a full four steps per row you're trying to execute. Worse than that, the data write to the physical file is time consuming.
I think that if you use a different approach, do these four steps (connection, memory, data write, re-index) for the entire set of data you're trying to insert. So, let's say you're adding 1000 records, rather than 4000 steps (4x1000), you could reduce this processing to 1400 processing steps (1 connection, super-fast 1000 memory writes, 1 data file write, 1 index revision).
The following code gives the rough idea of what I'm talking about:
class Program
{
static void Main(string[] args)
{
//memory-only list for data loading
List<HeaderResult> mylist = new List<HeaderResult>(){ new HeaderResult("report1","desc of report","ete"), new HeaderResult("report2", "desc of report2", "ete2")};
var tableForInsert = new DataTable();
using (SqlDataAdapter dataAdapter = new SqlDataAdapter("SELECT * from HeaderResult", "my conneciton string")) {
dataAdapter.Fill(tableForInsert);
//now I have a live copy of the table into which I want to insert data
//blast in the data
foreach (HeaderResult hr in mylist) {
tableForInsert.Rows.Add(hr);
}
//now all the data is written at once and sql will take care of the indexes after the datat's written
dataAdapter.Update(tableForInsert);
}
}
//class should have same fields as your table
class HeaderResult
{
string report;
string reportDescription;
string etc;
public HeaderResult(string rpt, string desc, string e)
{
report = rpt;
reportDescription = desc;
etc = e;
}
}

Database Sync - Local and Online without Replication

So I have this code that checks if new data is added to online database by comparing the rows of online and local database. If new data is found it inserts the new data to local database.
public class Reservation
{
public string res_no { get; set; }
public string mem_fname { get; set; }
}
My Code :
private async void updateDineList()
{
DBconnector.OpenConnection();
//Gets data from online database
HttpClient client = new HttpClient();
var response = await client.GetStringAsync("http://example.com/Reservation/view_pending_reservation");
var persons = JsonConvert.DeserializeObject<List<Reservation>>(response);
//Gets data from Local database
string string_reservation = "SELECT res_no,mem_fname FROM res_no WHERE res_status='pending';";
DataTable reservation_table = new DataTable();
MySqlDataAdapter adapter_reservartion = new MySqlDataAdapter(string_reservation, DBconnector.Connection);
adapter_reservartion.Fill(reservation_table);
//Gets the row of each table
int local = reservation_table.Rows.Count;
int online = persons.Count;
//Compares rows of online and local database
if (local < online)
{
//if the rows of online database is greater than local database
//inserts the new data from local database
string Command_membership = "INSERT INTO reservation_details (res_no,mem_fname) VALUES (#res_no, #mem_fname);";
for (int i = local; i < online; i++)
{
//inserts new data from online to local database
using (MySqlCommand myCmd = new MySqlCommand(Command_membership, DBconnector.Connection))
{
myCmd.CommandType = CommandType.Text;
myCmd.Parameters.AddWithValue("#res_no", persons[i].res_no);
myCmd.Parameters.AddWithValue("#mem_fname", persons[i].mem_fname);
myCmd.ExecuteNonQuery();
}
}
MessageBox.Show("New Records Found");
}
else
{
MessageBox.Show("No new Records");
}
DBconnector.Connection.Close();
}
So my question is there any problem could occur with this code, it works fine but is there any way to improve this. I know MySQL replication is better but I am only using free Web Hosting with few MySQL privileges.
The clear improvement is not to create a new command for every row. You should either create the command and parameters once and then set the parameters and call for each row, or better still package the set of updates into a single structure, like and xml string, and then pass the whole lot to the database via a stored procedure call.
Other probably problematic issue is that you are checking purely based on row counts. Don't know if that is valid in your scenario but it sounds dangerous. What if rows are deleted? or is that not possible in your scenario. Some other way of checking last updates would probably be preferable.
Without more context that's about all I can see.

How to delete table on Cassandra using C# Datastax driver?

var keyspace = "mydb";
var datacentersReplicationFactors = new Dictionary<string, int>(0);
var replication = ReplicationStrategies.CreateNetworkTopologyStrategyReplicationProperty(datacentersReplicationFactors);
using (var cluster = Cluster.Builder().AddContactPoints("my_ip").Build())
using (var session = cluster.Connect())
{
session.CreateKeyspaceIfNotExists(keyspace, replication, true);
session.ChangeKeyspace(keyspace);
var entityTable = new Table<Models.Entity>(session);
var attributeTable = new Table<Models.Attribute>(session);
entityTable.CreateIfNotExists(); // Worked
attributeTable.CreateIfNotExists(); // Worked
entityTable.Delete(); // Does nothing
attributeTable.Delete(); // Does nothing
}
EDIT: Without using raw queries session.Execute("DROP TABLE entities;"); working fine.
Delete() method is not intended for drop of the tables. It returns representation of a DELETE cql statement. If you call it, you just get {DELETE FROM entities}.
If you need to drop a table, the easiest way is just to execute DROP statement:
session.Execute("DROP TABLE entities;");
Unless there is already a method for dropping tables that I not aware of, you can use this extensions.
public static class DastaxTableExtensions
{
public static void Drop<T>(this Table<T> table)
{
table.GetSession().Execute($"DROP TABLE {table.Name};");
}
public static void DropIfExists<T>(this Table<T> table)
{
table.GetSession().Execute($"DROP TABLE IF EXISTS {table.Name};");
}
}
Then you can use it like this
entityTable.Drop();
attributeTable.DropIfExists();

How do I refactor this C# SQL query to do a unit test?

I have this code that queries a database. I want to put the actual database code into a separate class so I can reuse it in other places. This will leave just the actual read of the PassResult value so I can make a Unit Test of the code without having the SQL code running. I am having trouble finding references on how to make this kind of code Unit Testable. Could someone help out?
using System;
using System.Data;
using System.Data.SqlClient;
namespace CS_UI_Final_Inspection
{
public class CalibrationTestCheck
{
// declare the variables
private bool _calibrationTestPass = false;
private string _connectionString = string.Empty;
public bool CheckCalibrationTestResults(string serialNumber, IDeviceInfo deviceInfo, string mapID)
{
// get database location
DhrLocationPull dhrLocation = new DhrLocationPull();
_connectionString = dhrLocation.PullDhrLocation();
// build the query
SqlConnection calibrationCheckConnection = new SqlConnection(_connectionString);
SqlCommand calibrationCheckCommand = new SqlCommand("[MfgFloor].[GetLatestTestResultsForDeviceByTestType]",
calibrationCheckConnection);
// build the stored proc
calibrationCheckCommand.CommandType = CommandType.StoredProcedure;
calibrationCheckCommand.Parameters.Add(new SqlParameter("#SerialNumber", serialNumber));
calibrationCheckCommand.Parameters.Add(new SqlParameter("#DeviceTypeID", mapID));
calibrationCheckCommand.Parameters.Add(new SqlParameter("#TestDataMapTypeID", "C"));
calibrationCheckCommand.Connection.Open();
SqlDataReader calibrationCheckReader = calibrationCheckCommand.ExecuteReader();
// is there data?
if (calibrationCheckReader.HasRows)
{
// read the data
calibrationCheckReader.Read();
try
{
_calibrationTestPass = (bool) calibrationCheckReader["PassResult"];
}
catch (InvalidOperationException)
{
// means last element was not filled in
}
finally
{
// close refs
calibrationCheckReader.Close();
calibrationCheckCommand.Connection.Close();
calibrationCheckConnection.Close();
calibrationCheckReader.Dispose();
calibrationCheckCommand.Dispose();
calibrationCheckConnection.Dispose();
}
}
return _calibrationTestPass;
}
}
}
create an interface and implement it.
move all references to be tested to use the interface (exposing any methods/properties required through the interface)
have the constructor or method being tested take the interface as a parameter.
Roy Oscherov is a good resource on this. Roy Oscherov wrote a great book called "The art of unit testing". Roy's website can be found here: http://osherove.com/

Reading and writing data into sql server simultaneously

I have a service which continuously writes data in a separate thread into SQL database.Now from the same service if i am trying to read from the same table, since i already am writing into it,I get this exception : There is already an open DataReader associated with this Command which must be closed first.
So can anyone help me how to do this simultaneously?
Here s my code for reading data:
public Collection ReadData(string query)
{
{
_result = new Collection<string[]>();
string[] tempResult;
SqlDataReader _readerRead;
using (_command = new SqlCommand(query, _readConnection))
{
_readerRead = _command.ExecuteReader();
while (_readerRead.Read())
{
tempResult = new string[4];
tempResult[0] = _reader[0].ToString();
tempResult[1] = _reader[1].ToString();
tempResult[2] = _reader[2].ToString();
tempResult[3] = _reader[3].ToString();
_result.Add(tempResult);
//Console.WriteLine("Name : {0} Type : {1} Value : {2} timestamp : {3}", _reader[0], _reader[1], _reader[2], _reader[3]);
}
if (_readerRead != null)
{
_readerRead.Close();
}
_readConnection.Close();
return _result;
}
}
}
and here it is for writing to it :
public void WriteData(Collection<TagInfo> tagInfoList)
{
int i = 0;
for (i = 0; i < tagInfoList.Count; i++)
{
using( _command = new SqlCommand(insert statement here)
{
_command.Parameters.AddWithValue("Name", tagInfoList[i].Name);
_command.Parameters.AddWithValue("Type", tagInfoList[i].TagType);
_command.Parameters.AddWithValue("Value", tagInfoList[i].Value);
_reader = _command.ExecuteReader();
if (_reader != null)
{
_reader.Close();
}
}
}
}
You need a different SQLConnection to the database for your writer. You cannot use the same db connection for both.
Although its possible to do, using a separate connection I would question why you need to do this.
If you are reading and writing data to one table in the same service you will be placing unnecessary load on one SQL table, and depending on the number of queries you intend to make this could cause you problems. If you already have this data (in a different thread) why not Marshall the data from the background thread to where you need it as you write it into the database, and you don't need to read the data anymore.
However.... it is difficult to give an fair answer without seeing the code/what you are looking to achieve.

Categories

Resources