How to run non-persistent queries in Visual Studio - c#

When using the Query Design feature in Visual Studio, any queries that I run on a SQL Database or Microsoft Access Database while testing are persistent. Meaning they actually change the data in the table(s). Is there a way to make the queries non-persistent while testing them until a program is run? Using C# as a programming language, and .NET as a framework if it matters. Also need to know the process for doing this with either an MS Access or SQL database.

You can do transactions in C# similar to how you use them in SQL. Here is an example:
connection.Open();
SqlCommand command = connection.CreateCommand();
SqlTransaction transaction;
// Start a local transaction.
transaction = connection.BeginTransaction("SampleTransaction");
//Execute query here
Query details
//check if test environment
bool testEnvironment = SomeConfigFile.property("testEnvironment");
if (!testEnvironment) {
transaction.Commit();
} else {
transaction.Rollback();
}
Here is the documentation on transactions in C#: https://msdn.microsoft.com/en-us/library/86773566%28v=vs.110%29.aspx

It should be possible for VS to create you a local copy of the SQL data you're working on while you're testing. This is held in the bin folder. Have a look at this:
https://msdn.microsoft.com/en-us/library/ms246989.aspx
Once you're finished testing you could simply change it to be pointing to the database you want to alter with your application.

I'm not aware of a way to get exactly what you're asking for, but I think there is an approach to get close to the behaviour you want:
When using Microsoft SQL Server, creating a table with a leading hash in the name (#tableName) will cause the table to be disposed of when your session ends.
One way you could take advantage of this to get your desired behaviour is to copy your working table into a temporary table, and work on the temporary table instead of the live table.
To do so, use something like the following:
SELECT * INTO #tempTable FROM liveTable
This will create a complete copy of your liveTable, with all of the same columns and rows. Once you are finished, the table will be automatically dropped and no permanent changes will have been made.
This can also useful for a series of queries which you execute on the same subset of a large data set. Selecting the subset of data into a smaller temporary table can make subsequent queries much faster than if you had to select from the full data set repeatedly.
Just keep in mind that as soon as your connection closes, all the data goes with it.

Related

INSERT with Dapper takes too long in SQL Azure

I've got an instance of an entity, with a one to many relationship (Items). I'm attempting to have this inserted into a SQL Azure instance via Dapper.
The item count is ~3.5k
using (var transaction = connection.BeginTransaction())
{
// (...) x inserted here, we want to insert the items next...
connection.Execute(TSQL_InsertStatement, x.Items, transaction);
transaction.Commit();
}
Locally (vs a .\SQLEXPRESS), the execution takes ~300-500[ms]
In Azure, the execution takes ~8[s]
I believe this action generates a ton of individual INSERTs and this might be a problem. I've tried refactoring this to a "manually-generated-sql" solution, which generated 4x INSERTS (1k, 1k, 1k, 500 rows), however the execution of this took over 2[s] locally.
Is there a way to have this run faster with Dapper? Like a bulk insert maybe?
Is there something I'm not aware of with SQL Azure? Like some hidden throttling?
Using multiple inserts within same transaction is generally bad idea in Azure SQL Database.
You should use batching to increase performance of insert queries see https://azure.microsoft.com/en-us/documentation/articles/sql-database-use-batching-to-improve-performance/ .
If it is possible, I would recommend table value parameters, bulk insert or multiple parametrized inserts. These scenarios are described in the referenced article.
Azure Sql Database also has JSON support so if your original source is array of JSON objects you might push entire JSON as text and parse it in SQL side using OPENJSON (there are still no evidence is this faster than other methods).

How to properly use DataAdapter

I've used ADO for a long time, but am a relative noob to ADO.Net, and am trying to figure out the best way to build a data service for my application.
When I want to query data, I can successfully build a SQL statement, instantiate a DataAdapter, fill a DataTable and return it. So far, so good.
The trouble is, my service needs to be able to return data from several tables (not at the same time). So I build a SQL statement and send it to the method which does the above steps, and everything works for each table I need data from.
In the common method, I instantiate the DataAdapter each time. I'm not bothering to set all commands, just the select command.
Now I'm wondering if this should be more extensible. Should I load the DataAdapter with all the commands (select, update, delete, insert) and keep it alive for as long as the service is alive? This would probably require 4 different adapters, one for each table I need to interact with.
This is for a single-user database, so there will be no worries of conflicts between users. Frankly, I could just execute SQL statements against the connection for adds, updates and deletes. I just want to get an idea of the best practice for doing this.
Thanks...

Best way to pass a SQL query other than a stored procedure

I am working on a C# console application which runs two SQL queries which are passed as strings as follows
private DataTable GetData(string resultsQuery) {
SqlCommand sqlCmd = new SqlCommand(resultsQuery, sqlcon);
DataTable dtlist = new DataTable();
SqlDataAdapter dalist = new SqlDataAdapter();
dalist.SelectCommand = sqlCmd;
dalist.Fill(dtlist);
sqlcon.Close();
return dtlist;
}
but the thing is these queries keep changing very frequently and everytime they change, I have re-build, re-publish and uninstall the older application before installing the updated application which I think is a bad practice. The reason I cannot use a stored procedure is that I have only read access to the database and I cannot create a stored procedure.
Can anyone suggest me a better way and best practice to deal with this?
Essentially, your query is part of your program's configuration, which is currently hard-coded. Therefore, a solution you need has little to do with accessing databases: you need a way to upgrade configuration settings of an installed application.
Although having a stored procedure would be a fine choice, there are other ways of achieving the effect that you are looking for:
One approach would be to configure a separate database to which you have a write access, and use it as your source of query strings. Make a table that "maps" query names to query content:
QueryKey Query
-------- --------------------------------------
query1 SELECT A, B, C FROM MyTable1 WHERE ...
queryX SELECT X, Y, Z FROM MyTable2 WHERE ...
Your program can read this other DB at startup, and store queries for use at runtime. When end-users request their data, your program would execute the query it got from the configuration database against your read-only database.
You can take other approaches to distributing this piece of configuration. Alternatives include storing the strings in a shared folder on a file server to which your application has visibility, setting up a network service of your own to feed your application its queries at start-up, or using built-in means of configuration available in .NET. The last approach requires you to change the settings on individual machines one-by-one, which may not be an ideal roll-out scenario.
The idea of using an external XML file was mentioned, but because this is .Net, that is exactly what the App.Config files are designed to do. There is a whole class library designed to allow you to store info in the App.Config file so it can be easily modified without recompiling the program. The big caveat is, that if the query returns an entirely different shaped result set, this will not help much. But if all you are changing is the WHERE condition or something like that, but not the SELECT list; then this approach will work fine.

insert directly or via a stored procedure

I am using sql server and winforms for my application. data would be inserted every minute into the database tables by pressing a button on a Form.
for this, I am using the INSERT query.
But if I create a procedure and include the same insert query in it, then would it be more efficient, what would be the difference then?
Using stored procedures is more secure
A stored procedure would generally be quicker as the query plan is stored and does not need to be created for each call. If this is a simple insert the difference would be minimal.
A stored procedure can be run with execute permissions which is more secure than giving insert permissions to the user.
It depends on what you mean by 'efficient'.
Execution time - if you're only saving to the database only every couple of seconds then any speed difference between SPs and INSERT is most likely insignificant. If the volume is especially high you would probably set up something like a command queue on the server before fine-tuning at this level.
Development time
using INSERT means you can write your SQL directly in your codebase (in a repository or similar). I've seen that described as poor design, but I think that as long as you have integration tests around the query there's no real problem
Stored Procedures can be more difficult to maintain - you need to have a plan to deploy the new SP to the database. Benefits are that you can implement finer-grained security on the database itself (as #b-rain and #mark_s have said) and it is easy to decide between INSERT and UPDATE within the SP, whereas to do the same in code means making certain assumptions.
Personally (at the moment) I use inline SQL for querying and deleting, and stored procedures for inserting. I have a script and a set of migration files that I can run against the production database to deploy table and SP changes, which seems to work pretty well. I also have integration tests around both the inline SQL and the SP calls. If you go for inline SQL you definitely should use parameterised queries, it helps against SQL injection attacks and it is also easier to read and program.
If your DBA is even allowing you to do this without a stored procedure I'd be very suspicious...

Copy from one database table to another C#

Using C# (vs2005) I need to copy a table from one database to another. Both database engines are SQL Server 2005. For the remote database, the source, I only have execute access to a stored procedure to get the data I need to bring locally.
The local database I have more control over as it's used by the [asp.net] application which needs a local copy of this remote table. We would like it local for easier lookup and joins with other tables, etc.
Could you please explain to me an efficient method of copying this data to our local database.
The local table can be created with the same schema as the remote one, if it makes things simpler. The remote table has 9 columns, none of which are identity columns. There are approximately 5400 rows in the remote table, and this number grows by about 200 a year. So not a quickly changing table.
Perhaps SqlBulkCopy; use SqlCommand.ExecuteReader to get the reader that you use in the call to SqlBulkCopy.WriteToServer. This is the same as bulk-insert, so very quick. It should look something like (untested);
using (SqlConnection connSource = new SqlConnection(csSource))
using (SqlCommand cmd = connSource.CreateCommand())
using (SqlBulkCopy bcp = new SqlBulkCopy(csDest))
{
bcp.DestinationTableName = "SomeTable";
cmd.CommandText = "myproc";
cmd.CommandType = CommandType.StoredProcedure;
connSource.Open();
using(SqlDataReader reader = cmd.ExecuteReader())
{
bcp.WriteToServer(reader);
}
}
Bulk Copy feature of ADO.NET might help you take a look at that :
MSDN - Multiple Bulk Copy Operations (ADO.NET)
An example article
I would first look at using SQL Server Intergration Services (SSIS, née Data Transfer Services (DTS)).
It is designed for moving/comparing/processing/transforming data between databases, and IIRC allows an arbitrary expression for the source. You would need it installed on your database (shouldn't be a problem, it is part of a default install).
Otherwise a code solution, given the data size (small), pull all the data from the remove system into an internal structure, and then look for rows which don't exist locally to insert.
You probably can't do this, but if you can't, DON'T do it with a program. If you have any way of talking to someone who controls the source server, see if they will set up some sort of export of the data. If the data is as small as you say, then xml or csv output would be 100x better than writing something in c# (or any language).
So let's assume they can't export, still, avoid writing a program. You say you have more control over the destination. Can you set up an SSIS package, or setup a linked server? If so, you'll have a much easier time migrating the data.
If you set up at bare minimum the source as a linked server you could write a small t-sql batch to
TRUNCATE DestTable
INSERT INTO DestTable
SELECT SourceTable.Star FROM [SourceServer].[Schema].[Table]
wouldn't be as nice as SSIS (you have more visual of what's happening, but the t-sql above is pretty clear).
Since I would not take the programming route, the best solution I could give you would be, if you absolutely had to:
Use SqlClient namespace.
So, create 2 SqlConnections, 2 SqlCommands, and get the instance of the 1 SqlReader.
Iterate through the source reader, and execute the destination SqlCommand insert for each iteration with the.
It'll be ugly, but it'll work.
Doesn’t seem to be huge quantity of data you have to synchronize. Under conditions you described (only SP to access the remote DB and no way to get anything else), you can go for Marc Gravell’s solution.
In the case the data can only grow and existing data can not be changed you can compare the record count on remote and internal DB in order to optimize operation; if no change in remote DB no need to copy.

Categories

Resources