Is it possible to connect to SQL Server without specifying a database? - c#

I'm trying to write some unit tests for my code that connect to SQL Server for persistence, but I would like to be able to run my unit tests by just pointing it at a SQL Server instance, and let the tests create their own database to run tests in, so after each test it can just drop the database and then on setup before the next test recreate it, so I know there is no legacy data or structures left over from a previous test effecting the next test.

In brief: no, you cannot do that. You might be able to leave out the database from the connection string, but in that case, that connection will be made to the configured default database of the login that's connecting to SQL Server (and that default database must exist at the time the connection is made)
If you want to have this scenario, you need to
first connect to your instance and database master and create your new testdb (or whatever it's called)
disconnect
in your tests, connect to the instance and the testdb database
Better yet: use a mocking framework of some sort so you don't even need an actual database in your testing scenario!

I use the following class to facilitate the OP's scenario:
public class MsSqlDatabaseCreator
{
public void Create(string connectionstring)
{
if (DatabaseExists(connectionstring))
{
DropDatabase(connectionstring);
}
CreateDatabase(connectionstring);
}
private static void CreateDatabase(string connectionString)
{
var sqlConnectionStringBuilder = new SqlConnectionStringBuilder(connectionString);
var databaseName = sqlConnectionStringBuilder.InitialCatalog;
sqlConnectionStringBuilder.InitialCatalog = "master";
using (var sqlConnection = new SqlConnection(sqlConnectionStringBuilder.ConnectionString))
{
sqlConnection.Open();
using (var sqlCommand = sqlConnection.CreateCommand())
{
sqlCommand.CommandText = $"CREATE DATABASE {databaseName}";
sqlCommand.ExecuteNonQuery();
}
}
}
private static bool DatabaseExists(string connectionString)
{
var sqlConnectionStringBuilder = new SqlConnectionStringBuilder(connectionString);
var databaseName = sqlConnectionStringBuilder.InitialCatalog;
sqlConnectionStringBuilder.InitialCatalog = "master";
using (var sqlConnection = new SqlConnection(sqlConnectionStringBuilder.ConnectionString))
{
sqlConnection.Open();
using (var command = sqlConnection.CreateCommand())
{
command.CommandText = $"SELECT db_id('{databaseName}')";
return command.ExecuteScalar() != DBNull.Value;
}
}
}
private static void DropDatabase(string connectionString)
{
var sqlConnectionStringBuilder = new SqlConnectionStringBuilder(connectionString);
var databaseName = sqlConnectionStringBuilder.InitialCatalog;
sqlConnectionStringBuilder.InitialCatalog = "master";
using (var sqlConnection = new SqlConnection(sqlConnectionStringBuilder.ConnectionString))
{
sqlConnection.Open();
using (var sqlCommand = sqlConnection.CreateCommand())
{
sqlCommand.CommandText = $#"
ALTER DATABASE {databaseName} SET SINGLE_USER WITH ROLLBACK IMMEDIATE;
DROP DATABASE [{databaseName}]
";
sqlCommand.ExecuteNonQuery();
}
}
}
}
The important part is the switching of the database name (initial catalog) to master. This way you can have just one connectionstring.

What you want to accomplish is possible using a mocking framework, in which case you don't even have to "connect to a database", you simply mock the return values that the database should return in order for you to test your "db handler" implementation.
There are several to choose from when it comes to C#, I can recommend Rhino Mocks and Moq to name two. Here's a question detailing a bit more; https://stackoverflow.com/questions/37359/what-c-sharp-mocking-framework-to-use

Why not have the same named database dedicated for tests? and drop-create it every time. This way you won't need to mess about with connection strings - it is always the same.
And yet, there is a better solution: within all your tests, start transaction, do your test, where your data is messed up. Once you verified (or failed) the test, unroll the transaction. This way you don't need to drop-create your tests for every test, because the data is never changed.
But you'll need to make sure schema in test-database is always up to date. So you'll need to drop-create test database whenever your schema is changed.
I've blogged about database tests and how we deal with Entity Framework migrations. This might not be completely applicable to your situation, but might help with ideas.
Regarding using mocks in your tests - yes this is absolutely valid suggestion and should be followed most of the time. Unless you are trying to test the database layer. In that case no mocks will save you, and you just have to go to DB. Many times over I have tried to mock DbContext in EF, but never managed to simulate realistic DB behavior. So going to DB was easier for me, rather than simulating DB-mock.

I'd use SQL Server Management Objects for the task. It's Server and Database APIs doesn't necessarily need a connection string but I think you might still need to specify a database. You can use master for that. (Check jeroenh's answer about creating object using SMO API as well)
By the way, if you are using .Net 4.0.2 and up you can use LocalDB as well, which is even better.
Edit: Note that actually LocalDB is an SQL Server 2012 feature however you still need .Net Framework > 4.0.2 to be able to use it.

Related

how to properly create a postgres database connection class in c#?

I am new to working with databases.I am using postgres database.I want to connect it to c# for my project.Since I have multiple form screen in my project, I assume it is better to create a seperate database connection class instead of using the same code in every other classes.I want to learn how to create an effective postgres database connection class in c#
There's no need to create a connection class since database connections and commands aren't complicated or expensive to create. The best practice is to create a connection and command, execute the SQL, and then dispose of both of them. The typical pattern is:
string connString = {connection string from config};
using (OdbcConnection conn = new OdbcConnection(connString)) {
using(OdbcCommand cmd = new OdbcCommand(sql, conn) {
// execute command
}
}
The using construct ensures that the connection and command are closed een if there is a database error.
Take a look at this website: https://www.connectionstrings.com/postgresql/
This is a great resource for finding connection strings to a variety of different databases! I reference it quite a bit. There are a couple of different connection strings for postgreSql, so you will need to dtermine which one is best to use for your use case.
I wouldn't set up a special class for a connection. Instead I recommend that you use an appsettings.json or web.config file to store the connection string and call it when you need it. Check out the documentation from Microsoft: https://learn.microsoft.com/en-us/aspnet/core/fundamentals/configuration/?view=aspnetcore-6.0

GetSchema("Databases") on ODBC connection (C#)

I am testing various DB connection methods in C#. In particular, I am testing SqlConnection and OdbcConnection classes; my DB is SQLServer Express (.\SQLEXPRESS). Both are working reasonably well, except in listing available databases on the server.
In my test code I use a "generic" DbConnection object and a simple factory to create an instance of specific SqlConnetion and OdbcConnetion subclasses (they both derive from DbConnection):
DbConnection connection;
switch (connection_type)
{
case DbConnectionType.DBCONN_MSSQL:
connection = new SqlConnection(...sql connection string...);
break;
case DbConnectionType.DBCONN_ODBC:
connection = new OdbcConnection(...odbc connection string...);
break;
}
The trick seems to work well except when I try to get the list of databases on the server:
DataTable databases = connection.GetSchema("Databases");
foreach (DataRow database in databases.Rows)
{
String databaseName = database["database_name"] as String;
Console.WriteLine(databaseName);
}
When "connection" is an OdbcConnection (and, note, the database is the same), I get an exception saying that "Databases" key was not found. I listed all the keys exposed by GetSchema(), and the ODBC version returns only a subset of the items exposed by the SQLServer version. I couldn't find any hint about this specific problem. Is it a documented/expected behaviour? Am I doing something wrong?
NOTE: here how I build the ODBC connection string:
OdbcConnectionStringBuilder builder;
builder = new OdbcConnectionStringBuilder();
builder.Driver = "SQL Server";
builder.Add("Server", ".\\SQLEXPRESS");
builder.Add("Uid", "");
builder.Add("Pwd", ""); // Using current user
builder.Add("Integrated Security", "SSPI");
connection = new OdbcConnection(builder.ConnectionString);
Is it a documented/expected behaviour?
Yes. See Retrieving Database Schema Information
Am I doing something wrong?
If your goal is to read SQL Server metadata in a provider-agnostic way, then yes. You should query the SQL Server catalog views directly. sys.databases, sys.tables, etc.
Make sure your "Databases" model has a valid Key. Add the [Key] Data annotation if the key you want to implement for that database doesn't follow the "ClassName"+"ID" entity framework rule.

Beginner requests Help with Mysql and C#

I'm trying to build a program that uses a C# to work with a MySQL DB. I get the C# syntax, and can write the language, but I don't have much experience with the libraries, and I feel a bit lost.
Could someone post examples of how a program would be built (in technical terms, syntax would be nice, but pseudo code is fine, too)?
I understand the theory of how it works, but need a hands on approach to it.
Thank you.
EDIT
I forgot to add that I want to learn how to do it with the .NET v.2.0 framework / VS2005 / MySQL v5.0 combination.
EDIT # 2
2.0 .NET will only be supported. =)
Here is tutorial for Entity Framework + MySQL.
There are lots of other ways to operate with DB, depending on what you need:
If you need execute raw sql queries against DB - use OdbcConnection + OdbcCommand
Need to manipulate items in DB as objects - use ORM (EntityFramework, NHibernate, Linq2Sql)
Like old-style DB interop? - DataSets is your choice.
I really like EF. Easy thing to start with.
PS: And before mixing UI and DB-interop, please read about Separation of concerns. MVC is interesting to read about too. About "libraries": create another project in your solution and add DB-interop logics there. Don't mix it in one assembly, because when your project becomes bigger than "Hello DataBase!" application it will create a big mess in code and logics, really.
UPDATE:
Using VS2005 and .net 2.0 is mysterious idea, really. Lots of tools and assemblies where made since 2.0 release. Linq, Orm-s, etc. Live without them is hard and all the benefits of C# are lost. I highly recommend to use latest techniques, if there is no strict reasons to use 2.0.
If using SqlServer - ObdcCommand and OdbcConnection can be replaced to SqlCommand and SqlConnection. (thanks #Abe Miessler comment)
Here is an example swiped from MSDN:
public void InsertRow(string connectionString, string insertSQL)
{
using (OdbcConnection connection =
new OdbcConnection(connectionString))
{
// The insertSQL string contains a SQL statement that
// inserts a new row in the source table.
OdbcCommand command = new OdbcCommand(insertSQL, connection);
// Open the connection and execute the insert command.
try
{
connection.Open();
command.ExecuteNonQuery();
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
// The connection is automatically closed when the
// code exits the using block.
}
}
If you want to read records in a DB, look at this example:
public static void ReadData(string connectionString)
{
string queryString = "SELECT DISTINCT CustomerID FROM Orders";
using (OdbcConnection connection = new OdbcConnection(connectionString))
{
OdbcCommand command = new OdbcCommand(queryString, connection);
connection.Open();
// Execute the DataReader and access the data.
OdbcDataReader reader = command.ExecuteReader();
while (reader.Read())
{
Console.WriteLine("CustomerID={0}", reader[0]);
}
// Call Close when done reading.
reader.Close();
}
}
FYI i am just copy/pasting these directly from MSDN. I highly recommend reading over their documentation and looking at their examples if you are just getting started.
http://msdn.microsoft.com/en-us/library/system.data.odbc.odbcdatareader.aspx
Here is a blog post getting you started with MySql and C#.
http://blog.bobcravens.com/2010/06/the-repository-pattern-with-linq-to-fluent-nhibernate-and-mysql/
Hope that gets you started.
Bob

C# Sync MS Access database to sql server

I am trying to set up a synchronization routine in C# to send data from a ms access database to a sql server. MS Access is not my choice it's just the way it is.
I am able to query the MS Access database and get OleDbDataReader record set. I could potentially read each individual record and insert it onto SQL Server but it seems so wasteful.
Is there a better way to do this. I know I could do it in MS Access linking to sql server and perform the update easy but this is for end users and I don't want them messing with access.
EDIT:
Just looking at SqlBulkCopy I think that may be the answer if I get my results into DataRow[]
I found a solution in .NET that I am very happy with. It allows me to give the access to the sync routine to any user within my program. It involves the SQLBulkCopy class.
private static void BulkCopyAccessToSQLServer
(CommandType commandType, string sql, string destinationTable)
{
using (DataTable dt = new DataTable())
{
using (OleDbConnection conn = new OleDbConnection(Settings.Default.CurriculumConnectionString))
using (OleDbCommand cmd = new OleDbCommand(sql, conn))
using (OleDbDataAdapter adapter = new OleDbDataAdapter(cmd))
{
cmd.CommandType = commandType;
cmd.Connection.Open();
adapter.SelectCommand.CommandTimeout = 240;
adapter.Fill(dt);
adapter.Dispose();
}
using (SqlConnection conn2 = new SqlConnection(Settings.Default.qlsdat_extensionsConnectionString))
{
conn2.Open();
using (SqlBulkCopy copy = new SqlBulkCopy(conn2))
{
copy.DestinationTableName = destinationTable;
copy.BatchSize = 1000;
copy.BulkCopyTimeout = 240;
copy.WriteToServer(dt);
copy.NotifyAfter = 1000;
}
}
}
}
Basically this puts the data from MS Access into a DataTable it then uses the second connection conn2 and the SqlBulkCopy class to send the data from this DataTable to the SQL Server. It's probably not the best code but should give anyone reading this the idea.
You should harness the power of SET based queries over RBAR efforts.
Look into a SSIS solution to synchronize the data and then schedule the package to run at regular intervals using SQL Server Agent.
You can call an SSIS package from the command line so you can effectively do it from MS Access or from C#.
Also, the SQL Server, the MS Access DB and the SSIS package do not have to be on the same machine. As long as your calling program can see the SSIS package, and the package can connect to the SQL Server and the MS Access DB, you can transfer data from one place to another.
It sounds like what you are doing is ETL. There are several tools that are built to do this and to me, there is little reason to reinvent the functionality. You have SQL Server, therefore you have SSIS. It has a ton of tools for automated transformations, cleanups, lookups, etc. that you can use out of the box.
Unless this is a real cut-and-dry data load and there is absolutely no scope for the complexity of the upload to increase later on (yeah, right!) I would go with a tried and tested ETL tool.
If SQL Server Integration Services isn't an option, you could write out to a temporary text file the data that you read from Access and then call bcp.exe to load it to the database.
I have done something like this before.
I used
OleDbConnection aConnection = new OleDbConnection(String.Format("Provider=Microsoft.Jet.OLEDB.4.0;Data Source={0}", fileName));
aConnection.Open();
to open the access db. Then
OleDbCommand aCommand = new OleDbCommand(String.Format("select * from {0}", accessTable), aConnection);
OleDbDataReader aReader = aCommand.ExecuteReader();
to execute the read from the table. Then
int fieldCount = aReader.FieldCount;
to get the field count
while (aReader.Read())
to loop the records and
object[] values = new object[fieldCount];
aReader.GetValues(values);
to retrieve the values.
There are several ways to sync but it can give a problem when you change a field name in sql server or add a new column or delete. The best option would be:
Create connections for sql server and oledb.
Write custom query to fetch record from one connection and save it to another.
Before executing make sure to program in a way that you update all table definitions.
In my case this helped in a way because the load on sql server became down.
can you not transfer Access file to the server and delete it once sync is complete?
You can create windows service for that..

Managing SQL Server Connections

What is the the best practice for SQL connections?
Currently I am using the following:
using (SqlConnection sqlConn = new SqlConnection(CONNECTIONSTRING))
{
sqlConn.Open();
// DB CODE GOES HERE
}
I have read that this is a very effective way of doing SQL connections. By default the SQL pooling is active, so how I understand it is that when the using code ends the SqlConnection object is closed and disposed but the actual connection to the DB is put in the SQL connection pool. Am i wrong about this?
That's most of it. Some additional points to consider:
Where do you get your connection string? You don't want that hard-coded all over the place and you may need to secure it.
You often have other objects to create as well before your really use the connection (SqlCommand, SqlParameter, DataSet, SqlDataAdapter), and you want to wait as long as possible to open the connection. The full pattern needs to account for that.
You want to make sure your database access is forced into it's own data layer class or assembly. So a common thing to do is express this as a private function call:
.
private static string connectionString = "load from encrypted config file";
private SqlConnection getConnection()
{
return new SqlConnection(connectionString);
}
And then write your sample like this:
using (SqlConnection sqlConn = getConnection())
{
// create command and add parameters
// open the connection
sqlConn.Open();
// run the command
}
That sample can only exist in your data access class. An alternative is to mark it internal and spread the data layer over an entire assembly. The main thing is that a clean separation of your database code is strictly enforced.
A real implementation might look like this:
public IEnumerable<IDataRecord> GetSomeData(string filter)
{
string sql = "SELECT * FROM [SomeTable] WHERE [SomeColumn] LIKE #Filter + '%'";
using (SqlConnection cn = getConnection())
using (SqlCommand cmd = new SqlCommand(sql, cn))
{
cmd.Parameters.Add("#Filter", SqlDbType.NVarChar, 255).Value = filter;
cn.Open();
using (IDataReader rdr = cmd.ExecuteReader())
{
while (rdr.Read())
{
yield return (IDataRecord)rdr;
}
}
}
}
Notice that I was also able to "stack" the creation of the cn and cmd objects, and thus reduce nesting and only create one scope block.
Finally, a word of caution about using the yield return code in this specific sample. If you call the method and don't complete your DataBinding or other use right away it could hold the connection open for a long time. An example of this is using it to set a data source in the Load event of an ASP.NET page. Since the actual data binding event won't occur until later you could hold the connection open much longer than needed.
Microsoft's Patterns and Practices libraries are an excellent approach to handling database connectivity. The libraries encapsulate most of the mechanisms involved with opening a connection, which in turn will make your life easier.
Your understanding of using is correct, and that method of usage is the recommended way of doing so. You can also call close in your code as well.
Also : Open late, close early.
Don't open the connection until there are no more steps left before calling the database. And close the connection as soon as you're done.

Categories

Resources