C#/ASP.NET Web Site using SQL Server CE Database Slow Query - c#

The main issue I have is that a simple select from my SQL Server CE database has a typical wait time of 17 seconds for the first query and ~5 seconds for each subsequent query.
After letting the site sit idle for a couple minutes, the 17 second delay comes back. I am assuming the delay is from establishing connections, caching, etc, but I feel on our LAN this is a ridiculous wait and will only get worse for outside users.
The site is a c#/ASP.NET web site that is using a SQL Server Compact database (v4.0.8876.1). This is deployed to an IIS8 server. The site is intended to calculate values from user input using coefficients for different models that are stored in the database.
What am I doing wrong for there to be such a massive delay during the connection/queries?
Connection string:
<add name="DatabaseConnection"
connectionString="Data Source=|DataDirectory|\Database.sdf"
providerName="System.Data.SqlServerCe.4.0"/>
Query:
string connstr = ConfigurationManager.ConnectionStrings["DatabaseConnection"].ConnectionString;
using (SqlCeConnection conn = new SqlCeConnection(connstr))
{
string queryString = #"Select id, description, imagePath
from ConservationModels
where model = #modelName";
SqlCeCommand command = new SqlCeCommand(queryString, conn);
command.Parameters.AddWithValue("#modelName", modelName);
try
{
conn.Open();
SqlCeDataReader reader = command.ExecuteReader();
while (reader.Read())
{
modelID = (int)reader[0];
description = (string)reader[1];
imagePath = (string)reader[2];
}
reader.Close();
}
catch (Exception ex)
{
throw;
}
}
Apologies for any missing information and I appreciate any suggestions. If there is a better way to achieve the goal for a web site, I am all ears.

This may be belated, but I recent found out (the hard way), that if you use SQL Server CE with ASP.NET on OWIN hosted on IIS, IIS will recompile or build a new app domain on every request. No idea why this happens, but the time you're seeing is likely to be ASP.NET rebuilding the application.
My application originally uses SQL Server Express with Entity Framework and everything works fine -- typical slow response on the first request as Entity Framework compiles the database schema, then lightning-fast afterwards. Static variables on the application carry across multiple requests.
I moved from SQL Server Express to SQL Server CE, and then suddenly I see Entity Framework recompiling the schema on every request. That means that every request is now slow.
Also, any static variables I keep on the application is reset for every request -- consistent with a new app domain being built.
Moving the whole stack unchanged to self-hosting on OWIN and suddenly it becomes normal again -- the first request slow, the subsequent ones lightning-fast.
This proves that ASP.NET on OWIN on IIS with SQL Server CE is doing something weird by not reusing existing threads and constantly building new app domains.

Following MatteoSp's advice, I removed SQL Server CE components from the site and created a SQL Server Express database to replace it. Though the compilation time on the site seems to have increased for some reason, this resolved my issue for the slow queries/connection.
Thank you for the assistance.

Related

Performance problem on Azure web app using Dapper, Azure SQL and .NET Core 3.1

We have built a web api with .NET Core 3.1, Dapper and Azure SQL which is hosted on an Azure web app. Using loader.io we ran some performance tests, which were disappointing (~10 RPS) for an S1 instance. Each API call triggers a couple of SQL queries using Dapper to retrieve the data from an Azure SQL database (100 DTU).
The Application Insights profiler has made traces, which says that a lot of time is spent in the database:
Travelling the hot path:
The database itself doesn't appear stressed (DTU or other metrics wise):
The web app has some peaks on CPU, the threads look OK:
The code is injected using standard .NET Core IServiceCollection DI and Scrutor:
// Injection of all SqlServer repositories.
services.Scan(scan => scan
.FromAssemblyOf<SomeRepository>()
.AddClasses(classes => classes.Where(type => type.Name.EndsWith("Repository")))
.AsImplementedInterfaces()
.WithTransientLifetime());
Dapper is used in the SQL Server repositories to retrieve the information:
public async Task<IReadOnlyCollection<DocumentRevisionOverview>> GetDocumentRevisionsAsync()
{
Logger.LogInformation("Starting {Method} on {Class}.", nameof(GetDocumentRevisionsAsync), ClassName);
var sql = "select statement";
// databaseParameters.ConnectionString (type is string) is injected via the constructor.
using (var connection = new SqlConnection(databaseParameters.ConnectionString))
{
var documentRevisions = await connection.QueryAsync<DocumentRevisionOverview>(sql);
Logger.LogInformation("Finished {Method} on {Class}.", nameof(GetDocumentRevisionsAsync), ClassName);
return documentRevisions.ToList();
}
}
We don't have varchars in our database tables, so the varchar to nvarchar conversion is not applicable.
All resources are located in the same resource group and location.
The tables contain only a couple of hundred records.
The JSON that is returned from the web API is 88 KB.
Dapper should re-use the ADO.NET connection pooling, which is based on the connection string which doesn't vary.
The connectionstring format is "Server={server};Initial Catalog={database};Persist Security Info=False;User ID={user};Password={password};MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;"
Individual calls (from Postman for example) are quick, ~150 ms:
What can I do to further investigate the cause of the waiting time?
Nice PS: I was surprised to see a 5x improvement when deploying to a Linux web app (with the P1V2, 2x size of an S1).
Without knowing your situation it's hard to know for sure, but I have verified Azure SQL is slower than SQL Server by quite a huge amount due to the location of Azure SQL not being close enough to the executing server, even in the same data center.
At my last position we moved to Azure SQL, then due to our application be a heavy calculation app we moved back to SQL Server located on the virtual machine, and we used SQL Data Sync to send the data to Azure SQL for back up.
This won't help solve your problem, but Azure SQL is best for back end jobs from what I have found.

C# Sql LocalDB Application Slow Performance

I am completely new sql/database applications and am trying out a simple contact management applicaton using Visual Studio 2015 C#. I am using 'SQL Express LocalDB'. I have read on google that it is meant for development purpose, but microsoft also mentions that it could be used for production purpose too.
My problem is that when I try out the application from my developement system, the application first time takes few seconds to load but after that every query runs quickly. When I tried this on one my friends system, it takes time everytime I try to use any query. The database is just with 20-30 records.
I create new connection using 'new SqlConnection' and then execute command created by 'new SqlCommand' and after executing query I close the connection.
Here is the code snippet from my app
SqlConnection sqlConnection = new SqlConnection(#"Data Source = (LocalDB)\MSSQLLocalDB; AttachDbFilename = ""C:\ContactsDB.mdf""; Integrated Security = True; Connect Timeout = 30";);
SqlCommand sqlCmd = new SqlCommand();
sqlCmd.Connection = sqlConnection;
sqlCmd.CommandText = "SELECT Id, first_name, last_name from ContactsMaster ORDER BY first_name";
sqlConnection.Open();
SqlDataReader reader = sqlCmd.ExecuteReader();
try
{
while (reader.Read())
{
ListViewItem lvi = new ListViewItem(reader["first_name"]);
listViewItems.Add(lvi);
lvi.SubItems.Add(reader[0].ToString());
}
}
finally
{
reader.Close();
}
sqlConnection.Close();
Q. Should I keep the connection open all the time while app is running? I don't think this should be suggested. As if app crashes database can get corrupt.
One of the backdrop which ppl saying that LocalDB closes the connection every new milliseconds. So should I keep pinging the database every few milliseconds? Or I should not use localdb in production at all?
I want to make the app such that the requirement goes really low regaridng the database prerequisites. Like LocalDB installation is really seamless.
I have not used SQL Server Express, does Express installation is also seamless like LocalDB and can I use the connection string like LocalDB in Express too, giving the .mdf filename directly?
localdb has auto shutdown. default is 5 min. you can set it to higher value (ie: 12hour).
max is 65535 min.
see: How to prevent SQL Server LocalDB auto shutdown?
also sqlexpress autoshutdown is 1hour if im not wrong.
symptoms on my pc:
first open is 10- 30 seconds slow. if i reopen app right after it is below 1 second. if i wait for a while it is slow again
There are many things to take in count for ddbb performance, it's not a simple question. For such small amount of records there shouldn't be performance problems. Try storing the ddbb files in another disk different from OS disk, and even better, place data file and log file in different disks too.
About your question, connections must be always closed and disposed properly in a finally block or inside a using block.
Sql Express is very easy to install, and also use a connection string, been the biggest difference that it can be used across the network.
Finally moved to SQLite and that is much faster in compare to SQLLocalDB.

C# creating a database programmatically with SMO

I am trying to create a database, but once created, I cannot connect to it.
The server is Microsoft SQL Server 2008 and using .Net 4.5. We're creating the database with SMO, but we're usually using Dapper to connect and query the database.
This is the code I have so far, which works :
System.Data.SqlClient.SqlConnection con = new System.Data.SqlClient.SqlConnection(connectionString);
Microsoft.SqlServer.Management.Smo.Server srv = new Microsoft.SqlServer.Management.Smo.Server(new Microsoft.SqlServer.Management.Common.ServerConnection(con));
var database = new Microsoft.SqlServer.Management.Smo.Database(srv, dbName);
database.Create(false);
database.Roles["db_datareader"].AddMember(???);
database.Roles["db_datawriter"].AddMember(???);
database.Roles["db_backupoperator"].AddMember(???);
srv.Refresh();
Noce the ??? ? I have tried
System.Environment.UserDomainName + "\\" + System.Environment.UserName
and
System.Environment.UserName
but it fails (update) with the error Add member failed for DatabaseRole 'db_datareader'. with both values.
The problem is that when I create the database, I cannot coonect to it for some reason (using Dapper), from the same program. (update) I get the error message : Cannot open database \"<database_name>\" requested by the login. The login failed.\r\nLogin failed for user '<domain>\\<username>' (where <database_name> is the database name, <domain> my logon domain, and <username> my Windows logon).
Am I missing something? Am I doing th right thing? I've tried searching the web, but it seems no one creates database this way. The methods are there, it should work, no?
** Update **
If I comment the database.Roles["..."].AddMember(...) lines, and I add a break point at srv.Refresh(), resuming the program from there solves everything.
Why a break point solves everything? I can't just break the program in production... nor break the program when creating the database everytime.
It sounds like the Dapper connection issue is a problem with SQL Server doing some of the SMO operations asynchronously. In all likelihood, the new Database is not ready for other users/connections immediately, but requires some small time for SQL Server to prepare it. In "human-time" (in SSMS, or a Breakpoint) this isn't noticeable, but "program-time" it too fast, so you probably need to give it a pause.
This may also be the problem with the Role's AddMember, but there a a number of things that could be wrong here, and we do not have enough information to tell. (specifically, does AddMember work later on? and are the strings being passed correct or not?)
This is happening because you've created the user, but no login for that user. Though I don't know the exact syntax, you're going to have to create a Login. You'll want to set its LoginType to LoginType.WindowsUser. Further, you'll likely need to set the WindowsLoginAccessType to WindowsLoginAccessType.Grant and you'll need to set the Credential by building one, probably a NetworkCredential with the user name you want.
To put a visual on this, the Login is under the Security node for the Server in Management Studio whereas the User is under the Security node for the Database. Both need to exist for access to the SQL Server.

Inserting different data simultaneously from different clients

I created a windows forms application in C #, and a database MS SQL server 2008 Express, and I use LINQ-to-SQL query to insert and edit data.
The database is housed on a server with Windows Server 2008 R2 (standard edition). Right now I have the application running on five different computers, and users are authenticated through active directory.
One complaint reported to me was that sometimes when different data is entered and submitted, the same data do not appear in the listing that contains the application. I use try catch block to send the errors but errors do not appear in the application; but the data simply disappear.
The id of the table records is an integer auto-increment. As I have to tell them the registration number that was entered I use the following piece of code:
try{
ConectionDataContext db = new ConectionDataContext();
Table_Registers tr = new Table_Registers();
tr.Name=textbox1.text;
tr.sector=textbox2.text;
db.Table_Registers.InsertOnSubmit(tr);
db.SubmitChanges();
int numberRegister=tr.NumberRegister;
MessageBox.Show(tr.ToString());
}
catch{Exception e}
I wonder if I'm doing something wrong or if you know of any article on the web that speaks how to insert data from different clients in MSSQL Server databases, please let me know.
Thanks.
That's what a database server DOES: "insert data simultaneously from different clients".
One thing you can do is to consider "transactions":
http://www.sqlteam.com/article/introduction-to-transactions
Another thing you can (and should!) do is to insure as much work as possible is done on the server, by using "stored procedures":
http://www.sql-server-performance.com/2003/stored-procedures-basics/
You should also check the SQL Server error logs, especially for potential deadlocks. You can see these in your SSMS GUI, or in the "logs" directory under your SQL Server installation.
But the FIRST thing you need to do is to determine exactly what's going on. Since you've only got MSSQL Express (which is not a good choice for production use!), perhaps the easiest approach is to create a "log" table: insert an entry in your "log" every time you insert a row in the real table, and see if stuff is "missing" (i.e. you have more entires in the log table than the data table).

Getting SQL Connection fragmentation, different way to connect to DB's

We have multiple DB servers. On one of the servers we have a master config table that holds instructions as to what DB server and DataBase name an Agency is supposed to use.
Currently each Database always has 2 connections on them, even if they're not being used (which we are fixing). However, we're trying to find a way to make it so our connections are not all over the place, and relieve some of the stress on our DB Servers.
After a lot of research we found some articles saying to do all connections to a central location, and then Change which database we're using through the SQLConnection object. Which seems a bit roundabout, but could work.
So I'm wondering what others do in this situation?
The current path for this is:
-User Logs in
-System access ConfigTable to find out which database user is going to connect to.
-System loads the Agency connection settings into memory (SEssion) for that user.
-Every request now directly hits that users database.
Is there a more efficient way of doing this?
Open connections late, and close them early.
For example:
string result;
using (var con = new SqlConnection(...))
{
con.Open();
var com = con.CreateCommand();
com.CommandText = "select 'hello world'";
result = com.ExecuteScalar();
}
The Windows OS will make sure to efficiently pool and reuse connections. And since you're only using connections when you need them, there are no idle connections lying around.
EDIT: Windows only caches connection strings that are literally the same, so if you use Initial Catalog=<dbname> in the connection string, that could hurt performance by requiring 500+ "connection pools" for one server.
So if you have 4 servers with a lot of databases, make sure you only use 4 connection strings. After connecting, switch database with:
com.CommandText = "use <dbname>";
com.ExecuteNonQuery();
Or query with a three-part name like:
select * from <dbname>.dbo.YourTable

Categories

Resources