ASP.NET MVC kills my IIS - too many request - c#

I facing a big issuse. I've build software that now over 100 users use at once.
Since than my MVC ASP.NET Application is dying. The IIS Crashes over 30-40 times a day.
i dont have any recursively code
Main Probleme is that all users fetch a boolean which tell's them if they need to get new data.
That fetch-http method is called 1-5 times a second from diffrent users.
But the SQL-Reader is slower than the request made.
Error Message: There is already an open DataReader associated with this Command which must be closed first.
or
Error Message: Internal connection fatal error. Error state: 15, Token : 97
My method:
[HttpGet]
[Route("fetch")]
public IHttpActionResult fetch()
{
SqlConnection connection = new SqlConnection(con);
string sql = "SELECT whentime FROM did_datachange WHERE ux_admin_id = " + id;
connection.Open();
using (SqlCommand command = new SqlCommand(sql, connection))
{
using (SqlDataReader reader = command.ExecuteReader())
{
while (reader.Read())
{
if (reader.FieldCount == 1)
dateTime = reader.GetDateTime(0);
}
}
}
connection.Close();
....following code...
}
i know that lock cloud solve that problem, but using lock slows down the code significantly. what can i do?

it was bad code (multi threading that threw alot of execeptions)

Related

C# Using multiple MySQL connection or put queries in queue to execute respectively

I have a client/server app and my server stores data in a MySQL database, currently I have made a connection and I do queries without queue or something. I don't think this is a good solution for this, because when a MySQLDataReader opens another one can't be execute at the same time and first one must be closed. I think I have two options, either make a connection by every DataReader or put my queries in a queue to execute them one by one.
I want to know which one is the best or is there any way or something to prevent errors and exception which causes this error
There is already an open DataReader associated with this Connection which must be closed first.
This is how currently I am doing queries. I first get the main connection and do queries. it my causes above error.
string query = "SELECT * FROM users WHERE username = #username";
ServerModel.Database.CheckConnection(); // Get the main connection
MySqlCommand cmd = new MySqlCommand(query, ServerModel.Database);
cmd.Parameters.AddWithValue("#username", username);
UserStatus userStatus;
using (MySqlDataReader dataReader = cmd.ExecuteReader())
{
if (dataReader.Read())
{
...
dataReader.Close();
return userStatus;
}
}
To note that this server may do thousands of queries at moment. think about a chat server.
In this case please don't use the using block, I hope below approach will work fine.
string query = "SELECT * FROM users WHERE username = #username";
ServerModel.Database.CheckConnection(); // Get the main connection
MySqlCommand cmd = new MySqlCommand(query, ServerModel.Database);
cmd.Parameters.AddWithValue("#username", username);
UserStatus userStatus;
MySqlDataReader dataReader = cmd.ExecuteReader()
if (dataReader.Read())
{
...
dataReader.Close();
return userStatus;
}

Cannot insert value into database?(ERROR:The database file is locked)

I created a sqlite database in unity... and tried to connect with this function.
void AddScores(string conn)
{
IDbConnection dbconn;
dbconn = (IDbConnection)new SqliteConnection(conn);
dbconn.Open();
using(IDbCommand dbCmd = dbconn.CreateCommand())
{
// string sqlQuery = "SELECT Id FROM PickAndPlace ";
string sqlQuery= "INSERT INTO PickAndPlace (Id) VALUES (324)";
dbCmd.CommandText = sqlQuery;
using(IDataReader reader = dbCmd.ExecuteReader())
{
while (reader.Read())
{
print(reader.GetInt32(0));
}
dbconn.Close();
reader.Close();
dbCmd.Dispose();
}
}
}
The following code not working if I try insert values...and it is showing this error "The database file is locked:
database is locked" But If I try select this works fine.So where is my mistake?
Sqlite generally accepts a single "connection". Once one application connects to the database, which means just acquiring a write lock on it, no other applications can access it for writes, but can access it for reads. Which is just the behaviour you are seeing. See File Locking And Concurrency Control in SQLite Version 3 for a bunch more details about how this works, the various locking states etc.
But in principle, you can only have a single connection open. So somehow you have more than one. Either you forget to close some connections, or multiple threads or applications are trying to modify it. Or perhaps some error occurred and left the locking files in a bad state.

"An operation is already in progress" on DataAdaper.Fill

I am trying the Postgres Plus 9.5 with .Net 4.5, Npgsql 3.1.6 NuGet package.
I have read what is here about this error, but I do not understand why I get it. Everything is disposed. Here the is code:
public override DataTable getActListData(int FunkNr)
{
using (var cmd = new NpgsqlCommand())
{
cmd.CommandText = npgsqlCommand3.CommandText;
cmd.Connection = this.npgsqlConnection;
cmd.Parameters.Add(new Npgsql.NpgsqlParameter("ANWENDUNG", NpgsqlTypes.NpgsqlDbType.Numeric));
cmd.Parameters.Add(new Npgsql.NpgsqlParameter("XFUNKNR", NpgsqlTypes.NpgsqlDbType.Numeric));
using (var da = new NpgsqlDataAdapter(npgsqlCommand3))
{
var tab = new DataTable();
da.SelectCommand.Parameters["ANWENDUNG"].Value = getAnwendung();
da.SelectCommand.Parameters["XFUNKNR"].Value = FunkNr;
da.Fill(tab); // Here is the error on the 5th call
return tab;
}
}
}
Does this problem comes from Npgsql or is from Postgres?
Some other questions:
I have read here, that lazy loading is impossible, but I didn't understand is is because of the Npgsql or from Postgres?
Is it possible in Postgres to open several cursors and read on demand in the same connection?
Edit: Changed the code:
using (var npgsqlConnection = new NpgsqlConnection())
{
ConnectionString = string.Format(DataClientFactory.DataBaseConnectString, DB, User, PW);
npgsqlConnection.ConnectionString = ConnectionString;
npgsqlConnection.Open();
....
the code above here
....
}
The same error in the same call. The error:
System.InvalidOperationException occurred
HResult=-2146233079
Message=An operation is already in progress.
Source=Npgsql
StackTrace:
bei Npgsql.NpgsqlConnector.StartUserAction(ConnectorState newState)
InnerException:
If the piece of code you posted runs concurrently on the same connection, then that's your problem - Npgsql connections aren't thread-safe, and it's not possible to have multiple readers opened at the same time (MARS).

Timeout Expired - Session - How to Fix

Hi all and thanks in advance. I am trying to fix a method that inserts information to a database table. Currently its experiencing timeouts because its running in a while loop that is taking too long to process all the contents. While I know I could just increase the command timeout I don't think that solves the problem because I think its the code. But I'm not certain what the correct fix is. I have access to Dapper and I wonder if it would be more efficient to make a method that passes the necessary variables and executes just a quick simple statement for that group then goes to get the next one? Or is that just perpetuating what's below just in a different way? Should I move this out of the code and onto the server for better performance?
UPDATE Full error message:
Exception of type 'System.Web.HttpUnhandledException' was thrown.
File: c:\WINDOWS\Microsoft.NET\Framework\v4.0.30319\Temporary ASP.NET Files\6caa4c91\19b853c6\App_Web_o3102kpb.9.cs
Method: ProcessRequest Line Number : 0
Inner Exception: {Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
File: z:\inetpub\wwwroot\SessionTransfer.aspx.cs
Method: AddSessionToDatabase Line Number : 94
Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
File: z:\inetpub\wwwroot\SessionTransfer.aspx.cs Method: Page_Load Line Number : 33 }
Here is the original code:
SqlConnection con = new SqlConnection(connectionString);
SqlCommand cmd = new SqlCommand();
con.Open();
cmd.Connection = con;
int i = 0;
string strSql, guid = GetGuid();
string temp = "";
while (i < Session.Contents.Count)
{
if (Session.Contents[i] == null)
temp = "";
else {
if ((Session.Contents[i].ToString().Trim().Length) > 0)
temp = Session.Contents[i].ToString().Replace("'", "''");
else
temp = "";
}
strSql = "INSERT INTO SessionTable (GUID, SessionKey, SessionValue) " +
"VALUES ('" + guid + "', '" + Session.Contents.Keys[i].ToString() + "', '" + temp + "')";
cmd.CommandText = strSql;
cmd.ExecuteNonQuery();
i++;
}
con.Close();
cmd.Dispose();
con.Dispose();
return guid;
UPDATE - FINAL SOLUTION:
var SessionList = new List<Session>();
while (i < Session.Contents.Count)
{
string temp = "";
if (Session.Contents[i] == null)
temp = "";
else
{
temp = (Session.Contents[i].ToString().Trim().Length) > 0 ? Session.Contents[i].ToString().Replace("'", "''") : "";
}
var s = new Session
{
TempGuid = guidTemp,
Contents = Session.Contents[i] != null ? Session.Contents[i].ToString() : null,
Temp = temp
};
SessionList.Add(s);
i++;
}
mySession = SerializationUtilities.SerializeObjectToXML(SessionList);
using (var con = new SqlConnection())
{
con.ExecuteHGW("Transfer", new { mySession }, commandType: CommandType.StoredProcedure);
}
Then on the SQL side I just put the XML in a table and did one single insert statement against the table, time is significantly improved.
I would like to suggest you 2 improvements:
in the loop you are executing insert statements one by one against db. It takes much time to open connection to db then send the query, execute it and return result. It is much better to batch them. So gather like 10,000 of such insert statements, build the whole instuction with StringBuilder and execute it against db in one go. This will really increase the speed of your app. Amount of Insert instructions in one batch you should choose yourself basing on system tests.
If after applying 1) hint the problem with timeout still occurs I would suggest 2 possible solutions:
a). do not send all elements to be processed against db to web service at once but instead, as previoulsy, batch them (apply second batching). So for instance send to web service 50,000 elements to be insterted, then wait for confirmation from web service and then proceed with next batch. The big advantage is that you can show to user easily progress bar showing him current operation state.
b). send all items to be processed against db at once but do not wait for result. In your app just show that items are processed and each 10 s send to web service request to ask if the job is finished. When it is finished signal it to user.

NpgsqlCopyIn fails by timeout ("CommandTimeout" setting ignored)

I have a quite large dataset (900K records, 140Mb disk space) stored in CSV file in a client app (.NET 4.0). I need to load this data to Postgres 9 db the fastest way. I use Npgsql "NpgsqlCopyIn" technique (Npgsql library version=2.1.0).
For a probe load (138K) insertion works fine - it takes about 7 secons.
But for the whole batch (900K), the code throws timeout exception:
"ERROR: 57014: canceling statement due to statement timeout"
The stack trace is:
Npgsql.NpgsqlState.d_9.MoveNext() at
Npgsql.NpgsqlState.ProcessAndDiscardBackendResponses(NpgsqlConnector
context) at Npgsql.NpgsqlCopyInState.SendCopyDone(NpgsqlConnector
context) at Npgsql.NpgsqlCopyInState.StartCopy(NpgsqlConnector
context, NpgsqlCopyFormat copyFormat) at
Npgsql.NpgsqlState.d_9.MoveNext() at
Npgsql.NpgsqlState.ProcessAndDiscardBackendResponses(NpgsqlConnector
context) at
Npgsql.NpgsqlConnector.ProcessAndDiscardBackendResponses() at
Npgsql.NpgsqlCommand.ExecuteBlind() at Npgsql.NpgsqlCopyIn.Start()
I tried setting CommandTimeout to kilo values(>7200), zero; tried same values for connection "Timeout" parameter. Also I was trying to set "CommandTimeout" via connection string, but still with no result - "ERROR 57014" comes out again and again.
Please, help to load the batch correctly!
Here is the code I use:
private static void pgBulkCopy(string connection_string, FileInfo fiDataFile)
{
using (Npgsql.NpgsqlConnection con = new Npgsql.NpgsqlConnection(connection_string))
{
con.Open();
FileStream ifs = new FileStream(fiDataFile.FullName, FileMode.Open, FileAccess.Read);
string queryString = "COPY schm.Addresses(FullAddress,lat,lon) FROM STDIN;";
NpgsqlCommand cmd = new NpgsqlCommand(queryString, con);
cmd.CommandTimeout = 7200; //7200sec, 120 min, 2 hours
NpgsqlCopyIn copyIn = new NpgsqlCopyIn(cmd, con, ifs);
try{
copyIn.Start();
copyIn.End();
}catch(Exception ex)
{
Console.WriteLine("[DB] pgBulkCopy error: " + ex.Message );
}
finally
{
con.Close();
}
}
}
Npgsql has a bug regarding command timeout and NpgsqlCopyIn handling.
You may test our current master where we had a lot of fixes about command timeout handling.
You can download a copy of the project in our GitHub page: https://github.com/npgsql/Npgsql/archive/master.zip
Please, give it a try and let us know if it works for you.

Categories

Resources