I have a fairly simple WPF application that uses Entity Framework. The main page of the application has a list of records that I am getting from a database on startup.
Each record has a picture, so the operation can be a little slow when the wireless signal is poor. I'd like this (and many of my SQL operations) to perform in the background if possible. I have async/await setup and at first it seemed to be working exactly as I wanted but now I'm seeing that my application is becoming unresponsive when accessing the DB.
Eventually I'm thinking I'm going to load up the text in one query and the images in another background operation and load them as they come in. This way I get the important stuff right away and the pictures can come in in the background, but the way things are going it's still looking like it will lock up if I do this.
On top of that, I'm trying to implement something to handle connectivity issues (in case the wifi cuts out momentarily) so that the application notifies the user of a connection issue, automatically retries a few times, etc. I put a try catch for SQL exception which seems to be working for me, but the whole application locks up for about a minute while it is trying to connect to the DB.
I tried testing my async/await using await Task.Delay() and everything is very responsive as expected while awaiting the delay, but everything locks up when awaiting the .ToListAsync(). Is this normal and expected? My understanding of async/await is pretty limited.
My code is kind of messy (I'm new) but it does what I need it to do for the most part. I understand there's probably plenty of improvements I can make and better ways to do things, but one step at a time here. My main goal right now is to keep the application from crashing during database accessing exceptions and to keep the user notified of what the application is doing (searching, trying to access db, unable to reach DB and retrying, etc) as opposed to being frozen, which is what they're going to think when they see it being unresponsive for over a minute.
Some of my code:
In my main view model
DataHelper data = new DataHelper();
private async void GetQualityRegisterQueueAsync()
{
try
{
var task = data.GetQualityRegisterAsync();
IsSearching = true;
await task;
IsSearching = false;
QualityRegisterItems = new ObservableCollection<QualityRegisterQueue>(task.Result);
OrderQualityRegisterItems();
}
catch (M1Exception ex)
{
Debug.WriteLine(ex.Message);
Debug.WriteLine("QualityRegisterLogViewModel.GetQualityRegisterQueue() Operation Failed");
}
}
My Data Helper Class
public class DataHelper
{
private bool debugging = false;
private const int MAX_RETRY = 2;
private const double LONG_WAIT_SECONDS = 5;
private const double SHORT_WAIT_SECONDS = 0.5;
private static readonly TimeSpan longWait = TimeSpan.FromSeconds(LONG_WAIT_SECONDS);
private static readonly TimeSpan shortWait = TimeSpan.FromSeconds(SHORT_WAIT_SECONDS);
private enum RetryableSqlErrors
{
ServerNotFound = -1,
Timeout = -2,
NoLock = 1204,
Deadlock = 1205,
}
public async Task<List<QualityRegisterQueue>> GetQualityRegisterAsync()
{
if(debugging) await Task.Delay(5000);
var retryCount = 0;
using (M1Context m1 = new M1Context())
{
for (; ; )
{
try
{
return await (from a in m1.QualityRegisters
where (a.qanClosed == 0)
//orderby a.qanAssignedDate descending, a.qanOpenedDate
orderby a.qanAssignedDate.HasValue descending, a.qanAssignedDate, a.qanOpenedDate
select new QualityRegisterQueue
{
QualityRegisterID = a.qanQualityRegisterID,
JobID = a.qanJobID.Trim(),
JobAssemblyID = a.qanJobAssemblyID,
JobOperationID = a.qanJobOperationID,
PartID = a.qanPartID.Trim(),
PartRevisionID = a.qanPartRevisionID.Trim(),
PartShortDescription = a.qanPartShortDescription.Trim(),
OpenedByEmployeeID = a.qanOpenedByEmployeeID.Trim(),
OpenedByEmployeeName = a.OpenedEmployee.lmeEmployeeName.Trim(),
OpenedDate = a.qanOpenedDate,
PartImage = a.JobAssembly.ujmaPartImage,
AssignedDate = a.qanAssignedDate,
AssignedToEmployeeID = a.qanAssignedToEmployeeID.Trim(),
AssignedToEmployeeName = a.AssignedEmployee.lmeEmployeeName.Trim()
}).ToListAsync();
}
catch (SqlException ex)
{
Debug.WriteLine("SQL Exception number = " + ex.Number);
if (!Enum.IsDefined(typeof(RetryableSqlErrors), ex.Number))
throw new M1Exception(ex.Message, ex);
retryCount++;
if (retryCount > MAX_RETRY) throw new M1Exception(ex.Message, ex); ;
Debug.WriteLine("Retrying. Count = " + retryCount);
Thread.Sleep(ex.Number == (int)RetryableSqlErrors.Timeout ?
longWait : shortWait);
}
}
}
}
}
Edit: Mostly looking for general guidance here, though a specific example of what to do would be great. For these types of operations where I am downloading data, is it just a given that if I need the application to be responsive I need to be making multiple threads? Is that a common solution to this type of problem? Is this not something I should be expecting async/await to solve?
If you call this method from your UI thread, you will overload the capture of UI thread context and back on itself. Also, your service will not be necessarily "Performant" because it must wait until the UI thread is free before it can continue.
The solution is simple: just call the method passing the ConfigureAwait "false" parameter when you made the call.
.ToListAsync().ConfigureAwaiter(false);
I hope it helps
Related
FIRST! I know very similar questions have been asked before, however, the answers have been so case specific that I seriously don't understand how to apply them. Please don't roast me.
All I want is to build a rather generic application that has CRUD modules and reporting (c# and MySQL), nothing special. However, everytime my app runs a line where there is a call to a method that queries the server the UI freezes a few ms, the freeze-like-stuttering of the UI feels incredibly annoying.
As I understood, every DB call needs to be done in a different thread with a background worker or a task so the main thread doesn't get busy and makes the UI unresponsive BUT whenever I try to do this I have issues because I can't access the UI components.
Can you guys help me with this simple login form as an example, please?
I have a simple form, 2 textboxes, txtUser and txtPassword and a button with the following code:
MySqlDataReader? user = MdlUsers.GetUserByUsername(txtUser.Text);
if (user != null && user.Read())
{
string? pwd = user["clave"].ToString();
if (pwd != null && pwd.Equals(txtPassword.Text))
{
MessageBox.Show("Acceso correcto.");
logged = true;
}
else
{
MessageBox.Show("Acceso denegado.");
logged = false;
}
}
else
{
MessageBox.Show("No regresó nada.");
logged = true;
}
This is the code to the MdlUser class:
public class MdlUsers
{
public static MySqlDataReader? GetUserByUsername(string username)
{
MySqlDataReader? result = null;
try
{
string qry = "SELECT * FROM usuarios WHERE login = #username";
MySqlCommand cmd = new MySqlCommand(qry, MdlConnection.Connect());
cmd.Parameters.AddWithValue("#username", username);
result = cmd.ExecuteReader();
return result;
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
return null;
}
}
}
Whenever I click the button the UI stutters. So how can I handle this? Thank you very much.
The way to handle any situation of that sort is to perform your long blocking operation on a separate task, and when it finishes, post a call to the GUI thread.
You have not told us what kind of GUI you are using, (that's what tags are for on stackoverflow,) I suspect it is WinForms, so the way to do this in WinForms is with Control.BeginInvoke().
When you specify an Action to be invoked with Control.BeginInvoke(), that Action will run in the GUI thread, so it will be able to safely "access the UI components."
I'm fairly new to databases and asynchronous programing. I'm making a POS app that will eventually have hundreds of customers and possibly thousands of transactions. When I want to do a customer search or look up a previous ticket I don't want my program to hang waiting on the results.
Here is the method that shows the search result:
private void searchCritiria_TextChanging(TextBox sender, TextBoxTextChangingEventArgs args)
{
FilteredCustomer.Clear();
if(searchCritiria.Text.Length >= 3)
{
SQLiteConnection dbConnection = new SQLiteConnection("Customers.db");
string sSQL = null;
sSQL = #"SELECT [first],[last],[spouse],[home],[work],[cell] FROM Customers";
ISQLiteStatement dbState = dbConnection.Prepare(sSQL);
while (dbState.Step() == SQLiteResult.ROW)
{
string sFirst = dbState["first"] as string;
string sLast = dbState["last"] as string;
string sSpouse = dbState["spouse"] as string;
string sHome = dbState["home"] as string;
string sWork = dbState["work"] as string;
string sCell = dbState["cell"] as string;
//Load into observable collection
if (searchType.SelectedIndex == 0)//name search
{
if(sFirst.Contains(searchCritiria.Text) || sLast.Contains(searchCritiria.Text) || sSpouse.Contains(searchCritiria.Text))
FilteredCustomer.Add(new Customer {first = sFirst, last = sLast, spouse = sSpouse, home = sHome, work = sWork, cell = sCell});
}
else//number search
{
if(sWork.Contains(searchCritiria.Text)|| sHome.Contains(searchCritiria.Text) || sCell.Contains(searchCritiria.Text))
FilteredCustomer.Add(new Customer { first = sFirst, last = sLast, spouse = sSpouse, home = sHome, work = sWork, cell = sCell });
}
}
}
}
Throughout my program I have void methods that are structured similar to this.
I'm not sure how to solve this issue. I tried doing some research but no success. Any advice would be greatly appreciated!
Edit
As correctly pointed out by #AndriyK, the answer would not have made your code asynchronous.
What it means:
Although the code works and you're able to push in data, if in case you reach a condition where two threads are trying to write into the database (Database is Locked condition, your application would still freeze until the database is not locked anymore or it times out. This means any loaders running would also freeze.
Making it asynchronous
To do so, you must run your SQLite code in a Task.Run() block. This would ensure your database runs on a different Thread and ensure your UI would never hang. This would also allow you to show ProgressRing (or other loaders) to the user while the application waits for the database to unlock. Below is the code:
public void PerformSQLTasks()
{
// your SQL code code here!
}
and run the above code by using:
Task.Run(() => PerformSQLTasks());
// or even
Task.Run(PerformSQLTasks);
Accepted Answer:
To make your UI not hang, you can make your method return a Task instead of void to make it awaitable. This will not make your code async but would allow the caller to await this method completion. To achieve this, follow the method signature below:
public Task PerformSQLTasks()
{
// your code here!
return Task.CompletedTask;
}
and you can call it by:
private async void searchCritiria_TextChanging(TextBox sender, TextBoxTextChangingEventArgs args)
{
await PerformSQLTasks();
}
Remember this won't make your code Asynchronous
I need to test if there's any memory leak in our application and monitor to see if memory usage increases too much while processing the requests.
I'm trying to develop some code to make multiple simultaneous calls to our api/webservice method. This api method is not asynchronous and takes some time to complete its operation.
I've made a lot of research about Tasks, Threads and Parallelism, but so far I had no luck. The problem is, even after trying all the below solutions, the result is always the same, it appears to be processing only two requests at the time.
Tried:
-> Creating tasks inside a simple for loop and starting them with and without setting them with TaskCreationOptions.LongRunning
-> Creating threads inside a simple for loop and starting them with and without high priority
-> Creating a list of actions on a simple for loop and starting them using
Parallel.Foreach(list, options, item => item.Invoke)
-> Running directly inside a Parallel.For loop (below)
-> Running TPL methods with and without Options and TaskScheduler
-> Tried with different values for MaxParallelism and maximum threads
-> Checked this post too, but it didn't help either. (Could I be missing something?)
-> Checked some other posts here in Stackoverflow, but with F# solutions that I don't know how to properly translate them to C#. (I never used F#...)
(Task Scheduler class taken from msdn)
Here's the basic structure that I have:
public class Test
{
Data _data;
String _url;
public Test(Data data, string url)
{
_data = data;
_url = url;
}
public ReturnData Execute()
{
ReturnData returnData;
using(var ws = new WebService())
{
ws.Url = _url;
ws.Timeout = 600000;
var wsReturn = ws.LongRunningMethod(data);
// Basically convert wsReturn to my method return, with some logic if/else etc
}
return returnData;
}
}
sealed class ThreadTaskScheduler : TaskScheduler, IDisposable
{
// The runtime decides how many tasks to create for the given set of iterations, loop options, and scheduler's max concurrency level.
// Tasks will be queued in this collection
private BlockingCollection<Task> _tasks = new BlockingCollection<Task>();
// Maintain an array of threads. (Feel free to bump up _n.)
private readonly int _n = 100;
private Thread[] _threads;
public TwoThreadTaskScheduler()
{
_threads = new Thread[_n];
// Create unstarted threads based on the same inline delegate
for (int i = 0; i < _n; i++)
{
_threads[i] = new Thread(() =>
{
// The following loop blocks until items become available in the blocking collection.
// Then one thread is unblocked to consume that item.
foreach (var task in _tasks.GetConsumingEnumerable())
{
TryExecuteTask(task);
}
});
// Start each thread
_threads[i].IsBackground = true;
_threads[i].Start();
}
}
// This method is invoked by the runtime to schedule a task
protected override void QueueTask(Task task)
{
_tasks.Add(task);
}
// The runtime will probe if a task can be executed in the current thread.
// By returning false, we direct all tasks to be queued up.
protected override bool TryExecuteTaskInline(Task task, bool taskWasPreviouslyQueued)
{
return false;
}
public override int MaximumConcurrencyLevel { get { return _n; } }
protected override IEnumerable<Task> GetScheduledTasks()
{
return _tasks.ToArray();
}
// Dispose is not thread-safe with other members.
// It may only be used when no more tasks will be queued
// to the scheduler. This implementation will block
// until all previously queued tasks have completed.
public void Dispose()
{
if (_threads != null)
{
_tasks.CompleteAdding();
for (int i = 0; i < _n; i++)
{
_threads[i].Join();
_threads[i] = null;
}
_threads = null;
_tasks.Dispose();
_tasks = null;
}
}
}
And the test code itself:
private void button2_Click(object sender, EventArgs e)
{
var maximum = 100;
var options = new ParallelOptions
{
MaxDegreeOfParallelism = 100,
TaskScheduler = new ThreadTaskScheduler()
};
// To prevent UI blocking
Task.Factory.StartNew(() =>
{
Parallel.For(0, maximum, options, i =>
{
var data = new Data();
// Fill data
var test = new Test(data, _url); //_url is pre-defined
var ret = test.Execute();
// Check return and display on screen
var now = DateTime.Now.ToString("HH:mm:ss");
var newText = $"{Environment.NewLine}[{now}] - {ret.ReturnId}) {ret.ReturnDescription}";
AppendTextBox(newText, ref resultTextBox);
}
}
public void AppendTextBox(string value, ref TextBox textBox)
{
if (InvokeRequired)
{
this.Invoke(new ActionRef<string, TextBox>(AppendTextBox), value, textBox);
return;
}
textBox.Text += value;
}
And the result that I get is basically this:
[10:08:56] - (0) OK
[10:08:56] - (0) OK
[10:09:23] - (0) OK
[10:09:23] - (0) OK
[10:09:49] - (0) OK
[10:09:50] - (0) OK
[10:10:15] - (0) OK
[10:10:16] - (0) OK
etc
As far as I know there's no limitation on the server side. I'm relatively new to the Parallel/Multitasking world. Is there any other way to do this? Am I missing something?
(I simplified all the code for clearness and I believe that the provided code is enough to picture the mentioned scenarios. I also didn't post the application code, but it's a simple WinForms screen just to call and show results. If any code is somehow relevant, please let me know, I can edit and post it too.)
Thanks in advance!
EDIT1: I checked on the server logs that it's receiving the requests two by two, so it's indeed something related to sending them, not receiving.
Could it be a network problem/limitation related to how the framework manages the requests/connections? Or something with the network at all (unrelated to .net)?
EDIT2: Forgot to mention, it's a SOAP webservice.
EDIT3: One of the properties that I send (inside data) needs to change for each request.
EDIT4: I noticed that there's always an interval of ~25 secs between each pair of request, if it's relevant.
I would recommend not to reinvent the wheel and just use one of the existing solutions:
Most obvious choice: if your Visual Studio license allows you can use MS Load Testing Framework, most likely you won't even have to write a single line of code: How to: Create a Web Service Test
SoapUI is a free and open source web services testing tool, it has some limited load testing capabilities
If for some reasons SoapUI is not suitable (i.e. you need to run load tests in clustered mode from several hosts or you need more enhanced reporting) you can use Apache JMeter - free and open source multiprotocol load testing tool which supports web services load testing as well.
A good solution to create load tests without write a own project is use this service https://loader.io/targets
It is free for small tests, you can POST Parameters, Header,... and you have a nice reporting.
Isnt the "two requests at a time" the result of the default maxconnection=2 limit on connectionManagement?
<configuration>
<system.net>
<connectionManagement>
<add address = "http://www.contoso.com" maxconnection = "4" />
<add address = "*" maxconnection = "2" />
</connectionManagement>
</system.net>
</configuration>
My favorite load testing library is NBomber. It has an easy and powerful API, realistic user simulations, and provides you with nice HTML reports about latency and requests per second.
I used it to test my API and wrote an article about how I did it.
I have a service layer project on an MVC 5 ASP.NET application I am creating on .NET 4.5.2 which calls out to an External 3rd Party WCF Service to Get Information asynchronously. An original method to call external service was as below (there are 3 of these all similar in total which I call in order from my GetInfoFromExternalService method (note it isnt actually called that - just naming it for illustration)
private async Task<string> GetTokenIdForCarsAsync(Car[] cars)
{
try
{
if (_externalpServiceClient == null)
{
_externalpServiceClient = new ExternalServiceClient("WSHttpBinding_IExternalService");
}
string tokenId= await _externalpServiceClient .GetInfoForCarsAsync(cars).ConfigureAwait(false);
return tokenId;
}
catch (Exception ex)
{
//TODO plug in log 4 net
throw new Exception("Failed" + ex.Message);
}
finally
{
CloseExternalServiceClient(_externalpServiceClient);
_externalpServiceClient= null;
}
}
So that meant that when each async call had completed the finally block ran - the WCF client was closed and set to null and then newed up when another request was made. This was working fine until a change needed to be made whereby if the number of cars passed in by User exceeds 1000 I create a Split Function and then call my GetInfoFromExternalService method in a WhenAll with each 1000 - as below:
if (cars.Count > 1000)
{
const int packageSize = 1000;
var packages = SplitCarss(cars, packageSize);
//kick off the number of split packages we got above in Parallel and await until they all complete
await Task.WhenAll(packages.Select(GetInfoFromExternalService));
}
However this now falls over as if I have 3000 cars the method call to GetTokenId news up the WCF service but the finally blocks closes it so the second batch of 1000 that is attempting to be run throws an exception. If I remove the finally block the code works ok - but it is obviously not good practice to not be closing this WCF client.
I had tried putting it after my if else block where the cars.count is evaluated - but if a User uploads for e.g 2000 cars and that completes and runs in say 1 min - in the meantime as the user had control in the Webpage they could upload another 2000 or another User could upload and again it falls over with an Exception.
Is there a good way anyone can see to correctly close the External Service Client?
Based on the related question of yours, your "split" logic doesn't seem to give you what you're trying to achieve. WhenAll still executes requests in parallel, so you may end up running more than 1000 requests at any given moment of time. Use SemaphoreSlim to throttle the number of simultaneously active requests and limit that number to 1000. This way, you don't need to do any splits.
Another issue might be in how you handle the creation/disposal of ExternalServiceClient client. I suspect there might a race condition there.
Lastly, when you re-throw from the catch block, you should at least include a reference to the original exception.
Here's how to address these issues (untested, but should give you the idea):
const int MAX_PARALLEL = 1000;
SemaphoreSlim _semaphoreSlim = new SemaphoreSlim(MAX_PARALLEL);
volatile int _activeClients = 0;
readonly object _lock = new Object();
ExternalServiceClient _externalpServiceClient = null;
ExternalServiceClient GetClient()
{
lock (_lock)
{
if (_activeClients == 0)
_externalpServiceClient = new ExternalServiceClient("WSHttpBinding_IExternalService");
_activeClients++;
return _externalpServiceClient;
}
}
void ReleaseClient()
{
lock (_lock)
{
_activeClients--;
if (_activeClients == 0)
{
_externalpServiceClient.Close();
_externalpServiceClient = null;
}
}
}
private async Task<string> GetTokenIdForCarsAsync(Car[] cars)
{
var client = GetClient();
try
{
await _semaphoreSlim.WaitAsync().ConfigureAwait(false);
try
{
string tokenId = await client.GetInfoForCarsAsync(cars).ConfigureAwait(false);
return tokenId;
}
catch (Exception ex)
{
//TODO plug in log 4 net
throw new Exception("Failed" + ex.Message, ex);
}
finally
{
_semaphoreSlim.Release();
}
}
finally
{
ReleaseClient();
}
}
Updated based on the comment:
the External WebService company can accept me passing up to 5000 car
objects in one call - though they recommend splitting into batches of
1000 and run up to 5 in parallel at one time - so when I mention 7000
- I dont mean GetTokenIdForCarAsync would be called 7000 times - with my code currently it should be called 7 times - i.e giving me back 7
token ids - I am wondering can I use your semaphore slim to run first
5 in parallel and then 2
The changes are minimal (but untested). First:
const int MAX_PARALLEL = 5;
Then, using Marc Gravell's ChunkExtension.Chunkify, we introduce GetAllTokenIdForCarsAsync, which in turn will be calling GetTokenIdForCarsAsync from above:
private async Task<string[]> GetAllTokenIdForCarsAsync(Car[] cars)
{
var results = new List<string>();
var chunks = cars.Chunkify(1000);
var tasks = chunks.Select(chunk => GetTokenIdForCarsAsync(chunk)).ToArray();
await Task.WhenAll(tasks);
return tasks.Select(task => task.Result).ToArray();
}
Now you can pass all 7000 cars into GetAllTokenIdForCarsAsync. This is a skeleton, it can be improved with some retry logic if any of the batch requests has failed (I'm leaving that up to you).
The code which is throwing exception is extremely easy - this is very regular insert and then submit changes statement which looks:
context.tb_dayErrorLog.InsertOnSubmit(data);
context.SubmitChanges();
So really nothing special. This statement is executed about 50 thousands times a day without any problem, but:
about 6 - 10 times a day it finishes with:
The operation cannot be performed during a call to SubmitChanges.
StackTrace: at System.Data.Linq.DataContext.CheckNotInSubmitChanges()
at System.Data.Linq.Table`1.InsertOnSubmit(TEntity entity)
I was trying to find out what that can be but can't find a clue
This behavior is very not deterministic politely saying - how it can finish 50k times correctly and few times not?
DataContext was firstly initialized as a static one, and then reused for all the calls, so I was thinking maybe that's the problem. Then I changed it to be initialized with every call but results are quite similar. Still few exceptions a day.
Any idea?
some additions:
function looks like:
public override bool Log(ErrorLogData logData)
{
try
{
logData.ProcessID = _processID;
//Create new log dataset
var data = new DataRecord
{
application = logData.Application,
date = DateTime.Now,
Other = logData.Other,
process = logData.ProcessName,
processid = logData.ProcessID,
severity = logData.Severity,
username = logData.UserName,
Type = (short)logData.ErrorType
};
var context = new DataContext(ConnectionString);
context.tb_dayErrorLog.InsertOnSubmit(data);
context.SubmitChanges();
}
catch (Exception ex)
{
//log log in eventviewer
LogEvent(logData.ToString(), ex);
return false;
}
return true;
}
so simple record initialization and then insert.
As I wrote in the comment, while making same thing by Ado.Net and SqlCommand this problem is not occuring...
So my curiosity makes me think why?
This sounds like a threading issue where you are calling Log and hence SubmitChanges on one thread when another thread is in the middle of SubmitChanges.
I suspect your DataContext is still a global static variable.
Try changing your Log method to
using (var context = new DataContext(ConnectionString))
{
context.tb_dayErrorLog.InsertOnSubmit(data);
context.SubmitChanges();
}
#SgMoore points to concurrency problem and in my case it really was. If that's the case another approach is to use lock like this:
String lockValue = "";
lock (lockValue)
{
context.tb_dayErrorLog.InsertOnSubmit(data);//UPDATE: concurrency error can occur here too
context.dc.SubmitChanges();
}