Async Web Service call not consistently asynchronous - c#

I'm having issues creating an asynchronous web service using the Task Parallel Library with ASP.NET Web API 2. I make an asynchronous call to a method StartAsyncTest and create a cancellation token to abort the method. I store the token globally and then retrieve it and call it from a second method CancelAsyncTest. Here is the code:
// Private Global Dictionary to hold text search tokens
private static Dictionary<string, CancellationTokenSource> TextSearchLookup
= new Dictionary<string, CancellationTokenSource>();
/// <summary>
/// Performs an asynchronous test using a Cancellation Token
/// </summary>
[Route("StartAsyncTest")]
[HttpGet]
public async Task<WsResult<long>> StartAsyncTest(string sSearchId)
{
Log.Debug("Method: StartAsyncTest; ID: " + sSearchId + "; Message: Entering...");
WsResult<long> rWsResult = new WsResult<long>
{
Records = -1
};
try
{
var rCancellationTokenSource = new CancellationTokenSource();
{
var rCancellationToken = rCancellationTokenSource.Token;
// Set token right away in TextSearchLookup
TextSearchLookup.Add("SyncTest-" + sSearchId, rCancellationTokenSource);
HttpContext.Current.Session["SyncTest-" + sSearchId] =
rCancellationTokenSource;
try
{
// Start a New Task which has the ability to be cancelled
var rHttpContext = (HttpContext)HttpContext.Current;
await Task.Factory.StartNew(() =>
{
HttpContext.Current = rHttpContext;
int? nCurrentId = Task.CurrentId;
StartSyncTest(sSearchId, rCancellationToken);
}, TaskCreationOptions.LongRunning);
}
catch (OperationCanceledException e)
{
Log.Debug("Method: StartAsyncText; ID: " + sSearchId
+ "; Message: Cancelled!");
}
}
}
catch (Exception ex)
{
rWsResult.Result = "ERROR";
if (string.IsNullOrEmpty(ex.Message) == false)
{
rWsResult.Message = ex.Message;
}
}
// Remove token from Dictionary
TextSearchLookup.Remove(sSearchId);
HttpContext.Current.Session[sSearchId] = null;
return rWsResult;
}
private void StartSyncTest(string sSearchId, CancellationToken rCancellationToken)
{
// Spin for 1100 seconds
for (var i = 0; i < 1100; i++)
{
if (rCancellationToken.IsCancellationRequested)
{
rCancellationToken.ThrowIfCancellationRequested();
}
Log.Debug("Method: StartSyncTest; ID: " + sSearchId
+ "; Message: Wait Pass #" + i + ";");
Thread.Sleep(1000);
}
TextSearchLookup.Remove("SyncTest-" + sSearchId);
HttpContext.Current.Session.Remove("SyncTest-" + sSearchId);
}
[Route("CancelAsyncTest")]
[HttpGet]
public WsResult<bool> CancelAsyncTest(string sSearchId)
{
Log.Debug("Method: CancelAsyncTest; ID: " + sSearchId
+ "; Message: Cancelling...");
WsResult<bool> rWsResult = new WsResult<bool>
{
Records = false
};
CancellationTokenSource rCancellationTokenSource =
(CancellationTokenSource)HttpContext.Current.Session["SyncTest-" + sSearchId];
// Session doesn't always persist values. Use TextSearchLookup as backup
if (rCancellationTokenSource == null)
{
rCancellationTokenSource = TextSearchLookup["SyncTest-" + sSearchId];
}
if (rCancellationTokenSource != null)
{
rCancellationTokenSource.Cancel();
TextSearchLookup.Remove("SyncTest-" + sSearchId);
HttpContext.Current.Session.Remove("SyncTest-" + sSearchId);
rWsResult.Result = "OK";
rWsResult.Message = "Cancel delivered successfully!";
}
else
{
rWsResult.Result = "ERROR";
rWsResult.Message = "Reference unavailable to cancel task"
+ " (if it is still running)";
}
return rWsResult;
}
After I deploy this to IIS, the first time I call StartAsyncTest and then CancelAsyncTest (via the REST endpoints), both requests go through and it cancels as expected. However, the second time, the CancelAsyncTest request just hangs and the method is only called after StartAsyncTest completes (after 1100 seconds). I don't know why this occurs. StartAsyncTest seems to highjack all threads after it's called once. I appreciate any help anyone can provide!

I store the token globally and then retrieve it and call it from a second method CancelAsyncTest.
This is probably not a great idea. You can store these tokens "globally", but that's only "global" to a single server. This approach would break as soon as a second server enters the picture.
That said, HttpContext.Current shouldn't be assigned to, ever. This is most likely the cause of the odd behavior you're seeing. Also, if your real code is more complex than ThrowIfCancellationRequested - i.e., if it's actually listening to the CancellationToken - then the call to Cancel can execute the remainder of StartSyncTest from within the call to Cancel, which would cause considerable confusion over the value of HttpContext.Current.
To summarize:
I recommend doing away with this approach completely; it won't work at all on web farms. Instead, keep your "task state" in an external storage system like a database.
Don't pass HttpContext across threads.

A colleague offered a alternative call to Task.Factory.StartNew (within StartAsyncTest):
await Task.Factory.StartNew(() =>
{
StartSyncTest(sSearchId, rCancellationToken);
},
rCancellationToken,
TaskCreationOptions.LongRunning,
TaskScheduler.FromCurrentSynchronizationContext());
This implementation seemed to solve the asynchronous issue. Now future calls to CancelAsyncTest succeed and cancel the task as intended.

Related

Cloud Firestore GetSnapshotAsync to return a value. Unity and C#

I'm trying to create a Class to manage my Cloud Firestore requests (like any SQLiteHelper Class). However, firebase uses async calls and I'm not able to return a value to other scripts.
Here an example (bool return):
public bool CheckIfIsFullyRegistered(string idUtente)
{
DocumentReference docRef = db.Collection("Utenti").Document(idUtente);
docRef.GetSnapshotAsync().ContinueWithOnMainThread(task =>
{
DocumentSnapshot snapshot = task.Result;
if (snapshot.Exists)
{
Debug.Log(String.Format("Document {0} exist!", snapshot.Id));
return true; //Error here
}
else
{
Debug.Log(String.Format("Document {0} does not exist!", snapshot.Id));
}
});
}
Unfortunately, since Firestore is acting as a frontend for some slow running I/O (disk access or a web request), any interactions you have with it will need to be asynchronous. You'll also want to avoid blocking your game loop if at all possible while performing this access. That is to say, there won't be a synchronous call to GetSnapshotAsync.
Now there are two options you have for writing code that feels synchronous (if you're like me, it's easier to think like this than with callbacks or reactive structures).
First is that GetSnapshotAsync returns a task. You can opt to await on that task in an async function:
public async bool CheckIfIsFullyRegistered(string idUtente)
{
DocumentReference docRef = db.Collection("Utenti").Document(idUtente);
// this is equivalent to `task.Result` in the continuation code
DocumentSnapshot snapshot = await docRef.GetSnapshotAsync()
return snapshot.Exists;
}
The catch with this is that async/await makes some assumptions about C# object lifecycle that aren't guaranteed in the Unity context (more information in my related blog post and video). If you're a long-time Unity developer, or just want to avoid this == null ever being true, you may opt to wrap your async call in a WaitUntil block:
private IEnumerator CheckIfIsFullyRegisteredInCoroutine() {
string idUtente;
// set idUtente somewhere here
var isFullyRegisteredTask = CheckIfIsFullyRegistered(idUtente);
yield return new WaitUntil(()=>isFullyRegisteredTask.IsComplete);
if (isFullyRegisteredTask.Exception != null) {
// do something with the exception here
yield break;
}
bool isFullyRegistered = isFullyRegisteredTask.Result;
}
One other pattern I like to employ is to use listeners instead of just retrieving a snapshot. I would populate some Unity-side class with whatever the latest data is from Firestore (or RTDB) and have all my Unity objects ping that MonoBehaviour. This fits especially well with Unity's new ECS architecture or any time you're querying your data on a per-frame basis.
I hope that all helps!
This is how i got it to work, but excuse my ignorance as i've only been using FBDB for like a week.
Here is a snippet, I hope it helps someone.
Create a thread task extension to our login event
static Task DI = new System.Threading.Tasks.Task(LoginAnon);
Logging in anon
DI = FirebaseAuth.DefaultInstance.SignInAnonymouslyAsync().ContinueWith(result =>
{
Debug.Log("LOGIN [ID: " + result.Result.UserId + "]");
userID = result.Result.UserId;
FirebaseDatabase.DefaultInstance.GetReference("GlobalMsgs/").ChildAdded += HandleNewsAdded;
FirebaseDatabase.DefaultInstance.GetReference("Users/" + userID + "/infodata/nickname/").ValueChanged += HandleNameChanged;
FirebaseDatabase.DefaultInstance.GetReference("Users/" + userID + "/staticdata/status/").ValueChanged += HandleStatusChanged;
FirebaseDatabase.DefaultInstance.GetReference("Lobbies/").ChildAdded += HandleLobbyAdded;
FirebaseDatabase.DefaultInstance.GetReference("Lobbies/").ChildRemoved += HandleLobbyRemoved;
loggedIn = true;
});
And then get values.
DI.ContinueWith(Task =>
{
FirebaseDatabase.DefaultInstance.GetReference("Lobbies/" + selectedLobbyID + "/players/").GetValueAsync().ContinueWith((result) =>
{
DataSnapshot snap2 = result.Result;
Debug.Log("Their nickname is! -> " + snap2.Child("nickname").Value.ToString());
Debug.Log("Their uID is! -> " + snap2.Key.ToString());
//Add the user ID to the lobby list we have
foreach (List<string> lobbyData in onlineLobbies)
{
Debug.Log("Searching for lobby:" + lobbyData[0]);
if (selectedLobbyID == lobbyData[0].ToString()) //This is the id of the user hosting the lobby
{
Debug.Log("FOUND HOSTS LOBBY ->" + lobbyData[0]);
foreach (DataSnapshot snap3 in snap2.Children)
{
//add the user key to the lobby
lobbyData.Add(snap3.Key.ToString());
Debug.Log("Added " + snap3.Child("nickname").Value.ToString() + " with ID: " + snap3.Key.ToString() + " to local lobby.");
currentUsers++;
}
return;
}
}
});
});
Obviously you can alter it how you like, It does not really need loops but i'm using them to test before i compact the code into something less readable and more intuitive.

How to know when all my threads have finished executing when in recursive method?

I have been working on a webscraping project.
I am having two issues, one being presenting the number of urls processed as percentage but a far larger issue is that I can not figure out how I know when all the threads i am creating are totaly finished.
NOTE: I am aware of that the a parallel foreach once done moves on BUT this is within a recursive method.
My code below:
public async Task Scrape(string url)
{
var page = string.Empty;
try
{
page = await _service.Get(url);
if (page != string.Empty)
{
if (regex.IsMatch(page))
{
Parallel.For(0, regex.Matches(page).Count,
index =>
{
try
{
if (regex.Matches(page)[index].Groups[1].Value.StartsWith("/"))
{
var match = regex.Matches(page)[index].Groups[1].Value.ToLower();
if (!links.Contains(BaseUrl + match) && !Visitedlinks.Contains(BaseUrl + match))
{
Uri ValidUri = WebPageValidator.GetUrl(match);
if (ValidUri != null && HostUrls.Contains(ValidUri.Host))
links.Enqueue(match.Replace(".html", ""));
else
links.Enqueue(BaseUrl + match.Replace(".html", ""));
}
}
}
catch (Exception e)
{
log.Error("Error occured: " + e.Message);
Console.WriteLine("Error occured, check log for further details."); ;
}
});
WebPageInternalHandler.SavePage(page, url);
var context = CustomSynchronizationContext.GetSynchronizationContext();
Parallel.ForEach(links, new ParallelOptions { MaxDegreeOfParallelism = 25 },
webpage =>
{
try
{
if (WebPageValidator.ValidUrl(webpage))
{
string linkToProcess = webpage;
if (links.TryDequeue(out linkToProcess) && !Visitedlinks.Contains(linkToProcess))
{
ShowPercentProgress();
Thread.Sleep(15);
Visitedlinks.Enqueue(linkToProcess);
Task d = Scrape(linkToProcess);
Console.Clear();
}
}
}
catch (Exception e)
{
log.Error("Error occured: " + e.Message);
Console.WriteLine("Error occured, check log for further details.");
}
});
Console.WriteLine("parallel finished");
}
}
catch (Exception e)
{
log.Error("Error occured: " + e.Message);
Console.WriteLine("Error occured, check log for further details.");
}
}
NOTE that Scrape gets called multiple times(recursive)
call the method like this:
public Task ExecuteScrape()
{
var context = CustomSynchronizationContext.GetSynchronizationContext();
Scrape(BaseUrl).ContinueWith(x => {
Visitedlinks.Enqueue(BaseUrl);
}, context).Wait();
return null;
}
which in turn gets called like so:
static void Main(string[] args)
{
RunScrapper();
Console.ReadLine();
}
public static void RunScrapper()
{
try
{
_scrapper.ExecuteScrape();
}
catch (Exception e)
{
Console.WriteLine(e);
throw;
}
}
my result:
How do I solve this?
(Is it ethical for me to answer a question about web page scraping?)
Don't call Scrape recursively. Place the list of urls you want to scrape in a ConcurrentQueue and begin processing that queue. As the process of scraping a page returns more urls, just add them into the same queue.
I wouldn't use just a string, either. I recommend creating a class like
public class UrlToScrape //because naming things is hard
{
public string Url { get; set; }
public int Depth { get; set; }
}
Regardless of how you execute this it's recursive, so you have to somehow keep track of how many levels deep you are. A website could deliberately generate URLs that send you into infinite recursion. (If they did this then they don't want you scraping their site. Does anybody want people scraping their site?)
When your queue is empty that doesn't mean you're done. The queue could be empty, but the process of scraping the last url dequeued could still add more items back into that queue, so you need a way to account for that.
You could use a thread safe counter (int using Interlocked.Increment/Decrement) that you increment when you start processing a url and decrement when you finish. You're done when the queue is empty and the count of in-process urls is zero.
This is a very rough model to illustrate the concept, not what I'd call a refined solution. For example, you still need to account for exception handling, and I have no idea where the results go, etc.
public class UrlScraper
{
private readonly ConcurrentQueue<UrlToScrape> _queue = new ConcurrentQueue<UrlToScrape>();
private int _inProcessUrlCounter;
private readonly List<string> _processedUrls = new List<string>();
public UrlScraper(IEnumerable<string> urls)
{
foreach (var url in urls)
{
_queue.Enqueue(new UrlToScrape {Url = url, Depth = 1});
}
}
public void ScrapeUrls()
{
while (_queue.TryDequeue(out var dequeuedUrl) || _inProcessUrlCounter > 0)
{
if (dequeuedUrl != null)
{
// Make sure you don't go more levels deep than you want to.
if (dequeuedUrl.Depth > 5) continue;
if (_processedUrls.Contains(dequeuedUrl.Url)) continue;
_processedUrls.Add(dequeuedUrl.Url);
Interlocked.Increment(ref _inProcessUrlCounter);
var url = dequeuedUrl;
Task.Run(() => ProcessUrl(url));
}
}
}
private void ProcessUrl(UrlToScrape url)
{
try
{
// As the process discovers more urls to scrape,
// pretend that this is one of those new urls.
var someNewUrl = "http://discovered";
_queue.Enqueue(new UrlToScrape { Url = someNewUrl, Depth = url.Depth + 1 });
}
catch (Exception ex)
{
// whatever you want to do with this
}
finally
{
Interlocked.Decrement(ref _inProcessUrlCounter);
}
}
}
If I was doing this for real the ProcessUrl method would be its own class, and it would take HTML, not a URL. In this form it's difficult to unit test. If it were in a separate class then you could pass in HTML, verify that it outputs results somewhere, and that it calls a method to enqueue new URLs it finds.
It's also not a bad idea to maintain the queue as a database table instead. Otherwise if you're processing a bunch of urls and you have to stop, you'd have start all over again.
Can't you add all tasks Task d to some type of concurrent collection you thread through all recursive calls (via method argument) and then simply call Task.WhenAll(tasks).Wait()?
You'd need an intermediate method (makes it cleaner) that launches the base Scrape call and passes in the empty task collection. When the base call returns you have in hand all tasks and you simply wait them out.
public async Task Scrape (
string url) {
var tasks = new ConcurrentQueue<Task>();
//call your implementation but
//change it so that you add
//all launched tasks d to tasks
Scrape(url, tasks);
//1st option: Wait().
//This will block caller
//until all tasks finish
Task.WhenAll(tasks).Wait();
//or 2nd option: await
//this won't block and will return to caller.
//Once all tasks are finished method
//will resume in WriteLine
await Task.WhenAll(tasks);
Console.WriteLine("Finished!"); }
Simple rule: if you want to know when something finishes, the first step is to keep track of it. In your current implementation you are essentially firing and forgetting all launched tasks...

Pusher for Windows Phone failing to disconnect

I have encountered a problem with pusherClient.wp8 package I installed from Nuget. Every time the app is sent to the background, I disconnect from pusher and unsubscribe from the channel. when I resume the app, the code hang when connection is re-established with pusher.
I have tried to reset the pusher object, reset the channel, create a new instance of pusher, and still nothing works, it seems there is a problem with the package, or rather the websocket disconnect method is failing to disconnect from pusher, however, when I close the app, everything get reseted. but this does not help me in instances when a user open a photopicker from my app.
Does anyone has a suggestion or know of another pusherclient I can use for windows phone 8. I have been struggling with this problem for weeks now.
Here is the github link of the package I used: https://github.com/bszypelow/PusherClient.WP8/blob/master/PusherClient.WP8
Thank you
public ConnectionState Connect()
{
var task = Task.Run(async () => await pusher.ConnectAsync()); ;
try
{
task.Wait();
}
catch (Exception ex)
{
Debug.WriteLine("Exception " + ex.Message + " at " + ex.Source + "Inner exception " + ex.InnerException + " additional data " + ex.Data);
}
return task.Result;
}
From looking at the source code I have a guess:
All the async methods never use .ConfigureAwait(false) which could be a reason for the dead lock.
Especially if you call it from the UI thread using the .Wait() or .Result. From event for example.
I suggest you to update the code (it's MIT license, you can do that):
public Task<Channel> Subscribe(string channelName)
{
if (_connection.State != ConnectionState.Connected)
throw new PusherException("You must wait for Pusher to connect before you can subscribe to a channel", ErrorCodes.NotConnected);
if (Channels.ContainsKey(channelName))
{
return Task.FromResult(Channels[channelName]);
}
// If private or presence channel, check that auth endpoint has been set
var chanType = ChannelTypes.Public;
if (channelName.ToLower().StartsWith("private-"))
chanType = ChannelTypes.Private;
else if (channelName.ToLower().StartsWith("presence-"))
chanType = ChannelTypes.Presence;
return SubscribeToChannel(chanType, channelName); //await is not needed here
}
private async Task<Channel> SubscribeToChannel(ChannelTypes type, string channelName)
{
switch (type)
{
case ChannelTypes.Public:
Channels.Add(channelName, new Channel(channelName, this));
break;
case ChannelTypes.Private:
AuthEndpointCheck();
Channels.Add(channelName, new PrivateChannel(channelName, this));
break;
case ChannelTypes.Presence:
AuthEndpointCheck();
Channels.Add(channelName, new PresenceChannel(channelName, this));
break;
}
if (type == ChannelTypes.Presence || type == ChannelTypes.Private)
{
string jsonAuth = await _options.Authorizer.Authorize(channelName, _connection.SocketID)
.ConfigureAwait(false); //do not capture the context!!
var template = new { auth = String.Empty, channel_data = String.Empty };
var message = JsonConvert.DeserializeAnonymousType(jsonAuth, template);
_connection.Send(JsonConvert.SerializeObject(new { #event = Constants.CHANNEL_SUBSCRIBE, data = new { channel = channelName, auth = message.auth, channel_data = message.channel_data } }));
}
else
{
// No need for auth details. Just send subscribe event
_connection.Send(JsonConvert.SerializeObject(new { #event = Constants.CHANNEL_SUBSCRIBE, data = new { channel = channelName } }));
}
return Channels[channelName];
}
You can checkout the source code, than if you don't use it now you can install trial version of ReSharper and this plug-in. They will help you to find all the lines where .ConfigureAwait is missing.

limit # of web service requests simultaneously

I have an Excel Add-In written in C#, .NET 4.5. It will send many web service requests to a web server to get data. E.g. it sends 30,000 requests to web service server. When data of a request comes back, the addin will plot the data in Excel.
Originally I did all the requests asynchronously, but sometime I will get OutOfMemoryException
So I changed, sent the requests one by one, but it is too slow, takes long time to finish all requests.
I wonder if there is a way that I can do 100 requests at a time asynchronously, once the data of all the 100 requests come back and plot in Excel, then send the next 100 requests.
Thanks
Edit
On my addin, there is a ribbon button "Refresh", when it is clicked, refresh process starts.
On main UI thread, ribbon/button is clicked, it will call web service BuildMetaData,
once it is returned back, in its callback MetaDataCompleteCallback, another web service call is sent
Once it is returned back, in its callback DataRequestJobFinished, it will call plot to plot data on Excel. see below
RefreshBtn_Click()
{
if (cells == null) return;
Range firstOccurence = null;
firstOccurence = cells.Find(functionPattern, null,
null, null,
XlSearchOrder.xlByRows,
XlSearchDirection.xlNext,
null, null, null);
DataRequest request = null;
_reportObj = null;
Range currentOccurence = null;
while (!Helper.RefreshCancelled)
{
if(firstOccurence == null ||IsRangeEqual(firstOccurence, currentOccurence)) break;
found = true;
currentOccurence = cells.FindNext(currentOccurence ?? firstOccurence);
try
{
var excelFormulaCell = new ExcelFormulaCell(currentOccurence);
if (excelFormulaCell.HasValidFormulaCell)
{
request = new DataRequest(_unityContainer, XLApp, excelFormulaCell);
request.IsRefreshClicked = true;
request.Workbook = Workbook;
request.Worksheets = Worksheets;
_reportObj = new ReportBuilder(_unityContainer, XLApp, request, index, false);
_reportObj.ParseParameters();
_reportObj.GenerateReport();
//this is necessary b/c error message is wrapped in valid object DataResponse
//if (!string.IsNullOrEmpty(_reportObj.ErrorMessage)) //Clear previous error message
{
ErrorMessage = _reportObj.ErrorMessage;
Errors.Add(ErrorMessage);
AddCommentToCell(_reportObj);
Errors.Remove(ErrorMessage);
}
}
}
catch (Exception ex)
{
ErrorMessage = ex.Message;
Errors.Add(ErrorMessage);
_reportObj.ErrorMessage = ErrorMessage;
AddCommentToCell(_reportObj);
Errors.Remove(ErrorMessage);
Helper.LogError(ex);
}
}
}
on Class to GenerateReport
public void GenerateReport()
{
Request.ParseFunction();
Request.MetacompleteCallBack = MetaDataCompleteCallback;
Request.BuildMetaData();
}
public void MetaDataCompleteCallback(int id)
{
try
{
if (Request.IsRequestCancelled)
{
Request.FormulaCell.Dispose();
return;
}
ErrorMessage = Request.ErrorMessage;
if (string.IsNullOrEmpty(Request.ErrorMessage))
{
_queryJob = new DataQueryJob(UnityContainer, Request.BuildQueryString(), DataRequestJobFinished, Request);
}
else
{
ModifyCommentOnFormulaCellPublishRefreshEvent();
}
}
catch (Exception ex)
{
ErrorMessage = ex.Message;
ModifyCommentOnFormulaCellPublishRefreshEvent();
}
finally
{
Request.MetacompleteCallBack = null;
}
}
public void DataRequestJobFinished(DataRequestResponse response)
{
Dispatcher.Invoke(new Action<DataRequestResponse>(DataRequestJobFinishedUI), response);
}
public void DataRequestJobFinished(DataRequestResponse response)
{
try
{
if (Request.IsRequestCancelled)
{
return;
}
if (response.status != Status.COMPLETE)
{
ErrorMessage = ManipulateStatusMsg(response);
}
else // COMPLETE
{
var tmpReq = Request as DataRequest;
if (tmpReq == null) return;
new VerticalTemplate(tmpReq, response).Plot();
}
}
catch (Exception e)
{
ErrorMessage = e.Message;
Helper.LogError(e);
}
finally
{
//if (token != null)
// this.UnityContainer.Resolve<IEventAggregator>().GetEvent<DataQueryJobComplete>().Unsubscribe(token);
ModifyCommentOnFormulaCellPublishRefreshEvent();
Request.FormulaCell.Dispose();
}
}
on plot class
public void Plot()
{
...
attributeRange.Value2 = headerArray;
DataRange.Value2 = ....
DataRange.NumberFormat = ...
}
OutOfMemoryException is not about the too many requests sent simultaneously. It is about freeing your resources right way. In my practice there are two main problems when you are getting such exception:
Wrong working with immutable structures or System.String class
Not disposing your disposable resources, especially graphic objects and WCF requests.
In case of reporting, for my opinion, you got a second one type of a problem. DataRequest and DataRequestResponse are good point to start the investigation for the such objects.
If this doesn't help, try to use the Tasks library with async/await pattern, you can find good examples here:
// Signature specifies Task<TResult>
async Task<int> TaskOfTResult_MethodAsync()
{
int hours;
// . . .
// Return statement specifies an integer result.
return hours;
}
// Calls to TaskOfTResult_MethodAsync
Task<int> returnedTaskTResult = TaskOfTResult_MethodAsync();
int intResult = await returnedTaskTResult;
// or, in a single statement
int intResult = await TaskOfTResult_MethodAsync();
// Signature specifies Task
async Task Task_MethodAsync()
{
// . . .
// The method has no return statement.
}
// Calls to Task_MethodAsync
Task returnedTask = Task_MethodAsync();
await returnedTask;
// or, in a single statement
await Task_MethodAsync();
In your code I see a while loop, in which you can store your Task[] of size of 100, for which you can use the WaitAll method, and the problem should be solved. Sorry, but your code is huge enough, and I can't provide you a more straight example.
I'm having a lot of trouble parsing your code to figure out is being iterated for your request but the basic template for batching asynchronously is going to be something like this:
static const int batchSize = 100;
public async Task<IEnumerable<Results>> GetDataInBatches(IEnumerable<RequestParameters> parameters) {
if(!parameters.Any())
return Enumerable.Empty<Result>();
var batchResults = await Task.WhenAll(parameters.Take(batchSize).Select(doQuery));
return batchResults.Concat(await GetDataInBatches(parameters.Skip(batchSize));
}
where doQuery is something with the signature
Task<Results> async doQuery(RequestParameters parameters) {
//.. however you do the query
}
I wouldn't use this for a million requests since its recursive, but your case should would generate a callstack only 300 deep so you'll be fine.
Note that this also assumes that your data request stuff is done asynchronously and returns a Task. Most libraries have been updated to do this (look for methods with the Async suffix). If it doesn't expose that api you might want to create a separate question for how to specifically get your library to play nice with the TPL.

await CLGeocoder.GeocodeAddressAsync never returns

I'm trying to test the accuracy of Apple's forward geocoding service to cover the rare case when the latitude/longitude we have on our data model is missing or invalid, but we still know the physical address. Monotouch provides CLGeocoder.GeocodeAddress(String, CLGeocodeCompletionHandler) and GeocodeAddressAsync(String) but when I call them the completionHandler is never called and the async method never returns.
Nothing in my application log indicates a problem and enclosing the call in a try-catch block didn't turn up any exceptions. Maps integration is enabled in the project options. Short of capturing network traffic (which is probably SSL anyway) I'm out of ideas.
Here's the code which loads placemarks and tries to geocode addresses:
private void ReloadPlacemarks()
{
// list to hold any placemarks which come back with empty/invalid coordinates
List<ServiceCallWrapper> geoList = new List<ServiceCallWrapper> ();
mapView.ClearPlacemarks ();
List<MKPlacemark> placemarks = new List<MKPlacemark>();
if (serviceCallViewModel.ActiveServiceCall != null) {
var serviceCall = serviceCallViewModel.ActiveServiceCall;
if (serviceCall.dblLatitude != 0 && serviceCall.dblLongitude != 0) {
placemarks.Add (serviceCall.ToPlacemark ());
} else {
// add it to the geocode list
geoList.Add (serviceCall);
}
}
foreach (var serviceCall in serviceCallViewModel.ServiceCalls) {
if (serviceCall.dblLatitude != 0 && serviceCall.dblLongitude != 0) {
placemarks.Add (serviceCall.ToPlacemark ());
} else {
//add it to the geocode list
geoList.Add (serviceCall);
}
}
if (placemarks.Count > 0) {
mapView.AddPlacemarks (placemarks.ToArray ());
}
if (geoList.Count > 0) {
// attempt to forward-geocode the street address
foreach (ServiceCallWrapper s in geoList) {
ServiceCallWrapper serviceCall = GeocodeServiceCallAddressAsync (s).Result;
mapView.AddPlacemark (serviceCall.ToPlacemark());
}
}
}
private async Task<ServiceCallWrapper> GeocodeServiceCallAddressAsync(ServiceCallWrapper s)
{
CLGeocoder geo = new CLGeocoder ();
String addr = s.address + " " + s.city + " " + s.state + " " + s.zip;
Console.WriteLine ("Attempting forward geocode for service call UID: " + s.call_uid + " with address: " + addr);
//app hangs on this
CLPlacemark[] result = await geo.GeocodeAddressAsync(addr);
//code updating latitude and longitude (omitted)
return s;
}
Your problem is this line:
ServiceCallWrapper serviceCall = GeocodeServiceCallAddressAsync (s).Result;
By calling Task<T>.Result, you are causing a deadlock. I explain this fully on my blog, but the gist of it is that await will (by default) capture a "context" when it yields control, and will use that context to complete the async method. In this case, the "context" is the UI context. So, the UI thread is blocked (waiting on Result) when the response comes in, and the async method cannot continue because it's waiting to run on the UI thread.
The solution is to use async all the way. In other words, replace every Task<T>.Result and Task.Wait with await:
private async Task ReloadPlacemarksAsync()
{
...
ServiceCallWrapper serviceCall = await GeocodeServiceCallAddressAsync (s);
...
}
Note that your void ReloadPlacemarks is now Task ReloadPlacemarksAsync, so this change affects your callers. async will "grow" through the codebase, and this is normal. For more information, see my MSDN article on async best practices.

Categories

Resources