Firebase Unity - getDownloadURLAsync for multiple images - c#

I'm trying to retrieve the download urls for 2 images from Firebase, in order to pass on to another method as strings to download them.
For a single image, I'm using this code :
img1_ref.GetDownloadUrlAsync().ContinueWith((Task<Uri> task) => {
if (!task.IsFaulted && !task.IsCanceled) {
UnityMainThreadDispatcher.Instance().Enqueue(() => UIcontroller.displayImage(task.Result.ToString()));
}
});
Which works perfectly - but I can't figure out how to fetch two urls, make sure I've got them all, then pass them onto the download method...
I'm struggling to get my head around aSync...
Any help grateful received!

If you want to use Continuations, it's relatively straightforward.
First I'd suggest changing your original code to this:
img1_ref.GetDownloadUrlAsync().ContinueWithOnMainThread((Task<Uri> task) => {
if (!task.IsFaulted && !task.IsCanceled) {
UIcontroller.displayImage(task.Result.ToString()));
}
});
This uses a Firebase extension to task to move your logic onto the main thread in a more concise manner.
Next, you can use Task.WhenAll to create a task that waits for multiple tasks to complete:
var task1 = img1_ref.GetDownloadUrlAsync();
var task2 = img2_ref.GetDownloadUrlAsync();
Task.WhenAll(task1, task2).ContinueWithOnMainThread((Task<Uri>[] tasks) => {
if (!task.IsFaulted && !task.IsCanceled) {
// work with tasks here. You can also use IEnumerable<Task<Uri>>
}
});
Of course, now we can have a little fun. I'd recommend reading this article on threading in Firebase for a little more background.
You can use #Frank van Puffelen's answer and do something like this:
async void DownloadAsync() {
var task1 = img1_ref.GetDownloadUrlAsync();
var task2 = img2_ref.GetDownloadUrlAsync();
await Task.WhenAll(task1, task2);
// you're on the calling thread here, which is usually the main thread
UIcontroller.displayImage(task1.Result.ToString());
}
Or you can move this all to a coroutine to get a little bit more Unity-aware memory safety:
IEnumerator DownloadCoroutine() {
var task1 = img1_ref.GetDownloadUrlAsync();
var task2 = img2_ref.GetDownloadUrlAsync();
yield return new WaitUntil(()=>Task.WhenAll(task1, task2).IsComplete);
// coroutines always run on the main thread, and only if this hasn't been destroyed
UIcontroller.displayImage(task1.Result.ToString());
}

Based on Running multiple async tasks and waiting for them all to complete that'd be:
var task1 = img1_ref.GetDownloadUrlAsync();
var task2 = img2_ref.GetDownloadUrlAsync();
await Task.WhenAll(task1, task2);
...

Related

How to run multiple methods in parallel in ASP.NET

I have 8 methods in an ASP.NET Console App, like Fun1(), Fun2(), Fun3() ... and so on. In my console application I have called all these methods sequentially. But now the requirement is I have do that using parallel programming concepts. I have read about task and threading concepts in Java but completely new to .NET parallel Programming.
Here is the flow of methods I needed in my console app,
As you can see the diagram, Task1 and Task2 should run in parallel, and Task3 will only occur after completion of the previous two.
The functions inside each task, for example Fun3 and Fun4 for the Task1, should run sequentially, the one after the other.
Can anyone please help me out?
One way to solve this is, by using WhenAll.
To take an example, I have created X number of methods with the name FuncX() like this:
async static Task<int> FuncX()
{
await Task.Delay(500);
var result = await Task.FromResult(1);
return result;
}
In this case, we have Func1, Func3, Func4, Func5, and Func6.
So we call methods and pass them to a list of Task.
var task1 = new List<Task<int>>();
task1.Add(Func3());
task1.Add(Func4());
var task2 = new List<Task<int>>();
task2.Add(Func1());
task2.Add(Func5());
task2.Add(Func6());
You have 2 options to get the result:
// option1
var eachFunctionIsDoneWithAwait1 = await Task.WhenAll(task1);
var eachFunctionIsDoneWithAwait2 = await Task.WhenAll(task2);
var sum1 = eachFunctionIsDoneWithAwait1.Sum() + eachFunctionIsDoneWithAwait2.Sum();
Console.WriteLine(sum1);
// option2
var task3 = new List<List<Task<int>>>();
task3.Add(task1);
task3.Add(task2);
var sum2 = 0;
task3.ForEach(async x =>
{
var r = await Task.WhenAll(x);
sum2 += r.Sum();
});
Console.WriteLine(sum2);
This is just example for inspiration, you can change it and do it the way you want.
Here is how you could create the tasks according to the diagram, using the Task.Run method:
Task task1 = Task.Run(() =>
{
Fun3();
Fun4();
});
Task task2 = Task.Run(() =>
{
Fun1();
Fun5();
Fun6();
});
Task task3 = Task.Run(async () =>
{
await Task.WhenAll(task1, task2);
Fun7();
Fun8();
});
The Task.Run invokes the delegate on the ThreadPool, not on a dedicated thread. If you have some reason to create a dedicated thread for each task, you could use the advanced Task.Factory.StartNew method with the TaskCreationOptions.LongRunning argument, as shown here.
It should be noted that the above implementation has not an optimal behavior in case of exceptions. In case the Fun3() fails immediately, the optimal behavior would be to stop the execution of the task2 as soon as possible. Instead this implementation will execute all three functions Fun1, Fun5 and Fun6 before propagating the error. You could fix this minor flaw by creating a CancellationTokenSource and invoking the Token.ThrowIfCancellationRequested after each function, but it's going to be messy.
Another issue is that in case both task1 and task2 fail, only the exception of the task1 is going to be propagated through the task3. Solving this issue is not trivial.
Update: Here is one way to solve the issue of partial exception propagation:
Task task3 = Task.WhenAll(task1, task2).ContinueWith(t =>
{
if (t.IsFaulted)
{
TaskCompletionSource tcs = new();
tcs.SetException(t.Exception.InnerExceptions);
return tcs.Task;
}
if (t.IsCanceled)
{
TaskCompletionSource tcs = new();
tcs.SetCanceled(new TaskCanceledException(t).CancellationToken);
return tcs.Task;
}
Debug.Assert(t.IsCompletedSuccessfully);
Fun7();
Fun8();
return Task.CompletedTask;
}, default, TaskContinuationOptions.DenyChildAttach, TaskScheduler.Default)
.Unwrap();
In case both task1 and task2 fail, the task3 will propagate the exceptions of both tasks.

Nested async methods in a Parallel.ForEach

I have a method that runs multiple async methods within it. I have to iterate over a list of devices, and pass the device to this method. I am noticing that this is taking a long time to complete so I am thinking of using Parallel.ForEach so it can run this process against multiple devices at the same time.
Let's say this is my method.
public async Task ProcessDevice(Device device) {
var dev = await _deviceService.LookupDeviceIndbAsNoTracking(device);
var result = await DoSomething(dev);
await DoSomething2(dev);
}
Then DoSomething2 also calls an async method.
public async Task DoSomething2(Device dev) {
foreach(var obj in dev.Objects) {
await DoSomething3(obj);
}
}
The list of devices continuously gets larger over time, so the more this list grows, the longer it takes the program to finish running ProcessDevice() against each device. I would like to process more than one device at a time. So I have been looking into using Parallel.ForEach.
Parallel.ForEach(devices, async device => {
try {
await ProcessDevice(device);
} catch (Exception ex) {
throw ex;
}
})
It appears that the program is finishing before the device is fully processed. I have also tried creating a list of tasks, and then foreach device, add a new task running ProcessDevice to that list and then awaiting Task.WhenAll(listOfTasks);
var listOfTasks = new List<Task>();
foreach(var device in devices) {
var task = Task.Run(async () => await ProcessDevice(device));
listOfTasks.Add(task);
}
await Task.WhenAll(listOfTasks);
But it appears that the task is marked as completed before ProcessDevice() is actually finished running.
Please excuse my ignorance on this issue as I am new to parallel processing and not sure what is going on. What is happening to cause this behavior and is there any documentation that you could provide that could help me better understand what to do?
You can't mix async with Parallel.ForEach. Since your underlying operation is asynchronous, you'd want to use asynchronous concurrency, not parallelism. Asynchronous concurrency is most easily expressed with WhenAll:
var listOfTasks = devices.Select(ProcessDevice).ToList();
await Task.WhenAll(listOfTasks);
In your last example there's a few problems:
var listOfTasks = new List<Task>();
foreach (var device in devices)
{
await Task.Run(async () => await ProcessDevice(device));
}
await Task.WhenAll(listOfTasks);
Doing await Task.Run(async () => await ProcessDevice(device)); means you are not moving to the next iteration of the foreach loop until the previous one is done. Essentially, you're still doing them one at a time.
Additionally, you aren't adding any tasks to listOfTasks so it remains empty and therefore Task.WhenAll(listOfTasks) completes instantly because there's no tasks to await.
Try this:
var listOfTasks = new List<Task>();
foreach (var device in devices)
{
var task = Task.Run(async () => await ProcessDevice(device))
listOfTasks.Add(task);
}
await Task.WhenAll(listOfTasks);
I can explain the problem with Parallel.ForEach. An important thing to understand is that when the await keyword acts on an incomplete Task, it returns. It will return its own incomplete Task if the method signature allows (if it's not void). Then it is up to the caller to use that Task object to wait for the job to finish.
But the second parameter in Parallel.ForEach is an Action<T>, which is a void method, which means no Task can be returned, which means the caller (Parallel.ForEach in this case) has no way to wait until the job has finished.
So in your case, as soon as it hits await ProcessDevice(device), it returns and nothing waits for it to finish so it starts the next iteration. By the time Parallel.ForEach is finished, all it has done is started all the tasks, but not waited for them.
So don't use Parallel.ForEach with asynchronous code.
Stephen's answer is more appropriate. You can also use WSC's answer, but that can be dangerous with larger lists. Creating hundreds or thousands of new threads all at once will not help your performance.
not very sure it this if what you are asking for, but I can give example of how we start async process
private readonly Func<Worker> _worker;
private void StartWorkers(IEnumerable<Props> props){
Parallel.ForEach(props, timestamp => { _worker.Invoke().Consume(timestamp); });
}
Would recommend reading about Parallel.ForEach as it will do some part for you.

How do I stop awaiting a task but keep the task running in background?

If I have a list of tasks which I want to execute together but at the same time I want to execute a certain number of them together, so I await for one of them until one of them finishes, which then should mean awaiting should stop and a new task should be allowed to start, but when one of them finishes, I don't know how to stop awaiting for the task which is currently being awaited, I don't want to cancel the task, just stop awaiting and let it continue running in the background.
I have the following code
foreach (var link in SharedVars.DownloadQueue)
{
if (currentRunningCount != batch)
{
var task = DownloadFile(extraPathPerLink, link, totalLen);
_ = task.ContinueWith(_ =>
{
downloadQueueLinksTasks.Remove(task);
currentRunningCount--;
// TODO SHOULD CHANGE WHAT IS AWAITED
});
currentRunningCount++;
downloadQueueLinksTasks.Add(task);
}
if (currentRunningCount == batch)
{
// TODO SHOULD NOT AWAIT 0
await downloadQueueLinksTasks[0];
}
}
I found about Task.WhenAny but from this comment here I understood that the other tasks will be ignored so it's not the solution I want to achieve.
I'm sorry if the question is stupid or wrong but I can't seem to find any information related on how to solve it, or what is the name of the operation I want to achieve so I can even search correctly.
Solution Edit
All the answers provided are correct, I accepted the one I decided to use but still all of them are correct.
Thank you everyone, I learned a lot from all of you from these different answers and different ways to approach the problem and how to think about it.
What I learned about this specific problem was that I still needed to await for the other tasks left, so the solution was to have the Task.WhenAny inside the loop (which returns the finished task (this is also important)) AND Task.WhenAll outside the loop to await the other left tasks.
Task.WhenAny returns the Task which completed.
foreach (var link in SharedVars.DownloadQueue)
{
var task = DownloadFile(extraPathPerLink, link, totalLen);
downloadQueueLinksTasks.Add(task);
if (downloadQueueLinksTasks.Count == batch)
{
// Wait for any Task to complete, then remove it from
// the list of pending tasks.
var completedTask = await Task.WhenAny(downloadQueueLinksTasks);
downloadQueueLinksTasks.Remove(completedTask);
}
}
// Wait for all of the remaining Tasks to complete
await Task.WhenAll(downloadQueueLinksTasks);
You can use Task.WaitAny()
Here is the demonstration of the behavior:
public static async Task Main(string[] args)
{
IList<Task> tasks = new List<Task>();
tasks.Add(TestAsync(0));
tasks.Add(TestAsync(1));
tasks.Add(TestAsync(2));
tasks.Add(TestAsync(3));
tasks.Add(TestAsync(4));
tasks.Add(TestAsync(5));
var result = Task.WaitAny(tasks.ToArray());
Console.WriteLine("returned task id is {0}", result);
///do other operations where
//before exiting wait for other tasks so that your tasks won't get cancellation signal
await Task.WhenAll(tasks.ToArray());
}
public static async Task TestAsync(int i)
{
Console.WriteLine("Staring to wait" + i);
await Task.Delay(new Random().Next(1000, 10000));
Console.WriteLine("Task finished" + i);
}
Output:
Staring to wait0
Staring to wait1
Staring to wait2
Staring to wait3
Staring to wait4
Staring to wait5
Task finished0
returned task id is 0
Task finished4
Task finished2
Task finished1
Task finished5
Task finished3
What you're asking about is throttling, which for asynchronous code is best expressed via SemaphoreSlim:
var semaphore = new SemaphoreSlim(batch);
var tasks = SharedVars.DownloadQueue.Select(link =>
{
await semaphore.WaitAsync();
try { return DownloadFile(extraPathPerLink, link, totalLen); }
finally { semaphore.Release(); }
});
var results = await Task.WhenAll(tasks);
You should use Microsoft's Reactive Framework (aka Rx) - NuGet System.Reactive and add using System.Reactive.Linq; - then you can do this:
IDisposable subscription =
SharedVars.DownloadQueue
.ToObservable()
.Select(link =>
Observable.FromAsync(() => DownloadFile(extraPathPerLink, link, totalLen)))
.Merge(batch) //max concurrent downloads
.Subscribe(file =>
{
/* process downloaded file here
(no longer a task) */
});
If you need to stop the downloads before they would naturally finish just call subscription.Dispose().

Parallel.ForEach using Thread.Sleep equivalent

So here's the situation: I need to make a call to a web site that starts a search. This search continues for an unknown amount of time, and the only way I know if the search has finished is by periodically querying the website to see if there's a "Download Data" link somewhere on it (it uses some strange ajax call on a javascript timer to check the backend and update the page, I think).
So here's the trick: I have hundreds of items I need to search for, one at a time. So I have some code that looks a little bit like this:
var items = getItems();
Parallel.ForEach(items, item =>
{
startSearch(item);
var finished = isSearchFinished(item);
while(finished == false)
{
finished = isSearchFinished(item); //<--- How do I delay this action 30 Secs?
}
downloadData(item);
}
Now, obviously this isn't the real code, because there could be things that cause isSearchFinished to always be false.
Obvious infinite loop danger aside, how would I correctly keep isSearchFinished() from calling over and over and over, but instead call every, say, 30 seconds or 1 minute?
I know Thread.Sleep() isn't the right solution, and I think the solution might be accomplished by using Threading.Timer() but I'm not very familiar with it, and there are so many threading options that I'm just not sure which to use.
It's quite easy to implement with tasks and async/await, as noted by #KevinS in the comments:
async Task<ItemData> ProcessItemAsync(Item item)
{
while (true)
{
if (await isSearchFinishedAsync(item))
break;
await Task.Delay(30 * 1000);
}
return await downloadDataAsync(item);
}
// ...
var items = getItems();
var tasks = items.Select(i => ProcessItemAsync(i)).ToArray();
await Task.WhenAll(tasks);
var data = tasks.Select(t = > t.Result);
This way, you don't block ThreadPool threads in vain for what is mostly a bunch of I/O-bound network operations. If you're not familiar with async/await, the async-await tag wiki might be a good place to start.
I assume you can convert your synchronous methods isSearchFinished and downloadData to asynchronous versions using something like HttpClient for non-blocking HTTP request and returning a Task<>. If you are unable to do so, you still can simply wrap them with Task.Run, as await Task.Run(() => isSearchFinished(item)) and await Task.Run(() => downloadData(item)). Normally this is not recommended, but as you have hundreds of items, it sill would give you a much better level of concurrency than with Parallel.ForEach in this case, because you won't be blocking pool threads for 30s, thanks to asynchronous Task.Delay.
You can also write a generic function using TaskCompletionSource and Threading.Timer to return a Task that becomes complete once a specified retry function succeeds.
public static Task RetryAsync(Func<bool> retryFunc, TimeSpan retryInterval)
{
return RetryAsync(retryFunc, retryInterval, CancellationToken.None);
}
public static Task RetryAsync(Func<bool> retryFunc, TimeSpan retryInterval, CancellationToken cancellationToken)
{
var tcs = new TaskCompletionSource<object>();
cancellationToken.Register(() => tcs.TrySetCanceled());
var timer = new Timer((state) =>
{
var taskCompletionSource = (TaskCompletionSource<object>) state;
try
{
if (retryFunc())
{
taskCompletionSource.TrySetResult(null);
}
}
catch (Exception ex)
{
taskCompletionSource.TrySetException(ex);
}
}, tcs, TimeSpan.FromMilliseconds(0), retryInterval);
// Once the task is complete, dispose of the timer so it doesn't keep firing. Also captures the timer
// in a closure so it does not get disposed.
tcs.Task.ContinueWith(t => timer.Dispose(),
CancellationToken.None,
TaskContinuationOptions.ExecuteSynchronously,
TaskScheduler.Default);
return tcs.Task;
}
You can then use RetryAsync like this:
var searchTasks = new List<Task>();
searchTasks.AddRange(items.Select(
downloadItem => RetryAsync( () => isSearchFinished(downloadItem), TimeSpan.FromSeconds(2)) // retry timout
.ContinueWith(t => downloadData(downloadItem),
CancellationToken.None,
TaskContinuationOptions.OnlyOnRanToCompletion,
TaskScheduler.Default)));
await Task.WhenAll(searchTasks.ToArray());
The ContinueWith part specifies what you do once the task has completed successfully. In this case it will run your downloadData method on a thread pool thread because we specified TaskScheduler.Default and the continuation will only execute if the task ran to completion, i.e. it was not canceled and no exception was thrown.

Concurrent execution of async methods

Using the async/await model, I have a method which makes 3 different calls to a web service and then returns the union of the results.
var result1 = await myService.GetData(source1);
var result2 = await myService.GetData(source2);
var result3 = await myService.GetData(source3);
allResults = Union(result1, result2, result3);
Using typical await, these 3 calls will execute synchronously wrt each other. How would I go about letting them execute concurrently and join the results as they complete?
How would I go about letting them execute in parallel and join the results as they complete?
The simplest approach is just to create all the tasks and then await them:
var task1 = myService.GetData(source1);
var task2 = myService.GetData(source2);
var task3 = myService.GetData(source3);
// Now everything's started, we can await them
var result1 = await task1;
var result1 = await task2;
var result1 = await task3;
You might also consider Task.WhenAll. You need to consider the possibility that more than one task will fail... with the above code you wouldn't observe the failure of task3 for example, if task2 fails - because your async method will propagate the exception from task2 before you await task3.
I'm not suggesting a particular strategy here, because it will depend on your exact scenario. You may only care about success/failure and logging one cause of failure, in which case the above code is fine. Otherwise, you could potentially attach continuations to the original tasks to log all exceptions, for example.
You could use the Parallel class:
Parallel.Invoke(
() => result1 = myService.GetData(source1),
() => result2 = myService.GetData(source2),
() => result3 = myService.GetData(source3)
);
For more information visit: http://msdn.microsoft.com/en-us/library/system.threading.tasks.parallel(v=vs.110).aspx
As a more generic solution you can use the api I wrote below, it also allows you to define a real time throttling mechanism of max number of concurrent async requests.
The inputEnumerable will be the enumerable of your source and asyncProcessor is your async delegate (myservice.GetData in your example).
If the asyncProcessor - myservice.GetData - returns void or just a Task without any type, then you can simply update the api to reflect that. (just replace all Task<> references to Task)
public static async Task<TOut[]> ForEachAsync<TIn, TOut>(
IEnumerable<TIn> inputEnumerable,
Func<TIn, Task<TOut>> asyncProcessor,
int? maxDegreeOfParallelism = null)
{
IEnumerable<Task<TOut>> tasks;
if (maxDegreeOfParallelism != null)
{
SemaphoreSlim throttler = new SemaphoreSlim(maxDegreeOfParallelism.Value, maxDegreeOfParallelism.Value);
tasks = inputEnumerable.Select(
async input =>
{
await throttler.WaitAsync();
try
{
return await asyncProcessor(input).ConfigureAwait(false);
}
finally
{
throttler.Release();
}
});
}
else
{
tasks = inputEnumerable.Select(asyncProcessor);
}
await Task.WhenAll(tasks);
}

Categories

Resources