C# Task.WaitAll isn't waiting - c#

My aim is to download images from an Amazon Web Services bucket.
I have the following code function which downloads multiple images at once:
public static void DownloadFilesFromAWS(string bucketName, List<string> imageNames)
{
int batchSize = 50;
int maxDownloadMilliseconds = 10000;
List<Task> tasks = new List<Task>();
for (int i = 0; i < imageNames.Count; i++)
{
string imageName = imageNames[i];
Task task = Task.Run(() => GetFile(bucketName, imageName));
tasks.Add(task);
if (tasks.Count > 0 && tasks.Count % batchSize == 0)
{
Task.WaitAll(tasks.ToArray(), maxDownloadMilliseconds);//wait to download
tasks.Clear();
}
}
//if there are any left, wait for them
Task.WaitAll(tasks.ToArray(), maxDownloadMilliseconds);
}
private static void GetFile(string bucketName, string filename)
{
try
{
using (AmazonS3Client awsClient = new AmazonS3Client(Amazon.RegionEndpoint.EUWest1))
{
string key = Path.GetFileName(filename);
GetObjectRequest getObjectRequest = new GetObjectRequest() {
BucketName = bucketName,
Key = key
};
using (GetObjectResponse response = awsClient.GetObject(getObjectRequest))
{
string directory = Path.GetDirectoryName(filename);
if (!Directory.Exists(directory))
{
Directory.CreateDirectory(directory);
}
if (!File.Exists(filename))
{
response.WriteResponseStreamToFile(filename);
}
}
}
}
catch (AmazonS3Exception amazonS3Exception)
{
if (amazonS3Exception.ErrorCode == "NoSuchKey")
{
return;
}
if (amazonS3Exception.ErrorCode != null && (amazonS3Exception.ErrorCode.Equals("InvalidAccessKeyId") || amazonS3Exception.ErrorCode.Equals("InvalidSecurity")))
{
// Log AWS invalid credentials
throw new ApplicationException("AWS Invalid Credentials");
}
else
{
// Log generic AWS exception
throw new ApplicationException("AWS Exception: " + amazonS3Exception.Message);
}
}
catch
{
//
}
}
The downloading of the images all works fine but the Task.WaitAll seems to be ignored and the rest of the code continues to be executed - meaning I try to get files that are currently non existent (as they've not yet been downloaded).
I found this answer to another question which seems to be the same as mine. I tried to use the answer to change my code but it still wouldn't wait for all files to be downloaded.
Can anyone tell me where I am going wrong?

The code behaves as expected. Task.WaitAll returns after ten seconds even when not all files have been downloaded, because you have specified a timeout of 10 seconds (10000 milliseconds) in variable maxDownloadMilliseconds.
If you really want to wait for all downloads to finish, call Task.WaitAll without specifying a timeout.
Use
Task.WaitAll(tasks.ToArray());//wait to download
at both places.
To see some good explanations on how to implement parallel downloads while not stressing the system (only have a maximum number of parallel downloads), see the answer at How can I limit Parallel.ForEach?

Related

Download multiple files concurrently from FTP using FluentFTP with a maximum value

I would like to download multiple download files recursively from a FTP Directory, to do this I'm using FluentFTP library and my code is this one:
private async Task downloadRecursively(string src, string dest, FtpClient ftp)
{
foreach(var item in ftp.GetListing(src))
{
if (item.Type == FtpFileSystemObjectType.Directory)
{
if (item.Size != 0)
{
System.IO.Directory.CreateDirectory(Path.Combine(dest, item.Name));
downloadRecursively(Path.Combine(src, item.Name), Path.Combine(dest, item.Name), ftp);
}
}
else if (item.Type == FtpFileSystemObjectType.File)
{
await ftp.DownloadFileAsync(Path.Combine(dest, item.Name), Path.Combine(src, item.Name));
}
}
}
I know you need one FtpClient per download you want, but how can I make to use a certain number of connections as maximum, I guess that the idea is to create, connect, download and close per every file I find but just having a X number of downloading files at the same time. Also I'm not sure if I should create Task with async, Threads and my biggest problem, how to implement all of this.
Answer from #Bradley here seems pretty good, but the question does read every file thas has to download from an external file and it doesn't have a maximum concurrent download value so I'm not sure how to apply these both requirements.
Use:
ConcurrentBag class to implement a connection pool;
Parallel class to parallelize the operation;
ParallelOptions.MaxDegreeOfParallelism to limit number of the concurrent threads.
var clients = new ConcurrentBag<FtpClient>();
var opts = new ParallelOptions { MaxDegreeOfParallelism = maxConnections };
Parallel.ForEach(files, opts, file =>
{
file = Path.GetFileName(file);
string thread = $"Thread {Thread.CurrentThread.ManagedThreadId}";
if (!clients.TryTake(out var client))
{
Console.WriteLine($"{thread} Opening connection...");
client = new FtpClient(host, user, pass);
client.Connect();
Console.WriteLine($"{thread} Opened connection {client.GetHashCode()}.");
}
string remotePath = sourcePath + "/" + file;
string localPath = Path.Combine(destPath, file);
string desc =
$"{thread}, Connection {client.GetHashCode()}, " +
$"File {remotePath} => {localPath}";
Console.WriteLine($"{desc} - Starting...");
client.DownloadFile(localPath, remotePath);
Console.WriteLine($"{desc} - Done.");
clients.Add(client);
});
Console.WriteLine($"Closing {clients.Count} connections");
foreach (var client in clients)
{
Console.WriteLine($"Closing connection {client.GetHashCode()}");
client.Dispose();
}
Another approach is to start a fixed number of threads with one connection for each and have them pick files from a queue.
For an example of an implementation, see my article for WinSCP .NET assembly:
Automating transfers in parallel connections over SFTP/FTP protocol
A similar question about SFTP:
Processing SFTP files using C# Parallel.ForEach loop not processing downloads
Here is a TPL Dataflow approach. A BufferBlock<FtpClient> is used as a pool of FtpClient objects. The recursive enumeration takes a parameter of type IEnumerable<string> that holds the segments of one filepath. These segments are combined differently when constructing the local and the remote filepath. As a side effect of invoking the recursive enumeration, the paths of the remote files are sent to an ActionBlock<IEnumerable<string>>. This block handles the parallel downloading of the files. Its Completion property contains eventually all the exceptions that may have occurred during the whole operation.
public static Task FtpDownloadDeep(string ftpHost, string ftpRoot,
string targetDirectory, string username = null, string password = null,
int maximumConnections = 1)
{
// Arguments validation omitted
if (!Directory.Exists(targetDirectory))
throw new DirectoryNotFoundException(targetDirectory);
var fsLocker = new object();
var ftpClientPool = new BufferBlock<FtpClient>();
async Task<TResult> UsingFtpAsync<TResult>(Func<FtpClient, Task<TResult>> action)
{
var client = await ftpClientPool.ReceiveAsync();
try { return await action(client); }
finally { ftpClientPool.Post(client); } // Return to the pool
}
var downloader = new ActionBlock<IEnumerable<string>>(async path =>
{
var remotePath = String.Join("/", path);
var localPath = Path.Combine(path.Prepend(targetDirectory).ToArray());
var localDir = Path.GetDirectoryName(localPath);
lock (fsLocker) Directory.CreateDirectory(localDir);
var status = await UsingFtpAsync(client =>
client.DownloadFileAsync(localPath, remotePath));
if (status == FtpStatus.Failed) throw new InvalidOperationException(
$"Download of '{remotePath}' failed.");
}, new ExecutionDataflowBlockOptions()
{
MaxDegreeOfParallelism = maximumConnections,
BoundedCapacity = maximumConnections,
});
async Task Recurse(IEnumerable<string> path)
{
if (downloader.Completion.IsCompleted) return; // The downloader has failed
var listing = await UsingFtpAsync(client =>
client.GetListingAsync(String.Join("/", path)));
foreach (var item in listing)
{
if (item.Type == FtpFileSystemObjectType.Directory)
{
if (item.Size != 0) await Recurse(path.Append(item.Name));
}
else if (item.Type == FtpFileSystemObjectType.File)
{
var accepted = await downloader.SendAsync(path.Append(item.Name));
if (!accepted) break; // The downloader has failed
}
}
}
// Move on to the thread pool, to avoid ConfigureAwait(false) everywhere
return Task.Run(async () =>
{
// Fill the FtpClient pool
for (int i = 0; i < maximumConnections; i++)
{
var client = new FtpClient(ftpHost);
if (username != null && password != null)
client.Credentials = new NetworkCredential(username, password);
ftpClientPool.Post(client);
}
try
{
// Enumerate the files to download
await Recurse(new[] { ftpRoot });
downloader.Complete();
}
catch (Exception ex) { ((IDataflowBlock)downloader).Fault(ex); }
try
{
// Await the downloader to complete
await downloader.Completion;
}
catch (OperationCanceledException)
when (downloader.Completion.IsCanceled) { throw; }
catch { downloader.Completion.Wait(); } // Propagate AggregateException
finally
{
// Clean up
if (ftpClientPool.TryReceiveAll(out var clients))
foreach (var client in clients) client.Dispose();
}
});
}
Usage example:
await FtpDownloadDeep("ftp://ftp.test.com", "", #"C:\FtpTest",
"username", "password", maximumConnections: 10);
Note: The above implementation enumerates the remote directory lazily, following the tempo of the downloading process. If you prefer to enumerate it eagerly, gathering all info available about the remote listings ASAP, just remove the BoundedCapacity = maximumConnections configuration from the ActionBlock that downloads the files. Be aware that doing so could result in high memory consumption, in case the remote directory has a deep hierarchy of subfolders, containing cumulatively a huge number of small files.
I'd split this into three parts.
Recursively build a list of source and destination pairs.
Create the directories required.
Concurrently download the files.
It's the last part that is slow and should be done in parallel.
Here's the code:
private async Task DownloadRecursively(string src, string dest, FtpClient ftp)
{
/* 1 */
IEnumerable<(string source, string destination)> Recurse(string s, string d)
{
foreach (var item in ftp.GetListing(s))
{
if (item.Type == FtpFileSystemObjectType.Directory)
{
if (item.Size != 0)
{
foreach(var pair in Recurse(Path.Combine(s, item.Name), Path.Combine(d, item.Name)))
{
yield return pair;
}
}
}
else if (item.Type == FtpFileSystemObjectType.File)
{
yield return (Path.Combine(s, item.Name), Path.Combine(d, item.Name));
}
}
}
var pairs = Recurse(src, dest).ToArray();
/* 2 */
foreach (var d in pairs.Select(x => x.destination).Distinct())
{
System.IO.Directory.CreateDirectory(d);
}
/* 3 */
var downloads =
pairs
.AsParallel()
.Select(x => ftp.DownloadFileAsync(x.source, x.destination))
.ToArray();
await Task.WhenAll(downloads);
}
It should be clean, neat, and easy to reason about code.

How can I cancel an asynchronous task after a given time and how can I restart a failed task?

How can I cancel an asynchronous task when it takes very long to complete it or if it will probably never complete? Is it possible to use a given time(for example 10 seconds) for each task and when it doesn't complete in this given time, then the task will automatically be cancelled?
Is it possible to restart a task or create the same task again after it failed? What can I do if one of the tasks in a task list fails? Is it possible to only restart the failed task?
In my code, playerCountryDataUpdate should only be executed after each task in TasksList1 completed without error or exception. I want to restart a task when it fails. When the same task fails again, then don't restart it and display an error message on the screen. How can I do that?
bool AllMethods1Completed = false;
bool AllMethods2Completed = false;
public async Task PlayerAccountDetails()
{
var playerCountryDataGet = GetPlayerCountryData();
var playerTagsData = GetPlayerTagsData();
var TasksList1 = new List<Task> { playerCountryDataGet, playerTagsData };
try
{
await Task.WhenAll(TasksList1);
AllMethods1Completed = true;
}
catch
{
AllMethods1Completed = false;
}
if (AllMethods1Completed == true)
{
var playerCountryDataUpdate = UpdatePlayerCountryData("Germany", "Berlin");
var TasksList2 = new List<Task> { playerCountryDataUpdate };
try
{
await Task.WhenAll(TasksList2);
AllMethods2Completed = true;
}
catch
{
AllMethods2Completed = false;
}
}
}
private async Task GetPlayerTagsData()
{
var resultprofile = await PlayFabServerAPI.GetPlayerTagsAsync(new PlayFab.ServerModels.GetPlayerTagsRequest()
{
PlayFabId = PlayerPlayFabID
});
if (resultprofile.Error != null)
Console.WriteLine(resultprofile.Error.GenerateErrorReport());
else
{
if ((resultprofile.Result != null) && (resultprofile.Result.Tags.Count() > 0))
PlayerTag = resultprofile.Result.Tags[0].ToString();
}
}
private async Task GetPlayerCountryData()
{
var resultprofile = await PlayFabClientAPI.GetUserDataAsync(new PlayFab.ClientModels.GetUserDataRequest()
{
PlayFabId = PlayerPlayFabID,
Keys = null
});
if (resultprofile.Error != null)
Console.WriteLine(resultprofile.Error.GenerateErrorReport());
else
{
if (resultprofile.Result.Data == null || !resultprofile.Result.Data.ContainsKey("Country") || !resultprofile.Result.Data.ContainsKey("City"))
Console.WriteLine("No Country/City");
else
{
PlayerCountry = resultprofile.Result.Data["Country"].Value;
PlayerCity = resultprofile.Result.Data["City"].Value;
}
}
}
private async Task UpdatePlayerCountryData(string country, string city)
{
var resultprofile = await PlayFabClientAPI.UpdateUserDataAsync(new PlayFab.ClientModels.UpdateUserDataRequest()
{
Data = new Dictionary<string, string>() {
{"Country", country},
{"City", city}
},
Permission = PlayFab.ClientModels.UserDataPermission.Public
});
if (resultprofile.Error != null)
Console.WriteLine(resultprofile.Error.GenerateErrorReport());
else
Console.WriteLine("Successfully updated user data");
}
You need to build a cancellation mechanism directly into the task itself. C# provides a CancellationTokenSource and CancellationToken classes to assist with this. https://learn.microsoft.com/en-us/dotnet/api/system.threading.cancellationtoken?view=netcore-3.1
Add an (optional) CancellationToken to your task's parameters. Then check the token at appropriate intervals to determine if the task needs to abort before it completes.
In the case of a long running query, it would be best to figure out how to break the query into chunks and then check the CancellationToken between queries.
private async Task GetPlayerXXXData(CancellationToken ct = null) {
int limit = 100;
int total = Server.GetPlayerXXXCount();
List<PlayerXXXData> results = new List<PlayerXXXData>();
while((ct == null || ct.IsCancellationRequested) && result.Count < total) {
result.AddRange(Server.GetPlayerXXXData(result.Count, limit));
}
return results;
}
Mind the above has no error handling in it; but you get the idea. You might consider making it faster (to start using the data) by implementing Deferred Execution with your own custom IEnumerable implementation. Then you can query one chunk and iterate over that chunk before querying for the next chunk. This could also help prevent you from loading too much into RAM - depending upon the number of records you are intending to process.
set timeout in your logic to suspend the task:
int timeout = 1000;
var task = SomeOperationAsync();
if (await Task.WhenAny(task, Task.Delay(timeout)) == task) {
// task completed within timeout
} else {
// timeout logic
}
Asynchronously wait for Task<T> to complete with timeout
and also put try catch blocks in a while loop with a flag until you want to retry
var retry=0;
while (retry<=3)
{
try{
await with timeout
raise timeout exception
}
catch(catch timeout exception here )
{
retry++;
if(retry ==3)
{
throw the catched exception here
}
}
}

Task does not always return upon completion

So I am trying to build a program to control a machine. Communications with said machine is via a serial port for which I have written a driver. Continuous polling to the machine is necessary for status feedback etc. In my program I have a dedicated ExecutionEngine() class to handle serial send and receive. I also need to have two separate control sequences running, which I have put into methods RunSequenceA() and RunSequenceB() respectively. During normal operation, all three methods need to run until both control sequences finish, at which point the StopSequence() method is called. My issue is that sometimes, for whatever reason, the StopSequence() method is never called, leaving my ExecutionEngine() method in an infinite loop!
Code for ExecutionEngine():
private static void ExecutionEngine()
{
// Clear both lists in case they have old data
_commandList.Clear();
_pollingList.Clear();
// Poll while user has not yet clicked "STOP"
while (!_cTokenSource.Token.IsCancellationRequested)
{
// If there are commands to be sent, send them first
if (_commandList.Count > 0)
{
Command[] tempCommandArray;
lock (_commandList)
tempCommandArray = _commandList.ToArray();
foreach (var c in tempCommandArray)
{
if (_cTokenSource.Token.IsCancellationRequested)
break;
var response = SerialDriver.ComCycle(c.CommandBytes, _serialPort);
var success = CheckErrorReturn(response, false);
if (success)
{
AddPolling(c);
RemoveCommand(c);
}
}
}
// Do polling operation on applicable controllers
if (_pollingList.Count > 0)
{
Command[] tempPollingArray;
lock (_pollingList)
tempPollingArray = _pollingList.ToArray();
foreach (var c in tempPollingArray)
{
if (_cTokenSource.Token.IsCancellationRequested)
break;
var response = SerialDriver.ComCycle(c.PollBytes, _serialPort);
var success = ProcessPollReturn(response);
if (success)
{
c.FlagDone();
RemovePolling(c);
}
}
}
if (_commandList.Count + _pollingList.Count == 0)
{
// Will get stuck here if neither list gets new items added
Console.WriteLine("Bad place");
Thread.Sleep(500);
}
}
// Cancellation has been requested
lock (_commandList)
_commandList.Clear();
lock (_pollingList)
_pollingList.Clear();
ResetTriggers();
var endCommand = new Command("GL_SYSCMD", 0);
SerialDriver.ComCycle(endCommand.CommandBytes, _serialPort);
_serialPort.Close();
_vm.SequenceRunning = false;
return;
}
Code for running sequences:
private static async Task RunSequencesAsync()
{
var taskArray = new Task[2];
var a = new Action(RunSequenceA);
var b = new Action(RunSequenceB);
taskArray[0] = Task.Run(a);
taskArray[1] = Task.Run(b);
await Task.WhenAll(taskArray).ConfigureAwait(continueOnCapturedContext: false);
// Sometimes this never fires, WHY?
UpdateStatus("All done!");
StopSequence();
}
// Run A sequence
internal static void RunSequenceA()
{
if (_sequenceA1 != null && _sequenceA1.Count > 0)
{
foreach (var s in _sequenceA1)
{
if (_cTokenSource.Token.IsCancellationRequested)
return;
s.Execute();
if (s.Reference != null && TriggerStepCompleted != null)
TriggerStepCompleted(s, EventArgs.Empty);
}
// This part always fires
Console.WriteLine("Sequence A finished");
return;
}
else
return;
}
And finally, the methods to start and stop everything:
private static async Task StartSequenceAsync()
{
_serialPort.PortName = _vm.SelectedComPort;
_serialPort.Open();
_serialPort.DiscardInBuffer();
_serialPort.DiscardOutBuffer();
// Start
_cTokenSource = new CancellationTokenSource();
_vm.SequenceRunning = true;
var taskArray = new Task[2];
taskArray[0] = Task.Run(() => ExecutionEngine());
Thread.Sleep(50);
taskArray[1] = Task.Run(() => RunSequencesAsync());
await Task.WhenAll(taskArray).ConfigureAwait(continueOnCapturedContext: false);
}
private static void StopSequence()
{
_cTokenSource.Cancel();
}
To reiterate, the problem doesn't happen every time. In fact, most times the program runs fine. It seems that problems only arise if I manually call the StopSequence() method half way through execution. Then it's 50/50 as to whether the problem shows up. I'm pretty sure my issue is threading related, but not sure exactly what is going wrong. Any help pointing me in the right direction will be greatly appreciated!

C# Multi-threading - Upload to FTP Server

I would like to seek your help in implementing Multi-Threading in my C# program.
The program aims to upload 10,000++ files to an ftp server. I am planning to implement atleast a minimum of 10 threads to increase the speed of the process.
With this, this is the line of code that I have:
I have initialized 10 Threads:
public ThreadStart[] threadstart = new ThreadStart[10];
public Thread[] thread = new Thread[10];
My plan is to assign one file to a thread, as follows:
file 1 > thread 1
file 2 > thread 2
file 3 > thread 3
.
.
.
file 10 > thread 10
file 11 > thread 1
.
.
.
And so I have the following:
foreach (string file in files)
{
loop++;
threadstart[loop] = new ThreadStart(() => ftp.uploadToFTP(uploadPath + #"/" + Path.GetFileName(file), file));
thread[loop] = new Thread(threadstart[loop]);
thread[loop].Start();
if (loop == 9)
{
loop = 0;
}
}
The passing of files to their respective threads is working. My problem is that the starting of the thread is overlapping.
One example of exception is that when Thread 1 is running, then a file is passed to it. It returns an error since Thread 1 is not yet successfully done, then a new parameter is being passed to it. Also true with other threads.
What is the best way to implement this?
Any feedback will be greatly appreciated. Thank you! :)
Using async-await and just pass an array of files into it:
private static async void TestFtpAsync(string userName, string password, string ftpBaseUri,
IEnumerable<string> fileNames)
{
var tasks = new List<Task<byte[]>>();
foreach (var fileInfo in fileNames.Select(fileName => new FileInfo(fileName)))
{
using (var webClient = new WebClient())
{
webClient.Credentials = new NetworkCredential(userName, password);
tasks.Add(webClient.UploadFileTaskAsync(ftpBaseUri + fileInfo.Name, fileInfo.FullName));
}
}
Console.WriteLine("Uploading...");
foreach (var task in tasks)
{
try
{
await task;
Console.WriteLine("Success");
}
catch (Exception ex)
{
Console.WriteLine(ex.ToString());
}
}
}
Then call it like this:
const string userName = "username";
const string password = "password";
const string ftpBaseUri = "ftp://192.168.1.1/";
var fileNames = new[] { #"d:\file0.txt", #"d:\file1.txt", #"d:\file2.txt" };
TestFtpAsync(userName, password, ftpBaseUri, fileNames);
Why doing it the hard way?
.net already has a class called ThreadPool.
You can just use that and it manages the threads itself.
Your code will be like this:
static void DoSomething(object n)
{
Console.WriteLine(n);
Thread.Sleep(10);
}
static void Main(string[] args)
{
ThreadPool.SetMaxThreads(20, 10);
for (int x = 0; x < 30; x++)
{
ThreadPool.QueueUserWorkItem(new WaitCallback(DoSomething), x);
}
Console.Read();
}

Wait for finished download file in selenium and c#

I have some problem.
I download a file after clicking an icon on web application.
My next step is executed before the record file was downloaded.
I want to wait until file is downloaded?
Does anyone know how to wait for that?
I use the following script (filename should be passed in).
The first part waits till the file appears on disk (fine for chrome)
The second part waits till it stops changing (and starts to have some content)
var downloadsPath = Environment.GetEnvironmentVariable("USERPROFILE") + #"\Downloads\" + fileName;
for (var i = 0; i < 30; i++)
{
if (File.Exists(downloadsPath)) { break; }
Thread.Sleep(1000);
}
var length = new FileInfo(downloadsPath).Length;
for (var i = 0; i < 30; i++)
{
Thread.Sleep(1000);
var newLength = new FileInfo(downloadsPath).Length;
if (newLength == length && length != 0) { break; }
length = newLength;
}
I started from Dmitry`s answer and added some support for controlling the timeouts and the polling intervals.
This is the solution:
/// <exception cref="TaskCanceledException" />
internal async Task WaitForFileToFinishChangingContentAsync(string filePath, int pollingIntervalMs, CancellationToken cancellationToken)
{
await WaitForFileToExistAsync(filePath, pollingIntervalMs, cancellationToken);
var fileSize = new FileInfo(filePath).Length;
while (true)
{
if (cancellationToken.IsCancellationRequested)
{
throw new TaskCanceledException();
}
await Task.Delay(pollingIntervalMs, cancellationToken);
var newFileSize = new FileInfo(filePath).Length;
if (newFileSize == fileSize)
{
break;
}
fileSize = newFileSize;
}
}
/// <exception cref="TaskCanceledException" />
internal async Task WaitForFileToExistAsync(string filePath, int pollingIntervalMs, CancellationToken cancellationToken)
{
while (true)
{
if (cancellationToken.IsCancellationRequested)
{
throw new TaskCanceledException();
}
if (File.Exists(filePath))
{
break;
}
await Task.Delay(pollingIntervalMs, cancellationToken);
}
}
And you use it like this:
using var cancellationTokenSource = new CancellationTokenSource(timeoutMs);
WaitForFileToFinishChangingContentAsync(filePath, pollingIntervalMs, cancellationToken);
A cancellation operation will be automatically triggered after the specified timeout.
You can check if there is a temporary file in the download directory
I use to listen to folder from time to time
The example below is for Google Chrome driver
while (Directory.GetFiles(downloadPath2).Count(i => i.EndsWith(".crdownload")) > 0)
{
Thread.Sleep(2000);
}
Maybe look into these:
How can I ask the Selenium-WebDriver to wait for few seconds in Java?
https://msdn.microsoft.com/en-us/library/system.threading.thread.sleep%28v=vs.110%29.aspx

Categories

Resources