Can I use a progress bar to show the progress of
File.WriteAllBytes(file, array)
in C#?
No.
You'll need to write the bytes in chunks using a loop. Something like the following should get you started. Note that this needs to be running in a background thread. I you are using WinForms, you can use a BackgroundWorker.
using(var stream = new FileStream(...))
using(var writer = new BinaryWriter(stream)) {
var bytesLeft = array.Length; // assuming array is an array of bytes
var bytesWritten = 0;
while(bytesLeft > 0) {
var chunkSize = Math.Min(64, bytesLeft);
writer.WriteBytes(array, bytesWritten, chunkSize);
bytesWritten += chunkSize;
bytesLeft -= chunkSize;
// notify progressbar (assuming you're using a background worker)
backgroundWorker.ReportProgress(bytesWritten * 100 / array.Length);
}
}
EDIT: as Patashu pointed out below, you can also you tasks and await. I think my method is fairly straightforward and doesn't require any additional thread stuff (besides the one background thread you need to do the operation). It's the traditional way and works well enough.
Since WriteAllBytes is a synchronous method, you can do nothing and know nothing about the operation until it finishes.
What you need to do is have a method like WriteAllBytes, but written to be asynchronous, such as in http://msdn.microsoft.com/en-AU/library/jj155757.aspx . You can have your asynchronous method every so often stop and report its progress to the GUI, as it runs separately.
Related
I have a three async function that I want to call from multiple threads in parallel at the same time. Till now I have tried the following approach -
int numOfThreads = 4;
var taskList = List<Task>();
using(fs = new FileStream(inputFilePath, FileMode.OpenOrCreate,FileAccess.ReadWrite,FileShare.ReadWrite))
{
for(int i=1; i<= numOfThreads ; i++)
{
taskList.Add(Task.Run( async() => {
byte[] buffer = new byte[length]; // length could be upto a few thousand
await Function1Async(); // Reads from the file into a byte array
long result = await Function2Aync(); // Does some async operation with that byte array data
await Function3Async(result); // Writes the result into the file
}
}
}
Task.WaitAll(taskList.toArray());
However, not all of the tasks complete before the execution reaches an end. I have limited experience with threading in c#. What am I doing wrong in my code? Or should I take an alternative approach?
EDIT -
So I made some changes to my approach. I got rid of the Function3Async for now -
for(int i=1;i<=numOfThreads; i++)
{
using(fs = new FileStream(----))
{
taskList.Add(Task.Run( async() => {
byte[] buffer = new byte[length]; // length could be upto a few thousand
await Function1Async(buffer); // Reads from the file into a byte array
Stream data = new MemoryStream(buffer);
/** Write the Stream into a file and return
* the offset at which the write operation was done
*/
long blockStartOffset = await Function2Aync(data);
Console.WriteLine($"Block written at - {blockStartOffset}");
}
}
}
Task.WaitAll(taskList.toArray());
Now all threads seem to proceed to completion but the Function2Async seems to randomly write some Japanese characters to the output file. I guess it is some threading issue perhaps?
Here is the implementation of the Function2Async ->
public async Task<long> Function2Async(Stream data)
{
long offset = getBlockOffset();
using(var outputFs = new FileStream(fileName,
FileMode.OpenOrCreate,
FileAccess.ReadWrite,
FileShare.ReadWrite))
{
outputFs.Seek(offset, SeekOrigin.Begin);
await data.CopyToAsync(outputFs);
}
return offset;
}
In your example you have passed neither fs nor buffer into Function1Async but your comment says it reads from fs into buffer, so I will assume that is what happens.
You cannot read from a stream in parallel. It does not support that. If you find one that supports it, it will be horribly inefficient, because that is how hard disk storage works. Even worse if it is a network drive.
Read from the stream into your buffers first and in sequence, then let your threads loose and run your logic. In parallel, on the already existing buffers in memory.
Writing by the way would have the same problem if you wrote to the same file. If you write to one file per buffer, that's fine, otherwise, do it sequentially.
So let me explain : I'm making a file signature scanner in C# using MD5 to compute the hashes. A problem I encountered is that ComputeHash() would block the UI thread. So then I thought of Task.Run(), which would solve my problems, at least I hoped so.
Even when putting the hash computing (and the whole) in an async task, it would still block, or at least, slow down the UI thread. And if I remove that hash computing, the UI thread isn't blocked anymore.
Here is my little snippet of code :
Task.Run(() =>
{
if (int.Parse(label5.Text) != int.Parse(label7.Text))
{
listBox1.SelectedIndex++;
label11.Text = listBox1.SelectedItem.ToString();
/*progressBar1.Increment(1);
label5.Text = progressBar1.Value.ToString();
int percentage = Convert.ToInt32(progressBar1.Value / (double)progressBar1.Maximum * 100);
label2.Text = "Scanning files (" + percentage + "%)";*/
label2.Text = "Scanning";
label9.Text = currentThreats.ToString();
try
{
StringBuilder buff = new StringBuilder();
using (MD5 md5 = MD5.Create())
{
using (FileStream stream = File.OpenRead(label11.Text))
{
byte[] hash = md5.ComputeHash(stream);
buff.Append(BitConverter.ToString(hash).Replace("-", string.Empty));
}
}
if (Reference.VirusList.Contains(buff.ToString())) currentThreats++;
}
catch { }
scanned++;
label5.Text = scanned.ToString();
}
else
{
StopCurrentScan(false);
}
});
NOTE : This is being run in a Timer with an interval of 1 millisecond. Just precising it in case it might help solve the problem.
Well, now my UI thread isn't lagging out as hell anymore since I followed the steps below that people suggested me to do :
I used await Task.Run() instead of Task.Run() to log exceptions, and I added Control.CheckForIllegalCrossThreads = true; just after InitializeComponent so I could better understand my mistakes.
I removed ALL UI thread calls (like updating UI elements) from the async task, I instead (in my case) use a timer that updates the UI elements I need from local variables.
Similar to 2. but reversed, that means only "resource-intensive" tasks (like ComputeHash()) are in my async task.
I have this method for receiving a file.
public Task Download(IProgress<int> downloadProgress)
{
return Task.Run
(
async () =>
{
var counter = 0;
var buffer = new byte[1024];
while (true)
{
var byteCount = await _networkStream.ReadAsync(buffer, 0, buffer.Length);
counter += byteCount;
downloadProgress.Report(counter);
if (byteCount != buffer.Length)
break;
}
}
);
}
Then in the UI I call it like this:
await Download(progress);
where progress is simply updating a label.
When I run, the UI will be blocked (but after some time it will correctly update the label). I don't understand why, shouldn't Task.Run() create a new thread?
How do I fix this please?
You are calling downloadProgress.Report on an infinite loop without any pause in execution. My educated guess is this means every time execution time is available on the UI thread, the non-UI thread is requesting an operation which will require the UI thread's time (as demanded by the synchronisation context) and therefore clogging it up with invocations.
Essentially, rather than blocking the UI thread with one long execution, you may be blocking it with an un-ending stream of tiny ones.
Try putting a Thread.Sleep(10) in your 'spinlock' while(true) { ... } loop and see if that alleviates the issue.
I've managed to implement a Task in my class UpdateManager that downloads a file from my webspace.
public async Task DownloadPackageTask(IProgress<int> progress)
{
var webRequest = WebRequest.Create(new Uri("http://www.mywebspace.de/archive.zip"));
using (var webResponse = await webRequest.GetResponseAsync())
{
var buffer = new byte[1024];
FileStream fileStream = File.Create("Path"));
using (Stream input = webResponse.GetResponseStream())
{
int received = 0;
double total = 0d;
var size = GetFileSize(new Uri("...")); // Gets the content length with a request as the Stream.Length-property throws an NotSupportedException
if (size != null)
total = (long) size.Value;
int size = await input.ReadAsync(buffer, 0, buffer.Length);
while (size > 0)
{
fileStream.Write(buffer, 0, size);
received += size;
progress.Report((received/total)*100));
size = await input.ReadAsync(buffer, 0, buffer.Length);
}
}
}
}
This works well, the file is being downloaded and also if I add Debug.Print((received/total)*100) it outputs the correct percentage, everything is alright. The method is marked as async so that it can be awaited/wrapped asynchronously in a task.
The problem occurs in another class UpdaterUi that is basically the interface between the manager and the user interface and calls the method like that:
public void ShowUserInterface()
{
TaskEx.Run(async delegate
{
var downloadDialog = new UpdateDownloadDialog
{
LanguageName = _updateManager.LanguageCulture.Name,
PackagesCount = _updateManager.PackageConfigurations.Count()
};
_context.Post(downloadDialog.ShowModalDialog, null); // Do this with a SynchronizationContext as we are not on the UI thread.
var progressIndicator = new Progress<int>();
progressIndicator.ProgressChanged += (sender, value) =>
downloadDialog.Progress = value;
await TaskEx.Run(() => _updateManager.DownloadPackageTask(progressIndicator));
});
}
It never calls the anonymous method there that should be invoked as soon as the progress changes, but nothing happens, I debugged it with breakpoints.
The problem is maybe that the progressIndicator is not created on the UI-thread, but on the new thread created by TaskEx.Run. It doesn't fire the event and consecutively the UI does not update the progressbar it contains (which it does in the setter of the Progress-property that is initialized above).
The problem is that I don't know what to do in order to make it working, how am I supposed to change the implementation in my project to get it working, is my thought of that threading problem correct?
Thanks in advance!
Your speculation about the problem is right. Progress<T> should be created in UI thread in order to be notified in UI thread.
It works by capturing the SynchronizationContext and posting the delegate for execution in the captured context. Since it is created in non UI context(default context or threadpool context), it will raise ProgressChanged in threadpool thread.
If you move this line var progressIndicator = new Progress<int>(); out of the TaskEx.Run, it should work (if ShowUserInterface is called from UI thread).
I see you're creating UpdateDownloadDialog in a worker thread. You shouldn't do that. Move that also to UI thread.
I have an application that has many cases. Each case has many multipage tif files. I need to covert the tf files to pdf file. Since there are so many file, I thought I could thread the conversion process. I'm currently limiting the process to ten conversions at a time (i.e ten treads). When one conversion completes, another should start.
This is the current setup I'm using.
private void ConvertFiles()
{
List<AutoResetEvent> semaphores = new List<AutoResetEvet>();
foreach(String fileName in filesToConvert)
{
String file = fileName;
if(semaphores.Count >= 10)
{
WaitHandle.WaitAny(semaphores.ToArray());
}
AutoResetEvent semaphore = new AutoResetEvent(false);
semaphores.Add(semaphore);
ThreadPool.QueueUserWorkItem(
delegate
{
Convert(file);
semaphore.Set();
semaphores.Remove(semaphore);
}, null);
}
if(semaphores.Count > 0)
{
WaitHandle.WaitAll(semaphores.ToArray());
}
}
Using this, sometimes results in an exception stating the WaitHandle.WaitAll() or WaitHandle.WaitAny() array parameters must not exceed a length of 65. What am I doing wrong in this approach and how can I correct it?
There are a few problems with what you have written.
1st, it isn't thread safe. You have multiple threads adding, removing and waiting on the array of AutoResetEvents. The individual elements of the List can be accessed on separate threads, but anything that adds, removes, or checks all elements (like the WaitAny call), need to do so inside of a lock.
2nd, there is no guarantee that your code will only process 10 files at a time. The code between when the size of the List is checked, and the point where a new item is added is open for multiple threads to get through.
3rd, there is potential for the threads started in the QueueUserWorkItem to convert the same file. Without capturing the fileName inside the loop, the thread that converts the file will use whatever value is in fileName when it executes, NOT whatever was in fileName when you called QueueUserWorkItem.
This codeproject article should point you in the right direction for what you are trying to do: http://www.codeproject.com/KB/threads/SchedulingEngine.aspx
EDIT:
var semaphores = new List<AutoResetEvent>();
foreach (String fileName in filesToConvert)
{
String file = fileName;
AutoResetEvent[] array;
lock (semaphores)
{
array = semaphores.ToArray();
}
if (array.Count() >= 10)
{
WaitHandle.WaitAny(array);
}
var semaphore = new AutoResetEvent(false);
lock (semaphores)
{
semaphores.Add(semaphore);
}
ThreadPool.QueueUserWorkItem(
delegate
{
Convert(file);
lock (semaphores)
{
semaphores.Remove(semaphore);
}
semaphore.Set();
}, null);
}
Personally, I don't think I'd do it this way...but, working with the code you have, this should work.
Are you using a real semaphore (System.Threading)? When using semaphores, you typically allocate your max resources and it'll block for you automatically (as you add & release). You can go with the WaitAny approach, but I'm getting the feeling that you've chosen the more difficult route.
Looks like you need to remove the handle the triggered the WaitAny function to proceed
if(semaphores.Count >= 10)
{
int index = WaitHandle.WaitAny(semaphores.ToArray());
semaphores.RemoveAt(index);
}
So basically I would remove the:
semaphores.Remove(semaphore);
call from the thread and use the above to remove the signaled event and see if that works.
Maybe you shouldn't create so many events?
// input
var filesToConvert = new List<string>();
Action<string> Convert = Console.WriteLine;
// limit
const int MaxThreadsCount = 10;
var fileConverted = new AutoResetEvent(false);
long threadsCount = 0;
// start
foreach (var file in filesToConvert) {
if (threadsCount++ > MaxThreadsCount) // reached max threads count
fileConverted.WaitOne(); // wait for one of started threads
Interlocked.Increment(ref threadsCount);
ThreadPool.QueueUserWorkItem(
delegate {
Convert(file);
Interlocked.Decrement(ref threadsCount);
fileConverted.Set();
});
}
// wait
while (Interlocked.Read(ref threadsCount) > 0) // paranoia?
fileConverted.WaitOne();