Processing files in multiple threads [closed] - c#

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I have a windows form application which loads a list of paths to video files into an array, and then with a foreach (string[] file in fileList) loop, will go through each file and run some analysis of the video file, then write the result back to the filelist array.
The trouble is that it processes each video file at slightly less than real time which is not idea. I am aiming to split up the task across multiple threads. I have tested opening the application 5 times and ran the processing on separate files. The CPU handles this without issue.
What would be the simplest way to split up the processing across multiple threads?
Edit: I am new to multi threading, and currently learning. I know there are different ways to multi-thread but I am looking for the method I should be using in order to learn about it.
I found this example, however I don't understand how it works, it seems too simple compared to other examples I have been looking at.
Best way to do a multithread foreach loop

If the results are added to the same collection, PLINQ makes it a bit easier:
var results = fileList.AsParallel().Select(file => {
var bytes = File.ReadAllBytes(file);
var result = bytes.Length;
return result;
}).ToList();

Related

How to Process large JSON Data By MultiThreading in C#? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
I am new to threading and Multithreading . I have a method which Fetches Ids as input in JSON format from DB as
{
"ID": ["1",
....,
.....
"30000"]}
Now these Ids are to be processed again via WebAPI POST call. The issue is,though the code is optimized, it is taking hours to process all data.
How can I process these Ids in batch or multi threading to make it faster?
Recent versions of .NET have great libraries you can use that take care of the multi-threading for you.
Check out the Parallel For Each loop: https://learn.microsoft.com/en-us/dotnet/api/system.threading.tasks.parallel.foreach?view=netcore-3.1
You pass it a list and everything inside the loop is executed once for each item in the list and C# will do some of the iterations in parallel (multi-threaded). That means you could do your processing for more than one ID in parallel.
Whether or not it improves performance depends on the environment and work being done.

C# threading for file uploader [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I'm working with c# winform
I have a list of files .With a loop I need to upload all the files (5000 images ) to another server how can I implement it with multithreading
And another point how can I know when one thread is finished and then use it for the next file waiting to be uploaded do I need to use monitor class?
You can use PLINQ for that:
IEnumerable<string> yourFiles = new[]{ "C:\\file.txt", "D:\\data.dat" };
int numberOfThreads = 10;
yourFiles.AsParallel().WithDegreeOfParallelism(numberOfThreads).ForAll(UploadFile);
private static void UploadFile(string file)
{
// do the actual uploading
}
Maybe Parallel.For is something for you. It is easy to use. You know when a thread is finished because you can add some variable into the end of your method in the another thread. Something like ManualResetEvent. I think Parallel.For is the fasted to implement. You can use a thread pool aswell. Read trough the microsoft websites.
Parallel.For (.NET 4)
For(Int32, Int32, Action<Int32>)
ThreadPool (.NET 2)
ThreadPool.QueueUserWorkItem(waitCallback)

Calculate Statistics over multiple DataSets using R or some other statistics package called from C# [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
Looking for best way to calculate statistics (like Shapiro–Wilk test ) for multiple datasets and get back the calculations. I can do it in R-Gui one dataset at a time manually but wondering if I can somehow write a C# pgm to call Sql-Server then pass the dataset to some Statistics package like R ???
Thanks
Yours is a somewhat broad question but, in principle, you may explore the following approach:
A .NET program reads data from a database and writes that data on text files
The .NET program "spawns" the R interpreter (as a process), which executes a R script
The R script reads data from the text files, computes statistics, and writes the results on text files
the .NET program reads the results produced by the R script from the text files
I've had some success with this workflow in the past. I wasn't aware of any usable integration options for R and .NET back then -- you may also check whether you can find something for a more "refined" integration between the two. For example, I know a COM interface is available here .

Quick file loading in C# [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I have a requirement to load a file containing up to 1 million lines of string data. My first thought is to use C# 5.0 async to load the data whilst not blocking the UI thread. If the user tries to access something that relies on the data they will get a loading message.
Still I would like the fastest possible method in order to improve the user's experience.
Is the speed of reading data from the disk purely a function of the disk speed and thus StreamReader.ReadAllLines() is as performant as other c# code? Or is there something 'fancy' I can do to boost performance programmatically. This does not have to be described in detail. If so what approximate percentage improvement might be achieved?
I am purely interested in read speed and not concerned with the speed of code that may process the data once loaded.
First of all, take a look on File size, here is detailed Performance measurements

Zip Files one by one Or All of them together [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I use GZip in C#. I have multiple files to zip and put them together.
But i don't know if i zip files one by one then save them in a file is better than make a file from them, and finally zip the created file only.
How about the size of the produced File?Which one is better to producing smaller file size ?
If you are looking to optimize on file size, putting them in one zip file might be a bit better, depending on the data in the files; the exact settings used, etc.
If you are looking to optimize on running time, and the library you're using is single-threaded, then you can potentially get a large performance bump by zipping each file in its own thread, but that assumes that Gzip is processor bound, which it might well not be.
Overall, the results will probably be subjective, and very dependent on the data you're trying to compress. Your best option would likely be to grab a sample of somewhat representative files, GZip them a few different ways, and see which way works best for what you need.
I try to implement the options on two image file and the result is :
Image1 6kb + Image2 7kb = 13kb
Image1Compressed 3kb + Image2Compressed 3kb = 6kb
In this sample there are no difference.
So it's better to use 2nd option.

Categories

Resources