Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
We are having a excel data Invoice input of 3 million customers every month to be processed.
There are only 8 fields in the data. Now the time consumed in processing to PDF format is too big & we are not able to meet the TAT.
Can anyone suggest any input to reduce processing speed?
There are several possibilities to reduce processing time.
Please check before what the bottleneck is.
To reduce processing time here are some ideas:
Use TPL to parallelize processing https://en.wikipedia.org/wiki/Parallel_Extensions#Task_Parallel_Library
Maybe use a thirdparty library to process excel files (e.g. Aspose.Cells, Aspose.PDF)
When the hardware is the bottleneck => use SSD and a better CPU or use CUDA to process https://en.wikipedia.org/wiki/CUDA
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
I am new to threading and Multithreading . I have a method which Fetches Ids as input in JSON format from DB as
{
"ID": ["1",
....,
.....
"30000"]}
Now these Ids are to be processed again via WebAPI POST call. The issue is,though the code is optimized, it is taking hours to process all data.
How can I process these Ids in batch or multi threading to make it faster?
Recent versions of .NET have great libraries you can use that take care of the multi-threading for you.
Check out the Parallel For Each loop: https://learn.microsoft.com/en-us/dotnet/api/system.threading.tasks.parallel.foreach?view=netcore-3.1
You pass it a list and everything inside the loop is executed once for each item in the list and C# will do some of the iterations in parallel (multi-threaded). That means you could do your processing for more than one ID in parallel.
Whether or not it improves performance depends on the environment and work being done.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 7 years ago.
Improve this question
Recently I was working on a project to write a csv file and an XML file at the same time.
They contain the same metadata information, how to open two StreamWriters in C# at the same time?
The question is too broad so the answer is generic. In order "to write a csv file and an XML file at the same time" you have to implement sort of multi-threading, using either Task Parallel Library (TPL - recommended) or other technique available in .NET, and run the aforementioned procedures on two different threads (in either blocking or non-blocking mode).
More details on TPL implementation: https://msdn.microsoft.com/en-us/library/dd537609%28v=vs.110%29.aspx.
Hope this may help.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I want to test my asp.net web api REST service that hosted in a HP-580dl and I want to measure the performance and response time when 10,000 simultaneous requests hit the service.
is there any way to do that in C# ?
Siege is a good tool for measuring load under concurrent requests: http://www.joedog.org/siege-home/
It's not written in C#, but there's no reason why it should be.
The Visual Studio load testing tools provide this functionality, and can control multiple agents in cases where you want a distributed profile and/or a greater concurrency level than a single client machine can support.
Create and run a load test
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I have developed a c# application that I wish to sell.You'll must be knowing that many people just reverse their system clock and keep using the software.How to prevent that
any ideas?
The easiest and most safe way would be to require access to the net to validate the time.
But access to the net is a strict requirement, especially for some scenarios of usage.
In alternative you could try to keep an encrypted file in which you store the last time your application was launched. If the system clock on the next launch of your application is earlier than your stored last launch something must be fishy.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I have a requirement to load a file containing up to 1 million lines of string data. My first thought is to use C# 5.0 async to load the data whilst not blocking the UI thread. If the user tries to access something that relies on the data they will get a loading message.
Still I would like the fastest possible method in order to improve the user's experience.
Is the speed of reading data from the disk purely a function of the disk speed and thus StreamReader.ReadAllLines() is as performant as other c# code? Or is there something 'fancy' I can do to boost performance programmatically. This does not have to be described in detail. If so what approximate percentage improvement might be achieved?
I am purely interested in read speed and not concerned with the speed of code that may process the data once loaded.
First of all, take a look on File size, here is detailed Performance measurements