Calling mutliple services in a method. How to do it effectively? - c#

I have a asp .net web page(MVC) displaying 10,000 products.
For this I am using a method. In that method I have to call an external web service 20 times. This is because the web service gives me 500 data at a time, so to get 10000 data I need to call the service 20 times.
20 calls makes the page load slow. Now I need to increase the performance. Since web service is external I cannot make changes there.
Threading is an option I thought of. Since I can use page numbers (service is paging for the data) each service call is almost independent.
Another option is using parallel linq.
Should I use parallel linq, or choose threading?
Someone please guide me here. Or let me know another way to achieve this.
Note : this web page can be used by many users at a time.
We have filters left side of the page.for that we need all the 10,000 data to construct filter.Otherwise pagewise info could have been enough.and caching is not possible since the huge overload on the server. at a time 400-1000 users can hit server.web service response time is 10 second so that we can hit them many time
We have to hit the service 20 times to get all data.Now i need a solution to improve that hit.Is threading is the only option?

If you can't cache the data from the service, then just get the data you need, when you need to display it. I very much doubt that somebody wants to see all 10000 products on a single web page, and if they do, there is probably something wrong!

Threads, parallel linq will not help you here.
Parallel Linq is meant for lots of CPU work to be shared over CPU cores, what you want to do is make 20 web requests at the same time. You will need to use threading to do that.
You'll probably want to use the built in async capability of HttpWebRequest (see BeginGetResponse).

Consider calling that service asyncrhonously. Most of delay in calling webservice is caused by IO operations that can be done simultaneously.
But getting 10000 items per each request is something very scarry :)

Related

Best way to rate limit clientside api in C#

I've ran into an issue which i'm struggling to decide the best way to solve. Perhaps my software articheture needs to change?
I have a cron job which hits my website method every 10 seconds and then on my website the method then makes an API call each time to an API however the API is rate limited x amount in a minute and y amount a day
Currently i'm exceeding the API limits and need to control this in the website method somehow. I've thought storing in a file perhaps but seems hacky similary to a database as I don't currently use one for this project.
I've tried this package: https://github.com/David-Desmaisons/RateLimiter but alas it doesn't work in my scenario and I think it would work if I did one request with a loop as provided in his examples. I noticed he had a persistent timer(PersistentCountByIntervalAwaitableConstraint) but he has no documentation or examples for it(I emailed him incase). I've done a lot of googling around and can't find any examples of this only server rate limiting which is the other way around server limiting client and not client limiting requests to server
How can I solve my issue without changing the cronjobs? What does everyone think the best solution to this is?
Assuming that you don't want to change the clients generating the load, there is no choice but to implement rate limiting on the server.
Since an ASP.NET application can be restarted at any time, the state used for that rate-limiting must be persisted somewhere. You can choose any data store you like for that.
In this case you have two limits: One per minute and one per day. If you simply apply two separate rate limiters you will end up with the daily limit being exceeded fairly quickly. After that, there will be no further access for the rest of the day. Likely, this is undesirable.
It seems better to only apply the daily limit because it is more restrictive. A simple solution would be to calculate how far apart requests must be to meet the daily limit. Then, you store the date of the last request. Any new incoming request is immediately failed if not enough time has passed.
Let me know if this helps you.

How do I optimize parallel calls to a service?

I have a challenge that I am encountering when needing to pull down data from a service. I'm using the following call to Parallel.ForEach:
Parallel.ForEach(idList, id => GetDetails(id));
GetDetails(id) calls a web service that takes roughly half a second and adds the resulting details to a list.
static void GetDetails(string id)
{
var details = WebService.GetDetails(Key, Secret, id);
AllDetails.Add(id, details);
}
The problem is, I know the service can handle more calls, but I can't seem to figure out how to get my process to ramp up more calls, UNLESS I split my list and open the process multiple times. In other words, if I open this app GetDetails.exe 4 times and split the number of IDs into each, I cut the run time down to 25% of the original. This tells me that the possibility is there but I am unsure how to achieve it without ramping up the console app multiple times.
Hopefully this is a pretty simple issue for folks that are more familiar with parallelism, but in my research I've yet to solve it without running multiple instances.
A few possibilities:
There's a chance that WebService.GetDetails(...) is using some kind of mechanism to ensure that only one web request actually happens at a time.
.NET itself may be limiting the number of connections, either to a given host or in general see this question's answers for details about these kinds of problems
If WebService.GetDetails(...) reuses some kind of identifier like a session key, the server may be limiting the number of concurrent requests that it accepts for that one session.
It's generally a bad idea to try to solve performance issues by hammering the server with more concurrent requests. If you control the server, then you're causing your own server to do way more work than it needs to. If not, you run the risk of getting IP-banned or something for abusing their service. It's worth checking to see if the service you're accessing has some options to batch your requests or something.
As Scott Chamberlain mentioned in comments, you need to be careful with parallel processes because accessing structures like Dictionary<> from multiple threads concurrently can cause sporadic, hard-to-track-down bugs. You'd probably be better off using async requests rather than parallel threads. If you're careful about your awaits, you can have multiple requests be active concurrently while still using just a single thread at a time.

How to reduce the execution time in C# while calling an API?

I am creating a windows application (using windows form application) which calls the web service to fetch data. In data, I have to fetch 200+ clients information and for each client, I have to fetch all users information. A client can have 50 to 100 users. So, I am calling web service in a loop (after getting all clients list) for each client to fetch the users listing. This is a long process. I want to reduce the execution time for this whole process. So, please suggest me which approach can help in reducing the execution time which is currently up to 40-50 mins for one time data fetch. Let me know any solution like multithreading or any thing else, whichever is best suited to my application.
Thanks in advance.
If you are in control of the web service, have a method that returns all the clients at once instead of 1 by one to avoid rountrips as Michael suggested.
If not, make sure to make as many requests at the same time (not in sequence) to avoid as much laterncy as possible. For each request you will have at least 1 rountrip (so at least your ping's Worth of delay), if you make 150 requests then you'll get your ping to the server X 150 Worth of "just waiting on the network". If you split those requests in 4 bunches, and do each of these bunches in parallel, then you'll only wait 150/4*ping time. So the more requests you do concurrently, the least you wait.
I suggest you to avoid calling the service in a loop for every user to get the details, but instead do that loop in the server and return all the data in one-shot, otherwise you will suffer of a lot of useless latencies caused by the thousand of calls, and not just because of the server time or data-transferring time.
This is also a pattern, called Remote Facade or Facade Pattern explained by Martin Fowler and the Gang of Four:
any object that's intended to be used as a remote objects needs a coarse-grained interface that minimizes the number of calls needed to get some-thing done [...] Rather than ask for an order and its order lines individually, you need to access and update the order and order lines in a single call.
In case you're not in control of the web service, you could try to use a Parallel.ForEach loop instead of a ForEach loop to query the web service.
The MSDN has a tutorial on how to use it: http://msdn.microsoft.com/en-us/library/dd460720(v=vs.110).aspx

Writing synchronous queries or async ones

I am having an Asp.Net webforms page. I have 8 queries on the page on one of the complex page in the app. 2 queries can be cached and they are already cached but the other 6 requires hitting the DB. The page loads fine without any delay within 2 seconds. However, as a best practice along with performance I want to know if I should make them async. Problem is if I make them async, different connections have to be used for each query because currently I am storing the connection object in HttpContext.Current.Items and this won't be available if I am on different thread.
Should I be using Task api or should I leave them synchronous only? Please suggest best practice.
In my opinion the best option is to combine that queries together. If this is completely not possible run them using at least on sql connection. Using async probably will not increase time until you use shared sql connection but I am think it is not possible.
If you want to work on optimizing database access time then try to implement what Garath and mesterak suggested – that should give you additional performance improvements.
However, I must say that if my page was loading under 2s I wouldn’t really bother making any optimizations in this area.
Couple questions to ask yourself before you continue working on this:
How do you know if its database calls that are making the biggest impact on page load?
Have you seen the page trace or just assumed that it’s the database that’s making the biggest impact?
What have you done to optimize other elements?
Here are couple other suggestions for you to try:
Create page trace and examine the results. Here is a good tutorial on this.
Examine your page using PageSpeed and see if there are any optimizations in other areas
Check out these tutorials on how to optimize other page elements
I believe it depends on how you are making calls to your DB. It is possible to create a simple database handling class (static) that establishes one connection and then any queries performed can reuse the same connection object. Async calls just means you are performing them in parallel; this may improve user experience (faster performance), but the number of calls to the DB are the same. Perhaps we could better answer your question if you provide further details on how you are executing queries against your database in your code.

Hints for a high-traffic web service, c# asp.net sql2000

I'm developing a web service whose methods will be called from a "dynamic banner" that will show a sort of queue of messages read from a sql server table.
The banner will have a heavy pressure in the home pages of high traffic sites; every time the banner will be loaded, it will call my web service, in order to obtain the new queue of messages.
Now: I don't want that all this traffic drives queries to the database every time the banner is loaded, so I'm thinking to use the asp.net cache (i.e. HttpRuntime.Cache[cacheKey]) to limit database accesses; I will try to have a cache refresh every minute or so.
Obviously I'll try have the messages as little as possible, to limit traffic.
But maybe there are other ways to deal with such a scenario; for example I could write the last version of the queue on the file system, and have the web service access that file; or something mixing the two approaches...
The solution is c# web service, asp.net 3.5, sql server 2000.
Any hint? Other approaches?
Thanks
Andrea
It depends on a lot of things:
If there is little change in the data (think backend with "publish" button or daily batches), then I would definitely use static files (updated via push from the backend). We used this solution on a couple of large sites and worked really well.
If the data is small enough, memory caching (i.e. Http Cache) is viable, but beware of locking issues and also beware that Http Cache will not work that well under heavy memory load, because items can be expired early if the framework needs memory. I have been bitten by it before! With the above caveats, Http Cache works quite well.
I think caching is a reasonable approach and you can take it a step further and add a SQL Dependency to it.
ASP.NET Caching: SQL Cache Dependency With SQL Server 2000
If you go the file route, keep this in mind.
http://petesbloggerama.blogspot.com/2008/02/aspnet-writing-files-vs-application.html
Writing a file is a better solution IMHO - its served by IIS kernel code, w/o the huge asp.net overhead and you can copy the file to CDNs later.
AFAIK dependency cashing is not very efficient with SQL Server 2000.
Also, one way to get around the memory limitation mentioned by Skliwz is that if you are using this service outside of the normal application you can isolate it in it's own app pool. I have seen this done before which helps as well.
Thanks all, as the data are little in size, but the underlying tables will change, I think that I'll go the HttpCache way: I need actually a way to reduce db access, even if the data are changing (so that's the reason to not using a direct Sql dependency as suggested by #Bloodhound).
I'll make some stress test before going public, I think.
Thanks again all.
Of course you could (should) also use the caching features in the SixPack library .
Forward (normal) cache, based on HttpCache, which works by putting attributes on your class. Simplest to use, but in some cases you have to wait for the content to be actually be fetched from database.
Pre-fetch cache, from scratch, which, after the first call will start refreshing the cache behind the scenes, and you are guaranteed to have content without wait in some cases.
More info on the SixPack library homepage. Note that the code (especially the forward cache) is load tested.
Here's an example of simple caching:
[Cached]
public class MyTime : ContextBoundObject
{
[CachedMethod(1)]
public DateTime Get()
{
Console.WriteLine("Get invoked.");
return DateTime.Now;
}
}

Categories

Resources