Is it possible to read API data as it comes , I wrote the below Csharp code in my controller but sometimes the data takes more than 2 minutes and I was wondering if it was possible to load the data in my website as they come instead of waiting for it. Below is my current code:
private static async Task<List<Model>> GetFlightData()
{
using (var client = new HttpClient())
{
client.Timeout = TimeSpan.FromMilliseconds(Timeout.Infinite);
var content = await client.GetStringAsync(URL);
var result = JsonConvert.DeserializeObject<List<Model>>(content);
return result;
}
}
The fastest way is to save the data statically and initialize it on start up,
the problem with this solution is that IIS may restart you website when there's no traffic and the data will be lost (and will cause the next visitor to wait a whole 2 minutes).
The best suggestion I have is to save it to redis/other cache of your chosing and then just pull it from there.
Related
Could someone tell me if this is a good way to retrieve data from an API?
I'm new to this, and wonder if someone could recommend me some kind of setup and structure, and if this could be done in a more correct way?
string baseUrl = "https://api.data.com/getITems/
//Create a new instance of HttpClient
using (HttpClient client = new HttpClient())
{
var byteArray = Encoding.ASCII.GetBytes("username:password");
client.DefaultRequestHeaders.Authorization = new System.Net.Http.Headers.AuthenticationHeaderValue("Basic", Convert.ToBase64String(byteArray));
using (HttpResponseMessage result = await client.GetAsync(baseUrl))
using (HttpContent content = result.Content)
{
result.EnsureSuccessStatusCode();
string data = await content.ReadAsStringAsync();
var test1 = JsonConvert.DeserializeObject<Results>(data);
foreach (var item in test1.results)
{
foreach (var item1 in item.result)
{
Console.WriteLine("Result: {0}", item1.Symbol);
}
}
}
Your code looks fine. You could use a nuget package to simplify some of the operations or turn it into a helper class. The only thing you should do differently is not create a new HttpClient every time, this is against the practices. You should only create it once and use it lots of times.
Deciding if your implementation is a good way, is out of my scope, since I do not know your requirements and genereally you can always extract some functionality into some other class, while juggeling the effort you put in against the expected value you get out of it.
That being said, I'll just go into the one thing that springs to my eye and that is your usage of HttpClient itself.
If this is a long lived application and you keep calling that part of your code, you will always new-up an new instance of HttpClient and then dispose it. This will kill the connections to your endpoint immediately, but leave the ports in TIME_WAIT state.
Do this is a couple of hundred or thousand times and you will most likely run into SocketException, because your machine has run out of available sockets to use.
From the documentation:
HttpClient is intended to be instantiated once and reused throughout
the life of an application.
If all HttpClients, share the same headers and base address, you can re-use a single instance of that client throughout your application. If you have different headers, create a new static client.
In both scenarios you want to use them as a singleton throughout your application.
Some further reading:
https://medium.com/#nuno.caneco/c-httpclient-should-not-be-disposed-or-should-it-45d2a8f568bc
https://learn.microsoft.com/en-us/aspnet/web-api/overview/advanced/calling-a-web-api-from-a-net-client
https://aspnetmonsters.com/2016/08/2016-08-27-httpclientwrong/
https://codereview.stackexchange.com/questions/69950/single-instance-of-reusable-httpclient/69954#69954
I am consuming a web service provided to me by a vendor in c# application. This application calls a web method in a loop and that slows down the performance. To get the complete set of results, it takes more than an hour.
Can I apply multi threading on my side to consume this web service in multiple threads and combine the results together?
Is there any better approach to retrieve data in minutes instead of hours?
First of all you have to make sure your vendor does indeed support this or does not prohibit it (which is very probable too).
The code itself to do this is fairly straightforward, using a method such as Parallel.For
Simple Example (google.com):
Parallel.For(0, norequests,
i => {
//Code that does your request goes here
} );
Exaplanation:
In a Parallel.For loop, all the requests get executed in-parallel (as implied in the name), which could potentially provide a very significant increase in performance.
Further reading:
MSDN on Parallel.For loops
You should really ask your vendor. We can only speculate about why it takes that long or if firing multiple requests will actually yield the same results as the one that takes long.
Basically, sending one request getting one response should beat the multi-threaded variant because it should be easier to optimize on the servers side.
If you want to know why this is not the case with the current version of the service, ask the vendor.
This is only samples, if you call web services in parallel:
private void TestParallelForeach()
{
string[] uris = {"http://192.168.1.2", "http://192.168.1.3", "http://192.168.1.4"};
var results = new List<string>();
var syncObj = new object();
Parallel.ForEach(uris, uri =>
{
using (var webClient = new WebClient())
{
webClient.Encoding = Encoding.UTF8;
try
{
var result = webClient.DownloadString(uri);
lock (syncObj)
{
results.Add(result);
}
}
catch (Exception ex)
{
// Do error handling here...
}
}
});
// Do with "results" here....
}
I have a Windows Service that sends json data to a MVC5 WebAPI using the WebClient. Both the Windows Service and the WebClient are currently on the same machine.
After it has run for about 15 minutes with about 10 requests per second each post takes unreasonably long to complete. It can start out at about 3 ms to complete a request and build up to take about 5 seconds, which is way too much for my application.
This is the code I'm using:
private WebClient GetClient()
{
var webClient = new WebClient();
webClient.Headers.Add("Content-Type", "application/json");
return webClient;
}
public string Post<T>(string url, T data)
{
var sw = new Stopwatch();
try
{
var json = JsonConvert.SerializeObject(data);
sw.Start();
var result = GetClient().UploadString(GetAddress(url), json);
sw.Stop();
if (Log.IsVerboseEnabled())
Log.Verbose(String.Format("json: {0}, time(ms): {1}", json, sw.ElapsedMilliseconds));
return result;
}
catch (Exception)
{
sw.Stop();
Log.Debug(String.Format("Failed to send to webapi, time(ms): {0}", sw.ElapsedMilliseconds));
return "Failed to send to webapi";
}
}
The result of the request isn't really of importance to me.
The serialized data size varies from just a few bytes to about 1 kB but that does not seem to affect the time it takes to complete the request.
The api controllers that receive the request completes their execution almost instantly (0-1 ms).
From various questions here on SO and some blog posts I've seen people suggesting the use of HttpWebRequest instead to be able to control options of the request.
Using HttpWebRequest I've tried these things that did not work:
Setting the proxy to an empty proxy.
Setting the proxy to null
Setting the ServicePointManager.DefaultConnectionLimit to an arbitrary large number.
Disable KeepAlive (I don't want to but it was suggested).
Not opening the response stream at all (had some impact but not enough).
Why are the requests taking so long? Any help is greatly appreciated.
It turned out to be another part of the program that took all available connections. I.e. I was out of sockets and had to wait for a free one.
I found out by monitoring ASP.NET Applications\Requests/Sec in the Performance Monitor.
I'm currently developping a small application based on a Master Detail template. One of my Pages requires some data to be loaded immediatly, and I dont know how to do this. In every example, data is loaded once user press a button.
Here is my current code :
string test = async (sender, e) => {
Task<string> json = GetRandomRelations ();
return await json;
};
And my method
public async Task<string> GetRandomRelations () {
var client = new System.Net.Http.HttpClient ();
client.BaseAddress = new Uri("http://127.0.0.1/loltools/web/app_dev.php/api/relation/");
string response = await client.GetStringAsync("random/20");
return response;
}
I'm currently just trying to get the json response, but I cannot even manage to do that... My main problem is that I cannot convert the lambda expression to string...
Thanks for your help !
One of my Pages requires some data to be loaded immediatly, and I dont know how to do this.
Think about this for a bit. What you're really asking is how to reconcile two opposing requirements:
The page must show some data immediately. The UI must be responsive. The data must be available synchronously to display.
The data is retrieved asynchronously. It is not available immediately. It will take some (unknown) amount of time to even get the data to display.
So, obviously, there's no direct solution. Instead, you have to satisfy the both of the core requirements ("The UI must be responsive" and "The data is retrieved asynchronously") in a different way. One common approach is to (immediately and synchronously) display a "Loading..." view of the data - a spinner or whatnot. Then, update the display when the data arrives.
I'm not absolutely sure what you are trying to do but what's wrong with simply:
string test = await GetRandomRelations ();
Using Visual studio 2012, C#.net 4.5 , SQL Server 2008, Feefo, Nopcommerce
Hey guys I have Recently implemented a new review service into a current site we have.
When the change went live the first day all worked fine.
Since then though the sending of sales to Feefo hasnt been working, There are no logs either of anything going wrong.
In the OrderProcessingService.cs in Nop Commerce's Service, i call a HttpWebrequest when an order has been confirmed as completed. Here is the code.
var email = HttpUtility.UrlEncode(order.Customer.Email.ToString());
var name = HttpUtility.UrlEncode(order.Customer.GetFullName().ToString());
var description = HttpUtility.UrlEncode(productVariant.ProductVariant.Product.MetaDescription != null ? productVariant.ProductVariant.Product.MetaDescription.ToString() : "product");
var orderRef = HttpUtility.UrlEncode(order.Id.ToString());
var productLink = HttpUtility.UrlEncode(string.Format("myurl/p/{0}/{1}", productVariant.ProductVariant.ProductId, productVariant.ProductVariant.Name.Replace(" ", "-")));
string itemRef = "";
try
{
itemRef = HttpUtility.UrlEncode(productVariant.ProductVariant.ProductId.ToString());
}
catch
{
itemRef = "0";
}
var url = string.Format("feefo Url",
login, password,email,name,description,orderRef,productLink,itemRef);
var request = (HttpWebRequest)WebRequest.Create(url);
request.KeepAlive = false;
request.Timeout = 5000;
request.Proxy = null;
using (var response = (HttpWebResponse)request.GetResponse())
{
if (response.StatusDescription == "OK")
{
var stream = response.GetResponseStream();
if(stream != null)
{
using (var reader = new StreamReader(stream))
{
var content = reader.ReadToEnd();
}
}
}
}
So as you can see its a simple webrequest that is processed on an order, and all product variants are sent to feefo.
Now:
this hasnt been happening all week since the 15th (day of the
implementation)
the site has been grinding to a halt recently.
The stream and reader in the the var content is there for debugging.
Im wondering does the code redflag anything to you that could relate to the process of website?
Also note i have run some SQL statements to see if there is any deadlocks or large escalations, so far seems fine, Logs have also been fine just the usual logging of Bots.
Any help would be much appreciated!
EDIT: also note that this code is in a method that is called and wrapped in A try catch
UPDATE: well forget about the "not sending", thats because i was just told my code was rolled back last week
A call to another web site while processing the order can degrade performance, as you are calling to a site that you do not control. You don't know how much time it is going to take. Furthermore, the GetResponse method can throw an exception, if you don't log anything in your outer try/catch block then you won't be able to know what's happening.
The best way to perform such a task is to implement something like the "Send Emails" scheduled task, and send data when you can afford to wait for the remote service. It is easy if you try. It is more resilient and easier to maintain if you upgrade the nopCommerce code base.
This is how I do similar things:
Avoid modifying the OrderProcessingService: Create a custom service or plugin that consumes the OrderPlacedEvent or the OrderPaidEvent (just implement the IConsumer<OrderPaidEvent> or IConsumer<OrderPlacedEvent> interface).
Do not call to a third party service directly while processing the request if you don't need the response at that moment. It will only delay your process. At the service created in step 1, store data and send it to Feefo later. You can store data to database or use an static collection if you don't mind losing pending data when restarting the site (that could be ok for statistical data for instance).
Best way to implement point #2 is to add a new scheduled task implementing ITask (remember to add a record to the ScheduleTask table). Just recover the stored data do your processing.
Add some logging. It is easy, just get an ILogger instance and call Insert.
As far as I can see, you are making a blocking synchronous call to other websites, which will definitely slow down your site in between the request-response process. What Marco has suggested is valid, try to do it in an ITask. Or you can use an asynchronous web request to potentially remove the block, if you need things done immediately instead of scheduled. :)