Speed up HTTP Post Speed - c#

I am writing an application to send text messages through HTTP posts to a Slooce Tech API. Because the application will have to send a high volume of text messages, I'm trying to optimize its speed.
The second piece of code below is the method that I am currently using to send the posts. I wrote the first piece of code and left out the HTTPWebResponse to try to make it faster.
The problem is that the new method is actually slower and rather than taking .25 seconds to execute, it takes a second or more and sometimes will get stuck.
Does anyone know why it would do that or any other tips for improving the speed of this application? I have added Request.Proxy=null and that speeds it up a little bit.
Thanks.
The modified code is:
public void QuickSend()
{
XML = "<message id=\"" + lMessageID + "\"><partnerpassword>" + PartnerPassword + "</partnerpassword><content>" + sMessage + "</content></message>";
URL = "http://sloocetech.net:****/spi-war/spi/" + PartnerID + "/" + sRecipient + "/" + Keyword + "/messages/mt";
HttpWebRequest Request = (HttpWebRequest)WebRequest.Create(URL);
RequestBytes = System.Text.Encoding.ASCII.GetBytes(XML);
Request.Method = "POST";
Request.ContentType = "text/xml;charset=utf-8";
Request.ContentLength = RequestBytes.Length;
RequestStream = Request.GetRequestStream();
RequestStream.Write(RequestBytes, 0, RequestBytes.Length);
RequestStream.Close();
}
And here is the original code:
public XDocument SendSMS()
{
XML = "<message id=\""+ lMessageID +"\"><partnerpassword>" + PartnerPassword + "</partnerpassword><content>" + sMessage + "</content></message>";
URL = "http://sloocetech.net:****/spi-war/spi/" + PartnerID + "/" + sRecipient + "/" + Keyword + "/messages/mt";
HttpWebRequest Request = (HttpWebRequest)WebRequest.Create(URL);
RequestBytes = System.Text.Encoding.ASCII.GetBytes(XML);
Request.Method = "POST";
Request.ContentType = "text/xml;charset=utf-8";
Request.ContentLength = RequestBytes.Length;
RequestStream = Request.GetRequestStream();
RequestStream.Write(RequestBytes, 0, RequestBytes.Length);
RequestStream.Close();
HttpWebResponse Resp = (HttpWebResponse)Request.GetResponse();
oReader = new StreamReader(Resp.GetResponseStream(), System.Text.Encoding.Default);
string backstr = oReader.ReadToEnd();
oReader.Close();
Resp.Close();
Doc = XDocument.Parse(backstr);
return Doc;
}

First of all, I'd be very skeptical of any claims that you're going to see massive improvements because you're crafting your HttpWebRequest in a special way. The bottleneck on single threaded requests like yours is going to be network latency as well as the response time of the server. (Perhaps they're doing a lot of server-side processing before responding to your request).
You're making a blocking request, which means your CPU is doing nothing while it waits for a response.
If you want to multithread your application, you could do something like the following:
var tasks = new Task[10];
for (int i = 0; i < 10; i++)
{
tasks[i] = Task.Factory.StartNew(() =>
{
int messages_sent_by_one_task = 0;
while(messages_sent_by_one_task < 10)
{
QuickSend();
messages_sent_by_one_task++;
}
});
}
while (tasks.Any(t => !t.IsCompleted)) { } //wait for tasks to complete
This will spawn 10 tasks that will each send 10 messages. If one response is taking a long time, the other 9 threads will continue chugging along happily.
I believe you could probably improve on this is you were to incorporate asynchronous requests and HTTPClient, so each of your 10 threads never blocked. However, I don't feel qualified to give an example as I've never tried this.
You might be tempted to crank the number of threads up to some ungodly number, but avoid the temptation. The overhead of creating and managing threads will soon catch up with you. I don't know what the ideal number is, but you're welcome to experiment.

Related

Occasional timeout exception when running webclients in parallel [duplicate]

I am sending a large number of simultaneous requests to a particular web service with different data. To achieve this, I have created a number of threads(around 50 in number). Total number of requests per minute may increase up to 10000.
The application in the form of a windows service runs fine for a few minutes and then a operation time out error is encountered.
I have tried the usual suspects such as increasing DefaultConnectionLimit, closing the web response object. Since the requests do not take much time on server, I have also set the request Timeout and ReadWriteTimeout to 5 seconds.
Below is the code snippet which is called repeatedly by different threads.
// Below line is executed at the start of application
ServicePointManager.DefaultConnectionLimit = 15000;
// Below code is executed at repeatedly by different threads
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(url);
request.Host = hostName;
request.Proxy = null;
request.UserAgent = "Windows Service";
byte[] bytes = new byte[0];
if (body != null)
{
bytes = System.Text.Encoding.ASCII.GetBytes(body);
request.ContentType = "text/xml; encoding='utf-8'";
request.ContentLength = bytes.Length;
}
request.Method = "POST";
request.Timeout = 5000;
request.ReadWriteTimeout = 5000;
request.Headers["Authorization"] = "Basic " + Convert.ToBase64String(Encoding.Default.GetBytes(username + ":" + password));
request.CookieContainer = this.cookieContainer;
if (body != null)
{
Stream requestStream = request.GetRequestStream();
requestStream.Write(bytes, 0, bytes.Length);
requestStream.Close();
}
HttpWebResponse httpResponse = (HttpWebResponse)request.GetResponse();
using (var streamReader = new StreamReader(httpResponse.GetResponseStream()))
{
responseText = streamReader.ReadToEnd();
}
httpResponse.Close();
ServicePointManager.DefaultConnectionLimit limits the number of outgoing web requests to a given server. The default is generally 2 or 10.
If you are making 50 parallel calls to that web service, you should set ServicePointManager.DefaultConnectionLimit (at app startup) to a larger number (e.g. 40-50).
Additionally, you are not calling Close or Dispose on request. You should do this, or let using take care of it for you.

How to increase the IIS concurrent requests limit? [duplicate]

I am sending a large number of simultaneous requests to a particular web service with different data. To achieve this, I have created a number of threads(around 50 in number). Total number of requests per minute may increase up to 10000.
The application in the form of a windows service runs fine for a few minutes and then a operation time out error is encountered.
I have tried the usual suspects such as increasing DefaultConnectionLimit, closing the web response object. Since the requests do not take much time on server, I have also set the request Timeout and ReadWriteTimeout to 5 seconds.
Below is the code snippet which is called repeatedly by different threads.
// Below line is executed at the start of application
ServicePointManager.DefaultConnectionLimit = 15000;
// Below code is executed at repeatedly by different threads
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(url);
request.Host = hostName;
request.Proxy = null;
request.UserAgent = "Windows Service";
byte[] bytes = new byte[0];
if (body != null)
{
bytes = System.Text.Encoding.ASCII.GetBytes(body);
request.ContentType = "text/xml; encoding='utf-8'";
request.ContentLength = bytes.Length;
}
request.Method = "POST";
request.Timeout = 5000;
request.ReadWriteTimeout = 5000;
request.Headers["Authorization"] = "Basic " + Convert.ToBase64String(Encoding.Default.GetBytes(username + ":" + password));
request.CookieContainer = this.cookieContainer;
if (body != null)
{
Stream requestStream = request.GetRequestStream();
requestStream.Write(bytes, 0, bytes.Length);
requestStream.Close();
}
HttpWebResponse httpResponse = (HttpWebResponse)request.GetResponse();
using (var streamReader = new StreamReader(httpResponse.GetResponseStream()))
{
responseText = streamReader.ReadToEnd();
}
httpResponse.Close();
ServicePointManager.DefaultConnectionLimit limits the number of outgoing web requests to a given server. The default is generally 2 or 10.
If you are making 50 parallel calls to that web service, you should set ServicePointManager.DefaultConnectionLimit (at app startup) to a larger number (e.g. 40-50).
Additionally, you are not calling Close or Dispose on request. You should do this, or let using take care of it for you.

Parallel async HttpWebRequests with different proxies. How to maximize performance?

I'm writing an app that needs to do a lot of parallel httpwebrequests to bookmaker site from different proxies. The reason why i'm using different proxies is that bookmaker can ban ip if there are many requests from it. My goal is to get freshest site content as fast as possible.
Here is my code with all settings that I have:
ServicePointManager.DefaultConnectionLimit = 1000;
ServicePointManager.Expect100Continue = false;
ServicePointManager.UseNagleAlgorithm = false;
for (var i = 0; i < proxyCollection.Count; i++)
{
var proxyLocal = proxyCollection[i];
var iLocal = i;
Task.Run(async () =>
{
var httpWebRequest = (HttpWebRequest) WebRequest.Create(String.Format("https://bookmaker.com/page{0}", iLocal));
httpWebRequest.Proxy = proxyLocal;
httpWebRequest.PreAuthenticate = true;
httpWebRequest.AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate;
using (var httpWebResponse = (HttpWebResponse) await httpWebRequest.GetResponseAsync())
using (var responseStream = httpWebResponse.GetResponseStream())
using (var streamReader = new StreamReader(responseStream))
{
var stringContent = await streamReader.ReadToEndAsync();
//Here i'm processing new data. It works pretty fast, so this isn't a problem
ProcessStringContent(stringContent);
}
});
}
And this code works not so fast than I expected.
First problem is that in strange reason all requests don't start at one time. As I can see in task manager loading has two or more maximums. More over I have one realy fast proxy and proxyCollection contains it. But if I use await Task.Delay(5000) after code above, in some cases to the moment when 5000ms have passed the request with my fast proxy doesn't even start!
Second problem is that total time of all task execution is slow. I expected that if one request needs 200-300ms to execute, than 100 requests in parallel and async needs a little more time. But some times this "more" is 10-20 times. I have a suspicion, that something is wrong.
Third problem is that when I'm running this code it freezes UI (not full freeze, but UI lags). I read that WebRequest.Create is processing synchronously and can get some time (dns lookup, proxy settings e.t.c) and if I'm making a lot of requests they can simply fill all my threads (UI thread too) for creating WebRequests. But I tried to create requests to direct ip address (WebRequest.Create(String.Format("https://10.10.10.1/page{0}", iLocal)) - nothing changed, and i'm setting proxy (so auto detect proxy isn't needed), so I don't understand why can creating take so much time (and is problem with creating or maybe with smth else?).
Please can someone point me what i'm doing wrong? I'm lost in all ServicePointManager settings (all what I tried didn't help). Can .Net deal with such type of task? Or maybe I need to use nodejs for example for best performance?
P.S: Collection of proxies is not such big (50-100 different proxies).
Back to (Parallel) Basics: Don't Block Your Threads, Make Async I/O Work For You
Example:
Parallel.For(0, 10, delegate(int i) {
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(
new Uri("http://www.mysi.com"));
string dataToSend = "Data";
byte[] buffer = System.Text.Encoding.GetEncoding(1252).
GetBytes(dataToSend);
request.Method = "POST";
request.ContentType = "application/x-www-form-urlencoded";
request.ContentLength = buffer.Length;
request.Host = "www.mysite.com";
Stream requestStream = request.GetRequestStream();
requestStream.Write(buffer, 0, buffer.Length);
requestStream.Close();
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
});
Send multiple WebRequest in Parallel.For

GetResponse and GetRequestStream silverlight

For some reason I cant use GetRequestStream or GetResponse in silverlight comes up underlined :S not sure what to use? I am trying to connect to my web service here is where I get the error,
string uri = "http://localhost:8002/Service/Customer";
StringBuilder sb = new StringBuilder();
sb.Append("<Customer>");
sb.AppendLine("<FirstName>" + this.textBox1.Text + "</FirstName>");
sb.AppendLine("<LastName>" + this.textBox2.Text + "</LastName>");
sb.AppendLine("</Customer>");
string NewCustomer = sb.ToString();
byte[] arr = Encoding.UTF8.GetBytes(NewCustomer );
HttpWebRequest req = (HttpWebRequest)WebRequest.Create(uri);
req.Method = "POST";
req.ContentType = "application/xml";
req.ContentLength = arr.Length;
Stream reqStrm = req.GetRequestStream();// error here GetRequestStream
reqStrm.Write(arr, 0, arr.Length);
reqStrm.Close();
HttpWebResponse resp = (HttpWebResponse)req.GetResponse(); //error here GetRequestStream
MessageBox.Show("Staff Creation: Status " + resp.StatusDescription);
reqStrm.Close();
resp.Close();
Does anyone have a workaround?
Silverlight only supports the Asynchronous network access. There are no synchronous GetRequestStream and GetResponse methods in Silverlight. You will need to use the asynchronous methods BeginGetRequestStream/EndGetRequestStream and BeginGetResponse/EndGetResponse.
More importantly you will need to get up to speed in how to do things asynchronously in general. For example something will be calling your above code and expect that after it is complete certain changes will have happpened. In the asynchronous world that will not be true, the code will return quickly and something will happen later.

Why does sending post data with WebRequest take so long?

I am currently creating a C# application to tie into a php / MySQL online system. The application needs to send post data to scripts and get the response.
When I send the following data
username=test&password=test
I get the following responses...
Starting request at 22/04/2010 12:15:42
Finished creating request : took 00:00:00.0570057
Transmitting data at 22/04/2010 12:15:42
Transmitted the data : took 00:00:06.9316931 <<--
Getting the response at 22/04/2010 12:15:49
Getting response 00:00:00.0360036
Finished response 00:00:00.0360036
Entire call took 00:00:07.0247024
As you can see it is taking 6 seconds to actually send the data to the script, I have done further testing bye sending data from telnet and by sending post data from a local file to the url and they dont even take a second so this is not a problem with the hosted script on the site.
Why is it taking 6 seconds to transmit the data when it is two simple strings?
I use a custom class to send the data
class httppostdata
{
WebRequest request;
WebResponse response;
public string senddata(string url, string postdata)
{
var start = DateTime.Now;
Console.WriteLine("Starting request at " + start.ToString());
// create the request to the url passed in the paramaters
request = (WebRequest)WebRequest.Create(url);
// set the method to post
request.Method = "POST";
// set the content type and the content length
request.ContentType = "application/x-www-form-urlencoded";
request.ContentLength = postdata.Length;
// convert the post data into a byte array
byte[] byteData = Encoding.UTF8.GetBytes(postdata);
var end1 = DateTime.Now;
Console.WriteLine("Finished creating request : took " + (end1 - start));
var start2 = DateTime.Now;
Console.WriteLine("Transmitting data at " + start2.ToString());
// get the request stream and write the data to it
Stream dataStream = request.GetRequestStream();
dataStream.Write(byteData, 0, byteData.Length);
dataStream.Close();
var end2 = DateTime.Now;
Console.WriteLine("Transmitted the data : took " + (end2 - start2));
// get the response
var start3 = DateTime.Now;
Console.WriteLine("Getting the response at " + start3.ToString());
response = request.GetResponse();
//Console.WriteLine(((WebResponse)response).StatusDescription);
dataStream = response.GetResponseStream();
StreamReader reader = new StreamReader(dataStream);
var end3 = DateTime.Now;
Console.WriteLine("Getting response " + (end3 - start3));
// read the response
string serverresponse = reader.ReadToEnd();
var end3a = DateTime.Now;
Console.WriteLine("Finished response " + (end3a - start3));
Console.WriteLine("Entire call took " + (end3a - start));
//Console.WriteLine(serverresponse);
reader.Close();
dataStream.Close();
response.Close();
return serverresponse;
}
}
And to call it I use
private void btnLogin_Click(object sender, EventArgs e)
{
// string postdata;
if (txtUsername.Text.Length < 3 || txtPassword.Text.Length < 3)
{
MessageBox.Show("Missing your username or password.");
}
else
{
string postdata = "username=" + txtUsername.Text +
"&password=" + txtPassword.Text;
httppostdata myPost = new httppostdata();
string response = myPost.senddata("http://www.domainname.com/scriptname.php", postdata);
MessageBox.Show(response);
}
}
Make sure you explicitly set the proxy property of the WebRequest to null or it will try to autodetect the proxy settings which can take some time.
Chances are that because, in your test, you only call this once, the delay you see is the C# code being JIT compiled.
A better test would be to call this twice, and discard the timings from the first time and see if they are better.
An even better test would be to discard the first set of timings, and then run this many times and take an average, although for a very loose "indicative" view, this is probably not necessary.
As an aside, for this sort of timing, you are better off using the System.Diagnostics.Stopwatch class over System.DateTime.
[EDIT]
Also, noting Mant101's suggestion about proxies, if the setting no proxy fails to resolve things, you may wish to set up Fiddler and set your request to use Fiddler as its proxy. This would allow you to intercept the actual http calls so you can get a better breakdown of the http call timings themselves from outside the framework.

Categories

Resources