Possible Infinite Loop when streaming response - c#

We are trying to identify high CPU usage in services we have, and we believe there are a few potential areas that may be causing infinite loops. Below is code that we believe may potentially be causing an infinite loop. Is there anything specific that sticks out that may cause the while loop to run indefinitely?
WebRequest request = WebRequest.Create(Url);
request.ContentLength = formDataLength;
request.ContentType = "application/x-www-form-urlencoded";
request.Method = "POST";
using (Stream rs = request.GetRequestStream())
{
ASCIIEncoding encoding = new ASCIIEncoding();
var postData = encoding.GetBytes(formData);
rs.Write(postData, 0, postData.Length);
string str = string.Empty;
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
using (Stream sm = response.GetResponseStream())
{
int totalBytesRead = 0;
long responseBytesToRead = 1024;
byte[] buffer = new byte[responseBytesToRead];
int bytesRead;
do
{
bytesRead = sm.Read(buffer, totalBytesRead, (int)(responseBytesToRead - totalBytesRead));
totalBytesRead += bytesRead;
} while (totalBytesRead < bytesRead);
request.Abort();
str = Encoding.Default.GetString(buffer);
}
}
return str;
}

From the MSDN documentation:
Return Value
Type: System.Int32 The total number of bytes read into
the buffer. This can be less than the number of bytes requested if
that many bytes are not currently available, or zero (0) if the end of
the stream has been reached.
0 indicates the end of the stream. The condition for reaching end of stream has already been defined. The condition you rely on could be unreliable and is unnecessary.
Try
while(bytesRead != 0)

It is better to use StreamReader
using (StreamReader reader = new StreamReader(response.GetResponseStream()))
...
reader.ReadToEnd();
// or
while (!reader.EndOfStream)
{
// do read.
}

Related

saving a binary file to ram in c#

i am developing a flash tool for a mobile which sends a loader to it for further
communications.currently i receive the loader file through web request from a URL
and stores it in disk .below is the code i use
private void submitData()
{
try
{
ASCIIEncoding encoding = new ASCIIEncoding();
string cpuid = comboBox1.Text;
string postdata = "cpuid=" + cpuid;
byte[] data = encoding.GetBytes(postdata);
WebRequest request = WebRequest.Create("Http://127.0.0.1/fetch.php");
request.Method = "POST";
request.ContentType = "application/x-www-form-urlencoded";
request.ContentLength = data.Length;
Stream stream = request.GetRequestStream();
stream.Write(data, 0, data.Length);
stream.Close();
WebResponse response = request.GetResponse();
stream = response.GetResponseStream();
StreamReader sr = new StreamReader(stream);
string path = sr.ReadToEnd();
MessageBox.Show(path);
DateTime startTime = DateTime.UtcNow;
WebRequest request1 = WebRequest.Create("http://127.0.0.1/"+path);
WebResponse response2 = request1.GetResponse();
using (Stream responseStream = response2.GetResponseStream())
{
using (Stream fileStream = File.OpenWrite(#"e:\loader.mbn"))
{
byte[] buffer = new byte[4096];
int bytesRead = responseStream.Read(buffer, 0, 4096);
while (bytesRead > 0)
{
fileStream.Write(buffer, 0, bytesRead);
DateTime nowTime = DateTime.UtcNow;
if ((nowTime - startTime).TotalMinutes > 1)
{
throw new ApplicationException(
"Download timed out");
}
bytesRead = responseStream.Read(buffer, 0, 4096);
}
MessageBox.Show("COMPLETDED");
}
}
sr.Close();
stream.Close();
}
catch(Exception ex)
{
MessageBox.Show("ERR :" + ex.Message);
}
my question is there a way to store the file directly to the ram
and then use it from there
so far i tried to use Memory stream with no results.
There's nothing wrong with MemoryStream. You have to remember to reset the stream's position to the beginning if you want to use the data though.
//If the size is known, use it to avoid reallocations
use(var memStream=new MemoryStream(response.ContentLength))
use(var responseStream=response.GetResponseStream())
{
//Avoid blocking while waiting for the copy with CopyToAsync instead of CopyTo
await responseStream.CopyToAsync(memStream);
//Reset the position
memStream.Position = 0;
//Use the memory stream
...
}

Force an async WebRequest to flush its buffer c#

I have the following connection code:
_request = (HttpWebRequest)WebRequest.Create(complianceUrl);
_request.Method = "GET";
var authInfo = string.Format("{0}:{1}", _username, _password);
authInfo = Convert.ToBase64String(Encoding.Default.GetBytes(authInfo));
_request.Headers.Add("Authorization", "Basic " + authInfo);
// set stream parameters
_request.AutomaticDecompression = DecompressionMethods.Deflate | DecompressionMethods.GZip;
_request.Headers.Add("Accept-Encoding", "gzip");
_request.Accept = "application/json";
_request.ContentType = "application/json";
_request.ReadWriteTimeout = 30000;
_request.AllowReadStreamBuffering = false;
_request.Timeout = 30; //seconds, sends 15-second heartbeat.
_asyncCallback = HandleResult; //Setting handleResult as Callback method...
_request.BeginGetResponse(_asyncCallback, _request); //Calling BeginGetResponse on
This works fine, and the buffer fills with data and while I have large volumes of data this is fine. But with low volumes of data it takes a while for the buffer to fill up and I want to periodically flush the buffer if I have not had any activity in a while.
I tried to do that this way:
_request.GetRequestStream().FlushAsync();
But this is wrong as it tells me I am causing a ProtocolViolationException I guess as this is a GET Verb?
Can anyone tell me how to forcibly cause the connection to dump the buffer to the client?
Handling response code added:
private void HandleResult(IAsyncResult result)
{
using (var response = (HttpWebResponse) _request.EndGetResponse(result))
using (var stream = response.GetResponseStream())
using (var memory = new MemoryStream())
{
var compressedBuffer = new byte[BlockSize];
while (stream != null && stream.CanRead)
{
var readCount = stream.Read(compressedBuffer, 0, compressedBuffer.Length);
// if readCount is 0, then the stream must have disconnected. Process and abort!
if (readCount == 0)
{
}
}
}
}
It is not possible to make the server side of an HTTP call send you data in a particular way. Neither HTTP nor TCP have provisions for that. You have to take what you get, or talk to the vendor.
Read does not block until a buffer is full. It gives you what arrives immediately. This behavior is well known, this is not a guess of mine.
The 3rd party service send 15 second heartbeats CRLF which will also eventually fill up the buffer.
This sounds like the service is doing what you want. If that is true, the problem must be in your code. But not in the code shown in the question. Try this:
while (true)
{
var readCount = stream.Read(compressedBuffer, 0, compressedBuffer.Length);
Console.WriteLine(readCount);
}
This should show 2 bytes every 15 seconds. If not, the vendor is at fault. If yes, you are.
while (stream != null && stream.CanRead) this is weird because none of these conditions can become false. And even if they become false, what do you do about it?! This should be while (true) plus break on stream depletion.
// if readCount is 0, then the stream must have disconnected.
This condition means that the remote side has orderly finished their sending of the HTTP response.
For anyone interested, you can read and deflate manually doing the following:
using (var response = (HttpWebResponse) _request.EndGetResponse(result))
using (var stream = response.GetResponseStream())
using (var compressedMemory = new MemoryStream())
using (var uncompressedMemory = new MemoryStream()) // Added new memory stream
using (var gzipStream = new GZipStream(compressedMemory, CompressionMode.Decompress))
{
var compressedBuffer = new byte[BlockSize];
while (stream != null && stream.CanRead)
{
var readCount = stream.Read(compressedBuffer, 0, compressedBuffer.Length);
compressedMemory.Write(compressedBuffer.Take(readCount).ToArray(), 0, readCount);
compressedMemory.Position = 0;
gzipStream.CopyTo(uncompressedMemory); // use copy to rather than trying to read
var outputString = Encoding.UTF8.GetString(uncompressedMemory.ToArray());
Debug.WriteLine(outputString);
uncompressedMemory.Position = 0;
uncompressedMemory.SetLength(0);
compressedMemory.Position = 0;
compressedMemory.SetLength(0); // reset length
}
}

Error (HttpWebRequest): Bytes to be written to the stream exceed the Content-Length bytes size specified

I can't seem to figure out why I keep getting the following error:
Bytes to be written to the stream exceed the Content-Length bytes size specified.
at the following line:
writeStream.Write(bytes, 0, bytes.Length);
This is on a Windows Forms project. If anyone knows what is going on here I would surely owe you one.
private void Post()
{
HttpWebRequest request = null;
Uri uri = new Uri("xxxxx");
request = (HttpWebRequest)WebRequest.Create(uri);
request.Method = "POST";
request.ContentType = "application/x-www-form-urlencoded";
XmlDocument doc = new XmlDocument();
doc.Load("XMLFile1.xml");
request.ContentLength = doc.InnerXml.Length;
using (Stream writeStream = request.GetRequestStream())
{
UTF8Encoding encoding = new UTF8Encoding();
byte[] bytes = encoding.GetBytes(doc.InnerXml);
writeStream.Write(bytes, 0, bytes.Length);
}
string result = string.Empty;
request.ProtocolVersion = System.Net.HttpVersion.Version11;
request.KeepAlive = false;
try
{
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
using (Stream responseStream = response.GetResponseStream())
{
using (System.IO.StreamReader readStream = new System.IO.StreamReader(responseStream, Encoding.UTF8))
{
result = readStream.ReadToEnd();
}
}
}
}
catch (Exception exp)
{
// MessageBox.Show(exp.Message);
}
}
There are three possible options
Fix the ContentLength as described in the answer from #rene
Don't set the ContentLength, the HttpWebRequest is buffering the data, and sets the ContentLength automatically
Set the SendChunked property to true, and don't set the ContentLength. The request is send chunk encoded to the webserver. (needs HTTP 1.1 and has to be supported by the webserver)
Code:
...
request.SendChunked = true;
using (Stream writeStream = request.GetRequestStream())
{ ... }
The Encoded byte array from your InnerXml might be longer as some characters in an UTF8 encoding take up 2 or 3 bytes for a single character.
Change your code as follows:
using (Stream writeStream = request.GetRequestStream())
{
UTF8Encoding encoding = new UTF8Encoding();
byte[] bytes = encoding.GetBytes(doc.InnerXml);
request.ContentLength = bytes.Length;
writeStream.Write(bytes, 0, bytes.Length);
}
To show exactly what is going on, try this in LINQPad:
var s = "é";
s.Length.Dump("string length");
Encoding.UTF8.GetBytes(s).Length.Dump("array length");
This will output:
string length: 1
array length: 2
and now use an e without the apostrophe:
var s = "e";
s.Length.Dump("string length");
Encoding.UTF8.GetBytes(s).Length.Dump("array length");
which will output:
string length: 1
array length: 1
So remember: string length and the number of bytes needed for a specific encoding might differ.

c# .NET Fast (realtime) webresponse reading

I got code, that sending GET request and recieves answer in stream. I read stream with streamreader to end. Here is code:
HttpWebRequest requestGet = (HttpWebRequest)WebRequest.Create(url);
requestGet.Method = "GET";
requestGet.Timeout = 5000;
HttpWebResponse responseGet = (HttpWebResponse)requestGet.GetResponse();
StreamReader reader = new StreamReader(responseGet.GetResponseStream());
StringBuilder output = new StringBuilder();
output.Append(reader.ReadToEnd());
responseGet.Close();
But i dont like that program is waiting until all data recieved before starting working with response. It would be great if i can do it like this(pseudocode):
//here sending GET request
do
{
response.append(streamPart recieved);
//here work with response
} while (stream not ended)
I tried streamReader.Read(char[], int32_1, int32_2), but i cant specify int32_2, becouse i dont know how many symbols i recieved. And if i use ReadToEnd - it waits for all response to load.
Read takes a char[] buffer and returns the number of characters read from the stream.
Here's an example of how to read the response in chunks:
public static void Main(string[] args)
{
var req = WebRequest.Create("http://www.google.com");
req.Method = "GET";
req.Timeout = 5000;
using (var response = req.GetResponse())
using (var reader = new StreamReader(response.GetResponseStream()))
{
char[] buffer = new char[1024];
int read = 0;
int i = 0;
do
{
read = reader.Read(buffer, 0, buffer.Length);
Console.WriteLine("{0}: Read {1} bytes", i++, read);
Console.WriteLine("'{0}'", new String(buffer, 0, read));
Console.WriteLine();
} while(!reader.EndOfStream);
}
}
I didn't see any extraneous white space or repeated data.

Out of Memory Exception - Reading large text file for HttpWebRequest

I am trying to read 500 MB text file to send it contents through HttpWebRequest. According to my requirement, I cannot send the data in chunks. Code is as follows :
using (StreamReader reader = new StreamReader(filename))
{
postData = reader.ReadToEnd();
}
byte[] byteArray = Encoding.UTF8.GetBytes(postData);
request.ContentType = "text/plain";
request.ContentLength = byteArray.Length;
Stream dataStream = request.GetRequestStream();
dataStream.Write(byteArray, 0, byteArray.Length);
dataStream.Close();
WebResponse response = request.GetResponse();
Console.WriteLine(((HttpWebResponse)response).StatusDescription);
dataStream = response.GetResponseStream();
using (StreamReader reader = new StreamReader(dataStream))
{
responseFromServer = reader.ReadToEnd();
}
Console.WriteLine(responseFromServer);
dataStream.Close();
response.Close();
Reading such large file gives me out of memory exception. Is there a way I can do this?
Sounds like you may be encountering this documented issue with HttpWebRequest. Per the KB article, try setting the HttpWebRequest.AllowWriteStreamBuffering property to false.
All files are transferred in chunks - that's what an ethernet packet is; it's a single chunk of data. I would wager that the requirement really means "this file must be transferred in a single web service call."
Assuming that's the case, you'd read the data from disk into a 64KB buffer, and then write the buffer to the request.
request.ContentType = "text/plain";
request.ContentLength = byteArray.Length;
Stream dataStream = request.GetRequestStream();
int BUFFER_SIZE = 65536;
byte[] buffer = new byte[BUFFER_SIZE];
using (StreamReader reader = new StreamReader(filename)) {
int count = 0;
while (true) {
int count = reader.Read(buffer, 0, BUFFER_SIZE);
dataStream.Write(buffer, 0, count);
if (count < BUFFER_SIZE) break;
}
}
dataStream.Close();

Categories

Resources