I have the following connection code:
_request = (HttpWebRequest)WebRequest.Create(complianceUrl);
_request.Method = "GET";
var authInfo = string.Format("{0}:{1}", _username, _password);
authInfo = Convert.ToBase64String(Encoding.Default.GetBytes(authInfo));
_request.Headers.Add("Authorization", "Basic " + authInfo);
// set stream parameters
_request.AutomaticDecompression = DecompressionMethods.Deflate | DecompressionMethods.GZip;
_request.Headers.Add("Accept-Encoding", "gzip");
_request.Accept = "application/json";
_request.ContentType = "application/json";
_request.ReadWriteTimeout = 30000;
_request.AllowReadStreamBuffering = false;
_request.Timeout = 30; //seconds, sends 15-second heartbeat.
_asyncCallback = HandleResult; //Setting handleResult as Callback method...
_request.BeginGetResponse(_asyncCallback, _request); //Calling BeginGetResponse on
This works fine, and the buffer fills with data and while I have large volumes of data this is fine. But with low volumes of data it takes a while for the buffer to fill up and I want to periodically flush the buffer if I have not had any activity in a while.
I tried to do that this way:
_request.GetRequestStream().FlushAsync();
But this is wrong as it tells me I am causing a ProtocolViolationException I guess as this is a GET Verb?
Can anyone tell me how to forcibly cause the connection to dump the buffer to the client?
Handling response code added:
private void HandleResult(IAsyncResult result)
{
using (var response = (HttpWebResponse) _request.EndGetResponse(result))
using (var stream = response.GetResponseStream())
using (var memory = new MemoryStream())
{
var compressedBuffer = new byte[BlockSize];
while (stream != null && stream.CanRead)
{
var readCount = stream.Read(compressedBuffer, 0, compressedBuffer.Length);
// if readCount is 0, then the stream must have disconnected. Process and abort!
if (readCount == 0)
{
}
}
}
}
It is not possible to make the server side of an HTTP call send you data in a particular way. Neither HTTP nor TCP have provisions for that. You have to take what you get, or talk to the vendor.
Read does not block until a buffer is full. It gives you what arrives immediately. This behavior is well known, this is not a guess of mine.
The 3rd party service send 15 second heartbeats CRLF which will also eventually fill up the buffer.
This sounds like the service is doing what you want. If that is true, the problem must be in your code. But not in the code shown in the question. Try this:
while (true)
{
var readCount = stream.Read(compressedBuffer, 0, compressedBuffer.Length);
Console.WriteLine(readCount);
}
This should show 2 bytes every 15 seconds. If not, the vendor is at fault. If yes, you are.
while (stream != null && stream.CanRead) this is weird because none of these conditions can become false. And even if they become false, what do you do about it?! This should be while (true) plus break on stream depletion.
// if readCount is 0, then the stream must have disconnected.
This condition means that the remote side has orderly finished their sending of the HTTP response.
For anyone interested, you can read and deflate manually doing the following:
using (var response = (HttpWebResponse) _request.EndGetResponse(result))
using (var stream = response.GetResponseStream())
using (var compressedMemory = new MemoryStream())
using (var uncompressedMemory = new MemoryStream()) // Added new memory stream
using (var gzipStream = new GZipStream(compressedMemory, CompressionMode.Decompress))
{
var compressedBuffer = new byte[BlockSize];
while (stream != null && stream.CanRead)
{
var readCount = stream.Read(compressedBuffer, 0, compressedBuffer.Length);
compressedMemory.Write(compressedBuffer.Take(readCount).ToArray(), 0, readCount);
compressedMemory.Position = 0;
gzipStream.CopyTo(uncompressedMemory); // use copy to rather than trying to read
var outputString = Encoding.UTF8.GetString(uncompressedMemory.ToArray());
Debug.WriteLine(outputString);
uncompressedMemory.Position = 0;
uncompressedMemory.SetLength(0);
compressedMemory.Position = 0;
compressedMemory.SetLength(0); // reset length
}
}
Related
Hi I want to upload some dynamic generated content to my web api.
On the client I use the HttpWebRequest. The data should be uploaded sync and I want to write to the stream AFTER(!) I executed the HTTP-request.
(From server to client it does work fine, but from client to server i get some exceptions).
The client implementation looks like:
HttpWebRequest httpWebRequest = HttpWebRequest.Create(myUrl) as HttpWebRequest;
httpWebRequest.Method = "POST";
httpWebRequest.Headers["Authorization"] = "Basic " + ... ;
httpWebRequest.PreAuthenticate = true;
httpWebRequest.SendChunked = true;
//httpWebRequest.AllowWriteStreamBuffering = false; //does not help...
httpWebRequest.ContentType = "application/octet-stream";
Stream st = httpWebRequest.GetRequestStream();
Task<WebResponse> response = httpWebRequest.GetResponseAsync();
// NOW: Write after GetResponse()
var b = Encoding.UTF8.GetBytes("Test1");
st.Write(b, 0, b.Length);
b = Encoding.UTF8.GetBytes("Test2");
st.Write(b, 0, b.Length);
b = Encoding.UTF8.GetBytes("Test3");
st.Write(b, 0, b.Length);
st.Close();
var x = response.Result;
Stream resultStream = x.GetResponseStream();
//do some output...
I get exceptions (NotSupportedException: The stream does not support concurrent IO read or write operations.) on stream.write().
Why do I get here an exceptions. Some times the first writes worke and the late writes throw the exception.
On the beginning the stream.CanWrite property is true, but after the first or second or thrird write it goes false... And then the exception is thrown on the next write.
Edit: Changing AllowWriteStreamBuffering did not help
Appendix:
I found my problem. This problem is caused by the order of my code.
I have to call it in this order:
GetRequestStream
(writing async to the stream)
(the request is send to the server after the first write)
then:
GetResponseAsync()
GetResponseStream()
I thought "GetResponseAsync" triggers the client to send the Request (for now the headers only).
But it is not necessary because the request is already send after I write the first bits to the stream.
Second cause of my problems: Fiddler. (Fiddler currently only supports streaming of responses and not requests)
I found my problem.
The order of my code caused the problem.
The solution is to call it in this order:
GetRequestStream (writing async to the stream) (the request is send to the server after the first write) then:
GetResponseAsync()
GetResponseStream()
My understanding is that "GetResponseAsync" triggers the client to send the request (for now the headers only), but I have discovered it to be an uneccesary step, because the request had already been sent after the first few bits had been written to the stream.
The second cause of my problems is Fiddler, but Fiddler only supports streaming of responses, and not requests.
The code acheived referencing, 'HttpWebRequest' class:
HttpWebRequest httpWebRequest = HttpWebRequest.Create("http://xxx") as HttpWebRequest;
httpWebRequest.Method = "POST";
httpWebRequest.Headers["Authorization"] = "Basic " + Convert.ToBase64String(Encoding.ASCII.GetBytes("user:pw"));
httpWebRequest.PreAuthenticate = true;
httpWebRequest.SendChunked = true;
httpWebRequest.AllowWriteStreamBuffering = false;
httpWebRequest.AllowReadStreamBuffering = false;
httpWebRequest.ContentType = "application/octet-stream";
Stream st = httpWebRequest.GetRequestStream();
Console.WriteLine("Go");
try
{
st.Write(buffer, 0, buffer.Length); //with the first write, the request will be send.
st.Write(buffer, 0, buffer.Length);
st.Write(buffer, 0, buffer.Length);
for (int i = 1; i <= 10; i++)
{
st.Write(buffer, 0, buffer.Length); //still writing while I can read on the stream at my ASP.NET web api
}
}
catch (WebException ex)
{
var y = ex.Response;
}
finally
{
st.Close();
}
// Now we can read the response from the server in chunks
Task<WebResponse> response = httpWebRequest.GetResponseAsync();
Stream resultStream = response.Result.GetResponseStream();
byte[] data = new byte[1028];
int bytesRead;
while ((bytesRead = resultStream.Read(data, 0, data.Length)) > 0)
{
string output = System.Text.Encoding.UTF8.GetString(data, 0, bytesRead);
Console.WriteLine(output);
}
The code acheived referencing, 'HttpClient' class:
HttpClientHandler ch = new HttpClientHandler();
HttpClient c = new HttpClient(ch);
c.DefaultRequestHeaders.TransferEncodingChunked = true;
c.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Basic", Convert.ToBase64String(Encoding.ASCII.GetBytes("user:pw")));
Stream stream = new MemoryStream();
AsyncStream asyncStream = new AsyncStream(); // Custom implementation of the PushStreamContent with the method, "WriteToStream()".
PushStreamContent streamContent = new PushStreamContent(asyncStream.WriteToStream);
HttpRequestMessage requestMessage = new HttpRequestMessage(new HttpMethod("POST"), "http://XXX") { Content = streamContent };
requestMessage.Headers.TransferEncodingChunked = true;
HttpResponseMessage response = await c.SendAsync(requestMessage, HttpCompletionOption.ResponseHeadersRead);
// The request has been sent, since the first write in the "WriteToStream()" method.
response.EnsureSuccessStatusCode();
Task<Stream> result = response.Content.ReadAsStreamAsync();
byte[] data = new byte[1028];
int bytesRead;
while ((bytesRead = await result.Result.ReadAsync(data, 0, data.Length)) > 0)
{
string output = System.Text.Encoding.UTF8.GetString(data, 0, bytesRead);
Console.WriteLine(output);
}
Console.ReadKey();
You are using objects of an HttpWebRequest on multiple threads concurrently. response is a task that runs concurrently with your writes. That is clearly thread-unsafe.
Also, I don't see what you want to achieve. HTTP has a request-then-response model. The server can't (usually) send a single response byte before receiving the entire request. Theoretically, this is possible. But that is very unusual and likely not supported by the .NET BCL. This would only be supported by a (very unusual) custom HTTP server of yours.
I got code, that sending GET request and recieves answer in stream. I read stream with streamreader to end. Here is code:
HttpWebRequest requestGet = (HttpWebRequest)WebRequest.Create(url);
requestGet.Method = "GET";
requestGet.Timeout = 5000;
HttpWebResponse responseGet = (HttpWebResponse)requestGet.GetResponse();
StreamReader reader = new StreamReader(responseGet.GetResponseStream());
StringBuilder output = new StringBuilder();
output.Append(reader.ReadToEnd());
responseGet.Close();
But i dont like that program is waiting until all data recieved before starting working with response. It would be great if i can do it like this(pseudocode):
//here sending GET request
do
{
response.append(streamPart recieved);
//here work with response
} while (stream not ended)
I tried streamReader.Read(char[], int32_1, int32_2), but i cant specify int32_2, becouse i dont know how many symbols i recieved. And if i use ReadToEnd - it waits for all response to load.
Read takes a char[] buffer and returns the number of characters read from the stream.
Here's an example of how to read the response in chunks:
public static void Main(string[] args)
{
var req = WebRequest.Create("http://www.google.com");
req.Method = "GET";
req.Timeout = 5000;
using (var response = req.GetResponse())
using (var reader = new StreamReader(response.GetResponseStream()))
{
char[] buffer = new char[1024];
int read = 0;
int i = 0;
do
{
read = reader.Read(buffer, 0, buffer.Length);
Console.WriteLine("{0}: Read {1} bytes", i++, read);
Console.WriteLine("'{0}'", new String(buffer, 0, read));
Console.WriteLine();
} while(!reader.EndOfStream);
}
}
I didn't see any extraneous white space or repeated data.
We are trying to identify high CPU usage in services we have, and we believe there are a few potential areas that may be causing infinite loops. Below is code that we believe may potentially be causing an infinite loop. Is there anything specific that sticks out that may cause the while loop to run indefinitely?
WebRequest request = WebRequest.Create(Url);
request.ContentLength = formDataLength;
request.ContentType = "application/x-www-form-urlencoded";
request.Method = "POST";
using (Stream rs = request.GetRequestStream())
{
ASCIIEncoding encoding = new ASCIIEncoding();
var postData = encoding.GetBytes(formData);
rs.Write(postData, 0, postData.Length);
string str = string.Empty;
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
using (Stream sm = response.GetResponseStream())
{
int totalBytesRead = 0;
long responseBytesToRead = 1024;
byte[] buffer = new byte[responseBytesToRead];
int bytesRead;
do
{
bytesRead = sm.Read(buffer, totalBytesRead, (int)(responseBytesToRead - totalBytesRead));
totalBytesRead += bytesRead;
} while (totalBytesRead < bytesRead);
request.Abort();
str = Encoding.Default.GetString(buffer);
}
}
return str;
}
From the MSDN documentation:
Return Value
Type: System.Int32 The total number of bytes read into
the buffer. This can be less than the number of bytes requested if
that many bytes are not currently available, or zero (0) if the end of
the stream has been reached.
0 indicates the end of the stream. The condition for reaching end of stream has already been defined. The condition you rely on could be unreliable and is unnecessary.
Try
while(bytesRead != 0)
It is better to use StreamReader
using (StreamReader reader = new StreamReader(response.GetResponseStream()))
...
reader.ReadToEnd();
// or
while (!reader.EndOfStream)
{
// do read.
}
I am using below code to send a byte array to a site. Why this code is not throwing an exception even when there is no internet connection?.Even when no connection is there I am able to get stream and able to write to it.I expect it to throw an exception at Stream postStream = request1.EndGetRequestStream(result).Anyone got any idea why its behaving like this.
private void UploadHttpFile()
{
HttpWebRequest request = WebRequest.CreateHttp(new Uri(myUrl));
request.ContentType = string.Format("multipart/form-data; boundary={0}", boundary);
request.UserAgent = "Mozilla/4.0 (Windows; U; Windows Vista;)";
request.Method = "POST";
request.UseDefaultCredentials = true;
request.BeginGetRequestStream(GetStream, request);
}
private void GetStream(IAsyncResult result)
{
try
{
HttpWebRequest request1 = (HttpWebRequest)result.AsyncState;
using (Stream postStream = request1.EndGetRequestStream(result))
{
int len = postBody.Length;
len += mainBody.Length;
len += endBody.Length;
byte[] postArray = new byte[len + 1];
Encoding.UTF8.GetBytes(postBody.ToString()).CopyTo(postArray, 0);
Encoding.UTF8.GetBytes(mainBody).CopyTo(postArray, postBody.Length);
Encoding.UTF8.GetBytes(endBody).CopyTo(postArray, postBody.Length + mainBody.Length);
postStream.Write(postArray, 0, postArray.Length);
}
}
I expect it's buffering everything until you're done writing, at which point it will be able to use the content length immediately. If you set:
request.AllowWriteStreamBuffering = false;
then I suspect it will fail at least when you write to the stream.
Btw, your calculation of the required length for postArray appears to be assuming one byte per character, which won't always be the case... and you're calling ToString on postBody which looks like it's redundant. I'm not sure why you're trying to write in a single call anyway... either you could call Write three times:
byte[] postBodyBytes = Encoding.UTF8.GetBytes(postBody);
postStream.Write(postBodyBytes, 0, postBodyBytes.Length);
// etc
or (preferrably) just use a StreamWriter:
using (Stream postStream = request1.EndGetRequestStream(result))
{
using (StreamWriter writer = new StreamWriter(postStream)
{
writer.Write(postBody);
writer.Write(mainBody);
writer.Write(endBody);
}
}
It's also unclear why you've added 1 to the required length when initializing postArray. Are you trying to send an extra "0" byte at the end of the data?
I'm trying create a small http proxy service. This is not working so well. It is able to serve HTML okayish, however it chokes up on images. That is, some images.
Sending in a url through my proxy yields 19.4 kb in the response (according to firebug)
Visiting that url directly also yields 19.4 kb in the response, again according to firebug. The difference is, it doesn't show up when I put it through my proxy, but it does when I browse directly.
A completely different url works just fine. Does anyone have any idea?
private void DoProxy()
{
var http = listener.GetContext();
string url = http.Request.QueryString["url"];
WebRequest request = HttpWebRequest.Create(url);
WebResponse response = request.GetResponse();
http.Response.ContentType = response.ContentType;
byte[] content;
using (Stream responseStream = response.GetResponseStream())
content = ReadAll(responseStream);
http.Response.ContentLength64 = content.Length;
http.Response.OutputStream.Write(content, 0, content.Length);
http.Response.Close();
}
private byte[] ReadAll(Stream stream)
{
IList<byte> array = new List<byte>();
int b;
while ((b = stream.ReadByte()) != -1)
array.Add(Convert.ToByte(b));
return array.ToArray();
}
I would try and flush/close the OutputStream before you close the response.
Also as a second suggestion have a look at the HTTP traffic from the original site and then through your proxy site using an HTTP debugger like Fiddler - there must be a difference when using your proxy.
Also to make the ReadAll method more effective, in general I would avoid to load the full content into memory, because this will blow up on huge files - just stream them directly from the input stream to the output stream. If you still want to use byte arrays consider the following (untested but should work):
private byte[] ReadAll(Stream stream)
{
byte[] buffer = new byte[8192];
int bytesRead = 1;
List<byte> arrayList = new List<byte>();
while (bytesRead > 0)
{
bytesRead = stream.Read(buffer, 0, buffer.Length);
arrayList.AddRange(new ArraySegment<byte>(buffer, 0, bytesRead).Array);
}
return arrayList.ToArray();
}
You can try to replace
http.Response.Close();
with
http.Response.Flush();
http.Response.End();
A problem could be that you don't specify the MIME type of the response. Browsersthese days are very forgiving, but maybe there is a circumstance where the browser doesn't know how to handle whatever you are sticking through its throat.
I have written the most smallish file-based http server, presented here, which as far as I can remember can serve images without much problem.
Just separate the text response and image response, and write the outputs separately. I did like below and it worked for me.
static void Main(string[] args)
{
HttpListener server = new HttpListener();
server.Prefixes.Add("http://localhost:9020/");
server.Start();
Console.WriteLine("Listening...");
while (true)
{
try
{
HttpListenerContext context = server.GetContext();
HttpListenerResponse response = context.Response;
String localpath = context.Request.Url.LocalPath;
string page = Directory.GetCurrentDirectory() + localpath;
string msg = "";
bool imgtest = false;
if (localpath == "/")
page = "index.html";
Console.WriteLine(localpath);
if (!page.Contains("jpg") && !page.Contains("png"))//Separates image request
{
TextReader tr = new StreamReader(page);
msg = tr.ReadToEnd();
tr.Dispose();
}
else
{
byte[] output = File.ReadAllBytes(page);
response.ContentLength64 = output.Length;
Stream st1 = response.OutputStream;
st1.Write(output, 0, output.Length);
imgtest = true;
}
if (imgtest==false)
{
byte[] buffer = Encoding.UTF8.GetBytes(msg);
response.ContentLength64 = buffer.Length;
Stream st = response.OutputStream;
st.Write(buffer, 0, buffer.Length);
context.Response.Close();
}
}
catch (Exception ex)
{
Console.WriteLine("Error: "+ex);
Console.ReadKey();
}
}