Hi I want to upload some dynamic generated content to my web api.
On the client I use the HttpWebRequest. The data should be uploaded sync and I want to write to the stream AFTER(!) I executed the HTTP-request.
(From server to client it does work fine, but from client to server i get some exceptions).
The client implementation looks like:
HttpWebRequest httpWebRequest = HttpWebRequest.Create(myUrl) as HttpWebRequest;
httpWebRequest.Method = "POST";
httpWebRequest.Headers["Authorization"] = "Basic " + ... ;
httpWebRequest.PreAuthenticate = true;
httpWebRequest.SendChunked = true;
//httpWebRequest.AllowWriteStreamBuffering = false; //does not help...
httpWebRequest.ContentType = "application/octet-stream";
Stream st = httpWebRequest.GetRequestStream();
Task<WebResponse> response = httpWebRequest.GetResponseAsync();
// NOW: Write after GetResponse()
var b = Encoding.UTF8.GetBytes("Test1");
st.Write(b, 0, b.Length);
b = Encoding.UTF8.GetBytes("Test2");
st.Write(b, 0, b.Length);
b = Encoding.UTF8.GetBytes("Test3");
st.Write(b, 0, b.Length);
st.Close();
var x = response.Result;
Stream resultStream = x.GetResponseStream();
//do some output...
I get exceptions (NotSupportedException: The stream does not support concurrent IO read or write operations.) on stream.write().
Why do I get here an exceptions. Some times the first writes worke and the late writes throw the exception.
On the beginning the stream.CanWrite property is true, but after the first or second or thrird write it goes false... And then the exception is thrown on the next write.
Edit: Changing AllowWriteStreamBuffering did not help
Appendix:
I found my problem. This problem is caused by the order of my code.
I have to call it in this order:
GetRequestStream
(writing async to the stream)
(the request is send to the server after the first write)
then:
GetResponseAsync()
GetResponseStream()
I thought "GetResponseAsync" triggers the client to send the Request (for now the headers only).
But it is not necessary because the request is already send after I write the first bits to the stream.
Second cause of my problems: Fiddler. (Fiddler currently only supports streaming of responses and not requests)
I found my problem.
The order of my code caused the problem.
The solution is to call it in this order:
GetRequestStream (writing async to the stream) (the request is send to the server after the first write) then:
GetResponseAsync()
GetResponseStream()
My understanding is that "GetResponseAsync" triggers the client to send the request (for now the headers only), but I have discovered it to be an uneccesary step, because the request had already been sent after the first few bits had been written to the stream.
The second cause of my problems is Fiddler, but Fiddler only supports streaming of responses, and not requests.
The code acheived referencing, 'HttpWebRequest' class:
HttpWebRequest httpWebRequest = HttpWebRequest.Create("http://xxx") as HttpWebRequest;
httpWebRequest.Method = "POST";
httpWebRequest.Headers["Authorization"] = "Basic " + Convert.ToBase64String(Encoding.ASCII.GetBytes("user:pw"));
httpWebRequest.PreAuthenticate = true;
httpWebRequest.SendChunked = true;
httpWebRequest.AllowWriteStreamBuffering = false;
httpWebRequest.AllowReadStreamBuffering = false;
httpWebRequest.ContentType = "application/octet-stream";
Stream st = httpWebRequest.GetRequestStream();
Console.WriteLine("Go");
try
{
st.Write(buffer, 0, buffer.Length); //with the first write, the request will be send.
st.Write(buffer, 0, buffer.Length);
st.Write(buffer, 0, buffer.Length);
for (int i = 1; i <= 10; i++)
{
st.Write(buffer, 0, buffer.Length); //still writing while I can read on the stream at my ASP.NET web api
}
}
catch (WebException ex)
{
var y = ex.Response;
}
finally
{
st.Close();
}
// Now we can read the response from the server in chunks
Task<WebResponse> response = httpWebRequest.GetResponseAsync();
Stream resultStream = response.Result.GetResponseStream();
byte[] data = new byte[1028];
int bytesRead;
while ((bytesRead = resultStream.Read(data, 0, data.Length)) > 0)
{
string output = System.Text.Encoding.UTF8.GetString(data, 0, bytesRead);
Console.WriteLine(output);
}
The code acheived referencing, 'HttpClient' class:
HttpClientHandler ch = new HttpClientHandler();
HttpClient c = new HttpClient(ch);
c.DefaultRequestHeaders.TransferEncodingChunked = true;
c.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Basic", Convert.ToBase64String(Encoding.ASCII.GetBytes("user:pw")));
Stream stream = new MemoryStream();
AsyncStream asyncStream = new AsyncStream(); // Custom implementation of the PushStreamContent with the method, "WriteToStream()".
PushStreamContent streamContent = new PushStreamContent(asyncStream.WriteToStream);
HttpRequestMessage requestMessage = new HttpRequestMessage(new HttpMethod("POST"), "http://XXX") { Content = streamContent };
requestMessage.Headers.TransferEncodingChunked = true;
HttpResponseMessage response = await c.SendAsync(requestMessage, HttpCompletionOption.ResponseHeadersRead);
// The request has been sent, since the first write in the "WriteToStream()" method.
response.EnsureSuccessStatusCode();
Task<Stream> result = response.Content.ReadAsStreamAsync();
byte[] data = new byte[1028];
int bytesRead;
while ((bytesRead = await result.Result.ReadAsync(data, 0, data.Length)) > 0)
{
string output = System.Text.Encoding.UTF8.GetString(data, 0, bytesRead);
Console.WriteLine(output);
}
Console.ReadKey();
You are using objects of an HttpWebRequest on multiple threads concurrently. response is a task that runs concurrently with your writes. That is clearly thread-unsafe.
Also, I don't see what you want to achieve. HTTP has a request-then-response model. The server can't (usually) send a single response byte before receiving the entire request. Theoretically, this is possible. But that is very unusual and likely not supported by the .NET BCL. This would only be supported by a (very unusual) custom HTTP server of yours.
Related
I have the following connection code:
_request = (HttpWebRequest)WebRequest.Create(complianceUrl);
_request.Method = "GET";
var authInfo = string.Format("{0}:{1}", _username, _password);
authInfo = Convert.ToBase64String(Encoding.Default.GetBytes(authInfo));
_request.Headers.Add("Authorization", "Basic " + authInfo);
// set stream parameters
_request.AutomaticDecompression = DecompressionMethods.Deflate | DecompressionMethods.GZip;
_request.Headers.Add("Accept-Encoding", "gzip");
_request.Accept = "application/json";
_request.ContentType = "application/json";
_request.ReadWriteTimeout = 30000;
_request.AllowReadStreamBuffering = false;
_request.Timeout = 30; //seconds, sends 15-second heartbeat.
_asyncCallback = HandleResult; //Setting handleResult as Callback method...
_request.BeginGetResponse(_asyncCallback, _request); //Calling BeginGetResponse on
This works fine, and the buffer fills with data and while I have large volumes of data this is fine. But with low volumes of data it takes a while for the buffer to fill up and I want to periodically flush the buffer if I have not had any activity in a while.
I tried to do that this way:
_request.GetRequestStream().FlushAsync();
But this is wrong as it tells me I am causing a ProtocolViolationException I guess as this is a GET Verb?
Can anyone tell me how to forcibly cause the connection to dump the buffer to the client?
Handling response code added:
private void HandleResult(IAsyncResult result)
{
using (var response = (HttpWebResponse) _request.EndGetResponse(result))
using (var stream = response.GetResponseStream())
using (var memory = new MemoryStream())
{
var compressedBuffer = new byte[BlockSize];
while (stream != null && stream.CanRead)
{
var readCount = stream.Read(compressedBuffer, 0, compressedBuffer.Length);
// if readCount is 0, then the stream must have disconnected. Process and abort!
if (readCount == 0)
{
}
}
}
}
It is not possible to make the server side of an HTTP call send you data in a particular way. Neither HTTP nor TCP have provisions for that. You have to take what you get, or talk to the vendor.
Read does not block until a buffer is full. It gives you what arrives immediately. This behavior is well known, this is not a guess of mine.
The 3rd party service send 15 second heartbeats CRLF which will also eventually fill up the buffer.
This sounds like the service is doing what you want. If that is true, the problem must be in your code. But not in the code shown in the question. Try this:
while (true)
{
var readCount = stream.Read(compressedBuffer, 0, compressedBuffer.Length);
Console.WriteLine(readCount);
}
This should show 2 bytes every 15 seconds. If not, the vendor is at fault. If yes, you are.
while (stream != null && stream.CanRead) this is weird because none of these conditions can become false. And even if they become false, what do you do about it?! This should be while (true) plus break on stream depletion.
// if readCount is 0, then the stream must have disconnected.
This condition means that the remote side has orderly finished their sending of the HTTP response.
For anyone interested, you can read and deflate manually doing the following:
using (var response = (HttpWebResponse) _request.EndGetResponse(result))
using (var stream = response.GetResponseStream())
using (var compressedMemory = new MemoryStream())
using (var uncompressedMemory = new MemoryStream()) // Added new memory stream
using (var gzipStream = new GZipStream(compressedMemory, CompressionMode.Decompress))
{
var compressedBuffer = new byte[BlockSize];
while (stream != null && stream.CanRead)
{
var readCount = stream.Read(compressedBuffer, 0, compressedBuffer.Length);
compressedMemory.Write(compressedBuffer.Take(readCount).ToArray(), 0, readCount);
compressedMemory.Position = 0;
gzipStream.CopyTo(uncompressedMemory); // use copy to rather than trying to read
var outputString = Encoding.UTF8.GetString(uncompressedMemory.ToArray());
Debug.WriteLine(outputString);
uncompressedMemory.Position = 0;
uncompressedMemory.SetLength(0);
compressedMemory.Position = 0;
compressedMemory.SetLength(0); // reset length
}
}
I have read other similar questions and tried the solutions from those questions, but since that did not work, hence I am posting this here.
When I send POST request below, it fails with the following error message:
System.Net.ProtocolViolationException: You must write ContentLength bytes to the request stream before calling [Begin]GetResponse.
at System.Net.HttpWebRequest.GetResponse()
....
....
My GET requests for other URL end points are working fine, I am only having this issue while issuing a POST request. Also, I have already set the ContentLength in the code appropriately. I am still unable to send the POST request. Thoughts?
public void TestSubmitJobWithParams1()
{
const string RestActionPath = "URL_GOES_HERE";
// if you have multipe parameters seperate them with teh '&' delimeter.
var postData = HttpUtility.UrlEncode("MaxNumberOfRowsPerSFSTask") + "=" + HttpUtility.UrlEncode("3000");
var request = (HttpWebRequest)WebRequest.Create(RestActionPath);
request.Method = "POST";
request.Credentials = CredentialCache.DefaultCredentials;
request.PreAuthenticate = true;
request.ContentLength = 0;
request.Timeout = 150000;
request.CachePolicy = new RequestCachePolicy(RequestCacheLevel.BypassCache);
request.ContentType = "application/x-www-form-urlencoded";
byte[] bytes = Encoding.ASCII.GetBytes(postData);
request.ContentLength = bytes.Length;
Stream newStream = request.GetRequestStream();
newStream.Write(bytes, 0, bytes.Length);
string output = string.Empty;
try
{
using (var response = request.GetResponse())
{
using (var stream = new StreamReader(response.GetResponseStream(), Encoding.GetEncoding(1252)))
{
output = stream.ReadToEnd();
}
}
}
catch (WebException ex)
{
if (ex.Status == WebExceptionStatus.ProtocolError)
{
using (var stream = new StreamReader(ex.Response.GetResponseStream()))
{
output = stream.ReadToEnd();
}
}
else if (ex.Status == WebExceptionStatus.Timeout)
{
output = "Request timeout is expired.";
}
}
catch (ProtocolViolationException e)
{
Console.WriteLine(e);
}
Console.WriteLine(output);
Console.ReadLine();
}
A few things:
Firstly, you don't need to set ContentLength directly - just leave it out (defaults to -1). You're actually calling it twice, so remove both calls.
Also, you need to call Close() on the stream before calling GetResponse()
Stream newStream = request.GetRequestStream();
newStream.Write(bytes, 0, bytes.Length);
newStream.Close();
Alternatively, you could have it within a using statement, which handles the closing and disposal for you):
using (var newStream = request.GetRequestStream())
{
newStream.Write(bytes, 0, bytes.Length);
}
You also don't technically need to use HttpUtility.UrlEncode() for the postData since there's nothing in your string that would violate a Url's integrity. Just do:
string postData = "MaxNumberOfRowsPerSFSTask=3000");
Let me know if that solves it for you.
For a more thorough rundown, check this out: http://msdn.microsoft.com/en-us/library/system.net.httpwebrequest.getresponse.aspx
Specifically, the section about ProtocolViolationException and also where it says:
When using the POST method, you must get the request stream, write the data to be posted, and close the stream. This method blocks waiting for content to post; if there is no time-out set and you do not provide content, the calling thread blocks indefinitely.
I have two identical web services currently installed on the same PC for testing purposes. A request that is received by the first service, is suposed go to the second one as well. My intention was to do this using a HttpModule. I handle the request during application.BeginRequest.
public void AsyncForwardRequest(HttpRequest request)
{
try
{
// Prepare web request...
HttpWebRequest req = (HttpWebRequest)WebRequest.Create(WSaddress);
req.Credentials = CredentialCache.DefaultCredentials;
req.Headers.Add("SOAPAction", request.Headers["SOAPAction"]);
req.ContentType = request.Headers["Content-Type"];
req.Accept = request.Headers["Accept"];
req.Method = request.HttpMethod;
// Send the data.
long posStream = request.InputStream.Position;
long len = request.InputStream.Length;
byte[] buff = new byte[len];
request.InputStream.Seek(0, SeekOrigin.Begin);
if (len < int.MaxValue && request.InputStream.Read(buff, 0, (int)len) > 0)
{
using (Stream stm = req.GetRequestStream())
{
stm.Write(buff, 0, (int)len);
}
request.InputStream.Position = posStream;
DebugOutputStream(request.InputStream);
request.InputStream.Seek(0, SeekOrigin.Begin);
IAsyncResult result = (IAsyncResult)req.BeginGetResponse(new AsyncCallback(RespCallback), req);
}
}
catch (Exception ex)
{
App.Error2(String.Format("RequestDuplicatorModule - BeginRequest; ERROR begin request: {0}", ex.Message), ex);
}
private static void RespCallback(IAsyncResult asynchronousResult)
{
try
{
// State of request is asynchronous.
var req = (HttpWebRequest)asynchronousResult.AsyncState;
WebResponse resp = (HttpWebResponse)req.EndGetResponse(asynchronousResult);
Stream responseStream = resp.GetResponseStream();
System.IO.Stream stream = responseStream;
Byte[] arr = new Byte[1024 * 100];
stream.Read(arr, 0, 1024 * 100);
if (arr.Length > 0)
{
string body = (new ASCIIEncoding()).GetString(arr);
Debug.Print(body);
}
}
catch (WebException ex)
{
App.Error2(String.Format("RequestDuplicatorModule - BeginRequest; ERROR begin request: {0}", ex.Message), ex);
}
}
This (HttpWebResponse)req.EndGetResponse(asynchronousResult) throws an exception: 500 Internal Server Error and fails. The original request goes through fine.
request.InnerStream:
...
Does this have anything to do with the web service addresses being different? In my case they're:
http://localhost/WS/service.asmx
http://localhost/WS_replicate/service.asmx
I would at least implement this, to be able to read the content of a httpwebrequest that has a status 500:
catch (WebException ex)
{
HttpWebResponse webResponse = (HttpWebResponse)ex.Response;
Stream dataStream = webResponse.GetResponseStream();
if (dataStream != null)
{
StreamReader reader = new StreamReader(dataStream);
response = reader.ReadToEnd();
}
}
Not answering your question though, i would suggest having a simple amsx that reads the request, and posts the request to the two different web services. If you need some code for that let me know.
Additional advantages would be that the asmx will be easier to support for the developer as well as the one dealing with the deployment. You could add configuration options to make the actual url's dynamic.
Was able to modify the AsyncForwardRequest method so that I can actually edit the SoapEnvelope itself:
string buffString = System.Text.UTF8Encoding.UTF8.GetString(buff);
using (Stream stm = req.GetRequestStream())
{
bool stmIsReadable = stm.CanRead; //false, stream is not readable, how to get
//around this?
//solution: read Byte Array into a String,
//modify <wsa:to>, write String back into a
//Buffer, write Buffer to Stream
buffString = buffString.Replace("http://localhost/WS/Service.asmx", WSaddress);
//write modded string to buff
buff = System.Text.UTF8Encoding.UTF8.GetBytes(buffString);
stm.Write(buff, 0, (int)buff.Length);
}
how can i upload a large string (in my case XML with BLOB) with POST without getting Timeout with GetResponse?
Changing the timeout helps, but this isn't really a solution.
If the Server is really death or the POST was interrupted i have to wait for the extrem large timeout.
Any Idea?
HttpWebRequest webRequest = null;
string response = "";
byte[] bytes = Encoding.UTF8.GetBytes(xml);
try
{
webRequest = (HttpWebRequest)WebRequest.Create("http://" + this.host + ":" + this.port);
webRequest.ContentType = "application/x-www-form-urlencoded";
webRequest.Method = "POST";
webRequest.Timeout = 5000;
webRequest.ContentLength = bytes.Length;
using (Stream requeststream = webRequest.GetRequestStream())
{
requeststream.Write(bytes, 0, bytes.Length);
requeststream.Close();
}
using (HttpWebResponse webResponse = (HttpWebResponse)webRequest.GetResponse())
{
using (StreamReader sr = new StreamReader(webResponse.GetResponseStream()))
{
response = sr.ReadToEnd().Trim();
sr.Close();
}
webResponse.Close();
}
}
catch(Exception ex)
{
MessageBox.Show(ex.ToString());
}
return response;
Yes, this is pretty much expected http behaviour.
Options:
have a large timeout (you've already done this), and accept that it could take a long time to legitimately time out (as opposed to taking a while because of bandwidth)
maybe you can apply gzip on the request (and tell the server you're sending it compressed); I honestly don't know if this is supported automatically, but it could certainly be done by the api explicitly checking for a particular header and applying gzip decompression on the payload
change the api to perform a number of small uploads, and a completion message
live with it
I'm trying create a small http proxy service. This is not working so well. It is able to serve HTML okayish, however it chokes up on images. That is, some images.
Sending in a url through my proxy yields 19.4 kb in the response (according to firebug)
Visiting that url directly also yields 19.4 kb in the response, again according to firebug. The difference is, it doesn't show up when I put it through my proxy, but it does when I browse directly.
A completely different url works just fine. Does anyone have any idea?
private void DoProxy()
{
var http = listener.GetContext();
string url = http.Request.QueryString["url"];
WebRequest request = HttpWebRequest.Create(url);
WebResponse response = request.GetResponse();
http.Response.ContentType = response.ContentType;
byte[] content;
using (Stream responseStream = response.GetResponseStream())
content = ReadAll(responseStream);
http.Response.ContentLength64 = content.Length;
http.Response.OutputStream.Write(content, 0, content.Length);
http.Response.Close();
}
private byte[] ReadAll(Stream stream)
{
IList<byte> array = new List<byte>();
int b;
while ((b = stream.ReadByte()) != -1)
array.Add(Convert.ToByte(b));
return array.ToArray();
}
I would try and flush/close the OutputStream before you close the response.
Also as a second suggestion have a look at the HTTP traffic from the original site and then through your proxy site using an HTTP debugger like Fiddler - there must be a difference when using your proxy.
Also to make the ReadAll method more effective, in general I would avoid to load the full content into memory, because this will blow up on huge files - just stream them directly from the input stream to the output stream. If you still want to use byte arrays consider the following (untested but should work):
private byte[] ReadAll(Stream stream)
{
byte[] buffer = new byte[8192];
int bytesRead = 1;
List<byte> arrayList = new List<byte>();
while (bytesRead > 0)
{
bytesRead = stream.Read(buffer, 0, buffer.Length);
arrayList.AddRange(new ArraySegment<byte>(buffer, 0, bytesRead).Array);
}
return arrayList.ToArray();
}
You can try to replace
http.Response.Close();
with
http.Response.Flush();
http.Response.End();
A problem could be that you don't specify the MIME type of the response. Browsersthese days are very forgiving, but maybe there is a circumstance where the browser doesn't know how to handle whatever you are sticking through its throat.
I have written the most smallish file-based http server, presented here, which as far as I can remember can serve images without much problem.
Just separate the text response and image response, and write the outputs separately. I did like below and it worked for me.
static void Main(string[] args)
{
HttpListener server = new HttpListener();
server.Prefixes.Add("http://localhost:9020/");
server.Start();
Console.WriteLine("Listening...");
while (true)
{
try
{
HttpListenerContext context = server.GetContext();
HttpListenerResponse response = context.Response;
String localpath = context.Request.Url.LocalPath;
string page = Directory.GetCurrentDirectory() + localpath;
string msg = "";
bool imgtest = false;
if (localpath == "/")
page = "index.html";
Console.WriteLine(localpath);
if (!page.Contains("jpg") && !page.Contains("png"))//Separates image request
{
TextReader tr = new StreamReader(page);
msg = tr.ReadToEnd();
tr.Dispose();
}
else
{
byte[] output = File.ReadAllBytes(page);
response.ContentLength64 = output.Length;
Stream st1 = response.OutputStream;
st1.Write(output, 0, output.Length);
imgtest = true;
}
if (imgtest==false)
{
byte[] buffer = Encoding.UTF8.GetBytes(msg);
response.ContentLength64 = buffer.Length;
Stream st = response.OutputStream;
st.Write(buffer, 0, buffer.Length);
context.Response.Close();
}
}
catch (Exception ex)
{
Console.WriteLine("Error: "+ex);
Console.ReadKey();
}
}