Decoding a response c# - c#

I'm developing some API for testing, and I have a problem when I make a webrequest and especially when i retrieve the webresponse.
I use this code:
string request = HttpPost("http://iunlocker.net/check_imei.php", "ime_i=013270000134001");
public static string HttpPost(string URI, string Parameters)
{
try
{
System.Net.WebRequest req = System.Net.WebRequest.Create(URI);
req.ContentType = "application/x-www-form-urlencoded";
req.Method = "POST";
byte[] bytes = System.Text.Encoding.ASCII.GetBytes(Parameters);
req.ContentLength = bytes.Length;
System.IO.Stream os = req.GetRequestStream();
os.Write(bytes, 0, bytes.Length);
os.Close();
System.Net.WebResponse resp= req.GetResponse();
if (resp == null) return null;
System.IO.StreamReader sr = new System.IO.StreamReader(resp.GetResponseStream());
return sr.ReadToEnd().Trim();
}
catch (Exception ex) { }
return null;
}
The website in the call is an example, because with this and with other websites I can't retrieve the result correctly. I receive an exception "Error 403"
Can anyone maybe help me by telling what I may be doing wrong?
I thought the problem was on encoding/decoding -- in fact using Fiddler it asks me if I want to decode before see the text -- but with another website, used for examples, I receive the same message from Fiddler but I can retrieve the response without a problem.
Thanks in advance.

HTTP 403 error means "access forbidden". The destination website is refusing to fulfill your request, for reasons of its own.
Given this particular website http://iunlocker.net/, I'm going to hazard a guess that it may be checking the HTTP_REFERER. In other words it's refusing to fulfill your request because it knows it didn't come from a browser that was viewing the form.
[EDIT] After viewing the response from
curl --form ime_i=013270000134001 -i http://iunlocker.net/check_imei.php
I can see that the immediate response is setting a cookie and a redirect.
HTTP/1.1 307 Temporary Redirect
Server: nginx
Date: Wed, 03 Jul 2013 04:00:27 GMT
Content-Type: text/html
Content-Length: 180
Connection: keep-alive
Set-Cookie: PMBC=35e9e4cd3a7f9d50e7f3bb39d43750d1; path=/
Location: http://iunlocker.net/check_imei.php?pmtry=1
<html>
<head><title>307 Temporary Redirect</title></head>
<body bgcolor="white">
<center><h1>307 Temporary Redirect</h1></center>
<hr><center>nginx</center>
</body>
</html>
This site does not want you scraping it; if you wish to defeat this you will have to make use of its cookies.

http://en.wikipedia.org/wiki/HTTP_403 - The web server is denying you access to that URL.
Perhaps the IP address you are using, is not allowed to access that resource. Check web server.

Related

How can i read client request?

I have create an httplistener. So i need when client will send me data to read them. The problem is that i dont know how client should send the data
HttpListener listener = new HttpListener();
listener.Prefixes.Add("http://192.168.1.26:8282/");
listener.Prefixes.Add("http://localhost:8282/");
listener.Prefixes.Add("http://127.0.0.1:8282/");
listener.Start();
new Thread(() =>
{
Thread.CurrentThread.IsBackground = true;
for (;;)
{
Console.WriteLine("Listening...");
// Note: The GetContext method blocks while waiting for a request.
HttpListenerContext context = listener.GetContext();
HttpListenerRequest request = context.Request;
string text;
using (var reader = new StreamReader(request.InputStream,
request.ContentEncoding))
{
text = reader.ReadToEnd();
MessageBox.Show(text);
}
// Obtain a response object.
HttpListenerResponse response = context.Response;
// Construct a response.
string responseString = "HelloWorld";
byte[] buffer = System.Text.Encoding.UTF8.GetBytes(responseString);
// Get a response stream and write the response to it.
response.ContentLength64 = buffer.Length;
System.IO.Stream output = response.OutputStream;
output.Write(buffer, 0, buffer.Length);
// You must close the output stream.
output.Close();
}
}).Start();
}
So from client i send this command:
GET / 192.168.1.26:8282 HTTP/1.0
But i'm getting this message
Recv 34 bytes
SEND OK
+IPD,1,518:HTTP/1.1 400 Bad Request
Content-Type: text/html; charset=us-ascii
Server: Microsoft-HTTPAPI/2.0
Date: Wed, 13 Jun 2018 13:16:03 GMT
Connection: close
Content-Length: 339
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN""http://www.w3.org/TR/html4/strict.dtd">
<HTML><HEAD><TITLE>Bad Request</TITLE>
<META HTTP-EQUIV="Content-Type" Content="text/html; charset=us-ascii"></HEAD>
<BODY><h2>Bad Request - Invalid Header</h2>
<hr><p>HTTP Error 400. The request has an invalid header name.</p>
</BODY></HTML>
1,CLOSED
I cant understand what is wrong. Also in my code i set to get a message box every time a request will happen. But it never runs
This s what mozilla is sending
You are not attempting to invoke the service correctly. Here is your client request:
GET / 192.168.1.26:8282 HTTP/1.0
What you should be doing is first establishing a socket connection to host 192.168.1.26 over port 8282. Then you must issue a HTTP request in a valid format:
GET / HTTP/1.0
Don't forget to add some newlines after the request (ie: \r\n\r\n). Then your web server should respond to the HTTP request.
Quick example in Telnet:
telnet 192.168.1.26 8282
GET / HTTP/1.0
Quick example with netcat:
nc 192.168.1.26 8282
GET / HTTP/1.0
Note that these quick examples are provided just to help you ensure that your web service is accessible and functioning correctly. Ideally, you would likely use a more robust HTTP client that is customized for whatever your particular needs are. The process is still the same:
Establish a connection to your host IP address over the listening port
Issue a HTTP request in a valid format: (HTTP_VERB PATH HTTP_VERSION)
*) Maybe check out the developer tools in your browser of choice (F12 -> Network) to see how HTTP headers are sent as well.
Parse the response in some meaningful way.
"Also in my code i set to get a message box every time a request will happen." - You should try putting in a manual message to the message box, instead of reading from the input stream. This is a good debugging technique. In a HTTP GET request you generally are not sending data except in the form of optional query string parameters. I have a feeling that you are not getting the results you are expecting because you are reading from input that isn't there. Before reading from the stream input, first make sure that the connection is successful.

HTTP Upload Corrupts Zip File?

I have a zip file that I can read with DotNetZipLib from the file system. However, when I POST it via a form to my MVC application it can't be read as a stream. My best guess at the moment is that the HTTP upload is somehow corrupting the zip file. There's no shortage of questions with the same problem, and I thought I'd accounted for the stream properly but perhaps I'm not using the .NET object(s) here as intended.
Here's my WebAPI POST handler:
public void Post(HttpRequestMessage request)
{
using(var fileData = request.Content.ReadAsStreamAsync().Result)
if (fileData.Length > 0)
{
var zip = ZipFile.Read(fileData); // exception
}
}
The exception, of course, is from the DotNetZipLib ZipFile just saying that the stream can't be read as a zip. If I replace fileData with just a path to the file (this is all being tested on the same machine) then it reads it, so it has to be the HTTP upload.
In FireBug, the headers for the POST are:
Response Headers:
Cache-Control no-cache
Content-Length 1100
Content-Type application/xml; charset=utf-8
Date Sat, 01 Feb 2014 23:18:32 GMT
Expires -1
Pragma no-cache
Server Microsoft-IIS/8.0
X-AspNet-Version 4.0.30319
X-Powered-By ASP.NET
X-SourceFiles =?UTF-8?B?QzpcRGF0YVxDb2RlXE9yZ1BvcnRhbFxPcmdQb3J0YWxTZXJ2ZXJcYXBpXGFwcHg=?=
Request Headers
Accept text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Encoding gzip, deflate
Accept-Language en-US,en;q=0.5
Connection keep-alive
Cookie uvts=ENGUn8FXEnEQFeS
Host localhost:48257
Referer http://localhost:48257/Home/Test
User-Agent Mozilla/5.0 (Windows NT 6.3; WOW64; rv:26.0) Gecko/20100101 Firefox/26.0
Request Headers From Upload Stream
Content-Length 31817
Content-Type multipart/form-data; boundary=---------------------------265001916915724
And the form is simple enough:
<form action="/api/appx" method="post" enctype="multipart/form-data">
<input name="postedFile" type="file" />
<input type="submit" />
</form>
Am I doing something wrong with the steam? Pulling data from the HttpRequestMessage incorrectly? Or perhaps I should be receiving the upload in an entirely different way?
When you post a file using a HTML form the media type is multipart/form-data which has some special formatting syntax, as you can see from your Firebug details. You can't just read it as a stream and expect it to match the file that was sent. There are a set of ReadAsMultipartAsync extension methods for handling this media type.
The below code worked fine for both Zip and Text file. you may try this out
public HttpStatusCode Post(string fileName)
{
var task = this.Request.Content.ReadAsStreamAsync();
task.Wait();
Stream requestStream = task.Result;
try
{
Stream fileStream = File.Create(HttpContext.Current.Server.MapPath("~/" + fileName));
requestStream.CopyTo(fileStream);
fileStream.Close();
requestStream.Close();
}
catch (IOException)
{
throw new HttpResponseException(HttpStatusCode.InternalServerError);
}
HttpResponseMessage response = new HttpResponseMessage();
response.StatusCode = HttpStatusCode.Created;
return response.StatusCode;
}

HTTPS POST in C#, Winforms (Stream Writer, HttpWebResponse, HttpWebRequest)

UPDATE: I am trying to POST data to https URI. The POST works for HTTP but it fails for HTTPS uri
Hi I am creating a c# winforms exe to post data to a website. The code is below. The issue is, the stream duplicates my post data..
eg: suppose I want to post this -> username=bob
Then when I check the traffic, what is actually sent is, username=bobusername=bob
See? It duplicates, it adds the same line once more to the end of the buffer and sends it.
I am going crazy trying to find the issue from two days.. Can any body solve this or give me some hints please? thank you..
(content length is correctly set to 12, but it sends 24 bytes, after appending same data once again to the tail of buffer)
There are the headers
POST /login/ HTTP/1.0
Content-Type: application/x-www-form-urlencoded
Host: abc.test.com
Content-Length: 12
username=bobusername=bob
-
This is the code I am currently using
string post_data = "username=bob";
string uri = "https://abc.test.com/login/";
HttpWebRequest request = (HttpWebRequest)
WebRequest.Create(uri);
request.KeepAlive = false;
request.ProtocolVersion = HttpVersion.Version10;
request.Method = "POST";
byte[] postBytes = Encoding.ASCII.GetBytes(post_data);
request.ContentType = "application/x-www-form-urlencoded";
request.ContentLength = postBytes.Length;
Stream requestStream = request.GetRequestStream();
requestStream.Write(postBytes, 0, postBytes.Length);
MessageBox.Show(postBytes.Length.ToString());
requestStream.Close();
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
StreamReader sr = new StreamReader(response.GetResponseStream());
string tmp = sr.ReadToEnd().Trim();
I put a breakpoint on line byte[] postBytes = Encoding.ASCII.GetBytes(post_data); and postBytes contains the correct data... but it gets output twice.
Why is this happening? I hope I am clear..
I tried out your code and it seemed to work as expected (sent a HTTP Post with a 12byte payload) after I changed the host in the URI to something that was addressable (used http://adsf.com/login). Here's the trace from wireshark:
You might try out the URI I used to see what you get, this will at least rule out your computer or code as possible sources of the problem. If the problem disappears when using a different URI then the problem might be between your network equipment and the web server (reverse-proxy configuration, webserver configuration, network switch configuration, etc).
You can try to get more information by setting trace configuration as described in this page. When I tried your code, I get the following output:
System.Net Verbose: 0 : [2324] Data from ConnectStream#26756241::Write
System.Net Verbose: 0 : [2324] 00000000 : 75 73 65 72 6E 61 6D 65-3D 62 6F 62 : username=bob
System.Net Verbose: 0 : [2324] Exiting ConnectStream#26756241::Write()
Looks like data is correctly written to the ConnectStream. Something wrong somewhere else?
And don't forget to close the WebResponse object.

HttpWebRequest with caching enabled throws exceptions

I'm working on a small C#/WPF application that interfaces with a web service implemented in Ruby on Rails, using handcrafted HttpWebRequest calls and JSON serialization. Without caching, everything works as it's supposed to, and I've got HTTP authentication and compression working as well.
Once I enable caching, by setting request.CachePolicy = new HttpRequestCachePolicy(HttpRequestCacheLevel.CacheIfAvailable);, things go awry - in the production environment. When connecting to a simple WEBrick instance, things work fine, I get HTTP/1.1 304 Not Modified as expected and HttpWebRequest delivers the cached content.
When I try the same against the production server, running nginx/0.8.53 + Phusion Passenger 3.0.0, the application breaks. First request (uncached) is served properly, but on the second request which results in the 304 response, I get a WebException stating that "The request was aborted: The request was canceled." as soon as I invoke request.GetResponse().
I've run the connections through fiddler, which hasn't helped a whole lot; both WEBrick and nginx return an empty entity body, albeit different response headers. Intercepting the request and changing the response headers for nginx to match those of WEBrick didn't change anything, leading me to think that it could be a keep-alive issue; setting request.KeepAlive = false; changes nothing, though - it doesn't break stuff when connecting to WEBrick, and it doesn't fix stuff when connecting to nginx.
For what it's worth, the WebException.InnerException is a NullReferenceException with the following StackTrace:
at System.Net.HttpWebRequest.CheckCacheUpdateOnResponse()
at System.Net.HttpWebRequest.CheckResubmitForCache(Exception& e)
at System.Net.HttpWebRequest.DoSubmitRequestProcessing(Exception& exception)
at System.Net.HttpWebRequest.ProcessResponse()
at System.Net.HttpWebRequest.SetResponse(CoreResponseData coreResponseData)
Headers for the (working) WEBrick connection:
########## request
GET /users/current.json HTTP/1.1
Authorization: Basic *REDACTED*
Content-Type: application/json
Accept: application/json
Accept-Charset: utf-8
Host: testbox.local:3030
If-None-Match: "84a49062768e4ca619b1c081736da20f"
Accept-Encoding: gzip, deflate
Connection: Keep-Alive
########## response
HTTP/1.1 304 Not Modified
X-Ua-Compatible: IE=Edge
Etag: "84a49062768e4ca619b1c081736da20f"
Date: Wed, 01 Dec 2010 18:18:59 GMT
Server: WEBrick/1.3.1 (Ruby/1.8.7/2010-08-16)
X-Runtime: 0.177545
Cache-Control: max-age=0, private, must-revalidate
Set-Cookie: *REDACTED*
Headers for the (exception-throwing) nginx connection:
########## request
GET /users/current.json HTTP/1.1
Authorization: Basic *REDACTED*
Content-Type: application/json
Accept: application/json
Accept-Charset: utf-8
Host: testsystem.local:8080
If-None-Match: "a64560553465e0270cc0a23cc4c33f9f"
Accept-Encoding: gzip, deflate
Connection: Keep-Alive
########## response
HTTP/1.1 304 Not Modified
Connection: keep-alive
Status: 304
X-Powered-By: Phusion Passenger (mod_rails/mod_rack) 3.0.0
ETag: "a64560553465e0270cc0a23cc4c33f9f"
X-UA-Compatible: IE=Edge,chrome=1
X-Runtime: 0.240160
Set-Cookie: *REDACTED*
Cache-Control: max-age=0, private, must-revalidate
Server: nginx/0.8.53 + Phusion Passenger 3.0.0 (mod_rails/mod_rack)
UPDATE:
I tried doing a quick-and-dirty manual ETag cache, but turns out that's a no-go: I get a WebException when invoking request.GetResponce(), telling me that "The remote server returned an error: (304) Not Modified." - yeah, .NET, I kinda knew that, and I'd like to (attempt to) handle it myself, grr.
UPDATE 2:
Getting closer to the root of the problem. The showstopper seems to be a difference in the response headers for the initial request. WEBrick includes a Date: Wed, 01 Dec 2010 21:30:01 GMT header, which isn't present in the nginx reply. There's other differences as well, but intercepting the initial nginx reply with fiddler and adding a Date header, the subsequent HttpWebRequests are able to process the (unmodified) nginx 304 replies.
Going to try to look for a workaround, as well as getting nginx to add the Date header.
UPDATE 3:
It seems that the serverside issue is with Phusion Passenger, they have an open issue about lack of the Date header. I'd still say that HttpWebRequest's behavior is... suboptimal.
UPDATE 4:
Added a Microsoft Connect ticket for the bug.
I think the designers find it reasonable to throw an exception when the "expected behavior"---i.e., getting a response body---cannot be completed. You can handle this somewhat intelligently as follows:
catch (WebException ex)
{
if (ex.Status == WebExceptionStatus.ProtocolError)
{
var statusCode = ((HttpWebResponse)ex.Response).StatusCode;
// Test against HttpStatusCode enumeration.
}
else
{
// Do something else, e.g. throw;
}
}
So, it turns out to be Phusion Passenger (or nginx, depending on how you look at it - and Thin as well) that doesn't add a Date HTTP response header, combined with what I see as a bug in .NET HttpWebRequest (in my situation there's no If-Modified-Since, thus Date shouldn't be necessary) leading to the problem.
The workaround for this particular case was to edit our Rails ApplicationController:
class ApplicationController < ActionController::Base
# ...other stuff here
before_filter :add_date_header
# bugfix for .NET HttpWebRequst 304-handling bug and various
# webservers' lazyness in not adding the Date: response header.
def add_date_header
response.headers['Date'] = Time.now.to_s
end
end
UPDATE:
Turns out it's a bit more complex than "just" setting HttpRequestCachePolicy - to repro, I also need to have manually constructed HTTP Basic Auth. So the involved components are the following:
HTTP server that doesn't include a HTTP "Date:" response header.
manual construction of HTTP Authorization request header.
use of HttpRequestCachePolicy.
Smallest repro I've been able to come up with:
namespace Repro
{
using System;
using System.IO;
using System.Net;
using System.Net.Cache;
using System.Text;
class ReproProg
{
const string requestUrl = "http://drivelog.miracle.local:3030/users/current.json";
// Manual construction of HTTP basic auth so we don't get an unnecessary server
// roundtrip telling us to auth, which is what we get if we simply use
// HttpWebRequest.Credentials.
private static void SetAuthorization(HttpWebRequest request, string _username, string _password)
{
string userAndPass = string.Format("{0}:{1}", _username, _password);
byte[] authBytes = Encoding.UTF8.GetBytes(userAndPass.ToCharArray());
request.Headers["Authorization"] = "Basic " + Convert.ToBase64String(authBytes);
}
static public void DoRequest()
{
var request = (HttpWebRequest) WebRequest.Create(requestUrl);
request.Method = "GET";
request.CachePolicy = new HttpRequestCachePolicy(HttpRequestCacheLevel.CacheIfAvailable);
SetAuthorization(request, "user#domain.com", "12345678");
using(var response = request.GetResponse())
using(var stream = response.GetResponseStream())
using(var reader = new StreamReader(stream))
{
string reply = reader.ReadToEnd();
Console.WriteLine("########## Server reply: {0}", reply);
}
}
static public void Main(string[] args)
{
DoRequest(); // works
DoRequest(); // explodes
}
}
}

C# 403 error because the file contains an inaccessible image? or what?

I'm trying to get a stream from a url:http://actueel.nl.pwc.com/site/syndicate.jsp but i get the 403 error. It doest requier login. I used fiddler to check why IE can open it while my code doesn't. What i got was that there were 2 connections done when opening the link in IE. 1 succeeded while the other got a 403. The 403 was a sublink to a giff image. Seems like the xml is a public file, but the image it contains is located in a inaccesible folder.
I need to know how to ignore the image so i can still get the rest of stream. this is my code to test it(by the way..i tryed with WeClient too and headers) :
try
{
WebRequest request = WebRequest.Create("http://actueel.nl.pwc.com/site/syndicate.jsp");
request.Credentials = CredentialCache.DefaultCredentials;
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
Stream dataStream = response.GetResponseStream();
StreamReader reader = new StreamReader(dataStream);
MessageBox.Show(reader.ReadToEnd());
}
catch(Exception ex){
MessageBox.Show(ex.Message);
}
Thanks for your reactions ;)
I agree with Dmytro. The WebRequest is NOT attempting to download the gif image referenced in the jsp file, only the contents of the jsp itself is being downloaded. Try looking carefully (in Fiddler) at the IE request compared to yours - only the url but also all the request/response headers - and see if anything else is missing, such as cookies or ACCEPT headers.
Using Wireshark and wget, the differences were in the headers only.
The remote server requires User Agent and an Accept headers.
eg:
WebRequest request = WebRequest.Create("http://actueel.nl.pwc.com/site/syndicate.jsp");
((HttpWebRequest)request).UserAgent = "stackoverflow.com/q/4233673/111013";
((HttpWebRequest) request).Accept = "*/*";

Categories

Resources