WebRequest NameResolutionFailure - c#

I'm attempting to write a small screen-scraping tool for statistics aggregation in c#. I have attempted to use this code, (posted many times here but again for detail):
public static string GetPage(string url)
{
HttpWebRequest request = (HttpWebRequest) WebRequest.Create(url);
request.UserAgent = "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)";
WebResponse response = (HttpWebResponse) request.GetResponse();
Stream stream = response.GetResponseStream();
StreamReader reader = new StreamReader(stream);
string result = reader.ReadToEnd();
stream.Dispose();
reader.Dispose();
return result;
}
However, some (not all) websites I attempt to connect to that use Ajax or server side includes throw NameResolutionFailure exceptions and cannot read the data.
An example of this is : pgatour stats
I am led to believe the HttpWebRequest class emulates a browser when requesting information so you get the post-generated HTML. Currently, the only way I can read the data is making an iMacro that grabs it from the page source after it runs through the browser. As said before, it works in the browser so I don't think the error is related to a DNS issue and the website does generate a response (.haveresponse is set).
Has anyone else encountered this issue and what did you use tor resolve it?
Thanks.

Related

C# HttpWebRequest Medium Blog returns 403 Forbidden but Site is Open

Medium blog pages are available on Chrome, IE etc... browsers but I can not send a web request with this code blog. It returns 403 Forbidden. By the way this method was working properly a couple of days ago. I changed my IP address numerious times, thought they might have banned my IP address but did not work.
HttpWebRequest request = (HttpWebRequest)WebRequest.Create("https://medium.com/#coinbaseblog");
request.CachePolicy = new HttpRequestCachePolicy(HttpRequestCacheLevel.NoCacheNoStore);
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
using (System.IO.Stream stream = response.GetResponseStream())
using (StreamReader reader = new StreamReader(stream))
{
return reader.ReadToEnd();
}
Try this:
var request = (HttpWebRequest)WebRequest.Create("https://medium.com/#coinbaseblog");
request.UserAgent = #"Mozilla/5.0 (compatible; Rigor/1.0.0; http://rigor.com)";
var result = (HttpWebResponse)request.GetResponse();
http status code 403 indicates that server has understood the request but refuses to authorize it. Even if you are reauthenticating it will make no difference. This is similar to http status code 401 but in this case reauthentication works just fine. You may be authenticated but the resource which you are trying to access is restricted for you.

WCF stops responding after some requests

I have built wcf. it is working well
The issue is when I call it many times it displays the following error:
The server encountered an error processing the request. See server
logs for more details
I configured a WCF Tracing File but it remains always empty. what can be the reason of this sudden stop of the service and how to fix it?
Here is the code that I use at the client's side every 20 seconds:
string url = "http://host/Service.svc/method";
HttpWebRequest webrequest = (HttpWebRequest)WebRequest.Create(url);
webrequest.Method = "GET";
ASCIIEncoding encoding = new ASCIIEncoding();
HttpWebResponse webresponse = (HttpWebResponse)webrequest.GetResponse();
Encoding enc = System.Text.Encoding.GetEncoding("utf-8");
StreamReader loResponseStream =
new StreamReader(webresponse.GetResponseStream(), enc);
string strResult = loResponseStream.ReadToEnd();
loResponseStream.Close();
webresponse.Close();
I fixed the issue. it was due to open database connections. I missed to close, at the server side, the database connections. Thank you for answer
It could be a working memory issue on the server/host. If there's less than 5% available you get no response.

How to interact with a website without a browser? [closed]

It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 10 years ago.
Say I am building a c# application.
The purpose of application to :
get username & password from user.
and show some information present on the website.
in the background, after taking username and password, it should :
log in to a website with those credentials.
and click on the anchor link that appears after logging in.
find out the span that hold the info.
get the info.
that was an example. I am actually building an app to show bandwidth usage information.
The server does not expose any API for that.
Is there any tutorial/info/article available for similar purpose ? I just don't what to search for ?
Basic Introduction To HttpWebRequests
Firstly, you're going to need the right tools for the job. Go and download the Live HTTP Headers plugin for Firefox. This will allow you to view HTTP headers in real time so you can view the POST data that is sent when you interact with the website. Once you know the data that is sent to the website you can emulate the process by creating your own HTTP web requests programmatically. Tool > Live HTTP Headers
Load Live HTTP Headers by navigating to Tools > Live HTTP Headers. Once you've loaded the GUI navigate to the website you wish to login to, I will use Facebook for demonstration purposes. Type in your credentials ready to login, but before you do Clear the GUI text window and ensure that the check box labeled Capture is checked. Once you hit login you will see the text window flood with various information about the requests including the POST data which you need.
I find it best to click Save All... and then search for your username in the text document so that you can identify the POST data easily. For my request the POST data looked like this:
lsd=AVp-UAbD&display=&legacy_return=1&return_session=0&trynum=1&charset_test=%E2%82%AC%2C%C2%B4%2C%E2%82%AC%2C%C2%B4%2C%E6%B0%B4%2C%D0%94%2C%D0%84&timezone=0&lgnrnd=214119_mDgc&lgnjs=1356154880&email=%myfacebookemail40outlook.com&pass=myfacebookpassword&default_persistent=0
Which can then be defined in C# like so:
StringBuilder postData = new StringBuilder();
postData.Append("lsd=AVqRGVie&display=");
postData.Append("&legacy_return=1");
postData.Append("&return_session=0");
postData.Append("&trynum=1");
postData.Append("&charset_test=%E2%82%AC%2C%C2%B4%2C%E2%82%AC%2C%C2%B4%2C%E6%B0%B4%2C%D0%94%2C%D0%84");
postData.Append("&timezone=0");
postData.Append("&lgnrnd=153743_eO6D");
postData.Append("&lgnjs=1355614667");
postData.Append(String.Format("&email={0}", "CUSTOM_EMAIL"));
postData.Append(String.Format("&pass={0}", "CUSTOM_PASSWORD"));
postData.Append("&default_persistent=0");
I'm aiming to show you the relation between the POST data that we can send 'manually' via the web browser and how we can use said data to emulate the request in C#. Understand that sending POST data is far from deterministic. Different websites work in different ways and can throw all kinds of things your way. Below is a function I put together to validate that Facebook credentials are correct. I can't and shouldn't go into extraordinary depth here as the classes and their members are well self-documented. You can find better information than I can offer about the methods used at MSDN for example, WebRequest.Method Property
private bool ValidateFacebookCredentials(string email, string password)
{
CookieContainer cookies = new CookieContainer();
HttpWebRequest request = null;
HttpWebResponse response = null;
string returnData = string.Empty;
//Need to retrieve cookies first
request = (HttpWebRequest)WebRequest.Create(new Uri("https://www.facebook.com/login.php?login_attempt=1"));
request.Method = "GET";
request.CookieContainer = cookies;
response = (HttpWebResponse)request.GetResponse();
//Set up the request
request = (HttpWebRequest)WebRequest.Create(new Uri("https://www.facebook.com/login.php?login_attempt=1"));
request.Method = "POST";
request.ContentType = "application/x-www-form-urlencoded";
request.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2.13) Gecko/20101203 Firefox/3.6.13";
request.Referer = "https://www.facebook.com/login.php?login_attempt=1";
request.AllowAutoRedirect = true;
request.KeepAlive = true;
request.CookieContainer = cookies;
//Format the POST data
StringBuilder postData = new StringBuilder();
postData.Append("lsd=AVqRGVie&display=");
postData.Append("&legacy_return=1");
postData.Append("&return_session=0");
postData.Append("&trynum=1");
postData.Append("&charset_test=%E2%82%AC%2C%C2%B4%2C%E2%82%AC%2C%C2%B4%2C%E6%B0%B4%2C%D0%94%2C%D0%84");
postData.Append("&timezone=0");
postData.Append("&lgnrnd=153743_eO6D");
postData.Append("&lgnjs=1355614667");
postData.Append(String.Format("&email={0}", email));
postData.Append(String.Format("&pass={0}", password));
postData.Append("&default_persistent=0");
//write the POST data to the stream
using(StreamWriter writer = new StreamWriter(request.GetRequestStream()))
writer.Write(postData.ToString());
response = (HttpWebResponse)request.GetResponse();
//Read the web page (HTML) that we retrieve after sending the request
using (StreamReader reader = new StreamReader(response.GetResponseStream()))
returnData = reader.ReadToEnd();
return !returnData.Contains("Please re-enter your password");
}
Sample Code on Grabbing Contents (Screen Scraping)
Uri uri = new Uri("http://www.microsoft.com/default.aspx");
if(uri.Scheme = Uri.UriSchemeHttp)
{
HttpWebRequest request = HttpWebRequest.Create(uri);
request.Method = WebRequestMethods.Http.Get;
HttpWebResponse response = request.GetResponse();
StreamReader reader = new StreamReader(response.GetResponseStream());
string tmp = reader.ReadToEnd();
response.Close();
Response.Write(tmp);
}
Sample Code on how to Post Data to remote Web Page using HttpWebRequest
Uri uri = new Uri("http://www.amazon.com/exec/obidos/search-handle-form/102-5194535-6807312");
string data = "field-keywords=ASP.NET 2.0";
if (uri.Scheme == Uri.UriSchemeHttp)
{
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(uri);
request.Method = WebRequestMethods.Http.Post;
request.ContentLength = data.Length;
request.ContentType = "application/x-www-form-urlencoded";
StreamWriter writer = new StreamWriter(request.GetRequestStream());
writer.Write(data);
writer.Close();
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
StreamReader reader = new StreamReader(response.GetResponseStream());
string tmp = reader.ReadToEnd();
response.Close();
Response.Write(tmp);
}
Source
Any HTTP client implementation, there are tons of open-source libraries for that. look at curl for example. Some dude made a .NET wrapper for it.
You can continue using WebClient to POST (instead of GET, which is the HTTP verb you're currently using with DownloadString), but I think you'll find it easier to work with the (slightly) lower-level classes WebRequest and WebResponse.
There are two parts to this - the first is to post the login form, the second is recovering the "Set-cookie" header and sending that back to the server as "Cookie" along with your GET request. The server will use this cookie to identify you from now on (assuming it's using cookie-based authentication which I'm fairly confident it is as that page returns a Set-cookie header which includes "PHPSESSID").
Click Here to Check in Detail

Uploading image through http form with fields

I have today a program that I can post through to my site, but I need to add file upload support to my program (for jpg). I had such a problem just getting my program to post, and now after many hours I can't get it to work with file upload.
Today my code is not very good but it works and I'm the only one using it.
Here is my code, abit stripped but I hope you get the idea what I need help with.
Thanks!
("upfile" is the input field name for images)
public static string Post(string url, int id, string message)
{
try
{
string param =
string.Format(
"MAX_FILE_SIZE=2097152&id={0}&com={1}&upfile", id, message);
byte[] bytes = Encoding.ASCII.GetBytes(param);
var post = (HttpWebRequest)WebRequest.Create(url);
post.ProtocolVersion = HttpVersion.Version10;
post.Method = "POST";
post.AllowAutoRedirect = false;
post.ContentType = "application/x-www-form-urlencoded";
post.ContentLength = bytes.Length;
post.UserAgent =
"Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.0.15) Gecko/2009101601 Firefox/3.0.15 (.NET CLR 3.5.30729)";
post.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8";
Stream requestStream = post.GetRequestStream();
requestStream.Write(bytes, 0, bytes.Length);
requestStream.Close();
var webResponse = (HttpWebResponse)post.GetResponse();
var sr = new StreamReader(webResponse.GetResponseStream());
return sr.ReadToEnd();
}
catch (Exception exception)
{
return null;
}
}
Edit:
Sorry for my poor explanation here is what I need help with:
I use the method above today to post on my site, but I need add support to upload files in my method.
I can't change how the webpage works in the close future so it have to be done by using a POST request.
I can't get it to work then I try to do it myself, so I'm wondering if anybody could help me with what need to be added to read a image and post it with the request!
Thanks!
You need to use the encoding of multipart/form-data instead of application/x-www-form-urlencoded, and then you need to encode the post data accordingly
See Upload files with HTTPWebrequest (multipart/form-data)
I'm assuming you need a better solution than your current one.
If you are on c#, there is a FILE upload control that you can use to upload files.
You can even restrict it to the types of valid files that you allow.
Control is FileUpload
Shouldnt your Content type be multipart/form-data; ?
If it is possible for you to use a upload class I can make one available to you.

C# 403 error because the file contains an inaccessible image? or what?

I'm trying to get a stream from a url:http://actueel.nl.pwc.com/site/syndicate.jsp but i get the 403 error. It doest requier login. I used fiddler to check why IE can open it while my code doesn't. What i got was that there were 2 connections done when opening the link in IE. 1 succeeded while the other got a 403. The 403 was a sublink to a giff image. Seems like the xml is a public file, but the image it contains is located in a inaccesible folder.
I need to know how to ignore the image so i can still get the rest of stream. this is my code to test it(by the way..i tryed with WeClient too and headers) :
try
{
WebRequest request = WebRequest.Create("http://actueel.nl.pwc.com/site/syndicate.jsp");
request.Credentials = CredentialCache.DefaultCredentials;
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
Stream dataStream = response.GetResponseStream();
StreamReader reader = new StreamReader(dataStream);
MessageBox.Show(reader.ReadToEnd());
}
catch(Exception ex){
MessageBox.Show(ex.Message);
}
Thanks for your reactions ;)
I agree with Dmytro. The WebRequest is NOT attempting to download the gif image referenced in the jsp file, only the contents of the jsp itself is being downloaded. Try looking carefully (in Fiddler) at the IE request compared to yours - only the url but also all the request/response headers - and see if anything else is missing, such as cookies or ACCEPT headers.
Using Wireshark and wget, the differences were in the headers only.
The remote server requires User Agent and an Accept headers.
eg:
WebRequest request = WebRequest.Create("http://actueel.nl.pwc.com/site/syndicate.jsp");
((HttpWebRequest)request).UserAgent = "stackoverflow.com/q/4233673/111013";
((HttpWebRequest) request).Accept = "*/*";

Categories

Resources