Hello I am trying to get the entire total page size with HttpWebRequest from images scripts etc. I need to calculate the total trafic of a web page if one user visits the page via web browser. The content length brings me the length of the content in byte size. It is actualy the document length. But i need to get all the traffic also from Images and scripts
This is my Code So far
NetworkCredential cred = new NetworkCredential("username", "password");
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(Url);
request.Proxy = new System.Net.WebProxy("proxy", true);
request.Proxy.Credentials = cred;
request.UserAgent = #"Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.1.5) Gecko/20091102 Firefox/3.5.5";
using (HttpWebResponse requestresponse = (HttpWebResponse)request.GetResponse())
{
Headers = requestresponse.Headers;
Url = requestresponse.ResponseUri;
int ContentLength;
if (int.TryParse(response.Headers.Get("Content-Length"), out ContentLength))
{
//This is One way i get the content leangth
WebPageSize = ContentLength;
}
//This is another way i get the content leangth
WebPageSize = request.ContentLength;
return ProcessContent(requestresponse);
}
Also I the header content length is not guaranteed that the server will respond it back.
Any suggestions?
If you just need to check the size once why don't you try using the Net panel of Firebug (Firefox extension) or the equivalent tool of other browsers? It will tell you the size off all the requests performed while loading a webpage.
If you are the one doing the measurement and can guarantee there will be no other network traffic, you can try the following:
Get TotalBytesReceived from IPv4Statistics
Open the page in WebBrowserControl and wait until all resources are loaded.
Get the total bytes again.
From those two numbers, you can calculate the total size.
Here is an article about working with IPv4Statistics:
http://www.m0interactive.com/archives/2008/02/06/how_to_calculate_network_bandwidth_speed_in_c_/
Related
I am working on getting information that is behind a log in page, and using this as my starting point.
Looking at the Network tab, I looked at the form data and saw there were 3 additional values than just client/password (csrf, time, hash).
I attempted to log into the site as follows.
string formUrl = "mysite_loginaction";
string formParams = string.Format("client_id={0}&password={1}", "client", "password");
string cookieHeader;
WebRequest req = WebRequest.Create(formUrl);
req.ContentType = "application/x-www-form-urlencoded";
req.Method = "POST";
byte[] bytes = Encoding.ASCII.GetBytes(formParams);
req.ContentLength = bytes.Length;
using (Stream os = req.GetRequestStream())
{
os.Write(bytes, 0, bytes.Length);
}
WebResponse resp = req.GetResponse();
cookieHeader = resp.Headers["Set-cookie"];
When I print out the resp to my console, it shows my the log in page, when i was expecting the next page after login (google 2f page).
Do I need to post a csfr, time, and hash values as well to get a successful login?
Like it has been mentioned in your link, there is a concept of sessionid token. If you do want to stay logged in, you need to pass that token everytime for the following http requests.
Also, the CSRF token will always be different each time you do the request, but you do need to pass it along your next request to be successful.
To know more about CSRF, I should redirect you to this link
You're going to have to mess around with it. Most of the time you don't need all the headers, but I would assume that hash is required.
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 10 years ago.
Say I am building a c# application.
The purpose of application to :
get username & password from user.
and show some information present on the website.
in the background, after taking username and password, it should :
log in to a website with those credentials.
and click on the anchor link that appears after logging in.
find out the span that hold the info.
get the info.
that was an example. I am actually building an app to show bandwidth usage information.
The server does not expose any API for that.
Is there any tutorial/info/article available for similar purpose ? I just don't what to search for ?
Basic Introduction To HttpWebRequests
Firstly, you're going to need the right tools for the job. Go and download the Live HTTP Headers plugin for Firefox. This will allow you to view HTTP headers in real time so you can view the POST data that is sent when you interact with the website. Once you know the data that is sent to the website you can emulate the process by creating your own HTTP web requests programmatically. Tool > Live HTTP Headers
Load Live HTTP Headers by navigating to Tools > Live HTTP Headers. Once you've loaded the GUI navigate to the website you wish to login to, I will use Facebook for demonstration purposes. Type in your credentials ready to login, but before you do Clear the GUI text window and ensure that the check box labeled Capture is checked. Once you hit login you will see the text window flood with various information about the requests including the POST data which you need.
I find it best to click Save All... and then search for your username in the text document so that you can identify the POST data easily. For my request the POST data looked like this:
lsd=AVp-UAbD&display=&legacy_return=1&return_session=0&trynum=1&charset_test=%E2%82%AC%2C%C2%B4%2C%E2%82%AC%2C%C2%B4%2C%E6%B0%B4%2C%D0%94%2C%D0%84&timezone=0&lgnrnd=214119_mDgc&lgnjs=1356154880&email=%myfacebookemail40outlook.com&pass=myfacebookpassword&default_persistent=0
Which can then be defined in C# like so:
StringBuilder postData = new StringBuilder();
postData.Append("lsd=AVqRGVie&display=");
postData.Append("&legacy_return=1");
postData.Append("&return_session=0");
postData.Append("&trynum=1");
postData.Append("&charset_test=%E2%82%AC%2C%C2%B4%2C%E2%82%AC%2C%C2%B4%2C%E6%B0%B4%2C%D0%94%2C%D0%84");
postData.Append("&timezone=0");
postData.Append("&lgnrnd=153743_eO6D");
postData.Append("&lgnjs=1355614667");
postData.Append(String.Format("&email={0}", "CUSTOM_EMAIL"));
postData.Append(String.Format("&pass={0}", "CUSTOM_PASSWORD"));
postData.Append("&default_persistent=0");
I'm aiming to show you the relation between the POST data that we can send 'manually' via the web browser and how we can use said data to emulate the request in C#. Understand that sending POST data is far from deterministic. Different websites work in different ways and can throw all kinds of things your way. Below is a function I put together to validate that Facebook credentials are correct. I can't and shouldn't go into extraordinary depth here as the classes and their members are well self-documented. You can find better information than I can offer about the methods used at MSDN for example, WebRequest.Method Property
private bool ValidateFacebookCredentials(string email, string password)
{
CookieContainer cookies = new CookieContainer();
HttpWebRequest request = null;
HttpWebResponse response = null;
string returnData = string.Empty;
//Need to retrieve cookies first
request = (HttpWebRequest)WebRequest.Create(new Uri("https://www.facebook.com/login.php?login_attempt=1"));
request.Method = "GET";
request.CookieContainer = cookies;
response = (HttpWebResponse)request.GetResponse();
//Set up the request
request = (HttpWebRequest)WebRequest.Create(new Uri("https://www.facebook.com/login.php?login_attempt=1"));
request.Method = "POST";
request.ContentType = "application/x-www-form-urlencoded";
request.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2.13) Gecko/20101203 Firefox/3.6.13";
request.Referer = "https://www.facebook.com/login.php?login_attempt=1";
request.AllowAutoRedirect = true;
request.KeepAlive = true;
request.CookieContainer = cookies;
//Format the POST data
StringBuilder postData = new StringBuilder();
postData.Append("lsd=AVqRGVie&display=");
postData.Append("&legacy_return=1");
postData.Append("&return_session=0");
postData.Append("&trynum=1");
postData.Append("&charset_test=%E2%82%AC%2C%C2%B4%2C%E2%82%AC%2C%C2%B4%2C%E6%B0%B4%2C%D0%94%2C%D0%84");
postData.Append("&timezone=0");
postData.Append("&lgnrnd=153743_eO6D");
postData.Append("&lgnjs=1355614667");
postData.Append(String.Format("&email={0}", email));
postData.Append(String.Format("&pass={0}", password));
postData.Append("&default_persistent=0");
//write the POST data to the stream
using(StreamWriter writer = new StreamWriter(request.GetRequestStream()))
writer.Write(postData.ToString());
response = (HttpWebResponse)request.GetResponse();
//Read the web page (HTML) that we retrieve after sending the request
using (StreamReader reader = new StreamReader(response.GetResponseStream()))
returnData = reader.ReadToEnd();
return !returnData.Contains("Please re-enter your password");
}
Sample Code on Grabbing Contents (Screen Scraping)
Uri uri = new Uri("http://www.microsoft.com/default.aspx");
if(uri.Scheme = Uri.UriSchemeHttp)
{
HttpWebRequest request = HttpWebRequest.Create(uri);
request.Method = WebRequestMethods.Http.Get;
HttpWebResponse response = request.GetResponse();
StreamReader reader = new StreamReader(response.GetResponseStream());
string tmp = reader.ReadToEnd();
response.Close();
Response.Write(tmp);
}
Sample Code on how to Post Data to remote Web Page using HttpWebRequest
Uri uri = new Uri("http://www.amazon.com/exec/obidos/search-handle-form/102-5194535-6807312");
string data = "field-keywords=ASP.NET 2.0";
if (uri.Scheme == Uri.UriSchemeHttp)
{
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(uri);
request.Method = WebRequestMethods.Http.Post;
request.ContentLength = data.Length;
request.ContentType = "application/x-www-form-urlencoded";
StreamWriter writer = new StreamWriter(request.GetRequestStream());
writer.Write(data);
writer.Close();
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
StreamReader reader = new StreamReader(response.GetResponseStream());
string tmp = reader.ReadToEnd();
response.Close();
Response.Write(tmp);
}
Source
Any HTTP client implementation, there are tons of open-source libraries for that. look at curl for example. Some dude made a .NET wrapper for it.
You can continue using WebClient to POST (instead of GET, which is the HTTP verb you're currently using with DownloadString), but I think you'll find it easier to work with the (slightly) lower-level classes WebRequest and WebResponse.
There are two parts to this - the first is to post the login form, the second is recovering the "Set-cookie" header and sending that back to the server as "Cookie" along with your GET request. The server will use this cookie to identify you from now on (assuming it's using cookie-based authentication which I'm fairly confident it is as that page returns a Set-cookie header which includes "PHPSESSID").
Click Here to Check in Detail
I have today a program that I can post through to my site, but I need to add file upload support to my program (for jpg). I had such a problem just getting my program to post, and now after many hours I can't get it to work with file upload.
Today my code is not very good but it works and I'm the only one using it.
Here is my code, abit stripped but I hope you get the idea what I need help with.
Thanks!
("upfile" is the input field name for images)
public static string Post(string url, int id, string message)
{
try
{
string param =
string.Format(
"MAX_FILE_SIZE=2097152&id={0}&com={1}&upfile", id, message);
byte[] bytes = Encoding.ASCII.GetBytes(param);
var post = (HttpWebRequest)WebRequest.Create(url);
post.ProtocolVersion = HttpVersion.Version10;
post.Method = "POST";
post.AllowAutoRedirect = false;
post.ContentType = "application/x-www-form-urlencoded";
post.ContentLength = bytes.Length;
post.UserAgent =
"Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.0.15) Gecko/2009101601 Firefox/3.0.15 (.NET CLR 3.5.30729)";
post.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8";
Stream requestStream = post.GetRequestStream();
requestStream.Write(bytes, 0, bytes.Length);
requestStream.Close();
var webResponse = (HttpWebResponse)post.GetResponse();
var sr = new StreamReader(webResponse.GetResponseStream());
return sr.ReadToEnd();
}
catch (Exception exception)
{
return null;
}
}
Edit:
Sorry for my poor explanation here is what I need help with:
I use the method above today to post on my site, but I need add support to upload files in my method.
I can't change how the webpage works in the close future so it have to be done by using a POST request.
I can't get it to work then I try to do it myself, so I'm wondering if anybody could help me with what need to be added to read a image and post it with the request!
Thanks!
You need to use the encoding of multipart/form-data instead of application/x-www-form-urlencoded, and then you need to encode the post data accordingly
See Upload files with HTTPWebrequest (multipart/form-data)
I'm assuming you need a better solution than your current one.
If you are on c#, there is a FILE upload control that you can use to upload files.
You can even restrict it to the types of valid files that you allow.
Control is FileUpload
Shouldnt your Content type be multipart/form-data; ?
If it is possible for you to use a upload class I can make one available to you.
I'm attempting to write a small screen-scraping tool for statistics aggregation in c#. I have attempted to use this code, (posted many times here but again for detail):
public static string GetPage(string url)
{
HttpWebRequest request = (HttpWebRequest) WebRequest.Create(url);
request.UserAgent = "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)";
WebResponse response = (HttpWebResponse) request.GetResponse();
Stream stream = response.GetResponseStream();
StreamReader reader = new StreamReader(stream);
string result = reader.ReadToEnd();
stream.Dispose();
reader.Dispose();
return result;
}
However, some (not all) websites I attempt to connect to that use Ajax or server side includes throw NameResolutionFailure exceptions and cannot read the data.
An example of this is : pgatour stats
I am led to believe the HttpWebRequest class emulates a browser when requesting information so you get the post-generated HTML. Currently, the only way I can read the data is making an iMacro that grabs it from the page source after it runs through the browser. As said before, it works in the browser so I don't think the error is related to a DNS issue and the website does generate a response (.haveresponse is set).
Has anyone else encountered this issue and what did you use tor resolve it?
Thanks.
My app currently uses OAuth to communicate with the Twitter API. Back in December, Twitter upped the rate limit for OAuth to 350 requests per hour. However, I am not seeing this. I am still getting 150 from the account/rate_limit_status method.
I was told that I needed to use the X-RateLimit-Limit HTTP header to get the new rate limit. However, in my code, I do not see that header.
Here is my code...
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(newURL);
request.Method = "GET";
request.ServicePoint.Expect100Continue = false;
request.ContentType = "application/x-www-form-urlencoded";
using (WebResponse response = request.GetResponse())
{
using (StreamReader reader = new StreamReader(response.GetResponseStream()))
{
responseString = reader.ReadToEnd();
}
}
If I inspect the response, I can see that it has a property for Headers, and that there are 16 headers. However, I do not have X-RateLimit-Limit in the list.
(source: yfrog.com)
Any idea what I am doing wrong?
You should simple be able to use:
using (WebResponse response = request.GetResponse())
{
string limit = response.Headers["X-RateLimit-Limit"];
...
}
If that doesn't work as expected, you can do a watch on response.Headers and see what's in there.
Look at the raw response text (e.g., with Fiddler). If the header isn't there, no amount of C# code is going to make it appear. :) From what you've shown, it seems the header isn't in the response.
Update: When I go to: http://twitter.com/account/rate_limit_status.xml there is no X-RateLimit-Limit header. But when I go to http://twitter.com/statuses/public_timeline.xml, it's there. So I think you just need to use a different method.
It still says 150, though!