I am trying to view report using someone else's credential. I am stuck on how to assign credential and then navigate to the URL.
Question:
Is it possible to assign credential to WebRequest then navigate to the url? Something like this maybe:
WebRequest request = WebRequest.Create(repUrl);
request.Credentials = new NetworkCredential("UserName", "Password");
webBrowser1.Navigate(request.RequestUri);
I know i am able to see the report when i use something like this. But i cannot click the report:
WebRequest request = WebRequest.Create(repUrl);
request.Credentials = new NetworkCredential("Username", "Password");
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
StreamReader receiveStream = new StreamReader(response.GetResponseStream());
webBrowser1.ScriptErrorsSuppressed = true;
webBrowser1.DocumentStream = receiveStream.BaseStream;
Please help me on this. I stuck at this problem for too long now. Thank you.
Setting credentials like below
request.Credentials = new NetworkCredential("UserName", "Password");
would be possible if a page accepts credentials (basic authentication), that to my experience in most cases it's not like that.
If you want to use WebRequest, you would need to supply headers, cookies, and you would need a cookie container, now how to get cookies is another story.
As you are using WebBrowser control. the easiest way that I would suggest is that you navigate to login page first, fill the credentials and click on submit button. then navigate to the page page you want:
private void webbrowser1_DocumentCompleted(object sender, DocumentCompletedEventArgs e)
{
if(webbrowser1.Url.EndsWith("login.aspx") //check if it is the login url
{
var doc = webbrowser1.Document;
doc.GetElementById("email").SetAttribute("value", email);
doc.GetElementById("password").SetAttribute("value", password);
doc.GetElementsByTagName("input").OfType<HtmlElement>()
.FirstOrDefault(x => x.GetAttribute("type") == "submit"))
.InvokeMember("click");
}
else
{
webbrowser1.Navigate(repUrl);
}
}
In the above answer, I have just assumed that login page is login.aspx and email and password fields have id of email and password, of course you have to change how you fill these inputs with thier own ids or if they don't have an Id using their names, ...
Related
I'm trying to login to website and download some pages as it see logged user in C#.
I have class, where are functions to send POST to login page to login. I have verified, that I actually log in, but data's are not keeped, so when I download HTML of the page I need, I get page saying to log in.
I found this, but I don't know how to implement it.
I have class with two funtions: Login() used to login to website, based on this example:
using(WebClient client = new WebClient())
{
System.Collections.Specialized.NameValueCollection reqparm = new System.Collections.Specialized.NameValueCollection();
reqparm.Add("param1", "<any> kinds & of = ? strings");
reqparm.Add("param2", "escaping is already handled");
byte[] responsebytes = client.UploadValues("http://localhost", "POST", reqparm);
string responsebody = Encoding.UTF8.GetString(responsebytes);
}
Then, second function DownloadHtml(string url), contains this:
using (WebClient client = new WebClient()) {
client.Encoding = Encoding.UTF8;
html = client.DownloadString(url);
return html;
}
How to save cookis in Login() and use them in DownloadHtml() to see the page as logged user? Or shouldn't I use WebClient? If no, what should I use?
Thanks.
After you receive cookies from your first request to authorization you should saved them (in some variable perhaps) and then manually add them to any subsequent request that you will be make.
Adding cookies to request can be made like this:
using (WebClient client = new WebClient())
{
client.Headers.Add("Cookie", "AUTH_COOKIE_NAME=" + AUTH_COOKIE_VALUE);
client.Encoding = Encoding.UTF8;
html = client.DownloadString(url);
return html;
}
I have a computer that does not access the internet without a proxy and an authentication (username and password). So I did it:
var webProxy = new WebProxy(PROXY_ADRESS, PORT);
webProxy.Credentials = new NetworkCredential(USERNAME, PASSWORD, DOMAIN);
var webRequest = (HttpWebRequest)WebRequest.Create("https://www.google.com.br/");
webRequest.Proxy = webProxy;
HttpWebResponse response = (HttpWebResponse)webRequest.GetResponse();
Stream receiveStream = response.GetResponseStream();
webBrowser1.DocumentStream = receiveStream;
When I execute this code, the web browser loads (with some scripts errors) the Google page, but it does not load some images and when I click in Search or any other button the page gets white with a text with the end of the URL, for example: /search.
How can I navigate to other pages and load the full page (including the images)?
I've followed this tutorial and now it's working:
http://www.journeyintocode.com/2013/08/c-webbrowser-control-proxy.html
The below code will post a request to an IIS basic authorization site. And sucessfully log in to the site using Windows Credentials. But what I need to do is convert this to opening the website in a browser much like opening a new hyperlink with a target="null".
So just a recap, how do you post the WebRequest to a new browser tab? Or how do you send the CredentialCache to a new URL request?
var request = WebRequest.Create(testURL);
SetBasicAuthHeader(request, "username", "password", testURL);
var response = request.GetResponse();
}
public void SetBasicAuthHeader(WebRequest request, String userName, String userPassword, String testURL)
{
CredentialCache credentialCache = new CredentialCache();
credentialCache.Add(new System.Uri(testURL), "Basic", new NetworkCredential(userName, userPassword, "domain"));
request.Credentials = credentialCache;
request.PreAuthenticate = true;
}
The short answer is, you can't send a response to a new tab. The problem with what you are trying to do is that the request / response that you build on the server is technically the server's...not the client. So even if you build the request and redirect the user to the URL you just authenticated to, the client is still considered unauthenticated because the authentication took effect on the server.
I am using webbrowser control , and i get list of all the url of all the profiles from the search result.
After this is there any way that i use httpwebrequest to get the data from the urls?
I wanted to use the Linked in search profile api but that is very confusing.
Also i tried using httpwebrequest but it takes me to the linkedin login page.
I was thinking of any way that as i signed in to linkedin using the webbrowser control maybe using that information of webbrowser and adding with my request to pretend to be logged in .
Any ideas? Please help
The HttpWebRequest sent you to the login page, because there isnĀ“t the cookie with the validation.
So, you'll can connect using WebBrowser control and get the cookie, then put the cookie in the webrequest
webBrowser.Navigate(someUrl);
...
CookieContainer cookies = new CookieContainer();
foreach (string cookie in webBrowser.Document.Cookie.Split(';'))
{
string name = cookie.Split('=')[0];
string value = cookie.Substring(name.Length + 1);
string path = "/";
string domain = "yourdomain.com";
cookies.Add(new Cookie(name.Trim(), value.Trim(), path, domain));
}
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
request.CookieContainer = cookies;
...
I have an application that reads parts of the source code on a website. That all works; but the problem is that the page in question requires the user to be logged in to access this source code. What my program needs a way to initially log the user into the website- after that is done, I'll be able to access and read the source code.
The website that needs to be logged into is:
mmoinn.com/index.do?PageModule=UsersLogin
You can continue using WebClient to POST (instead of GET, which is the HTTP verb you're currently using with DownloadString), but I think you'll find it easier to work with the (slightly) lower-level classes WebRequest and WebResponse.
There are two parts to this - the first is to post the login form, the second is recovering the "Set-cookie" header and sending that back to the server as "Cookie" along with your GET request. The server will use this cookie to identify you from now on (assuming it's using cookie-based authentication which I'm fairly confident it is as that page returns a Set-cookie header which includes "PHPSESSID").
POSTing to the login form
Form posts are easy to simulate, it's just a case of formatting your post data as follows:
field1=value1&field2=value2
Using WebRequest and code I adapted from Scott Hanselman, here's how you'd POST form data to your login form:
string formUrl = "http://www.mmoinn.com/index.do?PageModule=UsersAction&Action=UsersLogin"; // NOTE: This is the URL the form POSTs to, not the URL of the form (you can find this in the "action" attribute of the HTML's form tag
string formParams = string.Format("email_address={0}&password={1}", "your email", "your password");
string cookieHeader;
WebRequest req = WebRequest.Create(formUrl);
req.ContentType = "application/x-www-form-urlencoded";
req.Method = "POST";
byte[] bytes = Encoding.ASCII.GetBytes(formParams);
req.ContentLength = bytes.Length;
using (Stream os = req.GetRequestStream())
{
os.Write(bytes, 0, bytes.Length);
}
WebResponse resp = req.GetResponse();
cookieHeader = resp.Headers["Set-cookie"];
Here's an example of what you should see in the Set-cookie header for your login form:
PHPSESSID=c4812cffcf2c45e0357a5a93c137642e; path=/; domain=.mmoinn.com,wowmine_referer=directenter; path=/; domain=.mmoinn.com,lang=en; path=/;domain=.mmoinn.com,adt_usertype=other,adt_host=-
GETting the page behind the login form
Now you can perform your GET request to a page that you need to be logged in for.
string pageSource;
string getUrl = "the url of the page behind the login";
WebRequest getRequest = WebRequest.Create(getUrl);
getRequest.Headers.Add("Cookie", cookieHeader);
WebResponse getResponse = getRequest.GetResponse();
using (StreamReader sr = new StreamReader(getResponse.GetResponseStream()))
{
pageSource = sr.ReadToEnd();
}
EDIT:
If you need to view the results of the first POST, you can recover the HTML it returned with:
using (StreamReader sr = new StreamReader(resp.GetResponseStream()))
{
pageSource = sr.ReadToEnd();
}
Place this directly below cookieHeader = resp.Headers["Set-cookie"]; and then inspect the string held in pageSource.
You can simplify things quite a bit by creating a class that derives from WebClient, overriding its GetWebRequest method and setting a CookieContainer object on it. If you always set the same CookieContainer instance, then cookie management will be handled automatically for you.
But the only way to get at the HttpWebRequest before it is sent is to inherit from WebClient and override that method.
public class CookieAwareWebClient : WebClient
{
private CookieContainer cookie = new CookieContainer();
protected override WebRequest GetWebRequest(Uri address)
{
WebRequest request = base.GetWebRequest(address);
if (request is HttpWebRequest)
{
(request as HttpWebRequest).CookieContainer = cookie;
}
return request;
}
}
var client = new CookieAwareWebClient();
client.BaseAddress = #"https://www.site.com/any/base/url/";
var loginData = new NameValueCollection();
loginData.Add("login", "YourLogin");
loginData.Add("password", "YourPassword");
client.UploadValues("login.php", "POST", loginData);
//Now you are logged in and can request pages
string htmlSource = client.DownloadString("index.php");
Matthew Brindley, your code worked very good for some website I needed (with login), but I needed to change to HttpWebRequest and HttpWebResponse otherwise I get a 404 Bad Request from the remote server. Also I would like to share my workaround using your code, and is that I tried it to login to a website based on moodle, but it didn't work at your step "GETting the page behind the login form" because when successfully POSTing the login, the Header 'Set-Cookie' didn't return anything despite other websites does.
So I think this where we need to store cookies for next Requests, so I added this.
To the "POSTing to the login form" code block :
var cookies = new CookieContainer();
HttpWebRequest req = (HttpWebRequest)WebRequest.Create(formUrl);
req.CookieContainer = cookies;
And To the "GETting the page behind the login form" :
HttpWebRequest getRequest = (HttpWebRequest)WebRequest.Create(getUrl);
getRequest.CookieContainer = new CookieContainer();
getRequest.CookieContainer.Add(resp.Cookies);
getRequest.Headers.Add("Cookie", cookieHeader);
Doing this, lets me Log me in and get the source code of the "page behind login" (website based moodle) I know this is a vague use of the CookieContainer and HTTPCookies because we may ask first is there a previously set of cookies saved before sending the request to the server. This works without problem anyway, but here's a good info to read about WebRequest and WebResponse with sample projects and tutorial:
Retrieving HTTP content in .NET
How to use HttpWebRequest and HttpWebResponse in .NET
Sometimes, it may help switching off AllowAutoRedirect and setting both login POST and page GET requests the same user agent.
request.UserAgent = userAgent;
request.AllowAutoRedirect = false;