Web browser proxy authentication - c#

I have a computer that does not access the internet without a proxy and an authentication (username and password). So I did it:
var webProxy = new WebProxy(PROXY_ADRESS, PORT);
webProxy.Credentials = new NetworkCredential(USERNAME, PASSWORD, DOMAIN);
var webRequest = (HttpWebRequest)WebRequest.Create("https://www.google.com.br/");
webRequest.Proxy = webProxy;
HttpWebResponse response = (HttpWebResponse)webRequest.GetResponse();
Stream receiveStream = response.GetResponseStream();
webBrowser1.DocumentStream = receiveStream;
When I execute this code, the web browser loads (with some scripts errors) the Google page, but it does not load some images and when I click in Search or any other button the page gets white with a text with the end of the URL, for example: /search.
How can I navigate to other pages and load the full page (including the images)?

I've followed this tutorial and now it's working:
http://www.journeyintocode.com/2013/08/c-webbrowser-control-proxy.html

Related

How to assign credential to WebRequest then navigate to it?

I am trying to view report using someone else's credential. I am stuck on how to assign credential and then navigate to the URL.
Question:
Is it possible to assign credential to WebRequest then navigate to the url? Something like this maybe:
WebRequest request = WebRequest.Create(repUrl);
request.Credentials = new NetworkCredential("UserName", "Password");
webBrowser1.Navigate(request.RequestUri);
I know i am able to see the report when i use something like this. But i cannot click the report:
WebRequest request = WebRequest.Create(repUrl);
request.Credentials = new NetworkCredential("Username", "Password");
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
StreamReader receiveStream = new StreamReader(response.GetResponseStream());
webBrowser1.ScriptErrorsSuppressed = true;
webBrowser1.DocumentStream = receiveStream.BaseStream;
Please help me on this. I stuck at this problem for too long now. Thank you.
Setting credentials like below
request.Credentials = new NetworkCredential("UserName", "Password");
would be possible if a page accepts credentials (basic authentication), that to my experience in most cases it's not like that.
If you want to use WebRequest, you would need to supply headers, cookies, and you would need a cookie container, now how to get cookies is another story.
As you are using WebBrowser control. the easiest way that I would suggest is that you navigate to login page first, fill the credentials and click on submit button. then navigate to the page page you want:
private void webbrowser1_DocumentCompleted(object sender, DocumentCompletedEventArgs e)
{
if(webbrowser1.Url.EndsWith("login.aspx") //check if it is the login url
{
var doc = webbrowser1.Document;
doc.GetElementById("email").SetAttribute("value", email);
doc.GetElementById("password").SetAttribute("value", password);
doc.GetElementsByTagName("input").OfType<HtmlElement>()
.FirstOrDefault(x => x.GetAttribute("type") == "submit"))
.InvokeMember("click");
}
else
{
webbrowser1.Navigate(repUrl);
}
}
In the above answer, I have just assumed that login page is login.aspx and email and password fields have id of email and password, of course you have to change how you fill these inputs with thier own ids or if they don't have an Id using their names, ...

Set the same cookies webrequest to web browser client in c#?

I use webrequest in asp.net page (run on server) to get google drive direct download link. After that, I send this link to client ( they use browser ex: chrome, firefox) for download this file. But issuie is cookie, webrequest cookie and client cookie not the same same. Ofcourse they can't download.
So, how I can set webrequest cookie to client?
Or how make webrequest cookie and client cookie to the same same? Thank and hope some idea to solve this problem. This my code:
HttpWebRequest objWebRequest = (HttpWebRequest)WebRequest.Create(url);
objWebRequest.CookieContainer = cookies;
objWebRequest.Credentials = CredentialCache.DefaultCredentials;
HttpWebResponse objWebResponse = (HttpWebResponse)objWebRequest.GetResponse();
Stream receiveStream = objWebResponse.GetResponseStream();
StreamReader readStream = new StreamReader(receiveStream, System.Text.Encoding.UTF8);
HtmlAgilityPack.HtmlDocument doc = new HtmlDocument();
doc.Load(readStream);
string link = ""; //=> this link i need to send client
foreach (HtmlNode row in doc.DocumentNode.SelectNodes("//a[#id='uc-download-link']"))
{ link += row.Attributes["href"].Value; }

Posting Credentials to a basic authorization site and opening the link in a new browser. Open URL instead of a response

The below code will post a request to an IIS basic authorization site. And sucessfully log in to the site using Windows Credentials. But what I need to do is convert this to opening the website in a browser much like opening a new hyperlink with a target="null".
So just a recap, how do you post the WebRequest to a new browser tab? Or how do you send the CredentialCache to a new URL request?
var request = WebRequest.Create(testURL);
SetBasicAuthHeader(request, "username", "password", testURL);
var response = request.GetResponse();
}
public void SetBasicAuthHeader(WebRequest request, String userName, String userPassword, String testURL)
{
CredentialCache credentialCache = new CredentialCache();
credentialCache.Add(new System.Uri(testURL), "Basic", new NetworkCredential(userName, userPassword, "domain"));
request.Credentials = credentialCache;
request.PreAuthenticate = true;
}
The short answer is, you can't send a response to a new tab. The problem with what you are trying to do is that the request / response that you build on the server is technically the server's...not the client. So even if you build the request and redirect the user to the URL you just authenticated to, the client is still considered unauthenticated because the authentication took effect on the server.

FormsAuthenticate through WCF (How do I make the Session apply to browser?!)

Users are authenticating to a REST WCF Service (my own). The credentials are sent through AJAX with Javascript and JSON format. The service reply with a OK and little info (redirect url) to the client, when authenticated.
Now, There are a new method provided for external authentication, and I have to create a compact code snippet that are easy to paste & run inside a asp.net code file method.
A typical wcf request could end up like this,
http://testuri.org/WebService/AuthenticationService.svc/ExtLogin?cId=197&aId=someName&password=!!pwd
My code snippet so far,
protected void bn_Click(object sender, EventArgs e)
{
WebHttpBinding webHttpBinding = new WebHttpBinding();
EndpointAddress endpointAddress = new EndpointAddress(url);
ContractDescription cd =
ContractDescription.GetContract(typeof(IAuthenticationService));
ServiceEndpoint sep = new ServiceEndpoint(cd);
sep.Behaviors.Add(new WebHttpBehavior());
sep.Address = endpointAddress;
sep.Binding = webHttpBinding;
var resp = new ChannelFactory<IAuthenticationService>(sepREST).CreateChannel();
LoginResult result = resp.ExtLogin(cId, aId, hashPwd);
Response.Redirect(result.RedirectUri);
// I.e. http://testuri.org/Profile.aspx (Require authenticated to visit)
}
I recieve correct authenticated reply in the resp/result objects. So, the communication are fine. When redirecting to the actual website, I'm not authenticated. I can't locate the problem? If I take the URI above (with valid credentials) and paste into my Webbrowser URL, and then manually type the uri, i'm authenticated.
I've spent a day searched the net for this, without success.
There are a LOT of info but none seem to apply.
What am I missing?
I also tried another approach but the same problem persist.
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(uriWithParameters);
CookieContainer cookieContainer = new CookieContainer();
request.CookieContainer = cookieContainer;
request.ContentType = "application/json";
request.Accept = "application/json";
request.Method = "GET";
string result;
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
using (Stream stream = response.GetResponseStream())
using (StreamReader reader = new StreamReader(stream, Encoding.UTF8))
result = reader.ReadToEnd();
JavaScriptSerializer jsonDeserializer = new JavaScriptSerializer();
LoginResult contact = jsonDeserializer.Deserialize<LoginResult>(result);
Response.Redirect(result.RedirectUri);
I'm not sure about this answer, but will offer it anyway as nobody else has posted:
I think it's because the request that has been authenticated is the request sent via code.
When you redirect it's a totally different request - so is still not authenticated.
All authentication techniques require some way of maintaining the authenticated state across 'stateless' requests = session cookies or some kind of authentication token.
Whatever token you get back from the call to the authentication service needs to be available to your website requests as well - dumping the token from the request into a cookie might be an option.
Can you see (in something like Fiddler) an auth token being sent as part of the request to 'RedirectUrl'?

How to make my web scraper log in to this website via C#

I have an application that reads parts of the source code on a website. That all works; but the problem is that the page in question requires the user to be logged in to access this source code. What my program needs a way to initially log the user into the website- after that is done, I'll be able to access and read the source code.
The website that needs to be logged into is:
mmoinn.com/index.do?PageModule=UsersLogin
You can continue using WebClient to POST (instead of GET, which is the HTTP verb you're currently using with DownloadString), but I think you'll find it easier to work with the (slightly) lower-level classes WebRequest and WebResponse.
There are two parts to this - the first is to post the login form, the second is recovering the "Set-cookie" header and sending that back to the server as "Cookie" along with your GET request. The server will use this cookie to identify you from now on (assuming it's using cookie-based authentication which I'm fairly confident it is as that page returns a Set-cookie header which includes "PHPSESSID").
POSTing to the login form
Form posts are easy to simulate, it's just a case of formatting your post data as follows:
field1=value1&field2=value2
Using WebRequest and code I adapted from Scott Hanselman, here's how you'd POST form data to your login form:
string formUrl = "http://www.mmoinn.com/index.do?PageModule=UsersAction&Action=UsersLogin"; // NOTE: This is the URL the form POSTs to, not the URL of the form (you can find this in the "action" attribute of the HTML's form tag
string formParams = string.Format("email_address={0}&password={1}", "your email", "your password");
string cookieHeader;
WebRequest req = WebRequest.Create(formUrl);
req.ContentType = "application/x-www-form-urlencoded";
req.Method = "POST";
byte[] bytes = Encoding.ASCII.GetBytes(formParams);
req.ContentLength = bytes.Length;
using (Stream os = req.GetRequestStream())
{
os.Write(bytes, 0, bytes.Length);
}
WebResponse resp = req.GetResponse();
cookieHeader = resp.Headers["Set-cookie"];
Here's an example of what you should see in the Set-cookie header for your login form:
PHPSESSID=c4812cffcf2c45e0357a5a93c137642e; path=/; domain=.mmoinn.com,wowmine_referer=directenter; path=/; domain=.mmoinn.com,lang=en; path=/;domain=.mmoinn.com,adt_usertype=other,adt_host=-
GETting the page behind the login form
Now you can perform your GET request to a page that you need to be logged in for.
string pageSource;
string getUrl = "the url of the page behind the login";
WebRequest getRequest = WebRequest.Create(getUrl);
getRequest.Headers.Add("Cookie", cookieHeader);
WebResponse getResponse = getRequest.GetResponse();
using (StreamReader sr = new StreamReader(getResponse.GetResponseStream()))
{
pageSource = sr.ReadToEnd();
}
EDIT:
If you need to view the results of the first POST, you can recover the HTML it returned with:
using (StreamReader sr = new StreamReader(resp.GetResponseStream()))
{
pageSource = sr.ReadToEnd();
}
Place this directly below cookieHeader = resp.Headers["Set-cookie"]; and then inspect the string held in pageSource.
You can simplify things quite a bit by creating a class that derives from WebClient, overriding its GetWebRequest method and setting a CookieContainer object on it. If you always set the same CookieContainer instance, then cookie management will be handled automatically for you.
But the only way to get at the HttpWebRequest before it is sent is to inherit from WebClient and override that method.
public class CookieAwareWebClient : WebClient
{
private CookieContainer cookie = new CookieContainer();
protected override WebRequest GetWebRequest(Uri address)
{
WebRequest request = base.GetWebRequest(address);
if (request is HttpWebRequest)
{
(request as HttpWebRequest).CookieContainer = cookie;
}
return request;
}
}
var client = new CookieAwareWebClient();
client.BaseAddress = #"https://www.site.com/any/base/url/";
var loginData = new NameValueCollection();
loginData.Add("login", "YourLogin");
loginData.Add("password", "YourPassword");
client.UploadValues("login.php", "POST", loginData);
//Now you are logged in and can request pages
string htmlSource = client.DownloadString("index.php");
Matthew Brindley, your code worked very good for some website I needed (with login), but I needed to change to HttpWebRequest and HttpWebResponse otherwise I get a 404 Bad Request from the remote server. Also I would like to share my workaround using your code, and is that I tried it to login to a website based on moodle, but it didn't work at your step "GETting the page behind the login form" because when successfully POSTing the login, the Header 'Set-Cookie' didn't return anything despite other websites does.
So I think this where we need to store cookies for next Requests, so I added this.
To the "POSTing to the login form" code block :
var cookies = new CookieContainer();
HttpWebRequest req = (HttpWebRequest)WebRequest.Create(formUrl);
req.CookieContainer = cookies;
And To the "GETting the page behind the login form" :
HttpWebRequest getRequest = (HttpWebRequest)WebRequest.Create(getUrl);
getRequest.CookieContainer = new CookieContainer();
getRequest.CookieContainer.Add(resp.Cookies);
getRequest.Headers.Add("Cookie", cookieHeader);
Doing this, lets me Log me in and get the source code of the "page behind login" (website based moodle) I know this is a vague use of the CookieContainer and HTTPCookies because we may ask first is there a previously set of cookies saved before sending the request to the server. This works without problem anyway, but here's a good info to read about WebRequest and WebResponse with sample projects and tutorial:
Retrieving HTTP content in .NET
How to use HttpWebRequest and HttpWebResponse in .NET
Sometimes, it may help switching off AllowAutoRedirect and setting both login POST and page GET requests the same user agent.
request.UserAgent = userAgent;
request.AllowAutoRedirect = false;

Categories

Resources