Logging into a website using HttpWebRequest/Response in C#? - c#

Now, first off, I want to understand whether or not its better to use HttpWebRequest and Response or whether its better to simply use a webbrowser control. Most people seem to prefer to use the web browser, however whenever I ask people about it, they tell me that HttpWebRequest and Response is better. So, if this question could be avoided by switching to a web browser (and there's a good reason as to why its better), please let me know!
Basically, I set up a test site, written in PHP, running on localhost. It consists of three files....
The first is index.php, which just contains a simple login form, all the session and everything is just me testing how sessions work, so its not very well written, like I said, its just for testing purposes:
<?php
session_start();
$_SESSION['id'] = 2233;
?>
<form method="post" action="login.php">
U: <input type="text" name="username" />
<br />
P: <input type="password" name="password" />
<br />
<input type="submit" value="Log In" />
</form>
Then, I have login.php (the action of the form), which looks like:
<?php
session_start();
$username = $_POST['username'];
$password = $_POST['password'];
if ($username == "username" && $password == "password" && $_SESSION['id'] == 2233)
{
header('Location: loggedin.php');
die();
}
else
{
die('Incorrect login details');
}
?>
And lastly, loggedin.php just displays "Success!" (using the element).
As you can see, a very simple test, and many of the things I have there are just for testing purposes.
So, then I go to my C# code. I created a method called "HttpPost". It looks like:
private static string HttpPost(string url)
{
request = HttpWebRequest.Create(url) as HttpWebRequest;
request.CookieContainer = cookies;
request.UserAgent = userAgent;
request.KeepAlive = keepAlive;
request.Method = "POST";
response = request.GetResponse() as HttpWebResponse;
if (response.StatusCode != HttpStatusCode.Found)
throw new Exception("Website not found");
StreamReader sr = new StreamReader(response.GetResponseStream());
return sr.ReadToEnd();
}
I built a Windows Form application, so in the button Click event, I want to add the code to call the HttpPost method with the appropriate URL. However, I'm not really sure what I'm supposed to put there to cause it to log in.
Can anyone help me out? I'd also appreciate some general pointers on programatically logging into websites!

Have you considered using WebClient?
It provides a set of abstract methods for use with web pages, including UploadValues, but I'm not sure if that would work for your purposes.
Also, it's probably better not to use WebBrowser as that's a full blown web browser that can execute scripts and such; HttpWebRequest and WebClient are much more light weight.
Edit : Login to website, via C#
Check this answer out, I think this is exactly what you're looking for.
Relevant code snippet from above link :
var client = new WebClient();
client.BaseAddress = #"https://www.site.com/any/base/url/";
var loginData = new NameValueCollection();
loginData.Add("login", "YourLogin");
loginData.Add("password", "YourPassword");
client.UploadValues("login.php", "POST", loginData);

You should use something like WCF Web Api HttpClient. It much easier to achieve.
Following code is writte off the top of my head. But it should give you the idea.
using (var client = new HttpClient())
{
var data = new Dictionary<string, string>(){{"username", "username_value"}, {"password", "the_password"}};
var content = new FormUrlEncodedContent(data);
var response = client.Post("yourdomain/login.php", content);
if (response.StatusCode == HttpStatusCode.OK)
{
//
}
}

Related

How can I pull data from website using C#

Web-page data into the application
You can replicate the request the website makes to get a list of relevant numbers. The following code might be a good start.
var httpRequest = (HttpWebRequest) WebRequest.Create("<url>");
httpRequest.Method = "POST";
httpRequest.Accept = "application/json";
string postData = "{<json payload>}";
using (var streamWriter = new StreamWriter(httpRequest.GetRequestStream())) {
streamWriter.Write(postData);
}
var httpResponse = (HttpWebResponse) httpRequest.GetResponse();
string result;
using (var streamReader = new StreamReader(httpResponse.GetResponseStream())) {
result = streamReader.ReadToEnd();
}
Console.WriteLine(result);
Now, for the <url> and <json payload> values:
Open the web inspector in your browser.
Go to the Network tab.
Set it so Fetch/XHR/AJAX requests are shown.
Refresh the page.
Look for a request that you want to replicate.
Copy the request URL.
Copy the Payload (JSON data, to use it in your code you'll have to add a \ before every ")
Side note: The owner of the website you are making automated requests to might not be very happy about your tool, and you/it might be blocked if it makes too many requests in a short time.

Scrape data from web page with HtmlAgilityPack c#

I had a problem scraping data from a web page which I got a solution
Scrape data from web page that using iframe c#
My problem is that they changed the webpage which is now https://webportal.thpa.gr/ctreport/container/track and I don't think that is using iFrames and I cannot get any data back.
Can someone tell me if I can use the same method to get data from this webpage or should I use a different aproach?
I don't know how #coder_b found that I should use https://portal.thpa.gr/fnet5/track/index.php as web page and that I should use
var reqUrlContent =
hc.PostAsync(url,
new StringContent($"d=1&containerCode={reference}&go=1", Encoding.UTF8,
"application/x-www-form-urlencoded"))
.Result;
to pass the variables
EDIT: When I check the webpage there is an input which contains the number
input type="text" id="report_container_containerno"
name="report_container[containerno]" required="required"
class="form-control" minlength="11" maxlength="11" placeholder="E/K
για αναζήτηση" value="ARKU2215462"
Can I use something to pass with HtmlAgilityPack and then it should be easy to read the result
Also when I check the DocumentNode it seems to show me the cookies page that I should agree.
Can I bypass or auto allow cookies?
Try this:
public static string Download(string search)
{
var request = (HttpWebRequest)WebRequest.Create("https://webportal.thpa.gr/ctreport/container/track");
var postData = string.Format("report_container%5Bcontainerno%5D={0}&report_container%5Bsearch%5D=", search);
var data = Encoding.ASCII.GetBytes(postData);
request.Method = "POST";
request.ContentType = "application/x-www-form-urlencoded";
request.ContentLength = data.Length;
using (var stream = request.GetRequestStream())
{
stream.Write(data, 0, data.Length);
}
using (var response = (HttpWebResponse)request.GetResponse())
using (var stream = new StreamReader(response.GetResponseStream()))
{
return stream.ReadToEnd();
}
}
Usage:
var html = Download("ARKU2215462");
UPDATE
To find the post parameters to use, press F12 in the browser to show dev tools, then select Network tab. Now, fill the search input with your ARKU2215462 and press the button.
That do a request to the server to get the response. In that request, you can inspect both request and response. There are lots of request (styles, scripts, iamges...) but you want the html pages. In this case, look this:
This is the Form data requested. If you click in "view source", you get the data encoded like "report_container%5Bcontainerno%5D=ARKU2215462&report_container%5Bsearch%5D=", as you need in your code.

Submit an HTML form submit using C#

I'm working on a school project which I want to see our timetable from our Windows8.1 devices with an universal app. So that I don't have to log in every time I want to check it. I need a method that logs me in to our school's website and lets me see the source code so I can see the lessons.
Website I need to log in with C# codes is here.
Source code seems like:
<form method="post" action="/adfs/ls/?SAMLRequest=nZJPb9swDMW/iqF7bMmt11SIU2QZihZosSBxd9hloG1m1WZJriglwT79lD9uM2DrYUeBj3yPP2py%0As9NdskFHypqSiZSzBE1jW2W%2Bl%2Bypuh2N2c10QqC7vJez4J/NEl8Ckk9ioyF5rJQsOCMtkCJpQCNJ%0A38jV7PFB5imXvbPeNrZjyYwInY9Wc2soaHQrdBvV4NPyoWTP3vcks0yrH4Z8aBWmv6D9qQympsug%0A77Mt1kSWJbfWNXgIU7I1dIQsuf9Usm8C4epacH6Zg/iQQ12PEQpx2dQXdcFbhCijBRCpDb41EgW8%0Aj4ZgfMlyLoqR4CN%2BUYmxFFzmeVoU119Zsjgt8VGZI5z3Nq6PIpJ3VbUYLT6vKpZ8GSBHATshlQd3%0Ad87y/cEwAGTTAdfACNvQBPoT1SQ7t3m94%2BmE2B4Yxlt43PlkbnUPTtE%2Bo4ad0kG/5jwXzruYYonr%0A/0q9l62xRQf7t4Q4F41XzfG5jdzslobYf3Odnor/2OKtfP5Zp78B&RelayState=Zadkine" id="MainForm">
<input name="ctl00$ContentPlaceHolder1$UsernameTextBox" type="text" id="ContentPlaceHolder1_UsernameTextBox" />
<input name="ctl00$ContentPlaceHolder1$PasswordTextBox" type="password" id="ContentPlaceHolder1_PasswordTextBox" />
<input type="submit" name="ctl00$ContentPlaceHolder1$SubmitButton" value="Aanmelden" id="ContentPlaceHolder1_SubmitButton" class="Resizable" />
</form>
This is it, basically. There are some __VIEWSTATE's but I don't know if they matter.
I found 2 type of solutions.
WebRequest examples here on Stack Overflow which didn't work :/
HttpWebRequest request = (HttpWebRequest)WebRequest.Create("https://sts.zadkine.nl/adfs/ls/?SAMLRequest=nZJPb9swDMW/iqF7bMmt11SIU2QZihZosSBxd9hloG1m1WZJriglwT79lD9uM2DrYUeBj3yPP2py%0As9NdskFHypqSiZSzBE1jW2W%2Bl%2Bypuh2N2c10QqC7vJez4J/NEl8Ckk9ioyF5rJQsOCMtkCJpQCNJ%0A38jV7PFB5imXvbPeNrZjyYwInY9Wc2soaHQrdBvV4NPyoWTP3vcks0yrH4Z8aBWmv6D9qQympsug%0A77Mt1kSWJbfWNXgIU7I1dIQsuf9Usm8C4epacH6Zg/iQQ12PEQpx2dQXdcFbhCijBRCpDb41EgW8%0Aj4ZgfMlyLoqR4CN%2BUYmxFFzmeVoU119Zsjgt8VGZI5z3Nq6PIpJ3VbUYLT6vKpZ8GSBHATshlQd3%0Ad87y/cEwAGTTAdfACNvQBPoT1SQ7t3m94%2BmE2B4Yxlt43PlkbnUPTtE%2Bo4ad0kG/5jwXzruYYonr%0A/0q9l62xRQf7t4Q4F41XzfG5jdzslobYf3Odnor/2OKtfP5Zp78B&RelayState=Zadkine");
request.Method = "POST";
request.ContentType = "application/x-www-form-urlencoded";
using (var requestStream = request.GetRequestStream())
using (var writer = new StreamWriter(requestStream))
{
writer.Write("ctl00$ContentPlaceHolder1$UsernameTextBox=" + yourusername + "&ctl00$ContentPlaceHolder1$PasswordTextBox=" + yourpassword);
}
using (var responseStream = request.GetResponse().GetResponseStream())
using (var reader = new StreamReader(responseStream))
{
var result = reader.ReadToEnd();
richTextBox1.Text = result;
}
In another websites, if I try this code, I get an error like "You have to allow cookies to login." but on the website of my school I don't get any, not even "Wrong password". (If I type wrong password on a browser, I get the wrong password error.)
Duplicating the form to an .HTML file and use WebView to log in with JavaScript. If I try this, I get redirected to another page and get a very weird error like "User null couldn't recognized". So these 2 type of solutions didn't work for me.
So, the question is, how can I log in to website with C# ?
Example code in WebBrowser document completed event:
HtmlElement element;
// Filling the username
element = webBrowser.Document.GetElementById("ContentPlaceHolder1_UsernameTextBox");
if (element != null)
{
element.InnerText = "username";
}
// In case if there is no id of the input field you can get it by name
HtmlElementCollection elements = null;
elements = webBrowser.Document.All.GetElementsByName("pass");
element = elements[0];
element.InnerText = "password";
//login (click)
elements = webBrowser.Document.All.GetElementsByName("submit");
element = elements[0];
element.InvokeMember("CLICK");

How to integrate HTML markup from another URL in C#

I have an aspx file that loads HTML markup. This markup contains a div element which is basically a container for another HTML markup retrieved from another URL. A code snippet would look like this:
<div id="container">
<%= RetrieveIntegrationMarkup() %>
</div>
What is the best way to retrieve the markup in the RetrieveIntegrationMarkup()? Currently, we are using a workaround to accept self-signed SSL certificates, but it only works in our test environments. It doesn't work in the production environment.
I don't know if this will help, but here's the snippet of the said method:
HttpWebRequest.DefaultCachePolicy = new HttpRequestCachePolicy(HttpRequestCacheLevel.Revalidate);
ServicePointManager.CertificatePolicy = new MyPolicy();
Uri serviceUri = new Uri(integrationUrl, UriKind.Absolute);
HttpWebRequest webRequest = (HttpWebRequest)System.Net.WebRequest.Create(serviceUri);
HttpWebResponse response = (HttpWebResponse)webRequest.GetResponse();
using (var sr = new StreamReader(response.GetResponseStream()))
{
markup= sr.ReadToEnd();
}
Thanks!

How to login a website using httpwebrequest via my web app or generic handler and access the content?

Basically I am making a chat app for my university students only and for that I have to make sure they are genuine by checking there details on UMS(university management system) and get their basic detail so they chat genuinely. I am nearly done with my chat app only the login is left.
So I want to login to my UMS page via my website from a generic handler.
and then navigate to another page in it to access there basic info keeping the session alive.
I did research on httpwebrequest and failed to login with my credentials.
https://ums.lpu.in/lpuums
(made in asp.net)
I did tried codes from other posts for login.
I am novice to this part so bear with me.. any help will be appreciated.
Without the actual handshake with UMS via a defined API, you would end up scraping UMS html, which is bad for various reasons.
I would suggest you read up on Single Sign On (SSO).
A few articles on SSO and ASP.NET -
1. Codeproject
2. MSDN
3. asp.net forum
Edit 1
Although, I think this is a bad idea, since you say you are out of options, here is a link that shows how Html Agility Pack can help in scraping the web pages.
Beware of the drawbacks of screen scraping, changes from UMS will not be communicated to you, and you will see your application not working all of a sudden.
public string Scrap(string Username, string Password)
{
string Url1 = "https://www.example.com";//first url
string Url2 = "https://www.example.com/login.aspx";//secret url to post request to
//first request
CookieContainer jar = new CookieContainer();
HttpWebRequest request1 = (HttpWebRequest)WebRequest.Create(Url1);
request1.CookieContainer = jar;
//Get the response from the server and save the cookies from the first request..
HttpWebResponse response1 = (HttpWebResponse)request1.GetResponse();
//second request
string postData = "***viewstate here***";//VIEWSTATE
HttpWebRequest request2 = (HttpWebRequest)WebRequest.Create(Url2);
request2.CookieContainer = jar;
request2.KeepAlive = true;
request2.Referer = Url2;
request2.Method = WebRequestMethods.Http.Post;
request2.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8";
request2.UserAgent = "Mozilla/5.0 (Windows NT 6.1) AppleWebKit/535.2 (KHTML, like Gecko) Chrome/15.0.874.121 Safari/535.2";
request2.ContentType = "application/x-www-form-urlencoded";
request2.AllowWriteStreamBuffering = true;
request2.ProtocolVersion = HttpVersion.Version11;
request2.AllowAutoRedirect = true;
byte[] byteArray = Encoding.ASCII.GetBytes(postData);
request2.ContentLength = byteArray.Length;
Stream newStream = request2.GetRequestStream(); //open connection
newStream.Write(byteArray, 0, byteArray.Length); // Send the data.
newStream.Close();
HttpWebResponse response2 = (HttpWebResponse)request2.GetResponse();
using (StreamReader sr = new StreamReader(response2.GetResponseStream()))
{
responseData = sr.ReadToEnd();
}
return responseData;
}
This is the code which works for me any one can add there links and viewstate for asp.net websites to scrap and you need to take care of cookie too.
and for other websites(non asp.net) they don't require viewstate.
Use fiddler to find things needed to add in header and viewstate or cookie.
Hope this helps if some one having the problem. :)

Categories

Resources