I log into the site https://dmarket.com. I want to save cookies and use later. In order not to visit the site next time.
private void login_Click(object sender, EventArgs e)
{
string login = textBox1.Text;
string password = textBox2.Text;
string steamguard = textBox3.Text;
IWebDriver driver = new ChromeDriver();
driver.Navigate().GoToUrl(#"https://steamcommunity.com/openid/login?openid.claimed_id=http%3A%2F%2Fspecs.openid.net%2Fauth%2F2.0%2Fidentifier_select&openid.identity=http%3A%2F%2Fspecs.openid.net%2Fauth%2F2.0%2Fidentifier_select&openid.mode=checkid_setup&openid.ns=http%3A%2F%2Fspecs.openid.net%2Fauth%2F2.0&openid.realm=https%3A%2F%2Fapi.dmarket.live&openid.return_to=https%3A%2F%2Fapi.dmarket.live%2Fauth%2Fv1%2Fcallback%2Fsteam%2F901e7d34-06c1-44b0-82b4-2f982c058361");
driver.FindElement(By.XPath("//*[#id=\"steamAccountName\"]")).SendKeys(login);
driver.FindElement(By.XPath("//*[#id=\"steamPassword\"]")).SendKeys(password);
driver.FindElement(By.XPath("//*[#id=\"imageLogin\"]")).Click();
driver.Manage().Timeouts().ImplicitWait = TimeSpan.FromSeconds(150);
driver.FindElement(By.XPath("//*[#id=\"twofactorcode_entry\"]")).SendKeys(steamguard);
driver.FindElement(By.XPath("//*[#id=\"login_twofactorauth_buttonset_entercode\"]/div[1]")).Click();
var cookies = driver.Manage().Cookies.AllCookies;
driver.Manage().Cookies.AddCookie(cookies);
}
But an error occurs: Error CS1503 Argument 1: Unable to convert from "System.Collections.ObjectModel.ReadOnlyCollection <OpenQA.Selenium.Cookie>" to "OpenQA.Selenium.Cookie". Maybe I'm doing something wrong. And maybe it should have been done differently.
Thank you!
From my experience dealing high level with cookies will fail you , to master and get root of the problem my way :
Get Cookie manager extension for Firefox or whatever browser you are using.
see how many cookies key/value you are getting after logging.
install fiddler sniffer and see how many of them sent in request after login when browsing the website.
extract that cookies and inject them in HttpClient or similar class and track the requests with fiddler if succeeded or not .
once socket request succeeded , i add the same headers and cookies to selenium request, and continue doing my selenium stuff.
maybe longer approach but always worked with me , let me show you an example with instagram login :
var ig_did = driver.Manage().Cookies.GetCookieNamed("ig_did");
var sessionid = driver.Manage().Cookies.GetCookieNamed("sessionid");
var mid = driver.Manage().Cookies.GetCookieNamed("mid");
var ig_nrcb = driver.Manage().Cookies.GetCookieNamed("ig_nrcb");
var rur = driver.Manage().Cookies.GetCookieNamed("rur");
var csrftoken = driver.Manage().Cookies.GetCookieNamed("csrftoken");
var ds_user_id = driver.Manage().Cookies.GetCookieNamed("ds_user_id");
string ig_did_value = ig_did.ToString().Substring(0, ig_did.ToString().IndexOf(";")).Replace("ig_did=", "");
string sessionid_value = sessionid.ToString().Substring(0, sessionid.ToString().IndexOf(";")).Replace("sessionid=", "");
string mid_value = mid.ToString().Substring(0, mid.ToString().IndexOf(";")).Replace("mid=", "");
string ig_nrcb_value = ig_nrcb.ToString().Substring(0, ig_nrcb.ToString().IndexOf(";")).Replace("ig_nrcb=", "");
string rur_value = rur.ToString().Substring(0, rur.ToString().IndexOf(";")).Replace("rur=", "");
string ds_user_id_value = ds_user_id.ToString().Substring(0, ds_user_id.ToString().IndexOf(";")).Replace("ds_user_id=", "");
string csrftoken_value = csrftoken.ToString().Substring(0, csrftoken.ToString().IndexOf(";")).Replace("csrftoken=", "");
Then inject them to HttpClient and sniff them with fiddler :
var baseAddress = new Uri("https://www.instagram.com");
var cookieContainer = new CookieContainer();
using (var handler = new HttpClientHandler()
{
CookieContainer = cookieContainer,
Proxy = new WebProxy("127.0.0.1:8888", false),
UseProxy = true,
AllowAutoRedirect = true
})
using (httpclient = new HttpClient(handler) { BaseAddress = baseAddress })
{
httpclient.DefaultRequestHeaders.Add("User-Agent", "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/89.0.4389.82 Safari/537.36");
httpclient.DefaultRequestHeaders.Add("X-CSRFToken", csrftoken_value);
httpclient.DefaultRequestHeaders.Add("Referer", "My_Instagram_URL");
httpclient.DefaultRequestHeaders.Add("X-IG-App-ID", Ig_app_Id_value);
httpclient.DefaultRequestHeaders.Add("Origin", "https://www.instagram.com");
httpclient.DefaultRequestHeaders.Add("Connection", "keep-alive");
httpclient.DefaultRequestHeaders.Add("X-Requested-With", "XMLHttpRequest");
httpclient.DefaultRequestHeaders.Add("Sec-Fetch-Site", "same-origin");
httpclient.DefaultRequestHeaders.Add("Sec-Fetch-Mode", "cors");
httpclient.DefaultRequestHeaders.Add("Sec-Fetch-Dest", "empty");
cookieContainer.Add(baseAddress, new System.Net.Cookie("ig_did", ig_did_value));
cookieContainer.Add(baseAddress, new System.Net.Cookie("mid", mid_value));
cookieContainer.Add(baseAddress, new System.Net.Cookie("ig_nrcb", ig_nrcb_value));
cookieContainer.Add(baseAddress, new System.Net.Cookie("csrftoken", csrftoken_value));
cookieContainer.Add(baseAddress, new System.Net.Cookie("sessionid", sessionid_value));
cookieContainer.Add(baseAddress, new System.Net.Cookie("rur", rur_value));
string url = "My_Instagram_URL";
var response = await httpclient.GetAsync(url);
}
As said it's looks long approach but this is will always work.
Good Luck.
Related
Using RESTSharp, I am able to login:
RestClient client = new RestClient(Constants.APIURL + "method/login");
CookieContainer cookieJar = new CookieContainer();
RestRequest request = new RestRequest(Method.POST);
client.CookieContainer = cookieJar;
request.AddHeader("Content-Type", "application/json");
request.AddHeader("Accept", "application/json");
request.AddJsonBody(new
{
usr = username,
pwd = password
});
var response = client.Execute(request);
var cookie = HttpContext.Current.Server.UrlDecode(response.Headers.ToList().Find(x => x.Name == "Set-Cookie").Value.ToString());
I am then storing the cookies and sending to another API call, also through RESTSharp.
RestClient client = new RestClient(Constants.APIURL);
RestRequest request = new RestRequest("resource/Asset", Method.GET);
request.AddCookie("Cookie", HttpContext.Current.Server.UrlEncode(cookie));
But it keeps returning 403 forbidden. I tried on POSTMan, it works absolutely fine.
Any help? Is it that I am sending the cookies wrongly? I tried sending the cookies in a HttpWebRequest and it is working absolutely fine.
I also tried copy pasting a code generated from Postman where cookie was passed in a header but it didn't work. I tried sending cookie as below and it worked
client.AddDefaultHeader("Cookie", cookie);
235 / 5.000
Resultados de traducción
For RestSharp v107 and >
You can use CookieContainer and in it store all received Cookies.
then pass the CookieContainer to RestClientOptions and use it as a parameter when instantiating var client1 = new RestClient(options);
CookieContainer cookieJar = new CookieContainer();
cookieJar.Add(response.Cookies);
var options1 = new RestClientOptions(UL.moodle_host + "login/index.php?")
{
ThrowOnAnyError = true,
UserAgent = "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:99.0) Gecko/20100101 Firefox/99.0",
FollowRedirects = true,
CookieContainer = cookieJar
};
// new request TRY loging
var client1 = new RestClient(options1);
I'm working on setting up an authorized restful request and I'm having a hell of a time getting a valid response back. If I paste the request URL into a browser(Firefox Quantum and Chrome) I can get a response of Status:Authenticated;token:[token string] but when I try WebRequest.Create([url]) I keep getting a response of "400: bad request". I'm copying the URL straight from debug code so I know it's valid. I'm pretty sure I'm doing something simple wrong. Would someone point me in the right direction?
string loginReq = _authPath + "?user=" + _username + "&pw=" + _password;
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(loginReq);
request.Accept = "text/html, application/xhtml + xml, */*";
request.UserAgent = "Mozilla/5.0(Windows NT 6.1; WOW64; Trident/7.0; rv: 11.0) like Gecko";
request.KeepAlive = true;
WebResponse response = request.GetResponse();
Console.WriteLine(response.ResponseUri);
Console.Read();
Ok, after doing some more poking it looks like the site I'm calling is refusing the request because it thinks I'm using IE9. Here's the latest version of the code
private static string GetAuthorization() {
string token = string.Empty;
string loginReq = _authPath + "?user=" + _ftpUsername + "&pw=" + _ftpPassword;
string task = SendWebRequest(loginReq);
//HttpWebRequest request = (HttpWebRequest)WebRequest.Create(loginReq);
//request.Accept = "text/html, application/xhtml + xml, */*";
//request.UserAgent = "Mozilla/5.0(Windows NT 6.1; WOW64; Trident/7.0; rv: 11.0) like Gecko";
//request.KeepAlive = true;
//request.Headers.Add("Accept-Encoding", "gzip, deflate");
//request.Headers.Add("Cache-Control", "no-cache");
//request.Headers.Add("Accept-Language", "en-US");
//request.Headers.Add("DNT", "1");
//request.Method = "GET";
//request.CookieContainer = new CookieContainer();
//request.Headers.Add("Request", "GET /xc2/QAPI_Upload?user=user#ottrtest1.com&pw=ETqDJeQ1! HTTP/1.1");
//HttpWebResponse response = (HttpWebResponse)request.GetResponse();
Console.WriteLine(task);
Console.Read();
return token;
}
public static string SendWebRequest(string requestUrl) {
using (HttpClient client = new HttpClient())
using (HttpResponseMessage response = client.GetAsync(requestUrl).GetAwaiter().GetResult())
return response.Content.ReadAsStringAsync().GetAwaiter().GetResult();
}
I keep trying different things but I'm getting the same results(400 Bad Request) Here's the latest version of what I'm working with
string loginReq = $"{_authPath}?user={_userName}&pw={_passWord}";
string result;
using (WebClient wc = new WebClient()) {
var json = wc.DownloadString(loginReq);
result = json.ToString();
Console.WriteLine(json);
}
If I change the url to "https://www.google.com" my code works. if I paste loginReq into SoapUI it works, I can't get the url and my code to work together...
Fiddler found the problem. Once I reviewed the request in fiddler I saw that I needed to set the security protocol type to tls1.0, tls1.1, or tls1.2. Once I did that I finally got the call to work. Here's the working code in case anyone needs it for reference:
ServicePointManager.SecurityProtocol = (SecurityProtocolType)192 | (SecurityProtocolType)768 | (SecurityProtocolType)3072;
string loginReq = $"{_authPath}?user={_userName}&pw={_passWord}";
string result;
using (WebClient wc = new WebClient()) {
var json = wc.DownloadString(loginReq);
result = json.ToString();
Console.WriteLine(json);
}
return result;
Fiddler found the problem. Once I reviewed the request in fiddler I saw that I needed to set the security protocol type to tls1.0, tls1.1, or tls1.2. Once I did that I finally got the call to work. Here's the working code in case anyone needs it for reference:
ServicePointManager.SecurityProtocol = (SecurityProtocolType)192 | (SecurityProtocolType)768 | (SecurityProtocolType)3072;
string loginReq = $"{_authPath}?user={_userName}&pw={_passWord}";
string result;
using (WebClient wc = new WebClient()) {
var json = wc.DownloadString(loginReq);
result = json.ToString();
Console.WriteLine(json);
}
return result;
I'm trying get all images from page:
public async Task<PhotoURL> GetImagePortal()
{
strLinkPage = "http://www.propertyguru.com.sg/listing/19077438";
var lstString = new List<string>();
int itotal = default(int);
HttpClient client = new HttpClient();
var doc = new HtmlAgilityPack.HtmlDocument();
string strHtml = await client.GetStringAsync(strLinkPage);
doc.LoadHtml(strHtml);
var pageHtml = doc.DocumentNode;
if (pageHtml != null)
{
var projectRoot = pageHtml.SelectSingleNode("//*[contains(#class,'submain')]");
//var projectChild = projectRoot.SelectSingleNode("div/div[2]");
var imgRoot = projectRoot.SelectSingleNode("//*[contains(#class,'white-bg-padding')]");
var imgChilds = imgRoot.SelectNodes("div[1]/div[1]/ul[1]/li");
itotal = imgChilds.Count();
foreach (var imgItem in imgChilds)
{
string linkImage = imgItem.SelectSingleNode("img").Attributes["src"].Value;
lstString.Add(linkImage);
}
}
return await Task.Run(() => new PhotoURL { total = itotal, URL = lstString });
}
at line
string strHtml = await client.GetStringAsync(strLinkPage);
i have error 405 method not allowed.
I try using
WebClient, HTTPWebRequest.
help me, please!
The site required a user-agent and since you are just using a HttpClient without any options, the site does not think it is a correct request (It does not look like it's coming from a browser without the user agent).
Try this:
HttpClient client = new HttpClient();
client.DefaultRequestHeaders.Add("User-Agent", "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/49.0.2623.110 Safari/537.36");
Or if you prefer any other user agents strings.
I need some help with the new Windows.Web.Http.HttpClient Class. I am writing my first WP8.1 App right now and it drives me crazy. I am logging into a website like this:
var values = new Dictionary<string, string>();
values.Add("login_username", _username);
values.Add("login_password", _password);
values.Add("login_lifetime", "36000");
var parameters = new HttpFormUrlEncodedContent(values);
var response = await Forum.Http.PostAsync(new Uri("http://foo.bar.xyz"), parameters);
var buffer = await response.Content.ReadAsBufferAsync();
byte[] byteArray = buffer.ToArray();
string content = Encoding.UTF8.GetString(byteArray, 0, byteArray.Length);
if (content.Contains("Wrong password/user name"))
{
return false;
}
return true;
And this works pretty fine. My HttpClient is a static field, like this:
public static HttpBaseProtocolFilter Filter = new HttpBaseProtocolFilter();
public static HttpClient Http = new HttpClient(Filter);
The login works just fine, but it doesn't save the cookies the website sends after logging in. How can I save them and can I send them to the website on every GetAsync()?
You can use HttpClientHandler instead of HttpBaseProtocolFilter. If you must use HttpBaseProtocolFilter, then there is a read-only CookieManager property of type HttpCookieManager that could help you.
Here's an example using HttpClientHandler:
public static CookieContainer Cookies = new CookieContainer();
public static HttpClientHandler HttpClientHandler = new HttpClientHandler() { CookieContainer = Cookies };
public static HttpClient Http = new HttpClient(HttpClientHandler);
After your PostAsync() call returns, you can extract the cookies
var uri = new Uri("http://foo.bar.xyz");
var response = await Forum.Http.PostAsync(uri, parameters);
IEnumerable<Cookie> responseCookies = Cookies.GetCookies(uri).Cast<Cookie>();
foreach (Cookie cookie in responseCookies) {
Console.WriteLine(cookie.Name + ": " + cookie.Value);
}
If you'd like to re-use a cookie from the initial request, you can create your own CookieContainer and copy the cookie from the response cookies. Or - you could also add a hard-coded cookie like this:
Cookies.Add(new HttpCookie("Name", "Value") { Domain="http://foo.bar.xyz" });
Hey guys so I'm trying to write a C# Application in which the user can login to their instagram account from a WPF. The problem I'm having is getting the authorization code. When I use this code I keep getting the login page URL, not the successful login page.
Help please!
Any feedback is appreciated! been stuck on this a while
private static AuthInfo GetInstagramAuth(string oAuthUri, string clientId, string redirectUri, InstagramConfig config,
string login, string password)
{
List<Auth.Scope> scopes = new List<Auth.Scope>();
scopes.Add(Auth.Scope.basic);
var link = InstaSharp.Auth.AuthLink(oAuthUri, clientId, redirectUri, scopes);
// Логинимся по указанному узлу
CookieAwareWebClient client = new CookieAwareWebClient();
// Зашли на страницу логина
var result = client.DownloadData(link);
var html = System.Text.Encoding.Default.GetString(result);
// Берем токен
string csr = "";
string pattern = #"csrfmiddlewaretoken""\svalue=""(.+)""";
var r = new System.Text.RegularExpressions.Regex(pattern);
var m = r.Match(html);
csr = m.Groups[1].Value;
// Логинимся
string loginLink = string.Format(
"https://instagram.com/accounts/login/?next=/oauth/authorize/%3Fclient_id%3D{0}%26redirect_uri%3Dhttp%3A//kakveselo.ru%26response_type%3Dcode%26scope%3Dbasic", clientId);
NameValueCollection parameters = new NameValueCollection();
parameters.Add("csrfmiddlewaretoken", csr);
parameters.Add("username", login);
parameters.Add("password", password);
// Нужно добавить секретные кукисы, полученные перед логином
// Нужны заголовки что ли
string agent = "Mozilla/5.0 (compatible; MSIE 10.0; Windows NT 6.2; WOW64; Trident/6.0)";
client.Headers["Referer"] = loginLink;
client.Headers["Host"] = "instagram.com";
//client.Headers["Connection"] = "Keep-Alive";
client.Headers["Content-Type"] = "application/x-www-form-urlencoded";
//client.Headers["Content-Length"] = "88";
client.Headers["User-Agent"] = agent;
// client.Headers["Accept-Language"] = "ru-RU";
//client.Headers["Accept-Encoding"] = "gzip, deflate";
client.Headers["Accept"] = "text/html, application/xhtml+xml, */*";
client.Headers["Cache-Control"] = "no-cache";
// Запрос
var result2 = client.UploadValues(loginLink, "POST", parameters);
// Постим данные, Получаем code
// New link не на апи, а на instagram
string newPostLink = string.Format(
"https://instagram.com/oauth/authorize/?client_id={0}&redirect_uri=http://kakveselo.ru&response_type=code&scope=basic", clientId);
HttpWebRequest request =
(HttpWebRequest) WebRequest.Create(newPostLink);
request.AllowAutoRedirect = false;
request.CookieContainer = client.Cookies;
request.Referer = newPostLink;
request.Method = "POST";
request.ContentType = "application/x-www-form-urlencoded";
request.UserAgent = agent;
string postData = String.Format("csrfmiddlewaretoken={0}&allow=Authorize", csr);
request.ContentLength = postData.Length;
ASCIIEncoding encoding = new ASCIIEncoding();
byte[] loginDataBytes = encoding.GetBytes(postData);
request.ContentLength = loginDataBytes.Length;
Stream stream = request.GetRequestStream();
stream.Write(loginDataBytes, 0, loginDataBytes.Length);
// send the request
var response = request.GetResponse();
string location = response.Headers["Location"];
Console.ForegroundColor = ConsoleColor.Red;
Console.WriteLine("--Responce from the webrequest--");
Console.ResetColor();
Console.WriteLine(((HttpWebResponse)response).ResponseUri+"\n\n");
// Теперь вытаскиваем код и получаем аутентификацию
pattern = #"kakveselo.ru\?code=(.+)";
r = new System.Text.RegularExpressions.Regex(pattern);
m = r.Match(location);
string code = m.Groups[1].Value;
// Наконец, получаем токен аутентификации
var auth = new InstaSharp.Auth(config); //.OAuth(InstaSharpConfig.config);
// now we have to call back to instagram and include the code they gave us
// along with our client secret
var oauthResponse = auth.RequestToken(code);
return oauthResponse;
}
}
I was using this website as an example and CookieAwareWebClient is just a WebClient that handles Cookies. I'll post it below:
using System;
/// <summary>
/// A Cookie-aware WebClient that will store authentication cookie information and persist it through subsequent requests.
/// </summary>
using System.Net;
public class CookieAwareWebClient : WebClient
{
//Properties to handle implementing a timeout
private int? _timeout = null;
public int? Timeout
{
get
{
return _timeout;
}
set
{
_timeout = value;
}
}
//A CookieContainer class to house the Cookie once it is contained within one of the Requests
public CookieContainer Cookies { get; private set; }
//Constructor
public CookieAwareWebClient()
{
Cookies = new CookieContainer();
}
//Method to handle setting the optional timeout (in milliseconds)
public void SetTimeout(int timeout)
{
_timeout = timeout;
}
//This handles using and storing the Cookie information as well as managing the Request timeout
protected override WebRequest GetWebRequest(Uri address)
{
//Handles the CookieContainer
var request = (HttpWebRequest)base.GetWebRequest(address);
request.CookieContainer = Cookies;
//Sets the Timeout if it exists
if (_timeout.HasValue)
{
request.Timeout = _timeout.Value;
}
return request;
}
}
Are you sure the login process on the website don't use javascript in some step(s)?
As far as i'm aware, if it's the case webrequests won't do the job.
All datas/actions that are javascript related will be non-existent through mere webrequests.
I noticed that for security reasons, Websites with personnal accounts tend to mix their login process with javascript now, to avoid bots requests.
Okay So I Figured out the issue. If you want to use webrequests and webresponses you need to make sure that the header information is correct. The issue with mine was I wasn't passing enough information from the browser. To see this information i used Tamper Data
It's an add on for Firefox and allows you took look at everything you are sending or receiving to/from the server.