POST DATA =>
Accept:application/json, text/javascript, */*; q=0.01
Accept-Charset:ISO-8859-9,utf-8;q=0.7,*;q=0.3
Accept-Encoding:gzip,deflate,sdch
Accept-Language:tr-TR,tr;q=0.8,en-US;q=0.6,en;q=0.4
Connection:keep-alive
Content-Length:0
Cookie:pfu=32904422; pfp=PO7PkdBDUwKoMG4FqkriwDLF7jrwcHBEoVqnX2i3; pfe=1386687638;
logged_in=1; tmgioct=5hRBmncU3JQtInFOSa4qqoHX
Host:www.tumblr.com
Origin:http://www.tumblr.com
Referer:http://www.tumblr.com/customize/hayirasla?redirect_to=http%3A%2F%2Fhayirasla.tumblr.com%2F
User-Agent:Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.95 Safari/537.11
X-Requested-With:XMLHttpRequest
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
request.Method = "POST";
request.Headers.Add("Origin", "http://www.tumblr.com");
request.Headers.Add(HttpRequestHeader.UserAgent, "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.95 Safari/537.11"); //Bu üstbilgi uygun özellik veya yöntem kullanılarak değiştirilmelidir.
request.Headers.Add(HttpRequestHeader.Cookie, "pfu=32904422;pfe=1386687638;pfp=PO7PkdBDUwKoMG4FqkriwDLF7jrwcHBEoVqnX2i3;logged_in=1;");
return new StreamReader(((HttpWebResponse)request.GetResponse()).GetResponseStream()).ReadToEnd();
you cannot add headers via HttpWebRequest.Headers.Add()
certain headers are set to be restricted
private static readonly string[] RestrictedHeaders = new[]
{
"Accept", "Connection", "Content-Type", "Content-Length", "Date", "Expect", "Host", "Range", "Referer", "User-Agent"
};
all the headers listed above are not to be added directly, they can only be set by using
//HttpWebRequest.[TheProperty] = value;
You forgot to "fill" the post-data?!? for what i see you're sending a request that only has headers...
Related
How can I download OSM tiles? I specified various request headers, but the site always returns 0 bytes.
WebClient client = new WebClient())
client.Headers.Add("method", "GET");
client.Headers.Add("scheme", "https");
client.Headers.Add(HttpRequestHeader.AcceptLanguage, "en-US");
client.Headers.Add("sec-ch-ua", "\"Chromium\";v=\"88\", \"Google Chrome\";v=\"88\", \"; Not A Brand\";v=\"99\"");
client.Headers.Add(HttpRequestHeader.Referer, "https://www.openstreetmap.org/");
client.Headers.Add("sec-fetch-dest", "document");
client.Headers.Add("sec-fetch-mode", "navigate");
client.Headers.Add("sec-fetch-site", "same-site");
client.Headers.Add("sec-fetch-user", "?1");
client.Headers.Add("upgrade-insecure-requests", "1");
client.Headers.Add(HttpRequestHeader.UserAgent, "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/88.0.4324.190 Safari/537.36");
client.DownloadData("https://tile.openstreetmap.org/0/0/0.png");
UPD:
client.Headers.Add("sec-ch-ua", "\" Not A; Brand\";v=\"99\", \"Chromium\";v=\"101\", \"Microsoft Edge\";v=\"101\"");
client.Headers.Add(HttpRequestHeader.UserAgent, "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.64 Safari/537.36 Edg/101.0.1210.47");
These values from my browser don't work either.
So Im learning RestSharp
But I'm stuck at this problem which is getting specific string for client cookies here is my code:
var cookieJar = new CookieContainer();
var client = new RestClient("https://server.com")
{
UserAgent =
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/61.0.3163.100 Safari/537.36",
};
client.CookieContainer = cookieJar;
var request = new RestRequest(Method.GET);
var cookie = client.CookieContainer.GetCookieHeader(new Uri("https://server.com"));
MessageBox.Show(""+cookie);
and I always get the cookie empty can anyone helps me?
This will set the cookie for your client. After all, you need to do is client.Execute. The code is in C# pretty sure you can make it work for anything else.
string myUserAgent = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/61.0.3163.100 Safari/537.36";
client.CookieContainer.Add(new Cookie("UserAgent", myUserAgent) { Domain = myUri.Host });
It is impossible to register. Writes:
{\"bSuccess\":false,\"details\":\"captcha data missing!\"}
I checked all the CAPTCHA correctly. The request also seems true. What's wrong?
PS I using xNet
public bool accaunt_create(AccauntData ad)
{
HttpRequest httpRequest = new HttpRequest();
httpRequest.AllowAutoRedirect = true;
httpRequest.UserAgent = "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.85 Safari/537.36 OPR/32.0.1948.38";
httpRequest.Cookies = cook;
httpRequest.AddParam("accountname", ad.accountname);
httpRequest.AddParam("password", ad.password);
httpRequest.AddParam("email", ad.email);
httpRequest.AddParam("captchagid", captchaID);
httpRequest.AddHeader("captcha_text", ad.captcha_text);
httpRequest.AddParam("i_agree", "1");
httpRequest.AddParam("ticket");
httpRequest.AddParam("count", "4");
httpRequest.AddHeader("X-Requested-With", "XMLHttpRequest");
httpRequest.AddHeader("X-Prototype-Version", "1.7");
httpRequest.AddHeader("Accept", "text/javascript, text/html, application/xml, text/xml, */*");
var res = httpRequest.Post("https://store.steampowered.com/join/createaccount/").ToString();
return false;
}
maybe this
httpRequest.AddParam("captchagid", captchaID);
have to be this:
httpRequest.AddParam("captchaid", captchaID);
I have the following Image, when this image opened using any browser and on either normal or private mode it works perfectly.
but using HttpRequest & HttpResponse it produces the following:
Here is my code:
/// <summary>
///
/// </summary>
/// <param name="imageUrl"></param>
/// <returns></returns>
private static Image GetImage(string imageUrl)
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(imageUrl);
request.Method = "GET";
request.KeepAlive = true;
request.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8";
request.Host = "www.google.com";
request.UserAgent = "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/32.0.1700.107 Safari/537.36";
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
return Image.FromStream(response.GetResponseStream());
}
And here is the Raw HttpRequest from Fiddler:
GET https://www.google.com/recaptcha/api/image?c=03AHJ_VusG9FppmdZqKUPjxoVUV4290vtoCQDM8jw9BROEI396sx7J4YYxUBab1zEQg2T94n19CiW2ahGLAtGm5H7nEgckTtJACFcXGZg_frhYIsBAg8VaY5JAbP4oJCx3cmknqltFsYeDUrdv0ZWbFjcMtYuZ0FQnh3blc5ZEx1oJuWJpLzPy0WhhZUIgJ_EXpiC5w-pvt00P HTTP/1.1
Host: www.google.com
Connection: keep-alive
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/32.0.1700.107 Safari/537.36
Accept-Encoding: gzip,deflate,sdch
Accept-Language: en-US,en;q=0.8,ar;q=0.6
Can anybody help in getting this image successfully. Thanks.
The question is how to construct HttpWebRequest so queried server will think it comes from a browser?
You could set the User-Agent HTTP request header.
var request = (HttpWebRequest)WebRequest.Create("http://www.google.com");
request.UserAgent = "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/535.2 (KHTML, like Gecko) Chrome/15.0.874.121 Safari/535.2";
or if you work with a WebClient:
using (var client = new WebClient())
{
client.Headers[HttpRequestHeader.UserAgent] = "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/535.2 (KHTML, like Gecko) Chrome/15.0.874.121 Safari/535.2";
...
}