I am trying to fake a post request to a site programmed with c#.
I used WireShark to sniff the communication between my computer and the server.
I noticed that the client send viewstate data (encoded in Base64) and I would like to know how to fake it in my request.
my post code
public static void sendPostRequest(string responseUri,CookieCollection responseCookies)
{
HttpWebRequest mPostRequest =
(HttpWebRequest)WebRequest.Create("http://tickets.cinema-city.co.il/webtixsnetglilot/SelectSeatPage2.aspx?dtticks=" + responseUri + "&hideBackButton=1");
mPostRequest.UserAgent = "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/28.0.1500.95 Safari/537.36";
mPostRequest.KeepAlive = false;
mPostRequest.Method = "Post";
mPostRequest.ContentType = "application/x-www-form-urlencoded";
CookieContainer mCookies= new CookieContainer();
foreach (Cookie cookie in responseCookies)
{
mCookies.Add(cookie);
}
mPostRequest.CookieContainer = mCookies;
HttpWebResponse myHttpWebResponse2 = (HttpWebResponse)mPostRequest.GetResponse();
}
If you can "fake" signed/encrypted data you don't really need to deal with fake posts - just steal all SSL traffic :).
View state comes in original response for the page encrypted - so you simply need to parse original response (use Html Agility Pack) and send that view state back in post request.
Related
I am trying to download the HTML from a site and parse it. I am actually interested in the OpenGraph data in the head section only. For most sites using the WebClient, HttpClient or HtmlAgilityPack works, but some domains I get 403, for example: westelm.com
I have tried setting up the Headers to be absolutely the same as they are when I use the browser, but I still get 403. Here is some code:
string url = "https://www.westelm.com/m/products/brushed-herringbone-throw-t5792/?";
var doc = new HtmlDocument();
using(WebClient client = new WebClient()) {
client.Headers["User-Agent"] = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.102 Safari/537.36";
client.Headers["Accept"] = "text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9";
client.Headers["Accept-Encoding"] = "gzip, deflate, br";
client.Headers["Accept-Language"] = "en-US,en;q=0.9";
doc.Load(client.OpenRead(url));
}
At this point, I am getting a 403.
Am I missing something or the site administrator is protecting the site from API requests?
How can I make this work? Is there a better way to get OpenGraph data from a site?
Thanks.
I used your question to resolve the same problem. IDK if you're already fixed this but I tell you how it worked for me
A page was giving me 403 for the same reasons. The thing is: you need to emulate a "web browser" from the code, sending a lot of headers.
I used one of yours headers I wasn't using (like Accept-Language)
I didn't use WebClient though, I used HttpClient to parse the webpage
private static async Task<string> GetHtmlResponseAsync(HttpClient httpClient, string url)
{
using var request = new HttpRequestMessage(HttpMethod.Get, new Uri(url));
request.Headers.TryAddWithoutValidation("Accept", "text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9");
request.Headers.TryAddWithoutValidation("Accept-Encoding", "gzip, deflate, br");
request.Headers.TryAddWithoutValidation("User-Agent", "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.102 Safari/537.36");
request.Headers.TryAddWithoutValidation("Accept-Charset", "UTF-8");
request.Headers.TryAddWithoutValidation("Accept-Language", "en-US,en;q=0.9");
using var response = await httpClient.SendAsync(request).ConfigureAwait(false);
if (response == null)
return string.Empty;
using var responseStream = await response.Content.ReadAsStreamAsync().ConfigureAwait(false);
using var decompressedStream = new GZipStream(responseStream, CompressionMode.Decompress);
using var streamReader = new StreamReader(decompressedStream);
return await streamReader.ReadToEndAsync().ConfigureAwait(false);
}
If it helps you, I'm glad. If not, I will leave this answer here to help someone else in the future!
I try to access a specific page on the site and pull it out of information.
I did GET request to the homepage and I get response status code ==OK
then I do another GET request to the page that contains the Json I want to retrieve) and the response status code == OK.
Now I want to retrieve the information so I do get request for the resource (another URL that the last page load)
And I get the error at this line:
HttpWebResponse oHttpResponseIndicesApiUrl = (HttpWebResponse)oHttpRequestIndicesApiUrl.GetResponse();
"Content-Length or Chunked Encoding cannot be set for an operation
that does not write data"
I set all the headers Just like the get request inside the chrome Inspect -> Network Tab -> choose the URL that i wanna(there i can see the get request headers)
this is the code that i run:
HttpWebRequest oHttpRequestIndicesApiUrl = (HttpWebRequest)WebRequest.Create(sIndicesApiURL);
LOG.DebugFormat("{0}:calculateIndexSecurityWeights(), Create get request to '{0}'", Name, sIndicesApiURL);
oHttpRequestIndicesApiUrl.CookieContainer = new CookieContainer();
foreach (Cookie oCookie in oHttpResponseIndicesParmsUrl.Cookies)
{
oHttpRequestIndicesApiUrl.CookieContainer.Add(oCookie);
}
oHttpRequestIndicesApiUrl.AllowAutoRedirect = false;
oHttpRequestIndicesApiUrl.Accept = ("application/json, text/plain, */*");
oHttpRequestIndicesApiUrl.Headers.Add("accept-encoding", "gzip, deflate, br");
oHttpRequestIndicesApiUrl.Headers.Add("accept-language", "he-IL");
oHttpRequestIndicesApiUrl.KeepAlive = true;
oHttpRequestIndicesApiUrl.ContentLength = 120;
oHttpRequestIndicesApiUrl.ContentType = "application/json;charset=UTF-8";
oHttpRequestIndicesApiUrl.Host = "api.tase.co.il";
oHttpRequestIndicesApiUrl.Headers.Add("origin", "https://www.tase.co.il");
oHttpRequestIndicesApiUrl.Referer = sIndicesParamsURL;
oHttpRequestIndicesApiUrl.Headers.Add("sec-fetch-mode", "cors");
oHttpRequestIndicesApiUrl.Headers.Add("sec-fetch-site", "same-site");
oHttpRequestIndicesApiUrl.Headers.Add("upgrade-insecure-requests", "1");
oHttpRequestIndicesApiUrl.UserAgent = "Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/78.0.3904.87 Safari/537.36";
LOG.DebugFormat("{0}:calculateIndexSecurityWeights(), Set headers to '{1}'", Name, sIndicesApiURL);
HttpWebResponse oHttpResponseIndicesApiUrl = (HttpWebResponse)oHttpRequestIndicesApiUrl.GetResponse();
if (oHttpResponseIndicesApiUrl.StatusCode != HttpStatusCode.OK)
{
// response failed
throw new ApplicationException(string.Format("get response from url '{0}' failed, Status Code: '{1}', Status Description '{2}'", sIndicesApiURL, oHttpResponseIndicesApiUrl.StatusCode, oHttpResponseIndicesApiUrl.StatusDescription));
}
I can't understand why is it happening?
I'm trying to automate a WebSocket service that denies connection unless you send a user agent with the CONNECT request.
I tried sending the upgrade request with HttpWebRequest and setting User-Agent using the property.
Using Fiddler to debug the request this was sent out:
CONNECT *.*.com:443 HTTP/1.1
Host: *.*.com:443
Connection: keep-alive
How do I add the User-Agent string to the CONNECT request and then upgrade to using WebSocket protocol?
My code so far:
public void Login ( Action onEnd = null ) {
var req = CreateUpgradeRequest();
var res = GetResponse(req);
}
private HttpWebRequest CreateUpgradeRequest ( ) {
HttpWebRequest request = WebRequest.Create("https://lobby35.runescape.com/") as HttpWebRequest;
request.UserAgent = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/54.0.2840.99 Safari/537.36";
request.Connection = "Upgrade";
SetWebSocketHeader(request, "Key", "5LENZfSifyj/Rw1ghTvpgw==");
SetWebSocketHeader(request, "Version", "13");
SetWebSocketHeader(request, "Extensions", "permessage-deflate; client_max_window_bits");
SetWebSocketHeader(request, "Protocol", "jagex");
return request;
}
You cannot use WebRequest to create a websocket connection. You will need ClientWebSocket and use `ClientWebSocket.Options.SetRequestHeader.
Note, you may have issues adding that header: Setting "User-Agent" HTTP header in ClientWebSocket
Update: Since you cannot add that header with ClientWebSocket try with Websocket4Net.
I have a website with webservice active(prestashop)
This site require an authentication.
I use this code:
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
request.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8";
request.UserAgent = "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/32.0.1700.107 Safari/537.36";
request.Method = "GET";
request.Credentials = new NetworkCredential("key", "");
request.PreAuthenticate = true;
//request.Connection
request.Host = "localhost";
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
Stream dataStream = response.GetResponseStream();
StreamReader reader = new StreamReader(dataStream);
String R = reader.ReadToEnd();
The code is ok but my problem is that there is a login form for the webservice
In fact, the HttpWebRequest object , sends two requests:
with the first answer is not authorized while the second was ok status.
I used fiddler web debbuger.
I apologize for my English.
if the form is submitted using GET method you must pass the form paramaters in the url query string, for instance http://url?username={0}&pass={1}. If it is POST method, you must pass the form info into the http body request. There is a lot of examples in stackoverflow of this. Also you must handle the cookies witch is achieve using the CookieContainer. In the first request intialize the container
request.CookieContainer = new CookieContainer();
when the request comeback with ok status the cookies will be in request.Cookies witch is a CookieCollection instance. Later for further request you must have to pass this cookies in order to retrieve the correct data.
request.CookieContainer = new CookieContainer();
request.CookieContainer.Add(userCookies);
Hope it helps!
Is it possible to read an image attachment from System.Net.HttpWebResponse?
I have a url to a java page, which generates images.
When I open the url in firefox, the download dialog appears. Content-type is application/png.
Seems to work.
When I try this in c#, and make a GET request I retrieve the content-type: text/html and no content-disposition header.
Simple Code:
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(uri);
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
response.GetResponseStream() is empty.
A try with java was successful.
Do I have to prepare webrequest or something else?
You probably need to set a User-Agent header.
Run Fiddler and compare the requests.
Writing something in the UserAgent property of the HttpWebRequest does indeed make a difference in a lot of cases. A common practice for web services seem to be to ignore requests with an empty UserAgent.
See: Webmasters: Interpretation of empty User-agent
Simply set the UserAgent property to a non-empty string. You can for example use the name of your application, assembly information, impersonate a common UserAgent, or something else identifying.
Examples:
request.UserAgent = "my example program v1";
request.UserAgent = $"{System.Reflection.Assembly.GetExecutingAssembly().GetName().Name.ToString()} v{System.Reflection.Assembly.GetExecutingAssembly().GetName().Version.ToString()}";
request.UserAgent = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/74.0.3729.169 Safari/537.36";
And just to give a full working example:
using System.IO;
using System.Net;
void DownloadFile(Uri uri, string filename)
{
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(uri);
request.Timeout = 10000;
request.Method = "GET";
request.UserAgent = "my example program v1";
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
using (Stream receiveStream = response.GetResponseStream())
{
using (FileStream fileStream = File.Create(filename))
{
receiveStream.CopyTo(fileStream);
}
}
}
}