c# webclient returns 403 forbidden - c#

I am trying to emulate the process of accepting a trade offer in steam. I have asked steam support and the confirm that this action is allowed as long as I do not disrupt their service to other players.
So here is the details:
The URL for accepting a trade offer is https://steamcommunity.com/tradeoffer/OfferID/accept
Here is their ajax code for doing so
return $J.ajax(
{
url: 'https://steamcommunity.com/tradeoffer/' + nTradeOfferID + '/accept',
data: rgParams,
type: 'POST',
crossDomain: true,
xhrFields: { withCredentials: true }
}
Here is the headers i tracked using IE10
Request POST /tradeoffer/xxxxxxx/accept HTTP/1.1
Accept */*
Content-Type application/x-www-form-urlencoded; charset=UTF-8
Referer http://steamcommunity.com/tradeoffer/xxxxxxx/
Accept-Language en-CA
Origin http://steamcommunity.com
Accept-Encoding gzip, deflate
User-Agent Mozilla/5.0 (compatible; MSIE 10.0; Windows NT 6.1; WOW64; Trident/6.0)
Host steamcommunity.com
Content-Length 51
DNT 1
Connection Keep-Alive
Cache-Control no-cache
The post body:
sessionid=SESSIONID&tradeofferid=OfferID
Cookie:
Sent sessionid SessionID
Sent __utmc XXXXX
Sent steamLogin XXXXX
Sent webTradeEligibility XXXXXXX
Sent Steam_Language XXXXXXX
Sent timezoneOffset XXXXXXXX
Sent __utma XXXXXXXXXXXXX
Sent __utmz XXXXXXXXXXXXX
Sent steamMachineAuth XXXXXXXXXXXXX
Sent strInventoryLastContext XXXXXXXXX
Sent steamRememberLogin XXXXXXXXXXXX
Sent steamCC_XXXXXXXXXXXX XXXXXXX
Sent __utmb XXXXXXX
Sent tsTradeOffersLastRead XXXXXXX
the initiator of the request is XMLHttpRequest
In my code i did
public bool AcceptOffer(string offerID)
{
string path = "tradeoffer/" + offerID + "/";
//Simulate the browser opening the trade offer window
_steamWeb.Get(new Uri(WebAPI.SteamCommunity + path));
NameValueCollection data = new NameValueCollection();
data.Add("sessionid", _steamWeb.SessionID);
data.Add("tradeofferid", offerID);
string result = _steamWeb.Post(new Uri("https://steamcommunity.com/" + path + "accept"), data);
return true;
}
_steamWeb contains a cookie aware webclient which is used to do all the post/get requests
here are parts of the codes for the cookie aware webclient
protected override WebRequest GetWebRequest(Uri address)
{
HttpWebRequest request = base.GetWebRequest(address) as HttpWebRequest;
if (request != null)
request.CookieContainer = _cookieContainer;
if (_lastPage != null)
request.Referer = _lastPage;
_lastPage = address.ToString();
return request;
}
protected override WebResponse GetWebResponse(WebRequest request)
{
WebResponse response = base.GetWebResponse(request);//403 exception here
ReadCookies(response);
return response;
}
here is the headers that i am setting
void SetCommonHeaders(Uri uri)
{
_webClient.Headers[HttpRequestHeader.Accept] = "text/html, application/xhtml+xml, */*";
_webClient.Headers[HttpRequestHeader.AcceptLanguage] = "en-CA";
_webClient.Headers[HttpRequestHeader.ContentType] = "application/x-www-form-urlencoded; charset=UTF-8";
_webClient.Headers[HttpRequestHeader.UserAgent] = "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)";
_webClient.Headers[HttpRequestHeader.Host] = uri.Host;
_webClient.Headers.Add("DNT", "1");
}
here are my cookie headers of the request i am sending
sessionid=XXXX;
steamMachineAuthXXXXX=XXXXXX;
steamLogin=XXXXXXX;
steamRememberLogin=XXXXXXXX;
Steam_Language=english;
webTradeEligibility=XXXXXXXXX;
steamCC_XXXXX=CA;
tsTradeOffersLastRead=XXXXXXXXX
I did not set those cookies manuelly, all of them are attened by GET requests to steamcommunity.com
I am pretty much sending the same request as the browser, but I am getting 403 Forbidden with my post. I have tried to set the X-Requested-With = XMLHttpRequest header but it is not helping. I see they are doing some credential thingy in the ajax call so am I suppose to do something too in my HttpWebRequest posts? Thanks

Problem solved, there are two things:
Keep alive header is not being send properly due to .NET bug
I encoded the sessionid twice

Related

Content-Length or Chunked Encoding cannot be set for an operation that does not write data what should i do?

I try to access a specific page on the site and pull it out of information.
I did GET request to the homepage and I get response status code ==OK
then I do another GET request to the page that contains the Json I want to retrieve) and the response status code == OK.
Now I want to retrieve the information so I do get request for the resource (another URL that the last page load)
And I get the error at this line:
HttpWebResponse oHttpResponseIndicesApiUrl = (HttpWebResponse)oHttpRequestIndicesApiUrl.GetResponse();
"Content-Length or Chunked Encoding cannot be set for an operation
that does not write data"
I set all the headers Just like the get request inside the chrome Inspect -> Network Tab -> choose the URL that i wanna(there i can see the get request headers)
this is the code that i run:
HttpWebRequest oHttpRequestIndicesApiUrl = (HttpWebRequest)WebRequest.Create(sIndicesApiURL);
LOG.DebugFormat("{0}:calculateIndexSecurityWeights(), Create get request to '{0}'", Name, sIndicesApiURL);
oHttpRequestIndicesApiUrl.CookieContainer = new CookieContainer();
foreach (Cookie oCookie in oHttpResponseIndicesParmsUrl.Cookies)
{
oHttpRequestIndicesApiUrl.CookieContainer.Add(oCookie);
}
oHttpRequestIndicesApiUrl.AllowAutoRedirect = false;
oHttpRequestIndicesApiUrl.Accept = ("application/json, text/plain, */*");
oHttpRequestIndicesApiUrl.Headers.Add("accept-encoding", "gzip, deflate, br");
oHttpRequestIndicesApiUrl.Headers.Add("accept-language", "he-IL");
oHttpRequestIndicesApiUrl.KeepAlive = true;
oHttpRequestIndicesApiUrl.ContentLength = 120;
oHttpRequestIndicesApiUrl.ContentType = "application/json;charset=UTF-8";
oHttpRequestIndicesApiUrl.Host = "api.tase.co.il";
oHttpRequestIndicesApiUrl.Headers.Add("origin", "https://www.tase.co.il");
oHttpRequestIndicesApiUrl.Referer = sIndicesParamsURL;
oHttpRequestIndicesApiUrl.Headers.Add("sec-fetch-mode", "cors");
oHttpRequestIndicesApiUrl.Headers.Add("sec-fetch-site", "same-site");
oHttpRequestIndicesApiUrl.Headers.Add("upgrade-insecure-requests", "1");
oHttpRequestIndicesApiUrl.UserAgent = "Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/78.0.3904.87 Safari/537.36";
LOG.DebugFormat("{0}:calculateIndexSecurityWeights(), Set headers to '{1}'", Name, sIndicesApiURL);
HttpWebResponse oHttpResponseIndicesApiUrl = (HttpWebResponse)oHttpRequestIndicesApiUrl.GetResponse();
if (oHttpResponseIndicesApiUrl.StatusCode != HttpStatusCode.OK)
{
// response failed
throw new ApplicationException(string.Format("get response from url '{0}' failed, Status Code: '{1}', Status Description '{2}'", sIndicesApiURL, oHttpResponseIndicesApiUrl.StatusCode, oHttpResponseIndicesApiUrl.StatusDescription));
}
I can't understand why is it happening?

Posting to a form, can't get the address right

I have intercepted an HTTP POST as follows
Header
Key Value
Request POST /east-berkshire/local/quick_search HTTP/1.1
Accept text/html, application/xhtml+xml, */*
Referer https://www.netmums.com/east-berkshire/local/index/childcare/nannies-au-pairs
Accept-Language en-GB
User-Agent Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko
Content-Type application/x-www-form-urlencoded
Accept-Encoding gzip, deflate
Host www.netmums.com
Content-Length 107
DNT 1
Connection Keep-Alive
Cache-Control no-cache
Cookie AMCV_44326DF2572396FB7F000101%40AdobeOrg=817868104%7CMCMID%7C34574735755395522184062187835447062918%7CMCAAMLH-1486721296%7C6%7CMCAAMB-1486721296%7CNRX38WO0n5BH8Th-nqAG_A%7CMCOPTOUT-1486123696s%7CNONE; _ga=GA1.2.258060262.1486116497; _gat=1; _lp4_u=dZXxbBpqGf; __qca=P0-238174588-1486116496764; _tynt_crtg=; aam_uuid=34158303305859258534090346121149142657; __gads=ID=b3ba42a045f2be6a:T=1486116505:S=ALNI_MZHsVecqphdMO7SI-l4IEGrCyFpsg; AMCVS_44326DF2572396FB7F000101%40AdobeOrg=1; ABTastySession=LiwioHashMRASN%3Anull%5E%7C%5ELiwioUTMC%3A1; ABTasty=ABTastyUTMB%3A1%5E%7C%5ELiwioTracking%3A17020310101198682%5E%7C%5EsegmentationTracking%3A17020310101198682%5E%7C%5ELiwioUTMA%3A0.1.1486116611618.0.1486116611618.2; firstvisit=1; Cake=3qdc1afjmdvq0fg9kdunu2okn4; NetmumsLocation=east-berkshire; OX_plg=swf|sl|shk|pm
Body
_method=POST&data%5BListing%5D%5Blisting_category_id%5D=2&data%5BListing%5D%5Blisting_subcategory_id%5D=211
I have written the following C# code to try simulate this
var request = WebRequest.Create("https://www.netmums.com/east-berkshire/local/quick_search") as HttpWebRequest;
if (request == null) throw new HttpRequestException("Could not create web request");
request.Method = "post";
request.ContentType = "application/x-www-form-urlencoded";
var bs = Encoding.ASCII.GetBytes("[Listing][listing_category_id]=2&[Listing][listing_subcategory_id]=211");
using (var reqStream = request.GetRequestStream())
reqStream.Write(bs, 0, bs.Length);
string result;
using (var response = request.GetResponse())
{
var stream = response.GetResponseStream();
if (stream == null) throw new HttpRequestException("No data returned");
var sr = new StreamReader(stream);
result = sr.ReadToEnd();
sr.Close();
}
However when I execute it, on the GetResponse() call I get the error
The remote server returned an error: (404) Not Found.
What am I doing wrong?

AngularJS $http.get() cannot handle 401 status returned by the server

I'm trying to send and $http.get() request and authenticate it in the server side (WebApi)
$http.get(config.remoteServiceUrl + "api/account", {
headers:
{
'Authorization': 'Basic ' + encoded,
'Content-Type': "application/json"
},
params:
{
'email': credentials.Email
}
}).then(
//Success
function (data, status) {
setCredentials(credentials.Email, credentials.Password);
service.user.email = credentials.Email;
loggedin = true;
result.data = data;
result.status = status;
deferred.resolve(result);
},
//Error
function (data, status) {
result.data = data;
result.status = status;
deferred.reject(result);
}
);
for every unauthorized request the server should return a 401 status
HttpResponseMessage reply = request.CreateErrorResponse(HttpStatusCode.Unauthorized, "Invalid Username or Password");
return Task.FromResult(reply);
but when i check the response in the client side the status is always empty instead of the 401 status.
below is the request being sent
Request URL:http://127.0.0.1:81/api/account?email=login#yahoo.com
Request Headersview source
Accept:application/json, text/plain, */*
Authorization:Basic bG9naW5AeWFob28uY29tOjExMTE=
Origin:http://localhost
Referer:http://localhost/EMR.WebUI/index.html
User-Agent:Mozilla/5.0 (Windows NT 6.3; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/31.0.1650.63 Safari/537.36
Query String Parametersview sourceview URL encoded
email:login#yahoo.com
when i check the request status after the call in the chrome debugger network tab is says
GET (Cancelled)
Anyone knows why this is happening?
It works correctly when a proper authorization is passed
Request URL:http://127.0.0.1:81/api/account?email=login#yahoo.com
Request Method:GET
Status Code:200 OK
Request Headersview source
Accept:application/json, text/plain, */*
Accept-Encoding:gzip,deflate,sdch
Accept-Language:fil,fil-PH;q=0.8,tl;q=0.6,en-US;q=0.4,en;q=0.2
Authorization:Basic bG9naW5AeWFob28uY29tOjE=
Connection:keep-alive
Host:127.0.0.1:81
Origin:http://localhost
Referer:http://localhost/EMR.WebUI/index.html
User-Agent:Mozilla/5.0 (Windows NT 6.3; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/31.0.1650.63 Safari/537.36
Query String Parametersview sourceview URL encoded
email:login#yahoo.com
Response Headersview source
Access-Control-Allow-Origin:*
Cache-Control:no-cache
Content-Length:110
Content-Type:application/json; charset=utf-8
Date:Mon, 13 Jan 2014 11:00:35 GMT
Expires:-1
Pragma:no-cache
Server:Microsoft-IIS/8.0
X-AspNet-Version:4.0.30319
X-Powered-By:ASP.NET

Web site log in for data scraping

I am attempting to web scrape date from my various remote transmitters. I have one brand of transmitter that I can log into with the following c# code:
public static string getSourceCode(string url, string user, string pass)
{
SecureString pw = new SecureString();
foreach (char c in pass.ToCharArray()) pw.AppendChar(c);
NetworkCredential credential = new NetworkCredential(user, pw, url);
CredentialCache cache = new CredentialCache();
cache.Add(new Uri(url), "Basic", credential);
Uri realLink = new Uri(url);
HttpWebRequest req = (HttpWebRequest)WebRequest.Create(realLink);
req.Credentials = CredentialCache.DefaultNetworkCredentials;
HttpWebResponse resp = (HttpWebResponse)req.GetResponse();
StreamReader sr = new StreamReader(resp.GetResponseStream());
string sourceCode = sr.ReadToEnd();
sr.Close();
resp.Close();
return sourceCode;
}
The second brand of transmitter (I'm hesitant to put the url out in public) instead of returning a web page requesting username and password returns a box requesting username and password. using the above code just returns an unauthorized error.
Fiddler says the following is sent when I successfully login to the site:
GET http(colon slash slash)lasvegas3abn(*)dyndns(*)tv(PORT)125(slash)measurements(*)htm HTTP/1.1
Accept: text/html, application/xhtml+xml, */*
Accept-Language: en-US
User-Agent: Mozilla/5.0 (compatible; MSIE 10.0; Windows NT 6.2; WOW64; Trident/6.0; Touch)
Accept-Encoding: gzip, deflate
Host: lasvegas3abn.dyndns.tv:125
Authorization: Basic dXNlcjpsaW5lYXI=
Connection: Keep-Alive
DNT: 1
Any suggestions?
Instead of:
req.Credentials = CredentialCache.DefaultNetworkCredentials;
you can specify a credential that uses a specific username and password:
req.Credentials = new NetworkCredential("username", "password");
This should enable you to get through the login prompt (assuming that you specify the correct username and password).

Log in to site programmatically and redirect browser to signed in state

I want to sign in to a site when a link is clicked and then redirect the browser there with a signed in session. Im having some troubles and here is what Ive tried:
First I get the session cookies from the login site:
CookieContainer cookies= new CookieContainer();
HttpWebRequest myHttpWebRequest = (HttpWebRequest)WebRequest.Create("http://someuri.com");
myHttpWebRequest.CookieContainer = cookies;
HttpWebResponse myHttpWebResponse = (HttpWebResponse)myHttpWebRequest.GetResponse();
myHttpWebResponse.Close();
Then I post to the sign in page to get signed in:
HttpWebRequest getRequest = (HttpWebRequest)WebRequest.Create("http://signInURL.com");
getRequest.CookieContainer = cookies;
getRequest.Method = WebRequestMethods.Http.Post;
getRequest.UserAgent = "Mozilla/5.0 (Windows NT 6.1) AppleWebKit/535.2 (KHTML, like Gecko) Chrome/15.0.874.121 Safari/535.2";
getRequest.AllowWriteStreamBuffering = true;
getRequest.ProtocolVersion = HttpVersion.Version11;
getRequest.AllowAutoRedirect = true;
getRequest.ContentType = "application/x-www-form-urlencoded";
byte[] byteArray = Encoding.ASCII.GetBytes(PostParameterStringWithSignInInfo);
getRequest.ContentLength = byteArray.Length;
Stream newStream = getRequest.GetRequestStream();
newStream.Write(byteArray, 0, byteArray.Length);
newStream.Close();
HttpWebResponse getResponse = (HttpWebResponse)getRequest.GetResponse();
Then I figured I need to set the cookies to the client:
CookieCollection cooki = getRequest.CookieContainer.GetCookies(new Uri("http://someUri.com"));
for(int i = 0; i < cooki.Count; i++)
{
Cookie c = cooki[i];
Response.Cookies.Add(new HttpCookie(c.Name, c.Value));
}
And then redirect to where you end up being signed in:
Response.Redirect("http://URLwhenBeingSignedIn.com");
This doesnt work. When redirected Im still logged out.
Tried to do this with Fiddler and succeeded to sign in and get redirected:
Get the session cookies:
GET / HTTP/1.1
Content-type: application/x-www-form-urlencoded
Host: someuri.com
Post to the sign in page to get signed in:
POST /signIn HTTP/1.1
Accept: text/html, application/xhtml+xml, */*
Referer: http://someuri.com
Accept-Language: en-GB,en;q=0.7,tr;q=0.3
User-Agent: Mozilla/5.0 (compatible; MSIE 10.0; Windows NT 6.2; WOW64; Trident/6.0)
Content-Type: application/x-www-form-urlencoded
Accept-Encoding: gzip, deflate
Connection: Keep-Alive
Content-Length: 90
DNT: 1
Host: signInURL.com
Pragma: no-cache
Cookie: JSESSIONID=fromBefore; Cookie2=fromBefore
PostParameterStringWithSignInInfo
Perhaps there's an easier way than the one I thought of now that you can see the fiddler requests that works, if so I'm happy to see it.

Categories

Resources