I have a problem with getting and sending this cookie header. To be more specific - with that MenuData cookie.
I'm sending GET request and with response i get this one particular cookie which I need to send back in next POST request.
The problem is that i get it in two parts(in cookiecontainer) and i have no idea how to Encode/Decode it to send it properly.
Here's the cookie header of POST i'm trying to send.
(//// - parts i need to put together)
Cookie:
//// MenuData={'Type':null;
ASP.NET_SessionId=oe5qzthlb51ri5nzxddadzzo;
.LoginISerwis=2880E262ECC48BD7D12443EDC97D9641E85401A345B629C2002AC89F22CEBD201700417EB0D499C6E8F10816AC1F457FF7CBD671C83509CEF405236C91D6CDD81543BF1EC507319EDD587E6FFDEBA80DFAD30D769DF6F70C942ABBCB383A0C0A0BF127F40FB4C04F25A6F68469EFAF51503EF10DCFF2F51A9B31040575B14962;
CustomerLogin=ID=xxxx&Login=xxxxx&RememberLogin=True;
//// 'Id':null}=
I need an advices how I can handle this kind of cookies.
Unfortunately, CookieContainer is known to have problems with parsing cookies with commas in them. My advice is to manually read the cookie header at this particular step and pass the cookie along to your next request.
Related
My app communicates with an internal web API that requires authentication.
When I send the request I get the 401 challenge as expected, the handshake occurs, the authenticated request is re-sent and everything continues fine.
However, I know that the auth is required. Why do I have to wait for the challenge? Can I force the request to send the credentials in the first request?
My request generation is like this:
private static HttpWebRequest BuildRequest(string url, string methodType)
{
var request = HttpWebRequest.CreateHttp(url);
request.PreAuthenticate = true;
request.AuthenticationLevel = AuthenticationLevel.MutualAuthRequested;
request.Credentials = CredentialCache.DefaultNetworkCredentials;
request.Proxy.Credentials = CredentialCache.DefaultNetworkCredentials;
request.ContentType = CONTENT_TYPE;
request.Method = methodType;
request.UserAgent = BuildUserAgent();
return request;
}
Even with this code, the auth header isn't included in the initial request.
I know how to include the auth info with basic.... what I want to do is to use Windows Auth of the user executing the app (so I can't store the password in a config file).
UPDATE I also tried using a HttpClient and its own .Credentials property with the same result: no auth header is added to the initial request.
The only way I could get the auth header in was to hack it in manually using basic authentication (which won't fly for this use-case)
Ntlm is a challenge/response based authentication protocol. You need to make the first request so that the server can issue the challenge then in the subsequent request the client sends the response to the challenge. The server then verifies this response with the domain controller by giving it the challenge and the response that the client sent.
Without knowing the challenge you can't send the response which is why 2 requests are needed.
Basic authentication is password based so you can short circuit this by sending the credentials with the first request but in my experience this can be a problem for some servers to handle.
More details available here:
http://msdn.microsoft.com/en-us/library/windows/desktop/aa378749(v=vs.85).aspx
I'm not 100% sure, but I suspect that there is no way around this; it's simply the way HttpWebRequest works.
In the online .NET source, function DoSubmitRequestProcessing which is here, you can see this comment just after the start of the function, line 1731:
// We have a response of some sort, see if we need to resubmit
// it do to authentication, redirection or something
// else, then handle clearing out state and draining out old response.
A little further down (line 1795) (some lines removed for brevity)
if (resubmit)
{
if (CacheProtocol != null && _HttpResponse != null) CacheProtocol.Reset();
ClearRequestForResubmit(ntlmFollowupRequest);
...
}
And in ClearRequestForResubmit line 5891:
// We're uploading and need to resubmit for Authentication or Redirect.
and then (Line 5923):
// The second NTLM request is required to use the same connection, don't close it
if (ntlmFollowupRequest) {....}
To my (admittedly n00bish) eyes these comments seem to indicate that the developers decided to follow the "standard" challenge-response protocol for NTML/Kerberos and not include any way of sending authentication headers up-front.
Setting PreAuthenticate is what you want, which you are doing. The very first request will still do the handshake but for subsequent requests it will automatically send the credentials (based on the URL being used). You can read up on it here: http://msdn.microsoft.com/en-us/library/system.net.httpwebrequest.preauthenticate(v=vs.110).aspx.
I am having some trouble adding a value to the Page.Request & Page.Response headers and have the key & value stay/persist through a redirect.
I have an enum tracking code that I want to place in the headers to trace how a user goes through my site prior to their checkout.
I am using this code to add the headers to response and request context.
var RequestSessionVariable = context.Request.Headers["SessionTrackingCode"];
if (RequestSessionVariable == null)
{
context.Response.AddHeader("SessionTrackingCode", ((int)tracker).ToString());
context.Request.Headers.Add("SessionTrackingCode", ((int)tracker).ToString());
}
else
{
if(!RequestSessionVariable.Contains(((int)tracker).ToString()))
{
RequestSessionVariable += ("," + ((int)tracker).ToString());
context.Request.Headers["SessionTrackingCode"] = RequestSessionVariable;
context.Response.Headers["SessionTrackingCode"] = RequestSessionVariable;
}
}
The method call that occurs in Page_Load of the necessary controls within the website:
trackingcodes.AddPageTrackingCode(TrackingCode.TrackingCodes.ShoppingCart, this.Context);
The header SessionTrackingCode is their but after a Response.Redirect("~/value.aspx") the RequestSessionVariable is always null. Is there something that happens on the redirect that will wipe out the headers that I add? Or what am I doing wrong on the addition of the header key and value?
this equals:
public partial class Cart : System.Web.UI.UserControl
Headers send by client on every request, so any redirect will require client to send headers again.
Unless you are using some special client (not a browser) any special headers will be essentially ignored/lost during requests. Browser only will send known headers (cookies, authentication, referrer) in requests and act on other set of known headers in response (setCookies). You are using custom header that not known to browser so browser will not read in from response nor send it in request.
Your options:
switch to use cookies for your tracking (same as everyone else)
use AJAX requests to send/receive custom headers (probably not what you are looking for as urls look like regular GET/POST ones)
build custom client that will pay attention to your headers (purely theoretical, unless you building some sort of sales terminal no one will install your client to visit your site)
Note: adding headers to request in page code does no make much sense as request will not be send anywhere (it is what come from browser).
This looks like a job for cookies, rather than http headers. The browser will not return your custom headers to you, but it will return your cookies.
I have an application that contains a button, on click of this button, it will open a browser window using a URL with querystring parameters (the url of a page that i am coding).
Is there a way to ensure that the URL is coming from my application and only from my application - and not just anyone typing the URL manually in a webbrowser?
If not, what is the best way to ensure that a specific URL is coming from a specific application - and not just manually entered in the address bar or a web browser-
Im using asp.net.
You can check if the request was made from one of the pages of your application using:
Request.UrlReferrer.Contains("mywebsite.com")
That's the simple way.
The secure way is to put a cookie on the client containing a value encrypted using a secure key or hashed using a secure salt. If the cookie is set to expire when the page is closed it should be impossible for someone to forge.
Here's an example:
On the pages that would redirect to the page you are trying to protect:
HttpCookie cookie = new HttpCookie("SecureCheck");
//don't set the cookie's expiration so it's deleted when the browser is closed
cookie.Value = System.Web.Security.FormsAuthentication.HashPasswordForStoringInConfigFile(Session.SessionID, "SHA1");
Response.Cookies.Add(cookie);
On the page you are trying to protect:
//check to see if the cookie is there and it has the correct value
if (string.IsNullOrEmpty(Request.Cookies["SecureCheck"]) || System.Web.Security.FormsAuthentication.HashPasswordForStoringInConfigFile(Session.SessionID, "SHA1") != Request.Cookies["SecureCheck"])
throw Exception("Invalid request. Please access this page only from the application.");
//if we got this far the exception was not thrown and we are safe to continue
//insert whatever code here
There's no reliable way to do this for a GET request, nor is their any reason to try for a legitimate user. What you should do instead is ensure that regardless of where the request comes from the user has the proper permissions and access rights and that the session is protected appropriately (HTTP only cookies, SSL, etc.) If the request is changing data, then it should be a POST, not a GET, and it should be accompanied by some suitable cross-site request forgery prevention techniques (such as a cookie containing a nonce that is verified against a matching nonce on the form itself).
There is no way, other than rejecting the request if it doesn't contain a previously generated random one-time token in the parameters (that would be stored in the session, for example).
While there is no 100% secure way to do this, what I am suggesting might at least take care of your basic needs.
This is what you can do .
Client: Add a HTTP header with an encoded string that is like hash (sha256) of some word.
Then make your client always do a POST request instead of GET.
Server: Check the HTTP Header for encoded string. Also make sure it is a POST request.
This is not 100% as ofcourse someone smart enough could figure out and still generate a request, but depending on your need you might find this enough or not
You can check the referer, the user agent, add an additional header to the request, always do post requests to that url. However, considering HTTP is transmitted in plain text, somebody is always able to let wireshark or fiddler run, capture the HTTP packets and recreate the requests with your measures in place.
Pass parameters from your application so that you can verify on the server side.
I suggest you use an encryption algorithm and generate random text using a password(key). Then, decrypt the param on the server side and check if it matches your expectation.
I am not very clear though. sorry about that, If had to do something like this, then, I would do something similar to mentioned above.
You can use to check the header on MVC controller like Request.Headers["Accept"]; if it is coming from your code in angularjs or jquery:
sample angularjs like this:
var url = ServiceServerPath + urlSearchService + '/SearchCustomer?input=' + $scope.strInput;
$http({
method: 'GET',
url: url,
headers: {
'Content-Type': 'application/json'
},.....
And on the MVC [HttpGet] Action method
[HttpGet]
[PreventDirectAccess]//It is my custom filters
// ---> /Index/SearchCustomer?input={input}/
public string SearchCustomer(string input)
{
try
{
var isJsonRequestOnMVC = Request.Headers["Accept"];//TODO: This will check if the request comes from MVC else comes from Browser
if (!isJsonRequestOnMVC.Contains("application/json")) return "Error Request on server!";
var serialize = new JavaScriptSerializer();
ISearch customer = new SearchCustomer();
IEnumerable<ContactInfoResult> returnSearch = customer.GetCustomerDynamic(input);
return serialize.Serialize(returnSearch);
}
catch (Exception err)
{
throw;
}
}
Try to save them like this:
HttpCookie latcook = new HttpCookie("latitude", lat.Value.ToString());
HttpCookie lngcook = new HttpCookie("longitude", lng.Value.ToString());
Request.Cookies.Add(latcook);
Request.Cookies.Add(lngcook);
Everything has a value, and the code steps through without error.
Then immediately after those are set, I refresh my page and step through this:
HttpCookie latcook = Request.Cookies.Get("latitude");
HttpCookie lngcook = Request.Cookies.Get("longitude");
The latcook and lngcook variables have names, but no values. What am I doing wrong?
You are adding your cookies to the request object. They should be added to the response:
Response.Cookies.Add(latcook);
Response.Cookies.Add(lngcook);
Cookies added to the response are returned to the user's browser via a series of Set-Cookie HTTP headers. They are then subsequently sent back (upon the next request) via the Cookie HTTP header. (You should be able to watch this happen using Firebug, etc.) Ultimately, this header will be parsed and populate the Request.Cookies collection.
Ok just want to clarify something with my solution.
I have a requirement to grab a file from a respository somewhere, this repository requires a session token to be passed in the form of a cookie along with the request for the file.
I am authenticating the user against this repository and storing the session token in the users cookie collection for my application when the user first logs onto my application.
Problem is the cookie will not get sent to the repository when a user tried to access a file because the repository is on a different URL and domain. Therefore I am creating a new http request, appending the cookie and getting the response stream back.
I now need to send this response stream back to the user, headers and all (as this response stream will contain the headers for the file the user is trying to access)
Can I use this:
string session = cookie.Value;
StreamReader reader = new StreamReader(Utility.GetLinkStream(url, session));
Context.Response.ClearHeaders();
Context.Response.Clear();
Context.Response.Write(reader.ReadToEnd());
Essentially the call to Utility.GetLinkStream goes off and creates a http request then returns the stream of the response. Will the call to Write write out the whole response headers and all, or is there a better way to acheive this?
Response.Write() will only write content, you have to set the headers before calling this. You could enumerate the headers from the WebResponse and add them to the Response.Headers manually.