I have included below lines of code in my Web.Config and Global.asax.cs file. Still when I use developer tools in browser I could see secure flag not set to the below Cookies.
Also Configured SSLSettings in my IIS(Selected checkbox requireSSL).
I would like to set Secure attribute to all Cookies not only to received but also to Sent cookies. Any suggestion please.
In Web.config:
<httpCookies requireSSL="true"/>
In Global.asax.cs:
protected void Application_EndRequest(object sender, EventArgs e)
{
if (Request.IsSecureConnection == true && HttpContext.Current.Request.Url.Scheme == "https")
{
Request.Cookies["ASP.NET_SessionID"].Secure = true;
if (Request.Cookies.Count > 0)
{
foreach (string s in Request.Cookies.AllKeys)
{
Request.Cookies[s].Secure = true;
}
}
Response.Cookies["ASP.NET_SessionID"].Secure = true;
if (Response.Cookies.Count > 0)
{
foreach (string s in Response.Cookies.AllKeys)
{
Response.Cookies[s].Secure = true;
}
}
}
}
In Browser:
There are two ways, one httpCookies element in web.config allows you to turn on requireSSL which only transmit all cookies including session in SSL only and also inside forms authentication, but if you turn on SSL on httpcookies you must also turn it on inside forms configuration too.
<form>
<httpCookies requireSSL="true" />
Related
I'm building a authentication app using OWIN. I'm trying to get both the Bear token and userinfo claims. The code below gets me to 85% of what I want. While initially writing the code I used IIS Express. I debugged and coded towards that environment. For whatever reason after the initial challenge called in the else block the request.isauthenticated is false after the return from the login screen (Using KeyCloak as idp). The code then drops the user into the else if block where I find request.form has my Bearer token. I must then execute the authentication.challenge again (no KeyCloak login screen opens) and I return to the top of the page_load and this time the request.isauthenticated is true and I can get the userinfo but the request.form is empty. This is find for me because I can store all the info off somewhere for later use.
Once I got to this point I targeted IIS. Ran the code and got different behavior. The code drops into the else block initially (same as before) I login but upon return from the idp this time the request.isAuthenticated is true. I have the userinfo but not the Bearer token. Any ideas why??
protected void Page_Load(object sender, EventArgs e)
{
if (Request.IsAuthenticated)
{
String str = String.Empty;
var qry = ((System.Security.Claims.ClaimsPrincipal)Request.RequestContext.HttpContext.User).Claims;
if (null != qry)
{
foreach (System.Security.Claims.Claim item in qry)
{
if (item.Type == "preferred_username")
{
str = item.Value;
}
}
}
}else if (!Request.IsAuthenticated && Request.Form.Count > 0)
{
HttpContext.Current.GetOwinContext().Authentication.Challenge(
new AuthenticationProperties { },
OpenIdConnectAuthenticationDefaults.AuthenticationType);
}
else
{
HttpContext.Current.GetOwinContext().Authentication.Challenge(
new AuthenticationProperties { RedirectUri = "/XXXapp locationXXX/" },
OpenIdConnectAuthenticationDefaults.AuthenticationType);
}
}
I've figured it out,. Needed to set the save token flag to true. This allowed the token to be carried along in the request. So, I don't need if else. Now that I got that working I'm changing this section of code. My main issue is it is hard to find complete and current documentation with sample code for my use case. --Thanks
I've been trying to figure out how to set the secure flag on all the server cookies for our website. We're running .NET 4.5. I tried adding <httpCookies requireSSL="true" /> to the web.config file. I tried adding <authentication><forms requireSSL="true" /></authentication>. I tried setting the secure flag in code. Nothing had any effect. Adding the following c# function to Global.asax.cs was supposed to work, but didn't:
protected void Application_EndRequest()
{
string authCookie = FormsAuthentication.FormsCookieName;
foreach (string sCookie in Response.Cookies)
{
if (sCookie.Equals(authCookie))
{
// Set the cookie to be secure. Browsers will send the cookie
// only to pages requested with https
var httpCookie = Response.Cookies[sCookie];
if (httpCookie != null) httpCookie.Secure = true;
}
}
It finally started working after I got rid of the "if (sCookie.Equals(authCookie))..." statement. So this is the working version:
protected void Application_EndRequest()
{
string authCookie = FormsAuthentication.FormsCookieName;
foreach (string sCookie in Response.Cookies)
{
// Set the cookie to be secure. Browsers will send the cookie
// only to pages requested with https
var httpCookie = Response.Cookies[sCookie];
if (httpCookie != null) httpCookie.Secure = true;
}
}
I have several questions. First, what is the logic behind putting this in the Application_EndRequest method? Second, why did I have to get rid of the sCookie.Equals(authCookie)) part? Finally, has anyone found a more elegant solution? Thanks.
If you are executing the request over HTTP and not HTTPS then I do not think you can set Secure = true. Can you verify that you are running over a secure connection? You can do some google / bing searches on how to generate a local certificate if you are testing on your dev box. Also do not forget to encrypt your cookie so its not readable on the client side.
Here is some sample code.
var userName = "userName";
var expiration = DateTime.Now.AddHours(3);
var rememberMe = true;
var ticketValueAsString = generateAdditionalTicketInfo(); // get additional data to include in the ticket
var ticket = new FormsAuthenticationTicket(1, userName, DateTime.Now, expiration, rememberMe, ticketValueAsString);
var encryptedTicket = FormsAuthentication.Encrypt(ticket); // encrypt the ticket
var cookie = new HttpCookie(FormsAuthentication.FormsCookieName, encryptedTicket)
{
HttpOnly = true,
Secure = true,
};
EDIT - Added link
Also take a look at this previous answer and how you can configure your web.config to ensure that cookies are always marked as secure.
I am testing a new load balanced staging site and the https is set up at the load balancer level, not at the site level. Also, this site will be always https so i don't need remote require https attributes etc. The url displays https but it is not available in my code. I have a few issues due to this reason
Request.Url.Scheme is always http:
public static string GetProtocol()
{
var protocol = "http";
if (HttpContext.Current != null && HttpContext.Current.Request != null)
{
protocol = HttpContext.Current.Request.Url.Scheme;
}
return protocol;
}
Same thing with this base url, protocol is http
public static string GetBaseUrl()
{
var baseUrl = String.Empty;
if (HttpContext.Current == null || HttpContext.Current.Request == null || String.IsNullOrWhiteSpace(HttpRuntime.AppDomainAppPath)) return baseUrl;
var request = HttpContext.Current.Request;
var appUrl = HttpRuntime.AppDomainAppVirtualPath;
baseUrl = string.Format("{0}://{1}{2}", request.Url.Scheme, request.Url.Authority, appUrl);
if (!string.IsNullOrWhiteSpace(baseUrl) && !baseUrl.EndsWith("/"))
baseUrl = String.Format("{0}/", baseUrl);
return baseUrl;
}
Now the biggest issue is referencing js files and google fonts referenced in the style sheets. I am using // here without http or https but these are treated as http and i see mixed content blocked message in FireBug.
How can i overcome this issue?
As you've said HTTPS termination is done at load balancer level ("https is set up at the load balancer level") which means original scheme may not come to the site depending on loadbalancer configuration.
It looks like in your case LB is configured to talk to site over HTTP all the time. So your site will never see original scheme on HttpContext.Request.RawUrl (or similar properties).
Fix: usually when LB, proxy or CDN configured such way there are additional headers that specify original scheme and likely other incoming request parameters like full url, client's IP which will be not directly visible to the site behind such proxying device.
I override the ServerVariables to convince MVC it really is communicating through HTTPS and also expose the user's IP address. This is using the X-Forwarded-For and X-Forwarded-Proto HTTP headers being set by your load balancer.
Note that you should only use this if you're really sure these headers are under your control, otherwise clients might inject values of their liking.
public sealed class HttpOverrides : IHttpModule
{
void IHttpModule.Init(HttpApplication app)
{
app.BeginRequest += OnBeginRequest;
}
private void OnBeginRequest(object sender, EventArgs e)
{
HttpApplication app = (HttpApplication)sender;
string forwardedFor = app.Context.Request.Headers["X-Forwarded-For"]?.Split(new char[] { ',' }).FirstOrDefault();
if (forwardedFor != null)
{
app.Context.Request.ServerVariables["REMOTE_ADDR"] = forwardedFor;
app.Context.Request.ServerVariables["REMOTE_HOST"] = forwardedFor;
}
string forwardedProto = app.Context.Request.Headers["X-Forwarded-Proto"];
if (forwardedProto == "https")
{
app.Context.Request.ServerVariables["HTTPS"] = "on";
app.Context.Request.ServerVariables["SERVER_PORT"] = "443";
app.Context.Request.ServerVariables["SERVER_PORT_SECURE"] = "1";
}
}
void IHttpModule.Dispose()
{
}
}
And in Web.config:
<system.webServer>
<modules runAllManagedModulesForAllRequests="true">
<add name="HttpOverrides" type="Namespace.HttpOverrides" preCondition="integratedMode" />
</modules>
</system.webServer>
I know this is an old question, but after encountering the same problem, I did discover that if I look into the UrlReferrer property of the HttpRequest object, the values will reflect what was actually in the client browser's address bar.
So for example, with UrlReferrer I got:
Request.UrlReferrer.Scheme == "https"
Request.UrlReferrer.Port == 443
But for the same request, with the Url property I got the following:
Request.Url.Scheme == "http"
Request.Url.Port == 80
According to https://learn.microsoft.com/en-us/aspnet/core/host-and-deploy/proxy-load-balancer
When HTTPS requests are proxied over HTTP, the original scheme (HTTPS)
may be lost and must be forwarded in a header.
In Asp.Net Core I found ( not sure if it works for all scenarios) that even if request.Scheme is misleadingly shows “http” for original “https”, request.IsHttps property is more reliable.
I am using the following code
//Scheme may return http for https
var scheme = request.Scheme;
if(request.IsHttps) scheme= scheme.EnsureEndsWith("S");
//General string extension
public static string EnsureEndsWith(this string str, string sEndValue, bool ignoreCase = true)
{
if (!str.EndsWith(sEndValue, CurrentCultureComparison(ignoreCase)))
{
str = str + sEndValue;
}
return str;
}
The cookie used for session in ASP.NET MVC is httpOnly (property set to true).
Is there a way to make it not httpOnly?
I want to be able to access this cookie from javascript.
Even if it is less secure than the "What if all the universe stands against me?!" default setting.
If you REALLY need it you could try to add this to your Global.asax:
void Application_EndRequest(Object sender, EventArgs e)
{
if (Response.Cookies.Count > 0)
{
foreach (string s in Response.Cookies.AllKeys)
{
if (s == "ASP.NET_SessionId")
{
Response.Cookies["ASP.NET_SessionId"].HttpOnly = false;
}
}
}
}
Solution was taken from here.
I built a system that uses cookies to store search params across the site.
On the home page there are links and I wanted to use jQuery to save a cookie with the item id in it.
But on click the user is then sent to an advanced search page where they can use .net controls to modify the search. The cookies are saved again but they needed to be writable by the js on the home page when the user browsed back.
So I set HttpOnly like this:
var cookie = new HttpCookie(name)
{
Value = val,
HttpOnly = false // #DEV search cookies can be modified by JS
};
HttpContext.Current.Response.Cookies.Add(cookie);
I have some proof concept code for a HTTP module. The code checks to see if a cookie exists, if so it retrieves a value, if the cookie does not exist it creates it and sets the value.
Once this is done I write to the screen to see what action has been taken (all nice and simple). So on the first request the cookie is created; subsequent requests retrieve the value from the cookie.
When I test this in a normal asp.net web site everything works correctly – yay! However as soon as I transfer it to SharePoint something weird happens, the cookie is never saved - that is the code always branches into creating the cookie and never takes the branch to retrieve the value - regardless of page refreshes or secondary requests.
Heres the code...
public class SwithcMasterPage : IHttpModule
{
public void Dispose()
{
throw new NotImplementedException();
}
public void Init(HttpApplication context)
{
// register handler
context.PreRequestHandlerExecute += new EventHandler(PreRequestHandlerExecute);
}
void PreRequestHandlerExecute(object sender, EventArgs e)
{
string outputText = string.Empty;
HttpCookie cookie = null;
string cookieName = "MPSetting";
cookie = HttpContext.Current.Request.Cookies[cookieName];
if (cookie == null)
{
// cookie doesn't exist, create
HttpCookie ck = new HttpCookie(cookieName);
ck.Value = GetCorrectMasterPage();
ck.Expires = DateTime.Now.AddMinutes(5);
HttpContext.Current.Response.Cookies.Add(ck);
outputText = "storing master page setting in cookie.";
}
else
{
// get the master page from cookie
outputText = "retrieving master page setting from cookie.";
}
HttpContext.Current.Response.Write(outputText + "<br/>");
}
private string GetCorrectMasterPage()
{
// logic goes here to get the correct master page
return "/_catalogs/masterpage/BlackBand.master";
}
This turned out to be the authentication of the web app. To work correctly you must use a FQDM that has been configured for Forms Authentication.
You can use Fiddler or FireBug (on FireFox) to inspect response to see if your cookie is being sent. If not then perhaps you can try your logic in PostRequestHandlerExecute. This is assuming that Sharepoint or some other piece of code is tinkering with response cookies. This way, you can be the last one adding the cookie.