I'm building an Iframe canvas application for Facebook. I'm not using the Javascript SDK.
This is the code I'm using, and it works well in all browsers except for Safari.
protected FacebookApp app;
protected CanvasAuthorizer cauth;
Response.AddHeader("p3p", "CP=\"IDC DSP COR ADM DEVi TAIi PSA PSD IVAi IVDi CONi HIS OUR IND CNT\"");
app = new FacebookApp();
cauth = new CanvasAuthorizer(app);
if (!cauth.IsAuthorized())
{
myAuth auth = new myAuth();
myAuth.Authorize(app, Request, Response, perms);
}
if (cauth.Authorize())
{
// Do my app stuff here
}
public class myAuth
{
public static void Authorize(FacebookApp fbApp, System.Web.HttpRequest request, System.Web.HttpResponse response, string perms)
{
Authorize(fbApp, request, response, perms, null);
}
public static void Authorize(FacebookApp fbApp, System.Web.HttpRequest request, System.Web.HttpResponse response, string perms, string redirectUrl)
{
Uri url = fbApp.GetLoginUrl();
NameValueCollection nvc = System.Web.HttpUtility.ParseQueryString(url.Query);
if (!string.IsNullOrEmpty(perms))
nvc.Add("req_perms", perms);
if (!string.IsNullOrEmpty(redirectUrl))
nvc["next"] = GetAppRelativeUrl(redirectUrl);
else if (request.QueryString.Count > 0)
nvc["next"] = GetAppRelativeUrl(request.Path.Replace(request.ApplicationPath, string.Empty).Replace(request.ApplicationPath.ToLower(), string.Empty) + "?" + request.QueryString);
else
nvc["next"] = GetAppRelativeUrl(request.Path.Replace(request.ApplicationPath, string.Empty).Replace(request.ApplicationPath.ToLower(), string.Empty));
UriBuilder ub = new UriBuilder(url);
ub.Query = nvc.ToString();
string content = CanvasUrlBuilder.GetCanvasRedirectHtml(ub.Uri);
response.ContentType = "text/html";
response.Write(content);
response.End();
}
public static string GetAppRelativeUrl(string url)
{
return CanvasSettings.Current.CanvasPageUrl.ToString();
}
}
I read about Safari not allowing third party cookies, and I figure that's where the problem lies. My question is wheter there's a way to handle this using the SDK, or what my options are.
Regards,
Anders Pettersson
I've had some problems with Safari changing the case of data sent in HTTP headers... make sure any parsing/comparing you are doing is case insensitive.
See here: Facebook Iframe App with multiple pages in Safari Session Variables not persisting
here I am getting same problem but now I got solution for safari..
Just change validation mode in web.config
<system.web>
<pages enableViewState="true" validateRequest="false" />
<httpRuntime requestValidationMode="2.0"/>
<!--
Enable this code if you get still problem
<sessionState cookieless="true" regenerateExpiredSessionId="true" />-->
Related
I would like to know, how one can get the response containing auth code from google, when the redirect uri is a custom uri.
This is the code that i use. I open the url in a browser.( currently testing in ios-simulator)
var url = "https://accounts.google.com/o/oauth2/v2/auth?";
string scope = "email%20profile";
string redirect_uri = "com.companyname.exampleauth://";
string response_type = "code";
string access_type = "offline";
Once user allows the app, it goes back to the app! but, how do i get the response with code?
In the info.plist under the Advance tab you'll see URL Types. Register your app there. Use your bundle identifer as the Identifier, your redirect_uri as the URL scheme and editor for role.
The AppDelegate will get called when the user is done. Heres an example:
public override bool OpenUrl(UIApplication application, NSUrl url, string sourceApplication, NSObject annotation)
{
if (url.Scheme == "yourScheme") {
var queryParams = url.Query.Split('&');
return true;
}
return false;
}
I am testing a new load balanced staging site and the https is set up at the load balancer level, not at the site level. Also, this site will be always https so i don't need remote require https attributes etc. The url displays https but it is not available in my code. I have a few issues due to this reason
Request.Url.Scheme is always http:
public static string GetProtocol()
{
var protocol = "http";
if (HttpContext.Current != null && HttpContext.Current.Request != null)
{
protocol = HttpContext.Current.Request.Url.Scheme;
}
return protocol;
}
Same thing with this base url, protocol is http
public static string GetBaseUrl()
{
var baseUrl = String.Empty;
if (HttpContext.Current == null || HttpContext.Current.Request == null || String.IsNullOrWhiteSpace(HttpRuntime.AppDomainAppPath)) return baseUrl;
var request = HttpContext.Current.Request;
var appUrl = HttpRuntime.AppDomainAppVirtualPath;
baseUrl = string.Format("{0}://{1}{2}", request.Url.Scheme, request.Url.Authority, appUrl);
if (!string.IsNullOrWhiteSpace(baseUrl) && !baseUrl.EndsWith("/"))
baseUrl = String.Format("{0}/", baseUrl);
return baseUrl;
}
Now the biggest issue is referencing js files and google fonts referenced in the style sheets. I am using // here without http or https but these are treated as http and i see mixed content blocked message in FireBug.
How can i overcome this issue?
As you've said HTTPS termination is done at load balancer level ("https is set up at the load balancer level") which means original scheme may not come to the site depending on loadbalancer configuration.
It looks like in your case LB is configured to talk to site over HTTP all the time. So your site will never see original scheme on HttpContext.Request.RawUrl (or similar properties).
Fix: usually when LB, proxy or CDN configured such way there are additional headers that specify original scheme and likely other incoming request parameters like full url, client's IP which will be not directly visible to the site behind such proxying device.
I override the ServerVariables to convince MVC it really is communicating through HTTPS and also expose the user's IP address. This is using the X-Forwarded-For and X-Forwarded-Proto HTTP headers being set by your load balancer.
Note that you should only use this if you're really sure these headers are under your control, otherwise clients might inject values of their liking.
public sealed class HttpOverrides : IHttpModule
{
void IHttpModule.Init(HttpApplication app)
{
app.BeginRequest += OnBeginRequest;
}
private void OnBeginRequest(object sender, EventArgs e)
{
HttpApplication app = (HttpApplication)sender;
string forwardedFor = app.Context.Request.Headers["X-Forwarded-For"]?.Split(new char[] { ',' }).FirstOrDefault();
if (forwardedFor != null)
{
app.Context.Request.ServerVariables["REMOTE_ADDR"] = forwardedFor;
app.Context.Request.ServerVariables["REMOTE_HOST"] = forwardedFor;
}
string forwardedProto = app.Context.Request.Headers["X-Forwarded-Proto"];
if (forwardedProto == "https")
{
app.Context.Request.ServerVariables["HTTPS"] = "on";
app.Context.Request.ServerVariables["SERVER_PORT"] = "443";
app.Context.Request.ServerVariables["SERVER_PORT_SECURE"] = "1";
}
}
void IHttpModule.Dispose()
{
}
}
And in Web.config:
<system.webServer>
<modules runAllManagedModulesForAllRequests="true">
<add name="HttpOverrides" type="Namespace.HttpOverrides" preCondition="integratedMode" />
</modules>
</system.webServer>
I know this is an old question, but after encountering the same problem, I did discover that if I look into the UrlReferrer property of the HttpRequest object, the values will reflect what was actually in the client browser's address bar.
So for example, with UrlReferrer I got:
Request.UrlReferrer.Scheme == "https"
Request.UrlReferrer.Port == 443
But for the same request, with the Url property I got the following:
Request.Url.Scheme == "http"
Request.Url.Port == 80
According to https://learn.microsoft.com/en-us/aspnet/core/host-and-deploy/proxy-load-balancer
When HTTPS requests are proxied over HTTP, the original scheme (HTTPS)
may be lost and must be forwarded in a header.
In Asp.Net Core I found ( not sure if it works for all scenarios) that even if request.Scheme is misleadingly shows “http” for original “https”, request.IsHttps property is more reliable.
I am using the following code
//Scheme may return http for https
var scheme = request.Scheme;
if(request.IsHttps) scheme= scheme.EnsureEndsWith("S");
//General string extension
public static string EnsureEndsWith(this string str, string sEndValue, bool ignoreCase = true)
{
if (!str.EndsWith(sEndValue, CurrentCultureComparison(ignoreCase)))
{
str = str + sEndValue;
}
return str;
}
I'm working on a very specific web-scraping application, and it needs to login to several websites and retrieve some data from them.
I am using a WebClient that has been made aware of cookies by overriding the following method:
protected override WebRequest GetWebRequest(Uri address)
{
WebRequest request = base.GetWebRequest(address);
var castRequest = request as HttpWebRequest;
if (castRequest != null)
{
castRequest.CookieContainer = this.CookieContainer;
}
return request;
}
I can login to the sites fine with regular POST/GET requests (via the appropriate download/upload methods on the webclient)
The targetted websites use ajax ASP.Net top-level forms, and there is a state variable that gets enabled after you click a button on the page. That is, when you click the button, the form gets submitted, the state gets changed, and then when it loads the response it has the information I need. The state modification at this point is also persistent. If i reload the page, or even close the tab and re-open it, the data i need will still be there because it is associated with the ASP session. As soon as the ASP session expires, you have to login and click the button again before the server will send the data I need.
I have watched the submitted form via the Chrome developer tools when clicking the button, and i re-created the form submit exactly as I saw it in the chrome network watch window, but it still does not correctly modify the viewstate.
So my question is, how can i simulate clicking this button so that the server will modify the viewstate and return the value i need.
I can not use a web-browser control for this, but I could use the html agility pack if it makes things substantially easier (although I really would like not to use an external library)
The button is defined as this:
<form name="aspnetForm" method="post" action="enterurlhere..." id="aspnetForm">
<input type="image" name="ctl00$....." id="ctl00...." title="...." src="...." style="height:50px;border-width:0px;">
if your target is ASP.NET WebForms site which:
1) you must login first to navigate to the required page
2) on the required page there is an UpdatePanel that has, let's say a textbox into which you need to enter something and then submit that information and if that information is correct, you will get "what you expect"
I've done previously various crawlers, thus took one as the base but stripped down quite, well, a lot, no error logging, validation that you are logged in, validation that you are still logged in when requesting the page, HtmlAgilityPack, structure, code cleanness, user agent string randomization etc. to keep it simple for you, but you of course can enhance it :) Anyway, I've created a web project (Web Forms) in Visual Studio 2013. As you may know it has some landing pages including user registration etc. Then you have "Manage account" page, which obviously requires user to be authenticated. On that page I added another div, then inside of it I placed UpdatePanel (that makes postback ajaxified). Inside UpdatePanel I placed textbox, a button and a literal server controls. In code behind I added a click event handler for that button: if user input is equal to, let's say "secret" then put some text into the literal to indicate that operation was successful. Thus the application had to login first then get that secret text by submitting the secret phrase to "Manage account" page.
Actual fetcher:
using Pokemon.BL.Utils;
using System;
using System.Text;
using System.Web;
namespace Pokemon.BL
{
sealed class UrlFetcher : IDisposable
{
private static readonly UrlFetcher _instance;
private CGWebClient _cgWebClient;
private string loginPostString = "__EVENTTARGET={0}&__EVENTARGUMENT={1}&__VIEWSTATE={2}&__VIEWSTATEGENERATOR={3}&__EVENTVALIDATION={4}&ctl00$MainContent$Email={5}&ctl00$MainContent$Password={6}&ctl00$MainContent$ctl05={7}";
private string secretPhrasePostString = "__EVENTTARGET={0}&__EVENTARGUMENT={1}&__VIEWSTATE={2}&__VIEWSTATEGENERATOR={3}&__EVENTVALIDATION={4}&__ASYNCPOST=true&ctl00$MainContent$btnGetSecretPhrase=Button&ctl00$ctl08=ctl00$MainContent$UpdatePanel1|ctl00$MainContent$btnGetSecretPhrase&ctl00$MainContent$txtSecret={5}";
private UrlFetcher()
{
_cgWebClient = new CGWebClient();
}
static UrlFetcher()
{
_instance = new UrlFetcher();
}
#region Methods
public void LoginToSite(string email, string password)
{
var loginUrl = "http://localhost:53998/Account/Login";
byte[] response = _cgWebClient.DownloadData(loginUrl);
var content = Encoding.UTF8.GetString(response);
string eventTarget = ExtractToken("__EVENTTARGET", content);
string eventArg = ExtractToken("__EVENTARGUMENT", content);
string viewState = ExtractToken("__VIEWSTATE", content);
string viewStateGen = ExtractToken("__VIEWSTATEGENERATOR", content);
string eventValidation = ExtractToken("__EVENTVALIDATION", content);
string postData = string.Format(
loginPostString,
eventTarget,
eventArg,
viewState,
viewStateGen,
eventValidation,
email,
password,
"Log in"
);
_cgWebClient.Headers.Add("Content-Type", "application/x-www-form-urlencoded");
response = _cgWebClient.UploadData(loginUrl, "POST", Encoding.UTF8.GetBytes(postData));
_cgWebClient.Headers.Remove("Content-Type");
}
public void GetSecretPhrase()
{
var loginUrl = "http://localhost:53998/Account/Manage";
byte[] response = _cgWebClient.DownloadData(loginUrl);
var content = Encoding.UTF8.GetString(response);
string eventTarget = ExtractToken("__EVENTTARGET", content);
string eventArg = ExtractToken("__EVENTARGUMENT", content);
string viewState = ExtractToken("__VIEWSTATE", content);
string viewStateGen = ExtractToken("__VIEWSTATEGENERATOR", content);
string eventValidation = ExtractToken("__EVENTVALIDATION", content);
string postData = string.Format(
secretPhrasePostString,
eventTarget,
eventArg,
viewState,
viewStateGen,
eventValidation,
"secret"
);
_cgWebClient.Headers.Add("Content-Type", "application/x-www-form-urlencoded");
_cgWebClient.Headers.Add("X-Requested-With", "XMLHttpRequest");
response = _cgWebClient.UploadData(loginUrl, "POST", Encoding.UTF8.GetBytes(postData));
_cgWebClient.Headers.Remove("Content-Type");
_cgWebClient.Headers.Remove("X-Requested-With");
Console.WriteLine(Encoding.UTF8.GetString(response));
}
#region IDisposable Members
public void Dispose()
{
if (_cgWebClient != null)
{
_cgWebClient.Dispose();
}
}
#endregion
private string ExtractToken(string whatToExtract, string content)
{
string viewStateNameDelimiter = whatToExtract;
string valueDelimiter = "value=\"";
int viewStateNamePosition = content.IndexOf(viewStateNameDelimiter);
int viewStateValuePosition = content.IndexOf(valueDelimiter, viewStateNamePosition);
int viewStateStartPosition = viewStateValuePosition + valueDelimiter.Length;
int viewStateEndPosition = content.IndexOf("\"", viewStateStartPosition);
return HttpUtility.UrlEncode(
content.Substring(
viewStateStartPosition,
viewStateEndPosition - viewStateStartPosition
)
);
}
#endregion
#region Properties
public static UrlFetcher Instance { get { return _instance; } }
#endregion
}
}
WebClient wrapper:
using System;
using System.Collections.Generic;
using System.Net;
namespace Pokemon.BL.Utils
{
// http://codehelp.smartdev.eu/2009/05/08/improve-webclient-by-adding-useragent-and-cookies-to-your-requests/
public class CGWebClient : WebClient
{
private System.Net.CookieContainer cookieContainer;
private string userAgent;
private int timeout;
public System.Net.CookieContainer CookieContainer
{
get { return cookieContainer; }
set { cookieContainer = value; }
}
public string UserAgent
{
get { return userAgent; }
set { userAgent = value; }
}
public int Timeout
{
get { return timeout; }
set { timeout = value; }
}
public CGWebClient()
{
timeout = -1;
userAgent = "Mozilla/5.0 (Windows NT 5.1; rv:31.0) Gecko/20100101 Firefox/31.0";
cookieContainer = new CookieContainer();
}
protected override WebRequest GetWebRequest(Uri address)
{
WebRequest request = base.GetWebRequest(address);
if (request.GetType() == typeof(HttpWebRequest))
{
((HttpWebRequest)request).CookieContainer = cookieContainer;
((HttpWebRequest)request).UserAgent = userAgent;
((HttpWebRequest)request).Timeout = timeout;
}
return request;
}
}
}
and finally run it:
UrlFetcher.Instance.LoginToSite("username", "password");
UrlFetcher.Instance.GetSecretPhrase();
UrlFetcher.Instance.Dispose();
this outputs the secret phrase into console application. Of course you will need to tweak this to make it work, for example depending on the ASP.NET version your target site is running and so on :)
Hope this helps :)
I don't think this will work server-side, because the client needs the session information. To do this you could implement an Iframe control that you could load the form in and call a server side or client side call to click the button in the Iframe and load the session information.
We have a website and some of the pages are using https and those pages kept in magic folder.
https enabling and port number is configured in web.config for a site.
However if the user trying to access magic folder contents using http, we need to redirect back to https and vise versa
Ex Case 1: Working for http to https
http://mysite/magic-look to https://mysite/magic-look
Here, we used
<urlMappings>
<add url="~/magic-look" mappedUrl="~/magic/look.aspx"/>
<add url="~/help" mappedUrl="~/Help/default.aspx"/>
In Global.asax
protected void Application_BeginRequest(object sender, EventArgs e)
{
string url = HttpContext.Current.Request.Url.AbsoluteUri;
var secPort = String.IsNullOrEmpty(ConfigurationManager.AppSettings["securePort"]) ? 0 : Convert.ToInt32(ConfigurationManager.AppSettings["securePort"]);
var secProtocolEnabled = String.IsNullOrEmpty(ConfigurationManager.AppSettings["useSecure"]) ? false : true;
bool isSecureUrl = (url.IndexOf("/magic/", StringComparison.OrdinalIgnoreCase) >= 0) ? true : false;
if (url.IndexOf(".aspx", StringComparison.OrdinalIgnoreCase) >= 0)
{
url = url.Replace(":" + secPort, "");
if (isSecureUrl && secProtocolEnabled)
{
if (HttpContext.Current.Request.Url.Port != secPort)
{
//change .aspx page back to original SEO friendly URL and redirect
url = url.Replace(HttpContext.Current.Request.Url.AbsolutePath, HttpContext.Current.Request.RawUrl);
HttpContext.Current.Response.Redirect(Regex.Replace(url, "http", "https", RegexOptions.IgnoreCase));
}
}
else
{
if (HttpContext.Current.Request.Url.Port == secPort && !isSecureUrl)
{
//cause infinite loop
url = url.Replace(HttpContext.Current.Request.Url.AbsolutePath, HttpContext.Current.Request.RawUrl);
var targetUrl = Regex.Replace(url, "https", "http", RegexOptions.IgnoreCase);
HttpContext.Current.Response.Redirect(targetUrl);
}
}
}
}
Non https page accessed using Https, Not working, infinite loop
ISSUE: Not working for https to http
https://mysite/help to http://mysite/help
It gives infinite loop.. keep redirecting to *https://mysite/help*
https://mysite/help --> 302 Found
https://mysite/Help --> 302 Found
https://mysite/Help --> 302 Found
https://mysite/Help --> 302 Found .............
UPDATE:
If it remove this, it works fine.
url = url.Replace(HttpContext.Current.Request.Url.AbsolutePath,
HttpContext.Current.Request.RawUrl);
But 3 requests instead of 2
https://mysite/help --> 302 Found
https://mysite/Help/Default.aspx--> 302 Found
http://mysite/Help/Default.aspx--> 200 OK
However i want SEO friendly url like http://mysite/Help/
UPDATE 2: ROOT CAUSE:
Whenever the url is https://../something and redirecting to http://../something is always making request https://../something
I just re-factored your code and used string.Format method to build url
private static string GetMyUrl(string securePort, string useSecure, Uri uri)
{
int secPort;
if (!int.TryParse(securePort, out secPort)) secPort = 0;
bool secProtocolEnabled = !string.IsNullOrWhiteSpace(useSecure);
bool shouldBeSecureUrl = uri.AbsolutePath.Contains("/magic/");
if (!uri.AbsolutePath.EndsWith(".aspx")) return uri.AbsoluteUri;
bool forceSecure = (shouldBeSecureUrl && secProtocolEnabled);
string url = string.Format("{0}://{1}{2}{3}",
forceSecure
? "https"
: "http",
uri.Host,
forceSecure && secPort != 0
? string.Format(":{0}", secPort)
: string.Empty,
uri.PathAndQuery);
return url;
}
protected void Application_BeginRequest(object sender, EventArgs e)
{
string url = GetMyUrl(ConfigurationManager.AppSettings["securePort"],
ConfigurationManager.AppSettings["useSecure"], HttpContext.Current.Request.Url);
if (HttpContext.Current.Request.Url.AbsoluteUri != url) HttpContext.Current.Response.Redirect(url);
}
You can test your addresses. Samples...
Console.WriteLine(GetMyUrl("8484","t", new Uri("https://www.contoso.com/catalog/shownew.aspx?date=today")));
Console.WriteLine(GetMyUrl("8484","t", new Uri("https://www.contoso.com:8484/magic/shownew.aspx?date=today")));
Console.WriteLine(GetMyUrl("8484","t", new Uri("http://www.contoso.com:8080/magic/shownew.aspx?date=today")));
Console.WriteLine(GetMyUrl("","t", new Uri("https://www.contoso.com/catalog/shownew.aspx?date=today")));
Console.WriteLine(GetMyUrl("8484","", new Uri("https://www.contoso.com/catalog/shownew.aspx?date=today")));
Finally found that we have url rewrite enabled, so just added config entry, works for all scenario
<configSections>
<section name="URLRewriteConfiguration"
type="MyAssembly.Routing.URLRewriteConfiguration, MyAssembly.Routing,
Version=1.0.0.0, Culture=neutral, PublicKeyToken=null"/>
</configSections>
<URLRewriteConfiguration>
<Routes>
<add Pattern="/help/default.aspx" PermanentRedirect="true" Replacement="/help"/>
and removed
url = url.Replace(HttpContext.Current.Request.Url.AbsolutePath, HttpContext.Current.Request.RawUrl);
I have a web application (hosted in IIS) that talks to a Windows service. The Windows service is using the ASP.Net MVC Web API (self-hosted), and so can be communicated with over http using JSON. The web application is configured to do impersonation, the idea being that the user who makes the request to the web application should be the user that the web application uses to make the request to the service. The structure looks like this:
(The user highlighted in red is the user being referred to in the examples below.)
The web application makes requests to the Windows service using an HttpClient:
var httpClient = new HttpClient(new HttpClientHandler()
{
UseDefaultCredentials = true
});
httpClient.GetStringAsync("http://localhost/some/endpoint/");
This makes the request to the Windows service, but does not pass the credentials over correctly (the service reports the user as IIS APPPOOL\ASP.NET 4.0). This is not what I want to happen.
If I change the above code to use a WebClient instead, the credentials of the user are passed correctly:
WebClient c = new WebClient
{
UseDefaultCredentials = true
};
c.DownloadStringAsync(new Uri("http://localhost/some/endpoint/"));
With the above code, the service reports the user as the user who made the request to the web application.
What am I doing wrong with the HttpClient implementation that is causing it to not pass the credentials correctly (or is it a bug with the HttpClient)?
The reason I want to use the HttpClient is that it has an async API that works well with Tasks, whereas the WebClient's asyc API needs to be handled with events.
You can configure HttpClient to automatically pass credentials like this:
var myClient = new HttpClient(new HttpClientHandler() { UseDefaultCredentials = true });
I was also having this same problem. I developed a synchronous solution thanks to the research done by #tpeczek in the following SO article: Unable to authenticate to ASP.NET Web Api service with HttpClient
My solution uses a WebClient, which as you correctly noted passes the credentials without issue. The reason HttpClient doesn't work is because of Windows security disabling the ability to create new threads under an impersonated account (see SO article above.) HttpClient creates new threads via the Task Factory thus causing the error. WebClient on the other hand, runs synchronously on the same thread thereby bypassing the rule and forwarding its credentials.
Although the code works, the downside is that it will not work async.
var wi = (System.Security.Principal.WindowsIdentity)HttpContext.Current.User.Identity;
var wic = wi.Impersonate();
try
{
var data = JsonConvert.SerializeObject(new
{
Property1 = 1,
Property2 = "blah"
});
using (var client = new WebClient { UseDefaultCredentials = true })
{
client.Headers.Add(HttpRequestHeader.ContentType, "application/json; charset=utf-8");
client.UploadData("http://url/api/controller", "POST", Encoding.UTF8.GetBytes(data));
}
}
catch (Exception exc)
{
// handle exception
}
finally
{
wic.Undo();
}
Note: Requires NuGet package: Newtonsoft.Json, which is the same JSON serializer WebAPI uses.
What you are trying to do is get NTLM to forward the identity on to the next server, which it cannot do - it can only do impersonation which only gives you access to local resources. It won't let you cross a machine boundary. Kerberos authentication supports delegation (what you need) by using tickets, and the ticket can be forwarded on when all servers and applications in the chain are correctly configured and Kerberos is set up correctly on the domain.
So, in short you need to switch from using NTLM to Kerberos.
For more on Windows Authentication options available to you and how they work start at:
http://msdn.microsoft.com/en-us/library/ff647076.aspx
OK, so thanks to all of the contributors above. I am using .NET 4.6 and we also had the same issue. I spent time debugging System.Net.Http, specifically the HttpClientHandler, and found the following:
if (ExecutionContext.IsFlowSuppressed())
{
IWebProxy webProxy = (IWebProxy) null;
if (this.useProxy)
webProxy = this.proxy ?? WebRequest.DefaultWebProxy;
if (this.UseDefaultCredentials || this.Credentials != null || webProxy != null && webProxy.Credentials != null)
this.SafeCaptureIdenity(state);
}
So after assessing that the ExecutionContext.IsFlowSuppressed() might have been the culprit, I wrapped our Impersonation code as follows:
using (((WindowsIdentity)ExecutionContext.Current.Identity).Impersonate())
using (System.Threading.ExecutionContext.SuppressFlow())
{
// HttpClient code goes here!
}
The code inside of SafeCaptureIdenity (not my spelling mistake), grabs WindowsIdentity.Current() which is our impersonated identity. This is being picked up because we are now suppressing flow. Because of the using/dispose this is reset after invocation.
It now seems to work for us, phew!
In .NET Core, I managed to get a System.Net.Http.HttpClient with UseDefaultCredentials = true to pass through the authenticated user's Windows credentials to a back end service by using WindowsIdentity.RunImpersonated.
HttpClient client = new HttpClient(new HttpClientHandler { UseDefaultCredentials = true } );
HttpResponseMessage response = null;
if (identity is WindowsIdentity windowsIdentity)
{
await WindowsIdentity.RunImpersonated(windowsIdentity.AccessToken, async () =>
{
var request = new HttpRequestMessage(HttpMethod.Get, url)
response = await client.SendAsync(request);
});
}
It worked for me after I set up a user with internet access in the Windows service.
In my code:
HttpClientHandler handler = new HttpClientHandler();
handler.Proxy = System.Net.WebRequest.DefaultWebProxy;
handler.Proxy.Credentials = System.Net.CredentialCache.DefaultNetworkCredentials;
.....
HttpClient httpClient = new HttpClient(handler)
....
Ok so I took Joshoun code and made it generic. I am not sure if I should implement singleton pattern on SynchronousPost class. Maybe someone more knowledgeble can help.
Implementation
//I assume you have your own concrete type. In my case I have am using code first with a class called FileCategory
FileCategory x = new FileCategory { CategoryName = "Some Bs"};
SynchronousPost<FileCategory>test= new SynchronousPost<FileCategory>();
test.PostEntity(x, "/api/ApiFileCategories");
Generic Class here. You can pass any type
public class SynchronousPost<T>where T :class
{
public SynchronousPost()
{
Client = new WebClient { UseDefaultCredentials = true };
}
public void PostEntity(T PostThis,string ApiControllerName)//The ApiController name should be "/api/MyName/"
{
//this just determines the root url.
Client.BaseAddress = string.Format(
(
System.Web.HttpContext.Current.Request.Url.Port != 80) ? "{0}://{1}:{2}" : "{0}://{1}",
System.Web.HttpContext.Current.Request.Url.Scheme,
System.Web.HttpContext.Current.Request.Url.Host,
System.Web.HttpContext.Current.Request.Url.Port
);
Client.Headers.Add(HttpRequestHeader.ContentType, "application/json;charset=utf-8");
Client.UploadData(
ApiControllerName, "Post",
Encoding.UTF8.GetBytes
(
JsonConvert.SerializeObject(PostThis)
)
);
}
private WebClient Client { get; set; }
}
My Api classs looks like this, if you are curious
public class ApiFileCategoriesController : ApiBaseController
{
public ApiFileCategoriesController(IMshIntranetUnitOfWork unitOfWork)
{
UnitOfWork = unitOfWork;
}
public IEnumerable<FileCategory> GetFiles()
{
return UnitOfWork.FileCategories.GetAll().OrderBy(x=>x.CategoryName);
}
public FileCategory GetFile(int id)
{
return UnitOfWork.FileCategories.GetById(id);
}
//Post api/ApileFileCategories
public HttpResponseMessage Post(FileCategory fileCategory)
{
UnitOfWork.FileCategories.Add(fileCategory);
UnitOfWork.Commit();
return new HttpResponseMessage();
}
}
I am using ninject, and repo pattern with unit of work. Anyways, the generic class above really helps.
Set identity's impersonation to true and validateIntegratedModeConfiguration to false in web.config
<configuration>
<system.web>
<authentication mode="Windows" />
<authorization>
<deny users="?" />
</authorization>
<identity impersonate="true"/>
</system.web>
<system.webServer>
<validation validateIntegratedModeConfiguration="false" ></validation>
</system.webServer>
</configuration>
string url = "https://www..com";
System.Windows.Forms.WebBrowser webBrowser = new System.Windows.Forms.WebBrowser();
this.Controls.Add(webBrowser);
webBrowser.ScriptErrorsSuppressed = true;
webBrowser.Navigate(new Uri(url));
var webRequest = WebRequest.Create(url);
webRequest.Headers["Authorization"] = "Basic" + Convert.ToBase64String(Encoding.Default.GetBytes(Program.username + ";" + Program.password));
webRequest.Method = "POST";