Create Dynamic subdomain in asp.net - c#

I need to do URL rewriting in such a way that
if my request is abc.domain.com, then request should be processed such as
domain.com/default.aspx?cid=abc
for example, if i give .domain.com, request should be assumed as domain.com/default.aspx?cid=
it is not mandatory that i need to give abc alone the subdomain can be any.. but my request should be processed assuming it as querystring value.
My domain is in Shared hosting...
can anyone throw light in this..

The Subdomain must be created and configure on DNS server and on IIS.
After you have setup your site to accept that subdomains, you may direct use the RewritePath to map a subdomain to a specific file with different parameters.
Starting from Application_BeginRequest you check and find the subdomain and rewrite the path as:
protected void Application_BeginRequest(Object sender, EventArgs e)
{
var SubDomain = GetSubDomain(HttpContext.Current.Request.Host);
// this is a simple example, you can place your variables base on subdomain
if(!String.IsNullOrEmpty(SubDomain))
RewritePath(HttpContext.Current.Request.Path + SubDomain + "/", false);
}
// from : http://madskristensen.net/post/Retrieve-the-subdomain-from-a-URL-in-C.aspx
private static string GetSubDomain(Uri url)
{
string host = url.Host;
if (host.Split('.').Length > 1)
{
int index = host.IndexOf(".");
return host.Substring(0, index);
}
return null;
}
Similar:
How to remap all the request to a specific domain to subdirectory in ASP.NET
Redirect Web Page Requests to a Default Sub Folder
Retrieve the subdomain from a URL in C#

Related

Redirect to different pages for request coming from subdomain site

I have main site and its subdomain site.
When user is entering main site url www.mysite.com it should get redirected to 'Login.aspx'
When user is entering sub domain site url sample.mysite.com it should get redirected to 'Welcom.aspx'
To achieve this what is the best way to do ?
Changes in IIS settings ?
Modify Code in Global.asax
Create Dummy Page for redirection
What code i have to wrote if i required to modify global.asax ?
Personally I will choose 2nd option, modify the Global.asax file.
void Application_BeginRequest(Object source, EventArgs e)
{
string host = HttpContext.Current.Request.Url.Host.ToLower();
if (host == "www.mysite.com")
{
//Incase if you are using any session
if (Session["User"] == null)
Response.Redirect("Login.aspx");
else
{
//validate the session
Response.Redirect("Home.aspx");
}
}
else if (host == "sample.mysite.com")
{
Response.Redirect("Welcome.aspx");
}
}
To achieve the redirection , modify the code in global.asax will be the best one

Request.Url.Scheme gives http instead of https on load balanced site

I am testing a new load balanced staging site and the https is set up at the load balancer level, not at the site level. Also, this site will be always https so i don't need remote require https attributes etc. The url displays https but it is not available in my code. I have a few issues due to this reason
Request.Url.Scheme is always http:
public static string GetProtocol()
{
var protocol = "http";
if (HttpContext.Current != null && HttpContext.Current.Request != null)
{
protocol = HttpContext.Current.Request.Url.Scheme;
}
return protocol;
}
Same thing with this base url, protocol is http
public static string GetBaseUrl()
{
var baseUrl = String.Empty;
if (HttpContext.Current == null || HttpContext.Current.Request == null || String.IsNullOrWhiteSpace(HttpRuntime.AppDomainAppPath)) return baseUrl;
var request = HttpContext.Current.Request;
var appUrl = HttpRuntime.AppDomainAppVirtualPath;
baseUrl = string.Format("{0}://{1}{2}", request.Url.Scheme, request.Url.Authority, appUrl);
if (!string.IsNullOrWhiteSpace(baseUrl) && !baseUrl.EndsWith("/"))
baseUrl = String.Format("{0}/", baseUrl);
return baseUrl;
}
Now the biggest issue is referencing js files and google fonts referenced in the style sheets. I am using // here without http or https but these are treated as http and i see mixed content blocked message in FireBug.
How can i overcome this issue?
As you've said HTTPS termination is done at load balancer level ("https is set up at the load balancer level") which means original scheme may not come to the site depending on loadbalancer configuration.
It looks like in your case LB is configured to talk to site over HTTP all the time. So your site will never see original scheme on HttpContext.Request.RawUrl (or similar properties).
Fix: usually when LB, proxy or CDN configured such way there are additional headers that specify original scheme and likely other incoming request parameters like full url, client's IP which will be not directly visible to the site behind such proxying device.
I override the ServerVariables to convince MVC it really is communicating through HTTPS and also expose the user's IP address. This is using the X-Forwarded-For and X-Forwarded-Proto HTTP headers being set by your load balancer.
Note that you should only use this if you're really sure these headers are under your control, otherwise clients might inject values of their liking.
public sealed class HttpOverrides : IHttpModule
{
void IHttpModule.Init(HttpApplication app)
{
app.BeginRequest += OnBeginRequest;
}
private void OnBeginRequest(object sender, EventArgs e)
{
HttpApplication app = (HttpApplication)sender;
string forwardedFor = app.Context.Request.Headers["X-Forwarded-For"]?.Split(new char[] { ',' }).FirstOrDefault();
if (forwardedFor != null)
{
app.Context.Request.ServerVariables["REMOTE_ADDR"] = forwardedFor;
app.Context.Request.ServerVariables["REMOTE_HOST"] = forwardedFor;
}
string forwardedProto = app.Context.Request.Headers["X-Forwarded-Proto"];
if (forwardedProto == "https")
{
app.Context.Request.ServerVariables["HTTPS"] = "on";
app.Context.Request.ServerVariables["SERVER_PORT"] = "443";
app.Context.Request.ServerVariables["SERVER_PORT_SECURE"] = "1";
}
}
void IHttpModule.Dispose()
{
}
}
And in Web.config:
<system.webServer>
<modules runAllManagedModulesForAllRequests="true">
<add name="HttpOverrides" type="Namespace.HttpOverrides" preCondition="integratedMode" />
</modules>
</system.webServer>
I know this is an old question, but after encountering the same problem, I did discover that if I look into the UrlReferrer property of the HttpRequest object, the values will reflect what was actually in the client browser's address bar.
So for example, with UrlReferrer I got:
Request.UrlReferrer.Scheme == "https"
Request.UrlReferrer.Port == 443
But for the same request, with the Url property I got the following:
Request.Url.Scheme == "http"
Request.Url.Port == 80
According to https://learn.microsoft.com/en-us/aspnet/core/host-and-deploy/proxy-load-balancer
When HTTPS requests are proxied over HTTP, the original scheme (HTTPS)
may be lost and must be forwarded in a header.
In Asp.Net Core I found ( not sure if it works for all scenarios) that even if request.Scheme is misleadingly shows “http” for original “https”, request.IsHttps property is more reliable.
I am using the following code
//Scheme may return http for https
var scheme = request.Scheme;
if(request.IsHttps) scheme= scheme.EnsureEndsWith("S");
//General string extension
public static string EnsureEndsWith(this string str, string sEndValue, bool ignoreCase = true)
{
if (!str.EndsWith(sEndValue, CurrentCultureComparison(ignoreCase)))
{
str = str + sEndValue;
}
return str;
}

Azure cloudapp.net domain and duplicate content issue

I have a C#/MVC4 site hosted on Azure as a web role located at http://www.equispot.com. During a check on Google for some searches related to my site, I came across a search result that links to this page:
http://equispot.cloudapp.net/horses-for-sale/quarter-horses/13
Note the difference in the domain name. Now, I have a canonical tag already (view the source on the cloudapp.net link and you can see the canonical rel tag points to the main site at http://www.equispot.com).
Since that's the case, why would Google have indexed the page at the cloudapp.net domain? I recently noticed a drop in my SERPs and I'm wondering if this is part of the reason (I migrated to Azure about the same time as the SERP change). It may be unrelated but still...
How can I prevent these pages from being indexed by Google or how can I prevent my Azure web role from responding to anything except www.equispot.com and equispot.com? When I had this hosted on premise, I just configured IIS to respond only to my domain (my previous provider produced some dupe content for some reason as well).
You can simply check to make sure that the host the application is running under is the domain name you want. If it is not, then simply do a 302 redirect to the domain name you want.
There are several places where you can inspect the request and do the redirect:
- Global.asax
- Custom module
- Override the OnActionExecuting for action methods
I couldn't find a straightforward way to do this using hostHeader configuration in the ServiceDefinition.csdef file so I rolled my own RedirectInvalidDomainsAttribute class to perform a 301 (Moved Permanently) redirect back to my main site during a request for an invalid domain. In case anyone else runs into the same problem, here's the code:
App_Start/FilterConfig.cs
public static void RegisterGlobalFilters(GlobalFilterCollection filters)
{
filters.Add(new RedirectInvalidDomainsAttribute());
}
RedirectInvalidDomainsAttribute.cs
public class RedirectInvalidDomainsAttribute : ActionFilterAttribute
{
public override void OnActionExecuting(ActionExecutingContext filterContext)
{
var url = filterContext.HttpContext.Request.Url;
if (url == null) return;
var host = url.Host;
if (host.Contains("equispot.com") || host.Contains("localhost")) return;
string subdomain = GetSubDomain(host);
Guid guid;
if (Guid.TryParseExact(subdomain, "N", out guid))
{
// this is a staging domain, it's okay
return;
}
// Invalid domain - 301 redirect
UriBuilder builder = new UriBuilder(url) {Host = "www.equispot.com"};
filterContext.Result = new RedirectResult(builder.Uri.ToString(), true);
}
// This isn't perfect, but it works for the sub-domains Azure provides
private static string GetSubDomain(string host)
{
if (host.Split('.').Length > 1)
{
int index = host.IndexOf(".");
return host.Substring(0, index);
}
return null;
}
}

Sharing session between two web applications on the same IIS server

I have a web application developed in MVC 2.0. I have created a dynamic data website for this web application. For accessing the dynamic data website, the user is redirected to the login page of the MVC application.
On login, the session is created for the user. I want to access this session on my dynamic data website for checking the user's role and username. If the user has the admin role, then only user is allow to access the dynamic data website.
How can I achieve this?
Try this solution:
On your First Web App (The one that user is used for login)
//create a method that will redirect to your web app 2
public ActionResult RedirectToAnotherSite(string name, string role)
{
//specify the IP or url for your web app 2
//pass the parameters
var rawUrl = string.Format("http://localhost:8051?name={0}&role={1}", name, role);
return Redirect(rawUrl);
}
On your web app 2 (dynamic) open the file Global.asax and add the following code
void Application_BeginRequest(object sender, EventArgs e)
{
//get the passed parameters
var req = ((HttpApplication)(sender)).Request;
var name = req.QueryString["name"] ?? string.Empty;
var role = req.QueryString["role"] ?? string.Empty;
//check the role
if (!role.Equals("admin"))
{
//return http access denied
var res = ((HttpApplication)(sender)).Response;
res.StatusCode = 401;
}
}

HTTP Module and Cookies in Sharepoint 2007

I have some proof concept code for a HTTP module. The code checks to see if a cookie exists, if so it retrieves a value, if the cookie does not exist it creates it and sets the value.
Once this is done I write to the screen to see what action has been taken (all nice and simple). So on the first request the cookie is created; subsequent requests retrieve the value from the cookie.
When I test this in a normal asp.net web site everything works correctly – yay! However as soon as I transfer it to SharePoint something weird happens, the cookie is never saved - that is the code always branches into creating the cookie and never takes the branch to retrieve the value - regardless of page refreshes or secondary requests.
Heres the code...
public class SwithcMasterPage : IHttpModule
{
public void Dispose()
{
throw new NotImplementedException();
}
public void Init(HttpApplication context)
{
// register handler
context.PreRequestHandlerExecute += new EventHandler(PreRequestHandlerExecute);
}
void PreRequestHandlerExecute(object sender, EventArgs e)
{
string outputText = string.Empty;
HttpCookie cookie = null;
string cookieName = "MPSetting";
cookie = HttpContext.Current.Request.Cookies[cookieName];
if (cookie == null)
{
// cookie doesn't exist, create
HttpCookie ck = new HttpCookie(cookieName);
ck.Value = GetCorrectMasterPage();
ck.Expires = DateTime.Now.AddMinutes(5);
HttpContext.Current.Response.Cookies.Add(ck);
outputText = "storing master page setting in cookie.";
}
else
{
// get the master page from cookie
outputText = "retrieving master page setting from cookie.";
}
HttpContext.Current.Response.Write(outputText + "<br/>");
}
private string GetCorrectMasterPage()
{
// logic goes here to get the correct master page
return "/_catalogs/masterpage/BlackBand.master";
}
This turned out to be the authentication of the web app. To work correctly you must use a FQDM that has been configured for Forms Authentication.
You can use Fiddler or FireBug (on FireFox) to inspect response to see if your cookie is being sent. If not then perhaps you can try your logic in PostRequestHandlerExecute. This is assuming that Sharepoint or some other piece of code is tinkering with response cookies. This way, you can be the last one adding the cookie.

Categories

Resources