prevent from linking css from others sites - c#

I have a comerce css on my site. I use IIS and vendor says that others can use my css fonts because they know the url. Is it possible to set server or something so that only my site can use it ? It is about cufon

Things you can do:
Give up. If your users can see it, they can steal it. Similarly, don't expect to protect your site from users viewing its source code.
If the font is a vector font, rasterize the font for all the font sizes you support, but no others. This may have a negative impact on browsing experience of your users. This makes stealing your font give less useful data, but doesn't actually stop the theft.
Replace all use of the font with bitmaps. Much more work to steal in that case, and only gives the user rasterized version of font (and not necessarily all the letters). You can create a special text UserControl that sticks a bitmap where-ever you put it, so this isn't actually that much work to do or maintain. It does increase the bandwidth requirements for your page, though. It also forces you to do some of the layout by hand that is normally handled by the browser, which could add heavy maintenance costs or minimal maintenance costs, depending on how your site's layout works. And as with #2, it can have a negative impact on browsing experience of your users. It also hurts accessibility, though not absurdly so since your UserControl will presumably use alt text to duplicate the text.
I strongly recommend #1.

If you are on IIS7 or greater you can perform a Referer check without writing any custom code, simply by using IIS URL Rewrite in the manor discussed here. However as simply a Referer check, it has the shortcomings discussed in the other answers given.
(For introduction to IIS URL Rewrite see here.)
Excerpt from the first link:
Let me now explain what we have done
on this property page:
Specified name of the rule as "Prevent Leeching". This must be a
unique rule.
Every requested URL will be matched as the pattern is ".*" and is a
regular expression.
Added two condition and specified both the condition to be satisfied
(see "Logical Grouping" is "Match
All")
HTTP_REFERER does not match empty as it can be a direct reference to the
image
HTTP_REFERER does not match my own site http://www.contoso.com
If the above two conditions are
satisfied (apparently meaning the
request is coming from any other
site), we are just redirecting it to
pick up some other image which can be
anything And that's it. So without
writing even a single line of code we
are able to prevent hot-linking.
I would probably tailor your Rewrite configuration so that it is only performed on your font URLs (and other static assets of concern) rather than every single incoming request.
If you don't have remote desktop access or are just editing web.config, your rewrite rule will probably look something like:
<rule name="block font leaching" stopProcessing="true">
<match url="myFontFile.woff" />
<conditions logicalGrouping="MatchAny">
<add input="{HTTP_REFERER}" pattern="^$" /><!-- no referrer -->
<add input="{HTTP_REFERER}" pattern="yourdomain.com" negate="true" /><!-- or not your site -->
</conditions>
<action type="AbortRequest" /><!-- block the request -->
</rule>
In this example I choose the block the request entirely (through AbortRequest), however you could just as well have redirected to a page with a friendly notice.

Not reliably. In order to serve up the embedded fonts they need to readable by the public, and referable by your CSS.
What you could do is create an asp.net page, or a handler which takes a parameter of the font file, reads the file from somewhere in your web site (APP_DATA is a good place to put them - you can't browse to APP_DATA) and spits it out. In the script you could check the HTTP_REFERER server side variable and if it is either blank, or comes from your site you server the file, if it doesn't you don't.
MSDN has an example of how to serve up a binary file in C#. You'll need to ensure you get the MIME type right, however be aware this would probably break any caching provided by the browser or proxies. This also wouldn't stop people downloading the fonts by typing the URL into their browser and saving them locally, but if bandwidth is the concern that's not really going to be a problem.
If you're on IIS7 you could write an Http Module which would do the referrer check for you, Scott Hansleman wrote one for image leeching prevention quite a while ago, you could edit that to match your purposes.

You could make an http handler to serve up css files. In your custom http handler, check that the request.Url.Host equals request.UrlReferrer.Host. If they don't match, set the response to 404 or serve up an empty css file.
This is untested but should be close to what you would need.
You would add a link to css like:
<link rel="Stylesheet" href="CustomCSSHandler.ashx?file=site.css" />
public class CustomCSSHandler : IHttpHandler
{
public void ProcessRequest(HttpContext ctx)
{
HttpRequest req = ctx.Request;
//Get the file from the query stirng
string file = req.QueryString["file"];
//Find the actual path
string path = ctx.Server.MapPath(file); //Might need to modify location of css
//Limit to only css files
if(Path.GetExtension(path) != ".css")
ctx.Response.End();
if (req.UrlReferrer != null && req.UrlReferrer.Host.Length > 0)
{
if (CultureInfo.InvariantCulture.CompareInfo.Compare(req.Url.Host, req.UrlReferrer.Host, CompareOptions.IgnoreCase) != 0)
{
path = ctx.Server.MapPath("~/thiswontexist.css");
}
}
//Make sure file exists
if(!File.Exists(path))
{
ctx.Response.Status = "File not found";
ctx.Response.StatusCode = 404;
ctx.Response.End();
}
ctx.Response.StatusCode = 200;
ctx.Response.ContentType = "text/css";
ctx.Response.WriteFile(path);
}
}

Related

Parse page using AngleSharp

I want to parse website using c# with AngleSharp, it's easy to do with static pages, but there is a problem, I can't parse info avalible only for authorized users. What should I do to autorize programmatically into website and parse all info avalible for me?
Depending on the used authorization scheme this may either be super simple or ultra hard / impossible.
So let's first visit what can be done with AngleSharp:
Any kind of requests incl. their manipulation (on request, but also before response)
General cookie management (and their manipulation, of course)
Querying the DOM and perform "simple" actions (e.g., clicking a button, submitting a form)
Running trivial JavaScript files
Here trivial means: Scripts that do not need any capabilities beyond what AngleSharp offers, e.g., rendering tree information, advanced CSSOM access, ... - or scripts that require non-ES5 compliant parsers (e.g., make use of ES6 or some special non-standard capabilities).
Now since I do not know what is the authorization scheme or exact problem that you are hitting (some code / MWE would be helpful!) I'll just go for a simple click example.
var context = BrowsingContext.New(Configuration.Default.WithDefaultLoader().WithCookies());
var loginPage = await context.OpenAsync("http://yourpage.com");
var loginForm = loginPage.QuerySelector<IHtmlFormElement>("form");
var profilePage = await loginForm.SubmitAsync(new { userName = "myUser", password = "password" });
// get something on profilePage
Note that in this example the form field names for the login form are userName and password - they may be different for your login page. Also note that your page may contain multiple forms and the selector could be more sophisticated than a simple form.
HTH!

How can I get various maximum request segment lengths from my webserver?

We're building an application on the following stack:
.NET 4.6.1
ASP.NET MVC 5
IIS 8
We have people generating very long search filter query strings in our application. Currently the resulting GET requests result in an exception on the server. We'd like to warn them if their request string is going to be too long, rather than submitting an invalid GET request.
Our .js guy can shim AJAX requests to support length checking for the full URL and the querystring pretty easily. The problem is we don't want to assume we know the right maximum lengths for those values. We want to query the server to know what the maximum lengths for the URL and querystring are, and just use that dynamically on the client side.
I can build the Web API endpoint to return those values very easily, but I'm having trouble being certain I'm getting the right values. For example, I can read our configuration file directly and look for things like this:
<system.webServer>
<security>
<requestFiltering>
<requestLimits maxUrl="2000" maxQueryString="2000" />
</requestFiltering>
</security>
</system.webServer>
However, I'm not certain that is really the authoritative place to look. I'm hoping somebody knows how to use C# to ask IIS the following questions and be certain of having the right answers:
What is the maximum allowed URL length for a GET request to the
current application?
What is the maximum allowed querystring length
for a GET request to the current application?
Here's how you can read the config values - which can then be returned by your API:
var config = System.Web.Configuration.WebConfigurationManager.OpenWebConfiguration(Request.ApplicationPath);
var section = config.GetSection("system.webServer");
var xml = section.SectionInformation.GetRawXml();
var doc = XDocument.Parse(xml);
var element = doc.Root.Element("security").Element("requestFiltering").Element("requestLimits");
var maxUrl = element.Attribute("maxUrl").Value;
var maxQs = element.Attribute("maxQueryString").Value;
Keep in mind though that the client (browser) will impose its own limit, as can proxy servers.
The below SO answer implies that you may have to set the limit in the httpRuntime element as well...although it is not explained why. See: How to configure the web.config to allow requests of any length
This article on www.asp.net may also be helpful in regards to more info increasing the allowable URL size: http://www.asp.net/whitepapers/aspnet4#0.2__Toc253429244

Cache images provided through script

I have a script, which by using several querystring variables provides an image. I am also using URL rewriting within IIS 7.5.
So images have an URL like this:
http://mydomain/pictures/ajfhajkfhal/44/thumb.jpg
or
http://mydomain/pictures/ajfhajkfhal/44.jpg
This is rewritten to:
http://mydomain/Picture.aspx?group=ajfhajkfhal&id=44&thumb=thumb.jpg
or
http://mydomain/Picture.aspx?group=ajfhajkfhal&id=44
I added caching rules to IIS to cache JPG images when they are requested. This works with my images that are REAL images on the disk. When images are provided through the script, they are somehow always requested through the script, without being cached.
The images do not change that often, so if the cache at least is being kept for 30 minutes (or until file change) that would be best.
I am using .NET/C# 4.0 for my website. I tried setting several cache options in C#, but I cant seem to find how to cache these images (client-side), while my static images are cached properly.
EDIT I use the following options to cache the image on the client side, where 'fileName' is the physical filename of the image (on disk).
context.Response.AddFileDependency(fileName);
context.Response.Cache.SetETagFromFileDependencies();
context.Response.Cache.SetLastModifiedFromFileDependencies();
context.Response.Cache.SetCacheability(HttpCacheability.Public);
context.Response.Cache.SetExpires(DateTime.Now.AddTicks(600));
context.Response.Cache.SetMaxAge(new TimeSpan(0, 5, 0));
context.Response.Cache.SetSlidingExpiration(true);
context.Response.Cache.SetValidUntilExpires(true);
context.Response.ContentType = "image/jpg";
EDIT 2 Thanks for pointing that out, that was indeed a very stupid mistake ;). I changed it to 30 minutes from now (DateTime.Now.AddMinutes(30)).
But this doesnt solve the problem. I am really thinking the problem lies with Firefox. I use Firebug to track each request and somehow, I am thinking I am doing something fundamentally wrong. Normal images (which are cached and static) give back an response code "304 (Not Modified)", while my page always gives back a "200 (OK)".
alt text http://images.depl0y.com/capture.jpg
If what you mean by "script" is the code in your Picture.aspx, I should point out that C# is not a scripting language, so it is technically not a script.
You can use the Caching API provided by ASP.NET.
I assume you alread have a method which contains something like this. Here is how you can use the Caching API:
string fileName = ... // The name of your file
byte[] bytes = null;
if (HttpContext.Current.Cache[fileName] != null)
{
bytes = (byte[])HttpContext.Current.Cache[fileName];
}
else
{
bytes = ... // Retrieve your image's bytes
HttpContext.Current.Cache[fileName] = bytes; // Set the cache
}
// Send it to the client
Response.BinaryWrite(bytes);
Response.Flush();
Note that the keys you use in the cache must be unique to each cached item, so it might not be enough to just use the name of the file for this purpose.
EDIT:
If you want to enable caching the content on the client side, use the following:
Response.Cache.SetCacheability(HttpCacheability.Public);
You can experiment with the different HttpCacheability values. With this, you can specify how and where the content should be cached. (Eg. on the server, on proxies, and on the client)
This will make ASP.NET to send the client the caching rules with the appropriate HTTP headers.
This will not guarantee that the client will actually cache it (it depends on browser settings, for example), but it will tell the browser "You should cache this!"
The best practice would be to use caching on both the client and the server side.
EDIT 2:
The problem with your code is the SetExpires(DateTime.Now.AddTicks(600)). 600 ticks is only a fraction of a second... (1 second = 10000000 ticks)
Basically, the content gets cached but expires the moment it gets to the browser.
Try these:
context.Response.Cache.SetExpires(DateTime.Now.AddMinutes(5));
context.Response.Cache.SetMaxAge(TimeSpan.FromMinutes(5));
(The TimeSpan.FromMinutes is also more readable than new TimeSpan(...).)

Can a page opt out of IIS 7 compression?

My pages are automatically being compressed by IIS7 with GZIP.
That is great... but, for one particular page, I need to stream it to the user, using Response.Flush() when needed. But when the output is being compressed, the IIS server seems to collect all my output until the page is done before compressing and sending it to the client. That nullifies my attempt to Flush the content out to the user.
Is there a way that I can have this one page opt out of the compression?
One possible option
I've determined that if I manually set the content type to one that does not match the IIS configuration at c:\windows\system32\inetsrv\config\applicationhost.config, then IIS will not compress it. Eg. Response.ContentType = "x-text/html". This works okay with IE8, as it falls back to display the HTML. But Firefox will ask the user what to do with the unknown file type.
This could work, if there was another Mime Type I could use that browsers would accept as HTML, that is not matched in the applicationhost.config. For reference, these are the mime types that will be compressed:
<add mimeType="text/*" enabled="true" />
<add mimeType="message/*" enabled="true" />
<add mimeType="application/x-javascript" enabled="true" />
<add mimeType="application/atom+xml" enabled="true" />
<add mimeType="application/xaml+xml" enabled="true" />
Others options?
Are there other options to opt out of compression?
It may not be possible to disable compression for a certain page, but you can for a directory.
This describes how to disable static compression, but it may work for dynamic compression: (From http://www.microsoft.com/technet/prodtechnol/WindowsServer2003/Library/IIS/502ef631-3695-4616-b268-cbe7cf1351ce.mspx?mfr=true)
To disable static compression for only a single directory, first enable global static compression (if it is disabled) and then disable static compression at that directory. For example, to enable static compression for a directory at http://www.contoso.com/Home/StyleSheets, perform the following steps:
Enable global static compression by executing the following command at a command prompt:
adsutil set w3svc/filters/compression/parameters/HcDoStaticCompression true
Disable static compression at this directory by executing the following command at a command prompt:
adsutil set w3svc/1/root/Home/StyleSheets/DoStaticCompression false
Not sure I like this but maybe worth mentioning:
Disable GZIP compression for IE6 clients
You could use a custom made compression module, like this one:
HTTP compression of WebResource.axd and pages in ASP.NET
Using such it should be easy to customize which files to include/exclude.
I know of no way of a page to disable itself programmatically during the request. However you can workaround the compression and send some extra padding garbage, enough for gzip to process a new block. Your padding data should be as random as possible so it doesn't get too compressed, filling the deflate buffer faster.
The actual amount of data to send depends on the compression module's configurations.
if you do Response.BufferOutput = false it will stop the inbuilt compression working, albeit not cleanly. You may get event warnings that it can't add headers after they have already been sent to the client.
If you need a solution which depends only on C#, you may adapt this method I have written to cope with a problem in the Android Browser:
/// <summary>
/// Alters the current HTTP request only for Android user agents, in order to disable web page compression so the Android browser will not cut off most of the page content, based on the Content-length HTTP header.
/// </summary>
public static void fixAndroidPageDisplay()
{
HttpContext c = HttpContext.Current;
if (c == null) return;
HttpRequest r = c.Request;
if (r == null || r.UserAgent == null) return;
if (r.UserAgent.ToLowerInvariant().Contains("android"))
{
HttpResponse rsp = c.Response;
if (rsp != null)
{
string ce = null;
foreach (string s in rsp.Headers.Keys)
{
if (s != null)
{
if (s.ToLowerInvariant().Equals("content-encoding")) {
ce = s;
}
}
}
if (ce != null) {
rsp.Headers[ce] = "text/html";
rsp.Filter = rsp.OutputStream;
}
}
}
}

Truncating Query String & Returning Clean URL C# ASP.net

I would like to take the original URL, truncate the query string parameters, and return a cleaned up version of the URL. I would like it to occur across the whole application, so performing through the global.asax would be ideal. Also, I think a 301 redirect would be in order as well.
ie.
in: www.website.com/default.aspx?utm_source=twitter&utm_medium=social-media
out: www.website.com/default.aspx
What would be the best way to achieve this?
System.Uri is your friend here. This has many helpful utilities on it, but the one you want is GetLeftPart:
string url = "http://www.website.com/default.aspx?utm_source=twitter&utm_medium=social-media";
Uri uri = new Uri(url);
Console.WriteLine(uri.GetLeftPart(UriPartial.Path));
This gives the output: http://www.website.com/default.aspx
[The Uri class does require the protocol, http://, to be specified]
GetLeftPart basicallys says "get the left part of the uri up to and including the part I specify". This can be Scheme (just the http:// bit), Authority (the www.website.com part), Path (the /default.aspx) or Query (the querystring).
Assuming you are on an aspx web page, you can then use Response.Redirect(newUrl) to redirect the caller.
Here is a simple trick
Dim uri = New Uri(Request.Url.AbsoluteUri)
dim reqURL = uri.GetLeftPart(UriPartial.Path)
Here is a quick way of getting the root path sans the full path and query.
string path = Request.Url.AbsoluteUri.Replace(Request.Url.PathAndQuery,"");
This may look a little better.
string rawUrl = String.Concat(this.GetApplicationUrl(), Request.RawUrl);
if (rawUrl.Contains("/post/"))
{
bool hasQueryStrings = Request.QueryString.Keys.Count > 1;
if (hasQueryStrings)
{
Uri uri = new Uri(rawUrl);
rawUrl = uri.GetLeftPart(UriPartial.Path);
HtmlLink canonical = new HtmlLink();
canonical.Href = rawUrl;
canonical.Attributes["rel"] = "canonical";
Page.Header.Controls.Add(canonical);
}
}
Followed by a function to properly fetch the application URL.
Works perfectly.
I'm guessing that you want to do this because you want your users to see pretty looking URLs. The only way to get the client to "change" the URL in its address bar is to send it to a new location - i.e. you need to redirect them.
Are the query string parameters going to affect the output of your page? If so, you'll have to look at how to maintain state between requests (session variables, cookies, etc.) because your query string parameters will be lost as soon as you redirect to a page without them.
There are a few ways you can do this globally (in order of preference):
If you have direct control over your server environment then a configurable server module like ISAPI_ReWrite or IIS 7.0 URL Rewrite Module is a great approach.
A custom IHttpModule is a nice, reusable roll-your-own approach.
You can also do this in the global.asax as you suggest
You should only use the 301 response code if the resource has indeed moved permanently. Again, this depends on whether your application needs to use the query string parameters. If you use a permanent redirect a browser (that respects the 301 response code) will skip loading a URL like .../default.aspx?utm_source=twitter&utm_medium=social-media and load .../default.aspx - you'll never even know about the query string parameters.
Finally, you can use POST method requests. This gives you clean URLs and lets you pass parameters in, but will only work with <form> elements or requests you create using JavaScript.
Take a look at the UriBuilder class. You can create one with a url string, and the object will then parse this url and let you access just the elements you desire.
After completing whatever processing you need to do on the query string, just split the url on the question mark:
Dim _CleanUrl as String = Request.Url.AbsoluteUri.Split("?")(0)
Response.Redirect(_CleanUrl)
Granted, my solution is in VB.NET, but I'd imagine that it could be ported over pretty easily. And since we are only looking for the first element of the split, it even "fails" gracefully when there is no querystring.

Categories

Resources