I have a website (SITE_DOMAIN) that needs Url rewrite rules.
The site is in asp.net Web forms.
For Example SITE_DOMAIN/abbigliamento/donna/jeans Is SITE_DOMAIN/Products/Donna/0/42/1
I have a table called Rewrites with this fields
In Global.asax I have
public static List<Rewrite> rewrites = null;
public static string oldChiave = "";
public void GetRewrites()
{
if (rewrites == null)
rewrites = Rewrite.getRules(); //reads from table (about 5000 rows)
}
protected void Application_BeginRequest(object sender, EventArgs e)
{
GetRewrites();
String fullOriginalPath = Request.Url.ToString();
int index = fullOriginalPath.IndexOf('/', fullOriginalPath.IndexOf(SITE_DOMAIN)) + 1;
string chiave = fullOriginalPath.Substring(index).ToLower();
if (oldChiave != chiave)
{
oldChiave = chiave;
Rewrite r = rewrites.Find(y => y.Chiave == chiave);
if (r != null)
{
string url = "/" + r.Pagina;
if (r.Param1 != null)
url += "/" + r.Param1;
if (r.Param2 != null)
url += "/" + r.Param2;
if (r.Param3 != null)
url += "/" + r.Param3;
if (r.Param4 != null)
url += "/" + r.Param4;
if (r.Param5 != null)
url += "/" + r.Param5;
Context.RewritePath(url);
}
//se non ho trovato la chiave all'interno delle chiavi potrebbe essere la composizione dei parametri in Param1,2,3,4,5 es /Products/Uomo/0/0/1, deve ritrasformarsi in Scarpe-Uomo
string[] param = chiave.Split('/');
if (param.Length == 5)
{
r = rewrites.Find(x => x.Pagina == param[0] &&
x.Param1 == param[1] &&
x.Param2 == param[2] &&
x.Param3 == param[3] &&
x.Param4 == param[4]);
if (r != null)
Response.Redirect("/" + r.Chiave);
}
if (param.Length == 6)
{
r = rewrites.Find(x => x.Pagina == param[0] &&
x.Param1 == param[1] &&
x.Param2 == param[2] &&
x.Param3 == param[3] &&
x.Param4 == param[4] &&
x.Param5 == param[5]);
if (r != null)
Response.Redirect("/" + r.Chiave);
}
}
}
The website is too slow and the problem I'm sure it's here.
The Global.asax Application_BeginRequest method is fired 3 or more times when I click a link.
Is there any other method that I could use or any 3-part dll that I could use?
PS. for completeness in my global.asax I also have the method
using Microsoft.AspNet.FriendlyUrls;
protected void Application_Start(object sender, EventArgs e)
{
RouteTable.Routes.EnableFriendlyUrls();
RouteTable.Routes.MapPageRoute("", "home", "~/Default.aspx");
RouteTable.Routes.MapPageRoute("", "carrello", "~/Carrello.aspx");
RouteTable.Routes.MapPageRoute("", "contatti", "~/Contatti.aspx");
RouteTable.Routes.MapPageRoute("", "checkout", "~/Checkout2.aspx");
RouteTable.Routes.MapPageRoute("", "logout", "~/Logout.aspx");
RouteTable.Routes.MapPageRoute("", "pagamenti", "~/Pagamenti.aspx");
RouteTable.Routes.MapPageRoute("", "chi-siamo/scarpe-online-di-marca", "~/ChiSiamo.aspx");
RouteTable.Routes.MapPageRoute("", "i-miei-ordini", "~/PageOrdini.aspx");
RouteTable.Routes.MapPageRoute("", "pre-checkout", "~/PreCheckout.aspx");
RouteTable.Routes.MapPageRoute("", "privacy-and-cookies", "~/PrivacyAndCookies.aspx");
RouteTable.Routes.MapPageRoute("", "ricerca-prodotto/{Filtri}/{Pagina}", "~/ProductsSearch.aspx");
RouteTable.Routes.MapPageRoute("", "il-mio-profilo", "~/Profilo.aspx");
RouteTable.Routes.MapPageRoute("", "registrazione", "~/Registrazione.aspx");
RouteTable.Routes.MapPageRoute("", "resi", "~/Resi.aspx");
RouteTable.Routes.MapPageRoute("", "spedizioni", "~/Spedizioni.aspx");
RouteTable.Routes.MapPageRoute("", "termini-e-condizioni", "~/TerminiECondizioni.aspx");
RouteTable.Routes.MapPageRoute("", "grazie", "~/Thanks.aspx");
RouteTable.Routes.MapPageRoute("", "product/{ProductId}", "~/Product.aspx");
RouteTable.Routes.MapPageRoute("", "products/{Menu}/{Marca}/{Categoria}/{Pagina}", "~/Products.aspx");
RouteTable.Routes.MapPageRoute("", "blog/{Pagina}", "~/Blog.aspx");
RouteTable.Routes.MapPageRoute("", "lista-dei-desideri/{Pagina}", "~/Wishlist.aspx");
RouteTable.Routes.MapPageRoute("", "blogpost/{NewsId}", "~/BlogPost.aspx");
}
This rewrites are fixed so, they don't need to be stored in the previous table.
Since you are open to third party solutions ("Is there any other method that I could use or any 3-part dll that I could use?") you can consider the following.
Option 1: Since your website is built using Microsoft tech stack, I am assuming you would be deploying the solution on IIS webserver. If that is the case, you can you can use the Rewrite module extension on IIS to do all of your rewrites. Your rules would be saved in the web.config file.
Option 2: Use third party software
Option 1 Steps
Open IIS Manager. Select Your Web Site.
Click URL Rewrite in the Feature View. If you don't see this option, URL rewrite extension might not have been installed. Get it from here [https://www.iis.net/downloads/microsoft/url-rewrite] and install it.
Click Add rules in the Actions pane (found on the right side)
Select Blank Rule in the Add Rules dialog box
Create your rules. Rule options are pretty self explanatory.
Since your rules are stored in a database, you might need to write code to read your rules from database (which you seems to have already done) and reformat the rules to confirm to IIS rule format. At the end of the day, rules are name-value pairs. Say you have a prod.aspx page that takes a product id and size parameters that displays product details and your current URL is "/products/prod.aspx?id=1234&size=3". The "original value" in the rule mapping will be "/product-details" and the "New Value" will be "/products/prod.aspx?id=1234&size=3" in IIS rule mapping dialog box. The original value specifies what URL path we want to rewrite from and the new value specifies what URL path we want to rewrite to.
You might need to use a custom Rewrite Provider with URL Rewrite Module to interface directly with your SQL tables to import rules. You can read more about how to do this here: https://learn.microsoft.com/en-us/iis/extensions/url-rewrite-module/using-custom-rewrite-providers-with-url-rewrite-module
Option 2
ISAPI Rewrite by Helicon Tech (http://www.isapirewrite.com/)
Ionic's Isapi Rewrite Filter (https://github.com/DinoChiesa/IIRF)
I have not used any of the above software products myself and therefore, I couldn't tell you how well they work. Try them at your own risk.
As other answers have suggested, there are different MiddleWare options available or Url-Rewrite modules that could be applied outside of your running application, but if you do not understand where your actual bottle necks are, how can you be sure that any changes in this area will be effective?
There are multiple compounding issues at play here.
The website is too slow
What is your definition of Too Slow and what response times are you looking to achieve?
Whilst complex route logic looks like a prime candidate, even if it looks inefficient, worst case this should only add a couple of hundred milliseconds to each response.
Know that each page request, depending on the architecture and code within it will usually make multiple requests, sometimes 10s or 100s of individual requests for content, it might not be the routing that is the issue, but instead the servicing of the content at each of those, or perhaps just some of the requests that is the bottle neck.
Use the dev tools in your browser to identify which individual requests on a page life cycle are taking the longest to execute, the slow response might not be a significant factor.
Loading the list of routes
Whilst what we assume to be a database call is loaded once into a static variable, the code pattern is ambiguous and instead of executing this "on request" it should be forced into a single call in the Application_Start. But do not overlook the importance of making sure this call is efficient, if this simple DB lookup takes too long to respond, then we can safely assume that all other queries against the database would also be slow, that is the MOST LIKELY FACTOR in your response times.
5K records doesn't sound like too much, but in some frameworks and low cost architectures it might add a few seconds to cold start times, in which case you should consider a compilation or deployment step that reads the data from the database into a local file or use T4 templates or something similar to generate code from the rules.
You need to know how long Rewrite.getRules() takes to execute to make decisions around this point, simple code like this will help to capture the time spent getting the rules, if this is unacceptable then that is a clear place to optimise, I would expect 5k rows to still return under 1 second in a production environment:
public static List<Rewrite> rewrites = null;
public static TimeSpan _getRules_Duration = TimeSpan.Zero;
public void GetRewrites()
{
if (rewrites == null)
{
var sw = System.Diagnostics.Stopwatch.StartNew();
try
{
rewrites = Rewrite.getRules(); //reads from table (about 5000 rows)
}
finally
{
sw.Stop();
_getRules_Duration = sw.Elapsed;
}
}
}
Post the value for _getRules_Duration, if it is more than 1 second then this indicates something is seriously wrong with your data access implementation. If a simple query like that takes too long, then your entire data driven web site is doomed from the start, see back to the above points, if a page makes multiple individual hits to the database then each of those queries may also be impacted by the poor performing DAL
Even 1 second is being generous, I just profiled a much more complex EF query, returning over 10K rows, with 4 levels of Included relationship data, and the whole lot comes back in under 100ms, the whole page response is closer to 700ms, so even if I was to optimise the DB query, best case scenario is that the whole page load comes back closer to 600ms, so is it worth the effort for such a minimal improvement?
Poor DAL can be caused by poor code or poorly implemented ORM logic, however we often overlook the operating environment and resources that the deployed websites are running on. The latency between the web client, the web server, the DAL and then the database introduce crucial physical hardware bottlenecks to your code may need to respect if you can't improve the bandwidth or resources.
Move the call to load the routes into Application_Start to make it obvious that this data is only loaded once in the application lifecycle, but it is needed for every request, so you want to know early if it is going to fail:
protected void Application_Start(object sender, EventArgs e)
{
GetRewrites();
RouteTable.Routes.EnableFriendlyUrls();
...
}
Re-factor your logic so you can Test it
Right now, OP and all other posts are focusing on logic used to match the routes, but similar to the load times, if the route logic only takes 200ms to evaluate then at best we can only reduce each individual response time by that amount. Lets refactor this logic into a format that we can both test and measure:
private string oldChiave;
protected void Application_BeginRequest(object sender, EventArgs e)
{
String fullOriginalPath = Request.Url.ToString();
int index = fullOriginalPath.IndexOf('/', fullOriginalPath.IndexOf(SITE_DOMAIN)) + 1;
string chiave = fullOriginalPath.Substring(index).ToLower();
if (oldChiave != chiave)
{
oldChiave = chiave;
if (TryGetRewritePath(chiave, out string rewriteUrl))
Context.RewritePath(rewriteUrl);
else if (TryGetRewritePath(chiave, out string redirectUrl))
Context.Redirect(redirectUrl);
}
}
public bool TryGetRewritePath(string chiave, out string url)
{
Rewrite r = rewrites.Find(y => y.Chiave == chiave);
if (r != null)
{
string url = "/" + r.Pagina;
if (r.Param1 != null)
url += "/" + r.Param1;
if (r.Param2 != null)
url += "/" + r.Param2;
if (r.Param3 != null)
url += "/" + r.Param3;
if (r.Param4 != null)
url += "/" + r.Param4;
if (r.Param5 != null)
url += "/" + r.Param5;
return true;
}
return false;
}
public bool TryGetRedirectPath(string chiave, out string url)
{
//se non ho trovato la chiave all'interno delle chiavi potrebbe essere la composizione dei parametri in Param1,2,3,4,5 es /Products/Uomo/0/0/1, deve ritrasformarsi in Scarpe-Uomo
string[] param = chiave.Split('/');
if (param.Length == 5)
{
Rewrite r = rewrites.Find(x => x.Pagina == param[0] &&
x.Param1 == param[1] &&
x.Param2 == param[2] &&
x.Param3 == param[3] &&
x.Param4 == param[4]);
if (r != null)
{
url = "/" + r.Chiave;
return true;
}
}
if (param.Length == 6)
{
Rewrite r = rewrites.Find(x => x.Pagina == param[0] &&
x.Param1 == param[1] &&
x.Param2 == param[2] &&
x.Param3 == param[3] &&
x.Param4 == param[4] &&
x.Param5 == param[5]);
if (r != null)
{
url = "/" + r.Chiave;
return true;
}
}
return false;
}
Now, as with the GetRewrites example, we can again use the stop watch to record the duration of each request, here we will just trace out the info, you can adapt this to your needs, perhaps storing the max or average processing time
The overall point is that you need data to inform you if this is the source of your overall performance issues.
protected void Application_BeginRequest(object sender, EventArgs e)
{
String fullOriginalPath = Request.Url.ToString();
int index = fullOriginalPath.IndexOf('/', fullOriginalPath.IndexOf(SITE_DOMAIN)) + 1;
string chiave = fullOriginalPath.Substring(index).ToLower();
if (oldChiave != chiave)
{
oldChiave = chiave;
var sw = System.Diagnostics.Stopwatch.StartNew();
if (TryGetRewritePath(chiave, out string rewriteUrl))
{
sw.Stop();
// log out the duration
System.Diagnostics.Trace.WriteLine($"URL Rewrite evaluated in: {sw.ElapsedMilliseconds}ms. '{chiave}' => '{rewriteUrl}'");
Context.RewritePath(rewriteUrl);
}
else if (TryGetRewritePath(chiave, out string redirectUrl))
{
sw.Stop();
// log out the duration (this includes the above dureation AS WELL)
System.Diagnostics.Trace.WriteLine($"URL Redirect evaluated in: {sw.ElapsedMilliseconds}ms. '{chiave}' => '{rewriteUrl}'");
Context.Redirect(redirectUrl);
}
else
{
sw.Stop();
// log out the duration (this includes the above dureation AS WELL)
System.Diagnostics.Trace.WriteLine($"NO REDIRECT evaluated in: {sw.ElapsedMilliseconds}ms. '{chiave}'");
}
}
}
OP please post the times that your code takes to complete these functions, from that you can decide if the route processing is significant factor or not.
Chiave Logic
The other obvious issue in your code is that you have tried to prevent multiple evaluations of the route logic based on whether the Chiave has changed or not. If your site serves requests for multiple values of the path prefix, then I'm not sure this logic is entirely correct. If two different users operating under different business Chiave identities are using your site at the same time, then each user will cause the previous value of oldChiave to be overwritten and the same logic executed.
As a minimal step, make sure that oldChiave is an instance member, and not a static member. But I'm not sure that is really helping your issue at all, what you probably want to implement is the following type of logic:
On Request:
- If the current URL has already been evaluated for redirect, use the previous result
- Otherwise, check if we need to redirect, and if we do, cache the result for next time.
As usual, there are a lot of different code patterns to go about this, however it is first important to know if this is going to improve your response times, and if so, by how much. That can only be determined by analysing the logic.
Note here static is used, we don't know which application instance might be serving the request.
static Dictionary<string, Tuple<string, string>> rewriteCache = new Dictionary<string, Tuple<string, string>>();
protected void Application_BeginRequest(object sender, EventArgs e)
{
String fullOriginalPath = Request.Url.ToString();
int index = fullOriginalPath.IndexOf('/', fullOriginalPath.IndexOf(SITE_DOMAIN)) + 1;
string chiave = fullOriginalPath.Substring(index).ToLower();
if (rewriteCache.TryGetValue(chiave, out Tuple<string, string> cacheItem))
{
if(cacheItem != null)
{
if(cacheItem.Item1 != null) Context.RewritePath(cacheItem.Item1);
if(cacheItem.Item2 != null) Context.Redirect(cacheItem.Item2);
}
}
else
{
if (TryGetRewritePath(chiave, out string rewriteUrl))
{
rewriteCache.Add(chiave, new Tuple<string, string>(rewriteUrl, null));
Context.RewritePath(rewriteUrl);
}
else if (TryGetRewritePath(chiave, out string redirectUrl))
{
rewriteCache.Add(chiave, new Tuple<string, string>(null, redirectUrl));
Context.Redirect(redirectUrl);
}
else
{
// Cache the no redirect scenario
rewriteCache.Add(chiave, null);
}
}
}
Application_BeginRequest method is fired 3 or more times when I click
a link
To minimize the calls -
the Application_BeginRequest is called on all request, not only on aspx files.
you can minimize only of on aspx and handlers files. Here is an example on how to do that:
string sExtentionOfThisFile = System.IO.Path.GetExtension(HttpContext.Current.Request.Path);
if ( sExtentionOfThisFile.Equals(".aspx", StringComparison.InvariantCultureIgnoreCase) ||
sExtentionOfThisFile.Equals(".ashx", StringComparison.InvariantCultureIgnoreCase)
)
{
// run here your Rewrites
}
rewrites = Rewrite.getRules(); //reads from table (about 5000 rows)
Rewrite r = rewrites.Find(y => y.Chiave == chiave);
This is the point where you make a search loop of 5000 times (maybe an average of 2500) on each call. Imaging if the most access page are at the end of the list, then near 5000 compare strings on each call - this is where you have the main issue.
To make it faster in your specific case, I suggest to use Dictionary<> - need a little bit more code and different way to search on it - but will make the different.
Also optimize that code with different way of search - this way using Dictionary<>
r = rewrites.Find(x => x.Pagina == param[0] &&
x.Param1 == param[1] &&
x.Param2 == param[2] &&
x.Param3 == param[3] &&
x.Param4 == param[4]);
I would add the verification of the extension as said by Aristos, and I would also perhaps search a "Find" implementation (using binary trees, or some other method) to make it faster.
I think that is the issue.
The query to the DB is not the issue since it runs one time only.
Another possible solution could be to setup the list in a redis server which is faster (although you need to run some tests to compare redis vs inmemory of 5000 items or more to make it futureproof)
Related
I'm working on report rendering with FastReports - I'm doing this coming from a system to render with Crystal Reports. When using Crystal, I found that preloading a report and then binding parameters on request sped up crystal dramatically, since most of the time for a small layout like an invoice is in the setup. I'm now trying to achieve the same with FastReports.
It's unclear how much time setup takes however, so I'd also be interested in whether this is not a worthwhile endeavour.
My issue is that I have used a JSON API call, and used ConnectionStringExpression with a single parameter. In a nutshell, changing the parameter does not reload the data when I call Prepare.
Here's my code, with the second report load taken out, it renders the same report twice.
var report = new Report();
report.Load("C:\\dev\\ia\\products\\StratusCloud\\AppFiles\\Reports\\SalesQuoteItems.frx");
var urlTemplate = "http://localhost:9502/data/sales-quote/{CardCode#}/{DocEntry#}";
var reportParms = new Dictionary<string, dynamic>();
reportParms.Add("CardCode#", "C20000");
reportParms.Add("DocEntry#", 77);
var connectionstring = "Json=" + System.Text.RegularExpressions.Regex.Replace(urlTemplate, "{([^}]+)}", (m) => {
if (reportParms.ContainsKey(m.Groups[1].Value))
{
return string.Format("{0}", reportParms[m.Groups[1].Value]);
}
return m.Value;
});
var dataapiparm = report.Dictionary.Parameters.FindByName("DataAPIUrl#");
if (dataapiparm != null)
{
dataapiparm.Value = connectionstring;
}
foreach(FastReport.Data.Parameter P in report.Dictionary.Parameters)
{
if (reportParms.ContainsKey(P.Name))
{
P.Value = reportParms[P.Name];
}
}
report.Prepare();
var pdfExport = new PDFSimpleExport();
pdfExport.Export(report, "test1.pdf");
//report = new Report();
//report.Load("C:\\dev\\ia\\products\\StratusCloud\\AppFiles\\Reports\\SalesQuoteItems.frx");
reportParms["DocEntry#"] = 117;
connectionstring = "Json=" + System.Text.RegularExpressions.Regex.Replace(urlTemplate, "{([^}]+)}", (m) => {
if (reportParms.ContainsKey(m.Groups[1].Value))
{
return string.Format("{0}", reportParms[m.Groups[1].Value]);
}
return m.Value;
});
dataapiparm = report.Dictionary.Parameters.FindByName("DataAPIUrl#");
if (dataapiparm != null)
{
dataapiparm.Value = connectionstring;
}
foreach (FastReport.Data.Parameter P in report.Dictionary.Parameters)
{
if (reportParms.ContainsKey(P.Name))
{
P.Value = reportParms[P.Name];
}
}
report.Prepare();
pdfExport.Export(report, "test2.pdf");
Cheers,
Mark
Fast Project definitely doesn't recalculate the ConnectionStringExpression on report.Prepare, so I went back to another method that I was looking at. It turns out that if the ConnectionString itself is rewritten, then report.Prepare does refetch the data.
A simple connection string without a schema takes a long time to process, so I remove everything beyond the semi-colon and keep it, replace the url portion of the connectionstring, and then stick teh same schema information back on the end.
Copying the schema information into each report generation connection string seems to remove around 10 seconds from report.Prepare!
At the moment it's the best that I can do, and I wonder if there is another more efficient method of rerunning the same report against new data (having the same schema).
I need to read all users from the AD. Here is code that I am using:
using Novell.Directory.Ldap;
using Novell.Directory.Ldap.Controls;
using System.Linq;
namespace LdapTestApp
{
class Program
{
static void Main()
{
LdapConnection ldapConn = new LdapConnection();
ldapConn.SecureSocketLayer = true;
ldapConn.Connect(HOST, PORT);
try
{
var cntRead = 0;
int? cntTotal = null;
var curPage = 0;
ldapConn.Bind(USERNAME, PASSWORD);
do
{
var constraints = new LdapSearchConstraints();
constraints.SetControls(new LdapControl[]
{
new LdapSortControl(new LdapSortKey("sn"), true),
new LdapVirtualListControl("sn=*", 0, 10)
});
ILdapSearchResults searchResults = ldapConn.Search(
"OU=All Users,DC=homecredit,DC=ru",
LdapConnection.ScopeSub,
"(&(objectCategory=person)(objectClass=user))",
null,
false,
constraints
);
while (searchResults.HasMore() && ((cntTotal == null) || (cntRead < cntTotal)))
{
++cntRead;
try
{
LdapEntry entry = searchResults.Next();
}
catch (LdapReferralException)
{
continue;
}
}
++curPage;
cntTotal = GetTotalCount(searchResults as LdapSearchResults);
} while ((cntTotal != null) && (cntRead < cntTotal));
}
finally
{
ldapConn.Disconnect();
}
}
private static int? GetTotalCount(LdapSearchResults results)
{
if (results.ResponseControls != null)
{
var r = (from c in results.ResponseControls
let d = c as LdapVirtualListResponse
where (d != null)
select (LdapVirtualListResponse)c).SingleOrDefault();
if (r != null)
{
return r.ContentCount;
}
}
return null;
}
}
}
I used this question Page LDAP query against AD in .NET Core using Novell LDAP as basis.
Unfortunatelly I get this exception when I am trying to recieve the very first entry:
"Unavailable Critical Extension"
000020EF: SvcErr: DSID-03140594, problem 5010 (UNAVAIL_EXTENSION), data 0
What am I doing wrong?
VLVs are browsing indexes and are not directly related to the possibility or not to browse large numbers of entries (see generic documentation). So even if this control would be activated on your AD, you wouldn't be able to retrieve more than 1000 elements this way :
how VLVs work on AD
MaxPageSize is 1000 by default on AD (see documentation)
So what you can do:
use a specific paged results control, but it seems that the Novell C# LDAP library does not have one
ask you the question: "is this pertinent to look for all the users in a single request?" (your request looks like a batch request: remember that a LDAP server is not designed for the same purposes than a classic database - that can easily return millions of entries - and that's why most of LDAP directories have default size limits around 1000).
The answer is no: review your design, be more specific in your LDAP search filter, your search base, etc.
The answer is yes:
you have a single AD server: ask your administrator to change the MaxPageSize value, but this setting is global and can lead to several side effects (ie. what happens if everybody start to request all the users all the time?)
you have several AD servers: you can configure one for specific "batch like" queries like the one you're trying to do (so large MaxPageSize, large timeouts etc.)
I had to use approach described here:
https://github.com/dsbenghe/Novell.Directory.Ldap.NETStandard/issues/71#issuecomment-420917269
The solution is far from being perfect but at least I am able to move on.
Starting with version 3.5 the library supports Simple Paged Results Control - https://ldapwiki.com/wiki/Simple%20Paged%20Results%20Control - and the usage is as simple as ldapConnection.SearchUsingSimplePaging(searchOptions, pageSize) or ldapConnection.SearchUsingSimplePaging(ldapEntryConverter, searchOptions, pageSize) - see Github repo for more details - https://github.com/dsbenghe/Novell.Directory.Ldap.NETStandard and more specifically use the tests as usage samples.
Given a URL as follows:
foo.bar.car.com.au
I need to extract foo.bar.
I came across the following code :
private static string GetSubDomain(Uri url)
{
if (url.HostNameType == UriHostNameType.Dns)
{
string host = url.Host;
if (host.Split('.').Length > 2)
{
int lastIndex = host.LastIndexOf(".");
int index = host.LastIndexOf(".", lastIndex - 1);
return host.Substring(0, index);
}
}
return null;
}
This gives me like foo.bar.car. I want foo.bar. Should i just use split and take 0 and 1?
But then there is possible wwww.
Is there an easy way for this?
Given your requirement (you want the 1st two levels, not including 'www.') I'd approach it something like this:
private static string GetSubDomain(Uri url)
{
if (url.HostNameType == UriHostNameType.Dns)
{
string host = url.Host;
var nodes = host.Split('.');
int startNode = 0;
if(nodes[0] == "www") startNode = 1;
return string.Format("{0}.{1}", nodes[startNode], nodes[startNode + 1]);
}
return null;
}
I faced a similar problem and, based on the preceding answers, wrote this extension method. Most importantly, it takes a parameter that defines the "root" domain, i.e. whatever the consumer of the method considers to be the root. In the OP's case, the call would be
Uri uri = "foo.bar.car.com.au";
uri.DnsSafeHost.GetSubdomain("car.com.au"); // returns foo.bar
uri.DnsSafeHost.GetSubdomain(); // returns foo.bar.car
Here's the extension method:
/// <summary>Gets the subdomain portion of a url, given a known "root" domain</summary>
public static string GetSubdomain(this string url, string domain = null)
{
var subdomain = url;
if(subdomain != null)
{
if(domain == null)
{
// Since we were not provided with a known domain, assume that second-to-last period divides the subdomain from the domain.
var nodes = url.Split('.');
var lastNodeIndex = nodes.Length - 1;
if(lastNodeIndex > 0)
domain = nodes[lastNodeIndex-1] + "." + nodes[lastNodeIndex];
}
// Verify that what we think is the domain is truly the ending of the hostname... otherwise we're hooped.
if (!subdomain.EndsWith(domain))
throw new ArgumentException("Site was not loaded from the expected domain");
// Quash the domain portion, which should leave us with the subdomain and a trailing dot IF there is a subdomain.
subdomain = subdomain.Replace(domain, "");
// Check if we have anything left. If we don't, there was no subdomain, the request was directly to the root domain:
if (string.IsNullOrWhiteSpace(subdomain))
return null;
// Quash any trailing periods
subdomain = subdomain.TrimEnd(new[] {'.'});
}
return subdomain;
}
You can use the following nuget package Nager.PublicSuffix. It uses the PUBLIC SUFFIX LIST from Mozilla to split the domain.
PM> Install-Package Nager.PublicSuffix
Example
var domainParser = new DomainParser();
var data = await domainParser.LoadDataAsync();
var tldRules = domainParser.ParseRules(data);
domainParser.AddRules(tldRules);
var domainName = domainParser.Get("sub.test.co.uk");
//domainName.Domain = "test";
//domainName.Hostname = "sub.test.co.uk";
//domainName.RegistrableDomain = "test.co.uk";
//domainName.SubDomain = "sub";
//domainName.TLD = "co.uk";
private static string GetSubDomain(Uri url)
{
if (url.HostNameType == UriHostNameType.Dns)
{
string host = url.Host;
String[] subDomains = host.Split('.');
return subDomains[0] + "." + subDomains[1];
}
return null;
}
OK, first. Are you specifically looking in 'com.au', or are these general Internet domain names? Because if it's the latter, there is simply no automatic way to determine how much of the domain is a "site" or "zone" or whatever and how much is an individual "host" or other record within that zone.
If you need to be able to figure that out from an arbitrary domain name, you will want to grab the list of TLDs from the Mozilla Public Suffix project (http://publicsuffix.org) and use their algorithm to find the TLD in your domain name. Then you can assume that the portion you want ends with the last label immediately before the TLD.
I would recommend using Regular Expression. The following code snippet should extract what you are looking for...
string input = "foo.bar.car.com.au";
var match = Regex.Match(input, #"^\w*\.\w*\.\w*");
var output = match.Value;
In addition to the NuGet Nager.PubilcSuffix package specified in this answer, there is also the NuGet Louw.PublicSuffix package, which according to its GitHub project page is a .Net Core Library that parses Public Suffix, and is based on the Nager.PublicSuffix project, with the following changes:
Ported to .NET Core Library.
Fixed library so it passes ALL the comprehensive tests.
Refactored classes to split functionality into smaller focused classes.
Made classes immutable. Thus DomainParser can be used as singleton and is thread safe.
Added WebTldRuleProvider and FileTldRuleProvider.
Added functionality to know if Rule was a ICANN or Private domain rule.
Use async programming model
The page also states that many of above changes were submitted back to original Nager.PublicSuffix project.
I am using the ImageResizing library, to resize and deliver my image in my C# Mvc application.
One thing that isn't happening though is my image aren't getting cached.
I am struggling to understand what would be required to appropriately add caching for each image.
I just need to know if I am on the write track? Will this cache my images correctly?
I think what I need to do, is set the FinalContentType, and FinalContentType in my ImageResizer_OnPostAuthorizeRequestStart (I dont know where to get these values)
And then, I am hoping that in the Application_PreSendRequestHeaders I can use the code below to set the cache headers correctly.
I have used a modified version of the method described here.
Here is my code:
private static void ImageResizer_OnPostAuthorizeRequestStart(IHttpModule sender2, HttpContext context)
{
string path = Config.Current.Pipeline.PreRewritePath;
if (!path.StartsWith(PathUtils.ResolveAppRelative("~/s3"), StringComparison.OrdinalIgnoreCase)) return;
Config.Current.Pipeline.SkipFileTypeCheck = true;
Config.Current.Pipeline.ModifiedQueryString["cache"] = ServerCacheMode.Always.ToString();
}
protected void Application_PreSendRequestHeaders(Object source, EventArgs e)
{
var app = source as HttpApplication;
HttpContext context = (app != null) ? app.Context : null;
if (context != null && context.Items != null && context.Items["FinalContentType"] != null && context.Items["LastModifiedDate"] != null)
{
//Clear previous output
//context.Response.Clear();
context.Response.ContentType = context.Items["FinalContentType"].ToString();
//FinalContentType is set to image/jpeg or whatever the image mime-type is earlier in code.
//Add caching headers
int mins = 1; //Or Configuration.AppSettings['whatever']
if (mins > 0)
{
context.Response.Expires = 1;
}
var lastModified = (DateTime?)context.Items["LastModifiedDate"]; //Set earlier in code.
if (lastModified != DateTime.MinValue)
{
Response.Cache.SetLastModified(lastModified.Value);
}
Response.Cache.SetCacheability(context.Request.IsAuthenticated ? HttpCacheability.Private : HttpCacheability.Public);
}
}
Use the DiskCache and ClientCache plugins to handle disk caching and cache headers respectively.
ASP.NET output caching is useless here.
can anyone help me how to resolve the out of memory error on my asp page? im using linq to sql.. after adding data several data.. like more than 10 rows. in the grid. an out of memory error occurs.. attached herewith is my add function..
public ServiceDetail checkservicedetailid()
{
string ServiceName = ViewState["Tab"].ToString();
ServiceDetail checkservicedetailid = ServiceDetails_worker.get(a => a.ServiceName == ServiceName && a.MarginAnalysisID == checkmarginanalysisid().MarginAnalysisID).SingleOrDefault();
return checkservicedetailid;
}
public IEnumerable<ServiceDetail> get(Expression<Func<ServiceDetail, Boolean>> express)
{
return ServiceDetailsDB.ServiceDetails.Where(express);
}
protected void btnSaveEmptyOC_Click(object sender, EventArgs e)
{
try
{
if (checkservicedetailid() != null)
{
CashExpense tblCashExpenses = new CashExpense();
Guid CashExpensesID = Guid.NewGuid();
tblCashExpenses.CashExpensesID = CashExpensesID;
tblCashExpenses.ServiceDetailsID = checkservicedetailid().ServiceDetailsID;
tblCashExpenses.Description = txtDescriptionEmptyOC.Text;
tblCashExpenses.Quantity = Decimal.Parse(txtQTYEmptyOC.Text);
tblCashExpenses.UnitCost = Decimal.Parse(txtUnitCostEmptyOC.Text);
tblCashExpenses.CreatedBy = User.Identity.Name;
tblCashExpenses.DateCreated = DateTime.Now;
tblCashExpenses.CashExpensesTypeID = "OTHER";
CashExpenses_worker.insert(tblCashExpenses);
CashExpenses_worker.submit();
//Clear items after saving
txtDescriptionEmptyOC.Text = "";
txtQTYEmptyOC.Text = "";
txtUnitCostEmptyOC.Text = "";
ValidationMessage.ShowValidationMessage(MessageCenter.CashExpenseMaintenace.InsertOC2, "SaveEmptyOC", this.Page);
MyAuditProvider.Insert(this.GetType().ToString(), ViewState["MarginAnalysisID"].ToString(), MessageCenter.Mode.ADD, MessageCenter.CashExpenseMaintenace.InsertOC2, Page.Request, User);
divOtherCost.Visible = false;
grd_othercost.Visible = true;
btnaddothercost.Visible = true;
}
else
{
//Displays a Message on the Validation Summary (Service Id does not exist)
ValidationMessage.ShowValidationMessage(MessageCenter.CashExpenseMaintenace.SaveServiceDetailOC, "SaveEmptyOC", this.Page);
}
}
catch
{
//Displays a Message on the Validation Summary (Error on Saving)
ValidationMessage.ShowValidationMessage(MessageCenter.CashExpenseMaintenace.InsertOCError, "SaveEmptyOC", this.Page);
}
finally
{
//Rebinds the Grid
populategrd_othercost();
}
}
I'm guessing from your code here:
ServiceDetail checkservicedetailid = ServiceDetails_worker.get(
a => a.ServiceName == ServiceName &&
a.MarginAnalysisID == checkmarginanalysisid().MarginAnalysisID
).SingleOrDefault();
that .get() is taking a Func<SomeType, bool>, and you are doing something like:
var row = dbCtx.SomeTable.Where(predicate);
(please correct me here if I'm incorrect)
This, however, is using LINQ-to-Objects, meaning: it is loading every row from the table to the client and testing locally. That'll hurt memory, especially if a different db-context is created for each row. Additionally, the checkmarginanalysisid() call is being executed per row, when presumably it doesn't change between rows.
You should be testing this with an Expression<Func<SomeType, bool>> which would be translated to TSQL and executed at the server. You may also need to remove untranslatable methods, i.e.
var marginAnalysisId = checkmarginanalysisid().MarginAnalysisID;
ServiceDetail checkservicedetailid = ServiceDetails_worker.get(
a => a.ServiceName == ServiceName &&
a.MarginAnalysisID == marginAnalysisId
).SingleOrDefault();
where that is get(Expression<Func<SomeType, bool>>).
I tried all of the solution given to me both by my peers as well as the solution provided here, from GC.Collect, to disposing linq datacontext after use etc. however the error keeps on occurring, i then tried to remove the update panel, Ive read a site that showed how ridiculous update panel when it comes to handling data esp when a function is done repeatedly. And poof! the memory problem is gone!