I need the pages in my site to load faster. So I heard about an html compressor that can reduce the html size that I send to the client. Does anybody know about a way to do that. I preffer an already made dll if possible...
Using compression does not always work.
When IIS compresses a page it keeps it in the memory till the page is expired or contents are changed. If server side has more dynamic pages having large amount of data then it can actually degrade the performance.
You should try to optimize the server side code and also reduce the client side code.
Many people make the mistake of writing JavaScript with variable name which are long. This increases the size of the page.
Unnecessary comments on the html are also not good.
Using .js files for commmon functions.
If you have data which does not change frequently, depending upon the type of data and size of the data you could try caching the same data in server side Cache. This reduces the query in the database.
Compression is good for static pages.
Take a look at the html that your site produces.
Do so by surfing to the pages in internet explorer (or other browser), right click the page body in your browser and select view source. If you are using ASP.NET the hidden field _VIEWSTATE may be big. If so try to disable it in your various page controls where its not needed. Look also in the source for other unneeded output.
IIS has this compression built in and it's on for static files by default.
To enable dynamic compression at the server level use the folowwing command:
C:\Windows\System32\Inetsrv\Appcmd.exe set config -section:urlCompression -doStaticCompression:true -doDynamicCompression:true
Alternatively if you would like to only enable dynamic compression for one site:
C:\Windows\System32\Inetsrv\Appcmd.exe set config "Site Name" -section:urlCompression -doStaticCompression:true -doDynamicCompression:true
If you would like to learn more about configuring compression in IIS for dynamic files see the below links:
http://technet.microsoft.com/en-us/library/cc771003%28v=ws.10%29.aspx
http://weblogs.asp.net/owscott/archive/2009/02/22/iis-7-compression-good-bad-how-much.aspx
Heavy CSS files affect the load-time of web pages, you can use this online utility to compress your CSS and it will make your webpages load faster.
http://www.cssdrive.com/index.php/main/csscompressor
Same applies to JS files, and here is JS Compressor tool
http://jscompress.com/
Note: I am not a promoter/supporter of above sites.
Related
A little while back, one of the junior developers at our company was tasked with creating a website for users to enter timesheets offsite. Mostly this is used for staff that reside offshore and have limited bandwidth (it's satellite internet, so we're already looking at a 500ms - 600ms response time, typically with only 10KB/s or less, including 10% - 20% intermittent packet loss).
So it's a challenging situation...
Recently I've been tasked with helping the junior to improve the speed and functionality of the website, mostly for my own benefit, since I'm usually a desktop dev. One thing I've noticed is that the website is using MultiView and I'm wondering if that's the best approach. I can see the reasoning; download the entire website once, then just make queries back and forth, showing/hiding the various views as necessary. Except it doesn't seem to work as smoothly as that.
95% of operations required a run by the server; i.e. add a new timesheet - need to tell the server, which in turn creates a new entry in the database. When the server is done, it seems to cause the client to download the entire webpage again, which is obviously counter productive.
So my question(s) are as follows;
Is this the expected behaviour, given the above situation? i.e. Should the entire webpage be getting re-downloaded once the server has completed it's actions?
If so, is this the best approach for the situation? Would it be better to have smaller, individual pages for the various features (timesheets/leave/etc.)?
I know this is probably a bit opinion based, but any ideas or assistance is greatly appreciated; for both our benefits.
Going from memory, Multiview only renders one of the views, not all of them, but since you mention the Multiview, that tells me you are using the older WebForms technology which often carries large amounts of overhead saving/restoring state. You can try and optimize that, especially if you are using some kind of grid control.
A better approach may be to ditch WebForms and switch to a newer technology like MVC. Rewrite the application to use AJAX with a webservice that returns JSON whenever possible to reduce the amount of data that needs to be sent to and from the server. Using MVC will also reduce the number of resources required for a page load (No resource.axd, etc) which will help page load times, especially over high latency links.
Make sure the server is set to compress dynamic pages with GZIP.
Compress and minify your javascript and CSS.
Don't use inline styles (the style attribute) in your HTML (use classes or IDs+children selectors) to reduce HTMLsize.
Bundle all your javascript and CSS.
Sprite your images in CSS where possible.
Run your images through a good image optimizer like http://kraken.io
Make sure you are caching whatever you can, and the cache duration is set properly.
Minify your HTML.
Stop using WebForms (or watch your page state, and control state very closely)
Check into some of the SPA architectures out there -- you may be able to make the whole application "offline-able" with the exception of the calls to get/update/create data.
Ultimately, each page should only require 1 HTML file, 1 CSS file, 1 Javascript file, and 1 sprite sheet on the first page hit, and then every page after that should only require a single HTML file.
You might also want to look into using a client side library like angular or knockout to handle rendering views. This can reduce the amount of traffic that needs to be sent (although it likely will increase the number of requests by one).
I think the best bet is a SPA (Single Page App) with Angularjs. Done right it greatly reduces the number of http requests. Navigation does not cause entire page reload in any case. Javascript files, css files etc, are loaded just one time at app load time. Once the app is loaded in the browser, the traffic is mainly sending JSON back and forth.
There are some tricks you should apply to reduce app load time:
Bundle javascript files into just one minified javascript file.
Bundle css files into just one css file.
Levearage http cache. You can use file versioning combined with MaxAge http header, so the browser does not even ask the server if the file has changed.
Some tools to help:
Fiddler, look at what is being cached and what isn't.
Facebooks augmented Traffic Control
To my understanding, ajax would be the best choice for you. If you want to access server 95% of times and reload the page with the new elements then the performance would hamper.
So instead of doing this make partial reloading with Ajax or Jquery. There are plenty of functionality available with jquery which would use ajax and reload specific portion of the webpage instead of whole page. It would increse the performance a lot.
One more thing I would like to add is that the response packet coming from server might be huge chunk. So instead of directly throwing the response from the server, implement GZip functionality in the website. It would compress the size of the data packet and the page would load/reload much faster.
Other than these, place your CSS and JS code inside some .css and .js file instead of placing it inside the page itself(and make sure to use it maximum time from all the pages). Browser would make a cache version of those files and reuse it instead of download it every time you want to connect to the server.
I believe that you have already figured out what's wrong. No Multiview is not good if it is implemented as is without tweaks. If your website uses viewstate and on top of that you have the multiview implemented, then it is going to be a costly affair.
Here are your options.
To use most out of the code, I would recommend to convert your methods HTTP GET / POST methods which can be then called separately from the needed actions in the html.
Don't re-render the entire page, but render the content which changes on menu action.
Change the non-changing part of your page / site to static content and apply compression on the static contents.
Enable page caching.
Cache the data offline wherever possible. (Remember it comes with a overhead of syncing data).
If you are considering a revamp give a thought about HTML 5 offline features.
I'm developing a C# replacement for a legacy VB app for my company. The front end is basically a Web Browser control inside of a Windows form, serving offline content which is sometimes altered to include the user's data. Because there are 100 or more web files in the legacy app, we are going to reuse the web UI from the old application with a new C# wrapper around it, modifying them as needed.
My questions are about how to store and deliver the web content.
Does it make sense to copy the web files to a temporary folder and point the Web Browser control to the file:// address of the temporary folder?
Is there some kind of pre-built offline-friendly server framework that makes more sense than copying the files to a temporary folder?
I have the web source files in my project as resources, but I'm not sure if that is appropriate for my uses. Is it?
The legacy VB implementation alters the web files to inject data using Substring methods; it searches for magic strings and replaces them with the appropriate data. That code smells pretty bad, is there a better, more native data injection strategy I should look at?
Some background:
The data is presented using HTML\CSS\JS and also sometimes XSL.
The browser delivers content that is available at compile time.
I'm going to have to handle some events using c# code when users click on buttons of the page.
I'm free to choose whatever approach is necessary to implement the application.
Hosting
I would probably avoid using a temporary location for the web content it just seems a little crude. If there is no internal linking between your html pages and all the css/js is embedded in one file it may be easier to just use the WebBrowser.DocumentText property.
Another option I have successfully used as a lightweight embedded web server is logv-http, it has a pretty easy to configure syntax. If you want to configure against anything other than localhost it does require administrator privileges but it sounds like everything will be local.
var server = new Server("localhost", 13337);
server.Get("http://localhost:13337" ,(req, res) => res.Write("Hello World!"));
server.Start();
Templating
I think the string replaces aren't necessarily bad depends how many there are and how complicated they are trying to be, but for simple find replace it shouldn't be too hard to manage. If there are lots of replaces wrapping them into a RegEx should help performance.
Storing the web content as embedded resources is probably how I would go that way you can read them out at run-time do you pre-processing and then return either via the the web server method or direct into the DocumentText.
Looking for information - I am creating a catolog website that includes a list of products. Each product has an image stored stored on the hard drive on the server. If the image does not exist, I want to show a default image. Whats the best way of doing this. I am using C# and considered checking on the server side if the image exists. But as some pages could have 50-60 images this would slow down the page. I use jquery on the client side. Any tips on this?
This is a great question, as the sitation arises in many circumstances. I see several options:
1) check for image availability during rendering of the catalog and use a link to the default image for items that do not have an image,
2) check for image availability in the image controller and return the default image when not available
3) put images inline in the document using data URLs
A major factor here is the possibility of caching.
Option (1) facilitates caching of the default image, but precludes caching of the catalog page. It is better if there are many items without an image, then such items will not even generate a hit to the server Furthermore, if there's a low chance that an image would appear for an item, you could cache the index too (for a reasonably short time).
Option (2) facilitates caching of the index page, but each image will have to send a request to the server. Again, you could use aggressive caching to avoid the same requests the second time the page is rendered.
Option (3) is best if your images are small and if the catalog page is relatively static. Be sure to use caching on the server side though while generating the page to reduce the load on the filesystem/database.
Sounds like this is a web application, so you should look into doing some caching. Even though image file lookups are expensive, once your page gets hit a few times the disk lookups will no longer be necessary.
Or you could store the information about whether a product image exists in your database. Then you prepopulate the database with the information and no disk checks are necessary.
Your best bet is to do this server-side as you suggest. You could do it client-side (attempt to load image, and load a default image if that fails), but this is not really what client-side scripting is designed for. You're making the user do extra HTTP requests, which is slower for the user.
An even better solution, as marcind suggests, is to pre-populate the database with default images. So in your CMS, when you create a new item, it assigns a default image URL to itself. You can then manually change it from there.
How does your jQuery code know the name of the image?
Seeing that your image files are physical files on the server and are accessible from a browser, I'd probably leave that part as is since that implies you don't have to serve the images yourself and IIS can handle that for you as a static file.
So your jQuery code obviously know the name of the image for each product. I assume this name is given to it by some server side process, so that process needs to give it either the name of the image for the product or the default image.
Some part of your code has to go through the process of figuring out if an image exists for the product and react accordingly. If you're using a database for your products that you could have a field in product table that indicates if the product has an image or not.
I'm developing a newsletter in asp.net that will be send to a large quantity of users, so each kilobyte that I can reduce will help a lot in the use of bandwidth consumption, what I do until know is write the aspx excluding some spaces between tags, and before render, i've renamed some controls ids to "-" to save more space.
So now, the file has 50kb. I need a file with 25 Kb.
Can anyone teach me any other way do save more space ?
ps.: I Use 3 divs with some data, and 2 repeaters, one inside other, to generate a table with some data for me.
EDIT: I've disabled viewstate, and remove unnecessary divs, I'll try to verify if gzip is enabled in IIS.
thanks in advance
Make sure HTTP compression is enabled. It will help to reduce the amount of HTML, but enabling HTTP compression will give more than the marginal improvements you're likely to see.
There are different ways to enable compression, depending on your version of IIS. For instance, in IIS 6.0, you can manually edit metabase.xml or run:
cscript adsutil.vbs set w3svc/filters/compression/parameters/HcDoDynamicCompression true
You can check HTTP headers to verify that compression is enabled using something like https://addons.mozilla.org/en-US/firefox/addon/3829/ Live HTTP Headers for Firefox. Check your headers for "Content-Encoding: gzip".
Don't use an aspx page if you want full control. Make a Generic Handler, and then you can have full control over every byte generated.
Instead of using Repeaters, just loop through a dataset and output tables or spans or something. Although, I have to say, repeaters are very easy to control the exact output of, too.
Look at your generated html and see if you can identify any obvious culprits.
You can disable the viewstate and optimize image files if you are using any.
I recommend going over this article if your using ASP.Net Web Forms - explains how to correctly utilize ViewState:
http://weblogs.asp.net/infinitiesloop/archive/2006/08/03/Truly-Understanding-Viewstate.aspx
You might want to look into IIS Compression(it uses gzip IIRC). That should knock the file size down.
Also minify the javascript/css(I've done that and seen up to 40% reduction on js/css file sizes), here's a link to a book/website that talks about other things you can do: http://developer.yahoo.com/performance/rules.html, the book's title is "High Performance Web Sites", the ISBN: 978-0-596-52930-7
It may not work out to much savings, but you can also reduce markup by seriously considering whether to use DIVs for styling purposes when styling the contents directly would achieve the same result.
For instance,
<div class="sidebar">
<ul>
<li>Lorem</li>
</ul>
</div>
In most cases you can get the same result from styling the UL directly:
<ul class="sidebar">
<li>Lorem</li>
</ul>
But in your case, the repeaters are probably the main source of the bloat. Make sure you're using a custom template for them with clean HTML, and not relying on the out of the box rendering, which can be quite messy.
Like someone else posted, turn off viewstate for any controls you don't need it on - that's a TON of junk alone.
I have an web based application. The content for the Home page has been currently mentioned in the HTML code for the Home page using , and tags. To change the content anytime in future, it needs to be changed in the HTML code. :(
Is there a way that we can pick up the content from some external place and get it reflected through the website. This ways, any change if required can be made at the external location without referring to the application's code.
Please advise if there is any solution for it.
Thanks.
You can
Use a database
Include external files using Server Side Includes
Read external files and write their contents and an alternative method
Sounds like you're looking for a Content Management System (CMS), which will allow your content editors access to modify only specific blocks of a page that you specify.
There are a ton out there to do what you want, so you don't have to start from scratch. Just Google 'CMS'.
Although I haven't used it myself, DotNetNuke is a popular one these days and has a free version.