Webserver not using the latest files - c#

One of the projects I am working on includes a website that
is hosted on a cheap shared hosting server.
Whenever I upload to the server some updated files, they don't
necessarily become available immediately.
It can take from 15 to 30 minutes before the server actually
starts using the new files instead of the old ones and in some
cases I even need to re-re-upload the updated files.
Some more info:
- C# webforms files (.aspx and .aspx.cs)
- If there was no previous file with that name on the server
then the file always become immediately available
- But if I first delete the older file and refresh the page
I get immediately a "file not found" error but if I then upload
the newer file the "file not found error" stops immediately but I
get back the older file again.
I understand that the server isn't actually serving the .aspx
page but rather using the compiled to dll version that it has made
(right?) so maybe this is a compiling problem on the server somehow?
I'm not sure if this would be better on serverfault.com
but as a programmer SO is where I usually come.
Any idea why this is happenning and preferably some solution
on how to fix this behavior so that when I upload an updated page
we can start using it immediately?
Thank you.

Usually, touching your web.config file will recycle the web server - if you do that, you should flush any caches. Just upload a new web.config with a trivial change and see if that helps.

If you are using .NET 2.0 websites, you can have problems with the .dlls in the bin folder. Changing to a Web application should solve your problem permanently.
http://webproject.scottgu.com/

I have seen this behavior on one of my sites as well.
In my case the issues began just after the hosting provider had moved my site to their new SAN-solution.
It turned out that this new storage solution did not support "file system watchers".
And without it IIS would never receive any notification of when a file has been updated or not.
The workaround they introduced was to move the applications into new application pools with regular intervals. (This gives the symptoms you are describing with updates only being applied at regular intervals.)
The only solution I found was to move my sites to a different hosting provider.

Related

xml files are downloaded in the wrong format

we have a web application in C# where our users can download .xml files but since we introduced identity server3 in our new update our download intermittently stop working and some early access users reported that they are (2 times on 3) downloading authorise.htm and/or a file without a format extension (that if renamed has all the content) and so on.
What part of our system is likely to be causing it? I'm a tester so I'm not expert in debugging but I am trying to understand what are our suspects.
Could it be some configuration issue? firewall issues?

Can you edit aspx html content

Is is possible to edit the raw html content of a deployed .aspx page without recompiling or redeploying?
For example say I have a deployed .net 4.6 aspx website. If i give someone FTP access to the .aspx files on the server can that person edit and update the layout of html elements without recompiling?
This might occur when a website is deployed onto a web host and a 3rd party contractor is requested to help improve the gui without being giving full access to the .sln file [eg c#\wcf\classes etc..].
Yes, this is possible.
.aspx files are not precompiled (unless you specifically say so, in which case you won't see them on the server).
One thing to keep in mind is that every time you change an .aspx-file, it will be recompiled the first time it's requested. Also after a number of changes (I belive this to be 20), the application pool will restart.
These page recompilations and apppool reboots could impact performance on your site.
Apart from that the usual guidelines apply and I would advise not to do this:
the changes won't be synced back to your local repository
You have little control or testing over the changes
You have no backups of these changes
It seems it should be possible to edit deployed aspx files, just to be aware the changes will not show up in the solution file which was used to deploy them but only be visible on the device where the edits occurred.
yes, according to my experience, we can change .aspx, .cshtml, .html, .css and any image file in the host server if we have access to that server. Solution do not need to be compiled to change all these files because we upload directly these files on the host server. So for any change on these file either on our local machine or on host server, we can just copy and paste file from either location using remote connection to synchronize manually.
Only For all .cs file we only upload a dll file so solution need to be recompiled before uploading for any changes of class file.
So, if third party make any changes on host file like .apsx, .css. image file then these can be copied to the local machine directly to override the old one and checkin to the repository.

Concurrency Issue on IO operation

I'm writing a multi threaded console application which downloads pdf files from the web and copies it locally on to our content Server location(windows server). This is also the same location from which the files will be served to our website.
I am skeptical about the approach, because of concurrrency issues such as if the user on the web site requests a pdf file from the content server, and at the same time the file is being written to or being updated by the console application there might be an IO Exception. (The application also makes updates to the pdf files if the original contents change over time)
Is there a way to control the concurrency issue?
You probably want your operations on creating and updating the files where they are served to be atomic, so that any other processes dealing with those files get the correct version, not the one that is still open for writing.
Instead of actually writing the files to where they will be served, you could write them to a temporary directory and then move them into the directory where they will be served from.
Similarly, for updating them, you should check that when your application is updating those pdfs that the files themselves are not changed until writing has finished. You could test this by making your application sleep after it has started writing to the file, for example.
The details depend on which web server software you are using, but the key to this problem is to give each version of the file a different name. The same URL, mind you, but a different name on the underlying file system.
Once a newer version of the file is ready, change the web server's configuration so that the URL points to the new file. In any reasonably functional web server this should be an atomic operation.
If the web server doesn't have built-in support for this, you could serve the files via a custom server-side script.
Mark the files hidden until the copy or update is complete.

Find out what is causing problem in IIS7

We have a C# web application, and the latest deploy doesn't work on our Windows Small Business Server 2008 (IIS7). The exact copy of that site runs fine on my Windows 7 machine (IIS7.5). The previous version and other builds still work on the Server 2008 R2 machine, but this itteration doesn't.
I've checked the W3SVC logs, but no requests are logged. I've checked the eventlog for errors, but no errors are logged. I also checked in fiddler, but the request just doesn't get a response as far as I can tell (Result column remains -)
When you open the url, the browser will just keep loading (no timeout).
Is there anything else I can check or enable to debug this IIS7 behaviour?
Thanks in advance,
Nick.
UPDATE
I published the application again & created a new site in IIS, and this new version works. While my the immediate problem is solved at this time, I would still like to know how to debug IIS7, see how it works & why it would keep loading infinitely.
First, I would drop a regular .html file into the sites directory. Then I would have a browser request that specific static file. This would bypass the .net engine and should be logged.
If for some reason it doesn't work and/or isn't logged then there are other things to check, let us know.
Assuming that it does serve the file and you are pointing to the correct machine then inspect your global.asax file and remove any type of error handling you might have. Also turn off the custom errors section of your web.config. Both of which could result in the server essentially spinning off into nothingness if improperly coded. If you have any type of additional threads you are spinning up on access, then see if you can turn those off or add additional logging.
Next, look in the HTTPERR logs to see if you can identify what's going on. These are located at
%SystemRoot%\system32\LogFiles\HTTPERR\httperr*.log
Info about this log file is at: http://support.microsoft.com/default.aspx?scid=kb;en-us;820729
If your app uses ADO then there is chance that depending where the build occurred on Windows 7 or not and whether SP1 is installed or not (at the time of the build) that your build is broken by some Micorsoft ADO-update contained in SP1 (see http://www.codeproject.com/Articles/225491/Your-ADO-is-broken.aspx).
If no requests are logged in the W3SVC logs then it probably means that IIS is not recieving the request at all - likely due to firewall configuration or similar.
You should diagnose why IIS is unavailable (for example by attempting to serve some static content) and then try again.
Try these:
re-register asp.net runtime with your IIS7
make sure the asp.net extension for the correct version is set to Allowed in 'ISAPI and CGI restrictions' in your IIS

Redeploying an ASP.NET site in IIS7 without files in use interfering

We've got a process currently which causes ASP.NET websites to be redeployed. The code is itself an ASP.NET application. The current method, which has worked for quite a while, is simply to loop over all the files in one folder and copy them over the top of the files in the webroot.
The problem that's arisen is that occasionally files end up being in use and hence can't be copied over. This has in the past been intermittent to the point it didn't matter but on some of our higher traffic sites it happens the majority of the time now.
I'm wondering if anyone has a workaround or alternative approach to this that I haven't thought of. Currently my ideas are:
Simply retry each file until it works. That's going to cause errors for a short time though which isn't really that good.
Deploy to a new folder and update IIS's webroot to the new folder. I'm not sure how to do this short of running the application as an administrator and running batch files, which is very untidy.
Does anyone know what the best way to do this is, or if it's possible to do #2 without running the publishing application as a user who has admin access (Willing to grant it special privileges, but I'd prefer to stop short of administrator)?
Edit
Clarification of infrastructure... We have 2 IIS 7 webservers in an NLB running their webroots off a shared NAS (To be more clear, they're using the exact same webroot on the NAS). We do a lot of deploys, to the point where any approach we can't automate really won't be viable.
What you need to do is temporary stop IIS from processing any incoming requests for that app, so you can copy the new files and then start it again. This will lead to a small downtime for your clients, but unless your website is mission critical, that shouldn't be that big of a problem.
ASP.NET has a feature that targets exactly this scenario. Basically, it boils down to temporarily creating a file named App_Offline.htm in the root of your webapp. Once the file is there, IIS will takedown the worker process for you app and unload any files in use. Once you copy over your files, you can delete the App_Offline.htm file and IIS will happily start churning again.
Note that while that file is there, IIS will serve its content as a response to any requests to your webapp. So be careful what you put in the file. :-)
Another solution is IIS Programmatic Administration.
Then you can copy your new/updated web to an alternative directory then switch the IIS root of your webapp to this alternative directory. Then you don't matter if files are locked in the original root. This a good solution for website availability.
However it requires some permission tuning...
You can do it via ADSI or WMI for IIS 6 or Microsoft.Web.Administration for IIS 7.
About your 2., note that WMI don't require administrator privileges as ADSI do. You can configure rights by objects. Check your WMI console (mmc).
Since you're already load balancing between 2 web servers, you can:
In the load balancer, take web server A offline, so only web server B is in use.
Deploy the updated site to web server A.
(As a bonus, you can do an extra test pass on web server A before it goes into production.)
In the load balancer, take B offline and put A online, so only web server A is in use.
Deploy the updated site to web server B.
(As a bonus, you can do an extra test pass on web server B before it goes into production.)
In the load balancer, put B back online. Now both web servers are upgraded and back in use in production.
List item
You could also try to modify the timestamp of web.config in the root folder before attempting to copy the files. This will unload the application and free used files.
Unless you're manually opening a handle to a file on your web server, IIS won't keep locks on your files.
Try shutting down other services that might be locking your files. Some examples of common services that do just that:
Windows Search
Google Desktop Search
Windows Backup
any other anti-virus or indexing software
We had the same server (2003) and the same problem. Certain dll's were being locked and putting the App_Offline.htm in the website root did jack diddly for us.
Solution:
File permissions!
We were using a web service which runs under the Network Service account or the IIS_WPG account to deploy updates to the web site. Thus it needed write access to all the files. I already knew this, and had already set the permissions on the directory a while ago. But for some strange reason, the necessary permissions were not set on this one problem dll. You should check the permissions not only on the directory, but on the problem file as well.
We gave Network Service and IIS_WPG users read/write access to the entire web root directory and that solved our file in use, file locked, timeout, and access denied issues.

Categories

Resources