Building, and Publishing, and User-Data (oh my!) - c#

What is the best way to update a "Web-Application" ("re-publish" from dev server to live server) while preserving user-data (such as images, videos, and audio stored in the filesystem) in a VS 2010 build/publish setup?
Additionally, what is the best way to minimize site downtime during these updates?
My backstory:
Usually I "build/publish" the site to a folder on my dev machine, ftp:// into the live server, then drag-and-drop the new "published" files and folders to the live site while making sure not to overwrite any user-generated directories.
Obviously this method comes from my static-html days where it didn't matter. And obviously this is dangerous, flawed, and counter-productive for any Web Application with user-generated data in the FS.

The easiest way is to have a directory that's outside of your code folder where you store the user data. You can even map this folder as a virtual folder in IIS when you need this folder to be available from the internet. Like:
C:\Inetpub
\ProjectWebsite
\ProjectFiles

Related

Microsoft Azure App Service Storage

I have a doubt for purchase a microsoft azure app service to host my app. I have already tested the free profile and i am concerned to switch to a basic profile.
That's my question.
I have seen on this table on azure website here which i'll have 10GB of disk space for my application files.
When i went on price calculator i see this
Well my question is:
Why here i see 10GB of temporary storage? will i lose my application files located in the wwwroot folder anytime?
will i lose my application files located in the wwwroot folder anytime?
It will not be lost if you application files located in the wwwroot folder.We could get the answer from the document.
Persisted files
This is what you can view as your web site's files. They follow a structure described here. They are rooted in d:\home, which can also be found using the %HOME% environment variable.
These files are persistent, meaning that you can rely on them staying there until you do something to change them. Also, they are shared between all instances of your site (when you scale it up to multiple instances). Internally, the way this works is that they are stored in Azure Storage instead of living on the local file system.
Free and Shared sites get 1GB of space, Basic sites get 10GB, and Standard sites get 50GB
Temporary files
A number of common Windows locations are using temporary storage on the local machine. For instance
%APPDATA% points to something like D:\local\AppData.
%TMP% goes to D:\local\Temp.
Unlike Persisted files, these files are not shared among site instances. Also, you cannot rely on them staying there. For instance, if you restart a web app, you'll find that all of these folders get reset to their original state.
For Free, Shared and Consumption (Functions) sites, there is a 500MB limit for all these locations together (i.e. not per-folder). For Standard and Basic sites, the limit is very high (over 100GB).
We could check the application files in the D:\home\site\wwwroot from the Azure kudu tool(https://yoursite.scm.azurewebsites.net/).
Available disk space is shown on the Environment page:

Opening files locally from web application

I recently added a way for my web application (ASP.NET written in C#) to go to a folder which contains a bunch of spreadsheets and import them into SQL server tables. I set the folders and file names using an admin table so it knows how to handle each file and which table they should go to etc. It even keeps track of the file dates and times so it ignores anything that isn't new since the last time it imported them. Very cool but it only works on my development machine, most likely because the path is easily recognized there.
I'd like others to be able to do this but I can't seem get the web application to access a pre-arranged path on the users local machine. Now I'm assuming this is normal (we shouldn't be able to have a web application reach into someone's machine and grab files!) but is there some way to either do it using a known path or by having a user select the local folder? Is it possibly done more easily if I put the files in a folder within the site?
Dana
If I understand your question correctly, the approach is that you want a user to type in a local file path and you process it.
This will not work through a website. And from a security perspective this is very wise as you point out. So unless you install some client application on the local machine it is not possible.
You will need a file-upload dialog and have the user explicity locate the files for you, click upload and process them on the server.
Some other strategies here:
https://developer.mozilla.org/en-US/docs/Using_files_from_web_applications
but it still requires the user to select them manually.

.NET MVC Websites - shared folders between websites

I have a production website that sits on two servers that used local label files to drive their page labels (request going round robin between the two).
Users need the ability to upload new labels files, but once uploaded on one I need it also updated on the second website - this needs to be immediate. I was trying to use a shared folder on one of the servers, but even if I give it everyone full access i get the error "Exception message: Unable to find label folder at \\MACHINENAME\LabelFiles" when reading from the other server, I've also tried giving full permissions to "IIS AppPool\DefaultAppPool", but get the same issue.
I'm using IIS 7.5 on Windows Server 2008 R2
Question-
Is there a way to share a folder between the two sites?
Is there a better alternative solution?
Thanks
Both the websites should have a virtual folder pointing to the same physical folder, where the users can upload files.
Make sure also that the Anonymous access is disabled
One approach is to map the folder as a drive on each of the production machines, it should then be as simple as refereing to that particular drive letter.
This can be done by navigating to the folder in windows explorer, then clicking Map Network Drive.
I cannot guarantee this will work, buut it might be worth a go.
The IIS AppPool\DefaultAppPool user is a special account that is only local to the machine where it is being used. You cannot access network file shares with that account. You will either need to use the 'Network Service' account or a domain account as the app pool user.
Also, since you are load balancing this site between two servers, you might want to consider using some type of SAN or NAS storage that is shared between the two servers. Otherwise, you will need to come up with some kind of process to synchronize the file share on both servers.

sync database to uploaded file using Windows Service or website code

I have a website that occasionally needs to have a handful of the tables in its database updated. The updates come from another system that exports to comma delimited text files. I can then either FTP the text files to the web server, send them in through an admin upload page, or manually log in to Remote Desktop to download the text files. I have all my C# code written to parse the files, check the database contents, and decide what to do.
Should I code the sync logic to be part of a file upload page, protected in the admin section of the site or should I create a Windows Service that constantly looks for files to process in a particular directory that I can drop files in through FTP?
I have used Windows Services in the past and they have worked great, but if I ever have to make a change to the code it can take longer than it would if I just had to modify an ASPX.
Are their security benefits one way or another?
Performance benefits?
ASPX page wins the "ease of maintenance" category.
I would create a Windows Service to watch a secure folder and use a directory watcher to look for new files. Since the files are coming from another system, it is asynchronous in nature, and it is much more performant to have a Windows Service running separately to watch for updates as they happen. It can also parse the files and update the database for you.
Depending on who maintains the remote system, the easiest way is to grant permission to the service to access the files on a secure, shared folder. Then you won't need to do anything manually.

Redeploying an ASP.NET site in IIS7 without files in use interfering

We've got a process currently which causes ASP.NET websites to be redeployed. The code is itself an ASP.NET application. The current method, which has worked for quite a while, is simply to loop over all the files in one folder and copy them over the top of the files in the webroot.
The problem that's arisen is that occasionally files end up being in use and hence can't be copied over. This has in the past been intermittent to the point it didn't matter but on some of our higher traffic sites it happens the majority of the time now.
I'm wondering if anyone has a workaround or alternative approach to this that I haven't thought of. Currently my ideas are:
Simply retry each file until it works. That's going to cause errors for a short time though which isn't really that good.
Deploy to a new folder and update IIS's webroot to the new folder. I'm not sure how to do this short of running the application as an administrator and running batch files, which is very untidy.
Does anyone know what the best way to do this is, or if it's possible to do #2 without running the publishing application as a user who has admin access (Willing to grant it special privileges, but I'd prefer to stop short of administrator)?
Edit
Clarification of infrastructure... We have 2 IIS 7 webservers in an NLB running their webroots off a shared NAS (To be more clear, they're using the exact same webroot on the NAS). We do a lot of deploys, to the point where any approach we can't automate really won't be viable.
What you need to do is temporary stop IIS from processing any incoming requests for that app, so you can copy the new files and then start it again. This will lead to a small downtime for your clients, but unless your website is mission critical, that shouldn't be that big of a problem.
ASP.NET has a feature that targets exactly this scenario. Basically, it boils down to temporarily creating a file named App_Offline.htm in the root of your webapp. Once the file is there, IIS will takedown the worker process for you app and unload any files in use. Once you copy over your files, you can delete the App_Offline.htm file and IIS will happily start churning again.
Note that while that file is there, IIS will serve its content as a response to any requests to your webapp. So be careful what you put in the file. :-)
Another solution is IIS Programmatic Administration.
Then you can copy your new/updated web to an alternative directory then switch the IIS root of your webapp to this alternative directory. Then you don't matter if files are locked in the original root. This a good solution for website availability.
However it requires some permission tuning...
You can do it via ADSI or WMI for IIS 6 or Microsoft.Web.Administration for IIS 7.
About your 2., note that WMI don't require administrator privileges as ADSI do. You can configure rights by objects. Check your WMI console (mmc).
Since you're already load balancing between 2 web servers, you can:
In the load balancer, take web server A offline, so only web server B is in use.
Deploy the updated site to web server A.
(As a bonus, you can do an extra test pass on web server A before it goes into production.)
In the load balancer, take B offline and put A online, so only web server A is in use.
Deploy the updated site to web server B.
(As a bonus, you can do an extra test pass on web server B before it goes into production.)
In the load balancer, put B back online. Now both web servers are upgraded and back in use in production.
List item
You could also try to modify the timestamp of web.config in the root folder before attempting to copy the files. This will unload the application and free used files.
Unless you're manually opening a handle to a file on your web server, IIS won't keep locks on your files.
Try shutting down other services that might be locking your files. Some examples of common services that do just that:
Windows Search
Google Desktop Search
Windows Backup
any other anti-virus or indexing software
We had the same server (2003) and the same problem. Certain dll's were being locked and putting the App_Offline.htm in the website root did jack diddly for us.
Solution:
File permissions!
We were using a web service which runs under the Network Service account or the IIS_WPG account to deploy updates to the web site. Thus it needed write access to all the files. I already knew this, and had already set the permissions on the directory a while ago. But for some strange reason, the necessary permissions were not set on this one problem dll. You should check the permissions not only on the directory, but on the problem file as well.
We gave Network Service and IIS_WPG users read/write access to the entire web root directory and that solved our file in use, file locked, timeout, and access denied issues.

Categories

Resources