We are currently working on moving our Asp.NET MVC app from a shared hosting provider to Azure. Our users can upload files such as images and documents to our server and we store these files under app-url/content/data which works pretty well.
Question:
Is it safe to keep doing the same thing and uploading files under app-url/content/data ? I've read about the Azure blob storage but we would like minimize the amount of work required to move to Azure (this is definitely something we could do in the coming months)
Azure provides a number of storage options such as Azure SQL, DocumentDB, Azure Blob storage and more, you can use anyone. If your application is just storing the images, Azure Blob storage is the best option.
Is it safe to keep doing the same thing and uploading files under app-url/content/data ?
Definitely, the security is not concerns to Azure Customers. It is Microsoft's concern you can learn about Azure security from here.
we would like minimize the amount of work required to move to Azure.
This depends upon your application's back-end storage and resource management. If you are setting a new Azure VM for running your application, it might take long. If you are about to use Azure Web Apps (Recommended), it will minimize your migration workload as you may be already familiar with.
Related
I am developing web application (Hosted on azure cloud service) which is generating PDF. Those PDFs, have to be store at some place because it will get downloaded later. So where to store those files ?
Right now storing it in 'Content' folder of project but issue is when I publish site again 'Content' folder gets empty.
So, having issue with storing files. What is the best way to store it ? To store it in DB or anywhere in cloud services. Can not use azure storage, client don't want to use it.
Please help me with issue.
Using storage is best practice, and it has a lower cost per GB than most other options in Azure. If the costs at scale are a problem, there are options to move files from hot to cool storage to reduce the per GB cost as data builds up.
You do need to store them in some sort of persistent storage, which Cloud Services does not offer. A database would work, as would deploying the application to a regular VM rather than Cloud Services. App Services has persistent storage, as the files are stored in Azure Storage and not on the VM, if that is an option. There are plenty of other options, and any sort of file share will work.
Q1: Where do you think is the right place to put a SQLite database file (database.sqlite) in Azure Web App file system? For example:
D:\home\data\database.sqlite
D:\home\site\database.sqlite
D:\home\site\wwwroot\database.sqlite
other?
Q2: What else should be taken into consideration in order to make sure that the database file won't be accessible to public users as well as not being accidentally overwritten during deployments or when the app is scaled up/down? (The Web App is configured for deployments from a Local Git Repository)
Q3: Where to learn more about the file system used in Azure App Service, the official source URL? E.g. how it's shared between multiple VMs within a single Web App, how does it work when the App is scaled up/down, what's the difference between D:\home (persistent) vs D:\local (non-persistent)...
Note that SQLite does not work in Azure Blob Storage, so that one is not an option. Please, don't suggest alternative storage solutions, this question is specifically about SQLite.
References
Appropriate Uses For SQLite
In a Web App, your app is deployed to d:\home\site\wwwroot. This is the area where you may write files. As an example, the ghost deployment writes its SQLite database to d:\home\site\wwwroot\content\data\ghost.db. (easy to see this, if you open up the kudu console via yourapp.scm.azurewebsites.net):
This file area is shared amongst your web app instances. Similar to an SMB file share, but specific to web apps (and different than Azure's File Service).
The content under wwwroot is durable, unless you delete your app service. Scaling up/down impacts the amount of space available. (I have no idea what happens if you scale down and the smaller size has less disk space than what you're consuming already).
I would say the best location would be app_data folder in the site/wwwroot folder. Create the folder if it doesn't exist.
Web Apps can connect to storage accounts so you can in fact use blob storage and connect that to your web app. So in terms of learning more about it then you need to be looking at the appropriate page of documentation.
In your Web App settings you can then select which storage account to use. You can find this under Settings > Data Connections where you can select Storage from the drop down box.
I have a asp.net web forms web site that uses many files from server disk, accept uploads, processing files on the server. All the files stored in the web server's disks.
I would like to move my site to azure web sites. But to do that i think we need to update site code to keep files in azure blobs and process from it. Right now we are not able to that. So can i move my web site to azure without using azure blobs? Is there any way i can move all my site and files to azure, keep and publish on azure but not on azure blobs?
Using Virtual Machine is not an option to us right now.
Every Azure App Service/Website comes with persisted storage, which is technically an Azure storage blob mapped to the local file system. However, your code need not be aware of that. The details are described to the File System section here.
If you can configure paths for your server files, this persisted storage should suffice.
I'm new working with files so i have done some reading, altough i feel that i'm still not certain how to deal with them using asp.net web api.
What i want is to be able to reach images thru my web api. What i've read so far is that many people prefer saving the file and then call for its URI, instead of saving the image to the database u only save the URI there. So I then created a imageController on the web api that does exactly this(Atleast working using localhost). I now get some people arguing that i should use blob storage(since i use Azure).
My question is: Is it wrong or bad practice to have a folder in my project where i save my image files? Else what would be the better way to save images?
Your question is really two questions:
1. database vs. filesystem
It depends on 2 main factors: security and performance.
If your images are sensitive and the risk of accessing them "outside" your app (for example by hotlinking) is unacceptable, you must go for database, and serving images via ASP.NET request - which you can authenticate any way you want. However, this is MUCH more resources-intensive than second option (see below).
If security is no concern, you definitely want to go for filesystem storage. On traditional hosting, you would save them "anywhere" on disk, and the IIS (the webserver) would serve them to user (by direct URL, bypassing your ASP.NET application). This alone is HUGE performance improvement over DB+ASP.NET solution for many reasons (request thread pool, memory pressure, avg. request duration, caching on IIS...).
2. Local directory on webrole vs. blob storage
However, in Azure you can, and HAVE TO go one step further - use dedicated blob storage, independent from your web role, so not even your IIS on webrole will be serving them, but dedicated server on blob storage (this does not need to concern you at all - it just works). Your web role should not store anything permanently - it can fail, be destroyed and replaced with new one at any time by Azure fabric. All permanent stuff must go to Azure blob storage.
Adding to #rouen's excellent answer (and I will focus only on local directory v/s blob storage).
Assuming you're deploying your WebApi as an Azure Web Application (instead of Web Role), consider the following scenarios:
What would happen to your images if accidentally you delete your Web Application. In that case, all your images will be lost.
What would happen if you need to scale your application from one instances to more than one. Since the files are located in instances local directory, they will not be replicated across other instances. If a request to fetch the image lands on an instance where the image is not present on the VM, then you will not be able to serve that image.
What would happen if you need to store more images than the disk size available to you in your web application?
What would happen if you need to serve the same images through another application (other than your WebApi)?
Considering all these, my recommendation would be to go with blob storage. What you do is store the image in blob storage and save the image URL in the database. Some of the advantages offered by blob storage are:
At this time, you can store 500 GB worth of data in a single blob storage account. If you need more space, you simply create another storage account. Currently you can create 100 storage accounts per subscription so essentially you can store 50TB worth of data in a single subscription.
Blob storage is cheap and you only pay for the storage that you use + storage transaction costs which are also very cheap.
Contents in blob storage are replicated thrice and if you opt for geo-redundancy option (there's an extra cost for that), your content is replicated six time (three times in primary region + three times in secondary region). However you should not confuse replication with backup!
Since the content is served from blob storage, you're essentially freeing up IIS server from serving that content.
You can take advantage of Azure CDN and serve the content from a CDN node closest to your end user.
Considering all these advantages, I would strongly urge you to take a look at blob storage.
Is there an API available to pull telemetry/insights data for websites from the azure portal, so I can display that data on my 3rd party site? I'm looking for something similar to Google Analytics embed API.
I do not think there is a specific API for Azure Web Apps analytics (the new name, as I found out today). However, you can configure your web site to log to Azure storage - either table storage or blob storage, and then use the regular storage REST APIs (or use the storage client classes from one of the Azure SDKs). You can do this configuration when you enable logging in the Azure portal. The choices for log destination are File (i.e. the local filesystem - these can then be pulled by FTP) or Table or Blob (these can then be accessed by the storage REST API).
It is summarised here:
http://azure.microsoft.com/en-us/documentation/articles/web-sites-enable-diagnostic-log/
Note: In that link they describe how you can stream logs for real time log viewing (using PowerShell or the Azure command line tools). I think this is intended for debugging purposes rather than embedding in an application, but it might help you...
An alternative approach (disclaimer: I never tried this) is to integrate NewRelic into your application and then use their REST API to extract their monitoring information:
https://docs.newrelic.com/docs/apm/apis/requirements/extracting-metric-data