Azure Instance change deleted files - c#

I am new to windows Azure, I am using 6month subscription plan with 1 VM yesterday instance of my VM changed & my files from root folder are deleted, how can I restore that files and how can I prevennt this in near future.

Azure PAAS instances aren't meant to store anything persistently on local disk since it may/will be wiped regularly when the instances are automatically replaced.
For best performance, you could store the files in blob storage where they can be accessed by any server using your storage account instead of just the single machine.
If you really need persistent file storage, you can attach blob storage as a local disk and store your data there. Note though that the disk will only be accessible by one instance at a time.
As for the files stored on the local file system when the instance was replaced, unless you have a backup, I know of no way to restore them.
This link is a good read regarding various storage options in Azure that gives much more details than this space allows for.

If you are using a virtual machine (IaaS):
Add a data disk and store files there, and not on the OS disk
You are responsible for making backups yourself
If you are using cloud services (PaaS):
Don't store data on the machine itself, use that only for cache, temporary data
Add a data disk and mount it, if it's a single server
Use blob storage if the data is used from multiple hosts

Related

How do I detect an azure blob storage file change and delete local cache instance of that file?

I am currently migrating a legacy .net application from a dedicated server to auzre web app. The application uses uses System.Web.Cache CacheDependency for XML file caching.
Caching.CacheDependency(xmlFile) normally detects changes made to the file and updating the cache with the latest version.
The issue is that the files are now being stored in an Azure storage account (ie. not the local file system) and I need a way to detect changes made to the files. The Caching.CacheDependency(xmlFile) will not work in this case as it looks for a local path.
Since the file based CacheDependency does not detect changes to files on the Azure blob storage, how can we make the web app detect changes and remove the stale cache file from the local cache?
I am thinking that a webfunction with a blob trigger will solve the file monitoring part but how do I remove the file from the System.Cache of the web app? I am also concerned about excessive resources being consumed. There are thousands of files.
Has anyone run into this yet and if so, what was your solution.
I had a issue like that.
The solution was create new Endpoint in WebApp. This endpoint just clean the cache. So we built a WebJob with blob storage trigger, then when this trigger occurs, the webjob call the new endpoint by a POST and the cache read the new datas.

Microsoft Azure App Service Storage

I have a doubt for purchase a microsoft azure app service to host my app. I have already tested the free profile and i am concerned to switch to a basic profile.
That's my question.
I have seen on this table on azure website here which i'll have 10GB of disk space for my application files.
When i went on price calculator i see this
Well my question is:
Why here i see 10GB of temporary storage? will i lose my application files located in the wwwroot folder anytime?
will i lose my application files located in the wwwroot folder anytime?
It will not be lost if you application files located in the wwwroot folder.We could get the answer from the document.
Persisted files
This is what you can view as your web site's files. They follow a structure described here. They are rooted in d:\home, which can also be found using the %HOME% environment variable.
These files are persistent, meaning that you can rely on them staying there until you do something to change them. Also, they are shared between all instances of your site (when you scale it up to multiple instances). Internally, the way this works is that they are stored in Azure Storage instead of living on the local file system.
Free and Shared sites get 1GB of space, Basic sites get 10GB, and Standard sites get 50GB
Temporary files
A number of common Windows locations are using temporary storage on the local machine. For instance
%APPDATA% points to something like D:\local\AppData.
%TMP% goes to D:\local\Temp.
Unlike Persisted files, these files are not shared among site instances. Also, you cannot rely on them staying there. For instance, if you restart a web app, you'll find that all of these folders get reset to their original state.
For Free, Shared and Consumption (Functions) sites, there is a 500MB limit for all these locations together (i.e. not per-folder). For Standard and Basic sites, the limit is very high (over 100GB).
We could check the application files in the D:\home\site\wwwroot from the Azure kudu tool(https://yoursite.scm.azurewebsites.net/).
Available disk space is shown on the Environment page:

Synchronization of Cloud Blobs with the local files

In a desktop application im developing, I would like to have a feature to store and sync the local save files. Basically, I would like to have a sync button (or automatic when the program starts) that does the following:
Download all the blobs(files) in the azure storage account that are newer than the local file, or not existent inside the local save folder.
Upload all the files in the local save folder that are newer than the azure storage blob, or if they are not present on the storage account.
Basically, I want both the local folder and the storage account to have the same files when the syncing is over; but only keeping the newer versions.
To do this, I first tried comparing them by their DateTime; which is working with one exception. Everytime I sync, due to time it takes to upload/download their DateTime's are off by couple of seconds. Therefore, either the local version or the cloud version appears to be newer than the other one always; leading the program to find something to sync everytime the sync button is pressed. I can always put a time threshold when we compare, but that doesn't sound like a valid solution.
Is there a way to make this work through time comparison? Or is there a way to compare the actual files and see if the file has changed? Or any other solution?
Note: Using File.GetLastWriteTime() to get the local file's time, and getting the "Last-Modified" header from the response for the storage blob's date. And using DateTime.Compare() to compare them.
You can probably complement the date with the use of the Etag property. This property is updated at each blob modification.
Here is an article that presents the use of Etag for managing blobs:
https://azure.microsoft.com/en-us/blog/managing-concurrency-in-microsoft-azure-storage-2/
You can find more details on headers for blob service, that can help you perform the synchronization:
https://learn.microsoft.com/fr-fr/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations

Redeploy All Files to Cloud Storage After Implementing within Kentico

I had a Kentico site that I was using at first without any cloud storage. Now that I have switched to Amazon S3 using their documentation (https://docs.kentico.com/display/K8/Configuring+Amazon+S3), I have a lot of files that are still being stored locally. I want to move them into the cloud automatically without having to touch each file.
Is there an easy way to automatically push files in media library, app_theme, attachments, images, etc into the new bucket in the Amazon S3 cloud storage?
It should be possible to move all the files to Amazon S3 storage but it is important to note there are a few downsides in the process as well.
First of all, you could copy all the files you want to move to the corresponding bucket in the storage which should ensure system would look for and retrieve all files from the storage instead of a local file system. However, I would also recommend that you delete the files from your local filesystem afterwards because it might cause some conflicts in specific scenarios.
The downside is the fact that Amazon S3 storage does not support some special characters so you might need to adjust file names manually which would mean the references would need to be changed accordingly.

PDF Attachments in Azure, use memory or temporary directory?

I'm planning on writing an application that sends multiple PDFs to the users' emails as attachments.
Should I use memory (MemoryStream) or is there a temporary directory that I can use? Which is more advisable? Thanks!
BTW I'm using C# ASP.NET
I would go with file-system storage, since memory is a more scarce resource. Windows Azure provides Local Storage Resources for this purpose, which are areas of disk that you configure in Service Definition and then access through the Azure SDK at runtime. They are not permanent storage, and will get cleaned up when a role recycles, thus they are ideal for temporary operations such as the one you describe. Although you should still try to clean up the files after each operation to make sure you don't fill up the space.
Full information on Local Storage Resources is here: http://msdn.microsoft.com/en-us/library/windowsazure/ee758708.aspx
A table detailing the amount of disk space available for Local Storage Resources on each instance size is here: http://msdn.microsoft.com/en-us/library/windowsazure/ee814754.aspx
You could use a different pattern. Put the PDFs in blob storage and place a queue message with the e-mail address & list of PDFs to send. Have a separate worker role build & send the e-mail. You could use X-Small or Small. Since this would also allow for asynch communication, you could just use 1 instance. If it can't keep up, spin up a second one via the config file (i.e. no re-deployment). This also has the added benefit of giving your solution more aggregate bandwidth.
If the traffic isn't very heavy, you could just spin up a separate thread (or process) that does the same thing.
Pat

Categories

Resources