Synchronization of Cloud Blobs with the local files - c#

In a desktop application im developing, I would like to have a feature to store and sync the local save files. Basically, I would like to have a sync button (or automatic when the program starts) that does the following:
Download all the blobs(files) in the azure storage account that are newer than the local file, or not existent inside the local save folder.
Upload all the files in the local save folder that are newer than the azure storage blob, or if they are not present on the storage account.
Basically, I want both the local folder and the storage account to have the same files when the syncing is over; but only keeping the newer versions.
To do this, I first tried comparing them by their DateTime; which is working with one exception. Everytime I sync, due to time it takes to upload/download their DateTime's are off by couple of seconds. Therefore, either the local version or the cloud version appears to be newer than the other one always; leading the program to find something to sync everytime the sync button is pressed. I can always put a time threshold when we compare, but that doesn't sound like a valid solution.
Is there a way to make this work through time comparison? Or is there a way to compare the actual files and see if the file has changed? Or any other solution?
Note: Using File.GetLastWriteTime() to get the local file's time, and getting the "Last-Modified" header from the response for the storage blob's date. And using DateTime.Compare() to compare them.

You can probably complement the date with the use of the Etag property. This property is updated at each blob modification.
Here is an article that presents the use of Etag for managing blobs:
https://azure.microsoft.com/en-us/blog/managing-concurrency-in-microsoft-azure-storage-2/
You can find more details on headers for blob service, that can help you perform the synchronization:
https://learn.microsoft.com/fr-fr/rest/api/storageservices/specifying-conditional-headers-for-blob-service-operations

Related

Application permanently store variable value

I am using Properties.setting.default.var to permanently store a value in a C# application on the same PC.
Now I am facing a problem that when I save the value but copy the application to another PC, the permanent value does not remain. Does the properties.setting trick not work in this scenario? If yes? Please advise the solution.
You need to get the settings stored in location that is accessible by all devices you plan to run your program on.
You can either
make sure current location of the settings file is synchronized between all devices - this way you can keep your existing code. You can sync files via roaming profiles in Windows Domain, by letting some file share synchronization tool to sync that file - i.e. OneDrive, by just manually copy file or any other way you can find.
write settings file yourself to shared location which can be accessed by all devices - pretty much any service that allow to upload data would do. Some will allow authenticated access so you can limit settings to particular user (OneDrive, GoogleDrive,...), some some form of anonymous/semi-authenticated uploads (which make personalized settings a bit harder and make them public for all to see). You may still be able to use some of the existing code but likely getting your settings in JSON format and uploading would be easier.

How do I detect an azure blob storage file change and delete local cache instance of that file?

I am currently migrating a legacy .net application from a dedicated server to auzre web app. The application uses uses System.Web.Cache CacheDependency for XML file caching.
Caching.CacheDependency(xmlFile) normally detects changes made to the file and updating the cache with the latest version.
The issue is that the files are now being stored in an Azure storage account (ie. not the local file system) and I need a way to detect changes made to the files. The Caching.CacheDependency(xmlFile) will not work in this case as it looks for a local path.
Since the file based CacheDependency does not detect changes to files on the Azure blob storage, how can we make the web app detect changes and remove the stale cache file from the local cache?
I am thinking that a webfunction with a blob trigger will solve the file monitoring part but how do I remove the file from the System.Cache of the web app? I am also concerned about excessive resources being consumed. There are thousands of files.
Has anyone run into this yet and if so, what was your solution.
I had a issue like that.
The solution was create new Endpoint in WebApp. This endpoint just clean the cache. So we built a WebJob with blob storage trigger, then when this trigger occurs, the webjob call the new endpoint by a POST and the cache read the new datas.

Reading/Writing SQLite3 database to OneDrive with WinRT 8.1

I am having a problem getting an SQLite3 database to work with OneDrive. I have no problems running the database from ApplicationData.Current.RoamingFolder, but when I try to add a FolderPicker to allow the user to specify a folder to store the database, I keep getting a CannotOpen error from SQLite.
I am adding a token for the folder to the FutureAccessList like so:
// give us a consent to use the folder
var folderPicker = new FolderPicker();
folderPicker.FileTypeFilter.Add("*");
_currentFolder = await folderPicker.PickSingleFolderAsync();
// after that, we can store the folder for future reuse
var pickedFolderToken = StorageApplicationPermissions.FutureAccessList.Add(_currentFolder);
ApplicationData.Current.LocalSettings.Values.Add("FolderTokenSettingsKey", pickedFolderToken);
As I said, the application runs fine when using RoamingFolder, but I am trying to get the database to be stored on remote storage so I can keep it synchronized between multiple devices.
I initially thought that RoamingFolder would be synchronized across multiple devices (as long as it is the same Microsoft account logged in), but that does not appear to be the case, so I am trying to get it to work with SkyDrive, or some other remote storage service.
The SQLite3 database is configured to create the database if it does not exist, and I have tried both scenarios with the same result - CannotOpen in either case. Anyone know if this is allowed in WinRT 8.1 applications? My other option is to copy the database from to the RoamingFolder and access that copy from the application, and then copy back to when writing to the database.
SQLite works with native Windows filesystem APIs (Win32), but the pickers (like FolderPicker) and the FutureAccessList work only with brokered filesystem APIs (Windows.Storage). There is no way to mix the two at the same time.
But items in your roaming folder should get copied to other devices with the same Microsoft Account, although it isn't instantaneous. How long have you waited to see if it turns up? How big is the file? Did you read this topic and related items on MSDN?
Otherwise, yes your work-around of copying the file from the brokered location into your app's storage, then modifying it, then copying it back again after it has been closed would work.

Redeploy All Files to Cloud Storage After Implementing within Kentico

I had a Kentico site that I was using at first without any cloud storage. Now that I have switched to Amazon S3 using their documentation (https://docs.kentico.com/display/K8/Configuring+Amazon+S3), I have a lot of files that are still being stored locally. I want to move them into the cloud automatically without having to touch each file.
Is there an easy way to automatically push files in media library, app_theme, attachments, images, etc into the new bucket in the Amazon S3 cloud storage?
It should be possible to move all the files to Amazon S3 storage but it is important to note there are a few downsides in the process as well.
First of all, you could copy all the files you want to move to the corresponding bucket in the storage which should ensure system would look for and retrieve all files from the storage instead of a local file system. However, I would also recommend that you delete the files from your local filesystem afterwards because it might cause some conflicts in specific scenarios.
The downside is the fact that Amazon S3 storage does not support some special characters so you might need to adjust file names manually which would mean the references would need to be changed accordingly.

Azure Instance change deleted files

I am new to windows Azure, I am using 6month subscription plan with 1 VM yesterday instance of my VM changed & my files from root folder are deleted, how can I restore that files and how can I prevennt this in near future.
Azure PAAS instances aren't meant to store anything persistently on local disk since it may/will be wiped regularly when the instances are automatically replaced.
For best performance, you could store the files in blob storage where they can be accessed by any server using your storage account instead of just the single machine.
If you really need persistent file storage, you can attach blob storage as a local disk and store your data there. Note though that the disk will only be accessible by one instance at a time.
As for the files stored on the local file system when the instance was replaced, unless you have a backup, I know of no way to restore them.
This link is a good read regarding various storage options in Azure that gives much more details than this space allows for.
If you are using a virtual machine (IaaS):
Add a data disk and store files there, and not on the OS disk
You are responsible for making backups yourself
If you are using cloud services (PaaS):
Don't store data on the machine itself, use that only for cache, temporary data
Add a data disk and mount it, if it's a single server
Use blob storage if the data is used from multiple hosts

Categories

Resources