Copy Azure Storage Data to Development storage - c#

I have been working on Asp.Net application which populates UI from azure storage account.
It makes development extremely tedious when for small change in UI code in I need to wait for all data to reload from azure storage (which is extremely slow, unless partionkey & rowkey are provided properly)
I wish to synchrozise all data in cloud storage account to local dev
storage.
I am sure I am not the first one to face this problem. How does
community handles this scenario? Is there any tool, which could copy
all data including blob & table to local development account?

Related

Azure blob storage migration with huge data in each container with decryption

I have a requirement to migrate encrypted blobs from source Azure storage account to destination storage account in decrypted format (Key vault key).
I have written C# code but it was taking almost 3 days for single container. I am trying event grid triggered azure function connected to destination storage account on new file captured event and migrating blobs using Azure data factory copy pipeline, azure function is using app service plan which can scale out till 10 instances.
am I on right path? is there any other performant way?
If your Azure function need is only to initiate ADF pipeline, then I guess you can take advantage of event based trigger or you can opt of LogicApp to do the same job for better performance.
Event-driven architecture (EDA) is a popular data integration paradigm that entails event creation, detection, consumption, and response. Data integration situations frequently need users triggering pipelines based on storage account events, such as the arrival or deletion of a file in an Azure Blob Storage account.
Please check below link to know more about event based triggers: Create a trigger that runs a pipeline in response to a storage event | Microsoft Docs
Also, you can consider increasing DTUs/Parallel copy options as well inside copy activity which helps you to improve performance of your copy.
If there is a need to migrate a big amount of data from a data lake or an enterprise data warehouse (EDW) to Azure. Other times, you may need to import huge volumes of data into Azure from several sources for big data analytics. In each scenario, achieving optimal performance and scalability is important.
Please check below link to know more details about : Copy activity performance and scalability guide

Difference between Azure Reddis Cache and Azure CDN

I need to implement a cache in my application using Azure Cache for Reddis but I went to some blogs where I have an option to store my responses or data using Azure CDN.
Could someone suggest me what is the difference between them?
As per my understanding Reddis is used to store the cache data whereas CDN used to cache data as well as a faster response from the nearby server
Azure Redis Cache
It perfectly complements Azure database services such as Cosmos DB. It provides a cost-effective solution to scale read and write throughput of your data tier. Store and share database query results, session states, static contents, and more using a common cache-aside pattern.
Here is the diagram below of Cache-Aside Pattern on Azure Storage.
We can see that we need to first hit to Redis Cache to see if we have our item available. if so, we will fetch it otherwise, pull the item from Table to re-cache.
Azure CDN
“A content delivery network (CDN) is a distributed network of edge servers that can efficiently deliver web content to users. CDNs store cached content on edge servers in point-of-presence (POP) locations that are close to end users, to minimize latency. A CDN profile, belonging to one Azure subscription, can have multiple CDN endpoints.”
What is a content delivery network on Azure? #Microsoft
It lets you reduce load times, save bandwidth, and speed responsiveness—whether you’re developing or managing websites or mobile apps, or encoding and distributing streaming media, gaming software, firmware updates, or IoT endpoints.
Web-Queue-Worker on Azure App Service
Conclusion
Azure Cache for Redis stores session state and other data that needs low latency access.
Azure CDN is used to cache static content such as images, CSS, or HTML.

Azure instance change code roll back

I am new to Azure, I have small instance of cloud service, In last one week my instance is changed 2 times & all my project data is lost, it will roll back to 1 month older. All my client data is lost, Is there any way to recover that data & why this issue occurs.
There is no way to recover your data and there's no way to prevent this from happening. This is by design.
Whenever your machine crashes or there's an update to the system, it is completely wiped. A new system image will be copied, the machine will boot again and your application is copied over. Azure cloud services are Platform-as-a-Service (PaaS).
This leaves you with two possible options. The first would be to not store persistent data on the cloud service in the first way. This is no proper way for Azure Cloud Services. Instead store your data in the Azure Storage or an Azure SQL database (or wherever you like).
Another option would be to use a virtual machine instead of a cloud service. That machine is completely in your hand. It's your duty to update it, keep it secure and to do whatever it takes to keep it running. With this approach you also have to take care yourself about a loadbalancer, about multiple instances, etc, so outscaling becomes a lot more hard. This is Infrastructure-as-a-Service (IaaS).
So it actually depends on what you want to do.
Cloud instances are stateless, this means that anything that you've stored on the local storage for the virtual machines can and will be deleted on the event of a node failure, or a system upgrade, or even a new deployment of a package that you upload.
A couple of things you can do:
If you need to add additional files or configurations to your project upon deployment, then make use of the OnStart() to perform it. This assures than on each deployment or failure restore you get back the same environment you always had.
To avoid losing your source code I recommend you setup source control and integrate it with you cloud instance implementation. You can either do this with Git or with Team Foundation Service (checkout tfspreview.com)
If you need to store files on the server such as assets or client-updated media, consider using Azure Blob Storage. Blob storage is replicated both locally on the datacenter and geo-replicated to other datacenters if you choose to do so.
Hope that helps

WP7 Isolated Storage when app is updated

I’m storing data that my app uses in isolated storage which is working fine. However, I’m not sure what will happen to the data when I release an update to my app – is the isolated storage data cleared? or will I still be able to access the data?
Your isolated storage is untouched via an app update.
It's your app's responsibility to manage any updates needed to keep app version and data in sync.

Is caching infrequently changed data to speed client application load times an appropriate use of Isolated Storage?

I'm working on an enterprise application re-write using Silverlight. The project is still in its early stages of development, but there is a heavy initial data load on the application start as it pulls several sets of business objects from the server. Some of these data sets are infrequently changed once they're set up by the user; like the list of all customized data types in use by the user.
In these cases, the thought is to cache the data objects (probably in a serialized form) in Isolated Storage so there's no wait on an asynchronous call to the server to grab the data after the first application load.
I thought that Isolated Storage is meant to store configuration data such as user preferences, or to share across the in-browser and out-of-browser version of an app...that it works a lot like a cookie store.
My main concern is that I'm unsure of how secure Isolated Storage is, and I don't trust caching application data in it. To be fair, the user would also have access to the Silverlight .xap file.
Is this an appropriate use for Isolated Storage, why or why not?
It's a fair use of isolated storage, if you're comfortable with the caveats.
The first caveat in my mind is that whatever you store in isolated storage on one machine will not be available when the user fires up your app on another machine - you lose the mobility advantage of web applications over desktop installed apps. If the user spends some time configuring their preferences, etc, they will be irritated that they have to do it all over again just because they switched to a different computer to view your web app. To solve this, you should replicate the user's customizations to cloud storage so that it can be copied down to whatever machine they choose to run your web app on. Treat the isolated storage as a performance optimization cache for data that officially lives in the cloud.
I believe Silverlight isolated storage is written to disk in the user's private data area in the file system. \users\\AppData or similar. This will keep it isolated away from other users on the same machine, but will not provide any protection from other programs running for the same user. I don't recall if Silverlight isolated storage is encrypted on disk. I highly doubt it.
A second caveat is that Silverlight isolated storage has a quota limit, and it's fairly small by default (1MB). The quota can be increased with a call to IncreaseQuotaTo(), which will prompt the end user to ok the request.
The third caveat is that if you're going to use local storage as a cache of data that lives in the cloud, you have to manage the data synchronization yourself. If the user makes changes locally, you need to push that up to the storage of authority in the cloud, and you'll have to decide when or how often to refresh the local cache from the cloud, and what to do when both have been changed at the same time (collision).
The browser cookie store is not a great metaphor for describing Silverlight isolated storage. Browser cookies for a given domain are attached to every http request that is made from the client to the server. The cookies are transmitted to the server constantly. The data in Silverlight isostorage is only accessible to the Silverlight code running on the client machine - it is never transmitted anywhere by Silverlight or the browser.
Treat Silverlight's isolated storage as a local cache of cloud data and you should be fine. Treat isostorage as a permanent storage and you'll piss off your customers because the data won't follow them everywhere they can use your web app.
Not a complete answer to your story but a data point to consider:
Beware the IO speeds of IsolatedStorage. While there has been considerable effort put into speeding it up, you may want to consider other options if you plan to do multiple small reads/writes as it can be extremely slow. (That, or use appropriate buffering techniques to ensure your reads/writes are larger and infrequent.)

Categories

Resources