Copy Azure Storage Tables in C# - c#

Thanks to Azcopy, it is quite easy to transfer data between different azure storage accounts in command line. But I failed to find an efficient way to copy Azure Table Storage in C#. I noticed that there is an Microsoft Azure Storage Data Movement Library that claims to power the Azcopy, but seems there is no direct way to copy tables according to the library reference. Any suggestions to implement that efficiently?
P.S. I have millions of entities to transfer now and then, and I prefer to integrate it in a C# project without using cmd.exe or power shell.

I noticed that there is an Microsoft Azure Storage Data Movement Library that claims to power the Azcopy, but seems there is no direct way to copy tables according to the library reference.
As you mentioned that there is no method about copy tables in the Microsoft Azure Storage Data Movement Library.
prefer to integrate it in a C# project without using cmd.exe or power shell.
About how to operate Azure table storage with C#, we could refer to Get started with Azure Table storage using .NET.
I have millions of entities to transfer now
As it is a huge number of entities need to be transfered. Based on my experience, we could use the Azure Data Factory to do that.
Related resources:
ETL using azure table storage
Copy data to or from Azure Table using Azure Data Factory

Related

Copy .csv file from Azure Blob Storage to Sharepoint site

I have a CSV file stored in blob storage. The goal is to move this file into a Sharepoint site and set some metadata. What would be the best way to do this? The client does not want us to use Power Automate or Logic Apps.
I tried using Azure Data Factory but there seems to be an issue with writing data to SharePoint. I used the copy activity but the 'sink' to SharePoint failed. Does data factory support writing to Sharepoint?
The client does not want us to use Power Automate or Logic Apps.
Why not? This is the simplest way to achieve this, and is also better maintainable than for instance C# code.
Does data factory support writing to Sharepoint?
Yes, it does. However, using Data Factory only to copy a file to SharePoint is quite a bit of overkill.
If Logic Apps are not an option, have a look at an Azure Function to automatically trigger when the file is created in Azure Storage, and have a look at for instance Upload File To SharePoint Office 365 Programmatically Using C# CSOM – PNP for a C# way of uploading a file to SharePoint.

Cosmos DB Attachment Limits and Alternate Attachment Locations

We're moving the data storage for our core product to Cosmos DB. For documents, it works very well but I'm having trouble finding the information I need for attachments.
I can successfully do everything I want with attachments from my C# code using the Microsoft.Azure.DocumentDB NuGet ackage v 1.19.1.
According to information I can find, attachments are limited to 2GB total for all attachments in an account. This is hugely limiting. Info found here:
https://learn.microsoft.com/en-us/azure/cosmos-db/sql-api-resources#attachments-and-media
It states:
Azure Cosmos DB allows you to store binary blobs/media either with Azure Cosmos DB (maximum of 2 GB per account) or to your own remote media store.
There seems to be some implication that you can create attachments that point to resources stored elsewhere. Perhaps on a CDN. But I can't find any documentation how to actually do this from C#.
Does anyone know if Cosmos DB can, in fact, attach to BLOB payloads stored outside of itself? If so, can the .NET NuGet package do it or is it only available for pure REST calls?
Many thanks in advance.
There's nothing inherently built-in to manage externally-stored attachments. Rather, it's up to you to store them and then reference them.
The most common pattern is to store a URL to the specific attachment, with a document (e.g. to a blob in Azure Storage). This results in effectively two operations:
A query to retrieve the document from Cosmos DB
A read from storage, based on the URL found in the returned Cosmos DB document.
Note: all responsibility is on you to manage referenced content: updating it, deleting it, etc. And if you're using blob storage, you'll need to deal with things such as private vs public access (and generating SAS for private URLs where necessary, when returning URLs to your clients, vs streaming content).
One more thing: CDN isn't a storage mechanism on its own. You cannot store something directly to CDN; that's more of a layer on top of something like Azure Storage (for public-accessible content).

Asp.NET MVC and Azure storage

We are currently working on moving our Asp.NET MVC app from a shared hosting provider to Azure. Our users can upload files such as images and documents to our server and we store these files under app-url/content/data which works pretty well.
Question:
Is it safe to keep doing the same thing and uploading files under app-url/content/data ? I've read about the Azure blob storage but we would like minimize the amount of work required to move to Azure (this is definitely something we could do in the coming months)
Azure provides a number of storage options such as Azure SQL, DocumentDB, Azure Blob storage and more, you can use anyone. If your application is just storing the images, Azure Blob storage is the best option.
Is it safe to keep doing the same thing and uploading files under app-url/content/data ?
Definitely, the security is not concerns to Azure Customers. It is Microsoft's concern you can learn about Azure security from here.
we would like minimize the amount of work required to move to Azure.
This depends upon your application's back-end storage and resource management. If you are setting a new Azure VM for running your application, it might take long. If you are about to use Azure Web Apps (Recommended), it will minimize your migration workload as you may be already familiar with.

What is the best practice to consume Azure table storage in a windows store app?

I'm using PRISM framework to develop my App, there is no Patterns and practice guidance available on consuming Azure table with Windows store app?
What is the best practice to consume Azure table storage in a windows store app?
App calling WCF REST service which then talks to Azure table through Azure SDK
App calling Azure table storage REST service
App calling Azure mobile service which then talks to Azure table through data script
App consuming Azure table storage through Azure SDK
Any other option?
I don't think there's any guidance available on the best practices for consuming Azure Table Storage with Windows Store App.
Given your 4 options above, I would not recommend using #2 and #4 as is for one reason - In order for you to use any of these options, you would need to include your storage credentials (account name/account key) in your application itself which I think is a big security risk.
There's one other way by which you can use #2 and #4 and that's by using Shared Access Signature (SAS) functionality. Essentially you create SAS tokens using some kind of server side code (WCF/Mobile Service/Web API etc.) and provide that SAS token to your client application. Then you can use #2 or #4 approach.
The advantage with this approach to me is that your server-side component is really light weight as all it is doing is creating SAS tokens and your Windows 8 application is directly talking with storage service without the need of an intermediary. Given that Windows Azure Table Storage now supports JSON, the data transferred between your app and storage will be very minimal (compared to ATOMPUB XML format which was really bulky).

Dynamic Database / Schema in Entity framework on Windows Azure / SQL Azure

We are currently in process of developing SAAS application codename FinAcuity which will be hosted on Windows Azure platform and primary database will be SQL Azure.
Here are some Technical Specifications for our product:
Development Environment - Asp.Net 4.0 with MVC 3 (Razor), Entity Framework
Database - SQL Azure
Here is our Business Case:
Our product is a SAAS product, And as it will contains Financial Data of Client, we are going to provide separate database to each client to achieve higher level of multi-tenancy, Data Isolation & Security.
Now Client can create multiple companies under their account and these companies will be separated by Schemas under particular Client DB.
Note: Table structure will be same for each Schema.
Here are some scenarios to will give you a deeper view of our application processes.
Scenario 1:
To provision new database upon client registration, we are going to run Store Procedure that will create database with basic structure.
Our Doubt: Is this correct way of doing it on SQL Azure or there is some other way for it?
Scenario 2:
For accessing multiple schemas under client database, we have dynamically generated SSDL file for individual schema and used that file for connection.
Our Doubt: Is there any other way of doing it, like using same SSDL file instance for multiple connections and passing Metadata for connection?
Scenario 3:
As our application supports ad-hoc querying and dynamic table creation from Excel file, we are going to provide wizard that will run Store Procedure in back-end and create that table dynamically from Excel file upon header selection from wizard under particular schema for client database.
Our Doubt: Suggest us a better way of doing it, if any?
Scenario 4:
Now as the new table is added to schema, we have to update EDMX file to get data from that new created table. To do this we are going to run Store Procedure that will fetch data from newly created table.
Our Doubt: Is there any way of updating EDMX file runtime and getting data?
Need advice for best possible solution for each scenario that is listed above.
Thank you in advance.
Best Regards - Sahil
I think this is a little too much for 1 single question.
And I personally think you look at it from a wrong perspective. Why did you choose Entity Framework and SQL Azure? Do you really think these are the best technologies to address your problems?
I suggest you take a step back and investigate what other technologies could be used. Because what you're asking here looks like a schema-less solution, and SQL Azure / SQL Server wasn't built for that IMHO.
You can start by looking at a NoSQL (schema-less, key value store) solution, possibly in Windows Azure. There's a whitepaper that will get you started on that: NoSQL and the Windows Azure platform -- Investigation of an Unlikely Combination
Windows Azure Table Storage is a key-value store that could solve some of your issues:
Customer isolation / Multiple schemas: WAZ Table Storage supports partitions, you could partition your data per customer instead of putting all the data together.
Provisioning: No need to provision anything. Get a storage account and you can get started. Then you can simply have some code that writes data in a specific partition for that customer.
Cost (not mentioned in the question): Table Storage is much cheaper than SQL Azure
...

Categories

Resources