Upload a text file with Azure Mobile Services - c#

I want to create an application where the user after authentication can upload a file(txt/xml etc.) using the Azure Mobile Services and after that he can download only those files which were uploaded by himself.
I've watched a lot of tutorials (including this one: link ) but in this case they simply inserts a row to a database table. I want basically the same thing, just with files. How can I do that?
I'm really new to this, so I'm just guessing, but should I upload the files to Blob Storage, and store a link in the database pointing to that file? I'm searching for the best practice.

Yes, you are correct!
You would be limited in size if you tried to store the text file as a field in the database.
http://azure.microsoft.com/en-us/documentation/articles/mobile-services-windows-store-dotnet-upload-data-blob-storage/
Shows how to do what you want to do but with images.
You would want to change the image stream to a text stream here:
// Get the new image as a stream.
using (var fileStream = await media.OpenStreamForReadAsync())
{
...
}
And use the Stream classes instead to open the file stream:
http://msdn.microsoft.com/en-us/library/windows/apps/system.io.stream(v=vs.105).aspx

Related

Google Cloud Storage .Net Client Bulk Upload

I am uploading files to cloud storage using the .net client.
at the moment am uploading files one by one like
StorageClient client = StorageClient.Create();
foreach(file in files)
{
client.UploadObject(bucketName, uploadLocation, contentType, file);
}
But I couldn't find any way to bulk upload files. Is there any way to upload files in bulk ?
You are effectively bulk uploading; you're just uploading each file serially 😃
If you're looking for a method that you give files to and it handles the entire upload, I'm unsure that this exists.
You can run the uploads in parallel using threads or equivalent.
You'll want to ensure that you can resume failed uploads (uploading multiple files increases the likelihood of failure), see Create Object Uploader
You'll need to self-manage resuming on failures.
It's possible that there are libraries that implement this abstraction.
I understand that you are searching for a way to upload a large number of files at once in Google Cloud Storage. However, there is no direct method to handle the entire upload. If you have a large number of files to upload you can perform a parallel multi-threaded/multi-processing copy. The following steps are to be followed:
1.Instantiation of a StorageClient object.
2.Specifying the parallelization options.
3.Getting a list of filenames from my upload folder and storing the names of the files.
4.Getting a list of files in the cloud.
5.Using the Parallel Foreach.
6.calling the UploadObject method from the Google Cloud Storage client library.
You can also refer to this article for more details on the above methods.

Windows service to GCP

I was looking for resources on how to create a simple background service using C# that checks a specific folder for FLAC files and sends them to a GCP bucket, once the file is uploaded successfully the file is erased or moved to another folder. Where can I find something to read about this kind of thing?
To move a file to another location using c# you can use the move method. The Move method moves an existing file to a new location with the same or a different file name.The Move method moves an existing file to a new location with the same or a different file name in File Move. The Move method takes two parameters. The Move method deletes the original file. The method that renames files is called File.Move
Example:
{
File.Move(sourceFile, destinationFile);
}
catch (IOException iox)
{
Console.WriteLine(iox.Message);
}
If you need more examples about File.Move method please follow this link
Adding to that, you can use the Directory.GetFiles method to select the file extension, like in the example below.
This is the original thread where the example was posted
Example:
//Assume user types .txt into textbox
string fileExtension = "*" + textbox1.Text;
string[] txtFiles = Directory.GetFiles("Source Path", fileExtension);
foreach (var item in txtFiles)
{
File.Move(item, Path.Combine("Destination Directory", Path.GetFileName(item)));
}
If you want to know more about Directory.GetFiles method follow this link
And concerning GCP,using Cloud Storage Transfer Service you can move or backup data to a Cloud Storage bucket either from other cloud storage providers or from your on-premises storage. Storage Transfer Service provides options that make data transfers and synchronization easier. For example, you can:
Schedule one-time transfer operations or recurring transfer
operations.
Delete existing objects in the destination bucket if they do not have
a corresponding object in the source.
Delete data source objects after transferring them.
Schedule periodic synchronization from a data source to a data sink
with advanced filters based on file creation dates, file-names, and
the times of day you prefer to import data.
If you want to know more about GCP Cloud Storage Transfer Service follow this link
If you want to know more about how to create storage buckets follow this link

Upload to ADLS from file stream

I am making a custom activity in ADF, which involves reading multiple files from Azure Storage Blobs, doing some work on them, and then finally writing a resulting file to the Azure Data Lake Store.
Last step is where I stop, because as far as I can see, the .NET SDK only allows for uploading from a local file.
Is there any way to to (programmatically) upload a file to ADL Store, where it is not from a local file? Could be a blob or a stream. If not, any workarounds?
Yes, it's possible to upload from Stream, the trick is to create file first and then append your stream to it:
string dataLakeAccount = "DLSAccountName";
var adlsFileSystemClient = new DataLakeStoreFileSystemManagementClient(credentials);
adlsFileSystemClient.FileSystem.Create(dataLakeAccount, filepath, overwrite: true);
adlsFileSystemClient.FileSystem.Append(dataLakeAccount, filepath, stream);
See also this article.

How do I pass a Stream from a Web API to Azure Blob Storage without temp files?

I am working on an application where file uploads happen often, and can be pretty large in size.
Those files are being uploaded to a Web API, which will then get the Stream from the request, and pass it on to my storage service, that then uploads it to Azure Blob Storage.
I need to make sure that:
No temp files are written on the Web API instance
The request stream is not fully read into memory before passing it on to the storage service (to prevent OutOfMemoryExceptions).
I've looked at this article, which describes how to disable input stream buffering, but because many file uploads from many different users happen simultaneously, it's important that it actually does what it says on the tin.
This is what I have in my controller at the moment:
if (this.Request.Content.IsMimeMultipartContent())
{
var provider = new MultipartMemoryStreamProvider();
await this.Request.Content.ReadAsMultipartAsync(provider);
var fileContent = provider.Contents.SingleOrDefault();
if (fileContent == null)
{
throw new ArgumentException("No filename.");
}
var fileName = fileContent.Headers.ContentDisposition.FileName.Replace("\"", string.Empty);
// I need to make sure this stream is ready to be processed by
// the Azure client lib, but not buffered fully, to prevent OoM.
var stream = await fileContent.ReadAsStreamAsync();
}
I don't know how I can reliably test this.
EDIT: I forgot to mention that uploading directly to Blob Storage (circumventing my API) won't work, as I am doing some size checking (e.g. can this user upload 500mb? Has this user used his quota?).
Solved it, with the help of this Gist.
Here's how I am using it, along with a clever "hack" to get the actual file size, without copying the file into memory first. Oh, and it's twice as fast
(obviously).
// Create an instance of our provider.
// See https://gist.github.com/JamesRandall/11088079#file-blobstoragemultipartstreamprovider-cs for implementation.
var provider = new BlobStorageMultipartStreamProvider ();
// This is where the uploading is happening, by writing to the Azure stream
// as the file stream from the request is being read, leaving almost no memory footprint.
await this.Request.Content.ReadAsMultipartAsync(provider);
// We want to know the exact size of the file, but this info is not available to us before
// we've uploaded everything - which has just happened.
// We get the stream from the content (and that stream is the same instance we wrote to).
var stream = await provider.Contents.First().ReadAsStreamAsync();
// Problem: If you try to use stream.Length, you'll get an exception, because BlobWriteStream
// does not support it.
// But this is where we get fancy.
// Position == size, because the file has just been written to it, leaving the
// position at the end of the file.
var sizeInBytes = stream.Position;
Voilá, you got your uploaded file's size, without having to copy the file into your web instance's memory.
As for getting the file length before the file is uploaded, that's not as easy, and I had to resort to some rather non-pleasant methods in order to get just an approximation.
In the BlobStorageMultipartStreamProvider:
var approxSize = parent.Headers.ContentLength.Value - parent.Headers.ToString().Length;
This gives me a pretty close file size, off by a few hundred bytes (depends on the HTTP header I guess). This is good enough for me, as my quota enforcement can accept a few bytes being shaved off.
Just for showing off, here's the memory footprint, reported by the insanely accurate and advanced Performance Tab in Task Manager.
Before - using MemoryStream, reading it into memory before uploading
After - writing directly to Blob Storage
I think a better approach is for you to go directly to Azure Blob Storage from your client. By leveraging the CORS support in Azure Storage you eliminate load on your Web API server resulting in better overall scale for your application.
Basically, you will create a Shared Access Signature (SAS) URL that your client can use to upload the file directly to Azure storage. For security reasons, it is recommended that you limit the time period for which the SAS is valid. Best practices guidance for generating the SAS URL is available here.
For your specific scenario check out this blog from the Azure Storage team where they discuss using CORS and SAS for this exact scenario. There is also a sample application so this should give you everything you need.

Is it possible to store .exe or .scr files in sqlserver database using asp.net c#?

I am developing an asp.net 3.5 website , I need to store some screensaver files .scr on the host machine , I wonder if there is a way to store .scr or .exe files in sqlserver database and retreive them ? any helps whould be appriciated .
Store them as BLOB (Binary Large Object) and store the file stream, which basically is a byte array. Normally the file is stored on a server and the database only holds the file path and name. You will have to make space for a BLOB data type in your database in SQL server will allow images of 8000 bytes:
BLOBData varBinary(8000)
// Using DataTable as a mockup database.
System.Data.DataTable table = new System.Data.DataTable();
table.Columns.Add("file name");
table.Columns.Add("data");
// Read data from file.
byte[] stream = System.IO.File.ReadAllBytes("th.exe");
// Add the file to the db, don't know how you add data to your database.
table.Rows.Add("th.exe", stream);
// Create a filestream that will write data to disk (in a file).
System.IO.FileStream save = new System.IO.FileStream((string)table.Rows[0].ItemArray[0], System.IO.FileMode.Create);
// Retrieve the data from the database, don't know how you do this with your database.
byte[] data = (byte[])table.Rows[0].ItemArray[0];
// Write data to the file on disk.
save.Write(data, 0, data.Length);
Though, you can store it as BLOB, performance could be a concern if the file size is bigger.
Alternatively, you can use Filestream Storage which is implemented as a varbinary(max) column in which the data is stored as BLOBs in the file system. The sizes of the BLOBs are limited only by the volume size of the file system. The standard varbinary(max) limitation of 2-GB file sizes does not apply to BLOBs that are stored in the file system.
For more details check msdn
http://msdn.microsoft.com/en-us/library/bb933993.aspx
cheers

Categories

Resources