Uploading video file from azure blob storage to azure media services - c#

I have a web api where i upload .mp4 files to my blob storage account on Azure which is connected to my media services account. I want to copy the video from my storage account and upload it to my media services account. What i've tried so far is to get the filepath that points to my storage account file and do it like this:
public void UploadToMediaServices(Uri storageAddress)
{
var filePath = storageAddress.ToString();
var context = new CloudMediaContext("Name", "Key");
var uploadAsset = context.Assets.Create(Path.GetFileNameWithoutExtension(filePath), AssetCreationOptions.None);
var assetFile = uploadAsset.AssetFiles.Create(Path.GetFileName(filePath));
assetFile.Upload(filePath);
}
This is what azure websites recomended in their code snippet. The only thing is that i assume they take form local disk.
When i do this it uploads a name to the media service but nothing else is available and i can't publish the video nor see the size of it.
Does anyone know what to do in this situation?
Thanks.

I assume you want to copy media files from your storage account into the storage account you attached to Azure Media Services. We have an article online to help you with that: https://msdn.microsoft.com/en-us/library/azure/jj933290.aspx. Please let me know whether it works for you.
Thanks,
Mingfei Yan

Related

Deleting files in Azure Synapse Notebook

This should have been simple but turned out to require a bit of GoogleFu.
I have an Azure Synapse Spark Notebook written in C# that
Receives a list of Deflate compressed IIS files.
Reads the files as binary into a DataFrame
Decompresses these files one at a time and writes them into Parquet format.
Now after all of them have been successfully processed I need to delete the compressed files.
This is my proof of concept but it works perfectly.
Create a linked service pointing to the storage account that contains the files you want to delete see Configure access to Azure Blob Storage
See code sample below
#r "nuget:Azure.Storage.Files.DataLake,12.0.0-preview.9"
using Microsoft.Spark.Extensions.Azure.Synapse.Analytics.Utils;
using Microsoft.Spark.Extensions.Azure.Synapse.Analytics.Notebook.MSSparkUtils;
using Azure.Storage.Files.DataLake;
using Azure.Storage.Files.DataLake.Models;
string blob_sas_token = Credentials.GetConnectionStringOrCreds('your linked service name here');
Uri uri = new Uri($"https://'your storage account name here'.blob.core.windows.net/'your container name here'{blob_sas_token}") ;
DataLakeServiceClient _serviceClient = new DataLakeServiceClient(uri);
DataLakeFileSystemClient fileClient = _serviceClient.GetFileSystemClient("'path to directory containing the file here'") ;
fileClient.DeleteFile("'file name here'") ;
The call to Credentials.GetConnectionStringOrCreds returns a signed SAS token that is ready for your code to attach to a storage resource uri.
You could of course use the DeleteFileAsync method if you so desire.
Hope this saves someone else a few hours of GoogleFu.

How to extract Thumbnail of MP4 Video located in azure storage

I want to extract a thumbnail from an mp4 video hosted in azure storage. My current method in C# uses a NReco NuGet package:
But that is a local file. How do i extract the thumb from an azure storage file.
string mp4inputpath = server.mappath("~/testfolder/myvideo.mp4");
string thumbOutputPath = server.mappath("~/testfolder/mythumb.jpg");
var ffMpeg = new NReco.VideoConverter.FFMpegConverter();
// Get the thumb at the frame 1 second into the video
ffMpeg.GetVideoThumbnail(mp4inputpath, thumbOutputPath, 1);
That works! But i need to use an azure storage file url for mp4inputpath.
I can download the mp4 file from azure storage and save it temporarily into my azure web app. I can do that programatically.
Then extract the thumb, ie,
ffMpeg.GetVideoThumbnail(mp4inputpath, thumbOutputPath, 1);
Then delete the temporary mp4 within my app.
this works but i don't know it is advisable to download mp4 files into my azure web app. I don't know if it will scale. This is the only solution i have, so far.
string mp4Url = #"https://mysorageaccount.blob.core.windows.net/mp4/vacation/summer/dogbarking.mp4";
string thumbOutputPath = server.mappath("~/testfolder/mythumb.jpg");
var ffMpeg = new NReco.VideoConverter.FFMpegConverter();
// Get the thumb at the frame 1 second into the video
ffMpeg.GetVideoThumbnail(mp4Url, thumbOutputPath, 1);
This does not seem to work. No Error, but the thumbOutputPath file is empty.
What you've done is pretty much what you have to do, since you cannot open an object in Azure Storage as you would a local file. So, grabbing the file into a local file or a stream is what you'd need to do.
As far as scaling: that will depend on the size (and number of instances) you're running in your Web App. Just be aware that you should have both your storage account and your Web App in the same region, to reduce latency and avoid egress charges for bandwidth.

Azure Data Lake: How to get Processed files

I've just started working with Data Lake and I'm currently trying to figure out the real workflow steps and how to automatize the whole process.
Say I have some files as an input and I would like to process them and download output files in order to push into my data warehouse or/and SSAS.
I've found absolutely lovely API and it's all good but I can't find a way to get all the file names in a directory to get them downloaded further.
Please correct my thoughts regarding workflow. Is there another, more elegant way to automatically get all the processed data (outputs) into a storage (like conventional SQL Server, SSAS, data warehouse and etc)?
If you have a working solution based on Data Lake, please describe the workflow (from "raw" files to reports for end-users) with a few words.
here is my example of NET Core application
using Microsoft.Azure.DataLake.Store;
using Microsoft.IdentityModel.Clients.ActiveDirectory;
using Microsoft.Rest.Azure.Authentication;
var creds = new ClientCredential(ApplicationId, Secret);
var clientCreds = ApplicationTokenProvider.LoginSilentAsync(Tenant, creds).GetAwaiter().GetResult();
var client = AdlsClient.CreateClient("myfirstdatalakeservice.azuredatalakestore.net", clientCreds);
var result = client.GetDirectoryEntry("/mynewfolder", UserGroupRepresentation.ObjectID);
Say I have some files as an input and I would like to process them and download output files in order to push into my data warehouse or/and SSAS.
If you want to download the files from the folder in the azure datalake to the local path, you could use the following code to do that.
client.BulkDownload("/mynewfolder", #"D:\Tom\xx"); //local path
But based on my understanding, you could use the azure datafactory to push your data from datalake store to azure storage blob or azure file storge.

upload files to Azure file storage from web app using rest api

I have a web app that is currently using webforms, not MVC, which is going to be hosted on the Azure platform.
The main function of this web app is to upload user files to Azure File Storage.
The files may be pdf, mp3, etc., not simple text or data stream or data input.
I am told to use Azure REST API to upload files, but I am really not familiar with it and can't find a good sample or tutorial or video online. The current documents from Microsoft reads like ?????? to me.
Currently I just upload to a local folder, so the code looks like:
FileUpload1.PostedFile.SaveAs(Server.MapPath("fileupload\\" + FileUpload1.FileName));
in C#;
Where do I go from there? I think I am supposed to add a StorageConnectionString which looks like DefaultEndpointsProtocol=https;AccountName=xxx;AccountKey=yyy, which I already have.
And then I should write some code like 'post' in C#?
Azure provide a nuget library that you can use to upload, and do other "file management" types of activities on Azure File Storage.
The library is called:
WindowsAzure.Storage
UPDATE: The new library to use is Azure.Storage.Blobs.
Here are the basics of getting this going:
//Connect to Azure
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(connectionString);
// Create a reference to the file client.
CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
// Create a reference to the Azure path
CloudFileDirectory cloudFileDirectory = GetCloudFileShare().GetRootDirectoryReference().GetDirectoryReference(path);
//Create a reference to the filename that you will be uploading
CloudFile cloudFile = cloudSubDirectory.GetFileReference(fileName);
//Open a stream from a local file.
Stream fileStream= File.OpenRead(localfile);
//Upload the file to Azure.
await cloudFile.UploadFromStreamAsync(fileStream);
fileStream.Dispose();
More links and info here (note scroll a fair way down for samples): https://azure.microsoft.com/en-us/documentation/articles/storage-dotnet-how-to-use-files/
This piece of code is based on the answer I get from Gary Holland above. I hope other people benefit from it. I am not good at programming, hopefully the code looks ok.
if (FileUpload1.HasFile)
{
//Connect to Azure
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConfigurationManager.ConnectionStrings["StorageConnectionString"].ConnectionString);
// Create a reference to the file client.
CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
// Get a reference to the file share we created previously.
CloudFileShare share = fileClient.GetShareReference("yourfilesharename");
if (share.Exists())
{
// Generate a SAS for a file in the share
CloudFileDirectory rootDir = share.GetRootDirectoryReference();
CloudFileDirectory sampleDir = rootDir.GetDirectoryReference("folderthatyouuploadto");
CloudFile file = sampleDir.GetFileReference(FileUpload1.FileName);
Stream fileStream = FileUpload1.PostedFile.InputStream;
file.UploadFromStream(fileStream);
fileStream.Dispose();
}
}

C# Windows Store BackgroundDownloader to removable storage

I am working on an app that will run on all Windows 8 devices (RT support a must) and I am working on adding some offline capabilities, but I can't figure out how to download to a removable storage device such as a USB drive or, in the case of a Surface RT, the micro SD card. Ideally I would like to be able to have the user specify the directory, but it may end up downloading hundreds of files so it has to be specified just once, not once per file. I also want to avoid requiring the user to manually configure libraries.
I have found plenty of articles about how to download to the various libraries, but those go to the internal storage and thus has very limited space on a Surface RT. How can I have the user specify a location for a large number of files to download and/or download to a removable storage device?
A really slick solution would be a way to programmatically create a library in a location of the user's choosing so the user can choose if they want it on the local system or on a removable device.
I appreciate any suggestions.
You should take advantage of FutureAccessList. It allows you to reuse files and folders that the user has previously granted you access to.
First the user will select the target folder using a FolderPicker:
var picker = new FolderPicker();
picker.FileTypeFilter.Add("*");
var folder = await picker.PickSingleFolderAsync();
You then add the folder to FutureAccessList and get back a string token which you can store for later use (e.g. to ApplicationData.LocalSettings):
var token = StorageApplicationPermissions.FutureAccessList.Add(folder);
When you want to download a file, first get the folder from FutureAccessList and create the target file:
var folder = await StorageApplicationPermissions.FutureAccessList
.GetFolderAsync(token);
var file = await folder.CreateFileAsync(filename);
With that data you can create a DownloadOperation:
var downloader = new BackgroundDownloader();
var download = downloader.CreateDownload(uri, file);
From here on proceed as if you were downloading to any other location (start the download, monitor progress...).

Categories

Resources