I have large media files on Azure Storage block blobs, and I would like to encode them with Azure Media Service API V3.
I've found equivalent in API v2 : Copying existing blobs into a Media Services Asset but the v2 is obsolete and will be retired soon.
Where can I find exemple of Creating an Asset from an existing blob? All exemples I can find (including offical SDK v3 tutorials) are always using small local videos uploaded directly.
Also, it's not clear if in V3 I still need to copy my blob to an asset like in V2, or if you can use a blob with an asset as long as the Media Service is using the same Storage Account (because as said in the v2 to v3 migration guide, AssetFiles doesn't exists anymore in v3).
I have a pretty extensive sample of copying content from a storage account, encoding it with AMS and delivering it back to the same location in this Node.js/Typescript sample:
https://github.com/Azure-Samples/media-services-v3-node-tutorials/tree/main/VideoEncoding/Encoding_Bulk_Remote_Storage_Account_SAS
Take a look there first and tell me if that is what you are in need of. There are a number of helper functions I use with the storage blob SDK in the common folder here.
https://github.com/Azure-Samples/media-services-v3-node-tutorials/tree/main/Common
Keep in mind that the workflow for remote assets can be achieved in a couple ways in v3.
V3 Jobs support the JobInputHTTP object, which can point to a read only SAS URL that you pass in from your remote storage blob (if the storage account is not attached to the AMS account).
You can create an empty Asset and copy the blob into it from a remote storage account and the submit the job as JobInputAsset as usual
You can create an asset and pass in the container name - if this is an attached storage account, you can then wrap an existing storage account container as an Asset and then submit a job with the specified file in that Asset container as the input source. This is what you said in the last sentence above... but it may not be clear that you can do this in JobInputAssets - look at the Files property on JobInputAsset to pass in a specific list of files to the encoder (single or multi if doing overlays).
"input": {
"#odata.type": "#Microsoft.Media.JobInputAsset",
"files": [],
"inputDefinitions": [],
"assetName": "job1-InputAsset"
},
Related
We store our logs in Blob Container and we create individual JSON file for each action
for example 12345.json
{"User\":\"User1\",\"Location\":\"LOC\","timestamp":"2023-01-10T10:34:43.5470187+00:00","Id":"12345"}
I want to return all the data that User = User1.
I can use BlobServiceClient to connect to Blog storage account and retrieve all the json files. I would assume I can read individual json file and do some filtering, but are there any better ways to do this?
My ultimate goal is to create an endpoint and accept list of keywords, date range and then return the corresponding results.
If you just want to use Blob Storage only, then the option would be to first list all blobs in the container and then search inside each of the blob using Query Blob Contents (I linked REST API documentation. Please check the equivalent method in the SDK).
Other (a much better option IMO) would be to use Azure Cognitive Search and create a Blob Indexer. Have the contents of the blob container indexed by Azure Cognitive Search and then do a search over that indexed data.
You can learn more about using Azure Cognitive Search with Blob Storage here: https://learn.microsoft.com/en-us/azure/search/search-blob-storage-integration. For working with JSON data in Blob Storage, please see this: https://learn.microsoft.com/en-us/azure/search/search-howto-index-json-blobs.
I have the following code:
var client = new BlobServiceClient("connection string");
var container = client.GetBlobContainerClient("my-container");
await foreach(var item in container.GetBlobsAsync()) {
// do something about the item if it is a file
}
I found that there is no way to determine if the item is a virtual directory or a file. I had also tried client.GetBlobsByHierarchyAsync, the folder still shows up as a BlobItem not a Prefix. What am I missing here?
** Edit for clarification.
I am using Azure Storage Account blob containers. There are past similar posts and the answers suggested that directory is not supported in ABS. But Azure moves at a pretty fast pace. Please see that attached screenshot. They sure looks like directories to me.
** Edit for further clarification.
It seems that there are two "Kinds" of storage account. In the Storage accounts UI, they are indicated as either "Storage" or "StorageV2". StorageV2 has Hierarchical namespace = enabled under "Data Lake Storage" properties. I will updated the title accordingly.
ADLS GEN 2 accounts (V2 Storage accounts with hierarchical namespace enabled) are built on top of blob storage. What that means is that some of the features available in blob storage are available for ADLS GEN2 accounts as well.
For most parts, you can use Azure.Storage.Blobs SDK for managing the data in your ADLS GEN2 accounts.
However there are some additional features which are only available in ADLS GEN2 accounts (e.g. first class support for folder hierarchy, POSIX support etc.). In order to take advantage of these additional features and manage them, you would want to use Azure.Storage.Files.DataLake SDK instead of Azure.Storage.Blobs.
The goal of what i am trying to do is to take a photo and upload it using dropzone (which drop zone is working fine for how i have implemented it) and load it to an NTFS file system. I store the "uploadpath" to my SQL server so i can pull the image faster later. The problem that i am running into is i have no idea how to load my images into Azure File System. Also from what i gather a blob storage isnt quite what i am needing to use since that is based off a type of table storage format which isnt using ntfs.
I have been trying to research this and i have no idea where to actually start... i have read a few articles within MSDN to try to understand it but it seems that everything i keep finding is rather pertaining to BLOB storage.
foreach (string filename in Request.Files)
{
HttpPostedFileBase file = Request.Files[filename];
fname = file.FileName;
if (file != null && file.ContentLength > 0)
{
var path = Path.Combine(Server.MapPath("~/uploadeimg"));
string pathstring = Path.Combine(path.ToString());
string filename1 = Guid.NewGuid() + Path.GetExtension(file.FileName);
bool isexist = Directory.Exists(pathstring);
if (!isexist)
{
Directory.CreateDirectory(pathstring);
}
uploadpath = string.Format("{0}\\{1}", pathstring, filename1);
file.SaveAs(uploadpath);
As for documentation the following links are what i have read and looked through.
File Uploading to Azure Storage by ASP.NET Webpages
https://learn.microsoft.com/en-us/azure/storage/files/storage-dotnet-how-to-use-files
https://learn.microsoft.com/en-us/azure/storage/files/storage-dotnet-how-to-use-files
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-dotnet-how-to-use-blobs
I appreciate any assistance that you guys may be able to provide. I am looking to get more experience with this type of programming and i have just decided to play around and see what i can do with it.
I should also note. I can save the files in the area that i host the project and I can retrieve them that way as well. But i am not certain that would be the proper way to go about handling this.
On Azure it is common to store your files in BLOB. I would recommend to store your photo on BLOB instead of storing them into the Azure Web App.
All objects saved into Azure BLOB have their own URL that you can store on your SQL database to retrieve them later.
Check https://learn.microsoft.com/en-us/azure/storage/common/storage-decide-blobs-files-disks?toc=%2fazure%2fstorage%2ffiles%2ftoc.json#comparison-files-and-blobs to get a clear comparison between Azure Files and BLOBS.
For the legacy applications which use the native file system APIs, you could mount your Azure File share in Azure VM or on-premises as the network drive, then access your file share just as the local file system. You could leverage AzCopy and Azure Storage Explorer to manage your file storage.
For Azure Web Apps, you could not mount the Azure File share. By default, Azure web app content is stored on Azure Storage which is managed by azure side. You could just leverage the Home directory access (d:\home) and store your files on Azure Web. Details you could follow Azure Web App sandbox about the File System Restrictions/Considerations section.
In summary, we recommend you store your files into Azure Blob storage. And you could use azure storage client library to communicate with your blob storage, details you could follow here.
I am using log4net in my web application
We are deploying it via Cloud Services (NOT App Services).
My understanding is that I won't be able to access the log files on disk (and further, these files are not persistent anyways).
My readings are to use Blob storage. But I don't see any code out there on how to do this. There is a nuget package
https://www.nuget.org/packages/log4net.Appender.Azure
but the documentation says it creates a file for each log entry.
What I want is the RollingLogFile.
Do I basically have to create my own? as in, pull down the log4net source code and create my own appender that logs to a cloud storage account instead of disk? Seems like a lot of work, would have thought someone would have coded this feature already?
Thanks.
This project shares us some samples that store log entry in Azure Blob storage using AzureBlobAppender or AzureAppendBlobAppender for log4Net.
According the code, we could find AzureBlobAppender will create separate xml file for each log entity in Azure Blob storage, but AzureAppendBlobAppender will store logs that generated in one day in one log file by calling CloudAppendBlob.AppendBlock method to append a new block of logs data to the end of an existing blob.
If you do not want to create a xml file for each log entry, you could try to use AzureAppendBlobAppender.
private static string Filename(string directoryName){
return string.Format("**{0}/{1}.entry.log.xml**",
directoryName,
DateTime.Today.ToString("yyyy_MM_dd",
DateTimeFormatInfo.InvariantInfo));
}
I am trying to find a simple way to list log files which I am storing in and Azure Blob Container so developers and admins can easily get to dev log information. I am following the information in this API doc https://msdn.microsoft.com/en-us/library/dd135734.aspx but when I go to
https://-my-storage-url-.blob.core.windows.net/dev?comp=list&nclude={snapshots,metadata,uncommittedblobs,copy}&maxresults=1000
I see one file listed which is a Block Blob but the log files I have generated which are of type Append Blob are not showing. How can I construct this api call to include Append Blobs?