I am trying to deserialize the blob stream to JSON object using azure blob trigger. This trigger would be fired whenever I upload a video to blob storage. However, it is throwing this error:
Newtonsoft.Json: Unexpected character encountered while parsing value: . Path ''.
This is the code that I am using to deserialize:
public static void Run(Stream myBlob, string name, TraceWriter log)
{
myBlob.Position = 0; //resetting stream's position to 0
var serializer = new JsonSerializer();
using(var sr = new StreamReader(myBlob))
{
using(var jsonTextReader = new JsonTextReader(sr))
{
BlobData blobData = serializer.Deserialize<BlobData>(jsonTextReader);
}
}
public class BlobData
{
public string path { get; set; }
}
}
Any help would be appreciated.Thanks.
A i mentioned earlier, the blob will contain a video and after upload a trigger will fire. As of now, i am using some sample videos
As Gaurav Mantri commented that you could not deserialize a video to a JSON object. Per my understanding, if you want to retrieve the blob Uri after a video blob is uploaded and you would store the video blob url to some other data store. At this point, you could bind the CloudBlockBlob type for your myBlob parameter, and you could retrieve the blob url as follows:
run.csx
#r "Microsoft.WindowsAzure.Storage"
using Microsoft.WindowsAzure.Storage.Blob;
public static void Run(CloudBlockBlob myBlob, string name, TraceWriter log)
{
//blob has public read access permission
var blobData = new BlobData() { path = myBlob.Uri.ToString() };
//blob is private, generate a SAS token for this blob with the limited permission(s)
var blobSasToken=myBlob.GetSharedAccessSignature(new SharedAccessBlobPolicy()
{
SharedAccessExpiryTime =DateTimeOffset.UtcNow.AddDays(2),
Permissions = SharedAccessBlobPermissions.Read
}));
var blobData = new BlobData()
{
path = $"{myBlob.Uri.ToString()}{blobSasToken}"
};
//TODO:
}
Moreover, you could follow Create and use a SAS with Blob storage, Azure Functions Blob storage bindings, Get started with Azure Blob storage using .NET for more detailed code sample.
Related
I'm a complete noob at c# and know very little about azure apis and a current cs student doing a project for work. I built some middleware with youtube tutorials that authenticates a with a storage account using a string connection and it enumerates, uploads, downloads, and deletes blobs within a container. The issue i'm having lies with ONLY the downloading functionality and ONLY when the storage account access is set to private. This function works fine with anon access. I suspect the issue is with appending the url, and I'm not sure how to fix it. The blobs are mainly csv data if that matters. Any help or direction to resources would be greatly appreciated 🙂 here is the relevant code:
url function
public async Task<string> GetBlob(string name, string containerName)
{
var containerClient = _blobClient.GetBlobContainerClient(containerName);
var blobClient = containerClient.GetBlobClient(name);
return blobClient.Uri.AbsoluteUri;
}
The config file
"AllowedHosts": "*",
"BlobConnection" : "<mystringconnection>******==;EndpointSuffix=core.windows.net"
action request
[HttpGet]
public async Task<IActionResult> ViewFile(string name)
{
var res = await _blobService.GetBlob(name, "<mystorageacc>");
return Redirect(res);
}
The reason you are not able to download the blobs from a private container is because you are simply returning the blob's URL from your method without any authorization information. Request to access blobs in a private container must be authorized.
What you would need to do is create a Shared Access Signature (SAS) with at least Read permission and then return that SAS URL. The method you would want to use is GenerateSasUri. Your code would be something like:
public async Task<string> GetBlob(string name, string containerName)
{
var containerClient = _blobClient.GetBlobContainerClient(containerName);
var blobClient = containerClient.GetBlobClient(name);
return blobClient.GenerateSasUri(BlobSasPermissions.Read, DateTime.UtcNow.AddMinutes(5)).Uri.AbsoluteUri;
}
This will give you a link which is valid for 5 minutes from the time of creation and has the permission to read (download) the blob.
If you want to download from the blob service;
public async Task<byte[]> ReadFileAsync(string path)
{
using var ms = new MemoryStream();
var blob = _client.GetBlobClient(path);
await blob.DownloadToAsync(ms);
return ms.ToArray();
}
If you want to download the file byte array from controllers, you can check this;
https://stackoverflow.com/a/3605510/3024129
If you want to set a blob file public access level;
https://learn.microsoft.com/en-us/azure/storage/blobs/anonymous-read-access-configure.
Pay attention to the images please;
Or you can connect with Azure Storage Explorer and choose the easy way.
I found the images on the Google, there may be differences. :)
This worked for me by returning a byte array:
byte[] base64ImageRepresentation = new byte[] { };
BlobClient blobClient = new BlobClient(blobConnectionString,
blobContainerUserDocs,+ "/" + fileName);
if (await blobClient.ExistsAsync())
{
using var ms = new MemoryStream();
await blobClient.DownloadToAsync(ms);
return ms.ToArray();
}
So far Im listing the names and creation dates of the blobs from the container in Azure Blob Storage.
Now I want to add a list of the URLs from the same blobs. I did a lot of research but I can't really find something that is of use.
Is it possible to achieve this with the same method I used for the other blob properties or is there another way?
My code:
using System.Threading.Tasks;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;
using Azure.Storage.Blobs;
using System;
using System.Collections.Generic;
namespace getBlobData
{
// Define data transfer objects (DTO)
public class ContainerInfo
{
public string Name
{
get; set;
}
public DateTimeOffset? CreatedOn
{
get; set;
}
}
public static class GetBlobData
{
[FunctionName("getBlobData")]
public static async Task<List<ContainerInfo>> Run(
[HttpTrigger(AuthorizationLevel.Function, "get", Route = null)] HttpRequest req,
ILogger log)
{
// Connect to container in storage account
// Get Blobs inside of it
string connection_string = Environment.GetEnvironmentVariable("AZURE_STORAGE_CONNECTION_STRING");
BlobServiceClient blobServiceClient = new BlobServiceClient(connection_string);
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient("container");
var response = containerClient.GetBlobsAsync();
// Get name and creation date of Blobs
// Return DTOs
var res = new List<ContainerInfo>();
await foreach (var item in response)
{
res.Add(new ContainerInfo { Name = item.Name, CreatedOn = item.Properties.CreatedOn });
}
return res;
}
}
}
If you're using the latest azure storage package Azure.Storage.Blobs 12.8.0, there are 2 ways to fetch the blob url.
1.You can build the blob url by yourself. First, you need to get the blob container url, then concatenate blob container url with the blob name, the code like below:
//other code
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient("container");
//get the container url here
string container_url = containerClient.Uri.ToString();
var response = containerClient.GetBlobsAsync();
// Get name and creation date of Blobs
// Return DTOs
var res = new List<ContainerInfo>();
await foreach (var item in response)
{
//here you can concatenate the container url with blob name
string blob_url = container_url + "/" + item.Name;
res.Add(new ContainerInfo { Name = item.Name, CreatedOn = item.Properties.CreatedOn });
}
2.Another is that after you get the blob name, you can use the blob name to get the BlobClient, then you can get the blob url from BlobClient. Code like below:
//other code
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient("container");
var response = containerClient.GetBlobsAsync();
//define a BlobClient here.
BlobClient blobClient = null;
// Get name and creation date of Blobs
// Return DTOs
var res = new List<ContainerInfo>();
await foreach (var item in response)
{
//here you can get a BlobClient by using blob name
blobClient = containerClient.GetBlobClient(item.Name);
//then you can get the blob url by using BlobClient
string blob_url = blobClient.Uri.ToString();
res.Add(new ContainerInfo { Name = item.Name, CreatedOn = item.Properties.CreatedOn });
}
I assume you can conventions of blob URL construction, to generate the relevant URL
https://myaccount.blob.core.windows.net/mycontainer/myblob
https://learn.microsoft.com/en-us/rest/api/storageservices/copy-blob-from-url
We have a parquet formatfile (500 mb) which is located in Azure blob.How to read the file directly from blob and save in memory of c# ,say eg:Datatable.
I am able to read parquet file which is physically located in folder using the below code.
public void ReadParqueFile()
{
using (Stream fileStream = System.IO.File.OpenRead("D:/../userdata1.parquet"))
{
using (var parquetReader = new ParquetReader(fileStream))
{
DataField[] dataFields = parquetReader.Schema.GetDataFields();
for (int i = 0; i < parquetReader.RowGroupCount; i++)
{
using (ParquetRowGroupReader groupReader = parquetReader.OpenRowGroupReader(i))
{
DataColumn[] columns = dataFields.Select(groupReader.ReadColumn).ToArray();
DataColumn firstColumn = columns[0];
Array data = firstColumn.Data;
//int[] ids = (int[])data;
}
}
}
}
}
}
(I am able to read csv file directly from blob using sourcestream).Please kindly suggest a fastest method to read the parquet file directly from blob
Per my experience, the solution to directly read the parquet file from blob is first to generate the blob url with sas token and then to get the stream of HttpClient from the url with sas and finally to read the http response stream via ParquetReader.
First, please refer to the sample code below of the section Create a service SAS for a blob of the offical document Create a service SAS for a container or blob with .NET using Azure Blob Storage SDK for .NET Core.
private static string GetBlobSasUri(CloudBlobContainer container, string blobName, string policyName = null)
{
string sasBlobToken;
// Get a reference to a blob within the container.
// Note that the blob may not exist yet, but a SAS can still be created for it.
CloudBlockBlob blob = container.GetBlockBlobReference(blobName);
if (policyName == null)
{
// Create a new access policy and define its constraints.
// Note that the SharedAccessBlobPolicy class is used both to define the parameters of an ad hoc SAS, and
// to construct a shared access policy that is saved to the container's shared access policies.
SharedAccessBlobPolicy adHocSAS = new SharedAccessBlobPolicy()
{
// When the start time for the SAS is omitted, the start time is assumed to be the time when the storage service receives the request.
// Omitting the start time for a SAS that is effective immediately helps to avoid clock skew.
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(24),
Permissions = SharedAccessBlobPermissions.Read | SharedAccessBlobPermissions.Write | SharedAccessBlobPermissions.Create
};
// Generate the shared access signature on the blob, setting the constraints directly on the signature.
sasBlobToken = blob.GetSharedAccessSignature(adHocSAS);
Console.WriteLine("SAS for blob (ad hoc): {0}", sasBlobToken);
Console.WriteLine();
}
else
{
// Generate the shared access signature on the blob. In this case, all of the constraints for the
// shared access signature are specified on the container's stored access policy.
sasBlobToken = blob.GetSharedAccessSignature(null, policyName);
Console.WriteLine("SAS for blob (stored access policy): {0}", sasBlobToken);
Console.WriteLine();
}
// Return the URI string for the container, including the SAS token.
return blob.Uri + sasBlobToken;
}
Then to get the http response stream of HttpClient from the url with sas token .
var blobUrlWithSAS = GetBlobSasUri(container, blobName);
var client = new HttpClient();
var stream = await client.GetStreamAsync(blobUrlWithSAS);
Finally to read it via ParquetReader, the code comes from Reading Data of GitHub repo aloneguid/parquet-dotnet.
var options = new ParquetOptions { TreatByteArrayAsString = true };
var reader = new ParquetReader(stream, options);
I am currently trying to download a file from Azure blob storage using the DownloadToStream method to download the contents of a blob as a text string.
However I am not getting anything back but an empty string.
Here is my code that I use to connect to the azure blob container and retrieve the blob file.
public static string DownLoadFroalaImageAsString(string blobStorageName, string companyID)
{
// Retrieve storage account from connection string.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
CloudConfigurationManager.GetSetting("StorageConnectionString"));
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve reference to a previously created container.
CloudBlobContainer container = blobClient.GetContainerReference(companyID.ToLower());
//retrieving the actual filename of the blob
string removeString = "BLOB/";
string trimmedString = blobStorageName.Remove(blobStorageName.IndexOf(removeString), removeString.Length);
// Retrieve reference to a blob named "trimmedString"
CloudBlockBlob blockBlob2 = container.GetBlockBlobReference(trimmedString);
string text;
using (var memoryStream = new MemoryStream())
{
blockBlob2.DownloadToStream(memoryStream);
text = System.Text.Encoding.UTF8.GetString(memoryStream.ToArray());
}
return text;
}
I was following along this documentation however I cannot seem to get it to work. Any help would be greatly appreciated.
However I am not getting anything back but an empty string.
I test your supplied code on my side, it works correctly. I assume that the test blob content is empty in your case. We could trouble shooting with following ways:
1.please have a try to check the Length of memoryStream. If length equal 0 we could know that the blob content is empty.
using (var memoryStream = new MemoryStream())
{
blockBlob2.DownloadToStream(memoryStream);
var length = memoryStream.Length;
text = System.Text.Encoding.UTF8.GetString(memoryStream.ToArray());
}
2.We could upload a blob with content to container, we could do that with Azure portal or Microsoft Azure storage explorer easily. And please have a try test it with uploaded blob.
If you want to get the text from the blob, you can use DownloadTextAsync()
var text = await blockBlob2.DownloadTextAsync();
If you want to return file stream back to an API respoinse, you can use FileStreamResult which is IActionResult.
var stream = await blockBlob2.OpenReadAsync();
return File(stream, blockBlob2.Properties.ContentType, "name");
I want to upload a file to Azure blob storage asynchronously. I have tried the way suggested in the official sdk:
This is how I get the container:
public static class BlobHelper
{
public static CloudBlobContainer GetBlobContainer()
{
// Pull these from config
var blobStorageConnectionString = ConfigurationManager.AppSettings["BlobStorageConnectionString"];
var blobStorageContainerName = ConfigurationManager.AppSettings["BlobStorageContainerName"];
// Create blob client and return reference to the container
var blobStorageAccount = CloudStorageAccount.Parse(blobStorageConnectionString);
var blobClient = blobStorageAccount.CreateCloudBlobClient();
var container = blobClient.GetContainerReference(blobStorageContainerName);
container.CreateIfNotExists();
container.SetPermissions(new BlobContainerPermissions { PublicAccess = BlobContainerPublicAccessType.Blob });
return container;
}
}
And this is how i try to upload the file:
var documentName = Guid.NewGuid().ToString();
CloudBlobContainer container = BlobHelper.GetBlobContainer();
CloudBlockBlob blockBlob = container.GetBlockBlobReference(documentName);
public class FilesService
{
public async Task<string> UploadFiles(HttpContent httpContent)
{
var documentName = Guid.NewGuid().ToString();
CloudBlobContainer container = BlobHelper.GetBlobContainer();
CloudBlockBlob blockBlob = container.GetBlockBlobReference(documentName);
using (var fileStream = System.IO.File.OpenRead(#"path\myfile"))
{
await blockBlob.UploadFromStreamAsync(fileStream);
}
return blockBlob.Uri.ToString();
}
}
The problem is that I do not know how to get the path to my file (it is uploaded by the user).
When I try this:
var rootpath = HttpContext.Current.Server.MapPath("~/App_Data");
var streamProvider = new MultipartFileStreamProvider(rootpath);
await httpContent.ReadAsMultipartAsync(streamProvider);
foreach (var file in streamProvider.FileData)
{
var localName = file.LocalFileName;
using (var fileStream = System.IO.File.OpenRead(file.LocalFileName))
{
await blockBlob.UploadFromStreamAsync(fileStream);
}
}
And when I try a post request. The request just crashes and does not return anything (even an exception);
Solution:
The issue was resolved in the following way. I used a service method in order to be able to upload a collection of files.
In the BlobHelper class I save the needed information about the container and then instantiate it, it is a static class. Using a collection makes it possible to upload a multiple files as a part of the same stream.
I think you are trying to get the path to the file that is being uploaded to the Blob Storage using standard ASP.NET methods and local context. Files uploaded to the blob will not be accessible that way.
Seems like you upload your blob properly. Now, if your file uploaded successfully, your method should return blockBlob.Uri.ToString(), which is the link to your file - you may store it somewhere in the database or anywhere else.