Azure Functions - Blob Stream Dynamic Input bindings - c#

I'm running a C# function on azure which needs to take in files from a container. The only problem is that the paths to the input files are going to be (potentially) different each time, and the number of input files will vary from 1 to about 4 or 5. Accordingly I can't just use the default input blob bindings as far as I'm aware. My options are give the container anonymous access and just grab the files through the link or figure out how to get dynamic input bindings.
Does anyone know how to declare the path for the input blob stream at runtime (in the C# code)?
If it helps I've managed to find this for dynamic output bindings
using (var writer = await binder.BindAsync<TextWriter>(
new BlobAttribute(containerPath + fileName)))
{
writer.Write(OutputVariable);
}
Thanks in advance, Cuan

try the below code:
string filename = string.Format("{0}/{1}_{2}.json", blobname, DateTime.UtcNow.ToString("ddMMyyyy_hh.mm.ss.fff"), Guid.NewGuid().ToString("n"));
using (var writer = await binder.BindAsync<TextWriter>(
new BlobAttribute(filename, FileAccess.Write)))
{
writer.Write(JsonConvert.SerializeObject(a_object));
}

For dynamic output bindings, you could leverage the following code snippet:
var attributes = new Attribute[]
{
new BlobAttribute("{container-name}/{blob-name}"),
new StorageAccountAttribute("brucchStorage") //connection string name for storage connection
};
using (var writer = await binder.BindAsync<TextWriter>(attributes))
{
writer.Write(userBlobText);
}
Note: The above code would create the target blob if not exists and override the existing blob if it exists. Moreover, if you do not specify the StorageAccountAttribute, your target blob would be create into the storage account based on the app setting AzureWebJobsStorage.
Additionally, you could follow Azure Functions imperative bindings for more details.
UPDATE:
For dynamic input binding, you could just change the binding type as follows:
var blobString = await binder.BindAsync<string>(attributes);
Or you could set the binding type to CloudBlockBlob and add the following namespace for azure storage blob:
#r "Microsoft.WindowsAzure.Storage"
using Microsoft.WindowsAzure.Storage.Blob;
CloudBlockBlob blob = await binder.BindAsync<CloudBlockBlob>(attributes);
Moreover, more details about the operations for CloudBlockBlob, you could follow here.

Related

Getting big data through SignalR - Blazor

I have a component library that uses JS code to generate an image as a base64 string and the image needs to be transposed to C#. The image size is larger than MaximumReceiveMessageSize.
Can I get the value of the MaximumReceiveMessageSize property in C#? I need a way to correctly split the picture into chunks, or some other way to transfer it.
My component can be used in a Wasm or Server application. I can't change the value of the MaximumReceiveMessageSize property.
Thanks
Using a stream as described in the Stream from JavaScript to .NET solved my problem.
From Microsoft docs:
In JavaScript:
function streamToDotNet() {
return new Uint8Array(10000000);
}
In C# code:
var dataReference = await JS.InvokeAsync<IJSStreamReference>("streamToDotNet");
using var dataReferenceStream = await dataReference.OpenReadStreamAsync(maxAllowedSize: 10_000_000);
var outputPath = Path.Combine(Path.GetTempPath(), "file.txt");
using var outputFileStream = File.OpenWrite(outputPath);
await dataReferenceStream.CopyToAsync(outputFileStream);
In the preceding example: JS is an injected IJSRuntime instance. The dataReferenceStream is written to disk (file.txt) at the current user's temporary folder path (GetTempPath).

Firebase Storage read or query existing data with .NET C#

Environment: VS project, .NET, C#
I've implemented uploading documents to my Firebase Storage Bucket via the example in the link below:
How to Upload File to Firebase Storage in .Net C# Windows Form?
I'm trying to find documentation on how to use the same library/functionality to read a file that I've manually uploaded to my Bucket.
In essence: how to 'peek' or 'read' a file that is already on Storage? I basically want to query data inside an existing csv file.
So far I've found documentation only here, which doesn't provide much in terms of a possible solution, at least as far as I can understand it...
Firebase Storage Introduction
There is seemingly more related information on the same page on the 'Firebase Store' section, but that isn't the same as Firebase Storage :/
Any ideas?
Looking at the docs, It seems you can open files by downloading them.
var client = StorageClient.Create();
// Create a bucket with a globally unique name
var bucketName = Guid.NewGuid().ToString();
var bucket = client.CreateBucket(projectId, bucketName);
// Upload some files
var content = Encoding.UTF8.GetBytes("hello, world");
var obj1 = client.UploadObject(bucketName, "file1.txt", "text/plain", new MemoryStream(content));
var obj2 = client.UploadObject(bucketName, "folder1/file2.txt", "text/plain", new MemoryStream(content));
// List objects
foreach (var obj in client.ListObjects(bucketName, ""))
{
Console.WriteLine(obj.Name);
}
// Download file
using (var stream = File.OpenWrite("file1.txt"))
{
client.DownloadObject(bucketName, "file1.txt", stream);
}

How to set ContentMD5 in DataLakeFileClient?

When uploading to an Azure Data Lake using the Microsoft Azure Storage Explorer the file automatically generates and stores a value for the ContentMD5 property. It also automatically does it in a function app that uses a Blob binding.
However, this does not automatically generate when uploading from a C# DLL.
I want to use this value to compare files in the future.
My code for the upload is very simple.
DataLakeFileClient fileClient = await directoryClient.CreateFileAsync("testfile.txt");
await fileClient.UploadAsync(fileStream);
I also know I can generate an MD5 using the below code, but I'm not certain if this is the same way that Azure Storage Explorer does it.
using (var md5gen = MD5.Create())
{
md5hash = md5gen.ComputeHash(fileStream);
}
but I have no idea how to set this value to the ContentMD5 property of the file.
I have found the solution.
The UploadAsync method has an overload that accepts a parameter of type DataLakeFileUploadOptions. This class contains a HttpHeaders object which in turn has a ContentHash property which stores it as a property of the document.
var uploadOptions = new DataLakeFileUploadOptions();
uploadOptions.HttpHeaders = new PathHttpHeaders();
uploadOptions.HttpHeaders.ContentHash = md5hash;
await fileClient.UploadAsync(fileStream, uploadOptions);

restore an azure blob snapshot using c# library

I am writing a CLI utility that does a lot of different things, but what I'm struggling with right now is I have a known blob. For that blob, I want to restore a snapshot that was taken of that blob
await foreach (var snapshot in containerClient.GetBlobsAsync(
BlobTraits.All,
BlobStates.Snapshots,
blobPath))
{
_logger.LogInformation($"found blob {snapshot.Name} - {snapshot.Snapshot}");
if (DecideIfRightSnapshot(snapshot)) {
BlobClient snapshotBlob = containerClient.GetBlobClient(snapshot.Name);
_logger.LogInformation($"found snapshot {snapshotBlob.Uri}");
await sourceBlob.StartCopyFromUriAsync(snapshotBlob.Uri);
}
break;
}
First, the filter isn't working right because the last blob in the list is always the base blob. But I can work around that one.
The real issue i'm struggling with is the proper way to restore a blob from a snapshot using the libs? I'm really concerned because the .Uri function always returns the base file's uri, even if its a snapshot. I was lead to believe the URI would be something like this
https://me.blob.core.windows.net/myapp/doc?snapshot=2020-12-16T17:07:44.1076450Z
but thats not the URI thats getting logged. Am i supposed to construct the full URI myself?
In all the searches they refer to this as "promoting" a snapshot. But I can't find a "promote" method in the API.
Am i doing this right?
If you're using the new version of blob storage sdk: Azure.Storage.Blobs, then you should construct the full URI by yourself. The sample code like below:
//other code
await foreach (var snapshot in containerClient.GetBlobsAsync(
BlobTraits.All,
BlobStates.Snapshots,
blobPath))
{
_logger.LogInformation($"found blob {snapshot.Name} - {snapshot.Snapshot}");
if(DecideIfRightSnapshot(snapshot)) {
BlobClient snapshotBlob = containerClient.GetBlobClient(snapshot.Name);
//construct the snapshot url
var snapshot_uri = snapshotBlob.Uri.ToString() + "?snapshot=" + snapshot.Snapshot;
_logger.LogInformation($"found snapshot {snapshot_uri }");
await sourceBlob.StartCopyFromUriAsync(snapshot_uri);
}
break;
}
For promoting, it means you restore the snapshot via azure portal. It's a UI operation and actually it calls the Put Blob From URL api. And currently, there is no such method in sdk.
But if you're using some old packages like WindowsAzure.Storage, it has many methods to operate with snapshot, see this article. Note: it's not recommended to use the old packages.

Is it possible to change files’ encoding azure blob storage using C#

I use polybase to export file to storageAccount. But the encoding is UTF8. I need to change it to SJIS. is there any easy way to change it to SJIS using C#? is it possible to do it by using blobstorage’s rest api
For api, you can use Set Blob Properties, then set x-ms-blob-content-encoding in the request header.
For code, if you're using azure blob storage sdk, you can refer to this article. You should modify the code, since the sample for getting properties of container. You can use sample code for setting blob property as below:
CloudBlobContainer blobContainer = blobClient.GetContainerReference("xxx");
CloudBlockBlob myblob = blobContainer.GetBlockBlobReference("xxx");
myblob.Properties.ContentEncoding = "SJIS";
myblob.SetProperties();

Categories

Resources