In Azure Function V1 (.NET4) code below worked fine
[FunctionName("run")]
public static HttpResponseMessage run(
[HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = null)] HttpRequestMessage request,
[Blob("sample.txt", FileAccess.Read)] Stream readStream,
[Blob("sample.txt", FileAccess.Write)] Stream writeStream)
{
//read & write to sample.txt stream works ok
}
But in Azure Function V4 (.NET 6), throws error that only 1 stream can be accessed
[FunctionName("run")]
public static HttpResponseMessage run(
[HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = null)] HttpRequestMessage request,
[Blob("sample.txt", FileAccess.Read)] Stream readStream,
[Blob("sample.txt", FileAccess.Write)] Stream writeStream)
{
//can ONLY use read stream or write stream, can't use both
}
How to Read & Write to same Blob file in Azure Function (.NET6)?
Note: issue only appears in Azure not in local debugging
Final solution that worked in Azure Function V4 (.NET 6) is BlobClient
Code to use BlobClient to read/write files in blob storage
[FunctionName("Function1")]
public static async Task<IActionResult> Function1(
[HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = null)] HttpRequestMessage incomingRequest,
[Blob("container-name/file-name.xml", FileAccess.ReadWrite)] BlobClient fileBlobClient)
{ ... }
After reproducing from our end, This was working even though we have added 2 streams. Make sure your environment is up to date.
[FunctionName("Function1")]
public static void Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = null)] HttpRequestMessage req,
[Blob("samples-workitems/samples.txt", FileAccess.Read)] Stream readStream,
[Blob("samples-workitems/samples.txt", FileAccess.Write)] Stream writeStream)
{
Console.WriteLine("Blob length is "+ readStream.Length);
string sample = "Sample Text";
byte[] bytes = Encoding.ASCII.GetBytes(sample);
writeStream.Write(bytes, 0, bytes.Length);
}
Considering this to be my initial text file
RESULT:
Text file after post-trigger
Related
I need to save my HTTP message to the blob. I am using the below code. the file is getting created but without content. I need the XML data to be stored in the blob
[FunctionName("Function11")
public static async Task <IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function, "post", Route = null)] HttpRequest req,
[Blob("Order/sales_{Datetimenow}.xml", FileAccess.Write, Connection = "AzureWebJobsStorage")] Stream outputStream)
{
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
return new OkObjectResult(requestBody);
}
Looking at the code you posted, and explicitly this part
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
return new OkObjectResult(requestBody);
it shows that you are...
reading the body of the POST
putting the body in the requestBody variable
returning an OkObjectResult with requestBody as the content
What you are NOT doing is writing the body to the outputStream. You should either create a StreamWriter and write to the stream, otherwise there's nothing to be written to the stream.
Thank you.
[FunctionName("HTTPTrigger")]
public static async Task RunAsync(
[HttpTrigger(AuthorizationLevel.Function, "post", Route = null)] HttpRequest req,
[Blob("sample/order{DateTime.now}.xml", FileAccess.Write, Connection = "AzureWebJobsStorage")] Stream oStream, ILogger log)
{
var requestBody = await new StreamReader(req.Body).ReadToEndAsync();
oStream.Write(Encoding.UTF8.GetBytes(requestBody));
return new OkObjectResult(requestBody);
I'm trying to create Azure function which takes image file from html form POST request and saves it to Blob Storage for further usage with another function. Here is my code:
public static class Function2
{
[FunctionName("Function2")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] HttpRequest req,
ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
foreach (var file in req.Form.Files)
{
using (var ms = new MemoryStream())
{
var file2 = req.Form.Files[0];
await file2.CopyToAsync(ms);
ms.Seek(0, SeekOrigin.Begin);
var connectionString = "DefaultEndpointsProtocol=https;" +
"AccountName=mystorageaccount;" +
"AccountKey=8Hk5k6j65j5j665j67k==;" +
"EndpointSuffix=core.windows.net";
// intialize BobClient
Azure.Storage.Blobs.BlobClient blobClient = new Azure.Storage.Blobs.BlobClient(
connectionString: connectionString,
blobContainerName: "image-storage",
blobName: "images");
// upload the file
blobClient.Upload(file2);
}
}
return new OkResult("Image uploaded successfully");
}
}
However this raises exception:
Error CS1503 Argument 1: cannot convert from
'Microsoft.AspNetCore.Http.IFormFile' to 'System.IO.Stream'
Any advise would be very highly appreciated.
Edit: I have previously created Blob Container "image-storage" to my storage account using Azure Portal.
BlobCients Upload method expects a Stream instead of an IFormFile.
Passing the MemoryStream you have in the ms variable will resolve the issue.
blobClient.Upload(ms);
I'm trying to get a binary file from the Blob Storage through Azure API management by using Azure function with HTTP trigger. How the Azure Function should be implemented to reach smallest possible memory footprint?
This implementation works but it seems to require a lot of memory which leads to the out of memory exception when too many concurrent request are processed:
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = "dbrevisions/{dbRevision}")] HttpRequestMessage request,
[Blob("typedatadev/typedata_{dbRevision}.db", FileAccess.Read)] Stream blobStream,
string dbRevision,
ILogger log)
{
var memoryBlobStream = new MemoryStream();
blobStream.CopyTo(memoryBlobStream);
var result = new FileStreamResult(memoryBlobStream, "application/octet-stream");
result.FileStream.Flush();
result.FileStream.Seek(0, SeekOrigin.Begin);
return result;
}
This will end up to the status 502 (also without the "Flush" call):
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = "dbrevisions/{dbRevision}")] HttpRequestMessage request,
[Blob("typedatadev/typedata_{dbRevision}.db", FileAccess.Read)] Stream blobStream,
string dbRevision, ILogger log)
{
var result = new FileStreamResult(blobStream, "application/octet-stream");
result.FileStream.Flush();
result.FileStream.Seek(0, SeekOrigin.Begin);
return result;
}
This also fails:
var response = new HttpResponseMessage(HttpStatusCode.OK)
{
Content = new StreamContent(blobStream)
};
response.Content.Headers.ContentType = new System.Net.Http.Headers.MediaTypeHeaderValue("application/octet-stream");
return response;
I did like below for minimum memory footprint. Note that instead of binding to stream, I am binding to a ICloudBlob instance (luckily, C# function supports several flavors of blob input binding) and returning open stream. Tested it using memory profiler and works fine with no memory leak even for large blobs.
NOTE: You don't need to seek to stream position 0 or flush or dispose (disposing would be automatically done on response end);
using System.IO;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;
using Microsoft.Azure.Storage.Blob;
namespace TestFunction1
{
public static class MyFunction
{
[FunctionName("MyFunction")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = "dbrevisions/{dbRevision}")] HttpRequest req,
[Blob("typedatadev/typedata_{dbRevision}.db", FileAccess.Read, Connection = "BlobConnection")] ICloudBlob blob,
string dbRevision,
ILogger log)
{
var blobStream = await blob.OpenReadAsync().ConfigureAwait(false);
return new FileStreamResult(blobStream, "application/octet-stream");
}
}
}
The question was asked in 2020, I came across the post looking for the same solution for my code which is on .net6 VisualStudio 2022.
Hence the latest libraries.
Now I am trying to read a pfx certificate file which is stored as a blob on an Azure Container. This container and blob access is Private.
I am connecting to the Azure Storage account, then finding the file, then fetchign the file as a memory stream from azure and finally returning that as a X509Certificate2 certificate object.
This is my final code, tested ok.
using Azure.Storage.Blobs;
using Microsoft.Extensions.Logging;
using System;
using System.IO;
using System.Security.Cryptography.X509Certificates;
using System.Threading.Tasks;
namespace ConfigEncryption
{
internal static class CloudStorage
{
//Connection string of the Azure Storage Account - Fonud at the storageaccount|Access keys - Show the keys
const string connectionString = "DefaultEndpointsProtocol=https;AccountName=storageaccountname;AccountKey=thelongkeyvaluewithfewslahesinit;EndpointSuffix=core.windows.net";
const string containerName = "yourcontainername";
const string certName = "Your_Certificate.pfx";const string password = "the_certificate_password";
public static async Task<X509Certificate2> GetCertAsync(ILogger log)
{
try
{
log.LogInformation("Connecting to cloud storage to fetch the certificate");
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(containerName);
BlobClient blobClient = containerClient.GetBlobClient(certName);
byte[] certData;
if (await blobClient.ExistsAsync())
{
var memorystream = new MemoryStream();
blobClient.DownloadTo(memorystream);
certData = memorystream.ToArray();
X509Certificate2 cert = new X509Certificate2(certData, password);
log.LogInformation("Found the certificate on cloud storage.");
return cert;
}
log.LogError("Error: Cerificate not found in the container");
}
catch(Exception ex)
{
log.LogError($"Error: Getting certificate from Cloud storage. Exception Details :{ex.Message}");
}
return null;
}
}
}
I hope it helps someone, happy coding !
Regards,
Rakesh
I want to pass the filename of blob to the httptrigger, through get request as below.
http://localhost:7071/api/CSVDataMigrationHttpTrigger/testdata.csv
Code for the azure function
public static async Task<HttpResponseMessage> Run(
[HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = "CSVDataMigrationHttpTrigger/{name}")]
HttpRequest req, string name,
[Blob("csvdata-upload/{name}", FileAccess.Read, Connection = "AzureWebJobsStorage")]
Stream inputBlob, ILogger log)
{}
inputBlob parameter is not resolved and it returns null.
But if i give filename as "testData.csv" as below in the Blob parameter, then inputBlob get resolved properly.
public static async Task<HttpResponseMessage> Run(
[HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = "CSVDataMigrationHttpTrigger/{name}")]
HttpRequest req, string name,
[Blob("csvdata-upload/testData.csv", FileAccess.Read, Connection = "AzureWebJobsStorage")]
Stream inputBlob, ILogger log){}
I found out finally, the filename was case sensitive, when passed to blob. Hope it helps anyone who has the same issue.
Check whether your blob is actually uploaded in the Storage Container. The stream will be null only if the blob is Not Exist/Unable to find in the container
Very, very new to Azure Functions and getting very frustrated.
All I want to do is execute on a 'get' request from a HttpTriggerFunction and return stream content from the CloudBlobContainer.
I really don't see why this is so hard. Just trying to host a SPA using Azure Functions.
Something like this
public static class UIHandler
{
[FunctionName("UIHandler")]
public static async Task<HttpResponseMessage> Run(
[HttpTrigger(AuthorizationLevel.Function, "get", Route = null)]HttpRequest req,
TraceWriter log,
CloudBlobContainer container)
{
log.Info("C# HTTP trigger function processed a request.");
var stream = await container.GetBlockBlobReference({Infer file name from request here}).OpenReadAsync();
return new HttpResponseMessage()
{
StatusCode = HttpStatusCode.OK,
Content = new StreamContent(stream)
};
}
}
When I try to run this I get the following error.
Run: Microsoft.Azure.WebJobs.Host: Error indexing method
'UIHandler.Run'. Microsoft.Azure.WebJobs.Host: Cannot bind parameter
'container' to type CloudBlobContainer. Make sure the parameter Type
is supported by the binding. If you're using binding extensions (e.g.
ServiceBus, Timers, etc.) make sure you've called the registration
method for the extension(s) in your startup code (e.g.
config.UseServiceBus(), config.UseTimers(), etc.).
I'm using Azure Functions 2. I can't see from the web how to setup the browsing extensions for this. Iv'e also looked into Input and Output bindings. I don't understand what makes a parameter input or output bound when your using C# that only seems to exist in the JSON.
Do I need to corresponding JSON file ? If so what is it called where does it go.
Thanks in Advance
Have a look at Blob Storage Input Binding. The very first sample there shows how to read blob stream, just replace Queue Trigger with HTTP trigger, e.g.
[FunctionName("UIHandler")]
public static async Task<HttpResponseMessage> Run(
[HttpTrigger(AuthorizationLevel.Function, "get", Route = "{name}")] HttpRequest req,
string name,
TraceWriter log,
[Blob("samples-workitems/{name}", FileAccess.Read)] Stream stream)
{
log.Info($"C# HTTP trigger function processed a request for {name}.");
return new HttpResponseMessage()
{
StatusCode = HttpStatusCode.OK,
Content = new StreamContent(stream)
};
}