Azure function app blob trigger cannot upload file - c#

I've created a blob trigger that accepts a file from a blob, unzips it and moves it to another blob using streams.
My code looks like this
[FunctionName("Requests")]
[StorageAccount("samplecontainer")]
public static void Run([BlobTrigger("incoming/{name}")]Stream myBlob,
[Blob("processing/{name}.xml", FileAccess.Write)] TextWriter output,
string name, ILogger log)
{
log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.Length} Bytes");
string returnValue;
using (Stream blob = myBlob)
{
using (GZipStream decompressionStream = new GZipStream(blob, CompressionMode.Decompress))
{
log.LogInformation($"Unzipping file");
name = name.Replace(".gz", "");
var content = String.Empty;
using (StreamReader reader = new StreamReader(decompressionStream))
{
content = reader.ReadToEnd();
reader.Close();
}
returnValue = content;
}
}
log.LogInformation($"C# Blob trigger function finished processing blob\n Name:{name} \n Now writing to xml");
output.WriteLine(returnValue);
}
This will in fact work while running this locally using "AzureWebJobsStorage": "UseDevelopmentStorage=true" in my local.settings.json file.
However, once I've deployed this app and upload a file to the real container using Azure Storage Explorer nothing happens and the activity log shows that the operation failed in Write.Tiny.txt.gz, Could not find file 'D:\home\site\wwwroot\Requests\tiny.txt.gz'.
I have http triggers that do work, and I have tried turning off the configuration WEBSITE_RUN_FROM_PACKAGE to no effect. Am I maybe missing some setting or config?
When I console into this path in the function app and cat the function.json I get this:
{
"generatedBy": "Microsoft.NET.Sdk.Functions-1.0.29",
"configurationSource": "attributes",
"bindings": [
{
"type": "blobTrigger",
"connection": "samplecontainer",
"path": "incoming/{name}",
"name": "myBlob"
}
],
"disabled": false,
"scriptFile": "../bin/azure-functions.dll",
"entryPoint": "Requests.Run"
}

OK, I can reproduce your problem.
Since you can work on local, then it should be no problem about your code.
I can explain why this happened. Your function app on azure don't have the RBAC that can write data to azure blob storage.
Below steps can solve the problem:
First, create identity of your function app.
Second, add RBAC role of your function app to storage account.
By the way, the above settings will not take effect immediately, you need to wait a few minutes for it to take effect.

I fully believe I also needed Bowman Zhu's answer for this to work, but in my specific case I had forgotten to add the storage account as a connection string in my appSettings.json, as we had recently updated to Microsoft.Azure.WebJobs.Extensions.Storage version 3 and this was our first blob trigger.
In appSettings.json:
{
"ConnectionStrings": {
"AzureWebJobsStorage": "{endpoint}"
}
}

Related

Cannot get AWS Lambda to PutObject to S3 in same account no matter what

I am trying to write a lambda to run FFMPEG in an AWS lambda. I have done this before at another workplace so I know that it is possible.
The log shows that I make a Temporary URL to read the file in, it processes with FFMPEG, I pipe the output to a byte[] which is showing data in it, and then I try to do an S3 PutObjectRequest, which always fails with this message:
The request signature we calculated does not match the signature you provided. Check your key and signing method.
The S3 client is the same credentials that I use automated all the time to upload files to S3 from different servers at different locations of our company. I have tried a couple different IAMs to no effect.
I am not trying to do any sort of signature whatsoever. I am simply doing this:
var putRequest = new PutObjectRequest
{
BucketName = m.Groups["bucket"].Value,
Key = m.Groups["key"].Value,
InputStream = new MemoryStream(Encoding.UTF8.GetBytes(data ?? "")),
CannedACL = S3CannedACL.PublicRead,
ContentType = MimeTypes.GetMimeType(Path.GetExtension(m.Groups["key"].Value)),
DisablePayloadSigning = true,
};
putRequest.Headers.ContentLength = data.Length;
_context.Logger.LogLine($"Saving file to bucket '{putRequest.BucketName}' and key '{putRequest.Key}' and content type '{putRequest.ContentType}' and content length {putRequest.Headers.ContentLength}");
try
{
await _s3Client.PutObjectAsync(putRequest);
}
catch (AmazonS3Exception s3x)
{
_context.Logger.LogLine($"S3 Exception: {s3x.Message}");
}
I have checked bucket and key and they are correct. Data.length is greater than 0. Content type is "audio/mpeg", which is correct for .mp3. The data is there to be written.
My lambda is running under AWSLambda_Full_Access with the following additional rights granted:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": [
"s3:ListStorageLensConfigurations",
"s3:ListAccessPointsForObjectLambda",
"s3:GetAccessPoint",
"s3:PutAccountPublicAccessBlock",
"s3:GetAccountPublicAccessBlock",
"s3:ListAllMyBuckets",
"s3:*",
"s3:ListAccessPoints",
"s3:ListJobs",
"s3:PutStorageLensConfiguration",
"s3:ListMultiRegionAccessPoints",
"s3:CreateJob"
],
"Resource": "*"
}
]
}
Does anyone have any ideas what else I could be missing? I have been stuck on this one problem for over 3 days now and have tried everything I can think of, so it must be something I'm not thinking of.
Thanks.

Not able to run Cosmos DB Change Feed Trigger Azure Function locally

I am not able to run Cosmos DB Change Feed Trigger function locally.
Cosmos DB Change Feed Trigger Azure Function:
public static class NotificationChangeFeed
{
[FunctionName("NotificationChangeFeed")]
public static async Task Run([CosmosDBTrigger(
databaseName: "FleetHubNotifications",
collectionName: "Notification",
ConnectionStringSetting = "CosmosDBConnection",
CreateLeaseCollectionIfNotExists = true,
LeaseCollectionName = "leases")]IReadOnlyList<Document> input,
[Inject] ILoggingService loggingService,
[Inject] IEmailProcessor emailProcessor)
{
var logger = new Logger(loggingService);
try
{
if (input != null && input.Count > 0)
{
foreach (Document document in input)
{
string requestBody = document.ToString();
var notification = requestBody.AsPoco<Notification>();
var result = await emailProcessor.HandleEmailAsync(notification, logger);
if (result)
{
logger.Info($"Email Notification sent successfully for file name: {document.Id}");
}
else
{
logger.Warning($"Unable to process document for Email Notification for file with name: {document.Id}");
}
}
}
}
catch (Exception ex)
{
logger.Error($"Unable to process Documents for Email Notification for Files: {input?.Count}", ex,
nameof(NotificationChangeFeed));
throw;
}
}
}
local.settings.json
{
"IsEncrypted": "false",
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"AzureWebJobsDashboard ": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"CosmosDbId": "FleetHubNotifications",
//Localhost
"CosmoDbAuthKey": "C2y6yDjf5/R+ob0N8A7Cgv30VRDJIWEHLM+4QDU5DE2nQ9nDuVTqobD4b8mGGyPMbIZnqyMsEcaGQy67XIw/Jw==",
"CosmoDbEndpoint": "https://localhost:8081/",
"CosmosDBConnection": "AccountEndpoint=https://localhost:8081/;AccountKey=C2y6yDjf5/R+ob0N8A7Cgv30VRDJIWEHLM+4QDU5DE2nQ9nDuVTqobD4b8mGGyPMbIZnqyMsEcaGQy67XIw/Jw==",
}
}
When I press F5, it got stuck in the console window.(As shown in the below screen shot)
Also not able call http trigger functions. Getting below error while calling:
Error: connect ECONNREFUSED 127.0.0.1:7071
Any thoughts?
Yeah in that part you only get the endpoints of the http functions. The other functions are also initialized and waiting for an event of whatever type. Yo can see it here: Found the following functions, 4 http trigger the other 2 are blob trigger.
If you want to debug your NotificationChangeFeed function you will have to create a new document on the DB and have the function runing and waiting for that event. And you will see the telemetry in the console and you can debug the function.

How to refactor code to turn HTTP trigger to a Blob Trigger?

I have this function that basically uploads a video to YouTube. However, currently I am having to hard code the actual file path where the video is located(example: #"C:\Users\Peter\Desktop\audio\test.mp4";). I am wondering is there a way to make this more dynamic. For example right now I am using HTTP trigger, but how can the code be refactored to make it a Blob trigger? So that, when I upload a new .mp4 file into my blob storage container, this function would get triggered.
I am asking this because, I planning to move this function to Azure portal and there, I won't be able to specify local file path there like I am doing right now. Thanks in advance.
using System;
using System.IO;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
using Google.Apis.Auth.OAuth2;
using Google.Apis.Upload;
using Google.Apis.YouTube.v3.Data;
using System.Reflection;
using Google.Apis.YouTube.v3;
using Google.Apis.Services;
using System.Threading;
namespace UploadVideoBlob
{
public static class Function1
{
[FunctionName("Function1")]
public static async Task Run([BlobTrigger("video/{name}")]Stream myBlob, string name, Microsoft.Azure.WebJobs.ExecutionContext context, ILogger log)
{
UserCredential credential;
using(var stream = new FileStream(System.IO.Path.Combine(context.FunctionDirectory, "client_secrets.json"), FileMode.Open, FileAccess.Read))
{
credential = await GoogleWebAuthorizationBroker.AuthorizeAsync(
GoogleClientSecrets.Load(stream).Secrets,
new[] { YouTubeService.Scope.YoutubeUpload },
"user",
CancellationToken.None
);
}
var youtubeService = new YouTubeService(new BaseClientService.Initializer()
{
HttpClientInitializer = credential,
ApplicationName = Assembly.GetExecutingAssembly().GetName().Name
});
var video = new Video();
video.Snippet = new VideoSnippet();
video.Snippet.Title = "Default Video Title";
video.Snippet.Description = "Default Video Description";
video.Snippet.Tags = new string[] { "tag1", "tag2" };
video.Snippet.CategoryId = "22";
video.Status = new VideoStatus();
video.Status.PrivacyStatus = "unlisted";
var VideoInsertRequest = youtubeService.Videos.Insert(video, "snippet,status", myBlob, "video/*");
await VideoInsertRequest.UploadAsync();
}
}
}
function.json
{
"generatedBy": "Microsoft.NET.Sdk.Functions-1.0.29",
"configurationSource": "attributes",
"bindings": [
{
"type": "blobTrigger",
"path": "video/{name}",
"name": "myBlob"
}
],
"disabled": false,
"scriptFile": "../bin/UploadVideoBlob.dll",
"entryPoint": "UploadVideoBlob.Function1.Run"
}
client_secrets.json
{
"installed": {
"client_id": "147300761218-dl0rhktkoj8arh0ebu5pu56es06hje5p.apps.googleusercontent.com",
"project_id": "mytestproj",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_secret": "xxxxxxxxxxxxxxxxxx",
"redirect_uris": [ "urn:ietf:wg:oauth:2.0:oob"]
}
}
I had some spare time, and despite using Azure Functions for more than a year now, never had the chance to implement an actual BlobTrigger. So I thought I'd share my tiny core 3.0 test implementation, and add an overkill answer as a supplement to #Bryan Lewis' correct answer.
If you want to test this before launching on Azure, you should first make sure you have Azure Storage Emulator. If you have Visual Studio 2019, it should already be installed. If you have VS19 but it's not installed, you should open the Visual Studio Installer and modify your VS19 installation. Under "Individual Components", you should be able to find "Azure Storage Emulator". If you don't have VS19, you can get it here.
Next I'd recommend downloading Azure Storage Explorer from here. If the emulator is running and you didn't change the default ports for the Storage Emulator you should be able to find a default entry under Local & Attached > Storage Accounts > (Emulator - Default ports).
Using the Storage Explorer, you can expand "Blob Containers". Right click "Blob Containers", and choose to "Create Blob Container", and give it a name. For my example I named it "youtube-files". I also created another container calling it "youtube-files-descriptions".
Now for the actual function. I gave myself the liberty to do this with dependency injection (I just dread the static chaos). For this you'll have to include the NuGet package Microsoft.Azure.Functions.Extensions and Microsoft.Extensions.DependencyInjection.
Startup
We register our services and what-not here. I'll add an InternalYoutubeService (named as such to not confuse it with the one supplied by the Goodle APIs). You can read more about DI and Azure functions here.
// Notice that the assembly definition comes before the namespace
[assembly: FunctionsStartup(typeof(FunctionApp1.Startup))]
namespace FunctionApp1
{
public class Startup : FunctionsStartup
{
public override void Configure(IFunctionsHostBuilder builder)
{
// You can of course change service lifetime as needed
builder.Services.AddTransient<IInternalYoutubeService, InternalYoutubeService>();
}
}
}
BlobFunction
You don't have to add/register classes containing any azure functions, they're handled automagically.
Notice the difference between BlobTrigger and Blob. BlobTrigger dictates that it is only when files are uploaded into the "youtube-files" container that the function actually triggers; simultaneously it will look up a blob in the "youtube-files-descriptions" container for a file with the same filename as the incoming one from the BlobTrigger, but with a "-description" suffix and only if it is using ".txt" extension. If the blob is not found, it will return null, and the bound string description will be null. You can find the different availing bindings here. The link will generally speaking tell you what you need to know about the BlobTrigger and Blob attribute.
[StorageAccount("AzureWebJobsStorage")]
public class BlobFunction
{
private readonly IInternalYoutubeService _YoutubeService;
private readonly ILogger _Logger;
// We inject the YoutubeService
public BlobFunction(IInternalYoutubeService youtubeService, ILogger<BlobFunction> logger)
{
_YoutubeService = youtubeService;
_Logger = logger;
}
[FunctionName("Function1")]
public async Task Run(
[BlobTrigger("youtube-files/{filename}.{extension}")] Stream blob,
[Blob("youtube-files-descriptions/{filename}-description.txt")] string description,
string filename,
string extension)
{
switch (extension)
{
case "mp4":
await _YoutubeService.UploadVideo(blob, filename, description, "Some tag", "Another tag", "An awesome tag");
break;
case "mp3":
await _YoutubeService.UploadAudio(blob, filename, description);
break;
default:
_Logger.LogInformation($"{filename}.{extension} not handled");
break;
}
}
}
YoutubeService
Will contain the logic that'll handle the actual authentication (the OAuth2 you're using) and the upload of the file. You can refer to #Bryan Lewis' answer in terms of how to use the incoming Stream. We could store our credentials in our function app's configuration and inject the IConfiguration interface, which allows us to access the values by supplying the value's key defined in the configuration. This way you avoid hardcoding any credentials in your code. I have omitted the YouTube-specific upload logic, as I have no experience with the library you're using, but it should be simple enough to migrate the logic to the service.
public interface IInternalYoutubeService
{
Task UploadVideo(Stream stream, string title, string description, params string[] tags);
Task UploadAudio(Stream stream, string title, string description, params string[] tags);
}
internal class InternalYoutubeService : IInternalYoutubeService
{
private readonly IConfiguration _Configuration;
private readonly ILogger _Logger;
public InternalYoutubeService(IConfiguration configuration, ILogger<InternalYoutubeService> logger)
{
_Configuration = configuration;
_Logger = logger;
}
public async Task UploadAudio(Stream stream, string title, string description, params string[] tags)
{
_Logger.LogInformation($"{_Configuration["YoutubeAccountName"]}");
_Logger.LogInformation($"{_Configuration["YoutubeAccountPass"]}");
_Logger.LogInformation($"Bytes: {stream.Length} - {title} - {description} - {string.Join(", ", tags)}");
}
public async Task UploadVideo(Stream stream, string title, string description, params string[] tags)
{
_Logger.LogInformation($"{_Configuration["YoutubeAccountName"]}");
_Logger.LogInformation($"{_Configuration["YoutubeAccountPass"]}");
_Logger.LogInformation($"Bytes: {stream.Length} - {title} - {description} - {string.Join(", ", tags)}");
}
}
local.settings.json
You'd of course put these values into your function app's configuration on Azure Portal, except the storage string, when you're done testing locally.
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"YoutubeAccountName": "MyAccountName",
"YoutubeAccountPass": "MySecretPassword"
}
}
Testing
I'm testing with a simple text file, "Test File-description.txt", containing the text "This is a sample description.". I also have a ~5MB MP3 file, "Test File.mp3". I start by drag-n-dropping my text file into the "youtube-files-descriptions" container, followed by drag-n-dropping the "Test File.mp3" file into the "youtube-files" container. The function is not triggered by uploading the text file; it's not 'till I upload "Test File.mp3" that the function triggers. I see the following lines logged:
Executing 'Function1' (Reason='New blob detected: youtube-files/Test File.mp3', Id=50a50657-a9bb-41a5-a7d5-2adb84477f69)
MyAccountName
MySecretPassword
Bytes: 5065849 - Test File - This is a sample description. -
Executed 'Function1' (Succeeded, Id=50a50657-a9bb-41a5-a7d5-2adb84477f69)
It's a pretty straight forward change. Just swap in the Blob Trigger. Replace:
public static async Task<IActionResult> Run([HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)] HttpRequest req, ILogger log)
with
public static void Run([BlobTrigger("blob-container-name/{fileName}")] Stream videoBlob, string fileName, ILogger log)
This give you a stream (videoBlob) to work with (and the video's filename if you need it). Then substitute this new stream for your FileStream. It appears that you used the Google/YouTube example code to construct your function, but there is not much need to create a separate Run() method for an Azure function. You could simplify things but combining your Run() method into the main function code instead of calling "await Run();".
Change your function to this:
using Google.Apis.Auth.OAuth2;
using Google.Apis.Services;
using Google.Apis.YouTube.v3;
using Google.Apis.YouTube.v3.Data;
using Microsoft.Azure.WebJobs;
using Microsoft.Extensions.Logging;
using System.IO;
using System.Reflection;
using System.Threading;
using System.Threading.Tasks;
namespace UploadVideoBlob
{
public static class Function1
{
[FunctionName("Function1")]
public static async Task Run([BlobTrigger("video/{name}",
Connection = "DefaultEndpointsProtocol=https;AccountName=uploadvideoblob;AccountKey=XXXX;EndpointSuffix=core.windows.net")]Stream videoBlob, string name,
Microsoft.Azure.WebJobs.ExecutionContext context, ILogger log)
{
UserCredential credential;
using (var stream = new FileStream(System.IO.Path.Combine(context.FunctionDirectory, "client_secrets.json"), FileMode.Open, FileAccess.Read))
{
credential = await GoogleWebAuthorizationBroker.AuthorizeAsync(
GoogleClientSecrets.Load(stream).Secrets,
// This OAuth 2.0 access scope allows an application to upload files to the
// authenticated user's YouTube channel, but doesn't allow other types of access.
new[] { YouTubeService.Scope.YoutubeUpload },
"user",
CancellationToken.None
);
}
var youtubeService = new YouTubeService(new BaseClientService.Initializer()
{
HttpClientInitializer = credential,
ApplicationName = Assembly.GetExecutingAssembly().GetName().Name
});
var video = new Video();
video.Snippet = new VideoSnippet();
video.Snippet.Title = "Default Video Title";
video.Snippet.Description = "Default Video Description";
video.Snippet.Tags = new string[] { "tag1", "tag2" };
video.Snippet.CategoryId = "22"; // See https://developers.google.com/youtube/v3/docs/videoCategories/list
video.Status = new VideoStatus();
video.Status.PrivacyStatus = "unlisted"; // or "private" or "public"
var videosInsertRequest = youtubeService.Videos.Insert(video, "snippet,status", videoBlob, "video/*");
await videosInsertRequest.UploadAsync();
}
}
}
Using the ProgressChanged and ResponseReceived events is not really needed for a Function except just for logging, so you could keep those or eliminate them. The Google example is a console app, so it's outputting a lot of status to the console.
I also made one additional change to correct for the file path of your "client_secrets.json". This code assumes that json file is in the same directory as your function and is being published with the Function.

Deserialize Azure Blob stream to Json Object

I am trying to deserialize the blob stream to JSON object using azure blob trigger. This trigger would be fired whenever I upload a video to blob storage. However, it is throwing this error:
Newtonsoft.Json: Unexpected character encountered while parsing value: . Path ''.
This is the code that I am using to deserialize:
public static void Run(Stream myBlob, string name, TraceWriter log)
{
myBlob.Position = 0; //resetting stream's position to 0
var serializer = new JsonSerializer();
using(var sr = new StreamReader(myBlob))
{
using(var jsonTextReader = new JsonTextReader(sr))
{
BlobData blobData = serializer.Deserialize<BlobData>(jsonTextReader);
}
}
public class BlobData
{
public string path { get; set; }
}
}
Any help would be appreciated.Thanks.
A i mentioned earlier, the blob will contain a video and after upload a trigger will fire. As of now, i am using some sample videos
As Gaurav Mantri commented that you could not deserialize a video to a JSON object. Per my understanding, if you want to retrieve the blob Uri after a video blob is uploaded and you would store the video blob url to some other data store. At this point, you could bind the CloudBlockBlob type for your myBlob parameter, and you could retrieve the blob url as follows:
run.csx
#r "Microsoft.WindowsAzure.Storage"
using Microsoft.WindowsAzure.Storage.Blob;
public static void Run(CloudBlockBlob myBlob, string name, TraceWriter log)
{
//blob has public read access permission
var blobData = new BlobData() { path = myBlob.Uri.ToString() };
//blob is private, generate a SAS token for this blob with the limited permission(s)
var blobSasToken=myBlob.GetSharedAccessSignature(new SharedAccessBlobPolicy()
{
SharedAccessExpiryTime =DateTimeOffset.UtcNow.AddDays(2),
Permissions = SharedAccessBlobPermissions.Read
}));
var blobData = new BlobData()
{
path = $"{myBlob.Uri.ToString()}{blobSasToken}"
};
//TODO:
}
Moreover, you could follow Create and use a SAS with Blob storage, Azure Functions Blob storage bindings, Get started with Azure Blob storage using .NET for more detailed code sample.

Azure: Cannot Read Blob Text from Azure Function HttpTrigger (400 - Bad Request Error)

I am having trouble reading text stored in a blob on Azure (Blob) Storage.
The blob contains only a single line of text (a string). The blob is filled with text via an Azure Functions HttpTrigger (C#) that receives text via a POST and saves the text to a blob with a user-specified name. The name is converted to all lowercase when saving the blob.
The user can then visit a simple HTML webpage and enter the blob's name into a form. When the user clicks "Submit" on the HTML form, a POST is performed against a different Azure Function API. This function accesses the blob and reads text from it. Whenever I test the Function from within Azure Functions or using Postman, it works correctly (I get the blob text back).
When I try to use the webpage to interact with the API, I get a "400 - Bad Request" when the Azure Function goes to read from the blob. Please see the details below:
Log of API Working correctly from Postman:
2017-04-11T20:19:14.340 Function started (Id=ea82f5c6-4345-40cc-90a5-1cb1cad78b7b)
2017-04-11T20:19:14.340 C# HTTP trigger function processed a request.
2017-04-11T20:19:14.340 Data from POST: blobName=TestBlob1submit=SubmitButtonText
2017-04-11T20:19:14.340 Blob name is: testblob1
2017-04-11T20:19:14.340 Accessing Azure Storage account.
2017-04-11T20:19:14.402 Text in Blob: Hello world test!
2017-04-11T20:19:14.402 Function completed (Success, Id=ea82f5c6-4345-40cc-90a5-1cb1cad78b7b)
Log of API Not Working via HTML Form:
2017-04-11T20:19:52.594 Function started (Id=1b1a39b6-0ab8-4673-bbf0-ae0006f7f7cf)
2017-04-11T20:19:52.594 C# HTTP trigger function processed a request.
2017-04-11T20:19:52.594 Data from POST: blobName=TestBlob1
submit=Retrieve Blob Text
2017-04-11T20:19:52.594 Blob name is: testblob1
2017-04-11T20:19:52.594 Accessing Azure Storage account.
2017-04-11T20:19:52.626 Function completed (Failure, Id=1b1a39b6-0ab8-4673-bbf0-ae0006f7f7cf)
2017-04-11T20:19:52.672 Exception while executing function: Functions.Austin-SteelThread-HttpTrigger-DisplayBlobText. Microsoft.WindowsAzure.Storage: The remote server returned an error: (400) Bad Request.
Here is the Azure Function in question:
#r "Microsoft.WindowsAzure.Storage"
using System;
using System.IO;
using System.Net;
using System.Text;
using Microsoft.Azure;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log, Binder binder)
{
log.Info("C# HTTP trigger function processed a request.");
// Get text passed in POST
string postData = await req.Content.ReadAsStringAsync();
log.Info("Data from POST: " + postData);
// Format blobName string to remove unwanted text
// Help from http://stackoverflow.com/questions/9505400/extract-part-of-a-string-between-point-a-and-b
int startPos = postData.LastIndexOf("blobName=") + "blobName=".Length;
int length = postData.IndexOf("submit=") - startPos;
string blobName = postData.Substring(startPos, length);
blobName = blobName.ToLower(); // Name of blob must be all lower-case
log.Info("Blob name is: " + blobName);
// START BLOB READING
log.Info("Accessing Azure Storage account.");
string containerAndBlob = "usertext-container/blob-" + blobName;
var attributes = new Attribute[]
{
new StorageAccountAttribute("[StorageAccountName]"),
new BlobAttribute(containerAndBlob)
};
try
{
userBlobText = await binder.BindAsync<string>(attributes);
}
catch (StorageException ex)
{
var requestInformation = ex.RequestInformation;
var extendedInformation = requestInformation.ExtendedErrorInformation;
if (extendedInformation == null)
{
log.Info("No Extended Error Information!");
log.Info(requestInformation.HttpStatusMessage);
}
else
{
         log.Info(requestInformation.HttpStatusMessage);
var errorMessage = string.Format("({0}) {1}", extendedInformation.ErrorCode, extendedInformation.ErrorMessage);
var errorDetails = extendedInformation.AdditionalDetails.Aggregate("", (s, pair) =>
{
return s + string.Format("{0}={1},", pair.Key, pair.Value);
});
log.Info(errorMessage + ": Error Details: " + errorDetails);
}
}
log.Info("Text in Blob: " + userBlobText.ToString());
// END BLOB READING
return userBlobText == null
? req.CreateResponse(HttpStatusCode.BadRequest, "Please pass blob name in the request body.")
: req.CreateResponse(HttpStatusCode.OK, "Your blob stored the text: " + userBlobText.ToString());
}
How can I fix this issue so that the Function reads the blob text and the web browser displays the blob's text (right now I get just an empty string)? Thank you in advance.
Instead of connecting to Blob storage manually, you should take advantage of the binding engine. Add binder parameter to your function and then use it to retrieve the file:
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req,
TraceWriter log, Binder binder)
{
// Do await, not .Result
string postData = await req.Content.ReadAsStringAsync();
// ... get your HTTP parameters here
var attributes = new Attribute[]
{
new StorageAccountAttribute(accountName),
new BlobAttribute(blobName) // blobName should have "container/blob" format
};
var userBlobText = await binder.BindAsync<string>(attributes);
// do whatever you want with this blob...
}

Categories

Resources