Connecting Azure Function to Azure Storage Blob with HTTP trigger - c#

I came across multiple ways of connecting Azure Storage Blobs to Azure (like this one for example) Functions but all of them require me to use a BlockBlob type parameter in my Run function, therefore replacing the HTTPRequestMessage parameter I need. Is there a way to keep the HTTPRequestMessage parameter and connect to Azure Storage Blob?
Ultimately, I need to get a file reference from the blob to send to another service via Azure Function.
When I try to add more parameters to Run, the function compiles properly, but I am returned a 500 error. When I change the parameters back to two, it works properly. The only difference is the parameter and function.json to which I add a section resulting in the entire file looking like this :
{
"bindings": [
{
"authLevel": "function",
"name": "req",
"type": "httpTrigger",
"direction": "in",
"methods": [
"get",
"post"
]
},
{
"type": "blob",
"name": "myBlobbo",
"path": "mycontainer",
"connection": "value",
"direction": "inout"
},
{
"name": "$return",
"type": "http",
"direction": "out"
}
],
"disabled": false
}
// Alright, now the logs are telling me that I didn't specify a connection string, even though I have a local.settings.json file with this inside:
{
"ConnectionStrings":
{
"xyz": "DefaultEndpointsProtocol=https;AccountName=xxx;AccountKey=yyy;EndpointSuffix=core.windows.net"
}
}
I'm probably going to just connect manually via passing an URI to CloudBlobContainer and using a file stream or %TEMP% to pass the contents, but I still would very much like to know how to get this binding to work.
// I'm using Azure environment to develop the function.

Following example shows how to get blob content with HttpRequest (HttpTrigger, Blob input, Http out):
run.csx
using System.Net;
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, string inputBlob, TraceWriter log)
{
log.Info("Blob content: " + inputBlob);
return req.CreateResponse(HttpStatusCode.OK, inputBlob);
}
functions.json
{
"bindings": [
{
"authLevel": "function",
"name": "req",
"type": "httpTrigger",
"direction": "in",
"methods": [
"get",
"post"
]
},
{
"name": "$return",
"type": "http",
"direction": "out"
},
{
"type": "blob",
"name": "inputBlob",
"path": "incontainer/myblob.txt",
"connection": "AzureWebJobsDashboard",
"direction": "in"
}
],
"disabled": false
}
storage:
AzureWebJobsDashboard:

Related

Blob Storage trigger is not getting fired upon uploading an image to the container, why?

I am new to Azure and tried my hands on azure blob storage trigger function.
I have created a function:
public static void Run(Stream myBlob, string name, Stream outputBlob, ILogger log)
{ log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.Length} Bytes");
}
function.json file:
{
"bindings": [
{
"name": "myBlob",
"type": "blobTrigger",
"direction": "in",
"path": "hunaincontainer/{blobname}.{blobextension}.",
"connection": "hunainfunctionstorage1_STORAGE"
},
{
"type": "blob",
"name": "outputBlob",
"path": "hunaincontainer/{blobname}-ResizedImage.{blobextension}",
"connection": "hunainfunctionstorage1_STORAGE",
"direction": "out"
}
],
"disabled": false
}
hunaincontainer is a blob storage container in general purpose storage account. I am using free Azure for testing.
I run the function and it compiles successfully and then I upload an image to hunaincontainer using portal but it doesn't hit the function. Why? The connectionstring, key value and everything is set.
I think your issue is a stray period in the path you're setting! Instead of "path": "hunaincontainer/{blobname}.{blobextension}.", try "path": "hunaincontainer/{blobname}.{blobextension}"

VMExtensionProvisioningError while deploying a virtual machine

I am trying to deploy a virtual machine in azure via JSON using as much as possible JSON templates and parameters, when i try to generate an extension for the VM i get the error:
"VMExtensionProvisioningError"
with the message:
"Failed to decode, decrypt, and deserialize the protected settings string. Error Message: Keyset does not exist"
And i dont know what this stands for.
{
"apiVersion": "2017-12-01",
"dependsOn": [
"[concat('Microsoft.Compute/virtualMachines/', parameters('virtualMachineName'))]"
],
"location": "[resourceGroup().location]",
"name": "[concat(parameters('virtualMachineName'),'/CustomScriptExtension')]",
"properties": {
"publisher": "Microsoft.Compute",
"type": "CustomScriptExtension",
"typeHandlerVersion": "1.7",
"autoUpgradeMinorVersion" : true,
"settings": {
"fileUris": [
"https://{storageAccountName}.blob.core.windows.net/scripts/{scriptName}"
],
"commandToExecute": "[concat('powershell -ExecutionPolicy Unrestricted -file "{scriptName}" ')]"
},
"protectedSettings": { "storageAccountName": "[parameters('storageAccountName')]" }
},
"type": "Microsoft.Compute/virtualMachines/extensions"
},
What i am trying is to execute a script obtained from a blob in Azure.
What i am doing wrong?

sendgrid send email to multiple recipients in Azure function failed

(For the complete version of this question, please refer to
https://social.msdn.microsoft.com/Forums/en-US/20bb5b37-82af-4cf3-8a59-04e5f19572bc/send-email-to-multiple-recipients-using-sendgrid-failure?forum=AzureFunctions)
Sending to single recipient was successful. But found no way to send to multiple recipients or CC/BCC in Azure function.
Tried several formats including
{ "to": [{ "email": ["john.doe#example.com", "sendgridtesting#gmail.com" ] }] }
It seemed to be the limit from azure function. But not sure where goes wrong yet. Please refer to the "bindings" below,
{
"bindings": [
{
"name": "telemetryEvent",
"type": "serviceBusTrigger",
"direction": "in",
"queueName": "threshold-email-queue",
"connection": "RootManageSharedAccessKey_SERVICEBUS",
"accessRights": "Manage"
},
{
"type": "sendGrid",
"name": "$return",
"apiKey": "SendGridKey",
"direction": "out",
"from": "ABC#sample.com",
"to": [{
"email": ["test1#sample1.com", "test2#sample2.com" ]
}]
}
],
"disabled": false
}
I did my example with a HTTP trigger, but based on this you'll be able to make it work with Service Bus trigger.
my function.json:
{
"bindings": [
{
"authLevel": "function",
"name": "req",
"type": "httpTrigger",
"direction": "in",
"methods": [
"get",
"post"
]
},
{
"type": "sendGrid",
"name": "mails",
"apiKey": "MySendGridKey",
"direction": "out",
"from":"samples#functions.com"
}
],
"disabled": false
}
My run.csx:
#r "SendGrid"
using System;
using System.Net;
using SendGrid.Helpers.Mail;
public static HttpResponseMessage Run(HttpRequestMessage req, TraceWriter log, ICollector<Mail> mails)
{
log.Info("C# HTTP trigger function processed a request.");
Mail message = new Mail()
{
Subject = $"Hello world from the SendGrid C#!"
};
var personalization = new Personalization();
personalization.AddTo(new Email("foo#bar.com"));
personalization.AddTo(new Email("foo2#bar.com"));
// you can add some more recipients here
Content content = new Content
{
Type = "text/plain",
Value = $"Hello world!"
};
message.AddContent(content);
message.AddPersonalization(personalization);
mails.Add(message);
return null;
}
I used this source to build my example:
Azure Function SendGrid
Thanks to Tamás Huj, the function can do the job now. So I post the solution in detail for others reference.
{
"bindings": [
{
"name": "telemetryEvent",
"type": "serviceBusTrigger",
"direction": "in",
"queueName": "threshold-email-queue",
"connection": "RootManageSharedAccessKey_SERVICEBUS",
"accessRights": "Manage"
},
{
"type": "sendGrid",
"name": "$return",
"apiKey": "SendGridKey",
"direction": "out",
"from": "ABC#sample.com"
}
],
"disabled": false
}
Then write the run.csx as:
#r "SendGrid"
#r "Newtonsoft.Json"
using System;
using SendGrid.Helpers.Mail;
using System.Threading.Tasks;
using Microsoft.Azure.WebJobs.Host;
using Newtonsoft.Json;
public static Mail Run(string telemetryEvent, TraceWriter log)
{
var telemetry = JsonConvert.DeserializeObject<Telemetry>(telemetryEvent);
Mail message = new Mail()
{
Subject = "Write your own subject"
};
var personalization = new Personalization();
personalization.AddBcc(new Email("sample1#test.com"));
personalization.AddTo(new Email("sample2#test.com"));
//add more Bcc,cc and to here as needed
Content content = new Content
{
Type = "text/plain",
Value = $"The temperature value is{temperature.Temperature}."
};
message.AddContent(content);
message.AddPersonalization(personalization);
return message;
}
public class Telemetry
{
public float Temperature { get; set; }
}

Azure Function - c# - ServicebusTrigger with Blob binding

I have a python function with a servicebus trigger and a blob input binding. The name of the blob match the content of the queue message.
My function.json file looks like that:
{
"bindings": [
{
"type": "serviceBusTrigger",
"name": "inputMessage",
"connection": "Omnibus_Validation_Listen_Servicebus",
"queueName": "validation-input-queue",
"accessRights": "listen",
"direction": "in"
},
{
"type": "blob",
"name": "inputBlob",
"path": "baselines/{inputMessage}",
"connection": "Omnibus_Blob_Storage",
"direction": "in"
}
],
"disabled": false
}
And it is working like a charm.
I'd to create a C# function with the same bindings but it does not seem to work.
I've used the same function.json file.
I have a project.json file:
{
"frameworks": {
"net46": {
"dependencies": {
"WindowsAzure.Storage": "8.5.0"
}
}
}
}
and my run.csx file looks like that:
public static void Run(string inputMessage, Stream inputBlob, TraceWriter log)
{
log.Info($"C# ServiceBus queue trigger function processed message: {inputMessage}");
}
When I save/run the function, I received this error:
Function ($import-baseline) Error: Microsoft.Azure.WebJobs.Host: Error indexing method 'Functions.import-baseline'. Microsoft.Azure.WebJobs.Host: No binding parameter exists for 'inputMessage'.
Is there any difference between the python and c# sdk for this kind of binding ?
I also can reproduce it on my side if I bind input blob path with baselines\{serviceBusTrigger} or baselines\{inputMessage} in the function.json file.
I am not sure whether it is supported to integrated input servicebus queue and input blob currently. We could give our feedback to Azure function team.
If Azure storage queue is acceptable we could use Azure storage queue trigger to do that. I test it on my side, it works correctly.
run.csx file
using System;
using System.Threading.Tasks;
public static void Run(string myQueueItem, Stream inputBlob, TraceWriter log)
{
log.Info($"C# storage queue trigger function processed message: {myQueueItem}");
StreamReader reader = new StreamReader(inputBlob);
string text = reader.ReadToEnd();
log.Info(text);
}
function.json
{
"bindings": [
{
"type": "blob",
"name": "inputBlob",
"path": "incontainer/{queueTrigger}",
"connection": "testweblog_STORAGE",
"direction": "in"
},
{
"type": "queueTrigger",
"name": "myQueueItem",
"queueName": "myqueue",
"connection": "testweblog_STORAGE",
"direction": "in"
}
],
"disabled": false
}

FunctionInvocationException when Azure function is called

I'm having an issue with my Azure function. The function cannot startup because a FunctionInvocationException is being thrown each time.
The inner exception being an InvalidOperationException with the below message:
PartitionKey must be supplied for this operation
I have two bindings in my function that connect to a Document DB, one is an in binding that retrieves a specific document from a collection, and the other is an out binding used for auditing.
These both work fine in my dev, qa, and uat environments it is only the production environment that is having an issue. Both collections (for the settings and the auditing) were created without a partition key, the same as every environment.
System.InvalidOperationException:
at Microsoft.Azure.Documents.Client.DocumentClient+d__347.MoveNext (Microsoft.Azure.Documents.Client, Version=1.11.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35)
Any ideas?
I've tried deleting and remaking the audit collection but that didn't work. I don't get why the other environments are fine, but this one isn't.
Edit:
The function.json
{
"scriptFile": "..\\bin\\Cso.Notification.Function.dll",
"entryPoint": "Cso.Notification.Function.Program.Run",
"disabled": false,
"bindings": [
{
"type": "httpTrigger",
"direction": "in",
"webHookType": "genericJson",
"name": "request",
"methods": [
"post"
]
},
{
"type": "documentDB",
"name": "subscriberSettings",
"databaseName": "CSO",
"collectionName": "Settings",
"id": "SubscriberSettings",
"connection": "CsoDocDb",
"direction": "in"
},
{
"type": "http",
"direction": "out",
"name": "res"
},
{
"type": "documentDB",
"name": "collector",
"databaseName": "CSO",
"collectionName": "AuditJSON",
"connection": "CsoDocDb",
"direction": "out"
}
]
}
I've found an answer. One of the documents we were querying for an initial value (SubscriberSettings) was incorrectly setup with a PartitionKey.
Deleting the collection and remaking it without a partition key did the trick.

Categories

Resources