Azure blob trigger, how to trigger on "move" for a file? - c#

So I have a blob trigger and recently discovered if I have a file and I move it to the folder in question, it does not trigger the blob trigger. How can I trigger off of a "move".
Context - The blob trigger is looking for a json file. It DOES recognize if I copy/ drag and drop from other folder on PC via upload and I overwrite, but NOT if the file is new!
How I have come across this is having an "a" folder and my trigger folder. If I DO NOT have the file in my trigger file and I perform a "move" from file "a" to my trigger folder. This is ignored by the blob trigger? Why is that? Is there a work around?
As far as code, I have to clean it up some but it's a general consumption blob trigger that's connected to look for a json file. It's a ADLS Gen 2 storage account. It DOES work on other blob copies and such. Just NOT on moves.
So far I have tried to move the file and it never triggers but if I copy, drag and drop WITH overwrite, it triggers. Looked through config and checked documents and can't find any mention yet.

After reproducing from my end, this works fine. I'm using ADLS Gen 2 storage account. Even after moving from a different folder to trigger folder, it is getting triggered. I made sure about this after attaching the storage account with function app and logic app. Below is the function.json for my function app.
{
"bindings": [
{
"name": "myBlob",
"type": "blobTrigger",
"direction": "in",
"path": "input/{name}.json",
"connection": "AzureWebJobsStorage"
}
]
}
RESULTS:
Make sure the extension that you have added for filter is in the right format to get it triggered.

Related

unzip Http FormFile in Azure Function

I have a front hand in which the user can download a zip. My idea is to use http triggered azure function to unzip that file and send it to Azure blob storage. Therefore I am simulating the http function with postman sending the zip in the form-data. I am not able to figure it out how to go from the Http.FormFile to the unzipped file that I am going to send. I am using c#.
Do you have some suggestion/reference ?
Maybe my approach is wrong and I should send the data(which unzipped is like 60-70 Mb) first to the blob and then use a Blob trigger to send the unzipped file to another container. This last approach feel to me more resource intensive. Which would you choose ?
As per article credits by FBoucher
Install the extension from Inside Visual Studio Code and Azure Function Core Tools set the setting AzureWebJobsStorage to UseDevelopmentStorage=true in the local.settings.json file.
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"unziptools_STORAGE": "DefaultEndpointsProtocol=https;AccountName=unziptools;AccountKey=XXXXXXXXX;EndpointSuffix=core.windows.net",
}
}
In Azure Function Extension, select your subscription and Function App name under Function App use AzUnzipEverything -> add new setting -> create cloud5mins_storage, destinationStorage and destinationContainer.
In your storage account -> resource group -> select your blob -> input files -> upload a zip file
After few mins uploaded zip file will be Unzipped into the blob storage container output-files.
For your Reference :
https://github.com/FBoucher/AzUnzipEverything by FBoucher

Windows service to GCP

I was looking for resources on how to create a simple background service using C# that checks a specific folder for FLAC files and sends them to a GCP bucket, once the file is uploaded successfully the file is erased or moved to another folder. Where can I find something to read about this kind of thing?
To move a file to another location using c# you can use the move method. The Move method moves an existing file to a new location with the same or a different file name.The Move method moves an existing file to a new location with the same or a different file name in File Move. The Move method takes two parameters. The Move method deletes the original file. The method that renames files is called File.Move
Example:
{
File.Move(sourceFile, destinationFile);
}
catch (IOException iox)
{
Console.WriteLine(iox.Message);
}
If you need more examples about File.Move method please follow this link
Adding to that, you can use the Directory.GetFiles method to select the file extension, like in the example below.
This is the original thread where the example was posted
Example:
//Assume user types .txt into textbox
string fileExtension = "*" + textbox1.Text;
string[] txtFiles = Directory.GetFiles("Source Path", fileExtension);
foreach (var item in txtFiles)
{
File.Move(item, Path.Combine("Destination Directory", Path.GetFileName(item)));
}
If you want to know more about Directory.GetFiles method follow this link
And concerning GCP,using Cloud Storage Transfer Service you can move or backup data to a Cloud Storage bucket either from other cloud storage providers or from your on-premises storage. Storage Transfer Service provides options that make data transfers and synchronization easier. For example, you can:
Schedule one-time transfer operations or recurring transfer
operations.
Delete existing objects in the destination bucket if they do not have
a corresponding object in the source.
Delete data source objects after transferring them.
Schedule periodic synchronization from a data source to a data sink
with advanced filters based on file creation dates, file-names, and
the times of day you prefer to import data.
If you want to know more about GCP Cloud Storage Transfer Service follow this link
If you want to know more about how to create storage buckets follow this link

How do I set up my Azure Function with the content of the local.settings.json file?

My question is pretty specific but I hope someone here will be able to help me...
Okay long story short
I'm developing an Azure Function in C# (with .NET Core 3.1) which outputs PowerPoint slideshows using data fetched from SharePoint lists, and a slideshow template also stored on said SharePoint. To achieve this, I'm using a ConfigurationBuilder to load the configuration in the local.settings.json file.
This file pretty much looks like this :
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet"
},
"TokenEndpoint": "https://login.microsoftonline.com/common/oauth2/token",
"User": "!! put this line filled with the email of the account in your secrets.json !!",
"Password": "!! put this line filled with the password of the account in your secrets.json !!",
"AzureAppId": "some guid",
"SharePointSite": "https://something.sharepoint.com/sites/something_else",
"TemplatesConfig": {
"Path": "Shared%20files/",
"FirstTemplateName": "FirstTemplate.pptx",
"SecondeTemplateName": "SecondTemplate.pptx"
}
To connect to SharePoint, we're using the email and password stored in this file, but to avoid any problems with the credentials leaking, I'm using user secrets that I'm "adding" in the configuration builder to replace the placeholder credentials with actual ones.
Then, I'm going into the TemplatesConfig items and read the Path, FirstTemplateName and SecondTemplateName properties to know where to search the template files, and the names of the first and second template.
I can run the function locally, everything works as intended. Good.
Where the problems begin
Now, I want to publish the function to the cloud to use it. The function app is created and pretty much set up. I can publish it, but can't run it... And the problem comes from the fact that the configuration builder couldn't read the TemplatesConfig as it's not defined.
Reading the docs and searching for answers on some forums here and there, I find that the configuration file local.settings.json is, obviously, only used when executing the function locally and you can't use it when published like I'm trying to. Instead, the key values in this file needs to be input on Azure, in the application settings :
Thing is that the keys entered in the application settings are the keys you could enter in the Values item of the configuration JSON, and that's not where my configuration data is... Furthermore, it looks like you can only store key - value associations, but I'm storing a whole item containing these...
According to the documentation, there are multiple sections possible in local.settings.json, and two of them seem to be linked to the two sections of the page I'm showing you in the screenshot above (Application Settings and Connection Strings), so I think this is the right place to search for answers, but it feels more like I'm going to have to remake my credentials and templates data storing system from the ground up...
It's the first time I'm making an Azure function. In fact, someone else started the job and then passed it on to me when the whole "base" was built (which includes this part of the program which bugs me out because I'm having a hard time understanding it).
So what's my question then ?
Do you have an idea of what I should do to properly store the information about the template, but also the credentials to connect to SharePoint ? Is the current solution correct, or have I completely missed another mechanism I should be using instead to store it ? Should I redo my authentication method completely and use Connection Strings to connect to SharePoint instead of storing the credentials as keys ?
Thank you so much for your time reading me, and thank you in advance for your replies.
If I understood correctly you could not simply set your settings like below?
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"TemplatesConfig:Path": "Shared%20files/",
"TemplatesConfig:FirstTemplateName": "FirstTemplate.pptx",
"TemplatesConfig:SecondeTemplateName": "SecondTemplate.pptx"
},
"TokenEndpoint": "https://login.microsoftonline.com/common/oauth2/token",
"User": "!! put this line filled with the email of the account in your secrets.json !!",
"Password": "!! put this line filled with the password of the account in your secrets.json !!",
"AzureAppId": "some guid",
"SharePointSite": "https://something.sharepoint.com/sites/something_else"
And your applicationsettings
Name: TemplatesConfig:Path
Value: Shared%20files/
Name: TemplatesConfig:FirstTemplateName
Value: FirstTemplateName
...
I would suspect a problem with your credentials.
If I remember correctly, when you publish, the user secrets are not. It is only used in developpement.
You should not use app secrets in production, as stated here in the title : https://learn.microsoft.com/en-us/aspnet/core/security/app-secrets?view=aspnetcore-5.0&tabs=windows
You should instead use environment variables : https://learn.microsoft.com/en-us/aspnet/core/fundamentals/configuration/?view=aspnetcore-5.0
Try it and complement your question with further details on the problem, so we can assist you further more.
Okay so I'm back with a solution I found with you guys' help along my colleagues' one.
What I did was to put everything that was in the local.settings.json file in the Values object, like that :
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"SharePointSite": "https://something.sharepoint.com/sites/something_else",
"User": "!! put this line filled with the email of the account in your secrets.json !!",
"Password": "!! put this line filled with the password of the account in your secrets.json !!",
"TemplatesConfig:Path": "Shared%20files/",
"TemplatesConfig:FirstTemplateName": "FirstTemplate.pptx",
"TemplatesConfig:SecondTemplateName": "SecondTemplate.pptx"
},
"TokenEndpoint": "https://login.microsoftonline.com/common/oauth2/token",
"AzureAppId": "some guid"
}
I've understood that Azure was only working with environment variables, and that Visual Studio was getting them in this file's Values object to "emulate" environment variables while developing.
In my configurations loading code, I just had to add a line with Microsoft.Extensions.Configuration.ConfigurationBuilder.AddEnvironmentVariables() to get them from Azure when the function is running online (after being published). Now, it's running fine locally, and most importantly, online.
As for the credentials, in my secret.json file, I just had to put the two lines for e-mail and password also inside a Values JSON object, just like in local.settings.json, and voilĂ , it does the job while developing and online too.
For more info, I checked the links people here replied with, and also this doc :
Best place to store environment variables for Azure Function
Thank you for your help, everyone !

Azure Media Services (v3) blob storage, assets, and locators backup

I'm trying to figure out how to backup videos produced by Azure Media Services.
Where are the assets and streaming locators stored, how to backup them or recreate them for existing binary files stored in the Azure Media Service's blob storage?
Proposed solution:
I've come up with a solution, once the video is processed by transformation job, the app will create a copy of the container to separate backup blob storage.
Since, from my understanding, the data produced by transformation jobs are immutable, I don't have to manage another synchronization.
if (job.State == JobState.Finished)
{
StreamingLocator locator = await AzureMediaServicesService.CreateStreamingLocatorAsync(client, azureMediaServicesConfig, outputAssetName, locatorName);
var videoUrls = await AzureMediaServicesService.GetVideoUrlsAsync(client, azureMediaServicesConfig, locator.Name);
// backup blobs in creted container here
}
Are only the binary data stored in blob storage sufficient for restoring the videos successfully? After restore, will the already existing streaming and download links work properly?
Since, when I'm creating locators, I'm passing the asset name as well, I reckon I should backup asset's data too. Can/should I somehow backup assets and locators? Where are they stored? Is there any better way to backup videos?
I was looking for the answers here:
https://learn.microsoft.com/en-us/azure/media-services/latest/streaming-locators-concept
https://learn.microsoft.com/en-us/azure/media-services/latest/stream-files-tutorial-with-api#get-a-streaming-locator
https://learn.microsoft.com/en-us/azure/media-services/latest/limits-quotas-constraints
Part of what you're asking is 'What is an asset in Media Services?'. The Storage container that is created as part of the encoding process is definitely a good portion of what you need to backup. Technically that is all you need to recreate an asset from the backup Storage account. Well, if you don't mind recreating the other aspects of the asset.
An asset is/can be several things:
The Storage container and the contents of that container. These would include the MP4 video files, the manifests (.ism and .ismc), and metadata XML files.
The published locator or URL where clients make GET requests to the streaming endpoint.
Metadata. This includes things like the asset name, creation date, description, etc.
If you keep track of the Storage container in your backup and what metadata is associated with it as well as have a way of updating your site with a new streaming locator then all you really need is the Storage container for recreating the asset.

Azure Functions Blob Trigger Dynamic Binding

I need an Azure Functions blob trigger to trigger off a bucket that is given at runtime by an app setting. I read it is possible to do this:
[FunctionName("Process")]
public static async Task Process([BlobTrigger("%BucketName%/{name}", Connection = "AzureWebJobsStorage")] Stream avroBlobStream, string name, TraceWriter log)
{
}
This works locally if I have BucketName ONLY in the Values field in appsettings.json.
{
"IsEncrypted": false,
"Values": {
"BucketName": "capture-bucket",
}
}
If its not in Values field, this is the error:
[6/24/2019 5:52:15 PM] Function 'SomeClass.Process' failed indexing and will be disabled.
[6/24/2019 5:52:15 PM] No job functions found. Try making your job classes and methods public. If you're using binding extensions (e.g. ServiceBus, Timers, etc.) make sure you've called the registration method for the extension(s) in your startup code (e.g. config.UseServiceBus(), config.UseTimers(), etc.).
I put into Azure App function a setting with just BucketName, but it gave me the same error. Could you suggest what the setting should be called or what I am doing wrong when actually in a real Azure environment? Should it be Values:BucketName? But I have never seen an example on Microsoft website with Values: as the prefix.
For your error, I have a test one situation is the Microsoft.Azure.WebJobs.Extensions.Storage package is not installed. After installation, it will work. You could have a try.
As for the dynamic bindings, the there is a description on the official tutorial: Binding expressions - app settings. And when you test locally, app setting values come from the local.settings.json file. I don't know why you are using appsettings.json. The format is just what you paste.
And on Azure, cause the settings in local.settings.json won't be deployed with VS, you have to go to you Azure Function Configuration, and set the the binding name.
I have test, this way it will work, it could process my blob file.
Hope this could help you, if I misunderstand you requirements please let me know.

Categories

Resources