I am creating my first Azure function and I want it to be run when a Blob is uploaded, so in Visual Studio 2022 I selected the Azure functions template and then the Blob option. There are some steps I don't fully understand:
=> Do I need to select both when I also want to test my function locally with Azurite? The "service depedency" on this screen, does it mean the storage that is observed for new uploaded files or is it the storage that the Azure function uses for some internal management stuff?
=> The "service depedency" on this screen, does it mean the storage that is observed for new uploaded files or is it the storage that the Azure function uses for some internal management stuff?
does it mean the storage that is observed for new uploaded files or is it the storage that the Azure function uses for some internal management stuff?
It's the observed collection. The function's runtime storage location is chosen on the 1st wizard 'screen' when you select the .NET version.
Related
I have just started creating azure functions. I am creating a new azure function and got the below option. what does it mean? what if I select or unselect it? I have not found any documentation about it.
It shows this info, but I didn't understand it at all.
When you create an Azure Function, there needs to be an Azure Storage Account behind it, for various runtime needs (https://learn.microsoft.com/en-us/azure/azure-functions/functions-app-settings#azurewebjobsstorage).
For local development, you can choose to either use a real Azure Storage Account in the cloud by setting AzureWebJobsStorage app setting to that storage account's connection string, or you can use a local emulator (Azurite) that simulates a Storage Account on your machine.
So if you select that Azurite box, it will set AzureWebJobsStorage to use your local emulator instead, therefore no need for a storage account in the cloud.
https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azurite?tabs=visual-studio
How can I bring Azure a local path?
I'm trying to perform an HTTP call using Azure Function Apps (on Azure Portal), & I have to bring a path from my PC, in order this local image to be uploaded.
The request body looks something like this:
{
'url' : 'C:/Users/User/Pictures/example.jpg'
}
I tried to search about it & I found a solution which tells to run the Azure on local, but I want it to run on Azure portal.
I found more answer which tells that when I run a function on azure portal, the function path changes to D:\home\site\wwwroot> & by this way - I can put the image in this location; but I have no such as this path in my PC.
Thanks a lot!!!
You need to make use of Azure blob storage to store the files that should be used by the user function. You won't know if your function will use the same files every time it is executed if it is using Consumption plan. Uploaded images should go to Blob Storage an example
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-upload-process-images?tabs=dotnet
I am currently migrating a legacy .net application from a dedicated server to auzre web app. The application uses uses System.Web.Cache CacheDependency for XML file caching.
Caching.CacheDependency(xmlFile) normally detects changes made to the file and updating the cache with the latest version.
The issue is that the files are now being stored in an Azure storage account (ie. not the local file system) and I need a way to detect changes made to the files. The Caching.CacheDependency(xmlFile) will not work in this case as it looks for a local path.
Since the file based CacheDependency does not detect changes to files on the Azure blob storage, how can we make the web app detect changes and remove the stale cache file from the local cache?
I am thinking that a webfunction with a blob trigger will solve the file monitoring part but how do I remove the file from the System.Cache of the web app? I am also concerned about excessive resources being consumed. There are thousands of files.
Has anyone run into this yet and if so, what was your solution.
I had a issue like that.
The solution was create new Endpoint in WebApp. This endpoint just clean the cache. So we built a WebJob with blob storage trigger, then when this trigger occurs, the webjob call the new endpoint by a POST and the cache read the new datas.
I have been working on Asp.Net application which populates UI from azure storage account.
It makes development extremely tedious when for small change in UI code in I need to wait for all data to reload from azure storage (which is extremely slow, unless partionkey & rowkey are provided properly)
I wish to synchrozise all data in cloud storage account to local dev
storage.
I am sure I am not the first one to face this problem. How does
community handles this scenario? Is there any tool, which could copy
all data including blob & table to local development account?
I'm training with Azure environment and I have some trouble with the object CloudDriver.
I have mounted a drive on a blob (on the Storage Emulator) but I can't see it with a GUI as CloudBerry or Azure Storage Explorer.
If I understand properly this topic client-side-accessing-windows-azure-drive, the blob which store the driver data looks like an other blob ?
So if the drive is really create, I can see him with CloudBerry ?
Other linked question :
Did you know a GUI which can upload page blob (and not block blob) ?
To create my CloudDrive I use this web site : http://archive.loicrebours.fr/index.php/2012/01/29/azure-storage-drive-55/
(but it's a french web page).
When using the storage emulator, the cloud drive is simulated using local storage on your disk. To see its contents, you can open the storage emulator and choose to view cloud drive contents (which opens an explorer window to the correct temporary directory). See this article for more details.
Note: The Windows Azure Drives lab is also in the Windows Azure Training Kit.
The cloud drive is simply a VHD stored in your storage account as Page Blob. Like David explains, if you're working in the emulator the cloud drive is simulated. But if you run the application in Windows Azure the VHD file will be present in your storage account (I assume you'll be able to see it in CloudBerry, but I don't know if it supports page blobs).
I don't know if there's a GUI allowing you to upload page blobs, but there's a console application that allows you to upload VHD files as page blobs like this:
vhdupload.exe input-file http://accountname.blob.core.windows.net/container/blobname key.txt
ClumsyLeaf CloudXplorer uploads page blobs nicely, so it's a good way to get your VHD into blob storage to use as a Drive.