How to bring local path from user PC to Azure function? - c#

How can I bring Azure a local path?
I'm trying to perform an HTTP call using Azure Function Apps (on Azure Portal), & I have to bring a path from my PC, in order this local image to be uploaded.
The request body looks something like this:
{
'url' : 'C:/Users/User/Pictures/example.jpg'
}
I tried to search about it & I found a solution which tells to run the Azure on local, but I want it to run on Azure portal.
I found more answer which tells that when I run a function on azure portal, the function path changes to D:\home\site\wwwroot> & by this way - I can put the image in this location; but I have no such as this path in my PC.
Thanks a lot!!!

You need to make use of Azure blob storage to store the files that should be used by the user function. You won't know if your function will use the same files every time it is executed if it is using Consumption plan. Uploaded images should go to Blob Storage an example
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-upload-process-images?tabs=dotnet

Related

Understanding steps of the Azure function blob template in Visual Studio

I am creating my first Azure function and I want it to be run when a Blob is uploaded, so in Visual Studio 2022 I selected the Azure functions template and then the Blob option. There are some steps I don't fully understand:
=> Do I need to select both when I also want to test my function locally with Azurite? The "service depedency" on this screen, does it mean the storage that is observed for new uploaded files or is it the storage that the Azure function uses for some internal management stuff?
=> The "service depedency" on this screen, does it mean the storage that is observed for new uploaded files or is it the storage that the Azure function uses for some internal management stuff?
does it mean the storage that is observed for new uploaded files or is it the storage that the Azure function uses for some internal management stuff?
It's the observed collection. The function's runtime storage location is chosen on the 1st wizard 'screen' when you select the .NET version.

What is Use Azurite for runtime storage account (AzureWebJobsStorage)?

I have just started creating azure functions. I am creating a new azure function and got the below option. what does it mean? what if I select or unselect it? I have not found any documentation about it.
It shows this info, but I didn't understand it at all.
When you create an Azure Function, there needs to be an Azure Storage Account behind it, for various runtime needs (https://learn.microsoft.com/en-us/azure/azure-functions/functions-app-settings#azurewebjobsstorage).
For local development, you can choose to either use a real Azure Storage Account in the cloud by setting AzureWebJobsStorage app setting to that storage account's connection string, or you can use a local emulator (Azurite) that simulates a Storage Account on your machine.
So if you select that Azurite box, it will set AzureWebJobsStorage to use your local emulator instead, therefore no need for a storage account in the cloud.
https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azurite?tabs=visual-studio

How can I access a "local" path in an Azure Function?

I have written an Azure function to create leads in the Zoho CRM. I gotten great guidance from members here and I have one last hurdle to get over. Zoho API writes to a location it calls the Resource Path. This has to be a local path. For example, running the function in VS, I can use a path to My Documents but not a system path to the temp folder or any other. I've tried several system paths which all throw an error. The actual error is a generic API error but Zoho support has told me its an access issue?
string path = context.FunctionAppDirectory;
string path = System.IO.Path.GetTempPath();
The last conversation I had with support on this, they explained it as
However, upon having further discussions based on your questions with my development team, I was told that it is mandatory for the resource path to be a local path (Documents path in our case) an that it is not possible to retrieve a file stored on cloud and use it with the local SDK.
So I THINK my question is, can I create something that looks like a local path to use in my Azure Function. I was reading about mounting a Azure file share as a drive letter (https://learn.microsoft.com/en-us/azure/storage/files/storage-how-to-use-files-windows) but it seems to only work with Azure Windows servers rather than Azure Functions.
Any suggestions are appreciated.
Using temporary is usually the way to go for most but I'm unsure why that doesn't work with the Zoho API.
An option that you could try is to write data to %HOME% or $HOME on Windows or Linux plans, respectively. If you are on Linux, you could also consider mounting a file share.
I got on the Microsoft Q&A to ask and was told it wasn't possible in Azure Functions. As you indicate it would have to be on a some kind of server hosting with a drive letter mounted.

How do I detect an azure blob storage file change and delete local cache instance of that file?

I am currently migrating a legacy .net application from a dedicated server to auzre web app. The application uses uses System.Web.Cache CacheDependency for XML file caching.
Caching.CacheDependency(xmlFile) normally detects changes made to the file and updating the cache with the latest version.
The issue is that the files are now being stored in an Azure storage account (ie. not the local file system) and I need a way to detect changes made to the files. The Caching.CacheDependency(xmlFile) will not work in this case as it looks for a local path.
Since the file based CacheDependency does not detect changes to files on the Azure blob storage, how can we make the web app detect changes and remove the stale cache file from the local cache?
I am thinking that a webfunction with a blob trigger will solve the file monitoring part but how do I remove the file from the System.Cache of the web app? I am also concerned about excessive resources being consumed. There are thousands of files.
Has anyone run into this yet and if so, what was your solution.
I had a issue like that.
The solution was create new Endpoint in WebApp. This endpoint just clean the cache. So we built a WebJob with blob storage trigger, then when this trigger occurs, the webjob call the new endpoint by a POST and the cache read the new datas.

How see contents of my azure drive?

I'm training with Azure environment and I have some trouble with the object CloudDriver.
I have mounted a drive on a blob (on the Storage Emulator) but I can't see it with a GUI as CloudBerry or Azure Storage Explorer.
If I understand properly this topic client-side-accessing-windows-azure-drive, the blob which store the driver data looks like an other blob ?
So if the drive is really create, I can see him with CloudBerry ?
Other linked question :
Did you know a GUI which can upload page blob (and not block blob) ?
To create my CloudDrive I use this web site : http://archive.loicrebours.fr/index.php/2012/01/29/azure-storage-drive-55/
(but it's a french web page).
When using the storage emulator, the cloud drive is simulated using local storage on your disk. To see its contents, you can open the storage emulator and choose to view cloud drive contents (which opens an explorer window to the correct temporary directory). See this article for more details.
Note: The Windows Azure Drives lab is also in the Windows Azure Training Kit.
The cloud drive is simply a VHD stored in your storage account as Page Blob. Like David explains, if you're working in the emulator the cloud drive is simulated. But if you run the application in Windows Azure the VHD file will be present in your storage account (I assume you'll be able to see it in CloudBerry, but I don't know if it supports page blobs).
I don't know if there's a GUI allowing you to upload page blobs, but there's a console application that allows you to upload VHD files as page blobs like this:
vhdupload.exe input-file http://accountname.blob.core.windows.net/container/blobname key.txt
ClumsyLeaf CloudXplorer uploads page blobs nicely, so it's a good way to get your VHD into blob storage to use as a Drive.

Categories

Resources