Force file download for IE9 on Azure Blob Storage - c#

I am trying to use Azure Blob Storage as a location for secure file downloads using Shared Access Signature. Everything is working very well, however the problem I am having is I am trying to allow the user to save files from the browser and I have all browsers except IE9 working.
Reviewing this question,
What content type to force download of text response?
this works well when I can control all of the headers, however in Azure Blob Storage, I have set the Content-Type to application/octet-stream and this allows all browsers except IE to ask the user to save the file, IE simply opens the file. It appears that known file types will open (example .jpg, .wmv etc…).
In Azure, I have found no way to set
Content-Disposition: attachment;filename="My Text File.txt"
Is there a way, using Azure Blob Storage, to use IE to download any file directly from Azure Blob Storage?
Thanks in advance.

I don't think there's a way to set the Content-Disposition header in Windows Azure blob storage.

You can now by right clicking the resource - properties.

Related

Using Azure SAS URL download Blob (all files) once from Azure Blob Storage into Browser

Please can you help me understand? Is it possible to use the Azure SAS URL to download Blob(all files) once from Azure Blob Storage into Browser?
I mean can the user click generated SAS URL and it will download all files into my Browser as a Folder?
The problem that I have some time I have very large files I make them ZIP and want to put them into Azure Blob storage and give the user URL to click and download it as Folder ZIP, not as file one by one.
Or if you have another suggestion please let me know, if there are some solutions to do code level I can do it via C#.
Any help will be appreciated.
I can't download the azure blob using sas URL. I'm new in this space, if you have another solution in this case or another way please let me know. Thank you :)
You could do this in 2 steps:
Firstly, run a program that takes all the files in a blob storage container and places them in a zip file, store that zip file in another container.
Then give the user a SAS URL to that file.

Upload File from local computer to server c#

Is there any way to upload a PDF file that is on my local computer in a way where anyone can retrieve the PDF?
If I do something like
var filename = #"c:\test.pdf";
m.Attachments.Add(new Attachment(filename));
Anyone who goes onto the website has to have the same local path and same test.pdf file in the correct place (which would be impossible). So is there a way I can go about having to upload the pdf so it's not hard coded with a local path?
There are many ways to upload a file e.g. using an HTTP request and then download a file again using another HTTP request. If you just need to download files (no other API endpoints), consider using blob storage like AWS S3 bucket, Azure Blob Store or equivalents of other providers directly and configure the access rules in such a way that public read access is possible. To download the file in your program you then just need an instance of HttpClient and make a request to your blob store.
If you want to upload files through ASP.NET have a look at the Microsoft docs which describe all things to be considered in detail. You will still need to store the file somewhere when using this approach. Either on disk of the server you run the application on or again a blob store (and there are of course other possibilites).

Generating PDF file with PuppeteerSharp from webpage containing images linking to Azure Blob storage

I have a web page that contains href tags pointing to pictures stored in Azure Blob storage. The Azure container is a private and the link generated to access each images is performed using Azure SAS token.The format of a href link is similar to https://myblob.blob.core.windows.net/mycontainer/myfolder%2Fmyfile.jpeg?sv=2019-12-12&st=2020-10-13T18%3A52%3A48Z&se=2020-10-13T18%3A58%3A48Z&sr=b&sp=r&sig=P5JRdwKa4GkbIFF55sWywOe4vnPnWOCoSf29UHYmNPA%3D
When generating the PDF using Puppeteer sharp using WaitUntilNavigation.Networkidle0, I didn't succeed in retrieving the images:
I also tested each generated secured SAS link and they work without problem. I also replaced each href link with a base 64 data encoded image and it works great.
I tested PDF generation using online Puppeteer service based on Nodejs (https://try-puppeteer.appspot.com/) and it works like a charm. So there seems to be an issue with puppeteersharp version (v2.0.4).
Any idea on what could be the issue?
After struggling with the issue for several hours, we finally located the issue. It is not related with Puppeteer that works like a charm but rather with the way a private blob storage container handles authentication: as our request contained an Authorization HTTP Header with a bearer token required by our own app, this header was sent by Chromium while retrieving remote images from the blob container.
Unfortunately Azure service tried to handle that token and rejected our request.
How did we identify that? By connecting a chrome debugger to the Chromium instance and by checking the logs. Indeed it is possible to launch Puppeteer with a remote debugging port.

Azure storage files force download to browser

I have my files stored in Azure File System and here are the requirements -
User should be able to view the documents without downloading it to the local
This is working fine for pdf but not any other mime types
I tried setting Content-Type,Content-Disposition (in azure file properties but no luck) and also iframe.
User should be able to edit the doc online without downloading.
I don't think this is possible just with Azure and have to integrate with One Drive may be? Correct me if I am wrong?
I would really appreciate any inputs/thoughts.
Not sure if this is a viable option but using Storage Accounts in Azure, you can map these accounts as a network drives to any client machine. So they would be able to access these files via File Explorer.
This link covers the basic steps in setting it up.
Unfortunately for anyone who wishes to use this feature, they need to be on Windows 8 (or above) to be able to map a network drive successfully as it uses SMB3.
If this option is a no go I will delete the post.

How to properly upload video files to Azure Media Services from angularjs

For my scenario, our current app is begin coded in html5 and angularjs communicating with web api. I have a workflow scenario that I seem to not be able to find an end to end example for. I would like to allow users of my website to upload videos and images to Azure Media Services. I found several examples that seem to move the data from a web page to blob storage and then copy over to azure media services.
Is there a way to upload the file directly to Media Services, instead of having a temporary and permanent blob container(one tied to AMS), as this approach seems to force me to have an additional storage container or is there a way to move the file to blob storage followed by linking the blob file to AMS via IAssetFile?
Can someone provide an end example that demonstrates the flow from web frontend upload to the file ending up in AMS?
Once up there, is there a way to make sure users can view but not download videos?
1.Is there a way to upload the file directly to Media Services:
The Media Services SDK requires you to first create an Asset object in our system. That Asset object is backed by a container in Storage. You can create an empty Asset object, and request a write only SAS URL (We call them "Locators" in our API) to upload your content directly into. You may want to check out this AngularJS module and see if it works. http://ngmodules.org/modules/angular-azure-blob-upload
2.Can someone provide an end example that demonstrates the flow from web frontend upload to the file ending up in AMS?
Your web API/frontend should use the Media Services SDK to create the empty Asset first. Once created, hand create a write only SAS URL and hand that back to your Angular client. The Angular client can then use client side javascript library to upload directly to the blob/container using the SAS URL and a module for azure-blob upload like the one here: http://ngmodules.org/modules/angular-azure-blob-upload
3.Once up there, is there a way to make sure users can view but not download videos?
Once the video is uploaded, you should Delete the Write only SAS "Locator" from your Asset's Locators collection. This way, nobody can use it to write again.
At this point you can create a Streaming Locator. Users will only be able to stream the file through our streaming services. Your file has to be encoded in a format that we can support streaming for, so you may have to first kick off an encoding job to get it into the right format and encoding settings (MP4 files with H264 and AAC audio). If you want to stream from Media Services, you need to make sure you have at least 1 streaming reserved unit enabled on your account. In addition, if you are looking to protect your files, you can take a look at our Content Protection services, which will provide on-the-fly AES 128 or PlayReady DRM encryption to your assets. You can integrate that with JWT tokens and Active Directory to authenticate/auth your users before they are able to decrypt the video on the client side.
I'm not too familiar with Azure Media Services but after looking at this guide http://azure.microsoft.com/en-us/documentation/articles/media-services-rest-get-started/ it appears to me that you can create an asset on Azure Media Services and link it to a blob. This means you'll only have one blob container.
AMS provides Rest API for all media processing capabilities including uploading, encoding, publishing. There is a sample project (postman collection) in GitHub to play around. This sample project is also accompanied by a well-written article. Please find the links below.
https://github.com/Azure-Samples/media-services-v3-rest-postman
https://learn.microsoft.com/en-us/azure/media-services/latest/stream-files-tutorial-with-rest
Hope this will help

Categories

Resources