I need to figure out a way to let my users download several pdf files (sometimes thousands), from Azure Blob Storage, I know that I can download the files in paralel, and that would make things quicker, but the issue here is that the user could possibly have thousands of pdf files to download, and, that isn't at all reasonable.
Also, I can't download the files to another server, zip them, and let the user download them from there, as that would be incredibly inefficient for me.
Is there a way to create a zip of the files and let the user download that (other than the way above)? I saw other questions on this topic but none gave an answer/solution that suits my needs.
What would be, the absolute best way I can do this? Or isn't there another way to preform this task?
Thank you in advance.
Since no one gave an answer, and I see more posts about this on stack overflow and other sites, I decided to share here my solution (can't share code, because reasons...)
Firstly, as of today 04-09-2020 there's still no support for bulk download from Azure Blob Storage in a zip (or other format) that is directly from azure to client, without routing the download flow through a server that does the organizing and zipping.
The problem I had...
A need to download (several) files from Azure Blob Storage, zip them (maybe organize them by folders), and prompt the client to download them in bulk without any download data passing through the server and not filling the client downloads folder with scattered files...
During my research I thought about doing everything on the client's side in javascript through memory and let the client download it, but it could be quite memory expensive since my downloads could be in the GB size range.
The solution...
Then I came across a javascript library called StreamSaver, this library writes the files with streams and writes directly on the client's machine, meaning the memory expense was much less.
By luck this library also allows to organize the files inside the 'download directory' that will be prompted to the user, and even lets me zip that directory before telling the user if he wants to download it, meaning that this one library solved, almost, all my problems.
Now I only have a webmethod called by javascript that returns all the Azure SAS url to download from, and the rest is all in javascript in the client.
TL;DR:
Used StreamSaver javascript library to download, organize and zip all the files from the client side and then prompt them to download it, only using a webmethod to get all the urls wich are to be downloaded.
This solution works (from what I've tested) in at least these browsers:
Chrome;
FireFox;
Opera;
Edge (Chromium)
Problems I came across using the StreamSaver Library...
There are a few drawbacks/problems with the library,
1st Safary doesn't support it! more info about this here
2nd StreamSaver only allows zipping to files smaller than 4GB, this could be worked around using yet another library for zipping...
Related
I have an ASP.NET website that stores large numbers of files such as videos. I want an easy way to allow the user to download all the files in a single package. I was thinking about creating ZIP files dynamically.
All the examples I have seen involve creating the file before it is downloaded but potentially terabytes of information will be downloaded and therefor the user will have a long wait. Apparently ZIP files store all the information regarding what is in the ZIP file at the end of the file.
My idea is to dynamically create the file as its downloaded. This way I could allow the user to click download. The download would start and not require any space on the server to be pre packaged as it would copy things over uncompressed sequentially. The final part of the file would contain the information on the contents of what has been downloaded.
Has anyone had any experience of this? Does anyone know a better way of doing this? At the moment I cant see any pre made utilities for doing this but I believe it will work. If it doesn't exist then i'm thinking that I will have to read the Zip file format specifications and write my own code... something that will take more time than I was intending to spend on this.
https://pkware.cachefly.net/webdocs/casestudies/APPNOTE.TXT
I am trying to make an app that make use of open data.
The data I try to read out is in a CSV format (and is about 40mb big).
I have 2 problems I can't solve.
First I having difficulties to read the file from the web.
I already read on MSDN how to read files asynchrome but it's all about local files. I want to make a list of objects. Each line (except the first line) contains all props for 1 object
Secondly when I finally managed to read the file, is there a way to save it's data and read it somehow the next time? Because 40mb is pretty big to re-download each time you open the app and it takes a lot of time.
I was wondering if it is possible that when I read the the file on the web again, it will only read and at the new lines.
I am a newbie in UWP (c#) applications, so my apologies for the questions.
Thanks in advance.
There are two APIs you can use to download a file. One is HttpClient, described here on MSDN Documentation and in a UWP sample here. This class is usually recommended for smaller files and smaller data, but can easily handler larger files as well. Its disadvantage is, that when the user closes the app, the file will stop downloading.
The alternative is BackgroundDownloader, again here on MSDN and here in UWP samples. This class is usually recommended for downloading larger files and data, as it automatically perfroms the download in the background so the download will continue even when the app is closed.
To store your files, you can use the ApplicationData.Current.LocalFolder. This is a special folder provided to you by the system for storage of application files. You have read/write access to this folder and you can not only store your files here, but even create subfolder structure using UWP StorageFile and StorageFolder APIs. More about this is on MSDN.
for the past 3 days I've been trying to create an upload system for multiple files, possibly large, with progress bars.
I've been roaming the web relentlessly for the past few days, and I can say, I am now familiar with most difficulties.
sadly, all the solutions I've found online are not written c# or vbscript, in fact most of them are written in php.
I wouldn't mind switching to another language but the entire website is written in vb.net and for the sake of coherence I thought it might be best to keep with it.
File uploads:
Problem 1 - progress bar:
I understand file uploads will not work with ajax, since the ajax response will only occur after the file had completed its upload.
I understand there is a solution using iFrames but I cannot seem to find any online examples (preferably using vb.net or c#).
I understand there is another alternative using flash. how???
I also understand people are mostly against using iframes but I can't find what the reason might be.
Problem 2 - Multiple Files:
I can have multiple file support with HTML5. great, but IE doesn't support it? well... IE users will just have to upload one file at a time.
Problem 3 - Large files:
how?
I heard something about chunking, and blobs, but these are still just random gibberish words for me. can somebody explain, the meaning and the implementation?
references to reading material are much appreciated even though, if it's on the web, I've probably already read it in my search for my solution.
#DevlshOne has a decent thread with some good information.
Here are the three basic requirements for what I did:
Create Silverlight app for clientside access and upload control. (use app of your choice)
Create an HttpHandler to receive the data in chunks and manage requests.
Create the database backend to handle the files.
Silverlight worked well because I was already in VB (ASP.NET). When used in-browser, as opposed to out-of-browser, the ASP.NET session was shared with Silverlight, so there was no need to have additional security/login measures. Silverlight also allowed me to limit what file types could be selected and allow the user to select multiple files from the same folder.
The Silverlight app grabs the files selected by the user, displays them for editing of certain properties, and then begins the upload when the user clicks the 'upload' button. This sets off a number of threads that each upload chunks of data to the httphandler. The HttpHandler and Silverlight app send and receive in chunks, with the HttpHandler always sending an OK or ERROR message when the request has been processed for the uploaded chunk.
Our specific implementation of file uploading also required some database properties (fields) to be filled out by the user, so we also had inputs for those properties and uploaded them to the server with the file data.
An in-browser Silverlight app can also have parameters passed into it through the html, so I do this with settings like 'max chunk size' or 'max thread count'. I can change the setting in the database and have it apply to all users.
The database backend is basically a few stored procedures (insert your data management preference here) that control the flow of the logic. One table holds completed files (no file data), and a second holds the temp files that are in progress of being uploaded. One stored procedure initiates a new file record in the temp table and processes additional chunk uploads, and another controls the migration of the completely uploaded file from the temp table to the completed table. (A piece of VB code in the HttpHandler migrates the actual binary file data from the temp table to a physical file.)
This seems pretty complex, but the most difficult part would be the interaction with the handler and passing the chunks around (response/requests, uploading successive chunks, etc.). I left out a lot of information, but this is the basic implementation.
I want to upload 30GB with asp.net file upload control, i have heard that ftp can do this or some advanced uploader. I searched but did not find any suitable code or some open source plugin for asp.net. Do you know some library or whats the right way to do this, i am confused.
I am in search of ASP.Net file upload which could upload large file e.g 30GB, and with any logic like ftp or some other resuming way. So is there any plugin which can do this job?
I don't think you can do this. Many browsers have upload limit ~ 2GB. Think about different solution than HTTP POST, e.g. direct FTP upload.
Here is a pretty good writeup of the problem
http://weblogs.asp.net/jgalloway/archive/2008/01/08/large-file-uploads-in-asp-net.aspx
In my own experience working with gigabyte uploads in .net several years ago, it is not easy within the common controls and infrastructure. You will be fighting http timeouts, and have to adjust the web.config to allow for the file size and changes a bit in web.config.
What has to happen to make it work is some form of chunking. So you divide the file up into much smaller pieces and then attempt to upload each one. Then you will have to keep track of which pieces you have gotten and which you have not.
A better/easier solution is to add some RIA functionality to your application so you can handle the upload in
You could try NeatUpload http://neatupload.codeplex.com/. I've used it last time for uploading files as large as 10MB without problems. Never tried it with multi GB files though.
The control requires full trust.
I think the best should be WebDAV server engine approach http://www.webdavsystem.com/server/documentation/large_files_iis_asp_net
I am designing a website that will upload video files in ASP.NET. The question I have is: video files can get very huge (i.e. 3GB) and I read that increasing the maxRequestLength in the webconfig file will give the chance for hackers to attack the server with large requests.
I already know about client validation to protect from malicious files that are not the intended files, so that's not a concern at the moment. My question is if the file-upload method is the right approach to upload video files? If not, is there a better approach?
For upload big file in asp.net,I used "Brettle.Web.NeatUpload"
You can get it at http://neatupload.codeplex.com/
I hope it is useful for you.
I use http://www.plupload.com/ combined with chunked uploads inside an ashx handler. On the server I push the parts to amazons S3, so the server never has the full file in memory. works great for me.
The reason to be concerned about this issues is that the built in functionality within the dotNET framework for handling file uploads to IIS in ASP.NET is written to cache the entire file upload to memory before streaming the file out to disk. Hence if you allow very large file uploads, you stand the risk of allowing someone to perform a Denial Of Service style attack on your IIS server because all it takes is a blast of several very large file uploads at once to exhaust available physical memory on the server. Hence the answer to that is to either write your own upload handler that does not cache the entire file upload to memory, or use one of the many available software components that can be installed and do this for you. The other two answers point to a couple. Here's another example component I found for ASP.NET:
http://www.easyalgo.com/eaupload.aspx