fileupload control in C# for videos - c#

I am designing a website that will upload video files in ASP.NET. The question I have is: video files can get very huge (i.e. 3GB) and I read that increasing the maxRequestLength in the webconfig file will give the chance for hackers to attack the server with large requests.
I already know about client validation to protect from malicious files that are not the intended files, so that's not a concern at the moment. My question is if the file-upload method is the right approach to upload video files? If not, is there a better approach?

For upload big file in asp.net,I used "Brettle.Web.NeatUpload"
You can get it at http://neatupload.codeplex.com/
I hope it is useful for you.

I use http://www.plupload.com/ combined with chunked uploads inside an ashx handler. On the server I push the parts to amazons S3, so the server never has the full file in memory. works great for me.

The reason to be concerned about this issues is that the built in functionality within the dotNET framework for handling file uploads to IIS in ASP.NET is written to cache the entire file upload to memory before streaming the file out to disk. Hence if you allow very large file uploads, you stand the risk of allowing someone to perform a Denial Of Service style attack on your IIS server because all it takes is a blast of several very large file uploads at once to exhaust available physical memory on the server. Hence the answer to that is to either write your own upload handler that does not cache the entire file upload to memory, or use one of the many available software components that can be installed and do this for you. The other two answers point to a couple. Here's another example component I found for ASP.NET:
http://www.easyalgo.com/eaupload.aspx

Related

How to add images in db?

Currently working in an intern project, where i am required to add an image when adding an employee in my table.
we are using angularJS in front end and asp.net core 3.1 in backend, we have sql database using SSMS, i couldnt get it how to upload images, my senior told me to store the path in db, if i am to store the path in db, where will my images be uploaded, i did upload the images making an api on wwwroot folder, but they marked it as a bad practice? So can any of you guide me? Thankyou in advance :)
Steve's Comment is useful. You can upload image to the new folder under the wwwroot. You can refer the blog fisrt, it create a Resources folder, and it also use angular.
In general, we still do not recommend storing pictures under wwwroot. Regardless of whether you create a subfolder such as Sources.
Reason:
As the business increases, or over time, the contents under the folder will definitely increase. Even if the hard disk space at the deployment site is large, it will cause subsequent maintenance problems.
Specifically, the picture file may be accidentally deleted and cannot be recovered subsequently. If the picture file ends up taking up a lot of space, it will run out of hard disk space.
The Best Practice:
Upload the image to the cloud drive, something like: azure storage. And save the path to your database. It will save your disk space and it will also relieve your server stress, and more safety.
Additional tips:
If you just store a very small avatar file, we can convert it to base64 string format and store it in the database. In the absence of cloud services, some tiny images can be realized in this way.
However, this is not suitable for growing business scenarios, because it will cause the database footprint to become larger, which is not conducive to synchronization or backup. As well as poor large files, it may cause problems with the loading of files.

Bulk Downloads in Azure Blob Storage

I need to figure out a way to let my users download several pdf files (sometimes thousands), from Azure Blob Storage, I know that I can download the files in paralel, and that would make things quicker, but the issue here is that the user could possibly have thousands of pdf files to download, and, that isn't at all reasonable.
Also, I can't download the files to another server, zip them, and let the user download them from there, as that would be incredibly inefficient for me.
Is there a way to create a zip of the files and let the user download that (other than the way above)? I saw other questions on this topic but none gave an answer/solution that suits my needs.
What would be, the absolute best way I can do this? Or isn't there another way to preform this task?
Thank you in advance.
Since no one gave an answer, and I see more posts about this on stack overflow and other sites, I decided to share here my solution (can't share code, because reasons...)
Firstly, as of today 04-09-2020 there's still no support for bulk download from Azure Blob Storage in a zip (or other format) that is directly from azure to client, without routing the download flow through a server that does the organizing and zipping.
The problem I had...
A need to download (several) files from Azure Blob Storage, zip them (maybe organize them by folders), and prompt the client to download them in bulk without any download data passing through the server and not filling the client downloads folder with scattered files...
During my research I thought about doing everything on the client's side in javascript through memory and let the client download it, but it could be quite memory expensive since my downloads could be in the GB size range.
The solution...
Then I came across a javascript library called StreamSaver, this library writes the files with streams and writes directly on the client's machine, meaning the memory expense was much less.
By luck this library also allows to organize the files inside the 'download directory' that will be prompted to the user, and even lets me zip that directory before telling the user if he wants to download it, meaning that this one library solved, almost, all my problems.
Now I only have a webmethod called by javascript that returns all the Azure SAS url to download from, and the rest is all in javascript in the client.
TL;DR:
Used StreamSaver javascript library to download, organize and zip all the files from the client side and then prompt them to download it, only using a webmethod to get all the urls wich are to be downloaded.
This solution works (from what I've tested) in at least these browsers:
Chrome;
FireFox;
Opera;
Edge (Chromium)
Problems I came across using the StreamSaver Library...
There are a few drawbacks/problems with the library,
1st Safary doesn't support it! more info about this here
2nd StreamSaver only allows zipping to files smaller than 4GB, this could be worked around using yet another library for zipping...

File upload / did I make a mistake choosing vb.net instead of php?

for the past 3 days I've been trying to create an upload system for multiple files, possibly large, with progress bars.
I've been roaming the web relentlessly for the past few days, and I can say, I am now familiar with most difficulties.
sadly, all the solutions I've found online are not written c# or vbscript, in fact most of them are written in php.
I wouldn't mind switching to another language but the entire website is written in vb.net and for the sake of coherence I thought it might be best to keep with it.
File uploads:
Problem 1 - progress bar:
I understand file uploads will not work with ajax, since the ajax response will only occur after the file had completed its upload.
I understand there is a solution using iFrames but I cannot seem to find any online examples (preferably using vb.net or c#).
I understand there is another alternative using flash. how???
I also understand people are mostly against using iframes but I can't find what the reason might be.
Problem 2 - Multiple Files:
I can have multiple file support with HTML5. great, but IE doesn't support it? well... IE users will just have to upload one file at a time.
Problem 3 - Large files:
how?
I heard something about chunking, and blobs, but these are still just random gibberish words for me. can somebody explain, the meaning and the implementation?
references to reading material are much appreciated even though, if it's on the web, I've probably already read it in my search for my solution.
#DevlshOne has a decent thread with some good information.
Here are the three basic requirements for what I did:
Create Silverlight app for clientside access and upload control. (use app of your choice)
Create an HttpHandler to receive the data in chunks and manage requests.
Create the database backend to handle the files.
Silverlight worked well because I was already in VB (ASP.NET). When used in-browser, as opposed to out-of-browser, the ASP.NET session was shared with Silverlight, so there was no need to have additional security/login measures. Silverlight also allowed me to limit what file types could be selected and allow the user to select multiple files from the same folder.
The Silverlight app grabs the files selected by the user, displays them for editing of certain properties, and then begins the upload when the user clicks the 'upload' button. This sets off a number of threads that each upload chunks of data to the httphandler. The HttpHandler and Silverlight app send and receive in chunks, with the HttpHandler always sending an OK or ERROR message when the request has been processed for the uploaded chunk.
Our specific implementation of file uploading also required some database properties (fields) to be filled out by the user, so we also had inputs for those properties and uploaded them to the server with the file data.
An in-browser Silverlight app can also have parameters passed into it through the html, so I do this with settings like 'max chunk size' or 'max thread count'. I can change the setting in the database and have it apply to all users.
The database backend is basically a few stored procedures (insert your data management preference here) that control the flow of the logic. One table holds completed files (no file data), and a second holds the temp files that are in progress of being uploaded. One stored procedure initiates a new file record in the temp table and processes additional chunk uploads, and another controls the migration of the completely uploaded file from the temp table to the completed table. (A piece of VB code in the HttpHandler migrates the actual binary file data from the temp table to a physical file.)
This seems pretty complex, but the most difficult part would be the interaction with the handler and passing the chunks around (response/requests, uploading successive chunks, etc.). I left out a lot of information, but this is the basic implementation.

Upload large file in asp.net

I want to upload 30GB with asp.net file upload control, i have heard that ftp can do this or some advanced uploader. I searched but did not find any suitable code or some open source plugin for asp.net. Do you know some library or whats the right way to do this, i am confused.
I am in search of ASP.Net file upload which could upload large file e.g 30GB, and with any logic like ftp or some other resuming way. So is there any plugin which can do this job?
I don't think you can do this. Many browsers have upload limit ~ 2GB. Think about different solution than HTTP POST, e.g. direct FTP upload.
Here is a pretty good writeup of the problem
http://weblogs.asp.net/jgalloway/archive/2008/01/08/large-file-uploads-in-asp-net.aspx
In my own experience working with gigabyte uploads in .net several years ago, it is not easy within the common controls and infrastructure. You will be fighting http timeouts, and have to adjust the web.config to allow for the file size and changes a bit in web.config.
What has to happen to make it work is some form of chunking. So you divide the file up into much smaller pieces and then attempt to upload each one. Then you will have to keep track of which pieces you have gotten and which you have not.
A better/easier solution is to add some RIA functionality to your application so you can handle the upload in
You could try NeatUpload http://neatupload.codeplex.com/. I've used it last time for uploading files as large as 10MB without problems. Never tried it with multi GB files though.
The control requires full trust.
I think the best should be WebDAV server engine approach http://www.webdavsystem.com/server/documentation/large_files_iis_asp_net

How to efficiently send large files from the database to the browser?

In my web application I am working with files. Some files are very large. I use Response.Write() to write the file to the browser. This goes well for the smaller files, but for large files this can take a while and the bandwidth is fully used.
Is it possible to split large documents and send it piece by piece to the browser? Are there other ways to send the document quicker to the browser?
I hold the document as a property of an object.
Why don't you compress the file and store it in the DB and decompress it will extracting it?
You can do a lot of things depending on this questions:
How often does the file change?
Do I really need the files in the DB?
Why not store the File path in the
DB and the File on disk?
Anyhow, since your files are extremely high bandwidth and you would want your app to respond appropriately you might want to use AJAX load the files Asynchronously. You can have a WebHandler .ashx for this.
Here's a few examples:
http://www.dotnetcurry.com/ShowArticle.aspx?ID=193&AspxAutoDetectCookieSupport=1
http://www.viawindowslive.com/Articles/VirtualEarth/InvokingserversidecodeusingAJAX.aspx
My question is, is it possible to
split large documents and send it
piece by piece to the browser?
It depends on the file type, but in general no. If you are sending something like an excel file or a word doc etc. the receiving application will need all of the information (bytes) to fully form the document. You could physically separate the document into multiple ones, and that would allow you to do so.
If the bandwidth is fully used, then there is nothing you can do to "speed it up" short of compressing the document prior to send. In other words, zip it up.
Depending on the document (I know you said .mht, but we're talking content here) you will see the size go down by some amount. Maybe it's enough, maybe not.
Either way, this is entirely a function of the amount of content you want to send versus the size of the pipe available to send it. One of those is more difficult to change than the other.
Try setting IIS's dynamic compression. By default, it's set fairly low, but you can try setting it for a higher compression level and see how much that helps.
I'm not up to speed with ASP.NET but you might be able to buffer from a FileStream to some sort of output stream.
You can use the Flush method to send the currently buffered data to the client (the browser).
Note that this has some implications, as is described aptly here.
I've considered using it myself, a project sent documents that became fairly large and I was cautious about storing the whole data in memory. In the end I decided the data was not large enough to be a problem though.
Sadly the MSDN documentation is very, very vague on what Flush implies and you will probably have to use Google to troubleshoot.

Categories

Resources