Currently working in an intern project, where i am required to add an image when adding an employee in my table.
we are using angularJS in front end and asp.net core 3.1 in backend, we have sql database using SSMS, i couldnt get it how to upload images, my senior told me to store the path in db, if i am to store the path in db, where will my images be uploaded, i did upload the images making an api on wwwroot folder, but they marked it as a bad practice? So can any of you guide me? Thankyou in advance :)
Steve's Comment is useful. You can upload image to the new folder under the wwwroot. You can refer the blog fisrt, it create a Resources folder, and it also use angular.
In general, we still do not recommend storing pictures under wwwroot. Regardless of whether you create a subfolder such as Sources.
Reason:
As the business increases, or over time, the contents under the folder will definitely increase. Even if the hard disk space at the deployment site is large, it will cause subsequent maintenance problems.
Specifically, the picture file may be accidentally deleted and cannot be recovered subsequently. If the picture file ends up taking up a lot of space, it will run out of hard disk space.
The Best Practice:
Upload the image to the cloud drive, something like: azure storage. And save the path to your database. It will save your disk space and it will also relieve your server stress, and more safety.
Additional tips:
If you just store a very small avatar file, we can convert it to base64 string format and store it in the database. In the absence of cloud services, some tiny images can be realized in this way.
However, this is not suitable for growing business scenarios, because it will cause the database footprint to become larger, which is not conducive to synchronization or backup. As well as poor large files, it may cause problems with the loading of files.
Related
I'm trying to create a website. But i don't know how to upload photos into database. All information is outdated
If the size of your images are small, then you can convert it to base64, and store in your DB.
If the size is very larger, then you will face some issue.
The correct way to save pictures and Images
If the images are static resouce, and you can store it under wwwroot. Due to it will not often update, you can save it here.
If the images are create by web user, you also can follow the first suggestion, and create folder like wwwroot/userdata/userid/, and the path like wwwroot/userdata/userid/xx.jpg. The main disadvantage is that it is not conducive to maintenance.
The disadvantage is that the published files under wwwroot will become larger and larger, and the pressure on the web server will also increase, when many users access operations.
The best choice you can store images to storage account or other third party service.
In this way, our database only needs to save the path. Conducive to the maintenance of images data. There will be no stress on the web server during access and operation.
I need to figure out a way to let my users download several pdf files (sometimes thousands), from Azure Blob Storage, I know that I can download the files in paralel, and that would make things quicker, but the issue here is that the user could possibly have thousands of pdf files to download, and, that isn't at all reasonable.
Also, I can't download the files to another server, zip them, and let the user download them from there, as that would be incredibly inefficient for me.
Is there a way to create a zip of the files and let the user download that (other than the way above)? I saw other questions on this topic but none gave an answer/solution that suits my needs.
What would be, the absolute best way I can do this? Or isn't there another way to preform this task?
Thank you in advance.
Since no one gave an answer, and I see more posts about this on stack overflow and other sites, I decided to share here my solution (can't share code, because reasons...)
Firstly, as of today 04-09-2020 there's still no support for bulk download from Azure Blob Storage in a zip (or other format) that is directly from azure to client, without routing the download flow through a server that does the organizing and zipping.
The problem I had...
A need to download (several) files from Azure Blob Storage, zip them (maybe organize them by folders), and prompt the client to download them in bulk without any download data passing through the server and not filling the client downloads folder with scattered files...
During my research I thought about doing everything on the client's side in javascript through memory and let the client download it, but it could be quite memory expensive since my downloads could be in the GB size range.
The solution...
Then I came across a javascript library called StreamSaver, this library writes the files with streams and writes directly on the client's machine, meaning the memory expense was much less.
By luck this library also allows to organize the files inside the 'download directory' that will be prompted to the user, and even lets me zip that directory before telling the user if he wants to download it, meaning that this one library solved, almost, all my problems.
Now I only have a webmethod called by javascript that returns all the Azure SAS url to download from, and the rest is all in javascript in the client.
TL;DR:
Used StreamSaver javascript library to download, organize and zip all the files from the client side and then prompt them to download it, only using a webmethod to get all the urls wich are to be downloaded.
This solution works (from what I've tested) in at least these browsers:
Chrome;
FireFox;
Opera;
Edge (Chromium)
Problems I came across using the StreamSaver Library...
There are a few drawbacks/problems with the library,
1st Safary doesn't support it! more info about this here
2nd StreamSaver only allows zipping to files smaller than 4GB, this could be worked around using yet another library for zipping...
One of the many things that SharePoint does extremely well is that when you have versioning enabled for files uploaded to a Document Library, every time you save changes to a file it only saves the difference from the previous version of the file to the Content Database but NOT the whole file again.
I am trying to duplicate that same behavior with standard C# code on either a File System folder in Windows or a SQL Database blob field. Does anyone have any idea or pointers on how SharePoint accomplishes this and how it can be done outside of SharePoint?
SharePoint uses a technique called data "shredding" to contain each change to a given file. Unfortunately, I don't think you will find enough technical details to truly reproduce what they are doing, but you might be able to devise a reasonable approximation using your own design.
When shredded, the data associated with a file such as Document.docx is distributed across a set of BLOBs associated with the file. The independent BLOBS are each assigned a unique ID (offset) to enable reconstruction in the correct order when requested by a user.
Each document "shred" is stored in a SQL database table named DocStreams. Each BLOB contains a numerical Id representative of the source BLOB when coalesced. When a client updates a file, only the shredded BLOB that corresponds to the change is updated with the update occurring on the database server as opposed to the Web server.
For more details on Shredding see
http://download.microsoft.com/download/9/6/6/9661DAC2-393D-445A-BDC1-E60743B1231E/Shredded%20Storage%20in%20SharePoint%202013.pdf
https://jeremythake.com/the-truth-behind-shredded-storage-in-sharepoint-2013-a84ec047f28e
https://www.c-sharpcorner.com/UploadFile/91b369/shredded-storage-in-sharepoint-2013/
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Storing Images in DB - Yea or Nay?
Pretty straight forward, I am hosting a site where users can upload pictures, and I have a .net File upload control working appropriately.. I'm just wondering what methodology I should use to store them on the server..
I can use the SaveAs() method off the FileUpload control which saves it as an actual file..
I can break the image down to a Byte[] and store it in the database for access.
I believe the real question here is.. Do I want the load on IIS or Sql Server 2008 R2?
What's the general consensus on which methodology I should use and why? Thanks!
There is no general consensus, because neither method is ideal.
Storing the images as files means that it becomes harder to scale the application to run on multiple servers. Then you would need a SAN/NAS to store the files instead of a local disk.
Storing the images in the database means that you are using a lot of the cache space for image data, slowing down other queries, and it also increases the size of the database, and the backups. It also means that you have to serve the images through a page, you can't just request the file directly.
So, it comes down to how much images you will have, and how scalable you need the application to be.
Avoid getting them in the database. That will make your DB a lot larger in size just because of a few files. That also affects the backup size.
I do not see any real gain on having the actual file bytes in the database. If you have the physical path to the file in the file system, that would suffice.
That also allows you to have your own backup strategy for the files.
I am designing a website that will upload video files in ASP.NET. The question I have is: video files can get very huge (i.e. 3GB) and I read that increasing the maxRequestLength in the webconfig file will give the chance for hackers to attack the server with large requests.
I already know about client validation to protect from malicious files that are not the intended files, so that's not a concern at the moment. My question is if the file-upload method is the right approach to upload video files? If not, is there a better approach?
For upload big file in asp.net,I used "Brettle.Web.NeatUpload"
You can get it at http://neatupload.codeplex.com/
I hope it is useful for you.
I use http://www.plupload.com/ combined with chunked uploads inside an ashx handler. On the server I push the parts to amazons S3, so the server never has the full file in memory. works great for me.
The reason to be concerned about this issues is that the built in functionality within the dotNET framework for handling file uploads to IIS in ASP.NET is written to cache the entire file upload to memory before streaming the file out to disk. Hence if you allow very large file uploads, you stand the risk of allowing someone to perform a Denial Of Service style attack on your IIS server because all it takes is a blast of several very large file uploads at once to exhaust available physical memory on the server. Hence the answer to that is to either write your own upload handler that does not cache the entire file upload to memory, or use one of the many available software components that can be installed and do this for you. The other two answers point to a couple. Here's another example component I found for ASP.NET:
http://www.easyalgo.com/eaupload.aspx