ImageResizer for Azure using AzureReader2 plugin not resizing - c#

EDIT
I got it to work however I had to add the RemoteReader plugin. When I remove the AzureReader2 plugin from my project it still works which makes sense however what is the AzureReader2 plugin benefiting me?
ORIGINAL QUESTION
I have done everything that has been outline here (including the comments) but can't seem to figure out why I can't resize images on the fly with this plugin for imageresizer.
This is what my web.config entry under the element looks like:
<add name="AzureReader2" prefix="~/img/" connectionString="DefaultEndpointsProtocol=https;AccountName=[Account];AccountKey=[key]" endpoint="http://<account>.blob.core.windows.net/" />
and I have set up my container to be called 'img'.
When I go to this URL to test it out:
https://<account>.blob.core.windows.net/img/image.jpg?width=50 The image shows up but just in its regular size. I've also tried running this locally and on the live side AWS but still get no resizing :(

ImageResizer library allows to serve modified versions of images (resized, cropped, rotated, with watermark etc.). AzureReader2 is a plugin that allows to fetch unmodified images from the Azure Blob (https://<account>.blob.core.windows.net) as opposed to disk.
So URL which should be used to obtain modified version of an image is your application URL with ImageResizer library installed and not Azure Blob URL (in your example https://<account>.blob.core.windows.net/img/image.jpg?width=50).
EDIT
AzureReader2 plugin allows you to read images from Azure Blob the same way as if they were saved in a disk. If you application is built in a way that all images are coming from Azure Blob, you can have two independent teams: one managing your images (and other media like CSS) and one managing your code. With that approach AzureReader2 plugin will be very handy.
I hope that will help.

After hours of playing around I finally understand how it works. I didn't realize the prefix is what you tack on the end of the actual URL and not the BLOB store URL. I ended up with
http://<account>.azurewebsites.net/img/img/image.jpg?width=50
This worked instead of my original thinking of:
https://<account>.blob.core.windows.net/img/image.jpg?width=50
For anybody that's looking at this the prefix is whats tacked on the URL of the actual site and not the BLOB store!

Related

How to reduce video size before uploading in asp.net C#

In my asp.net application when i'm going to upload any size of video need to decrease(reduce) the size of video, any one have code please help me
Thanks
You can not.
Any reduction in video size would be either compression or recoding to a lower resolution etc.
This is way beyond the scope of a web browser upload - unless you want to implement one or both of those in javascript (!).
Any size reduction would have to be done as a separate step - outside of the website - before uploading.
The whole question begs the concept whether you have understood how web pages work, in principle. There is a very strong separation of responsibilities between the web browser and the server. In particular, the following answer to a comment is - funny:
Okay no need to instantly decrease size, just before save path in
database and store file in folder decrease the size, save decreasing
file in folder
Ok, lets upload the parth. HOW DOES THIS HELP?
The path will be local to the uploaders machine. C:\Videos\LargeVideo.mpg is neither the video file, nor a location your asp.net server can access.
This does totally not solve the problem. Unless the user transcodes the file, it still is on the user's machine and too large. This is like saying "ok, the package weights too much - let's write the recipient address in another font". Does not even try to solve the problem.
Only realistic solutions are:
Provide the bandwidth.
Provide a client side upload application (NOT a webpage) that the user installs that then can not only do the upload, but can do any trans-coding necessary before uploading.
You are stuck in two elements:
A very strong client/server separation and
A very limited runtime environment on the client (javascript in the web browser).
No whining and not acceptance will every be able to change that. There is no magical way to "nothing to convert any format, all type of videos accepted just simple decrease file size only". This is called transcoding (change from one encoding to another one - and you can for example change the resolution when doing so) and it is a VERY intensive process not just doable in a browser.

Web Api 2 RESTFUL Image Upload

I'm new in Web Api and I'm working on my first project. I'm working on mobile CRM system for our company.
I want to store companies logo, customers face foto etc.
I found some tutorials on this topic, but unfortunately some of them was old (doesn't use async) and the others doesn't work.
At the end I found this one:
http://www.intstrings.com/ramivemula/articles/file-upload-using-multipartformdatastreamprovider-in-asp-net-webapi/
It works correctly, but I don't understand a few things.
1) Should I use App_Data (or any other folder like /Uploads) for storing this images, or rather store images in database?
2) Can I set only supported images like .jpg, .png and reject any other files?
3) How can I processed image in upload method? Like resize, reduce size of the file, quality etc?
Thank you
1) We are storing files in a different location than app_data. We have a few customer groups and we gave them all a unique folder that we get from the database. Storing in database is also an option but if you go down this road, make sure that the files you are saving don't belong directly to a table that you need to retrieve often. There is no right or wrong, but have a read at this question and answer for some pros and cons.
2) If you foollowed that guide, you can put a check inside the loop to check the file ending
List<string> denyList = new List<string>();
denyList.Add(".jpg");
foreach (MultipartFileData file in provider.FileData)
{
string fileName = Path.GetFileName(file.LocalFileName);
if(denyList.Contains(Path.GetExtension(fileName))
throw new HttpResponseException(HttpStatusCode.UnsupportedMediaType);
files.Add(Path.GetFileName(file.LocalFileName));
}
3) Resizing images is something that I have never personally done my self, but I think you should have a look at the System.Drawing.Graphics namespace.
Found a link with an accepted answer for downresize of picture: ASP.Net MVC Image Upload Resizing by downscaling or padding
None of the questions are actually related to Web API or REST.
If you are using SQL Server 2008 or newer the answer is use FILESTREAM columns. This looks like a column in database with all its advantages (i.e. backup, replication, transactions) but the data is actually stored in file system. So you get the best of each world, i.e. it will not happen that someone deletes the file accidentally so database will reference an inexistent file, or vice versa, records from database are deleted but files not so you'll end up with a bunch of orphan files. Using a database has many advantages, i.e. metadata can be associated with files and permissions are easier to set up.
This depends on how files are uploaded. I.e. if using multipart forms then examine content type of each part before part is saved. You can even create your own MultipartStreamProvider class. Being an API maybe the upload method has a stream or byte array parameter and a content type parameter, in this case just test the value of content type parameter before content is saved. For other upload methods do something similar depending on what the input is.
You can use .Net's built in classes (i.e. Bitmap: SetResolution, RotateFlip, to resize use a constructor what accepts a size), or if you are not familiar with image processing rather choose an image processing library.
All of the above work in Asp.Net, MVC, Web API 1 and 2, custom HTTP handlers, basically in any .Net code.
#Binke
Never user string operations on paths. I.e. fileName.split('.')[1] will not return the extension if file name is like this: some.file.txt, and will fail with index out of range error if file has no extension.
Always use file API, i.e. Path.GetExtension.
Also using the extension to get content type is not safe especially when pictures and videos are involved, just think of avi extension what is used by many video formats.
files.Add(Path.GetFileName(file.LocalFileName)) should be files.Add(fileName).

How can I deploy static content separately from my Azure solution?

I saw the following comment on Stack Overflow but I'm unable to find a reference to it.
"Yes it is. I have seen a Azure webcast from Cloud9 where the application was
broken up. The static content like images, html, css etc were deployed separately
than the Azure solution. The azure web app just linked to these resources"
Has anyone done this and have any information on how to do it?
As #Joe gennari mentioned: image links, css links, etc. just need to be changed, to reference objects in blob storage. For instance: <img src="http://mystorage.blob.core.windows.net/images/logo.jpg" />.
To actually get content into blob storage, you can:
Create a little uploader app, making very simple calls to via one of the language SDKs (Java, php, .net, python, etc.).
Upload blobs using PowerShell cmdlets - see command documentation here.
Use a tool such as Cerebrata's Cloud Storage Studio or Clumsy Leaf CloudXplorer, which lets you work with blobs in a similar way you'd work with your local file system.
You would no longer be bundling static content with your Windows Azure project. You'd upload blob updates separately (and without need for re-uploading an Azure project). This has the benefit of reducing deployment package size.

Folder explorer options

I have recently been assigned a task which sounded relatively simple!
Upon attempting it became clear it wasn't as straight forward as i first imagined!!!
I am trying to download multiple files to one location on the users machine. They select these files from lists within a custom share-point web part. Thats the bit i have managed to get working! The downloading is done via WebClient (System.Net.WebClient)
I now want to allow the user to select a location on their local machine to download the files to.
I thought i would be able to use but after attempting this i realized i can only pick files :( in order to get the desired location which will confuse the user
I want something similar to the above but i only need it to return a path location like c:\Temp or any other location the user prefers on their local machine.
Could anyone suggest a control that could provide this functionality. It can also be a share-point control.
In the meantime I will be attempting Tree view as i have never used these before and these may have the power to do this from what i have read
Cheers
Truez
Clarity on language ASP.NET
Unfortunately, you can't do this without some kind of active content, like a Flash control or spit activeX /spit.
It seems strange at first, but you have to consider that this kind of functionality would let a site discover the structure of anyones storage devices; this is not 'a good thing'™
However, perhaps a different approach might solve the problem?
Why are you using WebClient, can't you provide the link to the client and let them choose their own download folder ?
I ended up zipping the files in to one folder and passed the file to be downloaded through the browser! Thanks for your comments!

Download 3000+ Images Using C#?

I have a list of around 3000 image URLs, where I need to download them to my desktop.
I'm a web dev, so naturally wrote a little asp.net c# download method to do this but the obvious problem happened and the page timed out before I got hardly any of them.
Was wondering if anyone else knew of a good, quick and robust way of me looping through all the image URL's and downloading them to a folder? Open to any suggestions, WinForms, batch file although I'm a novice at both.
Any help greatly appreciated
What about wget? It can download a list of URL specified in a file.
wget -i c:\list-of-urls.txt
Write a C# command-line application (or Winforms, if that's your inclination), and use the WebClient class to retrieve the files.
Here are some tutorials:
C# WebClient Tutorial
Using WebClient to Download a File
or, just Google C# WebClient.
You'll either need to provide a list of files to download and loop through the list, issuing a request for each file and saving the result, or issue a request for the index page, parse it using something like HTML Agility Pack to find all of the image tags, and then issue a request for each image, saving the result somewhere on your local drive.
Edit
If you just want to do this once (as in, not as part of an application), mbeckish's answer makes the most sense.
You might want to use an existing download manager like Orbit, rather than writing your own program for the purpose. (blasphemy, I know)
I've been pretty happy with Orbit. It lets you import a list of downloads from a text file. It'll manage the connections, downloading portions of each file in parallel with multiple connections, to increase the speed of each download. It'll take care of retrying if connections time out, etc. It seems like you'd have to go to a lot of effort to build these kind of features from scratch.
If this is just a one-time job, then one easy solution would be to write a HTML page with img tags pointing to the URLs.
Then browse it with FireFox and use an extension to save all of the images to a folder.
Working on the assumption that this is a one off run once project and as you are a novice with other technologies I would suggest the following:
Rather than try and download all 3000 images in one web request do one image per request. When the image download is complete redirect to the same page passing the URL of the next image to get as a query string parameter. Download that one and then repeat until all images are downloaded.
Not what I would call a "production" solution, but if my assumption is correct it is a solution that will have you up an running in no time.
Another fairly simple solution would be to create a simple C# console application that uses WebClient to download each of the images. The following psuedo code should give you enough to get going:
List<string> imageUrls = new List<string>();
imageUrls.Add(..... your urls from wherever .....)
foreach(string imageUrl in imagesUrls)
{
using (WebClient client = new WebClient())
{
byte[] raw = client.DownloadData(imageUrl);
.. write raw .. to file
}
}
I've written a similar app in WinForms that loops through URLs in an Excel spreadsheet and downloads the image files. I think they problem you're having with implementing this as a web application is that server will only allow the process to run for a short amount of time before the request from your browser times out. You could either increase this time in the web.config file (change the executionTimeout attribute of the httpRuntime element), or implement this functionality as a WinForms application where the long execution time won't be a problem. If this is more than a throw-away application and you decide to go the WinForms route, you may want to add a progress bar to ind

Categories

Resources