I am using Webclient.UploadFileAsync function to call a rest webservice to upload files to a server. The uploads to server can also be done from a web application.
The server side processing is in milliseconds. So, most of the time of upload is spent in transport. I am able to upload a 6.28 MB file from web application in 2 minutes, But the same upload if done from my winform application using Webclient.UploadFileAsync takes 3 minutes.
Difference between web browser upload & webservice upload is that the former directly saves a file to server and in case of webservice first the webservice is called and then file is saved to the server.
So,what is the reason for such a huge difference in speed ? And how can this difference be reduced ?
Update: I tried using fiddler as suggested, and found an interesting thing.When I uploaded a file, while the fiddler was running, I got huge improvement in upload speed.Close to the speed of web application.And, when I tried uploading when the fiddler wasn't running, I got very slow upload speed as before.So, there seems to be a bug in webclient class.How do I get around this issue?
I can't add comments due to my reputation, so sorry for getting your hopes up in advance, but it would seem that since you have to go through middleware so to speak, the overall load time is increased, if it's not that important and you have the correct tools to do so, there are many FTP clients and libraries out there that could do this, and probably quicker than your web server's speed. Although if it's required for you to go through a web server, I wouldn't have much of an answer other than perhaps using an external webclient that can maybe run slightly faster.
So to sort of answer your question, using a secure FTP library would most likely be faster, and the speed difference is mainly due to the middleware you have to go through before hitting your actual server.
Related
There is a program which sends image files in binary by using WebClient.UploadFile(someUri, "STOR", filename). I can't change that program, but I need to build a program to receive the file. I don't want to implement a full FTP server, so what should I be looking at to create the bare minimum logic in C# to receive the file? A bonus would be a solution that uses features present in .NET 3.5 or 4
I don't really know where to start, so any tip is appreciated. Thanks.
It sounds like you'll need to provide not an FTP server, but an HTTP server (aka a web server). From a little quick Googling, it looks like there's a library for embedding a simple web server into a .net application named Nancy that a lot of people seem to have good results with: Embedded C# web server?
Of course, this would be the quick and dirty way. Probably a better long-term approach would be to create a normal ASP.Net website to receive the images, hosted on a normal IIS web server. But if you have no experience in web development, that might be biting off a lot.
Thank you for the suggestions, but through more digging around I found out how to implement what I need. This is a very simple ftp server done in python using sockets. I was able to easily replicate this in C# using sockets again and I have adapted it to be able to authenticate a user of my choosing and write the received files to the disk.
Let me start by describing what I want to do:
I have a Unity app, and I code it in c#.
In the app I want to download a file from a server, so I am using HttpWebRequest.
I sent out a HEAD request first to see if the file on the device needs to be updated or not, and to see the file size on the server. If the file needs to be updated I download it.
The class that I use for the downloads is in this git.
Now my problem is I don't know how to host the files because of the following issues:
I tried hosting it on a site with direct download links like ge.tt, but it does not support the HEAD request.
I tried hosting it on a free web host (000webhost), but the download gets stuck most of the time (for large files).
I tried hosting it on Dropbox but the function webRequest.EndGetResponse never returns.
I think the best solution would be to host the files on my computer, but I dont know how to do that, or how to get a download link that will work outside of my lan.
I would greatly appreciate any ideas you have, and answer questions about the code.
Thank you!
I need to somehow implement an ability to upload files through an ASP.NET application which is working within our corporate network. The problem is those files are getting increasingly big. At the moment we're using a very generic asynchronous upload but the problem is that files are getting increasingly big and the max limit of 3.9gb per file set through maxAllowedContentLength since the max value of uint won't allow anything more. Soon files which users are suppose to upload will exceed this value and might reach up to 100gb in size.
I tried looking online for some solution to this problem but in most articles by large files people mean 1gb at best.
So is there any way to upload really large files (up to 100g) through ASP.NET MVC\WebAPI application or I need to look for alternative solutions?
Yes there is, you need to split the file in smaller parts see the example here: http://forums.asp.net/t/1742612.aspx?How+to+upload+a+big+file+in+Mvc+
You could consider sending it in chunks. This would skip over the large file requirement (as each request would only be the size of the chunk you send), but is slightly more complicated on the client and server side.
I've done something similar for streaming uploaded files over a websocket, but this could easily be done with multiple ajax requests. In either case you'll want to use the JavaScript File API to read a segment of the file on the client's computer, encode that segment to something you can send (probably Base64), and send that particular segment to the web server. You could also send additional data such as file position to ensure the server is writing the file properly. The server can choose how to respond (can be as simple as a "true" to acknowledge receipt), after which the client javascript would read and send the next chunk of the file.
I have a demo of this using WebSockets on a github repo here (ASP.NET MVC server-side code here) but with a few tweaks you could easily make this into sequential AJAX requests.
... is very slow. We're trying to deploy a 280 MB cspkg file through the VS2010 tools, and it takes roughly 35 minutes to upload, and another 10 minutes to deploy.
Are there any ways to speed up this upload process? We're comptemplating putting invariant data into a blob and pulling it from there, but we'd like to know what's happening in the first place.
edited to reflect we're using vs2010 azure integration tools
Both deployment methods (API and Portal) allow you to deploy from a file that is already uploaded to Azure Storage. The VSTS tools are just utilizing this feature behind the scenes. (In 2010 you have to provide storage credentials for this reason).
You should look into uploading the .cspkg into a Blob directly (vs through VSTS, and then write up a simple upload client that will break the upload into blocks, which can be uploaded simultaneously. You can then tweak this (block size and # of blocks uploading at a time) to better utilize your outgoing bandwidth. Then you just use the api to "assemble" them in Azure once they are all there. This should really speed up the upload.
I think to answer your question as to "whats happening", you are just getting synchronous WebClient I/O to Azure Storage, and all the limitations that come with it.
We have been hitting a very similar problem recently, as we had to package about 40MB of 3rd party libraries to establish a SQL connection toward Oracle from Windows Azure.
Through Lokad.CQRS, we did exactly what you suggest, aka, putting all big static libraries and keeping the Azure package as lean as possible. It works very nicely.
We're currently creating an online music store that allows, only for administrators, to upload songs and previews on website. The problem is that uploading songs to our website take about 3 or 4 minutes. Is it normal? So can someone tell me ways I can tell to the website's hosters to check because our client is not really happy to upload about 100-200 songs to start his website that takes about 300-800 minutes, 5-13hours :oP.
Here's httpruntime we've put in web.config :
<httpRuntime maxRequestLength="20480" executionTimeout="240" />
Thanks
First step is to check the host's bandwidth limitations. Is this described in a service agreement or similar? Ask them. If you have access to the host server you could check for yourself using a variety of speed test tools or just simply transferring a file independent of your application.
The other thing to check is the client bandwidth. What's the ISP's bandwidth (downstream and upstream), any limits or throttling, does the speed vary at different times during the day (or night)? It's only going to go as fast as the slowest link in the chain, and if there is DSL/Cable involved remember that often these are asymmetric and if so are usually significantly slower on the upstream than downstream.
If the host and client bandwidths are okay, then start looking at the application's configuration.
Your server can handle fast uploads, the problem is with bandwidth. Internet providers optimize connections for fast downloads and slower uploads. If you can, offer FTP access to admins to upload their files. Should be faster than HTTP anyway.