Download very large files (8GB+) - combining WCF and WebClient? - c#

I have quite some trouble creating a WCF service that supports downloads of very large files. I have read a lot of guides on how to set transferMode attribute to Streamed, increase all messageSize and bufferSize attributes to Int32.MaxValue, and still I have no luck. (I am also returning the stream as the message body via the MessageBodyMember attribute, and metadata is sent via the headers using the MessageHeader attributes).
If I set all these attributes, I can download smaller files fine, but when I try to download 1-2GB files I simply get a 400 bad request error which makes it rather hard to debug...
My service should ideally support file sizes of at least 8GB. Is this even doable with WCF? The various messageSize attributes of the web.config file seem to be limited to Int32.MaxValue which equals a maximum file size of 2GB.
From my studies I have found that it seems I will have to use WebClient.DownloadFile instead.
Files should only be available for download to users who have the required rights. With WCF my download method could take a token-parameter that the server could check and return the stream only if the user had rights to download the requested file. This does not seem straight forward using the WebClient approach. If anyone has some guidelines on how to do this (via the WebClient), I would very much appreciate it.
Ideally my WCF service should administer and provide user tokens and somehow bind to every individual file what tokens are currently legal (tokens should be usable only once). Download should then happen via the WebClient.
Thanks in advance for any clues.

You can do this in WCF. Many moons ago I built a service that did this (we didn't have a web server as part of our configuration). We used WCF streaming:
http://msdn.microsoft.com/en-us/library/ms733742.aspx
The strategy to deal with large payloads is streaming. While messages,
especially those expressed in XML, are commonly thought of as being
relatively compact data packages, a message might be multiple
gigabytes in size and resemble a continuous data stream more than a
data package. When data is transferred in streaming mode instead of
buffered mode, the sender makes the contents of the message body
available to the recipient in the form of a stream and the message
infrastructure continuously forwards the data from sender to receiver
as it becomes available.

Related

Using web service to send large object (ex. image bitmap)

I'm trying to use WCF web service in C# to send the bitmap data of an image to the client, I plan to send this as an object. I want to understand the nature of web service.
My question is how does this work with a large file? such as 10MB? Does web service work in one single request/response to send all 10MB worth of bitmap data? What if a network error occurs? will the client have to reissue the request?
The reason I ask this is I have been working with Socket and often when it comes to large file, I break it down to trunk of manageable size (such as 4kb) and then send it, if one succeed then I'm sending the next 4kb until all the data are transferred.
Thanks again.
The same principle can be applied here. You can send it in chunks. Related post you might find usefull - wcf upload/download large files (i.e. Img, mp3) in chunks with windows service

WCF Streaming Big file

How can I send the service a ref of a big file on the cliet's computer by a stream object and then start download piece by piece it from the client's computer (I decide how much MB I transfer every sec)? Do I have any limitations when I use it?
IIS doesn't support streaming - it buffers the whole request.
CodeProject article: WCF 4.5 fixes this
Until then, if you use IIS, the whole file will be stored in server memory before it is passed to your service.
The solution for now is to send the file in chunks - each chunk sent in a separate service call.
This would also help with your bandwidth throttling. This is not built into WCF - you have to do it yourself. You can throttle each chunk either on the client or on the server.

Concurrency management in WCF

i have implemented a fairly simple wcf service which handles the file transfers from my clients to the server the problem is when a client sends a file request.
all of the bandwidth is allocated to that single client and others have to wait until the requested file transfer is completed.
So my question is how to make the service more efficient and let the users share the bandwidth
[ServiceBehavior(IncludeExceptionDetailInFaults = true, InstanceContextMode =InstanceContextMode.PerCall,
ConcurrencyMode=ConcurrencyMode.Multiple)]
I set the InstanceContextMode attribute to PerCall but that didn't do the trick
UPDATE : This Project is similar to mine
http://www.codeproject.com/Articles/33825/WCF-TCP-based-File-Server
WCF does not have proper load balancing, you will have to develop one yourself.
If you are transferring files, lets assume download, you should send packets of data rather than the complete file at once. When doing this, add 'delays/sleeps' to the process to limit the amount of bytes the server sends on each time window, this will make room for other requests.
It's questionable that it's desirable to serve up files through a WCF endpoint. The reasons against doing this are pretty much exactly the problems you have been having. It works for a few clients at a time - but scaling out requires hosting new instances of the service behind a load balancer.
It would be worth considering hosting your files with some kind of storage service and have your WCF service simply return a link or handle to the file. Then the file can be retrieved offline. Microsoft have created Azure Blob Storage for this exact purpose.
Appreciate this does not address your original question, and understand the scope of your requirement may not accommodate a large reworking.
Another option is to use chunking channel if you are transferring large files. Examples: MSDN, codeplex.
Although I agree with #hugh position.

REST service and big files

Can REST web service (which usually produces e.g. simple JSONs) both handle and return big binary input/output data?
I mean, to call a REST service by a HTTP POST providing big file and afterwards reading the big result back? Is REST ok for that? ("Big" = few megabytes)
With text serializers such as JSON and XML you would get about 33% increase of the size of the files over the wire as the binary data needs to be Base64 encoded. There are more optimized protocols such as MTOM to handle this scenario. WCF supports MTOM out of the box.
REST architectures are quite capable of using HTTP to serve up application/octet-stream, which is just a stream of bytes. HTTP can quite reliably serve very large files.
Since REST is primarily a service over HTTP, standard advantages and limitations of HTTP apply to REST services too. You can send large files of few MBs as POST to REST API in a way similar that one uploads a large file to a web app.

WCF Upload file

I've never used WCF before but I want to create a simple service. I want the computer to upload a file from the local machine. I've got this in my interface:
[OperationContract]
[WebInvoke(Method = "GET",UriTemplate = "/File")]
Stream GetFile();
In the method, I just do this:
return new FileStream(#"c:\myfile.zip", FileMode.Open);
When I run this from the client, the Result property contains the stream itself. This is fine, and it works, but I'd rather the client access the file from the server itself instead of copying the entire thing across. What would be the URL for the uploaded file so that the client can access it directly from the server? I assumed it's http://computername:port/something... (I believe this is a REST service?)
Let's try to think over what you are saying.
A user can access a file directly. In that case it is fair to think, that any user could access any file on any computer. It's obvious, that it's not so.
WCF provides some special interface to access the files on computer, where it is running. This is also a big security hole.
any other ideas?
In any case, if you want to share the file on web, you should make a public interface for it.
This could be an IIS server, which will return it from the server machine, or this could be just your WCF server.
Anyway, what you did is right. Stream returns a file over network and it has no overhead of copying (duplicating) some information, like you may think.
UPDATE:
To enable streaming you can look into this article
UPDATE 2:
If you really need to stream a video file, Smooth Streaming could be an option
I think you are getting things bit mixed up here. You expose a WCF service with an operation returning a Stream. That this is a filestream is an implementation detail. You could just as well return a memory stream or a network stream or anything which derives from Stream really. If you want to have the client direct access to the file you need to share it so it can access it. On Windows you could create a shared folder (but you really don't want to expose that on the internet) or you setup a ftp or http server.
I have not tinkered around with WCF and streaming but I would expect that when you supply a stream from the server that it works more or less out of the box (meaning it streams the file rather than downloading it in one big chunk). Have a look at this MSDN article which details some restrictions around streaming with WCF.

Categories

Resources