How can I send the service a ref of a big file on the cliet's computer by a stream object and then start download piece by piece it from the client's computer (I decide how much MB I transfer every sec)? Do I have any limitations when I use it?
IIS doesn't support streaming - it buffers the whole request.
CodeProject article: WCF 4.5 fixes this
Until then, if you use IIS, the whole file will be stored in server memory before it is passed to your service.
The solution for now is to send the file in chunks - each chunk sent in a separate service call.
This would also help with your bandwidth throttling. This is not built into WCF - you have to do it yourself. You can throttle each chunk either on the client or on the server.
Related
I'm trying to use WCF web service in C# to send the bitmap data of an image to the client, I plan to send this as an object. I want to understand the nature of web service.
My question is how does this work with a large file? such as 10MB? Does web service work in one single request/response to send all 10MB worth of bitmap data? What if a network error occurs? will the client have to reissue the request?
The reason I ask this is I have been working with Socket and often when it comes to large file, I break it down to trunk of manageable size (such as 4kb) and then send it, if one succeed then I'm sending the next 4kb until all the data are transferred.
Thanks again.
The same principle can be applied here. You can send it in chunks. Related post you might find usefull - wcf upload/download large files (i.e. Img, mp3) in chunks with windows service
I have a WCF REST based service which downloads and stores video files in the web server machine. The maximum size of video files is around 1MB and the download is asynchronous at present.
Is it possible to use synchronous download and will a timeout occur if large number or requests are submitted almost simultaneously? If so is there a mechanism to handle the situation?
You could set up your service to act as a singleton. WCF will handle the queueing of multiple calls. A similar question can be found here, and may be of help.
i have implemented a fairly simple wcf service which handles the file transfers from my clients to the server the problem is when a client sends a file request.
all of the bandwidth is allocated to that single client and others have to wait until the requested file transfer is completed.
So my question is how to make the service more efficient and let the users share the bandwidth
[ServiceBehavior(IncludeExceptionDetailInFaults = true, InstanceContextMode =InstanceContextMode.PerCall,
ConcurrencyMode=ConcurrencyMode.Multiple)]
I set the InstanceContextMode attribute to PerCall but that didn't do the trick
UPDATE : This Project is similar to mine
http://www.codeproject.com/Articles/33825/WCF-TCP-based-File-Server
WCF does not have proper load balancing, you will have to develop one yourself.
If you are transferring files, lets assume download, you should send packets of data rather than the complete file at once. When doing this, add 'delays/sleeps' to the process to limit the amount of bytes the server sends on each time window, this will make room for other requests.
It's questionable that it's desirable to serve up files through a WCF endpoint. The reasons against doing this are pretty much exactly the problems you have been having. It works for a few clients at a time - but scaling out requires hosting new instances of the service behind a load balancer.
It would be worth considering hosting your files with some kind of storage service and have your WCF service simply return a link or handle to the file. Then the file can be retrieved offline. Microsoft have created Azure Blob Storage for this exact purpose.
Appreciate this does not address your original question, and understand the scope of your requirement may not accommodate a large reworking.
Another option is to use chunking channel if you are transferring large files. Examples: MSDN, codeplex.
Although I agree with #hugh position.
I have quite some trouble creating a WCF service that supports downloads of very large files. I have read a lot of guides on how to set transferMode attribute to Streamed, increase all messageSize and bufferSize attributes to Int32.MaxValue, and still I have no luck. (I am also returning the stream as the message body via the MessageBodyMember attribute, and metadata is sent via the headers using the MessageHeader attributes).
If I set all these attributes, I can download smaller files fine, but when I try to download 1-2GB files I simply get a 400 bad request error which makes it rather hard to debug...
My service should ideally support file sizes of at least 8GB. Is this even doable with WCF? The various messageSize attributes of the web.config file seem to be limited to Int32.MaxValue which equals a maximum file size of 2GB.
From my studies I have found that it seems I will have to use WebClient.DownloadFile instead.
Files should only be available for download to users who have the required rights. With WCF my download method could take a token-parameter that the server could check and return the stream only if the user had rights to download the requested file. This does not seem straight forward using the WebClient approach. If anyone has some guidelines on how to do this (via the WebClient), I would very much appreciate it.
Ideally my WCF service should administer and provide user tokens and somehow bind to every individual file what tokens are currently legal (tokens should be usable only once). Download should then happen via the WebClient.
Thanks in advance for any clues.
You can do this in WCF. Many moons ago I built a service that did this (we didn't have a web server as part of our configuration). We used WCF streaming:
http://msdn.microsoft.com/en-us/library/ms733742.aspx
The strategy to deal with large payloads is streaming. While messages,
especially those expressed in XML, are commonly thought of as being
relatively compact data packages, a message might be multiple
gigabytes in size and resemble a continuous data stream more than a
data package. When data is transferred in streaming mode instead of
buffered mode, the sender makes the contents of the message body
available to the recipient in the form of a stream and the message
infrastructure continuously forwards the data from sender to receiver
as it becomes available.
we're working on a peer to peer comm software that would allow a number of grocery stores to sync their inventory with what we call "headquarters".
To so this, we're thinking WCF+WPF, and no IIS and web services. My experience with WCF is basically zero, so my question is whether a TCP comm solution using WCF would work. The data that's being transferred is quite limited, about 2MB for a compressed plain text file (so we're sending binary data!), and this is done once per day only. So bandwidth/load shouldn't be an issue here.
The idea at this point is to have a WCF "server" running at HQ. Stores make themselves known to that server and then send files back and forth (simliliar to a chat application).
What I'm not sure of: does every store need to have a WCF "server" (or endpoint)? How would the server (=HQ) send a file to one of the clients (=stores)? Every store can send a file to any other store, and the HQ, and every store can also "request" a file from any other store/HQ.
Two limitations: None of the machines/computers involved can run Windows server for budget reasons, and as stated before IIS is a no-go.
If you are only sending files back and forth, I might question whether or not WCF even makes any sense. Have you considered just using a file transfer protocol, like scp or sftp?
Every machine will have to accept connections and have a file drop location setup, and then yuor application will have to monitor that location for new files. I love WCF in general, but a file transfer protocol is going to have a leg up if that is all you want to do.
If you direct all of your traffic via the server then there's no reason why you couldn't achieve this with WCF. The server would host WCF services in IIS with the stores having a client that was able to upload and request files. With this method, stores would not be able to directly transfer fiels to each other, but they would have to do it via the main server, which would suit your needs if you don't have the budget for the other scenario.
If all transfers are made once per day, the requests for files would be made with each client requesting what files they require, followed by each client uploading any files that are required by the server or any other client. The final step would be the server distributing the required files to each client. Obviously, this is a simplified view of it, the actual process may require some more thinking.
You don't need to host WCF in IIS, but is there any particular reason you don't want to do that?
You can host WCF in a ServiceHost, but then you need to build, maintain and deploy a lot of server/service features that IIS provides for free, such as application process recycling, activation-based hosting, etc.
In any case, it almost sounds like you need peer to peer networking. You can do that with WCF using the NetPeerTcpBinding.
If you have an opportunity to redesign your application, I suggest you do. You can throw strings around in WCF but if you can create a data contract you can keep all your communication strongly typed.
If you have access to windows server 2008 then the new IIS can host your WCF even if it isn't using tcp. Otherwise you just need to write an application that opens a service host, which you would usually wrap into a windows service. But as #MArk Seemann pointed out, you get lots of freebies by running your service in IIS.
Don't have any experience with the PeerTcpBinding but I can tell you that the NetTcpBinding is nice and fast plus it comes with all sorts of goodies like encryption and authentication if you want it.