I have a WCF REST based service which downloads and stores video files in the web server machine. The maximum size of video files is around 1MB and the download is asynchronous at present.
Is it possible to use synchronous download and will a timeout occur if large number or requests are submitted almost simultaneously? If so is there a mechanism to handle the situation?
You could set up your service to act as a singleton. WCF will handle the queueing of multiple calls. A similar question can be found here, and may be of help.
Related
I am a WPF newbie and has little experience with C# thread programming. I am assigned a C# WPF application with multi-threading. The requirements for that application is to upload local big files to a
destination folder named "ABC" that sits directly under the root disk driver (e.g. C:\ABC) on a remote server that runs under
Windows platform. I do not know the directions to go. Please advise. Thank you.
Follows are details about the requirements:
Because each uploaded file size is big, there needs a separate thread to run the upload file function.
I plan to use thread programming with async/await and Task object. Any idea?
In WPF I do not know which WPF control to use for upload function. Please help.
For destination folder "ABC", do I need to set its access permission explicitly?
I should use async/await and Task, or BackgroundWorker class?
Update:
WPF application not WCF application. Sorry for my typo.
To transfer large files using WCF service over HTTP, you can use the following types of bindings:
wsHttpBinding
basicHttpBinding
In wsHttpBinding, we can set the TransferMode attribute as Buffered, but there is a disadvantage in using this approach for large files, because it needs to put the entire file in memory before uploading/downloading, A large buffer is required on both the web client and the WCF service host. However, this approach is very useful for transferring small files, securely.
In basicHTTPBinding we can use the TransferMode attribute as Streamed so that the file can be transferred in the form of chunks.
For more information follow this article:
WCF Streaming: Upload/Download Files Over HTTP
and for transfering files over TCP/IP read below articls:
WCF TCP-based File Server
Sending Files using TCP
Large Message Transfer with WCF-Adapters Part 1
I’m using a service reference to communicate between my clients (C#) and my server.
I Frequently want to transfer big amounts of data (> 2 MB) over this web service, this works perfectly but with a few customers who has a very small upload this process is very slow and complete fills the upload for a limited time.
I’m looking for a way to limits the bandwidth used by the client to make sure there is some upload available for the customer itself.
I found various solutions the throttle streams but is it possible to use throttle in a web service call?
Maybe using a custom binding?
How can I send the service a ref of a big file on the cliet's computer by a stream object and then start download piece by piece it from the client's computer (I decide how much MB I transfer every sec)? Do I have any limitations when I use it?
IIS doesn't support streaming - it buffers the whole request.
CodeProject article: WCF 4.5 fixes this
Until then, if you use IIS, the whole file will be stored in server memory before it is passed to your service.
The solution for now is to send the file in chunks - each chunk sent in a separate service call.
This would also help with your bandwidth throttling. This is not built into WCF - you have to do it yourself. You can throttle each chunk either on the client or on the server.
i have implemented a fairly simple wcf service which handles the file transfers from my clients to the server the problem is when a client sends a file request.
all of the bandwidth is allocated to that single client and others have to wait until the requested file transfer is completed.
So my question is how to make the service more efficient and let the users share the bandwidth
[ServiceBehavior(IncludeExceptionDetailInFaults = true, InstanceContextMode =InstanceContextMode.PerCall,
ConcurrencyMode=ConcurrencyMode.Multiple)]
I set the InstanceContextMode attribute to PerCall but that didn't do the trick
UPDATE : This Project is similar to mine
http://www.codeproject.com/Articles/33825/WCF-TCP-based-File-Server
WCF does not have proper load balancing, you will have to develop one yourself.
If you are transferring files, lets assume download, you should send packets of data rather than the complete file at once. When doing this, add 'delays/sleeps' to the process to limit the amount of bytes the server sends on each time window, this will make room for other requests.
It's questionable that it's desirable to serve up files through a WCF endpoint. The reasons against doing this are pretty much exactly the problems you have been having. It works for a few clients at a time - but scaling out requires hosting new instances of the service behind a load balancer.
It would be worth considering hosting your files with some kind of storage service and have your WCF service simply return a link or handle to the file. Then the file can be retrieved offline. Microsoft have created Azure Blob Storage for this exact purpose.
Appreciate this does not address your original question, and understand the scope of your requirement may not accommodate a large reworking.
Another option is to use chunking channel if you are transferring large files. Examples: MSDN, codeplex.
Although I agree with #hugh position.
When I first posted this question I had strong coupling between my web service and application controller where the controller needed to open multiple threads to the service and as it received back data it had to do a lot of processing on the returned data and merge it into one dataset. I did not like the fact that the client had to so much processing and merge the returned data before it was ready to be used and wanted to move that layer to the service and let the service open the asynchronous threads to the suppliers and merge the results before returning them to the client.
One challenge I had was that I could not wait till all threads were complete and results were merged, I had to start receiving data as it was available. That called me to implement an observer pattern on the service so that it would notify my application when new set of results are merged and ready to be used and send them to the application.
I was looking for how to do this using either on ASMX webservices or WCF and so far I have found implementing it using WCF but this thread is always open for suggestions and improvements.
OK the solution to my problem came from WCF
In addition to classic request-reply operation of ASMX web services, WCF supports additional operation types like; one-way calls, duplex callbacks and streaming.
Not too hard to guess, duplex callback was what I was looking for.
Duplex callbacks simply allow the service to do call backs to the client. A callback contract is defined on the server and client is required to provide the callback endpoint on every call. Then it is up to the service to decide when and how many times to use the callback reference.
Only bidirectiona-capable bindings support callback operations. WCF offers the WSDualHttpBinding to support callbacks over HTTP (Callback support also exists by NetNamedPipeBinding and NetTcpBinding as TCP and IPC protocols support duplex communication)
One very important thing to note here is that duplex callbacks are nonstandard and pure Microsoft feature. This is not creating a problem on my current task at hand as both my web service and application are running on Microsoft ASP.NET
Programming WCF Services gave me a good jump start on WCF. Being over 700 pages it delves deep into all WCF consepts and has a dedicated chapter on the Callback and other type of operations.
Some other good resources I found on the net are;
Windows Communication Foundation (WCF) Screencasts
MSDN Webcast: Windows Communication Foundation Top to Bottom
Web Service Software Factory
The Service Factory for WCF
This sounds like a perfect use case for Windows Workflow Foundation. You can easily create a workflow to get information from each supplier, then merge the results when ready. It's much cleaner, and WF will do all the async stuff for you.
I'm not so sure that duplex is needed here... IMO, a standard async call with a callback should be more than sufficient to get notification of data delivery.
What is the biggest problem? If you are talking about async etc, then usually we are talking about the time taken to get the data to the client. Is this due to sheer data volume? or complexity generating the data at the server?
If it is the data volume, then I can think of a number of ways of significantly improving performance - although most of them involve using DTO objects (not DataSet/DataTable, which seemed to be implied in the question). For example, protobuf-net significantly reduces the data volume and processing required to transfer data.
One of the ways to achieve this is by invoking your WS asynchronously (http://www.stardeveloper.com/articles/display.html?article=2001121901&page=1, http://www.ondotnet.com/pub/a/dotnet/2005/08/01/async_webservices.html), and then updating the GUI in the callback.
However, you could have timeout problems if the querying of data takes too long. For example, if one of the supplier's web site is down or very slow, this could mean that the whole query could fail. Maybe it would be better if your business logic on the client side does the merging instead of WS doing it.
Not sure if this solution fits your particular task, but anyway:
Add paging parameters to your WS API (int pageNumber, int pageSize, out int totalPages)
Add a short-living TTL cache that associates request details (maybe a hash value) with output data
When your application asks for the first page, return it as soon as it's ready and put the whole bunch of collected/merged data to cache so when the next page is required you may use what is already prepared.
But note that you won't get the most up-to-date data, configure cache reloading interval cautiously.
The absolute best way to archive in your scenario and technology would be having some kind of token between your web app / library against your web service and your controller needs to have a thread to check if there are new results etc. However please note that you will require to get the complete data back from your WS as it's merge can result in removed items from the initial response.
Or I still think that handling threads would be better from controller with the use of WCF Webservices