How to handle large file uploads via WCF? - c#

I am looking into using WCF for a project which would require the ability for people to upload large files (64MB-1GB) to my server. How would I handle this with WCF, possibly with the ability to resume uploads.
In order to handle a larger client base, I wanted to test out JSON via WCF. How would this affect the file upload? Can it be done from JSON, or would they need to switch to REST for the upload portion?

If you want to upload large files, you'll definitely need to look into WCF Streaming Mode.
Basically, you can change the transfer mode on your binding; by default, it's buffered, i.e. the whole message needs to be buffered on the sender, serialized, and then transmitted as a whole.
With Streaming, you can define either one-way streaming (for uploads only, for downloads only) or bidirectional streaming. This is done by setting the transferMode of your binding to StreamedRequest, StreamedResponse, or just plain Streamed.
<bindings>
<basicHttpBinding>
<binding name="HttpStreaming"
maxReceivedMessageSize="2000000"
transferMode="StreamedRequest"/>
</basicHttpBinding>
</bindings>
Then you need to have a service contract which either receives a parameter of type Stream (for uploads), or returns a value of type Stream (for downloads).
[ServiceContract]
public interface IFileUpload
{
[OperationContract]
bool UploadFile(Stream stream);
}
That should do it!

MTOM is optimized to handle large binary data.

You can use webHttpBinding with TransferMode streamed and a single Stream parameter or Stream response (as appropriate) for large file up/downloads, but you'd have to send any request metadata via URLs and/or headers, unless you're going to devise your own framing on the Stream. You'll have to build a custom non-HTML client (like Silverlight, Flash, etc) though, since browsers don't support random access to local files, and the normal file upload will be a form post, not JSON.

Related

Compression when using ASP.NET Core Response.BodyWriter PipeWriter

When using the Response.BodyWriter property to output JSON from a ASP.NET Core Web API endpoint, it seems the configured compression method is ignored.
Compression is enabled using the recommended services.AddResponseCompression and app.UseResponseCompression calls in Startup.cs, and indeed this works for normal endpoints that return object instances that are later serialized by the appropriate middleware.
However, using the PipeWriter method to write output seems to ignore these settings completely.
Of course, we can write compressed data to the Response.BodyWriter pipe directly, but this appears to involve determining the best compression method to use, setting up a Brotli or Gzip writer, and sending chunks through that - all of this is already done by ASP.NET somewhere, so ideally I'd like to avoid re-implementing all this myself.
Is there a standard way to apply the correct compression to the BodyWriter pipe?

Unbuffered output from IHTTPHandler

I want to stream data from an IHttpHandler class. I'm loading a large number of rows from the DB, serializing, and compressing them, then sending them down the wire. On the other end, I want my client to be able decompress, and deserialize the data before the server is even done serializing all the objects.
I'm using context.Response.OutputSteam.Write to write my data, but it still seems like the output data is being put into a buffer before being sent to the client. Is there a way to avoid this buffering?
The Response.Flush method should send it down the wire; however, there are some exceptions. If IIS is using Dynamic Compression, that is it's configured to compress dynamic content, then IIS will not flush the stream. Then there is the whole 'chunked' transfer encoding. If you have not specified Content-Length then the recieving end does not know how large the response body will be. This is accomplished with the chunked transfer encoding. Some HTTP servers require that the client uses an Accept-Encoding request header containing the chunked keyword. Others just default to chunked when you begin writing bytes before the full length is specified; however, they do not do this if you have specified your own Transfer-Encoding response header.
With IIS 7 and compression disabled, Response.Flush should then always do the trick, right? Not really. IIS 7 can have many modules that intercept and interact with the request and response. I don't know if any that are installed/enabled by default, but you should still be aware that they can effect your desired result.
... I'm loading a large number of rows from the DB, serializing, and compressing them, then sending them down the wire...
Curious that you are compressing this content. If you are using GZIP then you will not be in control of when and how much data is sent by calling flush. Additionally using GZIP content means that the receiving end may also be unable to start reading data right away.
You may want to break the records into smaller, digestible chucks of 10, 50, or 100 rows. Compress that and send it, then work on the next set of rows. Of course now you will need to write something to the client so they know how big each compressed set of rows is, and when they have reached the end. see http://en.wikipedia.org/wiki/Chunked_transfer_encoding for an example of how the chunked transfer works.
You can use context.Response.Flush() or context.Response.OutputSteam.Flush() to force buffered content to be written immediately.

How to increase the POST size for an ASMX web service?

Background
I am developing an ASP.Net server side control that needs to talk to an ASMX web service. The server side control uses a WebClient object to talk to the web service, since it needs to be reused often in various application, and to make it easier on the developers, they are not required to create a service reference to the web service.
Implementation
During the use of the control, it is requires the sending of a serialised object to the web service. The object is serialised using the XmlSerializer and the resulting XML string is then compressed using the chilkat compression library. The web service call for the control looks as follows:
webClient.UploadStringAsync(new Uri(serviceHost + serviceMethod), "POST", sendData)
The content of sendData (string) is compressedResponse={CompressedData}.
The web service has a method defined as follows to receive the data and then decompress the string value using the chilkat library before de-serialising the object using the XmlSerializer.
public void SaveResponse(string compressedResponse)
The communication between the control and the service is working. Initially there were no settings or binding defined in the web.config for any of the above. After initial searching I did add
<httpRuntime maxRequestLength="20480"/>
to both the client and server web.config files. This has made no difference.
Problem
Compressed or uncompressed the data being posted to the web service in the sendData variable is to big for a normal POST request, and is corrupted. This is confirmed when checking the last few characters of the string before and after it being posted to the server in compressed format, and uncompressed, the Xml document is missing the last root tag when checking in the debugger. The string can't be decompressed and therefore the service call fails every time.
How do I increase the POST size for the WebClient request to ensure that the full string is received by the server?
I have looked at the various option on Google, but none are giving me a good enough sample of where to make the changes, or samples of what the changes need to look like. I am completely lost as to whether the change needs to be made on the server or the consuming website, and since there are no binding defined for this, how to create a binding in the web.config for an ASMX HTTP service call.
I believe you must be hitting ASP.NET max request length limit. That you can modify via config file such as:
<system.web>
<httpRuntime executionTimeout="240" maxRequestLength="20480" />
</system.web>
maxRequestLength value is in KB, so above setting would allow 20 MB. You can also apply the setting only to selected URLs using location tag e.g.
<location path="yourservice.asmx">
<system.web>
<httpRuntime executionTimeout="240" maxRequestLength="20480" />
</system.web>
</location>
There seems to be no way to change the POST size for a ASMX Web Service when only HttpPost is enabled.
The solution in the end was to switch the service to running HttpSoap and create a service reference to the assembly containing the control. Once done the binding is created using code in the control once the endpoint is set via a property.

Sending a "larger" collection of data to an Azure WCF Service

I'm trying to create a web service using Azure.
For the time being, everything is being run locally. The web service and Azure work fine, a simple string Test() method which returns "Hello world" works without problems, as you'd expect. ;)
Now, I've created two methods which add rows to Azure Data Tables. The first sends (using a special DataContract) a single row of data, and this works fine.
The second is for sending larger amount of data, and sends an IEnumerable. So, to test the service, I've created a client application which creates a number of random data to send. If I create up to 42 rows and send that, all goes well.
Above that, I get a 400 Bad request error.
The problem is that there's no inner message to work with (or rather, that WAS the inner message). I strongly suspect it has to do with the size of the request however.
Note, that if I put a breakpoint on the service's method, it doesn't even get that far. I've read quite a few various forum posts regarding similar issues, but those seemed to deal with ordinary WCF services, not Azure ones, and so the Web.config file doesn't contain definitions for bindings nor endpoints, which would be something I could work with.
Please help.
PS. I realise I may have posted very little information. If something else is needed, please ask, and I'll do my best to include it.
Adding the following lines to the Web.config file (under system.serviceModel) in the Azure service project (NOT the Web.config in the client application) resolved the issue:
<bindings>
<basicHttpBinding>
<!--The basicHttpBinding is used for clients which use the generated code to transmit data; the following
settings make it possible to send larger amounts to the service-->
<binding maxReceivedMessageSize="10000000" receiveTimeout="01:00:00">
<readerQuotas maxStringContentLength="10000000" />
</binding>
</basicHttpBinding>
</bindings>

WCF streaming on asmx?

I've got wcf service for wcf straming. I works.
But I must integrate it with our webserice.
is there any way, to have webmethod like this:
[webmethod]
public Stream GetStream(string path)
{
return Iservice.GetStream(path);
}
I service is a class which I copy from WCF service to my asmx.
And is there any way to integrate App.config from wcf with web.config ?
Sorry, no, ASMX web services don't support streaming.
What is the bigger picture here, what are you trying to archieve with this stream?
Like John Saunders already said: Webservices dont support it. This is behaviour by design: Data is serialized into a platform/language independent and human readable xml packet, sent and deserialized on the receiver side. Of course you could go and split up your stream into chunks and send it piece for piece. But it wouldnt really make sense to misuse webservices like that, plus you are adding huge overhead in bandwidth and processing time.

Categories

Resources