Using web service to send large object (ex. image bitmap) - c#

I'm trying to use WCF web service in C# to send the bitmap data of an image to the client, I plan to send this as an object. I want to understand the nature of web service.
My question is how does this work with a large file? such as 10MB? Does web service work in one single request/response to send all 10MB worth of bitmap data? What if a network error occurs? will the client have to reissue the request?
The reason I ask this is I have been working with Socket and often when it comes to large file, I break it down to trunk of manageable size (such as 4kb) and then send it, if one succeed then I'm sending the next 4kb until all the data are transferred.
Thanks again.

The same principle can be applied here. You can send it in chunks. Related post you might find usefull - wcf upload/download large files (i.e. Img, mp3) in chunks with windows service

Related

Stream Relaying for large uploads

I have a .NET web app (MVC4, Web API) that needs to receive large uploads (videos >500MB) via Multipart POST requests. I am trying to improve user experience by relaying the incoming stream of the video part directly to the server that will process and store the video (Kaltura). This is to be an improvement over the current process which stores the video in a file on the filesystem which is later retrieved by the video server.
I've done this by creating a subclass of MultipartStreamProvider called RelayingStreamProvider. I pass it the request stream of a WebRequest I've created to the Kaltura server and it uses this for writing. When I construct the RelayingStreamProvider, I wrap the request stream in a BufferedStream to handle any latency or interruptions on either the reading or writing side of this process.
I've turned off Request buffering in my Web API controller as well.
My questions:
Am I overlooking anything that will likely bite me later with this approach? I'm trying to constrain for efficiency by keeping the filesystem out of the process. On the upside, multiple simultaneous 500 MB uploads are consuming essentially no memory in my test implementation. Downsides?
I'm using a 32KB buffer for my BufferedStream. Is this a reasonable figure? What are the reasons to make this larger/smaller?
Finally, I am indeed aware that there is a widget and mobile APIs I can use to take myself out of the middleman role for Kaltura. And we are going to be implementing these. Since the widget requires Flash, we are using this approach as a fallback for users who don't have Flash.

WCF Streaming Big file

How can I send the service a ref of a big file on the cliet's computer by a stream object and then start download piece by piece it from the client's computer (I decide how much MB I transfer every sec)? Do I have any limitations when I use it?
IIS doesn't support streaming - it buffers the whole request.
CodeProject article: WCF 4.5 fixes this
Until then, if you use IIS, the whole file will be stored in server memory before it is passed to your service.
The solution for now is to send the file in chunks - each chunk sent in a separate service call.
This would also help with your bandwidth throttling. This is not built into WCF - you have to do it yourself. You can throttle each chunk either on the client or on the server.

Download very large files (8GB+) - combining WCF and WebClient?

I have quite some trouble creating a WCF service that supports downloads of very large files. I have read a lot of guides on how to set transferMode attribute to Streamed, increase all messageSize and bufferSize attributes to Int32.MaxValue, and still I have no luck. (I am also returning the stream as the message body via the MessageBodyMember attribute, and metadata is sent via the headers using the MessageHeader attributes).
If I set all these attributes, I can download smaller files fine, but when I try to download 1-2GB files I simply get a 400 bad request error which makes it rather hard to debug...
My service should ideally support file sizes of at least 8GB. Is this even doable with WCF? The various messageSize attributes of the web.config file seem to be limited to Int32.MaxValue which equals a maximum file size of 2GB.
From my studies I have found that it seems I will have to use WebClient.DownloadFile instead.
Files should only be available for download to users who have the required rights. With WCF my download method could take a token-parameter that the server could check and return the stream only if the user had rights to download the requested file. This does not seem straight forward using the WebClient approach. If anyone has some guidelines on how to do this (via the WebClient), I would very much appreciate it.
Ideally my WCF service should administer and provide user tokens and somehow bind to every individual file what tokens are currently legal (tokens should be usable only once). Download should then happen via the WebClient.
Thanks in advance for any clues.
You can do this in WCF. Many moons ago I built a service that did this (we didn't have a web server as part of our configuration). We used WCF streaming:
http://msdn.microsoft.com/en-us/library/ms733742.aspx
The strategy to deal with large payloads is streaming. While messages,
especially those expressed in XML, are commonly thought of as being
relatively compact data packages, a message might be multiple
gigabytes in size and resemble a continuous data stream more than a
data package. When data is transferred in streaming mode instead of
buffered mode, the sender makes the contents of the message body
available to the recipient in the form of a stream and the message
infrastructure continuously forwards the data from sender to receiver
as it becomes available.

REST service and big files

Can REST web service (which usually produces e.g. simple JSONs) both handle and return big binary input/output data?
I mean, to call a REST service by a HTTP POST providing big file and afterwards reading the big result back? Is REST ok for that? ("Big" = few megabytes)
With text serializers such as JSON and XML you would get about 33% increase of the size of the files over the wire as the binary data needs to be Base64 encoded. There are more optimized protocols such as MTOM to handle this scenario. WCF supports MTOM out of the box.
REST architectures are quite capable of using HTTP to serve up application/octet-stream, which is just a stream of bytes. HTTP can quite reliably serve very large files.
Since REST is primarily a service over HTTP, standard advantages and limitations of HTTP apply to REST services too. You can send large files of few MBs as POST to REST API in a way similar that one uploads a large file to a web app.

WCF Service used by Java client and Java Service used by WCF client

I am working on a web service interface, where my WCF application works both as a Client and a Service. There are multiple Java clients that need to connect to my web service. I will need to accept stream of images and documents and send back stream of converted images.
I would also need to connect to other Java services to send the image streams as a payload to be stored in a database. I am new to web services, is there good documentation on how to enable streaming contracts between WCF and Java clients and vice verse.
If I want to return other information along with the stream of (group) images to the client, how would I do that? Like the size of each image, the offset in the stream, so they can separate images.
Thanks
In order to return additional information with your images you will need to define a DataContract which contains the metadata elements as well as a collection to contain your images. Perhaps representing your image collection as a byte array rather than just returning a raw stream of images? There are several ways to address the issue, however the best solution depends on your design requirements.

Categories

Resources