I need to implement a file downloader in C#. This downloader will be running on the client computer, and will download several files according to several conditions.
The main restriction I have is that the client will probably go offline during downloading (sometime more than once), so I need the following things to happen:
1) Downloader should notice there isn’t any network communication anymore and pause downloading.
2) Downloader should resume downloading once communication is back, and continue collecting the packages, adding them to those that were already downloaded to the local disk.
I have checked StackOverflow previous posts and saw that there are two options – WebClient and WebRequest (using one of the inheritance classes). I was wondering if someone can advise which one to use based on the requirements I have specified. How can I detect communication breakage?
You will need System.Net.HttpWebRequest to send HTTP requests and System.IO.FileStream to access files. Two methods needed are HttpWebRequest.AddRange and Filestream.seek
HttpWebRequest.AddRange method adds a byte range header to the request, and the range parameter specifies the starting point of the range. The server will start sending data from the range parameter specified to the end of the data in the HTTP entity. While Filestream.seek method is used to access the current position within a stream.
Source and example
You need download resume (if supported by server you are downloading from) which means you should go with WebRequest since with WebClient you cannot do that (probably next versions will support RANGE requests).
As soon as connection is dropped your code which is reading network stream throws an exception. That tells you there is a problem with download (i.e. network problem) then you can try in some periods to make a new connection and if succeeded, resume from last successful byte (using RANGE in HTTP HEADER).
Related
Last few days I've been building a web server application in C# that uses HttpListener. I've learned quite a lot on the way, and still am. Currently I got it all working, setting headers here and there depending on certain situations.
In most cases things are working fine, however at times a exception error is thrown. This happens on a few occasions. Most if not all of them is closing a connection before all data is send. Then the error occurs. But some of them are really caused by browsers as far as I can tell.
Like let's take Chrome. Whenever I go to a MP3 file directly, it sends 2 GET requests. And one of them is causing the error, the other is working and receiving part of the content. After this, I can listen the MP3 and there are no issues. Streaming works.
But back to the request that gives me the error, there is nothing in the headers that I could use in my code to not output data, like I do already with HEAD requests. So I'm quite puzzled here.
IE also has this problem with both opening MP3 files directly, and streaming via HTML5 audio tag. It also varies from time to time. Sometimes I open the page, and only 2 requests are made. The HTML page, and the MP3. No error there. Sometimes tho, there are 3 requests. It connects to the MP3 twice. Now sometimes one of those connections is being aborted straight after I open the page, and sometimes 2 requests to the MP3 file, doesn't even accept data. In both request headers, they want end of the file. So bytes: 123-123/124.
I've also tested it on w3school's audio element. IE also makes twice connections there, one aborted, other loading the MP3 file.
So my question is, is it possible to make the web server exception/error-proof, or maybe better question, is it bad that these exceptions are thrown? Or do you perhaps know how to fix these errors?
The error I'm getting is: I/O Operation has been aborted by either a thread exit or an application request.
The way I write to the client is:
using (Stream Output = _CResponse.OutputStream)
{
Output.Write(_FileOutput, rangeBegin, rangeLength);
}
I am not sure if there's another (better) way. This is what I came across in many topics, tutorials and pages while researching.
About headers: Default headers: Content Length, Content Type, Status Code. In some cases, like MP3 files and video's, I add a Accept-Ranges: Bytes header. In case the request header has Range in it, I add Content-Range header, and PartialContent status code.
From the server's point of view any client can disconnect at any time. This is part of the normal operation of a server. Detect this specific case, log it and swallow the exception (because it has been handled). It's not a server bug.
Basically I got a listener that (when it receives a new connection) creates a new socketWorker and assigns the connection to the client to it.
Now if the client sends a huge file(that takes, say 30 seconds to be fully received) and afterwards sends a tiny file of a few bytes the tiny file isn't received until the huge file has been fully received.
This is obviously a bad approach and I wonder how I could do it so the files would be sent simultaneously?
As of now I'm using async methods, every time a file has been fully received BeginReceive() is called again to receive the next file (bad way).
Any way to fix this?
I'd appreciate it!
You'll have to implement multiplexing, like for example SPDY does. This is (basically) done by framing message parts and supplying a stream ID on each frame. This way, multiple streams can be exchanged over a single connection.
Alternatively, you could open one connection per file.
Is this possible? If so, what is the industry standard as far as software goes? Specifically, I am referring to .net controls.
Thank you
EDITED:
Here is what I need. I have a thin client with a balance where RS-232 is used to interact with the thin client. Currently, it is a compact framework app. What I would like to know id whether it is possible to have the same set up in a web application. So that would entail that the RS-232 is NOT the server RS-232 - it is the user's computer RS-232 - RS-232 is on the client. So when the RS-232 spits out input, it should go to the browser. Is it possible in a web application?
Two ways I can think of;
Buffer the serial data in an application object, and then use an ajax call triggered by a timer to grab and display the latest data.
For shorter streams of data, you could, instead of using asp.net controls per se, do something like;
Response.ContentType = "text/plain";
Response.Clear();
String serialData;
while(serialData = getSerialData() {
Response.Write(serialData);
Response.Flush();
}
This will write text content to the web browser in real time. You probably wouldn't want to keep this session open for too long though.
If you wanted to display the stream of data within another page, then just place the page with the above code within an iFrame.
Also note that the above would best be done in an ashx handler rather than an aspx page.
Sure, you can do it. You'd read from a System.IO.Ports.SerialPort instance, and output it via an HttpListener.
The trick will be knowing what you want to do with the page that the HttpListener serves up. Since data will be coming in on the COM port in real time, you'll probably want to buffer it between HTTP requests to your server, since otherwise, if you try to read the port without data waiting, you'll hang the listener. You can also miss data if you don't read it regularly and it fills the SerialPort read buffer.
I have a Silverlight 4 out-of-browser application that needs to be able to resume the download of an external file if the download is interrupted for any reason. I would like to be able resume instead of restart from the beginning because the file will be rather large and we have the potential to have users on slower connections.
I found some code at,
http://www.codeproject.com/Tips/157532/Silverlight-4-OOB-HTTP-Download-Component-with-Pau.aspx
but there seems to be numerous errors in it, so I'm not exactly confident that I'll be able to get it to work.
So, if anyone has any other original suggestions or alternatives, I'd like to hear them.
Thanks,
One approach you might consider is managing the download using the HTTP/1.1 Acccept-Ranges response header and the Range request header.
Make sure the resource you are downloading will include the header:-
Accept-Ranges: bytes
when it is requested (a static file sent by IIS will do this by default).
Now using the ClientHTTP stack you make an initial "HEAD" request to determine the server will accept a Range: bytes= header in the request and find the total size of the content to be sent.
You then make a "GET" request for the resource including the header:-
Range: bytes=0-65535
This limits the downloaded content to just the first 64K chunk. You then repeat the same request with:-
Range: bytes=65536-131071
Each time you can save the content of the response stream to your destination file. You keep track of how many bytes you have received. When you determine the final chunk which is likely to less than full just use a header like:-
Range: bytes=131072-
That will read to the end of file.
If the requests to the server are fail you can resume at an appropriate point in this sequence.
You need to be degrade gracefully, if the server does not include the Accept-Ranges header in the initial "HEAD" request then you'll just have to download the whole file.
I am using the WebClient.UploadFile() method to post files to a service for processing. The file contains an XML document with a compressed, b64-encoded content element in it. For some files (currently 1), the UploadFile throws an exception, indicating that the underlying connection was closed. The innermost exception on socket level gives the message 'An existing connection was forcibly closed by the remote host'.
Questions:
Has anyone necountered the same problem?
Why does it throw an exception for some files, and not for all?
Should I set some additional parameter for files with binary content?
Is there a workaround?
This functionality does work fine in a VPN situation, but obviously we want to use it to work in standard Internet situations.
Thanks, Rine
Sounds like a firewall or other security software sitting in between you and the server may be rejecting the request as a potential attack. I've run into this before where firewalls were rejecting requests that contained a specific file extension-- even if that file extension was encoded in a query string parameter!
If I were you, I'd take the problematic file and start trimming XML out of it. You may be able to find a specific chunk of XML which triggers the issue. Once you've identified the culprit, you can figure out how to get around the issue (e.g. by encoding those characters using their Unicode values instead of as text before sending the files). If, however, any change to the file causes the problem to go away (it's not caused by a specific piece of worrisome text), then I'm stumped.
Any chance it's a size issue and the problematic file is above a certain size and all the working files are below it? The server closing the connection when it hits a max accepted request size matches your symptom. You mentioned it worked in VPN so it's admittedly a stretch, but maybe the VPN case was a different server that's configured differently (or the max request is different for some other reason).
Are there non-WebClient methods for uploading the file to the same service from the same machine and if so, do they work?