Upload file from 32Bit Machine fails to IIS 7 - c#

I am developing a Windows application that uploads a file to a webserver, IIS. My code is working just fine when I run the app on a 64Bit Machine. Upload to the IIS is working. But when I run it on a 32Bit machine, the upload is not working.
I think it has something to do with IIS. But I donĀ“t know what it could be. Does someone has experienced same issues?
UPDATE: This has nothing to with the server side. I tested several endpoints, but nothing worked.
This must be related to my upload code. This code is working from 64Bit Apps but not on 32Bit:
try
{
System.Net.Http.HttpClient hc = new System.Net.Http.HttpClient();
hc.DefaultRequestHeaders.TryAddWithoutValidation("Accept", "text/html,application/xhtml+xml,application/xml");
hc.DefaultRequestHeaders.TryAddWithoutValidation("Accept-Encoding", "gzip, deflate");
hc.DefaultRequestHeaders.TryAddWithoutValidation("User-Agent", "Mozilla/5.0 (Windows NT 6.2; WOW64; rv:19.0) Gecko/20100101 Firefox/19.0");
hc.DefaultRequestHeaders.TryAddWithoutValidation("Accept-Charset", "ISO-8859-1");
using (VirtualStream ms = new VirtualStream() { Size = UploadSize })
{
StreamContent content = new StreamContent(ms, BufferSize);
// time for the calculation of the total average throughput
var overallStart = DateTime.Now;
var start = DateTime.Now;
var responseTask = hc.PostAsync(URL, content);
while (!responseTask.IsCompleted)
{
// Check the Exit and Abort Constraints
if ((DateTime.Now - overallStart).TotalMilliseconds > MaxTestLength || _cancelationRequested)
{
System.Diagnostics.Debug.WriteLine("Bytes sent " + bytesSent);
hc.CancelPendingRequests();
IsRunning = false;
return;
}
try
{
bytesSent = ms.Position - bytesOfCalibrationPhase;
}
catch (Exception)
{
// The Upload is an async process which dispses the underlying stream when the upload finishes
// In some cases this could lead to ObjectDiposed Exceptions when accessing the current stream position
// If it is the case, the upload has finished....
break;
}
}
BytesSent is always "0" on 32Bit machines...Why is that?

It seems to me that the loop you have is there for three reasons:
1) You want to cancel the upload if cancellation is requested
2) You want to cancel the upload if there is a timeout
3) you want to know the progress of the upload
I suggest that you remove the loop completely, and achieve these tree goals in a different way. Here is how you can do this:
For (1), use the other overload of the PostAsync method that has a CancellationToken parameter. This allows you to provide a token that you can use from somewhere else to cancel the upload operation.
For (2), you can use the CancellationTokenSource (that you used to create the CancellationToken), to request that the upload operation be canceled after some time (if the task was not already completed). See the CancelAfter method.
Here is some code sample for (1) and (2):
Put the following two lines in some place (probably as fields) so that these variables are available from both your upload code and the code that might wish to cancel the upload:
CancellationTokenSource cancellation_token_source = new CancellationTokenSource();
CancellationToken cancellation_token = cancellation_token_source.Token;
The following line of code will setup automatic cancellation after 10 seconds:
cancellation_token_source.CancelAfter(TimeSpan.FromSeconds(10));
In the following line, we pass the cancellation_token to the PostAsync method:
var responseTask = hc.PostAsync(URL, content, cancellation_token);
I noticed you were waiting for responseTask.IsCompleted to become true. This means that you don't want your method to return until the upload is complete. In this case, use the following to wait for the upload to complete.
responseTask.Wait();
In case you want to convert your method to become asynchronous, mark your method as async, and instead use the following:
await responseTask;
You can use the following to cancel the upload:
cancellation_token_source.Cancel();
For (3), first look at the answers in this question.
If this does not work for you, I have the following suggestion:
You can create a decorator for the Stream class, that informs you when the stream was read, like this (please note the Read method):
public class ReadNotifierStreamWrapper : Stream
{
private readonly Stream m_Stream;
private readonly Action<int> m_ReadNotifier;
public ReadNotifierStreamWrapper(Stream stream, Action<int> read_notifier)
{
m_Stream = stream;
m_ReadNotifier = read_notifier;
}
public override void Flush()
{
m_Stream.Flush();
}
public override long Seek(long offset, SeekOrigin origin)
{
return m_Stream.Seek(offset, origin);
}
public override void SetLength(long value)
{
m_Stream.SetLength(value);
}
public override int Read(byte[] buffer, int offset, int count)
{
var bytes_read = m_Stream.Read(buffer, offset, count);
m_ReadNotifier(bytes_read);
return bytes_read;
}
public override void Write(byte[] buffer, int offset, int count)
{
m_Stream.Write(buffer, offset, count);
}
public override bool CanRead
{
get { return m_Stream.CanRead; }
}
public override bool CanSeek
{
get { return m_Stream.CanSeek; }
}
public override bool CanWrite
{
get { return m_Stream.CanWrite; }
}
public override long Length
{
get { return m_Stream.Length; }
}
public override long Position
{
get { return m_Stream.Position; }
set { m_Stream.Position = value; }
}
}
And then you can use it to wrap your ms stream like this:
int total_bytes = 0;
var stream_wrapper = new ReadNotifierStreamWrapper(ms , bytes =>
{
total_bytes += bytes;
Debug.WriteLine("Bytes sent " + total_bytes);
});
HttpContent content = new StreamContent(stream_wrapper); //Here we are creating the StreamContent from stream_wrapper instead of ms
This way you will get a notification when the stream is being read. And you will know how much bytes were read.
Please note that you might need to work more on the ReadNotifierStreamWrapper class to make it better. For example, maybe the HttpClient decides for some reason that it wants to seek the stream via the Seek method. You might need to take that into account. Although, I don't think that HttpClient will do this since it needs to read the whole file to upload it all, it does not make sense to skip parts of the file. You can put some breakpoints on the Seek method of ReadNotifierStreamWrapper to see if it will get called.

The exception you get;
EventSourceException: No Free Buffers available from the operating
system
is caused by the Debug.WriteLine, which uses ETW (Event Tracing for Windows). ETW is a Windows kernel-level tracing facility that relies on a number of buffers to cache data before writing it to disk. ETW uses the buffer size and the size of physical memory to calculate the maximum number of buffers allocated for the event tracing session's buffer pool. So, if your application is 64-bit ETW will allocate more buffers due to there being more addressable memory available.
The reason you are hitting the limit is that you are writing debugging events faster than ETW can process the buffers.
I believe it is possible to increase the maximum number of buffers using registry entries, but this will increase memory consumption and you have the potential to hit the limit again with bigger files, slower connection, etc.
So your best bet is to either sleep the thread on each loop to write tracing events at a lower rate or replace the Debug.WriteLine with some other form of logging.

Related

How to process data from one stream and return it as another async stream in ASP.NET

I'm writing a ASP.NET Web API (.NET 7). I need to create an service that loads specified data via a stream, does something to it on-the-fly and then returns it as a stream.
Let's say this is the method definition in a StreamEncoder service class:
public async Task<Stream> EncodeStream(Stream input);
What that method needs to do is:
Immediately return the output stream
Keep loading input stream data in chunks (for example 32 bytes)
Process that chunk by encoding it into Base64
Pass that encoded chunk to the output stream.
The idea is that later this service can be used withing the API endpoint. Something like this:
[HttpGet("image")]
public async Task<IActionResult> GetImage([FromQuery] string url, [FromQuery] string format)
{
// Perform checks ...
// Load stream
url = HttpUtility.UrlDecode(url);
var imageStream = await _imageLoaderService.LoadImage(url);
if (imageStream is null) return NotFound();
// Start processing the stream
var outputStream = await _streamEncoder(imageStream);
// Return immediately
return File(outputStream, Formats[format]);
}
Processing streams synchronously in batches is pretty simple, but I can't seem to find a solution for processing on-the-fly, so that the API client starts receiving data before the server had a chance to finish loading all of it's data.
Very often the input data is over 500MB in size and I need to be processing several of them at the same time. I can't just load all of data to RAM, then process it and at the end return the result.
How can I solve this problem? Are there any libraries that would help with it?
There's a couple of things in play here.
First, streaming responses come with a couple of limitations.
The HTTP response is comprised of headers followed by the data. Since the status code is returned as a header value, the first limitation of streaming responses is that you cannot change the status code. In order to stream a response, your API must return 200 and then stream. If you're streaming along and the upstream has an error, then there's no way to change that status code to a 502 or 500; all you can do is throw an exception and then ASP.NET will clamp the connection shut, which most clients will interpret as an error (some kind of general "communications error", not a 500).
The other limitation is that your code may not know the length of the response until after it's sent. This is especially true since there is encoding being done. So this means your response won't have a Content-Length header, which means no nice progress updates for your clients.
But if you're OK with those limitations, then the specifics of how to do a streaming response come into play.
You can start streaming by calling StartAsync and then copy to the stream, as such:
[HttpGet("image")]
public async Task GetImage([FromQuery] string url, [FromQuery] string format)
{
// Load stream
url = HttpUtility.UrlDecode(url);
var imageStream = await _imageLoaderService.LoadImage(url);
if (imageStream is null)
{
Response.StatusCode = 404;
return;
}
// Set all the response headers.
Response.StatusCode = 200;
Response.ContentType = Formats[format];
Response.Headers[...] = ...
// Send the headers and start streaming.
await Response.StartAsync();
// Process the stream. This is just a straight copy as an example.
await imageStream.CopyToAsync(Response.Body);
}
Note that you do lose the nice IAsyncResult helpers with this approach. (In particular, if you're using File and friends to set Content-Disposition, then the IAsyncResult helpers handle all the tedious header value encoding that is necessary). If you want to keep the IAsyncResult helpers, then you can't use StartAsync directly. In that case I recommend you write your own IAsyncResult type.
I have a FileCallbackResult type on GitHub that passes the output stream to a callback. Using my type would look like this:
[HttpGet("image")]
public async Task<IAsyncResult> GetImage([FromQuery] string url, [FromQuery] string format)
{
// Load stream
url = HttpUtility.UrlDecode(url);
var imageStream = await _imageLoaderService.LoadImage(url);
if (imageStream is null)
return NotFound();
return new FileCallbackResult(Formats[format], async (stream, context) =>
{
// Process the stream. This is just a straight copy as an example.
await imageStream.CopyToAsync(stream);
});
}
Technically, it would also be possible to write a producer/consumer stream, but this would be more work. No type like that currently exists (except NetworkStream, but you can't control both sides of that one). In the past, this would have been considerably difficult, but today I think you could do it using pipelines. Pipelines are a more modern and more efficient form of stream that also support producer/consumer semantics. Once you had a producer/consumer stream, then you could pass it to the standard File helper method. The only tricky part is error handling: you'd have to be sure that your producer delegate was wrapped in a top-level try/catch and would capture and re-raise any exception to the consumer.
Update: Indeed, creating a producer/consumer stream is not difficult due to pipelines:
public sealed class ProducerConsumerStream
{
public static Stream Create(Func<Stream, Task> producer, PipeOptions? options = null)
{
var pipe = new Pipe(options ?? PipeOptions.Default);
var readStream = pipe.Reader.AsStream();
var writeStream = pipe.Writer.AsStream();
Run();
return readStream;
async void Run()
{
try
{
await producer(writeStream);
await writeStream.FlushAsync();
pipe.Writer.Complete();
}
catch (Exception ex)
{
pipe.Writer.Complete(ex);
}
}
}
}
Usage (note that real-world usage should specify PipeOptions.PauseWriterThreshold):
public Stream EncodeStream(Stream input)
{
return ProducerConsumerStream.Create(async output =>
{
// Process the stream. This is just a straight copy as an example.
await input.CopyToAsync(output);
});
}
[HttpGet("image")]
public async Task<IAsyncResult> GetImage([FromQuery] string url, [FromQuery] string format)
{
// Load stream
url = HttpUtility.UrlDecode(url);
var imageStream = await _imageLoaderService.LoadImage(url);
if (imageStream is null)
return NotFound();
var outputStream = _streamEncoder.EncodeStream(imageStream);
return File(outputStream, Formats[format]);
}

Speech recognition with Microsoft Cognitive Speech API and non-microphone real-time audio stream

Problem
My project consists of a desktop application that records audio in real-time, for which I intend to receive real-time recognition feedback from an API. With a microphone, a real-time implementation using Microsoft's new Speech-to-Text API is trivial, with my scenario differing from that only in the sense that my data is written to a MemoryStream object.
API Support
This article explains how to implement the API's Recognizer (link) with custom audio streams, which invariably requires the implementation of the abstract class PullAudioInputStream (link) in order to create the required AudioConfig object using the CreatePullStream method (link). In other words, to achieve what I require, a callback interface must be implemented.
Implementation attempt
Since my data is written to a MemoryStream (and the library I use will only record to files or Stream objects), in the code below I simply copy over the buffer to the implemented class (in a sloppy way, perhaps?) resolving the divergence in method signatures.
class AudioInputCallback : PullAudioInputStreamCallback
{
private readonly MemoryStream memoryStream;
public AudioInputCallback(MemoryStream stream)
{
this.memoryStream = stream;
}
public override int Read(byte[] dataBuffer, uint size)
{
return this.Read(dataBuffer, 0, dataBuffer.Length);
}
private int Read(byte[] buffer, int offset, int count)
{
return memoryStream.Read(buffer, offset, count);
}
public override void Close()
{
memoryStream.Close();
base.Close();
}
}
The Recognizer implementation is as follows:
private SpeechRecognizer CreateMicrosoftSpeechRecognizer(MemoryStream memoryStream)
{
var recognizerConfig = SpeechConfig.FromSubscription(SubscriptionKey, #"westus");
recognizerConfig.SpeechRecognitionLanguage =
_programInfo.CurrentSourceCulture.TwoLetterISOLanguageName;
// Constants are used as constructor params)
var format = AudioStreamFormat.GetWaveFormatPCM(
samplesPerSecond: SampleRate, bitsPerSample: BitsPerSample, channels: Channels);
// Implementation of PullAudioInputStreamCallback
var callback = new AudioInputCallback(memoryStream);
AudioConfig audioConfig = AudioConfig.FromStreamInput(callback, format);
//Actual recognizer is created with the required objects
SpeechRecognizer recognizer = new SpeechRecognizer(recognizerConfig, audioConfig);
// Event subscriptions. Most handlers are implemented for debugging purposes only.
// A log window outputs the feedback from the event handlers.
recognizer.Recognized += MsRecognizer_Recognized;
recognizer.Recognizing += MsRecognizer_Recognizing;
recognizer.Canceled += MsRecognizer_Canceled;
recognizer.SpeechStartDetected += MsRecognizer_SpeechStartDetected;
recognizer.SpeechEndDetected += MsRecognizer_SpeechEndDetected;
recognizer.SessionStopped += MsRecognizer_SessionStopped;
recognizer.SessionStarted += MsRecognizer_SessionStarted;
return recognizer;
}
How the data is made available to the recognizer (using CSCore):
MemoryStream memoryStream = new MemoryStream(_finalSource.WaveFormat.BytesPerSecond / 2);
byte[] buffer = new byte[_finalSource.WaveFormat.BytesPerSecond / 2];
_soundInSource.DataAvailable += (s, e) =>
{
int read;
_programInfo.IsDataAvailable = true;
// Writes to MemoryStream as event fires
while ((read = _finalSource.Read(buffer, 0, buffer.Length)) > 0)
memoryStream.Write(buffer, 0, read);
};
// Creates MS recognizer from MemoryStream
_msRecognizer = CreateMicrosoftSpeechRecognizer(memoryStream);
//Initializes loopback capture instance
_soundIn.Start();
await Task.Delay(1000);
// Starts recognition
await _msRecognizer.StartContinuousRecognitionAsync();
Outcome
When the application is run, I don't get any exceptions, nor any response from the API other than SessionStarted and SessionStopped, as depicted below in the log window of my application.
I could use suggestions of different approaches to my implementation, as I suspect there is some timing problem in tying the recorded DataAvailable event with the actual sending of data to the API, which is making it discard the session prematurely. With no detailed feedback on why my requests are unsuccessful, I can only guess at the reason.
The Read() callback of PullAudioInputStream should block if there is no data immediate available. And Read() returns 0, only if the stream reaches the end. The SDK will then close the stream after Read() returns 0 (find an API reference doc here).
However, the behavior of Read() of C# MemoryStream is different: It returns 0 if there is no data available in the buffer. This is why you only see SessionStart and SessionStop events, but no recognition events.
In order to fix that, you need to add some kind of synchronization between PullAudioInputStream::Read() and MemoryStream::Write(), in order to make sure that PullAudioInputStream::Read() will wait until MemoryStream::Write() writes some data into buffer.
Alternatively, I would recommend to use PushAudioInputStream, which allows you directly write your data into stream. For your case, in _soundSource.DataAvailable event, instead of writing data into MemoryStream, you can directly write it into PushAudioInputStream. You can find samples for PushAudioInputStream here.
We will update the documentation in order to provide the best practice on how to use Pull and Push AudioInputStream. Sorry for the inconvenience.
Thank you!

Implementing a custom stream

First of all I should mention that I have no internal Stream object available. Instead of that, I do have this object:
public interface IChannel
{
void Send(byte[] data);
event EventHandler<byte[]> Receive;
}
I want to implement a Stream class, one like this:
public class ChannelStream : Stream
{
private readonly IChannel _channel;
public ChannelStream(IChannel channel)
{
this._channel = channel;
}
// TODO: Implement Stream class
}
The functionality I require is very similar to NetworkStream:
Writing bytes to my stream should add these bytes to a buffer and call _channel.Send once Flush() is called.
The Stream will also listen to _channel.Receive events and add the bytes to another internal buffer until they are read from the stream. If the Stream doesn't have any data available, it should block until new data becomes available.
I am however struggling with the implementation. I have experimented with internally using two MemoryStreams but this caused the buffer to keep eating more and more ram.
What kind of collection / stream can I use to implement my stream?
Consider what you need from the collection and go from there.
Here are a few questions you should consider when you need a collection of some sort:
Do you need random access to the items in the collection?
Is the collection going to be accessed by multiple threads?
Do you need to retain the data in the collection after it is read?
Is ordering important? If so, what order - add order, reverse add order, item ordering by some comparison?
For the output buffer in this case the answers are no, yes, no and yes: add order. Which pretty much singles out the ConcurrentQueue class. This allows you to add objects from a source or sources that do not need to be in the same thread as the code that is reading them back out. It doesn't let you arbitrarily index the collection (well, not directly anyway), which you don't appear to need.
I'd use the same type for the input buffer, with a 'current block' buffer to hold the most recently read buffer, wrapped in some simple object locking semantics to handle any threading issues.
The output section looks something like this:
// Output buffer
private readonly ConcurrentQueue<byte[]> _outputBuffer = new ConcurrentQueue<byte[]>();
public override void Write(byte[] buffer, int offset, int count)
{
// Copy written data to new buffer and add to output queue
byte[] data = new byte[count];
Buffer.BlockCopy(buffer, offset, data, 0, count);
_outputBuffer.Enqueue(data);
}
public override void Flush()
{
// pull everything out of the queue and send to wherever it is going
byte[] curr;
while (_outputBuffer.TryDequeue(out curr))
internalSendData(curr);
}
The internalSendData method is where the data would then go out to the network.
The read buffering is a little more complex:
// collection to hold unread input data
private readonly ConcurrentQueue<byte[]> _inputBuffer = new ConcurrentQueue<byte[]>();
// current data block being read from
private byte[] _inputCurrent = null;
// read offset in current block
private short _inputPos = 0;
// object for locking access to the above.
private readonly object _inputLock = new object();
public override int Read(byte[] buffer, int offset, int count)
{
int readCount = 0;
lock(_inputLock)
{
while (count > 0)
{
if (_inputCurrent == null || _inputCurrent.Length <= _inputPos)
{
// read next block from input buffer
if (!_inputBuffer.TryDequeue(out _inputCurrent))
break;
_inputPos = 0;
}
// copy bytes to destination
int nBytes = Math.Min(count, _inputCurrent.Length - _inputPos);
Buffer.BlockCopy(_inputCurrent, _inputPos, buffer, offset, nBytes);
// adjust all the offsets and counters
readCount += nBytes;
offset += nBytes;
count -= nBytes;
_inputPos += (short)nBytes;
}
}
return readCount;
}
Hopefully that makes sense.
Using queues for this soft of buffering means that the data is only held in memory for as long as they are delayed being sent or read. Once you call Flush the output buffer's memory is released for garbage collection, so you don't have to worry about memory blowouts unless you are trying to send a lot faster than the actual transport mechanism can handle. But if you're queuing up several megabytes of data every second to go out over an ADSL connection, nothing is going to save you :P
I'd add a few refinements to the above, like some checks to make sure that Flush gets called automatically once the buffer is at a reasonable level.

Consuming a HTTP stream without reading one byte at a time

I have been trying to read data from the Twitter stream API using C#, and since sometimes the API will return no data, and I am looking for a near-realtime response, I have been hesitant to use a buffer length of more than 1 byte on the reader in case the stream doesn't return any more data for the next day or two.
I have been using the following line:
input.BeginRead(buffer, 0, buffer.Length, InputReadComplete, null);
//buffer = new byte[1]
Now that I plan to scale the application up, I think a size of 1 will result in a lot of CPU usage, and want to increase that number, but I still don't want the stream to just block. Is it possible to get the stream to return if no more bytes are read in the next 5 seconds or something similar?
Async Option
You can use a timer in the async callback method to complete the operation if no bytes are received for e.g. 5 seconds. Reset the timer every time bytes are received. Start it before BeginRead.
Sync Option
Alternatively, you can use the ReceiveTimeout property of the underlying socket to establish a maximum time to wait before completing the read. You can use a larger buffer and set the timeout to e.g. 5 seconds.
From the MSDN documentation that property only applies to a synchronous read. You could perform a synchronous read on a separate thread.
UPDATE
Here's rough, untested code pieced together from a similar problem. It will probably not run (or be bug-free) as-is, but should give you the idea:
private EventWaitHandle asyncWait = new ManualResetEvent(false);
private Timer abortTimer = null;
private bool success = false;
public void ReadFromTwitter()
{
abortTimer = new Timer(AbortTwitter, null, 50000, System.Threading.Timeout.Infinite);
asyncWait.Reset();
input.BeginRead(buffer, 0, buffer.Length, InputReadComplete, null);
asyncWait.WaitOne();
}
void AbortTwitter(object state)
{
success = false; // Redundant but explicit for clarity
asyncWait.Set();
}
void InputReadComplete()
{
// Disable the timer:
abortTimer.Change(System.Threading.Timeout.Infinite, System.Threading.Timeout.Infinite);
success = true;
asyncWait.Set();
}

Implementing a timeout property on Stream Read

I'm getting a stream from HttpWebResponse.GetResponseStream() where I'm reading data from.
Now I want to implement a Timeout property. The easiest way to do it would be stream.ReadTimeout = timeout but this throws an InvalidOperationException -> Timeouts are not supported on this stream.
Given this, I'm trying to implement the timeout property myself but got stuck on a dispose. This is what I got so far:
public class MyStream : Stream {
private readonly Stream _src;
public override int ReadTimeout { get; set; }
public MyStream (Stream src, int timeout) {
ReadTimeout = timeout;
_src = src;
}
public override int Read(byte[] buffer, int offset, int count) {
var timer = new AutoResetEvent(false);
int read = 0;
ThreadPool.QueueUserWorkItem(
_ => {
read = _src.Read(buffer, offset, count);
timer.Set();
});
bool completed = timer.WaitOne(ReadTimeout);
if (completed) {
return read;
}
throw new TimeoutException(string.Format("waited {0} miliseconds", ReadTimeout));
}
The problem with this code is after is throws a TimeoutException that is being properly handled somewhere. It throws an Exception on _src.Read(buffer, offset, count) saying that the _src stream was disposed.
Is there a way to cancel the ThreadPool method or should I use a better approach and which one?
Thanks
EDIT
As asked by #JotaBe, were's the code where I get the stream from HttpWebResponse:
_httpRequest = WebRequest.CreateHttp(url);
_httpRequest.AllowReadStreamBuffering = false;
_httpRequest.BeginGetResponse(
result =>
{
try {
_httpResponse = (HttpWebResponse)_httpRequest.EndGetResponse(result);
stream = _httpResponse.GetResponseStream();
}
catch (WebException) {
downloadCompleted.Set();
Abort();
}
finally {
downloadCompleted.Set();
}
},
null);
bool completed = downloadCompleted.WaitOne(15 * 1000);
if (completed) {
return new MyStream(stream, 10000);
}
If you're trying to get a timeout if you don't receive and answer form a web server, you're trying to do it in the wrong place.
To get a Response, you usually make a Request:
HttpWebResponse response = (HttpWebResponse)request.GetResponse ();
This is the operation which can timeout. You have to call it in a differente way, using the Begin/End asynchronous pattern.
There is a full example of this in
MSDN doc for HttpWebRequest.BeginGetResponse Method
This example uses a callback function. However, there are many different ways to use Begin/End. For example, you can use a WaitHandle available in IAsyncResult like this:
IAsyncResult ar = req.BeginGetResponse(yourCallback, null);
bool completed = ar.AsyncWaitHandle.WaitOne(15000 /*your timeout in miliseconds*/);
This will wait 15 seconds. If the response arrives before this, completed will be true. If not, completed will be false. You can then use the HttpWebRequest.Abort() method to abort the request.
The Begin/End pattern takes charge of managing the neccesary threads.
I ended up using Nomad101 suggestion and surround the read with a try/catch.

Categories

Resources