WebException (timeout) When Reading Files From Multiple Threads After Setting ThreadPool.MaxThreads - c#

So, I tracked down the issue, but I don't understand the root cause and I'm curious.
I have multiple threads reading files (sometimes the same file, but usually different files. This doesn't seem to matter) from a local drive. This is the test setup, but in production these files are retrieved from a web server.
Anyway, I noticed that, after calling ThreadPool.SetMaxThreads(), I was receiving timeouts reading these files. Removing that line makes the problem go away. My hunch is that it has to do with setting the number of asynchronous IO threads (completionPortThreads, the second argument), but even when I set that value to a large number (50, 100, ...), the issue remains.
Removing the call to SetMaxThreads "fixes" the issue, though it means I can't increase or decrease the number of threads for testing purposes.
Here is a block of code which reproduces the issue. The file size doesn't matter as my test files range anywhere from 2KB to 3MB.
class Program
{
static void Main(string[] args)
{
_count = 15;
// Comment this line out and everything works
ThreadPool.SetMaxThreads(13, 50);
using (var mre = new ManualResetEvent(false))
{
for (int i = 0; i < _count; ++i)
{
ThreadPool.QueueUserWorkItem(ThreadFunc, mre);
}
mre.WaitOne();
}
}
private static readonly ConcurrentStack<byte[]> _files = new ConcurrentStack<byte[]>();
private static int _count;
private static void ThreadFunc(object o)
{
const string path = #"SomeLocalFile";
var file = ReadFile(path);
_files.Push(file);
if (Interlocked.Decrement(ref _count) == 0)
{
((ManualResetEvent)o).Set();
}
}
private static byte[] ReadFile(string uri)
{
var request = WebRequest.Create(uri);
using (var response = request.GetResponse())
using (var stream = response.GetResponseStream())
{
var ret = new byte[stream.Length];
stream.Read(ret, 0, ret.Length);
return ret;
}
}
}
So, yeah, not sure what's going on here. Even with a large value for IO threads I timeout on each test. I'm certainly missing something.

FileWebRequest which is the type returned by WebRequest.Create() also uses ThreadPool.QueueUserWorkItem. Since you limit the worker threads, the queued work of FileWebRequest never gets executed. You need to set max worker threads to at least _count + 1 (plus 1 so that there is at least one thread the can process the queued work by FileWebRequest).
FileWebRequest.GetRequestStream does the following:
ThreadPool.QueueUserWorkItem(read file)
Wait until the file is read or timeout is reached
Better Solution:
Do not enqueue items to the ThreadPool. Use WebRequest.GetResponseAsync instead.

Related

DataWriter.DetachStream() throws 'System.Runtime.InteropServices.COMException'

I'm creating a UWP program for Raspberry Pi. One of the functions of the program is to send and receive some data from an Arduino.
The problem is when I try sending data to the Arduino rapidly and many times, I end up with System.Runtime.InteropServices.COMException The operation identifier is not valid. originating from DataWriter.DetachStream().
Sending the data rapidly works just fine, up until a certain amount it seems, where I get the exception thrown.
With "rapid", I mean using an auto clicker to click a button to send data each millisecond.
I've not tried sending data slowly many times in a row to reproduce the issue, as this would probably take a long time (seeing it takes about 10-20 seconds with 1ms delay between transmissions.
I've been searching for a solution to this problem for way too many hours, but I can't seem to find any related questions/solutions.
public sealed partial class LightControl : Page
{
int Alpha;
int Red;
int Green;
int Blue;
// This is the handler for the button to send data
private void LightButton_Click(object sender, RoutedEventArgs e)
{
if (!(sender is Button button) || button.Tag == null) return;
string tag = button.Tag.ToString();
Alpha = int.Parse(tag.Substring(0, 2), System.Globalization.NumberStyles.HexNumber);
Red = int.Parse(tag.Substring(2, 2), System.Globalization.NumberStyles.HexNumber);
Green = int.Parse(tag.Substring(4, 2), System.Globalization.NumberStyles.HexNumber);
Blue = int.Parse(tag.Substring(6, 2), System.Globalization.NumberStyles.HexNumber);
SendLightData();
}
public async void SendLightData()
{
await ArduinoHandler.Current.WriteAsync(ArduinoHandler.DataEnum.LightArduino,
ArduinoHandler.DataEnum.Light, Convert.ToByte(LightConstants.LightCommand.LightCommand),
Convert.ToByte(Red), Convert.ToByte(Green), Convert.ToByte(Blue), Convert.ToByte(Alpha),
WriteCancellationTokenSource.Token);
}
}
public class ArduinoHandler
{
// Code for singleton behaviour. Included for completeness
#region Singleton behaviour
private static ArduinoHandler arduinoHandler;
private static Object singletonCreationLock = new Object();
public static ArduinoHandler Current
{
get
{
if (arduinoHandler == null)
{
lock (singletonCreationLock)
{
if (arduinoHandler == null)
{
CreateNewArduinoHandler();
}
}
}
return arduinoHandler;
}
}
public static void CreateNewArduinoHandler()
{
arduinoHandler = new ArduinoHandler();
}
#endregion
private DataWriter dataWriter;
private Object WriteCancelLock = new Object();
public async Task WriteAsync(DataEnum receiver, DataEnum sender,
byte commandByte1, byte dataByte1, byte dataByte2, byte dataByte3,
byte dataByte4, CancellationToken cancellationToken)
{
try
{
dataWriter = new DataWriter(arduinos[receiver].OutputStream);
byte[] buffer;
Task<uint> storeAsyncTask;
lock (WriteCancelLock)
{
buffer = new byte[8];
buffer[0] = Convert.ToByte(receiver);
buffer[1] = Convert.ToByte(sender);
buffer[2] = commandByte1;
buffer[3] = dataByte1;
buffer[4] = dataByte2;
buffer[5] = dataByte3;
buffer[6] = dataByte4;
buffer[7] = Convert.ToByte('\n');
cancellationToken.ThrowIfCancellationRequested();
dataWriter.WriteBytes(buffer);
storeAsyncTask = dataWriter.StoreAsync().AsTask(cancellationToken);
}
uint bytesWritten = await storeAsyncTask;
Debug.Write("\nSent: " + BitConverter.ToString(buffer) + "\n");
}
catch (Exception e)
{
Debug.Write(e.Message);
}
finally
{
dataWriter.DetachStream(); // <--- I've located the exception to originate from here, using the debugger in Visual Studio
dataWriter.Dispose();
}
}
public enum DataEnum
{
Light = 0x01,
Piston = 0x02,
PC = 0x03,
LightArduino = 0x04
}
}
I would expect the Raspberry Pi to send the data to the Arduino, but after a while with rapid data transmission, the exception is thrown.
Update
I tried using a local variable for the dataWriter as suggested below, but this causes strange behavior after a while with rapid data transmission. Just as if it slows down. It is worth noting that I don't get an exception anymore.
Quite hard trying to explain how it behaves, but the Debug.Write logs the message I'm sending (which works fine). However, after a while, it seems to "slow down", and even after I stop clicking, the data is being sent once every second or so. It works completely fine up until this point. So I'm wondering if there is a limit of some sort I'm hitting?
Update 2
I seem to have found a rather "hacky" and weird solution to the problem.
If I use Serial.write() on the Arduino to send the data back to the Raspberry Pi, it seems to have fixed the issue somehow.
If anyone knows how this worked, I'd be very interested to know :)
const int payloadSize = 8;
byte payload[payloadSize]
int numBytes;
// Called each time serial data is available
void serialEvent()
{
numBytes = Serial.available();
if (numBytes == payloadSize)
{
for (int i = 0; i < payloadSize; i++)
{
payload[i] = Serial.read();
Serial.write(payload[i]); // <--- This line fixed the issue for whatever reason
}
}
checkData(); // Function to do something with the data
for (int i = 0; i < payloadSize; i++)
{
payload[i] = None;
}
numBytes = 0;
}
Your problem originates from the fact that you are using a fire-and-forget approach of working with async method. When you call SendLightData() in quick succession, it doesn't wait for the previous WriteAsync operation to complete.
Once the execution reaches the first actual await expression - which is the await storeAsyncTask line, the UI thread is freed up to handle another button click.
This new button click can start executing and overwrite the dataWriter field in the same instance of ArduinoHandler. When the first storeAsyncTask finishes executing, it will actually datach the dataWriter of the second call, not its own. This can lead to multiple different sorts of issues and race conditions.
So you must make sure that it is not possible to click the button before the previous operation actually executes. You could use a boolean flag for that as a simple solution.
private bool _isWorking = false;
public async void SendLightData()
{
if (!_isWorking)
{
try
{
_isWorking = true;
await ArduinoHandler.Current.WriteAsync(ArduinoHandler.DataEnum.LightArduino,
ArduinoHandler.DataEnum.Light, Convert.ToByte(LightConstants.LightCommand.LightCommand),
Convert.ToByte(Red), Convert.ToByte(Green), Convert.ToByte(Blue), Convert.ToByte(Alpha),
WriteCancellationTokenSource.Token);
}
finally
{
_isWorking = false;
}
}
This will ensure that two operations never execute simultaneously.
Other solution could be to not store the data writer as a field and just have it as a local variable. When you avoid all shared state between the calls, you can safely know that there will be no race condition stemming from overwriting.

Upload file from 32Bit Machine fails to IIS 7

I am developing a Windows application that uploads a file to a webserver, IIS. My code is working just fine when I run the app on a 64Bit Machine. Upload to the IIS is working. But when I run it on a 32Bit machine, the upload is not working.
I think it has something to do with IIS. But I donĀ“t know what it could be. Does someone has experienced same issues?
UPDATE: This has nothing to with the server side. I tested several endpoints, but nothing worked.
This must be related to my upload code. This code is working from 64Bit Apps but not on 32Bit:
try
{
System.Net.Http.HttpClient hc = new System.Net.Http.HttpClient();
hc.DefaultRequestHeaders.TryAddWithoutValidation("Accept", "text/html,application/xhtml+xml,application/xml");
hc.DefaultRequestHeaders.TryAddWithoutValidation("Accept-Encoding", "gzip, deflate");
hc.DefaultRequestHeaders.TryAddWithoutValidation("User-Agent", "Mozilla/5.0 (Windows NT 6.2; WOW64; rv:19.0) Gecko/20100101 Firefox/19.0");
hc.DefaultRequestHeaders.TryAddWithoutValidation("Accept-Charset", "ISO-8859-1");
using (VirtualStream ms = new VirtualStream() { Size = UploadSize })
{
StreamContent content = new StreamContent(ms, BufferSize);
// time for the calculation of the total average throughput
var overallStart = DateTime.Now;
var start = DateTime.Now;
var responseTask = hc.PostAsync(URL, content);
while (!responseTask.IsCompleted)
{
// Check the Exit and Abort Constraints
if ((DateTime.Now - overallStart).TotalMilliseconds > MaxTestLength || _cancelationRequested)
{
System.Diagnostics.Debug.WriteLine("Bytes sent " + bytesSent);
hc.CancelPendingRequests();
IsRunning = false;
return;
}
try
{
bytesSent = ms.Position - bytesOfCalibrationPhase;
}
catch (Exception)
{
// The Upload is an async process which dispses the underlying stream when the upload finishes
// In some cases this could lead to ObjectDiposed Exceptions when accessing the current stream position
// If it is the case, the upload has finished....
break;
}
}
BytesSent is always "0" on 32Bit machines...Why is that?
It seems to me that the loop you have is there for three reasons:
1) You want to cancel the upload if cancellation is requested
2) You want to cancel the upload if there is a timeout
3) you want to know the progress of the upload
I suggest that you remove the loop completely, and achieve these tree goals in a different way. Here is how you can do this:
For (1), use the other overload of the PostAsync method that has a CancellationToken parameter. This allows you to provide a token that you can use from somewhere else to cancel the upload operation.
For (2), you can use the CancellationTokenSource (that you used to create the CancellationToken), to request that the upload operation be canceled after some time (if the task was not already completed). See the CancelAfter method.
Here is some code sample for (1) and (2):
Put the following two lines in some place (probably as fields) so that these variables are available from both your upload code and the code that might wish to cancel the upload:
CancellationTokenSource cancellation_token_source = new CancellationTokenSource();
CancellationToken cancellation_token = cancellation_token_source.Token;
The following line of code will setup automatic cancellation after 10 seconds:
cancellation_token_source.CancelAfter(TimeSpan.FromSeconds(10));
In the following line, we pass the cancellation_token to the PostAsync method:
var responseTask = hc.PostAsync(URL, content, cancellation_token);
I noticed you were waiting for responseTask.IsCompleted to become true. This means that you don't want your method to return until the upload is complete. In this case, use the following to wait for the upload to complete.
responseTask.Wait();
In case you want to convert your method to become asynchronous, mark your method as async, and instead use the following:
await responseTask;
You can use the following to cancel the upload:
cancellation_token_source.Cancel();
For (3), first look at the answers in this question.
If this does not work for you, I have the following suggestion:
You can create a decorator for the Stream class, that informs you when the stream was read, like this (please note the Read method):
public class ReadNotifierStreamWrapper : Stream
{
private readonly Stream m_Stream;
private readonly Action<int> m_ReadNotifier;
public ReadNotifierStreamWrapper(Stream stream, Action<int> read_notifier)
{
m_Stream = stream;
m_ReadNotifier = read_notifier;
}
public override void Flush()
{
m_Stream.Flush();
}
public override long Seek(long offset, SeekOrigin origin)
{
return m_Stream.Seek(offset, origin);
}
public override void SetLength(long value)
{
m_Stream.SetLength(value);
}
public override int Read(byte[] buffer, int offset, int count)
{
var bytes_read = m_Stream.Read(buffer, offset, count);
m_ReadNotifier(bytes_read);
return bytes_read;
}
public override void Write(byte[] buffer, int offset, int count)
{
m_Stream.Write(buffer, offset, count);
}
public override bool CanRead
{
get { return m_Stream.CanRead; }
}
public override bool CanSeek
{
get { return m_Stream.CanSeek; }
}
public override bool CanWrite
{
get { return m_Stream.CanWrite; }
}
public override long Length
{
get { return m_Stream.Length; }
}
public override long Position
{
get { return m_Stream.Position; }
set { m_Stream.Position = value; }
}
}
And then you can use it to wrap your ms stream like this:
int total_bytes = 0;
var stream_wrapper = new ReadNotifierStreamWrapper(ms , bytes =>
{
total_bytes += bytes;
Debug.WriteLine("Bytes sent " + total_bytes);
});
HttpContent content = new StreamContent(stream_wrapper); //Here we are creating the StreamContent from stream_wrapper instead of ms
This way you will get a notification when the stream is being read. And you will know how much bytes were read.
Please note that you might need to work more on the ReadNotifierStreamWrapper class to make it better. For example, maybe the HttpClient decides for some reason that it wants to seek the stream via the Seek method. You might need to take that into account. Although, I don't think that HttpClient will do this since it needs to read the whole file to upload it all, it does not make sense to skip parts of the file. You can put some breakpoints on the Seek method of ReadNotifierStreamWrapper to see if it will get called.
The exception you get;
EventSourceException: No Free Buffers available from the operating
system
is caused by the Debug.WriteLine, which uses ETW (Event Tracing for Windows). ETW is a Windows kernel-level tracing facility that relies on a number of buffers to cache data before writing it to disk. ETW uses the buffer size and the size of physical memory to calculate the maximum number of buffers allocated for the event tracing session's buffer pool. So, if your application is 64-bit ETW will allocate more buffers due to there being more addressable memory available.
The reason you are hitting the limit is that you are writing debugging events faster than ETW can process the buffers.
I believe it is possible to increase the maximum number of buffers using registry entries, but this will increase memory consumption and you have the potential to hit the limit again with bigger files, slower connection, etc.
So your best bet is to either sleep the thread on each loop to write tracing events at a lower rate or replace the Debug.WriteLine with some other form of logging.

Changes to Buffer While BeginWrite is Called

I'm curious as to whether making changes to the byte[] before BeginWrite actually finishes writing, will influence what is finally written by the FileStream.
I've this code below, with currentPage being a byte[] with the data I want to write.
try
{
FileStream.BeginWrite(currentPage, 0, currentPage.Length, new AsyncCallback(EndWriteCallback),new State(logFile.fs, currentPage, BUFFER_SIZE, manualEvent));
manualEvent.WaitOne();
}
catch (Exception e)
{
//handle exception here
}
I have this within a loop that will replace the data in currentPage. What will happen if I make changes to currentPage (like assign a new byte[] with all 0's in it)? Does FileStream buffer the byte[] to be written somewhere or does it actually just references the byte[] I passed in when I call it?
I tried looking at the MSDN article but all I could find was
Multiple simultaneous asynchronous requests render the request completion order uncertain.
Could someone please explain this to me?
This code should answer your questions. Firstly I create a long byte array where every cell is equal to 255. Then I start 2 threads. The first one is responsible for writing the prepared byte array to file. At the same time the second thread modifies this array, starting from the last cell, by setting every cell to 0.
The exact results of executing this code will depend on the machine, current CPU usage etc. On my computer one time I observed that about 77% of the created file contained 255s and the rest 0s. The next time it was about 70%. It confirms that the input array is not blocked for writing by BeginWrite method.
In order to observe this effect try to run this program a few times. It might be also necessary to use the longer array.
var path = #"C:\Temp\temp.txt";
var list = new List<byte>();
for(var i = 0; i < 1000000; ++i)
list.Add(255);
var buffer = list.ToArray();
var t1 = Task.Factory.StartNew(() =>
{
using (var fs = File.OpenWrite(path))
{
var res = fs.BeginWrite(buffer, 0, buffer.Length, null, null);
res.AsyncWaitHandle.WaitOne();
}
});
var t2 = Task.Factory.StartNew(() =>
{
for (var i = buffer.Length - 1; i > 0; --i)
buffer[i] = 0;
});
Task.WaitAll(t1, t2);

Consuming a HTTP stream without reading one byte at a time

I have been trying to read data from the Twitter stream API using C#, and since sometimes the API will return no data, and I am looking for a near-realtime response, I have been hesitant to use a buffer length of more than 1 byte on the reader in case the stream doesn't return any more data for the next day or two.
I have been using the following line:
input.BeginRead(buffer, 0, buffer.Length, InputReadComplete, null);
//buffer = new byte[1]
Now that I plan to scale the application up, I think a size of 1 will result in a lot of CPU usage, and want to increase that number, but I still don't want the stream to just block. Is it possible to get the stream to return if no more bytes are read in the next 5 seconds or something similar?
Async Option
You can use a timer in the async callback method to complete the operation if no bytes are received for e.g. 5 seconds. Reset the timer every time bytes are received. Start it before BeginRead.
Sync Option
Alternatively, you can use the ReceiveTimeout property of the underlying socket to establish a maximum time to wait before completing the read. You can use a larger buffer and set the timeout to e.g. 5 seconds.
From the MSDN documentation that property only applies to a synchronous read. You could perform a synchronous read on a separate thread.
UPDATE
Here's rough, untested code pieced together from a similar problem. It will probably not run (or be bug-free) as-is, but should give you the idea:
private EventWaitHandle asyncWait = new ManualResetEvent(false);
private Timer abortTimer = null;
private bool success = false;
public void ReadFromTwitter()
{
abortTimer = new Timer(AbortTwitter, null, 50000, System.Threading.Timeout.Infinite);
asyncWait.Reset();
input.BeginRead(buffer, 0, buffer.Length, InputReadComplete, null);
asyncWait.WaitOne();
}
void AbortTwitter(object state)
{
success = false; // Redundant but explicit for clarity
asyncWait.Set();
}
void InputReadComplete()
{
// Disable the timer:
abortTimer.Change(System.Threading.Timeout.Infinite, System.Threading.Timeout.Infinite);
success = true;
asyncWait.Set();
}

Maximum Thread Number

I have user control which has method as this
public void DownloadFileAsync()
{
ThreadStart thread = DownloadFile;
Thread downloadThread = new Thread(thread);
downloadThread.Start();
}
In the form I have 4 user control as this. But when I call in the user controls DownloadFileAsync() for each control only two of them begin to download. After finishing one of them the next begins to downloads.
What is the problem and how I can download simultanesly each download?
Thank you for your attention.
public void DownloadFile()
{
int byteRecieved = 0;
byte[] bytes = new byte[_bufferSize];
try
{
_webRequestDownloadFile = (HttpWebRequest)WebRequest.Create(_file.AddressURL);
_webRequestDownloadFile.AddRange((int)_totalRecievedBytes);
_webResponseDownloadFile = (HttpWebResponse)_webRequestDownloadFile.GetResponse();
_fileSize = _webResponseDownloadFile.ContentLength;
_streamFile = _webResponseDownloadFile.GetResponseStream();
_streamLocalFile = new FileStream(_file.LocalDirectory, _totalRecievedBytes > 0 ? FileMode.Append : FileMode.Create);
MessageBox.Show("Salam");
_status = DownloadStatus.Inprogress;
while ((byteRecieved = _streamFile.Read(bytes, 0, _bufferSize)) > 0 && _status == DownloadStatus.Inprogress)
{
_streamLocalFile.Write(bytes, 0, byteRecieved);
_totalRecievedBytes += byteRecieved;
if (_totalRecievedBytes >= _fileSize)
{
argsCompleted.Status = DownloadStatus.Completed;
if (_fileDownloadCompleted != null)
_fileDownloadCompleted(_file, argsCompleted);
break;
}
argsProgress.UpdateEventArgument(DownloadStatus.Inprogress, _totalRecievedBytes);
if (_fileDownloadProgress != null)
_fileDownloadProgress(_file, argsProgress);
}
}
catch (Exception ex)
{
LogOperations.Log(ex.Message);
}
finally
{
_streamFile.Close();
_streamFile.Close();
_streamLocalFile.Close();
_webResponseDownloadFile.Close();
}
}
HTTP had a limit of 2 connections per Web Origin for a very long time, so people wouldn't start too many downloads in parallel. This limit has been lifted, but many implementations including HttpWebRequest still implement it.
From draft-ietf-httpbis-p1-messaging:
Clients (including proxies) SHOULD limit the number of simultaneous
connections that they maintain to a given server (including proxies).
Previous revisions of HTTP gave a specific number of connections as a
ceiling, but this was found to be impractical for many applications.
As a result, this specification does not mandate a particular maximum
number of connections, but instead encourages clients to be
conservative when opening multiple connections.
You can change the connection limit by setting the ConnectionLimit Property as follows:
HttpWebRequest httpWebRequest = (HttpWebRequest)webRequest;
httpWebRequest.ServicePoint.ConnectionLimit = 10;
Don't set the limit too high, so you don't overload the server.
Also, instead of using threads, you should consider using the WebClient Class and the asynchronous methods it provides (such as the DownloadDataAsync Method). See How can I programmatically remove the 2 connection limit in WebClient for how to change the connection limit here.

Categories

Resources