big prob with serverSocket Send stack - c#

Ok, so here is my problem... basically i made a code and i implemented a timer in it .. IF i run the "client" that sends data to the server and the timer is active, the messages stack if i want to send them very fast.
Example: Instead of:
"Hello"
I get:
"HelloHelloHelloHelloHelloHelloHelloHelloHello" stacked up if i want to send them very very fast... 1 milisecond lets say.
Is there a way to fix it without neceserally splitting them when the server gets the info? I`ll not give the recieve code as that works.. here is the problem:
private void sends()
{
if (serverSocket.Connected)
{
send = textBox2.Text;
byte[] buffer = Encoding.ASCII.GetBytes(send);
serverSocket.Send(buffer);
}
}
private int s = 0;
private string send = string.Empty;
void Sendata(Socket socket, string stf)
{
byte[] data = Encoding.ASCII.GetBytes(stf);
socket.BeginSend(data, 0, data.Length, SocketFlags.None, new AsyncCallback(SendCallback), socket);
}
private void SendCallback(IAsyncResult AR)
{
Socket socket = (Socket)AR.AsyncState;
socket.EndSend(AR);
}
private void button3_Click(object sender, EventArgs e)
{
// timer1.Start();
//Thread send = new Thread(new ThreadStart(sends));
// send.Start();
for (int i = 0; i < 50000; i++)
{
if (serverSocket.Connected)
{
send = textBox2.Text;
Sendata(serverSocket, send);
}
}
}

You need a way to delineate messages. Maybe send a newline after each message unless you need to be able to send newlines as part of the message. If that's the case, you could prefix each message with its length. Then receiving end would first parse the length and then wait for that many bytes before considering the message complete. This way, multiple messages could be sent in one big push and the receiving end would just parse them up by length.

Related

SslStream EndRead gets first 1 Byte

I've written a TcpClient and Server which are communicating via an SslStream.
The communication works, but when i send a message from the Client to the Server, first the Server reads 1 Byte, and in the next step the rest. Example: I want to send "test" via Client, and the Server receives first "t", and then "est"
Here is the code for the Client to send
public void Send(string text) {
byte[] message = Encoding.UTF8.GetBytes(text);
SecureStream.BeginWrite(message, 0, message.Length, new AsyncCallback(WriteCallback), null);
}
private void WriteCallback(IAsyncResult AR) {
}
And here the code the Server uses to read
private SslStream CryptedStream = ...;
private byte[] buffer = new byte[1024];
public void BeginReadCallback(IAsyncResult AsyncCall) {
// initialize variables
int bytesRead = 0;
try {
// retrieve packet
bytesRead = CryptedStream.EndRead(AsyncCall);
// check if client has disconnected
if (bytesRead > 0) {
// copy buffer to a temporary one
var temporaryBuffer = buffer;
Array.Resize(ref temporaryBuffer, bytesRead);
string read = Encoding.ASCII.GetString(temporaryBuffer);
SetText(read);
// read more data
CryptedStream.BeginRead(buffer, 0, 1024, new AsyncCallback(BeginReadCallback), null);
// client is still connected, read data from buffer
//ProcessPacket(temporaryBuffer, temporaryBuffer.Length, helper);
} else {
// client disconnected, do everything to disconnect the client
//DisconnectClient(helper);
}
} catch (Exception e) {
// encountered an error, closing connection
// Program.log.Add(e.ToString(), Logger.LogLevel.Error);
// DisconnectClient(helper);
}
}
Did i miss something?
Thanks for your help
As Lasse explained streaming APIs do not promise you to return a specific number of bytes per read.
The best fix for this is to not use sockets. Use a higher level API such as WCF, SignalR, HTTP, ...
If you insist you probably should use BinaryReader/Writer to send your data. That makes it quite easy. For example, it has string sending built-in. You also can manually length-prefix easily with those classes.
Probably, you don't need async IO and should not use it. If you insist you can at least get rid of the callbacks by using await.

Why server app receives 50% of client messages?

Pre:
Client open socket to send data to the server:
private void Form1_Load(object sender, EventArgs e)
{
client = new TcpClient();
client.BeginConnect("127.0.0.1", 995, new AsyncCallback(ConnectCallback), client);
}
private void ConnectCallback(IAsyncResult _result) // it will send hello message from client
{
string data;
byte[] remdata = { };
IAsyncResult inResult = _result;
currentProcess = Process.GetCurrentProcess();
string currentProcessAsText = currentProcess.Id.ToString();
SetControlPropertyThreadSafe(proccessIdLabel, "Text", currentProcessAsText);
try {
sock = client.Client;
// send hello message
data = "Client with proccess id " + currentProcessAsText + " is connecting";
sock.Send(Encoding.ASCII.GetBytes(data));
SetControlPropertyThreadSafe(answersTextBox, "Text", answersTextBox.Text + "\n"+ GetCurrentTime() + " Connection established");
}
catch
{
SetControlPropertyThreadSafe(answersTextBox, "Text", answersTextBox.Text + "\n" + GetCurrentTime() + " Can't connect");
}
}
After that I have handler for click on some button (to send messages):
private void SendButton_Click(object sender, EventArgs e)
{
try
{
string data;
sock = client.Client;
data = "Some text";
sock.Send(Encoding.ASCII.GetBytes(data));
}
catch
{
SetControlPropertyThreadSafe(answersTextBox, "Text", "Can't connect");
}
}
Also handler for form close to send server exit command, so he will stop thread for this client:
private void Form1_FormClosing(object sender, FormClosingEventArgs e)
{
try
{
sock.Send(Encoding.ASCII.GetBytes("exit"));
sock.Close();
}
catch
{
}
}
Server listening port and handles messages:
private void Listeners()
{
Socket socketForClient = Listener.AcceptSocket();
string data;
int i = 0;
if (socketForClient.Connected)
{
string remoteHost = socketForClient.RemoteEndPoint.ToString();
Console.WriteLine(Message("Client:" + remoteHost + " now connected to server."));
while (true)
{
// буфер данных
byte[] buf = new byte[1024];
try
{
int messageLength = socketForClient.Receive(buf);
if (messageLength > 0)
{
byte[] cldata = new byte[messageLength];
socketForClient.Receive(cldata);
data = "";
data = Encoding.ASCII.GetString(cldata).Trim();
if (data.Contains("exit"))
{
socketForClient.Close();
string message = Message("Client:" + remoteHost + " is disconnected from the server (client wish).");
Console.WriteLine(message);
return;
}
else
{
Console.WriteLine(Message("Recevied message from client " + remoteHost + ":\n"));
Console.WriteLine(data);
Console.WriteLine("\nEOF\n");
}
}
}
catch
{
string message = Message("Client:" + remoteHost + " is disconnected from the server (forced close).");
Console.WriteLine(message);
socketForClient.Close();
return;
}
}
}
}
private void ServStart()
{
Listener = new TcpListener(LocalPort);
Listener.Start(); // начали слушать
Console.WriteLine("Waiting connections [" + Convert.ToString(LocalPort) + "]...");
for (int i = 0; i < 1000; i++)
{
Thread newThread = new Thread(new ThreadStart(Listeners));
newThread.Start();
}
}
So on server start it creates 1000 threads, which are listens clients messages.
Problems:
I will describe some situation:
Start server
Server starts threads and ready to accept clients connections
Start client
Connection is establishing. Server says that client connected on some port. Client send "hello" message. Server doesn't handle this hello message.
Push the button, so client will send Some text to the server. Server handles this message.
Push the button. Client sends "some text" again. Server doesn't handles that message.
Push the button. Client sends "some text" again. Server handles that message.
If I will push again, it will not handle it obviously....
Server logs:
Why server receives/client sends only 1 of 2 messages? What can cause it?
Also I have problems with sending exit message to the server, when client form is closing. I send exit message on this action.
So situation:
I just pushed the button and server handled it (so server will not handle next message).
I close the form, message is sended, but either client sends wrong message or server receives wrong message.
Situation in console:
You can see, that when form was closed and client sended exit, server handled empty message. Why?
Situation when client exit command passed by server normally:
.....
Client sends data, server doesn't handled it
Now, server will handle client, so we try to close form:
Console:
So in 2nd item client had sended hello message and server failed to handle it. In 3rd item client sends exit command and server passed it correctly.
Main question: why server handles only 1 of 2 messages from client?
Another point: also I found, that when client send exit data, server receives exit\0\0\0\0\0\0\0\0\ (or more, or less \0 symbols). Why?
Good news I think, that server receives or not receives messages constantly. 1 message is received, 1 message is not. That says about my lack of knowledges, but not random error.
So many bugs. :(
That said, the biggest one I noticed was this one:
int messageLength = socketForClient.Receive(buf);
if (messageLength > 0)
{
byte[] cldata = new byte[messageLength];
socketForClient.Receive(cldata);
data = "";
data = Encoding.ASCII.GetString(cldata).Trim();
First, understand that in TCP, you have no guarantees about the number of bytes any given receive operation will receive. No matter how the remote endpoint sends the data, you could receive all of the data at once, or only parts of it, in separate receive operations. TCP guarantees the bytes will be received in the same order in which they were sent, and nothing more.
But the above code not only fails to take that into account, it's just completely wrong. The number of bytes received in the first operation is how many bytes were received in that operation. But you are using that number as if it would tell you something about the number of bytes received in the next call to Receive(). It does nothing of the sort. At the same time, you ignore the data you received in the first operation.
Instead, your code should look more like this:
int messageLength = socketForClient.Receive(buf);
if (messageLength > 0)
{
data = Encoding.ASCII.GetString(buf, 0, messageLength).Trim();
That's still not quite right, because of course you could receive just a partial message in the call to Receive(), or even more than one message concatenated together. But at least you're likely to see all of the text.
That change will address the specific question you've asked about. If you have trouble figuring out how to address the other bugs, please feel free to post concise, specific questions and code examples to ask for help on those. See https://stackoverflow.com/help/mcve and https://stackoverflow.com/help/how-to-ask for advice on better ways to present your question.

Serial Port Polling and Data handling

I am trying to read from several serial ports from sensors through microcontrollers. Each serial port will receive more than 2000 measurements (each measurement is 7 bytes, all in hex). And they are firing at the same time. Right now I am polling from 4 serial ports. Also, I translate each measurement into String and append it to a Stringbuilder. When I finish receiving data, they will be ouput in to a file. The problem is the CPU consumption is very high, ranging from 80% to 100%.
I went though some articles and put Thread.Sleep(100) at the end. It reduces CPU time when there is no data coming. I also put Thread.Sleep at the end of each polling when the BytesToRead is smaller than 100. It only helps to a certain extent.
Can someone suggest a solution to poll from serial port and handle data that I get? Maybe appending every time I get something causes the problem?
//I use separate threads for all sensors
private void SensorThread(SerialPort mySerialPort, int bytesPerMeasurement, TextBox textBox, StringBuilder data)
{
textBox.BeginInvoke(new MethodInvoker(delegate() { textBox.Text = ""; }));
int bytesRead;
int t;
Byte[] dataIn;
while (mySerialPort.IsOpen)
{
try
{
if (mySerialPort.BytesToRead != 0)
{
//trying to read a fix number of bytes
bytesRead = 0;
t = 0;
dataIn = new Byte[bytesPerMeasurement];
t = mySerialPort.Read(dataIn, 0, bytesPerMeasurement);
bytesRead += t;
while (bytesRead != bytesPerMeasurement)
{
t = mySerialPort.Read(dataIn, bytesRead, bytesPerMeasurement - bytesRead);
bytesRead += t;
}
//convert them into hex string
StringBuilder s = new StringBuilder();
foreach (Byte b in dataIn) { s.Append(b.ToString("X") + ","); }
var line = s.ToString();
var lineString = string.Format("{0} ---- {2}",
line,
mySerialPort.BytesToRead);
data.Append(lineString + "\r\n");//append a measurement to a huge Stringbuilder...Need a solution for this.
////use delegate to change UI thread...
textBox.BeginInvoke(new MethodInvoker(delegate() { textBox.Text = line; }));
if (mySerialPort.BytesToRead <= 100) { Thread.Sleep(100); }
}
else{Thread.Sleep(100);}
}
catch (Exception ex)
{
//MessageBox.Show(ex.ToString());
}
}
}
this is not a good way to do it, it far better to work on the DataReceived event.
basically with serial ports there's a 3 stage process that works well.
Receiving the Data from the serial port
Waiting till you have a relevant chunk of data
Interpreting the data
so something like
class DataCollector
{
private readonly Action<List<byte>> _processMeasurement;
private readonly string _port;
private SerialPort _serialPort;
private const int SizeOfMeasurement = 4;
List<byte> Data = new List<byte>();
public DataCollector(string port, Action<List<byte>> processMeasurement)
{
_processMeasurement = processMeasurement;
_serialPort = new SerialPort(port);
_serialPort.DataReceived +=SerialPortDataReceived;
}
private void SerialPortDataReceived(object sender, SerialDataReceivedEventArgs e)
{
while(_serialPort.BytesToRead > 0)
{
var count = _serialPort.BytesToRead;
var bytes = new byte[count];
_serialPort.Read(bytes, 0, count);
AddBytes(bytes);
}
}
private void AddBytes(byte[] bytes)
{
Data.AddRange(bytes);
while(Data.Count > SizeOfMeasurement)
{
var measurementData = Data.GetRange(0, SizeOfMeasurement);
Data.RemoveRange(0, SizeOfMeasurement);
if (_processMeasurement != null) _processMeasurement(measurementData);
}
}
}
Note: Add Bytes keeps collecting data till you have enough to count as a measurement, or if you get a burst of data, splits it up into seperate measurements.... so you can get 1 byte one time, 2 the next, and 1 more the next, and it will then take that an turn it into a measurement. Most of the time if your micro sends it in a burst, it will come in as one, but sometimes it will get split into 2.
then somewhere you can do
var collector = new DataCollector("COM1", ProcessMeasurement);
and
private void ProcessMeasurement(List<byte> bytes)
{
// this will get called for every measurement, so then
// put stuff into a text box.... or do whatever
}
First of all consider reading Using Stopwatches and Timers in .NET. You can break down any performance issue with this and tell exactly which part of Your code is causing the problem.
Use SerialPort.DataReceived Event to trigger data receiving process.
Separate receiving process and data manipulation process. Store Your data first then process.
Do not edit UI from reading loop.
I guess what you should be doing is adding an event handler to process incoming data:
mySerialPort.DataReceived += new SerialDataReceivedEventHandler(mySerialPort_DataReceived);
This eliminates the need to run a separate thread for each serial port you listen to. Also, each DataReceived handler will be called precisely when there is data available and will consume only as much CPU time as is necessary to process the data, then yield to the application/OS.
If that doesn't solve the CPU usage problem, it means you're doing too much processing. But unless you've got some very fast serial ports I can't imagine the code you've got there will pose a problem.

Could I have some advice on basic Asynchronous Socket Programming in C#?

I've developing (or trying to, anyway) a program that uses Asynchronous Socket to, supposedly, pass strings to and fro the server and client, at any time.
This program requires no more than one client be connected to a server. I tried Socket Programming, but I found out it blocks the program until either one receives something.
Since I have only a basic understanding of Asynchronous socket programming, I just went for the simplest one I could find, or at least, the simplest one I could understand.
Here's my code for the Server:
public Socket g_server_conn;
public byte[] g_bmsg;
public bool check = false;
private void net_As_Accept(IAsyncResult iar)
{
Socket server_conn = (Socket)iar.AsyncState;
g_server_conn = server_conn.EndAccept(iar);
g_bmsg = new byte[1024];
check = true;
g_server_conn.BeginReceive(g_bmsg, 0, g_bmsg.Length, SocketFlags.None, new AsyncCallback(net_As_Receive), g_server_conn);
}
private void net_As_Send(IAsyncResult iar)
{
Socket server_conn = (Socket)iar.AsyncState;
server_conn.EndSend(iar);
}
private void net_As_Receive(IAsyncResult iar)
{
try
{
Socket server_conn = (Socket)iar.AsyncState;
server_conn.EndReceive(iar);
if (g_bmsg.Length != 0)
{
net_Data_Receive(Encoding.ASCII.GetString(g_bmsg, 0, g_bmsg.Length));
check = false;
}
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString(), "GG");
}
}
public void net_Data_Send(string msg2snd) // Function for sending through socket
{
MessageBox.Show(msg2snd);
byte[] byData = System.Text.Encoding.ASCII.GetBytes(msg2snd);
g_server_conn.BeginSend(byData, 0, byData.Length, SocketFlags.None, new AsyncCallback(net_As_Send), g_server_conn);
g_server_conn.BeginReceive(g_bmsg, 0, g_bmsg.Length, SocketFlags.None, new AsyncCallback(net_As_Receive), g_server_conn);
}
private void net_Data_Receive(string txt)
{
if (lblBuffer.InvokeRequired)
lblBuffer.Invoke(new MethodInvoker(delegate { net_Data_Receive(txt); }));
else
lblBuffer.Text = txt;
if (txt.StartsWith("&"))
{
// Do something
}
}
And here's my code for the Client:
private void net_As_Connect(IAsyncResult iar)
{
try
{
Socket client_conn = (Socket)iar.AsyncState;
client_conn.EndConnect(iar);
g_bmsg = new byte[1024];
check = true;
string toSendData = "&" + net_Name;
net_Data_Send(toSendData);
g_client_conn.BeginReceive(g_bmsg, 0, g_bmsg.Length, SocketFlags.None, new AsyncCallback(net_As_Receive), g_client_conn);
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString(), "GG");
}
}
private void net_As_Send(IAsyncResult iar)
{
Socket client_conn = (Socket)iar.AsyncState;
client_conn.EndSend(iar);
}
private void net_As_Receive(IAsyncResult iar)
{
if (g_bmsg.Length != 0)
{
net_Data_Receive(Encoding.ASCII.GetString(g_bmsg, 0, g_bmsg.Length));
check = false;
}
}
public void net_Data_Send(string msg2snd)
{
byte[] byData = System.Text.Encoding.ASCII.GetBytes(msg2snd);
g_client_conn.BeginSend(byData, 0, byData.Length, SocketFlags.None, new AsyncCallback(net_As_Send), g_client_conn);
g_client_conn.BeginReceive(g_bmsg, 0, g_bmsg.Length, SocketFlags.None, new AsyncCallback(net_As_Receive), g_client_conn);
}
private void net_Data_Receive(string txt)
{
if (lblBuffer.InvokeRequired)
lblBuffer.Invoke(new MethodInvoker(delegate { net_Data_Receive(txt); }));
else
lblBuffer.Text = txt;
if (txt.StartsWith("&"))
{
// Do Something
}
else if (txt.StartsWith("$"))
{
// Do something Else
}
}
Now, the Client can connect to the Server fine. The Client can even send in a string containing the user's name to the Server, which will then be displayed on the Server. The Server then sends out the name of its user to the Client, which the client receives and displays. Whatever is sent is stored in a Label (lblBuffer)
But afterwards, say I have the following code:
private void btnSendData_Click(object sender, EventArgs e)
{
string posMov = "Stuff to send";
net_Data_Send(posMov);
}
The Client receives nothing. Putting a Message Box in net_Data_Send(msg2snd) function reveals that the server does in fact send out the message. In fact, putting in the Message Box in that function makes it work (the Client receives it), for reasons I don't know. Since I haven't tried sending a message from the Client to the Server (other than the name when the Client Connects), I assume the Client will have the same problem sending to the Server.
Also, when it does send the second message (by putting a Message Box in the net_Data_Send function), only parts of the Label (lblBuffer) are overwritten. So if I my name is "Anon E. Moose", and the Server sends that when the Client connects, and I try to send out, say, "0.0" (via button press) the Label on the Client would then read "0.0n E. Moose".
What did I do wrong? Can I have some help on this, please?
Perhaps I have a problem with net_Data_Receive and net_Data_Send?
I think you need to call BeginReceive on your client again, it looks like you are only calling it once, so after it has received the server name, it isn't listening for any more data from the server
private void net_As_Receive(IAsyncResult iar)
{
var bytesRead = g_client_conn.EndReceive(iar);
if (bytesRead != 0)
{
net_Data_Receive(Encoding.ASCII.GetString(g_bmsg, 0, bytesRead));
check = false;
}
g_client_conn.BeginReceive(g_bmsg, 0, g_bmsg.Length, SocketFlags.None, new AsyncCallback(net_As_Receive), g_client_conn);
}
also, as I mentioned in my comment, use the bytesRead value to work out how much of the buffer you need to use.
You will need to work out if the data you have received from the socket is the full amount, or if you need to read more data to make up the current message from the other side.
BeginReceive doesn't just call its callback whenever a new packet (string in your case arrives). In fact. BeginReceive or any raw socket method works in a stream based fasion, not packet based. See http://msdn.microsoft.com/en-us/library/bew39x2a.aspx for an example.
What you need to do, is in your 'net_As_Receive' callback method (naming is terrible imo), you need to make a call first to socket.EndRecieve(IAsyncResult), which in turn returns the total bytes currently available. After that, you have to make a decision whether to receive more data or not.
For example:
private StringBuilder packetBuilder;
{
if (packetBuilder == null)
packetBuilder = new StringBuilder();
// finalyze the receive
int length = g_server_conn.EndReceive(iar);
if (length != 0)
{
// get the total bytes received. Note that the length property is of that of the number of bytes received rather than that of the buffer
packetBuilder.Append(Encoding.ASCII.GetString(g_bmsg, 0, length));
net_Data_Receive(packetBuilder.ToString());
check = false;
}
// receive the next part
g_server_conn.BeginReceive(g_bmsg, 0, g_bmsg.Length, SocketFlags.None, new AsyncCallback(net_As_Receive), g_server_conn);
}
Note that this example doesnt care about packages. It will work if your lucky but there is a good change either a part of a string will be shown or 2 different strings will be combined. A good implementation will look for a string end and only show that part while buffering the rest untill a new string end is found. You can also use a StreamReader for making your life much easier

Obviously this is not the correct way to read with SerialPort

Let's say I want to have a function which reads data from the SerialPort
and returns a byte[].
public byte[] RequestData(byte[] data)
{
//See code below
}
Something as simple as this really doesn't work/perform well and isn't very reliable:
byte[] response = new byte[port.ReadBufferSize];
port.Open();
port.Write(data, 0, data.Length);
Thread.Sleep(300); //Without this it doesn't even work at all
Console.WriteLine("Bytes to read: {0}", port.BytesToRead);
int count = port.Read(response, 0, port.ReadBufferSize);
Console.WriteLine("Read {0} bytes", count);
port.Close();
port.Dispose();
return response.GetSubByteArray(0, count);
I also tried replacing the Thread.Sleep with something like:
while (port.BytesToRead < 14)
{
//Maybe Thread.Sleep(10) here?
}
But that causes problems to. (PS: I know I need at least 14 bytes)
Of course a better way (I think) would be to have something like:
port.ReceivedBytesThreshold = 14;
port.DataReceived += new SerialDataReceivedEventHandler(port_DataReceived);
port.Open();
port.Write(data, 0, data.Length);
And then having a handler of course:
void port_DataReceived(object sender, SerialDataReceivedEventArgs e)
{
var port = (SerialPort)sender;
while (port.BytesToRead > 0)
{
//Read the data here
}
}
But then I can't return the data as the result of the function I wanted to define?
The client code using this would have to subscribe to an event raised by this code,
but then how would it know the response is really the response to the request it just made.
(Multiple messages might be sent, and I can imagine one message taking longer to process on the other side than the other, or something).
Any advise would be welcome
UPDATE
The following code works a lot better, but if I remove the Thread.Sleep() statements it once again stops working properly. For example, the serial port monitoring tool clearly indicates 17 bytes have been written on the serial line. The first time BytesToRead = 10 and the next time BytesToRead = 4 , but then BytesToRead remains 0 so where did the last 3 bytes go to ?
void port_DataReceived(object sender, SerialDataReceivedEventArgs e)
{
Thread.Sleep(100);
while (port.BytesToRead > 0)
{
Console.WriteLine("Bytes to read: {0}", port.BytesToRead);
var count = port.BytesToRead;
byte[] buffer = new byte[count];
var read = port.Read(buffer, 0, count);
if (count != read)
Console.WriteLine("Count <> Read : {0} {1}", count, read);
var collectAction = new Action(() =>
{
var response = dataCollector.Collect(buffer);
if (response != null)
{
this.OnDataReceived(response);
}
});
collectAction.BeginInvoke(null, null);
Thread.Sleep(100);
}
}
Here's how I've done it:
I have a wrapper for the class that accepts the vital data for the connection in the constructor and does the basic setup in that constructor. The consumer of the class calls a Connect method, which fires off another thread to perform the connection (non-blocking).
When the connection is complete, a StateEvent is fired indicating the connection is completed. At this time a Send queue is setup, a thread to work that queue is fired off and a read thread is also setup. The read thread reads 128 characters of data from the SerialPort, converts that to a string and then fires an event to pass along the received data. This is wrapped in a while thread that loops as long as the connection is maintained. When the consumer wants to send something, a Send method simply enqueues the data to be sent.
As far as knowing that the response is in response to something that was sent really isn't the job of a connection class. By abstracting away the connection into something as easy to handle as that, the consumer of the class can cleanly maintain the logic to determine if the response is what it expected.
Aren't serial ports fun. My only thought is that your fifo, assuming your device has one and its enabled, is being overrun.
Problem solved:
void port_DataReceived(object sender, SerialDataReceivedEventArgs e)
{
var count = port.BytesToRead;
byte[] buffer = new byte[count];
var read = port.Read(buffer, 0, count);
var response = dataCollector.Collect(buffer);
if (response != null)
{
this.OnDataReceived(response);
}
}
It seems the problem wasn't actually this code but the code in the dataCollector.Collect() method.

Categories

Resources