I have a microcontroller (Arduino Uno) running nanopb that is sending protobuf messages over the wire. I'm finding that under one specific case I'm not receiving my full message. I thought for a while that it was the microcontroller, but it appears to be on the C# side that's reading from it.
The issue ONLY happens for uint32 values GREATER THAN 16. 16 or less and it works fine.
I've setup a VERY simple program on the microcontroller to ensure it's not my other code that's causing it there. Essentially it's sending a struct over the wire with one uint32_t value in it:
//Protobuf message:
message Test { required uint32 testInt = 1 }
//Resulting struct:
typedef struct Test {
uint32_t testInt;
}
//Serial code:
Serial.begin(115200);
pb_ostream_t ostream;
//Removed ostream setup code as it's not relevant here...
Test message;
Test.testInt = 17;
pb_encode_delimited(&ostream, Test_fields, &message);
If I plug in my device and look at it's data output using Termite I see the following data (which is correct):
[02] [08] [11] (note Termite displays it in hex)
(That's saying the message is 2 bytes long, followed by the msg start byte, followed by the Test.testInt value of 17 - 0x11 in hex)
Now, if I bring this up in C# I should see 3 bytes when reading the message, but I only see 2. When the value in testInt is 16 or less it comes across as three bytes, 17 or greater and I only get two:
var port = new SerialPort("COM7", 115200, Parity.None, 8, StopBits.One);
port.Handshake = Handshake.RequestToSendXOnXOff;
port.Open();
while (port.IsOpen)
{
Console.WriteLine(port.ReadByte());
Thread.Sleep(10);
}
port.Close();
Console.ReadLine();
Output with 16: 2 8 16
Output with 17: 2 8 17
Any help is greatly appreciated, I'm at a loss on this one =(
You set the serial port to use Xon/Xoff. Why?
The code for XON is 17.
If you are sending binary data, don't use Xon/Xoff.
Looks like a simple race condition - there is nothing to ensure that the C# code gets 3 bytes. It could get one, two, three or more. If, say, it starts the loop when two bytes are in the UART buffer then it will get two bytes and output them. I suspect the 16/17 issue is just a coincidence.
Also, when there is nothing in the buffer your loop consumes 100% CPU. That's not good.
You'd be better off using the blocking SerialPort.ReadByte() call, get the length byte and then loop to read that many more bytes from the serial port.
Also, as a protocol, using protobuf messages alone with no header mark isn't great. If you get out of sync it could take a long while before you get lucky and get back in sync. You might want to add some kind of 'magic byte' or a sequence of magic bytes at the start of each message so the reader can regain sync.
Related
When using Microsoft Visual Studio with .NET code, there are multiple ways to read data from serial ports, mainly these:
Read
SerialPort.Read(byte[] buffer, int offset, int count);
This reads until the defined buffer is full
Read Existing
SerialPort.ReadExisting();
This reads the currently existing data bytes at the serial Port
Read To
SerialPort.ReadTo(string delimter);
This reads until a defined delimiter, e.g. "\r" is found.
Problem
The issue I am currently facing, is that my device operates in two modes, in normal mode, I can send command, and a response with \n in the end is sent, so this function can be processed with
string response = SerialPort.ReadTo("\n");
In data sending mode, the device sends data in bytes and variable package sizes, the data packets can be 5 to 1024 byte, and have no unique character at the end. In fact, the last characters are a checksum, so they will surely differ in each packet/stream. Therefore, the function
SerialPort.Read(byte[] buffer, int offset, int count);
cannot be used, since count is unknown. Additionally, the function
SerialPort.ReadExisting();
Is of no use, since it will only read the first bytes of the packed, and not the complete data stream.
Workaround
My current workaround is the following, which has the issue that its slow, and relies on the estimation of the highest packet size of 1024. To be
private void SerialPortDataReceived(object sender, SerialDataReceivedEventArgs e)
{
Task.Delay(900).Wait() //9600baud = 1200byte/s = 1024byte/1200byte = 0.85
string response = SerialPort.readExisting();
}
The major issue with my workaround is that it has to wait the complete time, even if the device sends a small amount of bytes (e.g. 100). Additionally, if the device is switched to a higher speed (e.g. 38400baud) the wait time could be way lower.
How could I deal with this problem in a proper way?
I have 2 GUI applications, one in C++ and one in C#.
The applications are the same and there is a function that writes and reads from COM port.
When I run my C++ app I receive the right result from the Serial.Read which is a buffer with 24 bytes.
But when I run my C# app I receive uneven results:
* Just 1 byte buffer if I don`t put sleep between write and read.
* Different sizes if I do put sleep between write and read (between 10-22 bytes).
What could be the reason for that?
My C++ code:
serial.write(&c, 1, &written);
serial.read(read_buf, read_len, &received); // received = 24
My C# code:
serial.Write(temp_char, 0, 1);
received = serial.Read(read_buff, 0,read_len); // received = 1
C# with sleep:
serial.Write(temp_char, 0, 1);
Thread.Sleep(100);
received = serial.Read(read_buff, 0,read_len); // received = (10~22)
Serial ports just give a stream of bytes, they don't know how you've written blocks of data to them. When you call read, the bytes that have been received are returned, if that isn't a complete message you need to call read repeatedly until you have the whole message.
You need to define a protocol to indicate message boundaries, this could be a special character (e.g. a new line in a text based protocol) or you can prefix your messages with a length.
I'm having a problem in reading the data transmitted from serial port (incomplete data in every first time running the project).
I've tried two methods to read:
byte[] data = new byte[_serialPort.BytesToRead];
_serialPort.Read(data, 0, data.Length);
txtGateway.Text = System.Text.Encoding.UTF8.GetString(data);
And
txtGateway.Text = _serialPort.ReadExisting();
However, it only reads 14 bytes in every first time when I start the program. When I trace the program, _serialPort.BytesToRead gives only 14 in every first time. If I send the data for the second time, the data is read correctly.
The above two methods have the same result. I'm sure that writing data from serial port gives the complete data.
Serial ports don't have any message boundaries. If you want framing, you have to add it yourself.
As for only the most recent 14 bytes being in the serial port when your program starts, 14 bytes is a typical FIFO size for serial ports.
See also How do you programmatically configure the Serial FIFO Receive and Transmit Buffers in Windows?
Or just flush the receive buffer when the program starts.
Hi you can use array with carriage return \r and ascii code \x02 for STX
string data = _serialPort.ReadExisting();
string[] arr1 = data.Split('\r');
checkFinal = checkFinal.Replace("\x02", "").ToString().Trim();
The protocol I am working with uses the Parity (Wakeup) bit to signal the beginning of a new message over RS-232 - all other bytes of the message have the Parity bit set SPACE. Packet contains length and CRC...
I need to be able to catch the first byte with the Wakeup bit set and then process the packet from the length information in the message itself.
I have done this in several application in assembly and c.
I am moving to C# and just need help in triggering the start of the capture of the message from the Parity/Wakeup bit of the packet frame - 11 bit with 1 start, 1 stop, 8 data and the Wakeup bit.
I am writing an application that receives some input from a long range radio via a serial connection. I am currently using the SerialPort C# class to receive and send data, but I am having a few issues. I've noticed that the function I have for receiving data is not correctly setting the buffer byte array size. All data is being sent in bytecode (Hex).
Say the other node sends 103 bytes of data. Stepping through my code and setting a breakpoint at the "Read()" line, I see that "serialPort1.BytesToRead-1" evaluates to 103, BUT the byte[] array is only initialized to 17. I have no explanation for this behavior. As a result, only the first 17 bytes are put into the array. Continuing through the step through, this same event is triggered, this time with "serialPort1.BytesToRead-1" evaluating to 85 (presumably since only the first 17 of the 103 bytes were read.
If I hardcore the data array size at 103, it works flawlessly in one pass. However, at the moment I am unable to store all the data in my byte array in one pass, which is causing a lot of problems. Anyone know why my byte array is being initialized to such an arbitrary size???
private void serialPort1_DataReceived(object sender, SerialDataReceivedEventArgs e)
{
byte[] data = new byte[serialPort1.BytesToRead - 1];
serialPort1.Read(data, 0, data.Length);
DisplayData(ByteToHex(data) /*+ "\n"*/);
}
Updated: Here's the new method I am attempting. isHeader is a boolean value initially set to true (as the first two bytes being received from the packet are in fact the length of the packet).
const int NUM_HEADER_BYTES = 2;
private void serialPort1_DataReceived(object sender, System.IO.Ports.SerialDataReceivedEventArgs e)
{
byte[] headerdata = new byte[2];
if (isHeader)
{
serialPort1.Read(headerdata, 0, NUM_HEADER_BYTES);
int totalSize = (headerdata[0] << 8 | headerdata[1]) >> 6;
serialPort1.ReceivedBytesThreshold = totalSize - NUM_HEADER_BYTES;
data = new byte[totalSize - NUM_HEADER_BYTES];
isHeader = false;
}
else
{
serialPort1.Read(data, 0, data.Length);
double[][] results = ParseData(data, data.Length);
serialPort1.ReceivedBytesThreshold = NUM_HEADER_BYTES;
isHeader = true;
DisplayData(ByteToHex(data) /*+ "\n"*/);
}
}
BytesToRead is equal to the number of bytes waiting in the buffer. It changes from moment to moment as new data arrives, and that's what you're seeing here.
When you step through with the debugger, that takes additional time, and the rest of the serial data comes in while you're stepping in the debugger, and so BytesToRead changes to the full value of 104.
If you know that you need 103 bytes, I believe setting ReceivedBytesThreshold to 104 will trigger the DataRecieved event at the proper time. If you don't know the size of the message you need to receive, you'll need to do something else. I notice you're throwing away one byte (serialPort1.BytesToRead - 1), is that an end-of-message byte that you can search for as you read data?
Debuggers won't deliver when it comes to real time data transfer. Use debug traces.
BTW I'd go with polling data myself, not putting my trust on events. With serial ports this is a sound and reliable approach.
Edit per comment:
Serial data transfer rate is bounded by your baud-rate.
You're worried about losing data, so let's look at the numbers:
Assuming:
baud_rate = 19600 [bytes/sec] // it's usually *bits*, but we want to err upwards
buffer_size = 4096 [bytes] // windows allocated default buffer size
So it takes:
4096/19600 [sec] ~ 200 [ms]
To overflow the buffer (upper bound).
So if you sample at 50 Hz, you're running on an order of magnitude safety net, and that's a good spot. On each 'sample' you read the whole buffer. There is no timing issue here.
Of course you should adopt the numbers to your case, but I'll be surprised if your low bandwidth RF channel will result in a transfer rate for which 50 Hz won't be a sufficient overkill.
LAST EDIT:
Needless to say, if what you currently have works, then don't touch it.