I'm having a problem in reading the data transmitted from serial port (incomplete data in every first time running the project).
I've tried two methods to read:
byte[] data = new byte[_serialPort.BytesToRead];
_serialPort.Read(data, 0, data.Length);
txtGateway.Text = System.Text.Encoding.UTF8.GetString(data);
And
txtGateway.Text = _serialPort.ReadExisting();
However, it only reads 14 bytes in every first time when I start the program. When I trace the program, _serialPort.BytesToRead gives only 14 in every first time. If I send the data for the second time, the data is read correctly.
The above two methods have the same result. I'm sure that writing data from serial port gives the complete data.
Serial ports don't have any message boundaries. If you want framing, you have to add it yourself.
As for only the most recent 14 bytes being in the serial port when your program starts, 14 bytes is a typical FIFO size for serial ports.
See also How do you programmatically configure the Serial FIFO Receive and Transmit Buffers in Windows?
Or just flush the receive buffer when the program starts.
Hi you can use array with carriage return \r and ascii code \x02 for STX
string data = _serialPort.ReadExisting();
string[] arr1 = data.Split('\r');
checkFinal = checkFinal.Replace("\x02", "").ToString().Trim();
Related
I have 2 GUI applications, one in C++ and one in C#.
The applications are the same and there is a function that writes and reads from COM port.
When I run my C++ app I receive the right result from the Serial.Read which is a buffer with 24 bytes.
But when I run my C# app I receive uneven results:
* Just 1 byte buffer if I don`t put sleep between write and read.
* Different sizes if I do put sleep between write and read (between 10-22 bytes).
What could be the reason for that?
My C++ code:
serial.write(&c, 1, &written);
serial.read(read_buf, read_len, &received); // received = 24
My C# code:
serial.Write(temp_char, 0, 1);
received = serial.Read(read_buff, 0,read_len); // received = 1
C# with sleep:
serial.Write(temp_char, 0, 1);
Thread.Sleep(100);
received = serial.Read(read_buff, 0,read_len); // received = (10~22)
Serial ports just give a stream of bytes, they don't know how you've written blocks of data to them. When you call read, the bytes that have been received are returned, if that isn't a complete message you need to call read repeatedly until you have the whole message.
You need to define a protocol to indicate message boundaries, this could be a special character (e.g. a new line in a text based protocol) or you can prefix your messages with a length.
I am writing a Small HttpServer, sometime I encounter a problem with missing POST Data.
By using Wireshark I discovered, that the Header is split into two segments.
I only get the first segment (636 Bytes), the second one (POST Data in this case) gets totally lost.
Here is a the relevant C# Code
string requestHeaderString = "";
StreamSocket socketStream = args.Socket;
IInputStream inputStream = socketStream.InputStream;
byte[] data = new byte[BufferSize];
IBuffer buffer = data.AsBuffer();
try
{
await inputStream.ReadAsync(buffer, BufferSize, InputStreamOptions.Partial);
// This is where things go missing, buffer.ToArray() should be 678 Bytes long,
// so Segment 1 (636 Bytes) and Segment 2 (42 Bytes) combined.
// But is only 636 Bytes long, so just the first Segment?!
requestHeaderString += Encoding.UTF8.GetString(buffer.ToArray());
}
catch (Exception e)
{
Debug.WriteLine("inputStream is not readable" + e.StackTrace);
return;
}
This code is in part of the StreamSocketListener ConnectionReceived Event.
Do I manually have to reassemble the TCP Segments, isn't this what the Systems TCP Stack should do?
Thanks,
David
The problem is the systems TCP stack treats the TCP stream just like any other stream. You don't get "messages" with streams, you just get a stream of bytes.
The receiving side has no way to tell when one "message" ends and where the next begins without you telling it some how. You must implement message framing on top of TCP, then on your receiving side you must repeatedly call Receive till you have received enough bytes to form a full message (this will involve using the int returned from the receive call to see how many bytes where processed).
Important note: If you don't know how many bytes you are expecting to get in total, for example you are doing message framing by using '\0' to seperate messages you may get the end of one message and the start of the next in a single Receive call. You will need to handle that situation.
EDIT: Sorry, I skipped over the fact you where reading HTTP. You must follow the protocol of HTTP. You must read in data till you see the pattern \r\n\r\n, once you get that you must parse the header and decode how much data is in the content portion of the HTTP message then repeatatly call read till you have read the number of bytes needed.
I have a microcontroller (Arduino Uno) running nanopb that is sending protobuf messages over the wire. I'm finding that under one specific case I'm not receiving my full message. I thought for a while that it was the microcontroller, but it appears to be on the C# side that's reading from it.
The issue ONLY happens for uint32 values GREATER THAN 16. 16 or less and it works fine.
I've setup a VERY simple program on the microcontroller to ensure it's not my other code that's causing it there. Essentially it's sending a struct over the wire with one uint32_t value in it:
//Protobuf message:
message Test { required uint32 testInt = 1 }
//Resulting struct:
typedef struct Test {
uint32_t testInt;
}
//Serial code:
Serial.begin(115200);
pb_ostream_t ostream;
//Removed ostream setup code as it's not relevant here...
Test message;
Test.testInt = 17;
pb_encode_delimited(&ostream, Test_fields, &message);
If I plug in my device and look at it's data output using Termite I see the following data (which is correct):
[02] [08] [11] (note Termite displays it in hex)
(That's saying the message is 2 bytes long, followed by the msg start byte, followed by the Test.testInt value of 17 - 0x11 in hex)
Now, if I bring this up in C# I should see 3 bytes when reading the message, but I only see 2. When the value in testInt is 16 or less it comes across as three bytes, 17 or greater and I only get two:
var port = new SerialPort("COM7", 115200, Parity.None, 8, StopBits.One);
port.Handshake = Handshake.RequestToSendXOnXOff;
port.Open();
while (port.IsOpen)
{
Console.WriteLine(port.ReadByte());
Thread.Sleep(10);
}
port.Close();
Console.ReadLine();
Output with 16: 2 8 16
Output with 17: 2 8 17
Any help is greatly appreciated, I'm at a loss on this one =(
You set the serial port to use Xon/Xoff. Why?
The code for XON is 17.
If you are sending binary data, don't use Xon/Xoff.
Looks like a simple race condition - there is nothing to ensure that the C# code gets 3 bytes. It could get one, two, three or more. If, say, it starts the loop when two bytes are in the UART buffer then it will get two bytes and output them. I suspect the 16/17 issue is just a coincidence.
Also, when there is nothing in the buffer your loop consumes 100% CPU. That's not good.
You'd be better off using the blocking SerialPort.ReadByte() call, get the length byte and then loop to read that many more bytes from the serial port.
Also, as a protocol, using protobuf messages alone with no header mark isn't great. If you get out of sync it could take a long while before you get lucky and get back in sync. You might want to add some kind of 'magic byte' or a sequence of magic bytes at the start of each message so the reader can regain sync.
Does Write method of SerialPort write all bytes it was told to?
I have this code to send data via serial port.
SerialPort port = new SerialPort(
"COM1", 9600, Parity.None, 8, StopBits.One);
// Open the port for communications
port.Open();
// Write bytes
byte[] bytes = Encoding.ASCII.GetBytes("Hello world from PC");
port.Write(bytes, 0, bytes.Length);
// Close the port
port.Close();
If I send string "Hello" the device connected to my PC via serial port receives it well. If I send string "Hello world from PC"
it only receives
first 16 bytes instead of 19.
Is there a way I can verify in code that all bytes were sent? Or it is problem of the hardware which is connected via serial port?
Yes, .Write is synchronous so all data is send before it returns. Anyway, 30 second timeout that you do would be more than enough, event if it wouldn't.
Normally all bytes are sent. Maybe the problem is at the receiving side.
To be sure that handshaking is turned off use this code, but I believe it is already the default:
serialPort.Handshake = Handshake.None;
I suspect the receiving side. It might be possible that the UART on the other side might not be fast enough to get the characters out of the buffer. Maybe you can try to add a small delays between parts of your message but that is only for troubleshooting, of course. Like so:
// Write bytes for test purpose
byte[] bytes = Encoding.ASCII.GetBytes("Hello world from PC");
port.Write(bytes, 0, 10);
Thread.Sleep(100); // small delay
port.Write(bytes, 10, 9); // next block
I had a similar problem.
I write an array of 20 bytes to the serial port which is connected to a wireless transmitter. After sending all bytes I need to switch back to receive mode.
Therefore it is necessary to exactly know when all bytes are physically written out of the serial port.
Unfortunately this is impossible under Windows since the buffer queries do not return the status of the physical chip buffer.
The solution for me was to send the data async with serialPort.BaseStream.WriteAsync
and then calculate the amount of transmit time by multiplying the number of bytes with the baudrate:
serialPort.BaseStream.WriteAsync(buffer,0,buffer.length);
int waittime = (int)((double)buffer.length * (10/(double)baudrate) + some_reserve);
Thread.Sleep(waittime);
the "10" means that one byte consists of 10 bits
i have a problem with a serial port reader in C#.
if i send 5555 through the serial port the program prints out 555.
here is the program
public static void Main()
{
byte[] buffer = new byte[256];
string buff;
using (SerialPort sp = new SerialPort("COM2", 6200))
{
sp.Open();
//read directly
sp.Read(buffer, 0, (int)buffer.Length);
//read using a Stream
sp.BaseStream.Read(buffer, 0, (int)buffer.Length);
string sir = System.Text.Encoding.Default.GetString(buffer);
Console.WriteLine(sir);
Both your computer's and the other device's UART may have a hardware buffer which passes data in respect to the actual hardware control enabled for the connection. Hence you have to take care of:
hardware control flow setup;
timing of data reading / writing;
Bear in mind you are working with a real-time hardware device that has its own timing which needs to be respected by your application. Communicating with a hardware device is a process. In other words, a one-shot read may not be enough to retrieve all input you are expecting on the logical level.
Update: Google for “SerialPort tutorial C#” and study few of them, like this one.
You need to use the int returned from the "Read" methods. The value returned will tell you how many bytes were actually read. You will probably need to loop and call "Read" multiple times until you have read the number of bytes you need.
Update: This other question has some sample code that shows how to read multiple times until you have enough data to process.