I am interfacing a scale to a computer through a serial port. I check the value SerialPort.BytesToRead for when it reaches 0 inside a loop. However the loop exits even though BytesToRead is not equal to 0. I can't post a screenshot as I am a new user but by going through debug I can see that BytesToRead is in fact not 0.
This results in my data not being read entirely. I have tried for different expressions such as _port.BytesToRead > 0 but the result is the same. Even assigning the value of BytesToRead to a variable gives a 0. Without the loop ReadExisting doesn't return all the data sent from the scale so I don't really have a choice. ReadLine doesn't work either. So why is BytesToRead always 0?
private void PortDataReceived(object sender, SerialDataReceivedEventArgs e)
{
{
var input = string.Empty;
// Reads the data one by one until it reaches the end
do
{
input += _port.ReadExisting();
} while (_port.BytesToRead != 0);
_scaleConfig = GenerateConfig(input);
if (ObjectReceived != null)
ObjectReceived(this, _scaleConfig);
}
}
My boss figured it out. Here's the code.
private void PortDataReceived2(object sender, SerialDataReceivedEventArgs e)
{
var bytesToRead = _port.BytesToRead;
_portDataReceived.Append(_port.ReadExisting());
// Buffer wasn't full. We are at the end of the transmission.
if (bytesToRead < _port.DataBits)
{
//Console.WriteLine(string.Format("Final Data received: {0}", _portDataReceived));
IScalePropertiesBuilder scaleReading = null;
scaleReading = GenerateConfig(_portDataReceived.ToString());
_portDataReceived.Clear();
if (ObjectReceived != null)
{
ObjectReceived(this, scaleReading);
}
}
}
Your original code was strange because I don't see anyway for the code to know if the buffer is empty because your code has emptied it, or because the device hasn't sent yet. (This seems to be a fundamental problem in your design: you want to read until you get all the bytes, but only after you read all the bytes can you figure out how many there should have been).
Your later code is even stranger, because DataBits is the serial configuration for number of bits per byte (between 5 and 8 inclusive)--only in RS232 can a byte be less than 8 bits.
That said, I have been seeing very strange behavior around BytesToRead. It appears to me that it is almost completely unreliable and must get updated only after it would be useful. There is a note on MSDN about it being inconsistent, but it doesn't include the case of it being inexplicably 0, which I have seen as well.
Perhaps when you are running the debugger it's going slow enough that there are in fact bytes to read, but when you running it without the debugger and therefore there are no breakpoints happening, it ends up exiting the loop before the device on the serial port has time to send the data. Most likely the ReadExisting will read all the data on the port, and then exit immediately because no new data is on the port. Perhaps to alleviate the problem you can put a small wait (perhaps with Thread.Sleep()) between reading the data and checking to see if there is more data by checking the value of BytesToRead. Although you should probably be looking at the data that you are reading in order to determine when you have infact read all the necessary data for whatever it is you are trying to receive.
Related
I have this ReadAllBytes method which is supposed to read certain amount
of bytes from NetworkStream
private void ReadAllBytes(byte[] buffer, int length)
{
if (buffer.Length != length)
throw new Exception("WriteBytes: Length should be same");
Stream stm = m_client.GetStream();
// Start reading
int offset = 0;
int remaining = length;
while (remaining > 0)
{
int read = stm.Read(buffer, offset, remaining);
if (read <= 0)
throw new EndOfStreamException
(String.Format("ReadAllBytes: End of stream reached with {0} bytes left to read", remaining));
remaining -= read;
offset += read;
}
}
The thing is it works most of the times, but sometimes when program enters this function, it never returns and seems to run forever. I found out this using logging. I use it like:
public TcpClient m_client = new TcpClient();
m_client.Connect(IP,port);
ReadAllBytes(lengthArray, 2);
Can someone help me what is the problem? I think it is related to timeouts, but how to be sure? and how to fix this?
Can it be related to how I am disposing this class?
I also don't get any exceptions.
That method seems fine. How certain are you that the stream can deliver as many bytes as you think it should? Is the sender guaranteed to send the right amount?
The bug is that not enough data is there. It's a bug. A timeout does not fix a bug, it's a way to break the read so that the app does not hang forever if the network goes down (and also so that bugs are not catastrophic). Fix the bug. But also keep the timeout just in case.
also I don't see that bug occurring when I am using the existing C++ project
This means the bug is in your C# code. It's not a C# thing, read can't work any other way in principle. What do you think is read supposed to do when there is no data? It can either wait or fail. No other choices.
The fact that you have called Read when there is no data coming is your bug/problem/fault. Not Reads fault.
should introduce timeout; Yes but I prefer that I get exception instead forever wait
Yes, a timeout is always required when it comes to network communication in case the network goes down (or in case you have a bug and don't want to hang forever). Always have a timeout.
I'm embarrassed to have to ask such a question, but I'm having a rough time figuring out how to reliably read data over a serial port with the .NET SerialPort class.
My first approach:
static void Main(string[] args)
{
_port = new SerialPort
{
PortName = portName,
BaudRate = 57600,
DataBits = 8,
Parity = Parity.None,
StopBits = StopBits.One,
RtsEnable = true,
DtrEnable = false,
WriteBufferSize = 2048,
ReadBufferSize = 2048,
ReceivedBytesThreshold = 1,
ReadTimeout = 5000,
};
_port.DataReceived += _port_DataReceived;
_port.Open();
// whatever
}
private void _port_DataReceived(object sender, SerialDataReceivedEventArgs e)
{
var buf = new byte[_port.BytesToRead];
var bytesRead = _port.Read(buf, 0, buf.Length);
_port.DiscardInBuffer();
for (int i = 0; i < bytesRead; ++i)
{
// read each byte, look for start/end values,
// signal complete packet event if/when end is found
}
}
So this has an obvious problem; I am calling DiscardInBuffer, so any data which came in after the event was fired is discarded, i.e., I'm dropping data.
Now, the documentation for SerialPort.Read() does not even state if it advances the current position of the stream (really?), but I have found other sources which claim that it does (which makes sense). However, if I do not call DiscardInBuffer I eventually get an RXOver error, i.e., I'm taking too long to process each message and the buffer is overflowing.
So... I'm really not a fan of this interface. If I have to process each buffer on a separate thread I'll do that, but that comes with its own set of problems, and I'm hoping that I am missing something as I don't have much experience with this interface.
Jason makes some good points about reducing UI access from the worker thread, but an even better option is to not receive the data on a worker thread in the first place.
Use port.BaseStream.ReadAsync to get your data, event-driven, on the thread where you want it. I've written more about this approach at http://www.sparxeng.com/blog/software/must-use-net-system-io-ports-serialport
To correctly handle data from a serial port you need to do a couple of things.
First, don't handle the data in your receive event. Copy the data somewhere else and do any processing on another thread. (This is true of most events - it is a bad idea to do any time-consuming processing in an event handler as it delays the caller and can introduce problems. You also need to be careful as your event is raised on a different thread to your main application)
Secondly, you can't guarantee that you will receive exactly one packet, or a complete packet when you receive data - it may come to you in small fragments.
So the upshot of this is that you should create your own buffer (big enough to hold several packets), and when you receive data, append it to your buffer. Then in another thread you can process the buffer, looking to see if you can decode a packet from it and then consume that data. You may have to skip the end of a partial packet before you find the start of a valid one. If you don't have enough data to build a full packet, then you may need to wait for a bit until more data arrives.
You shouldn't call Discard on the port - just read the data and consume it. Each time you are called, there will be another fragment of data to process. It does not remember the data from previous calls - each time your event is called, it is given a small burst of data that has arrived since you were last called. Just use the data you've been given and return.
As a last suggestion: Don't change any settings for the port unless you specifically need to for it to operate properly. So you must set the baud rate, data/stop bits and parity, but avoid trying to change properties like the Rts/Dtr, buffer sizes and read thresholds unless you have a good reason to think you know better than the author of the serial port. Most serial devices work in an industry standard manner these days, and changing these low-level options is very likely to cause trouble unless you're talking to some unusual equipment and you intimately know the hardware.
In particular setting the ReceivedBytesThreshold to 1 is probably what is causing the failure you've mentioned, because you are asking the serial port to call your event handler with only one byte at a time, 57,600 times per second - giving your event handler only 0.017 milliseconds to process each byte before you'll start to get re-entrant calls.
DiscardInBuffer is typically only used immediately after opening a serial port. It is not required for standard serial port communication so you should not have it in your dataReceived handler.
I am writing an application that receives some input from a long range radio via a serial connection. I am currently using the SerialPort C# class to receive and send data, but I am having a few issues. I've noticed that the function I have for receiving data is not correctly setting the buffer byte array size. All data is being sent in bytecode (Hex).
Say the other node sends 103 bytes of data. Stepping through my code and setting a breakpoint at the "Read()" line, I see that "serialPort1.BytesToRead-1" evaluates to 103, BUT the byte[] array is only initialized to 17. I have no explanation for this behavior. As a result, only the first 17 bytes are put into the array. Continuing through the step through, this same event is triggered, this time with "serialPort1.BytesToRead-1" evaluating to 85 (presumably since only the first 17 of the 103 bytes were read.
If I hardcore the data array size at 103, it works flawlessly in one pass. However, at the moment I am unable to store all the data in my byte array in one pass, which is causing a lot of problems. Anyone know why my byte array is being initialized to such an arbitrary size???
private void serialPort1_DataReceived(object sender, SerialDataReceivedEventArgs e)
{
byte[] data = new byte[serialPort1.BytesToRead - 1];
serialPort1.Read(data, 0, data.Length);
DisplayData(ByteToHex(data) /*+ "\n"*/);
}
Updated: Here's the new method I am attempting. isHeader is a boolean value initially set to true (as the first two bytes being received from the packet are in fact the length of the packet).
const int NUM_HEADER_BYTES = 2;
private void serialPort1_DataReceived(object sender, System.IO.Ports.SerialDataReceivedEventArgs e)
{
byte[] headerdata = new byte[2];
if (isHeader)
{
serialPort1.Read(headerdata, 0, NUM_HEADER_BYTES);
int totalSize = (headerdata[0] << 8 | headerdata[1]) >> 6;
serialPort1.ReceivedBytesThreshold = totalSize - NUM_HEADER_BYTES;
data = new byte[totalSize - NUM_HEADER_BYTES];
isHeader = false;
}
else
{
serialPort1.Read(data, 0, data.Length);
double[][] results = ParseData(data, data.Length);
serialPort1.ReceivedBytesThreshold = NUM_HEADER_BYTES;
isHeader = true;
DisplayData(ByteToHex(data) /*+ "\n"*/);
}
}
BytesToRead is equal to the number of bytes waiting in the buffer. It changes from moment to moment as new data arrives, and that's what you're seeing here.
When you step through with the debugger, that takes additional time, and the rest of the serial data comes in while you're stepping in the debugger, and so BytesToRead changes to the full value of 104.
If you know that you need 103 bytes, I believe setting ReceivedBytesThreshold to 104 will trigger the DataRecieved event at the proper time. If you don't know the size of the message you need to receive, you'll need to do something else. I notice you're throwing away one byte (serialPort1.BytesToRead - 1), is that an end-of-message byte that you can search for as you read data?
Debuggers won't deliver when it comes to real time data transfer. Use debug traces.
BTW I'd go with polling data myself, not putting my trust on events. With serial ports this is a sound and reliable approach.
Edit per comment:
Serial data transfer rate is bounded by your baud-rate.
You're worried about losing data, so let's look at the numbers:
Assuming:
baud_rate = 19600 [bytes/sec] // it's usually *bits*, but we want to err upwards
buffer_size = 4096 [bytes] // windows allocated default buffer size
So it takes:
4096/19600 [sec] ~ 200 [ms]
To overflow the buffer (upper bound).
So if you sample at 50 Hz, you're running on an order of magnitude safety net, and that's a good spot. On each 'sample' you read the whole buffer. There is no timing issue here.
Of course you should adopt the numbers to your case, but I'll be surprised if your low bandwidth RF channel will result in a transfer rate for which 50 Hz won't be a sufficient overkill.
LAST EDIT:
Needless to say, if what you currently have works, then don't touch it.
I am trying to interface an ancient network camera to my computer and I am stuck at a very fundamental problem -- detecting the end of stream.
I am using TcpClient to communicate with the camera and I can actually see it transmitting the command data, no problems here.
List<int> incoming = new List<int>();
TcpClient clientSocket = new TcpClient();
clientSocket.Connect(txtHost.Text, Int32.Parse(txtPort.Text));
NetworkStream serverStream = clientSocket.GetStream();
serverStream.Flush();
byte[] command = System.Text.Encoding.ASCII.GetBytes("i640*480M");
serverStream.Write(command, 0, command.Length);
Reading back the response is where the problem begins though. I initially thought something simple like the following bit of code would have worked:
while (serverStream.DataAvailable)
{
incoming.Add(serverStream.ReadByte());
}
But it didn't, so I had a go another version this time utilising ReadByte(). The description states:
Reads a byte from the stream and
advances the position within the
stream by one byte, or returns -1 if
at the end of the stream.
so I thought I could implement something along the lines of:
Boolean run = true;
int rec;
while (run)
{
rec = serverStream.ReadByte();
if (rec == -1)
{
run = false;
//b = (byte)'X';
}
else
{
incoming.Add(rec);
}
}
Nope, still doesn't work. I can actually see data coming in and after a certain point (which is not always the same, otherwise I could have simply read that many bytes every time) I start getting 0 as the value for the rest of the elements and it doesn't halt until I manually stop the execution. Here's what it looks like:
So my question is, am I missing something fundamental here? How can I detect the end of the stream?
Many thanks,
H.
What you're missing is how you're thinking of a TCP data stream. It is an open connection, like an open phone line - someone on the other end may or may not be talking (DataAvailable), and just because they paused to take a breath (DataAvailable==false) it doesn't mean they're actually DONE with their current statement. A moment later they could start talking again (DataAvailable==true)
You need to have some kind of defined rules for the communication protocol ABOVE TCP, which is really just a transport layer. So for instance perhaps the camera will send you a special character sequence when it's current image transmission is complete, and so you need to examine every character sent and determine if that sequence has been sent to you, and then act appropriately.
Well you can't exactly says EOS on a network communication ( unless the other party drop the connection ) usually the protocol itself contains something to signal that the message is complete ( sometimes a new line, for example ). So you read the stream and feed a buffer, and you extract complete message by applying these strategies.
I'm working on a SerialPort app and one very simple part of it is giving me issues. I simply want to read a constant stream of data from the port and write it out to a binary file as it comes in. The problem seems to be speed: my code has worked fine on my 9600 baud test device, but when carried over to the 115200bps live device, I seem to be losing data. What happens is after a variable period of time, I miss 1 byte which throws off the rest of the data. I've tried a few things:
private void serialPort1_DataReceived(object sender, SerialDataReceivedEventArgs e)
{
bwLogger.Write((byte)serialPort1.ReadByte());
}
or
private void serialPort1_DataReceived(object sender, SerialDataReceivedEventArgs e)
{
byte[] inc = new byte[serialPort1.BytesToRead];
serialPort1.Read(inc, 0, inc.Length);
bwLogger.Write(inc);
}
and a few variations. I can't use ReadLine() as I am working with a constant stream of data (right?). I've tried fiddling with the buffer size (both serialPort1.ReadBufferSize and the hardware FIFO buffer). Ideally, for usability purposes, I'd handle this on the software side and not make the user have to change Windows driver settings.
Any ideas?
If the problem seems to be that you can't process the data fast enough, what you could try would be to double-buffer your data.
1) Allow one thread to read the serial port into one buffer. This may involve copying data off the port into the buffer (i'm not intimately familiar with .NET).
2) When you are ready to handle the incoming data, (on a different thread) make your program read into the 2nd buffer, and while this is happening you should write the first buffer to disk.
3) When the first buffer is written to disk, swap it back to the serial port buffer, and write the 2nd buffer to disk. Repeat process, continually swapping the buffers.
You might try enabling handshaking, using the Handshake property of the SerialPort object.
You'll have to set it on both the sender or receiver. however: if you're overflowing the receiver's UART's buffer (very small, 16 bytes IIRC), there's probably no other way. If you can't enable handshaking on the sender, you'll probably have to stay at 9600 or below.
I'd try the following:
Set the Buffer-Size to at least 230K Bytes
Set the Incoming Threshold to 16K, 32K or 65K
Write this fixed blocks of data to the file
I'm not sure if this might help, but it should at least take the pressure of the framework to fire the event that often.
I would check the number of bytes read which is returned by the Read(Byte>[], Int32, Int32) method and make sure it matches what you expect.
Make sure you are listening for SerialErrorReceivedEventHandler ErrorReceived events on the port object. An RXOver error would indicate your buffer is full.
Check the thread safety on your output buffer. If the write to the output buffer is not thread safe, a second write may corrupt the first write.
Is your bwLogger a BinaryWriter class? You might try using it with a BufferedStream to make the disk I/O nonblocking.
Also, if your packets have a known ending character, you can set the SerialPort.NewLine property to enable you to use ReadLine/WriteLine, although I don't think that would make much of a performance difference.
The machines I've been working with recently all send a stop code (in my case ASCII code 3 or 4). If you also have this feature, you can make use of ReadTo(string) off your SerialPort object.