How to read bytes from SerialPort.BaseStream without Length - c#

I want to use the stream class to read/write data to/from a serial port. I use the BaseStream to get the stream (link below) but the Length property doesn't work. Does anyone know how can I read the full buffer without knowing how many bytes there are?
http://msdn.microsoft.com/en-us/library/system.io.ports.serialport.basestream.aspx

You can't. That is, you can't guarantee that you've received everything if all you have is the BaseStream.
There are two ways you can know if you've received everything:
Send a length word as the first 2 or 4 bytes of the packet. That says how many bytes will follow. Your reader then reads that length word, reads that many bytes, and knows it's done.
Agree on a record separator. That works great for text. For example you might decide that a null byte or a end-of-line character signals the end of the data. This is somewhat more difficult to do with arbitrary binary data, but possible. See comment.
Or, depending on your application, you can do some kind of timing. That is, if you haven't received anything new for X number of seconds (or milliseconds?), you assume that you've received everything. That has the obvious drawback of not working well if the sender is especially slow.

Maybe you can try SerialPort.BytesToRead property.

Related

NetworkStream.Length substitute

I am using a networkstream to pass short strings around the network.
Now, on the receiving side I have encountered an issue:
Normally I would do the reading like this
see if data is available at all
get count of data available
read that many bytes into a buffer
convert buffer content to string.
In code that assumes all offered methods work as probably intended, that would look something like this:
NetworkStream stream = someTcpClient.GetStream();
while(!stream.DataAvailable)
;
byte[] bufferByte;
stream.Read(bufferByte, 0, stream.Lenght);
AsciiEncoding enc = new AsciiEncoding();
string result = enc.GetString(bufferByte);
However, MSDN says that NetworkStream.Length is not really implemented and will always throw an Exception when called.
Since the incoming data are of varying length I cannot hard-code the count of bytes to expect (which would also be a case of the magic-number antipattern).
Question:
If I cannot get an accurate count of the number of bytes available for reading, then how can I read from the stream properly, without risking all sorts of exceptions within NetworkStream.Read?
EDIT:
Although the provided answer leads to a better overall code I still want to share another option that I came across:
TCPClient.Available gives the bytes available to read. I knew there had to be a way to count the bytes in one's own inbox.
There's no guarantee that calls to Read on one side of the connection will match up 1-1 with calls to Write from the other side. If you're dealing with variable length messages, it's up to you to provide the receiving side with this information.
One common way to do this is to first work out the length of the message you're going to send and then send that length information first. On the receiving side, you then obtain the length first and then you know how big a buffer to allocate. You then call Read in a loop until you've read the correct number of bytes. Note that, in your original code, you're currently ignoring the return value from Read, which tells you how many bytes were actually read. In a single call and return, this could be as low as 1, even if you're asking for more than 1 byte.
Another common way is to decide on message "formats" - where e.g. message number 1 is always 32 bytes in length and has X structure, and message number 2 is 51 bytes in length and has Y structure. With this approach, rather than you sending the message length before sending the message, you send the format information instead - first you send "here comes a message of type 1" and then you send the message.
A further common way, if applicable, is to use some form of sentinels - if your messages will never contain, say, a byte with value 0xff then you scan the received bytes until you've received an 0xff byte, and then everything before that byte was the message you wanted to receive.
But, whatever you want to do, whether its one of the above approaches, or something else, it's up to you to have your sending and receiving sides work together to allow the receiver to discover each message.
I forgot to say but a further way to change everything around is - if you want to exchange messages, and don't want to do any of the above fiddling around, then switch to something that works at a higher level - e.g. WCF, or HTTP, or something else, where those systems already take care of message framing and you can, then, just concentrate on what to do with your messages.
You could use StreamReader to read stream to the end
var streamReader = new StreamReader(someTcpClient.GetStream(), Encoding.ASCII);
string result = streamReader.ReadToEnd();

Weird behavior with a BinaryReader

I have a socket-based application that exposes received data with a BinaryReader object on the client side. I've been trying to debug an issue where the data contained in the reader is not clean... i.e. the buffer that I'm reading contains old data past the size of the new data.
In the code below:
System.Diagnostics.Debug.WriteLine("Stream length: {0}", _binaryReader.BaseStream.Length);
byte[] buffer = _binaryReader.ReadBytes((int)_binaryReader.BaseStream.Length);
When I comment out the first line, the data doesn't end up being dirty (or, doesn't end up being dirty as regularly) as when I have that print line statement. As far as I can tell, from the server side the data is coming in cleanly, so it's possible that my socket implementation has some issues. But does anyone have any idea why adding that print line would cause the data to be dirty more often?
Your binary reader looks like it is a private member variable (if the leading underscore is a tell tell sign).
Is your application multithreaded? You could be experiencing a race condition if another thread is attempting to do also use your binaryReader while you are reading from it. The fact that you experience issues even without that line seems quite suspect to me.
Are you sure that your reading logic is correct? Stream.Length indicates the length of the entire stream, not of the remaining data to be read.
Suppose that, initially, 100 bytes were available. Length is 100, and BinaryReader corrects reads 100 bytes and advances the stream position by 100. Then, another 20 bytes arrive. Length is now 120; however, your BinaryReader should only be reading 20 bytes, not 120. The ‘extra’ 100 bytes requested in the second read would either cause it to block or (if the stream is not implemented correctly) break.
The problem was silly and unrelated. I believe my reading logic above is correct, however. The issue was that the _binaryReader I was using was a reference that was not owned by my class and hence the underlying stream was being rewritten with bad data.

Split message in serial communication

I am new to serial communication. I have read a fair few tutorials, and most of what I am trying to do is working, however I have a question regarding serial communication with C#. I have a micro controller that is constantly sending data through a serial line. The data ist in this format:
bxxxxixx.xx,xx.xx*
where the x's represent different numbers, + or - signs.
At certain times want to read this information from my C# program on my PC. The problem that I am having is that my messages seem to be split in random positions even though I am using
ReadTo("*");
I assumed this would read everything upto the * character.
How can I make sure that the message I recieved is complete?
Thank you for your help.
public string receiveCommandHC()
{
string messageHC = "";
if (serialHC.IsOpen)
{
serialHC.DiscardInBuffer();
messageHC = serialHC.ReadTo("*");
}
return messageHC;
}
I'm not sure why you're doing it, but you're discarding everything in the serial ports in-buffer just before reading, so if the computer has already received "bxxx" at that point, you throw it away and you'll only be reading "xixx.xx,xx.xx".
You'll nearly always find in serial comms that data messages (unless very small) are split. This is mostly down to the speed of communication and the point at which you retrieve data from the port.
Usually you'd set your code to run in a separate thread (to help prevent impacting the performance of the rest of your code) which raises an event when a complete message is received and also takes full messages in for transmission. Read and write functionality is dealt with by worker threads (serial comms traffic is slow).
You'll need a read and a write buffer. These should be suitabley large to hold data for several cycles.
Append data read from the input to the end of your read buffer. Have the read buffer read on cyclicly for complete messages, from the start of the buffer.
Depending on the protocol used there is usually a data start and maybe a data end indicator and somewhere a message size (this may be fixed, again depending on your protocol). I gather form your protocol that the message start character is 'b' and the message end character is '*'. Discard all data preceeding your message start character ('b'), as this is from an incomplete message.
When a complete message is found, strip it from the front of the buffer and raise an event to indicate its arrival.
A similar process is run for sending data, except that you may need to split the message, hence data to be sent is appended to the end of the buffer and data being sent is read from the start.
I hope that this helps you in understanding how to cope with serial comms.
As pointed out by Marc you're currently clearing your buffer in a way that will cause problems.
edit
As I said in my comment I don't recognise serialHC, but if dealing with raw data then look at using the SerialPort class. More information on how to use it and an example (which roughly uses the process that I described above) can be found here.
I'm going to take a guess that you're actually getting the ends of commands, i.e. instead of getting b01234.56.78.9 (omitting the final *), you're getting (say) .56.78.9. That is because you discarded the input buffer. Don't do that. You can't know the state at that point (just before a read), so discarding the buffer is simply wrong. Remove the serialHC.DiscardInBuffer(); line.

How many bits does BinaryReader.PeekChar() read?

I am working on improving a stream reader class that uses a BinaryReader. It consists of a while loop that uses .PeekChar() to check if more data exists to continue processing.
The very first operation is a .ReadInt32() which reads 4 bytes. What if PeekChar only "saw" one byte (or one bit)? This doesn't seem like a reliable way of checking for EOF.
The BinaryReader is constructed using its default parameters, which as I understand it, uses UTF8 as the default encoding. I assume that .PeekChar() checks for 8 bits but I really am not sure.
How many bits does .PeekChar() look for? (and what are some alternate methods to checking for EOF?)
Here BinaryReader.PeekChar
I read:
ArgumentException: The current character cannot be decoded into the
internal character buffer by using the Encoding selected for the
stream.
This makes clear that amount of memory read depends on Encoding applied to that stream.
EDIT
Actually definition according to MSDN is:
Returns the next available character and does not advance the
byte or character position.*
Infact, it depends on encoding if this is a byte or more...
Hope this helps.
Making your Read*() calls blindly and handling any exceptions that are thrown is the normal method. I don't believe that the stream position is moved if anything goes wrong.
The PeekChar() method of BinaryReader is very buggy. Even when trying to read a from a memory stream with UTF8 encoded data, PeekChar() throws an exception after reading a particular length of the stream. The BCL team has acknowledged the issue, but they have not committed to resolving the issue. Their only response is to avoid using PeekChar() if you can.

Send serialised object via socket

Whats the best way to format a message to a server, at moment I'm serilising an object using the binaryformatter and then sending it to the server.
At the server end its listening in an async fashion and then when the buffer size recieved is not 100% it assumes that the transfer has complete.
This is working and the moment, and I can deserialise the object at the other end, I'm just concerned that if I start sending async this method will fail has message's could be blurred.
I know that I need to mark the message somehow as to say that's the end of message one, this other bit belongs to message 2, but I'm unsure of the correct way to do this.
Could anyone point me in the right direction and maybe give me some examples?
Thanks
You could always serialize it to a memory stream, see how big it is, send the length as a 4-byte binary number then send the stream's contents.
On the other side you can just sit and wait for 4 bytes, combine them into an integer then sit and wait for that number of bytes.
When your read function returns (make them blocking reads), you know you have the entire object into a buffer, so you just deserialize it and cast it to your common interface type.
edit: this is the answer to your specific question. This being said you would be better off using a higher-level library than pure tcp.
If your object has a fixed length you can received specified amount of bytes on the other end and then create your object.
Otherwise you can send delimiters (symbol or sequence of symbols that you are not using in your object) between your objects and keep reading received data byte by byte until you see the delimiter.
You might want to take a look at Protocol Buffers (http://code.google.com/p/protobuf), a (implementation language independent) data interchange format/framework. There exist at least two .NET implementations for it (see http://code.google.com/p/protobuf/wiki/ThirdPartyAddOns).

Categories

Resources