Does Write send all bytes? (Serial Port) - c#

Does Write method of SerialPort write all bytes it was told to?
I have this code to send data via serial port.
SerialPort port = new SerialPort(
"COM1", 9600, Parity.None, 8, StopBits.One);
// Open the port for communications
port.Open();
// Write bytes
byte[] bytes = Encoding.ASCII.GetBytes("Hello world from PC");
port.Write(bytes, 0, bytes.Length);
// Close the port
port.Close();
If I send string "Hello" the device connected to my PC via serial port receives it well. If I send string "Hello world from PC"
it only receives
first 16 bytes instead of 19.
Is there a way I can verify in code that all bytes were sent? Or it is problem of the hardware which is connected via serial port?

Yes, .Write is synchronous so all data is send before it returns. Anyway, 30 second timeout that you do would be more than enough, event if it wouldn't.

Normally all bytes are sent. Maybe the problem is at the receiving side.
To be sure that handshaking is turned off use this code, but I believe it is already the default:
serialPort.Handshake = Handshake.None;
I suspect the receiving side. It might be possible that the UART on the other side might not be fast enough to get the characters out of the buffer. Maybe you can try to add a small delays between parts of your message but that is only for troubleshooting, of course. Like so:
// Write bytes for test purpose
byte[] bytes = Encoding.ASCII.GetBytes("Hello world from PC");
port.Write(bytes, 0, 10);
Thread.Sleep(100); // small delay
port.Write(bytes, 10, 9); // next block

I had a similar problem.
I write an array of 20 bytes to the serial port which is connected to a wireless transmitter. After sending all bytes I need to switch back to receive mode.
Therefore it is necessary to exactly know when all bytes are physically written out of the serial port.
Unfortunately this is impossible under Windows since the buffer queries do not return the status of the physical chip buffer.
The solution for me was to send the data async with serialPort.BaseStream.WriteAsync
and then calculate the amount of transmit time by multiplying the number of bytes with the baudrate:
serialPort.BaseStream.WriteAsync(buffer,0,buffer.length);
int waittime = (int)((double)buffer.length * (10/(double)baudrate) + some_reserve);
Thread.Sleep(waittime);
the "10" means that one byte consists of 10 bits

Related

Receiving uneven results from Serial Port in C#

I have 2 GUI applications, one in C++ and one in C#.
The applications are the same and there is a function that writes and reads from COM port.
When I run my C++ app I receive the right result from the Serial.Read which is a buffer with 24 bytes.
But when I run my C# app I receive uneven results:
* Just 1 byte buffer if I don`t put sleep between write and read.
* Different sizes if I do put sleep between write and read (between 10-22 bytes).
What could be the reason for that?
My C++ code:
serial.write(&c, 1, &written);
serial.read(read_buf, read_len, &received); // received = 24
My C# code:
serial.Write(temp_char, 0, 1);
received = serial.Read(read_buff, 0,read_len); // received = 1
C# with sleep:
serial.Write(temp_char, 0, 1);
Thread.Sleep(100);
received = serial.Read(read_buff, 0,read_len); // received = (10~22)
Serial ports just give a stream of bytes, they don't know how you've written blocks of data to them. When you call read, the bytes that have been received are returned, if that isn't a complete message you need to call read repeatedly until you have the whole message.
You need to define a protocol to indicate message boundaries, this could be a special character (e.g. a new line in a text based protocol) or you can prefix your messages with a length.

C# virtual serial port timeoutexception on write

i'm having a problem with virtual serial ports in C#: when i call the Write function, it automatically throws a TimeOutException, but the client receives the data.
It only happens with virtual ports (i'm using Free Virtual Serial Ports from HDDSoftware, with a bridge COM12<->COM13). I open COM12 with the Visual Studio and the COM13 with Hercules. The application throws the timeout exception but Hercules receives the message.
It doesn't matter if i set 1000ms or 1000000ms of Read/Write port timeout.
Thanks!!
using (SerialPort port = new SerialPort("COM13"))
{
// configure serial port
port.BaudRate = 9600;
port.DataBits = 8;
port.Parity = Parity.None;
port.StopBits = StopBits.One;
port.Open();
port.ReadTimeout = 10000;
byte[] buffer = Encoding.ASCII.GetBytes("HELLO WORLD");
try
{
port.Write(buffer, 0, buffer.Length);
}
catch(TimeoutException)
{
Console.WriteLine("Write timeout");
}
Console.WriteLine(DateTime.Now.ToString("HH:mm:ss"));
try
{
byte[] buf = new byte[100];
port.Read(buf, 0, 1);
}
catch(IOException)
{
Console.WriteLine("Read timeout");
}
Console.WriteLine(DateTime.Now.ToString("HH:mm:ss"));
}
After a few tests (putting the Write into a try-catch), the Read operation also throws a TimeOutException instantly.
This is what i get when run the test. It is supposed to be:
12:16:06
(Read timeout)
12:16:16
port.Write(buffer, offset, count);
It is up to the device driver to decide how to implement this. But all the ones I know follow the rule that the underlying WriteFile() call is allowed to return *lpNumberOfBytesWritten < nNumberOfBytesToWrite. Or to put it another way, the write is not "transactional".
A decent mental model is that Write() writes one byte from the buffer at a time, count times. At some point, entirely unpredictable when, writing one more byte will stall when the driver's transmit buffer fills up to capacity and cannot store another byte. Eventually triggering the exception.
So part of the buffer will still make it to the other end. You cannot tell what part from the SerialPort class. A time-out is a gross communication failure that's pretty hard to recover from. If that's a show-stopper then you need to consider writing one byte at a time (fine, serial ports are slow) or pay attention to WriteBufferSize - BytesToWrite to check if the buffer fits and implement your own timeout.
As Hans Passant said, it was a problem with the Virtual COM software. I tried with Virtual Serial Port from Eltima Software and worked fine!

Incomplete data reading from serial port c#

I'm having a problem in reading the data transmitted from serial port (incomplete data in every first time running the project).
I've tried two methods to read:
byte[] data = new byte[_serialPort.BytesToRead];
_serialPort.Read(data, 0, data.Length);
txtGateway.Text = System.Text.Encoding.UTF8.GetString(data);
And
txtGateway.Text = _serialPort.ReadExisting();
However, it only reads 14 bytes in every first time when I start the program. When I trace the program, _serialPort.BytesToRead gives only 14 in every first time. If I send the data for the second time, the data is read correctly.
The above two methods have the same result. I'm sure that writing data from serial port gives the complete data.
Serial ports don't have any message boundaries. If you want framing, you have to add it yourself.
As for only the most recent 14 bytes being in the serial port when your program starts, 14 bytes is a typical FIFO size for serial ports.
See also How do you programmatically configure the Serial FIFO Receive and Transmit Buffers in Windows?
Or just flush the receive buffer when the program starts.
Hi you can use array with carriage return \r and ascii code \x02 for STX
string data = _serialPort.ReadExisting();
string[] arr1 = data.Split('\r');
checkFinal = checkFinal.Replace("\x02", "").ToString().Trim();

how to send voice from microphone or play recorded audio from hsdpa dongle C#

I have done all call dialling part and it works. Now I need to play recorded sound or microphone in put sound through phone.I just need to know how can I pass the audio to the dongle and send it through the call.
this is my working code for dialling phone number
SerialPort port = new SerialPort();
port.Open();
string t = port.ReadExisting();
Thread.Sleep(100);
string cmd = "ATD";
string phoneNumber = "071********";
port.WriteLine(cmd + phoneNumber + ";\r");
port.Close();
For your modem, you have 3 COM port available. One is for sending AT command, the second one is for sending data (voice data), and the last one is for monitoring the asynchronous status change.
When you place a call (with ATD), you'll have to wait for "^CONN: 1, 0" on the monitoring port.
Then you need a sound source with 8KHz, 16 bit signed, 1 channel.
Write 320 bytes of this source on the data port, each 20ms.
You'll also read 320 bytes of data from this port, each 20ms for the other side speaking.
When you get "^CEND: ...", then the call is terminated.
Beware of few catch however. You'll get "^CONN: 1, 0" even if the other side rejected you call. So, try to read data from the data port first to check if the other side is still there after a ^CONN message.

How can I set the buffer size for the underneath Socket UDP? C#

As we know for UDP receive, we use Socket.ReceiveFrom or UdpClient.receive
Socket.ReceiveFrom accept a byte array from you to put the udp data in.
UdpClient.receive returns directly a byte array where the data is
My question is that How to set the buffer size inside Socket. I think the OS maintains its own buffer for receive UDP data, right? for e.g., if a udp packet is sent to my machine, the OS will put it to a buffer and wait us to Socket.ReceiveFrom or UdpClient.receive, right?
How can I change the size of that internal buffer?
I have tried Socket.ReceiveBuffSize, it has no effect at all for UDP, and it clearly said that it is for TCP window. Also I have done a lot of experiments which proves Socket.ReceiveBufferSize is NOT for UDP.
Can anyone share some insights for UDP internal buffer???
Thanks
I have seen some posts here, for e.g.,
http://social.msdn.microsoft.com/Forums/en-US/ncl/thread/c80ad765-b10f-4bca-917e-2959c9eb102a
Dave said that Socket.ReceiveBufferSize can set the internal buffer for UDP. I disagree.
The experiment I did is like this:
27 hosts send a 10KB udp packet to me within a LAN at the same time (at least almost). I have a while-loop to handle each of the packet. For each packet, I create a thread a handle it. I used UdpClient or Socket to receive the packets.
I lost about 50% of the packets. I think it is a burst of the UDP sending and I can't handle all of them in time.
This is why I want to increase the buffer size for UDP. say, if I change the buffer size to 1MB, then 27 * 10KB = 270KB data can be accepted in the buffer, right?
I tried changing Socket.ReceiveBufferSize to many many values, and it just does not have effects at all.
Any one can help?
I use the .NET UDPClient often and I have always used the Socket.ReceiveBufferSize and have good results. Internally it calls Socket.SetSocketOption with the ReceiveBuffer parameter. Here is a some quick, simple, code you can test with:
public static void Main(string[] args)
{
IPEndPoint remoteEp = null;
UdpClient client = new UdpClient(4242);
client.Client.ReceiveBufferSize = 4096;
Console.Write("Start sending data...");
client.Receive(ref remoteEp);
Console.WriteLine("Good");
Thread.Sleep(5000);
Console.WriteLine("Stop sending data!");
Thread.Sleep(1500);
int count = 0;
while (true)
{
client.Receive(ref remoteEp);
Console.WriteLine(string.Format("Count: {0}", ++count));
}
}
Try adjusting the value passed into the ReceiveBufferSize. I tested sending a constant stream of data for the 5 seconds, and got 10 packets. I then increased x4 and the next time got 38 packets.
I would look to other places in your network where you may be dropping packets. Especially since you mention on your other post that you are sending 10KB packets. The 10KB will be fragmented when it is sent to packets the size of MTU. If any 1 packet in the series is dropped the entire packet will be dropped.
The issue with setting the ReceiveBufferSize is that you need to set it directly after creation of the UdpClient object. I had the same issue with my changes not being reflected when getting the value of ReceiveBufferSize.
UdpClient client = new UdpClient()
//no code inbetween these two lines accessing client.
client.Client.ReceiveBufferSize = somevalue

Categories

Resources