I'm currently working on an asynchronous TCP-Client. I am able to send and receive messages. However, the following code is driving me crazy at the moment:
int rx = theSockId.thisSocket.EndReceive(asyn);
char[] rcvd = new char[rx + 1];
System.Text.Decoder d = System.Text.Encoding.ASCII.GetDecoder();
int charLen = d.GetChars(theSockId.dataBuffer, 0, rx, rcvd, 0);
System.String szData = new System.String(rcvd);
Normally, everything works fine - but as soon as a message starts with a dollar sign ($), I only see this char.
I was searching for a long time but I couldn't find any solution....
Receive can complete when any data is received at the socket - not necessarily a whole "message". You have to buffer the received data until a whole message ( as defined in your protocol ) has been received.
Related
I seem to have issue to receive correct bytes from a PC to a PIC18F27J53.
The PIC UART is set standard, asynchronous, 8bits, 9600, No parity.
The PC is a win 10, I have made a simple UART program, and sending a few ints, all separated by commas, like the following.
"123,24,65,98,12,45,564,987,321,0,5.9,87,65,789,123,6554,213,8754\n"
I have tried different ways,
Tried to send each char one by one, however the PIC seems to get stuck midway or early in the transfer and the RX flag doesn't go high anymore.
I have tried to send each int followed by "\n" and my PIC to parse each chars and cut the read after a "\n" is found. This seems better, I can get more data in, but the final received data is corrupted: some ints are wrong etc.
It clearly show this is a sync issue, it looks like the PC is too fast for the PIC?
If so, I am looking at having a synchronous uart, however according to the web, this seems to be far from the chosen method, which makes me thing I must have another issue to resolve, in asynchronous mode?
My question, what is the most popular robust way to do that PIC to PC UART full duplex communication?
Here are my PIC receive APIs, fairly standard and simple (I think).
void int_receive_data(void)
{
char input_element[10] = { 0 };
char full_rx[128] = { 0 };
for (int i = 0; i < 22; i++) {
p18f47j53_uart2_read_text(input_element, sizeof(input_element));
strncat(full_rx, input_element, strlen(input_element));
strncat(full_rx, ",", 1);
}
}
void p18f47j53_uart2_read_text(char *output, uint8_t max_length)
{
uint8_t c;
char buffer[64] = { 0 };
for (uint8_t i = 0; i < max_length; i++) {
c = p18f47j53_uart2_receive_u8();
buffer[i] = c;
if ((c == 10) || (c == '\n')) {
buffer[i] = 0;
memcpy(output, buffer, i);
i = max_length;
}
}
}
uint8_t p18f47j53_uart2_receive_u8(void)
{
// wait for the flag
while (!PIR3bits.RC2IF);
// reset receiver if over run error
if (RCSTA2bits.OERR) {
RCSTA2bits.CREN = 0;
RCSTA2bits.CREN = 1;
return PIC_RC_FAIL;
}
// reset if frame error
if (RCSTA2bits.FERR) {
RCSTA2bits.SPEN = 0;
RCSTA2bits.SPEN = 1;
return PIC_RC_FAIL;
}
return RCREG2;
}
On the PC C# side, my sending looks like this
string[] full_separated = full_tx.Split(',');
foreach (string s in full_separated)
my_port.WriteLine(s);
The PIC is running from its internal clock 8MHz.
I never tried the synchronous way as it seems more complicated and 99 percent of the web result will show asynchronous way, which makes me think I better debug what I am doing.
Any idea? advice? Thanks
Well not really a solution, but an alternative. You should break the frame in small chunks. And if possible the receiver to ack with a char to notify the transmitter to go ahead with another chunk.
Reason I am saying that, I have a mikroE dev board with a similar PIC, and while running an "out of the box" example, and sending
"111,222,333,444,555,666,777,888,999"
It looks like the "999" is creating issues, too much byte, maybe buffer issue, maybe the not perfect baud rate mismatch builds up after a few bytes?
Repeat the sending every 50ms, 500ms or 1000ms doesn't make it better.
Changing the baud rate neither.
Only removing ",999" and it seems to work all right.
Without the ",999" I am guessing it is still on the "edge of working", so maybe just remove "666,777,888,999" and the communication should feel more comfortable.
More code, more traffic, but at least it works..
I am trying to run a treadmill using the serial port, I was able to do it using matlab however I am having a few probelms when I ported the same code to C#. I am sure that the port is open there is probably something wrong with the message format. Would be great if someone can tell me what mistake I am making.The matlab code (which works) and the C# code (which doesn't work) are given below.
MATLAB CODE :
ctr = char(12); %control character
rel = char(169); %release
set_speed = char(163);
set_dir = char (184);
%initializing the ports
R = serial('COM4');
set(R, 'BaudRate', 4800, 'Parity', 'none', 'DataBits', 8, 'StopBits', 1, 'Terminator', 'CR');
set(R, 'InputBufferSize', 128, 'OutputBufferSize', 128);
fopen(R);
if R.status == 'open'
fprintf(R, [rel ctr]);
disp('port for R belt open and released');
else
disp('error with R port-- COM3');
end;
%initial direction to FORWARD
fprintf(R, [set_dir '0' char(12)]);
%set speed to
fprintf(R, [set_speed '0005' ctr]);
My C# version of the matlab code above
char ctr = (char)12;
char rel = (char)169; //release
char set_speed = (char)163;
char set_dir = (char)184;
void Start () {
try{
SerialPort R = new SerialPort();
R.BaudRate = 4800;
R.Parity = Parity.None;
R.DataBits = 8;
R.StopBits = StopBits.One;
R.ReadBufferSize = 128;
R.WriteBufferSize = 128;
R.Open();
if(R.IsOpen){
//Release
R.Write(rel+""+ctr);
print ("Serial port is open");
}
else print ("Serial port is close");
R.Write(set_dir+""+"0"+""+ctr);
R.Write(set_speed+""+"0005"+""+ctr);
}
catch(UnityException e){
print ("Exception");
print (e);
}
}
I'm not really familiar with C#, but I'll try to guess that you should also send terminator character in your C# code.
Check the fprintf (serial) documentation in MatLab:
fprintf(obj,'cmd') writes the string cmd to the device connected to the serial port object, obj. The default format is %s\n. The write operation is synchronous and blocks the command-line until execution completes.
fprintf(obj,'format','cmd') writes the string using the format specified by format.
In your calls you are using 1st syntax so your call
fprintf(R, [rel ctr]);
is actually
fprintf(R, '%s\n', [rel ctr]);
Usually, serial devices read input data until the terminator character is found. This means that transmission of the command string or data is completed and the device now can execute the command. This is much like hitting ENTER in MatLab command window: command is executed only after you do this.
Which terminator character to use should be specified in your device programming manual.
Seems that CR is OK since your MatLab code works.
In your MatLAb code you set the terminator to be CR character (ASCII code 13). I do not see this in your C# code so your device waits for CR which is not sent so there should be no reaction from your device.
I do not think that C# will send the terminator character for you, you should take care of it by yourself.
My guess is that
R.Write(rel+""+ctr + "\r");
should solve the problem (thanks dodald for reminding me that I missed the proper conclusion).
See also Terminator property of SERIAL object and Rules for Writing the Terminator.
I have a client server app which is sending data from a C# client to a C++ server. When the server receives this data request, 9 out of 10 times it works ok, but there is always 1 time were there will be garbage data appended to the end of the received data on the server side.
for example instead of receiving a number 1 it will receive 1C or 1#????
Here are snippets of the client and server code, any help will be appreciated.
C# client
int flagSide = 1;
msg = name;
msg += "+";
msg += "qty";
msg += "+";
msg += flagSide.ToString();
ZeroMQ.ZmqContext context = ZeroMQ.ZmqContext.Create();
ZeroMQ.ZmqSocket socket = context.CreateSocket(SocketType.REQ);
socket.Connect("tcp://111.111.0.111:5556");
socket.Send(Encoding.ASCII.GetBytes(msg.ToCharArray()));
Thread.Sleep(1);
string reply = socket.Receive(Encoding.ASCII);
Console.WriteLine("Received reply = " + reply + "\n");
C++ Server
std::tr1::unordered_map <std::string, std::string> aMap;
zmq::context_t context( 1 );
zmq::socket_t responder( context, ZMQ_REP );
responder.bind ("tcp://*:5556");
while ( 1 )
{
zmq::message_t recvMsg;
responder.recv( &recvMsg );
t = static_cast<char*>( recvMsg.data() );
std::string s(t);
std::vector<std::string> strs;
boost::split(strs, s, boost::is_any_of("+"));
aMap["name"] = strs[0];
aMap["qty"] = strs[1];
aMap["flag"] = strs[2];
..........
outputing the split string in the server reveals that sometimes the flag or strs[2] receives the garbage data.
Please help me if you see something that I'm not seeing.
Thanks
In C#, strings converted to bytes are not null-terminated, and c++ string expects a null terminated pointer.
So I presume what is happening here, is a buffer underflow. You are reading memory which does not belongs to the string.
I've been breaking my head over a bug in this system I've been building. Basically, I use sockets to communicate between two C# applications. Or rather a Unity C# script server and a C# client application.
With manual tests, the system works perfectly fine, no anomalies whatsoever.
In order to test performance and multi-user functionality, I wrote up a tester class which launches multiple threads(clients), and have those fire X amount of messages at the server. Here's where my problem occurs...Sometimes.
When a Socket sends or receives, it returns an integer container the amount of bytes that was sent/received. When the problem occurs, I can see that the correct amount of bytes arrived at the server. However, after putting the bytes into a string, suddenly I'm left with an empty string, instead of the message I'd normally see here.
I'm at a loss at to what's causing this problem. I'm using Encoding.Default.GetString() to translate the bytes into a string.
Any help is appreciated!
David
public void ReceiveFromClient (Socket handlerSocket)
{
serverBuffer = new byte[iBufferSize]; //iBufferSize = 8192;
int i = handlerSocket.Receive (serverBuffer);
Debug.Log ("Bytes received: " + i);
string message = Encoding.UTF8.GetString (serverBuffer, 0, i);
Debug.Log ("Message received: " + message);
//Do stuff with the message
}
bool SendMessageToUnity(string input)
{//returns a bool saying whether the message was sent or not
if (clientSocket != null)
{
if (clientSocket.Connected)
{
byte[] bytes = Encoding.UTF8.GetBytes(input+"|");
txtOutput.BeginInvoke(new Action(() => txtOutput.AppendText("Sending message: " + Encoding.UTF8.GetString(bytes) + Environment.NewLine)));
int i = clientSocket.Send(bytes);
txtOutput.BeginInvoke(new Action(() => txtOutput.AppendText("Sending "+i+" bytes. "+ Environment.NewLine)));
return true;
}
}
return false;
}
Look for for a zero value ('\0') in your array of bytes before converting it to a string.
private string GetString(byte[] data)
{
data = data.Where(b => b != 0).ToArray();
return Encoding.UTF8.GetString(data);
}
If you get the byte array correctly than the problem in the Encoding.
Check the sending Encoding usually UTF8 but you have to check it out.
and then var inputStr = Encoding.UTF8.GetString(InputByteArray);
^^
I'm trying to communicate between C# and C++ with varying amounts of success.
I am able to send a message between the two using reply/request, but the doubles that I am receiving are not correct.
For debugging purposes and understanding, I am currently running the following:
Clrzmq 3.0 rc1, Google ProtocolBuffer 2.5, Protobuf-csharp-port-2.4, ZeroMQ-3.2.3
.Proto
package InternalComm;
message Point
{
optional double x = 1;
optional double y = 2;
optional string label = 3;
}
server.cpp (the relevant part)
while (true) {
zmq::message_t request;
// Wait for next request from client
socket.recv (&request);
zmq::message_t reply (request.size());
memcpy ((void*)reply.data(), request.data(), request.size());
socket.send(reply);
}
client.cs (the relevant part)
public static Point ProtobufPoint(Point point)
{
Point rtn = new Point(0,0);
using (var context = ZmqContext.Create())
{
using (ZmqSocket requester = context.CreateSocket(SocketType.REQ))
{
requester.Connect("tcp://localhost:5555");
var p = InternalComm.Point.CreateBuilder().SetX(point.X).SetY(point.Y).Build().ToByteArray();
requester.Send(p);
string reply = requester.Receive(Encoding.ASCII);
Console.WriteLine("Input: {0}", point);
byte[] bytes = System.Text.Encoding.ASCII.GetBytes(reply);
var message = InternalComm.Point.ParseFrom(bytes);
rtn.X = message.X;
rtn.Y = message.Y;
Console.WriteLine("Output: {0}", rtn);
}
}
return rtn;
}
On the C# side, Point is a very simple struct. Just x and y properties.
Here is what I'm getting from my unit tests as a result of running the above code.
Input (1.31616874365468, 4.55516872325469)
Output (0.000473917985115791, 4.55516872323627)
Input (274.120398471829, 274.128936418736)
Output (274.077917334613, 274.128936049925)
Input (150.123798461987, 2.345E-12)
Output (145.976459594794, 1.11014954927532E-13)
Input (150, 0)
Output (145.96875, 0)
I am thinking that the problem is my protobuf code is incorrect (doubtful this is a bug on Skeet's side). I am also running under the assumption that server.cpp is doing nothing to the message but returning it as is.
Thoughts?
The requestor.Receive(Encoding.ASCII) call is designed to receive a string, not a block of bytes. You are asking the ZmqSocket instance to return the message as an ASCII string, which is highly likely to cause modifications to the content. If you're sending a byte array, receive a byte array.
Try this:
int readSize;
byte[] reply = requester.Receive(null, out readSize);
var message = InternalComm.Point.ParseFrom(reply);
The readSize variable will contain the actual number of valid bytes in the received block, which may vary from the size of the reply array, so you may need to slice up the array to make it palatable to ProtoBuf.
Why the ASCII --> bytes --> parsing step? If you're parsing bytes, you should read bytes. If you're parsing text, you should read that.
Unnecessary charset-conversions look very likely to be erroneous.