PIC18 UART receiving corrupted bytes from PC - c#

I seem to have issue to receive correct bytes from a PC to a PIC18F27J53.
The PIC UART is set standard, asynchronous, 8bits, 9600, No parity.
The PC is a win 10, I have made a simple UART program, and sending a few ints, all separated by commas, like the following.
"123,24,65,98,12,45,564,987,321,0,5.9,87,65,789,123,6554,213,8754\n"
I have tried different ways,
Tried to send each char one by one, however the PIC seems to get stuck midway or early in the transfer and the RX flag doesn't go high anymore.
I have tried to send each int followed by "\n" and my PIC to parse each chars and cut the read after a "\n" is found. This seems better, I can get more data in, but the final received data is corrupted: some ints are wrong etc.
It clearly show this is a sync issue, it looks like the PC is too fast for the PIC?
If so, I am looking at having a synchronous uart, however according to the web, this seems to be far from the chosen method, which makes me thing I must have another issue to resolve, in asynchronous mode?
My question, what is the most popular robust way to do that PIC to PC UART full duplex communication?
Here are my PIC receive APIs, fairly standard and simple (I think).
void int_receive_data(void)
{
char input_element[10] = { 0 };
char full_rx[128] = { 0 };
for (int i = 0; i < 22; i++) {
p18f47j53_uart2_read_text(input_element, sizeof(input_element));
strncat(full_rx, input_element, strlen(input_element));
strncat(full_rx, ",", 1);
}
}
void p18f47j53_uart2_read_text(char *output, uint8_t max_length)
{
uint8_t c;
char buffer[64] = { 0 };
for (uint8_t i = 0; i < max_length; i++) {
c = p18f47j53_uart2_receive_u8();
buffer[i] = c;
if ((c == 10) || (c == '\n')) {
buffer[i] = 0;
memcpy(output, buffer, i);
i = max_length;
}
}
}
uint8_t p18f47j53_uart2_receive_u8(void)
{
// wait for the flag
while (!PIR3bits.RC2IF);
// reset receiver if over run error
if (RCSTA2bits.OERR) {
RCSTA2bits.CREN = 0;
RCSTA2bits.CREN = 1;
return PIC_RC_FAIL;
}
// reset if frame error
if (RCSTA2bits.FERR) {
RCSTA2bits.SPEN = 0;
RCSTA2bits.SPEN = 1;
return PIC_RC_FAIL;
}
return RCREG2;
}
On the PC C# side, my sending looks like this
string[] full_separated = full_tx.Split(',');
foreach (string s in full_separated)
my_port.WriteLine(s);
The PIC is running from its internal clock 8MHz.
I never tried the synchronous way as it seems more complicated and 99 percent of the web result will show asynchronous way, which makes me think I better debug what I am doing.
Any idea? advice? Thanks

Well not really a solution, but an alternative. You should break the frame in small chunks. And if possible the receiver to ack with a char to notify the transmitter to go ahead with another chunk.
Reason I am saying that, I have a mikroE dev board with a similar PIC, and while running an "out of the box" example, and sending
"111,222,333,444,555,666,777,888,999"
It looks like the "999" is creating issues, too much byte, maybe buffer issue, maybe the not perfect baud rate mismatch builds up after a few bytes?
Repeat the sending every 50ms, 500ms or 1000ms doesn't make it better.
Changing the baud rate neither.
Only removing ",999" and it seems to work all right.
Without the ",999" I am guessing it is still on the "edge of working", so maybe just remove "666,777,888,999" and the communication should feel more comfortable.
More code, more traffic, but at least it works..

Related

Reading data from RadioLink R12DS receiver by S-BUS protocol

My goal is pretty simple: I wanna read information from my RadioLink R12DS receiver by S-BUS protocol using desktop console application, written on C#.
I use AT9S transmitter blinded together with receiver in 12 channel mode. I tested it on Pixhawk flight controller. Everything was fine there, no any problem with retrieving data.
I designed a console application based on investigated articles. Here is a couple most valuable of them:
http://forum.fpv.kz/topic/303-frsky-x8r-sbus-v-cppm-konverter-na-arduino/ https://github.com/bolderflight/SBUS
My application receiving byte stream from a COM Port, one by one, and tries catch message header "0x0F", but it doesn't appear.
The SBUS protocol uses inverted serial logic with a baud rate of 100000, 8 data bits, even parity bit, and 2 stop bits. The SBUS packet is 25 bytes long consisting of:
Byte[0]: SBUS Header, 0x0F
Byte[1-22]: 16 servo channels, 11 bits per servo channel
Byte[23]:
Bit 7: digital channel 17 (0x80)
Bit 6: digital channel 18 (0x40)
Bit 5: frame lost (0x20)
Bit 4: failsafe activated (0x10)
Bit 0 - 3: n/a
Byte[24]: SBUS End Byte, 0x00
A table mapping bytes[1-22] to servo channels is included.
Here is listing of my code:
static void Main(String[] args)
{
var availablePorts = SerialPort.GetPortNames();
using(var port = new SerialPort(availablePorts[0], 100000, Parity.Even, 8, StopBits.Two))
{
port.DataReceived += PortOnDataReceived;
while(true)
{
if(!port.IsOpen)
TryReconnect(port);
Thread.Sleep(1000);
}
}
}
// HANDLERS ///////////////////////////////////////////////////////////////////////////////
private static void PortOnDataReceived(Object sender, SerialDataReceivedEventArgs serialDataReceivedEventArgs)
{
var serialPort = (SerialPort)sender;
if(SbusConverter.TryReadMessage(serialPort, out var messageBuffer))
{
var message = SbusConverter.Convert(messageBuffer);
Console.WriteLine(message.ServoChannels[0]);
}
}
public static Boolean TryReadMessage(SerialPort serialPort, out Byte[] messageBuffer)
{
const Int32 messageLength = 25;
const Int32 endOfStream = -1;
const Byte sBusMessageHeader = 0x0f;
const Byte sBusMessageEndByte = 0x00;
messageBuffer = new Byte[messageLength];
if(serialPort.BytesToRead < messageLength)
return false;
do
{
var value = serialPort.ReadByte();
if(value == endOfStream)
return false;
if(value == sBusMessageHeader)
{
messageBuffer[0] = (Byte)value;
for(var i = 1; i < messageLength; i++)
{
messageBuffer[i] = (Byte)serialPort.ReadByte();
}
if(messageBuffer[0] == sBusMessageHeader &&
messageBuffer[24] == sBusMessageEndByte)
return true;
}
} while(serialPort.BytesToRead > 0);
return false;
}
I have thoughts in my head and I want ask one question here.
It's possible, that RadioLink use different, modified or their own S-BUS implementation, than Futaba and I found no proper documentation yet.
Anybody, who experienced in that field, any suggestions please. It seems, I am stuck.
Thank you!
I made some investigations of received data stream and uncovered that RadioLink devices uses "0x1F" as a frame start byte insted of "0x0F". Another connection and message properties are the same.
var availablePorts = SerialPort.GetPortNames();
using(var port = new SerialPort(availablePorts[0], 100000, Parity.None, 8, StopBits.One)
{
Handshake = Handshake.None
})
{
port.DataReceived += PortOnDataReceived;
while(true)
{
if(!port.IsOpen)
OpenPort(port);
Thread.Sleep(1000);
}
}
Two years too late, but I just noticed you have 1 stop bit set, instead of 2, in the code of your answer. That could probably explain 0x1F instead 0x0F and otherwise shifted data.
I am trying to do the same kind of operation, intercepting Sbus signal from a herelink radio controller, so far I discovered a shift in the data and my start byte is recognised with the value of 0x1E. Moreover the data seems super noisy when looked at the oscilloscope, some 1 might be missing because of the poor quality of the signal and the ramp from 0 to 1.

Saving a variable that gets loaded on program startup (WFA) C#

The user is asked to type in the serial number on the device he's using. Then the program uses this serial number for all the functions. This was made so that the user can easily replace said device, without any technical help - just typing the new serial number in the application.
However, the way I've done it, the user needs to type in the serial number each time the program is opened, and it's kind of tedious.
Is there a way to store the last entered serial number, so that it loads the next time the program is being runned?
I have checked this link. While it seems promising, it hasn't solved the problem for me. I'll explain with my code below.
Here is the code asking for the user input serial number:
byte[] xbee { get; set; }
var xbee_serienr = prop1_serienr.Text;
xbee = new byte[xbee_serienr.Length / 2];
for (var i = 0; i < xbee.Length; i++)
{
xbee[i] = byte.Parse(xbee_serienr.Substring(i * 2, 2), NumberStyles.HexNumber);
}
I tried the aforementioned link, and save it like so:
prop1_serienr string user 0013A20040A65E23
And then use it in the code like so:
prop1_serienr = Xbee.Properties.Settings.Default.prop1_serienr;
//keep in mind I made the silly decision using Xbee as namespace and xbee as a variable
But the prop1_serienr remains empty this way.
Any tips or guidelines on how to make this easier than having to type it every time the program starts would be greatly appreciated. If that's my only option I will resort to hard coding the serial numbers and then change the code every time a device is changed.
Hard coding the serial numbers is really not an option, especially when something as "saving a serial number" is not very complicated at all (but like all things, complicated it can be, if you let it).
The very easy approach:
public partial class Form1 : Form
{
private byte[] _xbee;
public Form1()
{
if (!File.Exists("serial.txt"))
{
File.Create("serial.txt");
}
else
{
_xbee = File.ReadAllBytes("serial.txt");
}
InitializeComponent();
}
private void btnSaveSerial_Click(object sender, EventArgs e)
{
byte[] xbee { get; set; }
var xbee_serienr = prop1_serienr.Text;
xbee = new byte[xbee_serienr.Length / 2];
for (var i = 0; i < xbee.Length; i++)
{
xbee[i] = byte.Parse(xbee_serienr.Substring(i * 2, 2), NumberStyles.HexNumber);
}
_xbee = xbee;
File.WriteAllBytes("serial.txt", xbee);
}
}
It reads the bytes from the file at startup (if the file exists).
It writes the bytes to the file when the user has changed the serial (and clicked on a button to save it).
As I've said, you can make this as easy or as complicated as you like, but this should get you going.

Unable to return same double using Google Protocol Buffers from C# to C++ and back

I'm trying to communicate between C# and C++ with varying amounts of success.
I am able to send a message between the two using reply/request, but the doubles that I am receiving are not correct.
For debugging purposes and understanding, I am currently running the following:
Clrzmq 3.0 rc1, Google ProtocolBuffer 2.5, Protobuf-csharp-port-2.4, ZeroMQ-3.2.3
.Proto
package InternalComm;
message Point
{
optional double x = 1;
optional double y = 2;
optional string label = 3;
}
server.cpp (the relevant part)
while (true) {
zmq::message_t request;
// Wait for next request from client
socket.recv (&request);
zmq::message_t reply (request.size());
memcpy ((void*)reply.data(), request.data(), request.size());
socket.send(reply);
}
client.cs (the relevant part)
public static Point ProtobufPoint(Point point)
{
Point rtn = new Point(0,0);
using (var context = ZmqContext.Create())
{
using (ZmqSocket requester = context.CreateSocket(SocketType.REQ))
{
requester.Connect("tcp://localhost:5555");
var p = InternalComm.Point.CreateBuilder().SetX(point.X).SetY(point.Y).Build().ToByteArray();
requester.Send(p);
string reply = requester.Receive(Encoding.ASCII);
Console.WriteLine("Input: {0}", point);
byte[] bytes = System.Text.Encoding.ASCII.GetBytes(reply);
var message = InternalComm.Point.ParseFrom(bytes);
rtn.X = message.X;
rtn.Y = message.Y;
Console.WriteLine("Output: {0}", rtn);
}
}
return rtn;
}
On the C# side, Point is a very simple struct. Just x and y properties.
Here is what I'm getting from my unit tests as a result of running the above code.
Input (1.31616874365468, 4.55516872325469)
Output (0.000473917985115791, 4.55516872323627)
Input (274.120398471829, 274.128936418736)
Output (274.077917334613, 274.128936049925)
Input (150.123798461987, 2.345E-12)
Output (145.976459594794, 1.11014954927532E-13)
Input (150, 0)
Output (145.96875, 0)
I am thinking that the problem is my protobuf code is incorrect (doubtful this is a bug on Skeet's side). I am also running under the assumption that server.cpp is doing nothing to the message but returning it as is.
Thoughts?
The requestor.Receive(Encoding.ASCII) call is designed to receive a string, not a block of bytes. You are asking the ZmqSocket instance to return the message as an ASCII string, which is highly likely to cause modifications to the content. If you're sending a byte array, receive a byte array.
Try this:
int readSize;
byte[] reply = requester.Receive(null, out readSize);
var message = InternalComm.Point.ParseFrom(reply);
The readSize variable will contain the actual number of valid bytes in the received block, which may vary from the size of the reply array, so you may need to slice up the array to make it palatable to ProtoBuf.
Why the ASCII --> bytes --> parsing step? If you're parsing bytes, you should read bytes. If you're parsing text, you should read that.
Unnecessary charset-conversions look very likely to be erroneous.

LibUsbDotNet UsbDevice.ControlTransfer hangs

I have a C# .Net Winforms application, which uses LibUsbDotNet to program firmware into an USB-device (Atmel AVR32) using "DFU_DNLOAD" transfers, which is a special kind of control-transfers. This all works, BUT: A specific kind of transfer, which causes the device to erase its internal flash, fails to send an ACK within the correct timing.
When this happens, my LibUsbDotNet connection becomes irreparably broken, which causes everything to fail.
My code does the following:
int TransferToDevice(byte request, short value, byte[] data)
{
var setup = new UsbSetupPacket(
(byte)(UsbCtrlFlags.Direction_Out | UsbCtrlFlags.RequestType_Class | UsbCtrlFlags.Recipient_Interface),
request,
value,
0,
(short)data.Length);
int n;
IntPtr unmanagedPointer = System.Runtime.InteropServices.Marshal.AllocHGlobal(data.Length);
System.Runtime.InteropServices.Marshal.Copy(data, 0, unmanagedPointer, data.Length);
// UsbDevice obtained else-where
if (!UsbDevice.ControlTransfer(ref setup, unmanagedPointer, data.Length, out n))
{
n = 0;
}
System.Runtime.InteropServices.Marshal.FreeHGlobal(unmanagedPointer);
return n;
}
// In order to do a "DFU_DNLOAD", the method above is used as follows:
TransferToDevice(DFU_DNLOAD, Transactions++, data); // "data" is the payload
// where DFU_DNLOAD is:
private const byte DFU_DNLOAD = 1;
// Transactions is
short Transaction = 0;
The above code works (the device correctly receives the "DFU_DNLOAD" message), but the missing ACK is the problem. Once the error occurs, every attempt to communicate with the device (even if I try to re-initialize everything) fails, untill the device is disconnected and re-inserted...
I would like to be able to reset or re-initialize the USB-connection somehow, when this error occurs. Currently I am only able to re-establish communications with the device by exiting my application and re-starting it manually.
This was never solved to my satisfaction, ended up implementing my own "DFU" protocol ontop of LibUSB using plain C, and P/Invoke to that, avoiding LibUsbDotNet entirely... This solution seems to work.
Just guessing, but in case if data is array of short, than size of the buffer should be adjusted
int numberOfValues = data.Length;
int size = Marshal.SizeOf(typeof(short));
IntPtr unmanagedPointer = Marshal.AllocHGlobal(numberOfValues*size);
if (unmanagedPointer == IntPtr.Zero)
throw new OutOfMemoryException("Unable allocate memory");

c# search in another process's memory - sudden lockout

I need to keep track of another program's memory, constantly looking for a sequence of bytes to appear in there, and when they do, i need to remember their location so i later know where to write to.
I used the following post to learn how to look for byte[] in another process's memory:
C#: Search a byte[] array in another process's memory
My program is very simple: It launches process (using Process.Start), and then repeatedly runs function from the linked thread's one of the answers:
private static int GetMemoryAddressOfString(byte[] searchedBytes)
{
IntPtr hProcess = OpenProcess(ProcessAccessFlags.VMOperation | ProcessAccessFlags.VMRead | ProcessAccessFlags.VMWrite, false, Program.ArtemisProcess.Id);
if (hProcess == IntPtr.Zero)
throw new Win32Exception(Marshal.GetLastWin32Error());
int addr = 0;
int speed = 1024 * 64;
for (int j = 0x00400000; j < 0x11000000; j += speed)
{
byte[] bigMem = new byte[speed + searchedBytes.Length];
IntPtr unmanagedPointer = Marshal.AllocHGlobal(4);
ReadProcessMemory(hProcess, (IntPtr)j, bigMem, new UIntPtr((uint)(speed + searchedBytes.Length)), unmanagedPointer);
int result = Marshal.ReadInt32(unmanagedPointer);
Marshal.DestroyStructure(unmanagedPointer, typeof(int));
for (int k = 0; k < bigMem.Length - searchedBytes.Length; k++)
{
bool found = true;
for (int l = 0; l < searchedBytes.Length; l++)
{
if (bigMem[k + l] != searchedBytes[l])
{
found = false;
break;
}
}
if (found)
{
addr = k + j;
break;
}
}
if (addr != 0)
break;
}
return addr;
}
where ArtemisProcess is the Process i ran with .Start()
Most of the times, it works fine. As soon as i do the action in the watched process that puts the searched sequence of bytes to the memory, the next search finds it. However, sometimes, it wont.
I was wondering if i'm right and used Cheat Engine to be sure that the searched data IS there.
Then i added the part where i create an unmanaged pointer to know how many bytes there were read - and thats when i found out that exactly the place in the memory where the searched bytes appear (that Cheat Engine correctly identifies) returns 0! It wont let me read memory there. This "lockout" happens for about a minute or two, and only then it allows me to read the memory (just out of sudden, the next attempt to read the memory at that location is a success and the sequence of bytes is found all right).
Now, i read on the msdn that "he function fails if the requested read operation crosses into an area of the process that is inaccessible" but how do i know which part of the process memory is accessible and which isnt?
Why is Cheat Engine able to read that memory, and my program isnt?
Why does it suddenly allow me to read the process memory again?
I am at a loss here...

Categories

Resources