I am working on a C# development project which aims at controlling a flying robot thanks to a set of Optitrack cameras (infrared Motion capture system). The concept works really well, you get a position from the camera system, our application can access to these data and we can control the robot.
The problem comes, I believe, from the communication chain. We have a USB cable linking our PC to an STM32F4 based board, which acts as a bridge and simply sends everything received on USB to radio thanks to an embedded NRF module (nRF24L01+ chip). This mechanic goes both way (NRF -> USB -> PC). The board is running under C using FreeRTOS. The PC runs Windows 10.
On the PC side we use the official STM32 VCP driver 1.4 from ST. On the board firmware, we have this user-made library for VCP on STM32F4: https://stm32f4-discovery.net/2014/08/library-24-virtual-com-port-vcp-stm32f4xx/
There are no queues on the bridge board except the ones from the radio module itself and potentially the STM32 VCP driver itself. I checked, there is a circular buffer in this driver used for in and out messages. Using JSCOPE I can visualize in real time how this circular buffer behaves and I've never seen an overflow happening.
My problem comes from the fact that a delay suddenly appears.
The system works perfectly for tens of minutes (between 10 to 20 minutes) and then a delay is clearly perceptible, which makes the controller oscillate. This happens when I do a second flight.
The following have been tried:
- Restart the C# application
- Run the C# application in debug mode and stand alone mode
- Change USB cable (shorter)
- Change USB port
- Change of computer
- Reinstall the ST VCP drivers
- Lower drastically the communication frequency (the control loop)
When the problem appears, the above solutions didn't work, the delay remains completely and none of the above proved to make it more reliable.
On the C# application, I reset all communication Lists before each flight. The SerialPort port object (from System.IO) has its buffers reset as well (DiscardOutBuffer, DiscardInBuffer, BaseStream.Flush methods)
I found a "hack" to make it work, but it is not what we want as a final solution. The "hack" is to simply physically disconnect / reconnect the USB connecting the board acting as a bridge. After this manipulation, the delay disappears.
So my questions are:
What could possibly bring this delay, knowing it doesn't look to be application dependant, nor frequency related (bandwidth).
What could explain why this disappear when unplugging/plugging the bridge board ?
I know this question is about a big project and could be hard to understand from outside. Feel free to ask me for more details if needed.
Cheers,
Marc
Related
I considered cross posting to the Super User site or similar but don't know how. As you will see this is not just a programming question.
I developed some C# code to communicate with an Onset InTemp thermometer via Bluetooth Low Energy (BLE). It works fine for a long time. I'm able to get beacons (which have the thermometer data) and also connect, get services and characteristics in case I need to get missed data or set parameters on the thermometer. However, after about 8 hours (can be as much as 24 hours) of continuously receiving beacons and connecting, getting historical data, etc., my app hangs on this line:
var gatt = await device.GetGattServicesAsync();
I put lines before and after this line and verified it's clearly hanging on this line. Again, it can be after 8 hours or 24 hours of use. It certainly chugs along just fine for quite a time. Killing and restarting the program is of no help. It hangs on the first call to:
device = await BluetoothLEDevice.FromBluetoothAddressAsync(args.BluetoothAddress);
Only rebooting the PC fixes the problem. This is not surprising given the messages in the Windows Event Logs before it hangs:
The Bluetooth driver expected an HCI event with a certain size but did not receive it.
It's for this reason that I stated that this might be a SuperUser board question. No matter how badly I programmed my code :), killing the program should get BLE working again. FWIW, I did try disabling/re-enabling BLE through settings before rebooting.
I have found some links to this problem, but nothing very definitive.
I'm also working on writing a smaller, complete program to show the problem. So far, I can't get example program to fail from which I conclude that either (1) I haven't let it run long enough, or (2) I'm not fully duplicating what is going on with my full program. Perhaps I'm not putting as much pressure on the BLE drivers or I'm not listening for beacons and attempting to connect in the same ratios or with the same timing.
I should say that I see a lot of gripes about Windows implementation of the BLE Host layer and BLE drivers. Unfortunately, I'm stuck with Windows. Nordic also suggested that everything I'm reporting is a "known problem" and using their Nordic dongle will solve the problem. See for example: https://devzone.nordicsemi.com/f/nordic-q-a/65516/using-nrf52840-dongle-as-receiver-client-for-onset-thermometer.
That may well be true, but it would be a lot of work as they have libraries in C++ and Node.JS but not C#.
Any help is greatly appreciated.
Thank you!
I have a BLE sensor that sends data at a frequency of 100Hz and I've developed an UWP application to receive this data. I'm having a weird issue where after a couple of seconds of everything working fine, I stop receiving notifications for new data.
Now, I say it's weird because this happens when I'm using the laptop's internal Bluetooth but not when using a Bluetooth dongle. When using the dongle it works fine and never stops. Both internal Bluetooth and dongle drivers are updated to the latest version Windows can find.
As soon as the notification stops the sensor disconnects.
The sensor is based on the Nordic nRF52832 SoC.
Now a bit of information about my code:
I have a Sensor.cs class in which I handle the connection and the streaming of the data.
I'm using BluetoothLEAdvertisementWatcher in order to find the sensor.
The GattCharacteristic's are private members of the Sensor class.
I subscribe to notifications by calling:
await _gattDataSensorsCharacteristic.WriteClientCharacteristicConfigurationDescriptorAsync(
GattClientCharacteristicConfigurationDescriptorValue.Notify);
The application doesn't do anything fancy. Just connects to the sensor, starts the streaming and prints the data.
I've done quite a research and couldn't find anything similar. I've found posts about notifications stopping because the object was taken away by the GC, the characteristics were local variables instead of class members or calls weren't awaited.
Why does this happen?
1st edit
#Emil I'm sorry, my fault for not mentioning that as soon as the notifications stop, the sensor disconnects. I'm editing the question to mention that as well as it's something relevant. Anyway I'm trying what you mention asap.
#GrooverFromHolland That option was checked. I've tried unchecking that option on a couple of computers but sadly, unchecking it only makes the code last a few seconds longer.
#Nico Zhu - MSFT I've followed the link you posted and read the doc also went to my sensor's manual to double check if the characteristic allows Notify and it does, it does allow Read and Notify. I also do the same thing characteristic.ValueChanged += Characteristic_ValueChanged; as it's mentioned in the doc.
About the sensor disconnecting when the notifications stop/the valuechanged stops firing. I have to add that on my MainWindow, the sensor object is a member class and it isn't disposed at any moment. So it doesn't make sense that the GC takes the object away, right?
2nd edit
I've tried the code I'm using with both the laptop plugged and unplugged to a power source and it's always set to max performance. The code I'm using can be found here: link
3rd edit
Following #Emil 's recommendations I was finally able to scan the traffic. Tried to understand what the pcap file generated by USBPcap, but I simply don't understand. I've tried the issue with 2 sensors and the disconnect/notification stop pattern looks different for each sensor.
I've made a Dropbox folder containing the two pcap files: link
From what I understood, in the file "ble-FE592382586F.pcap" the interesting line No are: 12647 and 12681. It says the source is the controller and the destination is host. Is the controller the laptop's bluetooth driver?
Controller means the bluetooth chip (inside or outside the computer) and host is the main cpu. They talk with each other over usb. At packet 12681 in the first file, the disconnect reason is Connection Timeout (0x08). This means the connection was dropped unexpectedly (radio interference/bad signal?). At packet 613 in the other file, we have the same situation. A difference is that in only in the second file, there was a new attempt to reconnect the device.
Since you do not use bonding, you must make sure to re-write the client characteristic configuration descriptor upon every new connection. It doesn't seem like you do this.
I ran into similar - what I discovered was that the GattCharacteristic (the object you call WriteClientCharacteristicConfigurationDescriptorAsync on) was getting GCed - as soon as I added that to a class-wide variable (I made a Dictionary of all the characteristics I was tracking), my notifications kept coming
So, I have seen various posts regarding the use of an android phone as a keyboard and as a game pad. I'm actually working on a project that does just that. As you could imagine, I've hit a massive roadblock when it comes to sending the signal from the phone to the PC via USB.
I decided to use unity as the base of this project. I have a functioning GUI and I have some simple code to basically open the port, push through the Serial.write command, and close the port. I also understand that is code will not do me any good without the PC I'm connected to via USB recognizes the phone as a source of input.
That's where I'm stuck.
I've seen posts that involve explanations of how USB works, the different hex codes mapped to keys, ideas of modifying the kernel, third party programs, and third party devices, but nothing concrete on how to move forward in a video game on my PC by simply tapping the screen on my phone. It should be simple right?
So, I'm asking whether or not this project is worth pursuing for the few months I have left to do it or should I consider pivoting to a project that's a little closer to my pay grade(free)?
Although I am not sure if this post is a SO standard question, my train of thought would be to use bluetooth instead of using USB, client-server etc. as they are plagued by problems. While there maybe a certain lag in using bluetooth, programmatically this should be easily achievable. This maybe of interest to you: https://github.com/temach/HIDInterface
I have an Arduino mega communicating over Bluetooth (bluesmirf gold device) to a C# application that I wrote. The Arduino is constantly sending a serial signal of 32 characters, the first always being an "S" and the last an "E". Using putty I can confirm that this signal is being sent correctly 99% of the time.
Now I want to read this signal with my C# application, which I am doing with the following code:
public string receiveCommandHC()
{
string messageHC = "";
if (serialHC.IsOpen)
{
while (serialHC.ReadChar() != 'S')
{
}
messageHC = serialHC.ReadTo("E");
serialHC.DiscardInBuffer();
}
return messageHC;
}
serialHC is of the serial class.
Sometimes this works perfectly but other times I'm having problems, I cannot find out why it works sometimes but others not.
The problem that I seem to be having is that sometimes I get a rather large Lag in the data that I am reading from the arduino. I notice this because I am sending button states and they only change a few seconds after I actually press or release the button on the Arduino. I used the standard baud rate of the Bluetooth device, which is 115200, and was wondering if changing that to a much lower rate could yield better results? What if any advantage would that have? I do not need hight communication rates, even updating the state 4-5 times a second would be acceptable for my application.
Is it possible the lag is coming from my code? I think it may be from the while loop that is waiting for the incoming "S" but then I don't see why it should hang there since there are new signals always coming in at a high rate.
I'm using the DiscardInBuffer() because I do not care about outdated data and just want to skip over that. It is much more important that I am reading current up to date data and acting on that fresh data.
Thank you for your help!
Best regards,
Bender
Update:
Just found out a bit more information while debugging. The problem only seems to appear:
When connected over Bluetooth (over USB cable there is absolutely NO Lag)
When a second Bluetooth connection is established from the PC to another device (different COM port and different baud rate)
Does anybody have any experience running two different devices off the same Bluetooth dongle on the PC? I can manage to connect to both no problem but still having the lag issue mentioned before.
Thanks for any help
You are not really using a physical serial port here. The BlueTooth driver merely emulates one. This is common, the Windows API has a well defined set of api functions to talk to a serial port. Emulating one makes the interface to the driver simple, the vendor doesn't have to supply an interface DLL or document a complicated DeviceIoControl() protocol.
Which means for one thing that the actual communication settings don't matter. Baudrate is meaningless in this scenario, it is the BlueTooth radio signaling that sets the transfer rate. The driver will accept whatever you select but will otherwise ignore it. Handshake signals might be interpreted, it's up to the driver to implement them. Communication error reporting is very rarely implemented, BlueTooth has an error correcting protocol, unlike a real serial port.
No, the loss of data here is entirely self induced. Clearly the driver does implement DiscardInBuffer(). Which accomplishes nothing but throw away any data that the driver received. This goes wrong if your code runs a bit late or gets interrupted by a thread context switch.
Delete the DiscardInBuffer() call.
In my scenario to print something, I don't want the printer has to be connected with a computer. Rather I am trying to connect that with a Modem(GSM or any other). when a sms arrives to the modem, the print command will fire and the sms will be printed.My question is, is it possible to implement the diagram with our existing technology? If not, i will be delighted if you provide some alternatives.
Unless you can customize the firmware of the GSM modem or the printer, it is likely you will need a small computer in between. If both can talk over serial ports, this can be really tiny - PIC, AVR (including packaged version such as Arduino), etc. If at least one needs USB you may be better off with a bare-metal Arm board. This is going to range from $2 at the low end to maybe $70 at the high. (There are also a few Arm boards that run an embedded .NET framework if that is your background... how well they run it I'm not sure)
If you need to do formatting, or the printer depends on the computer to do a lot of the work, or your engineers aren't familiar with the mindset of tiny embedded systems, you probably want something capable of running an operating system - ie, a faster Arm chip with hundreds of megabytes of memory - think Beagleboard, plugputer, Chumby Hacker Board, etc or one of those micro-servers that are basically x86 netbooks refactored for better cooling. Depending on how careful you are, this puts you anywhere from $50-$250.
You could also use an android phone (pick one with known USB host capability) and fold in the GSM capability, but may spend a lot of time tripping over the android components when all you really want is an embedded linux with a full libc. A more "linux-y" linux smartphone might be preferable if you can find one you expect will continue to be available.