Adding nsh, file transfer, joystick control

Hi!

Any plans on adding such features?

  1. By joystick control I mean the possibility to issue control commands (throttle etc) like QGC can do, controlling drone with on-screen touch buttons or gaming controller. It could be useful in different testing scenarios to not connect real RC controller but issue control commands programmatically.

  2. Nuttx shell is other feature implemented in QGC which is good to have available via SDK.

  3. File transfer could be used to download logs. One more interesting use case (if this even possible) to upload new mixer files or startup scripts. Because now it could be done only by reflashing firmware (or removing SD card and manual editing).

To Point 3:
look under widgets
jakob

Thanks for the suggestions.

  1. The goal is to have offboard control which works well and allows you to interface your buttons/game controller/joystick what not. However, the implementation of reading of e.g. the joystick values would not be part of the Dronecode SDK library but something that you implement in your binary. That being said it would be nice to have an example which does exactly that as part of the repository so it’s easy to get started.

  2. Implementing the nsh would be possible. However, in my opinion, this wouldn’t be an essential part of the SDK but rather an additional plugin which can be enabled if required. The SDK tries to make things easy but also limit you so you can’t do anything stupid, and of course with the nsh there are no limits :smile:.

  3. Yes, that’s actually something that we’ve added: https://github.com/Dronecode/DronecodeSDK/tree/develop/plugins/log_files. However, it only supports log download and not the generic mavlink FTP which you would require.
    Again, to set mixer files and startup script, I think that’s more of setup feature and less something that an SDK necessarily should be concerned with. However, if you make a pull request with a separate plugin which does that, there is no good reason not to add it.

1 Like

Thanks for quick reply @JulianOes!

As for #1 I’m not telling about implementing joystick driver in SDK. But about implementing interface that allows you to send control commands so you can implement such a joystick in your app. Or do testing stuff on rigs sending commands for thrust and roll/pitch rate or position. So the following scenario is possible:

  1. I set drone in Manual Mode which is not yet possible with SDK I believe (probably Action is best place to implement such stuff)…
  2. I send thrust commands in programmatically controlled way and with corresponding precision to validate drones thrust on the rig.
  3. I send pitch/roll commands or corresponding rate commands to validate control and to tune parameters on the rig again in programmatically controlled way and with corresponding precision.

As for other stuff like nsh or ftp how about implementing all the stuff QGC has and than building QGC on top of that SDK? Would it be easier to maintain and develop both SDK and QGC then? As for now you have to implement some features in SDK and QGC separately.

As for doing stupid things its funny argument) I believe somebody using your SDK is more responsible person and can do less stupid things then somebody knowing nothing about programming but who can fire up QGC and do whatever he can do with it.

Apology, I did not realize the Qestion was for SDK. I meant WIdgets under QGC.

As for #1 I’m not telling about implementing joystick driver in SDK. But about implementing interface that allows you to send control commands so you can implement such a joystick in your app.

Right, make sense.

  1. I set drone in Manual Mode which is not yet possible with SDK I believe (probably Action is best place to implement such stuff)…

Manual mode is not implemented yet because my goal is that the SDK is for autonomous flights which is the future rather than supporting “old-school” manual control. That’s why it currently only support position and mission control, so you need either GPS or something like flow.
The idea is that you develop and tune your airframe with manual controls. Once the airframe is setup and tuned correctly and works for autonomous flights, that’s when you can start automating things and plan missions using the SDK, etc…

If this view is not shared by the SDK users, I’m open to change this and support manual control as well, it’s just that I would really like to push autonomy.

  1. I send thrust commands in programmatically controlled way and with corresponding precision to validate drones thrust on the rig.

So this would be more like a motor test rig?

  1. I send pitch/roll commands or corresponding rate commands to validate control and to tune parameters on the rig again in programmatically controlled way and with corresponding precision.

Ok, I can see how the SDK would be handy to automate these things. Best would probably be to have some separate plugin that allows these sorts of control/motor tests.

As for other stuff like nsh or ftp how about implementing all the stuff QGC has and than building QGC on top of that SDK? Would it be easier to maintain and develop both SDK and QGC then? As for now you have to implement some features in SDK and QGC separately.

I had that idea too at some point but there are a few reasons against it:

  1. QGC already works well, so why would you change it for the sake of changing it? Building QGC on top of the SDK would be a full time job for 2 people for a year if everything goes well (or at least that’s my wild guess). Also, QGC uses Qt for the cross-platform things while the SDK does not, so that would be something to keep in mind.
  2. It’s not a bad thing two have two implementations for the mavlink functionality. We’re more likely to find errors of the protocol implementation this way.
  3. Maintenance and development can be quicker if it’s shared between pieces but at the same time you can end up in a big dependency nightmare and it actually becomes much harder to change anything. It’s a trade-off at the end of the day.

As for doing stupid things its funny argument) I believe somebody using your SDK is more responsible person and can do less stupid things then somebody knowing nothing about programming but who can fire up QGC and do whatever he can do with it.

That’s probably true but the goal should still be to try to guide the user not to make bad choices. One way is to make it clear what is sort of the normal use case and what are “expert backdoors”, such as the param_raw interface, or a motor/tuning test plugin.

Thanks for detailed explanation!

Still chances Manual mode could be useful for autonomous flight if controlled by onboard PC via SDK. Where onboard PC implements some advanced position control loops while pixhawk do attitude control. But then RTPS could be a better choice in terms of latency I believe.

As for rig testing there is no way yet to programmatically control thrust and pitch/roll at the moment except for writing MAVLink application from scratch. Or maybe using Udacidrone API which has cmd_attitude and cmd_attitude_rate commands.

Also want to add some arguments in favour of having nsh exposed via SDK. Suppose you write your onboard application. It gets some arguments from shell do some stuff and gives some output. Then having nsh available via SDK you can easily interface your app from onboard or remote PC. As I’m not an expert in Dronecode ecosystem perhaps there are other ways to interface your custom onboard application. Would appreciate if you name some.

Sure, thanks for bringing up your thoughts!

For control using a companion computer we have the offboard plugin which should enable you to do exactly that:
https://sdk.dronecode.org/en/guide/offboard.html
https://github.com/Dronecode/DronecodeSDK/blob/develop/plugins/offboard/include/plugins/offboard/offboard.h#L142-L154
We might want to add attitude or attitude rate control there if required.

I didn’t knokw the Udacidrone API but it looks like it’s a simple API built on top of pymavlink.

Also want to add some arguments in favour of having nsh exposed via SDK. Suppose you write your onboard application. It gets some arguments from shell do some stuff and gives some output. Then having nsh available via SDK you can easily interface your app from onboard or remote PC. As I’m not an expert in Dronecode ecosystem perhaps there are other ways to interface your custom onboard application. Would appreciate if you name some.

In my opinion, the nsh is more a tool during autopilot firmware development. The application on the companion computer would then communicate via mavlink to the autopilot but not via nsh but specified messages.
Or maybe I misunderstood your question?

We might want to add attitude or attitude rate control there if required.

That’s exactly what I’m asking for and what I did not found in Offboard plugin. Not only attitude is required but thrust as well.

I didn’t knokw the Udacidrone API but it looks like it’s a simple API built on top of pymavlink.

That’s correct. Still it has useful features that official SDK would benefit from if they added.

The application on the companion computer would then communicate via mavlink to the autopilot but not via nsh

I believe this require add some custom messages to MAVLink. I have no experience in writing MAVLink applications yet.

I believe this require add some custom messages to MAVLink. I have no experience in writing MAVLink applications yet.

What’s your specific use case?

One use case is about testing.

So imagine drone on the rig equipped with sensors (strain gauges, vibration etc). Then I want to test different scenarios, measure thrust/torques under different conditions changing RPM on each many motors it has individually and all at once.

There is an onboard app pwm test for doing some stuff like that with PWM ESC. Also there is an app motor_test for dealing with UAVCAN ESC. Both quite limited and controlled via nsh. So my plan could be writing my own onboard app to allow for more detailed testing and programmatically control the process collecting drone telemetry and rig sensors data.

So my plan could be writing my own onboard app to allow for more detailed testing and programmatically control the process collecting drone telemetry and rig sensors data.

Yer convinced, that would be a handy plugin.