Embedding Orocos-RTT on the Parrot AR.Drone

Hello,

my lab recently bought an AR.Drone and I would be interested in
developing some custom control modules within the Orocos framework
(RTT, at the bare minimum, and possibly other). I do know there are
already several interesting projects around, e.g., itasc_ardrone[1]
and ardrone_autonomy[2]. However these projects are meant to run
controls on powerful (PC-like) "ground-stations": data (imu, cameras,
etc.) are read from the drone through a wireless link, the control
action is computed on the ground-station, and finally the commands for
the actuators are sent back to the drone (via wireless). I would
rather be interested in embedding the RTT middleware (eventually the
"small footprint" version of RTT[3]) directly on the AR.Drone,
similarly to what has been done in [4] with Urbi, with also the
control modules (for, e.g., autonomous set-points following with
obstacle detection).

I would like to have some feedback from you on this envisioned set-up.
Do you think it is feasible (with reasonable effort)? How difficult
would be to implement the RTT OS abstraction for the AR.Drone system?
Would there be any interest for the Orocos community in having this
developed?

Thanks in advance.

[1] http://code.metager.de/source/xref/Orocos/iTaSC/robots_objects/itasc_ard...
[2] https://github.com/AutonomyLab/ardrone_autonomy
[3] http://www.orocos.org/stable/documentation/rtt/v2.x/doc-xml/orocos-compo...
[4] http://www.psykokwak.com/blog/index.php/2012/05/18/60-ar-drone-et-urbi-e...

--
Matteo

Embedding Orocos-RTT on the Parrot AR.Drone

hi,

On 06/05/2014 04:49 PM, Matteo Morelli wrote:
> Hello,
>
> my lab recently bought an AR.Drone and I would be interested in
> developing some custom control modules within the Orocos framework
> (RTT, at the bare minimum, and possibly other). I do know there are
> already several interesting projects around, e.g., itasc_ardrone[1]
itasc_ardrone just models an ardrone (kinematics) when using it in the itasc framework

> and ardrone_autonomy[2]. However these projects are meant to run
> controls on powerful (PC-like) "ground-stations": data (imu, cameras,
> etc.) are read from the drone through a wireless link, the control
> action is computed on the ground-station, and finally the commands for
> the actuators are sent back to the drone (via wireless). I would
> rather be interested in embedding the RTT middleware (eventually the
> "small footprint" version of RTT[3]) directly on the AR.Drone,
> similarly to what has been done in [4] with Urbi, with also the
> control modules (for, e.g., autonomous set-points following with
> obstacle detection).
>
> I would like to have some feedback from you on this envisioned set-up.
> Do you think it is feasible (with reasonable effort)? How difficult
> would be to implement the RTT OS abstraction for the AR.Drone system?
> Would there be any interest for the Orocos community in having this
> developed?

An AR.Drone is in the end an ARM processor with a Linux running on it,
hence this could help you: http://sagar.se/yet-another-orocos-crosscompile-guide.html
(the links has changes wrt the one mentioned on the orocos page)

We have used the platform in our lab for student projects, but till now only with communication to an off-board ground station, as you mentioned earlier.
So, yes we are interested, but since it is not a platform for research, I don't think we can assign much resources here.

hth,

Nick

>
> Thanks in advance.
>
> [1] http://code.metager.de/source/xref/Orocos/iTaSC/robots_objects/itasc_ard...
> [2] https://github.com/AutonomyLab/ardrone_autonomy
> [3] http://www.orocos.org/stable/documentation/rtt/v2.x/doc-xml/orocos-compo...
> [4] http://www.psykokwak.com/blog/index.php/2012/05/18/60-ar-drone-et-urbi-e...
>
> --
> Matteo
>