Daily Archives: May 4, 2016

Visual Navigation for Autonomous Flying Robots

Introduction
In recent years, flying robots such as autonomous quadcopters have gained increased interest in robotics and computer vision research. For navigating safely, these robots need the ability to localize themselves autonomously using their onboard sensors. Potential applications of such systems include the automatic 3D reconstruction of buildings, inspection and simple maintenance tasks, surveillance of public places as well as in search and rescue systems.
In this project, I am trying to study and apply the current techniques for 3D localization, mapping and navigation that are suitable for quadcopters and try to come up with newer and better algorithms than the existing ones. This project will be based on the following topics:
Necessary background on robot hardware, sensors, 3D transformations
Motion estimation from images (including interest point detection, feature descriptors, robust estimation, visual odometry, iteratively closest point)
Filtering techniques and data fusion
Non-linear minimization, bundle adjustment, place recognition, 3D reconstruction
Autonomous navigation, path planning, exploration of unknown environments

Deliverables:
By the end of my Hons. Project following deliverables will be developed:
SLAM module: Following are the SLAM module which will be made:
Large-Scale Direct Monocular SLAM(LSD SLAM): In the first phase of my project, I will be implementing LSD SLAM in OpenCV by integrating the lsd_slam_core. After the core library is implemented, openFABMap package will be used for detecting loop-closures. Finally for map visualization, PCL(Point Cloud Library) would be integrated with the module.

Dense Visual SLAM for RGB-D Cameras: After the Mid evaluation, I will do the SLAM implementation for RGB-D cameras. An entropy-based similarity measure for keyframe selection and loop closure detection will be included. The calib3d module in OpenCV will be used for camera calibration, 3D reconstruction and for finding the camera intrinsics when porting the dvo_slam library to OpenCV.

Visual odometry module: Once the SLAM system is built, semi-dense and dense visual odometry for monocular and RGB-D cameras respectively will be made. Config APIs will be developed so as to configure the data-rate and precision of the local x, y, z coordinates. Finally, the local visual odometry will be fused with the global odometry estimates from GPS(latt, long). Altimeter will be used for fusing z coordinate.

Tracking module: The idea is to develop a robust optical flow tracker which runs in real time using a downwards camera. Currently, to compute the planar motion of a quadrocopter, users are restricted to ADNS family optical flow sensors which are generally used in mouse sensors. With this module in OpenCV, users will be able to use any generic camera module to compute the coarse motion vectors and in turn compute the 2D velocity for the quadrocopter with much greater accuracy and efficiency. Once the planar velocities are computed, we can use it for stabilizing the quadrocopter while hovering.

Navigation module: This module is essentially to make quadcopters robust by eliminating time delays because of the latency of the sensor readings, computational processing, send-receiving etc. This module will be an OpenCV implementation of the tum_ardrone ROS package developed for robust state estimation for Parrot AR Drone. Using the Monocular SLAM system based on PTAM(Parallel Tracking and Mapping), we rotate the visual map such that the xy-plane corresponds to the horizontal plane according to the accelerometer data, and scale it such that the average keypoint depth is 1. Next, we use the pose estimates from EKF(Extended Kalman Filter) to identify and reject falsely tracked frames. Finally, to steer the quadcopter to a desired location, PID control is used.

Obstacle avoidance module: In this approach, collision avoidance, traditionally considered a high level planning problem, can be effectively distributed between different levels of control, allowing real-time robot operations in a complex environment. We reformulated the manipulator control problem as direct control of manipulator motion in operational space-the space in which the task is originally described-rather than as control of the task’s corresponding joint space motion obtained only after geometric and kinematic transformation. Using visual sensing, real-time collision avoidance demonstrations on moving obstacles have been performed.

API Support for Beaglebone Blue

Hello everyone!!

Welcome to the blog for the project “API support for Beaglebone Blue”. I successfully finished this project as part of the Google Summer of Code 2016. In this website you will find all the recent, ongoing and future developments pertaining to this project. Here are some links which will get you started in knowing more about the project:

Description:

The aim of the project is to create easy-to-use APIs for the hardware on the BB Blue. This would consist of developing/improving kernel drivers for the on-board devices and then reimplementing the Strawson APIs to use these kernel drivers. These APIs will then be used by real-time applications which can be run on BB Blue. This project would widely help the Robotic industry as currently there is no low-cost open-source board which offers the amount of flexibility and performance as the beaglebone black/blue. 4 most important parts of this project are:

a. Enabling and Developing Kernel drivers: In this part, I developed and checked the kernel drives such that they work perfectly for the target kernel version. i.e: 4.4.12
i. Mostly from: https://github.com/kiran4399/beagleboard_kernel
ii. MPU-9250 Kernel driver: https://github.com/kiran4399/inv_mpu

b. Implementing Servo Kernel driver and PRU firmware: https://github.com/kiran4399/bbb_pru_firmware

c. Creating device tree base and overlay files for Beaglebone Blue: https://github.com/kiran4399/bb_blue_api/tree/master/install_files/2016-05-01

d. Developing Beaglebone Blue APIs: https://github.com/kiran4399/bb_blue_api

The APIs act as an abstraction layer for the C programs to access hardware present on the beaglebone black. The only thing which you should do is that you need to install the APIs on your beaglebone board and include the bb_blue_api.h file i.e. #include in your source code and use the APIs as mentioned here. When you are done with writing the code, you can compile by including the bb_blue_api.so shared library or create a makefile like the one mentioned here.

A short note about the links mentioned below, I’ll be releasing a new version once in 1 month even after the project. Developers are free to fork the project (source code link mentioned below) and submit any pull request. I’d be happy to merge in the mainline. The Blog would consist of the recent developments pertaining to the APIs and details on how to get started. The Wiki page would consist of the details in relation to each of the individual hardware on which the APIs are build. If someone finds any bugs, please feel free to create an issue in the Issue tracker as mentioned below. Finally, all the intricate details on how to use a specific API are mentioned in the Documentation.

Links !!

Releases:
v1.0

Source Code: https://github.com/kiran4399/bb_blue_api

PRU Servo Firmware and Driver: https://github.com/kiran4399/bbb_pru_firmware

Wiki: https://github.com/kiran4399/bb_blue_api/wiki

Documentation: https://github.com/kiran4399/bb_blue_api#library-functions

Issue Tracker: https://github.com/kiran4399/bb_blue_api/issues

Commit History: https://github.com/kiran4399/bb_blue_api/commits/master

What I accomplished:

I finished the coding, testing and documentation of v1.0 Beaglebone APIs which deal with the following hardware:

1. LED and Button
2. MPU-9250
3. BMP-280
4. Analog voltage signals
5. DC motor control using PWM
6. Servo and ESC
7. Quadrature Encoders
8. UART, SPI and I2C
9. DSM2

I am yet to document the v2.0 APIs which are:

1. CPU frequency control
2. Vector and Quaternion Math library
3. Discrete SISO filters

Future Improvements:

This project has lot of scope in adding new functionalities.Some of them which I couldn’t incorporate in this project due to lack of time are as follows:

1. Adding ROS support for the APIs.
2. Porting Beaglebone Blue APIs as HAL to Ardupilot
3. Adding DMP support to the MPU-9250 kernel driver for Linux 4.4.x
4. Writing PRU firmware for PRU-1 which allows it act as the 4th Encoder. (Beaglebone Black/Blue has only 3 encoders)
5. Adding these APIs to Cloud9 for Beaglebone Blue which enables the node.js to run Hardware specific JS code in the web-browser.
6. Writing python of javascript wrappers for the C APIs.
7. Testing and Documentation for v2.0 APIs
8. Creating documentation for the APIs in Deoxygen.

My Experience:

It was truly a great experience in doing a GSoC-2016 project for Beagleboard.org. Frankly speaking, when I look at some of my archived material which I submitted as proposals for GSoC-2015 and GSoC-2016, I am clearly able to see how much I’ve advanced in terms of Programming and understanding subtle aspects of open-source development. From using git clone or git pull to git commit or git push, I learnt everything during this time. Some of my project related stuff which I learnt were:

1. Kernel Module Development (Spent 50% of my time on this, had a difficult time, but learnt a lo!!t)
2. Writing a proper Makefile for a project.
3. Creating char sysfs devices in linux kernel
4. Better understand of all the hardware modules in Beaglebone black
5. Better usage of linux kernel in terms of patching, generating pull requests and other git stuff.
6. Last, but not the least, I learnt a lot even in terms of reading and understanding other developer’s code.

It might be my luck, but truly, my project helped me even in my Undergraduate Hons. project in my school. I am currently doing my project on “Visual Navigation for Flying Robots” and made a quadrocopter model which is used as a hardware platform to test my programs. What is more to it, I am using the APIs on Beaglebone Blue (Currently Beagleboone Black + Strawson Cape) as the Flight Computer for my drone. I proudly present my drone: Blue-copter!!! 😛

Acknowledgments:
Finally, I’d like to say something which might seem to be little informal in a Final report, but I feel this is very important. Seriously, I wouldn’t be, at all, able to finish this project without the support of my Mentors, especially Alexander Hiam and Micheal Welling who took a lot of time from their daily routine and patiently answered all my doubts and questions, though some of them were silly and trivial. I also thank all the other Beagleboard.org mentors and my friends who participated in GSoC 2016 for Beagleboard.org for guiding and supporting me. I really owe this org one and will look forward to participating again in GSoC 2017.

Hons. Project Timeline

Jan 23 – Jan 30 Week-27
Achieved: LSD SLAM implementation(Part-1)
Description: Developing an IO Wrapper for lsd_slam_core. implementation in OpenCV. The core library is an implementation of the motion estimation for the quadrocopter.

Feb 1 – Feb 6 Week-28
Achieved: LSD SLAM implementation(Part-2)
Description: Once the core library is successfully integrated, openFABMAP will be integrated to the system for detecting loop-closures. For Graph optimization problem g2o framework will be used after integrating it with OpenCV.

Feb 7 – Feb 14 Week-29
Achieved: Developing the lsd_slam_visualization submodule
Description:PCL(Point Cloud Library) will be used to represent 3D visualization for the map generated by the SLAM system. It must be noted that since a quadrocopter has limited computational resources in terms of memory and processing speed a library must be used which requires minimal amount of onboard resources.

Feb 15 – Feb 29 Week-30,31
Achieved: Unit testing LSD-SLAM
Description: LSD-SLAM will be tested on both publicly available benchmark sequences like the one developed by University of Freiburg, as well as live using the monocular camera onboard the quadrocopter. The functional aspects of the SLAM module will be documented in doxygen.

March 1 – March 7 Week-32
Achieved: Dense Visual SLAM for RGB-D camera
Description: With the freenect and openni library as an IO wrapper, I will be integrating dvo_core library with OpenCV. Once the core library is ported to OpenCV, visualization module is developed using the PCL visualizer to represent the dense 3D map.

March 8 – March 15 Week-33
Achieved: Visual odometry and Sensor model integration
Description: After implementing all the SLAM systems, sensor model from the IMU is integrated.

March 16 – March 23 Week-34
Achieved: Configuring Dense optical flow
Description: Dense optical flow will be used to implement a motion sensing module on the Raspberry pi camera. calcOpticalFlowSF() API will be used.

March 24 – March 31 Week-35
Achieved: Unit Testing for visual odometry and tracking module
Description: Respective modules will be tested with the standard benchmarks available. For the visual odometry from monocular camera, KITTI dataset would be used and for RGB-D camera dense visual odometry ICL-NUIM dataset and TUM RGB-D dataset would be used.

April 1 – April 7 Week-36
Achieved: Navigation module
Description: Major task would be to develop a state-estimation module which includes an OpenCV implementation of PTAM. Then the Linear Kalman filter class in OpenCV must be modified to Extended Kalman filter to incorporate the data parameters from the IMU sensor. Finally learned 3D feature visual Map will be made using the OpenGL library.

April 8 – April 15 Week-37
Achieved: Unit Testing for Navigation module
Description: The navigation module will be tested by letting the quadrocopter complete large variety of different figures like Rectangle, Haus vom Nikolaus. Documentation will be written on the usage of these APIs.

April 16 – April 23 Week-38
Achieved: Generating OpenCV python bindings and integration testing.
Description: Experimental quadrocopter whose specifications are mentioned in the first page will be used for testing purposes. The generated python APIs would be us in the Multiwii control program. Any bugs pertaining will be addressed and usage documentation will be committed.