Category Archives: Notes

notes for some things

Visual Navigation for Autonomous Flying Robots

Introduction
In recent years, flying robots such as autonomous quadcopters have gained increased interest in robotics and computer vision research. For navigating safely, these robots need the ability to localize themselves autonomously using their onboard sensors. Potential applications of such systems include the automatic 3D reconstruction of buildings, inspection and simple maintenance tasks, surveillance of public places as well as in search and rescue systems.
In this project, I am trying to study and apply the current techniques for 3D localization, mapping and navigation that are suitable for quadcopters and try to come up with newer and better algorithms than the existing ones. This project will be based on the following topics:
Necessary background on robot hardware, sensors, 3D transformations
Motion estimation from images (including interest point detection, feature descriptors, robust estimation, visual odometry, iteratively closest point)
Filtering techniques and data fusion
Non-linear minimization, bundle adjustment, place recognition, 3D reconstruction
Autonomous navigation, path planning, exploration of unknown environments

Deliverables:
By the end of my Hons. Project following deliverables will be developed:
SLAM module: Following are the SLAM module which will be made:
Large-Scale Direct Monocular SLAM(LSD SLAM): In the first phase of my project, I will be implementing LSD SLAM in OpenCV by integrating the lsd_slam_core. After the core library is implemented, openFABMap package will be used for detecting loop-closures. Finally for map visualization, PCL(Point Cloud Library) would be integrated with the module.

Dense Visual SLAM for RGB-D Cameras: After the Mid evaluation, I will do the SLAM implementation for RGB-D cameras. An entropy-based similarity measure for keyframe selection and loop closure detection will be included. The calib3d module in OpenCV will be used for camera calibration, 3D reconstruction and for finding the camera intrinsics when porting the dvo_slam library to OpenCV.

Visual odometry module: Once the SLAM system is built, semi-dense and dense visual odometry for monocular and RGB-D cameras respectively will be made. Config APIs will be developed so as to configure the data-rate and precision of the local x, y, z coordinates. Finally, the local visual odometry will be fused with the global odometry estimates from GPS(latt, long). Altimeter will be used for fusing z coordinate.

Tracking module: The idea is to develop a robust optical flow tracker which runs in real time using a downwards camera. Currently, to compute the planar motion of a quadrocopter, users are restricted to ADNS family optical flow sensors which are generally used in mouse sensors. With this module in OpenCV, users will be able to use any generic camera module to compute the coarse motion vectors and in turn compute the 2D velocity for the quadrocopter with much greater accuracy and efficiency. Once the planar velocities are computed, we can use it for stabilizing the quadrocopter while hovering.

Navigation module: This module is essentially to make quadcopters robust by eliminating time delays because of the latency of the sensor readings, computational processing, send-receiving etc. This module will be an OpenCV implementation of the tum_ardrone ROS package developed for robust state estimation for Parrot AR Drone. Using the Monocular SLAM system based on PTAM(Parallel Tracking and Mapping), we rotate the visual map such that the xy-plane corresponds to the horizontal plane according to the accelerometer data, and scale it such that the average keypoint depth is 1. Next, we use the pose estimates from EKF(Extended Kalman Filter) to identify and reject falsely tracked frames. Finally, to steer the quadcopter to a desired location, PID control is used.

Obstacle avoidance module: In this approach, collision avoidance, traditionally considered a high level planning problem, can be effectively distributed between different levels of control, allowing real-time robot operations in a complex environment. We reformulated the manipulator control problem as direct control of manipulator motion in operational space-the space in which the task is originally described-rather than as control of the task’s corresponding joint space motion obtained only after geometric and kinematic transformation. Using visual sensing, real-time collision avoidance demonstrations on moving obstacles have been performed.

API Support for Beaglebone Blue

Hello everyone!!

Welcome to the blog for the project “API support for Beaglebone Blue”. I successfully finished this project as part of the Google Summer of Code 2016. In this website you will find all the recent, ongoing and future developments pertaining to this project. Here are some links which will get you started in knowing more about the project:

Description:

The aim of the project is to create easy-to-use APIs for the hardware on the BB Blue. This would consist of developing/improving kernel drivers for the on-board devices and then reimplementing the Strawson APIs to use these kernel drivers. These APIs will then be used by real-time applications which can be run on BB Blue. This project would widely help the Robotic industry as currently there is no low-cost open-source board which offers the amount of flexibility and performance as the beaglebone black/blue. 4 most important parts of this project are:

a. Enabling and Developing Kernel drivers: In this part, I developed and checked the kernel drives such that they work perfectly for the target kernel version. i.e: 4.4.12
i. Mostly from: https://github.com/kiran4399/beagleboard_kernel
ii. MPU-9250 Kernel driver: https://github.com/kiran4399/inv_mpu

b. Implementing Servo Kernel driver and PRU firmware: https://github.com/kiran4399/bbb_pru_firmware

c. Creating device tree base and overlay files for Beaglebone Blue: https://github.com/kiran4399/bb_blue_api/tree/master/install_files/2016-05-01

d. Developing Beaglebone Blue APIs: https://github.com/kiran4399/bb_blue_api

The APIs act as an abstraction layer for the C programs to access hardware present on the beaglebone black. The only thing which you should do is that you need to install the APIs on your beaglebone board and include the bb_blue_api.h file i.e. #include in your source code and use the APIs as mentioned here. When you are done with writing the code, you can compile by including the bb_blue_api.so shared library or create a makefile like the one mentioned here.

A short note about the links mentioned below, I’ll be releasing a new version once in 1 month even after the project. Developers are free to fork the project (source code link mentioned below) and submit any pull request. I’d be happy to merge in the mainline. The Blog would consist of the recent developments pertaining to the APIs and details on how to get started. The Wiki page would consist of the details in relation to each of the individual hardware on which the APIs are build. If someone finds any bugs, please feel free to create an issue in the Issue tracker as mentioned below. Finally, all the intricate details on how to use a specific API are mentioned in the Documentation.

Links !!

Releases:
v1.0

Source Code: https://github.com/kiran4399/bb_blue_api

PRU Servo Firmware and Driver: https://github.com/kiran4399/bbb_pru_firmware

Wiki: https://github.com/kiran4399/bb_blue_api/wiki

Documentation: https://github.com/kiran4399/bb_blue_api#library-functions

Issue Tracker: https://github.com/kiran4399/bb_blue_api/issues

Commit History: https://github.com/kiran4399/bb_blue_api/commits/master

What I accomplished:

I finished the coding, testing and documentation of v1.0 Beaglebone APIs which deal with the following hardware:

1. LED and Button
2. MPU-9250
3. BMP-280
4. Analog voltage signals
5. DC motor control using PWM
6. Servo and ESC
7. Quadrature Encoders
8. UART, SPI and I2C
9. DSM2

I am yet to document the v2.0 APIs which are:

1. CPU frequency control
2. Vector and Quaternion Math library
3. Discrete SISO filters

Future Improvements:

This project has lot of scope in adding new functionalities.Some of them which I couldn’t incorporate in this project due to lack of time are as follows:

1. Adding ROS support for the APIs.
2. Porting Beaglebone Blue APIs as HAL to Ardupilot
3. Adding DMP support to the MPU-9250 kernel driver for Linux 4.4.x
4. Writing PRU firmware for PRU-1 which allows it act as the 4th Encoder. (Beaglebone Black/Blue has only 3 encoders)
5. Adding these APIs to Cloud9 for Beaglebone Blue which enables the node.js to run Hardware specific JS code in the web-browser.
6. Writing python of javascript wrappers for the C APIs.
7. Testing and Documentation for v2.0 APIs
8. Creating documentation for the APIs in Deoxygen.

My Experience:

It was truly a great experience in doing a GSoC-2016 project for Beagleboard.org. Frankly speaking, when I look at some of my archived material which I submitted as proposals for GSoC-2015 and GSoC-2016, I am clearly able to see how much I’ve advanced in terms of Programming and understanding subtle aspects of open-source development. From using git clone or git pull to git commit or git push, I learnt everything during this time. Some of my project related stuff which I learnt were:

1. Kernel Module Development (Spent 50% of my time on this, had a difficult time, but learnt a lo!!t)
2. Writing a proper Makefile for a project.
3. Creating char sysfs devices in linux kernel
4. Better understand of all the hardware modules in Beaglebone black
5. Better usage of linux kernel in terms of patching, generating pull requests and other git stuff.
6. Last, but not the least, I learnt a lot even in terms of reading and understanding other developer’s code.

It might be my luck, but truly, my project helped me even in my Undergraduate Hons. project in my school. I am currently doing my project on “Visual Navigation for Flying Robots” and made a quadrocopter model which is used as a hardware platform to test my programs. What is more to it, I am using the APIs on Beaglebone Blue (Currently Beagleboone Black + Strawson Cape) as the Flight Computer for my drone. I proudly present my drone: Blue-copter!!! 😛

Acknowledgments:
Finally, I’d like to say something which might seem to be little informal in a Final report, but I feel this is very important. Seriously, I wouldn’t be, at all, able to finish this project without the support of my Mentors, especially Alexander Hiam and Micheal Welling who took a lot of time from their daily routine and patiently answered all my doubts and questions, though some of them were silly and trivial. I also thank all the other Beagleboard.org mentors and my friends who participated in GSoC 2016 for Beagleboard.org for guiding and supporting me. I really owe this org one and will look forward to participating again in GSoC 2017.

Hons. Project Timeline

Jan 23 – Jan 30 Week-27
Achieved: LSD SLAM implementation(Part-1)
Description: Developing an IO Wrapper for lsd_slam_core. implementation in OpenCV. The core library is an implementation of the motion estimation for the quadrocopter.

Feb 1 – Feb 6 Week-28
Achieved: LSD SLAM implementation(Part-2)
Description: Once the core library is successfully integrated, openFABMAP will be integrated to the system for detecting loop-closures. For Graph optimization problem g2o framework will be used after integrating it with OpenCV.

Feb 7 – Feb 14 Week-29
Achieved: Developing the lsd_slam_visualization submodule
Description:PCL(Point Cloud Library) will be used to represent 3D visualization for the map generated by the SLAM system. It must be noted that since a quadrocopter has limited computational resources in terms of memory and processing speed a library must be used which requires minimal amount of onboard resources.

Feb 15 – Feb 29 Week-30,31
Achieved: Unit testing LSD-SLAM
Description: LSD-SLAM will be tested on both publicly available benchmark sequences like the one developed by University of Freiburg, as well as live using the monocular camera onboard the quadrocopter. The functional aspects of the SLAM module will be documented in doxygen.

March 1 – March 7 Week-32
Achieved: Dense Visual SLAM for RGB-D camera
Description: With the freenect and openni library as an IO wrapper, I will be integrating dvo_core library with OpenCV. Once the core library is ported to OpenCV, visualization module is developed using the PCL visualizer to represent the dense 3D map.

March 8 – March 15 Week-33
Achieved: Visual odometry and Sensor model integration
Description: After implementing all the SLAM systems, sensor model from the IMU is integrated.

March 16 – March 23 Week-34
Achieved: Configuring Dense optical flow
Description: Dense optical flow will be used to implement a motion sensing module on the Raspberry pi camera. calcOpticalFlowSF() API will be used.

March 24 – March 31 Week-35
Achieved: Unit Testing for visual odometry and tracking module
Description: Respective modules will be tested with the standard benchmarks available. For the visual odometry from monocular camera, KITTI dataset would be used and for RGB-D camera dense visual odometry ICL-NUIM dataset and TUM RGB-D dataset would be used.

April 1 – April 7 Week-36
Achieved: Navigation module
Description: Major task would be to develop a state-estimation module which includes an OpenCV implementation of PTAM. Then the Linear Kalman filter class in OpenCV must be modified to Extended Kalman filter to incorporate the data parameters from the IMU sensor. Finally learned 3D feature visual Map will be made using the OpenGL library.

April 8 – April 15 Week-37
Achieved: Unit Testing for Navigation module
Description: The navigation module will be tested by letting the quadrocopter complete large variety of different figures like Rectangle, Haus vom Nikolaus. Documentation will be written on the usage of these APIs.

April 16 – April 23 Week-38
Achieved: Generating OpenCV python bindings and integration testing.
Description: Experimental quadrocopter whose specifications are mentioned in the first page will be used for testing purposes. The generated python APIs would be us in the Multiwii control program. Any bugs pertaining will be addressed and usage documentation will be committed.

Project Proposal

Abstract: The aim of the project is to create easy-to-use APIs for the hardware on the BB Blue. This would consist of developing/improving kernel drivers for the on-board devices and then reimplementing the Strawson APIs to use these kernel drivers. These APIs will then be used by real-time applications which can be run on BB Blue. In the later phase of the project, support for BB Blue will be added in Ardupilot and ROS will be ported to BB Blue using these APIs.

Introduction:
BB Blue is the latest educational robotics controller built around the popular BeagleBone open hardware computer. With a suite of sensors very useful for robotics and a supported software stack ideal for robots and drones, the BeagleBone Blue is the perfect board for all kinds of robots. There are many robotic applications which are limited to some specific hardware. By developing APIs for BB Blue it would be easy for developers to blend in the latest versions of these applications with the unmatched architecture for the BB Blue.

Project Goals and challenges:
Following are the complete set of project goals and their challenges which I plan to deliver at the end of the tenure:
Enabling Kernel drivers: The very first step of this project is to develop the kernel drivers for the devices/modules on BB Blue. Presently most of the kernel drivers for the onboard modules are present in 4.1 kernel. I have already configured the kernel driver for MPU-9250 which is a combination of MPU-6050 and AKM8963.

Developing PRU servo kernel driver: Currently the pruss driver is based for kernel 3.8. During the first week I will be developing the kernel driver for PRU servo and DC. This would essentially involve porting the pruss driver to 4.1 to communicate with the modified remoteproc framework. A userspace API will then be written for implementing PWM downcalls.

BB Blue debian image: After kernel drivers for the onboard devices are completed, I will generate the config file and compile the kernel with the base BB Blue overlay which will be created. Finally, a debian image will be built by using the appropriate EEPROM database. After booting the image onto the board, I will check for any eMMc errors and system level bugs.

BB Blue APIs: I will make the APIs which would be communicating with the device kernel driver. By the end of 6th Week, all the APIs will be implemented, tested and well documented. The goal here should that for any program using the current APIs to not break when run on BB Blue. Also, since the APIs are a reimplementation of the strawson APIs, the syntax of the BB Blue APIs will adhere to the strawson APIs so as to make the BB Blue APIs more compatible. Most importantly, the performance of this kernel+API system will be compared with the standard mmap version

Ardupilot support for BB Blue: By the end of this project BB Blue support for Ardupilot will be developed which essentially involves implementing the AP_HAL APIs using the BB Blue APIs. mavlink will also be integrated with the APIs which would enable the board to communicate with other mavlink supported applications.

Standalone ROS for BB Blue: The APIs will also be used for interfacing the onboard devices with the ROS middleware by creating some custom packages.

Following ROS packages will also be tested on BB Blue to check the connectivity:
mavros: ROS interface for mavlink
roscopter: ROS interface for ArduCopter using Mavlink 1.0. Using this package, a quadcopter can be controlled directly from ROS overriding the RC commands.

Documentation and examples: I will provide extensive and accurate documentation for whatever I build in this project. Functional documentation for the APIs will be done in doxygen. Code documentation will be as comments in the source file.

Timeline and Milestones:
Google Summer of Code stretches over a period of 12 weeks with the Mid-Term evaluations in the 6th week and End-term evaluations in the end. Following are the timelines and milestones which I want to strictly follow throughout the project tenure:

May 15 – May 22 Week-0
Aim: Reading the Documentation.
Description: I’ll familiarize myself by going through the documentation of building the kernel drivers, Strawson APIs, Ardupilot and ROS. By this end of this week, I would be getting a rough estimate of what to do in the coming weeks. Also since BB Blue is roughly based on BB Black, I would be testing the board with the latest debian image and check for any hardware bugs.

May 23 – June 6 Week-1,2

Aim: Developing kernel driver for PRU servo and compiling kernel.
Description: Developing a PRU userspace API after porting prussdrv to 4.1 kernel to communicate with the modified remoteproc framework. After all the kernel drivers are developed and enabled and the base dto modified, the kernel will be compiled with the beaglebone_blue_defconfig file. Then, with the uImage file, I will build the image by modifying the necessary values in the BB Blue EEPROM data.

June 6 – June 13 Week-3
Aim: Fixing bugs and performance issues for kernel drivers
Description: The debian image will be thoroughly tested this week.

June 14 – June 21 Week-4
Aim: Developing BB Blue API’s(Part-1)
Description: Following the design and structure of the strawson API’s in implementing the BB Blue APIs. I’ll also be making the test APIs for unit testing.


June 22 – June 27 Week-5

Aim: MID-TERM EVALUATION !!
Description: Refactoring the code and increase its readability. Some time would be spent in rechecking the API source code with the unit tests for any bugs. During this week, usage documentation for all the APIs would be clearly outlined and written.

June 28 – July 7 Week-6
Aim: Developing BB Blue API’s(Part-2)
Description: In Part-2, The next list of BB Blue APIs will be written.

July 8 – July 15 Week-7
Aim:Reserve week
Description: Reserve week for any unseen problems faced. Meanwhile, API’s will be thoroughly tested by the examples present in the Strawson APIs git repository. Final documentation will be committed to github. The communication ports will be tested.

July 16 – July 31 Week-8,9

Aim: Writing ardupilot interfaces on the BB Blue using APIs
Description: Using the APIs, BB Blue support for the Ardupilot will be added. This would involve implementing the APIs mentioned in the AP_HAL directory in Ardupilot. The entire code would thereby go to AP_HAL_Linux directory. After coding, Ardupilot(Arducopter) will be tested by interfacing BB Blue with a mini quadcopter which I have built including the sensors mentioned above. The connections between ESCs and the brushless motors will be soldered and the input pins will be connected to the PRU servo ports. A DSM receiver will be interfaced to BB Blue which can be controlled by a transmitter. All the issues related to hardware and performance will also be address in this week. Once the coding and testing is done, I will be submitting merge requests to upstream Ardupilot repository.

August 1 – August 7 Week-10
Aim: Configuring mavlink and porting ROS to BB Blue
Description: Since mavlink is a header-only library, we need not compile it on BB Blue. It can simply be used by adding it to the API directory. I’ll also be Installing ROS on BB Blue and checking for bugs. BB Blue support for mavros, roscopter will be tested this week.

August 8 – August 15 Week-11
Aim: Documentation and making ROS nodes for BB Blue.
Description: Detailed testing will be done by controlling the experimental quadcopter by a ground station using mavlink and by using mavros. ROS support for the devices on BB Blue would be written. Finally I will be releasing all the alpha packages and adding them to the packages.ros.org list.

August 16 – August 23 Week-12
Aim: FINAL EVALUATION !!
Description: Checking for code smells and bugs. Refining the previous documentation so that it is more easy to understand. Checking the final implementation and doing the runthrough again.