• Grid-Based Uniform Feature Tracking for Visual-Inertial Navigation. Tong He, Konstantine Tsotsos, Stefano Soatto. UCLA Cross-disciplinary Scholars in Science and Technology (CSST). [PDF]

Abstract

We introduce a simple modification of common single threshold based feature detectors, such as FAST feature detector, that reduces the trajectory drift of our SLAM system by 68%. By dividing a whole image into several grids, we are able to run multiple feature detectors on different grids with separate adaptive thresholds, which helps to prevent detected features from gathering densely in a certain area of the image. To discriminate features detected on dark image regions from those on the textureless regions, like the sky, we use pixel covariance of patches around features as a simple but effective criteria. We also devise an iterative feature resampling strategy to keep the number of tracked features close to filter's capacity of processing tracked features.

Grid based uniform feature tracking gives us more tracked good features, which is proved by a 44% denser 3D point cloud. Because only good features can pass the outlier rejection procedure and be triangulated into 3D points in the map. Good features under tracking are always something desirable by vision based system, such as autonomous driving. Grid-based uniform feature tracking is intuitive and implementable in a few lines of code. Since feature detectors are running independently on different grids now, we may implement it in parallel later, which will speed up the tracking for several times.

Learn more

https://sites.google.com/site/ktsotsos/visual-inertial-sensor-fusion