My research project
Autonomous Navigation and Control of Vehicles
I focus on perception and control for various kinds of vehicles and robots.
By maximising the capabilities of off-the-shelf sensors, I seek to bring autonomous capabilities to existing vehicles.
This would make self-driving vehicles accessible to the common consumer, and may be used by people without specialist expertise.
PhD period: July 2017 - 2020
I am a demonstrator for the Year 3 Robotics module (EEE3043).
This consists of technical support for the Robot Operating System (ROS) for solving problems such as perception and exploration.
Accurate extrinsic sensor calibration is essential for both autonomous vehicles and robots. Traditionally this is an involved process requiring calibration targets, known fiducial markers and is generally performed in a lab. Moreover, even a small change in the sensor layout requires recalibration. With the anticipated arrival of consumer autonomous vehicles, there is demand for a system which can do this automatically, after deployment and without specialist human expertise.
To solve these limitations, we propose a flexible framework which can estimate extrinsic parameters without an explicit calibration stage, even for sensors with unknown scale. Our first contribution builds upon standard hand-eye calibration by jointly recovering scale. Our second contribution is that our system is made robust to imperfect and degenerate sensor data, by collecting independent sets of poses and automatically selecting those which are most ideal.
We show that our approach’s robustness is essential for the target scenario. Unlike previous approaches, ours runs in real time and constantly estimates the extrinsic transform. For both an ideal experimental setup and a real use case, comparison against these approaches shows that we outperform the state-of-the-art. Furthermore, we demonstrate that the recovered scale may be applied to the full trajectory, circumventing the need for scale estimation via sensor fusion.