Completion date: 18 Jan 2021
Description: Custom driver-assistance system
Tags: Robotics
Tools: CAN, C++, OpenCV, TF

Cruisin, ADAS

When I worked at Forcen, I had the unique opportunity to learn a lot about the CAN communication protocol, which is used widely in the automotive industry. With my computer vision background and that newfound knowledge, I decided to build my own driver-assist system - Cruisin - that taps into the CAN bus of my car.

The video above should cover it pretty well, but in a nutshell I made an application that: (1) detects obstacles/vehicles, (2) maps my drive using visual odometry supplemented by CAN data, (3) can display any telemetry data of interest over CAN (speed, rpm, etc).

Odometry

The visual odometry pipeline I implemented was pretty neat. It works by tracking the flow of pixels between consecutive camera frames to estimate the translation vector and rotation matrix. However, geometrically, you can’t resolve the magnitude of the translation with a single camera - only the direction. This is where the CAN data comes in. I’m able to monitor the car’s speed, which I integrate to get distance and scale translation vector. A challenge here was dealing with sampling time and phase variation between the algorithm’s inputs (ex: CAN data coming X miliseconds after a camera frame).

The above video is a behind-the-scenes look at the visual odometry (mapping) feature. The red dots are features that the algorithm extracts and tracks across frames. The pixel trajectories are shown in green.

Andrew Mourcos © 2017-2023

GITHUB | LINKEDIN