Skip to content

brunoeducsantos/3D-vehicle-tracking

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

3D Object Tracking

Overview

In this project the following keys concepts were developed:

  • Keypoint detectors and descriptors
  • Object detection using the pre-trained YOLO deep-learning framework
  • Methods to track objects by matching keypoints and bounding boxes across successive images
  • Associating regions in a camera image with lidar points in 3D space
  • Estimate TTC (aka Time-To-Collision) using LiDAR and camera key point matching

Dependencies for Running Locally

Basic Build Instructions

  1. Clone this repo.
  2. Make a build directory in the top level project directory: mkdir build && cd build
  3. Compile: cmake .. && make
  4. Run it: ./3D_object_tracking arg1 arg2 arg3 arg4 arg5.

For each argument, we select the following options:

  • arg1(KeyPoints Type): HARRIS, FAST, BRISK, ORB, AKAZE, and SIFT
  • arg2(Descriptors Type): BRIEF, ORB, FREAK, AKAZE, SIFT
  • arg3 (matcher Type) : MAT_BF, MAT_FLANN
  • arg4 (descriptor Type) : DES_BINARY, DES_HOG
  • arg5 (selector Type): SEL_NN, SEL_KNN

Implementation approach

Match 3D Objects

The algorithm performing match of bounding boxes is in matchBoundingBoxes. The implementation is divided into three seteps:

  1. Store in a pair of values box ids which are in previous and current frame bounding boxes
  2. Evaluate the number of pair points per bounding box match between current and previous frame
  3. Find the highest number of points per bounding box in prev and current frame above a certain threshold, choosing only the max counting bounding box per object detected

match

Compute Lidar-based TTC

The algorithm to compute lidar TTC is divided into three parts:

  1. Outliers removal.
  2. Compute closest point in previous and current lidar frame.
  3. Compute TTC between both frames.

The outliers removal is based on defined threshold euclidean distance around each point belonging to a cluster (in our case cars).

Associate Keypoint Correspondences with Bounding Boxes

The algorithm is implemented in clusterKptMatchesWithROI. The match between keypoints and each bounding box is divided into two steps:

  1. Compute absolute mean distance between current and previous frames, considering only the keypoints belonging to a bounding box.
  2. Store the keypoints within a certain distance threshold.

Reference

  • For further details on analytics about TTC estimates check this report

Disclamer

This project was cloned from Udacity 3D tracking project in the context of Sensor Fusion Engineer nanodegree.