Method

NVIDIA Bird's Eye View Lidar Net [la] [NVLidarNet]


Submitted on 14 Nov. 2017 09:30 by
Jarmo Lunden (NVIDIA Helsinki Oy)

Running time:0.1 s
Environment:GPU @ 2.5 Ghz (Python + C/C++)

Method Description:
Developed by NVIDIA self-driving team. Only KITTI
Lidar training data has been used.
Parameters:
-
Latex Bibtex:

Detailed Results

Object detection and orientation estimation results. Results for object detection are given in terms of average precision (AP) and results for joint object detection and orientation estimation are provided in terms of average orientation similarity (AOS).


Benchmark Easy Moderate Hard
Car (Bird's Eye View) 84.44 % 80.04 % 74.31 %
Pedestrian (Bird's Eye View) 45.21 % 37.81 % 33.82 %
Cyclist (Bird's Eye View) 63.95 % 47.97 % 44.51 %
This table as LaTeX


Bird's eye view results.
This figure as: png eps pdf txt gnuplot



Bird's eye view results.
This figure as: png eps pdf txt gnuplot



Bird's eye view results.
This figure as: png eps pdf txt gnuplot




eXTReMe Tracker