Method

Map-Enhanced Ego-Lane Detection [la] [gp] [CyberMELD+PLARD]
https://github.com/xiaoliangabc/cyber_meld/tree/plard

Submitted on 20 Jun. 2020 05:34 by
Xiaoliang Wang (SJTU)

Running time:0.18 s
Environment:8 cores @ 1.5 Ghz (Python + C/C++)

Method Description:
Employ the OSM road shape as lane model to enhance
our ego-lane detection.
Parameters:
None
Latex Bibtex:
@ARTICLE{9110871,
author={X. {Wang} and Y. {Qian} and C. {Wang}
and
M. {Yang}},
journal={IEEE Access},
title={Map-Enhanced Ego-Lane Detection in the
Missing Feature Scenarios},
year={2020},
volume={8},
pages={107958-107968}
}

Evaluation in Bird's Eye View


Benchmark MaxF AP PRE REC FPR FNR
UM_LANE 94.44 % 88.59 % 95.95 % 92.97 % 0.69 % 7.03 %
This table as LaTeX

Behavior Evaluation


Benchmark PRE-20 F1-20 HR-20 PRE-30 F1-30 HR-30 PRE-40 F1-40 HR-40
UM_LANE 99.18 % 99.36 % 99.29 % 98.70 % 98.20 % 97.17 % 96.74 % 90.80 % 90.79 %
This table as LaTeX

Road/Lane Detection

The following plots show precision/recall curves for the bird's eye view evaluation.


Distance-dependent Behavior Evaluation

The following plots show the F1 score/Precision/Hitrate with respect to the longitudinal distance which has been used for evaluation.



This figure as: png eps pdf

This figure as: png eps pdf

This figure as: png eps pdf

Visualization of Results

The following images illustrate the performance of the method qualitatively on a couple of test images. We first show results in the perspective image, followed by evaluation in bird's eye view. Here, red denotes false negatives, blue areas correspond to false positives and green represents true positives.



This figure as: png

This figure as: png

This figure as: png

This figure as: png

This figure as: png

This figure as: png

This figure as: png

This figure as: png

This figure as: png


eXTReMe Tracker