Method

MTStereo 1.0 [mts1]
https://github.com/rbrandt1/MaxTreeS

Submitted on 13 Mar. 2020 10:10 by
Remi Brandt (RUG)

Running time:0.8 s
Environment:4 cores @ 3.5 Ghz (C/C++)

Method Description:
Extraction of depth from images is of great
importance for various computer vision
applications. Methods based on convolutional
neural networks are very accurate but have high
computation requirements, which can be achieved
with GPUs. However, GPUs are difficult to use on
devices with low power requirements like robots
and embedded systems. In this light, we propose a
stereo matching method appropriate for
applications in which limited computational and
energy resources are available. The algorithm is
based on a hierarchical representation of image
pairs which is used to restrict disparity search
range. We propose a cost function that takes into
account region contextual information and a cost
aggregation method that preserves disparity
borders. We tested the proposed method on the
Middlebury and KITTI benchmark data sets and on
the TrimBot2020 synthetic data. We achieved
accuracy and time efficiency results that show
that the method is suitable to be deployed on
embedded and robotics systems.
Parameters:
See paper.
Latex Bibtex:
@article{BRANDT2020,
title = "Efficient binocular stereo
correspondence matching with 1-D Max-Trees",
journal = "Pattern Recognition Letters",
year = "2020",
issn = "0167-8655",
doi =
"https://doi.org/10.1016/j.patrec.2020.02.019",
url =
"http://www.sciencedirect.com/science/article/pii
/S0167865520300581",
author = "Rafaël Brandt and Nicola Strisciuglio
and Nicolai Petkov and Michael H.F. Wilkinson",
keywords = "Stereo matching, Mathematical
morphology, Tree structures",
abstract = "Extraction of depth from images is of
great importance for various computer vision
applications. Methods based on convolutional
neural networks are very accurate but have high
computation requirements, which can be achieved
with GPUs. However, GPUs are difficult to use on
devices with low power requirements like robots
and embedded systems. In this light, we propose a
stereo matching method appropriate for
applications in which limited computational and
energy resources are available. The algorithm is
based on a hierarchical representation of image
pairs which is used to restrict disparity search
range. We propose a cost function that takes into
account region contextual information and a cost
aggregation method that preserves disparity
borders. We tested the proposed method on the
Middlebury and KITTI benchmark data sets and on
the TrimBot2020 synthetic data. We achieved
accuracy and time efficiency results that show
that the method is suitable to be deployed on
embedded and robotics systems."
}

Detailed Results

This page provides detailed results for the method(s) selected. For the first 20 test images, the percentage of erroneous pixels is depicted in the table. We use the error metric described in Object Scene Flow for Autonomous Vehicles (CVPR 2015), which considers a pixel to be correctly estimated if the disparity or flow end-point error is <3px or <5% (for scene flow this criterion needs to be fulfilled for both disparity maps and the flow map). Underneath, the left input image, the estimated results and the error maps are shown (for disp_0/disp_1/flow/scene_flow, respectively). The error map uses the log-color scale described in Object Scene Flow for Autonomous Vehicles (CVPR 2015), depicting correct estimates (<3px or <5% error) in blue and wrong estimates in red color tones. Dark regions in the error images denote the occluded pixels which fall outside the image boundaries. The false color maps of the results are scaled to the largest ground truth disparity values / flow magnitudes.

Test Set Average

Error D1-bg D1-fg D1-all
All / All 28.03 46.55 31.11
All / Est 8.89 9.06 8.92
Noc / All 27.59 46.14 30.65
Noc / Est 8.56 8.76 8.60
This table as LaTeX

Test Image 0

Error D1-bg D1-fg D1-all
All / All 20.10 13.22 19.15
All / Est 13.13 2.76 10.15
Noc / All 19.95 13.22 19.01
Noc / Est 13.10 2.76 10.13
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 1

Error D1-bg D1-fg D1-all
All / All 27.68 38.78 28.92
All / Est 12.42 6.67 11.06
Noc / All 27.39 38.78 28.68
Noc / Est 12.10 6.67 10.81
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 2

Error D1-bg D1-fg D1-all
All / All 31.77 59.77 33.13
All / Est 11.47 15.69 11.71
Noc / All 31.36 59.77 32.77
Noc / Est 10.81 15.69 11.09
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 3

Error D1-bg D1-fg D1-all
All / All 29.16 64.33 32.41
All / Est 10.82 15.96 11.27
Noc / All 28.71 64.33 32.05
Noc / Est 10.73 15.96 11.19
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 4

Error D1-bg D1-fg D1-all
All / All 28.09 42.02 30.40
All / Est 8.13 4.37 7.18
Noc / All 27.30 42.02 29.78
Noc / Est 7.45 4.37 6.66
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 5

Error D1-bg D1-fg D1-all
All / All 39.03 50.99 40.10
All / Est 8.20 8.76 8.28
Noc / All 38.06 50.99 39.25
Noc / Est 8.17 8.76 8.26
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 6

Error D1-bg D1-fg D1-all
All / All 43.66 54.54 44.80
All / Est 8.72 9.22 8.80
Noc / All 43.81 54.54 44.96
Noc / Est 8.56 9.22 8.66
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 7

Error D1-bg D1-fg D1-all
All / All 23.81 44.33 27.82
All / Est 8.02 14.05 9.41
Noc / All 23.60 44.33 27.71
Noc / Est 7.57 14.05 9.07
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 8

Error D1-bg D1-fg D1-all
All / All 21.11 11.16 19.27
All / Est 6.88 0.80 5.04
Noc / All 21.12 11.16 19.28
Noc / Est 6.88 0.80 5.04
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 9

Error D1-bg D1-fg D1-all
All / All 21.07 22.85 21.53
All / Est 8.87 6.03 7.88
Noc / All 21.04 23.57 21.67
Noc / Est 8.76 6.03 7.81
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 10

Error D1-bg D1-fg D1-all
All / All 20.37 26.56 21.78
All / Est 14.51 2.78 10.09
Noc / All 20.12 26.56 21.61
Noc / Est 14.07 2.78 9.79
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 11

Error D1-bg D1-fg D1-all
All / All 20.96 31.31 22.82
All / Est 8.43 4.22 7.33
Noc / All 20.83 31.31 22.73
Noc / Est 8.40 4.22 7.30
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 12

Error D1-bg D1-fg D1-all
All / All 26.31 45.42 27.60
All / Est 7.14 4.26 6.81
Noc / All 26.22 45.42 27.52
Noc / Est 6.78 4.26 6.49
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 13

Error D1-bg D1-fg D1-all
All / All 24.64 17.48 23.76
All / Est 7.07 5.97 6.79
Noc / All 24.29 17.48 23.44
Noc / Est 6.82 5.97 6.61
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 14

Error D1-bg D1-fg D1-all
All / All 29.79 67.68 30.44
All / Est 9.08 3.85 8.98
Noc / All 29.50 67.68 30.16
Noc / Est 8.85 3.85 8.76
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 15

Error D1-bg D1-fg D1-all
All / All 24.14 63.33 27.68
All / Est 8.13 1.42 7.51
Noc / All 24.04 63.33 27.66
Noc / Est 7.82 1.42 7.22
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 16

Error D1-bg D1-fg D1-all
All / All 26.62 39.86 28.57
All / Est 7.53 5.06 7.05
Noc / All 26.32 39.86 28.33
Noc / Est 7.46 5.06 7.00
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 17

Error D1-bg D1-fg D1-all
All / All 16.98 38.93 19.27
All / Est 6.46 1.42 5.87
Noc / All 16.27 38.93 18.67
Noc / Est 5.79 1.42 5.28
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 18

Error D1-bg D1-fg D1-all
All / All 36.64 42.74 39.54
All / Est 11.65 16.72 14.61
Noc / All 36.01 42.74 39.24
Noc / Est 11.65 16.72 14.61
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 19

Error D1-bg D1-fg D1-all
All / All 14.29 30.82 16.16
All / Est 8.55 6.02 8.10
Noc / All 14.00 30.82 15.94
Noc / Est 8.26 6.02 7.86
This table as LaTeX

Input Image

D1 Result

D1 Error




eXTReMe Tracker