The stereo / flow benchmark consists of 194 training image pairs and 195 test image pairs, saved in loss less png format. Our evaluation server computes the average number of bad pixels for all non-occluded or occluded (=all groundtruth) pixels. We require that all methods use the same parameter set for all test pairs. Our development kit provides details about the data format as well as MATLAB / C++ utility functions for reading and writing disparity maps and flow fields.
Our evaluation table ranks all methods according to the number of non-occluded erroneous pixels at the specified disparity / end-point error threshold. All methods providing less than 100 % density have been interpolated using simple background interpolation as explained in the corresponding header file in the development kit. For each method we show:
Out-Noc: Percentage of erroneous pixels in non-occluded areas
Out-All: Percentage of erroneous pixels in total
Avg-Noc: Average disparity / end-point error in non-occluded areas
Avg-All: Average disparity / end-point error in total
Density: Percentage of pixels for which ground truth has been provided by the method
Note: Our main ranking is computed at 3 pixels error threshold, evaluating all pixels. For methods which do not provide dense result we use background interpolation to fill in missing values. IMPORTANT NOTE: On 04.11.2013 we have improved the ground truth disparity maps and flow fields leading to slightly improvements for all methods. Please download the stereo/flow dataset with the improved ground truth for training again, if you have downloaded the dataset prior to 04.11.2013. Please consider reporting these new number for all future submissions. The last leaderboards right before the changes can be found here: stereo and flow!
Additional information used by the methods
Stereo: Method uses left and right (stereo) images
Multiview: Method uses more than 2 temporally adjacent images
Motion stereo: Method uses epipolar geometry for computing optical flow
Optical Flow Evaluation
This table ranks general optical flow methods, performing a full 2D search, as compared to the motion stereo methods below.
The settings column describes additional assumptions made / information used by the methods:
ms = motion stereo: Usage of the epipolar geometry to restrict the search problem to 1D
HCI/Bosch Robust Vision Challenge: Optical flow and stereo vision challenge on high resolution imagery recorded at a high frame rate under diverse weather conditions (e.g., sunny, cloudy, rainy). The Robert Bosch AG provides a prize for the best performing method.
Middlebury Optical Flow Evaluation: The classic optical flow evaluation benchmark, featuring eight test images, with very accurate ground truth from a shape from UV light pattern system. 24 image pairs are provided in total.