Method

TrackMPNN: A Message Passing Graph Neural Architecture for Multi-Object Tracking [on] [TrackMPNN]
https://github.com/arangesh/TrackMPNN

Submitted on 9 Jan. 2021 00:24 by
Akshay Rangesh (UC San Diego)

Running time:0.05 s
Environment:4 cores @ 3.0 Ghz (Python)

Method Description:
A framework based on dynamic undirected graphs that
represent the data association problem over multiple
timesteps, and a message passing graph neural
network (GNN) that operates on these graphs to
produce the desired likelihood for every association
therein. We only use the 2D detection box location
and score as the descriptor for each object
instance.
Parameters:
N/A
Latex Bibtex:
@article{rangesh2101trackmpnn,
title={TrackMPNN: A Message Passing Graph Neural
Architecture for Multi-Object Tracking},
author={Rangesh, Akshay and Maheshwari, Pranav
and Gebre, Mez and Mhatre, Siddhesh and Ramezani,
Vahid and Trivedi, Mohan M},
journal={arXiv preprint arXiv:2101.04206}
}

Detailed Results

From all 29 test sequences, our benchmark computes the commonly used tracking metrics CLEARMOT, MT/PT/ML, identity switches, and fragmentations [1,2]. The tables below show all of these metrics.


Benchmark MOTA MOTP MODA MODP
CAR 87.74 % 84.55 % 88.92 % 87.46 %
PEDESTRIAN 53.22 % 73.69 % 54.93 % 91.74 %

Benchmark recall precision F1 TP FP FN FAR #objects #trajectories
CAR 93.52 % 96.67 % 95.07 % 36710 1266 2545 11.38 % 47508 2082
PEDESTRIAN 67.13 % 85.12 % 75.06 % 15701 2745 7689 24.68 % 24523 1129

Benchmark MT PT ML IDS FRAG
CAR 84.77 % 13.38 % 1.85 % 404 607
PEDESTRIAN 33.68 % 47.77 % 18.56 % 395 1035

This table as LaTeX


[1] K. Bernardin, R. Stiefelhagen: Evaluating Multiple Object Tracking Performance: The CLEAR MOT Metrics. JIVP 2008.
[2] Y. Li, C. Huang, R. Nevatia: Learning to associate: HybridBoosted multi-target tracker for crowded scene. CVPR 2009.


eXTReMe Tracker