Staple: Complementary Learners for Real-Time Tracking

Luca Bertinetto, Jack Valmadre, Stuart Golodetz, Ondrej Miksik, Philip H.S. Torr

University of Oxford

{name.surname}@eng.ox.ac.uk

 

NEWS: Staple ranked #1 inBeyond standard benchmarks by Čehovin et al.

NEWS: VOT'17 and VOT/TraX code available.

 
pipeline picture
   

Correlation Filter-based trackers have recently achieved excellent performance, showing great robustness to challenging situations exhibiting motion blur and illumination changes. However, since the model that they learn depends strongly on the spatial layout of the tracked object, they are notoriously sensitive to deformation. Models based on colour statistics have complementary traits: they cope well with variation in shape, but suffer when illumination is not consistent throughout a sequence. Moreover, colour distributions alone can be insufficiently discriminative. In this paper, we show that a simple tracker combining complementary cues in a ridge regression framework can operate faster than 80 FPS and outperform not only all entries in the popular VOT14 competition, but also recent and far more sophisticated trackers according to multiple benchmarks.

 

Paper (CVPR 2016 proceedings)

 

Poster

 

Code (+ webcam demo)

 

▸ Results [ OTB-13 ] [ OTB-100 ] [ VOT-2016 ] [ VOT-2015 ] [ VOT-2017 ]

 

Code submitted to VOT'17 (compatible with the VOT TraX protocol)  

 

NOTE: a preliminary version of Staple took part to the VOT15 challenge as OACF. Here the report.