#199: Robust target tracking using adaptive color feature and likelihood fusion

A. Bouix, N. Al-Shakarji, K. Gao, F. Bunyak, A. Chazot, A. Hafiane, and K. Palaniappan

International Society for Optics and Photonics, Volume 10645, pgs. 106450L, 2018

visual object tracking, staple, correlation filter, color name, fusion

PlainText, Bibtex, PDF, URL, Google Scholar


Designing a robust and accurate object tracker is important in many computer vision applications. The problem becomes more complicated when additional factors like changing appearance, illumination, and scale are introduced in the sequence. Recently, trackers that are based on the correlation filter method like Sum of Template and Pixel-wise Learners (STAPLE) have shown state-of-the-art short-term tracking performance. STAPLE consists two major modules: learning correlation filter on HOG features and representing color information using RGB histogram. In this paper, we propose an improved STAPLE (iSTAPLE) tracker by adding the Color Names (CN) to the correlation part of the tracker. CN complements HOG feature because using only HOG can lead to tracking failures in some cases where occlusion or deformation is present. As the color information could be a confusing factor and unreliable in tracking due to the rapid illumination changes, Bhattacharyya distance is used to measure the color similarity between the target and surrounding area to decide whether the color information is helpful. Since we use multiple feature cues to improve tracking performance, a robust approach to fuse multiple features is required. To fully utilize all features and optimize the tracking result, numerous weight combinations assigned to each feature are tested. We show through comprehensive experiments on the VOT Challenge 2016 dataset that iSTAPLE obtains a gain of 25% in tracking robustness.