TU Darmstadt / ULB / TUbiblio

A template consensus method for visual tracking

Zhou, Tong-xue and Zeng, Dong-dong and Zhu, Ming and Kuijper, Arjan (2019):
A template consensus method for visual tracking.
In: Optoelectronics Letters, pp. 70-74, 15, (1), ISSN 1673-1905,
DOI: 10.1007/s11801-019-8109-2,
[Online-Edition: https://doi.org/10.1007/s11801-019-8109-2],
[Article]

Abstract

Abstract: Visual tracking is a challenging problem in computer vision. Recently, correlation filter-based trackers have shown to provide excellent tracking performance. Inspired by a sample consensus approach proposed for foreground detection, which classifies a given pixel as foreground or background based on its similarity to recently observed samples, we present a template consensus tracker based on the kernelized correlation filter (KCF). Instead of keeping only one target appearance model in the KCF, we make a feature pool to keep several target appearance models in our method and predict the new target position by searching for the location of the maximal value of the response maps. Both quantitative and qualitative evaluations are performed on the CVPR2013 tracking benchmark dataset. The results show that our proposed method improves the original KCF tracker by 8.17% in the success plot and 8.11% in the precision plot.

Item Type: Article
Erschienen: 2019
Creators: Zhou, Tong-xue and Zeng, Dong-dong and Zhu, Ming and Kuijper, Arjan
Title: A template consensus method for visual tracking
Language: English
Abstract:

Abstract: Visual tracking is a challenging problem in computer vision. Recently, correlation filter-based trackers have shown to provide excellent tracking performance. Inspired by a sample consensus approach proposed for foreground detection, which classifies a given pixel as foreground or background based on its similarity to recently observed samples, we present a template consensus tracker based on the kernelized correlation filter (KCF). Instead of keeping only one target appearance model in the KCF, we make a feature pool to keep several target appearance models in our method and predict the new target position by searching for the location of the maximal value of the response maps. Both quantitative and qualitative evaluations are performed on the CVPR2013 tracking benchmark dataset. The results show that our proposed method improves the original KCF tracker by 8.17% in the success plot and 8.11% in the precision plot.

Journal or Publication Title: Optoelectronics Letters
Volume: 15
Number: 1
Uncontrolled Keywords: Computer vision based tracking, Object tracking, Tracking, Filtering, Appearance
Divisions: 20 Department of Computer Science
20 Department of Computer Science > Mathematical and Applied Visual Computing
Date Deposited: 19 Jun 2019 11:16
DOI: 10.1007/s11801-019-8109-2
Official URL: https://doi.org/10.1007/s11801-019-8109-2
Export:
Suche nach Titel in: TUfind oder in Google

Optionen (nur für Redakteure)

View Item View Item