Combining Local and Global Models for Robust Re-detection
2018 (Engelska)Ingår i: Proceedings of AVSS 2018. 2018 IEEE International Conference on Advanced Video and Signal-based Surveillance, Auckland, New Zealand, 27-30 November 2018, Institute of Electrical and Electronics Engineers (IEEE), 2018, s. 25-30Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]
Discriminative Correlation Filters (DCF) have demonstrated excellent performance for visual tracking. However, these methods still struggle in occlusion and out-of-view scenarios due to the absence of a re-detection component. While such a component requires global knowledge of the scene to ensure robust re-detection of the target, the standard DCF is only trained on the local target neighborhood. In this paper, we augment the state-of-the-art DCF tracking framework with a re-detection component based on a global appearance model. First, we introduce a tracking confidence measure to detect target loss. Next, we propose a hard negative mining strategy to extract background distractors samples, used for training the global model. Finally, we propose a robust re-detection strategy that combines the global and local appearance model predictions. We perform comprehensive experiments on the challenging UAV123 and LTB35 datasets. Our approach shows consistent improvements over the baseline tracker, setting a new state-of-the-art on both datasets.
Ort, förlag, år, upplaga, sidor
Institute of Electrical and Electronics Engineers (IEEE), 2018. s. 25-30
Nationell ämneskategori
Datorseende och robotik (autonoma system) Teknik och teknologier
Identifikatorer
URN: urn:nbn:se:liu:diva-158403DOI: 10.1109/AVSS.2018.8639159ISI: 000468081400005ISBN: 9781538692943 (digital)ISBN: 9781538692936 (digital)ISBN: 9781538692950 (tryckt)OAI: oai:DiVA.org:liu-158403DiVA, id: diva2:1332807
Konferens
15th IEEE International Conference on Advanced Video and Signal Based Surveillance (AVSS), 27-30 November, Auckland, New Zealand
Anmärkning
Funding Agencies|SSF (SymbiCloud); VR (EMC2) [2016-05543]; CENIIT grant [18.14]; SNIC; WASP
2019-06-282019-06-282019-10-30Bibliografiskt granskad