Distilled Siamese Networks for Visual TrackingShow others and affiliations
2022 (English)In: IEEE Transactions on Pattern Analysis and Machine Intelligence, ISSN 0162-8828, E-ISSN 1939-3539, Vol. 44, no 12, p. 8896-8909Article in journal (Refereed) Published
Abstract [en]
In recent years, Siamese network based trackers have significantly advanced the state-of-the-art in real-time tracking. Despite their success, Siamese trackers tend to suffer from high memory costs, which restrict their applicability to mobile devices with tight memory budgets. To address this issue, we propose a distilled Siamese tracking framework to learn small, fast and accurate trackers (students), which capture critical knowledge from large Siamese trackers (teachers) by a teacher-students knowledge distillation model. This model is intuitively inspired by the one teacher versus multiple students learning method typically employed in schools. In particular, our model contains a single teacher-student distillation module and a student-student knowledge sharing mechanism. The former is designed using a tracking-specific distillation strategy to transfer knowledge from a teacher to students. The latter is utilized for mutual learning between students to enable in-depth knowledge understanding. Extensive empirical evaluations on several popular Siamese trackers demonstrate the generality and effectiveness of our framework. Moreover, the results on five tracking benchmarks show that the proposed distilled trackers achieve compression rates of up to 18x and frame-rates of 265 FPS, while obtaining comparable tracking accuracy compared to base models.
Place, publisher, year, edition, pages
IEEE COMPUTER SOC , 2022. Vol. 44, no 12, p. 8896-8909
Keywords [en]
Training; Target tracking; Real-time systems; Visualization; Correlation; Knowledge engineering; Feature extraction; Siamese network; teacher-students; knowledge distillation; visual object tracking; siamese trackers
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:liu:diva-190193DOI: 10.1109/TPAMI.2021.3127492ISI: 000880661400027PubMedID: 34762585OAI: oai:DiVA.org:liu-190193DiVA, id: diva2:1714279
Note
Funding Agencies|National Natural Science Foundation of China [62106128]; Natural Science Foundation of Shandong Province [ZR202102240155]
2022-11-292022-11-292022-11-29