liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Unsupervised Novelty Detection in Pretrained Representation Space with Locally Adapted Likelihood Ratio
Linköping University, Department of Computer and Information Science, The Division of Statistics and Machine Learning. Linköping University, Faculty of Science & Engineering.ORCID iD: 0000-0001-7411-2177
Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, Faculty of Science & Engineering.ORCID iD: 0000-0001-5076-5798
Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, Faculty of Science & Engineering.ORCID iD: 0000-0002-9217-9997
Linköping University, Department of Computer and Information Science, The Division of Statistics and Machine Learning. Linköping University, Faculty of Science & Engineering.ORCID iD: 0000-0003-3749-5820
2024 (English)In: International Conference on Artificial Intelligence and Statistics 2024, Proceedings of Machine Learning Research, 2024, Vol. 238Conference paper, Published paper (Refereed)
Abstract [en]

Detecting novelties given unlabeled examples of normal data is a challenging task in machine learning, particularly when the novel and normal categories are semantically close. Large deep models pretrained on massive datasets can provide a rich representation space in which the simple k-nearest neighbor distance works as a novelty measure. However, as we show in this paper, the basic k-NN method might be insufficient in this context due to ignoring the 'local geometry' of the distribution over representations as well as the impact of irrelevant 'background features'. To address this, we propose a fully unsupervised novelty detection approach that integrates the flexibility of k-NN with a locally adapted scaling of dimensions based on the 'neighbors of nearest neighbor' and computing a 'likelihood ratio' in pretrained (self-supervised) representation spaces. Our experiments with image data show the advantage of this method when off-the-shelf vision transformers (e.g., pretrained by DINO) are used as the feature extractor without any fine-tuning.

Place, publisher, year, edition, pages
2024. Vol. 238
Series
Proceedings of Machine Learning Research, ISSN 2640-3498
National Category
Computer Sciences Computer graphics and computer vision Signal Processing
Identifiers
URN: urn:nbn:se:liu:diva-203391ISI: 001221034002024OAI: oai:DiVA.org:liu-203391DiVA, id: diva2:1856978
Conference
27th International Conference on Artificial Intelligence and Statistics (AISTATS), Valencia, SPAIN, MAY 02-04, 2024
Available from: 2024-05-08 Created: 2024-05-08 Last updated: 2025-02-01

Open Access in DiVA

No full text in DiVA

Authority records

Ahmadian, AmirhosseinDing, YifanEilertsen, GabrielLindsten, Fredrik

Search in DiVA

By author/editor
Ahmadian, AmirhosseinDing, YifanEilertsen, GabrielLindsten, Fredrik
By organisation
The Division of Statistics and Machine LearningFaculty of Science & EngineeringMedia and Information Technology
Computer SciencesComputer graphics and computer visionSignal Processing

Search outside of DiVA

GoogleGoogle Scholar

urn-nbn

Altmetric score

urn-nbn
Total: 469 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf