liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Polya Urn Latent Dirichlet Allocation: A Doubly Sparse Massively Parallel Sampler
Imperial Coll London, England.
Linköping University, Department of Computer and Information Science, The Division of Statistics and Machine Learning. Linköping University, Faculty of Arts and Sciences.
Linköping University, Department of Computer and Information Science. Linköping University, Faculty of Science & Engineering.
Univ Calif Santa Cruz, CA 95064 USA.
2019 (English)In: IEEE Transactions on Pattern Analysis and Machine Intelligence, ISSN 0162-8828, E-ISSN 1939-3539, Vol. 41, no 7, p. 1709-1719Article in journal (Refereed) Published
Abstract [en]

Latent Dirichlet Allocation (LDA) is a topic model widely used in natural language processing and machine learning. Most approaches to training the model rely on iterative algorithms, which makes it difficult to run LDA on big corpora that are best analyzed in parallel and distributed computational environments. Indeed, current approaches to parallel inference either dont converge to the correct posterior or require storage of large dense matrices in memory. We present a novel sampler that overcomes both problems, and we show that this sampler is faster, both empirically and theoretically, than previous Gibbs samplers for LDA. We do so by employing a novel Polya-urn-based approximation in the sparse partially collapsed sampler for LDA. We prove that the approximation error vanishes with data size, making our algorithm asymptotically exact, a property of importance for large-scale topic models. In addition, we show, via an explicit example, that-contrary to popular belief in the topic modeling literature-partially collapsed samplers can be more efficient than fully collapsed samplers. We conclude by comparing the performance of our algorithm with that of other approaches on well-known corpora.

Place, publisher, year, edition, pages
IEEE COMPUTER SOC , 2019. Vol. 41, no 7, p. 1709-1719
Keywords [en]
Bayesian inference; big data; computational complexity; gibbs sampling; latent dirichlet allocation; markov chain monte carlo; natural language processing; parallel and distributed systems; topic models
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:liu:diva-158856DOI: 10.1109/TPAMI.2018.2832641ISI: 000470972300014PubMedID: 29994329OAI: oai:DiVA.org:liu-158856DiVA, id: diva2:1337638
Note

Funding Agencies|Swedish Foundation for Strategic Research [RIT 15-0097]

Available from: 2019-07-16 Created: 2019-07-16 Last updated: 2021-01-26

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textPubMed

Search in DiVA

By author/editor
Magnusson, MånsJonsson, Leif
By organisation
The Division of Statistics and Machine LearningFaculty of Arts and SciencesDepartment of Computer and Information ScienceFaculty of Science & Engineering
In the same journal
IEEE Transactions on Pattern Analysis and Machine Intelligence
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
pubmed
urn-nbn

Altmetric score

doi
pubmed
urn-nbn
Total: 46 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf