Pachinko allocation
{{short description|Statistical tool}}
{{Primary sources|date=September 2010}}
In machine learning and natural language processing, the pachinko allocation model (PAM) is a topic model. Topic models are a suite of algorithms to uncover the hidden thematic structure of a collection of documents. {{cite web|last=Blei|first=David|title=Topic modeling|url=http://www.cs.princeton.edu/~blei/topicmodeling.html|accessdate=4 October 2012|archive-url=https://web.archive.org/web/20121002061418/http://www.cs.princeton.edu/~blei/topicmodeling.html|archive-date=2 October 2012|url-status=dead}} The algorithm improves upon earlier topic models such as latent Dirichlet allocation (LDA) by modeling correlations between topics in addition to the word correlations which constitute topics. PAM provides more flexibility and greater expressive power
than latent Dirichlet allocation.{{cite conference|last1=Li|first1=Wei|last2=Blei|first2=David|last3=McCallum|first3=Andrew|title=Nonparametric Bayes Pachinko Allocation|year=2007|conference=Twenty-Third Conference on Uncertainty in Artificial Intelligence|arxiv=1206.5270}} While first described and implemented in the context of natural language processing, the algorithm may have applications in other fields such as bioinformatics. The
model is named for pachinko machines—a game popular in Japan, in which metal balls bounce down around
a complex collection of pins until they land in various
History
Pachinko allocation was first described by Wei Li and Andrew McCallum in 2006.{{cite book
| last1 = Li | first1 = Wei
| last2 = McCallum |first2 = Andrew
| title = Proceedings of the 23rd international conference on Machine learning - ICML '06
| chapter = Pachinko allocation: DAG-structured mixture models of topic correlations
| pages = 577–584
| year = 2006
| doi = 10.1145/1143844.1143917
| isbn = 1595933832
| s2cid = 13160178
| chapter-url = http://www.cs.umass.edu/~mccallum/papers/pam-icml06.pdf
}}
The idea was extended with hierarchical Pachinko allocation by Li, McCallum, and David Mimno in 2007.{{cite book
| last1 = Mimno | first1 = David
| last2 = Li |first2 = Wei
| last3 = McCallum |first3 = Andrew
| title = Proceedings of the 24th international conference on Machine learning
| chapter = Mixtures of hierarchical topics with Pachinko allocation
| date = 2007
| pages = 633–640
| doi = 10.1145/1273496.1273576
| isbn = 9781595937933
| s2cid = 6045658
| chapter-url = http://maroo.cs.umass.edu/pdf/IR-587.pdf
}} In 2007, McCallum and his colleagues proposed a nonparametric Bayesian prior for PAM based
on a variant of the hierarchical Dirichlet process (HDP). The algorithm has been implemented in the MALLET software package published by McCallum's group at the University of Massachusetts Amherst.
Model
{{expand section|date=July 2017}}
PAM connects words in V and topics in T with an arbitrary directed acyclic graph (DAG), where topic nodes occupy the interior levels and the leaves are words.
The probability of generating a whole corpus is the product of the probabilities for every document:
See also
- Probabilistic latent semantic indexing (PLSI), an early topic model from Thomas Hofmann in 1999.{{cite journal
|last1 = Hofmann
|first1 = Thomas
|title = Probabilistic Latent Semantic Indexing
|journal = Proceedings of the Twenty-Second Annual International SIGIR Conference on Research and Development in Information Retrieval
|year = 1999
|url = http://www.cs.brown.edu/~th/papers/Hofmann-SIGIR99.pdf
|url-status = dead
|archiveurl = https://web.archive.org/web/20101214074049/http://www.cs.brown.edu/~th/papers/Hofmann-SIGIR99.pdf
|archivedate = 2010-12-14
}}
- Latent Dirichlet allocation, a generalization of PLSI developed by David Blei, Andrew Ng, and Michael Jordan in 2002, allowing documents to have a mixture of topics.{{cite journal
|last1 = Blei
|first1 = David M.
|last2 = Ng
|first2 = Andrew Y.
|last3 = Jordan
|first3 = Michael I
|authorlink3 = Michael I. Jordan
|title = Latent Dirichlet allocation
|journal = Journal of Machine Learning Research
|date = January 2003
|volume = 3
|pages = pp. 993–1022
|url = http://jmlr.csail.mit.edu/papers/v3/blei03a.html
|last4 = Lafferty
|first4 = John
|access-date = 19 July 2010
|archive-url = https://web.archive.org/web/20120501152722/http://jmlr.csail.mit.edu/papers/v3/blei03a.html
|archive-date = 1 May 2012
|url-status = dead
}}
- MALLET, an open-source Java library that implements Pachinko allocation.
References
{{Reflist}}
External links
- [http://videolectures.net/icml07_mimno_moht/ Mixtures of Hierarchical Topics with Pachinko Allocation], a video recording of David Mimno presenting HPAM in 2007.
{{Use dmy dates|date=October 2018}}
{{Natural Language Processing}}
{{DEFAULTSORT:Pachinko Allocation}}
Category:Statistical natural language processing
Category:Latent variable models
{{comp-sci-stub}}