skip to main content
10.1145/1008992.1009122acmconferencesArticle/Chapter ViewAbstractPublication PagesirConference Proceedingsconference-collections
Article

Information retrieval using hierarchical dirichlet processes

Published: 25 July 2004 Publication History

Abstract

An information retrieval method is proposed using a hierarchical Dirichlet process as a prior on the parameters of a set of multinomial distributions. The resulting method naturally includes a number of features found in other popular methods. Specifically, tf.idf-like term weighting and document length normalisation are recovered. The new method is compared with Okapi BM-25 [3] and the Twenty-One model [1] on TREC data and is shown to give better performance.

References

[1]
D. Hiemstra and W. Kraaij. Twenty-one at TREC-7: Ad-hoc and cross-language track. In Text REtrieval Conference, pages 174--185, 1998.
[2]
J. M. Ponte and W. B. Croft. A language modeling approach to information retrieval. In Research and Development in Information Retrieval, pages 275--281, 1998.
[3]
S. E. Robertson, S. Walker, M. Hancock-Beaulieu, A. Gull, and M. Lau. Okapi at TREC. In Text REtrieval Conference, pages 21--30, 1992.
[4]
Y. W. Teh, M. I. Jordan, M. J. Beal,and D. M. Blei. Hierarchical dirichlet processes. Technical Report 653, Department Of Statistics, UC Berkeley, 2003.

Cited By

View all
  • (2025)A Study of the Nonparametric Bayesian Hierarchical Dirichlet Mixture Model for Document ClusteringProceedings of the Third International Conference on Cognitive and Intelligent Computing, Volume 210.1007/978-981-97-9266-5_34(341-351)Online publication date: 26-Feb-2025
  • (2024)Blocked Gibbs Sampler for Hierarchical Dirichlet ProcessesJournal of Computational and Graphical Statistics10.1080/10618600.2024.2388543(1-11)Online publication date: 27-Sep-2024
  • (2018)Semantic relation extraction aware of N-gram features from unstructured biomedical textJournal of Biomedical Informatics10.1016/j.jbi.2018.08.01186(59-70)Online publication date: Oct-2018
  • Show More Cited By

Index Terms

  1. Information retrieval using hierarchical dirichlet processes

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    SIGIR '04: Proceedings of the 27th annual international ACM SIGIR conference on Research and development in information retrieval
    July 2004
    624 pages
    ISBN:1581138814
    DOI:10.1145/1008992
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 25 July 2004

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. hierarchical dirichlet processes
    2. probabilistic information retrieval

    Qualifiers

    • Article

    Conference

    SIGIR04
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 792 of 3,983 submissions, 20%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)5
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 01 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2025)A Study of the Nonparametric Bayesian Hierarchical Dirichlet Mixture Model for Document ClusteringProceedings of the Third International Conference on Cognitive and Intelligent Computing, Volume 210.1007/978-981-97-9266-5_34(341-351)Online publication date: 26-Feb-2025
    • (2024)Blocked Gibbs Sampler for Hierarchical Dirichlet ProcessesJournal of Computational and Graphical Statistics10.1080/10618600.2024.2388543(1-11)Online publication date: 27-Sep-2024
    • (2018)Semantic relation extraction aware of N-gram features from unstructured biomedical textJournal of Biomedical Informatics10.1016/j.jbi.2018.08.01186(59-70)Online publication date: Oct-2018
    • (2018)Hierarchical Dirichlet Processes with Social InfluenceNatural Language Processing and Chinese Computing10.1007/978-3-319-73618-1_41(490-502)Online publication date: 5-Jan-2018
    • (2015)A Pólya Urn Document Language Model for Improved Information RetrievalACM Transactions on Information Systems10.1145/274623133:4(1-34)Online publication date: 4-May-2015
    • (2015)The Supervised Hierarchical Dirichlet ProcessIEEE Transactions on Pattern Analysis and Machine Intelligence10.1109/TPAMI.2014.231580237:2(243-255)Online publication date: 1-Feb-2015
    • (2014)Unsupervised event coreference resolutionComputational Linguistics10.1162/COLI_a_0017440:2(311-347)Online publication date: 1-Jun-2014
    • (2013)Incorporating Hierarchical Dirichlet Process into Tag Topic ModelChinese Lexical Semantics10.1007/978-3-642-45185-0_39(368-377)Online publication date: 2013
    • (2009)Hierarchical dirichlet trees for information retrievalProceedings of Human Language Technologies: The 2009 Annual Conference of the North American Chapter of the Association for Computational Linguistics10.5555/1620754.1620780(173-181)Online publication date: 31-May-2009
    • (2005)Bayesian learning in text summarizationProceedings of the conference on Human Language Technology and Empirical Methods in Natural Language Processing10.3115/1220575.1220607(249-256)Online publication date: 6-Oct-2005

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media