title: Combining Independent Modules in Lexical Multiple-Choice Problems creator: Turney, Peter D. creator: Littman, Michael L. creator: Bigham, Jeffrey creator: Shnayder, Victor subject: Statistical Models subject: Language subject: Computational Linguistics subject: Semantics subject: Machine Learning description: Existing statistical approaches to natural language problems are very coarse approximations to the true complexity of language processing. As such, no single technique will be best for all problem instances. Many researchers are examining ensemble methods that combine the output of multiple modules to create more accurate solutions. This paper examines three merging rules for combining probability distributions: the familiar mixture rule, the logarithmic rule, and a novel product rule. These rules were applied with state-of-the-art results to two problems used to assess human mastery of lexical semantics -- synonym questions and analogy questions. All three merging rules result in ensembles that are more accurate than any of their component modules. The differences among the three rules are not statistically significant, but it is suggestive that the popular mixture rule is not the best rule for either of the two problems. publisher: John Benjamins contributor: Nicolov, Nicolas contributor: Botcheva, Kalina contributor: Angelova, Galia contributor: Mitkov, Ruslan date: 2004 type: Book Chapter type: PeerReviewed format: application/pdf identifier: http://cogprints.org/4027/1/turney.pdf identifier: Turney, Peter D. and Littman, Michael L. and Bigham, Jeffrey and Shnayder, Victor (2004) Combining Independent Modules in Lexical Multiple-Choice Problems. [Book Chapter] relation: http://cogprints.org/4027/