From Question Context to Answer Credibility: Modeling Semantic Structures for Question Answering Using Statistical Methods
Within a Question Answering (QA) framework, Question Context plays a vital role. We define Question Context to be background knowledge that can be used to represent the user’s information need more completely than the terms in the query alone. This paper proposes a novel approach that uses statistical language modeling techniques to develop a semantic Question Context which we then incorporate into the Information Retrieval (IR) stage of QA. Our approach proposes an Aspect-Based Relevance Language Model as basis of the Question Context Model. This model proposes that the sparse vocabulary of a query can be supplemented with semantic information from concepts (or aspects) related to query terms that already exist within the corpus.
We incorporate the Aspect-Based Relevance Language Model into Question Context Model by first obtaining all of the latent concepts that exist in the corpus for a particular question topic. Then, we derive a likelihood of relevance that relates each Context Term (CT) associated with those aspects to the user’s query. Context Terms from the aspects with the highest likelihood of relevance are then incorporated into the query language model based on their relevance score values. We use both query expansion and document model smoothing techniques and evaluate our approach. Our results are promising and show significant improvements in recall using the query expansion method.
Banerjee, P., & Han, H. (2009, March). Modeling Semantic Question Context for Question Answering. In Florida Artificial Intelligence Research Society Conference. http://aaai.org/ocs/index.php/FLAIRS/2009/paper/view/31