Improving Coherence of Topic Based Aspect Clusters using Domain Knowledge

Kavita Sanjay Asnani, Jyoti D Pawar

Abstract


Web is loaded with opinion data belonging to multiple domains. Probabilistic topic models such as Probabilistic Latent Semantic Analysis (pLSA) and Latent Dirichlet Allocation (LDA) have been popularly used to obtain thematic representations called topic-based aspects from the opinion data. These topic-based aspects are then clustered to obtain semantically related groups, by algorithms such as Automated Knowledge LDA (AKL). However, there are two main shortcomings with these algorithms namely the cluster of topics obtained sometimes lack coherence to accurately represent relevant aspects in the cluster and the popular or common words which are referred to as the generic topics are found to occur across clusters in different domains. In this paper we have used context domain knowledge from a publicly available lexical resource to increase the coherence of topic-based aspect clusters and discriminate domain-specific semantically relevant topical aspects from generic aspects shared across the domains. BabelNet was used as the lexical resource. The dataset comprised of product reviews from 36 product domains, containing 1000 reviews from each domain and 14 clusters per domain. Also, frequent topical aspects across topic clusters indicate occurrence of generic aspects. The average elimination of incoherent aspects was found to be 28.84%. The trend generated by UMass metric shows improved topic coherence and also better cluster quality is obtained as the average entropy without eliminated values was 0.876 and with elimination was 0.906.

Keywords


Topic-based Aspect Extraction, Aspect Filtering, Aspect Coherence, Lexical Resource BabelNet, Context Domain Knowledge

Full Text: PDF