Improving Sentiment Classification for Hotel Recommender System through Deep Learning and Data Balancing
Abstract
A recommender system is a type of information filtering system that predicts and recommends items or products to users based on their preferences and past behaviors. It is commonly used in e-commerce and social media to suggest items that a user may be interested in purchasing, reading, watching, or listening to. Sentiment analysis is an area of natural language processing that has emerged as a popular way for organizations to detect and categorize opinions about a product, idea or service. In recent years, many attempts have been made to apply sentiment analysis in designing recommender systems, in order to recommend various items, such as hotels. It is thought that providing a quality hotel suggestion based on the requirements and preferences of users is a challenge and, naturally, alluring effort for tourism applications. In this paper, the quality of decision making for hotel recommender system based on sentiment analysis, deep learning and data balancing techniques has been improve. Multiple approaches are used with our proposed system to provide high-quality hotel recommendations. To achieve this goal, first, the existing dataset is balanced, using the translating and text paraphrasing policy by a transformer-based model called T5. Afterwards, an integrated method, including the transformer-based XLM-RoBERTa model is used along with the attention mechanism for sentiment analysis. The result of the comparison of our proposed model with the four best non-transformer-based models; RNN, GRU, LSTM, Bi-LST, and the most recent transformer-based model, En-RFBERT, on the TripAdvisor dataset showed the superiority of our proposed method. Our propose system beats En-RFBERT by 3%, 7%, and 5% in Macro Precision, Recall, and F1-score, respectively and performs better than En-RFBERT when it comes to responsiveness time.
Keywords
Recommender system, sentiment analysis, data balancing, natural language processing, deep learning, transformer, attention