Structured data has a standardized format for easy access, organization, and categorization. However, approximately 95% of data, such as text files or online reviews, is unstructured, and these texts do not have standard rules. Unstructured data analysis, especially when the amount of data to be examined is substantial, requires considerable effort, cost, and time, and classical statistical methods are often insufficient. Transformer models, the latest technological models in natural language processing (NLP), are the strongest candidates to overcome these limits. In this paper, we propose the bi-directional encoder representations from transformers (BERT) model-based solution for sentiment analysis of consumer reviews. The dataset comprises 10975 consumer reviews of technological products from an e-commerce platform and was transformed into a structured dataset using data preprocessing. Then, we compared the performance of the BERT transformer model with deep learning models, specifically convolutional neural networks (CNN), long short-term memory (LSTM), and bidirectional long short-term memory (B-LSTM). Experimental results confirmed that the BERT transformer model achieved a higher kappa of 96.6% and an overall accuracy of 97.78% for multi-classification of consumer reviews. The proposed transformer-based model outperforms the state-of-the-art models, providing a reliable and efficient solution.
This paper does not include any studies with human or animal subjects.
| Primary Language | English |
|---|---|
| Subjects | Computer Software, Software Engineering (Other) |
| Journal Section | Research Article |
| Authors | |
| Early Pub Date | September 26, 2025 |
| Publication Date | September 30, 2025 |
| Submission Date | April 13, 2025 |
| Acceptance Date | August 17, 2025 |
| Published in Issue | Year 2025 Volume: 8 Issue: 3 |
The papers in this journal are licensed under a Creative Commons Attribution-NonCommercial 4.0 International License