Search publications

Reset filters Search by keyword

No publications found.

 

A novel span and syntax enhanced large language model based framework for fine-grained sentiment analysis

Authors: Zou HWang YHuang A


Affiliations

1 School of Computer Science and Engineering, Nanjing University of Science and Technology, Xiaolingwei Street No.200, Nanjing, 210094, Jiangsu, China; Department of Computer Science and Software Engineering, Concordia University, 2155 Guy Street, Montreal, H3H 2L9, Quebec, Canada. Electronic address: haochen.zou@mail.concordia.ca.
2 School of Computer Science and Engineering, Nanjing University of Science and Technology, Xiaolingwei Street No.200, Nanjing, 210094, Jiangsu, China. Electronic address: yongliwang@njust.edu.cn.
3 School of Computer Science and Engineering, Nanjing University of Science and Technology, Xiaolingwei Street No.200, Nanjing, 210094, Jiangsu, China. Electronic address: anqihuang@njust.edu.cn.

Description

Fine-grained aspect-based sentiment analysis requires language models to identify aspect entities and the corresponding sentiment information in the input text content. Transformer-based pre-trained large language models have demonstrated remarkable performance on various challenging natural language processing tasks. However, large language models face limitations in explicitly modelling syntactic relationships and effectively capturing local nuances between terms in the text content, which constrains their capability in fine-grained aspect-based sentiment analysis. We propose a novel span and syntax enhanced joint learning framework based on the latest large language model. The framework incorporates three key components, including the span-aware attention mechanism, the contextual Transformer, and the syntax-aware Transformer, which examine in parallel to generate span-aware features, contextual features, and syntax-aware features, respectively. The three dimensions of analyzed features are dynamically fused in the feature aggregation module, resulting in a combined feature for aspect entity recognition and sentiment classification. To the best of our knowledge, this study represents the pioneering effort to comprehensively leverage span-aware, contextual, and syntax-aware characteristics to augment large language models in addressing the fine-grained aspect-based sentiment analysis task. Experimental results on publicly available benchmark datasets validate the effectiveness of the architecture compared to state-of-the-art baseline competitors.


Keywords: Fine-grained sentiment analysisLarge language modelNatural language processingSpan-aware attentionSyntex-aware transformer


Links

PubMed: https://pubmed.ncbi.nlm.nih.gov/40876298/

DOI: 10.1016/j.neunet.2025.108012