Keyword search (4,164 papers available)

"Transformer" Keyword-tagged Publications:

Title Authors PubMed ID
1 Attention-Fusion-Based Two-Stream Vision Transformer for Heart Sound Classification Ranipa K; Zhu WP; Swamy MNS; 41155032
ENCS
2 Lung Nodule Malignancy Classification Integrating Deep and Radiomic Features in a Three-Way Attention-Based Fusion Module Khademi S; Heidarian S; Afshar P; Mohammadi A; Sidiqi A; Nguyen ET; Ganeshan B; Oikonomou A; 41150036
ENCS
3 A novel span and syntax enhanced large language model based framework for fine-grained sentiment analysis Zou H; Wang Y; Huang A; 40876298
ENCS
4 Deformable detection transformers for domain adaptable ultrasound localization microscopy with robustness to point spread function variations Gharamaleki SK; Helfield B; Rivaz H; 40640235
PHYSICS
5 SAVE: Self-Attention on Visual Embedding for Zero-Shot Generic Object Counting Zgaren A; Bouachir W; Bouguila N; 39997554
ENCS
6 Semantically-Enhanced Feature Extraction with CLIP and Transformer Networks for Driver Fatigue Detection Gao Z; Chen X; Xu J; Yu R; Zhang H; Yang J; 39771685
ENCS
7 CosSIF: Cosine similarity-based image filtering to overcome low inter-class variation in synthetic medical image datasets Islam M; Zunair H; Mohammed N; 38492455
ENCS
8 Enhanced identification of membrane transport proteins: a hybrid approach combining ProtBERT-BFD and convolutional neural networks Ghazikhani H; Butler G; 37497772
ENCS

 

Title:A novel span and syntax enhanced large language model based framework for fine-grained sentiment analysis
Authors:Zou HWang YHuang A
Link:https://pubmed.ncbi.nlm.nih.gov/40876298/
DOI:10.1016/j.neunet.2025.108012
Publication:Neural networks : the official journal of the International Neural Network Society
Keywords:Fine-grained sentiment analysisLarge language modelNatural language processingSpan-aware attentionSyntex-aware transformer
PMID:40876298 Category: Date Added:2025-08-29
Dept Affiliation: ENCS
1 School of Computer Science and Engineering, Nanjing University of Science and Technology, Xiaolingwei Street No.200, Nanjing, 210094, Jiangsu, China; Department of Computer Science and Software Engineering, Concordia University, 2155 Guy Street, Montreal, H3H 2L9, Quebec, Canada. Electronic address: haochen.zou@mail.concordia.ca.
2 School of Computer Science and Engineering, Nanjing University of Science and Technology, Xiaolingwei Street No.200, Nanjing, 210094, Jiangsu, China. Electronic address: yongliwang@njust.edu.cn.
3 School of Computer Science and Engineering, Nanjing University of Science and Technology, Xiaolingwei Street No.200, Nanjing, 210094, Jiangsu, China. Electronic address: anqihuang@njust.edu.cn.

Description:

Fine-grained aspect-based sentiment analysis requires language models to identify aspect entities and the corresponding sentiment information in the input text content. Transformer-based pre-trained large language models have demonstrated remarkable performance on various challenging natural language processing tasks. However, large language models face limitations in explicitly modelling syntactic relationships and effectively capturing local nuances between terms in the text content, which constrains their capability in fine-grained aspect-based sentiment analysis. We propose a novel span and syntax enhanced joint learning framework based on the latest large language model. The framework incorporates three key components, including the span-aware attention mechanism, the contextual Transformer, and the syntax-aware Transformer, which examine in parallel to generate span-aware features, contextual features, and syntax-aware features, respectively. The three dimensions of analyzed features are dynamically fused in the feature aggregation module, resulting in a combined feature for aspect entity recognition and sentiment classification. To the best of our knowledge, this study represents the pioneering effort to comprehensively leverage span-aware, contextual, and syntax-aware characteristics to augment large language models in addressing the fine-grained aspect-based sentiment analysis task. Experimental results on publicly available benchmark datasets validate the effectiveness of the architecture compared to state-of-the-art baseline competitors.





BookR developed by Sriram Narayanan
for the Concordia University School of Health
Copyright © 2011-2026
Cookie settings
Concordia University