Keyword search (4,163 papers available)

"Transformer" Keyword-tagged Publications:

Title Authors PubMed ID
1 Attention-Fusion-Based Two-Stream Vision Transformer for Heart Sound Classification Ranipa K; Zhu WP; Swamy MNS; 41155032
ENCS
2 Lung Nodule Malignancy Classification Integrating Deep and Radiomic Features in a Three-Way Attention-Based Fusion Module Khademi S; Heidarian S; Afshar P; Mohammadi A; Sidiqi A; Nguyen ET; Ganeshan B; Oikonomou A; 41150036
ENCS
3 A novel span and syntax enhanced large language model based framework for fine-grained sentiment analysis Zou H; Wang Y; Huang A; 40876298
ENCS
4 Deformable detection transformers for domain adaptable ultrasound localization microscopy with robustness to point spread function variations Gharamaleki SK; Helfield B; Rivaz H; 40640235
PHYSICS
5 SAVE: Self-Attention on Visual Embedding for Zero-Shot Generic Object Counting Zgaren A; Bouachir W; Bouguila N; 39997554
ENCS
6 Semantically-Enhanced Feature Extraction with CLIP and Transformer Networks for Driver Fatigue Detection Gao Z; Chen X; Xu J; Yu R; Zhang H; Yang J; 39771685
ENCS
7 CosSIF: Cosine similarity-based image filtering to overcome low inter-class variation in synthetic medical image datasets Islam M; Zunair H; Mohammed N; 38492455
ENCS
8 Enhanced identification of membrane transport proteins: a hybrid approach combining ProtBERT-BFD and convolutional neural networks Ghazikhani H; Butler G; 37497772
ENCS

 

Title:Enhanced identification of membrane transport proteins: a hybrid approach combining ProtBERT-BFD and convolutional neural networks
Authors:Ghazikhani HButler G
Link:https://pubmed.ncbi.nlm.nih.gov/37497772/
DOI:10.1515/jib-2022-0055
Publication:Journal of integrative bioinformatics
Keywords:ProtBERT-BFDneural networkprotein language modeltransformerstransmembrane transport proteins
PMID:37497772 Category: Date Added:2023-07-27
Dept Affiliation: ENCS

Description:

Transmembrane transport proteins (transporters) play a crucial role in the fundamental cellular processes of all organisms by facilitating the transport of hydrophilic substrates across hydrophobic membranes. Despite the availability of numerous membrane protein sequences, their structures and functions remain largely elusive. Recently, natural language processing (NLP) techniques have shown promise in the analysis of protein sequences. Bidirectional Encoder Representations from Transformers (BERT) is an NLP technique adapted for proteins to learn contextual embeddings of individual amino acids within a protein sequence. Our previous strategy, TooT-BERT-T, differentiated transporters from non-transporters by employing a logistic regression classifier with fine-tuned representations from ProtBERT-BFD. In this study, we expand upon this approach by utilizing representations from ProtBERT, ProtBERT-BFD, and MembraneBERT in combination with classical classifiers. Additionally, we introduce TooT-BERT-CNN-T, a novel method that fine-tunes ProtBERT-BFD and discriminates transporters using a Convolutional Neural Network (CNN). Our experimental results reveal that CNN surpasses traditional classifiers in discriminating transporters from non-transporters, achieving an MCC of 0.89 and an accuracy of 95.1 % on the independent test set. This represents an improvement of 0.03 and 1.11 percentage points compared to TooT-BERT-T, respectively.





BookR developed by Sriram Narayanan
for the Concordia University School of Health
Copyright © 2011-2026
Cookie settings
Concordia University