Keyword search (4,163 papers available)

"Text mining" Keyword-tagged Publications:

Title Authors PubMed ID
1 Utilizing large language models for detecting hospital-acquired conditions: an empirical study on pulmonary embolism Cheligeer C; Southern DA; Yan J; Wu G; Pan J; Lee S; Martin EA; Jafarpour H; Eastwood CA; Zeng Y; Quan H; 40105654
ENCS
2 Cross-collection latent Beta-Liouville allocation model training with privacy protection and applications Luo Z; Amayri M; Fan W; Bouguila N; 36685642
ENCS
3 Understanding the temporal evolution of COVID-19 research through machine learning and natural language processing. Ebadi A; Xi P; Tremblay S; Spencer B; Pall R; Wong A; 33230352
ENCS
4 Biodiversity Observations Miner: A web application to unlock primary biodiversity data from published literature. Muñoz G, Kissling WD, van Loon EE 30692868
BIOLOGY

 

Title:Cross-collection latent Beta-Liouville allocation model training with privacy protection and applications
Authors:Luo ZAmayri MFan WBouguila N
Link:https://pubmed.ncbi.nlm.nih.gov/36685642/
DOI:10.1007/s10489-022-04378-3
Publication:Applied intelligence (Dordrecht, Netherlands)
Keywords:Beta-Liouville priorComparative text miningCross-collection modelDifferential privacyImage classificationTopic correlation
PMID:36685642 Category: Date Added:2023-01-23
Dept Affiliation: ENCS
1 The Concordia Institute for Information Systems Engineering (CIISE), Concordia University, Montréal, H3H 1M8 Québec Canada.
2 G-SCOP Lab, Grenoble Institute of Technology, Grenoble, 38031 France.
3 Department of Computer Science, Beijing Normal University-Hong Kong Baptist University United International College (UIC), Zhuhai, Guangdong 519088 China.

Description:

Cross-collection topic models extend previous single-collection topic models, such as Latent Dirichlet Allocation (LDA), to multiple collections. The purpose of cross-collection topic modeling is to model document-topic representations and reveal similarities between each topic and differences among groups. However, the restriction of Dirichlet prior and the significant privacy risk have hampered those models' performance and utility. Training those cross-collection topic models may, in particular, leak sensitive information from the training dataset. To address the two issues mentioned above, we propose a novel model, cross-collection latent Beta-Liouville allocation (ccLBLA), which operates a more powerful prior, Beta-Liouville distribution with a more general covariance structure that enhances topic correlation analysis. To provide privacy protection for the ccLBLA model, we leverage the inherent differential privacy guarantee of the Collapsed Gibbs Sampling (CGS) inference scheme and then propose a hybrid privacy protection algorithm for the ccLBLA model (HPP-ccLBLA) that prevents inferring data from intermediate statistics during the CGS training process without sacrificing its utility. More crucially, our technique is the first attempt to use the cross-collection topic model in image classification applications and investigate the cross-collection topic model's capabilities beyond text analysis. The experimental results for comparative text mining and image classification will show the merits of our proposed approach.





BookR developed by Sriram Narayanan
for the Concordia University School of Health
Copyright © 2011-2026
Cookie settings
Concordia University