Keyword search (4,163 papers available)

"Mixture" Keyword-tagged Publications:

Title Authors PubMed ID
1 Trajectories of Alcohol-Related Problems Among First-Year Nursing Students: Nature, Predictors, and Outcomes Cheyroux P; Morin AJS; O' Connor RM; Colombat P; Vancappel A; Eltanoukhi R; Gillet N; 41797206
PSYCHOLOGY
2 Scientists warning: we must change paradigm for a revolution in toxicology and world food supply Seralini GE; Jungers G; Andersen A; Antoniou M; Aschner M; Bacon MH; Bertrand M; Bohn T; Bonfleur ML; Bücking E; Defarge N; Djemil R; Domingo JL; Douzelet J; Fagan J; Fournier T; Garcia JLY; Gil S; Hervé-Gruyer P; Hilbeck A; Hilty L; Huber D; Joyeux H; Khan I; Kouretas D; Lemarchand F; Loening U; Longo G; Mesnage R; Nikolopoulou DI; Panoff JM; Parente C; Robinson C; Scherber C; Sprangers D; Sultan C; Tsatsakis A; Vandelac L; Wan NF; Wynne B; Zaller JG; Zerrad-Saadi A; Zhang X; 41551494
CHEMBIOCHEM
3 Optimizing Mixtures of Metal-Organic Frameworks for Robust and Bespoke Passive Atmospheric Water Harvesting Harriman C; Ke Q; Vlugt TJH; Howarth AJ; Simon CM; 41427123
CHEMBIOCHEM
4 Deep clustering analysis via variational autoencoder with Gamma mixture latent embeddings Guo J; Fan W; Amayri M; Bouguila N; 39662201
ENCS
5 Developmental heterogeneity of school burnout across the transition from upper secondary school to higher education: A 9-year follow-up study Nadon L; Morin AJS; Gilbert W; Olivier E; Salmela-Aro K; 39645324
PSYCHOLOGY
6 Self-consolidating concrete: Dataset on mixture design and key properties Amine El Mahdi Safhi 38533116
ENCS
7 Unsupervised Mixture Models on the Edge for Smart Energy Consumption Segmentation with Feature Saliency Al-Bazzaz H; Azam M; Amayri M; Bouguila N; 37837127
ENCS
8 Entropy-Based Variational Scheme with Component Splitting for the Efficient Learning of Gamma Mixtures Bourouis S; Pawar Y; Bouguila N; 35009726
ENCS
9 Mixtures of rare earth elements show antagonistic interactions in Chlamydomonas reinhardtii Morel E; Cui L; Zerges W; Wilkinson KJ; 34175518
BIOLOGY
10 BioMiCo: a supervised Bayesian model for inference of microbial community structure. Shafiei M, Dunn KA, Boon E, MacDonald SM, Walsh DA, Gu H, Bielawski JP 25774293
BIOLOGY

 

Title:Deep clustering analysis via variational autoencoder with Gamma mixture latent embeddings
Authors:Guo JFan WAmayri MBouguila N
Link:https://pubmed.ncbi.nlm.nih.gov/39662201/
DOI:10.1016/j.neunet.2024.106979
Publication:Neural networks : the official journal of the International Neural Network Society
Keywords:ClusteringData augmentationGamma mixture modelsVAEVariational inference
PMID:39662201 Category: Date Added:2024-12-12
Dept Affiliation: ENCS
1 CIISE, Concordia University, Montreal, H3G 1T7, QC, Canada. Electronic address: g_jiax@encs.concordia.ca.
2 Guangdong Provincial Key Laboratory IRADS and Department of Computer Science, Beijing Normal University-Hong Kong Baptist University United International College, Zhuhai, Guangdong, China. Electronic address: wentaofan@uic.edu.cn.
3 CIISE, Concordia University, Montreal, H3G 1T7, QC, Canada. Electronic address: manar.amayri@concordia.ca.
4 CIISE, Concordia University, Montreal, H3G 1T7, QC, Canada. Electronic address: nizar.bouguila@concordia.ca.

Description:

This article proposes a novel deep clustering model based on the variational autoencoder (VAE), named GamMM-VAE, which can learn latent representations of training data for clustering in an unsupervised manner. Most existing VAE-based deep clustering methods use the Gaussian mixture model (GMM) as a prior on the latent space. We employ a more flexible asymmetric Gamma mixture model to achieve higher quality embeddings of the data latent space. Second, since the Gamma is defined for strictly positive variables, in order to exploit the reparameterization trick of VAE, we propose a transformation method from Gaussian distribution to Gamma distribution. This method can also be considered a Gamma distribution reparameterization trick, allows gradients to be backpropagated through the sampling process in the VAE. Finally, we derive the evidence lower bound (ELBO) based on the Gamma mixture model in an effective way for the stochastic gradient variational Bayesian (SGVB) estimator to optimize the proposed model. ELBO, a variational inference objective, ensures the maximization of the approximation of the posterior distribution, while SGVB is a method used to perform efficient inference and learning in VAEs. We validate the effectiveness of our model through quantitative comparisons with other state-of-the-art deep clustering models on six benchmark datasets. Moreover, due to the generative nature of VAEs, the proposed model can generate highly realistic samples of specific classes without supervised information.





BookR developed by Sriram Narayanan
for the Concordia University School of Health
Copyright © 2011-2026
Cookie settings
Concordia University