| Keyword search (4,164 papers available) | ![]() |
"Kiar G" Authored Publications:
| Title | Authors | PubMed ID | |
|---|---|---|---|
| 1 | Numerical stability of DeepGOPlus inference | Gonzalez Pepe I; Chatelain Y; Kiar G; Glatard T; | 38285635 ENCS |
| 2 | Data and Tools Integration in the Canadian Open Neuroscience Platform | Poline JB; Das S; Glatard T; Madjar C; Dickie EW; Lecours X; Beaudry T; Beck N; Behan B; Brown ST; Bujold D; Beauvais M; Caron B; Czech C; Dharsee M; Dugré M; Evans K; Gee T; Ippoliti G; Kiar G; Knoppers BM; Kuehn T; Le D; Lo D; Mazaheri M; MacFarlane D; Muja N; O' Brien EA; O' Callaghan L; Paiva S; Park P; Quesnel D; Rabelais H; Rioux P; Legault M; Tremblay-Mercier J; Rotenberg D; Stone J; Strauss T; Zaytseva K; Zhou J; Duchesne S; Khan AR; Hill S; Evans AC; | 37024500 ENCS |
| 3 | Numerical uncertainty in analytical pipelines lead to impactful variability in brain networks | Kiar G; Chatelain Y; de Oliveira Castro P; Petit E; Rokem A; Varoquaux G; Misic B; Evans AC; Glatard T; | 34724000 ENCS |
| 4 | File-based localization of numerical perturbations in data analysis pipelines. | Salari A, Kiar G, Lewis L, Evans AC, Glatard T | 33269388 ENCS |
| 5 | Comparing perturbation models for evaluating stability of neuroimaging pipelines. | Kiar G, de Oliveira Castro P, Rioux P, Petit E, Brown ST, Evans AC, Glatard T | 32831546 IMAGING |
| 6 | Boutiques: a flexible framework to integrate command-line applications in computing platforms. | Glatard T, Kiar G, Aumentado-Armstrong T, Beck N, Bellec P, Bernard R, Bonnet A, Brown ST, Camarasu-Pop S, Cervenansky F, Das S, Ferreira da Silva R, Flandin G, Girard P, Gorgolewski KJ, Guttmann CRG, Hayot-Sasson V, Quirion PO, Rioux P, Rousseau MÉ, Evans AC | 29718199 ENCS |
| 7 | A Serverless Tool for Platform Agnostic Computational Experiment Management. | Kiar G, Brown ST, Glatard T, Evans AC | 30890927 ENCS |
| Title: | Numerical stability of DeepGOPlus inference | ||||
| Authors: | Gonzalez Pepe I, Chatelain Y, Kiar G, Glatard T | ||||
| Link: | https://pubmed.ncbi.nlm.nih.gov/38285635/ | ||||
| DOI: | 10.1371/journal.pone.0296725 | ||||
| Publication: | PloS one | ||||
| Keywords: | |||||
| PMID: | 38285635 | Category: | Date Added: | 2024-01-29 | |
| Dept Affiliation: |
ENCS
1 Department of Computer Science and Software Engineering, Concordia University, Montreal, Qc, Canada. 2 Computational Neuroimaging Laboratory, Child Mind Institute, New York, NY, United States of America. |
||||
Description: |
Convolutional neural networks (CNNs) are currently among the most widely-used deep neural network (DNN) architectures available and achieve state-of-the-art performance for many problems. Originally applied to computer vision tasks, CNNs work well with any data with a spatial relationship, besides images, and have been applied to different fields. However, recent works have highlighted numerical stability challenges in DNNs, which also relates to their known sensitivity to noise injection. These challenges can jeopardise their performance and reliability. This paper investigates DeepGOPlus, a CNN that predicts protein function. DeepGOPlus has achieved state-of-the-art performance and can successfully take advantage and annotate the abounding protein sequences emerging in proteomics. We determine the numerical stability of the model's inference stage by quantifying the numerical uncertainty resulting from perturbations of the underlying floating-point data. In addition, we explore the opportunity to use reduced-precision floating point formats for DeepGOPlus inference, to reduce memory consumption and latency. This is achieved by instrumenting DeepGOPlus' execution using Monte Carlo Arithmetic, a technique that experimentally quantifies floating point operation errors and VPREC, a tool that emulates results with customizable floating point precision formats. Focus is placed on the inference stage as it is the primary deliverable of the DeepGOPlus model, widely applicable across different environments. All in all, our results show that although the DeepGOPlus CNN is very stable numerically, it can only be selectively implemented with lower-precision floating-point formats. We conclude that predictions obtained from the pre-trained DeepGOPlus model are very reliable numerically, and use existing floating-point formats efficiently. |



