Keyword search (4,163 papers available)

"Drouin S" Authored Publications:

Title Authors PubMed ID
1 Connect Brain, a Mobile App for Studying Depth Perception in Angiography Visualization: Gamification Study Titov A; Drouin S; Kersten-Oertel M; 41341989
ENCS
2 Assessment of cognitive load in the context of neurosurgery Di Giovanni DA; Kersten-Oertel M; Drouin S; Collins DL; 40650801
PERFORM
3 Exploring interaction paradigms for segmenting medical images in virtual reality Jones Z; Drouin S; Kersten-Oertel M; 40402355
ENCS
4 Guest editorial: Papers from the 18th joint workshop on Augmented Environments for Computer Assisted Interventions (AE-CAI) at MICCAI 2024: Guest editors' foreword Linte CA; Yaniv Z; Chen E; Drouin S; Kersten-Oertel M; McLeod J; Sarikaya D; Wang J; 39834896
ENCS
5 Papers from the 17th Joint Workshop on Augmented Environments for Computer Assisted Interventions at MICCAI 2023: Guest Editors' Foreword Linte CA; Yaniv Z; Chen E; Dou Q; Drouin S; Kalia M; Kersten-Oertel M; McLeod J; Sarikaya D; 38638501
CONCORDIA
6 MARIN: an open-source mobile augmented reality interactive neuronavigation system. Léger É; Reyes J; Drouin S; Popa T; Hall JA; Collins DL; Kersten-Oertel M; 32323206
PERFORM
7 Quantifying attention shifts in augmented reality image-guided neurosurgery. Léger É, Drouin S, Collins DL, Popa T, Kersten-Oertel M 29184663
PERFORM
8 Distance sonification in image-guided neurosurgery. Plazak J, Drouin S, Collins L, Kersten-Oertel M 29184665
PERFORM
9 Combining intraoperative ultrasound brain shift correction and augmented reality visualizations: a pilot study of eight cases. Gerard IJ, Kersten-Oertel M, Drouin S, Hall JA, Petrecca K, De Nigris D, Di Giovanni DA, Arbel T, Collins DL 29392162
PERFORM
10 Gesture-based registration correction using a mobile augmented reality image-guided neurosurgery system. Léger É, Reyes J, Drouin S, Collins DL, Popa T, Kersten-Oertel M 30800320
PERFORM

 

Title:Exploring interaction paradigms for segmenting medical images in virtual reality
Authors:Jones ZDrouin SKersten-Oertel M
Link:https://pubmed.ncbi.nlm.nih.gov/40402355/
DOI:10.1007/s11548-025-03424-y
Publication:International journal of computer assisted radiology and surgery
Keywords:ContoursInteraction methodsRadiologySegmentationVirtual reality
PMID:40402355 Category: Date Added:2025-05-22
Dept Affiliation: ENCS
1 Computer Science and Software Engineering, Concordia University, 1455 De Maisonneuve Blvd. W., Montreal, QC, H3G 1M8, Canada. zacharyjonesmail@gmail.com.
2 Département de Génie Logiciel Et TI, École de Technologie Supérieure, 1100 R. Notre Dame O, Montreal, QC, H3C 1K3, Canada.
3 Computer Science and Software Engineering, Concordia University, 1455 De Maisonneuve Blvd. W., Montreal, QC, H3G 1M8, Canada.

Description:

Purpose: Virtual reality (VR) can offer immersive platforms for segmenting complex medical images to facilitate a better understanding of anatomical structures for training, diagnosis, surgical planning, and treatment evaluation. These applications rely on user interaction within the VR environment to manipulate and interpret medical data. However, the optimal interaction schemes and input devices for segmentation tasks in VR remain unclear. This study compares user performance and experience using two different input schemes.

Methods: Twelve participants segmented 6 CT/MRI images using two input methods: keyboard and mouse (KBM) and motion controllers (MCs). Performance was assessed using accuracy, completion time, and efficiency. A post-task questionnaire measured users' perceived performance and experience.

Results: No significant overall time difference was observed between the two input methods, though KBM was faster for larger segmentation tasks. Accuracy was consistent across input schemes. Participants rated both methods as equally challenging, with similar efficiency levels, but found MCs more enjoyable to use.

Conclusion: These findings suggest that VR segmentation software should support flexible input options tailored to task complexity. Future work should explore enhancements to motion controller interfaces to improve usability and user experience.





BookR developed by Sriram Narayanan
for the Concordia University School of Health
Copyright © 2011-2026
Cookie settings
Concordia University