https://doi.org/10.4081/ejtm.2026.15004
Abstract 005 | Radio-anatomical interactive library: a new paradigm for training and teaching in anatomy and disease
Gianmarco Dolino 1|2, Damiano Coato 1|2, Riccardo Forni 1|2, Arnar Evgeni Gunnarsson 1, Halldór. Jónsson Jr 1|2, Paolo Gargiulo 1|3 | 1Institute of Biomedical and Neural Engineering, Reykjavik University, Iceland; Reykjavík University, Iceland; 2Centre for Mechanics of Biological Materials, University of Padova, Padua, Italy; 3Landspítali University Hospital, Reykjavik, Iceland.
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.
Published: 2 March 2026
The study of human anatomy has been the base of medical education through the ages. While digital tools like 3D visualization platforms, VR, and AR have created new opportunities for healthcare training [1], traditional methods remain the cornerstone [2]. Modern technologies enable the production of patient specific, custom-made physical models from previously acquired medical images, enhancing case understanding and pre-surgical preparation [3,4]. The Radio Anatomical Interactive Library (RAIL) will set a new standard - as an innovative educational platform that enhances medical training and education by offering interactive 3D visualizations of patient-specific anatomical models out of medical imaging data. The Institute of Biomedical and Neural Engineering at Reykjavík University, in collaboration with Landspítali, provided a set of clinical cases representing various medical conditions. Once the appropriate DICOM datasets were gathered, the images were prepared for processing through cleaning, alignment, and segmentation to initiate the 3D reconstruction workflow. Following segmentation, a 3D surface mesh is generated from the outlined regions. Finally, the reconstructed model is imported into a graphics engine, which provides the algorithms and tools necessary to generate interactive 3D environments for VR/AR visualization. The RAIL project addresses traditional teaching gaps by allowing users to explore real clinical cases in an interactive and flexible way; furthermore, using real-case imaging helps students understand complex anatomical structures more effectively than simplified or idealized models. Additionally, RAIL promotes accessibility by enabling students to interact with anatomical 3D models anytime and from anywhere, overcoming the physical limitations of traditional classrooms. The platform bridges the gap between traditional educational methods and modern technologies by combining clinical imaging, patient-specific case development, and interactive educational tools within a mixed reality framework. It supports on-demand case access, real time interaction, and operates with commercially available headsets, offering both clinical relevance and educational versatility.

Downloads
1. C. Moro, Z. ˇStromberga, A. Raikos, and A. Stirling, “The effectiveness of virtual and augmented reality in health sciences and medical anatomy,” Anatomical Sciences Education, vol. 10, no. 6, pp. 549–559, 2017.
2. S. Ghosh, et al., “Cadaveric dissection as an educational tool for anatomical sciences in the 21st century,” Anatomical Sciences Education, vol. 10, no. 3, pp. 286–299, 2017.
3. D. Coato, et al., "Synthetic 3D printed tibial plateau with gradient material properties for biomechanical accuracy." Frontiers in Bioengineering and Biotechnology 13 (2025): 1707380.
4. G. Dolino, et al., “Designing a Synthetic 3D-Printed Knee Cartilage: FEA Model, Micro-Structure and Mechanical Characteristics”. Appl. Sci., 14, pp. 1-16, 2024.
How to Cite

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
PAGEPress has chosen to apply the Creative Commons Attribution NonCommercial 4.0 International License (CC BY-NC 4.0) to all manuscripts to be published.