The use of medical imaging technologies, such as computed tomography, magnetic resonance imaging and ultrasound, have given clinicians powerful tools to assess and diagnose their patients. These imaging modalities allow health care providers to see into their patients in three perpendicular axial planes. However, complex anatomy and rare pathologies can be difficult to interpret based on two-dimensional images alone. Sometimes, a 3D model is required to provide clearer representation of the organ in space and give clinicians a tangible model to physically manipulate with tools and instruments before performing medical procedures on their patients. Our research lab (@APIL_TGH) specializes in the 3D modelling and printing of patient-specific organs for medical education and pre-surgical planning. 3D printing of these organs for research and clinical use have been valuable tools based on feedback from learners and clinicians alike. An area of weakness for 3D printed models is their inability to relay contextual data to the user. Information such descriptions of anatomy/pathology, case history of the patient, and additional supplementary medical information is unavailable unless the model is accompanied with a printed out data sheet or an online description. Our experiment was to develop a mobile augmented reality (AR) app to detect a 3D printed organ and overlay relevant digital data to the clinician in a responsive and interactive environment.
The solution to this experiment was to use Vuforia’s new Model Target Generator. This generator allowed us to quickly turn a patient specific 3D printed heart model into a fiducial marker recognized by Vuforia’s suite of AR tools in Unity. The integration between Unity and Vuforia allowed us to quickly iterate on different app designs and UI elements as well as rapid testing of the accuracy and detection rate of the 3D printed heart and the AR scene camera in a variety of positions and lighting conditions. Several overlays were then created to point out specific points of anatomy on the 3D model, toggle the heart’s myocardium on and off, and display specific information about this particular heart’s pathology: an aortic dissection.
Results from the development of our experimental AR app has been extremely promising. The detection rate of this 3D printed heart has been quick and accurate. The 3D overlays allows us to display things that we either could not have explained without an accompanying piece of paper, such as labels for the heart’s chambers. Additionally, being able to overlay the full myocardium of the heart has been extremely useful since it also allows us to display textures on the digital models which we can’t reproduce with our 3D printers. Most useful was the ability to digitally overlay the full pathology of this heart. This particular patient-specific heart model has an anomaly called an aortic dissection running through the aorta. 3D printing this anatomy is complex and takes nearly as much time to fabricate as the other chambers combined. This particular 3D printed heart was made for cardiac anatomy education so the full aorta was not necessary, but the AR app allowed us to have the necessary parts of the heart be 3D printed while still allowing user to see, learn, and interact with the full model digitally. This has saved us time and materials with 3D printing without having to compromise about the amount of data represented by the physical model. Continued development of the app prior to a proper release is planned. Future developments currently in the work include fully animated and beating hearts, more relevant data to be placed over the heart models, and embedded ultrasound simulators to allow clinicians to ‘practice’ their echocardiography skills digitally on the 3D printed heart models.
- Department of Anesthesia and Pain Management, Toronto General Hospital
- The University Health Network
- Peter Munk Cardiac Center
- The University of Toronto Faculty of Medicine
- Joshua Qua Hiansen
- Dr. Azad Mashari
- Dr. Massimiliano Meineri