Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
Background
Apart from animal testing and clinical trials, surgical research and laparoscopic training mainly rely on phantoms. The aim of this project was to design a phantom with realistic anatomy and haptic characteristics, modular design and easy reproducibility. The phantom was named open-source Heidelberg laparoscopic phantom (OpenHELP) and serves as an open-source platform.
Methods
The phantom was based on an anonymized CT scan of a male patient. The anatomical structures were segmented to obtain digital three-dimensional models of the torso and the organs. The digital models were materialized via rapid prototyping. One flexible, using an elastic abdominal wall, and one rigid method, using a plastic shell, to simulate pneumoperitoneum were developed. Artificial organ production was carried out sequentially starting from raw gypsum models to silicone molds to final silicone casts. The reproduction accuracy was exemplarily evaluated for ten silicone rectum models by comparing the digital 3D surface of the original rectum with CT scan by calculating the root mean square error of surface variations. Haptic realism was also evaluated to find the most realistic silicone compositions on a visual analog scale (VAS, 0–10).
Results
The rigid and durable plastic torso and soft silicone organs of the abdominal cavity were successfully produced. A simulation of pneumoperitoneum could be created successfully by both methods. The reproduction accuracy of ten silicone rectum models showed an average root mean square error of 2.26 (0–11.48) mm. Haptic realism revealed an average value on a VAS of 7.25 (5.2–9.6) for the most realistic rectum.
Conclusion
The OpenHELP phantom proved to be feasible and accurate. The phantom was consecutively applied frequently in the field of computer-assisted surgery at our institutions and is accessible as an open-source project at
www.open-cas.org
for the academic community.