Beyl, T., P. Nicolai, et al. (2012). “Haptic Feedback in OP:Sense – Augmented Reality in Telemanipulated Robotic Surgery.” Studies in Health Technology and Informatics 173: 58-63.
In current research, haptic feedback in robot assisted interventions plays an important role. However most approaches to haptic feedback only regard the mapping of the current forces at the surgical instrument to the haptic input devices, whereas surgeons demand a combination of medical imaging and telemanipulated robotic setups. In this paper we describe how this feature is integrated in our robotic research platform OP:Sense. The proposed method allows the automatic transfer of segmented imaging data to the haptic renderer and therefore allows enriching the haptic feedback with virtual fixtures based on imaging data. Anatomical structures are extracted from pre-operative generated medical images or virtual walls are defined by the surgeon inside the imaging data. Combining real forces with virtual fixtures can guide the surgeon to the regions of interest as well as helps to prevent the risk of damage to critical structures inside the patient. We believe that the combination of medical imaging and telemanipulation is a crucial step for the next generation of MIRS-systems.
Bleuler, H., L. Santos-Carreras, et al. (2012). “Haptic handles for robotic surgery.” Studies in Health Technology and Informatics 173: 64-68.
Robotic surgery, i.e. master-slave telemanipulators for surgery, is rapidly developing. One of the key components is the surgeon’s console, and, within the console, especially the “handles” (the “joysticks”) for manipulation control. Two examples of haptic (force-feedback) handle designs developed and realized at the LSRO lab are briefly presented along with design considerations (ergonomics, usability). Incidence on patient safety and patient benefit and future trends in this field are hinted at, especially the possibilities offered VR in this context.
Blondeau, A., M. Garbani, et al. (2011). “Haptic robots and rehabilitation of the hemiplegic upper limb.”Robots haptiques dans la rééducation du membre supérieur hémiplégique 11(120): 33-37.
Background: haptic robots can be used during rehabilitation of a hemiplegic patient. The article is a bibliographical synthesis which focuses on the benefits of this tool coupled with a visual interface. The objective is to present the effects of the system used in rehabilitation of the hemiplegic upper limb. Results: significant improvements were observed in the evolution of the deficient upper limb. No notable improvements were seen in the spasticity and articular amplitudes. Conclusion: rehabilitation with haptic robots, coupled with visual interface has positive effects on the recovery of the hemiplegic upper limb. However, certain studies are not unanimous when it comes to the superiority of this tool over conventional therapy. © 2011. Elsevier Masson SAS. All rights reserved.
Pisla, D., B. Gherman, et al. (2011). “PARASURG hybrid parallel robot for minimally invasive surgery.”Chirurgia (Bucharest, Romania : 1990) 106(5): 619-625.
This paper presents the parallel hybrid robot, PARASURG 9M, for robotically assisted surgery, a robot which was entirely designed and produced in Romania. It is a versatile robot, being composed of a positioning and orientation module, PARASURG 5M with five degrees of freedom, having the possibility of attaching at its end either a laparoscope or an active surgical instrument for cutting/grasping, PARASIM, with four degrees of freedom. Based on its mathematical modelling, the first low-cost experimental model of the surgical robot has been built. The robot is part of the surgical robotic system, PARAMIS, with three arms, one used as a laparoscope holder, and other two for manipulating active instruments. When it is used as a manipulator of the camera, the user has the possibility to give commands in a large area for the positioning of the laparoscope using different interfaces: joystick, microphone, keyboard & mouse and haptic device. If the active surgical instrument, PARASIM, is attached, the robot commands are given through a haptic device. The main features that make the PARASURG 9M surgical robot suited for minimally invasive surgery are: precision, the elimination of the natural tremor of the surgeon, direct control over a smooth, precise, stable view of the internal surgical field for the surgeon. It also eliminates the need of a second surgeon to be present for the entire procedure (in the case of using the robot as a camera holder). In addition, there is improvement of surgeon dexterity in the case of using the PARASIM active instrument and better ergonomics in using the robot (in the case of the classic laparoscopy, the surgeon must adopt a difficult position for a long period of time, while the robot never gets tired). Having a relatively easy to understand, intuitive commanding system, the surgeons can rapidly adapt to the use of the PARASURG 9M robot in surgical procedures.
Rincon-Gonzalez, L., J. P. Warren, et al. (2011). “Haptic interaction of touch and proprioception: implications for neuroprosthetics.” IEEE Transactions on Neural Systems and Rehabilitation Engineering19(5): 490-500.
Somatosensation is divided into multiple discrete modalities that we think of separably: e.g., tactile, proprioceptive, and temperature sensation. However, in processes such as haptics,those modalities all interact. If one intended to artificially generate a sensation that could be used for stereognosis, for example, it would be crucial to understand these interactions. We are presently examining the relationship between tactile and proprioceptive modalities in this context. In this overview of some of our recent work, we show that signals that would normally be attributed to two of these systems separately, tactile contact and self-movement, interact both perceptually and physiologically in ways that complicate the understanding of haptic processing. In the first study described here, we show that a tactile illusion on the fingertips, the cutaneous rabbit effect, can be abolished by changing the posture of the fingers. We then discuss activity in primary somatosensory cortical neurons illustrating the interrelationship of tactile and postural signals. In this study, we used a robot-enhanced virtual environment to show that many neurons in primary somatosensory cortex with cutaneous receptive fields encode elements both of tactile contact and self-motion. We then show the results of studies examining the structure of the process which extracts the spatial location of the hand from proprioceptive signals. The structure of the spatial errors in these maps indicates that the proprioceptive-spatial map is stable but individually constructed.These seemingly disparate studies lead us to suggest that tactile sensation is encoded in a 2-D map, but one which undergoes continual dynamic modification by an underlying proprioceptive map. Understanding how the disparate signals that comprise the somatosensory system are processed to produce sensation is an important step in realizing the kind of seamless integration aspired to in neuroprosthetics.