| Summary | DETAILS
"TELESAR V" is a haptic telexistence system which enables a user to bind with a dexterous robot and experience what the robot feels from its fingertip when manipulating and touching objects remotely.
The user is able to see the remote environment and the robot hands as if he were inside the robot and hear what the robot hears in the remote location through a Head Mounted Display. With human-like movements of the upper body, arms and hands of TELESAR V, together with the ability to see, hear and feel haptic sensation simultaneously, the user experiences the out-of-body illusion when he observes himself using this system.
The user feels the haptic sensation of glass ball pouring from one cup to another and haptic sensation when touching an uneven surface. Furthermore the user can feel the robot’s fingertip temperature when touching an object.
| Summary | DETAILS
TELESAR V is a telexistence master–slave robot system that was developed to realize the concept of haptic telexistence. TELESAR V was designed and implemented with the development of a robot with high-speed, robust and full upper body, mechanically unconstrained master cockpit, and a 53-DOF anthropomorphic slave robot. The system provides an experience of our extended “body schema,” which allows a human to maintain an up-to-date representation of the positions of his or her various body parts in space. Body schema can be used to understand the posture of the remote body and to perform actions with the perception that the remote body is the user’s own body. With this experience, users can perform tasks dexterously and perceive the robot’s body as their own body through visual, auditory, and haptic sensations, which provide the simplest and fundamental experience of telexistence. TELESAR V master–slave system can transmit fine haptic sensations such as the texture and temperature of a material from an avatar robot’s fingers to a human user’s fingers.
As shown in Figs. 1 and 2, TELESAR V system consists of a master (local) and a slave (remote). The 53-DOF dexterous robot was developed with a 6-DOF torso, a 3-DOF head, 7-DOF arms, and 15-DOF hands. The robot has Full HD (1920 × 1080 pixels) cameras also for capturing wide-angle stereovision, and stereo microphones are situated on the robot’s ears for capturing audio signals from the remote site. The operator’s voice is transferred to the remote site and output through a small speaker installed near the robot’s mouth area for conventional verbal bidirectional communication. On the master side, the operator’s movements are captured with a motion-capturing system (OptiTrack). Finger bending is captured with 14-DOF using modified 5DT Data Glove 14.
Figure 1 General view of TELESAR V master (left) and slave robot (right).
Figure 2 TELESAR V system configuration.
The haptic transmission system consists of three parts: a haptic sensor, a haptic display, and a processing block. When the haptic sensor touches an object, it obtains haptic information such as contact force, vibration, and temperature based on the haptic primary colors. The haptic display provides haptic stimuli on the user’s finger to reproduce the haptic information obtained by the haptic sensor. The processing block connects the haptic sensor with the haptic display and converts the obtained physical data into data that include the physiological haptic perception for reproduction by the haptic display. The details of the scanning and displaying mechanisms are described below.
First, a force sensor inside the haptic sensor measures the vector force when the haptic sensor touches an object. Then, two motor-belt mechanisms in the haptic display reproduce the vector force on the operator’s fingertips. The processing block controls the electrical current drawn by each motor to provide the target torques based on the measured force. As a result, the mechanism reproduces the force sensation when the haptic sensor touches the object.
Second, a microphone in the haptic scanner records the sound generated on its surface when the haptic sensor is in contact with an object. Then, a force reactor in the haptic display plays the transmitted sound as a vibration. Since this vibration provides a high-frequency haptic sensation, the information is transmitted without delay.
Third, a thermistor sensor in the haptic sensor measures the surface temperature of the object using a thermistor sensor. The measured temperature is reproduced by a Peltier actuator mounted on the operator’s fingertips. The processing block generates a control signal for the Peltier actuator. The signal is generated based on a PID control loop with feedback from a thermistor located on the Peltier actuator. Figs. 3 and 4 show the structures of the haptic sensor and the haptic display, respectively.
Fig. 3 Structure of haptic sensor. Fig. 4 Structure of haptic display.
Figure 6 shows the left hand of TELESAR V robot with the haptic sensors, and the haptic displays set in the modified 5DT Data Glove 14.
Fig. 6 Slave hand with haptic sensors (left) and master hand with haptic displays (right).
Figure 7 shows TELESAR V conducting several tasks such as picking up sticks, transferring small balls from one cup to another cup, producing Japanese calligraphy, playing Japanese chess (shogi), and feeling the texture of a cloth.
Figure 7 TELESAR V conducting several tasks transmitting haptic sensation to the user
PDFs of the following papers are available at http://tachilab.org/publications/.
1) S. Tachi, K. Minamizawa, M. Furukawa and C. L. Fernando, “Telexistence - from 1980 to 2012,” Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS2012), pp.5440-5441, Vilamoura, Algarve, Portugal, 2012.
2) C. L. Fernando, M. Furukawa, T. Kurogi, K. Hirota, S. Kamuro, K. Sato, K. Minamizawa and S. Tachi, “TELESAR V: TELExistence Surrogate Anthropomorphic Robot,” ACM SIGGRAPH 2012, Emerging Technologies, Los Angeles, CA, USA, 2012.
3) S. Tachi, K. Minamizawa, M. Furukawa and C. L. Fernando, “Haptic Media: Construction and Utilization of Human-harmonized "Tangible" Information Environment,” Proceedings of the 23rd International Conference on Artificial Reality and Telexistence (ICAT), Tokyo, Japan, pp.145-150, 2013.
4) S. Tachi: Telexistence 2nd Edition, World Scientific, ISBN 978-981-4618-06-9, 2015.
5) S. Tachi: Telexistence -Past, Present, and Future-, in G. Brunnett et al. ed. Virtual Realities, ISBN 978-3-319-17042-8, Springer, pp.229-259, 2015.
6) S. Tachi, “Telexistence: Enabling humans to be virtually ubiquitous," Computer Graphics and Applications, vol. 36, No.1, pp.8-14, 2016.