Robotic arms with a human-like sense of touch

Birth of innovative touch sensitive avatar robotic arm based on real-haptics

Published online 30 November 2017

Touch sensitive avatar-robotic arm based on real-haptics developed by Takahiro Nozaki and colleagues at the Keio University Haptics Research Center.

Touch sensitive avatar-robotic arm based on real-haptics developed by Takahiro Nozaki and colleagues at the Keio University Haptics Research Center.

© 2017 Keio University

The world's first 'real haptics' avatar-robot arm

Conventional haptic robots communicate with humans by transmitting the sense of touch to their operators through mechanical vibrations using touch sensors, which can be insensitive and prone to malfunction. Hence, this type of haptics technology is useful for games and entertainment but has limited industrial applications. 

Takahiro Nozaki, an assistant professor at the Faculty of Science and Technology, Keio University and colleagues have developed a 'real haptics' avatar-robot with a General Purpose Arm (GPA) that transmits sound, vision, movement, and importantly, a highly sensitive sense of touch, to remotely located users in real time. "Our real-haptics technology is an integral part of the Internet of Actions (IoA) technology with potential applications in manufacturing, agriculture, medicine, and nursing care," says Nozaki.

This is the world's first high precision 'tactile force transmission technology' that remembers human movements, edits and reproduces them. Notably, the GPA does not employ conventional touch sensors, thereby making it cheaper, more compact and robust. The realization of the Keio University touch sensitive avatar-robotic arm based on real-haptics is a dream come true for Nozaki. "As a high school student I wanted to study robotics, and surveying my options enrolled at the Department of System Design Engineering at Keio University―the top department in Japan conducting this type of research."

Core technology for the real time haptic avatar arm

The two innovative key components of the Keio avatar-robot GPA are high precision motors in the avatar arm and algorithms to drive them. Importantly, the precise control of force and position is critical for transmitting the sense of touch without using touch sensors.

The real haptics robot GPA can recognize shapes and compositions of materials―soft or hard―position of objects in 3D-space, and manipulate them according to real-time instructions from a remotely located user, where the arm acts as a real-time avatar. Historically, the technical breakthrough in motor control and robotics that led to the successful development of the robotic-avatar developed by Nozaki and colleagues was first reported by Keio University's Kouhei Ohnishi in 1983 in his paper titled, "Torque-speed regulation of DC motor based on load torque estimation method" (IPEC―Tokyo'83, page 1209). Ohnishi continued to develop his ideas in his 1993 paper on 'sensorless torque control' (IEEE Transactions on Industrial Electronics, 40, 259, (1993)).

This report was followed by his proposals for 'motion control in mechatronics' (IEEE Transactions on Mechatronics, 1, 56, (1996)).

Commercialization of ideas and future work

Nozaki has launched the company Motion Lib to commercialize his 'real-haptics technology'. The main product is the 'ABC-CORE' IC force/tactile controller that uses two synchronized motors to adjust the force of DC/AC servomotors and transmission of tactile forces.

Furthermore, Nozaki is collaborating with 30 companies to conduct proof of concept projects for applications to Internet of Actions (IoA). "In one of our projects, the assist-avatar robotic GPA is being tested for real-life applications in supporting farmers to pick fruit and other agricultural applications," says Nozaki.

About the researcher

Reference

  1. Fukushima, S., Sekiguchi, H., Saito, Y., Iida, W., Nozaki, T., & Ohnishi, K., Artificial Replacement of Human Sensation Using Haptic Transplant Technology IEEE Transactions on Industrial Electronics advanced online publication 2 October 2017 (doi: 10.1109/TIE.2017.2758757). | article