More human-like conversation by humanoid robots
Freely-movable child-like android joins the ranks of humanoid robots
The JST ERATO Ishiguro Symbiotic Human-Robot Interaction Project led by Professor Hiroshi Ishiguro of Osaka University developed a multimodal conversation control system and a multi-robot conversation control system, realizing a humanoid robot that gives its conversation partner a higher degree of human-like presence as well as a “sense of conversing.”
Research and development of conversational robots have been actively conducted in recent years; however, in conventional human-robot conversation, users cannot feel the “sense of conversing," "existence,” or "sociability” that they can feel in a conversation with humans.
Employing a multi-robot conversation control system that uses various sensors, such as a distance sensor, a camera, and a microphone array, as well as a conversation control system based on its intents and desires, the researchers developed a conversation system for androids, achieving a conversation setting for android "ERICA." These efforts have allowed ERICA to conduct human-like natural conversation with a stranger at a waiting room in a laboratory. Giving ERICA roles of listening to a client (in a counseling session) and asking questions as an interviewer (during a job interview), they also performed research aiming for natural android-human conversation equivalent to a conversation between humans.
In addition, the researchers developed a system to facilitate human utterance and extend conversations by creating human-like conversational behaviors using technologies to allow the android to generate natural and various types of nodding, ask again after analyzing focus words, and detect a conversation partner’s response. Androids capable of performing human-like conversation will be able to play roles such as instructors and counselors in interview practice and language study in the future.
Also, in a study using multiple conversational social robots named "CommU," the researchers developed a multi-robot conversation control system to control the timings of multiple CommUs' utterances and non-linguistic expressions by facilitating inter-robot conversations and inter-robot turn-taking interactions. They also demonstrated that showing inter-robot conversations to the interacting human made the human feel that the conversation was actually occurring.
In addition, they developed "ibuki," a child-like android with a moving mechanism. ibuki can produce a walking motion similar to a human gait by employing a wheeled mechanism, a swing mechanism by combining a pair of eccentricity wheels for horizontal body motion, and a ball screw-driven actuator for vertical body motion, as well as a special joint at the waist which drives the upper body.
ibuki has many degree-of-freedom joints in both driving mechanisms of the face and head (allowing it to create expressions) and the driving mechanisms of the arms and hands (allowing it to create gestures). Because all degrees of freedom (DOF) of its mechanical system are driven by a back-drivable electric motor with a reducer, an air compressor for the pneumatic system is not necessary. Thus, ibuki can work with humans by acting as a guide in an office building or by advertising products in a shop. ibuki also can be involved in physical interaction by coordinating its positional relationship with a human.
Through research and development of these robots, this project aims for the realization of autonomous conversational robots with humans in daily situations and promotes dissemination of dialogue robots in society.
Figure 1. ERICA
Figure 2. CommU
Figure 3. ibuki
Related links