stub Researcher Envisions Interactive Cyber-Physical Human (iCPH) Platform - Unite.AI
Connect with us

Artificial Intelligence

Researcher Envisions Interactive Cyber-Physical Human (iCPH) Platform

Updated on
Image: Tokyo University of Science

Professor Eiichi Yoshida of the Tokyo University of Science has put forward an intriguing idea of an interactive cyber-physical human (iCPH).

Humans can naturally perform various complex tasks, such as sitting down and picking up items. However, these activities involve various movements and require multiple contacts, which can prove difficult for robots. The iCPH could help solve this problem.

Understanding and Generating Human-Like Systems

The new platform can help understand and generate human-like systems that use a variety of contact-rich whole-body motions.

The work was published in Frontiers in Robotics and AI.

“As the name suggests, iCPH combines physical and cyber elements to capture human motions,” Prof. Yoshida says. “While a humanoid robot acts as a physical twin of a human, a digital twin exists as a simulated human or robot in cyberspace. The latter is modeled through techniques such as musculoskeletal and robotic analysis. The two twins complement each other.”

Prof. Yoshida addresses several questions with the framework, such as:

  • How can humanoids mimic human notion?
  • How can robots learn and simulate human behaviors?
  • How can robots interact with humans smoothly and naturally?

The iCPH Framework

The first part of the iCPH framework measures human motion by quantifying the movement of various body parts. It also records the sequence of contacts made by a human.

The framework enables the generic description of various motions through differential equations, as well as the generation of a contact motion network. A humanoid can then act upon this network.

When it comes to the digital twin, it learns the network through model-based and machine learning approaches. These two are connected by the analytical gradient computation method, and continoual learning helps teach the robot simulation how to perform the contact sequence.

The third part of the iCPH enriches the contact motion network through data augmentation before applying the vector quantization technique. This technique helps extract the symbols expressing the language of contact motion, enabling the generation contact motion in inexperienced situations.

All of this means robots can explore unknown environments while interacting with humans by using smooth motions and many contacts.

Prof. Yoshida puts forward three challenges for the iCPH that pertain to the general descriptors, continual learning, and symbolization of contact motion. For iCPH to be realized, it must learn how to navigate them*.*

“The data from iCPH will be made public and deployed to real-life problems for solving social and industrial issues. Humanoid robots can release humans from many tasks involving severe burdens and improve their safety, such as lifting heavy objects and working in hazardous environments,”  says Prof. Yoshida. “iCPH can also be used to monitor tasks performed by humans and help prevent work-related ailments. Finally, humanoids can be remotely controlled by humans through their digital twins, which will allow the humanoids to undertake large equipment installation and object transportation.”

Alex McFarland is an AI journalist and writer exploring the latest developments in artificial intelligence. He has collaborated with numerous AI startups and publications worldwide.