2023 International Robot Exhibition was held at Tokyo Big Sight for four days from November 29th to December 2nd. HBA was exhibiting a robot x digital twin solution that utilizes IOWN, which is being jointly promoted with NTT Comware. They demonstrated how to improve maintenance efficiency through remote control of robots.
NTT Comware is focusing on CPS model based on IOWN that combines robots and digital twins. CPS model fuses real space and cyberspace, making it possible to control the real world from cyberspace. Currently, we are working on automating data center operations using HBA’s automatic patrol inspection robot “HSR.”
The demo assumes the inside of a data center. The robot used AI image recognition to reflect information such as server specifications on the digital twin in real time.
It is also possible to read the meter values and display them on the digital twin to remotely understand the server status.
AI can judge the reading of the meter and send an alert email if the value is abnormal.
Additionally, a collision prevention function was implemented on the digital twin side.
Atsushi Kawaguchi, General Manager of NTT Comware Network Cloud Business Headquarters, said, “The robot body originally has a collision prevention function, but the arm that actually operates it does not have a collision prevention function.The robot checks servers, storage, etc. In this case, it is essential to prevent arm collisions. Therefore, when the robot on the digital twin is controlled, the real robot is also controlled, and we have added a function to avoid collisions. To control the robot arm on the real side, Although many sensors are required, this can be easily achieved on a digital twin (because the precise scale is reproduced, it is possible to simulate collision avoidance).
There was also a demonstration in which a robot pressed the power button on a server to turn it on and off.
A series of processes has been realized in which the robot collects on-site information, converts it into a digital twin, and uses AI to analyze and make decisions, which are then linked to the robot’s actions.
This groundbreaking demonstration brought us one step closer to unmanned on-site work using robots.
“Currently, it is possible to actually press a button. We would like to add a detailed control function for the coordinate axes within this fiscal year and refine the robot’s actions. If we can clarify the coordinates using image recognition AI, etc., we can control the robot. It will be able to move more accurately.We will also use millimeter waves to speed up network communication with the robot” (Mr. Kawaguchi).
Takanori Yamano, Managing Director of HBA, said, “Up until now, we have focused on data acquisition by robots, but in the future we are looking to have robots perform a variety of tasks, and we are implementing various innovations such as making the robot’s arms replaceable.” I want to go,” he says.