見出し画像

Achieving Vibrant Aging Societies: Coexisting with Adaptable AI-enabled Robots. A Dialogue between Yasuhisa Hirata x Yoichi Ochiai.

The Moonshot Research and Development Program initiated by the Japanese government aims to create disruptive innovation right here in Japan that promotes challenging research and development. The projects address social issues that are difficult to solve, but if achieved are expected to have a significant impact. Among them, the Moonshot Goal 3 aims for the "realization of AI robots that autonomously learn, adapt to their environment, evolve in intelligence and act alongside human beings, by 2050." An array of projects have been adopted in order to do just this. One of these R&D projects, "Adaptable AI-enabled Robots to Create a Vibrant Society," aims to develop adaptable and versatileAI robots that can change shape and function and provide appropriate services to suit individual users and environments.

We sat down to talk with Professor and Moonshot Project Manager Yasuhisa Hirata, and Yoichi Ochiai, a Moonshot ambassador who established xDiversity, a general incorporated association to research usability for a diverse spectrum of people, including elderly and disabled individuals. We asked them about the outlook for technology in the field of caregiving, which is expected to be a particularly useful field for this project.

Yasuhisa Hirata: Professor, Department of Robotics, Graduate School of Engineering at Tohoku University. He completed his Master's degree at Tohoku University in 2000, and later obtained his PhD in Engineering. After completing his Master's degree, he worked as a research associate at Tohoku University, a researcher at JST PRESTO, and subsequently became an associate professor and then a professor at Tohoku University. He serves as a member of the AdCom of the IEEE Robotics and Automation Society (RAS) and as a co-chair of the IEEE RAS Technical Committee on Rehabilitation and Assistive Robotics, among other roles.
Yoichi Ochiai: Media artist. Born in 1987, he completed his doctoral program in the Interdisciplinary Information Studies at the University of Tokyo (the first early graduation in the Graduate School of Interdisciplinary Information Studies) and obtained a PhD in Interdisciplinary Information Studies. He is currently the Center Director and Associate Professor at the Digital Nature Group, Institute of Advanced Sciences, at the University of Tsukuba and Head Researcher of the JST CREST xDiversity (Cross Diversity) project. He is an IPA-certified Super Creator and Genius Programmer, and CEO of Pixie Dust Technologies.
Photo credit: © 蜷川実花


I want to create a robot that can change its shape like the Flying Nimbus and move it around freely.

—Could you each tell us about your areas of expertise and the research topics you have been working on recently?

Hirata: I have conducted a wide range of research from industrial robots to research on the cooperation between humans and robots. As a student, I studied the distributed coordination control system for multiple robots, which enables people to easily carry heavy or large objects by appropriately coordinating multiple robots. This kind of cooperation control technology between humans and robots, and multiple robots can be used not only in industrial and living spaces but also in supporting people with disabilities. In recent years, I have been involved in research on assitive robots used in nursing-care field, with the aim of developing next-generation care and rehabilitation robots.

The Aobayama Living Lab at Tohoku University, where Hirata PM is affiliated, is developing next-generation care robots and rehabilitation robots.
(Photo provided by Tohoku University)


Ochiai: My research area is computationally redifined nature, or “Digital Nature.” Environments that are reconstructed as computers and that are implemented in society can be reinterpreted as new natural environments. Based on this, I have conducted research on metamaterials as material research, and how to systematize the optimization of phase amplitude in hologram research. Around 2017, I wanted to conduct research that is more useful to society, and started trying to deliver information to a diverse array of people based on their mental and physical conditions by integrating personal optimization technology using AI and spatial visual-tactile technology. We are also conducting events such as developing retinal irradiation technology that allows people to “see” objects regardless of their eyesight, and allowing people who are deaf to enjoy music through vision and vibration.

The Digital Nature Lab at the University of Tsukuba, led by Mr. Ochiai. Various projects are being developed in anticipation of the advent of digital nature.
(Provided by Ochiai)


—Please tell us about the details of this Moonshot Goal 3 project, which is being led by Project Manager Hirata.

Hirata: The title of our project is "Adaptable AI-enabled Robots to Create a Vibrant Society." We mean “robots” in the plural form quite literally, since our goal is that by 2050, a variety of robots will exist in the world, and everyone will be able to use them as parts of the social infrastructure.

We aim to combine robots effectively according to their purpose, people, and environment, and provide the most appropriate support according to each user's lifestyle. There are already various robots being utilized in society, such as humanoid robots and walking assistance robots, but their shapes and roles are fixed. One of the things we are considering creating is an adaptive and versatile robot that can change its shape and form according to the user and situation, and support the user in the most appropriate way.

—What image should people have when you say robots changing according to people and situations?

Hirata: We have coined the term "ROBOTIC NIMBUS" for this. Many of you may know about "Kintoun" (flying nimbus) from stories like Journey to the West, but we expect the ROBOTIC NIMBUS to be able to change its shape freely like a cloud, according to the user's physique, degree of disability, and intended use. If you call out to it like Kintoun, it will come, and if you ride it, it will expand or increase the user's ability. People can’t fly on their own, but if they borrow the power of the ROBOTIC NIMBUS in the future, they may be able to take on new endeavours that they couldn’t have before, and ultimately be able to move as if they were flying. We want to create a new type of robot that has never existed before. Of course, we have to overcome various challenges, but we are currently considering how to achieve materials that can become soft or hard to match its purpose, and mechanisms that can control the softness to fit the user's condition. We are also researching how to build a system that can reflect the user's intended movements instantly.

ROBOTIC NIMBUS supports human actions.
Three types are under development: HOLDER, WEAR, and LIMBS.


—By the time you achieve this, it seems that the very nature of the interface will have fundamentally changed. I think there are many points that can be connected to your research in such fields, Ochiai-san. What do you think?

Ochiai: A genre that I also research frequently is user interfaces that organically transform. Non-powered tools can be used flexibly and sometimes even in quite unreasonable ways. For example, we can use rolled-up newspaper to stuff inside shoes to absorb moisture, or use things for completely different purposes by changing their form. On the other hand, many tools that use electricity cannot be used so freely, so there is certainly a need for development from that perspective. Currently, it is conceivable that a large group of small robots can change formation to transform into a specific shape as a whole. However, in order to achieve this, one runs into problems regarding batteries and actuators that convert energy into movement.

Hirata: You’re absolutely right. In order to create a ROBOTIC NIMBUS that is lightweight yet generates a large force like the Kintoun and can operate for a long time, breakthroughs in batteries and actuators will be crucial.

Ochiai: What kind of support do you expect the adaptable robots to provide people with?

Hirata: The adaptable robots aim to support users by drawing out as much power as possible that users themselves possess, without providing excessive support. For example, if it is used in welfare, such as elderly care, the robot can support users by helping them to be independent at things they’re having trouble with, say, if their vision has slightly deteriorated or if they have difficulty walking. This can lead to an increase in self-efficacy. People will think, "I can still do it." By using robots, users' awareness can change, and behavior can be transformed. And by becoming more actively involved in society, we hope to make society more vibrant as a whole.

Ochiai: You were also involved in the COGY project, developing cycling wheelchairs, weren’t you?

Hirata: Yes, it was research we were conducting with TESS Corporation a while ago. Many people who use wheelchairs, use them because they have weak legs, but COGY can move by pedaling. Even without power assistance, once you start pedaling, you can continue to turn the pedals, so that even people with weakened leg strength and bad balance can make it go.

There’s one example I’ve heard, where a bedridden person who was feeling down that they couldn’t do anything, was given a COGY as a gift by their family, and they started going outside with it. Eventually, they didn’t need COGY anymore, and began to walk with a cane, and even started traveling.

In other words, that person had convinced themselves that they could no longer walk and had placed limits on themselves in their mind. With a little training, they might have been able to walk again without taking too much time, but their motivation had dropped and they couldn't find the energy for rehab. However, when they were given the COGY, and were able to move with their own legs, they thought to themselves, "I thought I couldn't go outside, but I still can," and "maybe I can walk again using a cane." This made them believe in their potential once again. Everyone has potential, and by giving them that little push, you can increase their motivation and self-efficacy. That's why I want to create robots that can provide this kind of support to people.

COGY, a wheelchair that can be moved by pedaling.
Even if one leg is only slightly movable, there is a possibility that the other paralyzed leg can also move again.
This photo shows a sports version developed in collaboration with TESS Co., Ltd.
(Photo provided by Hirata)


The necessary factors to implement technology that allows everyone in society to have a sense of self-efficacy.

—As you mentioned earlier, Ochiai-san, you are researching the integration of personal optimization technology using AI and spatial visual-tactile technology to make it usable by people with physical difficulties and disabilities. When doing research in caregiving fields, what do you think are the challenges in introducing new technology?

Ochiai: Actually, I think there are larger challenges than just technology. For example, it is often more rational for a caregiving facility to hire one more person rather than introduce new equipment. This is one of the fundamental problems that the current healthcare industry is facing. Due to staffing shortages and issues with the nursing insurance system, facilities find it difficult to raise wages. As a result, they tend to prioritize securing personnel over introducing new technology for cost optimization. In light of this, I believe that introducing digital transformation (DX) before robotics may be the right approach. For example, communication tools are rarely used when it comes to daily journaling or sharing information about work, so introducing DX in those areas should be a priority. However, robots specialized for bathing assistance seem to be making progress in terms of implementation.

Ochiai is a participating member in the project xDiversity (“cross diversity”), which aims to create mechanisms for friendlier problem solving by “crossing” AI with the "differences" in people and environments.
He and his team are developing devices such as self-driving wheelchairs and devices that represent sound as light and vibration.
(Photo provided by Ochiai)


—It seems that we also need to reconsider the system design around caregiving, in addition to the progress of technology.

Ochiai: I agree. Changing laws and systems would probably be faster. In the current system, it is difficult to have robots perform caregiving tasks within the cost range that each facility can afford, in my opinion. That’s why, as a participating member of the "Building a Comprehensive Social Security System for All Generations Conference," I propose comprehensive development of the tax system, social security, employment reform, and a push-type benefit system. Initially, I was involved as a member of the visionary conference to set goals for the Moonshot Goals, and I proposed more robots because I thought it would be better to promote the social implementation of robotics, with Japan’s aging society and declining birthrate. However, the issue to be solved first was not technology, but the law. That was the Achilles' heel.

Hirata: Apart from system design, I would like to talk about the challenges I feel from a different perspective. Many different technologies have already been developed. As Ochiai-san mentioned, there are already robots that assist with bathing and standing, smart speakers, and bots that open curtains and operate appliances, and some of them have already been introduced. However, the overall coordination is still shaky. Users are diverse, and each person has different thoughts and needs, so it is necessary to combine existing devices effectively and support people in a customized way. The current challenge is to connect these devices effectively. If each robot or device has to be set up and operated one by one, the workload of the caregivers will increase, and labor costs will also go up.

Another related challenge is that there are still few caregivers who have knowledge about digital devices. Many people do not know that they can change the height of the bed or turn the TV on and off by giving voice instructions to a smart speaker or chatbot. So, I think it is necessary to develop human resources who have sufficient experience in caregiving and are well-versed in recent technology. Currently, even with the technologies available in the world, it is important to first establish a system that can use them, and create a framework that can coordinate multiple sensors, devices, and robots in operation. Ultimately, it would be ideal to incorporate innovative technology developed through the Moonshot Goals into this systems later on.

Ochiai: It is also a challenge that field workers have trouble customizing hardware and software that operate wirelessly. With hardware, you can use duct tape to connect things if you need to, software on the other hand, its lack of lenience can actually get in the way.

Hirata: Currently, I am working with researchers from the Swiss Federal Institute of Technology in Zurich (ETH ZURICH) to standardize software. I am not sure if we can go as far as standardization, but at least at the research level, it has become relatively easy to connect devices. We are aiming to simplify this even further so that staff at care facilities can connect devices on the spot and say, "In this situation, we can connect this to this."

A scene from a discussion between Hirata and researchers from ETH Zurich on a Moonshot R&D project.
Hirata is collaborating with overseas researchers to standardize software for connecting devices.
(Photo provided by Hirata)


—Furthermore, if we make use of language models and speech recognition technology that have been rapidly developing in recent years, such as GPT-3 (*1), we may soon have a future where we can simply say "I want to do this" and have it immediately converted and executed as a program.

Ochiai: GPT-4 will probably be released soon as well. (This Interview was conducted in February.) There have already been cases where drones and robots have been controlled by human voice commands using technologies like Whisper, and I find that very interesting. I believe that caregivers will use generative AI like ChatGPT in the field in the near future. DX, which simplifies procedures, has the potential to significantly improve work efficiency by utilizing such technologies.

Hirata: Yes, that's right. I recently spoke with people in Kyushu who are in charge of consulting with both needs and seeds regarding nursing care robots. They tracked the direct work, which is work done directly with caretakers, and the indirect tasks of transporting various items, arranging vehicles, and performing clerical work. When they monitored and sorted work time, they found that indirect work accounted for more than 50%. In other words, if we can assist with indirect work to some extent through DX and the like, caregivers should be able to focus more on direct caregiving.

—What do you think is important for users when adapting to new technologies, including adaptable AI robots, in society?

Hirata: Something really opened my eyes when I talked to people in the caregiving field once. And that was, that even if they are told to wear a certain sensor or use a certain robot because they have gotten older or their physical condition has deteriorated, users are not very likely to become proactive about it. By using these technologies they acknowledge their own aging and decline, and they feel sad to see their deteriorating physical condition right in front of their eyes.

On the other hand, young people, even those up to middle age, have started to use wearable devices such as smartwatches and personal mobility devices in their daily lives. If they have a habit of using such technology in their lives from a young age, it will be natural for them to accept and utilize technology-based caregiving as an extension of the technology they have been using up to that point.

Ochiai: For sure. Even prompt engineering (*2) should be easy to do even as people age, if they are accustomed to technology. By the time adaptable AI robots are implemented in society, generative AI will have evolved even further, and the need for procedural work will become even less. If that happens, people will be able to immerse themselves in expressing their creativity. Everyone will be able to draw pictures infinitely, even create 3D models. Programming should also be easily doable with just a few words. There is hope all around. It would be great if the world turns out like that.

Hirata: After reading Ochiai's book (*3) on "aging" while doing this research with 2050 goal in mind, I was inspired to start thinking about "aging" again, in regards to how we might coexist with robots. Everyone will eventually get older and things will gradually become more difficult to do, or living daily life will become challenging. Even before we get old, people tend to think somewhat pessimistically about their future, thinking "Oh, I will also become like that someday." But I think that by incorporating some form of technology there, people can continue to live with a sense of self-efficacy even as they age.

Another important thing is that the choices in life after retirement may increase. If the amount of things you can do decreases and self-efficacy is lost, choices will naturally decrease. But with the development of these technologies, I think it will become an era where people can do more of what they want when they get old. This era will not only be about actively going out and taking action, but also about having the option to participate in social activities using cybernetic avatars, as proposed in the Moonshot Goal 1. In the future, far beyond this project, we can create a society where everyone can achieve the way of life they desire. I am working on this project with the hope of creating a society where people can be positive about aging.

*1: An abbreviation of Generative Pre-trained Transformer 3 (GPT-3). It is one of the natural language processing models developed by AI research organization OpenAI.
*2: This refers to instructions given to generative AI, such as image or text generation AI, to generate content and improve its quality.
*3: "Facing 'Aging': New Growth in an Ultra-Aged Society by Yoichi Ochiai, 34 years old” (2021)

Interview / Text: Mirei Takahashi