Breakthrough: Controlling Avatar Robot By Thought

By Helen Thomson, New Scientist July 09, 2012 Comments

Imagine trekking across the Sahara, popping in to see a friend in France, and admiring Niagara Falls, all without moving from your home. And what if you could do this despite being unable to move your limbs? For the first time, researchers have used fMRI – which detects your brain activity in real time – to allow someone to embody a robot hundreds of kilometers away using thought alone.

“The ultimate goal is to create a surrogate, like in Avatar, although that’s a long way off yet,” says Abderrahmane Kheddar, director of the CNRS-AIST joint robotics laboratory at the National Institute of Advanced Industrial Science and Technology in Tsukuba, Japan. He is part of an international team that hopes to use this kind of technology to give healthy people and those who are “locked in” – unable to move but fully conscious – the ability to interact with the world using a surrogate body.

Related Stories:

Teleoperated robots, those that can be remotely controlled by a human, have been around for decades. Kheddar and his colleagues are going a step further. “True embodiment goes far beyond classical telepresence, by making you feel that the thing you are embodying is part of you,” says Kheddar. “This is the feeling we want to reach.”

To attempt this feat, researchers with the international Virtual Embodiment and Robotic Re-embodiment project used fMRI to scan the brain of university student Tirosh Shapira as he imagined moving different parts of his body.

Ori Cohen and Doron Friedman from the Advanced Virtuality Lab at the Interdisciplinary Center in Herzliya, Israel, and colleagues first took Shapira through several training stages in which he attempted to direct a virtual avatar by thinking of moving his left or right hand or his legs. The scanner works by measuring changes in blood flow to the brain’s primary motor cortex, and using this the team was able to create an algorithm that could distinguish between each thought of movement. The commands were then sent via an internet connection to a small robot at the Béziers Technology Institute in France.

The set-up allowed Shapira to control the robot in near real time with his thoughts, while a camera on the robot’s head allowed him to see from the robot’s perspective. When he thought of moving his left or right hand, the robot moved 30 degrees to the left or right. Imagining moving his legs made the robot walk forward.

It takes a little while for the robot to register the thought. “There is a small delay between the start of the neural activity and when we can optimally classify a volunteer’s intentions,” says Cohen. But he says that subjects can adjust for this by thinking of the intended movement ahead of time.

Shapira took part in three trials, including one in which he was able to move the robot around freely and another where he was instructed to follow a person around a room at the French lab. In the third trial he successfully piloted his avatar to locate a teapot placed somewhere in the room. To test the extent of his feelings of embodiment, the researchers also surprised him with a mirror. “I really felt like I was there,” Shapira says. “At one point the connection failed. One of the researchers picked the robot up to see what the problem was and I was like, ‘Oi, put me down!'”

The brain is very easily fooled into incorporating an external entity as its own. Over a decade ago, psychologists discovered that they could convince people that a rubber hand was their own just by putting it on a table in front of them and stroking it in the same way as their real hand. “We’re looking at what kinds of sensory illusions we can incorporate at the next stage to increase this sense of embodiment,” says Kheddar. One such illusion might involve stimulating muscles to create the sensation of movement.

The next step is to improve the surrogate. Replacing the current robot with the HRP-4, made by Kawada Industries in Japan, will increase the feeling of embodiment as it is roughly the height of an adult human and has a more stable and dynamic walk, says Kheddar.

The researchers are also fine-tuning their algorithm to look for patterns of brain activity, rather than simply areas that are active. This will allow each thought process to control a greater range of movements. “For example, you could think of moving your fingers at different speeds and we could correspond that with different speeds of walking or turning,” says Cohen, who presented the results of the embodiment trials at BioRob 2012 in Rome, Italy, last week.

So far, only healthy people have embodied the surrogate. Next, the researchers, along with Rafael Malach’s group at the Weizmann Institute of Science, in Rehovot, Israel, hope to collaborate with groups such as Adrian Owen’s at the University of Western Ontario in Canada to test their surrogate on people who are paralysed or locked in.

On the inside, looking out

Tirosh Shapira stepped into an fMRI in Israel and took on the guise of a little robot in France. He was one of the first people to embody a surrogate robot using this particular method of mind-reading. So what did it feel like?

“It’s amazingly engaging,” he says. “Even in the training phase where you get a kind of virtual avatar and you learn to move it around using your thoughts, you get loads of enthusiasm for the whole process.”

Once you start controlling the robot, it gets much better. “It was mind-blowing. I really felt like I was there, moving around,” Shapira says. It’s not an easy job, though: “You need to concentrate, and you have to calculate a few steps in advance because there’s a small delay between thinking of a movement and it actually happening. But once you get used to it you feel like a puppet master.”

To create a left turn, right turn or leg movements, Shapira found it helpful to think about very specific actions. This enabled the computer to more easily recognise the activated areas of his brain. “I imagined turning the knob of a faucet with my right hand and a gear with my left hand. It worked best when I thought about everything in really vivid detail, like what the faucet felt like to touch.”

While Shapira was controlling the robot, the French team surprised him with one last trick. “I turned around and they’d put a mirror in front of me,” he says. He caught the first glance of his reflection. “I thought, ‘oh I’m so cute, I have blue eyes’, not ‘that robot is cute’. It was amazing.”

To continue reading this article, click here.
Via New Scientist
Photo by Béziers Technology Institute

Facebook Comments
Raphael Recanati International School Banner
OurCrowd Global Investor Summit Banner
Load more