Rachel Fieldhouse
Body

Partially paralysed man uses mind control to feed himself

A partially paralysed man has been able to feed himself - and use his fingers for the first time in 30 years - thanks to recent advances in neural science, software and robotics.

Equipped with two robotic arms, one with a fork and the other with a knife, the man was able to make subtle movements with his fists to certain prompts from a computerised voice, such as “select cut location”, to direct the arms to cut a bite-sized piece of cake in front of him.

With another subtle gesture at the command, “moving food to mouth”, the fork was aligned with his mouth.

In less than 90 seconds, the man, who has very limited upper body mobility, fed himself some cake using his mind and some robotic hands.

To make it all happen, a team of scientists from John Hopkins Applied Physics Laboratory (APL) and the Department of Physical Medicine and Rehabilitation (PMR) at the John Hopkins School of Medicine developed a brain-machine interface (BMI) that allows for direct communication between the brain and a computer.

The computer decodes neural signals and ‘translates’ them to perform various functions, such as controlling robotic prosthetic arms.

It is the culmination of more than 15 years of research between the two groups as part of the Revolutionising Prosthetics program, allowing a person to manoeuvre a pair of prosthetic arms with minimal mental input.

“This shared control approach is intended to leverage the intrinsic capabilities of the brain machine interface and the robotic system, creating a ‘best of both worlds’ environment where the user can personalise the behaviour of a smart prosthesis,” said Dr Francesco Tenore, a senior project manager in APL’s Research and Exploratory Development Department. 

“Although our results are preliminary, we are excited about giving users with limited capability a true sense of control over increasingly intelligent assistive machines.”

Their findings, published in the journal Frontiers in Neurology, also shows how robotics can be used to help people with disabilities.

“In order for robots to perform human-like tasks for people with reduced functionality, they will require human-like dexterity. Human-like dexterity requires complex control of a complex robot skeleton,” Dr David Handelman, the paper’s first author and a senior roboticist at APL, explained. “Our goal is to make it easy for the user to control the few things that matter most for specific tasks.”

Dr Pablo Celnik, the project’s principal investigator from PMR, said: “The human-machine interaction demonstrated in this project denotes the potential capabilities that can be developed to help people with disabilities.”

To see the robot in action, head here.

Image: John Hopkins University APL

Tags:
Body, Technology, Robotics, Science, Research, Disability