Engineers at the University of California, Berkeley have developed a device that can recognize hand gestures based on electrical signals detected in the forearm. This newly developed system is the result of wearable biosensors and artificial intelligence (AI), and it could lead to better control of prosthetics and human-computer interaction.
Ali Moin was part of the design team and is a doctoral student in UC Berkeley’s Department of Electrical Engineering and Computer Sciences. Moin is also co-first author of the research paper published online on Dec. 21 in the journal Nature Electronics.
“Prosthetics are one important application of this technology, but besides that, it also offers a very intuitive way of communicating with computers.” said Moin. “Reading hand gestures is one way of improving human-computer interaction. And, while there are other ways of doing that, by, for instance, using cameras and computer vision, this is a good solution that also maintains an individual’s privacy.”
Hand Gesture Recognition System
The team worked with Ana Arias, professor of electrical engineering at UC Berkeley, during the development of the system. Together, they designed and created a flexible armband capable of reading electrical signals at 64 different points on the forearm. These electrical signals were then fed into an electrical chip programmed with an AI algorithm. This algorithm can identify signal patterns in the forearm that come from specific hand gestures.
The algorithm was able to identify 21 individual hand gestures.
“When you want your hand muscles to contract, your brain sends electrical signals through neurons in your neck and shoulders to muscle fibers in your arms and hands,” Moin said. “Essentially, what the electrodes in the cuff are sensing is this electrical field. It’s not that precise, in the sense that we can’t pinpoint which exact fibers were triggered, but with the high density of electrodes, it can still learn to recognize certain patterns.”
The AI algorithm first learns to identify electrical signals in the arm and their corresponding hand gestures, which requires the user to wear the device while making those gestures. Taking things a step further, the system relies on a hyperdimensional computing algorithm, which is an advanced AI that continuously updates itself. This advanced technology allows for the system to correct itself with new information, such as arm movements or sweat.
“In gesture recognition, your signals are going to change over time, and that can affect the performance of your model,” Moin said. “We were able to greatly improve the classification accuracy by updating the model on the device.”
Computing Locally on the Chip
Another impressive feature of the device is that all of the computing takes place on the chip, meaning no personal data is transmitted to other devices. This results in a faster computing time and protected biological data.
Jan Rabaey is the Donald O. Pedersen Distinguished Professor of Electrical Engineering at UC Berkeley and senior author of the paper.
“When Amazon or Apple creates their algorithms, they run a bunch of software in the cloud that creates the model, and then the model gets downloaded onto your device,” said Jan Rabaey. “The problem is that then you’re stuck with that particular model. In our approach, we implemented a process where the learning is done on the device itself. And it is extremely quick: You only have to do it one time, and it starts doing the job. But if you do it more times, it can get better. So, it is continuously learning, which is how humans do it.”
According to Rabaey, the device could become commercialized after just a few slight changes.
“Most of these technologies already exist elsewhere, but what’s unique about this device is that it integrates the biosensing, signal processing and interpretation, and artificial intelligence into one system that is relatively small and flexible and has a low power budget,” Rabaey said.