This means that a computer can be operated without a keyboard, video games can be played without a controller and cars can be driven without a steering wheel, the developers promise.
Signals in around 64 places
Ali Moin, who developed the system as a doctoral student with Ana Arias, professor of electrical engineering, is primarily thinking of the control of prostheses. In principle, the technology can be used to interact with any electronic device. “Prosthetics is an important application of this technology, but it also offers a very intuitive way of communicating with computers,” says Moin.
The control of devices and prostheses with gestures can also be implemented in other ways, for example with a camera. But his method is more elegant and easier to implement.
The sensor is a flexible bracelet with which the electrical signals are registered at 64 points on the forearm. They are evaluated by a processor, whose software is based on artificial intelligence, and converted into control commands for devices and prostheses. After a training phase, the system was able to clearly recognize 21 different gestures, including thumbs up, a fist, movements of individual fingers and the flat hand.
Always better through training
Since the signals are individually different, the system has to adapt to its user through training. On the other hand, the software used allows independent learning. If, for example, the electrical signals associated with a certain hand movement change because a user’s arm is sweaty or he lifts his arm above his head, the algorithm can incorporate this new information into Moin’s model.
Most of the techniques used already exist, according to Donald O. Pedersen, professor of electrical engineering and head of the development team. What is unique is that biosensors, signal processing and evaluation as well as artificial intelligence could be integrated into a closed system that is small, flexible and consumes little electricity.