[ça y est] Today, we control devices by thought and waves

by bold-lichterman

In recent days, technological innovation seems to have turned a corner. While the manufacturer Intel has just created a Investment Funds dedicated to the computing of the senses, several impressive discoveries have been made, which could revolutionize the field of human / machine interfaces: researchers at the University of Washington have notably made it possible to remotely control electronic devices using simple gestures. Other American researchers, from the University of Minnesota, have meanwhile developed a technique to remotely control a small drone using only neuronal activity.

Wi-Fi waves that allow you to control a device remotely

Motion detection is not a new technology. We already know the Wii console, from the manufacturer Nitendo, capable of detecting the position, orientation and movements of the controller, for a more realistic gaming experience.

But the technology WiSee, developed by engineers at the University of Washington, offers an additional innovation: motion detection no longer needs a third-party device to function (such as the Wii controller), because it is provided by the presence of Wi-Fi waves. WiSee is based on the exploitation of interference caused by the human body and its movements through the waves of a Wi-Fi network. Engineers have designed software capable of transcribing this interference, according to a method that allowed them to record nine gestures. Then, when the person performs these gestures, they are automatically detected by the software, with a success rate of 94%.

What practical applications? The gestures recognized by WiSee can easily be associated with various tasks: raising and lowering the sound of a television, activating a music player, etc. Another advantage: as the Wi-Fi network is present in all rooms of a home, gesture recognition is not hampered by walls, so it is possible to control a device from another room.

Flying a drone by thinking: the fantasy of telekinesis comes true

At the University of Minnesota, a team of bio-medical engineering students has developed a revolutionary human-machine interface, allowing you to remotely control a small drone by just thinking.

The operation of this interface is based on the brain activity that occurs when we think about making a certain movement. By defining these movements with precision, the engineers sought to make each brain activity correspond to a control command: for example, when the person thinks of closing his right fist, he turns the device to the right, the same for the left. . To raise the drone, the pilot thinks of closing his two fists.

Wearing a helmet connected to a computer is necessary to make this telekinetic device work: when the pilot thinks about the movements mentioned above, the helmet sends the information to the computer, which processes them then sends the corresponding command to the drone, via Wi-Fi.

What practical applications? The objectives of the Start-up of these engineers from Minnesota are not to make the command of combat drones by thought possible. Professor Bin He, director of this research, thinks more about the interest of this technology for people with disabilities.

Photo credit: Shutterstock, millions of photos, illustrations, vectors and videos