The University of Florida held a fully technology-based sport: The world's first brain-controlled drone race. Using their brainwaves, 16 pilots flew drones through an indoor course ten yards long.
Wearing black headsets with tentacle-like sensors stretched over their foreheads, the competitors stare at cubes floating on computer screens as their small white drones prepare for takeoff.
“Three, two, one … GO!” the announcer hollers, and as the racers fix their thoughts on pushing the cubes, the drones suddenly whir, rise and buzz through the air. Some struggle to move even a few feet, while others zip confidently across the finish line.
Both drone-racing and brain-computer interfaces (BCIs) are not new, but this is the first combination of the two, and it's an efficient method of introducing BCIs to the mainstream eye.
BCI is a system that translates brain signals into commands comprehensible to output devices. Most often, this technology is used to allow individuals who are paralyzed to have control of prosthetic limbs.
Here’s how the technology delivers an abstract thought through the digital realm and into the real world: Each EEG headset is calibrated to identify the electrical activity associated with particular thoughts in each wearer’s brain — recording, for example, where neurons fire when the wearer imagines pushing a chair across the floor. Programmers write code to translate these “imaginary motion” signals into commands that computers send to the drones.
To make this possible, electroencephalogram headsets are calibrated to an individual’s brain. The calibration is pretty much like programming commands on your keyboard for a game, where you specify that certain letters move your character up, down, left, and right. Except in this case, each person’s neuron activity is used, translated, and recorded into these commands.
According to futurism.com & eng.ufl.edu