Get the latest delivered to your inbox
Privacy Policy

Now Reading

Keystroke, Double Click, Swipe Right? No, Just Think It.

From Typing to Telepathy

Keystroke, Double Click, Swipe Right? No, Just Think It.

From Typing to Telepathy

Published 08-21-23

Submitted by Lenovo

Digital rendering of waves on red, green, and blue dots.

From Typing to Telepathy

How human beings have interfaced with computers has steadily evolved over the decades and is now becoming extraordinary. Actually, it all started with zero user-to-computer contact. Once upon a time in a computer age long ago, IBM punch cards were stacked and delivered by programmers or users to computer operators, specially designated people with unique access to the mainframe. Then, with the invention of the PC, users got hands on with keyboards, mice, the ThinkPad TrackPoint, touchpads, digital pens, and touch screens. Today, people even just talk to computers to make them follow commands, or even use their eyes and hand gestures to use augmented (AR) and virtual reality (VR) headsets like the ones offered by Lenovo ThinkReality.

What’s next? Some people are now just simply thinking and the computer is responding directly to sensors interfacing with the user’s brain and body. Brain-Computer-Interfacing (BCI) is a technological frontier that has been explored for decades by the scientific community. The technology holds promise for therapies and remedies for people with a broad range of physical impairments, researchers trying to better understand the mind, and possibly many more applications, including helping improve human cognition.

A person speaking on a stage, a digital screen showing a person with device on their head.
Photo courtesy of Gilberto Tadday / TED

Innovation Dedicated to Opening Minds

Lenovo recently began collaboration with OpenBCI, a pioneer in the wearable BCI field. OpenBCI creates open-source tools for biosensing and neuroscience with the goal of lowering the barrier to entry for brain-computer interfacing.

OpenBCI’s latest project is called Galea; a hardware and software platform merging next-generation biometrics with mixed reality. Galea is the first device that integrates EEG, EMG, EDA, PPG, and eye-tracking into a single headset. By combining a multi-modal sensor system with the immersion of augmented and virtual reality, Galea gives researchers, developers, and creators a powerful new tool for understanding and augmenting the human mind and body.

Galea is already producing amazing results.

Christian Bayerlein is a German technologist and disability rights activist living with spinal muscular atrophy (SMA), a genetic disease that affects the motor neurons that control voluntary muscle movement. During TED2023, Christian and OpenBCI founder Conor Russomanno demonstrated how residual muscle activity from Christian’s body could be repurposed to control and fly a drone.

Galea headset
Photo courtesy of OpenBCI

Learning to Fly

The TED Talk and demonstration are part of OpenBCI’s larger “NeuroFly” project which will culminate in the release of a documentary and open-source software toolkit for replicating the demonstration.

OpenBCI worked with Christian to identify the four muscle groups that he could most reliably activate and placed electrodes on these muscles. OpenBCI’s hardware and software were used to translate each muscle’s activation into digital sliders that Christian learned to control. Signal processing and fine-tuned filtering helped eliminate false-positives and enable the thresholds of the joystick to adapt specifically to Christian and to variations across sessions. These sliders were then mapped to a new digital joystick. With practice, Christian was able to learn how to use the new controls to gain full control over the drone. By connecting the drone’s camera to a VR headset, Christian was able to fly in first person over the TED audience.

The process is like learning how to use a keyboard for the first time; with practice the brain adapts and learns how to use the new controls effortlessly. The same sensors and interaction methods used for NeuroFly could be applied to control a range of physical or digital tools.

OpenBCI tested many different machine learning and data processing pipelines while developing NeuroFly and the project was made possible in part through resources and expertise donated by Lenovo Workstations. The final demonstration used relatively simple signal processing techniques, but having access to cutting-edge computational, CPU, Memory & GPU horsepower, greatly accelerated OpenBCI’s ability to explore different engineering paths and process the range of data generated by Galea’s numerous sensors. Lenovo ThinkStation and ThinkPad P Series Workstations have the unique characteristics of being able to support these intensive workstreams without support from the cloud because of their large number of compute cores, fast processor clock speeds, huge memory capacity and bandwidth. Being able to run the powerful Varjo VR headset, while simultaneously streaming and running AI models on Galea’s data was a boon for the development of NeuroFly.

Three people on a stage "Ted" sign to the left, digital screens above.
Photo courtesy of Gilberto Tadday / TED

My Brain Has an App for That

The applications of Galea and other BCI technology seem nearly limitless. Galea’s list of early users includes companies in consumer technology, automotive, aviation, healthcare, neuroscience, simulation and training, and gaming. The project originally started as a collaboration with gaming giant Valve. In general, BCI technology has the potential to help restore lost motor and communication capabilities, unlock human’s ability to quantify cognitive and emotional states, and power new forms of AI-assisted human-computer interaction tailored to an individual’s specific mind and body.

“Ultimately, I see the combination of neurotechnology and mixed reality as the future of personal computers,” says Russomanno. “Spatial computing, BCI, and AI will converge this decade, resulting in quite possibly the greatest technological inflection point humanity ever experienced. It is going to be remarkable.”

Within a decade or more, PC users could start having a lot more intuitive computing experiences. Based on data from technology like Galea and other sources, as well as new AI-enabled applications, computers will understand, predict and respond to individual’s needs based on biometric data and other inputs. For example, if a user is tired or distracted, audio or video feedback could slow down or pause to help the user stay on track. Or game designers and players could optimize settings based on player’s physical and emotional reactions to enhance gameplay, like make scenarios less or more frightening, exciting or soothing, depending on user preferences.

This convergence of smarter technologies and human experiences is resulting in transforming the philosopher Rene Descarte’s insight “I think, therefore I am” to the more practical notion “I think, therefore I do.”

lenovo logo



Lenovo is a US$62 billion revenue global technology powerhouse, ranked #217 in the Fortune Global 500, employing 77,000 people around the world, and serving millions of customers every day in 180 markets. Focused on a bold vision to deliver Smarter Technology for All, Lenovo has built on its success as the world’s largest PC company by further expanding into growth areas that fuel the advancement of ‘New IT’ technologies (client, edge, cloud, network, and intelligence) including server, storage, mobile, software, solutions, and services. This transformation together with Lenovo’s world-changing innovation is building a more inclusive, trustworthy, and smarter future for everyone, everywhere. Lenovo is listed on the Hong Kong stock exchange under Lenovo Group Limited (HKSE: 992)(ADR: LNVGY). To find out more visit, and read about the latest news via our StoryHub.

More from Lenovo

Join today and get the latest delivered to your inbox