Are you interested in trying a wireless headset that allows you to interact with digital devices simply by reading your mind? This isn’t science fiction; it was invented over a decade ago by Australian entrepreneur, inventor, and businesswoman Tan Le.
Tan Le’s groundbreaking work in neuroscience has deepened our understanding of the brain’s inner workings. Now, she predicts that we will all use neural interfaces in our daily lives, and they will resemble easy-to-wear headphones.
My Wildest Prediction is a podcast series from Euronews Business where we dare to imagine the future with business and tech visionaries. In this episode, Tom Goodwin talks with Tan Le, CEO of Emotiv, about the future of brain technology.
How our daily life with neural interfaces will look like
Imagine everyday items like your headphones, glasses, or hat could detect signals from your brain’s neural activity. They could suggest a break from work or pause your audiobook if you’re starting to doze off.
This noninvasive support for daily tasks epitomises the future of mind-reading devices, according to Tan Le, CEO of Emotiv, a San Francisco-based neurotechnology company.
“So whether it’s selecting music to match your mood or enhance cognitive performance, or curating playlists tailored to your preferences, it’s a natural extension,” Le explains.
The concept diverges from Elon Musk Neuralink’s brain chip visions, but Le has a track record. In 2009, she disrupted the neural interface market by launching the first wireless headset for public use, enabling control of digital devices with thoughts and emotions.
This breakthrough made mind-controlled tech tangible, revealing brain-computer interface potential beyond sci-fi.
Le emphasises the health impact of integrating such devices into daily life: “Using neural interfaces will gather better data, models, and biomarkers for brain health, longevity, and resilience.”
Data collection is crucial; relying solely on feelings, like measuring stress levels, isn’t enough, Le notes: “You may feel fine, but sustained stress levels can go unnoticed without objective measurements.”
One challenge is decoding electrical signals due to physical barriers, like the skull, hair, and skin, which weaken signals collected from the brain’s surface. However, Le says progress is bringing us closer to a world where wearing such devices becomes commonplace.
A cognitive copilot
Tan Le believes neural devices will begin as health wearables, but their potential will evolve into broader neural interfaces over time.
“The models are improving significantly. Recent advancements in machine learning and AI have introduced more effective methods for decoding data and signals, especially with large datasets. AI can now identify subtle patterns and changes in electrical fluctuations.”
Le envisions a convergence of these technologies resulting in cognitive copilots. These copilots will recognise when a person is confused or struggling to recall a word, stepping in to assist.
“So, it will develop into a much more symbiotic relationship with AI, with a seamless feedback loop between humans and AI,” Le explains. “If AI can grasp the emotional tone I seek and understand my intentions, adjusting content accordingly, it will offer valuable guidance, even acting as guardrails.”
Emotiv powers neuroscience experiments worldwide in over 140 countries. According to Le, what’s particularly intriguing is how the brain adapts to lived experiences.
“Your brain is an incredible system completely responsive to the world in which you lived and your lived experience. If you live in a remote part of the world, the network connectivity in your brain can look very different to someone who lives in Miami, the US, or even in Europe. That’s really exciting for us to have this collection of brain data that really represents the diversity that exists in our world.”