Say goodbye to the keyboard; type ‘hello’ instead with your mind. And that’s just the beginning.
Today’s highlight: Neural Interfaces
What is it?
A neural interface (also called Brain-Computer Interface or BCI) is an external device to detect your neural system signals. The “human neural system generates, transmits, and processes electrochemical signals in different parts of the body. The ‘electric part’ of those signals can be ‘read’ and ‘interpreted’.” (Kaspersky) With machine learning, the device’s interpretation can be turned into an action, such as typing or selecting an item on a screen, turning on the lights in your house, or moving a car or robotic arm. Hence, you can interact with something using only your thoughts. Despite what it might look like, it’s not the same thing as reading your mind. But that doesn’t mean it’s not amazing.
In the past, this technology had to be implanted directly into the brain during a neurosurgery. Now there are non-invasive ways to detect these brain waves, simply by wearing a headset, headband or even a wristband. Soon we will be seeing this mind-blowing technology out in the world. In fact, last week Elon Musk of Tesla fame revealed his plans with Neuralink and brain-reading threads in a white paper. Defense Advanced Research Projects Agency (DARPA) is also spending $65 million to fund neural interfaces, Bryan Johnson has raised $100 million for his Kernel company, and Facebook is working on BCI as well. It’s only a matter of time before brain waves will become humanity’s next everyday tool.
Neural interfaces give the end-user additional autonomy. One great example is its potential to help us interact with the world and communicate more efficiently by cutting out the need for us to type with our hands. The world we live in now requires us to be typing a lot – not just when texting, but in entering information such as website addresses, our usernames and passwords, or locations in our GPS navigation apps, etc., etc. One of the most obvious uses for neural interfaces is to allow for hands-free typing. Voice recognition tech has advanced at an astounding rate to allow us to speak and have it be accurately translated into text, but for people who are nonverbal, have hearing loss, or simply don’t want to be heard speaking their private correspondence out loud, voice is not a cure-all option. Neural interfaces are a great solution that will allow us to type with our minds while our hands, if we have use of them, can be occupied with other tasks.
Here are a few other use cases for neural interfaces:
There’s a lot of potential for people with disabilities. Those suffering from strokes and paralysis will have new options. In 2018, researchers at MIT debuted functioning smart limbs that are controlled by the brain.
This technology will allow us to effectively become cyborgs and develop super-human abilities. Want to fly like a bird? We’ll develop functioning bionic wings. There are ways all of us could benefit from a device that is constantly detecting our brain waves and interpreting them. What if we could be alerted right away to the fact that something feels wrong to us? It would be like an augmented intuition. That’s just the tip of the iceberg – there are many more uses we can brainstorm.
> In education:
In a classroom setting, BCI could be used to test student engagement. Headbands that monitor concentration by reading brain signals have been tested on 10,000 Chinese schoolchildren. Lights on the device flag if the student is not paying attention. Say goodbye to daydreaming in class!
Neural interfaces could also provide tailored experiences for different learning levels. Imagine a digital learning environment that detects the student’s workload via brain waves (for example, how hard it is for them to solve an arithmetic exercise) and automatically adapts the difficulty of the exercises to hold the learner’s workload level in an optimal range. (See this research for more info.)
> In AR and VR:
Companies like Neurable, Emotiv, and NeuroSky are creating AR/VR headsets that have non-invasive BCI hardware built in. You can play a video game and never use your hands.
In an Augmented Reality apace, this will prove particularly useful when needing to interact with any holographic interfaces. Imagine walking down the street wearing your AR glasses and you pull up a display where information must be entered in to a form. Gesturing with your fingers to type on a holographic keyboard could work, but wouldn’t it be so much better (and far less silly-looking to passers-by) just to think of the letters you want inputted?
> CTRL-Labs’ wristband is a really cool alternative to a headset:
No need for a glove. By only using a single band of sensors around the forearm, this wristband knows what the hand and fingers want to do. It translates what it knows visually on the monitor.
Even when the hand is constrained – not moving at all – the wearer imagines what he wants to do with his hand and the wristband can pick up his intention from the electrical pulses along the neurons in his arm. So even those without hands may still be able to use this technology. My mind is blown.
> You can now get a drone that comes with a headset to control it:
The idea that you can wear a headband and use the mind to have a flying drone accompany you is the stuff of science fiction fantasy. My 7-year old self is high-fiving me right now.
> What about libraries?
New programming, new classes: BCI is a skill, requiring the user to be properly trained to achieve control over the neural interface. If he/she cannot perform the desired mental commands, no signal processing algorithm can identify them. To use this new tech, it may be that one will have to learn how to focus on a specific task. Mindfulness practices will become even more important than ever. Libraries may become a place for patrons to take a mindfulness or concentration class in order to learn how to use this new technology.
Patron privacy: If these devices attain mass consumer adoption as they are poised to do, and companies then get access to millions of users’ brain states and collect data about them for their own purposes, there are enormous privacy and ethical issues at stake. With authoritarian governments controlling the data or mandating citizen use (see schoolchildren example above), it could help spur the creation of a surveillance state. Libraries are already helping to keep their communities informed about privacy awareness and may have to increase their efforts.
What do you think? How else do you think libraries will be involved with neural interfaces? Let me know your thoughts! I want to detect those brain waves. 🙂