JDK’s solutions architect, Jon Gear, presented at SXSW 19 earlier in March on controlling drones with our minds. We interviewed him to better understand the technology that makes this possible and to learn how the audience reacted to this emerging technology. Read the interview and leave your comments or questions below.
Jon, you recently presented at SXSW 2019. What was your presentation about? And how did it go?
My presentation was on using a Brain-computer Interface (BCI) to control a retail drone, and more broadly, what the future looks like for BCI powered applications. The talk went well, one of my three demos failed which is unfortunate but sometimes that happens with live demos. Luckily everyone was a good sport about it.
Controlling drones with your mind sounds like a futuristic sci-fi thriller. But is that really so? Are we in an age where people are co-existing with mind-controlled devices?
It definitely does sound futuristic doesn’t it! The industry is still in its early stages but there is a lot of exciting work happening. The most common work you will see surrounds helping quadriplegic users interact with monitors by controlling the mouse cursor with their minds. Another exciting area of research is around allowing users with spinal injuries to regain use of their limbs through implants. The industry is still experimenting with this but there have been some successful results. There has even been successful classification of subjects dreams, yes you read that right. We are starting to be able to read someone’s dreams while they sleep. And as of last month the first successful paper was released detailing how to mind meld with a rat to allow a human to suggest to a rat how to best navigate a maze in real-time.
What technology makes this possible? How do you see it changing society, government, or business?
From a technology perspective, BCI devices focus on recording and analyzing brain waves emitted from your brain through your skull. This is based on the 10/20 Position System which acts as a standardized map for electrode placements. There are other approaches that go below the skull onto the actual brain itself but those require surgical implants. Short term, BCI is not as efficient as manually controlling a device or using eye tracking but I do believe the first real home for BCI in our modern world is with people who have suffered spinal injuries that we can help rehabilitate. From there, I believe there is a rather large leap to the next most impactful advancement. Long term, once we understand the brain more I do see BCI devices reshaping the world as we know it. Imagine a world where instead of having to look down at your cellphone to connect with loved ones or the internet, that power is now available to you within your own mind. In theory this would allow for visual overlay of your internet-based content over your real world sight. As well as thought driven querying. If you have ever read the book Feed by M.T. Anderson, I think that book mirrors closely with the social and business implications of accessing the internet with your mind.
When did you become interested in mind-controlled technology? And where do you think it is going? What are some use cases for it?
From childhood I guess, who didn’t want to be Professor X? But I didn’t actually have any practical hands-on exposure until last year on Twitter when I ran across someone who was using BCI devices to move objects and interact with visualizations. Right now, the industry still doesn’t understand how thoughts are made so we are limited to working with the motor cortex, the part of your brain responsible for controlling your limbs, and the visual cortex which is the part of your brain responsible for controlling visual stimuli. Motor-imagery has seen a lot of advancement, this is the concept of thinking about moving your limbs and having an interaction happen.
What types of questions was your audience asking about mind-controlled devices?
My audience was very interested in the real world applications of this technology as well as how to get started. In terms of real world applications, you want to think about interfaces that make sense to be controlled by focusing on visual and limb-based movement. Which is why prosthetics and visual displays are a natural area for hot research. Until we begin to understand more about how our brain turns electricity into thoughts we will be forced to measure side-effects such as brainwave emission and blood concentration in the brain.
You’re a solutions architect at JDK. We develop technology solutions for the enterprise. Do you see a place for BCI in your realm of work?
Not in the immediate future. BCI still requires a rather bulky peripheral to be utilized and is limited in capacity. This won’t lend well to enterprise unless they are already in the space. But, I do think once BCI advances further we will be rethinking interaction models a great deal. Just as Alexa caused us to rethink the User Experience associated with voice-driven interactions, I believe BCI technology will cause us to rethink our interaction models even further. There is a lot of work being done in the conversational UI space that I think will bleed over directly into BCI enabled devices when the industry is ready.
Jon Gear is a Solutions Architect for JDK Technologies. Jon has built a career on cloud and web development with a value-centric approach. Jon believes strongly in embracing new technologies and pushing the envelope of what is possible. Jon’s core passions are serverless development, machine learning, and robotics.