AlterEgo lets you talk to AI silently

AlterEgo could restore the ability to engage comfortably using their voices
An undated image of person wearing AlterEgo device. — MIT Media Lab
An undated image of person wearing AlterEgo device. — MIT Media Lab 

Some technologies seem like something out of a science fiction novel, at least until you see them in action. MIT Media Lab, a Boston-based startup led by MIT-trained innovator Arnav Kapur has developed AlterEgo, a wearable device that lets people communicate silently with artificial intelligence (AI).

The project represents an audacious step into human-computer interaction which feels like it is "invisible, like having a second self."

How does AlterEgo works?

The device is a non-invasive neural interface instead of a brain-reading implant. It picks up on weak neuromuscular signals from your face and throat that occur at the same time when you silently think of the words. 

Those signals are processed using machine learning software into either commands or text. Then the AI provides its answer via bone-conduction audio that only the user can hear. 

“This isn’t about mind-reading,” Kapur explains. “The hardware does not have access to brain data.” It doesn't even read the mouthing of the words as a sign. It does come before the signal reaches the point of being spoken out loud. So, in the end, this interface is private, discreet, and completely hands-free.

The implications are enormous, to say it directly. For patients with ALS, multiple sclerosis, or speech disorders, AlterEgo could restore the ability to engage comfortably using their voices. 

For the average user, this will provide a way to send a text in a crowded café quietly or ask an AI assistant a question during a meeting.

AlterEgo presented an initial prototype at TED in 2019. Excitingly, the company revealed the wearable at the Axios AI+ Summit in Washington, DC.