This Startup Aims to Integrate Its Brain-Computer Interface with Apple Vision Pro

Cognixion is now launching its AI communication app on the Vision Pro, which Forsland says offers more features than the specifically designed Axon-R. “The Vision Pro provides access to all your apps, the app store, and everything you wish to do,” he states.
In May, Apple introduced a new protocol for BCI integration, enabling users with severe mobility impairments to operate the iPhone, iPad, and Vision Pro without any physical movement. Another BCI company, Synchron, has successfully integrated its system with the Vision Pro through an implant placed in a blood vessel near the brain. (Apple is not currently known to be developing its own BCI technology.)
During Cognixion’s trial, the company replaced Apple’s headband with its custom version, equipped with six electroencephalographic (EEG) sensors. These sensors gather data from the brain’s visual and parietal cortex located at the back of the head. Cognixion’s system specifically detects visual fixation signals, which happen when a person focuses on an object. This enables users to interact with a menu of options solely through mental focus. A neural computing device worn at the hip processes the brain data separately from the Vision Pro.
“Our approach focuses on minimizing the communication burden on individuals,” explains Chris Ullrich, Cognixion’s chief technology officer.
While existing communication tools can assist, they are not optimal. For instance, low-tech handheld letterboards allow patients to indicate specific letters, words, or images for caregivers to interpret, but they are often slow to use. Moreover, eye-tracking technology remains costly and is not consistently reliable.
“We design an AI for each individual participant tailored to their speaking history, humor, and anything they’ve written or communicated that we can access. We distill all that information into a user proxy,” Ullrich states.