Synchron has shared a new demonstration of its brain-computer interface (BCI) technology enabling hands-free control of an iPad, powered by Apple’s accessibility features. In a video from the company, Mark, a participant in Synchron’s COMMAND clinical trial and a person living with ALS, is seen navigating the iPad Home Screen, launching apps, and composing messages — all without using his hands, voice, or eyes.
The system combines Synchron’s implantable Stentrode device with Apple’s built-in Switch Control feature in iPadOS. The Stentrode is inserted into a blood vessel above the brain’s motor cortex, where it captures neural signals related to motor intent. These signals are transmitted wirelessly to an external decoder, which then interfaces directly with the iPad using Apple’s BCI HID standard.
Switch Control plays a central role in enabling interaction. It is designed to allow users with limited mobility to control their devices through adaptive switches or external signals. In this case, it responds to decoded neural data instead of physical touch, making it possible for Mark to scroll, select apps, and type on the iPad entirely through thought.
The BCI HID protocol from Apple is what allows this seamless integration. It supports closed-loop communication between the iPad and Synchron’s system, dynamically sharing on-screen context so the interface remains responsive and optimized in real-time. This enables more precise and efficient control, even with limited neural input.
Dr. Tom Oxley, founder and CEO of Synchron, described the moment as a technological milestone. “This is the first time the world has seen native, thought-driven control of an Apple device in action. Mark’s experience is a technical breakthrough, and a glimpse into the future of human-computer interaction, where cognitive input becomes a mainstream mode of control.”
While Apple is not commercializing the technology itself, its accessibility frameworks and platform standards have enabled third-party innovations like Synchron’s to connect directly with iPadOS. The demonstration shows how Apple’s long-standing investment in accessibility can help unlock new forms of input for users with severe physical disabilities. It also reinforces the potential of brain-computer interfaces to bring real-world benefits when paired with mainstream consumer devices.