Meta expands neural wristband tech to cars and accessibility at CES 2026

0
29

Meta expands neural wristband tech to cars and accessibility at CES 2026

Meta unveiled applications of its EMG-based neural wristband beyond smart glasses at CES 2026 in Las Vegas, partnering with Garmin for automotive controls and the University of Utah for accessibility research to expand wrist-based device interaction.

Meta has developed electromyography (EMG) technology over several years, capturing electrical signals from wrist muscles to enable gesture controls. The company first introduced this commercially in 2025 with its Meta Ray-Ban Display glasses. Users wear a dedicated neural band on the wrist that detects subtle muscle movements, allowing intuitive control of the glasses without physical touchscreens or voice commands. These movements translate into actions such as scrolling, pinching to zoom, or swiping through content displayed in the glasses’ augmented reality interface.

At CES 2026, Meta presented demonstrations extending the neural band’s capabilities to external devices, marking the initial public showcase outside its smart glasses ecosystem. The company collaborated with Garmin and additional research partners to test practical implementations of the wrist-based controller in diverse environments.

Meta and Garmin previously integrated fitness tracking features into the Ray-Ban Display glasses, syncing data from Garmin wearables for enhanced activity monitoring during use. This foundation supported their joint CES presentation focused on automotive applications.

Garmin incorporated the neural band into a vehicle cockpit setup as part of its “Unified Cabin” concept, which integrates artificial intelligence for various in-car experiences centered on infotainment systems. Garmin develops these systems for multiple major car brands. In the demonstration, participants wearing the neural band navigated two applications on a touchscreen infotainment display. The first involved pinch and swipe gestures to rotate and manipulate a three-dimensional model of a car, replicating the zooming and panning motions used with the Ray-Ban glasses for image navigation. The second application featured the puzzle game 2048, where swipe gestures shifted numbered tiles across the screen to combine matching values.

Garmin representatives stated intentions to investigate further integrations, including neural band controls for core vehicle operations such as lowering windows or unlocking doors, building on the infotainment demo’s gesture recognition accuracy.

Meta also initiated a research partnership with the University of Utah to apply EMG technology for individuals with amyotrophic lateral sclerosis (ALS), muscular dystrophy, and similar conditions impairing hand mobility. The collaboration tests wrist gestures enabling operation of household smart devices, including speakers for audio playback, blinds for light adjustment, thermostats for temperature regulation, locks for secure entry, and comparable connected appliances.

Meta detailed in a blog post the band’s precision: “Meta Neural Band is sensitive enough to detect subtle muscle activity in the wrist — even for people who can’t move their hands.” This sensitivity supports residual muscle signals in affected users, facilitating non-invasive control without requiring visible motion.

The University of Utah researchers will evaluate mobility adaptations, targeting programs like TetraSki. TetraSki currently employs joystick or mouth-operated controllers to assist participants with disabilities in skiing by steering adaptive equipment on slopes.

Featured image credit