Wearable Devices Ltd. has announced a collaboration with Rokid to enable natural, fast, and consistent gesture control for AI and AR glasses using Wearable Devices' Mudra Link neural wristband. Under the agreement, Mudra Link and Rokid Glasses are compatible and ready for each other, featuring out-of-the-box pairing, pre-mapped gestures, and a shared setup flow. Joint marketing and consumer bundle opportunities are planned for an initial rollout in the second quarter of 2026.
The companies are demonstrating the combined experience at CES 2026, highlighting a touch-free, wrist-based control approach designed to reduce friction in everyday interactions with smart glasses. Wearable Devices Ltd. is a growth company pioneering human-computer interaction through its AI-powered neural input touchless technology. The company's consumer products, the Mudra Band and Mudra Link, leverage proprietary sensors, software, and advanced AI algorithms to enable touch-free, intuitive control of digital devices using gestures across multiple operating systems.
Operating through a dual-channel model of direct-to-consumer sales and enterprise licensing and collaborations, Wearable Devices aims to empower consumers with stylish, functional wearables for enhanced experiences in gaming, productivity, and extended reality. In the business sector, the company provides enterprise partners with advanced input solutions for immersive and interactive environments, from augmented reality/virtual reality/XR to smart environments. By setting the standard for neural input in the XR ecosystem, Wearable Devices seeks to shape the future of seamless, natural user experiences across some of the world's fastest-growing tech markets. The latest news and updates relating to WLDS are available in the company's newsroom at https://ibn.fm/WLDS.
This collaboration represents a significant step in integrating neural interface technology with mainstream augmented reality hardware. The partnership between a specialist in neural input and a leader in AR glasses could accelerate the adoption of more intuitive control methods for wearable computing, moving beyond voice commands and touch interfaces to a system where subtle wrist gestures can manipulate digital overlays in the physical world. The planned consumer bundles and joint marketing efforts indicate a strategy to bring this combined technology to market as a cohesive user experience rather than as separate components.
The demonstration at CES 2026 serves as a public validation of the technology's readiness and provides an early look at how gesture control might function in practical AR applications. For more information about AINewsWire, which distributed the announcement, please visit https://www.AINewsWire.com. The implications of this partnership extend beyond the immediate product integration, potentially influencing how future AR and AI interfaces are designed, with a focus on reducing the cognitive and physical load required to interact with increasingly pervasive digital environments.


