
Meta’s Reality Labs unveiled a cutting-edge wristband that uses surface electromyography (sEMG) to detect subtle electrical signals from the wrist, allowing users to control computers with simple gestures swipes, taps, and pinches without touching a keyboard or mouse.
The device can also recognize handwriting in the air, send messages, launch apps, and navigate menus by interpreting motor nerve signals before any physical movement occurs. It was tested with Orion AR glasses but could also work with laptops, tablets, and smartphones.
Importantly, this non-invasive tool is designed to enhance digital access for users with motor challenges. Researchers from Carnegie Mellon are testing it for people with spinal cord injuries, as the tool can tap into residual muscle signals even when fingers are paralyzed.
This wristband signals a major shift in human-computer interaction. By turning muscle activity into digital commands, it offers a new, intuitive way to interact with technology—no visible gestures, no screens. Meta says it plans to launch the device with its next smart glasses by late 2025.
The innovation could benefit all users offering private messaging in public, hands-free use, and broader accessibility without requiring surgery like brain implants. It marks a significant step toward natural, neural-controlled computing.
Tags:
Post a comment
Teen builds AI device to help people who can't speak...
- 23 Jul, 2025
- 2
Dubai unveils world’s first restaurant run by an AI-chef
- 17 Jul, 2025
- 2
Tesla Model Y hits Indian roads! Price starting from ₹60...
- 15 Jul, 2025
- 2
Nvidia is now worth $4 trillion, no company has ever...
- 09 Jul, 2025
- 2
AI may replace your job by 2030—unless you do this...
- 15 Jun, 2025
- 2
Categories
Recent News
Daily Newsletter
Get all the top stories from Blogs to keep track.