At the turn of the 20th century, William Hoy transformed Major League Baseball. The most prominent deaf player in history, he taught his team American Sign Language (ASL) to communicate on the field while keeping opponents in the dark. His silent speech, a legacy well over a century old now, also inspired umpires to make calls using hand gestures.
ASL is one of some 300 sign languages used today by roughly 70 million deaf people worldwide. But only a sliver of society understands signs. Everyday tasks, like ordering at a restaurant or meeting people at social events can be difficult. To bridge the gap, a South Korean team developed smart rings to translate finger motions into text.
Older devices usually require a jungle of cables to connect sensors. But the new rings are wireless, freeing people to use natural hand motions. The rings also stretch to fit different finger sizes. These upgrades make them more comfortable and reliable, wrote the team. Each ring is powered by a replaceable 12-hour battery.

Fluent signers can communicate at speeds of around 100 to 150 signs per minute, similar to spoken conversation. Devices need to keep up with that speed to avoid uncomfortable pauses. So the team developed AI-based “autocomplete” for the system that, like typing, guesses the next word based on what’s already been signed to generate phrases and sentences on the fly.
Trained on 100 common words in ASL and International Sign Language (ISL), the wearable was over 88 percent accurate in tests, even for users with no experience.
The rings are a step toward “seamless interaction between signers and non-signers,” wrote the team.
Let’s Chat
There are a variety of devices that translate sign language into text or speech, some already on the market.
One design is a bit like virtual reality gaming. It uses cameras and computer vision software to recognize hand gestures. The approach is reasonably fast and accurate in the lab, but struggles in simulated real-world scenarios, where changes in lighting or background confuse the system.
Devices worn by users are more reliable. WearSign, for example, uses sensors to capture the electrical activity of muscles during signing and translates it into text. Often, these devices need to be tailored to the user, a hurdle that limits use, as some can’t commit to the training.
Engineers have also tried embedding tracking sensors in a smart glove. The sensors send signals through cables to a shared wireless transmitter. But it’s a bit like using tools wearing a heavy winter glove. The devices limit natural movement and are uncomfortable for daily use.
They also usually come in only one size with fixed sensor placements, wrote the team. So, depending on hand size, the sensors may be out of place, reducing accuracy.
Put a Ring on It
To overcome these problems, the team built AI rings to track the seven most dominant fingers in signing. (The right pinkie, left middle finger, and thumb didn’t make the cut.) The rings are worn right below the second knuckle to allow natural movement.
Each device is made of stretchy material to accommodate different finger sizes and looks more like a translucent Band-Aid than a typical ring. A tiny accelerometer captures movements like bending, curling, and holding still. The sensors are cheap, low-power, and already used in Apple Watches, Fitbits, and other wearables. There are also onboard chips to manage power use, wafer-thin Bluetooth transmitters, and common replaceable batteries that last nearly 12 hours.
The rings broadcast signals to a host device, which processes the data and maintains a timeline of each movement so incoming signs aren’t scrambled in translation.
To identify words, the system matches gestures to a database of 100 ASL and ISL signs. For example, closing both open palms into fists means “want.” The rings can also pick up signs in motion, like “dance” or “fly,” and those with fingers held still, like “I” and “you.” In first-time users, the system was 88 percent accurate for both ASL and ISL.
To make sure that conversations flow naturally, the team added an AI to track conversations and predict what word comes next. In tests, the system autocompleted simple phrases, like “family want beautiful animal.”
While still experimental, the rings could also translate between sign languages. Because the AI learns from gestures alone, with enough training data, it could eventually turn into a kind of Google Translate for signing.
But finger gestures fail to capture the full spectrum of sign language. Facial expressions, mouth movements, shoulder and body posture, speed, and rhythm all carry critical information, including meaning and emotion. Without this context, the system could easily miscommunicate intent. Some efforts are now returning to older video-based systems to better capture the entire signing experience, this time with sleeker hardware and far more processing power.
The team thinks the rings might be useful elsewhere too, like for use in virtual or augmented reality, touchless computer interfaces, and tracking hand movements in rehabilitation.






