Research

Research Highlights

EE Prof. Ian Oakley’s Research Team Develops Next-Generation Wearable Interaction Technologies Using Around-Device Sensing

 교수님 900
<(from left) Ph.D. candidate Jiwan Kim, Professor Ian Oakley and Ph.D. candidate Mingyu Han>

When will the futuristic vision of natural gesture-based interaction with computers, as seen in sci-fi films like Iron Man, become a reality? Researchers from the KAIST School of Electrical Engineering have developed AI technologies that enable natural and expressive input for wearable devices.

 

Professor Ian Oakley’s research team at KAIST’s School of Electrical Engineering has developed two systems: BudsID, a finger-identification system for wireless earbuds, and SonarSelect, which enables mid-air gesture input on commercial smartwatches. These two studies were presented at the ACM Conference on Human Factors in Computing Systems (CHI)—the world’s premier conference in the field of human-computer interaction—held in Yokohama, Japan from April 26 to May 1. The presentations were part of the “Earables and Hearable” and “Interaction Techniques” sessions, respectively.

 

BudsID uses magnetic sensing to distinguish between fingers based on magnetic field changes that occur when a user wearing a magnetic ring touches an earbud. A lightweight deep learning model identifies which finger is used, allowing different functions to be assigned to each finger, thus expanding the input expressiveness of wireless earbuds.

 

1
<Figure 1: Overall system architecture of BudsID: A user wears a magnetic ring, and magnetic field variations upon earbud touch are detected via deep learning to identify the finger and assign different functions accordingly, enhancing interaction expressiveness.>

 

This magnetic sensing system for wireless earbuds allows users to go beyond traditional interactions like play, pause, or call handling. By mapping different functions or input commands to individual fingers, the interaction capabilities can extend to augmented reality device control and beyond.

 

SonarSelect leverages active sonar sensing using only the built-in microphone, speaker, and motion sensors of a commercial smartwatch. It recognizes mid-air gestures around the device, enabling precise pointer manipulation and target selection.

2
<Figure 2: Three target selection methods using SonarSelect’s around-device movement sensing on commercial smartwatches: A) Double-Crossing, B) Dwelling, C) Pinching.>

 

This gesture interaction technology, based on finger movements detected via active sonar, addresses usability issues of small smartwatch screens and touch occlusion. It enables delicate 3D spatial interactions around the device.

 

Jiwan Kim, first author of both papers, said, “We hope our research into around-device sensing for wearable interaction technologies will help shape the future of how people interact with wearable computing devices.”

 

Professor Ian Oakley’s research team has made both project systems available as open source, allowing researchers and industry professionals to freely use the technology.

 

[BudsID]

[SonarSelect]

 

This research was supported by the National Research Foundation of Korea (NRF) funded by the Ministry of Science and ICT (grants 2023R1A2C1004046, RS-2024-00407732) and the Institute for Information & Communications Technology Planning & Evaluation (IITP) under the University ICT Research Center (ITRC) support program (IITP-2024-RS-2024-00436398).