Research

Research Highlights

Prof. Yong Man Ro Elevated to IEEE Fellow, Class of 2026

교수님 360
<Prof. Yong Man Ro>

Professor Yong Man Ro has been elevated to the grade of IEEE Fellow for the Class of 2026, with the citation “for contributions to Human-Centered Multimodal Signal Processing.” Recognized for bridging the gap between human perception and machine intelligence, Prof. Ro has established foundational frameworks in multimodal human signal analysis and developed the first human-centered personalized models for quantifying Virtual Reality (VR) quality. His authority in the global signal processing community is further evidenced by his widely cited research and his influential academic standing.

 

Building on this legacy of human-centric analysis, Prof. Ro is currently spearheading the future of AI through his research on Multimodal Large Language Models (MLLM) and Multimodal AI. His lab focuses on creating AI agents capable of “Inclusive Human Multimodal AI,” a vision recently validated by his achievement, which won the Outstanding Paper Award at ACL 2024, a top-tier AI conference. This research marks a leap toward empathetic Artificial General Intelligence (AGI) that can perceive human signals. Beyond his research, Prof. Ro continues to shape the field as an elected member of the Image, Video, and Multidimensional Signal Processing (IVMSP) Technical Committee of the IEEE Signal Processing Society, a member of the Editorial Board for IEEE Transactions on Image Processing (TIP), and as a mentor to over 100 Ph.D. and M.S. graduates who are now leading innovation across academia and top-tier tech research Institutes.