to be the world’s
top IT powerhouse.We thrive to be the world’s top IT powerhouse.
Our mission is to lead innovations
in information technology, create lasting impact,
and educate next-generation leaders of the world.
- 1
- 6
to be the world’s
top IT powerhouse.We thrive to be the world’s top IT powerhouse.
Our mission is to lead innovations
in information technology, create lasting impact,
and educate next-generation leaders of the world.
- 2
- 6
to be the world’s
top IT powerhouse.We thrive to be the world’s top IT powerhouse.
Our mission is to lead innovations
in information technology, create lasting impact,
and educate next-generation leaders of the world.
- 3
- 6
to be the world’s
top IT powerhouse.We thrive to be the world’s top IT powerhouse.
Our mission is to lead innovations
in information technology, create lasting impact,
and educate next-generation leaders of the world.
- 4
- 6
to be the world’s
top IT powerhouse.We thrive to be the world’s top IT powerhouse.
Our mission is to lead innovations
in information technology, create lasting impact,
and educate next-generation leaders of the world.
- 5
- 6
are a key thrust
in EE researchAI and machine learning are a key thrust in EE research
AI/machine learning efforts are already a big part of ongoing
research in all 6 divisions - Computer, Communication, Signal,
Wave, Circuit and Device - of KAIST EE
- 6
- 6
Highlights
EE Professor Minsoo Rhu, Appointed as First Asian Program Chair for MICRO
On the 5th, KAIST announced that Minsoo Rhu, Professor in the Department of Electrical Engineering, has been appointed as Program Co-Chair for the IEEE/ACM International Symposium on Microarchitecture (MICRO) scheduled to be held next year. This marks the first time in MICRO’s 57-year history that a faculty member from an Asian university has been selected as Program Chair.
Celebrating its 57th edition this year, MICRO is the oldest and most prestigious international conference in the field of computer architecture. Alongside ISCA and HPCA, it is regarded as one of the top three international conferences in computer architecture. Scholars and industry professionals from around the world participate in MICRO, with fewer than 20% of submitted papers being selected for final presentation.
Professor Rhu was appointed Program Chair of the 58th MICRO conference, set to be held next year, in recognition of his contributions to the field of computer architecture. He will serve as Program Co-Chair alongside Professor Radu Teodorescu of Ohio State University, overseeing the selection of around 300 expert members of the Program Committee and supervising the review of over 500 submitted papers.
Professor Rhu is recognized as a next-generation leader in the fields of intelligent semiconductors and computer systems for artificial intelligence (AI). His expertise is reflected in his induction into the Hall of Fame of major conferences, including HPCA in 2021, MICRO in 2022, and ISCA this year.
Professor Rhu completed his undergraduate studies in electronic engineering at Sogang University, obtained his master’s degree in electrical engineering from KAIST, and earned his Ph.D. in computer science from the University of Texas at Austin. From 2014 to 2017, he worked at NVIDIA Research, and since 2018, he has been a professor at KAIST. He also served as a visiting researcher at Meta AI from 2022 to last year.
His research has been widely recognized by academia, receiving the Best Paper Award at HPCA this year, the Google Faculty Research Award last year, and the Facebook Faculty Research Award in 2020. Last year, he was also inducted as a member of Y-KAST, an elite group of young scientists under 43 recognized for their outstanding contributions to science by the Korean Academy of Science and Technology.
Professor Rhu commented, “While maintaining MICRO’s tradition of selecting only the highest-quality papers that lead academia and industry, I will also strive to create a program that comprehensively reflects emerging research in computer hardware and software.”
Dr. Jaehyup Lee (Advisor: Munchurl Kim) Appointed as an Assistant Professor at the School of Computer Science, Kyungpook National Univ.
[Prof. Jaehyup Lee]
Dr. Jaehyup Lee (Advisor: Munchurl Kim), a graduate of School of EE, was appointed as an assistant professor at School of Computer Science, Kyungpook National University on September 1, 2024.
Dr. Jaehyup Lee received Bachelor’s Master’s and Ph.D. degrees all from School of Electrical Engineering. During his Ph. D. study he has actively conducted computer vision research and, published several key papers in internationally renowned academic journals and conferences such as IEEE TCSVT/RGSL and CVPR.
After receiving his Ph.D. degree in Feb. 2024, he has been with Samsung Advanced Instotute of Technology, Samsung Electronics.
His main research is in the fields of image generation, translation, restoration and quality enhancement using generative AI.
Professor Seunghyup Yoo has been appointed as the 31st Head of the Department of Electrical Engineering
Professor Seunghyup Yoo has been appointed as the Head of the Department of Electrical Engineering as of September 1, 2024.
He will lead the department until August 31, 2029.
We would also like to thank the Professor Joonhyuk Kang for his dedication to the department.
EE Professor Dongsu Han’s Research Team Develops Technology to Accelerate AI Model Training in Distributed Environments Using Consumer-Grade GPUs
<(from left) Professor Dongsu Han, Dr. Hwijoon Iim, Ph.D. Candidate Juncheol Ye>
Professor Dongsu Han’s research team of the KAIST Department of Electrical Engineering has developed a groundbreaking technology that accelerates AI model training in distributed environments with limited network bandwidth using consumer-grade GPUs.
Training the latest AI models typically requires expensive infrastructure, such as high-performance GPUs costing tens of millions in won and high-speed dedicated networks.
As a result, most researchers in academia and small to medium-sized enterprises have to rely on cheaper, consumer-grade GPUs for model training.
However, they face difficulties in efficient model training due to network bandwidth limitations.
<Figure 1. Problems in Conventional Low-Cost Distributed Deep Learning Environments>
To address these issues, Professor Han’s team developed a distributed learning framework called StellaTrain.
StellaTrain accelerates model training on low-cost GPUs by integrating a pipeline that utilizes both CPUs and GPUs. It dynamically adjusts batch sizes and compression rates according to the network environment, enabling fast model training in multi-cluster and multi-node environments without the need for high-speed dedicated networks.
StellaTrain adopts a strategy that offloads gradient compression and optimization processes to the CPU to maximize GPU utilization by optimizing the learning pipeline. The team developed and applied a new sparse optimization technique and cache-aware gradient compression technology that work efficiently on CPUs.
This implementation creates a seamless learning pipeline where CPU tasks overlap with GPU computations. Furthermore, dynamic optimization technology adjusts batch sizes and compression rates in real-time according to network conditions, achieving high GPU utilization even in limited network environments.
<Figure 2. Overview of the StellaTrain Learning Pipeline>
Through these innovations, StellaTrain significantly improves the speed of distributed model training in low-cost multi-cloud environments, achieving up to 104 times performance improvement compared to the existing PyTorch DDP.
Professor Han’s research team has paved the way for efficient AI model training without the need for expensive data center-grade GPUs and high-speed networks. This breakthrough is expected to greatly aid AI research and development in resource-constrained environments, such as academia and small to medium-sized enterprises.
Professor Han emphasized, “KAIST is demonstrating leadership in the AI systems field in South Korea.” He added, “We will continue active research to implement large-scale language model (LLM) training, previously considered the domain of major IT companies, in more affordable computing environments. We hope this research will serve as a critical stepping stone toward that goal.”
The research team included Dr. Hwijoon Iim and Ph.D. candidate Juncheol Ye from KAIST, as well as Professor Sangeetha Abdu Jyothi from UC Irvine. The findings were presented at ACM SIGCOMM 2024, the premier international conference in the field of computer networking, held from August 4 to 8 in Sydney, Australia (Paper title: Accelerating Model Training in Multi-cluster Environments with Consumer-grade GPUs).
Meanwhile, Professor Han’s team has also made continuous research advancements in the AI systems field, presenting a framework called ES-MoE, which accelerates Mixture of Experts (MoE) model training, at ICML 2024 in Vienna, Austria.
By overcoming GPU memory limitations, they significantly enhanced the scalability and efficiency of large-scale MoE model training, enabling fine-tuning of a 15-billion parameter language model using only four GPUs. This achievement opens up the possibility of effectively training large-scale AI models with limited computing resources.
<Figure 3. Overview of the ES-MoE Framework>
<Figure 4. Professor Dongsu Han’s research team has enabled AI model training in low-cost computing environments, even with limited or no high-performance GPUs, through their research on StellaTrain and ES-MoE.>
Dr. CheolJun Park appointed as an assistant professor at Kyung Hee University
KAIST EE graduate (BS, MS, and PhD) Dr. CheolJun Park from Prof. Yongdae Kim’s lab will join the School of Computing at Kyung Hee university as an Assistant Professor from the Fall of 2024.
Dr. CheolJun Park earned his Ph.D. degree with the topic of “A study on dynamic method for finding implementation vulnerabilities in cellular baseband” in 2024.
During his degree, he worked on over-the-air security testing and reported several security vulnerabilities of cellular modems from companies like Qualcomm, Samsung, Google, which allow an attacker to eavesdrop on and manipulate data traffic, spoof a smartphone’s time, or trigger memory crashes. In addition to his research, Dr. Park interned with Qualcomm’s wireless security team. After receiving his Ph.D. degree in February 2024, he has continued his research work at the Prof. Yongdae Kim’ lab as a postdoctoral researcher.
His will continue to research in the fields of cellular and wireless security.
Please give Dr. CheolJun Park a warm encouragement and congratulations.
A new faculty member, Professor Insu Han, appointed at our School
Professor Insu Han will be appointed to our School as of September 1st, 2024.
Congratulations on your appointment.
Professor Yong Man Ro’s Research Team Wins Outstanding Paper Award at AI Top tier Conference (ACL 2024)
<(from left) Se Jin Park ph.d. candidate, Chae Won Kim ph.d. candidate>
PhD students Se Jin Park and Chae Won Kim from Professor Yong-Man Ro’s research team in the School of Electrical Engineering at KAIST have won the Outstanding Paper Award at the ACL (Association for Computational Linguistics) 2024 conference, held in Bangkok.
ACL is recognized as the world’s leading conference in the field of Natural Language Processing (NLP) and is one of the top-tier international conferences in Artificial Intelligence (AI).
Their award-winning paper, titled “Let’s Go Real Talk: Spoken Dialogue Model for Face-to-Face Conversation,” introduces an innovative model designed to make interactions between humans and AI more natural and human-like.
Unlike traditional text-based or speech-based dialogue models, this research developed a Human Multimodal LLM (Large Language Model) that enables AI to comprehend both visual cues and vocal signals from humans. Additionally, it allows the AI to engage in conversations using human-like facial expressions and speech.
This breakthrough opens up new possibilities for improving the intuitiveness and effectiveness of human-AI interactions by simultaneously processing visual and auditory signals during conversations.
Professor Yong Man Ro stated, ” This research marks a significant advancement in human-AI interaction, and we hope this technology will be widely applied in various real-world applications.
This award is yet another example of the international recognition of the excellence of AI research at KAIST’s School of Electrical Engineering.”
Professor Yun Insu’s Lab (as a Part of Team Atlanta) Advances to Finals of the U.S. DARPA ‘AI Cyber Challenge (AIxCC)’ and Secures $2 Million in Research Funding
<Professor Insu Yun>