Home

School of Electrical Engineering We thrive
to be the world’s
top IT powerhouse.
We thrive to be the world’s top IT powerhouse.

Our mission is to lead innovations
in information technology, create lasting impact,
and educate next-generation leaders of the world.

  • 1
  • 6
Learn More
School of Electrical Engineering We thrive
to be the world’s
top IT powerhouse.
We thrive to be the world’s top IT powerhouse.

Our mission is to lead innovations
in information technology, create lasting impact,
and educate next-generation leaders of the world.

  • 2
  • 6
Learn More
School of Electrical Engineering We thrive
to be the world’s
top IT powerhouse.
We thrive to be the world’s top IT powerhouse.

Our mission is to lead innovations
in information technology, create lasting impact,
and educate next-generation leaders of the world.

  • 3
  • 6
Learn More
School of Electrical Engineering We thrive
to be the world’s
top IT powerhouse.
We thrive to be the world’s top IT powerhouse.

Our mission is to lead innovations
in information technology, create lasting impact,
and educate next-generation leaders of the world.

  • 4
  • 6
Learn More
School of Electrical Engineering We thrive
to be the world’s
top IT powerhouse.
We thrive to be the world’s top IT powerhouse.

Our mission is to lead innovations
in information technology, create lasting impact,
and educate next-generation leaders of the world.

  • 5
  • 6
Learn More
AI in EE AI and machine learning
are a key thrust
in EE research
AI and machine learning are a key thrust in EE research

AI/machine learning  efforts are already   a big part of   ongoing
research in all 6 divisions - Computer, Communication, Signal,
Wave, Circuit and Device - of KAIST EE 

  • 6
  • 6
Learn More
Previous slide
Next slide

Highlights

5

Master’s student Jimin Lee from Professor Hyeon-Min Bae’s lab wins the Poster Excellence Award at the fNIRS 2024 Conference

 

Inline image 2024 09 27 10.48.45.675

<From left to right: Master’s student Jimin Lee, Ph.D. students Seongkwon Yu and Bumjun Koh, and Master’s graduate Yuqing Liang>

 

Jimin Lee, a master’s student in Professor Hyeonmin Bae’s lab, was awarded the prestigious Poster Excellence Award at the fNIRS 2024 conference, held from September 11 to 15 at the University of Birmingham, UK.

 

Now in its 7th edition, fNIRS is a biennial international conference that brings together basic and clinical scientists focused on understanding the functional properties of biological tissues, including the brain.

 

The award-winning research poster, titled “Fiber-less Speckle Contrast Optical Spectroscopy System Using a Multi-Hole Aperture Method,” was a collaborative project involving Jimin Lee, Ph.D. students Seongkwon Yu and Bumjun Koh, and Master’s graduate Yuqing Liang.

 

This research was recognized by the fNIRS 2024 Program Committee for its excellence, earning the Poster Excellence Award, which is part of the Scientific Excellence Awards.

 

The award is given to master’s, Ph.D., and postdoctoral researchers who deliver outstanding posters or presentations, chosen from among the 350 posters presented at the conference.

 

Inline image 2024 09 27 10.45.16.005

4

[KAIST EE’s Insue Won (MS, Graduated, 8. 2024), Jeoungmin Ji (Ph.D Candidate), and Donggyun Lee in Prof. Seunghyup Yoo’s lab awarded at the 2024 International Meeting on Information Display (IMID)]

  IMID

<(from left) Master’s Insue Won, Ph.D Candidate Jeoungmin Ji>

 

Insue Won (MS, Graduated, Aug., 2024) and Jeoungmin Ji (Ph.D Candidate) (Advised by Prof. Seunghyup Yoo) won the Best Poster Paper Award at the 2024 International Meeting on Information Display (IMID) for their work entitled “Temperature-Dependent Dynamics of Triplet Excitons in MR-TADF OLEDs: Insights from Magneto-Electroluminescence Analysis.”

 

In addition, Dr. Donggyun Lee (Ph.D. Graduated, Feb., 2024) won “Kim Yong-Bae Award Grand Prize” in IMID for his work on stretchable OLED displays.

 

The International Meeting on Information Display (IMID) is one of the world’s two largest international conferences in the field of display technology, held annually during the summer.

This year, the conference took place from August 20 to 23 at the Jeju Convention Center (ICC Jeju).

 

Ph.D candidate Jeoungmin Ji presented a poster titled “Temperature-Dependent Dynamics of Triplet Excitons in MR-TADF OLEDs: Insights from Magneto-Electroluminescence Analysis,” which was conducted in collaboration with Samsung Display and supported by the Technology Innovation Program funded By the Ministry of Trade, Industry & Energy(MOTIE, Korea).

 

Additionally, Dr. Donggyun Lee was awarded the prestigious ‘Kim Yong Bae Award Grand Prize,’ which is presented annually at IMID to one graduate who submits an outstanding thesis in the field of display technology.

 

Best Paper Award

  <Best Poster Award> 

 

              

<Dr. Lee being awarded ‘Kim Yong Bae Award Grand Prize at IMID 2024>

3

Dr. Sangmin Lee (Advisor: Yong Man Ro), has been appointed as an Assistant Professor in the Department of Data Science at Sungkyunkwan University

 

 

[Prof. Sangmin Lee]

 

Dr. Sangmin Lee, a 2023 graduate of the School of Electrical Engineering (advised by Professor Yong Man Ro), has been appointed as an Assistant Professor in the Department of Data Science at Sungkyunkwan University, Seoul, effective September 1st, 2024.

 

Dr. Lee received his Ph.D. with the dissertation titled “Associative Learning for Multimodal Representation under Ambiguous Pair Problems.” During his doctoral studies, he published 19 papers including top-tier conferences and journals such as CVPR, ECCV, and IEEE Trans on Image Processing, focusing on Multimodal Learning and Memory Learning.

 

Following his Ph.D., he pursued postdoctoral research at University of Illinois Urbana-Champaign (UIUC), USA. During his postdoc., he presented his research, “Modeling Multimodal Social Interactions: New Challenges and Baselines with Densely Aligned Representations,” as an oral presentation at CVPR 2024.

 

Dr. Lee’s Current research interests lie in Multimodal Learning and Social Artificial Intelligence.

 

 

2

Professor In-So Kweon Selected as Recipient of the 38th Inchon Prize in the Science and Technology Category

 

<Professor Kweon, In-So>
 

Professor In-So Kweon was selected as the recipient of the 38th Inchon Prize in the Science and Technology category, hosted by the Inchon Memorial Foundation and Dong-A Ilbo, on the 9th of this month.

 

The Inchon Memorial Foundation and the Dong-A Ilbo established the Inchon Prize in 1987 to honor the legacy of Inchon Seong-su Kim, who founded the Dong-A Ilbo and Gyeongseong Textile during the period of Japanese colonial rule and nurtured talents through institutions such as Choongang School and Posung College (now Korea University).

 

The award recognizes individuals and institutions with outstanding achievements in four categories: Education, Journalism and Culture, Humanities and Social Sciences, and Science and Technology.

 

The award ceremony is scheduled to take place on October 11, and the recipients will receive a cash prize of 100 million won and a medal.

EE Professor Minsoo Rhu, Appointed as First Asian Program Chair for MICRO

 Inline image 2024 09 06 14.27.29.783

<Professor Minsoo Rhu>
 

On the 5th, KAIST announced that Minsoo Rhu, Professor in the Department of Electrical Engineering, has been appointed as Program Co-Chair for the IEEE/ACM International Symposium on Microarchitecture (MICRO) scheduled to be held next year. This marks the first time in MICRO’s 57-year history that a faculty member from an Asian university has been selected as Program Chair.

 

Celebrating its 57th edition this year, MICRO is the oldest and most prestigious international conference in the field of computer architecture. Alongside ISCA and HPCA, it is regarded as one of the top three international conferences in computer architecture. Scholars and industry professionals from around the world participate in MICRO, with fewer than 20% of submitted papers being selected for final presentation.

 

Professor Rhu was appointed Program Chair of the 58th MICRO conference, set to be held next year, in recognition of his contributions to the field of computer architecture. He will serve as Program Co-Chair alongside Professor Radu Teodorescu of Ohio State University, overseeing the selection of around 300 expert members of the Program Committee and supervising the review of over 500 submitted papers.

 

Professor Rhu is recognized as a next-generation leader in the fields of intelligent semiconductors and computer systems for artificial intelligence (AI). His expertise is reflected in his induction into the Hall of Fame of major conferences, including HPCA in 2021, MICRO in 2022, and ISCA this year.

 

Professor Rhu completed his undergraduate studies in electronic engineering at Sogang University, obtained his master’s degree in electrical engineering from KAIST, and earned his Ph.D. in computer science from the University of Texas at Austin. From 2014 to 2017, he worked at NVIDIA Research, and since 2018, he has been a professor at KAIST. He also served as a visiting researcher at Meta AI from 2022 to last year.

 

His research has been widely recognized by academia, receiving the Best Paper Award at HPCA this year, the Google Faculty Research Award last year, and the Facebook Faculty Research Award in 2020. Last year, he was also inducted as a member of Y-KAST, an elite group of young scientists under 43 recognized for their outstanding contributions to science by the Korean Academy of Science and Technology.

 

Professor Rhu commented, “While maintaining MICRO’s tradition of selecting only the highest-quality papers that lead academia and industry, I will also strive to create a program that comprehensively reflects emerging research in computer hardware and software.”

1

Dr. Jaehyup Lee (Advisor: Munchurl Kim) Appointed as an Assistant Professor at the School of Computer Science, Kyungpook National Univ.

 

 

[Prof. Jaehyup Lee]

 

Dr. Jaehyup Lee (Advisor: Munchurl Kim), a graduate of School of EE, was appointed as an assistant professor at School of Computer Science, Kyungpook National University on September 1, 2024.

 

Dr. Jaehyup Lee received Bachelor’s Master’s and Ph.D. degrees all from School of Electrical Engineering. During his Ph. D. study he has actively conducted computer vision research and, published several key papers in internationally renowned academic journals and conferences such as IEEE TCSVT/RGSL and CVPR.

 

After receiving his Ph.D. degree in Feb. 2024, he has been with Samsung Advanced Instotute of Technology, Samsung Electronics.

 

His main research is in the fields of image generation, translation, restoration and quality enhancement using generative AI.

 

캡처 2024 09 02 212803

EE Professor Dongsu Han’s Research Team Develops Technology to Accelerate AI Model Training in Distributed Environments Using Consumer-Grade GPUs

2024 09 02 211619

<(from left) Professor Dongsu Han, Dr. Hwijoon Iim, Ph.D. Candidate Juncheol Ye>

 

Professor Dongsu Han’s research team of the KAIST Department of Electrical Engineering has developed a groundbreaking technology that accelerates AI model training in distributed environments with limited network bandwidth using consumer-grade GPUs.

 

Training the latest AI models typically requires expensive infrastructure, such as high-performance GPUs costing tens of millions in won and high-speed dedicated networks.

As a result, most researchers in academia and small to medium-sized enterprises have to rely on cheaper, consumer-grade GPUs for model training.

However, they face difficulties in efficient model training due to network bandwidth limitations.

 

Inline image 2024 09 02 14.59.01.205

<Figure 1. Problems in Conventional Low-Cost Distributed Deep Learning Environments>

 

To address these issues, Professor Han’s team developed a distributed learning framework called StellaTrain.

StellaTrain accelerates model training on low-cost GPUs by integrating a pipeline that utilizes both CPUs and GPUs. It dynamically adjusts batch sizes and compression rates according to the network environment, enabling fast model training in multi-cluster and multi-node environments without the need for high-speed dedicated networks.

 

StellaTrain adopts a strategy that offloads gradient compression and optimization processes to the CPU to maximize GPU utilization by optimizing the learning pipeline. The team developed and applied a new sparse optimization technique and cache-aware gradient compression technology that work efficiently on CPUs.

 

This implementation creates a seamless learning pipeline where CPU tasks overlap with GPU computations. Furthermore, dynamic optimization technology adjusts batch sizes and compression rates in real-time according to network conditions, achieving high GPU utilization even in limited network environments.

 

Inline image 2024 09 02 14.59.01.206

<Figure 2. Overview of the StellaTrain Learning Pipeline>

 

Through these innovations, StellaTrain significantly improves the speed of distributed model training in low-cost multi-cloud environments, achieving up to 104 times performance improvement compared to the existing PyTorch DDP.

 

Professor Han’s research team has paved the way for efficient AI model training without the need for expensive data center-grade GPUs and high-speed networks. This breakthrough is expected to greatly aid AI research and development in resource-constrained environments, such as academia and small to medium-sized enterprises.

 

Professor Han emphasized, “KAIST is demonstrating leadership in the AI systems field in South Korea.” He added, “We will continue active research to implement large-scale language model (LLM) training, previously considered the domain of major IT companies, in more affordable computing environments. We hope this research will serve as a critical stepping stone toward that goal.”

 

The research team included Dr. Hwijoon Iim and Ph.D. candidate Juncheol Ye from KAIST, as well as Professor Sangeetha Abdu Jyothi from UC Irvine. The findings were presented at ACM SIGCOMM 2024, the premier international conference in the field of computer networking, held from August 4 to 8 in Sydney, Australia (Paper title: Accelerating Model Training in Multi-cluster Environments with Consumer-grade GPUs). 

 

Meanwhile, Professor Han’s team has also made continuous research advancements in the AI systems field, presenting a framework called ES-MoE, which accelerates Mixture of Experts (MoE) model training, at ICML 2024 in Vienna, Austria.

 

By overcoming GPU memory limitations, they significantly enhanced the scalability and efficiency of large-scale MoE model training, enabling fine-tuning of a 15-billion parameter language model using only four GPUs. This achievement opens up the possibility of effectively training large-scale AI models with limited computing resources.

 

Inline image 2024 09 02 14.59.01.207 1

<Figure 3. Overview of the ES-MoE Framework>

 

Inline image 2024 09 02 14.59.01.207

<Figure 4. Professor Dongsu Han’s research team has enabled AI model training in low-cost computing environments, even with limited or no high-performance GPUs, through their research on StellaTrain and ES-MoE.>

 

 

 

 

 

NOTICE

MORE

SCHEDULE

SEMINAR