AI in EE

AI IN DIVISIONS

AI in Communication Division

Knowledge Distillation from Language-Oriented to Emergent Communication for Multi-Agent Remote Control, IEEE International Conference on Communications (ICC) 2024 (최준일 교수 연구실)

Title: Knowledge Distillation from Language-Oriented to Emergent Communication for Multi-Agent Remote Control

Conference: IEEE International Conference on Communications (ICC) 2024

Abstract: In this work, we compare emergent communication (EC) built upon multi-agent deep reinforcement learning (MADRL) and language-oriented semantic communication (LSC) empowered by a pre-trained large language model (LLM) using human language. In a multi-agent remote navigation task, with multimodal input data comprising location and channel maps, it is shown that EC incurs high training cost and struggles when using multimodal data, whereas LSC yields high inference computing cost due to the LLM’s large size. To address their respective bottlenecks, we propose a novel framework of language-guided EC (LEC) by guiding the EC training using LSC via knowledge distillation (KD). Simulations corroborate

that LEC achieves faster travel time while avoiding areas with poor channel conditions, as well as speeding up the MADRL training convergence by up to 61.8% compared to EC.

Main Figure:

2