
With the rapid advancement of artificial intelligence (AI), the importance of ultra-low-power semiconductor technologies that integrate sensing, computation, and memory into a single platform is growing. However, conventional architectures suffer from power loss and latency caused by data movement, as well as inherent limitations in memory reliability. Addressing these challenges, researchers from the School of Electrical Engineering have presented core technologies for sensor–compute–memory integrated AI semiconductors, drawing significant attention from the international research community.
Professor Sanghun Jeon’s research team presented a total of six papers at the IEEE International Electron Devices Meeting (IEDM 2025), the world’s most prestigious conference in the field of semiconductor devices, held in San Francisco, USA, from December 8 to 10. Among these, the team’s work was simultaneously selected as a Highlight Paper and a Top Ranked Student Paper.
In particular, this achievement is considered a highly significant academic accomplishment, given that a single research laboratory presented six silicon-based semiconductor device papers at IEEE IEDM, the world’s most prestigious conference in the semiconductor device field, known for its low acceptance rate and rigorous academic and industrial evaluation standards.
Highlight Paper: Monolithically Integrated Photodiode–Spiking Circuit for Neuromorphic Vision with In-Sensor Feature Extraction
Top Ranked Student Paper: A Highly Reliable Ferroelectric NAND Cell with Ultra-thin IGZO Charge Trap Layer; Trap Profile Engineering for Endurance and Retention Improvement
The research on the M3D integrated neuromorphic vision sensor, selected as a highlight paper, is a semiconductor that stacks the human eye and brain within a single chip. Simply put, the sensors that detect light and the circuits that process signals like a brain are made into very thin layers and stacked vertically in one chip, implementing a structure where the process of ‘seeing’ and ‘judging’ occurs simultaneously.
Through this, the research team completed the world’s first “In-Sensor Spiking Convolution” platform, where AI computation technology that “sees and judges at the same time” takes place directly within the camera sensor.


Previously, this technology required several stages: capturing an image (sensor), converting it to digital (ADC), storing it in memory (DRAM), and then calculating (CNN). However, this new technology eliminates unnecessary data movement as the calculation happens immediately within the sensor. As a result, it has become possible to implement real-time, ultra-low-power Edge AI with significantly reduced power consumption and dramatically improved response speeds.
Based on this approach, the research team presented six core technologies at the conference covering all layers of AI semiconductors, from input to storage. They simultaneously created neuromorphic semiconductors that operate like the brain using much less electricity while utilizing existing semiconductor processes, along with next-generation memory optimized for AI.
First, on the sensor side, they designed the system so that judgment occurs at the sensor stage rather than having separate components for capturing images and calculating. Consequently, power consumption decreased and response speeds increased compared to the conventional method of taking a photo and sending it to another chip for calculation.


Furthermore, in the field of memory, they implemented a next-generation NAND flash that uses the same materials but operates at lower voltages, lasts longer, and can store data stably even when the power is turned off. Through this, they presented a foundational technology that satisfies the requirements for high-capacity, high-reliability, and low-power memory necessary for AI.


Professor Sanghun Jeon, who led the research, stated, “This research is significant in that it demonstrates that the entire hierarchy can be integrated into a single material and process system, moving away from the existing AI semiconductor structure where sensing, computation, and storage were designed separately.” He added, “Moving forward, we plan to expand this into a next-generation AI semiconductor platform that encompasses everything from ultra-low-power Edge AI to large-scale AI memory.”
Meanwhile, this research was conducted with support from basic research projects of the Ministry of Science and ICT and the National Research Foundation of Korea, as well as the Center for Heterogeneous Integration of Extreme-scale & Property Semiconductors (CH³IPS). It was carried out in collaboration with Samsung Electronics, Kyungpook National University, and Hanyang University.