Nanoscale resistance switches are two-terminal devices representing at least two programmable resistance states that are retained once programmed. “Nanoscale” indicates the physical demension of a conducting filament that works as a circuit breaker/switch between the two electrodes. Among various classes of resistance switches, my talk mainly addresses valence change memory (VCM) of transistion metal oxides. The VCM effect is based on reversible phase transitions within the insulating mother matrix, which are triggered by point defect dynamics including interfacial electrochemical reactions. From this defect dynamic viewpoint, I first address the mechanism for the VCM effect as well as the electroforming process (initialization process) that breaks the initial symmetry of a pristine switch. Feasible appliations of such switches to deep learning acceleration are the main focus of the second part of my talk. The examples include analog multiply-accumulate (MAC) operation with minimum time complexity using a passive array of binary resistance switches. A new algorithm for ad hoc binary resistance update (fully satisfying a locality constraint) is also introduced. This algorithm features greedy edge-wise training of stochastic neural network with ternary weight values (-1, 0, 1).
In addition, I briefly introduce the recent efforts of my group on the analog and digital neuromorphic system engineering from building blocks to system architecture. This includes a fully reconfigurable digital neuromorphic chip (name Neo2C chip) that is in the pipeline for the moment.
Dook Seok Jeong is an Associate Professor at Hanyang University. He received his BE and ME in materials science from Seoul National University in 2002 and 2005, respectively. He received his phD degree in materials science from RWTH Aachen, Germany, in 2008. He worked for Korea Institute of Science and Technology (KIST) from September 2008 to February 2018. He was the principal investigator of the Neo2C chip development project (KIST Open Research Program). His research interests include resistance-switch-based deep learning accelerators and spiking neural networks for temporal learning, ranging from building blocks to temporal learning algorithm.