Soon, we can expect to build a quantum computer consisting of 50 to 100 qubits. These devices are expected to be large and reliable, and the consensus is that their behavior will be difficult to simulate classically. While an error-corrected fault-tolerant quantum computer promises great speedups over a range of important problems, even the best known error correction method to date will be unable to reduce the noise to a desired level. An outstanding challenge then is to come up with an application that can be reliably run on these noisy intermediate-scale devices. Can we do anything that is practically useful? I will survey the existing approaches to this problem, and discuss the main challenges. Despite many of the challenges, we have quantitative reasons to believe that near-term computers will be a powerful tool to study materialistic properties of strongly-correlated quantum many-body systems, which lies outside the existing computational capabilities. I will report on this recent progress, and discuss its future prospects.
2003~2007 : Massachusetts Institute of Technology (B.S in Physics and Mathematics)
2007~2013 : California Institute of Technology (Ph.D in Physics)
2013/09 ~ 2016/01 : Perimeter Institute (Postdoctoral Scholar)
2016/01 ~ 2017/10 : IBM T. J. Watson Research Center ( Postdoctoral Scholar)
2017/11 ~ : Stanford Institute for Theoretical Physics