A machine learning approach to the simulation of nonlocal quantum correlations
Bo-An Tsai1*, Yeong-Cherng Liang1
11Department of Physics and Center for Quantum Frontiers of Research & Technology (QFort), National Cheng Kung University, Tainan, Taiwan
* Presenter:Bo-An Tsai, email:L26081121@gs.ncku.edu.tw
Quantum nonlocality, as signified by the quantum violation of Bell inequalities is an important feature of
quantum theory that is responsible for various quantum information processing tasks.
Operationally, the violation of Bell inequalities by certain joint conditional probability distributions means that the corresponding correlation cannot be simulated by using shared randomness alone. In this regard, it is worth noting that, as shown by Toner and Bacon, shared randomness augmented by one bit of communication suffices to reproduce all quantum correlations originating from any two-qubit maximally entangled states. In this work, we consider the general problem of simulating quantum correlations originating from a general
two-qubit entangled state in a Bell-type setup when communication is allowed. To this end, we adapt some machine learning techniques that have recently found applications in other simulation problems in the studies of quantum nonlocality.


Keywords: Quantum nonlocality, quantum correlation, machine learning, simulation