The physical limit of quantum optics resolves a mystery of computational complexity | 6/11/2018 | Staff
tiffcourt011 (Posted by) Level 3
Click For Photo:

Linear optics comprises one of the best examples for demonstrating quantum physics. It works at room temperatures, and can be observed with relatively simple devices. Linear optics involves physical processes that conserve the total number of photons. In the ideal case, if there are 100 photons at the beginning, no matter how complicated the physical process is, there will be exactly 100 photons left in the end.

Photons are bosonic non-interacting particles. However, they can still interference with each other, exhibiting non-trivial quantum effects. A typical example is the Hong-Ou-Mandel experiment, where two identical photons are sent to an experimental device. After a simple linear transformation, the two photons appear as if they are stuck together and unwilling to separate. In addition to providing a foundational understanding of quantum mechanics, the study of linear optics has also led to many scientific applications.

Years - Properties - Systems - Development - Complexity

In recent years, the unique properties of linear optical systems have also inspired the development of computational complexity theory. In 2012, Professor Scott Aaronson at MIT (currently at the University of Texas at Austin) proposed a linear optical method for demonstrating the quantum (computational) supremacy, which is based on the concept of boson sampling. More specifically, Aaronson suggested that for a class of sampling problems based on linear optical systems, it would be impossible in practice to apply any classical computer to simulate. This idea immediately sparks a race for reaching the status of "quantum supremacy." Many quantum optical laboratories around the world have become interested in developing boson sampling systems to break records in terms of photon numbers. On the other hand, computer...
(Excerpt) Read more at:
Wake Up To Breaking News!
Sign In or Register to comment.

Welcome to Long Room!

Where The World Finds Its News!