Toggle light / dark theme

Entanglement is the essential resource that enables quantum information and processing tasks. Historically, sources of entangled light were developed as experimental tools to test the foundations of quantum mechanics. In this study, we make an extreme version of such a source, where the entangled photons are separated in energy by 5 orders of magnitude, to engineer a quantum interconnect between light and superconducting microwave devices.

Our entanglement source is an integrated chip-scale device with a specially designed acoustic transducer, whose vibrations can simultaneously modulate the frequency of an optical cavity and generate an oscillating voltage in a superconducting electrical resonator. We operate this transducer at cryogenic temperatures to maintain the acoustic and electrical components of the device close to their quantum ground state and excite it with laser pulses to generate entangled pairs. We measure statistical correlations between the optical and microwave emission to verify entanglement.

Our work demonstrates a fundamental prerequisite for a quantum information processing architecture in which room-temperature optical communication links may be used to network superconducting quantum-bit processors in distant cryogenic setups.

Decreasing the number of dimensions from three to two to one dramatically influences the physical behaviour of a system, causing different states of matter to emerge. In recent years, physicists have been using optical quantum gases to study this phenomenon.

In the new study, conducted in the framework of the collaborative research centre OSCAR, a team led by Frank Vewinger of the Institute of Applied Physics (IAP) at the University of Bonn looked at how the behaviour of a photon gas changed as it went from being 2D to 1D. The researchers prepared the 2D gas in an optical microcavity, which is a structure in which light is reflected back and forth between two mirrors. The cavity was filled with dye molecules. As the photons repeatedly interact with the dye, they cool down and the gas eventually condenses into an extended quantum state called a Bose–Einstein condensate.

DOOM has been ported to quantum computers, marking another milestone for this seminal 3D gaming title. However, the coder behind this feat admits that there is currently no quantum computer capable of executing (playing) this code right now. All is not lost, though, as Quandoom can run on a classical computer, even a modest laptop, using a lightweight QASM simulator.

Barcelona ICFO-based Quantum Information PhD student Luke Mortimer, AKA Lumorti, is behind this newest port of DOOM. In the ReadMe file accompanying the Quandoom 1.0.0 release, Lumorti quips that “It is a well-known fact that all useful computational devices ever created are capable of running DOOM,” and humorously suggests that Quandoom may be the first practical use found for quantum computers.

A few weeks ago, I attended the Seven Pines Symposium on Fundamental Problems in Physics outside Minneapolis, where I had the honor of participating in a panel discussion with Sir Roger Penrose. The way it worked was, Penrose spoke for a half hour about his ideas about conscious ness (Gödel, quantum gravity, microtubules, uncomputability, you know the drill), then I delivered a half-hour “response,” and then there was an hour of questions and discussion from the floor. Below, I’m sharing the prepared notes for my talk, as well as some very brief recollections about the discussion afterward. (Sorry, there’s no audio or video.) I unfortunately don’t have the text or transparencies for Penrose’s talk available to me, but—with one exception, which I touch on in my own talk—his talk very much followed the outlines of his famous books, The Emperor’s New Mind and Shadows of the Mind.

Admittedly, for regular readers of this blog, not much in my own talk will be new either. Apart from a few new wisecracks, almost all of the material (including the replies to Penrose) is contained in The Ghost in the Quantum Turing Machine, Could A Quantum Computer Have Subjective Experience? (my talk at IBM T. J. Watson), and Quantum Computing Since Democritus chapters 4 and 11. See also my recent answer on Quora to “What’s your take on John Searle’s Chinese room argument”?

Still, I thought it might be of interest to some readers how I organized this material for the specific, unenviable task of debating the guy who proved that our universe contains spacetime singularities.

Battery performance is heavily influenced by the non-uniformity and failure of individual electrode particles. Understanding the reaction mechanisms and failure modes at nanoscale level is key to advancing battery technologies and extending their lifespan. However, capturing real-time electrochemical evolution at this scale remains challenging due to the limitations of existing sensing methods, which lack the necessary spatial resolution and sensitivity.

Quantum squeezing is a concept in quantum physics where the uncertainty in one aspect of a system is reduced while the uncertainty in another related aspect is increased. Imagine squeezing a round balloon filled with air. In its normal state, the balloon is perfectly spherical. When you squeeze one side, it gets flattened and stretched out in the other direction. This represents what is happening in a squeezed quantum state: you are reducing the uncertainty (or noise) in one quantity, like position, but in doing so, you increase the uncertainty in another quantity, like momentum. However, the total uncertainty remains the same, since you are just redistributing it between the two. Even though the overall uncertainty remains the same, this ‘squeezing’ allows you to measure one of those variables with much greater precision than before.

This technique has already been used to improve the accuracy of measurements in situations where only one variable needs to be precisely measured, such as in improving the precision of atomic clocks. However, using squeezing in cases where multiple factors need to be measured simultaneously, such as an object’s position and momentum, is much more challenging.

In a research paper published in Physical Review Research (“Squeezing-induced quantum-enhanced multiphase estimation”), Tohoku University’s Dr. Le Bin Ho explores the effectiveness of the squeezing technique in enhancing the precision of measurements in quantum systems with multiple factors. The analysis provides theoretical and numerical insights, aiding in the identification of mechanisms for achieving maximum precision in these intricate measurements.