Jess Riedel on Decoherence, Research and Academia

Dr. Jess Riedel joined NTT Research as a senior research scientist in the Physics & Informatics (PHI) Lab in October 2019. He previously held post-doctoral fellowships at the Perimeter Institute for Theoretical Physics, in Waterloo, Ontario and at IBM Research, in Yorktown Heights, NY. He received his Ph.D. from the University of California, Santa Barbara (UCSB) in 2012, but conducted his dissertation research at Los Alamos National Laboratory. He has a B.A. in physics from Princeton University. 

Dr. Riedel studies decoherence and the quantum-classical transition. He is especially interested in defining and efficiently identifying wavefunction “branches” in out-of-equilibrium many-body states. He has previously studied the sensitivity of massive superpositions for detecting dark matter. His PHI Lab poster at the Upgrade 2020 NTT Research Summit was titled “Symplectic covariance, Lindbladian dynamics, and the classical limit of the quantum characteristic function.” For more about Dr. Riedel, check out the video below and the following Q&A:

How did you get interested in quantum physics, and decoherence more particularly?

My philosophical approach to picking research topics is to find the ones that are simultaneously very important, but also neglected by other researchers, which is not easy. Some people try to work on what they think is the most important problem, but often these topics are already being studied by many researchers. In particular, I personally judge my career on its degree of “replaceability”: in the alternative world where I didn’t do any research, if someone else would have gotten more-or-less the same results, perhaps a couple years later, then I would consider my work replaceable. My goal is to be as irreplaceable as I can be.

I picked decoherence because:

  • I think the basic features of quantum mechanics are the most profound and un-intuitive things we know about the physical word.
  • Physicists still disagree about how to understand it, and in particular there is no unified theory of the quantum-classical transition, i.e., the way in which the unusual features of quantum mechanics, like entanglement and superposition, become undetectable on macroscopic scales.
  • Decoherence was, and for the most part still is, the most insightful and precise theory for understanding the quantum-classical transition.
  • Decoherence is criminally understudied, both because predecessor theories are “good enough” for most practical purposes, and because of some subtle reasons that bleed into philosophy.

Of course, many researchers disagree on these points, and especially on the likelihood that further progress can be made in this direction.

Your point that “decoherence arises when the environment learns about the state of the system” reminds one of the Heisenberg uncertainty principle. Is there any relationship?

Yes, these are very related, although you won’t be surprised that it’s a really big topic and is difficult to explain compactly. In short, when quantum mechanics was first developed in the early decades of the 20th century, some researchers interpreted the Heisenberg uncertainty principle mostly as a restriction about what experimenters could know about microscopic particles. The (mostly) consensus view now is that it is in fact a restriction about what any physical system can know about any other physical system (where “know” is made precise in terms of correlation and information, like the way your hard drive “knows” which keys you pressed when writing this document because it has a saved record thereof). Furthermore, we have very strong reasons to believe that once one system knows the maximum amount of information about another system allowed by the uncertainty principle, the “missing” information is not just hidden, but in fact non-existent, i.e., there simply is no fact of the matter about exactly where particles are and how fast they are moving. That sounds suspicious, but can be made precise using very deep ideas like Bell’s theorem.

The process by which a large system appears to lose its quantum weirdness is intimately connected to the physical process by which its environment (e.g., the stray air molecules in the room, or the tiny vibrations in a table) learns about the system. We call this “decoherence.”

How does your research at the crossover between classical and quantum physics play into the PHI Lab’s work with Coherent Ising Machines (CIMs)?

CIMs, like other quantum-information processing devices, must suppress decoherence in order to function properly. If the environment interacts too strongly with the delicate photons storing the quantum information, decoherence takes place and the quantum effects that we hope will accelerate the computation are eliminated. I work on understanding decoherence better, which may give us insight about how to protect a CIM from its environment. 

However, such an application is not at all guaranteed. My work is speculative and is a good example of NTT Research’s commitment to supporting basic research. Ultimately, I am driven primarily by curiosity.

Are there any other areas you are actively investigating?

Most of my research involves decoherence in one guise or another. The three major instantiations are:

  1. Taking the existing theory of decoherence, which suffers from being “ad-hoc” and heuristically applied, and making it more formal, both in the sense of formal mathematics, but also in the sense of generalizing it so it applies to more physical systems.
  2. Going beyond the conventional decoherence framework (grounded in a system-environment distinction) and extending these ideas to a rigorous concept of “wavefunction branches.” Branches are a mathematical structure that have been discussed mostly informally and in toy models, but I, with my collaborators, would like to generalize them to a condensed-matter context.
  3. Using decoherence as a mechanism for detecting very weakly interacting particles like dark matter. I have done a lot of work on this topic in the past but more recently I only dabble in it because the major barrier is the (extreme) experimental difficulties for which my theoretical skills are not very useful.

I also have a significant side interest in how scientific knowledge is stored, synthesized and taught. Suffice it to say, I think our current methods are truly terrible. However, my work in that has so far been restricted to blogging and other informal discussion.

Your career appears to have so far focused on the research end of the academic spectrum. Is that fair to say? What attracted you to NTT Research?

Yes, my work has always been very academic, yet I have always had positions outside universities: I was a UCSB grad student, but I did my dissertation research at Los Alamos National Lab. Then I did postdocs at IBM Research (a corporate basic research lab much like NTT Research) and at the Perimeter Institute (an independent non-profit research institute weakly associated with the University of Waterloo).

Ultimately, I have pursued positions at institutions that were willing to support speculative basic research about the big foundational questions, and who have a community of other researchers interested in the same. NTT Research had this in spades, and as a bonus I was able to move to the Bay Area, which I find to be the most intellectually exciting place in the world, both academically and in terms of applied technology. Many of my colleagues who have pursued jobs in universities and industry have (understandably) had to significantly compromise their research priorities to fit the prevailing fashion, and I am grateful that I have largely been able to stay my own course.

Your post on the failure to teach the proper distinction between quantum and classical mechanics is listed as one of the most popular posts on your personal blog. Any thoughts on why? Did you strike a chord?

The main reason I started a blog is because I realized that people (myself included) are stubborn and don’t change their minds very much. Therefore, rather than trying to persuade my neighbors, it’s a lot more effective to broadcast my quirky opinions into the ether and make contact with the scattered few young researchers who have many of the same intellectual sympathies but haven’t ever met anyone who has articulated them before. 

Several of my most popular posts were driven by my own dissatisfaction with the conventional approaches and explanations that students are given during graduate school. Students often have a sense that what they are learning is incomplete and unsatisfactory, but it’s shockingly hard to dig down and find the good answers that are buried in the academic literature; professors promulgate the same bad explanations they were taught as students, even though better explanations exist, simply because they’re not aware of them and they don’t have time to go deeper.

Our mechanism for distilling scientific knowledge into pedagogical material (textbooks and review articles) have major deficiencies, so that good information fails to reach our students. This is perhaps a self-aggrandizing opinion of my writing, but, based on emails I occasionally get, I do think that my work has been very satisfying and validating to some independently minded students who knew that the explanations they were learning in their classes were inadequate but didn’t know how to find better answers. Along these lines I recently wrote a project proposal requested by a colleague who is collecting and refining philanthropic ideas for improving academia.

Do you have most favorite post among your own list of favorites?

My favorite post describing why a conventional math/physics explanation is flawed is the one on Legendre transforms. The topic is fairly technical, so it’s not very accessible outside students of math and physics. But it’s gotten very enthusiastic feedback from its small intended audience. Other technical posts on this general theme that I like are here, here, and here.

Unfortunately, most of my posts are not accessible to a wide audience. However, my post “Beyond papers — Gitwikxiv” on improving how we store and synthesize academic knowledge was pretty popular on the famous tech nerd news site HackerNews.