My research is animated by the conviction that it is essential both to the proper development of physics, and to its assimilation into both scholarly and broader human culture, that we obtain a clear, precise, and comprehensive conceptual understanding of our existing physical theories.
To that end, the backbone of my research activity over the last 20 years has been the development of a better understanding of the multi-faceted challenges that quantum theory poses to the mechanico-geometric view of reality that underpins classical physics. In this endeavour, reconstruction (see below) is the key methodological tool, and the informational view of physical reality the primary inspiration.
In recent years, I have turned increasing attention to the possibility of (practically) harnessing the improved understanding that has resulted from this work and in interpreting quantum reconstructions due to myself and others. In addition, anticipating work on the relationship between quantum theory and space-time physics, I have begun to the investigate the structure of classical physics and to look afresh at the origin and meaning of the quantum wave equations.
More broadly, I regard our physical theories as providing an unprecedented window into the relationship between thought and reality. As such, I am interested in investigating how and why we find it natural to reflect physical reality in thought in the ways that we commonly do; and in investigating the origin and meaning of the limitations of thought's ability to capture the richness of physical reality, particularly in seeing whether there are ways of partially transcending these limitations.
The theories of classical physics are based on a mechanical conception of reality. That view of reality inspired their postulates, and was in turn reflected in their mathematical formalism.
In contrast, quantum theory developed in a much less regular manner. It did not arise through the fleshing out of a distinct, non-mechanical conception of reality. Rather, it was seeded in the bold new hypotheses such as Planck's quantum of energy and de Broglie's wave-particle duality. However, as these new hypotheses mixed classical and non-classical ideas, the process of giving these ideas precise mathematical form required mathematical leaps whose physical interpretation was far from clear.
The legacy of this genesis is that, until recently, many question about quantum theory's mathematical structure have been unresolved: Why does the formalism use complex numbers? Why are continuous transformations represented by unitary transformations rather than some other group of transformations? Why is the configuration space of a set of particles 3N-dimensional rather than 3-dimensional (a fact that gives rise to the possibility of entanglement)?
Although the lack of answers to these questions does not obviously impede the application of quantum theory, their absence hinders both the clear interpretation of the theory ("What is quantum theory is telling us about how nature works?"), the full harnessing of quantum theory ("Can the behaviour of identical particles be technologically harnessed?") and the application of the theory to novel physical domains ("Can we apply the rules of quantum theory without modification in such domains as quantum gravity?").
One powerful way to elucidate the physical origin of the mathematical rules of quantum theory is to attempt to derive—or reconstruct—these rules from a set of physical principles. Once this is done, one can more readily generate an interpretation that fits these physical principles (it is much easier to interpret a set of physical principles than a set of abstract mathematical statements), and assess the applicability of the quantum formalism in new physical domains.
My research has focussed on harnessing new ideas (particularly from quantum information) to formulate simple physical principles from which quantum theory can be reconstructed. The reconstructive project thus far divides naturally into the following parts:
My earliest reconstructive work, itself a development of my PhD work, seeks to derive the abstract formalism of quantum theory on the basis of information geometry—the natural geometry of probability distributions. The key quantum-inspired postulates concern complementarity and gauge invariance.
The key ideas are described in two (shorter and longer) papers and in this short talk. The precursor work, which is based on the notion of prior probability and Shannon-Jaynes entropy rather than metrics, is described in a paper and a corresponding talk.
The application of the abstract quantum formalism to construct models of specific systems makes use of many so-called correspondence rules. These rules specify, for example, the mathematical form of the quantum operators that correspond to measurements of specific properties (such as position and momentum) of a physical system.
A unified derivation of the correspondence rules on the basis of a single new physical principle is given in this paper.
The key idea is very simple: the predictions of the quantum model of a physical system must "on average" (suitably defined) agree with those of the classical model of the same system.
My most recent reconstructive work approaches quantum theory via Feynman's formulation of the theory. Feynman's rules present the core content of quantum theory is an extremely accessible form, and it turns out that reconstructing quantum theory from this perspective has many benefits.
The derivation of Feynman's rules for single systems is given in this paper. In a further paper, I show how to derive the tensor product rule for nonidentical composite systems, and how to derive the standard state-based quantum formalism starting from Feynman's rules.
The key ideas are described in this short talk, and elaborated in this longer talk.
Most recently, I have worked out a reconstruction and a novel interpretation of Feynman's rule for handling identical quantum particles—see "The nature of identical quantum particles" project below.
Another sub-project currently underway concerns the systematic derivation of the quantum wave equations. At some point, I also intend to investigate the spin-statistics theorem from the operational-informational reconstructive perspective.
What can we learn from quantum theory about the nature of physical reality? In light of quantum theory, are there any parts of the mechanico-geometric conception of reality (due to Galileo, Descartes and Newton) that underpins classical physics which we can hold on to? What aspects of this mechanico-geometric conception should be replaced, and what should replace them? In short: is there some conceptual story that we can tell about quantum theory which is compelling and illuminating, and which can profitably guide our future explorations of physical reality?
Since in its creating in the mid-1920s, many attempts have been made to answer the above questions—to interpret quantum theory. One of the fundamental weaknesses of almost all existing interpretative work is that it takes all (or the majority) of the mathematics of the quantum formalism as a given, and then appends an interpretation to it. The trouble is that it is possible to dress the quantum formalism in many, very different conceptual outfits. Each is plausible to an extent, but leaves a great many questions unanswered. And, of the many alternative widely-divergent interpretations on the table there is little prospect of experimental (or other) means to decide between them.
Over the last few decades, there has been a surge of interest in the quantum reconstruction program, which aims to elucidate the physical origin of the mathematical formalism of quantum theory by formulating physical principles from which the formalism can be derived. Such reconstructive work provides an ideal launch-pad from which to interpret quantum theory in a new way—by interpreting the principles of a reconstruction, rather that the formalism itself. Such a two-step process has the inestimable advantage that it is much easier to digest physical principles expressed in natural language than an abstract mathematical formalism.
Interpretation of quantum theory on the basis of reconstruction is a nascent area. However, already, it is possible to draw some striking inferences. For example, based on the many reconstructions (such as my reconstruction of Feynman rules of quantum theory), it is clearly possible to derive the abstract (von Neumann–Dirac) formalism without reference to the dimension, metric, or topology of space; and without reference to most of the notions (such as matter, fields) that are central to the conceptual framework of classical physics. Thus, the abstract quantum formalism seems to describe an aspect of physical reality which transcends space, matter, and fields. In contrast, the notion of time seems inescapable in these reconstructions, which points to the primacy this notion over the others.
Over the last few years, I have also developed a new interpretation of the symmetrization postulate, which is the main rule that is used for handling identical particles in quantum theory. This interpretation is based on a reconstruction of this rule which I presented a few years earlier. According to this interpretation, identical particles manifest a new kind of complementarity (to learn more, see below).
A recent paper which describes the role of reconstruction in the elucidation of quantum theory, and describes some of the many interpretative implications of quantum reconstruction, is available here.
Introductory accounts of the promise of the informational approach to understanding quantum theory are given in these two papers. A more philosophical discussion, in the context of Husserl's reflections on the Galilean mechanico-geometric conception, is given here.
The behaviour of identical particles in quantum theory is profoundly counterintuitive, yet lies at the basis of the quantal understanding of essentially anything more complicated than the hydrogen atom. For example, the structure of the periodic table, and the mechanical, electrical, and thermal properties of materials, all hinge on their strange behaviour.
However, the origin and interpretation of the key quantum rule—known as the symmetrization postulate—which is used to enforce this behaviour within the quantum formalism is the source of long controversy. There have been two options on the table: either we view identical particles as 'indistinguishable' (i.e. unable to be reidentified), or as not even objects in the sense that we ordinarily understand that word. However, as with many interpretational issues concerning quantum theory, there is no consensus on the matter.
I have shown how to systematically derive ('reconstruct') the symmetrization postulate from a new physical principle ('the operational indistinguishability postulate'), and have subsequently provided an interpretation of identical particles in terms of a new kind of complementarity—a complementarity of persistence and nonpersistence (see talk—including visual transcript—here).
In brief: classically, one thinks of a box of an ideal gas as consisting of \(N\) point particles which move around. But, there is another way of thinking about it, namely that the gas is a single (abstract) entity, and that, upon measurement, it manifests itself as \(N\) distinct point-like 'flashes.' In the latter case, there are no individual objects as such, just \(N\) flashes. Seen from this perspective, we can regard these two descriptions as different descriptions of the same underlying data. In the former ('persistence') picture, we assume that the data (the flashes) are the manifestation of persistent underlying objects ('atoms'). But, in the latter ('nonpersistence') picture, we refrain from such an assumption.
Now, in classical physics, as well as in everyday life, we ordinarily make use of the persistence picture, for this is the picture which offers the greatest scope for prediction of future happenings. However, when it comes to so-called quantum identical 'particles', one must combine both of these pictures. In a nutshell—at some risk of misunderstanding—identical particle-like events are objects and non-objects at the same time.
More recently, I have developed a more philosophical understanding of identical particles, which addresses the question of why identical quantum particles would require such special treatment. I argue that the need for such treatment originates in the confluence of identicality and the active nature of the quantum measurement process, which together mean that reidentification is not possible in contexts in which we ordinarily say that two particles interact. I propose a conception in which detection-events are ontologically primary, while the notion of individually persistent object is relegated to merely one way of bringing order to these events.
The central tenet of information physics is that the concept of information is as fundamental as the notions of space, time, energy, and matter which form the basis of the conceptual framework of classical physics. As detailed in this paper and this talk, the notion of information entered physics nontrivially over a century ago due to the tension between thermodynamics and statistical mechanics, and has has been given new meaning and importance by the rise of the informational view of quantum theory.
The informational perspective on quantum theory has not only led to the discovery that quantum systems can process information in new, nonclassical ways, but has been crucial to the wave of reconstructive work over the last two or so decades (see this talk for more detail), where new information-theoretic principles of physical import have been formulated.
Due to its centrality to any quantitative theory of information, a clear understanding of probability theory is vital to any use of the quantitative notion of information for building an understanding of quantum theory.
To that end, I have written a paper on a new principle that determines a unique prior over probability distributions, as well as a paper that investigates the relationship between probability theory and quantum theory. The latter paper proves that probability theory and quantum theory are compatible with one another—despite widespread statements to the contrary—by showing how they both derive from fundamental symmetries, and pinpoints the error that lies behind many claims that probability theory is disproved by quantum theory.
The theories of classical physics—classical mechanics, thermodynamics, and Maxwell's theory of electromagnetism—harbour many subtle conceptual issues which are yet to be adequately resolved.
Two key notions, namely conservation and relativity, lie at the heart of classical mechanics. Each is an instance of a fundamental physical symmetry—conservation expressing the symmetry that temporal evolution of a system is accompanied by changelessness in some of its properties; relativity expressing the symmetry that, although observations are necessarily perspectival, there are classes of observers which are, in some fundamental sense, physically equivalent.
These ideas played a fundamental role in the early development of mechanics, but were eclipsed in Newton's framework by the notion of force and its associated laws, such as \(F = ma\). However, subsequent developments in physics, particularly starting with the conservation of energy in the 1830s, have brought symmetry principles firmly into the foreground.
In order to better understand how symmetry principles (particular conservation and relativity) shape the structure of classical mechanics, I have carried out a reconstruction of classical mechanics with these principles as its driving force. One of the values of the reconstruction is that nonrelativistic and relativistic mechanics arise in parallel, which elucidates both their similarities and differences.
The reconstruction takes place in three steps. First, the quantities of motion (energy & momentum) are derived from symmetry considerations (asymptotic conservation; relativity) using functional equations.
Second, in order that energy can be continuously conserved, a system of particles are embedded within an "energetic system" that can bear massless energy (and, in the relativistic case, also massless momentum). The imposition of symmetry constraints (continuous conservation; relativity) then leads, in the nonrelativistic case, to mass conservation and the frame-invariance of massless energy; and, in the relativistic case, to the fact that massless energy-momentum transforms as a four-vector.
Third, by introducing a staccato model of motion change, the force-based framework for particle mechanics is derived, which coincides with Newton's framework in the nonrelativistic case.
The reconstruction generalizes a number of fundamental results, such as Laue's theorem, as well as Einstein's key thought-experiments concerning mass-energy interconversion, and illuminates or resolves many longstanding puzzles in nonrelativistic and relativistic mechanics.
In order to clarify the overall architecture of mechanics, I then classify the principles & assumptions that are used in the reconstruction according to their explanatory role.
One of the most surprising outcomes of the reconstruction is the emergence—and central role—of what I refer to as the 'energetic framework', which serves is a kind of meta-theory which encompasses and shapes the specific physics theories built within it, and which is in turn shaped by broad assumptions (such as the principle of relativity) that are made within the specific theories.
As I detail in the paper, I believe that this reconstruction paves the way for a new way to teach mechanics—both nonrelativistic and relativistic—at all levels.