Quantum theory is the most empirically successful theory in the history of physics, and one of the most extraordinary achievements of the human mind. However, despite the efforts of numerous physicists and philosophers over the last hundred years, including many of the illustrious founders of quantum theory, a comprehensive conception of the quantum world has yet to be developed. This not only limits the further development of physics, but prevents the broader scholarly and humanistic culture from fully benefiting from the deep lessons that quantum theory has to teach us—such as the relation of human thought and physical reality, and the relation between everyday reality and microphysical reality.
The broad goal of this workshop is to bring together physicists playing a leading role in developing new (particularly reconstructive) approaches to quantum theory and philosophers from diverse traditions that are seeking to elucidate quantum theory from novel, non-mainstream philosophical vantage points. The aim is to stimulate crossdisciplinary discussion, and to catalyze collaborative research projects that decisively further the elucidation of quantum theory.
There are two methods of probing the foundations of relativistic quantum field theory: the algebraic approach more often favored by mathematical physicists, and the conventional approach more often favored by other physicists. Results from the former suggest that particles do not belong to the fundamental ontology of QFT, whereas the latter, which considers particles to be unexpendable, provides good and exceptionally precise predictions and explanations in high-energy physics by reference to particles that the former is quite incapable of making.
Philosophers, with some important exceptions, have tended to favor the algebraic approach because its theoretical results are more directly related to the principles of QFT and, so, have favored a field-fundamentalist ontology. However, relatively recent results in the philosophy of QFT indicate that, if the ontology is to be “read off” the theoretical formalism, a field-fundamentalist ontology is no more plausible than a particle-fundamentalist ontology because their ostensible formal representations share a basic mathematical shortcoming that precludes both from serving as good representations at all. Moreover, it has been argued, for example, by Weinberg that particles are more directly connected to relativistic quantum principles than fields are, despite the above-mentioned interpretational tendencies.
In this paper, I argue from a progressive scientific realist perspective for a QFT ontology that includes both particles and fields wherein the particle, which is an entity apparently closer to direct experience, is dependent on but not reducible to the field, which is supervened upon by the particle.
One of the main problems with the old quantum theory of atomic radiation was its reliance on the Bohr-Sommerfeld semi-classical model of the atom, which presented an inconsistent ontology for the atomic electron: Atomic electrons were described as corpuscles moving along orbits, yet, at the same time, they had to violate the laws of classical mechanics to ensure the stability of matter.
Heisenberg's matrix mechanics has been interpreted as a way of "de-ontologizing" the description of the behavior of atomic electrons. I propose that, when viewed from a phenomenological perspective, it actually provides a new ontology for the atomic electron. To support this claim, I will adopt the phenomenological approach to the reality of unobservables presented by Vallor in 2009. This approach is based on a re-conceptualization of ‘reality’ grounded in Husserlian phenomenology.
On this basis, Vallor argues that unobservables, such as elementary particles, are as real as everyday perceptual objects. I integrate Vallor's approach by revealing a crucial aspect of the phenomenological ‘reality’ of unobservables that Vallor overlooked: unobservables like elementary particles require the explicit construction of a theoretical-mathematical framework to emerge from experience as objects that are actual parts of the world.
From this perspective, the mathematical framework introduced by Heisenberg in matrix mechanics assumes an ontological significance, as it provides the foundation upon which the object ‘atomic electron’ emerges as an actual entity, free from the ontological contradictions of the previous semi-classical model.
Contrasting Schrödinger’s comment that experimentation does not deal with single quantum systems (Schrödinger, 1952), engineers and physicists today have developed capabilities to create, manipulate, exploit and read out quantum states of single quantum systems. Given the recent boom in technologies such as quantum computing and quantum communication, some note that quantum technology finds itself ahead of theory (e.g., Cuffaro & Hagar, 2024). In this talk, I survey the relations between the foundations of quantum mechanics and quantum technology in order to map the role of quantum technology in developing a better understanding of the quantum world.
Developments such as the Bell tests embody a particular philosophical utility of technological development, expressed as ‘experimental metaphysics’ (Shimony, 1989). However, quantum technology can do more than test existing theories – I illustrate several ways technology fruitfully interacts with foundational and interpretational debates. I show how technology can, for example, enable theoretical reflection by providing novel questions, influence the status of positions in theoretical debates through practical utility, and inform concepts through illustration and engineering challenges (such as decoherence and measurement). The overview presented in this talk suggests a more complex, multi-faceted interaction between science and technology than that which is suggested by the linear relation between science and technology, which appears to be often assumed.
Although the role of practice in scientific inquiries is increasingly recognised (e.g., Hacking, 1983; Kitcher, 2001; Longino, 2002), research in the foundations of physics often unfolds along theoretical lines. By elucidating how quantum technology fruitfully interacts with theoretical accounts of the quantum world in foundational debates, I aim to start exploring a potentially underestimated approach towards formulating a more comprehensive understanding of quantum mechanics.
The usual explanation of the quantum computing speedup is parallel calculation, inserting a superposition of all possible input values into the computation. The picture painted is that the machine performs a calculation for every input, perhaps in multiple parallell worlds, and then combines the outcomes of each individual calculation by interference to obtain the desired answer. This explanation has become standard, but does not give any guidance on how to identify new mathematical problems that could be solved, and how to design algorithms to solve them. New developments seem to need another explanatory model.
An alternative is tracing the calculation in phase space, a standard tool in classical mechanics but more challenging to use in the quantum realm. The reason is that the Liouville distribution, the probability distribution over classical phase space, becomes the Wigner function in quantum mechanics, a quasi-probability distribution that is not always positive. A negative Wigner function has been linked to presence of quantum contextuality, a behavior only seen in quantum systems.
This presentation will introduce a description of the phase space of a restriction of quantum mechanics, that generates a positive distribution but still reproduces the contextual behavior of quantum systems. The description is Einstein-complete, but distinct from Bohmian mechanics. We will briefly see how this can be used to give a better explanation of the quantum speedup, but also use this new description as a tool for reasoning about more generic interpretational issues within the foundations of quantum mechanics.
In this paper I undertake the first detailed historical examination of the origin of the latency concept in the works of Henry Margenau in the 1950’s. Margenau was a highly influential and respected member of the theoretical physics community after the war. He was not only a prolific writer but a tireless organiser and a consummate teacher. He was also a devout advocate of the philosophy of science and physics and did much to elevate their status among scientists and philosophers alike.
In fact, Margenau played a key role in establishing philosophy of physics as a discipline. He founded the journal Foundations of Physics and was instrumental in setting up the Philosophy of Science Association. As the Yale Professor in Theoretical Physics, he was able to attract many students to the foundations and philosophy of physics who worked out the details of Margenau’s own ideas, including the concept of quantum latency.
There certainly was a ‘Margenau school’ in the 1950’s and 1960’s. Margenau himself had philosophical training; he had studied the writings of Cassirer and the Marburg school and considered himself a Neo-Kantian. Because of his institutional role as a mersenne for philosophy of physics, Margenau came into regular contact with some of the most important philosophers and physicists in his time precisely at a time when they were developing ‘dispositional’ accounts of quantum properties (Heisenberg) and of theoretical terms in general (Carnap).
I also consider the possible influence of Margenau’s school upon Popper and his followers in the 1960’s and ask whether Popper’s “propensities” may be seen as a critical rationalist response to Margenau’s neo-Kantian latencies. Finally, I very briefly reassess the prospects for latencies or propensities as an interpretation of quantum mechanics (Suárez, 2007).
David Bohm’s work on the foundations of quantum theory is today best known as a version of the de Broglie-Bohm interpretation called “Bohmian mechanics” (see Goldstein 2024). However, Bohm himself worked on many different interpretations of quantum theory in the course of his long career, and even his own preferred version of Bohmian mechanics differs considerably from the version currently discussed in the foundations of physics.
This talk gives an overview of some of Bohm’s main attempts to interpret quantum theory:
1) the idea that quantum properties are “opposing, incompletely defined potentialities” which he advocated in his 1951 text-book Quantum theory (predating Heisenberg’s similar later suggestions);
2) his own version of Bohmian mechanics in which the pilot wave encodes “active information”, thus introducing a new type of ontological notion of information to fundamental physics (Bohm and Hiley 1987);
3) his more general implicate order scheme (Bohm 1980) which suggests that quantum nonlocality points to the need to consider a non-spatial order, or “Ur-space”, as Jenann Ismael (2020) has recently emphasized.
While different, and even contradictory in some ways, it is suggested these Bohmian concepts provide us with valuable clues about what quantum theory teaches us about the nature of reality. The view that emerges can be seen as a middle way between on the one hand very counterintuitive “actualist” views (such as Many Worlds (Wallace 2012) or Wave Function Realism (Albert 1996, Ney 2021)) and on the other hand radically epistemic and subjectivist neo-Bohrian views such as QBism (Fuchs, Mermin and Schack 2014).
Quantum theory radically challenges the core metaphysical tenets underpinning classical physics. The indeterminacy and state-disturbance of quantum measurement; the complementarity of observables; the phenomena of entanglement and steering; and the strange behaviour of identical particles all profoundly challenge the deterministic, geometric, atomistic ideals of classical physics.
However, since the creation of quantum theory almost one hundred years ago, no widely-accepted, fully-fledged quantum conception of reality, with its own set of metaphysical ideals, has emerged.
The quantum reconstruction program is the perhaps the most important advance in the foundations of physics of the last half-century, and opens up rich opportunities for philosophical reflection. However, despite the development of many reconstructions of various parts of the quantum formalism over the last twenty-five years, the reconstruction program has yet to engage the serious attention of the majority of those engaged in the philosophy of physics.
In this talk, I describe some of the obstacles and misconceptions that stand in the way of the embrace of the quantum reconstruction program; explain how interpretation of quantum reconstructions can avoid the pitfall of projection of implicit metaphysics that is inherent in the conventional approach of direct interpretation of the quantum formalism; and describe the exciting possibilities that certain reconstructions open up, such as the prospect of developing a rigorous metaphysical understanding of such enigmatic notions as complementarity and entanglement.
The objective of this paper is to offer an analysis of several key elements within Niels Bohr’s transcendental interpretation of quantum mechanics. After some stage setting, I will demonstrate that a transcendental perspective on Bohr offers several advantages over alternative interpretations. Specifically, I will argue that some of his most contentious claims become more plausible when viewed through a transcendental lens. However, despite these strengths, Bohr’s approach faces challenges. Following an evaluation of what I consider to be the primary weakness in his framework, the final section of the paper will explore potential avenues for enhancing the viability of the Bohrian project, with a specific focus on the role of phenomenology as a potential solution.
We quantify the difference between classical and quantum counterfactual effects, where an output distribution is somehow changed by the removal of signal (”blocking”) at some point. We show that there is a counterfactual gain in quantum counterfactual communication, which quantifies the effect it has above and beyond any classical counterfactual effect, and that this counterfactual gain comes from coherences.
This counterfactual gain contains a term proportional to a Kirkwood-Dirac quasiprobability term—when this is positive or zero, this counterfactual gain can only distribute probability more equi- tably over the set of outputs; however, if this Kirkwood-Dirac term is negative, blocking can cause output probability to focus on a specific outcome. We show that this difference between quantum and classical counterfactual effects results from the measurement backaction caused by this blocking.
We show that we cannot explain quantum counterfactual effects simply by removing detection events. We link this to attempts to argue from counterfactuals in quantum mechanics (e.g. when forming noncontextual and Bell inequalities), and show that this backaction effect forms a natural explanation for the violation of the statistical, or measurement, independence assumption used to form these inequalities. See J.R. Hance et al, Quant. Sci. Technol. 9, 045015 (2024) for more details.
In recent years, philosophers of physics—QBists in particular—have turned increasingly to the phenomenological tradition to make sense of quantum mechanics. And the work of Merleau-Ponty especially has caught their eye. Appealing to his later ontology of “flesh”, QBists describe agents who are constitutively embodied and embroiled within an external world and who are constitutively available to public scrutiny.
In this paper, I trace the provenance of Merleau-Ponty’s ideas back to Hegel’s account of mutual recognition. Not only are there striking analogies to be found, but it is clear that Merleau-Ponty derived these analogous elements (at least in part) directly from his engagements with Hegel’s thought. This analysis: (i) reveals a new (and surprising) extension of Hegel’s influence into contemporary philosophy of physics; (ii) sheds light on the philosophical relationship between Hegel and Merleau-Ponty; (iii) opens up fresh conceptual resources to QBists (relating in particular to their response to Wigner’s “friend” paradox); and (iv) generates a deep irony concerning the distinction between the so-called “analytic” and “continental” traditions in philosophy. An additional aim of the paper is to provide a clear statement of the explanatory relationship between QBism and phenomenology.
One of the most surprising findings of quantum mechanics is wave-particle duality. It suggests for example that both mechanical point particles and photons have state vectors and evolve both according to the Schroedinger equation. However, having a closer look at the standard quantum physics of these systems, we see that we do not really use the same formalism to describe them. For example, while the velocity of point particles depends on the shape of their wave packets, the same does not apply to photons which always move at the speed of light.
In this contribution, we, discuss a possible way of altering standard quantum mechanics by including features of a recent local photon model in its formalism. The aim of phase space quantum mechanics is to increase wave particle duality. At the same time, we address some issues related to the consistency between quantum and classical mechanics. Taking a philosophical point of view, one would expect that classical mechanics emerges from quantum mechanics when only expectation values are considered. Currently, this is only true in some cases. Moreover, phase space quantum mechanics might help us to eventually overcome the current gap between quantum physics and relativity.
I will argue that ideal, i.e. closed quantum systems, cannot exist because no system is completely isolated, they assume a radical object subject split, and their description involves an infinity which the world cannot instantiate. Inspired by the tension between transcendence and closure, I will then argue that the Universe itself should not be described as a closed quantum system, because it would amount to an ultimate closure. If anything, the Universe should be conceived of as a plurality. I will consider these questions in the light of mereological relations as well as epistemic features.
We use the framework of Empirical Models (EM) and Hidden Variables Models (HVM) to analyze the locality and stochasticity properties of relativistic quantum theories, such as Quantum Field Theory (QFT). First, we present the standard definition of properties such as determinism, No Signaling, Locality, and Contextuality for HVM and for EM and their relations.
Then, we show that if no other conditions are added, there are only two types of EMs: An EM is either classical, in which case it is strongly deterministic, local, and non-contextual. Or else an EM is non-classical, in which case it is weakly deterministic, non-local and contextual. Consequently, we define a criteria for an HVM to be Lorentz invariant and prove that it implies No Signaling. As a result, we show that a relativistic quantum theory must be genuinely stochastic i.e., it cannot have a deterministic (strong or weak) HVM.
Finally, we discuss Bell’s definition of locality and show that it is equivalent to non-contextuality, and moreover we argue that Bell’s justification for this definition tacitly assumes non-contextuality. We propose an alternative definition of locality for contextual and relativistic theories that accounts for correlations that result from common history and renders QFT a local theory.
I propose a solution to a neglected conflict characterizing Bohr’s philosophy of quantum theory, namely between what I call the ’distinction thesis’ and the ’nonseparability thesis’ between quantum systems and classical apparatuses (Howard [1] p.163). Here is a quotation attesting to the former thesis: ”The essentially new feature in the analysis of quantum phenomena is the introduction of a fundamental distinction between the measuring apparatus and the objects under investigation.” ([2, pp. 3-4, my emphasis]. Here is one proving the latter: “The quantum postulate implies that any observation of atomic phenomena will involve an interaction with the agency of observation not to be neglected. Accordingly, an independent reality in the ordinary physical sense can neither be ascribed to the phenomena nor the agencies of observation” ([3] p. 580, my emphasis).
Many scholars have focused exclusively on the former thesis by concluding that Bohr’s approach to quantum mechanics is hopeless since it brings into play the irremediably vague distinction between the micro (the quantum) and the macro (the classical) realms. Howard, who correctly focussed on the latter thesis, has controversially claimed that the way out of the conflict is to appeal to decoherence [1].
By discussing the two-screen thought experiment proposed by Einstein, I argue instead that in virtue of the nonseparability thesis, Bohr regarded any classical system (measuring apparatuses included) as a quantum system. Consequently, by discussing Zinkernagel [4], I claim that Bohr is a quantum fundamentalist, since the distinction thesis is advanced only to fulfill the epistemic and pragmatic aim to get definite and communicable outcomes. Even though he regarded the measurement interaction as an (irreversible) physical process, Bohr did not need to invoke a collapse postulate, since treating a macroscopic apparatus as quantum or classical depends contextually on the experimenter’s variable aim: ”...for every particular case it is a question of convenience at what point the concept of observation involving the quantum postulate with its inherent ’irrationality’ is brought in” [3, p.580].
Finally, I will show that as a consequence of this pragmatic solution to the above conflict, Bohr adopted a non-constructive, principle theory approach [5] to the measurement problem: ”... the neglect of the atomic constitution of the measuring instruments themselves, ... is equally characteristic of the applications of relativity and quantum theory.” [6, p.236].
The workshop, generously supported by the Bank of Sweden Tercentenary Foundation and the Division for Philosophy and Applied Ethics at Linköping University, will take place at Vadstena Monastery, originally built in 1346, by the shores of Vättern, Sweden's second-largest lake.
Participants will stay at the Vadstena Klosterhotel.
Vadstena is about an hour from Linköping. There will be a charter bus departing from Linköping at 8:15 am on Wednesday. If you are staying at Park Hotel, Harald will be in the lobby at 7:45 am to accompany you. For those making their way to the meeting point independently, the bus will depart from the bus terminal near the main train station.
Public transportation from Linköping to Vadstena is challenging, so please do not miss the charter bus.
Philip Goyal (University at Albany)
Daniele Pizzocaro (UC Lovain)
Harald Wiltsche (Linköping University)
For questions, please contact Harald Wiltsche.