Lorentz Center

International center for scientific workshops

International center for scientific workshops

Current Workshop | Overview | Back | Home | Search | | ||||||||||

## Part and Whole in Physics |

In many condensed matter systems, the degrees of freedom associated with constituent particles are dramatically rearranged due to correlation effects. In such cases, the (quasi-)particles from which the system is "made" (at low energy scales) may bear little resemblance to the original constituents. I will describe several examples of this phenomenon, including solitons with inverted charge/spin relations, elementary excitations in quantum spin chains, quasiparticles in the fractional quantum Hall effect, and vortices in p-wave superconductors. Time permitting, I will also describe the case of quantum critical points, where the quasiparticles are absent entirely.
Trinity College, Cambridge, UK In this talk, I propose to do three jobs. First, I will review the rock-bottom basics of the composition of physical systems, in both classical and quantum physics, in terms of both state-spaces and algebras of quantities. I expect to mention the category-theoretic perspective, i.e. the idea of a category containing products. Then I will review the rock-bottom basics of the traditional philosophical ideas of (i) reduction as one theory being a logical definitional extension of another, and (ii) supervenience (also known as: determination or implicit definability). I expect to mention Beth's theorem, which gives conditions under which (ii) collapses into (i). Finally, I will discuss two ways in which the first two jobs are linked. These are ways in which supervenience of the whole upon the parts fails or could fail: ways which are familiar to philosophers, and which should be uncontroversial. The first way is factual, the second apparently counterfactual. Namely: (i) Entangled states in quantum theory violate a supervenience of the whole's quantum state upon the states of its parts. (ii) If there had been fundamental many-body forces (called by C D Broad, a British emergentist of the early twentieth century: `configurational forces'), then the theory of a many-body system would not be supervenient on a theory of its components, using two-body interactions.
Department of Philosophy, University of Waterloo
Most philosophical discussion of the `particle' (i.e., quanta) concept that is afforded by quantum field theory (QFT) has focused on free systems. I investigate whether the quanta concept for free systems can be extended to interacting systems. My conclusion is that the possible methods of accomplishing this are all unsatisfactory. Therefore, an interacting system cannot be interpreted in terms of quanta. As a consequence, QFT does not support assigning particles a fundamental place in our ontology. In contrast to much of the recent discussion of the particle concept derived from QFT, this argument does not rely on the assumption that a particulate entity be localizable. I will focus on isolating the features of QFT that rule out a quanta interpretation for interacting systems. In particular, the situation in relativistic QFT will be contrasted with that in Galilean QFT.
The two questions I shall be primarily concerned with are: what is (physical) structure? And what is the relationship between such structure and the putative entities of physics? I shall consider these in the broad context of the various forms of ‘ontic’ structural realism (OSR) currently on offer, namely eliminativist, non-eliminativist and ‘moderate’. I shall begin by outlining these positions, their underling motivation, and their pros and cons. With regard to the first question, I shall suggest that the appropriate metaphysics needs to be tailored to fit our understanding of the relevant science, and in particular, of the role played by symmetry principles. The latter feature prominently in the above forms of OSR but tend to be ignored or dismissed in current philosophical accounts of laws and properties. I shall sketch a structuralist view of the latter that appropriately incorporates symmetries and helps illuminate the nature of the structure that the ontic structural realist posits. When it comes to the second question, I shall draw on metaphysical accounts of dependence to help explicate the relationship between the above structure and the putative objects of physics, again paying due attention to the crucial differences between the above versions of OSR. I shall also briefly consider how OSR might be extended to biology, for example, and the associated ‘mid-level’ objects. And I’ll conclude with some reflections on the relationship between metaphysics and science in the light of the above.
Adopting the quantum view of natural processes one is naturally led to consider two puzzling situations arising from its most characteristic feature, i.e., the occurrence of entangled states of composite systems. The first one, and by far the best known, consists essentially in the measurement or macro-objectification problem which emerges from the entanglement of a microscopic system with a macroscopic one. The second one has specifically to do with the appearance of identical constituents of composite quantum systems. In such a case, the (anti)symmetrization postulate for (fermions) bosons makes quite problematic to properly identify whether entanglement is present or not. With reference to this case we point out that one can find many misleading approaches and conclusions in the literature. We will try to briefly describe how the problem of the whole and the parts presents itself in measurement-like situations and we will show how some recent genuinely quantum proposals of overcoming the problem can lead to a satisfactory situation. Concerning the problem of identical constituents we will make fully precise how to deal with the whole and the parts in a case of a composite system. The attention will be fundamentally focussed in discussing entangled microscopic systems. Since the theoretical framework underlying the above mentioned solutions to the macro-objectification problem satisfy the quantum requirements for systems with identical constituents there will be no need to reconsider measurement processes taking into account the identity of the involved constituents.
Idiscuss a few possible reasons for thinking that the question "What is everything made of?" suffers from presupposition failure, and suggest a sense in which _something like_ this question is likely to survive all such concerns. Nothing that I say is expected to be news to philosophers - the talk's main purpose is to (hopefully) aid interdisciplinary discussion by introducing a few possibly-relevant ideas and distinctions that are common currency in analytic philosophy to a wider audience.
Stephan HartmannVarious scientific theories stand in a reductive
relation to each other. In a recent article, we have argued that a generalized
version of the Nagel-Schaffner model (GNS) is the
right account of this relation. In this article, we present a Bayesian analysis
of how GNS impacts on confirmation. We formalize the relation between the
reducing and the reduced theory before and after the reduction using Bayesian
networks, and thereby show that, post-reduction, the
two theories are confirmatory of each other. We then ask when a purported
reduction should be accepted on epistemic grounds. To do so, we compare the
prior and posterior probabilities of the conjunction of both theories before
and after the reduction and ask how well each is confirmed by the available
evidence. The talk is based on joint work with F. Dizadji-Bahmani
and R. Frigg.
I will discuss examples chosen to illustrate a variety of different ways in which physical systems may be taken to be composed of others. While these do not amount to “a plethora of quite specific, disconnected, sui generis, compositional facts”, neither do they support a view of the physical world as a compositional hierarchy. Physics succeeds by adopting a flexible, pragmatic strategy, decomposing systems into parts of whatever kind facilitates modeling their behavior in predictively and explanatorily successful ways.
A system that obeys classical dynamical equations can be taken to be in a quantum entangled state. We demonstrate how such a state evolves in time obeying a quantum Schroedinger equation. Constructing its quantum Hamiltonian and studying the properties thereof is an important mathematical exercise. To what extent is the Hamiltonian a local one? Can these quantum states be used to construct hidden variable theories? What do Bell's inequalities imply for such systems? Can a quantum entangled state originate as an initial condition for the universe, in spite of local determinism?
Philosophy Dept., University of Arizona I will begin by arguing (i) that
the phenomenon of vagueness harbors a specific kind of logical incoherence,
(ii) that this fact does not undermine vagueness in language and thought, and
(iii) that the logical incoherence inherent to vagueness renders impossible the
existence of vague objects (e.g., clouds, mountains) and vague properties
(e.g., baldness, heaphood). I will then sketch an argument,
based partly on these facts about vagueness and partly on the difficulty of
finding general and systematic principles of part-to-whole composition, that
the most viable ontology of concrete particulars is ontological monism—the view
that the only real concrete particular is the entire cosmos, construed as being
spatiotemporally complex and nonhomogenous, but also
as being partless. Finally, I will sketch an approach
to the language/world interface that reconciles ontological monism with the
truth of various statements of common sense and of science that posit ordinary
objects (e.g., clouds and mountains) or theoretical objects (e.g., electrons
and quarks); this reconciliation is what I call weak emergence. The talk will
draw largely on material from Terence Horgan and Matjaž
Leo P. KadanoffThe University of Chicago & The Perimeter
InstituteThe renormalization group method, as developed by Kenneth G. Wilson, is first and foremost a method for achieving theory reduction. Historically, this method is rooted in the theory of phase transitions as it was developed by workers from van der Waals to Landau. Sharp phase transitions are necessarily connected with singularities in statistical mechanics, which require infinite systems for their realizations. A discussion of this point apparently marked a 1937 meeting in Amsterdam on van der Waals theory. Mean field theories, like van der Waals’, neither demand nor employ spatial infinities in their descriptions of phase transitions. For a good description of phase transitions, another kind of theory is required that weds a breaking of internal symmetries with a proper description of spatial infinities. The renormalization (semi-)group provides such a wedding. Its nature is described. I argue that each of its fixed points can each be considered to be a physical theory, and the process of renormalization a theory reduction. The major conceptualizations surrounding this point of view are described.
Daniel Dennett’s notion of ‘real pattern’ is a computational one. The idea is based on the compression of data and the reduction of information processing made possible by a high level description of a system that could in principle be described at a fine-grained level but at a much greater computational cost. Dennett’s paper is notoriously unclear about whether ‘real patterns’ should be regarded as real or as useful fictions. David Wallace advocates what he calls a functionalist account of ontology based on the notion of real patterns in his elucidation of the Everettian interpretation of quantum mechanics. His ontology is two-tier in that only higher-order entities such as cats and tables are understood in terms of real patterns, whereas the wavefunction or whatever else proves to be fundamental in physics is understood in categorical rather than functional terms. On the other hand, James Ladyman and Don Ross advocate a real patterns account of ontology across the board. In this paper, I will offer a version of the real patterns theory developed in terms of the dynamics of phase spaces rather than in computational terms, and clarify the question of whether the theory is best understood as a form of pragmatism or realism in ontology. I will also consider the implications of the view for the questions of emergence and composition addressed by the workshop.
I discuss the distinction between spatiotemporal mereological composition and qualitative mereological composition, with particular attention to the way that different features of the two mereologies can combine to provide an interesting range of possible overall ontological world-structures.
I will indicate the way composite systems consisting of two or more subsystems are represented in classical and quantum physics. To be more precise, I will focus on the particular structure of the state space of composite multi-partite systems as usually construed in these two parts of physics. I will argue that the differences encountered are of particular interest. Especially the structure of multi-partite entanglement in quantum physics - something not to be found in classical physics - is rather intricate and surprising. The emphasis will be on systems consisting of two and three subsystems only, as this suffices to provide us with the essential structural aspects for analyzing systems comprising more than three subsystems. These results are relevant for any philosophical analysis that bases its conclusions on the structure of the multi-partite state space of one of these two parts of physics, i.e, classical or quantum physics. If time permits I will present an example of this relevant to the part vs. whole debate: i.e., the question of holism in classical and quantum physics.
Analyzing things in terms of their parts is a manifestly productive research strategy. But one must be cautious about inferring conclusions about ontology from that success. (From the success of the strategy of linearizing one would not infer that all functional relations in nature are linear.) There are various reasons to be cautious. Different problems call for different decompositions that may not be interanalizable. It’s not just the parts, but also their interactions that are crucial – and when should we say that we have interacting parts and when a whole with a corresponding property? When we hit something and things fly out, should we say that such resultants were prior parts or something created by the blow – indeed, even when the interaction is gentile, does what emerges count as the same thing that was previously a part? But most importantly, all such questions are complicated by the fact that all our accounts are idealizations or not completely accurate models, analysis into parts very much included. I will expand a bit on the first questions and then go to work on the very hard problem: How do we wrest knowledge, in particular knowledge about what there is, from falsehoods?
Quantum mechanics, understood literally as a description of the world - that is, understood in Everett's sense - faces a problem of indefiniteness: the theory seems (unless augmented somehow) to predict that macroscopic quantities have no definite values. But this criticism rests on a false dichotomy: that the macroworld must either be written directly into the formalism or be regarded as somehow illusory. By means of analogy with other areas of physics, I develop the view that the macroworld is instead to be understood in terms of certain structures and patterns, which emerge naturally in unitary quantum mechanics via the dynamical process of decoherence. [Back] |