Lorentz Center - Beyond the Quantum from 29 May 2006 through 2 Jun 2006
  Current Workshop  |   Overview   Back  |   Home   |   Search   |     

    Beyond the Quantum
    from 29 May 2006 through 2 Jun 2006

 
1
1.                        Jean-Michel Raimond, Laboratoire Kastler Brossel, Département de Physique de l’Ecole Normale Superieure
 
 
Circular Rydberg atoms and superconducting millimetre-wave cavities realize an ideal system for quantum optics. A single long-lived atom interacts with a few photons stored in a modern realization of the famous Einstein’s photon box. In the strong coupling regime of cavity QED, the coherent atom-field interaction overwhelms the dissipative processes responsible for decoherence.
 
This model system is thus quite appropriate for the realization of some of the gedanken-experiments proposed in the early times of quantum theory. It makes it possible to unveil the most intimate features of the quantum world: state superpositions, complementarity and entanglement.
 
The study of the fuzzy boundary between the quantum and the classical worlds is particularly interesting. The resonant or dispersive interaction of a single atom with a mesoscopic coherent field made up of a few tens of photons prepares quantum state superpositions at the border between the quantum and classical worlds, reminiscent of Schrödinger’s cat. These superpositions open the way to detailed studies of the decoherence process.
 
A stringent test of our understanding of decoherence would involve two such fields stored in two separate cavities. A single atom would prepare a mesoscopic state superposition presenting the non-local character of an EPR pair. This superposition violates an adapted version of the Bell inequality. Studying the gradual blurring of the non-local properties under the action of the decoherence mechanism is a fascinating perspective.
 
 
 
2.                        Franck Laloë, LKB, Dept physique de l'ENS, 24 rue Lhomond, F 75005 Paris (France)
 
 
Bose- Einstein condensates and Bell experiments
 
Bose-Einstein condensates such as those obtained in atomic physics experiments are well described quantum mechanically by Fock states, where the number of particles is perfectly well defined. Such states  have a completely unknown phase, and the question arises as to whether interference effects between two Fock states can be observed. Recent experiments in quantum gases show that they can, but from where does the relative phase between the two states emerge?
 
A study of the effect of quantum measurements with Fock states shows that the results of successive interference measurements are in fact exactly the same as if a relative phase existed initially, remained completely unchanged, and became progressively better and better known when measurement accumulate. Mathematically, the unknown phase arises naturally in the formalism when expressing the conservation of the number of particles in both populated states. This phase plays exactly the role of an additional variable in quantum mechanics (often called "hidden variable"). Numerical simulations allow one to see how the knowledge on hidden variables is progressively acquired under the effect of successive measurements.
 
The phenomenon becomes particularly illustrative in the case of spin states, where angular momentum is implied. Interesting non local effects can then be predicted. For highly populated states, no violation of Bell inequalities takes place, and the situation is classical in this sense; but, when the populations decrease, more and more pronounced violations take place, until one reaches the well known case of two spins in the S=1 M=0 state (maximum violation of the Bell inequalities).
 
 
 
3.                        Karl Hess and Walter Philipp, Beckman Institute, University of Illinois, Urbana, Illinois 61801
 
 
Probability Spaces, the Bell Theorem and Kolmogorov-type Consistency
 
In their introduction of quantum mechanics and probability theory Breuer and Petruccione [1] note that many mathematical theorems express in various ways that quantum mechanics (QM) can not be formulated as a statistical theory on a classical probability space and give as an example Bell(1966) and Gudder (1979). We do not agree fully with their statement and show, by using as an example the theorem of Bell that instead the statement should read: Some results of QM, e.g. those involved in EPR type of experiments, can not be formulated as a theory of a given number of random variables on one common probability space because of the consistency problem for probability distributions as first investigated by Bass, Schell and Vorob'ev [2]. The result we have obtained for the Bell Theorem can also be applied to similar theorems and corresponding experiments and we discuss Greenberger-Horne-Zeilinger and the Pan et al. experiment along these lines. Generally problems arise if the experiments that are described happen to be incompatible and demand the use of different random variables and/or probability spaces. We show that the use of one common probability space can not be justified from results of quantum mechanics e.g. solutions of the Schroedinger equation.
 
Our examples make it clear that a comparison of Kolmogorov’s probability theory and QM with Born’s probability interpretation require a careful assessment of the actual experimental conditions. The link to experiments has been worked out in both approaches in a different way. One of the most obvious differences of the two approaches lies in the fact that Kolmogorov’s probability theory connects to experiments via a sample space and probability space while QM connects to experiments by considerations of state preparation and operators corresponding to observable quantities. We will show that no simple correspondence exists between elements of the sample space and the states of QM and that this fact is at the root of difficulties when comparing the two approaches. To avoid these difficulties generalities need to be avoided and concrete experiments need to be carefully analyzed (one by one) individually. In this analysis an assessment must be made which observables need to be described by different random variables and/or probability spaces. We will show that dependencies on measurement time play a particularly important role for this assessment and that relativistic considerations involving light-cones provide criteria whether certain random variables can or cannot be defined on a common probability space.
Our results shed new light on the various no-go proofs such as that by Bell. His proof only excludes small numbers of random variables on a common probability space and does not necessarily require violations of Einstein locality.
 
[1] H. P. Breuer and F. Petruccione, Open Quantum Systems, Oxford University Press (2002)
[2] N. N. Vorob'ev, Theory of probability and its Applications, Vol. VII, pp 147-162 (1962)
 
 
 
4.                        Alexander Burinskii
 
 
The Kerr particle and Quantum Mechanics
 
The story of the Kerr particle has more than 35 years. It is now a theory which demands a few lectures including:
 
1) the specific real structure of the Kerr solution,
2) structure of the source in Compton region,
3) its specific complex ‘point-like’ structure,
4) relationships to the Dirac theory and QED
 
 
 
5.                        Gregg Jaeger, College of General Studies and Quantum Imaging Lab, 1871 Commonwealth Ave. Boston University
 
 
Beyond Quantum Communication
 
I will discuss the value of communication protocols and communication complexity methods as a probe for the investigation of phenomena in the quantum realm that might ultimately require a description other than the standard quantum description.
 
 
 
6.                        Willem de Muynk
 
 
The generalization of the quantum mechanical formalism to positive operator valued measures as a subject
 
By this extension of the mathematical formalism new kinds of measurements can be described (for instance, joint nonideal measurements of incompatible observables, as well as complete measurements). This extension also yields the possibility of a more generalized look upon quantum information, which possibly is relevant to quantum computing.
 
 
 
7.                        José Lages, LABORATOIRE DE PHYSIQUE MOLECULAIRE UMR CNRS 6624, Université de Franche-Comté - La Bouloie F - 25030 Besançon Cedex - France
 
 
Noncommutativity in Physics
 
In the nineties Dyson published a proof of the Maxwell equations derived by Feynman. From classical assumptions (Galilean invariant) he derived the Maxwell equations which are relativistic (Lorentz invariant). This demonstration implies some new brackets (Feynman brackets) which are neither the Poisson brackets nor the quantum brackets. With this new brackets velocities are not commuting, and the commutator of two velocities is related to the electromagnetic Faraday's tensor. This new brackets have very interesting mathematical properties, and can be related to the first two orders of the Moyal brackets in Noncommutative Geometry. This formalism has various fields of application as the gravitoelectromagnetism (monopoles are necessary to recover sO3 symmetry) or the anomalous hall effect (recent experiments of the anomalous Hall effect provide evidence for noncommutativity (and monopole) in crystal momentum space).
 
 
 
8.                        Roland Omnès, Laboratoire de physique Théorique, Université de Paris XI
 
 
Non-linear corrections to decoherence and quantum reduction
 
The time evolution of a full density matrix is linear and the trace over an environment defining a reduced density matrix is also a linear operation. Little attention has been given however to the fact that decoherence –through which the reduced density matrix tends to become diagonal– is generally a non-linear approximation. Similarly, the conservation of the “diagonal” elements of this matrix, which are considered as empirical probabilities, is also only approximate in realistic situations. As a consequence, these so-called probabilities are generally time-varying, although this effect is significantly slower than decoherence.
The time scale of this effect has not yet been evaluated and it could be so large as having no consequence.  A fascinating possibility occurs however in the opposite case. If one assumes a Brownian variation of the diagonal elements of the reduced density matrix, previous works by Pearle and by the author show that the ultimate outcome will be a reduction effect, reestablishing an exact agreement between the empirical probabilities and the predictions of quantum mechanics. Older ideas by Pearle and other authors on reduction through non-linear effects would then completely agree with the opposite “dogma”, according to which the basic principles of quantum mechanics are completely consistent and involve everything entering in their own interpretation.
 
 
 
9.                        Emilio Santos
 
 
Tests of Bell´s inequalities: Increasing empirical support for local realism?
 
It is argued that local realism is a fundamental principle, which should be rejected only if experiments clearly show that it is untenable. Forty years after Bell´s inequalities no experiment has provided a valid, loophole-free, violation of them. This persistent failure supports that local realism is maintained by nature. I propose that, without any change in the (Hilbert-space) formalism and the equations, quantum mechanics might be made compatible with local realism by some change in the measurement postulates. In this respect, I comment on the optical tests of Bell´s inequalities.
 
 
 
10.                    Andrei Khrennikov, Director of International Center for Mathematical Modeling in Physics, Engineering, Economy and Cognitive Sc., University of Vaxjo, Sweden
 
 
Quantum mechanics as an asymptotic projection of statistical mechanics of classical fields
 
We show that QM can be represented as a natural projection of a classical statistical model on the phase space $\\Omega= H\\times H,$ where $H$ is the real Hilbert space. Statistical states are given by Gaussian measures on $\\Omega$ having zero mean value and dispersion of very small magnitude $\\alpha$ (which is considered as a small parameter of the model). Such statistical states can be interpreted as fluctuations of the background field, cf. with SED and Nelson\'s mechanics. Physical variables (e.g., energy) are given by maps $f: \\Omega \\to {\\bf R}$ (functions of classical fields). The conventional quantum representation of our prequantum classical statistical model is constructed on the basis of the Taylor expansion (up to the terms of the second order at the vacuum field point $\\psi\\_{\\rm{vacuum}}\\equiv 0) $ of variables $f: \\Omega \\to {\\bf R}$ with respect to the small parameter $\\sqrt{\\alpha}.$ The complex structure of QM is induced by the symplectic structure on the infinite-dimensional phase space $\\Omega.$ A Gaussian measure (statistical state) is represented in QM by its covariation operator. Equations of Schr\\\"{o}dinger, Heisenberg and von Neumann are images of Hamiltonian dynamics on $\\Omega.$ The main experimental prediction of our prequantum model is that experimental statistical averages can deviate from ones given by QM.
 
 
 
11.                    Gerard t’Hooft, Institute for Theoretical Physics, Utrecht University
 
 
The Mathematical Basis for Deterministic Quantum Mechanics
 
If there exists a classical, i.e. deterministic theory underlying quantum mechanics, an explanation must be found of the fact that the Hamiltonian, which is defined to be the operator that generates evolution in time, is bounded from below. The mechanism that can produce exactly such a constraint is identified in this paper. It is the fact that not all classical data are registered in the quantum description. Large sets of values of these data are assumed to be indistinguishable, forming equivalence classes. It is argued that this should be attributed to information loss, such as what one might suspect to happen during the formation and annihilation of virtual black holes.
The nature of the equivalence classes is further elucidated, as it follows from the positivity of the Hamiltonian. Our world is assumed to consist of a very large number of subsystems that may be regarded as approximately independent, or weakly interacting with one another. As long as two (or more) sectors of our world are treated as being independent, they all must be demanded to be restricted to positive energy states only. What follows from these considerations is a unique definition of energy in the quantum system in terms of the periodicity of the limit cycles of the deterministic model.
 
 
 
12.                    Louis Marchildon
13.                     
14.                     
Understanding long-distance quantum correlations
 
The interpretation of quantum mechanics (or, for that matter, of any physical theory) consists in answering the question: How can the world be for the theory to be true?  That question is especially pressing in the case of the long-distance correlations predicted by Einstein, Posolsky and Rosen, and rather convincingly established during the past decades in various laboratories.  I will review four different approaches to the understanding of long-distance quantum correlations: (i) the Copenhagen interpretation and some of its modern variants; (ii) Bohmian mechanics of spin-carrying particles; (iii) Cramer's transactional interpretation; and (iv) the Hess-Philipp analysis of extended parameter spaces.
 
 
 
15.                    Giacomo Mauro D'Ariano, Dipartimento di Fisica ``A. Volta'', via Bassi 6, I-27100 Pavia, Italy
 
 
Hilbert-Space Formulation of Quantum Mechanics From Purely Operational Axioms
 
I show how it is possible to derive the Hilbert space formulation of Quantum Mechanics from a comprehensive definition of "physical experiment" and assuming "experimental accessibility and simplicity" as specified by five simple Postulates. Pivotal roles are played by the "local observability principle", which reconciles the holism of nonlocality with the reductionism of local observation, and by the postulated existence of "informationally complete observables" and of a "symmetric faithful state". This last notion allows one to introduce an operational definition for the real version of the "adjoint"--i. e. the transposition--from which one can derive a real Hilbert-space structure via either the Mackey-Kakutani or the Gelfand-Naimark-Segal constructions. I will analyze in detail the Gelfand-Naimark-Segal construction, which leads to a real Hilbert space structure analogous to that of (classes of generally unbounded) selfadjoint operators in Quantum Mechanics. For finite dimensions, general dimensionality theorems that can be derived from a local observability principle, allow us to represent the elements of the real Hilbert space as operators over an underlying complex Hilbert space. The route for the present operational axiomatization was suggested by novel ideas originated from Quantum Tomography.
 
 
 
16.                    Hans Westman, perimeterinstitute
 
 
Two recent developements in pilot-wave theory
 
We present two recent developements in pilot-wave theory. Firstly, we present a numerical simulation showing that in pilot wave theory an arbitrary distribution p(x) of particle beables will approach the |psi|^2 distribution on a coarse grained level. This strongly suggest that the Born rule is not fundamental but merely a consequence of a relaxation process that presumably took place early in the universe. Secondly, we present a pilot-wave model that reproduces the statistics of quantum electrodynamics. The key idea is similar to the idea that Bell used to develope a pilot-wave theory for spin.
 
 
 
17.                    Theo. M. Nieuwenhuizen, Institute for Theoretical Physics, Universiteit van Amsterdam
 
 
The electron as a soliton in classical electromagnetism
 
In classical electrodynamics, extended with gradients of the electric and magnetic fields, a soliton is presented which bears features of the Kerr-Newman electron of electrogravity. In this model the electron has a ring shape, with diameter equal to the Compton length and thickness smaller by the fine structure constant. There occurs a finite mass, a spin 1/2, a g=2 factor, and an electric quadrupole moment that is also “twice too large''. >From this setup, all relativistic corrections to the classical version of the Pauli Hamiltonian are derived. There appears an additional, spin-dependent quadrupolar force that may vanish on the average.
 
Particle-antiparticle annihilation and the Pauli principle may become explained on the basis of electromagnetism alone.
 
The approach starts with a linear analysis, while non-linearities, treated perturbatively, fix mass, charge and spin.
 
 
 
 
 


   [Back]