Fifty years ago, Simon Kochen and Ernst Specker proved that quantum theory cannot be explained with noncontextual models. Since then, the Kochen-Specker theorem has been seen as one of the basic results in foundations of quantum theory. At the same time, the questions the theorem raised and the methods the theorem used received little attention.
However, in the last decade, we have seen how the Kochen-Specker theorem has inspired some results which help us to understand quantum theory and which is the origin of the power of quantum systems for information processing and computation. There is increasing evidence that contextuality, that is, the impossibility of noncontextual models, is the notion that better captures the sense in which quantum theory is fundamentally different than classical physics. There is also increasing evidence that contextuality is the basic resource behind the quantum advantage for computation and information processing. Moreover, there is increasing evidence that the principle of exclusivity, considered by Specker the fundamental principle of quantum theory, is the fundamental principle that limits correlations in nature. More importantly, there is a growing community of young researchers worldwide actively engaged in advancing our knowledge on each of these fronts.
The aim of this workshop is bringing together to the ETH Zurich, the place where the Kochen-Specker theorem was conceived, leading researchers in all these fields in order to develop broader perspectives, draw connections between different approaches, stimulate collaborations, and envision objectives for future research.
- Simon Kochen, U. Princeton, USA
- Alastair Abbott, Institut Néel, France
- Joseba Alonso, ETH Zurich, Switzerland
- Chris Fuchs, University of Massachusetts Boston, USA
- Jürg Fröhlich, ETH Zurich, Switzerland
- Angela Karanjai, U. Sydney, Australia
- Marc-Olivier Renou, U. Geneva, Switzerland
- Otfried Gühne, U. Siegen, Germany
- Robert Raussendorf, U. British Columbia, Canada
- Ana Belén Sainz, Perimeter Institute, Canada
- Robert W. Spekkens, Perimeter Institute, Canada
Thursday, 22nd June
The talk will show how exactly the same intuitively plausible definitions of state, observable, symmetry, and dynamics of the Boolean structure of intrinsic properties of classical systems, when applied to the structure of extrinsic, relational quantum properties, lead to the standard quantum formalism, including the Schrödinger equation and the von Neumann–Lüders Projection Rule. This approach is then applied to resolving the paradoxes and difficulties of the orthodox interpretation.
11:00 Marc-Olivier Renou – Network-locality [slides]
In quantum nonlocality, distant observers performing local measurements on a shared entangled quantum state can observe strong correlations. It has no equivalent in classical physics. Indeed, the characterization of all possible classical probability distributions, achievable with a single local hidden variable model, is now a relatively well understood problem.
A direct generalization is to study several independent sources distributed in a network: we go from “Bell locality” to “Network-locality”. Here, a network features distant observers, as well as several independent sources distributing states to different subsets of observers. No practical method to characterize the set of classical probability distributions for networks is known.
In this talk, we focus on several network example, illustrating some characteristics of the problem which makes it hard: non-convexity, non-linearity. We introduce a geometrical approach to the triangle scenario, where three sources are distributed between three observers in a loop. With the approach, we find a new inequality and make a connection with computer science.
12:00 Angela Karanjai – Bounding complexity of stabilizer models
This talk will be about constraints on any model which reproduces the qubit stabilizer sub-theory. We show that the minimum number of classical bits required to specify the state of an n-qubit system must scale as ~ n(n-3)/2 in any model that does not contradict the predictions of the quantum stabilizer sub-theory. The Gottesman-Knill algorithm, which is a strong simulation algorithm is in fact, very close to this bound as it scales at ~n(2n+1). This is a result of state-independent contextuality which puts a lower bound on the minimum number of states a model requires in order to reproduce the statistics of the qubit stabilizer sub-theory.
14:00 Robert Raussendorf – Topological proofs of contextuality in quantum mechanics
We provide a cohomological framework for contextuality of quantum mechanics that is suited to describing contextuality as a resource in measurement-based quantum computation. This framework applies to the parity proofs first discussed by Mermin, as well as a different type of contextuality proofs based on symmetry transformations. The topological arguments presented can be used in the state-dependent and the state-independent case.
Joint work with Cihan Okay, Sam Roberts and Stephen D. Bartlett. See arXiv:1701.01888v1 (quant-ph)
15:30 Otfried Gühne – Contextuality and Temporal Correlations in Quantum Mechanics [slides]
Experimental tests of contextuality often make use of sequential measurements on single quantum systems. In this talk I will explore the temporal correlations that can arise, if a
sequence of measurements on a single quantum systems is made. First, I will discuss the complexity of such correlations and the difficulty to simulate them classically. Second, I will
present methods to characterize temporal correlations, allowing to compute the maximal violation of contextuality inequalities in quantum mechanics. Finally, I will show how the correlations can be used to estimate the dimension of the underlying quantum system.
Friday, 23rd June
9:30 Joseba Alonso – Quantum contextuality tests with a single trapped-ion qutrit [slides]
We report on the generation and observation of correlations in a local system beyond those allowed by purely classical models. These manifest in the violation of non-contextual inequalities. For this, we make use of a single qutrit encoded into electronic energy levels of a 40Ca+ ion in a cryogenic surface-electrode trap. In a first experiment we demonstrate sustained generation of stronger-than-classical correlations by performing over 50 million consecutive measurements on the same qutrit. We do this with a single, self-correcting sequence, randomizing the measurement settings on the go. Our results violate the relevant state-independent inequality  by 236 standard deviations. In a second experiment we explore the limits of quantum correlations using sequential measurements of five observables. Our measurements violate the most fundamental non-contextual inequality  by 18 standard deviations, while lying within one standard deviation of the quantum-mechanical prediction. Furthermore, we have gradually increased the complexity of our system to include up to 31 observables and found stronger-than-classical correlations in all prepared scenarios. Our findings indicate that the limits of correlations in our system are indeed consistent with QM, while clearly revealing contextuality even under the experimentally unavoidable lack of compatibility between sets of ideally compatible observables, thereby addressing the so-called compatibility loophole.
11:00 Robert W. Spekkens – Translating proofs of the Kochen-Specker theorem into noncontextuality inequalities that are robust to noise [slides]
The Kochen-Specker theorem rules out models of quantum theory that satisfy a particular assumption of context-independence: that sharp measurements are assigned outcomes both deterministically and independently of their context. This notion of noncontextuality is not suited to a direct experimental test because realistic measurements always have some degree of unsharpness due to noise. However, a generalized notion of noncontextuality has been proposed that is applicable to any experimental procedure, including unsharp measurements, but also preparations as well, and for which a quantum no-go result still holds. According to this notion, the model need only specify a probability distribution over the outcomes of a measurement in a context-independent way, rather than specifying a particular outcome. It also implies novel constraints of context-independence for the representation of preparations.
I will describe a general technique for deriving inequalities for generalized noncontextuality—i.e., inequalities that test whether a given set of experimental statistics is consistent with a generalized-noncontextual model. In particular, I consider how to translate proofs of the Kochen-Specker theorem into such inequalities. Both the case of state-independent and state-dependent proofs are considered. Unlike previous inequalities inspired by the Kochen-Specker theorem, this approach does not assume that the value-assignments are deterministic and therefore in the face of a violation of the inequality, the possibility of salvaging noncontextuality by abandoning determinism is no longer an option. The approach is operational in the sense that it does not presume quantum theory: a violation of these inequalities implies the impossibility of a noncontextual model for any operational theory that can account for the experimental observations, including any successor to quantum theory.
12:00 Chris Fuchs – Is there a SIC in the sky when nobody looks?
14:00 Ana Belén Sainz – Kochen-Specker contextuality: a hypergraph approach with operational equivalences [slides]
Most work on contextuality so far has focused on specific examples and concrete proofs of the Kochen-Specker theorem, while general definitions and theorems about contextuality are sparse. For example, it is commonly believed that nonlocality is a special case of contextuality, but what exactly does this mean? In this work, that builds on the graph-theoretic approach of Cabello, Severini and Winter, we develop a hypergraph approach to study Kochen-Specker contextuality and Bell nonlocality in a unified manner. In this talk I will further focus on the relation between some sets of probabilistic models and graph invariants, and discuss principles to characterise quantum predictions.
15:30 Alastair Abbott – The (strong) Kochen-Specker theorem, the eigenstate-eigenvalue link, and quantum randomness [slides]
The Kochen-Specker theorem rules out the possibility of noncontextual hidden-variable formulations of quantum mechanics, and as a result (along with Bell’s Theorem) has played an grounding the belief that quantum mechanics is indeed intrinsically random. This common view is well summed up in the “Eigenstate-Eigenvalue link” which states that a system has a definite property with respect to an observable iff it is in an eigenstate of that observable.
Recent work has made strong progress in developing a device-independent understanding of contextuality and understanding contextuality without assuming outcome determinism. In this talk I will look in the other direction, and discuss what the Kochen-Specker Theorem and contextuality really tell us about quantum indeterminism and randomness. I will argue that there is a discrepancy between the conclusions of the principal no-go theorems (i.e., Bell, Kochen-Specker, etc.) and the E-E link, and prove a strengthened version of the Kochen-Specker Theorem which closes this gap and, given the relevant assumptions, proves the E-E link. In contrast to standard proofs of the Kochen-Specker theorem, this stronger result requires a novel constructive method of reduction between Kochen-Specker sets. I will finish by discussing briefly the connection between these results and quantum randomness, and in particular the relation between indeterminism, unpredictability and randomness.
16:30 Jürg Fröhlich – The ETH approach to quantum mechanics
A novel approach to Quantum Mechanics is outlined – called “ETH approach”, where “E” stands for “events”, “T” for “trees”, and “H” for “histories”.
The purpose of this approach is to add some precision to formulating Quantum Mechanics that helps to clarify what an “event” is in Quantum Mechanics and how to observe or record an event using “instruments”. The ultimate aim of this approach is to develop a “Quantum Theory without Observers”. The ETH approach is based on a precise definition of “open isolated systems” and on two basic principles: (i) the “principle of loss of direct access to information” in open isolated systems, and (ii) a principle that specifies the impact of an event happening at some time t onto the state of the system after time t. These principles imply that the dynamics of states of open isolated systems can be described in terms of a new kind of stochastic branching process with a “non-commutative state space”.