A Bell test is a randomized trial that compares experimental observations against the philosophical worldview of local realism , in which the properties of the physical world are independent of our observation of them and no signal travels faster than light. A Bell test requires spatially distributed entanglement, fast and high-efficiency detection and unpredictable measurement settings. Although technology can satisfy the first two of these requirements, the use of physical devices to choose settings in a Bell test involves making assumptions about the physics that one aims to test. Bell himself noted this weakness in using physical setting choices and argued that human âfree willâ could be used rigorously to ensure unpredictability in Bell tests. Here we report a set of local-realism tests using human choices, which avoids assumptions about predictability in physics. We recruited about 100,000 human participants to play an online video game that incentivizes fast, sustained input of unpredictable selections and illustrates Bell-test methodology. The participants generated 97,347,490 binary choices, which were directed via a scalable web platform to 12 laboratories on five continents, where 13 experiments tested local realism using photons, single atoms, atomic ensembles and superconducting devices. Over a 12-hour period on 30 November 2016, participants worldwide provided a sustained data flow of over 1,000 bits per second to the experiments, which used different human-generated data to choose each measurement setting. The observed correlations strongly contradict local realism and other realistic positions in bi-partite and tri-partite 12 scenarios. Project outcomes include closing the âfreedom-of-choice loopholeâ (the possibility that the setting choices are influenced by âhidden variablesâ to correlate with the particle properties), the utilization of video-game methods for rapid collection of human-generated randomness, and the use of networking techniques for global participation in experimental science.
Information-theoretically secure (ITS) authentication is needed in Quantum Key Distribution (QKD). In this paper, we study security of an ITS authentication scheme proposed by Wegman& Carter, in the case of partially known authentication key. This scheme uses a new authentication key in each authentication attempt, to select a hash function from an Almost Strongly Universal_{2} hash function family. The partial knowledge of the attacker is measured as the trace distance between the authentication key distribution and the uniform distribution; this is the usual measure in QKD. We provide direct proofs of security of the scheme, when using partially known key, first in the information-theoretic setting and then in terms of witness indistinguishability as used in the Universal Composability (UC) framework. We find that if the authentication procedure has a failure probability ε and the authentication key has an ε´ trace distance to the uniform, then under ITS, the adversary’s success probability conditioned on an authentic message-tag pair is only bounded by ε +|Ƭ|ε´, where |Ƭ| is the size of the set of tags. Furthermore, the trace distance between the authentication key distribution and the uniform increases to |Ƭ|ε´ after having seen an authentic message-tag pair. Despite this, we are able to prove directly that the authenticated channel is indistinguishable from an (ideal) authentic channel (the desired functionality), except with probability less than ε + ε´. This proves that the scheme is (ε + ε´)-UC-secure, without using the composability theorem.
Quantum cryptography is an unconditionally secure key growing technique provided that an unconditionally secure authentication protocol is combined with it. This paper is about the study of the lifetime of a message authentication scheme, where a message to be authenticated is first hashed by a secret–but fixed–Strongly Universal hash function then the output is encrypted with a one-time-pad key to generate a tag for the message. If the onetime-pad is completely secret, then the lifetime is exponential in the tag length. If, however, the one-time-pad key is partially known in each authentication round, as is the case in practical quantum key distribution protocols, then the picture is different; because the adversary’s partial knowledge of the one-time-pad key in each authentication round contributes to his/her ability to identify the secret hash function. We estimate the lifetime of this type of authentication. Here the parameters are the length of the key identifying the secret hash function and the amount of knowledge that Eve has on the one-time-pad. A theoretical estimate is presented, along with experimental results that support it.
Universal hash functions are important building blocks for unconditionally secure message authentication codes. In this paper, we present a new construction of a class of Almost Strongly Universal hash functions with much smaller description (or key) length than the Wegman-Carter construction. Unlike some other constructions, our new construction has a very short key length and a security parameter that is independent of the message length, which makes it suitable for authentication in practical applications such as Quantum Cryptography.
Secure message authentication is an important part of Quantum Key Distribution. In this paper we analyze special properties of a Strongly Universal2 hash function family, an understanding of which is important in the security analysis of the authentication used in Quantum Cryptography. We answer the following question: How much of Alices message does Eve need to influence so that the message along with its tag will give her enough information to create the correct tag for her message?
In this paper, we review and comment on "A novel protocol-authentication algorithm ruling out a man-in-the-middle attack in quantum cryptography" [M. Peev et al., Int. J. Quant. Inf. 3 (2005) 225]. In particular, we point out that the proposed primitive is not secure when used in a generic protocol, and needs additional authenticating properties of the surrounding quantum-cryptographic protocol.
Quantum Key Distribution (QKD - also referred to as Quantum Cryptography) is a technique for secret key agreement. It has been shown that QKD rigged with Information-Theoretic Secure (ITS) authentication (using secret key) of the classical messages transmitted during the key distribution protocol is also ITS. Note, QKD without any authentication can trivially be broken by man-in-the-middle attacks. Here, we study an authentication method that was originally proposed because of its low key consumption; a two-step authentication that uses a publicly known hash function, followed by a secret strongly universal2 hash function, which is exchanged each round. This two-step authentication is not information-theoretically secure but it was argued that nevertheless it does not compromise the security of QKD. In the current contribution we study intrinsic weaknesses of this approach under the common assumption that the QKD adversary has access to unlimited resources including quantum memories. We consider one implementation of Quantum Cryptographic protocols that use such authentication and demonstrate an attack that fully extract the secret key. Even including the final key from the protocol in the authentication does not rule out the possibility of these attacks. To rectify the situation, we propose a countermeasure that, while not informationtheoretically secure, restores the need for very large computing power for the attack to work. Finally, we specify conditions that must be satisfied by the two-step authentication in order to restore informationtheoretic security.
The two-photon interferometric experiment proposed by J. D. Franson [Phys. Rev. Lett. 62, 2205 (1989)] is often treated as a “Bell test of local realism.” However, it has been suggested that this is incorrect due to the 50% postselection performed even in the ideal gedanken version of the experiment. Here we present a simple local hidden variable model of the experiment that successfully explains the results obtained in usual realizations of the experiment, even with perfect detectors. Furthermore, we also show that there is no such model if the switching of the local phase settings is done at a rate determined by the internal geometry of the interferometers.
n/a
We show that the Clifford group-the normaliser of the Weyl-Heisenberg group-can be represented by monomial phase-permutation matrices if and only if the dimension is a square number. This simplifies expressions for SIC vectors, and has other applications to SICs and to Mutually Unbiased Bases. Exact solutions for SICs in dimension 16 are presented for the first time.
It is known that if the dimension is a perfect square the Clifford group can be represented by monomial matrices. Another way of expressing this result is to say that when the dimension is a perfect square the standard representation of the Clifford group has a system of imprimitivity consisting of one dimensional subspaces. We generalize this result to the case of an arbitrary dimension. Let k be the square-free part of the dimension. Then we show that the standard representation of the Clifford group has a system of imprimitivity consisting of k-dimensional subspaces. To illustrate the use of this result we apply it to the calculation of SIC-POVMs (symmetric informationally complete positive operator valued measures), constructing exact solutions in dimensions 8 (hand-calculation) as well as 12 and 28 (machine-calculation).
This paper considers approximations of marginalization sums thatarise in Bayesian inference problems. Optimal approximations ofsuch marginalization sums, using a fixed number of terms, are analyzedfor a simple model. The model under study is motivated byrecent studies of linear regression problems with sparse parametervectors, and of the problem of discriminating signal-plus-noise samplesfrom noise-only samples. It is shown that for the model understudy, if only one term is retained in the marginalization sum, thenthis term should be the one with the largest a posteriori probability.By contrast, if more than one (but not all) terms are to be retained,then these should generally not be the ones corresponding tothe components with largest a posteriori probabilities.
Klyachko and coworkers consider an orthogonality graph in the form of a pentagram, and in this way derive a Kochen-Specker inequality for spin 1 systems. In some low-dimensional situations Hilbert spaces are naturally organised, by a magical choice of basis, into SO(N) orbits. Combining these ideas some very elegant results emerge. We give a careful discussion of the pentagram operator, and then show how the pentagram underlies a number of other quantum "paradoxes", such as that of Hardy.
We report on a search for mutually unbiased bases (MUBs) in six dimensions. We find only triplets of MUBs, and thus do not come close to the theoretical upper bound 7. However, we point out that the natural habitat for sets of MUBs is the set of all complex Hadamard matrices of the given order, and we introduce a natural notion of distance between bases in Hilbert space. This allows us to draw a detailed map of where in the landscape the MUB triplets are situated. We use available tools, such as the theory of the discrete Fourier transform, to organize our results. Finally, we present some evidence for the conjecture that there exists a four dimensional family of complex Hadamard matrices of order 6. If this conjecture is true the landscape in which one may search for MUBs is much larger than previously thought.
The interpretation of quantum theory is one of the longest-standing debates in physics. Type I interpretations see quantum probabilities as determined by intrinsic properties of the observed system. Type II see them as relational experiences between an observer and the system. It is usually believed that a decision between these two options cannot be made simply on purely physical grounds but requires an act of metaphysical judgment. Here we show that, under some assumptions, the problem is decidable using thermodynamics. We prove that type I interpretations are incompatible with the following assumptions: (i) The choice of which measurement is performed can be made randomly and independently of the system under observation, (ii) the system has limited memory, and (iii) Landauers erasure principle holds.
In Bell experiments, one problem is to achieve high enough photodetection to ensure that there is no possibility of describing the results via a local hidden-variable model. Using the Clauser-Horne inequality and a two-photon nonmaximally entangled state, a photodetection efficiency higher than 0.67 is necessary. Here we discuss atom-photon Bell experiments. We show that, assuming perfect detection efficiency of the atom, it is possible to perform a loophole-free atom-photon Bell experiment whenever the photodetection efficiency exceeds 0.50.
The Kochen-Specker theorem states that noncontextual hidden variable models are inconsistent with the quantum predictions for every yes-no question on a qutrit, corresponding to every projector in three dimensions. It has been suggested [D.A. Meyer, Phys. Rev. Lett. 83 (1999) 3751] that the inconsistency would disappear when restricting to projectors on unit vectors with rational components; that noncontextual hidden variables could reproduce the quantum predictions for rational vectors. Here we show that a qutrit state with rational components violates an inequality valid for noncontextual hidden-variable models [A.A. Klyachko et al., Phys. Rev. Lett. 101 (2008) 020403] using rational projectors. This shows that the inconsistency remains even when using only rational vectors.
The chained Bell inequalities of Braunstein and Caves involving N settings per observer have some interesting applications. Here we obtain the minimum detection efficiency required for a loophole-free violation of the Braunstein-Caves inequalities for any N greater than= 2. We discuss both the case in which both particles are detected with the same efficiency and the case in which the particles are detected with different efficiencies.
Device-independent quantum communication will require a loophole-free violation of Bell inequalities. In typical scenarios where line of sight between the communicating parties is not available, it is convenient to use energy-time entangled photons due to intrinsic robustness while propagating over optical fibers. Here we show an energy-time Clauser-Horne-Shimony-Holt Bell inequality violation with two parties separated by 3.7 km over the deployed optical fiber network belonging to the University of Concepcion in Chile. Remarkably, this is the first Bell violation with spatially separated parties that is free of the postselection loophole, which affected all previous in-field long-distance energy-time experiments. Our work takes a further step towards a fiber-based loophole-free Bell test, which is highly desired for secure quantum communication due to the widespread existing telecommunication infrastructure.
Unconditionally secure message authentication is an important part of Quantum Cryptography (QC). We analyze security effects of using a key obtained from QC for authentication purposes in later rounds of QC. In particular, the eavesdropper gains partial knowledge on the key in QC that may have an effect on the security of the authentication in the later round. Our initial analysis indicates that this partial knowledge has little effect on the authentication part of the system, in agreement with previous results on the issue. However, when taking the full QC protocol into account, the picture is different. By accessing the quantum channel used in QC, the attacker can change the message to be authenticated. This together with partial knowledge of the key does incur a security weakness of the authentication. The underlying reason for this is that the authentication used, which is insensitive to such message changes when the key is unknown, becomes sensitive when used with a partially known key. We suggest a simple solution to this problem, and stress usage of this or an equivalent extra security measure in QC.
We present approximations of the LLR distribution for a class of fixed-complexity soft-output MIMO detectors, such as the optimal soft detector and the soft-output via partial marginalization detector. More specifically, in a MIMO AWGN setting, we approximate the LLR distribution conditioned on the transmitted signal and the channel matrix with a Gaussian mixture model (GMM). Our main results consist of an analytical expression of the GMM model (including the number of modes and their corresponding parameters) and a proof that, in the limit of high SNR, this LLR distribution converges in probability towards a unique Gaussian distribution.
We present a formal theory of contextuality for a set of random variables grouped into different subsets (contexts) corresponding to different, mutually incompatible conditions. Within each context the random variables are jointly distributed, but across different contexts they are stochastically unrelated. The theory of contextuality is based on the analysis of the extent to which some of these random variables can be viewed as preserving their identity across different contexts when one considers all possible joint distributions imposed on the entire set of the random variables. We illustrate the theory on three systems of traditional interest in quantum physics (and also in non-physical, e.g., behavioral studies). These are systems of the Klyachko-Can-Binicioglu-Shumovsky-type, Einstein-Podolsky-Rosen-Bell-type, and Suppes-Zanotti-Leggett-Garg-type. Listed in this order, each of them is formally a special case of the previous one. For each of them we derive necessary and sufficient conditions for contextuality while allowing for experimental errors and contextual biases or signaling. Based on the same principles that underly these derivations we also propose a measure for the degree of contextuality and compute it for the three systems in question.
This year in Växjö we thought we would try an experiment—it felt high timefor a new result. Much of the foundations discussion ofprevious years has focussed on EPR-style arguments and the meaningand experimental validity of various Bell inequality violations. Yet, thereis another pillar of the quantum foundations puzzle that hashardly received any attention in our great series of meetings:It is the phenomenon first demonstrated by Kochen and Specker,quantum contextuality. Recently there has been a rapid growth ofactivity aimed toward better understanding this aspect of quantum mechanics,which Asher Peres sloganized by the phrase, “unperformed experiments haveno results.” Below is a sampling of some important paperson the topic for the reader not yet familiar withthe subject. What is the source of this phenomenon? Doesit depend only on high level features of quantum mechanics,or is it deep in the conceptual framework on whichthe theory rests? Might it, for instance, arise from theway quantum mechanics amends the classic laws of probability? Whatare the mathematically simplest ways contextuality can be demonstrated? Howmight the known results be made amenable to experimental tests?These were the sorts of discussions we hoped the sessionwould foster.
We report an experimental violation of a Bell inequality with strong statistical significance. Our experiment employs polarization measurements on entangled single photons and closes the locality, freedom-of-choice, fair-sampling, coincidence-time, and memory loopholes simultaneously.
John Bells theorem of 1964 states that local elements of physical reality, existing independent of measurement, are inconsistent with the predictions of quantum mechanics (Bell, J. S. (1964), Physics (College. Park. Md). 1 (3), 195). Specifically, correlations between measurement results from distant entangled systems would be smaller than predicted by quantum physics. This is expressed in Bells inequalities. Employing modifications of Bells inequalities, many experiments have been performed that convincingly support the quantum predictions. Yet, all experiments rely on assumptions, which provide loopholes for a local realist explanation of the measurement. Here we report an experiment with polarization-entangled photons that simultaneously closes the most significant of these loopholes. We use a highly efficient source of entangled photons, distributed these over a distance of 58.5 meters, and implemented rapid random setting generation and high-efficiency detection to observe a violation of a Bell inequality with high statistical significance. The merely statistical probability of our results to occur under local realism is less than 3.74 . 10(-31), corresponding to an 11.5 standard deviation effect.
Local realism is the worldview in which physical properties of objects exist independently of measurement and where physical influences cannot travel faster than the speed of light. Bells theorem states that this worldview is incompatible with the predictions of quantum mechanics, as is expressed in Bells inequalities. Previous experiments convincingly supported the quantum predictions. Yet, every experiment requires assumptions that provide loopholes for a local realist explanation. Here, we report a Bell test that closes the most significant of these loopholes simultaneously. Using a well-optimized source of entangled photons, rapid setting generation, and highly efficient superconducting detectors, we observe a violation of a Bell inequality with high statistical significance. The purely statistical probability of our results to occur under local realism does not exceed 3.74 x 10(-31), corresponding to an 11.5 standard deviation effect.
We show that the phenomenon of quantum contextuality can be used to certify lower bounds on the dimension accessed by the measurement devices. To prove this, we derive bounds for different dimensions and scenarios of the simplest noncontextuality inequalities. Some of the resulting dimension witnesses work independently of the prepared quantum state. Our constructions are robust against noise and imperfections, and we show that a recent experiment can be viewed as an implementation of a state-independent quantum dimension witness.
A basic assumption behind the inequalities used for testing noncontextual hidden variable models is that the observables measured on the same individual system are perfectly compatible. However, compatibility is not perfect in actual experiments using sequential measurements. We discuss the resulting "compatibility loophole" and present several methods to rule out certain hidden variable models that obey a kind of extended noncontextuality. Finally, we present a detailed analysis of experimental imperfections in a recent trapped-ion experiment and apply our analysis to that case.
Entanglement and its consequences—in particular the violation of Bell inequalities, which defies our concepts of realism and locality—have been proven to play key roles in Nature by many experiments for various quantum systems. Entanglement can also be found in systems not consisting of ordinary matter and light, i.e. in massive meson–antimeson systems. Bell inequalities have been discussed for these systems, but up to date no direct experimental test to conclusively exclude local realism was found. This mainly stems from the fact that one only has access to a restricted class of observables and that these systems are also decaying. In this Letter we put forward a Bell inequality for unstable systems which can be tested at accelerator facilities with current technology. Herewith, the long awaited proof that such systems at different energy scales can reveal the sophisticated “dynamical” nonlocal feature of Nature in a direct experiment gets feasible. Moreover, the role of entanglement and violation, an asymmetry between matter and antimatter, is explored, a special feature offered only by these meson–antimeson systems.
Everyday experience supports the existence of physical properties independent of observation in strong contrast to the predictions of quantum theory. In particular, the existence of physical properties that are independent of the measurement context is prohibited for certain quantum systems. This property is known as contextuality. This Rapid Communication studies whether the process of decay in space-time generally destroys the ability of revealing contextuality. We find that in the most general situation the decay property does not diminish this ability. However, applying certain constraints due to the space-time structure either on the time evolution of the decaying system or on the measurement procedure, the criteria revealing contextuality become inherently dependent on the decay property or an impossibility. In particular, we derive how the context-revealing setup known as Bells nonlocality tests changes for decaying quantum systems. Our findings illustrate the interdependence between hidden and local hidden parameter theories and the role of time.
In a recent Letter [Phys. Rev. Lett. 118, 030501 (2017)], Peiris, Konthasinghe, and Muller report a Franson interferometry experiment using pairs of photons generated from a two-level semiconductor quantum dot. The authors report a visibility of 66% and claim that this visibility “goes beyond the classical limit of 50% and approaches the limit of violation of Bell’s inequalities (70.7%).” We explain why we do not agree with this last statement and how to fix the problem.
Photonic systems based on energy-time entanglement have been proposed to test local realism using the Bell inequality. A violation of this inequality normally also certifies security of device-independent quantum key distribution (QKD) so that an attacker cannot eavesdrop or control the system. We show how this security test can be circumvented in energy-time entangled systems when using standard avalanche photodetectors, allowing an attacker to compromise the system without leaving a trace. We reach Bell values up to 3.63 at 97.6% faked detector efficiency using tailored pulses of classical light, which exceeds even the quantum prediction. This is the first demonstration of a violation-faking source that gives both tunable violation and high faked detector efficiency. The implications are severe: the standard Clauser-Horne-Shimony-Holt inequality cannot be used to show device-independent security for energy-time entanglement setups based on Franson’s configuration. However, device-independent security can be reestablished, and we conclude by listing a number of improved tests and experimental setups that would protect against all current and future attacks of this type.
The Franson interferometer, proposed in 1989 (Franson 1989 Phys. Rev. Lett. 62 2205-08), beautifully shows the counter-intuitive nature of light. The quantum description predicts sinusoidal interference for specific outcomes of the experiment, and these predictions can be verified in experiment. In the spirit of Einstein, Podolsky, and Rosen it is possible to ask if the quantum-mechanical description (of this setup) can be considered complete. This question will be answered in detail in this paper, by delineating the quite complicated relation between energy-time entanglement experiments and Einstein-Podolsky-Rosen (EPR) elements of reality. The mentioned sinusoidal interference pattern is the same as that giving a violation in the usual Bell experiment. Even so, depending on the precise requirements made on the local realist model, this can imply (a) no violation, (b) smaller violation than usual, or (c) full violation of the appropriate statistical bound. Alternatives include (a) using only the measurement outcomes as EPR elements of reality, (b) using the emission time as EPR element of reality, (c) using path realism, or (d) using a modified setup. This paper discusses the nature of these alternatives and how to choose between them. The subtleties of this discussion needs to be taken into account when designing and setting up experiments intended to test local realism. Furthermore, these considerations are also important for quantum communication, for example in Bell-inequality-based quantum cryptography, especially when aiming for device independence.
In any Bell test, loopholes can cause issues in the interpretation of the results, since an apparent violation of the inequality may not correspond to a violation of local realism. An important example is the coincidence-time loophole that arises when detector settings might influence the time when detection will occur. This effect can be observed in many experiments where measurement outcomes are to be compared between remote stations because the interpretation of an ostensible Bell violation strongly depends on the method used to decide coincidence. The coincidence-time loophole has previously been studied for the Clauser-Horne-Shimony-Holt and Clauser-Horne inequalities, but recent experiments have shown the need for a generalization. Here, we study the generalized chained inequality by Pearle, Braunstein, and Caves (PBC) with N amp;gt;= 2 settings per observer. This inequality has applications in, for instance, quantum key distribution where it has been used to reestablish security. In this paper we give the minimum coincidence probability for the PBC inequality for all N amp;gt;= 2 and show that this bound is tight for a violation free of the fair-coincidence assumption. Thus, if an experiment has a coincidence probability exceeding the critical value derived here, the coincidence-time loophole is eliminated.
Along-standing aim of quantum information research is to understand what gives quantum computers their advantage. This requires separating problems that need genuinely quantum resources from those for which classical resources are enough. Two examples of quantum speed-up are the Deutsch-Jozsa and Simons problem, both efficiently solvable on a quantum Turing machine, and both believed to lack efficient classical solutions. Here we present a framework that can simulate both quantum algorithms efficiently, solving the Deutsch-Jozsa problem with probability 1 using only one oracle query, and Simons problem using linearly many oracle queries, just as expected of an ideal quantum computer. The presented simulation framework is in turn efficiently simulatable in a classical probabilistic Turing machine. This shows that the Deutsch-Jozsa and Simons problem do not require any genuinely quantum resources, and that the quantum algorithms show no speed-up when compared with their corresponding classical simulation. Finally, this gives insight into what properties are needed in the two algorithms and calls for further study of oracle separation between quantum and classical computation.
Contextuality is a natural generalization of nonlocality which does not need composite systems or spacelike separation and offers a wider spectrum of interesting phenomena. Most notably, in quantum mechanics there exist scenarios where the contextual behavior is independent of the quantum state. We show that the quest for an optimal inequality separating quantum from classical noncontextual correlations in a state-independent manner admits an exact solution, as it can be formulated as a linear program. We introduce the noncontextuality polytope as a generalization of the locality polytope and apply our method to identify two different tight optimal inequalities for the most fundamental quantum scenario with state-independent contextuality.
The simulation of quantum effects requires certain classical resources, and quantifying them is an important step to characterize the difference between quantum and classical physics. For a simulation of the phenomenon of state-independent quantum contextuality, we show that the minimum amount of memory used by the simulation is the critical resource. We derive optimal simulation strategies for important cases and prove that reproducing the results of sequential measurements on a two-qubit system requires more memory than the information-carrying capacity of the system.
Experimental violations of Bell inequalities are in general vulnerable to so-called loopholes. In this work, we analyze the characteristics of a loophole-free Bell test with photons, closing simultaneously the locality, freedom-of-choice, fair-sampling (i.e., detection), coincidence-time, and memory loopholes. We pay special attention to the effect of excess predictability in the setting choices due to nonideal random-number generators. We discuss necessary adaptations of the Clauser-Horne and Eberhard inequality when using such imperfect devices and-using Hoeffdings inequality and Doobs optional stopping theorem-the statistical analysis in such Bell tests.
The notion of (non) contextuality pertains to sets of properties measured one subset (context) at a time. We extend this notion to include so-called inconsistently connected systems, in which the measurements of a given property in different contexts may have different distributions, due to contextual biases in experimental design or physical interactions (signaling): a system of measurements has a maximally noncontextual description if they can be imposed a joint distribution on in which the measurements of any one property in different contexts are equal to each other with the maximal probability allowed by their different distributions. We derive necessary and sufficient conditions for the existence of such a description in a broad class of systems including Klyachko-Can-Binicioglu-Shumvosky-type (KCBS), EPR-Bell-type, and Leggett-Garg-type systems. Because these conditions allow for inconsistent connectedness, they are applicable to real experiments. We illustrate this by analyzing an experiment by Lapkiewicz and colleagues aimed at testing contextuality in a KCBS-type system.
Quantum systems show contextuality. More precisely, it is impossible to reproduce the quantum-mechanical predictions using a non-contextual realist model, i.e., a model where the outcome of one measurement is independent of the choice of compatible measurements performed in the measurement context. There has been several attempts to quantify the amount of contextuality for specific quantum systems, for example, in the number of rays needed in a KS proof, or the number of terms in certain inequalities, or in the violation, noise sensitivity, and other measures. This paper is about another approach: to use a simple contextual model that reproduces the quantum-mechanical contextual behaviour, but not necessarily all quantum predictions. The amount of contextuality can then be quantified in terms of additional resources needed as compared with a similar model without contextuality. In this case the contextual model needs to keep track of the context used, so the appropriate measure would be memory. Another way to view this is as a memory requirement to be able to reproduce quantum contextuality in a realist model. The model we will use can be viewed as an extension of Spekkens toy model [Phys. Rev. A 75, 032110 (2007)], and the relation is studied in some detail. To reproduce the quantum predictions for the Peres-Mermin square, the memory requirement is more than one bit in addition to the memory used for the individual outcomes in the corresponding noncontextual model.
By probabilistic means, the concept of contextuality is extended so that it can be used in non-ideal situations. An inequality is presented, which at least in principle enables a test to discard non-contextual hidden-variable models at low error rates, in the spirit of the Kochen-Specker theorem. Assuming that the errors are independent, an explicit error bound of 1.42% is derived, below which a Kochen-Specker contradiction occurs.
It is well-known in the physics community that the Copenhagen interpretation of quantum mechanics is very different from the Bohm interpretation. Usually, a local realistic model is thought to be even further from these two, as in its purest form it cannot even yield the probabilities from quantum mechanics by the Bell theorem. Nevertheless, by utilizing the “efficiency loophole” such a model can mimic the quantum probabilities, and more importantly, in this paper it is shown that it is possible to interpret this latter kind of local realistic model such that it contains elements of reality as found in the Bohm interpretation, while retaining the complementarity present in the Copenhagen interpretation.
Quantum Cryptography, or more accurately, Quantum Key Distribution (QKD) is based on using an unconditionally secure "quantum channel" to share a secret key among two users. A manufacturer of QKD devices could, intentionally or not, use a (semi-) classical channel instead of the quantum channel, which would remove the supposedly unconditional security. One example is the BB84 protocol, where the quantum channel can be implemented in polarization of single photons. Here, use of several photons instead of one to encode each bit of the key provides a similar but insecure system. For protocols based on violation of a Bell inequality (e.g., the Ekert protocol) the situation is somewhat different. While the possibility is mentioned by some authors, it is generally thought that an implementation of a (semi-) classical channel will differ significantly from that of a quantum channel. Here, a counterexample will be given using an identical physical setup as is used in photon-polarization Ekert QKD. Since the physical implementation is identical, a manufacturer may include this modification as a Trojan Horse in manufactured systems, to be activated at will by an eavesdropper. Thus, the old truth of cryptography still holds: you have to trust the manufacturer of your cryptographic device. Even when you do violate the Bell inequality.
A probabilistic version of the Kochen-Specker paradox is presented. The paradox is restated in the form of an inequality relating probabilities from a non-contextual hidden-variable model, by formulating the concept of "probabilistic contextuality." This enables an experimental test for contextuality at low experimental error rates. Using the assumption of independent errors, an explicit error bound of 0.71% is derived, below which a Kochen-Specker contradiction occurs.