Measurement is not just the same as entanglement. If you try that you get paradoxes and you don't match actual observations.
If you treat a measurement as entanglement then by the Kochen-Specker theorem you can't condition on the measurement outcome. However we seem to be able to do this in actual experiments.
Thus measurement is not entanglement alone, but also the elimination of other bases.
I interpret KS backward from what you're saying. From my understanding, KS says that there is no sense in which the experiment involves an objective revelation of hidden state.
After the experiment, the experimenter did not "learn" or "reveal" some objective fact about reality, ie that the true state was UP rather than DOWN. Instead, after the experiment the experimenter becomes entangled with the UP/DOWN system in such a way that the experimenter measured both UP and DOWN, but all observables relating to the experimenter are either wholly consistent with UP or wholly consistent with DOWN.
The lack of noncontextual hidden variables is the main implication of the Kochen-Specker theorem from which the inability to condition follows. It's not "backward" from the conclusion of no noncontextual hidden state, it's just another consequence there of.
So the Kochen-Specker theorem says there was no pre-existent noncontextual state for the particle. However that doesn't in any way imply the particle was both UP and DOWN or that the device measured both UP and DOWN. Especially for the device as the Kochen-Specker theorem is proved in a context where observable outcomes are assumed to be single-valued.
However in real practice we can condition on the states of our devices following measurement, hence they don't seem to be susceptible to a Kochen-Specker result when viewed as the system for some "super"-observational device. Which they would be if they simply entered an entangled state. Thus the assumption of measurement as simple entanglement does not match actual observed reality. This is a point made in many texts such as those of Schlosshauer, Omnès, Peres and at a very rigorous level in the theory of C* algebras and Category theory by Fröhlich and Landsman respectively.
There are other alternatives such as the derivation of quantum theory within the GPT framework and many other axiomatic derivations.
I've never seen the Born rule derived from unitary evolution and axioms for how subjective experience should work, so I don't even see this as one of the ways.
It depends on what one calls the measurement problem.
This solves the "consistency/small problem", i.e. treating the macroscopic apparatus as boolean is justified.
It doesn't resolve the "outcome problem", i.e. which outcome is selected. Of course if you accept the world is not deterministic this isn't really a problem.
One could argue even classical mechanics isn't deterministic as we think of it because of chaos, which has fascinating connections with QM. W Hoover (of the Nose-Hoover thermostat fame) did some great work with reversible thermostats exploring the instability of Newtons equations of motion.
I think that's different. Chaos still uses classical probability and the randomness is just ignorance of underlying initial conditions. This is very different from QM.
I have. They still don't make Quantum Theory and Chaos similar. Rather for some systems QM can motivate ergodicity as well as classical chaos can. However that doesn't mean Quantum Probability and Classical Probability are alike simply because they give similar behaviour for certain systems for one specific limit. Their representation theory is completely different.
Chaotic systems don't have a Kochen-Specker or PBR theorem.
What would you expect "experiencing along those other bases" to look like? If you expand along a different basis you just get something like: half a chance of experiencing (1/sqrt(3)|x> + 2/sqrt(3)|y>), and half a chance of experiencing (1/sqrt(3)|x> + 2/sqrt(3)|y>), so it amounts to the same thing.
> the structure of the wavefunction is that it divides cleanly into those two branches, and that's true in any basis.
It's not. It only has this Schmidt decomposition in one basis. In other bases there will be cross terms among the basis elements. What you're doing is privileging Schmidt bases as ones that give experiences. In another basis with states w,z say the state will be:
|w>|w> + |z>|z> + |w>|z> + |z>|w>
So you won't be able to give this clean "experience" reading unless you posit we can't experience in things like the w,z basis here and only in Schmidt bases, but then you run into the problem that for real macroscopic systems they won't admit a Schmidt basis.
This seems like the kind of a "vague" Many Worlds where one doesn't look any deeper than pretending a macro-device is a qubit (e.g. no thermal states etc) and looking at one basis. There's a reason properly developed MWI is nothing like this such as the Spacetime State realism of Wallace and Timpson.
Why one would believe in quantum state realism at all is a separate question.
>Of course you can
No you can't, it's a direct consequence of the Kochen-Specker theorem. If the device is treated quantum mechanically and it enters an entangled state of the form you gave then you cannot perform conditioning as the Kochen-Specker theorem, via the non-uniqueness of Hilbert space orthogonal decompositions, prevents an unambiguous formulation of Bayes's law. I can link to papers proving this if you wish.
The fact that we do experiments where we can condition is, in light of this theorem, a demonstration that our measurement devices do not enter into the kind of CHSH states you're giving.
> It's not. It only has this Schmidt decomposition in one basis. In other bases there will be cross terms among the basis elements. What you're doing is privileging Schmidt bases as ones that give experiences. In another basis with states w,z say the state will be: |w>|w> + |z>|z> + |w>|z> + |z>|w>
The state's evolution will be completely equivalent to (a linear superposition of) the evolution of |x>|x> and |y>|y>. That's a physically observable fact that's independent of your choice of basis (it's less obvious in the |z>/|w> basis, but it's still true).
Any physically valid concept of "experience" would have to behave the same way. If your state is equivalent to a linear superposition of "experiencing x" and "experiencing y" then it can be characterised completely in terms of "experiencing x" and "experiencing y", and that's not dependent on your choice of basis (though it may be easier to see in one basis or another).
> No you can't, it's a direct consequence of the Kochen-Specker theorem. If the device is treated quantum mechanically and it enters an entangled state of the form you gave then you cannot perform conditioning as the Kochen-Specker theorem, via the non-uniqueness of Hilbert space orthogonal decompositions, prevents an unambiguous formulation of Bayes's law. I can link to papers proving this if you wish.
> The fact that we do experiments where we can condition is, in light of this theorem, a demonstration that our measurement devices do not enter into the kind of CHSH states you're giving.
I don't know what you're trying to claim here. All the available evidence is that measurement devices, being ordinary physical objects, follow the laws of quantum mechanics, and that includes conditioning behaving as entanglement; if you've got evidence that that's not the case then a Nobel prize awaits. Non-uniqueness is a red herring, because choice of basis does not and cannot change experimental predictions; the basis exists only in the map, not the territory.
The device has to have its contextual observable algebra develop a non-trivial center, not just be entangled as is mentioned in section 5 of the paper I linked. It's well known entanglement alone isn't enough which again is why entanglement alone has been called "pre-measurement" since the 1980s.
Note how this involves hard mathematics, not vague talk about "obvious features of subjective experience". I'll also note that this is a general feature of discussions about this stuff among non-physicists online, especially programming communities like this one, the knowledge is stuck in the late 1970s.
> The state's evolution will be completely equivalent to (a linear superposition of) the evolution of |x>|x> and |y>|y>. That's a physically observable fact that's independent of your choice of basis (it's less obvious in the |z>/|w> basis, but it's still true).
Of course the state can be written in the form |xx> + |yy>. I never denied that. The point is that it can be written in other forms. So it's equally correct to say we'd "experience"
|zz> + |ww> + |zw> + |wz> as to say we'd experience |xx> + |yy> so there's no reason to say we'd "obviously" experience only the latter. Your argument is just "that expansion is always available", but since other expansions are also always available I don't see what the force of this argument is.
Even worse in QFT there isn't an expansion of the form |xx> + |yy> available due to the Reeh-Schleider theorem so your whole construction is moot anyway. Again where is this paper deriving the Born rule from unitarity and basic facts about subjective experience.
> I don't know what you're trying to claim here. All the available evidence is that measurement devices, being ordinary physical objects, follow the laws of quantum mechanics, and that includes conditioning behaving as entanglement.
I'm claiming a consequence of a well known theorem from Quantum Probability. See section 4.2 of this paper https://arxiv.org/abs/1310.1484
Quantum states without superselection (e.g. the entangled states of the form you are considering) leave Bayesian conditioning undefined. As the paper mentions this is a direct consequence of the Kochen-Specker theorem via non-unique orthogonal expansion. It's not a red herring but a rigorously proved theorem.
I don't know what the "Nobel prize" remark is about as it is well known that entanglement doesn't give well-defined conditioning. That's why entanglement with the device alone is called "pre-measurement" in most papers in measurement theory following terminology introduced by Zurek in the early 80s. A good example of the issues with pre-measurement alone is here https://arxiv.org/abs/2003.07464. You can't just treat the device as simply entering some CHSH or GHZ style entangled state and think that solves everything about measurement. It doesn't via the theorem I gave in the paper above (and other issues).
> Of course the state can be written in the form |xx> + |yy>. I never denied that. The point is that it can be written in other forms. So it's equally correct to say we'd "experience" |zz> + |ww> + |zw> + |wz> as to say we'd experience |xx> + |yy> so there's no reason to say we'd "obviously" experience only the latter. Your argument is just "that expansion is always available", but since other expansions are also always available I don't see what the force of this argument is.
If there's a simple description of the wavefunction that's valid then there should be a correspondingly simple description of our experiences that's valid. The fact that there's also a more complicated valid description of the wavefunction is neither here nor there. It's like looking at a basket of 4 apples and asking why your experience doesn't correspond to there being 6 - 2 apples.
> Quantum states without superselection (e.g. the entangled states of the form you are considering) leave Bayesian conditioning undefined. As the paper mentions this is a direct consequence of the Kochen-Specker theorem via non-unique orthogonal expansion. It's not a red herring but a rigorously proved theorem.
Ok, I take your point, saying that we can just condition is overly flippant: if there are cross terms (i.e. entanglement) then classical conditional probability doesn't always accurately describe the behaviour of a system, and of course that's true for a system that includes experimenters inside it. But if we treat an experimenter's conditioning as creating entanglement, like any other QM interaction, and treat the subsequent evolution of the system quantum-mechanically, then there's no problem.
> A good example of the issues with pre-measurement alone is here https://arxiv.org/abs/2003.07464. You can't just treat the device as simply entering some CHSH or GHZ style entangled state and think that solves everything about measurement. It doesn't via the theorem I gave in the paper above (and other issues).
That paper amounts to nothing more than redefining "outcome" as something that cannot be in a superposition, and then using this to argue that it makes their unfounded notion of decoherence physically meaningful. If we assume that experimenters are physical systems that can undergo superpositions like any other, then of course Bell-style "no hidden variables" results apply when those variables are the outcomes of experiments. Big whoop. (Would you find the following argument convincing: "Pre-measuring the polarisation of the photon might have one of two possible results, so it doesn't have an outcome according to any reasonable notion of "outcome". Therefore if any observer has measured a photon's polarisation, a physically meaningful process of decoherence must have occurred"? Put like that it's hopefully obvious that this is nothing more than asserting the primacy of the Copenhagen interpretation).
> Note how this involves hard mathematics, not vague talk about "obvious features of subjective experience". I'll also note that this is a general feature of discussions about this stuff among non-physicists online, especially programming communities like this one, the knowledge is stuck in the late 1970s.
Look, I'm not a big fan of credentialism, but I do have a master's in this from a reputable institution. If working physics has found a compelling argument that there's something mysterious about measurement or experience, then that knowledge hasn't made its way as far as even taught postgrad courses, yet alone the wider public, and the blame for that has to rest with the physicists. (I rather suspect that there's no such argument that has reached any significant consensus among working physicists, and that that the "late 1970s" view in the public sphere reflects that).
Those are the same states so I'm not sure what you mean.
The point is that there is no reason to select out any particular basis over another. You can't just retreat into "well this is the only basis I can experience" because the human sensory apparatus would be able to select out a range of bases in a full unitary account and also the ambiguity of basis decomposition means you can't perform conditioning which we do all the time in experiments.
> Those are the same states so I'm not sure what you mean.
I mean that if you decompose along a different basis than experiencing x/experiencing y, you just get an ensemble of states each of which is a superposition of experiencing x and experiencing y. So you end up with the same thing.
It's like looking at an entangled state (because that's exactly what it is) - if we have a two-particle state like 1/sqrt(2)(|x>|x> + |y>|y>), that behaves like the first particle being in |x> and experiencing the other particle being in |x>, or being in |y> and experiencing the other particle being in |y>, and it might look like that's an artifact of this particular basis decomposition, but it actually isn't - the structure of the wavefunction is that it divides cleanly into those two branches, and that's true in any basis.
> You can't just retreat into "well this is the only basis I can experience" because the human sensory apparatus would be able to select out a range of bases in a full unitary account
A system that's freely interacting will become entangled; whatever we consider ourself is constantly interacting with the rest of ourself, almost by definition.
> also the ambiguity of basis decomposition means you can't perform conditioning which we do all the time in experiments.
Of course you can, and it works exactly the way you'd expect - we already do experiments where some isolated apparatus inside the experiment does something if it detects one thing and something else if it detects something else. Choice of basis is a tool for understanding the wavefunction, not a physically real thing.
See my reply above. You're just declaring we only experience Schmidt bases for no particular reason. Where are you getting this "clear connection" between experience and the decomposition in one particular basis. Do you have a reference?
Well you don't need to have nonlinear evolution to get what alpineidyll3 is saying. It's sufficient for the observable algebra of macroobservables to be commutative. This allows the evolution to be linear and have no interferences.
The QM is "an island in theoryspace" idea isn't strictly true either. QM is one among an entire family of probability theories. It's only rigid when considered purely from the point of view of Probability theories based around vectors in Hilbert space. However considered as part of OPTs in general there's nothing that makes it difficult to modify.
If you treat a measurement as entanglement then by the Kochen-Specker theorem you can't condition on the measurement outcome. However we seem to be able to do this in actual experiments.
Thus measurement is not entanglement alone, but also the elimination of other bases.