Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

From my little knowledge, logic at Quantum scale appears quite different:

* Things don't have their own location or identity

* Spatial and temporal extents don't exist

* Something may be true and false at the same time, or concept of true and false may not be defined

* cause and effect goes for a toss, as behavior of time is different

* Existence and non-existence co-exist, or come into existence together

Similar effects at relatively-infinite scale (maybe purely mathematical)

* Comparisons (big/small/equal) breakdown

* Regular arithmetic and logic breaks down



Most of these things are misunderstandings of quantum mechanics, as we know it today.

The main thing that is at the root of all of them is the word "things". In QM, the ground truth of the world is the wavefunction of the system. The wavefunction assigns some amplitude (potentially 0) to any possible state of the system that it describes. It then evolves purely deterministically from past to future, according to Schrodinger's equation (or Dirac's equation, if you want to discuss speeds close to that of light). The only kink is interaction with a measurement device (what constitutes a measurement device is one of the big mysteries that we don't yet have an answer for). After a measurement, the wavefunction collapses non-deterministically to one of the states that the measurement device was set up to detect, with a probability that is proportional to the amplitude of the wave function of that state.

Now, this is the "ground truth" of QM. Everything else, such as particles and space-time and so on are just stories we tell to make sense of the wavefunction and its behavior. Sometimes those descriptions break down, and they start assigning weird fanciful ideas, such as retrocausality etc - but these just prove that the stories are wrong, that they are misinterpreting the math of the wavefunction.

I'd also note that the main "time is weird" factoid you encounter related to QM experiments, the delayed-choice quantum eraser, is mostly a misunderstanding / sensationalization of the actual physics and the experiment. It again only proves that certain interpretations of what the wavefunction and its collapse represent are not temporally consistent, but the direct conclusion from this should be that the interpretations are wrong, not that "cause and effect goes for a toss, as behavior of time is different".


If we can't even define a "thing", by identifying inside and outside of it, what it is and what it is not, where it is and where it is not, that itself a big contrast with Newtonian (human scale) mechanics. Everything one can talk about Quantum mechanics is coompletly alien to the human perceived world. That should justify the distinction by scale.


Most of these happen at most scales, but is more to do with the classic laws of classic logic that we accept A priori because they are useful.

* PNC: at most one is true; both can be false

* PEM, at least one is true; both can be true

* PNC + PEM, exactly one is true, exactly one is false

If you stick to the familiar computational complexity classes P=co-P, but NP != co-NP, and those both relate to the accessibility of T/F, where both P and NP are by definition decision problems, specifically the ones that can be verified in poly time.

If you ignore the Heisenberg uncertainty principle to avoid that complexity, the standard model of QM is just mapping continuous functions to discrete space, and will have problems with the above.

This happens in math too, where we use the rationals over the reals or Cauchy sequences to construct the reals, because almost all reals are normal and non-computable, and even equality between to "real" real numbers is undecidable.

This is also related to why after Weierstrass show that almost all continuous functions are nowhere smooth, we moved to the Epsilon-Delta definition of a limits etc...

We have the lier's paradox, which is easier to understand than Berry's paradox, which relates to Chaitin proof that there is an upper limit to what any algorithm can prove.

Things at Quantum scale do act very different than our typical intuition, but lots of maps from a continuous space to discrete categories can exhibit the same behavior even at macro scale, we just can often use a model that lets us ignore that to accomplish useful work.

Often we can even use repeated approximation or other methods to reduce those problems to something that is practical, but that is still the map and not the territory.

Superposition is just that:

    f(a + b) = f(a) + f(b)
And:

    f(sa) = s*f(a)
If you have two vectors, one at i (0,1) and one at 1 (1,0), but a map that maps only to choice(1,i) and think of that as up and down, how you divide that continuous arc from that segment of the unit circle to UP or DOWN, almost all of quantum mechanics still works, and it will be the contradictions that conflict with our intuitions that will become the barrier. (ignoring Heisenberg)

If you think about Heisenberg uncertainty as indeterminacy instead, Independence in math (Like ZFC + CH) which are neither provable as false or true, Chatlin's complexity limits, the breakdown of Laplacian determinism, or even modal logic all apply at multiple levels.

The amazing thing in my mind is that we found useful models despite these limits and using methods which are effective.

But IMHO it is best to think that in the quantum world, we aren't so lucky rather than those limits aren't still lurking under the surface at the macro scale, which they very much are.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: