Of course. To him that would be modern mechanics. Or just mathematical natural philosophy, or whatever.
> Maybe we should categorize sciences based on the spatial scale at which they operate.
That would not be very useful, because there is no boundary. Nothing in general relativity says "below this everything is Newtonian". As a matter of fact we need to consider relativistic effects in quantum chemistry calculations that involve some heavy elements, at length scales smaller than 0.1 nm. Similarly, they just gave a Nobel prize for work on "Quantum properties on a human scale".
> For example, quantum scale, human scale and cosmic scales have their own physics, logic and causality.
That is not at all how these frameworks are built, and that is not the dominant epistemological approach. The mainstream view is that there is a theory of everything that exists but is unknown to us, and that our various theories are approximations of that theory under different assumptions. They look categorically different because we don’t understand the overarching framework, not because nature is fundamentally different depending on scale.
Also, I don’t see how the logic is fundamentally different between e.g. quantum mechanics and general relativity. Both rely heavily on things like Hamiltonian mechanics or symmetries. Some behaviours are different (like photons following geodesics and not straight lines, or superpositions of quantum states), but these are not a fundamental problem: a straight line is a limit case of a geodesic in a flat space, and a unique state is a limit case of superposition.
I am not saying that everything is fine and we know everything, just that there is no clear boundary between the situations in which different theories are required and we cannot neatly decompose the universe into different realms where different theories apply.
Most of these things are misunderstandings of quantum mechanics, as we know it today.
The main thing that is at the root of all of them is the word "things". In QM, the ground truth of the world is the wavefunction of the system. The wavefunction assigns some amplitude (potentially 0) to any possible state of the system that it describes. It then evolves purely deterministically from past to future, according to Schrodinger's equation (or Dirac's equation, if you want to discuss speeds close to that of light). The only kink is interaction with a measurement device (what constitutes a measurement device is one of the big mysteries that we don't yet have an answer for). After a measurement, the wavefunction collapses non-deterministically to one of the states that the measurement device was set up to detect, with a probability that is proportional to the amplitude of the wave function of that state.
Now, this is the "ground truth" of QM. Everything else, such as particles and space-time and so on are just stories we tell to make sense of the wavefunction and its behavior. Sometimes those descriptions break down, and they start assigning weird fanciful ideas, such as retrocausality etc - but these just prove that the stories are wrong, that they are misinterpreting the math of the wavefunction.
I'd also note that the main "time is weird" factoid you encounter related to QM experiments, the delayed-choice quantum eraser, is mostly a misunderstanding / sensationalization of the actual physics and the experiment. It again only proves that certain interpretations of what the wavefunction and its collapse represent are not temporally consistent, but the direct conclusion from this should be that the interpretations are wrong, not that "cause and effect goes for a toss, as behavior of time is different".
If we can't even define a "thing", by identifying inside and outside of it, what it is and what it is not, where it is and where it is not, that itself a big contrast with Newtonian (human scale) mechanics. Everything one can talk about Quantum mechanics is coompletly alien to the human perceived world. That should justify the distinction by scale.
Most of these happen at most scales, but is more to do with the classic laws of classic logic that we accept A priori because they are useful.
* PNC: at most one is true; both can be false
* PEM, at least one is true; both can be true
* PNC + PEM, exactly one is true, exactly one is false
If you stick to the familiar computational complexity classes P=co-P, but NP != co-NP, and those both relate to the accessibility of T/F, where both P and NP are by definition decision problems, specifically the ones that can be verified in poly time.
If you ignore the Heisenberg uncertainty principle to avoid that complexity, the standard model of QM is just mapping continuous functions to discrete space, and will have problems with the above.
This happens in math too, where we use the rationals over the reals or Cauchy sequences to construct the reals, because almost all reals are normal and non-computable, and even equality between to "real" real numbers is undecidable.
This is also related to why after Weierstrass show that almost all continuous functions are nowhere smooth, we moved to the Epsilon-Delta definition of a limits etc...
We have the lier's paradox, which is easier to understand than Berry's paradox, which relates to Chaitin proof that there is an upper limit to what any algorithm can prove.
Things at Quantum scale do act very different than our typical intuition, but lots of maps from a continuous space to discrete categories can exhibit the same behavior even at macro scale, we just can often use a model that lets us ignore that to accomplish useful work.
Often we can even use repeated approximation or other methods to reduce those problems to something that is practical, but that is still the map and not the territory.
Superposition is just that:
f(a + b) = f(a) + f(b)
And:
f(sa) = s*f(a)
If you have two vectors, one at i (0,1) and one at 1 (1,0), but a map that maps only to choice(1,i) and think of that as up and down, how you divide that continuous arc from that segment of the unit circle to UP or DOWN, almost all of quantum mechanics still works, and it will be the contradictions that conflict with our intuitions that will become the barrier. (ignoring Heisenberg)
If you think about Heisenberg uncertainty as indeterminacy instead, Independence in math (Like ZFC + CH) which are neither provable as false or true, Chatlin's complexity limits, the breakdown of Laplacian determinism, or even modal logic all apply at multiple levels.
The amazing thing in my mind is that we found useful models despite these limits and using methods which are effective.
But IMHO it is best to think that in the quantum world, we aren't so lucky rather than those limits aren't still lurking under the surface at the macro scale, which they very much are.
> > Maybe we should categorize sciences based on the spatial scale at which they operate.
> That would not be very useful, because there is no boundary. Nothing in general relativity says "below this everything is Newtonian". As a matter of fact we need to consider relativistic effects in quantum chemistry calculations that involve some heavy elements, at length scales smaller than 0.1 nm. Similarly, they just gave a Nobel prize for work on "Quantum properties on a human scale".
You are just saying "well ackshually". I dare you to build a cabinet using the Hamiltonian. I double-dog-dare you.
> > For example, quantum scale, human scale and cosmic scales have their own physics, logic and causality.
> That is not at all how these frameworks are built, and that is not the dominant epistemological approach.
Again, 99.999% of all functional mechanics don't involve epistemology.
> The mainstream view is that there is a theory of everything that exists but is unknown to us, and that our various theories are approximations of that theory under different assumptions.
Oh! You're so close to seeing the point... There are multiple levels of approximation (at least two), and the one we all experience is Newtonian. Perhaps more accurately, our senses mostly believe pre-Newtonion approximations, which is why it took until Newton to realize how inaccurate they were.
> Also, I don’t see how the logic is fundamentally different between e.g. quantum mechanics and general relativity.
You're pretty radically moving the goalposts here. GP was talking about Newtonian mechanics, not Hamiltonian.
> You are just saying "well ackshually". I dare you to build a cabinet using the Hamiltonian. I double-dog-dare you.
The Hamiltoninan (and Lagrangian) are much more amenable to actual physical calculations, at least on a computer, than the Newtonian formulations of classical mechanics - but otherwise they are perfectly equivalent mathematically. I'm not sure where you'd need any kind of dynamical laws in the building of a cabinet, on the other hand. Are you trying to arrange for a system of inclined planes and pullies to slot the pieces into place?
> Perhaps more accurately, our senses mostly believe pre-Newtonion approximations, which is why it took until Newton to realize how inaccurate they were.
This is a bit of misnomer. Our senses and intuitions are in fact remarkably accurate for a certain range of values, and quite equivalent to what Newton's laws of motion say about these. To some extent, Newton "only" found a simple formalism to represent our existing intuitions. Our intuitions of course break down in other places, such as at very high speeds, or very high altitudes , where relativistic corrections start to become significant.
QM however is a paradigm shift in how the world is described, and it is completely non-intuitive, even in regimes where its predictions are fully aligned with our intuitions and senses. You can use QM to compute the collision of two ideal balls on an ideal plane, and the results will exactly match your intuitions. But the computation and even the representation of the system will not, in any way.
> You are just saying "well ackshually". I dare you to build a cabinet using the Hamiltonian. I double-dog-dare you.
We do that every day using things like finite elements. It’s just a different Hamiltonian that accounts for the fact that we simplify bunches of atoms into a continuum.
> Again, 99.999% of all functional mechanics don't involve epistemology.
Discussing the nature of scientific theories is epistemology. The parent’s point is epistemological in nature.
> ! You're so close to seeing the point... There are multiple levels of approximation (at least two), and the one we all experience is Newtonian.
You are off base. There are many, many approximations that may or may not overlap. It’s not just onion layers.
> You're pretty radically moving the goalposts here. GP was talking about Newtonian mechanics, not Hamiltonian.
GP was talking about different theories applying at different scales. Sorry, it may not have been clear in context but Hamiltonian here is not just the generalisation of Newtonian mechanics we learn about in 2nd year Physics. these theories (quantum mechanics, Newtonian mechanics, and relativity) can be written using the Hamiltonian formalism.
Of course. To him that would be modern mechanics. Or just mathematical natural philosophy, or whatever.
> Maybe we should categorize sciences based on the spatial scale at which they operate.
That would not be very useful, because there is no boundary. Nothing in general relativity says "below this everything is Newtonian". As a matter of fact we need to consider relativistic effects in quantum chemistry calculations that involve some heavy elements, at length scales smaller than 0.1 nm. Similarly, they just gave a Nobel prize for work on "Quantum properties on a human scale".
> For example, quantum scale, human scale and cosmic scales have their own physics, logic and causality.
That is not at all how these frameworks are built, and that is not the dominant epistemological approach. The mainstream view is that there is a theory of everything that exists but is unknown to us, and that our various theories are approximations of that theory under different assumptions. They look categorically different because we don’t understand the overarching framework, not because nature is fundamentally different depending on scale.
Also, I don’t see how the logic is fundamentally different between e.g. quantum mechanics and general relativity. Both rely heavily on things like Hamiltonian mechanics or symmetries. Some behaviours are different (like photons following geodesics and not straight lines, or superpositions of quantum states), but these are not a fundamental problem: a straight line is a limit case of a geodesic in a flat space, and a unique state is a limit case of superposition.
I am not saying that everything is fine and we know everything, just that there is no clear boundary between the situations in which different theories are required and we cannot neatly decompose the universe into different realms where different theories apply.