I strongly disagree with your last statement - physics explicitly _is_ formulated in terms that can be made into a lookup table (see phase spaces in classical mechanics, for instance).
My point is that there's a finite light cone of possible causal influences over you at any moment in time, and in principle you can break those down into state variables finely enough to predict future states of a person. This is isomorphic to a lookup table, albeit one we aren't able to construct right now.
Im not suggesting it's enough to consider just the person in this scenario - the causal factors are part of the lookup.
No need, physicists already do this all the time - any computer simulation of quantum mechanical systems has to come to terms with the same problems (namely quantising the state space and representing the dynamics deterministically).
Physicists simulate on computers only what can be, which is almost nothing. Consider obtaining the dynamics of water by simulating all its parts: proton flow, hydrogen bonding etc. of 10^{PHYSICALLY UNCOMPUTABLE} interactions.
The simulations which do exist fail to model vast amounts. This is why, say, climate change is given as a prediction on temperature -- because it can be obtained as a mean which ignores "basically everything".
And it can be easily show that the assumptions of QM are false if Hilbert space is computable (QM becomes non-linear); and of classical mechanics (which becomes non-deterministic); and so on. ie., that the issue isnt merely 10^{PHYSICALLY UNCOMPUTABLE} but that non-computable functions are essential to the formulation.
The assertion that the world is computable is just that: there are no research projects, no textbooks, no experiments, no formalism to replace physics or anything like it -- nothing. All the basic assumptions of physics would have to be false, and we would have to have good reasons for supposing so.
This is just nonsense. The world is geometrical as described by physics. It is not computational as described by the discrete mathematician whose megalomania and platonism knows no bounds.
To be honest, I don't really understand what you mean by Hilbert spaces being computable, and what that has to do with the linearity of QM, determinism of classical mechanics, universe being geometrical and not computational etc. I'm familiar with all of those concepts, but not sure how they tie together here. If you have resources you could share I would appreciate it (I had little success with google).
determinism = g(x, t_future) fully set by g(x, t_now) and g
if you model a geometric, g : Real -> Real with computable, c : Int -> Int then there are gaps at arbitrarily high precisions, say p (eg., p = delta(g, c) at (x, t))
construct a classical system of arbitrarily complexity (eg., 10^BIG interactions), describe each interaction with g. Since 10^BIG are required, "delta(g, c) < BIG" is required in order for the system to remain deterministic (ie., described by g). We can easily find cases where BIG > delta(g, c), so CM would be non-deterministic if g is replaced by c.
As for QM, these "gaps" are cause much deeper contradictions with premises of QM.
If you replace wavefunctions, g, with computable ones, c then they dont sum to solutions of the wave-eq, so QM fails to be linear (the detla(g,c) are massive because hibert space is infinite-dim).
Now it might be that reality is really computable in the sense that there's some c which can replace g, but this would violate the assumptions of physics and has no motivation. Physics might be wrong, but there's no evidence of that.
There are also other issues, but these are just two off the top of my head.
Refences: Look for physical church-turing, church-turing thesis, non-det and det in chaos theory, non-det in classical mechanics, physical interpretations of the reals -- this will be in postgrad work, it wont be in popsci books.
> if you model a geometric, g : Real -> Real with computable, c : Int -> Int then there are gaps at arbitrarily high precisions, say p (eg., p = delta(g, c) at (x, t))
Nobody takes "computable approximation to g: R -> R" to mean "a computable function c: R* -> R" where R is the computable reals. There are many mathematical issues with this caused by self-referential programs (realised by Turing himself in "On Computable Numbers"). Typically you would model it as "c: R* x Q -> R*" where Q is a rational describing your desired precision, right?
> Since 10^BIG are required, "delta(g, c) < BIG" is required in order for the system to remain deterministic (ie., described by g).
I'm not sure what you mean by this - the computable approximation "c" is deterministic essentially by definition. If you mean "in order to remain within some bound of g" I can kinda see what you're saying but in that case you can interleave computations with smaller and smaller precisions (the "Q" I mentioned) in order to work around that issue, right? It won't be efficient, but it will certainly be computable.
> Refences: Look for physical church-turing, church-turing thesis, non-det and det in chaos theory, non-det in classical mechanics, physical interpretations of the reals -- this will be in postgrad work, it wont be in popsci books.
Thanks! I don't know much chaos theory, I'll have a look around for a good textbook.
Edit: I just want to say - you have a pretty wild way of writing that makes it hard for me to tell if you're a crank or not. Either way, reading your posts here has given me a ton of food for thought =) what's your background?
1 yr medicine, 6 yr physics, 4 yr debating union, 20 yr c programming, 20 yr love of political and stand up comedy, 15 yr software eng, 10 yr data scientist, 15 yr python, 22 yr informal & formal philosophy, 8 yr data sci & software consult/coach to finance/defence/... and maybe soon, 4 yr PhD AI & HCI
Of those, you may decide which is the most relevant to my writing style. The amount of theatrics and irony in a live delivery might change the interpretation.
Replacing R with Q is just replacing it with (Int, Int) -- so be it. My claim concerns whether CM assumes determinism (it does) and therefore requires infinite precision, any gap whatsoever that goes missing means P(t_next|t_now) < 1
You might say this indicates reality doesnt follow CM, and so that CM is wrong and (some now less hegemonic views) of QM are correct -- reality isnt deterministic.
Fine, but QM makes the situation worse. Since it's linearity now under threat: we would not be able to compose QM systems linearly if the wavefns didnt have infinite dim.
One important assumption here is that we ought take the explicit and implicit assumptions of physics as given as our starting point, ie., Prior(Physics) = High, and Posterior(NotPhysics|Physics) = Low.
So the dialectical burden is on the "computationalists" to show that there is a workable theory of physics, at every level which preserves either (1) the assumptions of physics; or (2) motivates why those assumptions are wrong non-circularly.
Given the premise on priors above, the argument, "physics is wrong because reality is computational" is both circular and unpersuasive (this doesnt mean its wrong, just that no reason has been presented).
Wild, well, I'm a lowly math PhD so that's where my interests lie =)
I'm _not_ suggesting we replace R with Q. I'm suggesting that you bake in the desired accuracy of your computational approximation as an input. This is how Turing evades self-referential problems in his conception of computational reals, and also perhaps how you evade your criticisms with CM requiring infinite precision.
Similarly - I think it's reasonable in a computational context to assume linearity up to an error bound that is provided as an input. Of course things become non-computable if you ask for exact linearity. Equality itself is non-computable!
Either way I think we agree about physics. I don't believe the universe is describable as a computable function. Merely that we can approximate it to arbitrary degrees of accuracy =P
I think we teach people only what we can write in finite formula, and compute in finite time.
This is imv, much like teaching people what's under a street light just because everything else is in darkness.
I think, philosophically, we can build inferential telescopes that point to the vast (epistemic) blackness, inside say, a proton, or a cell, or the chaos in water.
As an ameliorative, or therapeutic project, I think people who build computational models too much should meditate on the number of protons flowing free in a drop of water, and what properties their interactions might bring about. And whether it would ever be possible to know them.
I've read this thread exchange with interest, but what about the results that quantum computers are simulatable by classical computers? See David Deutsch 1985. This would reduce the issue of infinite Hilbert spaces to simulation using quantum computers, and in turn, Deutsch's result which says classical Turing machines can actually simulate quantum computers.
You can always make local arguments that, say, some g can be substituted with some c.
The issue is broader than that. It concerns the premises of vast areas of physics -- you have to show they are more likely false than true.
This isnt an argument saying no c can be found for any given g, it's saying, "g-c gaps have empirical consequences we havent observed" and if we did, physics would be foundationally wrong
When they assert theorems like "classical TMs can simulate quantum TMs" they mean the simulation is gapless. Otherwise they use the term approximation.
> The assertion that the world is computable is just that: there are no research projects, no textbooks, no experiments, no formalism to replace physics or anything like it -- nothing. All the basic assumptions of physics would have to be false, and we would have to have good reasons for supposing so.
I have no idea what this means. Physics must be computable from straightforward physical arguments like the Bekenstein Bound: finite volumes must contain finite information. Any physical object has finite volume at any given time, the universe included, ergo they must contain finite information. Any system consisting of finite information can be modeled as a finite state machine.
Thermodynamic entropy isnt information in the relevant sense.
There's a wide class of computational mysticism born of people going around and equivocating between "information" as it means radically different things where it is used.
thermodynamic entropy (a real number) != information theory entropy (bits) != information in csi != information in stat mech' != information in QM != information in a turing machien !=. ...
This is basically pseudoscience at this point. If you hear people talking about "information" as if its defined in a general sense (ie., equivocating across physics, computer science, etc.), they have no clue what they're talking about.
Eg., the "entropy" of real-valued quantum states as measured by integer-valued notions of entropy is 1 bit (the measured state is UP,DOWN) -- but QM requires the state be real-valued (having infinite information in the computability sense).
These kinds of information are not measuring the same thing, and largely irrelevant to each other.
I enjoyed the thread the other day, just want to point out there are some basic misunderstandings here - informational entropy and thermodynamic entropy in fact _do_ have a deep connection (Edwin Jaynes famously wrote about this). The key idea is that both are measures of the uncertainty of a system with respect to some distribution over it.
Saying that a real-valued state has infinite information in the computability sense is nonsense - information is a property of a _distribution_, not a _state_. You could talk about the Kolmogorov complexity of a real-valued state, but even this is generally not infinite, as anyone who's written a program to generate the digits of pi can attest.
> but QM requires the state be real-valued (having infinite information in the computability sense).
The unobservable state, which is merely a physical model that may have little resemblance to reality. All observable states necessarily have finite precision and beyond 60-70 digits are effectively undefined due to the uncertainty principle, which is yet another reason why people suggest physics is effectively computable.
While the types of information you mention are not strictly equivalent in some 1:1 sense that I don't think anyone has really suggested, there are formal correspondences, so your explanation ultimately just seems like a lot of special pleading, eg. you can derive a Bekenstein Bound for bits, thermodynamic entropy, information in QM, and so on.
No one here disagrees that measurement produces finite information; that is obvious and a necessary --- if it wasnt, we would never be able to know anything. Knowing requires an "early termination".
The issue is that there's no evidence this property of measurement is a property of reality, and all the methods, premises, etc. of physics attribute the opposite to reality.
Here, it is absolutely necessary for QM to work that the unmeasured state is real-valued,.
I'd also say that since measurement is finite in this manner, it then follows large swathes of reality are unknowable.. and this makes it clear why we cannot obtain the latent state of a QM system.
> Here, it is absolutely necessary for QM to work that the unmeasured state is real-valued
You're just doubling down on the premise that a theory founded on a formalism based on unbounded numbers requires unbounded numbers to work. Sure, but why is that necessarily reflective of reality? Why does that entail that no other formalism that doesn't embed infinities / continuity is also not possible? I simply see no reason to accept your conclusion. The infinities you see as essential could very well simply be artifacts of our formalisms.
In fact, I'd conjecture that our continuous formalisms are at the very heart of some core problems in physics [1], and that at least some of those problems can be resolved by exploring more discrete formalisms. I suppose we'll see.
My point is that there's a finite light cone of possible causal influences over you at any moment in time, and in principle you can break those down into state variables finely enough to predict future states of a person. This is isomorphic to a lookup table, albeit one we aren't able to construct right now.
Im not suggesting it's enough to consider just the person in this scenario - the causal factors are part of the lookup.