Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's an interesting thread.

It seems to me that Kay sees "data" in context of semiotics, where there is a signifier, a signified and an interpreter, while Hickey is in the camp of physics where things "are what they are", and can't be lied with (from "semiotics study everything that can be lied with").



I am interested in reading where Kay references semiotics?

As a designer that is “graphically oriented” by nature, and also “CLI oriented” from necessity, I can easily see why Kay would lean into semiotics to iron out how humans should best interact with machines.

It’s possible all interfaces will eventually be seen as relics of a low bandwidth input era. We should understand the semiotics (and physics) of everything well before we hand off full autonomy to technology.


He doesn't directly reference semiotics, it's just the line of argument that adds an interpreter to the equation. This implies that data is just a signifier, which can then be resolved to a signified by the help of an interpreter, hence you also need to send an interpreter alongside it.

In what form is an interpreter sent though remains an open question (because if the answer is "data" then wouldn't that mean a recursion in the argument?).


Thanks for clarifying that, I appreciate that connection you made.


At some point of exhaustion the recursion can be interrupted by a dynamic conceptual framework as the interpreter. Still left with philosophy.


Anything less than being a convincing prophet or an exhaustive orator won't suffice. There is likely no definitive answer to anything—only varying degrees of certainty, based on conceptual frameworks that are ultimately rooted in philosophy.


Doesn't the frame and qualification problem discredit the latter?

FWIW, most physics professionals I know, who aren't just popular personalities are not in the scientific realism camps.

They realize that all models are wrong and that some are useful.

I do think that the limits of induction and deduction are often ignored in the CS world, and as abduction is only practical in local cases is also ignored.

But the quants have always been pseudoscientific.

We are restricted to induction, deduction, and laplacian determination not because they are ideals, but because they make problems practical with computers.

There are lots of problems that we can find solutions for, many more that we can approximate, but we are still producing models.

More and more data is an attempt to get around the frame and qualification problems.

Same problem that John McCarthy is trying to get around in this 1986 paper.

http://jmc.stanford.edu/articles/circumscription/circumscrip...

Note that local bi-abduction is the STOA even for code.

Without solving the frame and qualification problems, everything that isn't markovian and ergotic can be 'lied' with.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: