certainly an intriguing case, Dijkstra. I smell pedantry...but I could be wrong. A person so deeply involved in CS and not having a computer??? That's like saying being an expert cook but not wanting to taste food.
> A person so deeply involved in CS and not having a computer???
I looked for a reference just now and couldn't find one. This mentions it:
> Dijkstra was famous for his general rejection of personal computers. Instead of typing papers out using a word processor, he printed everything in longhand.
> That's like saying being an expert cook but not wanting to taste food.
Oh ho! Don't let him hear you say that, eh? You'd get a scolding. He's the one who said, "Computer science is no more about computers than astronomy is about telescopes" and "Calling it computer science is like calling surgery knife science."
The analogy would be more like "an expert chef who refused to eat frozen dinners" maybe? :)
The difference as I see it: computer science (sorry, "informatics") is a mathematical discipline, and hence tends to concern itself with, out of a given class, the minimal (and maximal, when existent) object(s). Programming is an engineering discipline, and hence tends to concern itself with, out of a given class, the intervals within that lattice that are optimal by some suitability function.
In principle, the suitability function would be evaluated over the entire lattice; in practice, that function, whether explicitly or implicitly, includes a strong weight for "distance from existing solutions". In either case, this split in focus between the interior and the boundaries of the solution space means that programmers are often highly concerned with specific details that do not even appear (because they have been abstracted away) in the objects with which informaticians work.
As an example: theory people love to use 1-ary trees (induction steps cost nothing in proofs, but cases are expensive) and they will use 2-ary trees (sometimes even without pressure to sympathize with the machine) but systems people and programmers use k-ary trees (where, if it's been determined by measurement and not by compatibility, k depends upon "the" bandwidth-delay product between the storage hierarchy levels for which the tree is optimized ... or at least what the bandwidth-delay product had been at the time of writing).
FWIW, when I was poking around last night looking for references for the "forced to get a Mac" story I found this.
Tony Hoare:
> The first time I visited Edsger in Eindhoven was in the early Seventies. My purpose was to find out more about the THE operating system, [Dijkstra, May 1968.] which Edsger had designed. In the computing center at which the system was running I asked whether there was really no possibility of deadlock. "Let's see" was the answer. They then input a program with an infinite recursion. After a while, a request appeared at the operator's console for more storage to be allocated to the program, and this was granted. At the same time they put a circular paper tape loop into one of the tape readers, and this was immediately read into buffer file by the spooling demon. After a while the reader stopped; but the operator typed a message forcing the spooler to continue reading. At the same time even more storage was allocated to the recursive program. After an interval in which the operator repeatedly forced further foolish storage allocations, the system finally ground to a complete halt, and a brief message explained that storage was exhausted and requested the operator to restart operations.
> So the answer was YES; the system did have a possibility of deadlock. But what interested me was that the restart message and the program that printed it were permanently resident in expensive core storage, so that it would be available even when the paging store and input/output utilities were inoperative. And secondly, that this was the very first time it had happened. I concluded that the THE operating system had been designed by a practical engineer of high genius. Having conducted the most fundamental and far-reaching research into deadlock and its avoidance, he nevertheless allocated scarce resources to ensure that if anything went wrong, it would be recognized and rectified. And finally, of course, nothing actually ever did go wrong, except as a demonstration to an inquisitive visitor.
People don't do computer science for the mere theory of it. In the end, CS should yield utility--solve problems in the real world and those problems are solved through the medium of computers.
There is a reason it is called COMPUTER science while astronomy is not "telescope science". The analogy is faulty as well:
>> "Computer science is no more about computers than astronomy is about telescopes":
You can't compare astronomy to CS--astronomy needs both computers and telescopes and spaceships--but in CS, the computer is the central figure. If not, can you think of ANY other tool that represents CS?
>> "Calling it computer science is like calling surgery knife science." is another faulty analogy. The knife does not have the same amount of critical importance both as a mean and an end in surgery, as does the computer for CS; in other words, a knife is merely a means to an end in surgery, but in CS, the computer is both the mean and the end: in the latter case, the computer is a sort of representative of all the knowledge and practice in CS, in a given time. The sophistication of the computer and what it can do, represent, to a large extent, what we have achieved in CS--but you can't say the same about a knife vs. surgery. Capisch?
First of all, you're a brave man going after Dijkstra. Even two decades in the ground he's still going to win this argument.
The domain of astronomy is the starry sky and the Universe it reveals. The domain of surgery is anatomy, physiology, metabolism. In Informatics (not everyone calls it "Computer science", eh?) the domain is formal systems.
In each case the instruments (telescope, scalpel, digital computer) are not the main focus of investigation, they are tools, not the domain of study.
> the computer is the central figure
This is precisely the misunderstanding that Dijkstra tilted against.
> can you think of ANY other tool that represents CS?
Yes. The human brain.
I'll leave you with another joke, one of my favorite, although I don't know who said it, "Computer science could be called the post-Turing decline in the study of formal systems."
Considering how it were the Greeks that prevented Calculus from being discovered for 2000+ years, I would rather err on the side of Descartes and Leibnitz and still ask questions like these.
I think you mistake my using the term "computer" for the machine that everybody is using nowadays--but that is just an instance of the Class of computers. The ultimate goal of formal systems is making better Class of computers that should solve real-world problems more efficiently (any other formal systems digression into logic and linguistics always boomerangs back to machines).
Consider how Bayesian probability was looked down upon for decades before computers became powerful enough to reveal how the academic world was wrong about dismissing it--big names from the Frequentist school, just like EDK is in CS....
Even if you still disagree--which you will--there is no denying the fact that not using technology when you ARE an expert in the said technologies is rather odd, and perhaps a bit silly. Have you seen astronomers shunning mathematics? Math is a tool that simplifies a great deal of issues not ordinarily possible with a "naked" mind. So does the computer (as an instance of the computer Class); that someone did not even want to use a typewriter let alone a computer is bewildering to me.
> People don't do computer science for the mere theory of it.
Are you sure? If I were to pick any arbitrary computer scientist (even stipulating it won't be EWD himself, this would still be "demonic choice", from your point of view), are you prepared to argue that whomever I pick does/did not do cs for the "mere" theory?
Exercise N: Was Euclid's GCD doing computer science?
Exercise S: Is watching TikTok doing computer science?
Hint: Gurfr dhrfgvbaf ner zrnag gb vyyhfgengr gur fhssvpvrapl naq/be arprffvgl bs pbzchgref gb qbvat pbzchgre fpvrapr.
When I say "don't, I mean "should not", but if they do, hey, that is what pedantry is for isn't it?
Exercise Q: Is doing theory for the sake of theory not ultimately about better theories that, in the end, should always yield utility in the real, applied world of computers solving hard problems?
Answer N & S: Yes, in a way, but it still, ultimately has sth to do with computers. Re TikTok I am not sure how you classify "watching". It can have something to do with CS and therefore computers in the sense that the original Tiktok source-code is written ON A COMPUTER--and when users watch Tiktok, the real data is analyzed ON A COMPUTER.
I don't mean to be lame, I see your smiley, but c'mon, it's like you're not even trying... People were doing astronomy before the invention of the telescope.
And historically most of what we call computer science was developed before the advent of the computer. Turing, Church, Boole, Quine, Haskell Curry, etc. Wittgenstein, Russell & Whitehead (Principia Mathematica), etc. I could list names all day, none of whom used a mechanical computer.