It used to be that the major limitation in computing was hardware. Look at the creativity that it spawned. We are still living off of ideas three decades old.
Now what is the limitation? Is it programmer stupidity? Or maybe there is no limitation? Maybe that is the problem.
History shows hardware kept getting faster, but there has been no decrease in the amount of ever larger, slow, unreliable software. Where is all the lean, "instantaneous" software running in RAM, extracting every last bit of computing power, instead of mindlessly consuming it?
What exactly is the point of "programmmer productivity" and whatever problems some programmers think that this justifies?
Is the point to advance computing? The state of the art?
Or is the goal is to peddle their junk to those with very low expectations, not to mention those with zero expectations. (The later were not old enough, or even alive, to see how computing was done in the 80's and hence have nothing with which to compare.)
It is unfortunate that giving programmers what they wanted -- faster hardware -- has not resulted in software that is any more creative or powerful, and truly it is less so, than the software of the past. Only hardware has improved.
Given the history so far, I would argue that the only proven path to true creativity in computing is through limitation.
Perhaps present day computing's limitation is programmer stupidity, or to be more gentle, programmer ignorance.
Agree 100% with all points in the blog post, except perhaps the last one. Well done.
It used to be that the major limitation in computing was hardware. Look at the creativity that it spawned.
This can be summed up in one word: demoscene. In fact I thought the article would be about the demoscene, just from its title.
History shows hardware kept getting faster, but there has been no decrease in the amount of ever larger, slow, unreliable software.
That's what I think someone who took this 30-year-leap would perceive if they switched from a computer of the 80s to a new system today: "This app is how many bytes!? It's pretty and all, but it can only do this? Why does it take so long to boot up?"
What exactly is the point of "programmmer productivity" and whatever problems some programmers think that this justifies?
I think it's more like a combination of laziness and selfishness: programmers want to be more "productive" by doing the least work possible, and at the same time are uncaring about or underestimate the impact of this on the users of their software. Trends in educating programmers that encourage this sort of attitude certainly don't help...
Latency. Our hardware may be getting faster and faster, throughput-wise, but it is certainly not improving very much in the latency department. Many of the things we'd like to build (such as AI) tend to have a lot of dependent memory accesses. These sorts of processes are highly sensitive to latency.
Now what is the limitation? Is it programmer stupidity? Or maybe there is no limitation? Maybe that is the problem.
History shows hardware kept getting faster, but there has been no decrease in the amount of ever larger, slow, unreliable software. Where is all the lean, "instantaneous" software running in RAM, extracting every last bit of computing power, instead of mindlessly consuming it?
What exactly is the point of "programmmer productivity" and whatever problems some programmers think that this justifies?
Is the point to advance computing? The state of the art?
Or is the goal is to peddle their junk to those with very low expectations, not to mention those with zero expectations. (The later were not old enough, or even alive, to see how computing was done in the 80's and hence have nothing with which to compare.)
It is unfortunate that giving programmers what they wanted -- faster hardware -- has not resulted in software that is any more creative or powerful, and truly it is less so, than the software of the past. Only hardware has improved.
Given the history so far, I would argue that the only proven path to true creativity in computing is through limitation.
Perhaps present day computing's limitation is programmer stupidity, or to be more gentle, programmer ignorance.
Agree 100% with all points in the blog post, except perhaps the last one. Well done.