> The truth seems to be that we don't know what makes a good language.
On the other hand, we do know what makes a bad language. We can spot flaws in a language, and fixing them automatically makes them better —at least in the aspects one is interested in. C for instance has a number of flaws and questionable tradeoffs that are widely known by now:
Contextual grammar. While not too bad (the context is easy to maintain in the parser), unknown identifiers can lead to syntax errors, and slightly worse error messages.
Complicated syntax for types. Matching declaration and use may have sounded neat at the time, but doing so kills the separation between an entity's name and its type. The result is not very readable. An ML-like syntax would be much better.
Switch that falls through by default. The switch statement were clearly designed from an implementer's point of view —it maps nicely to a jump table. In practice however over 95% of switch statements do not fall through. Switch should break by default (Having multiple cases branch to the same code is more common, but is not incompatible with breaking by default, see how ML style pattern matching does it).
No direct support for sum types (tagged unions). We have structs , unions, and with them we can emulate algebraic data types. It's a pain in the butt however. Automating this somewhat would be nice, since sum types are so widely useful. Even if they were used just for error handling that would be a nice bonus.
No generics. Makes it much harder to abstract away common code. Want to write a generic hash table? Good luck with pre-processor magic.
Textual macros. We can do, and have done, better than that.
Too many undefined behaviours. In the name of portability, many things that would have worked fine on many architectures are now undefined because some architectures couldn't handle them. And now we have silly stuff such as undefined signed integer overflow. But we can't remove them because compiler writers justify this madness with optimizations! (For the record, I have seen Chandler Carruth's talk on the subject, and I disagree: when compilers remove security checks because of undefined behaviour, it is just as bad as nasal demons.)
Silly `for` loop. That damn thing is fully general, we don't need that. We have the `while` loop. A simpler, less general syntax would have allowed optimisations that currently exploit undefined signed integer overflow, and then some.
---
Of course, it's easy to criticise a language over 40 years after the fact. But that's kind of the point: we have learned a good deal since the 70's. A language written now could not justify most of the flaws above. This is why so many people (me included) dismissed Go out of hand: not providing generics in a new statically typed language is just silly.
Sure, designing a good language is still very hard. But we can avoid more mistakes now than we could some decades ago.
At least I find it encouraging that there are some of the newer languages that are taking errors from the past into account.
They're not going to be perfect and they surely have their own flaws, but that's something that someone will fix in a couple of decades once we will know what new ideas are actually good and bad.
On the other hand, we do know what makes a bad language. We can spot flaws in a language, and fixing them automatically makes them better —at least in the aspects one is interested in. C for instance has a number of flaws and questionable tradeoffs that are widely known by now:
Contextual grammar. While not too bad (the context is easy to maintain in the parser), unknown identifiers can lead to syntax errors, and slightly worse error messages.
Complicated syntax for types. Matching declaration and use may have sounded neat at the time, but doing so kills the separation between an entity's name and its type. The result is not very readable. An ML-like syntax would be much better.
Switch that falls through by default. The switch statement were clearly designed from an implementer's point of view —it maps nicely to a jump table. In practice however over 95% of switch statements do not fall through. Switch should break by default (Having multiple cases branch to the same code is more common, but is not incompatible with breaking by default, see how ML style pattern matching does it).
No direct support for sum types (tagged unions). We have structs , unions, and with them we can emulate algebraic data types. It's a pain in the butt however. Automating this somewhat would be nice, since sum types are so widely useful. Even if they were used just for error handling that would be a nice bonus.
No generics. Makes it much harder to abstract away common code. Want to write a generic hash table? Good luck with pre-processor magic.
Textual macros. We can do, and have done, better than that.
Too many undefined behaviours. In the name of portability, many things that would have worked fine on many architectures are now undefined because some architectures couldn't handle them. And now we have silly stuff such as undefined signed integer overflow. But we can't remove them because compiler writers justify this madness with optimizations! (For the record, I have seen Chandler Carruth's talk on the subject, and I disagree: when compilers remove security checks because of undefined behaviour, it is just as bad as nasal demons.)
Silly `for` loop. That damn thing is fully general, we don't need that. We have the `while` loop. A simpler, less general syntax would have allowed optimisations that currently exploit undefined signed integer overflow, and then some.
---
Of course, it's easy to criticise a language over 40 years after the fact. But that's kind of the point: we have learned a good deal since the 70's. A language written now could not justify most of the flaws above. This is why so many people (me included) dismissed Go out of hand: not providing generics in a new statically typed language is just silly.
Sure, designing a good language is still very hard. But we can avoid more mistakes now than we could some decades ago.