Now we have 300+ people voting on features that are never tested on a real compiler, before being set in stone.
Compilers now are in different states of ISO C++ compliance, between ISO C++14 and ISO C++23.
Several of those on paper features, deemed not quite right when implemented, and had to be redone between revisions, but naturally old ones kind of stay in, backwards compatibility.
All languages are hostile to cross language interop, that is why everyone pretends to be C, or uses OS IPC.
Now we have 300+ people voting on features that are never tested on a real compiler, before being set in stone.
Would you prefer that only 30 people should be allowed to hold up the acceptance of half-baked proposals? How could lower participation in C++ language evolution and/or standardization possibly do anything to improve the rigor and field testing of proposals?
Compilers now are in different states of ISO C++ compliance, between ISO C++14 and ISO C++23.
And? MSVC didn’t even do two-phase name lookup correctly until 2017, a fundamental feature of C++, yet the non-conforming behavior was little more than a nuisance. It’s really no concern of mine that Clang/libc++ hasn’t completed pstl support for C++17 because I wouldn’t be using it anyway. It is concerning when implementations half-asses a feature to pencil whip their “compliance” with new C++ standards (particularly when it only takes one implementation to make a feature useless for portable code).
Several of those on paper features, deemed not quite right when implemented, and had to be redone between revisions, but naturally old ones kind of stay in, backwards compatibility.
The developers behind major C++ implementations are acutely aware of every wart and failure, more so than virtually any end user. If they could they would choose to test every proposal on multiple compilers, across thousands of projects, targeting as many platforms as possible. But the resources and expertise to do so simply aren’t there. At some point the committee must hope for the best, take a leap of faith, and commit to a decision. It’s admirable that the C++ committee chooses to prioritize the needs of end users over the interests of implementations.
All languages are hostile to cross language interop,…
Managed languages do it every day (Kotlin and Java). But yes, the situation is quite different for native languages. What’s different about C++ is that it’s a remarkably difficult language to wrangle. C bindings are almost trivial to write or generate, but C++ introduces a mountain of complexity and ambiguity.
There are dozens of projects which have incrementally rewritten C in Rust. But it’s practically impossible to do anything similar with C++.
…that is why everyone pretends to be C, or uses OS IPC.
No language wants to bear the cross of a stable ABI, much less an ossified ABI. A platform’s C implementation has already paid the price, everyone is welcome to freeload.
That said, nothing good is ever free and the poverty of C ABIs prove it. With hardly any exceptions, describing a type C ABI as “fragile” is an understatement at best.
Try to call Kotlin co-routines from Java to see how well it goes, even bytecode doesn't help when semantics aren't there, or languages build their little islands on top of the underlying platform.
Yes, a smaller group of people, and like any sane language evolution process, papers without implementations don't go in.
Words aren't a replacement for proper field testing.
Now we have 300+ people voting on features that are never tested on a real compiler, before being set in stone.
Compilers now are in different states of ISO C++ compliance, between ISO C++14 and ISO C++23.
Several of those on paper features, deemed not quite right when implemented, and had to be redone between revisions, but naturally old ones kind of stay in, backwards compatibility.
All languages are hostile to cross language interop, that is why everyone pretends to be C, or uses OS IPC.