Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The thing what people don't get with C++'s complexity is that complexity is unavoidable.

It is also there in Ada, C#, Java, Python, Common Lisp,....

Even if the languages started tiny, complexity eventually grows on them.

C23 + compiler extensions is quite far from where K&R C was.

Scheme R7 is quite far from where Scheme started.

Go's warts are directly related to ignoring history of growing pains from other ecosystems.



> Even if the languages started tiny, complexity eventually grows on them.

And then of course the case that proves the opposite, Clojure. Sure, new ideas appear, but core language is more or less unchanged since introduced, rock solid and decades old projects still run just fine, although usually a bit faster.


That is because Clojure is done, there is hardly anything being done other than probably what matters to NuBank and Datomic mostly.

Also its market share adoption kind of shows it.


That's a pretty far cry from "complexity is unavoidable". Reading that to me implies that the complexity is inherent in programming language design, whereas this follow-up argument seems to say that complexity is the result of tacking on new features.

The latter is a bit tautological, since the size of the language grammar is itself a measure of complexity.


I think they even haven't adopted newer JVM features, it is a hosted language designed to depend on its host, plus it is a Lisp.

The complexity would be to grow like Common Lisp, instead it is up to Clojure folks to write Java, C#, JavaScript code, therein lies the complexity.


> I think they even haven't adopted newer JVM features

You don't know what you're talking about. Not only Clojure steadily adopting newer JVM features (for when that makes sense) - java streams, functional interfaces, qualified method values, interactive lib loading, JDK21 virtual threads, etc., Clojure constantly explores beyond the JVM - e.g., Jank targets LLVM and has C++ interop.

Pick some hardcore JVM topics and try searching what Clojurists think about them - GC, profiling, concurrency, etc. There's tons of interesting, deeply involved things constantly being hacked together by incredibly knowledgeable folks. You're casually name-dropping "complexity" maybe without even realizing that it's a community that includes people who have written production experience reports on Shenandoah GC, built profiling tools that work around safepoint bias, and given conference talks on tri-color marking algorithms. Dealing with complexity is their bread-n-butter. Challenging Clojurists to debate about "complexity" is like dropping "the brain has neurons" around a group of neurosurgeons. They'd quietly say nothing, so you can "win your argumentation", but they'll just... know.


I was talking about the JVM bytecodes for dynamic languages.

Also I remember watching a recent talk, where virtual threads was still "being considered".

Having to write portable code that has to take into account the host differences, and difference in execution semantics, and still delivery the same outcome, is also complexity that keeps neurons busy.


Let's see.

> JVM bytecodes for dynamic languages

You're talkiing about invokedynamic - bytecode instruction added in Java 7, specifically to make dynamic language dispatch efficient, right? Explained simply: JVM was designed for static types - method calls resolved at compile time. In dynamic langs you don't know the type of something until runtime had to hack around, typically by boxing everything and doing manual type checks. This was slow and awkward. JRuby/Groovy adopted it eagerly. Clojure's dispatch model though is different. Most calls are either: direct interop (already statically typed), or calls through a Var (is a reference to a function value, not a dynamic method lookup). The Var indirection is a different shape of problem that invokedynamic doesn't solve as cleanly. It's not that it's useless, just that the fit isn't as natural.

> virtual threads was still "being considered"

That is an outdated info. Clojure 1.12.0 shipped two years ago with virtual thread support, but the integration with core.async's thread pool model was not there (so you were not completely incorrect). However, core.async later reimplemented go blocks using virtual threads when available. The improvements are still underway https://clojure.org/news/2025/10/01/async_virtual_threads

> take into account the host differences

Okay, this one is genuinely not that straightforward. The #? reader conditional in .cljc files is a clean, minimal mechanism. I don't really know any other language that can target completely different platforms from a single namespace as cleanly - even in Nodejs you can't in practice do it as nicely. Kotlin Multiplatform is probably the closest competitor - but its `expect/actual` mechanism requires separate source sets, separate files, and considerably more boilerplate. You're not writing in the same namespace; you're wiring together parallel declarations. Scala.js and GHCJS are essentially separate compilation targets with thinner sharing stories. But yes, it still can get complicated - different hosts have meaningfully different concurrency and I/O models, so it's rather "shared logic, host-specific edges" rather than "write once run anywhere". I still think Clojure handles this all far more elegantly than alternatives.

So pragmatically speaking, you're pointing at complexity at the implementation/runtime layer, while Clojure's complexity reduction happens at a different layer entirely - data model, immutability by default, simpler concurrency reasoning, REPL workflow. Those layers mostly don't interfere with each other. You mentioned real concerns at the platform engineering level, but they in practice don't touch what Clojure is actually trying to simplify. Someone writing Clojure code never experiences invokedynamic problems one way or the other.


The current market share shows how far you can go with just being a better Java.

If (or when? I haven't checked recently) a decent and well-thought-out LLVM backend emerges for it, ideally with some new underlying complexity seeping through, the market share might expand overnight.

And as for C++, while some complexity is certainly unavoidable, a rigorous complexity control is desperately needed. Ideally, the same way Bell Labs folks did when they initially conceived Go from Algol68 and C and similar (before or after joining Google; I couldn't tell), and Rich Hickey did when he initially designed Clojure. Some people are managing the complexity using style guides and clang-tidy checks. Which is great in that doing so doesn't need lengthy language committee decisions. But that approach hasn't been enough to make code _sufficiently_ safe; every now and then an enterprising engineer or team finds a way to abuse a feature in a way that produces unsafe or unpredictable results. Rust is a bit better and solves a few of the common problems, but sadly the list of potential issues (of using Rust in a codebase at scale; Engineers' faults, not Rust's) is long and growing. My verdict is we need both complex and simple LLVM languages, ideally co-designed to have no interop problems by design, while allowing expressing some logic in the simple parts and some logic in the complex parts. Or better, a 3 tier design would be nearly perfect: expressive config language, glue and research language, and core building blocks language. I think a clojure-style language can be designed to achieve all three.


All good, except that going forward the new languages to be designed are going to be specifications and formal verification for agents.

I think the way of classical programming languages is behind us, unless AI implodes and we are back to programming without it.


> That is because Clojure is done

Yes, that's one approach to avoiding ever growing complexity, maybe the other languages should try it sometime ;)

With that said, everything around Clojure keeps improving and getting better. While the language doesn't have static types, clojure.spec offers something that is even better than static typing (imo), and doesn't even require any changes to the core language. Something else other mainstream languages could learn too.


Is Typed.Clojure finally stable and sound?

In theory we only need parentheses, prefix operators and a REPL, but mainstream never went down that route.

Anyway the complexity then ends up being custom DSLs and macros.


> Is Typed.Clojure finally stable and sound?

It feels like as if you're operating with a few keywords you picked up without fully understanding the meaning of them. Typed Clojure was a PhD research project. Experimental. CircleCI experimented with it at one point but the friction was high enough that it never became a standard practice - the annotation burden was significant. You'd be writing a lot of type scaffolding for a language whose entire value proposition includes getting things done with less ceremony. Clojure's power comes heavily from its data orientation. Maps, sequences, heterogeneous data flowing through pipelines. Traditional type systems are deeply uncomfortable with that style. You end up either constraining how you write Clojure, or writing very complicated types to describe simple data flows. Types are there, Clojure does have types, and OMG, non-clojure coders have no idea how expressive they can be. There's just no static checking and that's for good reasons.

Does Typed Clojure solve complexity? - No, not really. Complexity is about incidental complexity from complecting things - and a type system doesn't untangle that (according to Rich Hickey). You can have beautifully typed spaghetti.

I'm not against static typing, and I have used languages with really nice type systems. But honestly, whenever this point pops on forums and people be like "meh, Clojure is not typed" - I immediately know - they probably have only shallow experience working with Clojure code.


No, I happen to know that some things that people throw around haven't delivered their initial expectations.

Just as you confirmed with your lengthy reply, by following up to my question.


What do you know about "initial expectations" of Typed Clojure specifically? I for one have worked with Ambrose on the same team, he's a good friend of mine and we discussed his dissertation at least a few times. I don't argue about static vs. dynamic typing without specific context, because it always matters. I have intimate understanding for why Clojure is how it is. I'm not defending it blindly - just like any other PL it has its own warts, but you're picking the wrong ones to criticize its face value.


> That is because Clojure is done

First of all: Clojure is not "done". Latest commits were 3 months ago - https://github.com/clojure/clojure. Secondly, the language intentionally not 'all batteries-included' PL. The core is meant to be a stable, minimal substrate. Most action happens in libraries and tools - core.async, spec & malli, babashka, nbb, etc. Check the activity in Clojurians Slack. It's a small but unusually vibrant community, every single day there are news and announcements - updates, etc. It is done-ness in the good sense - like a well-designed tool that doesn't need to keep changing its handle.

> market share adoption kind of shows it

NuBank being the world's largest digital bank and running Clojure at scale is not "adoption"? Besides, there's Apple, Cisco, and tons of smaller companies running on it.

> there is hardly anything being done

They are making a documentary https://www.youtube.com/watch?v=JJEyffSdBsk Please don't say: "well, there are documentaries about dinosaurs" or something. I've been using Clojure for over ten years - in different teams, companies, industries. For my own projects and professionally. I've heard about it "dying" back then. I keep hearing about it dying every year and I promise you - nothing like that (even remotely) happening. Yes, the hype is gone (was it ever real?), but the language, community, library ecosystem, tooling - all of that only getting better.

There's no "killing" of Lisp. As long as programming languages remain relevant, there will always be some Lisp-dialect around. It probably never will become mainstream, yet it never completely disappears. There's no killing of Lisp, because it would be like killing "graph theory" or something. Graph theory doesn't need a Fortune 500 company funding it to remain true. Similarly, a small community keeping a Lisp dialect alive is all it takes - and there will always be people drawn to the clarity you get when you strip a language down to its lambda-calculus bones and see the whole thing fit in your head at once.

Rich Hickey has made this point himself - Clojure isn't trying to be the most popular language, it's trying to be correct about certain things. And correctness doesn't go out of fashion.


The issue with C++ complexity is that the standard development model is dysfunctional.

They aren’t able to bring concepts like “a file” into the standard because one standards body member maintains a compiler in some baroque ancient environment without the concept of a fs. Too much is forced to implement at the library level instead of in the compiler, and end up feeling half baked for real users (modules, coroutines etc).


Just like any other language under ISO, like Ada, Fortran, COBOL and guess what, C the FOSS beloved language.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: