That's why I don't think Ubuntu is a newbie distro. You never have to compile for source on arch-based distros. Obviously plain arch isn't fit for beginners, but I would argue that something like endeavouros or cachyos is easier to use than Ubuntu. If you want to install something, you just run one command, and then it is installed, 99.99% of the time.
I thought this was going to be a longer rant about how python needs to... Go away. Which, as a long time python programmer and contributor, and at one time avid proponent of the language, I would entertain the suggestion. I think all of ML being in Python is a collosal mistake that we'll pay for for years.
The main reasons being it is slow, its type system is significantly harder to use than other languages, and it's hard to distribute. The only reason to use it is inertia. Obviously inertia can be sufficient for many reasons, but I would like to see the industry consider python last, and instead consider typescript, go, or rust (depending on use case) as a best practice. Python would be considered deprecated and only used for existing codebases like pytorch. Why would you write a web app in Python? Types are terrible, it's slow. There are way better alternatives.
With that said... there is a reason why ML went with Python. GPU programming requires C-based libraries. NodeJS does not have a good FFI story, and neither does Rust or Go. Yes, there's support, but Python's FFI support is actually better here. Zig is too immature here.
The world deserves a Python-like language with a better type system, a better distribution system, and not nearly as much dynamism footguns / rope for people to hang themselves with.
I've actually done a fair bit of ML work in Elixir, in practice I found:
1) It's generally harder to interface with existing libraries and models (example: whisperX [0] is a library that combines generic whisper speech recognition models with some additional tools like discrete-time-warping to create a transcription with more accurate time stamp alignment - something that was very helpful when generating subtitles. But because most of this logic just lives in the python library, using this in Elixir requires writing a lot more tooling around the existing bumblebee whisper implementation [1]).
but,
2) It's way easier to ship models I built and trained entirely with Elixir's ML ecosystem - EXLA, NX, Bumblebee. I trained a few models doing basic visual recognition tasks (detecting scene transitions, credits, title cards, etc), using the existing CLIP model as a visual frontend and then training a small classifier on the output of CLIP. It was pretty straightforward to do with Elixir, and I love that I can run the same exact code on my laptop and server without dealing with lots of dependencies and environment issues.
Livebook is also incredibly nice, my typical workflow has become prototyping things in Livebook with some custom visualization tools that I made and then just connecting to a livebook instance running on EC2 to do the actual training run. From there shipping and using the model is seamless, and I just publish the wrapping module as a library on our corporate github, which lets anyone else import it straight into livebook and use it.
> NodeJS does not have a good FFI story, and neither does Rust or Go. Yes, there's support, but Python's FFI support is actually better here.
Huh. I've found Rust's FFI very pleasant to work with. I understand that Zig's is second to none, but what does Python offer in this domain that Rust (or Go) doesn't?
Rust's problem is similar to Go's - the language makes some very strong guarantees, and FFI breaks those guarantees, so trying to work with FFI in those languages "infects" the codebase and breaks the value-add of working with the codebase to begin with.
In Rust's case, it's the necessity of wrapping FFI with unsafe. Memory deallocation e.g. cudaFree() is just part of the underlying reality; trying to handle memory management in a language with a borrow checker rather defeats the purpose of using a language with a borrow checker in the first place. Python lets library authors write __enter__ and __exit__ dunder methods to ensure that memory deallocation is handled correctly via Python context managers, which is a much more elegant abstraction. Yes, in Rust you can implement the Drop trait, but then the caller needs to remember to put the object in its own block... like I said, it's definitely possible with Rust, it's just not as nice of a story.
I don't see how what you describe doesn't in general apply to FFI between any languages with different resource management philosophies. In particular:
> Yes, in Rust you can implement the Drop trait, but then the caller needs to remember to put the object in its own block...
Why would you need to remember to put the object in its own block? If you want to manually control deallocation, just call drop manually (or put the object in its own block if you really prefer). If you don't care, just let the Rust compiler pick a time to drop. In both cases, the most important guarantee – that drop doesn't happen while references to the object live – is still upheld.
> Python lets library authors write __enter__ and __exit__ dunder methods to ensure that memory deallocation is handled correctly via Python context managers, which is a much more elegant abstraction
Whats stopping you from writing a WrapperPtr and the drop trait for it in Rust? This would achieve the same as the dunder methods in python
> The world deserves a Python-like language with a better type system, a better distribution system, and not nearly as much dynamism footguns / rope for people to hang themselves with.
C#/.Net? (Their too strong focus on worthless backwards compatibility and slow (very slow) development speed on basic language features not withstanding.)
Admittedly I haven't used C# in a few years, but to my knowledge it is much more ergonomic than java and personally it's my preferred language. Only thing stopping me from using it more is it has a much smaller community than java/python etc. Wondering what you think is missing.
Its type system is miles better than Python and it has some basic stuff Python doesn't have like block scope. Functional programming is also intentionally kind of a pain in Python with the limited lambdas.
If TypeScript had the awesome python stdlib and the Numpy/ML ecosystem I would use it over Python in a heartbeat.
Typescript also has significantly better performance. This is largely thanks to the browser wars funnelling an insane amount of engineering effort toward JavaScript engines over the last couple decades. Nodejs runs v8, which is the JavaScript engine used by chrome. And Bun uses JSC, written for safari.
For IO bound tasks, it also helps that JavaScript has a much simpler threading model. And it ships an event based IO system out of the box.
you can define a named closure in python, i do it from time to time, though it does seem to surprise others sometimes. i think maybe it's not too common.
Typescript is a really nice language even though it sits on a janky runtime. I’d love a subset of typescript that compiles to Go or something like that.
Typescript is ubiquitous in web, and there are some amazing new frameworks that reuse typescript types on the server and client (trpc, tanstack). It's faster (than python), has ergonomic types, and a massive community + npm ecosystem. Bun advances the state of the art for runtime performance (which anthropic just bought and use for Claude code).
Those are both valid reasons to use both languages. The "only" (whether true or not) is what the argument hinges on. It is roughly the same as saying that the only advantage of X is that it is popular, but Y is also popular and has additional advantages, therefore, Y is better than X. That is a valid argument, whether the premises are true or not.
I do not disagree but if you are going to say that "X" is only used because of "Y", maybe if you are pitching "Z" instead of "X" do not start with the "Y" :)
But it comes from Javascript and inherits its issues, and is used mostly because there is no other option on the web and that makes it popular, not its quality, but lack of options. So a closed ecosystem with just one language - Javascript, and any other languages that can be compiled to it. And Typescript is a bandaid to make sensible type system on top of Javascript, and it still leaves option open to use Javascript lack of rules.
Concurrency in JS also looks junky.
And on readability Python also wins (if we prohibit stupidities like PEP 572)
Typescript is a lot nicer than Python in many ways. Especially via Deno, and especially for scripting (imports work like people want!).
There are some things that aren't as good, e.g. Python's arbitrary precision integers are definitely nicer for scripting. And I'd say Python's list comprehension syntax is often quite nice even if it is weirdly irregular.
But overall Deno is a much better choice for ad-hoc scripting than Python.
I agree, but bigints are missing from json because the json spec defines all numbers as 64 bit floats. Any other kind of number in JSON is nonstandard.
JavaScript itself supports bigint literals just fine. Just put an ‘n’ after your number literal. Eg 0xffffffffffffffn.
There’s a whole bunch of features I wish we could go in and add to json. Like comments, binary blobs, dates and integers / bigints. It would be so much nicer to work with if it has that stuff.
I'd love to replace Python with something simple, expressive, and strongly typed that compiles to native code. I have a habit of building little CLI tools as conveniences for working with internal APIs, and you wouldn't think you could tell a performance difference between Go and Python for something like that, but you can. After a year or so of writing these tools in Go, I went back to Python because the LOC difference is stark, but every time I run one of them I wish it was written in Go.
(OCaml is probably what I'm looking for, but I'm having a hard time getting motivated to tackle it, because I dread dealing with the tooling and dependency management of a 20th century language from academia.)
Have you tried Nim? Strong and static typed, versatile, compiles down to native code vía C, interops with C trivially, has macros and stuff to twist your brain if you're into that, and is trivially easy to get into.
That looks very interesting. The code samples look like very simple OO/imperative style code like Python. At first glance it's weird to me how much common functionality relies on macros, but it seems like that's an intentional part of the language design that users don't mind? I might give it a try.
Yes, Go can hardly be called statically typed, when they use the empty interface everywhere.
Yes, OCaml would be a decent language to look into. Or perhaps even OxCaml. The folks over at Jane Street have put a lot of effort into tooling recently.
> Yes, Go can hardly be called statically typed, when they use the empty interface everywhere.
How often are you using any/interface {}? Yes, sometimes it's the correct solution to a problem, but it's really not that common in my experience. Certainly not common in ways that actually make life hard.
Also, since generics, I've been able to cut down my use of the empty interface even further.
You can replace Python with Nim. It checks literally all your marks (expressive, fast, compiled, strong-typing). It's as concise as Python, and IMO, Nim syntax is even more flexible.
I bounced off OCaml a few years ago because of the state of the tooling, despite it being almost exactly the language I was looking for.
I'm really happy with Gleam now, and recommended it over OCaml for most use cases.
I always assumed a runtime specialized for highly concurrent, fault-tolerant, long-running processes would have a noticeable startup penalty, which is one of the things that bothers me about Python. Is that something you notice with Gleam?
Did you consider using F#? The language is very similar to OCaml, but it has the added benefit of good tooling and a large package ecosystem (can use any .NET package).
I've heard a lot of good things about F#, but I've also heard that C# has taken all the best features from F# and now development has slowed down. I don't know how true that is. It's also just some irrational anti Microsoft bias, even though I know .NET runs fine on Linux now, the idea still felt weird to me. I suspect if I'd actually tried F# I would have stuck with it.
I have looked at the Fable compiler for F# which lets you compile F# to Rust which is very cool!
Rust might be worth a look. It gets much closer to the line count and convenience of the dynamic languages like Python than Go, plus a somewhat better type system. Also gets a fully modern tooling and dependency management system. And native code of course.
I suppose you could try typescript which can compile to a single binary using node or bun. Both bun and node do type stripping of ts types, and can compile a cli to a single file executable. This is what anthropic does for Claude code.
"> I think all of ML being in Python is a colossal mistake that we'll pay for for years.
Market pressure. Early ML frameworks were in Lisp, then eventually Lua with Torch, but demand dictated the choice of Python because "it's simple" even if the result is cobbled together.
Lisp is arguably still the most suitable language for neural networks for a lot of reasons beyond the scope of this post, but the tooling is missing. I’m developing such a framework right now, though I have no illusions that many will adopt it. Python may not be elegant or efficient, but it's simple, and that's what people want.
Gee, I wonder why the tooling for ML in Lisp is missing even though the early ML frameworks were in Lisp. Perhaps there is something about the language that stifles truly wide collaboration?
I doubt it considering there are massive Clojure codebases with large teams collaborating on them every day. The lack of Lisp tooling and the prevalence of Python are more a result of inertia, low barrier to entry and ecosystem lock-in.
I swear the only the people who care about Python types are on Hacker News comments. I've never actually worked with or met someone who cared so much about it, and the ones that care at all seem just fine with type hints.
The people we happen to work with is an incredibly biased sample set of all software engineers.
As an example, almost everyone I’ve worked with in my career likes using macOS and Linux. But there are entire software engineering sub communities who stick to windows. For them, macOS is a quaint toy.
If you’ve never met or worked with people who care about typing, I think that says more about your workplace and coworkers than anything. I’ve worked with plenty of engineers who consider dynamic typing to be abhorrent. Especially at places like FAANG.
Long before typescript, before nodejs, before even “JavaScript the good parts”, Google wrote their own JavaScript compiler called Closure. The compiler is written in Java. It could do many things - but as far as I can tell, the main purpose of the compiler was to add types to JavaScript. Why? Because googlers would rather write a compiler from scratch than use a dynamically typed language. I know it was used to make the the early versions of Gmail. It may still be in use to this day.
How much does python really impact ml? All of the libraries are wrappers around C code that uses gpus any way, it's distributed and inference can be written in faster languages for serving anyway?
You're thinking only about the final step where we're just doing a bunch of matrix computation. The real work Python does in the ML world is automatic differentiation.
Python has multiple excellent options for this: JAX, Pytorch, Tensorflow, autograd, etc. Each of these libraries excels for different use cases.
I also believe these are cases where Python the language is part of the reason these libraries exist (whereas, to your point, for the matrix operations pretty much any language could implement these C wrappers). Python does make it easy to perform meta-programming and is very flexible when you need to manipulate the language itself.
It’s especially frustrating that dependency hell seems to be embedded in the Python culture. The amount of “oh no this lib will only work with Python 3.10+ and a slew of other libs at random versions we won’t bother to tell you” while some other lib that it depends on will only work on “3.8.56z but not if you look at it funny and only if you say pretty please” is maddening. Semver is apparently not standard practice either.
I am probably biased against Python, so take this opinion with a grain of salt, but it feels to me like a whole ecosystem of amateur software devs (but professional ML-engineers, data scientists etc) cobbling together something that barely works.
I’m old enough at this point that I remember the whole old guard of software engineers falling over themselves to hate on JS and Node, call the ecosystem immature, and be quick to point out how that is not “real” software. But in the past 10-15 years it appears JS and Node got their shit together, while Python is still completely and utterly stuck in managing dependencies and environments like it is 2012. And if you ask professional Pythonistas about this, you always get an answer like “oh it’s actually really easy, you must not have taken the time to really look at it, because Python is easy, it’s just pseudocode look how amazing it all is”
I really wish ML hadn’t standarized on Python. As a user of ML tools ans frameworks but not a fulltime ML engineer it just constant pain.
>It’s especially frustrating that dependency hell seems to be embedded in the Python culture. The amount of “oh no this lib will only work with Python 3.10+ and a slew of other libs at random versions we won’t bother to tell you” while some other lib that it depends on will only work on “3.8.56z but not if you look at it funny and only if you say pretty please” is maddening. Semver is apparently not standard practice either.
There is no problems with this in modern python if you just use the right tooling.
Not really applicable to ML. The massive amount of compute running on the GPU is not executing in Python, and is basically the same regardless of host language.
You say he's narrow-minded, but you focus on the least relevant thing of everything he said, speed, and suggest that, somehow, something with "fast" in its name will fix it?
Speed is the least concern because things like numpy are written in C and the overhead you pay for is in the glue code and ffi. The lack of a standard distribution system is a big one. Dynamic typing works well for small programs and teams but does not scale when either dimension is increased.
But pure Python is inherently slow because of language design. It also cannot be compiled efficiently unless you introduce constraints into the language, at which point you're tackling a subset thereof. No library can fix this.
Very little of what you're claiming is relevant for FastAPI specifically, which in terms of speed isn't too far from an equivalent app written in Go for writing a web app. You need to research the specifics of a problem at hand instead of making broad but situationally incorrect assumptions. The subject here is web apps, and Python is very much a capable language in this niche as of the end of 2025, both in terms of speed, code elegance and support for static typing (FastAPI is fully based on Pydantic) - https://www.techempower.com/benchmarks/#section=test&runid=7...
> But pure Python is inherently slow because of language design. It also cannot be compiled efficiently unless you introduce constraints into the language, at which point you're tackling a subset thereof. No library can fix this.
A similar point was raised in the other python thread on cpython the other day, and I’m not sure I agree. For sure, it is far from trivial. However, GraalVM has shown us how it can be done for Java with generics. Highover, take the app, compile and run it. The compilation takes care of any literal use of Generics, running the app takes care of initialising classes and memory, instrumentation during runtime can be added to add runtime invocations of generics otherwise missed. Obviously, this takes a lot of details getting it right for it to work. But it can be done.
Implying that existence of your tool of preference in another programming language makes other equally impressive tools something akin to "[colossal] mistake that we'll pay for for years" "simply motivated by inertia" is way below the level of discussion I would expect from Hacker News.
I would have given the OOP the effort and due respect is formulating my response if it was phrased in the way you're describing. It's only fair that comments that strongly violate the norms of substantive discourse don't get a well-crafted response back.
Who really cares? The goalpost of "AI is useless because I can't vibe code novel discoveries" is a strawman. AI and vibe coding are transformational. So are AI-enhanced efforts to solve longstanding, difficult scientific problems. If cancer is cured with AI assistance, does it really matter if it was vibe-cured or state-of-the-art-lab-cured?