Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> There should be one-- and preferably only one --obvious way to do it.

This is so hilariously wrong in python though



So I imagine you have that perspective because you started less than 20 years ago. In some ways the idea of the Pythonic Way to do things evolved in opposition to Perl's vigorous advocacy of More Than One Way.

Python has been really winning for some time, so it's natural that its ideological discipline has grown ragged. The crop of kids who value options above consistency don't have the scars of the Perl age to inform their prejudices.

But Python is -dramatically- better focused, as a community, on finding a Pythonic way to proceed, and then advocating it, than previous cultures.


Back when I decided it was time to add a scripting language, Perl and Python seemed like the obvious choices, and in my mind were equally good options. I asked my best friend which I should choose, and he more or less said, "You can't go wrong with either one, but when you ask for help Perl people are assholes and Python people are nice."

I can't confirm his thoughts on Perl and I haven't interacted much with Ruby, but the Python community is definitely welcoming and patient in my experience. I wouldn't be surprised if this was a significant factor in Python's prevalence over Perl, Ruby, or anything else.


yep the Perl community kind of had issues around the turn of the millennium and the perl6 debacle did a lot to convince people that Perl was kind of a dead end.

I don't think there was any toxicity in the Ruby community but it was made up of working programmers where as the big leading voices in the python community was teaching assistants and students so it might have been more tailored to newbies.

I don't recall there being much real industrial use of python prior to Ruby emerging even if python is technically older so i think the real answer lies in why the educational sector decided that teaching python was easier and significant whitespace plays a huge part here.


> I don't recall there being much real industrial use of python prior to Ruby emerging even if python is technically older

Yeah that's my recollection too. About 2011ish there weren't a lot of jobs in python yet. Perhaps in SV, but not out in the real world. Several startups were doing it, including Youtube and google at the time.

But in the F500 world, python wasn't used at all. I started using it in 2008/9-ish.


> [...] the perl6 debacle did a lot to convince people that Perl was kind of a dead end.

Not GP, but the Python 2 vs 3 holy wars were also something that kept me from adopting Python as a scripting language a couple of years.


Yeah, python 2->3 transition was painful. But, I would argue that was self inflicted. Guido and company chose not to develop a 2.8/2.9/etc series where people could move their code base over incrementally.

I mean, I love python, but that sucked!

Yes it would have been more work for the devs, but the amount of work it meant for the users were worse.

In fact they pretty much just threw away anything before python 3.6 anyway now. Many things introduced in the 3.x series before 3.6 just don't work anymore (asyncio syntax being the notable one).


> Guido and company chose not to develop a 2.8/2.9/etc series where people could move their code base over incrementally.

That is literally what 2.7 was, as well as reimplementing some features in later p3 (up to 3.4).

The core team definitely had the wrong transition model at the start, and it took some time for the community to convince them then decide on which items were the most important, but let’s not act like they did not get it in the end.

> In fact they pretty much just threw away anything before python 3.6 anyway now. Many things introduced in the 3.x series before 3.6 just don't work anymore (asyncio syntax being the notable one).

What?


What would have been a better transition model? Are there any languages with major breaking changes that have done the upgrade smoothly?


> What would have been a better transition model?

Better supporting cross-version transition codebases.

The core team initially saw the transition as “run 2to3, fix what’s left, publish updates, P2 is gone”, but aside from 2to3 being quite limited such transition is quite hard for dependencies, as it means they leave all older dependents behind entirely (dependents which might be the primary user for e.g. company-sponsored projects), or they have to keep two different codebases in sync (which is hard), plus the limitations of pypi in terms of versions segregation.

What ended up happening instead was libraries would update to 2.7 then to cross-version codebases, this way their downstream could migrate at their leisure, a few years down the line people started dropping P2.

> Are there any languages with major breaking changes that have done the upgrade smoothly?

Some but usually statically typed languages e.g. elm’s upgrade tool worked pretty well as long as you didn’t have native modules and all your dependencies had been ported. I think the swift migrator ended up working pretty well after a while (Swift broke compatibility a lot “initially”) though I’ve less experience with that.

An alternative, again for statically typed languages more than dynamically typed ones, is to allow majorly different versions of the language to cohabit e.g. editions in Rust (Rust shares the stdlib between all editions but technically you could version the stdlib too).

Not workable for Python, not just because it doesn’t really have the tooling (it has some with the __future__ imports but nowhere near enough) but also because it changed runtime data model components specifically the entire string data model, which is not a small matter (and was by far the most difficult part of the transition, and why they piled on smaller breakages while at it really).


The only ruby person i've met was insistent that ruby was the one true way and he trued to force it into everything. That attitude turned me off.

Of course I already knew Python, and so did the rest of my team so we had been doing tools in Python (the guy wasn't on my team), but until he pushed ruby into places where python would have been better (import a Python library rather than shell to out to a program) I was willing to accept it was probably fine '


> The only ruby person I've met was insistent that ruby was the one true way and he trued to force it into everything. That attitude turned me off.

I mean, if you read almost any Elixir article that has hit the front page of HN, there are always comments from Pythonistas saying, "Why bother when there's Python?" Similar attitude. Obviously it's not everyone, but it's not everyone in the Ruby community either.


It’s such a bizarre reason. “One person using something rubbed me the wrong way so I decided not to use it”. Did the person extrapolate to an entire community from a sample size of 1?


    The only ruby person i've met was insistent that ruby was the one true way
That sucks. I've been doing Ruby full-time since 2014 at 4 companies and I've never seen that sentiment, even from people who really love it. My experiences have been really positive.


I agree. I’ve found ruby and its developers to be pretty friendly and open to other languages and styles.


My experience with python was simply, people wanting to get shit done. This was circa 2008. They weren't really engaging in language wars, but doing innovative things like extending Java, with Jython.

I was arguing for the F500 company I was working on to explore using Jython to write unit tests for Java code.

Why not have a scripting language to write unit tests for Java code?

I see this with Rust trying to extend python in interesting ways. I don't see this with Java trying to extend C/C++ or Python.


Python and Ruby have some things where the intuitions are exactly inverted from one another. It took me a long time to figure out why Python rubbed me the wrong, and that if I dig up how I used to structure code in Pascal, it’s fine.

Not that I care much these days since I prefer writing in Elixir.


> I haven't interacted much with Ruby

“Matz is nice and so we are nice” https://en.wiktionary.org/wiki/MINASWAN :)

The Rails community is another story, unfortunately.


That’s funny because that’s one of the reasons I tend to point beginners to R instead of Python for data work.


I can't imagine Python's welcoming community has anything to do with it. If anything it was Ruby that had a reputation for being the most welcoming community with its MINASWAN (cringe) philosophy.


> I can't imagine Python's welcoming community has anything to do with it. If anything it was Ruby that had a reputation for being the most welcoming community with its MINASWAN (cringe) philosophy.

TBH, community had nothing to do with Python's enormous success over its competitors (Perl, and Ruby. Possibly Tcl too.).

Nor did any technical merit, nor ergonomics.

There's one, and only one, reason why Python exploded at the expense of the other competitors: The ease and acceptance of using the language as glue for C functions.

Python's popularity is built on a solid foundation compatibility with C.

If, in the 90s, C++ had taken off enough to displace C, I doubt Python would be as popular as it is. Python owes its ubiquity to C, because if C was not ubiquitous, Python wouldn't be either.

(It's only recently, like the last 10 year or so, that I started seeing popular Python programs which didn't have a dependency on C libraries. And even now, it's still rare to see).


I don't think your analysis is accurate.

My experience of trying to get my own C functions to use in Python to have been nightmarish. Yes, you can do it... if you have exactly the same compiler & version used to produce the python interpreter itself.

C's only usefulness to Python is: it allows optimization of the 80/20 or 90/10 rule, so performance doesn't have to totally suck with Python.

Python 'won' IMHO because it hit a sweet spot -- simple enough for beginners, in fact, beginner-friendly, but due to having a good basic set of datatypes (lists, tuples, sets, plus the usual ints, floats, and complex) -- this allowed complex ideas to be compactly expressed. The ability to switch between functional and imperative styles also helped.

Python is a 'good enough' lisp. MIT switched, and Norvig has said as much.

No, the astonishing thing is that Python survived the 2->3 transition, and came out stronger on the other end. Language cleanups, new 'syntactic sugar' (e.g. @ as the decorator syntax), and what you see is Python is trying to actively steal all the successful programming paradigms under one unified syntax.

Is python perfect? Hardly. But it's beginner-friendly and expert-optimized. AND, unlike C++ (at least for me), you can get ALL of Python into your head at the same time. (Libraries, ok, but true in any language). In this specific sense, it is exactly like C (you can keep it all in your head, even the edge cases).

There are newer languages gunning for a piece of Python's mindshare (Zig, Nim). But because Python is a moving target: getting better and better, the others will need to provide a spectacular use-case advantage --- and I just don't see that happening.


Maz (ruby author) is nice and so we are nice, isn’t so bad. It’s twee sounding but saying you are going to follow the example set by the founder is absolutely fine. Is it any worse than the Python ‘benevolent dictator for life’ example?


I'm going to answer your question directly: No, it's not worse.

My interactions with Guido haven't been awesome. But the people put up with him regardless. The other people in python have been awesome.


I'm genuinely curious: what's "cringe" about MINASWAN?

(I write mostly Python these days, but have been involved in both communities for a long time, and MINASWAN never particularly stood out to me other than as a cute reminder to be nice.)


There is no problem with someone being nice. It's only a problem when they want you to be nice exactly like them.


It was meant tongue in cheek as I was defending Ruby being the most welcoming community. It's just a bit twee like the voice-over on London Underground - "See it, say it, sorted".


Oh I have met Ruby people and it's a big factor in why I never learnt the language.


> But Python is -dramatically- better focused, as a community, on finding a Pythonic way to proceed, and then advocating it, than previous cultures.

I would revise that to be that the pythonic culture of one acceptable way to to is better matched with a lot of good development practices.

Perl was also very good at finding a Perl way to proceed. It's just that with Perl that often mean a lot of implicitness and "do what I mean", and multiple ways to do it so you could fit it within your preferred style and tailor it to the current project's needs.

That all sounds good until you are confronted with the flip side of the coin, which is that it's harder to understand when looking at something written with those perspectives for the first time or after a long hiatus from the project, which puts a lot of importance on policy and coding standards which saps effort from just getting something done.

I love Perl, and it's still the language I write most often for work, and it's no great mystery why Python became preferred.


But Ruby took all the best bits of Perl so I'm still perplexed as to why Python "won".


Ruby did have a lot going for it about 15 years ago. Many Java/JSP people jumped ship and got on the ruby train. Ruby was a breath of fresh air comparatively.

Python had a great community though, and the Python Software Foundation went out of it's way to make people feel accepted in the community. And frankly programming is often more of a social activity than most people realize -- particularly for new programmers.

So new programmers tended to lean towards python, because the resources to lean on others were there. And people like Raymond Hettinger, Naomi Ceder, Ned Batchelder, and Yarko Tymurciak were approachable even if the BDFL wasn't.


What did the PSF do to make people feel accepted?


Taking a look at the bigger picture, it does indeed seem like the Perl philosophy lost to the Python philosophy overall.

Looking at the hip n cool languages, not just Python for scripting but surely Go and to some extent Rust as well for native stuff (Dart for scripting also but it didn't outright "win"), these are mostly languages that deliberately simplified things. Yes, even Rust - it needs to be compared to C++ and its biggest feature is basically that it doesn't let you do all the things C++ does, within a very similar model.

The only language that I heard is going against these trends (but I'm largely clueless what it's actually like) is Julia which sort of has its own "lay" niche like Perl did back in the day, and is mostly winning based on the premise that it's a performant scientific language.

The industry obsesses over costs of adoption, stability, maintenance; in short: how to get the most out of the least actual new development. It does make quite a lot of sense, to be honest, although sometimes it's really demotivating at an individual level.

And frankly, "learn the language, duh" usually comes up when the language is complex or unintuitive for no practical purpose. Of course there will be people who always complain if they have to learn anything but I don't think they make the majority, or even a significant minority, in the software engineering world. "Learning the language" is only as much of a virtue as the language itself serves its purpose, which includes easy of use and the readability of someone else's code.


Because the best bits of Perl were kinda trash that Python was smart to avoid.


Yeah for sure, the PERL influence is why I dislike Ruby. Makes it really hard to read if you haven't been doing it constantly.


Python's clean, obvious syntax is what drew me to it. They actively decide to not do things because it detracts from the cleanliness of the syntax. Very often, less is more. My biggest fear is Python might be forgetting this, which I see with the sort of things like := operator, etc.


God forbid we should have to familiarise ourselves with a language before using it.


God forbid things be intuitive.


Funny, because 99% of current regex libraries use Perl's regex extensions.


Not just practices, tooling. The whole typing system. Linting to check for mistakes. For the love of black.


Python's static typing still feels very very clunky and bolted on, and tooling around it was rather bad, like mypy. In general I definitely wouldn't say python tooling was good. It's improving rapidly with ruff tho, just as JavaScript when esbuild appeared.


Dumb question: I know the built-in typing (import typing) is limited in some ways, but it works pretty fine to cover most basic needs. So what does mypy add? I'm super interested in adding typing to some code we have but I'm just a bit confused by the choices available.

Would mypy with pydantic be a good combination or do they overlap?


The two are complementary: the built-in `typing` module only provides type annotations, not type checking. Mypy provides the latter, via the type annotations that you (and others) add to Python codebases.

Pydantic overlaps with mypy in the sense that it provides a mypy plugin, but otherwise it's just like any other Python library.


So mypy runs "at run time"? I guess that makes sense, I thought the annotations provided some form of checking too, but now I realize that I should really spend some time to inform myself better :').


Sort of -- mypy is its own standalone program, and you run it to typecheck your program's type annotations. It does some evaluation of the Python program (specifically imports and some very basic conditional evaluation, among other things), but it never really executes the codebase itself.


Mypy runs as a type linter/checker. See https://mypy-lang.org/


No, it's more that pydantic runs "at run time" while mypy not.


Isn't Python's tooling usually considered some of the worst?


By what standard? Yours? Java developers? Rust developers? The fact that Python has typing now, tools to check those, and has plenty of tooling around “make it faster”, I think your world view might be stuck in 2010. Python has gone from a slow obscure scripting language to a powerhouse. Conda, NumPy, Scikit, PyTorch, GPU programming, Games, Analytics, web apps, API’s, I think it’s safe to say this ain’t your grandpa’s Python anymore.


I have seen Python in action since 2010 plenty of times, including the present. It's been a mess every time. My worldview is driven by F# and Elixir, namely the `dotnet` and `mix` tooling, respectively. So no, Python's tooling does not impress me in the slightest. The first thing you need to do is get agreement on all the different checkers, linters, formatters. For Elixir, there's one tool for each task. My worldview feels quite current.

Those other things you mentioned aren't relevant to tooling. They're distributions, libraries, and applications. But note that Conda's existence was basically due to the fact of Python's tooling being poor.


You see tooling in a different view than typical Python users. Python users see things like iPython, notebooks, the ability to quickly do statistics and plot results, as tooling - to do their jobs. Data Science. Machine learning. Things that perplex and confound static-typing OOP purists. So yes, by your world view - Python is a mess. There's no one way to do things, there's no one tool, no one linter, no one formatter. I praise the fact that there isn't. What a boring world. What choice would you have if that tool didn't satisfy your needs? Find another language?

I began in that world view. The C/C++/Java/DotNet everything must have a standard, a fixation on a singular consensus. That's not how things work in the open source world of Python, javascript, rust, etc. Will there be gravitation towards a paradigm? sure, until such time that a new one emerges.

If you looked at Python in the 2.5 days and looked at Python today - You cannot argue the tooling has gotten better.


> You see tooling in a different view than typical Python users.

That doesn't surprise me.

I do what you mentioned with F# in Polyglot Notebooks (which support several languages in the same notebook) and Elixir in Livebook all the time, both of which are superior notebook implementations to the outdated Python Jupyter notebooks.

My worldview is not driven by comparing Python to C#, Java, C++, and other such ilk.


So following your logic, "some of the worst" includes the vast majority of actually used languages and tools. Must be cool to be on the special side. :)


They want to be lisp SOOO BAD! =)


+1 for mentioning that python's original competitor was Perl. This point is forgotten some 20-30 years latter


Not just perl, but C/C++/Java as well. Ruby was a competitor to Java JSP development back in the day. And I remember when a lot of Java people jumped ship to Ruby. I moved from C++ to python over a decade ago and never looked back.

Back then, the python jobs were scarce -- but based upon how I picked up the language, many of the typical C++ issues just disappeared -- and I knew it was going to become popular.

One comp sci professor talking at a PyCon years ago made a point that maybe the best college level introductory course probably should not have been SICP based but Python. His example was that for the first time in 5 years of teaching intro courses he had people coming up to him looking to change majors.


C++ and Python are not competitors. Sure both are Turing complete so you can implement anything in either. However if you should is a different question. Python is very difficult to maintain when you program goes over 100k lines of code, while the static type system of c++ is good for millions. C++ compiles to fast binary code, while Python is typically 60x slower for logic (though often Python calls a fast algorithm implemented in a compiled language for anything significant so this is hard to measure in typical programs). However if your programs are not very large and you don't need every last bit of speed Python is going to be easier to work with. (And frankly if not working with a legacy c++ codebase you should look at rust, ada, go, or something else i'm not aware of, c++ has many ugly warts that you shouldn't need to worry about)


As a person who uses C++, I must say something that also applies, somewhat, to Java.

We have all the cool kids like Kotlin, Rust, etc.

However, when it is about finishing a project, beating C++ or Java ecosystems is almost impossible.

Besides that, C++ has improved a lot over time. It still has its warts and knives but you can code very reasonable C++ with a few guidelines and there are also linters, and -Wall -Wextra -Weverything -Werror. That takes you quite far in real life provided you have a reasonable amount of training.

I would choose C++ over Rust any day. The only hope I have for C++ successors are Cppfront and Carbon and they are highly experimental. As successors those two fit the bill.

There is a third one, my favorite. It is this one: https://www.hylo-lang.org/ but I am not sure how compatible it will/would be.


The one thing Rust is getting right that I hope Carbon et. al take from it is using the type system to manage memory.

not having to explicitly remember `free` in safe Rust code is amazing. Knowing that if my types are sound that memory will be managed reasonably is great.

I also think that immutable by default, mutable by explicit declaration is pretty great.

I do think there is a lot of room to add better ergonomics on these ideas however


The amount of complexity that Rust adds is not worth in most scenarios in my opinion. I can think of Rust as something for OS with critical safety or so.

Besides that, in real life you end up having C in most of your software, so I am not sure how Rust behaves compared to well-written C++ with all warnings and linters on and using smart pointers. But my hypothesis is that they do not stay very far apart in safety.

There are many ways to achieve safety, and the most promising, IMHO, is the path that Hylo programming language is taking. It sticks to Generic programming and mutable value semantics.

The language base draws from Swift and it has very strong people behind that know what they are doing. For example David Abrahams and Sean Parent. This is an implementation in a language of many of the ideas from "Better Code" from Sean Parent. It has a very solid foundation.

Besides being a solid foundation for generic programming, value semantics, and concurrency, what I like the most is how much it simplifies the model by not escaping references all around and preventing unnecessary copies. This removes the (IMHO) mess that is to have to manage reference escaping constantly in Rust, reason why many patterns such as Linked lists are not even possible.

And Lists are not just an academic exercise, as I was told sometimes by Rust proponents. Linked structures (Inmutable linked structures actually) are important in scenarios such as TelCo backend where you need replication, fast moving of data and history versioning, rollbacks and so on.


> The amount of complexity that Rust adds is not worth in most scenarios in my opinion. I can think of Rust as something for OS with critical safety or so.

It's difficult to have this discussion in any sane way when Rust (or C) comes up. I tried Rust, but I have projects to deliver on strict timelines and I have yet to find a client who is prepared to pay me for the (what I found to be) very large onramp time to gain deep Rust expertise.[1]

The argument of "just get gud" whenever you point out the deep learning curve of Rust is pointless; I have noticed that Rust experts only come when your employer is rich enough to pay the team to not deliver while they learn it: Basically only FAANGs and startups flush with VC money.

[1] When I have a small project I reach for C. When I need something bigger for which C is not suitable, I don't reach for C++, or Rust, I rather take the tiny performance hit and move to Go. On extremely large projects, where I work with others, C# and Java seem to hit the sweet spot.[2]

[2] Although, C# and Java are also getting a bit too complicated for my tastes too. Seems to me that every language follows C++ evolution towards complexity, because the people stewarding the language are experts in that language and/or in programming language theory.[3] They are comfortable with the changes they propose because they have no need to upskill (they are already at that skill level).

[3] I propose that a language designed by your average corporate developer who has 15 years of experience but no CS degree will have much higher adoption than languages designed by PL purists.


What makes you choose Go over C#/Java for medium projects? And why not go for the large projects?


> What makes you choose Go over C#/Java for medium projects?

Because I said:

>> C# and Java are also getting a bit too complicated for my tastes too.

I abhor complications.

> And why not go for the large projects?

Because I said:

>> where I work with others, C# and Java seem to hit the sweet spot

Yeah yeah, I know it sounds like I am whining (Maybe I am :-), but at least I am complaining about all of them.

Java and C# do appear to have been battle-tested for very large projects that aren't microservices.

Go? I dunno. I've only ever seen very large projects in Go using microservices. I like its simplicity.

My main complaint is that programming languages have too much minutiae to track that I really shouldn't have to be tracking.

Take, for example, asynchronous builtins:

Why are all the explanations wrapped in jargon that only an expert in the language would grok immediately? Promises? Futures? You gotta explain those, with examples, before you can explain what to do with a value from an async call. Go look at the popular introductions to async (say, on MDN for js, or Microsoft for C#, etc) and count how many times they have to explain something because of their leaky abstraction implementation rather than explaining the concept.

How about simply saying "calling async functions only schedules the function for later execution, it doesn't execute it".

That naturally leads into "So you need to check if it is finished using an identifier to identify which scheduled call you want to check"...

Which itself naturally leads to "The identifier you need was given to you when you scheduled the call"...

Which leads to "Using that identifier from `id = foo();`, you wait for the result like this: `result = id.wait()`".

You can even add "You can return that id, or pass it around so some other code can do `id.wait()`".

Now they don't explain it this way, because their implementation(s) is more of a leaky abstraction exposing irrelevant information about the compiler, than of a straightforward implementation of the concept. They are unable to separate the concept from their implementation.

The common implementation of async is so counterintuitive that they have to explain their particular implementation instead of the concept, because it has almost nothing to do with the concept. If they just explained the concept, programmers would still be confused because the implementation needs things like colored functions just to work poorly.

The concept of scheduled functions (which may return once, or may yield multiple times before returning), which is a simple path to understanding asynchronous calls, is completely divorced from the implementation which will produce errors like "cannot call asynchronous function from a top level"[1] or "cannot call await in a function not declared as async".[2]

So, yeah, I'm kinda upset at how programming has evolved over the years (I wrote my first program in 1986, so had a good seat for the later part of this particular movie), from "simple and straightforward", to "complex for complexities sake".

[1] Why? Because their abstraction is an abstraction of their implementation, and not an abstraction of asynchronous calls.

[2] Se [1] above.


> How about simply saying "calling async functions only schedules the function for later execution, it doesn't execute it".

This is not always true.


All this may be true (I'm not the strongest C++ developer in the world, relatively limited exposure), however the Rust memory management via the type system feels natural once you wrap your head around it. That idea is really good. I always hated dealing with `delete`, `free` and `malloc`.

Being able to offload all that busy work to the type system is just nice. There are definitely ergonomic improvements that could be made around this.

All the rest? I'll leave that to someone else to talk through, as I'm no expert here.


I.write c++ all the time, and I go months between needing new or delete. Unique_ptr is a wonderful thing. Not quite as powerful as rust's borrow checker, but it saves me a lot of thinking.


> however the Rust memory management via the type system feels natural once you wrap your head around it

It disallows many valid patterns. That is why I recommend to take a look at Hylo programming language (before called Val lang) to see what I think it is a very good example of how to make a language safe without making the learning curve so steep and without a need for a GC.


The way Rust does it may disallow valid patterns, but it is not inherent to the idea


> well-written C++ with all warnings and linters on and using smart pointers. But my hypothesis is that they do not stay very far apart in safety.

Can C++ compilers + linters reliably detect all misuses of unique_ptr? Because that sounds like a halting-problem kind of problem, and as soon as you can't guarantee memory-safety, you're definitely not in the same ballpark in terms of safety. I mean, memory-unsafety is the number one vulnerability cause in software. C++ has many qualities, but safety certainly isn't one of them.


> Can C++ compilers + linters reliably detect all misuses of unique_ptr? Because that sounds like a halting-problem kind of problem, and as soon as you can't guarantee memory-safety, you're definitely not in the same ballpark in terms of safety.

Is C and assembly the same level of memory safety? Probably yes... but no, it is not in practice.

And C and C++? Yes, in theory, in practice... C++ is safer.

How about Rust? In theory Rust is safer. In practice, you are going to use C libraries here and there, so... in practice not as safe as advertised.

Well-written C++ with -Wall -Werror, -Weverything, -Wextra... that is very safe, including detecting even dangling stuff to some extent (gcc-13). If you stick to `shared_ptr` and `unique_ptr` no matter how much you complain about it: Rust with its C shims and C++ with all linters and a good environment are practically at similar levels of safety.

This is the practical, real thing that happens. I do use C++ for every day use for around 14 years professionally and 20 years in total.

You are all in the terrain of theory, but how much Rust and C++ have you really written?

Of course, the CVEs data about memory safety, well, those are true. And they are a real problem. But with a reasonably good use of C++ those would be much, much, much lower than they have been so far.


> However, when it is about finishing a project, beating C++ or Java ecosystems is almost impossible.

Yet, somehow people do this with python, perl, and ruby. Google hires professional python people too.


Not only do this, but do it way more successfully. I'll never get tired of repeating that among top YC startups, Java as a primary language contributes to roughly 1% of value, while Python + Ruby are almost at 70%.

https://charliereese.ca/y-combinator-top-50-software-startup...


If by successfully you mean time to market, for sure you are right.

C++ gives more return when you start to save in infra because you have a more efficient language, if coded properly. Same goes for Go vs Python.

The right tool for the right job. I would use (and will, I am on it) Django for a SaaS that I have for the backend. If things start to work relatively well, then, I will keep identifying, if there are, bottlenecks and migrate with a hybrid architecture parts of the workload to C++.

This allows me to save in infrastructure bills.


Hylo page says it was formerly Val.

IIRC, Val has been mentioned on HN sometimes earlier.


As painful as the Python package system is, C/C++ is so much worse.

I remember trying to compile GTK+ on a solaris system a decade ago, and remembering how terrible it was to even to get it to compile.

You're really deluding yourself if spending your time in compiler dependency hell is so much better than python.


The new CMake package manager may make things easier.


> Python is very difficult to maintain when you program goes over 100k lines of code, while the static type system of c++ is good for millions

I see this argument a lot, but people often forget that Python is very concise (yet readable) compared to other languages.

100k LOC in Python typically contains way more business logic than C++, so it is only natural to be harder to maintain.


Python ismuch less concise at this size. Sure your algorithms are more concise, but you lose all of that and more because it you don't have 100% test coverage you never know when a stupid typo will make your program crash, while with c++ you typically can be fine with more reasonable coverage, say 80% where the last 20% is things hard to test that are unlikely to fail anyway. At that scale Python is slower to build as well because c++ lets your tests depend only one the files that changed and this your code-build-test cycle is faster despite c++ being famously slow to compile.


MIT dropping SICP/Scheme for Python conicided with the general decline of education as an end in itself. Python is the VHS of computer languages. I couldn't believe it when I heard MIT dropped a language renowned for its elegant lambda implemenation in favour of a language in which lambdas were not only frowned upon but literally throttled. I think it says everything that MIT ditched Scheme and SICP at the same time as it would be near impossible to teach SICP with Python.



I would argue that Python is the Betamax of computer languages, and C++ is the true VHS.

Fight me.

But seriously, Python is good. Don't let the perfect be the enemy of good, only computer science people can do this.


> the best college level introductory course probably should not have been SICP based but Python

i saw this linked on here recently: https://wizardforcel.gitbooks.io/sicp-in-python/content/


And were of cause forgetting VisualBasic because who cares about microsoftland those days but back when python/ruby emerged even windows server and IIS was relevant as this was kind of the peak of microsoft's dominance.


Yes, it is pretty weird to think that VB.Net flavor was fairly used in web dev even as late as 2006.


> There should be one-- and preferably only one --obvious way to do it.

I just wish Python applied this approach to package management. It is needlessly complicated.


The obvious part is definitely lacking, but the long and short of it is basically to ignore all the newfangled solutions. They are much more trouble than they are worth. Just use pip with a venv and a requirements.txt (easier) or a pyproject.toml (cleaner).

I really fail to see what the newer tools actually bring to the table. From my experience they solve no actual problems and just introduce more bugs and weird behavior.


I think Python has some of the worst API documentation I’ve ever read.

Even Java puts it to shame and that is sad


Coming to Python from PHP it was interesting to see that in the PHP world I'd get 80% of my knowledge from the PHP documentation and 20% elsewhere. In the Python world it's easily the other way around. It also doesn't help that the PHP documentation is more up to date so I am using the most current info, while for Python their own docs are so bad I rely on other docs but they all vary on what version of Python it depends on and whether it follows current best practice. The difference is night and day and one of the reason I ended up going back to PHP.


Yeah the PHP documentation is extremely pragmatic and seems designed to get you going quickly with useful examples.

The Python documentation seems to be suffering from some sort of weird snobbery. Very wordy as another comment mentioned. Examples are frequently lacking or inadequate. They seem like they're trying to compete with MSDN in "professionalism" although these days even the MSDN examples are better. There is an entire ecosystem of python tutorial websites around the web that would not exist if the documentation was as helpful as that of PHP.


The stdlib documentation definitely has a unique flavour. I would characterise it as "wordy".


I disagree: Python seems to have a zillion build tools and multiple competing type-checkers.


I like Python, but most of the Zen has always been a meme, and not a guideline of design principles of either the language, or software written with it.

Besides the one you mention, I also find the "Explicit is better than implicit" line to be against everything Python stands for. The language is chock full of implicit behavior, and strongly encourages writing implicit code. From the dynamically typed nature of the language itself, to being able to define dunder methods that change the behavior of comparison and arithmetic operators.

I really like Go partly because of this. Not only does the language itself stictly follow the explicit over implicit and TOOWTDI principles, but it encourages or even enforces them on the programmer.


It is absolutely not a meme, it's [PEP 20](https://peps.python.org/pep-0020/). Just because some people don't take it seriously, it's definitely a part of the language's soul.


I think The Zen was really important for Python's success as well. Having your core values right there, well defined and out in front, wasn't something you get with a lot of languages. Any language thats been around for a long time gets sorta.. muddled.


I'm from the outside looking in so don't take me too seriously because I tried Python and bounced off of it; there's very likely elegant parts of the language that I never really internalized because I didn't spend enough time working with it.

But speaking as someone who tried Python because I agree with principles like "have one right way to do things" and "be explicit", my initial impression working with language is that much like real souls, Python's soul is metaphysical, unobservable, and doesn't seem to interact much with the physical world in an observable way ;)

If I had to list some of my main criticisms of Python it would be that the language seems to have way too much implicit behavior and seems to have way too many ways of doing everything. I'm going to say something heretical, but it was weirdly enough early Javascript that I found to be a lot more consistent and explicit[0]. Type casting was a disaster of course, dates and the standard APIs were a complete mess, but beyond that it was rare for me to look at Javascript code and think "I have no idea what the heck that is doing." But it happened all the time in Python, it took me a while to get used to things and I still feel generally less capable in Python than I do in other languages that I've spent less time working with. There's so many little syntactic tricks in the language that are... convenient, but I resent having to memorize all of them.

[0]: Until it started messing around with classes and const and Symbols and crap -- the language is probably much harder to learn now than it used to be in the past, but I don't know because I'm disconnected from new users now. But certainly having 3 ways to declare a variable now probably doesn't help new users.

----

As an example, just this weekend I tried to convert a Python codebase from 2.0 to 3.0 and was immediately hit by needing to resolve implicit casting rules about integers, buffers, and strings that were all different now. Python has this weird thing where sometimes it does implicit casting behind the scenes and sometimes it doesn't? There's probably a rule about it, but it's never been explained to me.

So then I wanted to figure out the best way to handle converting a string of hex values into a manageable format so I searched that up and got advised that I should use `struct.unpack` except to be careful because that would give me a tuple instead of a list and also would require me to pass in the length, so instead I should actually use `map(ord, s)`, which prints out something that certainly seems to look like a list, but is not subscriptable, which is not a thing that I knew that I needed to care about but apparently do because the program broke until I cast it back to a list. And probably what I should have done was list comprehension from the start? But it wasn't clear to me if I could do list comprehension on a string or not since I do know that strings in Python technically aren't lists, they're sequences, and anyway list comprehension was not what people were suggesting.

And I know it's unfair because this is very beginner stuff in the language, but my immediate thought was, "oh right, Python. Of course when my debugger prints out something that looks like an array of values it might be one of 3 or 4 different types behind the scenes, all of which will error for subtly different reasons. It was silly of me not to see this coming."

Again, fully aware that this is basic stuff that would completely go away with familiarity with the language, but like.. oh my goodness my kingdom for having one array type that just works everywhere and one iterable quality that supports the same manipulations everywhere no matter what the underlying type is. I'm trying to do quick scripts, if I cared about these distinctions and if I cared enough about performance to need multiple ways to have a list of values, I'd have written this in Rust or at least C# or some fully typed lower-level language. There doesn't need to be this many ways in a scripting language to say "I have an ordered collection of values."

I'm not saying you're wrong, I suspect you're right. I suspect the underlying language is much more elegant than what I'm seeing. All I'm saying is just that the initial impressions of Python for people like me who are really inexperienced with the language are anything but the PEP 20 list -- the impressions are the opposite, it's exactly why I bounced off of Python so hard. And I don't think that's individuals doing something weird, that seems baked into the language? Individuals didn't give Python 4 different ways to represent a sequence of values. I don't think it's a few coders' fault that I'm constantly seeing syntax in Python where having the code look prettier seems to be the priority over making it understandable or explicit? Again, take it with a grain of salt, just... I don't know, I always laugh when I see the PEP 20 linked because it's so contrary to how I think the language looks to new users. I could compare this to something like Lisp, which I am also extremely inexperienced with and extremely bad at writing, but when people talk about Lisp having simple rules, I think, "yeah, I see that. I see the system that you're talking about and I see the consistency you're talking about." With Python I just don't see it, the initial impression makes it feel like a language written by graphic designers trying to create something that's pretty rather than systemic.


For what it's worth, I have come around to really enjoying the more recent versions of python, and it's what I'm writing most at the moment, but I totally agree with you here. I don't think of python as being explicit and having only one way to do things. I think that's a pretty inevitable result of trying to both add new things and also keep older things for compatibility.


I code both.

> explicit over implicit

Go was the first language I used that broke an explicit check for nil/null/None. I had to cast my nil to a different kind of nil in order for the program to run correctly. C/Python/Java/C++ didn't have this issue. Null was always null in each of these languages.

Further, go panics in surprising ways sometimes. A library developer might decide to panic and cause your program to crash, when that library was only used in your program for optional behavior.


> Go was the first language I used that broke an explicit check for nil/null/None. I had to cast my nil to a different kind of nil in order for the program to run correctly.

Maybe I'm reading it wrong, but it reads that you have been accustomed to implicit nil type conversions in other languages and that tripped you up when Go required you to be explicit. It seems Go is the explicit one here.

> A library developer might decide to panic and cause your program to crash

Go is abundantly clear that panics should never cross package boundaries except in exceptional circumstances. And those exceptions should crash your program. An exception means that the programmer made a mistake and the program is now invalid. There is nothing left to do but crash.

It is technically true that a library developer can violate that guidance, but it is strongly encouraged for them to not. The parent did indicate that sometimes the explicitness is only encouraged, not enforced.

Not that Go is some kind of explicitness panacea. It has its own fair share of implicitness. But I'm not sure these two are representative.


> Go was the first language I used that broke an explicit check for nil/null/None. I had to cast my nil to a different kind of nil in order for the program to run correctly. C/Python/Java/C++ didn't have this issue. Null was always null in each of these languages.

I think it makes sense - a `nonexistent<ThisType>` is different from a `nonexistent<ThatType>`.

In C, Java, C#, C++ and other languages, null/nil is a value. In Go null/nil has a type. Now that I've used Go a little, I actually miss having error messages for certain things, and comparing a `nonexistent<ThatType>` to a `nonexistent<ThisType>` is actually a logic error in my code that I want to be warned about.


I don't think "you can define + on custom data types" to be the same as implicit behavior.

"Explicit is better than implicit" is more around things like Django not just automatically importing every directory it sees (instead requiring you to list the apps used). It's also a decent rationale for changes made from Py2 to Py3 regarding bytes-to-string conversions needing to happen explicitly.

It's also about how usually you end up explicitly listing imports (wildcard imports exist but are pretty sparing), or lacking implicit type conversions between data types.

This stuff is of course dependent on what library you are using, the projects you are working on, etc. And there's a certain level of mastery expected. But I think the general ideas exist despite the language allowing for much more in theory.


This held up much better in the earlier days of Python.

Sooner or later every sufficiently popular programming language is confronted with the dilemma of either breaking backwards compatibility and/or adding "alternative ways to do things"

Pathlib is an interesting one. It even has a correspondence table [1]. The introduction of pathlib made sense because it is just much nicer to use. But you couldn't drop the equivalent functionality in the "os" module for backwards compatibility. It's just far too widely used.

The is no magic bullet for this one. Either you accept introducing duplication in your language or you go through a hard change (f.ex. the Python 2 to 3 change).

The softest way to migrate might be to flag functionality as deprecated and give users a fairly long time (talking years) before dropping it. But no matter how much time you give the users there will be users who won't change until it's too late.

So there really isn't a winning path for this one.

[1]: https://docs.python.org/3/library/pathlib.html#correspondenc...


That was the one mantra from the zen of python that I always laugh at too.

Because my #1 complaint with python is that there are so many ways to do the same thing. It sounds nice in theory if you are a new developer because you can create your own voice in python and character.

For example, lets say I gave a simple coding puzzle (think leetcode) to 10 python engineers. I would get at least 8 different responses. A few might gravitate around similar concepts but they would be significantly different.

By comparison if I gave the same puzzle to some Go engineers, all of the responses would be probably very close to identical.

This sounds fun as a new engineer, like this is an argument for using Python. But as you get older and more experienced you will likely find yourself hating the flexibility of python because as you scan any decent size codebases, you start to be able to identify entire parts of the codebase and who wrote them, without even needing to do a git blame.

I am currently an SRE manager who works in a very polyglot environment (primarily Node/js, bash/sh, python, and Golang with just enough java thrown in to ruin your day). When I read through python codebases, I can identify exactly who wrote it, even past employees. Then when you need to fix it, you often have to adapt to that style as you fix a bug or add an updated feature. This is a little true with Bash because you will see certain engineers always lean towards sed and others towards awk and grep to solve problems depending on their strength, but it is less significant. However, in our Go codebases, everyone writes nearly identical code.

I've been writing Python for over a decade and I still learn new features of the language every week or month. Just this week I had to dive into the `ast` library (abstract syntax trees) which was a native module I haven't ever touched before. By contrast, I haven't had to learn any new core syntax and tools of Go and Bash in a long time. You have a fairly small set of syntax and that's it. The power comes in how you use that small set of tools, not learning a new language module.

Again, the infinite flexibility sounds nice. But in a corporate environment the strictness of other languages is actually a benefit because it keeps everyone following the same path.

I truly believe the mantra from zen of python:

> There should be one-- and preferably only one --obvious way to do it

Sadly, Python lost this tradition far before I ever became acquainted with the language. And the python3.10+ mentality of supporting everything forever is only going to make it worse overtime.


Either you move with the times or you become obsolete. 20 years ago Python codebases were clean and consistent, but language design has moved on, and Python has - barely - kept up with it, so now you have people who know the new ways and people who know the old ways and various stages in between (and it's not like they didn't try ditching backward compatibility, but that didn't work out well either). Go has the luxury of starting 20 years later and being a lot less ambitious, but it'll happen to Go too in time.


> For example, lets say I gave a simple coding puzzle (think leetcode) to 10 python engineers. I would get at least 8 different responses. A few might gravitate around similar concepts but they would be significantly different.

I get what you're saying, but I don't think that's as true as you mean. I think that most experienced Python developers tend to gravitate towards the same style of solutions for things like coding puzzles (including the classic "read in data, map it through a comprehension, write out data" pattern).

There are multiple ways of expressing the same thing, but very often in a specific context one of the ways tends to be overwhelmingly a better fit than the other.


> For example, lets say I gave a simple coding puzzle (think leetcode) to 10 python engineers. I would get at least 8 different responses. A few might gravitate around similar concepts but they would be significantly different.

This is true for any language. Arguably, what's different about Python is that the more senior the engineers you're interviewing, the more likely their solutions to converge.

Just because something is in the Zen of Python, doesn't mean it automatically gets followed by every Python engineer. It's Zen, after all - you don't automatically get enlightened just by reading it.


There's generally one obvious way to do it. There's almost certainly other ways to do it, and those other ways might be better suited for different specific tasks, but the obvious way tends to exist and be widely usable.

Are there some exceptions? Sure.


I'd suggest "pythonic", rather than obvious, for that sentence. It's one of the ways the community reminds itself of the utility of consistent discipline.


It's perhaps worth noting that the next line of the Zen of Python is: "Although that way may not be obvious at first unless you're Dutch." (I.e. unless you're Guido.)

So it was a bit of a tongue in cheek statement from the very beginning. :D


IMO, its not “hilariously wrong” in Python.

Note that as written the priority is that every use case has at least one obvious approach, and that a secondary priority is that there should be a unique obvious approach.

People often seem, however, to misread it as “there should be only one possible way to perform any given task.” Which, yes, is hilariously false for Python, but also not what it says.


Yeah. It is now.

It didn't used to be though. People would wax poetically about how pythonic some codebase was and sing the praises of idiomatic python. There was an ideological war between Python ("there should be one--and preferably only one--obvious way to do it") and Perl. ("there's more than one way to do it" or TIMTOWTDI, pronounced "Tim Toady")

Generators and comprehensions and decorators and `functools` are all relatively new.


I feel like the entire industry has adopted Python's approach here, so probably Python doesn't stand out on this point as much as it did in the early days.

Compare Python to C/C++, where even just getting a working binary from source is fraught with many competing tools. Or boost vs std and whatnot.

Others have already pointed out the contrast with Perl.


There is always one obvious way to do things in Python, but it happens that what's obvious varies from person to person.


Like those various personal subsets of C++ :)


As evidenced by the unified Python library / dependency management system.


I use pip 100% of the time


>> I use pip 100% of the time

What about pipenv, poetry, conda, setuptools, hatch, micropipenv, PDM, pip-tools, ActiveState platform, homebrew, or your Linux / BSD distro's package manager?


As much as I love to rag on things, I would go so far as to say that the big problem with Python packaging is the fact that it tries to manage C/C++ packaging and integration.

If Python is only managing Python code, the issues to be solved are VASTLY simpler. Just about anything works.

Once it has to deal with dynamic libraries, compiled code, plugins, build systems, etc. the combinatorial explosion just makes life suck.


Python is a victim of its own success and versatility.

It has become a better "glue code to solve your problem" than many other solutions and so people want to use it everywhere for everything.

It gets packaged in many different ways because it gets used in many different ways for many different purposes.


Packaging is a difficult problem and all attempts to simplify things fail to understand how complex the problem is and so fail in some way. (Attempts like .deb do okay by only focusing on a subset of the problem)


I use none of those except for homebrew, but I didn't mention it because it's for installing complete programs that happen to be written in Python, not Python dependencies for when I'm working with Python.


You forgot about egg!


pip (with venvs, which is a built-in Python feature) covers 99% percent of all use cases.

Yes, its dependency resolution could have been better, and lock files are nice (which can be emulated using constraints [1]) but I don't understand why people are so busy writing alternatives. I work on pretty sophisticated codebases on a daily basis and haven't used anything but pip.

[1] https://pip.pypa.io/en/stable/user_guide/#constraints-files


Good for you but many don’t. Most people I know use Anaconda and install things with Conda


Use Poetry to produce a lockfile (or whatever lockfile-producing tool you like), and then let Nix take the wheel. It solves the dependency problem not just for Python, but for your entire stack. I've been able to hand it off to my coworkers and have everything just work with a simple "nix-shell" command, even with a dependency tree chock full of gnarly ML libs.


Whatever in the NamedTuples do you mean?


There is also TypedDict, Pydantic, Attrs and dataclasses


Yep, but couldn't be bothered typing them all :D


Far faaaar better with respect to this than R though at least.


You're young enough to have never dealt with write-only Perl code ...

Be glad.


> You're young enough to have never dealt with write-only Perl code ...

How young does one have to be for that to be true?

As recently as 2017 I was employed at a place where there was a significant amount of perl code, as part of the build system (ugh) for the C code, and generating html (for some other system).


Back then (in 2004) the most popular programming languages were PHP, Perl, C++ and Java. Java was pretty focused (and even then, there was the distinction between int and Integer, between []int and ArrayList, etc.) but C++ and (especially) Perl were driving people crazy because there were thousands of ways to do the same thing. And nobody had any respect for PHP's design so let's not even talk about that one.


Not really. For the core language this applies. This does not extend to 3rd party libraries, obviously, since anyone is free to reproduce whatever someone else already doing better, if they want.


I like also the idea of explicit>implicit, but then you see Django.

I hate significant whitespace though. I can work with it, I get it, I still don't like it.


That was then. Python does much much less explicitly anymore these days, either.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: