Barriers to entry are invisible. They are invisible to people on the inside and most frequently invisible to people who have a hand in creating those barriers.
I once had a conversation with a founder about their signup flow. This founder had aspirations to be a global player. They said that step 2 of their flow included entering a credit card number. I had to stop them. "What about people who don't use credit cards?" They were nonplussed. "You know, China, much of Africa ..." they had never actually thought about whether people had credit cards because every single person they interacted with had one.
Back to languages. If you've never taught an introduction to computer programming for a general audience you are blind to what does and does not matter about a language. You look at things like `foo.bar()` and think "Yeah that's a simple method invocation" and have no idea how many people you just lost.
Never underestimate how much ease of use matters. We, as a community, have selected for decades against people who care about language ergonomics because they get hit in the face with a wall of punctuation filled text and turn away. We fight over where braces belong while not realizing how many hundreds of thousands of people we've excluded.
Ease of use is the most important part of a language. It's just that the barriers to entry are invisible and so the people who happen to be well suited to vim vs emacs debates get to talk while the vast majority of potential programmers are left on the outside.
We create barriers to entry accidentally because we design for people like us. To us the barriers are invisible.
Programming has and almost infinite depth to it, depending on the complexity you are tackling/modeling. In a way, it is like mathematics in the way you have layers acting as foundations/abstractions for more complex representation. Introduction to mathematics is counting discrete things, then the number line, addition, which is a foundation for/abstracted by multiplication, which is in turn a foundation for/abstracted by exponents, etc. and in no time, you're at the level of multi-variate calculus and n-dimensional hypercubes.
Each level requires notation that is concise (so it can fit in short-term memory to be actually useful) - one cannot expect graduate-level mathematics to share the same notation/symbols as beginner (elementary) maths.
Programming languages have to trade off which level they are biased towards, I do not believe in a universal language that is both easy for beginners while being concise for advanced users.
Back on topic - Go is not an easy language, it is a simple language, those 2 things are not the same.
Go isn't even simple. They tried to be simple, but mostly managed to only move the complexity around, or foist it on to the user's shoulders by not providing a layered, composeable API in the standard library.
There are some kinds of complexity you can't eliminate; only contain through careful, thoughtful architecture. Unfortunately, people often confuse this with the eliminatable kind, and end up making a bigger and bigger mess as they push things around.
What do you mean by "layered, composeable API"? Go does provide abstractions for common use cases in its standard library (`io.ReadAll` for example) and there are interfaces to connect different parts of the standard library (`io.Reader`/`Writer`, `fs.FS` etc.).
> Go is not an easy language, it is a simple language, those 2 things are not the same.
This distinction is fundamental to any sort of design, but is unfortunately lost on a lot of people (especially developers, in my experience). Easy and simple are near-universally conflated.
> I do not believe in a universal language that is both easy for beginners while being concise for advanced users.
Exactly. I really want some languages to be more intuitive and easy to pick up for non-programmers. The average person (whatever that means) does not have the first clue about the machines that run so many parts of their lives.
At the same time, such a language would likely be a bad fit for most professional software development. That hardly means it's without value.
I know languages like that exist, but they're often aimed at absolute beginners and are treated like toys. There doesn't seem to be much middle ground or "transition languages".
Insert joke about how $LANGUAGE_YOU_DISLIKE is a toy.
That’s an interesting statement as it doesn’t always work in other languages. In German (my native language) my first instinct was to translate both words as “einfach” which contains both concepts. In fact, in my online dictionary of choice the word “einfach” is the first entry for both “easy” and “simple”. So if Germans conflate these two it might be because of the language they speak :) But more to the point, I’m wondering how universal the distinction between easy and simple is when other languages cannot express that distinction as easily as in English.
Interesting. My native language, Afrikaans, has a lot of influence from both Dutch and German (as well as a bit from English and quite a bit from Bantu African languages). We say "eenvoudig" for "simple" and "maklik" for "easy". I recognize both the (different) Dutch words Google Translate provides me when I translate to Dutch as similar to the Afrikaans words, but Google Translate translates both to "einfach" when I translate to German. May be German for some reason conflates the meanings. That said, homonyms and homophones that conflate meanings are found in most human languages, and often for meanings that are far easier (or is that simpler ;-)) to distinguish than "simple" and "easy".
If we were to agree that a simple task has low complexity to accomplish, and an easy task requires little energy to accomplish, then conflating them is straightforward, particularly when weighing the mental effort to accomplish the task. (Tying a shoelace is simple: four steps. Tying a shoelace with 5 pound weights on my wrists is still simple, but not easy.) Of course, if you don't agree to these definitions, then the intersection of them is thinner.
I mean, Python is the de facto beginner language and also used for professional software development (and of course intermediate data science work). Are you suggesting this is an unwise or unstable equilibrium?
I'm glad you bring that up. I think Python is the closest we have to a "universal language" (even so, it still has some limitations).
I think it works well for beginners because the language itself is so consistent and they have put a lot of effort into avoiding corner cases and "gotchas". And I think it works for professional uses because of third party library support.
To answer your question: I'm not suggesting that at all. I'm honestly not entirely sure how Python balances it seemingly so well. Given the lack of focus in the industry towards "intermediate" programmers and use cases, my slight fear is that Python will be shoehorned into one direction or the other.
Even if the language itself isn't, it does feel like the use-case-complexity gap is growing exponentially, at times.
And not just with Python. Seemingly, you're either a complete beginner learning conditionals and for-loops or you're scaling out a multi-region platform serving millions of users with many 9's of uptime.
Python does this so well because of the extremely full featured and fairly easy to use C API. Advanced programmers can write extension modules for the interpreter and provide APIs to their C libraries via Python, give their types and functions a basically identical syntax to MATLAB and R, and bang, statisticians, engineers, and scientists can easily migrate from what they already know how to use, pay no performance penalty, but do it in a language that also has web frameworks and ORMs. You can do machine learning research and give your resulting predictive models a web API in the same language.
This gets badly underappreciated. I've been working in Python for a while and honestly, I hate it. I wish I could use Rust for everything I'm doing. I can't stand finding so many errors at runtime that should be caught at build time in a language with static type checking.
But I also recognize the tremendous utility in having a language that can be used for application development but also for numerical computing where static typing isn't really needed because everything is some variant of a floating-point array with n dimensions. Mathematically, your functions should be able to accept and return those no matter what they're doing. All of linear algebra is just tensor transformations and you can model virtually anything that way if you come from a hard engineering background. Want to multiple two vectors? Forget about looping. Just v1 * v2. It will even automatically use SSE instructions. Why is that possible? The language developers themselves didn't provide this functionality. But they provided the building blocks in the form of a C API and operator overloading, that allowed others to add features for them.
So the complaints you typically see about dynamic languages simply don't matter. No static typing? Who cares? Everything is a ndarray. Syntax is unreadable? Not if you're coming from R or MATLAB because the syntax is identical to what you're already used to using. Interpreted languages are slow? Not when you have an interface directly to highly optimized BLAS and ATLAS implementations that have been getting optimized since the 50s and your code is auto-vectorized without you needing to do anything. GIL? It doesn't matter if you can drop directly into C and easily get around it.
Meanwhile, it's also still beginner friendly!
EDIT: I should add, editable installs. That's the one feature I really love as a developer. You can just straight up test your application in a production-like environment, as you're writing it line by line. No need to build or deploy or anything. Technically, you can do this with any interpreted language, but Python builds this feature directly into its bundled package manager.
Great rundown! It's love/hate for me too. Python is the worst language for scientific computing, except for all the others. I think Julia's going to take the crown in a few years though, once the libraries broaden out and they figure out how to cache precompiled binaries, to get the boot time down. With Python, it's not so much that you get to write C, you have to write C to get performance. I'll be interested to see whether Julia takes off for applications beside heavy numerical stuff. That seems to be the Achille's heel of languages designed for math/science applications -- it's easier to write scientific packages inside a general-purpose language than vice versa.
This is hands down the best description I've seen of why so many of us persist in using Python despite the language or runtime. I do hope that more alternative language ecosystems will begin to thrive in the numerical space and that we'll see more ergonomic facilities for generating high performance code from within Python itself.
TLDR; Right now Python is almost always easier for numeric Python beginners than Rust is for numeric Rust beginners and even also more productive. I just don't see Python's ease and productivity advantages remaining if Rust can catch up with Python's ecosystem and toolchain. But we'll have to wait and see if that will happen. And when and if Rust is actually (slightly) more friendly to the numeric computing beginner and much more productive in some numeric/scientific contexts than Python, Python loses its current intermediate language position. Especially if similar improvements happen in other domains.
> Python does this so well because of the extremely full featured and fairly easy to use C API. Advanced programmers can write extension modules for the interpreter and provide APIs to their C libraries via Python, give their types and functions a basically identical syntax to MATLAB and R, and bang, statisticians, engineers, and scientists can easily migrate from what they already know how to use, pay no performance penalty, but do it in a language that also has web frameworks and ORMs. You can do machine learning research and give your resulting predictive models a web API in the same language.
You know what's better than "the extremely full featured and fairly easy to use C API": If your language can itself compete with C/C++ for writing the libraries you need. The only advantage Python has over Rust regarding library ecosystem is the first-mover advantage and that Rust makes obvious how terrible the C API is which means people often invent new Rust libraries rather than reuse the old C libraries. The only advantages Python has over Julia are first-mover and that I doubt Julia-native libraries can truly match highly optimized C/C++/Rust libraries performance-wise in most situations where performance actually even matters.
> But I also recognize the tremendous utility in having a language that can be used for application development but also for numerical computing where static typing isn't really needed because everything is some variant of a floating-point array with n dimensions.
* Some numeric use-cases need to work with more than floating points. May be 2D (complex)/4D/8D numbers, may be dollars, may be durations. You lose all units of measurement and they are often valuable. In Python you cannot indicate "this cannot contain NaN/None".
* In an N-D array, N is a (dependent) type, so is the size, so is the shape. Julia got this right but last time I checked it had a nasty tendency to cast the dependent types to `Any` when they got too complex. Imagine if you can replace most `resize` calls with `into` calls and have the compiler verify the few cases you still need resize. In Rust several libraries already use dependent types for these sorts of uses, but lack of important features that are only now starting to approach stable (const generics, GATs) makes them very unergonomic to work with.
* I see a lot of complex types that should've been a single column in a dataframe get represented with the full complexity of multi-indexes. Juck! Not only more complex, but far less expressive and more error prone. I haven't yet seen Rust go the extra step and represent a struct as a multi-index to get the best of both worlds, but it's what I would love and Rust definitely has the potential. It's just not a priority yet as we are still just implementing the basics of dataframes first.
* Things get even more interesting when you throw in machine learning. As a masters degree student, it took me months (mostly during the summer vacation, so I wasn't going to ask my promoter for help) to figure out the reason I'm getting bogus results is due to a methodological mistake that should have been caught by the type system in a safe language with a well-designed ML library. But here the issue is "safe" and "well designed library" not so much as "statically typed", but a powerful type system is required and the type system would catch the error in milliseconds in stead of hours if the it is static rather than dynamic.
> Forget about looping. Just v1 * v2. It will even automatically use SSE instructions.
Many languages have operator overloading and specialization or polymorphism to enable optimizations. In Rust this is again just a case of libraries providing an optimized implementation with an ergonomic API.
> So the complaints you typically see about dynamic languages simply don't matter. No static typing? Who cares? Everything is a ndarray.
Nope. Everything is not just an ndarray. That often works well enough. But when numeric computing gets more complex, you really want a lot more typing.
> Not when you have an interface directly to highly optimized BLAS and ATLAS implementations that have been getting optimized since the 50s and your code is auto-vectorized without you needing to do anything.
Much of those decades old optimizations are irrelevant or even deoptimizations on modern hardware and with modern workloads. The optimizations needs maintenance. In C/C++ optimizations are very expensive to maintain in Rust we cannot only leapfrog outdated optimizations but also much more cheaply maintain optimizations. Also, as we move into more and more datasets that are medium/large/big (and therefore don't fit into RAM), we're getting more and more optimizations that are very hard to make work over the FFI boundary with Python. The fastest experimental medium data framework at the moment is implemented in Rust and has an incredibly thick wrapper that includes LLVM as a dependency (of the wrapper), since it's basically hot reloading Python and compiling it to a brand new (library-specific) DSL at runtime to get some of the advantages of AOT compilation and to try to fix some of the optimizations that would otherwise be lost across the FFI boundary. Note that means now you need to do a very expensive optimized compile every run, not every release compile of the program, though I guess you can do some caching. Note also that it means maintenance cost of the wrapper quite likely dwarfs maintenance cost of the library implementation which is not a good situation to be in. The fastest currently in production python framework for medium/large data is probably Dask, but to achieve that performance you need to know quite a bit about software engineering and the Dask library implementation and do quite a bit of manual calculations for optimal chunk sizes, optimal use of reindexing, optimal switching back and forth with numpy, etc. and to avoid all the many gotchas where something that expect would work crashes in stead and needs a very complex workaround. In Rust, you can have a much faster library where the compiler handles all of the complexity for you and where everything you think should work actually does work and that library is already available (though not yet production ready).
> Meanwhile, it's also still beginner friendly!
* Is it? I admit it's code (but definitely not its toolchain) is marginally better for programming novices (and that marginally is important). Importantly, remember that novices don't need to learn about borrow checking/pointers/whatever in Rust either and that by the time they're that advanced, they need to worry about it in Python as well but the language provides no tools to help them so in stead of learning concepts, they learn debugging. Rust is lagging in teaching novices mostly due to the lack of REPL and fewer truly novice-friendly documentation, IMHO.
* But give Rust a good REPL and more mature library ecosystem and I cannot imagine Python being any more beginner friendly than Rust for numeric computing. (When "everything is just an ndarray of floats" is good enough, the Rust would look identical to the Python (except for the names of the libraries used) but provide better intellisense and package management. When "just an ndarray of floats" isn't good enough, Rust would have your back and help the beginner avoid stupid mistakes Python can't help with or express custom domain types that Python cannot express or at least cannot express without losing the advantages of the library ecosystem.
Don't get me wrong. Right now Python is almost always easier and even also more productive for numeric computing. I just don't see it remaining that way if Rust can catch up with its ecosystem and toolchain. But we'll have to wait and see if that will happen.
I can also think of several other domains where Rust is actually potentially better suited as intermediate language than the competitors:
* Rust arguably is already there in embedded if you can get access to and afford a Cortex-M. But I think it might actually be capable of beating mycropython in ease on an arduino one day. (At least for programmers not already experts in Python.) I won't go into my reasoning since this is already getting long. One day embedded Rust might also compete with C in terms of capabilities (the same or better for Rust) and portability (probably not as good as C on legacy hardware but possibly better on new hardware).
* I think Rust is already a better language than Go for cloud native infrastructure except for its long compile times and it seems like an increasing number of cloud native infrastructure projects also feel that way. In the meantime new libraries like `lunatic` might be an indication that one day Rust might be able to compete with Go in terms of ease of writing front-ends for beginners.
* Looking at what happens in the Rust game libraries space, I think Rust can definitely be a great intermediate language there one day. It already has a library that aims to take on some beginner gamedev/art libraries in languages like JS/processing/Go and at the same time, it has several libraries aiming to be best in class for AAA games.
Python abounds with corner cases and gotchas. It may have fewer than JS/Perl, but that really isn't saying much. It may hide them until a test or real-world use shows you you've stepped on them but that's not always a good thing.
"How do you remove an item from an array in Ruby? list.delete_at(i)...Pretty easy, yeah?"
"In Go it’s … less easy; to remove the index i you need to do:"
list = append(list[:i], list[i+1:]...)
So Go is simple in that it doesn't have shortcut functions for things. There's generally one way to do things, which is simple. But it's not easy, because it's certainly not intuitive that "append" is the way to remove an array element.
Not OP, but I'd be happy to try and differentiate, as well.
Simple means uncomplicated, not a lot of pieces. A violin is arguably simpler than a guitar because of a lack of frets.
Easy means it is not difficult. A guitar is arguably easier than violin [0] because it has frets.
It's important to remember that "easy" is very subjective. What is easy for one person might be insurmountable to another.
tl;dr "simple" usually means "easy to understand" and "easy" usually means "easy to do". Both inherently assume some amount of prior knowledge or skill, so neither is entirely universal or objective.
[0]: I'm not trying to say one instrument is better or disparage and guitarists. It's an example.
(I am not entirely sure I agree with its thesis or its applicability to Go, but since nobody had actually linked you directly to the concept, I thought it would be worthwhile to do so.)
To me, what makes Go simple language is limited set of building blocks, and being very opinionated - this mostly relates to syntax and what the core language provides out of the box. With Go, you only need to understand looping, branching, channels, slices and you're mostly good to go.
Ease is measured by how a language may be used to solve a problem. As an example, text-wrangling with Perl is easy - but you may have to do it using complex (i.e. not simple) representation.
Back to Go - channels are a simple concept, but are not (always) easy to work with if you do not put a lot of thought into your concurrency model.
Edit: I just thought of another way to express the difference between "simple" and "easy". The notation of adding "1+1=2" is simple, however proving that "1+1=2" is not easy (at least at the level of elementary students).
Oh, you don't even have to go that far. Of all the people I know maybe 1 out of 10 has a credit card. The irony in regard to your post: Most of them got one because "it was needed for something online" at some point. This is in Germany.
Maybe your sample is really biased, but the actual credit card ownership rate in Germany in 2017 was 53% and growing YoY, so it's likely quite a bit higher by now.
A programming language is a tool. I don’t think it should really be optimized or designed around how easy it is to teach someone who is at step zero, basic programming concepts. I do think Go is a decent language to teach up and coming professional devs precisely because it doesn’t hide the real complexity of what’s going on, but I’d probably opt for something a bit higher level as the absolute intro to programming.
I've been coding for 10+ years in primarily ruby and python. I switched to golang recently for work. While its an interesting language, it takes forever to express what I want the code to do, unlike the previous languages.
Go's simplicity forces developers to introduce insane amounts of complexity.
After like 9 months of Go I am already starting to leave it behind for Rust, but one thing Rust still really lags behind Go in is the maturity and feature completeness of major libraries.
The main example in my mind is how powerful Cobra/Viper is to create CLI apps where config can come from multiple sources - files, flags, env vars. To do the same in Rust you need to write a lot of your own code and glue together several libraries.
There's also nothing I can find for Rust that can do database migrations as nicely as Goose for go. Diesel can create its own migrations, but that's only if you're using Diesel and I prefer SQLX over an ORM and Diesel isn't async yet
Ruby and Python are notorious for being difficult to read because of all the foot guns in place. Monkey patching, duck typing, metaprogramming... and that's not even talking about structural limitations like the GIL.
Go is definitely more verbose and less "fun" to write, but it's 10X easier to reason what's going on in a large application.
Of course, it's also the correct kind of solution for some types of problems. If you need a compiled language (many do), the competition to Go isn't Ruby and Python, it's C++ and Rust.
Actually, I feel like Zig may be most in the compiled-but-keep-it-simple-like-C headspace as Go. I don't know either Go or Zig super well. From what I do know, Zig seems not quite as stubborn as Go about forcing code to be "simple to reason about" (which is also subjective, though maybe less so than "easy to do things").
I would say Zig is interesting in that "safe things are easy and pretty", "unsafe things are difficult and ugly", thus drawing eyes to the code that needs it. It's four lines of nasty looking code for me to convert a slice of bytes to a slice of (f64/f32/i32)... Which the compiler no-ops. This is dangerous because the alignment and byte count must be correct.
Nim's stdlib is actually quite large (like 300 modules). I am sure there are things in Go's that are not in Nim's (like Go's `big`), but I am equally sure there are things in Nim's that are not in Go's (like Nim's `critbits`, `pegs`, etc).
I doubt there is much in Go's stdlib not available in Nim's nimble package system, but OTOH, I am also sure the Go ecosystem is far more vast. I just didn't want people left with the impression Nim's stdlib is as small as Zig's which it is definitely not. Nim has web server/client things and ftp/smtp clients, json and SQL parsers, etc.
Python is more strongly typed than JS/Perl, granted. But it is still very weakly typed overall. Here are some examples:
1. if list(): pass # implicit coercion from collection -> None -> bool. (Very uncommon weak typing and a terrible idea.)
2. a = 123; b = 4.5; c = a + b # implicit coercion from int -> float (Common but not universal weak typing. More often hides bugs than helps with ergonomics and readability but sometimes a worthwhile tradeoff.)
3. a = 1 + false # implicit coercion from bool to int (Common weak typing in scientific languages (for masking) and older C-family languages (for bit twiddling). However, that's still bad language design. Libraries/syntax sugar should special case masking and bit twiddling. You should not have global coersion between bool and int.)
That's the difference between static and dynamic typing, not the difference between weak and strong. The values cannot be used as if they were another type.
It has nothing to do with the difference between strong and weak typing and also has nothing to do with the difference between weak and strong typing. It is about Python not disambiguating between variable shadowing and variable reassignment.
I do think my example code demonstrates why Go is easier to reason at scale than Python. Python is conflating assignment and type declaration. Go has = and :=, so it's crystal clear what's going on.
I like the way Rust makes clear what are the different possibilities:
In Rust we use a macro to handle variadic arguments, but if it was just a case of supporting different types, not of supporting an arbitrary number of arguments we would've had three options for the type signature:
1. Monomorphic duck typing: `fn print(arg: &impl Debug)`. Here the compiler simply generates multiple print functions, one for every type the function gets called on. The exact concrete type is known at compile time.
2. Polymorphic dynamic dispatch (with dynamic sizing) duck typing: `fn print(arg: Box<dyn<Debug>>)`. Here the compiler generates a vtable and allocates on the heap. Only a type approximation is known at compile time, not the exact concrete type, but it still counts as static typing.
3. Dynamic typing: `fn print(arg: Box<dyn<Any>>)`. Note `Any` in stead of `Debug`. Full dynamic typing with the type completely unknown at compile time. Juck! But occasionally useful for prototyping or for FFI with dynamically typed languages.
Additionally, you could perfectly well have constants, or require variables to have a fixed type, in a dynamic language. You would just pay a cost at runtime to check the type on assignment.
I would argue that you pay the cost at runtime but you also pay a cognitive overhead cost while writing in a dynamically typed language. Refactoring in particular is a lot more difficult.
Bad example, booleans in python are integer subclasses since forever... 2 + True = 3.
Also,
False in [1, 2, 0] evaluated to True.
But you're still right saying python is strongly typed.
I don't think you can write that much of a rant using the term "ease of use" without explaining your view on what "ease of use" is; does "use" really only mean a beginner using it for the first time? You mention languages with punctuation, what alternatives do you see instead? You claim "ease of use is THE most important part of a language", but why is it the most important?
> Back to languages. If you've never taught an introduction to computer programming for a general audience you are blind to what does and does not matter about a language. You look at things like `foo.bar()` and think "Yeah that's a simple method invocation" and have no idea how many people you just lost.
I keenly remember in college when they were first starting to teach me C++ and I asked something like “Ok: I hear what you’re saying that this is a function, and that’s a parameter, but my question is how does the computer know that you’ve named it [whatever the variable name was]?
The teacher had no understanding that this was a conceptual barrier.
Of course now I know that the answer is “because order of the syntax tells it so“, but stuff like that made those classes much harder than they needed to be.
> Ease of use is the most important part of a language.
It's really a matter of who your target audience is and what they're trying to achieve, though, isn't it? You might make a language easy for the whole world to use, but simultaneously make it hard for specific tasks. Likewise, a language might be easy to use for experts who are trying to achieve a specific task, but difficult for newbies. That's totally okay.
BASIC was super easy to understand and helped me get into programming, but there's no way I would use it for anything serious today.
I feel completely the opposite. While I do agree that we should make the barrier to entry as low as possible. A lot of times the barrier to entry is in conflict with the usefulness of a tool. We should lower the barrier to entry without losing any usefulness and not any further.
> Barriers to entry are invisible. They are invisible to people on the inside and most frequently invisible to people who have a hand in creating those barriers.
I just want to say that I really, really like this phrasing.
I have a lot of opinions on how programming languages could be improved, and however much I disagree with Rob Pike on types, I still think Go hit a real sweet spot.
I once had a conversation with a founder about their signup flow. This founder had aspirations to be a global player. They said that step 2 of their flow included entering a credit card number. I had to stop them. "What about people who don't use credit cards?" They were nonplussed. "You know, China, much of Africa ..." they had never actually thought about whether people had credit cards because every single person they interacted with had one.
Back to languages. If you've never taught an introduction to computer programming for a general audience you are blind to what does and does not matter about a language. You look at things like `foo.bar()` and think "Yeah that's a simple method invocation" and have no idea how many people you just lost.
Never underestimate how much ease of use matters. We, as a community, have selected for decades against people who care about language ergonomics because they get hit in the face with a wall of punctuation filled text and turn away. We fight over where braces belong while not realizing how many hundreds of thousands of people we've excluded.
Ease of use is the most important part of a language. It's just that the barriers to entry are invisible and so the people who happen to be well suited to vim vs emacs debates get to talk while the vast majority of potential programmers are left on the outside.
We create barriers to entry accidentally because we design for people like us. To us the barriers are invisible.