I was expecting something that would be a bit more controversial. I highly recommend reading through to the linked "neutral" analysis. Seeing evented and threaded styles as duals of each other is fairly useful. Especially if you take that to realize that ultimately you want to see the flow of the data. Oftentimes the specifics of implementation are more noisy than they need to be.
I think concurrency has nothing to do with doing things "at the same time". Concurrency is about overlapped execution, and you can very well have concurrency without parallelism.
It might be one of those terms in computer science that has multiple definitions.
> I think concurrency has nothing to do with doing things "at the same time".
Concurrent is a word that pre-dates computers, and for the English language to work we need to stay true to its definition. The dictionary definition is: "occurring or existing simultaneously".
Rob Pike gave a feature-length talk on the very topic of 'concurrency vs parallelism' and although I gave up 20 minutes in (I don't think I was getting what he was trying to say), I assume there might be a notion behind what we call concurrency that stands independently of the parallelism/serialism divide, maybe at a different level of abstraction.
I still don't understand what that notion is, that warrants using a new word other than the existing multitasking/multithreading on a single core, or parallel-processing on multiple-cores, or distributed-computing on a cluster/network. (maybe that's what it is? three different ways of achieving concurrency? but then all of it could easily be called "parallel-processing" or "multitasking")
In any case, I think an article about "Introduction to Concurrency" can be considered incomplete without a discussion of comparing/contrasting it with parallelism.
You use concurrency because having multiple execution contexts at "the same" time is useful, even if they are merely interleaved by the thread scheduler on the same core. E.g. an IRC server: having a thread per socket, blocking until there is some I/O to be performed on that socket, is a very workable, straightforward design.
A concurrent process is concurrent for design reasons, and could not (sanely) be any other way.
You use parallelism because your problem gets solved faster when worked on in parallel instead of sequence (and obviously not all problems are this way.) A parallel program running on a single core machine would not buy you any performance increase (and its design is unnecessarily convoluted.)
A parallel process is parallel for performance reasons, but could be run (just proportionally slower) on a single core.
> You use concurrency because having multiple execution contexts at "the same" time is useful, even if they are merely interleaved by the thread scheduler on the same core. E.g. an IRC server: having a thread per socket, blocking until there is some I/O to be performed on that socket, is a very workable, straightforward design.
So concurrency is a technical/academic/generalized term for "multitasking". And yet:
- the article doesn't mention that word.
- Wikipedia on concurrency doesn't mention the word.
- Wikipedia on multitasking is a separate article (that does mention concurrency but doesn't clarify the relationship between the two: one being a generalization of the other).
- And people go on length trying to explain the "difference" between the terms, without making a helpful connection [0]
"Concurrency is a system structuring mechanism.
Parallelism is a resource."
from "understanding and expressing scalable concurrency"
(Turon, Phd thesis, p 29)
Oh dear. You might be right about "syntax", but definitely not "type". There're all kinds of arguments over what a "type" is; "stack" is probably safe; "message", eh, has different definitions in reference to different programming languages, but I think everybody knows that and won't be confused in the context of any particular language, so that probably counts.
For those interested in debates about the word "type" here is just one out of a million online discussions that have taken place: https://github.com/JuliaLang/julia/issues/6113 It started with an argument over what a "dependent type" was. That said, type-as-interface and class-as-implementation gets some traction because it is just vague enough.
Re parallelism and concurrency, a similar vague-enough distinction would be: parallelism is the state of things happening at once, while concurrency is the coordination and management of parallel activities? But I agree that a lot of people don't really make the distinction, and use the terms interchangeably, which is something Rob Pike pointed out (in the context of Go, here: http://blog.golang.org/concurrency-is-not-parallelism)