logoalt Hacker News

butterisgoodyesterday at 8:29 PM0 repliesview on HN

> Concurrency refers to the ability of a system to execute multiple tasks through simultaneous execution or time-sharing (context switching)

Wikipedia had the wrong idea about microkernels for about a decade too, so ... here we are I guess.

It's not a _wrong_ description but it's incomplete...

Consider something like non-strict evaluation, in a language like Haskell. One can be evaluating thunks from an _infinite_ computation, terminate early, and resume something else just due to the evaluation patterns.

That is something that could be simulated via generators with "yield" in other languages, and semantically would be pretty similar.

Also consider continuations in lisp-family languages... or exceptions for error handling.

You have to assume all things could occur simultaneously relative to each other in what "feels like" interrupted control flow to wrangle with it. Concurrency is no different from the outside looking in, and sequencing things.

Is it evaluated in parallel? Who knows... that's a strategy that can be applied to concurrent computation, but it's not required. Nor is "context switching" unless you mean switched control flow.

The article is very good, but if we're going by the "dictionary definition" (something programming environments tend to get only "partially correct" anyway), then I think we're kind of missing the point.

The stuff we call "asynchronous" is usually a subset of asynchronous things in the real world. The stuff we treat as task switching is a single form of concurrency. But we seem to all agree on parallelism!