Concurrent generally means "occurring or able to occurr at the exact same time." So you do have parallel and asynchronous IO, but not concurrent. There's lots of debate over the terms, but I think that's the agreed "exact definition."
Of course, does this really make a difference for network IO? Almost always the answer is no. The difference will be on the order of microseconds, maybe milliseconds.
* Concurrency is a property of relationships between tasks in the problem (or the algorithm). Is fetching one page for example independent of another one. If could be so it is concurrent, but it also might not be true, if it is a child page. You have to fetch one page, look at links and then fetch those pages. To the tasks have hard coded sequence so those are not concurrent.
* Parallelism is how that algorithm or problem is solved or executed. It could be that you can execute all concurrent units at the same time so you achieve parallelism, which is great. Or it could be that due to a particular architecture or other reasons you execute it serially. Maybe you just have a while loop and fetch one page, wait fetch another one. The problem is concurrent but it is not run in parallel.
Notice my definition doesn't include CPU or IO in there. In real world there is both. CPU concurrency interleaved with IO concurrency. That you can then end up running none, one or both in parallel when you execute.
Yes. And this implies that concurrent is not necessarily parallel, but parallel is always concurrent. A way to keep the notions straight is that if you're implementing a kernel that will only ever run on one core, you still have to worry about concurrency: all processes, including the kernel, are time-sharing the single core.
I don't really agree. The heart of the problem of concurrency is non-determinism, but it's perfectly possible to have deterministic parallel algorithms. Normally the key is not letting the parallel operations interact with one another.
So to me, a useful (for discussion) definition of concurrency involves multiple logical tasks, overlapping in time, and interacting with one another, in a non-deterministic way.
Whereas parallelism is concerned with taking advantage of physical hardware that can do more than one thing simultaneously.
And concurrency does not imply parallelism, nor does parallelism imply concurrency, under my understanding. In particular, data parallelism like SIMD or CUDA is not concurrent.
> So to me, a useful (for discussion) definition of concurrency involves multiple logical tasks, overlapping in time, and interacting with one another, in a non-deterministic way.
Hmm, I would think it would be the opposite, they're concurrent precisely because they don't have to interact. They can run independently. 2 requests from a server are concurrent because they don't have to know about each other and don't have to interact with each. This is a property of the problem domain (idealized web requests) this doesn't tell us anything about how they'll run (in parallel or not).
> Whereas parallelism is concerned with taking advantage of physical hardware that can do more than one thing simultaneously.
I agree with that.
> And concurrency does not imply parallelism, nor does parallelism imply concurrency, under my understanding. In particular, data parallelism like SIMD or CUDA is not concurrent.
Don't quite agree with that and don't see why SIMD algorithms have to be a special case. Maybe you compute a dot product between 2 vectors. If you write the algorithm down you have a bunch of multiplications and a sum. You notice that it has a lot of concurrency (the algorithm). If you don't have SIMD you could spawn a thread to multiply out each pair and then to sum. That would be silly. But you'd run in parallel. You could just do it sequentially with a for loop. But if you have SIMD, it know how to run those concurrent algorithmic steps in parallel.
In computer science, concurrency is a property of systems in which several computations are executing simultaneously, and potentially interacting with each other.
But it doesn't really matter, what matters is whether people understand each other, not whether they're using the "correct" words.
If they really do understand what each other is trying to say, it doesn't matter much what words they're using.
What annoys me is when half of the comments is about form, not about substance. It's understandable, people (including me) love to correct mistakes of other people, but it still annoys me.
When people start misusing a word, it loses its original meaning. There is an avoidable period of confusion between the moment people start using word A to mean a subset of A and when everyone agrees A refers to a subset of what previously was A.
I think a lot of people confuse parallelism with concurrency. The easiest analogy I can think of is this:
1. Concurrent means having two cups of water, one in each hand, and drinking(think CPU computation) a little bit from one, then switch to the other. While you drink from a cup someone is filling up the other (think socket IO)
2. Parallel means having two cups, one in each hand,and lifting them up and drinking from them at the same exact time.
The confusion is worsened by the fact that "concurrent" literally means "at the same time", so calling interleaved timeslices "concurrent" is an oxymoron.
The fact that it's so hard to remember which is supposed to be "concurrency" and which "parallelism" is an indicator of how weakly these words are bound to those meanings.
I like to imagine that concurrent processes, concur (agree) on how they should share the time slices of the CPU, and parallel processes don't ever give a damn about each other, cause each has its own core, and like parallel lines, they never meet.
Now, for clarity purposes I would add that in fact concurrent processes don't "choose" per se when to run, that's the schedulers job.
I think the whole concurrent/parallel confusion is worsened even more by that fact that on a multi-core system concurrent processes can in fact be executed in parallel.
well, "concurrent" literally means "that run together", without specifing how the race takes place, perhaps the runners have to interleave their steps :-)
Oh! Of course you're right. This is brilliant, and clearly the right way to look at it. I hope more people notice what you've said here.
What ithkuil pointed out is that the root "cur" in "current" comes from the Latin for "to run". Thus "concurrent" does not literally mean "at the same time"—what I said was wrong. It literally means "running together". There's not any piece of that word that technically refers to time, so it's not an oxymoron to use it to describe interleaved timeslicing.
Perhaps it would be clearer if we spoke of "concurrent vs. simultaneous" processes rather than "concurrent vs. parallel". I'm not sure; I still don't think everyone is talking about the same things.
seriously, the whole issue of whether threaded code is really running in parallel or not (i.e. whether adding more cpus will make the code run faster) is misleading.
Context switches without the compiler being explicitly aware of when this happens can yield similar issues whether the context switch is done in software or if memory accesses are interleaved because the code is genuinely running on multiple execution units.
The problem stems from the fact that both the compiler and the processor might perform memory access in a different order than what you'd expect. I'd suggest an interesting read about it at http://ridiculousfish.com/blog/posts/barrier.html
Asynchronous programming allows to process effectively one event at a time, where things happen exactly as defined by a simple programming model, and the compiler can know what it can safely be done to produce the requested side effects.
If the grain of the events is fine enough you can reach the same effect as being concurrent, from the point of view of task being performed, while actually there is nothing really concurrent from the point of view of the actual code that is running.
Thus, it's not about the definition of concurrency per se, but about what is being concurrent in the system.
Yes. Very unfortunate. If they were replaced by random new labels "frob" or "zbring" it would probably be easier to convey the ideas behind then. Existing colloquial meanings and interpretations just cause confusion.
Maybe that is why they stick with Latin when practicing law. Each term then is in a separate language and less likely to cause confusion or collisions with the English language.
To state the obvious, you're attempting to make distinctions that either don't exist, or do not have a consensus. You need to find new words. Your definition of concurrent is just... wrong.
Oh? Then why don't they say anything in their posts that disagrees with my view on the subject? (Hint: You're replying to the first comment I made in this thread, so the idea that I used the words "concurrent" or "parallel" in any particular way, much less wrongly, is objectively incorrect.)
Maybe my post wasn't clear enough, but I wasn't trying to illustrate the word "concurrent" in its literal sense, but rather from a programming perspective. (also read gruseom's comment above)
Concurrent and parallel, in their literal senses can be synonyms – and sure enough, if you read the first source you've just linked, at 2a you'll see concurrent defined as "in parallel". In the case of programming, a concurrent program is not parallel, though it could be if you have multiple cores.
Your post described only a possible manifestation of concurrency, and attempted to define that as what concurrency is in some sort of opposition to "parallel".
I suspect we're actually in agreement that ideally "concurrent" would be a description of capability and "parallel" a manifestation of that capability, but the fact is you're never going to get everyone to agree on (or remember) that[1]. So, again, new words are needed.
[1] Edit: I just found the later post of yours where you said this:
"I like to imagine that concurrent processes, concur (agree) on how they should share the time slices of the CPU, and parallel processes don't ever give a damn about each other, cause each has its own core, and like parallel lines, they never meet."
So, yet another novel definition of "concurrent". And yet you think other people are wrong. Heh.
It's great, but all these definitions (this one, the ones at https://news.ycombinator.com/item?id=6270128 and so on) have slightly different meanings. It must be maddening to anyone trying to get it for the first time.
For example, are parallel processes always also concurrent? It's hard to imagine a more elementary question, yet the different definitions don't all answer it the same way. That alone casts some doubt on how well-defined these terms are to begin with.
concurrency is a property of the algorithm, parallelism is a property of the execution environment
To expand on the above. This means that a particular problem can be talked about in terms of smaller sub problems. Example, you are serving a site. Sub problems are handling each client request. Another example of problem "crack password via brute-force method", sub problem is "try one particular password". Here is where discussion comes about whether there are concurrent sub-problems or not. We are not sure about how they'll run yet.
>For example, are parallel processes always concurrent?
Not sure what you mean by that. Are these processes solving one particular problem. Concurrency and parallelism make sense for a particular problem or algorithm. How are these processes related? Do they just happen to run on the same machine but otherwise are solving separate problems. Then maybe it doesn't even make sense to talk about either concurrency or parallelism.
Now you can turn this on its head an look at it from the point of view of a kernel designer. His very simplified algorithm is "fairly schedule processes and IO" for all the users. So his problem now deals with any two processes but these are now all part of a problem.
I guess I am trying to say that some questions just don't make sense to ask.
It is very common to describe pre-emptive multitasking on a single core as concurrent processing. Your definition is a general dictionary definition - it does not necessarily fit well with usage in technology.
You're right, both "parallel" and "concurrent" are ambiguous if using the dictionary definitions. Computer science has added somewhat new definitions to both of those terms but they're not really universally known or understood.
It'd be nice if new terms were used entirely, really.
No, multithreading is an OS abstraction providing the illusion of parallelism - which might actually be parallel, if more than one hardware executor is available - and is an implementation technique for concurrency, but not the only one.
Of course, does this really make a difference for network IO? Almost always the answer is no. The difference will be on the order of microseconds, maybe milliseconds.