Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think a lot of people confuse parallelism with concurrency. The easiest analogy I can think of is this:

1. Concurrent means having two cups of water, one in each hand, and drinking(think CPU computation) a little bit from one, then switch to the other. While you drink from a cup someone is filling up the other (think socket IO)

2. Parallel means having two cups, one in each hand,and lifting them up and drinking from them at the same exact time.



The confusion is worsened by the fact that "concurrent" literally means "at the same time", so calling interleaved timeslices "concurrent" is an oxymoron.

The fact that it's so hard to remember which is supposed to be "concurrency" and which "parallelism" is an indicator of how weakly these words are bound to those meanings.


I think you've very much nailed it.

I like to imagine that concurrent processes, concur (agree) on how they should share the time slices of the CPU, and parallel processes don't ever give a damn about each other, cause each has its own core, and like parallel lines, they never meet.

Now, for clarity purposes I would add that in fact concurrent processes don't "choose" per se when to run, that's the schedulers job.

I think the whole concurrent/parallel confusion is worsened even more by that fact that on a multi-core system concurrent processes can in fact be executed in parallel.


well, "concurrent" literally means "that run together", without specifing how the race takes place, perhaps the runners have to interleave their steps :-)


Oh! Of course you're right. This is brilliant, and clearly the right way to look at it. I hope more people notice what you've said here.

What ithkuil pointed out is that the root "cur" in "current" comes from the Latin for "to run". Thus "concurrent" does not literally mean "at the same time"—what I said was wrong. It literally means "running together". There's not any piece of that word that technically refers to time, so it's not an oxymoron to use it to describe interleaved timeslicing.

Perhaps it would be clearer if we spoke of "concurrent vs. simultaneous" processes rather than "concurrent vs. parallel". I'm not sure; I still don't think everyone is talking about the same things.


seriously, the whole issue of whether threaded code is really running in parallel or not (i.e. whether adding more cpus will make the code run faster) is misleading.

Context switches without the compiler being explicitly aware of when this happens can yield similar issues whether the context switch is done in software or if memory accesses are interleaved because the code is genuinely running on multiple execution units.

The problem stems from the fact that both the compiler and the processor might perform memory access in a different order than what you'd expect. I'd suggest an interesting read about it at http://ridiculousfish.com/blog/posts/barrier.html

Asynchronous programming allows to process effectively one event at a time, where things happen exactly as defined by a simple programming model, and the compiler can know what it can safely be done to produce the requested side effects.

If the grain of the events is fine enough you can reach the same effect as being concurrent, from the point of view of task being performed, while actually there is nothing really concurrent from the point of view of the actual code that is running.

Thus, it's not about the definition of concurrency per se, but about what is being concurrent in the system.


Yes. Very unfortunate. If they were replaced by random new labels "frob" or "zbring" it would probably be easier to convey the ideas behind then. Existing colloquial meanings and interpretations just cause confusion.

Maybe that is why they stick with Latin when practicing law. Each term then is in a separate language and less likely to cause confusion or collisions with the English language.


http://www.merriam-webster.com/dictionary/concurrent

http://en.wiktionary.org/wiki/concurrent

http://en.wikipedia.org/wiki/Concurrency_(computer_science)

To state the obvious, you're attempting to make distinctions that either don't exist, or do not have a consensus. You need to find new words. Your definition of concurrent is just... wrong.



Oh? Then why don't they say anything in their posts that disagrees with my view on the subject? (Hint: You're replying to the first comment I made in this thread, so the idea that I used the words "concurrent" or "parallel" in any particular way, much less wrongly, is objectively incorrect.)


Maybe my post wasn't clear enough, but I wasn't trying to illustrate the word "concurrent" in its literal sense, but rather from a programming perspective. (also read gruseom's comment above)

Concurrent and parallel, in their literal senses can be synonyms – and sure enough, if you read the first source you've just linked, at 2a you'll see concurrent defined as "in parallel". In the case of programming, a concurrent program is not parallel, though it could be if you have multiple cores.


Your post described only a possible manifestation of concurrency, and attempted to define that as what concurrency is in some sort of opposition to "parallel".

I suspect we're actually in agreement that ideally "concurrent" would be a description of capability and "parallel" a manifestation of that capability, but the fact is you're never going to get everyone to agree on (or remember) that[1]. So, again, new words are needed.

[1] Edit: I just found the later post of yours where you said this:

"I like to imagine that concurrent processes, concur (agree) on how they should share the time slices of the CPU, and parallel processes don't ever give a damn about each other, cause each has its own core, and like parallel lines, they never meet."

So, yet another novel definition of "concurrent". And yet you think other people are wrong. Heh.


I like this quick comic from Joe Armstrong (Erlang's "father").

http://joearms.github.io/2013/04/05/concurrent-and-parallel-...


It's great, but all these definitions (this one, the ones at https://news.ycombinator.com/item?id=6270128 and so on) have slightly different meanings. It must be maddening to anyone trying to get it for the first time.

For example, are parallel processes always also concurrent? It's hard to imagine a more elementary question, yet the different definitions don't all answer it the same way. That alone casts some doubt on how well-defined these terms are to begin with.


What helped me understand was this point

concurrency is a property of the algorithm, parallelism is a property of the execution environment

To expand on the above. This means that a particular problem can be talked about in terms of smaller sub problems. Example, you are serving a site. Sub problems are handling each client request. Another example of problem "crack password via brute-force method", sub problem is "try one particular password". Here is where discussion comes about whether there are concurrent sub-problems or not. We are not sure about how they'll run yet.

>For example, are parallel processes always concurrent?

Not sure what you mean by that. Are these processes solving one particular problem. Concurrency and parallelism make sense for a particular problem or algorithm. How are these processes related? Do they just happen to run on the same machine but otherwise are solving separate problems. Then maybe it doesn't even make sense to talk about either concurrency or parallelism.

Now you can turn this on its head an look at it from the point of view of a kernel designer. His very simplified algorithm is "fairly schedule processes and IO" for all the users. So his problem now deals with any two processes but these are now all part of a problem.

I guess I am trying to say that some questions just don't make sense to ask.


What that analogy misses is that drinking two cups at the same time is both parallel and concurrent.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: