- Parallel computing is more tightly coupled to multi-threading, or how to make full use of a single CPU.
- Distributed computing refers to the notion of divide and conquer, executing sub-tasks on different machines and then merging the results.
However, since we stepped into the Big Data era, it seems the distinction is indeed melting, and most systems today use a combination of parallel and distributed computing. An example I use in my day-to-day job is Hadoop with the Map/Reduce paradigm, a clearly distributed system with workers executing tasks on different machines, but also taking full advantage of each machine with some parallel computing. I would like to get some advice to understand how exactly to make the distinction in today’s world, and if we can still talk about parallel computing or there is no longer a clear distinction. To me it seems distributed computing has grown a lot over the past years, while parallel computing seems to stagnate, which could probably explain why I hear much more talking about distributing computations than parallelizing.
Asked By : Charles Menguy
Best Answer from StackOverflow
Question Source : http://cs.stackexchange.com/questions/1580