IMG_9333.jpeg
Document Details
Uploaded by ThriftySuccess1286
Sebha University
Full Transcript
# Parallel Computing **Abstract** This document discusses concurrency and parallelism in the context of parallel computing, specifically focusing on distributed computing. **Concurrency vs. Parallelism** * **Parallelism**: Refers to multiple activities happening simultaneously. For example, ex...
# Parallel Computing **Abstract** This document discusses concurrency and parallelism in the context of parallel computing, specifically focusing on distributed computing. **Concurrency vs. Parallelism** * **Parallelism**: Refers to multiple activities happening simultaneously. For example, executing four tasks on four separate CPUs simultaneously. * **Concurrency**: Refers to the ability to execute tasks at the same time. It includes parallelism but also encompasses cases of apparent parallelism, such as time-sharing where multiple tasks are executed on a single CPU. **Distributed Computing** * Distributed computing involves running computations across multiple independent computing units, often situated at considerable distances. This typically requires high-capacity communication networks. **Aims of Distributed Systems** * Increasing computational capacity with reduced cost. * Improving power by adding/grouping computing units. **Architectural Models** * **Message Passing**: Communication is facilitated by the exchange of messages. When a processor needs to communicate with another, it sends a message, which is then routed across network connections. * **Shared Memory**: Processors communicate through a joint memory space. Each processor reads from and/or writes to the shared memory location. A special case of shared memory is PRAM (which is shown in figure 4-1).