Full Transcript

# Parallel Computing Lectures **Synchronism (Synchronization):** * **Synchronous System:** Assumes communication times are fixed. Each processor has an internal clock. A message sent at time t4 by processor u must arrive at processor v at time t1. * Task 1: Sends messages to one or multiple n...

# Parallel Computing Lectures **Synchronism (Synchronization):** * **Synchronous System:** Assumes communication times are fixed. Each processor has an internal clock. A message sent at time t4 by processor u must arrive at processor v at time t1. * Task 1: Sends messages to one or multiple neighboring processors. * Task 2: Receives messages from one or multiple neighboring processors. * Task 3: Performs local calculations. Local calculation time is assumed negligible compared to message transfer time. Computation time is effectively determined by the waiting time for messages from other processors. * **Asynchronous System:** Computation is triggered by events. * Event E1: Perform X1 action when message arrives triggering E1. * Event E2: Perform X2 action when message arrives triggering E2. (E1 = X1) and (E2 = X2) **Multitasking:** * Multitasking is a process management technique that allows a single processor to handle multiple tasks simultaneously. It allows different programs to run and interact (e.g., printing, storage) without waiting. * In multiprocessor systems, algorithms are based on multitasking. Each processing unit handles a portion of the problem concurrently, resulting in the combined output from all units. * Programming for multitasking requires understanding how instructions operate on data. **Sources and Limits of Parallelism:** * **Data Parallelism (Fig. 6-1):** This approach involves applying the same operation to different data items. Each processor applies the operation to a different dataset (D1, D2,... Dn). Processors (P1, P2, P3...) execute the same operation for data. Outputs (O1, O2, On) are the results from the processes.

Use Quizgecko on...
Browser
Browser