Granularity of Parallel Systems

BriskCharacterization avatar
BriskCharacterization
·
·
Download

Start Quiz

Study Flashcards

12 Questions

What does the granularity of a task measure?

The amount of work performed by a task

How is the granularity G of a task calculated?

G = Tcomp / Tcomm

What is fine-grained parallelism characterized by?

A large number of small tasks

What is the benefit of fine-grained parallelism?

Facilitates load balancing

What is an alternative way to specify granularity?

In terms of the execution time of a program

What is the purpose of considering granularity in parallel systems?

To take into account the communication overhead between processors

What is an example of a fine-grained system from outside the parallel computing domain?

The system of neurons in our brain

What occurs in coarse-grained parallelism if tasks process bulk of the data unevenly?

Load imbalance

What is the advantage of coarse-grained parallelism?

Low communication and synchronization overhead

What is medium-grained parallelism relative to?

Fine-grained and coarse-grained parallelism

What is the result of using fewer processors in parallel systems?

Improved performance

What is the optimal performance achieved in parallel and distributed computing?

Between fine-grained and coarse-grained parallelism

Study Notes

Granularity of Parallel Systems

  • Granularity is a measure of the amount of work (or computation) performed by a task.
  • It can also be defined as the ratio of computation time to communication time, wherein:
    • Computation time is the time required to perform the computation of a task.
    • Communication time is the time required to exchange data between processors.

Calculating Granularity

  • Granularity (G) can be calculated as: G = Tcomp / Tcomm
  • Granularity is usually measured in terms of the number of instructions executed in a particular task.
  • Alternatively, it can be specified in terms of the execution time of a program, combining the computation time and communication time.

Types of Parallelism

Fine-grained Parallelism

  • A program is broken down into a large number of small tasks.
  • These tasks are assigned individually to many processors.
  • The amount of work associated with a parallel task is low and the work is evenly distributed among the processors.
  • Example: The system of neurons in our brain.

Coarse-grained Parallelism

  • A program is split into large tasks.
  • A large amount of computation takes place in processors.
  • This might result in load imbalance, where certain tasks process the bulk of the data while others might be idle.
  • Advantage: Low communication and synchronization overhead.
  • Example: Message-passing architecture.

Medium-grained Parallelism

  • A compromise between fine-grained and coarse-grained parallelism.
  • Task size and communication time are greater than fine-grained parallelism and lower than coarse-grained parallelism.
  • Example: General-purpose parallel computers.

Effects of Granularity in Parallel and Distributed Computing

  • Using fewer processors can improve performance of parallel systems.
  • Scaling down a parallel system means using fewer than the maximum possible number of processing elements to execute a parallel algorithm.
  • Optimal performance is achieved between the two extremes of fine-grained and coarse-grained parallelism.

Understand the concept of granularity in parallel systems, including its definition, computation time, and communication time. Learn how granularity affects parallel processing and task distribution.

Make Your Own Quizzes and Flashcards

Convert your notes into interactive study material.

Get started for free

More Quizzes Like This

Use Quizgecko on...
Browser
Browser