Podcast
Questions and Answers
Which of the following accurately describes a thread?
Which of the following accurately describes a thread?
- A separate memory space within a process
- An isolated environment for running applications
- The basic unit of CPU utilization (correct)
- A virtual machine instance
Threads in the same process have separate address spaces.
Threads in the same process have separate address spaces.
False (B)
What are the four basic components of a thread?
What are the four basic components of a thread?
A thread ID, a program counter, a register set, and a stack
In a multithreaded process, multiple threads share the same ______ space.
In a multithreaded process, multiple threads share the same ______ space.
Match the process type with its correct description:
Match the process type with its correct description:
Which of the following is a key benefit of multithreaded programming?
Which of the following is a key benefit of multithreaded programming?
Multithreading is ineffective on multicore hardware.
Multithreading is ineffective on multicore hardware.
Name four key advantages of multithreaded programming.
Name four key advantages of multithreaded programming.
Multithreading allows a program to continue running even if part of it is ______ or performing a lengthy operation, improving responsiveness.
Multithreading allows a program to continue running even if part of it is ______ or performing a lengthy operation, improving responsiveness.
Match the benefit with its description:
Match the benefit with its description:
What is the primary difference between concurrency and parallelism?
What is the primary difference between concurrency and parallelism?
Concurrency is always achieved through parallelism.
Concurrency is always achieved through parallelism.
Explain the difference between data parallelism and task parallelism.
Explain the difference between data parallelism and task parallelism.
In ________ parallelism, subsets of data are distributed across multiple cores for simultaneous processing.
In ________ parallelism, subsets of data are distributed across multiple cores for simultaneous processing.
Match the type of parallelism with its description:
Match the type of parallelism with its description:
What does thread cancellation involve?
What does thread cancellation involve?
The 'target thread' refers to the thread that initiates the cancellation process.
The 'target thread' refers to the thread that initiates the cancellation process.
Describe two different scenarios in which thread cancellation may occur.
Describe two different scenarios in which thread cancellation may occur.
In ________ cancellation, one thread immediately terminates the target thread.
In ________ cancellation, one thread immediately terminates the target thread.
Match the type of thread cancellation with its description:
Match the type of thread cancellation with its description:
Which cancellation mode is generally safer in thread cancellation?
Which cancellation mode is generally safer in thread cancellation?
Asynchronous cancellation ensures that necessary system-wide resources are always freed upon thread termination.
Asynchronous cancellation ensures that necessary system-wide resources are always freed upon thread termination.
What is the main advantage of deferred mode in thread cancellation?
What is the main advantage of deferred mode in thread cancellation?
In deferred cancellation, the target thread checks a ______ to determine whether or not it should be canceled.
In deferred cancellation, the target thread checks a ______ to determine whether or not it should be canceled.
Match the thread type with its description:
Match the thread type with its description:
In the many-to-one multithreading model, multiple user threads are mapped to:
In the many-to-one multithreading model, multiple user threads are mapped to:
In the one-to-one model, creating a user thread does not require creating a corresponding kernel thread.
In the one-to-one model, creating a user thread does not require creating a corresponding kernel thread.
What is a disadvantage of the many-to-one multithreading model?
What is a disadvantage of the many-to-one multithreading model?
In the ________ multithreading model, multiple user-level threads are multiplexed to a smaller or equal number of kernel threads.
In the ________ multithreading model, multiple user-level threads are multiplexed to a smaller or equal number of kernel threads.
Match the multithreading model to its description:
Match the multithreading model to its description:
What is the main purpose of a thread library?
What is the main purpose of a thread library?
A thread library in user space requires kernel support for thread management.
A thread library in user space requires kernel support for thread management.
Name two well-known thread libraries.
Name two well-known thread libraries.
Pthreads, the threads extension of the POSIX standard, can be provided as either a user-level or ________-level library.
Pthreads, the threads extension of the POSIX standard, can be provided as either a user-level or ________-level library.
Match the thread library with its typical implementation level:
Match the thread library with its typical implementation level:
In asynchronous thread creation, what is the relationship between the parent and child threads?
In asynchronous thread creation, what is the relationship between the parent and child threads?
In synchronous thread creation, the parent thread resumes its execution immediately after creating a child thread.
In synchronous thread creation, the parent thread resumes its execution immediately after creating a child thread.
What is the key difference between asynchronous and synchronous strategies for creating multiple threads?
What is the key difference between asynchronous and synchronous strategies for creating multiple threads?
In _________ threading, data sharing among threads is significant.
In _________ threading, data sharing among threads is significant.
Match the type of thread creation strategy with its characteristic:
Match the type of thread creation strategy with its characteristic:
What is implicit threading?
What is implicit threading?
Implicit threading requires programmers to explicitly manage all thread synchronization.
Implicit threading requires programmers to explicitly manage all thread synchronization.
Name three methods of implicit threading.
Name three methods of implicit threading.
_______ threading is mainly the use of libraries or other language support to hide the management of threads.
_______ threading is mainly the use of libraries or other language support to hide the management of threads.
Match the method with its common use in implicit threading:
Match the method with its common use in implicit threading:
Flashcards
What is a Thread?
What is a Thread?
The basic unit of CPU utilization; modern systems allow multithreading.
Thread address space
Thread address space
Threads share the same address space. Inter-thread communication is easier and faster than Inter-process communication (IPC).
Thread Components
Thread Components
A thread has a Thread ID, a program counter, a register set, and a stack.
Responsiveness (Multithreading)
Responsiveness (Multithreading)
Signup and view all the flashcards
Resource Sharing (Multithreading)
Resource Sharing (Multithreading)
Signup and view all the flashcards
Economy (Multithreading)
Economy (Multithreading)
Signup and view all the flashcards
Scalability (Multithreading)
Scalability (Multithreading)
Signup and view all the flashcards
Concurrency vs Parallelism
Concurrency vs Parallelism
Signup and view all the flashcards
Concurrency and Cores
Concurrency and Cores
Signup and view all the flashcards
Data Parallelism
Data Parallelism
Signup and view all the flashcards
Task Parallelism
Task Parallelism
Signup and view all the flashcards
Thread Cancellation
Thread Cancellation
Signup and view all the flashcards
Asynchronous Cancellation
Asynchronous Cancellation
Signup and view all the flashcards
Deferred Cancellation
Deferred Cancellation
Signup and view all the flashcards
User Threads
User Threads
Signup and view all the flashcards
Kernel Threads
Kernel Threads
Signup and view all the flashcards
Many-to-One Model
Many-to-One Model
Signup and view all the flashcards
One-to-One Model
One-to-One Model
Signup and view all the flashcards
Many-to-Many Model
Many-to-Many Model
Signup and view all the flashcards
Thread Library
Thread Library
Signup and view all the flashcards
Synchronous threading
Synchronous threading
Signup and view all the flashcards
Asynchronous threading
Asynchronous threading
Signup and view all the flashcards
Implicit Threading
Implicit Threading
Signup and view all the flashcards
Thread Pool
Thread Pool
Signup and view all the flashcards
Faster service
Faster service
Signup and view all the flashcards
Study Notes
- Education unlocks the world and is a passport to freedom, according to the words of Oprah Winfrey.
Threads Outline
- Threads - definition and creation
- Benefits of multithreaded programming
- Thread cancellation
- Multicore programming and types of parallelism
- Multithreading models
- Thread libraries
- Implicit threading
Learning Outcomes
- Define a thread and know the difference between a thread and a process
- Know the basic components of a thread
- Differentiate between single and multithreaded processes
- Differentiate between shared and non-shared components between a thread and a process
- Know the benefits of multithreaded programming
- Know how a thread is created and cancelled with the different mechanisms of cancellation
- Define multi-core programming and differentiate between concurrency and parallelism
- Differentiate between the two main types of parallelism
- Differentiate between multi-threaded models
- Define thread libraries
- Know the importance of implicit threading and focus on thread pools
Thread Definition
- A thread represents the basic unit of CPU utilization.
- Modern systems support multithreading, allowing a single process to have multiple threads of execution, proceeding concurrently.
Thread vs. Process
- Threads of the same process share the same address space, unlike processes.
- This makes concurrent programming easier from a programmer's perspective, although synchronization is still needed.
- Inter-thread communication is easier and faster than inter-process communication (IPC) because threads share the same address space of the owner process.
- Threads are faster to create and switch between.
- Multithreading is especially effective on multi-core hardware.
- Multithreading provides better memory usage compared to multiprocessing.
Thread Components
- Components include thread ID, a program counter, a register set, and a stack.
Single vs. Multithreaded Process
- Single-threaded processes have code, data, and files along with registers and a stack for a single thread.
- Multithreaded processes have code, data, and files shared among multiple threads, each with its own registers and stack.
Multi-threaded System Example
- A multi-threaded server architecture involves a client sending a request to a server.
- The server creates a new thread to service the request.
- The server then resumes listening for additional client requests.
Benefits of Multithreaded Programming
- Multithreading makes interactive applications more responsive by allowing a program to continue running even if one part is blocked.
- Continued execution is allowed if another part of the process is blocked
- Threads share memory and resources by default, enabling several different threads of activity within the same address space
- It is more economical to create and context-switch threads because threads share resources of the process to which they belong
- In a multiprocessor architecture, threads can run in parallel on different processing cores
- A single-threaded process is limited to one processor
- Multithreading increases concurrency on multi-CPU machines.
Multicore Programming and Types of Parallelism
- A multi-core processor consists of multiple cores, each with individual memory, sharing memory via a shared memory component and connection via a bus interface.
Concurrency vs Parallelism
- A system is parallel, if it can perform more than one task simultaneously
- A concurrent system supports more than one task by allowing all the tasks to make progress
- Concurrency is more general than parallelism
- It can be done on a single core, as well as on multiple cores
- Parallelism is when concurrency uses multiple cores to run many tasks simultaneously
Types of Parallelism
- Data parallelism distributes subsets of the data across many cores, performing the same operation on each.
- Task parallelism distributes threads across cores, with each thread performing a unique operation.
Thread Cancellation
- Thread cancellation terminates a thread before completion.
- It is useful in scenarios such as database searches where remaining threads should be canceled once a result is found or when web pages cease loading due to cancellation
Target Thread
- A thread that is to be canceled is often referred to as the target thread
Cancellation Scenarios
- Cancellation of a target thread may occur in two different scenarios:
- Asynchronous cancellation: One thread immediately terminates the target thread.
- Deferred cancellation: The target thread periodically checks whether it should terminate, allowing it an opportunity to terminate itself in an orderly fashion.
- The difficulty with cancellation lies in managing resources allocated to a canceled thread and handling shared data.
Thread Mode
- Deferred mode for thread cancellation is preferred because asynchronous mode may not free necessary system-wide resources.
- Deferred cancellation ensures that the cancellation occurs safely, at a point where the thread can be safely terminated.
Types of Threads
- User threads are supported above the kernel and managed without kernel support.
- Kernel threads are supported and managed directly by the Operating System.
Thread Relation Types
- There are three common ways for establishing a relation between user threads and kernel threads:
- Many-to-One Model
- One-to-One Model
- Many-to-Many Model
Many-to-One Model
- It maps many user threads to one Kernel Thread.
- Thread management is done by the thread library in user space, making it efficient.
- Disadvantages:
- The entire process is blocked if a thread makes a blocking system call.
- Since only one thread can access the kernel at a time, multiple threads cannot run in parallel on multiprocessors.
One-to-One Model
- Maps each user thread to a kernel thread.
- Provides more concurrency than the many-to-one model, by allowing another thread to run when a thread makes a blocking system call.
- Allows multiple threads to run in parallel on multiprocessors.
- Disadvantages:
- Creating a user thread requires creating the corresponding kernel thread.
- Can burden the performance of an application because of the overhead of creating kernel threads
- Most implementations restrict the number of threads supported by the system.
Many-to-Many Model
- It multiplexes many user-level threads to a smaller or equal number of kernel threads.
- The number of kernel threads is specific to either a particular application or machine.
- Developers can create as many user threads as necessary.
- Corresponding kernel threads can run in parallel on a multiprocessor.
- When a thread runs a blocking system call, the kernel can schedule another thread for execution.
Thread Libraries
- It provides the programmer with an API for creating and managing threads.
- There are 2 primary ways of implementing a thread library:
- To provide a library entirely in user space with no kernel support
- To implement a kernel-level library supported directly by the OS
Thread Approach
- Providing a library entirely in user space with no kernel support
- All code and data structures for the library exist in user space
- Invoking a function in the library results in a local function call in user space and not a system call
Thread Approach
- To implement a kernel-level library supported directly by the OS
- Code and data structures for the library exist in kernel space
- Invoking a function in the API for the library typically results in a system call to the kernel
Well Known Libraries
- The threads extension of the POSIX standard, may be provided as either a user- or kernel-level library.
- The Win32 thread library is a kernel-level library available on Windows systems.
Strategies
- Strategies of creating multiple threads: Asynchronous and Synchronous
Asynchronous
- The parent resumes its execution once the parent creates a child thread.
- The parent and child execute concurrently.
- Each thread runs independently of every other thread, and the parent thread need not know when its child terminates.
- Threads are independent - typically little data sharing between threads.
- It is the strategy used in the multithreaded Server
Synchronous
- Parent creates one or more children and then waits for all children to terminate before resuming
- Threads created by the parent perform work concurrently, but the parent cannot continue until this work has been completed.
- Once each thread finishes its work, it terminates and joins its parent.
- The parent can resume execution only after all children have joined.
- Involves significant data sharing among threads.
Pthreads Example
- #include <pthread.h>: used for threads
- Global data is shared by the threads
- Child thread finishes with the pthread_exit(0)
Implicit Threading
- Implicit threading focuses on creation and management of threads and is usually managed by compilers and runtime libraries rather than programmers.
- Implicit threading is applicable in situations when the number of threads increases, program correctness becomes difficult with Explicit Threads.
- It addresses the challenges and design of multithreaded applications
- Libraries or language support is used to hide the management of threads.
- Five methods explored:
- Thread Pools
- Fork-Join
- OpenMP
- Grand Central Dispatch
- Intel Threading Building Blocks
Thread Pool
- A thread pool creates a number of threads at process startup and places them in a pool to wait for work.
- A server awakens a thread from this pool when it receives a request passes it the request for service.
- When the thread completes its service, it returns to the pool and awaits more work.
- The server waits until one becomes free if the pool contains no available threads.
- Factors such as the number of CPUs, memory, and expected client requests determine the heuristically set number of threads in the pool.
- More sophisticated architectures dynamically adjust the number of threads in the pool according to usage patterns.
Thread Pool Benefits
- Servicing a request with an existing thread is faster than waiting to create a thread.
- It limits the number of threads that exist at any one point which is particularly important on systems that cannot support a large number of concurrent threads.
- Separating the task from the mechanics of creating the task allows using different strategies for executing the task.
Studying That Suits You
Use AI to generate personalized quizzes and flashcards to suit your learning preferences.