Podcast
Questions and Answers
What is a key characteristic of functional programming that makes it well-suited for parallelism?
What is a key characteristic of functional programming that makes it well-suited for parallelism?
- It relies heavily on shared mutable state.
- It uses pure functions that always return the same output for the same input. (correct)
- It requires explicit locking mechanisms.
- It depends on global variables.
In functional parallelism, the order of execution of G(A) and H(A) is always predetermined and cannot be changed for optimization.
In functional parallelism, the order of execution of G(A) and H(A) is always predetermined and cannot be changed for optimization.
False (B)
What is the main purpose of using 'future' in parallel programming?
What is the main purpose of using 'future' in parallel programming?
- To manage shared memory.
- To run tasks asynchronously and retrieve results later. (correct)
- To execute tasks sequentially.
- To block the execution until a result is available.
Define the term 'future object' in the context of parallel programming with futures.
Define the term 'future object' in the context of parallel programming with futures.
In the context of futures, the Get()
operation is a ______ read.
In the context of futures, the Get()
operation is a ______ read.
What does the FA.Get()
method do in the context of functional parallelism with Futures?
What does the FA.Get()
method do in the context of functional parallelism with Futures?
Using futures guarantees the absence of data races and errors in parallel programs.
Using futures guarantees the absence of data races and errors in parallel programs.
What is the purpose of join
edges in a computation graph with futures?
What is the purpose of join
edges in a computation graph with futures?
In Java's Fork/Join Framework, what serves as the 'handle' for accessing a task's return value?
In Java's Fork/Join Framework, what serves as the 'handle' for accessing a task's return value?
The Compute()
method of a future task in Java's Fork/Join framework must have a ______ return type.
The Compute()
method of a future task in Java's Fork/Join framework must have a ______ return type.
In Java's Fork/Join Framework, what is the purpose of the fork
method?
In Java's Fork/Join Framework, what is the purpose of the fork
method?
In Java's Fork/Join framework, the join
method is used to retrieve the computed value from a forked task.
In Java's Fork/Join framework, the join
method is used to retrieve the computed value from a forked task.
What is the primary purpose of memoization in functional programming?
What is the primary purpose of memoization in functional programming?
Explain the core idea behind memoization regarding efficiency when a functional program encounters the same function call multiple times.
Explain the core idea behind memoization regarding efficiency when a functional program encounters the same function call multiple times.
In parallel programming with memoization, instead of storing final values, data structures store ______ for sub-computations.
In parallel programming with memoization, instead of storing final values, data structures store ______ for sub-computations.
Why is inserting futures into a data structure a key idea in memoization for parallel programming?
Why is inserting futures into a data structure a key idea in memoization for parallel programming?
Memoization cannot be applied to algorithms like Pascal's Triangle.
Memoization cannot be applied to algorithms like Pascal's Triangle.
In the context of parallel tasks and memoization, what typically happens if multiple parallel tasks attempt to compute the same value?
In the context of parallel tasks and memoization, what typically happens if multiple parallel tasks attempt to compute the same value?
In Pascal's Triangle, how can memoization optimize the recursive divide and conquer approach?
In Pascal's Triangle, how can memoization optimize the recursive divide and conquer approach?
In Java Streams, a short block of code that takes parameters and returns a value in the body of a method, without needing a name, is known as a ______.
In Java Streams, a short block of code that takes parameters and returns a value in the body of a method, without needing a name, is known as a ______.
What benefit do Java Streams provide when working with collections?
What benefit do Java Streams provide when working with collections?
Java Streams always require the creation of intermediate collections to store results during processing.
Java Streams always require the creation of intermediate collections to store results during processing.
Which of the following is a benefit of using Java Streams?
Which of the following is a benefit of using Java Streams?
When computing the average age of active students using Java Streams, what does the FILTER
operation perform?
When computing the average age of active students using Java Streams, what does the FILTER
operation perform?
To enable parallel programming with Java Streams, you can extend stream()
to ______.
To enable parallel programming with Java Streams, you can extend stream()
to ______.
What happens when a parallel stream is used?
What happens when a parallel stream is used?
Java Streams provide a cumbersome way of encoding MapReduce patterns.
Java Streams provide a cumbersome way of encoding MapReduce patterns.
Which of the following is a feature that Java Streams offers for data processing?
Which of the following is a feature that Java Streams offers for data processing?
In Java Streams, what are the terminal operations?
In Java Streams, what are the terminal operations?
A program is said to be functionally deterministic
if it always gives the ______ when the function is given same input.
A program is said to be functionally deterministic
if it always gives the ______ when the function is given same input.
What is a key characteristic of a structurally deterministic parallel program?
What is a key characteristic of a structurally deterministic parallel program?
It is possible to create parallel programs that satisfy neither functional nor structured deterministic properties.
It is possible to create parallel programs that satisfy neither functional nor structured deterministic properties.
The given code exhibits which type of synchronization issue: Async { Sum₁ = Sum of Lower Half }, Sum2 = Sum of Upper Half, SUM = Sum₁ + Sum2
?
The given code exhibits which type of synchronization issue: Async { Sum₁ = Sum of Lower Half }, Sum2 = Sum of Upper Half, SUM = Sum₁ + Sum2
?
What term describes the (r/w) of same location running in parallel?
What term describes the (r/w) of same location running in parallel?
If there is ______, then determinism is guaranteed.
If there is ______, then determinism is guaranteed.
What term is used to describe a data race that does not negatively impact the outcome of a parallel program, making all results acceptable?
What term is used to describe a data race that does not negatively impact the outcome of a parallel program, making all results acceptable?
Code involving constructs such as async, finish, and futures is guaranteed to be deterministic.
Code involving constructs such as async, finish, and futures is guaranteed to be deterministic.
What will be searched in 'Benign Determinism String search'?
What will be searched in 'Benign Determinism String search'?
In the example of the 'String search', what will the inner loop j go though?
In the example of the 'String search', what will the inner loop j go though?
If the parallel program uses the constructs in the provided course then the program is ______ to exhibit a data race.
If the parallel program uses the constructs in the provided course then the program is ______ to exhibit a data race.
Flashcards
Task Parallelism
Task Parallelism
Tasks split into independent units run concurrently.
Data Parallelism
Data Parallelism
Parallelism achieved by applying same operation to multiple data.
Functional Programming
Functional Programming
Programming approach mimicking math functions, enabling easier parallelism.
Functional Parallelism: Futures
Functional Parallelism: Futures
Signup and view all the flashcards
Future Variable
Future Variable
Signup and view all the flashcards
Future Object
Future Object
Signup and view all the flashcards
Get() operation on Future
Get() operation on Future
Signup and view all the flashcards
Join Edges
Join Edges
Signup and view all the flashcards
Memoization
Memoization
Signup and view all the flashcards
Memoization in Parallelism
Memoization in Parallelism
Signup and view all the flashcards
Promise Object
Promise Object
Signup and view all the flashcards
Java Streams
Java Streams
Signup and view all the flashcards
Map-Reduce
Map-Reduce
Signup and view all the flashcards
Terminal Operation
Terminal Operation
Signup and view all the flashcards
Intermediate Operation
Intermediate Operation
Signup and view all the flashcards
parallelStream()
parallelStream()
Signup and view all the flashcards
Functional Determinism
Functional Determinism
Signup and view all the flashcards
Structural Determinism
Structural Determinism
Signup and view all the flashcards
Data Race
Data Race
Signup and view all the flashcards
Data Race Freedom
Data Race Freedom
Signup and view all the flashcards
Benign Non-Determinism
Benign Non-Determinism
Signup and view all the flashcards
Study Notes
Intro to Parallel Programming
- Studies the techniques for parallel programming.
Parallel Programming Topics
- Task parallelism: Focuses on the creation, termination, and scheduling of tasks using techniques like Fork/Join.
- Computations graphs, work/span analysis, multiprocessor scheduling, parallel speedup, and Ahmdahl’s Law are considered.
- Functional Parallelism: Involves tasks with return values using Futures, Java's Fork/Join framework, memoization, Java streams, and addressing data races.
- Loop Parallelism: Covers parallel loops, matrix multiplication, barriers, and iterative averaging techniques.
- Data flow Synchronization and Pipelining: includes split-phase barriers with Java Phasers, point-to-point synchronization, and pipeline parallelism.
Functional Parallelism Introduction
- Approaches to Parallelism: Inspired by functional programming. Functional parallelism aims to eliminate imperative parallelism bugs and are advocated by parallel functional programming.
- Topics of Study: Futures, memoization, streams, and data races.
- Java APIs: Fork/Join framework and Stream APIs.
Learning Objectives of Functional Parallelism
- Explain functional and structural determinism and data races while optimizing with memorization.
- Demonstrate functional parallelism using the Future construct.
- Create functional-parallel programs using Java’s Fork/Join Framework and Java Streams.
- Apply memoization to optimize functional parallelism.
Futures: Tasks with Return Values
- Functional Programming: Functional programming is characterized by writing programs as pure function calls. In such calls, A = F(B), C = G(A), and D = H(A). The value of A being a function of B depends solely on the value of B.
- F(B) is called multiple times with the same value for B will always yield the same value for A.
- In equations A = F(B), C = G(A), and D = H(A), If ordering is not a factor given A's availability they can execute in any order and hence in parallel.
- Make an async task, also called "future tasks" or "future", it contains FA = { F(B) }, useful for the value of A (G or H). To represent the return of a future, a "future" is used and known as a future object. An example call looks like future{ G(FA.Get()) }.
Explanation of future object
and Get()
- Handle for an Object: Future refers to the handle for the object, the final value for A resides in the handle.
- Difference between
FA
andA
:FA
is a wrapper forA
. - Return Characteristics: Something (doing future G) is returned immediately as an async task, moving the program to the next statement without blocking the creation of a future.
Using FA.Get()
and Running in Parallel
FA.Get()
is used, and the task will block or wait until 'A' is available. The *future,G (FA.Get()), guarantees no data race or errors.- Running in Parallel: Even if G & H are running in parallel, if function G needs input from task A,
Get()
is used which will cause it to block and wait until the task is completed.
Completing Parallel Tasks for Future Objects
- Able to chain together multiple tasks if they are future objects:
FA = { F(B) }
,FC = future{ G(FA.Get()) }
,FD = future{ H(FA.Get()) }
. - The joins are special edges arising from
Get
. By and large this expresses parallelism intuitively in functional programming.
Futures: Tasks Summary Breakdown
- Extended asynch tasks to future tasks & future objects where future tasks have return values and the future object is a handle for accessing the task’s return value.
- Key operations for a future object A:
- Assignment: A assigned reference to future object in the form
future{ task with return value}
can look likefuture{ G(A.Get()) }
. The future object content is constrained to be single-assignment (cannot be modified after future task has returned). - Blocking Read: Is used when an operation A.get() waits until task associated with A is completed then pass task's return value as A.get()'s return value.
- Assignment: A assigned reference to future object in the form
- Operations Defined to Avoid Data Races: These operations are carefully defined to avoid data racing in the task’s return value, which makes futures best suited to functional programming.
Summary of Futures
- Futures use two key operations: Assign future (task with return value) and Blocking read
.Get(). - Futures convert functional programs into parallel programs by turning pure function calls into asynchronous tasks, doing the
Get
operation - Futures allow very rich set of parallel programs and are more general than async-finish constructs for functional programming.
Futures in Java’s Fork/Join Framework
- Futures: General instructional concept implemented in Java using ForkJoin Framework.
- Class ASUM extends RecursiveTask (NOT RecursiveAction as in task parallelism), its member fields: ARR, LO, HI.
- Compute(): the method to check boundary conditions.
- If LO > HI: return 0 (empty range). Else if LO == HI: return ARR[LO] which returns the single element at its position. Else it does the divide and conquer.
Compute() and Future Tasks
- If
LO > HI
, return 0 (empty range), else ifLO == HI
, returnARR[LO]
(singleton, return element at position). - Else divide & consqure part of
mid = (LO + HI) / 2
. - New objects from array: can split into
left-half(ARR, LO to MID)
andright-half (ARR, MID+1 to HI)
. - Uses fork, really launching the Future to spawn right-half computation part as recursive future task.
Implementing the Get Operation of Future
- Spawn right-half computation part as recursive future task. Uses fork to return the sum of the left-half computation and right-half Join
- The Join function implements
GET
operation for futures in Java by having the RecursiveTask implement the future interface which returns computed value for functional parallelism & futures and to wait for task to complete.
Future Tasks in ForkJoin Framework
- When using ForkJoin use a class (RecursiveTask) that Extends RecursiveTask with the methods: fields (ARR, LO, HI) and computations (Compute()).
- The Compute() implements fork to use futures returing results of the computation. Then, it implements the Get operation using join.
Summary: Futures in Java's F/J Framework
- Uses fork/join frameworks, where future tasks extends class RecursiveTask, regular tasks extends RecursiveAction class.
- Method
compute()
of future has non-void return vs regular tasks have void return. - Both cases share methods like L.Join() but future tasks also provide task value.
Memoization
- Optimizing Technique in Functional Programming: Memoization plays a role in optimizing in functional programming for sequential tasks where data structures is available and wide range algorithms.
- Functional programs may make multiple calls to
G(X1)
when only one call is needed. In serial operations, wasted work recomputing G(X₁) for Y3 needs only one lookup and reuse.
Memoization Functional Program optimization
- Insert in data future structure for quick access.
- Instead of calling the function, look up (G, X₁) to get value in the data structure and keep it for reuse.
Summary for Memoization
- It must be known how to convert functional program to parallel program using future, but don't want to lose efficiency of memorization.
- Creation: Make a future and assign the future object to call for example a function
FY₁= Future
and future task, insert a future object that holds a the tasks value and state. - Computation: Do operation, this can be a task (G or H etc) or computed sequentially as a lookup. Then return with .
Get()
.
Key Elements of Memoization
- Executed Get() operation to lookup the value needs to wait for lookup, but returns when data is available. This ensures value is not computed twice just like in standard memoization. It can also be applied to use with a wide range of problems like Pascal's Triangle.
Optimization in Parallel
- Is to future object (promise object) to hold handle for final return as parallel exec. In cases like Pascals triangle ensure futures can compute each element with only one computation instead of many.
Optimizing Pascal's Triangle with Memoization
- Computing a value of Pascal's Triangle can trigger unneeded duplicated work.
- Lower element is the sum of two neighbouring elements from the previous row using tasks to compute and it will make sure compute is only done once by using memorization.
More About Optimizing Pascal's Triangle with Memoization
- In pascal triangle each task is handled recursively (ex computing (4) & compute(6)).
- Each computation must be only be called once with stored value and
get()
operation. When doing a compute, if running into a task will block them depending on get() value.
Memoization in Functional Programming Summary
- Use it an important optimisation with futures for sequential programming as it has been used very nicely to parallel programming by inseting values in the data structure rather than returning the final output and use look up and get future outputs. It also provides pathways for more efficiant functional programming.
Summary of Memorization:
- Core Idea: Memorization is to remember the results of function calls
f(x)
as follows:- Storage: Create a data structure that stores a set of input output calls like {(x1, y1 = f(x1)), (x2, y2 = f(x2)),...}.
- Lookups: Perform lookups in the data structure (in the form f(x')) of known values in the set.
Java Streams
- Ex.: Iterating through list of students is traditionally done using Java For-each Loop and Print. Java streams can improve the performance. FOR (S:Students) PRINT S Students. STREAM(). //create a stream (S→PRINT S) // provide lambda* for each element. The short block requires no name, it takes in parameters and returns.
Computing Averages with a different method
- In can be implemented like: FOR (S:Students) IF (S is Active) Active.ADD(S); FOR (A:Active) AgeSum = AgeSum + A.Age(); Average = AgeSum / A.SIZE();
- But streams can also implement: Students.Stream(). //create a stream FILTER(S→S is Active). // perform filter to select set of elements MAP(S→S.Age). // get age of each element in the filtered stream Average(); // perform average operation and is more easy with less interations.
Java Streams and benefits of parallel programming
Students. Stream().
(S→PRINT S) to Students. ParallelSTREAM().
(S→PRINT S). Students. Stream().
FILTER(S→S is Active).
MAP(S→S.Age).
Average to Students. parallelStream(). FILTER(S→S is Active). MAP(S→S.Age). Average();
Streams Implement:
- Java stream can call
- The filter happens in the filter operation and average is the average operation and is convenient encode of the map reduce. The popular patter is the mapping of the filters then reduces the average. For example from terminal the value is the
average()
and the code is iterated with a map.
Encoding Streams
- Streams work to split encoded mapped data using a function and reduce the function into one command.
Summary of Stream Operations
- Used to call functional programming in collections using streams using
Students.Stream().forEach
to specify the action on each stream and print all results.
Java Streams, Pipeline, and Data
- Aggregation of data query for typical data via the source stream which transforms to pipeline by inserting in and output the stream.
- Optional operations allow for each to average for functional expression. With students streams can map and filter and compute in single pipeline.
Data Races and Determinism
- Determinism: Parallel processing of data, for mathematics its expect that each process yeilds the same output with the same amount of input. Each result has the same computation growth.
Functional Determinism
- Is guarenteed depending if its has the same input and structural design depending on computatio growth.. The computation must be deterministic, if async/finish, future data must be accurate with right output for each step.
Non Determinism
- Async call for two way sum with no statements is an issue, however.
data race
(r/w of same location running in parallel) can lead in errors because Value read depends on data race and order. Read should give the right output because redoing reads in either order will give the same results!
DRF
- Data Rase Freedom, is a means to determinism by creating a bug free program with constructs.
- Data Race must be avoided, a 'Benign Determinism' case allows the system give different results depending on a factor. String search example it just finds any result whether its the first or other for the most appropriate option.
More about deterministic systems
- Its absence may exhibit different behaviours in parallel programs, based on relative scheduling and memory access. If a parallel program uses constructs that maintain data safety then these issues won't occur.
'Benign Determinism'
- May generate different outputs depending on the application String search for something.
Studying That Suits You
Use AI to generate personalized quizzes and flashcards to suit your learning preferences.