I Recursive Halving and Doubling 2 aa aaa a Iaa 1bcbc□_cbcb Figure 13.3. Adding vectors
Recursive Halving and Doubling Figure 13.3. Adding vectors
I Recursive Halving and Doubling Pseudocode for Broadcast Operation: The following algorithm completes a broadcast operation in logarithmic time. Figure 13.4 illustrates the idea Figure 13. 4. Broadcast via recursive doubling
Recursive Halving and Doubling • Pseudocode for Broadcast Operation: • The following algorithm completes a broadcast operation in logarithmic time. Figure 13.4 illustrates the idea. Figure 13.4. Broadcast via recursive doubling
I Recursive Halving and Doubling The first processor first sends f(myRank==0)t the data to only two other send to processors 1 and 2 processors. Then each of these processors send the data to two other processors, and so else on At each stage the number of processors sending and receive from processors receiving data doubles. The int((myRank-1)/2) code is simple and looks similar to torank 1=2 myRank+1 torank2=2 myRank+2 if (torank1 exists) send to trank 1 if (torank2 exists) send to torank2
Recursive Halving and Doubling • The first processor first sends the data to only two other processors. Then each of these processors send the data to two other processors, and so on. At each stage, the number of processors sending and receiving data doubles. The code is simple and looks similar to If (myRank==0) { send to processors 1 and 2; } else { receive from processors int((myRank-1)/2); torank1=2*myRank+1; torank2=2*myRank+2; if (torank1 exists) send to torank1; if (torank2 exists) send to torank2; }
Parallel Algorithm Examples
Parallel Algorithm Examples
I Specific Examples In this section, specific examples of parallel algorithms that implement message passing are given The first two examples consider simple collective communication calls to parallelize matrix-vector and matrix-matrix multiplication These calls are meant to be illustrative, because parallel numerical libraries usually provide the most efficient algorithms for these operations. See Chapter 10 Parallel Mathematical Libraries. The third example shows how you can use ghost cells to construct a parallel data approach to solve Poisson's equation The fourth example revisits matrix-vector multiplication, but from a client server approach
Specific Examples • In this section, specific examples of parallel algorithms that implement message passing are given. • The first two examples consider simple collective communication calls to parallelize matrix-vector and matrix-matrix multiplication. • These calls are meant to be illustrative, because parallel numerical libraries usually provide the most efficient algorithms for these operations. (See Chapter 10 - Parallel Mathematical Libraries.) • The third example shows how you can use ghost cells to construct a parallel data approach to solve Poisson's equation. • The fourth example revisits matrix-vector multiplication, but from a client server approach