Download - Patterns of parallel programming
![Page 2: Patterns of parallel programming](https://reader034.vdocuments.us/reader034/viewer/2022042814/554f8786b4c9052a518b4f91/html5/thumbnails/2.jpg)
Agenda
• Why parallel?• Terms and measures• Building Blocks• Patterns overview
– Pipeline and data flow– Producer-Consumer– Map-Reduce– Other
![Page 3: Patterns of parallel programming](https://reader034.vdocuments.us/reader034/viewer/2022042814/554f8786b4c9052a518b4f91/html5/thumbnails/3.jpg)
Why Moore's law is not working anymore
• Power consumption• Wire delays• DRAM access latency• Diminishing returns of more instruction-level
parallelism
![Page 4: Patterns of parallel programming](https://reader034.vdocuments.us/reader034/viewer/2022042814/554f8786b4c9052a518b4f91/html5/thumbnails/4.jpg)
Power consumption
10,000
1,000
100
10
1
‘70 ‘80 ’90 ’00 ‘10
Pow
er D
ensi
ty (
W/c
m2 )
8080
Pentium® processors
Hot Plate
Nuclear Reactor
Rocket Nozzle
Sun’s Surface
![Page 5: Patterns of parallel programming](https://reader034.vdocuments.us/reader034/viewer/2022042814/554f8786b4c9052a518b4f91/html5/thumbnails/5.jpg)
Wire delays
![Page 6: Patterns of parallel programming](https://reader034.vdocuments.us/reader034/viewer/2022042814/554f8786b4c9052a518b4f91/html5/thumbnails/6.jpg)
Diminishing returns
• 80’s– 10 CPI 1 CPI
• 90– 1 CPI 0.5CPI
• 00’s: multicore
![Page 7: Patterns of parallel programming](https://reader034.vdocuments.us/reader034/viewer/2022042814/554f8786b4c9052a518b4f91/html5/thumbnails/7.jpg)
No matter how fast processors get, software consistently finds new ways to eat up the extra speed.
Herb Sutter
![Page 8: Patterns of parallel programming](https://reader034.vdocuments.us/reader034/viewer/2022042814/554f8786b4c9052a518b4f91/html5/thumbnails/8.jpg)
To scale performance, put many processing cores on the microprocessor chip
New Moore’s law edition is about doubling of cores.
Survival
![Page 9: Patterns of parallel programming](https://reader034.vdocuments.us/reader034/viewer/2022042814/554f8786b4c9052a518b4f91/html5/thumbnails/9.jpg)
Terms & Measures• Work = T1• Span = T∞• Work Law: Tp>=T1/P• Span Law: Tp>=T∞• Speedup: Tp/T1
– Linear: θ(P)– Perfect: P
• Parallelism: T1/T∞• Tp<=(T1-T∞)/P + T∞
![Page 10: Patterns of parallel programming](https://reader034.vdocuments.us/reader034/viewer/2022042814/554f8786b4c9052a518b4f91/html5/thumbnails/10.jpg)
Definitions
• Concurrent- Several things happenings at the same time
• Multithreaded– Multiple execution contexts
• Parallel– Multiple simultaneous computations
• Asynchronous– Not having to wait
![Page 11: Patterns of parallel programming](https://reader034.vdocuments.us/reader034/viewer/2022042814/554f8786b4c9052a518b4f91/html5/thumbnails/11.jpg)
Dangers
• Race Conditions• Starvations• Deadlocks• Livelock• Optimizing compilers • …
![Page 12: Patterns of parallel programming](https://reader034.vdocuments.us/reader034/viewer/2022042814/554f8786b4c9052a518b4f91/html5/thumbnails/12.jpg)
Data parallelism
Parallel.ForEach(letters, ch => Capitalize(ch));
![Page 13: Patterns of parallel programming](https://reader034.vdocuments.us/reader034/viewer/2022042814/554f8786b4c9052a518b4f91/html5/thumbnails/13.jpg)
Task parallelism
Parallel.Invoke(() => Average(), () => Minimum() …);
![Page 14: Patterns of parallel programming](https://reader034.vdocuments.us/reader034/viewer/2022042814/554f8786b4c9052a518b4f91/html5/thumbnails/14.jpg)
Fork-Join• Additional work may be started only when specific subsets of the
original elements have completed processing• All elements should be given the chance to run even if one
invocation fails (Ping)
Fork
Compute Mean
Compute Median
Join
Compute Mode
Parallel.Invoke(() => ComputeMean(),() => ComputeMedian(),() => ComputeMode());
static void MyParallelInvoke(params Action[] actions){
var tasks = new Task[actions.Length];for (int i = 0; i < actions.Length; i++)
tasks[i] = Task.Factory.StartNew(actions[i]);Task.WaitAll(tasks);
}
![Page 15: Patterns of parallel programming](https://reader034.vdocuments.us/reader034/viewer/2022042814/554f8786b4c9052a518b4f91/html5/thumbnails/15.jpg)
Pipeline pattern
Task 1
Task 2
Task 3
Task<int> T1 = Task.Factory.StartNew(() => { return result1(); });
Task<double> T2 = T1.ContinueWith((antecedent) => { return result2(antecedent.Result); });
Task<double> T3 = T2.ContinueWith((antecedent) => { return result3(antecedent.Result); });
![Page 16: Patterns of parallel programming](https://reader034.vdocuments.us/reader034/viewer/2022042814/554f8786b4c9052a518b4f91/html5/thumbnails/16.jpg)
Producer/Consumer
BlockingCollection<T>
Read 1 Read 2 Read 3Disk/Net
Process Process Process
![Page 17: Patterns of parallel programming](https://reader034.vdocuments.us/reader034/viewer/2022042814/554f8786b4c9052a518b4f91/html5/thumbnails/17.jpg)
![Page 18: Patterns of parallel programming](https://reader034.vdocuments.us/reader034/viewer/2022042814/554f8786b4c9052a518b4f91/html5/thumbnails/18.jpg)
Other patterns
• Speculative Execution• APM (IAsyncResult, Begin/end pairs)• EAP(Operation/Callback pairs)
![Page 19: Patterns of parallel programming](https://reader034.vdocuments.us/reader034/viewer/2022042814/554f8786b4c9052a518b4f91/html5/thumbnails/19.jpg)
References• Patterns for Parallel Programming: Understanding and Applying Par
allel Patterns with the .NET Framework 4• Pluralsight:
– Introduction to Async and Parallel Programming in .NET 4 – Async and Parallel Programming: Application Design
• The Free Lunch Is Over: A Fundamental Turn Toward Concurrency in Software
• Chapter 27 Multithreaded Algorithms from Introduction to algorithms 3rd edition