parallel computing
TRANSCRIPT
Seminar on :- Parallel Computing
Submitted ByKartik N. Kalpande..!
INTRODUCTION:
Parallel computing is a type of computation.
There are several different forms of parallel computing: bit-level, instruction-level, data, and task parallelism.
PARALLEL COMPUTING:
A problem is broken into discrete parts that can be solved concurrently
Each part is further broken down to a series of instructions
Why Use Parallel Computing?:
Main Reasons:
Save Time And/or Money.
Solve Larger / More Complex Problems.
Make Better Use Of Underlying Parallel Hardware.
Parallel Computer Memory Architectures:Shared Memory:
General Characteristics:
Multiple processors can operate independently but share the same memory resources.
Changes in a memory location effected by one processor are visible to all other processors.
Advantages:Global address space provides a user-friendly programming perspective to memory.
Data sharing between tasks is both fast and uniform.
Disadvantages:
Adding more CPUs can geometrically increases traffic on the shared memory-CPU path.
Programmer responsibility for synchronization constructs.
Distributed Memory:
General Characteristics:
Like shared memory systems, distributed memory systems vary widely but share a common characteristic.
Processors have their own local memory.
Advantages:
Memory is scalable with the number of processors
Each processor can rapidly access its own memory.
Disadvantages:
It may be difficult to map existing data structures.
Non-uniform memory access times.
Hybrid Distributed-Shared Memory:
General Characteristics:
The largest and fastest computers in the world today both shared and distributed memory architectures.
The shared memory component can be a shared memory machine.
Advantage :
Increased scalability.
Disadvantage:
Increased programmer complexity.
Flynn's Classical Taxonomy:
One of the more widely used classifications, is called Flynn's Taxonomy.
The matrix below defines the 4 possible classifications according to Flynn:
Types of parallelism:
Bit-level parallelism:
Historically, 4-bit microprocessors were replaced with 8-bit, then 16-bit, then 32-bit microprocessors.
This trend generally came to an end with the introduction of 32-bit processors.
Instruction-level parallelism:
Without instruction-level parallelism, a processor can only issue less than one instruction per clock cycle (IPC < 1).
Applications of Parallel Computing:
Data bases, Data mining.
Networked videos and Multimedia technologies.
Medical imaging and diagnosis.
Advanced graphics and virtual reality.
Collaborative work environments.
Conclusion:
With increasing demands of human, and more expectation from technology, the importance of processing faster and to a larger scale is growing fast.
REFERENCES
www.google.com www.yahoo.com www.wikipedia.org/history www.askmi.com
THANK YOU!