Types of parallel computing:
Bit-level parallelism:
Tasks are dependent on processor word size
Instruction-level parallelism:
How many instructions are implemented in a clock cycle
Instruction-level parallelism:
How many instructions are implemented in a clock cycle
Task parallelism:
Tasks are decomposed into subtasks
Distributed systems issues:
Applications of parallel computing:
Lack of global knowledge issues:
Dichotomy of parallel computing platforms:
An explicit parallel program must specify control structure and communication model
Distributed system problems:
PRAM subclasses:
SIMD:
A single control unit dispatches same instruction to processors that work on different data
Distributed memory parallel system:
Parallel systems that don’t share memory
MIMD:
Each processor has its own control unit and can execute different data items
Shared memory parallel system:
Multiple processors can access the same memory simultaneously
Shared memory parallel system challenges:
Architecture of an ideal parallel computer:
UMA:
Time taken by a processor to access memory in global or local is identical
NUMA:
Time taken by a processor to access memory in global or local is not identical
PRAM stands for _____
Parallel Random-Access Machine