Introduction to MPI Flashcards
(36 cards)
Which two functions need to be called to initialise and finalise the MPI environment?
- MPI_Init to initialise the environment
- MPI_Finalize to cleanly shut down the MPI environment
Which function is used to find the number of processes there are?
MPI_Comm_size returns the number of processes currently running
Which function is used to find the current process rank?
MPI_Comm_rank returns the rank of the calling process
What is a communicator?
A communicator is a grouping of processes. There is a predefined global communicator known as MPI_COMM_WORLD containing all processes.
Which variable is accepted by MPI_Comm_size and MPI_Comm_rank?
comm - the communicator that is being queried.
Which compile command is used for MPI?
mpicc
Which command is used to run an MPI process?
mpirun - starts multiple instances of the executable
How is the number of MPI processes specified?
A flag is passed to mpirun which specified the number of processes to start.
How do processes communicate in MPI?
By passing messages
What is point-to-point communication?
Message passing between a pair of processes in a communicator
Which MPI functions are used for point-to-point messaging?
- MPI_Send
- MPI_Recv
What does MPI_Send do?
It sends a message to another process describing the data to be sent and the connection to be made.
What does MPI_Recv do?
It receives a message from another process describing the data to be received and the connection to be made.
How are message lengths specified?
In terms of MPI data types rather than in bytes.
What is collective communicaiton?
An alternative to point-to-point messaging that allows for communication to all processes in a communicator in one command.
What are the 4 types of MPI collective functions?
- MPI_Bcast
- MPI_Scatter
- MPI_Gather/ALLgather
- MPI_Reduce/ALLreduce
What does MPI_Bcast do?
Send data from one process to all the others in the specified communicator (one-to-all). All processes receive the same data.
What does MPI_Scatter do?
Distributes data to all the others in a communicator. Each processes receives a subset of the data.
What does MPI_Gather do?
Gather data from all processes in a communicator to one process (all-to-one). Each process contributes a subset of the data received.
What does MPI_Allgather do?
Gathers data from all the processes in a communicator to all processes (all-to-all). Each process contributes a subset of the data received. All the processes receive the result.
What does MPI_Reduce do?
Carries out a reduction and returns the result to the specified process.
What does MPI_Allreduce do?
Carries out a reduction and returns the result to all processes
What is a blocking communication?
MPI_Send and MPI_Recv will block until the message is received by the destination process or until it is safe to change the send buffer.
Why are blocking communications used?
The behaviour is used to help write correct programs and avoid overwriting a send buffer before it is safe to do so.