This page imported from: /afs/bu.edu/cwis/webuser/web/s/c/scv/documentation/tutorials/MPI/more/MPI_Cart_sub.html
MPI_Cart_sub
For an N-dimensional cartesian grid, create new communicator
for subgrids of up to (N-1) dimensions
Often, after we have created a cartesian grid, we wish to further
group elements of this grid into subgrids of lower dimensions.
For instance, the subgrids of a 2D cartesian grid are 1D grids of
the individual rows or columns of the original grid.
Similarly, for a 3D cartesian grid, the subgrids can
either be 2D or 1D. In any case, full length of each dimension of the
original grid is used in the subgrids.
Fortran syntax
Subroutine MPI_Cart_sub(old_comm,remain,new_comm,ierr)
C syntax
int MPI_Cart_sub(MPI_COMM old_comm,int remain,MPI_COMM* new_comm)
Example in Fortran
For a 2D cartesian grid, create subgrids of rows and columns
Create cartesian topology for processes dims(1) = nv dims(2) = mv call MPI_Cart_create(MPI_COMM_WORLD, ndim, dims, & period, reorder, grid_comm, ierr) call MPI_Comm_rank(grid_comm, me, ierr) call MPI_Cart_coords(grid_comm, me, ndim, coords, ierr) Create row subgrids ==> 3 subgrids of (1x2) remain(0) = .false. remain(1) = .true. call MPI_Cart_sub(grid_comm, remain, row_comm, ierr) call MPI_Comm_rank(row_comm, row_id, ierr) call MPI_Cart_coords(row_comm, row_id, 1, row_coords, ierr) Create column subgrids ==> 2 subgrids of (3x1) remain(0) = .true. remain(1) = .false. call MPI_Cart_sub(grid_comm, remain, col_comm, ierr) call MPI_Comm_rank(col_comm, col_id, ierr) call MPI_Cart_coords(col_comm, col_id, 1, col_coords, ierr)
Shown in Figure a below is a 3-by-2 cartesian topology (grid)
where the index pair “i,j” indicate row “i” and column “j”. The number
in parentheses represents the rank
number associated with the cartesian grid. Figure b shows the row subgrids while
Figure c shows the column subgrids.
|
|
|
As another example, lets look at a 3D cartesian grid of 3x2x4. Calling MPI_Cart_sub
with
remain_dims array defined as
remain_dims(0) = .true.
remain_dims(1) = .true.
remain_dims(2) = .false.
yields four 3-by-2 subgrids.
The output of a fortran code for this particular arrangement is shown below:
MPI_Cart_sub example: 3x2x4 cartesian grid ==> 4 (3x2) subgrids
Iam 3D 3D cartesian coords. 2D 2D subgrids
Rank Rank coords.
0 0| 0 0 0| 0| 0 0
18 18| 2 0 2| 4| 2 0
8 8| 1 0 0| 2| 1 0
14 14| 1 1 2| 3| 1 1
16 16| 2 0 0| 4| 2 0
12 12| 1 1 0| 3| 1 1
1 1| 0 0 1| 0| 0 0
7 7| 0 1 3| 1| 0 1
6 6| 0 1 2| 1| 0 1
10 10| 1 0 2| 2| 1 0
23 23| 2 1 3| 5| 2 1
4 4| 0 1 0| 1| 0 1
5 5| 0 1 1| 1| 0 1
2 2| 0 0 2| 0| 0 0
3 3| 0 0 3| 0| 0 0
21 21| 2 1 1| 5| 2 1
9 9| 1 0 1| 2| 1 0
15 15| 1 1 3| 3| 1 1
19 19| 2 0 3| 4| 2 0
20 20| 2 1 0| 5| 2 1
17 17| 2 0 1| 4| 2 0
13 13| 1 1 1| 3| 1 1
11 11| 1 0 3| 2| 1 0
22 22| 2 1 2| 5| 2 1
The above table is illustrated below in four figures, each representing
one of the four 2D subgrids of size (3x2). The top set of numbers denote the
3D cartesian grid coordinates with the process rank in that grid enclosed
in parentheses. The lower, colored, sets represent the corresponding numbers
in the 2D subgrids.
Figure d.
Subgrid for K = 0
0,0,0 (0)
0,0 (0)
0,1,0 (4)
0,1 (1)
1,0,0 (8)
1,0 (2)
1,1,0 (12)
1,1 (3)
2,0,0 (16)
2,0 (4)
2,1,0 (20)
2,1 (5)
Figure e.
Subgrid for K = 1
0,0,1 (1)
0,0 (0)
0,1,1 (5)
0,1 (1)
1,0,1 (9)
1,0 (2)
1,1,1 (13)
1,1 (3)
2,0,1 (17)
2,0 (4)
2,1,1 (21)
2,1 (5)
Figure f.
Subgrid for K = 2
0,0,2 (2)
0,0 (0)
0,1,2 (6)
0,1 (1)
1,0,2 (10)
1,0 (2)
1,1,2 (14)
1,1 (3)
2,0,2 (18)
2,0 (4)
2,1,2 (22)
2,1 (5)
Figure g.
Subgrid for K = 3
0,0,3 (3)
0,0 (0)
0,1,3 (7)
0,1 (1)
1,0,3 (11)
1,0 (2)
1,1,3 (15)
1,1 (3)
2,0,3 (19)
2,0 (4)
2,1,3 (23)
2,1 (5)
We have just demonstrated the use of MPI_Cart_sub to divide a cartesian grid into
subgrids of lower dimensions.
<!--
It is important to note that subgrids are
treated as entities. Let say that you want to send a
row or a column of a 2D matrix, set up as a subgrid,
to a process. The count
input parameter, required by the MPI_Send
to indicate the buffer size, is 1 (one), not the number of elements in the row or column.
-->
On occasions, the information regarding a subgrid may not be available, as in the
case where the subgrid communicator was created in one routine and is used in
another. In such a situation, MPI_Cartdim_get may
be called to find out the dimensions of the subgrid. Armed with this information,
additional information may be obtained by calling
MPI_Cart_get. We will discuss these routines next.