Skip to content

Commit ea39c45

Browse files
committed
Update Daint MPI infos
1 parent af96b33 commit ea39c45

File tree

5 files changed

+24
-38
lines changed

5 files changed

+24
-38
lines changed

scripts/l7_runme2D.sh

+1-4
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,4 @@
99
#SBATCH --constraint=gpu
1010
#SBATCH --account class04
1111

12-
module load daint-gpu
13-
module load Julia/1.9.3-CrayGNU-21.09-cuda
14-
15-
srun julia -O3 PorousConvection_2D_xpu.jl
12+
srun julia --project PorousConvection_2D_xpu.jl

scripts/l7_runme3D.sh

+1-4
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,4 @@
99
#SBATCH --constraint=gpu
1010
#SBATCH --account class04
1111

12-
module load daint-gpu
13-
module load Julia/1.9.3-CrayGNU-21.09-cuda
14-
15-
srun julia -O3 PorousConvection_3D_xpu.jl
12+
srun julia --project PorousConvection_3D_xpu.jl
+1-4
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,6 @@
11
#!/bin/bash -l
22

3-
module load daint-gpu
4-
module load Julia/1.9.3-CrayGNU-21.09-cuda
5-
63
export MPICH_RDMA_ENABLED_CUDA=0
74
export IGG_CUDAAWARE_MPI=0
85

9-
julia -O3 diffusion_2D_perf_multixpu.jl
6+
julia --project diffusion_2D_perf_multixpu.jl

scripts/l8_scripts/l8_sbatch_mpi_daint.sh

+1-4
Original file line numberDiff line numberDiff line change
@@ -9,10 +9,7 @@
99
#SBATCH --constraint=gpu
1010
#SBATCH --account class04
1111

12-
module load daint-gpu
13-
module load Julia/1.9.3-CrayGNU-21.09-cuda
14-
1512
export MPICH_RDMA_ENABLED_CUDA=0
1613
export IGG_CUDAAWARE_MPI=0
1714

18-
srun -n4 bash -c 'julia -O3 diffusion_2D_perf_multixpu.jl'
15+
srun -n4 bash -c 'julia --project diffusion_2D_perf_multixpu.jl'

website/software_install.md

+20-22
Original file line numberDiff line numberDiff line change
@@ -504,6 +504,25 @@ salloc -C'gpu' -Aclass04 -N4 -n4 --time=02:00:00
504504
srun -n4 julia --project <my_mpi_script.jl>
505505
```
506506

507+
If you do not want to use an interactive session you can use the `sbatch` command to launch an MPI job remotely on daint. Example of a `sbatch_mpi_daint.sh` you can launch (without need of an allocation) as [`sbatch sbatch_mpi_daint.sh`](https://github.com/eth-vaw-glaciology/course-101-0250-00/blob/main/scripts/l8_scripts/l8_sbatch_mpi_daint.sh):
508+
```sh
509+
#!/bin/bash -l
510+
#SBATCH --job-name="diff2D"
511+
#SBATCH --output=diff2D.%j.o
512+
#SBATCH --error=diff2D.%j.e
513+
#SBATCH --time=00:05:00
514+
#SBATCH --nodes=4
515+
#SBATCH --ntasks-per-node=1
516+
#SBATCH --partition=normal
517+
#SBATCH --constraint=gpu
518+
#SBATCH --account class04
519+
520+
srun -n4 bash -c 'julia --project <my_julia_mpi_gpu_script.jl>'
521+
```
522+
523+
\note{The scripts above can be found in the [scripts](https://github.com/eth-vaw-glaciology/course-101-0250-00/blob/main/scripts/l8_scripts/) folder.}
524+
525+
507526
#### CUDA-aware MPI on Piz Daint
508527
\warn{There is currently an issue on the Daint software stack with CuDA-aware MPI. For now, make sure **not to run** with CUDA-aware MPI, i.e., having both `MPICH_RDMA_ENABLED_CUDA` and `IGG_CUDAAWARE_MPI` set to 0.}
509528

@@ -527,25 +546,4 @@ julia --project <my_script.jl>
527546
Which you then launch using `srun` upon having made it executable (`chmod +x runme_mpi_daint.sh`)
528547
```sh
529548
srun -n4 ./runme_mpi_daint.sh
530-
```
531-
532-
If you do not want to use an interactive session you can use the `sbatch` command to launch a job remotely on daint. Example of a `sbatch_mpi_daint.sh` you can launch (without need of an allocation) as [`sbatch sbatch_mpi_daint.sh`](https://github.com/eth-vaw-glaciology/course-101-0250-00/blob/main/scripts/l8_scripts/l8_sbatch_mpi_daint.sh):
533-
```sh
534-
#!/bin/bash -l
535-
#SBATCH --job-name="diff2D"
536-
#SBATCH --output=diff2D.%j.o
537-
#SBATCH --error=diff2D.%j.e
538-
#SBATCH --time=00:05:00
539-
#SBATCH --nodes=4
540-
#SBATCH --ntasks-per-node=1
541-
#SBATCH --partition=normal
542-
#SBATCH --constraint=gpu
543-
#SBATCH --account class04
544-
545-
export MPICH_RDMA_ENABLED_CUDA=1
546-
export IGG_CUDAAWARE_MPI=1
547-
548-
srun -n4 bash -c 'julia --project <my_julia_mpi_gpu_script.jl>'
549-
``` -->
550-
551-
\note{The scripts above can be found in the [scripts](https://github.com/eth-vaw-glaciology/course-101-0250-00/blob/main/scripts/l8_scripts/) folder.}
549+
```-->

0 commit comments

Comments
 (0)