Introduction#
LAMMPS is a classical molecular dynamics code. The name stands for Large-scale Atomic / Molecular Massively Parallel Simulator. LAMMPS is distributed by Sandia National Laboratories, a US Department of Energy laboratory.
Modules#
On the Grex’s default software stack (SBEnv), LAMMPS was built using a variety of compilers and OpenMPI 4.1
To find out which versions are available, use module spider lammps
As an example of the version 2024 Aug 29 patch 1, using GCC:
module load arch/avx512 gcc/13.2.0 openmpi/4.1.6
module load lammps/2024-08-29p1
or, another example of an older LAMMPS version using Intel OneAPI compilers:
module load arch/avx512 intel-one/2024.1 openmpi/4.1.6
module load lammps/2021-09-29
There is a GPU version of LAMMPS on Grex that uses the KOKKOS library for GPU interface. It is available as follows:
module load cuda/12.4.1 arch/avx2 gcc/13.2.0 openmpi/4.1.6
module load lammps/2024-08-29p1
The above LAMMPS version should be used only on GPU-enabled partitions.
It is also possible to load modules from the Alliance software stack after load CCEnv:
module purge
module load CCEnv
module load arch/avx512
module load StdEnv/2023
module load intel/2023.2.1 openmpi/4.1.5
module load lammps-omp/20230802
Serial version#
Script example using a module from SBEnv:
#!/bin/bash
#SBATCH --ntasks=1
#SBATCH --cpus-per-task=1
#SBATCH --mem=1500M
#SBATCH --time=0-3:00:00
#SBATCH --job-name=Lammps-Test
# Load the modules:
module load arch/avx512 intel-one/2024.1 openmpi/4.1.6
module load lammps/2021-09-29
echo "Starting run at: `date`"
lmp_exec=lmp
lmp_input="lammps.in"
lmp_output="lammps_lj_output.txt"
${lmp_exec} < ${lmp_input} > ${lmp_output}
echo "Program finished with exit code $? at: `date`"
Script example using a module from CCEnv:
#!/bin/bash
#SBATCH --ntasks=1
#SBATCH --cpus-per-task=1
#SBATCH --mem=1500M
#SBATCH --time=0-3:00:00
#SBATCH --job-name=Lammps-Test
# Load the modules:
module purge
module load CCEnv
module load arch/avx512
module load StdEnv/2023
module load intel/2023.2.1 openmpi/4.1.5
module load lammps-omp/20230802
echo "Starting run at: `date`"
lmp_exec=lmp
lmp_input="lammps.in"
lmp_output="lammps_lj_output.txt"
${lmp_exec} < ${lmp_input} > ${lmp_output}
echo "Program finished with exit code $? at: `date`"
MPI version#
Script example using a module from SBEnv:
#!/bin/bash
#SBATCH --ntasks=16
#SBATCH --cpus-per-task=1
#SBATCH --mem-per-cpu=1500M
#SBATCH --time=0-3:00:00
#SBATCH --job-name=Lammps-Test
# Load the modules:
module load arch/avx512 gcc/13.2.0 openmpi/4.1.6
module load lammps/2024-08-29p1
echo "Starting run at: `date`"
lmp_exec=lmp
lmp_input="lammps.in"
lmp_output="lammps_lj_output.txt"
srun ${lmp_exec} -in ${lmp_input} -log ${lmp_output}
echo "Program finished with exit code $? at: `date`"
Script example using a module from CCEnv:
#!/bin/bash
#SBATCH --ntasks=16
#SBATCH --cpus-per-task=1
#SBATCH --mem-per-cpu=1500M
#SBATCH --time=0-3:00:00
#SBATCH --job-name=Lammps-Test
# Load the modules:
module purge
module load CCEnv
module load arch/avx512
module load StdEnv/2023
module load intel/2023.2.1 openmpi/4.1.5
module load lammps-omp/20230802
echo "Starting run at: `date`"
lmp_exec=lmp
lmp_input="lammps.in"
lmp_output="lammps_lj_output.txt"
srun ${lmp_exec} -in ${lmp_input} -log ${lmp_output}
echo "Program finished with exit code $? at: `date`"
Script example using a GPU version using KOKKOS from SBEnv:
#!/bin/bash
#SBATCH --ntasks=1 --partition=gpu
#SBATCH --cpus-per-gpu=8 --gpus=1
#SBATCH --mem=15000M
#SBATCH --time=0-3:00:00
#SBATCH --job-name=Lammps-Test-GPU
# Load the modules:
module load cuda/12.4.1 arch/avx2 gcc/13.2.0 openmpi/4.1.6
module load lammps/2024-08-29p1
echo "Starting run at: `date`"
lmp_exec=lmp
lmp_input="lammps.in"
lmp_output="lammps_lj_output.txt"
# this example uses KOKKOS GPU module, on a single GPU
srun ${lmp_exec} -in ${lmp_input} -k on g 1 -sf kk -pk kokkos newton off neigh full -log ${lmp_output}
echo "Program finished with exit code $? at: `date`"
Related links#
- LAMMPS website.
- LAMMPS GitHub
- LAMMPS online documentation.
- LAMMPS
- Tuning LAMMPS from HPC Carpentries