Schedule: 2-5 pm, Room 8-119, Jan 12 and 14 (Octopus), Jan 19 and 21 (BerkeleyGW)
Instructor: Dr. David Strubbe, Department of Materials Science and Engineering, MIT
Please sign up by January 8 by emailing me at dstrubbe @ mit.edu, so we can be sure to have enough space and NERSC accounts.
It is important to bring your laptop so you can do the hands-on exercises.
For the BerkeleyGW part (and optionally the Octopus part) we will be running simulations on the Cori supercomputer at the National Energy Research Scientific Computing Center (NERSC). You will be provided with a training account to use the machine, which is strongly recommended even if you already have a NERSC account. If you are not familiar with NERSC, and especially if you are not familiar with other clusters or supercomputers, take a look over the information on NERSC's getting started guide. (Don't worry about the discussion of compiling and optimizing code though; the needed software has already been installed for you.)
JOBID USER ACCOUNT NAME PARTITION QOS NODES TIME_LIMIT TIME ST START_TIME 921487 dstrubbe mp149 test_pulpo regular normal 1 4:00:00 0:00 PD N/A
Sample job script using 4 cores of 1 node. Name it, for example, job.scr, and submit to run via sbatch job.scr.
#!/bin/bash #SBATCH -J methane #SBATCH -n 1 #SBATCH -p debug #SBATCH -t 00:30:00 #SBATCH --export=ALL cd $SLURM_SUBMIT_DIR srun -n 4 /project/projectdirs/mp149/IAP2015/octopus-5.0.1/bin/octopus_mpi &> output
Octopus: real-space TDDFT.
tutorial materials
Additional instructor: Jacob Sanders, Department of Chemistry and Chemical Biology, Harvard University
We are using version 5.0.1. The octopus tutorials can be run easily on a small machine such as your laptop or a cluster. For Mac OS, installation via MacPorts is recommended (see here). For Linux, install with a .deb file for your distribution, or along the lines of these instructions. Please be sure to run the testsuite with make check before trying any calculations.
Further reading:
BerkeleyGW: many-body perturbation theory.
Additional instructor: Dr. Huashan Li, Department of Materials Science and Engineering, MIT
We are using version 1.2-beta. The tutorials are best run on the NERSC machines. However, if you are interested in installing on your own machine for later usage, you can use MacPorts for Mac OS (port install berkeleygw) for the last open release 1.1-beta2, or install from source following one of the files in the config directory of the source. Please be sure to run the testsuite with make check before trying any calculations.
Prof. Louie's introduction to the theory and applications: slides
Intro to practical issues for tutorial: slides
Bethe-Salpeter equation: slides
Instructions:
# Log in to Cori ssh -X train##@cori.nersc.gov # Load modules module load berkeleygw/1.2-beta paratec # Optional: make the output of 'ls' color-coded; only have to do this once echo 'alias ls="ls --color"' >> ~/.bashrc.ext . ~/.bashrc # Start an interactive job. # ** Do not start more than one interactive job! ** salloc -N 1 -p regular --reservation=20160121_tutorial -t 1:00:00You will see a message like the following; answer yes.
The authenticity of host 'cmom06 (128.55.145.146)' can't be established. RSA key fingerprint is SHA256:7SF90Sk0RfoVXDVqQfQtTbMx1/GW9wY0XSuEPnfCqRg. Are you sure you want to continue connecting (yes/no)? yesAnd then these messages:
salloc: Pending job allocation 969589 salloc: job 969589 queued and waiting for resources salloc: job 969589 has been allocated resources salloc: Granted job allocation 969589 salloc: Waiting for resource configuration salloc: Nodes nid00182 are ready for jobWhen your interactive job has started, continue by doing
# Go to the scratch directory, where all runs should happen. cd $SCRATCH # List all examples available ls /project/projectdirs/mp149/IAP2015 # Copy 1-silicon example to your directory cp -R /project/projectdirs/mp149/IAP2015/1-silicon . # Go to your local folder and follow instructions cd 1-silicon less README
Further reading: