-
Notifications
You must be signed in to change notification settings - Fork 9
Titan Installation Instructions
In this entry, we describe steps needed to install UCVM on Titan. UCVMC standard installation is on an current Linux system. On this page, we describe details involved in installing UCVM on alternative systems, including supercomputers.
The source files will exceed 25GB so we will install on the project storage directories, not home directories
- Project Home /ccs/proj/[projid] NFS 770 50 GB Yes No 90 days
- Member Work $MEMBERWORK/[projid] Lustre® 700 [1] 10 TB No 14 days 14 days
- Project Work $PROJWORK/[projid] Lustre® 770 100 TB No 90 days 90 days
- World Work $WORLDWORK/[projid] Lustre® 775 10 TB No 90 days 90 days
- Project Archive /proj/[projid] HPSS 770 100 TB [2] No No 90 days
*/lustre/atlas/proj-shared/ge0112/pmaech
- git clone https://github.com/SCECcode/ucvmc
General guidance is to avoid use of IOBUF modules on Titan due to naming conflicts with some of the UCVM packages. We document our installation process for Titan below.
Titan has a non-standard-Linux environment. Also it used modules to manage the programming environment. Also, it is unlikely to have the extended python packages needed by some of the UCVMC scripts.
-
The intended use may not require the plotting scripts, so we will try with the current Titan python package. Our fallback is to install anaconda package.
-
First determine a file system where it can be installed. The olcf file system home directories are on /ccs/home/$USER. These directories have 10GB storage which is a bind for ucvm source or installation.
-
We identify a lustre filesystem for the UCVM source and installation directories. Recommended are $PROJWORK/projid with 100GB storage. $memberwork/projid is only 10GB storage.
-
Titan has a git client. Otherwise we will need to upload a tar of the distribution.
Previous installations on Titan suggest the UCVM will build with the GNU compilers.
First unload the IOBUF
- module unload iobuf
Then swap Gnu for PGI
- module swap PrgEnv-pgi PrgEnv-gnu
Then ideally we'd link statically. Not sure this is currently supported.
Then test runs need to run on the compute nodes, not the head nodes. so we must submit our test jobs through the queue
-
Log into titan
-
move to $PROJWORK/geo112/pmaech
-
This resolves to absolute path of /lustre/atlas/proj-shared/geo112/pmaech/
-
Create directory ucvmc_src
-
cd there and git clone https://github.com/SCECcode/UCVMC.git
-
cd largefiles/ and run ./get_large_files.py This download took at almost one hour.
-
./check_largefiles_md5.py
-
Create install_inputs.txt containing
/lustre/atlas/proj-shared/geo112/pmaech/ucvmc yes yes yes yes yes
- Check which modules are loaded
pmaech@titan-ext7:/lustre/atlas/proj-shared/geo112/pmaech/ucvmc_src/UCVMC> module list Currently Loaded Modulefiles: 1) eswrap/1.3.3-1.020200.1278.0 9) dmapp/7.0.1-1.0502.11080.8.74.gem 17) cray-mpich/7.4.0 2) craype-network-gemini 10) gni-headers/4.0-1.0502.10859.7.8.gem 18) craype-interlagos 3) pgi/16.5.0 11) xpmem/0.1-2.0502.64982.5.3.gem 19) lustredu/1.4 4) craype/2.5.5 12) dvs/2.5_0.9.0-1.0502.2188.1.113.gem 20) xalt/0.5.3 5) cray-libsci/16.06.1 13) alps/5.2.4-2.0502.9774.31.12.gem 21) module_msg/0.1 6) udreg/2.3.2-1.0502.10518.2.17.gem 14) rca/1.0.0-2.0502.60530.1.63.gem 22) modulator/1.2.0 7) ugni/6.0-1.0502.10863.8.28.gem 15) atp/2.0.2 23) hsi/5.0.2.p1 8) pmi/5.0.9-1.0000.10911.175.4.gem 16) PrgEnv-pgi/5.2.82 24) DefApps pmaech@titan-ext7:/lustre/atlas/proj-shared/geo112/pmaech/ucvmc_src/UCVMC> module unload iobuf pmaech@titan-ext7:/lustre/atlas/proj-shared/geo112/pmaech/ucvmc_src/UCVMC> module swap PrgEnv-pgi PrgEnv-gnu pmaech@titan-ext7:/lustre/atlas/proj-shared/geo112/pmaech/ucvmc_src/UCVMC> module list Currently Loaded Modulefiles: 1) eswrap/1.3.3-1.020200.1278.0 9) module_msg/0.1 17) dmapp/7.0.1-1.0502.11080.8.74.gem 2) craype-network-gemini 10) modulator/1.2.0 18) gni-headers/4.0-1.0502.10859.7.8.gem 3) gcc/4.9.3 11) hsi/5.0.2.p1 19) xpmem/0.1-2.0502.64982.5.3.gem 4) craype/2.5.5 12) DefApps 20) dvs/2.5_0.9.0-1.0502.2188.1.113.gem 5) cray-mpich/7.4.0 13) cray-libsci/16.06.1 21) alps/5.2.4-2.0502.9774.31.12.gem 6) craype-interlagos 14) udreg/2.3.2-1.0502.10518.2.17.gem 22) rca/1.0.0-2.0502.60530.1.63.gem 7) lustredu/1.4 15) ugni/6.0-1.0502.10863.8.28.gem 23) atp/2.0.2 8) xalt/0.5.3 16) pmi/5.0.9-1.0000.10911.175.4.gem 24) PrgEnv-gnu/5.2.82
-
python -v returns /usr/lib64/python2.6/site.pyc matches /usr/lib64/python2.6/site.py
-
using install_inputs.txt invoke ./ucvm_setup.py
/lustre/atlas/proj-shared/geo112/pmaech/ucvmc_src/UCVMC> ./ucvm_setup.py < install_inputs.txt &> install_results.txt & [1] 37167 pmaech@titan-ext7:/lustre/atlas/proj-shared/geo112/pmaech/ucvmc_src/UCVMC> tail -f install_results.txt configure.ac:6: installing `./install-sh' configure.ac:6: installing `./missing' libsrc/Makefile.am:29: `%'-style pattern rules are a GNU make extension checking for a BSD-compatible install... /usr/bin/install -c checking whether build environment is sane... yes
Eventually printed this:
Done installing UCVM! Thank you for installing UCVM. Please export the following library paths (note this is in Bash format): LD_LIBRARY_PATH=/lustre/atlas/proj-shared/geo112/pmaech/ucvm/lib/euclid3/lib:$LD_LIBRARY_PATH LD_LIBRARY_PATH=/lustre/atlas/proj-shared/geo112/pmaech/ucvm/lib/proj-4/lib:$LD_LIBRARY_PATH LD_LIBRARY_PATH=/lustre/atlas/proj-shared/geo112/pmaech/ucvm/model/cvms426/lib:$LD_LIBRARY_PATH LD_LIBRARY_PATH=/lustre/atlas/proj-shared/geo112/pmaech/ucvm/model/cencal/lib:$LD_LIBRARY_PATH export LD_LIBRARY_PATH We recommend adding the above lines to the end of your ~/.bashrc file so that they are preserved for the next time you login. Once you have set these environment variables, return to the UCVMC source directory and type make check This will run the UCVMC unite and acceptance tests. If all tests pass. UCVMC is correctly installed and ready to use on your computer. To try out ucvm, once the tests pass, move to the UCVMC installation directory, and run an example query. As an example: cd /lustre/atlas/proj-shared/geo112/pmaech/ucvm ./bin/ucvm_query -f ./conf/ucvm.conf -m cvms < ./tests/test_latlons.txt You will then see the following output: Using Geo Depth coordinates as default mode. -118.0000 34.0000 0.000 280.896 390.000 cvms 696.491 213.000 1974.976 none 0.000 0.000 0.000 crust 696.491 213.000 1974.976 A copy of all the commands to setup UCVM has been saved at ./setup_log.sh
Default bash environment on Titan does not include a .bashrc or .bash_profile. We created these files to preserve the required LD_LIB paths. This may change if we learn there is a preferred way to set environment variables, such as through the module interface. We log out and in, and the proper path is found in the env, so these new .bash files appear to meet our needs.
Our.bashrc file is:
# This is the .bashrc. This runs on non-interactive sessions. Define # paths and others here. The interactive shell calls this one. # # #export PATH=$PATH:. source .bash_profile
Our .bash_profile is:
# # Setup path to UCVM files # LD_LIBRARY_PATH=/lustre/atlas/proj-shared/geo112/pmaech/ucvm/lib/euclid3/lib:$LD_LIBRARY_PATH LD_LIBRARY_PATH=/lustre/atlas/proj-shared/geo112/pmaech/ucvm/lib/proj-4/lib:$LD_LIBRARY_PATH LD_LIBRARY_PATH=/lustre/atlas/proj-shared/geo112/pmaech/ucvm/model/cvms426/lib:$LD_LIBRARY_PATH LD_LIBRARY_PATH=/lustre/atlas/proj-shared/geo112/pmaech/ucvm/model/cencal/lib:$LD_LIBRARY_PATH export LD_LIBRARY_PATH # # export PYTHONPATH=/lustre/atlas/proj-shared/geo112/pmaech/ucvm/utilities/pycvm
Titan builds executables that run on the compute nodes, but not on the head nodes. So we can't run the acceptance tests the standard way. To run the unittest and accepttest binaries. We create a .pbs file like this: It's not certain we need both the atlas1 and altas2 files systems mounted, but we'll use that as the default for now.
pmaech@titan-ext2:~> more ucvm_test.pbs #!/bin/bash # Begin PBS directives #PBS -A geo112 #PBS -N ucvm_accept_test #PBS -j oe #PBS -l walltime=0:15:00,nodes=1 #PBS -l gres=atlas1%atlas2 # End PBS directives and begin shell commands cd $PROJWORK/geo112/pmaech/ucvm/tests date aprun -n 1 ./unittest aprun -n 1 ./accepttest
Data Retention, Purge, & Quota Summary User-Centric Storage Areas Area Path Type Permissions Quota Backups Purged Retention User Home $HOME NFS User-controlled 10 GB Yes No 90 days User Archive /home/$USER HPSS User-controlled 2 TB [1] No No 90 days Project-Centric Storage Areas Area Path Type Permissions Quota Backups Purged Retention Project Home /ccs/proj/[projid] NFS 770 50 GB Yes No 90 days Member Work $MEMBERWORK/[projid] Lustre® 700 [2] 10 TB No 14 days 14 days Project Work $PROJWORK/[projid] Lustre® 770 100 TB No 90 days 90 days World Work $WORLDWORK/[projid] Lustre® 775 10 TB No 90 days 90 days Project Archive /proj/[projid] HPSS 770 100 TB [3] No No 90 days Area The general name of storage area. Path The path (symlink) to the storage area's directory. Type The underlying software technology supporting the storage area. Permissions UNIX Permissions enforced on the storage area's top-level directory. Quota The limits placed on total number of bytes and/or files in the storage area. Backups States if the data is automatically duplicated for disaster recovery purposes. Purged Period of time, post-file-creation, after which a file will be marked as eligible for permanent deletion. Retention Period of time, post-account-deactivation or post-project-end, after which data will be marked as eligible for permanent deletion.
The UCVMC binaries require command line parameters to the binary, and to the ucvm.conf file. a script that runs the ucvm_query pointing at a shared directory looks like this:
#!/bin/bash # Begin PBS directives #PBS -A geo112 #PBS -N ucvm_accept_test #PBS -j oe #PBS -l walltime=0:15:00,nodes=1 #PBS -l gres=atlas1%atlas2 # End PBS directives and begin shell commands cd $PROJWORK/geo112/pmaech/ucvm date aprun -n 1 ./bin/ucvm_query -f ./conf/ucvm.conf -m cvmsi < $HOME/meshes/test_latlons.txt > $HOME/meshes/query_results.txt
This points to ucvm installatioin directory, invokes the binary and points to the ucvm.conf in the shared directory, but references input and outputs from a personal account directory.
This wiki is licensed by University of Southern California (USC) to the public under a Creative Commons Attribution 4.0 license.