Prerequisites and installation#

This section lists the prerequisites of pyALF and how to set things up to be able to use it in a productive manner. \(\phantom{\xi}\)

ALF prerequisites#

Since pyALF builds on ALF, we also want to satisfy its requirements. Note, however, that pyALF’s postprocessing features are independent from ALF. This might be relevant, for example, when performing Monte Carlo runs and analysis on different machines.

The minimal ALF prerequisites are:

  • A Unix shell, e. g. Bash

  • Make

  • A recent Fortran Compiler (e. g. Submodules must be supported)

  • BLAS+LAPACK

  • Python 3

For parallelization, an MPI develoment library, e. g. Open MPI, is necessary.

Results from ALF can either be saved in a plain text format or HDF5, but full pyALF support is only provided for the latter, which is why in pyALF, HDF5 is enabled by default. ALF automatically downloads and compiles HDF5. For this to succeed, the following is needed:

  • A C compiler (which is most often automatically included when installing a Fortran Compiler)

  • A C++ preprocessor

  • Curl

  • gzip development libraries

The recommended way for obtaining the source code is through git.

Finally, the ALF testsuite needs:

  • CMake

As an example, the requirements mentioned above can be satisfied on a Debian Bullseye, Ubuntu Focal, or similar operating system using the APT package manager, by executing the command:

sudo apt install make gfortran libblas-dev liblapack-dev \
           python3 libopenmpi-dev g++ curl libghc-zlib-dev \
           git ca-certificates cmake

The above installs compilers from the GNU compiler collection. Other supported and tested compiler frameworks are from the Intel® oneAPI Toolkits and the NVIDIA HPC SDK. The latter is denoted as PGI in ALF.

pyALF prerequisites#

Besides ALF, pyALF needs the following Python 3 packages:

  • h5py

  • numpy

  • pandas

  • matplotlib

  • numba

  • scipy

  • tkinter

  • ipywidgets (= Jupyter Widgets)

  • ipympl (= Matplotlib Jupyter Integration)

  • f90nml

Furthermore, one needs JupyterLab or Jupyter Notebook to use the interactive Python notebooks of pyALF. All this can be installed by executing for example:

sudo apt install python3-pip python3-tk
pip install h5py numpy pandas matplotlib numba scipy \
            jupyterlab ipywidgets ipympl f90nml

Setting up the environment#

The recommended way of obtaining pyALF is by cloning the git repository through the shell command

git clone https://git.physik.uni-wuerzburg.de/ALF/pyALF.git

this will create a folder called pyALF in the current working directory of the terminal and download the repository there[1].

For Python to find the modules of pyALF, its location has to be added to the $PYTHONPATH environment variable. When using a Unix shell, this is achieved through the command

export PYTHONPATH="/path/to/pyALF:$PYTHONPATH"

where /path/to/pyALF is the location of the pyALF code, for example /home/jonas/Programs/pyALF. To not have to repeat this command in every terminal session, it is advisable to add it to a file sourced when starting the shell, e.g. ~/.bashrc or ~/.zshrc. Furthermore, pyALF supplies a number of command line tools. To use them conveniently, one may add /path/to/pyALF/py_alf/cli/ to the $PATH environment variable:

export PATH="/path/to/pyALF/py_alf/cli:$PATH"

Since pyALF is set up to automatically clone ALF with git, it is not strictly necessary to do download ALF manually, but pyALF will download ALF every time its does not find it. Therefore it is recommended to clone ALF once manually from https://git.physik.uni-wuerzburg.de/ALF/ALF.git and set the environment variable:

export ALF_DIR="/path/to/ALF"

This way, pyALF will use the same ALF source code directory every time.

Check setup#

To check if most things have been set up correctly, the script minimal_ALF_run.py can be used. It executes the same commands as the Minimal example. The script is located in py_alf/cli and one should therefore be able to run it by executing

minimal_ALF_run.py

in the Unix shell, if $PATH has been extended correctly. If it does clone the ALF repository, $ALF_DIR has not been set up correctly. Note that on the first compilation, ALF downloads and compiles HDF5, which can take up to ~15 minutes.

Ready-to-use Docker image#

For a ready-to-use environment, one can use the Docker image alfcollaboration/jupyter-pyalf-full (alfcollaboration/jupyter-pyalf-full), which has both the above mentioned dependencies installed and ALF+pyALF source code with corresponding environment variables set. It is derived from the Docker image jupyter/minimal-notebook and therefore this documentation applies.

Using Jupyter via SSH tunnel#

If one would like to do all the computing, including plots, on a remote machine and still would like to use Jupyter Notebooks, there is actually a very easy way, using SSH port forwarding. The only thing needed on the local machine is a browser, an SSH client and a Unix shell[2].

When launching, JupyterLab or Jupyter Notebook sets up a webserver and prints out how to access it locally, like:

http://localhost:<remote_port_number>/lab?token=<token>

or

http://localhost:<remote_port_number>/?token=<token>

Where <remote_port_number> is some port number (default 8888) and <token> is the password to access the server.

Now, to access this web server on the remote machine, one can forward this port to the local machine using the SSH option -L and open it with the browser. With some additional options, we can make it even more efficient and convenient:

ssh -fNT -M -S <some_socket_name> -L <local_port_number>:localhost:<remote_port_number> <username@serveraddress>

The command now goes into the background and lives on even if the terminal executing it closes. The string <some_socket_name> is the name of a control socked created with this command, which will be represented by a file-like object in the Unix environment. The number <local_port_number> is the local port on which the remote port will be mapped and <username@serveraddress> is the ssh address of the remote server.

With the command from above, a remote JupyterLab will be accessible trough the address http://localhost:<local_port_number>:/lab?token=<token>.

For interacting with the background process that does the forwarding, the control sockets can be used. The following two commands check the status of the process and stop it:

ssh -S <some_socket_name> -O check <username@serveraddress>
ssh -S <some_socket_name> -O exit <username@serveraddress>