Problems installing needed libraries in a new environment

I am trying to run a retrieval on a transmission spectrum of an exoplanet atmosphere. For parallelization, the retrieval code (petitRADTRANS: petitRADTRANS: exoplanet spectra for everyone! — petitRADTRANS 3.2.0 documentation) uses MPI and PyMultiNest. I am trying to set up a new python environment where I install these libraries, but I am having trouble getting it working. I am trying to follow the installation instructions within the pyMultiNest documentation but they use a sudo install, which doesn’t work here. Here are the installation instructions for pyMultiNest that I was trying to follow ( Installing PyMultiNest and PyCuba — pymultinest 2.9 documentation ). Is there any workaround for this?

I can see that the installation instructions assumes the user has root access, which is not always possible.
Our recommendation for Fornax is to use packages from conda (through micromamba). This gives users more flexibility in installing their own packages, including compilers and libraries like those required for petitRADTRANS without the need for root access on the system.

Adopting the instructions from the links provided to use libraries from conda, the following script compiles and installs both pymultinest and petitRADTRANS. It creates a conda environment in the user’s home directory so it persists between sessions.

If there is an expectations that these tools are useful to more users, we can consider adding them to our software stack so they become available by default.

#!/usr/bin/bash

# Create a new conda environment called multinest in $HOME/user-envs/multinest
mkdir -p $HOME/user-envs
micromamba create -y -p $HOME/user-envs/multinest \
    python=3.12 openmpi libblas blas liblapack ipykernel compilers cmake make
micromamba activate $HOME/user-envs/multinest

# install multinest
mkdir -p /tmp/multinest
cd /tmp/multinest
if [ -d MultiNest ]; then rm -rf Multinest; fi
/opt/envs/base/bin/git clone https://github.com/JohannesBuchner/MultiNest
cd MultiNest/build

# For some reasone FIND_LIBRARY fails here, but compiling with -lm works.
sed -i 's/FIND_LIBRARY/# FIND_LIBRARY/' ../src/example_eggbox_C/CMakeLists.txt
cmake -DCMAKE_POLICY_VERSION_MINIMUM=3.5  ..
make
cp ../lib/* $HOME/user-envs/multinest/lib/

# install cuba
cd /tmp/multinest
if [ -d cuba ]; then rm -rf cuba; fi
/opt/envs/base/bin/git clone https://github.com/JohannesBuchner/cuba/
cd cuba
./configure
# add -Wl,--allow-multiple-definition to the gcc for shared lib
sed -i 's/-o libcuba.so/-o libcuba.so -Wl,--allow-multiple-definition/' makesharedlib.sh
./makesharedlib.sh
cp libcuba.so $HOME/user-envs/multinest/lib/

# Install pymultinest
pip install --no-cache-dir pymultinest scipy matplotlib progressbar


# add the new env to jupyterlab launcher 
micromamba run -p $HOME/user-envs/multinest python -m ipykernel install --name multinest --prefix $JUPYTER_DIR

# check
# python -c 'import pymultinest'
# python -c 'import pycuba'


# Now do petitradtrans
pip install --no-cache-dir numpy meson-python ninja
pip install --no-cache-dir petitRADTRANS --no-build-isolation

# test it
# python -c 'import petitRADTRANS'

# some cleaning
micromamba uninstall -y compilers cmake make
micromamba clean -yaf

Save this to say: install_petitradtrans.sh.
Then call it with: bash -i install_petitradtrans.sh.

The script will also add the new kernel to be used in notebooks. One the terminal, it can be activated by:

micromamba activate $HOME/user-envs/multinest

If you close your server session and start a new one, you may want to add the kernel again with the following, as the list of kernels is reset with every new session:

micromamba run -p $HOME/user-envs/multinest python -m ipykernel install --name multinest --prefix $JUPYTER_DIR

2 Likes

Thank you so much for your response. This has worked for me! I have successfully been able to import petitRADTRANS, shut down my session and then start a new one and get petitRADTRANS running again in the new session.

As a rule of thumb moving forward, if I would like to add other packages to this environment is it recommended to continue to add packages using the same method as was done for numpy, petitradtrans, etc. (i.e. pip install –no-cache-dir XXX)? I guess I am not completely clear when packages should be installed via conda/micromamba and when they should be installed via pip. And if I should stick with pip, is the no-cache-dir recommended for space reasons?

Yes. You can add more packages to that environment. Using conda vs pip is a personal choice that depends on many factors. We have seen conflict arise when mixing them, so we recommend using pip unless the package is not in pip (e.g the compilers or liblapack).

and yes, –no-cache-dir is there so that your home directory is not filled with cache packages. We’ve learned that keeping an eye on cache files in the cloud is a good practice to control storage costs.
In any case, we set it up so the cache is under /tmp/ that is cleaned with every session and not in the persistent home directory, but passing the option is a good practice.