DAIC Experimental
— Documentation for the experimental environment.
(Go to stable docs)
Python Setup
Set up Python environments on DAIC.
2 minute read
Option 1: Use modules (recommended for common packages)
Load pre-installed Python packages:
$ module load 2025/gpu
$ module load py-torch/2.5.1
$ module load py-numpy/1.26.4
$ python -c "import torch; print(torch.__version__)"
2.5.1
Option 2: UV (recommended for custom packages)
UV is a fast Python package manager.
Run shell setup first
If you haven’t already, run the shell setup to configure storage paths.Install UV
$ curl -LsSf https://astral.sh/uv/install.sh | sh
Create a project
$ cd /tudelft.net/staff-umbrella/<project>
$ uv init myproject
$ cd myproject
$ uv add torch numpy pandas
Run scripts
$ uv run python train.py
Install CLI tools
Install Python tools globally (ruff, black, jupyter, etc.):
$ uv tool install ruff
$ uv tool install jupyter
$ ruff --version
$ jupyter lab
List installed tools:
$ uv tool list
Use in batch scripts
#!/bin/bash
#SBATCH --account=<your-account>
#SBATCH --partition=all
#SBATCH --gres=gpu:1
module purge
module load 2025/gpu cuda/12.9
cd /tudelft.net/staff-umbrella/<project>/myproject
srun uv run python train.py
Storage location
Create projects in project storage, not home directory (5MB quota).Option 3: Pixi (conda alternative)
Pixi is a fast conda-compatible package manager.
Install Pixi
$ curl -fsSL https://pixi.sh/install.sh | sh
Create a project
$ cd /tudelft.net/staff-umbrella/<project>
$ pixi init myproject
$ cd myproject
$ pixi add python pytorch numpy
Run commands
$ pixi run python train.py
$ pixi shell # activate environment
Use in batch scripts
#!/bin/bash
#SBATCH --account=<your-account>
#SBATCH --partition=all
#SBATCH --gres=gpu:1
module purge
module load 2025/gpu cuda/12.9
cd /tudelft.net/staff-umbrella/<project>/myproject
srun pixi run python train.py
Option 4: Virtual environment
$ module load 2025/gpu
$ python -m venv /tudelft.net/staff-umbrella/<project>/venvs/myenv --system-site-packages
$ source /tudelft.net/staff-umbrella/<project>/venvs/myenv/bin/activate
$ pip install transformers
Option 5: Micromamba (global conda environments)
Micromamba is a lightweight conda implementation. Use it when you need traditional conda environments shared across projects.
Install Micromamba
$ "${SHELL}" <(curl -L micro.mamba.pm/install.sh)
Follow the prompts. When asked for install location, use your project storage:
/tudelft.net/staff-umbrella/<project>/micromamba
Create and use environments
$ micromamba create -n myenv python=3.11 pytorch numpy -c conda-forge -c pytorch
$ micromamba activate myenv
$ python -c "import torch; print(torch.__version__)"
Use in batch scripts
#!/bin/bash
#SBATCH --account=<your-account>
#SBATCH --partition=all
#SBATCH --gres=gpu:1
module purge
module load 2025/gpu cuda/12.9
eval "$(micromamba shell hook --shell bash)"
micromamba activate myenv
srun python train.py
When to use which tool
- UV: Best for most projects. Fast, lockfiles, reproducible.
- Pixi: When you need conda-forge packages in a project.
- Micromamba: When you need shared environments across projects.
Next steps
- See Modules for available packages
- Use Containers for complex dependencies