Custom containers

By default, Modal functions are executed within a Debian Bullseye Linux container with a basic Python installation of the same minor version that your local Python has.

Oftentimes you might need some third party Python packages, or some other pre-installed dependencies for your function. Modal provides a few different options to customize the container your function runs in.

Additional Python packages

The simplest and most common container modification is to add some third party Python package, like pandas. To do this you can create a custom modal.Image using the modal.Image.debian_slim, and then extend it with the pip_install method with a list of all of the packages you need.

pandas_image = modal.Image.debian_slim().pip_install("pandas")

def my_function():
    import pandas as pd

    df = pd.DataFrame()

Importing Python packages

You might want to use packages inside your Modal code that you don’t have on your local computer. In the example above, we build a container that uses pandas. But if we don’t have pandas locally, on the computer launching the Modal job, we can’t put import pandas at the top of the script, since it would cause an ImportError.

The easiest solution to this is to put import pandas in the function body instead, as you can see above. This means that pandas is only imported when running inside the remote Modal container, which has pandas installed.

If you have a lot of functions and a lot of Python packages, you might want to keep the imports in the global scope so that every function can use the same imports. In that case, you can use the stub.is_inside() function:

pandas_image = modal.Image.debian_slim().pip_install("pandas")

if stub.is_inside():
    import pandas as pd

def my_function():
    df = pd.DataFrame()

Note that is_inside is considered beta and its interface may change in the future.

Shell commands

You can also supply shell commands that should be executed when building the container image. This can be useful for installing additional binary dependencies:

ffmpeg_image = modal.Image.debian_slim().apt_install("ffmpeg")

def process_video():["ffmpeg", ...])

Or for preloading custom assets into the container:

image_with_model = (
        "curl -O",

def find_cats():
    content = open("/haarcascade_frontalcatface.xml").read()

Using existing Docker Hub images

Docker Hub has many pre-built images for common software packages. You can use any such image as your function container using modal.Image.from_dockerhub as long as the image conforms to the following requirements:

  • Python 3.7 or above is present, and is available as python
  • pip is installed correctly
  • The image is built for the linux/amd64 platform
sklearn_image = modal.Image.from_dockerhub("huanjason/scikit-learn")

def fit_knn():
    from sklearn.neighbors import KNeighborsClassifier

If python or pip isn’t set up properly, you can use the setup_commands to run extra commands before the Modal package is installed:

image = modal.Image.from_dockerhub(
    setup_commands=["apt-get update", "apt-get install -y python3-pip"],

Using Conda instead of pip

Modal provides a pre-built Conda base image, if you would like to use conda for package management. The Python version available is whatever version the official miniconda3 image currently comes with (3.9.12 at this time).

pymc_image = modal.Image.conda().conda_install("theano-pymc==1.1.2", "pymc3==3.11.2")

def fit():
    import pymc3 as pm

Using a Dockerfile

Modal also supports using a Dockerfile using the Image.from_dockerfile function. It takes a path to an existing Dockerfile. For instance:

FROM python:3.9
RUN pip install sklearn
dockerfile_image = modal.Image.from_dockerfile("Dockerfile")

def fit():
    import sklearn

Running a function as a build step (beta)

Instead of using shell commands, you can also run a Python function as an image build step using the Image.run_function method. For example, you can use this to download model parameters to your image:

def download_models():
    import diffusers

    pipe = diffusers.StableDiffusionPipeline.from_pretrained(
        model_id, use_auth_token=os.environ["HUGGINGFACE_TOKEN"]

image = (
        .pip_install("diffusers[torch]", "transformers", "ftfy", "accelerate")
        .run_function(download_models, secrets=[model.Secret.from_name("huggingface")])

Any kwargs accepted by @stub.function (such as Mounts, SharedVolumes, and resource requests) can be supplied to it. Essentially, this is equivalent to running a Modal function and snapshotting the resulting filesystem as an image.

Please see the reference documentation for an explanation of which changes to your build function trigger image rebuilds.