Dockerfile

See here for the main Dockerfile to construct the image for running an OpenAI compatible server with vLLM.

  • Below is a visual representation of the multi-stage Dockerfile. The build graph contains the following nodes:

    • All build stages

    • The default build target (highlighted in grey)

    • External images (with dashed borders)

    The edges of the build graph represent:

    • FROM … dependencies (with a solid line and a full arrow head)

    • COPY –from=… dependencies (with a dashed line and an empty arrow head)

    • RUN –mount=(.*)from=… dependencies (with a dotted line and an empty diamond arrow head)

    query

    Made using: patrickhoefler/dockerfilegraph

    Commands to regenerate the build graph (make sure to run it from the `root` directory of the vLLM repository where the dockerfile is present):

    1. dockerfilegraph -o png --legend --dpi 200 --max-label-length 50 --filename Dockerfile

    or in case you want to run it directly with the docker image:

    1. docker run \
    2. --rm \
    3. --user "$(id -u):$(id -g)" \
    4. --workdir /workspace \
    5. --volume "$(pwd)":/workspace \
    6. ghcr.io/patrickhoefler/dockerfilegraph:alpine \
    7. --output png \
    8. --dpi 200 \
    9. --max-label-length 50 \
    10. --filename Dockerfile \
    11. --legend

    (To run it for a different file, you can pass in a different argument to the flag –filename.)