Skip to content

docker image overwrite #572

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion libs/infinity_emb/Dockerfile.amd_auto
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,7 @@ else \
fi

# TODO: remove this line
RUN apt-get install --no-install-recommends -y git && poetry run python -m pip install git+https://github.com/huggingface/transformers.git@7547f55e5d93245c0a013b50df976924f2d9e8b0 && rm -rf ~/.cache/ /tmp/*
RUN poetry run pip install --no-cache-dir transformers==4.50.3 colpali-engine==0.3.9

FROM builder AS testing
# install lint and test dependencies
Expand Down
2 changes: 1 addition & 1 deletion libs/infinity_emb/Dockerfile.cpu_auto
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ RUN poetry run python -m pip install --no-cache-dir onnxruntime-openvino

#
# TODO: remove this line
RUN apt-get install --no-install-recommends -y git && poetry run python -m pip install git+https://github.com/huggingface/transformers.git@7547f55e5d93245c0a013b50df976924f2d9e8b0 && rm -rf ~/.cache/ /tmp/*
RUN poetry run pip install --no-cache-dir transformers==4.50.3 colpali-engine==0.3.9
Comment on lines 57 to +58
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

style: TODO comment indicates this line should be removed, but it's being modified instead. Consider removing the line entirely or removing the TODO if the line is now permanent.


FROM builder AS testing
# install lint and test dependencies
Expand Down
2 changes: 1 addition & 1 deletion libs/infinity_emb/Dockerfile.jinja2
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ COPY infinity_emb infinity_emb
{{main_install|replace("--no-root","")}}
{{extra_installs_main | default('#')}}
# TODO: remove this line
RUN apt-get install --no-install-recommends -y git && poetry run python -m pip install git+https://github.com/huggingface/transformers.git@7547f55e5d93245c0a013b50df976924f2d9e8b0 && rm -rf ~/.cache/ /tmp/*
RUN poetry run pip install --no-cache-dir transformers==4.50.3 colpali-engine==0.3.9
Comment on lines 47 to +48
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

style: consider using poetry to manage these dependencies instead of direct pip install to maintain consistent dependency management


FROM builder AS testing
# install lint and test dependencies
Expand Down
2 changes: 1 addition & 1 deletion libs/infinity_emb/Dockerfile.nvidia_auto
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ RUN poetry install --no-interaction --no-ansi --extras "${EXTRAS}" --without li
RUN poetry run $PYTHON -m pip install --no-cache-dir https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.6cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

logic: flash-attention wheel is pinned to Python 3.10 and CUDA 12, but there's no explicit Python version check

Suggested change
RUN poetry run $PYTHON -m pip install --no-cache-dir https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.6cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
RUN if [ "$($PYTHON --version 2>&1)" != "Python 3.10"* ]; then echo "Error: Python 3.10 is required for flash-attention wheel" && exit 1; fi && \
poetry run $PYTHON -m pip install --no-cache-dir https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.6cxx11abiFALSE-cp310-cp310-linux_x86_64.whl


# TODO: remove this line
RUN apt-get install --no-install-recommends -y git && poetry run python -m pip install git+https://github.com/huggingface/transformers.git@7547f55e5d93245c0a013b50df976924f2d9e8b0 && rm -rf ~/.cache/ /tmp/*
RUN poetry run pip install --no-cache-dir transformers==4.50.3 colpali-engine==0.3.9

FROM builder AS testing
# install lint and test dependencies
Expand Down
2 changes: 1 addition & 1 deletion libs/infinity_emb/Dockerfile.trt_onnx_auto
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ RUN poetry run $PYTHON -m pip install --no-cache-dir flash-attn --no-build-isola
RUN poetry run $PYTHON -m pip install --no-cache-dir "tensorrt==10.3.0" "tensorrt_lean==10.3.0" "tensorrt_dispatch==10.3.0"

# TODO: remove this line
RUN apt-get install --no-install-recommends -y git && poetry run python -m pip install git+https://github.com/huggingface/transformers.git@7547f55e5d93245c0a013b50df976924f2d9e8b0 && rm -rf ~/.cache/ /tmp/*
RUN poetry run pip install --no-cache-dir transformers==4.50.3 colpali-engine==0.3.9

FROM builder AS testing
# install lint and test dependencies
Expand Down
Loading