'"##[error]Error response from daemon: failed to reach build target <stage> in Dockerfile" only during CI pipeline
I'm getting this error in my PR pipeline and I'm not sure what the cause and solution is.
The Docker task is pretty well templated and the stage does exist in my Dockerfile
:
# docker.yaml
parameters:
- name: service
default: ''
- name: jobName
default: ''
- name: jobDisplayName
default: ''
- name: taskDisplayName
default: ''
- name: dockerCommand
default: ''
- name: target
default: ''
- name: tag
default: ''
jobs:
- job: ${{ parameters.jobName }}
displayName: ${{ parameters.jobDisplayName }}
# Handle whether to run service or not
variables:
servicesChanged: $[ stageDependencies.Changed.Changes.outputs['detectChanges.servicesChanged'] ]
condition: or(contains(variables['servicesChanged'], '${{ parameters.service }}'), eq(variables['Build.Reason'], 'Manual'))
steps:
# Set to app repo
- checkout: app
# Run the Docker task
- task: Docker@2
# Run if there have been changes
displayName: ${{ parameters.taskDisplayName }}
inputs:
command: ${{ parameters.dockerCommand }}
repository: $(imageRepository)-${{ parameters.service }}
dockerfile: $(dockerFilePath)/${{ parameters.service }}/docker/Dockerfile
buildContext: $(dockerFilePath)/${{ parameters.service }}
containerRegistry: $(dockerRegistryServiceConnection)
arguments: --target ${{ parameters.target }}
tags: |
${{ parameters.tag }}-$(Build.BuildNumber)
# Dockerfile
# syntax=docker/dockerfile:1
# creating a python base with shared environment variables
FROM python:3.8-slim as python-base
ENV PYTHONUNBUFFERED=1 \
PYTHONDONTWRITEBYTECODE=1 \
PIP_NO_CACHE_DIR=off \
PIP_DISABLE_PIP_VERSION_CHECK=on \
PIP_DEFAULT_TIMEOUT=100 \
POETRY_HOME="/opt/poetry" \
POETRY_VIRTUALENVS_IN_PROJECT=true \
POETRY_NO_INTERACTION=1 \
PYSETUP_PATH="/opt/pysetup" \
VENV_PATH="/opt/pysetup/.venv"
ENV PATH="$POETRY_HOME/bin:$VENV_PATH/bin:$PATH"
# builder-base is used to build dependencies
FROM python-base as builder-base
RUN apt-get update \
&& apt-get install --no-install-recommends -y \
curl \
build-essential
# Install Poetry - respects $POETRY_VERSION & $POETRY_HOME
ENV POETRY_VERSION=1.1.8
RUN curl -sSL https://raw.githubusercontent.com/sdispater/poetry/master/get-poetry.py | python
# We copy our Python requirements here to cache them
# and install only runtime deps using poetry
WORKDIR $PYSETUP_PATH
COPY ./poetry.lock ./pyproject.toml ./
RUN poetry install --no-dev
# 'development' stage installs all dev deps and can be used to develop code.
# For example using docker-compose to mount local volume under /app
FROM python-base as development
# Copying poetry and venv into image
COPY --from=builder-base $POETRY_HOME $POETRY_HOME
COPY --from=builder-base $PYSETUP_PATH $PYSETUP_PATH
# Copying in our entrypoint
# COPY ./docker/docker-entrypoint.sh /docker-entrypoint.sh
RUN chmod +x . /opt/pysetup/.venv/bin/activate
# venv already has runtime deps installed we get a quicker install
WORKDIR $PYSETUP_PATH
RUN poetry install
WORKDIR /app
COPY . .
EXPOSE 5000
CMD [ "python", "src/manage.py", "runserver", "0.0.0.0:5000"]
# 'unit-tests' stage runs our unit tests with unittest and coverage.
FROM development AS unit-tests
RUN coverage run --omit='src/manage.py,src/config/*,*/.venv/*,*/*__init__.py,*/tests.py,*/admin.py' src/manage.py test src --tag=ut && \
coverage report
The Dockerfile
is being found correctly and the pipeline looks it starts to build the image, but then throws this error. I can run a docker build ... --target unit-tests
locally without issue, so it is isolated the Azure Pipelines.
Suggestions for what could be causing this?
EDIT: This is the project structure:
app/
admin/
docker/
Dockerfile
entrypoint.sh
src/
...
api/
docker/
Dockerfile
entrypoint.sh
src/
...
portal/
docker/
Dockerfile
entrypoint.sh
src/
...
This is a portion of the devspace.yaml
:
admin-ut:
image: ${APP-NAME}/${ADMIN-UT}
dockerfile: ${ADMIN}/docker/Dockerfile
context: ${ADMIN}/
build:
buildKit:
args: []
options:
target: unit-tests
EDIT2:
Maybe the issue is related to not having BuildKit
per this question:
But there is Github issue that is related:
https://github.com/MicrosoftDocs/azure-devops-docs/issues/9196#issuecomment-761624398
So I've modified my docker.yaml
for Azure Pipelines to:
- task: Docker@2
# Run if there have been changes
displayName: ${{ parameters.taskDisplayName }}
inputs:
command: ${{ parameters.dockerCommand }}
repository: $(imageRepository)-${{ parameters.service }}
dockerfile: $(dockerFilePath)/${{ parameters.service }}/docker/Dockerfile
buildContext: $(dockerFilePath)/${{ parameters.service }}
containerRegistry: $(dockerRegistryServiceConnection)
arguments: --target ${{ parameters.target }}
tags: |
${{ parameters.tag }}-$(Build.BuildNumber)
env:
DOCKER_BUILDKIT: 1
Now I get a more verbose error output:
failed to solve with frontend dockerfile.v0: failed to create LLB definition: target stage unit-tests could not be found
##[error]#1 [internal] load build definition from Dockerfile
##[error]#1 sha256:acc1b908d881e469d44e7f005ceae0820d5ee08ada351a0aa2a7b8e749c8f6fe
##[error]#1 transferring dockerfile: 974B done
##[error]#1 DONE 0.0s
##[error]#2 [internal] load .dockerignore
##[error]#2 sha256:189c0a02bba84ed5c5f9ea82593d0e664746767c105d65afdf3cd0771eeb378e
##[error]#2 transferring context: 346B done
##[error]#2 DONE 0.0s
##[error]failed to solve with frontend dockerfile.v0: failed to create LLB definition: target stage unit-tests could not be found
##[error]The process '/usr/bin/docker' failed with exit code 1
Solution 1:[1]
Ok, I think I have it sorted out now and the pipeline stages are running successfully. It was a combination of adding DOCKER_BUILDKIT: 1
like:
- task: Docker@2
# Run if there have been changes
displayName: ${{ parameters.taskDisplayName }}
inputs:
command: ${{ parameters.dockerCommand }}
repository: $(imageRepository)-${{ parameters.service }}
dockerfile: $(dockerFilePath)/${{ parameters.service }}/docker/Dockerfile
buildContext: $(dockerFilePath)/${{ parameters.service }}
containerRegistry: $(dockerRegistryServiceConnection)
arguments: --target ${{ parameters.target }}
tags: |
${{ parameters.tag }}-$(Build.BuildNumber)
env:
DOCKER_BUILDKIT: 1
And then removing # syntax=docker/dockerfile:1
from each Dockerfile
. Local environment still works after making the modification to the Dockerfile
as well.
Solution 2:[2]
I had this same error a few minutes ago. My client service was like this before:
services: client: container_name: client image: client build: ./client ports: - '3000:3000' volumes: - ./client:/app - ./app/node_modules
I moved the build: ./client at the beginning of the service and it worked for me.
After, it was like this:
services: client: build: ./client container_name: client image: client ports: - '3000:3000' volumes: - ./client:/app - ./app/node_modules
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 | cjones |
Solution 2 | Cyebukayire |