Facing build issue with failed to prepare sha error #2883
Open
Description
Contributing guidelines
- I've read the contributing guidelines and wholeheartedly agree
I've found a bug and checked that ...
- ... the documentation does not mention anything about my problem
- ... there are no open or closed issues that are related to my problem
Description
ERROR: failed to solve: failed to prepare sha256:6ef31b9aac55f699e551706c154f1b66955d5e4379da9e6ffc45d5163cde3777 as xyx2mcarp3p5pksqoa7y6rv90: open /var/lib/docker/overlay2/2a1d88ce0a17b395ae852bb3c22bc4821b23f90f01c2caacc0503cc783a73fc9/.tmp-committed4135229564: no such file or directory
getting this error repeatedly on trying to build an alpine image on ubuntu based jenkins server
Expected behaviour
builds without any errors
Actual behaviour
throwing above mentioned error
Buildx version
github.com/docker/buildx v0.19.3 48d6a39
Docker info
ubuntu@ip-172-31-75-146:~$ docker info
Client: Docker Engine - Community
Version: 27.4.1
Context: default
Debug Mode: false
Plugins:
buildx: Docker Buildx (Docker Inc.)
Version: v0.19.3
Path: /usr/libexec/docker/cli-plugins/docker-buildx
compose: Docker Compose (Docker Inc.)
Version: v2.32.1
Path: /usr/libexec/docker/cli-plugins/docker-compose
Server:
Containers: 5
Running: 5
Paused: 0
Stopped: 0
Images: 6
Server Version: 27.4.1
Storage Driver: overlay2
Backing Filesystem: extfs
Supports d_type: true
Using metacopy: false
Native Overlay Diff: true
userxattr: false
Logging Driver: json-file
Cgroup Driver: systemd
Cgroup Version: 2
Plugins:
Volume: local
Network: bridge host ipvlan macvlan null overlay
Log: awslogs fluentd gcplogs gelf journald json-file local splunk syslog
Swarm: inactive
Runtimes: runc io.containerd.runc.v2
Default Runtime: runc
Init Binary: docker-init
containerd version: 88bf19b2105c8b17560993bee28a01ddc2f97182
runc version: v1.2.2-0-g7cb3632
init version: de40ad0
Security Options:
apparmor
seccomp
Profile: builtin
cgroupns
Kernel Version: 6.8.0-1021-aws
Operating System: Ubuntu 22.04.5 LTS
OSType: linux
Architecture: aarch64
CPUs: 16
Total Memory: 30.75GiB
Name: ip-172-31-75-146
ID: 1b28ad27-2811-410d-8e4b-db85893d2f73
Docker Root Dir: /var/lib/docker
Debug Mode: false
Experimental: false
Insecure Registries:
127.0.0.0/8
Live Restore Enabled: false
Builders list
NAME/NODE DRIVER/ENDPOINT STATUS BUILDKIT PLATFORMS
multiarch* docker-container
\_ multiarch0 \_ unix:///var/run/docker.sock inactive
multiplatformbuilder docker-container
\_ multiplatformbuilder0 \_ unix:///var/run/docker.sock inactive
default docker
\_ default \_ default running v0.17.3 linux/amd64, linux/arm64, linux/arm (+2), linux/ppc64le, (4 more)
Configuration
FROM jar-docker-images:alpine-node-arm64
# Add a cache buster argument
ARG CACHEBUSTER=1
# Install build and runtime dependencies
RUN apk add \
bash \
g++ \
make \
python3 \
git \
chromium \
nss \
freetype \
harfbuzz \
ca-certificates \
ttf-freefont \
lz4 \
cyrus-sasl \
openssl \
nodejs \
npm \
typescript \
&& echo "Cache buster: $CACHEBUSTER"
# Clear NPM cache to avoid potential issues
RUN npm cache clean --force
# Install Puppeteer with bundled Chromium for ARM
RUN npm install puppeteer --unsafe-perm=true --loglevel=verbose || { \
echo "Retrying Puppeteer install..."; \
npm cache clean --force && npm install puppeteer --unsafe-perm=true; \
}
# Set Puppeteer Chromium executable path
ENV PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=false
ENV PUPPETEER_EXECUTABLE_PATH=/usr/bin/chromium-browser
# Set working directory
WORKDIR /usr/src/app
# Copy application files
COPY . .
# Install application dependencies
RUN npm install
# Compile TypeScript project
RUN tsc -p tsconfig.staging.json
# Expose application port
EXPOSE 5001
# Start the application in staging mode
CMD ["npm", "run", "start:staging"]
Build logs
creating docker build and pushing to the ECR
deployment info nodejs : mantis : staging
docker build for nodejs
#0 building with "default" instance using docker driver
#1 [internal] load build definition from Dockerfile
#1 transferring dockerfile: 1.32kB done
#1 DONE 0.0s
#2 [auth] sharing credentials for
#2 DONE 0.0s
#3 [internal] load metadata for /jar-docker-images:alpine-node-arm64
#3 DONE 0.1s
#4 [internal] load .dockerignore
#4 transferring context: 2B done
#4 DONE 0.0s
#5 [1/8] FROM /jar-docker-images:alpine-node-arm64@sha256:28bbfaf93681a35bfae4fe54098986f41e77998a080fa347c1e2b7f79a6a8c70
#5 CACHED
#6 [2/8] RUN apk add bash g++ make python3 git chromium nss freetype harfbuzz ca-certificates ttf-freefont lz4 cyrus-sasl openssl nodejs npm typescript && echo "Cache buster: 1"
#6 ERROR: failed to prepare sha256:6ef31b9aac55f699e551706c154f1b66955d5e4379da9e6ffc45d5163cde3777 as xyx2mcarp3p5pksqoa7y6rv90: open /var/lib/docker/overlay2/2a1d88ce0a17b395ae852bb3c22bc4821b23f90f01c2caacc0503cc783a73fc9/.tmp-committed4135229564: no such file or directory
#7 [internal] load build context
#7 transferring context: done
#7 CANCELED
------
> [2/8] RUN apk add bash g++ make python3 git chromium nss freetype harfbuzz ca-certificates ttf-freefont lz4 cyrus-sasl openssl nodejs npm typescript && echo "Cache buster: 1":
------
Dockerfile:7
--------------------
6 | # Install build and runtime dependencies
7 | >>> RUN apk add \
8 | >>> bash \
9 | >>> g++ \
10 | >>> make \
11 | >>> python3 \
12 | >>> git \
13 | >>> chromium \
14 | >>> nss \
15 | >>> freetype \
16 | >>> harfbuzz \
17 | >>> ca-certificates \
18 | >>> ttf-freefont \
19 | >>> lz4 \
20 | >>> cyrus-sasl \
21 | >>> openssl \
22 | >>> nodejs \
23 | >>> npm \
24 | >>> typescript \
25 | >>> && echo "Cache buster: $CACHEBUSTER"
26 |
--------------------
ERROR: failed to solve: failed to prepare sha256:6ef31b9aac55f699e551706c154f1b66955d5e4379da9e6ffc45d5163cde3777 as xyx2mcarp3p5pksqoa7y6rv90: open /var/lib/docker/overlay2/2a1d88ce0a17b395ae852bb3c22bc4821b23f90f01c2caacc0503cc783a73fc9/.tmp-committed4135229564: no such file or directory
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Declarative: Post Actions)
[Pipeline] cleanWs
[WS-CLEANUP] Deleting project workspace...
[WS-CLEANUP] Deferred wipeout is used...
[WS-CLEANUP] done
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: script returned exit code 1
Finished: FAILURE
Additional info
No response