Skip to content

Commit e08f287

Browse files
authored
Merge pull request #1793 from alicevision/build/rockyDocker
[docker] Add Dockerfiles for Rocky 9
2 parents 7b7dac1 + bbf85f8 commit e08f287

File tree

11 files changed

+259
-284
lines changed

11 files changed

+259
-284
lines changed

.github/workflows/continuous-integration.yml

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -21,15 +21,19 @@ on:
2121
jobs:
2222
build-linux:
2323
runs-on: ubuntu-latest
24+
strategy:
25+
matrix:
26+
container: ["alicevision/alicevision-deps:2024.12.03-ubuntu22.04-cuda12.1.0", "alicevision/alicevision-deps:2024.12.09-rocky9-cuda12.1.0"]
2427
container:
25-
image: alicevision/alicevision-deps:2024.11.25-ubuntu22.04-cuda12.1.0
28+
image: ${{ matrix.container }}
2629
env:
2730
DEPS_INSTALL_DIR: /opt/AliceVision_install
2831
BUILD_TYPE: Release
2932
CTEST_OUTPUT_ON_FAILURE: 1
3033
ALICEVISION_ROOT: ${{ github.workspace }}/../AV_install
3134
ALICEVISION_SENSOR_DB: ${{ github.workspace }}/../AV_install/share/aliceVision/cameraSensors.db
3235
ALICEVISION_LENS_PROFILE_INFO: ""
36+
BUILD_CCTAG: "${{ matrix.container == 'alicevision/alicevision-deps:2024.12.03-ubuntu22.04-cuda12.1.0' && 'ON' || 'OFF' }}"
3337
steps:
3438
- uses: actions/checkout@v1
3539

@@ -53,7 +57,7 @@ jobs:
5357
-DALICEVISION_BUILD_SWIG_BINDING=ON \
5458
-DALICEVISION_USE_OPENCV=ON \
5559
-DALICEVISION_USE_CUDA=ON \
56-
-DALICEVISION_USE_CCTAG=ON \
60+
-DALICEVISION_USE_CCTAG="${BUILD_CCTAG}" \
5761
-DALICEVISION_USE_POPSIFT=ON \
5862
-DALICEVISION_USE_ALEMBIC=ON \
5963
-DOpenCV_DIR="${DEPS_INSTALL_DIR}/share/OpenCV" \

INSTALL.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -384,33 +384,33 @@ Check the sample in [samples](src/samples/aliceVisionAs3rdParty) for an example
384384

385385
### Docker image
386386

387-
A docker image can be built using the CentOS or Ubuntu Dockerfiles.
387+
A docker image can be built using the Ubuntu or Rocky Linux Dockerfiles.
388388
The Dockerfiles are based on `nvidia/cuda` images (https://hub.docker.com/r/nvidia/cuda/)
389389

390390
To generate the docker image, just run:
391391
```
392-
./docker/build-centos.sh
392+
./docker/build-rocky.sh
393393
```
394394

395-
To do it manually, parameters `OS_TAG` and `CUDA_TAG` should be passed to choose the OS and CUDA version.
396-
For example, the first line of below's commands shows the example to create docker for a CentOS 7 with Cuda 11.3.1 and second line for Ubuntu 16.04 with Cuda 11.0:
395+
To do it manually, parameters `ROCKY_VERSION`/`UBUNTU_VERSION` and `CUDA_TAG` should be passed to choose the OS and CUDA versions.
396+
For example, the first line of the commands below shows the example to create docker for a Rocky 9 with Cuda 12.1.0 and the second line for Ubuntu 16.04 with Cuda 11.0:
397397

398398
```
399-
docker build --build-arg OS_TAG=7 --build-arg CUDA_TAG=11.3.1 --tag alicevision:centos7-cuda11.3.1 .
400-
docker build --build-arg OS_TAG=16.04 --build-arg CUDA_TAG=11.0 --build-arg NPROC=8 --tag alicevision:ubuntu16.04-cuda11.0 -f Dockerfile_ubuntu .
399+
docker build --build-arg ROCKY_VERSION=9 --build-arg CUDA_TAG=12.1.0 --tag alicevision:rocky9-cuda12.1.0 -f Dockerfile_rocky .
400+
docker build --build-arg UBUNTU_VERSION=22.04 --build-arg CUDA_TAG=12.1.0 --build-arg NPROC=8 --tag alicevision:ubuntu22.04-cuda12.1.0 -f Dockerfile_ubuntu .
401401
```
402402

403403
In order to run the image [nvidia docker](https://github.com/nvidia/nvidia-docker/wiki/Installation-(version-2.0)) is needed.
404404

405405
```
406-
docker run -it --runtime=nvidia alicevision:centos7-cuda9.2
406+
docker run -it --runtime=nvidia alicevision:rocky9-cuda12.1.0
407407
```
408408

409409
To retrieve the generated files:
410410

411411
```
412412
# Create an instance of the image, copy the files and remove the temporary docker instance.
413-
CID=$(docker create alicevision:centos7-cuda11.3.1) && docker cp ${CID}:/opt/AliceVision_install . && docker cp ${CID}:/opt/AliceVision_bundle . && docker rm ${CID}
413+
CID=$(docker create alicevision:rocky9-cuda12.1.0) && docker cp ${CID}:/opt/AliceVision_install . && docker cp ${CID}:/opt/AliceVision_bundle . && docker rm ${CID}
414414
```
415415

416416
Environment variable

docker/Dockerfile_centos

Lines changed: 0 additions & 49 deletions
This file was deleted.

docker/Dockerfile_centos_deps

Lines changed: 0 additions & 149 deletions
This file was deleted.

docker/Dockerfile_rocky

Lines changed: 71 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,71 @@
1+
ARG AV_DEPS_VERSION
2+
ARG AV_VERSION
3+
ARG CUDA_VERSION
4+
ARG ROCKY_VERSION
5+
FROM alicevision/alicevision-deps:${AV_DEPS_VERSION}-rocky${ROCKY_VERSION}-cuda${CUDA_VERSION}
6+
LABEL maintainer="AliceVision Team [email protected]"
7+
ARG TARGET_ARCHITECTURE=core
8+
9+
# use CUDA_VERSION to select the image version to use
10+
# see https://hub.docker.com/r/nvidia/cuda/
11+
#
12+
# AV_VERSION=2.2.8
13+
# CUDA_VERSION=11.0
14+
# ROCKY_VERSION=9
15+
# docker build \
16+
# --build-arg CUDA_VERSION=${CUDA_VERSION} \
17+
# --build-arg ROCKY_VERSION${ROCKY_VERSION} \
18+
# --build-arg AV_VERSION=2.2.8.develop \
19+
# --tag alicevision/alicevision:${AV_VERSION}-rocky${ROCKY_VERSION}-cuda${CUDA_VERSION} \
20+
# -f Dockerfile_rocky .
21+
#
22+
# then execute with nvidia docker (https://github.com/nvidia/nvidia-docker/wiki/Installation-(version-2.0))
23+
# docker run -it --runtime=nvidia alicevision/alicevision:{AV_VERSION}-rocky${ROCKY_VERSION}-cuda${CUDA_VERSION}
24+
25+
26+
# OS/Version (FILE): cat /etc/issue.net
27+
# Cuda version (ENV): $CUDA_VERSION
28+
29+
ENV AV_DEV=/opt/AliceVision_git \
30+
AV_BUILD=/tmp/AliceVision_build \
31+
AV_INSTALL=/opt/AliceVision_install \
32+
AV_BUNDLE=/opt/AliceVision_bundle \
33+
PATH="${PATH}:${AV_BUNDLE}" \
34+
VERBOSE=1
35+
36+
COPY CMakeLists.txt *.md ${AV_DEV}/
37+
COPY src ${AV_DEV}/src
38+
39+
WORKDIR "${AV_BUILD}"
40+
41+
COPY docker ${AV_DEV}/docker
42+
43+
RUN export CPU_CORES=`${AV_DEV}/docker/check-cpu.sh`
44+
45+
RUN cmake -DCMAKE_BUILD_TYPE=Release \
46+
-DBUILD_SHARED_LIBS:BOOL=ON \
47+
-DTARGET_ARCHITECTURE=${TARGET_ARCHITECTURE} \
48+
-DALICEVISION_BUILD_DEPENDENCIES:BOOL=OFF \
49+
-DCMAKE_PREFIX_PATH:PATH="${AV_INSTALL}" \
50+
-DCMAKE_INSTALL_PREFIX:PATH="${AV_INSTALL}" \
51+
-DALICEVISION_BUNDLE_PREFIX="${AV_BUNDLE}" \
52+
-DALICEVISION_USE_ALEMBIC:BOOL=ON \
53+
-DMINIGLOG:BOOL=ON \
54+
-DALICEVISION_USE_CCTAG:BOOL=OFF \
55+
-DALICEVISION_USE_OPENCV:BOOL=ON \
56+
-DALICEVISION_USE_OPENGV:BOOL=ON \
57+
-DALICEVISION_USE_POPSIFT:BOOL=ON \
58+
-DALICEVISION_USE_CUDA:BOOL=ON \
59+
-DALICEVISION_USE_ONNX_GPU:BOOL=OFF \
60+
-DALICEVISION_BUILD_DOC:BOOL=OFF \
61+
-DALICEVISION_BUILD_SWIG_BINDING:BOOL=ON \
62+
-DSWIG_DIR:PATH="${AV_INSTALL}/share/swig/4.3.0" -DSWIG_EXECUTABLE:PATH="${AV_INSTALL}/bin-deps/swig" \
63+
"${AV_DEV}"
64+
65+
RUN make install -j${CPU_CORES}
66+
67+
RUN make bundle
68+
69+
RUN rm -rf "${AV_BUILD}" "${AV_DEV}" && \
70+
echo "export ALICEVISION_SENSOR_DB=${AV_BUNDLE}/share/aliceVision/cameraSensors.db" >> /etc/profile.d/alicevision.sh && \
71+
echo "export ALICEVISION_ROOT=${AV_BUNDLE}" >> /etc/profile.d/alicevision.sh

0 commit comments

Comments
 (0)