Skip to content

Commit e40fcbc

Browse files
authored
Merge pull request #974 from netfs/r1.8
Docker fixes cherrypicked from master
2 parents ad626e1 + cf0dfdf commit e40fcbc

File tree

5 files changed

+242
-67
lines changed

5 files changed

+242
-67
lines changed

tensorflow_serving/g3doc/docker.md

Lines changed: 83 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -9,43 +9,109 @@ General installation instructions are
99
[on the Docker site](https://docs.docker.com/installation/), but we give some
1010
quick links here:
1111

12-
* [OSX](https://docs.docker.com/installation/mac/): [docker toolbox](https://www.docker.com/toolbox)
13-
* [ubuntu](https://docs.docker.com/installation/ubuntulinux/)
12+
* [OSX](https://docs.docker.com/installation/mac/): [docker
13+
toolbox](https://www.docker.com/toolbox)
14+
* [Ubuntu](https://docs.docker.com/installation/ubuntulinux/)
1415

1516
## Which containers exist?
1617

1718
We currently maintain the following Dockerfiles:
1819

19-
* [`Dockerfile.devel`](https://github.com/tensorflow/serving/blob/master/tensorflow_serving/tools/docker/Dockerfile.devel),
20-
which is a minimal VM with all of the dependencies needed to build TensorFlow
21-
Serving.
20+
* [`Dockerfile`](https://github.com/tensorflow/serving/blob/master/tensorflow_serving/tools/docker/Dockerfile),
21+
which is a minimal VM with TensorFlow Serving installed.
22+
23+
* [`Dockerfile.devel`](https://github.com/tensorflow/serving/blob/master/tensorflow_serving/tools/docker/Dockerfile.devel),
24+
which is a minimal VM with all of the dependencies needed to build
25+
TensorFlow Serving.
26+
27+
* [`Dockerfile.devel-gpu`](https://github.com/tensorflow/serving/blob/master/tensorflow_serving/tools/docker/Dockerfile.devel),
28+
which is a minimal VM with all of the dependencies needed to build
29+
TensorFlow Serving with GPU support.
2230

2331
## Building a container
24-
run;
32+
33+
`Dockerfile`:
34+
35+
```shell
36+
docker build --pull -t $USER/tensorflow-serving .
37+
```
38+
39+
`Dockerfile.devel`:
2540

2641
```shell
2742
docker build --pull -t $USER/tensorflow-serving-devel -f Dockerfile.devel .
2843
```
2944

45+
`Dockerfile.devel-gpu`:
46+
47+
```shell
48+
docker build --pull -t $USER/tensorflow-serving-devel-gpu -f Dockerfile.devel-gpu .
49+
```
50+
3051
## Running a container
31-
This assumes you have built the container.
3252

33-
`Dockerfile.devel`: Use the development container to clone and test the
34-
TensorFlow Serving repository.
53+
### Serving Example
3554

36-
Run the container;
55+
This assumes you have built the `Dockerfile` container.
56+
57+
First you will need a model. Clone the TensorFlow Serving repo.
3758

3859
```shell
39-
docker run -it $USER/tensorflow-serving-devel
60+
mkdir -p /tmp/tfserving
61+
cd /tmp/tfserving
62+
git clone --recursive https://github.com/tensorflow/serving
4063
```
4164

42-
Clone, configure and test Tensorflow Serving in the running container;
65+
We will use a toy model called Half Plus Three, which will predict values 0.5\*x
66+
+ 3 for the values we provide for prediction.
67+
68+
```shell
69+
docker run -p 8501:8501 \
70+
-v /tmp/tfserving/serving/tensorflow_serving/servables/tensorflow/testdata/saved_model_half_plus_three:/models/half_plus_three \
71+
-e MODEL_NAME=half_plus_three -t $USER/tensorflow-serving &
72+
```
73+
74+
This will run the docker container and launch the TensorFlow Serving Model
75+
Server, bind the REST API port 8501, and map our desired model from our host to
76+
where models are expected in the container. We also pass the name of the model
77+
as an environment variable, which will be important when we query the model.
78+
79+
To query the model using the predict API, you can run
80+
81+
```shell
82+
curl -d '{"instances": [1.0, 2.0, 5.0]}' -X POST http://localhost:8501/v1/models/half_plus_three:predict
83+
```
84+
85+
This should return a set of values:
86+
87+
```json
88+
{ "predictions": [3.5, 4.0, 5.5] }
89+
```
90+
91+
More information on using the RESTful API can be found [here](api_rest.md).
92+
93+
### Development Example
94+
95+
This assumes you have built the `Dockerfile.devel` container.
96+
97+
To run the container:
4398

4499
```shell
45-
git clone --recurse-submodules https://github.com/tensorflow/serving
46-
cd serving/tensorflow
47-
./configure
48-
cd ..
49-
bazel test tensorflow_serving/...
100+
docker run -it -p 8500:8500 $USER/tensorflow-serving-devel
50101
```
51102

103+
This will pull the latest TensorFlow Serving release with a prebuilt binary and
104+
working development environment. To test a model, from inside the container try:
105+
106+
``````shell
107+
bazel build -c opt //tensorflow_serving/example:mnist_saved_model
108+
# train the mnist model
109+
bazel-bin/tensorflow_serving/example/mnist_saved_model /tmp/mnist_model
110+
# serve the model
111+
tensorflow_model_server --port=8500 --model_name=mnist --model_base_path=/tmp/mnist_model/ &
112+
# build the client
113+
bazel build -c opt //tensorflow_serving/example:mnist_client
114+
# test the client
115+
bazel-bin/tensorflow_serving/example/mnist_client --num_tests=1000 --server=localhost:8500
116+
`````
117+
``````
Lines changed: 48 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,48 @@
1+
# Copyright 2018 Google LLC
2+
#
3+
# Licensed under the Apache License, Version 2.0 (the "License");
4+
# you may not use this file except in compliance with the License.
5+
# You may obtain a copy of the License at
6+
#
7+
# https://www.apache.org/licenses/LICENSE-2.0
8+
#
9+
# Unless required by applicable law or agreed to in writing, software
10+
# distributed under the License is distributed on an "AS IS" BASIS,
11+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12+
# See the License for the specific language governing permissions and
13+
# limitations under the License.
14+
FROM ubuntu:16.04
15+
16+
LABEL maintainer="[email protected]"
17+
18+
RUN apt-get update && apt-get install -y \
19+
curl \
20+
gnupg
21+
22+
# Install TF Serving pkg.
23+
ARG TF_SERVING_VERSION=1.8.0
24+
# Use tensorflow-model-server-universal for older hardware
25+
ARG TF_SERVING_PKGNAME=tensorflow-model-server
26+
RUN curl -LO https://storage.googleapis.com/tensorflow-serving-apt/pool/${TF_SERVING_PKGNAME}-${TF_SERVING_VERSION}/t/${TF_SERVING_PKGNAME}/${TF_SERVING_PKGNAME}_${TF_SERVING_VERSION}_all.deb ; \
27+
dpkg -i ${TF_SERVING_PKGNAME}_${TF_SERVING_VERSION}_all.deb ; \
28+
rm ${TF_SERVING_PKGNAME}_${TF_SERVING_VERSION}_all.deb
29+
30+
# Cleanup to reduce the size of the image
31+
# See https://docs.docker.com/develop/develop-images/dockerfile_best-practices/#run
32+
RUN apt-get clean && \
33+
rm -rf /var/lib/apt/lists/*
34+
35+
#Expose ports
36+
# gRPC
37+
EXPOSE 8500
38+
39+
# REST
40+
EXPOSE 8501
41+
42+
# Set where models should be stored in the container
43+
ENV MODEL_BASE_PATH=/models
44+
RUN mkdir -p ${MODEL_BASE_PATH}
45+
46+
# The only required piece is the model name in order to differentiate endpoints
47+
ENV MODEL_NAME=model
48+
ENTRYPOINT tensorflow_model_server --port=8500 --rest_api_port=8501 --model_name=${MODEL_NAME} --model_base_path=${MODEL_BASE_PATH}/${MODEL_NAME}

tensorflow_serving/tools/docker/Dockerfile.devel

Lines changed: 41 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,19 @@
1+
# Copyright 2018 Google LLC
2+
#
3+
# Licensed under the Apache License, Version 2.0 (the "License");
4+
# you may not use this file except in compliance with the License.
5+
# You may obtain a copy of the License at
6+
#
7+
# https://www.apache.org/licenses/LICENSE-2.0
8+
#
9+
# Unless required by applicable law or agreed to in writing, software
10+
# distributed under the License is distributed on an "AS IS" BASIS,
11+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12+
# See the License for the specific language governing permissions and
13+
# limitations under the License.
114
FROM ubuntu:16.04
215

3-
MAINTAINER Jeremiah Harmsen <jeremiah@google.com>
16+
LABEL maintainer=gvasudevan@google.com
417

518
RUN apt-get update && apt-get install -y \
619
automake \
@@ -34,17 +47,39 @@ RUN pip install mock grpcio
3447

3548
# Set up Bazel.
3649

37-
ENV BAZELRC /root/.bazelrc
38-
# Install the most recent bazel release.
39-
ENV BAZEL_VERSION 0.10.0
50+
# Running bazel inside a `docker build` command causes trouble, cf:
51+
# https://github.com/bazelbuild/bazel/issues/134
52+
# The easiest solution is to set up a bazelrc file forcing --batch.
53+
RUN echo "startup --batch" >>/etc/bazel.bazelrc
54+
# Similarly, we need to workaround sandboxing issues:
55+
# https://github.com/bazelbuild/bazel/issues/418
56+
RUN echo "build --spawn_strategy=standalone --genrule_strategy=standalone" \
57+
>>/etc/bazel.bazelrc
58+
# Install a recent bazel release.
59+
ENV BAZEL_VERSION 0.11.1
4060
WORKDIR /
4161
RUN mkdir /bazel && \
4262
cd /bazel && \
43-
curl -fSsL -O https://github.com/bazelbuild/bazel/releases/download/$BAZEL_VERSION/bazel-$BAZEL_VERSION-installer-linux-x86_64.sh && \
44-
curl -fSsL -o /bazel/LICENSE.txt https://raw.githubusercontent.com/bazelbuild/bazel/master/LICENSE && \
63+
curl -H "User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.133 Safari/537.36" -fSsL -O https://github.com/bazelbuild/bazel/releases/download/$BAZEL_VERSION/bazel-$BAZEL_VERSION-installer-linux-x86_64.sh && \
64+
curl -H "User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.133 Safari/537.36" -fSsL -o /bazel/LICENSE.txt https://raw.githubusercontent.com/bazelbuild/bazel/master/LICENSE && \
4565
chmod +x bazel-*.sh && \
4666
./bazel-$BAZEL_VERSION-installer-linux-x86_64.sh && \
4767
cd / && \
4868
rm -f /bazel/bazel-$BAZEL_VERSION-installer-linux-x86_64.sh
4969

70+
# Download, build, and install TensorFlow Serving
71+
ARG TF_SERVING_VERSION_GIT_BRANCH=1.8.0
72+
WORKDIR /tensorflow-serving
73+
RUN git clone --depth=1 --branch=${TF_SERVING_VERSION_GIT_BRANCH} \
74+
https://github.com/tensorflow/serving .
75+
76+
ARG TF_SERVING_BUILD_OPTIONS="--copt=-mavx --cxxopt=-D_GLIBCXX_USE_CXX11_ABI=0 --verbose_failures"
77+
RUN bazel build -c opt --color=yes --curses=yes ${TF_SERVING_BUILD_OPTIONS} \
78+
--output_filter=DONT_MATCH_ANYTHING \
79+
tensorflow_serving/model_servers:tensorflow_model_server && \
80+
cp bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server /usr/local/bin/ && \
81+
bazel clean --expunge --color=yes && \
82+
rm -rf /root/.cache
83+
# Clean up Bazel cache when done.
84+
5085
CMD ["/bin/bash"]

0 commit comments

Comments
 (0)