@@ -9,43 +9,109 @@ General installation instructions are
99[ on the Docker site] ( https://docs.docker.com/installation/ ) , but we give some
1010quick links here:
1111
12- * [ OSX] ( https://docs.docker.com/installation/mac/ ) : [ docker toolbox] ( https://www.docker.com/toolbox )
13- * [ ubuntu] ( https://docs.docker.com/installation/ubuntulinux/ )
12+ * [ OSX] ( https://docs.docker.com/installation/mac/ ) : [ docker
13+ toolbox] ( https://www.docker.com/toolbox )
14+ * [ Ubuntu] ( https://docs.docker.com/installation/ubuntulinux/ )
1415
1516## Which containers exist?
1617
1718We currently maintain the following Dockerfiles:
1819
19- * [ ` Dockerfile.devel ` ] ( https://github.com/tensorflow/serving/blob/master/tensorflow_serving/tools/docker/Dockerfile.devel ) ,
20- which is a minimal VM with all of the dependencies needed to build TensorFlow
21- Serving.
20+ * [ ` Dockerfile ` ] ( https://github.com/tensorflow/serving/blob/master/tensorflow_serving/tools/docker/Dockerfile ) ,
21+ which is a minimal VM with TensorFlow Serving installed.
22+
23+ * [ ` Dockerfile.devel ` ] ( https://github.com/tensorflow/serving/blob/master/tensorflow_serving/tools/docker/Dockerfile.devel ) ,
24+ which is a minimal VM with all of the dependencies needed to build
25+ TensorFlow Serving.
26+
27+ * [ ` Dockerfile.devel-gpu ` ] ( https://github.com/tensorflow/serving/blob/master/tensorflow_serving/tools/docker/Dockerfile.devel ) ,
28+ which is a minimal VM with all of the dependencies needed to build
29+ TensorFlow Serving with GPU support.
2230
2331## Building a container
24- run;
32+
33+ ` Dockerfile ` :
34+
35+ ``` shell
36+ docker build --pull -t $USER /tensorflow-serving .
37+ ```
38+
39+ ` Dockerfile.devel ` :
2540
2641``` shell
2742docker build --pull -t $USER /tensorflow-serving-devel -f Dockerfile.devel .
2843```
2944
45+ ` Dockerfile.devel-gpu ` :
46+
47+ ``` shell
48+ docker build --pull -t $USER /tensorflow-serving-devel-gpu -f Dockerfile.devel-gpu .
49+ ```
50+
3051## Running a container
31- This assumes you have built the container.
3252
33- ` Dockerfile.devel ` : Use the development container to clone and test the
34- TensorFlow Serving repository.
53+ ### Serving Example
3554
36- Run the container;
55+ This assumes you have built the ` Dockerfile ` container.
56+
57+ First you will need a model. Clone the TensorFlow Serving repo.
3758
3859``` shell
39- docker run -it $USER /tensorflow-serving-devel
60+ mkdir -p /tmp/tfserving
61+ cd /tmp/tfserving
62+ git clone --recursive https://github.com/tensorflow/serving
4063```
4164
42- Clone, configure and test Tensorflow Serving in the running container;
65+ We will use a toy model called Half Plus Three, which will predict values 0.5\* x
66+ + 3 for the values we provide for prediction.
67+
68+ ``` shell
69+ docker run -p 8501:8501 \
70+ -v /tmp/tfserving/serving/tensorflow_serving/servables/tensorflow/testdata/saved_model_half_plus_three:/models/half_plus_three \
71+ -e MODEL_NAME=half_plus_three -t $USER /tensorflow-serving &
72+ ```
73+
74+ This will run the docker container and launch the TensorFlow Serving Model
75+ Server, bind the REST API port 8501, and map our desired model from our host to
76+ where models are expected in the container. We also pass the name of the model
77+ as an environment variable, which will be important when we query the model.
78+
79+ To query the model using the predict API, you can run
80+
81+ ``` shell
82+ curl -d ' {"instances": [1.0, 2.0, 5.0]}' -X POST http://localhost:8501/v1/models/half_plus_three:predict
83+ ```
84+
85+ This should return a set of values:
86+
87+ ``` json
88+ { "predictions" : [3.5 , 4.0 , 5.5 ] }
89+ ```
90+
91+ More information on using the RESTful API can be found [ here] ( api_rest.md ) .
92+
93+ ### Development Example
94+
95+ This assumes you have built the ` Dockerfile.devel ` container.
96+
97+ To run the container:
4398
4499``` shell
45- git clone --recurse-submodules https://github.com/tensorflow/serving
46- cd serving/tensorflow
47- ./configure
48- cd ..
49- bazel test tensorflow_serving/...
100+ docker run -it -p 8500:8500 $USER /tensorflow-serving-devel
50101```
51102
103+ This will pull the latest TensorFlow Serving release with a prebuilt binary and
104+ working development environment. To test a model, from inside the container try:
105+
106+ `````` shell
107+ bazel build -c opt //tensorflow_serving/example:mnist_saved_model
108+ # train the mnist model
109+ bazel-bin/tensorflow_serving/example/mnist_saved_model /tmp/mnist_model
110+ # serve the model
111+ tensorflow_model_server --port=8500 --model_name=mnist --model_base_path=/tmp/mnist_model/ &
112+ # build the client
113+ bazel build -c opt //tensorflow_serving/example:mnist_client
114+ # test the client
115+ bazel-bin/tensorflow_serving/example/mnist_client --num_tests=1000 --server=localhost:8500
116+ `````
117+ ``````
0 commit comments