Tiven Wang
Wang Tiven August 04, 2020
425 favorite favorites
bookmark bookmark
share share

运行一个预先创建好的模型的服务容器

在运行 tensorflow/serving 容器时要使用 --mount 来挂载文件目录

docker run -t --rm -p 8501:8501 --name tf_serving --mount type=bind,source=${PWD}/serving/tensorflow_serving/servables/tensorflow/testdata/saved_model_half_plus_two_cpu,target=/models/half_plus_two -e MODEL_NAME=half_plus_two tensorflow/serving &
curl -d '{"instances": [1.0, 2.0, 5.0]}' \
  -X POST http://localhost:8501/v1/models/half_plus_two:predict

https://www.tensorflow.org/tfx/serving/architecture

https://www.tensorflow.org/tfx/tutorials/serving/rest_simple

运行一个 MNIST 服务容器

Serving a TensorFlow Model MNIST

按步骤来, 我们已经下载了 serving 项目代码

# 进入项目目录
$ cd serving
# 删除可能已经存在的目标文件目录
$ rm -rf /tmp/mnist
# 使用脚本运行 docker 容器来运行 python 脚本
$ tools/run_in_docker.sh python tensorflow_serving/example/mnist_saved_model.py /tmp/mnist

Similar Posts

Comments

Back to Top