Return to Image List

tensorflow-serving

tensorflow-serving

TensorFlow Serving is a flexible, high-performance serving system for machine learning models, designed for production environments. It deals with the inference aspect of machine learning, taking models after training and managing their lifetimes, providing clients with versioned access via a high-performance, reference-counted lookup table. TensorFlow Serving provides out-of-the-box integration with TensorFlow models, but can be easily extended to serve other types of models and data.

See tensorflow.org for more information

This image is built by IBM to run on the IBM Z architecture and is not affiliated with any other community that provides a version of this image.


License

View license information here

As with all Docker images, these likely also contain other software which may be under other licenses (such as Bash, etc from the base distribution, along with any direct or indirect dependencies of the primary software being contained).

As for any pre-built image usage, it is the image user's responsibility to ensure that any use of this image complies with any relevant licenses for all software contained within.


Versions

Use the pull string below for the version of this image you require.
2.4.0 docker pull icr.io/ibmz/tensorflow-serving@sha256:d232a0532342a29ed49d9cd61957793af07da6e8fba4d4c1da808124bb5909b7 Vulnerability Report07-29-2021
2.4-opts-vector docker pull icr.io/ibmz/tensorflow-serving@sha256:c1a9fb946305538b6c72967cf349c9504f08c2054dea4fd575fd37078bfd1d1d Vulnerability Report08-09-2022
2.7.0 docker pull icr.io/ibmz/tensorflow-serving@sha256:8da2e8e497fc839a76cad33b16a76e1ed537730b762a4c7f17fb2673e27fcf55 Vulnerability Report05-04-2022
Version Pull String Security (IBM Cloud) Created

Usage Notes

Exposed Ports: 8500 (gRPC), 8501 (REST)

Please run the following commands as an example of the TensorFlow Serving image: Pull the Docker Image and Git Clone Test Data:

docker pull icr.io/ibmz/tensorflow-serving:[version&hash]

git clone -b 2.4.0 https://github.com/tensorflow/serving

Run a Base Container:

docker run -d --name serving_base icr.io/ibmz/tensorflow-serving:[version&hash]

Copy Your Model Data:

docker cp /path/to/your/model_data_from_git_clone_step serving_base:/models/half_plus_two

Create a Custom Container and set MODEL_NAME:

docker commit --change "ENV MODEL_NAME=half_plus_two" serving_base serving_custom

docker stop serving_base

docker rm serving_base

Run TensorFlow Serving Container:

docker run -t --rm -p 8501:8501 serving_custom

Query the model using the predict API

curl -d '{"instances": [1.0, 2.0, 5.0]}' \
   -X POST http://[host_machine]:8501/v1/models/half_plus_two:predict

Returns => { "predictions": [2.5, 3.0, 4.5] }