You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
TensorFlow 2.18.0 has dropped tensorRT support, how about TensorFlow Serving?
I tried to host a tensorRT model with 2.18.0-gpu container, it fails with log below. 025-01-17 19:47:04.267837: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:466] SavedModel load for tags { serve }; Status: fail: NOT_FOUND: Op type not registered 'CreateTRTResourceHandle' in binary running on 9b2753a90ca1. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib (e.g. tf.contrib.resampler), accessing should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.. Took 2505621 microseconds. 2025-01-17 19:47:04.267889: E tensorflow_serving/util/retrier.cc:40] Loading servable: {name: tftrt_saved_model version: 1} failed: NOT_FOUND: Op type not registered 'CreateTRTResourceHandle' in binary running on 9b2753a90ca1. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib (e.g. tf.contrib.resampler), accessing should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.
While host with 2.17.0-gpu, it is successful.
Dockerfile of both version have tensorrt relevant library and ENV.
If we want to build with tensorRT support, how should we do?
I've tried to set TF_NEED_TENSORRT=1, but it doesn't work.
Thanks.
The text was updated successfully, but these errors were encountered:
I tried to host a tensorRT model with 2.18.0-gpu container, it fails with log below.
025-01-17 19:47:04.267837: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:466] SavedModel load for tags { serve }; Status: fail: NOT_FOUND: Op type not registered 'CreateTRTResourceHandle' in binary running on 9b2753a90ca1. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib (e.g.
tf.contrib.resampler), accessing should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.. Took 2505621 microseconds. 2025-01-17 19:47:04.267889: E tensorflow_serving/util/retrier.cc:40] Loading servable: {name: tftrt_saved_model version: 1} failed: NOT_FOUND: Op type not registered 'CreateTRTResourceHandle' in binary running on 9b2753a90ca1. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib (e.g.
tf.contrib.resampler), accessing should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.
While host with 2.17.0-gpu, it is successful.
Dockerfile of both version have tensorrt relevant library and ENV.
I've tried to set TF_NEED_TENSORRT=1, but it doesn't work.
Thanks.
The text was updated successfully, but these errors were encountered: