This repository was archived by the owner on Aug 11, 2023. It is now read-only.
forked from hunglc007/tensorflow-yolov4-tflite
-
Notifications
You must be signed in to change notification settings - Fork 75
This repository was archived by the owner on Aug 11, 2023. It is now read-only.
Saved model and Tensorflow Serving #73
Copy link
Copy link
Open
Labels
Description
I'am trying to use saved model in Tensorflow Serving but without success.
I exported model:
yolo = YOLOv4()
yolo.config.parse_names("yolov4-data/coco.names")
yolo.config.parse_cfg("yolov4-data/yolov4-tiny.cfg")
yolo.make_model()
yolo.load_weights(
"./yolov4-data/trained/yolov4-tiny-18000-step.weights",
weights_type="yolo",
)
yolo.model.save("./yolov4-data/saved_model/1")
I run this model in Tensorflow Serving:
tfserving | 2021-03-02 10:18:10.583329: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:152] Running initialization op on SavedModel bundle at path: /notebooks/object-detection/yolov4-data/saved_model/1
tfserving | 2021-03-02 10:18:10.727702: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:333] SavedModel load for tags { serve }; Status: success: OK. Took 3277171 microseconds.
tfserving | 2021-03-02 10:18:10.727917: I tensorflow_serving/core/loader_harness.cc:87] Successfully loaded servable version {name: object-detection-yolov4 version: 1}
tfserving | 2021-03-02 10:18:10.810755: I tensorflow_serving/model_servers/server.cc:355] Running gRPC ModelServer at 0.0.0.0:8500 ...
tfserving | [warn] getaddrinfo: address family for nodename not supported
tfserving | [evhttp_server.cc : 238] NET_LOG: Entering the event loop ...
tfserving | 2021-03-02 10:18:10.813386: I tensorflow_serving/model_servers/server.cc:375] Exporting HTTP/REST API at:localhost:8501 ...
I'am able to get metadata of the saved model:
curl http://tfserving:8501/v1/models/object-detection-yolov4/metadata
{
"model_spec":{
"name": "object-detection-yolov4",
"signature_name": "",
"version": "1"
}
,
"metadata": {"signature_def": {
"signature_def": {
"serving_default": {
"inputs": {
"input_1": {
"dtype": "DT_FLOAT",
"tensor_shape": {
"dim": [
{
"size": "-1",
"name": ""
},
{
"size": "416",
"name": ""
},
{
"size": "416",
"name": ""
},
{
"size": "3",
"name": ""
}
],
"unknown_rank": false
},
"name": "serving_default_input_1:0"
}
},
"outputs": {
"output_1": {
"dtype": "DT_FLOAT",
"tensor_shape": {
"dim": [
{
"size": "-1",
"name": ""
},
{
"size": "13",
"name": ""
},
{
"size": "13",
"name": ""
},
{
"size": "255",
"name": ""
}
],
"unknown_rank": false
},
"name": "StatefulPartitionedCall:0"
},
"output_2": {
"dtype": "DT_FLOAT",
"tensor_shape": {
"dim": [
{
"size": "-1",
"name": ""
},
{
"size": "26",
"name": ""
},
{
"size": "26",
"name": ""
},
{
"size": "255",
"name": ""
}
],
"unknown_rank": false
},
"name": "StatefulPartitionedCall:1"
}
},
"method_name": "tensorflow/serving/predict"
},
"__saved_model_init_op": {
"inputs": {},
"outputs": {
"__saved_model_init_op": {
"dtype": "DT_INVALID",
"tensor_shape": {
"dim": [],
"unknown_rank": true
},
"name": "NoOp"
}
},
"method_name": ""
}
}
}
}
}
But I'am not able to send image by rest and get detected objects:
import numpy as np
from PIL import Image
import requests
data = np.array(Image.open(TEST_IMG))
req = { "inputs": {"input_1": data.tolist()} }
resp = requests.post(SERVER_URL, json=req)
resp.raise_for_status()
print('response.status_code: {}'.format(resp.status_code))
print('response.content: {}'.format(resp.content))
Result:
HTTPError: 400 Client Error: Bad Request for url: http://tfserving:8501/v1/models/object-detection-yolov4/versions/1:predict
Any recommendations?
Thank you.
Lukas