Code Structure for Multi-Tensor TensorFlow Deployment (Model Saving and Client Request Structure)

Putri Alvina Lutfiani
3 min readSep 25, 2021

For note, the code can’t be implemented on every cloud platform for deployment. They must have had their own structure, but at least they have the same outline. I wrote this based on experience to deploy on IBM cloud using Watson Studio and Machine Learning features of them.
But, this structure MAY can be such a reference with more modifications and adjust on cloud you will use to.

More note, this just will be my notes for my future (I won’t be me in rn about codes)

Photo by Thomas Tastet on Unsplash

This example of deployment is about deep learning using TensorFlow that have multi-tensor as input and multi-operation those compute the output. Because, so far I just found the single input and single output structure deployment. So, I wrote this.

I will write in dummy mode of tensor and operation names. Using TensorFlow 2 of version.

Save the Model

First, the model will be saved using TensorFlow’s SavedModelBuilder module to export the model. The original and official documentation can be accessed at this link. I modified some lines of code for making it fit to model saving.

The example structure code:

with tf.compat.v1.Session() as sess:
sess.run(tf.compat.v1.global_variables_initializer())
graph = tf.compat.v1.get_default_graph()
operation_1 = graph.get_tensor_by_name("tensor_name_of_operation_1:0")
operation_2 = graph.get_tensor_by_name("tensor_name_of_operation_2:0")
operation_3 = graph.get_tensor_by_name("tensor_name_of_operation_3:0")

feature_1 = graph.get_tensor_by_name("tensor_name_of_feature_1:0")
feature_2 = graph.get_tensor_by_name("tensor_name_of_feature_2:0")
feature_3 = graph.get_tensor_by_name("tensor_name_of_feature_3:0")
feature_4 = graph.get_tensor_by_name("tensor_name_of_feature_4:0")
feature_5 = graph.get_tensor_by_name("tensor_name_of_feature_5:0")
classification_signature = ( tf.compat.v1.saved_model.signature_def_utils.build_signature_def (
inputs = {'feature_1' : tf.compat.v1.saved_model.utils.build_tensor_info(feature_1),
'feature_2' : tf.compat.v1.saved_model.utils.build_tensor_info(feature_2),
'feature_3' : tf.compat.v1.saved_model.utils.build_tensor_info(feature_3),
'feature_4' : tf.compat.v1.saved_model.utils.build_tensor_info(feature_4),
'feature_5' : tf.compat.v1.saved_model.utils.build_tensor_info(feature_5)
},
outputs = {'operation_1' : tf.compat.v1.saved_model.utils.build_tensor_info(operation_1),
'operation_2' : tf.compat.v1.saved_model.utils.build_tensor_info(operation_2),
'operation_3' : tf.compat.v1.saved_model.utils.build_tensor_info(operation_3)
}
))
builder = tf.compat.v1.saved_model.builder.SavedModelBuilder("name-of-model-dir") #adjustable-name-of-mode-dir
builder.add_meta_graph_and_variables(sess,
[ tf.saved_model.SERVING ],
signature_def_map= {
"model": classification_signature
})
builder.save()

The parameter of build_tensor_info function can be contained variable of the tensor or operation that will be used, directly. I choose to call it from the graph because it can be happened if you have high complexity of TensorFlow codes.

Deployment

The deployment can be followed in official documentation of IBM, link in above of codes. If deployment success, model can be used through API from the client.

Consume the API

I will give an example consuming the model through API from client using python. You can try on notebook or such, just make sure you have Internet connection. Just remember, this is an IBM cloud request code, but maybe the json structure for other clouds will be similar.

import requests# NOTE: you must manually set API_KEY below using information retrieved from your IBM Cloud account.
API_KEY = "<YOUR API KEY>"
token_response = requests.post('https://iam.cloud.ibm.com/identity/token', data={"apikey": API_KEY, "grant_type": 'urn:ibm:params:oauth:grant-type:apikey'})
mltoken = token_response.json()["access_token"]
header = {'Content-Type': 'application/json', 'Authorization': 'Bearer ' + mltoken}# NOTE: manually define and pass the array(s) of values to be scored in the next line
payload_scoring = {"input_data": [{
"id": "feature_1",
"values": [the_value_of_feature_1]
},
{
"id": "feature_2",
"values": [the_value_of_feature_2]
},
{
"id": "feature_3",
"values": [the_value_of_feature_3]
},
{
"id": "feature_4",
"values": [the_value_of_feature_4]
},
{
"id": "feature_5",
"values": [the_value_of_feature_5]
}
]
}
response_scoring = requests.post('your_link_request', json=payload_scoring, headers={'Authorization': 'Bearer ' + mltoken})
print("Scoring response")
print(response_scoring.json())

The id in input_data key is the key of inputs when build/save the model that we have deployed on cloud.

--

--