Tensorflow Hub.load Model To Tflite
Solution 1:
Try using hub.KerasLayer
to load your model into a tf.keras.Model
and then convert it to ŧflite
using .from_keras_model
.
There's no such thing as a "keras SavedModel". There's the SavedModel
, which is .pb
file + assets
folder + variables
folder. It's like a file format, a way to store your model. It has nothing to do with the in memory tf.keras.Model
s. hub.load
does not return a tf.keras.Model
, but rather "the most generic thing" you can save in the SavedModel
file format, namely a _UserObject
. This is because you can save other things than just tf.keras.Models
s in a SavedModel
s file format.
I know this was not your question, but if you do want to get your tf.keras.Model
back after loading, you can use tf.keras.save_model
to save it. Then it will come back as a tf.keras.Model
after loading using tf.saved_model.load
(so then it's no longer the most generic thing).
EDIT:
Just the code:
import tensorflow as tf
import tensorflow_hub as hub
model = tf.keras.Sequential()
model.add(tf.keras.layers.InputLayer(dtype=tf.string, input_shape=()))
model.add(hub.KerasLayer("https://tfhub.dev/google/universal-sentence-encoder/4"))
converter = tf.lite.TFLiteConverter.from_keras_model(model)
tflite_model = converter.convert()
which works (it starts converting), but you get a:
2020-05-05 10:48:44.927433:Itensorflow/lite/toco/import_tensorflow.cc:659]Converting unsupported operation:StatefulPartitionedCall
So this is the code to convert models saved in SavedModel
format to tflite
, but you get a google-universal-sentence-encoder
specific error. No idea how to fix that tough.
Post a Comment for "Tensorflow Hub.load Model To Tflite"