Error Trying to Convert TensorFlow Saved Model to TensorFlow.js Model
up vote
17
down vote
favorite
I have successfully trained a DNNClassifier to classify texts (posts from an online discussion board). I've created and saved my model using this code:
embedded_text_feature_column = hub.text_embedding_column(
key="sentence",
module_spec="https://tfhub.dev/google/nnlm-de-dim128/1")
feature_columns = [embedded_text_feature_column]
estimator = tf.estimator.DNNClassifier(
hidden_units=[500, 100],
feature_columns=feature_columns,
n_classes=2,
optimizer=tf.train.AdagradOptimizer(learning_rate=0.003))
feature_spec = tf.feature_column.make_parse_example_spec(feature_columns)
serving_input_receiver_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(feature_spec)
estimator.export_savedmodel(export_dir_base="/my/dir/base", serving_input_receiver_fn=serving_input_receiver_fn)
Now I want to convert my saved model to use it with the JavaScript version of TensorFlow, tf.js, using the tfjs-converter.
When I issue the following command:
tensorflowjs_converter --input_format=tf_saved_model --output_node_names='dnn/head/predictions/str_classes,dnn/head/predictions/probabilities' --saved_model_tags=serve /my/dir/base /my/export/dir
…I get this error message:
ValueError: Node 'dnn/input_from_feature_columns/input_layer/sentence_hub_module_embedding/module_apply_default/embedding_lookup_sparse/embedding_lookup' expects to be colocated with unknown node 'dnn/input_from_feature_columns/input_layer/sentence_hub_module_embedding
I assume I'm doing something wrong when saving the model.
What is the correct way to save an estimator model so that it can be converted with tfjs-converter?
The source code of my project can be found on GitHub.
javascript python tensorflow
add a comment |
up vote
17
down vote
favorite
I have successfully trained a DNNClassifier to classify texts (posts from an online discussion board). I've created and saved my model using this code:
embedded_text_feature_column = hub.text_embedding_column(
key="sentence",
module_spec="https://tfhub.dev/google/nnlm-de-dim128/1")
feature_columns = [embedded_text_feature_column]
estimator = tf.estimator.DNNClassifier(
hidden_units=[500, 100],
feature_columns=feature_columns,
n_classes=2,
optimizer=tf.train.AdagradOptimizer(learning_rate=0.003))
feature_spec = tf.feature_column.make_parse_example_spec(feature_columns)
serving_input_receiver_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(feature_spec)
estimator.export_savedmodel(export_dir_base="/my/dir/base", serving_input_receiver_fn=serving_input_receiver_fn)
Now I want to convert my saved model to use it with the JavaScript version of TensorFlow, tf.js, using the tfjs-converter.
When I issue the following command:
tensorflowjs_converter --input_format=tf_saved_model --output_node_names='dnn/head/predictions/str_classes,dnn/head/predictions/probabilities' --saved_model_tags=serve /my/dir/base /my/export/dir
…I get this error message:
ValueError: Node 'dnn/input_from_feature_columns/input_layer/sentence_hub_module_embedding/module_apply_default/embedding_lookup_sparse/embedding_lookup' expects to be colocated with unknown node 'dnn/input_from_feature_columns/input_layer/sentence_hub_module_embedding
I assume I'm doing something wrong when saving the model.
What is the correct way to save an estimator model so that it can be converted with tfjs-converter?
The source code of my project can be found on GitHub.
javascript python tensorflow
3
At the moment it doesn't look like this is possible with the available library. Aside from this colocation issue, which seems to come from freezing word embeddings, tfjs-converter doesn't support all the ops in the graph. So even if the main TF library would freeze and restore the graph right, it would still include some unsupported ops likeLookupTableFindV2
andStringToHashBucketFast
. The README says to file issues to let the devs know which ops to support, but issues aren't currently enabled on the repo.
– argx
Jul 14 at 16:02
Awww, too bad... And surprising, since they have a word embeddings example in the tfjs repo, sentiment analysis of imdb movie ratings, pretty much the same as what I'm doing: github.com/tensorflow/tfjs-examples/tree/master/sentiment
– Patrick Hund
Jul 14 at 16:49
add a comment |
up vote
17
down vote
favorite
up vote
17
down vote
favorite
I have successfully trained a DNNClassifier to classify texts (posts from an online discussion board). I've created and saved my model using this code:
embedded_text_feature_column = hub.text_embedding_column(
key="sentence",
module_spec="https://tfhub.dev/google/nnlm-de-dim128/1")
feature_columns = [embedded_text_feature_column]
estimator = tf.estimator.DNNClassifier(
hidden_units=[500, 100],
feature_columns=feature_columns,
n_classes=2,
optimizer=tf.train.AdagradOptimizer(learning_rate=0.003))
feature_spec = tf.feature_column.make_parse_example_spec(feature_columns)
serving_input_receiver_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(feature_spec)
estimator.export_savedmodel(export_dir_base="/my/dir/base", serving_input_receiver_fn=serving_input_receiver_fn)
Now I want to convert my saved model to use it with the JavaScript version of TensorFlow, tf.js, using the tfjs-converter.
When I issue the following command:
tensorflowjs_converter --input_format=tf_saved_model --output_node_names='dnn/head/predictions/str_classes,dnn/head/predictions/probabilities' --saved_model_tags=serve /my/dir/base /my/export/dir
…I get this error message:
ValueError: Node 'dnn/input_from_feature_columns/input_layer/sentence_hub_module_embedding/module_apply_default/embedding_lookup_sparse/embedding_lookup' expects to be colocated with unknown node 'dnn/input_from_feature_columns/input_layer/sentence_hub_module_embedding
I assume I'm doing something wrong when saving the model.
What is the correct way to save an estimator model so that it can be converted with tfjs-converter?
The source code of my project can be found on GitHub.
javascript python tensorflow
I have successfully trained a DNNClassifier to classify texts (posts from an online discussion board). I've created and saved my model using this code:
embedded_text_feature_column = hub.text_embedding_column(
key="sentence",
module_spec="https://tfhub.dev/google/nnlm-de-dim128/1")
feature_columns = [embedded_text_feature_column]
estimator = tf.estimator.DNNClassifier(
hidden_units=[500, 100],
feature_columns=feature_columns,
n_classes=2,
optimizer=tf.train.AdagradOptimizer(learning_rate=0.003))
feature_spec = tf.feature_column.make_parse_example_spec(feature_columns)
serving_input_receiver_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(feature_spec)
estimator.export_savedmodel(export_dir_base="/my/dir/base", serving_input_receiver_fn=serving_input_receiver_fn)
Now I want to convert my saved model to use it with the JavaScript version of TensorFlow, tf.js, using the tfjs-converter.
When I issue the following command:
tensorflowjs_converter --input_format=tf_saved_model --output_node_names='dnn/head/predictions/str_classes,dnn/head/predictions/probabilities' --saved_model_tags=serve /my/dir/base /my/export/dir
…I get this error message:
ValueError: Node 'dnn/input_from_feature_columns/input_layer/sentence_hub_module_embedding/module_apply_default/embedding_lookup_sparse/embedding_lookup' expects to be colocated with unknown node 'dnn/input_from_feature_columns/input_layer/sentence_hub_module_embedding
I assume I'm doing something wrong when saving the model.
What is the correct way to save an estimator model so that it can be converted with tfjs-converter?
The source code of my project can be found on GitHub.
javascript python tensorflow
javascript python tensorflow
edited Jul 11 at 6:22
asked Jul 9 at 16:18
Patrick Hund
6,24252146
6,24252146
3
At the moment it doesn't look like this is possible with the available library. Aside from this colocation issue, which seems to come from freezing word embeddings, tfjs-converter doesn't support all the ops in the graph. So even if the main TF library would freeze and restore the graph right, it would still include some unsupported ops likeLookupTableFindV2
andStringToHashBucketFast
. The README says to file issues to let the devs know which ops to support, but issues aren't currently enabled on the repo.
– argx
Jul 14 at 16:02
Awww, too bad... And surprising, since they have a word embeddings example in the tfjs repo, sentiment analysis of imdb movie ratings, pretty much the same as what I'm doing: github.com/tensorflow/tfjs-examples/tree/master/sentiment
– Patrick Hund
Jul 14 at 16:49
add a comment |
3
At the moment it doesn't look like this is possible with the available library. Aside from this colocation issue, which seems to come from freezing word embeddings, tfjs-converter doesn't support all the ops in the graph. So even if the main TF library would freeze and restore the graph right, it would still include some unsupported ops likeLookupTableFindV2
andStringToHashBucketFast
. The README says to file issues to let the devs know which ops to support, but issues aren't currently enabled on the repo.
– argx
Jul 14 at 16:02
Awww, too bad... And surprising, since they have a word embeddings example in the tfjs repo, sentiment analysis of imdb movie ratings, pretty much the same as what I'm doing: github.com/tensorflow/tfjs-examples/tree/master/sentiment
– Patrick Hund
Jul 14 at 16:49
3
3
At the moment it doesn't look like this is possible with the available library. Aside from this colocation issue, which seems to come from freezing word embeddings, tfjs-converter doesn't support all the ops in the graph. So even if the main TF library would freeze and restore the graph right, it would still include some unsupported ops like
LookupTableFindV2
and StringToHashBucketFast
. The README says to file issues to let the devs know which ops to support, but issues aren't currently enabled on the repo.– argx
Jul 14 at 16:02
At the moment it doesn't look like this is possible with the available library. Aside from this colocation issue, which seems to come from freezing word embeddings, tfjs-converter doesn't support all the ops in the graph. So even if the main TF library would freeze and restore the graph right, it would still include some unsupported ops like
LookupTableFindV2
and StringToHashBucketFast
. The README says to file issues to let the devs know which ops to support, but issues aren't currently enabled on the repo.– argx
Jul 14 at 16:02
Awww, too bad... And surprising, since they have a word embeddings example in the tfjs repo, sentiment analysis of imdb movie ratings, pretty much the same as what I'm doing: github.com/tensorflow/tfjs-examples/tree/master/sentiment
– Patrick Hund
Jul 14 at 16:49
Awww, too bad... And surprising, since they have a word embeddings example in the tfjs repo, sentiment analysis of imdb movie ratings, pretty much the same as what I'm doing: github.com/tensorflow/tfjs-examples/tree/master/sentiment
– Patrick Hund
Jul 14 at 16:49
add a comment |
1 Answer
1
active
oldest
votes
up vote
0
down vote
You can try this and I think this will work. Just input your input format in code.
tensorflowjs_converter --input_format keras
path/to/my_model.h5
path/to/tfjs_target_dir
New contributor
Thanks, I'll give it a try
– Patrick Hund
Nov 4 at 13:17
add a comment |
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
0
down vote
You can try this and I think this will work. Just input your input format in code.
tensorflowjs_converter --input_format keras
path/to/my_model.h5
path/to/tfjs_target_dir
New contributor
Thanks, I'll give it a try
– Patrick Hund
Nov 4 at 13:17
add a comment |
up vote
0
down vote
You can try this and I think this will work. Just input your input format in code.
tensorflowjs_converter --input_format keras
path/to/my_model.h5
path/to/tfjs_target_dir
New contributor
Thanks, I'll give it a try
– Patrick Hund
Nov 4 at 13:17
add a comment |
up vote
0
down vote
up vote
0
down vote
You can try this and I think this will work. Just input your input format in code.
tensorflowjs_converter --input_format keras
path/to/my_model.h5
path/to/tfjs_target_dir
New contributor
You can try this and I think this will work. Just input your input format in code.
tensorflowjs_converter --input_format keras
path/to/my_model.h5
path/to/tfjs_target_dir
New contributor
edited Nov 4 at 10:17
Lee Mac
2,3132833
2,3132833
New contributor
answered Nov 4 at 9:59
Vivek Kaushik
11
11
New contributor
New contributor
Thanks, I'll give it a try
– Patrick Hund
Nov 4 at 13:17
add a comment |
Thanks, I'll give it a try
– Patrick Hund
Nov 4 at 13:17
Thanks, I'll give it a try
– Patrick Hund
Nov 4 at 13:17
Thanks, I'll give it a try
– Patrick Hund
Nov 4 at 13:17
add a comment |
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f51249989%2ferror-trying-to-convert-tensorflow-saved-model-to-tensorflow-js-model%23new-answer', 'question_page');
}
);
Post as a guest
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
3
At the moment it doesn't look like this is possible with the available library. Aside from this colocation issue, which seems to come from freezing word embeddings, tfjs-converter doesn't support all the ops in the graph. So even if the main TF library would freeze and restore the graph right, it would still include some unsupported ops like
LookupTableFindV2
andStringToHashBucketFast
. The README says to file issues to let the devs know which ops to support, but issues aren't currently enabled on the repo.– argx
Jul 14 at 16:02
Awww, too bad... And surprising, since they have a word embeddings example in the tfjs repo, sentiment analysis of imdb movie ratings, pretty much the same as what I'm doing: github.com/tensorflow/tfjs-examples/tree/master/sentiment
– Patrick Hund
Jul 14 at 16:49