Error Trying to Convert TensorFlow Saved Model to TensorFlow.js Model











up vote
17
down vote

favorite
1












I have successfully trained a DNNClassifier to classify texts (posts from an online discussion board). I've created and saved my model using this code:



embedded_text_feature_column = hub.text_embedding_column(
key="sentence",
module_spec="https://tfhub.dev/google/nnlm-de-dim128/1")
feature_columns = [embedded_text_feature_column]
estimator = tf.estimator.DNNClassifier(
hidden_units=[500, 100],
feature_columns=feature_columns,
n_classes=2,
optimizer=tf.train.AdagradOptimizer(learning_rate=0.003))
feature_spec = tf.feature_column.make_parse_example_spec(feature_columns)
serving_input_receiver_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(feature_spec)
estimator.export_savedmodel(export_dir_base="/my/dir/base", serving_input_receiver_fn=serving_input_receiver_fn)


Now I want to convert my saved model to use it with the JavaScript version of TensorFlow, tf.js, using the tfjs-converter.



When I issue the following command:



tensorflowjs_converter --input_format=tf_saved_model --output_node_names='dnn/head/predictions/str_classes,dnn/head/predictions/probabilities' --saved_model_tags=serve /my/dir/base /my/export/dir


…I get this error message:




ValueError: Node 'dnn/input_from_feature_columns/input_layer/sentence_hub_module_embedding/module_apply_default/embedding_lookup_sparse/embedding_lookup' expects to be colocated with unknown node 'dnn/input_from_feature_columns/input_layer/sentence_hub_module_embedding




I assume I'm doing something wrong when saving the model.



What is the correct way to save an estimator model so that it can be converted with tfjs-converter?



The source code of my project can be found on GitHub.










share|improve this question




















  • 3




    At the moment it doesn't look like this is possible with the available library. Aside from this colocation issue, which seems to come from freezing word embeddings, tfjs-converter doesn't support all the ops in the graph. So even if the main TF library would freeze and restore the graph right, it would still include some unsupported ops like LookupTableFindV2 and StringToHashBucketFast. The README says to file issues to let the devs know which ops to support, but issues aren't currently enabled on the repo.
    – argx
    Jul 14 at 16:02










  • Awww, too bad... And surprising, since they have a word embeddings example in the tfjs repo, sentiment analysis of imdb movie ratings, pretty much the same as what I'm doing: github.com/tensorflow/tfjs-examples/tree/master/sentiment
    – Patrick Hund
    Jul 14 at 16:49















up vote
17
down vote

favorite
1












I have successfully trained a DNNClassifier to classify texts (posts from an online discussion board). I've created and saved my model using this code:



embedded_text_feature_column = hub.text_embedding_column(
key="sentence",
module_spec="https://tfhub.dev/google/nnlm-de-dim128/1")
feature_columns = [embedded_text_feature_column]
estimator = tf.estimator.DNNClassifier(
hidden_units=[500, 100],
feature_columns=feature_columns,
n_classes=2,
optimizer=tf.train.AdagradOptimizer(learning_rate=0.003))
feature_spec = tf.feature_column.make_parse_example_spec(feature_columns)
serving_input_receiver_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(feature_spec)
estimator.export_savedmodel(export_dir_base="/my/dir/base", serving_input_receiver_fn=serving_input_receiver_fn)


Now I want to convert my saved model to use it with the JavaScript version of TensorFlow, tf.js, using the tfjs-converter.



When I issue the following command:



tensorflowjs_converter --input_format=tf_saved_model --output_node_names='dnn/head/predictions/str_classes,dnn/head/predictions/probabilities' --saved_model_tags=serve /my/dir/base /my/export/dir


…I get this error message:




ValueError: Node 'dnn/input_from_feature_columns/input_layer/sentence_hub_module_embedding/module_apply_default/embedding_lookup_sparse/embedding_lookup' expects to be colocated with unknown node 'dnn/input_from_feature_columns/input_layer/sentence_hub_module_embedding




I assume I'm doing something wrong when saving the model.



What is the correct way to save an estimator model so that it can be converted with tfjs-converter?



The source code of my project can be found on GitHub.










share|improve this question




















  • 3




    At the moment it doesn't look like this is possible with the available library. Aside from this colocation issue, which seems to come from freezing word embeddings, tfjs-converter doesn't support all the ops in the graph. So even if the main TF library would freeze and restore the graph right, it would still include some unsupported ops like LookupTableFindV2 and StringToHashBucketFast. The README says to file issues to let the devs know which ops to support, but issues aren't currently enabled on the repo.
    – argx
    Jul 14 at 16:02










  • Awww, too bad... And surprising, since they have a word embeddings example in the tfjs repo, sentiment analysis of imdb movie ratings, pretty much the same as what I'm doing: github.com/tensorflow/tfjs-examples/tree/master/sentiment
    – Patrick Hund
    Jul 14 at 16:49













up vote
17
down vote

favorite
1









up vote
17
down vote

favorite
1






1





I have successfully trained a DNNClassifier to classify texts (posts from an online discussion board). I've created and saved my model using this code:



embedded_text_feature_column = hub.text_embedding_column(
key="sentence",
module_spec="https://tfhub.dev/google/nnlm-de-dim128/1")
feature_columns = [embedded_text_feature_column]
estimator = tf.estimator.DNNClassifier(
hidden_units=[500, 100],
feature_columns=feature_columns,
n_classes=2,
optimizer=tf.train.AdagradOptimizer(learning_rate=0.003))
feature_spec = tf.feature_column.make_parse_example_spec(feature_columns)
serving_input_receiver_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(feature_spec)
estimator.export_savedmodel(export_dir_base="/my/dir/base", serving_input_receiver_fn=serving_input_receiver_fn)


Now I want to convert my saved model to use it with the JavaScript version of TensorFlow, tf.js, using the tfjs-converter.



When I issue the following command:



tensorflowjs_converter --input_format=tf_saved_model --output_node_names='dnn/head/predictions/str_classes,dnn/head/predictions/probabilities' --saved_model_tags=serve /my/dir/base /my/export/dir


…I get this error message:




ValueError: Node 'dnn/input_from_feature_columns/input_layer/sentence_hub_module_embedding/module_apply_default/embedding_lookup_sparse/embedding_lookup' expects to be colocated with unknown node 'dnn/input_from_feature_columns/input_layer/sentence_hub_module_embedding




I assume I'm doing something wrong when saving the model.



What is the correct way to save an estimator model so that it can be converted with tfjs-converter?



The source code of my project can be found on GitHub.










share|improve this question















I have successfully trained a DNNClassifier to classify texts (posts from an online discussion board). I've created and saved my model using this code:



embedded_text_feature_column = hub.text_embedding_column(
key="sentence",
module_spec="https://tfhub.dev/google/nnlm-de-dim128/1")
feature_columns = [embedded_text_feature_column]
estimator = tf.estimator.DNNClassifier(
hidden_units=[500, 100],
feature_columns=feature_columns,
n_classes=2,
optimizer=tf.train.AdagradOptimizer(learning_rate=0.003))
feature_spec = tf.feature_column.make_parse_example_spec(feature_columns)
serving_input_receiver_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(feature_spec)
estimator.export_savedmodel(export_dir_base="/my/dir/base", serving_input_receiver_fn=serving_input_receiver_fn)


Now I want to convert my saved model to use it with the JavaScript version of TensorFlow, tf.js, using the tfjs-converter.



When I issue the following command:



tensorflowjs_converter --input_format=tf_saved_model --output_node_names='dnn/head/predictions/str_classes,dnn/head/predictions/probabilities' --saved_model_tags=serve /my/dir/base /my/export/dir


…I get this error message:




ValueError: Node 'dnn/input_from_feature_columns/input_layer/sentence_hub_module_embedding/module_apply_default/embedding_lookup_sparse/embedding_lookup' expects to be colocated with unknown node 'dnn/input_from_feature_columns/input_layer/sentence_hub_module_embedding




I assume I'm doing something wrong when saving the model.



What is the correct way to save an estimator model so that it can be converted with tfjs-converter?



The source code of my project can be found on GitHub.







javascript python tensorflow






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Jul 11 at 6:22

























asked Jul 9 at 16:18









Patrick Hund

6,24252146




6,24252146








  • 3




    At the moment it doesn't look like this is possible with the available library. Aside from this colocation issue, which seems to come from freezing word embeddings, tfjs-converter doesn't support all the ops in the graph. So even if the main TF library would freeze and restore the graph right, it would still include some unsupported ops like LookupTableFindV2 and StringToHashBucketFast. The README says to file issues to let the devs know which ops to support, but issues aren't currently enabled on the repo.
    – argx
    Jul 14 at 16:02










  • Awww, too bad... And surprising, since they have a word embeddings example in the tfjs repo, sentiment analysis of imdb movie ratings, pretty much the same as what I'm doing: github.com/tensorflow/tfjs-examples/tree/master/sentiment
    – Patrick Hund
    Jul 14 at 16:49














  • 3




    At the moment it doesn't look like this is possible with the available library. Aside from this colocation issue, which seems to come from freezing word embeddings, tfjs-converter doesn't support all the ops in the graph. So even if the main TF library would freeze and restore the graph right, it would still include some unsupported ops like LookupTableFindV2 and StringToHashBucketFast. The README says to file issues to let the devs know which ops to support, but issues aren't currently enabled on the repo.
    – argx
    Jul 14 at 16:02










  • Awww, too bad... And surprising, since they have a word embeddings example in the tfjs repo, sentiment analysis of imdb movie ratings, pretty much the same as what I'm doing: github.com/tensorflow/tfjs-examples/tree/master/sentiment
    – Patrick Hund
    Jul 14 at 16:49








3




3




At the moment it doesn't look like this is possible with the available library. Aside from this colocation issue, which seems to come from freezing word embeddings, tfjs-converter doesn't support all the ops in the graph. So even if the main TF library would freeze and restore the graph right, it would still include some unsupported ops like LookupTableFindV2 and StringToHashBucketFast. The README says to file issues to let the devs know which ops to support, but issues aren't currently enabled on the repo.
– argx
Jul 14 at 16:02




At the moment it doesn't look like this is possible with the available library. Aside from this colocation issue, which seems to come from freezing word embeddings, tfjs-converter doesn't support all the ops in the graph. So even if the main TF library would freeze and restore the graph right, it would still include some unsupported ops like LookupTableFindV2 and StringToHashBucketFast. The README says to file issues to let the devs know which ops to support, but issues aren't currently enabled on the repo.
– argx
Jul 14 at 16:02












Awww, too bad... And surprising, since they have a word embeddings example in the tfjs repo, sentiment analysis of imdb movie ratings, pretty much the same as what I'm doing: github.com/tensorflow/tfjs-examples/tree/master/sentiment
– Patrick Hund
Jul 14 at 16:49




Awww, too bad... And surprising, since they have a word embeddings example in the tfjs repo, sentiment analysis of imdb movie ratings, pretty much the same as what I'm doing: github.com/tensorflow/tfjs-examples/tree/master/sentiment
– Patrick Hund
Jul 14 at 16:49












1 Answer
1






active

oldest

votes

















up vote
0
down vote













You can try this and I think this will work. Just input your input format in code.



tensorflowjs_converter --input_format keras 
path/to/my_model.h5
path/to/tfjs_target_dir





share|improve this answer










New contributor




Vivek Kaushik is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.


















  • Thanks, I'll give it a try
    – Patrick Hund
    Nov 4 at 13:17











Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














 

draft saved


draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f51249989%2ferror-trying-to-convert-tensorflow-saved-model-to-tensorflow-js-model%23new-answer', 'question_page');
}
);

Post as a guest
































1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes








up vote
0
down vote













You can try this and I think this will work. Just input your input format in code.



tensorflowjs_converter --input_format keras 
path/to/my_model.h5
path/to/tfjs_target_dir





share|improve this answer










New contributor




Vivek Kaushik is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.


















  • Thanks, I'll give it a try
    – Patrick Hund
    Nov 4 at 13:17















up vote
0
down vote













You can try this and I think this will work. Just input your input format in code.



tensorflowjs_converter --input_format keras 
path/to/my_model.h5
path/to/tfjs_target_dir





share|improve this answer










New contributor




Vivek Kaushik is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.


















  • Thanks, I'll give it a try
    – Patrick Hund
    Nov 4 at 13:17













up vote
0
down vote










up vote
0
down vote









You can try this and I think this will work. Just input your input format in code.



tensorflowjs_converter --input_format keras 
path/to/my_model.h5
path/to/tfjs_target_dir





share|improve this answer










New contributor




Vivek Kaushik is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









You can try this and I think this will work. Just input your input format in code.



tensorflowjs_converter --input_format keras 
path/to/my_model.h5
path/to/tfjs_target_dir






share|improve this answer










New contributor




Vivek Kaushik is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









share|improve this answer



share|improve this answer








edited Nov 4 at 10:17









Lee Mac

2,3132833




2,3132833






New contributor




Vivek Kaushik is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









answered Nov 4 at 9:59









Vivek Kaushik

11




11




New contributor




Vivek Kaushik is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





New contributor





Vivek Kaushik is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






Vivek Kaushik is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.












  • Thanks, I'll give it a try
    – Patrick Hund
    Nov 4 at 13:17


















  • Thanks, I'll give it a try
    – Patrick Hund
    Nov 4 at 13:17
















Thanks, I'll give it a try
– Patrick Hund
Nov 4 at 13:17




Thanks, I'll give it a try
– Patrick Hund
Nov 4 at 13:17


















 

draft saved


draft discarded



















































 


draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f51249989%2ferror-trying-to-convert-tensorflow-saved-model-to-tensorflow-js-model%23new-answer', 'question_page');
}
);

Post as a guest




















































































這個網誌中的熱門文章

Xamarin.form Move up view when keyboard appear

Post-Redirect-Get with Spring WebFlux and Thymeleaf

Anylogic : not able to use stopDelay()