question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Error runnig a frozen graph of the 128-dimensional embeddings model

See original GitHub issue

I froze the new 128-dimensional embeddings model using the freeze_graph.py, but at the time of running the session, it gives this error:

Failed precondition: Attempting to use uninitialized value Bottleneck/BatchNorm/moving_variance [[Node: Bottleneck/BatchNorm/moving_variance/read = Identity[T=DT_FLOAT, _class=["loc:@Bottleneck/BatchNorm/moving_variance"], _device="/job:localhost/replica:0/task:0/cpu:0"](Bottleneck/BatchNorm/moving_variance)]]

Should I change something in freeze_graph.py?

Issue Analytics

  • State:closed
  • Created 7 years ago
  • Comments:6 (4 by maintainers)

github_iconTop GitHub Comments

2reactions
mhaghighatcommented, Mar 7, 2017

@RakshakTalwar, It happened to me too. I updated to TensorFlow r1.0, and it worked.

0reactions
adwin5commented, Mar 19, 2017

confirmed. it doesn’t work for tf0.12 but for tf r1.0.1

Read more comments on GitHub >

github_iconTop Results From Across the Web

TensorFlow 2.x: Cannot load trained model in h5 format when ...
This problem is caused by the inconsistency between the dimension of emebedding matrix in training and prediction.
Read more >
How to Use Word Embedding Layers for Deep Learning with ...
The output of the Embedding layer is a 2D vector with one embedding for each word in the input sequence of words (input...
Read more >
Neural Network Embeddings Explained - Towards Data Science
In the context of neural networks, embeddings are low-dimensional, learned continuous vector representations of discrete variables.
Read more >
towards deep neuromorphic knowledge graph embeddings
Based on the insight that randomly initialized and untrained (i.e., frozen) graph neural networks are able to preserve local graph structures,.
Read more >
TAGE: Task Agnostic Graph Embeddings - SNAP: Stanford
unsupervised deep learning model for learning structural node embeddings. ... low-dimensional embedding f(w|H) given node w and graph H. Then, the embedding ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found