question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Feeding input to intermediate layer fails with Graph disconnected Exception

See original GitHub issue

I am writing a pipeline that fine-tunes the pre-trained models of Keras 1.2.0. To speed it up, instead of freezing the layers I try to:

  1. Feed the training images once to the “frozen” part of the network and store the intermediate output to a file.
  2. Train iteratively the remaining network by feeding directly the intermediate output from the file.

If you don’t use data augmentation, this should yield a significant speed improvement. Unfortunately the step 2 fails with a “Graph Disconnected” exception. I tried alternative ways to do this (such as using the K.function() approach) but it still fails.

Below you will find a simple example that reproduces the problem and the error message:

import keras.applications
from keras.models import Model
from keras.layers import Input
from keras.preprocessing import image
from keras.applications.imagenet_utils import preprocess_input
import numpy as np

# Read some random image
img = image.load_img('/path/to/image.jpg', target_size=(224, 224))
x = image.img_to_array(img)
x = np.expand_dims(x, axis=0)
x = preprocess_input(x)

# Load a pre-trained model
model = keras.applications.resnet50.ResNet50(weights='imagenet', include_top=False, input_tensor=Input(shape=(224, 224, 3)))

# Feed the image and get the bn_conv1 output: WORKS!
bn_conv1_model = Model(input=model.input, output=model.get_layer('bn_conv1').output)
bn_conv1_output = bn_conv1_model.predict(x)

# Feed directly the bn_conv1 output to the remaining layers: FAILS!
avg_pool_model = Model(input=Input(model.get_layer('bn_conv1').output_shape[1:]), output=model.get_layer('avg_pool').output) # This line throws exception
avg_pool_output = avg_pool_model.predict(bn_conv1_output)

The error message is: Traceback (most recent call last): File “<stdin>”, line 1, in <module> File “/usr/local/lib/python2.7/dist-packages/keras/engine/topology.py”, line 1987, in init str(layers_with_complete_input)) RuntimeError: Graph disconnected: cannot obtain value for tensor Tensor(“input_1:0”, shape=(?, 224, 224, 3), dtype=float32) at layer “input_1”. The following previous layers were accessed without issue: []

Issue Analytics

  • State:closed
  • Created 7 years ago
  • Reactions:4
  • Comments:23 (10 by maintainers)

github_iconTop GitHub Comments

37reactions
bstrinercommented, Jan 20, 2017

Graph disconnected normally means your input and output are not part of the same graph. If your input was not the variable you used to create your output, this is the error you will get.

13reactions
datumboxcommented, Jul 22, 2017

Hi @engharat

I have not yet written a proper solution for that. The network graphs can be very complex, especially for networks that branch-out and merge a lot and this requires writting graph traversal algorithms. On the future I’ll probably do this and contribute it back to Keras but I have not done it yet.

Below I send you the latest version of the “terrible” solution that I’m using. Rest assured it is equally terrible as the previous one:

def split(model, start, end):
    confs = model.get_config()
    kept_layers = set()
    for i, l in enumerate(confs['layers']):
        if i == 0:
            confs['layers'][0]['config']['batch_input_shape'] = model.layers[start].input_shape
            if i != start:
                confs['layers'][0]['name'] += str(random.randint(0, 100000000)) # rename the input layer to avoid conflicts on merge
                confs['layers'][0]['config']['name'] = confs['layers'][0]['name']
        elif i < start or i > end:
            continue
        kept_layers.add(l['name'])
    # filter layers
    layers = [l for l in confs['layers'] if l['name'] in kept_layers]
    layers[1]['inbound_nodes'][0][0][0] = layers[0]['name']
    # set conf
    confs['layers'] = layers
    confs['input_layers'][0][0] = layers[0]['name']
    confs['output_layers'][0][0] = layers[-1]['name']
    # create new model
    submodel = Model.from_config(confs)
    for l in submodel.layers:
        orig_l = model.get_layer(l.name)
        if orig_l is not None:
            l.set_weights(orig_l.get_weights())
    return submodel
Read more comments on GitHub >

github_iconTop Results From Across the Web

Graph disconnected error when loading model in keras
1 Answer 1 ... The problem was generated by the input layers inside the model, for some reason they don't create any problem...
Read more >
Problem "Graph disconnected" - TensorFlow Forum
You say that the model has 3 input layers, but the ValueError says about missing values from the layer named “input_144”. It seams...
Read more >
layer-test - Jupyter Notebooks Gallery
As expected, kerlaModel doesn't work. Also looking at this closed keras issue: Feeding input to intermediate layer fails with Graph disconnected Exception #5074...
Read more >
keras the following previous layers were accessed without issue
Graph disconnected : cannot obtain value for tensor Tensor Input Keras Python ... I want to obtain the output of intermediate sub-model layers...
Read more >
Feeding output of a given intermediate layer in Keras as the ...
Sometimes you might get stuck while using an output of an intermediate layer with the errors like 'graph disconnected'.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found