question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Popping layer problem when merging it

See original GitHub issue
from keras.layers import Merge
from keras.layers import Dense
from keras.models import Sequential

def pop_layer(model):
    if not model.outputs:
        raise Exception('Sequential model cannot be popped: model is empty.')

    model.layers.pop()
    if not model.layers:
        model.outputs = []
        model.inbound_nodes = []
        model.outbound_nodes = []
    else:
        model.layers[-1].outbound_nodes = []
        model.outputs = [model.layers[-1].output]
    model.built = False

if __name__ == "__main__":
    model = Sequential()
    model.add(Dense(32,input_dim=100))
    model.add(Dense(100))
    pop_layer(model)
    print model.layers[-1].output_shape

    model2 = Sequential()
    model2.add(Dense(32,input_dim=100))
    print model2.layers[-1].output_shape

    merged = Merge([model, model2], mode='concat')
    modelp = Sequential()
    modelp.add(merged)
    print modelp.layers[-1].output_shape

    from keras.utils.visualize_util import plot
    plot(modelp, show_shapes=True, to_file='modelp.png')

So basically model and model2 should be the same because I popped from model.

However, if you see the output plot (modelp.png), model and model2 has different output and the former seems not has been popped.

And if you check out the printed text to standard output, you’ll see model and model2 has 32D output and modelp has 132D output, which doesn’t make sense.

I only see this problem when I POP and MERGE the layer.

Thank you.

Issue Analytics

  • State:closed
  • Created 7 years ago
  • Comments:9

github_iconTop GitHub Comments

2reactions
mbollmanncommented, Feb 1, 2017

I’m having the same problem with popping layers. They just magically reappear. model.pop is not working. It gives AttributeError: ‘Model’ object has no attribute ‘pop’

A Model object can have layers connected in arbitrary ways, not only sequentially. It doesn’t make sense conceptually to pop a layer from something that doesn’t behave like a list.

0reactions
kiyooncommented, Mar 8, 2017

@alyato Yes there’s no pop function for Model, because Model is not necessarily sequential. Popping from non-sequential model doesn’t make sense at all. So if you want to pop, use sequential model.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Photoshop in a merge-layers-always-fails state
It is not just layer merging. It is transform/warp, too. Photoshop is unusable in this state. I don't know of a workaround for...
Read more >
adobe photoshop - Problem when merging, please help
Always view at 100% before merging. I suspect the reason you notice it changing is because your view is at 33%. So, there's...
Read more >
How To Merge Layer Blend Modes In Photoshop
What happens is that we usually end up with a different looking image than what we had before merging the layers. Sometimes it's...
Read more >
Some layers disappear when clicking on Review and merge ...
Quick update, I made the merge and it worked fine. I think it was just a visual bug but still a problem IMO....
Read more >
How Do I Merge Layers in Figma? - WebsiteBuilderInsider.com
1. Select the layers you want to merge. You can do this by clicking on one layer, then holding down the Shift key...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found