question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Call new layer on the last layer of create_model object using Functional API

See original GitHub issue

Hi. First, I want to say that I enjoy this library a lot! Thank you @martinsbruveris for creating it!

I have a question: I want create a model body using create_model function and add my own classification head. In classification head I want to add another input layer to additional features, call a concatenate layer on last layer of the create_model object and new input layer, and add final dense layer. Since create_model object is not a Sequential or Functional model object, is there any way I can do that? I tried using ‘model_tfimm.output’ or ‘model_tfimm.layers[-1].output’ calls, because .output call works with Tensorflow models, but it does not seem to work with tfimm models:

dense_1 = tf.keras.layers.Dense(512, activation='relu', name='dense_1')(model_tfimm.layers[-1].output)

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
v:\Git\spellbook\magicClassification.py in <module>
----> 1 dense_1 = tf.keras.layers.Dense(512, activation='relu', name='dense_1')(model_tfimm.layers[-1].output)

~\AppData\Local\Programs\Python\Python37\lib\site-packages\keras\engine\base_layer.py in output(self)
  2094     """
  2095     if not self._inbound_nodes:
-> 2096       raise AttributeError('Layer ' + self.name + ' has no inbound nodes.')
  2097     return self._get_node_attribute_at_index(0, 'output_tensors', 'output')
  2098 

AttributeError: Layer activation_72 has no inbound nodes.
model_tfimm.output

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
v:\Git\spellbook\magicClassification.py in <module>
----> 1 model_tfimm.output

~\AppData\Local\Programs\Python\Python37\lib\site-packages\keras\engine\base_layer.py in output(self)
   2094     """
   2095     if not self._inbound_nodes:
-> 2096       raise AttributeError('Layer ' + self.name + ' has no inbound nodes.')
   2097     return self._get_node_attribute_at_index(0, 'output_tensors', 'output')
   2098 

AttributeError: Layer conv_ne_xt_1 has no inbound nodes.

Using Tensorflow Functional API this would like something like this:

model_tfimm = tfimm.create_model(TFIMM_MODEL_NAME, nb_classes=0, pretrained="timm")
feature_extractor = model_tfimm.output

add_input = tf.keras.layers.Input(shape=(NUM_ADD_FEATURES, ), name='input_features_layer')
concat_layer = tf.keras.layers.Concatenate(name='concat_features')([feature_extractor, add_input])

predictions = tf.keras.layers.Dense(NUM_CLASSES, activation=OUTPUT_ACTIVATION)(concat_layer)

model = tf.keras.Model(inputs=[model_tfimm.input, add_input], outputs=predictions)

Any ideas?

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:7 (3 by maintainers)

github_iconTop GitHub Comments

3reactions
martinsbruveriscommented, Feb 11, 2022

Yes, you should be able to train all layers by default. Consider this code

import tensorflow as tf
import tfimm
from tfimm.utils.flops import get_parameters

x = tf.keras.Input((32, 32, 3))
y = tf.keras.Input((512,))
backbone = tfimm.create_model("resnet18", nb_classes=0)
z = backbone(x)
z = tf.keras.layers.Concatenate()([y, z])
z = tf.keras.layers.Dense(units=10)(z)
model = tf.keras.Model(inputs=[x, y], outputs=z)

print(get_parameters(backbone))
print(get_parameters(model))

Note that the combined model has all the trainable parameters of backbone plus the ones from the dense layer.

1reaction
martinsbruveriscommented, Feb 11, 2022

Happy to help. I will close the issue for now. Feel free to reopen it, if needed.

Read more comments on GitHub >

github_iconTop Results From Across the Web

The Functional API - Keras
You create a new node in the graph of layers by calling a layer on this inputs object: dense = layers.Dense(64, activation="relu") x ......
Read more >
Deep Learning with TensorFlow and the Keras Functional API
A simple Functional model with two Dense layers: ... Create final output layer and call it on the features outputs = layers.
Read more >
3 ways to create a Keras model with TensorFlow 2.0 ...
This style is representative of the Keras Functional API. Layers are appended to one another where x acts as the input to subsequent...
Read more >
Loading a trained model, popping the last two layers, and then ...
So I'm working with this architecture for a facial point network: n_kpts = 68 # number of keypoints input_shape = (1100100) input_1 ...
Read more >
How to Use the Keras Functional API for Deep Learning
Shared Layers Model; Multiple Input and Output Models; Best Practices; NEW: Note on the Functional API Python Syntax. 1. Keras Sequential Models.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found