Sequential model doesn't have outputs
See original GitHub issueShouldn’t this code work?
from sklearn.datasets import make_classification
from scikeras.wrappers import KerasClassifier
import tensorflow as tf
def model():
model = tf.keras.Sequential()
model.add(tf.keras.layers.Dense(8))
model.add(tf.keras.layers.Dense(1))
return model
X, y = make_classification(n_features=8)
est = KerasClassifier(model=model, loss="sparse_categorical_crossentropy")
est.fit(X, y=y)
This throws a ValueError: object of type NoneType [self.model_.outputs] has no len().
Full traceback
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
~/Downloads/_junk2.py in <module>
11 X, y = make_classification(n_features=8)
12 est = KerasClassifier(model=model, loss="sparse_categorical_crossentropy")
---> 13 est.fit(X, y=y)
~/anaconda3/envs/scikeras/lib/python3.7/site-packages/scikeras/wrappers.py in fit(self, X, y, sample_weight, **kwargs)
1375 sample_weight = 1 if sample_weight is None else sample_weight
1376 sample_weight *= compute_sample_weight(class_weight=self.class_weight, y=y)
-> 1377 super().fit(X=X, y=y, sample_weight=sample_weight, **kwargs)
1378 return self
1379
~/anaconda3/envs/scikeras/lib/python3.7/site-packages/scikeras/wrappers.py in fit(self, X, y, sample_weight, **kwargs)
739 epochs=getattr(self, "fit__epochs", self.epochs),
740 initial_epoch=0,
--> 741 **kwargs,
742 )
743
~/anaconda3/envs/scikeras/lib/python3.7/site-packages/scikeras/wrappers.py in _fit(self, X, y, sample_weight, warm_start, epochs, initial_epoch, **kwargs)
855 X = self.feature_encoder_.transform(X)
856
--> 857 self._check_model_compatibility(y)
858
859 self._fit_keras_model(
~/anaconda3/envs/scikeras/lib/python3.7/site-packages/scikeras/wrappers.py in _check_model_compatibility(self, y)
541 # we recognize the attribute but do not force it to be
542 # generated
--> 543 if self.n_outputs_expected_ != len(self.model_.outputs):
544 raise ValueError(
545 "Detected a Keras model input of size"
TypeError: object of type 'NoneType' has no len()
Issue Analytics
- State:
- Created 3 years ago
- Comments:16 (8 by maintainers)
Top Results From Across the Web
The Sequential model - Keras
A Sequential model is appropriate for a plain stack of layers where each layer has exactly one input tensor and one output tensor....
Read more >Keras : Why does Sequential and Model give different outputs?
I trained it both with a fixed seed (np. random. seed(1337)), with the same training data and my output are different... Knowing that...
Read more >tf.keras.models.Sequential | TensorFlow - API Manual
Retrieves the output shape(s) of a layer. Only applicable if the layer has one output, or if all outputs have the same shape....
Read more >The Sequential model - TensorFlow for R - RStudio
A Sequential model is appropriate for a plain stack of layers where each layer has exactly one input tensor and one output tensor....
Read more >Understanding Sequential Vs Functional API in Keras
Keras is a deep learning Api that makes our model building task easier. ... Also, you can't have multiple inputs or outputs. Loading...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found

Eh, I like specific and narrow PRs. Let’s keep them separate.
The original model definitely works as expected - it’s a default model from TensorFlow Decision Forests and does indeed train and predict correctly. It looks like the warning is something unique to TFDF (I think Keras’ functional API is confusing it for whatever reason), however the wrapped model seems to be working correctly. And you were right about the
Flattenbeing superfluous.Thanks a lot for your help. It’s much appreciated.