question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Accuracy not changing across the epochs

See original GitHub issue

`from keras import regularizers from keras.layers import Input, Dense, Activation, Dropout from keras.models import Model, Sequential from keras.utils import np_utils from keras.optimizers import SGD import numpy import pandas from sklearn import preprocessing from sklearn.model_selection import GridSearchCV from keras.models import Sequential from keras.layers import Dense

dataset = numpy.loadtxt(“Train.txt”, delimiter=“\t”)

split into input (X) and output (Y) variables

X = dataset[:,0:106] Y = dataset[:,106]

model = Sequential() model.add(Dense(64, input_dim=106, init=‘uniform’, activation=‘relu’)) model.add(Dense(32, init=‘uniform’, activation=‘tanh’)) model.add(Dense(12, init=‘uniform’, activation=‘relu’)) model.add(Dense(8, init=‘uniform’, activation=‘sigmoid’)) model.add(Dense(1, init=‘uniform’, activation=‘relu’))

Compile model

optimizer = SGD(lr=0.0001, momentum=0.4) model.compile(loss=‘binary_crossentropy’, optimizer=optimizer, metrics=[‘accuracy’])

Fit the model

model.fit(X, Y, nb_epoch=100, batch_size=300, verbose=1)

evaluate the model

scores = model.evaluate(X, Y, verbose=0) print(“%s: %.2f%%” % (model.metrics_names[1], scores[1]*100))

serialize model to JSON

model_json = model.to_json() with open(“model.json”, “w”) as json_file: json_file.write(model_json)

serialize weights to HDF5

model.save_weights(“model.h5”) print(“Saved model to disk”)`

For the above training, the accuracy remains the same across all the epochs. Using Theano backend. DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

Using gpu device 0: Quadro K2200 (CNMeM is disabled, cuDNN not available) DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

DEBUG: nvcc STDOUT mod.cu Creating library D:/tmp/xyz/theano.NOBACKUP/compiledir_Windows-8.1-6.3.9600-Intel64

Epoch 1/100 10131566/10131566 [==============================] - 338s - loss: 4.4380 - acc: 0.7247 Epoch 2/100 10131566/10131566 [==============================] - 341s - loss: 4.4380 - acc: 0.7247 Epoch 3/100 10131566/10131566 [==============================] - 340s - loss: 4.4380 - acc: 0.7247 Epoch 4/100 10131566/10131566 [==============================] - 350s - loss: 4.4380 - acc: 0.7247 Epoch 5/100 10131566/10131566 [==============================] - 343s - loss: 4.4380 - acc: 0.7247 Epoch 6/100 10131566/10131566 [==============================] - 336s - loss: 4.4380 - acc: 0.7247 Epoch 7/100 10131566/10131566 [==============================] - 346s - loss: 4.4380 - acc: 0.7247 Epoch 8/100 10131566/10131566 [==============================] - 349s - loss: 4.4380 - acc: 0.7247 Epoch 9/100 10131566/10131566 [==============================] - 345s - loss: 4.4380 - acc: 0.7247 Epoch 10/100 10131566/10131566 [==============================] - 345s - loss: 4.4380 - acc: 0.7247 Epoch 11/100 10131566/10131566 [==============================] - 349s - loss: 4.4380 - acc: 0.7247 Epoch 12/100 10131566/10131566 [==============================] - 349s - loss: 4.4380 - acc: 0.7247 Epoch 13/100 10131566/10131566 [==============================] - 344s - loss: 4.4380 - acc: 0.7247 Epoch 14/100 10131566/10131566 [==============================] - 341s - loss: 4.4380 - acc: 0.7247 Epoch 15/100 10131566/10131566 [==============================] - 344s - loss: 4.4380 - acc: 0.7247 Epoch 16/100 10131566/10131566 [==============================] - 344s - loss: 4.4380 - acc: 0.7247 Epoch 17/100 10131566/10131566 [==============================] - 353s - loss: 4.4380 - acc: 0.7247 Epoch 18/100 10131566/10131566 [==============================] - 349s - loss: 4.4380 - acc: 0.7247 Epoch 19/100 10131566/10131566 [==============================] - 343s - loss: 4.4380 - acc: 0.7247 Epoch 20/100 10131566/10131566 [==============================] - 272s - loss: 4.4380 - acc: 0.7247 Epoch 21/100 10131566/10131566 [==============================] - 270s - loss: 4.4380 - acc: 0.7247 Epoch 22/100 10131566/10131566 [==============================] - 268s - loss: 4.4380 - acc: 0.7247 Epoch 23/100 10131566/10131566 [==============================] - 269s - loss: 4.4380 - acc: 0.7247 Epoch 24/100 10131566/10131566 [==============================] - 270s - loss: 4.4380 - acc: 0.7247 Epoch 25/100 10131566/10131566 [==============================] - 269s - loss: 4.4380 - acc: 0.7247 Epoch 26/100 10131566/10131566 [==============================] - 268s - loss: 4.4380 - acc: 0.7247 Epoch 27/100 10131566/10131566 [==============================] - 270s - loss: 4.4380 - acc: 0.7247 Epoch 28/100 10131566/10131566 [==============================] - 268s - loss: 4.4380 - acc: 0.7247 Epoch 29/100 10131566/10131566 [==============================] - 270s - loss: 4.4380 - acc: 0.7247 Epoch 30/100 10131566/10131566 [==============================] - 271s - loss: 4.4380 - acc: 0.7247 Epoch 31/100 10131566/10131566 [==============================] - 268s - loss: 4.4380 - acc: 0.7247 Epoch 32/100 10131566/10131566 [==============================] - 274s - loss: 4.4380 - acc: 0.7247 Epoch 33/100 10131566/10131566 [==============================] - 268s - loss: 4.4380 - acc: 0.7247 Epoch 34/100 10131566/10131566 [==============================] - 272s - loss: 4.4380 - acc: 0.7247 Epoch 35/100 10131566/10131566 [==============================] - 271s - loss: 4.4380 - acc: 0.7247 Epoch 36/100 10131566/10131566 [==============================] - 271s - loss: 4.4380 - acc: 0.7247 Epoch 37/100 10131566/10131566 [==============================] - 342s - loss: 4.4380 - acc: 0.7247 Epoch 38/100 10131566/10131566 [==============================] - 337s - loss: 4.4380 - acc: 0.7247 Epoch 39/100 10131566/10131566 [==============================] - 336s - loss: 4.4380 - acc: 0.7247 Epoch 40/100 10131566/10131566 [==============================] - 351s - loss: 4.4380 - acc: 0.7247 Epoch 41/100 10131566/10131566 [==============================] - 387s - loss: 4.4380 - acc: 0.7247 Epoch 42/100 10131566/10131566 [==============================] - 388s - loss: 4.4380 - acc: 0.7247 Epoch 43/100 10131566/10131566 [==============================] - 364s - loss: 4.4380 - acc: 0.7247

… this remains the same till 100 epochs?

Issue Analytics

  • State:closed
  • Created 7 years ago
  • Comments:6 (1 by maintainers)

github_iconTop GitHub Comments

17reactions
bstrinercommented, Dec 25, 2016

Can you check if your Ys are all 0 or 1? Your model doesn’t match your loss function.

If your targets are [0,1], use a sigmoid output layer and binary_crossentropy loss.

If your targets are [-1,1], use a linear or tanh output layer and hinge or squared_hinge loss.

If your targets are labels for k categories, use to_categorical to convert to one-hot, use a softmax output layer with k outputs, and use categorical_crossentropy loss.

As a side note, try getting a 2 layer model to work before jumping to 5.

Cheers, Ben

0reactions
KushalDavecommented, Dec 26, 2016

@bstriner @Shuailong Yes, my labels are all 0 or 1. Once I set the output layer to sigmoid, I see accurcy improving across epochs with the binary_loss.

Also, I had tried a NN with 2 layers too, the one I posted was one of many instances that I had tried.

Many thanks for all your generous responses.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Keras accuracy does not change - Stack Overflow
If the accuracy is not changing, it means the optimizer has found a local minimum for the loss. This may be an undesirable...
Read more >
neural network - Validation Accuracy Not Changing
As the title states, my validation accuracy isn't changing when I try to train my model. I've built an NVIDIA model using tensorflow.keras...
Read more >
Keras accuracy does not change - Intellipaat Community
The most seeming reason is that the optimizer isn't suited to your dataset. Here may be a list of Keras optimizers from the...
Read more >
Accuracy does not change across training - PyTorch Forums
Please make sure you are matching “(predicted_eval == targets_eval)” on similar scale/type/probability thresholds ! Make sure they are of same ...
Read more >
Why does my validation loss increase, but validation accuracy ...
I feel like the change in accuracy could be caused by shuffling. Are you shuffling your data during training but not on test...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found