question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

In the middle of training, loss Crazy increase, even more than 10000

See original GitHub issue

Here is my model

from keras.models import Sequential, Model
from keras.layers import Input, Dense, Dropout, LSTM
from numpy import array
import numpy as np
from keras.optimizers import RMSprop
from keras.models import load_model
from sklearn.preprocessing import MinMaxScaler

scaler = MinMaxScaler(feature_range=(0, 1))

# LSTM
lstm_output_size = 32

# Training
batch_size = 2
epochs = 2000

# Test or Train
isTrain = False


def load_data(filename):
    with open(filename, 'r') as file:
        next(file)
        x_data = []
        y_data = []
        for line in file:
            line = list(map(float, line.strip('\n').split(',')[1:]))
            if line:
                line = scaler.fit_transform(np.float64(line))
                print('line', line)
                x_data.append([line[1:7], line[7:13], line[13:19]])
                y_data.append(line[0])
    return np.float64(x_data), np.float64(y_data)


print('Loading data...')
(x_train, y_train) = load_data('tianjin_train.csv')  # shape 638 3 6
(x_test, y_test) = load_data('tianjin_test.csv')

print('x_train shape:', x_train.shape)
print('x_test shape:', x_test.shape)

if isTrain:
    print('Build model...')
    inputs = Input(shape=[3, 6])
    Dense1 = Dense(32)(inputs)
    LSTM1 = LSTM(32, return_sequences=True)(Dense1)
    LSTM2 = LSTM(32, return_sequences=True)(LSTM1)
    LSTM3 = LSTM(32, return_sequences=True)(LSTM2)
    LSTM4 = LSTM(32, return_sequences=False)(LSTM3)
    Dropout1 = Dropout(0.5)(LSTM4)
    Dense1 = Dense(32, activation='relu')(Dropout1)

    predictions = Dense(1, init='uniform', activation='linear')(Dense1)
    model = Model(input=inputs, output=predictions)

    op = RMSprop(lr=0.001, rho=0.9, epsilon=1e-06)  #lr 学习率
    # model.compile(optimizer=op, loss='mape', metrics=['accuracy', 'mae'])
    model.compile(optimizer=op, loss='mape')

    print('Model summary')
    model.summary()

    print('Train...')
    model.fit(
        x_train,
        y_train,
        batch_size=batch_size,
        epochs=epochs,
        validation_split=0.1)

    print('Evaluate')
    loss = model.evaluate(x_train, y_train)
    print('\nloss', loss)

    print('Save model')
    model.save('model.h5')

    print('Predict')
    print('gold  predict')
    x_train_predict = model.predict(x_train, batch_size=batch_size)
    for i, j in zip(x_train_predict, y_train):
        print(i[0], j)
else:
    print('Load model')
    model = load_model('model.h5')

    print('Evaluate')
    loss = model.evaluate(x_test, y_test)
    print('loss', loss)

    print('Predict')
    print('gold  predict')
    x_test_predict = model.predict(x_test, batch_size=batch_size)
    for i, j in zip(x_test_predict, y_test):
        print(i[0], j)

I read data from file and use MinMaxScaler to standardize data, and my train data’s shape is (638,3,6). But when i train this model, the change of loss is strange.

2/574 [..............................] - ETA: 5s - loss: 11.0410
  8/574 [..............................] - ETA: 5s - loss: 12.6024
 14/574 [..............................] - ETA: 5s - loss: 11.4194
 20/574 [>.............................] - ETA: 4s - loss: 12.8124
 24/574 [>.............................] - ETA: 5s - loss: 11.8158
 28/574 [>.............................] - ETA: 5s - loss: 14.0418
 32/574 [>.............................] - ETA: 6s - loss: 13.9323
 36/574 [>.............................] - ETA: 6s - loss: 13.9805
 40/574 [=>............................] - ETA: 6s - loss: 14.5694
 44/574 [=>............................] - ETA: 7s - loss: 14.1051
 48/574 [=>............................] - ETA: 7s - loss: 14.3149
 52/574 [=>............................] - ETA: 7s - loss: 13.9272
 56/574 [=>............................] - ETA: 7s - loss: 14.1912
 60/574 [==>...........................] - ETA: 7s - loss: 13.7881
 64/574 [==>...........................] - ETA: 7s - loss: 14.0969
 68/574 [==>...........................] - ETA: 7s - loss: 14.0842
 72/574 [==>...........................] - ETA: 7s - loss: 14.1509
 76/574 [==>...........................] - ETA: 7s - loss: 13.9915
 80/574 [===>..........................] - ETA: 7s - loss: 13.8698
 84/574 [===>..........................] - ETA: 7s - loss: 13.8333
 88/574 [===>..........................] - ETA: 7s - loss: 13.9330
 92/574 [===>..........................] - ETA: 7s - loss: 14.1213
 96/574 [====>.........................] - ETA: 7s - loss: 14.3972
100/574 [====>.........................] - ETA: 7s - loss: 14.1310
104/574 [====>.........................] - ETA: 7s - loss: 13.7499
108/574 [====>.........................] - ETA: 7s - loss: 13.9128
112/574 [====>.........................] - ETA: 7s - loss: 13.9477
116/574 [=====>........................] - ETA: 7s - loss: 13.9351
120/574 [=====>........................] - ETA: 7s - loss: 14.1038
124/574 [=====>........................] - ETA: 7s - loss: 14.0430
128/574 [=====>........................] - ETA: 7s - loss: 13.9114
132/574 [=====>........................] - ETA: 7s - loss: 13.9177
136/574 [======>.......................] - ETA: 7s - loss: 13.7766
140/574 [======>.......................] - ETA: 7s - loss: 13.7123
144/574 [======>.......................] - ETA: 7s - loss: 13.8442
148/574 [======>.......................] - ETA: 7s - loss: 13.8257
152/574 [======>.......................] - ETA: 7s - loss: 13.7642
156/574 [=======>......................] - ETA: 7s - loss: 13.7518
160/574 [=======>......................] - ETA: 7s - loss: 13.8022
164/574 [=======>......................] - ETA: 6s - loss: 13.8442
168/574 [=======>......................] - ETA: 6s - loss: 13.8466
172/574 [=======>......................] - ETA: 6s - loss: 13.8214
176/574 [========>.....................] - ETA: 6s - loss: 13.8500
180/574 [========>.....................] - ETA: 6s - loss: 13.8466
184/574 [========>.....................] - ETA: 6s - loss: 14.0158
188/574 [========>.....................] - ETA: 6s - loss: 13.8391
192/574 [=========>....................] - ETA: 6s - loss: 13.8327
196/574 [=========>....................] - ETA: 6s - loss: 13.8936
200/574 [=========>....................] - ETA: 6s - loss: 13.8201
204/574 [=========>....................] - ETA: 6s - loss: 13.9233
208/574 [=========>....................] - ETA: 6s - loss: 13.8687
212/574 [==========>...................] - ETA: 6s - loss: 13.8771
216/574 [==========>...................] - ETA: 6s - loss: 13.8874
220/574 [==========>...................] - ETA: 6s - loss: 13.9138
224/574 [==========>...................] - ETA: 6s - loss: 13.9162
228/574 [==========>...................] - ETA: 5s - loss: 14.0132
232/574 [===========>..................] - ETA: 5s - loss: 13.9221
236/574 [===========>..................] - ETA: 5s - loss: 13.7742
240/574 [===========>..................] - ETA: 5s - loss: 13.6739
244/574 [===========>..................] - ETA: 5s - loss: 13.6515
248/574 [===========>..................] - ETA: 5s - loss: 13.6853
252/574 [============>.................] - ETA: 5s - loss: 13.6275
256/574 [============>.................] - ETA: 5s - loss: 13.6578
260/574 [============>.................] - ETA: 5s - loss: 13.5238
264/574 [============>.................] - ETA: 5s - loss: 13.6071
268/574 [=============>................] - ETA: 5s - loss: 13.7091
272/574 [=============>................] - ETA: 5s - loss: 13.5633
276/574 [=============>................] - ETA: 5s - loss: 13.6563
280/574 [=============>................] - ETA: 5s - loss: 13.7627
284/574 [=============>................] - ETA: 5s - loss: 13.8519
288/574 [==============>...............] - ETA: 5s - loss: 13.8510
292/574 [==============>...............] - ETA: 4s - loss: 13.8450
296/574 [==============>...............] - ETA: 4s - loss: 13.8943
300/574 [==============>...............] - ETA: 4s - loss: 13.9378
304/574 [==============>...............] - ETA: 4s - loss: 13.8734
308/574 [===============>..............] - ETA: 4s - loss: 13.8516
312/574 [===============>..............] - ETA: 4s - loss: 13.7103
316/574 [===============>..............] - ETA: 4s - loss: 13.6937
320/574 [===============>..............] - ETA: 4s - loss: 13.6311
324/574 [===============>..............] - ETA: 4s - loss: 13.6676
328/574 [================>.............] - ETA: 4s - loss: 13.7905
332/574 [================>.............] - ETA: 4s - loss: 13.8505
336/574 [================>.............] - ETA: 4s - loss: 13.8014
340/574 [================>.............] - ETA: 4s - loss: 13.7897
344/574 [================>.............] - ETA: 4s - loss: 13.7593
348/574 [=================>............] - ETA: 4s - loss: 13.7143
352/574 [=================>............] - ETA: 3s - loss: 13.6914
356/574 [=================>............] - ETA: 3s - loss: 13.7753
360/574 [=================>............] - ETA: 3s - loss: 13.7559
364/574 [==================>...........] - ETA: 3s - loss: 13.7720
368/574 [==================>...........] - ETA: 3s - loss: 13.8671
372/574 [==================>...........] - ETA: 3s - loss: 14.0041
376/574 [==================>...........] - ETA: 3s - loss: 13.9566
380/574 [==================>...........] - ETA: 3s - loss: 13.9012
384/574 [===================>..........] - ETA: 3s - loss: 13.9551
388/574 [===================>..........] - ETA: 3s - loss: 13.9283
392/574 [===================>..........] - ETA: 3s - loss: 13.9404
396/574 [===================>..........] - ETA: 3s - loss: 13.8850
400/574 [===================>..........] - ETA: 3s - loss: 13.8880
404/574 [====================>.........] - ETA: 3s - loss: 13.9073
408/574 [====================>.........] - ETA: 2s - loss: 13.9321
412/574 [====================>.........] - ETA: 2s - loss: 13.9083
416/574 [====================>.........] - ETA: 2s - loss: 13.8485
420/574 [====================>.........] - ETA: 2s - loss: 13.8084
424/574 [=====================>........] - ETA: 2s - loss: 13.7253
428/574 [=====================>........] - ETA: 2s - loss: 13.6726
432/574 [=====================>........] - ETA: 2s - loss: 13.6169
436/574 [=====================>........] - ETA: 2s - loss: 13.5947
440/574 [=====================>........] - ETA: 2s - loss: 13.5699
444/574 [======================>.......] - ETA: 2s - loss: 13.5829
448/574 [======================>.......] - ETA: 2s - loss: 13.6044
452/574 [======================>.......] - ETA: 2s - loss: 13.6060
456/574 [======================>.......] - ETA: 2s - loss: 13.5351
460/574 [=======================>......] - ETA: 2s - loss: 13.4985
464/574 [=======================>......] - ETA: 1s - loss: 13.5073
468/574 [=======================>......] - ETA: 1s - loss: 13.4950
472/574 [=======================>......] - ETA: 1s - loss: 13.4311
476/574 [=======================>......] - ETA: 1s - loss: 13.4550
480/574 [========================>.....] - ETA: 1s - loss: 13.4373
484/574 [========================>.....] - ETA: 1s - loss: 13.4539
488/574 [========================>.....] - ETA: 1s - loss: 13.5174
492/574 [========================>.....] - ETA: 1s - loss: 13.5151
496/574 [========================>.....] - ETA: 1s - loss: 13.4977
500/574 [=========================>....] - ETA: 1s - loss: 13.5094
504/574 [=========================>....] - ETA: 1s - loss: 13.4655
508/574 [=========================>....] - ETA: 1s - loss: 13.4575
512/574 [=========================>....] - ETA: 1s - loss: 13.6445
516/574 [=========================>....] - ETA: 1s - loss: 13.5949
520/574 [==========================>...] - ETA: 0s - loss: 13.6091
524/574 [==========================>...] - ETA: 0s - loss: 13.5857
528/574 [==========================>...] - ETA: 0s - loss: 13.5630
532/574 [==========================>...] - ETA: 0s - loss: 13.5292
536/574 [===========================>..] - ETA: 0s - loss: 13.4956
540/574 [===========================>..] - ETA: 0s - loss: 13.4889
544/574 [===========================>..] - ETA: 0s - loss: 13.4791
548/574 [===========================>..] - ETA: 0s - loss: 13.5655
552/574 [===========================>..] - ETA: 0s - loss: 13.5427
556/574 [============================>.] - ETA: 0s - loss: 13.5281
560/574 [============================>.] - ETA: 0s - loss: 13.4918
564/574 [============================>.] - ETA: 0s - loss: 13.4846
568/574 [============================>.] - ETA: 0s - loss: 13.4925
572/574 [============================>.] - ETA: 0s - loss: 13.5286
574/574 [==============================] - 10s - loss: 13.5482 - val_loss: 10.9326
Epoch 7/2000

...

2/574 [..............................] - ETA: 4s - loss: 25032.1348
 10/574 [..............................] - ETA: 4s - loss: 13005.2484
 18/574 [..............................] - ETA: 4s - loss: 10310.3415
 26/574 [>.............................] - ETA: 4s - loss: 9570.1200 
 34/574 [>.............................] - ETA: 4s - loss: 10021.2076
 42/574 [=>............................] - ETA: 3s - loss: 9426.5582 
 50/574 [=>............................] - ETA: 3s - loss: 8773.5551
 58/574 [==>...........................] - ETA: 3s - loss: 9654.8209
 66/574 [==>...........................] - ETA: 3s - loss: 9057.5851
 74/574 [==>...........................] - ETA: 3s - loss: 8479.7212
 82/574 [===>..........................] - ETA: 3s - loss: 8914.1833
 90/574 [===>..........................] - ETA: 3s - loss: 9410.1467
 98/574 [====>.........................] - ETA: 3s - loss: 9169.5430
106/574 [====>.........................] - ETA: 3s - loss: 8775.4716
114/574 [====>.........................] - ETA: 3s - loss: 8668.7629
122/574 [=====>........................] - ETA: 3s - loss: 8857.8727
130/574 [=====>........................] - ETA: 3s - loss: 8776.2437
138/574 [======>.......................] - ETA: 3s - loss: 8740.3830
146/574 [======>.......................] - ETA: 3s - loss: 8730.5757
154/574 [=======>......................] - ETA: 3s - loss: 8548.3567
162/574 [=======>......................] - ETA: 3s - loss: 8634.1684
170/574 [=======>......................] - ETA: 3s - loss: 8656.8425
178/574 [========>.....................] - ETA: 2s - loss: 8693.5918
186/574 [========>.....................] - ETA: 2s - loss: 8692.7155
194/574 [=========>....................] - ETA: 2s - loss: 8808.7449
202/574 [=========>....................] - ETA: 2s - loss: 8799.6075
210/574 [=========>....................] - ETA: 2s - loss: 8737.0450
218/574 [==========>...................] - ETA: 2s - loss: 8768.0472
226/574 [==========>...................] - ETA: 2s - loss: 8685.9705
234/574 [===========>..................] - ETA: 2s - loss: 8619.0025
242/574 [===========>..................] - ETA: 2s - loss: 8517.3632
250/574 [============>.................] - ETA: 2s - loss: 8550.4458
258/574 [============>.................] - ETA: 2s - loss: 8634.4164
266/574 [============>.................] - ETA: 2s - loss: 8593.0441
274/574 [=============>................] - ETA: 2s - loss: 8549.3398
282/574 [=============>................] - ETA: 2s - loss: 8562.7924
290/574 [==============>...............] - ETA: 2s - loss: 8630.2673
298/574 [==============>...............] - ETA: 2s - loss: 8565.9530
306/574 [==============>...............] - ETA: 2s - loss: 8489.1334
314/574 [===============>..............] - ETA: 1s - loss: 8551.8765
322/574 [===============>..............] - ETA: 1s - loss: 8529.2364
330/574 [================>.............] - ETA: 1s - loss: 8570.7090
338/574 [================>.............] - ETA: 1s - loss: 8484.9154
346/574 [=================>............] - ETA: 1s - loss: 8581.5754
354/574 [=================>............] - ETA: 1s - loss: 8469.7536
362/574 [=================>............] - ETA: 1s - loss: 8527.3965
370/574 [==================>...........] - ETA: 1s - loss: 8643.9054
378/574 [==================>...........] - ETA: 1s - loss: 8557.7242
386/574 [===================>..........] - ETA: 1s - loss: 8517.8851
394/574 [===================>..........] - ETA: 1s - loss: 8517.8910
402/574 [====================>.........] - ETA: 1s - loss: 8447.9392
410/574 [====================>.........] - ETA: 1s - loss: 8446.4299
418/574 [====================>.........] - ETA: 1s - loss: 8435.6283
426/574 [=====================>........] - ETA: 1s - loss: 8596.5833
434/574 [=====================>........] - ETA: 1s - loss: 8575.0673
442/574 [======================>.......] - ETA: 0s - loss: 8530.5529
450/574 [======================>.......] - ETA: 0s - loss: 8474.4823
458/574 [======================>.......] - ETA: 0s - loss: 8568.6729
466/574 [=======================>......] - ETA: 0s - loss: 8572.8954
474/574 [=======================>......] - ETA: 0s - loss: 8641.9489
482/574 [========================>.....] - ETA: 0s - loss: 8626.0034
490/574 [========================>.....] - ETA: 0s - loss: 8616.2472
498/574 [=========================>....] - ETA: 0s - loss: 8535.9563
506/574 [=========================>....] - ETA: 0s - loss: 8536.0856
514/574 [=========================>....] - ETA: 0s - loss: 8567.3827
522/574 [==========================>...] - ETA: 0s - loss: 8560.0272
530/574 [==========================>...] - ETA: 0s - loss: 8653.2590
538/574 [===========================>..] - ETA: 0s - loss: 8622.0580
546/574 [===========================>..] - ETA: 0s - loss: 8559.7420
554/574 [===========================>..] - ETA: 0s - loss: 8568.2375
562/574 [============================>.] - ETA: 0s - loss: 8549.4708
570/574 [============================>.] - ETA: 0s - loss: 8617.1536
574/574 [==============================] - 4s - loss: 8618.1509 - val_loss: 4589.8918
Epoch 653/2000



2/574 [..............................] - ETA: 4s - loss: 15461.0098
  8/574 [..............................] - ETA: 4s - loss: 11567.5577
 16/574 [..............................] - ETA: 4s - loss: 15431.1452
 22/574 [>.............................] - ETA: 4s - loss: 14907.2735
 28/574 [>.............................] - ETA: 4s - loss: 16469.8548
 34/574 [>.............................] - ETA: 4s - loss: 19166.2927
 42/574 [=>............................] - ETA: 4s - loss: 20959.5801
 50/574 [=>............................] - ETA: 4s - loss: 22395.3465
 58/574 [==>...........................] - ETA: 4s - loss: 22019.4038
 66/574 [==>...........................] - ETA: 4s - loss: 20818.4458
 74/574 [==>...........................] - ETA: 4s - loss: 19903.9495
 82/574 [===>..........................] - ETA: 4s - loss: 20732.7192
 88/574 [===>..........................] - ETA: 3s - loss: 20165.5202
 94/574 [===>..........................] - ETA: 3s - loss: 20703.4287
102/574 [====>.........................] - ETA: 3s - loss: 20655.1077
110/574 [====>.........................] - ETA: 3s - loss: 21293.1598
118/574 [=====>........................] - ETA: 3s - loss: 21487.1158
126/574 [=====>........................] - ETA: 3s - loss: 22921.2475
134/574 [======>.......................] - ETA: 3s - loss: 23231.1183
142/574 [======>.......................] - ETA: 3s - loss: 23304.4916
150/574 [======>.......................] - ETA: 3s - loss: 23299.2299
158/574 [=======>......................] - ETA: 3s - loss: 22920.9130
166/574 [=======>......................] - ETA: 3s - loss: 22801.8330
174/574 [========>.....................] - ETA: 3s - loss: 22574.0137
182/574 [========>.....................] - ETA: 3s - loss: 22662.4141
190/574 [========>.....................] - ETA: 3s - loss: 22639.9107
198/574 [=========>....................] - ETA: 3s - loss: 22767.7930
206/574 [=========>....................] - ETA: 2s - loss: 22727.1314
214/574 [==========>...................] - ETA: 2s - loss: 22566.6478
222/574 [==========>...................] - ETA: 2s - loss: 22569.7071
230/574 [===========>..................] - ETA: 2s - loss: 22542.4761
238/574 [===========>..................] - ETA: 2s - loss: 22597.3304
246/574 [===========>..................] - ETA: 2s - loss: 22427.5783
254/574 [============>.................] - ETA: 2s - loss: 22750.0249
262/574 [============>.................] - ETA: 2s - loss: 22660.5272
270/574 [=============>................] - ETA: 2s - loss: 22615.6725
278/574 [=============>................] - ETA: 2s - loss: 22500.6912
286/574 [=============>................] - ETA: 2s - loss: 22418.0539
294/574 [==============>...............] - ETA: 2s - loss: 22434.2064
302/574 [==============>...............] - ETA: 2s - loss: 22735.5259
310/574 [===============>..............] - ETA: 2s - loss: 22717.3892
318/574 [===============>..............] - ETA: 2s - loss: 22660.2040
326/574 [================>.............] - ETA: 1s - loss: 22802.6840
334/574 [================>.............] - ETA: 1s - loss: 22640.8964
342/574 [================>.............] - ETA: 1s - loss: 22747.8323
350/574 [=================>............] - ETA: 1s - loss: 22535.8510
358/574 [=================>............] - ETA: 1s - loss: 22615.5259
366/574 [==================>...........] - ETA: 1s - loss: 23410.7538
374/574 [==================>...........] - ETA: 1s - loss: 23798.1803
382/574 [==================>...........] - ETA: 1s - loss: 23541.1321
388/574 [===================>..........] - ETA: 1s - loss: 23421.8500
396/574 [===================>..........] - ETA: 1s - loss: 23234.2265
404/574 [====================>.........] - ETA: 1s - loss: 23200.4357
412/574 [====================>.........] - ETA: 1s - loss: 23320.1471
418/574 [====================>.........] - ETA: 1s - loss: 23154.9219
426/574 [=====================>........] - ETA: 1s - loss: 23650.1016
434/574 [=====================>........] - ETA: 1s - loss: 23527.1210
442/574 [======================>.......] - ETA: 1s - loss: 23332.7254
450/574 [======================>.......] - ETA: 0s - loss: 23542.9813
458/574 [======================>.......] - ETA: 0s - loss: 23401.7659
466/574 [=======================>......] - ETA: 0s - loss: 23459.3561
474/574 [=======================>......] - ETA: 0s - loss: 23447.3781
482/574 [========================>.....] - ETA: 0s - loss: 23354.8805
490/574 [========================>.....] - ETA: 0s - loss: 23345.2613
498/574 [=========================>....] - ETA: 0s - loss: 23236.4487
506/574 [=========================>....] - ETA: 0s - loss: 23315.2012
514/574 [=========================>....] - ETA: 0s - loss: 23202.0041
522/574 [==========================>...] - ETA: 0s - loss: 23109.5439
530/574 [==========================>...] - ETA: 0s - loss: 23105.3809
538/574 [===========================>..] - ETA: 0s - loss: 23032.0118
546/574 [===========================>..] - ETA: 0s - loss: 23056.5381
554/574 [===========================>..] - ETA: 0s - loss: 23180.9754
562/574 [============================>.] - ETA: 0s - loss: 23193.4535
570/574 [============================>.] - ETA: 0s - loss: 23188.4087
574/574 [==============================] - 4s - loss: 23271.4704 - val_loss: 29224.7771
Epoch 1656/2000

at first,the loss is normal,but with the model training, the loss become more and more bigger, I have checked many times, changed model and haven’t search for something helpful.I really don’t know what’s worry

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Comments:5

github_iconTop GitHub Comments

1reaction
macrovvecommented, Jun 12, 2017

the problem is solved when i scale the entire dataset. thank you @ceteke @lhk

0reactions
stale[bot]commented, Sep 10, 2017

This issue has been automatically marked as stale because it has not had recent activity. It will be closed after 30 days if no further activity occurs, but feel free to re-open a closed issue if needed.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Training loss increases with time [duplicate] - Cross Validated
This seems weird to me as I would expect that on the training set the performance should improve with time not deteriorate. I...
Read more >
[D] Sudden drop in loss after hours of no improvement - Reddit
This hit me really hard because I tend to use a "stopped improving" heuristic that in this case would have stopped training around...
Read more >
Loss suddenly increases with Adam Optimizer in Tensorflow
Yeah the first plot is the training loss without weight loss, second is the weight loss only. Optimization is done on the sum...
Read more >
Why is my validation loss lower than my training loss?
If you go through all three reasons for validation loss being lower than training loss detailed above, you may have over-regularized your model....
Read more >
Can You Lose Weight By Walking 10000 Steps A Day | Bit4id
There are still more than 2,000 fat burning foods before bedtime people in what is fat fast in keto diet the wanjiange pavilion,...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found