On the learning stability of the results of the general_gnn.py
See original GitHub issueHi Daniele and Jack,
I have a question regarding this example. In particular, running the code yields the following file’s content: data.txt.
However, if one plots the data above, say,
import matplotlib.pyplot as plt
import numpy as np
import matplotlib as mlp
from numpy import minimum, maximum
test_acc, epoch = [], []
for line in open('data.txt', 'r'):
values = [s for s in line.split()]
epoch.append(values[1])
test_acc.append(float(values[15]))
plt.figure(figsize=(20,5))
plt.plot(epoch, test_acc)
plt.xlabel('Epoch')
plt.ylabel('Test accuracy')
plt.legend(["test_acc"], loc ="lower right")
plt.show()
there is no convergence and stability in terms of the test accuracy:
Here test_acc
severely oscillates. Can you please explain how this learning process is considered to be normal while the accuracy of the model does not overall increase from one epoch to another?
Issue Analytics
- State:
- Created 2 years ago
- Comments:7 (4 by maintainers)
Top Results From Across the Web
Stability and Generalization of Graph Convolutional Neural ...
Our results shed new insights on the design of new & improved graph convolution filters with guaranteed algorithmic stability.
Read more >Isn't there any way to use ogb datasets by tensorflow, instead ...
I am working on a big machine learning project in which various features ... output)) File "~\AppData\Roaming\Python\Python37\site-packages\ ...
Read more >All-optical graph representation learning using integrated ...
The results show that our optical DGNN achieves competitive and even superior ... more stable model performance on the synthetic SBM graph.
Read more >Gradient Descent on Neural Networks Typically Occurs at the ...
Abstract: We empirically demonstrate that full-batch gradient descent on neural network training objectives typically operates in a regime ...
Read more >Stability (learning theory) - Wikipedia
Stability, also known as algorithmic stability, is a notion in computational learning theory of how a machine learning algorithm is perturbed by small ......
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
The isssue here is that “molhiv” is a dataset that has edge attributes, but GeneralGNN expects only node attributes (x, a, i).
You can either change dataset or implement a model similar to GeneralGNN which is designed to discard edge attributes. Something like:
Thanks. So, I ended up with the following code:
and I get the following error:
The problem here is that in the examples you supplied, the models are created based on three values
dataset.n_node_features
,dataset.n_edge_features
anddataset.n_labels
. However, I can onlydataset.n_labels
as the output to theGeneralGNN
constructor. If that’s the case, can you please explain how I can feed those values toGeneralGNN
?