Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Performance of 8-layer GCN

See original GitHub issue

Hi DropEdge Team,

I am running experiments on 8-layer GCN (using DropEdge) in the semi-supervised setting. I used the default hyper-parameters as 2-layer GCN and changed --nbaseblocklayer 0 to --nbaseblocklayer 6 in script/semi-supervised/ On Cora dataset, the 2-layer GCN performance is 82.8% while 8-lay GCN is only 16%, not 75.80%. Could you please tell me how I can reproduce the results shown in readme?

Thank you.

Issue Analytics

  • State:open
  • Created 3 years ago
  • Comments:8 (1 by maintainers)

github_iconTop GitHub Comments

InfluenceFunctioncommented, Aug 20, 2020

I have the same problem as you,It should be that the parameters of the multi-layer GCN are not given, and you cannot simply set --nbaseblocklayer 0 to --nbaseblocklayer 6 in script/semi-supervised/

klchaicommented, Dec 23, 2020

You can try to modify these parameters,such as weight_decay, sampling_percent and dropout. For multi-layers model with no parameters provided, I get good results by tuning these parameters,but there is a small gap compared with the results of the paper ------------------ 原始邮件 ------------------ 发件人: “DropEdge/DropEdge” <>; 发送时间: 2020年12月8日(星期二) 晚上6:47 收件人: “DropEdge/DropEdge”<>; 抄送: “流觞曲水”<>;“Comment”<>; 主题: Re: [DropEdge/DropEdge] Performance of 8-layer GCN (#8) I also have the same problem. Have you guys fixed the problem? — You are receiving this because you commented. Reply to this email directly, view it on GitHub, or unsubscribe.

What are your parameters? When I try to reproduce the multi-layer GCN, the nhiddenlayer is fixed to 1 and I can only modify nbaseblocklayer. Thanks in advance.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Effect of the different graph convolutional network (GCN ...
Method We utilised a standard 8-layer CNN, then integrated two improvement techniques: (i) batch normalization (BN) and (ii) dropout (DO). Finally, we utilized ......
Read more >
Tackling Over-Smoothing for General Graph Convolutional ...
GCN -4+DropEdge. Fig. 1: Performance of GCNs on Cora. We implement 4-layer and 8-layer GCNs w and w/o DropEdge. GCN-4 gets stuck.
Read more >
Cluster-GCN: An Efficient Algorithm for Training Deep and Large ...
We find that the efficiency of a mini-batch algorithm can be characterized by the notion of ... We show a detailed convergence of...
Read more >
Going Deep: Graph Convolutional Ladder-Shape Networks
classical and state-of-the-art methods to demonstrate the solid performance of GCLN on graph node classification. The experiments with 8-layer GCN and GAT ...
Read more >
pathGCN: Learning General Graph Spatial Operators from Paths
pressive GCNs for improved performance. In this paper we propose pathGCN, a novel approach to learn the spatial operator from random paths on...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Post

No results found

github_iconTop Related Hashnode Post

No results found