project. This work has also been supported by Na-
tional Security Council Secretariat (NSCS), Govt. of
India.
REFERENCES
Abu-El-Haija, S., Perozzi, B., Kapoor, A., Alipourfard, N.,
Lerman, K., Harutyunyan, H., Steeg, G. V., and Gal-
styan, A. (2019). MixHop: Higher-order graph con-
volutional architectures via sparsified neighborhood
mixing. In Chaudhuri, K. and Salakhutdinov, R., edi-
tors, Proceedings of the 36th International Conference
on Machine Learning, volume 97 of Proceedings of
Machine Learning Research, pages 21–29. PMLR.
Bo, D., Wang, X., Shi, C., and Shen, H. (2021). Be-
yond low-frequency information in graph convolu-
tional networks. In AAAI, pages 3950–3957.
Chen, M., Wei, Z., Huang, Z., Ding, B., and Li, Y. (2020).
Simple and deep graph convolutional networks. In
III, H. D. and Singh, A., editors, Proceedings of the
37th International Conference on Machine Learning,
volume 119 of Proceedings of Machine Learning Re-
search, pages 1725–1735. PMLR.
Defferrard, M., Bresson, X., and Vandergheynst, P. (2016).
Convolutional neural networks on graphs with fast
localized spectral filtering. In Proceedings of the
30th International Conference on Neural Information
Processing Systems, NIPS’16, page 3844–3852, Red
Hook, NY, USA. Curran Associates Inc.
Dong, H., Chen, J., Feng, F., He, X., Bi, S., Ding, Z., and
Cui, P. (2021). On the equivalence of decoupled graph
convolution network and label propagation. In Pro-
ceedings of the Web Conference 2021, WWW ’21,
page 3651–3662, New York, NY, USA. Association
for Computing Machinery.
Fu, X., Zhang, J., Meng, Z., and King, I. (2020). Magnn:
Metapath aggregated graph neural network for hetero-
geneous graph embedding. In Proceedings of The Web
Conference 2020, WWW ’20, page 2331–2341, New
York, NY, USA. Association for Computing Machin-
ery.
Geisler, S., Li, Y., Mankowitz, D., Cemgil, A. T.,
G
¨
unnemann, S., and Paduraru, C. (2023). Transform-
ers meet directed graphs.
Gong, L. and Cheng, Q. (2019). Exploiting edge features
for graph neural networks. In 2019 IEEE/CVF Con-
ference on Computer Vision and Pattern Recognition
(CVPR), pages 9203–9211.
Grover, A. and Leskovec, J. (2016). Node2vec: Scal-
able feature learning for networks. In Proceedings
of the 22nd ACM SIGKDD International Conference
on Knowledge Discovery and Data Mining, KDD ’16,
page 855–864, New York, NY, USA. Association for
Computing Machinery.
Guo, Y. and Wei, Z. (2023). Graph neural networks with
learnable and optimal polynomial bases.
Hamilton, W. L., Ying, R., and Leskovec, J. (2017). Induc-
tive representation learning on large graphs. In Pro-
ceedings of the 31st International Conference on Neu-
ral Information Processing Systems, NIPS’17, page
1025–1035, Red Hook, NY, USA. Curran Associates
Inc.
He, K., Zhang, X., Ren, S., and Sun, J. (2016). Deep resid-
ual learning for image recognition. In 2016 IEEE Con-
ference on Computer Vision and Pattern Recognition
(CVPR), pages 770–778.
He, Y., Gan, Q., Wipf, D., Reinert, G. D., Yan, J., and Cu-
curingu, M. (2022). GNNRank: Learning global rank-
ings from pairwise comparisons via directed graph
neural networks. In Chaudhuri, K., Jegelka, S., Song,
L., Szepesvari, C., Niu, G., and Sabato, S., editors,
Proceedings of the 39th International Conference on
Machine Learning, volume 162 of Proceedings of Ma-
chine Learning Research, pages 8581–8612. PMLR.
Hu, Y., You, H., Wang, Z., Wang, Z., Zhou, E., and Gao,
Y. (2021). Graph-mlp: Node classification without
message passing in graph.
Kipf, T. N. and Welling, M. (2017). Semi-supervised clas-
sification with graph convolutional networks.
Klicpera, J., Bojchevski, A., and G
¨
unnemann, S. (2019).
Predict then propagate: Graph neural networks meet
personalized pagerank. In 7th International Confer-
ence on Learning Representations, ICLR 2019, New
Orleans, LA, USA, May 6-9, 2019. OpenReview.net.
Li, G., Muller, M., Thabet, A., and Ghanem, B. (2019).
Deepgcns: Can gcns go as deep as cnns? In 2019
IEEE/CVF International Conference on Computer Vi-
sion (ICCV), pages 9266–9275, Los Alamitos, CA,
USA. IEEE Computer Society.
Li, Q., Han, Z., and Wu, X.-M. (2018). Deeper in-
sights into graph convolutional networks for semi-
supervised learning. In Proceedings of the Thirty-
Second AAAI Conference on Artificial Intelligence
and Thirtieth Innovative Applications of Artificial In-
telligence Conference and Eighth AAAI Symposium
on Educational Advances in Artificial Intelligence,
AAAI’18/IAAI’18/EAAI’18. AAAI Press.
Lim, D., Hohne, F., Li, X., Huang, S. L., Gupta, V.,
Bhalerao, O., and Lim, S.-N. (2021). Large scale
learning on non-homophilous graphs: New bench-
marks and strong simple methods.
Liu, M., Gao, H., and Ji, S. (2020). Towards deeper graph
neural networks. In Proceedings of the 26th ACM
SIGKDD International Conference on Knowledge
Discovery& Data Mining, KDD ’20, page 338–348,
New York, NY, USA. Association for Computing Ma-
chinery.
Luan, S., Hua, C., Lu, Q., Zhu, J., Zhao, M., Zhang, S.,
Chang, X., and Precup, D. (2021). Is heterophily A
real nightmare for graph neural networks to do node
classification? CoRR, abs/2109.05641.
Luan, S., Hua, C., Lu, Q., Zhu, J., Zhao, M., Zhang, S.,
Chang, X.-W., and Precup, D. (2022). Revisiting het-
erophily for graph neural networks.
Ma, J., Cui, P., Kuang, K., Wang, X., and Zhu, W.
(2019). Disentangled graph convolutional networks.
In Chaudhuri, K. and Salakhutdinov, R., editors, Pro-
ceedings of the 36th International Conference on Ma-
GNNDLD: Graph Neural Network with Directional Label Distribution
173