Advertisement

Cooperative graph neural networks(四)Cooperative Graph Neural Networks

阅读量:

被引用次数为9次

4、Cooperative Graph Neural Networks

CO-GNNs将图中的每个节点类比为一个多玩家环境中参与者的角色,并以相应节点表示(或状态)为其核心特征。每个节点按照两个阶段的更新流程进行状态演进:在初始状态下基于自身状态及周边节点信息选择合适的策略;随后在下一阶段则依据当前状态及前一阶段确定的邻居子集信息进行动态调整与优化。由此可知,在这种多轮互动机制下各参与方能够有效协调自身行为并完成信息传播任务。

一个CO-GNN

eft

架构由两个协作的GNN组成 :(i)动作网络

i

用于选择最佳动作,(ii)环境网络

ta

用于更新节点表示。CO-GNN层通过如下方式更新每个节点

v

的表示

h_{v}^{}

首先, 动作网络

i

根据每个节点

v

的状态及其邻居

N_{v}

的状态,预测

v

可以执行的动作

eft  S,L,B,I ight

的概率分布

p_{v}^{}n R^{4}

然后 ,使用Straight-through GS为每个节点

v

p_{v}^{}

中采样动作

a_{v}^{}

,并根据采样的动作,环境网络

ta

更新每个节点的状态:

这是单层更新,通过堆叠

Leq 1

层,我们获得每个节点

v

的最终表示

h_{v}^{}

Cooperative Graph Neural Networks采用动作网络和环境网络系统作为核心组件,在这种架构下,节点能够自主选择与邻居的互动模式,并通过灵活的信息传递机制实现高效的通信与数据共享。

CO-GNN

eft

该架构广泛应用于有向图和无向图;并且能够采用任意的GNN架构取代动作网络

i

和环境网络

ta

为了全面分析这一新消息传递范式的优点, 我们采用了相对简单的架构设计, 比如SUMGNNs, MEANGNNs, GCN, GIN和GAT, 它们分别表示为

um

u

st

arepsilon

lpha

例如,CO-GNN(

um

,

u

**) 表示使用SUMGNN作为动作网络、MEANGNN作为环境网络的CO-GNN架构。

**从根本上讲,CO-GNNs以精细的方式更新节点状态:如果节点

v

当选择采用 ISOLATE 或 BROADCAST 策略时,则该机制仅基于 节点之前的活动状态 进行 更新操作;这可被视为一种 基于节点 的 更新机制。
另一方面,在另一种情况下(即当一个 节点 ...)则采取不同的处理方式。

v

当选择LISTEN或STANDARD动作时,则其状态更新行为不仅仅基于其之前的状态情况,并考虑到当前层中邻居节点执行BROADCAST或STANDARD动作的状态信息情况。

内容部分深入阐述了CO-GNN架构如何灵活运用不同的GNN架构进行协同工作,并且详细探讨了每个节点在不同动作下的状态更新机制。

该研究展示了图神经网络随机节点初始化时的强大性能,在IJCAI会议上发表于2021年。

The research conducted by Abboud, R., along with Dimitrov, R., and Ceylan, I.I., focuses on the application of shortest-path-based network architectures for predicting graph properties. This study was presented in the proceedings of the LoG conference in 2022.

Alon, U. and Yahav, E. Examining the limitations of graph neural networks and their practical implications. In ICLR, 2021.

Probabilistic graph learning through the use of contextual architectures was introduced in JMLR, 2020.

Barcelo et al., 2021. The study explores graph neural networks based on local graph features. This research contributes to the field of machine learning within the NeurIPS conference proceedings in 2021.

Bevilacqua et al., Invariant graph-level aggregation models. In ICLR, 2022.

Bodnar et al., C., Giovanni F.D., Chamberlain B.P., Lio P., and Bronstein M.M.: Neural sheaf diffusion: From a topological viewpoint regarding heterophily and oversmoothing in graph neural networks at NeurIPS 2023.

Bresson, X. and Laurent, T. Residual gated graph convnets. In arXiv , 2018.

Castellana, D., Errica, F., Bacciu, D., and Micheli, A. The large-scale contextual graph Markov model. In ICML, 2022.

陈等人提出直白且深入的图卷积网络模型,并在Proceedings of ICML上发表于2020年

In 1989, G. Cybenko presented an approximation through the superposition of a sigmoid function in MCSS.

Dai, E., Jin, W., Liu, H., and Wang, S. are investigating resilient graph neural networks for graphs with noise and limited label information. This research will be presented at WSDM in 2022.

Di Giovanni et al., 2023. Investigating over-squashing in message-passing neural networks: exploring the effects of network width, depth, and topology on performance. In ICLR, 2023.

Duvenaud et al., 2015. Graph convolutional networks for the learning of molecular fingerprints. In Proceedings of NeurIPS, 2015.

Dwivedi et al. conducted a long-range graph dataset benchmarking study in the NeurIPS Datasets and Benchmarks competition in 2022.

Errica, F. and Niepert, M. Tractable probabilistic graph-based representation learning with sum-product networks induced by graphs. In arXiv , 2023.

Errica et al., conducted a systematic evaluation of various graph-based neural models for graph classification tasks in their 2020 paper published in ICLR.

Investigating asynchronous neural network models for graph-based learning tasks, as demonstrated by their 2022 publication in arXiv.

Gilmer et al. address neural message passing within the domain of quantum chemistry at the ICML conference in 2017.

Gori, M., Monfardini, G., and Scarselli, F. Innovative approach for learning within graph structures. In IJCNN, 2005.

Presented by Gutteridge et al.: Dynamically rewired message passing with delay at the ICML conference in 2023.

Hamilton, W., Ying, Z., and Leskovec, J. Presenting inductive-based representation learning techniques within extensive graph datasets. In NeurIPS, 2017.

Hamilton, W. L. Graph Representation Learning. Appearing in Synthesis Lectures on Artificial Intelligence and Machine Learning, 2020.

Xiaohu He and his team have presented a novel approach called the extension of ViT/MLP-mixer to graph structures at the ICML conference in 2023.

Hornik, K. Approximation abilities of deep neural networks. In the journal Neural Networks, 1991.

Jang, E., Gu, S., and Poole, B. A categorical reparametrization employing the Gumbel-softmax distribution was introduced at the ICLR conference in 2017.

Jin, E., Bronstein, M., Ceylan, ˙I. ˙I., and Lanzinger, M. Homomorphic counts within the domain of graph neural networks: Focuses solely on this particular basis. In ICML, 2024.

FoSR: Low-pass filtering-based approach for mitigating the oversquashing issue in graph neural networks.

Keriven, N. 和 Peyre, G. 提出了通用不变量与等价图神经网络。该论文发表于 NeurIPS, 2019 年。

Kipf and Welling introduced a semi-supervised learning method utilizing graph convolutional networks at the ICLR conference in 2017.

Klicpera, J., Weißenberger, S., and Gunnemann, S. Through diffusion mechanisms, graph learning is optimized. In NeurIPS, 2019.

Lai, K.-H., Zha, D., Zhou, K., and Hu, X. A study titled Policy-GNN introduces an aggregation optimization technique aimed at enhancing the performance of graph neural networks. In KDD, 2020.

Li, Q., Han, Z., and Wu, X. 提供图卷积网络在半监督学习中的深入见解。该研究于2018年在AAAI会议上发表

Loukas, A. What graph neural networks cannot learn: depth vs width. In ICLR , 2020.

The study explores Graph Inductive Biases in Transformers without Message Passing. Proceeding of ICML 2023.

Continuous relaxations of discrete random variables are represented through the concrete distribution, as demonstrated by Maddison, C. J., Mnih, A., and Teh, Y. W. in their 2017 ICLR publication.

Maron et al. demonstrate the capability of graph networks in a provable manner. In NeurIPS, 2019.

The Weisfeiler-Leman method achieves significant progress in the development of higher-order graph neural networks. This advancement is presented at the AAAI conference in 2019.

Morris等人在ICML举办的关于图表示学习及其扩展的会议上发表了论文《Tudataset:一种用于图学习基准数据集的集合》。

Pei et al., Geom-GCN: Geometric Graph Convolutional Networks. Presented at the ICLR Conference in 2020.

A detailed analysis of GNNs' assessment in heterophilous scenarios: Are we truly advancing? This research was featured at the ICLR conference in 2023.

Sato et al. present findings that random features improve graph neural network capabilities in their work published in the proceedings of SDM in 2021.

Graph Neural Network (GNN) model appearing in IEEE Transactions on Neural Networks (year). [Scarselli et al., 2009]**

Sen et al. conducted extensive research on collective classification within the context of network data. Their study was published in the AI Magazine in 2008.

Shi et al., 2021

The Exphormer framework represents a significant advancement in sparse transformer models for graph analysis. Proceedings of ICML, 2023

The work of Shlomi J., Battaglia P., and Vlimant J.-R. has applications in particle physics research. This contribution was published in the proceedings of MLST in 2021.

Simonovsky, M. et al. developed dynamic edge-conditioned filters within convolutional neural networks for graph structures. The paper was presented at the CVPR conference in 2017.

Thiede et al., along with Zhou and Kondor, introduced the concept of Graph Neural Networks based on automorphism properties in their paper titled 'Autobahn'. The research was presented at the NeurIPS conference in 2021.

Tonshoff, J., Ritzert, M., Wolf, H., and Grohe, M. Stepping beyond the Weisfeiler-Lehman hierarchy: graph learning moving beyond message-passing. In the TMLR journal in 2023.

The phenomenon of over-squashing and bottlenecks in graph structures was analyzed through curvature analysis at the ICLR conference in 2022.

The graph attention networks were presented by Veličkić et al. at the ICLR in 2018.

Wang et al., Y.Y.; Sun et al., Y.Y.; Liu et al., Z.; Sarma et al., S.E.; Bronstein et al., M.M.; and Solomon et al., J.M. developed a dynamic graph convolutional neural network for effective point cloud learning. In the proceedings of TOG, published in 2019.

Wu Z et al. conducted a detailed examination of graph neural networks in their 2019 publication titled "A comprehensive survey on graph neural networks" within the IEEE Transactions on Neural Networks and Learning Systems journal.

Xu et al. investigate the effectiveness of graph neural networks in their study.

Yang, Z., Cohen, W. W., and Salakhutdinov, R. Examining again the approach of semi-supervised learning based on graph embeddings. In ICML, 2016.

Are they really capable of having poor performance in graph representation? In NeurIPS , 2021.

Zhen Ying and Jonathan You, along with Charles Morris, Xin Ren, William L. Hamilton, and Joan Leskovec, presented their work on hierarchical graph representation learning using differentiable pooling at the 2018 NeurIPS conference.

Zhao, L. and Akoglu, L. Pairnorm: Tackling oversmoothing in gnns. arXiv , 2019.

Zitnik et al. characterized the polypharmacy side effect profiles using graph convolutional networks in Bioinformatics (Year).

全部评论 (0)

还没有任何评论哟~