Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
Ergebnis 22 von 844
Neural networks, 2023-05, Vol.162, p.412-424
2023

Details

Autor(en) / Beteiligte
Titel
A learnable sampling method for scalable graph neural networks
Ist Teil von
  • Neural networks, 2023-05, Vol.162, p.412-424
Ort / Verlag
United States: Elsevier Ltd
Erscheinungsjahr
2023
Link zum Volltext
Quelle
Elsevier ScienceDirect Journals
Beschreibungen/Notizen
  • With the development of graph neural networks, how to handle large-scale graph data has become an increasingly important topic. Currently, most graph neural network models which can be extended to large-scale graphs are based on random sampling methods. However, the sampling process in these models is detached from the forward propagation of neural networks. Moreover, quite a few works design sampling based on statistical estimation methods for graph convolutional networks and the weights of message passing in GCNs nodes are fixed, making these sampling methods not scalable to message passing networks with variable weights, such as graph attention networks. Noting the end-to-end learning capability of neural networks, we propose a learnable sampling method. It solves the problem that random sampling operations cannot calculate gradients and samples nodes with an unfixed probability. In this way, the sampling process is dynamically combined with the forward propagation process of the features, allowing for better training of the networks. And it can be generalized to all message passing models. In addition, we apply the learnable sampling method to GNNs and propose two models. Our method can be flexibly combined with different graph neural network models and achieves excellent accuracy on benchmark datasets with large graphs. Meanwhile, loss function converges to smaller values at a faster rate during training than past methods.
Sprache
Englisch
Identifikatoren
ISSN: 0893-6080
eISSN: 1879-2782
DOI: 10.1016/j.neunet.2023.03.015
Titel-ID: cdi_proquest_miscellaneous_2791369460

Weiterführende Literatur

Empfehlungen zum selben Thema automatisch vorgeschlagen von bX