The document discusses the challenges faced by existing graph neural networks (GNNs) and graph transformers in learning from general graph-structured data, including issues like over-smoothing and noise from neighbors. It introduces a new model called Nagphormer, which utilizes hop2token and an attention-based readout function to improve understanding of graph structures. Various experiments are conducted on graph datasets to evaluate the performance of Nagphormer compared to traditional GNNs and graph transformers.
Related topics: