Hyperbolic Graph Neural Networks Under the Microscope: The Role of Geometry-Task Alignment

arXiv:2602.01828v2 Announce Type: replace Abstract: Many complex networks exhibit hierarchical, tree-like structures, making hyperbolic space a natural candidate wherein to learn representations of them. Based on this observation, Hyperbolic Graph Neural Networks (HGNNs) have been widely adopted as a principled choice for representation learning on tree-like graphs. In this work, we question this paradigm by proposing the additional condition of geometry--task alignment, i.e., whether the metric structure of the target follows that of the input graph. We theoretically and empirically demonstrate the capability of HGNNs to recover low-distortion representations on regression problems, and show that their geometric inductive bias becomes helpful when the problem requires preserving metric structure. By jointly analyzing predictive performance and embedding distortion, we further show that HGNNs gain an advantage on link prediction, a naturally geometry-aligned task, whereas this advantage largely disappears on standard node classification benchmarks, which are typically not geometry--aligned. Overall, our findings shift the focus from only asking "Is the graph hyperbolic?" to also questioning "Is the task aligned with hyperbolic geometry?", showing that HGNNs consistently outperform Euclidean models under such alignment, while their advantage vanishes otherwise.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top