GraphVec: Cross-Domain Graph Vectorization for Graph-Level Representation Learning

arXiv:2602.04244v2 Announce Type: replace Abstract: Learning universal graph representations across heterogeneous domains is difficult because graph datasets differ in topology, node-attribute semantics, feature dimensions, and even attribute availability. We propose GraphVec, a language-model-free graph vectorization model that maps diverse graphs into transferable fixed-dimensional embeddings for graph-level tasks. Instead of directly using incomparable raw node attributes, GraphVec constructs multi-scale global graphs over all nodes in each dataset and extracts spectral embeddings to obtain domain-agnostic relational features. To make these spectral features comparable across datasets, we introduce a density-maximization mean alignment algorithm over orthogonal transformations and prove its monotonic convergence. GraphVec further combines a GIN--Graph Transformer backbone with a multi-layer reference distribution module, which preserves node-level distributional information beyond standard pooling. We also provide a generalization error bound for the proposed model. Experiments on 13 datasets with more than 15 comparison methods demonstrate that GraphVec consistently outperforms strong graph pretraining baselines in cross-domain few-shot graph classification and graph clustering. Beyond graph-level tasks, GraphVec also yields strong node-level representations, achieving competitive performance on few-shot node classification against representative graph prompt learning methods.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top