Abstract

Graphs naturally model complex relationships and interactions across various domains, including social networks, biological systems, and recommender platforms. Graph Neural Networks (GNNs) have emerged as powerful tools for learning effective graph representations through iterative message passing, significantly improving performance in tasks such as node classification, link prediction, and graph classification. However, the success of GNNs largely depends on abundant labeled data, posing challenges in practical scenarios where labeled data is scarce or unavailable. This dissertation addresses these challenges by exploring few-shot and zero-shot learning within the graph domain. We first propose COLA, a self-supervised few-shot node classification method that exploits unlabeled graph structures. Next, we introduce OneForAll (OFA), a unified graph foundation model capable of training across multiple datasets to improve few-shot generalization. Finally, we propose Generative OneForAll (GOFA), a generative foundation model integrating GNNs with large language model (LLM) architectures, specifically designed for zero-shot scenarios. Together, these contributions provide a principled path toward versatile and broadly applicable graph learning systems capable of operating effectively under minimal or no supervision.

Committee Chair

Nathan Jacobs

Committee Members

Chien-Ju Ho; Christopher Ryan King; Roman Garnett; Yixin Chen

Degree

Doctor of Philosophy (PhD)

Author's Department

Computer Science & Engineering

Author's School

McKelvey School of Engineering

Document Type

Dissertation

Date of Award

8-18-2025

Language

English (en)

Available for download on Sunday, February 15, 2026

Share

COinS