|LLM|TRANSFORMER|FOUNDATION MODEL|GRAPH|NETWORK|
Disconnected from the other modalities graphs wait for their AI revolution: is it coming?
“If the foundation is solid, everything else will follow.” – Unknown
“The loftier the building, the deeper must the foundation be laid.” – Thomas à Kempis
Foundation models have changed artificial intelligence in recent years. A foundation model is a model trained with huge amounts of data (usually by unsupervised learning) that can be adapted to different tasks. Models such as BERT or GPT brought about a revolution in which one model could then be adapted for all tasks in a domain, simplifying AI access and reducing the need for data for a single task. We have foundation models for text and other modalities, but for modalities such as graphs and tabular data, we do not. In this paper we discuss why we do not have a foundation model for graphs and how we might get one, specifically, we will answer these questions:
- Why do we want a foundation model for graphs? Why do we not have one?