Overview
This interface presents GPTKB, a large general-domain knowledge base (KB) entirely from a large language model (LLM). It demonstrates the feasibility of large-scale KB construction from LLMs, while highlighting specific challenges arising around entity recognition, entity and property canonicalization, and taxonomy construction.
Based on GPT-4o-mini, GPTKB contains 105 million triples for more than 2.9 million entities, at a cost 100x less than previous KBC projects.
GPTKB is a landmark for two fields:- For NLP, for the first time, it provides constructive insights into the knowledge (or beliefs) of LLMs.
- For the Semantic Web, it shows novel ways forward for the long-standing challenge of general-domain KB construction.
Main paper
If you use this data, please cite the following paper:
GPTKB: Building Very Large Knowledge Bases from Language Models
arXiv, 2024