Join the conversation on Slack
Open source library based on TensorFlow that predicts links between concepts in a knowledge graph.
AmpliGraph is a suite of neural machine learning models for relational Learning, a branch of machine learning that deals with supervised learning on knowledge graphs.
Use AmpliGraph if you need to:
- Discover new knowledge from an existing knowledge graph.
- Complete large knowledge graphs with missing statements.
- Generate stand-alone knowledge graph embeddings.
- Develop and evaluate a new relational model.
AmpliGraph's machine learning models generate knowledge graph embeddings, vector representations of concepts in a metric space:
It then combines embeddings with model-specific scoring functions to predict unseen and novel links:
- Intuitive APIs: AmpliGraph APIs are designed to reduce the code amount required to learn models that predict links in knowledge graphs.
- GPU-Ready: AmpliGraph is based on TensorFlow, and it is designed to run seamlessly on CPU and GPU devices - to speed-up training.
- Extensible: Roll your own knowledge graph embeddings model by extending AmpliGraph base estimators.
AmpliGraph includes the following submodules:
- Datasets: helper functions to load datasets (knowledge graphs).
- Models: knowledge graph embedding models. AmpliGraph contains TransE, DistMult, ComplEx, HolE, ConvE, ConvKB. (More to come!)
- Evaluation: metrics and evaluation protocols to assess the predictive power of the models.
- Discovery: High-level convenience APIs for knowledge discovery (discover new facts, cluster entities, predict near duplicates).
- Linux, macOS, Windows
- Python ≥ 3.6
Create and activate a virtual environment (conda)
conda create --name ampligraph python=3.7
source activate ampligraph
AmpliGraph is built on TensorFlow 1.x. Install from pip or conda:
CPU-only
pip install "tensorflow>=1.14.0,<2.0"
or
conda install tensorflow'>=1.14.0,<2.0.0'
GPU support
pip install "tensorflow-gpu>=1.14.0,<2.0"
or
conda install tensorflow-gpu'>=1.14.0,<2.0.0'
Install the latest stable release from pip:
pip install ampligraph
If instead you want the most recent development version, you can clone the repository
and install from source (your local working copy will be on the latest commit on the develop
branch).
The code snippet below will install the library in editable mode (-e
):
git clone https://github.com/Accenture/AmpliGraph.git
cd AmpliGraph
pip install -e .
>> import ampligraph
>> ampligraph.__version__
'1.3.1'
AmpliGraph includes implementations of TransE, DistMult, ComplEx, HolE, ConvE, and ConvKB. Their predictive power is reported below and compared against the state-of-the-art results in literature. More details available here.
FB15K-237 | WN18RR | YAGO3-10 | FB15k | WN18 | |
---|---|---|---|---|---|
Literature Best | 0.35* | 0.48* | 0.49* | 0.84** | 0.95* |
TransE (AmpliGraph) | 0.31 | 0.22 | 0.51 | 0.63 | 0.66 |
DistMult (AmpliGraph) | 0.31 | 0.47 | 0.50 | 0.78 | 0.82 |
ComplEx (AmpliGraph) | 0.32 | 0.51 | 0.49 | 0.80 | 0.94 |
HolE (AmpliGraph) | 0.31 | 0.47 | 0.50 | 0.80 | 0.94 |
ConvE (AmpliGraph) | 0.26 | 0.45 | 0.30 | 0.50 | 0.93 |
ConvE (1-N, AmpliGraph) | 0.32 | 0.48 | 0.40 | 0.80 | 0.95 |
ConvKB (AmpliGraph) | 0.23 | 0.39 | 0.30 | 0.65 | 0.80 |
** Kadlec, Rudolf, Ondrej Bajgar, and Jan Kleindienst. "Knowledge base completion: Baselines strike back. " arXiv preprint arXiv:1705.10744 (2017).
The project documentation can be built from your local working copy with:
cd docs
make clean autogen html
See guidelines from AmpliGraph documentation.
If you like AmpliGraph and you use it in your project, why not starring the project on GitHub!
If you instead use AmpliGraph in an academic publication, cite as:
@misc{ampligraph,
author= {Luca Costabello and
Sumit Pai and
Chan Le Van and
Rory McGrath and
Nicholas McCarthy and
Pedro Tabacof},
title = {{AmpliGraph: a Library for Representation Learning on Knowledge Graphs}},
month = mar,
year = 2019,
doi = {10.5281/zenodo.2595043},
url = {https://doi.org/10.5281/zenodo.2595043}
}
AmpliGraph is licensed under the Apache 2.0 License.