The knowledge graph was complemented using an open source framework called OpenKE. As a memo for myself, I will transcribe the result.
This article applies to people who fall under any of the following.
--People who are interested in the knowledge graph --People who want to know what python can do --People who want to use OpenKE
The knowledge graph shows the connection of various knowledge as a structure.
** Example) ** (obama, born-in, Hawaii)
Data that has a subject, the form of relations, and object relations such as is called a knowledge graph.
If the subject, the form of relations, and the object are $ s, r, and o $, respectively, they remain from the relationship between $ s $ and $ r $ or $ o $ and $ r $. The purpose of this time is to guess $ o and s $.
OpenKE is an open source created by Tsinghua University Natural Language Processing and Social Studies Laboratory (THUNLP). It is a framework.
It is a framework dedicated to knowledge graphs written in C ++ and python, and currently seems to support pytorch and tensorflow.
For details, please refer to the following github link or the OpenKE homepage. OpenKE Home Page OpenKE's github
Next, the program to be actually executed is shown below. This time, we will use train_distmult_WN18.py in examples.
import openke
from openke.config import Trainer, Tester
from openke.module.model import DistMult
from openke.module.loss import SoftplusLoss
from openke.module.strategy import NegativeSampling
from openke.data import TrainDataLoader, TestDataLoader
# dataloader for training
train_dataloader = TrainDataLoader(
in_path = "./benchmarks/WN18RR/",
nbatches = 100,
threads = 8,
sampling_mode = "normal",
bern_flag = 1,
filter_flag = 1,
neg_ent = 25,
neg_rel = 0
)
# dataloader for test
test_dataloader = TestDataLoader("./benchmarks/WN18RR/", "link")
# define the model
distmult = DistMult(
ent_tot = train_dataloader.get_ent_tot(),
rel_tot = train_dataloader.get_rel_tot(),
dim = 200
)
# define the loss function
model = NegativeSampling(
model = distmult,
loss = SoftplusLoss(),
batch_size = train_dataloader.get_batch_size(),
regul_rate = 1.0
)
# train the model
trainer = Trainer(model = model, data_loader = train_dataloader, train_times = 2000, alpha = 0.5, use_gpu = True, opt_method = "adagrad")
trainer.run()
distmult.save_checkpoint('./checkpoint/distmult.ckpt')
# test the model
distmult.load_checkpoint('./checkpoint/distmult.ckpt')
tester = Tester(model = distmult, data_loader = test_dataloader, use_gpu = True)
tester.run_link_prediction(type_constrain = False)
test_dataloader is "./benchmarks/WN18RR/" model is distmult The loss function is SoftplusLoss () I will leave it. dim is set to the form of 200. All are the same as when downloaded.
There are several other types of executable programs in examples.
There are three parts that can be changed: dataset, model, and loss.
Make sure that "./benchmarks/WN18RR/" of train_dataloader and test_dataloader are the same. You can use the dataset in the link below for this benchmark. benchmarks
Variables in TrainDataLoader can be changed freely. In addition to nomal, cross can be selected for sampling_mode. (The cross setting may require a slight change to the deeper setting.)
For the model, please refer to the link below. Available models
In addition to Softplus Loss, Margin Loss and Sigmoid Loss can be used for loss.
The execution result is as follows. I don't have a GPU machine so I ran it with google colaboratory.
Let's compare it with the Experiments table on GitHub. The table seems to be the value at Hits @ 10 (filter).
The average of the experimental results was 0.463306, so the accuracy was 0.015 lower than the value of DistMult on GitHub.
The improvement is to adopt another loss function. Also, I think one way is to change the values of neg_ent, neg_rel, and alpha.
This time, I tried to complement the knowledge graph using OpenKE. As a result, we did not get the expected results, but since there was room for improvement, we would like to start the improvement points shown above.
Thank you for reading until the end.
Recommended Posts