← Back to team overview

opencog-dev team mailing list archive

Bengio's A Neural Knowledge Language Model

 

Using RNN and knowledge graph together. The new dataset seems very
interesting.

*A Neural Knowledge Language Model*

*Communicating knowledge is a primary purpose of language. However, current
language models have significant limitations in their ability to encode or
decode knowledge. This is mainly because they acquire knowledge based on
statistical co-occurrences, even if most of the knowledge words are rarely
observed named entities. In this paper, we propose a Neural Knowledge
Language Model (NKLM) which combines symbolic knowledge provided by
knowledge graphs with RNN language models. At each time step, the model
predicts a fact on which the observed word is supposed to be based. Then, a
word is either generated from the vocabulary or copied from the knowledge
graph. We train and test the model on a new dataset, WikiFacts. In
experiments, we show that the NKLM significantly improves the perplexity
while generating a much smaller number of unknown words. In addition, we
demonstrate that the sampled descriptions include named entities which were
used to be the unknown words in RNN language models. *

http://arxiv.org/pdf/1608.00318v1.pdf

-- 
*Murilo Saraiva de Queiroz, MSc*
*Hardware Engineer at NVIDIA*

Follow ups