learning to understand phrases by embedding the dictionary
TRANSCRIPT
@graphificRoelof Pieters
Learning to Understand Phrases by Embedding the
Dic9onary
11 April 2015 Deep Learning Reading Group
www.csc.kth.se/~roelof/
http://arxiv.org/abs/1504.00548
Review of
Felix Hill, Kyunghyun Cho, Anna Korhonen, Yoshua Bengio
2
Core Idea
• Model that learns useful representations of phrases and sentences
• bridging gap between lexical [word meaning] and phrasal [or compositional] semantics [phrase/sentence meaning]
3
Model
• RNN which maps dictionary definitions (phrases) to (lexical) representations (of the words those definitions define)
• 2 tasks:
1. reverse dictionary/concept finder
2. general-knowledge crossword question answerer
4
RNN (Recurrent Neural Network)
1. Latent features are modelled as distributed dense vector of hidden layers
2. Can operate on sequential data of variable length
3. Vanishing/exploding gradient
5
LSTM (Long Short-Term Memory)1. Scales connections between memory cell and the input/
output layers
2. Gates to control input/memory/outputs
3. Lessens vanishing/exploding gradient problem
RNN (LSTM)
I. Cross-Lingual Reverse Dictionaries
Dictionary with definitions
Target word embeddings (Word2Vec, CBOW, 8B w)
Model (training)
I. Cross-Lingual Reverse Dictionaries
RNN
Embedding Space
testing Input phrase
I. Cross-Lingual Reverse Dictionaries
I. Cross-Lingual Reverse Dictionaries
II. General Knowledge (crossword) Question Answering