Ad

Sunday, March 10, 2019

Word Embeddings Word2Vec LSTM, Recurrent Neural Network, GRU Review and Notes - Udacity Deep Learning Nanodegree Part 2


Word embedding can use math to represent relations between words such as man and woman, work and worked

Embedding Weights will be learned while training
Embedding lookup finding the corresponding row in the embedding layer

Embedding dimensions is the number of hidden units
Encode each word as an Integer

Embedding matrix is a weight matrix
Embedding layer is a hidden layer 


Each row of the learned embedding matrix is a vector representation of the input word
The column of the embedding matrix is the number of stacked hidden units? Usually in the hundreds?

Words in similar context, expected to have similar embeddings, such as I drink water throughout the day, I drink coffee in the morning, I drink tea in the afternoon.

such as water, coffee, and tea
such as morning, throughout the day, afternoon

Vector artihmetic 

map a verb A from present to past
map a verb B from present to past
should be the same embedding weights, or vector transformation


No comments:

Post a Comment

React UI, UI UX, Reactstrap React Bootstrap

React UI MATERIAL  Install yarn add @material-ui/icons Reactstrap FORMS. Controlled Forms. Uncontrolled Forms.  Columns, grid