Skip to content

Commit

Permalink
More edits in "Using word embeddings"
Browse files Browse the repository at this point in the history
  • Loading branch information
VladKha authored May 23, 2018
1 parent 1c0c5a9 commit 10d1e3e
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion 5- Sequence Models/Readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -440,7 +440,7 @@ Here are the course summary as its given on the course [link](https://www.course
![](Images/31.png)
- In this problem, we encode each face into a vector and then check how similar are these vectors.
- Words **encoding** and **embeddings** have a similar meaning here.
- In the word embeddings task, we are learning a representation for each word in our vocabulary (unlike in image encoding where we have to map each new image to some n-dimensional vector). We will discuss the algorithm in next sections.
- In the word embeddings task, we are learning a representation for each word in our vocabulary (unlike in image encoding where we have to map each new image to some n-dimensional vector). We will discuss the algorithm in next sections.

#### Properties of word embeddings
- One of the most fascinating properties of word embeddings is that they can also help with analogy reasoning. Analogy reasoning is one of the most important applications of NLP.
Expand Down

0 comments on commit 10d1e3e

Please sign in to comment.