- note below $O_{6257}$ is the classic one hot encoded vector saying that in the 10,000 words in our dictionary, we are choosing the 6257th one (which is orange)
- embedding matrix $E$ contains the [[Word Embeddings|Word Embedding Vector]] for each unique word by column
- if you do $E\,\cdot\, O_{6257}$, you are picking the 6257th column, & retrieving that single unique word vector from the embedding matrix
![[CleanShot 2024-07-08 at
[email protected]|400]]