diff --git a/lessons/5-NLP/15-LanguageModeling/README.md b/lessons/5-NLP/15-LanguageModeling/README.md
index 06d73d1aff3df1a3e82c84fe4c6a63b2a60d66dc..dc5e4a513a997fe60290f0df4f0ad519a7ef8891 100644
--- a/lessons/5-NLP/15-LanguageModeling/README.md
+++ b/lessons/5-NLP/15-LanguageModeling/README.md
@@ -23,6 +23,8 @@ In our previous examples, we used pre-trained semantic embeddings, but it is int
 Continue your learning in the following notebooks:
 
 * [Training CBoW Word2Vec with TensorFlow](CBoW-TF.ipynb)
+* [Training CBoW Word2Vec with PyTorch](CBoW-PyTorch.ipynb)
+
 
 ## Conclusion