Skip to content

rohith2506/word_embeddings

Repository files navigation

Implementation of word2vec embedding model from scratch using C++

Description

Word2Vec is a popular word embedding model which is used in major NLP models. Instead of representing every word in the document as one-hot vector, we learn the representation of word in form of an embedding based on the context and semantics around it

For more information, please follow https://arxiv.org/pdf/1301.3781.pdf

Build

g++ -std=c++11 word2vec.cc -o word2vec

Remarks

Please note that this project was built for learning and not yet close for production usage. Please use it at your own risk

Author

Rohith Uppala

About

Implementation of popular word embedding techniques

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published