rutikab12

Rutik Bhoyar

Posted on June 15, 2021

What is Word2Vec?

Why we need word2vec?

  1. As both TFIDF and BOW do not store semantic information. TFIDF gives importance to uncommon words.
  2. Hence there is definitely a chance of overfitting.

What word2vec does?

In this word2vec specific model, each word is basically represented as a vector of 32 or more dimension instead of single number.

Here the semantic information and relation between different word is also preserved.

Word2vec is applied to huge amount of data mostly.

How to use?

from gensim.model import word2vec

After tokenizing and stopwords write the following code...

model=word2vec(senetnces,min_count=1)
words=model.wv

Enter fullscreen mode Exit fullscreen mode

Thank You!

💖 💪 🙅 🚩
rutikab12
Rutik Bhoyar

Posted on June 15, 2021

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related

What is Word2Vec?
machinelearning What is Word2Vec?

June 15, 2021

Statistics in ML
machinelearning Statistics in ML

January 22, 2021