Pretrained Word Embedding

Hi everyone!
It is possible to add pre-trained models like GloVe?

What would you be using it for in this context? Also, why not one of the BERT variants?

Well. I don’t mind which embedding. BERT is also good.
But my question is if you can add pretrained vocabulary to our model. So, instead of only using the the vocabulary we put on to train it, use like a data base to give it more vocab.

Add pretrained vocab to…?