BERT sentence embeddings from transformers

While the existing answer of Jindrich is generally correct, it does not address the question entirely. The OP asked which layer he should use to calculate the cosine similarity between sentence embeddings and the short answer to this question is none. A metric like cosine similarity requires that the dimensions of the vector contribute equally … Read more

How to change huggingface transformers default cache directory

You can specify the cache directory everytime you load a model with .from_pretrained by the setting the parameter cache_dir. You can define a default location by exporting an environment variable TRANSFORMERS_CACHE everytime before you use (i.e. before importing it!) the library). Example for python: import os os.environ[‘TRANSFORMERS_CACHE’] = ‘/blabla/cache/’ Example for bash: export TRANSFORMERS_CACHE=/blabla/cache/