nomic-embed-text-v1-unsupervised: A Reproducible Long Context (8192) Text Embedder

nomic-embed-text-v1-unsupervised is 8192 context length text encoder. This is a checkpoint after contrastive pretraining from multi-stage contrastive training of the final model. The purpose of releasing this checkpoint is to open-source training artifacts from our Nomic Embed Text tech report here

If you want to use a model to extract embeddings, we suggest using nomic-embed-text-v1.

Join the Nomic Community

Downloads last month
1,696
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model authors have turned it off explicitly.

Model tree for nomic-ai/nomic-embed-text-v1-unsupervised

Quantizations
1 model

Spaces using nomic-ai/nomic-embed-text-v1-unsupervised 4

Collection including nomic-ai/nomic-embed-text-v1-unsupervised

Evaluation results