metadata
license: apache-2.0
language:
- en
tags:
- ColBERT
- PyLate
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:497901
- loss:Contrastive
base_model: google/bert_uncased_L-2_H-128_A-2
datasets:
- sentence-transformers/msmarco-bm25
pipeline_tag: sentence-similarity
library_name: PyLate
Model card for PyLate BERT Tiny
This is a PyLate model finetuned from google/bert_uncased_L-2_H-128_A-2 on the msmarco-bm25 dataset. It maps sentences & paragraphs to sequences of 128-dimensional dense vectors and can be used for semantic textual similarity using the MaxSim operator.
This model is primarily designed for unit tests in limited compute environments such as GitHub Actions. But it does work to an extent for basic use cases.