dariakryvosheieva commited on
Commit
d064537
·
verified ·
1 Parent(s): 01bf1da

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +45 -0
README.md CHANGED
@@ -40,6 +40,51 @@ Summary of features:
40
  | Pooling Strategy | Last-token pooling |
41
  | Attention Mechanism | FlashAttention2 |
42
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
43
  ## Training & Evaluation
44
 
45
  Please refer to our technical report of jina-embeddings-c1 for training details and benchmarks.
 
40
  | Pooling Strategy | Last-token pooling |
41
  | Attention Mechanism | FlashAttention2 |
42
 
43
+ ## Usage
44
+
45
+ <details>
46
+ <summary>Requirements</a></summary>
47
+
48
+ The following Python packages are required:
49
+
50
+ - `transformers>=4.53.0`
51
+ - `torch>=2.7.1`
52
+
53
+ ### Optional / Recommended
54
+ - **flash-attention**: Installing [flash-attention](https://github.com/Dao-AILab/flash-attention) is recommended for improved inference speed and efficiency, but not mandatory.
55
+ </details>
56
+
57
+ <details>
58
+ <summary>via <a href="https://huggingface.co/docs/transformers/en/index">transformers</a></summary>
59
+
60
+ ```python
61
+ # !pip install transformers>=4.53.0 torch>=2.7.1
62
+
63
+ from transformers import AutoModel
64
+ import torch
65
+
66
+ # Initialize the model
67
+ model = AutoModel.from_pretrained("jinaai/jina-embeddings-c1-0.5B", trust_remote_code=True)
68
+ model.to("cuda")
69
+
70
+ # Configure truncate_dim, max_length, batch_size in the encode function if needed
71
+
72
+ # Encode query
73
+ query_embeddings = model.encode(
74
+ ["print hello world in python"],
75
+ task="nl2code",
76
+ prompt_name="query",
77
+ )
78
+
79
+ # Encode passage
80
+ passage_embeddings = model.encode(
81
+ ["print('Hello World!')"],
82
+ task="nl2code",
83
+ prompt_name="passage",
84
+ )
85
+ ```
86
+ </details>
87
+
88
  ## Training & Evaluation
89
 
90
  Please refer to our technical report of jina-embeddings-c1 for training details and benchmarks.