File size: 1,990 Bytes
4204b83
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
c99ceae
 
 
4204b83
c99ceae
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
4204b83
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
---
base_model: qingy2024/GRMR-2B-Instruct
tags:
- text-generation-inference
- transformers
- unsloth
- gemma2
- trl
- openvino
- openvino-export
license: apache-2.0
language:
- en
pipeline_tag: text-generation
---

This model was converted to OpenVINO from [`qingy2024/GRMR-2B-Instruct`](https://huggingface.co/qingy2024/GRMR-2B-Instruct) using [optimum-intel](https://github.com/huggingface/optimum-intel)
via the [export](https://huggingface.co/spaces/echarlaix/openvino-export) space.

First make sure you have optimum-intel installed:

```bash
pip install optimum[openvino]
```

To load your model you can do as follows:

```python
from transformers import AutoTokenizer, AutoConfig, pipeline
from optimum.intel.openvino import OVModelForSeq2SeqLM
import time

mode_id = "santhosh/GRMR-2B-Instruct-openvino"
model = OVModelForSeq2SeqLM.from_pretrained(
    model_id,
    config=AutoConfig.from_pretrained(model_id),
    use_cache=True,
)
tokenizer = AutoTokenizer.from_pretrained(model_id)

# Create a pipeline
pipe = pipeline(
    "text2text-generation",
    model=model,
    tokenizer=tokenizer,
    truncation=True,
    max_length=256,
)

texts = [
    "Most of the course is about semantic or  content of language but there are also interesting topics to be learned from the servicefeatures except statistics in characters in documents.",
    "At this point, He introduces herself as his native English speaker and goes on to say that if you contine to work on social scnce",
    "He come after the event.",
    "When I grew up, I start to understand what he said is quite right",
    "Write this more formally: omg! i love that song im listening to right now",
    "Improve the grammaticality: As the number of people grows, the need of habitable environment is unquestionably essential.",
]
start_time = time.time()
for result in pipe(texts):
    print(result)
end_time = time.time()
duration = end_time - start_time
print(f"Correction completed in {duration:.2f} seconds.")
```