Model Card
This model was trained using custom training code.
Model Details
- Architecture: Decoder-only transformer
- Parameters: See config.json for details
- Training: Custom training loop
Usage
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("omkaark/test1")
model = AutoModelForCausalLM.from_pretrained("omkaark/test1")
# Generate text
inputs = tokenizer("Hello, world!", return_tensors="pt")
outputs = model.generate(**inputs, max_length=50)
print(tokenizer.decode(outputs[0]))
- Downloads last month
- 3
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support