tags: - text2text-generation - t5 - parsing - instruction-following - custom-task
Model Description
This is a fine-tuned T5-based model designed for parsing input instructions and converting them into structured outputs. It supports tasks such as:
- Log parsing
- Data transformation
- Instruction following for structured output generation.
Example Usage
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("your-username/your-model-name")
model = AutoModelForSeq2SeqLM.from_pretrained("your-username/your-model-name")
input_text = "Parse log entry: 2025-01-01T00:53:36.000000 WARN Chartered_accountant_Service: Restarting security module key0=d6c40d4c"
inputs = tokenizer(input_text, return_tensors="pt")
outputs = model.generate(inputs["input_ids"])
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
- Downloads last month
- 12
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.
Model tree for aronip/SN-T5-Base-FT
Base model
google-t5/t5-base