RakutenAI-2.0-mini

Model Description

RakutenAI-2.0-mini is a lightweight Japanese language model trained from scratch using a transformer architecture, designed for efficient performance in resource-constrained environments. As a foundation model, it serves as the backbone for instruct models.

If you are looking for instruct model, check RakutenAI-2.0-mini-instruct.

Model Usage

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer
model_path = "Rakuten/RakutenAI-2.0-mini"
tokenizer = AutoTokenizer.from_pretrained(model_path)
model = AutoModelForCausalLM.from_pretrained(model_path, torch_dtype="auto", device_map="auto")
model.eval()

requests = [
    "南硫黄島原生自然環境保全地域は、自然",
    "The capybara is a giant cavy rodent",
]

for req in requests:
    input_text = tokenizer(req, return_tensors="pt").to(device=model.device)
    tokens = model.generate(
        **input_text,
        max_new_tokens=512,
        do_sample=True,
        pad_token_id=tokenizer.eos_token_id,
    )
    out = tokenizer.decode(tokens[0], skip_special_tokens=True)
    print("INPUT:\n" + req)
    print("OUTPUT:\n" + out)

Model Details

Limitations and Bias

The suite of RakutenAI-2.0 models is capable of generating human-like text on a wide range of topics. However, like all LLMs, they have limitations and can produce biased, inaccurate, or unsafe outputs. Please exercise caution and judgement while interacting with them.

Citation

For citing our work on the suite of RakutenAI-2.0 models, please use:

@misc{rakutengroup2025rakutenai2.0,
  author = {Rakuten Group, Inc.},
  title = {RakutenAI-2.0},
  year = {2025},
  publisher = {Hugging Face},
  url = {https://huggingface.co/Rakuten},
}
Downloads last month
662
Safetensors
Model size
1.53B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for Rakuten/RakutenAI-2.0-mini

Finetunes
1 model
Quantizations
2 models