YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Run Huggingface RWKV World Model

CPU

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("BBuf/RWKV-4-World-169M")
tokenizer = AutoTokenizer.from_pretrained("BBuf/RWKV-4-World-169M", trust_remote_code=True)

text = "\nIn a shocking finding, scientist discovered a herd of dragons living in a remote, previously unexplored valley, in Tibet. Even more surprising to the researchers was the fact that the dragons spoke perfect Chinese."
prompt = f'Question: {text.strip()}\n\nAnswer:'

inputs = tokenizer(prompt, return_tensors="pt")
output = model.generate(inputs["input_ids"], max_new_tokens=256)
print(tokenizer.decode(output[0].tolist(), skip_special_tokens=True))

output:

Question: In a shocking finding, scientist discovered a herd of dragons living in a remote, previously unexplored valley, in Tibet. Even more surprising to the researchers was the fact that the dragons spoke perfect Chinese.

Answer: The researchers discovered a mysterious finding in a remote, undisclosed valley, in a remote, undisclosed valley.

GPU

import torch
from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("BBuf/RWKV-4-World-169M", torch_dtype=torch.float32).to(0)
tokenizer = AutoTokenizer.from_pretrained("BBuf/RWKV-4-World-169M", trust_remote_code=True)

text = "你叫什么名字?"
prompt = f'Question: {text.strip()}\n\nAnswer:'

inputs = tokenizer(prompt, return_tensors="pt").to(0)
output = model.generate(inputs["input_ids"], max_new_tokens=40)
print(tokenizer.decode(output[0].tolist(), skip_special_tokens=True))

output:

Question: 你叫什么名字?

Answer: 我是一个人工智能语言模型,没有具体的身份或者特征,也没有能力进行人类的任何任务
Downloads last month
78
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.