Usage
from transformers import AutoTokenizer,AutoModelForSequenceClassification,AutoModelForCausalLM
nli_tokenizer = AutoTokenizer.from_pretrained(nli_v2_model_name,trust_remote_code=True)
nli_model = AutoModelForCausalLM.from_pretrained(nli_v2_model_name,device_map="auto", trust_remote_code=True).eval()
query = f"以下提供两个句子,你的工作是选择这两个句子是否明确一致(蕴含)、不一致(矛盾)或者是否无法确定(中立)。你的答案必须是entailment(蕴含)、neutral(中性)或contradiction(矛盾)。\n句子1:{premise}\n句子2:{hypothesis}"
response, history = self.nli_v2_model.chat(self.nli_v2_tokenizer,query,history=None)
- Downloads last month
- 11
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The HF Inference API does not support model that require custom code execution.