ONNX version of gyr66/bert-base-chinese-finetuned-ner

This model is a conversion of gyr66/bert-base-chinese-finetuned-ner to ONNX format using the πŸ€— Optimum library.

Downloads last month
395
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model authors have turned it off explicitly.

Model tree for protectai/gyr66-bert-base-chinese-finetuned-ner-onnx

Quantized
(1)
this model

Collection including protectai/gyr66-bert-base-chinese-finetuned-ner-onnx