BirdieByte1024's picture
Update README.md
6a1f254 verified
metadata
license: mit
base_model: meta-llama/Llama-3.2-3B
library_name: peft
tags:
  - llama-3.2
  - unsloth
  - lora
  - peft
  - fine-tuned
  - doctor
  - dental
  - medical
  - instruction-tuning
  - adapter
datasets:
  - BirdieByte1024/doctor-dental-llama-qa

🦷 doctor-dental-implant-LoRA-llama3.2-3B

This is a LoRA adapter trained on top of meta-llama/Llama-3.2-3B using Unsloth, for the purpose of aligning the model to doctor–patient conversations and dental implant-related Q&A.

The adapter improves the model's performance in instruction-following and medical dialogue within the dental implant domain (e.g. Straumann® surgical workflows).


🔧 Model Details

  • Base model: meta-llama/Llama-3.2-3B
  • Adapter type: LoRA via PEFT
  • Framework: Unsloth
  • Quantization for training: QLoRA (bnb 4-bit)
  • Training objective: Instruction-tuning on domain-specific dialogue
  • Dataset: BirdieByte1024/doctor-dental-llama-qa

🧠 Dataset


💬 Expected Prompt Format

{
  "conversation": [
    { "from": "patient", "value": "What is the purpose of a healing abutment?" },
    { "from": "doctor", "value": "It helps shape the gum tissue and protect the implant site during healing." }
  ]
}

💻 How to Use the Adapter

from transformers import AutoTokenizer, AutoModelForCausalLM
from peft import PeftModel

# Load base model
base = AutoModelForCausalLM.from_pretrained("meta-llama/Llama-3.2-3B")
tokenizer = AutoTokenizer.from_pretrained("meta-llama/Llama-3.2-3B")

# Load LoRA adapter
model = PeftModel.from_pretrained(base, "BirdieByte1024/doctor-dental-implant-LoRA-llama3.2-3B")

✅ Intended Use

  • Domain adaptation for dental and clinical chatbots
  • Offline inference for healthcare-specific assistants
  • Safe instruction-following aligned with patient communication

⚠️ Limitations

  • Not a diagnostic tool
  • May hallucinate or oversimplify
  • Based on non-clinical and synthetic data

🛠 Authors

Developed by (BirdieByte1024)
Fine-tuned using Unsloth and PEFT


📜 License

MIT