Model Description

This model is fine-tuned from Mistral 7B v0.3. This model is enhanced to improve coding capabilities, particularly in Python, as it was fine-tuned on a dataset of 18,000 Python samples using Alpaca prompt instructions.

Please refer to this repository when using the model.

To perform inference using these LoRA adapters, please use the following code:

# Installs Unsloth, Xformers (Flash Attention) and all other packages!
!pip install "unsloth[colab-new] @ git+https://github.com/unslothai/unsloth.git"
!pip install --no-deps "xformers<0.0.27" "trl<0.9.0" peft accelerate bitsandbytes
from unsloth import FastLanguageModel
model, tokenizer = FastLanguageModel.from_pretrained(
    model_name = "MouezYazidi/PyMistral-7b-Genius_LoRA",
    max_seq_length = 2048,
    dtype = None,
    load_in_4bit = True,
)
FastLanguageModel.for_inference(model) # Enable native 2x faster inference

alpaca_prompt = """Below is an instruction describing a task, along with an input providing additional context. Your task is to generate a clear, concise, and accurate Python code response that fulfills the given request.

### Instruction:
{}

### Input:
{}

### Response:
{}"""

inputs = tokenizer(
[
    alpaca_prompt.format(
        "", # instruction
        """Write a Python function that generates and prints the first n rows of Pascal's Triangle. Ensure the function accepts a positive integer n as input and produces the rows in a well-formatted structure (e.g., lists within a list or as strings). If you use any external libraries, make sure to explicitly import them in your code.""", # input
        "", # output - leave this blank for generation!
    )
], return_tensors = "pt").to("cuda")

from transformers import TextStreamer
text_streamer = TextStreamer(tokenizer)
_ = model.generate(**inputs, streamer = text_streamer, max_new_tokens = 512)

The Outout is:

<s> Below is an instruction describing a task, along with an input providing additional context. Your task is to generate a clear, concise, and accurate Python code response that fulfills the given request.

### Instruction:


### Input:
Write a Python function that generates and prints the first n rows of Pascal's Triangle. Ensure the function accepts a positive integer n as input and produces the rows in a well-formatted structure (e.g., lists within a list or as strings). If you use any external libraries, make sure to explicitly import them in your code.

### Response:
def pascal_triangle(n):
   triangle = [[1]]
   for i in range(1, n):
       row = [1]
       for j in range(1, i):
           row.append(triangle[i-1][j-1] + triangle[i-1][j])
       row.append(1)
       triangle.append(row)
   return triangle

print(pascal_triangle(5))</s>

Uploaded model

  • Developed by: MouezYazidi
  • License: apache-2.0
  • Finetuned from model : unsloth/mistral-7b-v0.3-bnb-4bit

This mistral model was trained 2x faster with Unsloth and Huggingface's TRL library.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.

Model tree for MouezYazidi/PyMistral-7b-Genius_LoRA

Finetuned
(472)
this model

Dataset used to train MouezYazidi/PyMistral-7b-Genius_LoRA