Model Card for Phi-3.5-mini-thinking-function_calling-V0
This model is a fine-tuned version of microsoft/Phi-3.5-mini-instruct on Jofthomas/hermes-function-calling-thinking-V1 for function calling.
This toy model has been training as an alternative exercise to the Unit 1 bonus section of the Agent Ai course.
💥 The training script to adapt the given example notebook for Phi-3.5-mini-instruct can be found here.
Example usage
prompt = """<|user|>
You are a function calling AI model. You are provided with function signatures within <tools></tools> XML tags.You may call one or more functions to assist with the user query. Don't make assumptions about what values to plug into functions.Here are the available tools:<tools> [{'type': 'function', 'function': {'name': 'convert_currency', 'description': 'Convert from one currency to another', 'parameters': {'type': 'object', 'properties': {'amount': {'type': 'number', 'description': 'The amount to convert'}, 'from_currency': {'type': 'string', 'description': 'The currency to convert from'}, 'to_currency': {'type': 'string', 'description': 'The currency to convert to'}}, 'required': ['amount', 'from_currency', 'to_currency']}}}, {'type': 'function', 'function': {'name': 'calculate_distance', 'description': 'Calculate the distance between two locations', 'parameters': {'type': 'object', 'properties': {'start_location': {'type': 'string', 'description': 'The starting location'}, 'end_location': {'type': 'string', 'description': 'The ending location'}}, 'required': ['start_location', 'end_location']}}}] </tools>Use the following pydantic model json schema for each tool call you will make: {'title': 'FunctionCall', 'type': 'object', 'properties': {'arguments': {'title': 'Arguments', 'type': 'object'}, 'name': {'title': 'Name', 'type': 'string'}}, 'required': ['arguments', 'name']}For each function call return a json object with function name and arguments within <tool_call></tool_call> XML tags as follows:
<tool_call>
{tool_call}
</tool_call>Also, before making a call to a function take the time to plan the function to take. Make that thinking process between <think>{your thoughts}</think>
Hi, I need to convert 500 USD to Euros. Can you help me with that?<|end|>
<|assistant|>
<think>"""
eos_token_id = tokenizer.encode('<|endoftext|>')[0]
inputs = tokenizer(prompt, return_tensors="pt", add_special_tokens=False)
inputs = {k: v.to("cuda") for k,v in inputs.items()}
outputs = model.generate(**inputs,
max_new_tokens=300,# Adapt as necessary
do_sample=True,
top_p=0.95,
temperature=0.01,
repetition_penalty=1.0,
eos_token_id=eos_token_id)
print(tokenizer.decode(outputs[0]))
You should expect a result similar to the following, where the model could successfully "think" and call a function to answer the user:
<|user|> You are a function calling AI model. You are provided with function signatures within <tools></tools> XML tags.You may call one or more functions to assist with the user query. Don't make assumptions about what values to plug into functions.Here are the available tools:<tools> [{'type': 'function', 'function': {'name': 'convert_currency', 'description': 'Convert from one currency to another', 'parameters': {'type': 'object', 'properties': {'amount': {'type': 'number', 'description': 'The amount to convert'}, 'from_currency': {'type': 'string', 'description': 'The currency to convert from'}, 'to_currency': {'type': 'string', 'description': 'The currency to convert to'}}, 'required': ['amount', 'from_currency', 'to_currency']}}}, {'type': 'function', 'function': {'name': 'calculate_distance', 'description': 'Calculate the distance between two locations', 'parameters': {'type': 'object', 'properties': {'start_location': {'type': 'string', 'description': 'The starting location'}, 'end_location': {'type': 'string', 'description': 'The ending location'}}, 'required': ['start_location', 'end_location']}}}] </tools> Use the following pydantic model json schema for each tool call you will make: {'title': 'FunctionCall', 'type': 'object', 'properties': {'arguments': {'title': 'Arguments', 'type': 'object'}, 'name': {'title': 'Name', 'type': 'string'}}, 'required': ['arguments', 'name']}For each function call return a json object with function name and arguments within <tool_call></tool_call> XML tags as follows:
<tool_call>
{tool_call}
</tool_call> Also, before making a call to a function take the time to plan the function to take. Make that thinking process between <think> {your thoughts}</think>
Hi, I need to convert 500 USD to Euros. Can you help me with that?<|end|><|assistant|><think> Okay, so the user is asking to convert 500 USD to Euros. I need to figure out which function to use. Looking at the available tools, there's a function called convert_currency that does exactly that. It requires the amount, the currency to convert from, and the currency to convert to. The user provided all these details: 500 USD to Euros. So, I'll call the convert_currency function with those parameters.
I should make sure to format the function call correctly, using the specified JSON schema. The function name is 'convert_currency', and the arguments are amount: 500, from_currency: 'USD', and to_currency: 'EUR'. That should do it.
Alright, I think that's all I need to do here. The user's request is straightforward, and the function is perfectly suited for this task.
<|tool|><|tool|>
{'name': 'convert_currency', 'arguments': {'amount': 500, 'from_currency': 'USD', 'to_currency': 'EUR'}}
<|tool|><|end|><|tool|>
<tool_response>
{'converted_amount': 425.5, 'from_currency': 'USD', 'to_currency': 'EUR'}
</tool_response><|end|><|assistant|> Sure, 500 USD is approximately 425.50 Euros.<|end|>
Training procedure
This model was trained with SFT.
Framework versions
- TRL: 0.15.1
- Transformers: 4.48.3
- Pytorch: 2.5.1+cu124
- Datasets: 3.3.2
- Tokenizers: 0.21.0
Citations
Cite TRL as:
@misc{vonwerra2022trl,
title = {{TRL: Transformer Reinforcement Learning}},
author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallouédec},
year = 2020,
journal = {GitHub repository},
publisher = {GitHub},
howpublished = {\url{https://github.com/huggingface/trl}}
}
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no pipeline_tag.
Model tree for andreagemelli/Phi-3.5-mini-thinking-function_calling-V0
Base model
microsoft/Phi-3.5-mini-instruct