jeffreymeetkai commited on
Commit
f510263
·
verified ·
1 Parent(s): 7ad2869

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +34 -13
README.md CHANGED
@@ -28,8 +28,8 @@ We provide custom code for both converting tool definitions into the system prom
28
  ```python
29
  from transformers import AutoModelForCausalLM, AutoTokenizer
30
 
31
- tokenizer = AutoTokenizer.from_pretrained("meetkai/functionary-small-v2.5", trust_remote_code=True)
32
- model = AutoModelForCausalLM.from_pretrained("meetkai/functionary-small-v2.5", device_map="auto", trust_remote_code=True)
33
 
34
  tools = [
35
  {
@@ -61,9 +61,9 @@ print(tokenizer.decode(pred.cpu()[0]))
61
 
62
  ## Prompt Template
63
 
64
- We convert function definitions to a similar text to TypeScript definitions. Then we inject these definitions as system prompts. After that, we inject the default system prompt. Then we start the conversation messages.
65
 
66
- This formatting is also available via our vLLM server which we process the functions into Typescript definitions encapsulated in a system message and use a pre-defined Transformers chat template. This means that lists of messages can be formatted for you with the apply_chat_template() method within our server:
67
 
68
  ```python
69
  from openai import OpenAI
@@ -101,21 +101,42 @@ will yield:
101
  ```
102
  <|start_header_id|>system<|end_header_id|>
103
 
104
- // Supported function definitions that should be called when necessary.
105
- namespace functions {
106
 
107
- // Get the current weather
108
- type get_current_weather = (_: {
109
- // The city and state, e.g. San Francisco, CA
110
- location: string,
111
- }) => any;
112
 
113
- } // namespace functions<|eot_id|><|start_header_id|>user<|end_header_id|>
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
114
 
115
  What is the weather for Istanbul?
116
  ```
117
 
118
- A more detailed example is provided [here](https://github.com/MeetKai/functionary/blob/main/tests/prompt_test_v2.llama3.txt).
119
 
120
  ## Run the model
121
 
 
28
  ```python
29
  from transformers import AutoModelForCausalLM, AutoTokenizer
30
 
31
+ tokenizer = AutoTokenizer.from_pretrained("meetkai/functionary-medium-v3.1", trust_remote_code=True)
32
+ model = AutoModelForCausalLM.from_pretrained("meetkai/functionary-medium-v3.1", device_map="auto", trust_remote_code=True)
33
 
34
  tools = [
35
  {
 
61
 
62
  ## Prompt Template
63
 
64
+ We convert function definitions to a similar text to Meta's Llama 3.1 definitions. Then we inject these definitions as system prompts. After that, we inject the default system prompt. Then we start the conversation messages.
65
 
66
+ This formatting is also available via our vLLM server which we process the functions into definitions encapsulated in a system message and use a pre-defined Transformers chat template. This means that lists of messages can be formatted for you with the apply_chat_template() method within our server:
67
 
68
  ```python
69
  from openai import OpenAI
 
101
  ```
102
  <|start_header_id|>system<|end_header_id|>
103
 
104
+ Environment: ipython
 
105
 
106
+ Cutting Knowledge Date: December 2023
 
 
 
 
107
 
108
+
109
+ You have access to the following functions:
110
+
111
+ Use the function 'get_current_weather' to 'Get the current weather'
112
+ {"name": "get_current_weather", "description": "Get the current weather", "parameters": {"type": "object", "properties": {"location": {"type": "string", "description": "The city and state, e.g. San Francisco, CA"}},"required": ["location"]}}
113
+
114
+
115
+ Think very carefully before calling functions.
116
+ If a you choose to call a function ONLY reply in the following format:
117
+ <{start_tag}={function_name}>{parameters}{end_tag}
118
+ where
119
+
120
+ start_tag => `<function`
121
+ parameters => a JSON dict with the function argument name as key and function argument value as value.
122
+ end_tag => `</function>`
123
+
124
+ Here is an example,
125
+ <function=example_function_name>{"example_name": "example_value"}</function>
126
+
127
+ Reminder:
128
+ - If looking for real time information use relevant functions before falling back to brave_search
129
+ - Function calls MUST follow the specified format, start with <function= and end with </function>
130
+ - Required parameters MUST be specified
131
+ - Only call one function at a time
132
+ - Put the entire function call reply on one line
133
+
134
+ <|eot_id|><|start_header_id|>user<|end_header_id|>
135
 
136
  What is the weather for Istanbul?
137
  ```
138
 
139
+ A more detailed example is provided [here](https://github.com/MeetKai/functionary/blob/main/tests/prompt_test_v3-llama3.1.txt).
140
 
141
  ## Run the model
142