-
Notifications
You must be signed in to change notification settings - Fork 37
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is there a way to enable structured output? #22
Comments
Thank you for opening the issue! There is not such feature, yet. But definitely interesting!. Do you think you could provide an outline on how it should work? Like what do you want to achieve. |
A vanilla function calling implementation can be done by appending Python function signatures to the system prompts in |
@rajaswa-postman can you post an example how this would look like? |
Sure. For an OpenAI function call with the following example - "functions": [
{
"name": "get_current_weather",
"description": "Get the current weather in a given location",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA"
},
"unit": {
"type": "string",
"enum": ["celsius", "fahrenheit"]
}
},
"required": ["location"]
}
}
] We can have the following appended to the system prompt - functions_string = """
You can respond with a function call by giving out the function name and its arguments in a structured JSON format -
{
"name" : function_name,
"arguments" : { arg_name : arg_value }
}
You can choose the most suitable function from the following list of Python functions to respond with function calls -
def get_current_weather(location: str, unit: str = "celsius"):
"""
Get the current weather in a given location.
:param location: The city and state, e.g. San Francisco, CA.
:param unit: The temperature unit to use, either "celsius" or "fahrenheit" (default is "celsius").
:return: Current weather information for the specified location.
"""
""" |
This will obviously be very limited since we don't know how OpenAI has trained their model to respond with |
I am actually building a LlamaJsonformer right now using this library! Example: keywords = ['Linguistics', "Medicine", 'Engineering', 'Music', 'Performing Arts', 'Sociology', 'Women in STEM', 'Diversity', 'Political Science', 'Automotive Repair', 'Multilingual Studies', 'Global Studies']
context = '''Sara is a second-year linguistics student at University College London. She is currently taking courses in phonetics, syntax, and sociolinguistics. Sara enjoys her phonetics lectures, where she is learning about the physical properties of speech sounds. For her syntax course, she is studying how words are put together to form phrases and sentences. Her sociolinguistics class examines how language varies based on social factors like gender, ethnicity, and socioeconomic status. While sociolinguistics is fascinating, Sara finds some of the course readings quite dense. Overall, she is glad to be pursuing her passion for the study of language and hopes to continue her linguistics studies after completing her bachelor's degree.''' prompt = "Choose the best, most relevant keywords from the options list, based on the context. Generate factual information as json based on the following schema and context:" output
Still hallucinating some stuff, but it's valid json! As I said, hope to release the edits soon :) B ✌️ |
@bcarsley any updates on the llama jsonformer? |
... like OpenAI function calling or jsonformer?
The text was updated successfully, but these errors were encountered: