Skip to content

Releases: alexeichhorn/typegpt

0.3.5

06 Aug 21:03
Compare
Choose a tag to compare

Support for new and cheaper gpt-4o-2024-08-06 model

0.3.4

02 Aug 21:03
2d06316
Compare
Choose a tag to compare

Fixed decoding issue

0.3.3

18 Jul 19:31
Compare
Choose a tag to compare

Full support for gpt-4o-mini

0.3.2

14 May 12:42
Compare
Choose a tag to compare
  • Support for gpt-4o (including new tokenizer)
  • Fixed maximum allowed tokens of standard gpt-3.5-turbo model (now correctly 16K)
  • LLMException that are thrown contain valuable information inside the object such as compiled system and user prompt

0.3.1

17 Apr 09:41
Compare
Choose a tag to compare

Support for openai library version >= 1.20.0

0.3

11 Apr 16:56
7e48162
Compare
Choose a tag to compare
0.3
  • Support for easy and typed few-shot prompting. Just implement the few_shot_examples function inside your prompt:
class ExamplePrompt(PromptTemplate):

    class Output(BaseLLMResponse):
        class Ingredient(BaseLLMResponse):
            name: str
            quantity: int

        ingredients: list[Ingredient]

    def system_prompt(self) -> str:
        return "Given a recipe, extract the ingredients."

    def few_shot_examples(self) -> list[FewShotExample[Output]]:
        return [
            FewShotExample(
                input="Let's take two apples, three bananas, and four oranges.",
                output=self.Output(ingredients=[
                    self.Output.Ingredient(name="apple", quantity=2),
                    self.Output.Ingredient(name="banana", quantity=3),
                    self.Output.Ingredient(name="orange", quantity=4),
                ])
            ),
            FewShotExample(
                input="My recipe requires five eggs and two cups of flour.",
                output=self.Output(ingredients=[
                    self.Output.Ingredient(name="egg", quantity=5),
                    self.Output.Ingredient(name="flour cups", quantity=2),
                ])
            )
        ]

    def user_prompt(self) -> str:
        ...
  • You can disable automatically added system instruction that instructs LLM how to format output. Only use this if you use few-shot prompting, a fine-tuned model, or instruct the model yourself (not recommended). Use it like this:
class ExamplePrompt(PromptTemplate):

    settings = PromptSettings(disable_formatting_instructions=True)

    ...
  • Support for new gpt-4-turbo-2024-04-09 / gpt-4-turbo model

0.2.2

01 Feb 22:13
a122d48
Compare
Choose a tag to compare

Support for new gpt-3.5-turbo-0125 model version

0.2.1

25 Jan 19:53
10219c0
Compare
Choose a tag to compare

Fully supports new gpt-4-0125-preview model and new gpt-4-turbo-preview model alias.

0.2

18 Jan 21:54
df9150b
Compare
Choose a tag to compare
0.2

You can now nest response types. Note that you need to use BaseLLMArrayElement for classes that you want to nest inside a list. To add instructions inside an element of BaseLLMArrayElement, you must use LLMArrayElementOutput instead of LLMOutput.

class Output(BaseLLMResponse):

    class Item(BaseLLMArrayElement):

        class Description(BaseLLMResponse):
            short: str | None
            long: str

        title: str
        description: Description
        price: float = LLMArrayElementOutput(instruction=lambda pos: f"The price of the {pos.ordinal} item")

    items: list[Item]
    count: int

Bug fix

12 Nov 17:28
8d84f28
Compare
Choose a tag to compare
Bug fix Pre-release
Pre-release

Fixed issue where TypeOpenAI were not importable