-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat (provider/togetherai): Add TogetherAI provider. #3781
base: main
Are you sure you want to change the base?
Conversation
ec3e6b1
to
feb0fe3
Compare
fcffdd5
to
8c6420c
Compare
8c6420c
to
10915e2
Compare
10915e2
to
6d44bf7
Compare
export function mapOpenAICompatibleCompletionLogProbs( | ||
logprobs: OpenAICompatibleCompletionLogProps | null | undefined, | ||
): LanguageModelV1LogProbs | undefined { | ||
return logprobs?.tokens.map((token, index) => ({ | ||
token, | ||
logprob: logprobs.token_logprobs[index], | ||
topLogprobs: logprobs.top_logprobs | ||
? Object.entries(logprobs.top_logprobs[index]).map( | ||
([token, logprob]) => ({ | ||
token, | ||
logprob, | ||
}), | ||
) | ||
: [], | ||
})); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's remove logprobs from the compatible provider. I'm leaning towards moving it into a provider extension since it's rarely supported.
// TODO(shaper): This is really model-specific, move to config or elsewhere? | ||
defaultObjectGenerationMode?: 'json' | 'tool' | undefined; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can you make it a constructor parameter (in the options) for the model that is then defined in the providers? (which would have that knowledge)
// TODO(shaper): Review vs. OpenAI impl here. | ||
throw new UnsupportedFunctionalityError({ | ||
functionality: 'object-json mode', | ||
}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
copy what openai has including schemas
constructor( | ||
modelId: OpenAICompatibleChatModelId, | ||
settings: OpenAICompatibleChatSettings, | ||
config: OpenAICompatibleChatConfig, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
include the default object generation mode in config
completionTokens: response.usage.completion_tokens, | ||
}, | ||
finishReason: mapOpenAICompatibleFinishReason(choice.finish_reason), | ||
logprobs: mapOpenAICompatibleCompletionLogProbs(choice.logprobs), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
rm logbprobs
logprobs: z | ||
.object({ | ||
tokens: z.array(z.string()), | ||
token_logprobs: z.array(z.number()), | ||
top_logprobs: z.array(z.record(z.string(), z.number())).nullable(), | ||
}) | ||
.nullish(), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
rm logprobs
logprobs: z | ||
.object({ | ||
tokens: z.array(z.string()), | ||
token_logprobs: z.array(z.number()), | ||
top_logprobs: z.array(z.record(z.string(), z.number())).nullable(), | ||
}) | ||
.nullish(), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same
// 'openai-organization': 'test-organization', | ||
// 'openai-project': 'test-project', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
rm
// organization: 'test-organization', | ||
// project: 'test-project', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
rm
/** | ||
Override the parallelism of embedding calls. | ||
*/ | ||
supportsParallelCalls?: boolean; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
move to config?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(first 2 options)
// TODO(shaper): Reconcile this with openai-error.ts. We derived from `xai`. | ||
|
||
export const openAICompatibleErrorDataSchema = z.object({ | ||
code: z.string(), | ||
error: z.string(), | ||
}); | ||
|
||
export type OpenAICompatibleErrorData = z.infer< | ||
typeof openAICompatibleErrorDataSchema | ||
>; | ||
|
||
export const openAICompatibleFailedResponseHandler = | ||
createJsonErrorResponseHandler({ | ||
errorSchema: openAICompatibleErrorDataSchema, | ||
errorToMessage: data => data.error, | ||
}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
make provider-specific. different providers have different structures. have default that matches openai
L extends string = string, | ||
C extends string = string, | ||
E extends string = string, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Prefer full words for generic in upper case, e.g. CONTEXT
// TODO(shaper): | ||
// - consider throwing if baseUrl, name, sufficient api key info not available | ||
// - force only 'compatible' -- look into whether we can remove some 'strict' logic/configs entirely |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes - goal is that this works w/ compatible mode but is more flexible. then eventually openai provider should only be strict mode and we can simplify
Built on top of a new
openai-compatible
package for better code sharing as we add more top level providers that follow this pattern.