Skip to content

support for passing "functions" to chat completions #17

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 5 commits into from
Jun 19, 2023

Conversation

ChuckJonas
Copy link
Contributor

@ChuckJonas ChuckJonas commented Jun 14, 2023

Open AI just released support for "Functions". This is a first pass at supporting that behavior.

https://openai.com/blog/function-calling-and-other-api-updates

NOTE: this has been lightly tested and seems to be working

src/types.ts Outdated
parameters: ObjectSchema;
};

type JSONSchema = ObjectSchema | StringSchema | NumberSchema | BooleanSchema;
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These might not be the full set of supported types and some of the types might be missing optional properties... but it's a start

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good start, especially given the lack of docs on this. This seems fine to me.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Testing Array rn and seems to work... The docs also mention anyOf, but I assume they also support allOf, oneOf. Might just wait and add these in the future as we have time to test they actually work.

- "function_call" response property
src/types.ts Outdated
@@ -615,7 +660,11 @@ export interface ChatCompletion {
message: {
name?: string;
role: "system" | "assistant" | "user";
content: string;
content: string | null;
Copy link
Contributor Author

@ChuckJonas ChuckJonas Jun 14, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is a somewhat "breaking" change, but with the introductions of functions, seems like this value can now be null

Open AI example response:

{
  "id": "chatcmpl-123",
  ...
  "choices": [{
    "index": 0,
    "message": {
      "role": "assistant",
      "content": null,
      "function_call": {
        "name": "get_current_weather",
        "arguments": "{ \"location\": \"Boston, MA\"}"
      }
    },
    "finish_reason": "function_call"
  }]
}

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This feels really bad to me. hate to have to tell typescript to ignore the null every single time I need to extract the content.

I feel like we could do more intelligent typing by looking at the functions and function_call parameter to see if there is a possibility of the response being null. Would you be willing to try doing this?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ya, I had the same thought. Shouldn't be too hard to compose the types to get it to work that way

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@lino-levan so... it is definitely possible, but I think it would require a much more complicated typings with generics and conditional types...

Typescript Playground showing a POC

A simpler solutions might just be to keep the type as is, but default the content property to an empty string if null...

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think part of the philosophy of this library is to try to pass the raw response as often as possible. Transforming the response even for something as minor as this would mean that people couldn't rely on open ai docs. The complex typings look nice, I'd accept this PR if it were to be added.

Copy link
Contributor Author

@ChuckJonas ChuckJonas Jun 15, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Accidentally edited the wrong comment (sorry for the mess). This previously said something like:

This is possible, but the deeply nested properties make the conditional typing very complex.

async createChatCompletion<T extends ChatCompletionOptions>(
    options: T
  ): Promise<
    T extends { functions: any }
      ? Omit<ChatCompletion, "choices"> & {
          choices: Omit<ChatCompletionChoice, "message"> & {
            message: Omit<ChatCompletionChoiceMessage, "content"> & {
              content: string | null;
            };
          };
        }
      : ChatCompletion
  > {

I think this would have a negative impact on developer experience. IMO, the "null coalesce" to "" is the superior solution, but I respect whatever you want to do.

I may actually abandon the PR anyway, because I'm having issues with the error handling (throwError) of this library swallowing status code and other raw request details.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

IMO, the "null coalesce" to "" is the superior solution, but I respect whatever you want to do.

Thanks for the insight. I see where you're coming from on the typing front. Maybe worth revisiting in the future, but after further pondering I think a "null coalesce" is more than reasonable.

I may actually abandon the PR anyway, because I'm having issues with the error handling (throwError) of this library swallowing status code and other raw request details.

Error handling was kind of hacked in after the fact 😅. If you have any ideas on how to improve it, I'm all ears!

Copy link
Contributor Author

@ChuckJonas ChuckJonas Jun 15, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Created ticket #18 to track

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sounds good, thanks for the issue.

@ChuckJonas ChuckJonas force-pushed the chat-function-support branch from 93c6b5a to 6779b07 Compare June 14, 2023 01:11
@lino-levan lino-levan self-requested a review June 14, 2023 01:59
@load1n9 load1n9 self-requested a review June 14, 2023 16:20
Copy link
Owner

@load1n9 load1n9 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM can you add tests

Copy link
Collaborator

@lino-levan lino-levan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall the code looks fine, but the typing could use some work. Please refer to my comment above.

@load1n9 load1n9 requested a review from lino-levan June 14, 2023 18:14
@blakechambers
Copy link
Contributor

Something to consider here: I played around with this some locally and ended up needing to extract a ChatCompletionMessage type. The problem I was seeing is there is some drift between the messages in ChatCompletion and ChatCompletionOptions – meaning I got some typing issues trying to pass message replies back in during subsequent requests.

@lino-levan
Copy link
Collaborator

I played around with this some locally and ended up needing to extract a ChatCompletionMessage type

Ah, very interesting. Seems like a good idea.

- null coalesce content.message to empty string
- added example
@ChuckJonas
Copy link
Contributor Author

LGTM can you add tests

I don't see any actual unit tests in the project? I added an "example" file.

I also added the null coalesce... It's not the most beautiful solution, but as discussed, I'm not sure if there is a better option.

Let me know if there is anything else needed to get this out.

Copy link
Collaborator

@lino-levan lino-levan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Nice! Thanks for the comment explaining why we are null coalescing.

Copy link
Owner

@load1n9 load1n9 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@load1n9 load1n9 merged commit 38ffa5e into load1n9:main Jun 19, 2023
@lino-levan
Copy link
Collaborator

Thanks for the PR @ChuckJonas!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants