Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable passing secrets to action server. #11

Closed
wants to merge 1 commit into from

Conversation

bakar-io
Copy link

@bakar-io bakar-io commented May 16, 2024

Related to Sema4AI/langchain#1

toolkit = ActionServerToolkit(url=kwargs["url"], api_key=kwargs["api_key"])
def _build_context(secrets: dict) -> str:
ctx = json.dumps({"secrets": secrets})
return base64.b64encode(ctx.encode("utf-8")).decode("ascii")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What would be actually stored in the database? It would be maybe easier to assume that secrets is already in the right format and do the base64 encoding etc before storing the data.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This PR takes the approach of config looking this way:

class ActionServerConfig(ToolConfig):
    url: str
    api_key: str
    secrets: Optional[dict]

I followed these docs for context encoding.

Here's an example assistant config in the database:

{
    "configurable": {
        "type": "agent",
        "type==agent/agent_type": "GPT 3.5 Turbo",
        "type==agent/interrupt_before_action": false,
        "type==agent/retrieval_description": "Can be used to look up information that was uploaded to this assistant.\nIf the user is referencing particular files, that is often a good hint that information may be here.\nIf the user asks a vague question, they are likely meaning to look up info from this retriever, and you should call it!",
        "type==agent/system_message": "You are a helpful assistant.",
        "type==agent/tools": [
            {
                "id": "f7015ef0-5a84-4da0-b2f1-f37ba09c548d",
                "type": "action_server_by_sema4ai",
                "name": "Action Server by Sema4.ai",
                "description": "Run AI actions with [Sema4.ai Action Server](https://github.com/Sema4AI/actions).",
                "config": {
                    "url": "https://sixty-five-chatty-rats.robocorp.link",
                    "api_key": "cbHz3L1yx8_0SMBsuqjcEtDvY4CZEm2HuJSMmT-U8oY",
                    "secrets": {
                        "mypwd": "green"
                    }
                }
            }
        ],
        "type==chat_retrieval/llm_type": "GPT 3.5 Turbo",
        "type==chat_retrieval/system_message": "You are a helpful assistant.",
        "type==chatbot/llm_type": "GPT 3.5 Turbo",
        "type==chatbot/system_message": "You are a helpful assistant."
    }
}

@mkorpela
Copy link
Member

@bakar-io How about if we add "additionalHeaders" or similar key to the config with key:value pairs as the value.
Then you can place there whatever headers with their values.

The value could then be directly stored as it is sent in the db and handled on the calling side. This might also make it easier to later have testing capabilities related to Action Server calls that are not directly linked to OpenGPTs backend.

@bakar-io
Copy link
Author

@bakar-io How about if we add "additionalHeaders" or similar key to the config with key:value pairs as the value. Then you can place there whatever headers with their values.

The value could then be directly stored as it is sent in the db and handled on the calling side. This might also make it easier to later have testing capabilities related to Action Server calls that are not directly linked to OpenGPTs backend.

That was one of the options considered. Spoke with Kari this morning and he was conflicted between these two options.
'
Can you elaborate on "The value could then be directly stored as it is sent in the db and handled on the calling side."

@mkorpela
Copy link
Member

"The value could then be directly stored as it is sent in the db and handled on the calling side." === I would like as little logic to OpenGPTs backend as possible.

@bakar-io bakar-io marked this pull request as draft May 16, 2024 12:11
@bakar-io bakar-io closed this May 17, 2024
@bakar-io bakar-io deleted the action-server-secrets branch May 17, 2024 10:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants