Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Generation error: Response from LLM is not in JSON format #15

Open
theotherp opened this issue Jul 17, 2024 · 2 comments
Open

Generation error: Response from LLM is not in JSON format #15

theotherp opened this issue Jul 17, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@theotherp
Copy link

theotherp commented Jul 17, 2024

❯ magic-cli config list
Field: llm
Value: "openai"
Description: The LLM to use for generating responses. Supported values: "ollama", "openai"

Field: ollama.base_url
Value: "http://localhost:11434"
Description: The base URL of the Ollama API.

Field: ollama.embedding_model
Value: "nomic-embed-text:latest"
Description: The model to use for generating embeddings.

Field: ollama.model
Value: "codestral:latest"
Description: The model to use for generating responses.

Field: openai.api_key (secret)
Value: **********************************************************
Description: The API key for the OpenAI API.

Field: openai.embedding_model
Value: "gpt-4o"
Description: The model to use for generating embeddings.

Field: openai.model
Value: "gpt-4o"
Description: The model to use for generating responses.

Field: suggest.add_to_history
Value: false
Description: Whether to add the suggested command to the shell history.

Field: suggest.mode
Value: "clipboard"
Description: The mode to use for suggesting commands. Supported values: "clipboard" (copying command to clipboard), "unsafe-execution" (executing in the current shell session)
❯ magic-cli suggest "get kubernetes pods with most memory usage in all namespaces"
Generating suggested command for prompt "get kubernetes pods with most memory usage in all namespaces"...

Error: Generation error: Generation error: Response from LLM is not in JSON format
❯                                                                

Version: magic-cli 0.0.2

❯ magic-cli sys-info
System information as detected by the CLI:

OS: Windows
OS version: 10 (19045)
CPU architecture: x86_64
Shell: pwsh
@guywaldman guywaldman added the bug Something isn't working label Jul 17, 2024
@tnorthcutt
Copy link

Ran into this myself as well. Here's my version, config, system info, and attempt:

» magic-cli --version  
magic-cli 0.0.2
» magic-cli config list
Field: llm 
Value: "openai"
Description: The LLM to use for generating responses. Supported values: "ollama", "openai"

Field: ollama.base_url 
Value: "http://localhost:11434"
Description: The base URL of the Ollama API.

Field: ollama.embedding_model 
Value: "nomic-embed-text:latest"
Description: The model to use for generating embeddings.

Field: ollama.model 
Value: "codestral:latest"
Description: The model to use for generating responses.

Field: openai.api_key (secret)
Value: **********************************************************
Description: The API key for the OpenAI API.

Field: openai.embedding_model 
Value: "text-embedding-ada-002"
Description: The model to use for generating embeddings.

Field: openai.model 
Value: "gpt-4o"
Description: The model to use for generating responses.

Field: suggest.add_to_history 
Value: false
Description: Whether to add the suggested command to the shell history.

Field: suggest.mode 
Value: "clipboard"
Description: The mode to use for suggesting commands. Supported values: "clipboard" (copying command to clipboard), "unsafe-execution" (executing in the current shell session)
» magic-cli sys-info                                             
System information as detected by the CLI:

OS: Darwin
OS version: 14.5
CPU architecture: arm64
Shell: zsh
» magic-cli suggest "convert all png to jpg in current directory"
Generating suggested command for prompt "convert all png to jpg in current directory"...

Error: Generation error: Generation error: Response from LLM is not in JSON format

@guywaldman guywaldman assigned guywaldman and Ronbalt and unassigned guywaldman and Ronbalt Jul 17, 2024
@guywaldman
Copy link
Owner

Thanks folks. This is an "alignment" issue, I have some ideas around that.
Will fix this as soon as possible.

cc @Ronbalt

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants