You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
❯ magic-cli config list
Field: llm
Value: "openai"
Description: The LLM to use for generating responses. Supported values: "ollama", "openai"
Field: ollama.base_url
Value: "http://localhost:11434"
Description: The base URL of the Ollama API.
Field: ollama.embedding_model
Value: "nomic-embed-text:latest"
Description: The model to use for generating embeddings.
Field: ollama.model
Value: "codestral:latest"
Description: The model to use for generating responses.
Field: openai.api_key (secret)
Value: **********************************************************
Description: The API key for the OpenAI API.
Field: openai.embedding_model
Value: "gpt-4o"
Description: The model to use for generating embeddings.
Field: openai.model
Value: "gpt-4o"
Description: The model to use for generating responses.
Field: suggest.add_to_history
Value: false
Description: Whether to add the suggested command to the shell history.
Field: suggest.mode
Value: "clipboard"
Description: The mode to use for suggesting commands. Supported values: "clipboard" (copying command to clipboard), "unsafe-execution" (executing in the current shell session)
❯ magic-cli suggest "get kubernetes pods with most memory usage in all namespaces"
Generating suggested command for prompt "get kubernetes pods with most memory usage in all namespaces"...
Error: Generation error: Generation error: Response from LLM is not in JSON format
❯
Version: magic-cli 0.0.2
❯ magic-cli sys-info
System information as detected by the CLI:
OS: Windows
OS version: 10 (19045)
CPU architecture: x86_64
Shell: pwsh
The text was updated successfully, but these errors were encountered:
Ran into this myself as well. Here's my version, config, system info, and attempt:
» magic-cli --version
magic-cli 0.0.2
» magic-cli config list
Field: llm
Value: "openai"
Description: The LLM to use for generating responses. Supported values: "ollama", "openai"
Field: ollama.base_url
Value: "http://localhost:11434"
Description: The base URL of the Ollama API.
Field: ollama.embedding_model
Value: "nomic-embed-text:latest"
Description: The model to use for generating embeddings.
Field: ollama.model
Value: "codestral:latest"
Description: The model to use for generating responses.
Field: openai.api_key (secret)
Value: **********************************************************
Description: The API key for the OpenAI API.
Field: openai.embedding_model
Value: "text-embedding-ada-002"
Description: The model to use for generating embeddings.
Field: openai.model
Value: "gpt-4o"
Description: The model to use for generating responses.
Field: suggest.add_to_history
Value: false
Description: Whether to add the suggested command to the shell history.
Field: suggest.mode
Value: "clipboard"
Description: The mode to use for suggesting commands. Supported values: "clipboard" (copying command to clipboard), "unsafe-execution" (executing in the current shell session)
» magic-cli sys-info
System information as detected by the CLI:
OS: Darwin
OS version: 14.5
CPU architecture: arm64
Shell: zsh
» magic-cli suggest "convert all png to jpg in current directory"
Generating suggested command for prompt "convert all png to jpg in current directory"...
Error: Generation error: Generation error: Response from LLM is not in JSON format
Version: magic-cli 0.0.2
The text was updated successfully, but these errors were encountered: