Skip to content

Issues: containers/ai-lab-recipes

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

Typo in computer vision recipe
#654 opened Jul 2, 2024 by jeffmaury
Remove Deepspeed and VLLM
#574 opened Jun 17, 2024 by cooktheryan
MODEL_ENDPOINT hard coded for Makefile run command enhancement Improve on an existing feature or experience
#561 opened Jun 13, 2024 by HunterGerlach
llama-cpp-server broken bug Something isn't working
#547 opened Jun 11, 2024 by jeffmaury
build a RHEL based Milvus enhancement Improve on an existing feature or experience
#538 opened Jun 7, 2024 by cooktheryan
https://quay.io/repository/ai-lab/llamacpp-python-cuda is provided only for x86_64 bug Something isn't working enhancement Improve on an existing feature or experience
#525 opened Jun 3, 2024 by jeffmaury
ilab serve with vllm - error on 3x GPU system bug Something isn't working
#514 opened May 23, 2024 by markmc
Recipe idea: LLM agent framework feature Ask for totally new functionality
#500 opened May 17, 2024 by hemajv
[Feature] model evaluation + metrics AI feature Ask for totally new functionality testing
#356 opened Apr 28, 2024 by Gregory-Pereira
ProTip! Find all open issues with in progress development work with linked:pr.