The easiest way to serve AI apps and models - Build Model Inference APIs, Job queues, LLM apps, Multi-model pipelines, and more!
-
Updated
Dec 26, 2024 - Python
The easiest way to serve AI apps and models - Build Model Inference APIs, Job queues, LLM apps, Multi-model pipelines, and more!
Extension for Scikit-learn is a seamless way to speed up your Scikit-learn application
oneAPI Data Analytics Library (oneDAL)
The easiest way to use Machine Learning. Mix and match underlying ML libraries and data set sources. Generate new datasets or modify existing ones with ease.
Client library to interact with various APIs used within Philips in a simple and uniform way
llama.cpp + ROCm + llama-swap
No more Hugging Face cost leaks.
Add a description, image, and links to the ai-inference topic page so that developers can more easily learn about it.
To associate your repository with the ai-inference topic, visit your repo's landing page and select "manage topics."