Skip to content

Latest commit

 

History

History
81 lines (69 loc) · 16.2 KB

README.md

File metadata and controls

81 lines (69 loc) · 16.2 KB

👩🏻‍🍳 Haystack Cookbook

Green logo of a stylized white 'H' with the text 'Haystack, by deepset. Haystack 2.0 is live 🎉' Abstract green and yellow diagrams in the background.

🧑‍🍳🍳Discover The Haystack Cookbook here

A collection of example notebooks using Haystack 👇

You can use these examples as guidelines on how to make use of different model providers, vector databases, retrieval techniques and more with Haystack. Most of them showcase a specific, small demo.

🧑‍🍳 Guidelines on How to Contribute a Cookbook

To learn more about how to use Haystack, please visit our Docs and official Tutorials.

For more examples, you may also find our Blog useful.

Note: Unless '(Haystack 1.x)' is mentioned in the title, all of these examples use Haystack 2.0 onwards.

Name Colab
Improving Retrieval with Auto-Merging Open In Colab
Speaker Diarization with AssemblyAI Open In Colab
Advance Prompt Customization for Anthropic Open In Colab
Advanced RAG: Query Decomposition and Reasoning Open In Colab
Advanced RAG: Automated Structured Metadata Enrichment Open In Colab
Techcrunch News Digest with Local LLMs using TitanML Takeoff Open In Colab
Use Gemini Models with Vertex AI Open In Colab
Gradient AI Embedders and Generators for RAG Open In Colab
Mixtral 8x7B with Hugging Face TGI for Web QA Open In Colab
Amazon Bedrock and OpenSearch for PDF QA Open In Colab
Use Zephyr 7B Beta with Hugging Face for RAG Open In Colab
Hacker News RAG with Custom Component Open In Colab
Use Chroma for RAG and Indexing Open In Colab
Using the Jina-embeddings-v2-base-en model in a Haystack RAG pipeline for legal document analsysis Open In Colab
Multilingual RAG from a podcast with Whisper, Qdrant and Mistral Open In Colab
Improve retrieval by embedding meaningful metadata Open In Colab
Advanced RAG: Query Expansion Open In Colab
Information extraction via LLMs (Gorilla OpenFunctions) Open In Colab
Information extraction via LLMs (NexusRaven) Open In Colab
Using AstraDB as a data store in your Haystack pipelines Open In Colab
Streaming model explorer: compare how different models handle the same prompt. Open In Colab
Function Calling with OpenAIChatGenerator Open In Colab
Use the vLLM inference engine in Haystack 2.x Open In Colab
Build with Google Gemma: chat and RAG Open In Colab
Optimizing Retrieval with HyDE Open In Colab
RAG pipeline using FastEmbed for embeddings generation Open In Colab
Sparse Embedding Retrieval with Qdrant and FastEmbed Open In Colab
Hybrid Retrieval: BM42 + Dense Retrieval (with Qdrant and FastEmbed) Open In Colab
Air-Gapped RAG pipelines with NVIDIA NIMs Open In Colab
Evaluate a RAG pipeline using Haystack-UpTrain integration Open In Colab
RAG on the Oscars using Llama 3.1 models Open In Colab
Chatting with SQL Databases Open In Colab
Evaluate a RAG pipeline using DeepEval integration Open In Colab
Evaluate a RAG pipeline using Ragas integration Open In Colab
Extract Metadata Filters from a Query Open In Colab
Run tasks concurrently within a custom component Open In Colab
Prompt Optimization with DSPy Open In Colab
RAG Evaluation with Prometheus 2 Open In Colab
Build quizzes and adventures with Character Codex and llamafile Open In Colab
Invoking APIs with OpenAPITool Open In Colab
Extract and use website content for RAG with Apify Open In Colab
Analyze Your Instagram Comments’ Vibe with Apify and Haystack Open In Colab
Conversational RAG using Memory Open In Colab
Evaluating RAG Pipelines with EvaluationHarness Open In Colab
Define & Run Tools Open In Colab
Agentic RAG with Llama 3.2 3B Open In Colab
Create a Swarm of Agents Open In Colab
Cohere for Multilingual QA (Haystack 1.x) Open In Colab
GPT-4 and Weaviate for Custom Documentation QA (Haystack 1.x) Open In Colab
Whisper Transcriber and Weaviate for YouTube video QA (Haystack 1.x) Open In Colab

How to Contribute a Cookbook

If you have an example that uses Haystack, you can add it to this repository by creating a PR. You can also create a PR from Colab by creating a Fork of this repository and selecting "Save a Copy to GitHub". Once you add your example to your fork, you can create a PR onto this repository.

  1. Add your Notebook
  2. Give a descriptive name to your file that includes the names of (if applicable) the model providers, databases the technologies you use in your example and/or the task you are completing in the example.
  3. Make sure you add it to index.toml including its title and topics.
  4. Make sure to add a row in the table above 🎉