Skip to content

Commit d0e6e93

Browse files
authored
Remove explicit mention of Haystack "2.x" in tutorials (#384)
* remove 2.x note at the top * remove more 2.0 and 2.x mentions * remove 1.x tutorials from readme overview page * remove outdated multiplexer tutorial from overview in readme
1 parent 50b43c1 commit d0e6e93

13 files changed

+36
-89
lines changed

README.md

+17-46
Large diffs are not rendered by default.

tutorials/27_First_RAG_Pipeline.ipynb

+2-3
Original file line numberDiff line numberDiff line change
@@ -12,8 +12,7 @@
1212
"- **Time to complete**: 10 minutes\n",
1313
"- **Components Used**: [`InMemoryDocumentStore`](https://docs.haystack.deepset.ai/docs/inmemorydocumentstore), [`SentenceTransformersDocumentEmbedder`](https://docs.haystack.deepset.ai/docs/sentencetransformersdocumentembedder), [`SentenceTransformersTextEmbedder`](https://docs.haystack.deepset.ai/docs/sentencetransformerstextembedder), [`InMemoryEmbeddingRetriever`](https://docs.haystack.deepset.ai/docs/inmemoryembeddingretriever), [`PromptBuilder`](https://docs.haystack.deepset.ai/docs/promptbuilder), [`OpenAIChatGenerator`](https://docs.haystack.deepset.ai/docs/openaichatgenerator)\n",
1414
"- **Prerequisites**: You must have an [OpenAI API Key](https://platform.openai.com/api-keys).\n",
15-
"- **Goal**: After completing this tutorial, you'll have learned the new prompt syntax and how to use PromptBuilder and OpenAIChatGenerator to build a generative question-answering pipeline with retrieval-augmentation.\n",
16-
"> This tutorial uses the latest version of Haystack 2.x (`haystack-ai`). For more information on Haystack 2.0, read the [Haystack 2.0 announcement](https://haystack.deepset.ai/blog/haystack-2-release) or visit the [Haystack Documentation](https://docs.haystack.deepset.ai/docs/intro)."
15+
"- **Goal**: After completing this tutorial, you'll have learned the new prompt syntax and how to use PromptBuilder and OpenAIChatGenerator to build a generative question-answering pipeline with retrieval-augmentation."
1716
]
1817
},
1918
{
@@ -24,7 +23,7 @@
2423
"source": [
2524
"## Overview\n",
2625
"\n",
27-
"This tutorial shows you how to create a generative question-answering pipeline using the retrieval-augmentation ([RAG](https://www.deepset.ai/blog/llms-retrieval-augmentation)) approach with Haystack 2.0. The process involves four main components: [SentenceTransformersTextEmbedder](https://docs.haystack.deepset.ai/docs/sentencetransformerstextembedder) for creating an embedding for the user query, [InMemoryBM25Retriever](https://docs.haystack.deepset.ai/docs/inmemorybm25retriever) for fetching relevant documents, [PromptBuilder](https://docs.haystack.deepset.ai/docs/promptbuilder) for creating a template prompt, and [OpenAIChatGenerator](https://docs.haystack.deepset.ai/docs/openaichatgenerator) for generating responses.\n",
26+
"This tutorial shows you how to create a generative question-answering pipeline using the retrieval-augmentation ([RAG](https://www.deepset.ai/blog/llms-retrieval-augmentation)) approach with Haystack. The process involves four main components: [SentenceTransformersTextEmbedder](https://docs.haystack.deepset.ai/docs/sentencetransformerstextembedder) for creating an embedding for the user query, [InMemoryBM25Retriever](https://docs.haystack.deepset.ai/docs/inmemorybm25retriever) for fetching relevant documents, [PromptBuilder](https://docs.haystack.deepset.ai/docs/promptbuilder) for creating a template prompt, and [OpenAIChatGenerator](https://docs.haystack.deepset.ai/docs/openaichatgenerator) for generating responses.\n",
2827
"\n",
2928
"For this tutorial, you'll use the Wikipedia pages of [Seven Wonders of the Ancient World](https://en.wikipedia.org/wiki/Wonders_of_the_World) as Documents, but you can replace them with any text you want.\n"
3029
]

tutorials/28_Structured_Output_With_Loop.ipynb

+1-3
Original file line numberDiff line numberDiff line change
@@ -14,10 +14,8 @@
1414
"- **Components Used**: `PromptBuilder`, `OpenAIChatGenerator`, `OutputValidator` (Custom component)\n",
1515
"- **Goal**: After completing this tutorial, you will have built a system that extracts unstructured data, puts it in a JSON schema, and automatically corrects errors in the JSON output from a large language model (LLM) to make sure it follows the specified structure.\n",
1616
"\n",
17-
"> This tutorial uses the latest version of Haystack 2.x (`haystack-ai`). For more information on Haystack 2.0, read the [Haystack 2.0 announcement](https://haystack.deepset.ai/blog/haystack-2-release) or visit the [Haystack Documentation](https://docs.haystack.deepset.ai/docs/intro).\n",
18-
"\n",
1917
"## Overview\n",
20-
"This tutorial demonstrates how to use Haystack 2.0's advanced [looping pipelines](https://docs.haystack.deepset.ai/docs/pipelines#loops) with LLMs for more dynamic and flexible data processing. You'll learn how to extract structured data from unstructured data using an LLM, and to validate the generated output against a predefined schema.\n",
18+
"This tutorial demonstrates how to use Haystack's advanced [looping pipelines](https://docs.haystack.deepset.ai/docs/pipelines#loops) with LLMs for more dynamic and flexible data processing. You'll learn how to extract structured data from unstructured data using an LLM, and to validate the generated output against a predefined schema.\n",
2119
"\n",
2220
"This tutorial uses `gpt-4o-mini` to change unstructured passages into JSON outputs that follow the [Pydantic](https://github.com/pydantic/pydantic) schema. It uses a custom OutputValidator component to validate the JSON and loop back to make corrections, if necessary."
2321
]

tutorials/29_Serializing_Pipelines.ipynb

+1-3
Original file line numberDiff line numberDiff line change
@@ -12,9 +12,7 @@
1212
"- **Time to complete**: 10 minutes\n",
1313
"- **Components Used**: [`HuggingFaceLocalChatGenerator`](https://docs.haystack.deepset.ai/docs/huggingfacelocalchatgenerator), [`ChatPromptBuilder`](https://docs.haystack.deepset.ai/docs/chatpromptbuilder)\n",
1414
"- **Prerequisites**: None\n",
15-
"- **Goal**: After completing this tutorial, you'll understand how to serialize and deserialize between YAML and Python code.\n",
16-
"\n",
17-
"> This tutorial uses the latest version of Haystack 2.x (`haystack-ai`). For more information on Haystack 2.0, read the [Haystack 2.0 announcement](https://haystack.deepset.ai/blog/haystack-2-release) or visit the [Haystack Documentation](https://docs.haystack.deepset.ai/docs/intro)."
15+
"- **Goal**: After completing this tutorial, you'll understand how to serialize and deserialize between YAML and Python code."
1816
]
1917
},
2018
{

tutorials/30_File_Type_Preprocessing_Index_Pipeline.ipynb

-2
Original file line numberDiff line numberDiff line change
@@ -12,8 +12,6 @@
1212
"- **Time to complete**: 15 minutes\n",
1313
"- **Goal**: After completing this tutorial, you'll have learned how to build an indexing pipeline that will preprocess files based on their file type, using the `FileTypeRouter`.\n",
1414
"\n",
15-
"> This tutorial uses the latest version of Haystack 2.x (`haystack-ai`). For more information on Haystack 2.0, read the [Haystack 2.0 announcement](https://haystack.deepset.ai/blog/haystack-2-release) or visit the [Haystack Documentation](https://docs.haystack.deepset.ai/docs/intro).\n",
16-
"\n",
1715
"> 💡 (Optional): After creating the indexing pipeline in this tutorial, there is an optional section that shows you how to create a RAG pipeline on top of the document store you just created. You must have a [Hugging Face API Key](https://huggingface.co/settings/tokens) for this section\n",
1816
"\n",
1917
"## Components Used\n",

tutorials/31_Metadata_Filtering.ipynb

+1-3
Original file line numberDiff line numberDiff line change
@@ -12,9 +12,7 @@
1212
"- **Time to complete**: 5 minutes\n",
1313
"- **Components Used**: [`InMemoryDocumentStore`](https://docs.haystack.deepset.ai/docs/inmemorydocumentstore), [`InMemoryBM25Retriever`](https://docs.haystack.deepset.ai/docs/inmemorybm25retriever)\n",
1414
"- **Prerequisites**: None\n",
15-
"- **Goal**: Filter documents in a document store based on given metadata\n",
16-
"\n",
17-
"> This tutorial uses the latest version of Haystack 2.x (`haystack-ai`). For more information on Haystack 2.0, read the [Haystack 2.0 announcement](https://haystack.deepset.ai/blog/haystack-2-release) or visit the [Haystack Documentation](https://docs.haystack.deepset.ai/docs/intro)."
15+
"- **Goal**: Filter documents in a document store based on given metadata"
1816
]
1917
},
2018
{

tutorials/32_Classifying_Documents_and_Queries_by_Language.ipynb

+2-4
Original file line numberDiff line numberDiff line change
@@ -12,9 +12,7 @@
1212
"- **Time to complete**: 15 minutes\n",
1313
"- **Components Used**: [`InMemoryDocumentStore`](https://docs.haystack.deepset.ai/docs/inmemorydocumentstore), [`DocumentLanguageClassifier`](https://docs.haystack.deepset.ai/docs/documentlanguageclassifier), [`MetadataRouter`](https://docs.haystack.deepset.ai/docs/metadatarouter), [`DocumentWriter`](https://docs.haystack.deepset.ai/docs/documentwriter), [`TextLanguageRouter`](https://docs.haystack.deepset.ai/docs/textlanguagerouter), [`DocumentJoiner`](https://docs.haystack.deepset.ai/docs/documentjoiner), [`InMemoryBM25Retriever`](https://docs.haystack.deepset.ai/docs/inmemorybm25retriever), [`ChatPromptBuilder`](https://docs.haystack.deepset.ai/docs/chatpromptbuilder), [`OpenAIChatGenerator`](https://docs.haystack.deepset.ai/docs/openaichatgenerator)\n",
1414
"- **Goal**: After completing this tutorial, you'll have learned how to build a Haystack pipeline to classify documents based on the (human) language they were written in.\n",
15-
"- Optionally, at the end you'll also incorporate language clasification and query routing into a RAG pipeline, so you can query documents based on the language a question was written in.\n",
16-
"\n",
17-
"> This tutorial uses the latest version of Haystack 2.x (`haystack-ai`). For more information on Haystack 2.0, read the [Haystack 2.0 announcement](https://haystack.deepset.ai/blog/haystack-2-release) or visit the [Haystack Documentation](https://docs.haystack.deepset.ai/docs/intro).\n"
15+
"- Optionally, at the end you'll also incorporate language clasification and query routing into a RAG pipeline, so you can query documents based on the language a question was written in."
1816
]
1917
},
2018
{
@@ -679,7 +677,7 @@
679677
"If you've been following along, now you know how to incorporate language detection into query and indexing Haystack piplines. Go forth and build the international application of your dreams. 🗺️\n",
680678
"\n",
681679
"\n",
682-
"If you liked this tutorial, there's more to learn about Haystack 2.0:\n",
680+
"If you liked this tutorial, there's more to learn about Haystack:\n",
683681
"- [Serializing Haystack Pipelines](https://haystack.deepset.ai/tutorials/29_serializing_pipelines)\n",
684682
"- [Generating Structured Output with Loop-Based Auto-Correction](https://haystack.deepset.ai/tutorials/28_structured_output_with_loop)\n",
685683
"- [Preprocessing Different File Types](https://haystack.deepset.ai/tutorials/30_file_type_preprocessing_index_pipeline)\n",

tutorials/33_Hybrid_Retrieval.ipynb

+1-3
Original file line numberDiff line numberDiff line change
@@ -12,9 +12,7 @@
1212
"- **Time to complete**: 15 minutes\n",
1313
"- **Components Used**: [`DocumentSplitter`](https://docs.haystack.deepset.ai/docs/documentsplitter), [`SentenceTransformersDocumentEmbedder`](https://docs.haystack.deepset.ai/docs/sentencetransformersdocumentembedder), [`DocumentJoiner`](https://docs.haystack.deepset.ai/docs/documentjoiner), [`InMemoryDocumentStore`](https://docs.haystack.deepset.ai/docs/inmemorydocumentstore), [`InMemoryBM25Retriever`](https://docs.haystack.deepset.ai/docs/inmemorybm25retriever), [`InMemoryEmbeddingRetriever`](https://docs.haystack.deepset.ai/docs/inmemoryembeddingretriever), and [`TransformersSimilarityRanker`](https://docs.haystack.deepset.ai/docs/transformerssimilarityranker)\n",
1414
"- **Prerequisites**: None\n",
15-
"- **Goal**: After completing this tutorial, you will have learned about creating a hybrid retrieval and when it's useful.\n",
16-
"\n",
17-
"> This tutorial uses the latest version of Haystack 2.x (`haystack-ai`). For more information on Haystack 2.0, read the [Haystack 2.0 announcement](https://haystack.deepset.ai/blog/haystack-2-release) or visit the [Haystack Documentation](https://docs.haystack.deepset.ai/docs/intro)."
15+
"- **Goal**: After completing this tutorial, you will have learned about creating a hybrid retrieval and when it's useful."
1816
]
1917
},
2018
{

tutorials/34_Extractive_QA_Pipeline.ipynb

+4-7
Original file line numberDiff line numberDiff line change
@@ -11,10 +11,7 @@
1111
"- **Level**: Beginner\n",
1212
"- **Time to complete**: 15 minutes\n",
1313
"- **Components Used**: [`ExtractiveReader`](https://docs.haystack.deepset.ai/docs/extractivereader), [`InMemoryDocumentStore`](https://docs.haystack.deepset.ai/docs/inmemorydocumentstore), [`InMemoryEmbeddingRetriever`](https://docs.haystack.deepset.ai/docs/inmemoryembeddingretriever), [`DocumentWriter`](https://docs.haystack.deepset.ai/docs/documentwriter), [`SentenceTransformersDocumentEmbedder`](https://docs.haystack.deepset.ai/docs/sentencetransformersdocumentembedder), [`SentenceTransformersTextEmbedder`](https://docs.haystack.deepset.ai/docs/sentencetransformerstextembedder)\n",
14-
"- **Goal**: After completing this tutorial, you'll have learned how to build a Haystack pipeline that uses an extractive model to display where the answer to your query is.\n",
15-
"\n",
16-
"> This tutorial uses the latest version of Haystack 2.x (`haystack-ai`). For more information on Haystack 2.0, read the [Haystack 2.0 announcement](https://haystack.deepset.ai/blog/haystack-2-release) or visit the [Haystack Documentation](https://docs.haystack.deepset.ai/docs/intro).\n",
17-
"\n"
14+
"- **Goal**: After completing this tutorial, you'll have learned how to build a Haystack pipeline that uses an extractive model to display where the answer to your query is."
1815
]
1916
},
2017
{
@@ -108,7 +105,7 @@
108105
"\n",
109106
"The data has already been cleaned and preprocessed, so turning it into Haystack `Documents` is fairly straightfoward.\n",
110107
"\n",
111-
"Using an `InMemoryDocumentStore` here keeps things simple. However, this general approach would work with [any document store that Haystack 2.0 supports](https://docs.haystack.deepset.ai/docs/document-store).\n",
108+
"Using an `InMemoryDocumentStore` here keeps things simple. However, this general approach would work with [any document store that Haystack supports](https://docs.haystack.deepset.ai/docs/document-store).\n",
112109
"\n",
113110
"The `SentenceTransformersDocumentEmbedder` transforms each `Document` into a vector. Here we've used [`sentence-transformers/multi-qa-mpnet-base-dot-v1`](https://huggingface.co/sentence-transformers/multi-qa-mpnet-base-dot-v1). You can substitute any embedding model you like, as long as you use the same one in your extractive pipeline.\n",
114111
"\n",
@@ -656,10 +653,10 @@
656653
"source": [
657654
"## Wrapping it up\n",
658655
"\n",
659-
"If you've been following along, now you know how to build an extractive question answering pipeline with Haystack 2.0. 🎉 Thanks for reading!\n",
656+
"If you've been following along, now you know how to build an extractive question answering pipeline with Haystack. 🎉 Thanks for reading!\n",
660657
"\n",
661658
"\n",
662-
"If you liked this tutorial, there's more to learn about Haystack 2.0:\n",
659+
"If you liked this tutorial, there's more to learn about Haystack:\n",
663660
"- [Classifying Documents & Queries by Language](https://haystack.deepset.ai/tutorials/32_classifying_documents_and_queries_by_language)\n",
664661
"- [Generating Structured Output with Loop-Based Auto-Correction](https://haystack.deepset.ai/tutorials/28_structured_output_with_loop)\n",
665662
"- [Preprocessing Different File Types](https://haystack.deepset.ai/tutorials/30_file_type_preprocessing_index_pipeline)\n",

tutorials/35_Evaluating_RAG_Pipelines.ipynb

+1-3
Original file line numberDiff line numberDiff line change
@@ -12,9 +12,7 @@
1212
"- **Time to complete**: 15 minutes\n",
1313
"- **Components Used**: `InMemoryDocumentStore`, `InMemoryEmbeddingRetriever`, `ChatPromptBuilder`, `OpenAIChatGenerator`, `DocumentMRREvaluator`, `FaithfulnessEvaluator`, `SASEvaluator`\n",
1414
"- **Prerequisites**: You must have an API key from an active OpenAI account as this tutorial is using the gpt-4o-mini model by OpenAI: https://platform.openai.com/api-keys\n",
15-
"- **Goal**: After completing this tutorial, you'll have learned how to evaluate your RAG pipelines both with model-based, and statistical metrics available in the Haystack evaluation offering. You'll also see which other evaluation frameworks are integrated with Haystack.\n",
16-
"\n",
17-
"> This tutorial uses the latest version of Haystack 2.x (`haystack-ai`). For more information on Haystack 2.0, read the [Haystack 2.0 announcement](https://haystack.deepset.ai/blog/haystack-2-release) or visit the [Haystack Documentation](https://docs.haystack.deepset.ai/docs/intro)."
15+
"- **Goal**: After completing this tutorial, you'll have learned how to evaluate your RAG pipelines both with model-based, and statistical metrics available in the Haystack evaluation offering. You'll also see which other evaluation frameworks are integrated with Haystack."
1816
]
1917
},
2018
{

tutorials/36_Building_Fallbacks_with_Conditional_Routing.ipynb

+3-5
Original file line numberDiff line numberDiff line change
@@ -12,9 +12,7 @@
1212
"- **Time to complete**: 10 minutes\n",
1313
"- **Components Used**: [`ConditionalRouter`](https://docs.haystack.deepset.ai/docs/conditionalrouter), [`SerperDevWebSearch`](https://docs.haystack.deepset.ai/docs/serperdevwebsearch), [`ChatPromptBuilder`](https://docs.haystack.deepset.ai/docs/chatpromptbuilder), [`OpenAIChatGenerator`](https://docs.haystack.deepset.ai/docs/openaichatgenerator)\n",
1414
"- **Prerequisites**: You must have an [OpenAI API Key](https://platform.openai.com/api-keys) and a [Serper API Key](https://serper.dev/api-key) for this tutorial\n",
15-
"- **Goal**: After completing this tutorial, you'll have learned how to create a pipeline with conditional routing that can fallback to websearch if the answer is not present in your dataset.\n",
16-
"\n",
17-
"> This tutorial uses the latest version of Haystack 2.x (`haystack-ai`). For more information on Haystack 2.0, read the [Haystack 2.0 announcement](https://haystack.deepset.ai/blog/haystack-2-release) or visit the [Haystack Documentation](https://docs.haystack.deepset.ai/docs/intro).\n"
15+
"- **Goal**: After completing this tutorial, you'll have learned how to create a pipeline with conditional routing that can fallback to websearch if the answer is not present in your dataset."
1816
]
1917
},
2018
{
@@ -540,9 +538,9 @@
540538
"source": [
541539
"## What's next\n",
542540
"\n",
543-
"🎉 Congratulations! You've built a pipeline with conditional routing! You can now customize the condition for your specific use case and create a custom Haystack 2.0 pipeline to meet your needs.\n",
541+
"🎉 Congratulations! You've built a pipeline with conditional routing! You can now customize the condition for your specific use case and create a custom Haystack pipeline to meet your needs.\n",
544542
"\n",
545-
"If you liked this tutorial, there's more to learn about Haystack 2.0:\n",
543+
"If you liked this tutorial, there's more to learn about Haystack:\n",
546544
"- [Creating Your First QA Pipeline with Retrieval-Augmentation](https://haystack.deepset.ai/tutorials/27_first_rag_pipeline)\n",
547545
"- [Model-Based Evaluation of RAG Pipelines](https://haystack.deepset.ai/tutorials/35_model_based_evaluation_of_rag_pipelines)\n",
548546
"\n",

0 commit comments

Comments
 (0)