diff --git a/source/aggregation/stages.txt b/source/aggregation/stages.txt index 94d54fd4..673d84b4 100644 --- a/source/aggregation/stages.txt +++ b/source/aggregation/stages.txt @@ -1,4 +1,6 @@ .. _csharp-aggregation-stages: +.. _csharp-builders-aggregation: +.. _csharp-linq: =========================== Aggregation Pipeline Stages @@ -104,10 +106,9 @@ Aggregation Stage Methods ------------------------- The following table lists the builder methods in the {+driver-short+} that correspond -to stages in the aggregation pipeline. To learn more about an aggregation stage, -follow the link from the method name to its reference page in the {+mdb-server+} manual. -To learn more about a builder method, follow the link from the method name to its -dedicated page. +to stages in the aggregation pipeline. To learn more about an aggregation stage and +see a code example for the equivalent C# method, follow the link from the stage name +to its reference page in the {+mdb-server+} manual. If an aggregation stage isn't in the table, the driver doesn't provide a builder method for it. In this case, you must use the @@ -125,7 +126,7 @@ to your pipeline. * - :manual:`$bucket ` - Categorizes incoming documents into groups, called buckets, based on a specified expression and bucket boundaries. - - :ref:`Bucket() ` + - ``Bucket()`` * - :manual:`$bucketAuto ` - Categorizes incoming documents into a specific number of @@ -133,13 +134,13 @@ to your pipeline. Bucket boundaries are automatically determined in an attempt to evenly distribute the documents into the specified number of buckets. - - :ref:`BucketAuto() ` + - ``BucketAuto()`` * - :manual:`$changeStream ` - Returns a change stream cursor for the collection. This stage can occur only once in an aggregation pipeline and it must occur as the first stage. - - :ref:`ChangeStream() ` + - ``ChangeStream()`` * - :manual:`$changeStreamSplitLargeEvent ` - Splits large change stream events that exceed 16 MB into smaller fragments returned @@ -147,20 +148,20 @@ to your pipeline. You can use ``$changeStreamSplitLargeEvent`` only in a ``$changeStream`` pipeline, and it must be the final stage in the pipeline. - - :ref:`ChangeStreamSplitLargeEvent() ` + - ``ChangeStreamSplitLargeEvent()`` * - :manual:`$count ` - Returns a count of the number of documents at this stage of the aggregation pipeline. - - :ref:`Count() ` + - ``Count()`` * - :manual:`$densify ` - Creates new documents in a sequence of documents where certain values in a field are missing. - - :ref:`Densify() ` + - ``Densify()`` * - :manual:`$documents ` - Returns literal documents from input expressions. - - :ref:`Documents() ` + - ``Documents()`` * - :manual:`$facet ` - Processes multiple aggregation pipelines @@ -168,13 +169,13 @@ to your pipeline. of input documents. Enables the creation of multi-faceted aggregations capable of characterizing data across multiple dimensions, or facets, in a single stage. - - :ref:`Facet() ` + - ``Facet()`` * - :manual:`$graphLookup ` - Performs a recursive search on a collection. This method adds a new array field to each output document that contains the traversal results of the recursive search for that document. - - :ref:`GraphLookup() ` + - ``GraphLookup()`` * - :manual:`$group ` - Groups input documents by a specified identifier expression @@ -183,27 +184,27 @@ to your pipeline. document per each distinct group. The output documents contain only the identifier field and, if specified, accumulated fields. - - :ref:`Group() ` + - ``Group()`` * - :manual:`$limit ` - Passes the first *n* documents unmodified to the pipeline, where *n* is the specified limit. For each input document, outputs either one document (for the first *n* documents) or zero documents (after the first *n* documents). - - :ref:`Limit() ` + - ``Limit()`` * - :manual:`$lookup ` - Performs a left outer join to another collection in the *same* database to filter in documents from the "joined" collection for processing. - - :ref:`Lookup() ` + - ``Lookup()`` * - :manual:`$match ` - Filters the document stream to allow only matching documents to pass unmodified into the next pipeline stage. For each input document, outputs either one document (a match) or zero documents (no match). - - :ref:`Match() ` + - ``Match()`` * - :manual:`$merge ` - Writes the resulting documents of the aggregation pipeline to @@ -213,24 +214,24 @@ to your pipeline. custom update pipeline) the results into an output collection. To use this stage, it must be the last stage in the pipeline. - - :ref:`Merge() ` + - ``Merge()`` * - :manual:`$out ` - Writes the resulting documents of the aggregation pipeline to a collection. To use this stage, it must be the last stage in the pipeline. - - :ref:`Out() ` + - ``Out()`` * - :manual:`$project ` - Reshapes each document in the stream, such as by adding new fields or removing existing fields. For each input document, outputs one document. - - :ref:`Project() ` + - ``Project()`` * - :manual:`$rankFusion ` - Uses a rank fusion algorithm to combine results from a Vector Search query and an Atlas Search query. - - :ref:`RankFusion() ` + - ``RankFusion()`` * - :manual:`$replaceRoot ` - Replaces a document with the specified embedded document. The @@ -240,7 +241,7 @@ to your pipeline. top level. The ``$replaceWith`` stage is an alias for the ``$replaceRoot`` stage. - - :ref:`ReplaceRoot() ` + - ``ReplaceRoot()`` * - :manual:`$replaceWith ` - Replaces a document with the specified embedded document. @@ -249,12 +250,12 @@ to your pipeline. the embedded document to the top level. The ``$replaceWith`` stage is an alias for the ``$replaceRoot`` stage. - - :ref:`ReplaceWith() ` + - ``ReplaceWith()`` * - :manual:`$sample ` - Randomly selects the specified number of documents from its input. - - :ref:`Sample() ` + - ``Sample()`` * - :manual:`$search ` - Performs a full-text search of the field or fields in an @@ -265,7 +266,7 @@ to your pipeline. available for self-managed deployments. To learn more, see :atlas:`Atlas Search Aggregation Pipeline Stages ` in the Atlas documentation. - - :ref:`Search() ` + - ``Search()`` * - :manual:`$searchMeta ` - Returns different types of metadata result documents for the @@ -277,7 +278,7 @@ to your pipeline. and is not available for self-managed deployments. To learn more, see :atlas:`Atlas Search Aggregation Pipeline Stages ` in the Atlas documentation. - - :ref:`SearchMeta() ` + - ``SearchMeta()`` * - :manual:`$set ` - Adds new fields to documents. Like the ``Project()`` method, @@ -285,12 +286,12 @@ to your pipeline. document in the stream by adding new fields to output documents that contain both the existing fields from the input documents and the newly added fields. - - :ref:`Set() ` + - ``Set()`` * - :manual:`$setWindowFields ` - Groups documents into windows and applies one or more operators to the documents in each window. - - :ref:`SetWindowFields() ` + - ``SetWindowFields()`` * - :manual:`$skip ` - Skips the first *n* documents, where *n* is the specified skip @@ -298,23 +299,23 @@ to your pipeline. pipeline. For each input document, outputs either zero documents (for the first *n* documents) or one document (if after the first *n* documents). - - :ref:`Skip() ` + - ``Skip()`` * - :manual:`$sort ` - Reorders the document stream by a specified sort key. The documents remain unmodified. For each input document, outputs one document. - - :ref:`Sort() ` + - ``Sort()`` * - :manual:`$sortByCount ` - Groups incoming documents based on the value of a specified expression, then computes the count of documents in each distinct group. - - :ref:`SortByCount() ` + - ``SortByCount()`` * - :manual:`$unionWith ` - Combines pipeline results from two collections into a single result set. - - :ref:`UnionWith() ` + - ``UnionWith()`` * - :manual:`$unwind ` - Deconstructs an array field from the input documents to @@ -322,7 +323,7 @@ to your pipeline. replaces the array with an element value. For each input document, outputs *n* Documents, where *n* is the number of array elements. *n* can be zero for an empty array. - - :ref:`Unwind() ` + - ``Unwind()`` * - :manual:`$vectorSearch ` - Performs an :abbr:`ANN (Approximate Nearest Neighbor)` or @@ -333,7 +334,7 @@ to your pipeline. This stage is available only for MongoDB Atlas clusters, and is not available for self-managed deployments. To learn more, see :ref:`Atlas Vector Search `. - - :ref:`VectorSearch() ` + - ``VectorSearch()`` API Documentation -----------------