-
Notifications
You must be signed in to change notification settings - Fork 9
feat: create metadata entries generator #272
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
araujogui
wants to merge
3
commits into
nodejs:main
Choose a base branch
from
araujogui:refactor-parsers
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from all commits
Commits
File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,28 @@ | ||
'use strict'; | ||
|
||
import { parseApiDoc } from './utils/parse.mjs'; | ||
|
||
/** | ||
* This generator generates a flattened list of metadata entries from a API doc | ||
* | ||
* @typedef {ParserOutput<import('mdast').Root>[]} Input | ||
* | ||
* @type {GeneratorMetadata<Input, ApiDocMetadataEntry[]>} | ||
*/ | ||
export default { | ||
name: 'metadata', | ||
|
||
version: '1.0.0', | ||
|
||
description: 'generates a flattened list of API doc metadata entries', | ||
|
||
dependsOn: 'ast', | ||
|
||
/** | ||
* @param {Input} inputs | ||
* @returns {Promise<ApiDocMetadataEntry[]>} | ||
*/ | ||
async generate(inputs) { | ||
return inputs.flatMap(input => parseApiDoc(input)); | ||
}, | ||
}; |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,148 @@ | ||
'use strict'; | ||
|
||
import { u as createTree } from 'unist-builder'; | ||
import { findAfter } from 'unist-util-find-after'; | ||
import { remove } from 'unist-util-remove'; | ||
import { selectAll } from 'unist-util-select'; | ||
import { SKIP, visit } from 'unist-util-visit'; | ||
|
||
import createQueries from '../../../utils/queries/index.mjs'; | ||
import { getRemark } from '../../../utils/remark.mjs'; | ||
import { createNodeSlugger } from '../../../utils/slugger/index.mjs'; | ||
import createMetadata from '../../../metadata.mjs'; | ||
|
||
/** | ||
* This generator generates a flattened list of metadata entries from a API doc | ||
* | ||
* @param {ParserOutput<import('mdast').Root>} input | ||
* @returns {Promise<ApiDocMetadataEntry[]>} | ||
*/ | ||
export const parseApiDoc = ({ file, tree }) => { | ||
/** | ||
* This holds references to all the Metadata entries for a given file | ||
* this is used so we can traverse the AST tree and keep mutating things | ||
* and then stringify the whole api doc file at once without creating sub traversals | ||
* | ||
* Then once we have the whole file parsed, we can split the resulting string into sections | ||
* and seal the Metadata Entries (`.create()`) and return the result to the caller of parae. | ||
* | ||
* @type {Array<ApiDocMetadataEntry>} | ||
*/ | ||
const metadataCollection = []; | ||
|
||
const { | ||
setHeadingMetadata, | ||
addYAMLMetadata, | ||
updateMarkdownLink, | ||
updateTypeReference, | ||
updateLinkReference, | ||
addStabilityMetadata, | ||
} = createQueries(); | ||
|
||
// Creates an instance of the Remark processor with GFM support | ||
// which is used for stringifying the AST tree back to Markdown | ||
const remarkProcessor = getRemark(); | ||
|
||
// Creates a new Slugger instance for the current API doc file | ||
const nodeSlugger = createNodeSlugger(); | ||
|
||
// Get all Markdown Footnote definitions from the tree | ||
const markdownDefinitions = selectAll('definition', tree); | ||
|
||
// Get all Markdown Heading entries from the tree | ||
const headingNodes = selectAll('heading', tree); | ||
|
||
// Handles Markdown link references and updates them to be plain links | ||
visit(tree, createQueries.UNIST.isLinkReference, node => | ||
updateLinkReference(node, markdownDefinitions) | ||
); | ||
|
||
// Removes all the original definitions from the tree as they are not needed | ||
// anymore, since all link references got updated to be plain links | ||
remove(tree, markdownDefinitions); | ||
|
||
// Handles the normalisation URLs that reference to API doc files with .md extension | ||
// to replace the .md into .html, since the API doc files get eventually compiled as HTML | ||
visit(tree, createQueries.UNIST.isMarkdownUrl, node => | ||
updateMarkdownLink(node) | ||
); | ||
|
||
// If the document has no headings but it has content, we add a fake heading to the top | ||
// so that our parsing logic can work correctly, and generate content for the whole file | ||
if (headingNodes.length === 0 && tree.children.length > 0) { | ||
tree.children.unshift(createTree('heading', { depth: 1 }, [])); | ||
} | ||
|
||
// Handles iterating the tree and creating subtrees for each API doc entry | ||
// where an API doc entry is defined by a Heading Node | ||
// (so all elements after a Heading until the next Heading) | ||
// and then it creates and updates a Metadata entry for each API doc entry | ||
// and then generates the final content for each API doc entry and pushes it to the collection | ||
visit(tree, createQueries.UNIST.isHeading, (headingNode, index) => { | ||
// Creates a new Metadata entry for the current API doc file | ||
const apiEntryMetadata = createMetadata(nodeSlugger); | ||
|
||
// Adds the Metadata of the current Heading Node to the Metadata entry | ||
setHeadingMetadata(headingNode, apiEntryMetadata); | ||
|
||
// We retrieve the immediate next Heading if it exists | ||
// This is used for ensuring that we don't include items that would | ||
// belong only to the next heading to the current Heading metadata | ||
// Note that if there is no next heading, we use the current node as the next one | ||
const nextHeadingNode = | ||
findAfter(tree, index, createQueries.UNIST.isHeading) ?? headingNode; | ||
|
||
// This is the cutover index of the subtree that we should get | ||
// of all the Nodes within the AST tree that belong to this section | ||
// If `next` is equals the current heading, it means there's no next heading | ||
// and we are reaching the end of the document, hence the cutover should be the end of | ||
// the document itself. | ||
const stop = | ||
headingNode === nextHeadingNode | ||
? tree.children.length | ||
: tree.children.indexOf(nextHeadingNode); | ||
|
||
// Retrieves all the nodes that should belong to the current API docs section | ||
// `index + 1` is used to skip the current Heading Node | ||
const subTree = createTree('root', tree.children.slice(index, stop)); | ||
|
||
// Visits all Stability Index nodes from the current subtree if there's any | ||
// and then apply the Stability Index metadata to the current metadata entry | ||
visit(subTree, createQueries.UNIST.isStabilityNode, node => | ||
addStabilityMetadata(node, apiEntryMetadata) | ||
); | ||
|
||
// Visits all HTML nodes from the current subtree and if there's any that matches | ||
// our YAML metadata structure, it transforms into YAML metadata | ||
// and then apply the YAML Metadata to the current Metadata entry | ||
visit(subTree, createQueries.UNIST.isYamlNode, node => { | ||
// TODO: Is there always only one YAML node? | ||
apiEntryMetadata.setYamlPosition(node.position); | ||
addYAMLMetadata(node, apiEntryMetadata); | ||
}); | ||
|
||
// Visits all Text nodes from the current subtree and if there's any that matches | ||
// any API doc type reference and then updates the type reference to be a Markdown link | ||
visit(subTree, createQueries.UNIST.isTextWithType, (node, _, parent) => | ||
updateTypeReference(node, parent) | ||
); | ||
|
||
// Removes already parsed items from the subtree so that they aren't included in the final content | ||
remove(subTree, [createQueries.UNIST.isYamlNode]); | ||
|
||
// Applies the AST transformations to the subtree based on the API doc entry Metadata | ||
// Note that running the transformation on the subtree isn't costly as it is a reduced tree | ||
// and the GFM transformations aren't that heavy | ||
const parsedSubTree = remarkProcessor.runSync(subTree); | ||
|
||
// We seal and create the API doc entry Metadata and push them to the collection | ||
const parsedApiEntryMetadata = apiEntryMetadata.create(file, parsedSubTree); | ||
|
||
// We push the parsed API doc entry Metadata to the collection | ||
metadataCollection.push(parsedApiEntryMetadata); | ||
|
||
return SKIP; | ||
}); | ||
|
||
return metadataCollection; | ||
}; |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,5 +1,3 @@ | ||
// @ts-check | ||
|
||
/** | ||
* @type {import('../../types').LintIssue} | ||
*/ | ||
|
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'll fix this after #275 is merged, because we need the raw asts here.