diff --git a/CLAUDE.md b/CLAUDE.md index dba7ab2d4..d486c3eec 100644 --- a/CLAUDE.md +++ b/CLAUDE.md @@ -5,6 +5,7 @@ **vscode-dbt-power-user** is a comprehensive VSCode extension that makes VSCode seamlessly work with dbt (data build tool). It's an open-source project published by Altimate AI that extends VSCode with advanced dbt features including auto-completion, query preview, lineage visualization, documentation generation, and AI-powered features. ### Key Statistics + - **Version**: 0.57.3 - **Project Type**: VSCode Extension (TypeScript/React) - **License**: MIT @@ -25,16 +26,19 @@ The extension follows a **dependency injection pattern** using Inversify contain The extension operates across multiple processes: 1. **Main Extension Process** (Node.js/TypeScript) + - VSCode API integration - File system operations - dbt CLI interactions 2. **Webview Panels** (React/TypeScript) + - Modern React-based UI components - Located in `webview_panels/` directory - Built with Vite, uses Antd for UI components 3. **Python Bridge Integration** + - dbt core/cloud integration via Python scripts - Key files: `dbt_core_integration.py`, `dbt_cloud_integration.py` - Jupyter kernel for notebook functionality @@ -62,12 +66,14 @@ src/ ### 1. dbt Integration Support **Multiple Integration Types**: + - **dbt Core**: Direct Python integration via Python bridge - **dbt Cloud**: API-based integration with dbt Cloud services - **dbt Fusion**: Command-line integration with dbt-fusion CLI - **Core Command**: CLI wrapper integration for dbt core **Key Integration Files**: + - `src/dbt_client/dbtCoreIntegration.ts` - dbt Core Python integration - `src/dbt_client/dbtCloudIntegration.ts` - dbt Cloud API integration - `src/dbt_client/dbtFusionCommandIntegration.ts` - dbt Fusion CLI integration @@ -76,6 +82,7 @@ src/ ### 2. Language Server Features **Provider Architecture**: Each feature implemented as a separate provider: + - `autocompletion_provider/` - IntelliSense for dbt models, macros, sources - `definition_provider/` - Go-to-definition functionality - `hover_provider/` - Hover information @@ -85,12 +92,14 @@ src/ ### 3. Webview Panel System **Modern React Architecture** (`webview_panels/`): + - **Build System**: Vite + TypeScript + React 18 - **State Management**: Redux Toolkit - **UI Framework**: Antd + custom components - **Data Visualization**: Perspective.js, Plotly.js **Key Panels**: + - `modules/dataPilot/` - AI chat interface - `modules/queryPanel/` - Query results and analysis - `modules/lineage/` - Data lineage visualization @@ -100,12 +109,14 @@ src/ ### 4. AI and Advanced Features **DataPilot AI Integration**: + - Chat-based interface for dbt assistance - Query explanation and optimization - Documentation generation - Test suggestions **MCP Server Integration**: + - Tool calling for dbt operations - Integration with Claude and other AI models - Located in `src/mcp/server.ts` @@ -115,12 +126,14 @@ src/ ### 1. Multi-Stage Build Process **Main Extension Build** (Webpack): + ```bash npm run webpack # Development build npm run vscode:prepublish # Production build ``` **Webview Panels Build** (Vite): + ```bash npm run panel:webviews # Build React components ``` @@ -128,12 +141,14 @@ npm run panel:webviews # Build React components ### 2. Development Workflow **Key Scripts**: + - `npm run watch` - Development with hot reload - `npm run test` - Jest-based testing - `npm run lint` - ESLint + Prettier - `npm run build-vsix` - Package extension **Development Environment**: + - Uses VSCode's built-in debugger ("Launch Extension") - Hot reload for webview panels - Python environment auto-detection @@ -141,6 +156,7 @@ npm run panel:webviews # Build React components ### 3. Testing Strategy **Test Configuration** (`jest.config.js`): + - **Unit Tests**: Jest + ts-jest - **Mock System**: Custom VSCode API mocks - **Coverage**: Istanbul-based coverage reporting @@ -151,6 +167,7 @@ npm run panel:webviews # Build React components ### 1. VSCode Extension Dependencies **Required Extensions**: + - `samuelcolvin.jinjahtml` - Jinja templating support - `ms-python.python` - Python environment integration - `altimateai.vscode-altimate-mcp-server` - MCP server @@ -158,12 +175,14 @@ npm run panel:webviews # Build React components ### 2. Major Technical Dependencies **Backend (Node.js)**: + - `inversify` - Dependency injection - `python-bridge` - Python process communication - `zeromq` - Jupyter kernel communication - `@modelcontextprotocol/sdk` - MCP protocol **Frontend (React)**: + - `react` 18 + `react-dom` - `@reduxjs/toolkit` - State management - `antd` - UI component library @@ -172,6 +191,7 @@ npm run panel:webviews # Build React components ### 3. Python Integration **Python Scripts**: + - `dbt_core_integration.py` - Core dbt operations - `dbt_cloud_integration.py` - Cloud API operations - `dbt_healthcheck.py` - Project health analysis @@ -182,6 +202,7 @@ npm run panel:webviews # Build React components ### 1. Extension Configuration **Comprehensive Settings** (190+ configuration options): + - dbt integration mode selection - Query limits and templates - AI features and endpoints @@ -191,6 +212,7 @@ npm run panel:webviews # Build React components ### 2. Language Support **File Type Associations**: + - `jinja-sql` - Primary dbt model files - `jinja-yaml` - dbt configuration files - `jinja-md` - Documentation files @@ -199,6 +221,7 @@ npm run panel:webviews # Build React components ### 3. Command System **80+ Commands Available**: + - Model execution (`dbtPowerUser.runCurrentModel`) - Documentation generation (`dbtPowerUser.generateSchemaYML`) - Query analysis (`dbtPowerUser.sqlLineage`) @@ -209,6 +232,7 @@ npm run panel:webviews # Build React components ### 1. Multi-Platform Distribution **CI/CD Pipeline** (`.github/workflows/ci.yml`): + - **Build Matrix**: macOS, Ubuntu, Windows - **Visual Studio Marketplace**: Primary distribution - **OpenVSX Registry**: Open-source alternative @@ -217,6 +241,7 @@ npm run panel:webviews # Build React components ### 2. Release Process **Automated Release**: + - Git tag triggers release pipeline - Pre-release and stable channel support - Slack notifications for release status @@ -234,16 +259,19 @@ npm run panel:webviews # Build React components ### 2. Adding New Features **For Language Features**: + 1. Create provider in appropriate `*_provider/` directory 2. Register in `inversify.config.ts` 3. Wire up in `DBTPowerUserExtension` **For UI Features**: + 1. Add React component in `webview_panels/src/modules/` 2. Update routing in `AppRoutes.tsx` 3. Add state management slice if needed **For dbt Integration**: + 1. Extend appropriate dbt client (`dbtCoreIntegration.ts` etc.) 2. Add Python bridge function if needed 3. Update MCP server tools if AI-accessible @@ -258,15 +286,19 @@ npm run panel:webviews # Build React components ## Common Development Patterns ### 1. Manifest-Driven Architecture + The extension heavily relies on dbt's `manifest.json` for understanding project structure. Most features key off manifest parsing events. ### 2. Multi-Integration Support + Always consider how features work across dbt core, cloud, and other integration types. Use strategy pattern for integration-specific behavior. ### 3. Webview Communication + Uses VSCode's webview messaging system with typed message contracts. State is synchronized between extension and webview contexts. ### 4. Python Bridge Pattern + For dbt operations requiring Python, use the established bridge pattern with JSON serialization and error handling. This architecture enables the extension to provide comprehensive dbt development support while maintaining modularity and extensibility for future enhancements. @@ -280,6 +312,7 @@ This architecture enables the extension to provide comprehensive dbt development The dbt Power User extension accelerates dbt and SQL development by 3x through three key phases: ### ๐Ÿ”ง DEVELOP + - **SQL Visualizer**: Visual query builder and analyzer - **Query Explanation**: AI-powered SQL query explanation - **Auto-generation**: Generate dbt models from sources or raw SQL @@ -288,7 +321,8 @@ The dbt Power User extension accelerates dbt and SQL development by 3x through t - **Query Translation**: Translate SQL between different dialects - **Compiled SQL Preview**: View compiled dbt code before execution -### ๐Ÿงช TEST +### ๐Ÿงช TEST + - **Query Results Preview**: Execute and analyze query results with export capabilities - **Test Generation**: AI-powered test generation for dbt models - **Column Lineage**: Detailed data lineage with code visibility @@ -297,6 +331,7 @@ The dbt Power User extension accelerates dbt and SQL development by 3x through t - **Model Lineage**: Visual representation of model dependencies ### ๐Ÿค COLLABORATE + - **Documentation Generation**: AI-powered documentation creation - **Code Collaboration**: Discussion threads on code and documentation - **Project Governance**: Automated checks for code quality and standards @@ -307,6 +342,7 @@ The dbt Power User extension accelerates dbt and SQL development by 3x through t ## DataMates AI Integration The extension includes **AI Teammates** through the DataMates Platform: + - **Coaching**: Personalize AI teammates for specific requirements - **Query Assistance**: AI-powered query explanation and optimization - **Documentation**: Automated documentation generation @@ -316,11 +352,13 @@ The extension includes **AI Teammates** through the DataMates Platform: ## Feature Availability **Free Extension Features**: + - SQL Visualizer, Model-level lineage, Auto-generation from sources - Auto-completion, Click to Run, Compiled SQL preview - Query results preview, Defer to production, SQL validation **With Altimate AI Key** (free signup at [app.myaltimate.com](https://app.myaltimate.com)): + - Column-level lineage, Query explanation AI, Query translation AI - Auto-generation from SQL, Test generation AI, Documentation generation AI - Code/documentation collaboration, Lineage export, SaaS UI @@ -333,39 +371,44 @@ The extension includes **AI Teammates** through the DataMates Platform: ## Installation Methods ### Native Installation + Install directly from [VS Code Marketplace](https://marketplace.visualstudio.com/items?itemName=innoverio.vscode-dbt-power-user) or via VS Code: + 1. Open VS Code Extensions panel (`Ctrl+Shift+X`) 2. Search for "dbt Power User" 3. Click Install 4. Reload VS Code if prompted ### Dev Container Installation + Add to your `.devcontainer/devcontainer.json`: + ```json { "customizations": { "vscode": { "files.associations": { "*.yaml": "jinja-yaml", - "*.yml": "jinja-yaml", + "*.yml": "jinja-yaml", "*.sql": "jinja-sql", "*.md": "jinja-md" }, - "extensions": [ - "innoverio.vscode-dbt-power-user" - ] + "extensions": ["innoverio.vscode-dbt-power-user"] } } } ``` ### Cursor IDE Support + The extension is also available for [Cursor IDE](https://www.cursor.com/how-to-install-extension). Install the same way as VS Code. ## Required Configuration ### 1. dbt Integration Setup + Configure how the extension connects to dbt: + - **dbt Core**: For local dbt installations with Python bridge (default) - **dbt Cloud**: For dbt Cloud API integration - **dbt Fusion**: For dbt-fusion CLI integration @@ -374,22 +417,28 @@ Configure how the extension connects to dbt: Set via `dbt.dbtIntegration` setting. #### dbt Fusion Integration + dbt Fusion is a command-line interface that provides enhanced dbt functionality. When using fusion integration: + - Requires dbt-fusion CLI to be installed in your environment - Extension automatically detects fusion installation via `dbt --version` output - Provides full feature support including query execution, compilation, and catalog operations - Uses JSON log format for structured command output parsing ### 2. Python Environment + Ensure Python and dbt are properly installed and accessible. The extension will auto-detect your Python environment through the VS Code Python extension. ### 3. Optional: Altimate AI Key + For advanced AI features, get a free API key: + 1. Sign up at [app.myaltimate.com/register](https://app.myaltimate.com/register) 2. Add API key to `dbt.altimateAiKey` setting 3. Set instance name in `dbt.altimateInstanceName` setting ## Project Setup + 1. Open your dbt project folder in VS Code 2. Run the setup wizard: Select "dbt" in bottom status bar โ†’ "Setup Extension" 3. The extension will auto-install dbt dependencies if enabled @@ -402,52 +451,65 @@ For advanced AI features, get a free API key: ## Quick Diagnostics ### 1. Setup Wizard + Use the built-in setup wizard for automated issue detection: + - Click "dbt" or "dbt is not installed" in bottom status bar -- Select "Setup Extension" +- Select "Setup Extension" - Follow guided setup process ### 2. Diagnostics Command + Run comprehensive system diagnostics: + - Open Command Palette (`Cmd+Shift+P` / `Ctrl+Shift+P`) - Type "diagnostics" โ†’ Select "dbt Power User: Diagnostics" - Review output for environment issues, Python/dbt installation status, and connection problems ### 3. Problems Panel + Check VS Code Problems panel for dbt project issues: + - View โ†’ Problems (or `Ctrl+Shift+M`) - Look for dbt-related validation errors ## Debug Logging Enable detailed logging for troubleshooting: + 1. Command Palette โ†’ "Set Log Level" โ†’ "Debug" 2. View logs: Output panel โ†’ "Log" dropdown โ†’ "dbt" 3. Reproduce the issue to capture debug information ## Developer Tools + For advanced debugging: + - Help โ†’ Toggle Developer Tools - Check console for JavaScript errors and detailed logs ## Common Issues **Extension not recognizing dbt project**: + - Verify `dbt_project.yml` exists in workspace root - Check Python environment has dbt installed - Run diagnostics command for detailed analysis **Python/dbt not found**: + - Configure Python interpreter via VS Code Python extension - Verify dbt is installed in selected Python environment - Set `dbt.dbtPythonPathOverride` if using custom Python path **Connection issues**: + - Verify database connection in dbt profiles - Check firewall/network settings - Review connection details in diagnostics output ## Getting Help + - Join [#tools-dbt-power-user](https://getdbt.slack.com/archives/C05KPDGRMDW) in dbt Community Slack - Contact support at [altimate.ai/support](https://www.altimate.ai/support) - Use in-extension feedback widgets for feature-specific issues @@ -459,38 +521,45 @@ For advanced debugging: ## Auto-completion and Navigation ### Model Auto-completion + - **Smart IntelliSense**: Auto-complete model names with `ref()` function - **Go-to-Definition**: Navigate directly to model files - **Hover Information**: View model details on hover -### Macro Support +### Macro Support + - **Macro Auto-completion**: IntelliSense for custom and built-in macros - **Parameter Hints**: Auto-complete macro parameters - **Definition Navigation**: Jump to macro definitions ### Source Integration + - **Source Auto-completion**: IntelliSense for configured sources - **Column Awareness**: Auto-complete source column names - **Schema Navigation**: Navigate to source definitions ### Documentation Blocks + - **Doc Block Auto-completion**: IntelliSense for documentation references - **Definition Linking**: Navigate to doc block definitions ## Query Development ### SQL Compilation and Preview + - **Compiled Code View**: See final SQL before execution - **Template Resolution**: Preview Jinja templating results - **Syntax Highlighting**: Enhanced SQL syntax highlighting for dbt files ### Query Execution + - **Preview Results**: Execute queries with `Cmd+Enter` / `Ctrl+Enter` - **Result Analysis**: Export results as CSV, copy as JSON - **Query History**: Track executed queries - **Configurable Limits**: Set row limits for query previews (default: 500 rows) ### SQL Formatting + - **Auto-formatting**: Integration with sqlfmt - **Custom Parameters**: Configure formatting rules - **Batch Processing**: Format multiple files @@ -498,17 +567,20 @@ For advanced debugging: ## AI-Powered Development ### Query Explanation + - **Natural Language**: Get plain English explanations of complex SQL - **Step-by-step Analysis**: Breakdown of query logic - **Performance Insights**: Query optimization suggestions ### Code Generation + - **Model from Source**: Generate base models from source tables - **Model from SQL**: Convert raw SQL to dbt models - **Test Generation**: AI-powered test suggestions - **Documentation Generation**: Auto-generate model documentation ### Query Translation + - **Cross-dialect Support**: Translate SQL between database dialects - **Syntax Adaptation**: Handle dialect-specific functions and syntax @@ -526,19 +598,23 @@ This is a MkDocs-based documentation site for the dbt Power User VSCode Extensio ### Architecture #### Content Organization + - `documentation/docs/` contains all documentation content in Markdown format - Content is organized by feature areas: `setup/`, `develop/`, `test/`, `document/`, `govern/`, `discover/`, `teammates/`, `datamates/`, `arch/` - Images and assets are stored within feature-specific directories - `documentation/mkdocs.yml` contains all site configuration #### Key Configuration Files + - `documentation/mkdocs.yml`: Main site configuration including navigation, theme settings, and plugins - `documentation/requirements.txt`: Python dependencies for MkDocs and plugins - `documentation/docs/overrides/`: Custom theme overrides (currently empty) - `documentation/docs/javascripts/`: Custom JavaScript for enhanced functionality #### Theme Configuration + The site uses Material theme with: + - Custom Altimate AI branding and colors - Google Analytics integration (G-LXRSS3VK5N) - Git revision date tracking via plugin @@ -546,7 +622,9 @@ The site uses Material theme with: - Dark/light mode support #### Navigation Structure + Navigation follows a three-phase user journey: + 1. **Setup**: Installation and configuration 2. **Develop**: Core development features 3. **Test**: Testing and validation tools @@ -555,18 +633,21 @@ Navigation follows a three-phase user journey: ### Working with Content #### Adding New Pages + 1. Create `.md` files in the appropriate `docs/` subdirectory 2. Update the `nav` section in `mkdocs.yml` to include the new page 3. Follow existing naming conventions for consistency #### Images and Assets + - Store images in the same directory as the referencing markdown file - Use relative paths for image references - Common assets go in `docs/assets/` #### Internal Links + Use relative markdown links to reference other pages. The site has extensive cross-referencing between related features. ### Testing Changes -Always test locally with `mkdocs serve` before deploying. The development server provides live reload for content changes. \ No newline at end of file +Always test locally with `mkdocs serve` before deploying. The development server provides live reload for content changes. diff --git a/documentation/docs/setup/reqdConfigFusion.md b/documentation/docs/setup/reqdConfigFusion.md index a517ecdfd..66e8ebb50 100644 --- a/documentation/docs/setup/reqdConfigFusion.md +++ b/documentation/docs/setup/reqdConfigFusion.md @@ -11,6 +11,7 @@ type: tip dbt Fusion is a command-line interface that provides enhanced dbt functionality with improved performance and additional features. Unlike standard dbt Core, dbt Fusion is a standalone executable that doesn't require a Python environment, making it easier to install and manage. ### Key Benefits of dbt Fusion Integration: + - **Standalone Installation**: No Python environment required - **Enhanced Performance**: Optimized execution compared to standard dbt - **Cross-Platform Support**: Available for macOS, Linux, and Windows @@ -70,11 +71,13 @@ type: info #### Manual Installation **macOS and Linux:** + ```bash curl -fsSL https://public.cdn.getdbt.com/fs/install/install.sh | sh -s -- --update ``` **Windows (PowerShell):** + ```powershell irm https://public.cdn.getdbt.com/fs/install/install.ps1 | iex ``` @@ -82,6 +85,7 @@ irm https://public.cdn.getdbt.com/fs/install/install.ps1 | iex #### Verify Installation After installation, verify that dbt Fusion is properly installed by running: + ```bash dbt --version ``` @@ -93,36 +97,42 @@ You should see output that includes "dbt-fusion" in the version information. Set the integration type to fusion in your VSCode settings: #### Method 1: Via VSCode Settings UI + 1. Open VSCode Settings (`Ctrl+,` / `Cmd+,`) 2. Search for "dbt integration" 3. Set "Dbt: Dbt Integration" to "fusion" #### Method 2: Via settings.json + Add the following to your VSCode settings.json: + ```json { - "dbt.dbtIntegration": "fusion" + "dbt.dbtIntegration": "fusion" } ``` -### Step 3: Associate *.sql files with jinja-sql +### Step 3: Associate \*.sql files with jinja-sql #### Method 1: Configure in Preferences > Settings + ![File Associations](images/associations.png) #### Method 2: Update settings.json directly + ```json { - "files.associations": { - "*.sql": "jinja-sql", - "*.yml": "jinja-yaml" - } + "files.associations": { + "*.sql": "jinja-sql", + "*.yml": "jinja-yaml" + } } ``` ### Step 4: Verify Configuration After configuration, check that: + 1. The bottom status bar shows "dbt fusion" with a checkmark 2. You can execute dbt commands through the extension 3. IntelliSense and syntax highlighting work in your dbt files @@ -155,6 +165,7 @@ Go to VSCode extension settings, and add API key and instance name there. dbt Fusion integration supports most extension features with some exceptions: ### โœ… Supported Features + - **Query Execution**: Execute models and preview results - **SQL Compilation**: View compiled SQL code - **Auto-completion**: IntelliSense for models, macros, and sources @@ -165,6 +176,7 @@ dbt Fusion integration supports most extension features with some exceptions: - **Query Explanation**: AI-powered SQL explanation ### โŒ Limited Features + - **Documentation Generation**: Not supported in dbt Fusion CLI - **Some Advanced Features**: May have limitations compared to dbt Core integration @@ -192,4 +204,4 @@ dbt Fusion supports macOS, Linux, and Windows. If you encounter installation iss #### Why do I need to add the Altimate API key? -The API key is necessary for advanced AI-powered features like query explanation, test generation, and column lineage. Basic dbt operations (execution, compilation) work without an API key. \ No newline at end of file +The API key is necessary for advanced AI-powered features like query explanation, test generation, and column lineage. Basic dbt operations (execution, compilation) work without an API key. diff --git a/jest.config.js b/jest.config.js index 6b9713756..6c21d4d3b 100644 --- a/jest.config.js +++ b/jest.config.js @@ -16,6 +16,6 @@ module.exports = { coverageDirectory: "coverage", moduleNameMapper: { "^vscode$": "/src/test/mock/vscode.ts", - "^@lib$": "/src/test/mock/lib.ts", + "^@altimateai/extension-components$": "/src/test/mock/lib.ts", }, }; diff --git a/package-lock.json b/package-lock.json index 118924640..dd3d32c91 100644 --- a/package-lock.json +++ b/package-lock.json @@ -10,6 +10,7 @@ "hasInstallScript": true, "license": "MIT", "dependencies": { + "@altimateai/extension-components": "0.0.10-beta.3", "@jupyterlab/coreutils": "^6.2.4", "@jupyterlab/nbformat": "^4.2.4", "@jupyterlab/services": "^6.6.7", @@ -54,6 +55,7 @@ "@vscode/debugprotocol": "^1.68.0", "@vscode/test-electron": "^2.4.1", "@vscode/zeromq": "^0.2.1", + "babel-loader": "^10.0.0", "chai": "^4.3.10", "concurrently": "^8.2.2", "copy-webpack-plugin": "^11.0.0", @@ -88,6 +90,15 @@ "vscode": "^1.95.0" } }, + "node_modules/@altimateai/extension-components": { + "version": "0.0.10-beta.3", + "resolved": "https://registry.npmjs.org/@altimateai/extension-components/-/extension-components-0.0.10-beta.3.tgz", + "integrity": "sha512-PCa8qNaWjIMrreswOmDSFEj7pGm2AtqFWFANGDk7nRJXcAVodXC35mOFAwW1yg9jbJyl61psqGAj4pUkERnxNQ==", + "hasInstallScript": true, + "peerDependencies": { + "@jupyterlab/services": "^6.6.7" + } + }, "node_modules/@ampproject/remapping": { "version": "2.3.0", "resolved": "https://registry.npmjs.org/@ampproject/remapping/-/remapping-2.3.0.tgz", @@ -6198,6 +6209,23 @@ "node": ">=8" } }, + "node_modules/babel-loader": { + "version": "10.0.0", + "resolved": "https://registry.npmjs.org/babel-loader/-/babel-loader-10.0.0.tgz", + "integrity": "sha512-z8jt+EdS61AMw22nSfoNJAZ0vrtmhPRVi6ghL3rCeRZI8cdNYFiV5xeV3HbE7rlZZNmGH8BVccwWt8/ED0QOHA==", + "dev": true, + "license": "MIT", + "dependencies": { + "find-up": "^5.0.0" + }, + "engines": { + "node": "^18.20.0 || ^20.10.0 || >=22.0.0" + }, + "peerDependencies": { + "@babel/core": "^7.12.0", + "webpack": ">=5.61.0" + } + }, "node_modules/babel-plugin-istanbul": { "version": "6.1.1", "resolved": "https://registry.npmjs.org/babel-plugin-istanbul/-/babel-plugin-istanbul-6.1.1.tgz", diff --git a/package.json b/package.json index 0ec6a6a7b..6e511b295 100644 --- a/package.json +++ b/package.json @@ -1356,6 +1356,7 @@ "@vscode/debugprotocol": "^1.68.0", "@vscode/test-electron": "^2.4.1", "@vscode/zeromq": "^0.2.1", + "babel-loader": "^10.0.0", "chai": "^4.3.10", "concurrently": "^8.2.2", "copy-webpack-plugin": "^11.0.0", @@ -1392,6 +1393,7 @@ "altimateai.vscode-altimate-mcp-server" ], "dependencies": { + "@altimateai/extension-components": "0.0.10-beta.3", "@jupyterlab/coreutils": "^6.2.4", "@jupyterlab/nbformat": "^4.2.4", "@jupyterlab/services": "^6.6.7", diff --git a/postInstall.js b/postInstall.js index a5ba60700..b4f5f6211 100644 --- a/postInstall.js +++ b/postInstall.js @@ -1,51 +1,5 @@ // Copied from https://github.com/microsoft/vscode-jupyter/blob/main/build/ci/postInstall.js -const fs = require("fs"); const { downloadZMQ } = require("@vscode/zeromq"); -const path = require("path"); - -/** - * In order to get raw kernels working, we reuse the default kernel that jupyterlab ships. - * However it expects to be talking to a websocket which is serializing the messages to strings. - * Our raw kernel is not a web socket and needs to do its own serialization. To do so, we make a copy - * of the default kernel with the serialization stripped out. This is simpler than making a copy of the module - * at runtime. - */ -function createJupyterKernelWithoutSerialization() { - var relativePath = path.join( - "node_modules", - "@jupyterlab", - "services", - "lib", - "kernel", - "default.js", - ); - var filePath = path.join("", relativePath); - if (!fs.existsSync(filePath)) { - throw new Error( - "Jupyter lab default kernel not found '" + - filePath + - "' (Jupyter Extension post install script)", - ); - } - var fileContents = fs.readFileSync(filePath, { encoding: "utf8" }); - var replacedContents = fileContents - .replace( - /^const serialize =.*$/gm, - "const serialize = { serialize: (a) => a, deserialize: (a) => a };", - ) - .replace( - "const owned = team.session === this.clientId;", - "const owned = parentHeader.session === this.clientId;", - ); - if (replacedContents === fileContents) { - throw new Error( - "Jupyter lab default kernel cannot be made non serializing", - ); - } - var destPath = path.join(path.dirname(filePath), "nonSerializingKernel.js"); - fs.writeFileSync(destPath, replacedContents); - console.log(destPath + " file generated (by Jupyter VSC)"); -} async function downloadZmqBinaries() { // if (common.getBundleConfiguration() === common.bundleConfiguration.web) { @@ -55,7 +9,6 @@ async function downloadZmqBinaries() { await downloadZMQ(); } -createJupyterKernelWithoutSerialization(); downloadZmqBinaries() .then(() => process.exit(0)) .catch((ex) => { diff --git a/src/altimate.ts b/src/altimate.ts index c2ac7c901..1f227ab5b 100644 --- a/src/altimate.ts +++ b/src/altimate.ts @@ -1,5 +1,5 @@ import type { RequestInit } from "node-fetch"; -import { CommentThread, env, Uri, window, workspace } from "vscode"; +import { CommentThread, env, Range, Uri, window, workspace } from "vscode"; import { provideSingleton, processStreamResponse } from "./utils"; import { ColumnMetaData, NodeMetaData, SourceMetaData } from "./domain"; import { TelemetryService } from "./telemetry"; @@ -10,8 +10,11 @@ import { RateLimitException, ExecutionsExhaustedException } from "./exceptions"; import { DBTProject } from "./manifest/dbtProject"; import { DBTTerminal } from "./dbt_client/dbtTerminal"; import { PythonEnvironment } from "./manifest/pythonEnvironment"; -import { PreconfiguredNotebookItem, NotebookItem, NotebookSchema } from "@lib"; -import * as vscode from "vscode"; +import { + PreconfiguredNotebookItem, + NotebookItem, + NotebookSchema, +} from "@altimateai/extension-components"; export class NoCredentialsError extends Error {} @@ -333,8 +336,8 @@ export interface ConversationGroup { resource_type?: string; range: | { - end: vscode.Range["end"]; - start: vscode.Range["start"]; + end: Range["end"]; + start: Range["start"]; } | undefined; }; diff --git a/src/code_lens_provider/virtualSqlCodeLensProvider.ts b/src/code_lens_provider/virtualSqlCodeLensProvider.ts index 665886a72..34f1d8430 100644 --- a/src/code_lens_provider/virtualSqlCodeLensProvider.ts +++ b/src/code_lens_provider/virtualSqlCodeLensProvider.ts @@ -11,7 +11,7 @@ import { import { provideSingleton } from "../utils"; import { QueryManifestService } from "../services/queryManifestService"; import { DBTProjectContainer } from "../manifest/dbtProjectContainer"; -import { NotebookService } from "@lib"; +import { NotebookService } from "@altimateai/extension-components"; @provideSingleton(VirtualSqlCodeLensProvider) export class VirtualSqlCodeLensProvider diff --git a/src/commands/index.ts b/src/commands/index.ts index 6d65bc774..e044feb3f 100644 --- a/src/commands/index.ts +++ b/src/commands/index.ts @@ -47,7 +47,10 @@ import { DBTProject } from "../manifest/dbtProject"; import { SQLLineagePanel } from "../webview_provider/sqlLineagePanel"; import { QueryManifestService } from "../services/queryManifestService"; import { AltimateRequest } from "../altimate"; -import { DatapilotNotebookController, OpenNotebookRequest } from "@lib"; +import { + DatapilotNotebookController, + OpenNotebookRequest, +} from "@altimateai/extension-components"; import { NotebookQuickPick } from "../quickpick/notebookQuickPick"; import { CteInfo } from "../code_lens_provider/cteCodeLensProvider"; diff --git a/src/dbtPowerUserExtension.ts b/src/dbtPowerUserExtension.ts index 1027c09c1..85e9af350 100644 --- a/src/dbtPowerUserExtension.ts +++ b/src/dbtPowerUserExtension.ts @@ -15,7 +15,7 @@ import { HoverProviders } from "./hover_provider"; import { DbtPowerUserActionsCenter } from "./quickpick"; import { ValidationProvider } from "./validation_provider"; import { CommentProviders } from "./comment_provider"; -import { NotebookProviders } from "@lib"; +import { NotebookProviders } from "@altimateai/extension-components"; import { DbtPowerUserMcpServer } from "./mcp"; enum PromptAnswer { diff --git a/src/inversify.config.ts b/src/inversify.config.ts index cc3065376..aff3b2297 100755 --- a/src/inversify.config.ts +++ b/src/inversify.config.ts @@ -44,7 +44,10 @@ import { AltimateRequest } from "./altimate"; import { ValidationProvider } from "./validation_provider"; import { DeferToProdService } from "./services/deferToProdService"; import { SharedStateService } from "./services/sharedStateService"; -import { NotebookKernelClient, NotebookDependencies } from "@lib"; +import { + NotebookKernelClient, + NotebookDependencies, +} from "@altimateai/extension-components"; import { DBTCoreCommandProjectIntegration } from "./dbt_client/dbtCoreCommandIntegration"; import { DBTFusionCommandDetection, diff --git a/src/lib/index.d.ts b/src/lib/index.d.ts deleted file mode 100644 index 434d5cd5b..000000000 --- a/src/lib/index.d.ts +++ /dev/null @@ -1,443 +0,0 @@ -import { AltimateRequest } from "../dependencies.d.ts"; -import { CancellationToken } from "vscode"; -import { CommandProcessExecutionFactory } from "../../dependencies.d.ts"; -import { DBTCommandExecutionInfrastructure } from "../dependencies.d.ts"; -import { DBTCommandExecutionInfrastructure as DBTCommandExecutionInfrastructure_2 } from "../../dependencies.d.ts"; -import { DBTProject } from "../../dependencies.d.ts"; -import { DBTTerminal } from "../../dependencies.d.ts"; -import { DBTTerminal as DBTTerminal_2 } from "../dependencies.d.ts"; -import { Disposable as Disposable_2 } from "vscode"; -import { Event as Event_2 } from "vscode"; -import { ExecuteSQLResult } from "../dependencies.d.ts"; -import { FileChangeEvent } from "vscode"; -import { FileStat } from "vscode"; -import { FileSystemProvider } from "vscode"; -import { FileType } from "vscode"; -import { KernelConnection } from "@jupyterlab/services"; -import { NotebookCell } from "vscode"; -import { NotebookCellKind } from "vscode"; -import { NotebookCellOutput } from "vscode"; -import { NotebookData } from "vscode"; -import { NotebookSerializer } from "vscode"; -import { PythonEnvironment } from "../../dependencies.d.ts"; -import { QueryManifestService } from "../dependencies.d.ts"; -import { TelemetryService } from "../../dependencies.d.ts"; -import { TelemetryService as TelemetryService_2 } from "../dependencies.d.ts"; -import { Uri } from "vscode"; - -declare class ClientMapper { - private executionInfrastructure; - private notebookDependencies; - private dbtTerminal; - private clientMap; - constructor( - executionInfrastructure: DBTCommandExecutionInfrastructure, - notebookDependencies: NotebookDependencies, - dbtTerminal: DBTTerminal_2, - ); - initializeNotebookClient(notebookUri: Uri): Promise; - getNotebookClient(notebookUri: Uri): Promise; -} - -declare interface ColumnConfig { - name: string; - tests: string[]; - [key: string]: any; -} - -declare interface ConnectionSettings { - control_port: number; - hb_port: number; - iopub_port: number; - ip: string; - key: string; - kernel_name: string; - shell_port: number; - signature_scheme: string; - stdin_port: number; - transport: string; -} - -export declare const CustomNotebooks: { - notebooks: PreconfiguredNotebookItem[]; -}; - -export declare class DatapilotNotebookController implements Disposable_2 { - private clientMapper; - private queryManifestService; - private telemetry; - private dbtTerminal; - private notebookDependencies; - private altimate; - private readonly id; - private readonly label; - private _onNotebookCellEvent; - readonly onNotebookCellChangeEvent: Event_2; - private readonly disposables; - private associatedNotebooks; - private executionOrder; - private readonly controller; - constructor( - clientMapper: ClientMapper, - queryManifestService: QueryManifestService, - telemetry: TelemetryService_2, - dbtTerminal: DBTTerminal_2, - notebookDependencies: NotebookDependencies, - altimate: AltimateRequest, - ); - private getNotebookByTemplate; - modelTestSuggestions(args: any): Promise; - generateDbtSourceYaml(args: any): Promise; - generateDbtDbtModelSql(args: any): Promise; - generateDbtDbtModelYaml(args: any): Promise; - generateDbtDbtModelCTE(args: any): Promise; - extractExposuresFromMetabase(args: any): Promise; - extractExposuresFromTableau(args: any): Promise; - private getFileName; - createNotebook(args: OpenNotebookRequest | undefined): Promise; - private sendMessageToPreloadScript; - private getRandomString; - private genUniqueId; - private updateCellId; - private onNotebookClose; - private onDidChangeSelectedNotebooks; - private onNotebookOpen; - private waitForControllerAssociation; - private isControllerAssociatedWithNotebook; - dispose(): void; - private _executeAll; - private filterIPyWidgets; - private updateContextVariablesInKernel; - private _doExecution; -} - -declare class DatapilotNotebookSerializer - implements NotebookSerializer, Disposable_2 -{ - dispose(): void; - deserializeNotebook( - content: Uint8Array, - _token: CancellationToken, - ): Promise; - serializeNotebook( - data: NotebookData, - _token: CancellationToken, - ): Promise; -} - -declare interface DBColumn { - column: string; - dtype: string; -} - -export declare interface DbtConfig { - [key: string]: Model[]; -} - -export declare const getTestSuggestions: ({ - tableRelation, - sample, - limit, - resourceType, - columnConfig, - excludeTypes, - excludeCols, - tests, - uniquenessCompositeKeyLength, - acceptedValuesMaxCardinality, - rangeStddevs, - stringLengthStddevs, - recencyStddevs, - dbtConfig, - returnObject, - columnsInRelation, - adapter, - queryFn, -}: Props) => Promise; - -export declare interface IPyWidgetMessage { - type: string; - payload: any; -} - -export declare interface Model { - name: string; - columns: ColumnConfig[]; - tests?: any[]; -} - -export declare interface NotebookCellEvent { - cellId: string; - notebook: string; - result?: any; - event: "add" | "update" | "delete"; - fragment?: string; - languageId: string; -} - -export declare interface NotebookCellSchema { - source: string[]; - cell_type: NotebookCellKind; - languageId: string; - metadata?: Record; - outputs?: NotebookCellOutput[]; -} - -export declare class NotebookDependencies { - private readonly dbtTerminal; - private readonly telemetry; - private commandProcessExecutionFactory; - private pythonEnvironment; - constructor( - dbtTerminal: DBTTerminal, - telemetry: TelemetryService, - commandProcessExecutionFactory: CommandProcessExecutionFactory, - pythonEnvironment: PythonEnvironment, - ); - private checkPythonDependencies; - private checkDbtDependencies; - private installMissingPythonPackages; - private installMissingDbtPackages; - verifyAndInstallDependenciesIfNeeded( - dependencies: NotebookDependency[], - project: DBTProject, - ): Promise; - getDependenciesVersion(): Promise>; - validateAndInstallNotebookDependencies(): Promise; - private notebookDependenciesAreInstalled; -} - -export declare interface NotebookDependency { - type: "dbt" | "python"; - package: string; - name?: string; - version?: string; -} - -export declare class NotebookFileSystemProvider implements FileSystemProvider { - private dbtTerminal; - private altimate; - private _emitter; - readonly onDidChangeFile: Event_2; - private notebookDataMap; - constructor(dbtTerminal: DBTTerminal_2, altimate: AltimateRequest); - watch( - _uri: Uri, - _options: { - recursive: boolean; - excludes: string[]; - }, - ): Disposable_2; - stat(_uri: Uri): FileStat; - readDirectory(_uri: Uri): [string, FileType][]; - createDirectory(_uri: Uri): void; - private getNotebookData; - readFile(uri: Uri): Promise; - writeFile( - uri: Uri, - content: Uint8Array, - _options: { - create: boolean; - overwrite: boolean; - }, - ): Promise; - delete( - uri: Uri, - _options: { - recursive: boolean; - }, - ): void; - rename( - oldUri: Uri, - newUri: Uri, - _options: { - overwrite: boolean; - }, - ): void; - private getFileNameFromUri; - private customSave; - private saveNotebook; -} - -export declare interface NotebookItem { - id: number; - name: string; - data: NotebookSchema; - description: string; - created_on: string; - updated_on: string; - tags: { - id: number; - tag: string; - }[]; - privacy: boolean; -} - -export declare class NotebookKernelClient implements Disposable_2 { - private executionInfrastructure; - private notebookDependencies; - private dbtTerminal; - get postMessage(): Event_2; - private _postMessageEmitter; - private disposables; - private lastUsedStreamOutput?; - private static modelIdsOwnedByCells; - private python; - private kernel; - private isInitializing; - private readonly ownedCommIds; - private readonly commIdsMappedToWidgetOutputModels; - private readonly ownedRequestMsgIds; - private commIdsMappedToParentWidgetModel; - private streamsReAttachedToExecutingCell; - private registerdCommTargets; - private outputsAreSpecificToAWidget; - private versions?; - constructor( - notebookPath: string, - executionInfrastructure: DBTCommandExecutionInfrastructure_2, - notebookDependencies: NotebookDependencies, - dbtTerminal: DBTTerminal, - ); - isInitialized(): Promise; - dispose(): Promise; - get jupyterPackagesVersions(): Record | undefined; - private getDependenciesVersion; - getKernel(): Promise; - private initializeNotebookKernel; - onKernelSocketResponse(payload: { id: string }): void; - storeContext(context: any): Promise; - storeDataInKernel(cellId: string, data: any): Promise; - registerCommTarget(payload: string): Promise; - getPythonCodeByType(type: string, cellId: string): Promise; - executePython( - code: string, - cell: NotebookCell, - onOutput: (output: NotebookCellOutput) => void, - ): Promise; - private handleUpdateDisplayDataMessage; - private handleCommOpen; - private handleCommMsg; - private handleExecuteResult; - private addToCellData; - private canMimeTypeBeRenderedByWidgetManager; - private handleExecuteInput; - private handleStatusMessage; - private handleStreamMessage; - private handleDisplayData; - private handleClearOutput; - private handleError; -} - -export declare class NotebookProviders implements Disposable_2 { - private notebookProvider; - private notebookController; - private notebookFileSystemProvider; - private dbtTerminal; - private disposables; - constructor( - notebookProvider: DatapilotNotebookSerializer, - notebookController: DatapilotNotebookController, - notebookFileSystemProvider: NotebookFileSystemProvider, - dbtTerminal: DBTTerminal_2, - ); - private bindNotebookActions; - dispose(): void; -} - -export declare interface NotebookSchema { - cells: NotebookCellSchema[]; - metadata?: Record; -} - -export declare class NotebookService implements Disposable_2 { - private notebookKernel; - private disposables; - private cellByNotebookAutocompleteMap; - constructor(notebookKernel: DatapilotNotebookController); - dispose(): void; - getCellByNotebookAutocompleteMap(): Map< - string, - { - cellId: string; - fragment: string; - languageId: string; - }[] - >; - private onNotebookCellChanged; -} - -export declare interface OpenNotebookRequest { - notebookId?: string; - template?: string; - context?: Record; - notebookSchema?: NotebookSchema; -} - -export declare interface PreconfiguredNotebookItem { - name: string; - description: string; - created_at: string; - updated_at: string; - id: string; - tags: string[]; - data: NotebookSchema; -} - -declare interface Props { - tableRelation: string; - sample?: boolean; - limit?: number; - resourceType?: string; - columnConfig?: Record; - excludeTypes?: string[]; - excludeCols?: string[]; - tests?: ( - | "uniqueness" - | "accepted_values" - | "range" - | "string_length" - | "recency" - )[]; - uniquenessCompositeKeyLength?: number; - acceptedValuesMaxCardinality?: number; - rangeStddevs?: number; - stringLengthStddevs?: number; - recencyStddevs?: number; - dbtConfig?: Record; - returnObject?: boolean; - columnsInRelation: DBColumn[]; - adapter: string; - queryFn: QueryFn; -} - -declare type QueryFn = (query: string) => Promise; - -declare interface RawKernelType { - realKernel: KernelConnection; - socket: any; - kernelProcess: { - connection: ConnectionSettings; - pid: number; - }; -} - -export {}; - -export declare namespace Identifiers { - const GeneratedThemeName = "ipython-theme"; - const MatplotLibDefaultParams = "_VSCode_defaultMatplotlib_Params"; - const MatplotLibFigureFormats = "_VSCode_matplotLib_FigureFormats"; - const DefaultCodeCellMarker = "# %%"; - const DefaultCommTarget = "jupyter.widget"; - const ALL_VARIABLES = "ALL_VARIABLES"; - const KERNEL_VARIABLES = "KERNEL_VARIABLES"; - const DEBUGGER_VARIABLES = "DEBUGGER_VARIABLES"; - const PYTHON_VARIABLES_REQUESTER = "PYTHON_VARIABLES_REQUESTER"; - const MULTIPLEXING_DEBUGSERVICE = "MULTIPLEXING_DEBUGSERVICE"; - const RUN_BY_LINE_DEBUGSERVICE = "RUN_BY_LINE_DEBUGSERVICE"; - const REMOTE_URI = "https://remote/"; - const REMOTE_URI_ID_PARAM = "id"; - const REMOTE_URI_HANDLE_PARAM = "uriHandle"; - const REMOTE_URI_EXTENSION_ID_PARAM = "extensionId"; -} diff --git a/src/lib/index.js b/src/lib/index.js deleted file mode 100644 index f182d345c..000000000 --- a/src/lib/index.js +++ /dev/null @@ -1,3649 +0,0 @@ -"use strict"; -Object.defineProperty(exports, Symbol.toStringTag, { value: "Module" }); -const l = require("vscode"), - p = require("@extension"), - ie = require("python-bridge"), - ue = require("fs"), - de = require("@nteract/messaging/lib/wire-protocol"); -function pe(o) { - const e = Object.create(null, { [Symbol.toStringTag]: { value: "Module" } }); - if (o) { - for (const t in o) - if (t !== "default") { - const n = Object.getOwnPropertyDescriptor(o, t); - Object.defineProperty( - e, - t, - n.get ? n : { enumerable: !0, get: () => o[t] }, - ); - } - } - return (e.default = o), Object.freeze(e); -} -const te = pe(de), - he = (o) => ("getCells" in o ? o.getCells() : o.cells), - me = (o) => - o instanceof l.NotebookCellData ? o.value : o.document.getText(), - fe = (o) => - o instanceof l.NotebookCellData ? o.languageId : o.document.languageId, - X = (o, e, t) => { - var r; - const n = []; - for (const s of he(o)) - n.push({ - cell_type: s.kind, - source: me(s).split(/\r?\n/g), - languageId: fe(s), - metadata: s.metadata, - outputs: t ? s.outputs : void 0, - }); - return { - cells: n, - metadata: { - ...o.metadata, - name: e, - createdAt: - ((r = o.metadata) == null ? void 0 : r.createdAt) || - new Date().toISOString(), - updatedAt: new Date().toISOString(), - }, - }; - }, - A = () => Math.random().toString(36).substr(2, 9); -function be() { - const o = new Date(), - e = o.toLocaleDateString("en-GB").replace(/\//g, "-"), - t = o.toLocaleTimeString("en-GB", { hour12: !1 }).replace(/:/g, "-"); - return `${e}-${t}`; -} -var ge = function (o, e, t, n) { - var r = arguments.length, - s = - r < 3 ? e : n === null ? (n = Object.getOwnPropertyDescriptor(e, t)) : n, - a; - if (typeof Reflect == "object" && typeof Reflect.decorate == "function") - s = Reflect.decorate(o, e, t, n); - else - for (var i = o.length - 1; i >= 0; i--) - (a = o[i]) && (s = (r < 3 ? a(s) : r > 3 ? a(e, t, s) : a(e, t)) || s); - return r > 3 && s && Object.defineProperty(e, t, s), s; -}; -let U = class { - dispose() { - throw new Error("Method not implemented."); - } - async deserializeNotebook(e, t) { - const n = new TextDecoder().decode(e); - let r; - try { - r = JSON.parse(n); - } catch { - r = { cells: [] }; - } - const s = r.cells.map((i) => { - var u; - const c = new l.NotebookCellData( - i.cell_type, - (u = i.source) == null - ? void 0 - : u.join(` -`), - i.languageId, - ); - return (c.metadata = i.metadata), (c.outputs = i.outputs), c; - }), - a = new l.NotebookData(s); - return (a.metadata = r.metadata), a; - } - async serializeNotebook(e, t) { - const n = X(e); - return new TextEncoder().encode(JSON.stringify(n)); - } -}; -U = ge([p.provideSingleton(U)], U); -var P; -(function (o) { - (o.error = "application/vnd.code.notebook.error"), - (o.stderr = "application/vnd.code.notebook.stderr"), - (o.stdout = "application/vnd.code.notebook.stdout"); -})(P || (P = {})); -const ye = ["text/plain", "text/markdown", P.stderr, P.stdout], - ne = [ - "application/vnd.*", - "application/vdom.*", - "application/geo+json", - "application/x-nteract-model-debug+json", - "text/html", - "application/javascript", - "image/gif", - "text/latex", - "text/markdown", - "image/png", - "image/svg+xml", - "image/jpeg", - "application/json", - "text/plain", - ], - D = new Map(); -D.set("display_data", G); -D.set("error", Ce); -D.set("execute_result", G); -D.set("stream", Se); -D.set("update_display_data", G); -function Y(o) { - const e = D.get(o.output_type); - let t; - return ( - e - ? (t = e(o)) - : (console.warn( - `Unable to translate cell from ${o.output_type} to NotebookCellData for VS Code.`, - ), - (t = G(o))), - t - ); -} -function ee(o) { - const e = { outputType: o.output_type }; - switch ((o.transient && (e.transient = o.transient), o.output_type)) { - case "display_data": - case "execute_result": - case "update_display_data": { - (e.executionCount = o.execution_count), - (e.metadata = o.metadata ? JSON.parse(JSON.stringify(o.metadata)) : {}); - break; - } - } - return e; -} -function G(o) { - const e = ee(o); - ("image/svg+xml" in o.data || "image/png" in o.data) && - (e.__displayOpenPlotIcon = !0); - const t = []; - if (o.data) for (const n in o.data) t.push(ke(n, o.data[n])); - return new l.NotebookCellOutput(we(t), e); -} -function we(o) { - return o.sort((e, t) => { - const n = (a, i) => ( - a.endsWith(".*") && (a = a.substr(0, a.indexOf(".*"))), i.startsWith(a) - ); - let r = ne.findIndex((a) => n(a, e.mime)), - s = ne.findIndex((a) => n(a, t.mime)); - return ( - oe(e) && (r = -1), - oe(t) && (s = -1), - (r = r === -1 ? 100 : r), - (s = s === -1 ? 100 : s), - r - s - ); - }); -} -function oe(o) { - if (o.mime.startsWith("application/vnd.")) - try { - return new TextDecoder().decode(o.data).length === 0; - } catch {} - return !1; -} -function ke(o, e) { - if (!e) return l.NotebookCellOutputItem.text("", o); - try { - if ( - (o.startsWith("text/") || ye.includes(o)) && - (Array.isArray(e) || typeof e == "string") - ) { - const t = Array.isArray(e) ? z(e) : e; - return l.NotebookCellOutputItem.text(t, o); - } else - return o.startsWith("image/") && - typeof e == "string" && - o !== "image/svg+xml" - ? new l.NotebookCellOutputItem(_e(e), o) - : typeof e == "object" && e !== null && !Array.isArray(e) - ? l.NotebookCellOutputItem.text(JSON.stringify(e), o) - : ((e = Array.isArray(e) ? z(e) : e), - l.NotebookCellOutputItem.text(e, o)); - } catch (t) { - return ( - console.error( - `Failed to convert ${o} output to a buffer ${typeof e}, ${e}`, - t, - ), - l.NotebookCellOutputItem.text("") - ); - } -} -function _e(o) { - return typeof Buffer < "u" && typeof Buffer.from == "function" - ? Buffer.from(o, "base64") - : Uint8Array.from(atob(o), (e) => e.charCodeAt(0)); -} -function z(o) { - if (Array.isArray(o)) { - let e = ""; - for (let t = 0; t < o.length; t += 1) { - const n = o[t]; - t < o.length - 1 && - !n.endsWith(` -`) - ? (e = e.concat(`${n} -`)) - : (e = e.concat(n)); - } - return e; - } - return o.toString(); -} -function ve(o) { - let e = o; - do (o = e), (e = o.replace(/[^\n]\x08/gm, "")); - while (e.length < o.length); - return o; -} -function Te(o) { - for ( - o = o.replace( - /\r+\n/gm, - ` -`, - ); - o.search(/\r[^$]/g) > -1; - - ) { - const e = o.match(/^(.*)\r+/m)[1]; - let t = o.match(/\r+(.*)$/m)[1]; - (t = t + e.slice(t.length, e.length)), - (o = o.replace(/\r+.*$/m, "\r").replace(/^.*\r/m, t)); - } - return o; -} -function Ee(o) { - return Te(ve(o)); -} -function B(o) { - if (o.parent_header && "msg_id" in o.parent_header) - return o.parent_header.msg_id; -} -function Ne(o) { - if (o.hasOwnProperty("text/html")) { - const e = o["text/html"]; - typeof e == "string" && - e.includes('