Add databricks-execution-compute skill — serverless, classic cluster, and file execution#278
Open
GeorgeTheo99 wants to merge 5 commits intodatabricks-solutions:mainfrom
Open
Conversation
Adds a new `run_code_on_serverless()` function that executes Python or SQL code on Databricks serverless compute using the Jobs API `runs/submit` endpoint. No interactive cluster is required. The implementation: - Uploads code as a temporary notebook to the workspace - Submits a one-time run with serverless compute (environments + environment_key pattern) - Waits for completion and retrieves output via get_run_output - Cleans up temporary workspace files after execution - Returns a typed ServerlessRunResult with output, error, run_id, run_url, and timing New files and changes: - databricks-tools-core: compute/serverless.py (core module) - databricks-tools-core: compute/__init__.py (exports) - databricks-mcp-server: tools/compute.py (MCP tool wrapper) 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
… improve docs - Retrieve actual Python traceback on failure instead of generic "Workload failed" message by fetching run output in the exception handler - Fix Optional[str] type annotation for run_name in MCP wrapper - Document SQL SELECT output limitation in all docstrings - Reframe tool as Python-first; clarify SQL is niche (DDL/DML only, use execute_sql for queries — works with serverless SQL warehouses) - Add databricks-serverless-compute skill file with decision matrix, output capture behavior, limitations, and examples 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
Auto-detects .ipynb JSON content and uploads via Databricks native Jupyter import (ImportFormat.JUPYTER), enabling users to run local Jupyter notebooks on serverless compute without conversion. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
…ge and workspace_path support - Rename databricks-serverless-compute → databricks-execution-compute covering all three execution tools (execute_databricks_command, run_file_on_databricks, run_code_on_serverless) - Rename run_python_file_on_databricks → run_file_on_databricks with language auto-detection from file extension (.py, .scala, .sql, .r); old name kept as alias - Add workspace_path param to run_file_on_databricks and run_code_on_serverless for persistent mode (saves notebook to workspace) vs ephemeral (default, temp cleanup) - Add comprehensive integration tests (34 tests) covering classic cluster execution, serverless Python/SQL, ephemeral vs persistent modes, input validation, and error handling - Update MCP tool layer with new params and empty-string-to-None coercion - Update install_skills.sh with new skill name and description Co-authored-by: Isaac
…le management Core functions (manage.py): create/modify/terminate/delete clusters with opinionated defaults (auto-pick LTS DBR, reasonable node type, SINGLE_USER mode, 120min auto-term), plus create/modify/delete SQL warehouses. List helpers for node types and spark versions. MCP tool wrappers with destructive-action warnings in docstrings. SKILL.md with decision matrix, tool reference tables, and examples. Integration tests validated against e2-demo-field-eng. Co-authored-by: Isaac
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Adds a unified databricks-execution-compute skill covering all three code execution tools, plus new capabilities for multi-language file execution and persistent workspace notebooks.
New skill:
databricks-execution-computeConsolidates what was previously
databricks-serverless-computeinto a single skill covering:execute_databricks_commandrun_file_on_databricks(new)run_code_on_serverless(new)New:
run_code_on_serverless.ipynbcontent and uploads via Databricks' native Jupyter importNew:
run_file_on_databricks(renamed fromrun_python_file_on_databricks)languageparameter for overrideNew:
workspace_path— ephemeral vs persistent modeBoth
run_code_on_serverlessandrun_file_on_databrickssupport:workspace_pathto save the notebook in the Databricks workspace for project work (model training, ETL, etc.)Cluster management helpers
list_clusters,get_best_cluster,start_cluster,get_cluster_status— with actionable error messages when no cluster is available (suggests startable clusters, serverless alternatives)Files changed
databricks-tools-core/.../compute/serverless.pydatabricks-tools-core/.../compute/execution.pydatabricks-tools-core/.../compute/__init__.pydatabricks-mcp-server/.../tools/compute.pydatabricks-skills/databricks-execution-compute/SKILL.mddatabricks-skills/databricks-serverless-compute/databricks-skills/install_skills.shtests/integration/compute/test_execution.pytests/integration/compute/test_serverless.pyTest plan
run_file_on_databricks: Python/SQL auto-detect, language override, empty file, file not foundrun_file_on_databricks+workspace_path: notebook uploaded and verified in workspacerun_python_file_on_databricksalias confirmedvalidate_skills.pypasses (26 skills)This pull request was AI-assisted by Isaac.