-
Notifications
You must be signed in to change notification settings - Fork 11
test: Add test that executes a colab notebook that uses spark connect… #153
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,2 +1,3 @@ | ||
pytest>=8.0 | ||
pytest-xdist>=3.0 | ||
google-cloud-aiplatform>=1.119.0 | ||
Original file line number | Diff line number | Diff line change | ||||||||
---|---|---|---|---|---|---|---|---|---|---|
@@ -0,0 +1,118 @@ | ||||||||||
# Copyright 2025 Google LLC | ||||||||||
# | ||||||||||
# Licensed under the Apache License, Version 2.0 (the "License"); | ||||||||||
# you may not use this file except in compliance with the License. | ||||||||||
# You may obtain a copy of the License at | ||||||||||
# | ||||||||||
# http://www.apache.org/licenses/LICENSE-2.0 | ||||||||||
# | ||||||||||
# Unless required by applicable law or agreed to in writing, software | ||||||||||
# distributed under the License is distributed on an "AS IS" BASIS, | ||||||||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||||||||||
# See the License for the specific language governing permissions and | ||||||||||
# limitations under the License. | ||||||||||
|
||||||||||
from google.cloud import aiplatform_v1 | ||||||||||
from google.cloud.aiplatform_v1.types import JobState | ||||||||||
|
||||||||||
import os | ||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. May we add doc comment to this test that describes how it works? |
||||||||||
import uuid | ||||||||||
import pytest | ||||||||||
import logging | ||||||||||
|
||||||||||
LOGGER = logging.getLogger(__name__) | ||||||||||
|
||||||||||
REPOSITORY_ID = "97193e1e-c5d1-4ce8-bc6f-cf206c701624" | ||||||||||
TEMPLATE_ID = "6409629422399258624" | ||||||||||
Comment on lines
+25
to
+26
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. For better maintainability and flexibility, these hardcoded IDs should be configurable via environment variables, with the current values as defaults. This pattern is already used for project, region, and service account. This also applies to the GCS URIs hardcoded on lines 89 and 92. I recommend extracting them to module-level constants as well, for example: GCS_NOTEBOOK_URI = os.getenv(
"GCS_NOTEBOOK_URI",
"gs://e2e-testing-bucket/input/notebooks/spark_connect_e2e_notebook_test.ipynb",
)
GCS_OUTPUT_URI = os.getenv("GCS_OUTPUT_URI", "gs://e2e-testing-bucket/output")
Suggested change
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @fangyh20 I think we can ignore this comment but please LMK
Comment on lines
+25
to
+26
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. May we add a comment what this vars are? |
||||||||||
|
||||||||||
|
||||||||||
@pytest.fixture | ||||||||||
def test_project(): | ||||||||||
return os.getenv("GOOGLE_CLOUD_PROJECT") | ||||||||||
|
||||||||||
|
||||||||||
@pytest.fixture | ||||||||||
def test_region(): | ||||||||||
return os.getenv("GOOGLE_CLOUD_REGION") | ||||||||||
|
||||||||||
|
||||||||||
@pytest.fixture | ||||||||||
def test_service_account(): | ||||||||||
return os.getenv("DATAPROC_SPARK_CONNECT_SERVICE_ACCOUNT") | ||||||||||
|
||||||||||
|
||||||||||
@pytest.fixture | ||||||||||
def test_template(): | ||||||||||
return TEMPLATE_ID | ||||||||||
|
||||||||||
|
||||||||||
@pytest.fixture | ||||||||||
def test_repository(): | ||||||||||
return REPOSITORY_ID | ||||||||||
|
||||||||||
|
||||||||||
def test_executing_colab_notebook( | ||||||||||
test_project, | ||||||||||
test_region, | ||||||||||
test_service_account, | ||||||||||
test_template, | ||||||||||
test_repository, | ||||||||||
): | ||||||||||
"""Test executing a Colab notebook that uses Spark Connect.""" | ||||||||||
test_api_endpoint = f"{test_region}-aiplatform.googleapis.com" | ||||||||||
test_parent = f"projects/{test_project}/locations/{test_region}" | ||||||||||
test_execution_display_name = ( | ||||||||||
f"spark-connect-e2e-notebook-test-{uuid.uuid4().hex}" | ||||||||||
) | ||||||||||
|
||||||||||
LOGGER.info( | ||||||||||
f"Starting notebook execution job with display name: {test_execution_display_name}" | ||||||||||
) | ||||||||||
|
||||||||||
notebook_service_client = aiplatform_v1.NotebookServiceClient( | ||||||||||
client_options={ | ||||||||||
"api_endpoint": test_api_endpoint, | ||||||||||
} | ||||||||||
) | ||||||||||
|
||||||||||
operation = notebook_service_client.create_notebook_execution_job( | ||||||||||
parent=test_parent, | ||||||||||
notebook_execution_job={ | ||||||||||
"display_name": test_execution_display_name, | ||||||||||
# Specify a NotebookRuntimeTemplate to source compute configuration from | ||||||||||
"notebook_runtime_template_resource_name": f"projects/{test_project}/locations/{test_region}/notebookRuntimeTemplates/{test_template}", | ||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Do we need to preconfigure project for this to work somehow? If so, let's document it |
||||||||||
# Specify a Colab Enterprise notebook to run | ||||||||||
"dataform_repository_source": { | ||||||||||
"dataform_repository_resource_name": f"projects/{test_project}/locations/{test_region}/repositories/{test_repository}", | ||||||||||
}, | ||||||||||
"gcs_notebook_source": { | ||||||||||
"uri": "gs://e2e-testing-bucket/input/notebooks/spark_connect_e2e_notebook_test.ipynb", | ||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. could we upload the test content from repo for testing? so we do have one place for managing all test dependencies. ideally, we should be able to test existing integration test code and new test cases in executor within github repo. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This bucket seems to be a little too generic for global namespace, maybe we can use something like |
||||||||||
}, | ||||||||||
# Specify a Cloud Storage bucket to store output artifacts | ||||||||||
"gcs_output_uri": "gs://e2e-testing-bucket/output", | ||||||||||
# Run as the service account instead | ||||||||||
"service_account": f"{test_service_account}", | ||||||||||
}, | ||||||||||
) | ||||||||||
LOGGER.info("Waiting for operation to complete...") | ||||||||||
|
||||||||||
result = operation.result() | ||||||||||
LOGGER.info(f"Notebook execution uri: {result}") | ||||||||||
|
||||||||||
notebook_execution_jobs = ( | ||||||||||
notebook_service_client.list_notebook_execution_jobs(parent=test_parent) | ||||||||||
) | ||||||||||
executed_job = list( | ||||||||||
filter( | ||||||||||
lambda job: job.display_name == test_execution_display_name, | ||||||||||
notebook_execution_jobs, | ||||||||||
) | ||||||||||
) | ||||||||||
|
||||||||||
assert len(executed_job) == 1 | ||||||||||
executed_job = executed_job[0] | ||||||||||
|
||||||||||
LOGGER.info(executed_job) | ||||||||||
jinnthehuman marked this conversation as resolved.
Show resolved
Hide resolved
|
||||||||||
|
||||||||||
LOGGER.info(f"Job status: {executed_job.job_state}") | ||||||||||
assert executed_job.job_state == JobState.JOB_STATE_SUCCEEDED |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: sort alphabetically