Skip to content

chore(hdfs): Update container images ahead of Stackable Release 25.7.0 #1077

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
7 of 14 tasks
Tracked by #1064
NickLarsenNZ opened this issue May 5, 2025 · 0 comments
Open
7 of 14 tasks
Tracked by #1064
Assignees

Comments

@NickLarsenNZ
Copy link
Member

NickLarsenNZ commented May 5, 2025

Part of #1064.

  • Remove: 3.3.4, 3.4.0
  • Remove support for 3.3.6

Caution

Hadoop 3.3.6 is still required by Hive and Spark in this release.

Tip

Please add the scheduled-for/20XX-XX label, and add to the Stackable Engineering project.

Update tasks

  • Update versions.py to reflect the agreed upon versions in the spreadsheet (including the removal of old versions).
  • Upload new version (see hadoop/upload_new_hadoop_version.sh). No new versions.
  • Update versions.py to the latest supported version of JVM (base and devel).
  • Update other dependencies if applicable (eg: hdfs_utils, jmx_exporter, protobuf, etc).
  • Check other operators (getting_started / kuttl / supported-versions) for usage of the versions. Add the PR(s) to the list below.
  • Update the version in demos. Add the PR(s) to the list below.

Related Pull Requests

Tip

Delete any items that do not apply so that all applicable items can be checked.
For example, if you add release notes to the documentation repository, you do not need the latter two criteria.

Acceptance

Tip

This list should be completed by the assignee(s), once respective PRs have been merged. Once all items have been
checked, the issue can be moved into Development: Done.

  • Can build image (either locally, or in CI)
  • Kuttl smoke tests passes (either locally, or in CI)
Testing instructions
# See the latest version at https://pypi.org/project/image-tools-stackabletech/
pip install image-tools-stackabletech==0.0.13

bake --product hadoop=x.y.z # where x.y.z is the new version added in this PR

kind load docker-image oci.stackable.tech/sdp/hadoop:x.y.z-stackable0.0.0-dev

# Change directory into the hdfs-operator repository and update the
# product version in tests/test-definition.yaml
./scripts/run-tests --test-suite smoke-latest # or similar

Please consider updating this template if these instructions are wrong, or
could be made clearer.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Status: Development: In Progress
Development

No branches or pull requests

1 participant