Skip to content

Conversation

@dibryant
Copy link
Contributor

@dibryant dibryant commented Nov 14, 2025

Fixes for https://issues.redhat.com/browse/RHAIENG-1758

Description

Updated GA with missing Pandoc

How Has This Been Tested?

Self checklist (all need to be checked):

  • Ensure that you have run make test (gmake on macOS) before asking for review
  • Changes to everything except Dockerfile.konflux files should be done in odh/notebooks and automatically synced to rhds/notebooks. For Konflux-specific changes, modify Dockerfile.konflux files directly in rhds/notebooks as these require special attention in the downstream repository and flow to the upcoming RHOAI release.

Merge criteria:

  • The commits are squashed in a cohesive manner and have meaningful messages.
  • Testing instructions have been added in the PR body (for PRs involving changes that are not immediately obvious).
  • The developer has manually tested the changes and verified that the changes work

Summary by CodeRabbit

  • Tests

    • Improved PDF export test reliability by broadening the skip condition to include an additional architecture and updating the skip message for clarity.
  • Chores

    • Updated commit references for multiple runtime and workbench components so images and components point to the latest builds across datascience, minimal, pytorch, llmcompressor, rocm, tensorflow, and various workbench variants.

✏️ Tip: You can customize this high-level summary in your review settings.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Nov 14, 2025

Walkthrough

Add architecture-based skip for test_pdf_export (skip on s390x or ppc64le) and update its skip message; update multiple commit-hash entries in manifests/base/commit-latest.env to new values.

Changes

Cohort / File(s) Summary
Test platform-specific skip logic
tests/containers/workbenches/jupyterlab/jupyterlab_test.py
Compute architecture from uname output and skip test_pdf_export when arch is in (s390x, ppc64le); update skip message accordingly; no other control-flow changes.
Commit-hash updates
manifests/base/commit-latest.env
Replace existing commit-hash mappings with updated commit identifiers for multiple components (e.g., odh-pipeline-runtime variants and odh-workbench images including jupyter and rstudio); values changed, file structure unchanged.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

  • Review attention:
    • Verify uname parsing and membership check correctly detect s390x and ppc64le.
    • Confirm updated skip message text and CI behavior.
    • Validate commit-latest.env formatting and that each updated commit hash targets the intended component.

Pre-merge checks and finishing touches

❌ Failed checks (3 warnings)
Check name Status Explanation Resolution
Title check ⚠️ Warning The title references a ticket (RHAIENG-1758) and mentions 'Revise Tests for 2025b Onboarding: PandocMissing', but the actual changes shown in the raw_summary involve updating test skip conditions and commit hashes, not adding missing Pandoc or revising onboarding tests in an obvious way. Clarify the title to accurately reflect the main changes: either specify the commit hash updates and test skip logic changes, or verify the PR description matches the actual code changes shown in the summary.
Description check ⚠️ Warning The PR description is incomplete: the 'How Has This Been Tested?' section is empty, and all self-checklist items remain unchecked, indicating the developer has not confirmed running tests or verifying changes work as required by the template. Complete the testing section with details of how changes were tested, check the self-checklist items after running make test, and verify manual testing was performed.
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. You can run @coderabbitai generate docstrings to improve docstring coverage.
✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

📜 Recent review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 091eec1 and f2e4f50.

📒 Files selected for processing (2)
  • manifests/base/commit-latest.env (1 hunks)
  • tests/containers/workbenches/jupyterlab/jupyterlab_test.py (1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
  • tests/containers/workbenches/jupyterlab/jupyterlab_test.py
🧰 Additional context used
🪛 GitHub Actions: Validation of image references (image SHAs) in params.env and runtime images
manifests/base/commit-latest.env

[error] 1-1: Variable definitions in commit-latest.env fail validation (see related commit.env errors)

🔇 Additional comments (1)
manifests/base/commit-latest.env (1)

1-20: File validation requirements need verification before action can be taken.

The file manifests/base/commit-latest.env currently contains 20 entries, but verify whether the stated requirement of 46 total records is accurate. Identify the validation schema or requirements document that defines which 26 additional component entries are needed. Check whether this represents a regression from the previous state or if this validation requirement is applied during the build process.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@github-actions github-actions bot added the review-requested GitHub Bot creates notification on #pr-review-ai-ide-team slack channel label Nov 14, 2025
@openshift-ci openshift-ci bot requested review from atheo89 and daniellutz November 14, 2025 01:23
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (1)
.github/workflows/build-notebooks-push.yaml (1)

33-34: Remove sudo and pin Pandoc version for deterministic builds.

GitHub Actions runners have sufficient permissions; sudo is typically unnecessary and can cause issues on certain runners. Additionally, Pandoc's version should be pinned to ensure reproducible builds and prevent unexpected breaking changes.

Apply this diff to remove sudo and pin the version:

-      - name: Install Pandoc
-        run: sudo apt-get update && sudo apt-get install -y pandoc
+      - name: Install Pandoc
+        run: apt-get install -y pandoc=3.1.*

Alternatively, if you need the latest version in a specific major.minor release:

-      - name: Install Pandoc
-        run: sudo apt-get update && sudo apt-get install -y pandoc
+      - name: Install Pandoc
+        run: |
+          apt-get update
+          apt-get install -y pandoc
📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between e9c915d and 1b7f86e.

📒 Files selected for processing (1)
  • .github/workflows/build-notebooks-push.yaml (1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (31)
  • GitHub Check: build (runtime-cuda-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (codeserver-ubi9-python-3.12, 3.12, linux/arm64, false) / build
  • GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
  • GitHub Check: build (rocm-runtime-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-rstudio-c9s-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-cuda-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (rocm-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-trustyai-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (codeserver-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-cuda-tensorflow-ubi9-python-3.12, 3.12, linux/arm64, false) / build
  • GitHub Check: build (rocm-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/arm64, false) / build
  • GitHub Check: build (rstudio-c9s-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (rocm-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (rocm-runtime-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-cuda-pytorch-llmcompressor-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-pytorch-llmcompressor-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/s390x, false) / build
  • GitHub Check: build (runtime-minimal-ubi9-python-3.12, 3.12, linux/s390x, false) / build
  • GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/s390x, false) / build
  • GitHub Check: build (cuda-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
  • GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/arm64, false) / build
  • GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (rstudio-rhel9-python-3.12, 3.12, linux/amd64, true) / build
  • GitHub Check: build (cuda-rstudio-rhel9-python-3.12, 3.12, linux/amd64, true) / build
🔇 Additional comments (1)
.github/workflows/build-notebooks-push.yaml (1)

33-34: Pandoc installation may be in the wrong job—verify placement and necessity.

The Pandoc installation step is added to the gen job, which only runs gen_gha_matrix_jobs.py to generate the build matrix. Pandoc is typically needed during actual notebook building, not matrix generation. Confirm whether:

  1. The Python matrix generation script actually requires Pandoc
  2. This step should instead be in the build job(s) or in ./.github/workflows/build-notebooks-TEMPLATE.yaml

If Pandoc is not needed for matrix generation, this step should be moved to avoid unnecessary overhead and potential ordering issues.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (1)
.github/workflows/build-notebooks-push.yaml (1)

33-36: Install Pandoc step looks good, minor clarity suggestion on sudo usage.

The placement and logic are correct—Pandoc is installed after checkout and before the matrix generation step that likely depends on it. The -y flag for automatic confirmation is appropriate for CI/CD.

Minor note: sudo is unnecessary in GitHub Actions ubuntu-latest runners, which typically run as root. You can simplify to just apt-get update && apt-get install -y pandoc for clarity.

~      - name: Install Pandoc
~        run: |
-~          sudo apt-get update
-~          sudo apt-get install -y pandoc
+~          apt-get update
+~          apt-get install -y pandoc
📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 4c3d496 and c9cfd1b.

📒 Files selected for processing (1)
  • .github/workflows/build-notebooks-push.yaml (1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (31)
  • GitHub Check: build (cuda-rstudio-rhel9-python-3.12, 3.12, linux/amd64, true) / build
  • GitHub Check: build (rstudio-rhel9-python-3.12, 3.12, linux/amd64, true) / build
  • GitHub Check: build (rocm-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/s390x, false) / build
  • GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-rstudio-c9s-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (codeserver-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/s390x, false) / build
  • GitHub Check: build (rocm-runtime-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
  • GitHub Check: build (codeserver-ubi9-python-3.12, 3.12, linux/arm64, false) / build
  • GitHub Check: build (rocm-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (rocm-runtime-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (rstudio-c9s-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-cuda-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/arm64, false) / build
  • GitHub Check: build (cuda-jupyter-pytorch-llmcompressor-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/arm64, false) / build
  • GitHub Check: build (jupyter-trustyai-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (rocm-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-cuda-tensorflow-ubi9-python-3.12, 3.12, linux/arm64, false) / build
  • GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-minimal-ubi9-python-3.12, 3.12, linux/s390x, false) / build
  • GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
  • GitHub Check: build (runtime-cuda-pytorch-llmcompressor-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (runtime-cuda-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
  • GitHub Check: build (cuda-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
🔇 Additional comments (1)
.github/workflows/build-notebooks-push.yaml (1)

20-47: Reminder: Complete the PR checklist before requesting review removal.

This PR is marked as WIP and the self-checklist indicates you should run make test (gmake on macOS) before requesting review, and ensure the merge criteria are met. Please complete these checks before changing the PR status from work-in-progress.

@openshift-ci openshift-ci bot added size/xs and removed size/xs labels Dec 4, 2025
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (1)
tests/containers/workbenches/jupyterlab/jupyterlab_test.py (1)

60-66: Align the skip comment with the expanded architecture list

The runtime check now skips on both s390x and ppc64le, but the comment above still says “Skip if we're running on s390x architecture”. Consider updating it to mention both architectures (or describe it generically, e.g. “architectures where PDF export is not supported”) so the comment stays in sync if the list changes again.

📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between b108875 and c9d2eda.

📒 Files selected for processing (1)
  • tests/containers/workbenches/jupyterlab/jupyterlab_test.py (1 hunks)

@openshift-ci
Copy link
Contributor

openshift-ci bot commented Dec 5, 2025

@dibryant: The following tests failed, say /retest to rerun all failed tests or /retest-required to rerun all mandatory failed tests:

Test name Commit Details Required Rerun command
ci/prow/notebook-jupyter-ubi9-python-3-12-pr-image-mirror 5796b68 link true /test notebook-jupyter-ubi9-python-3-12-pr-image-mirror
ci/prow/notebook-rocm-jupyter-ubi9-python-3-12-pr-image-mirror 5796b68 link true /test notebook-rocm-jupyter-ubi9-python-3-12-pr-image-mirror
ci/prow/notebook-cuda-jupyter-ubi9-python-3-12-pr-image-mirror 5796b68 link true /test notebook-cuda-jupyter-ubi9-python-3-12-pr-image-mirror
ci/prow/notebooks-py312-ubi9-e2e-tests 5796b68 link true /test notebooks-py312-ubi9-e2e-tests
ci/prow/images f2e4f50 link true /test images

Full PR test history. Your PR dashboard.

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository. I understand the commands that are listed here.

@opendatahub-io opendatahub-io deleted a comment from coderabbitai bot Dec 5, 2025
Copy link
Member

@jiridanek jiridanek left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please undo the updates to commit-latest.env because it's out of scope for this PR. It's true that the values are out of date, and if you bring that up on some meeting, I'll give you a +100, but addressing it one-off like this in an unrelated PR, I don't like that.

Also please update the comment that Rabbit flagged the way it suggests it.

With these two things done, /lgtm

@allure.description("Check that PDF export is working correctly")
def test_pdf_export(self, jupyterlab_image: conftest.Image) -> None:
container = WorkbenchContainer(image=jupyterlab_image.name, user=4321, group_add=[0])
# Skip if we're running on s390x architecture
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@dibryant please update the commit according to rabbit's suggestion

@@ -1,20 +1,20 @@
odh-pipeline-runtime-datascience-cpu-py312-ubi9-commit-n=d91bb6a
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this may be well, good, and true, but, why update it in a PR that has nothing to do with this?

@openshift-ci
Copy link
Contributor

openshift-ci bot commented Dec 5, 2025

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: jiridanek

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@openshift-ci openshift-ci bot added the approved label Dec 5, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

approved lgtm review-requested GitHub Bot creates notification on #pr-review-ai-ide-team slack channel size/m

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants