Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WIP - do not merge - support parameter inputs in kfp-kubernetes #71

Conversation

gregsheremeta
Copy link

proof of concept for supporting parameter inputs in kfp-kubernetes. We've added support first to secret.py.

Ref: kubeflow#10534
Ref: kubeflow#10914

proof of concept for supporting parameter inputs in kfp-kubernetes.
We've added support first to secret.py.

Ref: kubeflow#10534
Ref: kubeflow#10914
@gregsheremeta
Copy link
Author

/hold

Copy link

openshift-ci bot commented Aug 11, 2024

[APPROVALNOTIFIER] This PR is NOT APPROVED

This pull-request has been approved by:
Once this PR has been reviewed and has the lgtm label, please ask for approval from gregsheremeta. For more information see the Kubernetes Code Review Process.

The full list of commands accepted by this bot can be found here.

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@dsp-developers
Copy link

Commit Checker results:

**NOTE**: These are the results of the commit checker scans. 
If these are not commits from upstream kfp, then please ensure
you adhere to the commit checker formatting
commitchecker verson unknown
Validating 1 commits between bbfb5897533403b38ad8956424931868c103b0b0...3844a58bf8b4196a202c36068475d02d65306da0

UPSTREAM commit 3844a58 has invalid summary WIP - do not merge - support parameter inputs in kfp-kubernetes.

UPSTREAM commits are validated against the following regular expression:
  ^UPSTREAM: (revert: )?(([\w.-]+/[\w-.-]+)?: )?(\d+:|<carry>:|<drop>:)

UPSTREAM commit summaries should look like:

  UPSTREAM: <PR number|carry|drop>: description

UPSTREAM commits which revert previous UPSTREAM commits should look like:

  UPSTREAM: revert: <normal upstream format>

Examples of valid summaries:

  UPSTREAM: 12345: A kube fix
  UPSTREAM: <carry>: A carried kube change
  UPSTREAM: <drop>: A dropped kube change
  UPSTREAM: revert: 12345: A kube revert


@dsp-developers
Copy link

A set of new images have been built to help with testing out this PR:
API Server: quay.io/opendatahub/ds-pipelines-api-server:pr-71
DSP DRIVER: quay.io/opendatahub/ds-pipelines-driver:pr-71
DSP LAUNCHER: quay.io/opendatahub/ds-pipelines-launcher:pr-71
Persistence Agent: quay.io/opendatahub/ds-pipelines-persistenceagent:pr-71
Scheduled Workflow Manager: quay.io/opendatahub/ds-pipelines-scheduledworkflow:pr-71
MLMD Server: quay.io/opendatahub/mlmd-grpc-server:latest
MLMD Envoy Proxy: registry.redhat.io/openshift-service-mesh/proxyv2-rhel8:2.3.9-2
UI: quay.io/opendatahub/ds-pipelines-frontend:pr-71

@dsp-developers
Copy link

An OCP cluster where you are logged in as cluster admin is required.

The Data Science Pipelines team recommends testing this using the Data Science Pipelines Operator. Check here for more information on using the DSPO.

To use and deploy a DSP stack with these images (assuming the DSPO is deployed), first save the following YAML to a file named dspa.pr-71.yaml:

apiVersion: datasciencepipelinesapplications.opendatahub.io/v1alpha1
kind: DataSciencePipelinesApplication
metadata:
  name: pr-71
spec:
  dspVersion: v2
  apiServer:
    image: "quay.io/opendatahub/ds-pipelines-api-server:pr-71"
    argoDriverImage: "quay.io/opendatahub/ds-pipelines-driver:pr-71"
    argoLauncherImage: "quay.io/opendatahub/ds-pipelines-launcher:pr-71"
  persistenceAgent:
    image: "quay.io/opendatahub/ds-pipelines-persistenceagent:pr-71"
  scheduledWorkflow:
    image: "quay.io/opendatahub/ds-pipelines-scheduledworkflow:pr-71"
  mlmd:  
    deploy: true  # Optional component
    grpc:
      image: "quay.io/opendatahub/mlmd-grpc-server:latest"
    envoy:
      image: "registry.redhat.io/openshift-service-mesh/proxyv2-rhel8:2.3.9-2"
  mlpipelineUI:
    deploy: true  # Optional component 
    image: "quay.io/opendatahub/ds-pipelines-frontend:pr-71"
  objectStorage:
    minio:
      deploy: true
      image: 'quay.io/opendatahub/minio:RELEASE.2019-08-14T20-37-41Z-license-compliance'

Then run the following:

cd $(mktemp -d)
git clone [email protected]:opendatahub-io/data-science-pipelines.git
cd data-science-pipelines/
git fetch origin pull/71/head
git checkout -b pullrequest 3844a58bf8b4196a202c36068475d02d65306da0
oc apply -f dspa.pr-71.yaml

More instructions here on how to deploy and test a Data Science Pipelines Application.

Copy link

@diegolovison diegolovison left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Without the code that I suggested I can't assert this PR. Please change the python code and regenerate the YAMLs.


@dsl.component
def comp():
pass

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

with open('/mnt/my_secret') as f:
    print(f.read())


@dsl.component
def comp():
pass

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Replace pass with the following

    with open('/mnt/my_secret') as f:
        print(f.read())


@dsl.component
def comp():
pass

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Replace pass with the following

    with open('/mnt/my_secret') as f:
        print(f.read())


@dsl.component
def comp():
pass

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Replace pass with the following

    with open('/mnt/my_secret') as f:
        print(f.read())

@gregsheremeta
Copy link
Author

per conversation with @chensun in KFP Community meeting, we'll pursue something called f-strings (Ref: https://github.com/kubeflow/pipelines/blob/4c955f4780839702dc4924f8f4e7c90aa251b826/sdk/python/kfp/dsl/pipeline_channel.py#L383)

We don't yet understand what f-strings are...

/close

@openshift-ci openshift-ci bot closed this Sep 6, 2024
Copy link

openshift-ci bot commented Sep 6, 2024

@gregsheremeta: Closed this PR.

In response to this:

per conversation with @chensun in KFP Community meeting, we'll pursue something called f-strings (Ref: https://github.com/kubeflow/pipelines/blob/4c955f4780839702dc4924f8f4e7c90aa251b826/sdk/python/kfp/dsl/pipeline_channel.py#L383)

We don't yet understand what f-strings are...

/close

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository.

@dsp-developers
Copy link

Commit Checker results:

**NOTE**: These are the results of the commit checker scans. 
If these are not commits from upstream kfp, then please ensure
you adhere to the commit checker formatting
commitchecker verson unknown
Validating 0 commits between a8fbbd2020d280f4cab31245a9639a350207b3f9...3844a58bf8b4196a202c36068475d02d65306da0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants