'How to get the status of a pipeline run within a component, running on Vertex AI?
Previously, using Kubeflow Pipelines SDK v1, the status of a pipeline could be inferred during pipeline execution by passing an Argo placeholder, {{workflow.status}}
, to the component, as shown below:
import kfp.dsl as dsl
component_1 = dsl.ContainerOp(
name='An example component',
image='eu.gcr.io/.../my-component-img',
arguments=[
'python3', 'main.py',
'--status', "{{workflow.status}}"
]
)
This placeholder would take the value Succeeded
or Failed
when passed to the component. One use-case for this would be to send a failure-warning to eg. Slack, in combination with dsl.ExitHandler
.
However, when using Pipeline SDK version 2, kfp.v2
, together with Vertex AI to compile and run the pipeline the Argo placeholders no longer work, as described by this open issue. Because of this, I would need another way to check the status of the pipeline within the component. I was thinking I could use the kfp.Client
class, but I'm assuming this won't work using Vertex AI, since there is no "host" really. Also, there seems to be supported placeholders for to pass the run id (dsl.PIPELINE_JOB_ID_PLACEHOLDER
) as a placeholder, as per this SO post, but I can't find anything around status
.
Any ideas how to get the status of a pipeline run within a component, running on Vertex AI?
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|