DSL Static Type Checking

Statically check the component I/O types

This page describes how to integrate the type information in the pipeline and utilize the static type checking for fast development iterations.

Motivation

A pipeline is a workflow consisting of components and each component contains inputs and outputs. The DSL compiler supports static type checking to ensure the type consistency among the component I/Os within the same pipeline. Static type checking helps you to identify component I/O inconsistencies without running the pipeline. It also shortens the development cycles by catching the errors early. This feature is especially useful in two cases:

  • When the pipeline is huge and manually checking the types is infeasible;
  • When some components are shared ones and the type information is not immediately available.

Type system

In Kubeflow pipeline, a type is defined as a type name with an OpenAPI Schema property, which defines the input parameter schema. Warning: the pipeline system currently does not check the input value against the schema when you submit a pipeline run. However, this feature will come in the near future.

There is a set of core types defined in the pipeline SDK and you can use these core types or define your custom types.

In the component YAML, types are specified as a string or a dictionary with the OpenAPI Schema, as illustrated below. “component a” expects an input with Integer type and emits three outputs with the type GCSPath, customized_type and GCRPath. Among these types, Integer, GCSPath, and GCRPath are core types that are predefined in the SDK while customized_type is a user-defined type.

  1. name: component a
  2. description: component desc
  3. inputs:
  4. - {name: field_l, type: Integer}
  5. outputs:
  6. - {name: field_m, type: {GCSPath: {openapi_schema_validator: {type: string, pattern: "^gs://.*$" } }}}
  7. - {name: field_n, type: customized_type}
  8. - {name: field_o, type: GCRPath}
  9. implementation:
  10. container:
  11. image: gcr.io/ml-pipeline/component-a
  12. command: [python3, /pipelines/component/src/train.py]
  13. args: [
  14. --field-l, {inputValue: field_l},
  15. ]
  16. fileOutputs:
  17. field_m: /schema.txt
  18. field_n: /feature.txt
  19. field_o: /output.txt

Similarly, when you write a component with the decorator, you can annotate I/O with types in the function signature, as shown below.

  1. from kfp.dsl import component
  2. from kfp.dsl.types import Integer, GCRPath
  3. @component
  4. def task_factory_a(field_l: Integer()) -> {
  5. 'field_m': {
  6. 'GCSPath': {
  7. 'openapi_schema_validator':
  8. '{"type": "string", "pattern": "^gs://.*$"}'
  9. }
  10. },
  11. 'field_n': 'customized_type',
  12. 'field_o': GCRPath()
  13. }:
  14. return ContainerOp(
  15. name='operator a',
  16. image='gcr.io/ml-pipeline/component-a',
  17. command=['python3', '/pipelines/component/src/train.py'],
  18. arguments=[
  19. '--field-l',
  20. field_l,
  21. ],
  22. file_outputs={
  23. 'field_m': '/schema.txt',
  24. 'field_n': '/feature.txt',
  25. 'field_o': '/output.txt'
  26. })

You can also annotate pipeline inputs with types and the input are checked against the component I/O types as well. For example,

  1. @component
  2. def task_factory_a(
  3. field_m: {
  4. 'GCSPath': {
  5. 'openapi_schema_validator':
  6. '{"type": "string", "pattern": "^gs://.*$"}'
  7. }
  8. }, field_o: 'Integer'):
  9. return ContainerOp(
  10. name='operator a',
  11. image='gcr.io/ml-pipeline/component-a',
  12. arguments=[
  13. '--field-l',
  14. field_m,
  15. '--field-o',
  16. field_o,
  17. ],
  18. )
  19. # Pipeline input types are also checked against the component I/O types.
  20. @dsl.pipeline(name='type_check', description='')
  21. def pipeline(
  22. a: {
  23. 'GCSPath': {
  24. 'openapi_schema_validator':
  25. '{"type": "string", "pattern": "^gs://.*$"}'
  26. }
  27. } = 'good',
  28. b: Integer() = 12):
  29. task_factory_a(field_m=a, field_o=b)
  30. try:
  31. compiler.Compiler().compile(pipeline, 'pipeline.tar.gz', type_check=True)
  32. except InconsistentTypeException as e:
  33. print(e)

How does the type checking work?

The basic checking criterion is the equality checking. In other words, type checking passes only when the type name strings are equal and the corresponding OpenAPI Schema properties are equal. Examples of type checking failure are:

  • “GCSPath” vs. “GCRPath”
  • “Integer” vs. “Float”
  • {‘GCSPath’: {‘openapi_schema_validator’: ‘{“type”: “string”, “pattern”: “^gs://.*$”}’}} vs.
    {‘GCSPath’: {‘openapi_schema_validator’: ‘{“type”: “string”, “pattern”: “^gcs://.*$”}’}}

If inconsistent types are detected, it throws an InconsistentTypeException.

Type checking configuration

Type checking is enabled by default and it can be disabled in two ways:

If you compile the pipeline programmably:

  1. compiler.Compiler().compile(pipeline_a, 'pipeline_a.tar.gz', type_check=False)

If you compile the pipeline using the dsl-compiler tool:

  1. dsl-compiler --py pipeline.py --output pipeline.zip --disable-type-check

Fine-grained configuration

Sometimes, you might want to enable the type checking but disable certain arguments. For example, when the upstream component generates an output with type “Float” and the downstream can ingest either “Float” or “Integer“, it might fail if you define the type as “Float_or_Integer“. Disabling the type checking per-argument is also supported as shown below.

  1. @dsl.pipeline(name='type_check_a', description='')
  2. def pipeline():
  3. a = task_factory_a(field_l=12)
  4. # For each of the arguments, you can also ignore the types by calling
  5. # ignore_type function.
  6. b = task_factory_b(
  7. field_x=a.outputs['field_n'],
  8. field_y=a.outputs['field_o'],
  9. field_z=a.outputs['field_m'].ignore_type())
  10. compiler.Compiler().compile(pipeline, 'pipeline.tar.gz', type_check=True)

Missing types

DSL compiler passes the type checking if either of the upstream or the downstream components lack the type information for some parameters. The effects are the same as that of ignoring the type information. However, type checking would still fail if some I/Os lack the type information and some I/O types are incompatible.

Next steps

Learn how to define a KubeFlow pipeline with Python DSL and compile the pipeline with type checking: a Jupyter notebook demo.

Last modified 03.03.2021: Move Kubeflow Pipelines under /components (#2505) (c34470b8)