Standard workflows#
A standard workflow is defined by a Python function decorated with the @workflow
decorator.
The function is written in a domain specific language (DSL) that is a subset of Python syntax that describes directed acyclic graph (DAG) that is deployed and executed on Union.
The syntax of a standard workflow definition can only include the following:
Calls to functions decorated with
@task
and assignment of variables to the returned values.Calls to other functions decorated with
@workflow
and assignment of variables to the returned values (see Subworkflows).Calls to
LaunchPlan
objects (see When to use sub-launch plans)Calls to functions decorated with
@dynamic
and assignment of variables to the returned values (see Dynamic workflows).Calls to functions decorated with
@eager
and assignment of variables to the returned values (see Eager workflows).The special
conditional
construct.Statements using the chaining operator
>>
.
Evaluation of a standard workflow#
When a standard workflow is run locally in a Python environment it is executed as a normal Python function.
However, when it is registered to Union, the top level @workflow
-decorated function is evaluated as follows:
Inputs to the workflow are materialized as lazily-evaluated promises which are propagated to downstream tasks and subworkflows.
All values returned by calls to functions decorated with
@task
,@dynamic
and@eager
are also materialized as lazily-evaluated promises.
The resulting structure is used to construct the Directed Acyclic Graph (DAG) and deploy the required containers to the cluster. The actual evaluation of these promises occurs when the tasks (or dynamic or eager workflows) are executed in their respective containers.
Conditional construct#
Because standard workflows cannot directly include Python if
statements, a special conditional
construct is provided that allows you to define conditional logic in a workflow.
For details, see Conditionals.
Chaining operator#
When Union builds the DAG for a standard workflow, it uses the passing of values from one task to another to determine the dependency relationships between tasks.
There may be cases where you want to define a dependency between two tasks that is not based on the output of one task being passed as an input to another.
In that case, you can use the chaining operator >>
to define the dependencies between tasks.
For details, see Chaining Flyte entities.
Workflow decorator parameters#
The @workflow
decorator can take the following parameters:
failure_policy
: Use the options inflytekit.WorkflowFailurePolicy
.on_failure
: Invoke this workflow or task on failure. The workflow specified must have the same parameter signature as the current workflow, with an additional parameter callederror
.docs
: A description entity for the workflow.