Base task#
- class flytekit.core.base_task.kwtypes(**kwargs)#
This is a small helper function to convert the keyword arguments to an OrderedDict of types.
kwtypes(a=int, b=str)
- Return type:
OrderedDict[str, Type]
- class flytekit.core.base_task.PythonTask(*args, **kwargs)#
Base Class for all Tasks with a Python native
Interface
. This should be directly used for task types, that do not have a python function to be executed. Otherwise refer toflytekit.PythonFunctionTask
.- compile(ctx, *args, **kwargs)#
Generates a node that encapsulates this task in a workflow definition.
- Parameters:
ctx (FlyteContext)
- Return type:
- construct_node_metadata()#
Used when constructing the node that encapsulates this task as part of a broader workflow definition.
- Return type:
NodeMetadata
- property deck_fields: List[DeckField]#
If not empty, this task will output deck html file for the specified decks
- property disable_deck: bool#
If true, this task will not output deck html file
- dispatch_execute(ctx, input_literal_map)#
This method translates Flyte’s Type system based input values and invokes the actual call to the executor This method is also invoked during runtime.
VoidPromise
is returned in the case when the task itself declares no outputs.Literal Map
is returned when the task returns either one more outputs in the declaration. Individual outputs may be noneDynamicJobSpec
is returned when a dynamic workflow is executed
- Parameters:
ctx (FlyteContext)
input_literal_map (LiteralMap)
- Return type:
LiteralMap | DynamicJobSpec | Coroutine
- property environment: Dict[str, str]#
Any environment variables that supplied during the execution of the task.
- abstract execute(**kwargs)#
This method will be invoked to execute the task.
- Return type:
Any
- get_input_types()#
Returns the names and python types as a dictionary for the inputs of this task.
- Return type:
Dict[str, type]
- get_type_for_input_var(k, v)#
Returns the python type for an input variable by name.
- Parameters:
k (str)
v (Any)
- Return type:
Type[Any]
- get_type_for_output_var(k, v)#
Returns the python type for the specified output variable by name.
- Parameters:
k (str)
v (Any)
- Return type:
Type[Any]
- post_execute(user_params, rval)#
Post execute is called after the execution has completed, with the user_params and can be used to clean-up, or alter the outputs to match the intended tasks outputs. If not overridden, then this function is a No-op
- Args:
rval is returned value from call to execute user_params: are the modified user params as created during the pre_execute step
- Parameters:
user_params (ExecutionParameters | None)
rval (Any)
- Return type:
Any
- pre_execute(user_params)#
This is the method that will be invoked directly before executing the task method and before all the inputs are converted. One particular case where this is useful is if the context is to be modified for the user process to get some user space parameters. This also ensures that things like SparkSession are already correctly setup before the type transformers are called
This should return either the same context of the mutated context
- Parameters:
user_params (ExecutionParameters | None)
- Return type:
ExecutionParameters | None
- property task_config: T | None#
Returns the user-specified task config which is used for plugin-specific handling of the task.
- class flytekit.core.base_task.Task(task_type, name, interface, metadata=None, task_type_version=0, security_ctx=None, docs=None, **kwargs)#
The base of all Tasks in flytekit. This task is closest to the FlyteIDL TaskTemplate and captures information in FlyteIDL specification and does not have python native interfaces associated. Refer to the derived classes for examples of how to extend this class.
- Parameters:
task_type (str)
name (str)
interface (TypedInterface)
metadata (TaskMetadata | None)
security_ctx (SecurityContext | None)
docs (Documentation | None)
- abstract dispatch_execute(ctx, input_literal_map)#
This method translates Flyte’s Type system based input values and invokes the actual call to the executor This method is also invoked during runtime.
- Parameters:
ctx (FlyteContext)
input_literal_map (LiteralMap)
- Return type:
LiteralMap
- abstract execute(**kwargs)#
This method will be invoked to execute the task.
- Return type:
Any
- get_config(settings)#
Returns the task config as a serializable dictionary. This task config consists of metadata about the custom defined for this task.
- Parameters:
settings (SerializationSettings)
- Return type:
Dict[str, str] | None
- get_container(settings)#
Returns the container definition (if any) that is used to run the task on hosted Flyte.
- Parameters:
settings (SerializationSettings)
- Return type:
Container | None
- get_custom(settings)#
Return additional plugin-specific custom data (if any) as a serializable dictionary.
- Parameters:
settings (SerializationSettings)
- Return type:
Dict[str, Any] | None
- get_extended_resources(settings)#
Returns the extended resources to allocate to the task on hosted Flyte.
- Parameters:
settings (SerializationSettings)
- Return type:
ExtendedResources | None
- get_input_types()#
Returns python native types for inputs. In case this is not a python native task (base class) and hence returns a None. we could deduce the type from literal types, but that is not a required exercise # TODO we could use literal type to determine this
- Return type:
Dict[str, type] | None
- get_k8s_pod(settings)#
Returns the kubernetes pod definition (if any) that is used to run the task on hosted Flyte.
- Parameters:
settings (SerializationSettings)
- Return type:
K8sPod | None
- get_sql(settings)#
Returns the Sql definition (if any) that is used to run the task on hosted Flyte.
- Parameters:
settings (SerializationSettings)
- Return type:
Sql | None
- get_type_for_input_var(k, v)#
Returns the python native type for the given input variable # TODO we could use literal type to determine this
- Parameters:
k (str)
v (Any)
- Return type:
type
- get_type_for_output_var(k, v)#
Returns the python native type for the given output variable # TODO we could use literal type to determine this
- Parameters:
k (str)
v (Any)
- Return type:
type
- local_execute(ctx, **kwargs)#
This function is used only in the local execution path and is responsible for calling dispatch execute. Use this function when calling a task with native values (or Promises containing Flyte literals derived from Python native values).
- Parameters:
ctx (FlyteContext)
- Return type:
- abstract pre_execute(user_params)#
This is the method that will be invoked directly before executing the task method and before all the inputs are converted. One particular case where this is useful is if the context is to be modified for the user process to get some user space parameters. This also ensures that things like SparkSession are already correctly setup before the type transformers are called
This should return either the same context of the mutated context
- Parameters:
user_params (ExecutionParameters)
- Return type:
- sandbox_execute(ctx, input_literal_map)#
Call dispatch_execute, in the context of a local sandbox execution. Not invoked during runtime.
- Parameters:
ctx (FlyteContext)
input_literal_map (LiteralMap)
- Return type:
LiteralMap
- class flytekit.core.base_task.TaskResolverMixin#
Flytekit tasks interact with the Union platform very, very broadly in two steps. They need to be uploaded to Admin, and then they are run by the user upon request (either as a single task execution or as part of a workflow). In any case, at execution time, for most tasks (that is those that generate a container target) the container image containing the task needs to be spun up again at which point the container needs to know which task it’s supposed to run and how to rehydrate the task object.
For example, the serialization of a simple task
# in repo_root/workflows/example.py @task def t1(...) -> ...: ...
might result in a container with arguments like
pyflyte-execute --inputs s3://path/inputs.pb --output-prefix s3://outputs/location --raw-output-data-prefix /tmp/data --resolver flytekit.core.python_auto_container.default_task_resolver -- task-module repo_root.workflows.example task-name t1
At serialization time, the container created for the task will start out automatically with the
pyflyte-execute
bit, along with the requisite input/output args and the offloaded data prefix. Appended to that will be two things,the
location
of the task’s task resolver, followed by two dashes, followed bythe arguments provided by calling the
loader_args
function below.
The
default_task_resolver
declared below knows thatWhen
loader_args
is called on a task, to look up the module the task is in, and the name of the task (the key of the task in the module, either the function name, or the variable it was assigned to).When
load_task
is called, it interprets the first part of the command as the module to callimportlib.import_module
on, and then looks for a keyt1
.
This is just the default behavior. Users should feel free to implement their own resolvers.
- abstract get_all_tasks()#
Future proof method. Just making it easy to access all tasks (Not required today as we auto register them)
- Return type:
List[Task]
- abstract load_task(loader_args)#
Given the set of identifier keys, should return one Python Task or raise an error if not found
- Parameters:
loader_args (List[str])
- Return type:
- abstract loader_args(settings, t)#
Return a list of strings that can help identify the parameter Task
- Parameters:
settings (SerializationSettings)
t (Task)
- Return type:
List[str]
- class flytekit.core.base_task.IgnoreOutputs#
This exception should be used to indicate that the outputs generated by this can be safely ignored. This is useful in case of distributed training or peer-to-peer parallel algorithms.