Ask AI

You are viewing an unreleased or outdated version of the documentation

Definitions

class dagster.Definitions(assets=None, schedules=None, sensors=None, jobs=None, resources=None, executor=None, loggers=None, asset_checks=None, metadata=None)[source]

A set of definitions explicitly available and loadable by Dagster tools.

Parameters:
  • assets (Optional[Iterable[Union[AssetsDefinition, SourceAsset, CacheableAssetsDefinition]]]) – A list of assets. Assets can be created by annotating a function with @asset or @observable_source_asset. Or they can by directly instantiating AssetsDefinition, SourceAsset, or CacheableAssetsDefinition.

  • asset_checks (Optional[Iterable[AssetChecksDefinition]]) – A list of asset checks.

  • schedules (Optional[Iterable[Union[ScheduleDefinition, UnresolvedPartitionedAssetScheduleDefinition]]]) – List of schedules.

  • sensors (Optional[Iterable[SensorDefinition]]) – List of sensors, typically created with @sensor.

  • jobs (Optional[Iterable[Union[JobDefinition, UnresolvedAssetJobDefinition]]]) – List of jobs. Typically created with define_asset_job or with @job for jobs defined in terms of ops directly. Jobs created with @job must already have resources bound at job creation time. They do not respect the resources argument here.

  • resources (Optional[Mapping[str, Any]]) – Dictionary of resources to bind to assets. The resources dictionary takes raw Python objects, not just instances of ResourceDefinition. If that raw object inherits from IOManager, it gets coerced to an IOManagerDefinition. Any other object is coerced to a ResourceDefinition. These resources will be automatically bound to any assets passed to this Definitions instance using with_resources. Assets passed to Definitions with resources already bound using with_resources will override this dictionary.

  • executor (Optional[Union[ExecutorDefinition, Executor]]) – Default executor for jobs. Individual jobs can override this and define their own executors by setting the executor on @job or define_asset_job explicitly. This executor will also be used for materializing assets directly outside of the context of jobs. If an Executor is passed, it is coerced into an ExecutorDefinition.

  • loggers (Optional[Mapping[str, LoggerDefinition]) – Default loggers for jobs. Individual jobs can define their own loggers by setting them explictly.

  • metadata (Optional[MetadataMapping]) – Arbitrary metadata for the Definitions. Not displayed in the UI but accessible on the Definitions instance at runtime.

Example usage:

defs = Definitions(
    assets=[asset_one, asset_two],
    schedules=[a_schedule],
    sensors=[a_sensor],
    jobs=[a_job],
    resources={
        "a_resource": some_resource,
    },
    asset_checks=[asset_one_check_one]
)

Dagster separates user-defined code from system tools such the web server and the daemon. Rather than loading code directly into process, a tool such as the webserver interacts with user-defined code over a serialization boundary.

These tools must be able to locate and load this code when they start. Via CLI arguments or config, they specify a Python module to inspect.

A Python module is loadable by Dagster tools if there is a top-level variable that is an instance of Definitions.

get_all_asset_specs()[source]

experimental This API may break in future versions, even between dot releases.

Returns an AssetSpec object for every asset contained inside the Definitions object.

get_asset_value_loader(instance=None)[source]

Returns an object that can load the contents of assets as Python objects.

Invokes load_input on the IOManager associated with the assets. Avoids spinning up resources separately for each asset.

Usage:

with defs.get_asset_value_loader() as loader:
    asset1 = loader.load_asset_value("asset1")
    asset2 = loader.load_asset_value("asset2")
get_job_def(name)[source]

Get a job definition by name. If you passed in a an UnresolvedAssetJobDefinition (return value of define_asset_job()) it will be resolved to a JobDefinition when returned from this function, with all resource dependencies fully resolved.

get_schedule_def(name)[source]

Get a ScheduleDefinition by name. If your passed-in schedule had resource dependencies, or the job targeted by the schedule had resource dependencies, those resource dependencies will be fully resolved on the returned object.

get_sensor_def(name)[source]

Get a SensorDefinition by name. If your passed-in sensor had resource dependencies, or the job targeted by the sensor had resource dependencies, those resource dependencies will be fully resolved on the returned object.

load_asset_value(asset_key, *, python_type=None, instance=None, partition_key=None, metadata=None)[source]

Load the contents of an asset as a Python object.

Invokes load_input on the IOManager associated with the asset.

If you want to load the values of multiple assets, it’s more efficient to use get_asset_value_loader(), which avoids spinning up resources separately for each asset.

Parameters:
  • asset_key (Union[AssetKey, Sequence[str], str]) – The key of the asset to load.

  • python_type (Optional[Type]) – The python type to load the asset as. This is what will be returned inside load_input by context.dagster_type.typing_type.

  • partition_key (Optional[str]) – The partition of the asset to load.

  • metadata (Optional[Dict[str, Any]]) – Input metadata to pass to the IOManager (is equivalent to setting the metadata argument in In or AssetIn).

Returns:

The contents of an asset as a Python object.

static merge(*def_sets)[source]

experimental This API may break in future versions, even between dot releases.

Merges multiple Definitions objects into a single Definitions object.

The returned Definitions object has the union of all the definitions in the input Definitions objects.

Raises an error if the Definitions objects to be merged contain conflicting values for the same resource key or logger key, or if they have different executors defined.

Examples

import submodule1
import submodule2

defs = Definitions.merge(submodule1.defs, submodule2.defs)
Returns:

The merged definitions.

Return type:

Definitions

static validate_loadable(defs)[source]

Validates that the enclosed definitions will be loadable by Dagster: - No assets have conflicting keys. - No jobs, sensors, or schedules have conflicting names. - All asset jobs can be resolved. - All resource requirements are satisfied.

Meant to be used in unit tests.

Raises an error if any of the above are not true.

dagster.create_repository_using_definitions_args(name, assets=None, schedules=None, sensors=None, jobs=None, resources=None, executor=None, loggers=None, asset_checks=None)[source]

experimental This API may break in future versions, even between dot releases.

Create a named repository using the same arguments as Definitions. In older versions of Dagster, repositories were the mechanism for organizing assets, schedules, sensors, and jobs. There could be many repositories per code location. This was a complicated ontology but gave users a way to organize code locations that contained large numbers of heterogenous definitions.

As a stopgap for those who both want to 1) use the new Definitions API and 2) but still want multiple logical groups of assets in the same code location, we have introduced this function.

Example usage:

named_repo = create_repository_using_definitions_args(
    name="a_repo",
    assets=[asset_one, asset_two],
    schedules=[a_schedule],
    sensors=[a_sensor],
    jobs=[a_job],
    resources={
        "a_resource": some_resource,
    }
)