ads.opctl.backend package#

Submodules#

ads.opctl.backend.ads_dataflow module#

class ads.opctl.backend.ads_dataflow.DataFlowBackend(config: Dict)[source]#

Bases: Backend

Initialize a MLJobBackend object given config dictionary.

Parameters:

config (dict) – dictionary of configurations

apply() Dict[source]#

Create DataFlow and DataFlow Run from YAML.

cancel()[source]#

Cancel DataFlow Run from OCID.

delete()[source]#

Delete DataFlow or DataFlow Run from OCID.

init(uri: str | None = None, overwrite: bool = False, runtime_type: str | None = None, **kwargs: Dict) str | None[source]#

Generates a starter YAML specification for a Data Flow Application.

Parameters:
  • overwrite ((bool, optional). Defaults to False.) – Overwrites the result specification YAML if exists.

  • uri ((str, optional), Defaults to None.) – The filename to save the resulting specification template YAML.

  • runtime_type ((str, optional). Defaults to None.) – The resource runtime type.

  • **kwargs (Dict) – The optional arguments.

Returns:

The YAML specification for the given resource if uri was not provided. None otherwise.

Return type:

Union[str, None]

run() Dict[source]#

Create DataFlow and DataFlow Run from OCID or cli parameters.

watch()[source]#

Watch DataFlow Run from OCID.

class ads.opctl.backend.ads_dataflow.DataFlowOperatorBackend(config: Dict, operator_info: OperatorInfo | None = None)[source]#

Bases: DataFlowBackend

Backend class to run operator on Data Flow Application.

runtime_config#

The runtime config for the operator.

Type:

(Dict)

operator_config#

The operator specification config.

Type:

(Dict)

operator_type#

The type of the operator.

Type:

str

operator_version#

The version of the operator.

Type:

str

job#

The Data Science Job.

Type:

Job

Instantiates the operator backend.

Parameters:
  • config ((Dict)) – The configuration file containing operator’s specification details and execution section.

  • operator_info ((OperatorInfo, optional)) – The operator’s detailed information extracted from the operator.__init__ file. Will be extracted from the operator type in case if not provided.

run(**kwargs: Dict) Dict | None[source]#

Runs the operator on the Data Flow service.

class ads.opctl.backend.ads_dataflow.DataFlowRuntimeFactory[source]#

Bases: RuntimeFactory

Data Flow runtime factory.

ads.opctl.backend.ads_ml_job module#

class ads.opctl.backend.ads_ml_job.JobRuntimeFactory[source]#

Bases: RuntimeFactory

Job runtime factory.

class ads.opctl.backend.ads_ml_job.MLJobBackend(config: Dict)[source]#

Bases: Backend

Initialize a MLJobBackend object given config dictionary.

Parameters:

config (dict) – dictionary of configurations

apply() Dict[source]#

Create Job and Job Run from YAML.

cancel()[source]#

Cancel Job Run from OCID.

delete()[source]#

Delete Job or Job Run from OCID.

init(uri: str | None = None, overwrite: bool = False, runtime_type: str | None = None, **kwargs: Dict) str | None[source]#

Generates a starter YAML specification for a Data Science Job.

Parameters:
  • overwrite ((bool, optional). Defaults to False.) – Overwrites the result specification YAML if exists.

  • uri ((str, optional), Defaults to None.) – The filename to save the resulting specification template YAML.

  • runtime_type ((str, optional). Defaults to None.) – The resource runtime type.

  • **kwargs (Dict) – The optional arguments.

Returns:

The YAML specification for the given resource if uri was not provided. None otherwise.

Return type:

Union[str, None]

run() Dict[source]#

Create Job and Job Run from OCID or cli parameters.

watch()[source]#

Watch Job Run from OCID.

class ads.opctl.backend.ads_ml_job.MLJobDistributedBackend(config: Dict)[source]#

Bases: MLJobBackend

Initialize a MLJobDistributedBackend object given config dictionary.

Parameters:

config (dict) – dictionary of configurations

DIAGNOSTIC_COMMAND = 'python -m ads.opctl.diagnostics -t distributed'#
static generate_worker_name(worker_jobrun_conf, i)[source]#
prepare_job_config(cluster_info)[source]#
run(cluster_info, dry_run=False) None[source]#
  • Creates Job Definition and starts main and worker jobruns from that job definition

  • The Job Definition will contain all the environment variables defined at the cluster/spec/config level, environment variables defined by the user at runtime/spec/env level and OCI__ derived from the yaml specification

  • The Job Run will have overrides provided by the user under cluster/spec/{main|worker}/config section and `OCI__MODE`={MASTER|WORKER} depending on the run type

run_diagnostics(cluster_info, dry_run=False, **kwargs)[source]#

Implement Diagnostics check appropriate for the backend

class ads.opctl.backend.ads_ml_job.MLJobOperatorBackend(config: Dict, operator_info: OperatorInfo | None = None)[source]#

Bases: MLJobBackend

Backend class to run operator on Data Science Jobs. Currently supported two scenarios:

  • Running operator within container runtime.

  • Running operator within python runtime.

runtime_config#

The runtime config for the operator.

Type:

(Dict)

operator_config#

The operator specification config.

Type:

(Dict)

operator_type#

The type of the operator.

Type:

str

operator_version#

The version of the operator.

Type:

str

operator_info#

The detailed information about the operator.

Type:

OperatorInfo

job#

The Data Science Job.

Type:

Job

Instantiates the operator backend.

Parameters:
  • config ((Dict)) – The configuration file containing operator’s specification details and execution section.

  • operator_info ((OperatorInfo, optional)) – The operator’s detailed information extracted from the operator.__init__ file. Will be extracted from the operator type in case if not provided.

run(**kwargs: Dict) Dict | None[source]#

Runs the operator on the Data Science Jobs.

ads.opctl.backend.ads_ml_pipeline module#

class ads.opctl.backend.ads_ml_pipeline.PipelineBackend(config: Dict)[source]#

Bases: Backend

Initialize a MLPipeline object given config dictionary.

Parameters:

config (dict) – dictionary of configurations

apply() Dict[source]#

Create Pipeline and Pipeline Run from YAML.

cancel() None[source]#

Cancel Pipeline Run from OCID.

delete() None[source]#

Delete Pipeline or Pipeline Run from OCID.

init(uri: str | None = None, overwrite: bool = False, runtime_type: str | None = None, **kwargs: Dict) str | None[source]#

Generates a starter YAML specification for an MLPipeline.

Parameters:
  • overwrite ((bool, optional). Defaults to False.) – Overwrites the result specification YAML if exists.

  • uri ((str, optional), Defaults to None.) – The filename to save the resulting specification template YAML.

  • runtime_type ((str, optional). Defaults to None.) – The resource runtime type.

  • **kwargs (Dict) – The optional arguments.

Returns:

The YAML specification for the given resource if uri was not provided. None otherwise.

Return type:

Union[str, None]

run() Dict[source]#

Create Pipeline and Pipeline Run from OCID.

watch() None[source]#

Watch Pipeline Run from OCID.

ads.opctl.backend.base module#

class ads.opctl.backend.base.Backend(config: Dict)[source]#

Bases: object

Interface for backend

activate() None[source]#

Activate a remote service.

Return type:

None

apply() Dict[source]#

Initiate Data Science service from YAML.

Return type:

Dict

cancel() None[source]#

Cancel a remote run.

Return type:

None

deactivate() None[source]#

Deactivate a remote service.

Return type:

None

delete() None[source]#

Delete a remote run.

Return type:

None

init(uri: str | None = None, overwrite: bool = False, **kwargs: Dict) str | None[source]#

Generates a YAML specification for the resource.

Parameters:
  • overwrite ((bool, optional). Defaults to False.) – Overwrites the result specification YAML if exists.

  • uri ((str, optional)) – The filename to save the resulting specification template YAML.

  • **kwargs (Dict) –

    The optional arguments.

    runtime_type: str

    The resource runtime type.

Returns:

The YAML specification for the given resource if uri was not provided. None otherwise.

Return type:

Union[str, None]

predict() None[source]#

Run model predict.

Return type:

None

abstract run() Dict[source]#

Initiate a run.

Return type:

Dict

run_diagnostics()[source]#

Implement Diagnostics check appropriate for the backend

watch() None[source]#

Stream logs from a remote run.

Return type:

None

class ads.opctl.backend.base.RuntimeFactory[source]#

Bases: object

Base factory for runtime.

classmethod get_runtime(key: str, *args, **kwargs)[source]#
exception ads.opctl.backend.base.UnsupportedRuntime(runtime_type: str)[source]#

Bases: Exception

ads.opctl.backend.local module#

exception ads.opctl.backend.local.CondaPackNotFound[source]#

Bases: Exception

class ads.opctl.backend.local.LocalBackend(config: Dict)[source]#

Bases: Backend

Initialize a LocalBackend object with given config.

Parameters:

config (dict) – dictionary of configurations

init_vscode_container() None[source]#

Create a .devcontainer.json file for development with VSCode.

Return type:

None

run()[source]#

Initiate a run.

Return type:

Dict

class ads.opctl.backend.local.LocalBackendDistributed(config: Dict)[source]#

Bases: LocalBackend

Initialize a LocalBackendDistributed object with given config. This serves local single node(docker) testing for Distributed Tranining

Parameters:

config (dict) – dictionary of configurations

run()[source]#

Initiate a run.

Return type:

Dict

class ads.opctl.backend.local.LocalModelDeploymentBackend(config: Dict)[source]#

Bases: LocalBackend

Initialize a LocalModelDeploymentBackend object with given config.

Parameters:

config (dict) – dictionary of configurations

predict() None[source]#

Conducts local verify.

Returns:

Nothing.

Return type:

None

class ads.opctl.backend.local.LocalOperatorBackend(config: Dict | None, operator_info: OperatorInfo | None = None)[source]#

Bases: Backend

The local operator backend to execute operator in the local environment. Currently supported two scenarios:

  • Running operator within local conda environment.

  • Running operator within local container.

runtime_config#

The runtime config for the operator.

Type:

(Dict)

operator_config#

The operator specification config.

Type:

(Dict)

operator_type#

The type of the operator.

Type:

str

operator_info#

The detailed information about the operator.

Type:

OperatorInfo

Instantiates the operator backend.

Parameters:
  • config ((Dict)) – The configuration file containing operator’s specification details and execution section.

  • operator_info ((OperatorInfo, optional)) – The operator’s detailed information extracted from the operator.__init__ file. Will be extracted from the operator type in case if not provided.

init(uri: str | None = None, overwrite: bool = False, runtime_type: str | None = None, **kwargs: Dict) str | None[source]#

Generates a starter YAML specification for the operator local runtime.

Parameters:
  • overwrite ((bool, optional). Defaults to False.) – Overwrites the result specification YAML if exists.

  • uri ((str, optional), Defaults to None.) – The filename to save the resulting specification template YAML.

  • runtime_type ((str, optional). Defaults to None.) – The resource runtime type.

  • **kwargs (Dict) – The optional arguments.

Returns:

The YAML specification for the given resource if uri was not provided. None otherwise.

Return type:

Union[str, None]

run(**kwargs: Dict) Dict[source]#

Runs the operator.

class ads.opctl.backend.local.LocalPipelineBackend(config: Dict)[source]#

Bases: Backend

Initialize a LocalPipelineBackend object with given config.

Parameters:

config (dict) – dictionary of configurations

DEFAULT_PARALLEL_CONTAINER_MAXIMUM = 4#
DEFAULT_STATUS_POLL_INTERVAL_SECONDS = 5#
LOG_PREFIX = 'Local Pipeline:'#
run() None[source]#

Initiate a run.

Return type:

Dict

Module contents#