OrderedJobArgs

data class OrderedJobArgs(val hadoopJob: Output<HadoopJobArgs>? = null, val hiveJob: Output<HiveJobArgs>? = null, val labels: Output<Map<String, String>>? = null, val pigJob: Output<PigJobArgs>? = null, val prerequisiteStepIds: Output<List<String>>? = null, val prestoJob: Output<PrestoJobArgs>? = null, val pysparkJob: Output<PySparkJobArgs>? = null, val scheduling: Output<JobSchedulingArgs>? = null, val sparkJob: Output<SparkJobArgs>? = null, val sparkRJob: Output<SparkRJobArgs>? = null, val sparkSqlJob: Output<SparkSqlJobArgs>? = null, val stepId: Output<String>, val trinoJob: Output<TrinoJobArgs>? = null) : ConvertibleToJava<OrderedJobArgs>

A job executed by the workflow.

Constructors

Link copied to clipboard
fun OrderedJobArgs(hadoopJob: Output<HadoopJobArgs>? = null, hiveJob: Output<HiveJobArgs>? = null, labels: Output<Map<String, String>>? = null, pigJob: Output<PigJobArgs>? = null, prerequisiteStepIds: Output<List<String>>? = null, prestoJob: Output<PrestoJobArgs>? = null, pysparkJob: Output<PySparkJobArgs>? = null, scheduling: Output<JobSchedulingArgs>? = null, sparkJob: Output<SparkJobArgs>? = null, sparkRJob: Output<SparkRJobArgs>? = null, sparkSqlJob: Output<SparkSqlJobArgs>? = null, stepId: Output<String>, trinoJob: Output<TrinoJobArgs>? = null)

Functions

Link copied to clipboard
open override fun toJava(): OrderedJobArgs

Properties

Link copied to clipboard
val hadoopJob: Output<HadoopJobArgs>? = null

Optional. Job is a Hadoop job.

Link copied to clipboard
val hiveJob: Output<HiveJobArgs>? = null

Optional. Job is a Hive job.

Link copied to clipboard
val labels: Output<Map<String, String>>? = null

Optional. The labels to associate with this job.Label keys must be between 1 and 63 characters long, and must conform to the following regular expression: \p{Ll}\p{Lo}{0,62}Label values must be between 1 and 63 characters long, and must conform to the following regular expression: \p{Ll}\p{Lo}\p{N}_-{0,63}No more than 32 labels can be associated with a given job.

Link copied to clipboard
val pigJob: Output<PigJobArgs>? = null

Optional. Job is a Pig job.

Link copied to clipboard
val prerequisiteStepIds: Output<List<String>>? = null

Optional. The optional list of prerequisite job step_ids. If not specified, the job will start at the beginning of workflow.

Link copied to clipboard
val prestoJob: Output<PrestoJobArgs>? = null

Optional. Job is a Presto job.

Link copied to clipboard
val pysparkJob: Output<PySparkJobArgs>? = null

Optional. Job is a PySpark job.

Link copied to clipboard
val scheduling: Output<JobSchedulingArgs>? = null

Optional. Job scheduling configuration.

Link copied to clipboard
val sparkJob: Output<SparkJobArgs>? = null

Optional. Job is a Spark job.

Link copied to clipboard
val sparkRJob: Output<SparkRJobArgs>? = null

Optional. Job is a SparkR job.

Link copied to clipboard
val sparkSqlJob: Output<SparkSqlJobArgs>? = null

Optional. Job is a SparkSql job.

Link copied to clipboard
val stepId: Output<String>

The step id. The id must be unique among all jobs within the template.The step id is used as prefix for job id, as job goog-dataproc-workflow-step-id label, and in prerequisiteStepIds field from other steps.The id must contain only letters (a-z, A-Z), numbers (0-9), underscores (_), and hyphens (-). Cannot begin or end with underscore or hyphen. Must consist of between 3 and 50 characters.

Link copied to clipboard
val trinoJob: Output<TrinoJobArgs>? = null

Optional. Job is a Trino job.