WorkflowTemplateJobArgs

data class WorkflowTemplateJobArgs(val hadoopJob: Output<WorkflowTemplateJobHadoopJobArgs>? = null, val hiveJob: Output<WorkflowTemplateJobHiveJobArgs>? = null, val labels: Output<Map<String, String>>? = null, val pigJob: Output<WorkflowTemplateJobPigJobArgs>? = null, val prerequisiteStepIds: Output<List<String>>? = null, val prestoJob: Output<WorkflowTemplateJobPrestoJobArgs>? = null, val pysparkJob: Output<WorkflowTemplateJobPysparkJobArgs>? = null, val scheduling: Output<WorkflowTemplateJobSchedulingArgs>? = null, val sparkJob: Output<WorkflowTemplateJobSparkJobArgs>? = null, val sparkRJob: Output<WorkflowTemplateJobSparkRJobArgs>? = null, val sparkSqlJob: Output<WorkflowTemplateJobSparkSqlJobArgs>? = null, val stepId: Output<String>) : ConvertibleToJava<WorkflowTemplateJobArgs>

Constructors

Link copied to clipboard
constructor(hadoopJob: Output<WorkflowTemplateJobHadoopJobArgs>? = null, hiveJob: Output<WorkflowTemplateJobHiveJobArgs>? = null, labels: Output<Map<String, String>>? = null, pigJob: Output<WorkflowTemplateJobPigJobArgs>? = null, prerequisiteStepIds: Output<List<String>>? = null, prestoJob: Output<WorkflowTemplateJobPrestoJobArgs>? = null, pysparkJob: Output<WorkflowTemplateJobPysparkJobArgs>? = null, scheduling: Output<WorkflowTemplateJobSchedulingArgs>? = null, sparkJob: Output<WorkflowTemplateJobSparkJobArgs>? = null, sparkRJob: Output<WorkflowTemplateJobSparkRJobArgs>? = null, sparkSqlJob: Output<WorkflowTemplateJobSparkSqlJobArgs>? = null, stepId: Output<String>)

Properties

Link copied to clipboard

Job is a Hadoop job.

Link copied to clipboard

Job is a Hive job.

Link copied to clipboard
val labels: Output<Map<String, String>>? = null

The labels to associate with this job. Label keys must be between 1 and 63 characters long, and must conform to the following regular expression: {0,63} No more than 32 labels can be associated with a given job.

Link copied to clipboard

Job is a Pig job.

Link copied to clipboard
val prerequisiteStepIds: Output<List<String>>? = null

The optional list of prerequisite job step_ids. If not specified, the job will start at the beginning of workflow.

Link copied to clipboard

Job is a Presto job.

Link copied to clipboard

Job is a PySpark job.

Link copied to clipboard

Job scheduling configuration.

Link copied to clipboard

Job is a Spark job.

Link copied to clipboard

Job is a SparkR job.

Link copied to clipboard

Job is a SparkSql job.

Link copied to clipboard
val stepId: Output<String>

Required. The step id. The id must be unique among all jobs within the template. The step id is used as prefix for job id, as job goog-dataproc-workflow-step-id label, and in field from other steps. The id must contain only letters (a-z, A-Z), numbers (0-9), underscores (_), and hyphens (-). Cannot begin or end with underscore or hyphen. Must consist of between 3 and 50 characters.

Functions

Link copied to clipboard
open override fun toJava(): WorkflowTemplateJobArgs