Job

Import

Dataflow jobs can be imported using the job id e.g.

$ pulumi import gcp:dataflow/job:Job example 2022-07-31_06_25_42-11926927532632678660

Properties

Link copied to clipboard

List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].

Link copied to clipboard

Enable/disable the use of Streaming Engine for the job. Note that Streaming Engine is enabled by default for pipelines developed against the Beam SDK for Python v2.21.0 or later when using Python 3.

Link copied to clipboard
val id: Output<String>
Link copied to clipboard
val ipConfiguration: Output<String>?

The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".

Link copied to clipboard
val jobId: Output<String>

The unique ID of this job.

Link copied to clipboard
val kmsKeyName: Output<String>?

The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY

Link copied to clipboard
val labels: Output<Map<String, Any>>?

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

Link copied to clipboard
val machineType: Output<String>?

The machine type to use for the job.

Link copied to clipboard
val maxWorkers: Output<Int>?

The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.

Link copied to clipboard
val name: Output<String>

A unique name for the resource, required by Dataflow.

Link copied to clipboard
val network: Output<String>?

The network to which VMs will be assigned. If it is not provided, "default" will be used.

Link copied to clipboard
val onDelete: Output<String>?

One of "drain" or "cancel". Specifies behavior of deletion during terraform destroy.

Link copied to clipboard
val parameters: Output<Map<String, Any>>?

Key/Value pairs to be passed to the Dataflow job (as used in the template).

Link copied to clipboard
val project: Output<String>

The project in which the resource belongs. If it is not provided, the provider project is used.

Link copied to clipboard
Link copied to clipboard
Link copied to clipboard
val region: Output<String>?

The region in which the created job should run.

Link copied to clipboard

The Service Account email used to create the job.

Link copied to clipboard

If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on. WARNING: this will lead to job name conflicts if you do not ensure that the job names are different, e.g. by embedding a release ID or by using a random_id.

Link copied to clipboard
val state: Output<String>

The current state of the resource, selected from the JobState enum

Link copied to clipboard
val subnetwork: Output<String>?

The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK". If the subnetwork is located in a Shared VPC network, you must use the complete URL. For example "googleapis.com/compute/v1/projects/PROJECT_ID/regions/REGION/subnetworks/SUBNET_NAME"

Link copied to clipboard
val tempGcsLocation: Output<String>

A writeable location on GCS for the Dataflow job to dump its temporary data.

Link copied to clipboard
val templateGcsPath: Output<String>

The GCS path to the Dataflow job template.

Link copied to clipboard

Only applicable when updating a pipeline. Map of transform name prefixes of the job to be replaced with the corresponding name prefixes of the new job. This field is not used outside of update.

Link copied to clipboard
val type: Output<String>

The type of this job, selected from the JobType enum

Link copied to clipboard
val urn: Output<String>
Link copied to clipboard
val zone: Output<String>?

The zone in which the created job should run. If it is not provided, the provider zone is used.