SparkJobArgs

data class SparkJobArgs(val archives: Output<List<String>>? = null, val args: Output<String>? = null, val codeId: Output<String>, val componentId: Output<String>? = null, val computeId: Output<String>? = null, val conf: Output<Map<String, String>>? = null, val description: Output<String>? = null, val displayName: Output<String>? = null, val entry: Output<Either<SparkJobPythonEntryArgs, SparkJobScalaEntryArgs>>, val environmentId: Output<String>? = null, val environmentVariables: Output<Map<String, String>>? = null, val experimentName: Output<String>? = null, val files: Output<List<String>>? = null, val identity: Output<Any>? = null, val inputs: Output<Map<String, Any>>? = null, val isArchived: Output<Boolean>? = null, val jars: Output<List<String>>? = null, val jobType: Output<String>, val notificationSetting: Output<NotificationSettingArgs>? = null, val outputs: Output<Map<String, Any>>? = null, val properties: Output<Map<String, String>>? = null, val pyFiles: Output<List<String>>? = null, val queueSettings: Output<QueueSettingsArgs>? = null, val resources: Output<SparkResourceConfigurationArgs>? = null, val services: Output<Map<String, JobServiceArgs>>? = null, val tags: Output<Map<String, String>>? = null) : ConvertibleToJava<SparkJobArgs>

Spark job definition.

Constructors

Link copied to clipboard
constructor(archives: Output<List<String>>? = null, args: Output<String>? = null, codeId: Output<String>, componentId: Output<String>? = null, computeId: Output<String>? = null, conf: Output<Map<String, String>>? = null, description: Output<String>? = null, displayName: Output<String>? = null, entry: Output<Either<SparkJobPythonEntryArgs, SparkJobScalaEntryArgs>>, environmentId: Output<String>? = null, environmentVariables: Output<Map<String, String>>? = null, experimentName: Output<String>? = null, files: Output<List<String>>? = null, identity: Output<Any>? = null, inputs: Output<Map<String, Any>>? = null, isArchived: Output<Boolean>? = null, jars: Output<List<String>>? = null, jobType: Output<String>, notificationSetting: Output<NotificationSettingArgs>? = null, outputs: Output<Map<String, Any>>? = null, properties: Output<Map<String, String>>? = null, pyFiles: Output<List<String>>? = null, queueSettings: Output<QueueSettingsArgs>? = null, resources: Output<SparkResourceConfigurationArgs>? = null, services: Output<Map<String, JobServiceArgs>>? = null, tags: Output<Map<String, String>>? = null)

Properties

Link copied to clipboard
val archives: Output<List<String>>? = null

Archive files used in the job.

Link copied to clipboard
val args: Output<String>? = null

Arguments for the job.

Link copied to clipboard
val codeId: Output<String>

Required arm-id of the code asset.

Link copied to clipboard
val componentId: Output<String>? = null

ARM resource ID of the component resource.

Link copied to clipboard
val computeId: Output<String>? = null

ARM resource ID of the compute resource.

Link copied to clipboard
val conf: Output<Map<String, String>>? = null

Spark configured properties.

Link copied to clipboard
val description: Output<String>? = null

The asset description text.

Link copied to clipboard
val displayName: Output<String>? = null

Display name of job.

Link copied to clipboard

Required The entry to execute on startup of the job.

Link copied to clipboard
val environmentId: Output<String>? = null

The ARM resource ID of the Environment specification for the job.

Link copied to clipboard
val environmentVariables: Output<Map<String, String>>? = null

Environment variables included in the job.

Link copied to clipboard
val experimentName: Output<String>? = null

The name of the experiment the job belongs to. If not set, the job is placed in the "Default" experiment.

Link copied to clipboard
val files: Output<List<String>>? = null

Files used in the job.

Link copied to clipboard
val identity: Output<Any>? = null

Identity configuration. If set, this should be one of AmlToken, ManagedIdentity, UserIdentity or null. Defaults to AmlToken if null.

Link copied to clipboard
val inputs: Output<Map<String, Any>>? = null

Mapping of input data bindings used in the job.

Link copied to clipboard
val isArchived: Output<Boolean>? = null

Is the asset archived?

Link copied to clipboard
val jars: Output<List<String>>? = null

Jar files used in the job.

Link copied to clipboard
val jobType: Output<String>

Enum to determine the type of job. Expected value is 'Spark'.

Link copied to clipboard

Notification setting for the job

Link copied to clipboard
val outputs: Output<Map<String, Any>>? = null

Mapping of output data bindings used in the job.

Link copied to clipboard
val properties: Output<Map<String, String>>? = null

The asset property dictionary.

Link copied to clipboard
val pyFiles: Output<List<String>>? = null

Python files used in the job.

Link copied to clipboard
val queueSettings: Output<QueueSettingsArgs>? = null

Queue settings for the job

Link copied to clipboard

Compute Resource configuration for the job.

Link copied to clipboard
val services: Output<Map<String, JobServiceArgs>>? = null

List of JobEndpoints. For local jobs, a job endpoint will have an endpoint value of FileStreamObject.

Link copied to clipboard
val tags: Output<Map<String, String>>? = null

Tag dictionary. Tags can be added, removed, and updated.

Functions

Link copied to clipboard
open override fun toJava(): SparkJobArgs