Spark Job Args
data class SparkJobArgs(val archives: Output<List<String>>? = null, val args: Output<String>? = null, val codeId: Output<String>, val componentId: Output<String>? = null, val computeId: Output<String>? = null, val conf: Output<Map<String, String>>? = null, val description: Output<String>? = null, val displayName: Output<String>? = null, val entry: Output<Either<SparkJobPythonEntryArgs, SparkJobScalaEntryArgs>>, val environmentId: Output<String>? = null, val environmentVariables: Output<Map<String, String>>? = null, val experimentName: Output<String>? = null, val files: Output<List<String>>? = null, val identity: Output<Any>? = null, val inputs: Output<Map<String, Any>>? = null, val isArchived: Output<Boolean>? = null, val jars: Output<List<String>>? = null, val jobType: Output<String>, val notificationSetting: Output<NotificationSettingArgs>? = null, val outputs: Output<Map<String, Any>>? = null, val properties: Output<Map<String, String>>? = null, val pyFiles: Output<List<String>>? = null, val queueSettings: Output<QueueSettingsArgs>? = null, val resources: Output<SparkResourceConfigurationArgs>? = null, val services: Output<Map<String, JobServiceArgs>>? = null, val tags: Output<Map<String, String>>? = null) : ConvertibleToJava<SparkJobArgs>
Spark job definition.
Constructors
Link copied to clipboard
constructor(archives: Output<List<String>>? = null, args: Output<String>? = null, codeId: Output<String>, componentId: Output<String>? = null, computeId: Output<String>? = null, conf: Output<Map<String, String>>? = null, description: Output<String>? = null, displayName: Output<String>? = null, entry: Output<Either<SparkJobPythonEntryArgs, SparkJobScalaEntryArgs>>, environmentId: Output<String>? = null, environmentVariables: Output<Map<String, String>>? = null, experimentName: Output<String>? = null, files: Output<List<String>>? = null, identity: Output<Any>? = null, inputs: Output<Map<String, Any>>? = null, isArchived: Output<Boolean>? = null, jars: Output<List<String>>? = null, jobType: Output<String>, notificationSetting: Output<NotificationSettingArgs>? = null, outputs: Output<Map<String, Any>>? = null, properties: Output<Map<String, String>>? = null, pyFiles: Output<List<String>>? = null, queueSettings: Output<QueueSettingsArgs>? = null, resources: Output<SparkResourceConfigurationArgs>? = null, services: Output<Map<String, JobServiceArgs>>? = null, tags: Output<Map<String, String>>? = null)
Properties
Link copied to clipboard
ARM resource ID of the component resource.
Link copied to clipboard
The asset description text.
Link copied to clipboard
Display name of job.
Link copied to clipboard
Required The entry to execute on startup of the job.
Link copied to clipboard
The ARM resource ID of the Environment specification for the job.
Link copied to clipboard
Environment variables included in the job.
Link copied to clipboard
The name of the experiment the job belongs to. If not set, the job is placed in the "Default" experiment.
Link copied to clipboard
Is the asset archived?
Link copied to clipboard
Notification setting for the job
Link copied to clipboard
The asset property dictionary.
Link copied to clipboard
Queue settings for the job
Link copied to clipboard
Compute Resource configuration for the job.
Link copied to clipboard
List of JobEndpoints. For local jobs, a job endpoint will have an endpoint value of FileStreamObject.