SparkJobResponse

data class SparkJobResponse(val archiveUris: List<String>, val args: List<String>, val fileUris: List<String>, val jarFileUris: List<String>, val loggingConfig: LoggingConfigResponse, val mainClass: String, val mainJarFileUri: String, val properties: Map<String, String>)

A Dataproc job for running Apache Spark (http://spark.apache.org/) applications on YARN. The specification of the main method to call to drive the job. Specify either the jar file that contains the main class or the main class name. To pass both a main jar and a main class in that jar, add the jar to CommonJob.jar_file_uris, and then specify the main class name in main_class.

Constructors

Link copied to clipboard
fun SparkJobResponse(archiveUris: List<String>, args: List<String>, fileUris: List<String>, jarFileUris: List<String>, loggingConfig: LoggingConfigResponse, mainClass: String, mainJarFileUri: String, properties: Map<String, String>)

Types

Link copied to clipboard
object Companion

Properties

Link copied to clipboard

Optional. HCFS URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.

Link copied to clipboard

Optional. The arguments to pass to the driver. Do not include arguments, such as --conf, that can be set as job properties, since a collision may occur that causes an incorrect job submission.

Link copied to clipboard

Optional. HCFS URIs of files to be placed in the working directory of each executor. Useful for naively parallel tasks.

Link copied to clipboard

Optional. HCFS URIs of jar files to add to the CLASSPATHs of the Spark driver and tasks.

Link copied to clipboard

Optional. The runtime log config for job execution.

Link copied to clipboard

The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris.

Link copied to clipboard

The HCFS URI of the jar file that contains the main class.

Link copied to clipboard

Optional. A mapping of property names to values, used to configure Spark. Properties that conflict with values set by the Dataproc API may be overwritten. Can include properties set in /etc/spark/conf/spark-defaults.conf and classes in user code.