SparkPoolArgs

data class SparkPoolArgs(val autoPause: Output<SparkPoolAutoPauseArgs>? = null, val autoScale: Output<SparkPoolAutoScaleArgs>? = null, val cacheSize: Output<Int>? = null, val computeIsolationEnabled: Output<Boolean>? = null, val dynamicExecutorAllocationEnabled: Output<Boolean>? = null, val libraryRequirement: Output<SparkPoolLibraryRequirementArgs>? = null, val maxExecutors: Output<Int>? = null, val minExecutors: Output<Int>? = null, val name: Output<String>? = null, val nodeCount: Output<Int>? = null, val nodeSize: Output<String>? = null, val nodeSizeFamily: Output<String>? = null, val sessionLevelPackagesEnabled: Output<Boolean>? = null, val sparkConfig: Output<SparkPoolSparkConfigArgs>? = null, val sparkEventsFolder: Output<String>? = null, val sparkLogFolder: Output<String>? = null, val sparkVersion: Output<String>? = null, val synapseWorkspaceId: Output<String>? = null, val tags: Output<Map<String, String>>? = null) : ConvertibleToJava<SparkPoolArgs>

Manages a Synapse Spark Pool.

Import

Synapse Spark Pool can be imported using the resource id, e.g.

$ pulumi import azure:synapse/sparkPool:SparkPool example /subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/group1/providers/Microsoft.Synapse/workspaces/workspace1/bigDataPools/sparkPool1

Constructors

Link copied to clipboard
fun SparkPoolArgs(autoPause: Output<SparkPoolAutoPauseArgs>? = null, autoScale: Output<SparkPoolAutoScaleArgs>? = null, cacheSize: Output<Int>? = null, computeIsolationEnabled: Output<Boolean>? = null, dynamicExecutorAllocationEnabled: Output<Boolean>? = null, libraryRequirement: Output<SparkPoolLibraryRequirementArgs>? = null, maxExecutors: Output<Int>? = null, minExecutors: Output<Int>? = null, name: Output<String>? = null, nodeCount: Output<Int>? = null, nodeSize: Output<String>? = null, nodeSizeFamily: Output<String>? = null, sessionLevelPackagesEnabled: Output<Boolean>? = null, sparkConfig: Output<SparkPoolSparkConfigArgs>? = null, sparkEventsFolder: Output<String>? = null, sparkLogFolder: Output<String>? = null, sparkVersion: Output<String>? = null, synapseWorkspaceId: Output<String>? = null, tags: Output<Map<String, String>>? = null)

Functions

Link copied to clipboard
open override fun toJava(): SparkPoolArgs

Properties

Link copied to clipboard
val autoPause: Output<SparkPoolAutoPauseArgs>? = null

An auto_pause block as defined below.

Link copied to clipboard
val autoScale: Output<SparkPoolAutoScaleArgs>? = null

An auto_scale block as defined below. Exactly one of node_count or auto_scale must be specified.

Link copied to clipboard
val cacheSize: Output<Int>? = null

The cache size in the Spark Pool.

Link copied to clipboard
val computeIsolationEnabled: Output<Boolean>? = null

Indicates whether compute isolation is enabled or not. Defaults to false.

Link copied to clipboard

Indicates whether Dynamic Executor Allocation is enabled or not. Defaults to false.

Link copied to clipboard

A library_requirement block as defined below.

Link copied to clipboard
val maxExecutors: Output<Int>? = null

The maximum number of executors allocated only when dynamic_executor_allocation_enabled set to true.

Link copied to clipboard
val minExecutors: Output<Int>? = null

The minimum number of executors allocated only when dynamic_executor_allocation_enabled set to true.

Link copied to clipboard
val name: Output<String>? = null

The name which should be used for this Synapse Spark Pool. Changing this forces a new Synapse Spark Pool to be created.

Link copied to clipboard
val nodeCount: Output<Int>? = null

The number of nodes in the Spark Pool. Exactly one of node_count or auto_scale must be specified.

Link copied to clipboard
val nodeSize: Output<String>? = null

The level of node in the Spark Pool. Possible values are Small, Medium, Large, None, XLarge, XXLarge and XXXLarge.

Link copied to clipboard
val nodeSizeFamily: Output<String>? = null

The kind of nodes that the Spark Pool provides. Possible values are HardwareAcceleratedFPGA, HardwareAcceleratedGPU, MemoryOptimized, and None.

Link copied to clipboard

Indicates whether session level packages are enabled or not. Defaults to false.

Link copied to clipboard

A spark_config block as defined below.

Link copied to clipboard
val sparkEventsFolder: Output<String>? = null

The Spark events folder. Defaults to /events.

Link copied to clipboard
val sparkLogFolder: Output<String>? = null

The default folder where Spark logs will be written. Defaults to /logs.

Link copied to clipboard
val sparkVersion: Output<String>? = null

The Apache Spark version. Possible values are 2.4 , 3.1 , 3.2 and 3.3. Defaults to 2.4.

Link copied to clipboard
val synapseWorkspaceId: Output<String>? = null

The ID of the Synapse Workspace where the Synapse Spark Pool should exist. Changing this forces a new Synapse Spark Pool to be created.

Link copied to clipboard
val tags: Output<Map<String, String>>? = null

A mapping of tags which should be assigned to the Synapse Spark Pool.