Get Big Data Pool Result
data class GetBigDataPoolResult(val autoPause: AutoPausePropertiesResponse? = null, val autoScale: AutoScalePropertiesResponse? = null, val cacheSize: Int? = null, val creationDate: String, val customLibraries: List<LibraryInfoResponse>? = null, val defaultSparkLogFolder: String? = null, val dynamicExecutorAllocation: DynamicExecutorAllocationResponse? = null, val id: String, val isAutotuneEnabled: Boolean? = null, val isComputeIsolationEnabled: Boolean? = null, val lastSucceededTimestamp: String, val libraryRequirements: LibraryRequirementsResponse? = null, val location: String, val name: String, val nodeCount: Int? = null, val nodeSize: String? = null, val nodeSizeFamily: String? = null, val provisioningState: String? = null, val sessionLevelPackagesEnabled: Boolean? = null, val sparkConfigProperties: SparkConfigPropertiesResponse? = null, val sparkEventsFolder: String? = null, val sparkVersion: String? = null, val tags: Map<String, String>? = null, val type: String)
A Big Data pool
Constructors
Link copied to clipboard
constructor(autoPause: AutoPausePropertiesResponse? = null, autoScale: AutoScalePropertiesResponse? = null, cacheSize: Int? = null, creationDate: String, customLibraries: List<LibraryInfoResponse>? = null, defaultSparkLogFolder: String? = null, dynamicExecutorAllocation: DynamicExecutorAllocationResponse? = null, id: String, isAutotuneEnabled: Boolean? = null, isComputeIsolationEnabled: Boolean? = null, lastSucceededTimestamp: String, libraryRequirements: LibraryRequirementsResponse? = null, location: String, name: String, nodeCount: Int? = null, nodeSize: String? = null, nodeSizeFamily: String? = null, provisioningState: String? = null, sessionLevelPackagesEnabled: Boolean? = null, sparkConfigProperties: SparkConfigPropertiesResponse? = null, sparkEventsFolder: String? = null, sparkVersion: String? = null, tags: Map<String, String>? = null, type: String)
Properties
Link copied to clipboard
Auto-pausing properties
Link copied to clipboard
Auto-scaling properties
Link copied to clipboard
The time when the Big Data pool was created.
Link copied to clipboard
List of custom libraries/packages associated with the spark pool.
Link copied to clipboard
The default folder where Spark logs will be written.
Link copied to clipboard
Dynamic Executor Allocation
Link copied to clipboard
Whether autotune is required or not.
Link copied to clipboard
Whether compute isolation is required or not.
Link copied to clipboard
The time when the Big Data pool was updated successfully.
Link copied to clipboard
Library version requirements
Link copied to clipboard
The kind of nodes that the Big Data pool provides.
Link copied to clipboard
The state of the Big Data pool.
Link copied to clipboard
Whether session level packages enabled.
Link copied to clipboard
Spark configuration file to specify additional properties
Link copied to clipboard
The Spark events folder
Link copied to clipboard
The Apache Spark version.