Data Quality Job Definition Data Quality Job Input Batch Transform Input
data class DataQualityJobDefinitionDataQualityJobInputBatchTransformInput(val dataCapturedDestinationS3Uri: String, val datasetFormat: DataQualityJobDefinitionDataQualityJobInputBatchTransformInputDatasetFormat, val localPath: String? = null, val s3DataDistributionType: String? = null, val s3InputMode: String? = null)
Constructors
Link copied to clipboard
constructor(dataCapturedDestinationS3Uri: String, datasetFormat: DataQualityJobDefinitionDataQualityJobInputBatchTransformInputDatasetFormat, localPath: String? = null, s3DataDistributionType: String? = null, s3InputMode: String? = null)
Properties
Link copied to clipboard
The Amazon S3 location being used to capture the data.
Link copied to clipboard
The dataset format for your batch transform job. Fields are documented below.
Link copied to clipboard
Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defaults to FullyReplicated
. Valid values are FullyReplicated
or ShardedByS3Key
Link copied to clipboard
Whether the Pipe
or File
is used as the input mode for transferring data for the monitoring job. Pipe
mode is recommended for large datasets. File
mode is useful for small files that fit in memory. Defaults to File
. Valid values are Pipe
or File