Etl Args
The data transformation of the log service is a hosted, highly available, and scalable data processing service, which is widely applicable to scenarios such as data regularization, enrichment, distribution, aggregation, and index reconstruction. Refer to details.
NOTE: Available in 1.120.0
Example Usage
Basic Usage
package generated_program;
import com.pulumi.Context;
import com.pulumi.Pulumi;
import com.pulumi.core.Output;
import com.pulumi.random.RandomInteger;
import com.pulumi.random.RandomIntegerArgs;
import com.pulumi.alicloud.log.Project;
import com.pulumi.alicloud.log.ProjectArgs;
import com.pulumi.alicloud.log.Store;
import com.pulumi.alicloud.log.StoreArgs;
import com.pulumi.alicloud.log.Etl;
import com.pulumi.alicloud.log.EtlArgs;
import com.pulumi.alicloud.log.inputs.EtlEtlSinkArgs;
import java.util.List;
import java.util.ArrayList;
import java.util.Map;
import java.io.File;
import java.nio.file.Files;
import java.nio.file.Paths;
public class App {
public static void main(String[] args) {
Pulumi.run(App::stack);
}
public static void stack(Context ctx) {
var default_ = new RandomInteger("default", RandomIntegerArgs.builder()
.max(99999)
.min(10000)
.build());
var exampleProject = new Project("exampleProject", ProjectArgs.builder()
.description("terraform-example")
.build());
var exampleStore = new Store("exampleStore", StoreArgs.builder()
.project(exampleProject.name())
.retentionPeriod(3650)
.shardCount(3)
.autoSplit(true)
.maxSplitShardCount(60)
.appendMeta(true)
.build());
var example2 = new Store("example2", StoreArgs.builder()
.project(exampleProject.name())
.retentionPeriod(3650)
.shardCount(3)
.autoSplit(true)
.maxSplitShardCount(60)
.appendMeta(true)
.build());
var example3 = new Store("example3", StoreArgs.builder()
.project(exampleProject.name())
.retentionPeriod(3650)
.shardCount(3)
.autoSplit(true)
.maxSplitShardCount(60)
.appendMeta(true)
.build());
var exampleEtl = new Etl("exampleEtl", EtlArgs.builder()
.etlName("terraform-example")
.project(exampleProject.name())
.displayName("terraform-example")
.description("terraform-example")
.accessKeyId("access_key_id")
.accessKeySecret("access_key_secret")
.script("e_set('new','key')")
.logstore(exampleStore.name())
.etlSinks(
EtlEtlSinkArgs.builder()
.name("target_name")
.accessKeyId("example2_access_key_id")
.accessKeySecret("example2_access_key_secret")
.endpoint("cn-hangzhou.log.aliyuncs.com")
.project(exampleProject.name())
.logstore(example2.name())
.build(),
EtlEtlSinkArgs.builder()
.name("target_name2")
.accessKeyId("example3_access_key_id")
.accessKeySecret("example3_access_key_secret")
.endpoint("cn-hangzhou.log.aliyuncs.com")
.project(exampleProject.name())
.logstore(example3.name())
.build())
.build());
}
}
Import
Log etl can be imported using the id, e.g.
$ pulumi import alicloud:log/etl:Etl example tf-log-project:tf-log-etl-name
Constructors
Functions
Properties
An KMS encryption context used to decrypt kms_encrypted_access_key_id
before creating or updating an instance with kms_encrypted_access_key_id
. See Encryption Context. It is valid when kms_encrypted_password
is set. When it is changed, the instance will reboot to make the change take effect.
An KMS encryption context used to decrypt kms_encrypted_access_key_secret
before creating or updating an instance with kms_encrypted_access_key_secret
. See Encryption Context. It is valid when kms_encrypted_password
is set. When it is changed, the instance will reboot to make the change take effect.
Sts role info under delivery target logstore. role_arn
and (access_key_id, access_key_secret)
fill in at most one. If you do not fill in both, then you must fill in (kms_encrypted_access_key_id, kms_encrypted_access_key_secret, kms_encryption_access_key_id_context, kms_encryption_access_key_secret_context)
to use KMS to get the key pair.