Data Transfer Config Args
Represents a data transfer configuration. A transfer configuration contains all metadata needed to perform a data transfer. To get more information about Config, see:
How-to Guides
Warning: All arguments including the following potentially sensitive values will be stored in the raw state as plain text:
sensitive_params.secret_access_key
. Read more about sensitive data in state.
Example Usage
Bigquerydatatransfer Config Scheduled Query
package generated_program;
import com.pulumi.Context;
import com.pulumi.Pulumi;
import com.pulumi.core.Output;
import com.pulumi.gcp.organizations.OrganizationsFunctions;
import com.pulumi.gcp.organizations.inputs.GetProjectArgs;
import com.pulumi.gcp.projects.IAMMember;
import com.pulumi.gcp.projects.IAMMemberArgs;
import com.pulumi.gcp.bigquery.Dataset;
import com.pulumi.gcp.bigquery.DatasetArgs;
import com.pulumi.gcp.bigquery.DataTransferConfig;
import com.pulumi.gcp.bigquery.DataTransferConfigArgs;
import com.pulumi.resources.CustomResourceOptions;
import java.util.List;
import java.util.ArrayList;
import java.util.Map;
import java.io.File;
import java.nio.file.Files;
import java.nio.file.Paths;
public class App {
public static void main(String[] args) {
Pulumi.run(App::stack);
}
public static void stack(Context ctx) {
final var project = OrganizationsFunctions.getProject();
var permissions = new IAMMember("permissions", IAMMemberArgs.builder()
.project(project.applyValue(getProjectResult -> getProjectResult.projectId()))
.role("roles/iam.serviceAccountTokenCreator")
.member(String.format("serviceAccount:service-%s@gcp-sa-bigquerydatatransfer.iam.gserviceaccount.com", project.applyValue(getProjectResult -> getProjectResult.number())))
.build());
var myDataset = new Dataset("myDataset", DatasetArgs.builder()
.datasetId("my_dataset")
.friendlyName("foo")
.description("bar")
.location("asia-northeast1")
.build(), CustomResourceOptions.builder()
.dependsOn(permissions)
.build());
var queryConfig = new DataTransferConfig("queryConfig", DataTransferConfigArgs.builder()
.displayName("my-query")
.location("asia-northeast1")
.dataSourceId("scheduled_query")
.schedule("first sunday of quarter 00:00")
.destinationDatasetId(myDataset.datasetId())
.params(Map.ofEntries(
Map.entry("destination_table_name_template", "my_table"),
Map.entry("write_disposition", "WRITE_APPEND"),
Map.entry("query", "SELECT name FROM tabl WHERE x = 'y'")
))
.build(), CustomResourceOptions.builder()
.dependsOn(permissions)
.build());
}
}
Import
Config can be imported using any of these accepted formats:
$ pulumi import gcp:bigquery/dataTransferConfig:DataTransferConfig default {{name}}
Constructors
Properties
The number of days to look back to automatically refresh the data. For example, if dataRefreshWindowDays = 10, then every day BigQuery reingests data for today-10, today-1, rather than ingesting data for just today-1. Only valid if the data source supports the feature. Set the value to 0 to use the default value.
The data source id. Cannot be changed once the transfer config is created.
The BigQuery target dataset id.
The user specified display name for the transfer config.
Email notifications will be sent according to these preferences to the email address of the user who owns this transfer config. Structure is documented below.
Pub/Sub topic where notifications will be sent after transfer runs associated with this transfer config finish.
Parameters specific to each data source. For more information see the bq tab in the 'Setting up a data transfer' section for each data source. For example the parameters for Cloud Storage transfers are listed here: https://cloud.google.com/bigquery-transfer/docs/cloud-storage-transfer#bq NOTE : If you are attempting to update a parameter that cannot be updated (due to api limitations) please force recreation of the resource.
Data transfer schedule. If the data source does not support a custom schedule, this should be empty. If it is empty, the default value for the data source will be used. The specified times are in UTC. Examples of valid format: 1st,3rd monday of month 15:30, every wed,fri of jan, jun 13:15, and first sunday of quarter 00:00. See more explanation about the format here: https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_format NOTE: the granularity should be at least 8 hours, or less frequent.
Options customizing the data transfer schedule. Structure is documented below.
Different parameters are configured primarily using the the params
field on this resource. This block contains the parameters which contain secrets or passwords so that they can be marked sensitive and hidden from plan output. The name of the field, eg: secret_access_key, will be the key in the params
map in the api request. Credentials may not be specified in both locations and will cause an error. Changing from one location to a different credential configuration in the config will require an apply to update state. Structure is documented below.
Service account email. If this field is set, transfer config will be created with this service account credentials. It requires that requesting user calling this API has permissions to act as this service account.