Project Sink
Import
Project-level logging sinks can be imported using their URI, e.g.
projects/{{project_id}}/sinks/{{name}}
When using thepulumi import
command, project-level logging sinks can be imported using one of the formats above. For example:
$ pulumi import gcp:logging/projectSink:ProjectSink default projects/{{project_id}}/sinks/{{name}}
Properties
Options that affect sinks exporting data to BigQuery. Structure documented below.
A user managed service account that will be used to write the log entries. The format must be serviceAccount:some@email
. This field can only be specified if you are routing logs to a destination outside this sink's project. If not specified, a Logging service account will automatically be generated.
A description of this sink. The maximum length of the description is 8000 characters.
The destination of the sink (or, in other words, where logs are written to). Can be a Cloud Storage bucket, a PubSub topic, a BigQuery dataset, a Cloud Logging bucket, or a Google Cloud project. Examples:
Log entries that match any of the exclusion filters will not be exported. If a log entry is matched by both filter
and one of exclusions.filter
, it will not be exported. Can be repeated multiple times for multiple exclusions. Structure is documented below.
The filter to apply when exporting logs. Only log entries that match the filter are exported. See Advanced Log Filters for information on how to write a filter.
Whether or not to create a unique identity associated with this sink. If false
, then the writer_identity
used is serviceAccount:cloud-logs@system.gserviceaccount.com
. If true
(the default), then a unique service account is created and used for this sink. If you wish to publish logs across projects or utilize bigquery_options
, you must set unique_writer_identity
to true.
The identity associated with this sink. This identity must be granted write access to the configured destination
.