Logpush Job Args
Example Usage
Import
Import an account-scoped job.
$ pulumi import cloudflare:index/logpushJob:LogpushJob example account/<account_id>/<job_id>
Import a zone-scoped job.
$ pulumi import cloudflare:index/logpushJob:LogpushJob example zone/<zone_id>/<job_id>
Constructors
Properties
The kind of the dataset to use with the logpush job. Available values: access_requests
, casb_findings
, firewall_events
, http_requests
, spectrum_events
, nel_reports
, audit_logs
, gateway_dns
, gateway_http
, gateway_network
, dns_logs
, network_analytics_logs
, workers_trace_events
, device_posture_results
, zero_trust_network_sessions
, magic_ids_detections
, page_shield_events
, dlp_forensic_copies
.
Uniquely identifies a resource (such as an s3 bucket) where data will be pushed. Additional configuration parameters supported by the destination may be included. See Logpush destination documentation.
Configuration string for the Logshare API. It specifies things like requested fields and timestamp formats. See Logpush options documentation.
The maximum uncompressed file size of a batch of logs. Value must be between 5MB and 1GB.
The maximum interval in seconds for log batches. Value must be between 30 and 300.
The maximum number of log lines per batch. Value must be between 1000 and 1,000,000.
Structured replacement for logpulloptions. When including this field, the logpulloption field will be ignored.
Ownership challenge token to prove destination ownership, required when destination is Amazon S3, Google Cloud Storage, Microsoft Azure or Sumo Logic. See Developer documentation.