Configure the Google Cloud Storage target
- Last UpdatedJun 25, 2025
- 2 minute read
Complete the procedure below to configure the Google Cloud Storage target and verify that you can write to the Google Cloud Storage bucket.
Note: For more information on Google IAM roles and permissions, see https://cloud.google.com/iam/docs/understanding-roles and https://cloud.google.com/iam/docs/creating-custom-roles.
-
In the Google Cloud console, under IAM, give the GCP service account access to the Google Cloud Storage project. The minimum permissions are:
-
storage.buckets.get
-
storage.buckets.list
-
storage.objects.create
-
storage.objects.get
-
storage.objects.delete
-
-
Alternatively, you can assign the GCP service account the roles/storage.admin role which includes the minimum permissions.
-
If BigQuery is configured, the GCP service account requires the following additional permissions:
-
bigquery.datasets.create
-
bigquery.datasets.get
-
bigquery.tables.create
-
bigquery.tables.list
-
bigquery.tables.delete
-
-
Alternatively, if BigQuery is configured, you can assign the GCP service account the roles/bigquery.dataOwner role which includes the minimum permissions.
-
Create the Google Cloud Storage target before you configure it. For more information, see Add a publish target.
-
On the Administration page, click the Targets tab. Then select the target in the Publish Targets list.
Note: If you have already selected your target, skip to the next step.
-
Refer to the following table and enter the required information. Click the buttons as they become enabled.
Parameter
Description
Data Storage Format
File format in which data is stored. Default: Parquet. Other options are JSON, JSON Indent, and Text. If BigQuery is used, then the only supported option is Text.
Compression
Compression type on message sets. Default: None. Other options are gzip and Snappy.
GCP Service Account Key File
JSON file that contains the Google Cloud Platform (GCP) service account credentials.
BigQuery Project ID
(Optional) Unique ID for your BigQuery project in the GCP.
Include Header
(Optional) When selected, column names are added to the beginning of the file.
Field Delimiter
(Optional) Character(s) that separate the data fields in the rows. By default, a tab (\t) separates the fields.
Row Delimiter
(Optional) Character(s) that separate the data rows. By default, a new line separates the data rows. The characters that specify a new line are platform specific. The default automatically provides the correct characters for the environment.
Bucket Name
Google Cloud Storage bucket that data is uploaded to.
Folder Path
(Optional) Path to the Google Cloud Storage folder. Defaults to the root of the specified bucket.
Allow Nulls
When selected, null values are valid. Default: Null values are allowed
Maximum Rows/Objects
(Optional) Maximum number of objects in a file. Default: 100,000 rows
Maximum File Size (KB)
(Optional) Maximum file size in kilobytes. Default: 10,000 KB
Maximum Update Time (sec)
(Optional) Maximum time to update the database, in seconds, before the writer times out. Default: 86,400 seconds (1 day)
BigQuery Dataset
(Optional) Top-level container that organizes and controls access to tables and views. The dataset is contained within a project and must be created before data can be streamed into BigQuery.
-
Click Verify GCP Storage Writer.
-
Click Save Changes.
-
Give users access to the Google Cloud Storage target. For more information, see Grant access to targets.