Google BigQuery
As a best practice, set up a dedicated Google Cloud Platform (GCP) Service Account user and assign it a specific role(s) before configuring Connecty. This approach isolates your integration credentials and simplifies permission management, ensuring a seamless no-code connection.
Prerequisites
GCP Project Identifier where your BigQuery is setup.
Dedicated Service Account (SA) user for Connecty.
Appropriate roles and permissions for the dedicated SA user.
The SA user access key.
Multi-region data location of your BigQuery dataset.
Project ID
Your Project ID uniquely identifies your BigQuery project. To get the project ID you can:
Go to Google Cloud Console.
In the top-left Project selector dropdown, locate your project (under which your BigQuery instance is setup).
Copy the
ID
.
Examples:
✅
my-project-connecty
- correct and expected identifier.❌
950600139040
- incorrect. That is usually a project number or an organisation ID
Permissions
To enable Connecty AI to access and query data within your BigQuery service, the designated GCP role must contains the following GCP permissions:
Metadata Access
bigquery.datasets.get
bigquery.datasets.getIamPolicy
bigquery.models.list
bigquery.models.getMetadata
bigquery.routines.list
bigquery.routines.get
READ permissions
bigquery.tables.list
bigquery.tables.get
bigquery.tables.getData
bigquery.tables.getIamPolicy
Query (job) history access
bigquery.jobs.list
bigquery.jobs.listAll
bigquery.jobs.get
bigquery.jobs.create
While combination of GCP built-in roles (BigQuery Data Viewer + BigQuery Job User + BigQuery Resource Viewer
) inherently satisfy these requirements, it is best practice to create a custom role that includes only the minimum necessary privileges outlined above.
Custom SA user and Role
The following script is an example how one can create new dedicate Service Account user in GCP for ConnectyAI with granted new custom role with all required permissions. This script assumes that gcloud
is installed locally. In case you don't have it, please follow the instruction.
⚠️ Please remember to replace
PROJECT_ID
and potentiallyROLE_ID
accordingly to your GCP setup.
#!/usr/bin/env bash
set -euo pipefail
trap 'echo "❌ Error on line $LINENO" >&2' ERR
PROJECT_ID="<YOUR_PROJECT_ID>"
SA_ID="connecty-ai-sa-user"
SA_EMAIL="${SA_ID}@${PROJECT_ID}.iam.gserviceaccount.com"
SA_DISPLAY="Connecty AI Service Account User"
ROLE_ID="bigqueryConnectyAIRole" # you can change the new role ID
ROLE_NAME="projects/${PROJECT_ID}/roles/${ROLE_ID}"
ROLE_TITLE="BigQuery Connecty AI Role" # you can change the new role title
ROLE_DESC="BigQuery READ data access + READ metadata + READ query history"
PERMS=(
# Dataset-level enumeration & metadata
bigquery.datasets.get
bigquery.datasets.getIamPolicy
# Table-level enumeration & metadata
bigquery.tables.list
bigquery.tables.get
bigquery.tables.getData
bigquery.tables.getIamPolicy
# INFORMATION_SCHEMA-style metadata for models & routines
bigquery.models.list
bigquery.models.getMetadata
bigquery.routines.list
bigquery.routines.get
# Query history (all users’ jobs)
bigquery.jobs.list
bigquery.jobs.listAll
bigquery.jobs.get
bigquery.jobs.create
)
PERMS_CSV=$(IFS=,; echo "${PERMS[*]}")
# 1. Create SA if it doesn't exist
if gcloud iam service-accounts list \
--project="$PROJECT_ID" \
--filter="email:${SA_EMAIL}" \
--format="value(email)" | grep -q .; then
echo "ℹ️ Service account already exists: ${SA_EMAIL}"
else
echo "➡️ Creating service account ${SA_EMAIL}…"
gcloud iam service-accounts create "$SA_ID" \
--project="$PROJECT_ID" \
--display-name="$SA_DISPLAY"
fi
# 2. Create custom role if it doesn't exist
if gcloud iam roles describe "${ROLE_ID}" \
--project="${PROJECT_ID}" \
--quiet &>/dev/null; then
echo "ℹ️ Role already exists, updating permissions…"
gcloud iam roles update "${ROLE_ID}" \
--project="${PROJECT_ID}" \
--permissions="${PERMS_CSV}" \
--stage="GA"
else
echo "➡️ Creating custom role ${ROLE_ID}…"
gcloud iam roles create "$ROLE_ID" \
--project="$PROJECT_ID" \
--title="$ROLE_TITLE" \
--description="$ROLE_DESC" \
--permissions="$PERMS_CSV" \
--stage="GA"
fi
# 3. Bind role
echo "➡️ Granting ${ROLE_NAME} to ${SA_EMAIL}…"
gcloud projects add-iam-policy-binding "$PROJECT_ID" \
--member="serviceAccount:${SA_EMAIL}" \
--role="${ROLE_NAME}"
echo "✅ Done."
SA user access key
To generate an access key for Service Account user you can either use GCP console or gcloud
CLI interface.
GCP Console
Go to IAM service in Google Cloud Console.
Go to
Service Accounts
and found the apprioprate SA user.Click on that user and open
Keys
tab.Click on
Add Key
and thenCreate new key
. Select JSON format for the key.Download generated access key file.
Command line interface
In case when you have gcloud
installed locally and authenticated for your project in GCP, you can generate new access key file for a SA user using the following command:
PROJECT_ID="<YOUR_PROJECT_ID>"
SA_ID="connecty-ai-sa-user"
SA_EMAIL="${SA_ID}@${PROJECT_ID}.iam.gserviceaccount.com"
gcloud iam service-accounts keys create ~/key_for_connectyai.json \
--iam-account="$SA_EMAIL" \
--project="$PROJECT_ID"
⚠️ In case you've changed SA user ID or email, please update the values in the script above. Values in the script corresponds to setup mentioned in Custom SA user and Role.
Multi-region data location
To fully setup Connecty integration with BigQuery, we need infromation in which BigQuery multi-region your datasets are stored. It can be either US
or EU
. The easiest way to found out which multi-region your dataset uses is to by clicking on any object in that dataset and go to details
tab, like on the following screenshot example:

Last updated