Connecty AI
Home
  • What's Connecty AI?
  • System Overview
  • Security, Privacy and Governance
  • Getting Started
    • Integrations (Data Connection)
      • Which no-code integrations are available?
      • Snowflake
      • Google BigQuery
      • Databricks
      • PostgreSQL
    • Setup
      • Step 1: Data Environment
      • Step 2: Connection
      • Step 3: Data Workspace
    • Invite Members and Manage Roles
  • FEATURES
    • Features
      • Analyze
      • Discover
      • Refine
    • Training Videos
  • Support
    • Contact Support
    • In-App Help Chat
    • Support Tickets (In-app)
Powered by GitBook
On this page
  • Prerequisites
  • Project ID
  • Permissions
  • Custom SA user and Role
  • SA user access key
  • Multi-region data location
  1. Getting Started
  2. Integrations (Data Connection)

Google BigQuery

Last updated 14 days ago

As a best practice, set up a dedicated Google Cloud Platform (GCP) Service Account user and assign it a specific role(s) before configuring Connecty. This approach isolates your integration credentials and simplifies permission management, ensuring a seamless no-code connection.

Prerequisites

  • GCP Project Identifier where your BigQuery is setup.

  • Dedicated Service Account (SA) user for Connecty.

  • Appropriate roles and permissions for the dedicated SA user.

  • The SA user access key.

  • Multi-region data location of your BigQuery dataset.

Project ID

Your Project ID uniquely identifies your BigQuery project. To get the project ID you can:

  • In the top-left Project selector dropdown, locate your project (under which your BigQuery instance is setup).

  • Copy the ID.

Examples:

  • ✅ my-project-connecty - correct and expected identifier.

  • ❌ 950600139040 - incorrect. That is usually a project number or an organisation ID

Permissions

To enable Connecty AI to access and query data within your BigQuery service, the designated GCP role must contains the following GCP permissions:

  1. Metadata Access

    1. bigquery.datasets.get

    2. bigquery.datasets.getIamPolicy

    3. bigquery.models.list

    4. bigquery.models.getMetadata

    5. bigquery.routines.list

    6. bigquery.routines.get

  2. READ permissions

    1. bigquery.tables.list

    2. bigquery.tables.get

    3. bigquery.tables.getData

    4. bigquery.tables.getIamPolicy

  3. Query (job) history access

    1. bigquery.jobs.list

    2. bigquery.jobs.listAll

    3. bigquery.jobs.get

    4. bigquery.jobs.create

While combination of GCP built-in roles (BigQuery Data Viewer + BigQuery Job User + BigQuery Resource Viewer) inherently satisfy these requirements, it is best practice to create a custom role that includes only the minimum necessary privileges outlined above.

Custom SA user and Role

#!/usr/bin/env bash
set -euo pipefail
trap 'echo "❌ Error on line $LINENO" >&2' ERR

PROJECT_ID="<YOUR_PROJECT_ID>"
SA_ID="connecty-ai-sa-user"
SA_EMAIL="${SA_ID}@${PROJECT_ID}.iam.gserviceaccount.com"
SA_DISPLAY="Connecty AI Service Account User"
ROLE_ID="bigqueryConnectyAIRole"                     # you can change the new role ID
ROLE_NAME="projects/${PROJECT_ID}/roles/${ROLE_ID}"
ROLE_TITLE="BigQuery Connecty AI Role"               # you can change the new role title
ROLE_DESC="BigQuery READ data access + READ metadata + READ query history"
PERMS=(
  # Dataset-level enumeration & metadata
  bigquery.datasets.get
  bigquery.datasets.getIamPolicy

  # Table-level enumeration & metadata
  bigquery.tables.list
  bigquery.tables.get
  bigquery.tables.getData
  bigquery.tables.getIamPolicy

  # INFORMATION_SCHEMA-style metadata for models & routines
  bigquery.models.list
  bigquery.models.getMetadata
  bigquery.routines.list
  bigquery.routines.get

  # Query history (all users’ jobs)
  bigquery.jobs.list
  bigquery.jobs.listAll
  bigquery.jobs.get
  bigquery.jobs.create
)
PERMS_CSV=$(IFS=,; echo "${PERMS[*]}")

# 1. Create SA if it doesn't exist
if gcloud iam service-accounts list \
     --project="$PROJECT_ID" \
     --filter="email:${SA_EMAIL}" \
     --format="value(email)" | grep -q .; then
  echo "ℹ️  Service account already exists: ${SA_EMAIL}"
else
  echo "➡️  Creating service account ${SA_EMAIL}…"
  gcloud iam service-accounts create "$SA_ID" \
    --project="$PROJECT_ID" \
    --display-name="$SA_DISPLAY"
fi

# 2. Create custom role if it doesn't exist
if gcloud iam roles describe "${ROLE_ID}" \
     --project="${PROJECT_ID}" \
     --quiet &>/dev/null; then

  echo "ℹ️  Role already exists, updating permissions…"
  gcloud iam roles update "${ROLE_ID}" \
    --project="${PROJECT_ID}" \
    --permissions="${PERMS_CSV}" \
    --stage="GA"

else
  echo "➡️  Creating custom role ${ROLE_ID}…"
  gcloud iam roles create "$ROLE_ID" \
    --project="$PROJECT_ID" \
    --title="$ROLE_TITLE" \
    --description="$ROLE_DESC" \
    --permissions="$PERMS_CSV" \
    --stage="GA"
fi

# 3. Bind role
echo "➡️  Granting ${ROLE_NAME} to ${SA_EMAIL}…"
gcloud projects add-iam-policy-binding "$PROJECT_ID" \
  --member="serviceAccount:${SA_EMAIL}" \
  --role="${ROLE_NAME}"

echo "✅ Done."

SA user access key

To generate an access key for Service Account user you can either use GCP console or gcloud CLI interface.

GCP Console

  1. Go to Service Accounts and found the apprioprate SA user.

  2. Click on that user and open Keys tab.

  3. Click on Add Key and then Create new key . Select JSON format for the key.

  4. Download generated access key file.

Command line interface

In case when you have gcloud installed locally and authenticated for your project in GCP, you can generate new access key file for a SA user using the following command:

PROJECT_ID="<YOUR_PROJECT_ID>"
SA_ID="connecty-ai-sa-user"
SA_EMAIL="${SA_ID}@${PROJECT_ID}.iam.gserviceaccount.com"

gcloud iam service-accounts keys create ~/key_for_connectyai.json \
  --iam-account="$SA_EMAIL" \
  --project="$PROJECT_ID"

Multi-region data location

Go to .

The following script is an example how one can create new dedicate Service Account user in GCP for ConnectyAI with granted new custom role with all required permissions. This script assumes that gcloud is installed locally. In case you don't have it, please follow .

Please remember to replace PROJECT_ID and potentially ROLE_ID accordingly to your GCP setup.

Go to IAM service in .

In case you've changed SA user ID or email, please update the values in the script above. Values in the script corresponds to setup mentioned in Custom SA user and Role.

To fully setup Connecty integration with BigQuery, we need infromation in which your datasets are stored. It can be either US or EU . The easiest way to found out which multi-region your dataset uses is to by clicking on any object in that dataset and go to details tab, like on the following screenshot example:

⚠️
⚠️
Google Cloud Console
the instruction
Google Cloud Console
BigQuery multi-region
BigQuery UI, object->Details tab.