# BigQuery

**URL:** https://heroiclabs.com/docs/satori/concepts/performance-monitoring/export-to-data-lakes/bigquery/
**Summary:** Configure and manage GCP BigQuery connection to Satori.
**Keywords:** bigquery, satori
**Categories:** satori, data-lakes, settings

---


# Connect to BigQuery

This page describes how to connect to BigQuery from Satori, enabling you to send data to your BigQuery project for analysis.

## Prerequisites

Before you can connect to BigQuery, you must have the following:

* [GCP project](https://cloud.google.com/resource-manager/docs/creating-managing-projects) with BigQuery enabled
* [Service account](#create-a-service-account) with the appropriate permissions

### Create a service account

To create a service account for Satori, navigate to the [Google Cloud Console](https://console.cloud.google.com/) of your project and follow these steps:

1. From the left-hand navigation menu, go to **IAM & Admin** > [**Service Accounts**](https://console.cloud.google.com/iam-admin/serviceaccounts):

{{< screenshot src="images/pages/satori/concepts/monitoring/monitoring_datalake-service-accounts.png" alt="Service Accounts" >}}

2. Click **+ Create Service Account** and in the displayed page enter the details for your new service account:

{{< screenshot src="images/pages/satori/concepts/monitoring/monitoring_datalake-create-service-account.png" alt="Create Service Account" >}}

3. Click **Create and Continue** and then select the **BigQuery Data Editor** role for your service account:

{{< screenshot src="images/pages/satori/concepts/monitoring/monitoring_datalake-bigquery-role.png" alt="BigQuery User Role" >}}

4. Click **Continue** to confirm the role and then click **Done** to complete the process. The service account is now created.

5. On the **Service Accounts** page, click the **Actions** button for your new service account and select **Manage Keys**.

6. On the **Keys** page, click **Add Key** and select **Create new key**.

{{< screenshot src="images/pages/satori/concepts/monitoring/monitoring_datalake-create-new-key.png" alt="Create New Key" >}}

7. Select **JSON** as the key type and click **Create**. This will download a JSON file containing the service account credentials.

## Configure BigQuery in Satori

From the Satori dashboard, navigate to **Settings** > **Data Lakes** > **BigQuery**:

{{< screenshot src="images/pages/satori/concepts/monitoring/monitoring_datalake_bigquery_integration.png" alt="BigQuery tab" >}}

1. Enter the corresponding details:
    * **GCP Project ID**: The ID of your GCP project.
    * **BigQuery Dataset ID**: The ID of the BigQuery dataset to which you want to send data. If the dataset doesn't exist, it'll be created.
    * **Events Table Name**: The name of the table to which you want to send events. If the table doesn't exist, it'll be created.
    * **GCP Service Account Credentials**: The JSON file containing the service account credentials, created above.
    * **Schema Version**: The version of the table schema. If you need to migrate between versions, you can do so by reconfiguring the integration with the same table.

2. Click **Save** to save the configuration.

The BigQuery connection is now configured and will be tested by inserting a dummy event into the specified table. If the test is successful, the connection status will be displayed as **Enabled**.
