- Overview
- Platform setup and administration
- Platform setup and administration
- Platform architecture
- Data Bridge onboarding overview
- Connecting a Peak-managed data lake
- Connecting a customer-managed data lake
- Creating an AWS IAM role for Data Bridge
- Connecting a Snowflake data warehouse
- Connecting a Redshift data warehouse (public connectivity)
- Connecting a Redshift data warehouse (private connectivity)
- Reauthorizing a Snowflake OAuth connection
- Using Snowflake with Peak
- SQL Explorer overview
- Roles and permissions
- User management
- Inventory management solution
- Commercial pricing solution
- Merchandising solution

Supply Chain & Retail Solutions user guide
Connecting a Snowflake data warehouse
Connecting your Snowflake data warehouse to Peak enables Peak to read and write data directly to your existing analytics infrastructure. Peak supports both basic authentication and OAuth for secure connections.
Prerequisites
Before connecting your Snowflake data warehouse, ensure you have the following:
- Access to Peak with permissions to configure Data Bridge.
- A data lake already configured in Peak, in the same region as your Snowflake account.
- A Snowflake account with a user, role, database, and schema prepared for Peak access. See Preparing your Snowflake account.
- Storage integration configured between your Snowflake account and the S3 data lake. See Setting up storage integration.
Preparing your Snowflake account
Before connecting to Peak, your Snowflake account must be configured with the correct user, role, schemas, and storage integration. If these have already been set up, skip to Steps.
Creating a user, role, and schema
Peak requires a dedicated user, role, and at least one schema with write access (the default schema) in your Snowflake account. You can optionally add read-only schemas for Peak to read input data from.
To create these resources, run the setup script provided by the Peak team in your Snowflake account using the accountadmin role. The script creates:
- A warehouse and resource monitor
- A database (or uses an existing one)
- A role with appropriate permissions
- A user with the role and warehouse assigned
- A default schema with write access for Peak
- An optional read-only schema
Save the generated username and password in a secure location for later use during the Data Bridge configuration.
Setting up storage integration
Peak requires storage integration between your Snowflake cluster and the S3 data lake to load and unload data efficiently. Once set up, the same integration can be reused across multiple stages.
To set up storage integration, you need the following details from your configured data lake (available in Manage > Data Bridge > Data Lake):
- S3 bucket name — the name of the data lake bucket
- Root path — the root path of the S3 bucket
- AWS IAM role ARN — in the format
arn:aws:iam::794236216820:role/prod-<tenant_name>-System
Run the storage integration script provided by the Peak team in your Snowflake account using the accountadmin role.
Steps
- In Peak, open Manage and select Data Bridge.
- Select Add data warehouse.
- Enter a unique name for the connection and select Snowflake.
- The name must be unique to your Peak organization.
- Use only alphanumeric characters and underscores. No spaces or other special characters.
- Minimum 3 characters, maximum 40 characters.
- Must start and end with an alphanumeric character.
- The name cannot be changed after the connection has been set up.
- Select Next.
- Select the cloud platform where your Snowflake account is hosted (AWS, Azure, or GCP).
- Select the region where your Snowflake account is located. Ensure the region complies with your data localization requirements.
- Select the authentication type:
- Basic — authenticates using account ID, username, and password.
- OAuth — authenticates using a Snowflake security integration. Follow the OAuth setup steps below before continuing.
- Enter the required credentials for the selected authentication type. See Credential fields.
- Select Test to validate the connection.
- In the Data lake step, select the data lake connection to link to this data warehouse.
- Review the configuration and select Finish.
Setting up OAuth
If you selected OAuth in step 7, complete the following before entering database credentials:
- In Peak, copy the SQL command displayed under Create Security Integration.
- In your Snowflake account, sign in using the
ACCOUNTADMINrole and run the command to create the security integration. - Copy and run the SQL command under Get Client Details to retrieve the Client ID, Authorization URL, and Token URL.
- Copy and run the SQL command under Get Client Secret to retrieve the Client Secret.
- Save all retrieved values — you will enter them in the database credentials step.
- Select the confirmation checkbox in Peak to confirm the security integration has been set up.
After authorization, Peak redirects you to the Snowflake login page. Sign in using the credentials of a Snowflake user who has access to the role specified in the Default role field.
Credential fields
Basic authentication
| Field | Description |
|---|---|
| Account ID | Your Snowflake account ID. |
| Password | The password for the Snowflake user. |
| Default user | The Snowflake username Peak will use to access the database. |
| Default role | The role assigned to the user for Peak access. |
| Database | The name of the Snowflake database. |
| Schema | The schema Peak will use for reading and writing data. |
| Warehouse | The Snowflake warehouse used for computations. |
| Storage integration | The name of the storage integration configured for the data lake. |
OAuth authentication
| Field | Description |
|---|---|
| Account ID | Your Snowflake account ID. |
| Client ID | Generated during security integration setup. |
| Client secret | Generated during security integration setup. |
| Authorization URL | Used to obtain the authorization code. |
| Token URL | Used to generate the access token. |
| Default role | The role assigned to the user for Peak access. |
| Database | The name of the Snowflake database. |
| Schema | The schema Peak will use for reading and writing data. |
| Warehouse | The Snowflake warehouse used for computations. |
| Storage integration | The name of the storage integration configured for the data lake. |
Result
The Snowflake data warehouse connection appears as Active in the Data Bridge list.