UiPath Documentation
industry-department-solutions
latest
false
UiPath logo, featuring letters U and I in white

Supply Chain & Retail Solutions user guide

Last updated Apr 16, 2026

Connecting a Snowflake data warehouse

Connecting your Snowflake data warehouse to Peak enables Peak to read and write data directly to your existing analytics infrastructure. Peak supports both basic authentication and OAuth for secure connections.

Prerequisites

Before connecting your Snowflake data warehouse, ensure you have the following:

  • Access to Peak with permissions to configure Data Bridge.
  • A data lake already configured in Peak, in the same region as your Snowflake account.
  • A Snowflake account with a user, role, database, and schema prepared for Peak access. See Preparing your Snowflake account.
  • Storage integration configured between your Snowflake account and the S3 data lake. See Setting up storage integration.

Preparing your Snowflake account

Before connecting to Peak, your Snowflake account must be configured with the correct user, role, schemas, and storage integration. If these have already been set up, skip to Steps.

Creating a user, role, and schema

Peak requires a dedicated user, role, and at least one schema with write access (the default schema) in your Snowflake account. You can optionally add read-only schemas for Peak to read input data from.

To create these resources, run the setup script provided by the Peak team in your Snowflake account using the accountadmin role. The script creates:

  • A warehouse and resource monitor
  • A database (or uses an existing one)
  • A role with appropriate permissions
  • A user with the role and warehouse assigned
  • A default schema with write access for Peak
  • An optional read-only schema

Save the generated username and password in a secure location for later use during the Data Bridge configuration.

Setting up storage integration

Peak requires storage integration between your Snowflake cluster and the S3 data lake to load and unload data efficiently. Once set up, the same integration can be reused across multiple stages.

To set up storage integration, you need the following details from your configured data lake (available in Manage > Data Bridge > Data Lake):

  • S3 bucket name — the name of the data lake bucket
  • Root path — the root path of the S3 bucket
  • AWS IAM role ARN — in the format arn:aws:iam::794236216820:role/prod-<tenant_name>-System

Run the storage integration script provided by the Peak team in your Snowflake account using the accountadmin role.

Steps

  1. In Peak, open Manage and select Data Bridge.
  2. Select Add data warehouse.
  3. Enter a unique name for the connection and select Snowflake.
    • The name must be unique to your Peak organization.
    • Use only alphanumeric characters and underscores. No spaces or other special characters.
    • Minimum 3 characters, maximum 40 characters.
    • Must start and end with an alphanumeric character.
    • The name cannot be changed after the connection has been set up.
  4. Select Next.
  5. Select the cloud platform where your Snowflake account is hosted (AWS, Azure, or GCP).
  6. Select the region where your Snowflake account is located. Ensure the region complies with your data localization requirements.
  7. Select the authentication type:
    • Basic — authenticates using account ID, username, and password.
    • OAuth — authenticates using a Snowflake security integration. Follow the OAuth setup steps below before continuing.
  8. Enter the required credentials for the selected authentication type. See Credential fields.
  9. Select Test to validate the connection.
  10. In the Data lake step, select the data lake connection to link to this data warehouse.
  11. Review the configuration and select Finish.

Setting up OAuth

If you selected OAuth in step 7, complete the following before entering database credentials:

  1. In Peak, copy the SQL command displayed under Create Security Integration.
  2. In your Snowflake account, sign in using the ACCOUNTADMIN role and run the command to create the security integration.
  3. Copy and run the SQL command under Get Client Details to retrieve the Client ID, Authorization URL, and Token URL.
  4. Copy and run the SQL command under Get Client Secret to retrieve the Client Secret.
  5. Save all retrieved values — you will enter them in the database credentials step.
  6. Select the confirmation checkbox in Peak to confirm the security integration has been set up.

After authorization, Peak redirects you to the Snowflake login page. Sign in using the credentials of a Snowflake user who has access to the role specified in the Default role field.

Credential fields

Basic authentication

FieldDescription
Account IDYour Snowflake account ID.
PasswordThe password for the Snowflake user.
Default userThe Snowflake username Peak will use to access the database.
Default roleThe role assigned to the user for Peak access.
DatabaseThe name of the Snowflake database.
SchemaThe schema Peak will use for reading and writing data.
WarehouseThe Snowflake warehouse used for computations.
Storage integrationThe name of the storage integration configured for the data lake.

OAuth authentication

FieldDescription
Account IDYour Snowflake account ID.
Client IDGenerated during security integration setup.
Client secretGenerated during security integration setup.
Authorization URLUsed to obtain the authorization code.
Token URLUsed to generate the access token.
Default roleThe role assigned to the user for Peak access.
DatabaseThe name of the Snowflake database.
SchemaThe schema Peak will use for reading and writing data.
WarehouseThe Snowflake warehouse used for computations.
Storage integrationThe name of the storage integration configured for the data lake.

Result

The Snowflake data warehouse connection appears as Active in the Data Bridge list.

Was this page helpful?

Connect

Need help? Support

Want to learn? UiPath Academy

Have questions? UiPath Forum

Stay updated