Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,6 @@ In this guide we will walkthrough how to use Snowflake Data Change Management fe
### Prerequisites
- Familiarity with Snowflake, Github
- A snowflake user with [Key Pair](https://docs.snowflake.com/en/user-guide/key-pair-auth), role sysadmin.
- A XSMALL snowflake warehouse.
- Setup Snowflake [cli connections](https://docs.snowflake.com/en/developer-guide/snowflake-cli/connecting/configure-connections)

### What You’ll Learn
Expand Down Expand Up @@ -62,7 +61,7 @@ In this guide we will walkthrough how to use Snowflake Data Change Management fe

```shell
openssl genrsa -out snowflake_demo_key 4096
openssl rsa -in snowflake_demo_key -pubout -out snowflake_demo_key.pub
openssl rsa -in snowflake_demo_key.pem -pubout -out snowflake_demo_key.pub
openssl pkcs8 -topk8 -nocrypt -in snowflake_demo_key.pem -out snowflake_demo_key.p8
```
+ You could also use a [online service](https://www.cryptool.org/en/cto/openssl/) to create the key pair.
Expand All @@ -77,19 +76,33 @@ openssl pkcs8 -topk8 -nocrypt -in snowflake_demo_key.pem -out snowflake_demo_key

```sql
USE ROLE ACCOUNTADMIN;
CREATE OR REPLACE USER "DCM_DEMO" RSA_PUBLIC_KEY='RSA_PUBLIC_KEY_HERE' DEFAULT_ROLE=SYSADMIN MUST_CHANGE_PASSWORD=FALSE;
CREATE OR REPLACE USER "DCM_DEMO" RSA_PUBLIC_KEY='RSA_PUBLIC_KEY_HERE' MUST_CHANGE_PASSWORD=FALSE;
GRANT ROLE SYSADMIN TO USER DCM_DEMO;
```

3) Use role SYSADMIN.
3) Create warehouse and grant

4) Create database and schema to hold the objects that will be managed via DCM.
```sql
CREATE WAREHOUSE XSMALL
WITH WAREHOUSE_SIZE = XSMALL
AUTO_SUSPEND = 30
AUTO_RESUME = TRUE
COMMENT = 'This is a warehouse for dcm quickstart.';


GRANT USAGE ON WAREHOUSE XSMALL TO ROLE SYSADMIN;
```

4) Use role SYSADMIN.

5) Create database and schema to hold the objects that will be managed via DCM.

```sql
create database DCM_PROJECTS;
create schema DCM_PROJECTS.DCM_PROJECTS;
```

5) Create Database, schema, tables for bronze layer. We will be using a subset of the tables/data.
6) Create Database, schema, tables for bronze layer. We will be using a subset of the tables/data.

```sql
create database bronze;
Expand Down Expand Up @@ -146,7 +159,7 @@ CREATE TABLE order_details (
);

```
6) Load the data
7) Load the data

```sql
INSERT INTO customers VALUES
Expand Down Expand Up @@ -349,11 +362,11 @@ jobs:
--no-interactive --default
- name: Create DCM Project if not exists
run: |
cd ./data_project && snow dcm create --if-not-exists proj1
cd ./dcm_project && snow dcm create --if-not-exists proj1

- name: Execute PLAN with config PROD
run: |
cd ./data_project && snow dcm plan DCM.DCM.PROJ1 --configuration="prod"
cd ./dcm_project && snow dcm plan DCM.DCM.PROJ1 --configuration="prod"

```

Expand Down Expand Up @@ -402,13 +415,13 @@ jobs:
--no-interactive --default
- name: Create DCM Project if not exists
run: |
cd ./data_project && snow dcm create --if-not-exists proj1
cd ./dcm_project && snow dcm create --if-not-exists proj1
- name: Execute PLAN with config PROD
run: |
cd ./data_project && snow dcm plan DCM.DCM.PROJ1 --configuration="prod"
cd ./dcm_project && snow dcm plan DCM.DCM.PROJ1 --configuration="prod"
- name: Execute deploy with config prod
run: |
cd ./data_project && snow dcm deploy DCM.DCM.PROJ1 --configuration="prod"
cd ./dcm_project && snow dcm deploy DCM.DCM.PROJ1 --configuration="prod"

```
5) Add sandbox script and requirements.txt
Expand Down Expand Up @@ -486,7 +499,7 @@ SANITIZED_NAME=$(echo ${BRANCH_NAME//-/_} | tr '[:lower:]' '[:upper:]')

pushd $REPO_ROOT
snow sql -q "CREATE OR REPLACE STAGE ${SANITIZED_NAME}_FILES"
snow stage copy --recursive ./data_project @${SANITIZED_NAME}_FILES/
snow stage copy --recursive ./dcm_project @${SANITIZED_NAME}_FILES/
snow sql -q "EXECUTE DCM PROJECT ${SANITIZED_NAME} PLAN USING CONFIGURATION non_prod (target_db => '${SANITIZED_NAME}', target_schema => '${SANITIZED_NAME}') FROM @${SANITIZED_NAME}_FILES/;"
popd

Expand All @@ -502,7 +515,7 @@ SANITIZED_NAME=$(echo ${BRANCH_NAME//-/_} | tr '[:lower:]' '[:upper:]')

pushd $REPO_ROOT
snow sql -q "CREATE OR REPLACE STAGE ${SANITIZED_NAME}_FILES"
snow stage copy --recursive ./data_project @${SANITIZED_NAME}_FILES/
snow stage copy --recursive ./dcm_project @${SANITIZED_NAME}_FILES/

snow sql -q "EXECUTE DCM PROJECT ${SANITIZED_NAME} DEPLOY USING CONFIGURATION non_prod (target_db => '${SANITIZED_NAME}', target_schema => '${SANITIZED_NAME}') FROM @${SANITIZED_NAME}_FILES/;"
popd
Expand Down Expand Up @@ -579,12 +592,12 @@ urllib3==2.5.0
wcwidth==0.2.13
```

6) Create folder called data_project with manifest.yml
6) Create folder called dcm_project with manifest.yml

```shell
cd <your repo>
mkdir -p data_project/definitions
cd data_project
mkdir -p dcm_project/definitions
cd dcm_project
```
+ In your code editor create file manifest.yml. Copy following content into manifest.yml

Expand Down Expand Up @@ -616,11 +629,11 @@ git push origin main

## Setup DCM to manage the gold layer
1) Create gold database and schema
+ Add a sql file into the data_project/definitions folder called gold.sql.
+ Add a sql file into the dcm_project/definitions folder called gold.sql.

```shell
git checkout -b setup_gold
cd data_project/definitions
cd dcm_project/definitions
```

+ In your code editor create file gold.sql. Contents of gold.sql:
Expand Down Expand Up @@ -659,8 +672,8 @@ git pull
cd sandbox
./do.sh create PROJ-001
cd ..
mkdir -p data_project/definitions
cd data_project/definitions
mkdir -p dcm_project/definitions
cd dcm_project/definitions
```

+ do.sh creates a git branch named 'PROJ-001' and all changes are made in the context of this branch. It also creates a sandbox environment by cloning the gold layer.
Expand Down Expand Up @@ -723,8 +736,8 @@ git pull
cd sandbox
./do.sh create PROJ-002
cd ..
mkdir -p data_project/definitions
cd data_project/definitions
mkdir -p dcm_project/definitions
cd dcm_project/definitions
```

+ In your code editor create file order_fact.sql. Copy following content into order_fact.sql
Expand Down Expand Up @@ -809,6 +822,7 @@ In this guide we targetted a subset of tables in the northwind database.
+ You can have the main branch map to the prod snowflake account, and an integration branch can be mapped
to the non-prod snowflake account.
+ The sandbox creation script can be hosted in a CI system like Jenkins, etc
+ You can enhance the worklfow by adding Expectations attached to tables in project and using the TEST ALL command to


5) All scripts/shell commands in this quickstart are tested to work on macOS.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,8 @@ This Quickstart showcases the complete Cortex AI Demo Framework with:
- **Production-ready applications** with professional UI/UX


![Architecture Diagram](assets/architecture_diagram.png)

### What You Will Build
- Complete 6-application integrated demo platform
- AI-powered synthetic data generation system using Cortex functions
Expand Down Expand Up @@ -1259,8 +1261,6 @@ Your demo is complete! You can:
- Edit YAML to add more steps or visualizations
- Share demo with colleagues by sharing the YAML file

**Return to Page 5** to explore other workflows or **continue to Page 12** for cleanup instructions.

<!-- ------------------------ -->
## YAML Wizard

Expand Down Expand Up @@ -1806,18 +1806,6 @@ Shows complete dataset in table format with sortable columns and CSV export opti
- Compare entities side-by-side
- Export data for presentations

---

### Best Practices

**Explore systematically**: Start with Overview, then drill into specific tabs
**Use AI Assistant**: Natural language queries are powerful and intuitive
**Compare entities**: VS tab helps identify top performers
**Export insights**: Share findings via CSV export
**Adjust time windows**: Find the right time range for your analysis

---

### What's Next?

**For Persona 1 (Full-Stack Developer)**:
Expand All @@ -1838,8 +1826,6 @@ You now have an interactive analytics dashboard! You can:
- Compare products/customers/categories
- Export data for presentations

**Return to Page 5** to explore other workflows or **continue to Page 12** for cleanup instructions.

<!-- ------------------------ -->
## Clean Up Resources

Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Original file line number Diff line number Diff line change
Expand Up @@ -35,14 +35,6 @@ In this lab, we will go through everything you need to know to get started with

We will be using Tasty Bytes data in this lab. Run the script [here](https://github.com/Snowflake-Labs/getting-started-with-dbt-on-snowflake/blob/main/tasty_bytes_dbt_demo/setup/tasty_bytes_setup.sql) in Snowsight to build the objects and data required for this lab.

Workspaces that you create in Snowflake are created in the personal database associated with the active user. To use Workspaces, you must run the following SQL commands to activate all secondary roles for your user.

``` sql
ALTER USER my_user SET DEFAULT_SECONDARY_ROLES = ('ALL');
```

Sign out of Snowsight and sign back in.

<!-- ------------------------ -->
## Introduction to Workspaces

Expand Down Expand Up @@ -96,7 +88,7 @@ Let's now clone an example dbt project we will use in the rest of this lab.
1. Repository URL: `https://github.com/Snowflake-Labs/getting-started-with-dbt-on-snowflake.git`
2. Workspace Name: `Example-dbt-Project`
3. API Integration: `GIT_INTEGRATION`. Note: the [API Integration](https://docs.snowflake.com/en/developer-guide/git/git-setting-up#label-integrating-git-repository-api-integration) has already been configured for you.
4. Select Public Repository
4. Select Public Repository. Note: Private repos can be authenticated with personal access tokens and GitHub users can authenticate with [OAuth](https://docs.snowflake.com/en/developer-guide/git/git-setting-up#configure-for-authenticating-with-oauth).
3. Click Create!

![create-dbt-project](assets/create-workspace-git.png)
Expand Down Expand Up @@ -168,7 +160,7 @@ Let's start by running `dbt deps` to pull in the dbt_utils package. The dbt_util
From the dbt toolbar, you get dropdowns for the project, target, and command. Clicking the play button will run the relevant command. You can also click the down arrow to override the arguments.

1. From the toolbar, select dev and deps.
2. Click the dropdown arrow and enter `dbt_access_integration`. This [external access integration](https://docs.snowflake.com/en/sql-reference/sql/create-external-access-integration) has already been configured for you.
2. Click the dropdown arrow and select `dbt_access_integration`. This [external access integration](https://docs.snowflake.com/en/sql-reference/sql/create-external-access-integration) has already been configured for you.
3. Click the Deps button.

![dbt-deps](assets/dbt-deps.png)
Expand Down Expand Up @@ -228,23 +220,23 @@ Workspaces are fully git backed. To view changes and commit, click changes from
<!-- ------------------------ -->
## Orchestration and Monitoring

### Monitor dbt Projects
### Orchestrate with Tasks

You can get an overview of dbt project status from the dbt Projects activity in Snowsight. Navigate to monitoring > dbt Projects to view overall status of dbt Projects and quickly jump to the deployed projects.
Navigate to Catalog > Database Explorer > TASTY_BYTES_DBT_DB > RAW > dbt Projects > DBT_PROJECT to view the project details. From the Run History tab, you can view all runs associated with the project.

![dbt-projects](assets/dbt-projects.png)

### Orchestrate with Tasks
![project-details](assets/dbt_project_details.png)

#### Create Scheduled dbt Tasks

Let's create tasks to regularly run and test our dbt project.

1. Click dbt_project in the top right corner of Workspaces
2. Click Create Schedule from the dropdown
1. Navigate to the Project Details tab
2. Click Create Schedule from the Schedules dropdown
3. Enter a name, schedule, and profile, then click create

![create-task](assets/create-task.png)
![create-schedule](assets/dbt-create-schedule.png)

![create-task](assets/create-schedule.png)

#### Complex Tasks and Alerts

Expand All @@ -263,18 +255,7 @@ CREATE OR REPLACE TASK tasty_bytes_dbt_db.raw.dbt_run_task
CREATE OR REPLACE TASK tasty_bytes_dbt_db.raw.dbt_test_task
WAREHOUSE=TASTY_BYTES_DBT_WH
AFTER tasty_bytes_dbt_db.raw.dbt_run_task
AS
DECLARE
dbt_success BOOLEAN;
dbt_exception STRING;
my_exception EXCEPTION (-20002, 'My exception text');
BEGIN
EXECUTE DBT PROJECT "TASTY_BYTES_DBT_DB"."RAW"."DBT_PROJECT" args='test --target dev';
SELECT SUCCESS, EXCEPTION into :dbt_success, :dbt_exception FROM TABLE(result_scan(last_query_id()));
IF (NOT :dbt_success) THEN
raise my_exception;
END IF;
END;
AS EXECUTE DBT PROJECT "TASTY_BYTES_DBT_DB"."RAW"."DBT_PROJECT" args='test --target dev';

-- Run the tasks once
ALTER TASK tasty_bytes_dbt_db.raw.dbt_test_task RESUME;
Expand Down Expand Up @@ -326,6 +307,12 @@ You can view the status or running tasks by going to Monitoring > Task History.

![tasks](assets/tasks.png)

### Monitor dbt Projects

You can get an overview of dbt project status from the dbt Projects activity in Snowsight. Navigate to Monitoring > dbt Projects to view overall status of dbt Projects and quickly jump to the deployed projects.

![dbt-projects](assets/dbt-projects.png)

### Tracing

dbt projects in Snowflake integrate with [Tracing and Logging](https://docs.snowflake.com/en/developer-guide/logging-tracing/logging-tracing-overview), allowing you to easily monitor and debug dbt projects. Tracing follows the OpenTelemetry standard and allows you to keep logs within a single platform.
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
# Getting Started with Snowflake Interactive Tables

## Overview

When it comes to near real-time (or sub-second) analytics, the ideal scenario involves achieving consistent, rapid query performance and managing costs effectively, even with large datasets and high user demand.

Snowflake's new Interactive Warehouses and Tables are designed to deliver on these needs. They provide high-concurrency, low-latency serving layer for near real-time analytics. This allows consistent, sub-second query performance for live dashboards and APIs with great price-for-performance. With this end-to-end solution, you can avoid operational complexities and tool sprawl.

Here's how interactive warehouses and tables fits in for a typical data analytics pipeline:

![](assets/architecture.png)

### What You'll Learn
- The core concepts behind Snowflake's Interactive Warehouses and Tables and how they provide low-latency analytics.
- How to create and configure an Interactive Warehouse using SQL.
- The process of creating an Interactive Table from an existing standard table.
- How to attach a table to an Interactive Warehouse to pre-warm the data cache for faster queries.
- A methodology for benchmarking and comparing the query latency of an interactive setup versus a standard warehouse.

### What You'll Build

You will build a complete, functioning interactive data environment in Snowflake, including a dedicated Interactive Warehouse and an Interactive Table populated with data. You will also create a Python-based performance test that executes queries against both your new interactive setup and a standard configuration, culminating in a comparative bar chart that visually proves the latency improvements.

# Step-By-Step Guide

For prerequisites, environment setup, step-by-step guide and instructions, please refer to the [QuickStart Guide](template.md).
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading