Skip to content

Commit dafaa5d

Browse files
yuutengdprophet
authored andcommitted
Add Snowflake JDBC Connector
***** Update ci and delete no use lines (#20) Approved Update according to reviews 11/01/2024 Various style fixes and cleanup (#15) (#17) Co-authored-by: Martin Traverso <[email protected]> Various style fixes and cleanup (#15) Update the github CI (#12) * Add Snowflake JDBC Connector * Add snowflake in the ci Add Snowflake JDBC Connector (#11) Had to redo the connector because all the rebases caused havoc
1 parent acc40c9 commit dafaa5d

File tree

26 files changed

+2722
-0
lines changed

26 files changed

+2722
-0
lines changed

.github/workflows/ci.yml

+22
Original file line numberDiff line numberDiff line change
@@ -373,6 +373,7 @@ jobs:
373373
!:trino-server,
374374
!:trino-server-rpm,
375375
!:trino-singlestore,
376+
!:trino-snowflake,
376377
!:trino-sqlserver,
377378
!:trino-test-jdbc-compatibility-old-server,
378379
!:trino-tests,
@@ -475,6 +476,8 @@ jobs:
475476
- { modules: plugin/trino-redshift, profile: fte-tests }
476477
- { modules: plugin/trino-resource-group-managers }
477478
- { modules: plugin/trino-singlestore }
479+
- { modules: plugin/trino-snowflake }
480+
- { modules: plugin/trino-snowflake, profile: cloud-tests }
478481
- { modules: plugin/trino-sqlserver }
479482
- { modules: testing/trino-faulttolerant-tests, profile: default }
480483
- { modules: testing/trino-faulttolerant-tests, profile: test-fault-tolerant-delta }
@@ -651,6 +654,24 @@ jobs:
651654
if: matrix.modules == 'plugin/trino-bigquery' && !contains(matrix.profile, 'cloud-tests-2') && (env.CI_SKIP_SECRETS_PRESENCE_CHECKS != '' || env.BIGQUERY_CASE_INSENSITIVE_CREDENTIALS_KEY != '')
652655
run: |
653656
$MAVEN test ${MAVEN_TEST} -pl :trino-bigquery -Pcloud-tests-case-insensitive-mapping -Dbigquery.credentials-key="${BIGQUERY_CASE_INSENSITIVE_CREDENTIALS_KEY}"
657+
- name: Cloud Snowflake Tests
658+
env:
659+
SNOWFLAKE_URL: ${{ secrets.SNOWFLAKE_URL }}
660+
SNOWFLAKE_USER: ${{ secrets.SNOWFLAKE_USER }}
661+
SNOWFLAKE_PASSWORD: ${{ secrets.SNOWFLAKE_PASSWORD }}
662+
SNOWFLAKE_DATABASE: ${{ secrets.SNOWFLAKE_DATABASE }}
663+
SNOWFLAKE_ROLE: ${{ secrets.SNOWFLAKE_ROLE }}
664+
SNOWFLAKE_WAREHOUSE: ${{ secrets.SNOWFLAKE_WAREHOUSE }}
665+
if: matrix.modules == 'plugin/trino-snowflake' && !contains(matrix.profile, 'cloud-tests') && (env.SNOWFLAKE_URL != '' && env.SNOWFLAKE_USER != '' && env.SNOWFLAKE_PASSWORD != '')
666+
run: |
667+
$MAVEN test ${MAVEN_TEST} -pl :trino-snowflake -Pcloud-tests \
668+
-Dconnector.name="snowflake" \
669+
-Dsnowflake.test.server.url="${SNOWFLAKE_URL}" \
670+
-Dsnowflake.test.server.user="${SNOWFLAKE_USER}" \
671+
-Dsnowflake.test.server.password="${SNOWFLAKE_PASSWORD}" \
672+
-Dsnowflake.test.server.database="${SNOWFLAKE_DATABASE}" \
673+
-Dsnowflake.test.server.role="${SNOWFLAKE_ROLE}" \
674+
-Dsnowflake.test.server.warehouse="${SNOWFLAKE_WAREHOUSE}"
654675
- name: Iceberg Cloud Tests
655676
id: tests-iceberg
656677
env:
@@ -842,6 +863,7 @@ jobs:
842863
- suite-clickhouse
843864
- suite-mysql
844865
- suite-iceberg
866+
- suite-snowflake
845867
- suite-hudi
846868
- suite-ignite
847869
exclude:

core/trino-server/src/main/provisio/trino.xml

+6
Original file line numberDiff line numberDiff line change
@@ -296,6 +296,12 @@
296296
</artifact>
297297
</artifactSet>
298298

299+
<artifactSet to="plugin/snowflake">
300+
<artifact id="${project.groupId}:trino-snowflake:zip:${project.version}">
301+
<unpack />
302+
</artifact>
303+
</artifactSet>
304+
299305
<artifactSet to="plugin/sqlserver">
300306
<artifact id="${project.groupId}:trino-sqlserver:zip:${project.version}">
301307
<unpack />

docs/src/main/sphinx/connector.md

+1
Original file line numberDiff line numberDiff line change
@@ -38,6 +38,7 @@ Prometheus <connector/prometheus>
3838
Redis <connector/redis>
3939
Redshift <connector/redshift>
4040
SingleStore <connector/singlestore>
41+
Snowflake <connector/snowflake>
4142
SQL Server <connector/sqlserver>
4243
System <connector/system>
4344
Thrift <connector/thrift>
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,94 @@
1+
# Snowflake connector
2+
3+
```{raw} html
4+
<img src="../_static/img/snowflake.png" class="connector-logo">
5+
```
6+
7+
The Snowflake connector allows querying and creating tables in an
8+
external [Snowflake](https://www.snowflake.com/) account. This can be used to join data between
9+
different systems like Snowflake and Hive, or between two different
10+
Snowflake accounts.
11+
12+
## Configuration
13+
14+
To configure the Snowflake connector, create a catalog properties file
15+
in `etc/catalog` named, for example, `example.properties`, to
16+
mount the Snowflake connector as the `snowflake` catalog.
17+
Create the file with the following contents, replacing the
18+
connection properties as appropriate for your setup:
19+
20+
```none
21+
connector.name=snowflake
22+
connection-url=jdbc:snowflake://<account>.snowflakecomputing.com
23+
connection-user=root
24+
connection-password=secret
25+
snowflake.account=account
26+
snowflake.database=database
27+
snowflake.role=role
28+
snowflake.warehouse=warehouse
29+
```
30+
31+
### Arrow serialization support
32+
33+
This is an experimental feature which introduces support for using Apache Arrow
34+
as the serialization format when reading from Snowflake. Please note there are
35+
a few caveats:
36+
37+
- Using Apache Arrow serialization is disabled by default. In order to enable
38+
it, add `--add-opens=java.base/java.nio=ALL-UNNAMED` to the Trino
39+
{ref}`jvm-config`.
40+
41+
### Multiple Snowflake databases or accounts
42+
43+
The Snowflake connector can only access a single database within
44+
a Snowflake account. Thus, if you have multiple Snowflake databases,
45+
or want to connect to multiple Snowflake accounts, you must configure
46+
multiple instances of the Snowflake connector.
47+
48+
% snowflake-type-mapping:
49+
50+
## Type mapping
51+
52+
Trino supports the following Snowflake data types:
53+
54+
| Snowflake Type | Trino Type |
55+
| -------------- | -------------- |
56+
| `boolean` | `boolean` |
57+
| `tinyint` | `bigint` |
58+
| `smallint` | `bigint` |
59+
| `byteint` | `bigint` |
60+
| `int` | `bigint` |
61+
| `integer` | `bigint` |
62+
| `bigint` | `bigint` |
63+
| `float` | `real` |
64+
| `real` | `real` |
65+
| `double` | `double` |
66+
| `decimal` | `decimal(P,S)` |
67+
| `varchar(n)` | `varchar(n)` |
68+
| `char(n)` | `varchar(n)` |
69+
| `binary(n)` | `varbinary` |
70+
| `varbinary` | `varbinary` |
71+
| `date` | `date` |
72+
| `time` | `time` |
73+
| `timestampntz` | `timestamp` |
74+
75+
Complete list of [Snowflake data types](https://docs.snowflake.com/en/sql-reference/intro-summary-data-types.html).
76+
77+
(snowflake-sql-support)=
78+
79+
## SQL support
80+
81+
The connector provides read access and write access to data and metadata in
82+
a Snowflake database. In addition to the {ref}`globally available
83+
<sql-globally-available>` and {ref}`read operation <sql-read-operations>`
84+
statements, the connector supports the following features:
85+
86+
- {doc}`/sql/insert`
87+
- {doc}`/sql/delete`
88+
- {doc}`/sql/truncate`
89+
- {doc}`/sql/create-table`
90+
- {doc}`/sql/create-table-as`
91+
- {doc}`/sql/drop-table`
92+
- {doc}`/sql/alter-table`
93+
- {doc}`/sql/create-schema`
94+
- {doc}`/sql/drop-schema`
91.3 KB
Loading

0 commit comments

Comments
 (0)