6.23. Snowflake Connector
The Snowflake connector allows querying and creating tables in an external Snowflake account. This can be used to join data between different systems like Snowflake and Hive, or between two different Snowflake accounts.
Configuration
To configure the Snowflake connector, create a catalog properties file
in etc/catalog
named, for example, snowflake.properties
, to
mount the Redshift connector as the snowflake
catalog.
Create the file with the following contents, replacing the
connection properties as appropriate for your setup:
connector.name=snowflake
connection-url=jdbc:snowflake://<account>.snowflakecomputing.com>/?db=<database>
connection-user=root
connection-password=secret
scan-cache-enabled=true
Transparent scan cache
Transparent scan cache can be enabled to cache data scans from Snowflake to in memory Ampool connector. On subsequent access, Snowflake data scan will be replaced by cached Ampool table scan, improving performance. Cache invalidation mechanism deletes cached tables based on data change and Ampool resource consumption. Transparent scan cache can be enabled by setting ‘scan-cache-enabled=true’ in Snowflake catalog properties file. Also, there is session property ‘query_scancaching_enabled’ to enable/disable transparent scan cache for a particular session. Note that both, properties file cache property and session cache property must be ‘true’ to use transparent cache.
Multiple Snowflake Databases or Accounts
The Snowflake connector can only access a single database within a Snowflake account. Thus, if you have multiple Snowflake databases, or want to connect to multiple Snowflake accounts, you must configure multiple instances of the Snowflake connector.
To add another catalog, simply add another properties file to etc/catalog
with a different name, making sure it ends in .properties
. For example,
if you name the property file sales.properties
, Presto creates a
catalog named sales
using the configured connector.