Working with Snowflake Data source
Processing semantic model with source as Snowflake on AWS
To process the semantic model with source as Snowflake on AWS,
In semantic model advance properties, provide the value of your secret key in the AWS_SECRET_KEY property.
In semantic model advance properties, provide the value of your access key in the AWS_KEY_ID property.
-or-Configure a Snowflake storage integration to access Amazon S3.
Important
Refer to Snowflake documentation for configuring a Snowflake storage integration to access Amazon S3. Ensure that before creating an external stage, as per Snowflake documentation, the URL should be: s3://<your bucket name>/<kvyso work directory>/temp/
For example, 's3://kyvos-qa/user/engine_work/temp/'For AWS Marketplace document, add IAM role (created in the ‘Step 2: Create the IAM Role in AWS’ section of the Snowflake documentation) in the KMS key policy.
Note
This requires AWS console access and Snowflake account administrators (users with the ACCOUNTADMIN role) or a role with the global CREATE INTEGRATION.
Execute the GRANT USAGE ON STAGE <stagename> TO ROLE <role_used_on_Kyvos_SF_Connection>; query on Snowflake with the role used in Snowflake connection in Kyvos.
Add the kyvos.connection.snowflake.stage = @mydb.myschema.mystage property on Snowflake connection.
NOTE: Here, mydb is the database name, myschema is the schema name, and mystage is the stage name.Update the IAM role in your bucket policy that you created while configuring a Snowflake storage integration to access Amazon S3 in the above step.
Processing semantic model with source as Snowflake on Azure
This requires Azure console access and Snowflake account administrators (users with the ACCOUNTADMIN role).
To get the SAS token from the storage account:
Go to the Azure Portal.
Navigate to your Storage Account.
Under Security + networking, click Shared access signature.
Choose the following:
Allowed services (Blob, File, etc.)
Allowed resource types (Service, Container, Object)
Permissions (Read, Write, Delete, List, etc.)
Start and expiry time
Allowed IP addresses (optional)
Allowed protocols (HTTPS recommended)
Click Generate SAS and connection string.
Copy the SAS Token or Blob URL with SAS Token.
Go to Snowflake and create Stage.
Copy the SAS token and enter it in the following command in the Snowflake workbook:
CREATE OR REPLACE STAGE Sample_stage URL='azure://myaccount.blob.core.windows.net/mycontainer/load/files' CREDENTIALS=(AZURE_SAS_TOKEN='?sv=2016-05-31&ss=b&srt=sco&sp=rwdl&se=2018-06-27T10:05:50Z&st=2017-06-27T02:05:50Z&spr=https,http&sig=bgqQwoXwxzuD2GJfagRg7VOS8hzNr3QLT7rhS8OFRLQ%3D')Execute the following command:
GRANT USAGE ON STAGE <stagename> TO ROLE <role_used_on_Kyvos_SF_Connection>;Go to Snowflake connection on the Connection page and add the kyvos.connection.snowflake.stage = @mydb.myschema.Sample_stage property.
Note
Here, mydb is the database name, myschema is the schema name, and Sample_stage is the stage name.
Processing semantic model with source as Snowflake on GCP
To process the semantic model with source as Snowflake on GCP,
Configure a Snowflake storage integration to access Google Cloud Storage.
Important
To configure a Snowflake storage integration to access Google Cloud Storage, refer to the Snowflake documentation.
Ensure that before creating an external stage, as per Snowflake documentation, the URL should be:
gcs://<your bucket name>/<kyvos work directory>/temp/
For example, 'gcs://kyvos-qa/user/engine_work/temp/'
Note
This requires GCP console access and Snowflake account administrators (users with the ACCOUNTADMIN role) or a role with the global CREATE INTEGRATION.
Grant the USAGE privilege on the stage to the role used on Kyvos_Snowflake_Connection; query on Snowflake with the role used in Snowflake connection in Kyvos.
Add the kyvos.connection.snowflake.stage = @mydb.myschema.mystage property on the Snowflake connection.
NOTE: Here, mydb is the database name, myschema is the schema name, and mystage is the stage name.If Customer Managed Key (CMK) or Bring Your Own Key (BYOK) is used in Terraform based deployment, use below command for creating stage.
CREATE STAGE my_gcs_stage
STORAGE_INTEGRATION = gcs_int
URL = 'gcs://mybucket1/path1'
ENCRYPTION = (TYPE = 'GCS_SSE_KMS');For more details, refer to Snowflake documentation.