Supported environments and connections
Applies to: Kyvos Enterprise Kyvos Cloud (SaaS on AWS) Kyvos AWS Marketplace
Kyvos Azure Marketplace Kyvos GCP Marketplace Kyvos Single Node Installation (Kyvos SNI)
Kyvos SNI supports the following warehouse connections:
BigQuery
Teradata
Azure SQL Database
Oracle RDS
Snowflake
Generic
Databricks SQL Warehouse
Note
Kyvos SNI does not support Redshift connection.
Supported Platforms
Following are the supported platforms.
Cloud Platform | Instance |
|---|---|
AWS | EC2 Instance |
GCP | VM Instance |
On Prem | On Prem |
Azure | VM Instance |
Supported Operating Systems
Platform | Supported OS |
|---|---|
On-premise | CentOS 7.x |
AWS | Amazon Linux 2023 |
Azure | RHEL 8.x |
GCP | RHEL 8.x |
BigQuery
Prerequisites
You must create a service account. To know more about creating a service account, refer to Google Cloud documentation.
After creating a service account, assign the following permissions:
To the service account used by Kyvos VMs, add the following roles on the BigQuery Project:
BigQuery Data Viewer
BigQuery User
For accessing BigQuery Views, add the following permissions to the Kyvos custom role (created above). To create a custom role, refer to Google Cloud documentation.
bigquery.tables.create
bigquery.tables.delete
bigquery.tables.update
bigquery.tables.updateData
To use BigQuery Warehouse Connection with Google Service Account Authentication, place the Private key JSON file in olapengine/bin and queryengine/bin. To create a BigQuery Connection:
From the Toolbox, click Setup, then Connections.
From the Actions menu ( ⋮ ) click Add Connection.
Enter a name or select it from the Connection list.
Select Warehouse from the Category List.
There may be more than one warehouse connection.For Providers, select BigQuery.
Specify the Server to access Google APIs. For example, https://www.googleapis.com.
Specify the Port to use.
Enter your Google Project ID for the Google Cloud Platform.
To generate Temporary Views in Separate Dataset when performing the validation/preview operation from Kyvos on Google BigQuery, provide the Materialization Project and Materialization Dataset names.
From the Authentication Type, the Google Service Account is displayed:
Service Account Email: Provide the email of the service account to be used for authentication.
Service Account Private Key Path: Provide the name of the private key JSON file to be used for authentication.
Ensure that the Private key JSON file must be placed at olapengine/bin and queryengine/bin.
By default, Use as Source checkbox is selected as this connection can only be used to read data (creating datasets) on which the semantic models will be created.
Select Is Default SQL Engine checkbox to use the default connection.
Enter the JDBC URL for Big Query connection.
Click the Test button from the top left to validate the connection settings.
If the connection is valid, click the Save button.
Teradata
Prerequisites
The Teradata application must be up and running.
AzureSqlDB
Prerequisites
Allow public Internet IP addresses to access your resource at the Azure SQL Server if access to Azure resources is from other platforms.
Generic
Kyvos SNI allows for Generic JDBC connectivity with any source that offers JDBC-based connectivity. To establish this connection, simply upload the compatible driver version jar from Kyvos Manager.
The following connections are certified through Generic JDBC flow.
AWS-Databricks
Azure-Databricks: Databricks connectivity with Kyvos Single Node Installation is certified with Personal Access Token-based authentication.
Oracle RDS
Prerequisites
No prerequisite and permissions are required if the Oracle RDS database is available on the same platform and within the same account.
Oracle RDS should be up and running
Snowflake
There is no prerequisite for Snowflake.
Important
Kyvos SNI setup does not support the following:
Windows and Mac Operating System
Disaster Recovery in case of node or disk failure.
SQL and HCatalog-based files
Redshift connection
SQL Engine – Athena and Presto
Graviton EC2 instances
Semantic models are processed locally on the same node. External process clusters are not supported.
Backup is supported through EBS snapshot
Databricks SQL Warehouse
Supported only with premium workspace.
Supported only with Personal Access Token authentication.
Storage Blob Data Contributor rights are required for the user.
You must have permission to create (and map) Storage credentials and External Locations for the Unity Catalog.
For more information, see the Working with Databricks SQL warehouse for Azure section.
Read more: