Configuring GenAI Custom Provider
LLM Connection is the configuration that enables a system to communicate with a Large Language Model (LLM), such as those developed by OpenAI, Azure OpenAI, AWS Bedrock, and so on. An LLM connection is used to:
Translate natural language into analytical queries.
Power chat-based or voice-based interfaces.
Enhance user experience through intelligent, language-based interaction with data.
An LLM connection allows:
Sending natural language inputs (e.g., user queries) to the LLM.
Receiving generated outputs like MDX/SQL queries, summaries, or explanations.
Enabling features such as conversational analytics, natural language querying, or smart recommendations.
On the Custom Provider page, while configuring a custom provider, you must define the ‘applicable For’ field to indicate the type whether the details apply to an LLM Connection, an Embedding Connection, or Both.
To configure the GenAI LLM custom provider, perform the following steps:
On the navigation pane, click Kyvos and Ecosystem > GenAI Configuration. The page displays information about the GenAI Configuration connection details.
In the Configure GenAI custom provider details pane, click the plus icon to add a new custom provider details. The connection that you create will be listed in the Custom Providers pane. You can edit the information when needed.
Parameter/Field | Description |
|---|---|
Name | The name of the GenAI provider the system will use to generate output. |
Applicable For | Select the provider from the list.
|
GenAI Provider Zip | Upload the GenAI provider zip file. The zip file should include two folders:
|
Callback Class Name for LLM Service | Provide fully qualified class name, including package name of the class which implements the GenAICallbackService interface for LLM Service. |
Callback Class Name for Embedding Service | Provide fully qualified class name, including package name of the class which implements the GenAIEmbeddingCallbackService interface for Embedding generation. |