Overview of the Generative AI functionality and LLM integrations in Hymalaia.
This section gives an overview of the Generative AI capabilities in Hymalaia and how Large Language Models (LLMs) are integrated into the system.
Hymalaia supports a wide range of cloud-based and self-hosted LLMs:
Hymalaia relies on the excellent
LiteLLM
and Langchain libraries to support these integrations.
LLMs are used to:
This is the core of Hymalaia AI Answering functionality.
Our default recommendation is:
gpt-4
(OpenAI)Claude 3.5 Sonnet
(Anthropic)Other high-quality recommended options:
Azure OpenAI
Claude via Bedrock
Self-hosted LLaMA 3.1 70B / 405B
These provide an excellent balance between quality, latency, and reliability.
There are several reasons to customize your LLM provider:
gpt-4o
)Note: OpenAI and Azure OpenAI retain logs for 30 days for misuse monitoring
🔐 Generative AI is the only part of Hymalaia that sends data to a third-party service.
You can avoid this by self-hosting a model — but note the potential performance tradeoffs.
To configure LLMs:
You can set up multiple LLM providers and assign them to different assistants.
This allows you to mix and match models based on:
Check out the following examples for how to configure specific LLM providers, or go to the Admin Panel to get started.
🙋 Need help? The Hymalaia team is here — don’t hesitate to reach out!
Overview of the Generative AI functionality and LLM integrations in Hymalaia.
This section gives an overview of the Generative AI capabilities in Hymalaia and how Large Language Models (LLMs) are integrated into the system.
Hymalaia supports a wide range of cloud-based and self-hosted LLMs:
Hymalaia relies on the excellent
LiteLLM
and Langchain libraries to support these integrations.
LLMs are used to:
This is the core of Hymalaia AI Answering functionality.
Our default recommendation is:
gpt-4
(OpenAI)Claude 3.5 Sonnet
(Anthropic)Other high-quality recommended options:
Azure OpenAI
Claude via Bedrock
Self-hosted LLaMA 3.1 70B / 405B
These provide an excellent balance between quality, latency, and reliability.
There are several reasons to customize your LLM provider:
gpt-4o
)Note: OpenAI and Azure OpenAI retain logs for 30 days for misuse monitoring
🔐 Generative AI is the only part of Hymalaia that sends data to a third-party service.
You can avoid this by self-hosting a model — but note the potential performance tradeoffs.
To configure LLMs:
You can set up multiple LLM providers and assign them to different assistants.
This allows you to mix and match models based on:
Check out the following examples for how to configure specific LLM providers, or go to the Admin Panel to get started.
🙋 Need help? The Hymalaia team is here — don’t hesitate to reach out!