UiPath Documentation
private-test-cloud
2.2510
true

Private Test Cloud admin guide

Last updated May 11, 2026

Large Language Model Support

Large language models (LLMs) power the agentic and generative AI features in Automation Suite. The AI Trust Layer governs how these models connect to UiPath products across your organization, applying policy enforcement and audit logging to all AI interactions regardless of which model you use.

Automation Suite supports two broad integration modes: connecting to cloud-hosted models through your own subscriptions, and running open source models on your own infrastructure. In both cases, the AI Trust Layer remains the central governance layer. LLM integration is supported on AKS/EKS and OpenShift deployments.

Integration modes

Cloud models

Cloud models are LLMs hosted and managed by third-party providers. You can connect your own subscriptions from the following providers:

  • OpenAI — Available via direct API or Azure-hosted deployment (Microsoft Azure OpenAI).
  • Google Gemini — Available via Google Vertex AI.
  • Anthropic Claude — Available via the Amazon Web Services connector.

Cloud model connections give you two configuration options.

Bring Your Own Subscription (BYOS)

Replace a UiPath-managed model subscription with your own subscription for the same model family and version. This lets you route model calls through your own account while keeping the UiPath product experience intact.

Bring Your Own Model (BYOM)

Add a custom model not included in the UiPath-managed offering, as long as the model passes the AI Trust Layer probe validation test.

UiPath recommends Google Gemini for the broadest compatibility across agentic capabilities.

For configuration steps, see Configuring LLMs.

Self-hosted models

For organizations that require on-premises model deployments to meet data sovereignty or compliance requirements, Automation Suite supports a set of recommended open-source (OSS) models. Note that self-hosted models are compatible with a subset of agentic capabilities only.

Self-hosted models must expose an API that follows the OpenAI V1 standard. UiPath connects to them through the OpenAI V1 Compliant LLM connector in Integration Service, and each model must pass the AI Trust Layer probe validation test before it can be used.

Running self-hosted models also requires a supported inference engine and appropriate hardware. Supported inference engines include vLLM and SGLang. For hardware requirements, refer to Kubernetes cluster and nodes.

Provider and capability compatibility

The following table shows which providers are supported for each product and feature, including recommended self-hosted models where applicable. For specific model versions, refer to Configuring LLMs.

ProductFeatureOpenAIGoogle GeminiAnthropic ClaudeSelf-hosted OSS
AgentsAgent executionSupportedSupportedSupportedGLM 4.5, GLM 4.6, GLM 4.6 V, GPT-OSS 120B
EvaluationsSupportedSupportedSupportedGLM 4.5, GLM 4.6, GLM 4.6 V, GPT-OSS 120B
SimulationsSupported1Not supportedNot supportedNot supported
AutopilotGenerationNot supportedSupportedNot supportedNot supported
ChatNot supportedSupportedSupportedNot supported
Autopilot for EveryoneChatSupportedNot supportedSupportedNot supported
Coded AgentsCall LLMSupportedSupportedSupportedNot supported
Context GroundingAdvanced ExtractionSupported2SupportedNot supportedNot supported
Batch TransformSupportedSupportedNot supportedGLM 4.6 V
Batch Transform with Web SearchNot supportedSupportedNot supportedNot supported
DeepRAGSupportedSupportedNot supportedNot supported
EmbeddingsSupportedSupportedNot supportedQwen 8B
GenAI ActivitiesBuild, Test & DeploySupportedSupportedSupportedTBD
Healing AgentAI-enhanced recoverySupportedSupportedNot supportedGLM 4.5, GLM 4.6, Qwen3 235B
Popup governanceSupportedSupportedNot supportedGLM 4.5, GLM 4.6, Qwen3 235B
UI AutomationScreenPlaySupportedSupportedSupported3Not supported
Semantic selectorsNot supportedSupportedNot supportedNot supported
Test ManagerAutopilotSupportedSupportedSupportedNot supported

1 GPT 4.1 mini only.

2 Embeddings only. Advanced Extraction and DeepRAG require a cloud model provider.

3 ScreenPlay only. Semantic Selectors require Google Gemini.

Governance

All LLM interactions, regardless of provider, pass through the AI Trust Layer. AI Trust Layer policy enforcement via Automation Ops™ and detailed audit logging apply to all model interactions.

Governance policies are specifically designed for UiPath-managed LLMs. If you disable a UiPath-managed model through a policy, that restriction does not extend to your own configured models of the same type.

  • Integration modes
  • Cloud models
  • Self-hosted models
  • Provider and capability compatibility
  • Governance

Was this page helpful?

Connect

Need help? Support

Want to learn? UiPath Academy

Have questions? UiPath Forum

Stay updated