Skip to main content

LLM Configuration: Advanced Properties

This appendix lists additional LLM configuration properties you can set for the Agentic AI plugin steps. The following table explains about other LLM properties that you may required to configure for selected LLM provider. Following table shows the list of additional configuration options for LLM settings. Use these properties to customize behavior and control how the system processes requests

PropertiesDescription
baseUrlSpecify the endpoint address used to send requests to the LLM provider. Use this property to control which service endpoint handles API calls, such as a regional or custom endpoint.
Note: This property can also override the default LLM API base URL or configure the AE Gateway base URL when isUseAEGateway is set to true for AWS Bedrock and Azure OpenAI providers.
apiEndpointSpecify the API endpoint used to connect to the LLM provider. Use this property to define the regional service URL (for example, us-central1-aiplatform.googleapis.com) so that requests are routed to the correct location and meet data residency requirements.
Note: This property can also override the default LLM API base URL or configure the AE Gateway base URL when isUseAEGateway is set to true for Vertex AI

AE Tool Property:

PropertiesDescription
aeToolExecutionTimeoutSpecify the maximum time (in seconds) the system waits for a workflow (AE tool) to complete during asynchronous execution. The system checks the workflow status every 5 seconds until completion or timeout.
Minimum allowed value is 30 seconds and maximum is 600 seconds (10 minutes)
If not provided the step use default that is 600 seconds as timeout value.

AE Gateway Properties:

Use the following properties to connect to the AE Gateway. When enabled, configure AE Gateway details instead of LLM provider details.

Note
  • These properties are supported only for the following LLM providers: Google Vertex AI, AWS Bedrock, and Azure OpenAI.
  • Configure the AE Gateway base URL using the baseUrl or apiEndpoint property, depending on the LLM provider.
PropertiesDescription
isUseAEGatewayAllow to enable and disable AE gateway use in LLM configuration.
Allowed Values:
- true
- false
aeGatewayTokenSpecify the authentication token that your application uses to connect to the AE gateway.
Use this property to pass the required token for secure access. It ensures that only authorized requests can access the gateway services.
Get this token by calling the AE gateway authentication API provided.

Proxy Details Property:

PropertiesDescription
llmProxyHostSpecifies the host name or IP address of the proxy server used to route LLM requests.
llmProxyPortSpecifies the port number of the proxy server used for LLM communication.
llmProxyUsernameSpecifies the username required to authenticate with the proxy server, if proxy authentication is enabled.
llmProxyPasswordSpecifies the password used to authenticate with the proxy server. Store this value securely.
PropertiesDescription
timeoutSpecifies the maximum time the application waits for a response from the LLM provider before failing the request.
connectTimeoutSpecifies the maximum time the application waits to establish a connection with the LLM provider.

Transport type for vertex ai:

PropertiesDescription
vertexAITransportTypeSpecifies the transport mechanism used to communicate with Vertex AI, such as REST or gRPC.
Note:
Property is available only for LLM provider type Google Vertex AI.