Skip to main content

LLM Configuration: Advanced Properties

This appendix lists additional LLM configuration properties you can set for the Agentic AI plugin steps. The following table explains about other LLM properties that you may required to configure for selected LLM provider. Following table shows the list of additional configuration options for LLM settings. Use these properties to customize behavior and control how the system processes requests

PropertiesDescription
baseUrlSpecify the endpoint address used to send requests to the LLM provider. Use this property to control which service endpoint handles API calls, such as a regional or custom endpoint.
Note: This property can also override the default LLM API base URL or configure the AE Gateway base URL when isUseAEGateway is set to true for AWS Bedrock and Azure OpenAI providers.

Proxy Details Property:

PropertiesDescription
llmProxyHostSpecifies the host name or IP address of the proxy server used to route LLM requests.
llmProxyPortSpecifies the port number of the proxy server used for LLM communication.
llmProxyUsernameSpecifies the username required to authenticate with the proxy server, if proxy authentication is enabled.
llmProxyPasswordSpecifies the password used to authenticate with the proxy server. Store this value securely.
PropertiesDescription
timeoutSpecifies the maximum time the application waits for a response from the LLM provider before failing the request.
connectTimeoutSpecifies the maximum time the application waits to establish a connection with the LLM provider.

Transport type for vertex ai:

PropertiesDescription
vertexAITransportTypeSpecifies the transport mechanism used to communicate with Vertex AI, such as REST or gRPC.
Note:
Property is available only for LLM provider type Google Vertex AI.