inference Configurations
suspend fun inferenceConfigurations(value: Output<List<AgentAgentPromptOverrideConfigurationPromptConfigurationInferenceConfigurationArgs>>)
suspend fun inferenceConfigurations(value: List<AgentAgentPromptOverrideConfigurationPromptConfigurationInferenceConfigurationArgs>)
Parameters
value
Inference parameters to use when the agent invokes a foundation model in the part of the agent sequence defined by the prompt_type
. For more information, see Inference parameters for foundation models. See inference_configuration
Block for details.
suspend fun inferenceConfigurations(vararg values: Output<AgentAgentPromptOverrideConfigurationPromptConfigurationInferenceConfigurationArgs>)
suspend fun inferenceConfigurations(values: List<Output<AgentAgentPromptOverrideConfigurationPromptConfigurationInferenceConfigurationArgs>>)
suspend fun inferenceConfigurations(vararg values: AgentAgentPromptOverrideConfigurationPromptConfigurationInferenceConfigurationArgs)
Parameters
values
Inference parameters to use when the agent invokes a foundation model in the part of the agent sequence defined by the prompt_type
. For more information, see Inference parameters for foundation models. See inference_configuration
Block for details.
suspend fun inferenceConfigurations(argument: List<suspend AgentAgentPromptOverrideConfigurationPromptConfigurationInferenceConfigurationArgsBuilder.() -> Unit>)
suspend fun inferenceConfigurations(vararg argument: suspend AgentAgentPromptOverrideConfigurationPromptConfigurationInferenceConfigurationArgsBuilder.() -> Unit)
suspend fun inferenceConfigurations(argument: suspend AgentAgentPromptOverrideConfigurationPromptConfigurationInferenceConfigurationArgsBuilder.() -> Unit)
Parameters
argument
Inference parameters to use when the agent invokes a foundation model in the part of the agent sequence defined by the prompt_type
. For more information, see Inference parameters for foundation models. See inference_configuration
Block for details.