FlowPromptModelInferenceConfigurationArgs

data class FlowPromptModelInferenceConfigurationArgs(val maxTokens: Output<Double>? = null, val stopSequences: Output<List<String>>? = null, val temperature: Output<Double>? = null, val topK: Output<Double>? = null, val topP: Output<Double>? = null) : ConvertibleToJava<FlowPromptModelInferenceConfigurationArgs>

Prompt model inference configuration

Constructors

constructor(maxTokens: Output<Double>? = null, stopSequences: Output<List<String>>? = null, temperature: Output<Double>? = null, topK: Output<Double>? = null, topP: Output<Double>? = null)

Properties

Link copied to clipboard
val maxTokens: Output<Double>? = null

Maximum length of output

Link copied to clipboard
val stopSequences: Output<List<String>>? = null

List of stop sequences

Link copied to clipboard
val temperature: Output<Double>? = null

Controls randomness, higher values increase diversity

Link copied to clipboard
val topK: Output<Double>? = null

Sample from the k most likely next tokens

Link copied to clipboard
val topP: Output<Double>? = null

Cumulative probability cutoff for token selection

Functions

Link copied to clipboard
open override fun toJava(): FlowPromptModelInferenceConfigurationArgs