Skip to main content

LLMInferenceParams

@codebolt/codeboltjs


Interface: LLMInferenceParams

Defined in: packages/codeboltjs/src/types/libFunctionTypes.ts:133

LLM inference request parameters

Properties

PropertyTypeDescriptionDefined in
llmrolestringThe LLM role to determine which model to usepackages/codeboltjs/src/types/libFunctionTypes.ts:141
max_tokens?numberMaximum number of tokens to generatepackages/codeboltjs/src/types/libFunctionTypes.ts:143
messagesMessage[]Array of messages in the conversationpackages/codeboltjs/src/types/libFunctionTypes.ts:135
stream?booleanWhether to stream the responsepackages/codeboltjs/src/types/libFunctionTypes.ts:147
temperature?numberTemperature for response generationpackages/codeboltjs/src/types/libFunctionTypes.ts:145
tool_choice?| { function: { name: string; }; type: "function"; } | "auto" | "none" | "required"How the model should use toolspackages/codeboltjs/src/types/libFunctionTypes.ts:139
tools?Tool[]Available tools for the model to usepackages/codeboltjs/src/types/libFunctionTypes.ts:137