LLMInferenceParams
Interface: LLMInferenceParams
Defined in: packages/codeboltjs/src/types/libFunctionTypes.ts:133
LLM inference request parameters
Properties
| Property | Type | Description | Defined in |
|---|---|---|---|
llmrole | string | The LLM role to determine which model to use | packages/codeboltjs/src/types/libFunctionTypes.ts:141 |
max_tokens? | number | Maximum number of tokens to generate | packages/codeboltjs/src/types/libFunctionTypes.ts:143 |
messages | Message[] | Array of messages in the conversation | packages/codeboltjs/src/types/libFunctionTypes.ts:135 |
stream? | boolean | Whether to stream the response | packages/codeboltjs/src/types/libFunctionTypes.ts:147 |
temperature? | number | Temperature for response generation | packages/codeboltjs/src/types/libFunctionTypes.ts:145 |
tool_choice? | | { function: { name: string; }; type: "function"; } | "auto" | "none" | "required" | How the model should use tools | packages/codeboltjs/src/types/libFunctionTypes.ts:139 |
tools? | Tool[] | Available tools for the model to use | packages/codeboltjs/src/types/libFunctionTypes.ts:137 |