LLMChatOptions
Interface: LLMChatOptions
Defined in: packages/codeboltjs/src/types/libFunctionTypes.ts:1375
Properties
| Property | Type | Description | Defined in |
|---|---|---|---|
maxTokens? | number | Maximum tokens to generate | packages/codeboltjs/src/types/libFunctionTypes.ts:1383 |
messages | Message[] | Messages in the conversation | packages/codeboltjs/src/types/libFunctionTypes.ts:1377 |
model? | string | Model to use | packages/codeboltjs/src/types/libFunctionTypes.ts:1379 |
stream? | boolean | Whether to stream response | packages/codeboltjs/src/types/libFunctionTypes.ts:1385 |
temperature? | number | Temperature (0-1) | packages/codeboltjs/src/types/libFunctionTypes.ts:1381 |
toolChoice? | "auto" | "none" | "required" | Tool choice strategy | packages/codeboltjs/src/types/libFunctionTypes.ts:1389 |
tools? | Tool[] | Available tools | packages/codeboltjs/src/types/libFunctionTypes.ts:1387 |