LlmProvider API
The llmProvider module of the @codebolt/plugin-sdk.
import plugin from '@codebolt/plugin-sdk';
Quick Reference
| Method | Description |
|---|---|
onCompletionRequest | Subscribe to incoming non-streaming completion requests. |
onLoginRequest | Subscribe to incoming login requests (triggered by the UI login button). |
onStreamRequest | Subscribe to incoming streaming completion requests. |
register | Register this plugin as a custom LLM provider on the server. |
sendChunk | Send a streaming chunk for an in-flight stream request. |
sendError | Send an error for a completion or stream request. |
sendReply | Send the final reply for a completion or stream request. |
unregister | Unregister this plugin's provider. |
Methods
onCompletionRequest
plugin.llmProvider.onCompletionRequest(callback: Function): void
Subscribe to incoming non-streaming completion requests. Reply with sendReply(requestId, response) or sendError(requestId, error).
| Parameter | Type | Required | Description |
|---|---|---|---|
callback | Function | Yes |
Returns: void
onLoginRequest
plugin.llmProvider.onLoginRequest(callback: Function): void
Subscribe to incoming login requests (triggered by the UI login button). The plugin should run its authentication flow (e.g. OAuth) and then call sendReply(requestId, { authenticated: true }) or sendError().
| Parameter | Type | Required | Description |
|---|---|---|---|
callback | Function | Yes |
Returns: void
onStreamRequest
plugin.llmProvider.onStreamRequest(callback: Function): void
Subscribe to incoming streaming completion requests. Stream tokens with sendChunk(requestId, chunk), then finalize with sendReply(requestId, finalResponse) or sendError(requestId, error).
| Parameter | Type | Required | Description |
|---|---|---|---|
callback | Function | Yes |
Returns: void
register
plugin.llmProvider.register(manifest: LlmProviderManifest): Promise<LlmProviderRegisterResponse>
Register this plugin as a custom LLM provider on the server. After registration, the provider appears in the provider list and can be selected by users like any built-in provider.
| Parameter | Type | Required | Description |
|---|---|---|---|
manifest | LlmProviderManifest | Yes |
Returns: Promise<LlmProviderRegisterResponse>
sendChunk
plugin.llmProvider.sendChunk(requestId: string, chunk: any): void
Send a streaming chunk for an in-flight stream request. The chunk shape should match multillm StreamChunk (id, model, choices: [{delta: {content, ...}}]).
| Parameter | Type | Required | Description |
|---|---|---|---|
requestId | string | Yes | |
chunk | any | Yes |
Returns: void
sendError
plugin.llmProvider.sendError(requestId: string, error: string): void
Send an error for a completion or stream request.
| Parameter | Type | Required | Description |
|---|---|---|---|
requestId | string | Yes | |
error | string | Yes |
Returns: void
sendReply
plugin.llmProvider.sendReply(requestId: string, response: any, success: boolean): void
Send the final reply for a completion or stream request. For non-streaming: this is the only message you send. For streaming: send after all chunks have been emitted.
| Parameter | Type | Required | Description |
|---|---|---|---|
requestId | string | Yes | |
response | any | Yes | |
success | boolean | Yes | (default: true) |
Returns: void
unregister
plugin.llmProvider.unregister(providerId: string): Promise<LlmProviderResponse>
Unregister this plugin's provider.
| Parameter | Type | Required | Description |
|---|---|---|---|
providerId | string | Yes |
Returns: Promise<LlmProviderResponse>