Skip to main content

LlmProvider API

The llmProvider module of the @codebolt/plugin-sdk.

import plugin from '@codebolt/plugin-sdk';

Quick Reference

MethodDescription
onCompletionRequestSubscribe to incoming non-streaming completion requests.
onLoginRequestSubscribe to incoming login requests (triggered by the UI login button).
onStreamRequestSubscribe to incoming streaming completion requests.
registerRegister this plugin as a custom LLM provider on the server.
sendChunkSend a streaming chunk for an in-flight stream request.
sendErrorSend an error for a completion or stream request.
sendReplySend the final reply for a completion or stream request.
unregisterUnregister this plugin's provider.

Methods


onCompletionRequest

plugin.llmProvider.onCompletionRequest(callback: Function): void

Subscribe to incoming non-streaming completion requests. Reply with sendReply(requestId, response) or sendError(requestId, error).

ParameterTypeRequiredDescription
callbackFunctionYes

Returns: void

Full reference →


onLoginRequest

plugin.llmProvider.onLoginRequest(callback: Function): void

Subscribe to incoming login requests (triggered by the UI login button). The plugin should run its authentication flow (e.g. OAuth) and then call sendReply(requestId, { authenticated: true }) or sendError().

ParameterTypeRequiredDescription
callbackFunctionYes

Returns: void

Full reference →


onStreamRequest

plugin.llmProvider.onStreamRequest(callback: Function): void

Subscribe to incoming streaming completion requests. Stream tokens with sendChunk(requestId, chunk), then finalize with sendReply(requestId, finalResponse) or sendError(requestId, error).

ParameterTypeRequiredDescription
callbackFunctionYes

Returns: void

Full reference →


register

plugin.llmProvider.register(manifest: LlmProviderManifest): Promise<LlmProviderRegisterResponse>

Register this plugin as a custom LLM provider on the server. After registration, the provider appears in the provider list and can be selected by users like any built-in provider.

ParameterTypeRequiredDescription
manifestLlmProviderManifestYes

Returns: Promise<LlmProviderRegisterResponse>

Full reference →


sendChunk

plugin.llmProvider.sendChunk(requestId: string, chunk: any): void

Send a streaming chunk for an in-flight stream request. The chunk shape should match multillm StreamChunk (id, model, choices: [{delta: {content, ...}}]).

ParameterTypeRequiredDescription
requestIdstringYes
chunkanyYes

Returns: void

Full reference →


sendError

plugin.llmProvider.sendError(requestId: string, error: string): void

Send an error for a completion or stream request.

ParameterTypeRequiredDescription
requestIdstringYes
errorstringYes

Returns: void

Full reference →


sendReply

plugin.llmProvider.sendReply(requestId: string, response: any, success: boolean): void

Send the final reply for a completion or stream request. For non-streaming: this is the only message you send. For streaming: send after all chunks have been emitted.

ParameterTypeRequiredDescription
requestIdstringYes
responseanyYes
successbooleanYes(default: true)

Returns: void

Full reference →


unregister

plugin.llmProvider.unregister(providerId: string): Promise<LlmProviderResponse>

Unregister this plugin's provider.

ParameterTypeRequiredDescription
providerIdstringYes

Returns: Promise<LlmProviderResponse>

Full reference →