API Specification
Window API
Section titled “Window API”The extension injects navigator.llm into web pages.
navigator.llm.request(config)
Section titled “navigator.llm.request(config)”Low-level API for making requests.
navigator.llm.request(config: RequestConfig): Promise<Response>RequestConfig
Section titled “RequestConfig”interface RequestConfig { action: string; // Action type [key: string]: any; // Action-specific parameters}Response
Section titled “Response”interface Response { content?: string; // Generated content data?: any; // Structured data usage?: Usage; // Token usage metadata?: Metadata; // Response metadata}
interface Usage { inputTokens: number; outputTokens: number; cost?: number;}
interface Metadata { provider: string; model: string; latency: number;}Standard Actions
Section titled “Standard Actions”Summarize
Section titled “Summarize”{ action: 'summarize', input: string, maxLength?: number, style?: 'paragraph' | 'bullet-points' | 'tldr'}Response:
{ content: string, usage: Usage, metadata: Metadata}Translate
Section titled “Translate”{ action: 'translate', input: string, targetLanguage: string, formal?: boolean, context?: string}Response:
{ content: string, usage: Usage, metadata: Metadata}Extract
Section titled “Extract”{ action: 'extract', input: string, schema: Schema}Schema format:
type Schema = { [key: string]: 'string' | 'number' | 'boolean' | 'string[]' | 'number[]' | Schema}Response:
{ data: any, // Matches schema structure usage: Usage, metadata: Metadata}Classify
Section titled “Classify”{ action: 'classify', input: string, categories: string[]}Response:
{ data: { category: string, confidence: number }, usage: Usage, metadata: Metadata}Answer
Section titled “Answer”{ action: 'answer', question: string, context: string, concise?: boolean}Response:
{ content: string, usage: Usage, metadata: Metadata}Generate
Section titled “Generate”{ action: 'generate', prompt: string | Message[], systemPrompt?: string, maxTokens?: number, temperature?: number, stopSequences?: string[]}Message format:
interface Message { role: 'system' | 'user' | 'assistant'; content: string;}Response:
{ content: string, usage: Usage, metadata: Metadata}Streaming API
Section titled “Streaming API”For streaming responses, use the stream flag:
{ action: 'generate', prompt: string, stream: true}Returns a ReadableStream:
interface StreamChunk { type: 'content' | 'done' | 'error'; content?: string; error?: Error; usage?: Usage; metadata?: Metadata;}Example:
const response = await navigator.llm.request({ action: 'generate', prompt: 'Write a story', stream: true});
const reader = response.getReader();while (true) { const { done, value } = await reader.read(); if (done) break;
const chunk = JSON.parse(new TextDecoder().decode(value)); if (chunk.type === 'content') { console.log(chunk.content); }}Future Native API
Section titled “Future Native API”Once standardized, the API will be:
navigator.llm
Section titled “navigator.llm”interface NavigatorLLM { getProvider(name: string): Promise<LLMProvider>; listProviders(): Promise<ProviderInfo[]>;}LLMProvider
Section titled “LLMProvider”interface LLMProvider { readonly name: string; readonly version: string; readonly capabilities: ModelCapabilities;
createSession(config: SessionConfig): Promise<LLMSession>; listModels(): Promise<ModelInfo[]>;}LLMSession
Section titled “LLMSession”interface LLMSession { generate(request: GenerateRequest): Promise<GenerateResponse>; stream(request: GenerateRequest): Promise<ReadableStream<StreamChunk>>; abort(): void; close(): void;}ModelCapabilities
Section titled “ModelCapabilities”interface ModelCapabilities { maxTokens: number; supportedModalities: ('text' | 'image' | 'audio')[]; supportsStreaming: boolean; supportsTools: boolean; contextWindow: number; pricing?: ModelPricing;}ModelInfo
Section titled “ModelInfo”interface ModelInfo { id: string; provider: string; name: string; description?: string; capabilities: ModelCapabilities; location: 'cloud' | 'local' | 'hybrid';}Error Codes
Section titled “Error Codes”All errors include a code property:
Extension Errors
Section titled “Extension Errors”EXTENSION_NOT_INSTALLED- Extension not detectedEXTENSION_DISABLED- Extension installed but disabled
Permission Errors
Section titled “Permission Errors”PERMISSION_DENIED- User denied permissionPERMISSION_REQUIRED- Permission needs to be requestedORIGIN_BLOCKED- Origin is blocked in settings
Provider Errors
Section titled “Provider Errors”NO_PROVIDERS- No providers configuredALL_PROVIDERS_FAILED- All providers failedPROVIDER_ERROR- Specific provider error
Request Errors
Section titled “Request Errors”INVALID_REQUEST- Malformed requestINVALID_SCHEMA- Invalid extraction schemaINVALID_ACTION- Unknown action type
Rate Limiting
Section titled “Rate Limiting”RATE_LIMITED- Too many requestsQUOTA_EXCEEDED- Usage quota exceeded
Timeout
Section titled “Timeout”TIMEOUT- Request timeout exceededABORTED- Request aborted by user
Error Format
Section titled “Error Format”interface WebLLMError extends Error { name: 'WebLLMError'; code: string; message: string; details?: any; provider?: string;}Example:
try { await navigator.llm.request({ action: 'summarize', input: text });} catch (error) { console.error('Error code:', error.code); console.error('Message:', error.message); console.error('Details:', error.details);}Permission API
Section titled “Permission API”Check Permission
Section titled “Check Permission”const hasPermission = await navigator.llm.checkPermission();// Returns: booleanRequest Permission
Section titled “Request Permission”Automatically requested on first use, or manually:
const granted = await navigator.llm.requestPermission({ reason: 'To summarize articles for you'});// Returns: booleanCapability Detection
Section titled “Capability Detection”Check what’s available:
const capabilities = await navigator.llm.getCapabilities();
// {// actions: ['summarize', 'translate', 'extract', 'classify', 'answer', 'generate'],// streaming: true,// providers: ['local', 'anthropic', 'openai'],// version: '1.0.0'// }Usage Tracking
Section titled “Usage Tracking”Get usage statistics:
const stats = await navigator.llm.getUsageStats();
// {// today: {// requests: 42,// inputTokens: 15000,// outputTokens: 3000,// cost: 0.05// },// thisWeek: { ... },// thisMonth: { ... }// }WebIDL Specification
Section titled “WebIDL Specification”Future standard interface definition:
[Exposed=Window]interface NavigatorLLM { Promise<LLMProvider> getProvider(DOMString name); Promise<sequence<ProviderInfo>> listProviders();};
[Exposed=Window]interface LLMProvider { readonly attribute DOMString name; readonly attribute DOMString version; readonly attribute ModelCapabilities capabilities;
Promise<LLMSession> createSession(SessionConfig config); Promise<sequence<ModelInfo>> listModels();};
[Exposed=Window]interface LLMSession { Promise<GenerateResponse> generate(GenerateRequest request); Promise<ReadableStream> stream(GenerateRequest request); undefined abort(); undefined close();};
dictionary GenerateRequest { DOMString prompt; DOMString? systemPrompt; unsigned long? maxTokens; double? temperature; sequence<DOMString>? stopSequences;};
dictionary GenerateResponse { DOMString content; Usage usage; Metadata metadata;};Versioning
Section titled “Versioning”API follows semantic versioning:
- Major version - Breaking changes
- Minor version - New features (backward compatible)
- Patch version - Bug fixes
Current version: 1.0.0 (WIP)
Check version:
const version = await navigator.llm.getVersion();// Returns: "1.0.0"Next Steps
Section titled “Next Steps”- See Client Library API for high-level SDK
- Read Extension Architecture for implementation
- Check Getting Started for usage examples