Skip to content

Getting Started for Users

Welcome! WebLLM lets you use AI on websites using your own AI provider - whether that’s a free local model running on your computer, or premium models like Claude or GPT-4 using your own API keys.

Your data stays under your control. Use local models that never send data anywhere, or use your own API keys directly.

One AI subscription works across all WebLLM-enabled websites. No more paying per app.

Already paying for Claude Pro or ChatGPT Plus? Use those premium models on any WebLLM-enabled website.

Run AI completely free on your own computer. No subscriptions needed.

  1. Install WebLLM for Chrome (Chrome Web Store link)
  2. Click “Add to Chrome”
  3. Click “Add Extension” when prompted
  4. Pin the extension - Click the puzzle icon in Chrome toolbar, then click the pin next to WebLLM

Extension installed!

When you first click the WebLLM icon, you’ll see the setup wizard. You have three options:

Section titled “Option 1: Free Local Model (Recommended for Getting Started)”
  • Best for: Privacy, offline use, no costs
  • Setup time: 5-10 minutes (one-time download)
  • Capabilities: Good for summarization, translation, basic Q&A

Click “Download Local Model” and select Llama 3.2 1B (1.2GB download). This runs entirely on your computer.

  • Best for: Premium capabilities, you already have an API key
  • Setup time: 2 minutes
  • Capabilities: State-of-the-art AI (Claude, GPT-4, etc.)

Click “Add API Provider” and choose:

Paste your API key and save.

Option 3: Use Multiple Providers (Advanced)

Section titled “Option 3: Use Multiple Providers (Advanced)”
  • Set up both local and cloud providers
  • WebLLM automatically chooses the best one for each task
  • Local for quick tasks, cloud for complex ones

Provider configured!

Visit a WebLLM-enabled demo site to test it:

  1. Go to webllm.org/demo
  2. Click “Summarize this article”
  3. WebLLM will ask for permission (first time only)
  4. Click “Allow”
  5. Watch the AI summarize the article!

You’re all set!

Now that you’re set up, you can:

How do I know if a website supports WebLLM?

Section titled “How do I know if a website supports WebLLM?”

Look for the “Powered by WebLLM” badge, or the WebLLM icon will light up when you visit a supported site.

Local models use about as much power as watching a YouTube video. Cloud models don’t use your computer at all - they run on the provider’s servers.

Yes! With local models, your data never leaves your computer. With cloud providers, your data goes directly from the extension to the provider using your API key - the website never sees it.

Absolutely! Local models are completely free and don’t require any API keys or subscriptions.

  • WebLLM Extension: Free forever
  • Local models: Free forever
  • Cloud providers: You pay the provider directly (typically $0.01-0.10 per request, depending on model)

Yes! You can add, remove, or change providers anytime from the extension settings.


Ready to experience AI on your terms? Install WebLLM and take control of your AI experience!