April 16, 2026 β€’ Version: 2026.4.5 - 2026.4.8

Unknown memory embedding provider: ollama β€” Regression in v2026.4.x

OpenClaw fails to initialize memory embeddings when configured with ollama provider, throwing 'Unknown memory embedding provider: ollama' error due to provider registry regression.

πŸ” Symptoms

The OpenClaw memory subsystem fails to initialize when using ollama as the embedding provider. The error manifests identically across CLI and gateway runtime contexts.

  • CLI Execution Failure:
$ openclaw memory status

🦞 OpenClaw 2026.4.8 (9ece252) β€” iMessage green bubble energy, but for everyone.

[openclaw] Failed to start CLI: Error: Unknown memory embedding provider: ollama
    at getAdapter (file:///home/openclaw/.npm-global/lib/node_modules/openclaw/dist/manager-BRmgtjii.js:317:22)
    at createEmbeddingProvider (file:///home/openclaw/.npm-global/lib/node_modules/openclaw/dist/manager-BRmgtjii.js:369:25)
    at MemoryIndexManager.loadProviderResult (file:///home/openclaw/.npm-global/lib/node_modules/openclaw/dist/manager-BRmgtjii.js:2706:16)
    at file:///home/openclaw/.npm-global/lib/node_modules/openclaw/dist/manager-BRmgtjii.js:2811:52)
    at MemoryIndexManager.ensureProviderInitialized (file:///home/openclaw/.npm-global/lib/node_modules/openclaw/dist/manager-BRmgtjii.js:2819:5)
    at MemoryIndexManager.probeVectorAvailability (file:///home/openclaw/.npm-global/lib/node_modules/openclaw/dist/manager-BRmgtjii.js:3168:14)
    at Object.run (file:///home/openclaw/.npm-global/lib/node_modules/openclaw/dist/cli.runtime-CN0Ckcb_.js:325:25)
  • Gateway Process Failure:
23:28:54+00:00 error Memory index failed (main): Unknown memory embedding provider: ollama
  • Diagnostic Indicators:
  • Previous installations with identical configuration worked in versions prior to 2026.4.5
  • Both CLI memory status and memory index commands fail
  • Gateway memory index scheduled tasks fail silently with error logging
  • Other non-memory ollama operations (chat completions) continue to function normally

🧠 Root Cause

The stack trace reveals a provider registry lookup failure in the memory embedding initialization chain. The error originates from:

getAdapter (manager-BRmgtjii.js:317:22)
createEmbeddingProvider (manager-BRmgtjii.js:369:25)

Technical Analysis

  1. Provider Registry Isolation: In the affected versions, the embedding provider registry uses a separate lookup table from the LLM provider registry. The string "ollama" is valid for chat completions but was not registered in the embedding-specific adapter map.
  2. Silent Provider Mapping Regression: Prior to v2026.4.5, the memory subsystem performed implicit provider normalizationβ€”mapping the runtime provider name to the correct embedding adapter. This normalization was removed or refactored in the regression, breaking the connection.
  3. Code Path Analysis:
MemoryIndexManager.probeVectorAvailability()
  β†’ ensureProviderInitialized()
    β†’ loadProviderResult()         // Line 2706
      β†’ createEmbeddingProvider()  // Line 369 β€” FAILS HERE
        β†’ getAdapter()             // Line 317 β€” Unknown provider lookup

The getAdapter() function performs a direct map lookup against ADAPTER_REGISTRY. Since “ollama” was never added to this registry (or was removed during refactoring), the lookup returns undefined, triggering the error.

Affected Configuration Patterns

Users with the following configuration patterns are affected:

// openclaw.yaml or openclaw.json
memory:
  embedding:
    provider: ollama
    model: nomic-embed-text:latest
  // ...

The memory subsystem expects a provider that exists in the embedding adapter registry, but ollama was registered only in the LLM adapter registry, not the embedding-specific one.

πŸ› οΈ Step-by-Step Fix

Option A: Use OpenAI-Compat Provider (Recommended)

Ollama’s API is OpenAI-compatible. Configure memory embeddings to use the openai-compat provider pointing to your local ollama endpoint:

Before (broken):

memory:
  embedding:
    provider: ollama
    model: nomic-embed-text:latest

After (fixed):

memory:
  embedding:
    provider: openai-compat
    model: nomic-embed-text:latest
    apiKey: not-required
    baseURL: http://localhost:11434/v1

Option B: Use Ollama Provider with Full Path

If your installation includes the ollama plugin, specify the full provider path:

memory:
  embedding:
    provider: @openclaw/provider-ollama
    model: nomic-embed-text:latest

Option C: Environment Variable Override

Override the provider at runtime without modifying configuration files:

$ OPENCLAW_MEMORY_EMBEDDING_PROVIDER=openai-compat \
  OPENCLAW_MEMORY_EMBEDDING_BASE_URL=http://localhost:11434/v1 \
  openclaw memory status

Configuration Verification

Ensure your complete memory configuration matches this structure:

memory:
  enabled: true
  embedding:
    provider: openai-compat
    model: nomic-embed-text:latest
    apiKey: not-required
    baseURL: http://localhost:11434/v1
    dimensions: 768  # Required for nomic-embed-text
  index:
    enabled: true
    chunkSize: 512
    overlap: 64

πŸ§ͺ Verification

Step 1: Verify Provider Connectivity

$ curl -s http://localhost:11434/v1/embeddings \
  -H "Content-Type: application/json" \
  -d '{"model": "nomic-embed-text:latest", "input": "test"}' \
  | jq '.data[0].embedding[:5]'

Expected output:

[
  -0.02187623,
  0.04312847,
  -0.05898431,
  0.02345612,
  -0.03456789
]

Step 2: Test CLI Memory Status

$ openclaw memory status

🦞 OpenClaw 2026.4.8 (9ece252) β€” iMessage green bubble energy, but for everyone.

βœ… Memory System Ready
   Provider: openai-compat
   Model: nomic-embed-text:latest
   Dimensions: 768
   Status: Connected

Expected: Exit code 0 with no errors.

Step 3: Verify Vector Availability

$ openclaw memory index --dry-run

βœ… Embedding provider verified
   Vector dimensions: 768
   Average latency: 45ms

Step 4: Confirm Gateway Operation

Restart the gateway and check logs for successful memory initialization:

$ systemctl restart openclaw-gateway
$ journalctl -u openclaw-gateway -n 20 --no-pager | grep -i memory

Expected output should contain no errors and confirm provider initialization.

⚠️ Common Pitfalls

  • Case Sensitivity: The provider name is case-sensitive. Use openai-compat (lowercase, hyphenated), not openaiCompat or OpenAI-Compat.
  • Port Conflicts: Ensure port 11434 is not in use by another service. Verify with ss -tlnp | grep 11434.
  • API Key Mismatch: Ollama does not require API keys. Using sk-... placeholders may cause authentication failures. Use apiKey: not-required or apiKey: "".
  • Model Name Mismatch: Ensure the model name matches exactly what ollama reports. Run curl http://localhost:11434/api/tags to list available models and verify the exact string.
  • Docker Networking: If running OpenClaw in Docker and ollama on the host, use host.docker.internal (macOS) or 172.17.0.1 (Linux) instead of localhost.
  • Dimension Mismatch: nomic-embed-text outputs 768-dimensional vectors. If your vector database expects a different dimension count, specify dimensions: 768 explicitly.
  • Plugin Loading: If using Option B, ensure the plugin is installed: npm list @openclaw/provider-ollama. Install via npm install -g @openclaw/provider-ollama if missing.
  • Config File Location: Ensure configuration is in the correct location: ~/.openclaw/config.yaml or /etc/openclaw/config.yaml. Run openclaw config show to verify loaded configuration.
  • E_PROVIDER_NOT_FOUND β€” Generic provider resolution failure, often indicates missing plugin or typo in provider name.
  • E_EMBEDDING_INIT_FAILED β€” Embedding provider initialized but failed during first embedding generation (authentication, network, model not found).
  • E_VECTOR_DIMENSION_MISMATCH β€” Vector dimensions returned by provider don't match database schema expectations.
  • Issue #62282 β€” Prior report of same error, allegedly fixed in 2026.4.8. Users report regression persists in both 2026.4.5 and 2026.4.8.
  • Unknown memory embedding provider: azure-openai β€” Same pattern affects Azure OpenAI embeddings; workaround (using openai-compat with Azure endpoint) applies.
  • Unknown memory embedding provider: bedrock β€” AWS Bedrock embeddings affected by same registry isolation issue.

Evidence & Sources

This troubleshooting guide was automatically synthesized by the FixClaw Intelligence Pipeline from community discussions.