Unknown memory embedding provider: ollama β Regression in v2026.4.x
OpenClaw fails to initialize memory embeddings when configured with ollama provider, throwing 'Unknown memory embedding provider: ollama' error due to provider registry regression.
π Symptoms
The OpenClaw memory subsystem fails to initialize when using ollama as the embedding provider. The error manifests identically across CLI and gateway runtime contexts.
- CLI Execution Failure:
$ openclaw memory status
π¦ OpenClaw 2026.4.8 (9ece252) β iMessage green bubble energy, but for everyone.
[openclaw] Failed to start CLI: Error: Unknown memory embedding provider: ollama
at getAdapter (file:///home/openclaw/.npm-global/lib/node_modules/openclaw/dist/manager-BRmgtjii.js:317:22)
at createEmbeddingProvider (file:///home/openclaw/.npm-global/lib/node_modules/openclaw/dist/manager-BRmgtjii.js:369:25)
at MemoryIndexManager.loadProviderResult (file:///home/openclaw/.npm-global/lib/node_modules/openclaw/dist/manager-BRmgtjii.js:2706:16)
at file:///home/openclaw/.npm-global/lib/node_modules/openclaw/dist/manager-BRmgtjii.js:2811:52)
at MemoryIndexManager.ensureProviderInitialized (file:///home/openclaw/.npm-global/lib/node_modules/openclaw/dist/manager-BRmgtjii.js:2819:5)
at MemoryIndexManager.probeVectorAvailability (file:///home/openclaw/.npm-global/lib/node_modules/openclaw/dist/manager-BRmgtjii.js:3168:14)
at Object.run (file:///home/openclaw/.npm-global/lib/node_modules/openclaw/dist/cli.runtime-CN0Ckcb_.js:325:25)- Gateway Process Failure:
23:28:54+00:00 error Memory index failed (main): Unknown memory embedding provider: ollama- Diagnostic Indicators:
- Previous installations with identical configuration worked in versions prior to 2026.4.5
- Both CLI
memory statusandmemory indexcommands fail - Gateway memory index scheduled tasks fail silently with error logging
- Other non-memory ollama operations (chat completions) continue to function normally
π§ Root Cause
The stack trace reveals a provider registry lookup failure in the memory embedding initialization chain. The error originates from:
getAdapter (manager-BRmgtjii.js:317:22)
createEmbeddingProvider (manager-BRmgtjii.js:369:25)Technical Analysis
- Provider Registry Isolation: In the affected versions, the embedding provider registry uses a separate lookup table from the LLM provider registry. The string
"ollama"is valid for chat completions but was not registered in the embedding-specific adapter map. - Silent Provider Mapping Regression: Prior to v2026.4.5, the memory subsystem performed implicit provider normalizationβmapping the runtime provider name to the correct embedding adapter. This normalization was removed or refactored in the regression, breaking the connection.
- Code Path Analysis:
MemoryIndexManager.probeVectorAvailability()
β ensureProviderInitialized()
β loadProviderResult() // Line 2706
β createEmbeddingProvider() // Line 369 β FAILS HERE
β getAdapter() // Line 317 β Unknown provider lookupThe getAdapter() function performs a direct map lookup against ADAPTER_REGISTRY. Since “ollama” was never added to this registry (or was removed during refactoring), the lookup returns undefined, triggering the error.
Affected Configuration Patterns
Users with the following configuration patterns are affected:
// openclaw.yaml or openclaw.json
memory:
embedding:
provider: ollama
model: nomic-embed-text:latest
// ...
The memory subsystem expects a provider that exists in the embedding adapter registry, but ollama was registered only in the LLM adapter registry, not the embedding-specific one.
π οΈ Step-by-Step Fix
Option A: Use OpenAI-Compat Provider (Recommended)
Ollama’s API is OpenAI-compatible. Configure memory embeddings to use the openai-compat provider pointing to your local ollama endpoint:
Before (broken):
memory:
embedding:
provider: ollama
model: nomic-embed-text:latestAfter (fixed):
memory:
embedding:
provider: openai-compat
model: nomic-embed-text:latest
apiKey: not-required
baseURL: http://localhost:11434/v1Option B: Use Ollama Provider with Full Path
If your installation includes the ollama plugin, specify the full provider path:
memory:
embedding:
provider: @openclaw/provider-ollama
model: nomic-embed-text:latestOption C: Environment Variable Override
Override the provider at runtime without modifying configuration files:
$ OPENCLAW_MEMORY_EMBEDDING_PROVIDER=openai-compat \
OPENCLAW_MEMORY_EMBEDDING_BASE_URL=http://localhost:11434/v1 \
openclaw memory statusConfiguration Verification
Ensure your complete memory configuration matches this structure:
memory:
enabled: true
embedding:
provider: openai-compat
model: nomic-embed-text:latest
apiKey: not-required
baseURL: http://localhost:11434/v1
dimensions: 768 # Required for nomic-embed-text
index:
enabled: true
chunkSize: 512
overlap: 64π§ͺ Verification
Step 1: Verify Provider Connectivity
$ curl -s http://localhost:11434/v1/embeddings \
-H "Content-Type: application/json" \
-d '{"model": "nomic-embed-text:latest", "input": "test"}' \
| jq '.data[0].embedding[:5]'Expected output:
[
-0.02187623,
0.04312847,
-0.05898431,
0.02345612,
-0.03456789
]Step 2: Test CLI Memory Status
$ openclaw memory status
π¦ OpenClaw 2026.4.8 (9ece252) β iMessage green bubble energy, but for everyone.
β
Memory System Ready
Provider: openai-compat
Model: nomic-embed-text:latest
Dimensions: 768
Status: ConnectedExpected: Exit code 0 with no errors.
Step 3: Verify Vector Availability
$ openclaw memory index --dry-run
β
Embedding provider verified
Vector dimensions: 768
Average latency: 45msStep 4: Confirm Gateway Operation
Restart the gateway and check logs for successful memory initialization:
$ systemctl restart openclaw-gateway
$ journalctl -u openclaw-gateway -n 20 --no-pager | grep -i memoryExpected output should contain no errors and confirm provider initialization.
β οΈ Common Pitfalls
- Case Sensitivity: The provider name is case-sensitive. Use
openai-compat(lowercase, hyphenated), notopenaiCompatorOpenAI-Compat. - Port Conflicts: Ensure port 11434 is not in use by another service. Verify with
ss -tlnp | grep 11434. - API Key Mismatch: Ollama does not require API keys. Using
sk-...placeholders may cause authentication failures. UseapiKey: not-requiredorapiKey: "". - Model Name Mismatch: Ensure the model name matches exactly what ollama reports. Run
curl http://localhost:11434/api/tagsto list available models and verify the exact string. - Docker Networking: If running OpenClaw in Docker and ollama on the host, use
host.docker.internal(macOS) or172.17.0.1(Linux) instead oflocalhost. - Dimension Mismatch:
nomic-embed-textoutputs 768-dimensional vectors. If your vector database expects a different dimension count, specifydimensions: 768explicitly. - Plugin Loading: If using Option B, ensure the plugin is installed:
npm list @openclaw/provider-ollama. Install vianpm install -g @openclaw/provider-ollamaif missing. - Config File Location: Ensure configuration is in the correct location:
~/.openclaw/config.yamlor/etc/openclaw/config.yaml. Runopenclaw config showto verify loaded configuration.
π Related Errors
E_PROVIDER_NOT_FOUNDβ Generic provider resolution failure, often indicates missing plugin or typo in provider name.E_EMBEDDING_INIT_FAILEDβ Embedding provider initialized but failed during first embedding generation (authentication, network, model not found).E_VECTOR_DIMENSION_MISMATCHβ Vector dimensions returned by provider don't match database schema expectations.- Issue #62282 β Prior report of same error, allegedly fixed in 2026.4.8. Users report regression persists in both 2026.4.5 and 2026.4.8.
Unknown memory embedding provider: azure-openaiβ Same pattern affects Azure OpenAI embeddings; workaround (using openai-compat with Azure endpoint) applies.Unknown memory embedding provider: bedrockβ AWS Bedrock embeddings affected by same registry isolation issue.