You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Implements a configurable default value for the infer parameter across
memory operations, allowing users to control whether LLM processing is
enabled by default.
Configuration:
- Added default_infer field to mem0 MemoryConfig (default: True)
- Added default_infer to OpenMemory API configuration schema
- Configuration UI toggle in settings page under "Default Memory Processing Settings"
- Loaded from database configuration with proper None/False handling
API Changes:
- MCP add_memories: infer parameter now optional (Optional[bool])
- REST create_memory: infer parameter now optional with Field documentation
- When infer=None, applies memory_client.config.default_infer
- Updated tool descriptions to document infer parameter behavior
Behavior:
- infer=True: LLM extracts semantic facts and deduplicates
- infer=False: Stores exact verbatim text without transformation
- infer=None: Uses configured default_infer value (default: True)
This allows users to set their preferred default behavior while still
being able to override on a per-call basis.
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <[email protected]>
@mcp.tool(description="Add a new memory. This method is called everytime the user informs anything about themselves, their preferences, or anything that has any relevant information which can be useful in the future conversation. This can also be called when the user asks you to remember something.")
61
-
asyncdefadd_memories(text: str) ->str:
61
+
@mcp.tool(description="Add a new memory. This method is called everytime the user informs anything about themselves, their preferences, or anything that has any relevant information which can be useful in the future conversation. This can also be called when the user asks you to remember something. The 'infer' parameter controls processing: True (default) = LLM extracts semantic facts and deduplicates; False = stores exact verbatim text without transformation.")
0 commit comments