Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion AGENTS.md
Original file line number Diff line number Diff line change
Expand Up @@ -111,7 +111,7 @@ Hook scripts in `src/hooks/` are standalone Node.js scripts (no iii-sdk import).

## Current Stats (v0.8.9)

- 44 MCP tools (8 visible by default, `AGENTMEMORY_TOOLS=all` for all)
- 45 MCP tools (8 visible by default, `AGENTMEMORY_TOOLS=all` for all)
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

MCP tool count inconsistent with README and plugin.json.

AGENTS.md shows 45 MCP tools, but README.md and plugin.json both show 52 MCP tools. Per the review stack context, Layer 1 should update documentation to reflect 52 tools.

📝 Suggested fix
-- 45 MCP tools (8 visible by default, `AGENTMEMORY_TOOLS=all` for all)
+- 52 MCP tools (8 visible by default, `AGENTMEMORY_TOOLS=all` for all)

Based on learnings: When adding or removing MCP tools, update all six locations: tools-registry.ts, server.ts, triggers/api.ts, index.ts, test/mcp-standalone.test.ts, README.md, and plugin/.claude-plugin/plugin.json.

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
- 45 MCP tools (8 visible by default, `AGENTMEMORY_TOOLS=all` for all)
- 52 MCP tools (8 visible by default, `AGENTMEMORY_TOOLS=all` for all)
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@AGENTS.md` at line 114, AGENTS.md currently lists 45 MCP tools but README.md
and plugin.json list 52; update AGENTS.md to say 52 and ensure the MCP tool
count and any related lists are synchronized across all locations
(tools-registry.ts, server.ts, triggers/api.ts, index.ts,
test/mcp-standalone.test.ts, README.md, plugin/.claude-plugin/plugin.json and
AGENTS.md) so every reference and test reflects 52 tools.

- 104 REST endpoints
- 6 MCP resources, 3 MCP prompts
- 12 hooks, 4 skills
Expand Down
8 changes: 4 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@
<p align="center">
<picture><source media="(prefers-color-scheme: dark)" srcset="assets/tags/light/stat-recall.svg"><img src="assets/tags/stat-recall.svg" alt="95.2% retrieval R@5" height="38" /></picture>
<picture><source media="(prefers-color-scheme: dark)" srcset="assets/tags/light/stat-tokens.svg"><img src="assets/tags/stat-tokens.svg" alt="92% fewer tokens" height="38" /></picture>
<picture><source media="(prefers-color-scheme: dark)" srcset="assets/tags/light/stat-tools.svg"><img src="assets/tags/stat-tools.svg" alt="51 MCP tools" height="38" /></picture>
<picture><source media="(prefers-color-scheme: dark)" srcset="assets/tags/light/stat-tools.svg"><img src="assets/tags/stat-tools.svg" alt="52 MCP tools" height="38" /></picture>
<picture><source media="(prefers-color-scheme: dark)" srcset="assets/tags/light/stat-hooks.svg"><img src="assets/tags/stat-hooks.svg" alt="12 auto hooks" height="38" /></picture>
<picture><source media="(prefers-color-scheme: dark)" srcset="assets/tags/light/stat-deps.svg"><img src="assets/tags/stat-deps.svg" alt="0 external DBs" height="38" /></picture>
<picture><source media="(prefers-color-scheme: dark)" srcset="assets/tags/light/stat-tests.svg"><img src="assets/tags/stat-tests.svg" alt="827 tests passing" height="38" /></picture>
Expand Down Expand Up @@ -138,7 +138,7 @@ agentmemory works with any agent that supports hooks, MCP, or REST API. All agen
<sub>AgentSDKProvider</sub>
</td>
<td align="center" width="12.5%">
<img src="https://img.shields.io/badge/104-endpoints-1f6feb?style=flat-square" alt="REST API" width="48" /><br/>
<img src="https://img.shields.io/badge/108-endpoints-1f6feb?style=flat-square" alt="REST API" width="48" /><br/>
<strong>Any agent</strong><br/>
<sub>REST API</sub>
</td>
Expand Down Expand Up @@ -338,7 +338,7 @@ Implementation details live in `src/cli.ts` (see `runUpgrade` around the `src/cl
### Claude Code (one block, paste it)

```
Install agentmemory: run `npx @agentmemory/agentmemory` in a separate terminal to start the memory server. Then run `/plugin marketplace add rohitg00/agentmemory` and `/plugin install agentmemory` — the plugin registers all 12 hooks, 4 skills, AND auto-wires the `@agentmemory/mcp` stdio server via its `.mcp.json`, so you get 51 MCP tools (memory_smart_search, memory_save, memory_sessions, memory_governance_delete, etc.) without any extra config step. Verify with `curl http://localhost:3111/agentmemory/health`. The real-time viewer is at http://localhost:3113.
Install agentmemory: run `npx @agentmemory/agentmemory` in a separate terminal to start the memory server. Then run `/plugin marketplace add rohitg00/agentmemory` and `/plugin install agentmemory` — the plugin registers all 12 hooks, 4 skills, AND auto-wires the `@agentmemory/mcp` stdio server via its `.mcp.json`, so you get 52 MCP tools (memory_smart_search, memory_save, memory_sessions, memory_governance_delete, etc.) without any extra config step. Verify with `curl http://localhost:3111/agentmemory/health`. The real-time viewer is at http://localhost:3113.
```

<details>
Expand Down Expand Up @@ -954,7 +954,7 @@ Create `~/.agentmemory/.env`:

<h2 id="api"><picture><source media="(prefers-color-scheme: dark)" srcset="assets/tags/light/section-api.svg"><img src="assets/tags/section-api.svg" alt="API" height="32" /></picture></h2>

107 endpoints on port `3111`. The REST API binds to `127.0.0.1` by default. Protected endpoints require `Authorization: Bearer <secret>` when `AGENTMEMORY_SECRET` is set, and mesh sync endpoints require `AGENTMEMORY_SECRET` on both peers.
108 endpoints on port `3111`. The REST API binds to `127.0.0.1` by default. Protected endpoints require `Authorization: Bearer <secret>` when `AGENTMEMORY_SECRET` is set, and mesh sync endpoints require `AGENTMEMORY_SECRET` on both peers.

<details>
<summary>Key endpoints</summary>
Expand Down
2 changes: 1 addition & 1 deletion plugin/.claude-plugin/plugin.json
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
{
"name": "agentmemory",
"version": "0.9.4",
"description": "Persistent memory for AI coding agents -- captures tool usage, compresses via LLM, injects context into future sessions. 12 hooks, 51 MCP tools, 4 skills, real-time viewer.",
"description": "Persistent memory for AI coding agents -- captures tool usage, compresses via LLM, injects context into future sessions. 12 hooks, 52 MCP tools, 4 skills, real-time viewer.",
"author": {
"name": "Rohit Ghumare",
"url": "https://github.com/rohitg00"
Expand Down
8 changes: 4 additions & 4 deletions plugin/scripts/session-end.mjs
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ async function main() {
method: "POST",
headers: authHeaders(),
body: JSON.stringify({ sessionId }),
signal: AbortSignal.timeout(5e3)
signal: AbortSignal.timeout(3e4)
});
} catch {}
if (process.env["CONSOLIDATION_ENABLED"] === "true") {
Expand All @@ -37,7 +37,7 @@ async function main() {
method: "POST",
headers: authHeaders(),
body: JSON.stringify({ olderThanDays: 0 }),
signal: AbortSignal.timeout(15e3)
signal: AbortSignal.timeout(6e4)
});
} catch {}
try {
Expand All @@ -48,15 +48,15 @@ async function main() {
tier: "all",
force: true
}),
signal: AbortSignal.timeout(3e4)
signal: AbortSignal.timeout(12e4)
});
} catch {}
}
if (process.env["CLAUDE_MEMORY_BRIDGE"] === "true") try {
await fetch(`${REST_URL}/agentmemory/claude-bridge/sync`, {
method: "POST",
headers: authHeaders(),
signal: AbortSignal.timeout(5e3)
signal: AbortSignal.timeout(3e4)
});
} catch {}
}
Expand Down
2 changes: 1 addition & 1 deletion plugin/scripts/stop.mjs
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ async function main() {
method: "POST",
headers: authHeaders(),
body: JSON.stringify({ sessionId }),
signal: AbortSignal.timeout(3e4)
signal: AbortSignal.timeout(12e4)
});
} catch {}
}
Expand Down
65 changes: 65 additions & 0 deletions src/functions/concept-backfill.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,65 @@
import type { ISdk } from "iii-sdk";
import type { StateKV } from "../state/kv.js";
import type { Memory } from "../types.js";
import { KV } from "../state/schema.js";
import { recordAudit } from "./audit.js";
import { logger } from "../logger.js";

const CONFIG_KEY = "concept-backfill-done";

export function registerConceptBackfillFunction(sdk: ISdk, kv: StateKV): void {
sdk.registerFunction(
"mem::concept-backfill",
async () => {
const flag = await kv.get<{ done: boolean }>(KV.config, CONFIG_KEY);
if (flag?.done) {
return { success: true, skipped: true, reason: "already completed" };
}
Comment on lines +14 to +17
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major | 🏗️ Heavy lift

Backfill can be permanently marked “done” after partial failure and can double-run concurrently.

Line 36 swallows upsert failures, Line 39 still counts them as processed, and Line 42 sets done: true anyway. Also, the check at Line 14 and final set at Line 42 are non-atomic, so concurrent invocations can both process and reinforce the same edges.

Suggested direction
-      const flag = await kv.get<{ done: boolean }>(KV.config, CONFIG_KEY);
+      const flag = await kv.get<{ done: boolean; inProgress?: boolean }>(KV.config, CONFIG_KEY);
       if (flag?.done) {
         return { success: true, skipped: true, reason: "already completed" };
       }
+      if (flag?.inProgress) {
+        return { success: true, skipped: true, reason: "already running" };
+      }
+      await kv.set(KV.config, CONFIG_KEY, { done: false, inProgress: true });

       const memories = await kv.list<Memory>(KV.memories);
@@
-      let processed = 0;
+      let processed = 0;
+      let failed = 0;
@@
-        await Promise.all(
-          batch.map((m) =>
-            sdk
-              .trigger({
-                function_id: "mem::concept-edge-upsert",
-                payload: { concepts: m.concepts },
-              })
-              .catch(() => {}),
-          ),
-        );
-        processed += batch.length;
+        const settled = await Promise.allSettled(
+          batch.map((m) =>
+            sdk.trigger({
+              function_id: "mem::concept-edge-upsert",
+              payload: { concepts: m.concepts },
+            }),
+          ),
+        );
+        for (const r of settled) {
+          if (r.status === "fulfilled") processed += 1;
+          else failed += 1;
+        }
       }
 
-      await kv.set(KV.config, CONFIG_KEY, { done: true, completedAt: new Date().toISOString() });
+      const completedAt = new Date().toISOString();
+      await kv.set(KV.config, CONFIG_KEY, {
+        done: failed === 0,
+        inProgress: false,
+        completedAt,
+        failed,
+      });

Also applies to: 29-43

🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@src/functions/concept-backfill.ts` around lines 14 - 17, The current backfill
marks CONFIG_KEY done non-atomically and counts failed upserts as processed; fix
by making CONFIG_KEY updates atomic and only setting done when all upserts
succeed: implement a distributed lock/lease or CAS on CONFIG_KEY (use
kv.compare-and-set / put-if-absent semantics) so the initial kv.get/flag check
and the transition to in_progress/done are atomic and prevent concurrent runs,
change the loop that calls the upsert function (concept upsert / whatever method
currently swallowing errors) to propagate or record failures instead of
incrementing the processed count on exception, and only write { done: true } to
CONFIG_KEY when no upsert errors remain (or persist a retryable cursor/state
rather than done=true on partial failure). Ensure references to CONFIG_KEY,
kv.get, kv.put/CAS and the upsert function are updated accordingly.


const memories = await kv.list<Memory>(KV.memories);
const eligible = memories.filter(
(m) => m.isLatest !== false && m.concepts && m.concepts.length >= 2,
);

let processed = 0;
let errors = 0;
const batchSize = 50;

for (let i = 0; i < eligible.length; i += batchSize) {
const batch = eligible.slice(i, i + batchSize);
const results = await Promise.allSettled(
batch.map((m) =>
sdk.trigger({
function_id: "mem::concept-edge-upsert",
payload: { concepts: m.concepts },
})
),
);
for (const res of results) {
if (res.status === "rejected") errors++;
else processed++;
}
}

if (errors > 0) {
throw new Error(`Concept backfill failed to process ${errors} items.`);
}

await kv.set(KV.config, CONFIG_KEY, { done: true, completedAt: new Date().toISOString() });

try {
await recordAudit(kv, "concept_backfill", "mem::concept-backfill", [], {
memoriesProcessed: processed,
totalMemories: memories.length,
});
} catch {}

logger.info("Concept backfill completed", {
processed,
total: eligible.length,
});
Comment thread
coderabbitai[bot] marked this conversation as resolved.

return { success: true, processed, total: memories.length };
},
);
}
84 changes: 84 additions & 0 deletions src/functions/concept-edges.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,84 @@
import type { ISdk } from "iii-sdk";
import type { StateKV } from "../state/kv.js";
import type { ConceptEdge } from "../types.js";
import { KV, fingerprintId } from "../state/schema.js";
import { recordAudit } from "./audit.js";

function reinforceEdge(edge: ConceptEdge): void {
const now = new Date().toISOString();
edge.reinforcements++;
edge.strength = Math.min(
1.0,
edge.strength + 0.1 * (1 - edge.strength),
);
edge.lastSeenAt = now;
}

function generatePairs(concepts: string[]): Array<[string, string]> {
const normalized = [...new Set(concepts.map((c) => c.toLowerCase().trim()).filter(Boolean))];
const pairs: Array<[string, string]> = [];
for (let i = 0; i < normalized.length; i++) {
for (let j = i + 1; j < normalized.length; j++) {
const [a, b] = normalized[i] < normalized[j]
? [normalized[i], normalized[j]]
: [normalized[j], normalized[i]];
pairs.push([a, b]);
}
}
return pairs;
}

export function registerConceptEdgesFunctions(sdk: ISdk, kv: StateKV): void {
sdk.registerFunction(
"mem::concept-edge-upsert",
async (data: { concepts?: string[] }) => {
if (!data.concepts || !Array.isArray(data.concepts) || data.concepts.length < 2) {
return { success: false, error: "at least 2 concepts required" };
}

const pairs = generatePairs(data.concepts);
if (pairs.length === 0) {
return { success: true, created: 0, reinforced: 0 };
}

const now = new Date().toISOString();
let created = 0;
let reinforced = 0;

const edgeOps = pairs.map(async ([from, to]) => {
const id = fingerprintId("ce", `${from}|${to}`);
const existing = await kv.get<ConceptEdge>(KV.conceptEdges, id);

if (existing) {
reinforceEdge(existing);
await kv.set(KV.conceptEdges, id, existing);
reinforced++;
} else {
const edge: ConceptEdge = {
id,
from,
to,
strength: 0.5,
reinforcements: 0,
lastSeenAt: now,
createdAt: now,
};
await kv.set(KV.conceptEdges, id, edge);
created++;
}
});

await Promise.all(edgeOps);

try {
await recordAudit(kv, "concept_edge_upsert", "mem::concept-edge-upsert", [], {
pairs: pairs.length,
created,
reinforced,
});
} catch {}

return { success: true, created, reinforced };
},
);
}
114 changes: 114 additions & 0 deletions src/functions/concept-graph-search.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,114 @@
import type { ISdk } from "iii-sdk";
import type { StateKV } from "../state/kv.js";
import type { ConceptEdge, Memory } from "../types.js";
import { KV } from "../state/schema.js";

const MAX_BFS_DEPTH = 2;
const MAX_NEIGHBORS_PER_NODE = 10;

function decayedStrength(edge: ConceptEdge): number {
const timestamp = new Date(edge.lastSeenAt).getTime();
if (Number.isNaN(timestamp)) return 0.05;
const daysSinceLastSeen = (Date.now() - timestamp) / (1000 * 60 * 60 * 24);
const decay = edge.strength * 0.05 * (daysSinceLastSeen / 7);
return Math.max(0.05, edge.strength - decay);
}
Comment thread
coderabbitai[bot] marked this conversation as resolved.

export function registerConceptGraphSearchFunction(sdk: ISdk, kv: StateKV): void {
sdk.registerFunction(
"mem::concept-graph-search",
async (data: { concepts: string[]; depth?: number; limit?: number }) => {
if (!data.concepts || data.concepts.length === 0) {
return { success: false, error: "concepts array is required" };
}

const depth = data.depth ?? 2;
if (!Number.isInteger(depth) || depth < 1 || depth > MAX_BFS_DEPTH) {
return {
success: false,
error: "depth_out_of_range",
message: `BFS depth must be an integer between 1 and ${MAX_BFS_DEPTH}, got ${depth}`,
};
}
Comment thread
coderabbitai[bot] marked this conversation as resolved.

const limit = Math.max(1, Math.min(data.limit ?? 20, 100));
const allEdges = await kv.list<ConceptEdge>(KV.conceptEdges);

const adjacency = new Map<string, Array<{ concept: string; strength: number }>>();
for (const edge of allEdges) {
const strength = decayedStrength(edge);
if (strength <= 0.05) continue;

if (!adjacency.has(edge.from)) adjacency.set(edge.from, []);
if (!adjacency.has(edge.to)) adjacency.set(edge.to, []);
adjacency.get(edge.from)!.push({ concept: edge.to, strength });
adjacency.get(edge.to)!.push({ concept: edge.from, strength });
}

const seedConcepts = data.concepts.map((c) => c.toLowerCase().trim());
const visited = new Set<string>();
const conceptScores = new Map<string, number>();

let frontier = new Set<string>();
for (const seed of seedConcepts) {
visited.add(seed);
conceptScores.set(seed, 1.0);
frontier.add(seed);
}

for (let d = 0; d < depth; d++) {
const nextFrontier = new Set<string>();
for (const current of frontier) {
const neighbors = adjacency.get(current) || [];
const sorted = neighbors
.filter((n) => !visited.has(n.concept))
.sort((a, b) => b.strength - a.strength)
.slice(0, MAX_NEIGHBORS_PER_NODE);

for (const neighbor of sorted) {
if (visited.has(neighbor.concept)) continue;
visited.add(neighbor.concept);

const parentScore = conceptScores.get(current) || 0;
conceptScores.set(neighbor.concept, parentScore * neighbor.strength);
nextFrontier.add(neighbor.concept);
}
}
frontier = nextFrontier;
}

const expandedConcepts = [...conceptScores.keys()];

const allMemories = await kv.list<Memory>(KV.memories);
const results: Array<{ memoryId: string; score: number; matchedConcepts: string[] }> = [];

for (const memory of allMemories) {
if (memory.isLatest === false) continue;
const memoryConcepts = memory.concepts.map((c) => c.toLowerCase());
const matched = memoryConcepts.filter((c) => expandedConcepts.includes(c));
if (matched.length === 0) continue;

let score = 0;
for (const mc of matched) {
score += conceptScores.get(mc) || 0;
}
score = score / matched.length;

results.push({
memoryId: memory.id,
score,
matchedConcepts: matched,
});
}

results.sort((a, b) => b.score - a.score);

return {
success: true,
results: results.slice(0, limit),
expandedConcepts,
depth,
};
},
);
}
8 changes: 8 additions & 0 deletions src/functions/remember.ts
Original file line number Diff line number Diff line change
Expand Up @@ -107,6 +107,14 @@ export function registerRememberFunction(sdk: ISdk, kv: StateKV): void {
});
}

if (memory.concepts.length >= 2) {
sdk.trigger({
function_id: "mem::concept-edge-upsert",
payload: { concepts: memory.concepts },
action: TriggerAction.Void(),
}).catch(() => {});
}

logger.info("Memory saved", {
memId: memory.id,
type: memory.type,
Expand Down
Loading