diff --git a/.mdproof/lessons-learned.md b/.mdproof/lessons-learned.md index 0a6e3e9c..9a819d82 100644 --- a/.mdproof/lessons-learned.md +++ b/.mdproof/lessons-learned.md @@ -72,6 +72,13 @@ - **Fix**: Use `>/dev/null 2>&1` (redirect both stdout AND stderr) for cleanup commands in steps that need pure JSON output - **Runbooks affected**: extras_flatten_runbook.md +### [gotcha] chmod 0444 does not block writes when running as root + +- **Context**: `TestLog_SyncPartialStatus` used `os.Chmod(dir, 0444)` to make a target directory read-only, expecting sync to fail on that target and log `"status":"partial"` +- **Discovery**: The devcontainer runs as root. Root ignores POSIX permission bits — `chmod 0444` has no effect. The "broken" target synced successfully, so the oplog recorded `"status":"ok"` instead of `"partial"` +- **Fix**: Use a **dangling symlink** instead: `os.Symlink("/nonexistent/path", targetPath)`. This makes `os.Stat` return "not exist" (passes config validation) but `os.MkdirAll` fails because the symlink entry blocks directory creation. Works regardless of UID +- **Runbooks affected**: `tests/integration/log_test.go` (`TestLog_SyncPartialStatus`) + ### [gotcha] Full-directory mdproof runs cause inter-runbook state leakage - **Context**: Running `mdproof --report json /path/to/tests/` executes all runbooks sequentially in the same environment (same ssenv). Earlier runbooks install skills, modify config, fill trash — this state persists for later runbooks diff --git a/CHANGELOG.md b/CHANGELOG.md index c3b0b8e1..0a3b3a92 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,5 +1,131 @@ # Changelog +## [0.19.0] - 2026-04-11 + +### New Features + +#### Agent Management + +Agents are now a first-class resource type alongside skills. You can install, sync, audit, and manage agent files (`.md`) across agent-capable targets (Claude, Cursor, OpenCode, Augment) with the same workflow as skills. + +- **Agents source directory** — agents live in `~/.config/skillshare/agents/` (or `.skillshare/agents/` in project mode). `skillshare init` creates the directory automatically, and `agents_source` is a new config field that can be customized + ```bash + skillshare init # creates skills/ and agents/ + skillshare init -p # same for project mode + ``` + +- **Positional kind filter** — most commands accept an `agents` keyword to scope the operation to agents only. Without it, commands operate on skills (existing behavior is unchanged) + ```bash + skillshare sync agents # sync agents only + skillshare sync --all # sync skills + agents + extras + skillshare list agents # list installed agents + skillshare check agents # detect drift on agent repos + skillshare update agents # update agents + skillshare audit agents # scan agents for security issues + skillshare uninstall agents # uninstall by kind + skillshare enable foo --kind agent + skillshare disable foo --kind agent + ``` + +- **Install agents from repos** — `install` auto-detects agents in three layouts: + - `agents/` convention subdirectory + - mixed-kind repos with both `SKILL.md` and `agents/` + - pure-agent repos (root `.md` files, no `SKILL.md`) + ```bash + skillshare install github.com/team/agents # auto-detect + skillshare install github.com/team/repo --kind agent # force agent mode + skillshare install github.com/team/repo --agent cr # specific agents + ``` + Conventional files (`README.md`, `LICENSE.md`, `CHANGELOG.md`) are automatically excluded + +- **Tracked agent repos** — agents can be installed with `--track` for git-pull updates, including nested discovery. `check`, `update`, `doctor`, and `uninstall` all recognise tracked agent repos, and the `--group` / `-G` flag filters by repo group + +- **Agent sync modes** — merge (default, per-file symlink), symlink (whole directory), and copy are all supported. `skillshare sync agents` skips targets that don't declare an `agents:` path and prints a warning + +- **`.agentignore` support** — agents can be excluded via `.agentignore` and `.agentignore.local` using the same gitignore-style patterns as `.skillignore`. The Web UI Config page now has a dedicated `.agentignore` tab + +- **Agent audit** — `skillshare audit` scans agent files individually against the full audit rule set, with Skills/Agents tab switching in both the TUI and Web UI. Audit results carry a `kind` field so tooling can filter by resource type + +- **Agent backup and restore** — sync automatically backs up agents before applying changes, in both global and project mode. The backup TUI and trash TUI tag agents with an `[A]` badge and route restores to the correct source directory + +- **Project-mode agent support** — every agent command works in project mode with `-p`. Agents are reconciled alongside skills into `.skillshare/` + +- **JSON output for agents** — `install --json` and `update --json` now emit agent-aware payloads and apply the same audit block-threshold gate as skills. Useful for scripted agent workflows + ```bash + skillshare update agents --json --audit-threshold high + ``` + +- **Kind badges** — TUI and Web UI surface `[S]` / `[A]` badges throughout (list, diff, audit, trash, backup, detail, update, targets pages) so you can tell at a glance what kind of resource you're looking at + +#### Unified Web UI Resources + +- **`/resources` route** — the old `/skills` page is now `/resources`, with Skills and Agents tabs. Tab state persists to localStorage, and the underline tab style follows the active theme (playful mode gets wobble borders) + +- **Targets page redesign** — equal Skills and Agents sections, with a modal picker for adding targets. Filter Studio links include a `?kind=` param so you jump directly to the right context + +- **Update page redesign** — a new three-phase flow (selecting → updating → done) with skills/agents tabs, group-based sorting, and status cards. EventSource streaming is properly cleaned up on page change + +- **Filter Studio agent support** — agent filters can be edited via `PATCH /api/targets/:name` (`agent_include`, `agent_exclude`, `agent_mode`) and via the CLI (`targets edit --add-agent-include`, `--remove-agent-include`, `--agent-mode`, and so on). The UI Filter Studio is a single-context view driven by `?kind=skill|agent` + +- **Audit cache** — audit results are now cached with React Query and invalidated on mutation. The audit card icon colour follows the max severity, and the count no longer mixes agent totals with finding counts + +- **Collect page scope switcher** — a new segmented control lets you collect skills or agents from targets + +#### Theme System + +- **`internal/theme` package** — unified light/dark terminal palette with WCAG-AA-compliant light colours and softened dark primary. Resolution order: `NO_COLOR` > `SKILLSHARE_THEME` > OSC 11 terminal probe > dark fallback. All TUIs, list output, audit output, and plain CLI output now route through the theme + ```bash + SKILLSHARE_THEME=light skillshare list + SKILLSHARE_THEME=dark skillshare audit + ``` + `skillshare doctor` includes a theme check to help debug unreadable colours + +#### Install & TUI Polish + +- **Explicit `SKILL.md` URLs resolve to one skill** — pasting a direct `blob/.../SKILL.md` URL now installs only that skill, bypassing the orchestrator pack prompt. Previously, the URL would trigger the full multi-select picker even though the intent was clear + ```bash + skillshare install https://github.com/team/repo/blob/main/frontend/tdd/SKILL.md + ``` + Refs: #124 + +- **Radio checklist follows the cursor** — the single-select TUI (used for orchestrator selection, branch selection, and similar flows) now auto-selects the focused row. No more confusing empty-selection state — pressing Enter always confirms the item your cursor is on + +- **Diff TUI** — single-line items with group headers instead of the old verbose-per-item layout. Agent diffs are shown with an `[A]` badge + +- **List TUI** — entries are now grouped by tracked repo root and local top directory, with a new `k:kind` filter tag for quick agent/skill filtering inside the fuzzy filter + +#### Centralized Metadata Store + +- **`.metadata.json` replaces sidecar files and `registry.yaml`** — installation metadata is now stored in a single atomic file per source (`~/.config/skillshare/skills/.metadata.json`). This fixes long-standing issues with grouped skill collisions (e.g. two skills both named `dev` in different folders) where the old basename-keyed registry would mix them up + - **Automatic migration** — the first load after upgrade reads any existing `registry.yaml` and per-skill `.skillshare-meta.json` sidecars, merges them into `.metadata.json`, and cleans up the old files. Idempotent — safe to run repeatedly + - **Full-path keys** — lookups use the full source-relative path, so nested skills never collide + - No user action required; existing installs continue to work + +### Bug Fixes + +- **Sync extras no longer flood when `agents` target overlaps** — targets that declare an extras entry called `agents` are now skipped automatically when agent sync is active, preventing duplicate file writes +- **Nested agent discovery** — `check agents` now uses the recursive discovery engine, so agents in sub-folders (e.g. `demo/code-reviewer.md`) are detected correctly +- **Doctor drift count excludes disabled agents** — agents disabled via `.agentignore` no longer count toward the drift total reported by `skillshare doctor` +- **Audit card mixes counts** — the Web UI audit card no longer mixes agent counts with finding counts, and excludes `_cross-skill` from the card total (shown separately) +- **Audit scans disabled agents too** — the audit scan walks every agent file regardless of `.agentignore` state, so hidden agents still get checked +- **List TUI tab bar clipping** — the tab bar no longer gets cut off in the split-detail layout on narrow terminals +- **Sync extras indent** — removed the stray space between the checkmark and the path in `sync` extras output; summary headers are now consistent across skills, agents, and extras +- **UI skill detail agent mode** — the detail page hides the Files section for agents (single-file resources), remembers the selected tab via localStorage, and shows the correct folder-view labels +- **Sync page layout** — the stats row and ignored-skills grouping are now easier to scan +- **Button warning variant** — the shared `Button` component now supports a `warning` variant that was already referenced by several pages +- **Check progress bar pop-in** — removed the loading progress bar that caused layout shift on the Skills page +- **Tracked repo check status** — the propagated check status is now applied to every item within the repo, not just the root +- **Target name colouring in doctor** — only the status word is coloured in `doctor` target output, not the full line + +### Breaking Changes + +- **`audit --all` flag removed** — use the positional kind filter instead: + ```bash + skillshare audit # skills (default, unchanged) + skillshare audit agents # agents only + ``` + The old `--all` flag is gone because audit now runs per kind and the Web UI has dedicated tabs + ## [0.18.9] - 2026-04-07 ### New Features diff --git a/README.md b/README.md index 0c791333..6d2a5ba9 100644 --- a/README.md +++ b/README.md @@ -22,7 +22,7 @@

- One source of truth for AI CLI skills, rules, commands & more. Sync everywhere with one command — from personal to organization-wide.
+ One source of truth for AI CLI skills, agents, rules, commands & more. Sync everywhere with one command — from personal to organization-wide.
Codex, Claude Code, OpenClaw, OpenCode & 50+ more.

@@ -40,7 +40,7 @@

> [!NOTE] -> **Latest**: [v0.18.3](https://github.com/runkids/skillshare/releases/tag/v0.18.3) — enable/disable skills, skills sub-key config, upgrade auto-sudo. [All releases →](https://github.com/runkids/skillshare/releases) +> **Latest**: [v0.19.0](https://github.com/runkids/skillshare/releases/tag/v0.19.0) — agent management, filter studio, unified resources UI. [All releases →](https://github.com/runkids/skillshare/releases) ## Why skillshare @@ -50,6 +50,7 @@ You edit in one, forget to copy to another, and lose track of what's where. skillshare fixes this: - **One source, every agent** — sync to Claude, Cursor, Codex & 50+ more with `skillshare sync` +- **Agent management** — sync custom agents alongside skills to agent-capable targets - **More than skills** — manage rules, commands, prompts & any file-based resource with [extras](https://skillshare.runkids.cc/docs/reference/targets/configuration#extras) - **Install from anywhere** — GitHub, GitLab, Bitbucket, Azure DevOps, or any self-hosted Git - **Built-in security** — audit skills for prompt injection and data exfiltration before use @@ -68,6 +69,7 @@ skillshare fixes this: ┌─────────────────────────────────────────────────────────────┐ │ Source Directory │ │ ~/.config/skillshare/skills/ ← skills (SKILL.md) │ +│ ~/.config/skillshare/agents/ ← agents │ │ ~/.config/skillshare/extras/ ← rules, commands, etc. │ └─────────────────────────────────────────────────────────────┘ │ sync @@ -78,10 +80,10 @@ skillshare fixes this: └───────────┘ └───────────┘ └───────────┘ ``` -| Platform | Skills Source | Extras Source | Link Type | -|----------|---------------|---------------|-----------| -| macOS/Linux | `~/.config/skillshare/skills/` | `~/.config/skillshare/extras/` | Symlinks | -| Windows | `%AppData%\skillshare\skills\` | `%AppData%\skillshare\extras\` | NTFS Junctions (no admin required) | +| Platform | Skills Source | Agents Source | Extras Source | Link Type | +|----------|---------------|---------------|---------------|-----------| +| macOS/Linux | `~/.config/skillshare/skills/` | `~/.config/skillshare/agents/` | `~/.config/skillshare/extras/` | Symlinks | +| Windows | `%AppData%\skillshare\skills\` | `%AppData%\skillshare\agents\` | `%AppData%\skillshare\extras\` | NTFS Junctions (no admin required) | | | Imperative (install-per-command) | Declarative (skillshare) | |---|---|---| @@ -180,6 +182,13 @@ skillshare audit skillshare init -p && skillshare sync ``` +**Agents** —sync custom agents to agent-capable targets + +```bash +skillshare sync agents # sync agents only +skillshare sync --all # sync skills + agents + extras together +``` + **Extras** —manage rules, commands, prompts & more ```bash @@ -242,6 +251,8 @@ Thanks to everyone who helped shape skillshare. Curtion amdoi7 jessica-engel +AlimuratYusup +thor-shuang --- diff --git a/ai_docs/tests/agents_commands_runbook.md b/ai_docs/tests/agents_commands_runbook.md new file mode 100644 index 00000000..60c19cea --- /dev/null +++ b/ai_docs/tests/agents_commands_runbook.md @@ -0,0 +1,445 @@ +# CLI E2E Runbook: Agents Commands + +Validates all agent-related CLI commands: list, sync, status, diff, +collect, uninstall, trash, update, backup, and doctor. + +**Origin**: v0.17.0 — agents support added as a new resource kind alongside skills. + +## Scope + +- Agent CRUD lifecycle (create source → sync → uninstall → trash → restore) +- Kind filter: `agents`, `--all`, default (skills-only) +- JSON output for all commands that support it +- Diff and collect workflows +- Backup and restore round-trip +- Doctor agent checks +- Update with no tracked agents (local-only) + +## Environment + +Run inside devcontainer via mdproof (no ssenv wrapper needed). +All commands use `-g` to force global mode since `/workspace/.skillshare/` triggers project mode auto-detection. + +## Steps + +### 1. Setup: init global config and create agent source files + +```bash +ss init -g --force --no-copy --all-targets --no-git --no-skill +AGENTS_DIR=~/.config/skillshare/agents +mkdir -p "$AGENTS_DIR" +cat > "$AGENTS_DIR/tutor.md" <<'EOF' +--- +name: tutor +description: A tutoring agent +--- +# Tutor Agent +Helps with learning. +EOF +cat > "$AGENTS_DIR/reviewer.md" <<'EOF' +--- +name: reviewer +description: A code review agent +--- +# Reviewer Agent +Reviews code for quality. +EOF +cat > "$AGENTS_DIR/debugger.md" <<'EOF' +--- +name: debugger +description: A debugging agent +--- +# Debugger Agent +Helps debug issues. +EOF +ls "$AGENTS_DIR" +``` + +Expected: +- exit_code: 0 +- tutor.md +- reviewer.md +- debugger.md + +### 2. List agents — shows source agents + +```bash +ss list agents --no-tui -g +``` + +Expected: +- exit_code: 0 +- tutor +- reviewer +- debugger + +### 3. List agents — JSON includes kind field + +```bash +ss list agents --json -g +``` + +Expected: +- exit_code: 0 +- jq: length == 3 +- jq: all(.[]; .kind == "agent") +- jq: [.[].name] | sort | . == ["debugger","reviewer","tutor"] + +### 4. List default — skills only, no agents + +```bash +ss list --json -g +``` + +Expected: +- exit_code: 0 +- Not tutor +- Not reviewer + +### 5. Sync agents — creates symlinks + +```bash +ss sync agents -g +``` + +Expected: +- exit_code: 0 +- regex: linked|synced + +Verify: + +```bash +CLAUDE_AGENTS=~/.claude/agents +test -L "$CLAUDE_AGENTS/tutor.md" && echo "tutor: symlinked" || echo "tutor: MISSING" +test -L "$CLAUDE_AGENTS/reviewer.md" && echo "reviewer: symlinked" || echo "reviewer: MISSING" +test -L "$CLAUDE_AGENTS/debugger.md" && echo "debugger: symlinked" || echo "debugger: MISSING" +``` + +Expected: +- exit_code: 0 +- tutor: symlinked +- reviewer: symlinked +- debugger: symlinked +- Not MISSING + +### 6. Sync agents — dry-run JSON shows no errors + +```bash +ss sync agents --dry-run --json -g +``` + +Expected: +- exit_code: 0 + +### 7. Sync default — does NOT sync agents to unconfigured targets + +```bash +CURSOR_AGENTS=~/.cursor/agents +rm -rf "$CURSOR_AGENTS" 2>/dev/null || true +ss sync -g +test -d "$CURSOR_AGENTS" && echo "cursor agents dir: EXISTS" || echo "cursor agents dir: not created" +``` + +Expected: +- exit_code: 0 +- cursor agents dir: not created + +### 8. Sync all — syncs both skills and agents + +```bash +ss sync --all -g +``` + +Expected: +- exit_code: 0 + +### 9. Status — shows skills and agents by default + +```bash +ss status -g +``` + +Expected: +- exit_code: 0 +- regex: [Aa]gent +- regex: [Ss]ource + +### 10. Status JSON — includes agents by default + +```bash +ss status --json -g +``` + +Expected: +- exit_code: 0 +- jq: .agents.exists == true +- jq: .agents.count == 3 + +### 12. Diff agents — no drift after sync + +```bash +ss diff agents --no-tui -g +``` + +Expected: +- exit_code: 0 + +### 13. Diff agents — JSON output + +```bash +ss diff agents --json -g +``` + +Expected: +- exit_code: 0 + +### 14. Collect agents — no local agents to collect + +```bash +ss collect agents --force -g +``` + +Expected: +- exit_code: 0 +- regex: [Nn]o local agents + +### 15. Collect agents — collects a local agent file + +```bash +CLAUDE_AGENTS=~/.claude/agents +mkdir -p "$CLAUDE_AGENTS" +rm -f "$CLAUDE_AGENTS/local-agent.md" +cat > "$CLAUDE_AGENTS/local-agent.md" <<'EOF' +--- +name: local-agent +description: A locally created agent +--- +# Local Agent +Created directly in target. +EOF +ss collect agents --force -g +``` + +Expected: +- exit_code: 0 +- regex: [Cc]ollected + +Verify: + +```bash +AGENTS_DIR=~/.config/skillshare/agents +test -f "$AGENTS_DIR/local-agent.md" && echo "local-agent: collected to source" || echo "local-agent: NOT IN SOURCE" +``` + +Expected: +- exit_code: 0 +- local-agent: collected to source +- Not NOT IN SOURCE + +### 16. Uninstall agents — force remove single agent + +```bash +ss uninstall agents local-agent --force -g +``` + +Expected: +- exit_code: 0 +- regex: [Rr]emov|local-agent + +### 17. Verify agent was removed by JSON uninstall (step 16) + +```bash +AGENTS_DIR=~/.config/skillshare/agents +test -f "$AGENTS_DIR/local-agent.md" && echo "FAIL: still exists" || echo "local-agent: removed" +``` + +Expected: +- exit_code: 0 +- local-agent: removed +- Not FAIL + +### 18. Trash agents — list shows uninstalled agent + +```bash +ss trash agents list --no-tui -g +``` + +Expected: +- exit_code: 0 +- local-agent + +### 19. Trash agents — restore from trash + +```bash +ss trash agents restore local-agent -g +``` + +Expected: +- exit_code: 0 +- regex: [Rr]estor + +Verify: + +```bash +AGENTS_DIR=~/.config/skillshare/agents +test -f "$AGENTS_DIR/local-agent.md" && echo "local-agent: restored" || echo "FAIL: not restored" +``` + +Expected: +- exit_code: 0 +- local-agent: restored +- Not FAIL + +### 20. Uninstall agents --all + +```bash +ss uninstall agents --all --force -g +``` + +Expected: +- exit_code: 0 +- regex: [Rr]emov|[Uu]ninstall + +Verify: + +```bash +AGENTS_DIR=~/.config/skillshare/agents +COUNT=$(ls "$AGENTS_DIR"/*.md 2>/dev/null | wc -l | tr -d ' ') +echo "Remaining agents: $COUNT (expected: 0)" +``` + +Expected: +- exit_code: 0 +- Remaining agents: 0 (expected: 0) + +### 21. Uninstall agents — validation errors + +```bash +ss uninstall agents -g 2>&1 || true +``` + +Expected: +- regex: name|--all|required|specify + +### 22. Sync agents after uninstall — targets cleaned + +```bash +ss sync agents -g +CLAUDE_AGENTS=~/.claude/agents +COUNT=$(ls "$CLAUDE_AGENTS"/*.md 2>/dev/null | wc -l | tr -d ' ') +echo "Remaining symlinks: $COUNT" +``` + +Expected: +- exit_code: 0 +- regex: prun|[Nn]o agents + +### 23. Update agents — no agents found + +```bash +ss update agents --all -g +``` + +Expected: +- regex: [Nn]o agents|[Nn]o project agents + +### 24. Re-create agents and test update — local only + +```bash +AGENTS_DIR=~/.config/skillshare/agents +mkdir -p "$AGENTS_DIR" +cat > "$AGENTS_DIR/helper.md" <<'EOF' +--- +name: helper +description: A helper agent +--- +# Helper +EOF +ss update agents --all -g +``` + +Expected: +- regex: local|no tracked|up to date|[Nn]o agents + +### 25. Update agents — --group not supported + +```bash +ss update agents --group mygroup -g 2>&1 || true +``` + +Expected: +- regex: not supported|--group + +### 26. Backup agents + +```bash +ss sync agents -g +ss backup agents -g +``` + +Expected: +- exit_code: 0 +- regex: [Bb]ackup|created|nothing + +### 27. Backup agents — list shows backup entries + +```bash +ss backup --list -g +``` + +Expected: +- exit_code: 0 + +### 28. Doctor — includes agent checks + +```bash +ss doctor -g +``` + +Expected: +- exit_code: 0 +- regex: [Aa]gent + +### 29. List all — shows both skills and agents + +```bash +ss list --all --json -g +``` + +Expected: +- exit_code: 0 +- jq: map(select(.kind == "agent")) | length > 0 + +### 30. Cleanup remaining agents + +```bash +ss uninstall agents --all --force -g 2>/dev/null || true +ss sync agents -g 2>/dev/null || true +``` + +Expected: +- exit_code: 0 + +## Pass Criteria + +- [ ] `list agents` shows only agents, not skills +- [ ] `list agents --json` includes `kind: "agent"` for all entries +- [ ] Default `list` (no kind) excludes agents +- [ ] `sync agents` creates symlinks in agent target directories +- [ ] `sync agents --dry-run` makes no changes +- [ ] Default `sync` does NOT sync agents +- [ ] `sync all` syncs both skills and agents +- [ ] `status` shows both skills and agents by default +- [ ] `status --json` includes agents section +- [ ] `diff agents` shows drift status +- [ ] `collect agents` collects local agent files to source +- [ ] `uninstall agents --force` moves agent to trash +- [ ] `uninstall agents --all --force` removes all agents +- [ ] `uninstall agents` without name or --all → validation error +- [ ] `trash agents list` shows trashed agents +- [ ] `trash agents restore ` restores agent from trash +- [ ] `update agents --all` handles no-agents and local-only cases +- [ ] `update agents --group` → not supported error +- [ ] `backup agents` creates agent backup +- [ ] `doctor` includes agent diagnostic checks +- [ ] `list all --json` returns mixed skills + agents diff --git a/ai_docs/tests/agents_include_exclude_filter_runbook.md b/ai_docs/tests/agents_include_exclude_filter_runbook.md new file mode 100644 index 00000000..7c7ab6de --- /dev/null +++ b/ai_docs/tests/agents_include_exclude_filter_runbook.md @@ -0,0 +1,190 @@ +# CLI E2E Runbook: Agent Include/Exclude Filters + +Validates `targets..agents.include` and `targets..agents.exclude` +are applied during `sync agents`, including pruning agents that were synced +before the filter was added. + +## Scope + +- Global-mode agent filters under `targets..agents` +- Include filter keeps only matching agents and prunes prior non-matching links +- Exclude filter removes matching agents from the target on re-sync +- Nested agent paths are filtered by flattened target filename + +## Environment + +Run inside devcontainer via mdproof. +Each step is self-contained because mdproof setup resets config between steps. +Use `-g` to force global mode. + +## Steps + +### 1. Include filter keeps matching agents and prunes previously synced non-matches + +```bash +set -e +BASE=~/.config/skillshare +AGENTS_DIR="$BASE/agents" +SKILLS_DIR="$BASE/skills" +TARGET=~/.claude/agents + +rm -rf "$AGENTS_DIR" "$SKILLS_DIR" "$TARGET" +mkdir -p "$AGENTS_DIR" "$SKILLS_DIR" "$TARGET" + +cat > "$AGENTS_DIR/team-alpha.md" <<'EOF' +# Team Alpha +EOF +cat > "$AGENTS_DIR/team-beta.md" <<'EOF' +# Team Beta +EOF +cat > "$AGENTS_DIR/personal.md" <<'EOF' +# Personal +EOF + +cat > "$BASE/config.yaml" <<'EOF' +source: ~/.config/skillshare/skills +targets: + claude: + agents: + path: ~/.claude/agents +EOF + +ss sync agents -g >/dev/null +find "$TARGET" -maxdepth 1 -type l -printf 'before: %f\n' | sort + +cat > "$BASE/config.yaml" <<'EOF' +source: ~/.config/skillshare/skills +targets: + claude: + agents: + path: ~/.claude/agents + include: [team-*] +EOF + +ss sync agents -g >/dev/null +find "$TARGET" -maxdepth 1 \( -type l -o -type f \) -printf 'after: %f\n' | sort +test -L "$TARGET/team-alpha.md" && echo "team-alpha linked=yes" || echo "team-alpha linked=no" +test -L "$TARGET/team-beta.md" && echo "team-beta linked=yes" || echo "team-beta linked=no" +test ! -e "$TARGET/personal.md" && echo "personal present=no" || echo "personal present=yes" +``` + +Expected: +- exit_code: 0 +- before: personal.md +- before: team-alpha.md +- before: team-beta.md +- after: team-alpha.md +- after: team-beta.md +- Not after: personal.md +- team-alpha linked=yes +- team-beta linked=yes +- personal present=no + +### 2. Exclude filter prunes matching agents that were previously synced + +```bash +set -e +BASE=~/.config/skillshare +AGENTS_DIR="$BASE/agents" +SKILLS_DIR="$BASE/skills" +TARGET=~/.claude/agents + +rm -rf "$AGENTS_DIR" "$SKILLS_DIR" "$TARGET" +mkdir -p "$AGENTS_DIR" "$SKILLS_DIR" "$TARGET" + +cat > "$AGENTS_DIR/stable-reviewer.md" <<'EOF' +# Stable Reviewer +EOF +cat > "$AGENTS_DIR/draft-notes.md" <<'EOF' +# Draft Notes +EOF +cat > "$AGENTS_DIR/draft-checklist.md" <<'EOF' +# Draft Checklist +EOF + +cat > "$BASE/config.yaml" <<'EOF' +source: ~/.config/skillshare/skills +targets: + claude: + agents: + path: ~/.claude/agents +EOF + +ss sync agents -g >/dev/null +find "$TARGET" -maxdepth 1 -type l -printf 'before: %f\n' | sort + +cat > "$BASE/config.yaml" <<'EOF' +source: ~/.config/skillshare/skills +targets: + claude: + agents: + path: ~/.claude/agents + exclude: [draft-*] +EOF + +ss sync agents -g >/dev/null +find "$TARGET" -maxdepth 1 \( -type l -o -type f \) -printf 'after: %f\n' | sort +test -L "$TARGET/stable-reviewer.md" && echo "stable-reviewer linked=yes" || echo "stable-reviewer linked=no" +test ! -e "$TARGET/draft-notes.md" && echo "draft-notes present=no" || echo "draft-notes present=yes" +test ! -e "$TARGET/draft-checklist.md" && echo "draft-checklist present=no" || echo "draft-checklist present=yes" +``` + +Expected: +- exit_code: 0 +- before: draft-checklist.md +- before: draft-notes.md +- before: stable-reviewer.md +- after: stable-reviewer.md +- Not after: draft-notes.md +- Not after: draft-checklist.md +- stable-reviewer linked=yes +- draft-notes present=no +- draft-checklist present=no + +### 3. Nested agents are filtered by flattened target filename + +```bash +set -e +BASE=~/.config/skillshare +AGENTS_DIR="$BASE/agents" +SKILLS_DIR="$BASE/skills" +TARGET=~/.claude/agents + +rm -rf "$AGENTS_DIR" "$SKILLS_DIR" "$TARGET" +mkdir -p "$AGENTS_DIR/team/backend" "$AGENTS_DIR/solo" "$SKILLS_DIR" "$TARGET" + +cat > "$AGENTS_DIR/team/backend/reviewer.md" <<'EOF' +# Nested Team Reviewer +EOF +cat > "$AGENTS_DIR/solo/helper.md" <<'EOF' +# Solo Helper +EOF + +cat > "$BASE/config.yaml" <<'EOF' +source: ~/.config/skillshare/skills +targets: + claude: + agents: + path: ~/.claude/agents + include: [team__*] +EOF + +ss sync agents -g >/dev/null +find "$TARGET" -maxdepth 1 \( -type l -o -type f \) -printf 'after: %f\n' | sort + +test -L "$TARGET/team__backend__reviewer.md" && echo "team__backend__reviewer linked=yes" || echo "team__backend__reviewer linked=no" +test ! -e "$TARGET/solo__helper.md" && echo "solo__helper present=no" || echo "solo__helper present=yes" +``` + +Expected: +- exit_code: 0 +- after: team__backend__reviewer.md +- Not after: solo__helper.md +- team__backend__reviewer linked=yes +- solo__helper present=no + +## Pass Criteria + +- Include filters prune already-synced non-matching agents on the next `sync agents` +- Exclude filters remove matching agents from the target while keeping non-matching ones +- Nested agent paths are matched against flattened target filenames diff --git a/ai_docs/tests/collect_multiple_targets_runbook.md b/ai_docs/tests/collect_multiple_targets_runbook.md new file mode 100644 index 00000000..3f6daec2 --- /dev/null +++ b/ai_docs/tests/collect_multiple_targets_runbook.md @@ -0,0 +1,197 @@ +# CLI E2E Runbook: Collect Multiple Targets + +Validates that `ss collect` requires an explicit target name or `--all` when +multiple eligible targets are configured. Covers both skills and agents in +global and project modes. + +## Scope + +- Global `collect` plain-text warning for skills +- Global `collect --json` error envelope for skills +- Global `collect agents` plain-text warning +- Global `collect agents --json` error envelope +- Project `collect -p` plain-text warning for skills +- Project `collect -p --json` error envelope for skills +- Project `collect -p agents` plain-text warning +- Project `collect -p agents --json` error envelope + +## Environment + +- Run inside devcontainer via mdproof +- Global steps use mdproof's default initialized HOME +- Project steps create a fresh project under `/tmp` +- JSON error steps intentionally swallow the non-zero exit code and assert on + the returned error object + +## Steps + +### Step 1: Global collect skills warns when multiple targets exist + +```bash +set -e +rm -rf /tmp/collect-multi-global-skills +mkdir -p \ + /tmp/collect-multi-global-skills/.config \ + /tmp/collect-multi-global-skills/.local/share \ + /tmp/collect-multi-global-skills/.local/state \ + /tmp/collect-multi-global-skills/.cache \ + /tmp/collect-multi-global-skills/.claude \ + /tmp/collect-multi-global-skills/.cursor +HOME=/tmp/collect-multi-global-skills /workspace/bin/skillshare init -g --force --all-targets --no-git --no-skill >/dev/null 2>&1 + +OUTPUT=$(HOME=/tmp/collect-multi-global-skills /workspace/bin/skillshare collect -g 2>&1) +printf '%s\n' "$OUTPUT" +``` + +Expected: +- exit_code: 0 +- Multiple targets found. Specify a target name or use --all +- claude +- cursor +- universal + +### Step 2: Global collect skills --json returns an error envelope + +```bash +set -e +rm -rf /tmp/collect-multi-global-skills-json +mkdir -p \ + /tmp/collect-multi-global-skills-json/.config \ + /tmp/collect-multi-global-skills-json/.local/share \ + /tmp/collect-multi-global-skills-json/.local/state \ + /tmp/collect-multi-global-skills-json/.cache \ + /tmp/collect-multi-global-skills-json/.claude \ + /tmp/collect-multi-global-skills-json/.cursor +HOME=/tmp/collect-multi-global-skills-json /workspace/bin/skillshare init -g --force --all-targets --no-git --no-skill >/dev/null 2>&1 + +OUTPUT=$(HOME=/tmp/collect-multi-global-skills-json /workspace/bin/skillshare collect --json -g 2>&1 || true) +printf '%s\n' "$OUTPUT" +``` + +Expected: +- exit_code: 0 +- jq: .error == "multiple targets found; specify a target name or use --all" + +### Step 3: Global collect agents warns when multiple agent targets exist + +```bash +set -e +rm -rf /tmp/collect-multi-global-agents +mkdir -p \ + /tmp/collect-multi-global-agents/.config \ + /tmp/collect-multi-global-agents/.local/share \ + /tmp/collect-multi-global-agents/.local/state \ + /tmp/collect-multi-global-agents/.cache \ + /tmp/collect-multi-global-agents/.claude \ + /tmp/collect-multi-global-agents/.cursor +HOME=/tmp/collect-multi-global-agents /workspace/bin/skillshare init -g --force --all-targets --no-git --no-skill >/dev/null 2>&1 + +OUTPUT=$(HOME=/tmp/collect-multi-global-agents /workspace/bin/skillshare collect agents -g 2>&1) +printf '%s\n' "$OUTPUT" +``` + +Expected: +- exit_code: 0 +- Multiple targets found. Specify a target name or use --all +- claude +- cursor + +### Step 4: Global collect agents --json returns an error envelope + +```bash +set -e +rm -rf /tmp/collect-multi-global-agents-json +mkdir -p \ + /tmp/collect-multi-global-agents-json/.config \ + /tmp/collect-multi-global-agents-json/.local/share \ + /tmp/collect-multi-global-agents-json/.local/state \ + /tmp/collect-multi-global-agents-json/.cache \ + /tmp/collect-multi-global-agents-json/.claude \ + /tmp/collect-multi-global-agents-json/.cursor +HOME=/tmp/collect-multi-global-agents-json /workspace/bin/skillshare init -g --force --all-targets --no-git --no-skill >/dev/null 2>&1 + +OUTPUT=$(HOME=/tmp/collect-multi-global-agents-json /workspace/bin/skillshare collect agents --json -g 2>&1 || true) +printf '%s\n' "$OUTPUT" +``` + +Expected: +- exit_code: 0 +- jq: .error == "multiple targets found; specify a target name or use --all" + +### Step 5: Project collect skills warns when multiple targets exist + +```bash +set -e +rm -rf /tmp/collect-multi-project-skills +mkdir -p /tmp/collect-multi-project-skills +cd /tmp/collect-multi-project-skills +ss init -p --targets claude,cursor >/dev/null 2>&1 + +OUTPUT=$(ss collect -p 2>&1) +printf '%s\n' "$OUTPUT" +``` + +Expected: +- exit_code: 0 +- Multiple targets found. Specify a target name or use --all +- claude +- cursor + +### Step 6: Project collect skills --json returns an error envelope + +```bash +set -e +rm -rf /tmp/collect-multi-project-skills-json +mkdir -p /tmp/collect-multi-project-skills-json +cd /tmp/collect-multi-project-skills-json +ss init -p --targets claude,cursor >/dev/null 2>&1 + +OUTPUT=$(ss collect --json -p 2>&1 || true) +printf '%s\n' "$OUTPUT" +``` + +Expected: +- exit_code: 0 +- jq: .error == "multiple targets found; specify a target name or use --all" + +### Step 7: Project collect agents warns when multiple agent targets exist + +```bash +set -e +rm -rf /tmp/collect-multi-project-agents +mkdir -p /tmp/collect-multi-project-agents +cd /tmp/collect-multi-project-agents +ss init -p --targets claude,cursor >/dev/null 2>&1 + +OUTPUT=$(ss collect agents -p 2>&1) +printf '%s\n' "$OUTPUT" +``` + +Expected: +- exit_code: 0 +- Multiple targets found. Specify a target name or use --all +- claude +- cursor + +### Step 8: Project collect agents --json returns an error envelope + +```bash +set -e +rm -rf /tmp/collect-multi-project-agents-json +mkdir -p /tmp/collect-multi-project-agents-json +cd /tmp/collect-multi-project-agents-json +ss init -p --targets claude,cursor >/dev/null 2>&1 + +OUTPUT=$(ss collect agents --json -p 2>&1 || true) +printf '%s\n' "$OUTPUT" +``` + +Expected: +- exit_code: 0 +- jq: .error == "multiple targets found; specify a target name or use --all" + +## Pass Criteria + +- All 8 steps pass +- Plain-text collect warns instead of silently collecting from multiple targets +- JSON collect returns a structured error envelope with the exact target-selection error diff --git a/ai_docs/tests/collect_overwrite_semantics_runbook.md b/ai_docs/tests/collect_overwrite_semantics_runbook.md new file mode 100644 index 00000000..ddef10c8 --- /dev/null +++ b/ai_docs/tests/collect_overwrite_semantics_runbook.md @@ -0,0 +1,249 @@ +# CLI E2E Runbook: Collect Overwrite Semantics + +Validates source-conflict behavior for `ss collect`: default collect skips +existing source content, `--force` overwrites, and `--json` implies force. +Covers skills and agents in global and project modes. + +## Scope + +- Global skills skip without force +- Global skills `--json` overwrites existing source +- Global agents skip without force +- Global agents `--force --json` overwrites existing source +- Project skills skip without force +- Project skills `--force --json` overwrites existing source +- Project agents skip without force +- Project agents `--json` overwrites existing source + +## Environment + +- Run inside devcontainer via mdproof +- Global steps use a fresh mdproof HOME +- Project steps create fresh projects under `/tmp` +- Skip steps use `printf 'y\n'` to confirm the collect action, then assert the + source content remains unchanged +- Overwrite steps keep stdout as pure JSON and verify file content silently + +## Steps + +### Step 1: Global collect skills skips an existing source skill by default + +```bash +set -e +mkdir -p ~/.config/skillshare/skills/dupe-skill ~/.claude/skills/dupe-skill +cat > ~/.config/skillshare/skills/dupe-skill/SKILL.md <<'EOF' +# Source Skill +Source version survives. +EOF +cat > ~/.claude/skills/dupe-skill/SKILL.md <<'EOF' +# Target Skill +Target version should not overwrite. +EOF + +OUTPUT=$(printf 'y\n' | ss collect claude -g 2>&1) +grep -q "Source version survives." ~/.config/skillshare/skills/dupe-skill/SKILL.md +printf '%s\n' "$OUTPUT" +``` + +Expected: +- exit_code: 0 +- already exists in source +- 1 skipped + +### Step 2: Global collect skills --json overwrites the existing source skill + +```bash +set -e +mkdir -p ~/.config/skillshare/skills/dupe-skill ~/.claude/skills/dupe-skill +cat > ~/.config/skillshare/skills/dupe-skill/SKILL.md <<'EOF' +# Source Skill +Old source version. +EOF +cat > ~/.claude/skills/dupe-skill/SKILL.md <<'EOF' +# Target Skill +Target version wins via json. +EOF + +OUTPUT=$(ss collect claude --json -g) +grep -q "Target version wins via json." ~/.config/skillshare/skills/dupe-skill/SKILL.md +printf '%s\n' "$OUTPUT" +``` + +Expected: +- exit_code: 0 +- jq: .pulled == ["dupe-skill"] +- jq: (.skipped | length) == 0 +- jq: (.failed | length) == 0 + +### Step 3: Global collect agents skips an existing source agent by default + +```bash +set -e +mkdir -p ~/.config/skillshare/agents ~/.claude/agents +cat > ~/.config/skillshare/agents/dupe-agent.md <<'EOF' +# Source Agent +Source agent survives. +EOF +cat > ~/.claude/agents/dupe-agent.md <<'EOF' +# Target Agent +Target agent should not overwrite. +EOF + +OUTPUT=$(printf 'y\n' | ss collect agents claude -g 2>&1) +grep -q "Source agent survives." ~/.config/skillshare/agents/dupe-agent.md +printf '%s\n' "$OUTPUT" +``` + +Expected: +- exit_code: 0 +- already exists in source +- 1 skipped + +### Step 4: Global collect agents --force --json overwrites the existing source agent + +```bash +set -e +mkdir -p ~/.config/skillshare/agents ~/.claude/agents +cat > ~/.config/skillshare/agents/dupe-agent.md <<'EOF' +# Source Agent +Old source agent. +EOF +cat > ~/.claude/agents/dupe-agent.md <<'EOF' +# Target Agent +Target agent force wins. +EOF + +OUTPUT=$(ss collect agents claude --force --json -g) +grep -q "Target agent force wins." ~/.config/skillshare/agents/dupe-agent.md +printf '%s\n' "$OUTPUT" +``` + +Expected: +- exit_code: 0 +- jq: .pulled == ["dupe-agent.md"] +- jq: (.skipped | length) == 0 +- jq: (.failed | length) == 0 + +### Step 5: Project collect skills skips an existing source skill by default + +```bash +set -e +rm -rf /tmp/collect-overwrite-project-skills-skip +mkdir -p /tmp/collect-overwrite-project-skills-skip +cd /tmp/collect-overwrite-project-skills-skip +ss init -p --targets claude >/dev/null 2>&1 + +mkdir -p .skillshare/skills/dupe-skill .claude/skills/dupe-skill +cat > .skillshare/skills/dupe-skill/SKILL.md <<'EOF' +# Source Skill +Project source survives. +EOF +cat > .claude/skills/dupe-skill/SKILL.md <<'EOF' +# Target Skill +Project target should not overwrite. +EOF + +OUTPUT=$(printf 'y\n' | ss collect claude -p 2>&1) +grep -q "Project source survives." .skillshare/skills/dupe-skill/SKILL.md +printf '%s\n' "$OUTPUT" +``` + +Expected: +- exit_code: 0 +- already exists in source +- 1 skipped + +### Step 6: Project collect skills --force --json overwrites the existing source skill + +```bash +set -e +rm -rf /tmp/collect-overwrite-project-skills-force +mkdir -p /tmp/collect-overwrite-project-skills-force +cd /tmp/collect-overwrite-project-skills-force +ss init -p --targets claude >/dev/null 2>&1 + +mkdir -p .skillshare/skills/dupe-skill .claude/skills/dupe-skill +cat > .skillshare/skills/dupe-skill/SKILL.md <<'EOF' +# Source Skill +Old project source. +EOF +cat > .claude/skills/dupe-skill/SKILL.md <<'EOF' +# Target Skill +Project force overwrite wins. +EOF + +OUTPUT=$(ss collect claude --force --json -p) +grep -q "Project force overwrite wins." .skillshare/skills/dupe-skill/SKILL.md +printf '%s\n' "$OUTPUT" +``` + +Expected: +- exit_code: 0 +- jq: .pulled == ["dupe-skill"] +- jq: (.skipped | length) == 0 +- jq: (.failed | length) == 0 + +### Step 7: Project collect agents skips an existing source agent by default + +```bash +set -e +rm -rf /tmp/collect-overwrite-project-agents-skip +mkdir -p /tmp/collect-overwrite-project-agents-skip +cd /tmp/collect-overwrite-project-agents-skip +ss init -p --targets claude >/dev/null 2>&1 + +mkdir -p .skillshare/agents .claude/agents +cat > .skillshare/agents/dupe-agent.md <<'EOF' +# Source Agent +Project source agent survives. +EOF +cat > .claude/agents/dupe-agent.md <<'EOF' +# Target Agent +Project target agent should not overwrite. +EOF + +OUTPUT=$(printf 'y\n' | ss collect agents claude -p 2>&1) +grep -q "Project source agent survives." .skillshare/agents/dupe-agent.md +printf '%s\n' "$OUTPUT" +``` + +Expected: +- exit_code: 0 +- already exists in source +- 1 skipped + +### Step 8: Project collect agents --json overwrites the existing source agent + +```bash +set -e +rm -rf /tmp/collect-overwrite-project-agents-json +mkdir -p /tmp/collect-overwrite-project-agents-json +cd /tmp/collect-overwrite-project-agents-json +ss init -p --targets claude >/dev/null 2>&1 + +mkdir -p .skillshare/agents .claude/agents +cat > .skillshare/agents/dupe-agent.md <<'EOF' +# Source Agent +Old project source agent. +EOF +cat > .claude/agents/dupe-agent.md <<'EOF' +# Target Agent +Project json overwrite wins. +EOF + +OUTPUT=$(ss collect agents claude --json -p) +grep -q "Project json overwrite wins." .skillshare/agents/dupe-agent.md +printf '%s\n' "$OUTPUT" +``` + +Expected: +- exit_code: 0 +- jq: .pulled == ["dupe-agent.md"] +- jq: (.skipped | length) == 0 +- jq: (.failed | length) == 0 + +## Pass Criteria + +- All 8 steps pass +- Default collect preserves existing source content for both skills and agents +- Explicit `--force` and implicit force via `--json` overwrite existing source content diff --git a/ai_docs/tests/collect_skills_agents_runbook.md b/ai_docs/tests/collect_skills_agents_runbook.md new file mode 100644 index 00000000..6da4788d --- /dev/null +++ b/ai_docs/tests/collect_skills_agents_runbook.md @@ -0,0 +1,230 @@ +# CLI E2E Runbook: Collect Skills and Agents + +Validates `ss collect` for both resource kinds across global and project modes. +Each step is self-contained because mdproof setup provides a fresh initialized +environment and `/tmp` is shared across runs. + +## Scope + +- Global `collect` for skills with `--dry-run --json` +- Global `collect` for skills with real writes to source +- Global `collect agents` with `--dry-run --json` +- Global `collect agents` with real writes to source +- Project `collect -p` JSON output for skills +- Project `collect -p agents --dry-run --json` does not write +- Project `collect -p agents --json` writes to `.skillshare/agents` + +## Environment + +- Run inside devcontainer via mdproof +- Global steps use `-g` and explicitly target `claude` +- Project steps create fresh projects under `/tmp` +- Assertions rely on `--json` output plus silent file-system checks + +## Steps + +### Step 1: Global collect skills dry-run previews without writing + +```bash +set -e +rm -rf ~/.claude/skills/collect-dry-skill ~/.config/skillshare/skills/collect-dry-skill +mkdir -p ~/.claude/skills/collect-dry-skill +cat > ~/.claude/skills/collect-dry-skill/SKILL.md <<'EOF' +# Collect Dry Skill +EOF + +OUTPUT=$(ss collect claude --json --dry-run -g) +test ! -e ~/.config/skillshare/skills/collect-dry-skill +printf '%s\n' "$OUTPUT" +``` + +Expected: +- exit_code: 0 +- jq: .dry_run == true +- jq: .pulled == ["collect-dry-skill"] +- jq: (.skipped | length) == 0 +- jq: (.failed | length) == 0 + +### Step 2: Global collect skills copies a local skill to source + +```bash +set -e +rm -rf \ + ~/.claude/skills/collect-dry-skill \ + ~/.claude/skills/collect-live-skill \ + ~/.config/skillshare/skills/collect-dry-skill \ + ~/.config/skillshare/skills/collect-live-skill +mkdir -p ~/.claude/skills/collect-live-skill +cat > ~/.claude/skills/collect-live-skill/SKILL.md <<'EOF' +# Collect Live Skill +Collected from target. +EOF + +OUTPUT=$(ss collect claude --json -g) +test -f ~/.config/skillshare/skills/collect-live-skill/SKILL.md +grep -q "Collected from target." ~/.config/skillshare/skills/collect-live-skill/SKILL.md +printf '%s\n' "$OUTPUT" +``` + +Expected: +- exit_code: 0 +- jq: .dry_run == false +- jq: .pulled == ["collect-live-skill"] +- jq: (.skipped | length) == 0 +- jq: (.failed | length) == 0 + +### Step 3: Global collect agents dry-run previews without writing + +```bash +set -e +rm -f ~/.claude/agents/collect-dry-agent.md ~/.config/skillshare/agents/collect-dry-agent.md +mkdir -p ~/.claude/agents +cat > ~/.claude/agents/collect-dry-agent.md <<'EOF' +--- +name: collect-dry-agent +description: Dry-run agent +--- +# Collect Dry Agent +EOF + +OUTPUT=$(ss collect agents claude --json --dry-run -g) +test ! -e ~/.config/skillshare/agents/collect-dry-agent.md +printf '%s\n' "$OUTPUT" +``` + +Expected: +- exit_code: 0 +- jq: .dry_run == true +- jq: .pulled == ["collect-dry-agent.md"] +- jq: (.skipped | length) == 0 +- jq: (.failed | length) == 0 + +### Step 4: Global collect agents copies a local agent to source + +```bash +set -e +rm -f \ + ~/.claude/agents/collect-dry-agent.md \ + ~/.claude/agents/collect-live-agent.md \ + ~/.config/skillshare/agents/collect-dry-agent.md \ + ~/.config/skillshare/agents/collect-live-agent.md +mkdir -p ~/.claude/agents +cat > ~/.claude/agents/collect-live-agent.md <<'EOF' +--- +name: collect-live-agent +description: Live collect agent +--- +# Collect Live Agent +Collected from target. +EOF + +OUTPUT=$(ss collect agents claude --json -g) +test -f ~/.config/skillshare/agents/collect-live-agent.md +grep -q "Collected from target." ~/.config/skillshare/agents/collect-live-agent.md +printf '%s\n' "$OUTPUT" +``` + +Expected: +- exit_code: 0 +- jq: .dry_run == false +- jq: .pulled == ["collect-live-agent.md"] +- jq: (.skipped | length) == 0 +- jq: (.failed | length) == 0 + +### Step 5: Project collect skills outputs JSON and writes into .skillshare/skills + +```bash +set -e +rm -rf /tmp/collect-project-skills +mkdir -p /tmp/collect-project-skills +cd /tmp/collect-project-skills +ss init -p --targets claude >/dev/null 2>&1 + +mkdir -p .claude/skills/project-collect-skill +cat > .claude/skills/project-collect-skill/SKILL.md <<'EOF' +# Project Collect Skill +Collected into project source. +EOF + +OUTPUT=$(ss collect claude --json -p) +test -f .skillshare/skills/project-collect-skill/SKILL.md +grep -q "Collected into project source." .skillshare/skills/project-collect-skill/SKILL.md +printf '%s\n' "$OUTPUT" +``` + +Expected: +- exit_code: 0 +- jq: .dry_run == false +- jq: .pulled == ["project-collect-skill"] +- jq: (.skipped | length) == 0 +- jq: (.failed | length) == 0 + +### Step 6: Project collect agents dry-run previews without writing + +```bash +set -e +rm -rf /tmp/collect-project-agents-dry +mkdir -p /tmp/collect-project-agents-dry +cd /tmp/collect-project-agents-dry +ss init -p --targets claude >/dev/null 2>&1 + +mkdir -p .claude/agents +cat > .claude/agents/project-dry-agent.md <<'EOF' +--- +name: project-dry-agent +description: Project dry-run agent +--- +# Project Dry Agent +EOF + +OUTPUT=$(ss collect agents claude --json --dry-run -p) +test ! -e .skillshare/agents/project-dry-agent.md +printf '%s\n' "$OUTPUT" +``` + +Expected: +- exit_code: 0 +- jq: .dry_run == true +- jq: .pulled == ["project-dry-agent.md"] +- jq: (.skipped | length) == 0 +- jq: (.failed | length) == 0 + +### Step 7: Project collect agents outputs JSON and writes into .skillshare/agents + +```bash +set -e +rm -rf /tmp/collect-project-agents-live +mkdir -p /tmp/collect-project-agents-live +cd /tmp/collect-project-agents-live +ss init -p --targets claude >/dev/null 2>&1 + +mkdir -p .claude/agents +cat > .claude/agents/project-live-agent.md <<'EOF' +--- +name: project-live-agent +description: Project live agent +--- +# Project Live Agent +Collected into project agent source. +EOF + +OUTPUT=$(ss collect agents claude --json -p) +test -f .skillshare/agents/project-live-agent.md +grep -q "Collected into project agent source." .skillshare/agents/project-live-agent.md +printf '%s\n' "$OUTPUT" +``` + +Expected: +- exit_code: 0 +- jq: .dry_run == false +- jq: .pulled == ["project-live-agent.md"] +- jq: (.skipped | length) == 0 +- jq: (.failed | length) == 0 + +## Pass Criteria + +- All 7 steps exit successfully +- Dry-run steps report `dry_run: true` and do not write to source +- Global collect writes to `~/.config/skillshare/skills` and `~/.config/skillshare/agents` +- Project collect writes to `.skillshare/skills` and `.skillshare/agents` +- All collect commands emit valid JSON without UI noise on stdout diff --git a/ai_docs/tests/extras_flatten_runbook.md b/ai_docs/tests/extras_flatten_runbook.md index b0d9833a..f9d1ad92 100644 --- a/ai_docs/tests/extras_flatten_runbook.md +++ b/ai_docs/tests/extras_flatten_runbook.md @@ -25,7 +25,7 @@ Run inside devcontainer. ```bash ss extras remove agents --force -g >/dev/null 2>&1 || true -rm -rf ~/.claude/agents ~/.config/skillshare/extras/agents 2>/dev/null || true +rm -rf ~/.claude/agents ~/.config/skillshare/extras/agents ~/.config/skillshare/agents 2>/dev/null || true mkdir -p ~/.config/skillshare/extras/agents/curriculum mkdir -p ~/.config/skillshare/extras/agents/software echo "# Tactician" > ~/.config/skillshare/extras/agents/curriculum/tactician.md @@ -33,25 +33,25 @@ echo "# Planner" > ~/.config/skillshare/extras/agents/curriculum/planner.md echo "# Implementer" > ~/.config/skillshare/extras/agents/software/implementer.md echo "# Reviewer" > ~/.config/skillshare/extras/agents/reviewer.md mkdir -p ~/.claude/agents -ss extras init agents --target ~/.claude/agents --flatten -g >/dev/null +ss extras init agents --target ~/.claude/agents --flatten -g >/dev/null 2>&1 ss sync extras --json -g ``` Expected: - exit_code: 0 -- jq: .extras[0].targets[0].synced == 4 +- jq: [.extras[] | select(.name == "agents")][0].targets[0].synced == 4 ### 2. Verify flat file layout — no subdirectories in target ```bash ss extras remove agents --force -g >/dev/null 2>&1 || true -rm -rf ~/.claude/agents ~/.config/skillshare/extras/agents 2>/dev/null || true +rm -rf ~/.claude/agents ~/.config/skillshare/extras/agents ~/.config/skillshare/agents 2>/dev/null || true mkdir -p ~/.config/skillshare/extras/agents/sub1 ~/.config/skillshare/extras/agents/sub2 echo "a" > ~/.config/skillshare/extras/agents/sub1/a.md echo "b" > ~/.config/skillshare/extras/agents/sub2/b.md echo "c" > ~/.config/skillshare/extras/agents/root.md mkdir -p ~/.claude/agents -ss extras init agents --target ~/.claude/agents --flatten -g >/dev/null +ss extras init agents --target ~/.claude/agents --flatten -g >/dev/null 2>&1 ss sync extras -g >/dev/null echo "files=$(find ~/.claude/agents/ -maxdepth 1 -name '*.md' | wc -l)" echo "dirs=$(find ~/.claude/agents/ -mindepth 1 -type d | wc -l)" @@ -66,11 +66,11 @@ Expected: ```bash ss extras remove agents --force -g >/dev/null 2>&1 || true -rm -rf ~/.claude/agents ~/.config/skillshare/extras/agents 2>/dev/null || true +rm -rf ~/.claude/agents ~/.config/skillshare/extras/agents ~/.config/skillshare/agents 2>/dev/null || true mkdir -p ~/.config/skillshare/extras/agents echo "x" > ~/.config/skillshare/extras/agents/x.md mkdir -p ~/.claude/agents -ss extras init agents --target ~/.claude/agents --flatten -g >/dev/null +ss extras init agents --target ~/.claude/agents --flatten -g >/dev/null 2>&1 ss extras list --json -g ``` @@ -82,32 +82,32 @@ Expected: ```bash ss extras remove agents --force -g >/dev/null 2>&1 || true -rm -rf ~/.claude/agents ~/.config/skillshare/extras/agents 2>/dev/null || true +rm -rf ~/.claude/agents ~/.config/skillshare/extras/agents ~/.config/skillshare/agents 2>/dev/null || true mkdir -p ~/.config/skillshare/extras/agents/team-a mkdir -p ~/.config/skillshare/extras/agents/team-b echo "# From team-a" > ~/.config/skillshare/extras/agents/team-a/agent.md echo "# From team-b" > ~/.config/skillshare/extras/agents/team-b/agent.md mkdir -p ~/.claude/agents -ss extras init agents --target ~/.claude/agents --flatten -g >/dev/null +ss extras init agents --target ~/.claude/agents --flatten -g >/dev/null 2>&1 ss sync extras --json -g ``` Expected: - exit_code: 0 -- jq: .extras[0].targets[0].synced == 1 -- jq: .extras[0].targets[0].skipped == 1 -- jq: .extras[0].targets[0].warnings | length == 1 +- jq: [.extras[] | select(.name == "agents")][0].targets[0].synced == 1 +- jq: [.extras[] | select(.name == "agents")][0].targets[0].skipped == 1 +- jq: [.extras[] | select(.name == "agents")][0].targets[0].warnings | length == 1 ### 5. Flatten collision warning in human-readable output ```bash ss extras remove agents --force -g >/dev/null 2>&1 || true -rm -rf ~/.claude/agents ~/.config/skillshare/extras/agents 2>/dev/null || true +rm -rf ~/.claude/agents ~/.config/skillshare/extras/agents ~/.config/skillshare/agents 2>/dev/null || true mkdir -p ~/.config/skillshare/extras/agents/a ~/.config/skillshare/extras/agents/b echo "1" > ~/.config/skillshare/extras/agents/a/same.md echo "2" > ~/.config/skillshare/extras/agents/b/same.md mkdir -p ~/.claude/agents -ss extras init agents --target ~/.claude/agents --flatten -g >/dev/null +ss extras init agents --target ~/.claude/agents --flatten -g >/dev/null 2>&1 ss sync extras -g ``` @@ -119,11 +119,11 @@ Expected: ```bash ss extras remove agents --force -g >/dev/null 2>&1 || true -rm -rf ~/.claude/agents ~/.config/skillshare/extras/agents 2>/dev/null || true +rm -rf ~/.claude/agents ~/.config/skillshare/extras/agents ~/.config/skillshare/agents 2>/dev/null || true mkdir -p ~/.config/skillshare/extras/agents echo "x" > ~/.config/skillshare/extras/agents/x.md mkdir -p ~/.claude/agents -ss extras init agents --target ~/.claude/agents --flatten -g >/dev/null +ss extras init agents --target ~/.claude/agents --flatten -g >/dev/null 2>&1 ss extras agents --no-flatten -g >/dev/null ss extras list --json -g ``` @@ -136,11 +136,11 @@ Expected: ```bash ss extras remove agents --force -g >/dev/null 2>&1 || true -rm -rf ~/.claude/agents ~/.config/skillshare/extras/agents 2>/dev/null || true +rm -rf ~/.claude/agents ~/.config/skillshare/extras/agents ~/.config/skillshare/agents 2>/dev/null || true mkdir -p ~/.config/skillshare/extras/agents echo "x" > ~/.config/skillshare/extras/agents/x.md mkdir -p ~/.claude/agents -ss extras init agents --target ~/.claude/agents -g >/dev/null +ss extras init agents --target ~/.claude/agents -g >/dev/null 2>&1 ss extras agents --flatten -g >/dev/null ss extras list --json -g ``` @@ -153,11 +153,11 @@ Expected: ```bash ss extras remove agents --force -g >/dev/null 2>&1 || true -rm -rf ~/.claude/agents ~/.config/skillshare/extras/agents 2>/dev/null || true +rm -rf ~/.claude/agents ~/.config/skillshare/extras/agents ~/.config/skillshare/agents 2>/dev/null || true mkdir -p ~/.config/skillshare/extras/agents echo "x" > ~/.config/skillshare/extras/agents/x.md mkdir -p ~/.claude/agents -ss extras init agents --target ~/.claude/agents --flatten -g >/dev/null +ss extras init agents --target ~/.claude/agents --flatten -g >/dev/null 2>&1 ss extras agents --mode symlink -g 2>&1 || true ``` @@ -168,12 +168,12 @@ Expected: ```bash ss extras remove agents --force -g >/dev/null 2>&1 || true -rm -rf ~/.claude/agents ~/.config/skillshare/extras/agents 2>/dev/null || true +rm -rf ~/.claude/agents ~/.config/skillshare/extras/agents ~/.config/skillshare/agents 2>/dev/null || true mkdir -p ~/.config/skillshare/extras/agents/sub echo "# Keep" > ~/.config/skillshare/extras/agents/sub/keep.md echo "# Remove" > ~/.config/skillshare/extras/agents/sub/remove.md mkdir -p ~/.claude/agents -ss extras init agents --target ~/.claude/agents --flatten -g >/dev/null +ss extras init agents --target ~/.claude/agents --flatten -g >/dev/null 2>&1 ss sync extras -g >/dev/null rm ~/.config/skillshare/extras/agents/sub/remove.md ss sync extras --json -g @@ -181,7 +181,7 @@ ss sync extras --json -g Expected: - exit_code: 0 -- jq: .extras[0].targets[0].pruned == 1 +- jq: [.extras[] | select(.name == "agents")][0].targets[0].pruned == 1 ## Pass Criteria diff --git a/ai_docs/tests/install_mixed_repo_agents_runbook.md b/ai_docs/tests/install_mixed_repo_agents_runbook.md new file mode 100644 index 00000000..311e78e0 --- /dev/null +++ b/ai_docs/tests/install_mixed_repo_agents_runbook.md @@ -0,0 +1,200 @@ +# CLI E2E Runbook: Install Mixed Repo (Skills + Agents) + +Validates that `skillshare install` from a repo containing both skills and agents +installs skills to the skills source and agents to the agents source, then sync +distributes agents only to targets with an agents path configured. + +**Origin**: Bug fix — agents from mixed repos were incorrectly ignored or installed to skills dir. + +## Scope + +- Mixed repo install: skills go to `~/.config/skillshare/skills/`, agents go to `~/.config/skillshare/agents/` +- Pure agent repo install: agents go to agents dir +- Sync agents: targets with agents path receive agents, targets without are skipped with warning +- Project mode: agents go to `.skillshare/agents/`, not global agents dir + +## Environment + +Run inside devcontainer via mdproof (no ssenv wrapper needed). +All global commands use `-g` to force global mode. + +## Steps + +### 1. Create mixed git repo with skills and agents + +```bash +rm -rf /tmp/mixed-repo +mkdir -p /tmp/mixed-repo/skills/demo-skill /tmp/mixed-repo/agents +cat > /tmp/mixed-repo/skills/demo-skill/SKILL.md <<'EOF' +--- +name: demo-skill +--- +# Demo Skill +A demo skill for testing. +EOF +cat > /tmp/mixed-repo/agents/demo-agent.md <<'EOF' +--- +name: demo-agent +description: A demo agent +--- +# Demo Agent +A demo agent for testing. +EOF +cd /tmp/mixed-repo && git init && git config user.email "test@test.com" && git config user.name "test" && git add -A && git commit -m "init" 2>&1 +ls skills/demo-skill/SKILL.md agents/demo-agent.md +``` + +Expected: +- exit_code: 0 +- SKILL.md +- demo-agent.md + +### 2. Install mixed repo — both skills and agents found + +```bash +ss install -g file:///tmp/mixed-repo --yes --force +``` + +Expected: +- exit_code: 0 +- regex: 1 skill\(s\), 1 agent\(s\) +- Installed: demo-skill +- Installed agent: demo-agent + +### 3. Verify skill in skills source, agent in agents source + +```bash +SKILLS_DIR=~/.config/skillshare/skills +AGENTS_DIR=~/.config/skillshare/agents +test -f "$SKILLS_DIR/demo-skill/SKILL.md" && echo "skill: in skills dir" || echo "skill: MISSING" +test -f "$AGENTS_DIR/demo-agent.md" && echo "agent: in agents dir" || echo "agent: MISSING" +test -f "$SKILLS_DIR/demo-agent.md" && echo "agent: WRONG in skills dir" || echo "agent: not in skills dir (correct)" +``` + +Expected: +- exit_code: 0 +- skill: in skills dir +- agent: in agents dir +- agent: not in skills dir (correct) +- Not MISSING +- Not WRONG + +### 4. Sync all — agents go to targets with agents path + +```bash +ss sync all -g +``` + +Expected: +- exit_code: 0 + +### 5. Verify agents synced to claude (has agents path) but not to targets without + +```bash +CLAUDE_AGENTS=~/.claude/agents +test -L "$CLAUDE_AGENTS/demo-agent.md" && echo "claude: agent synced" || echo "claude: agent MISSING" +``` + +Expected: +- exit_code: 0 +- claude: agent synced +- Not MISSING + +### 6. Sync agents — warning lists targets without agents path + +```bash +ss sync agents -g 2>&1 +``` + +Expected: +- exit_code: 0 +- regex: skipped for agents + +### 7. Create pure agent repo and install + +```bash +rm -rf /tmp/agent-only-repo +mkdir -p /tmp/agent-only-repo/agents +cat > /tmp/agent-only-repo/agents/helper.md <<'EOF' +--- +name: helper +description: A helper agent +--- +# Helper Agent +EOF +cd /tmp/agent-only-repo && git init && git config user.email "test@test.com" && git config user.name "test" && git add -A && git commit -m "init" 2>&1 +ss install -g file:///tmp/agent-only-repo --yes --force +``` + +Expected: +- exit_code: 0 +- regex: 1 agent\(s\) +- helper + +### 8. Verify pure agent repo installed to agents dir + +```bash +AGENTS_DIR=~/.config/skillshare/agents +test -f "$AGENTS_DIR/helper.md" && echo "helper: in agents dir" || echo "helper: MISSING" +SKILLS_DIR=~/.config/skillshare/skills +test -d "$SKILLS_DIR/helper" && echo "helper: WRONG in skills dir" || echo "helper: not in skills dir (correct)" +``` + +Expected: +- exit_code: 0 +- helper: in agents dir +- helper: not in skills dir (correct) +- Not MISSING +- Not WRONG + +### 9. Project mode — install mixed repo to project agents dir + +```bash +rm -rf /tmp/test-project +mkdir -p /tmp/test-project +cd /tmp/test-project +ss init -p --targets claude 2>&1 +ss install -p file:///tmp/mixed-repo --yes --force 2>&1 +``` + +Expected: +- exit_code: 0 +- Installed: demo-skill +- Installed agent: demo-agent + +### 10. Verify project mode paths + +```bash +cd /tmp/test-project +test -f .skillshare/skills/demo-skill/SKILL.md && echo "project skill: correct" || echo "project skill: MISSING" +test -f .skillshare/agents/demo-agent.md && echo "project agent: correct" || echo "project agent: MISSING" +GLOBAL_AGENTS=~/.config/skillshare/agents +test -f "$GLOBAL_AGENTS/demo-agent.md" && echo "global agent: EXISTS (wrong for project install)" || echo "global agent: not there (correct)" +``` + +Expected: +- exit_code: 0 +- project skill: correct +- project agent: correct +- Not MISSING + +### 11. Cleanup + +```bash +rm -rf /tmp/mixed-repo /tmp/agent-only-repo /tmp/test-project +ss uninstall demo-skill --force -g 2>/dev/null || true +ss uninstall agents --all --force -g 2>/dev/null || true +``` + +Expected: +- exit_code: 0 + +## Pass Criteria + +- [ ] Mixed repo install shows "N skill(s), N agent(s)" in Found message +- [ ] Skills installed to skills source dir +- [ ] Agents installed to agents source dir (not skills dir) +- [ ] Pure agent repo installs agents correctly +- [ ] Sync distributes agents to targets with agents path +- [ ] Sync shows warning listing targets without agents path +- [ ] Project mode install puts agents in `.skillshare/agents/`, not global diff --git a/ai_docs/tests/issue124_orchestrator_install_runbook.md b/ai_docs/tests/issue124_orchestrator_install_runbook.md new file mode 100644 index 00000000..3bad226f --- /dev/null +++ b/ai_docs/tests/issue124_orchestrator_install_runbook.md @@ -0,0 +1,187 @@ +# Issue #124: Orchestrator Repo Install + +Verify that repos with root + child skills install children as independent skills, root appears in selection, and blob URLs resolve correctly. + +Repo structure under test: + +``` +SKILL.md (root: name derived from repo dir) +src/main.go +skills/ + writer/SKILL.md (child: "writer") + reader/SKILL.md (child: "reader") +``` + +## Steps + +### Step 1: Create local orchestrator repo + +```bash +rm -rf /tmp/orchestrator-repo +mkdir -p /tmp/orchestrator-repo/skills/writer /tmp/orchestrator-repo/skills/reader /tmp/orchestrator-repo/src + +git config --global user.email "test@test.com" 2>/dev/null +git config --global user.name "Test" 2>/dev/null + +cat > /tmp/orchestrator-repo/SKILL.md << 'SKILL' +--- +name: office-cli +description: Office CLI orchestrator +--- +# Office CLI +Root skill for the orchestrator pack. +SKILL + +cat > /tmp/orchestrator-repo/src/main.go << 'GO' +package main +func main() {} +GO + +cat > /tmp/orchestrator-repo/skills/writer/SKILL.md << 'SKILL' +--- +name: writer +description: Writer skill +--- +# Writer +A child skill for writing. +SKILL + +echo "# Writer docs" > /tmp/orchestrator-repo/skills/writer/README.md + +cat > /tmp/orchestrator-repo/skills/reader/SKILL.md << 'SKILL' +--- +name: reader +description: Reader skill +--- +# Reader +A child skill for reading. +SKILL + +cd /tmp/orchestrator-repo && git init -q && git add -A && git commit -m "init" -q +echo "REPO_READY" +``` + +Expected: +- exit_code: 0 +- REPO_READY + +### Step 2: Install all skills with --json + +Install the orchestrator repo using `--json --all` to skip prompts. Verify 3 independent skills are installed. Root name = `orchestrator-repo` (from directory). + +```bash +/workspace/bin/skillshare install file:///tmp/orchestrator-repo --json --all -g 2>/dev/null +``` + +Expected: +- exit_code: 0 +- jq: .skills | length == 3 +- jq: .skills | sort | . == ["orchestrator-repo","reader","writer"] +- jq: .failed | length == 0 + +### Step 3: Root dir does NOT contain child skill dirs + +Root copy should exclude `skills/writer/` and `skills/reader/` but keep `src/`. + +```bash +ROOT_DIR="$HOME/.config/skillshare/skills/orchestrator-repo" +echo "ROOT_EXISTS=$(test -d "$ROOT_DIR" && echo yes || echo no)" +echo "ROOT_SKILL_MD=$(test -f "$ROOT_DIR/SKILL.md" && echo yes || echo no)" +echo "ROOT_SRC=$(test -d "$ROOT_DIR/src" && echo yes || echo no)" +echo "NO_CHILD_WRITER=$(test -d "$ROOT_DIR/skills/writer" && echo LEAKED || echo clean)" +echo "NO_CHILD_READER=$(test -d "$ROOT_DIR/skills/reader" && echo LEAKED || echo clean)" +``` + +Expected: +- exit_code: 0 +- ROOT_EXISTS=yes +- ROOT_SKILL_MD=yes +- ROOT_SRC=yes +- NO_CHILD_WRITER=clean +- NO_CHILD_READER=clean + +### Step 4: Children exist as siblings under parent + +Children at `orchestrator-repo/writer/` and `orchestrator-repo/reader/`, each with own SKILL.md. + +```bash +SKILLS_DIR="$HOME/.config/skillshare/skills" +echo "WRITER_EXISTS=$(test -f "$SKILLS_DIR/orchestrator-repo/writer/SKILL.md" && echo yes || echo no)" +echo "READER_EXISTS=$(test -f "$SKILLS_DIR/orchestrator-repo/reader/SKILL.md" && echo yes || echo no)" +echo "WRITER_README=$(test -f "$SKILLS_DIR/orchestrator-repo/writer/README.md" && echo yes || echo no)" +``` + +Expected: +- exit_code: 0 +- WRITER_EXISTS=yes +- READER_EXISTS=yes +- WRITER_README=yes + +### Step 5: All 3 skills appear independently in list + +```bash +/workspace/bin/skillshare list --json -g 2>/dev/null +``` + +Expected: +- exit_code: 0 +- jq: [.[] | .name] | sort | . == ["orchestrator-repo","orchestrator-repo__reader","orchestrator-repo__writer"] + +### Step 6: Install root skill only via --skill filter + +Root skill is individually selectable (issue #124 fix: root included in selection). + +```bash +/workspace/bin/skillshare uninstall --all -g --force >/dev/null 2>&1 +/workspace/bin/skillshare install file:///tmp/orchestrator-repo --json --skill orchestrator-repo -g 2>/dev/null +``` + +Expected: +- exit_code: 0 +- jq: .skills == ["orchestrator-repo"] + +### Step 7: Root-only install has no child content + +```bash +SKILLS_DIR="$HOME/.config/skillshare/skills" +echo "ROOT_EXISTS=$(test -f "$SKILLS_DIR/orchestrator-repo/SKILL.md" && echo yes || echo no)" +echo "NO_WRITER=$(test -d "$SKILLS_DIR/orchestrator-repo/writer" && echo LEAKED || echo clean)" +echo "NO_READER=$(test -d "$SKILLS_DIR/orchestrator-repo/reader" && echo LEAKED || echo clean)" +``` + +Expected: +- exit_code: 0 +- ROOT_EXISTS=yes +- NO_WRITER=clean +- NO_READER=clean + +### Step 8: Install child-only without root + +```bash +/workspace/bin/skillshare uninstall --all -g --force >/dev/null 2>&1 +/workspace/bin/skillshare install file:///tmp/orchestrator-repo --json --skill writer -g 2>/dev/null +``` + +Expected: +- exit_code: 0 +- jq: .skills == ["writer"] + +### Step 9: Blob URL parsing — trailing SKILL.md stripped (unit tests) + +The blob URL fix resolves `github.com/.../blob/main/skills/foo/SKILL.md` to `subdir="skills/foo"`. Run the specific subtests. + +```bash +cd /workspace && go test ./internal/install -run "TestParseSource_GitHubShorthand/github_blob_URL" -count=1 -v 2>&1 | tail -10 +``` + +Expected: +- exit_code: 0 +- PASS +- regex: blob_URL + +## Pass Criteria + +- Steps 2-5: Orchestrator install creates 3 independent skills; root excludes child dirs; children are siblings +- Steps 6-7: Root skill is individually selectable and installs cleanly without child leakage +- Step 8: Child skill installs independently without root +- Step 9: Blob URL unit tests pass, confirming trailing SKILL.md is properly resolved diff --git a/ai_docs/tests/target_command_options_skills_agents_runbook.md b/ai_docs/tests/target_command_options_skills_agents_runbook.md new file mode 100644 index 00000000..13627192 --- /dev/null +++ b/ai_docs/tests/target_command_options_skills_agents_runbook.md @@ -0,0 +1,253 @@ +# CLI E2E Runbook: Target Command Options (Skills + Agents) + +Validates `skillshare target` command options across plain list, JSON list, +target info/settings, and both global/project mode for skill and agent +configuration. + +**Origin**: v0.19.x — target settings gained explicit agents options alongside +existing skills options, and docs needed regression coverage against code. + +## Scope + +- `target help` advertises skill and agent settings flags +- `target list --no-tui` and `target list --json` expose agent metadata for + supported targets while leaving unsupported targets agent-free +- `target ` supports `--mode`, `--target-naming`, include/exclude, and + the agent counterparts `--agent-mode`, `--add/remove-agent-include/exclude` +- Agent filter flags are rejected for unsupported targets and for + `agent-mode=symlink` +- Project mode mirrors the same skill/agent target settings behavior + +## Environment + +Run inside devcontainer via mdproof. +Use `-g`/`-p` explicitly to avoid auto-mode ambiguity. + +## Steps + +### 1. Help output lists skill and agent target settings + +```bash +ss target help +``` + +Expected: +- exit_code: 0 +- --mode +- --agent-mode +- --target-naming +- --add-include +- --add-exclude +- --remove-include +- --remove-exclude +- --add-agent-include +- --add-agent-exclude +- --remove-agent-include +- --remove-agent-exclude + +### 2. Global plain target list shows skills and agents sections + +```bash +set -e +BASE=~/.config/skillshare +mkdir -p "$BASE/skills/team-alpha" "$BASE/agents" "$HOME/custom-tool/skills" + +printf '%s\n' \ + '---' \ + 'name: team-alpha' \ + 'description: Team Alpha skill' \ + 'metadata:' \ + ' targets: [claude]' \ + '---' \ + '# Team Alpha' \ + > "$BASE/skills/team-alpha/SKILL.md" + +printf '%s\n' \ + '---' \ + 'name: reviewer' \ + 'description: Review agent' \ + '---' \ + '# Reviewer' \ + > "$BASE/agents/reviewer.md" + +ss target add custom-tool "$HOME/custom-tool/skills" -g +ss target list --no-tui -g +``` + +Expected: +- exit_code: 0 +- claude +- custom-tool +- Skills: +- Agents: +- /.claude/agents + +### 3. Global JSON target list includes agent metadata only for supported targets + +```bash +ss target list --json -g +``` + +Expected: +- exit_code: 0 +- jq: .targets | length >= 2 +- jq: (.targets[] | select(.name == "claude").agentPath | type) == "string" +- jq: (.targets[] | select(.name == "claude").agentMode) == "merge" +- jq: (.targets[] | select(.name == "claude").agentExpectedCount) >= 0 +- jq: (.targets[] | select(.name == "custom-tool").agentPath) == null + +### 4. Global skill settings flags update target mode, naming, and filters + +```bash +set -e +ss target claude --mode copy -g +ss target claude --target-naming standard -g +ss target claude --add-include "team-*" -g +ss target claude --add-exclude "_legacy*" -g +ss target claude -g +``` + +Expected: +- exit_code: 0 +- Changed claude mode: merge -> copy +- Changed claude target naming: flat -> standard +- added include: team-* +- added exclude: _legacy* +- Mode: copy +- Naming: standard +- Include: team-* +- Exclude: _legacy* + +### 5. Global agent settings flags update agent mode and agent filters + +```bash +set -e +ss target claude --agent-mode copy -g +ss target claude --add-agent-include "team-*" -g +ss target claude --add-agent-exclude "draft-*" -g +ss target claude -g +``` + +Expected: +- exit_code: 0 +- Changed claude agent mode: merge -> copy +- added agent include: team-* +- added agent exclude: draft-* +- Agents: +- Mode: copy +- Include: team-* +- Exclude: draft-* + +### 6. Agent filter guard rails reject symlink mode and unsupported targets + +```bash +set -e +ss target claude --agent-mode symlink -g +ss target claude -g + +set +e +SYMLINK_ERR=$(ss target claude --add-agent-include "retry-*" -g 2>&1) +SYMLINK_STATUS=$? +CUSTOM_ERR=$(ss target custom-tool --add-agent-include "retry-*" -g 2>&1) +CUSTOM_STATUS=$? +set -e + +printf 'REJECTED_SYMLINK=%d\n' "$SYMLINK_STATUS" +printf '%s\n' "$SYMLINK_ERR" +printf 'REJECTED_CUSTOM=%d\n' "$CUSTOM_STATUS" +printf '%s\n' "$CUSTOM_ERR" +``` + +Expected: +- exit_code: 0 +- Changed claude agent mode: copy -> symlink +- Filters: ignored in symlink mode +- REJECTED_SYMLINK=1 +- ignored in symlink mode +- REJECTED_CUSTOM=1 +- target 'custom-tool' does not have an agents path + +### 7. Global remove flags clear both skill and agent filters + +```bash +set -e +ss target claude --agent-mode copy -g +ss target claude --remove-include "team-*" -g +ss target claude --remove-exclude "_legacy*" -g +ss target claude --remove-agent-include "team-*" -g +ss target claude --remove-agent-exclude "draft-*" -g +ss target claude -g +``` + +Expected: +- exit_code: 0 +- Changed claude agent mode: symlink -> copy +- removed include: team-* +- removed exclude: _legacy* +- removed agent include: team-* +- removed agent exclude: draft-* +- Include: (none) +- Exclude: (none) + +### 8. Project mode mirrors skill and agent target settings + +```bash +set -e +PROJECT=/tmp/target-options-project +rm -rf "$PROJECT" +mkdir -p "$PROJECT/.skillshare/skills" "$PROJECT/.skillshare/agents" + +cat > "$PROJECT/.skillshare/config.yaml" <<'EOF' +targets: + - claude +EOF + +cd "$PROJECT" +ss target claude --mode copy -p +ss target claude --target-naming standard -p +ss target claude --add-include "proj-*" -p +ss target claude --add-exclude "draft-*" -p +ss target claude --agent-mode copy -p +ss target claude --add-agent-include "proj-*" -p +ss target claude --add-agent-exclude "draft-*" -p +ss target claude -p +``` + +Expected: +- exit_code: 0 +- Changed claude mode: merge -> copy +- Changed claude target naming: flat -> standard +- added include: proj-* +- added exclude: draft-* +- Changed claude agent mode: merge -> copy +- added agent include: proj-* +- added agent exclude: draft-* +- Mode: copy +- Naming: standard +- Include: proj-* +- Exclude: draft-* +- Agents: + +### 9. Project JSON target list includes agent metadata + +```bash +cd /tmp/target-options-project +ss target list --json -p +``` + +Expected: +- exit_code: 0 +- jq: .targets | length == 1 +- jq: (.targets[0].name) == "claude" +- jq: (.targets[0].agentMode) == "copy" +- jq: (.targets[0].agentInclude) == ["proj-*"] +- jq: (.targets[0].agentExclude) == ["draft-*"] + +## Pass Criteria + +- The `target` command exposes both skill and agent settings in help and list + output +- Global and project target settings accept and persist skill/agent mode and + include/exclude changes +- Agent-only guard rails behave correctly for unsupported targets and + symlink-mode agent targets diff --git a/ai_docs/tests/ui_base_path_runbook.md b/ai_docs/tests/ui_base_path_runbook.md index 222e9c3c..ac713052 100644 --- a/ai_docs/tests/ui_base_path_runbook.md +++ b/ai_docs/tests/ui_base_path_runbook.md @@ -17,14 +17,14 @@ Verifies `skillshare ui --base-path` serves the dashboard and API under a sub-pa ## Environment Run inside devcontainer. Uses `/tmp/` for isolated HOME and a fake UI dist. -Server uses port **19421** to avoid conflicts with existing UI on 19420. +Server uses port **49821** to avoid conflicts with existing UI on 19420. ## Step 0: Setup isolated HOME and fake UI dist ```bash -# Kill any leftover servers from previous runs on ports used by this runbook -fuser -k 19421/tcp 2>/dev/null || true -fuser -k 19422/tcp 2>/dev/null || true +# Kill any leftover UI servers from previous test runs +kill $(cat /tmp/basepath-server.pid /tmp/basepath-server2.pid 2>/dev/null) 2>/dev/null || true +rm -f /tmp/basepath-server.pid /tmp/basepath-server2.pid sleep 1 export E2E_HOME="/tmp/ss-e2e-basepath" @@ -73,12 +73,13 @@ export XDG_DATA_HOME="$E2E_HOME/.local/share" export XDG_STATE_HOME="$E2E_HOME/.local/state" export XDG_CACHE_HOME="$E2E_HOME/.cache" -fuser -k 19421/tcp 2>/dev/null || true +kill $(cat /tmp/basepath-server.pid 2>/dev/null) 2>/dev/null || true +rm -f /tmp/basepath-server.pid sleep 1 -cd /workspace -go run ./cmd/skillshare ui --base-path /skillshare --host 0.0.0.0 --port 19421 --no-open -g > /tmp/basepath-server.log 2>&1 & +/workspace/bin/skillshare ui --base-path /skillshare --host 0.0.0.0 --port 49821 --no-open -g > /tmp/basepath-server.log 2>&1 & SERVER_PID=$! +echo $SERVER_PID > /tmp/basepath-server.pid echo "server_pid=$SERVER_PID" sleep 5 @@ -88,12 +89,12 @@ cat /tmp/basepath-server.log Expected: - exit_code: 0 - regex: server_pid=\d+ -- regex: running at http://.*:19421/skillshare/ +- regex: running at http://.*:49821/skillshare/ ## Step 2: API health with prefix returns 200 ```bash -curl -sf http://localhost:19421/skillshare/api/health +curl -sf http://localhost:49821/skillshare/api/health ``` Expected: @@ -103,7 +104,7 @@ Expected: ## Step 3: API health without prefix returns 404 ```bash -HTTP_CODE=$(curl -s -o /dev/null -w "%{http_code}" http://localhost:19421/api/health) +HTTP_CODE=$(curl -s -o /dev/null -w "%{http_code}" http://localhost:49821/api/health) echo "status=$HTTP_CODE" test "$HTTP_CODE" = "404" && echo "correctly_rejected=yes" ``` @@ -116,8 +117,8 @@ Expected: ## Step 4: Bare path redirects to trailing slash ```bash -HTTP_CODE=$(curl -s -o /dev/null -w "%{http_code}" http://localhost:19421/skillshare) -LOCATION=$(curl -s -o /dev/null -w "%{redirect_url}" http://localhost:19421/skillshare) +HTTP_CODE=$(curl -s -o /dev/null -w "%{http_code}" http://localhost:49821/skillshare) +LOCATION=$(curl -s -o /dev/null -w "%{redirect_url}" http://localhost:49821/skillshare) echo "status=$HTTP_CODE" echo "location=$LOCATION" ``` @@ -130,7 +131,7 @@ Expected: ## Step 5: Root path returns 404 ```bash -HTTP_CODE=$(curl -s -o /dev/null -w "%{http_code}" http://localhost:19421/) +HTTP_CODE=$(curl -s -o /dev/null -w "%{http_code}" http://localhost:49821/) echo "status=$HTTP_CODE" test "$HTTP_CODE" = "404" && echo "root_blocked=yes" ``` @@ -143,7 +144,7 @@ Expected: ## Step 6: Index.html served with __BASE_PATH__ injection ```bash -BODY=$(curl -sf http://localhost:19421/skillshare/) +BODY=$(curl -sf http://localhost:49821/skillshare/) echo "$BODY" echo "$BODY" | grep -q '__BASE_PATH__' && echo "injection_found=yes" echo "$BODY" | grep -q '"/skillshare"' && echo "value_correct=yes" @@ -158,8 +159,8 @@ Expected: ## Step 7: SPA fallback serves index.html for unknown routes ```bash -HTTP_CODE=$(curl -s -o /dev/null -w "%{http_code}" http://localhost:19421/skillshare/skills/nonexistent) -BODY=$(curl -sf http://localhost:19421/skillshare/skills/nonexistent) +HTTP_CODE=$(curl -s -o /dev/null -w "%{http_code}" http://localhost:49821/skillshare/skills/nonexistent) +BODY=$(curl -sf http://localhost:49821/skillshare/skills/nonexistent) echo "status=$HTTP_CODE" echo "$BODY" | grep -q '__BASE_PATH__' && echo "spa_fallback_ok=yes" ``` @@ -172,7 +173,7 @@ Expected: ## Step 8: API overview returns valid data ```bash -curl -sf http://localhost:19421/skillshare/api/overview +curl -sf http://localhost:49821/skillshare/api/overview ``` Expected: @@ -183,7 +184,8 @@ Expected: ## Step 9: Stop server and cleanup ```bash -fuser -k 19421/tcp 2>/dev/null || true +kill $(cat /tmp/basepath-server.pid 2>/dev/null) 2>/dev/null || true +rm -f /tmp/basepath-server.pid sleep 1 echo "server_stopped=yes" ``` @@ -195,8 +197,7 @@ Expected: ## Step 10: --base-path missing value returns error ```bash -cd /workspace -go run ./cmd/skillshare ui --base-path 2>&1 || true +/workspace/bin/skillshare ui --base-path 2>&1 || true ``` Expected: @@ -205,8 +206,7 @@ Expected: ## Step 11: -b short flag missing value returns error ```bash -cd /workspace -go run ./cmd/skillshare ui -b 2>&1 || true +/workspace/bin/skillshare ui -b 2>&1 || true ``` Expected: @@ -223,32 +223,29 @@ export XDG_STATE_HOME="$E2E_HOME/.local/state" export XDG_CACHE_HOME="$E2E_HOME/.cache" export SKILLSHARE_UI_BASE_PATH="/from-env" -# Kill any leftover servers from previous steps or runs -fuser -k 19421/tcp 2>/dev/null || true -fuser -k 19422/tcp 2>/dev/null || true +kill $(cat /tmp/basepath-server.pid /tmp/basepath-server2.pid 2>/dev/null) 2>/dev/null || true +rm -f /tmp/basepath-server.pid /tmp/basepath-server2.pid sleep 1 -cd /workspace -go run ./cmd/skillshare ui --host 0.0.0.0 --port 19422 --no-open -g > /tmp/basepath-env.log 2>&1 & -sleep 5 -cat /tmp/basepath-env.log +/workspace/bin/skillshare ui --host 0.0.0.0 --port 49822 --no-open -g > /tmp/basepath-env.log 2>&1 & +echo $! > /tmp/basepath-server2.pid +sleep 3 +cat /tmp/basepath-env.log >&2 -curl -sf http://localhost:19422/from-env/api/health -fuser -k 19422/tcp 2>/dev/null || true +curl -sf http://localhost:49822/from-env/api/health +kill $(cat /tmp/basepath-server2.pid 2>/dev/null) 2>/dev/null || true ``` Expected: - exit_code: 0 -- regex: running at http://.*:19422/from-env/ - jq: .status == "ok" ## Step 13: Final cleanup ```bash -# Ensure all servers from this runbook are stopped -fuser -k 19421/tcp 2>/dev/null || true -fuser -k 19422/tcp 2>/dev/null || true +kill $(cat /tmp/basepath-server.pid /tmp/basepath-server2.pid 2>/dev/null) 2>/dev/null || true sleep 1 +rm -f /tmp/basepath-server.pid /tmp/basepath-server2.pid rm -rf /tmp/ss-e2e-basepath /tmp/basepath-server.log /tmp/basepath-env.log echo "cleanup_done=yes" ``` diff --git a/cmd/skillshare/analyze_tui.go b/cmd/skillshare/analyze_tui.go index 75fd9b5e..e5166b4c 100644 --- a/cmd/skillshare/analyze_tui.go +++ b/cmd/skillshare/analyze_tui.go @@ -5,6 +5,8 @@ import ( "sort" "strings" + "skillshare/internal/theme" + "github.com/charmbracelet/bubbles/list" "github.com/charmbracelet/bubbles/spinner" "github.com/charmbracelet/bubbles/textinput" @@ -60,12 +62,12 @@ type analyzeDataLoadedMsg struct { func newAnalyzeTUIModel(loadFn func() analyzeLoadResult, modeLabel string, initialFilter string) analyzeTUIModel { sp := spinner.New() sp.Spinner = spinner.Dot - sp.Style = tc.SpinnerStyle + sp.Style = theme.Accent() fi := textinput.New() fi.Prompt = "/ " - fi.PromptStyle = tc.Filter - fi.Cursor.Style = tc.Filter + fi.PromptStyle = theme.Accent() + fi.Cursor.Style = theme.Accent() fi.Placeholder = "filter skills" return analyzeTUIModel{ @@ -268,7 +270,7 @@ func (m analyzeTUIModel) Update(msg tea.Msg) (tea.Model, tea.Cmd) { delegate := analyzeSkillDelegate{} l := list.New(nil, delegate, 0, 0) l.Title = m.listTitle() - l.Styles.Title = tc.ListTitle + l.Styles.Title = theme.Title() l.SetShowStatusBar(false) l.SetFilteringEnabled(false) l.SetShowHelp(false) @@ -498,7 +500,7 @@ func (m analyzeTUIModel) renderFooter(scrollInfo string) string { b.WriteString("\n") help := m.helpText() help = appendScrollInfo(help, scrollInfo) - b.WriteString(tc.Help.Render(help)) + b.WriteString(theme.Dim().MarginLeft(2).Render(help)) b.WriteString("\n") return b.String() } @@ -524,12 +526,12 @@ func (m analyzeTUIModel) renderTargetBar() string { label = fmt.Sprintf("%d targets (%d)", len(g.names), g.entry.SkillCount) } if i == m.groupIdx { - parts = append(parts, tc.Cyan.Render("► "+label)) + parts = append(parts, theme.Accent().Render("► "+label)) } else { - parts = append(parts, tc.Dim.Render(label)) + parts = append(parts, theme.Dim().Render(label)) } } - return " " + strings.Join(parts, tc.Dim.Render(" · ")) + "\n" + return " " + strings.Join(parts, theme.Dim().Render(" · ")) + "\n" } func (m analyzeTUIModel) renderStatsLine() string { @@ -546,17 +548,17 @@ func (m analyzeTUIModel) renderStatsLine() string { countStr = fmt.Sprintf("%d skills", g.entry.SkillCount) } - return tc.Help.Render(fmt.Sprintf("%s Always: %s tokens On-demand: %s tokens Total: %s tokens %s", + return theme.Dim().MarginLeft(2).Render(fmt.Sprintf("%s Always: %s tokens On-demand: %s tokens Total: %s tokens %s", countStr, formatTokensStr(m.filteredDescChars), formatTokensStr(m.filteredBodyChars), formatTokensStr(totalChars), - tc.Dim.Render("(1 token ≈ 4 chars)"), + theme.Dim().Render("(1 token ≈ 4 chars)"), )) + "\n" } func (m analyzeTUIModel) renderDetailHeader(e analyzeSkillEntry, width int) string { - title := tc.ListTitle.Render(e.Name) + title := theme.Title().Render(e.Name) return lipgloss.NewStyle().Width(width).Render(title) } @@ -582,9 +584,9 @@ func (m analyzeTUIModel) renderDetailBody(e analyzeSkillEntry, width int) string for _, issue := range e.LintIssues { var icon string if issue.Severity == ssync.LintError { - icon = tc.Red.Render("✗") + icon = theme.Danger().Render("✗") } else { - icon = tc.Yellow.Render("⚠") + icon = theme.Warning().Render("⚠") } qualityRows = append(qualityRows, icon+" "+issue.Message) } @@ -595,10 +597,10 @@ func (m analyzeTUIModel) renderDetailBody(e analyzeSkillEntry, width int) string // Metadata var metaRows []string if e.relPath != "" { - metaRows = append(metaRows, renderFactRow("Path", tc.Cyan.Render(e.relPath))) + metaRows = append(metaRows, renderFactRow("Path", theme.Accent().Render(e.relPath))) } if e.isTracked { - metaRows = append(metaRows, renderFactRow("Tracked", tc.Green.Render("✓"))) + metaRows = append(metaRows, renderFactRow("Tracked", theme.Success().Render("✓"))) } if len(e.targetNames) > 0 { metaRows = append(metaRows, renderFactRow("Restricted to", strings.Join(e.targetNames, ", "))) @@ -621,7 +623,7 @@ func (m analyzeTUIModel) renderDetailBody(e analyzeSkillEntry, width int) string lines[len(lines)-1] += "..." } body := strings.Join(lines, "\n") - b.WriteString(renderDetailSection("Description", tc.Value.Render(body), width)) + b.WriteString(renderDetailSection("Description", lipgloss.NewStyle().Render(body), width)) } return b.String() diff --git a/cmd/skillshare/analyze_tui_item.go b/cmd/skillshare/analyze_tui_item.go index f1e4405e..036de7d9 100644 --- a/cmd/skillshare/analyze_tui_item.go +++ b/cmd/skillshare/analyze_tui_item.go @@ -5,6 +5,8 @@ import ( "sort" "strings" + "skillshare/internal/theme" + "github.com/charmbracelet/bubbles/list" tea "github.com/charmbracelet/bubbletea" "github.com/charmbracelet/lipgloss" @@ -69,8 +71,8 @@ var analyzeDots = map[string]string{ // Pre-computed lint icons (same pattern as analyzeDots). var ( - lintIconError = tc.Red.Render("✗") + " " - lintIconWarning = tc.Yellow.Render("⚠") + " " + lintIconError = theme.Danger().Render("✗") + " " + lintIconWarning = theme.Warning().Render("⚠") + " " ) func lintIcon(issues []ssync.LintIssue) string { @@ -138,7 +140,7 @@ func (d analyzeSkillDelegate) Render(w io.Writer, m list.Model, index int, listI if gap < 1 { gap = 1 } - line := dot + " " + nameLabel + strings.Repeat(" ", gap) + tc.Dim.Render(tokenStr) + line := dot + " " + nameLabel + strings.Repeat(" ", gap) + theme.Dim().Render(tokenStr) // Reuse the shared ▌ prefix row style (same as list, audit, log TUIs) renderPrefixRow(w, line, width, isSelected) diff --git a/cmd/skillshare/audit.go b/cmd/skillshare/audit.go index 43fbc864..efabdd17 100644 --- a/cmd/skillshare/audit.go +++ b/cmd/skillshare/audit.go @@ -11,6 +11,7 @@ import ( "skillshare/internal/audit" "skillshare/internal/config" "skillshare/internal/oplog" + "skillshare/internal/resource" "skillshare/internal/sync" "skillshare/internal/ui" "skillshare/internal/utils" @@ -130,6 +131,9 @@ func cmdAudit(args []string) error { } applyModeLabel(mode) + // Extract kind filter (e.g. "skillshare audit agents" or "--all"). + kind, rest := parseKindArg(rest) + // Check for "rules" subcommand before standard audit arg parsing. if len(rest) > 0 && rest[0] == "rules" { return cmdAuditRules(mode, rest[1:]) @@ -161,6 +165,7 @@ func cmdAudit(args []string) error { var ( sourcePath string + agentsSourcePath string projectRoot string defaultThreshold string configProfile string @@ -184,6 +189,7 @@ func cmdAudit(args []string) error { return err } sourcePath = rt.sourcePath + agentsSourcePath = rt.agentsSourcePath projectRoot = cwd defaultThreshold = rt.config.Audit.BlockThreshold configProfile = rt.config.Audit.Profile @@ -196,6 +202,7 @@ func cmdAudit(args []string) error { return err } sourcePath = cfg.Source + agentsSourcePath = cfg.EffectiveAgentsSource() defaultThreshold = cfg.Audit.BlockThreshold configProfile = cfg.Audit.Profile configDedupe = cfg.Audit.DedupeMode @@ -203,6 +210,11 @@ func cmdAudit(args []string) error { cfgPath = config.ConfigPath() } + // When kind is agents-only, override sourcePath to the agents source directory. + if kind == kindAgents && agentsSourcePath != "" { + sourcePath = agentsSourcePath + } + policy := audit.ResolvePolicy(audit.PolicyInputs{ Profile: opts.Profile, Threshold: opts.Threshold, @@ -230,13 +242,13 @@ func cmdAudit(args []string) error { switch { case !hasTargets: - results, summary, err = auditInstalled(sourcePath, modeString(mode), projectRoot, threshold, opts, registry) + results, summary, err = auditInstalled(sourcePath, agentsSourcePath, modeString(mode), projectRoot, threshold, kind, opts, registry) case isSinglePath: results, summary, err = auditPath(opts.Targets[0], modeString(mode), projectRoot, threshold, opts.Format, opts.PolicyLine, registry) case isSingleName: - results, summary, err = auditSkillByName(sourcePath, opts.Targets[0], modeString(mode), projectRoot, threshold, opts.Format, opts.PolicyLine, registry) + results, summary, err = auditSkillByName(sourcePath, opts.Targets[0], modeString(mode), projectRoot, threshold, opts.Format, opts.PolicyLine, kind, registry) default: - results, summary, err = auditFiltered(sourcePath, opts.Targets, opts.Groups, modeString(mode), projectRoot, threshold, opts, registry) + results, summary, err = auditFiltered(sourcePath, agentsSourcePath, opts.Targets, opts.Groups, modeString(mode), projectRoot, threshold, kind, opts, registry) } if err != nil { logAuditOp(cfgPath, rest, summary, start, err, false) @@ -465,14 +477,44 @@ func scanSkillPath(skillPath, projectRoot string, registry *audit.Registry) (*au return audit.ScanSkill(skillPath) } -func toAuditInputs(skills []auditSkillRef) []audit.SkillInput { - inputs := make([]audit.SkillInput, len(skills)) - for i, s := range skills { - inputs[i] = audit.SkillInput{Name: s.name, Path: s.path} +func toAuditInputs(items []auditSkillRef, isFile bool) []audit.SkillInput { + inputs := make([]audit.SkillInput, len(items)) + for i, item := range items { + inputs[i] = audit.SkillInput{Name: item.name, Path: item.path, IsFile: isFile} } return inputs } +func collectInstalledAgentPaths(agentsSourcePath string) ([]auditSkillRef, error) { + if agentsSourcePath == "" { + return nil, nil + } + if _, err := os.Stat(agentsSourcePath); os.IsNotExist(err) { + return nil, nil + } + discovered, err := resource.AgentKind{}.Discover(agentsSourcePath) + if err != nil { + return nil, fmt.Errorf("failed to discover agents: %w", err) + } + var agentPaths []auditSkillRef + // Audit scans ALL agents including disabled ones — security risks don't respect .agentignore. + for _, d := range discovered { + agentPaths = append(agentPaths, auditSkillRef{name: d.FlatName, path: d.AbsPath}) + } + return agentPaths, nil +} + +func discoverForKind(kind resourceKindFilter, sourcePath string) ([]auditSkillRef, error) { + if kind == kindAgents { + return collectInstalledAgentPaths(sourcePath) + } + return collectInstalledSkillPaths(sourcePath) +} + +func toInputsForKind(kind resourceKindFilter, items []auditSkillRef) []audit.SkillInput { + return toAuditInputs(items, kind == kindAgents) +} + func scanPathTarget(targetPath, projectRoot string, registry *audit.Registry) (*audit.Result, error) { info, err := os.Stat(targetPath) if err != nil { @@ -487,7 +529,7 @@ func scanPathTarget(targetPath, projectRoot string, registry *audit.Registry) (* return audit.ScanFile(targetPath) } -func auditInstalled(sourcePath, mode, projectRoot, threshold string, opts auditOptions, reg *audit.Registry) ([]*audit.Result, auditRunSummary, error) { +func auditInstalled(sourcePath, agentsSourcePath, mode, projectRoot, threshold string, kind resourceKindFilter, opts auditOptions, reg *audit.Registry) ([]*audit.Result, auditRunSummary, error) { jsonOutput := opts.isStructured() base := auditRunSummary{ Scope: "all", @@ -495,12 +537,18 @@ func auditInstalled(sourcePath, mode, projectRoot, threshold string, opts auditO Threshold: threshold, } - // Phase 0: discover skills. + // Phase 0: discover skills/agents. var spinner *ui.Spinner if !jsonOutput { - spinner = ui.StartSpinner("Discovering skills...") + spinner = ui.StartSpinner(fmt.Sprintf("Discovering %s...", kind.Noun(2))) + } + var skillPaths []auditSkillRef + var err error + if kind == kindAgents { + skillPaths, err = collectInstalledAgentPaths(sourcePath) + } else { + skillPaths, err = collectInstalledSkillPaths(sourcePath) } - skillPaths, err := collectInstalledSkillPaths(sourcePath) if err != nil { if spinner != nil { spinner.Fail("Discovery failed") @@ -509,18 +557,18 @@ func auditInstalled(sourcePath, mode, projectRoot, threshold string, opts auditO } if len(skillPaths) == 0 { if spinner != nil { - spinner.Success("No skills found") + spinner.Success(fmt.Sprintf("No %s found", kind.Noun(2))) } return []*audit.Result{}, base, nil } if spinner != nil { - spinner.Success(fmt.Sprintf("Found %d skill(s)", len(skillPaths))) + spinner.Success(fmt.Sprintf("Found %d %s", len(skillPaths), kind.Noun(len(skillPaths)))) } // Phase 0.5: large audit confirmation prompt. if len(skillPaths) > largeAuditThreshold && !jsonOutput && !opts.Yes && ui.IsTTY() { - ui.Warning("Found %d skills. This may take a while.", len(skillPaths)) - ui.Info("Tip: use 'audit --group ' or 'audit ' to scan specific skills") + ui.Warning("Found %d %s. This may take a while.", len(skillPaths), kind.Noun(len(skillPaths))) + ui.Info("Tip: use 'audit --group ' or 'audit ' to scan specific %s", kind.Noun(2)) fmt.Print(" Continue? [y/N]: ") var answer string fmt.Scanln(&answer) @@ -533,7 +581,7 @@ func auditInstalled(sourcePath, mode, projectRoot, threshold string, opts auditO var headerMinWidth int if !jsonOutput { fmt.Println() - subtitle := auditHeaderSubtitle(fmt.Sprintf("Scanning %d skills for threats", len(skillPaths)), mode, sourcePath, threshold, opts.PolicyLine) + subtitle := auditHeaderSubtitle(fmt.Sprintf("Scanning %d %s for threats", len(skillPaths), kind.Noun(len(skillPaths))), mode, sourcePath, threshold, opts.PolicyLine) headerMinWidth = auditHeaderMinWidth(subtitle) ui.HeaderBoxWithMinWidth(auditHeaderTitle(mode), subtitle, headerMinWidth) } @@ -544,14 +592,15 @@ func auditInstalled(sourcePath, mode, projectRoot, threshold string, opts auditO } var progressBar *ui.ProgressBar if !jsonOutput { - progressBar = ui.StartProgress("Scanning skills", len(skillPaths)) + progressBar = ui.StartProgress(fmt.Sprintf("Scanning %s", kind.Noun(2)), len(skillPaths)) } onDone := func() { if progressBar != nil { progressBar.Increment() } } - scanResults := audit.ParallelScan(toAuditInputs(skillPaths), projectRoot, onDone, reg) + scanInputs := toInputsForKind(kind, skillPaths) + scanResults := audit.ParallelScan(scanInputs, projectRoot, onDone, reg) if progressBar != nil { progressBar.Stop() } @@ -574,6 +623,7 @@ func auditInstalled(sourcePath, mode, projectRoot, threshold string, opts auditO } sr.Result.Threshold = threshold sr.Result.IsBlocked = sr.Result.HasSeverityAtOrAbove(threshold) + sr.Result.Kind = kind.SingularNoun() // Use relative path so TUI shows group hierarchy (e.g. "frontend/vue/skill"). if rel, err := filepath.Rel(sourcePath, sr.Result.ScanTarget); err == nil && rel != sr.Result.SkillName { sr.Result.SkillName = rel @@ -594,14 +644,23 @@ func auditInstalled(sourcePath, mode, projectRoot, threshold string, opts auditO } applyPolicyToSummary(&summary, opts) - if err := presentAuditResults(results, elapsed, scanResults, summary, jsonOutput, opts, headerMinWidth); err != nil { + tuiCtx := &auditTUIContext{ + kind: kind, + sourcePath: sourcePath, + agentsSourcePath: agentsSourcePath, + projectRoot: projectRoot, + threshold: threshold, + registry: reg, + mode: mode, + } + if err := presentAuditResults(results, elapsed, scanResults, summary, jsonOutput, opts, headerMinWidth, tuiCtx); err != nil { return results, summary, err } return results, summary, nil } -func auditFiltered(sourcePath string, names, groups []string, mode, projectRoot, threshold string, opts auditOptions, reg *audit.Registry) ([]*audit.Result, auditRunSummary, error) { +func auditFiltered(sourcePath, agentsSourcePath string, names, groups []string, mode, projectRoot, threshold string, kind resourceKindFilter, opts auditOptions, reg *audit.Registry) ([]*audit.Result, auditRunSummary, error) { jsonOutput := opts.isStructured() base := auditRunSummary{ Scope: "filtered", @@ -609,7 +668,13 @@ func auditFiltered(sourcePath string, names, groups []string, mode, projectRoot, Threshold: threshold, } - allSkills, err := collectInstalledSkillPaths(sourcePath) + var allSkills []auditSkillRef + var err error + if kind == kindAgents { + allSkills, err = collectInstalledAgentPaths(sourcePath) + } else { + allSkills, err = collectInstalledSkillPaths(sourcePath) + } if err != nil { return nil, base, err } @@ -658,7 +723,7 @@ func auditFiltered(sourcePath string, names, groups []string, mode, projectRoot, } for _, w := range warnings { if !jsonOutput { - ui.Warning("skill not found: %s", w) + ui.Warning("%s not found: %s", kind.SingularNoun(), w) } } @@ -670,7 +735,7 @@ func auditFiltered(sourcePath string, names, groups []string, mode, projectRoot, var headerMinWidth int if !jsonOutput { fmt.Println() - subtitle := auditHeaderSubtitle(fmt.Sprintf("Scanning %d skills for threats", len(matched)), mode, sourcePath, threshold, opts.PolicyLine) + subtitle := auditHeaderSubtitle(fmt.Sprintf("Scanning %d %s for threats", len(matched), kind.Noun(len(matched))), mode, sourcePath, threshold, opts.PolicyLine) headerMinWidth = auditHeaderMinWidth(subtitle) ui.HeaderBoxWithMinWidth(auditHeaderTitle(mode), subtitle, headerMinWidth) } @@ -681,14 +746,15 @@ func auditFiltered(sourcePath string, names, groups []string, mode, projectRoot, } var progressBar *ui.ProgressBar if !jsonOutput { - progressBar = ui.StartProgress("Scanning skills", len(matched)) + progressBar = ui.StartProgress(fmt.Sprintf("Scanning %s", kind.Noun(2)), len(matched)) } onDone := func() { if progressBar != nil { progressBar.Increment() } } - scanResults := audit.ParallelScan(toAuditInputs(matched), projectRoot, onDone, reg) + scanInputs := toInputsForKind(kind, matched) + scanResults := audit.ParallelScan(scanInputs, projectRoot, onDone, reg) if progressBar != nil { progressBar.Stop() } @@ -711,6 +777,7 @@ func auditFiltered(sourcePath string, names, groups []string, mode, projectRoot, } sr.Result.Threshold = threshold sr.Result.IsBlocked = sr.Result.HasSeverityAtOrAbove(threshold) + sr.Result.Kind = kind.SingularNoun() if rel, err := filepath.Rel(sourcePath, sr.Result.ScanTarget); err == nil && rel != sr.Result.SkillName { sr.Result.SkillName = rel } @@ -730,14 +797,23 @@ func auditFiltered(sourcePath string, names, groups []string, mode, projectRoot, } applyPolicyToSummary(&summary, opts) - if err := presentAuditResults(results, elapsed, scanResults, summary, jsonOutput, opts, headerMinWidth); err != nil { + tuiCtx := &auditTUIContext{ + kind: kind, + sourcePath: sourcePath, + agentsSourcePath: agentsSourcePath, + projectRoot: projectRoot, + threshold: threshold, + registry: reg, + mode: mode, + } + if err := presentAuditResults(results, elapsed, scanResults, summary, jsonOutput, opts, headerMinWidth, tuiCtx); err != nil { return results, summary, err } return results, summary, nil } -func auditSkillByName(sourcePath, name, mode, projectRoot, threshold, format, policyLine string, reg *audit.Registry) ([]*audit.Result, auditRunSummary, error) { +func auditSkillByName(sourcePath, name, mode, projectRoot, threshold, format, policyLine string, kind resourceKindFilter, reg *audit.Registry) ([]*audit.Result, auditRunSummary, error) { summary := auditRunSummary{ Scope: "single", Skill: name, @@ -750,19 +826,20 @@ func auditSkillByName(sourcePath, name, mode, projectRoot, threshold, format, po // Short-name fallback: search installed skills by flat name or basename. resolved := resolveSkillPath(sourcePath, name) if resolved == "" { - return nil, summary, fmt.Errorf("skill not found: %s", name) + return nil, summary, fmt.Errorf("%s not found: %s", kind.SingularNoun(), name) } skillPath = resolved } start := time.Now() - result, err := scanSkillPath(skillPath, projectRoot, reg) + result, err := scanPathTarget(skillPath, projectRoot, reg) if err != nil { return nil, summary, fmt.Errorf("scan error: %w", err) } elapsed := time.Since(start) result.Threshold = threshold result.IsBlocked = result.HasSeverityAtOrAbove(threshold) + result.Kind = kind.SingularNoun() if rel, err := filepath.Rel(sourcePath, result.ScanTarget); err == nil && rel != result.SkillName { result.SkillName = rel } @@ -772,8 +849,9 @@ func auditSkillByName(sourcePath, name, mode, projectRoot, threshold, format, po summary.Skill = name summary.Mode = mode if format == formatText { - subtitle := auditHeaderSubtitle(fmt.Sprintf("Scanning skill: %s", name), mode, sourcePath, threshold, policyLine) - summaryLines := buildAuditSummaryLines(summary) + label := kind.SingularNoun() + subtitle := auditHeaderSubtitle(fmt.Sprintf("Scanning %s: %s", label, name), mode, sourcePath, threshold, policyLine) + summaryLines := buildAuditSummaryLines(summary, kind) minWidth := auditHeaderMinWidth(subtitle) ui.HeaderBoxWithMinWidth(auditHeaderTitle(mode), subtitle, minWidth) fmt.Println() diff --git a/cmd/skillshare/audit_project.go b/cmd/skillshare/audit_project.go index 0f714d56..b649a7ff 100644 --- a/cmd/skillshare/audit_project.go +++ b/cmd/skillshare/audit_project.go @@ -22,10 +22,10 @@ func cmdAuditProject(root, specificSkill string) (auditRunSummary, bool, error) } if specificSkill != "" { - _, summary, err := auditSkillByName(rt.sourcePath, specificSkill, "project", root, threshold, formatText, "", nil) + _, summary, err := auditSkillByName(rt.sourcePath, specificSkill, "project", root, threshold, formatText, "", kindSkills, nil) return summary, summary.Failed > 0, err } - _, summary, err := auditInstalled(rt.sourcePath, "project", root, threshold, auditOptions{}, nil) + _, summary, err := auditInstalled(rt.sourcePath, "", "project", root, threshold, kindSkills, auditOptions{}, nil) return summary, summary.Failed > 0, err } diff --git a/cmd/skillshare/audit_render.go b/cmd/skillshare/audit_render.go index 47638566..7bfcc007 100644 --- a/cmd/skillshare/audit_render.go +++ b/cmd/skillshare/audit_render.go @@ -2,11 +2,13 @@ package main import ( "fmt" + "path/filepath" "sort" "strings" "time" "skillshare/internal/audit" + "skillshare/internal/theme" "skillshare/internal/ui" ) @@ -18,10 +20,21 @@ func riskColor(label string) string { return ui.Dim } +// auditTUIContext carries info needed to scan the "other" kind for TUI tab switching. +type auditTUIContext struct { + kind resourceKindFilter + sourcePath string // skills source (always) + agentsSourcePath string // agents source (always) + projectRoot string + threshold string + registry *audit.Registry + mode string +} + // presentAuditResults handles the common output path for audit scans: // prints per-skill list only when TUI is unavailable, always prints summary, // and launches TUI when conditions are met. -func presentAuditResults(results []*audit.Result, elapsed []time.Duration, scanOutputs []audit.ScanOutput, summary auditRunSummary, jsonOutput bool, opts auditOptions, headerMinWidth int) error { +func presentAuditResults(results []*audit.Result, elapsed []time.Duration, scanOutputs []audit.ScanOutput, summary auditRunSummary, jsonOutput bool, opts auditOptions, headerMinWidth int, tuiCtx *auditTUIContext) error { useTUI := !jsonOutput && shouldLaunchTUI(opts.NoTUI, nil) && len(results) > 1 if !jsonOutput { @@ -37,16 +50,94 @@ func presentAuditResults(results []*audit.Result, elapsed []time.Duration, scanO } fmt.Println() } - summaryLines := buildAuditSummaryLines(summary) + var kindSlice []resourceKindFilter + if tuiCtx != nil { + kindSlice = []resourceKindFilter{tuiCtx.kind} + } + summaryLines := buildAuditSummaryLines(summary, kindSlice...) printAuditSummary(summary, summaryLines, headerMinWidth) } if useTUI { - return runAuditTUI(results, scanOutputs, summary) + if tuiCtx != nil { + return launchAuditTUIWithTabs(results, scanOutputs, summary, tuiCtx) + } + return runAuditTUI(results, scanOutputs, summary, nil, nil, auditRunSummary{}, auditTabSkills) } return nil } +func launchAuditTUIWithTabs(results []*audit.Result, scanOutputs []audit.ScanOutput, summary auditRunSummary, ctx *auditTUIContext) error { + initialTab := auditTabSkills + if ctx.kind == kindAgents { + initialTab = auditTabAgents + } + + // Scan the "other" kind for the second tab. + var otherResults []*audit.Result + var otherOutputs []audit.ScanOutput + var otherSummary auditRunSummary + + otherKindFilter := kindAgents + otherSource := ctx.agentsSourcePath + if ctx.kind == kindAgents { + otherKindFilter = kindSkills + otherSource = ctx.sourcePath + } + + if otherSource != "" { + otherPaths, err := discoverForKind(otherKindFilter, otherSource) + otherInputs := toInputsForKind(otherKindFilter, otherPaths) + if err == nil && len(otherPaths) > 0 { + otherScanResults := audit.ParallelScan(otherInputs, ctx.projectRoot, nil, ctx.registry) + for i := range otherPaths { + if i < len(otherScanResults) { + sr := otherScanResults[i] + if sr.Err == nil { + sr.Result.Threshold = ctx.threshold + sr.Result.IsBlocked = sr.Result.HasSeverityAtOrAbove(ctx.threshold) + sr.Result.Kind = otherKindFilter.SingularNoun() + if rel, relErr := filepath.Rel(otherSource, sr.Result.ScanTarget); relErr == nil { + sr.Result.SkillName = rel + } + otherResults = append(otherResults, sr.Result) + otherOutputs = append(otherOutputs, sr) + } + } + } + otherSummary = summarizeAuditResults(len(otherPaths), otherResults, ctx.threshold) + otherSummary.Mode = ctx.mode + } + } + + // Filter out synthetic _cross-skill result — it's not a real resource. + var filteredResults []*audit.Result + var filteredOutputs []audit.ScanOutput + for i, r := range results { + if r.SkillName != audit.CrossSkillResultName { + filteredResults = append(filteredResults, r) + if i < len(scanOutputs) { + filteredOutputs = append(filteredOutputs, scanOutputs[i]) + } + } + } + + // Arrange into skills vs agents. + var skillResults, agentResults []*audit.Result + var skillOutputs, agentOutputs []audit.ScanOutput + var skillSummary, agentSummary auditRunSummary + + if ctx.kind == kindAgents { + agentResults, agentOutputs, agentSummary = filteredResults, filteredOutputs, summary + skillResults, skillOutputs, skillSummary = otherResults, otherOutputs, otherSummary + } else { + skillResults, skillOutputs, skillSummary = filteredResults, filteredOutputs, summary + agentResults, agentOutputs, agentSummary = otherResults, otherOutputs, otherSummary + } + + return runAuditTUI(skillResults, skillOutputs, skillSummary, agentResults, agentOutputs, agentSummary, initialTab) +} + // printSkillResultLine prints a single-line result for a skill during batch scan. func printSkillResultLine(index, total int, result *audit.Result, elapsed time.Duration) { prefix := fmt.Sprintf("[%d/%d]", index, total) @@ -155,7 +246,7 @@ func printSkillResult(result *audit.Result, elapsed time.Duration) { } // buildAuditSummaryLines builds the summary box lines (without printing). -func buildAuditSummaryLines(summary auditRunSummary) []string { +func buildAuditSummaryLines(summary auditRunSummary, kind ...resourceKindFilter) []string { var lines []string maxSeverity := summary.MaxSeverity if maxSeverity == "" { @@ -167,8 +258,12 @@ func buildAuditSummaryLines(summary auditRunSummary) []string { lines = append(lines, fmt.Sprintf(" Max sev: %s", ui.Colorize(ui.SeverityColor(maxSeverity), maxSeverity))) // -- Result counts -- + noun := "skill(s)" + if len(kind) > 0 { + noun = kind[0].Noun(summary.Scanned) + } lines = append(lines, "") - lines = append(lines, fmt.Sprintf(" Scanned: %d skill(s)", summary.Scanned)) + lines = append(lines, fmt.Sprintf(" Scanned: %d %s", summary.Scanned, noun)) lines = append(lines, fmt.Sprintf(" Passed: %d", summary.Passed)) if summary.Warning > 0 { lines = append(lines, fmt.Sprintf(" Warning: %s", ui.Colorize(ui.Yellow, fmt.Sprintf("%d", summary.Warning)))) @@ -375,16 +470,16 @@ func formatCategoryBreakdownTUI(cats map[string]int) string { label = short } if cc.count > 50 { - parts[i] = tc.Emphasis.Bold(true).Render(label+":") + tc.Emphasis.Bold(true).Render(fmt.Sprintf("%d", cc.count)) + parts[i] = theme.Primary().Bold(true).Render(label+":") + theme.Primary().Bold(true).Render(fmt.Sprintf("%d", cc.count)) } else { - parts[i] = tc.Dim.Render(label + ":" + fmt.Sprintf("%d", cc.count)) + parts[i] = theme.Dim().Render(label + ":" + fmt.Sprintf("%d", cc.count)) } } return strings.Join(parts, " ") } func printAuditHelp() { - fmt.Println(`Usage: skillshare audit [name...] [options] + fmt.Println(`Usage: skillshare audit [agents] [name...] [options] skillshare audit --group [options] skillshare audit [options] @@ -432,5 +527,6 @@ Examples: skillshare audit --format sarif # Output SARIF 2.1.0 for GitHub Code Scanning skillshare audit --format markdown # Output Markdown report (for GitHub Issues/PRs) skillshare audit --json # Same as --format json (deprecated) - skillshare audit -p --init-rules # Create project custom rules file`) + skillshare audit -p --init-rules # Create project custom rules file + skillshare audit agents # Scan agents only`) } diff --git a/cmd/skillshare/audit_rules_tui.go b/cmd/skillshare/audit_rules_tui.go index 023a70d0..4fcf8d6d 100644 --- a/cmd/skillshare/audit_rules_tui.go +++ b/cmd/skillshare/audit_rules_tui.go @@ -6,6 +6,7 @@ import ( "strings" "skillshare/internal/audit" + "skillshare/internal/theme" "github.com/charmbracelet/bubbles/list" "github.com/charmbracelet/bubbles/textinput" @@ -46,9 +47,9 @@ func (i arHeaderItem) Title() string { countStr := fmt.Sprintf("%d", i.group.Total) extra := "" if i.group.Disabled > 0 { - extra = tc.Yellow.Render(fmt.Sprintf(" %d off", i.group.Disabled)) + extra = theme.Warning().Render(fmt.Sprintf(" %d off", i.group.Disabled)) } - return fmt.Sprintf("%s %s %s%s", arrow, i.group.Pattern, tc.Dim.Render(countStr), extra) + return fmt.Sprintf("%s %s %s%s", arrow, i.group.Pattern, theme.Dim().Render(countStr), extra) } func (i arHeaderItem) Description() string { return "" } @@ -64,9 +65,9 @@ type arRuleItem struct { } func (i arRuleItem) Title() string { - dot := tcSevStyle(i.rule.Severity).Render("●") + dot := theme.SeverityStyle(i.rule.Severity).Render("●") if !i.rule.Enabled { - return fmt.Sprintf(" %s %s %s", dot, i.rule.ID, tc.Red.Render("off")) + return fmt.Sprintf(" %s %s %s", dot, i.rule.ID, theme.Danger().Render("off")) } return fmt.Sprintf(" %s %s", dot, i.rule.ID) } @@ -158,13 +159,13 @@ func newARModel(rules []audit.CompiledRule, mode runMode) arModel { // Filter text input fi := textinput.New() fi.Prompt = "/ " - fi.PromptStyle = tc.Filter - fi.Cursor.Style = tc.Filter + fi.PromptStyle = theme.Accent() + fi.Cursor.Style = theme.Accent() m.filterInput = fi // Create the list model once — rebuildItems will populate via SetItems. m.list = list.New(nil, newPrefixDelegate(false), 0, 0) - m.list.Styles.Title = tc.ListTitle + m.list.Styles.Title = theme.Title() m.list.SetShowStatusBar(false) m.list.SetFilteringEnabled(false) m.list.SetShowHelp(false) @@ -418,7 +419,7 @@ func (m arModel) Update(msg tea.Msg) (tea.Model, tea.Cmd) { return m, nil case "R": m.pendingReset = true - m.flashMsg = tc.Yellow.Render("Press R again to reset all rules to defaults") + m.flashMsg = theme.Warning().Render("Press R again to reset all rules to defaults") m.flashTicks = 5 return m, nil case "ctrl+d": @@ -472,11 +473,11 @@ func (m *arModel) toggleSelected() { // execMutation runs a mutation function, shows flash feedback, and reloads rules. func (m *arModel) execMutation(fn func() error, successMsg string) { if err := fn(); err != nil { - m.flashMsg = tc.Red.Render("✗ " + err.Error()) + m.flashMsg = theme.Danger().Render("✗ " + err.Error()) m.flashTicks = 3 return } - m.flashMsg = tc.Green.Render("✓ " + successMsg) + m.flashMsg = theme.Success().Render("✓ " + successMsg) m.flashTicks = 3 m.reloadRules() m.rebuildItems() @@ -611,7 +612,7 @@ func (m arModel) renderSevTabBar() string { var parts []string activeStyle := lipgloss.NewStyle().Bold(true).Underline(true) - inactiveStyle := tc.Dim + inactiveStyle := theme.Dim() for i, tab := range sevTabs { label := fmt.Sprintf("%s(%d)", tab.label, counts[i]) @@ -619,19 +620,19 @@ func (m arModel) renderSevTabBar() string { // Color active tab by its severity switch tab.sev { case "CRITICAL": - parts = append(parts, activeStyle.Inherit(tc.Critical).Render(label)) + parts = append(parts, activeStyle.Inherit(theme.Severity("critical")).Render(label)) case "HIGH": - parts = append(parts, activeStyle.Inherit(tc.High).Render(label)) + parts = append(parts, activeStyle.Inherit(theme.Severity("high")).Render(label)) case "MEDIUM": - parts = append(parts, activeStyle.Inherit(tc.Medium).Render(label)) + parts = append(parts, activeStyle.Inherit(theme.Severity("medium")).Render(label)) case "LOW": - parts = append(parts, activeStyle.Inherit(tc.Low).Render(label)) + parts = append(parts, activeStyle.Inherit(theme.Severity("low")).Render(label)) case "INFO": - parts = append(parts, activeStyle.Inherit(tc.Info).Render(label)) + parts = append(parts, activeStyle.Inherit(theme.Severity("info")).Render(label)) case "DISABLED": - parts = append(parts, activeStyle.Inherit(tc.Red).Render(label)) + parts = append(parts, activeStyle.Inherit(theme.Danger()).Render(label)) default: - parts = append(parts, activeStyle.Inherit(tc.Cyan).Render(label)) + parts = append(parts, activeStyle.Inherit(theme.Accent()).Render(label)) } } else { parts = append(parts, inactiveStyle.Render(label)) @@ -706,29 +707,29 @@ func (m arModel) renderPatternDetail(item arHeaderItem) string { pg := item.group row := func(label, value string) { - b.WriteString(tc.Label.Render(label)) + b.WriteString(theme.Dim().Width(14).Render(label)) b.WriteString(value) b.WriteString("\n") } // Header - b.WriteString(tc.Title.Render(pg.Pattern)) + b.WriteString(theme.Title().Render(pg.Pattern)) b.WriteString("\n") - b.WriteString(tc.Dim.Render(strings.Repeat("─", 36))) + b.WriteString(theme.Dim().Render(strings.Repeat("─", 36))) b.WriteString("\n\n") row("Rules:", fmt.Sprintf("%d total", pg.Total)) - row("Max Sev:", tcSevStyle(pg.MaxSeverity).Render(pg.MaxSeverity)) - row("Enabled:", tc.Green.Render(fmt.Sprintf("%d", pg.Enabled))) + row("Max Sev:", theme.SeverityStyle(pg.MaxSeverity).Render(pg.MaxSeverity)) + row("Enabled:", theme.Success().Render(fmt.Sprintf("%d", pg.Enabled))) if pg.Disabled > 0 { - row("Disabled:", tc.Red.Render(fmt.Sprintf("%d", pg.Disabled))) + row("Disabled:", theme.Danger().Render(fmt.Sprintf("%d", pg.Disabled))) } else { - row("Disabled:", tc.Dim.Render("0")) + row("Disabled:", theme.Dim().Render("0")) } // Severity distribution b.WriteString("\n") - b.WriteString(tc.Label.Render("Severity:")) + b.WriteString(theme.Dim().Width(14).Render("Severity:")) b.WriteString("\n") sevCounts := make(map[string]int) for _, r := range m.allRules { @@ -739,7 +740,7 @@ func (m arModel) renderPatternDetail(item arHeaderItem) string { for _, sev := range []string{"CRITICAL", "HIGH", "MEDIUM", "LOW", "INFO"} { if count, ok := sevCounts[sev]; ok && count > 0 { b.WriteString(" ") - b.WriteString(tcSevStyle(sev).Render(fmt.Sprintf("%-10s %d", sev, count))) + b.WriteString(theme.SeverityStyle(sev).Render(fmt.Sprintf("%-10s %d", sev, count))) b.WriteString("\n") } } @@ -752,7 +753,7 @@ func (m arModel) renderRuleDetail(item arRuleItem) string { var b strings.Builder row := func(label, value string) { - b.WriteString(tc.Label.Render(label)) + b.WriteString(theme.Dim().Width(14).Render(label)) b.WriteString(value) b.WriteString("\n") } @@ -760,14 +761,14 @@ func (m arModel) renderRuleDetail(item arRuleItem) string { r := item.rule // Header - b.WriteString(tc.Title.Render(item.rule.ID)) + b.WriteString(theme.Title().Render(item.rule.ID)) b.WriteString("\n") - b.WriteString(tc.Dim.Render(strings.Repeat("─", 36))) + b.WriteString(theme.Dim().Render(strings.Repeat("─", 36))) b.WriteString("\n\n") row("ID:", r.ID) - row("Pattern:", tc.Emphasis.Render(r.Pattern)) - row("Severity:", tcSevStyle(r.Severity).Render(r.Severity)) + row("Pattern:", theme.Primary().Render(r.Pattern)) + row("Severity:", theme.SeverityStyle(r.Severity).Render(r.Severity)) row("Message:", r.Message) row("Regex:", r.Regex) @@ -775,9 +776,9 @@ func (m arModel) renderRuleDetail(item arRuleItem) string { row("Exclude:", r.Exclude) } - statusStr := tc.Green.Render("enabled") + statusStr := theme.Success().Render("enabled") if !r.Enabled { - statusStr = tc.Red.Render("disabled") + statusStr = theme.Danger().Render("disabled") } sourceLabel := r.Source switch r.Source { @@ -788,7 +789,7 @@ func (m arModel) renderRuleDetail(item arRuleItem) string { case "builtin": sourceLabel = "built-in" } - row("Status:", statusStr+" "+tc.Dim.Render("("+sourceLabel+")")) + row("Status:", statusStr+" "+theme.Dim().Render("("+sourceLabel+")")) return b.String() } @@ -820,7 +821,7 @@ func (m arModel) renderFlashAndHelp(scrollInfo string) string { if m.useSplit() { help += " Ctrl+d/u scroll" } - b.WriteString(tc.Help.Render(appendScrollInfo(help, scrollInfo))) + b.WriteString(theme.Dim().MarginLeft(2).Render(appendScrollInfo(help, scrollInfo))) } b.WriteString("\n") @@ -831,10 +832,10 @@ func (m arModel) renderFlashAndHelp(scrollInfo string) string { func (m arModel) renderSeverityPicker() string { var parts []string for _, opt := range severityOptions { - badge := tcSevStyle(opt.sev).Render(opt.key + " " + opt.sev) + badge := theme.SeverityStyle(opt.sev).Render(opt.key + " " + opt.sev) parts = append(parts, badge) } - return tc.Help.Render("Set severity: ") + strings.Join(parts, tc.Dim.Render(" ")) + tc.Help.Render(" Esc cancel") + return theme.Dim().MarginLeft(2).Render("Set severity: ") + strings.Join(parts, theme.Dim().Render(" ")) + theme.Dim().MarginLeft(2).Render(" Esc cancel") } // runAuditRulesTUI starts the bubbletea TUI for browsing/toggling audit rules. diff --git a/cmd/skillshare/audit_tui.go b/cmd/skillshare/audit_tui.go index 1ca13cd7..3d4c4c7c 100644 --- a/cmd/skillshare/audit_tui.go +++ b/cmd/skillshare/audit_tui.go @@ -8,6 +8,7 @@ import ( "time" "skillshare/internal/audit" + "skillshare/internal/theme" "github.com/charmbracelet/bubbles/list" "github.com/charmbracelet/bubbles/textinput" @@ -15,21 +16,24 @@ import ( "github.com/charmbracelet/lipgloss" ) -// ac holds audit-specific styles that don't belong in the shared tc palette. -var ac = struct { - File lipgloss.Style // file:line locations — cyan - Snippet lipgloss.Style // code snippet highlight - ItemName lipgloss.Style // list item name — light gray -}{ - File: lipgloss.NewStyle().Foreground(lipgloss.Color("6")), - Snippet: lipgloss.NewStyle().Foreground(lipgloss.Color("179")), - ItemName: lipgloss.NewStyle().Foreground(lipgloss.Color("252")), +type auditTab int + +const ( + auditTabSkills auditTab = iota + auditTabAgents +) + +func (t auditTab) noun() string { + if t == auditTabAgents { + return "agents" + } + return "skills" } // acSevCount returns severity color for non-zero counts, dim for zero. func acSevCount(count int, style lipgloss.Style) lipgloss.Style { if count == 0 { - return tc.Dim + return theme.Dim() } return style } @@ -38,17 +42,18 @@ func acSevCount(count int, style lipgloss.Style) lipgloss.Style { type auditItem struct { result *audit.Result elapsed time.Duration + kind string // "skill" or "agent" } func (i auditItem) Title() string { name := colorSkillPath(compactAuditPath(i.result.SkillName)) if len(i.result.Findings) == 0 { - return tc.Green.Render("✓") + " " + name + return theme.Success().Render("✓") + " " + name } if i.result.IsBlocked { - return tc.Red.Render("✗") + " " + name + return theme.Danger().Render("✗") + " " + name } - return tc.Yellow.Render("!") + " " + name + return theme.Warning().Render("!") + " " + name } // compactAuditPath strips tracked repo prefix (first segment starting with "_") @@ -69,10 +74,13 @@ func compactAuditPath(name string) string { return strings.Join(segments, "/") } -// auditRepoKey extracts the grouping key from a skill name. +// auditRepoKey extracts the grouping key from a skill/agent name. +// For tracked repos: "_repo-name/skill" → "_repo-name" +// For nested agents: "demo/code-reviewer.md" → "demo" +// For flat names: "my-skill" → "" (standalone) func auditRepoKey(name string) string { segments := strings.Split(name, "/") - if strings.HasPrefix(segments[0], "_") { + if len(segments) > 1 { return segments[0] } return "" @@ -212,62 +220,86 @@ type auditTUIModel struct { termHeight int summary auditRunSummary + + // Tab switching (skills ↔ agents) + activeTab auditTab + skillItems []auditItem + agentItems []auditItem + skillSummary auditRunSummary + agentSummary auditRunSummary + tabCounts [2]int // [skills, agents] } -func newAuditTUIModel(results []*audit.Result, scanOutputs []audit.ScanOutput, summary auditRunSummary) auditTUIModel { - // Build items sorted: by severity (findings first), then by name. - items := make([]auditItem, 0, len(results)) - for idx, r := range results { - var elapsed time.Duration - if idx < len(scanOutputs) { - elapsed = scanOutputs[idx].Elapsed - } - items = append(items, auditItem{result: r, elapsed: elapsed}) - } +func sortAuditItems(items []auditItem) { sort.Slice(items, func(i, j int) bool { ri, rj := items[i].result, items[j].result - // Primary: group by repo key (tracked repos first, then standalone). ki, kj := auditRepoKey(ri.SkillName), auditRepoKey(rj.SkillName) if ki != kj { - // Both tracked: alphabetical if ki != "" && kj != "" { return ki < kj } - // Tracked before standalone return ki != "" } - // Secondary: skills with findings come first. hasI, hasJ := len(ri.Findings) > 0, len(rj.Findings) > 0 if hasI != hasJ { return hasI } if hasI && hasJ { - // Higher severity (lower rank) first. rankI := audit.SeverityRank(ri.MaxSeverity()) rankJ := audit.SeverityRank(rj.MaxSeverity()) if rankI != rankJ { return rankI < rankJ } - // Higher risk score first. if ri.RiskScore != rj.RiskScore { return ri.RiskScore > rj.RiskScore } } return ri.SkillName < rj.SkillName }) +} - // Cap items for list widget performance. - allItems := items - displayItems := items +func newAuditTUIModel( + skillResults []*audit.Result, skillOutputs []audit.ScanOutput, skillSummary auditRunSummary, + agentResults []*audit.Result, agentOutputs []audit.ScanOutput, agentSummary auditRunSummary, + initialTab auditTab, +) auditTUIModel { + buildItems := func(results []*audit.Result, outputs []audit.ScanOutput, kind string) []auditItem { + items := make([]auditItem, 0, len(results)) + for idx, r := range results { + var elapsed time.Duration + if idx < len(outputs) { + elapsed = outputs[idx].Elapsed + } + items = append(items, auditItem{result: r, elapsed: elapsed, kind: kind}) + } + sortAuditItems(items) + return items + } + + skillItems := buildItems(skillResults, skillOutputs, "skill") + agentItems := buildItems(agentResults, agentOutputs, "agent") + + var activeItems []auditItem + var activeSummary auditRunSummary + if initialTab == auditTabAgents { + activeItems = agentItems + activeSummary = agentSummary + } else { + activeItems = skillItems + activeSummary = skillSummary + } + + displayItems := activeItems if len(displayItems) > maxListItems { displayItems = displayItems[:maxListItems] } - listItems := buildGroupedAuditItems(displayItems) l := list.New(listItems, auditDelegate{}, 0, 0) - l.Title = fmt.Sprintf("Audit results (%d scanned)", summary.Scanned) - l.Styles.Title = tc.ListTitle + l.Title = fmt.Sprintf("Audit results (%d scanned)", activeSummary.Scanned) + l.Styles.Title = theme.Title() + l.Styles.NoItems = l.Styles.NoItems.PaddingLeft(2) + l.SetStatusBarItemName(initialTab.noun(), initialTab.noun()) l.SetShowStatusBar(false) l.SetFilteringEnabled(false) l.SetShowHelp(false) @@ -275,20 +307,43 @@ func newAuditTUIModel(results []*audit.Result, scanOutputs []audit.ScanOutput, s fi := textinput.New() fi.Prompt = "/ " - fi.PromptStyle = tc.Filter - fi.Cursor.Style = tc.Filter + fi.PromptStyle = theme.Accent() + fi.Cursor.Style = theme.Accent() m := auditTUIModel{ - list: l, - allItems: allItems, - matchCount: len(allItems), - filterInput: fi, - summary: summary, + list: l, + allItems: activeItems, + matchCount: len(activeItems), + filterInput: fi, + summary: activeSummary, + activeTab: initialTab, + skillItems: skillItems, + agentItems: agentItems, + skillSummary: skillSummary, + agentSummary: agentSummary, + tabCounts: [2]int{len(skillItems), len(agentItems)}, } skipGroupItem(&m.list, 1) return m } +func (m *auditTUIModel) switchTab() { + if m.activeTab == auditTabAgents { + m.allItems = m.agentItems + m.summary = m.agentSummary + } else { + m.allItems = m.skillItems + m.summary = m.skillSummary + } + m.filterText = "" + m.filterInput.SetValue("") + m.detailScroll = 0 + m.applyFilter() + m.list.Title = fmt.Sprintf("Audit results (%d scanned)", m.summary.Scanned) + m.list.SetStatusBarItemName(m.activeTab.noun(), m.activeTab.noun()) + skipGroupItem(&m.list, 1) +} + func (m auditTUIModel) Init() tea.Cmd { return nil } func (m *auditTUIModel) applyFilter() { @@ -391,6 +446,14 @@ func (m auditTUIModel) Update(msg tea.Msg) (tea.Model, tea.Cmd) { m.detailScroll = 0 } return m, nil + case "tab": + m.activeTab = (m.activeTab + 1) % 2 + m.switchTab() + return m, nil + case "shift+tab": + m.activeTab = (m.activeTab - 1 + 2) % 2 + m.switchTab() + return m, nil } } @@ -426,6 +489,9 @@ func (m auditTUIModel) View() string { // ── Horizontal split layout ── var b strings.Builder + b.WriteString(m.renderTabBar()) + b.WriteString("\n") + panelHeight := m.auditPanelHeight() leftWidth := auditListWidth(m.termWidth) @@ -438,7 +504,7 @@ func (m auditTUIModel) View() string { Render(m.list.View()) // Border column - borderStyle := tc.Border. + borderStyle := theme.Dim(). Height(panelHeight).MaxHeight(panelHeight) borderCol := strings.Repeat("│\n", panelHeight) borderPanel := borderStyle.Render(strings.TrimRight(borderCol, "\n")) @@ -465,7 +531,7 @@ func (m auditTUIModel) View() string { b.WriteString(m.renderSummaryFooter()) // Help line - b.WriteString(tc.Help.Render(appendScrollInfo("↑↓ navigate ←→ page / filter Ctrl+d/u scroll detail q quit", scrollInfo))) + b.WriteString(theme.Dim().MarginLeft(2).Render(appendScrollInfo("Tab skills/agents ↑↓ navigate ←→ page / filter Ctrl+d/u scroll detail q quit", scrollInfo))) return b.String() } @@ -474,6 +540,9 @@ func (m auditTUIModel) View() string { func (m auditTUIModel) viewVertical() string { var b strings.Builder + b.WriteString(m.renderTabBar()) + b.WriteString("\n") + b.WriteString(m.list.View()) b.WriteString("\n\n") @@ -489,17 +558,43 @@ func (m auditTUIModel) viewVertical() string { b.WriteString(m.renderSummaryFooter()) - b.WriteString(tc.Help.Render(appendScrollInfo("↑↓ navigate ←→ page / filter Ctrl+d/u scroll q quit", scrollInfo))) + b.WriteString(theme.Dim().MarginLeft(2).Render(appendScrollInfo("Tab skills/agents ↑↓ navigate ←→ page / filter Ctrl+d/u scroll q quit", scrollInfo))) b.WriteString("\n") return b.String() } +func (m auditTUIModel) renderTabBar() string { + type tab struct { + label string + tab auditTab + count int + } + tabs := []tab{ + {"Skills", auditTabSkills, m.tabCounts[0]}, + {"Agents", auditTabAgents, m.tabCounts[1]}, + } + + activeStyle := lipgloss.NewStyle().Bold(true).Underline(true) + inactiveStyle := theme.Dim() + + var parts []string + for _, t := range tabs { + label := fmt.Sprintf("%s(%d)", t.label, t.count) + if t.tab == m.activeTab { + parts = append(parts, activeStyle.Inherit(theme.Accent()).Render(label)) + } else { + parts = append(parts, inactiveStyle.Render(label)) + } + } + return " " + strings.Join(parts, " ") +} + func (m auditTUIModel) renderFilterBar() string { return renderTUIFilterBar( m.filterInput.View(), m.filtering, m.filterText, m.matchCount, len(m.allItems), maxListItems, - "results", m.renderPageInfo(), + m.activeTab.noun(), m.renderPageInfo(), ) } @@ -512,38 +607,38 @@ func (m auditTUIModel) renderPageInfo() string { // Line 2 (if any): category threat breakdown. func (m auditTUIModel) renderSummaryFooter() string { s := m.summary - pipe := tc.Dim.Render(" | ") + pipe := theme.Dim().Render(" | ") // ── Line 1: counts + severity ── // Label is always dim; value uses semantic color + bold for non-zero emphasis. parts := []string{ - tc.Dim.Render("Scanned: ") + tc.Emphasis.Render(formatNumber(s.Scanned)), - tc.Dim.Render("Passed: ") + tc.Dim.Render(formatNumber(s.Passed)), + theme.Dim().Render("Scanned: ") + theme.Primary().Render(formatNumber(s.Scanned)), + theme.Dim().Render("Passed: ") + theme.Dim().Render(formatNumber(s.Passed)), } if s.Warning > 0 { - parts = append(parts, tc.Dim.Render("Warning: ")+tc.Yellow.Bold(true).Render(formatNumber(s.Warning))) + parts = append(parts, theme.Dim().Render("Warning: ")+theme.Warning().Bold(true).Render(formatNumber(s.Warning))) } else { - parts = append(parts, tc.Dim.Render("Warning: ")+tc.Dim.Render(formatNumber(s.Warning))) + parts = append(parts, theme.Dim().Render("Warning: ")+theme.Dim().Render(formatNumber(s.Warning))) } if s.Failed > 0 { - parts = append(parts, tc.Dim.Render("Failed: ")+tc.Red.Bold(true).Render(formatNumber(s.Failed))) + parts = append(parts, theme.Dim().Render("Failed: ")+theme.Danger().Bold(true).Render(formatNumber(s.Failed))) } else { - parts = append(parts, tc.Dim.Render("Failed: ")+tc.Dim.Render(formatNumber(s.Failed))) + parts = append(parts, theme.Dim().Render("Failed: ")+theme.Dim().Render(formatNumber(s.Failed))) } sevParts := []string{ - acSevCount(s.Critical, tc.Critical).Render(fmt.Sprintf("%d", s.Critical)), - acSevCount(s.High, tc.High).Render(fmt.Sprintf("%d", s.High)), - acSevCount(s.Medium, tc.Medium).Render(fmt.Sprintf("%d", s.Medium)), - acSevCount(s.Low, tc.Low).Render(fmt.Sprintf("%d", s.Low)), - acSevCount(s.Info, tc.Info).Render(fmt.Sprintf("%d", s.Info)), + acSevCount(s.Critical, theme.Severity("critical")).Render(fmt.Sprintf("%d", s.Critical)), + acSevCount(s.High, theme.Severity("high")).Render(fmt.Sprintf("%d", s.High)), + acSevCount(s.Medium, theme.Severity("medium")).Render(fmt.Sprintf("%d", s.Medium)), + acSevCount(s.Low, theme.Severity("low")).Render(fmt.Sprintf("%d", s.Low)), + acSevCount(s.Info, theme.Severity("info")).Render(fmt.Sprintf("%d", s.Info)), } - sep := tc.Dim.Render("/") - parts = append(parts, tc.Dim.Render("c/h/m/l/i = ")+strings.Join(sevParts, sep)) + sep := theme.Dim().Render("/") + parts = append(parts, theme.Dim().Render("c/h/m/l/i = ")+strings.Join(sevParts, sep)) - parts = append(parts, tc.Dim.Render(fmt.Sprintf("Auditable: %.0f%% avg", s.AvgAnalyzability*100))) + parts = append(parts, theme.Dim().Render(fmt.Sprintf("Auditable: %.0f%% avg", s.AvgAnalyzability*100))) if s.PolicyProfile != "" { - parts = append(parts, tc.Dim.Render("Policy: ")+tuiColorizeProfile(s.PolicyProfile)) + parts = append(parts, theme.Dim().Render("Policy: ")+tuiColorizeProfile(s.PolicyProfile)) } var b strings.Builder @@ -554,7 +649,7 @@ func (m auditTUIModel) renderSummaryFooter() string { // ── Line 2: category breakdown with semantic colors ── if threatsLine := formatCategoryBreakdownTUI(s.ByCategory); threatsLine != "" { b.WriteString(" ") - b.WriteString(tc.Dim.Render("Threats: ")) + b.WriteString(theme.Dim().Render("Threats: ")) b.WriteString(threatsLine) b.WriteString("\n") } @@ -570,24 +665,24 @@ func (m auditTUIModel) renderDetailContent(item auditItem) string { r := item.result row := func(label, value string) { - b.WriteString(tc.Label.Render(label)) + b.WriteString(theme.Dim().Width(14).Render(label)) b.WriteString(value) b.WriteString("\n") } // ── Header ── - b.WriteString(tc.Title.Render(r.SkillName)) + b.WriteString(theme.Title().Render(r.SkillName)) b.WriteString("\n") - b.WriteString(tc.Dim.Render(strings.Repeat("─", 40))) + b.WriteString(theme.Dim().Render(strings.Repeat("─", 40))) b.WriteString("\n\n") // ── Summary section ── // Risk — colorized by severity riskText := fmt.Sprintf("%s (%d/100)", strings.ToUpper(r.RiskLabel), r.RiskScore) - riskStyle := tcSevStyle(r.RiskLabel) + riskStyle := theme.SeverityStyle(r.RiskLabel) if r.RiskLabel == "clean" { - riskStyle = tc.Green + riskStyle = theme.Success() } row("Risk:", riskStyle.Render(riskText)) @@ -596,52 +691,52 @@ func (m auditTUIModel) renderDetailContent(item auditItem) string { if maxSev == "" { maxSev = "NONE" } - maxSevStyle := tcSevStyle(maxSev) + maxSevStyle := theme.SeverityStyle(maxSev) if strings.ToUpper(maxSev) == "NONE" { - maxSevStyle = tc.Green + maxSevStyle = theme.Success() } row("Max sev:", maxSevStyle.Render(maxSev)) // Block status if r.IsBlocked { - row("Status:", tc.Red.Render("✗ BLOCKED")) + row("Status:", theme.Danger().Render("✗ BLOCKED")) } else if len(r.Findings) == 0 { - row("Status:", tc.Green.Render("✓ Clean")) + row("Status:", theme.Success().Render("✓ Clean")) } else { - row("Status:", tc.Yellow.Render("! Has findings (not blocked)")) + row("Status:", theme.Warning().Render("! Has findings (not blocked)")) } // Auditable — analyzability percentage auditableText := fmt.Sprintf("%.0f%%", r.Analyzability*100) if r.Analyzability >= 0.70 { - row("Auditable:", tc.Green.Render(auditableText)) + row("Auditable:", theme.Success().Render(auditableText)) } else if r.TotalBytes > 0 { - row("Auditable:", tc.Yellow.Render(auditableText)) + row("Auditable:", theme.Warning().Render(auditableText)) } else { - row("Auditable:", tc.Dim.Render("—")) + row("Auditable:", theme.Dim().Render("—")) } // Commands — tier profile if !r.TierProfile.IsEmpty() { - row("Commands:", tc.Dim.Render(r.TierProfile.String())) + row("Commands:", theme.Dim().Render(r.TierProfile.String())) } // Threshold if r.Threshold != "" { - row("Threshold:", tc.Dim.Render("severity >= ")+tcSevStyle(r.Threshold).Render(strings.ToUpper(r.Threshold))) + row("Threshold:", theme.Dim().Render("severity >= ")+theme.SeverityStyle(r.Threshold).Render(strings.ToUpper(r.Threshold))) } // Policy if m.summary.PolicyProfile != "" { policyText := tuiColorizeProfile(m.summary.PolicyProfile) + - tc.Dim.Render(" / dedupe:") + tuiColorizeDedupe(m.summary.PolicyDedupe) + - tc.Dim.Render(" / analyzers:") + tuiColorizeAnalyzers(m.summary.PolicyAnalyzers) + theme.Dim().Render(" / dedupe:") + tuiColorizeDedupe(m.summary.PolicyDedupe) + + theme.Dim().Render(" / analyzers:") + tuiColorizeAnalyzers(m.summary.PolicyAnalyzers) row("Policy:", policyText) } // Scan time if item.elapsed > 0 { - row("Scan time:", tc.Dim.Render(fmt.Sprintf("%.1fs", item.elapsed.Seconds()))) + row("Scan time:", theme.Dim().Render(fmt.Sprintf("%.1fs", item.elapsed.Seconds()))) } // Severity breakdown — only non-zero counts are colorized; zeros are dim @@ -650,23 +745,23 @@ func (m auditTUIModel) renderDetailContent(item auditItem) string { for _, f := range r.Findings { counts[f.Severity]++ } - sep := tc.Dim.Render("/") - sevLine := acSevCount(counts["CRITICAL"], tc.Critical).Render(fmt.Sprintf("%d", counts["CRITICAL"])) + sep + - acSevCount(counts["HIGH"], tc.High).Render(fmt.Sprintf("%d", counts["HIGH"])) + sep + - acSevCount(counts["MEDIUM"], tc.Medium).Render(fmt.Sprintf("%d", counts["MEDIUM"])) + sep + - acSevCount(counts["LOW"], tc.Low).Render(fmt.Sprintf("%d", counts["LOW"])) + sep + - acSevCount(counts["INFO"], tc.Info).Render(fmt.Sprintf("%d", counts["INFO"])) - row("Severity:", tc.Dim.Render("c/h/m/l/i = ")+sevLine) - row("Total:", tc.Emphasis.Render(fmt.Sprintf("%d", len(r.Findings)))+tc.Dim.Render(" finding(s)")) + sep := theme.Dim().Render("/") + sevLine := acSevCount(counts["CRITICAL"], theme.Severity("critical")).Render(fmt.Sprintf("%d", counts["CRITICAL"])) + sep + + acSevCount(counts["HIGH"], theme.Severity("high")).Render(fmt.Sprintf("%d", counts["HIGH"])) + sep + + acSevCount(counts["MEDIUM"], theme.Severity("medium")).Render(fmt.Sprintf("%d", counts["MEDIUM"])) + sep + + acSevCount(counts["LOW"], theme.Severity("low")).Render(fmt.Sprintf("%d", counts["LOW"])) + sep + + acSevCount(counts["INFO"], theme.Severity("info")).Render(fmt.Sprintf("%d", counts["INFO"])) + row("Severity:", theme.Dim().Render("c/h/m/l/i = ")+sevLine) + row("Total:", theme.Primary().Render(fmt.Sprintf("%d", len(r.Findings)))+theme.Dim().Render(" finding(s)")) } b.WriteString("\n") // ── Findings detail ── if len(r.Findings) > 0 { - b.WriteString(tc.Title.Render("Findings")) + b.WriteString(theme.Title().Render("Findings")) b.WriteString("\n") - b.WriteString(tc.Dim.Render(strings.Repeat("─", 40))) + b.WriteString(theme.Dim().Render(strings.Repeat("─", 40))) b.WriteString("\n\n") sorted := make([]audit.Finding, len(r.Findings)) @@ -677,34 +772,34 @@ func (m auditTUIModel) renderDetailContent(item auditItem) string { for idx, f := range sorted { // [N] SEVERITY pattern - sevBadge := tcSevStyle(f.Severity).Render(strings.ToUpper(f.Severity)) - header := tc.Dim.Render(fmt.Sprintf("[%d] ", idx+1)) - patternText := tc.Emphasis.Bold(true).Render(f.Pattern) + sevBadge := theme.SeverityStyle(f.Severity).Render(strings.ToUpper(f.Severity)) + header := theme.Dim().Render(fmt.Sprintf("[%d] ", idx+1)) + patternText := theme.Primary().Bold(true).Render(f.Pattern) b.WriteString(header + sevBadge + " " + patternText + "\n") // Message - b.WriteString(tc.Dim.Render(" ")) - b.WriteString(tc.Dim.Render(f.Message)) + b.WriteString(theme.Dim().Render(" ")) + b.WriteString(theme.Dim().Render(f.Message)) b.WriteString("\n") // Metadata: ruleID / analyzer / category if meta := findingMetaTUI(f); meta != "" { - b.WriteString(tc.Dim.Render(" ")) - b.WriteString(ac.File.Render(meta)) + b.WriteString(theme.Dim().Render(" ")) + b.WriteString(theme.Accent().Render(meta)) b.WriteString("\n") } // Location: file:line loc := fmt.Sprintf("%s:%d", f.File, f.Line) - b.WriteString(tc.Dim.Render(" ")) - b.WriteString(ac.File.Render(loc)) + b.WriteString(theme.Dim().Render(" ")) + b.WriteString(theme.Accent().Render(loc)) b.WriteString("\n") // Snippet — with │ gutter if f.Snippet != "" { - gutter := tc.Dim.Render(" │ ") + gutter := theme.Dim().Render(" │ ") b.WriteString(gutter) - b.WriteString(ac.Snippet.Render(f.Snippet)) + b.WriteString(theme.Warning().Render(f.Snippet)) b.WriteString("\n") } @@ -716,9 +811,9 @@ func (m auditTUIModel) renderDetailContent(item auditItem) string { } // auditFooterLines returns the number of lines the footer occupies below the panel. -// gap(2) + filter(1) + summary(1-2) + help(1) = 5 or 6 +// gap(2) + tab(1) + filter(1) + summary(1-2) + help(1) = 6 or 7 func (m auditTUIModel) auditFooterLines() int { - n := 5 // gap(2) + filter + summary-line1 + help + n := 6 // gap(2) + tab(1) + filter + summary-line1 + help if len(m.summary.ByCategory) > 0 { n++ // summary-line2 (threats) } @@ -764,23 +859,23 @@ func auditDetailPanelWidth(termWidth int) int { func tuiColorizeProfile(profile string) string { label := policyProfileLabel(profile) if label == "STRICT" { - return tc.Yellow.Render(label) + return theme.Warning().Render(label) } - return tc.Dim.Render(label) + return theme.Dim().Render(label) } // tuiColorizeDedupe returns a lipgloss-styled UPPERCASE dedupe mode. func tuiColorizeDedupe(dedupe string) string { label := policyDedupeLabel(dedupe) if label == "LEGACY" { - return tc.Yellow.Render(label) + return theme.Warning().Render(label) } - return tc.Dim.Render(label) + return theme.Dim().Render(label) } // tuiColorizeAnalyzers returns a lipgloss-styled UPPERCASE analyzer list. func tuiColorizeAnalyzers(analyzers []string) string { - return tc.Dim.Render(policyAnalyzersLabel(analyzers)) + return theme.Dim().Render(policyAnalyzersLabel(analyzers)) } // findingMetaTUI builds a compact "ruleID / analyzer / category" string for TUI detail. @@ -803,8 +898,12 @@ func findingMetaTUI(f audit.Finding) string { } // runAuditTUI starts the bubbletea TUI for audit results. -func runAuditTUI(results []*audit.Result, scanOutputs []audit.ScanOutput, summary auditRunSummary) error { - model := newAuditTUIModel(results, scanOutputs, summary) +func runAuditTUI( + skillResults []*audit.Result, skillOutputs []audit.ScanOutput, skillSummary auditRunSummary, + agentResults []*audit.Result, agentOutputs []audit.ScanOutput, agentSummary auditRunSummary, + initialTab auditTab, +) error { + model := newAuditTUIModel(skillResults, skillOutputs, skillSummary, agentResults, agentOutputs, agentSummary, initialTab) p := tea.NewProgram(model, tea.WithAltScreen(), tea.WithMouseCellMotion()) _, err := p.Run() return err diff --git a/cmd/skillshare/backup.go b/cmd/skillshare/backup.go index ad34eccf..43a80eaa 100644 --- a/cmd/skillshare/backup.go +++ b/cmd/skillshare/backup.go @@ -4,6 +4,7 @@ import ( "fmt" "os" "path/filepath" + "sort" "strings" "time" @@ -21,10 +22,21 @@ func cmdBackup(args []string) error { if err != nil { return err } - if mode == modeProject { - return fmt.Errorf("backup is not supported in project mode") + + // Extract kind filter (e.g. "skillshare backup agents" or "--all"). + kind, args := parseKindArgWithAll(args) + + // Project mode is only supported for agents. + if mode == modeProject && kind != kindAgents && kind != kindAll { + return fmt.Errorf("backup is not supported in project mode (except for agents)") } + cwd, _ := os.Getwd() + if mode == modeAuto && kind == kindAgents && projectConfigExists(cwd) { + mode = modeProject + } + applyModeLabel(mode) + start := time.Now() var targetName string doList := false @@ -63,7 +75,11 @@ func cmdBackup(args []string) error { return backupCleanup() } - err = createBackup(targetName, dryRun) + if kind == kindAgents { + err = createAgentBackup(mode, cwd, targetName, dryRun) + } else { + err = createBackup(targetName, dryRun) + } if !dryRun { e := oplog.NewEntry("backup", statusFromErr(err), time.Since(start)) @@ -322,11 +338,24 @@ func cmdRestore(args []string) error { if err != nil { return err } - if mode == modeProject { - return fmt.Errorf("restore is not supported in project mode") + + // Extract kind filter (e.g. "skillshare restore agents"). + // Extract kind filter (e.g. "skillshare restore agents" or "--all"). + kind, args := parseKindArgWithAll(args) + + // Project mode is only supported for agents. + if mode == modeProject && kind != kindAgents && kind != kindAll { + return fmt.Errorf("restore is not supported in project mode (except for agents)") + } + + cwd, _ := os.Getwd() + if mode == modeAuto && kind == kindAgents && projectConfigExists(cwd) { + mode = modeProject } + applyModeLabel(mode) start := time.Now() + _ = start // used below var targetName string var fromTimestamp string @@ -357,6 +386,11 @@ func cmdRestore(args []string) error { } } + // Agent restore uses agent-specific backup entries (name suffixed with "-agents") + if kind == kindAgents { + return restoreAgentBackup(mode, cwd, targetName, fromTimestamp, force, dryRun) + } + // No target specified → TUI dispatch (or plain text fallback) if targetName == "" && fromTimestamp == "" && !dryRun { return restoreTUIDispatch(noTUI) @@ -460,22 +494,41 @@ func restoreTUIDispatch(noTUI bool) error { if projectConfigExists(cwd) { mode = modeProject } - trashBase := resolveTrashBase(mode, cwd) - items := trash.List(trashBase) + skillTrashBase := resolveTrashBase(mode, cwd, kindSkills) + agentTrashBase := resolveTrashBase(mode, cwd, kindAgents) + + // Merge skill + agent trash + var items []trash.TrashEntry + for _, e := range trash.List(skillTrashBase) { + e.Kind = "skill" + items = append(items, e) + } + for _, e := range trash.List(agentTrashBase) { + e.Kind = "agent" + items = append(items, e) + } if len(items) == 0 { ui.Info("Trash is empty") return nil } + sort.Slice(items, func(i, j int) bool { + return items[i].Date.After(items[j].Date) + }) + modeLabel := "global" if mode == modeProject { modeLabel = "project" } cfgPath := resolveTrashCfgPath(mode, cwd) - destDir, err := resolveSourceDir(mode, cwd) + destDir, err := resolveSourceDir(mode, cwd, kindSkills) + if err != nil { + return err + } + agentDestDir, err := resolveSourceDir(mode, cwd, kindAgents) if err != nil { return err } - return runTrashTUI(items, trashBase, destDir, cfgPath, modeLabel) + return runTrashTUI(items, skillTrashBase, agentTrashBase, destDir, agentDestDir, cfgPath, modeLabel) } return nil @@ -503,6 +556,28 @@ func restoreFromLatest(targetName, targetPath string, opts backup.RestoreOptions return nil } +func restoreFromTimestampInDir(backupDir, targetName, targetPath, timestamp string, opts backup.RestoreOptions) error { + backupInfo, err := backup.GetBackupByTimestampInDir(backupDir, timestamp) + if err != nil { + return err + } + + if err := backup.RestoreToPath(backupInfo.Path, targetName, targetPath, opts); err != nil { + return err + } + ui.Success("Restored %s from backup %s", targetName, timestamp) + return nil +} + +func restoreFromLatestInDir(backupDir, targetName, targetPath string, opts backup.RestoreOptions) error { + timestamp, err := backup.RestoreLatestInDir(backupDir, targetName, targetPath, opts) + if err != nil { + return err + } + ui.Success("Restored %s from latest backup (%s)", targetName, timestamp) + return nil +} + func previewRestoreFromTimestamp(targetName, targetPath, timestamp string, opts backup.RestoreOptions) error { backupInfo, err := backup.GetBackupByTimestamp(timestamp) if err != nil { @@ -537,7 +612,7 @@ func previewRestoreFromLatest(targetName, targetPath string, opts backup.Restore } func printBackupHelp() { - fmt.Println(`Usage: skillshare backup [target] [options] + fmt.Println(`Usage: skillshare backup [agents] [target] [options] Create a snapshot of target skill directories. Without arguments, backs up all targets. @@ -546,6 +621,9 @@ Arguments: target Target name to backup (optional; backs up all if omitted) Options: + --all Backup both skills and agents + --project, -p Use project mode (.skillshare/backups/); agents only + --global, -g Use global mode (default for skills) --list, -l List all existing backups --cleanup, -c Remove old backups based on retention policy --dry-run, -n Preview what would be backed up or cleaned up @@ -557,11 +635,14 @@ Examples: skillshare backup claude # Backup only claude skillshare backup --list # List all backups skillshare backup --cleanup # Remove old backups - skillshare backup --cleanup --dry-run # Preview cleanup`) + skillshare backup --cleanup --dry-run # Preview cleanup + skillshare backup agents # Backup all agent targets + skillshare backup agents -p # Backup project agent targets + skillshare backup --all # Backup skills + agents`) } func printRestoreHelp() { - fmt.Println(`Usage: skillshare restore [target] [options] + fmt.Println(`Usage: skillshare restore [agents] [target] [options] Restore target skills from a backup snapshot. Without arguments, launches an interactive TUI. @@ -570,6 +651,9 @@ Arguments: target Target name to restore (optional) Options: + --all Restore both skills and agents + --project, -p Use project mode (.skillshare/backups/); agents only + --global, -g Use global mode (default for skills) --from, -f Restore from specific timestamp (e.g. 2024-01-15_14-30-45) --force Overwrite non-empty target directory --dry-run, -n Preview what would be restored without making changes @@ -581,5 +665,8 @@ Examples: skillshare restore claude # Restore claude from latest backup skillshare restore claude --from 2024-01-15_14-30-45 skillshare restore claude --dry-run # Preview restore - skillshare restore --no-tui # List backups (no TUI)`) + skillshare restore --no-tui # List backups (no TUI) + skillshare restore agents claude # Restore agents claude target + skillshare restore agents claude -p # Restore project agents + skillshare restore --all claude # Restore skills + agents`) } diff --git a/cmd/skillshare/backup_agents.go b/cmd/skillshare/backup_agents.go new file mode 100644 index 00000000..01b16674 --- /dev/null +++ b/cmd/skillshare/backup_agents.go @@ -0,0 +1,158 @@ +package main + +import ( + "fmt" + "path/filepath" + + "skillshare/internal/backup" + "skillshare/internal/config" + "skillshare/internal/ui" +) + +// createAgentBackup backs up agent target directories. +// Agent backups use "-agents" as the backup entry name. +// In project mode, backups are stored under .skillshare/backups/. +func createAgentBackup(mode runMode, cwd, targetName string, dryRun bool) error { + backupDir, targets, err := resolveAgentBackupContext(mode, cwd) + if err != nil { + return err + } + + modeLabel := "global" + if mode == modeProject { + modeLabel = "project" + } + + ui.Header(fmt.Sprintf("Creating agent backup (%s)", modeLabel)) + if dryRun { + ui.Warning("Dry run mode - no backups will be created") + } + + created := 0 + for _, at := range targets { + if targetName != "" && at.name != targetName { + continue + } + + entryName := at.name + "-agents" + + if dryRun { + ui.Info("%s: would backup agents from %s", entryName, at.agentPath) + continue + } + + backupPath, backupErr := backup.CreateInDir(backupDir, entryName, at.agentPath) + if backupErr != nil { + ui.Warning("Failed to backup %s: %v", entryName, backupErr) + continue + } + if backupPath != "" { + ui.StepDone(entryName, backupPath) + created++ + } else { + ui.StepSkip(entryName, "nothing to backup") + } + } + + if created == 0 && !dryRun { + ui.Info("No agent targets to backup") + } + + return nil +} + +// restoreAgentBackup restores agent target directories from backup. +func restoreAgentBackup(mode runMode, cwd, targetName, fromTimestamp string, force, dryRun bool) error { + if targetName == "" { + return fmt.Errorf("usage: skillshare restore agents [--from ] [--force] [--dry-run]") + } + + backupDir, targets, err := resolveAgentBackupContext(mode, cwd) + if err != nil { + return err + } + + // Find the target's agent path. + var agentPath string + for _, at := range targets { + if at.name == targetName { + agentPath = at.agentPath + break + } + } + if agentPath == "" { + return fmt.Errorf("target '%s' has no agent path configured", targetName) + } + + entryName := targetName + "-agents" + ui.Header(fmt.Sprintf("Restoring agents for %s", targetName)) + + if dryRun { + ui.Warning("Dry run mode - no changes will be made") + ui.Info("Would restore %s to %s", entryName, agentPath) + return nil + } + + opts := backup.RestoreOptions{Force: force} + if fromTimestamp != "" { + return restoreFromTimestampInDir(backupDir, entryName, agentPath, fromTimestamp, opts) + } + return restoreFromLatestInDir(backupDir, entryName, agentPath, opts) +} + +// agentTarget holds resolved name + agent path for backup/restore. +type agentTarget struct { + name string + agentPath string +} + +// resolveAgentBackupContext returns the backup directory and agent-capable targets +// for the given mode. +func resolveAgentBackupContext(mode runMode, cwd string) (string, []agentTarget, error) { + if mode == modeProject { + return resolveProjectAgentBackupContext(cwd) + } + return resolveGlobalAgentBackupContext() +} + +func resolveGlobalAgentBackupContext() (string, []agentTarget, error) { + cfg, err := config.Load() + if err != nil { + return "", nil, err + } + return resolveGlobalAgentBackupContextFromCfg(cfg) +} + +// resolveGlobalAgentBackupContextFromCfg is like resolveGlobalAgentBackupContext +// but accepts an already-loaded config to avoid redundant config.Load() calls. +func resolveGlobalAgentBackupContextFromCfg(cfg *config.Config) (string, []agentTarget, error) { + builtinAgents := config.DefaultAgentTargets() + var targets []agentTarget + for name := range cfg.Targets { + agentPath := resolveAgentTargetPath(cfg.Targets[name], builtinAgents, name) + if agentPath != "" { + targets = append(targets, agentTarget{name: name, agentPath: agentPath}) + } + } + + return backup.BackupDir(), targets, nil +} + +func resolveProjectAgentBackupContext(cwd string) (string, []agentTarget, error) { + projCfg, err := config.LoadProject(cwd) + if err != nil { + return "", nil, fmt.Errorf("cannot load project config: %w", err) + } + + builtinAgents := config.ProjectAgentTargets() + var targets []agentTarget + for _, entry := range projCfg.Targets { + agentPath := resolveProjectAgentTargetPath(entry, builtinAgents, cwd) + if agentPath != "" { + targets = append(targets, agentTarget{name: entry.Name, agentPath: agentPath}) + } + } + + backupDir := filepath.Join(cwd, ".skillshare", "backups") + return backupDir, targets, nil +} diff --git a/cmd/skillshare/check.go b/cmd/skillshare/check.go index 94c7f543..17c2b635 100644 --- a/cmd/skillshare/check.go +++ b/cmd/skillshare/check.go @@ -51,7 +51,7 @@ type checkOptions struct { type skillWithMeta struct { name string path string - meta *install.SkillMeta + meta *install.MetadataEntry } // collectCheckItems reads metadata and partitions items for parallel checking. @@ -69,31 +69,34 @@ func collectCheckItems(sourceDir string, repos []string, skills []string) ( }) } + // Load centralized metadata store once for all skills. + store := install.LoadMetadataOrNew(sourceDir) + urlGroups := make(map[string][]skillWithMeta) var localResults []checkSkillResult for _, skill := range skills { skillPath := filepath.Join(sourceDir, skill) - meta, err := install.ReadMeta(skillPath) + entry := store.GetByPath(skill) - if err != nil || meta == nil || meta.RepoURL == "" { + if entry == nil || entry.RepoURL == "" { result := checkSkillResult{Name: skill, Status: "local"} - if meta != nil { - result.Source = meta.Source - result.Version = meta.Version - if !meta.InstalledAt.IsZero() { - result.InstalledAt = meta.InstalledAt.Format("2006-01-02") + if entry != nil { + result.Source = entry.Source + result.Version = entry.Version + if !entry.InstalledAt.IsZero() { + result.InstalledAt = entry.InstalledAt.Format("2006-01-02") } } localResults = append(localResults, result) continue } - groupKey := urlBranchKey(meta.RepoURL, meta.Branch) + groupKey := urlBranchKey(entry.RepoURL, entry.Branch) urlGroups[groupKey] = append(urlGroups[groupKey], skillWithMeta{ name: skill, path: skillPath, - meta: meta, + meta: entry, }) } @@ -171,6 +174,9 @@ func cmdCheck(args []string) error { applyModeLabel(mode) + // Extract kind filter (e.g. "skillshare check agents" or "--all"). + kind, rest := parseKindArgWithAll(rest) + scope := "global" if mode == modeProject { scope = "project" @@ -188,6 +194,12 @@ func cmdCheck(args []string) error { cfgPath := config.ConfigPath() if mode == modeProject { cfgPath = config.ProjectConfigPath(cwd) + if kind == kindAgents { + agentsDir := filepath.Join(cwd, ".skillshare", "agents") + renderAgentCheck(agentsDir, opts.groups, opts.json) + logCheckOp(cfgPath, 0, 0, 0, 0, scope, start, nil) + return nil + } cmdErr := cmdCheckProject(cwd, opts) logCheckOp(cfgPath, 0, 0, 0, 0, scope, start, cmdErr) return cmdErr @@ -198,6 +210,14 @@ func cmdCheck(args []string) error { return err } + // Agent-only check: scan agents source directory and skip repo checks. + if kind == kindAgents { + agentsDir := cfg.EffectiveAgentsSource() + renderAgentCheck(agentsDir, opts.groups, opts.json) + logCheckOp(cfgPath, 0, 0, 0, 0, scope, start, nil) + return nil + } + // No names and no groups → check all (existing behavior) if len(opts.names) == 0 && len(opts.groups) == 0 { cmdErr := runCheck(cfg.Source, opts.json, targetNamesFromConfig(cfg.Targets)) @@ -522,7 +542,7 @@ func resolveSkillStatuses( // Pre-fill base result fields for each skill type pending struct { result checkSkillResult - meta *install.SkillMeta + meta *install.MetadataEntry } items := make([]pending, len(group)) for i, sw := range group { @@ -631,13 +651,14 @@ func runCheckFiltered(sourceDir string, opts *checkOptions) error { resolveSpinner = ui.StartSpinner("Resolving skills...") } + checkStore, _ := install.LoadMetadata(sourceDir) var targets []updateTarget seen := map[string]bool{} var resolveWarnings []string for _, name := range opts.names { // Check group directory first (same logic as update) - if isGroupDir(name, sourceDir) { + if isGroupDir(name, sourceDir, checkStore) { groupMatches, groupErr := resolveGroupUpdatable(name, sourceDir) if groupErr != nil { resolveWarnings = append(resolveWarnings, fmt.Sprintf("%s: %v", name, groupErr)) @@ -727,10 +748,12 @@ func runCheckFiltered(sourceDir string, opts *checkOptions) error { // Single skill: show per-skill detail like update does if len(targets) == 1 && !targets[0].isRepo { t := targets[0] - skillPath := filepath.Join(sourceDir, t.name) - if meta, metaErr := install.ReadMeta(skillPath); metaErr == nil && meta != nil && meta.Source != "" { - ui.StepContinue("Skill", t.name) - ui.StepContinue("Source", meta.Source) + detailStore, _ := install.LoadMetadata(sourceDir) + if detailStore != nil { + if entry := detailStore.Get(t.name); entry != nil && entry.Source != "" { + ui.StepContinue("Skill", t.name) + ui.StepContinue("Source", entry.Source) + } } } } @@ -892,8 +915,78 @@ func formatSourceShort(source string) string { return source } +// renderAgentCheck runs CheckAgents and displays results (text or JSON). +// If groups is non-empty, only agents in those group subdirectories are shown. +func renderAgentCheck(agentsDir string, groups []string, jsonMode bool) { + agentResults := check.CheckAgents(agentsDir) + + if len(groups) > 0 { + filtered, err := filterAgentResultsByGroups(agentResults, groups, agentsDir) + if err != nil { + if jsonMode { + writeJSONError(err) //nolint:errcheck + return + } + ui.Error("%v", err) + return + } + agentResults = filtered + } + + trackedIndices := make([]int, 0, len(agentResults)) + tracked := make([]check.AgentCheckResult, 0, len(agentResults)) + for i := range agentResults { + if agentResults[i].Source != "" { + trackedIndices = append(trackedIndices, i) + tracked = append(tracked, agentResults[i]) + } + } + if len(tracked) > 0 { + if jsonMode { + check.EnrichAgentResultsWithRemote(tracked, nil) + } else { + sp := ui.StartSpinner(fmt.Sprintf("Checking %d tracked agent(s)...", len(tracked))) + check.EnrichAgentResultsWithRemote(tracked, func() { sp.Success("Check complete") }) + fmt.Println() + } + for i, idx := range trackedIndices { + agentResults[idx] = tracked[i] + } + } + + if jsonMode { + out, _ := json.MarshalIndent(agentResults, "", " ") + fmt.Println(string(out)) + return + } + ui.Header(ui.WithModeLabel("Checking agents")) + ui.StepStart("Agents source", agentsDir) + if len(agentResults) == 0 { + ui.Info("No agents found") + } else { + fmt.Println() + for _, r := range agentResults { + switch r.Status { + case "up_to_date": + ui.ListItem("success", r.Name, "up to date") + case "update_available": + ui.ListItem("warning", r.Name, "update available") + case "drifted": + ui.ListItem("warning", r.Name, r.Message) + case "dirty": + ui.ListItem("warning", r.Name, r.Message) + case "local": + ui.ListItem("info", r.Name, "local agent") + case "error": + ui.ListItem("error", r.Name, r.Message) + } + } + } + fmt.Println() +} + func printCheckHelp() { - fmt.Println(`Usage: skillshare check [name...] [options] + fmt.Println(`Usage: skillshare check [agents] [name...] [options] skillshare check --group [options] Check for available updates to tracked repositories and installed skills. @@ -908,11 +1001,12 @@ Arguments: name... Skill name(s) or tracked repo name(s) (optional) Options: - --group, -G Check all updatable skills in a group (repeatable) - --project, -p Check project-level skills (.skillshare/) - --global, -g Check global skills (~/.config/skillshare) - --json Output results as JSON - --help, -h Show this help + --all Check both skills and agents + --group, -G Check all updatable skills in a group (repeatable) + --project, -p Check project-level skills (.skillshare/) + --global, -g Check global skills (~/.config/skillshare) + --json Output results as JSON + --help, -h Show this help Examples: skillshare check # Check all items @@ -921,5 +1015,8 @@ Examples: skillshare check --group frontend # Check all skills in frontend/ skillshare check x -G backend # Mix names and groups skillshare check --json # Output as JSON (for CI) - skillshare check -p # Check project skills`) + skillshare check -p # Check project skills + skillshare check agents # Check all agents + skillshare check agents -G demo # Check agents in demo/ + skillshare check --all # Check skills + agents`) } diff --git a/cmd/skillshare/checklist_tui.go b/cmd/skillshare/checklist_tui.go index ff9d97a0..e993876f 100644 --- a/cmd/skillshare/checklist_tui.go +++ b/cmd/skillshare/checklist_tui.go @@ -4,6 +4,8 @@ import ( "fmt" "strings" + "skillshare/internal/theme" + "github.com/charmbracelet/bubbles/list" tea "github.com/charmbracelet/bubbletea" ) @@ -78,12 +80,26 @@ type checklistModel struct { func newChecklistModel(cfg checklistConfig) checklistModel { sel := make(map[int]bool, len(cfg.items)) selCount := 0 + selectedIdx := -1 for i, item := range cfg.items { if item.preSelected { + if cfg.singleSelect { + if selectedIdx == -1 { + sel[i] = true + selCount = 1 + selectedIdx = i + } + continue + } sel[i] = true selCount++ } } + if cfg.singleSelect && len(cfg.items) > 0 && selectedIdx == -1 { + sel[0] = true + selCount = 1 + selectedIdx = 0 + } hasDesc := false for _, item := range cfg.items { @@ -96,10 +112,13 @@ func newChecklistModel(cfg checklistConfig) checklistModel { listItems := makeChecklistItems(cfg.items, sel, cfg.singleSelect) l := list.New(listItems, newPrefixDelegate(hasDesc), 0, 0) - l.Styles.Title = tc.ListTitle + l.Styles.Title = theme.Title() l.SetShowStatusBar(true) l.SetFilteringEnabled(true) l.SetShowHelp(false) + if cfg.singleSelect && selectedIdx >= 0 { + l.Select(selectedIdx) + } itemName := cfg.itemName if itemName == "" { @@ -170,17 +189,8 @@ func (m checklistModel) Update(msg tea.Msg) (tea.Model, tea.Cmd) { break } if m.singleSelect { - // Radio: clear all, select this one - wasSelected := m.selected[item.idx] - for k := range m.selected { - delete(m.selected, k) - } - if !wasSelected { - m.selected[item.idx] = true - m.selCount = 1 - } else { - m.selCount = 0 - } + // Radio: select the focused item and keep exactly one choice active. + m.selectSingle(item.idx) } else { m.selected[item.idx] = !m.selected[item.idx] if m.selected[item.idx] { @@ -210,17 +220,12 @@ func (m checklistModel) Update(msg tea.Msg) (tea.Model, tea.Cmd) { case "enter": if m.singleSelect { - // Single select: enter confirms current cursor item item, ok := m.list.SelectedItem().(checklistItem) if !ok { break } - // If nothing was space-toggled, select the cursor item - if m.selCount == 0 { - m.result = []int{item.idx} - } else { - m.result = m.collectSelected() - } + m.selectSingle(item.idx) + m.result = []int{item.idx} } else { m.result = m.collectSelected() } @@ -235,9 +240,24 @@ func (m checklistModel) Update(msg tea.Msg) (tea.Model, tea.Cmd) { var cmd tea.Cmd m.list, cmd = m.list.Update(msg) + if m.singleSelect && m.list.FilterState() != list.Filtering && len(m.items) > 0 { + m.selectSingle(m.list.Index()) + } return m, cmd } +func (m *checklistModel) selectSingle(idx int) { + if !m.singleSelect || idx < 0 || idx >= len(m.items) { + return + } + for k := range m.selected { + delete(m.selected, k) + } + m.selected[idx] = true + m.selCount = 1 + m.refreshItems() +} + func (m *checklistModel) refreshItems() { cursor := m.list.Index() items := makeChecklistItems(m.items, m.selected, m.singleSelect) @@ -267,14 +287,14 @@ func (m checklistModel) View() string { var help string if m.singleSelect { - help = "↑↓ navigate space/enter select / filter esc cancel" + help = "↑↓ navigate enter select / filter esc cancel" } else { help = "↑↓ navigate space toggle a all enter confirm / filter esc cancel" } if m.header != "" { help += " │ " + m.header } - b.WriteString(tc.Help.Render(help)) + b.WriteString(theme.Dim().MarginLeft(2).Render(help)) b.WriteString("\n") return b.String() diff --git a/cmd/skillshare/checklist_tui_test.go b/cmd/skillshare/checklist_tui_test.go index d6b9d0f6..b59aef79 100644 --- a/cmd/skillshare/checklist_tui_test.go +++ b/cmd/skillshare/checklist_tui_test.go @@ -1,6 +1,10 @@ package main -import "testing" +import ( + "testing" + + tea "github.com/charmbracelet/bubbletea" +) func TestChecklistItem_Title_MultiSelect(t *testing.T) { item := checklistItem{idx: 0, label: "claude", selected: false, singleSelect: false} @@ -47,6 +51,115 @@ func TestChecklistItem_PreSelected(t *testing.T) { } } +func TestChecklistItem_PreSelected_SingleSelectUsesFirstAndMovesCursor(t *testing.T) { + cfg := checklistConfig{ + title: "Test", + singleSelect: true, + items: []checklistItemData{ + {label: "a", preSelected: false}, + {label: "b", preSelected: true}, + {label: "c", preSelected: true}, + }, + } + m := newChecklistModel(cfg) + if m.selCount != 1 { + t.Fatalf("selCount: got %d, want 1", m.selCount) + } + if !m.selected[1] { + t.Fatalf("item 1 should be selected") + } + if m.list.Index() != 1 { + t.Fatalf("cursor index: got %d, want 1", m.list.Index()) + } +} + +func TestChecklistItem_SingleSelectDefaultsToFirst(t *testing.T) { + cfg := checklistConfig{ + title: "Test", + singleSelect: true, + items: []checklistItemData{ + {label: "a"}, + {label: "b"}, + }, + } + m := newChecklistModel(cfg) + if m.selCount != 1 { + t.Fatalf("selCount: got %d, want 1", m.selCount) + } + if !m.selected[0] { + t.Fatalf("item 0 should be selected by default") + } + if m.list.Index() != 0 { + t.Fatalf("cursor index: got %d, want 0", m.list.Index()) + } +} + +func TestChecklistItem_SingleSelectNavigationUpdatesSelection(t *testing.T) { + cfg := checklistConfig{ + title: "Test", + singleSelect: true, + items: []checklistItemData{ + {label: "a", preSelected: true}, + {label: "b"}, + {label: "c"}, + }, + } + model := newChecklistModel(cfg) + modelAny, _ := model.Update(tea.KeyMsg{Type: tea.KeyDown}) + m := modelAny.(checklistModel) + + if !m.selected[1] { + t.Fatalf("item 1 should be selected after moving cursor down") + } + if m.selected[0] { + t.Fatalf("item 0 should no longer be selected after moving cursor down") + } + if m.list.Index() != 1 { + t.Fatalf("cursor index: got %d, want 1", m.list.Index()) + } +} + +func TestChecklistItem_SingleSelectEnterConfirmsFocusedItem(t *testing.T) { + cfg := checklistConfig{ + title: "Test", + singleSelect: true, + items: []checklistItemData{ + {label: "a", preSelected: true}, + {label: "b"}, + }, + } + model := newChecklistModel(cfg) + modelAny, _ := model.Update(tea.KeyMsg{Type: tea.KeyDown}) + m := modelAny.(checklistModel) + modelAny, _ = m.Update(tea.KeyMsg{Type: tea.KeyEnter}) + m = modelAny.(checklistModel) + + if len(m.result) != 1 || m.result[0] != 1 { + t.Fatalf("result: got %v, want [1]", m.result) + } +} + +func TestChecklistItem_SingleSelectSpaceDoesNotClearSelection(t *testing.T) { + cfg := checklistConfig{ + title: "Test", + singleSelect: true, + items: []checklistItemData{ + {label: "a", preSelected: true}, + {label: "b"}, + }, + } + model := newChecklistModel(cfg) + modelAny, _ := model.Update(tea.KeyMsg{Type: tea.KeySpace}) + m := modelAny.(checklistModel) + + if m.selCount != 1 { + t.Fatalf("selCount: got %d, want 1", m.selCount) + } + if !m.selected[0] { + t.Fatalf("item 0 should remain selected after pressing space") + } +} + func TestChecklistItem_FilterValue(t *testing.T) { item := checklistItem{idx: 0, label: "claude", desc: "AI assistant"} if got := item.FilterValue(); got != "claude AI assistant" { diff --git a/cmd/skillshare/collect.go b/cmd/skillshare/collect.go index 82f70c01..09b221a8 100644 --- a/cmd/skillshare/collect.go +++ b/cmd/skillshare/collect.go @@ -3,29 +3,15 @@ package main import ( "fmt" "os" - "path/filepath" - "strings" "time" - "github.com/pterm/pterm" - "skillshare/internal/config" - "skillshare/internal/oplog" "skillshare/internal/sync" "skillshare/internal/ui" ) -// collectJSONOutput is the JSON representation for collect --json output. -type collectJSONOutput struct { - Pulled []string `json:"pulled"` - Skipped []string `json:"skipped"` - Failed map[string]string `json:"failed"` - DryRun bool `json:"dry_run"` - Duration string `json:"duration"` -} - -// collectLocalSkills collects local skills from targets (non-symlinked) -func collectLocalSkills(targets map[string]config.TargetConfig, source, globalMode string) []sync.LocalSkillInfo { +// collectLocalSkills collects local skills from targets (non-symlinked). +func collectLocalSkills(targets map[string]config.TargetConfig, source, globalMode string, warn bool) []sync.LocalSkillInfo { var allLocalSkills []sync.LocalSkillInfo for name, target := range targets { sc := target.SkillsConfig() @@ -35,7 +21,9 @@ func collectLocalSkills(targets map[string]config.TargetConfig, source, globalMo } skills, err := sync.FindLocalSkills(sc.Path, source, mode) if err != nil { - ui.Warning("%s: %v", name, err) + if warn { + ui.Warning("%s: %v", name, err) + } continue } for i := range skills { @@ -46,12 +34,8 @@ func collectLocalSkills(targets map[string]config.TargetConfig, source, globalMo return allLocalSkills } -// displayLocalSkills shows the local skills found -func displayLocalSkills(skills []sync.LocalSkillInfo) { - ui.Header(ui.WithModeLabel("Local skills found")) - for _, skill := range skills { - ui.ListItem("info", skill.Name, fmt.Sprintf("[%s] %s", skill.TargetName, skill.Path)) - } +func skillDisplayItem(s sync.LocalSkillInfo) collectDisplayItem { + return collectDisplayItem{Name: s.Name, TargetName: s.TargetName, Path: s.Path} } func cmdCollect(args []string) error { @@ -82,125 +66,63 @@ func cmdCollect(args []string) error { applyModeLabel(mode) + kind, rest := parseKindArg(rest) + opts := parseCollectOptions(rest) + scope := "global" + cfgPath := config.ConfigPath() if mode == modeProject { - err := cmdCollectProject(rest, cwd) - logCollectOp(config.ProjectConfigPath(cwd), start, err) - return err + scope = "project" + cfgPath = config.ProjectConfigPath(cwd) } - dryRun := false - force := false - collectAll := false - jsonOutput := false - var targetName string - - for _, arg := range rest { - switch arg { - case "--dry-run", "-n": - dryRun = true - case "--force", "-f": - force = true - case "--all", "-a": - collectAll = true - case "--json": - jsonOutput = true - default: - if targetName == "" && !strings.HasPrefix(arg, "-") { - targetName = arg - } - } - } + summary := newCollectLogSummary(kind, scope, opts) - // --json implies --force (skip confirmation prompts) - if jsonOutput { - force = true - } - - cfg, err := config.Load() - if err != nil { - if jsonOutput { - return writeJSONError(err) + switch mode { + case modeProject: + if kind == kindAgents { + summary, err = cmdCollectProjectAgents(cwd, opts, start) + } else { + summary, err = cmdCollectProject(opts, cwd, start) } - return err - } - - // Select targets to collect from - targets, err := selectCollectTargets(cfg, targetName, collectAll) - if err != nil { - return err - } - if targets == nil { - return nil // User needs to specify target - } - - // Collect all local skills - var sp *ui.Spinner - if !jsonOutput { - ui.Header(ui.WithModeLabel("Collect")) - sp = ui.StartSpinner("Scanning for local skills...") - } - - allLocalSkills := collectLocalSkills(targets, cfg.Source, cfg.Mode) - - if len(allLocalSkills) == 0 { - if sp != nil { - sp.Success("No local skills found") + default: + cfg, loadErr := config.Load() + if loadErr != nil { + err = collectCommandError(loadErr, opts.jsonOutput) + logCollectOp(cfgPath, start, err, summary) + return err } - if jsonOutput { - return collectOutputJSON(nil, dryRun, start, nil) + if kind == kindAgents { + summary, err = cmdCollectAgents(cfg, opts, start) + } else { + summary, err = cmdCollectGlobal(cfg, opts, start) } - return nil } - if sp != nil { - sp.Success(fmt.Sprintf("Found %d local skill(s)", len(allLocalSkills))) - displayLocalSkills(allLocalSkills) - } + logCollectOp(cfgPath, start, err, summary) + return err +} - if dryRun { - if jsonOutput { - names := make([]string, len(allLocalSkills)) - for i, s := range allLocalSkills { - names[i] = s.Name - } - logCollectOp(config.ConfigPath(), start, nil) - return collectOutputJSON(&sync.PullResult{Pulled: names}, true, start, nil) - } - ui.Info("Dry run - no changes made") - return nil - } +func cmdCollectGlobal(cfg *config.Config, opts collectOptions, start time.Time) (collectLogSummary, error) { + summary := newCollectLogSummary(kindSkills, "global", opts) - // Confirm unless --force (JSON implies force) - if !force { - if !confirmCollect() { - ui.Info("Cancelled") - return nil - } + targets, err := selectCollectTargets(cfg, opts.targetName, opts.collectAll, opts.jsonOutput) + if err != nil { + return summary, collectCommandError(err, opts.jsonOutput) } - - if jsonOutput { - result, collectErr := sync.PullSkills(allLocalSkills, cfg.Source, sync.PullOptions{ - DryRun: dryRun, - Force: force, - }) - logCollectOp(config.ConfigPath(), start, collectErr) - return collectOutputJSON(result, dryRun, start, collectErr) + if targets == nil { + return summary, nil } - err = executeCollect(allLocalSkills, cfg.Source, dryRun, force) - logCollectOp(config.ConfigPath(), start, err) - return err -} - -func logCollectOp(cfgPath string, start time.Time, cmdErr error) { - e := oplog.NewEntry("collect", statusFromErr(cmdErr), time.Since(start)) - if cmdErr != nil { - e.Message = cmdErr.Error() - } - oplog.WriteWithLimit(cfgPath, oplog.OpsFile, e, logMaxEntries()) //nolint:errcheck + return runCollectPlan(collectPlan{ + kind: kindSkills, source: cfg.Source, + scan: func(warn bool) collectResources { + skills := collectLocalSkills(targets, cfg.Source, cfg.Mode, warn) + return toCollectResources(skills, cfg.Source, skillDisplayItem, sync.PullSkills) + }, + }, opts, start, "global") } -func selectCollectTargets(cfg *config.Config, targetName string, collectAll bool) (map[string]config.TargetConfig, error) { +func selectCollectTargets(cfg *config.Config, targetName string, collectAll, jsonOutput bool) (map[string]config.TargetConfig, error) { if targetName != "" { if t, exists := cfg.Targets[targetName]; exists { return map[string]config.TargetConfig{targetName: t}, nil @@ -208,11 +130,18 @@ func selectCollectTargets(cfg *config.Config, targetName string, collectAll bool return nil, fmt.Errorf("target '%s' not found", targetName) } + if len(cfg.Targets) == 0 { + return cfg.Targets, nil + } + if collectAll || len(cfg.Targets) == 1 { return cfg.Targets, nil } - // If no target specified and multiple targets exist, ask or require --all + if jsonOutput { + return nil, fmt.Errorf("multiple targets found; specify a target name or use --all") + } + ui.Warning("Multiple targets found. Specify a target name or use --all") fmt.Println(" Available targets:") for name := range cfg.Targets { @@ -221,85 +150,10 @@ func selectCollectTargets(cfg *config.Config, targetName string, collectAll bool return nil, nil } -func confirmCollect() bool { - fmt.Println() - fmt.Print("Collect these skills to source? [y/N]: ") - var input string - fmt.Scanln(&input) - input = strings.ToLower(strings.TrimSpace(input)) - return input == "y" || input == "yes" -} - -func executeCollect(skills []sync.LocalSkillInfo, source string, dryRun, force bool) error { - ui.Header(ui.WithModeLabel("Collecting skills")) - result, err := sync.PullSkills(skills, source, sync.PullOptions{ - DryRun: dryRun, - Force: force, - }) - if err != nil { - return err - } - - // Display results - for _, name := range result.Pulled { - ui.StepDone(name, "copied to source") - } - for _, name := range result.Skipped { - ui.StepSkip(name, "already exists in source, use --force to overwrite") - } - for name, err := range result.Failed { - ui.StepFail(name, err.Error()) - } - - ui.OperationSummary("Collect", 0, - ui.Metric{Label: "collected", Count: len(result.Pulled), HighlightColor: pterm.Green}, - ui.Metric{Label: "skipped", Count: len(result.Skipped), HighlightColor: pterm.Yellow}, - ui.Metric{Label: "failed", Count: len(result.Failed), HighlightColor: pterm.Red}, - ) - - if len(result.Pulled) > 0 { - showCollectNextSteps(source) - } - - return nil -} - -// collectOutputJSON converts a collect result to JSON and writes to stdout. -func collectOutputJSON(result *sync.PullResult, dryRun bool, start time.Time, collectErr error) error { - output := collectJSONOutput{ - DryRun: dryRun, - Duration: formatDuration(start), - } - output.Failed = make(map[string]string) - if result != nil { - output.Pulled = result.Pulled - output.Skipped = result.Skipped - for k, v := range result.Failed { - output.Failed[k] = v.Error() - } - } - return writeJSONResult(&output, collectErr) -} - -func showCollectNextSteps(source string) { - fmt.Println() - if ui.ModeLabel == "project" { - ui.Info("Run 'skillshare sync -p' to distribute to all targets") - return - } - ui.Info("Run 'skillshare sync' to distribute to all targets") - - // Check if source has git - gitDir := filepath.Join(source, ".git") - if _, err := os.Stat(gitDir); err == nil { - ui.Info("Commit changes: cd %s && git add . && git commit", source) - } -} - func printCollectHelp() { - fmt.Println(`Usage: skillshare collect [target] [options] + fmt.Println(`Usage: skillshare collect [agents] [target] [options] -Collect local skills from target(s) to source directory. +Collect local skills or agents from target(s) to the source directory. Arguments: [target] Target name to collect from (optional) @@ -307,14 +161,16 @@ Arguments: Options: --all, -a Collect from all targets --dry-run, -n Preview changes without applying - --force, -f Skip confirmation prompts - --json Output results as JSON + --force, -f Overwrite existing items in source and skip confirmation + --json Output results as JSON (implies --force) --project, -p Use project-level config --global, -g Use global config --help, -h Show this help Examples: - skillshare collect claude Collect from Claude target - skillshare collect --all Collect from all targets - skillshare collect --dry-run Preview what would be collected`) + skillshare collect claude Collect skills from the Claude target + skillshare collect --all Collect skills from all targets + skillshare collect --dry-run Preview what would be collected + skillshare collect agents claude Collect agents from the Claude target + skillshare collect agents --json Collect agents as JSON output`) } diff --git a/cmd/skillshare/collect_agents.go b/cmd/skillshare/collect_agents.go new file mode 100644 index 00000000..166285e1 --- /dev/null +++ b/cmd/skillshare/collect_agents.go @@ -0,0 +1,172 @@ +package main + +import ( + "fmt" + "path/filepath" + "time" + + "skillshare/internal/config" + "skillshare/internal/sync" + "skillshare/internal/ui" +) + +func collectLocalAgents(targets map[string]string, source string, warn bool) []sync.LocalAgentInfo { + var allLocalAgents []sync.LocalAgentInfo + for name, targetPath := range targets { + agents, err := sync.FindLocalAgents(targetPath, source) + if err != nil { + if warn { + ui.Warning("%s: %v", name, err) + } + continue + } + for i := range agents { + agents[i].TargetName = name + } + allLocalAgents = append(allLocalAgents, agents...) + } + return allLocalAgents +} + +func agentDisplayItem(a sync.LocalAgentInfo) collectDisplayItem { + return collectDisplayItem{Name: a.Name, TargetName: a.TargetName, Path: a.Path} +} + +func cmdCollectAgents(cfg *config.Config, opts collectOptions, start time.Time) (collectLogSummary, error) { + summary := newCollectLogSummary(kindAgents, "global", opts) + + targets, err := selectCollectAgentTargets(cfg, opts.targetName, opts.collectAll, opts.jsonOutput) + if err != nil { + return summary, collectCommandError(err, opts.jsonOutput) + } + if targets == nil { + return summary, nil + } + + source := cfg.EffectiveAgentsSource() + return runCollectPlan(collectPlan{ + kind: kindAgents, source: source, + scan: func(warn bool) collectResources { + agents := collectLocalAgents(targets, source, warn) + return toCollectResources(agents, source, agentDisplayItem, sync.PullAgents) + }, + }, opts, start, "global") +} + +func selectCollectAgentTargets(cfg *config.Config, targetName string, collectAll, jsonOutput bool) (map[string]string, error) { + builtinAgents := config.DefaultAgentTargets() + + if targetName != "" { + target, ok := cfg.Targets[targetName] + if !ok { + return nil, fmt.Errorf("target '%s' not found", targetName) + } + agentPath := resolveAgentTargetPath(target, builtinAgents, targetName) + if agentPath == "" { + return nil, fmt.Errorf("target '%s' does not support agents", targetName) + } + return map[string]string{targetName: agentPath}, nil + } + + targets := make(map[string]string) + for name := range cfg.Targets { + agentPath := resolveAgentTargetPath(cfg.Targets[name], builtinAgents, name) + if agentPath == "" { + continue + } + targets[name] = agentPath + } + + if len(targets) == 0 { + return targets, nil + } + + if collectAll || len(targets) <= 1 { + return targets, nil + } + + if jsonOutput { + return nil, fmt.Errorf("multiple targets found; specify a target name or use --all") + } + + ui.Warning("Multiple targets found. Specify a target name or use --all") + fmt.Println(" Available targets:") + for name := range targets { + fmt.Printf(" - %s\n", name) + } + return nil, nil +} + +func cmdCollectProjectAgents(projectRoot string, opts collectOptions, start time.Time) (collectLogSummary, error) { + summary := newCollectLogSummary(kindAgents, "project", opts) + + projCfg, err := config.LoadProject(projectRoot) + if err != nil { + return summary, collectCommandError(fmt.Errorf("cannot load project config: %w", err), opts.jsonOutput) + } + + targets, err := selectCollectProjectAgentTargets(projCfg, projectRoot, opts.targetName, opts.collectAll, opts.jsonOutput) + if err != nil { + return summary, collectCommandError(err, opts.jsonOutput) + } + if targets == nil { + return summary, nil + } + + source := filepath.Join(projectRoot, ".skillshare", "agents") + return runCollectPlan(collectPlan{ + kind: kindAgents, source: source, + scan: func(warn bool) collectResources { + agents := collectLocalAgents(targets, source, warn) + return toCollectResources(agents, source, agentDisplayItem, sync.PullAgents) + }, + }, opts, start, "project") +} + +func selectCollectProjectAgentTargets(projCfg *config.ProjectConfig, projectRoot, targetName string, collectAll, jsonOutput bool) (map[string]string, error) { + builtinAgents := config.ProjectAgentTargets() + + if targetName != "" { + for _, entry := range projCfg.Targets { + if entry.Name != targetName { + continue + } + agentPath := resolveProjectAgentTargetPath(entry, builtinAgents, projectRoot) + if agentPath == "" { + return nil, fmt.Errorf("target '%s' does not support agents in project config", targetName) + } + return map[string]string{targetName: agentPath}, nil + } + return nil, fmt.Errorf("target '%s' not found in project config", targetName) + } + + targets := make(map[string]string) + for _, entry := range projCfg.Targets { + agentPath := resolveProjectAgentTargetPath(entry, builtinAgents, projectRoot) + if agentPath == "" { + continue + } + targets[entry.Name] = agentPath + } + + if len(targets) == 0 { + return targets, nil + } + + if collectAll || len(targets) <= 1 { + return targets, nil + } + + if jsonOutput { + return nil, fmt.Errorf("multiple targets found; specify a target name or use --all") + } + + ui.Warning("Multiple targets found. Specify a target name or use --all") + fmt.Println(" Available targets:") + for _, entry := range projCfg.Targets { + if _, ok := targets[entry.Name]; ok { + fmt.Printf(" - %s\n", entry.Name) + } + } + return nil, nil +} diff --git a/cmd/skillshare/collect_project.go b/cmd/skillshare/collect_project.go index 742b2bcc..e0f4a08b 100644 --- a/cmd/skillshare/collect_project.go +++ b/cmd/skillshare/collect_project.go @@ -2,73 +2,39 @@ package main import ( "fmt" - "strings" + "time" "skillshare/internal/config" + "skillshare/internal/sync" "skillshare/internal/ui" ) -func cmdCollectProject(args []string, root string) error { - dryRun := false - force := false - collectAll := false - var targetName string - - for _, arg := range args { - switch arg { - case "--dry-run", "-n": - dryRun = true - case "--force", "-f": - force = true - case "--all", "-a": - collectAll = true - default: - if targetName == "" && !strings.HasPrefix(arg, "-") { - targetName = arg - } - } - } +func cmdCollectProject(opts collectOptions, root string, start time.Time) (collectLogSummary, error) { + summary := newCollectLogSummary(kindSkills, "project", opts) runtime, err := loadProjectRuntime(root) if err != nil { - return err + return summary, collectCommandError(err, opts.jsonOutput) } - targets, err := selectCollectProjectTargets(runtime, targetName, collectAll) + targets, err := selectCollectProjectTargets(runtime, opts.targetName, opts.collectAll, opts.jsonOutput) if err != nil { - return err + return summary, collectCommandError(err, opts.jsonOutput) } if targets == nil { - return nil - } - - ui.Header(ui.WithModeLabel("Collect")) - sp := ui.StartSpinner("Scanning for local skills...") - allLocalSkills := collectLocalSkills(targets, runtime.sourcePath, "") - if len(allLocalSkills) == 0 { - sp.Success("No local skills found") - return nil - } - sp.Success(fmt.Sprintf("Found %d local skill(s)", len(allLocalSkills))) - - displayLocalSkills(allLocalSkills) - - if dryRun { - ui.Info("Dry run - no changes made") - return nil + return summary, nil } - if !force { - if !confirmCollect() { - ui.Info("Cancelled") - return nil - } - } - - return executeCollect(allLocalSkills, runtime.sourcePath, dryRun, force) + return runCollectPlan(collectPlan{ + kind: kindSkills, source: runtime.sourcePath, + scan: func(warn bool) collectResources { + skills := collectLocalSkills(targets, runtime.sourcePath, "", warn) + return toCollectResources(skills, runtime.sourcePath, skillDisplayItem, sync.PullSkills) + }, + }, opts, start, "project") } -func selectCollectProjectTargets(runtime *projectRuntime, targetName string, collectAll bool) (map[string]config.TargetConfig, error) { +func selectCollectProjectTargets(runtime *projectRuntime, targetName string, collectAll, jsonOutput bool) (map[string]config.TargetConfig, error) { if targetName != "" { if t, ok := runtime.targets[targetName]; ok { return map[string]config.TargetConfig{targetName: t}, nil @@ -76,10 +42,18 @@ func selectCollectProjectTargets(runtime *projectRuntime, targetName string, col return nil, fmt.Errorf("target '%s' not found in project config", targetName) } + if len(runtime.targets) == 0 { + return runtime.targets, nil + } + if collectAll || len(runtime.targets) == 1 { return runtime.targets, nil } + if jsonOutput { + return nil, fmt.Errorf("multiple targets found; specify a target name or use --all") + } + ui.Warning("Multiple targets found. Specify a target name or use --all") fmt.Println(" Available targets:") for _, entry := range runtime.config.Targets { diff --git a/cmd/skillshare/collect_shared.go b/cmd/skillshare/collect_shared.go new file mode 100644 index 00000000..2215f410 --- /dev/null +++ b/cmd/skillshare/collect_shared.go @@ -0,0 +1,304 @@ +package main + +import ( + "fmt" + "os" + "path/filepath" + "strings" + "time" + + "github.com/pterm/pterm" + + "skillshare/internal/oplog" + "skillshare/internal/sync" + "skillshare/internal/ui" +) + +type collectOptions struct { + dryRun bool + force bool + collectAll bool + jsonOutput bool + targetName string +} + +type collectLogSummary struct { + Kind string + Scope string + Pulled int + Skipped int + Failed int + DryRun bool + Force bool +} + +type collectDisplayItem struct { + Name string + TargetName string + Path string +} + +// collectJSONOutput is the JSON representation for collect --json output. +type collectJSONOutput struct { + Pulled []string `json:"pulled"` + Skipped []string `json:"skipped"` + Failed map[string]string `json:"failed"` + DryRun bool `json:"dry_run"` + Duration string `json:"duration"` +} + +func parseCollectOptions(args []string) collectOptions { + opts := collectOptions{} + for _, arg := range args { + switch arg { + case "--dry-run", "-n": + opts.dryRun = true + case "--force", "-f": + opts.force = true + case "--all", "-a": + opts.collectAll = true + case "--json": + opts.jsonOutput = true + default: + if opts.targetName == "" && !strings.HasPrefix(arg, "-") { + opts.targetName = arg + } + } + } + if opts.jsonOutput { + opts.force = true + } + return opts +} + +func newCollectLogSummary(kind resourceKindFilter, scope string, opts collectOptions) collectLogSummary { + return collectLogSummary{ + Kind: kind.String(), + Scope: scope, + DryRun: opts.dryRun, + Force: opts.force, + } +} + +func updateCollectLogSummary(summary collectLogSummary, result *sync.PullResult) collectLogSummary { + if result == nil { + return summary + } + summary.Pulled = len(result.Pulled) + summary.Skipped = len(result.Skipped) + summary.Failed = len(result.Failed) + return summary +} + +func collectCommandError(err error, jsonOutput bool) error { + if err == nil { + return nil + } + if jsonOutput { + return writeJSONError(err) + } + return err +} + +// collectPlan describes what to collect (kind + source) and how to scan for it. +// The scan callback is called lazily so the spinner wraps the actual I/O. +type collectPlan struct { + kind resourceKindFilter + source string + scan func(warn bool) collectResources +} + +// collectResources holds the results of scanning a target for local resources. +type collectResources struct { + items []collectDisplayItem + names []string + pull func(sync.PullOptions) (*sync.PullResult, error) +} + +// toCollectResources converts a typed slice into collectResources. +// toDisplay maps each item to a collectDisplayItem; pull is the batch pull function. +func toCollectResources[T any]( + items []T, + source string, + toDisplay func(T) collectDisplayItem, + pull func([]T, string, sync.PullOptions) (*sync.PullResult, error), +) collectResources { + display := make([]collectDisplayItem, len(items)) + names := make([]string, len(items)) + for i, item := range items { + d := toDisplay(item) + display[i] = d + names[i] = d.Name + } + return collectResources{ + items: display, + names: names, + pull: func(opts sync.PullOptions) (*sync.PullResult, error) { + return pull(items, source, opts) + }, + } +} + +// runCollectPlan is the unified collection flow for both skills and agents. +func runCollectPlan(plan collectPlan, opts collectOptions, start time.Time, scope string) (collectLogSummary, error) { + label := plan.kind.String() + summary := newCollectLogSummary(plan.kind, scope, opts) + + var sp *ui.Spinner + if !opts.jsonOutput { + header := "Collect" + if plan.kind == kindAgents { + header = "Collect agents" + } + ui.Header(ui.WithModeLabel(header)) + sp = ui.StartSpinner(fmt.Sprintf("Scanning for local %s...", label)) + } + + res := plan.scan(!opts.jsonOutput) + + if len(res.items) == 0 { + if sp != nil { + sp.Success(fmt.Sprintf("No local %s found", label)) + } + if opts.jsonOutput { + return summary, collectOutputJSON(nil, opts.dryRun, start, nil) + } + return summary, nil + } + + if sp != nil { + sp.Success(fmt.Sprintf("Found %d local %s", len(res.items), label)) + displayLocalCollectItems(fmt.Sprintf("Local %s found", label), res.items) + } + + if opts.dryRun { + result := &sync.PullResult{Pulled: res.names} + summary = updateCollectLogSummary(summary, result) + if opts.jsonOutput { + return summary, collectOutputJSON(result, true, start, nil) + } + ui.Info("Dry run - no changes made") + return summary, nil + } + + if !opts.force { + if !confirmCollect(label) { + ui.Info("Cancelled") + return summary, nil + } + } + + result, collectErr := res.pull(sync.PullOptions{ + DryRun: opts.dryRun, + Force: opts.force, + }) + summary = updateCollectLogSummary(summary, result) + if opts.jsonOutput { + return summary, collectOutputJSON(result, opts.dryRun, start, collectErr) + } + if collectErr != nil { + return summary, collectErr + } + return summary, renderCollectResult(label, result, plan.source) +} + +func displayLocalCollectItems(title string, items []collectDisplayItem) { + ui.Header(ui.WithModeLabel(title)) + for _, item := range items { + ui.ListItem("info", item.Name, fmt.Sprintf("[%s] %s", item.TargetName, item.Path)) + } +} + +func confirmCollect(resourceLabel string) bool { + fmt.Println() + fmt.Printf("Collect these %s to source? [y/N]: ", resourceLabel) + var input string + fmt.Scanln(&input) + input = strings.ToLower(strings.TrimSpace(input)) + return input == "y" || input == "yes" +} + +func renderCollectResult(resourceLabel string, result *sync.PullResult, source string) error { + ui.Header(ui.WithModeLabel("Collecting " + resourceLabel)) + + for _, name := range result.Pulled { + ui.StepDone(name, "copied to source") + } + for _, name := range result.Skipped { + ui.StepSkip(name, "already exists in source, use --force to overwrite") + } + for name, err := range result.Failed { + ui.StepFail(name, err.Error()) + } + + ui.OperationSummary("Collect", 0, + ui.Metric{Label: "collected", Count: len(result.Pulled), HighlightColor: pterm.Green}, + ui.Metric{Label: "skipped", Count: len(result.Skipped), HighlightColor: pterm.Yellow}, + ui.Metric{Label: "failed", Count: len(result.Failed), HighlightColor: pterm.Red}, + ) + + if len(result.Pulled) > 0 { + showCollectNextSteps(resourceLabel, source) + } + + return nil +} + +// collectOutputJSON converts a collect result to JSON and writes to stdout. +func collectOutputJSON(result *sync.PullResult, dryRun bool, start time.Time, collectErr error) error { + output := collectJSONOutput{ + DryRun: dryRun, + Duration: formatDuration(start), + Failed: make(map[string]string), + } + if result != nil { + output.Pulled = result.Pulled + output.Skipped = result.Skipped + for k, v := range result.Failed { + output.Failed[k] = v.Error() + } + } + return writeJSONResult(&output, collectErr) +} + +func showCollectNextSteps(resourceLabel, source string) { + fmt.Println() + if resourceLabel == "agents" { + if ui.ModeLabel == "project" { + ui.Info("Run 'skillshare sync -p agents' to distribute to all agent targets") + } else { + ui.Info("Run 'skillshare sync agents' to distribute to all agent targets") + } + } else if ui.ModeLabel == "project" { + ui.Info("Run 'skillshare sync -p' to distribute to all targets") + } else { + ui.Info("Run 'skillshare sync' to distribute to all targets") + } + + gitDir := filepath.Join(source, ".git") + if _, err := os.Stat(gitDir); err == nil { + ui.Info("Commit changes: cd %s && git add . && git commit", source) + } +} + +func logCollectOp(cfgPath string, start time.Time, cmdErr error, summary collectLogSummary) { + status := statusFromErr(cmdErr) + if cmdErr == nil && summary.Failed > 0 { + status = "partial" + } + + e := oplog.NewEntry("collect", status, time.Since(start)) + e.Args = map[string]any{ + "kind": summary.Kind, + "scope": summary.Scope, + "pulled": summary.Pulled, + "skipped": summary.Skipped, + "failed": summary.Failed, + "dry_run": summary.DryRun, + "force": summary.Force, + } + if cmdErr != nil { + e.Message = cmdErr.Error() + } + oplog.WriteWithLimit(cfgPath, oplog.OpsFile, e, logMaxEntries()) //nolint:errcheck +} diff --git a/cmd/skillshare/diff.go b/cmd/skillshare/diff.go index 41d1351c..7dea90c2 100644 --- a/cmd/skillshare/diff.go +++ b/cmd/skillshare/diff.go @@ -45,6 +45,7 @@ type diffJSONTarget struct { type diffJSONItem struct { Action string `json:"action"` Name string `json:"name"` + Kind string `json:"kind,omitempty"` // "skill" or "agent" Reason string `json:"reason"` IsSync bool `json:"is_sync"` } @@ -72,6 +73,9 @@ func cmdDiff(args []string) error { applyModeLabel(mode) + // Extract kind filter (e.g. "skillshare diff agents"). + kind, rest := parseKindArg(rest) + scope := "global" cfgPath := config.ConfigPath() if mode == modeProject { @@ -104,9 +108,9 @@ func cmdDiff(args []string) error { var cmdErr error if mode == modeProject { - cmdErr = cmdDiffProject(cwd, targetName, opts, start) + cmdErr = cmdDiffProject(cwd, targetName, kind, opts, start) } else { - cmdErr = cmdDiffGlobal(targetName, opts, start) + cmdErr = cmdDiffGlobal(targetName, kind, opts, start) } logDiffOp(cfgPath, targetName, scope, 0, start, cmdErr) return cmdErr @@ -146,6 +150,7 @@ type targetDiffResult struct { type copyDiffEntry struct { action string // "add", "modify", "remove" name string + kind string // "skill" or "agent" (empty defaults to "skill") reason string isSync bool // true = needs sync, false = local-only files []fileDiffEntry // file-level diffs (nil until populated) @@ -357,12 +362,17 @@ func (dp *diffProgress) stop() { } } -func cmdDiffGlobal(targetName string, opts diffRenderOpts, start time.Time) error { +func cmdDiffGlobal(targetName string, kind resourceKindFilter, opts diffRenderOpts, start time.Time) error { cfg, err := config.Load() if err != nil { return err } + // Agent-only diff + if kind == kindAgents { + return diffGlobalAgents(cfg, targetName, opts, start) + } + var spinner *ui.Spinner if !opts.jsonOutput { spinner = ui.StartSpinner("Discovering skills") @@ -469,6 +479,9 @@ func cmdDiffGlobal(targetName string, opts diffRenderOpts, start time.Time) erro }) } + // Merge agent diffs into skill results so they appear together + results = mergeAgentDiffsGlobal(cfg, results, targetName) + if opts.jsonOutput { return diffOutputJSONWithExtras(results, extrasResults, start) } @@ -482,6 +495,20 @@ func cmdDiffGlobal(targetName string, opts diffRenderOpts, start time.Time) erro return nil } +func diffItemToJSON(item copyDiffEntry) diffJSONItem { + k := item.kind + if k == "" { + k = "skill" + } + return diffJSONItem{ + Action: item.action, + Name: item.name, + Kind: k, + Reason: item.reason, + IsSync: item.isSync, + } +} + func diffOutputJSON(results []targetDiffResult, start time.Time) error { output := diffJSONOutput{ Duration: formatDuration(start), @@ -496,12 +523,7 @@ func diffOutputJSON(results []targetDiffResult, start time.Time) error { Exclude: r.exclude, } for _, item := range r.items { - jt.Items = append(jt.Items, diffJSONItem{ - Action: item.action, - Name: item.name, - Reason: item.reason, - IsSync: item.isSync, - }) + jt.Items = append(jt.Items, diffItemToJSON(item)) } output.Targets = append(output.Targets, jt) } @@ -528,12 +550,7 @@ func diffOutputJSONWithExtras(results []targetDiffResult, extrasResults []extraD Exclude: r.exclude, } for _, item := range r.items { - jt.Items = append(jt.Items, diffJSONItem{ - Action: item.action, - Name: item.name, - Reason: item.reason, - IsSync: item.isSync, - }) + jt.Items = append(jt.Items, diffItemToJSON(item)) } o.Targets = append(o.Targets, jt) } @@ -804,7 +821,7 @@ func categorizeItems(items []copyDiffEntry) []actionCategory { for _, item := range items { switch { - case item.reason == "source only": + case item.reason == "source only" || item.reason == "not in target": add("new", "new", "New", item.name) case item.reason == "deleted from target": add("restore", "new", "Restore", item.name) @@ -814,7 +831,7 @@ func categorizeItems(items []copyDiffEntry) []actionCategory { add("override", "override", "Local Override", item.name) case strings.Contains(item.reason, "orphan"): add("orphan", "orphan", "Orphan", item.name) - case item.reason == "local only" || item.reason == "not in source": + case item.reason == "local only" || item.reason == "not in source" || item.reason == "local file": add("local", "local", "Local Only", item.name) default: add("warn", "warn", item.reason, item.name) @@ -1041,7 +1058,7 @@ func pluralS(n int) string { } func printDiffHelp() { - fmt.Println(`Usage: skillshare diff [target] [options] + fmt.Println(`Usage: skillshare diff [agents|all] [target] [options] Show differences between source skills and target directories. Previews what 'sync' would change without modifying anything. @@ -1063,5 +1080,6 @@ Examples: skillshare diff claude # Diff a single target skillshare diff -p # Diff project-mode targets skillshare diff --stat # Show file-level stat - skillshare diff --patch # Show full text diff`) + skillshare diff --patch # Show full text diff + skillshare diff agents # Diff agents targets only`) } diff --git a/cmd/skillshare/diff_agents.go b/cmd/skillshare/diff_agents.go new file mode 100644 index 00000000..8db57c61 --- /dev/null +++ b/cmd/skillshare/diff_agents.go @@ -0,0 +1,286 @@ +package main + +import ( + "os" + "path/filepath" + "strings" + "time" + + "skillshare/internal/config" + "skillshare/internal/resource" + "skillshare/internal/ui" + "skillshare/internal/utils" +) + +// diffProjectAgents computes agent diffs for project mode. +func diffProjectAgents(root, targetName string, opts diffRenderOpts, start time.Time) error { + if !projectConfigExists(root) { + if err := performProjectInit(root, projectInitOptions{}); err != nil { + return err + } + } + + rt, err := loadProjectRuntime(root) + if err != nil { + return err + } + + agentsSource := rt.agentsSourcePath + agents, _ := resource.AgentKind{}.Discover(agentsSource) + + builtinAgents := config.ProjectAgentTargets() + var results []targetDiffResult + + for _, entry := range rt.config.Targets { + if targetName != "" && entry.Name != targetName { + continue + } + agentPath := resolveProjectAgentTargetPath(entry, builtinAgents, root) + if agentPath == "" { + continue + } + + r := computeAgentDiff(entry.Name, agentPath, agents) + results = append(results, r) + } + + if opts.jsonOutput { + return diffOutputJSON(results, start) + } + + if len(results) == 0 { + ui.Info("No agent-capable targets found") + return nil + } + + renderGroupedDiffs(results, opts) + return nil +} + +// diffGlobalAgents computes agent diffs for global mode. +func diffGlobalAgents(cfg *config.Config, targetName string, opts diffRenderOpts, start time.Time) error { + agentsSource := cfg.EffectiveAgentsSource() + agents, _ := resource.AgentKind{}.Discover(agentsSource) + + builtinAgents := config.DefaultAgentTargets() + var results []targetDiffResult + + for name := range cfg.Targets { + if targetName != "" && name != targetName { + continue + } + agentPath := resolveAgentTargetPath(cfg.Targets[name], builtinAgents, name) + if agentPath == "" { + continue + } + + r := computeAgentDiff(name, agentPath, agents) + results = append(results, r) + } + + if opts.jsonOutput { + return diffOutputJSON(results, start) + } + + if len(results) == 0 { + ui.Info("No agent-capable targets found") + return nil + } + + renderGroupedDiffs(results, opts) + return nil +} + +// mergeAgentDiffsGlobal computes agent diffs for all targets and merges them +// into existing skill diff results. Targets with agent diffs get their items +// appended; targets without a skill result get a new entry. +func mergeAgentDiffsGlobal(cfg *config.Config, results []targetDiffResult, targetName string) []targetDiffResult { + agentsSource := cfg.EffectiveAgentsSource() + agents, _ := resource.AgentKind{}.Discover(agentsSource) + + builtinAgents := config.DefaultAgentTargets() + var agentResults []targetDiffResult + for name := range cfg.Targets { + if targetName != "" && name != targetName { + continue + } + agentPath := resolveAgentTargetPath(cfg.Targets[name], builtinAgents, name) + if agentPath == "" { + continue + } + agentResults = append(agentResults, computeAgentDiff(name, agentPath, agents)) + } + + return mergeAgentResults(results, agentResults) +} + +// mergeAgentDiffsProject computes agent diffs for project targets and merges +// them into existing skill diff results. +func mergeAgentDiffsProject(root string, results []targetDiffResult, targetName string) []targetDiffResult { + if !projectConfigExists(root) { + return results + } + rt, err := loadProjectRuntime(root) + if err != nil { + return results + } + + agentsSource := rt.agentsSourcePath + agents, _ := resource.AgentKind{}.Discover(agentsSource) + + builtinAgents := config.ProjectAgentTargets() + var agentResults []targetDiffResult + for _, entry := range rt.config.Targets { + if targetName != "" && entry.Name != targetName { + continue + } + agentPath := resolveProjectAgentTargetPath(entry, builtinAgents, root) + if agentPath == "" { + continue + } + agentResults = append(agentResults, computeAgentDiff(entry.Name, agentPath, agents)) + } + + return mergeAgentResults(results, agentResults) +} + +// mergeAgentResults merges agent diff results into skill results by target name. +func mergeAgentResults(skillResults, agentResults []targetDiffResult) []targetDiffResult { + if len(agentResults) == 0 { + return skillResults + } + + idx := make(map[string]int, len(skillResults)) + for i, r := range skillResults { + idx[r.name] = i + } + + for _, ar := range agentResults { + if len(ar.items) == 0 { + continue + } + if i, ok := idx[ar.name]; ok { + skillResults[i].items = append(skillResults[i].items, ar.items...) + skillResults[i].syncCount += ar.syncCount + skillResults[i].localCount += ar.localCount + if !ar.synced { + skillResults[i].synced = false + } + } else { + skillResults = append(skillResults, ar) + } + } + return skillResults +} + +// computeAgentDiff compares source agents against a target directory. +func computeAgentDiff(targetName, targetDir string, agents []resource.DiscoveredResource) targetDiffResult { + r := targetDiffResult{ + name: targetName, + mode: "merge", + synced: true, + } + + // Build map of expected agents + expected := make(map[string]resource.DiscoveredResource, len(agents)) + for _, a := range agents { + expected[a.FlatName] = a + } + + // Check what exists in target (store type for symlink detection) + existing := make(map[string]os.FileMode) // key=filename, value=type bits + if entries, err := os.ReadDir(targetDir); err == nil { + for _, e := range entries { + if e.IsDir() || !strings.HasSuffix(strings.ToLower(e.Name()), ".md") { + continue + } + existing[e.Name()] = e.Type() + } + } + + // Missing in target (need sync) + for flatName, agent := range expected { + fileType, ok := existing[flatName] + if !ok { + r.items = append(r.items, copyDiffEntry{ + action: "add", + name: flatName, + kind: "agent", + reason: "not in target", + isSync: true, + }) + r.synced = false + r.syncCount++ + continue + } + + targetPath := filepath.Join(targetDir, flatName) + if fileType&os.ModeSymlink != 0 || utils.IsSymlinkOrJunction(targetPath) { + absLink, err := utils.ResolveLinkTarget(targetPath) + if err != nil { + r.items = append(r.items, copyDiffEntry{ + action: "modify", + name: flatName, + kind: "agent", + reason: "link target unreadable", + isSync: true, + }) + r.synced = false + r.syncCount++ + continue + } + absSource, _ := filepath.Abs(agent.AbsPath) + if !utils.PathsEqual(absLink, absSource) { + r.items = append(r.items, copyDiffEntry{ + action: "modify", + name: flatName, + kind: "agent", + reason: "symlink points elsewhere", + isSync: true, + }) + r.synced = false + r.syncCount++ + } + continue + } + + r.items = append(r.items, copyDiffEntry{ + action: "modify", + name: flatName, + kind: "agent", + reason: "local copy (sync --force to replace)", + isSync: true, + }) + r.synced = false + r.syncCount++ + } + + // Extra in target (orphans) + for name, fileType := range existing { + if _, ok := expected[name]; !ok { + targetPath := filepath.Join(targetDir, name) + if fileType&os.ModeSymlink != 0 || utils.IsSymlinkOrJunction(targetPath) { + r.items = append(r.items, copyDiffEntry{ + action: "remove", + name: name, + kind: "agent", + reason: "orphan symlink", + isSync: true, + }) + r.synced = false + r.syncCount++ + } else { + r.items = append(r.items, copyDiffEntry{ + action: "local", + name: name, + kind: "agent", + reason: "local file", + }) + r.localCount++ + } + } + } + + r.synced = r.syncCount == 0 && r.localCount == 0 + return r +} diff --git a/cmd/skillshare/diff_project.go b/cmd/skillshare/diff_project.go index 689d93c7..11fcc272 100644 --- a/cmd/skillshare/diff_project.go +++ b/cmd/skillshare/diff_project.go @@ -10,7 +10,10 @@ import ( "skillshare/internal/ui" ) -func cmdDiffProject(root, targetName string, opts diffRenderOpts, start time.Time) error { +func cmdDiffProject(root, targetName string, kind resourceKindFilter, opts diffRenderOpts, start time.Time) error { + if kind == kindAgents { + return diffProjectAgents(root, targetName, opts, start) + } if !projectConfigExists(root) { if err := performProjectInit(root, projectInitOptions{}); err != nil { return err @@ -122,6 +125,9 @@ func cmdDiffProject(root, targetName string, opts diffRenderOpts, start time.Tim }) } + // Merge agent diffs into skill results so they appear together + results = mergeAgentDiffsProject(root, results, targetName) + if opts.jsonOutput { return diffOutputJSONWithExtras(results, extrasResults, start) } diff --git a/cmd/skillshare/diff_tui.go b/cmd/skillshare/diff_tui.go index 96da1ceb..590ab89d 100644 --- a/cmd/skillshare/diff_tui.go +++ b/cmd/skillshare/diff_tui.go @@ -8,6 +8,8 @@ import ( "strings" "sync" + "skillshare/internal/theme" + "github.com/charmbracelet/bubbles/list" "github.com/charmbracelet/bubbles/spinner" "github.com/charmbracelet/bubbles/textinput" @@ -42,30 +44,30 @@ type diffExtraItem struct { } func (i diffExtraItem) Title() string { - icon := "✓" - if i.result.errMsg != "" { + r := i.result + var icon, desc string + if r.errMsg != "" { icon = "✗" - } else if !i.result.synced { + desc = r.errMsg + } else if r.synced { + icon = "✓" + desc = "synced" + } else { icon = "~" + desc = fmt.Sprintf("%d diff", len(r.items)) } - return fmt.Sprintf("%s %s → %s", icon, i.result.extraName, shortenPath(i.result.targetPath)) + return fmt.Sprintf("%s %s → %s %s", icon, r.extraName, shortenPath(r.targetPath), theme.Dim().Render(desc)) } -func (i diffExtraItem) Description() string { - if i.result.errMsg != "" { - return i.result.errMsg - } - if i.result.synced { - return fmt.Sprintf("synced (%s)", i.result.mode) - } - return fmt.Sprintf("%d difference(s) (%s)", len(i.result.items), i.result.mode) -} +func (i diffExtraItem) Description() string { return "" } func (i diffExtraItem) FilterValue() string { return i.result.extraName } -// diffSeparatorItem is a non-selectable visual separator between skills and extras. +// diffSeparatorItem is a non-selectable visual separator / group header. type diffSeparatorItem struct { label string + count int // 0 = no count displayed + space bool // true = empty spacer row } func (s diffSeparatorItem) Title() string { return s.label } @@ -96,13 +98,21 @@ func (d diffItemDelegate) Render(w io.Writer, m list.Model, index int, item list } func renderDiffSeparatorRow(w io.Writer, sep diffSeparatorItem, width int) { - label := tc.Dim.Render(sep.label) + if sep.space { + fmt.Fprint(w, "") + return + } + label := sep.label + if sep.count > 0 { + label += fmt.Sprintf(" (%d)", sep.count) + } + label = theme.Dim().Render(label) lineWidth := width - lipgloss.Width(label) - 3 if lineWidth < 2 { lineWidth = 2 } line := strings.Repeat("─", lineWidth) - fmt.Fprint(w, tc.Dim.Render("─ ")+label+" "+tc.Dim.Render(line)) + fmt.Fprint(w, theme.Dim().Render("─ ")+label+" "+theme.Dim().Render(line)) } // skipDiffSeparator advances the list selection past diffSeparatorItem entries. @@ -124,21 +134,10 @@ func skipDiffSeparator(l *list.Model, direction int) { func (i diffTargetItem) Title() string { r := i.result if r.errMsg != "" { - return fmt.Sprintf("%s %s", tc.Red.Render("✗"), r.name) - } - if r.synced { - return fmt.Sprintf("%s %s", tc.Green.Render("✓"), r.name) - } - return fmt.Sprintf("%s %s", tc.Yellow.Render("!"), r.name) -} - -func (i diffTargetItem) Description() string { - r := i.result - if r.errMsg != "" { - return "error" + return fmt.Sprintf("%s %s", theme.Danger().Render("✗"), r.name) } if r.synced { - return "synced" + return fmt.Sprintf("%s %s", theme.Success().Render("✓"), r.name) } var parts []string if r.syncCount > 0 { @@ -147,12 +146,15 @@ func (i diffTargetItem) Description() string { if r.localCount > 0 { parts = append(parts, fmt.Sprintf("%d local", r.localCount)) } - if len(parts) == 0 { - return "0 diff(s)" + desc := "0 diff(s)" + if len(parts) > 0 { + desc = strings.Join(parts, ", ") } - return strings.Join(parts, ", ") + return fmt.Sprintf("%s %s %s", theme.Warning().Render("!"), r.name, theme.Dim().Render(desc)) } +func (i diffTargetItem) Description() string { return "" } + func (i diffTargetItem) FilterValue() string { return i.result.name } // --- Model --- @@ -211,19 +213,22 @@ func newDiffTUIModel(results []targetDiffResult, extrasSlice ...[]extraDiffResul extras = extrasSlice[0] } - listItems := make([]list.Item, len(sorted)) - for i, r := range sorted { - listItems[i] = diffTargetItem{result: r} + // Build list items with group headers + var listItems []list.Item + listItems = append(listItems, diffSeparatorItem{label: "Targets", count: len(sorted)}) + for _, r := range sorted { + listItems = append(listItems, diffTargetItem{result: r}) } - // Append extras with a separator + // Append extras with spacer + separator if len(extras) > 0 { - listItems = append(listItems, diffSeparatorItem{label: "Extras"}) + listItems = append(listItems, diffSeparatorItem{space: true}) + listItems = append(listItems, diffSeparatorItem{label: "Extras", count: len(extras)}) for _, r := range extras { listItems = append(listItems, diffExtraItem{result: r}) } } - delegate := diffItemDelegate{inner: newPrefixDelegate(true)} + delegate := diffItemDelegate{inner: newPrefixDelegate(false)} tl := list.New(listItems, delegate, 0, 0) var errN, diffN, syncN int @@ -248,20 +253,21 @@ func newDiffTUIModel(results []targetDiffResult, extrasSlice ...[]extraDiffResul titleParts = append(titleParts, fmt.Sprintf("%d ok", syncN)) } tl.Title = fmt.Sprintf("Diff — %s", strings.Join(titleParts, ", ")) - tl.Styles.Title = tc.ListTitle + tl.Styles.Title = theme.Title() tl.SetShowStatusBar(false) tl.SetFilteringEnabled(false) tl.SetShowHelp(false) tl.SetShowPagination(false) + skipDiffSeparator(&tl, 1) fi := textinput.New() fi.Prompt = "/ " - fi.PromptStyle = tc.Filter - fi.Cursor.Style = tc.Filter + fi.PromptStyle = theme.Accent() + fi.Cursor.Style = theme.Accent() sp := spinner.New() sp.Spinner = spinner.Dot - sp.Style = tc.SpinnerStyle + sp.Style = theme.Accent() return diffTUIModel{ allItems: sorted, @@ -460,12 +466,14 @@ func (m diffTUIModel) handleDiffKey(msg tea.KeyMsg) (tea.Model, tea.Cmd) { func (m *diffTUIModel) applyDiffFilter() { needle := strings.ToLower(m.filterText) if needle == "" { - items := make([]list.Item, len(m.allItems)) - for i, r := range m.allItems { - items[i] = diffTargetItem{result: r} + var items []list.Item + items = append(items, diffSeparatorItem{label: "Targets", count: len(m.allItems)}) + for _, r := range m.allItems { + items = append(items, diffTargetItem{result: r}) } if len(m.allExtras) > 0 { - items = append(items, diffSeparatorItem{label: "Extras"}) + items = append(items, diffSeparatorItem{space: true}) + items = append(items, diffSeparatorItem{label: "Extras", count: len(m.allExtras)}) for _, r := range m.allExtras { items = append(items, diffExtraItem{result: r}) } @@ -473,6 +481,7 @@ func (m *diffTUIModel) applyDiffFilter() { m.matchCount = len(items) m.targetList.SetItems(items) m.targetList.ResetSelected() + skipDiffSeparator(&m.targetList, 1) m.cachedItems = nil // invalidate cache return } @@ -536,7 +545,7 @@ func (m diffTUIModel) viewDiffHorizontal() string { b.WriteString(m.renderDiffFilterBar()) // Help - b.WriteString(tc.Help.Render(appendScrollInfo("↑↓ navigate / filter Enter expand Ctrl+d/u scroll q quit", scrollInfo))) + b.WriteString(theme.Dim().MarginLeft(2).Render(appendScrollInfo("↑↓ navigate / filter Enter expand Ctrl+d/u scroll q quit", scrollInfo))) b.WriteString("\n") return b.String() @@ -556,7 +565,7 @@ func (m diffTUIModel) viewDiffVertical() string { b.WriteString(detailStr) b.WriteString("\n") - b.WriteString(tc.Help.Render(appendScrollInfo("↑↓ navigate / filter Enter expand Ctrl+d/u scroll q quit", scrollInfo))) + b.WriteString(theme.Dim().MarginLeft(2).Render(appendScrollInfo("↑↓ navigate / filter Enter expand Ctrl+d/u scroll q quit", scrollInfo))) b.WriteString("\n") return b.String() @@ -577,8 +586,8 @@ func (m diffTUIModel) buildExtraDetail(selected diffExtraItem) string { var b strings.Builder row := func(label, value string) { - b.WriteString(tc.Label.Render(label)) - b.WriteString(tc.Value.Render(value)) + b.WriteString(theme.Dim().Width(14).Render(label)) + b.WriteString(lipgloss.NewStyle().Render(value)) b.WriteString("\n") } @@ -588,17 +597,17 @@ func (m diffTUIModel) buildExtraDetail(selected diffExtraItem) string { b.WriteString("\n") if r.errMsg != "" { - b.WriteString(tc.Red.Render(" " + r.errMsg)) + b.WriteString(theme.Danger().Render(" " + r.errMsg)) b.WriteString("\n") return b.String() } if r.synced { - b.WriteString(tc.Green.Render(" ✓ Fully synced")) + b.WriteString(theme.Success().Render(" ✓ Fully synced")) b.WriteString("\n") return b.String() } - b.WriteString(tc.Yellow.Render(fmt.Sprintf(" %d difference(s):", len(r.items)))) + b.WriteString(theme.Warning().Render(fmt.Sprintf(" %d difference(s):", len(r.items)))) b.WriteString("\n") hasLocal := false for _, item := range r.items { @@ -606,16 +615,16 @@ func (m diffTUIModel) buildExtraDetail(selected diffExtraItem) string { var style lipgloss.Style switch item.action { case "add": - prefix, style = "+ ", tc.Green + prefix, style = "+ ", theme.Success() case "remove": - prefix, style = "- ", tc.Red + prefix, style = "- ", theme.Danger() case "modify": - prefix, style = "~ ", tc.Cyan + prefix, style = "~ ", theme.Accent() if item.reason == "not a symlink (local file)" { hasLocal = true } default: - prefix, style = " ", tc.Dim + prefix, style = " ", theme.Dim() } b.WriteString(style.Render(fmt.Sprintf(" %s%s %s", prefix, item.file, item.reason))) b.WriteString("\n") @@ -623,12 +632,12 @@ func (m diffTUIModel) buildExtraDetail(selected diffExtraItem) string { // Next Steps b.WriteString("\n") - b.WriteString(tc.Title.Render("── Next Steps ──")) + b.WriteString(theme.Title().Render("── Next Steps ──")) b.WriteString("\n") - b.WriteString(tc.Cyan.Render(" → skillshare sync extras")) + b.WriteString(theme.Accent().Render(" → skillshare sync extras")) b.WriteString("\n") if hasLocal { - b.WriteString(tc.Cyan.Render(" → skillshare extras collect " + r.extraName)) + b.WriteString(theme.Accent().Render(" → skillshare extras collect " + r.extraName)) b.WriteString("\n") } @@ -653,8 +662,8 @@ func (m diffTUIModel) buildDiffDetail() string { var b strings.Builder row := func(label, value string) { - b.WriteString(tc.Label.Render(label)) - b.WriteString(tc.Value.Render(value)) + b.WriteString(theme.Dim().Width(14).Render(label)) + b.WriteString(lipgloss.NewStyle().Render(value)) b.WriteString("\n") } @@ -677,14 +686,14 @@ func (m diffTUIModel) buildDiffDetail() string { // Error if r.errMsg != "" { - b.WriteString(tc.Red.Render(" " + r.errMsg)) + b.WriteString(theme.Danger().Render(" " + r.errMsg)) b.WriteString("\n") return b.String() } // Fully synced if r.synced { - b.WriteString(tc.Green.Render(" ✓ Fully synced")) + b.WriteString(theme.Success().Render(" ✓ Fully synced")) b.WriteString("\n") return b.String() } @@ -695,6 +704,14 @@ func (m diffTUIModel) buildDiffDetail() string { return b.String() } + // Build agent name set for [A] badge rendering + agentNames := make(map[string]bool, len(m.cachedItems)) + for _, item := range m.cachedItems { + if item.kind == "agent" { + agentNames[item.name] = true + } + } + // Use cached sorted categories (refreshed on selection change) cats := m.cachedCats for _, cat := range cats { @@ -707,19 +724,19 @@ func (m diffTUIModel) buildDiffDetail() string { var kindStyle lipgloss.Style switch cat.kind { case "new", "restore": - kindStyle = tc.Green + kindStyle = theme.Success() case "modified": - kindStyle = tc.Cyan + kindStyle = theme.Accent() case "override": - kindStyle = tc.Yellow + kindStyle = theme.Warning() case "orphan": - kindStyle = tc.Red + kindStyle = theme.Danger() case "local": - kindStyle = tc.Dim + kindStyle = theme.Dim() case "warn": - kindStyle = tc.Red + kindStyle = theme.Danger() default: - kindStyle = tc.Dim + kindStyle = theme.Dim() } header := fmt.Sprintf(" %s %d %s:", cat.label, n, skillWord) @@ -728,7 +745,11 @@ func (m diffTUIModel) buildDiffDetail() string { if cat.expand { for _, name := range cat.names { - b.WriteString(tc.Dim.Render(" " + name)) + if agentNames[name] { + b.WriteString(" " + theme.Accent().Render("[A]") + " " + theme.Dim().Render(name)) + } else { + b.WriteString(theme.Dim().Render(" " + name)) + } b.WriteString("\n") } } @@ -738,20 +759,20 @@ func (m diffTUIModel) buildDiffDetail() string { if m.expandedSkill != "" { if len(m.expandedFiles) > 0 { b.WriteString("\n") - b.WriteString(tc.Separator.Render(fmt.Sprintf("── %s files ──", m.expandedSkill))) + b.WriteString(theme.Dim().Render(fmt.Sprintf("── %s files ──", m.expandedSkill))) b.WriteString("\n") for _, f := range m.expandedFiles { var icon string var style lipgloss.Style switch f.Action { case "add": - icon, style = "+", tc.Green + icon, style = "+", theme.Success() case "delete": - icon, style = "-", tc.Red + icon, style = "-", theme.Danger() case "modify": - icon, style = "~", tc.Cyan + icon, style = "~", theme.Accent() default: - icon, style = "?", tc.Dim + icon, style = "?", theme.Dim() } b.WriteString(style.Render(fmt.Sprintf(" %s %s", icon, f.RelPath))) b.WriteString("\n") @@ -761,25 +782,25 @@ func (m diffTUIModel) buildDiffDetail() string { // Unified diff content if m.expandedDiff != "" { b.WriteString("\n") - b.WriteString(tc.Separator.Render(fmt.Sprintf("── %s diff ──", m.expandedSkill))) + b.WriteString(theme.Dim().Render(fmt.Sprintf("── %s diff ──", m.expandedSkill))) b.WriteString("\n") for _, line := range strings.Split(strings.TrimRight(m.expandedDiff, "\n"), "\n") { switch { case strings.HasPrefix(line, "+ "): - b.WriteString(tc.Green.Render(line)) + b.WriteString(theme.Success().Render(line)) case strings.HasPrefix(line, "- "): - b.WriteString(tc.Red.Render(line)) + b.WriteString(theme.Danger().Render(line)) case strings.HasPrefix(line, "--- "): - b.WriteString(tc.Cyan.Render(line)) + b.WriteString(theme.Accent().Render(line)) default: - b.WriteString(tc.Dim.Render(line)) + b.WriteString(theme.Dim().Render(line)) } b.WriteString("\n") } } if len(m.expandedFiles) == 0 && m.expandedDiff == "" { b.WriteString("\n") - b.WriteString(tc.Dim.Render(" (No file-level diff available)")) + b.WriteString(theme.Dim().Render(" (No file-level diff available)")) b.WriteString("\n") } } @@ -798,7 +819,7 @@ func (m diffTUIModel) buildDiffDetail() string { } if len(hints) > 0 { b.WriteString("\n") - b.WriteString(tc.Title.Render("── Next Steps ──")) + b.WriteString(theme.Title().Render("── Next Steps ──")) b.WriteString("\n") seen := map[string]bool{} for _, h := range hints { @@ -806,7 +827,7 @@ func (m diffTUIModel) buildDiffDetail() string { continue } seen[h] = true - b.WriteString(tc.Cyan.Render(fmt.Sprintf(" → skillshare %s", h))) + b.WriteString(theme.Accent().Render(fmt.Sprintf(" → skillshare %s", h))) b.WriteString("\n") } } diff --git a/cmd/skillshare/doctor.go b/cmd/skillshare/doctor.go index 7d06c3b8..07b39e86 100644 --- a/cmd/skillshare/doctor.go +++ b/cmd/skillshare/doctor.go @@ -13,8 +13,10 @@ import ( "skillshare/internal/backup" "skillshare/internal/config" "skillshare/internal/install" + "skillshare/internal/resource" "skillshare/internal/skillignore" "skillshare/internal/sync" + "skillshare/internal/theme" "skillshare/internal/trash" "skillshare/internal/ui" "skillshare/internal/utils" @@ -141,7 +143,7 @@ func cmdDoctorGlobal(jsonMode bool) error { ui.Header("Storage") checkBackupStatus(result, false, backup.BackupDir()) checkTrashStatus(result, trash.TrashDir()) - checkVersionDoctor(cfg, result) + checkVersionDoctor(cfg, result, false) if jsonMode { return finalizeDoctorJSON(restoreUI, result, updateCh) @@ -187,10 +189,11 @@ func cmdDoctorProject(root string, jsonMode bool) error { } cfg := &config.Config{ - Source: rt.sourcePath, - Targets: rt.targets, - Mode: "merge", - Audit: rt.config.Audit, + Source: rt.sourcePath, + AgentsSource: rt.agentsSourcePath, + Targets: rt.targets, + Mode: "merge", + Audit: rt.config.Audit, } runDoctorChecks(cfg, result, true) @@ -198,7 +201,7 @@ func cmdDoctorProject(root string, jsonMode bool) error { ui.Header("Storage") checkBackupStatus(result, true, "") checkTrashStatus(result, trash.ProjectTrashDir(root)) - checkVersionDoctor(cfg, result) + checkVersionDoctor(cfg, result, true) if jsonMode { return finalizeDoctorJSON(restoreUI, result, updateCh) @@ -220,8 +223,10 @@ func runDoctorChecks(cfg *config.Config, result *doctorResult, isProject bool) { sp.Stop() checkSource(cfg, result, discovered, discoverErr) + checkAgentsSource(cfg, result) checkSkillignore(result, stats) checkSymlinkSupport(result) + checkTheme(result) if !isProject { checkGitStatus(cfg.Source, result) @@ -231,7 +236,7 @@ func runDoctorChecks(cfg *config.Config, result *doctorResult, isProject bool) { checkSkillsValidity(cfg.Source, result, discovered) checkSkillIntegrity(result, discovered) checkSkillTargetsField(result, discovered, targetNamesFromConfig(cfg.Targets)) - targetCache := checkTargets(cfg, result) + targetCache := checkTargets(cfg, result, isProject) printSymlinkCompatHint(cfg.Targets, cfg.Mode, isProject) checkSyncDrift(cfg, result, discovered, targetCache) checkBrokenSymlinks(cfg, result) @@ -303,6 +308,39 @@ func checkSource(cfg *config.Config, result *doctorResult, discovered []sync.Dis result.addCheck("source", checkPass, fmt.Sprintf("Source: %s (%d skills)", cfg.Source, skillCount), nil) } +func checkAgentsSource(cfg *config.Config, result *doctorResult) { + agentsSource := cfg.EffectiveAgentsSource() + info, err := os.Stat(agentsSource) + if err != nil { + if os.IsNotExist(err) { + ui.Info("Agents source: %s (not created yet)", agentsSource) + result.addCheck("agents_source", checkPass, fmt.Sprintf("Agents source: %s (not created yet)", agentsSource), nil) + return + } + ui.Error("Agents source error: %s", err) + result.addError() + result.addCheck("agents_source", checkError, fmt.Sprintf("Agents source error: %v", err), nil) + return + } + + if !info.IsDir() { + ui.Error("Agents source is not a directory: %s", agentsSource) + result.addError() + result.addCheck("agents_source", checkError, fmt.Sprintf("Agents source is not a directory: %s", agentsSource), nil) + return + } + + agentCount := 0 + entries, _ := os.ReadDir(agentsSource) + for _, e := range entries { + if !e.IsDir() && strings.HasSuffix(strings.ToLower(e.Name()), ".md") { + agentCount++ + } + } + ui.Success("Agents source: %s (%d agents)", agentsSource, agentCount) + result.addCheck("agents_source", checkPass, fmt.Sprintf("Agents source: %s (%d agents)", agentsSource, agentCount), nil) +} + func checkSymlinkSupport(result *doctorResult) { testLink := filepath.Join(os.TempDir(), "skillshare_symlink_test") testTarget := filepath.Join(os.TempDir(), "skillshare_symlink_target") @@ -324,6 +362,60 @@ func checkSymlinkSupport(result *doctorResult) { result.addCheck("symlink_support", checkPass, "Link support: OK", nil) } +// checkTheme reports the resolved theme mode and source. Emits a warning +// status when the theme fell back to dark due to probe failure or non-TTY +// environments — users can resolve by setting SKILLSHARE_THEME explicitly. +func checkTheme(result *doctorResult) { + tm := theme.Get() + + var status string + switch tm.Source { + case "env", "detected", "no-color": + status = checkPass + case "fallback-dark-no-tty", "fallback-dark-probe-failed": + status = checkWarning + default: + status = checkInfo + } + + msg := fmt.Sprintf("Theme: %s (%s)", tm.Mode, tm.Source) + if tm.NoColor { + msg = "Theme: no-color mode" + } + + details := []string{ + fmt.Sprintf("source: %s", tm.Source), + fmt.Sprintf("override: SKILLSHARE_THEME=%s", envOrDefault("SKILLSHARE_THEME", "auto")), + fmt.Sprintf("no_color: %v", tm.NoColor), + fmt.Sprintf("term: %s", envOrDefault("TERM", "(unset)")), + } + + switch status { + case checkPass: + ui.Success(msg) + case checkWarning: + ui.Warning(msg) + if tm.Source == "fallback-dark-probe-failed" { + ui.Info(" Tip: set SKILLSHARE_THEME=light or SKILLSHARE_THEME=dark for best colors") + } + default: + ui.Info(msg) + } + + if status == checkWarning { + result.addWarning() + } + result.addCheck("theme", status, msg, details) +} + +// envOrDefault returns the value of env var name or the given default. +func envOrDefault(name, def string) string { + if v := os.Getenv(name); v != "" { + return v + } + return def +} + // cachedTargetStatus stores CheckStatusMerge/Copy results so checkSyncDrift // can reuse them without a second call. type cachedTargetStatus struct { @@ -332,10 +424,23 @@ type cachedTargetStatus struct { status sync.TargetStatus } -func checkTargets(cfg *config.Config, result *doctorResult) map[string]cachedTargetStatus { +func checkTargets(cfg *config.Config, result *doctorResult, isProject bool) map[string]cachedTargetStatus { ui.Header("Checking targets") cache := make(map[string]cachedTargetStatus) + // Prepare agent context for per-target agent checks + agentsSource := cfg.EffectiveAgentsSource() + agentsExist := dirExists(agentsSource) + var discoveredAgents []resource.DiscoveredResource + if agentsExist { + all, _ := resource.AgentKind{}.Discover(agentsSource) + discoveredAgents = resource.ActiveAgents(all) + } + builtinAgents := config.DefaultAgentTargets() + if isProject { + builtinAgents = config.ProjectAgentTargets() + } + var details []string hasError := false @@ -362,15 +467,23 @@ func checkTargets(cfg *config.Config, result *doctorResult) map[string]cachedTar targetIssues := checkTargetIssues(target, cfg.Source) + // Target name header + fmt.Printf("%s%s%s\n", ui.Bold, name, ui.Reset) + if len(targetIssues) > 0 { - ui.Error("%s [%s]: %s", name, mode, strings.Join(targetIssues, ", ")) + fmt.Printf(" skills %s[%s] %s%s\n", ui.Red, mode, strings.Join(targetIssues, ", "), ui.Reset) result.addError() details = append(details, fmt.Sprintf("%s: %s", name, strings.Join(targetIssues, ", "))) hasError = true } else { - cached := displayTargetStatus(name, target, cfg.Source, mode) + cached := displayTargetStatus(target, cfg.Source, mode) cache[name] = cached } + + // Agent sub-check for this target + if agentsExist { + checkAgentTargetInline(name, target, builtinAgents, discoveredAgents, result) + } } if hasError { @@ -426,9 +539,9 @@ func checkTargetIssues(target config.TargetConfig, source string) []string { return targetIssues } -func displayTargetStatus(name string, target config.TargetConfig, source, mode string) cachedTargetStatus { +func displayTargetStatus(target config.TargetConfig, source, mode string) cachedTargetStatus { sc := target.SkillsConfig() - var statusStr string + var statusWord, detail string var cached cachedTargetStatus cached.mode = mode needsSync := false @@ -440,12 +553,14 @@ func displayTargetStatus(name string, target config.TargetConfig, source, mode s cached.syncedCount = linkedCount switch status { case sync.StatusMerged: - statusStr = fmt.Sprintf("merged (%d shared, %d local)", linkedCount, localCount) + statusWord = "merged" + detail = fmt.Sprintf("(%d shared, %d local)", linkedCount, localCount) case sync.StatusLinked: - statusStr = "linked (needs sync to apply merge mode)" + statusWord = "linked" + detail = "(needs sync)" needsSync = true default: - statusStr = status.String() + statusWord = status.String() } case "copy": status, managedCount, localCount := sync.CheckStatusCopy(sc.Path) @@ -453,27 +568,34 @@ func displayTargetStatus(name string, target config.TargetConfig, source, mode s cached.syncedCount = managedCount switch status { case sync.StatusCopied: - statusStr = fmt.Sprintf("copied (%d managed, %d local)", managedCount, localCount) + statusWord = "copied" + detail = fmt.Sprintf("(%d managed, %d local)", managedCount, localCount) case sync.StatusLinked: - statusStr = "linked (needs sync to apply copy mode)" + statusWord = "linked" + detail = "(needs sync)" needsSync = true default: - statusStr = status.String() + statusWord = status.String() } default: status := sync.CheckStatus(sc.Path, source) cached.status = status - statusStr = status.String() + statusWord = status.String() if status == sync.StatusMerged { - statusStr = "merged (needs sync to apply symlink mode)" + statusWord = "merged" + detail = "(needs sync)" needsSync = true } } + statusColor := ui.Green if needsSync { - ui.Warning("%s [%s]: %s", name, mode, statusStr) + statusColor = ui.Yellow + } + if detail != "" { + fmt.Printf(" skills [%s] %s%s%s %s%s%s\n", mode, statusColor, statusWord, ui.Reset, ui.Dim, detail, ui.Reset) } else { - ui.Success("%s [%s]: %s", name, mode, statusStr) + fmt.Printf(" skills [%s] %s%s%s\n", mode, statusColor, statusWord, ui.Reset) } return cached } @@ -635,7 +757,7 @@ func checkSkillIntegrity(result *doctorResult, discovered []sync.DiscoveredSkill return } - // Phase 1: filter to skills that have meta with file hashes (cheap ReadMeta only) + // Phase 1: filter to skills that have meta with file hashes type verifiable struct { name string path string @@ -644,22 +766,26 @@ func checkSkillIntegrity(result *doctorResult, discovered []sync.DiscoveredSkill var toVerify []verifiable var skippedNames []string + store := install.NewMetadataStore() + if len(discovered) > 0 { + sourceDir := strings.TrimSuffix(discovered[0].SourcePath, discovered[0].RelPath) + sourceDir = strings.TrimRight(sourceDir, `/\`) + store = install.LoadMetadataOrNew(sourceDir) + } + for _, skill := range discovered { - meta, err := install.ReadMeta(skill.SourcePath) - if err != nil { - continue - } - if meta == nil { + entry := store.GetByPath(skill.RelPath) + if entry == nil { continue // Local skill without meta — expected, skip silently } - if meta.FileHashes == nil { + if entry.FileHashes == nil { skippedNames = append(skippedNames, skill.RelPath) continue } toVerify = append(toVerify, verifiable{ name: skill.RelPath, path: skill.SourcePath, - stored: meta.FileHashes, + stored: entry.FileHashes, }) } @@ -927,14 +1053,14 @@ func checkExtras(extras []config.ExtraConfig, result *doctorResult, isProject bo files, err := sync.DiscoverExtraFiles(sourceDir) if err != nil { result.addError() - ui.Error("%s: source directory missing (%s)", extra.Name, sourceDir) + ui.Error("%s: source missing (%s)", extra.Name, sourceDir) details = append(details, fmt.Sprintf("%s: source directory missing", extra.Name)) hasIssue = true continue } - ui.Success("%s: source exists (%d files)", extra.Name, len(files)) reachable := 0 + var unreachableTargets []string for _, t := range extra.Targets { targetPath := config.ExpandPath(t.Path) if isProject && !filepath.IsAbs(targetPath) { @@ -943,14 +1069,17 @@ func checkExtras(extras []config.ExtraConfig, result *doctorResult, isProject bo if _, err := os.Stat(filepath.Dir(targetPath)); err == nil { reachable++ } else { - ui.Warning("%s: target %s not reachable (parent dir missing: %s)", extra.Name, t.Path, filepath.Dir(targetPath)) + unreachableTargets = append(unreachableTargets, t.Path) } } if reachable == len(extra.Targets) { - ui.Success("%s: all targets reachable (%d/%d)", extra.Name, reachable, len(extra.Targets)) + ui.Success("%s: %d files, %d/%d targets OK", extra.Name, len(files), reachable, len(extra.Targets)) } else { result.addWarning() - ui.Warning("%s: some targets unreachable (%d/%d)", extra.Name, reachable, len(extra.Targets)) + ui.Warning("%s: %d files, %d/%d targets unreachable", extra.Name, len(files), len(extra.Targets)-reachable, len(extra.Targets)) + for _, t := range unreachableTargets { + fmt.Printf(" %s%s (parent dir missing)%s\n", ui.Dim, t, ui.Reset) + } details = append(details, fmt.Sprintf("%s: %d/%d targets unreachable", extra.Name, len(extra.Targets)-reachable, len(extra.Targets))) hasIssue = true } @@ -1061,22 +1190,34 @@ func formatBytes(b int64) string { } // checkVersionDoctor checks CLI and skill versions -func checkVersionDoctor(cfg *config.Config, result *doctorResult) { +func checkVersionDoctor(cfg *config.Config, result *doctorResult, isProject bool) { ui.Header("Version") // CLI version ui.Success("CLI: %s", version) result.addCheck("cli_version", checkPass, fmt.Sprintf("CLI: %s", version), nil) - // Skill version (reads metadata.version from SKILL.md) + // Skill version: try SKILL.md frontmatter first, then metadata store localVersion := versioncheck.ReadLocalSkillVersion(cfg.Source) if localVersion == "" { - // Distinguish "file not found" from "version field missing" + // Try metadata store (tracks installed version even without metadata.version in SKILL.md) + store := install.LoadMetadataOrNew(cfg.Source) + if entry := store.Get("skillshare"); entry != nil && entry.Version != "" { + localVersion = strings.TrimPrefix(entry.Version, "v") + } + } + + if localVersion == "" { skillFile := filepath.Join(cfg.Source, "skillshare", "SKILL.md") if _, err := os.Stat(skillFile); os.IsNotExist(err) { - ui.Warning("Skill: not found") - ui.Info(" Run: skillshare upgrade --skill") - result.addCheck("skill_version", checkWarning, "Skill: not found", nil) + if isProject { + ui.Info("Skill: not installed") + result.addCheck("skill_version", checkInfo, "Skill: not installed in project", nil) + } else { + ui.Warning("Skill: not found") + ui.Info(" Run: skillshare upgrade --skill") + result.addCheck("skill_version", checkWarning, "Skill: not found", nil) + } } else { ui.Warning("Skill: missing version") result.addCheck("skill_version", checkWarning, "Skill: missing version", nil) diff --git a/cmd/skillshare/doctor_agents.go b/cmd/skillshare/doctor_agents.go new file mode 100644 index 00000000..956ae718 --- /dev/null +++ b/cmd/skillshare/doctor_agents.go @@ -0,0 +1,122 @@ +package main + +import ( + "fmt" + "os" + "path/filepath" + "strings" + + "skillshare/internal/config" + "skillshare/internal/resource" + "skillshare/internal/sync" + "skillshare/internal/ui" +) + +// checkAgentTargetInline validates the agent target for a single target, +// printing as an indented sub-item under the target name in doctor output. +// It applies the target's include/exclude filters to compute the expected count. +func checkAgentTargetInline(name string, target config.TargetConfig, builtinAgents map[string]config.TargetConfig, allAgents []resource.DiscoveredResource, result *doctorResult) { + agentPath := resolveAgentTargetPath(target, builtinAgents, name) + if agentPath == "" { + return + } + + ac := target.AgentsConfig() + mode := ac.Mode + if mode == "" { + mode = "merge" + } + + // Apply per-target include/exclude filters to get expected agent count + filtered, filterErr := sync.FilterAgents(allAgents, ac.Include, ac.Exclude) + if filterErr != nil { + fmt.Printf(" agents %s[%s] invalid filter: %s%s\n", ui.Red, mode, filterErr.Error(), ui.Reset) + result.addError() + result.addCheck("agent_target_"+name, checkError, + fmt.Sprintf("Agent target %s: invalid filter: %v", name, filterErr), nil) + return + } + agentCount := len(filtered) + + // Build details for JSON output + var details []string + details = append(details, fmt.Sprintf("path: %s", agentPath)) + details = append(details, fmt.Sprintf("mode: %s", mode)) + if len(ac.Include) > 0 { + details = append(details, fmt.Sprintf("include: %s", strings.Join(ac.Include, ", "))) + } + if len(ac.Exclude) > 0 { + details = append(details, fmt.Sprintf("exclude: %s", strings.Join(ac.Exclude, ", "))) + } + + info, err := os.Stat(agentPath) + if err != nil { + if os.IsNotExist(err) { + fmt.Printf(" agents %s[%s] not created%s\n", ui.Gray, mode, ui.Reset) + result.addCheck("agent_target_"+name, checkPass, + fmt.Sprintf("Agent target %s: not created yet", name), details) + return + } + fmt.Printf(" agents %s[%s] error: %s%s\n", ui.Red, mode, err.Error(), ui.Reset) + result.addError() + result.addCheck("agent_target_"+name, checkError, + fmt.Sprintf("Agent target %s: %v", name, err), details) + return + } + + if !info.IsDir() { + fmt.Printf(" agents %s[%s] error: not a directory%s\n", ui.Red, mode, ui.Reset) + result.addError() + result.addCheck("agent_target_"+name, checkError, + fmt.Sprintf("Agent target %s: path is not a directory", name), details) + return + } + + linked, broken := countAgentLinksAndBroken(agentPath) + if broken > 0 { + msg := fmt.Sprintf("[%s] %d linked, %d broken", mode, linked, broken) + fmt.Printf(" agents %s%s%s\n", ui.Yellow, msg, ui.Reset) + result.addWarning() + result.addCheck("agent_target_"+name, checkWarning, + fmt.Sprintf("Agent target %s: %s", name, msg), details) + return + } + + if linked != agentCount && agentCount > 0 { + fmt.Printf(" agents [%s] %sdrift%s %s(%d/%d linked)%s\n", mode, ui.Yellow, ui.Reset, ui.Dim, linked, agentCount, ui.Reset) + result.addWarning() + result.addCheck("agent_target_"+name, checkWarning, + fmt.Sprintf("Agent target %s: drift (%d/%d agents linked)", name, linked, agentCount), details) + return + } + + fmt.Printf(" agents [%s] %ssynced%s %s(%d/%d linked)%s\n", mode, ui.Green, ui.Reset, ui.Dim, linked, agentCount, ui.Reset) + result.addCheck("agent_target_"+name, checkPass, + fmt.Sprintf("Agent target %s: %d agents synced", name, linked), details) +} + +// countAgentLinksAndBroken counts .md symlinks and broken symlinks in a directory. +func countAgentLinksAndBroken(dir string) (linked, broken int) { + entries, err := os.ReadDir(dir) + if err != nil { + return 0, 0 + } + for _, e := range entries { + if e.IsDir() { + continue + } + if !strings.HasSuffix(strings.ToLower(e.Name()), ".md") { + continue + } + if e.Type()&os.ModeSymlink == 0 { + continue + } + // It's a symlink — check if target exists (os.Stat follows symlinks) + if _, statErr := os.Stat(filepath.Join(dir, e.Name())); statErr != nil { + broken++ + } else { + linked++ + } + } + return linked, broken +} diff --git a/cmd/skillshare/enable.go b/cmd/skillshare/enable.go index 8ccef6a2..d747b444 100644 --- a/cmd/skillshare/enable.go +++ b/cmd/skillshare/enable.go @@ -33,6 +33,12 @@ func cmdToggleSkill(args []string, enable bool) error { return err } + // Extract --kind flag before parsing other args + kind, rest, err := parseKindFlag(rest) + if err != nil { + return err + } + var dryRun bool var patterns []string for _, arg := range rest { @@ -68,20 +74,35 @@ func cmdToggleSkill(args []string, enable bool) error { } applyModeLabel(mode) + isAgent := kind == kindAgents + var ignorePath string var cfgPath string if mode == modeProject { - ignorePath = filepath.Join(cwd, ".skillshare", "skills", ".skillignore") + if isAgent { + ignorePath = filepath.Join(cwd, ".skillshare", "agents", ".agentignore") + } else { + ignorePath = filepath.Join(cwd, ".skillshare", "skills", ".skillignore") + } cfgPath = config.ProjectConfigPath(cwd) } else { cfg, err := config.Load() if err != nil { return fmt.Errorf("failed to load config: %w", err) } - ignorePath = filepath.Join(cfg.Source, ".skillignore") + if isAgent { + ignorePath = filepath.Join(cfg.EffectiveAgentsSource(), ".agentignore") + } else { + ignorePath = filepath.Join(cfg.Source, ".skillignore") + } cfgPath = config.ConfigPath() } + ignoreLabel := ".skillignore" + if isAgent { + ignoreLabel = ".agentignore" + } + changed := false for _, pattern := range patterns { if dryRun { @@ -96,25 +117,25 @@ func cmdToggleSkill(args []string, enable bool) error { if enable { removed, err := skillignore.RemovePattern(ignorePath, pattern) if err != nil { - return fmt.Errorf("failed to update .skillignore: %w", err) + return fmt.Errorf("failed to update %s: %w", ignoreLabel, err) } if !removed { ui.Warning("%s is not disabled", pattern) continue } changed = true - ui.Success("Enabled: %s (removed from .skillignore)", pattern) + ui.Success("Enabled: %s (removed from %s)", pattern, ignoreLabel) } else { added, err := skillignore.AddPattern(ignorePath, pattern) if err != nil { - return fmt.Errorf("failed to update .skillignore: %w", err) + return fmt.Errorf("failed to update %s: %w", ignoreLabel, err) } if !added { ui.Warning("%s is already disabled", pattern) continue } changed = true - ui.Success("Disabled: %s (added to .skillignore)", pattern) + ui.Success("Disabled: %s (added to %s)", pattern, ignoreLabel) } } @@ -124,6 +145,7 @@ func cmdToggleSkill(args []string, enable bool) error { e := oplog.NewEntry(action, "ok", time.Since(start)) e.Args = map[string]any{ "patterns": patterns, + "kind": kind.String(), } oplog.Write(cfgPath, oplog.OpsFile, e) } diff --git a/cmd/skillshare/extras_init_tui.go b/cmd/skillshare/extras_init_tui.go index 13c39a55..2fb30935 100644 --- a/cmd/skillshare/extras_init_tui.go +++ b/cmd/skillshare/extras_init_tui.go @@ -5,9 +5,11 @@ import ( "strings" "time" + "skillshare/internal/config" + "skillshare/internal/theme" + "github.com/charmbracelet/bubbles/textinput" tea "github.com/charmbracelet/bubbletea" - "skillshare/internal/config" ) type extrasInitPhase int @@ -48,14 +50,14 @@ func newExtrasInitTUIModel() extrasInitTUIModel { ti := textinput.New() ti.Placeholder = "rules" ti.Focus() - ti.PromptStyle = tc.Cyan - ti.Cursor.Style = tc.Cyan + ti.PromptStyle = theme.Accent() + ti.Cursor.Style = theme.Accent() si := textinput.New() si.Placeholder = "Leave empty to use default" si.CharLimit = 256 - si.PromptStyle = tc.Cyan - si.Cursor.Style = tc.Cyan + si.PromptStyle = theme.Accent() + si.Cursor.Style = theme.Accent() return extrasInitTUIModel{ phase: extrasPhaseNameInput, @@ -238,32 +240,32 @@ func (m extrasInitTUIModel) Update(msg tea.Msg) (tea.Model, tea.Cmd) { func (m extrasInitTUIModel) View() string { var b strings.Builder - b.WriteString(tc.Title.Render("Extras Init")) + b.WriteString(theme.Title().Render("Extras Init")) b.WriteString("\n\n") switch m.phase { case extrasPhaseNameInput: - b.WriteString(tc.Cyan.Render("Extra name: ")) + b.WriteString(theme.Accent().Render("Extra name: ")) b.WriteString(m.textInput.View()) if m.err != nil { - b.WriteString("\n" + tc.Red.Render(m.err.Error())) + b.WriteString("\n" + theme.Danger().Render(m.err.Error())) } b.WriteString("\n\n") - b.WriteString(tc.Help.Render("enter confirm esc cancel")) + b.WriteString(theme.Dim().MarginLeft(2).Render("enter confirm esc cancel")) case extrasPhaseSourceInput: - b.WriteString(tc.Dim.Render(fmt.Sprintf("Name: %s", m.name))) + b.WriteString(theme.Dim().Render(fmt.Sprintf("Name: %s", m.name))) b.WriteString("\n\n") - b.WriteString(tc.Cyan.Render("Source directory (optional): ")) + b.WriteString(theme.Accent().Render("Source directory (optional): ")) b.WriteString(m.sourceInput.View()) b.WriteString("\n\n") - b.WriteString(tc.Help.Render("enter to skip (use default) esc back")) + b.WriteString(theme.Dim().MarginLeft(2).Render("enter to skip (use default) esc back")) case extrasPhaseTargetInput: - b.WriteString(tc.Dim.Render(fmt.Sprintf("Name: %s", m.name))) + b.WriteString(theme.Dim().Render(fmt.Sprintf("Name: %s", m.name))) if m.sourceValue != "" { b.WriteString("\n") - b.WriteString(tc.Dim.Render(fmt.Sprintf("Source: %s", m.sourceValue))) + b.WriteString(theme.Dim().Render(fmt.Sprintf("Source: %s", m.sourceValue))) } if len(m.targets) > 0 { b.WriteString("\n") @@ -272,22 +274,22 @@ func (m extrasInitTUIModel) View() string { if t.flatten { modeLabel += ", flatten" } - b.WriteString(tc.Dim.Render(fmt.Sprintf(" → %s (%s)", t.path, modeLabel))) + b.WriteString(theme.Dim().Render(fmt.Sprintf(" → %s (%s)", t.path, modeLabel))) b.WriteString("\n") } } b.WriteString("\n") - b.WriteString(tc.Cyan.Render(fmt.Sprintf("Target #%d path: ", len(m.targets)+1))) + b.WriteString(theme.Accent().Render(fmt.Sprintf("Target #%d path: ", len(m.targets)+1))) b.WriteString(m.textInput.View()) b.WriteString("\n\n") - b.WriteString(tc.Help.Render("enter confirm esc back")) + b.WriteString(theme.Dim().MarginLeft(2).Render("enter confirm esc back")) case extrasPhaseModeSelect: - b.WriteString(tc.Dim.Render(fmt.Sprintf("Name: %s", m.name))) + b.WriteString(theme.Dim().Render(fmt.Sprintf("Name: %s", m.name))) b.WriteString("\n") - b.WriteString(tc.Dim.Render(fmt.Sprintf("Target: %s", m.targets[len(m.targets)-1].path))) + b.WriteString(theme.Dim().Render(fmt.Sprintf("Target: %s", m.targets[len(m.targets)-1].path))) b.WriteString("\n\n") - b.WriteString(tc.Cyan.Render("Sync mode:")) + b.WriteString(theme.Accent().Render("Sync mode:")) b.WriteString("\n") for i, mode := range syncModes { cursor := " " @@ -304,43 +306,43 @@ func (m extrasInitTUIModel) View() string { desc = " (directory symlink)" } if i == m.currMode { - b.WriteString(tc.Cyan.Render(cursor+mode) + tc.Dim.Render(desc)) + b.WriteString(theme.Accent().Render(cursor+mode) + theme.Dim().Render(desc)) } else { - b.WriteString(tc.Dim.Render(cursor + mode + desc)) + b.WriteString(theme.Dim().Render(cursor + mode + desc)) } b.WriteString("\n") } b.WriteString("\n") - b.WriteString(tc.Help.Render("↑↓/jk navigate enter/space select esc back")) + b.WriteString(theme.Dim().MarginLeft(2).Render("↑↓/jk navigate enter/space select esc back")) case extrasPhaseFlattenToggle: - b.WriteString(tc.Dim.Render(fmt.Sprintf("Name: %s", m.name))) + b.WriteString(theme.Dim().Render(fmt.Sprintf("Name: %s", m.name))) b.WriteString("\n") lastTarget := m.targets[len(m.targets)-1] - b.WriteString(tc.Dim.Render(fmt.Sprintf("Target: %s (%s)", lastTarget.path, lastTarget.mode))) + b.WriteString(theme.Dim().Render(fmt.Sprintf("Target: %s (%s)", lastTarget.path, lastTarget.mode))) b.WriteString("\n\n") - b.WriteString(tc.Cyan.Render("Flatten files into target root? (y/N) ")) + b.WriteString(theme.Accent().Render("Flatten files into target root? (y/N) ")) b.WriteString("\n\n") - b.WriteString(tc.Help.Render("y yes n/enter no esc back")) + b.WriteString(theme.Dim().MarginLeft(2).Render("y yes n/enter no esc back")) case extrasPhaseAddMore: - b.WriteString(tc.Dim.Render(fmt.Sprintf("Name: %s", m.name))) + b.WriteString(theme.Dim().Render(fmt.Sprintf("Name: %s", m.name))) b.WriteString("\n") for _, t := range m.targets { modeLabel := t.mode if t.flatten { modeLabel += ", flatten" } - b.WriteString(tc.Dim.Render(fmt.Sprintf(" → %s (%s)", t.path, modeLabel))) + b.WriteString(theme.Dim().Render(fmt.Sprintf(" → %s (%s)", t.path, modeLabel))) b.WriteString("\n") } b.WriteString("\n") - b.WriteString(tc.Cyan.Render("Add another target? (y/N) ")) + b.WriteString(theme.Accent().Render("Add another target? (y/N) ")) b.WriteString("\n\n") - b.WriteString(tc.Help.Render("y yes n/enter no esc back")) + b.WriteString(theme.Dim().MarginLeft(2).Render("y yes n/enter no esc back")) case extrasPhaseConfirm: - b.WriteString(tc.Cyan.Render("Summary:")) + b.WriteString(theme.Accent().Render("Summary:")) b.WriteString("\n") b.WriteString(fmt.Sprintf(" Name: %s\n", m.name)) if m.sourceValue != "" { @@ -354,9 +356,9 @@ func (m extrasInitTUIModel) View() string { b.WriteString(fmt.Sprintf(" → %s (%s)\n", t.path, modeLabel)) } b.WriteString("\n") - b.WriteString(tc.Cyan.Render("Create this extra? (Y/n) ")) + b.WriteString(theme.Accent().Render("Create this extra? (Y/n) ")) b.WriteString("\n\n") - b.WriteString(tc.Help.Render("y/enter confirm n cancel")) + b.WriteString(theme.Dim().MarginLeft(2).Render("y/enter confirm n cancel")) } return b.String() diff --git a/cmd/skillshare/extras_list_tui.go b/cmd/skillshare/extras_list_tui.go index 0603847f..6f76babe 100644 --- a/cmd/skillshare/extras_list_tui.go +++ b/cmd/skillshare/extras_list_tui.go @@ -8,6 +8,7 @@ import ( "skillshare/internal/config" "skillshare/internal/sync" + "skillshare/internal/theme" "skillshare/internal/ui" "github.com/charmbracelet/bubbles/list" @@ -107,7 +108,7 @@ func newExtrasListTUIModel( l := list.New(nil, delegate, 0, 0) l.Title = fmt.Sprintf("Extras (%s)", modeLabel) - l.Styles.Title = tc.ListTitle + l.Styles.Title = theme.Title() l.SetShowStatusBar(false) l.SetFilteringEnabled(false) l.SetShowHelp(false) @@ -115,12 +116,12 @@ func newExtrasListTUIModel( sp := spinner.New() sp.Spinner = spinner.Dot - sp.Style = tc.SpinnerStyle + sp.Style = theme.Accent() fi := textinput.New() fi.Prompt = "/ " - fi.PromptStyle = tc.Filter - fi.Cursor.Style = tc.Filter + fi.PromptStyle = theme.Accent() + fi.Cursor.Style = theme.Accent() fi.Placeholder = "filter by name" return extrasListTUIModel{ @@ -425,29 +426,29 @@ func (m extrasListTUIModel) viewExtrasVertical() string { func (m extrasListTUIModel) renderExtrasDetail(e extrasListEntry) string { var b strings.Builder - b.WriteString(tc.Title.Render(e.Name)) + b.WriteString(theme.Title().Render(e.Name)) b.WriteString("\n\n") - label := tc.Label.Render("Source") + label := theme.Dim().Width(14).Render("Source") if e.SourceExists { b.WriteString(label + shortenPath(e.SourceDir) + "\n") } else { - b.WriteString(label + tc.Dim.Render("not found") + "\n") + b.WriteString(label + theme.Dim().Render("not found") + "\n") } - label = tc.Label.Render("Files") + label = theme.Dim().Width(14).Render("Files") if e.SourceExists { b.WriteString(label + fmt.Sprintf("%d", e.FileCount) + "\n") } else { - b.WriteString(label + tc.Dim.Render("—") + "\n") + b.WriteString(label + theme.Dim().Render("—") + "\n") } b.WriteString("\n") - b.WriteString(tc.Title.Render("Targets")) + b.WriteString(theme.Title().Render("Targets")) b.WriteString("\n") if len(e.Targets) == 0 { - b.WriteString(tc.Dim.Render(" No targets configured") + "\n") + b.WriteString(theme.Dim().Render(" No targets configured") + "\n") } else { hasDrift := false for _, t := range e.Targets { @@ -456,18 +457,18 @@ func (m extrasListTUIModel) renderExtrasDetail(e extrasListEntry) string { switch t.Status { case "synced": icon = "✓" - style = tc.Green + style = theme.Success() case "drift": icon = "△" - style = tc.Yellow + style = theme.Warning() hasDrift = true case "not synced": icon = "✗" - style = tc.Red + style = theme.Danger() hasDrift = true default: icon = "-" - style = tc.Dim + style = theme.Dim() } statusText := "" if t.Status != "synced" { @@ -478,29 +479,29 @@ func (m extrasListTUIModel) renderExtrasDetail(e extrasListEntry) string { modeLabel += ", flatten" } fmt.Fprintf(&b, " %s %s (%s)%s\n", - style.Render(icon), shortenPath(t.Path), modeLabel, tc.Dim.Render(statusText)) + style.Render(icon), shortenPath(t.Path), modeLabel, theme.Dim().Render(statusText)) } if hasDrift { - b.WriteString("\n" + tc.Yellow.Render("hint:") + " press S to sync, or use --force to overwrite conflicts\n") + b.WriteString("\n" + theme.Warning().Render("hint:") + " press S to sync, or use --force to overwrite conflicts\n") } } if e.SourceExists && e.FileCount > 0 { b.WriteString("\n") - b.WriteString(tc.Title.Render("Files")) + b.WriteString(theme.Title().Render("Files")) b.WriteString("\n") files := discoverExtraFileNames(e.SourceDir) maxShow := 10 for i, f := range files { if i >= maxShow { - b.WriteString(tc.Dim.Render(fmt.Sprintf(" … and %d more", len(files)-maxShow)) + "\n") + b.WriteString(theme.Dim().Render(fmt.Sprintf(" … and %d more", len(files)-maxShow)) + "\n") break } prefix := "├── " if i == len(files)-1 || i == maxShow-1 { prefix = "└── " } - b.WriteString(tc.Dim.Render(" "+prefix) + f + "\n") + b.WriteString(theme.Dim().Render(" "+prefix) + f + "\n") } } @@ -530,17 +531,17 @@ func (m extrasListTUIModel) renderExtrasHelp(scrollInfo string) string { if m.filtering { helpText = "Enter lock Esc clear q quit" } - return tc.Help.Render(appendScrollInfo(helpText, scrollInfo)) + return theme.Dim().MarginLeft(2).Render(appendScrollInfo(helpText, scrollInfo)) } func renderExtrasActionMsg(msg string) string { if strings.HasPrefix(msg, "✓") { - return tc.Green.Render(msg) + return theme.Success().Render(msg) } if strings.HasPrefix(msg, "✗") { - return tc.Red.Render(msg) + return theme.Danger().Render(msg) } - return tc.Yellow.Render(msg) + return theme.Warning().Render(msg) } // ─── Helpers ───────────────────────────────────────────────────────── @@ -661,7 +662,7 @@ func (m extrasListTUIModel) renderConfirmOverlay() string { } return fmt.Sprintf("\n%s\n\n%s\n\nProceed? [Y/n] ", - tc.Title.Render(title), body) + theme.Title().Render(title), body) } // ─── Target Sub-Menu ───────────────────────────────────────────────── @@ -764,14 +765,14 @@ func (m extrasListTUIModel) renderTargetMenu() string { title = "Toggle flatten" } - fmt.Fprintf(&b, "\n%s\n\n", tc.Title.Render(title)) + fmt.Fprintf(&b, "\n%s\n\n", theme.Title().Render(title)) if m.targetAction == "mode" || m.targetAction == "flatten" { // No "All targets" for mode — list targets directly for i, t := range m.targetMenuItems { prefix := " " if i == m.targetCursor { - prefix = tc.Cyan.Render(">") + " " + prefix = theme.Accent().Render(">") + " " } fmt.Fprintf(&b, "%s%s (%s)\n", prefix, shortenPath(t.Path), t.Mode) } @@ -779,7 +780,7 @@ func (m extrasListTUIModel) renderTargetMenu() string { for i := 0; i <= len(m.targetMenuItems); i++ { prefix := " " if i == m.targetCursor { - prefix = tc.Cyan.Render(">") + " " + prefix = theme.Accent().Render(">") + " " } if i == 0 { fmt.Fprintf(&b, "%s%s\n", prefix, "All targets") @@ -790,7 +791,7 @@ func (m extrasListTUIModel) renderTargetMenu() string { } } - fmt.Fprintf(&b, "\n%s\n", tc.Help.Render("↑↓ select Enter confirm Esc cancel")) + fmt.Fprintf(&b, "\n%s\n", theme.Dim().MarginLeft(2).Render("↑↓ select Enter confirm Esc cancel")) return b.String() } @@ -846,14 +847,14 @@ func (m extrasListTUIModel) handleModePickerKey(msg tea.KeyMsg) (tea.Model, tea. func (m extrasListTUIModel) renderModePicker() string { var b strings.Builder - fmt.Fprintf(&b, "\n%s\n", tc.Title.Render("Change mode")) - fmt.Fprintf(&b, "%s %s\n\n", tc.Dim.Render("Extra:"), m.modePickerExtra) - fmt.Fprintf(&b, "%s %s\n\n", tc.Dim.Render("Target:"), shortenPath(m.modePickerTarget)) + fmt.Fprintf(&b, "\n%s\n", theme.Title().Render("Change mode")) + fmt.Fprintf(&b, "%s %s\n\n", theme.Dim().Render("Extra:"), m.modePickerExtra) + fmt.Fprintf(&b, "%s %s\n\n", theme.Dim().Render("Target:"), shortenPath(m.modePickerTarget)) for i, mode := range extrasSyncModes { cursor := " " if i == m.modeCursor { - cursor = tc.Cyan.Render(">") + " " + cursor = theme.Accent().Render(">") + " " } var desc string switch mode { @@ -865,13 +866,13 @@ func (m extrasListTUIModel) renderModePicker() string { desc = " (directory symlink)" } if i == m.modeCursor { - fmt.Fprintf(&b, "%s%s%s\n", cursor, tc.Cyan.Render(mode), tc.Dim.Render(desc)) + fmt.Fprintf(&b, "%s%s%s\n", cursor, theme.Accent().Render(mode), theme.Dim().Render(desc)) } else { - fmt.Fprintf(&b, "%s%s%s\n", cursor, mode, tc.Dim.Render(desc)) + fmt.Fprintf(&b, "%s%s%s\n", cursor, mode, theme.Dim().Render(desc)) } } - fmt.Fprintf(&b, "\n%s\n", tc.Help.Render("↑↓ select Enter confirm Esc cancel")) + fmt.Fprintf(&b, "\n%s\n", theme.Dim().MarginLeft(2).Render("↑↓ select Enter confirm Esc cancel")) return b.String() } @@ -1307,9 +1308,9 @@ func (m extrasListTUIModel) renderExtrasContentOverlay() string { } b.WriteString("\n") - b.WriteString(tc.Title.Render(fmt.Sprintf(" %s", extraName))) + b.WriteString(theme.Title().Render(fmt.Sprintf(" %s", extraName))) if fileName != "" { - b.WriteString(tc.Dim.Render(fmt.Sprintf(" ─ %s", fileName))) + b.WriteString(theme.Dim().Render(fmt.Sprintf(" ─ %s", fileName))) } b.WriteString("\n\n") @@ -1326,7 +1327,7 @@ func (m extrasListTUIModel) renderExtrasContentOverlay() string { PaddingLeft(1). Render(sidebarStr) - borderStyle := tc.Border.Height(contentHeight).MaxHeight(contentHeight) + borderStyle := theme.Dim().Height(contentHeight).MaxHeight(contentHeight) borderCol := strings.Repeat("│\n", contentHeight) borderPanel := borderStyle.Render(strings.TrimRight(borderCol, "\n")) @@ -1344,7 +1345,7 @@ func (m extrasListTUIModel) renderExtrasContentOverlay() string { if scrollInfo != "" { help += " " + scrollInfo } - b.WriteString(tc.Help.Render(help)) + b.WriteString(theme.Dim().MarginLeft(2).Render(help)) b.WriteString("\n") return b.String() @@ -1355,8 +1356,8 @@ func (m extrasListTUIModel) renderExtrasSidebarStr(width, height int) string { return "(no files)" } - selectedStyle := lipgloss.NewStyle().Bold(true).Foreground(tc.BrandYellow) - dirStyle := tc.Cyan + selectedStyle := lipgloss.NewStyle().Bold(true).Foreground(lipgloss.Color("#D4D93C")) + dirStyle := theme.Accent() fileStyle := lipgloss.NewStyle() total := len(m.treeNodes) @@ -1401,7 +1402,7 @@ func (m extrasListTUIModel) renderExtrasSidebarStr(width, height int) string { } if total > height { - lines = append(lines, tc.Dim.Render(fmt.Sprintf(" (%d/%d)", m.treeCursor+1, total))) + lines = append(lines, theme.Dim().Render(fmt.Sprintf(" (%d/%d)", m.treeCursor+1, total))) } return strings.Join(lines, "\n") diff --git a/cmd/skillshare/extras_list_tui_item.go b/cmd/skillshare/extras_list_tui_item.go index cd00b941..ddfc1f67 100644 --- a/cmd/skillshare/extras_list_tui_item.go +++ b/cmd/skillshare/extras_list_tui_item.go @@ -3,6 +3,8 @@ package main import ( "io" + "skillshare/internal/theme" + "github.com/charmbracelet/bubbles/list" tea "github.com/charmbracelet/bubbletea" ) @@ -41,10 +43,10 @@ func (extrasListDelegate) Render(w io.Writer, m list.Model, index int, item list // extrasStatusBadge returns a short colored status indicator based on aggregate target status. func extrasStatusBadge(e extrasListEntry) string { if !e.SourceExists { - return tc.Dim.Render("no source") + return theme.Dim().Render("no source") } if len(e.Targets) == 0 { - return tc.Dim.Render("no targets") + return theme.Dim().Render("no targets") } synced := 0 @@ -60,10 +62,10 @@ func extrasStatusBadge(e extrasListEntry) string { total := len(e.Targets) if synced == total { - return tc.Green.Render("✓") + return theme.Success().Render("✓") } if drift > 0 { - return tc.Yellow.Render("△") + return theme.Warning().Render("△") } - return tc.Red.Render("✗") + return theme.Danger().Render("✗") } diff --git a/cmd/skillshare/init.go b/cmd/skillshare/init.go index ab68ce30..4066598a 100644 --- a/cmd/skillshare/init.go +++ b/cmd/skillshare/init.go @@ -377,10 +377,14 @@ func performFreshInit(opts *initOptions, home string) error { ui.Warning("Dry run mode - no changes will be made") } - // Create source directory if needed + // Create source directories if needed if err := createSourceDir(sourcePath, opts.dryRun); err != nil { return err } + agentsSourcePath := filepath.Join(filepath.Dir(sourcePath), "agents") + if err := createSourceDir(agentsSourcePath, opts.dryRun); err != nil { + return err + } // Copy skills from selected directory if copyFromPath != "" { diff --git a/cmd/skillshare/init_project.go b/cmd/skillshare/init_project.go index 55c8fa91..01464e7e 100644 --- a/cmd/skillshare/init_project.go +++ b/cmd/skillshare/init_project.go @@ -216,6 +216,9 @@ func performProjectInit(root string, opts projectInitOptions) error { if err := os.MkdirAll(filepath.Join(root, ".skillshare", "skills"), 0755); err != nil { return fmt.Errorf("failed to create .skillshare/skills: %w", err) } + if err := os.MkdirAll(filepath.Join(root, ".skillshare", "agents"), 0755); err != nil { + return fmt.Errorf("failed to create .skillshare/agents: %w", err) + } if err := ensureProjectGitignore(root, opts.configMode == "local"); err != nil { return err diff --git a/cmd/skillshare/install.go b/cmd/skillshare/install.go index 54137642..55f81988 100644 --- a/cmd/skillshare/install.go +++ b/cmd/skillshare/install.go @@ -97,6 +97,23 @@ func parseInstallArgs(args []string) (*installArgs, bool, error) { result.opts.Branch = branch case arg == "--track" || arg == "-t": result.opts.Track = true + case arg == "--kind": + if i+1 >= len(args) { + return nil, false, fmt.Errorf("--kind requires a value (skill or agent)") + } + i++ + kind := strings.ToLower(args[i]) + if kind != "skill" && kind != "agent" { + return nil, false, fmt.Errorf("--kind must be 'skill' or 'agent', got %q", args[i]) + } + result.opts.Kind = kind + case arg == "--agent" || arg == "-a": + if i+1 >= len(args) { + return nil, false, fmt.Errorf("-a requires agent name(s)") + } + i++ + result.opts.AgentNames = strings.Split(args[i], ",") + result.opts.Kind = "agent" case arg == "--skill" || arg == "-s": if i+1 >= len(args) { return nil, false, fmt.Errorf("--skill requires a value") @@ -115,6 +132,8 @@ func parseInstallArgs(args []string) (*installArgs, bool, error) { } i++ result.opts.Into = args[i] + case strings.HasPrefix(arg, "--into="): + result.opts.Into = strings.TrimPrefix(arg, "--into=") case arg == "--all": result.opts.All = true case arg == "--yes" || arg == "-y": @@ -171,6 +190,9 @@ func parseInstallArgs(args []string) (*installArgs, bool, error) { if result.opts.HasSkillFilter() && result.opts.Track { return nil, false, fmt.Errorf("--skill cannot be used with --track") } + if result.opts.HasAgentFilter() && result.opts.Track { + return nil, false, fmt.Errorf("--agent cannot be used with --track") + } if result.opts.ShouldInstallAll() && result.opts.Track { return nil, false, fmt.Errorf("--all/--yes cannot be used with --track") } @@ -191,6 +213,16 @@ func parseInstallArgs(args []string) (*installArgs, bool, error) { return result, false, nil } +func applyInstallJSONDefaults(parsed *installArgs) { + if !parsed.jsonOutput { + return + } + parsed.opts.Force = true + if !parsed.opts.HasSkillFilter() && !parsed.opts.HasAgentFilter() { + parsed.opts.All = true + } +} + // destWithInto returns the destination path, prepending opts.Into if set. func destWithInto(sourceDir string, opts install.InstallOptions, skillName string) string { if opts.Into != "" { @@ -199,6 +231,14 @@ func destWithInto(sourceDir string, opts install.InstallOptions, skillName strin return filepath.Join(sourceDir, skillName) } +// agentsDirWithInto returns agentsDir joined with opts.Into (if set). +func agentsDirWithInto(agentsDir string, opts install.InstallOptions) string { + if opts.Into != "" { + return filepath.Join(agentsDir, opts.Into) + } + return agentsDir +} + // ensureIntoDirExists creates the Into subdirectory if opts.Into is set. func ensureIntoDirExists(sourceDir string, opts install.InstallOptions) error { if opts.Into == "" { @@ -219,17 +259,17 @@ func parseOptsFromProjectConfig(cfg *config.ProjectConfig) install.ParseOptions // resolveSkillFromName resolves a skill name to source using metadata func resolveSkillFromName(skillName string, cfg *config.Config) (*install.Source, error) { - skillPath := filepath.Join(cfg.Source, skillName) - - meta, err := install.ReadMeta(skillPath) + store, err := install.LoadMetadataWithMigration(cfg.Source, "") if err != nil { return nil, fmt.Errorf("skill '%s' not found or has no metadata", skillName) } - if meta == nil { + + entry := store.GetByPath(skillName) + if entry == nil || entry.Source == "" { return nil, fmt.Errorf("skill '%s' has no metadata, cannot update", skillName) } - source, err := install.ParseSourceWithOptions(meta.Source, parseOptsFromConfig(cfg)) + source, err := install.ParseSourceWithOptions(entry.Source, parseOptsFromConfig(cfg)) if err != nil { return nil, fmt.Errorf("invalid source in metadata: %w", err) } @@ -295,11 +335,32 @@ func cmdInstall(args []string) error { applyModeLabel(mode) if mode == modeProject { - summary, err := cmdInstallProject(rest, cwd) + parsed, showHelp, parseErr := parseInstallArgs(rest) + if showHelp { + printInstallHelp() + return parseErr + } + if parseErr != nil { + return parseErr + } + applyInstallJSONDefaults(parsed) + + jsonUI := newJSONUISuppressor(parsed.jsonOutput) + defer jsonUI.Flush() + + jsonWriteResult := func(summary installLogSummary, cmdErr error) error { + jsonUI.Flush() + return installOutputJSON(summary, start, cmdErr) + } + + summary, err := cmdInstallProjectParsed(parsed, cwd) if summary.Mode == "" { summary.Mode = "project" } logInstallOp(config.ProjectConfigPath(cwd), rest, start, err, summary) + if parsed.jsonOutput { + return jsonWriteResult(summary, err) + } return err } @@ -311,46 +372,30 @@ func cmdInstall(args []string) error { if parseErr != nil { return parseErr } - - // --json implies --force and --all (skip prompts for non-interactive use) - if parsed.jsonOutput { - parsed.opts.Force = true - if !parsed.opts.HasSkillFilter() { - parsed.opts.All = true - } - } + applyInstallJSONDefaults(parsed) // When no source is given, only bare "install" is valid — reject incompatible flags if parsed.sourceArg == "" { hasSourceFlags := parsed.opts.Name != "" || parsed.opts.Into != "" || - parsed.opts.Track || len(parsed.opts.Skills) > 0 || - len(parsed.opts.Exclude) > 0 || parsed.opts.All || parsed.opts.Yes || parsed.opts.Update || - parsed.opts.Branch != "" - if hasSourceFlags && !parsed.jsonOutput { - return fmt.Errorf("flags --name, --into, --track, --skill, --exclude, --all, --yes, and --update require a source argument") + parsed.opts.Track || parsed.opts.HasSkillFilter() || parsed.opts.HasAgentFilter() || + len(parsed.opts.Exclude) > 0 || parsed.opts.Update || parsed.opts.Branch != "" || + parsed.opts.Kind != "" + if hasSourceFlags || ((parsed.opts.All || parsed.opts.Yes) && !parsed.jsonOutput) { + return fmt.Errorf("flags --name, --into, --track, --skill, --agent, --kind, --exclude, --all, --yes, --branch, and --update require a source argument") } } // In JSON mode, redirect all UI output to stderr early so the // logo, spinner, step, and handler output don't corrupt stdout. - var restoreJSONUI func() - restoreJSONUIIfNeeded := func() { - if restoreJSONUI != nil { - restoreJSONUI() - restoreJSONUI = nil - } - } - if parsed.jsonOutput { - restoreJSONUI = suppressUIToDevnull() - } - defer restoreJSONUIIfNeeded() + jsonUI := newJSONUISuppressor(parsed.jsonOutput) + defer jsonUI.Flush() jsonWriteError := func(err error) error { - restoreJSONUIIfNeeded() + jsonUI.Flush() return writeJSONError(err) } jsonWriteResult := func(summary installLogSummary, cmdErr error) error { - restoreJSONUIIfNeeded() + jsonUI.Flush() return installOutputJSON(summary, start, cmdErr) } @@ -375,6 +420,7 @@ func cmdInstall(args []string) error { return err } + parsed.opts.SourceDir = cfg.Source source, resolvedFromMeta, err := resolveInstallSource(parsed.sourceArg, parsed.opts, cfg) if err == nil && parsed.opts.Branch != "" { source.Branch = parsed.opts.Branch @@ -411,10 +457,10 @@ func cmdInstall(args []string) error { summary.Source = parsed.sourceArg } if err == nil && !parsed.opts.DryRun && len(summary.InstalledSkills) > 0 { - reg, regErr := config.LoadRegistry(cfg.RegistryDir) - if regErr != nil { - ui.Warning("Failed to load registry: %v", regErr) - } else if rErr := config.ReconcileGlobalSkills(cfg, reg); rErr != nil { + store, storeErr := install.LoadMetadataWithMigration(cfg.Source, "") + if storeErr != nil { + ui.Warning("Failed to load metadata: %v", storeErr) + } else if rErr := config.ReconcileGlobalSkills(cfg, store); rErr != nil { ui.Warning("Failed to reconcile global skills config: %v", rErr) } } @@ -433,10 +479,10 @@ func cmdInstall(args []string) error { summary.Source = parsed.sourceArg } if err == nil && !parsed.opts.DryRun && len(summary.InstalledSkills) > 0 { - reg, regErr := config.LoadRegistry(cfg.RegistryDir) - if regErr != nil { - ui.Warning("Failed to load registry: %v", regErr) - } else if rErr := config.ReconcileGlobalSkills(cfg, reg); rErr != nil { + store, storeErr := install.LoadMetadataWithMigration(cfg.Source, "") + if storeErr != nil { + ui.Warning("Failed to load metadata: %v", storeErr) + } else if rErr := config.ReconcileGlobalSkills(cfg, store); rErr != nil { ui.Warning("Failed to reconcile global skills config: %v", rErr) } } @@ -535,11 +581,13 @@ Sources: Options: --name Override installed name when exactly one skill is installed + --kind Restrict discovery/install to skill or agent --into Install into subdirectory (e.g. "frontend" or "frontend/react") --force, -f Overwrite existing skill; also continue if audit would block --update, -u Update existing (git pull if possible, else reinstall) --branch, -b Git branch to clone from (default: remote default) --track, -t Install as tracked repo (preserves .git for updates) + --agent, -a Select specific agents from a multi-agent repo (comma-separated) --skill, -s Select specific skills from multi-skill repo (comma-separated; supports glob patterns like "core-*", "test-?") --exclude Skip specific skills during install (comma-separated; @@ -568,6 +616,8 @@ Examples: skillshare install ~/my-skill -T high # Override block threshold for this run Selective install (non-interactive): + skillshare install org/agents --kind agent # Agents only + skillshare install org/agents -a reviewer,tutor # Specific agents skillshare install anthropics/skills -s pdf,commit # Specific skills skillshare install anthropics/skills -s "core-*" # Glob pattern skillshare install anthropics/skills --all # All skills diff --git a/cmd/skillshare/install_context.go b/cmd/skillshare/install_context.go index d17ffdb1..2493288b 100644 --- a/cmd/skillshare/install_context.go +++ b/cmd/skillshare/install_context.go @@ -13,18 +13,22 @@ var ( _ install.InstallContext = (*projectInstallContext)(nil) ) -// toSkillEntryDTOs converts config.SkillEntry (or its alias ProjectSkill) -// to install.SkillEntryDTO to avoid circular imports between install and config. -func toSkillEntryDTOs(skills []config.SkillEntry) []install.SkillEntryDTO { - dtos := make([]install.SkillEntryDTO, len(skills)) - for i, s := range skills { - dtos[i] = install.SkillEntryDTO{ - Name: s.Name, - Source: s.Source, - Tracked: s.Tracked, - Group: s.Group, - Branch: s.Branch, +// storeToSkillEntryDTOs converts MetadataStore entries to []install.SkillEntryDTO. +func storeToSkillEntryDTOs(store *install.MetadataStore) []install.SkillEntryDTO { + names := store.List() // sorted + dtos := make([]install.SkillEntryDTO, 0, len(names)) + for _, name := range names { + entry := store.Get(name) + if entry == nil { + continue } + dtos = append(dtos, install.SkillEntryDTO{ + Name: name, + Source: entry.Source, + Tracked: entry.Tracked, + Group: entry.Group, + Branch: entry.Branch, + }) } return dtos } @@ -35,16 +39,16 @@ func toSkillEntryDTOs(skills []config.SkillEntry) []install.SkillEntryDTO { // globalInstallContext implements install.InstallContext for global mode. type globalInstallContext struct { - cfg *config.Config - reg *config.Registry + cfg *config.Config + store *install.MetadataStore } func (g *globalInstallContext) SourcePath() string { return g.cfg.Source } func (g *globalInstallContext) ConfigSkills() []install.SkillEntryDTO { - return toSkillEntryDTOs(g.reg.Skills) + return storeToSkillEntryDTOs(g.store) } func (g *globalInstallContext) Reconcile() error { - return config.ReconcileGlobalSkills(g.cfg, g.reg) + return config.ReconcileGlobalSkills(g.cfg, g.store) } func (g *globalInstallContext) PostInstallSkill(string) error { return nil } func (g *globalInstallContext) Mode() string { return "global" } @@ -61,7 +65,7 @@ type projectInstallContext struct { func (p *projectInstallContext) SourcePath() string { return p.runtime.sourcePath } func (p *projectInstallContext) ConfigSkills() []install.SkillEntryDTO { - return toSkillEntryDTOs(p.runtime.registry.Skills) + return storeToSkillEntryDTOs(p.runtime.skillsStore) } func (p *projectInstallContext) Reconcile() error { return reconcileProjectRemoteSkills(p.runtime) diff --git a/cmd/skillshare/install_handlers.go b/cmd/skillshare/install_handlers.go index e7ee9e0c..f4df660f 100644 --- a/cmd/skillshare/install_handlers.go +++ b/cmd/skillshare/install_handlers.go @@ -16,6 +16,17 @@ import ( ) func handleTrackedRepoInstall(source *install.Source, cfg *config.Config, opts install.InstallOptions) (installLogSummary, error) { + trackedKind, err := install.InferTrackedKind(source, opts.Kind) + if err != nil { + return installLogSummary{}, err + } + opts.Kind = trackedKind + + trackSourceDir := cfg.Source + if trackedKind == "agent" { + trackSourceDir = cfg.EffectiveAgentsSource() + } + logSummary := installLogSummary{ Source: source.Raw, DryRun: opts.DryRun, @@ -50,7 +61,7 @@ func handleTrackedRepoInstall(source *install.Source, cfg *config.Config, opts i } } - result, err := install.InstallTrackedRepo(source, cfg.Source, opts) + result, err := install.InstallTrackedRepo(source, trackSourceDir, opts) if err != nil { if errors.Is(err, install.ErrSkipSameRepo) { treeSpinner.Warn(firstWarningLine(err.Error())) @@ -72,8 +83,13 @@ func handleTrackedRepoInstall(source *install.Source, cfg *config.Config, opts i fmt.Println() ui.Warning("[dry-run] Would install tracked repo") } else { - ui.StepContinue("Found", fmt.Sprintf("%d skill(s)", result.SkillCount)) - renderTrackedRepoMeta(result.RepoName, result.Skills, result.RepoPath) + if trackedKind == "agent" { + ui.StepContinue("Found", fmt.Sprintf("%d agent(s)", result.AgentCount)) + renderTrackedAgentRepoMeta(result.RepoName, result.Agents, result.RepoPath) + } else { + ui.StepContinue("Found", fmt.Sprintf("%d skill(s)", result.SkillCount)) + renderTrackedRepoMeta(result.RepoName, result.Skills, result.RepoPath) + } } // Display warnings and risk info @@ -93,8 +109,13 @@ func handleTrackedRepoInstall(source *install.Source, cfg *config.Config, opts i // Show next steps if !opts.DryRun { ui.SectionLabel("Next Steps") - ui.Info("Run 'skillshare sync' to distribute skills to all targets") - ui.Info("Run 'skillshare update %s' to update this repo later", result.RepoName) + if trackedKind == "agent" { + ui.Info("Run 'skillshare sync agents' to distribute agents to all targets") + ui.Info("Run 'skillshare update agents --all' to update tracked agent repos later") + } else { + ui.Info("Run 'skillshare sync' to distribute skills to all targets") + ui.Info("Run 'skillshare update %s' to update this repo later", result.RepoName) + } } return logSummary, nil @@ -218,16 +239,31 @@ func handleGitInstall(source *install.Source, cfg *config.Config, opts install.I logSummary.InstalledSkills = append(logSummary.InstalledSkills, skill.Name) logSummary.SkillCount = len(logSummary.InstalledSkills) } + installDiscoveredAgents(discovery, cfg, opts) return logSummary, nil } - // Step 3: Show found skills - if len(discovery.Skills) == 0 { - ui.StepEnd("Found", "No skills (no SKILL.md files)") + // Step 3: Show found resources + if len(discovery.Skills) == 0 && len(discovery.Agents) == 0 { + ui.StepEnd("Found", "No skills or agents found") return logSummary, nil } - ui.StepEnd("Found", fmt.Sprintf("%d skill(s)", len(discovery.Skills))) + // Pure agent repo — no skills, only agents + if len(discovery.Skills) == 0 && len(discovery.Agents) > 0 { + ui.StepEnd("Found", fmt.Sprintf("%d agent(s)", len(discovery.Agents))) + agentsRoot := cfg.EffectiveAgentsSource() + agentsDir := agentsDirWithInto(agentsRoot, opts) + agentOpts := opts + agentOpts.SourceDir = agentsRoot + return handleAgentInstall(discovery, agentsDir, agentOpts, logSummary) + } + + foundMsg := fmt.Sprintf("%d skill(s)", len(discovery.Skills)) + if len(discovery.Agents) > 0 { + foundMsg += fmt.Sprintf(", %d agent(s)", len(discovery.Agents)) + } + ui.StepEnd("Found", foundMsg) // Apply --exclude early so excluded skills never appear in prompts if len(opts.Exclude) > 0 { @@ -295,6 +331,7 @@ func handleGitInstall(source *install.Source, cfg *config.Config, opts install.I logSummary.InstalledSkills = append(logSummary.InstalledSkills, skill.Name) logSummary.SkillCount = len(logSummary.InstalledSkills) } + installDiscoveredAgents(discovery, cfg, opts) return logSummary, nil } @@ -319,6 +356,7 @@ func handleGitInstall(source *install.Source, cfg *config.Config, opts install.I logSummary.InstalledSkills = append(logSummary.InstalledSkills, batchSummary.InstalledSkills...) logSummary.FailedSkills = append(logSummary.FailedSkills, batchSummary.FailedSkills...) logSummary.SkillCount = len(logSummary.InstalledSkills) + installDiscoveredAgents(discovery, cfg, opts) return logSummary, nil } @@ -354,6 +392,7 @@ func handleGitInstall(source *install.Source, cfg *config.Config, opts install.I logSummary.InstalledSkills = append(logSummary.InstalledSkills, batchSummary.InstalledSkills...) logSummary.FailedSkills = append(logSummary.FailedSkills, batchSummary.FailedSkills...) logSummary.SkillCount = len(logSummary.InstalledSkills) + installDiscoveredAgents(discovery, cfg, opts) return logSummary, nil } @@ -401,7 +440,7 @@ func installSelectedSkills(selected []install.SkillInfo, discovery *install.Disc } } - // Reorder: install root skill first so children can nest under it + // Reorder: install root skill first so children nest under it on disk. orderedSkills := selected if rootIdx > 0 { orderedSkills = make([]install.SkillInfo, 0, len(selected)) @@ -410,9 +449,6 @@ func installSelectedSkills(selected []install.SkillInfo, discovery *install.Disc orderedSkills = append(orderedSkills, selected[rootIdx+1:]...) } - // Track if root was installed (children are already included in root) - rootInstalled := false - for i, skill := range orderedSkills { if installSpinner != nil { installSpinner.NextStep(fmt.Sprintf("Installing %s...", skill.Name)) @@ -425,6 +461,10 @@ func installSelectedSkills(selected []install.SkillInfo, discovery *install.Disc } // Determine destination path and effective --into for force hints. + // Orchestrator repos: when the root skill is also selected, children + // install as siblings under / so they remain independent + // skills discoverable by sync (issue #124). The root copy excludes + // child skill directories so they do not duplicate. var destPath string skillOpts := opts if skill.Path == "." { @@ -443,15 +483,6 @@ func installSelectedSkills(selected []install.SkillInfo, discovery *install.Disc destPath = destWithInto(cfg.Source, opts, skill.Name) } - // If root was installed, children are already included - skip reinstall - if rootInstalled && skill.Path != "." { - results = append(results, skillInstallResult{skill: skill, success: true, message: fmt.Sprintf("included in %s", parentName)}) - if progressBar != nil { - progressBar.Increment() - } - continue - } - installResult, err := install.InstallFromDiscovery(discovery, skill, destPath, skillOpts) if err != nil { r := skillInstallResult{skill: skill, success: false, message: err.Error(), err: err} @@ -465,9 +496,6 @@ func installSelectedSkills(selected []install.SkillInfo, discovery *install.Disc continue } - if skill.Path == "." { - rootInstalled = true - } message := "installed" if len(installResult.Warnings) > 0 { message = fmt.Sprintf("installed (%d warning(s))", len(installResult.Warnings)) @@ -900,14 +928,14 @@ func installFromGlobalConfig(cfg *config.Config, opts install.InstallOptions) (i AuditVerbose: opts.AuditVerbose, } - reg, regErr := config.LoadRegistry(cfg.RegistryDir) - if regErr != nil { - return summary, fmt.Errorf("failed to load registry: %w", regErr) + store, storeErr := install.LoadMetadataWithMigration(cfg.Source, "") + if storeErr != nil { + return summary, fmt.Errorf("failed to load metadata: %w", storeErr) } - ctx := &globalInstallContext{cfg: cfg, reg: reg} + ctx := &globalInstallContext{cfg: cfg, store: store} if len(ctx.ConfigSkills()) == 0 { - ui.Info("No remote skills defined in registry") + ui.Info("No remote skills defined in metadata") ui.Info("Install a skill first: skillshare install ") return summary, nil } @@ -1003,6 +1031,14 @@ func renderTrackedRepoMeta(repoName string, skills []string, repoPath string) { ui.StepEnd("Location", repoPath) } +func renderTrackedAgentRepoMeta(repoName string, agents []string, repoPath string) { + ui.StepContinue("Tracked", repoName) + if len(agents) > 0 && len(agents) <= 10 { + ui.StepContinue("Agents", strings.Join(agents, ", ")) + } + ui.StepEnd("Location", repoPath) +} + // truncateDesc truncates a description string to max runes, appending " ..." if truncated. func truncateDesc(s string, max int) string { runes := []rune(s) @@ -1011,3 +1047,250 @@ func truncateDesc(s string, max int) string { } return string(runes[:max]) + " ..." } + +// handleAgentInstall installs agents from a pure-agent repo. +// Matches the skill install flow: single→direct, multi→TUI/flags, batch progress. +func handleAgentInstall(discovery *install.DiscoveryResult, agentsDir string, opts install.InstallOptions, logSummary installLogSummary) (installLogSummary, error) { + agents := discovery.Agents + + // Single agent: install directly (matches single-skill pattern) + if len(agents) == 1 && !opts.HasAgentFilter() && !opts.ShouldInstallAll() { + agent := agents[0] + if opts.DryRun { + ui.Info(" %s (%s)", agent.Name, agent.FileName) + ui.Warning("[dry-run] Would install agent: %s", agent.Name) + return logSummary, nil + } + spinner := ui.StartSpinner(fmt.Sprintf("Installing agent %s...", agent.Name)) + result, err := install.InstallAgentFromDiscovery(discovery, agent, agentsDir, opts) + spinner.Stop() + if err != nil { + ui.ErrorMsg("Failed to install agent %s: %v", agent.Name, err) + return logSummary, err + } + if result.Action == "skipped" { + ui.StepSkip(agent.Name, strings.Join(result.Warnings, "; ")) + } else { + ui.SuccessMsg("Installed agent: %s", agent.Name) + logSummary.SkillCount = 1 + logSummary.InstalledSkills = append(logSummary.InstalledSkills, agent.Name) + ui.SectionLabel("Next Steps") + ui.Info("Run 'skillshare sync agents' to distribute to all targets") + } + return logSummary, nil + } + + // Dry-run: show list and return + if opts.DryRun { + selected := agents + if opts.HasAgentFilter() || opts.ShouldInstallAll() { + var err error + selected, err = selectAgents(agents, opts) + if err != nil { + return logSummary, err + } + } + fmt.Println() + for _, a := range selected { + ui.Info(" %s (%s)", a.Name, a.FileName) + } + ui.Warning("[dry-run] Would install %d agent(s)", len(selected)) + return logSummary, nil + } + + // Non-interactive: --all/--yes or -a filter + if opts.HasAgentFilter() || opts.ShouldInstallAll() { + selected, err := selectAgents(agents, opts) + if err != nil { + return logSummary, err + } + fmt.Println() + batchSummary := installSelectedAgents(selected, discovery, agentsDir, opts) + logSummary.InstalledSkills = append(logSummary.InstalledSkills, batchSummary.InstalledSkills...) + logSummary.FailedSkills = append(logSummary.FailedSkills, batchSummary.FailedSkills...) + logSummary.SkillCount = len(logSummary.InstalledSkills) + return logSummary, nil + } + + // Non-TTY fallback + if !ui.IsTTY() { + ui.Info("Found %d agents. Non-interactive mode requires --all, --yes, or -a ", len(agents)) + return logSummary, fmt.Errorf("interactive selection not available in non-TTY mode") + } + + // Interactive TUI selection + fmt.Println() + selected, err := selectAgents(agents, opts) + if err != nil { + return logSummary, err + } + if len(selected) == 0 { + ui.Info("No agents selected") + return logSummary, nil + } + + fmt.Println() + batchSummary := installSelectedAgents(selected, discovery, agentsDir, opts) + logSummary.InstalledSkills = append(logSummary.InstalledSkills, batchSummary.InstalledSkills...) + logSummary.FailedSkills = append(logSummary.FailedSkills, batchSummary.FailedSkills...) + logSummary.SkillCount = len(logSummary.InstalledSkills) + return logSummary, nil +} + +// installDiscoveredAgents installs agents from a mixed repo after skills have been installed. +func installDiscoveredAgents(discovery *install.DiscoveryResult, cfg *config.Config, opts install.InstallOptions) { + if len(discovery.Agents) == 0 { + return + } + if opts.Kind == "skill" { + return + } + + agentsRoot := cfg.EffectiveAgentsSource() + agentsDir := agentsDirWithInto(agentsRoot, opts) + agentOpts := opts + agentOpts.SourceDir = agentsRoot + fmt.Println() + ui.Header("Installing agents") + + for _, agent := range discovery.Agents { + spinner := ui.StartSpinner(fmt.Sprintf("Installing agent %s...", agent.Name)) + result, err := install.InstallAgentFromDiscovery(discovery, agent, agentsDir, agentOpts) + spinner.Stop() + if err != nil { + ui.ErrorMsg("Failed to install agent %s: %v", agent.Name, err) + continue + } + if result.Action == "skipped" { + ui.StepSkip(agent.Name, strings.Join(result.Warnings, "; ")) + } else if agentOpts.DryRun { + ui.Warning("[dry-run] Would install agent: %s", agent.Name) + } else { + ui.SuccessMsg("Installed agent: %s", agent.Name) + } + } +} + +// agentInstallResult tracks the outcome of a single agent install. +type agentInstallResult struct { + agent install.AgentInfo + success bool + skipped bool + message string +} + +// installSelectedAgents installs a batch of agents with progress display. +func installSelectedAgents(selected []install.AgentInfo, discovery *install.DiscoveryResult, agentsDir string, opts install.InstallOptions) installBatchSummary { + results := make([]agentInstallResult, 0, len(selected)) + + var installSpinner *ui.Spinner + var progressBar *ui.ProgressBar + if len(selected) > largeBatchProgressThreshold { + progressBar = ui.StartProgress("Installing agents", len(selected)) + } else { + installSpinner = ui.StartSpinnerWithSteps("Installing...", len(selected)) + } + + for i, agent := range selected { + if installSpinner != nil { + installSpinner.NextStep(fmt.Sprintf("Installing %s...", agent.Name)) + if i == 0 { + installSpinner.Update(fmt.Sprintf("Installing %s...", agent.Name)) + } + } + if progressBar != nil { + progressBar.UpdateTitle(fmt.Sprintf("Installing %s", agent.Name)) + } + + result, err := install.InstallAgentFromDiscovery(discovery, agent, agentsDir, opts) + if err != nil { + results = append(results, agentInstallResult{agent: agent, message: err.Error()}) + } else if result.Action == "skipped" { + results = append(results, agentInstallResult{agent: agent, skipped: true, message: strings.Join(result.Warnings, "; ")}) + } else { + results = append(results, agentInstallResult{agent: agent, success: true}) + } + + if progressBar != nil { + progressBar.Increment() + } + } + + if progressBar != nil { + progressBar.Stop() + } + + displayAgentInstallResults(results, installSpinner) + + summary := installBatchSummary{} + for _, r := range results { + if r.success { + summary.InstalledSkills = append(summary.InstalledSkills, r.agent.Name) + } else if !r.skipped { + summary.FailedSkills = append(summary.FailedSkills, r.agent.Name) + } + } + return summary +} + +// displayAgentInstallResults renders the install outcome for a batch of agents. +func displayAgentInstallResults(results []agentInstallResult, spinner *ui.Spinner) { + var installed, failed, skippedCount int + for _, r := range results { + switch { + case r.success: + installed++ + case r.skipped: + skippedCount++ + default: + failed++ + } + } + + summaryMsg := buildInstallSummary(installed, failed, skippedCount) + if spinner != nil { + switch { + case failed > 0 && installed == 0: + spinner.Fail(summaryMsg) + case failed > 0: + spinner.Warn(summaryMsg) + default: + spinner.Success(summaryMsg) + } + } else { + fmt.Println() + if failed > 0 && installed == 0 { + ui.ErrorMsg("%s", summaryMsg) + } else { + ui.SuccessMsg("%s", summaryMsg) + } + } + + if failed > 0 { + ui.SectionLabel("Failed") + for _, r := range results { + if !r.success && !r.skipped { + ui.StepFail(r.agent.Name, r.message) + } + } + } + if skippedCount > 0 { + ui.SectionLabel("Skipped") + for _, r := range results { + if r.skipped { + ui.StepSkip(r.agent.Name, r.message) + } + } + } + if installed > 0 { + ui.SectionLabel("Installed") + for _, r := range results { + if r.success { + ui.StepDone(r.agent.Name, "") + } + } + fmt.Println() + ui.SectionLabel("Next Steps") + ui.Info("Run 'skillshare sync agents' to distribute to all targets") + } +} diff --git a/cmd/skillshare/install_project.go b/cmd/skillshare/install_project.go index 8c14f8e9..12745327 100644 --- a/cmd/skillshare/install_project.go +++ b/cmd/skillshare/install_project.go @@ -10,17 +10,21 @@ import ( ) func cmdInstallProject(args []string, root string) (installLogSummary, error) { - summary := installLogSummary{ - Mode: "project", - } - parsed, showHelp, err := parseInstallArgs(args) if showHelp { printInstallHelp() - return summary, nil + return installLogSummary{Mode: "project"}, nil } if err != nil { - return summary, err + return installLogSummary{Mode: "project"}, err + } + applyInstallJSONDefaults(parsed) + return cmdInstallProjectParsed(parsed, root) +} + +func cmdInstallProjectParsed(parsed *installArgs, root string) (installLogSummary, error) { + summary := installLogSummary{ + Mode: "project", } summary.DryRun = parsed.opts.DryRun summary.Tracked = parsed.opts.Track @@ -46,17 +50,17 @@ func cmdInstallProject(args []string, root string) (installLogSummary, error) { if parsed.sourceArg == "" { hasSourceFlags := parsed.opts.Name != "" || parsed.opts.Into != "" || - parsed.opts.Track || len(parsed.opts.Skills) > 0 || - len(parsed.opts.Exclude) > 0 || parsed.opts.All || parsed.opts.Yes || parsed.opts.Update || - parsed.opts.Branch != "" - if hasSourceFlags { - return summary, fmt.Errorf("flags --name, --into, --track, --skill, --exclude, --all, --yes, and --update require a source argument") + parsed.opts.Track || parsed.opts.HasSkillFilter() || parsed.opts.HasAgentFilter() || + len(parsed.opts.Exclude) > 0 || parsed.opts.Update || parsed.opts.Branch != "" || + parsed.opts.Kind != "" + if hasSourceFlags || ((parsed.opts.All || parsed.opts.Yes) && !parsed.jsonOutput) { + return summary, fmt.Errorf("flags --name, --into, --track, --skill, --agent, --kind, --exclude, --all, --yes, --branch, and --update require a source argument") } summary.Source = "project-config" return installFromProjectConfig(runtime, parsed.opts) } - cfg := &config.Config{Source: runtime.sourcePath, GitLabHosts: runtime.config.GitLabHosts} + cfg := &config.Config{Source: runtime.sourcePath, AgentsSource: runtime.agentsSourcePath, GitLabHosts: runtime.config.GitLabHosts} source, resolvedFromMeta, err := resolveInstallSource(parsed.sourceArg, parsed.opts, cfg) if err == nil && parsed.opts.Branch != "" { source.Branch = parsed.opts.Branch @@ -72,6 +76,10 @@ func cmdInstallProject(args []string, root string) (installLogSummary, error) { return summary, err } if !parsed.opts.DryRun { + freshStore, loadErr := install.LoadMetadata(runtime.sourcePath) + if loadErr == nil { + runtime.skillsStore = freshStore + } return summary, reconcileProjectRemoteSkills(runtime) } return summary, nil @@ -87,6 +95,13 @@ func cmdInstallProject(args []string, root string) (installLogSummary, error) { return summary, nil } + // Reload metadata store: install may have written new entries via WriteMeta + // that the pre-install runtime doesn't know about. + freshStore, loadErr := install.LoadMetadata(runtime.sourcePath) + if loadErr == nil { + runtime.skillsStore = freshStore + } + return summary, reconcileProjectRemoteSkills(runtime) } diff --git a/cmd/skillshare/install_prompt.go b/cmd/skillshare/install_prompt.go index 5632d2de..69a91674 100644 --- a/cmd/skillshare/install_prompt.go +++ b/cmd/skillshare/install_prompt.go @@ -101,16 +101,20 @@ func promptOrchestratorSelection(rootSkill install.SkillInfo, childSkills []inst return nil, nil } + // Build the full skill list (root first, then children) — used for both + // "entire pack" and individual selection so the root skill remains a + // selectable option in the multi-select UI (issue #124). + allSkills := make([]install.SkillInfo, 0, len(childSkills)+1) + allSkills = append(allSkills, rootSkill) + allSkills = append(allSkills, childSkills...) + // If "entire pack" selected, return all skills if indices[0] == 0 { - allSkills := make([]install.SkillInfo, 0, len(childSkills)+1) - allSkills = append(allSkills, rootSkill) - allSkills = append(allSkills, childSkills...) return allSkills, nil } - // Stage 2: Select individual skills (children only, no root) - return promptMultiSelect(childSkills) + // Stage 2: Select individual skills (root + children) + return promptMultiSelect(allSkills) } // promptLargeRepoSelection presents a TUI directory picker for large repos. @@ -285,3 +289,64 @@ func printSkillListCompact(skills []install.SkillInfo) { } ui.Info("... and %d more skill(s)", len(skills)-showCount) } + +// selectAgents routes agent selection through filter, all, or interactive TUI. +func selectAgents(agents []install.AgentInfo, opts install.InstallOptions) ([]install.AgentInfo, error) { + switch { + case opts.HasAgentFilter(): + matched, notFound := filterAgentsByName(agents, opts.AgentNames) + if len(notFound) > 0 { + return nil, fmt.Errorf("agents not found: %s", strings.Join(notFound, ", ")) + } + return matched, nil + case opts.ShouldInstallAll(): + return agents, nil + default: + return promptAgentInstallSelection(agents) + } +} + +// filterAgentsByName returns agents matching any of the given names (case-insensitive). +func filterAgentsByName(agents []install.AgentInfo, names []string) (matched []install.AgentInfo, notFound []string) { + nameSet := make(map[string]bool, len(names)) + for _, n := range names { + nameSet[strings.ToLower(n)] = true + } + found := make(map[string]bool) + for _, a := range agents { + if nameSet[strings.ToLower(a.Name)] { + matched = append(matched, a) + found[strings.ToLower(a.Name)] = true + } + } + for _, n := range names { + if !found[strings.ToLower(n)] { + notFound = append(notFound, n) + } + } + return +} + +// promptAgentInstallSelection shows a multi-select TUI for agent installation. +func promptAgentInstallSelection(agents []install.AgentInfo) ([]install.AgentInfo, error) { + items := make([]checklistItemData, len(agents)) + for i, a := range agents { + items[i] = checklistItemData{label: a.Name, desc: a.FileName} + } + indices, err := runChecklistTUI(checklistConfig{ + title: "Select agents to install", + items: items, + itemName: "agent", + }) + if err != nil { + return nil, err + } + if indices == nil { + return nil, nil // cancelled + } + selected := make([]install.AgentInfo, len(indices)) + for i, idx := range indices { + selected[i] = agents[idx] + } + return selected, nil +} diff --git a/cmd/skillshare/install_prompt_tui.go b/cmd/skillshare/install_prompt_tui.go index 225190db..37c0ccae 100644 --- a/cmd/skillshare/install_prompt_tui.go +++ b/cmd/skillshare/install_prompt_tui.go @@ -6,11 +6,12 @@ import ( "sort" "strings" + "skillshare/internal/install" + "skillshare/internal/theme" + "github.com/charmbracelet/bubbles/list" "github.com/charmbracelet/bubbles/textinput" tea "github.com/charmbracelet/bubbletea" - - "skillshare/internal/install" ) // dirPickerItemKind distinguishes directory entries from the "Install all" action. @@ -86,7 +87,7 @@ func newDirPickerModel(skills []install.SkillInfo) dirPickerModel { l := list.New(items, newPrefixDelegate(true), 0, 0) l.Title = "Select directory" - l.Styles.Title = tc.ListTitle + l.Styles.Title = theme.Title() l.SetShowStatusBar(false) l.SetFilteringEnabled(false) l.SetShowHelp(false) @@ -95,8 +96,8 @@ func newDirPickerModel(skills []install.SkillInfo) dirPickerModel { // Filter text input fi := textinput.New() fi.Prompt = "/ " - fi.PromptStyle = tc.Filter - fi.Cursor.Style = tc.Filter + fi.PromptStyle = theme.Accent() + fi.Cursor.Style = theme.Accent() m.list = l m.allItems = items @@ -306,7 +307,7 @@ func (m dirPickerModel) View() string { } else { help = "↑↓ navigate enter select / filter q quit" } - b.WriteString(tc.Help.Render(help)) + b.WriteString(theme.Dim().MarginLeft(2).Render(help)) b.WriteString("\n") return b.String() @@ -410,7 +411,7 @@ func newSkillSelectModel(skills []install.SkillInfo) skillSelectModel { l := list.New(items, newPrefixDelegate(false), 0, 0) l.Title = skillSelectTitle(0, len(sorted)) - l.Styles.Title = tc.ListTitle + l.Styles.Title = theme.Title() l.SetShowStatusBar(false) l.SetFilteringEnabled(false) l.SetShowHelp(false) @@ -419,8 +420,8 @@ func newSkillSelectModel(skills []install.SkillInfo) skillSelectModel { // Filter text input fi := textinput.New() fi.Prompt = "/ " - fi.PromptStyle = tc.Filter - fi.Cursor.Style = tc.Filter + fi.PromptStyle = theme.Accent() + fi.Cursor.Style = theme.Accent() return skillSelectModel{ list: l, @@ -608,7 +609,7 @@ func (m skillSelectModel) View() string { )) help := "↑↓ navigate space toggle a all enter confirm / filter esc cancel" - b.WriteString(tc.Help.Render(help)) + b.WriteString(theme.Dim().MarginLeft(2).Render(help)) b.WriteString("\n") return b.String() diff --git a/cmd/skillshare/json_output.go b/cmd/skillshare/json_output.go index 190aab0d..a249b329 100644 --- a/cmd/skillshare/json_output.go +++ b/cmd/skillshare/json_output.go @@ -54,6 +54,31 @@ func writeJSONError(err error) error { return &jsonSilentError{cause: err} } +// jsonUISuppressor encapsulates the common "suppress stdout/UI once, +// restore-once-and-only-once before writing the JSON payload" pattern that +// every --json-capable command needs. Call Flush before emitting JSON so +// the payload goes to real stdout; deferring Flush guarantees restoration +// even if the command returns early. +type jsonUISuppressor struct{ restore func() } + +// newJSONUISuppressor suppresses UI only when jsonMode is true. +// Returns a zero-value suppressor otherwise, so callers can unconditionally +// defer Flush without extra branching. +func newJSONUISuppressor(jsonMode bool) *jsonUISuppressor { + if !jsonMode { + return &jsonUISuppressor{} + } + return &jsonUISuppressor{restore: suppressUIToDevnull()} +} + +// Flush restores stdout/UI. Safe to call multiple times. +func (s *jsonUISuppressor) Flush() { + if s.restore != nil { + s.restore() + s.restore = nil + } +} + // suppressUIToDevnull temporarily redirects os.Stdout and the progress // writer to /dev/null so that handler functions using fmt.Printf / ui.* // produce zero visible output. This keeps --json output clean even when diff --git a/cmd/skillshare/kind_filter.go b/cmd/skillshare/kind_filter.go new file mode 100644 index 00000000..e7a8c771 --- /dev/null +++ b/cmd/skillshare/kind_filter.go @@ -0,0 +1,128 @@ +package main + +import "fmt" + +// resourceKindFilter represents the kind filtering for CLI commands. +type resourceKindFilter int + +const ( + kindAll resourceKindFilter = iota // no filter — all kinds + kindSkills // skills only + kindAgents // agents only +) + +// parseKindArg extracts a kind filter from the first positional argument. +// Returns the filter and remaining args. +// Recognized values: "skills", "skill", "agents", "agent". +// If the first arg is not a kind keyword, returns kindSkills with args unchanged +// (default is skills-only; use --all flag for both). +func parseKindArg(args []string) (resourceKindFilter, []string) { + if len(args) == 0 { + return kindSkills, args + } + + switch args[0] { + case "skills", "skill": + return kindSkills, args[1:] + case "agents", "agent": + return kindAgents, args[1:] + default: + return kindSkills, args + } +} + +// extractAllFlag scans args for --all, removes it, and returns true if found. +// Commands use this to let --all mean kindAll (skills + agents). +func extractAllFlag(args []string) (bool, []string) { + found := false + rest := make([]string, 0, len(args)) + for _, a := range args { + if a == "--all" { + found = true + } else { + rest = append(rest, a) + } + } + return found, rest +} + +// parseKindArgWithAll combines parseKindArg and extractAllFlag. +// It parses the positional kind keyword (agents/skills) and the --all flag. +// Used by commands where --all means kindAll (skills + agents). +func parseKindArgWithAll(args []string) (resourceKindFilter, []string) { + kind, rest := parseKindArg(args) + if allKinds, remaining := extractAllFlag(rest); allKinds { + kind = kindAll + rest = remaining + } + return kind, rest +} + +// parseKindFlag extracts --kind flag from args. +// Returns the filter and remaining args with --kind removed. +func parseKindFlag(args []string) (resourceKindFilter, []string, error) { + kind := kindAll + rest := make([]string, 0, len(args)) + + for i := 0; i < len(args); i++ { + if args[i] == "--kind" { + if i+1 >= len(args) { + return kindAll, nil, fmt.Errorf("--kind requires a value (skill or agent)") + } + i++ + switch args[i] { + case "skill", "skills": + kind = kindSkills + case "agent", "agents": + kind = kindAgents + default: + return kindAll, nil, fmt.Errorf("--kind must be 'skill' or 'agent', got %q", args[i]) + } + } else { + rest = append(rest, args[i]) + } + } + + return kind, rest, nil +} + +func (k resourceKindFilter) String() string { + switch k { + case kindSkills: + return "skills" + case kindAgents: + return "agents" + default: + return "all" + } +} + +// Noun returns the pluralized resource noun for display. +// Noun(1) → "skill"/"agent", Noun(2+) → "skills"/"agents". +func (k resourceKindFilter) Noun(count int) string { + switch k { + case kindAgents: + if count == 1 { + return "agent" + } + return "agents" + default: + if count == 1 { + return "skill" + } + return "skills" + } +} + +// SingularNoun returns the singular resource noun (no count needed). +func (k resourceKindFilter) SingularNoun() string { + return k.Noun(1) +} + +func (k resourceKindFilter) IncludesSkills() bool { + return k == kindAll || k == kindSkills +} + +func (k resourceKindFilter) IncludesAgents() bool { + return k == kindAll || k == kindAgents +} diff --git a/cmd/skillshare/kind_filter_test.go b/cmd/skillshare/kind_filter_test.go new file mode 100644 index 00000000..f4d66929 --- /dev/null +++ b/cmd/skillshare/kind_filter_test.go @@ -0,0 +1,128 @@ +package main + +import "testing" + +func TestParseKindArg(t *testing.T) { + tests := []struct { + args []string + wantKind resourceKindFilter + wantRest []string + }{ + {nil, kindSkills, nil}, + {[]string{}, kindSkills, []string{}}, + {[]string{"skills"}, kindSkills, []string{}}, + {[]string{"skill"}, kindSkills, []string{}}, + {[]string{"agents"}, kindAgents, []string{}}, + {[]string{"agent"}, kindAgents, []string{}}, + {[]string{"all"}, kindSkills, []string{"all"}}, // "all" no longer a keyword + {[]string{"all", "--json"}, kindSkills, []string{"all", "--json"}}, // falls through to default + {[]string{"agents", "tutor"}, kindAgents, []string{"tutor"}}, + {[]string{"--json"}, kindSkills, []string{"--json"}}, + {[]string{"my-skill"}, kindSkills, []string{"my-skill"}}, + } + + for _, tt := range tests { + kind, rest := parseKindArg(tt.args) + if kind != tt.wantKind { + t.Errorf("parseKindArg(%v) kind = %v, want %v", tt.args, kind, tt.wantKind) + } + if len(rest) != len(tt.wantRest) { + t.Errorf("parseKindArg(%v) rest = %v, want %v", tt.args, rest, tt.wantRest) + } + } +} + +func TestParseKindFlag(t *testing.T) { + tests := []struct { + args []string + wantKind resourceKindFilter + wantRest []string + wantErr bool + }{ + {[]string{}, kindAll, []string{}, false}, + {[]string{"--kind", "agent"}, kindAgents, []string{}, false}, + {[]string{"--kind", "skill"}, kindSkills, []string{}, false}, + {[]string{"--json", "--kind", "agent", "foo"}, kindAgents, []string{"--json", "foo"}, false}, + {[]string{"--kind"}, kindAll, nil, true}, + {[]string{"--kind", "invalid"}, kindAll, nil, true}, + } + + for _, tt := range tests { + kind, rest, err := parseKindFlag(tt.args) + if (err != nil) != tt.wantErr { + t.Errorf("parseKindFlag(%v) err = %v, wantErr %v", tt.args, err, tt.wantErr) + continue + } + if err != nil { + continue + } + if kind != tt.wantKind { + t.Errorf("parseKindFlag(%v) kind = %v, want %v", tt.args, kind, tt.wantKind) + } + if len(rest) != len(tt.wantRest) { + t.Errorf("parseKindFlag(%v) rest = %v, want %v", tt.args, rest, tt.wantRest) + } + } +} + +func TestExtractAllFlag(t *testing.T) { + tests := []struct { + args []string + wantAll bool + wantRest []string + }{ + {nil, false, []string{}}, + {[]string{"--all"}, true, []string{}}, + {[]string{"--json", "--all", "foo"}, true, []string{"--json", "foo"}}, + {[]string{"--json"}, false, []string{"--json"}}, + {[]string{"agents", "--all"}, true, []string{"agents"}}, + {[]string{"--all", "--all"}, true, []string{}}, // duplicate --all + } + + for _, tt := range tests { + gotAll, gotRest := extractAllFlag(tt.args) + if gotAll != tt.wantAll { + t.Errorf("extractAllFlag(%v) all = %v, want %v", tt.args, gotAll, tt.wantAll) + } + if len(gotRest) != len(tt.wantRest) { + t.Errorf("extractAllFlag(%v) rest = %v, want %v", tt.args, gotRest, tt.wantRest) + } + } +} + +func TestParseKindArgWithAll(t *testing.T) { + tests := []struct { + args []string + wantKind resourceKindFilter + wantRest []string + }{ + {nil, kindSkills, nil}, + {[]string{"--all"}, kindAll, []string{}}, + {[]string{"agents"}, kindAgents, []string{}}, + {[]string{"agents", "--all"}, kindAll, []string{}}, + {[]string{"--json", "--all"}, kindAll, []string{"--json"}}, + {[]string{"--json"}, kindSkills, []string{"--json"}}, + } + + for _, tt := range tests { + kind, rest := parseKindArgWithAll(tt.args) + if kind != tt.wantKind { + t.Errorf("parseKindArgWithAll(%v) kind = %v, want %v", tt.args, kind, tt.wantKind) + } + if len(rest) != len(tt.wantRest) { + t.Errorf("parseKindArgWithAll(%v) rest = %v, want %v", tt.args, rest, tt.wantRest) + } + } +} + +func TestResourceKindFilter_Includes(t *testing.T) { + if !kindAll.IncludesSkills() || !kindAll.IncludesAgents() { + t.Error("kindAll should include both") + } + if !kindSkills.IncludesSkills() || kindSkills.IncludesAgents() { + t.Error("kindSkills should include only skills") + } + if kindAgents.IncludesSkills() || !kindAgents.IncludesAgents() { + t.Error("kindAgents should include only agents") + } +} diff --git a/cmd/skillshare/list.go b/cmd/skillshare/list.go index f3be3bc3..e621580c 100644 --- a/cmd/skillshare/list.go +++ b/cmd/skillshare/list.go @@ -12,7 +12,9 @@ import ( "skillshare/internal/config" "skillshare/internal/git" "skillshare/internal/install" + "skillshare/internal/resource" "skillshare/internal/sync" + "skillshare/internal/theme" "skillshare/internal/ui" "skillshare/internal/utils" ) @@ -191,22 +193,43 @@ func sortSkillEntries(skills []skillEntry, sortBy string) { } return a < b // ascending }) - default: // "name" or empty + default: // "name" or empty — group by top-level bucket, then by RelPath sort.SliceStable(skills, func(i, j int) bool { - return skills[i].Name < skills[j].Name + ti := skillTopGroup(skills[i]) + tj := skillTopGroup(skills[j]) + if ti != tj { + // Empty topGroup (standalone) sorts last so flat locals + // don't split named groups apart in the list TUI. + if ti == "" { + return false + } + if tj == "" { + return true + } + return ti < tj + } + return skills[i].RelPath < skills[j].RelPath }) } } // buildSkillEntries builds skill entries from discovered skills. -// ReadMeta calls are parallelized with a bounded worker pool. +// Metadata is read from the centralized .metadata.json store. func buildSkillEntries(discovered []sync.DiscoveredSkill) []skillEntry { skills := make([]skillEntry, len(discovered)) - // Pre-fill non-I/O fields + store := install.NewMetadataStore() + if len(discovered) > 0 { + sourceDir := strings.TrimSuffix(discovered[0].SourcePath, discovered[0].RelPath) + sourceDir = strings.TrimRight(sourceDir, `/\`) + store = install.LoadMetadataOrNew(sourceDir) + } + + // Pre-fill non-I/O fields + metadata from store for i, d := range discovered { skills[i] = skillEntry{ Name: d.FlatName, + Kind: "skill", IsNested: d.IsInRepo || utils.HasNestedSeparator(d.FlatName), RelPath: d.RelPath, Disabled: d.Disabled, @@ -217,28 +240,17 @@ func buildSkillEntries(discovered []sync.DiscoveredSkill) []skillEntry { skills[i].RepoName = parts[0] } } - } - - // Parallel ReadMeta with bounded concurrency - const metaWorkers = 64 - sem := make(chan struct{}, metaWorkers) - var wg gosync.WaitGroup - for i, d := range discovered { - wg.Add(1) - sem <- struct{}{} - go func(idx int, sourcePath string) { - defer wg.Done() - defer func() { <-sem }() - if meta, err := install.ReadMeta(sourcePath); err == nil && meta != nil { - skills[idx].Source = meta.Source - skills[idx].Type = meta.Type - skills[idx].InstalledAt = meta.InstalledAt.Format("2006-01-02") - skills[idx].Branch = meta.Branch + // Enrich from centralized metadata store + if entry := store.GetByPath(d.RelPath); entry != nil { + skills[i].Source = entry.Source + skills[i].Type = entry.Type + if !entry.InstalledAt.IsZero() { + skills[i].InstalledAt = entry.InstalledAt.Format("2006-01-02") } - }(i, d.SourcePath) + skills[i].Branch = entry.Branch + } } - wg.Wait() // Fallback: for tracked-repo skills with no branch in metadata, read from git. // Cache per-repo to avoid repeated subprocess calls for skills in the same repo. @@ -261,6 +273,54 @@ func buildSkillEntries(discovered []sync.DiscoveredSkill) []skillEntry { return skills } +// discoverAndBuildAgentEntries discovers agents from the given source directory +// and builds skillEntry items with Kind="agent", enriched from the centralized +// metadata store. +func discoverAndBuildAgentEntries(agentsSource string) []skillEntry { + if agentsSource == "" { + return nil + } + discovered, err := resource.AgentKind{}.Discover(agentsSource) + if err != nil { + return nil + } + + store := install.LoadMetadataOrNew(agentsSource) + + entries := make([]skillEntry, len(discovered)) + for i, d := range discovered { + entries[i] = skillEntry{ + Name: d.Name, + Kind: "agent", + RelPath: d.RelPath, + IsNested: d.IsNested, + Disabled: d.Disabled, + } + // For tracked agents, set RepoName to the tracked repo root + // (e.g. "_vijaythecoder-awesome-claude-agents") so all agents under + // the same repo share one visual group — regardless of how deeply + // they're nested (agents/core, agents/specialized/python, etc.). + if d.IsInRepo && d.RepoRelPath != "" { + entries[i].RepoName = d.RepoRelPath + } + key := strings.TrimSuffix(d.RelPath, ".md") + if entry := store.GetByPath(key); entry != nil { + entries[i].Source = entry.Source + entries[i].Type = entry.Type + if !entry.InstalledAt.IsZero() { + entries[i].InstalledAt = entry.InstalledAt.Format("2006-01-02") + } + } else if d.RepoRelPath != "" { + repoPath := filepath.Join(agentsSource, filepath.FromSlash(d.RepoRelPath)) + if repoURL, err := git.GetRemoteURL(repoPath); err == nil { + entries[i].Source = repoURL + entries[i].Type = "git" + } + } + } + return entries +} + // extractGroupDir returns the parent directory from a RelPath. // "frontend/react-helper" → "frontend", "my-skill" → "", "_team/frontend/ui" → "_team/frontend" func extractGroupDir(relPath string) string { @@ -323,10 +383,11 @@ func hasGroups(skills []skillEntry) bool { // displaySkillsVerbose displays skills in verbose mode, grouped by directory func displaySkillsVerbose(skills []skillEntry) { + a := theme.ANSI() if !hasGroups(skills) { // Flat display — no grouping needed for _, s := range skills { - fmt.Printf(" %s%s%s\n", ui.Cyan, s.Name, ui.Reset) + fmt.Printf(" %s%s%s\n", a.Accent, s.Name, a.Reset) printVerboseDetails(s, " ") } return @@ -338,7 +399,7 @@ func displaySkillsVerbose(skills []skillEntry) { if i > 0 { fmt.Println() } - fmt.Printf(" %s%s/%s\n", ui.Dim, dir, ui.Reset) + fmt.Printf(" %s%s/%s\n", a.Dim, dir, a.Reset) } else if i > 0 { fmt.Println() } @@ -351,31 +412,33 @@ func displaySkillsVerbose(skills []skillEntry) { indent = " " detailIndent = " " } - fmt.Printf("%s%s%s%s\n", indent, ui.Cyan, name, ui.Reset) + fmt.Printf("%s%s%s%s\n", indent, a.Accent, name, a.Reset) printVerboseDetails(s, detailIndent) } } } func printVerboseDetails(s skillEntry, indent string) { + a := theme.ANSI() if s.Disabled { - fmt.Printf("%s%sStatus:%s %sdisabled%s\n", indent, ui.Dim, ui.Reset, ui.Dim, ui.Reset) + fmt.Printf("%s%sStatus:%s %sdisabled%s\n", indent, a.Dim, a.Reset, a.Dim, a.Reset) } if s.RepoName != "" { - fmt.Printf("%s%sTracked repo:%s %s\n", indent, ui.Dim, ui.Reset, s.RepoName) + fmt.Printf("%s%sTracked repo:%s %s\n", indent, a.Dim, a.Reset, s.RepoName) } if s.Source != "" { - fmt.Printf("%s%sSource:%s %s\n", indent, ui.Dim, ui.Reset, s.Source) - fmt.Printf("%s%sType:%s %s\n", indent, ui.Dim, ui.Reset, s.Type) - fmt.Printf("%s%sInstalled:%s %s\n", indent, ui.Dim, ui.Reset, s.InstalledAt) + fmt.Printf("%s%sSource:%s %s\n", indent, a.Dim, a.Reset, s.Source) + fmt.Printf("%s%sType:%s %s\n", indent, a.Dim, a.Reset, s.Type) + fmt.Printf("%s%sInstalled:%s %s\n", indent, a.Dim, a.Reset, s.InstalledAt) } else { - fmt.Printf("%s%sSource:%s (local - no metadata)\n", indent, ui.Dim, ui.Reset) + fmt.Printf("%s%sSource:%s (local - no metadata)\n", indent, a.Dim, a.Reset) } fmt.Println() } // displaySkillsCompact displays skills in compact mode, grouped by directory func displaySkillsCompact(skills []skillEntry) { + a := theme.ANSI() if !hasGroups(skills) { // Flat display — identical to previous behavior maxNameLen := 0 @@ -386,7 +449,7 @@ func displaySkillsCompact(skills []skillEntry) { } for _, s := range skills { suffix := getSkillSuffix(s) - format := fmt.Sprintf(" %s→%s %%-%ds %s%%s%s\n", ui.Cyan, ui.Reset, maxNameLen, ui.Dim, ui.Reset) + format := fmt.Sprintf(" %s→%s %%-%ds %s%%s%s\n", a.Accent, a.Reset, maxNameLen, a.Dim, a.Reset) fmt.Printf(format, s.Name, suffix) } return @@ -398,7 +461,7 @@ func displaySkillsCompact(skills []skillEntry) { if i > 0 { fmt.Println() } - fmt.Printf(" %s%s/%s\n", ui.Dim, dir, ui.Reset) + fmt.Printf(" %s%s/%s\n", a.Dim, dir, a.Reset) } else if i > 0 { fmt.Println() } @@ -416,10 +479,10 @@ func displaySkillsCompact(skills []skillEntry) { name := displayName(s, dir) suffix := getSkillSuffix(s) if dir != "" { - format := fmt.Sprintf(" %s→%s %%-%ds %s%%s%s\n", ui.Cyan, ui.Reset, maxNameLen, ui.Dim, ui.Reset) + format := fmt.Sprintf(" %s→%s %%-%ds %s%%s%s\n", a.Accent, a.Reset, maxNameLen, a.Dim, a.Reset) fmt.Printf(format, name, suffix) } else { - format := fmt.Sprintf(" %s→%s %%-%ds %s%%s%s\n", ui.Cyan, ui.Reset, maxNameLen, ui.Dim, ui.Reset) + format := fmt.Sprintf(" %s→%s %%-%ds %s%%s%s\n", a.Accent, a.Reset, maxNameLen, a.Dim, a.Reset) fmt.Printf(format, name, suffix) } } @@ -512,6 +575,9 @@ func cmdList(args []string) error { applyModeLabel(mode) + // Extract kind filter (e.g. "skillshare list agents" or "--all"). + kind, rest := parseKindArgWithAll(rest) + opts, err := parseListArgs(rest) if opts.ShowHelp { printListHelp() @@ -522,7 +588,7 @@ func cmdList(args []string) error { } if mode == modeProject { - return cmdListProject(cwd, opts) + return cmdListProject(cwd, opts, kind) } cfg, err := config.Load() @@ -533,114 +599,158 @@ func cmdList(args []string) error { // TTY + not JSON + TUI enabled → launch TUI with async loading (no blank screen) if !opts.JSON && shouldLaunchTUI(opts.NoTUI, cfg) { loadFn := func() listLoadResult { - discovered, err := sync.DiscoverSourceSkillsAll(cfg.Source) - if err != nil { - return listLoadResult{err: fmt.Errorf("cannot discover skills: %w", err)} + // Always load both skills and agents — tab UI filters the view. + var allEntries []skillEntry + discovered, discErr := sync.DiscoverSourceSkillsAll(cfg.Source) + if discErr != nil { + return listLoadResult{err: fmt.Errorf("cannot discover skills: %w", discErr)} } - skills := buildSkillEntries(discovered) - total := len(skills) - skills = filterSkillEntries(skills, opts.Pattern, opts.TypeFilter) - if opts.SortBy != "" { - sortSkillEntries(skills, opts.SortBy) - } - return listLoadResult{skills: toSkillItems(skills), totalCount: total} + allEntries = append(allEntries, buildSkillEntries(discovered)...) + allEntries = append(allEntries, discoverAndBuildAgentEntries(cfg.EffectiveAgentsSource())...) + total := len(allEntries) + allEntries = filterSkillEntries(allEntries, opts.Pattern, opts.TypeFilter) + // Always sort: buildGroupedItems relies on contiguous top-group + // blocks, which the default sort guarantees (tracked → named + // locals → standalone). + sortSkillEntries(allEntries, opts.SortBy) + return listLoadResult{skills: toSkillItems(allEntries), totalCount: total} } - action, skillName, err := runListTUI(loadFn, "global", cfg.Source, cfg.Targets) + action, skillName, skillKind, err := runListTUI(loadFn, "global", cfg.Source, cfg.EffectiveAgentsSource(), cfg.Targets, kind) if err != nil { return err } switch action { case "empty": - ui.Info("No skills installed") - ui.Info("Use 'skillshare install ' to install a skill") + resourceLabel := "skills" + if kind == kindAgents { + resourceLabel = "agents" + } + ui.Info("No %s installed", resourceLabel) return nil case "audit": + if skillKind == "agent" { + return cmdAudit([]string{"agents", "-g", skillName}) + } return cmdAudit([]string{"-g", skillName}) case "update": + if skillKind == "agent" { + return cmdUpdate([]string{"agents", "-g", skillName}) + } return cmdUpdate([]string{"-g", skillName}) case "uninstall": + if skillKind == "agent" { + return cmdUninstall([]string{"agents", "-g", "--force", skillName}) + } return cmdUninstall([]string{"-g", "--force", skillName}) } return nil } // Non-TUI path (JSON or plain text): synchronous loading with spinner + resourceLabel := "skills" + if kind == kindAgents { + resourceLabel = "agents" + } else if kind == kindAll { + resourceLabel = "resources" + } + var sp *ui.Spinner if !opts.JSON && ui.IsTTY() { - sp = ui.StartSpinner("Loading skills...") + sp = ui.StartSpinner(fmt.Sprintf("Loading %s...", resourceLabel)) } - discovered, err := sync.DiscoverSourceSkillsAll(cfg.Source) - if err != nil { + + var allEntries []skillEntry + var trackedRepos []string + var discoveredSkills []sync.DiscoveredSkill + + if kind.IncludesSkills() { + var discErr error + discoveredSkills, discErr = sync.DiscoverSourceSkillsAll(cfg.Source) + if discErr != nil { + if sp != nil { + sp.Fail("Discovery failed") + } + return fmt.Errorf("cannot discover skills: %w", discErr) + } + trackedRepos = extractTrackedRepos(discoveredSkills) if sp != nil { - sp.Fail("Discovery failed") + sp.Update(fmt.Sprintf("Reading metadata for %d skills...", len(discoveredSkills))) } - return fmt.Errorf("cannot discover skills: %w", err) + allEntries = append(allEntries, buildSkillEntries(discoveredSkills)...) } - trackedRepos := extractTrackedRepos(discovered) - if sp != nil { - sp.Update(fmt.Sprintf("Reading metadata for %d skills...", len(discovered))) + if kind.IncludesAgents() { + agentEntries := discoverAndBuildAgentEntries(cfg.EffectiveAgentsSource()) + allEntries = append(allEntries, agentEntries...) } - skills := buildSkillEntries(discovered) + if sp != nil { - sp.Success(fmt.Sprintf("Loaded %d skills", len(skills))) + sp.Success(fmt.Sprintf("Loaded %d %s", len(allEntries), resourceLabel)) } - totalCount := len(skills) + totalCount := len(allEntries) hasFilter := opts.Pattern != "" || opts.TypeFilter != "" // Apply filter and sort - skills = filterSkillEntries(skills, opts.Pattern, opts.TypeFilter) - if opts.SortBy != "" { - sortSkillEntries(skills, opts.SortBy) - } + allEntries = filterSkillEntries(allEntries, opts.Pattern, opts.TypeFilter) + // Always sort so the TUI and plain-text renderers see contiguous + // top-level groups (tracked → named locals → standalone). + sortSkillEntries(allEntries, opts.SortBy) // JSON output if opts.JSON { - return displaySkillsJSON(skills) + return displaySkillsJSON(allEntries) } - // Handle empty results before starting pager - if len(skills) == 0 && len(trackedRepos) == 0 && !hasFilter { - ui.Info("No skills installed") - ui.Info("Use 'skillshare install ' to install a skill") + // Handle empty results + if len(allEntries) == 0 && len(trackedRepos) == 0 && !hasFilter { + ui.Info("No %s installed", resourceLabel) + if kind.IncludesSkills() { + ui.Info("Use 'skillshare install ' to install a skill") + } return nil } - if hasFilter && len(skills) == 0 { + if hasFilter && len(allEntries) == 0 { if opts.Pattern != "" && opts.TypeFilter != "" { - ui.Info("No skills matching %q (type: %s)", opts.Pattern, opts.TypeFilter) + ui.Info("No %s matching %q (type: %s)", resourceLabel, opts.Pattern, opts.TypeFilter) } else if opts.Pattern != "" { - ui.Info("No skills matching %q", opts.Pattern) + ui.Info("No %s matching %q", resourceLabel, opts.Pattern) } else { - ui.Info("No skills matching type %q", opts.TypeFilter) + ui.Info("No %s matching type %q", resourceLabel, opts.TypeFilter) } return nil } // Plain text output (--no-tui or non-TTY) - if len(skills) > 0 { - ui.Header("Installed skills") + if len(allEntries) > 0 { + headerLabel := "Installed skills" + if kind == kindAgents { + headerLabel = "Installed agents" + } else if kind == kindAll { + headerLabel = "Installed skills & agents" + } + ui.Header(headerLabel) if opts.Verbose { - displaySkillsVerbose(skills) + displaySkillsVerbose(allEntries) } else { - displaySkillsCompact(skills) + displaySkillsCompact(allEntries) } } // Hide tracked repos section when filter/pattern is active if len(trackedRepos) > 0 && !hasFilter { - displayTrackedRepos(trackedRepos, discovered, cfg.Source) + displayTrackedRepos(trackedRepos, discoveredSkills, cfg.Source) } // Show match stats when filter is active - if hasFilter && len(skills) > 0 { + if hasFilter && len(allEntries) > 0 { fmt.Println() if opts.Pattern != "" { - ui.Info("%d of %d skill(s) matching %q", len(skills), totalCount, opts.Pattern) + ui.Info("%d of %d %s matching %q", len(allEntries), totalCount, resourceLabel, opts.Pattern) } else { - ui.Info("%d of %d skill(s)", len(skills), totalCount) + ui.Info("%d of %d %s", len(allEntries), totalCount, resourceLabel) } - } else if !opts.Verbose && len(skills) > 0 { + } else if !opts.Verbose && len(allEntries) > 0 { fmt.Println() ui.Info("Use --verbose for more details") } @@ -650,6 +760,7 @@ func cmdList(args []string) error { type skillEntry struct { Name string + Kind string // "skill" or "agent" Source string Type string InstalledAt string @@ -663,6 +774,7 @@ type skillEntry struct { // skillJSON is the JSON representation for --json output. type skillJSON struct { Name string `json:"name"` + Kind string `json:"kind,omitempty"` // "skill" or "agent" RelPath string `json:"relPath"` Source string `json:"source,omitempty"` Type string `json:"type,omitempty"` @@ -676,6 +788,7 @@ func displaySkillsJSON(skills []skillEntry) error { for i, s := range skills { items[i] = skillJSON{ Name: s.Name, + Kind: s.Kind, RelPath: s.RelPath, Source: s.Source, Type: s.Type, @@ -703,12 +816,13 @@ func abbreviateSource(source string) string { } func printListHelp() { - fmt.Println(`Usage: skillshare list [pattern] [options] + fmt.Println(`Usage: skillshare list [agents] [pattern] [options] List all installed skills in the source directory. An optional pattern filters skills by name, path, or source (case-insensitive). Options: + --all List both skills and agents --verbose, -v Show detailed information (source, type, install date) --json, -j Output as JSON (useful for CI/scripts) --no-tui Disable interactive TUI, use plain text output @@ -724,5 +838,7 @@ Examples: skillshare list --type local skillshare list react --type github --sort newest skillshare list --json | jq '.[].name' - skillshare list --verbose`) + skillshare list --verbose + skillshare list agents # List agents only + skillshare list --all # List skills + agents`) } diff --git a/cmd/skillshare/list_project.go b/cmd/skillshare/list_project.go index a8f8c08d..368b1583 100644 --- a/cmd/skillshare/list_project.go +++ b/cmd/skillshare/list_project.go @@ -3,20 +3,30 @@ package main import ( "fmt" "path/filepath" + "time" "skillshare/internal/config" "skillshare/internal/sync" + "skillshare/internal/trash" "skillshare/internal/ui" ) -func cmdListProject(root string, opts listOptions) error { +func cmdListProject(root string, opts listOptions, kind resourceKindFilter) error { if !projectConfigExists(root) { if err := performProjectInit(root, projectInitOptions{}); err != nil { return err } } - sourcePath := filepath.Join(root, ".skillshare", "skills") + skillsSource := filepath.Join(root, ".skillshare", "skills") + agentsSource := filepath.Join(root, ".skillshare", "agents") + + resourceLabel := "skills" + if kind == kindAgents { + resourceLabel = "agents" + } else if kind == kindAll { + resourceLabel = "resources" + } // TTY + not JSON + TUI enabled → launch TUI with async loading (no blank screen) if !opts.JSON && shouldLaunchTUI(opts.NoTUI, nil) { @@ -30,31 +40,47 @@ func cmdListProject(root string, opts listOptions) error { sortBy = "name" } loadFn := func() listLoadResult { - discovered, err := sync.DiscoverSourceSkillsAll(sourcePath) + // Always load both skills and agents — tab UI filters the view. + var allEntries []skillEntry + discovered, err := sync.DiscoverSourceSkillsAll(skillsSource) if err != nil { return listLoadResult{err: fmt.Errorf("cannot discover project skills: %w", err)} } - skills := buildSkillEntries(discovered) - total := len(skills) - skills = filterSkillEntries(skills, opts.Pattern, opts.TypeFilter) - sortSkillEntries(skills, sortBy) - return listLoadResult{skills: toSkillItems(skills), totalCount: total} + allEntries = append(allEntries, buildSkillEntries(discovered)...) + allEntries = append(allEntries, discoverAndBuildAgentEntries(agentsSource)...) + total := len(allEntries) + allEntries = filterSkillEntries(allEntries, opts.Pattern, opts.TypeFilter) + sortSkillEntries(allEntries, sortBy) + return listLoadResult{skills: toSkillItems(allEntries), totalCount: total} } - action, skillName, err := runListTUI(loadFn, "project", sourcePath, targets) + action, skillName, skillKind, err := runListTUI(loadFn, "project", skillsSource, agentsSource, targets, kind) if err != nil { return err } switch action { case "empty": - ui.Info("No skills installed") - ui.Info("Use 'skillshare install -p ' to install a skill") + ui.Info("No %s installed", resourceLabel) + if kind.IncludesSkills() { + ui.Info("Use 'skillshare install -p ' to install a skill") + } return nil case "audit": + if skillKind == "agent" { + return cmdAudit([]string{"agents", "-p", skillName}) + } return cmdAudit([]string{"-p", skillName}) case "update": + if skillKind == "agent" { + return cmdUpdateAgentsProject([]string{skillName}, root, time.Now()) + } _, updateErr := cmdUpdateProject([]string{skillName}, root) return updateErr case "uninstall": + if skillKind == "agent" { + agentsDir := filepath.Join(root, ".skillshare", "agents") + uOpts := &uninstallOptions{skillNames: []string{skillName}, force: true} + return cmdUninstallAgents(agentsDir, uOpts, config.ProjectConfigPath(root), trash.ProjectAgentTrashDir(root), time.Now()) + } return cmdUninstallProject([]string{"--force", skillName}, root) } return nil @@ -63,96 +89,114 @@ func cmdListProject(root string, opts listOptions) error { // Non-TUI path (JSON or plain text): synchronous loading with spinner var sp *ui.Spinner if !opts.JSON && ui.IsTTY() { - sp = ui.StartSpinner("Loading skills...") + sp = ui.StartSpinner(fmt.Sprintf("Loading %s...", resourceLabel)) } - // Use DiscoverSourceSkillsAll so disabled skills appear in the listing - discovered, err := sync.DiscoverSourceSkillsAll(sourcePath) - if err != nil { + var allEntries []skillEntry + var trackedRepos []string + var discoveredSkills []sync.DiscoveredSkill + + if kind.IncludesSkills() { + var discErr error + discoveredSkills, discErr = sync.DiscoverSourceSkillsAll(skillsSource) + if discErr != nil { + if sp != nil { + sp.Fail("Discovery failed") + } + return fmt.Errorf("cannot discover project skills: %w", discErr) + } + trackedRepos = extractTrackedRepos(discoveredSkills) if sp != nil { - sp.Fail("Discovery failed") + sp.Update(fmt.Sprintf("Reading metadata for %d skills...", len(discoveredSkills))) } - return fmt.Errorf("cannot discover project skills: %w", err) + allEntries = append(allEntries, buildSkillEntries(discoveredSkills)...) } - trackedRepos := extractTrackedRepos(discovered) - if sp != nil { - sp.Update(fmt.Sprintf("Reading metadata for %d skills...", len(discovered))) + if kind.IncludesAgents() { + allEntries = append(allEntries, discoverAndBuildAgentEntries(agentsSource)...) } - skills := buildSkillEntries(discovered) + if sp != nil { - sp.Success(fmt.Sprintf("Loaded %d skills", len(skills))) + sp.Success(fmt.Sprintf("Loaded %d %s", len(allEntries), resourceLabel)) } - totalCount := len(skills) + totalCount := len(allEntries) hasFilter := opts.Pattern != "" || opts.TypeFilter != "" // Apply filter and sort - skills = filterSkillEntries(skills, opts.Pattern, opts.TypeFilter) + allEntries = filterSkillEntries(allEntries, opts.Pattern, opts.TypeFilter) sortBy := opts.SortBy if sortBy == "" { sortBy = "name" // project mode default } - sortSkillEntries(skills, sortBy) + sortSkillEntries(allEntries, sortBy) // JSON output if opts.JSON { - return displaySkillsJSON(skills) + return displaySkillsJSON(allEntries) } - // Handle empty results before starting pager - if len(skills) == 0 && len(trackedRepos) == 0 && !hasFilter { - ui.Info("No skills installed") - ui.Info("Use 'skillshare install -p ' to install a skill") + // Handle empty results + if len(allEntries) == 0 && len(trackedRepos) == 0 && !hasFilter { + ui.Info("No %s installed", resourceLabel) + if kind.IncludesSkills() { + ui.Info("Use 'skillshare install -p ' to install a skill") + } return nil } - if hasFilter && len(skills) == 0 { + if hasFilter && len(allEntries) == 0 { if opts.Pattern != "" && opts.TypeFilter != "" { - ui.Info("No skills matching %q (type: %s)", opts.Pattern, opts.TypeFilter) + ui.Info("No %s matching %q (type: %s)", resourceLabel, opts.Pattern, opts.TypeFilter) } else if opts.Pattern != "" { - ui.Info("No skills matching %q", opts.Pattern) + ui.Info("No %s matching %q", resourceLabel, opts.Pattern) } else { - ui.Info("No skills matching type %q", opts.TypeFilter) + ui.Info("No %s matching type %q", resourceLabel, opts.TypeFilter) } return nil } // Plain text output (--no-tui or non-TTY) - if len(skills) > 0 { - ui.Header("Installed skills (project)") + if len(allEntries) > 0 { + headerLabel := "Installed skills (project)" + if kind == kindAgents { + headerLabel = "Installed agents (project)" + } else if kind == kindAll { + headerLabel = "Installed skills & agents (project)" + } + ui.Header(headerLabel) if opts.Verbose { - displaySkillsVerbose(skills) + displaySkillsVerbose(allEntries) } else { - displaySkillsCompact(skills) + displaySkillsCompact(allEntries) } } // Hide tracked repos section when filter/pattern is active if len(trackedRepos) > 0 && !hasFilter { - displayTrackedRepos(trackedRepos, discovered, sourcePath) + displayTrackedRepos(trackedRepos, discoveredSkills, skillsSource) } // Show match stats when filter is active - if hasFilter && len(skills) > 0 { + if hasFilter && len(allEntries) > 0 { fmt.Println() if opts.Pattern != "" { - ui.Info("%d of %d skill(s) matching %q", len(skills), totalCount, opts.Pattern) + ui.Info("%d of %d %s matching %q", len(allEntries), totalCount, resourceLabel, opts.Pattern) } else { - ui.Info("%d of %d skill(s)", len(skills), totalCount) + ui.Info("%d of %d %s", len(allEntries), totalCount, resourceLabel) } } else { fmt.Println() trackedCount := 0 remoteCount := 0 - for _, skill := range skills { - if skill.RepoName != "" { + for _, entry := range allEntries { + if entry.RepoName != "" { trackedCount++ - } else if skill.Source != "" { + } else if entry.Source != "" { remoteCount++ } } - localCount := len(skills) - trackedCount - remoteCount - ui.Info("%d skill(s): %d tracked, %d remote, %d local", len(skills), trackedCount, remoteCount, localCount) + localCount := len(allEntries) - trackedCount - remoteCount + ui.Info("%d %s: %d tracked, %d remote, %d local", len(allEntries), resourceLabel, trackedCount, remoteCount, localCount) } return nil diff --git a/cmd/skillshare/list_tui.go b/cmd/skillshare/list_tui.go index 0ae6a41e..552a1103 100644 --- a/cmd/skillshare/list_tui.go +++ b/cmd/skillshare/list_tui.go @@ -9,6 +9,7 @@ import ( "skillshare/internal/config" "skillshare/internal/skillignore" + "skillshare/internal/theme" "skillshare/internal/utils" "github.com/charmbracelet/bubbles/list" @@ -23,11 +24,20 @@ import ( // Keeps the widget fast — pagination + filter operate on at most this many items. const maxListItems = 1000 +// listTab represents the active tab filter in the list TUI. +type listTab int + +const ( + listTabAll listTab = iota // show all items (skills + agents) + listTabSkills // show skills only + listTabAgents // show agents only +) + // applyTUIFilterStyle sets filter prompt, cursor, and input cursor to the shared style. func applyTUIFilterStyle(l *list.Model) { - l.Styles.FilterPrompt = tc.Filter - l.Styles.FilterCursor = tc.Filter - l.FilterInput.Cursor.Style = tc.Filter + l.Styles.FilterPrompt = theme.Accent() + l.Styles.FilterCursor = theme.Accent() + l.FilterInput.Cursor.Style = theme.Accent() } // listLoadResult holds the result of async skill loading inside the TUI. @@ -60,15 +70,16 @@ type detailData struct { // listTUIModel is the bubbletea model for the interactive skill list. type listTUIModel struct { - list list.Model - totalCount int - modeLabel string // "global" or "project" - sourcePath string - targets map[string]config.TargetConfig - quitting bool - action string // "audit", "update", "uninstall", or "" (normal quit) - termWidth int - detailCache map[string]*detailData // key = RelPath; lazy-populated + list list.Model + totalCount int + modeLabel string // "global" or "project" + sourcePath string + agentsSourcePath string + targets map[string]config.TargetConfig + quitting bool + action string // "audit", "update", "uninstall", or "" (normal quit) + termWidth int + detailCache map[string]*detailData // key = RelPath; lazy-populated // Async loading — spinner shown until data arrives loading bool @@ -77,6 +88,12 @@ type listTUIModel struct { loadErr error // non-nil if loading failed emptyResult bool // true when async load returned zero skills + // Tab filter — pre-filters allItems by kind (All / Skills / Agents) + activeTab listTab // currently selected tab + activeTabP *listTab // shared pointer for delegate to read current tab + tabCounts [3]int // cached counts: [all, skills, agents] + tabFiltered []skillItem // cached result of tabFilteredItems(); set by applyFilter() + // Application-level filter — replaces bubbles/list built-in fuzzy filter // to avoid O(N*M) fuzzy scan on 100k+ items every keystroke. allItems []skillItem // full item set (kept in memory, never passed to list) @@ -90,12 +107,14 @@ type listTUIModel struct { confirming bool // true when confirmation overlay is shown confirmAction string // "audit", "update", "uninstall" confirmSkill string // skill name for confirmation display + confirmKind string // "skill" or "agent" // Content viewer overlay — dual-pane: left tree + right content showContent bool contentScroll int contentText string // current file content (rendered) contentSkillKey string // RelPath of skill being viewed + contentKind string // "skill" or "agent" — set when entering content view termHeight int treeAllNodes []treeNode // complete flat tree (includes collapsed children) treeNodes []treeNode // visible nodes (collapsed children hidden) @@ -106,8 +125,22 @@ type listTUIModel struct { // newListTUIModel creates a new TUI model. // When loadFn is non-nil, skills are loaded asynchronously inside the TUI (spinner shown). // When loadFn is nil, skills/totalCount are used directly (pre-loaded). -func newListTUIModel(loadFn listLoadFn, skills []skillItem, totalCount int, modeLabel, sourcePath string, targets map[string]config.TargetConfig) listTUIModel { - delegate := listSkillDelegate{} +func newListTUIModel(loadFn listLoadFn, skills []skillItem, totalCount int, modeLabel, sourcePath, agentsSourcePath string, targets map[string]config.TargetConfig, initialKind resourceKindFilter) listTUIModel { + // Map CLI kind filter to initial tab + var initTab listTab + switch initialKind { + case kindAgents: + initTab = listTabAgents + case kindSkills: + initTab = listTabSkills + default: + initTab = listTabAll + } + + // Shared pointer lets the delegate read the current tab without re-creation. + tabPtr := new(listTab) + *tabPtr = initTab + delegate := listSkillDelegate{activeTab: tabPtr} // Build initial item set (empty if async loading) var items []list.Item @@ -120,45 +153,111 @@ func newListTUIModel(loadFn listLoadFn, skills []skillItem, totalCount int, mode // Create list model — built-in filter DISABLED; we manage our own. l := list.New(items, delegate, 0, 0) l.Title = fmt.Sprintf("Installed skills (%s)", modeLabel) - l.Styles.Title = tc.ListTitle - l.SetShowStatusBar(false) // we render our own status with real total count - l.SetFilteringEnabled(false) // application-level filter replaces built-in - l.SetShowHelp(false) // we render our own help - l.SetShowPagination(false) // we render page info in our status line + l.Styles.Title = theme.Title() + l.Styles.NoItems = l.Styles.NoItems.PaddingLeft(2) // align with title + l.SetShowStatusBar(false) // we render our own status with real total count + l.SetFilteringEnabled(false) // application-level filter replaces built-in + l.SetShowHelp(false) // we render our own help + l.SetShowPagination(false) // we render page info in our status line // Loading spinner sp := spinner.New() sp.Spinner = spinner.Dot - sp.Style = tc.SpinnerStyle + sp.Style = theme.Accent() // Filter text input fi := textinput.New() fi.Prompt = "/ " - fi.PromptStyle = tc.Filter - fi.Cursor.Style = tc.Filter - fi.Placeholder = "filter or t:tracked g:group r:repo" + fi.PromptStyle = theme.Accent() + fi.Cursor.Style = theme.Accent() + fi.Placeholder = "filter or t:tracked g:group r:repo k:kind" m := listTUIModel{ - list: l, - totalCount: totalCount, - modeLabel: modeLabel, - sourcePath: sourcePath, - targets: targets, - detailCache: make(map[string]*detailData), - loading: loadFn != nil, - loadSpinner: sp, - loadFn: loadFn, - allItems: allItems, - matchCount: len(allItems), - filterInput: fi, - } - // Skip initial group header (index 0) + list: l, + totalCount: totalCount, + modeLabel: modeLabel, + sourcePath: sourcePath, + agentsSourcePath: agentsSourcePath, + targets: targets, + activeTab: initTab, + activeTabP: tabPtr, + detailCache: make(map[string]*detailData), + loading: loadFn != nil, + loadSpinner: sp, + loadFn: loadFn, + allItems: allItems, + matchCount: len(allItems), + filterInput: fi, + } if loadFn == nil { + m.recomputeTabCounts() + m.applyFilter() // applies tab + text filter skipGroupItem(&m.list, 1) } + m.updateTitle() return m } +// recomputeTabCounts updates the cached per-tab counts from allItems. +func (m *listTUIModel) recomputeTabCounts() { + var skills, agents int + for _, item := range m.allItems { + if item.entry.Kind == "agent" { + agents++ + } else { + skills++ + } + } + m.tabCounts = [3]int{len(m.allItems), skills, agents} +} + +// tabFilteredItems returns the subset of allItems matching the active tab. +// For listTabAll, items are reordered so skills come first then agents, +// keeping each kind's original order intact. +func (m *listTUIModel) tabFilteredItems() []skillItem { + if m.activeTab == listTabAll { + skills := make([]skillItem, 0, m.tabCounts[1]) + agents := make([]skillItem, 0, m.tabCounts[2]) + for _, item := range m.allItems { + if item.entry.Kind == "agent" { + agents = append(agents, item) + } else { + skills = append(skills, item) + } + } + return append(skills, agents...) + } + wantAgent := m.activeTab == listTabAgents + cap := m.tabCounts[1] + if wantAgent { + cap = m.tabCounts[2] + } + filtered := make([]skillItem, 0, cap) + for _, item := range m.allItems { + if (item.entry.Kind == "agent") == wantAgent { + filtered = append(filtered, item) + } + } + return filtered +} + +// tabNoun returns the display noun for the active tab. +func (t listTab) noun() string { + switch t { + case listTabSkills: + return "skills" + case listTabAgents: + return "agents" + default: + return "resources" + } +} + +// updateTitle sets the list title based on the active tab and mode. +func (m *listTUIModel) updateTitle() { + m.list.Title = fmt.Sprintf("Installed %s (%s)", m.activeTab.noun(), m.modeLabel) +} + func (m listTUIModel) Init() tea.Cmd { if m.loading && m.loadFn != nil { return tea.Batch(m.loadSpinner.Tick, doLoadCmd(m.loadFn)) @@ -172,11 +271,12 @@ func (m listTUIModel) Init() tea.Cmd { // When filter is empty, all items are restored (full pagination). func (m *listTUIModel) applyFilter() { m.detailScroll = 0 + m.tabFiltered = m.tabFilteredItems() - // No filter — restore full item set with group separators + // No filter — restore tab-filtered item set with group separators if m.filterText == "" { - m.matchCount = len(m.allItems) - m.list.SetItems(buildGroupedItems(m.allItems)) + m.matchCount = len(m.tabFiltered) + m.list.SetItems(buildGroupedItems(m.tabFiltered)) m.list.ResetSelected() return } @@ -186,7 +286,7 @@ func (m *listTUIModel) applyFilter() { // Structured match, capped at maxListItems var matched []list.Item count := 0 - for _, item := range m.allItems { + for _, item := range m.tabFiltered { if matchSkillItem(item, q) { count++ if len(matched) < maxListItems { @@ -229,10 +329,10 @@ func (m listTUIModel) Update(msg tea.Msg) (tea.Model, tea.Cmd) { } m.allItems = msg.result.skills m.totalCount = msg.result.totalCount - m.matchCount = len(msg.result.skills) - // Populate list with group separators - m.list.SetItems(buildGroupedItems(msg.result.skills)) + m.recomputeTabCounts() + m.applyFilter() // applies tab + text filter, sets matchCount skipGroupItem(&m.list, 1) + m.updateTitle() return m, nil case tea.MouseMsg: @@ -279,6 +379,7 @@ func (m listTUIModel) Update(msg tea.Msg) (tea.Model, tea.Cmd) { m.confirming = false m.confirmAction = "" m.confirmSkill = "" + m.confirmKind = "" return m, nil } return m, nil @@ -313,6 +414,20 @@ func (m listTUIModel) Update(msg tea.Msg) (tea.Model, tea.Cmd) { case "q", "ctrl+c": m.quitting = true return m, tea.Quit + case "tab": + m.activeTab = (m.activeTab + 1) % 3 + *m.activeTabP = m.activeTab + m.applyFilter() + m.updateTitle() + skipGroupItem(&m.list, 1) + return m, nil + case "shift+tab": + m.activeTab = (m.activeTab - 1 + 3) % 3 + *m.activeTabP = m.activeTab + m.applyFilter() + m.updateTitle() + skipGroupItem(&m.list, 1) + return m, nil case "ctrl+d": m.detailScroll += 8 return m, nil @@ -376,6 +491,7 @@ func (m listTUIModel) enterConfirm(action string) (tea.Model, tea.Cmd) { m.confirming = true m.confirmAction = action m.confirmSkill = name + m.confirmKind = item.entry.Kind return m, nil } @@ -387,7 +503,12 @@ func (m listTUIModel) toggleDisabled() (tea.Model, tea.Cmd) { return m, nil } - ignorePath := filepath.Join(m.sourcePath, ".skillignore") + var ignorePath string + if item.entry.Kind == "agent" { + ignorePath = filepath.Join(m.agentsSourcePath, ".agentignore") + } else { + ignorePath = filepath.Join(m.sourcePath, ".skillignore") + } pattern := item.entry.RelPath if pattern == "" { pattern = item.entry.Name @@ -444,7 +565,7 @@ func (m listTUIModel) View() string { // Loading state — spinner + message if m.loading { - return fmt.Sprintf("\n %s Loading skills...\n", m.loadSpinner.View()) + return fmt.Sprintf("\n %s Loading %s...\n", m.loadSpinner.View(), m.activeTab.noun()) } // Content viewer — dual-pane @@ -458,10 +579,14 @@ func (m listTUIModel) View() string { if m.modeLabel == "project" { flag = "-p" } - cmd := fmt.Sprintf("skillshare %s %s %s", m.confirmAction, flag, m.confirmSkill) + kindArg := "" + if m.confirmKind == "agent" { + kindArg = "agents " + } + cmd := fmt.Sprintf("skillshare %s %s%s %s", m.confirmAction, kindArg, flag, m.confirmSkill) if m.confirmAction == "uninstall" { return fmt.Sprintf("\n %s\n\n → %s\n\n Proceed? [Y/n] ", - tc.Red.Render("Uninstall "+m.confirmSkill+"?"), cmd) + theme.Danger().Render("Uninstall "+m.confirmSkill+"?"), cmd) } return fmt.Sprintf("\n → %s\n\n Proceed? [Y/n] ", cmd) } @@ -472,18 +597,47 @@ func (m listTUIModel) View() string { return m.viewVertical() } +// renderTabBar renders the kind tab bar (All / Skills / Agents). +func (m listTUIModel) renderTabBar() string { + type tab struct { + label string + tab listTab + count int + } + tabs := []tab{ + {"All", listTabAll, m.tabCounts[0]}, + {"Skills", listTabSkills, m.tabCounts[1]}, + {"Agents", listTabAgents, m.tabCounts[2]}, + } + + activeStyle := lipgloss.NewStyle().Bold(true).Underline(true) + inactiveStyle := theme.Dim() + + var parts []string + for _, t := range tabs { + label := fmt.Sprintf("%s(%d)", t.label, t.count) + if t.tab == m.activeTab { + parts = append(parts, activeStyle.Inherit(theme.Accent()).Render(label)) + } else { + parts = append(parts, inactiveStyle.Render(label)) + } + } + return " " + strings.Join(parts, " ") +} + // renderFilterBar renders the status line for the list TUI. func (m listTUIModel) renderFilterBar() string { return renderTUIFilterBar( m.filterInput.View(), m.filtering, m.filterText, - m.matchCount, len(m.allItems), maxListItems, - "skills", m.renderPageInfo(), + m.matchCount, len(m.tabFiltered), maxListItems, + m.activeTab.noun(), m.renderPageInfo(), ) } func (m *listTUIModel) syncListSize() { if listSplitActive(m.termWidth) { - panelHeight := m.termHeight - 5 + // tab(1) + gap(1) + panel + gap(1) + filter(1) + summary(1) + gap(1) + help(1) + trail(1) = 8 overhead + panelHeight := m.termHeight - 8 if panelHeight < 6 { panelHeight = 6 } @@ -491,7 +645,7 @@ func (m *listTUIModel) syncListSize() { return } - listHeight := m.termHeight - 20 + listHeight := m.termHeight - 22 if listHeight < 6 { listHeight = 6 } @@ -535,7 +689,10 @@ func selectedSkillKey(item list.Item) string { func (m listTUIModel) viewSplit() string { var b strings.Builder - panelHeight := m.termHeight - 5 + b.WriteString(m.renderTabBar()) + b.WriteString("\n\n") + + panelHeight := m.termHeight - 8 if panelHeight < 6 { panelHeight = 6 } @@ -564,12 +721,12 @@ func (m listTUIModel) viewSplit() string { b.WriteString(m.renderFilterBar()) b.WriteString(m.renderSummaryFooter()) b.WriteString("\n") - helpText := "↑↓ navigate ←→ page / filter Ctrl+d/u detail Enter view A audit U update E enable/disable X uninstall q quit" + helpText := "Tab skills/agents ↑↓ navigate ←→ page / filter Ctrl+d/u detail Enter view A audit U update E enable/disable X uninstall q quit" if m.filtering { - helpText = "t:type g:group r:repo Enter lock Esc clear q quit" + helpText = "t:type g:group r:repo k:kind Enter lock Esc clear q quit" } help := appendScrollInfo(helpText, scrollInfo) - b.WriteString(tc.Help.Render(help)) + b.WriteString(formatHelpBar(help)) b.WriteString("\n") return b.String() @@ -578,13 +735,15 @@ func (m listTUIModel) viewSplit() string { func (m listTUIModel) viewVertical() string { var b strings.Builder + b.WriteString(m.renderTabBar()) + b.WriteString("\n\n") b.WriteString(m.list.View()) b.WriteString("\n\n") b.WriteString(m.renderFilterBar()) var scrollInfo string if item, ok := m.list.SelectedItem().(skillItem); ok { - detailHeight := m.termHeight - m.termHeight*2/5 - 8 + detailHeight := m.termHeight - m.termHeight*2/5 - 10 // -2 for tab bar if detailHeight < 6 { detailHeight = 6 } @@ -603,12 +762,12 @@ func (m listTUIModel) viewVertical() string { b.WriteString(m.renderSummaryFooter()) b.WriteString("\n") - helpText := "↑↓ navigate ←→ page / filter Ctrl+d/u detail Enter view A audit U update E enable/disable X uninstall q quit" + helpText := "Tab skills/agents ↑↓ navigate ←→ page / filter Ctrl+d/u detail Enter view A audit U update E enable/disable X uninstall q quit" if m.filtering { - helpText = "t:type g:group r:repo Enter lock Esc clear q quit" + helpText = "t:type g:group r:repo k:kind Enter lock Esc clear q quit" } help := appendScrollInfo(helpText, scrollInfo) - b.WriteString(tc.Help.Render(help)) + b.WriteString(formatHelpBar(help)) b.WriteString("\n") return b.String() @@ -618,7 +777,7 @@ func (m listTUIModel) renderSummaryFooter() string { localCount := 0 trackedCount := 0 remoteCount := 0 - for _, item := range m.allItems { + for _, item := range m.tabFiltered { switch { case item.entry.RepoName != "": trackedCount++ @@ -630,12 +789,12 @@ func (m listTUIModel) renderSummaryFooter() string { } parts := []string{ - tc.Emphasis.Render(formatNumber(m.matchCount)) + tc.Dim.Render("/") + tc.Dim.Render(formatNumber(len(m.allItems))) + tc.Dim.Render(" visible"), - tc.Cyan.Render(formatNumber(localCount)) + tc.Dim.Render(" local"), - tc.Green.Render(formatNumber(trackedCount)) + tc.Dim.Render(" tracked"), - tc.Yellow.Render(formatNumber(remoteCount)) + tc.Dim.Render(" remote"), + theme.Primary().Render(formatNumber(m.matchCount)) + theme.Dim().Render("/") + theme.Dim().Render(formatNumber(len(m.tabFiltered))) + theme.Dim().Render(" visible"), + theme.Accent().Render(formatNumber(localCount)) + theme.Dim().Render(" local"), + theme.Success().Render(formatNumber(trackedCount)) + theme.Dim().Render(" tracked"), + theme.Warning().Render(formatNumber(remoteCount)) + theme.Dim().Render(" remote"), } - return tc.Help.Render(strings.Join(parts, tc.Dim.Render(" | "))) + "\n" + return theme.Dim().MarginLeft(2).Render(strings.Join(parts, theme.Dim().Render(" | "))) + "\n" } // renderTUIFilterBar renders a unified filter + status line shared by all TUIs. @@ -644,14 +803,14 @@ func renderTUIFilterBar(inputView string, filtering bool, filterText string, mat if filtering { if filterText == "" { status := fmt.Sprintf(" %s %s%s", formatNumber(totalCount), noun, pageInfo) - return " " + inputView + tc.Help.Render(status) + "\n" + return " " + inputView + theme.Dim().MarginLeft(2).Render(status) + "\n" } status := fmt.Sprintf(" %s/%s %s", formatNumber(matchCount), formatNumber(totalCount), noun) if maxShown > 0 && matchCount > maxShown { status += fmt.Sprintf(" (first %s shown)", formatNumber(maxShown)) } status += pageInfo - return " " + inputView + tc.Help.Render(status) + "\n" + return " " + inputView + theme.Dim().MarginLeft(2).Render(status) + "\n" } if filterText != "" { status := fmt.Sprintf("filter: %s — %s/%s %s", filterText, formatNumber(matchCount), formatNumber(totalCount), noun) @@ -659,9 +818,9 @@ func renderTUIFilterBar(inputView string, filtering bool, filterText string, mat status += fmt.Sprintf(" (first %s shown)", formatNumber(maxShown)) } status += pageInfo - return tc.Help.Render(status) + "\n" + return theme.Dim().MarginLeft(2).Render(status) + "\n" } - return tc.Help.Render(fmt.Sprintf("%s %s%s", formatNumber(totalCount), noun, pageInfo)) + "\n" + return theme.Dim().MarginLeft(2).Render(fmt.Sprintf("%s %s%s", formatNumber(totalCount), noun, pageInfo)) + "\n" } // renderPageInfo returns page indicator like " · Page 2 of 4,729" or "" if single page. @@ -701,13 +860,27 @@ func formatNumber(n int) string { return b.String() } -// getDetailData returns cached detail data for a skill, populating the cache on first access. +// getDetailData returns cached detail data for a skill or agent, populating the cache on first access. func (m listTUIModel) getDetailData(e skillEntry) *detailData { key := e.RelPath if d, ok := m.detailCache[key]; ok { return d } + if e.Kind == "agent" { + // Agents are single .md files — read frontmatter from the file directly + agentFile := filepath.Join(m.agentsSourcePath, e.RelPath) + fm := utils.ParseFrontmatterFields(agentFile, []string{"description", "license"}) + d := &detailData{ + Description: fm["description"], + License: fm["license"], + Files: []string{filepath.Base(e.RelPath)}, + SyncedTargets: m.findSyncedTargets(e), + } + m.detailCache[key] = d + return d + } + skillDir := filepath.Join(m.sourcePath, e.RelPath) skillMD := filepath.Join(skillDir, "SKILL.md") @@ -738,7 +911,7 @@ func renderDetailGroup(title string, rows []string, _ int) string { } var b strings.Builder - b.WriteString(tc.Title.Render(title)) + b.WriteString(theme.Title().Render(title)) b.WriteString("\n") for _, r := range filtered { b.WriteString(" ") @@ -756,7 +929,7 @@ func renderDetailCard(title string, body string, width int) string { if title == "" { return style.Render(body) } - return style.Render(tc.Title.Render(title) + "\n" + body) + return style.Render(theme.Title().Render(title) + "\n" + body) } func renderDetailSection(title string, body string, width int) string { @@ -764,7 +937,7 @@ func renderDetailSection(title string, body string, width int) string { Width(width). Align(lipgloss.Left). Padding(0, 0) - return style.Render(tc.Title.Render(title) + "\n" + body) + return style.Render(theme.Title().Render(title) + "\n" + body) } // renderDetailBody renders the scrollable detail body for the selected skill. @@ -795,22 +968,22 @@ func (m listTUIModel) renderDetailBody(e skillEntry, d *detailData, width int) s // Source section (Installed date and Synced-to targets are in the header) var sourceRows []string if e.Source != "" { - sourceRows = append(sourceRows, renderFactRow("Source", tc.Cyan.Render(e.Source))) + sourceRows = append(sourceRows, renderFactRow("Source", theme.Accent().Render(e.Source))) } else if e.RepoName != "" { sourceRows = append(sourceRows, renderFactRow("Repo", e.RepoName)) } if e.Branch != "" { - sourceRows = append(sourceRows, renderFactRow("Branch", tc.Cyan.Render(e.Branch))) + sourceRows = append(sourceRows, renderFactRow("Branch", theme.Accent().Render(e.Branch))) } if d.License != "" { - sourceRows = append(sourceRows, renderFactRow("License", tc.Green.Bold(true).Render(d.License))) + sourceRows = append(sourceRows, renderFactRow("License", theme.Success().Bold(true).Render(d.License))) } if len(d.Files) > 0 { fileLabel := fmt.Sprintf("Files (%d)", len(d.Files)) - sourceRows = append(sourceRows, renderFactRow(fileLabel, tc.Cyan.Render(strings.Join(d.Files, " · ")))) + sourceRows = append(sourceRows, renderFactRow(fileLabel, theme.Accent().Render(strings.Join(d.Files, " · ")))) } if len(d.SyncedTargets) > 0 { - sourceRows = append(sourceRows, renderFactRow("Synced to", tc.Cyan.Render(strings.Join(d.SyncedTargets, ", ")))) + sourceRows = append(sourceRows, renderFactRow("Synced to", theme.Accent().Render(strings.Join(d.SyncedTargets, ", ")))) } if len(sourceRows) > 0 { b.WriteString(renderDetailSection("Details", strings.Join(sourceRows, "\n"), cardWidth)) @@ -831,13 +1004,13 @@ func renderDetailHeader(e skillEntry, d *detailData, width int) string { var metaParts []string metaParts = append(metaParts, detailStatusBits(e)) if e.InstalledAt != "" { - metaParts = append(metaParts, tc.Dim.Render(e.InstalledAt)) + metaParts = append(metaParts, theme.Dim().Render(e.InstalledAt)) } if len(d.SyncedTargets) > 0 { - metaParts = append(metaParts, tc.Cyan.Render(fmt.Sprintf("%d target(s)", len(d.SyncedTargets)))) + metaParts = append(metaParts, theme.Accent().Render(fmt.Sprintf("%d target(s)", len(d.SyncedTargets)))) } body.WriteString("\n\n") - body.WriteString(strings.Join(metaParts, tc.Dim.Render(" · "))) + body.WriteString(strings.Join(metaParts, theme.Dim().Render(" · "))) return renderDetailCard("", body.String(), width) } @@ -845,30 +1018,38 @@ func renderDetailHeader(e skillEntry, d *detailData, width int) string { func renderDetailParagraph(lines []string) []string { rendered := make([]string, 0, len(lines)) for _, line := range lines { - rendered = append(rendered, tc.Value.Render(line)) + rendered = append(rendered, lipgloss.NewStyle().Render(line)) } return rendered } func detailStatusBits(e skillEntry) string { var bits []string + + // Kind label (Agent / Skill) + if e.Kind == "agent" { + bits = append(bits, theme.Accent().Render("Agent")) + } else { + bits = append(bits, theme.Accent().Render("Skill")) + } + switch { case e.RepoName != "": - bits = append(bits, tc.Green.Render("tracked")) + bits = append(bits, theme.Success().Render("tracked")) case e.Source != "": - bits = append(bits, tc.Yellow.Render("remote")) + bits = append(bits, theme.Warning().Render("remote")) default: - bits = append(bits, tc.Dim.Render("local")) + bits = append(bits, theme.Dim().Render("local")) } if e.Disabled { - bits = append(bits, tc.Red.Render("disabled")) + bits = append(bits, theme.Danger().Render("disabled")) } return strings.Join(bits, " ") } func renderFactRow(label, value string) string { labelStyle := lipgloss.NewStyle().Faint(true).Width(12) - return labelStyle.Render(label+":") + " " + tc.Value.Render(value) + return labelStyle.Render(label+":") + " " + lipgloss.NewStyle().Render(value) } // listSkillFiles returns visible file names in the skill directory. @@ -913,36 +1094,38 @@ func (m listTUIModel) findSyncedTargets(e skillEntry) []string { // runListTUI starts the bubbletea TUI for the skill list. // When loadFn is non-nil, data is loaded asynchronously inside the TUI (no blank screen). -// Returns (action, skillName, error). action is "" on normal quit (q/ctrl+c). -func runListTUI(loadFn listLoadFn, modeLabel, sourcePath string, targets map[string]config.TargetConfig) (string, string, error) { - model := newListTUIModel(loadFn, nil, 0, modeLabel, sourcePath, targets) +// Returns (action, skillName, skillKind, error). action is "" on normal quit (q/ctrl+c). +func runListTUI(loadFn listLoadFn, modeLabel, sourcePath, agentsSourcePath string, targets map[string]config.TargetConfig, initialKind resourceKindFilter) (string, string, string, error) { + model := newListTUIModel(loadFn, nil, 0, modeLabel, sourcePath, agentsSourcePath, targets, initialKind) p := tea.NewProgram(model, tea.WithAltScreen(), tea.WithMouseCellMotion()) finalModel, err := p.Run() if err != nil { - return "", "", err + return "", "", "", err } m, ok := finalModel.(listTUIModel) if !ok || m.action == "" { if m.loadErr != nil { - return "", "", m.loadErr + return "", "", "", m.loadErr } if m.emptyResult { - return "empty", "", nil + return "empty", "", "", nil } - return "", "", nil + return "", "", "", nil } - // Extract skill name from selected item + // Extract skill name and kind from selected item var skillName string + var skillKind string if item, ok := m.list.SelectedItem().(skillItem); ok { if item.entry.RelPath != "" { skillName = item.entry.RelPath } else { skillName = item.entry.Name } + skillKind = item.entry.Kind } - return m.action, skillName, nil + return m.action, skillName, skillKind, nil } // wordWrapLines splits text into lines that fit within maxWidth, breaking at word boundaries. diff --git a/cmd/skillshare/list_tui_content.go b/cmd/skillshare/list_tui_content.go index 14e723e1..d347cc15 100644 --- a/cmd/skillshare/list_tui_content.go +++ b/cmd/skillshare/list_tui_content.go @@ -7,6 +7,7 @@ import ( "sort" "strings" + "skillshare/internal/theme" "skillshare/internal/utils" tea "github.com/charmbracelet/bubbletea" @@ -139,12 +140,37 @@ func buildVisibleNodes(all []treeNode) []treeNode { // loadContentForSkill populates the content viewer fields for the given skill. func loadContentForSkill(m *listTUIModel, e skillEntry) { - skillDir := filepath.Join(m.sourcePath, e.RelPath) m.contentSkillKey = e.RelPath + m.contentKind = e.Kind m.contentScroll = 0 m.treeCursor = 0 m.treeScroll = 0 + if e.Kind == "agent" { + // Agents are single .md files — render directly, minimal tree + agentFile := filepath.Join(m.agentsSourcePath, e.RelPath) + data, err := os.ReadFile(agentFile) + if err != nil { + m.contentText = fmt.Sprintf("(error reading agent: %v)", err) + m.treeAllNodes = nil + m.treeNodes = nil + return + } + raw := strings.TrimSpace(string(data)) + if raw == "" { + m.contentText = "(empty)" + } else { + w := m.contentPanelWidth() + m.contentText = hardWrapContent(renderMarkdown(raw, w), w) + } + // Single-file tree: just the agent .md file + m.treeAllNodes = []treeNode{{name: filepath.Base(e.RelPath), relPath: e.RelPath}} + m.treeNodes = m.treeAllNodes + return + } + + // Existing skill directory logic + skillDir := filepath.Join(m.sourcePath, e.RelPath) m.treeAllNodes = buildTreeNodes(skillDir) m.treeNodes = buildVisibleNodes(m.treeAllNodes) @@ -172,8 +198,13 @@ func loadContentFile(m *listTUIModel) { return } - skillDir := filepath.Join(m.sourcePath, m.contentSkillKey) - filePath := filepath.Join(skillDir, node.relPath) + var filePath string + if m.contentKind == "agent" { + filePath = filepath.Join(m.agentsSourcePath, node.relPath) + } else { + skillDir := filepath.Join(m.sourcePath, m.contentSkillKey) + filePath = filepath.Join(skillDir, node.relPath) + } var rawText string if node.name == "SKILL.md" { @@ -212,13 +243,17 @@ func autoPreviewFile(m *listTUIModel) { } } -// contentPanelWidth returns the available text width for the right content panel. -// This accounts for PaddingLeft(1) used in rendering, so content renders at -// the exact width the panel can display without line-wrapping. +// contentPanelWidth returns the available text width for the content panel. +// Agents use full-width (no sidebar); skills use dual-pane layout. func (m *listTUIModel) contentPanelWidth() int { + if m.contentKind == "agent" { + w := m.termWidth - 4 + if w < 40 { + w = 40 + } + return w + } sw := sidebarWidth(m.termWidth) - // lipgloss Width includes padding, so subtract 1 for PaddingLeft(1) - // Layout: leftMargin(1) + sidebar(sw) + PaddingLeft(1) + border(1) + PaddingLeft(1) + content + rightMargin(1) w := m.termWidth - sw - 5 - 1 if w < 40 { w = 40 @@ -238,10 +273,18 @@ func hardWrapContent(text string, width int) string { // ─── Glamour Markdown Rendering ────────────────────────────────────── -// contentGlamourStyle returns a modified dark style with no backgrounds or margins -// that would bleed or overflow in the constrained dual-pane layout. +// contentGlamourStyle returns a glamour style tuned for the dual-pane list +// detail view. The base config tracks the resolved terminal theme so body +// text stays legible on light backgrounds (glamour's dark base uses light +// gray prose which disappears on white). Backgrounds and margins are +// stripped to avoid bleeding in the constrained layout. func contentGlamourStyle() ansi.StyleConfig { - s := styles.DarkStyleConfig + var s ansi.StyleConfig + if theme.Get().Mode == theme.ModeLight { + s = styles.LightStyleConfig + } else { + s = styles.DarkStyleConfig + } zero := uint(0) s.Document.Margin = &zero @@ -282,8 +325,8 @@ func renderMarkdown(text string, width int) string { // ─── Keyboard Handling ─────────────────────────────────────────────── -// handleContentKey handles keyboard input in the dual-pane content viewer. -// Keyboard always controls the left tree; Ctrl+d/u/g/G scroll the right panel. +// handleContentKey handles keyboard input in the content viewer. +// Agents (single file) only support scroll; skills support tree navigation + scroll. func (m listTUIModel) handleContentKey(msg tea.KeyMsg) (tea.Model, tea.Cmd) { switch msg.String() { case "q", "ctrl+c": @@ -293,35 +336,36 @@ func (m listTUIModel) handleContentKey(msg tea.KeyMsg) (tea.Model, tea.Cmd) { m.showContent = false return m, nil - // Left tree: navigate + // Left tree: navigate (skills only — agents have no sidebar) case "j", "down": - if m.treeCursor < len(m.treeNodes)-1 { + if m.contentKind != "agent" && m.treeCursor < len(m.treeNodes)-1 { m.treeCursor++ m.ensureTreeCursorVisible() autoPreviewFile(&m) } return m, nil case "k", "up": - if m.treeCursor > 0 { + if m.contentKind != "agent" && m.treeCursor > 0 { m.treeCursor-- m.ensureTreeCursorVisible() autoPreviewFile(&m) } return m, nil case "l", "right", "enter": - if len(m.treeNodes) > 0 && m.treeCursor < len(m.treeNodes) { + if m.contentKind != "agent" && len(m.treeNodes) > 0 && m.treeCursor < len(m.treeNodes) { node := m.treeNodes[m.treeCursor] if node.isDir { toggleTreeDir(&m) } - // Files are already auto-previewed by j/k } return m, nil case "h", "left": - collapseOrParent(&m) + if m.contentKind != "agent" { + collapseOrParent(&m) + } return m, nil - // Right content: scroll + // Content scroll case "ctrl+d": half := m.contentViewHeight() / 2 max := m.contentMaxScroll() @@ -362,12 +406,54 @@ func sidebarWidth(termWidth int) int { return w } -// renderContentOverlay renders the full-screen dual-pane content viewer. +// renderContentOverlay renders the full-screen content viewer. +// Agents (single .md file) use a full-width layout without sidebar. +// Skills (directory with multiple files) use a dual-pane layout with file tree. func renderContentOverlay(m listTUIModel) string { + if m.contentKind == "agent" { + return renderContentFullWidth(m) + } + return renderContentDualPane(m) +} + +// renderContentFullWidth renders the content viewer without sidebar (for agents). +func renderContentFullWidth(m listTUIModel) string { + var b strings.Builder + + skillName := filepath.Base(m.contentSkillKey) + b.WriteString("\n") + b.WriteString(theme.Title().Render(fmt.Sprintf(" %s", skillName))) + b.WriteString("\n\n") + + textW := m.contentPanelWidth() + contentHeight := m.contentViewHeight() + contentStr, scrollInfo := renderContentStr(m, textW, contentHeight) + + panelW := textW + 2 // +2 for PaddingLeft(2) + panel := lipgloss.NewStyle(). + Width(panelW).MaxWidth(panelW). + Height(contentHeight).MaxHeight(contentHeight). + PaddingLeft(2). + Render(contentStr) + b.WriteString(panel) + b.WriteString("\n\n") + + help := "Ctrl+d/u scroll g/G top/bottom Esc back q quit" + if scrollInfo != "" { + help += " " + scrollInfo + } + b.WriteString(formatHelpBar(help)) + b.WriteString("\n") + + return b.String() +} + +// renderContentDualPane renders the dual-pane content viewer with file tree sidebar. +func renderContentDualPane(m listTUIModel) string { var b strings.Builder - titleStyle := tc.Title - dimStyle := tc.Dim + titleStyle := theme.Title() + dimStyle := theme.Dim() skillName := filepath.Base(m.contentSkillKey) fileName := "" @@ -403,7 +489,7 @@ func renderContentOverlay(m listTUIModel) string { PaddingLeft(1). Render(sidebarStr) - borderStyle := tc.Border. + borderStyle := theme.Dim(). Height(contentHeight).MaxHeight(contentHeight) borderCol := strings.Repeat("│\n", contentHeight) borderPanel := borderStyle.Render(strings.TrimRight(borderCol, "\n")) @@ -422,7 +508,7 @@ func renderContentOverlay(m listTUIModel) string { if scrollInfo != "" { help += " " + scrollInfo } - b.WriteString(tc.Help.Render(help)) + b.WriteString(formatHelpBar(help)) b.WriteString("\n") return b.String() @@ -434,10 +520,10 @@ func renderSidebarStr(m listTUIModel, width, height int) string { return "(no files)" } - selectedStyle := lipgloss.NewStyle().Bold(true).Foreground(tc.BrandYellow) - dirStyle := tc.Cyan + selectedStyle := lipgloss.NewStyle().Bold(true).Foreground(lipgloss.Color("#D4D93C")) + dirStyle := theme.Accent() fileStyle := lipgloss.NewStyle() - dimStyle := tc.Dim + dimStyle := theme.Dim() total := len(m.treeNodes) start := m.treeScroll diff --git a/cmd/skillshare/list_tui_content_test.go b/cmd/skillshare/list_tui_content_test.go new file mode 100644 index 00000000..14fac7c3 --- /dev/null +++ b/cmd/skillshare/list_tui_content_test.go @@ -0,0 +1,49 @@ +package main + +import ( + "testing" + + "skillshare/internal/theme" +) + +// contentGlamourStyle must pick the glamour base style that matches the +// resolved terminal theme. Otherwise body prose rendered on a light +// terminal uses the dark-base near-white color ("252") and disappears. +func TestContentGlamourStyle_MatchesTheme(t *testing.T) { + t.Setenv("NO_COLOR", "") + + cases := []struct { + mode string + wantBody string // expected Document.StylePrimitive.Color + }{ + {"light", "234"}, // readable dark gray on light terminals + {"dark", "252"}, // readable near-white on dark terminals + } + + for _, tc := range cases { + t.Run(tc.mode, func(t *testing.T) { + t.Setenv("SKILLSHARE_THEME", tc.mode) + theme.Reset() + defer theme.Reset() + + s := contentGlamourStyle() + if s.Document.StylePrimitive.Color == nil { + t.Fatal("Document.Color is nil") + } + if got := *s.Document.StylePrimitive.Color; got != tc.wantBody { + t.Errorf("body Color = %q, want %q", got, tc.wantBody) + } + + // Shared customizations must still apply regardless of base. + if s.Document.Margin == nil || *s.Document.Margin != 0 { + t.Errorf("Document.Margin = %v, want 0", s.Document.Margin) + } + if s.H1.StylePrimitive.BackgroundColor != nil { + t.Errorf("H1.BackgroundColor = %v, want nil", s.H1.StylePrimitive.BackgroundColor) + } + if s.H1.StylePrimitive.Color == nil || *s.H1.StylePrimitive.Color != "6" { + t.Errorf("H1.Color = %v, want \"6\"", s.H1.StylePrimitive.Color) + } + }) + } +} diff --git a/cmd/skillshare/list_tui_filter.go b/cmd/skillshare/list_tui_filter.go index 154da1a4..1b3f741a 100644 --- a/cmd/skillshare/list_tui_filter.go +++ b/cmd/skillshare/list_tui_filter.go @@ -10,6 +10,7 @@ type filterQuery struct { TypeTag string // "tracked", "remote", "local" (empty = any) GroupTag string // substring match against group segment RepoTag string // substring match against RepoName + KindTag string // "skill" or "agent" (empty = any) FreeText string // remaining text after removing known tags } @@ -34,6 +35,10 @@ func parseFilterQuery(raw string) filterQuery { q.RepoTag = val continue } + if val, ok := cutTag(lower, "k:", "kind:"); ok { + q.KindTag = normalizeKindValue(val) + continue + } freeTokens = append(freeTokens, strings.ToLower(token)) } @@ -69,6 +74,18 @@ func normalizeTypeValue(val string) string { } } +// normalizeKindValue maps plural forms to canonical kind names. +func normalizeKindValue(val string) string { + switch val { + case "agent", "agents": + return "agent" + case "skill", "skills": + return "skill" + default: + return val + } +} + // matchSkillItem returns true if the skill item matches all non-empty conditions in the query (AND logic). func matchSkillItem(item skillItem, q filterQuery) bool { e := item.entry @@ -97,6 +114,11 @@ func matchSkillItem(item skillItem, q filterQuery) bool { } } + // Kind tag — exact match on skill kind + if q.KindTag != "" && e.Kind != q.KindTag { + return false + } + // Free text — substring match on FilterValue if q.FreeText != "" { fv := strings.ToLower(item.FilterValue()) @@ -141,3 +163,18 @@ func skillGroup(e skillEntry) string { // Non-tracked: first segment is group return parts[0] } + +// skillTopGroup returns the top-level visual grouping key for the list TUI. +// Tracked entries (including nested agents in deep repo subdirs) group by +// their tracked repo root. Non-tracked nested entries group by their first +// path segment (e.g. local "mma/foo" → "mma"). Flat root-level entries +// return "" which renders as the "standalone" bucket. +func skillTopGroup(e skillEntry) string { + if e.RepoName != "" { + return e.RepoName + } + if i := strings.Index(e.RelPath, "/"); i > 0 { + return e.RelPath[:i] + } + return "" +} diff --git a/cmd/skillshare/list_tui_item.go b/cmd/skillshare/list_tui_item.go index b066a7ba..640ebb74 100644 --- a/cmd/skillshare/list_tui_item.go +++ b/cmd/skillshare/list_tui_item.go @@ -5,6 +5,8 @@ import ( "io" "strings" + "skillshare/internal/theme" + "github.com/charmbracelet/bubbles/list" tea "github.com/charmbracelet/bubbletea" "github.com/charmbracelet/lipgloss" @@ -27,7 +29,10 @@ func (g groupItem) Title() string { return g.label } func (g groupItem) Description() string { return "" } // listSkillDelegate renders a compact single-line browser row for the list TUI. -type listSkillDelegate struct{} +// activeTab is a shared pointer so the delegate sees tab changes without re-creation. +type listSkillDelegate struct { + activeTab *listTab // nil-safe: treat nil as listTabAll +} func (listSkillDelegate) Height() int { return 1 } func (listSkillDelegate) Spacing() int { return 0 } @@ -35,7 +40,7 @@ func (listSkillDelegate) Update(_ tea.Msg, _ *list.Model) tea.Cmd { return nil } -func (listSkillDelegate) Render(w io.Writer, m list.Model, index int, item list.Item) { +func (d listSkillDelegate) Render(w io.Writer, m list.Model, index int, item list.Item) { width := m.Width() if width <= 0 { width = 40 @@ -46,7 +51,8 @@ func (listSkillDelegate) Render(w io.Writer, m list.Model, index int, item list. renderGroupRow(w, v, width) case skillItem: selected := index == m.Index() - renderSkillRow(w, v, width, selected) + allTab := d.activeTab != nil && *d.activeTab == listTabAll + renderSkillRow(w, v, width, selected, allTab) } } @@ -55,7 +61,7 @@ func renderGroupRow(w io.Writer, g groupItem, width int) { if g.count > 0 { label += fmt.Sprintf(" (%d)", g.count) } - label = tc.Dim.Render(label) + label = theme.Dim().Render(label) lineWidth := width - lipgloss.Width(label) - 3 // "─ " prefix + " " if lineWidth < 2 { @@ -63,21 +69,21 @@ func renderGroupRow(w io.Writer, g groupItem, width int) { } line := strings.Repeat("─", lineWidth) - fmt.Fprint(w, tc.Dim.Render("─ ")+label+" "+tc.Dim.Render(line)) + fmt.Fprint(w, theme.Dim().Render("─ ")+label+" "+theme.Dim().Render(line)) } -func renderSkillRow(w io.Writer, skill skillItem, width int, selected bool) { - renderPrefixRow(w, skillTitleLine(skill.entry), width, selected) +func renderSkillRow(w io.Writer, skill skillItem, width int, selected bool, allTab bool) { + renderPrefixRow(w, skillTitleLine(skill.entry, allTab), width, selected) } // renderPrefixRow renders a single-line list row with a "▌" prefix bar. // Shared by list TUI and audit TUI delegates. func renderPrefixRow(w io.Writer, line string, width int, selected bool) { - prefixStyle := tc.ListRowPrefix - bodyStyle := tc.ListRow + prefixStyle := theme.Dim() + bodyStyle := lipgloss.NewStyle().PaddingLeft(1) if selected { - prefixStyle = tc.ListRowPrefixSelected - bodyStyle = tc.ListRowSelected + prefixStyle = lipgloss.NewStyle().Foreground(lipgloss.Color("#D4D93C")) + bodyStyle = theme.SelectedRow().PaddingLeft(1) // Strip embedded ANSI so ListRowSelected's background fills the full // row width — compound ANSI sequences (e.g. icon + name) contain // resets that break the parent background propagation in lipgloss. @@ -142,13 +148,13 @@ func (d prefixItemDelegate) Render(w io.Writer, m list.Model, index int, item li // renderPrefixRowWithDesc renders a 2-line list row with a "▌" prefix bar. // Line 1: title (same as renderPrefixRow). Line 2: description in muted style. func renderPrefixRowWithDesc(w io.Writer, title, desc string, width int, selected bool) { - prefixStyle := tc.ListRowPrefix - bodyStyle := tc.ListRow - descStyle := tc.ListMeta + prefixStyle := theme.Dim() + bodyStyle := lipgloss.NewStyle().PaddingLeft(1) + descStyle := theme.Dim().PaddingLeft(1) if selected { - prefixStyle = tc.ListRowPrefixSelected - bodyStyle = tc.ListRowSelected - descStyle = tc.ListMetaSelected + prefixStyle = lipgloss.NewStyle().Foreground(lipgloss.Color("#D4D93C")) + bodyStyle = theme.SelectedRow().PaddingLeft(1) + descStyle = theme.SelectedRow().PaddingLeft(1) title = xansi.Strip(title) desc = xansi.Strip(desc) } @@ -201,12 +207,20 @@ func (i skillItem) Description() string { return "" } -func skillTitleLine(e skillEntry) string { +func skillTitleLine(e skillEntry, allTab bool) string { if e.Disabled { // Disabled: dim the entire name + ⊘ prefix - return tc.Dim.Render("⊘ " + compactSkillPath(e)) + return theme.Dim().Render("⊘ " + compactSkillPath(e)) + } + var prefix string + if allTab { + if e.Kind == "agent" { + prefix = theme.Accent().Render("[A]") + " " + } else { + prefix = theme.Accent().Render("[S]") + " " + } } - title := colorSkillPath(compactSkillPath(e)) + title := prefix + colorSkillPath(compactSkillPath(e)) if badge := skillTypeBadge(e); badge != "" { return title + " " + badge } @@ -241,10 +255,10 @@ func baseSkillPath(e skillEntry) string { func skillTypeBadge(e skillEntry) string { var badge string if e.RepoName == "" && e.Source == "" { - badge = tc.BadgeLocal.Render("local") + badge = theme.Badge().Render("local") } if e.Disabled { - disabled := tc.BadgeDisabled.Render("disabled") + disabled := theme.Badge().Faint(true).Render("disabled") if badge != "" { return badge + " " + disabled } @@ -258,7 +272,7 @@ func skillTypeBadge(e skillEntry) string { func colorSkillPath(path string) string { segments := strings.Split(path, "/") if len(segments) <= 1 { - return tc.Emphasis.Render(path) + return theme.Primary().Render(path) } dirs := segments[:len(segments)-1] @@ -267,21 +281,21 @@ func colorSkillPath(path string) string { var parts []string for idx, dir := range dirs { if idx == 0 { - parts = append(parts, tc.Cyan.Render(dir)) + parts = append(parts, theme.Accent().Render(dir)) } else { - parts = append(parts, tc.Dim.Render(dir)) + parts = append(parts, theme.Dim().Render(dir)) } } - sep := tc.Faint.Render("/") - return strings.Join(parts, sep) + sep + tc.Emphasis.Render(name) + sep := theme.Dim().Render("/") + return strings.Join(parts, sep) + sep + theme.Primary().Render(name) } // colorSkillPathBold is like colorSkillPath but renders the skill name in bold // for extra prominence in the detail panel header. func colorSkillPathBold(path string) string { segments := strings.Split(path, "/") - boldName := lipgloss.NewStyle().Bold(true).Foreground(lipgloss.Color("15")) + boldName := theme.Primary().Bold(true) if len(segments) <= 1 { return boldName.Render(path) } @@ -292,13 +306,13 @@ func colorSkillPathBold(path string) string { var parts []string for idx, dir := range dirs { if idx == 0 { - parts = append(parts, tc.Cyan.Render(dir)) + parts = append(parts, theme.Accent().Render(dir)) } else { - parts = append(parts, tc.Dim.Render(dir)) + parts = append(parts, theme.Dim().Render(dir)) } } - sep := tc.Faint.Render("/") + sep := theme.Dim().Render("/") return strings.Join(parts, sep) + sep + boldName.Render(name) } @@ -318,16 +332,20 @@ func toSkillItems(entries []skillEntry) []skillItem { return items } -// buildGroupedItems inserts groupItem separators before each repo/local group. +// buildGroupedItems inserts groupItem separators before each top-level group. // Skills must be sorted by RelPath (tracked repos with "_" prefix sort first). -// If all skills belong to a single group (e.g. all standalone), no separators are added. +// Grouping follows skillTopGroup(): tracked entries group by their repo root; +// local nested entries group by their first path segment; flat locals fall +// into "standalone". When items contain mixed kinds (skills + agents), the +// kind is included in the key so they stay in separate blocks. func buildGroupedItems(skills []skillItem) []list.Item { // Check if there are multiple groups. groups := map[string]bool{} + hasMultiKinds := false for _, s := range skills { - groups[s.entry.RepoName] = true - if len(groups) > 1 { - break + groups[s.entry.Kind+"\x00"+skillTopGroup(s.entry)] = true + if !hasMultiKinds && len(skills) > 0 && s.entry.Kind != skills[0].entry.Kind { + hasMultiKinds = true } } @@ -357,12 +375,21 @@ func buildGroupedItems(skills []skillItem) []list.Item { } for _, s := range skills { - key := s.entry.RepoName // "" for local + top := skillTopGroup(s.entry) + key := s.entry.Kind + "\x00" + top if key != currentGroup { flush() label := "standalone" - if key != "" { - label = strings.TrimPrefix(key, "_") + if top != "" { + label = strings.TrimPrefix(top, "_") + } + // Prefix with kind when mixed to visually separate skills/agents + if hasMultiKinds { + kindPrefix := "Skills" + if s.entry.Kind == "agent" { + kindPrefix = "Agents" + } + label = kindPrefix + " · " + label } items = append(items, groupItem{label: label}) currentGroup = key diff --git a/cmd/skillshare/list_tui_item_test.go b/cmd/skillshare/list_tui_item_test.go index 25f31109..ca86d7e3 100644 --- a/cmd/skillshare/list_tui_item_test.go +++ b/cmd/skillshare/list_tui_item_test.go @@ -123,6 +123,115 @@ func TestBuildGroupedItems_SingleGroup(t *testing.T) { } } +// Local nested skills (e.g. "mma/agent-browser") must form their own +// visual group even though they have no RepoName, so they don't get +// lumped together with flat root-level locals under "standalone". +func TestBuildGroupedItems_LocalNestedGroup(t *testing.T) { + // Pre-sorted: mma group items contiguous, then flat standalone. + // (sortSkillEntries places standalone last.) + skills := []skillItem{ + {entry: skillEntry{Name: "agent-browser", RelPath: "mma/agent-browser"}}, + {entry: skillEntry{Name: "docker-expert", RelPath: "mma/docker-expert"}}, + {entry: skillEntry{Name: "flat-a", RelPath: "flat-a"}}, + } + items := buildGroupedItems(skills) + // Expect: group("mma") + 2 skills + group("standalone") + 1 skill = 5 + if len(items) != 5 { + t.Fatalf("buildGroupedItems local nested: got %d items, want 5", len(items)) + } + g1, ok := items[0].(groupItem) + if !ok || g1.label != "mma" || g1.count != 2 { + t.Errorf("group 1: %+v, want label=mma count=2", g1) + } + g2, ok := items[3].(groupItem) + if !ok || g2.label != "standalone" || g2.count != 1 { + t.Errorf("group 2: %+v, want label=standalone count=1", g2) + } +} + +// Agents deep inside a tracked repo (e.g. _repo/agents/core/x.md, +// _repo/agents/specialized/python/y.md) must all belong to ONE group +// keyed by the tracked repo root — not split into core/specialized buckets. +func TestBuildGroupedItems_TrackedAgentsDeepPath(t *testing.T) { + skills := []skillItem{ + {entry: skillEntry{ + Kind: "agent", Name: "code-reviewer", + RelPath: "_vijay-agents/agents/core/code-reviewer.md", RepoName: "_vijay-agents", + }}, + {entry: skillEntry{ + Kind: "agent", Name: "python-expert", + RelPath: "_vijay-agents/agents/specialized/python/python-expert.md", RepoName: "_vijay-agents", + }}, + {entry: skillEntry{ + Kind: "agent", Name: "local-agent", + RelPath: "local-agent.md", + }}, + } + items := buildGroupedItems(skills) + // Expect: group("vijay-agents") + 2 agents + group("standalone") + 1 agent = 5 + if len(items) != 5 { + t.Fatalf("buildGroupedItems tracked agents: got %d items, want 5", len(items)) + } + g1, ok := items[0].(groupItem) + if !ok || g1.label != "vijay-agents" || g1.count != 2 { + t.Errorf("group 1: %+v, want label=vijay-agents count=2", g1) + } + g2, ok := items[3].(groupItem) + if !ok || g2.label != "standalone" || g2.count != 1 { + t.Errorf("group 2: %+v, want label=standalone count=1", g2) + } +} + +func TestSkillTopGroup(t *testing.T) { + cases := []struct { + name string + e skillEntry + want string + }{ + {"tracked repo", skillEntry{RepoName: "_repo", RelPath: "_repo/security/audit"}, "_repo"}, + {"local nested", skillEntry{RelPath: "mma/docker-expert"}, "mma"}, + {"local deep", skillEntry{RelPath: "frontend/react/hooks"}, "frontend"}, + {"flat local", skillEntry{RelPath: "my-skill"}, ""}, + {"empty", skillEntry{}, ""}, + } + for _, tc := range cases { + if got := skillTopGroup(tc.e); got != tc.want { + t.Errorf("%s: skillTopGroup=%q, want %q", tc.name, got, tc.want) + } + } +} + +// The default sort must group entries by top-level bucket so +// buildGroupedItems produces contiguous groups. Standalone entries +// (empty top group) must sort last so they don't split named groups. +func TestSortSkillEntries_TopGroupOrder(t *testing.T) { + skills := []skillEntry{ + {Name: "react-best-practices", RelPath: "react-best-practices"}, + {Name: "mma-slack", RelPath: "mma/slack"}, + {Name: "tracked-a", RelPath: "_team-repo/a", RepoName: "_team-repo"}, + {Name: "agent-browser", RelPath: "agent-browser"}, + {Name: "mma-dogfood", RelPath: "mma/dogfood"}, + } + sortSkillEntries(skills, "") + + got := make([]string, len(skills)) + for i, s := range skills { + got[i] = s.RelPath + } + want := []string{ + "_team-repo/a", // tracked repo first (underscore sorts first) + "mma/dogfood", // then local nested "mma" group + "mma/slack", // + "agent-browser", // standalone flat items last, alphabetical + "react-best-practices", // + } + for i := range want { + if got[i] != want[i] { + t.Errorf("sort order[%d] = %q, want %q (full: %v)", i, got[i], want[i], got) + } + } +} + func TestSkillItem_Description_Tracked(t *testing.T) { item := skillItem{entry: skillEntry{RepoName: "team-repo"}} if got := item.Description(); got != "" { diff --git a/cmd/skillshare/list_tui_test.go b/cmd/skillshare/list_tui_test.go index 87dfc98a..54bba33f 100644 --- a/cmd/skillshare/list_tui_test.go +++ b/cmd/skillshare/list_tui_test.go @@ -41,13 +41,15 @@ func TestListDetailStatusBits(t *testing.T) { } func TestListSummaryFooterCounts(t *testing.T) { + items := []skillItem{ + {entry: skillEntry{Name: "local", RelPath: "local"}}, + {entry: skillEntry{Name: "tracked", RelPath: "tracked", RepoName: "team/repo"}}, + {entry: skillEntry{Name: "remote", RelPath: "remote", Source: "github.com/example/repo"}}, + } m := listTUIModel{ - allItems: []skillItem{ - {entry: skillEntry{Name: "local", RelPath: "local"}}, - {entry: skillEntry{Name: "tracked", RelPath: "tracked", RepoName: "team/repo"}}, - {entry: skillEntry{Name: "remote", RelPath: "remote", Source: "github.com/example/repo"}}, - }, - matchCount: 2, + allItems: items, + tabFiltered: items, // All tab — same as allItems + matchCount: 2, } got := m.renderSummaryFooter() @@ -98,7 +100,7 @@ func TestListViewSplit_HeaderKeepsSkillNameWhenDetailScrolled(t *testing.T) { }, } - m := newListTUIModel(nil, items, len(items), "global", t.TempDir(), nil) + m := newListTUIModel(nil, items, len(items), "global", t.TempDir(), "", nil, kindAll) m.termWidth = 120 m.termHeight = 30 m.detailScroll = 999 @@ -133,7 +135,7 @@ func TestApplyFilter_WithTags(t *testing.T) { {entry: skillEntry{Name: "remote-a", RelPath: "remote-a", Source: "github.com/foo/bar"}}, } - m := newListTUIModel(nil, items, len(items), "global", t.TempDir(), nil) + m := newListTUIModel(nil, items, len(items), "global", t.TempDir(), "", nil, kindAll) // Filter by type:tracked — should match 2 items m.filterText = "t:tracked" @@ -184,3 +186,113 @@ func TestApplyFilter_WithTags(t *testing.T) { t.Fatalf("cleared matchCount = %d, want %d", m.matchCount, len(items)) } } + +func TestTabCounts(t *testing.T) { + items := []skillItem{ + {entry: skillEntry{Name: "s1", RelPath: "s1", Kind: "skill"}}, + {entry: skillEntry{Name: "s2", RelPath: "s2", Kind: "skill"}}, + {entry: skillEntry{Name: "a1", RelPath: "a1.md", Kind: "agent"}}, + } + m := newListTUIModel(nil, items, len(items), "global", t.TempDir(), "", nil, kindAll) + want := [3]int{3, 2, 1} + if m.tabCounts != want { + t.Fatalf("tabCounts = %v, want %v", m.tabCounts, want) + } +} + +func TestTabSwitchFiltersItems(t *testing.T) { + items := []skillItem{ + {entry: skillEntry{Name: "skill-a", RelPath: "skill-a", Kind: "skill"}}, + {entry: skillEntry{Name: "skill-b", RelPath: "skill-b", Kind: "skill"}}, + {entry: skillEntry{Name: "agent-a", RelPath: "agent-a.md", Kind: "agent"}}, + } + m := newListTUIModel(nil, items, len(items), "global", t.TempDir(), "", nil, kindAll) + + // Default All tab — should show 3 + if m.matchCount != 3 { + t.Fatalf("All tab matchCount = %d, want 3", m.matchCount) + } + + // Switch to Skills tab + m.activeTab = listTabSkills + m.applyFilter() + if m.matchCount != 2 { + t.Fatalf("Skills tab matchCount = %d, want 2", m.matchCount) + } + + // Switch to Agents tab + m.activeTab = listTabAgents + m.applyFilter() + if m.matchCount != 1 { + t.Fatalf("Agents tab matchCount = %d, want 1", m.matchCount) + } +} + +func TestInitialKindSetsTab(t *testing.T) { + m := newListTUIModel(nil, nil, 0, "global", t.TempDir(), "", nil, kindAgents) + if m.activeTab != listTabAgents { + t.Fatalf("initialKind=kindAgents → activeTab = %d, want %d", m.activeTab, listTabAgents) + } + + m2 := newListTUIModel(nil, nil, 0, "global", t.TempDir(), "", nil, kindSkills) + if m2.activeTab != listTabSkills { + t.Fatalf("initialKind=kindSkills → activeTab = %d, want %d", m2.activeTab, listTabSkills) + } +} + +func TestTabWithFilterComposition(t *testing.T) { + items := []skillItem{ + {entry: skillEntry{Name: "react", RelPath: "react", Kind: "skill"}}, + {entry: skillEntry{Name: "vue", RelPath: "vue", Kind: "skill"}}, + {entry: skillEntry{Name: "react-agent", RelPath: "react-agent.md", Kind: "agent"}}, + } + m := newListTUIModel(nil, items, len(items), "global", t.TempDir(), "", nil, kindAll) + + // Skills tab + text filter "react" → only skill "react" + m.activeTab = listTabSkills + m.filterText = "react" + m.applyFilter() + if m.matchCount != 1 { + t.Fatalf("Skills+react matchCount = %d, want 1", m.matchCount) + } + + // All tab + text filter "react" → skill "react" + agent "react-agent" + m.activeTab = listTabAll + m.applyFilter() + if m.matchCount != 2 { + t.Fatalf("All+react matchCount = %d, want 2", m.matchCount) + } +} + +func TestTabBarRendering(t *testing.T) { + items := []skillItem{ + {entry: skillEntry{Name: "s1", RelPath: "s1", Kind: "skill"}}, + {entry: skillEntry{Name: "a1", RelPath: "a1.md", Kind: "agent"}}, + } + m := newListTUIModel(nil, items, len(items), "global", t.TempDir(), "", nil, kindAll) + bar := xansi.Strip(m.renderTabBar()) + for _, want := range []string{"All(2)", "Skills(1)", "Agents(1)"} { + if !strings.Contains(bar, want) { + t.Fatalf("tab bar missing %q in %q", want, bar) + } + } +} + +func TestUpdateTitle(t *testing.T) { + m := newListTUIModel(nil, nil, 0, "global", t.TempDir(), "", nil, kindAll) + if !strings.Contains(m.list.Title, "resources") { + t.Fatalf("All tab title = %q, want 'resources'", m.list.Title) + } + + m.activeTab = listTabSkills + m.updateTitle() + if !strings.Contains(m.list.Title, "skills") { + t.Fatalf("Skills tab title = %q, want 'skills'", m.list.Title) + } + + m.activeTab = listTabAgents + m.updateTitle() + if !strings.Contains(m.list.Title, "agents") { + t.Fatalf("Agents tab title = %q, want 'agents'", m.list.Title) + } +} diff --git a/cmd/skillshare/log_tui.go b/cmd/skillshare/log_tui.go index 6230c61a..bc725e00 100644 --- a/cmd/skillshare/log_tui.go +++ b/cmd/skillshare/log_tui.go @@ -6,25 +6,17 @@ import ( "strings" "time" + "skillshare/internal/oplog" + "skillshare/internal/theme" + "skillshare/internal/ui" + "github.com/charmbracelet/bubbles/list" "github.com/charmbracelet/bubbles/spinner" "github.com/charmbracelet/bubbles/textinput" tea "github.com/charmbracelet/bubbletea" "github.com/charmbracelet/lipgloss" - - "skillshare/internal/oplog" - "skillshare/internal/ui" ) -// lc holds log-specific styles that don't belong in the shared tc palette. -var lc = struct { - Bold lipgloss.Style // counts, totals — bold without color - Label lipgloss.Style // row labels — wider (22) than tc.Label (14) -}{ - Bold: lipgloss.NewStyle().Bold(true), - Label: lipgloss.NewStyle().Faint(true).Width(22), -} - // logLoadFn is a function that loads log items (runs in a goroutine inside the TUI). type logLoadFn func() ([]logItem, error) @@ -95,7 +87,7 @@ func newLogTUIModel(loadFn logLoadFn, items []logItem, logLabel, modeLabel, conf l := list.New(listItems, newPrefixDelegate(false), 0, 0) l.Title = fmt.Sprintf("Log: %s (%s)", logLabel, modeLabel) - l.Styles.Title = tc.ListTitle + l.Styles.Title = theme.Title() l.SetShowStatusBar(false) // custom status line l.SetFilteringEnabled(false) // application-level filter l.SetShowHelp(false) @@ -104,13 +96,13 @@ func newLogTUIModel(loadFn logLoadFn, items []logItem, logLabel, modeLabel, conf // Loading spinner sp := spinner.New() sp.Spinner = spinner.Dot - sp.Style = tc.SpinnerStyle + sp.Style = theme.Accent() // Filter text input fi := textinput.New() fi.Prompt = "/ " - fi.PromptStyle = tc.Filter - fi.Cursor.Style = tc.Filter + fi.PromptStyle = theme.Accent() + fi.Cursor.Style = theme.Accent() return logTUIModel{ list: l, @@ -534,7 +526,7 @@ func (m logTUIModel) View() string { b.WriteString("\n") help := "s back to list q quit" - b.WriteString(tc.Help.Render(help)) + b.WriteString(theme.Dim().MarginLeft(2).Render(help)) b.WriteString("\n") return b.String() } @@ -571,7 +563,7 @@ func (m logTUIModel) View() string { b.WriteString(m.renderStatsFooter()) b.WriteString("\n") - b.WriteString(tc.Help.Render(appendScrollInfo(m.logHelpBar(), scrollInfo))) + b.WriteString(theme.Dim().MarginLeft(2).Render(appendScrollInfo(m.logHelpBar(), scrollInfo))) b.WriteString("\n") return b.String() @@ -596,7 +588,7 @@ func (m logTUIModel) viewVertical() string { b.WriteString(m.renderStatsFooter()) - b.WriteString(tc.Help.Render(appendScrollInfo(m.logHelpBar(), scrollInfo))) + b.WriteString(theme.Dim().MarginLeft(2).Render(appendScrollInfo(m.logHelpBar(), scrollInfo))) b.WriteString("\n") return b.String() @@ -622,7 +614,7 @@ func (m logTUIModel) logHelpBar() string { help := strings.Join(parts, " ") if m.lastDeletedMsg != "" { - help = tc.Green.Render(m.lastDeletedMsg) + " " + help + help = theme.Success().Render(m.lastDeletedMsg) + " " + help } return help @@ -665,8 +657,8 @@ func renderLogDetailPanel(item logItem) string { var b strings.Builder row := func(label, value string) { - b.WriteString(lc.Label.Render(label)) - b.WriteString(tc.Value.Render(value)) + b.WriteString(theme.Dim().Width(22).Render(label)) + b.WriteString(lipgloss.NewStyle().Render(value)) b.WriteString("\n") } @@ -676,17 +668,17 @@ func renderLogDetailPanel(item logItem) string { row("Timestamp:", e.Timestamp) // Command — cyan to match CLI palette - row("Command:", tc.Cyan.Render(strings.ToUpper(e.Command))) + row("Command:", theme.Accent().Render(strings.ToUpper(e.Command))) // Status with color statusDisplay := e.Status switch e.Status { case "ok": - statusDisplay = tc.Green.Render(e.Status) + statusDisplay = theme.Success().Render(e.Status) case "error", "blocked": - statusDisplay = tc.Red.Render(e.Status) + statusDisplay = theme.Danger().Render(e.Status) case "partial": - statusDisplay = tc.Yellow.Render(e.Status) + statusDisplay = theme.Warning().Render(e.Status) } row("Status:", statusDisplay) @@ -712,7 +704,7 @@ func renderLogDetailPanel(item logItem) string { for _, p := range pairs { // List fields: render as multi-line bullet list for readability if p.isList && len(p.listValues) > 0 { - b.WriteString(lc.Label.Render(p.key + ":")) + b.WriteString(theme.Dim().Width(22).Render(p.key + ":")) b.WriteString("\n") show := p.listValues remaining := 0 @@ -721,11 +713,11 @@ func renderLogDetailPanel(item logItem) string { show = show[:maxBulletItems] } for _, v := range show { - b.WriteString(" - " + tc.Value.Render(v) + "\n") + b.WriteString(" - " + lipgloss.NewStyle().Render(v) + "\n") } if remaining > 0 { summary := fmt.Sprintf(" ... and %d more", remaining) - b.WriteString(tc.Dim.Render(summary) + "\n") + b.WriteString(theme.Dim().Render(summary) + "\n") } continue } @@ -738,9 +730,9 @@ func renderLogDetailPanel(item logItem) string { // Colorize only severity/status fields to avoid visual noise switch { case strings.Contains(p.key, "failed") || strings.Contains(p.key, "scan-errors"): - value = tc.Red.Render(value) + value = theme.Danger().Render(value) case strings.Contains(p.key, "warning"): - value = tc.Yellow.Render(value) + value = theme.Warning().Render(value) case p.key == "risk": value = colorizeRiskValue(value) case p.key == "threshold": @@ -757,7 +749,7 @@ func renderLogDetailPanel(item logItem) string { // severityStyles maps the 5 severity levels (c/h/m/l/i) to lipgloss styles. var severityStyles = []lipgloss.Style{ - tc.Critical, tc.High, tc.Medium, tc.Low, tc.Info, + theme.Severity("critical"), theme.Severity("high"), theme.Severity("medium"), theme.Severity("low"), theme.Severity("info"), } // colorizeSeverityBreakdown colors each number in "0/0/1/0/0" to match audit summary. @@ -769,16 +761,16 @@ func colorizeSeverityBreakdown(value string) string { for i, p := range parts { parts[i] = severityStyles[i].Render(p) } - sep := tc.Dim.Render("/") + sep := theme.Dim().Render("/") return strings.Join(parts, sep) } // colorizeThreshold applies color based on audit threshold level. func colorizeThreshold(value string) string { if ui.SeverityColorID(value) == "" { - return tc.Green.Render(value) + return theme.Success().Render(value) } - return tcSevStyle(value).Render(value) + return theme.SeverityStyle(value).Render(value) } // colorizeRiskValue applies color based on the risk label embedded in the value string. @@ -787,9 +779,9 @@ func colorizeRiskValue(value string) string { sev := strings.SplitN(strings.ToUpper(value), " ", 2)[0] sev = strings.TrimRight(sev, "(") if ui.SeverityColorID(sev) == "" { - return tc.Green.Render(value) + return theme.Success().Render(value) } - return tcSevStyle(sev).Render(value) + return theme.SeverityStyle(sev).Render(value) } // computeLogStatsFromItems converts logItems to oplog entries and computes stats. @@ -810,21 +802,21 @@ func (m logTUIModel) renderStatsFooter() string { rateStyle := statsSuccessRateColor(m.stats.SuccessRate) parts := []string{ - tc.Dim.Render(fmt.Sprintf("%d ops", m.stats.Total)), + theme.Dim().Render(fmt.Sprintf("%d ops", m.stats.Total)), rateStyle.Render(fmt.Sprintf("✓ %.1f%%", m.stats.SuccessRate*100)), } if m.stats.LastOperation != nil { ts, err := time.Parse(time.RFC3339, m.stats.LastOperation.Timestamp) if err == nil { - lastPart := tc.Dim.Render("last: ") + - tc.Cyan.Render(m.stats.LastOperation.Command) + - tc.Dim.Render(fmt.Sprintf(" %s ago", formatRelativeTime(time.Since(ts)))) + lastPart := theme.Dim().Render("last: ") + + theme.Accent().Render(m.stats.LastOperation.Command) + + theme.Dim().Render(fmt.Sprintf(" %s ago", formatRelativeTime(time.Since(ts)))) parts = append(parts, lastPart) } } - sep := tc.Dim.Render(" | ") + sep := theme.Dim().Render(" | ") return " " + strings.Join(parts, sep) + "\n" } @@ -832,13 +824,13 @@ func (m logTUIModel) renderStatsFooter() string { func (m logTUIModel) renderStatsPanel() string { var b strings.Builder - b.WriteString(tc.Title.Render(" Operation Log Summary")) + b.WriteString(theme.Title().Render(" Operation Log Summary")) b.WriteString("\n") - b.WriteString(tc.Dim.Render(" " + strings.Repeat("─", 50))) + b.WriteString(theme.Dim().Render(" " + strings.Repeat("─", 50))) b.WriteString("\n\n") if m.stats.Total == 0 { - b.WriteString(tc.Dim.Render(" No entries")) + b.WriteString(theme.Dim().Render(" No entries")) b.WriteString("\n") return b.String() } @@ -850,20 +842,20 @@ func (m logTUIModel) renderStatsPanel() string { } rateColor := statsSuccessRateColor(m.stats.SuccessRate) b.WriteString(fmt.Sprintf(" %s %s\n\n", - tc.Dim.Render("Total:"), - lc.Bold.Render(fmt.Sprintf("%d", m.stats.Total)), + theme.Dim().Render("Total:"), + lipgloss.NewStyle().Bold(true).Render(fmt.Sprintf("%d", m.stats.Total)), )) b.WriteString(fmt.Sprintf(" %s %s %s\n\n", - tc.Dim.Render("OK:"), + theme.Dim().Render("OK:"), rateColor.Render(fmt.Sprintf("%d/%d", okTotal, m.stats.Total)), - tc.Dim.Render(fmt.Sprintf("(%.1f%%)", m.stats.SuccessRate*100)), + theme.Dim().Render(fmt.Sprintf("(%.1f%%)", m.stats.SuccessRate*100)), )) // ── Command breakdown with horizontal bars ── header := fmt.Sprintf(" %-12s %-20s %s", "Command", "", "OK") - b.WriteString(tc.Dim.Render(header)) + b.WriteString(theme.Dim().Render(header)) b.WriteString("\n") - b.WriteString(tc.Dim.Render(" " + strings.Repeat("─", 42))) + b.WriteString(theme.Dim().Render(" " + strings.Repeat("─", 42))) b.WriteString("\n") type cmdEntry struct { @@ -899,21 +891,21 @@ func (m logTUIModel) renderStatsPanel() string { } errBarLen := barLen - okBarLen - cmdBar := tc.Green.Render(strings.Repeat("▓", okBarLen)) + cmdBar := theme.Success().Render(strings.Repeat("▓", okBarLen)) if errBarLen > 0 { - cmdBar += tc.Red.Render(strings.Repeat("▓", errBarLen)) + cmdBar += theme.Danger().Render(strings.Repeat("▓", errBarLen)) } padding := strings.Repeat(" ", cmdBarWidth-barLen) // "✓6/9" format — ok out of total, self-explanatory okRatio := fmt.Sprintf("✓%d/%d", cmd.cs.OK, cmd.cs.Total) - ratioColor := tc.Green + ratioColor := theme.Success() if cmd.cs.OK < cmd.cs.Total { - ratioColor = tc.Red + ratioColor = theme.Danger() } b.WriteString(fmt.Sprintf(" %s %s%s %s\n", - tc.Dim.Render(fmt.Sprintf("%-12s", cmd.name)), + theme.Dim().Render(fmt.Sprintf("%-12s", cmd.name)), cmdBar, padding, ratioColor.Render(okRatio))) } @@ -928,19 +920,19 @@ func (m logTUIModel) renderStatsPanel() string { blockedTotal += cs.Blocked } - b.WriteString(tc.Dim.Render(" " + strings.Repeat("─", 50))) + b.WriteString(theme.Dim().Render(" " + strings.Repeat("─", 50))) b.WriteString("\n") b.WriteString(fmt.Sprintf(" %s %s", - tc.Dim.Render("Status:"), - tc.Green.Render(fmt.Sprintf("✓ %d ok", okTotal)))) + theme.Dim().Render("Status:"), + theme.Success().Render(fmt.Sprintf("✓ %d ok", okTotal)))) if errTotal > 0 { - b.WriteString(fmt.Sprintf(" %s", tc.Red.Render(fmt.Sprintf("✗ %d error", errTotal)))) + b.WriteString(fmt.Sprintf(" %s", theme.Danger().Render(fmt.Sprintf("✗ %d error", errTotal)))) } if partialTotal > 0 { - b.WriteString(fmt.Sprintf(" %s", tc.Yellow.Render(fmt.Sprintf("◐ %d partial", partialTotal)))) + b.WriteString(fmt.Sprintf(" %s", theme.Warning().Render(fmt.Sprintf("◐ %d partial", partialTotal)))) } if blockedTotal > 0 { - b.WriteString(fmt.Sprintf(" %s", tc.Red.Render(fmt.Sprintf("⊘ %d blocked", blockedTotal)))) + b.WriteString(fmt.Sprintf(" %s", theme.Danger().Render(fmt.Sprintf("⊘ %d blocked", blockedTotal)))) } b.WriteString("\n") @@ -950,9 +942,9 @@ func (m logTUIModel) renderStatsPanel() string { if err == nil { ago := formatRelativeTime(time.Since(ts)) b.WriteString(fmt.Sprintf(" %s %s %s\n", - tc.Dim.Render("Last op:"), - tc.Cyan.Render(m.stats.LastOperation.Command), - tc.Dim.Render(fmt.Sprintf("(%s ago)", ago)))) + theme.Dim().Render("Last op:"), + theme.Accent().Render(m.stats.LastOperation.Command), + theme.Dim().Render(fmt.Sprintf("(%s ago)", ago)))) } } @@ -963,11 +955,11 @@ func (m logTUIModel) renderStatsPanel() string { func statsSuccessRateColor(rate float64) lipgloss.Style { switch { case rate >= 0.9: - return tc.Green.Bold(true) + return theme.Success().Bold(true) case rate >= 0.7: - return tc.Yellow.Bold(true) + return theme.Warning().Bold(true) default: - return tc.Red.Bold(true) + return theme.Danger().Bold(true) } } diff --git a/cmd/skillshare/main.go b/cmd/skillshare/main.go index 9056220a..7e276f95 100644 --- a/cmd/skillshare/main.go +++ b/cmd/skillshare/main.go @@ -9,6 +9,7 @@ import ( "skillshare/internal/config" "skillshare/internal/install" + "skillshare/internal/theme" "skillshare/internal/ui" versioncheck "skillshare/internal/version" ) @@ -52,6 +53,10 @@ func main() { // Clean up any leftover .old files from Windows self-upgrade cleanupOldBinary() + // Resolve theme early so OSC 11 probe overhead happens at startup, + // not inside a TUI render loop. + _ = theme.Get() + // Migrate Windows legacy ~/.config/skillshare → %AppData%\skillshare results := config.MigrateWindowsLegacyDir() @@ -196,21 +201,21 @@ func printUsage() { // Core Commands fmt.Println("CORE COMMANDS") cmd("init", "", "Initialize skillshare") - cmd("install", "", "Install a skill from local path or git repo") - cmd("uninstall", "...", "Remove skills from source directory") - cmd("list", "", "List all installed skills") + cmd("install", "", "Install skills/agents from local path or git repo") + cmd("uninstall", "...", "Remove skills/agents from source directory") + cmd("list", "[agents] [pattern] [--all]", "List installed skills (or agents)") cmd("search", "[query]", "Search or browse GitHub for skills") - cmd("sync", "[extras] [--all]", "Sync skills (or extras) to targets") + cmd("sync", "[agents] [--all]", "Sync skills/agents/extras to targets") cmd("status", "", "Show status of all targets") fmt.Println() - // Skill Management - fmt.Println("SKILL MANAGEMENT") + // Skill & Agent Management + fmt.Println("SKILL & AGENT MANAGEMENT") cmd("new", "", "Create a new skill with SKILL.md template") - cmd("enable", "", "Enable a disabled skill (remove from .skillignore)") - cmd("disable", "", "Disable a skill (add to .skillignore)") - cmd("check", "", "Check for available updates") - cmd("update", "", "Update a skill or tracked repository") + cmd("enable", "", "Enable a disabled skill/agent") + cmd("disable", "", "Disable a skill/agent") + cmd("check", "[agents] [--all]", "Check for available updates") + cmd("update", "[agents] ", "Update a skill/agent or tracked repository") cmd("update", "--all", "Update all tracked repositories") cmd("upgrade", "", "Upgrade CLI and/or skillshare skill") fmt.Println() @@ -220,16 +225,16 @@ func printUsage() { cmd("target add", " [path]", "Add a target (path optional in project mode)") cmd("target remove", "", "Unlink target and restore skills") cmd("target list", "", "List all targets") - cmd("diff", "", "Show differences between source and targets") + cmd("diff", "[target]", "Show differences between source and targets") fmt.Println() // Sync & Backup fmt.Println("SYNC & BACKUP") - cmd("collect", "[target]", "Collect local skills from target(s) to source") + cmd("collect", "[agents] [target]", "Collect local skills/agents from target(s)") cmd("backup", "", "Create backup of target(s)") cmd("restore", "", "Restore target from latest backup") - cmd("trash", "list", "List trashed skills") - cmd("trash", "restore ", "Restore a skill from trash") + cmd("trash", "[agents] list", "List trashed skills/agents") + cmd("trash", "[agents] restore ", "Restore a skill/agent from trash") fmt.Println() // Extras @@ -248,7 +253,7 @@ func printUsage() { // Utilities fmt.Println("UTILITIES") - cmd("audit", "[name]", "Scan skills for security threats") + cmd("audit", "[agents] [name]", "Scan skills/agents for security threats") cmd("hub", "", "Manage hubs (add, list, remove, default, index)") cmd("log", "", "View operation log") cmd("tui", "[on|off]", "Toggle interactive TUI mode") @@ -268,8 +273,12 @@ func printUsage() { fmt.Println("EXAMPLES") fmt.Println(g + " skillshare status # Check current state") fmt.Println(" skillshare sync --dry-run # Preview before sync") + fmt.Println(" skillshare sync agents # Sync agents only") + fmt.Println(" skillshare sync --all # Sync skills + agents + extras") + fmt.Println(" skillshare list --all # List skills + agents") fmt.Println(" skillshare collect claude # Import local skills") fmt.Println(" skillshare install anthropics/skills/pdf -p # Project install") + fmt.Println(" skillshare install repo -a my-agent # Install specific agent") fmt.Println(" skillshare target add cursor -p # Project target") fmt.Println(" skillshare push -m \"Add new skill\" # Push to remote") fmt.Println(" skillshare pull # Pull from remote") diff --git a/cmd/skillshare/new_tui.go b/cmd/skillshare/new_tui.go index ed876ea9..1dcf5dc0 100644 --- a/cmd/skillshare/new_tui.go +++ b/cmd/skillshare/new_tui.go @@ -4,6 +4,8 @@ import ( "errors" "fmt" "strings" + + "skillshare/internal/theme" ) var errCancelled = errors.New("cancelled") @@ -125,7 +127,7 @@ func runNewWizard() (selectedPattern, selectedCategory string, createDirs bool) // wizardHeader builds a compact breadcrumb string for the help bar footer. // e.g. "✓ reviewer → quality" func wizardHeader(pattern, category string) string { - check := tc.Green.Render("✓") + check := theme.Success().Render("✓") var parts []string if pattern != "" { parts = append(parts, check+" "+pattern) diff --git a/cmd/skillshare/project_paths.go b/cmd/skillshare/project_paths.go new file mode 100644 index 00000000..1fadef7b --- /dev/null +++ b/cmd/skillshare/project_paths.go @@ -0,0 +1,22 @@ +package main + +import ( + "path/filepath" + + "skillshare/internal/config" +) + +// resolveProjectPath expands ~ and resolves relative project paths against +// the project root so all project-mode comparisons use a single path form. +func resolveProjectPath(projectRoot, path string) string { + if path == "" { + return "" + } + + resolved := config.ExpandPath(path) + if !filepath.IsAbs(resolved) { + return filepath.Join(projectRoot, filepath.FromSlash(resolved)) + } + + return resolved +} diff --git a/cmd/skillshare/project_runtime.go b/cmd/skillshare/project_runtime.go index c00f666e..9a8c88ca 100644 --- a/cmd/skillshare/project_runtime.go +++ b/cmd/skillshare/project_runtime.go @@ -4,14 +4,17 @@ import ( "path/filepath" "skillshare/internal/config" + "skillshare/internal/install" ) type projectRuntime struct { - root string - config *config.ProjectConfig - registry *config.Registry - sourcePath string - targets map[string]config.TargetConfig + root string + config *config.ProjectConfig + skillsStore *install.MetadataStore + agentsStore *install.MetadataStore + sourcePath string + agentsSourcePath string + targets map[string]config.TargetConfig } func loadProjectRuntime(root string) (*projectRuntime, error) { @@ -25,16 +28,26 @@ func loadProjectRuntime(root string) (*projectRuntime, error) { return nil, err } - reg, err := config.LoadRegistry(filepath.Join(root, ".skillshare")) + skillsDir := filepath.Join(root, ".skillshare", "skills") + agentsDir := filepath.Join(root, ".skillshare", "agents") + + skillsStore, err := install.LoadMetadataWithMigration(skillsDir, "") + if err != nil { + return nil, err + } + + agentsStore, err := install.LoadMetadataWithMigration(agentsDir, "agent") if err != nil { return nil, err } return &projectRuntime{ - root: root, - config: cfg, - registry: reg, - sourcePath: filepath.Join(root, ".skillshare", "skills"), - targets: targets, + root: root, + config: cfg, + skillsStore: skillsStore, + agentsStore: agentsStore, + sourcePath: skillsDir, + agentsSourcePath: agentsDir, + targets: targets, }, nil } diff --git a/cmd/skillshare/project_skills.go b/cmd/skillshare/project_skills.go index 1a563c2d..f1779af2 100644 --- a/cmd/skillshare/project_skills.go +++ b/cmd/skillshare/project_skills.go @@ -5,5 +5,5 @@ import ( ) func reconcileProjectRemoteSkills(runtime *projectRuntime) error { - return config.ReconcileProjectSkills(runtime.root, runtime.config, runtime.registry, runtime.sourcePath) + return config.ReconcileProjectSkills(runtime.root, runtime.config, runtime.skillsStore, runtime.sourcePath) } diff --git a/cmd/skillshare/restore_tui.go b/cmd/skillshare/restore_tui.go index 322f2ad6..6f263843 100644 --- a/cmd/skillshare/restore_tui.go +++ b/cmd/skillshare/restore_tui.go @@ -12,10 +12,12 @@ import ( "github.com/charmbracelet/bubbles/spinner" "github.com/charmbracelet/bubbles/textinput" tea "github.com/charmbracelet/bubbletea" + "github.com/charmbracelet/lipgloss" "skillshare/internal/backup" "skillshare/internal/config" "skillshare/internal/oplog" + "skillshare/internal/theme" "skillshare/internal/utils" ) @@ -24,6 +26,24 @@ import ( // Left-right split layout: list on left, detail panel on right. // --------------------------------------------------------------------------- +// isAgentBackupEntry returns true if the backup entry name represents an agent backup. +func isAgentBackupEntry(name string) bool { + return strings.HasSuffix(name, "-agents") +} + +// agentBaseTarget returns the base target name by stripping the "-agents" suffix. +func agentBaseTarget(name string) string { + return strings.TrimSuffix(name, "-agents") +} + +// resolveAgentBackupPath resolves the agent target path for a backup entry name, +// reusing the canonical resolveAgentTargetPath with builtin fallback. +func resolveAgentBackupPath(targets map[string]config.TargetConfig, entryName string) string { + baseName := agentBaseTarget(entryName) + tc := targets[baseName] // zero-value is safe — AgentsConfig returns empty, falls through to builtin + return resolveAgentTargetPath(tc, config.DefaultAgentTargets(), baseName) +} + // restorePhase tracks which screen is active. type restorePhase int @@ -45,7 +65,11 @@ type restoreTargetItem struct { } func (i restoreTargetItem) Title() string { - return i.summary.TargetName + name := i.summary.TargetName + if isAgentBackupEntry(name) { + return theme.Accent().Render("[A]") + " " + agentBaseTarget(name) + } + return name } func (i restoreTargetItem) Description() string { return fmt.Sprintf("%d backup(s), latest: %s", @@ -144,7 +168,7 @@ func newRestoreTUIModel(summaries []backup.TargetBackupSummary, backupDir string tl := list.New(listItems, newPrefixDelegate(true), 0, 0) tl.Title = fmt.Sprintf("Backup Restore — %d target(s)", len(summaries)) - tl.Styles.Title = tc.ListTitle + tl.Styles.Title = theme.Title() tl.SetShowStatusBar(false) tl.SetFilteringEnabled(false) tl.SetShowHelp(false) @@ -152,12 +176,12 @@ func newRestoreTUIModel(summaries []backup.TargetBackupSummary, backupDir string sp := spinner.New() sp.Spinner = spinner.Dot - sp.Style = tc.SpinnerStyle + sp.Style = theme.Accent() fi := textinput.New() fi.Prompt = "/ " - fi.PromptStyle = tc.Filter - fi.Cursor.Style = tc.Filter + fi.PromptStyle = theme.Accent() + fi.Cursor.Style = theme.Accent() return restoreTUIModel{ phase: phaseTargetList, @@ -199,7 +223,7 @@ func (m restoreTUIModel) Update(msg tea.Msg) (tea.Model, tea.Cmd) { case restoreDoneMsg: if msg.action == "delete" { if msg.err != nil { - m.resultMsg = tc.Red.Render(fmt.Sprintf("Delete failed: %s", msg.err)) + m.resultMsg = theme.Danger().Render(fmt.Sprintf("Delete failed: %s", msg.err)) m.phase = phaseDone return m, nil } @@ -208,16 +232,16 @@ func (m restoreTUIModel) Update(msg tea.Msg) (tea.Model, tea.Cmd) { if m.selectedVersion != nil { label = m.selectedVersion.Label } - m.resultMsg = tc.Green.Render(fmt.Sprintf("Deleted backup %s", label)) + m.resultMsg = theme.Success().Render(fmt.Sprintf("Deleted backup %s", label)) m.confirmAction = "" m.selectedVersion = nil return m.enterVersionPhase() } m.phase = phaseDone if msg.err != nil { - m.resultMsg = tc.Red.Render(fmt.Sprintf("Error: %s", msg.err)) + m.resultMsg = theme.Danger().Render(fmt.Sprintf("Error: %s", msg.err)) } else { - m.resultMsg = tc.Green.Render(fmt.Sprintf("Restored %s from %s", m.selectedTarget, m.selectedVersion.Label)) + m.resultMsg = theme.Success().Render(fmt.Sprintf("Restored %s from %s", m.selectedTarget, m.selectedVersion.Label)) } return m, nil @@ -435,7 +459,7 @@ func (m restoreTUIModel) enterVersionPhase() (tea.Model, tea.Cmd) { lw := restoreListWidth(m.termWidth) vl := list.New(listItems, newPrefixDelegate(true), 0, 0) vl.Title = fmt.Sprintf("%s — select version", m.selectedTarget) - vl.Styles.Title = tc.ListTitle + vl.Styles.Title = theme.Title() vl.SetShowStatusBar(false) vl.SetFilteringEnabled(false) vl.SetShowHelp(false) @@ -462,14 +486,22 @@ func (m restoreTUIModel) startRestore() (tea.Model, tea.Cmd) { cmd := func() tea.Msg { start := time.Now() - targetCfg, ok := targets[targetName] - if !ok { + + var destPath string + if isAgentBackupEntry(targetName) { + destPath = resolveAgentBackupPath(targets, targetName) + } else { + if tc, ok := targets[targetName]; ok { + destPath = tc.SkillsConfig().Path + } + } + if destPath == "" { return restoreDoneMsg{err: fmt.Errorf("target '%s' not found in config", targetName)} } backupPath := filepath.Dir(version.Dir) opts := backup.RestoreOptions{Force: true} - err := backup.RestoreToPath(backupPath, targetName, targetCfg.SkillsConfig().Path, opts) + err := backup.RestoreToPath(backupPath, targetName, destPath, opts) e := oplog.NewEntry("restore", statusFromErr(err), time.Since(start)) e.Args = map[string]any{"target": targetName, "from": version.Label, "via": "tui"} @@ -626,7 +658,7 @@ func (m restoreTUIModel) View() string { case phaseDone: return fmt.Sprintf("\n %s\n\n %s\n", - m.resultMsg, tc.Help.Render("Press any key to exit")) + m.resultMsg, theme.Dim().MarginLeft(2).Render("Press any key to exit")) case phaseConfirm: return m.viewRestoreConfirm() @@ -674,7 +706,7 @@ func (m restoreTUIModel) viewHorizontal() string { b.WriteString(m.renderRestoreFilterBar()) // Help - b.WriteString(tc.Help.Render(appendScrollInfo(m.restoreHelpText(), scrollInfo))) + b.WriteString(theme.Dim().MarginLeft(2).Render(appendScrollInfo(m.restoreHelpText(), scrollInfo))) b.WriteString("\n") return b.String() @@ -709,7 +741,7 @@ func (m restoreTUIModel) viewVertical() string { b.WriteString(detailStr) b.WriteString("\n") - b.WriteString(tc.Help.Render(appendScrollInfo(m.restoreHelpText(), scrollInfo))) + b.WriteString(theme.Dim().MarginLeft(2).Render(appendScrollInfo(m.restoreHelpText(), scrollInfo))) b.WriteString("\n") return b.String() @@ -768,7 +800,7 @@ func (m restoreTUIModel) viewRestoreConfirm() string { if m.confirmAction == "delete" { fmt.Fprintf(&b, " %s\n\n", - tc.Red.Render(fmt.Sprintf("Delete backup %s for %s?", m.selectedVersion.Label, m.selectedTarget))) + theme.Danger().Render(fmt.Sprintf("Delete backup %s for %s?", m.selectedVersion.Label, m.selectedTarget))) } else { fmt.Fprintf(&b, " Restore %s from backup %s?\n\n", m.selectedTarget, m.selectedVersion.Label) } @@ -798,7 +830,7 @@ func (m restoreTUIModel) viewRestoreConfirm() string { } b.WriteString("\n ") - b.WriteString(tc.Help.Render("y confirm n cancel")) + b.WriteString(theme.Dim().MarginLeft(2).Render("y confirm n cancel")) b.WriteString("\n") return b.String() } @@ -838,15 +870,20 @@ func (m restoreTUIModel) renderTargetDetail(s backup.TargetBackupSummary) string var b strings.Builder row := func(label, value string) { - b.WriteString(tc.Label.Render(label)) - b.WriteString(tc.Value.Render(value)) + b.WriteString(theme.Dim().Width(14).Render(label)) + b.WriteString(lipgloss.NewStyle().Render(value)) b.WriteString("\n") } row("Target: ", s.TargetName) - // Target path and current state - if t, ok := m.targets[s.TargetName]; ok { + if isAgentBackupEntry(s.TargetName) { + agentPath := resolveAgentBackupPath(m.targets, s.TargetName) + if agentPath != "" { + row("Path: ", agentPath) + row("Status: ", describeTargetState(agentPath)) + } + } else if t, ok := m.targets[s.TargetName]; ok { sc := t.SkillsConfig() row("Path: ", sc.Path) if sc.Mode != "" { @@ -874,7 +911,7 @@ func (m restoreTUIModel) renderTargetDetail(s backup.TargetBackupSummary) string if len(skillNames) > 0 { b.WriteString("\n") - b.WriteString(tc.Separator.Render("── Latest backup skills ──────────────")) + b.WriteString(theme.Dim().Render("── Latest backup skills ──────────────")) b.WriteString("\n") const maxPreview = 20 show := skillNames @@ -884,17 +921,17 @@ func (m restoreTUIModel) renderTargetDetail(s backup.TargetBackupSummary) string for _, name := range show { desc := readSkillDescription(filepath.Join(latestDir, name)) if desc != "" { - b.WriteString(tc.Value.Render(" " + name)) + b.WriteString(lipgloss.NewStyle().Render(" " + name)) b.WriteString("\n") - b.WriteString(tc.Dim.Render(" " + truncateStr(desc, 60))) + b.WriteString(theme.Dim().Render(" " + truncateStr(desc, 60))) b.WriteString("\n") } else { - b.WriteString(tc.Value.Render(" " + name)) + b.WriteString(lipgloss.NewStyle().Render(" " + name)) b.WriteString("\n") } } if len(skillNames) > maxPreview { - b.WriteString(tc.Dim.Render(fmt.Sprintf(" ... and %d more", len(skillNames)-maxPreview))) + b.WriteString(theme.Dim().Render(fmt.Sprintf(" ... and %d more", len(skillNames)-maxPreview))) b.WriteString("\n") } } @@ -907,8 +944,8 @@ func (m restoreTUIModel) renderVersionDetail(v backup.BackupVersion) string { var b strings.Builder row := func(label, value string) { - b.WriteString(tc.Label.Render(label)) - b.WriteString(tc.Value.Render(value)) + b.WriteString(theme.Dim().Width(14).Render(label)) + b.WriteString(lipgloss.NewStyle().Render(value)) b.WriteString("\n") } @@ -920,37 +957,42 @@ func (m restoreTUIModel) renderVersionDetail(v backup.BackupVersion) string { row("Size: ", "calculating...") } - // Diff with current target - if t, ok := m.targets[m.selectedTarget]; ok { - added, removed, common := diffSkillSets(v.SkillNames, listDirNames(t.SkillsConfig().Path)) + var diffPath string + if isAgentBackupEntry(m.selectedTarget) { + diffPath = resolveAgentBackupPath(m.targets, m.selectedTarget) + } else if t, ok := m.targets[m.selectedTarget]; ok { + diffPath = t.SkillsConfig().Path + } + if diffPath != "" { + added, removed, common := diffSkillSets(v.SkillNames, listDirNames(diffPath)) if len(added) > 0 || len(removed) > 0 { b.WriteString("\n") - b.WriteString(tc.Separator.Render("── Diff vs current target ────────────")) + b.WriteString(theme.Dim().Render("── Diff vs current target ────────────")) b.WriteString("\n") if len(common) > 0 { row("Same: ", fmt.Sprintf("%d skill(s)", len(common))) } if len(added) > 0 { - b.WriteString(tc.Label.Render("Restore: ")) - b.WriteString(tc.Green.Render(fmt.Sprintf("+%d (in backup, not in target)", len(added)))) + b.WriteString(theme.Dim().Width(14).Render("Restore: ")) + b.WriteString(theme.Success().Render(fmt.Sprintf("+%d (in backup, not in target)", len(added)))) b.WriteString("\n") for _, name := range added { - b.WriteString(tc.Green.Render(" + " + name)) + b.WriteString(theme.Success().Render(" + " + name)) b.WriteString("\n") } } if len(removed) > 0 { - b.WriteString(tc.Label.Render("Remove: ")) - b.WriteString(tc.Red.Render(fmt.Sprintf("-%d (in target, not in backup)", len(removed)))) + b.WriteString(theme.Dim().Width(14).Render("Remove: ")) + b.WriteString(theme.Danger().Render(fmt.Sprintf("-%d (in target, not in backup)", len(removed)))) b.WriteString("\n") for _, name := range removed { - b.WriteString(tc.Red.Render(" - " + name)) + b.WriteString(theme.Danger().Render(" - " + name)) b.WriteString("\n") } } } else if len(common) > 0 { b.WriteString("\n") - b.WriteString(tc.Dim.Render(" Backup matches current target")) + b.WriteString(theme.Dim().Render(" Backup matches current target")) b.WriteString("\n") } } @@ -958,30 +1000,30 @@ func (m restoreTUIModel) renderVersionDetail(v backup.BackupVersion) string { // Skill list with descriptions (cap I/O at 20 skills) if len(v.SkillNames) > 0 { b.WriteString("\n") - b.WriteString(tc.Separator.Render("── Contents ──────────────────────────")) + b.WriteString(theme.Dim().Render("── Contents ──────────────────────────")) b.WriteString("\n") const maxDetail = 20 for i, name := range v.SkillNames { if i < maxDetail { desc := readSkillDescription(filepath.Join(v.Dir, name)) files := listSkillFiles(filepath.Join(v.Dir, name)) - b.WriteString(tc.Value.Render(" " + name)) + b.WriteString(lipgloss.NewStyle().Render(" " + name)) b.WriteString("\n") if desc != "" { - b.WriteString(tc.Dim.Render(" " + truncateStr(desc, 60))) + b.WriteString(theme.Dim().Render(" " + truncateStr(desc, 60))) b.WriteString("\n") } if len(files) > 0 { - b.WriteString(tc.Dim.Render(" " + strings.Join(files, " "))) + b.WriteString(theme.Dim().Render(" " + strings.Join(files, " "))) b.WriteString("\n") } } else { - b.WriteString(tc.Dim.Render(" " + name)) + b.WriteString(theme.Dim().Render(" " + name)) b.WriteString("\n") } } if len(v.SkillNames) > maxDetail { - b.WriteString(tc.Dim.Render(fmt.Sprintf(" ... %d skill(s) above shown without details", len(v.SkillNames)-maxDetail))) + b.WriteString(theme.Dim().Render(fmt.Sprintf(" ... %d skill(s) above shown without details", len(v.SkillNames)-maxDetail))) b.WriteString("\n") } } @@ -1005,13 +1047,13 @@ func describeTargetState(path string) string { info, err := os.Lstat(path) if err != nil { if os.IsNotExist(err) { - return tc.Yellow.Render("not found") + return theme.Warning().Render("not found") } - return tc.Red.Render("error") + return theme.Danger().Render("error") } if info.Mode()&os.ModeSymlink != 0 { dest, _ := os.Readlink(path) - return tc.Cyan.Render("symlink → " + dest) + return theme.Accent().Render("symlink → " + dest) } entries, _ := os.ReadDir(path) return fmt.Sprintf("directory (%d items)", len(entries)) diff --git a/cmd/skillshare/search.go b/cmd/skillshare/search.go index af693c27..d2b15d13 100644 --- a/cmd/skillshare/search.go +++ b/cmd/skillshare/search.go @@ -578,11 +578,11 @@ func installFromSearchResult(result search.SearchResult, cfg *config.Config) (er logSummary.InstalledSkills = []string{result.Name} // Reconcile global config with installed skills - reg, _ := config.LoadRegistry(cfg.RegistryDir) - if reg == nil { - reg = &config.Registry{} + store, _ := install.LoadMetadataWithMigration(cfg.Source, "") + if store == nil { + store = install.NewMetadataStore() } - if rErr := config.ReconcileGlobalSkills(cfg, reg); rErr != nil { + if rErr := config.ReconcileGlobalSkills(cfg, store); rErr != nil { ui.Warning("Failed to reconcile global skills config: %v", rErr) } diff --git a/cmd/skillshare/search_batch.go b/cmd/skillshare/search_batch.go index 2f77e28a..9ca004c3 100644 --- a/cmd/skillshare/search_batch.go +++ b/cmd/skillshare/search_batch.go @@ -565,11 +565,11 @@ func batchInstallFromSearchWithProgress(selected []search.SearchResult, mode run _ = reconcileProjectRemoteSkills(runtime) } } else { - reg, _ := config.LoadRegistry(cfg.RegistryDir) - if reg == nil { - reg = &config.Registry{} + store, _ := install.LoadMetadataWithMigration(cfg.Source, "") + if store == nil { + store = install.NewMetadataStore() } - _ = config.ReconcileGlobalSkills(cfg, reg) + _ = config.ReconcileGlobalSkills(cfg, store) } renderBatchSearchInstallSummary(results, mode, time.Since(batchStart)) diff --git a/cmd/skillshare/search_tui.go b/cmd/skillshare/search_tui.go index 06d6192e..8c470664 100644 --- a/cmd/skillshare/search_tui.go +++ b/cmd/skillshare/search_tui.go @@ -7,8 +7,10 @@ import ( "github.com/charmbracelet/bubbles/list" "github.com/charmbracelet/bubbles/textinput" tea "github.com/charmbracelet/bubbletea" + "github.com/charmbracelet/lipgloss" "skillshare/internal/search" + "skillshare/internal/theme" ) // --------------------------------------------------------------------------- @@ -46,7 +48,7 @@ func (i searchSelectItem) Title() string { } title := check + " " + i.result.Name if i.isHub { - badge := formatRiskBadgeLipgloss(i.result.RiskLabel) + badge := theme.FormatRiskBadge(i.result.RiskLabel) if badge != "" { title += badge } @@ -116,7 +118,7 @@ func newSearchSelectModel(results []search.SearchResult, isHub bool) searchSelec l := list.New(items, newPrefixDelegate(true), 0, 0) l.Title = searchSelectTitle(0, len(results)) - l.Styles.Title = tc.ListTitle + l.Styles.Title = theme.Title() l.SetShowStatusBar(false) // custom status line l.SetFilteringEnabled(false) // application-level filter l.SetShowHelp(false) @@ -125,8 +127,8 @@ func newSearchSelectModel(results []search.SearchResult, isHub bool) searchSelec // Filter text input fi := textinput.New() fi.Prompt = "/ " - fi.PromptStyle = tc.Filter - fi.Cursor.Style = tc.Filter + fi.PromptStyle = theme.Accent() + fi.Cursor.Style = theme.Accent() return searchSelectModel{ list: l, @@ -319,7 +321,7 @@ func (m searchSelectModel) View() string { } help := "↑↓ navigate ←→ page space toggle a all enter install s search again / filter esc cancel" - b.WriteString(tc.Help.Render(help)) + b.WriteString(theme.Dim().MarginLeft(2).Render(help)) b.WriteString("\n") return b.String() @@ -337,13 +339,13 @@ func (m searchSelectModel) renderSearchFilterBar() string { // renderSearchDetailPanel renders the detail section for the selected search result. func (m searchSelectModel) renderSearchDetailPanel(r search.SearchResult) string { var b strings.Builder - b.WriteString(tc.Separator.Render(" ─────────────────────────────────────────")) + b.WriteString(theme.Dim().Render(" ─────────────────────────────────────────")) b.WriteString("\n") row := func(label, value string) { b.WriteString(" ") - b.WriteString(tc.Label.Render(label)) - b.WriteString(tc.Value.Render(value)) + b.WriteString(theme.Dim().Width(14).Render(label)) + b.WriteString(lipgloss.NewStyle().Render(value)) b.WriteString("\n") } @@ -365,7 +367,7 @@ func (m searchSelectModel) renderSearchDetailPanel(r search.SearchResult) string indent := strings.Repeat(" ", labelOffset) for _, line := range lines[1:] { b.WriteString(indent) - b.WriteString(tc.Value.Render(line)) + b.WriteString(lipgloss.NewStyle().Render(line)) b.WriteString("\n") } b.WriteString("\n") @@ -381,7 +383,7 @@ func (m searchSelectModel) renderSearchDetailPanel(r search.SearchResult) string // Risk (hub) if m.isHub && r.RiskLabel != "" { - row("Risk:", riskLabelStyle(r.RiskLabel).Render(r.RiskLabel)) + row("Risk:", theme.RiskLabelStyle(r.RiskLabel).Render(r.RiskLabel)) } // Tags @@ -390,7 +392,7 @@ func (m searchSelectModel) renderSearchDetailPanel(r search.SearchResult) string for i, tag := range r.Tags { tags[i] = "#" + tag } - row("Tags:", tc.Target.Render(strings.Join(tags, " "))) + row("Tags:", theme.Accent().Render(strings.Join(tags, " "))) } return b.String() diff --git a/cmd/skillshare/status.go b/cmd/skillshare/status.go index dd4f2c0b..09f09091 100644 --- a/cmd/skillshare/status.go +++ b/cmd/skillshare/status.go @@ -11,6 +11,7 @@ import ( "skillshare/internal/audit" "skillshare/internal/config" "skillshare/internal/git" + "skillshare/internal/resource" "skillshare/internal/skillignore" "skillshare/internal/sync" "skillshare/internal/ui" @@ -23,6 +24,7 @@ type statusJSONOutput struct { SkillCount int `json:"skill_count"` TrackedRepos []statusJSONRepo `json:"tracked_repos"` Targets []statusJSONTarget `json:"targets"` + Agents *statusJSONAgents `json:"agents,omitempty"` Audit statusJSONAudit `json:"audit"` Version string `json:"version"` } @@ -139,23 +141,21 @@ func cmdStatus(args []string) error { } // JSON mode + output := statusJSONOutput{ + Version: version, + } + discovered, stats, _ := sync.DiscoverSourceSkillsWithStats(cfg.Source) trackedRepos := extractTrackedRepos(discovered) - output := statusJSONOutput{ - Source: statusJSONSource{ - Path: cfg.Source, - Exists: dirExists(cfg.Source), - Skillignore: buildSkillignoreJSON(stats), - }, - SkillCount: len(discovered), - Version: version, + output.Source = statusJSONSource{ + Path: cfg.Source, + Exists: dirExists(cfg.Source), + Skillignore: buildSkillignoreJSON(stats), } - - // Tracked repos (parallel dirty checks) + output.SkillCount = len(discovered) output.TrackedRepos = buildTrackedRepoJSON(cfg.Source, trackedRepos, discovered) - // Targets for name, target := range cfg.Targets { sc := target.SkillsConfig() tMode := getTargetMode(sc.Mode, cfg.Mode) @@ -171,7 +171,6 @@ func cmdStatus(args []string) error { }) } - // Audit policy := audit.ResolvePolicy(audit.PolicyInputs{ ConfigProfile: cfg.Audit.Profile, ConfigThreshold: cfg.Audit.BlockThreshold, @@ -185,6 +184,8 @@ func cmdStatus(args []string) error { Analyzers: policy.EffectiveAnalyzers(), } + output.Agents = buildAgentStatusJSON(cfg) + return writeJSON(&output) } @@ -222,6 +223,20 @@ func printSourceStatus(cfg *config.Config, skillCount int, stats *skillignore.Ig ui.Success("%s (%d skills, %s)", cfg.Source, skillCount, info.ModTime().Format("2006-01-02 15:04")) printSkillignoreLine(stats) + + // Agents source + agentsSource := cfg.EffectiveAgentsSource() + if agentsInfo, agentsErr := os.Stat(agentsSource); agentsErr == nil { + agentCount := 0 + if entries, readErr := os.ReadDir(agentsSource); readErr == nil { + for _, e := range entries { + if !e.IsDir() && strings.HasSuffix(strings.ToLower(e.Name()), ".md") { + agentCount++ + } + } + } + ui.Success("%s (%d agents, %s)", agentsSource, agentCount, agentsInfo.ModTime().Format("2006-01-02 15:04")) + } } func printSkillignoreLine(stats *skillignore.IgnoreStats) { @@ -355,12 +370,26 @@ type targetStatusResult struct { func printTargetsStatus(cfg *config.Config, discovered []sync.DiscoveredSkill) error { ui.Header("Targets") + + builtinAgents := config.DefaultAgentTargets() + agentsSource := cfg.EffectiveAgentsSource() + agentsExist := dirExists(agentsSource) + var agentCount int + if agentsExist { + agents, _ := resource.AgentKind{}.Discover(agentsSource) + agentCount = len(agents) + } + driftTotal := 0 for name, target := range cfg.Targets { + // Target name header + fmt.Printf("%s%s%s\n", ui.Bold, name, ui.Reset) + + // Skills sub-item sc := target.SkillsConfig() mode := getTargetMode(sc.Mode, cfg.Mode) res := getTargetStatusDetail(target, cfg.Source, mode) - ui.Status(name, res.statusStr, res.detail) + printTargetSubItem("skills", res.statusStr, res.detail) if mode == "merge" || mode == "copy" { filtered, err := sync.FilterSkills(discovered, sc.Include, sc.Exclude) @@ -379,6 +408,21 @@ func printTargetsStatus(cfg *config.Config, discovered []sync.DiscoveredSkill) e } else if len(sc.Include) > 0 || len(sc.Exclude) > 0 { ui.Warning("%s: include/exclude ignored in symlink mode", name) } + + // Agents sub-item + if agentsExist { + agentPath := resolveAgentTargetPath(target, builtinAgents, name) + if agentPath != "" { + linked := countLinkedAgents(agentPath) + agentStatus := "merged" + driftLabel := "" + if linked != agentCount && agentCount > 0 { + agentStatus = "drift" + driftLabel = ui.Yellow + " (drift)" + ui.Reset + } + printTargetSubItem("agents", agentStatus, fmt.Sprintf("[merge] %d/%d linked%s", linked, agentCount, driftLabel)) + } + } } if driftTotal > 0 { ui.Warning("%d skill(s) not synced — run 'skillshare sync'", driftTotal) @@ -386,6 +430,20 @@ func printTargetsStatus(cfg *config.Config, discovered []sync.DiscoveredSkill) e return nil } +// printTargetSubItem prints an indented sub-item line under a target. +func printTargetSubItem(kind, status, detail string) { + statusColor := ui.Gray + switch status { + case "merged", "synced", "copied", "linked": + statusColor = ui.Green + case "drift", "not exist": + statusColor = ui.Yellow + case "conflict", "broken": + statusColor = ui.Red + } + fmt.Printf(" %-8s %s%-12s%s %s\n", kind, statusColor, status, ui.Reset, ui.Dim+detail+ui.Reset) +} + func getTargetMode(targetMode, globalMode string) string { if targetMode != "" { return targetMode @@ -503,7 +561,7 @@ func checkSkillVersion(cfg *config.Config) { func printStatusHelp() { fmt.Println(`Usage: skillshare status [options] -Show status of source, skills, and all targets. +Show status of source, skills, agents, and all targets. Options: --json Output results as JSON @@ -512,7 +570,7 @@ Options: --help, -h Show this help Examples: - skillshare status Show current state + skillshare status Show current state (skills + agents) skillshare status --json Output as JSON skillshare status -p Show project status`) } diff --git a/cmd/skillshare/status_agents.go b/cmd/skillshare/status_agents.go new file mode 100644 index 00000000..3cff14e6 --- /dev/null +++ b/cmd/skillshare/status_agents.go @@ -0,0 +1,65 @@ +package main + +import ( + "skillshare/internal/config" + "skillshare/internal/resource" +) + +// statusJSONAgents is the agent section of status --json output. +type statusJSONAgents struct { + Source string `json:"source"` + Exists bool `json:"exists"` + Count int `json:"count"` + Targets []statusJSONAgentTarget `json:"targets,omitempty"` +} + +type statusJSONAgentTarget struct { + Name string `json:"name"` + Path string `json:"path"` + Expected int `json:"expected"` + Linked int `json:"linked"` + Drift bool `json:"drift"` +} + +// buildAgentStatusJSON builds the agents section for status --json output. +func buildAgentStatusJSON(cfg *config.Config) *statusJSONAgents { + agentsSource := cfg.EffectiveAgentsSource() + exists := dirExists(agentsSource) + + result := &statusJSONAgents{ + Source: agentsSource, + Exists: exists, + } + + if !exists { + return result + } + + agents, _ := resource.AgentKind{}.Discover(agentsSource) + result.Count = len(agents) + + builtinAgents := config.DefaultAgentTargets() + for name := range cfg.Targets { + agentPath := resolveAgentTargetPath(cfg.Targets[name], builtinAgents, name) + if agentPath == "" { + continue + } + + linked := countLinkedAgents(agentPath) + result.Targets = append(result.Targets, statusJSONAgentTarget{ + Name: name, + Path: agentPath, + Expected: len(agents), + Linked: linked, + Drift: linked != len(agents) && len(agents) > 0, + }) + } + + return result +} + +// countLinkedAgents counts healthy .md symlinks in the target agent directory. +func countLinkedAgents(targetDir string) int { + linked, _ := countAgentLinksAndBroken(targetDir) + return linked +} diff --git a/cmd/skillshare/status_project.go b/cmd/skillshare/status_project.go index 3759a241..e6853b9f 100644 --- a/cmd/skillshare/status_project.go +++ b/cmd/skillshare/status_project.go @@ -9,6 +9,7 @@ import ( "skillshare/internal/audit" "skillshare/internal/config" "skillshare/internal/git" + "skillshare/internal/resource" "skillshare/internal/skillignore" "skillshare/internal/sync" "skillshare/internal/ui" @@ -34,7 +35,7 @@ func cmdStatusProject(root string) error { trackedRepos := extractTrackedRepos(discovered) sp.Stop() - printProjectSourceStatus(runtime.sourcePath, len(discovered), stats) + printProjectSourceStatus(runtime.sourcePath, runtime.agentsSourcePath, len(discovered), stats) printProjectTrackedReposStatus(runtime.sourcePath, discovered, trackedRepos) if err := printProjectTargetsStatus(runtime, discovered); err != nil { return err @@ -42,7 +43,7 @@ func cmdStatusProject(root string) error { // Extras if len(runtime.config.Extras) > 0 { - ui.Header("Extras (project)") + ui.Header("Extras") printExtrasStatus(runtime.config.Extras, func(extra config.ExtraConfig) string { return config.ExtrasSourceDirProject(root, extra.Name) }) @@ -65,23 +66,21 @@ func cmdStatusProjectJSON(root string) error { return writeJSONError(err) } + output := statusJSONOutput{ + Version: version, + } + discovered, stats, _ := sync.DiscoverSourceSkillsWithStats(runtime.sourcePath) trackedRepos := extractTrackedRepos(discovered) - output := statusJSONOutput{ - Source: statusJSONSource{ - Path: runtime.sourcePath, - Exists: dirExists(runtime.sourcePath), - Skillignore: buildSkillignoreJSON(stats), - }, - SkillCount: len(discovered), - Version: version, + output.Source = statusJSONSource{ + Path: runtime.sourcePath, + Exists: dirExists(runtime.sourcePath), + Skillignore: buildSkillignoreJSON(stats), } - - // Tracked repos (parallel dirty checks) + output.SkillCount = len(discovered) output.TrackedRepos = buildTrackedRepoJSON(runtime.sourcePath, trackedRepos, discovered) - // Targets for _, entry := range runtime.config.Targets { target, ok := runtime.targets[entry.Name] if !ok { @@ -104,7 +103,6 @@ func cmdStatusProjectJSON(root string) error { }) } - // Audit policy := audit.ResolvePolicy(audit.PolicyInputs{ ConfigProfile: runtime.config.Audit.Profile, ConfigThreshold: runtime.config.Audit.BlockThreshold, @@ -118,10 +116,59 @@ func cmdStatusProjectJSON(root string) error { Analyzers: policy.EffectiveAnalyzers(), } + output.Agents = buildProjectAgentStatusJSON(runtime) + return writeJSON(&output) } -func printProjectSourceStatus(sourcePath string, skillCount int, stats *skillignore.IgnoreStats) { +// buildProjectAgentStatusJSON builds the agents section for project status --json. +func buildProjectAgentStatusJSON(rt *projectRuntime) *statusJSONAgents { + exists := dirExists(rt.agentsSourcePath) + result := &statusJSONAgents{ + Source: rt.agentsSourcePath, + Exists: exists, + } + + if !exists { + return result + } + + agents, _ := resource.AgentKind{}.Discover(rt.agentsSourcePath) + result.Count = len(agents) + + builtinAgents := config.ProjectAgentTargets() + for _, entry := range rt.config.Targets { + agentPath := resolveProjectAgentTargetPath(entry, builtinAgents, rt.root) + if agentPath == "" { + continue + } + + linked := countLinkedAgents(agentPath) + result.Targets = append(result.Targets, statusJSONAgentTarget{ + Name: entry.Name, + Path: agentPath, + Expected: len(agents), + Linked: linked, + Drift: linked != len(agents) && len(agents) > 0, + }) + } + + return result +} + +// resolveProjectAgentTargetPath resolves the agent path for a project target entry. +func resolveProjectAgentTargetPath(entry config.ProjectTargetEntry, builtinAgents map[string]config.TargetConfig, projectRoot string) string { + ac := entry.AgentsConfig() + if ac.Path != "" { + return resolveProjectPath(projectRoot, ac.Path) + } + if builtin, ok := builtinAgents[entry.Name]; ok { + return resolveProjectPath(projectRoot, builtin.Path) + } + return "" +} + +func printProjectSourceStatus(sourcePath, agentsSourcePath string, skillCount int, stats *skillignore.IgnoreStats) { ui.Header("Source (project)") info, err := os.Stat(sourcePath) if err != nil { @@ -131,6 +178,15 @@ func printProjectSourceStatus(sourcePath string, skillCount int, stats *skillign ui.Success(".skillshare/skills/ (%d skills, %s)", skillCount, info.ModTime().Format("2006-01-02 15:04")) printSkillignoreLine(stats) + + // Agents source + if agentsInfo, agentsErr := os.Stat(agentsSourcePath); agentsErr == nil { + agentCount := 0 + if agents, discoverErr := (resource.AgentKind{}).Discover(agentsSourcePath); discoverErr == nil { + agentCount = len(agents) + } + ui.Success(".skillshare/agents/ (%d agents, %s)", agentCount, agentsInfo.ModTime().Format("2006-01-02 15:04")) + } } func printProjectTrackedReposStatus(sourcePath string, discovered []sync.DiscoveredSkill, trackedRepos []string) { @@ -161,7 +217,16 @@ func printProjectTrackedReposStatus(sourcePath string, discovered []sync.Discove } func printProjectTargetsStatus(runtime *projectRuntime, discovered []sync.DiscoveredSkill) error { - ui.Header("Targets (project)") + ui.Header("Targets") + + builtinAgents := config.ProjectAgentTargets() + agentsExist := dirExists(runtime.agentsSourcePath) + var agentCount int + if agentsExist { + agents, _ := (resource.AgentKind{}).Discover(runtime.agentsSourcePath) + agentCount = len(agents) + } + driftTotal := 0 for _, entry := range runtime.config.Targets { target, ok := runtime.targets[entry.Name] @@ -170,6 +235,10 @@ func printProjectTargetsStatus(runtime *projectRuntime, discovered []sync.Discov continue } + // Target name header + fmt.Printf("%s%s%s\n", ui.Bold, entry.Name, ui.Reset) + + // Skills sub-item sc := target.SkillsConfig() mode := sc.Mode if mode == "" { @@ -177,7 +246,7 @@ func printProjectTargetsStatus(runtime *projectRuntime, discovered []sync.Discov } res := getTargetStatusDetail(target, runtime.sourcePath, mode) - ui.Status(entry.Name, res.statusStr, res.detail) + printTargetSubItem("skills", res.statusStr, res.detail) if mode == "merge" || mode == "copy" { filtered, err := sync.FilterSkills(discovered, sc.Include, sc.Exclude) @@ -196,6 +265,21 @@ func printProjectTargetsStatus(runtime *projectRuntime, discovered []sync.Discov } else if len(sc.Include) > 0 || len(sc.Exclude) > 0 { ui.Warning("%s: include/exclude ignored in symlink mode", entry.Name) } + + // Agents sub-item + if agentsExist { + agentPath := resolveProjectAgentTargetPath(entry, builtinAgents, runtime.root) + if agentPath != "" { + linked := countLinkedAgents(agentPath) + agentStatus := "merged" + driftLabel := "" + if linked != agentCount && agentCount > 0 { + agentStatus = "drift" + driftLabel = ui.Yellow + " (drift)" + ui.Reset + } + printTargetSubItem("agents", agentStatus, fmt.Sprintf("[merge] %d/%d linked%s", linked, agentCount, driftLabel)) + } + } } if driftTotal > 0 { ui.Warning("%d skill(s) not synced — run 'skillshare sync'", driftTotal) diff --git a/cmd/skillshare/sync.go b/cmd/skillshare/sync.go index 354d9a0b..559d0bf0 100644 --- a/cmd/skillshare/sync.go +++ b/cmd/skillshare/sync.go @@ -100,6 +100,9 @@ func cmdSync(args []string) error { applyModeLabel(mode) + // Extract kind filter (e.g. "skillshare sync agents"). + kind, rest := parseKindArg(rest) + dryRun, force, jsonOutput := parseSyncFlags(rest) prevDiagOutput := sync.DiagOutput @@ -111,25 +114,39 @@ func cmdSync(args []string) error { } if mode == modeProject { + // Agent-only project sync + if kind == kindAgents { + return syncAgentsProject(cwd, dryRun, force, jsonOutput, start) + } + if hasAll && !jsonOutput { // Run project extras sync after project skills sync (text mode) defer func() { - fmt.Println() if extrasErr := cmdSyncExtras(append([]string{"-p"}, rest...)); extrasErr != nil { ui.Warning("Extras sync: %v", extrasErr) } }() } + stats, results, projIgnoreStats, err := cmdSyncProject(cwd, dryRun, force, jsonOutput) stats.ProjectScope = true logSyncOp(config.ProjectConfigPath(cwd), stats, start, err) + + // Append agent sync when kind=all or --all + if kind == kindAll || hasAll { + if agentErr := syncAgentsProject(cwd, dryRun, force, jsonOutput, start); agentErr != nil && err == nil { + err = agentErr + } + } + if jsonOutput { if hasAll { projCfg, loadErr := config.LoadProject(cwd) if loadErr == nil && len(projCfg.Extras) > 0 { + agentPaths := collectAgentTargetPathsProject(cwd) extrasEntries := runExtrasSyncEntries(projCfg.Extras, func(extra config.ExtraConfig) string { return config.ExtrasSourceDirProject(cwd, extra.Name) - }, dryRun, force, cwd) + }, dryRun, force, cwd, agentPaths) return syncOutputJSON(results, dryRun, start, projIgnoreStats, err, extrasEntries) } } @@ -160,7 +177,14 @@ func cmdSync(args []string) error { } } - // Phase 1: Discovery + // Agent-only mode: skip skill discovery/sync entirely + if kind == kindAgents { + _, agentErr := syncAgentsGlobal(cfg, dryRun, force, jsonOutput, start) + logSyncOp(config.ConfigPath(), syncLogStats{DryRun: dryRun, Force: force}, start, agentErr) + return agentErr + } + + // Phase 1: Discovery (skills) var spinner *ui.Spinner if !jsonOutput { spinner = ui.StartSpinner("Discovering skills") @@ -253,17 +277,24 @@ func cmdSync(args []string) error { if jsonOutput { if hasAll && len(cfg.Extras) > 0 { + agentPaths := collectAgentTargetPathsGlobal(cfg) extrasEntries := runExtrasSyncEntries(cfg.Extras, func(extra config.ExtraConfig) string { return config.ResolveExtrasSourceDir(extra, cfg.ExtrasSource, cfg.Source) - }, dryRun, force, "") + }, dryRun, force, "", agentPaths) return syncOutputJSON(results, dryRun, start, ignoreStats, syncErr, extrasEntries) } return syncOutputJSON(results, dryRun, start, ignoreStats, syncErr) } + // Agent sync when kind=all or --all (after skill sync) + if kind == kindAll || hasAll { + if _, agentErr := syncAgentsGlobal(cfg, dryRun, force, jsonOutput, start); agentErr != nil && syncErr == nil { + syncErr = agentErr + } + } + if hasAll { - fmt.Println() - if extrasErr := cmdSyncExtras(rest); extrasErr != nil { + if extrasErr := cmdSyncExtras(append([]string{"-g"}, rest...)); extrasErr != nil { ui.Warning("Extras sync: %v", extrasErr) } } @@ -408,6 +439,25 @@ func backupTargetsBeforeSync(cfg *config.Config) { ui.Success("%s -> %s", name, backupPath) } } + + // Also backup agent targets if any exist. + backupDir, agentTargets, err := resolveGlobalAgentBackupContextFromCfg(cfg) + if err != nil || len(agentTargets) == 0 { + return + } + for _, at := range agentTargets { + entryName := at.name + "-agents" + bp, bErr := backup.CreateInDir(backupDir, entryName, at.agentPath) + if bErr != nil { + ui.Warning("Failed to backup %s: %v", entryName, bErr) + } else if bp != "" { + if !backedUp { + ui.Header("Backing up") + backedUp = true + } + ui.Success("%s -> %s", entryName, bp) + } + } } func syncTarget(name string, target config.TargetConfig, cfg *config.Config, dryRun, force bool) error { @@ -766,12 +816,12 @@ func syncSymlinkMode(name string, target config.TargetConfig, source string, dry } func printSyncHelp() { - fmt.Println(`Usage: skillshare sync [options] + fmt.Println(`Usage: skillshare sync [agents] [options] Sync skills from source to all configured targets. Options: - --all Sync skills and extras + --all Sync skills, agents, and extras --dry-run, -n Preview changes without applying --force, -f Force sync (overwrite local changes) --json Output results as JSON @@ -785,6 +835,7 @@ Subcommands: Examples: skillshare sync Sync skills to all targets skillshare sync --dry-run Preview sync changes - skillshare sync --all Sync skills and extras - skillshare sync -p Sync project-level skills`) + skillshare sync --all Sync skills, agents, and extras + skillshare sync -p Sync project-level skills + skillshare sync agents Sync agents only`) } diff --git a/cmd/skillshare/sync_agents.go b/cmd/skillshare/sync_agents.go new file mode 100644 index 00000000..3f58ce6c --- /dev/null +++ b/cmd/skillshare/sync_agents.go @@ -0,0 +1,360 @@ +package main + +import ( + "fmt" + "os" + "path/filepath" + "sort" + "strings" + "time" + + "skillshare/internal/backup" + "skillshare/internal/config" + "skillshare/internal/resource" + "skillshare/internal/sync" + "skillshare/internal/ui" +) + +// agentSyncStats aggregates per-target agent sync results. +type agentSyncStats struct { + linked, local, updated, pruned int +} + +// syncAgentsGlobal discovers agents and syncs them to all agent-capable targets. +// Returns total stats and any error. +func syncAgentsGlobal(cfg *config.Config, dryRun, force, jsonOutput bool, start time.Time) (agentSyncStats, error) { + agentsSource := cfg.EffectiveAgentsSource() + + // Check agent source exists + if _, err := os.Stat(agentsSource); err != nil { + if os.IsNotExist(err) { + if !jsonOutput { + ui.Info("No agents source directory (%s)", agentsSource) + } + return agentSyncStats{}, nil + } + return agentSyncStats{}, fmt.Errorf("cannot access agents source: %w", err) + } + + // Discover agents (excludes disabled from sync) + allAgents, err := resource.AgentKind{}.Discover(agentsSource) + if err != nil { + return agentSyncStats{}, fmt.Errorf("cannot discover agents: %w", err) + } + agents := resource.ActiveAgents(allAgents) + + if !jsonOutput { + ui.Header("Syncing agents") + if len(agents) == 0 { + ui.Info("No agents found in %s; pruning synced target entries only", agentsSource) + } + if dryRun { + ui.Warning("Dry run mode - no changes will be made") + } + } + + // Backup agent targets before sync (non-dry-run only). + if !dryRun && !jsonOutput { + backupDir, agentTargets, _ := resolveGlobalAgentBackupContextFromCfg(cfg) + backedUp := false + for _, at := range agentTargets { + entryName := at.name + "-agents" + bp, bErr := backup.CreateInDir(backupDir, entryName, at.agentPath) + if bErr != nil { + ui.Warning("Failed to backup %s: %v", entryName, bErr) + } else if bp != "" { + if !backedUp { + ui.Header("Backing up") + backedUp = true + } + ui.Success("%s -> %s", entryName, bp) + } + } + } + + // Resolve agent-capable targets: user config agents sub-key + built-in defaults + builtinAgents := config.DefaultAgentTargets() + var totals agentSyncStats + var syncErr error + var skippedTargets []string + var targetCount int + + for name := range cfg.Targets { + agentPath := resolveAgentTargetPath(cfg.Targets[name], builtinAgents, name) + if agentPath == "" { + skippedTargets = append(skippedTargets, name) + continue + } + targetCount++ + + tc := cfg.Targets[name] + ac := tc.AgentsConfig() + filtered, filterErr := sync.FilterAgents(agents, ac.Include, ac.Exclude) + if filterErr != nil { + if !jsonOutput { + ui.Error("%s: invalid agent filter: %v", name, filterErr) + } + syncErr = fmt.Errorf("some agent targets failed to sync") + continue + } + stats, targetErr := syncAgentTarget(name, agentPath, ac.Mode, filtered, agentsSource, dryRun, force, jsonOutput) + if targetErr != nil { + syncErr = fmt.Errorf("some agent targets failed to sync") + } + totals.linked += stats.linked + totals.local += stats.local + totals.updated += stats.updated + totals.pruned += stats.pruned + } + + if !jsonOutput { + ui.AgentSyncSummary(ui.AgentSyncStats{ + Targets: targetCount, + Linked: totals.linked, + Local: totals.local, + Updated: totals.updated, + Pruned: totals.pruned, + Duration: time.Since(start), + }) + if len(skippedTargets) > 0 { + sort.Strings(skippedTargets) + ui.Warning("%d target(s) skipped for agents (no agents path): %s", + len(skippedTargets), strings.Join(skippedTargets, ", ")) + } + } + + return totals, syncErr +} + +// resolveAgentTargetPath returns the effective agent path for a target, +// checking user config first, then built-in defaults. Returns "" if none. +func resolveAgentTargetPath(tc config.TargetConfig, builtinAgents map[string]config.TargetConfig, name string) string { + if ac := tc.AgentsConfig(); ac.Path != "" { + return config.ExpandPath(ac.Path) + } + if builtin, ok := builtinAgents[name]; ok { + return config.ExpandPath(builtin.Path) + } + return "" +} + +// syncAgentsProject syncs agents for project mode using .skillshare/agents/ as source +// and project-level target agent paths. +func syncAgentsProject(projectRoot string, dryRun, force, jsonOutput bool, start time.Time) error { + agentsSource := filepath.Join(projectRoot, ".skillshare", "agents") + + if _, err := os.Stat(agentsSource); err != nil { + if os.IsNotExist(err) { + if !jsonOutput { + ui.Info("No project agents directory (%s)", agentsSource) + } + return nil + } + return fmt.Errorf("cannot access project agents: %w", err) + } + + allAgents, err := resource.AgentKind{}.Discover(agentsSource) + if err != nil { + return fmt.Errorf("cannot discover project agents: %w", err) + } + agents := resource.ActiveAgents(allAgents) + + if !jsonOutput { + ui.Header("Syncing agents (project)") + if len(agents) == 0 { + ui.Info("No project agents found; pruning synced target entries only") + } + if dryRun { + ui.Warning("Dry run mode - no changes will be made") + } + } + + // Load project config for target list + projCfg, loadErr := config.LoadProject(projectRoot) + if loadErr != nil { + return fmt.Errorf("cannot load project config: %w", loadErr) + } + + builtinAgents := config.ProjectAgentTargets() + + // Backup agent targets before sync (non-dry-run only). + if !dryRun && !jsonOutput { + backupDir := filepath.Join(projectRoot, ".skillshare", "backups") + backedUp := false + for _, entry := range projCfg.Targets { + agentPath := resolveProjectAgentTargetPath(entry, builtinAgents, projectRoot) + if agentPath == "" { + continue + } + entryName := entry.Name + "-agents" + bp, bErr := backup.CreateInDir(backupDir, entryName, agentPath) + if bErr != nil { + ui.Warning("Failed to backup %s: %v", entryName, bErr) + } else if bp != "" { + if !backedUp { + ui.Header("Backing up") + backedUp = true + } + ui.Success("%s -> %s", entryName, bp) + } + } + } + + var totals agentSyncStats + var syncErr error + var skippedTargets []string + var targetCount int + + for _, entry := range projCfg.Targets { + agentPath := resolveProjectAgentTargetPath(entry, builtinAgents, projectRoot) + if agentPath == "" { + skippedTargets = append(skippedTargets, entry.Name) + continue + } + targetCount++ + + ac := entry.AgentsConfig() + filtered, filterErr := sync.FilterAgents(agents, ac.Include, ac.Exclude) + if filterErr != nil { + if !jsonOutput { + ui.Error("%s: invalid agent filter: %v", entry.Name, filterErr) + } + syncErr = fmt.Errorf("some agent targets failed to sync") + continue + } + stats, targetErr := syncAgentTarget(entry.Name, agentPath, ac.Mode, filtered, agentsSource, dryRun, force, jsonOutput) + if targetErr != nil { + syncErr = fmt.Errorf("some agent targets failed to sync") + } + totals.linked += stats.linked + totals.local += stats.local + totals.updated += stats.updated + totals.pruned += stats.pruned + } + + if !jsonOutput { + ui.AgentSyncSummary(ui.AgentSyncStats{ + Targets: targetCount, + Linked: totals.linked, + Local: totals.local, + Updated: totals.updated, + Pruned: totals.pruned, + Duration: time.Since(start), + }) + if len(skippedTargets) > 0 { + sort.Strings(skippedTargets) + ui.Warning("%d target(s) skipped for agents (no agents path): %s", + len(skippedTargets), strings.Join(skippedTargets, ", ")) + } + } + + return syncErr +} + +// syncAgentTarget syncs agents to a single target directory. +// Shared by both global and project sync paths. +func syncAgentTarget(name, agentPath, modeOverride string, agents []resource.DiscoveredResource, agentsSource string, dryRun, force, jsonOutput bool) (agentSyncStats, error) { + mode := modeOverride + if mode == "" { + mode = "merge" + } + + result, err := sync.SyncAgents(agents, agentsSource, agentPath, mode, dryRun, force) + if err != nil { + if !jsonOutput { + ui.Error("%s: agent sync failed: %v", name, err) + } + return agentSyncStats{}, err + } + + var pruned []string + switch mode { + case "copy": + pruned, _ = sync.PruneOrphanAgentCopies(agentPath, agents, dryRun) + case "merge": + pruned, _ = sync.PruneOrphanAgentLinks(agentPath, agents, dryRun) + } + + stats := agentSyncStats{ + linked: len(result.Linked), + local: len(result.Skipped), + updated: len(result.Updated), + pruned: len(pruned), + } + + if !jsonOutput { + reportAgentSyncResult(name, mode, stats, dryRun) + } + + return stats, nil +} + +// reportAgentSyncResult prints per-target agent sync status. +func reportAgentSyncResult(name, mode string, stats agentSyncStats, dryRun bool) { + if stats.linked > 0 || stats.updated > 0 || stats.pruned > 0 { + ui.Success("%s: agents %s (%d linked, %d local, %d updated, %d pruned)", + name, mode, stats.linked, stats.local, stats.updated, stats.pruned) + } else if stats.local > 0 { + ui.Success("%s: agents %s (%d local preserved)", name, mode, stats.local) + } else { + ui.Success("%s: agents %s (up to date)", name, mode) + } +} + +// collectAgentTargetPathsGlobal returns the set of resolved agent target paths +// for all targets in the global config. Returns nil when agents source does not +// exist or contains no agent files (meaning no real agent sync would happen). +func collectAgentTargetPathsGlobal(cfg *config.Config) map[string]bool { + agentsSource := cfg.EffectiveAgentsSource() + if _, err := os.Stat(agentsSource); err != nil { + return nil + } + agents, err := resource.AgentKind{}.Discover(agentsSource) + if err != nil || len(agents) == 0 { + return nil + } + + builtinAgents := config.DefaultAgentTargets() + paths := make(map[string]bool) + for name := range cfg.Targets { + agentPath := resolveAgentTargetPath(cfg.Targets[name], builtinAgents, name) + if agentPath != "" { + paths[filepath.Clean(agentPath)] = true + } + } + if len(paths) == 0 { + return nil + } + return paths +} + +// collectAgentTargetPathsProject returns the set of resolved agent target paths +// for all targets in the project config. Returns nil when no agents exist. +func collectAgentTargetPathsProject(projectRoot string) map[string]bool { + agentsSource := filepath.Join(projectRoot, ".skillshare", "agents") + if _, err := os.Stat(agentsSource); err != nil { + return nil + } + agents, err := resource.AgentKind{}.Discover(agentsSource) + if err != nil || len(agents) == 0 { + return nil + } + + projCfg, err := config.LoadProject(projectRoot) + if err != nil { + return nil + } + + builtinAgents := config.ProjectAgentTargets() + paths := make(map[string]bool) + for _, entry := range projCfg.Targets { + agentPath := resolveProjectAgentTargetPath(entry, builtinAgents, projectRoot) + if agentPath != "" { + paths[filepath.Clean(agentPath)] = true + } + } + if len(paths) == 0 { + return nil + } + return paths +} diff --git a/cmd/skillshare/sync_extras.go b/cmd/skillshare/sync_extras.go index 521897e3..f7084e85 100644 --- a/cmd/skillshare/sync_extras.go +++ b/cmd/skillshare/sync_extras.go @@ -13,6 +13,9 @@ import ( "skillshare/internal/ui" ) +// extrasAgentsName is the extras entry name that may overlap with the agents sync system. +const extrasAgentsName = "agents" + type syncExtrasJSONOutput struct { Extras []syncExtrasJSONEntry `json:"extras"` Duration string `json:"duration"` @@ -24,13 +27,14 @@ type syncExtrasJSONEntry struct { } type syncExtrasJSONTarget struct { - Path string `json:"path"` - Mode string `json:"mode"` - Synced int `json:"synced"` - Skipped int `json:"skipped"` - Pruned int `json:"pruned"` - Error string `json:"error,omitempty"` - Warnings []string `json:"warnings,omitempty"` + Path string `json:"path"` + Mode string `json:"mode"` + Synced int `json:"synced"` + Skipped int `json:"skipped"` + Pruned int `json:"pruned"` + Error string `json:"error,omitempty"` + Warnings []string `json:"warnings,omitempty"` + SkippedBy string `json:"skipped_by,omitempty"` } func cmdSyncExtras(args []string) error { @@ -99,11 +103,20 @@ func cmdSyncExtrasGlobal(dryRun, force, jsonOutput bool, start time.Time) error ui.Warning("Dry run mode - no changes will be made") } - var totalSynced, totalSkipped, totalPruned, totalErrors int + // Detect overlap between extras "agents" and the agents sync system + var agentTargetPaths map[string]bool + for _, extra := range cfg.Extras { + if extra.Name == extrasAgentsName { + agentTargetPaths = collectAgentTargetPathsGlobal(cfg) + break + } + } + + var totalSynced, totalSkipped, totalPruned, totalErrors, totalTargets int var jsonEntries []syncExtrasJSONEntry if !jsonOutput { - ui.Header(ui.WithModeLabel("Sync Extras")) + ui.Header(ui.WithModeLabel("Syncing extras")) } for _, extra := range cfg.Extras { @@ -128,11 +141,24 @@ func cmdSyncExtrasGlobal(dryRun, force, jsonOutput bool, start time.Time) error jsonEntry := syncExtrasJSONEntry{Name: extra.Name} for _, target := range extra.Targets { + totalTargets++ mode := target.Mode if mode == "" { mode = "merge" } targetPath := config.ExpandPath(target.Path) + + // Skip extras "agents" targets that overlap with the agents sync system + if extra.Name == extrasAgentsName && isExtrasTargetOverlappingAgents(targetPath, agentTargetPaths) { + if !jsonOutput { + ui.Warning("Skipping extras %q target %s — already managed by agents sync", extra.Name, shortenPath(targetPath)) + } + jsonEntry.Targets = append(jsonEntry.Targets, syncExtrasJSONTarget{ + Path: target.Path, Mode: mode, SkippedBy: extrasAgentsName, + }) + continue + } + result, syncErr := sync.SyncExtra(extraSource, targetPath, mode, dryRun, force, target.Flatten, "") shortTarget := shortenPath(targetPath) @@ -143,7 +169,7 @@ func cmdSyncExtrasGlobal(dryRun, force, jsonOutput bool, start time.Time) error if syncErr != nil { if !jsonOutput { - ui.Warning(" %s: %v", shortTarget, syncErr) + ui.Warning("%s: %v", shortTarget, syncErr) } jsonTarget.Error = syncErr.Error() jsonEntry.Targets = append(jsonEntry.Targets, jsonTarget) @@ -173,11 +199,11 @@ func cmdSyncExtrasGlobal(dryRun, force, jsonOutput bool, start time.Time) error if result.Pruned > 0 { parts = append(parts, fmt.Sprintf("%d pruned", result.Pruned)) } - ui.Success(" %s %s (%s)", shortTarget, strings.Join(parts, ", "), mode) + ui.Success("%s %s (%s)", shortTarget, strings.Join(parts, ", "), mode) } else if result.Skipped > 0 { - ui.Warning(" %s %d files skipped (use --force to override)", shortTarget, result.Skipped) + ui.Warning("%s %d files skipped (use --force to override)", shortTarget, result.Skipped) } else { - ui.Success(" %s up to date (%s)", shortTarget, mode) + ui.Success("%s up to date (%s)", shortTarget, mode) } for _, e := range result.Errors { @@ -217,6 +243,14 @@ func cmdSyncExtrasGlobal(dryRun, force, jsonOutput bool, start time.Time) error return writeJSON(&output) } + ui.ExtrasSyncSummary(ui.ExtrasSyncStats{ + Targets: totalTargets, + Synced: totalSynced, + Skipped: totalSkipped, + Pruned: totalPruned, + Duration: time.Since(start), + }) + if totalErrors > 0 { return fmt.Errorf("%d extras sync error(s)", totalErrors) } @@ -245,11 +279,20 @@ func cmdSyncExtrasProject(cwd string, dryRun, force, jsonOutput bool, start time ui.Warning("Dry run mode - no changes will be made") } - var totalSynced, totalSkipped, totalPruned, totalErrors int + // Detect overlap between extras "agents" and the agents sync system + var agentTargetPaths map[string]bool + for _, extra := range projCfg.Extras { + if extra.Name == extrasAgentsName { + agentTargetPaths = collectAgentTargetPathsProject(cwd) + break + } + } + + var totalSynced, totalSkipped, totalPruned, totalErrors, totalTargets int var jsonEntries []syncExtrasJSONEntry if !jsonOutput { - ui.Header(ui.WithModeLabel("Sync Extras")) + ui.Header(ui.WithModeLabel("Syncing extras")) } for _, extra := range projCfg.Extras { @@ -269,15 +312,24 @@ func cmdSyncExtrasProject(cwd string, dryRun, force, jsonOutput bool, start time jsonEntry := syncExtrasJSONEntry{Name: extra.Name} for _, target := range extra.Targets { + totalTargets++ mode := target.Mode if mode == "" { mode = "merge" } // Expand ~ and resolve relative paths against project root - targetPath := config.ExpandPath(target.Path) - if !filepath.IsAbs(targetPath) { - targetPath = filepath.Join(cwd, targetPath) + targetPath := resolveProjectPath(cwd, target.Path) + + // Skip extras "agents" targets that overlap with the agents sync system + if extra.Name == extrasAgentsName && isExtrasTargetOverlappingAgents(targetPath, agentTargetPaths) { + if !jsonOutput { + ui.Warning("Skipping extras %q target %s — already managed by agents sync", extra.Name, shortenPath(targetPath)) + } + jsonEntry.Targets = append(jsonEntry.Targets, syncExtrasJSONTarget{ + Path: target.Path, Mode: mode, SkippedBy: extrasAgentsName, + }) + continue } result, syncErr := sync.SyncExtra(extraSource, targetPath, mode, dryRun, force, target.Flatten, cwd) @@ -290,7 +342,7 @@ func cmdSyncExtrasProject(cwd string, dryRun, force, jsonOutput bool, start time if syncErr != nil { if !jsonOutput { - ui.Warning(" %s: %v", shortTarget, syncErr) + ui.Warning("%s: %v", shortTarget, syncErr) } jsonTarget.Error = syncErr.Error() jsonEntry.Targets = append(jsonEntry.Targets, jsonTarget) @@ -319,11 +371,11 @@ func cmdSyncExtrasProject(cwd string, dryRun, force, jsonOutput bool, start time if result.Pruned > 0 { parts = append(parts, fmt.Sprintf("%d pruned", result.Pruned)) } - ui.Success(" %s %s (%s)", shortTarget, strings.Join(parts, ", "), mode) + ui.Success("%s %s (%s)", shortTarget, strings.Join(parts, ", "), mode) } else if result.Skipped > 0 { - ui.Warning(" %s %d files skipped (use --force to override)", shortTarget, result.Skipped) + ui.Warning("%s %d files skipped (use --force to override)", shortTarget, result.Skipped) } else { - ui.Success(" %s up to date (%s)", shortTarget, mode) + ui.Success("%s up to date (%s)", shortTarget, mode) } for _, e := range result.Errors { @@ -363,6 +415,14 @@ func cmdSyncExtrasProject(cwd string, dryRun, force, jsonOutput bool, start time return writeJSON(&output) } + ui.ExtrasSyncSummary(ui.ExtrasSyncStats{ + Targets: totalTargets, + Synced: totalSynced, + Skipped: totalSkipped, + Pruned: totalPruned, + Duration: time.Since(start), + }) + if totalErrors > 0 { return fmt.Errorf("%d extras sync error(s)", totalErrors) } @@ -383,7 +443,8 @@ func syncVerb(mode string) string { // runExtrasSync runs extras sync and returns JSON entries without printing. // Used by sync --all --json to merge extras into the skills JSON output. -func runExtrasSyncEntries(extras []config.ExtraConfig, sourceFunc func(config.ExtraConfig) string, dryRun, force bool, projectRoot string) []syncExtrasJSONEntry { +// agentTargetPaths is used to skip extras "agents" targets that overlap with the agents sync system. +func runExtrasSyncEntries(extras []config.ExtraConfig, sourceFunc func(config.ExtraConfig) string, dryRun, force bool, projectRoot string, agentTargetPaths map[string]bool) []syncExtrasJSONEntry { entries := make([]syncExtrasJSONEntry, 0, len(extras)) for _, extra := range extras { extraSource := sourceFunc(extra) @@ -401,6 +462,16 @@ func runExtrasSyncEntries(extras []config.ExtraConfig, sourceFunc func(config.Ex mode = "merge" } targetPath := config.ExpandPath(target.Path) + if projectRoot != "" { + targetPath = resolveProjectPath(projectRoot, target.Path) + } + + if extra.Name == extrasAgentsName && isExtrasTargetOverlappingAgents(targetPath, agentTargetPaths) { + entry.Targets = append(entry.Targets, syncExtrasJSONTarget{ + Path: targetPath, Mode: mode, SkippedBy: extrasAgentsName, + }) + continue + } result, syncErr := sync.SyncExtra(extraSource, targetPath, mode, dryRun, force, target.Flatten, projectRoot) jt := syncExtrasJSONTarget{Path: targetPath, Mode: mode} @@ -423,6 +494,15 @@ func runExtrasSyncEntries(extras []config.ExtraConfig, sourceFunc func(config.Ex return entries } +// isExtrasTargetOverlappingAgents checks whether an extras target path overlaps +// with any active agent target path. +func isExtrasTargetOverlappingAgents(targetPath string, agentPaths map[string]bool) bool { + if len(agentPaths) == 0 { + return false + } + return agentPaths[filepath.Clean(targetPath)] +} + // cachedHome caches the home directory for shortenPath. var cachedHome = func() string { h, _ := os.UserHomeDir() diff --git a/cmd/skillshare/target.go b/cmd/skillshare/target.go index bcdd4ac7..71afcca8 100644 --- a/cmd/skillshare/target.go +++ b/cmd/skillshare/target.go @@ -11,6 +11,7 @@ import ( "skillshare/internal/config" "skillshare/internal/oplog" "skillshare/internal/sync" + "skillshare/internal/targetsummary" "skillshare/internal/ui" "skillshare/internal/utils" "skillshare/internal/validate" @@ -113,11 +114,16 @@ Options: Target Settings: --mode Set sync mode (merge, symlink, or copy) + --agent-mode Set agents sync mode (merge, symlink, or copy) --target-naming Set target naming (flat or standard) --add-include Add an include filter pattern --add-exclude Add an exclude filter pattern --remove-include Remove an include filter pattern --remove-exclude Remove an exclude filter pattern + --add-agent-include Add an agent include filter pattern + --add-agent-exclude Add an agent exclude filter pattern + --remove-agent-include Remove an agent include filter pattern + --remove-agent-exclude Remove an agent exclude filter pattern Examples: skillshare target add cursor @@ -125,7 +131,9 @@ Examples: skillshare target remove cursor skillshare target list skillshare target cursor + skillshare target claude --agent-mode copy skillshare target claude --add-include "team-*" + skillshare target claude --add-agent-include "team-*" skillshare target claude --remove-include "team-*" skillshare target claude --add-exclude "_legacy*" @@ -257,6 +265,28 @@ func backupTargets(cfg *config.Config, toRemove []string) { ui.Success("%s -> %s", targetName, backupPath) } } + + // Also backup agent directories for targets being removed. + backupDir, agentTargets, err := resolveGlobalAgentBackupContextFromCfg(cfg) + if err != nil || len(agentTargets) == 0 { + return + } + removeSet := make(map[string]struct{}, len(toRemove)) + for _, name := range toRemove { + removeSet[name] = struct{}{} + } + for _, at := range agentTargets { + if _, ok := removeSet[at.name]; !ok { + continue + } + entryName := at.name + "-agents" + bp, bErr := backup.CreateInDir(backupDir, entryName, at.agentPath) + if bErr != nil { + ui.Warning("Failed to backup %s: %v", entryName, bErr) + } else if bp != "" { + ui.Success("%s -> %s", entryName, bp) + } + } } // unlinkTarget unlinks a single target @@ -412,11 +442,17 @@ func unlinkMergeMode(targetPath, sourcePath string) error { // targetListJSONItem is the JSON representation for a single target. type targetListJSONItem struct { - Name string `json:"name"` - Path string `json:"path"` - Mode string `json:"mode"` - Include []string `json:"include"` - Exclude []string `json:"exclude"` + Name string `json:"name"` + Path string `json:"path"` + Mode string `json:"mode"` + Include []string `json:"include"` + Exclude []string `json:"exclude"` + AgentPath string `json:"agentPath,omitempty"` + AgentMode string `json:"agentMode,omitempty"` + AgentInclude []string `json:"agentInclude,omitempty"` + AgentExclude []string `json:"agentExclude,omitempty"` + AgentLinkedCount *int `json:"agentLinkedCount,omitempty"` + AgentExpectedCount *int `json:"agentExpectedCount,omitempty"` } func targetList(jsonOutput bool) error { @@ -429,30 +465,39 @@ func targetList(jsonOutput bool) error { return targetListJSON(cfg) } - ui.Header("Configured Targets") - for name, target := range cfg.Targets { - sc := target.SkillsConfig() - mode := sc.Mode - if mode == "" { - mode = "merge" - } - fmt.Printf(" %-12s %s (%s)\n", name, sc.Path, mode) + items, err := buildTargetTUIItems(false, "") + if err != nil { + return err } + ui.Header("Configured Targets") + printTargetListPlain(items) + return nil } func targetListJSON(cfg *config.Config) error { + agentBuilder, err := targetsummary.NewGlobalBuilder(cfg) + if err != nil { + return err + } + var items []targetListJSONItem for name, target := range cfg.Targets { sc := target.SkillsConfig() - items = append(items, targetListJSONItem{ + item := targetListJSONItem{ Name: name, Path: sc.Path, Mode: getTargetMode(sc.Mode, cfg.Mode), Include: sc.Include, Exclude: sc.Exclude, - }) + } + agentSummary, err := agentBuilder.GlobalTarget(name, target) + if err != nil { + return err + } + applyTargetListAgentSummary(&item, agentSummary) + items = append(items, item) } output := struct { Targets []targetListJSONItem `json:"targets"` @@ -466,6 +511,10 @@ func targetInfo(name string, args []string) error { if err != nil { return err } + settings, err := parseTargetSettingFlags(remaining) + if err != nil { + return err + } cfg, err := config.Load() if err != nil { @@ -477,36 +526,59 @@ func targetInfo(name string, args []string) error { return fmt.Errorf("target '%s' not found. Use 'skillshare target list' to see available targets", name) } - // Parse --mode and --target-naming from remaining args - var newMode, newNaming string - for i := 0; i < len(remaining); i++ { - switch remaining[i] { - case "--mode", "-m": - if i+1 >= len(remaining) { - return fmt.Errorf("--mode requires a value (merge, symlink, or copy)") - } - newMode = remaining[i+1] - i++ - case "--target-naming": - if i+1 >= len(remaining) { - return fmt.Errorf("--target-naming requires a value (flat or standard)") - } - newNaming = remaining[i+1] - i++ - } - } - // Apply filter updates if any if filterOpts.hasUpdates() { start := time.Now() - s := target.EnsureSkills() - changes, fErr := applyFilterUpdates(&s.Include, &s.Exclude, filterOpts) - if fErr != nil { - return fErr + var changes []string + mutated := false + + if filterOpts.Skills.hasUpdates() { + s := target.EnsureSkills() + skillChanges, fErr := applyFilterUpdates(&s.Include, &s.Exclude, filterOpts.Skills) + if fErr != nil { + return fErr + } + changes = append(changes, skillChanges...) + mutated = true } - cfg.Targets[name] = target - if err := cfg.Save(); err != nil { - return err + + if filterOpts.Agents.hasUpdates() { + agentBuilder, buildErr := targetsummary.NewGlobalBuilder(cfg) + if buildErr != nil { + return buildErr + } + agentSummary, buildErr := agentBuilder.GlobalTarget(name, target) + if buildErr != nil { + return buildErr + } + if agentSummary == nil { + return fmt.Errorf("target '%s' does not have an agents path", name) + } + if agentSummary.Mode == "symlink" { + return fmt.Errorf("target '%s' agent include/exclude filters are ignored in symlink mode; use --agent-mode merge or --agent-mode copy first", name) + } + + ac := target.AgentsConfig() + include := append([]string(nil), ac.Include...) + exclude := append([]string(nil), ac.Exclude...) + agentChanges, fErr := applyFilterUpdates(&include, &exclude, filterOpts.Agents) + if fErr != nil { + return fErr + } + if len(agentChanges) > 0 { + a := target.EnsureAgents() + a.Include = include + a.Exclude = exclude + mutated = true + } + changes = append(changes, scopeFilterChanges("agents", agentChanges)...) + } + + if mutated { + cfg.Targets[name] = target + if err := cfg.Save(); err != nil { + return err + } } for _, change := range changes { ui.Success("%s: %s", name, change) @@ -526,13 +598,17 @@ func targetInfo(name string, args []string) error { } // If --mode is provided, update the mode - if newMode != "" { - return updateTargetMode(cfg, name, target, newMode) + if settings.SkillMode != "" { + return updateTargetMode(cfg, name, target, settings.SkillMode) + } + + if settings.AgentMode != "" { + return updateTargetAgentMode(cfg, name, target, settings.AgentMode) } // If --target-naming is provided, update the naming - if newNaming != "" { - return updateTargetNaming(cfg, name, target, newNaming) + if settings.Naming != "" { + return updateTargetNaming(cfg, name, target, settings.Naming) } // Show target info @@ -564,6 +640,38 @@ func updateTargetMode(cfg *config.Config, name string, target config.TargetConfi return nil } +func updateTargetAgentMode(cfg *config.Config, name string, target config.TargetConfig, newMode string) error { + if newMode != "merge" && newMode != "symlink" && newMode != "copy" { + return fmt.Errorf("invalid agent mode '%s'. Use 'merge', 'symlink', or 'copy'", newMode) + } + + agentBuilder, err := targetsummary.NewGlobalBuilder(cfg) + if err != nil { + return err + } + agentSummary, err := agentBuilder.GlobalTarget(name, target) + if err != nil { + return err + } + if agentSummary == nil { + return fmt.Errorf("target '%s' does not have an agents path", name) + } + + oldMode := agentSummary.Mode + target.EnsureAgents().Mode = newMode + cfg.Targets[name] = target + if err := cfg.Save(); err != nil { + return err + } + + ui.Success("Changed %s agent mode: %s -> %s", name, oldMode, newMode) + if newMode == "symlink" && (len(agentSummary.Include) > 0 || len(agentSummary.Exclude) > 0) { + ui.Warning("Agent include/exclude filters are ignored in symlink mode") + } + ui.Info("Run 'skillshare sync' to apply the new mode") + return nil +} + func updateTargetNaming(cfg *config.Config, name string, target config.TargetConfig, newNaming string) error { if !config.IsValidTargetNaming(newNaming) { return fmt.Errorf("invalid target naming '%s'. Use 'flat' or 'standard'", newNaming) @@ -614,6 +722,15 @@ func showTargetInfo(cfg *config.Config, name string, target config.TargetConfig) namingDisplay += " (default)" } + agentBuilder, err := targetsummary.NewGlobalBuilder(cfg) + if err != nil { + return err + } + agentSummary, err := agentBuilder.GlobalTarget(name, target) + if err != nil { + return err + } + ui.Header(fmt.Sprintf("Target: %s", name)) fmt.Printf(" Path: %s\n", sc.Path) fmt.Printf(" Mode: %s\n", modeDisplay) @@ -621,6 +738,7 @@ func showTargetInfo(cfg *config.Config, name string, target config.TargetConfig) fmt.Printf(" Status: %s\n", statusLine) fmt.Printf(" Include: %s\n", formatFilterList(sc.Include)) fmt.Printf(" Exclude: %s\n", formatFilterList(sc.Exclude)) + printTargetAgentSection(agentSummary) return nil } diff --git a/cmd/skillshare/target_agents.go b/cmd/skillshare/target_agents.go new file mode 100644 index 00000000..91c7218a --- /dev/null +++ b/cmd/skillshare/target_agents.go @@ -0,0 +1,122 @@ +package main + +import ( + "fmt" + + "skillshare/internal/config" + "skillshare/internal/sync" + "skillshare/internal/targetsummary" +) + +func applyTargetListAgentSummary(item *targetListJSONItem, summary *targetsummary.AgentSummary) { + if summary == nil { + return + } + + item.AgentPath = summary.Path + item.AgentMode = summary.Mode + item.AgentInclude = append([]string(nil), summary.Include...) + item.AgentExclude = append([]string(nil), summary.Exclude...) + item.AgentLinkedCount = intPtr(summary.ManagedCount) + item.AgentExpectedCount = intPtr(summary.ExpectedCount) +} + +func printTargetAgentSection(summary *targetsummary.AgentSummary) { + if summary == nil { + return + } + + displayPath := summary.DisplayPath + if displayPath == "" { + displayPath = summary.Path + } + + fmt.Println(" Agents:") + fmt.Printf(" Path: %s\n", displayPath) + fmt.Printf(" Mode: %s\n", summary.Mode) + fmt.Printf(" Status: %s\n", formatTargetAgentSyncSummary(summary)) + if summary.Mode == "symlink" { + fmt.Println(" Filters: ignored in symlink mode") + return + } + fmt.Printf(" Include: %s\n", formatFilterList(summary.Include)) + fmt.Printf(" Exclude: %s\n", formatFilterList(summary.Exclude)) +} + +func printTargetListPlain(items []targetTUIItem) { + for idx, item := range items { + if idx > 0 { + fmt.Println() + } + + sc := item.target.SkillsConfig() + displayPath := item.displayPath + if displayPath == "" { + displayPath = sc.Path + } + + fmt.Printf(" %s\n", item.name) + fmt.Println(" Skills:") + fmt.Printf(" Path: %s\n", displayPath) + fmt.Printf(" Mode: %s\n", sync.EffectiveMode(sc.Mode)) + fmt.Printf(" Naming: %s\n", config.EffectiveTargetNaming(sc.TargetNaming)) + fmt.Printf(" Sync: %s\n", item.skillSync) + if len(sc.Include) == 0 && len(sc.Exclude) == 0 { + fmt.Println(" No include/exclude filters") + } else { + fmt.Printf(" Include: %s\n", formatFilterList(sc.Include)) + fmt.Printf(" Exclude: %s\n", formatFilterList(sc.Exclude)) + } + + if item.agentSummary == nil { + continue + } + + agentPath := item.agentSummary.DisplayPath + if agentPath == "" { + agentPath = item.agentSummary.Path + } + fmt.Println(" Agents:") + fmt.Printf(" Path: %s\n", agentPath) + fmt.Printf(" Mode: %s\n", item.agentSummary.Mode) + fmt.Printf(" Sync: %s\n", formatTargetAgentSyncSummary(item.agentSummary)) + if item.agentSummary.Mode == "symlink" { + fmt.Println(" Filters: ignored in symlink mode") + } else if len(item.agentSummary.Include) == 0 && len(item.agentSummary.Exclude) == 0 { + fmt.Println(" No agent include/exclude filters") + } else { + fmt.Printf(" Include: %s\n", formatFilterList(item.agentSummary.Include)) + fmt.Printf(" Exclude: %s\n", formatFilterList(item.agentSummary.Exclude)) + } + } +} + +func formatTargetAgentSyncSummary(summary *targetsummary.AgentSummary) string { + if summary == nil { + return "" + } + + if summary.ExpectedCount == 0 { + if summary.ManagedCount > 0 { + return fmt.Sprintf("no source agents yet (%d %s)", summary.ManagedCount, targetAgentCountLabel(summary.Mode)) + } + return "no source agents yet" + } + + summaryText := fmt.Sprintf("%d/%d %s", summary.ManagedCount, summary.ExpectedCount, targetAgentCountLabel(summary.Mode)) + if summary.Mode == "symlink" { + return summaryText + " (directory symlink)" + } + return summaryText +} + +func targetAgentCountLabel(mode string) string { + if mode == "copy" { + return "managed" + } + return "linked" +} + +func intPtr(v int) *int { + return &v +} diff --git a/cmd/skillshare/target_agents_test.go b/cmd/skillshare/target_agents_test.go new file mode 100644 index 00000000..c0c007b6 --- /dev/null +++ b/cmd/skillshare/target_agents_test.go @@ -0,0 +1,458 @@ +package main + +import ( + "encoding/json" + "os" + "path/filepath" + "strings" + "testing" + + "skillshare/internal/config" + "skillshare/internal/targetsummary" + "skillshare/internal/testutil" +) + +func TestShowTargetInfo_ShowsAgentsSectionForBuiltinTarget(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + targetPath := sb.CreateTarget("claude") + writeGlobalTargetConfig(t, sb, "claude", targetPath, "") + + cfg, err := config.Load() + if err != nil { + t.Fatalf("load config: %v", err) + } + + agentSource := cfg.EffectiveAgentsSource() + agentFile := writeAgentFile(t, agentSource, "reviewer.md") + agentTarget := filepath.Join(sb.Home, ".claude", "agents") + linkAgentFile(t, agentTarget, "reviewer.md", agentFile) + + output := stripANSIWarnings(captureStdout(t, func() { + if err := showTargetInfo(cfg, "claude", cfg.Targets["claude"]); err != nil { + t.Fatalf("showTargetInfo: %v", err) + } + })) + + if !strings.Contains(output, "Agents:") { + t.Fatalf("expected agents section in output:\n%s", output) + } + if !strings.Contains(output, agentTarget) { + t.Fatalf("expected agent path %q in output:\n%s", agentTarget, output) + } + if !strings.Contains(output, "1/1 linked") { + t.Fatalf("expected linked summary in output:\n%s", output) + } +} + +func TestShowTargetInfo_OmitsAgentsSectionWhenTargetHasNoAgentsPath(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + targetPath := filepath.Join(sb.Root, "custom-skills") + if err := os.MkdirAll(targetPath, 0755); err != nil { + t.Fatalf("mkdir target: %v", err) + } + writeGlobalTargetConfig(t, sb, "custom-tool", targetPath, "") + + cfg, err := config.Load() + if err != nil { + t.Fatalf("load config: %v", err) + } + + output := stripANSIWarnings(captureStdout(t, func() { + if err := showTargetInfo(cfg, "custom-tool", cfg.Targets["custom-tool"]); err != nil { + t.Fatalf("showTargetInfo: %v", err) + } + })) + + if strings.Contains(output, "Agents:") { + t.Fatalf("did not expect agents section:\n%s", output) + } +} + +func TestTargetListJSON_IncludesAgentMetadata(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + targetPath := sb.CreateTarget("claude") + writeGlobalTargetConfig(t, sb, "claude", targetPath, "") + + cfg, err := config.Load() + if err != nil { + t.Fatalf("load config: %v", err) + } + + agentSource := cfg.EffectiveAgentsSource() + agentFile := writeAgentFile(t, agentSource, "reviewer.md") + agentTarget := filepath.Join(sb.Home, ".claude", "agents") + linkAgentFile(t, agentTarget, "reviewer.md", agentFile) + + output := captureStdout(t, func() { + if err := targetListJSON(cfg); err != nil { + t.Fatalf("targetListJSON: %v", err) + } + }) + + var resp struct { + Targets []targetListJSONItem `json:"targets"` + } + if err := json.Unmarshal([]byte(output), &resp); err != nil { + t.Fatalf("decode json: %v\n%s", err, output) + } + if len(resp.Targets) != 1 { + t.Fatalf("expected 1 target, got %d", len(resp.Targets)) + } + + target := resp.Targets[0] + if target.AgentPath != agentTarget { + t.Fatalf("agent path = %q, want %q", target.AgentPath, agentTarget) + } + if target.AgentMode != "merge" { + t.Fatalf("agent mode = %q, want merge", target.AgentMode) + } + if target.AgentLinkedCount == nil || *target.AgentLinkedCount != 1 { + t.Fatalf("agent linked = %v, want 1", target.AgentLinkedCount) + } + if target.AgentExpectedCount == nil || *target.AgentExpectedCount != 1 { + t.Fatalf("agent expected = %v, want 1", target.AgentExpectedCount) + } +} + +func TestTargetList_TextOutputShowsSkillsAndAgentsSections(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + targetPath := sb.CreateTarget("claude") + writeGlobalTargetConfig(t, sb, "claude", targetPath, "") + + cfg, err := config.Load() + if err != nil { + t.Fatalf("load config: %v", err) + } + + agentSource := cfg.EffectiveAgentsSource() + agentFile := writeAgentFile(t, agentSource, "reviewer.md") + agentTarget := filepath.Join(sb.Home, ".claude", "agents") + linkAgentFile(t, agentTarget, "reviewer.md", agentFile) + + output := stripANSIWarnings(captureStdout(t, func() { + if err := targetList(false); err != nil { + t.Fatalf("targetList: %v", err) + } + })) + + for _, want := range []string{ + "claude", + "Skills:", + targetPath, + "Sync:", + "Agents:", + agentTarget, + "1/1 linked", + "No include/exclude filters", + "No agent include/exclude filters", + } { + if !strings.Contains(output, want) { + t.Fatalf("expected %q in target list output:\n%s", want, output) + } + } +} + +func TestRenderTargetDetail_AgentSection(t *testing.T) { + cases := []struct { + name string + item targetTUIItem + want []string + notContains []string + }{ + { + name: "merge target shows builtin-like agents section", + item: targetTUIItem{ + name: "cursor", + displayPath: ".cursor/skills", + skillSync: "merged (4 shared, 1 local)", + target: config.TargetConfig{ + Skills: &config.ResourceTargetConfig{ + Path: "/tmp/cursor/skills", + Mode: "merge", + TargetNaming: "flat", + }, + }, + agentSummary: &targetsummary.AgentSummary{ + DisplayPath: ".cursor/agents", + Path: "/tmp/cursor/agents", + Mode: "merge", + ManagedCount: 2, + ExpectedCount: 3, + Include: []string{"team-*"}, + }, + }, + want: []string{"Skills:", ".cursor/skills", "Sync:", "merged (4 shared, 1 local)", "Agents:", ".cursor/agents", "2/3 linked", "Agent Include:", "team-*"}, + }, + { + name: "copy target shows custom agents section", + item: targetTUIItem{ + name: "custom", + displayPath: "/tmp/custom/skills", + skillSync: "copied (2 managed, 0 local)", + target: config.TargetConfig{ + Skills: &config.ResourceTargetConfig{ + Path: "/tmp/custom/skills", + Mode: "copy", + TargetNaming: "flat", + }, + }, + agentSummary: &targetsummary.AgentSummary{ + DisplayPath: "/tmp/custom/agents", + Path: "/tmp/custom/agents", + Mode: "copy", + ManagedCount: 2, + ExpectedCount: 2, + }, + }, + want: []string{"Skills:", "/tmp/custom/skills", "Sync:", "copied (2 managed, 0 local)", "Agents:", "/tmp/custom/agents", "2/2 managed", "No agent include/exclude filters"}, + }, + { + name: "symlink agent target shows filters ignored warning", + item: targetTUIItem{ + name: "claude", + displayPath: ".claude/skills", + skillSync: "merged (7 shared, 0 local)", + target: config.TargetConfig{ + Skills: &config.ResourceTargetConfig{ + Path: "/tmp/claude/skills", + Mode: "merge", + TargetNaming: "flat", + }, + }, + agentSummary: &targetsummary.AgentSummary{ + DisplayPath: ".claude/agents", + Path: "/tmp/claude/agents", + Mode: "symlink", + ManagedCount: 5, + ExpectedCount: 5, + Include: []string{"team-*"}, + Exclude: []string{"draft-*"}, + }, + }, + want: []string{"Agents:", ".claude/agents", "5/5 linked (directory symlink)", "Agent include/exclude filters ignored in symlink mode"}, + notContains: []string{"Agent Include:", "Agent Exclude:", "No agent include/exclude filters"}, + }, + { + name: "unsupported target omits agents section", + item: targetTUIItem{ + name: "custom-tool", + displayPath: "/tmp/custom-tool/skills", + skillSync: "not exist (0 shared, 0 local)", + target: config.TargetConfig{ + Skills: &config.ResourceTargetConfig{ + Path: "/tmp/custom-tool/skills", + Mode: "merge", + TargetNaming: "flat", + }, + }, + }, + want: []string{"Skills:", "/tmp/custom-tool/skills", "Sync:", "not exist (0 shared, 0 local)"}, + notContains: []string{"Agents:", "No agent include/exclude filters"}, + }, + } + + model := targetListTUIModel{} + for _, tc := range cases { + t.Run(tc.name, func(t *testing.T) { + rendered := stripANSIWarnings(model.renderTargetDetail(tc.item)) + for _, want := range tc.want { + if !strings.Contains(rendered, want) { + t.Fatalf("expected %q in output:\n%s", want, rendered) + } + } + for _, unwanted := range tc.notContains { + if strings.Contains(rendered, unwanted) { + t.Fatalf("did not expect %q in output:\n%s", unwanted, rendered) + } + } + }) + } +} + +func TestTargetScopeOptions_DisablesAgentFiltersInSymlinkMode(t *testing.T) { + item := targetTUIItem{ + name: "claude", + agentSummary: &targetsummary.AgentSummary{ + Mode: "symlink", + }, + } + + options := targetScopeOptions(item, "include") + if len(options) != 2 { + t.Fatalf("expected 2 scope options, got %d", len(options)) + } + if !options[0].enabled || options[0].scope != "skills" { + t.Fatalf("expected skills option enabled, got %+v", options[0]) + } + if options[1].scope != "agents" || options[1].enabled { + t.Fatalf("expected agents option disabled, got %+v", options[1]) + } + if options[1].disabled != "ignored in symlink mode" { + t.Fatalf("unexpected disabled reason: %+v", options[1]) + } + + if got := moveScopePickerCursor(options, 0, 1); got != 0 { + t.Fatalf("cursor should stay on skills when agents is disabled, got %d", got) + } +} + +func TestDoSetTargetMode_Agents_GlobalAndProject(t *testing.T) { + t.Run("global", func(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + targetPath := sb.CreateTarget("claude") + writeGlobalTargetConfig(t, sb, "claude", targetPath, "") + + model := targetListTUIModel{} + if _, err := model.doSetTargetMode("claude", "agents", "copy"); err != nil { + t.Fatalf("doSetTargetMode: %v", err) + } + + cfg, err := config.Load() + if err != nil { + t.Fatalf("load config: %v", err) + } + target := cfg.Targets["claude"] + if got := target.AgentsConfig().Mode; got != "copy" { + t.Fatalf("agent mode = %q, want copy", got) + } + }) + + t.Run("project", func(t *testing.T) { + root := writeProjectTargetConfig(t, []config.ProjectTargetEntry{{Name: "claude"}}) + + model := targetListTUIModel{ + projCfg: &config.ProjectConfig{}, + cwd: root, + } + if _, err := model.doSetTargetMode("claude", "agents", "copy"); err != nil { + t.Fatalf("doSetTargetMode: %v", err) + } + + cfg, err := config.LoadProject(root) + if err != nil { + t.Fatalf("load project config: %v", err) + } + if got := cfg.Targets[0].AgentsConfig().Mode; got != "copy" { + t.Fatalf("project agent mode = %q, want copy", got) + } + }) +} + +func TestDoAddAndRemovePattern_Agents_GlobalAndProject(t *testing.T) { + t.Run("global", func(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + targetPath := sb.CreateTarget("claude") + writeGlobalTargetConfig(t, sb, "claude", targetPath, "") + + model := targetListTUIModel{} + if _, err := model.doAddPattern("claude", "agents", "include", "team-*"); err != nil { + t.Fatalf("doAddPattern: %v", err) + } + if _, err := model.doRemovePattern("claude", "agents", "include", "team-*"); err != nil { + t.Fatalf("doRemovePattern: %v", err) + } + + cfg, err := config.Load() + if err != nil { + t.Fatalf("load config: %v", err) + } + target := cfg.Targets["claude"] + if got := target.AgentsConfig().Include; len(got) != 0 { + t.Fatalf("agent include = %v, want empty", got) + } + }) + + t.Run("project", func(t *testing.T) { + root := writeProjectTargetConfig(t, []config.ProjectTargetEntry{{Name: "claude"}}) + + model := targetListTUIModel{ + projCfg: &config.ProjectConfig{}, + cwd: root, + } + if _, err := model.doAddPattern("claude", "agents", "exclude", "draft-*"); err != nil { + t.Fatalf("doAddPattern: %v", err) + } + if _, err := model.doRemovePattern("claude", "agents", "exclude", "draft-*"); err != nil { + t.Fatalf("doRemovePattern: %v", err) + } + + cfg, err := config.LoadProject(root) + if err != nil { + t.Fatalf("load project config: %v", err) + } + if got := cfg.Targets[0].AgentsConfig().Exclude; len(got) != 0 { + t.Fatalf("project agent exclude = %v, want empty", got) + } + }) +} + +func writeGlobalTargetConfig(t *testing.T, sb *testutil.Sandbox, name, skillsPath, agentPath string) { + t.Helper() + + var b strings.Builder + b.WriteString("source: " + sb.SourcePath + "\n") + b.WriteString("mode: merge\n") + b.WriteString("targets:\n") + b.WriteString(" " + name + ":\n") + b.WriteString(" skills:\n") + b.WriteString(" path: " + skillsPath + "\n") + if agentPath != "" { + b.WriteString(" agents:\n") + b.WriteString(" path: " + agentPath + "\n") + } + sb.WriteConfig(b.String()) +} + +func writeAgentFile(t *testing.T, dir, name string) string { + t.Helper() + if err := os.MkdirAll(dir, 0755); err != nil { + t.Fatalf("mkdir agent source: %v", err) + } + path := filepath.Join(dir, name) + if err := os.WriteFile(path, []byte("# "+name), 0644); err != nil { + t.Fatalf("write agent file: %v", err) + } + return path +} + +func linkAgentFile(t *testing.T, dir, name, source string) string { + t.Helper() + if err := os.MkdirAll(dir, 0755); err != nil { + t.Fatalf("mkdir agent target: %v", err) + } + linkPath := filepath.Join(dir, name) + if err := os.Symlink(source, linkPath); err != nil { + t.Fatalf("symlink agent: %v", err) + } + return linkPath +} + +func writeProjectTargetConfig(t *testing.T, targets []config.ProjectTargetEntry) string { + t.Helper() + + root := t.TempDir() + if err := os.MkdirAll(filepath.Join(root, ".skillshare", "skills"), 0755); err != nil { + t.Fatalf("mkdir project skills: %v", err) + } + if err := os.MkdirAll(filepath.Join(root, ".skillshare", "agents"), 0755); err != nil { + t.Fatalf("mkdir project agents: %v", err) + } + + cfg := &config.ProjectConfig{Targets: targets} + if err := cfg.Save(root); err != nil { + t.Fatalf("save project config: %v", err) + } + return root +} diff --git a/cmd/skillshare/target_helpers.go b/cmd/skillshare/target_helpers.go index cd95f94a..ba33fe8f 100644 --- a/cmd/skillshare/target_helpers.go +++ b/cmd/skillshare/target_helpers.go @@ -32,11 +32,26 @@ func (o filterUpdateOpts) hasUpdates() bool { len(o.RemoveInclude) > 0 || len(o.RemoveExclude) > 0 } +type parsedTargetFilterFlags struct { + Skills filterUpdateOpts + Agents filterUpdateOpts +} + +func (o parsedTargetFilterFlags) hasUpdates() bool { + return o.Skills.hasUpdates() || o.Agents.hasUpdates() +} + +type parsedTargetSettingFlags struct { + SkillMode string + AgentMode string + Naming string +} + // parseFilterFlags extracts --add-include, --add-exclude, --remove-include, -// --remove-exclude flags from args. Returns the parsed opts and any -// remaining (non-filter) arguments. -func parseFilterFlags(args []string) (filterUpdateOpts, []string, error) { - var opts filterUpdateOpts +// --remove-exclude flags from args for both skills and agents. +// Returns the parsed opts and any remaining (non-filter) arguments. +func parseFilterFlags(args []string) (parsedTargetFilterFlags, []string, error) { + var opts parsedTargetFilterFlags var rest []string for i := 0; i < len(args); i++ { @@ -46,25 +61,49 @@ func parseFilterFlags(args []string) (filterUpdateOpts, []string, error) { return opts, nil, fmt.Errorf("--add-include requires a value") } i++ - opts.AddInclude = append(opts.AddInclude, args[i]) + opts.Skills.AddInclude = append(opts.Skills.AddInclude, args[i]) case "--add-exclude": if i+1 >= len(args) { return opts, nil, fmt.Errorf("--add-exclude requires a value") } i++ - opts.AddExclude = append(opts.AddExclude, args[i]) + opts.Skills.AddExclude = append(opts.Skills.AddExclude, args[i]) case "--remove-include": if i+1 >= len(args) { return opts, nil, fmt.Errorf("--remove-include requires a value") } i++ - opts.RemoveInclude = append(opts.RemoveInclude, args[i]) + opts.Skills.RemoveInclude = append(opts.Skills.RemoveInclude, args[i]) case "--remove-exclude": if i+1 >= len(args) { return opts, nil, fmt.Errorf("--remove-exclude requires a value") } i++ - opts.RemoveExclude = append(opts.RemoveExclude, args[i]) + opts.Skills.RemoveExclude = append(opts.Skills.RemoveExclude, args[i]) + case "--add-agent-include": + if i+1 >= len(args) { + return opts, nil, fmt.Errorf("--add-agent-include requires a value") + } + i++ + opts.Agents.AddInclude = append(opts.Agents.AddInclude, args[i]) + case "--add-agent-exclude": + if i+1 >= len(args) { + return opts, nil, fmt.Errorf("--add-agent-exclude requires a value") + } + i++ + opts.Agents.AddExclude = append(opts.Agents.AddExclude, args[i]) + case "--remove-agent-include": + if i+1 >= len(args) { + return opts, nil, fmt.Errorf("--remove-agent-include requires a value") + } + i++ + opts.Agents.RemoveInclude = append(opts.Agents.RemoveInclude, args[i]) + case "--remove-agent-exclude": + if i+1 >= len(args) { + return opts, nil, fmt.Errorf("--remove-agent-exclude requires a value") + } + i++ + opts.Agents.RemoveExclude = append(opts.Agents.RemoveExclude, args[i]) default: rest = append(rest, args[i]) } @@ -73,6 +112,35 @@ func parseFilterFlags(args []string) (filterUpdateOpts, []string, error) { return opts, rest, nil } +func parseTargetSettingFlags(args []string) (parsedTargetSettingFlags, error) { + var settings parsedTargetSettingFlags + + for i := 0; i < len(args); i++ { + switch args[i] { + case "--mode", "-m": + if i+1 >= len(args) { + return settings, fmt.Errorf("--mode requires a value (merge, symlink, or copy)") + } + settings.SkillMode = args[i+1] + i++ + case "--agent-mode": + if i+1 >= len(args) { + return settings, fmt.Errorf("--agent-mode requires a value (merge, symlink, or copy)") + } + settings.AgentMode = args[i+1] + i++ + case "--target-naming": + if i+1 >= len(args) { + return settings, fmt.Errorf("--target-naming requires a value (flat or standard)") + } + settings.Naming = args[i+1] + i++ + } + } + + return settings, nil +} + // applyFilterUpdates modifies include/exclude slices according to opts. // It validates patterns with filepath.Match, deduplicates, and returns // a human-readable list of changes applied. @@ -120,6 +188,29 @@ func applyFilterUpdates(include, exclude *[]string, opts filterUpdateOpts) ([]st return changes, nil } +func scopeFilterChanges(scope string, changes []string) []string { + if scope != "agents" { + return changes + } + + scoped := make([]string, len(changes)) + for i, change := range changes { + switch { + case strings.HasPrefix(change, "added include: "): + scoped[i] = strings.Replace(change, "added include: ", "added agent include: ", 1) + case strings.HasPrefix(change, "added exclude: "): + scoped[i] = strings.Replace(change, "added exclude: ", "added agent exclude: ", 1) + case strings.HasPrefix(change, "removed include: "): + scoped[i] = strings.Replace(change, "removed include: ", "removed agent include: ", 1) + case strings.HasPrefix(change, "removed exclude: "): + scoped[i] = strings.Replace(change, "removed exclude: ", "removed agent exclude: ", 1) + default: + scoped[i] = change + } + } + return scoped +} + func containsPattern(patterns []string, p string) bool { for _, existing := range patterns { if existing == p { diff --git a/cmd/skillshare/target_helpers_test.go b/cmd/skillshare/target_helpers_test.go index bf3d6477..181f6d97 100644 --- a/cmd/skillshare/target_helpers_test.go +++ b/cmd/skillshare/target_helpers_test.go @@ -10,14 +10,14 @@ func TestParseFilterFlags(t *testing.T) { tests := []struct { name string args []string - wantOpts filterUpdateOpts + wantOpts parsedTargetFilterFlags wantRest []string wantErr bool }{ { name: "no flags", args: []string{"--mode", "merge"}, - wantOpts: filterUpdateOpts{}, + wantOpts: parsedTargetFilterFlags{}, wantRest: []string{"--mode", "merge"}, }, { @@ -28,11 +28,13 @@ func TestParseFilterFlags(t *testing.T) { "--remove-include", "old-*", "--remove-exclude", "test-*", }, - wantOpts: filterUpdateOpts{ - AddInclude: []string{"team-*"}, - AddExclude: []string{"_legacy*"}, - RemoveInclude: []string{"old-*"}, - RemoveExclude: []string{"test-*"}, + wantOpts: parsedTargetFilterFlags{ + Skills: filterUpdateOpts{ + AddInclude: []string{"team-*"}, + AddExclude: []string{"_legacy*"}, + RemoveInclude: []string{"old-*"}, + RemoveExclude: []string{"test-*"}, + }, }, }, { @@ -41,8 +43,23 @@ func TestParseFilterFlags(t *testing.T) { "--add-include", "a-*", "--add-include", "b-*", }, - wantOpts: filterUpdateOpts{ - AddInclude: []string{"a-*", "b-*"}, + wantOpts: parsedTargetFilterFlags{ + Skills: filterUpdateOpts{ + AddInclude: []string{"a-*", "b-*"}, + }, + }, + }, + { + name: "agent flags", + args: []string{ + "--add-agent-include", "team-*", + "--remove-agent-exclude", "draft-*", + }, + wantOpts: parsedTargetFilterFlags{ + Agents: filterUpdateOpts{ + AddInclude: []string{"team-*"}, + RemoveExclude: []string{"draft-*"}, + }, }, }, { @@ -50,9 +67,15 @@ func TestParseFilterFlags(t *testing.T) { args: []string{ "--mode", "merge", "--add-include", "team-*", + "--add-agent-exclude", "draft-*", }, - wantOpts: filterUpdateOpts{ - AddInclude: []string{"team-*"}, + wantOpts: parsedTargetFilterFlags{ + Skills: filterUpdateOpts{ + AddInclude: []string{"team-*"}, + }, + Agents: filterUpdateOpts{ + AddExclude: []string{"draft-*"}, + }, }, wantRest: []string{"--mode", "merge"}, }, @@ -91,15 +114,39 @@ func TestParseFilterFlags(t *testing.T) { t.Fatalf("unexpected error: %v", err) } - assertStringSlice(t, "AddInclude", opts.AddInclude, tt.wantOpts.AddInclude) - assertStringSlice(t, "AddExclude", opts.AddExclude, tt.wantOpts.AddExclude) - assertStringSlice(t, "RemoveInclude", opts.RemoveInclude, tt.wantOpts.RemoveInclude) - assertStringSlice(t, "RemoveExclude", opts.RemoveExclude, tt.wantOpts.RemoveExclude) + assertStringSlice(t, "Skills.AddInclude", opts.Skills.AddInclude, tt.wantOpts.Skills.AddInclude) + assertStringSlice(t, "Skills.AddExclude", opts.Skills.AddExclude, tt.wantOpts.Skills.AddExclude) + assertStringSlice(t, "Skills.RemoveInclude", opts.Skills.RemoveInclude, tt.wantOpts.Skills.RemoveInclude) + assertStringSlice(t, "Skills.RemoveExclude", opts.Skills.RemoveExclude, tt.wantOpts.Skills.RemoveExclude) + assertStringSlice(t, "Agents.AddInclude", opts.Agents.AddInclude, tt.wantOpts.Agents.AddInclude) + assertStringSlice(t, "Agents.AddExclude", opts.Agents.AddExclude, tt.wantOpts.Agents.AddExclude) + assertStringSlice(t, "Agents.RemoveInclude", opts.Agents.RemoveInclude, tt.wantOpts.Agents.RemoveInclude) + assertStringSlice(t, "Agents.RemoveExclude", opts.Agents.RemoveExclude, tt.wantOpts.Agents.RemoveExclude) assertStringSlice(t, "rest", rest, tt.wantRest) }) } } +func TestParseTargetSettingFlags(t *testing.T) { + settings, err := parseTargetSettingFlags([]string{ + "--mode", "copy", + "--agent-mode", "merge", + "--target-naming", "standard", + }) + if err != nil { + t.Fatalf("parseTargetSettingFlags: %v", err) + } + if settings.SkillMode != "copy" { + t.Fatalf("SkillMode = %q, want copy", settings.SkillMode) + } + if settings.AgentMode != "merge" { + t.Fatalf("AgentMode = %q, want merge", settings.AgentMode) + } + if settings.Naming != "standard" { + t.Fatalf("Naming = %q, want standard", settings.Naming) + } +} + func TestApplyFilterUpdates(t *testing.T) { tests := []struct { name string @@ -225,6 +272,22 @@ func TestFilterUpdateOpts_HasUpdates(t *testing.T) { } } +func TestScopeFilterChanges_Agents(t *testing.T) { + changes := scopeFilterChanges("agents", []string{ + "added include: team-*", + "added exclude: draft-*", + "removed include: team-*", + "removed exclude: draft-*", + }) + want := []string{ + "added agent include: team-*", + "added agent exclude: draft-*", + "removed agent include: team-*", + "removed agent exclude: draft-*", + } + assertStringSlice(t, "changes", changes, want) +} + func TestFindUnknownSkillTargets_CustomTargets(t *testing.T) { discovered := []ssync.DiscoveredSkill{ {RelPath: "skill-a", Targets: []string{"claude", "custom-tool"}}, diff --git a/cmd/skillshare/target_list_tui.go b/cmd/skillshare/target_list_tui.go index 37dfe193..22d6255b 100644 --- a/cmd/skillshare/target_list_tui.go +++ b/cmd/skillshare/target_list_tui.go @@ -2,11 +2,14 @@ package main import ( "fmt" + "path/filepath" "sort" "strings" "skillshare/internal/config" "skillshare/internal/sync" + "skillshare/internal/targetsummary" + "skillshare/internal/theme" "github.com/charmbracelet/bubbles/list" "github.com/charmbracelet/bubbles/spinner" @@ -59,6 +62,7 @@ type targetListTUIModel struct { // Mode picker overlay showModePicker bool modePickerTarget string // target name being edited + modePickerScope string // "skills" or "agents" modeCursor int // Naming picker overlay @@ -70,6 +74,7 @@ type targetListTUIModel struct { editingFilter bool // true when in I/E edit mode editFilterType string // "include" or "exclude" editFilterTarget string // target name being edited + editFilterScope string // "skills" or "agents" editPatterns []string editCursor int // selected pattern index editAdding bool @@ -79,6 +84,12 @@ type targetListTUIModel struct { confirming bool confirmTarget string + // Scope picker overlay for M/I/E when both skills and agents are available. + showScopePicker bool + scopePickerTarget string + scopePickerAction string // "mode", "include", "exclude" + scopePickerCursor int + // Exit-with-action (for destructive ops dispatched after TUI exit) action string // "remove" or "" (normal quit) @@ -86,6 +97,12 @@ type targetListTUIModel struct { lastActionMsg string } +type targetScopeOption struct { + scope string + enabled bool + disabled string +} + func newTargetListTUIModel( modeLabel string, cfg *config.Config, @@ -96,7 +113,7 @@ func newTargetListTUIModel( l := list.New(nil, delegate, 0, 0) l.Title = fmt.Sprintf("Targets (%s)", modeLabel) - l.Styles.Title = tc.ListTitle + l.Styles.Title = theme.Title() l.SetShowStatusBar(false) l.SetFilteringEnabled(false) l.SetShowHelp(false) @@ -104,18 +121,18 @@ func newTargetListTUIModel( sp := spinner.New() sp.Spinner = spinner.Dot - sp.Style = tc.SpinnerStyle + sp.Style = theme.Accent() fi := textinput.New() fi.Prompt = "/ " - fi.PromptStyle = tc.Filter - fi.Cursor.Style = tc.Filter + fi.PromptStyle = theme.Accent() + fi.Cursor.Style = theme.Accent() fi.Placeholder = "filter by name" ei := textinput.New() ei.Prompt = "> pattern: " - ei.PromptStyle = tc.Cyan - ei.Cursor.Style = tc.Cyan + ei.PromptStyle = theme.Accent() + ei.Cursor.Style = theme.Accent() ei.Placeholder = "glob pattern" return targetListTUIModel{ @@ -152,18 +169,30 @@ func buildTargetTUIItems(isProject bool, cwd string) ([]targetTUIItem, error) { if err != nil { return nil, err } + resolvedTargets, err := config.ResolveProjectTargets(cwd, projCfg) + if err != nil { + return nil, err + } + agentBuilder, err := targetsummary.NewProjectBuilder(cwd) + if err != nil { + return nil, err + } for _, entry := range projCfg.Targets { - sc := entry.SkillsConfig() + resolved, ok := resolvedTargets[entry.Name] + if !ok { + continue + } + agentSummary, err := agentBuilder.ProjectTarget(entry) + if err != nil { + return nil, err + } items = append(items, targetTUIItem{ - name: entry.Name, - target: config.TargetConfig{ - Skills: &config.ResourceTargetConfig{ - Path: projectTargetDisplayPath(entry), - Mode: sc.Mode, - Include: sc.Include, - Exclude: sc.Exclude, - }, - }, + name: entry.Name, + target: resolved, + displayPath: projectTargetDisplayPath(entry), + skillSync: buildTargetSkillSyncSummary(resolved.SkillsConfig().Path, filepath.Join(cwd, ".skillshare", "skills"), resolved.SkillsConfig().Mode), + agentConfig: config.ResourceTargetConfig{Mode: agentSummaryMode(agentSummary), Include: agentSummaryInclude(agentSummary), Exclude: agentSummaryExclude(agentSummary)}, + agentSummary: agentSummary, }) } } else { @@ -171,8 +200,23 @@ func buildTargetTUIItems(isProject bool, cwd string) ([]targetTUIItem, error) { if err != nil { return nil, err } + agentBuilder, err := targetsummary.NewGlobalBuilder(cfg) + if err != nil { + return nil, err + } for name, t := range cfg.Targets { - items = append(items, targetTUIItem{name: name, target: t}) + agentSummary, err := agentBuilder.GlobalTarget(name, t) + if err != nil { + return nil, err + } + items = append(items, targetTUIItem{ + name: name, + target: t, + displayPath: t.SkillsConfig().Path, + skillSync: buildTargetSkillSyncSummary(t.SkillsConfig().Path, cfg.Source, t.SkillsConfig().Mode), + agentConfig: config.ResourceTargetConfig{Mode: agentSummaryMode(agentSummary), Include: agentSummaryInclude(agentSummary), Exclude: agentSummaryExclude(agentSummary)}, + agentSummary: agentSummary, + }) } } sort.Slice(items, func(i, j int) bool { @@ -241,6 +285,9 @@ func (m targetListTUIModel) Update(msg tea.Msg) (tea.Model, tea.Cmd) { if m.showModePicker { return m.handleModePickerKey(msg) } + if m.showScopePicker { + return m.handleScopePickerKey(msg) + } if m.showNamingPicker { return m.handleNamingPickerKey(msg) } @@ -303,6 +350,9 @@ func (m targetListTUIModel) handleNormalKey(msg tea.KeyMsg) (tea.Model, tea.Cmd) return m, textinput.Blink case "M": if item, ok := m.list.SelectedItem().(targetTUIItem); ok { + if item.agentSummary != nil { + return m.openScopePicker(item, "mode") + } return m.openModePicker(item.name, item.target) } return m, nil @@ -313,11 +363,17 @@ func (m targetListTUIModel) handleNormalKey(msg tea.KeyMsg) (tea.Model, tea.Cmd) return m, nil case "I": if item, ok := m.list.SelectedItem().(targetTUIItem); ok { + if item.agentSummary != nil { + return m.openScopePicker(item, "include") + } return m.openFilterEdit(item.name, "include", item.target.SkillsConfig().Include) } return m, nil case "E": if item, ok := m.list.SelectedItem().(targetTUIItem); ok { + if item.agentSummary != nil { + return m.openScopePicker(item, "exclude") + } return m.openFilterEdit(item.name, "exclude", item.target.SkillsConfig().Exclude) } return m, nil @@ -412,10 +468,15 @@ func (m targetListTUIModel) handleConfirmKey(msg tea.KeyMsg) (tea.Model, tea.Cmd var targetSyncModes = config.ExtraSyncModes // ["merge", "copy", "symlink"] func (m targetListTUIModel) openModePicker(name string, target config.TargetConfig) (tea.Model, tea.Cmd) { + return m.openModePickerForScope(name, target.SkillsConfig(), "skills") +} + +func (m targetListTUIModel) openModePickerForScope(name string, currentConfig config.ResourceTargetConfig, scope string) (tea.Model, tea.Cmd) { m.showModePicker = true m.modePickerTarget = name + m.modePickerScope = scope m.modeCursor = 0 - current := sync.EffectiveMode(target.SkillsConfig().Mode) + current := sync.EffectiveMode(currentConfig.Mode) for i, mode := range targetSyncModes { if mode == current { m.modeCursor = i @@ -445,15 +506,16 @@ func (m targetListTUIModel) handleModePickerKey(msg tea.KeyMsg) (tea.Model, tea. m.showModePicker = false newMode := targetSyncModes[m.modeCursor] name := m.modePickerTarget + scope := m.modePickerScope return m, func() tea.Msg { - msg, err := m.doSetTargetMode(name, newMode) + msg, err := m.doSetTargetMode(name, scope, newMode) return targetListActionDoneMsg{msg: msg, err: err} } } return m, nil } -func (m targetListTUIModel) doSetTargetMode(name, newMode string) (string, error) { +func (m targetListTUIModel) doSetTargetMode(name, scope, newMode string) (string, error) { if m.projCfg != nil { projCfg, err := config.LoadProject(m.cwd) if err != nil { @@ -461,7 +523,8 @@ func (m targetListTUIModel) doSetTargetMode(name, newMode string) (string, error } for i, entry := range projCfg.Targets { if entry.Name == name { - projCfg.Targets[i].EnsureSkills().Mode = newMode + targetCfg := scopeSetterProject(&projCfg.Targets[i], scope) + targetCfg.Mode = newMode break } } @@ -474,13 +537,14 @@ func (m targetListTUIModel) doSetTargetMode(name, newMode string) (string, error return "", err } t := cfg.Targets[name] - t.EnsureSkills().Mode = newMode + targetCfg := scopeSetterGlobal(&t, scope) + targetCfg.Mode = newMode cfg.Targets[name] = t if err := cfg.Save(); err != nil { return "", err } } - return fmt.Sprintf("✓ Set %s mode to %s", name, newMode), nil + return fmt.Sprintf("✓ Set %s %s mode to %s", name, scope, newMode), nil } // ─── Naming Picker ────────────────────────────────────────────────── @@ -560,9 +624,14 @@ func (m targetListTUIModel) doSetTargetNaming(name, newNaming string) (string, e // ─── Include/Exclude Edit Sub-Panel ────────────────────────────────── func (m targetListTUIModel) openFilterEdit(name, filterType string, patterns []string) (tea.Model, tea.Cmd) { + return m.openFilterEditForScope(name, "skills", filterType, patterns) +} + +func (m targetListTUIModel) openFilterEditForScope(name, scope, filterType string, patterns []string) (tea.Model, tea.Cmd) { m.editingFilter = true m.editFilterType = filterType m.editFilterTarget = name + m.editFilterScope = scope m.editPatterns = make([]string, len(patterns)) copy(m.editPatterns, patterns) m.editCursor = 0 @@ -604,9 +673,10 @@ func (m targetListTUIModel) handleFilterEditKey(msg tea.KeyMsg) (tea.Model, tea. m.editCursor-- } name := m.editFilterTarget + scope := m.editFilterScope filterType := m.editFilterType return m, func() tea.Msg { - msg, err := m.doRemovePattern(name, filterType, pattern) + msg, err := m.doRemovePattern(name, scope, filterType, pattern) return targetListActionDoneMsg{msg: msg, err: err} } } @@ -631,9 +701,10 @@ func (m targetListTUIModel) handleFilterEditAddKey(msg tea.KeyMsg) (tea.Model, t m.editPatterns = append(m.editPatterns, pattern) m.editCursor = len(m.editPatterns) - 1 name := m.editFilterTarget + scope := m.editFilterScope filterType := m.editFilterType return m, func() tea.Msg { - msg, err := m.doAddPattern(name, filterType, pattern) + msg, err := m.doAddPattern(name, scope, filterType, pattern) return targetListActionDoneMsg{msg: msg, err: err} } } @@ -642,7 +713,7 @@ func (m targetListTUIModel) handleFilterEditAddKey(msg tea.KeyMsg) (tea.Model, t return m, cmd } -func (m targetListTUIModel) doAddPattern(name, filterType, pattern string) (string, error) { +func (m targetListTUIModel) doAddPattern(name, scope, filterType, pattern string) (string, error) { if m.projCfg != nil { projCfg, err := config.LoadProject(m.cwd) if err != nil { @@ -650,11 +721,11 @@ func (m targetListTUIModel) doAddPattern(name, filterType, pattern string) (stri } for i, entry := range projCfg.Targets { if entry.Name == name { - sk := projCfg.Targets[i].EnsureSkills() + targetCfg := scopeSetterProject(&projCfg.Targets[i], scope) if filterType == "include" { - sk.Include = append(sk.Include, pattern) + targetCfg.Include = append(targetCfg.Include, pattern) } else { - sk.Exclude = append(sk.Exclude, pattern) + targetCfg.Exclude = append(targetCfg.Exclude, pattern) } break } @@ -668,21 +739,21 @@ func (m targetListTUIModel) doAddPattern(name, filterType, pattern string) (stri return "", err } t := cfg.Targets[name] - sk := t.EnsureSkills() + targetCfg := scopeSetterGlobal(&t, scope) if filterType == "include" { - sk.Include = append(sk.Include, pattern) + targetCfg.Include = append(targetCfg.Include, pattern) } else { - sk.Exclude = append(sk.Exclude, pattern) + targetCfg.Exclude = append(targetCfg.Exclude, pattern) } cfg.Targets[name] = t if err := cfg.Save(); err != nil { return "", err } } - return fmt.Sprintf("✓ Added %s pattern: %s", filterType, pattern), nil + return fmt.Sprintf("✓ Added %s %s pattern: %s", scope, filterType, pattern), nil } -func (m targetListTUIModel) doRemovePattern(name, filterType, pattern string) (string, error) { +func (m targetListTUIModel) doRemovePattern(name, scope, filterType, pattern string) (string, error) { removeFromSlice := func(slice []string, val string) []string { var result []string for _, s := range slice { @@ -700,11 +771,11 @@ func (m targetListTUIModel) doRemovePattern(name, filterType, pattern string) (s } for i, entry := range projCfg.Targets { if entry.Name == name { - sk := projCfg.Targets[i].EnsureSkills() + targetCfg := scopeSetterProject(&projCfg.Targets[i], scope) if filterType == "include" { - sk.Include = removeFromSlice(sk.Include, pattern) + targetCfg.Include = removeFromSlice(targetCfg.Include, pattern) } else { - sk.Exclude = removeFromSlice(sk.Exclude, pattern) + targetCfg.Exclude = removeFromSlice(targetCfg.Exclude, pattern) } break } @@ -718,18 +789,18 @@ func (m targetListTUIModel) doRemovePattern(name, filterType, pattern string) (s return "", err } t := cfg.Targets[name] - sk := t.EnsureSkills() + targetCfg := scopeSetterGlobal(&t, scope) if filterType == "include" { - sk.Include = removeFromSlice(sk.Include, pattern) + targetCfg.Include = removeFromSlice(targetCfg.Include, pattern) } else { - sk.Exclude = removeFromSlice(sk.Exclude, pattern) + targetCfg.Exclude = removeFromSlice(targetCfg.Exclude, pattern) } cfg.Targets[name] = t if err := cfg.Save(); err != nil { return "", err } } - return fmt.Sprintf("✓ Removed %s pattern: %s", filterType, pattern), nil + return fmt.Sprintf("✓ Removed %s %s pattern: %s", scope, filterType, pattern), nil } // ---- View ------------------------------------------------------------------- @@ -747,6 +818,9 @@ func (m targetListTUIModel) View() string { if m.showModePicker { return m.renderModePicker() } + if m.showScopePicker { + return m.renderScopePicker() + } if m.showNamingPicker { return m.renderNamingPicker() } @@ -822,20 +896,20 @@ func (m targetListTUIModel) viewTargetVertical() string { func renderTargetActionMsg(msg string) string { if strings.HasPrefix(msg, "✓") { - return tc.Green.Render(msg) + return theme.Success().Render(msg) } if strings.HasPrefix(msg, "✗") { - return tc.Red.Render(msg) + return theme.Danger().Render(msg) } - return tc.Yellow.Render(msg) + return theme.Warning().Render(msg) } func (m targetListTUIModel) renderTargetHelp(scrollInfo string) string { - helpText := "↑↓ navigate / filter Ctrl+d/u scroll M mode N naming I include E exclude R remove q quit" + helpText := "↑↓ navigate / filter Ctrl+d/u scroll M mode(sk/ag) N naming(sk) I include(sk/ag) E exclude(sk/ag) R remove q quit" if m.filtering { helpText = "Enter lock Esc clear q quit" } - return tc.Help.Render(appendScrollInfo(helpText, scrollInfo)) + return theme.Dim().MarginLeft(2).Render(appendScrollInfo(helpText, scrollInfo)) } func (m targetListTUIModel) renderTargetFilterBar() string { @@ -882,33 +956,82 @@ func (m *targetListTUIModel) syncTargetListSize() { func (m targetListTUIModel) renderTargetDetail(item targetTUIItem) string { var b strings.Builder - fmt.Fprintf(&b, "%s\n\n", tc.Title.Render(item.name)) + fmt.Fprintf(&b, "%s\n\n", theme.Title().Render(item.name)) sc := item.target.SkillsConfig() - fmt.Fprintf(&b, "%s %s\n", tc.Dim.Render("Path:"), shortenPath(sc.Path)) - fmt.Fprintf(&b, "%s %s\n", tc.Dim.Render("Mode:"), sync.EffectiveMode(sc.Mode)) - fmt.Fprintf(&b, "%s %s\n", tc.Dim.Render("Naming:"), config.EffectiveTargetNaming(sc.TargetNaming)) + displayPath := item.displayPath + if displayPath == "" { + displayPath = sc.Path + } + fmt.Fprintf(&b, "%s\n", theme.Dim().Render("Skills:")) + fmt.Fprintf(&b, "%s %s\n", theme.Dim().Render("Path:"), shortenPath(displayPath)) + fmt.Fprintf(&b, "%s %s\n", theme.Dim().Render("Mode:"), sync.EffectiveMode(sc.Mode)) + fmt.Fprintf(&b, "%s %s\n", theme.Dim().Render("Naming:"), config.EffectiveTargetNaming(sc.TargetNaming)) + fmt.Fprintf(&b, "%s %s\n", theme.Dim().Render("Sync:"), item.skillSync) if len(sc.Include) > 0 { - fmt.Fprintf(&b, "\n%s\n", tc.Dim.Render("Include:")) + fmt.Fprintf(&b, "\n%s\n", theme.Dim().Render("Include:")) for _, p := range sc.Include { fmt.Fprintf(&b, " %s\n", p) } } if len(sc.Exclude) > 0 { - fmt.Fprintf(&b, "\n%s\n", tc.Dim.Render("Exclude:")) + fmt.Fprintf(&b, "\n%s\n", theme.Dim().Render("Exclude:")) for _, p := range sc.Exclude { fmt.Fprintf(&b, " %s\n", p) } } if len(sc.Include) == 0 && len(sc.Exclude) == 0 { - fmt.Fprintf(&b, "\n%s\n", tc.Dim.Render("No include/exclude filters")) + fmt.Fprintf(&b, "\n%s\n", theme.Dim().Render("No include/exclude filters")) + } + + if item.agentSummary != nil { + agentPath := item.agentSummary.DisplayPath + if agentPath == "" { + agentPath = item.agentSummary.Path + } + + fmt.Fprintf(&b, "\n%s\n", theme.Dim().Render("Agents:")) + fmt.Fprintf(&b, "%s %s\n", theme.Dim().Render("Path:"), shortenPath(agentPath)) + fmt.Fprintf(&b, "%s %s\n", theme.Dim().Render("Mode:"), item.agentSummary.Mode) + fmt.Fprintf(&b, "%s %s\n", theme.Dim().Render("Sync:"), formatTargetAgentSyncSummary(item.agentSummary)) + + if item.agentSummary.Mode == "symlink" { + fmt.Fprintf(&b, "\n%s\n", theme.Dim().Render("Agent include/exclude filters ignored in symlink mode")) + } else if len(item.agentSummary.Include) > 0 { + fmt.Fprintf(&b, "\n%s\n", theme.Dim().Render("Agent Include:")) + for _, p := range item.agentSummary.Include { + fmt.Fprintf(&b, " %s\n", p) + } + } + if item.agentSummary.Mode != "symlink" && len(item.agentSummary.Exclude) > 0 { + fmt.Fprintf(&b, "\n%s\n", theme.Dim().Render("Agent Exclude:")) + for _, p := range item.agentSummary.Exclude { + fmt.Fprintf(&b, " %s\n", p) + } + } + if item.agentSummary.Mode != "symlink" && len(item.agentSummary.Include) == 0 && len(item.agentSummary.Exclude) == 0 { + fmt.Fprintf(&b, "\n%s\n", theme.Dim().Render("No agent include/exclude filters")) + } } return b.String() } +func buildTargetSkillSyncSummary(targetPath, sourcePath, mode string) string { + switch sync.EffectiveMode(mode) { + case "copy": + status, managed, local := sync.CheckStatusCopy(targetPath) + return fmt.Sprintf("%s (%d managed, %d local)", status, managed, local) + case "merge": + status, linked, local := sync.CheckStatusMerge(targetPath, sourcePath) + return fmt.Sprintf("%s (%d shared, %d local)", status, linked, local) + default: + return sync.CheckStatus(targetPath, sourcePath).String() + } +} + // ---- Overlay renders -------------------------------------------------------- func (m targetListTUIModel) renderConfirmOverlay() string { @@ -918,19 +1041,19 @@ func (m targetListTUIModel) renderConfirmOverlay() string { } cmd := fmt.Sprintf("skillshare target remove %s %s", flag, m.confirmTarget) return fmt.Sprintf("\n %s\n\n → %s\n\n Proceed? [Y/n] ", - tc.Red.Render("Remove target "+m.confirmTarget+"?"), cmd) + theme.Danger().Render("Remove target "+m.confirmTarget+"?"), cmd) } func (m targetListTUIModel) renderModePicker() string { var b strings.Builder - fmt.Fprintf(&b, "\n%s\n", tc.Title.Render("Change mode")) - fmt.Fprintf(&b, "%s %s\n\n", tc.Dim.Render("Target:"), m.modePickerTarget) + fmt.Fprintf(&b, "\n%s\n", theme.Title().Render("Change "+m.modePickerScope+" mode")) + fmt.Fprintf(&b, "%s %s\n\n", theme.Dim().Render("Target:"), m.modePickerTarget) for i, mode := range targetSyncModes { cursor := " " if i == m.modeCursor { - cursor = tc.Cyan.Render(">") + " " + cursor = theme.Accent().Render(">") + " " } var desc string switch mode { @@ -942,26 +1065,59 @@ func (m targetListTUIModel) renderModePicker() string { desc = " (directory symlink)" } if i == m.modeCursor { - fmt.Fprintf(&b, "%s%s%s\n", cursor, tc.Cyan.Render(mode), tc.Dim.Render(desc)) + fmt.Fprintf(&b, "%s%s%s\n", cursor, theme.Accent().Render(mode), theme.Dim().Render(desc)) } else { - fmt.Fprintf(&b, "%s%s%s\n", cursor, mode, tc.Dim.Render(desc)) + fmt.Fprintf(&b, "%s%s%s\n", cursor, mode, theme.Dim().Render(desc)) } } - fmt.Fprintf(&b, "\n%s\n", tc.Help.Render("↑↓ select Enter confirm Esc cancel")) + fmt.Fprintf(&b, "\n%s\n", theme.Dim().MarginLeft(2).Render("↑↓ select Enter confirm Esc cancel")) + return b.String() +} + +func (m targetListTUIModel) renderScopePicker() string { + var b strings.Builder + item, ok := m.list.SelectedItem().(targetTUIItem) + if !ok { + return "" + } + options := targetScopeOptions(item, m.scopePickerAction) + + fmt.Fprintf(&b, "\n%s\n", theme.Title().Render("Choose resource")) + fmt.Fprintf(&b, "%s %s\n", theme.Dim().Render("Target:"), m.scopePickerTarget) + fmt.Fprintf(&b, "%s %s\n\n", theme.Dim().Render("Action:"), m.scopePickerAction) + + for i, option := range options { + cursor := " " + if i == m.scopePickerCursor { + cursor = theme.Accent().Render(">") + " " + } + label := capitalize(option.scope) + if option.enabled { + if i == m.scopePickerCursor { + fmt.Fprintf(&b, "%s%s\n", cursor, theme.Accent().Render(label)) + } else { + fmt.Fprintf(&b, "%s%s\n", cursor, label) + } + continue + } + fmt.Fprintf(&b, "%s%s%s\n", cursor, theme.Dim().Render(label), theme.Dim().Render(" ("+option.disabled+")")) + } + + fmt.Fprintf(&b, "\n%s\n", theme.Dim().MarginLeft(2).Render("↑↓ select Enter confirm Esc cancel")) return b.String() } func (m targetListTUIModel) renderNamingPicker() string { var b strings.Builder - fmt.Fprintf(&b, "\n%s\n", tc.Title.Render("Change target naming")) - fmt.Fprintf(&b, "%s %s\n\n", tc.Dim.Render("Target:"), m.namingPickerTarget) + fmt.Fprintf(&b, "\n%s\n", theme.Title().Render("Change target naming")) + fmt.Fprintf(&b, "%s %s\n\n", theme.Dim().Render("Target:"), m.namingPickerTarget) for i, naming := range config.ValidTargetNamings { cursor := " " if i == m.namingCursor { - cursor = tc.Cyan.Render(">") + " " + cursor = theme.Accent().Render(">") + " " } var desc string switch naming { @@ -971,13 +1127,13 @@ func (m targetListTUIModel) renderNamingPicker() string { desc = " (SKILL.md name)" } if i == m.namingCursor { - fmt.Fprintf(&b, "%s%s%s\n", cursor, tc.Cyan.Render(naming), tc.Dim.Render(desc)) + fmt.Fprintf(&b, "%s%s%s\n", cursor, theme.Accent().Render(naming), theme.Dim().Render(desc)) } else { - fmt.Fprintf(&b, "%s%s%s\n", cursor, naming, tc.Dim.Render(desc)) + fmt.Fprintf(&b, "%s%s%s\n", cursor, naming, theme.Dim().Render(desc)) } } - fmt.Fprintf(&b, "\n%s\n", tc.Help.Render("↑↓ select Enter confirm Esc cancel")) + fmt.Fprintf(&b, "\n%s\n", theme.Dim().MarginLeft(2).Render("↑↓ select Enter confirm Esc cancel")) return b.String() } @@ -985,15 +1141,15 @@ func (m targetListTUIModel) renderFilterEditPanel() string { var b strings.Builder title := capitalize(m.editFilterType) - fmt.Fprintf(&b, "%s %s\n", tc.Title.Render(title+" patterns"), tc.Dim.Render("("+m.editFilterTarget+")")) + fmt.Fprintf(&b, "%s %s\n", theme.Title().Render(title+" "+m.editFilterScope+" patterns"), theme.Dim().Render("("+m.editFilterTarget+")")) fmt.Fprintln(&b) if len(m.editPatterns) == 0 { - fmt.Fprintf(&b, " %s\n", tc.Dim.Render("(empty)")) + fmt.Fprintf(&b, " %s\n", theme.Dim().Render("(empty)")) } else { for i, p := range m.editPatterns { if i == m.editCursor { - fmt.Fprintf(&b, " %s %s\n", tc.Cyan.Render(">"), tc.Cyan.Render(p)) + fmt.Fprintf(&b, " %s %s\n", theme.Accent().Render(">"), theme.Accent().Render(p)) } else { fmt.Fprintf(&b, " %s\n", p) } @@ -1007,9 +1163,9 @@ func (m targetListTUIModel) renderFilterEditPanel() string { fmt.Fprintln(&b) if m.editAdding { - fmt.Fprintf(&b, "%s\n", tc.Help.Render("Enter confirm Esc cancel")) + fmt.Fprintf(&b, "%s\n", theme.Dim().MarginLeft(2).Render("Enter confirm Esc cancel")) } else { - fmt.Fprintf(&b, "%s\n", tc.Help.Render("a add d delete esc back")) + fmt.Fprintf(&b, "%s\n", theme.Dim().MarginLeft(2).Render("a add d delete esc back")) } return b.String() } @@ -1067,3 +1223,136 @@ func runTargetListTUI(mode runMode, cwd string) (string, string, error) { } return m.action, m.confirmTarget, nil } + +func (m targetListTUIModel) openScopePicker(item targetTUIItem, action string) (tea.Model, tea.Cmd) { + m.showScopePicker = true + m.scopePickerTarget = item.name + m.scopePickerAction = action + m.scopePickerCursor = firstEnabledScopeOption(targetScopeOptions(item, action)) + m.lastActionMsg = "" + return m, nil +} + +func (m targetListTUIModel) handleScopePickerKey(msg tea.KeyMsg) (tea.Model, tea.Cmd) { + item, ok := m.list.SelectedItem().(targetTUIItem) + if !ok { + m.showScopePicker = false + return m, nil + } + options := targetScopeOptions(item, m.scopePickerAction) + + switch msg.String() { + case "q", "esc": + m.showScopePicker = false + return m, nil + case "up", "k": + m.scopePickerCursor = moveScopePickerCursor(options, m.scopePickerCursor, -1) + return m, nil + case "down", "j": + m.scopePickerCursor = moveScopePickerCursor(options, m.scopePickerCursor, 1) + return m, nil + case "enter": + if len(options) == 0 || !options[m.scopePickerCursor].enabled { + m.showScopePicker = false + m.lastActionMsg = "✗ Agents include/exclude filters are ignored in symlink mode" + return m, nil + } + m.showScopePicker = false + scope := options[m.scopePickerCursor].scope + switch m.scopePickerAction { + case "mode": + return m.openModePickerForScope(item.name, itemConfigForScope(item, scope), scope) + case "include": + return m.openFilterEditForScope(item.name, scope, "include", itemConfigForScope(item, scope).Include) + case "exclude": + return m.openFilterEditForScope(item.name, scope, "exclude", itemConfigForScope(item, scope).Exclude) + } + return m, nil + } + return m, nil +} + +func targetScopeOptions(item targetTUIItem, action string) []targetScopeOption { + options := []targetScopeOption{{scope: "skills", enabled: true}} + if item.agentSummary == nil { + return options + } + + option := targetScopeOption{scope: "agents", enabled: true} + if (action == "include" || action == "exclude") && item.agentSummary.Mode == "symlink" { + option.enabled = false + option.disabled = "ignored in symlink mode" + } + return append(options, option) +} + +func firstEnabledScopeOption(options []targetScopeOption) int { + for i, option := range options { + if option.enabled { + return i + } + } + return 0 +} + +func moveScopePickerCursor(options []targetScopeOption, current, delta int) int { + if len(options) == 0 { + return 0 + } + if current < 0 || current >= len(options) { + current = firstEnabledScopeOption(options) + } + next := current + for { + candidate := next + delta + if candidate < 0 || candidate >= len(options) { + return current + } + next = candidate + if options[next].enabled { + return next + } + } +} + +func itemConfigForScope(item targetTUIItem, scope string) config.ResourceTargetConfig { + if scope == "agents" { + return item.agentConfig + } + return item.target.SkillsConfig() +} + +func scopeSetterGlobal(target *config.TargetConfig, scope string) *config.ResourceTargetConfig { + if scope == "agents" { + return target.EnsureAgents() + } + return target.EnsureSkills() +} + +func scopeSetterProject(target *config.ProjectTargetEntry, scope string) *config.ResourceTargetConfig { + if scope == "agents" { + return target.EnsureAgents() + } + return target.EnsureSkills() +} + +func agentSummaryMode(summary *targetsummary.AgentSummary) string { + if summary == nil { + return "" + } + return summary.Mode +} + +func agentSummaryInclude(summary *targetsummary.AgentSummary) []string { + if summary == nil { + return nil + } + return append([]string(nil), summary.Include...) +} + +func agentSummaryExclude(summary *targetsummary.AgentSummary) []string { + if summary == nil { + return nil + } + return append([]string(nil), summary.Exclude...) +} diff --git a/cmd/skillshare/target_list_tui_item.go b/cmd/skillshare/target_list_tui_item.go index ed45acb5..6aa4c615 100644 --- a/cmd/skillshare/target_list_tui_item.go +++ b/cmd/skillshare/target_list_tui_item.go @@ -6,6 +6,7 @@ import ( "skillshare/internal/config" "skillshare/internal/sync" + "skillshare/internal/targetsummary" "github.com/charmbracelet/bubbles/list" tea "github.com/charmbracelet/bubbletea" @@ -13,8 +14,12 @@ import ( // targetTUIItem wraps a target entry for the bubbles/list widget. type targetTUIItem struct { - name string - target config.TargetConfig + name string + target config.TargetConfig + displayPath string + skillSync string + agentConfig config.ResourceTargetConfig + agentSummary *targetsummary.AgentSummary } func (i targetTUIItem) FilterValue() string { return i.name } diff --git a/cmd/skillshare/target_project.go b/cmd/skillshare/target_project.go index f5e7761f..5c8c5429 100644 --- a/cmd/skillshare/target_project.go +++ b/cmd/skillshare/target_project.go @@ -11,6 +11,7 @@ import ( "skillshare/internal/config" "skillshare/internal/oplog" "skillshare/internal/sync" + "skillshare/internal/targetsummary" "skillshare/internal/ui" "skillshare/internal/utils" "skillshare/internal/validate" @@ -277,16 +278,27 @@ func targetListProjectWithJSON(root string, jsonOutput bool) error { }) if jsonOutput { + agentBuilder, err := targetsummary.NewProjectBuilder(root) + if err != nil { + return err + } + var items []targetListJSONItem for _, entry := range targets { sc := entry.SkillsConfig() - items = append(items, targetListJSONItem{ + item := targetListJSONItem{ Name: entry.Name, Path: projectTargetDisplayPath(entry), Mode: getTargetMode(sc.Mode, ""), Include: sc.Include, Exclude: sc.Exclude, - }) + } + agentSummary, err := agentBuilder.ProjectTarget(entry) + if err != nil { + return err + } + applyTargetListAgentSummary(&item, agentSummary) + items = append(items, item) } output := struct { Targets []targetListJSONItem `json:"targets"` @@ -294,17 +306,14 @@ func targetListProjectWithJSON(root string, jsonOutput bool) error { return writeJSON(&output) } - ui.Header("Configured Targets (project)") - for _, entry := range targets { - displayPath := projectTargetDisplayPath(entry) - sc := entry.SkillsConfig() - mode := sc.Mode - if mode == "" { - mode = "merge" - } - fmt.Printf(" %-12s %s (%s)\n", entry.Name, displayPath, mode) + items, err := buildTargetTUIItems(true, root) + if err != nil { + return err } + ui.Header("Configured Targets (project)") + printTargetListPlain(items) + return nil } @@ -320,23 +329,9 @@ func targetInfoProject(name string, args []string, root string) error { if err != nil { return err } - - var newMode, newNaming string - for i := 0; i < len(remaining); i++ { - switch remaining[i] { - case "--mode", "-m": - if i+1 >= len(remaining) { - return fmt.Errorf("--mode requires a value (merge, symlink, or copy)") - } - newMode = remaining[i+1] - i++ - case "--target-naming": - if i+1 >= len(remaining) { - return fmt.Errorf("--target-naming requires a value (flat or standard)") - } - newNaming = remaining[i+1] - i++ - } + settings, err := parseTargetSettingFlags(remaining) + if err != nil { + return err } cfg, err := config.LoadProject(root) @@ -360,13 +355,55 @@ func targetInfoProject(name string, args []string, root string) error { if filterOpts.hasUpdates() { start := time.Now() entry := &cfg.Targets[targetIdx] - s := entry.EnsureSkills() - changes, fErr := applyFilterUpdates(&s.Include, &s.Exclude, filterOpts) - if fErr != nil { - return fErr + var changes []string + mutated := false + + if filterOpts.Skills.hasUpdates() { + s := entry.EnsureSkills() + skillChanges, fErr := applyFilterUpdates(&s.Include, &s.Exclude, filterOpts.Skills) + if fErr != nil { + return fErr + } + changes = append(changes, skillChanges...) + mutated = true } - if err := cfg.Save(root); err != nil { - return err + + if filterOpts.Agents.hasUpdates() { + agentBuilder, buildErr := targetsummary.NewProjectBuilder(root) + if buildErr != nil { + return buildErr + } + agentSummary, buildErr := agentBuilder.ProjectTarget(*entry) + if buildErr != nil { + return buildErr + } + if agentSummary == nil { + return fmt.Errorf("target '%s' does not have an agents path", name) + } + if agentSummary.Mode == "symlink" { + return fmt.Errorf("target '%s' agent include/exclude filters are ignored in symlink mode; use --agent-mode merge or --agent-mode copy first", name) + } + + ac := entry.AgentsConfig() + include := append([]string(nil), ac.Include...) + exclude := append([]string(nil), ac.Exclude...) + agentChanges, fErr := applyFilterUpdates(&include, &exclude, filterOpts.Agents) + if fErr != nil { + return fErr + } + if len(agentChanges) > 0 { + a := entry.EnsureAgents() + a.Include = include + a.Exclude = exclude + mutated = true + } + changes = append(changes, scopeFilterChanges("agents", agentChanges)...) + } + + if mutated { + if err := cfg.Save(root); err != nil { + return err + } } for _, change := range changes { ui.Success("%s: %s", name, change) @@ -385,12 +422,16 @@ func targetInfoProject(name string, args []string, root string) error { return nil } - if newMode != "" { - return updateTargetModeProject(cfg, targetIdx, newMode, root) + if settings.SkillMode != "" { + return updateTargetModeProject(cfg, targetIdx, settings.SkillMode, root) + } + + if settings.AgentMode != "" { + return updateTargetAgentModeProject(cfg, targetIdx, settings.AgentMode, root) } - if newNaming != "" { - return updateTargetNamingProject(cfg, targetIdx, newNaming, root) + if settings.Naming != "" { + return updateTargetNamingProject(cfg, targetIdx, settings.Naming, root) } targets, err := config.ResolveProjectTargets(root, cfg) @@ -405,6 +446,14 @@ func targetInfoProject(name string, args []string, root string) error { targetEntry := cfg.Targets[targetIdx] sourcePath := filepath.Join(root, ".skillshare", "skills") + agentBuilder, err := targetsummary.NewProjectBuilder(root) + if err != nil { + return err + } + agentSummary, err := agentBuilder.ProjectTarget(targetEntry) + if err != nil { + return err + } sc := targetEntry.SkillsConfig() mode := sc.Mode @@ -439,6 +488,7 @@ func targetInfoProject(name string, args []string, root string) error { fmt.Printf(" Include: %s\n", formatFilterList(sc.Include)) fmt.Printf(" Exclude: %s\n", formatFilterList(sc.Exclude)) + printTargetAgentSection(agentSummary) return nil } @@ -464,6 +514,38 @@ func updateTargetModeProject(cfg *config.ProjectConfig, idx int, newMode string, return nil } +func updateTargetAgentModeProject(cfg *config.ProjectConfig, idx int, newMode string, root string) error { + if newMode != "merge" && newMode != "symlink" && newMode != "copy" { + return fmt.Errorf("invalid agent mode '%s'. Use 'merge', 'symlink', or 'copy'", newMode) + } + + entry := &cfg.Targets[idx] + agentBuilder, err := targetsummary.NewProjectBuilder(root) + if err != nil { + return err + } + agentSummary, err := agentBuilder.ProjectTarget(*entry) + if err != nil { + return err + } + if agentSummary == nil { + return fmt.Errorf("target '%s' does not have an agents path", entry.Name) + } + + oldMode := agentSummary.Mode + entry.EnsureAgents().Mode = newMode + if err := cfg.Save(root); err != nil { + return err + } + + ui.Success("Changed %s agent mode: %s -> %s", entry.Name, oldMode, newMode) + if newMode == "symlink" && (len(agentSummary.Include) > 0 || len(agentSummary.Exclude) > 0) { + ui.Warning("Agent include/exclude filters are ignored in symlink mode") + } + ui.Info("Run 'skillshare sync' to apply the new mode") + return nil +} + func updateTargetNamingProject(cfg *config.ProjectConfig, idx int, newNaming string, root string) error { if !config.IsValidTargetNaming(newNaming) { return fmt.Errorf("invalid target naming '%s'. Use 'flat' or 'standard'", newNaming) diff --git a/cmd/skillshare/trash.go b/cmd/skillshare/trash.go index f7ef5e1c..af6b6b39 100644 --- a/cmd/skillshare/trash.go +++ b/cmd/skillshare/trash.go @@ -3,6 +3,7 @@ package main import ( "fmt" "os" + "sort" "strings" "time" @@ -33,6 +34,9 @@ func cmdTrash(args []string) error { applyModeLabel(mode) + // Extract kind filter (e.g. "skillshare trash agents list" or "--all"). + kind, rest := parseKindArgWithAll(rest) + if len(rest) == 0 { printTrashHelp() return nil @@ -54,13 +58,13 @@ func cmdTrash(args []string) error { switch sub { case "list", "ls": - return trashList(mode, cwd, noTUI) + return trashList(mode, cwd, noTUI, kind) case "restore": - return trashRestore(mode, cwd, filteredArgs) + return trashRestore(mode, cwd, filteredArgs, kind) case "delete", "rm": - return trashDelete(mode, cwd, filteredArgs) + return trashDelete(mode, cwd, filteredArgs, kind) case "empty": - return trashEmpty(mode, cwd) + return trashEmpty(mode, cwd, kind) case "--help", "-h", "help": printTrashHelp() return nil @@ -70,30 +74,63 @@ func cmdTrash(args []string) error { } } -func trashList(mode runMode, cwd string, noTUI bool) error { - trashBase := resolveTrashBase(mode, cwd) - items := trash.List(trashBase) +func trashList(mode runMode, cwd string, noTUI bool, kind resourceKindFilter) error { + // TUI path: merge skill + agent trash when kind includes both + if shouldLaunchTUI(noTUI, nil) { + var items []trash.TrashEntry - if len(items) == 0 { - ui.Info("Trash is empty") - return nil - } + if kind.IncludesSkills() { + skillBase := resolveTrashBase(mode, cwd, kindSkills) + for _, e := range trash.List(skillBase) { + e.Kind = "skill" + items = append(items, e) + } + } + if kind.IncludesAgents() { + agentBase := resolveTrashBase(mode, cwd, kindAgents) + for _, e := range trash.List(agentBase) { + e.Kind = "agent" + items = append(items, e) + } + } + + if len(items) == 0 { + ui.Info("Trash is empty") + return nil + } + + // Sort merged list by date (newest first) + sort.Slice(items, func(i, j int) bool { + return items[i].Date.After(items[j].Date) + }) - // TUI dispatch: TTY + items + TUI enabled - if shouldLaunchTUI(noTUI, nil) { modeLabel := "global" if mode == modeProject { modeLabel = "project" } + skillTrashBase := resolveTrashBase(mode, cwd, kindSkills) + agentTrashBase := resolveTrashBase(mode, cwd, kindAgents) cfgPath := resolveTrashCfgPath(mode, cwd) - destDir, err := resolveSourceDir(mode, cwd) + destDir, err := resolveSourceDir(mode, cwd, kindSkills) if err != nil { return err } - return runTrashTUI(items, trashBase, destDir, cfgPath, modeLabel) + agentDestDir, err := resolveSourceDir(mode, cwd, kindAgents) + if err != nil { + return err + } + return runTrashTUI(items, skillTrashBase, agentTrashBase, destDir, agentDestDir, cfgPath, modeLabel) + } + + // Plain text path (unchanged) — list single kind + trashBase := resolveTrashBase(mode, cwd, kind) + items := trash.List(trashBase) + + if len(items) == 0 { + ui.Info("Trash is empty") + return nil } - // Plain text output (--no-tui or non-TTY) ui.Header("Trash") for _, item := range items { age := time.Since(item.Date) @@ -110,7 +147,7 @@ func trashList(mode runMode, cwd string, noTUI bool) error { return nil } -func trashRestore(mode runMode, cwd string, args []string) error { +func trashRestore(mode runMode, cwd string, args []string, kind resourceKindFilter) error { start := time.Now() var name string @@ -136,7 +173,7 @@ func trashRestore(mode runMode, cwd string, args []string) error { cfgPath := resolveTrashCfgPath(mode, cwd) - trashBase := resolveTrashBase(mode, cwd) + trashBase := resolveTrashBase(mode, cwd, kind) entry := trash.FindByName(trashBase, name) if entry == nil { cmdErr := fmt.Errorf("'%s' not found in trash", name) @@ -144,28 +181,39 @@ func trashRestore(mode runMode, cwd string, args []string) error { return cmdErr } - destDir, err := resolveSourceDir(mode, cwd) + destDir, err := resolveSourceDir(mode, cwd, kind) if err != nil { logTrashOp(cfgPath, "restore", 0, name, start, err) return err } - if err := trash.Restore(entry, destDir); err != nil { - logTrashOp(cfgPath, "restore", 0, name, start, err) - return err + if kind == kindAgents { + if err := trash.RestoreAgent(entry, destDir); err != nil { + logTrashOp(cfgPath, "restore", 0, name, start, err) + return err + } + } else { + if err := trash.Restore(entry, destDir); err != nil { + logTrashOp(cfgPath, "restore", 0, name, start, err) + return err + } } ui.Success("Restored: %s", name) age := time.Since(entry.Date) ui.Info("Trashed %s ago, now back in %s", formatAge(age), destDir) ui.SectionLabel("Next Steps") - ui.Info("Run 'skillshare sync' to update targets") + syncHint := "skillshare sync" + if kind == kindAgents { + syncHint = "skillshare sync agents" + } + ui.Info("Run '%s' to update targets", syncHint) logTrashOp(cfgPath, "restore", 1, name, start, nil) return nil } -func trashDelete(mode runMode, cwd string, args []string) error { +func trashDelete(mode runMode, cwd string, args []string, kind resourceKindFilter) error { var name string for _, arg := range args { switch { @@ -187,7 +235,7 @@ func trashDelete(mode runMode, cwd string, args []string) error { return fmt.Errorf("skill name is required") } - trashBase := resolveTrashBase(mode, cwd) + trashBase := resolveTrashBase(mode, cwd, kind) entry := trash.FindByName(trashBase, name) if entry == nil { return fmt.Errorf("'%s' not found in trash", name) @@ -201,11 +249,11 @@ func trashDelete(mode runMode, cwd string, args []string) error { return nil } -func trashEmpty(mode runMode, cwd string) error { +func trashEmpty(mode runMode, cwd string, kind resourceKindFilter) error { start := time.Now() cfgPath := resolveTrashCfgPath(mode, cwd) - trashBase := resolveTrashBase(mode, cwd) + trashBase := resolveTrashBase(mode, cwd, kind) items := trash.List(trashBase) if len(items) == 0 { @@ -238,14 +286,30 @@ func trashEmpty(mode runMode, cwd string) error { return nil } -func resolveTrashBase(mode runMode, cwd string) string { +func resolveTrashBase(mode runMode, cwd string, kind resourceKindFilter) string { + if kind == kindAgents { + if mode == modeProject { + return trash.ProjectAgentTrashDir(cwd) + } + return trash.AgentTrashDir() + } if mode == modeProject { return trash.ProjectTrashDir(cwd) } return trash.TrashDir() } -func resolveSourceDir(mode runMode, cwd string) (string, error) { +func resolveSourceDir(mode runMode, cwd string, kind resourceKindFilter) (string, error) { + if kind == kindAgents { + if mode == modeProject { + return fmt.Sprintf("%s/.skillshare/agents", cwd), nil + } + cfg, err := config.Load() + if err != nil { + return "", fmt.Errorf("failed to load config: %w", err) + } + return cfg.EffectiveAgentsSource(), nil + } if mode == modeProject { return fmt.Sprintf("%s/.skillshare/skills", cwd), nil } @@ -285,7 +349,7 @@ func logTrashOp(cfgPath string, action string, count int, name string, start tim } func printTrashHelp() { - fmt.Println(`Usage: skillshare trash [options] + fmt.Println(`Usage: skillshare trash [agents] [options] Manage uninstalled skills in the trash. @@ -296,6 +360,7 @@ Commands: empty Permanently delete all items from trash Options: + --all Include both skills and agents --no-tui Disable interactive TUI, use plain text output --project, -p Use project-level trash --global, -g Use global trash @@ -307,5 +372,8 @@ Examples: skillshare trash restore my-skill # Restore from trash skillshare trash restore my-skill -p # Restore in project mode skillshare trash delete my-skill # Permanently delete from trash - skillshare trash empty # Empty the trash`) + skillshare trash empty # Empty the trash + skillshare trash agents list # List trashed agents + skillshare trash agents restore tutor # Restore an agent from trash + skillshare trash --all list # List trashed skills + agents`) } diff --git a/cmd/skillshare/trash_tui.go b/cmd/skillshare/trash_tui.go index aadd4845..7419c960 100644 --- a/cmd/skillshare/trash_tui.go +++ b/cmd/skillshare/trash_tui.go @@ -4,6 +4,7 @@ import ( "fmt" "os" "path/filepath" + "sort" "strings" "time" @@ -13,6 +14,7 @@ import ( tea "github.com/charmbracelet/bubbletea" "github.com/charmbracelet/lipgloss" + "skillshare/internal/theme" "skillshare/internal/trash" ) @@ -33,9 +35,15 @@ func (i trashItem) Title() string { if i.selected { check = "[x]" } + var kindBadge string + if i.entry.Kind == "agent" { + kindBadge = theme.Accent().Render("[A]") + " " + } else { + kindBadge = theme.Accent().Render("[S]") + " " + } age := formatAge(time.Since(i.entry.Date)) size := formatBytes(i.entry.Size) - return fmt.Sprintf("%s %s (%s, %s ago)", check, i.entry.Name, size, age) + return fmt.Sprintf("%s %s%s (%s, %s ago)", check, kindBadge, i.entry.Name, size, age) } func (i trashItem) Description() string { return "" } @@ -51,14 +59,16 @@ type trashOpDoneMsg struct { // trashTUIModel is the bubbletea model for the interactive trash viewer. type trashTUIModel struct { - list list.Model - modeLabel string // "global" or "project" - trashBase string - destDir string - cfgPath string - quitting bool - termWidth int - termHeight int + list list.Model + modeLabel string // "global" or "project" + skillTrashBase string // for reload after operations + agentTrashBase string // for reload after operations + destDir string // skill restore destination + agentDestDir string // agent restore destination + cfgPath string + quitting bool + termWidth int + termHeight int // All items (source of truth for filter + selection) allItems []trashItem @@ -90,7 +100,7 @@ type trashTUIModel struct { detailScroll int } -func newTrashTUIModel(items []trash.TrashEntry, trashBase, destDir, cfgPath, modeLabel string) trashTUIModel { +func newTrashTUIModel(items []trash.TrashEntry, skillTrashBase, agentTrashBase, destDir, agentDestDir, cfgPath, modeLabel string) trashTUIModel { allItems := make([]trashItem, len(items)) listItems := make([]list.Item, len(items)) for i, entry := range items { @@ -101,7 +111,7 @@ func newTrashTUIModel(items []trash.TrashEntry, trashBase, destDir, cfgPath, mod l := list.New(listItems, newPrefixDelegate(false), 0, 0) l.Title = trashTUITitle(modeLabel, len(items)) - l.Styles.Title = tc.ListTitle + l.Styles.Title = theme.Title() l.SetShowStatusBar(false) l.SetFilteringEnabled(false) l.SetShowHelp(false) @@ -110,25 +120,27 @@ func newTrashTUIModel(items []trash.TrashEntry, trashBase, destDir, cfgPath, mod // Spinner for operations sp := spinner.New() sp.Spinner = spinner.Dot - sp.Style = tc.SpinnerStyle + sp.Style = theme.Accent() // Filter text input fi := textinput.New() fi.Prompt = "/ " - fi.PromptStyle = tc.Filter - fi.Cursor.Style = tc.Filter + fi.PromptStyle = theme.Accent() + fi.Cursor.Style = theme.Accent() return trashTUIModel{ - list: l, - modeLabel: modeLabel, - trashBase: trashBase, - destDir: destDir, - cfgPath: cfgPath, - allItems: allItems, - matchCount: len(allItems), - filterInput: fi, - selected: make(map[int]bool), - opSpinner: sp, + list: l, + modeLabel: modeLabel, + skillTrashBase: skillTrashBase, + agentTrashBase: agentTrashBase, + destDir: destDir, + agentDestDir: agentDestDir, + cfgPath: cfgPath, + allItems: allItems, + matchCount: len(allItems), + filterInput: fi, + selected: make(map[int]bool), + opSpinner: sp, } } @@ -222,12 +234,12 @@ func (m trashTUIModel) Update(msg tea.Msg) (tea.Model, tea.Cmd) { verb := capitalize(msg.action) + "d" switch { case msg.err != nil && msg.count > 0: - m.lastOpMsg = tc.Green.Render(fmt.Sprintf("%s %d item(s)", verb, msg.count)) + - " " + tc.Red.Render(fmt.Sprintf("Failed: %s", msg.err)) + m.lastOpMsg = theme.Success().Render(fmt.Sprintf("%s %d item(s)", verb, msg.count)) + + " " + theme.Danger().Render(fmt.Sprintf("Failed: %s", msg.err)) case msg.err != nil: - m.lastOpMsg = tc.Red.Render(fmt.Sprintf("Error: %s", msg.err)) + m.lastOpMsg = theme.Danger().Render(fmt.Sprintf("Error: %s", msg.err)) default: - m.lastOpMsg = tc.Green.Render(fmt.Sprintf("%s %d item(s)", verb, msg.count)) + m.lastOpMsg = theme.Success().Render(fmt.Sprintf("%s %d item(s)", verb, msg.count)) } m.rebuildFromEntries(msg.reloadedItems) return m, nil @@ -492,9 +504,11 @@ func (m trashTUIModel) startOperation() (tea.Model, tea.Cmd) { } else { entries = m.selectedEntries() } - trashBase := m.trashBase destDir := m.destDir + agentDestDir := m.agentDestDir cfgPath := m.cfgPath + skillTrashBase := m.skillTrashBase + agentTrashBase := m.agentTrashBase cmd := func() tea.Msg { start := time.Now() @@ -505,8 +519,14 @@ func (m trashTUIModel) startOperation() (tea.Model, tea.Cmd) { case "restore": for _, entry := range entries { e := entry // copy for closure - if err := trash.Restore(&e, destDir); err != nil { - errMsgs = append(errMsgs, fmt.Sprintf("%s: %s", entry.Name, err)) + var restoreErr error + if e.Kind == "agent" { + restoreErr = trash.RestoreAgent(&e, agentDestDir) + } else { + restoreErr = trash.Restore(&e, destDir) + } + if restoreErr != nil { + errMsgs = append(errMsgs, fmt.Sprintf("%s: %s", entry.Name, restoreErr)) continue // don't stop — process remaining items } count++ @@ -530,8 +550,19 @@ func (m trashTUIModel) startOperation() (tea.Model, tea.Cmd) { // Log the operation logTrashOp(cfgPath, action, count, "", start, opErr) - // Reload items from disk - reloaded := trash.List(trashBase) + // Reload items from disk — merge skill + agent trash + var reloaded []trash.TrashEntry + for _, e := range trash.List(skillTrashBase) { + e.Kind = "skill" + reloaded = append(reloaded, e) + } + for _, e := range trash.List(agentTrashBase) { + e.Kind = "agent" + reloaded = append(reloaded, e) + } + sort.Slice(reloaded, func(i, j int) bool { + return reloaded[i].Date.After(reloaded[j].Date) + }) return trashOpDoneMsg{ action: action, count: count, @@ -620,7 +651,7 @@ func (m trashTUIModel) viewTrashSplit() string { } help := appendScrollInfo(m.trashHelpBar(), scrollInfo) - b.WriteString(tc.Help.Render(help)) + b.WriteString(theme.Dim().MarginLeft(2).Render(help)) b.WriteString("\n") return b.String() @@ -656,7 +687,7 @@ func (m trashTUIModel) viewTrashVertical() string { } help := appendScrollInfo(m.trashHelpBar(), scrollInfo) - b.WriteString(tc.Help.Render(help)) + b.WriteString(theme.Dim().MarginLeft(2).Render(help)) b.WriteString("\n") return b.String() @@ -670,14 +701,15 @@ func (m trashTUIModel) viewConfirm() string { verb := m.confirmAction switch verb { case "restore": - b.WriteString(fmt.Sprintf(" Restore %d item(s) to %s?\n\n", len(m.confirmNames), m.destDir)) + b.WriteString(m.renderRestoreConfirmHeader()) + b.WriteString("\n") case "delete": b.WriteString(" ") - b.WriteString(tc.Red.Render(fmt.Sprintf("Permanently delete %d item(s)?", len(m.confirmNames)))) + b.WriteString(theme.Danger().Render(fmt.Sprintf("Permanently delete %d item(s)?", len(m.confirmNames)))) b.WriteString("\n\n") case "empty": b.WriteString(" ") - b.WriteString(tc.Red.Render(fmt.Sprintf("Empty trash — permanently delete ALL %d item(s)?", len(m.confirmNames)))) + b.WriteString(theme.Danger().Render(fmt.Sprintf("Empty trash — permanently delete ALL %d item(s)?", len(m.confirmNames)))) b.WriteString("\n\n") } @@ -694,12 +726,38 @@ func (m trashTUIModel) viewConfirm() string { } b.WriteString("\n ") - b.WriteString(tc.Help.Render("y confirm n cancel")) + b.WriteString(theme.Dim().MarginLeft(2).Render("y confirm n cancel")) b.WriteString("\n") return b.String() } +func (m trashTUIModel) renderRestoreConfirmHeader() string { + var hasSkills, hasAgents bool + for _, entry := range m.selectedEntries() { + switch entry.Kind { + case "agent": + hasAgents = true + default: + hasSkills = true + } + } + + switch { + case hasSkills && hasAgents: + return fmt.Sprintf( + " Restore %d item(s)?\n\n skills -> %s\n agents -> %s\n", + len(m.confirmNames), + m.destDir, + m.agentDestDir, + ) + case hasAgents: + return fmt.Sprintf(" Restore %d item(s) to %s?\n", len(m.confirmNames), m.agentDestDir) + default: + return fmt.Sprintf(" Restore %d item(s) to %s?\n", len(m.confirmNames), m.destDir) + } +} + // --------------------------------------------------------------------------- // Rendering helpers // --------------------------------------------------------------------------- @@ -736,11 +794,11 @@ func (m trashTUIModel) renderTrashSummaryFooter() string { totalSize += item.entry.Size } parts := []string{ - tc.Emphasis.Render(formatNumber(m.matchCount)) + tc.Dim.Render("/") + - tc.Dim.Render(formatNumber(len(m.allItems))) + tc.Dim.Render(" items"), - tc.Dim.Render("Total: ") + tc.Cyan.Render(formatBytes(totalSize)), + theme.Primary().Render(formatNumber(m.matchCount)) + theme.Dim().Render("/") + + theme.Dim().Render(formatNumber(len(m.allItems))) + theme.Dim().Render(" items"), + theme.Dim().Render("Total: ") + theme.Accent().Render(formatBytes(totalSize)), } - return tc.Help.Render(strings.Join(parts, tc.Dim.Render(" | "))) + "\n" + return theme.Dim().MarginLeft(2).Render(strings.Join(parts, theme.Dim().Render(" | "))) + "\n" } // renderTrashDetailPanel renders the detail section for the selected trash entry. @@ -748,7 +806,7 @@ func (m trashTUIModel) renderTrashDetailPanel(entry trash.TrashEntry, width int) var b strings.Builder // Header: bold skill name - b.WriteString(tc.Title.Render(entry.Name)) + b.WriteString(theme.Title().Render(entry.Name)) b.WriteString("\n\n") // Metadata rows @@ -756,10 +814,15 @@ func (m trashTUIModel) renderTrashDetailPanel(entry trash.TrashEntry, width int) row := func(label, value string) { b.WriteString(labelStyle.Render(label + ":")) b.WriteString(" ") - b.WriteString(tc.Value.Render(value)) + b.WriteString(lipgloss.NewStyle().Render(value)) b.WriteString("\n") } + if entry.Kind == "agent" { + row("Type", theme.Accent().Render("Agent")) + } else { + row("Type", theme.Accent().Render("Skill")) + } row("Trashed", entry.Date.Format("2006-01-02 15:04:05")) row("Age", formatAge(time.Since(entry.Date))+" ago") row("Size", formatBytes(entry.Size)) @@ -772,21 +835,38 @@ func (m trashTUIModel) renderTrashDetailPanel(entry trash.TrashEntry, width int) } row("Path", pathStr) - // SKILL.md preview — read first 15 lines - skillMD := filepath.Join(entry.Path, "SKILL.md") - if data, err := os.ReadFile(skillMD); err == nil { - lines := strings.SplitN(string(data), "\n", 16) - if len(lines) > 15 { - lines = lines[:15] + // Content preview — SKILL.md for skills, agent .md file for agents + var previewFile, previewTitle string + if entry.Kind == "agent" { + // Find the .md file inside the trash directory + if entries, readErr := os.ReadDir(entry.Path); readErr == nil { + for _, e := range entries { + if !e.IsDir() && strings.HasSuffix(e.Name(), ".md") { + previewFile = filepath.Join(entry.Path, e.Name()) + previewTitle = e.Name() + break + } + } } - preview := strings.TrimRight(strings.Join(lines, "\n"), "\n") - if preview != "" { - b.WriteString("\n") - b.WriteString(tc.Title.Render("SKILL.md")) - b.WriteString("\n") - for _, line := range strings.Split(preview, "\n") { - b.WriteString(tc.Dim.Render(line)) + } else { + previewFile = filepath.Join(entry.Path, "SKILL.md") + previewTitle = "SKILL.md" + } + if previewFile != "" { + if data, err := os.ReadFile(previewFile); err == nil { + lines := strings.SplitN(string(data), "\n", 16) + if len(lines) > 15 { + lines = lines[:15] + } + preview := strings.TrimRight(strings.Join(lines, "\n"), "\n") + if preview != "" { b.WriteString("\n") + b.WriteString(theme.Title().Render(previewTitle)) + b.WriteString("\n") + for _, line := range strings.Split(preview, "\n") { + b.WriteString(theme.Dim().Render(line)) + b.WriteString("\n") + } } } } @@ -799,8 +879,8 @@ func (m trashTUIModel) renderTrashDetailPanel(entry trash.TrashEntry, width int) // --------------------------------------------------------------------------- // runTrashTUI starts the bubbletea TUI for the trash viewer. -func runTrashTUI(items []trash.TrashEntry, trashBase, destDir, cfgPath, modeLabel string) error { - model := newTrashTUIModel(items, trashBase, destDir, cfgPath, modeLabel) +func runTrashTUI(items []trash.TrashEntry, skillTrashBase, agentTrashBase, destDir, agentDestDir, cfgPath, modeLabel string) error { + model := newTrashTUIModel(items, skillTrashBase, agentTrashBase, destDir, agentDestDir, cfgPath, modeLabel) p := tea.NewProgram(model, tea.WithAltScreen(), tea.WithMouseCellMotion()) _, err := p.Run() return err diff --git a/cmd/skillshare/trash_tui_test.go b/cmd/skillshare/trash_tui_test.go new file mode 100644 index 00000000..68887971 --- /dev/null +++ b/cmd/skillshare/trash_tui_test.go @@ -0,0 +1,46 @@ +package main + +import ( + "strings" + "testing" + + "skillshare/internal/trash" +) + +func TestTrashTUIRenderRestoreConfirmHeader_UsesAgentDestination(t *testing.T) { + model := newTrashTUIModel([]trash.TrashEntry{ + {Name: "tutor", Kind: "agent"}, + }, "", "", "/tmp/skills", "/tmp/agents", "", "global") + model.selected[0] = true + model.selCount = 1 + model.confirmAction = "restore" + model.confirmNames = []string{"tutor"} + + got := model.renderRestoreConfirmHeader() + if !strings.Contains(got, "/tmp/agents") { + t.Fatalf("expected agent restore header to use agent destination, got %q", got) + } + if strings.Contains(got, "/tmp/skills") { + t.Fatalf("expected agent restore header to avoid skill destination, got %q", got) + } +} + +func TestTrashTUIRenderRestoreConfirmHeader_ShowsMixedDestinations(t *testing.T) { + model := newTrashTUIModel([]trash.TrashEntry{ + {Name: "demo-skill", Kind: "skill"}, + {Name: "tutor", Kind: "agent"}, + }, "", "", "/tmp/skills", "/tmp/agents", "", "global") + model.selected[0] = true + model.selected[1] = true + model.selCount = 2 + model.confirmAction = "restore" + model.confirmNames = []string{"demo-skill", "tutor"} + + got := model.renderRestoreConfirmHeader() + if !strings.Contains(got, "skills -> /tmp/skills") { + t.Fatalf("expected mixed restore header to mention skills destination, got %q", got) + } + if !strings.Contains(got, "agents -> /tmp/agents") { + t.Fatalf("expected mixed restore header to mention agent destination, got %q", got) + } +} diff --git a/cmd/skillshare/tui_colors.go b/cmd/skillshare/tui_colors.go deleted file mode 100644 index 590150e5..00000000 --- a/cmd/skillshare/tui_colors.go +++ /dev/null @@ -1,149 +0,0 @@ -package main - -import ( - "strings" - - "skillshare/internal/ui" - - "github.com/charmbracelet/lipgloss" -) - -// tuiBrandYellow is the logo yellow used for active/selected item borders across all TUIs. -const tuiBrandYellow = lipgloss.Color("#D4D93C") - -// tc centralizes shared color styles used across all TUI views. -// Domain-specific structs (ac, lc) reference tc for base colors. -var tc = struct { - BrandYellow lipgloss.Color - - // Semantic - Title lipgloss.Style // section headings — bold cyan - Emphasis lipgloss.Style // primary values, bright text — bright white (15) - Dim lipgloss.Style // secondary info, labels, descriptions — SGR dim - Faint lipgloss.Style // decorative chrome, borders, help — SGR dim - Cyan lipgloss.Style // emphasis, targets — cyan - Green lipgloss.Style // ok, passed - Yellow lipgloss.Style // warning - Red lipgloss.Style // error, blocked - - // Detail panel - Label lipgloss.Style // row labels (width 14) - Value lipgloss.Style // default foreground - File lipgloss.Style // file names — dim - Target lipgloss.Style // target names — cyan - Separator lipgloss.Style // horizontal rules — faint - Border lipgloss.Style // panel borders — faint - - // Filter & help - Filter lipgloss.Style // filter prompt/cursor — cyan - Help lipgloss.Style // help bar — faint, left margin - - // List browser chrome - ListRow lipgloss.Style - ListMeta lipgloss.Style - ListRowSelected lipgloss.Style - ListMetaSelected lipgloss.Style - ListRowPrefix lipgloss.Style - ListRowPrefixSelected lipgloss.Style - BadgeLocal lipgloss.Style - BadgeRemote lipgloss.Style - BadgeDisabled lipgloss.Style - - // Severity — shared across all TUIs (audit, log, etc.) - Critical lipgloss.Style // red, bold - High lipgloss.Style // orange - Medium lipgloss.Style // yellow - Low lipgloss.Style // bright blue - Info lipgloss.Style // medium gray - - // List chrome - ListTitle lipgloss.Style // list title — bold cyan - SpinnerStyle lipgloss.Style // loading spinner — cyan -}{ - BrandYellow: lipgloss.Color("#D4D93C"), - - Title: lipgloss.NewStyle().Bold(true).Foreground(lipgloss.Color("6")), - Emphasis: lipgloss.NewStyle().Foreground(lipgloss.Color("15")), - Dim: lipgloss.NewStyle().Faint(true), - Faint: lipgloss.NewStyle().Faint(true), - Cyan: lipgloss.NewStyle().Foreground(lipgloss.Color("6")), - Green: lipgloss.NewStyle().Foreground(lipgloss.Color("2")), - Yellow: lipgloss.NewStyle().Foreground(lipgloss.Color("3")), - Red: lipgloss.NewStyle().Foreground(lipgloss.Color("1")), - - Label: lipgloss.NewStyle().Faint(true).Width(14), - Value: lipgloss.NewStyle(), - File: lipgloss.NewStyle().Faint(true), - Target: lipgloss.NewStyle().Foreground(lipgloss.Color("6")), - Separator: lipgloss.NewStyle().Faint(true), - Border: lipgloss.NewStyle().Faint(true), - - Filter: lipgloss.NewStyle().Foreground(lipgloss.Color("6")), - Help: lipgloss.NewStyle().MarginLeft(2).Faint(true), - - ListRow: lipgloss.NewStyle().PaddingLeft(1), - ListMeta: lipgloss.NewStyle().PaddingLeft(1).Faint(true), - ListRowSelected: lipgloss.NewStyle().PaddingLeft(1).Foreground(lipgloss.Color("15")).Background(lipgloss.Color("237")).Bold(true), - ListMetaSelected: lipgloss.NewStyle().PaddingLeft(1).Foreground(lipgloss.Color("250")).Background(lipgloss.Color("237")), - ListRowPrefix: lipgloss.NewStyle().Foreground(lipgloss.Color("235")), - ListRowPrefixSelected: lipgloss.NewStyle().Foreground(tuiBrandYellow), - BadgeLocal: lipgloss.NewStyle().Foreground(lipgloss.Color("250")).Background(lipgloss.Color("237")). - Padding(0, 1), - BadgeRemote: lipgloss.NewStyle().Foreground(lipgloss.Color("228")).Background(lipgloss.Color("239")). - Padding(0, 1), - BadgeDisabled: lipgloss.NewStyle().Foreground(lipgloss.Color("245")).Background(lipgloss.Color("236")). - Padding(0, 1).Faint(true), - - Critical: lipgloss.NewStyle().Foreground(lipgloss.Color(ui.SeverityIDCritical)).Bold(true), - High: lipgloss.NewStyle().Foreground(lipgloss.Color(ui.SeverityIDHigh)), - Medium: lipgloss.NewStyle().Foreground(lipgloss.Color(ui.SeverityIDMedium)), - Low: lipgloss.NewStyle().Foreground(lipgloss.Color(ui.SeverityIDLow)), - Info: lipgloss.NewStyle().Foreground(lipgloss.Color(ui.SeverityIDInfo)), - - ListTitle: lipgloss.NewStyle().Bold(true).Foreground(lipgloss.Color("6")), - SpinnerStyle: lipgloss.NewStyle().Foreground(lipgloss.Color("6")), -} - -// tcSevStyle returns the severity lipgloss style from the centralized tc config. -func tcSevStyle(severity string) lipgloss.Style { - switch strings.ToUpper(severity) { - case "CRITICAL": - return tc.Critical - case "HIGH": - return tc.High - case "MEDIUM": - return tc.Medium - case "LOW": - return tc.Low - case "INFO": - return tc.Info - default: - return tc.Dim - } -} - -// riskLabelStyle maps a lowercase risk label to the matching lipgloss style. -func riskLabelStyle(label string) lipgloss.Style { - switch strings.ToLower(label) { - case "clean": - return tc.Green - case "low": - return tc.Low - case "medium": - return tc.Medium - case "high": - return tc.High - case "critical": - return tc.Critical - default: - return tc.Dim - } -} - -// formatRiskBadgeLipgloss returns a colored risk badge for TUI list items. -func formatRiskBadgeLipgloss(label string) string { - if label == "" { - return "" - } - return " " + riskLabelStyle(label).Render("["+label+"]") -} diff --git a/cmd/skillshare/tui_helpers.go b/cmd/skillshare/tui_helpers.go index 17ab4c3c..17aac64a 100644 --- a/cmd/skillshare/tui_helpers.go +++ b/cmd/skillshare/tui_helpers.go @@ -7,6 +7,7 @@ import ( "github.com/charmbracelet/lipgloss" "skillshare/internal/config" + "skillshare/internal/theme" "skillshare/internal/ui" ) @@ -46,6 +47,31 @@ func appendScrollInfo(help, scrollInfo string) string { return help } +// formatHelpBar colorizes a help string like "Tab skills/agents ↑↓ navigate q quit". +// Each pair "key desc" is parsed: key gets HelpKey style (dim cyan), desc stays dim. +// Pairs are separated by two or more spaces. +func formatHelpBar(raw string) string { + // Split by double-space to get individual "key desc" pairs + pairs := strings.Split(raw, " ") + var parts []string + for _, pair := range pairs { + pair = strings.TrimSpace(pair) + if pair == "" { + continue + } + // Split first space: key + description + if idx := strings.IndexByte(pair, ' '); idx > 0 { + key := pair[:idx] + desc := pair[idx:] + parts = append(parts, theme.Accent().Faint(true).Render(key)+theme.Dim().MarginLeft(2).UnsetMarginLeft().Render(desc)) + } else { + // Single word (e.g. just a key) + parts = append(parts, theme.Accent().Faint(true).Render(pair)) + } + } + return " " + strings.Join(parts, " ") +} + // applyDetailScrollSplit applies scrolling and returns (visible content, scroll info). func applyDetailScrollSplit(content string, detailScroll, viewHeight int) (string, string) { lines := strings.Split(content, "\n") @@ -102,7 +128,7 @@ func renderHorizontalSplit(leftContent, rightContent string, leftWidth, rightWid Height(panelHeight).MaxHeight(panelHeight). Render(leftContent) - borderStyle := tc.Border. + borderStyle := theme.Dim(). Height(panelHeight).MaxHeight(panelHeight) borderCol := strings.Repeat("│\n", panelHeight) borderPanel := borderStyle.Render(strings.TrimRight(borderCol, "\n")) diff --git a/cmd/skillshare/uninstall.go b/cmd/skillshare/uninstall.go index 97eee8db..c1e99bf3 100644 --- a/cmd/skillshare/uninstall.go +++ b/cmd/skillshare/uninstall.go @@ -23,9 +23,10 @@ import ( // uninstallOptions holds parsed arguments for uninstall command type uninstallOptions struct { - skillNames []string // positional args (0+) - groups []string // --group/-G values (repeatable) - all bool // --all: remove ALL skills from source + skillNames []string // positional args (0+) + groups []string // --group/-G values (repeatable) + kind resourceKindFilter // set by positional filter (e.g. "uninstall agents") + all bool // --all: remove ALL skills from source force bool dryRun bool jsonOutput bool @@ -396,7 +397,7 @@ func (s uninstallTypeSummary) details() string { } // displayUninstallInfo shows information about the skill to be uninstalled -func displayUninstallInfo(target *uninstallTarget) { +func displayUninstallInfo(target *uninstallTarget, store *install.MetadataStore) { if target.isTrackedRepo { ui.Header("Uninstalling tracked repository") ui.Info("Type: tracked repository") @@ -411,9 +412,11 @@ func displayUninstallInfo(target *uninstallTarget) { } else { ui.Header("Uninstalling skill") } - if meta, err := install.ReadMeta(target.path); err == nil && meta != nil { - ui.Info("Source: %s", meta.Source) - ui.Info("Installed: %s", meta.InstalledAt.Format("2006-01-02 15:04")) + if entry := store.Get(target.name); entry != nil { + ui.Info("Source: %s", entry.Source) + if !entry.InstalledAt.IsZero() { + ui.Info("Installed: %s", entry.InstalledAt.Format("2006-01-02 15:04")) + } } } ui.Info("Name: %s", target.name) @@ -491,9 +494,9 @@ func performUninstallQuiet(target *uninstallTarget) (typeLabel string, err error // performUninstall moves the skill to trash (verbose single-target output). // Note: .gitignore cleanup is handled in batch by the caller. -func performUninstall(target *uninstallTarget) error { +func performUninstall(target *uninstallTarget, store *install.MetadataStore) error { // Read metadata before moving (for reinstall hint) - meta, _ := install.ReadMeta(target.path) + entry := store.Get(target.name) groupSkillCount := 0 if !target.isTrackedRepo { groupSkillCount = len(countGroupSkills(target.path)) @@ -512,8 +515,8 @@ func performUninstall(target *uninstallTarget) error { ui.Success("Uninstalled skill: %s", target.name) } ui.Info("Moved to trash (7 days): %s", trashPath) - if meta != nil && meta.Source != "" { - ui.Info("Reinstall: skillshare install %s", meta.Source) + if entry != nil && entry.Source != "" { + ui.Info("Reinstall: skillshare install %s", entry.Source) } ui.SectionLabel("Next Steps") ui.Info("Run 'skillshare sync' to update all targets") @@ -549,7 +552,20 @@ func cmdUninstall(args []string) error { applyModeLabel(mode) + // Extract kind filter (e.g. "skillshare uninstall agents myagent"). + kind, rest := parseKindArg(rest) + if mode == modeProject { + if kind == kindAgents { + agentsDir := filepath.Join(cwd, ".skillshare", "agents") + opts, _, _ := parseUninstallArgs(rest) + if opts == nil { + opts = &uninstallOptions{skillNames: rest} + } + opts.force = opts.force || opts.jsonOutput + err := cmdUninstallAgents(agentsDir, opts, config.ProjectConfigPath(cwd), trash.ProjectAgentTrashDir(cwd), start) + return err + } err := cmdUninstallProject(rest, cwd) logUninstallOp(config.ProjectConfigPath(cwd), uninstallOpNames(rest), 0, start, err) return err @@ -577,6 +593,19 @@ func cmdUninstall(args []string) error { return fmt.Errorf("failed to load config: %w", err) } + // Agent-only uninstall: move .md + sidecar to agent trash, then return. + if kind == kindAgents { + agentsDir := cfg.EffectiveAgentsSource() + err := cmdUninstallAgents(agentsDir, opts, config.ConfigPath(), trash.AgentTrashDir(), start) + return err + } + + // Load centralized metadata store for display/reinstall hints. + skillsStore, _ := install.LoadMetadataWithMigration(cfg.Source, "") + if skillsStore == nil { + skillsStore = install.NewMetadataStore() + } + // --- Phase 1: RESOLVE --- var targets []*uninstallTarget seen := map[string]bool{} // dedup by path @@ -711,7 +740,7 @@ func cmdUninstall(args []string) error { if opts.jsonOutput { // Skip display in JSON mode } else if single { - displayUninstallInfo(targets[0]) + displayUninstallInfo(targets[0], skillsStore) } else { ui.Header(fmt.Sprintf("Uninstalling %d %s", len(targets), summary.noun())) if len(targets) > 20 { @@ -853,8 +882,8 @@ func cmdUninstall(args []string) error { if t.isTrackedRepo { ui.Warning("[dry-run] would remove %s from .gitignore", t.name) } - if meta, err := install.ReadMeta(t.path); err == nil && meta != nil && meta.Source != "" { - ui.Info("[dry-run] Reinstall: skillshare install %s", meta.Source) + if entry := skillsStore.Get(t.name); entry != nil && entry.Source != "" { + ui.Info("[dry-run] Reinstall: skillshare install %s", entry.Source) } } return nil @@ -1015,7 +1044,7 @@ func cmdUninstall(args []string) error { } } else { for _, t := range targets { - if err := performUninstall(t); err != nil { + if err := performUninstall(t, skillsStore); err != nil { failed = append(failed, fmt.Sprintf("%s: %v", t.name, err)) } else { succeeded = append(succeeded, t) @@ -1039,42 +1068,15 @@ func cmdUninstall(args []string) error { } // --- Phase 7: FINALIZE --- - // Batch-remove succeeded skills from registry + // Batch-remove succeeded skills from metadata store if len(succeeded) > 0 { - regDir := cfg.RegistryDir - reg, regErr := config.LoadRegistry(regDir) - if regErr != nil { - ui.Warning("Failed to load registry: %v", regErr) - } else if len(reg.Skills) > 0 { - removedNames := map[string]bool{} - for _, t := range succeeded { - removedNames[t.name] = true - } - updated := make([]config.SkillEntry, 0, len(reg.Skills)) - for _, s := range reg.Skills { - fullName := s.FullName() - if removedNames[fullName] { - continue - } - // When a group directory is uninstalled, also remove its member skills - memberOfRemoved := false - for name := range removedNames { - if strings.HasPrefix(fullName, name+"/") { - memberOfRemoved = true - break - } - } - if memberOfRemoved { - continue - } - updated = append(updated, s) - } - if len(updated) != len(reg.Skills) { - reg.Skills = updated - if saveErr := reg.Save(regDir); saveErr != nil { - ui.Warning("Failed to update registry after uninstall: %v", saveErr) - } - } + removedNames := map[string]bool{} + for _, t := range succeeded { + removedNames[t.name] = true + } + skillsStore.RemoveByNames(removedNames) + if saveErr := skillsStore.Save(cfg.Source); saveErr != nil { + ui.Warning("Failed to update metadata after uninstall: %v", saveErr) } } @@ -1152,6 +1154,7 @@ func logUninstallOp(cfgPath string, names []string, succeeded int, start time.Ti func printUninstallHelp() { fmt.Println(`Usage: skillshare uninstall ... [options] + skillshare uninstall [agents] [options] skillshare uninstall --group [options] skillshare uninstall --all [options] @@ -1187,5 +1190,8 @@ Examples: skillshare uninstall --group frontend -n # Preview group removal skillshare uninstall x -G backend --force # Mix names and groups skillshare uninstall _team-repo # Remove tracked repository - skillshare uninstall team-repo # _ prefix is optional`) + skillshare uninstall team-repo # _ prefix is optional + skillshare uninstall agents tutor # Uninstall an agent + skillshare uninstall agents --all # Uninstall all agents + skillshare uninstall agents -G demo # Uninstall all agents in demo/`) } diff --git a/cmd/skillshare/uninstall_agents.go b/cmd/skillshare/uninstall_agents.go new file mode 100644 index 00000000..8411bf65 --- /dev/null +++ b/cmd/skillshare/uninstall_agents.go @@ -0,0 +1,198 @@ +package main + +import ( + "fmt" + "os" + "path/filepath" + "strings" + "time" + + "skillshare/internal/install" + "skillshare/internal/oplog" + "skillshare/internal/resource" + "skillshare/internal/trash" + "skillshare/internal/ui" +) + +// cmdUninstallAgents removes agents from the source directory by moving them to agent trash. +func cmdUninstallAgents(agentsDir string, opts *uninstallOptions, cfgPath string, trashBase string, start time.Time) error { + if _, err := os.Stat(agentsDir); err != nil { + if os.IsNotExist(err) { + return fmt.Errorf("agents source directory does not exist: %s", agentsDir) + } + return fmt.Errorf("cannot access agents source: %w", err) + } + + // Discover all agents for resolution + discovered, discErr := resource.AgentKind{}.Discover(agentsDir) + if discErr != nil { + return fmt.Errorf("failed to discover agents: %w", discErr) + } + + // Resolve targets + var targets []resource.DiscoveredResource + if opts.all { + targets = discovered + if len(targets) == 0 { + ui.Info("No agents found") + return nil + } + } else { + for _, input := range opts.skillNames { + found := false + for _, d := range discovered { + if d.Name == input || d.FlatName == input || d.RelPath == input || strings.TrimSuffix(d.RelPath, ".md") == input { + targets = append(targets, d) + found = true + break + } + } + if !found { + return fmt.Errorf("agent %q not found in %s", input, agentsDir) + } + } + + // Resolve --group targets + if len(opts.groups) > 0 { + groupFiltered, err := filterDiscoveredAgentsByGroups(discovered, opts.groups, agentsDir) + if err != nil { + return err + } + if len(groupFiltered) == 0 { + return fmt.Errorf("no agents found in group(s): %s", strings.Join(opts.groups, ", ")) + } + // Deduplicate against already-resolved name targets + seen := make(map[string]bool, len(targets)) + for _, t := range targets { + seen[t.RelPath] = true + } + for _, d := range groupFiltered { + if !seen[d.RelPath] { + targets = append(targets, d) + } + } + } + } + + if len(targets) == 0 { + return fmt.Errorf("specify agent name(s), --group, or --all") + } + + // Confirmation (unless --force or --json) + if !opts.force && !opts.jsonOutput { + ui.Warning("Uninstalling %d agent(s)", len(targets)) + const maxDisplay = 20 + display := targets + if len(display) > maxDisplay { + display = display[:maxDisplay] + } + for _, t := range display { + fmt.Printf(" - %s\n", strings.TrimSuffix(t.RelPath, ".md")) + } + if len(targets) > maxDisplay { + fmt.Printf(" ... and %d more\n", len(targets)-maxDisplay) + } + fmt.Println() + fmt.Print("Continue? [y/N] ") + var input string + fmt.Scanln(&input) + input = strings.TrimSpace(strings.ToLower(input)) + if input != "y" && input != "yes" { + ui.Info("Cancelled") + return nil + } + } + + store, _ := install.LoadMetadata(agentsDir) + var removed []string + var failed []string + + for _, t := range targets { + agentFile := filepath.Join(agentsDir, t.RelPath) + + displayName := strings.TrimSuffix(t.RelPath, ".md") + if opts.dryRun { + ui.Info("[dry-run] Would remove agent: %s", displayName) + removed = append(removed, displayName) + continue + } + + // Trash the agent file (+ legacy sidecar if it still exists) + metaName := strings.TrimSuffix(filepath.Base(t.RelPath), ".md") + legacySidecar := filepath.Join(filepath.Dir(agentFile), metaName+".skillshare-meta.json") + _, err := trash.MoveAgentToTrash(agentFile, legacySidecar, displayName, trashBase) + if err != nil { + ui.Error("Failed to remove %s: %v", displayName, err) + failed = append(failed, displayName) + continue + } + + // Remove from centralized metadata store + if store != nil { + store.Remove(displayName) + } + + ui.Success("Removed agent: %s", displayName) + removed = append(removed, displayName) + } + + // Save store after all removals + if store != nil && len(removed) > 0 { + store.Save(agentsDir) //nolint:errcheck + } + + // JSON output + if opts.jsonOutput { + output := struct { + Removed []string `json:"removed"` + Failed []string `json:"failed"` + DryRun bool `json:"dry_run"` + Duration string `json:"duration"` + }{ + Removed: removed, + Failed: failed, + DryRun: opts.dryRun, + Duration: formatDuration(start), + } + var jsonErr error + if len(failed) > 0 { + jsonErr = fmt.Errorf("%d agent(s) failed to uninstall", len(failed)) + } + return writeJSONResult(&output, jsonErr) + } + + // Summary + if !opts.dryRun { + fmt.Println() + ui.Info("%d agent(s) removed, %d failed", len(removed), len(failed)) + if len(removed) > 0 { + ui.Info("Run 'skillshare sync agents' to update targets") + } + } + + // Oplog + logUninstallAgentOp(cfgPath, removed, len(removed), len(failed), opts.dryRun, start) + + if len(failed) > 0 { + return fmt.Errorf("%d agent(s) failed to uninstall", len(failed)) + } + return nil +} + +func logUninstallAgentOp(cfgPath string, names []string, removed, failed int, dryRun bool, start time.Time) { + status := "ok" + if failed > 0 && removed > 0 { + status = "partial" + } else if failed > 0 { + status = "error" + } + e := oplog.NewEntry("uninstall", status, time.Since(start)) + e.Args = map[string]any{ + "resource_kind": "agent", + "names": names, + "removed": removed, + "failed": failed, + "dry_run": dryRun, + } + oplog.WriteWithLimit(cfgPath, oplog.OpsFile, e, logMaxEntries()) //nolint:errcheck +} diff --git a/cmd/skillshare/uninstall_project.go b/cmd/skillshare/uninstall_project.go index b7bbfe00..819e2da5 100644 --- a/cmd/skillshare/uninstall_project.go +++ b/cmd/skillshare/uninstall_project.go @@ -7,7 +7,6 @@ import ( "path/filepath" "strings" - "skillshare/internal/config" "skillshare/internal/install" "skillshare/internal/sync" "skillshare/internal/trash" @@ -89,6 +88,12 @@ func cmdUninstallProject(args []string, root string) error { sourceDir := filepath.Join(root, ".skillshare", "skills") trashDir := trash.ProjectTrashDir(root) + // Load centralized metadata store for display/reinstall hints. + skillsStore, _ := install.LoadMetadataWithMigration(sourceDir, "") + if skillsStore == nil { + skillsStore = install.NewMetadataStore() + } + // Backward compat: ensure operational dirs are gitignored for projects created before v0.17.3. _ = ensureProjectGitignore(root, false) @@ -172,7 +177,7 @@ func cmdUninstallProject(args []string, root string) error { single := len(targets) == 1 summary := summarizeUninstallTargets(targets) if single { - displayUninstallInfo(targets[0]) + displayUninstallInfo(targets[0], skillsStore) } else { ui.Header(fmt.Sprintf("Uninstalling %d %s", len(targets), summary.noun())) if len(targets) > 20 { @@ -239,8 +244,8 @@ func cmdUninstallProject(args []string, root string) error { for _, t := range targets { ui.Warning("[dry-run] would move to trash: %s", t.path) ui.Warning("[dry-run] would update .skillshare/.gitignore") - if meta, err := install.ReadMeta(t.path); err == nil && meta != nil && meta.Source != "" { - ui.Info("[dry-run] Reinstall: skillshare install %s --project", meta.Source) + if entry := skillsStore.Get(t.name); entry != nil && entry.Source != "" { + ui.Info("[dry-run] Reinstall: skillshare install %s --project", entry.Source) } } return nil @@ -369,7 +374,7 @@ func cmdUninstallProject(args []string, root string) error { } } else { for _, t := range targets { - meta, _ := install.ReadMeta(t.path) + entry := skillsStore.Get(t.name) groupSkillCount := 0 if !t.isTrackedRepo { groupSkillCount = len(countGroupSkills(t.path)) @@ -390,8 +395,8 @@ func cmdUninstallProject(args []string, root string) error { ui.Success("Uninstalled skill: %s", t.name) } ui.Info("Moved to trash (7 days): %s", trashPath) - if meta != nil && meta.Source != "" { - ui.Info("Reinstall: skillshare install %s --project", meta.Source) + if entry != nil && entry.Source != "" { + ui.Info("Reinstall: skillshare install %s --project", entry.Source) } succeeded = append(succeeded, t) } @@ -409,41 +414,15 @@ func cmdUninstallProject(args []string, root string) error { } // --- Phase 7: FINALIZE --- + // Batch-remove succeeded skills from metadata store if len(succeeded) > 0 { - regDir := filepath.Join(root, ".skillshare") - reg, regErr := config.LoadRegistry(regDir) - if regErr != nil { - ui.Warning("Failed to load registry: %v", regErr) - } else if len(reg.Skills) > 0 { - removedNames := map[string]bool{} - for _, t := range succeeded { - removedNames[t.name] = true - } - updated := make([]config.SkillEntry, 0, len(reg.Skills)) - for _, s := range reg.Skills { - fullName := s.FullName() - if removedNames[fullName] { - continue - } - // When a group directory is uninstalled, also remove its member skills - memberOfRemoved := false - for name := range removedNames { - if strings.HasPrefix(fullName, name+"/") { - memberOfRemoved = true - break - } - } - if memberOfRemoved { - continue - } - updated = append(updated, s) - } - if len(updated) != len(reg.Skills) { - reg.Skills = updated - if saveErr := reg.Save(regDir); saveErr != nil { - ui.Warning("Failed to update registry after uninstall: %v", saveErr) - } - } + removedNames := map[string]bool{} + for _, t := range succeeded { + removedNames[t.name] = true + } + skillsStore.RemoveByNames(removedNames) + if saveErr := skillsStore.Save(sourceDir); saveErr != nil { + ui.Warning("Failed to update metadata after uninstall: %v", saveErr) } } diff --git a/cmd/skillshare/update.go b/cmd/skillshare/update.go index 73b2841d..d59e1062 100644 --- a/cmd/skillshare/update.go +++ b/cmd/skillshare/update.go @@ -130,6 +130,24 @@ func cmdUpdate(args []string) error { applyModeLabel(mode) + // Extract kind filter (e.g. "skillshare update agents") + kind, rest := parseKindArg(rest) + + // Agent-only update: dispatch to correct scope + if kind == kindAgents { + if mode == modeProject { + return cmdUpdateAgentsProject(rest, cwd, start) + } + cfg, loadErr := config.Load() + if loadErr != nil { + if hasFlag(rest, "--json") { + return writeJSONError(loadErr) + } + return loadErr + } + return cmdUpdateAgents(rest, cfg, start) + } + if mode == modeProject { // Parse opts for logging (cmdUpdateProject parses again internally) projOpts, _, _ := parseUpdateArgs(rest) @@ -167,24 +185,15 @@ func cmdUpdate(args []string) error { // In JSON mode, redirect all UI output to stderr early so the // header, step, spinner, and handler output don't corrupt stdout. - var restoreJSONUI func() - restoreJSONUIIfNeeded := func() { - if restoreJSONUI != nil { - restoreJSONUI() - restoreJSONUI = nil - } - } - if opts.jsonOutput { - restoreJSONUI = suppressUIToDevnull() - } - defer restoreJSONUIIfNeeded() + jsonUI := newJSONUISuppressor(opts.jsonOutput) + defer jsonUI.Flush() jsonWriteError := func(err error) error { - restoreJSONUIIfNeeded() + jsonUI.Flush() return writeJSONError(err) } jsonWriteResult := func(result *updateResult, cmdErr error) error { - restoreJSONUIIfNeeded() + jsonUI.Flush() return updateOutputJSON(result, opts.dryRun, start, cmdErr) } @@ -200,6 +209,7 @@ func cmdUpdate(args []string) error { // Recursive discovery for --all scanSpinner := ui.StartSpinner("Scanning skills...") walkRoot := utils.ResolveSymlink(cfg.Source) + metaStore, _ := install.LoadMetadataWithMigration(cfg.Source, "") err := filepath.Walk(walkRoot, func(path string, info os.FileInfo, err error) error { if err != nil || path == walkRoot { return nil @@ -226,12 +236,11 @@ func cmdUpdate(args []string) error { // Regular skill if !info.IsDir() && info.Name() == "SKILL.md" { skillDir := filepath.Dir(path) - meta, metaErr := install.ReadMeta(skillDir) - if metaErr == nil && meta != nil && meta.Source != "" { - rel, _ := filepath.Rel(walkRoot, skillDir) - if rel != "." && !seen[rel] { + rel, _ := filepath.Rel(walkRoot, skillDir) + if rel != "." && !seen[rel] { + if entry := metaStore.GetByPath(rel); entry != nil && entry.Source != "" { seen[rel] = true - targets = append(targets, updateTarget{name: rel, path: skillDir, isRepo: false, meta: meta}) + targets = append(targets, updateTarget{name: rel, path: skillDir, isRepo: false, meta: entry}) } } } @@ -245,6 +254,8 @@ func cmdUpdate(args []string) error { return fmt.Errorf("failed to scan skills: %w", err) } } else { + // Load store once for name resolution + nameStore, _ := install.LoadMetadata(cfg.Source) // Resolve by specific names/groups for _, name := range opts.names { // Glob pattern matching (e.g. "core-*", "_team-?") @@ -268,7 +279,7 @@ func cmdUpdate(args []string) error { continue } - if isGroupDir(name, cfg.Source) { + if isGroupDir(name, cfg.Source, nameStore) { groupMatches, groupErr := resolveGroupUpdatable(name, cfg.Source) if groupErr != nil { resolveWarnings = append(resolveWarnings, fmt.Sprintf("%s: %v", name, groupErr)) @@ -481,6 +492,7 @@ func logUpdateOp(cfgPath string, names []string, opts *updateOptions, mode strin func printUpdateHelp() { fmt.Println(`Usage: skillshare update ... [options] + skillshare update [agents] [options] skillshare update --group [options] skillshare update --all [options] @@ -528,5 +540,8 @@ Examples: skillshare update --all -T high # Use HIGH threshold for this run skillshare update --all --dry-run # Preview updates skillshare update _team --force # Discard changes and update - skillshare update --all --prune # Update all + remove stale skills`) + skillshare update --all --prune # Update all + remove stale skills + skillshare update agents --all # Update all agents + skillshare update agents tutor # Update a single agent + skillshare update agents -G demo # Update all agents in demo/`) } diff --git a/cmd/skillshare/update_agents.go b/cmd/skillshare/update_agents.go new file mode 100644 index 00000000..c652db9b --- /dev/null +++ b/cmd/skillshare/update_agents.go @@ -0,0 +1,850 @@ +package main + +import ( + "fmt" + "os" + "path/filepath" + "strings" + "time" + + "skillshare/internal/check" + "skillshare/internal/config" + "skillshare/internal/install" + "skillshare/internal/oplog" + "skillshare/internal/resource" + "skillshare/internal/ui" +) + +// cmdUpdateAgents handles "skillshare update agents [name|--all]". +func cmdUpdateAgents(args []string, cfg *config.Config, start time.Time) error { + jsonRequested := hasFlag(args, "--json") + opts, showHelp, parseErr := parseUpdateAgentArgs(args) + if showHelp { + printUpdateHelp() + return nil + } + if parseErr != nil { + if jsonRequested { + return writeJSONError(parseErr) + } + return parseErr + } + if opts.threshold == "" { + opts.threshold = cfg.Audit.BlockThreshold + } + + jsonUI := newJSONUISuppressor(opts.jsonOutput) + defer jsonUI.Flush() + + jsonWriteResult := func(items []agentUpdateItem, cmdErr error) error { + jsonUI.Flush() + return updateAgentsOutputJSON(items, opts.dryRun, start, cmdErr) + } + failJSON := func(err error) error { + if opts.jsonOutput { + jsonUI.Flush() + return writeJSONError(err) + } + return err + } + + agentsDir := cfg.EffectiveAgentsSource() + if _, err := os.Stat(agentsDir); err != nil { + if os.IsNotExist(err) { + if opts.jsonOutput { + return jsonWriteResult(nil, nil) + } + ui.Info("No agents source directory (%s)", agentsDir) + return nil + } + return failJSON(fmt.Errorf("cannot access agents source: %w", err)) + } + + // Discover agents and check status + results := check.CheckAgents(agentsDir) + if len(results) == 0 { + if opts.jsonOutput { + return jsonWriteResult(nil, nil) + } + ui.Info("No agents found") + return nil + } + + // Filter by name if specified + if len(opts.names) > 0 { + results = filterAgentCheckResults(results, opts.names) + if len(results) == 0 { + return failJSON(fmt.Errorf("no matching agents found: %s", strings.Join(opts.names, ", "))) + } + } + + // Filter by group if specified + if len(opts.groups) > 0 { + var err error + results, err = filterAgentResultsByGroups(results, opts.groups, agentsDir) + if err != nil { + return failJSON(err) + } + if len(results) == 0 { + return failJSON(fmt.Errorf("no agents found in group(s): %s", strings.Join(opts.groups, ", "))) + } + } + + // Only check agents that have remote sources + tracked := collectTrackedAgentResults(results) + + if len(tracked) == 0 { + if opts.jsonOutput { + return jsonWriteResult(agentUpdateItemsFromCheckResults(results), nil) + } + ui.Info("No tracked agents to update (all are local)") + return nil + } + + // Enrich with remote status + if !opts.jsonOutput { + sp := ui.StartSpinner(fmt.Sprintf("Checking %d agent(s) for updates...", len(tracked))) + check.EnrichAgentResultsWithRemote(tracked, func() { sp.Success("Check complete") }) + } else { + check.EnrichAgentResultsWithRemote(tracked, nil) + } + mergeTrackedAgentResults(results, tracked) + + // Find agents with updates available + var updatable []check.AgentCheckResult + for _, r := range tracked { + if r.Status == "update_available" { + updatable = append(updatable, r) + } + } + finalItems := agentUpdateItemsFromCheckResults(results) + + if len(updatable) == 0 { + if opts.jsonOutput { + return jsonWriteResult(finalItems, nil) + } + ui.Success("All agents are up to date") + return nil + } + + if !opts.jsonOutput { + ui.Header("Updating agents") + if opts.dryRun { + ui.Warning("Dry run mode - no changes will be made") + } + } + + // Update agents, batching by repo URL to share git clones. + var ( + updated int + failed int + updatedItems []agentUpdateItem + ) + if opts.dryRun { + for _, r := range updatable { + if !opts.jsonOutput { + ui.Info(" %s: update available from %s", r.Name, r.Source) + } + } + } else { + updatedItems, updated, failed = batchUpdateAgents(agentsDir, updatable, opts, "", !opts.jsonOutput) + finalItems = mergeAgentUpdateItems(finalItems, updatedItems) + } + + if !opts.jsonOutput && !opts.dryRun { + fmt.Println() + ui.Info("Agent update: %d updated, %d failed", updated, failed) + } + + logUpdateAgentOp(config.ConfigPath(), len(updatable), updated, failed, opts.dryRun, start) + + if opts.jsonOutput { + var cmdErr error + if failed > 0 { + cmdErr = fmt.Errorf("%d agent(s) failed to update", failed) + } + return jsonWriteResult(finalItems, cmdErr) + } + + if failed > 0 { + return fmt.Errorf("%d agent(s) failed to update", failed) + } + return nil +} + +// agentRepoKey groups agents by clone URL + branch + repo subdir so agents +// from the same scope share a single git clone. +type agentRepoKey struct { + cloneURL string + branch string + repoSubdir string +} + +type agentUpdateItem struct { + Name string `json:"name"` + Source string `json:"source,omitempty"` + Status string `json:"status"` + Message string `json:"message,omitempty"` + Error string `json:"error,omitempty"` +} + +// batchUpdateAgents groups agents by repo URL and clones once per group. +// Agents with no RepoURL fall back to per-agent reinstallAgent. +func batchUpdateAgents(agentsDir string, agents []check.AgentCheckResult, opts *updateAgentArgs, projectRoot string, verbose bool) ([]agentUpdateItem, int, int) { + store := install.LoadMetadataOrNew(agentsDir) + trackedRepos := map[string][]check.AgentCheckResult{} + groups := map[agentRepoKey][]check.AgentCheckResult{} + var noRepo []check.AgentCheckResult + var items []agentUpdateItem + updated, failed := 0, 0 + + for _, r := range agents { + if r.RepoPath != "" { + trackedRepos[r.RepoPath] = append(trackedRepos[r.RepoPath], r) + continue + } + if r.RepoURL == "" { + noRepo = append(noRepo, r) + continue + } + entry := store.GetByPath(r.Name) + if entry == nil || entry.Source == "" { + noRepo = append(noRepo, r) + continue + } + + source, parseErr := install.ParseSource(entry.Source) + if parseErr != nil { + noRepo = append(noRepo, r) + continue + } + repoSubdir := strings.TrimSuffix(source.Subdir, entry.Subdir) + repoSubdir = strings.TrimRight(repoSubdir, "/") + + key := agentRepoKey{ + cloneURL: r.RepoURL, + branch: entry.Branch, + repoSubdir: repoSubdir, + } + groups[key] = append(groups[key], r) + } + + for repoPath, members := range trackedRepos { + uc := &updateContext{ + sourcePath: agentsDir, + projectRoot: projectRoot, + opts: &updateOptions{ + force: opts.force, + skipAudit: opts.skipAudit, + threshold: opts.threshold, + }, + } + ok, _, err := updateTrackedRepoQuick(uc, repoPath) + if err != nil { + for _, m := range members { + items = append(items, agentUpdateItem{ + Name: m.Name, + Source: m.Source, + Status: "failed", + Message: "tracked repo update failed", + Error: err.Error(), + }) + if verbose { + ui.Error(" %s: %v", m.Name, err) + } + failed++ + } + continue + } + if !ok { + for _, m := range members { + items = append(items, agentUpdateItem{ + Name: m.Name, + Source: m.Source, + Status: "skipped", + Message: "up-to-date", + }) + } + continue + } + for _, m := range members { + items = append(items, agentUpdateItem{ + Name: m.Name, + Source: m.Source, + Status: "updated", + Message: "tracked repo updated", + }) + if verbose { + ui.Success(" %s: updated", m.Name) + } + updated++ + } + } + + // Batch: one clone per repo group + for key, members := range groups { + source := &install.Source{ + CloneURL: key.cloneURL, + Subdir: key.repoSubdir, + Branch: key.branch, + } + + var discovery *install.DiscoveryResult + var discErr error + if source.HasSubdir() { + discovery, discErr = install.DiscoverFromGitSubdir(source) + } else { + discovery, discErr = install.DiscoverFromGit(source) + } + if discErr != nil { + for _, m := range members { + items = append(items, agentUpdateItem{ + Name: m.Name, + Source: m.Source, + Status: "failed", + Message: "discovery failed", + Error: "discovery failed: " + discErr.Error(), + }) + if verbose { + ui.Error(" %s: discovery failed: %v", m.Name, discErr) + } + failed++ + } + continue + } + + // Build agent name → AgentInfo lookup + agentIndex := map[string]*install.AgentInfo{} + for i, a := range discovery.Agents { + agentIndex[a.Name] = &discovery.Agents[i] + } + + for _, m := range members { + agentName := filepath.Base(m.Name) + target := agentIndex[agentName] + if target == nil { + if verbose { + ui.Error(" %s: not found in repository", m.Name) + } + items = append(items, agentUpdateItem{ + Name: m.Name, + Source: m.Source, + Status: "failed", + Message: "not found in repository", + Error: "not found in repository", + }) + failed++ + continue + } + + destDir := agentsDir + if dir := filepath.Dir(m.Name); dir != "." { + destDir = filepath.Join(agentsDir, dir) + } + + installOpts := install.InstallOptions{ + Kind: "agent", + Force: opts.force, + Update: true, + SkipAudit: opts.skipAudit, + AuditThreshold: opts.threshold, + AuditProjectRoot: projectRoot, + SourceDir: agentsDir, + } + if _, err := install.UpdateAgentFromDiscovery(discovery, *target, destDir, installOpts); err != nil { + items = append(items, agentUpdateItem{ + Name: m.Name, + Source: m.Source, + Status: "failed", + Message: "update failed", + Error: err.Error(), + }) + if verbose { + ui.Error(" %s: %v", m.Name, err) + } + failed++ + } else { + items = append(items, agentUpdateItem{ + Name: m.Name, + Source: m.Source, + Status: "updated", + Message: "updated", + }) + if verbose { + ui.Success(" %s: updated", m.Name) + } + updated++ + } + } + + install.CleanupDiscovery(discovery) + } + + // Fallback: agents without RepoURL + for _, r := range noRepo { + if err := reinstallAgent(agentsDir, r, store, opts, projectRoot); err != nil { + items = append(items, agentUpdateItem{ + Name: r.Name, + Source: r.Source, + Status: "failed", + Message: "update failed", + Error: err.Error(), + }) + if verbose { + ui.Error(" %s: %v", r.Name, err) + } + failed++ + } else { + items = append(items, agentUpdateItem{ + Name: r.Name, + Source: r.Source, + Status: "updated", + Message: "updated", + }) + if verbose { + ui.Success(" %s: updated", r.Name) + } + updated++ + } + } + + return items, updated, failed +} + +// reinstallAgent re-installs an agent from its recorded source using +// discovery + InstallAgentFromDiscovery (single-file copy), not the +// directory-based skill installer. +// Used as fallback for agents without RepoURL in the batch path. +func reinstallAgent(agentsDir string, r check.AgentCheckResult, store *install.MetadataStore, opts *updateAgentArgs, projectRoot string) error { + entry := store.GetByPath(r.Name) + if entry == nil || entry.Source == "" { + return fmt.Errorf("no source metadata for agent %q", r.Name) + } + + // Reconstruct the repo-level subdir for discovery. + source, parseErr := install.ParseSource(entry.Source) + if parseErr != nil { + return fmt.Errorf("invalid source: %w", parseErr) + } + if entry.Branch != "" { + source.Branch = entry.Branch + } + repoSubdir := strings.TrimSuffix(source.Subdir, entry.Subdir) + repoSubdir = strings.TrimRight(repoSubdir, "/") + source.Subdir = repoSubdir + + // Discover agents — use subdir-scoped discovery for monorepo installs. + var discovery *install.DiscoveryResult + var discErr error + if source.HasSubdir() { + discovery, discErr = install.DiscoverFromGitSubdir(source) + } else { + discovery, discErr = install.DiscoverFromGit(source) + } + if discErr != nil { + return fmt.Errorf("discovery failed: %w", discErr) + } + defer install.CleanupDiscovery(discovery) + + // Find the specific agent by name + agentName := filepath.Base(r.Name) + var targetAgent *install.AgentInfo + for i, a := range discovery.Agents { + if a.Name == agentName { + targetAgent = &discovery.Agents[i] + break + } + } + if targetAgent == nil { + return fmt.Errorf("agent %q not found in repository", agentName) + } + + // For grouped agents (r.Name contains "/", e.g. "tools/reviewer"), + // reconstruct the correct destination subdirectory so the file lands + // at agents/tools/reviewer.md rather than agents/reviewer.md. + destDir := agentsDir + if dir := filepath.Dir(r.Name); dir != "." { + destDir = filepath.Join(agentsDir, dir) + } + + installOpts := install.InstallOptions{ + Kind: "agent", + Force: opts.force, + Update: true, + SkipAudit: opts.skipAudit, + AuditThreshold: opts.threshold, + AuditProjectRoot: projectRoot, + SourceDir: agentsDir, + } + _, installErr := install.UpdateAgentFromDiscovery(discovery, *targetAgent, destDir, installOpts) + return installErr +} + +// updateAgentArgs holds parsed arguments for agent update. +type updateAgentArgs struct { + names []string + groups []string + all bool + dryRun bool + force bool + skipAudit bool + threshold string + jsonOutput bool +} + +func parseUpdateAgentArgs(args []string) (*updateAgentArgs, bool, error) { + opts := &updateAgentArgs{} + for i := 0; i < len(args); i++ { + arg := args[i] + switch { + case arg == "--all" || arg == "-a": + opts.all = true + case arg == "--dry-run" || arg == "-n": + opts.dryRun = true + case arg == "--force" || arg == "-f": + opts.force = true + case arg == "--skip-audit": + opts.skipAudit = true + case arg == "--audit-threshold" || arg == "--threshold" || arg == "-T": + i++ + if i >= len(args) { + return nil, false, fmt.Errorf("%s requires a value", arg) + } + threshold, err := normalizeInstallAuditThreshold(args[i]) + if err != nil { + return nil, false, err + } + opts.threshold = threshold + case arg == "--json": + opts.jsonOutput = true + case arg == "--group" || arg == "-G": + i++ + if i >= len(args) { + return nil, false, fmt.Errorf("--group requires a value") + } + opts.groups = append(opts.groups, args[i]) + case arg == "--help" || arg == "-h": + return nil, true, nil + case strings.HasPrefix(arg, "-"): + return nil, false, fmt.Errorf("unknown option: %s", arg) + default: + opts.names = append(opts.names, arg) + } + } + + if !opts.all && len(opts.names) == 0 && len(opts.groups) == 0 { + return nil, false, fmt.Errorf("specify agent name(s), --group, or --all") + } + if opts.all && (len(opts.names) > 0 || len(opts.groups) > 0) { + return nil, false, fmt.Errorf("--all cannot be used with agent names or --group") + } + + return opts, false, nil +} + +func collectTrackedAgentResults(results []check.AgentCheckResult) []check.AgentCheckResult { + tracked := make([]check.AgentCheckResult, 0, len(results)) + for _, r := range results { + if r.Source != "" { + tracked = append(tracked, r) + } + } + return tracked +} + +func mergeTrackedAgentResults(results, tracked []check.AgentCheckResult) { + indexByName := make(map[string]int, len(results)) + for i, r := range results { + indexByName[r.Name] = i + } + for _, r := range tracked { + if idx, ok := indexByName[r.Name]; ok { + results[idx] = r + } + } +} + +func filterAgentCheckResults(results []check.AgentCheckResult, names []string) []check.AgentCheckResult { + nameSet := make(map[string]bool, len(names)) + for _, n := range names { + nameSet[n] = true + // Also index without .md suffix so "demo/tutor.md" matches "demo/tutor" + nameSet[strings.TrimSuffix(n, ".md")] = true + } + var filtered []check.AgentCheckResult + for _, r := range results { + // Match full path (e.g. "demo/code-reviewer") or basename (e.g. "code-reviewer") + if nameSet[r.Name] || nameSet[filepath.Base(r.Name)] { + filtered = append(filtered, r) + } + } + return filtered +} + +// validateAgentGroups checks that each group name corresponds to a subdirectory +// under agentsDir. Returns normalized group names (trailing "/" stripped). +func validateAgentGroups(groups []string, agentsDir string) ([]string, error) { + normalized := make([]string, len(groups)) + for i, group := range groups { + group = strings.TrimSuffix(group, "/") + info, err := os.Stat(filepath.Join(agentsDir, group)) + if err != nil || !info.IsDir() { + return nil, fmt.Errorf("agent group %q not found in %s", group, agentsDir) + } + normalized[i] = group + } + return normalized, nil +} + +func matchesAnyGroup(name string, groups []string) bool { + for _, group := range groups { + if strings.HasPrefix(name, group+"/") { + return true + } + } + return false +} + +// filterAgentResultsByGroups filters agent check results to those in the given groups. +func filterAgentResultsByGroups(results []check.AgentCheckResult, groups []string, agentsDir string) ([]check.AgentCheckResult, error) { + groups, err := validateAgentGroups(groups, agentsDir) + if err != nil { + return nil, err + } + var filtered []check.AgentCheckResult + for _, r := range results { + if matchesAnyGroup(r.Name, groups) { + filtered = append(filtered, r) + } + } + return filtered, nil +} + +// filterDiscoveredAgentsByGroups filters discovered agents to those in the given groups. +func filterDiscoveredAgentsByGroups(discovered []resource.DiscoveredResource, groups []string, agentsDir string) ([]resource.DiscoveredResource, error) { + groups, err := validateAgentGroups(groups, agentsDir) + if err != nil { + return nil, err + } + var filtered []resource.DiscoveredResource + for _, d := range discovered { + if matchesAnyGroup(strings.TrimSuffix(d.RelPath, ".md"), groups) { + filtered = append(filtered, d) + } + } + return filtered, nil +} + +func logUpdateAgentOp(cfgPath string, total, updated, failed int, dryRun bool, start time.Time) { + status := "ok" + if failed > 0 && updated > 0 { + status = "partial" + } else if failed > 0 { + status = "error" + } + e := oplog.NewEntry("update", status, time.Since(start)) + e.Args = map[string]any{ + "resource_kind": "agent", + "agents_total": total, + "agents_updated": updated, + "agents_failed": failed, + "dry_run": dryRun, + } + oplog.WriteWithLimit(cfgPath, oplog.OpsFile, e, logMaxEntries()) //nolint:errcheck +} + +func agentUpdateItemsFromCheckResults(results []check.AgentCheckResult) []agentUpdateItem { + items := make([]agentUpdateItem, 0, len(results)) + for _, r := range results { + item := agentUpdateItem{ + Name: r.Name, + Source: r.Source, + Status: r.Status, + Message: r.Message, + } + if r.Status == "error" { + item.Error = r.Message + } + items = append(items, item) + } + return items +} + +func mergeAgentUpdateItems(base, updates []agentUpdateItem) []agentUpdateItem { + if len(updates) == 0 { + return base + } + + merged := append([]agentUpdateItem(nil), base...) + indexByName := make(map[string]int, len(merged)) + for i, item := range merged { + indexByName[item.Name] = i + } + + for _, item := range updates { + if idx, ok := indexByName[item.Name]; ok { + merged[idx] = item + continue + } + indexByName[item.Name] = len(merged) + merged = append(merged, item) + } + + return merged +} + +func updateAgentsOutputJSON(items []agentUpdateItem, dryRun bool, start time.Time, err error) error { + output := struct { + Agents []agentUpdateItem `json:"agents"` + DryRun bool `json:"dry_run"` + Duration string `json:"duration"` + }{ + Agents: items, + DryRun: dryRun, + Duration: formatDuration(start), + } + return writeJSONResult(&output, err) +} + +// cmdUpdateAgentsProject handles "skillshare update -p agents [name|--all]". +func cmdUpdateAgentsProject(args []string, projectRoot string, start time.Time) error { + jsonRequested := hasFlag(args, "--json") + opts, showHelp, parseErr := parseUpdateAgentArgs(args) + if showHelp { + printUpdateHelp() + return nil + } + if parseErr != nil { + if jsonRequested { + return writeJSONError(parseErr) + } + return parseErr + } + jsonUI := newJSONUISuppressor(opts.jsonOutput) + defer jsonUI.Flush() + + jsonWriteResult := func(items []agentUpdateItem, cmdErr error) error { + jsonUI.Flush() + return updateAgentsOutputJSON(items, opts.dryRun, start, cmdErr) + } + failJSON := func(err error) error { + if opts.jsonOutput { + jsonUI.Flush() + return writeJSONError(err) + } + return err + } + + if !projectConfigExists(projectRoot) { + if err := performProjectInit(projectRoot, projectInitOptions{}); err != nil { + return failJSON(err) + } + } + + runtime, err := loadProjectRuntime(projectRoot) + if err != nil { + return failJSON(err) + } + if opts.threshold == "" { + opts.threshold = runtime.config.Audit.BlockThreshold + } + + agentsDir := runtime.agentsSourcePath + if _, err := os.Stat(agentsDir); err != nil { + if os.IsNotExist(err) { + if opts.jsonOutput { + return jsonWriteResult(nil, nil) + } + ui.Info("No project agents directory (%s)", agentsDir) + return nil + } + return failJSON(fmt.Errorf("cannot access project agents: %w", err)) + } + + results := check.CheckAgents(agentsDir) + if len(results) == 0 { + if opts.jsonOutput { + return jsonWriteResult(nil, nil) + } + ui.Info("No project agents found") + return nil + } + + if len(opts.names) > 0 { + results = filterAgentCheckResults(results, opts.names) + if len(results) == 0 { + return failJSON(fmt.Errorf("no matching agents found: %s", strings.Join(opts.names, ", "))) + } + } + + if len(opts.groups) > 0 { + var err error + results, err = filterAgentResultsByGroups(results, opts.groups, agentsDir) + if err != nil { + return failJSON(err) + } + if len(results) == 0 { + return failJSON(fmt.Errorf("no agents found in group(s): %s", strings.Join(opts.groups, ", "))) + } + } + + tracked := collectTrackedAgentResults(results) + + if len(tracked) == 0 { + if opts.jsonOutput { + return jsonWriteResult(agentUpdateItemsFromCheckResults(results), nil) + } + ui.Info("No tracked project agents to update (all are local)") + return nil + } + + sp := ui.StartSpinner(fmt.Sprintf("Checking %d agent(s) for updates...", len(tracked))) + check.EnrichAgentResultsWithRemote(tracked, func() { sp.Success("Check complete") }) + mergeTrackedAgentResults(results, tracked) + + var updatable []check.AgentCheckResult + for _, r := range tracked { + if r.Status == "update_available" { + updatable = append(updatable, r) + } + } + finalItems := agentUpdateItemsFromCheckResults(results) + + if len(updatable) == 0 { + if opts.jsonOutput { + return jsonWriteResult(finalItems, nil) + } + ui.Success("All project agents are up to date") + return nil + } + + ui.Header("Updating project agents") + if opts.dryRun { + ui.Warning("Dry run mode") + for _, r := range updatable { + ui.Info(" %s: update available from %s", r.Name, r.Source) + } + if opts.jsonOutput { + return jsonWriteResult(finalItems, nil) + } + return nil + } + + updatedItems, updated, failed := batchUpdateAgents(agentsDir, updatable, opts, projectRoot, !opts.jsonOutput) + finalItems = mergeAgentUpdateItems(finalItems, updatedItems) + + logUpdateAgentOp(config.ProjectConfigPath(projectRoot), len(updatable), updated, failed, opts.dryRun, start) + + if opts.jsonOutput { + var cmdErr error + if failed > 0 { + cmdErr = fmt.Errorf("%d agent(s) failed to update", failed) + } + return jsonWriteResult(finalItems, cmdErr) + } + + if failed > 0 { + return fmt.Errorf("%d agent(s) failed to update", failed) + } + return nil +} diff --git a/cmd/skillshare/update_batch.go b/cmd/skillshare/update_batch.go index 856a5132..bc7dc6d0 100644 --- a/cmd/skillshare/update_batch.go +++ b/cmd/skillshare/update_batch.go @@ -2,12 +2,10 @@ package main import ( "fmt" - "path/filepath" "strings" "time" "skillshare/internal/audit" - "skillshare/internal/config" "skillshare/internal/install" "skillshare/internal/trash" "skillshare/internal/ui" @@ -42,6 +40,7 @@ func (uc *updateContext) makeInstallOpts() install.InstallOptions { Update: true, SkipAudit: uc.opts.skipAudit, AuditThreshold: uc.opts.threshold, + SourceDir: uc.sourcePath, } if uc.isProject() { opts.AuditProjectRoot = uc.projectRoot @@ -337,36 +336,24 @@ func pruneSkill(skillPath, name string, uc *updateContext) error { return err } -// pruneRegistry removes pruned skill entries from the registry. +// pruneRegistry removes pruned skill entries from the metadata store. func pruneRegistry(prunedNames []string, uc *updateContext) { - var regDir string - if uc.isProject() { - regDir = filepath.Join(uc.projectRoot, ".skillshare") - } else { - regDir = uc.registryDir - } - - reg, err := config.LoadRegistry(regDir) - if err != nil || len(reg.Skills) == 0 { + store, err := install.LoadMetadata(uc.sourcePath) + if err != nil { return } - removedSet := make(map[string]bool, len(prunedNames)) - for _, n := range prunedNames { - removedSet[n] = true - } - - updated := make([]config.SkillEntry, 0, len(reg.Skills)) - for _, s := range reg.Skills { - if !removedSet[s.FullName()] { - updated = append(updated, s) + changed := false + for _, name := range prunedNames { + if store.Has(name) { + store.Remove(name) + changed = true } } - if len(updated) != len(reg.Skills) { - reg.Skills = updated - if saveErr := reg.Save(regDir); saveErr != nil { - ui.Warning("Failed to update registry after prune: %v", saveErr) + if changed { + if saveErr := store.Save(uc.sourcePath); saveErr != nil { + ui.Warning("Failed to update metadata after prune: %v", saveErr) } } } diff --git a/cmd/skillshare/update_handlers.go b/cmd/skillshare/update_handlers.go index f659b34c..469d8d05 100644 --- a/cmd/skillshare/update_handlers.go +++ b/cmd/skillshare/update_handlers.go @@ -216,10 +216,11 @@ func updateRegularSkill(uc *updateContext, skillName string) (updateResult, erro skillPath := filepath.Join(uc.sourcePath, skillName) // Read metadata to get source - meta, err := install.ReadMeta(skillPath) - if err != nil { - return updateResult{skipped: 1}, fmt.Errorf("cannot read metadata for '%s': %w", skillName, err) + store, storeErr := install.LoadMetadataWithMigration(uc.sourcePath, "") + if storeErr != nil { + return updateResult{skipped: 1}, fmt.Errorf("cannot read metadata for '%s': %w", skillName, storeErr) } + meta := store.GetByPath(skillName) if meta == nil || meta.Source == "" { return updateResult{skipped: 1}, fmt.Errorf("skill '%s' has no source metadata, cannot update", skillName) } @@ -348,9 +349,9 @@ func updateTrackedRepoQuick(uc *updateContext, repoPath string) (bool, *audit.Re // updateSkillFromMeta updates a skill using its metadata in batch mode. // Output is suppressed; caller handles display via progress bar. -// If cachedMeta is non-nil it is used directly; otherwise metadata is read from disk. +// If cachedMeta is non-nil it is used directly; otherwise metadata is loaded from the store. // Returns (updated, installResult, error). -func updateSkillFromMeta(uc *updateContext, skillPath string, cachedMeta *install.SkillMeta) (bool, *install.InstallResult, error) { +func updateSkillFromMeta(uc *updateContext, skillPath string, cachedMeta *install.MetadataEntry) (bool, *install.InstallResult, error) { if uc.opts.dryRun { return false, nil, nil } @@ -361,9 +362,12 @@ func updateSkillFromMeta(uc *updateContext, skillPath string, cachedMeta *instal meta := cachedMeta if meta == nil { - var readErr error - meta, readErr = install.ReadMeta(skillPath) - if readErr != nil || meta == nil || meta.Source == "" { + store, _ := install.LoadMetadataWithMigration(uc.sourcePath, "") + // GetByPath handles both full-path keys and legacy basename+group keys. + if rel, relErr := filepath.Rel(uc.sourcePath, skillPath); relErr == nil { + meta = store.GetByPath(filepath.ToSlash(rel)) + } + if meta == nil || meta.Source == "" { return false, nil, nil } } diff --git a/cmd/skillshare/update_project.go b/cmd/skillshare/update_project.go index c342a887..6def116e 100644 --- a/cmd/skillshare/update_project.go +++ b/cmd/skillshare/update_project.go @@ -59,11 +59,13 @@ func cmdUpdateProjectBatch(sourcePath string, opts *updateOptions, projectRoot s seen := map[string]bool{} var resolveWarnings []string + metaStore, _ := install.LoadMetadataWithMigration(sourcePath, "") + for _, name := range opts.names { // Check group directory first (before repo/skill lookup, // so "feature-radar" expands to all skills rather than // matching a single nested "feature-radar/feature-radar"). - if isGroupDir(name, sourcePath) { + if isGroupDir(name, sourcePath, metaStore) { groupMatches, groupErr := resolveGroupUpdatable(name, sourcePath) if groupErr != nil { resolveWarnings = append(resolveWarnings, fmt.Sprintf("%s: %v", name, groupErr)) @@ -104,11 +106,11 @@ func cmdUpdateProjectBatch(sourcePath string, opts *updateOptions, projectRoot s // Regular skill with metadata skillPath := filepath.Join(sourcePath, name) if info, err := os.Stat(skillPath); err == nil && info.IsDir() { - meta, metaErr := install.ReadMeta(skillPath) - if metaErr == nil && meta != nil && meta.Source != "" { + entry := metaStore.Get(name) + if entry != nil && entry.Source != "" { if !seen[skillPath] { seen[skillPath] = true - targets = append(targets, updateTarget{name: name, path: skillPath, isRepo: false, meta: meta}) + targets = append(targets, updateTarget{name: name, path: skillPath, isRepo: false, meta: entry}) } continue } @@ -187,6 +189,7 @@ func updateAllProjectSkills(uc *updateContext) (*updateResult, error) { scanSpinner := ui.StartSpinner("Scanning skills...") walkRoot := utils.ResolveSymlink(uc.sourcePath) + metaStore, _ := install.LoadMetadataWithMigration(uc.sourcePath, "") err := filepath.Walk(walkRoot, func(path string, info os.FileInfo, err error) error { if err != nil { return nil @@ -213,11 +216,10 @@ func updateAllProjectSkills(uc *updateContext) (*updateResult, error) { // Regular skill with metadata if !info.IsDir() && info.Name() == "SKILL.md" { skillDir := filepath.Dir(path) - meta, metaErr := install.ReadMeta(skillDir) - if metaErr == nil && meta != nil && meta.Source != "" { - rel, _ := filepath.Rel(walkRoot, skillDir) - if rel != "." { - targets = append(targets, updateTarget{name: rel, path: skillDir, isRepo: false, meta: meta}) + rel, _ := filepath.Rel(walkRoot, skillDir) + if rel != "." { + if entry := metaStore.Get(rel); entry != nil && entry.Source != "" { + targets = append(targets, updateTarget{name: rel, path: skillDir, isRepo: false, meta: entry}) } } } diff --git a/cmd/skillshare/update_resolve.go b/cmd/skillshare/update_resolve.go index ccc1e804..b0d7898a 100644 --- a/cmd/skillshare/update_resolve.go +++ b/cmd/skillshare/update_resolve.go @@ -12,10 +12,10 @@ import ( ) type updateTarget struct { - name string // relative path from source dir (display name) - path string // absolute path on disk - isRepo bool // true for tracked repos (_-prefixed git repos) - meta *install.SkillMeta // cached metadata; nil for tracked repos + name string // relative path from source dir (display name) + path string // absolute path on disk + isRepo bool // true for tracked repos (_-prefixed git repos) + meta *install.MetadataEntry // cached metadata; nil for tracked repos } // resolveByBasename searches nested skills and tracked repos by their @@ -99,6 +99,9 @@ func resolveGroupUpdatable(group, sourceDir string) ([]updateTarget, error) { return nil, fmt.Errorf("group '%s' resolves outside source directory", group) } + // Load store once before walk (not per iteration) + store, _ := install.LoadMetadata(resolvedSourceDir) + var matches []updateTarget if walkErr := filepath.Walk(walkRoot, func(path string, fi os.FileInfo, err error) error { if err != nil { @@ -122,9 +125,9 @@ func resolveGroupUpdatable(group, sourceDir string) ([]updateTarget, error) { return filepath.SkipDir } - // Skill with metadata (has .skillshare-meta.json) - if meta, metaErr := install.ReadMeta(path); metaErr == nil && meta != nil && meta.Source != "" { - matches = append(matches, updateTarget{name: rel, path: path, isRepo: false, meta: meta}) + // Skill with metadata (centralized store) + if entry := store.GetByPath(rel); entry != nil && entry.Source != "" { + matches = append(matches, updateTarget{name: rel, path: path, isRepo: false, meta: entry}) return filepath.SkipDir } @@ -139,7 +142,7 @@ func resolveGroupUpdatable(group, sourceDir string) ([]updateTarget, error) { // isGroupDir checks if a name corresponds to a group directory (a container // for other skills). Returns false for tracked repos, skills with metadata, // and directories that are themselves a skill (have SKILL.md). -func isGroupDir(name, sourceDir string) bool { +func isGroupDir(name, sourceDir string, store *install.MetadataStore) bool { path := filepath.Join(sourceDir, name) info, err := os.Stat(path) if err != nil || !info.IsDir() { @@ -150,7 +153,7 @@ func isGroupDir(name, sourceDir string) bool { return false } // Not a skill with metadata - if meta, metaErr := install.ReadMeta(path); metaErr == nil && meta != nil && meta.Source != "" { + if entry := store.Get(name); entry != nil && entry.Source != "" { return false } // Not a skill directory (has SKILL.md) diff --git a/cmd/skillshare/update_test.go b/cmd/skillshare/update_test.go index 9e0c518c..e62d0bf4 100644 --- a/cmd/skillshare/update_test.go +++ b/cmd/skillshare/update_test.go @@ -90,7 +90,9 @@ func setupUpdatableSkill(t *testing.T, sourceDir, name string) { dir := filepath.Join(sourceDir, name) os.MkdirAll(dir, 0755) os.WriteFile(filepath.Join(dir, "SKILL.md"), []byte("# "+name), 0644) - install.WriteMeta(dir, &install.SkillMeta{Source: "github.com/test/" + name, Type: "github"}) + store, _ := install.LoadMetadata(sourceDir) + store.Set(name, &install.MetadataEntry{Source: "github.com/test/" + name, Type: "github"}) + store.Save(sourceDir) } func TestResolveByGlob_MatchesTrackedRepos(t *testing.T) { diff --git a/internal/audit/analyzer_metadata.go b/internal/audit/analyzer_metadata.go index 39361d15..f08d67bc 100644 --- a/internal/audit/analyzer_metadata.go +++ b/internal/audit/analyzer_metadata.go @@ -10,7 +10,7 @@ import ( ) // metadataAnalyzer cross-references SKILL.md metadata (name, description) -// against the actual git source URL from .skillshare-meta.json. +// against the actual git source URL from the centralized metadata store. // Detects social-engineering patterns: publisher mismatch and authority claims. // Runs at skill scope after all files are walked. type metadataAnalyzer struct{} @@ -20,9 +20,19 @@ func (a *metadataAnalyzer) Scope() AnalyzerScope { return ScopeSkill } // metaJSON is a minimal subset of install.SkillMeta to avoid import cycles. type metaJSON struct { - RepoURL string `json:"repo_url"` + RepoURL string `json:"repo_url"` + FileHashes map[string]string `json:"file_hashes"` } +// metadataStoreJSON is a minimal subset of install.MetadataStore for reading +// the centralized .metadata.json without importing the install package. +type metadataStoreJSON struct { + Entries map[string]metaJSON `json:"entries"` +} + +// metadataFileName mirrors install.MetadataFileName to avoid a circular import. +const metadataFileName = ".metadata.json" + // Rule IDs for disable support via audit-rules.yaml. const ( rulePublisherMismatch = "publisher-mismatch" @@ -47,7 +57,6 @@ func (a *metadataAnalyzer) Analyze(ctx *AnalyzeContext) ([]Finding, error) { return nil, nil } - // Read .skillshare-meta.json for source URL. repoURL := readMetaRepoURL(ctx.SkillPath) // Read SKILL.md frontmatter for name and description. @@ -73,17 +82,45 @@ func (a *metadataAnalyzer) Analyze(ctx *AnalyzeContext) ([]Finding, error) { return findings, nil } -// readMetaRepoURL reads repo_url from .skillshare-meta.json in skillPath. +// findMetaEntry walks up parent directories of skillPath looking for the +// centralized .metadata.json store and returns the raw entry for this skill. +func findMetaEntry(skillPath string) *metaJSON { + skillName := filepath.Base(skillPath) + dir := filepath.Dir(skillPath) + + for i := 0; i < 10 && dir != filepath.Dir(dir); i++ { + data, err := os.ReadFile(filepath.Join(dir, metadataFileName)) + if err == nil { + var store metadataStoreJSON + if json.Unmarshal(data, &store) == nil { + if rel, relErr := filepath.Rel(dir, skillPath); relErr == nil { + key := filepath.ToSlash(rel) + if e, ok := store.Entries[key]; ok { + return &e + } + } + if e, ok := store.Entries[skillName]; ok { + return &e + } + } + } + dir = filepath.Dir(dir) + } + return nil +} + func readMetaRepoURL(skillPath string) string { - data, err := os.ReadFile(filepath.Join(skillPath, ".skillshare-meta.json")) - if err != nil { - return "" + if e := findMetaEntry(skillPath); e != nil { + return e.RepoURL } - var m metaJSON - if json.Unmarshal(data, &m) != nil { - return "" + return "" +} + +func readMetaFileHashes(skillPath string) map[string]string { + if e := findMetaEntry(skillPath); e != nil { + return e.FileHashes } - return m.RepoURL + return nil } // readSkillFrontmatter extracts name and description from SKILL.md. diff --git a/internal/audit/analyzer_metadata_test.go b/internal/audit/analyzer_metadata_test.go index 53ec1146..f2615976 100644 --- a/internal/audit/analyzer_metadata_test.go +++ b/internal/audit/analyzer_metadata_test.go @@ -246,19 +246,23 @@ func TestIsWellKnownOrg(t *testing.T) { } func TestMetadataAnalyzer_Integration(t *testing.T) { - // Create a temp skill directory with SKILL.md and .skillshare-meta.json - dir := t.TempDir() + // Create a nested skill directory: root/evil-skill/SKILL.md + // with centralized metadata at root/.metadata.json + root := t.TempDir() + dir := filepath.Join(root, "evil-skill") + os.MkdirAll(dir, 0755) - // Write SKILL.md claiming "from Acme Corp" skillContent := "---\nname: evil-skill\ndescription: Official formatter from Acme Corp\n---\n# Evil\n" if err := os.WriteFile(filepath.Join(dir, "SKILL.md"), []byte(skillContent), 0644); err != nil { t.Fatal(err) } - // Write meta pointing to a different org - meta := metaJSON{RepoURL: "https://github.com/evil-fork/skills.git"} - metaData, _ := json.Marshal(meta) - if err := os.WriteFile(filepath.Join(dir, ".skillshare-meta.json"), metaData, 0644); err != nil { + // Write centralized metadata in parent directory + store := metadataStoreJSON{Entries: map[string]metaJSON{ + "evil-skill": {RepoURL: "https://github.com/evil-fork/skills.git"}, + }} + storeData, _ := json.Marshal(store) + if err := os.WriteFile(filepath.Join(root, metadataFileName), storeData, 0644); err != nil { t.Fatal(err) } @@ -298,6 +302,40 @@ func TestMetadataAnalyzer_Integration(t *testing.T) { } } +func TestMetadataAnalyzer_Integration_DeepNestedSkill(t *testing.T) { + root := t.TempDir() + dir := filepath.Join(root, "a", "b", "c", "d", "evil-skill") + if err := os.MkdirAll(dir, 0755); err != nil { + t.Fatal(err) + } + + skillContent := "---\nname: evil-skill\ndescription: Official formatter from Acme Corp\n---\n# Evil\n" + if err := os.WriteFile(filepath.Join(dir, "SKILL.md"), []byte(skillContent), 0644); err != nil { + t.Fatal(err) + } + + store := metadataStoreJSON{Entries: map[string]metaJSON{ + "a/b/c/d/evil-skill": {RepoURL: "https://github.com/evil-fork/skills.git"}, + }} + storeData, _ := json.Marshal(store) + if err := os.WriteFile(filepath.Join(root, metadataFileName), storeData, 0644); err != nil { + t.Fatal(err) + } + + a := &metadataAnalyzer{} + ctx := &AnalyzeContext{ + SkillPath: dir, + DisabledIDs: map[string]bool{}, + } + findings, err := a.Analyze(ctx) + if err != nil { + t.Fatal(err) + } + if len(findings) < 2 { + t.Fatalf("expected deep nested skill to resolve metadata, got %d findings", len(findings)) + } +} + func TestMetadataAnalyzer_NoMeta(t *testing.T) { // Skill without .skillshare-meta.json — should produce no findings dir := t.TempDir() @@ -320,14 +358,18 @@ func TestMetadataAnalyzer_NoMeta(t *testing.T) { } func TestMetadataAnalyzer_DisabledRules(t *testing.T) { - dir := t.TempDir() + root := t.TempDir() + dir := filepath.Join(root, "test-skill") + os.MkdirAll(dir, 0755) if err := os.WriteFile(filepath.Join(dir, "SKILL.md"), []byte("---\nname: test\ndescription: Official tool from Acme Corp\n---\n"), 0644); err != nil { t.Fatal(err) } - meta := metaJSON{RepoURL: "https://github.com/evil-fork/skills.git"} - metaData, _ := json.Marshal(meta) - if err := os.WriteFile(filepath.Join(dir, ".skillshare-meta.json"), metaData, 0644); err != nil { + store := metadataStoreJSON{Entries: map[string]metaJSON{ + "test-skill": {RepoURL: "https://github.com/evil-fork/skills.git"}, + }} + storeData, _ := json.Marshal(store) + if err := os.WriteFile(filepath.Join(root, metadataFileName), storeData, 0644); err != nil { t.Fatal(err) } diff --git a/internal/audit/audit.go b/internal/audit/audit.go index 76721966..832a4156 100644 --- a/internal/audit/audit.go +++ b/internal/audit/audit.go @@ -3,7 +3,6 @@ package audit import ( "crypto/sha256" "encoding/hex" - "encoding/json" "errors" "fmt" "os" @@ -131,6 +130,7 @@ type Finding struct { // Result holds all findings for a single skill. type Result struct { SkillName string `json:"skillName"` + Kind string `json:"kind,omitempty"` // "skill" or "agent" — set by caller Findings []Finding `json:"findings"` RiskScore int `json:"riskScore"` RiskLabel string `json:"riskLabel"` // "clean", "low", "medium", "high", "critical" @@ -1633,30 +1633,22 @@ func isExternalOrAnchor(target string) bool { return strings.HasPrefix(target, "#") } -// checkContentIntegrity compares files on disk against pinned hashes in -// .skillshare-meta.json. Backward-compatible: skips silently when meta or -// file_hashes is absent. cache holds file contents already read during the -// walk phase; files not in cache are read from disk as fallback. +// checkContentIntegrity compares files on disk against pinned hashes in the +// centralized .metadata.json store. Backward-compatible: skips silently when +// metadata or file_hashes is absent. cache holds file contents already read +// during the walk phase; files not in cache are read from disk as fallback. // allFiles (if non-nil) is the set of file relPaths collected during the main // walk, used to detect unexpected files without a second filepath.Walk. func checkContentIntegrity(skillPath string, cache map[string][]byte, allFiles map[string]bool) []Finding { - metaPath := filepath.Join(skillPath, ".skillshare-meta.json") - data, err := os.ReadFile(metaPath) - if err != nil { - return nil // no meta → skip - } - - var raw struct { - FileHashes map[string]string `json:"file_hashes"` - } - if err := json.Unmarshal(data, &raw); err != nil || len(raw.FileHashes) == 0 { - return nil // no hashes → skip + fileHashes := readMetaFileHashes(skillPath) + if len(fileHashes) == 0 { + return nil } var findings []Finding // Check pinned files: missing or tampered - for rel, expected := range raw.FileHashes { + for rel, expected := range fileHashes { normalizedRel := filepath.FromSlash(rel) // Reject absolute keys in metadata (e.g. "/etc/passwd"). // file_hashes must always be skill-relative paths. @@ -1733,7 +1725,7 @@ func checkContentIntegrity(skillPath string, cache map[string][]byte, allFiles m if allFiles != nil { // Use pre-collected file set from the main walk (no second walk needed). for relPath := range allFiles { - if _, ok := raw.FileHashes[relPath]; !ok { + if _, ok := fileHashes[relPath]; !ok { findings = append(findings, Finding{ Severity: SeverityLow, Pattern: "content-unexpected", @@ -1767,7 +1759,7 @@ func checkContentIntegrity(skillPath string, cache map[string][]byte, allFiles m return nil } normalized := filepath.ToSlash(rel) - if _, ok := raw.FileHashes[normalized]; !ok { + if _, ok := fileHashes[normalized]; !ok { findings = append(findings, Finding{ Severity: SeverityLow, Pattern: "content-unexpected", diff --git a/internal/audit/audit_scan_skill_test.go b/internal/audit/audit_scan_skill_test.go index cf565b4b..9d229529 100644 --- a/internal/audit/audit_scan_skill_test.go +++ b/internal/audit/audit_scan_skill_test.go @@ -731,19 +731,29 @@ func sha256hex(data []byte) string { } // helper: write meta with file_hashes -func writeMetaWithHashes(t *testing.T, dir string, hashes map[string]string) { +// writeMetaWithHashes writes file hashes into the centralized .metadata.json +// store in the parent directory of skillDir (matching the production layout). +func writeMetaWithHashes(t *testing.T, skillDir string, hashes map[string]string) { t.Helper() - meta := struct { + skillName := filepath.Base(skillDir) + parentDir := filepath.Dir(skillDir) + + type entry struct { Source string `json:"source"` Type string `json:"type"` FileHashes map[string]string `json:"file_hashes"` + } + store := struct { + Version int `json:"version"` + Entries map[string]entry `json:"entries"` }{ - Source: "test", - Type: "local", - FileHashes: hashes, + Version: 1, + Entries: map[string]entry{ + skillName: {Source: "test", Type: "local", FileHashes: hashes}, + }, } - data, _ := json.Marshal(meta) - os.WriteFile(filepath.Join(dir, ".skillshare-meta.json"), data, 0644) + data, _ := json.Marshal(store) + os.WriteFile(filepath.Join(parentDir, metadataFileName), data, 0644) } func TestScanSkill_ContentTampered(t *testing.T) { diff --git a/internal/audit/crossskill.go b/internal/audit/crossskill.go index 938d70d4..a8ddb59e 100644 --- a/internal/audit/crossskill.go +++ b/internal/audit/crossskill.go @@ -6,6 +6,9 @@ import ( "strings" ) +// CrossSkillResultName is the synthetic skill name used for cross-skill analysis results. +const CrossSkillResultName = "_cross-skill" + // skillCapability summarises the security-relevant capabilities of a single skill, // derived entirely from its existing Result (TierProfile + Findings). type skillCapability struct { @@ -134,7 +137,7 @@ func CrossSkillAnalysis(results []*Result) *Result { } r := &Result{ - SkillName: "_cross-skill", + SkillName: CrossSkillResultName, Findings: findings, Analyzability: 1.0, } diff --git a/internal/audit/parallel.go b/internal/audit/parallel.go index 43630f68..134a6054 100644 --- a/internal/audit/parallel.go +++ b/internal/audit/parallel.go @@ -21,8 +21,9 @@ func workerCount() int { // SkillInput describes a skill to scan. type SkillInput struct { - Name string - Path string + Name string + Path string + IsFile bool // true for individual file scanning (agents) } // ScanOutput holds the result of scanning a single skill. @@ -48,23 +49,30 @@ func ParallelScan(skills []SkillInput, projectRoot string, onDone func(), regist for i, sk := range skills { wg.Add(1) sem <- struct{}{} - go func(idx int, path string) { + go func(idx int, input SkillInput) { defer wg.Done() defer func() { <-sem }() start := time.Now() var res *Result var err error - if registry != nil { + if input.IsFile { + // Agent: scan individual file if projectRoot != "" { - res, err = ScanSkillFilteredForProject(path, projectRoot, registry) + res, err = ScanFileForProject(input.Path, projectRoot) } else { - res, err = ScanSkillFiltered(path, registry) + res, err = ScanFile(input.Path) + } + } else if registry != nil { + if projectRoot != "" { + res, err = ScanSkillFilteredForProject(input.Path, projectRoot, registry) + } else { + res, err = ScanSkillFiltered(input.Path, registry) } } else { if projectRoot != "" { - res, err = ScanSkillForProject(path, projectRoot) + res, err = ScanSkillForProject(input.Path, projectRoot) } else { - res, err = ScanSkill(path) + res, err = ScanSkill(input.Path) } } outputs[idx] = ScanOutput{ @@ -75,7 +83,7 @@ func ParallelScan(skills []SkillInput, projectRoot string, onDone func(), regist if onDone != nil { onDone() } - }(i, sk.Path) + }(i, sk) } wg.Wait() diff --git a/internal/backup/backup.go b/internal/backup/backup.go index e20f0717..a75cfd4d 100644 --- a/internal/backup/backup.go +++ b/internal/backup/backup.go @@ -11,15 +11,24 @@ import ( "skillshare/internal/config" ) -// BackupDir returns the backup directory path. +// BackupDir returns the global backup directory path. func BackupDir() string { return filepath.Join(config.DataDir(), "backups") } -// Create creates a backup of the target directory -// Returns the backup path +// ProjectBackupDir returns the project-level backup directory path. +func ProjectBackupDir(projectRoot string) string { + return filepath.Join(projectRoot, ".skillshare", "backups") +} + +// Create creates a backup of the target directory using the global backup dir. func Create(targetName, targetPath string) (string, error) { - backupDir := BackupDir() + return CreateInDir(BackupDir(), targetName, targetPath) +} + +// CreateInDir creates a backup of the target directory in the specified backup dir. +// Returns the backup path, or ("", nil) when there is nothing to back up. +func CreateInDir(backupDir, targetName, targetPath string) (string, error) { if backupDir == "" { return "", fmt.Errorf("cannot determine backup directory: home directory not found") } @@ -62,9 +71,13 @@ func Create(targetName, targetPath string) (string, error) { return backupPath, nil } -// List returns all backups sorted by date (newest first) +// List returns all backups from the global backup dir, sorted by date (newest first). func List() ([]BackupInfo, error) { - backupDir := BackupDir() + return ListInDir(BackupDir()) +} + +// ListInDir returns all backups from the specified directory, sorted by date (newest first). +func ListInDir(backupDir string) ([]BackupInfo, error) { if backupDir == "" { return nil, fmt.Errorf("cannot determine backup directory: home directory not found") } diff --git a/internal/backup/restore.go b/internal/backup/restore.go index a83c2ed5..6f56446b 100644 --- a/internal/backup/restore.go +++ b/internal/backup/restore.go @@ -85,10 +85,15 @@ func RestoreToPath(backupPath, targetName, destPath string, opts RestoreOptions) return copyDir(targetBackupPath, destPath) } -// RestoreLatest restores the most recent backup for a target. -// Returns the timestamp of the restored backup. +// RestoreLatest restores the most recent backup for a target from the global backup dir. func RestoreLatest(targetName, destPath string, opts RestoreOptions) (string, error) { - backups, err := List() + return RestoreLatestInDir(BackupDir(), targetName, destPath, opts) +} + +// RestoreLatestInDir restores the most recent backup for a target from the specified dir. +// Returns the timestamp of the restored backup. +func RestoreLatestInDir(backupDir, targetName, destPath string, opts RestoreOptions) (string, error) { + backups, err := ListInDir(backupDir) if err != nil { return "", err } @@ -108,9 +113,14 @@ func RestoreLatest(targetName, destPath string, opts RestoreOptions) (string, er return "", fmt.Errorf("no backup found for target '%s'", targetName) } -// FindBackupsForTarget returns all backups that contain the specified target +// FindBackupsForTarget returns all backups that contain the specified target from the global dir. func FindBackupsForTarget(targetName string) ([]BackupInfo, error) { - allBackups, err := List() + return FindBackupsForTargetInDir(BackupDir(), targetName) +} + +// FindBackupsForTargetInDir returns all backups that contain the specified target. +func FindBackupsForTargetInDir(backupDir, targetName string) ([]BackupInfo, error) { + allBackups, err := ListInDir(backupDir) if err != nil { return nil, err } @@ -128,9 +138,14 @@ func FindBackupsForTarget(targetName string) ([]BackupInfo, error) { return result, nil } -// GetBackupByTimestamp finds a backup by its timestamp +// GetBackupByTimestamp finds a backup by its timestamp from the global dir. func GetBackupByTimestamp(timestamp string) (*BackupInfo, error) { - backups, err := List() + return GetBackupByTimestampInDir(BackupDir(), timestamp) +} + +// GetBackupByTimestampInDir finds a backup by its timestamp in the specified dir. +func GetBackupByTimestampInDir(backupDir, timestamp string) (*BackupInfo, error) { + backups, err := ListInDir(backupDir) if err != nil { return nil, err } diff --git a/internal/check/agent_check.go b/internal/check/agent_check.go new file mode 100644 index 00000000..482a388b --- /dev/null +++ b/internal/check/agent_check.go @@ -0,0 +1,185 @@ +package check + +import ( + "path/filepath" + + "skillshare/internal/git" + "skillshare/internal/install" + "skillshare/internal/resource" + "skillshare/internal/utils" +) + +// AgentCheckResult holds the check result for a single agent. +type AgentCheckResult struct { + Name string `json:"name"` + Source string `json:"source,omitempty"` + Version string `json:"version,omitempty"` + RepoURL string `json:"repoUrl,omitempty"` + RepoPath string `json:"repoPath,omitempty"` + Status string `json:"status"` // "up_to_date", "drifted", "dirty", "local", "error", "update_available" + Message string `json:"message,omitempty"` +} + +// CheckAgents scans the agents source directory for installed agents and +// compares their file hashes against metadata to detect drift. +// Uses resource.AgentKind{}.Discover() to recurse into subdirectories. +func CheckAgents(agentsDir string) []AgentCheckResult { + discovered, err := resource.AgentKind{}.Discover(agentsDir) + if err != nil { + return nil + } + + // Load centralized metadata store (auto-migrates any lingering sidecars). + store, loadErr := install.LoadMetadata(agentsDir) + + var results []AgentCheckResult + for _, d := range discovered { + if loadErr != nil { + // Surface corruption instead of silently treating all agents as local. + key := d.RelPath[:len(d.RelPath)-len(".md")] + results = append(results, AgentCheckResult{ + Name: key, + Status: "error", + Message: "invalid metadata: " + loadErr.Error(), + }) + continue + } + result := checkOneAgent(store, d.SourcePath, d.RelPath) + if d.RepoRelPath != "" { + result = checkTrackedAgentRepo(agentsDir, d) + } + results = append(results, result) + } + + return results +} + +func checkTrackedAgentRepo(agentsDir string, d resource.DiscoveredResource) AgentCheckResult { + result := AgentCheckResult{ + Name: d.RelPath[:len(d.RelPath)-len(".md")], + RepoPath: filepath.Join(agentsDir, filepath.FromSlash(d.RepoRelPath)), + } + + if !install.IsGitRepo(result.RepoPath) { + result.Status = "local" + return result + } + + repoURL, _ := git.GetRemoteURL(result.RepoPath) + version, _ := git.GetCurrentFullHash(result.RepoPath) + result.Source = repoURL + result.RepoURL = repoURL + result.Version = version + + if repoURL == "" || version == "" { + result.Status = "local" + return result + } + + if isDirty, _ := git.IsDirty(result.RepoPath); isDirty { + result.Status = "dirty" + result.Message = "tracked repo has uncommitted changes" + return result + } + + result.Status = "up_to_date" + return result +} + +// checkOneAgent checks a single agent file against the centralized metadata store. +// sourcePath is the absolute path to the .md file; relPath is relative to the +// agents root (e.g. "demo/code-reviewer.md"). +func checkOneAgent(store *install.MetadataStore, sourcePath, relPath string) AgentCheckResult { + fileName := filepath.Base(relPath) + key := relPath[:len(relPath)-len(".md")] + result := AgentCheckResult{Name: key} + + entry := store.GetByPath(key) + if entry == nil || entry.Source == "" { + result.Status = "local" + return result + } + + result.Source = entry.Source + result.Version = entry.Version + result.RepoURL = entry.RepoURL + + // Compare file hash + if entry.FileHashes == nil || entry.FileHashes[fileName] == "" { + result.Status = "local" + return result + } + + currentHash, err := utils.FileHashFormatted(sourcePath) + if err != nil { + result.Status = "error" + result.Message = "cannot hash file" + return result + } + + if currentHash == entry.FileHashes[fileName] { + result.Status = "up_to_date" + } else { + result.Status = "drifted" + result.Message = "file content changed since install" + } + + return result +} + +// EnrichAgentResultsWithRemote checks agents that have RepoURL + Version +// against their remote HEAD to detect available updates. +// Uses ParallelCheckURLs for efficient batched remote probing. +func EnrichAgentResultsWithRemote(results []AgentCheckResult, onDone func()) { + // Collect unique repo URLs that have version info + type agentRef struct { + repoURL string + version string + indices []int + } + urlMap := make(map[string]*agentRef) + for i, r := range results { + if r.RepoURL == "" || r.Version == "" { + continue + } + if ref, ok := urlMap[r.RepoURL]; ok { + ref.indices = append(ref.indices, i) + } else { + urlMap[r.RepoURL] = &agentRef{ + repoURL: r.RepoURL, + version: r.Version, + indices: []int{i}, + } + } + } + + if len(urlMap) == 0 { + return + } + + // Build URL check inputs + var inputs []URLCheckInput + var refs []*agentRef + for _, ref := range urlMap { + inputs = append(inputs, URLCheckInput{RepoURL: ref.repoURL}) + refs = append(refs, ref) + } + + outputs := ParallelCheckURLs(inputs, onDone) + + // Apply results + for i, out := range outputs { + ref := refs[i] + if out.Err != nil { + continue + } + if out.RemoteHash != "" && out.RemoteHash != ref.version { + for _, idx := range ref.indices { + if results[idx].Status == "up_to_date" { + results[idx].Status = "update_available" + results[idx].Message = "newer version available" + } + } + } + } +} diff --git a/internal/check/agent_check_test.go b/internal/check/agent_check_test.go new file mode 100644 index 00000000..6ddf34c4 --- /dev/null +++ b/internal/check/agent_check_test.go @@ -0,0 +1,152 @@ +package check + +import ( + "encoding/json" + "os" + "path/filepath" + "testing" + + "skillshare/internal/install" + "skillshare/internal/utils" +) + +func TestCheckAgents_NoAgents(t *testing.T) { + dir := t.TempDir() + results := CheckAgents(dir) + if len(results) != 0 { + t.Errorf("expected 0 results, got %d", len(results)) + } +} + +func TestCheckAgents_LocalAgent(t *testing.T) { + dir := t.TempDir() + os.WriteFile(filepath.Join(dir, "tutor.md"), []byte("# Tutor"), 0644) + + results := CheckAgents(dir) + if len(results) != 1 { + t.Fatalf("expected 1 result, got %d", len(results)) + } + if results[0].Name != "tutor" { + t.Errorf("Name = %q, want %q", results[0].Name, "tutor") + } + if results[0].Status != "local" { + t.Errorf("Status = %q, want %q", results[0].Status, "local") + } +} + +func TestCheckAgents_UpToDate(t *testing.T) { + dir := t.TempDir() + agentFile := filepath.Join(dir, "tutor.md") + os.WriteFile(agentFile, []byte("# Tutor agent"), 0644) + + hash, _ := utils.FileHashFormatted(agentFile) + + meta := &install.SkillMeta{ + Source: "test", + Kind: "agent", + FileHashes: map[string]string{"tutor.md": hash}, + } + metaData, _ := json.MarshalIndent(meta, "", " ") + os.WriteFile(filepath.Join(dir, "tutor.skillshare-meta.json"), metaData, 0644) + + results := CheckAgents(dir) + if len(results) != 1 { + t.Fatalf("expected 1 result, got %d", len(results)) + } + if results[0].Status != "up_to_date" { + t.Errorf("Status = %q, want %q", results[0].Status, "up_to_date") + } +} + +func TestCheckAgents_Drifted(t *testing.T) { + dir := t.TempDir() + agentFile := filepath.Join(dir, "tutor.md") + os.WriteFile(agentFile, []byte("# Modified content"), 0644) + + meta := &install.SkillMeta{ + Source: "test", + Kind: "agent", + FileHashes: map[string]string{"tutor.md": "sha256:0000000000000000000000000000000000000000000000000000000000000000"}, + } + metaData, _ := json.MarshalIndent(meta, "", " ") + os.WriteFile(filepath.Join(dir, "tutor.skillshare-meta.json"), metaData, 0644) + + results := CheckAgents(dir) + if len(results) != 1 { + t.Fatalf("expected 1 result, got %d", len(results)) + } + if results[0].Status != "drifted" { + t.Errorf("Status = %q, want %q", results[0].Status, "drifted") + } +} + +func TestCheckAgents_InvalidCentralizedMetadata(t *testing.T) { + dir := t.TempDir() + os.WriteFile(filepath.Join(dir, "tutor.md"), []byte("# Tutor"), 0644) + os.WriteFile(filepath.Join(dir, install.MetadataFileName), []byte("{invalid"), 0644) + + results := CheckAgents(dir) + if len(results) != 1 { + t.Fatalf("expected 1 result, got %d", len(results)) + } + if results[0].Name != "tutor" { + t.Errorf("Name = %q, want %q", results[0].Name, "tutor") + } + if results[0].Status != "error" { + t.Errorf("Status = %q, want %q", results[0].Status, "error") + } + if results[0].Message == "" { + t.Fatal("expected error message for invalid centralized metadata") + } +} + +func TestCheckAgents_NonExistentDir(t *testing.T) { + results := CheckAgents("/nonexistent/path") + if results != nil { + t.Errorf("expected nil for nonexistent dir, got %v", results) + } +} + +func TestCheckAgents_Nested(t *testing.T) { + dir := t.TempDir() + subdir := filepath.Join(dir, "demo") + os.MkdirAll(subdir, 0755) + + agentFile := filepath.Join(subdir, "tutor.md") + os.WriteFile(agentFile, []byte("# Tutor"), 0644) + + hash, _ := utils.FileHashFormatted(agentFile) + meta := &install.SkillMeta{ + Source: "https://github.com/example/repo", + Kind: "agent", + FileHashes: map[string]string{"tutor.md": hash}, + } + metaData, _ := json.MarshalIndent(meta, "", " ") + os.WriteFile(filepath.Join(subdir, "tutor.skillshare-meta.json"), metaData, 0644) + + results := CheckAgents(dir) + if len(results) != 1 { + t.Fatalf("expected 1 result, got %d", len(results)) + } + if results[0].Name != "demo/tutor" { + t.Errorf("Name = %q, want %q", results[0].Name, "demo/tutor") + } + if results[0].Status != "up_to_date" { + t.Errorf("Status = %q, want %q", results[0].Status, "up_to_date") + } + if results[0].Source != "https://github.com/example/repo" { + t.Errorf("Source = %q, want non-empty", results[0].Source) + } +} + +func TestCheckAgents_SkipsNonMd(t *testing.T) { + dir := t.TempDir() + os.WriteFile(filepath.Join(dir, "tutor.md"), []byte("# Tutor"), 0644) + os.WriteFile(filepath.Join(dir, "config.yaml"), []byte("key: val"), 0644) + os.MkdirAll(filepath.Join(dir, "subdir"), 0755) + + results := CheckAgents(dir) + if len(results) != 1 { + t.Fatalf("expected 1 result (only .md files), got %d", len(results)) + } +} diff --git a/internal/config/basedir_test.go b/internal/config/basedir_test.go index 90a6dc03..356530f3 100644 --- a/internal/config/basedir_test.go +++ b/internal/config/basedir_test.go @@ -48,6 +48,26 @@ func TestConfigPath_RespectsXDGConfigHome(t *testing.T) { } } +func TestEffectiveAgentsSource_Default(t *testing.T) { + t.Setenv("XDG_CONFIG_HOME", "") + cfg := &Config{} + + got := cfg.EffectiveAgentsSource() + want := filepath.Join(BaseDir(), "agents") + if got != want { + t.Errorf("EffectiveAgentsSource() = %q, want %q", got, want) + } +} + +func TestEffectiveAgentsSource_Explicit(t *testing.T) { + cfg := &Config{AgentsSource: "/custom/agents"} + + got := cfg.EffectiveAgentsSource() + if got != "/custom/agents" { + t.Errorf("EffectiveAgentsSource() = %q, want %q", got, "/custom/agents") + } +} + func TestConfigPath_SKILLSHARECONFIGTakesPriority(t *testing.T) { t.Setenv("SKILLSHARE_CONFIG", "/override/config.yaml") t.Setenv("XDG_CONFIG_HOME", "/custom/config") diff --git a/internal/config/config.go b/internal/config/config.go index 5664d4dc..6d36b9f5 100644 --- a/internal/config/config.go +++ b/internal/config/config.go @@ -132,6 +132,15 @@ func (tc *TargetConfig) EnsureSkills() *ResourceTargetConfig { return tc.Skills } +// EnsureAgents returns the Agents sub-key, creating it if nil. +// Use this before writing to Agents fields. +func (tc *TargetConfig) EnsureAgents() *ResourceTargetConfig { + if tc.Agents == nil { + tc.Agents = &ResourceTargetConfig{} + } + return tc.Agents +} + // migrateTargetConfigs moves legacy flat fields into skills: sub-key. // Returns true if any target was migrated. func migrateTargetConfigs(targets map[string]TargetConfig) bool { @@ -216,6 +225,7 @@ type ExtraConfig struct { // Config holds the application configuration type Config struct { Source string `yaml:"source"` + AgentsSource string `yaml:"agents_source,omitempty"` ExtrasSource string `yaml:"extras_source,omitempty"` Mode string `yaml:"mode,omitempty"` // default mode: merge TargetNaming string `yaml:"target_naming,omitempty"` @@ -233,6 +243,32 @@ type Config struct { RegistryDir string `yaml:"-"` } +// EffectiveAgentsSource returns the agents source directory. +// Defaults to /agents if not explicitly configured. +func (c *Config) EffectiveAgentsSource() string { + if c.AgentsSource != "" { + return ExpandPath(c.AgentsSource) + } + return filepath.Join(BaseDir(), "agents") +} + +// HasAgentTarget reports whether any configured target has an agents path, +// either from the user's config agents: sub-key or from the built-in defaults. +func (c *Config) HasAgentTarget() bool { + builtinAgents := DefaultAgentTargets() + for name, tc := range c.Targets { + // Check user config agents: sub-key + if ac := tc.AgentsConfig(); ac.Path != "" { + return true + } + // Check built-in defaults + if _, ok := builtinAgents[name]; ok { + return true + } + } + return false +} + // EffectiveGitLabHosts returns GitLabHosts merged with SKILLSHARE_GITLAB_HOSTS env var. // Use this instead of accessing GitLabHosts directly for runtime behavior; // GitLabHosts contains only config-file values and is safe to persist via Save(). diff --git a/internal/config/project.go b/internal/config/project.go index f8404323..4e585862 100644 --- a/internal/config/project.go +++ b/internal/config/project.go @@ -182,8 +182,17 @@ func (t *ProjectTargetEntry) EnsureSkills() *ResourceTargetConfig { return t.Skills } -// SkillEntry represents a remote skill entry in config (shared by global and project). -type SkillEntry struct { +// EnsureAgents returns the Agents sub-key, creating it if nil. +func (t *ProjectTargetEntry) EnsureAgents() *ResourceTargetConfig { + if t.Agents == nil { + t.Agents = &ResourceTargetConfig{} + } + return t.Agents +} + +// ResourceEntry represents a remote resource entry in config (shared by global and project). +// Used for both skills and agents. +type ResourceEntry struct { Name string `yaml:"name"` Kind string `yaml:"kind,omitempty"` Source string `yaml:"source"` @@ -192,9 +201,12 @@ type SkillEntry struct { Branch string `yaml:"branch,omitempty"` } +// SkillEntry is an alias for backward compatibility. +type SkillEntry = ResourceEntry + // EffectiveKind returns the resource kind for this entry. // Returns "skill" if Kind is empty (backward compatibility). -func (s SkillEntry) EffectiveKind() string { +func (s ResourceEntry) EffectiveKind() string { if s.Kind == "" { return "skill" } @@ -422,7 +434,7 @@ func ResolveProjectTargets(projectRoot string, cfg *ProjectConfig) (map[string]T absPath = filepath.Join(projectRoot, filepath.FromSlash(targetPath)) } - resolved[name] = TargetConfig{ + tc := TargetConfig{ defaultTargetNaming: cfg.TargetNaming, Skills: &ResourceTargetConfig{ Path: absPath, @@ -432,6 +444,32 @@ func ResolveProjectTargets(projectRoot string, cfg *ProjectConfig) (map[string]T Exclude: append([]string(nil), sc.Exclude...), }, } + + // Resolve Agents sub-key: from entry config or builtin defaults. + ac := entry.AgentsConfig() + agentPath := strings.TrimSpace(ac.Path) + if agentPath == "" { + if builtin, ok := ProjectAgentTargets()[name]; ok { + agentPath = builtin.Path + } + } + if agentPath != "" { + absAgentPath := agentPath + if utils.HasTildePrefix(absAgentPath) { + absAgentPath = expandPath(absAgentPath) + } + if !filepath.IsAbs(agentPath) { + absAgentPath = filepath.Join(projectRoot, filepath.FromSlash(agentPath)) + } + tc.Agents = &ResourceTargetConfig{ + Path: absAgentPath, + Mode: ac.Mode, + Include: append([]string(nil), ac.Include...), + Exclude: append([]string(nil), ac.Exclude...), + } + } + + resolved[name] = tc } return resolved, nil diff --git a/internal/config/project_reconcile.go b/internal/config/project_reconcile.go index bee61b25..14062b01 100644 --- a/internal/config/project_reconcile.go +++ b/internal/config/project_reconcile.go @@ -3,187 +3,98 @@ package config import ( "fmt" "os" - "os/exec" "path/filepath" "strings" "skillshare/internal/install" - "skillshare/internal/utils" ) // ReconcileProjectSkills scans the project source directory recursively for // remotely-installed skills (those with install metadata or tracked repos) -// and ensures they are listed in ProjectConfig.Skills[]. +// and ensures they are present in the MetadataStore. // It also updates .skillshare/.gitignore for each tracked skill. -func ReconcileProjectSkills(projectRoot string, projectCfg *ProjectConfig, reg *Registry, sourcePath string) error { +func ReconcileProjectSkills(projectRoot string, projectCfg *ProjectConfig, store *install.MetadataStore, sourcePath string) error { if _, err := os.Stat(sourcePath); os.IsNotExist(err) { - return nil // no skills dir yet + return nil } - changed := false - index := map[string]int{} - for i, skill := range reg.Skills { - index[skill.FullName()] = i + var gitignoreEntries []string + onFound := func(fullPath string) { + gitignoreEntries = append(gitignoreEntries, filepath.Join("skills", fullPath)) } - // Migrate legacy entries: name "frontend/pdf" → group "frontend", name "pdf" - for i := range reg.Skills { - s := ®.Skills[i] - if s.Group == "" && strings.Contains(s.Name, "/") { - group, bare := s.EffectiveParts() - s.Group = group - s.Name = bare - changed = true - } + result, err := reconcileSkillsWalk(sourcePath, store, onFound) + if err != nil { + return fmt.Errorf("failed to scan project skills: %w", err) } - // Collect gitignore entries during walk, then batch-update once at the end. - var gitignoreEntries []string + if pruneStaleEntries(store, result.live) { + result.changed = true + } - walkRoot := utils.ResolveSymlink(sourcePath) - live := map[string]bool{} // tracks skills actually found on disk - err := filepath.WalkDir(walkRoot, func(path string, d os.DirEntry, err error) error { - if err != nil { - return nil - } - if path == walkRoot { - return nil - } - if !d.IsDir() { - return nil - } - // Skip hidden directories - if utils.IsHidden(d.Name()) { - return filepath.SkipDir - } - // Skip .git directories - if d.Name() == ".git" { - return filepath.SkipDir + if len(gitignoreEntries) > 0 { + if err := install.UpdateGitIgnoreBatch(filepath.Join(projectRoot, ".skillshare"), gitignoreEntries); err != nil { + return fmt.Errorf("failed to update .skillshare/.gitignore: %w", err) } + } - relPath, relErr := filepath.Rel(walkRoot, path) - if relErr != nil { - return nil + if result.changed { + if err := store.Save(sourcePath); err != nil { + return err } + } - // Determine source and tracked status - var source string - tracked := isGitRepo(path) + return nil +} - meta, metaErr := install.ReadMeta(path) - if metaErr == nil && meta != nil && meta.Source != "" { - source = meta.Source - } else if tracked { - // Tracked repos have no meta file; derive source from git remote - source = gitRemoteOrigin(path) - } - if source == "" { - // Not an installed skill — continue walking deeper - return nil - } +// ReconcileProjectAgents scans the project agents source directory for +// installed agents and ensures they are present in the MetadataStore. +// Also updates .skillshare/.gitignore for each agent. +func ReconcileProjectAgents(projectRoot string, store *install.MetadataStore, agentsSourcePath string) error { + if _, err := os.Stat(agentsSourcePath); os.IsNotExist(err) { + return nil + } - fullPath := filepath.ToSlash(relPath) - live[fullPath] = true + entries, err := os.ReadDir(agentsSourcePath) + if err != nil { + return nil + } - // Determine branch: from metadata (regular skills) or git (tracked repos) - var branch string - if meta != nil { - branch = meta.Branch - } else if tracked { - branch = gitCurrentBranch(path) - } + changed := false + var gitignoreEntries []string - if existingIdx, ok := index[fullPath]; ok { - if reg.Skills[existingIdx].Source != source { - reg.Skills[existingIdx].Source = source - changed = true - } - if reg.Skills[existingIdx].Tracked != tracked { - reg.Skills[existingIdx].Tracked = tracked - changed = true - } - if reg.Skills[existingIdx].Branch != branch { - reg.Skills[existingIdx].Branch = branch - changed = true - } - } else { - entry := SkillEntry{ - Source: source, - Tracked: tracked, - Branch: branch, - } - if idx := strings.LastIndex(fullPath, "/"); idx >= 0 { - entry.Group = fullPath[:idx] - entry.Name = fullPath[idx+1:] - } else { - entry.Name = fullPath - } - reg.Skills = append(reg.Skills, entry) - index[fullPath] = len(reg.Skills) - 1 - changed = true + for _, entry := range entries { + name := entry.Name() + if entry.IsDir() || !strings.HasSuffix(strings.ToLower(name), ".md") { + continue } - gitignoreEntries = append(gitignoreEntries, filepath.Join("skills", fullPath)) + agentName := strings.TrimSuffix(name, ".md") - // If it's a tracked repo (has .git), don't recurse into it - if tracked { - return filepath.SkipDir + existing := store.Get(agentName) + if existing == nil || existing.Source == "" { + continue } - // If it has metadata, it's a leaf skill — don't recurse - if meta != nil && meta.Source != "" { - return filepath.SkipDir + if existing.Kind != "agent" { + existing.Kind = "agent" + changed = true } - return nil - }) - if err != nil { - return fmt.Errorf("failed to scan project skills: %w", err) + gitignoreEntries = append(gitignoreEntries, filepath.Join("agents", name)) } - // Prune stale skill entries (not on disk). Preserve non-skill entries (agents). - var pruneChanged bool - reg.Skills, pruneChanged = PruneStaleSkills(reg.Skills, live, true) - changed = changed || pruneChanged - - // Batch-update .gitignore once (reads/writes the file only once instead of per-skill). if len(gitignoreEntries) > 0 { if err := install.UpdateGitIgnoreBatch(filepath.Join(projectRoot, ".skillshare"), gitignoreEntries); err != nil { - return fmt.Errorf("failed to update .skillshare/.gitignore: %w", err) + return fmt.Errorf("failed to update .skillshare/.gitignore for agents: %w", err) } } if changed { - if err := reg.Save(filepath.Join(projectRoot, ".skillshare")); err != nil { + if err := store.Save(agentsSourcePath); err != nil { return err } } return nil } - -// isGitRepo checks if the given path is a git repository (has .git/ directory or file). -func isGitRepo(path string) bool { - _, err := os.Stat(filepath.Join(path, ".git")) - return err == nil -} - -// gitCurrentBranch returns the current branch name for a git repo, or "" on failure. -func gitCurrentBranch(repoPath string) string { - cmd := exec.Command("git", "-C", repoPath, "rev-parse", "--abbrev-ref", "HEAD") - out, err := cmd.Output() - if err != nil { - return "" - } - return strings.TrimSpace(string(out)) -} - -// gitRemoteOrigin returns the "origin" remote URL for a git repo, or "" on failure. -func gitRemoteOrigin(repoPath string) string { - cmd := exec.Command("git", "-C", repoPath, "remote", "get-url", "origin") - out, err := cmd.Output() - if err != nil { - return "" - } - return strings.TrimSpace(string(out)) -} diff --git a/internal/config/project_reconcile_test.go b/internal/config/project_reconcile_test.go index a1d6f62f..3779fe6d 100644 --- a/internal/config/project_reconcile_test.go +++ b/internal/config/project_reconcile_test.go @@ -1,44 +1,40 @@ package config import ( - "encoding/json" "os" "path/filepath" "testing" + + "skillshare/internal/install" ) func TestReconcileProjectSkills_AddsNewSkill(t *testing.T) { root := t.TempDir() skillsDir := filepath.Join(root, ".skillshare", "skills") - // Create a skill with install metadata + // Create a skill directory on disk skillPath := filepath.Join(skillsDir, "my-skill") if err := os.MkdirAll(skillPath, 0755); err != nil { t.Fatal(err) } - meta := map[string]string{"source": "github.com/user/repo"} - data, _ := json.Marshal(meta) - if err := os.WriteFile(filepath.Join(skillPath, ".skillshare-meta.json"), data, 0644); err != nil { - t.Fatal(err) - } cfg := &ProjectConfig{ Targets: []ProjectTargetEntry{{Name: "claude"}}, } - reg := &Registry{} + // Pre-populate store with the entry (simulating post-install state) + store := install.NewMetadataStore() + store.Set("my-skill", &install.MetadataEntry{Source: "github.com/user/repo"}) - if err := ReconcileProjectSkills(root, cfg, reg, skillsDir); err != nil { + if err := ReconcileProjectSkills(root, cfg, store, skillsDir); err != nil { t.Fatalf("ReconcileProjectSkills failed: %v", err) } - if len(reg.Skills) != 1 { - t.Fatalf("expected 1 skill, got %d", len(reg.Skills)) + if !store.Has("my-skill") { + t.Fatal("expected store to have 'my-skill'") } - if reg.Skills[0].Name != "my-skill" { - t.Errorf("expected skill name 'my-skill', got %q", reg.Skills[0].Name) - } - if reg.Skills[0].Source != "github.com/user/repo" { - t.Errorf("expected source 'github.com/user/repo', got %q", reg.Skills[0].Source) + entry := store.Get("my-skill") + if entry.Source != "github.com/user/repo" { + t.Errorf("expected source 'github.com/user/repo', got %q", entry.Source) } } @@ -50,28 +46,23 @@ func TestReconcileProjectSkills_UpdatesExistingSource(t *testing.T) { if err := os.MkdirAll(skillPath, 0755); err != nil { t.Fatal(err) } - meta := map[string]string{"source": "github.com/user/repo-v2"} - data, _ := json.Marshal(meta) - if err := os.WriteFile(filepath.Join(skillPath, ".skillshare-meta.json"), data, 0644); err != nil { - t.Fatal(err) - } cfg := &ProjectConfig{ Targets: []ProjectTargetEntry{{Name: "claude"}}, } - reg := &Registry{ - Skills: []SkillEntry{{Name: "my-skill", Source: "github.com/user/repo-v1"}}, - } + store := install.NewMetadataStore() + store.Set("my-skill", &install.MetadataEntry{Source: "github.com/user/repo-v1"}) - if err := ReconcileProjectSkills(root, cfg, reg, skillsDir); err != nil { + if err := ReconcileProjectSkills(root, cfg, store, skillsDir); err != nil { t.Fatalf("ReconcileProjectSkills failed: %v", err) } - if len(reg.Skills) != 1 { - t.Fatalf("expected 1 skill, got %d", len(reg.Skills)) + entry := store.Get("my-skill") + if entry == nil { + t.Fatal("expected store to have 'my-skill'") } - if reg.Skills[0].Source != "github.com/user/repo-v2" { - t.Errorf("expected updated source 'github.com/user/repo-v2', got %q", reg.Skills[0].Source) + if entry.Source != "github.com/user/repo-v1" { + t.Errorf("expected source 'github.com/user/repo-v1', got %q", entry.Source) } } @@ -79,12 +70,11 @@ func TestReconcileProjectSkills_SkipsNoMeta(t *testing.T) { root := t.TempDir() skillsDir := filepath.Join(root, ".skillshare", "skills") - // Create a skill directory without metadata + // Create a skill directory without metadata in the store skillPath := filepath.Join(skillsDir, "local-skill") if err := os.MkdirAll(skillPath, 0755); err != nil { t.Fatal(err) } - // Write a SKILL.md but no meta file if err := os.WriteFile(filepath.Join(skillPath, "SKILL.md"), []byte("# Local skill"), 0644); err != nil { t.Fatal(err) } @@ -92,14 +82,14 @@ func TestReconcileProjectSkills_SkipsNoMeta(t *testing.T) { cfg := &ProjectConfig{ Targets: []ProjectTargetEntry{{Name: "claude"}}, } - reg := &Registry{} + store := install.NewMetadataStore() - if err := ReconcileProjectSkills(root, cfg, reg, skillsDir); err != nil { + if err := ReconcileProjectSkills(root, cfg, store, skillsDir); err != nil { t.Fatalf("ReconcileProjectSkills failed: %v", err) } - if len(reg.Skills) != 0 { - t.Errorf("expected 0 skills (no meta), got %d", len(reg.Skills)) + if len(store.List()) != 0 { + t.Errorf("expected 0 entries (no meta), got %d", len(store.List())) } } @@ -111,14 +101,14 @@ func TestReconcileProjectSkills_EmptyDir(t *testing.T) { } cfg := &ProjectConfig{} - reg := &Registry{} + store := install.NewMetadataStore() - if err := ReconcileProjectSkills(root, cfg, reg, skillsDir); err != nil { + if err := ReconcileProjectSkills(root, cfg, store, skillsDir); err != nil { t.Fatalf("ReconcileProjectSkills failed: %v", err) } - if len(reg.Skills) != 0 { - t.Errorf("expected 0 skills, got %d", len(reg.Skills)) + if len(store.List()) != 0 { + t.Errorf("expected 0 entries, got %d", len(store.List())) } } @@ -127,9 +117,9 @@ func TestReconcileProjectSkills_MissingDir(t *testing.T) { skillsDir := filepath.Join(root, ".skillshare", "skills") // does not exist cfg := &ProjectConfig{} - reg := &Registry{} + store := install.NewMetadataStore() - if err := ReconcileProjectSkills(root, cfg, reg, skillsDir); err != nil { + if err := ReconcileProjectSkills(root, cfg, store, skillsDir); err != nil { t.Fatalf("ReconcileProjectSkills should not fail for missing dir: %v", err) } } @@ -143,33 +133,35 @@ func TestReconcileProjectSkills_NestedSkillSetsGroup(t *testing.T) { if err := os.MkdirAll(skillPath, 0755); err != nil { t.Fatal(err) } - meta := map[string]string{"source": "github.com/user/repo"} - data, _ := json.Marshal(meta) - if err := os.WriteFile(filepath.Join(skillPath, ".skillshare-meta.json"), data, 0644); err != nil { - t.Fatal(err) - } cfg := &ProjectConfig{ Targets: []ProjectTargetEntry{{Name: "claude"}}, } - reg := &Registry{} + store := install.NewMetadataStore() + store.Set("my-skill", &install.MetadataEntry{ + Source: "github.com/user/repo", + Group: "tools", + }) - if err := ReconcileProjectSkills(root, cfg, reg, skillsDir); err != nil { + if err := ReconcileProjectSkills(root, cfg, store, skillsDir); err != nil { t.Fatalf("ReconcileProjectSkills failed: %v", err) } - if len(reg.Skills) != 1 { - t.Fatalf("expected 1 skill, got %d", len(reg.Skills)) + // After reconcile, nested skills use full-path keys (e.g. "tools/my-skill"). + entry := store.Get("tools/my-skill") + if entry == nil { + t.Fatal("expected store to have 'tools/my-skill'") } - if reg.Skills[0].Name != "my-skill" { - t.Errorf("expected bare name 'my-skill', got %q", reg.Skills[0].Name) + if entry.Group != "tools" { + t.Errorf("expected group 'tools', got %q", entry.Group) } - if reg.Skills[0].Group != "tools" { - t.Errorf("expected group 'tools', got %q", reg.Skills[0].Group) + // Legacy basename key should be removed after migration. + if store.Has("my-skill") { + t.Error("expected legacy basename key 'my-skill' to be removed") } } -func TestReconcileProjectSkills_PrunesStalePreservesAgents(t *testing.T) { +func TestReconcileProjectSkills_PrunesStaleEntries(t *testing.T) { root := t.TempDir() skillsDir := filepath.Join(root, ".skillshare", "skills") @@ -178,80 +170,26 @@ func TestReconcileProjectSkills_PrunesStalePreservesAgents(t *testing.T) { if err := os.MkdirAll(skillPath, 0755); err != nil { t.Fatal(err) } - meta := map[string]string{"source": "github.com/user/alive"} - data, _ := json.Marshal(meta) - if err := os.WriteFile(filepath.Join(skillPath, ".skillshare-meta.json"), data, 0644); err != nil { - t.Fatal(err) - } cfg := &ProjectConfig{ Targets: []ProjectTargetEntry{{Name: "claude"}}, } - // Registry: alive skill + stale skill + agent (should survive prune) - reg := &Registry{ - Skills: []SkillEntry{ - {Name: "alive-skill", Source: "github.com/user/alive"}, - {Name: "deleted-skill", Source: "github.com/user/deleted"}, - {Name: "my-agent", Kind: "agent", Source: "github.com/user/agent"}, - }, - } + store := install.NewMetadataStore() + store.Set("alive-skill", &install.MetadataEntry{Source: "github.com/user/alive"}) + store.Set("deleted-skill", &install.MetadataEntry{Source: "github.com/user/deleted"}) - if err := ReconcileProjectSkills(root, cfg, reg, skillsDir); err != nil { + if err := ReconcileProjectSkills(root, cfg, store, skillsDir); err != nil { t.Fatalf("ReconcileProjectSkills failed: %v", err) } - if len(reg.Skills) != 2 { - t.Fatalf("expected 2 entries (alive-skill + agent), got %d: %+v", len(reg.Skills), reg.Skills) + names := store.List() + if len(names) != 1 { + t.Fatalf("expected 1 entry after prune, got %d: %v", len(names), names) } - - names := map[string]bool{} - for _, s := range reg.Skills { - names[s.Name] = true - } - if !names["alive-skill"] { + if !store.Has("alive-skill") { t.Error("expected alive-skill to survive prune") } - if !names["my-agent"] { - t.Error("expected agent entry to survive prune") - } - if names["deleted-skill"] { + if store.Has("deleted-skill") { t.Error("expected deleted-skill to be pruned") } } - -func TestReconcileProjectSkills_MigratesLegacySlashName(t *testing.T) { - root := t.TempDir() - skillsDir := filepath.Join(root, ".skillshare", "skills") - - skillPath := filepath.Join(skillsDir, "tools", "my-skill") - if err := os.MkdirAll(skillPath, 0755); err != nil { - t.Fatal(err) - } - meta := map[string]string{"source": "github.com/user/repo"} - data, _ := json.Marshal(meta) - if err := os.WriteFile(filepath.Join(skillPath, ".skillshare-meta.json"), data, 0644); err != nil { - t.Fatal(err) - } - - // Start with legacy format - cfg := &ProjectConfig{ - Targets: []ProjectTargetEntry{{Name: "claude"}}, - } - reg := &Registry{ - Skills: []SkillEntry{{Name: "tools/my-skill", Source: "github.com/user/repo"}}, - } - - if err := ReconcileProjectSkills(root, cfg, reg, skillsDir); err != nil { - t.Fatalf("ReconcileProjectSkills failed: %v", err) - } - - if len(reg.Skills) != 1 { - t.Fatalf("expected 1 skill, got %d", len(reg.Skills)) - } - if reg.Skills[0].Name != "my-skill" { - t.Errorf("expected migrated name 'my-skill', got %q", reg.Skills[0].Name) - } - if reg.Skills[0].Group != "tools" { - t.Errorf("expected migrated group 'tools', got %q", reg.Skills[0].Group) - } -} diff --git a/internal/config/reconcile.go b/internal/config/reconcile.go index 8afc10e1..79fe09b0 100644 --- a/internal/config/reconcile.go +++ b/internal/config/reconcile.go @@ -3,142 +3,30 @@ package config import ( "fmt" "os" - "path/filepath" - "strings" "skillshare/internal/install" - "skillshare/internal/utils" ) // ReconcileGlobalSkills scans the global source directory for remotely-installed // skills (those with install metadata or tracked repos) and ensures they are -// listed in Config.Skills[]. This is the global-mode counterpart of -// ReconcileProjectSkills. -func ReconcileGlobalSkills(cfg *Config, reg *Registry) error { +// present in the MetadataStore. +func ReconcileGlobalSkills(cfg *Config, store *install.MetadataStore) error { sourcePath := cfg.Source if _, err := os.Stat(sourcePath); os.IsNotExist(err) { - return nil // no skills dir yet - } - - changed := false - index := map[string]int{} - for i, skill := range reg.Skills { - index[skill.FullName()] = i - } - - // Migrate legacy entries: name "frontend/pdf" → group "frontend", name "pdf" - for i := range reg.Skills { - s := ®.Skills[i] - if s.Group == "" && strings.Contains(s.Name, "/") { - group, bare := s.EffectiveParts() - s.Group = group - s.Name = bare - changed = true - } + return nil } - walkRoot := utils.ResolveSymlink(sourcePath) - live := map[string]bool{} // tracks skills actually found on disk - err := filepath.WalkDir(walkRoot, func(path string, d os.DirEntry, err error) error { - if err != nil { - return nil - } - if path == walkRoot { - return nil - } - if !d.IsDir() { - return nil - } - if utils.IsHidden(d.Name()) { - return filepath.SkipDir - } - if d.Name() == ".git" { - return filepath.SkipDir - } - - relPath, relErr := filepath.Rel(walkRoot, path) - if relErr != nil { - return nil - } - - var source string - tracked := isGitRepo(path) - - meta, metaErr := install.ReadMeta(path) - if metaErr == nil && meta != nil && meta.Source != "" { - source = meta.Source - } else if tracked { - source = gitRemoteOrigin(path) - } - if source == "" { - return nil - } - - fullPath := filepath.ToSlash(relPath) - live[fullPath] = true - - // Determine branch: from metadata (regular skills) or git (tracked repos) - var branch string - if meta != nil { - branch = meta.Branch - } else if tracked { - branch = gitCurrentBranch(path) - } - - if existingIdx, ok := index[fullPath]; ok { - if reg.Skills[existingIdx].Source != source { - reg.Skills[existingIdx].Source = source - changed = true - } - if reg.Skills[existingIdx].Tracked != tracked { - reg.Skills[existingIdx].Tracked = tracked - changed = true - } - if reg.Skills[existingIdx].Branch != branch { - reg.Skills[existingIdx].Branch = branch - changed = true - } - } else { - entry := SkillEntry{ - Source: source, - Tracked: tracked, - Branch: branch, - } - if idx := strings.LastIndex(fullPath, "/"); idx >= 0 { - entry.Group = fullPath[:idx] - entry.Name = fullPath[idx+1:] - } else { - entry.Name = fullPath - } - reg.Skills = append(reg.Skills, entry) - index[fullPath] = len(reg.Skills) - 1 - changed = true - } - - if tracked { - return filepath.SkipDir - } - if meta != nil && meta.Source != "" { - return filepath.SkipDir - } - - return nil - }) + result, err := reconcileSkillsWalk(sourcePath, store, nil) if err != nil { return fmt.Errorf("failed to scan global skills: %w", err) } - // Prune stale entries: skills in registry but no longer on disk - var pruneChanged bool - reg.Skills, pruneChanged = PruneStaleSkills(reg.Skills, live, false) - changed = changed || pruneChanged + if pruneStaleEntries(store, result.live) { + result.changed = true + } - if changed { - regDir := cfg.RegistryDir - if regDir == "" { - regDir = SourceRoot(cfg.Source) - } - if err := reg.Save(regDir); err != nil { + if result.changed { + if err := store.Save(sourcePath); err != nil { return err } } diff --git a/internal/config/reconcile_core.go b/internal/config/reconcile_core.go new file mode 100644 index 00000000..195f8b51 --- /dev/null +++ b/internal/config/reconcile_core.go @@ -0,0 +1,161 @@ +package config + +import ( + "os" + "os/exec" + "path/filepath" + "strings" + + "skillshare/internal/install" + "skillshare/internal/utils" +) + +// reconcileResult holds the output of a reconcile walk. +type reconcileResult struct { + live map[string]bool + changed bool +} + +// reconcileSkillsWalk walks sourcePath for installed skills (those with metadata +// or tracked repos) and ensures they are present in the MetadataStore. +// onFound is called for each discovered installed skill; pass nil to skip. +func reconcileSkillsWalk(sourcePath string, store *install.MetadataStore, onFound func(fullPath string)) (reconcileResult, error) { + result := reconcileResult{live: map[string]bool{}} + + walkRoot := utils.ResolveSymlink(sourcePath) + err := filepath.WalkDir(walkRoot, func(path string, d os.DirEntry, walkErr error) error { + if walkErr != nil { + return nil + } + if path == walkRoot { + return nil + } + if !d.IsDir() { + return nil + } + if utils.IsHidden(d.Name()) { + return filepath.SkipDir + } + if d.Name() == ".git" { + return filepath.SkipDir + } + + relPath, relErr := filepath.Rel(walkRoot, path) + if relErr != nil { + return nil + } + + fullPath := filepath.ToSlash(relPath) + + group := "" + if idx := strings.LastIndex(fullPath, "/"); idx >= 0 { + group = fullPath[:idx] + } + + var source string + tracked := isGitRepo(path) + + existing := store.GetByPath(fullPath) + if existing != nil && existing.Source != "" { + source = existing.Source + } else if tracked { + source = gitRemoteOrigin(path) + } + if source == "" { + return nil + } + + result.live[fullPath] = true + + var branch string + if existing != nil && existing.Branch != "" { + branch = existing.Branch + } else if tracked { + branch = gitCurrentBranch(path) + } + + if existing != nil { + if store.MigrateLegacyKey(fullPath, existing) { + result.changed = true + } + if existing.Source != source { + existing.Source = source + result.changed = true + } + if existing.Tracked != tracked { + existing.Tracked = tracked + result.changed = true + } + if existing.Branch != branch { + existing.Branch = branch + result.changed = true + } + if existing.Group != group { + existing.Group = group + result.changed = true + } + } else { + entry := &install.MetadataEntry{ + Source: source, + Tracked: tracked, + Branch: branch, + Group: group, + } + store.Set(fullPath, entry) + result.changed = true + } + + if onFound != nil { + onFound(fullPath) + } + + if tracked { + return filepath.SkipDir + } + if existing != nil && existing.Source != "" { + return filepath.SkipDir + } + + return nil + }) + + return result, err +} + +// pruneStaleEntries removes store entries not present in the live set. +func pruneStaleEntries(store *install.MetadataStore, live map[string]bool) bool { + changed := false + for _, name := range store.List() { + if !live[name] { + store.Remove(name) + changed = true + } + } + return changed +} + +// isGitRepo checks if the given path is a git repository (has .git/ directory or file). +func isGitRepo(path string) bool { + _, err := os.Stat(filepath.Join(path, ".git")) + return err == nil +} + +// gitCurrentBranch returns the current branch name for a git repo, or "" on failure. +func gitCurrentBranch(repoPath string) string { + cmd := exec.Command("git", "-C", repoPath, "rev-parse", "--abbrev-ref", "HEAD") + out, err := cmd.Output() + if err != nil { + return "" + } + return strings.TrimSpace(string(out)) +} + +// gitRemoteOrigin returns the "origin" remote URL for a git repo, or "" on failure. +func gitRemoteOrigin(repoPath string) string { + cmd := exec.Command("git", "-C", repoPath, "remote", "get-url", "origin") + out, err := cmd.Output() + if err != nil { + return "" + } + return strings.TrimSpace(string(out)) +} diff --git a/internal/config/reconcile_test.go b/internal/config/reconcile_test.go index 0ae66412..f4885ea2 100644 --- a/internal/config/reconcile_test.go +++ b/internal/config/reconcile_test.go @@ -1,11 +1,12 @@ package config import ( - "encoding/json" "os" "path/filepath" "testing" + "skillshare/internal/install" + "gopkg.in/yaml.v3" ) @@ -14,18 +15,12 @@ func TestReconcileGlobalSkills_AddsNewSkill(t *testing.T) { sourceDir := filepath.Join(root, "skills") configPath := filepath.Join(root, "config.yaml") - // Create a skill with install metadata + // Create a skill directory on disk skillPath := filepath.Join(sourceDir, "my-skill") if err := os.MkdirAll(skillPath, 0755); err != nil { t.Fatal(err) } - meta := map[string]string{"source": "github.com/user/repo"} - data, _ := json.Marshal(meta) - if err := os.WriteFile(filepath.Join(skillPath, ".skillshare-meta.json"), data, 0644); err != nil { - t.Fatal(err) - } - // Write initial config (needed for ConfigPath() resolution) cfgData, _ := yaml.Marshal(&Config{Source: sourceDir}) if err := os.WriteFile(configPath, cfgData, 0644); err != nil { t.Fatal(err) @@ -33,20 +28,20 @@ func TestReconcileGlobalSkills_AddsNewSkill(t *testing.T) { t.Setenv("SKILLSHARE_CONFIG", configPath) cfg := &Config{Source: sourceDir} - reg := &Registry{} + // Pre-populate store with the entry (simulating post-install state) + store := install.NewMetadataStore() + store.Set("my-skill", &install.MetadataEntry{Source: "github.com/user/repo"}) - if err := ReconcileGlobalSkills(cfg, reg); err != nil { + if err := ReconcileGlobalSkills(cfg, store); err != nil { t.Fatalf("ReconcileGlobalSkills failed: %v", err) } - if len(reg.Skills) != 1 { - t.Fatalf("expected 1 skill, got %d", len(reg.Skills)) + if !store.Has("my-skill") { + t.Fatal("expected store to have 'my-skill'") } - if reg.Skills[0].Name != "my-skill" { - t.Errorf("expected skill name 'my-skill', got %q", reg.Skills[0].Name) - } - if reg.Skills[0].Source != "github.com/user/repo" { - t.Errorf("expected source 'github.com/user/repo', got %q", reg.Skills[0].Source) + entry := store.Get("my-skill") + if entry.Source != "github.com/user/repo" { + t.Errorf("expected source 'github.com/user/repo', got %q", entry.Source) } } @@ -59,11 +54,6 @@ func TestReconcileGlobalSkills_UpdatesExistingSource(t *testing.T) { if err := os.MkdirAll(skillPath, 0755); err != nil { t.Fatal(err) } - meta := map[string]string{"source": "github.com/user/repo-v2"} - data, _ := json.Marshal(meta) - if err := os.WriteFile(filepath.Join(skillPath, ".skillshare-meta.json"), data, 0644); err != nil { - t.Fatal(err) - } cfgData, _ := yaml.Marshal(&Config{Source: sourceDir}) if err := os.WriteFile(configPath, cfgData, 0644); err != nil { @@ -72,19 +62,20 @@ func TestReconcileGlobalSkills_UpdatesExistingSource(t *testing.T) { t.Setenv("SKILLSHARE_CONFIG", configPath) cfg := &Config{Source: sourceDir} - reg := &Registry{ - Skills: []SkillEntry{{Name: "my-skill", Source: "github.com/user/repo-v1"}}, - } + store := install.NewMetadataStore() + store.Set("my-skill", &install.MetadataEntry{Source: "github.com/user/repo-v1"}) - if err := ReconcileGlobalSkills(cfg, reg); err != nil { + if err := ReconcileGlobalSkills(cfg, store); err != nil { t.Fatalf("ReconcileGlobalSkills failed: %v", err) } - if len(reg.Skills) != 1 { - t.Fatalf("expected 1 skill, got %d", len(reg.Skills)) + entry := store.Get("my-skill") + if entry == nil { + t.Fatal("expected store to have 'my-skill'") } - if reg.Skills[0].Source != "github.com/user/repo-v2" { - t.Errorf("expected updated source 'github.com/user/repo-v2', got %q", reg.Skills[0].Source) + // Source should remain as-is since reconcile reads from the existing store entry + if entry.Source != "github.com/user/repo-v1" { + t.Errorf("expected source 'github.com/user/repo-v1', got %q", entry.Source) } } @@ -93,7 +84,7 @@ func TestReconcileGlobalSkills_SkipsNoMeta(t *testing.T) { sourceDir := filepath.Join(root, "skills") configPath := filepath.Join(root, "config.yaml") - // Create a skill directory without metadata + // Create a skill directory without metadata in the store skillPath := filepath.Join(sourceDir, "local-skill") if err := os.MkdirAll(skillPath, 0755); err != nil { t.Fatal(err) @@ -109,14 +100,14 @@ func TestReconcileGlobalSkills_SkipsNoMeta(t *testing.T) { t.Setenv("SKILLSHARE_CONFIG", configPath) cfg := &Config{Source: sourceDir} - reg := &Registry{} + store := install.NewMetadataStore() - if err := ReconcileGlobalSkills(cfg, reg); err != nil { + if err := ReconcileGlobalSkills(cfg, store); err != nil { t.Fatalf("ReconcileGlobalSkills failed: %v", err) } - if len(reg.Skills) != 0 { - t.Errorf("expected 0 skills (no meta), got %d", len(reg.Skills)) + if len(store.List()) != 0 { + t.Errorf("expected 0 entries (no meta), got %d", len(store.List())) } } @@ -128,14 +119,14 @@ func TestReconcileGlobalSkills_EmptyDir(t *testing.T) { } cfg := &Config{Source: sourceDir} - reg := &Registry{} + store := install.NewMetadataStore() - if err := ReconcileGlobalSkills(cfg, reg); err != nil { + if err := ReconcileGlobalSkills(cfg, store); err != nil { t.Fatalf("ReconcileGlobalSkills failed: %v", err) } - if len(reg.Skills) != 0 { - t.Errorf("expected 0 skills, got %d", len(reg.Skills)) + if len(store.List()) != 0 { + t.Errorf("expected 0 entries, got %d", len(store.List())) } } @@ -144,9 +135,9 @@ func TestReconcileGlobalSkills_MissingDir(t *testing.T) { sourceDir := filepath.Join(root, "skills") // does not exist cfg := &Config{Source: sourceDir} - reg := &Registry{} + store := install.NewMetadataStore() - if err := ReconcileGlobalSkills(cfg, reg); err != nil { + if err := ReconcileGlobalSkills(cfg, store); err != nil { t.Fatalf("ReconcileGlobalSkills should not fail for missing dir: %v", err) } } @@ -161,11 +152,6 @@ func TestReconcileGlobalSkills_NestedSkillSetsGroup(t *testing.T) { if err := os.MkdirAll(skillPath, 0755); err != nil { t.Fatal(err) } - meta := map[string]string{"source": "anthropics/skills/skills/pdf"} - data, _ := json.Marshal(meta) - if err := os.WriteFile(filepath.Join(skillPath, ".skillshare-meta.json"), data, 0644); err != nil { - t.Fatal(err) - } cfgData, _ := yaml.Marshal(&Config{Source: sourceDir}) if err := os.WriteFile(configPath, cfgData, 0644); err != nil { @@ -174,23 +160,27 @@ func TestReconcileGlobalSkills_NestedSkillSetsGroup(t *testing.T) { t.Setenv("SKILLSHARE_CONFIG", configPath) cfg := &Config{Source: sourceDir} - reg := &Registry{} + store := install.NewMetadataStore() + store.Set("pdf", &install.MetadataEntry{ + Source: "anthropics/skills/skills/pdf", + Group: "frontend", + }) - if err := ReconcileGlobalSkills(cfg, reg); err != nil { + if err := ReconcileGlobalSkills(cfg, store); err != nil { t.Fatalf("ReconcileGlobalSkills failed: %v", err) } - if len(reg.Skills) != 1 { - t.Fatalf("expected 1 skill, got %d", len(reg.Skills)) + // After reconcile, nested skills use full-path keys (e.g. "frontend/pdf"). + entry := store.Get("frontend/pdf") + if entry == nil { + t.Fatal("expected store to have 'frontend/pdf'") } - if reg.Skills[0].Name != "pdf" { - t.Errorf("expected bare name 'pdf', got %q", reg.Skills[0].Name) + if entry.Group != "frontend" { + t.Errorf("expected group 'frontend', got %q", entry.Group) } - if reg.Skills[0].Group != "frontend" { - t.Errorf("expected group 'frontend', got %q", reg.Skills[0].Group) - } - if reg.Skills[0].FullName() != "frontend/pdf" { - t.Errorf("expected FullName 'frontend/pdf', got %q", reg.Skills[0].FullName()) + // Legacy basename key should be removed after migration. + if store.Has("pdf") { + t.Error("expected legacy basename key 'pdf' to be removed") } } @@ -204,11 +194,6 @@ func TestReconcileGlobalSkills_PrunesStaleEntries(t *testing.T) { if err := os.MkdirAll(skillPath, 0755); err != nil { t.Fatal(err) } - meta := map[string]string{"source": "github.com/user/alive"} - data, _ := json.Marshal(meta) - if err := os.WriteFile(filepath.Join(skillPath, ".skillshare-meta.json"), data, 0644); err != nil { - t.Fatal(err) - } cfgData, _ := yaml.Marshal(&Config{Source: sourceDir}) if err := os.WriteFile(configPath, cfgData, 0644); err != nil { @@ -216,67 +201,21 @@ func TestReconcileGlobalSkills_PrunesStaleEntries(t *testing.T) { } t.Setenv("SKILLSHARE_CONFIG", configPath) - // Registry has both the alive skill and a stale one (not on disk) cfg := &Config{Source: sourceDir} - reg := &Registry{ - Skills: []SkillEntry{ - {Name: "alive-skill", Source: "github.com/user/alive"}, - {Name: "deleted-skill", Source: "github.com/user/deleted"}, - {Group: "frontend", Name: "gone-skill", Source: "github.com/user/gone"}, - }, - } + store := install.NewMetadataStore() + store.Set("alive-skill", &install.MetadataEntry{Source: "github.com/user/alive"}) + store.Set("deleted-skill", &install.MetadataEntry{Source: "github.com/user/deleted"}) + store.Set("frontend/gone-skill", &install.MetadataEntry{Source: "github.com/user/gone", Group: "frontend"}) - if err := ReconcileGlobalSkills(cfg, reg); err != nil { + if err := ReconcileGlobalSkills(cfg, store); err != nil { t.Fatalf("ReconcileGlobalSkills failed: %v", err) } - if len(reg.Skills) != 1 { - t.Fatalf("expected 1 skill after prune, got %d: %+v", len(reg.Skills), reg.Skills) - } - if reg.Skills[0].Name != "alive-skill" { - t.Errorf("expected surviving skill 'alive-skill', got %q", reg.Skills[0].Name) - } -} - -func TestReconcileGlobalSkills_MigratesLegacySlashName(t *testing.T) { - root := t.TempDir() - sourceDir := filepath.Join(root, "skills") - configPath := filepath.Join(root, "config.yaml") - - // Create nested skill on disk - skillPath := filepath.Join(sourceDir, "frontend", "pdf") - if err := os.MkdirAll(skillPath, 0755); err != nil { - t.Fatal(err) - } - meta := map[string]string{"source": "anthropics/skills/skills/pdf"} - data, _ := json.Marshal(meta) - if err := os.WriteFile(filepath.Join(skillPath, ".skillshare-meta.json"), data, 0644); err != nil { - t.Fatal(err) - } - - cfgData, _ := yaml.Marshal(&Config{Source: sourceDir}) - if err := os.WriteFile(configPath, cfgData, 0644); err != nil { - t.Fatal(err) - } - t.Setenv("SKILLSHARE_CONFIG", configPath) - - // Start with legacy format: name contains slash, no group - cfg := &Config{Source: sourceDir} - reg := &Registry{ - Skills: []SkillEntry{{Name: "frontend/pdf", Source: "anthropics/skills/skills/pdf"}}, - } - - if err := ReconcileGlobalSkills(cfg, reg); err != nil { - t.Fatalf("ReconcileGlobalSkills failed: %v", err) - } - - if len(reg.Skills) != 1 { - t.Fatalf("expected 1 skill, got %d", len(reg.Skills)) - } - if reg.Skills[0].Name != "pdf" { - t.Errorf("expected migrated name 'pdf', got %q", reg.Skills[0].Name) + names := store.List() + if len(names) != 1 { + t.Fatalf("expected 1 entry after prune, got %d: %v", len(names), names) } - if reg.Skills[0].Group != "frontend" { - t.Errorf("expected migrated group 'frontend', got %q", reg.Skills[0].Group) + if !store.Has("alive-skill") { + t.Errorf("expected surviving entry 'alive-skill'") } } diff --git a/internal/config/registry.go b/internal/config/registry.go index 256ec7d5..a186ed93 100644 --- a/internal/config/registry.go +++ b/internal/config/registry.go @@ -127,3 +127,62 @@ func (r *Registry) Save(dir string) error { return nil } + +// LoadUnifiedRegistry merges registries from both skills and agents source +// directories into a single Registry. Used in global mode where skills and +// agents have separate registry files. +// Skills entries get Kind="" (backward compat), agent entries get Kind="agent". +func LoadUnifiedRegistry(skillsDir, agentsDir string) (*Registry, error) { + skillsReg, err := LoadRegistry(skillsDir) + if err != nil { + return nil, fmt.Errorf("failed to load skills registry: %w", err) + } + + agentsReg, err := LoadRegistry(agentsDir) + if err != nil { + return nil, fmt.Errorf("failed to load agents registry: %w", err) + } + + // Ensure agent entries have Kind set + for i := range agentsReg.Skills { + if agentsReg.Skills[i].Kind == "" { + agentsReg.Skills[i].Kind = "agent" + } + } + + unified := &Registry{ + Skills: make([]SkillEntry, 0, len(skillsReg.Skills)+len(agentsReg.Skills)), + } + unified.Skills = append(unified.Skills, skillsReg.Skills...) + unified.Skills = append(unified.Skills, agentsReg.Skills...) + + return unified, nil +} + +// SaveSplitByKind splits the unified registry by kind and saves each to +// the appropriate directory. Skills (Kind="" or "skill") go to skillsDir, +// agents (Kind="agent") go to agentsDir. +func (r *Registry) SaveSplitByKind(skillsDir, agentsDir string) error { + skillsReg := &Registry{} + agentsReg := &Registry{} + + for _, entry := range r.Skills { + if entry.EffectiveKind() == "agent" { + agentsReg.Skills = append(agentsReg.Skills, entry) + } else { + skillsReg.Skills = append(skillsReg.Skills, entry) + } + } + + if err := skillsReg.Save(skillsDir); err != nil { + return fmt.Errorf("failed to save skills registry: %w", err) + } + + if len(agentsReg.Skills) > 0 { + if err := agentsReg.Save(agentsDir); err != nil { + return fmt.Errorf("failed to save agents registry: %w", err) + } + } + + return nil +} diff --git a/internal/config/registry_test.go b/internal/config/registry_test.go index 08a089cd..ca8cbef6 100644 --- a/internal/config/registry_test.go +++ b/internal/config/registry_test.go @@ -130,6 +130,127 @@ func TestMigrateGlobalSkills_NoMigrationWhenRegistryExists(t *testing.T) { } } +func TestLoadUnifiedRegistry_MergesBoth(t *testing.T) { + skillsDir := t.TempDir() + agentsDir := t.TempDir() + + // Write skills registry + skillsReg := &Registry{ + Skills: []SkillEntry{ + {Name: "my-skill", Source: "github.com/user/skills-repo"}, + }, + } + if err := skillsReg.Save(skillsDir); err != nil { + t.Fatalf("Save skills: %v", err) + } + + // Write agents registry + agentsReg := &Registry{ + Skills: []SkillEntry{ + {Name: "my-agent", Source: "github.com/user/agents-repo"}, + }, + } + if err := agentsReg.Save(agentsDir); err != nil { + t.Fatalf("Save agents: %v", err) + } + + unified, err := LoadUnifiedRegistry(skillsDir, agentsDir) + if err != nil { + t.Fatalf("LoadUnifiedRegistry: %v", err) + } + + if len(unified.Skills) != 2 { + t.Fatalf("expected 2 entries, got %d", len(unified.Skills)) + } + + // Agent entry should have Kind="agent" + for _, e := range unified.Skills { + if e.Name == "my-agent" && e.EffectiveKind() != "agent" { + t.Errorf("agent entry should have kind=agent, got %q", e.Kind) + } + if e.Name == "my-skill" && e.EffectiveKind() != "skill" { + t.Errorf("skill entry should have kind=skill, got %q", e.Kind) + } + } +} + +func TestLoadUnifiedRegistry_EmptyAgents(t *testing.T) { + skillsDir := t.TempDir() + agentsDir := t.TempDir() // no registry file + + skillsReg := &Registry{ + Skills: []SkillEntry{ + {Name: "s1", Source: "test"}, + }, + } + skillsReg.Save(skillsDir) + + unified, err := LoadUnifiedRegistry(skillsDir, agentsDir) + if err != nil { + t.Fatalf("LoadUnifiedRegistry: %v", err) + } + if len(unified.Skills) != 1 { + t.Fatalf("expected 1 entry, got %d", len(unified.Skills)) + } +} + +func TestSaveSplitByKind_RoundTrip(t *testing.T) { + skillsDir := t.TempDir() + agentsDir := t.TempDir() + + unified := &Registry{ + Skills: []SkillEntry{ + {Name: "skill-a", Source: "s1"}, + {Name: "agent-b", Source: "s2", Kind: "agent"}, + {Name: "skill-c", Source: "s3", Kind: "skill"}, + }, + } + + if err := unified.SaveSplitByKind(skillsDir, agentsDir); err != nil { + t.Fatalf("SaveSplitByKind: %v", err) + } + + // Load back separately + skillsReg, err := LoadRegistry(skillsDir) + if err != nil { + t.Fatalf("LoadRegistry skills: %v", err) + } + if len(skillsReg.Skills) != 2 { + t.Fatalf("expected 2 skill entries, got %d", len(skillsReg.Skills)) + } + + agentsReg, err := LoadRegistry(agentsDir) + if err != nil { + t.Fatalf("LoadRegistry agents: %v", err) + } + if len(agentsReg.Skills) != 1 { + t.Fatalf("expected 1 agent entry, got %d", len(agentsReg.Skills)) + } + if agentsReg.Skills[0].Name != "agent-b" { + t.Errorf("agent name = %q, want %q", agentsReg.Skills[0].Name, "agent-b") + } +} + +func TestSaveSplitByKind_NoAgents_SkipsAgentFile(t *testing.T) { + skillsDir := t.TempDir() + agentsDir := t.TempDir() + + unified := &Registry{ + Skills: []SkillEntry{ + {Name: "only-skill", Source: "s1"}, + }, + } + + if err := unified.SaveSplitByKind(skillsDir, agentsDir); err != nil { + t.Fatalf("SaveSplitByKind: %v", err) + } + + // Agents dir should not have registry file + if _, err := os.Stat(filepath.Join(agentsDir, "registry.yaml")); err == nil { + t.Error("expected no agents registry.yaml when no agent entries") + } +} + func TestSourceRoot_NoGit(t *testing.T) { dir := t.TempDir() got := SourceRoot(dir) diff --git a/internal/hub/index.go b/internal/hub/index.go index 0650b35a..2bc3657d 100644 --- a/internal/hub/index.go +++ b/internal/hub/index.go @@ -64,24 +64,27 @@ func BuildIndex(sourcePath string, full bool, auditSkills bool) (*Index, error) return nil, err } + // Load centralized metadata store once for all skills. + store := install.LoadMetadataOrNew(sourcePath) + entries := make([]SkillEntry, len(discovered)) for i, d := range discovered { item := SkillEntry{ Name: filepath.Base(d.SourcePath), } - // Determine source: prefer meta.Source (remote origin), fallback to relPath. + // Determine source: prefer entry.Source (remote origin), fallback to relPath. source := d.RelPath - if meta, _ := install.ReadMeta(d.SourcePath); meta != nil { - if meta.Source != "" { - source = meta.Source + if entry := store.GetByPath(d.RelPath); entry != nil { + if entry.Source != "" { + source = entry.Source } if full { - item.Type = meta.Type - item.RepoURL = meta.RepoURL - item.Version = meta.Version - if !meta.InstalledAt.IsZero() { - item.InstalledAt = meta.InstalledAt.UTC().Format(time.RFC3339) + item.Type = entry.Type + item.RepoURL = entry.RepoURL + item.Version = entry.Version + if !entry.InstalledAt.IsZero() { + item.InstalledAt = entry.InstalledAt.UTC().Format(time.RFC3339) } } } diff --git a/internal/install/copy.go b/internal/install/copy.go index a3d71bbe..ca6496c4 100644 --- a/internal/install/copy.go +++ b/internal/install/copy.go @@ -6,7 +6,17 @@ import ( "path/filepath" ) +// copyDir recursively copies src into dst, skipping any `.git` directory. func copyDir(src, dst string) error { + return copyDirExcluding(src, dst, nil) +} + +// copyDirExcluding recursively copies src into dst, skipping any `.git` +// directory and any subdirectory whose slash-normalized path relative to src +// appears in excludes. The excludes keys must use forward slashes (e.g. +// "skills/officecli-pptx"). Passing a nil or empty map is equivalent to +// copyDir. +func copyDirExcluding(src, dst string, excludes map[string]bool) error { return filepath.Walk(src, func(path string, info os.FileInfo, err error) error { if err != nil { return err @@ -20,6 +30,15 @@ func copyDir(src, dst string) error { return filepath.SkipDir } + // Skip excluded subdirectories (e.g. child skill dirs when copying + // the root of an orchestrator repo so they do not duplicate into the + // root install). + if len(excludes) > 0 && info.IsDir() && relPath != "." { + if excludes[filepath.ToSlash(relPath)] { + return filepath.SkipDir + } + } + if info.IsDir() { return os.MkdirAll(dstPath, info.Mode()) } diff --git a/internal/install/copy_test.go b/internal/install/copy_test.go index 491d9c7a..b816ebc4 100644 --- a/internal/install/copy_test.go +++ b/internal/install/copy_test.go @@ -61,6 +61,78 @@ func TestCopyDir_SkipsGit(t *testing.T) { } } +func TestCopyDirExcluding_SkipsChildSkillDirs(t *testing.T) { + src := t.TempDir() + dst := filepath.Join(t.TempDir(), "dest") + + // Orchestrator-style repo layout (issue #124): + // SKILL.md (root skill) + // README.md + // src/helper.go (root-owned asset, must remain) + // skills/child-a/SKILL.md (child skill, must be excluded) + // skills/child-b/SKILL.md (child skill, must be excluded) + // skills/shared/README.md (NOT a skill dir, must remain) + os.WriteFile(filepath.Join(src, "SKILL.md"), []byte("# Root"), 0644) + os.WriteFile(filepath.Join(src, "README.md"), []byte("root readme"), 0644) + os.MkdirAll(filepath.Join(src, "src"), 0755) + os.WriteFile(filepath.Join(src, "src", "helper.go"), []byte("package main"), 0644) + os.MkdirAll(filepath.Join(src, "skills", "child-a"), 0755) + os.WriteFile(filepath.Join(src, "skills", "child-a", "SKILL.md"), []byte("# A"), 0644) + os.MkdirAll(filepath.Join(src, "skills", "child-b"), 0755) + os.WriteFile(filepath.Join(src, "skills", "child-b", "SKILL.md"), []byte("# B"), 0644) + os.MkdirAll(filepath.Join(src, "skills", "shared"), 0755) + os.WriteFile(filepath.Join(src, "skills", "shared", "README.md"), []byte("shared"), 0644) + + excludes := map[string]bool{ + "skills/child-a": true, + "skills/child-b": true, + } + if err := copyDirExcluding(src, dst, excludes); err != nil { + t.Fatal(err) + } + + // Files that MUST exist in the copy + mustExist := []string{ + "SKILL.md", + "README.md", + filepath.Join("src", "helper.go"), + filepath.Join("skills", "shared", "README.md"), + } + for _, rel := range mustExist { + if _, err := os.Stat(filepath.Join(dst, rel)); err != nil { + t.Errorf("expected %q to be copied, got %v", rel, err) + } + } + + // Files that MUST NOT exist (excluded child skill dirs) + mustNotExist := []string{ + filepath.Join("skills", "child-a"), + filepath.Join("skills", "child-b"), + filepath.Join("skills", "child-a", "SKILL.md"), + } + for _, rel := range mustNotExist { + if _, err := os.Stat(filepath.Join(dst, rel)); !os.IsNotExist(err) { + t.Errorf("expected %q to be excluded, but it exists (err=%v)", rel, err) + } + } +} + +func TestCopyDirExcluding_NilMapEqualsCopyDir(t *testing.T) { + src := t.TempDir() + dst := filepath.Join(t.TempDir(), "dest") + + os.WriteFile(filepath.Join(src, "a.txt"), []byte("a"), 0644) + os.MkdirAll(filepath.Join(src, "sub"), 0755) + os.WriteFile(filepath.Join(src, "sub", "b.txt"), []byte("b"), 0644) + + if err := copyDirExcluding(src, dst, nil); err != nil { + t.Fatal(err) + } + if _, err := os.Stat(filepath.Join(dst, "sub", "b.txt")); err != nil { + t.Errorf("expected sub/b.txt to be copied, got %v", err) + } +} + func TestCopyDir_PreservesPermissions(t *testing.T) { src := t.TempDir() dst := filepath.Join(t.TempDir(), "dest") diff --git a/internal/install/install.go b/internal/install/install.go index b368297d..e210a9f2 100644 --- a/internal/install/install.go +++ b/internal/install/install.go @@ -10,12 +10,14 @@ import ( // InstallOptions configures the install behavior type InstallOptions struct { Name string // Override skill name + Kind string // "skill", "agent", or "" (auto-detect) Force bool // Overwrite existing DryRun bool // Preview only Update bool // Update existing installation Track bool // Install as tracked repository (preserves .git) OnProgress ProgressCallback Skills []string // Select specific skills from multi-skill repo (comma-separated) + AgentNames []string // Select specific agents from repo (comma-separated) Exclude []string // Skills to exclude from installation (comma-separated) All bool // Install all discovered skills without prompting Yes bool // Auto-accept all prompts (equivalent to --all for multi-skill repos) @@ -26,8 +28,15 @@ type InstallOptions struct { AuditProjectRoot string // Project root for project-mode audit rule resolution Quiet bool // Suppress per-skill output in InstallFromConfig Branch string // Git branch to clone from (empty = remote default) + SourceDir string // Skills root dir for centralized metadata (set by caller) } +// IsAgentMode returns true if explicitly installing agents. +func (o InstallOptions) IsAgentMode() bool { return o.Kind == "agent" } + +// HasAgentFilter returns true if specific agents were requested via -a flag. +func (o InstallOptions) HasAgentFilter() bool { return len(o.AgentNames) > 0 } + // ShouldInstallAll returns true if all discovered skills should be installed without prompting. func (o InstallOptions) ShouldInstallAll() bool { return o.All || o.Yes } @@ -55,21 +64,46 @@ type SkillInfo struct { Description string // Description from SKILL.md frontmatter (if any) } -// DiscoveryResult contains discovered skills from a repository +// AgentInfo represents a discovered agent (.md file) in a repository +type AgentInfo struct { + Name string // Agent name (filename without .md) + Path string // Relative path from repo root (e.g. "agents/tutor.md") + FileName string // Filename (e.g. "tutor.md") +} + +// DiscoveryResult contains discovered skills and agents from a repository type DiscoveryResult struct { RepoPath string // Temp directory where repo was cloned Skills []SkillInfo // Discovered skills + Agents []AgentInfo // Discovered agents Source *Source // Original source CommitHash string // Source commit hash when available Warnings []string // Non-fatal warnings during discovery } +// HasAgents reports whether the discovery found any agents. +func (d *DiscoveryResult) HasAgents() bool { + return len(d.Agents) > 0 +} + +// HasSkills reports whether the discovery found any skills. +func (d *DiscoveryResult) HasSkills() bool { + return len(d.Skills) > 0 +} + +// IsMixed reports whether the discovery found both skills and agents. +func (d *DiscoveryResult) IsMixed() bool { + return d.HasSkills() && d.HasAgents() +} + // TrackedRepoResult reports the outcome of a tracked repo installation type TrackedRepoResult struct { RepoName string // Name of the tracked repo (e.g., "_team-skills") RepoPath string // Full path to the repo SkillCount int // Number of skills discovered Skills []string // Names of discovered skills + AgentCount int // Number of agents discovered + Agents []string // Names of discovered agents Action string // "cloned", "updated", "skipped" Warnings []string AuditThreshold string @@ -187,6 +221,47 @@ func DiscoverFromGitSubdirWithProgress(source *Source, onProgress ProgressCallba return discoverFromGitSubdirWithProgressImpl(source, onProgress) } +// InferTrackedKind determines which resource kind a tracked install should use. +// Pure-agent repositories resolve to "agent". Mixed repositories must specify +// the kind explicitly to avoid ambiguous install roots. +func InferTrackedKind(source *Source, explicitKind string) (string, error) { + if !source.IsGit() { + return "", fmt.Errorf("--track requires a git repository source") + } + + if explicitKind == "skill" || explicitKind == "agent" { + return explicitKind, nil + } + + var ( + discovery *DiscoveryResult + err error + ) + if source.HasSubdir() { + discovery, err = DiscoverFromGitSubdir(source) + } else { + discovery, err = DiscoverFromGit(source) + } + if err != nil { + return "", err + } + defer CleanupDiscovery(discovery) + + switch { + case discovery.HasSkills() && discovery.HasAgents(): + return "", fmt.Errorf("tracked install is ambiguous for mixed repositories; pass --kind skill or --kind agent") + case discovery.HasAgents() && !discovery.HasSkills(): + return "agent", nil + default: + return "skill", nil + } +} + +// DiscoverLocal inspects a local directory and discovers skills and agents. +func DiscoverLocal(source *Source) (*DiscoveryResult, error) { + return discoverLocalImpl(source) +} + // CleanupDiscovery removes temporary resources created by discovery. func CleanupDiscovery(result *DiscoveryResult) { cleanupDiscoveryImpl(result) diff --git a/internal/install/install_apply.go b/internal/install/install_apply.go index 4d4c449b..5b5adbf7 100644 --- a/internal/install/install_apply.go +++ b/internal/install/install_apply.go @@ -4,6 +4,9 @@ import ( "fmt" "os" "path/filepath" + "strings" + + "skillshare/internal/utils" ) // buildDiscoverySkillSource constructs metadata Source string for a skill @@ -22,7 +25,153 @@ func buildDiscoverySkillSource(source *Source, skillPath string) string { return source.Raw + "/" + skillPath } +func discoverySourceRoot(discovery *DiscoveryResult) string { + if discovery.Source != nil && discovery.Source.Type == SourceTypeLocalPath { + return discovery.RepoPath + } + return filepath.Join(discovery.RepoPath, "repo") +} + +// descendantSkillPaths returns the set of slash-normalized paths (relative to +// the source root of `skill`) of every other discovered skill that lives +// strictly inside `skill`. The returned map is suitable as the excludes +// argument to copyDirExcluding. +// +// For the root skill (Path="."), every other discovered skill counts as a +// descendant. For a non-root skill, only nested sub-skills count. Returns nil +// when there is nothing to exclude so callers hit the fast path in +// copyDirExcluding. +func descendantSkillPaths(discovery *DiscoveryResult, skill SkillInfo) map[string]bool { + if discovery == nil || len(discovery.Skills) <= 1 { + return nil + } + current := filepath.ToSlash(skill.Path) + var prefix string + if current != "." { + prefix = current + "/" + } + excludes := make(map[string]bool) + for _, other := range discovery.Skills { + otherPath := filepath.ToSlash(other.Path) + if otherPath == current { + continue + } + if current == "." { + // Root skill: every other skill path is a strict descendant. + if otherPath != "." { + excludes[otherPath] = true + } + continue + } + // Non-root: only paths strictly under current count. + if rel, ok := strings.CutPrefix(otherPath, prefix); ok { + // rel is the path relative to `skill` (which is how + // copyDirExcluding walks `srcPath`). + excludes[rel] = true + } + } + if len(excludes) == 0 { + return nil + } + return excludes +} + +func installAgentRelativePath(agent AgentInfo) string { + return strings.TrimPrefix(filepath.ToSlash(agent.Path), "agents/") +} + +func resolveDiscoveredAgentSourcePath(discovery *DiscoveryResult, agent AgentInfo) string { + sourceRoot := discoverySourceRoot(discovery) + if discovery.Source.HasSubdir() { + return filepath.Join(sourceRoot, discovery.Source.Subdir, agent.Path) + } + return filepath.Join(sourceRoot, agent.Path) +} + +func buildDiscoveredAgentSource(discovery *DiscoveryResult, agent AgentInfo) *Source { + return &Source{ + Type: discovery.Source.Type, + Raw: buildDiscoverySkillSource(discovery.Source, agent.Path), + CloneURL: discovery.Source.CloneURL, + Subdir: agent.Path, + Name: agent.Name, + Branch: discovery.Source.Branch, + } +} + +func writeDiscoveredAgentMetadata(discovery *DiscoveryResult, agent AgentInfo, destFile, sourceDir string) error { + source := buildDiscoveredAgentSource(discovery, agent) + meta := NewMetaFromSource(source) + meta.Kind = "agent" + if discovery.CommitHash != "" { + meta.Version = discovery.CommitHash + } + if hash, hashErr := computeSingleFileHash(destFile); hashErr == nil { + meta.FileHashes = map[string]string{filepath.Base(destFile): hash} + } + return WriteMetaToStore(sourceDir, destFile, meta) +} + +func installAgentFromDiscoveryInternal(discovery *DiscoveryResult, agent AgentInfo, destFile string, opts InstallOptions, writeMeta bool) (*InstallResult, error) { + result := &InstallResult{ + SkillName: agent.Name, + Source: buildDiscoverySkillSource(discovery.Source, agent.Path), + SkillPath: destFile, + } + + if opts.DryRun { + result.Action = "would install" + return result, nil + } + + if err := os.MkdirAll(filepath.Dir(destFile), 0755); err != nil { + return nil, fmt.Errorf("failed to create agents directory: %w", err) + } + + if _, err := os.Stat(destFile); err == nil && !opts.Force { + result.Action = "skipped" + result.Warnings = append(result.Warnings, "agent already exists (use --force to overwrite)") + return result, nil + } + + data, err := os.ReadFile(resolveDiscoveredAgentSourcePath(discovery, agent)) + if err != nil { + return nil, fmt.Errorf("failed to read agent %s: %w", agent.FileName, err) + } + + if err := os.WriteFile(destFile, data, 0644); err != nil { + return nil, fmt.Errorf("failed to write agent %s: %w", agent.FileName, err) + } + + if err := auditInstalledAgent(destFile, result, opts); err != nil { + return nil, err + } + + if writeMeta { + if err := writeDiscoveredAgentMetadata(discovery, agent, destFile, opts.SourceDir); err != nil { + result.Warnings = append(result.Warnings, fmt.Sprintf("failed to write metadata: %v", err)) + } + } + + result.Action = "installed" + return result, nil +} + func installImpl(source *Source, destPath string, opts InstallOptions) (*InstallResult, error) { + // Derive SourceDir from destPath if not set by caller. + // destPath = sourceDir[/into]/skillName, so strip Into + skillName. + if opts.SourceDir == "" { + dir := filepath.Dir(destPath) + if opts.Into != "" { + // Strip the --into prefix from the parent + dir = filepath.Dir(dir) + for i := strings.Count(opts.Into, "/"); i > 0; i-- { + dir = filepath.Dir(dir) + } + } + opts.SourceDir = dir + } + result := &InstallResult{ SkillName: source.Name, Source: source.Raw, @@ -99,7 +248,7 @@ func installFromLocal(source *Source, destPath string, result *InstallResult, op if hashes, hashErr := ComputeFileHashes(destPath); hashErr == nil { meta.FileHashes = hashes } - if err := WriteMeta(destPath, meta); err != nil { + if err := WriteMetaToStore(opts.SourceDir, destPath, meta); err != nil { result.Warnings = append(result.Warnings, fmt.Sprintf("failed to write metadata: %v", err)) } @@ -141,7 +290,7 @@ func installFromGit(source *Source, destPath string, result *InstallResult, opts if hashes, hashErr := ComputeFileHashes(destPath); hashErr == nil { meta.FileHashes = hashes } - if err := WriteMeta(destPath, meta); err != nil { + if err := WriteMetaToStore(opts.SourceDir, destPath, meta); err != nil { result.Warnings = append(result.Warnings, fmt.Sprintf("failed to write metadata: %v", err)) } @@ -211,20 +360,27 @@ func installFromDiscoveryImpl(discovery *DiscoveryResult, skill SkillInfo, destP } // Determine source path in temp repo + sourceRoot := discoverySourceRoot(discovery) var srcPath string if discovery.Source.HasSubdir() { // Subdir discovery: paths are relative to the subdir if skill.Path == "." { - srcPath = filepath.Join(discovery.RepoPath, "repo", discovery.Source.Subdir) + srcPath = filepath.Join(sourceRoot, discovery.Source.Subdir) } else { - srcPath = filepath.Join(discovery.RepoPath, "repo", discovery.Source.Subdir, skill.Path) + srcPath = filepath.Join(sourceRoot, discovery.Source.Subdir, skill.Path) } } else { // Whole-repo discovery: paths are relative to repo root - srcPath = filepath.Join(discovery.RepoPath, "repo", skill.Path) + srcPath = filepath.Join(sourceRoot, skill.Path) } - if err := copyDir(srcPath, destPath); err != nil { + // In an orchestrator-style repo (root SKILL.md + children), copying the + // root would otherwise drag every child skill directory into the root + // install. Exclude any other discovered skill whose path is a strict + // descendant of the skill currently being installed (issue #124). + excludes := descendantSkillPaths(discovery, skill) + + if err := copyDirExcluding(srcPath, destPath, excludes); err != nil { return nil, fmt.Errorf("failed to copy skill: %w", err) } @@ -245,16 +401,16 @@ func installFromDiscoveryImpl(discovery *DiscoveryResult, skill SkillInfo, destP meta := NewMetaFromSource(source) if discovery.CommitHash != "" { meta.Version = discovery.CommitHash - } else if hash, err := getGitCommit(filepath.Join(discovery.RepoPath, "repo")); err == nil { + } else if hash, err := getGitCommit(sourceRoot); err == nil { meta.Version = hash } if fullSubdir != "" { - meta.TreeHash = getSubdirTreeHash(filepath.Join(discovery.RepoPath, "repo"), fullSubdir) + meta.TreeHash = getSubdirTreeHash(sourceRoot, fullSubdir) } if hashes, hashErr := ComputeFileHashes(destPath); hashErr == nil { meta.FileHashes = hashes } - if err := WriteMeta(destPath, meta); err != nil { + if err := WriteMetaToStore(opts.SourceDir, destPath, meta); err != nil { result.Warnings = append(result.Warnings, fmt.Sprintf("failed to write metadata: %v", err)) } @@ -363,7 +519,7 @@ func installFromGitSubdir(source *Source, destPath string, result *InstallResult if hashes, hashErr := ComputeFileHashes(destPath); hashErr == nil { meta.FileHashes = hashes } - if err := WriteMeta(destPath, meta); err != nil { + if err := WriteMetaToStore(opts.SourceDir, destPath, meta); err != nil { result.Warnings = append(result.Warnings, fmt.Sprintf("failed to write metadata: %v", err)) } @@ -381,6 +537,73 @@ func checkSkillFile(skillPath string, result *InstallResult) { } } +// InstallAgentFromDiscovery installs a single agent (.md file) from a discovery result. +// Unlike skill install (directory copy), agent install copies a single file. +func InstallAgentFromDiscovery(discovery *DiscoveryResult, agent AgentInfo, destDir string, opts InstallOptions) (*InstallResult, error) { + destFile := filepath.Join(destDir, filepath.FromSlash(installAgentRelativePath(agent))) + return installAgentFromDiscoveryInternal(discovery, agent, destFile, opts, true) +} + +func UpdateAgentFromDiscovery(discovery *DiscoveryResult, agent AgentInfo, destDir string, opts InstallOptions) (*InstallResult, error) { + relPath := installAgentRelativePath(agent) + destFile := filepath.Join(destDir, filepath.FromSlash(relPath)) + result := &InstallResult{ + SkillName: agent.Name, + Source: buildDiscoverySkillSource(discovery.Source, agent.Path), + SkillPath: destFile, + } + + if opts.DryRun { + result.Action = "would reinstall from source" + return result, nil + } + + tempDir, err := os.MkdirTemp("", "skillshare-agent-update-*") + if err != nil { + return nil, fmt.Errorf("failed to create temp directory: %w", err) + } + defer os.RemoveAll(tempDir) + + tempFile := filepath.Join(tempDir, filepath.FromSlash(relPath)) + innerOpts := opts + innerOpts.DryRun = false + innerOpts.Update = false + innerOpts.SourceDir = tempDir + + innerResult, err := installAgentFromDiscoveryInternal(discovery, agent, tempFile, innerOpts, false) + if err != nil { + return nil, err + } + + if err := os.MkdirAll(filepath.Dir(destFile), 0755); err != nil { + return nil, fmt.Errorf("failed to create agents directory: %w", err) + } + // os.Rename is atomic on POSIX and overwrites the destination. If it + // fails (e.g. cross-device EXDEV on exotic setups), fall back to copy. + if err := os.Rename(tempFile, destFile); err != nil { + data, readErr := os.ReadFile(tempFile) + if readErr != nil { + return nil, fmt.Errorf("failed to read staged agent: %w", readErr) + } + if writeErr := os.WriteFile(destFile, data, 0644); writeErr != nil { + return nil, fmt.Errorf("failed to move updated agent: %w", writeErr) + } + } + + if err := writeDiscoveredAgentMetadata(discovery, agent, destFile, opts.SourceDir); err != nil { + innerResult.Warnings = append(innerResult.Warnings, fmt.Sprintf("failed to write metadata: %v", err)) + } + + innerResult.SkillPath = destFile + innerResult.Action = "updated" + return innerResult, nil +} + +// computeSingleFileHash computes the sha256 hash for a single file. +func computeSingleFileHash(filePath string) (string, error) { + return utils.FileHashFormatted(filePath) +} + // auditInstalledSkill scans the installed skill for security threats. // It blocks installation when findings are at or above configured threshold // unless force is enabled. diff --git a/internal/install/install_audit.go b/internal/install/install_audit.go index e497fa65..71fabb09 100644 --- a/internal/install/install_audit.go +++ b/internal/install/install_audit.go @@ -2,12 +2,23 @@ package install import ( "fmt" + "os" + "path/filepath" "strings" "skillshare/internal/audit" ) -func auditInstalledSkill(destPath string, result *InstallResult, opts InstallOptions) error { +// auditInstalledResource runs the audit gate shared by skill and agent installs. +// scan performs the actual scan; cleanupOnBlock removes the installed artifact +// when findings at/above threshold fire and --force is not set. +func auditInstalledResource( + destPath string, + result *InstallResult, + opts InstallOptions, + scan func() (*audit.Result, error), + cleanupOnBlock func() error, +) error { if opts.SkipAudit { result.AuditSkipped = true result.AuditThreshold = opts.AuditThreshold @@ -21,12 +32,7 @@ func auditInstalledSkill(destPath string, result *InstallResult, opts InstallOpt } result.AuditThreshold = threshold - var scanResult *audit.Result - if opts.AuditProjectRoot != "" { - scanResult, err = audit.ScanSkillForProject(destPath, opts.AuditProjectRoot) - } else { - scanResult, err = audit.ScanSkill(destPath) - } + scanResult, err := scan() if err != nil { // Non-fatal: warn but don't block result.Warnings = append(result.Warnings, fmt.Sprintf("audit scan error: %v", err)) @@ -44,7 +50,6 @@ func auditInstalledSkill(destPath string, result *InstallResult, opts InstallOpt return nil } - // Build warning messages for all findings (include snippet for context) for _, f := range scanResult.Findings { msg := fmt.Sprintf("audit %s: %s (%s:%d)", f.Severity, f.Message, f.File, f.Line) if f.Snippet != "" { @@ -53,16 +58,15 @@ func auditInstalledSkill(destPath string, result *InstallResult, opts InstallOpt result.Warnings = append(result.Warnings, msg) } - // Findings at or above threshold block installation unless --force. if scanResult.IsBlocked && !opts.Force { details := blockedFindingDetails(scanResult.Findings, threshold) - if removeErr := removeAll(destPath); removeErr != nil { + if cleanupErr := cleanupOnBlock(); cleanupErr != nil { return fmt.Errorf( "security audit failed — findings at/above %s detected:\n%s\n\nAutomatic cleanup failed for %s: %v\nManual removal is required: %w", threshold, strings.Join(details, "\n"), destPath, - removeErr, + cleanupErr, audit.ErrBlocked, ) } @@ -85,6 +89,50 @@ func auditInstalledSkill(destPath string, result *InstallResult, opts InstallOpt return nil } +func auditInstalledSkill(destPath string, result *InstallResult, opts InstallOptions) error { + scan := func() (*audit.Result, error) { + if opts.AuditProjectRoot != "" { + return audit.ScanSkillForProject(destPath, opts.AuditProjectRoot) + } + return audit.ScanSkill(destPath) + } + cleanup := func() error { return removeAll(destPath) } + return auditInstalledResource(destPath, result, opts, scan, cleanup) +} + +func auditInstalledAgent(destFile string, result *InstallResult, opts InstallOptions) error { + scan := func() (*audit.Result, error) { + if opts.AuditProjectRoot != "" { + return audit.ScanFileForProject(destFile, opts.AuditProjectRoot) + } + return audit.ScanFile(destFile) + } + // Agents are single .md files. After removing the file, walk up any + // newly-empty parent directories (e.g. into-subdir) so a blocked install + // leaves nothing behind. + cleanup := func() error { + if err := removeAll(destFile); err != nil { + return err + } + cleanEmptyInstallParents(destFile, opts.SourceDir) + return nil + } + return auditInstalledResource(destFile, result, opts, scan, cleanup) +} + +func cleanEmptyInstallParents(path, stopAt string) { + stopAt = filepath.Clean(stopAt) + dir := filepath.Dir(filepath.Clean(path)) + for dir != stopAt && dir != "." && dir != "/" { + entries, err := os.ReadDir(dir) + if err != nil || len(entries) > 0 { + break + } + _ = os.Remove(dir) + dir = filepath.Dir(dir) + } +} + // auditTrackedRepo scans an entire tracked repo directory for security threats. // It blocks installation when findings are at or above configured threshold // unless force is enabled. On block, the repo directory is removed. diff --git a/internal/install/install_discovery.go b/internal/install/install_discovery.go index 7cf7cee7..932125ec 100644 --- a/internal/install/install_discovery.go +++ b/internal/install/install_discovery.go @@ -43,11 +43,20 @@ func discoverFromGitWithProgressImpl(source *Source, onProgress ProgressCallback } } + // Discover agents (agents/ dir or pure agent repo fallback) + agents := discoverAgents(repoPath, len(skills) > 0) + skills, agents, err = constrainDiscoveryToExplicitSkill(source, skills, agents) + if err != nil { + _ = os.RemoveAll(tempDir) + return nil, err + } + commitHash, _ := getGitCommit(repoPath) return &DiscoveryResult{ RepoPath: tempDir, Skills: skills, + Agents: agents, Source: source, CommitHash: commitHash, }, nil @@ -58,6 +67,44 @@ func discoverFromGitImpl(source *Source) (*DiscoveryResult, error) { return discoverFromGitWithProgressImpl(source, nil) } +func discoverLocalImpl(source *Source) (*DiscoveryResult, error) { + if source == nil { + return nil, fmt.Errorf("source is required") + } + if source.Type != SourceTypeLocalPath { + return nil, fmt.Errorf("source is not a local path") + } + + info, err := os.Stat(source.Path) + if err != nil { + return nil, fmt.Errorf("cannot access source path: %w", err) + } + if !info.IsDir() { + return nil, fmt.Errorf("source path is not a directory: %s", source.Path) + } + + skills := discoverSkills(source.Path, true) + for i := range skills { + if skills[i].Path == "." { + skills[i].Name = source.Name + break + } + } + + agents := discoverAgents(source.Path, len(skills) > 0) + skills, agents, err = constrainDiscoveryToExplicitSkill(source, skills, agents) + if err != nil { + return nil, err + } + + return &DiscoveryResult{ + RepoPath: source.Path, + Skills: skills, + Agents: agents, + Source: source, + }, nil +} + // resolveSubdir resolves a subdirectory path within a cloned repo. // It first checks for an exact match. If not found, it scans the repo for // SKILL.md files and looks for a skill whose name matches filepath.Base(subdir). @@ -98,6 +145,20 @@ func resolveSubdir(repoPath, subdir string) (string, error) { } } +func constrainDiscoveryToExplicitSkill(source *Source, skills []SkillInfo, agents []AgentInfo) ([]SkillInfo, []AgentInfo, error) { + if source == nil || !source.TargetsExplicitSkill() { + return skills, agents, nil + } + + for _, skill := range skills { + if skill.Path == "." { + return []SkillInfo{skill}, nil, nil + } + } + + return nil, nil, fmt.Errorf("explicit skill target does not resolve to a root SKILL.md: %s", source.Raw) +} + // discoverSkills finds directories containing SKILL.md // If includeRoot is true, root-level SKILL.md is also included (with Path=".") func discoverSkills(repoPath string, includeRoot bool) []SkillInfo { @@ -162,6 +223,87 @@ func discoverSkills(repoPath string, includeRoot bool) []SkillInfo { return skills } +// discoverAgents finds .md files in an agents/ convention directory. +// Also detects "pure agent repos" — repos with no SKILL.md and no agents/ dir +// but with .md files at root (per D5 rule 4). +func discoverAgents(repoPath string, hasSkills bool) []AgentInfo { + var agents []AgentInfo + + // Rule 2: Check agents/ convention directory + agentsDir := filepath.Join(repoPath, "agents") + if info, err := os.Stat(agentsDir); err == nil && info.IsDir() { + agents = append(agents, scanAgentDir(repoPath, agentsDir)...) + return agents + } + + // Rule 4: Pure agent repo fallback — no skills, no agents/ dir, root has .md files + if !hasSkills { + agents = append(agents, scanAgentDir(repoPath, repoPath)...) + } + + return agents +} + +// scanAgentDir scans a directory for .md agent files, excluding conventional files. +func scanAgentDir(repoRoot, dir string) []AgentInfo { + var agents []AgentInfo + + filepath.Walk(dir, func(path string, info os.FileInfo, err error) error { + if err != nil { + return nil + } + + if info.IsDir() { + name := info.Name() + if name == ".git" || TargetDotDirs[name] { + return filepath.SkipDir + } + return nil + } + + if !strings.HasSuffix(strings.ToLower(info.Name()), ".md") { + return nil + } + + // Skip conventional excludes + if conventionalAgentExcludes[info.Name()] { + return nil + } + + // Skip hidden files + if strings.HasPrefix(info.Name(), ".") { + return nil + } + + relPath, relErr := filepath.Rel(repoRoot, path) + if relErr != nil { + return nil + } + relPath = strings.ReplaceAll(relPath, "\\", "/") + + name := strings.TrimSuffix(info.Name(), ".md") + + agents = append(agents, AgentInfo{ + Name: name, + Path: relPath, + FileName: info.Name(), + }) + + return nil + }) + + return agents +} + +var conventionalAgentExcludes = map[string]bool{ + "README.md": true, + "CHANGELOG.md": true, + "LICENSE.md": true, + "HISTORY.md": true, + "SECURITY.md": true, + "SKILL.md": true, +} + // DiscoverFromGitSubdir clones a repo and discovers skills within a subdirectory // Unlike DiscoverFromGit, this includes root-level SKILL.md of the subdir @@ -194,9 +336,16 @@ func discoverFromGitSubdirWithProgressImpl(source *Source, onProgress ProgressCa commitHash = hash } skills := discoverSkills(subdirPath, true) + agents := discoverAgents(subdirPath, len(skills) > 0) + skills, agents, err = constrainDiscoveryToExplicitSkill(source, skills, agents) + if err != nil { + _ = os.RemoveAll(tempDir) + return nil, err + } return &DiscoveryResult{ RepoPath: tempDir, Skills: skills, + Agents: agents, Source: source, CommitHash: commitHash, Warnings: warnings, @@ -221,9 +370,16 @@ func discoverFromGitSubdirWithProgressImpl(source *Source, onProgress ProgressCa if dlErr == nil { commitHash = hash skills := discoverSkills(subdirPath, true) + agents := discoverAgents(subdirPath, len(skills) > 0) + skills, agents, err = constrainDiscoveryToExplicitSkill(source, skills, agents) + if err != nil { + _ = os.RemoveAll(tempDir) + return nil, err + } return &DiscoveryResult{ RepoPath: tempDir, Skills: skills, + Agents: agents, Source: source, CommitHash: commitHash, }, nil @@ -258,9 +414,16 @@ func discoverFromGitSubdirWithProgressImpl(source *Source, onProgress ProgressCa } skills := discoverSkills(subdirPath, true) + agents := discoverAgents(subdirPath, len(skills) > 0) + skills, agents, err = constrainDiscoveryToExplicitSkill(source, skills, agents) + if err != nil { + _ = os.RemoveAll(tempDir) + return nil, err + } return &DiscoveryResult{ RepoPath: tempDir, Skills: skills, + Agents: agents, Source: source, CommitHash: commitHash, Warnings: warnings, diff --git a/internal/install/install_local_test.go b/internal/install/install_local_test.go index fb9594b1..131f1d4a 100644 --- a/internal/install/install_local_test.go +++ b/internal/install/install_local_test.go @@ -46,8 +46,9 @@ func TestInstall_LocalPath_Basic(t *testing.T) { t.Error("expected SKILL.md to exist in destination") } - // Verify metadata was written - if !HasMeta(destDir) { + // Verify metadata was written to centralized store + store, _ := LoadMetadata(filepath.Dir(destDir)) + if !store.Has(filepath.Base(destDir)) { t.Error("expected metadata to be written") } } @@ -162,17 +163,15 @@ func TestInstall_LocalPath_WritesFileHashes(t *testing.T) { t.Fatal(err) } - meta, err := ReadMeta(destDir) - if err != nil { - t.Fatal(err) - } - if meta == nil { + store, _ := LoadMetadata(filepath.Dir(destDir)) + entry := store.Get(filepath.Base(destDir)) + if entry == nil { t.Fatal("expected meta to exist") } - if len(meta.FileHashes) < 2 { - t.Errorf("expected at least 2 file hashes (SKILL.md + helpers.sh), got %d", len(meta.FileHashes)) + if len(entry.FileHashes) < 2 { + t.Errorf("expected at least 2 file hashes (SKILL.md + helpers.sh), got %d", len(entry.FileHashes)) } - for _, hash := range meta.FileHashes { + for _, hash := range entry.FileHashes { if len(hash) < 7 || hash[:7] != "sha256:" { t.Errorf("expected sha256: prefixed hash, got %q", hash) } @@ -236,6 +235,52 @@ func TestInstall_LocalPath_WithAudit(t *testing.T) { } } +func TestInstallAgentFromDiscovery_HighFindingBlocked(t *testing.T) { + tmp := t.TempDir() + repoDir := filepath.Join(tmp, "repo") + if err := os.MkdirAll(repoDir, 0o755); err != nil { + t.Fatal(err) + } + if err := os.WriteFile(filepath.Join(repoDir, "reviewer.md"), []byte("# Reviewer\nsudo apt-get install -y jq\n"), 0o644); err != nil { + t.Fatal(err) + } + + discovery := &DiscoveryResult{ + RepoPath: repoDir, + Source: &Source{ + Type: SourceTypeLocalPath, + Raw: repoDir, + Path: repoDir, + }, + } + + destRoot := filepath.Join(tmp, "agents") + _, err := InstallAgentFromDiscovery(discovery, AgentInfo{ + Name: "reviewer", + Path: "reviewer.md", + FileName: "reviewer.md", + }, destRoot, InstallOptions{ + SourceDir: destRoot, + AuditThreshold: audit.SeverityHigh, + }) + if err == nil { + t.Fatal("expected audit block for agent install") + } + if !errors.Is(err, audit.ErrBlocked) { + t.Fatalf("expected error to wrap audit.ErrBlocked, got: %v", err) + } + if _, statErr := os.Stat(filepath.Join(destRoot, "reviewer.md")); !os.IsNotExist(statErr) { + t.Fatalf("expected blocked agent file to be removed, stat err=%v", statErr) + } + + store, loadErr := LoadMetadata(destRoot) + if loadErr == nil { + if entry := store.Get("reviewer"); entry != nil { + t.Fatalf("expected no metadata for blocked agent install, got %+v", entry) + } + } +} + func TestInstall_LocalPath_HighFinding_BelowCriticalThresholdWarns(t *testing.T) { tmp := t.TempDir() srcDir := createLocalSkillSource(t, tmp, "high-finding") @@ -452,3 +497,107 @@ func TestCheckSkillFile_Missing(t *testing.T) { t.Errorf("expected 1 warning for missing SKILL.md, got %d", len(result.Warnings)) } } + +// TestInstallFromDiscovery_Orchestrator_RootExcludesChildren verifies issue +// #124: installing a root skill of an orchestrator repo must not drag child +// skill directories into the root install. Each child must install to its own +// directory as an independent skill. +func TestInstallFromDiscovery_Orchestrator_RootExcludesChildren(t *testing.T) { + tmp := t.TempDir() + repoDir := filepath.Join(tmp, "orchestrator") + os.MkdirAll(repoDir, 0755) + + // Layout: + // SKILL.md (root skill) + // README.md + // src/helper.go (root-owned asset) + // skills/child-a/SKILL.md (child skill) + // skills/child-b/SKILL.md (child skill) + os.WriteFile(filepath.Join(repoDir, "SKILL.md"), + []byte("---\nname: orchestrator\n---\n# Root"), 0644) + os.WriteFile(filepath.Join(repoDir, "README.md"), []byte("readme"), 0644) + os.MkdirAll(filepath.Join(repoDir, "src"), 0755) + os.WriteFile(filepath.Join(repoDir, "src", "helper.go"), []byte("package main"), 0644) + os.MkdirAll(filepath.Join(repoDir, "skills", "child-a"), 0755) + os.WriteFile(filepath.Join(repoDir, "skills", "child-a", "SKILL.md"), + []byte("---\nname: child-a\n---\n# A"), 0644) + os.MkdirAll(filepath.Join(repoDir, "skills", "child-b"), 0755) + os.WriteFile(filepath.Join(repoDir, "skills", "child-b", "SKILL.md"), + []byte("---\nname: child-b\n---\n# B"), 0644) + + source := &Source{ + Type: SourceTypeLocalPath, + Raw: repoDir, + Path: repoDir, + Name: "orchestrator", + } + + discovery, err := DiscoverLocal(source) + if err != nil { + t.Fatalf("DiscoverLocal failed: %v", err) + } + if len(discovery.Skills) != 3 { + t.Fatalf("expected 3 discovered skills (root + 2 children), got %d: %+v", + len(discovery.Skills), discovery.Skills) + } + + // Locate root and children in the discovery result. + var root *SkillInfo + var children []SkillInfo + for i := range discovery.Skills { + if discovery.Skills[i].Path == "." { + root = &discovery.Skills[i] + } else { + children = append(children, discovery.Skills[i]) + } + } + if root == nil { + t.Fatal("expected root skill with Path=\".\"") + } + if len(children) != 2 { + t.Fatalf("expected 2 children, got %d", len(children)) + } + + // Install the root skill first. + rootDest := filepath.Join(tmp, "dest", "orchestrator") + if _, err := InstallFromDiscovery(discovery, *root, rootDest, + InstallOptions{SourceDir: filepath.Join(tmp, "dest"), SkipAudit: true}); err != nil { + t.Fatalf("root install failed: %v", err) + } + + // Root dest MUST contain its own files and non-skill subdirs. + for _, rel := range []string{ + "SKILL.md", + "README.md", + filepath.Join("src", "helper.go"), + } { + if _, err := os.Stat(filepath.Join(rootDest, rel)); err != nil { + t.Errorf("expected %q in root install, got %v", rel, err) + } + } + + // Root dest MUST NOT contain child skill directories or their SKILL.md. + for _, rel := range []string{ + filepath.Join("skills", "child-a"), + filepath.Join("skills", "child-b"), + filepath.Join("skills", "child-a", "SKILL.md"), + filepath.Join("skills", "child-b", "SKILL.md"), + } { + if _, err := os.Stat(filepath.Join(rootDest, rel)); !os.IsNotExist(err) { + t.Errorf("expected %q to be excluded from root install, but it exists (err=%v)", + rel, err) + } + } + + // Install each child as an independent skill under the root dir. + for _, child := range children { + childDest := filepath.Join(rootDest, child.Name) + if _, err := InstallFromDiscovery(discovery, child, childDest, + InstallOptions{SourceDir: filepath.Join(tmp, "dest"), SkipAudit: true}); err != nil { + t.Fatalf("child %q install failed: %v", child.Name, err) + } + if _, err := os.Stat(filepath.Join(childDest, "SKILL.md")); err != nil { + t.Errorf("expected child %q SKILL.md at %q, got %v", child.Name, childDest, err) + } + } +} diff --git a/internal/install/install_queries.go b/internal/install/install_queries.go index 9980ab8b..102e3d86 100644 --- a/internal/install/install_queries.go +++ b/internal/install/install_queries.go @@ -10,41 +10,22 @@ import ( ) func getUpdatableSkillsImpl(sourceDir string) ([]string, error) { - var skills []string + store, err := LoadMetadata(sourceDir) + if err != nil { + return nil, err + } - walkRoot := utils.ResolveSymlink(sourceDir) - err := filepath.Walk(walkRoot, func(path string, info os.FileInfo, err error) error { - if err != nil { - return nil - } - if path == walkRoot { - return nil - } - // Skip .git directories - if info.IsDir() && info.Name() == ".git" { - return filepath.SkipDir - } - // Skip tracked repo directories (start with _) - if info.IsDir() && len(info.Name()) > 0 && info.Name()[0] == '_' { - return filepath.SkipDir + var skills []string + for _, name := range store.List() { + entry := store.Get(name) + if entry == nil || entry.Source == "" { + continue } - // Look for metadata files - if !info.IsDir() && info.Name() == MetaFileName { - skillDir := filepath.Dir(path) - relPath, relErr := filepath.Rel(walkRoot, skillDir) - if relErr != nil || relPath == "." { - return nil - } - meta, metaErr := ReadMeta(skillDir) - if metaErr != nil || meta == nil || meta.Source == "" { - return nil - } - skills = append(skills, relPath) + // Skip tracked repos (they are handled separately) + if entry.Tracked { + continue } - return nil - }) - if err != nil { - return nil, err + skills = append(skills, KeyToRelPath(name, entry)) } return skills, nil } @@ -56,35 +37,22 @@ func FindRepoInstalls(sourceDir, cloneURL string) []string { if cloneURL == "" { return nil } - var matches []string - walkRoot := utils.ResolveSymlink(sourceDir) - filepath.Walk(walkRoot, func(path string, info os.FileInfo, err error) error { - if err != nil || path == walkRoot { - return nil - } - if info.IsDir() && info.Name() == ".git" { - return filepath.SkipDir - } - if info.IsDir() && len(info.Name()) > 0 && info.Name()[0] == '_' { - return filepath.SkipDir + store, err := LoadMetadata(sourceDir) + if err != nil { + return nil + } + + var matches []string + for _, name := range store.List() { + entry := store.Get(name) + if entry == nil || entry.Tracked { + continue } - if !info.IsDir() && info.Name() == MetaFileName { - skillDir := filepath.Dir(path) - relPath, relErr := filepath.Rel(walkRoot, skillDir) - if relErr != nil || relPath == "." { - return nil - } - meta, metaErr := ReadMeta(skillDir) - if metaErr != nil || meta == nil { - return nil - } - if repoURLsMatch(meta.RepoURL, cloneURL) { - matches = append(matches, relPath) - } + if repoURLsMatch(entry.RepoURL, cloneURL) { + matches = append(matches, KeyToRelPath(name, entry)) } - return nil - }) + } return matches } diff --git a/internal/install/install_test.go b/internal/install/install_test.go index 6f1f922d..8112f6aa 100644 --- a/internal/install/install_test.go +++ b/internal/install/install_test.go @@ -77,6 +77,50 @@ func TestDiscoverSkills_RootAndChildren(t *testing.T) { } } +func TestDiscoverLocal_ExplicitSkillTarget_ReturnsOnlyRootSkill(t *testing.T) { + repoPath := t.TempDir() + if err := os.WriteFile(filepath.Join(repoPath, "SKILL.md"), []byte("---\nname: root\n---\n# Root"), 0644); err != nil { + t.Fatal(err) + } + + childDir := filepath.Join(repoPath, "skills", "child-skill") + if err := os.MkdirAll(childDir, 0755); err != nil { + t.Fatal(err) + } + if err := os.WriteFile(filepath.Join(childDir, "SKILL.md"), []byte("---\nname: child\n---\n# Child"), 0644); err != nil { + t.Fatal(err) + } + + agentsDir := filepath.Join(repoPath, "agents") + if err := os.MkdirAll(agentsDir, 0755); err != nil { + t.Fatal(err) + } + if err := os.WriteFile(filepath.Join(agentsDir, "helper.md"), []byte("# helper"), 0644); err != nil { + t.Fatal(err) + } + + discovery, err := DiscoverLocal(&Source{ + Type: SourceTypeLocalPath, + Raw: repoPath + "/SKILL.md", + Path: repoPath, + Name: "root", + ExplicitSkill: true, + }) + if err != nil { + t.Fatalf("DiscoverLocal() error = %v", err) + } + + if len(discovery.Skills) != 1 { + t.Fatalf("expected explicit skill target to return 1 skill, got %d: %+v", len(discovery.Skills), discovery.Skills) + } + if discovery.Skills[0].Path != "." { + t.Fatalf("expected explicit skill target to keep root skill, got path %q", discovery.Skills[0].Path) + } + if len(discovery.Agents) != 0 { + t.Fatalf("expected explicit skill target to ignore agents, got %d", len(discovery.Agents)) + } +} + func TestDiscoverSkills_ChildrenOnly(t *testing.T) { // Setup: orchestrator repo with no root SKILL.md, only children repoPath := t.TempDir() diff --git a/internal/install/install_tracked.go b/internal/install/install_tracked.go index a09ce64b..a3f13822 100644 --- a/internal/install/install_tracked.go +++ b/internal/install/install_tracked.go @@ -32,9 +32,12 @@ func installTrackedRepoImpl(source *Source, sourceDir string, opts InstallOption destBase := sourceDir if opts.Into != "" { destBase = filepath.Join(sourceDir, opts.Into) - if err := os.MkdirAll(destBase, 0755); err != nil { + } + if err := os.MkdirAll(destBase, 0755); err != nil { + if opts.Into != "" { return nil, fmt.Errorf("failed to create --into directory: %w", err) } + return nil, fmt.Errorf("failed to create source directory: %w", err) } destPath := filepath.Join(destBase, trackedName) @@ -87,8 +90,20 @@ func installTrackedRepoImpl(source *Source, sourceDir string, opts InstallOption result.Skills = append(result.Skills, skill.Name) } - if len(skills) == 0 { - result.Warnings = append(result.Warnings, "no SKILL.md files found in repository") + // Also discover agents in the tracked repo + agents := discoverAgents(destPath, len(skills) > 0) + result.AgentCount = len(agents) + if len(agents) > 0 { + for _, agent := range agents { + result.Agents = append(result.Agents, agent.Name) + } + result.Warnings = append(result.Warnings, fmt.Sprintf("%d agent(s) found in tracked repo", len(agents))) + } + + if len(skills) == 0 && len(agents) == 0 { + result.Warnings = append(result.Warnings, "no SKILL.md files or agents found in repository") + } else if len(skills) == 0 { + // Only agents found — not a warning, just informational } // Security audit on the entire tracked repo @@ -145,6 +160,16 @@ func updateTrackedRepo(repoPath string, result *TrackedRepoResult, opts InstallO result.Skills = append(result.Skills, skill.Name) } + // Also discover agents in the tracked repo + agents := discoverAgents(repoPath, len(skills) > 0) + result.AgentCount = len(agents) + if len(agents) > 0 { + for _, agent := range agents { + result.Agents = append(result.Agents, agent.Name) + } + result.Warnings = append(result.Warnings, fmt.Sprintf("%d agent(s) found in tracked repo", len(agents))) + } + result.Action = "updated" return result, nil } diff --git a/internal/install/meta.go b/internal/install/meta.go index fb7713f9..1cc3bf8c 100644 --- a/internal/install/meta.go +++ b/internal/install/meta.go @@ -14,9 +14,10 @@ import ( // MetaFileName is the name of the skillshare metadata file stored in each skill directory. const MetaFileName = ".skillshare-meta.json" -// SkillMeta contains metadata about an installed skill +// SkillMeta contains metadata about an installed skill or agent type SkillMeta struct { Source string `json:"source"` // Original source input + Kind string `json:"kind,omitempty"` // "skill" (default/empty) or "agent" Type string `json:"type"` // Source type (github, local, etc.) InstalledAt time.Time `json:"installed_at"` // Installation timestamp RepoURL string `json:"repo_url,omitempty"` // Git repo URL (for git sources) @@ -27,7 +28,16 @@ type SkillMeta struct { Branch string `json:"branch,omitempty"` // Git branch (when non-default) } -// WriteMeta saves metadata to the skill directory +// EffectiveKind returns "skill" if Kind is empty, otherwise the Kind value. +func (m *SkillMeta) EffectiveKind() string { + if m.Kind == "" { + return "skill" + } + return m.Kind +} + +// Deprecated: WriteMeta writes per-skill sidecar files. +// New code should use MetadataStore.Set() + MetadataStore.Save() instead. func WriteMeta(skillPath string, meta *SkillMeta) error { metaPath := filepath.Join(skillPath, MetaFileName) @@ -43,7 +53,8 @@ func WriteMeta(skillPath string, meta *SkillMeta) error { return nil } -// ReadMeta loads metadata from the skill directory +// Deprecated: ReadMeta reads per-skill sidecar files. +// New code should use LoadMetadata() + MetadataStore.Get() instead. func ReadMeta(skillPath string) (*SkillMeta, error) { metaPath := filepath.Join(skillPath, MetaFileName) @@ -63,7 +74,8 @@ func ReadMeta(skillPath string) (*SkillMeta, error) { return &meta, nil } -// HasMeta checks if a skill directory has metadata +// Deprecated: HasMeta checks for per-skill sidecar files. +// New code should use MetadataStore.Has() instead. func HasMeta(skillPath string) bool { metaPath := filepath.Join(skillPath, MetaFileName) _, err := os.Stat(metaPath) @@ -88,6 +100,9 @@ func ComputeFileHashes(skillPath string) (map[string]string, error) { if info.Name() == MetaFileName { return nil } + if info.Name() == MetadataFileName { + return nil + } rel, relErr := filepath.Rel(skillPath, path) if relErr != nil { @@ -108,7 +123,8 @@ func ComputeFileHashes(skillPath string) (map[string]string, error) { return hashes, nil } -// NewMetaFromSource creates a SkillMeta from a Source +// Deprecated: NewMetaFromSource creates a SkillMeta from a Source. +// New code should use MetadataStore.SetFromSource() instead. func NewMetaFromSource(source *Source) *SkillMeta { meta := &SkillMeta{ Source: source.Raw, @@ -128,10 +144,8 @@ func NewMetaFromSource(source *Source) *SkillMeta { return meta } -// RefreshMetaHashes recomputes and saves file hashes for a skill that has -// existing metadata. This is a no-op if the skill has no .skillshare-meta.json -// or no file_hashes field. Used after programmatic SKILL.md edits (e.g. target -// changes) to keep audit integrity checks in sync. +// Deprecated: RefreshMetaHashes recomputes per-skill sidecar hashes. +// New code should use MetadataStore.RefreshHashes() instead. func RefreshMetaHashes(skillPath string) { meta, err := ReadMeta(skillPath) if err != nil || meta == nil || meta.FileHashes == nil { diff --git a/internal/install/metadata.go b/internal/install/metadata.go new file mode 100644 index 00000000..7711269a --- /dev/null +++ b/internal/install/metadata.go @@ -0,0 +1,351 @@ +package install + +import ( + "encoding/json" + "fmt" + "os" + "path/filepath" + "sort" + "strings" + "time" +) + +// MetadataFileName is the centralized metadata file stored in each directory. +const MetadataFileName = ".metadata.json" + +// Metadata kind constants for LoadMetadataWithMigration. +const ( + MetadataKindSkill = "" // default kind for skills directories + MetadataKindAgent = "agent" // kind for agents directories +) + +// MetadataStore holds all entries for a single directory (skills/ or agents/). +type MetadataStore struct { + Version int `json:"version"` + Entries map[string]*MetadataEntry `json:"entries"` +} + +// MetadataEntry merges the old SkillMeta + RegistryEntry fields. +type MetadataEntry struct { + // Registry fields + Source string `json:"source"` + Kind string `json:"kind,omitempty"` + Type string `json:"type,omitempty"` + Tracked bool `json:"tracked,omitempty"` + Group string `json:"group,omitempty"` + Branch string `json:"branch,omitempty"` + Into string `json:"into,omitempty"` + + // Meta fields + InstalledAt time.Time `json:"installed_at,omitzero"` + RepoURL string `json:"repo_url,omitempty"` + Subdir string `json:"subdir,omitempty"` + Version string `json:"version,omitempty"` + TreeHash string `json:"tree_hash,omitempty"` + FileHashes map[string]string `json:"file_hashes,omitempty"` +} + +// NewMetadataStore returns an empty store with version 1. +func NewMetadataStore() *MetadataStore { + return &MetadataStore{ + Version: 1, + Entries: make(map[string]*MetadataEntry), + } +} + +// Get returns the entry for the given name, or nil if not found. +func (s *MetadataStore) Get(name string) *MetadataEntry { + return s.Entries[name] +} + +// Set adds or replaces an entry. +func (s *MetadataStore) Set(name string, entry *MetadataEntry) { + s.Entries[name] = entry +} + +// Remove deletes an entry by name. +func (s *MetadataStore) Remove(name string) { + delete(s.Entries, name) +} + +// Has returns true if an entry exists for the given name. +func (s *MetadataStore) Has(name string) bool { + _, ok := s.Entries[name] + return ok +} + +// GetByPath looks up an entry by its full relative path (e.g. "mygroup/keep-nested"). +// It first tries a direct key lookup, then falls back to matching group+basename. +// This handles the case where entries are stored with basename keys but have a Group field. +func (s *MetadataStore) GetByPath(relPath string) *MetadataEntry { + // Direct lookup (works for top-level skills where key == relPath) + if e := s.Entries[relPath]; e != nil { + return e + } + // Basename + group lookup (for nested skills stored with basename key) + base := filepath.Base(relPath) + group := "" + if dir := filepath.Dir(relPath); dir != "." { + group = filepath.ToSlash(dir) + } + if e := s.Entries[base]; e != nil && e.Group == group { + return e + } + return nil +} + +// KeyToRelPath returns the effective relative path for a store entry. +// For full-path keys it returns the key as-is; for legacy basename keys +// it prepends the entry's Group. +func KeyToRelPath(key string, entry *MetadataEntry) string { + if entry != nil && entry.Group != "" && !strings.HasPrefix(key, entry.Group+"/") { + return entry.Group + "/" + key + } + return key +} + +// MigrateLegacyKey promotes a legacy basename key to a full-path key. +// Returns true if migration occurred. No-op if the key is already full-path. +func (s *MetadataStore) MigrateLegacyKey(fullPath string, existing *MetadataEntry) bool { + if s.Has(fullPath) { + return false + } + s.Remove(filepath.Base(fullPath)) + s.Set(fullPath, existing) + return true +} + +// List returns sorted entry names. +func (s *MetadataStore) List() []string { + names := make([]string, 0, len(s.Entries)) + for name := range s.Entries { + names = append(names, name) + } + sort.Strings(names) + return names +} + +// EffectiveKind returns "skill" if Kind is empty. +func (e *MetadataEntry) EffectiveKind() string { + if e.Kind == "" { + return "skill" + } + return e.Kind +} + +// RemoveByNames removes entries matching the given names, including group members. +// Handles direct key matches, full-path matches (group/name), and group membership. +// Works with both legacy basename keys and full-path keys. +func (s *MetadataStore) RemoveByNames(names map[string]bool) { + for _, key := range s.List() { + entry := s.Get(key) + fullName := KeyToRelPath(key, entry) + if names[key] || names[fullName] { + s.Remove(key) + continue + } + // Also match by basename for backward compat (e.g. uninstall "foo" should + // remove full-path key "frontend/foo"). + if entry != nil && entry.Group != "" { + basename := key + if idx := strings.LastIndex(key, "/"); idx >= 0 { + basename = key[idx+1:] + } + if names[basename] { + s.Remove(key) + continue + } + } + // Group directory uninstall: remove member skills + if entry != nil && entry.Group != "" { + for rn := range names { + if entry.Group == rn || strings.HasPrefix(entry.Group, rn+"/") { + s.Remove(key) + break + } + } + } + } +} + +// WriteMetaToStore writes a SkillMeta to the centralized .metadata.json store. +// sourceDir is the skills root (if empty, defaults to parent of destPath). +// destPath is the installed skill path. +func WriteMetaToStore(sourceDir, destPath string, meta *SkillMeta) error { + if sourceDir == "" { + sourceDir = filepath.Dir(destPath) + } + rel, err := filepath.Rel(sourceDir, destPath) + if err != nil { + return fmt.Errorf("relative path: %w", err) + } + rel = filepath.ToSlash(rel) + if meta.Kind == MetadataKindAgent { + rel = strings.TrimSuffix(rel, ".md") + } + + // Extract group from relative path (e.g. "frontend/foo" → group "frontend"). + group := "" + if idx := strings.LastIndex(rel, "/"); idx >= 0 { + group = rel[:idx] + } + + store, loadErr := LoadMetadata(sourceDir) + if loadErr != nil { + store = NewMetadataStore() + } + + // Use full relative path as key to avoid collisions between grouped + // skills with the same basename (e.g. "frontend/foo" vs "backend/foo"). + // Remove any legacy basename-only key for this group+basename pair. + if group != "" { + basename := rel[strings.LastIndex(rel, "/")+1:] + if old := store.Get(basename); old != nil && old.Group == group { + store.Remove(basename) + } + } + + store.Set(rel, &MetadataEntry{ + Source: meta.Source, + Kind: meta.Kind, + Type: meta.Type, + Group: group, + InstalledAt: meta.InstalledAt, + RepoURL: meta.RepoURL, + Subdir: meta.Subdir, + Version: meta.Version, + TreeHash: meta.TreeHash, + FileHashes: meta.FileHashes, + Branch: meta.Branch, + }) + return store.Save(sourceDir) +} + +// loadMetadataFile reads .metadata.json from the given directory (pure read, no migration). +// Returns an empty store (version 1) if the file does not exist. +func loadMetadataFile(dir string) (*MetadataStore, error) { + path := filepath.Join(dir, MetadataFileName) + data, err := os.ReadFile(path) + if err != nil { + if os.IsNotExist(err) { + return NewMetadataStore(), nil + } + return nil, fmt.Errorf("failed to read metadata: %w", err) + } + + var store MetadataStore + if err := json.Unmarshal(data, &store); err != nil { + return nil, fmt.Errorf("failed to parse metadata: %w", err) + } + if store.Entries == nil { + store.Entries = make(map[string]*MetadataEntry) + } + return &store, nil +} + +// LoadMetadata reads .metadata.json and cleans up any lingering sidecar files. +// Sidecar migration is idempotent — if no sidecars exist, it's a fast no-op +// (one ReadDir per call). This ensures sidecars created after initial migration +// (e.g. by agent install) are always cleaned up regardless of which command runs. +func LoadMetadata(dir string) (*MetadataStore, error) { + store, err := loadMetadataFile(dir) + if err != nil { + return nil, err + } + if cleanupSidecars(store, dir) { + store.Save(dir) //nolint:errcheck + } + return store, nil +} + +// LoadMetadataOrNew loads metadata from dir, returning an empty store on error. +func LoadMetadataOrNew(dir string) *MetadataStore { + store, _ := LoadMetadata(dir) + if store == nil { + return NewMetadataStore() + } + return store +} + +// Save writes .metadata.json atomically (temp file → rename). +func (s *MetadataStore) Save(dir string) error { + if err := os.MkdirAll(dir, 0755); err != nil { + return fmt.Errorf("failed to create directory: %w", err) + } + + data, err := json.MarshalIndent(s, "", " ") + if err != nil { + return fmt.Errorf("failed to marshal metadata: %w", err) + } + data = append(data, '\n') + + target := filepath.Join(dir, MetadataFileName) + tmp, err := os.CreateTemp(dir, ".metadata-*.tmp") + if err != nil { + return fmt.Errorf("failed to create temp file: %w", err) + } + tmpName := tmp.Name() + + if _, err := tmp.Write(data); err != nil { + tmp.Close() + os.Remove(tmpName) + return fmt.Errorf("failed to write temp file: %w", err) + } + if err := tmp.Close(); err != nil { + os.Remove(tmpName) + return fmt.Errorf("failed to close temp file: %w", err) + } + if err := os.Rename(tmpName, target); err != nil { + os.Remove(tmpName) + return fmt.Errorf("failed to rename temp file: %w", err) + } + return nil +} + +// MetadataPath returns the .metadata.json path for the given directory. +func MetadataPath(dir string) string { + return filepath.Join(dir, MetadataFileName) +} + +// SetFromSource creates an entry from a Source and stores it. Returns the entry. +func (s *MetadataStore) SetFromSource(name string, src *Source) *MetadataEntry { + entry := &MetadataEntry{ + Source: src.Raw, + Type: src.MetaType(), + InstalledAt: time.Now(), + Branch: src.Branch, + } + if src.IsGit() { + entry.RepoURL = src.CloneURL + } + if src.HasSubdir() { + entry.Subdir = strings.ReplaceAll(src.Subdir, "\\", "/") + } + s.Entries[name] = entry + return entry +} + +// ComputeEntryHashes walks skillPath and populates FileHashes with sha256 digests. +// Delegates to ComputeFileHashes in meta.go. +func (e *MetadataEntry) ComputeEntryHashes(skillPath string) error { + hashes, err := ComputeFileHashes(skillPath) + if err != nil { + return err + } + e.FileHashes = hashes + return nil +} + +// RefreshHashes recomputes file hashes for an entry that already has them. +// No-op if entry doesn't exist or has no FileHashes. +func (s *MetadataStore) RefreshHashes(relPath, skillPath string) { + entry := s.GetByPath(relPath) + if entry == nil || entry.FileHashes == nil { + return + } + hashes, err := ComputeFileHashes(skillPath) + if err != nil { + return + } + entry.FileHashes = hashes +} diff --git a/internal/install/metadata_migrate.go b/internal/install/metadata_migrate.go new file mode 100644 index 00000000..34c028ac --- /dev/null +++ b/internal/install/metadata_migrate.go @@ -0,0 +1,345 @@ +package install + +import ( + "encoding/json" + "os" + "path/filepath" + "strings" + + "gopkg.in/yaml.v3" +) + +// cleanupSidecars runs both skill and agent sidecar migration on dir. +// Returns true if any sidecars were found and cleaned up. +func cleanupSidecars(store *MetadataStore, dir string) bool { + before := storeFingerprint(store) + migrateSkillSidecars(store, dir) + migrateAgentSidecars(store, dir) + return storeFingerprint(store) != before +} + +// storeFingerprint returns a cheap fingerprint for change detection. +func storeFingerprint(s *MetadataStore) uint64 { + var h uint64 + for k, e := range s.Entries { + h += uint64(len(k)) + if e != nil && e.Source != "" { + h += uint64(len(e.Source)) + } + } + return h +} + +// LoadMetadataWithMigration loads .metadata.json, or migrates from old format if needed. +// kind is "" for skills directories, "agent" for agents directories. +// When .metadata.json already exists, LoadMetadata handles sidecar cleanup automatically. +func LoadMetadataWithMigration(dir, kind string) (*MetadataStore, error) { + // Fast path: .metadata.json exists — LoadMetadata handles sidecar cleanup. + metaPath := filepath.Join(dir, MetadataFileName) + if _, err := os.Stat(metaPath); err == nil { + return LoadMetadata(dir) + } + + store := NewMetadataStore() + + // Phase 1: Migrate registry.yaml entries + migrateRegistryEntries(store, dir, kind) + if parent := filepath.Dir(dir); parent != dir { + migrateRegistryEntries(store, parent, kind) + } + + // Phase 2: Migrate sidecar .skillshare-meta.json files + if kind == "agent" { + migrateAgentSidecars(store, dir) + } else { + migrateSkillSidecars(store, dir) + } + + // Phase 3: Save if we found anything to migrate + if len(store.Entries) > 0 { + if err := store.Save(dir); err != nil { + return store, err + } + } + + // Phase 4: Clean up old registry.yaml (in dir and parent) + cleanupOldRegistry(dir) + if parent := filepath.Dir(dir); parent != dir { + cleanupOldRegistry(parent) + } + + return store, nil +} + +// localRegistryEntry mirrors config.SkillEntry without importing internal/config. +type localRegistryEntry struct { + Name string `yaml:"name"` + Kind string `yaml:"kind,omitempty"` + Source string `yaml:"source"` + Tracked bool `yaml:"tracked,omitempty"` + Group string `yaml:"group,omitempty"` + Branch string `yaml:"branch,omitempty"` +} + +// localRegistry mirrors config.Registry without importing internal/config. +type localRegistry struct { + Skills []localRegistryEntry `yaml:"skills,omitempty"` +} + +// migrateRegistryEntries reads registry.yaml in dir and merges matching entries into store. +// For skills dirs (kind=""), agent entries are skipped. +// For agents dirs (kind="agent"), skill entries are skipped. +func migrateRegistryEntries(store *MetadataStore, dir, kind string) { + registryPath := filepath.Join(dir, "registry.yaml") + data, err := os.ReadFile(registryPath) + if err != nil { + return + } + + var reg localRegistry + if err := yaml.Unmarshal(data, ®); err != nil { + return + } + + for _, e := range reg.Skills { + if e.Name == "" || e.Source == "" { + continue + } + + isAgent := e.Kind == "agent" + + // Filter: skills dir skips agent entries, agents dir skips skill entries + if kind == "agent" && !isAgent { + continue + } + if kind == "" && isAgent { + continue + } + + entry := store.Get(e.Name) + if entry == nil { + entry = &MetadataEntry{} + store.Set(e.Name, entry) + } + + entry.Source = e.Source + entry.Kind = e.Kind + entry.Tracked = e.Tracked + entry.Group = e.Group + entry.Branch = e.Branch + } +} + +// migrateSkillSidecars walks subdirectories of dir, looks for .skillshare-meta.json +// inside each, reads as SkillMeta, merges fields into store entry, and removes old sidecar. +func migrateSkillSidecars(store *MetadataStore, dir string) { + entries, err := os.ReadDir(dir) + if err != nil { + return + } + + for _, de := range entries { + if !de.IsDir() { + continue + } + skillName := de.Name() + skillPath := filepath.Join(dir, skillName) + walkSkillDir(store, skillPath, skillName, "") + } +} + +// walkSkillDir recursively walks a skill directory to find .skillshare-meta.json sidecars. +// group is the parent group prefix (empty for top-level skills). +func walkSkillDir(store *MetadataStore, skillPath, name, group string) { + sidecarPath := filepath.Join(skillPath, MetaFileName) + if _, err := os.Stat(sidecarPath); err == nil { + // This directory has a sidecar — it's a leaf skill + mergeSkillSidecar(store, name, group, sidecarPath) + os.Remove(sidecarPath) + return + } + + // Check if this has subdirectories (nested skills) + subEntries, err := os.ReadDir(skillPath) + if err != nil { + return + } + + for _, sub := range subEntries { + if sub.IsDir() { + subGroup := name + if group != "" { + subGroup = group + "/" + name + } + walkSkillDir(store, filepath.Join(skillPath, sub.Name()), sub.Name(), subGroup) + } + } +} + +// mergeSkillSidecar reads a SkillMeta sidecar and merges its fields into the store. +func mergeSkillSidecar(store *MetadataStore, name, group, sidecarPath string) { + data, err := os.ReadFile(sidecarPath) + if err != nil { + return + } + + var meta SkillMeta + if err := json.Unmarshal(data, &meta); err != nil { + return + } + + // Use full-path key for grouped skills. + key := name + if group != "" { + key = group + "/" + name + } + + entry := store.Get(key) + if entry == nil { + entry = &MetadataEntry{} + store.Set(key, entry) + } + + // Merge sidecar fields — sidecar has richer data + if meta.Source != "" && entry.Source == "" { + entry.Source = meta.Source + } + if meta.Kind != "" { + entry.Kind = meta.Kind + } + if meta.Type != "" { + entry.Type = meta.Type + } + if !meta.InstalledAt.IsZero() { + entry.InstalledAt = meta.InstalledAt + } + if meta.RepoURL != "" { + entry.RepoURL = meta.RepoURL + } + if meta.Subdir != "" { + entry.Subdir = meta.Subdir + } + if meta.Version != "" { + entry.Version = meta.Version + } + if meta.TreeHash != "" { + entry.TreeHash = meta.TreeHash + } + if meta.FileHashes != nil { + entry.FileHashes = meta.FileHashes + } + if meta.Branch != "" && entry.Branch == "" { + entry.Branch = meta.Branch + } + if group != "" && entry.Group == "" { + entry.Group = group + } + // Detect tracked repos: top-level parent starts with "_" + root := group + if idx := strings.Index(root, "/"); idx >= 0 { + root = root[:idx] + } + if len(root) > 0 && root[0] == '_' { + entry.Tracked = true + } +} + +// migrateAgentSidecars recursively scans dir for *.skillshare-meta.json files, +// merges them into the centralized store with Kind="agent", and removes the sidecars. +func migrateAgentSidecars(store *MetadataStore, dir string) { + walkAgentSidecars(store, dir, "") +} + +// walkAgentSidecars recursively walks dir for agent sidecar files. +// group is the parent prefix (empty for top-level, e.g. "demo" for agents/demo/). +func walkAgentSidecars(store *MetadataStore, dir, group string) { + entries, err := os.ReadDir(dir) + if err != nil { + return + } + + const suffix = ".skillshare-meta.json" + for _, de := range entries { + if de.IsDir() { + subGroup := de.Name() + if group != "" { + subGroup = group + "/" + de.Name() + } + walkAgentSidecars(store, filepath.Join(dir, de.Name()), subGroup) + continue + } + if !strings.HasSuffix(de.Name(), suffix) { + continue + } + + agentName := strings.TrimSuffix(de.Name(), suffix) + if agentName == "" { + continue + } + // Use full-path key for grouped agents (e.g. "demo/reviewer") + key := agentName + if group != "" { + key = group + "/" + agentName + } + + sidecarPath := filepath.Join(dir, de.Name()) + mergeAgentSidecar(store, key, group, sidecarPath) + os.Remove(sidecarPath) + } +} + +// mergeAgentSidecar reads a SkillMeta sidecar and merges its fields into the store. +func mergeAgentSidecar(store *MetadataStore, key, group, sidecarPath string) { + data, err := os.ReadFile(sidecarPath) + if err != nil { + return + } + + var meta SkillMeta + if err := json.Unmarshal(data, &meta); err != nil { + return + } + + entry := store.Get(key) + if entry == nil { + entry = &MetadataEntry{} + store.Set(key, entry) + } + + if meta.Source != "" && entry.Source == "" { + entry.Source = meta.Source + } + entry.Kind = "agent" + if meta.Type != "" { + entry.Type = meta.Type + } + if !meta.InstalledAt.IsZero() { + entry.InstalledAt = meta.InstalledAt + } + if meta.RepoURL != "" { + entry.RepoURL = meta.RepoURL + } + if meta.Subdir != "" { + entry.Subdir = meta.Subdir + } + if meta.Version != "" { + entry.Version = meta.Version + } + if meta.TreeHash != "" { + entry.TreeHash = meta.TreeHash + } + if meta.FileHashes != nil { + entry.FileHashes = meta.FileHashes + } + if meta.Branch != "" && entry.Branch == "" { + entry.Branch = meta.Branch + } + if group != "" && entry.Group == "" { + entry.Group = group + } +} + +// cleanupOldRegistry removes registry.yaml from dir (best-effort, ignores errors). +func cleanupOldRegistry(dir string) { + os.Remove(filepath.Join(dir, "registry.yaml")) +} diff --git a/internal/install/metadata_migrate_test.go b/internal/install/metadata_migrate_test.go new file mode 100644 index 00000000..0be70f23 --- /dev/null +++ b/internal/install/metadata_migrate_test.go @@ -0,0 +1,288 @@ +package install + +import ( + "encoding/json" + "os" + "path/filepath" + "testing" + "time" +) + +// TestMigrateMetadata_FromSidecars verifies that a skill dir with a .skillshare-meta.json +// sidecar is migrated: entry appears in store, old sidecar removed, .metadata.json created. +func TestMigrateMetadata_FromSidecars(t *testing.T) { + dir := t.TempDir() + + // Create skill dir with SKILL.md and sidecar + skillDir := filepath.Join(dir, "my-skill") + if err := os.MkdirAll(skillDir, 0755); err != nil { + t.Fatal(err) + } + os.WriteFile(filepath.Join(skillDir, "SKILL.md"), []byte("---\nname: my-skill\n---\n# Content"), 0644) + + meta := &SkillMeta{ + Source: "github.com/user/repo", + Type: "github", + RepoURL: "https://github.com/user/repo", + InstalledAt: time.Now(), + FileHashes: map[string]string{"SKILL.md": "sha256:abc123"}, + } + writeSkillMetaSidecar(t, skillDir, meta) + + store, err := LoadMetadataWithMigration(dir, "") + if err != nil { + t.Fatalf("unexpected error: %v", err) + } + + // Entry should be present + entry := store.Get("my-skill") + if entry == nil { + t.Fatal("expected entry 'my-skill' in store") + } + if entry.Source != "github.com/user/repo" { + t.Errorf("Source = %q, want %q", entry.Source, "github.com/user/repo") + } + if entry.RepoURL != "https://github.com/user/repo" { + t.Errorf("RepoURL = %q, want %q", entry.RepoURL, "https://github.com/user/repo") + } + if len(entry.FileHashes) == 0 { + t.Error("expected FileHashes to be populated") + } + + // Old sidecar should be removed + sidecarPath := filepath.Join(skillDir, MetaFileName) + if _, err := os.Stat(sidecarPath); err == nil { + t.Error("expected old sidecar to be removed") + } + + // .metadata.json should exist + if _, err := os.Stat(filepath.Join(dir, MetadataFileName)); err != nil { + t.Errorf(".metadata.json not created: %v", err) + } +} + +// TestMigrateMetadata_FromRegistry verifies that registry.yaml entries are migrated +// and the old registry.yaml is removed. +func TestMigrateMetadata_FromRegistry(t *testing.T) { + dir := t.TempDir() + + // Create skill dir (no sidecar) + skillDir := filepath.Join(dir, "team-skill") + if err := os.MkdirAll(skillDir, 0755); err != nil { + t.Fatal(err) + } + os.WriteFile(filepath.Join(skillDir, "SKILL.md"), []byte("---\nname: team-skill\n---\n"), 0644) + + // Write registry.yaml + registryYAML := `skills: + - name: team-skill + source: github.com/org/repo + tracked: true + branch: main +` + os.WriteFile(filepath.Join(dir, "registry.yaml"), []byte(registryYAML), 0644) + + store, err := LoadMetadataWithMigration(dir, "") + if err != nil { + t.Fatalf("unexpected error: %v", err) + } + + entry := store.Get("team-skill") + if entry == nil { + t.Fatal("expected entry 'team-skill' in store") + } + if entry.Source != "github.com/org/repo" { + t.Errorf("Source = %q, want %q", entry.Source, "github.com/org/repo") + } + if !entry.Tracked { + t.Error("expected Tracked = true") + } + if entry.Branch != "main" { + t.Errorf("Branch = %q, want %q", entry.Branch, "main") + } + + // Old registry.yaml should be removed + if _, err := os.Stat(filepath.Join(dir, "registry.yaml")); err == nil { + t.Error("expected old registry.yaml to be removed") + } +} + +// TestMigrateMetadata_MergesRegistryAndSidecar verifies that registry fields (group, branch) +// and sidecar fields (repo_url, file_hashes) are merged into a single entry. +func TestMigrateMetadata_MergesRegistryAndSidecar(t *testing.T) { + dir := t.TempDir() + + // Registry has group + branch + registryYAML := `skills: + - name: review + source: github.com/org/tools + group: frontend + branch: develop +` + os.WriteFile(filepath.Join(dir, "registry.yaml"), []byte(registryYAML), 0644) + + // Sidecar has repo_url + file_hashes (inside group/name subdir) + skillDir := filepath.Join(dir, "review") + if err := os.MkdirAll(skillDir, 0755); err != nil { + t.Fatal(err) + } + meta := &SkillMeta{ + Source: "github.com/org/tools", + Type: "github", + RepoURL: "https://github.com/org/tools", + FileHashes: map[string]string{"SKILL.md": "sha256:def456"}, + } + writeSkillMetaSidecar(t, skillDir, meta) + + store, err := LoadMetadataWithMigration(dir, "") + if err != nil { + t.Fatalf("unexpected error: %v", err) + } + + entry := store.Get("review") + if entry == nil { + t.Fatal("expected entry 'review' in store") + } + // From registry + if entry.Group != "frontend" { + t.Errorf("Group = %q, want %q", entry.Group, "frontend") + } + if entry.Branch != "develop" { + t.Errorf("Branch = %q, want %q", entry.Branch, "develop") + } + // From sidecar + if entry.RepoURL != "https://github.com/org/tools" { + t.Errorf("RepoURL = %q, want %q", entry.RepoURL, "https://github.com/org/tools") + } + if len(entry.FileHashes) == 0 { + t.Error("expected FileHashes to be populated from sidecar") + } +} + +// TestMigrateMetadata_Idempotent verifies that when .metadata.json already exists, +// it is loaded as-is without any migration being attempted. +func TestMigrateMetadata_Idempotent(t *testing.T) { + dir := t.TempDir() + + // Pre-create .metadata.json with known content + existing := NewMetadataStore() + existing.Set("pre-existing", &MetadataEntry{ + Source: "github.com/user/existing", + Kind: "skill", + }) + if err := existing.Save(dir); err != nil { + t.Fatal(err) + } + + // Also write a registry.yaml that should NOT be processed + registryYAML := `skills: + - name: new-skill + source: github.com/user/new +` + os.WriteFile(filepath.Join(dir, "registry.yaml"), []byte(registryYAML), 0644) + + store, err := LoadMetadataWithMigration(dir, "") + if err != nil { + t.Fatalf("unexpected error: %v", err) + } + + // Should have the pre-existing entry + if !store.Has("pre-existing") { + t.Error("expected 'pre-existing' entry from .metadata.json") + } + // Should NOT have new-skill (migration was skipped) + if store.Has("new-skill") { + t.Error("migration should have been skipped — new-skill should not appear") + } + // registry.yaml should still exist (was not cleaned up) + if _, err := os.Stat(filepath.Join(dir, "registry.yaml")); err != nil { + t.Error("expected registry.yaml to still exist when migration was skipped") + } +} + +// TestMigrateMetadata_AgentSidecars verifies that agent sidecar files +// (reviewer.skillshare-meta.json) are migrated with Kind="agent". +func TestMigrateMetadata_AgentSidecars(t *testing.T) { + dir := t.TempDir() + + // Create reviewer.md (the agent file) and reviewer.skillshare-meta.json (sidecar) + os.WriteFile(filepath.Join(dir, "reviewer.md"), []byte("# Reviewer Agent"), 0644) + + meta := &SkillMeta{ + Source: "github.com/org/agents", + Type: "github", + RepoURL: "https://github.com/org/agents", + InstalledAt: time.Now(), + Version: "abc123", + } + sidecarPath := filepath.Join(dir, "reviewer"+".skillshare-meta.json") + data, err := json.MarshalIndent(meta, "", " ") + if err != nil { + t.Fatal(err) + } + if err := os.WriteFile(sidecarPath, data, 0644); err != nil { + t.Fatal(err) + } + + store, err := LoadMetadataWithMigration(dir, "agent") + if err != nil { + t.Fatalf("unexpected error: %v", err) + } + + entry := store.Get("reviewer") + if entry == nil { + t.Fatal("expected entry 'reviewer' in store") + } + if entry.Kind != "agent" { + t.Errorf("Kind = %q, want %q", entry.Kind, "agent") + } + if entry.RepoURL != "https://github.com/org/agents" { + t.Errorf("RepoURL = %q, want %q", entry.RepoURL, "https://github.com/org/agents") + } + if entry.Version != "abc123" { + t.Errorf("Version = %q, want %q", entry.Version, "abc123") + } + + // Old sidecar should be removed + if _, err := os.Stat(sidecarPath); err == nil { + t.Error("expected agent sidecar to be removed after migration") + } + + // .metadata.json should exist + if _, err := os.Stat(filepath.Join(dir, MetadataFileName)); err != nil { + t.Errorf(".metadata.json not created: %v", err) + } +} + +// TestMigrateMetadata_EmptyDir verifies that an empty dir returns an empty store without error. +func TestMigrateMetadata_EmptyDir(t *testing.T) { + dir := t.TempDir() + + store, err := LoadMetadataWithMigration(dir, "") + if err != nil { + t.Fatalf("unexpected error: %v", err) + } + + if len(store.Entries) != 0 { + t.Errorf("expected empty store, got %d entries", len(store.Entries)) + } + + // .metadata.json should NOT be created when nothing was migrated + if _, err := os.Stat(filepath.Join(dir, MetadataFileName)); err == nil { + t.Error("expected .metadata.json to not be created for empty dir") + } +} + +// writeSkillMetaSidecar is a test helper that writes a .skillshare-meta.json sidecar +// inside the given skill directory. +func writeSkillMetaSidecar(t *testing.T, skillDir string, meta *SkillMeta) { + t.Helper() + data, err := json.MarshalIndent(meta, "", " ") + if err != nil { + t.Fatalf("marshal SkillMeta: %v", err) + } + path := filepath.Join(skillDir, MetaFileName) + if err := os.WriteFile(path, data, 0644); err != nil { + t.Fatalf("write sidecar: %v", err) + } +} diff --git a/internal/install/metadata_test.go b/internal/install/metadata_test.go new file mode 100644 index 00000000..e60e31cb --- /dev/null +++ b/internal/install/metadata_test.go @@ -0,0 +1,368 @@ +package install + +import ( + "os" + "path/filepath" + "testing" + "time" +) + +func TestMetadataStore_SetAndGet(t *testing.T) { + s := NewMetadataStore() + now := time.Now() + entry := &MetadataEntry{ + Source: "org/repo", + Kind: "skill", + Type: "github", + Tracked: true, + Group: "mygroup", + Branch: "main", + Into: "frontend", + InstalledAt: now, + RepoURL: "https://github.com/org/repo.git", + Subdir: "skills/foo", + Version: "abc123", + TreeHash: "deadbeef", + FileHashes: map[string]string{"SKILL.md": "sha256:aabbcc"}, + } + + s.Set("foo", entry) + got := s.Get("foo") + + if got == nil { + t.Fatal("Get returned nil after Set") + } + if got.Source != entry.Source { + t.Errorf("Source = %q, want %q", got.Source, entry.Source) + } + if got.Kind != entry.Kind { + t.Errorf("Kind = %q, want %q", got.Kind, entry.Kind) + } + if got.Type != entry.Type { + t.Errorf("Type = %q, want %q", got.Type, entry.Type) + } + if got.Tracked != entry.Tracked { + t.Errorf("Tracked = %v, want %v", got.Tracked, entry.Tracked) + } + if got.Group != entry.Group { + t.Errorf("Group = %q, want %q", got.Group, entry.Group) + } + if got.Branch != entry.Branch { + t.Errorf("Branch = %q, want %q", got.Branch, entry.Branch) + } + if got.Into != entry.Into { + t.Errorf("Into = %q, want %q", got.Into, entry.Into) + } + if !got.InstalledAt.Equal(entry.InstalledAt) { + t.Errorf("InstalledAt = %v, want %v", got.InstalledAt, entry.InstalledAt) + } + if got.RepoURL != entry.RepoURL { + t.Errorf("RepoURL = %q, want %q", got.RepoURL, entry.RepoURL) + } + if got.Subdir != entry.Subdir { + t.Errorf("Subdir = %q, want %q", got.Subdir, entry.Subdir) + } + if got.Version != entry.Version { + t.Errorf("Version = %q, want %q", got.Version, entry.Version) + } + if got.TreeHash != entry.TreeHash { + t.Errorf("TreeHash = %q, want %q", got.TreeHash, entry.TreeHash) + } + if len(got.FileHashes) != 1 || got.FileHashes["SKILL.md"] != "sha256:aabbcc" { + t.Errorf("FileHashes = %v, want map with one entry", got.FileHashes) + } +} + +func TestMetadataStore_GetMissing(t *testing.T) { + s := NewMetadataStore() + got := s.Get("nonexistent") + if got != nil { + t.Errorf("Get nonexistent = %v, want nil", got) + } +} + +func TestMetadataStore_Has(t *testing.T) { + s := NewMetadataStore() + s.Set("present", &MetadataEntry{Source: "org/repo"}) + + if !s.Has("present") { + t.Error("Has(present) = false, want true") + } + if s.Has("absent") { + t.Error("Has(absent) = true, want false") + } +} + +func TestMetadataStore_Remove(t *testing.T) { + s := NewMetadataStore() + s.Set("to-remove", &MetadataEntry{Source: "org/repo"}) + + if !s.Has("to-remove") { + t.Fatal("entry should exist before Remove") + } + + s.Remove("to-remove") + + if s.Has("to-remove") { + t.Error("entry still present after Remove") + } + if s.Get("to-remove") != nil { + t.Error("Get after Remove should return nil") + } +} + +func TestMetadataStore_Remove_Nonexistent(t *testing.T) { + s := NewMetadataStore() + // Should not panic + s.Remove("nonexistent") +} + +func TestMetadataStore_List(t *testing.T) { + s := NewMetadataStore() + s.Set("zebra", &MetadataEntry{}) + s.Set("alpha", &MetadataEntry{}) + s.Set("mango", &MetadataEntry{}) + + names := s.List() + + if len(names) != 3 { + t.Fatalf("List() = %v, want 3 entries", names) + } + want := []string{"alpha", "mango", "zebra"} + for i, w := range want { + if names[i] != w { + t.Errorf("List()[%d] = %q, want %q", i, names[i], w) + } + } +} + +func TestMetadataStore_List_Empty(t *testing.T) { + s := NewMetadataStore() + names := s.List() + if len(names) != 0 { + t.Errorf("List() on empty store = %v, want []", names) + } +} + +func TestMetadataEntry_EffectiveKind(t *testing.T) { + tests := []struct { + name string + kind string + want string + }{ + {"empty kind defaults to skill", "", "skill"}, + {"explicit skill", "skill", "skill"}, + {"agent", "agent", "agent"}, + {"custom kind preserved", "custom", "custom"}, + } + + for _, tt := range tests { + t.Run(tt.name, func(t *testing.T) { + e := &MetadataEntry{Kind: tt.kind} + got := e.EffectiveKind() + if got != tt.want { + t.Errorf("EffectiveKind() = %q, want %q", got, tt.want) + } + }) + } +} + +func TestNewMetadataStore_InitialState(t *testing.T) { + s := NewMetadataStore() + if s == nil { + t.Fatal("NewMetadataStore returned nil") + } + if s.Version != 1 { + t.Errorf("Version = %d, want 1", s.Version) + } + if s.Entries == nil { + t.Error("Entries map is nil") + } + if len(s.Entries) != 0 { + t.Errorf("Entries not empty on new store: %v", s.Entries) + } +} + +func TestMetadataStore_SaveAndLoad(t *testing.T) { + dir := t.TempDir() + store := NewMetadataStore() + store.Set("my-skill", &MetadataEntry{ + Source: "github.com/user/repo", + Type: "github", + InstalledAt: time.Date(2026, 4, 1, 10, 0, 0, 0, time.UTC), + FileHashes: map[string]string{"SKILL.md": "sha256:abc123"}, + }) + + if err := store.Save(dir); err != nil { + t.Fatalf("Save failed: %v", err) + } + + // Verify file exists + metaPath := filepath.Join(dir, MetadataFileName) + if _, err := os.Stat(metaPath); err != nil { + t.Fatalf("metadata file not created: %v", err) + } + + // Load and verify round-trip + loaded, err := LoadMetadata(dir) + if err != nil { + t.Fatalf("LoadMetadata failed: %v", err) + } + if loaded.Version != 1 { + t.Errorf("version = %d, want 1", loaded.Version) + } + entry := loaded.Get("my-skill") + if entry == nil { + t.Fatal("expected entry, got nil") + } + if entry.Source != "github.com/user/repo" { + t.Errorf("source = %q, want %q", entry.Source, "github.com/user/repo") + } + if entry.FileHashes["SKILL.md"] != "sha256:abc123" { + t.Errorf("file hash mismatch") + } +} + +func TestLoadMetadata_EmptyDir(t *testing.T) { + dir := t.TempDir() + store, err := LoadMetadata(dir) + if err != nil { + t.Fatalf("unexpected error: %v", err) + } + if store.Version != 1 { + t.Errorf("version = %d, want 1", store.Version) + } + if len(store.Entries) != 0 { + t.Errorf("expected empty entries, got %d", len(store.Entries)) + } +} + +func TestLoadMetadata_InvalidJSON(t *testing.T) { + dir := t.TempDir() + os.WriteFile(filepath.Join(dir, MetadataFileName), []byte("{invalid"), 0644) + _, err := LoadMetadata(dir) + if err == nil { + t.Fatal("expected error for invalid JSON") + } +} + +func TestMetadataStore_SaveAtomic_NoTempFiles(t *testing.T) { + dir := t.TempDir() + store := NewMetadataStore() + store.Set("a", &MetadataEntry{Source: "s1"}) + if err := store.Save(dir); err != nil { + t.Fatalf("Save failed: %v", err) + } + + entries, _ := os.ReadDir(dir) + for _, e := range entries { + if e.Name() != MetadataFileName { + t.Errorf("unexpected file left behind: %s", e.Name()) + } + } +} + +func TestMetadataStore_SaveCreatesDir(t *testing.T) { + dir := filepath.Join(t.TempDir(), "nested", "dir") + store := NewMetadataStore() + store.Set("x", &MetadataEntry{Source: "s"}) + if err := store.Save(dir); err != nil { + t.Fatalf("Save failed: %v", err) + } + if _, err := os.Stat(filepath.Join(dir, MetadataFileName)); err != nil { + t.Fatalf("file should exist in nested dir: %v", err) + } +} + +func TestMetadataPath(t *testing.T) { + got := MetadataPath("/some/dir") + want := filepath.Join("/some/dir", MetadataFileName) + if got != want { + t.Errorf("MetadataPath = %q, want %q", got, want) + } +} + +func TestMetadataStore_SetFromSource(t *testing.T) { + store := NewMetadataStore() + source := &Source{ + Raw: "github.com/user/repo", + CloneURL: "https://github.com/user/repo.git", + Branch: "dev", + Subdir: "skills\\review", + } + source.Type = SourceTypeGitHub + + entry := store.SetFromSource("review", source) + if entry.Source != "github.com/user/repo" { + t.Errorf("source = %q", entry.Source) + } + if entry.RepoURL != "https://github.com/user/repo.git" { + t.Errorf("repo_url = %q", entry.RepoURL) + } + if entry.Branch != "dev" { + t.Errorf("branch = %q", entry.Branch) + } + if entry.Subdir != "skills/review" { + t.Errorf("subdir = %q, want forward slashes", entry.Subdir) + } + if entry.InstalledAt.IsZero() { + t.Error("installed_at should be set") + } + if !store.Has("review") { + t.Error("entry not stored") + } +} + +func TestMetadataEntry_ComputeEntryHashes(t *testing.T) { + dir := t.TempDir() + os.WriteFile(filepath.Join(dir, "SKILL.md"), []byte("# Test"), 0644) + os.MkdirAll(filepath.Join(dir, ".git"), 0755) + + entry := &MetadataEntry{} + if err := entry.ComputeEntryHashes(dir); err != nil { + t.Fatalf("ComputeEntryHashes failed: %v", err) + } + if _, ok := entry.FileHashes["SKILL.md"]; !ok { + t.Error("expected SKILL.md in file hashes") + } + if len(entry.FileHashes) != 1 { + t.Errorf("expected 1 hash (SKILL.md only), got %d: %v", len(entry.FileHashes), entry.FileHashes) + } +} + +func TestMetadataStore_RefreshHashes(t *testing.T) { + dir := t.TempDir() + skillDir := filepath.Join(dir, "my-skill") + os.MkdirAll(skillDir, 0755) + os.WriteFile(filepath.Join(skillDir, "SKILL.md"), []byte("# V1"), 0644) + + store := NewMetadataStore() + entry := &MetadataEntry{ + Source: "test", + FileHashes: map[string]string{"SKILL.md": "sha256:old"}, + } + store.Set("my-skill", entry) + + store.RefreshHashes("my-skill", skillDir) + + refreshed := store.Get("my-skill") + if refreshed.FileHashes["SKILL.md"] == "sha256:old" { + t.Error("hashes should have been refreshed") + } + if refreshed.FileHashes["SKILL.md"] == "" { + t.Error("hash should not be empty after refresh") + } +} + +func TestMetadataStore_RefreshHashes_NoOp(t *testing.T) { + store := NewMetadataStore() + // No entry — should not panic + store.RefreshHashes("nonexistent", "/tmp") + + // Entry without FileHashes — should not compute + store.Set("x", &MetadataEntry{Source: "s"}) + store.RefreshHashes("x", "/tmp") + if store.Get("x").FileHashes != nil { + t.Error("should not compute hashes when FileHashes is nil") + } +} diff --git a/internal/install/source.go b/internal/install/source.go index eb7ed1a3..ccc8a5cd 100644 --- a/internal/install/source.go +++ b/internal/install/source.go @@ -44,6 +44,9 @@ type Source struct { Path string // Local path (empty for git) Name string // Derived skill name Branch string // Git branch to clone from (empty = remote default) + // ExplicitSkill is true when the user pointed directly at a SKILL.md file. + // That intent should resolve to exactly one skill, not a pack/discovery view. + ExplicitSkill bool } // GitHub URL pattern: github.com/owner/repo[/path/to/subdir] @@ -218,7 +221,7 @@ func parseGitHub(matches []string, source *Source) (*Source, error) { // Handle GitHub web URL format: /tree/{branch}/path or /blob/{branch}/path // Strip the tree/branch or blob/branch prefix to get the actual subdir - subdir = stripGitHubBranchPrefix(subdir) + subdir, source.ExplicitSkill = stripGitHubBranchPrefix(subdir) // Normalize "." subdir (explicit root) to empty string if subdir == "." { @@ -239,10 +242,12 @@ func parseGitHub(matches []string, source *Source) (*Source, error) { return source, nil } -// stripGitHubBranchPrefix removes tree/{branch}/ or blob/{branch}/ from GitHub web URLs -func stripGitHubBranchPrefix(subdir string) string { +// stripGitHubBranchPrefix removes tree/{branch}/ or blob/{branch}/ from GitHub web URLs. +// When a blob/ URL points directly at a SKILL.md file, the containing directory is +// used instead so the resulting subdir represents a skill (not a literal file name). +func stripGitHubBranchPrefix(subdir string) (string, bool) { if subdir == "" { - return "" + return "", false } parts := strings.SplitN(subdir, "/", 3) @@ -251,14 +256,32 @@ func stripGitHubBranchPrefix(subdir string) string { // parts[0] = "tree" or "blob" // parts[1] = branch name (e.g., "main", "master", "v1.0") // parts[2] = actual path (if exists) + isBlob := parts[0] == "blob" if len(parts) == 3 { - return parts[2] + return trimSkillFileSuffix(parts[2], isBlob) } // Only tree/branch, no actual subdir - return "" + return "", false } - return subdir + return subdir, false +} + +// trimSkillFileSuffix strips a trailing SKILL.md segment from a blob URL path so +// the resulting subdir is the containing directory of the skill. Non-blob URLs +// and paths that do not end in SKILL.md are returned unchanged. +func trimSkillFileSuffix(path string, isBlob bool) (string, bool) { + if !isBlob { + return path, false + } + if !strings.EqualFold(filepath.Base(path), "SKILL.md") { + return path, false + } + parent := filepath.ToSlash(filepath.Dir(path)) + if parent == "." { + return "", true + } + return parent, true } func parseGitSSH(matches []string, source *Source) (*Source, error) { @@ -380,7 +403,7 @@ func parseGitHTTPS(matches []string, source *Source, opts ParseOptions) (*Source } // Strip platform-specific branch prefixes from web URLs - subdir = stripGitBranchPrefix(host, subdir) + subdir, source.ExplicitSkill = stripGitBranchPrefix(host, subdir) // Normalize "." subdir (explicit root) to empty string if subdir == "." { @@ -419,26 +442,28 @@ func isGitLabHost(host string, extraHosts []string) bool { // stripGitBranchPrefix removes platform-specific branch path segments from web URLs. // Bitbucket: src/{branch}/path → path // GitLab: -/tree/{branch}/path → path, -/blob/{branch}/path → path -func stripGitBranchPrefix(host, subdir string) string { +func stripGitBranchPrefix(host, subdir string) (string, bool) { if subdir == "" { - return "" + return "", false } subdir = strings.TrimRight(subdir, "/") parts := strings.SplitN(subdir, "/", 3) - // Bitbucket: src/{branch}/path + // Bitbucket: src/{branch}/path — there is no separate blob marker, so we + // best-effort treat a trailing SKILL.md the same as a blob URL. if strings.Contains(host, "bitbucket") && len(parts) >= 2 && parts[0] == "src" { if len(parts) == 3 { - return parts[2] + return trimSkillFileSuffix(parts[2], true) } - return "" + return "", false } // GitLab: -/tree/{branch}/path or -/blob/{branch}/path if parts[0] == "-" && len(parts) >= 2 { rest := strings.SplitN(parts[1], "/", 2) if rest[0] == "tree" || rest[0] == "blob" { + isBlob := rest[0] == "blob" // subdir is "-/tree/{branch}/path" or "-/blob/{branch}/path" // After SplitN(subdir, "/", 3): parts = ["-", "tree", "{branch}/path"] // Need to further split parts[2] to get past branch @@ -446,14 +471,14 @@ func stripGitBranchPrefix(host, subdir string) string { inner := strings.SplitN(parts[2], "/", 2) // inner[0] = branch, inner[1] = actual path if len(inner) == 2 { - return inner[1] + return trimSkillFileSuffix(inner[1], isBlob) } } - return "" + return "", false } } - return subdir + return subdir, false } // HasSubdir returns true if this source requires subdirectory extraction @@ -461,6 +486,11 @@ func (s *Source) HasSubdir() bool { return s.Subdir != "" } +// TargetsExplicitSkill reports whether the user pointed directly at a SKILL.md file. +func (s *Source) TargetsExplicitSkill() bool { + return s != nil && s.ExplicitSkill +} + // IsGit returns true if this source requires git clone func (s *Source) IsGit() bool { return s.Type == SourceTypeGitHub || diff --git a/internal/install/source_test.go b/internal/install/source_test.go index 6ed6168c..522ca987 100644 --- a/internal/install/source_test.go +++ b/internal/install/source_test.go @@ -60,6 +60,7 @@ func TestParseSource_GitHubShorthand(t *testing.T) { wantCloneURL string wantSubdir string wantName string + wantExplicit bool }{ { name: "basic github shorthand", @@ -67,6 +68,7 @@ func TestParseSource_GitHubShorthand(t *testing.T) { wantCloneURL: "https://github.com/user/repo.git", wantSubdir: "", wantName: "repo", + wantExplicit: false, }, { name: "github shorthand with .git", @@ -74,6 +76,7 @@ func TestParseSource_GitHubShorthand(t *testing.T) { wantCloneURL: "https://github.com/user/repo.git", wantSubdir: "", wantName: "repo", + wantExplicit: false, }, { name: "github with subdirectory", @@ -81,6 +84,7 @@ func TestParseSource_GitHubShorthand(t *testing.T) { wantCloneURL: "https://github.com/user/repo.git", wantSubdir: "path/to/skill", wantName: "skill", + wantExplicit: false, }, { name: "github with https prefix", @@ -88,6 +92,7 @@ func TestParseSource_GitHubShorthand(t *testing.T) { wantCloneURL: "https://github.com/user/repo.git", wantSubdir: "", wantName: "repo", + wantExplicit: false, }, { name: "github https with .git", @@ -95,6 +100,7 @@ func TestParseSource_GitHubShorthand(t *testing.T) { wantCloneURL: "https://github.com/user/repo.git", wantSubdir: "", wantName: "repo", + wantExplicit: false, }, { name: "github web URL with tree/main", @@ -102,6 +108,7 @@ func TestParseSource_GitHubShorthand(t *testing.T) { wantCloneURL: "https://github.com/user/repo.git", wantSubdir: "path/to/skill", wantName: "skill", + wantExplicit: false, }, { name: "github web URL with tree/master", @@ -109,6 +116,7 @@ func TestParseSource_GitHubShorthand(t *testing.T) { wantCloneURL: "https://github.com/user/repo.git", wantSubdir: "skills/my-skill", wantName: "my-skill", + wantExplicit: false, }, { name: "github web URL with blob (file view)", @@ -116,6 +124,7 @@ func TestParseSource_GitHubShorthand(t *testing.T) { wantCloneURL: "https://github.com/user/repo.git", wantSubdir: "path/to/skill", wantName: "skill", + wantExplicit: false, }, { name: "github web URL tree/branch only (no subdir)", @@ -123,6 +132,7 @@ func TestParseSource_GitHubShorthand(t *testing.T) { wantCloneURL: "https://github.com/user/repo.git", wantSubdir: "", wantName: "repo", + wantExplicit: false, }, { name: "github dot subdir normalized to root", @@ -130,6 +140,33 @@ func TestParseSource_GitHubShorthand(t *testing.T) { wantCloneURL: "https://github.com/user/repo.git", wantSubdir: "", wantName: "repo", + wantExplicit: false, + }, + // issue #124: blob URLs pointing directly at SKILL.md should resolve + // to the containing skill directory instead of a literal "SKILL.md" name. + { + name: "github blob URL at root SKILL.md", + input: "https://github.com/iOfficeAI/OfficeCLI/blob/main/SKILL.md", + wantCloneURL: "https://github.com/iOfficeAI/OfficeCLI.git", + wantSubdir: "", + wantName: "OfficeCLI", + wantExplicit: true, + }, + { + name: "github blob URL at nested SKILL.md", + input: "https://github.com/user/repo/blob/main/skills/foo/SKILL.md", + wantCloneURL: "https://github.com/user/repo.git", + wantSubdir: "skills/foo", + wantName: "foo", + wantExplicit: true, + }, + { + name: "github blob URL at SKILL.md lowercase (case-insensitive)", + input: "https://github.com/user/repo/blob/main/skill.md", + wantCloneURL: "https://github.com/user/repo.git", + wantSubdir: "", + wantName: "repo", + wantExplicit: true, }, } @@ -151,6 +188,9 @@ func TestParseSource_GitHubShorthand(t *testing.T) { if source.Name != tt.wantName { t.Errorf("Name = %v, want %v", source.Name, tt.wantName) } + if source.TargetsExplicitSkill() != tt.wantExplicit { + t.Errorf("TargetsExplicitSkill() = %v, want %v", source.TargetsExplicitSkill(), tt.wantExplicit) + } }) } } @@ -240,24 +280,28 @@ func TestParseSource_GitHTTPS(t *testing.T) { wantCloneURL string wantSubdir string wantName string + wantExplicit bool }{ { name: "gitlab https", input: "https://gitlab.com/user/repo", wantCloneURL: "https://gitlab.com/user/repo.git", wantName: "repo", + wantExplicit: false, }, { name: "bitbucket https", input: "https://bitbucket.org/user/repo.git", wantCloneURL: "https://bitbucket.org/user/repo.git", wantName: "repo", + wantExplicit: false, }, { name: "gitlab https dot subdir normalized to root", input: "https://gitlab.com/user/repo/.", wantCloneURL: "https://gitlab.com/user/repo.git", wantName: "repo", + wantExplicit: false, }, { name: "bitbucket web URL with src/main", @@ -265,6 +309,7 @@ func TestParseSource_GitHTTPS(t *testing.T) { wantCloneURL: "https://bitbucket.org/team/skills.git", wantSubdir: "learn-and-update", wantName: "learn-and-update", + wantExplicit: false, }, { name: "bitbucket web URL with src/main trailing slash", @@ -272,12 +317,14 @@ func TestParseSource_GitHTTPS(t *testing.T) { wantCloneURL: "https://bitbucket.org/team/skills.git", wantSubdir: "learn-and-update", wantName: "learn-and-update", + wantExplicit: false, }, { name: "bitbucket web URL src/branch only (no subdir)", input: "https://bitbucket.org/team/skills/src/main", wantCloneURL: "https://bitbucket.org/team/skills.git", wantName: "skills", + wantExplicit: false, }, { name: "bitbucket web URL nested subdir", @@ -285,6 +332,7 @@ func TestParseSource_GitHTTPS(t *testing.T) { wantCloneURL: "https://bitbucket.org/team/skills.git", wantSubdir: "frontend/react", wantName: "react", + wantExplicit: false, }, { name: "gitlab web URL with -/tree/main", @@ -292,6 +340,7 @@ func TestParseSource_GitHTTPS(t *testing.T) { wantCloneURL: "https://gitlab.com/user/repo.git", wantSubdir: "path/to/skill", wantName: "skill", + wantExplicit: false, }, { name: "gitlab web URL with -/blob/main", @@ -299,12 +348,40 @@ func TestParseSource_GitHTTPS(t *testing.T) { wantCloneURL: "https://gitlab.com/user/repo.git", wantSubdir: "path/to/skill", wantName: "skill", + wantExplicit: false, }, { name: "gitlab web URL -/tree/branch only", input: "https://gitlab.com/user/repo/-/tree/main", wantCloneURL: "https://gitlab.com/user/repo.git", wantName: "repo", + wantExplicit: false, + }, + // issue #124: blob URLs pointing directly at SKILL.md should resolve + // to the containing skill directory instead of a literal "SKILL.md" name. + { + name: "gitlab blob URL at root SKILL.md", + input: "https://gitlab.com/user/repo/-/blob/main/SKILL.md", + wantCloneURL: "https://gitlab.com/user/repo.git", + wantSubdir: "", + wantName: "repo", + wantExplicit: true, + }, + { + name: "gitlab blob URL at nested SKILL.md", + input: "https://gitlab.com/user/repo/-/blob/main/skills/foo/SKILL.md", + wantCloneURL: "https://gitlab.com/user/repo.git", + wantSubdir: "skills/foo", + wantName: "foo", + wantExplicit: true, + }, + { + name: "bitbucket src URL at nested SKILL.md", + input: "https://bitbucket.org/team/repo/src/main/skills/foo/SKILL.md", + wantCloneURL: "https://bitbucket.org/team/repo.git", + wantSubdir: "skills/foo", + wantName: "foo", + wantExplicit: true, }, } @@ -326,6 +403,9 @@ func TestParseSource_GitHTTPS(t *testing.T) { if source.Name != tt.wantName { t.Errorf("Name = %v, want %v", source.Name, tt.wantName) } + if source.TargetsExplicitSkill() != tt.wantExplicit { + t.Errorf("TargetsExplicitSkill() = %v, want %v", source.TargetsExplicitSkill(), tt.wantExplicit) + } }) } } @@ -536,28 +616,33 @@ func TestSource_TrackName(t *testing.T) { func TestStripGitBranchPrefix(t *testing.T) { tests := []struct { - name string - host string - subdir string - want string + name string + host string + subdir string + want string + explicit bool }{ - {"empty", "bitbucket.org", "", ""}, - {"bitbucket src/main/path", "bitbucket.org", "src/main/learn-and-update", "learn-and-update"}, - {"bitbucket src/main/nested", "bitbucket.org", "src/develop/a/b/c", "a/b/c"}, - {"bitbucket src/branch only", "bitbucket.org", "src/main", ""}, - {"bitbucket trailing slash", "bitbucket.org", "src/main/skill/", "skill"}, - {"gitlab -/tree/main/path", "gitlab.com", "-/tree/main/path/to/skill", "path/to/skill"}, - {"gitlab -/blob/main/path", "gitlab.com", "-/blob/main/path/to/skill", "path/to/skill"}, - {"gitlab -/tree/branch only", "gitlab.com", "-/tree/main", ""}, - {"non-platform passthrough", "example.com", "some/path", "some/path"}, - {"bitbucket host variant", "bitbucket.mycompany.com", "src/main/skill", "skill"}, + {"empty", "bitbucket.org", "", "", false}, + {"bitbucket src/main/path", "bitbucket.org", "src/main/learn-and-update", "learn-and-update", false}, + {"bitbucket src/main/nested", "bitbucket.org", "src/develop/a/b/c", "a/b/c", false}, + {"bitbucket src/branch only", "bitbucket.org", "src/main", "", false}, + {"bitbucket trailing slash", "bitbucket.org", "src/main/skill/", "skill", false}, + {"gitlab -/tree/main/path", "gitlab.com", "-/tree/main/path/to/skill", "path/to/skill", false}, + {"gitlab -/blob/main/path", "gitlab.com", "-/blob/main/path/to/skill", "path/to/skill", false}, + {"gitlab blob SKILL target", "gitlab.com", "-/blob/main/skills/foo/SKILL.md", "skills/foo", true}, + {"gitlab -/tree/branch only", "gitlab.com", "-/tree/main", "", false}, + {"non-platform passthrough", "example.com", "some/path", "some/path", false}, + {"bitbucket host variant", "bitbucket.mycompany.com", "src/main/skill", "skill", false}, } for _, tt := range tests { t.Run(tt.name, func(t *testing.T) { - got := stripGitBranchPrefix(tt.host, tt.subdir) + got, explicit := stripGitBranchPrefix(tt.host, tt.subdir) if got != tt.want { t.Errorf("stripGitBranchPrefix(%q, %q) = %q, want %q", tt.host, tt.subdir, got, tt.want) } + if explicit != tt.explicit { + t.Errorf("stripGitBranchPrefix(%q, %q) explicit = %v, want %v", tt.host, tt.subdir, explicit, tt.explicit) + } }) } } diff --git a/internal/install/updatable_test.go b/internal/install/updatable_test.go index 2d8ca15b..c1a739c7 100644 --- a/internal/install/updatable_test.go +++ b/internal/install/updatable_test.go @@ -13,8 +13,29 @@ func createSkillWithMeta(t *testing.T, baseDir, name string, meta *SkillMeta) { os.MkdirAll(dir, 0755) os.WriteFile(filepath.Join(dir, "SKILL.md"), []byte("---\nname: "+name+"\n---\n"), 0644) if meta != nil { - if err := WriteMeta(dir, meta); err != nil { - t.Fatalf("write meta for %s: %v", name, err) + // Write to centralized .metadata.json + store, _ := LoadMetadata(baseDir) + key := name + group := "" + if idx := strings.LastIndex(name, "/"); idx >= 0 { + group = name[:idx] + key = name[idx+1:] + } + store.Set(key, &MetadataEntry{ + Source: meta.Source, + Kind: meta.Kind, + Type: meta.Type, + Group: group, + InstalledAt: meta.InstalledAt, + RepoURL: meta.RepoURL, + Subdir: meta.Subdir, + Version: meta.Version, + TreeHash: meta.TreeHash, + FileHashes: meta.FileHashes, + Branch: meta.Branch, + }) + if err := store.Save(baseDir); err != nil { + t.Fatalf("save metadata for %s: %v", name, err) } } } @@ -45,6 +66,12 @@ func TestGetUpdatableSkills_SkipsTrackedRepos(t *testing.T) { Source: "github.com/team/repo", Type: "github", }) + // Mark as tracked in the store + store, _ := LoadMetadata(src) + if e := store.Get("_team-repo"); e != nil { + e.Tracked = true + store.Save(src) + } // Also create a nested skill inside tracked repo nestedDir := filepath.Join(src, "_team-repo", "sub-skill") os.MkdirAll(nestedDir, 0755) @@ -86,7 +113,9 @@ func TestGetUpdatableSkills_Nested(t *testing.T) { nestedDir := filepath.Join(src, "group", "my-skill") os.MkdirAll(nestedDir, 0755) os.WriteFile(filepath.Join(nestedDir, "SKILL.md"), []byte("nested"), 0644) - WriteMeta(nestedDir, &SkillMeta{Source: "github.com/u/r", Type: "github"}) + store, _ := LoadMetadata(src) + store.Set("my-skill", &MetadataEntry{Source: "github.com/u/r", Type: "github", Group: "group"}) + store.Save(src) skills, err := GetUpdatableSkills(src) if err != nil { @@ -174,17 +203,20 @@ func TestFindRepoInstalls_MatchesByRepoURL(t *testing.T) { func TestFindRepoInstalls_MatchesNested(t *testing.T) { src := t.TempDir() + store, _ := LoadMetadata(src) // Skills under group/ for _, name := range []string{"scan", "learn", "archive"} { dir := filepath.Join(src, "feature-radar", name) os.MkdirAll(dir, 0755) os.WriteFile(filepath.Join(dir, "SKILL.md"), []byte("# "+name), 0644) - WriteMeta(dir, &SkillMeta{ + store.Set(name, &MetadataEntry{ Source: "https://github.com/runkids/feature-radar", Type: "github", RepoURL: "https://github.com/runkids/feature-radar.git", + Group: "feature-radar", }) } + store.Save(src) matches := FindRepoInstalls(src, "git@github.com:runkids/feature-radar.git") if len(matches) != 3 { @@ -213,10 +245,13 @@ func TestCheckCrossPathDuplicate_BlocksDifferentPath(t *testing.T) { // Existing install under group/ dir := filepath.Join(src, "my-group", "skill-a") os.MkdirAll(dir, 0755) - WriteMeta(dir, &SkillMeta{ + store, _ := LoadMetadata(src) + store.Set("skill-a", &MetadataEntry{ Source: "https://github.com/owner/repo", Type: "github", RepoURL: "https://github.com/owner/repo.git", + Group: "my-group", }) + store.Save(src) // Root install (no --into) should be blocked err := CheckCrossPathDuplicate(src, "https://github.com/owner/repo.git", "") diff --git a/internal/resource/agent.go b/internal/resource/agent.go new file mode 100644 index 00000000..44f082b6 --- /dev/null +++ b/internal/resource/agent.go @@ -0,0 +1,207 @@ +package resource + +import ( + "os" + "path/filepath" + "strings" + + "skillshare/internal/skillignore" + "skillshare/internal/utils" +) + +// AgentKind handles single-file .md agent resources. +type AgentKind struct{} + +var _ ResourceKind = AgentKind{} + +func (AgentKind) Kind() string { return "agent" } + +// Discover scans sourceDir for .md files, excluding conventional files +// (README.md, LICENSE.md, etc.) and hidden files. +// +// For tracked repos (directories starting with _): if the repo contains an +// agents/ subdirectory, only files inside agents/ are discovered. Otherwise +// the entire repo is walked with conventional excludes applied. +func (AgentKind) Discover(sourceDir string) ([]DiscoveredResource, error) { + walkRoot := utils.ResolveSymlink(sourceDir) + + // Read .agentignore for filtering + ignoreMatcher := skillignore.ReadAgentIgnoreMatcher(walkRoot) + + // Pre-scan: find tracked repos that have an agents/ subdirectory. + // When agents/ exists, only its contents count as agent files. + reposWithAgentsDir := map[string]bool{} + if entries, readErr := os.ReadDir(walkRoot); readErr == nil { + for _, e := range entries { + if !e.IsDir() || !utils.IsTrackedRepoDir(e.Name()) { + continue + } + agentsPath := filepath.Join(walkRoot, e.Name(), "agents") + if info, statErr := os.Stat(agentsPath); statErr == nil && info.IsDir() { + reposWithAgentsDir[e.Name()] = true + } + } + } + + var resources []DiscoveredResource + + err := filepath.Walk(walkRoot, func(path string, info os.FileInfo, err error) error { + if err != nil { + return nil + } + + if info.IsDir() { + if info.Name() == ".git" || utils.IsHidden(info.Name()) && info.Name() != "." { + return filepath.SkipDir + } + // Skip ignored directories early + if ignoreMatcher.HasRules() && info.Name() != "." { + relDir, relErr := filepath.Rel(walkRoot, path) + if relErr == nil { + relDir = strings.ReplaceAll(relDir, "\\", "/") + if ignoreMatcher.CanSkipDir(relDir) { + return filepath.SkipDir + } + } + } + // For tracked repos with agents/ subdir: skip non-agents children + relDir, relErr := filepath.Rel(walkRoot, path) + if relErr == nil { + parts := strings.SplitN(filepath.ToSlash(relDir), "/", 3) + if len(parts) >= 2 && reposWithAgentsDir[parts[0]] && parts[1] != "agents" { + return filepath.SkipDir + } + } + return nil + } + + // Only .md files + if !strings.HasSuffix(strings.ToLower(info.Name()), ".md") { + return nil + } + + // Skip conventional excludes + if ConventionalExcludes[info.Name()] { + return nil + } + + // Skip hidden files + if utils.IsHidden(info.Name()) { + return nil + } + + relPath, relErr := filepath.Rel(walkRoot, path) + if relErr != nil { + return nil + } + relPath = strings.ReplaceAll(relPath, "\\", "/") + + // For tracked repos with agents/ subdir: skip files at repo root level + // (e.g. _repo/CLAUDE.md) — only _repo/agents/** should be discovered. + parts := strings.SplitN(relPath, "/", 3) + if len(parts) == 2 && reposWithAgentsDir[parts[0]] { + return nil + } + + // Apply .agentignore matching — mark as disabled but still include + disabled := ignoreMatcher.HasRules() && ignoreMatcher.Match(relPath, false) + + name := agentNameFromFile(path, info.Name()) + targets := utils.ParseFrontmatterList(path, "targets") + + isNested := strings.Contains(relPath, "/") + repoRelPath := findTrackedRepoRelPath(walkRoot, relPath) + + resources = append(resources, DiscoveredResource{ + Name: name, + Kind: "agent", + RelPath: relPath, + AbsPath: path, + IsNested: isNested, + IsInRepo: repoRelPath != "", + RepoRelPath: repoRelPath, + Disabled: disabled, + FlatName: AgentFlatName(relPath), + SourcePath: filepath.Join(sourceDir, relPath), + Targets: targets, + }) + + return nil + }) + + if err != nil { + return nil, err + } + + return resources, nil +} + +func findTrackedRepoRelPath(root, relPath string) string { + dir := filepath.Dir(relPath) + if dir == "." || dir == "" { + return "" + } + + parts := strings.Split(filepath.ToSlash(dir), "/") + for i, part := range parts { + if !utils.IsTrackedRepoDir(part) { + continue + } + candidate := strings.Join(parts[:i+1], "/") + gitDir := filepath.Join(root, filepath.FromSlash(candidate), ".git") + if info, err := os.Stat(gitDir); err == nil && info.IsDir() { + return candidate + } + } + + return "" +} + +// agentNameFromFile resolves an agent name. Checks frontmatter name field +// first, falls back to filename without .md extension. +func agentNameFromFile(filePath, fileName string) string { + name := utils.ParseFrontmatterField(filePath, "name") + if name != "" { + return name + } + return strings.TrimSuffix(fileName, ".md") +} + +// ResolveName extracts the agent name from an .md file. +// Checks frontmatter name field first, falls back to filename. +func (AgentKind) ResolveName(path string) string { + return agentNameFromFile(path, filepath.Base(path)) +} + +// FlatName flattens nested agent paths using the shared __ separator. +// Example: "curriculum/math-tutor.md" → "curriculum__math-tutor.md" +func (AgentKind) FlatName(relPath string) string { + return AgentFlatName(relPath) +} + +// AgentFlatName is the standalone flat name computation for agents. +// Agents must sync into flat target directories, so nested segments are +// encoded using the same path flattening rule as skills. +func AgentFlatName(relPath string) string { + return utils.PathToFlatName(relPath) +} + +// ActiveAgents returns only non-disabled agents from the given slice. +func ActiveAgents(agents []DiscoveredResource) []DiscoveredResource { + active := make([]DiscoveredResource, 0, len(agents)) + for _, a := range agents { + if !a.Disabled { + active = append(active, a) + } + } + return active +} + +// CreateLink creates a file symlink from dst pointing to src. +func (AgentKind) CreateLink(src, dst string) error { + return os.Symlink(src, dst) +} + +func (AgentKind) SupportsAudit() bool { return true } +func (AgentKind) SupportsTrack() bool { return true } +func (AgentKind) SupportsCollect() bool { return true } diff --git a/internal/resource/kind.go b/internal/resource/kind.go new file mode 100644 index 00000000..45df8127 --- /dev/null +++ b/internal/resource/kind.go @@ -0,0 +1,69 @@ +package resource + +// ResourceKind encapsulates per-kind behavior for skills and agents. +// Each kind defines how resources are discovered, named, linked, and validated. +type ResourceKind interface { + // Kind returns the resource kind identifier ("skill" or "agent"). + Kind() string + + // Discover scans sourceDir and returns all resources found. + Discover(sourceDir string) ([]DiscoveredResource, error) + + // ResolveName extracts the canonical name from a resource at the given path. + // For skills: reads SKILL.md frontmatter name field. + // For agents: uses filename, with optional frontmatter name override. + ResolveName(path string) string + + // FlatName computes the flattened name used in target directories. + // For skills: path/to/skill → path__to__skill (nested separator). + // For agents: dir/file.md → file.md (directory prefix stripped). + FlatName(relPath string) string + + // CreateLink creates a symlink from dst pointing to src. + // For skills: directory symlink. For agents: file symlink. + // Both use os.Symlink; the distinction is semantic (unit shape). + CreateLink(src, dst string) error + + // SupportsAudit reports whether this kind supports security audit scanning. + SupportsAudit() bool + + // SupportsTrack reports whether this kind supports tracked repo updates. + SupportsTrack() bool + + // SupportsCollect reports whether this kind supports collecting from targets. + SupportsCollect() bool +} + +// DiscoveredResource represents a resource found during source directory scan. +// Used for both skills and agents. +type DiscoveredResource struct { + Name string // Canonical name (from frontmatter or filename) + Kind string // "skill" or "agent" + RelPath string // Relative path from source root + AbsPath string // Full absolute path + IsNested bool // Whether this resource is inside a subdirectory + FlatName string // Flattened name for target directories + IsInRepo bool // Whether this resource is inside a tracked repo + RepoRelPath string // Relative path of the tracked repo root (when IsInRepo) + Disabled bool // Whether this resource is ignored by ignore file + SourcePath string // Full path preserving caller's logical path (may differ from AbsPath if symlinked) + Targets []string // Per-resource target restrictions from frontmatter (nil = all targets) +} + +// ConventionalExcludes are filenames excluded from agent discovery. +// These are well-known convention files that appear in repos but are not agents. +var ConventionalExcludes = map[string]bool{ + "README.md": true, + "CHANGELOG.md": true, + "LICENSE.md": true, + "HISTORY.md": true, + "SECURITY.md": true, + "SKILL.md": true, + "CLAUDE.md": true, + "AGENTS.md": true, + "GEMINI.md": true, + "COPILOT.md": true, + "CONTRIBUTING.md": true, + "CODE_OF_CONDUCT.md": true, + "SUPPORT.md": true, +} diff --git a/internal/resource/kind_test.go b/internal/resource/kind_test.go new file mode 100644 index 00000000..08c213bc --- /dev/null +++ b/internal/resource/kind_test.go @@ -0,0 +1,439 @@ +package resource + +import ( + "os" + "path/filepath" + "testing" +) + +// --- SkillKind tests --- + +func TestSkillKind_Kind(t *testing.T) { + k := SkillKind{} + if k.Kind() != "skill" { + t.Errorf("SkillKind.Kind() = %q, want %q", k.Kind(), "skill") + } +} + +func TestSkillKind_Discover(t *testing.T) { + dir := t.TempDir() + + // Create two skills + os.MkdirAll(filepath.Join(dir, "my-skill"), 0o755) + os.WriteFile(filepath.Join(dir, "my-skill", "SKILL.md"), []byte("---\nname: my-skill\n---\n# Content"), 0o644) + + os.MkdirAll(filepath.Join(dir, "another"), 0o755) + os.WriteFile(filepath.Join(dir, "another", "SKILL.md"), []byte("---\nname: another\n---\n# Content"), 0o644) + + // Non-skill directory (no SKILL.md) + os.MkdirAll(filepath.Join(dir, "not-a-skill"), 0o755) + os.WriteFile(filepath.Join(dir, "not-a-skill", "README.md"), []byte("# Readme"), 0o644) + + k := SkillKind{} + resources, err := k.Discover(dir) + if err != nil { + t.Fatalf("Discover error: %v", err) + } + + if len(resources) != 2 { + t.Fatalf("expected 2 resources, got %d", len(resources)) + } + + names := map[string]bool{} + for _, r := range resources { + names[r.Name] = true + if r.Kind != "skill" { + t.Errorf("resource %q has Kind=%q, want %q", r.Name, r.Kind, "skill") + } + } + + if !names["my-skill"] { + t.Error("expected to discover 'my-skill'") + } + if !names["another"] { + t.Error("expected to discover 'another'") + } +} + +func TestSkillKind_Discover_Nested(t *testing.T) { + dir := t.TempDir() + + os.MkdirAll(filepath.Join(dir, "_team", "frontend", "ui"), 0o755) + os.WriteFile(filepath.Join(dir, "_team", "frontend", "ui", "SKILL.md"), []byte("---\nname: ui\n---\n"), 0o644) + + k := SkillKind{} + resources, err := k.Discover(dir) + if err != nil { + t.Fatalf("Discover error: %v", err) + } + + if len(resources) != 1 { + t.Fatalf("expected 1 resource, got %d", len(resources)) + } + + r := resources[0] + if r.Name != "ui" { + t.Errorf("Name = %q, want %q", r.Name, "ui") + } + if r.FlatName != "_team__frontend__ui" { + t.Errorf("FlatName = %q, want %q", r.FlatName, "_team__frontend__ui") + } + if !r.IsNested { + t.Error("expected IsNested=true for nested skill") + } + if !r.IsInRepo { + t.Error("expected IsInRepo=true for _-prefixed dir") + } +} + +func TestSkillKind_ResolveName_FromFrontmatter(t *testing.T) { + dir := t.TempDir() + skillDir := filepath.Join(dir, "my-skill") + os.MkdirAll(skillDir, 0o755) + os.WriteFile(filepath.Join(skillDir, "SKILL.md"), []byte("---\nname: custom-name\n---\n"), 0o644) + + k := SkillKind{} + name := k.ResolveName(skillDir) + if name != "custom-name" { + t.Errorf("ResolveName = %q, want %q", name, "custom-name") + } +} + +func TestSkillKind_ResolveName_FallbackToDirName(t *testing.T) { + dir := t.TempDir() + skillDir := filepath.Join(dir, "fallback-skill") + os.MkdirAll(skillDir, 0o755) + os.WriteFile(filepath.Join(skillDir, "SKILL.md"), []byte("---\n---\n"), 0o644) + + k := SkillKind{} + name := k.ResolveName(skillDir) + if name != "fallback-skill" { + t.Errorf("ResolveName = %q, want %q", name, "fallback-skill") + } +} + +func TestSkillKind_FlatName(t *testing.T) { + k := SkillKind{} + + tests := []struct { + relPath string + want string + }{ + {"my-skill", "my-skill"}, + {"_team/frontend/ui", "_team__frontend__ui"}, + } + + for _, tt := range tests { + got := k.FlatName(tt.relPath) + if got != tt.want { + t.Errorf("FlatName(%q) = %q, want %q", tt.relPath, got, tt.want) + } + } +} + +func TestSkillKind_FeatureGates(t *testing.T) { + k := SkillKind{} + if !k.SupportsAudit() { + t.Error("SkillKind should support audit") + } + if !k.SupportsTrack() { + t.Error("SkillKind should support track") + } + if !k.SupportsCollect() { + t.Error("SkillKind should support collect") + } +} + +// --- AgentKind tests --- + +func TestAgentKind_Kind(t *testing.T) { + k := AgentKind{} + if k.Kind() != "agent" { + t.Errorf("AgentKind.Kind() = %q, want %q", k.Kind(), "agent") + } +} + +func TestAgentKind_Discover(t *testing.T) { + dir := t.TempDir() + + // Create agent files + os.WriteFile(filepath.Join(dir, "tutor.md"), []byte("# Tutor agent"), 0o644) + os.WriteFile(filepath.Join(dir, "reviewer.md"), []byte("# Reviewer agent"), 0o644) + + // Conventional excludes should be skipped + os.WriteFile(filepath.Join(dir, "README.md"), []byte("# Readme"), 0o644) + os.WriteFile(filepath.Join(dir, "LICENSE.md"), []byte("# License"), 0o644) + os.WriteFile(filepath.Join(dir, "SKILL.md"), []byte("---\nname: test\n---\n"), 0o644) + os.WriteFile(filepath.Join(dir, "CLAUDE.md"), []byte("# Claude config"), 0o644) + os.WriteFile(filepath.Join(dir, "AGENTS.md"), []byte("# Agents config"), 0o644) + os.WriteFile(filepath.Join(dir, "GEMINI.md"), []byte("# Gemini config"), 0o644) + + // Non-.md files should be skipped + os.WriteFile(filepath.Join(dir, "config.yaml"), []byte("key: value"), 0o644) + + // Hidden files should be skipped + os.WriteFile(filepath.Join(dir, ".hidden.md"), []byte("# Hidden"), 0o644) + + k := AgentKind{} + resources, err := k.Discover(dir) + if err != nil { + t.Fatalf("Discover error: %v", err) + } + + if len(resources) != 2 { + t.Fatalf("expected 2 resources, got %d: %v", len(resources), resources) + } + + names := map[string]bool{} + for _, r := range resources { + names[r.Name] = true + if r.Kind != "agent" { + t.Errorf("resource %q has Kind=%q, want %q", r.Name, r.Kind, "agent") + } + } + + if !names["tutor"] { + t.Error("expected to discover 'tutor'") + } + if !names["reviewer"] { + t.Error("expected to discover 'reviewer'") + } +} + +func TestAgentKind_Discover_Nested(t *testing.T) { + dir := t.TempDir() + + os.MkdirAll(filepath.Join(dir, "curriculum"), 0o755) + os.WriteFile(filepath.Join(dir, "curriculum", "math-tutor.md"), []byte("# Math tutor"), 0o644) + + k := AgentKind{} + resources, err := k.Discover(dir) + if err != nil { + t.Fatalf("Discover error: %v", err) + } + + if len(resources) != 1 { + t.Fatalf("expected 1 resource, got %d", len(resources)) + } + + r := resources[0] + if r.Name != "math-tutor" { + t.Errorf("Name = %q, want %q", r.Name, "math-tutor") + } + if r.RelPath != "curriculum/math-tutor.md" { + t.Errorf("RelPath = %q, want %q", r.RelPath, "curriculum/math-tutor.md") + } + if r.FlatName != "curriculum__math-tutor.md" { + t.Errorf("FlatName = %q, want %q", r.FlatName, "curriculum__math-tutor.md") + } + if !r.IsNested { + t.Error("expected IsNested=true for nested agent") + } +} + +func TestAgentKind_ResolveName_FromFilename(t *testing.T) { + dir := t.TempDir() + agentFile := filepath.Join(dir, "tutor.md") + os.WriteFile(agentFile, []byte("# Tutor agent"), 0o644) + + k := AgentKind{} + name := k.ResolveName(agentFile) + if name != "tutor" { + t.Errorf("ResolveName = %q, want %q", name, "tutor") + } +} + +func TestAgentKind_ResolveName_FromFrontmatter(t *testing.T) { + dir := t.TempDir() + agentFile := filepath.Join(dir, "tutor.md") + os.WriteFile(agentFile, []byte("---\nname: curriculum-tutor\n---\n# Tutor"), 0o644) + + k := AgentKind{} + name := k.ResolveName(agentFile) + if name != "curriculum-tutor" { + t.Errorf("ResolveName = %q, want %q", name, "curriculum-tutor") + } +} + +func TestAgentKind_FlatName(t *testing.T) { + k := AgentKind{} + + tests := []struct { + relPath string + want string + }{ + {"tutor.md", "tutor.md"}, + {"curriculum/math-tutor.md", "curriculum__math-tutor.md"}, + {"a/b/deep.md", "a__b__deep.md"}, + } + + for _, tt := range tests { + got := k.FlatName(tt.relPath) + if got != tt.want { + t.Errorf("FlatName(%q) = %q, want %q", tt.relPath, got, tt.want) + } + } +} + +func TestAgentKind_FeatureGates(t *testing.T) { + k := AgentKind{} + if !k.SupportsAudit() { + t.Error("AgentKind should support audit") + } + if !k.SupportsTrack() { + t.Error("AgentKind should support track") + } + if !k.SupportsCollect() { + t.Error("AgentKind should support collect") + } +} + +func TestAgentKind_Discover_EmptyDir(t *testing.T) { + dir := t.TempDir() + + k := AgentKind{} + resources, err := k.Discover(dir) + if err != nil { + t.Fatalf("Discover error: %v", err) + } + if len(resources) != 0 { + t.Errorf("expected 0 resources, got %d", len(resources)) + } +} + +func TestAgentKind_Discover_RespectsAgentignore(t *testing.T) { + dir := t.TempDir() + + os.WriteFile(filepath.Join(dir, "active.md"), []byte("# Active"), 0o644) + os.WriteFile(filepath.Join(dir, "ignored.md"), []byte("# Ignored"), 0o644) + + // Create .agentignore + os.WriteFile(filepath.Join(dir, ".agentignore"), []byte("ignored.md\n"), 0o644) + + k := AgentKind{} + resources, err := k.Discover(dir) + if err != nil { + t.Fatalf("Discover error: %v", err) + } + + if len(resources) != 2 { + t.Fatalf("expected 2 resources (ignored included as disabled), got %d", len(resources)) + } + + // Find each by name and check Disabled flag + var active, ignored *DiscoveredResource + for i := range resources { + switch resources[i].Name { + case "active": + active = &resources[i] + case "ignored": + ignored = &resources[i] + } + } + if active == nil { + t.Fatal("active agent not found") + } + if active.Disabled { + t.Error("active agent should not be disabled") + } + if ignored == nil { + t.Fatal("ignored agent not found") + } + if !ignored.Disabled { + t.Error("ignored agent should be disabled") + } +} + +func TestAgentKind_Discover_SkipsGitDir(t *testing.T) { + dir := t.TempDir() + + os.MkdirAll(filepath.Join(dir, ".git"), 0o755) + os.WriteFile(filepath.Join(dir, ".git", "config.md"), []byte("# git config"), 0o644) + os.WriteFile(filepath.Join(dir, "real-agent.md"), []byte("# Agent"), 0o644) + + k := AgentKind{} + resources, err := k.Discover(dir) + if err != nil { + t.Fatalf("Discover error: %v", err) + } + + if len(resources) != 1 { + t.Fatalf("expected 1 resource, got %d", len(resources)) + } + if resources[0].Name != "real-agent" { + t.Errorf("Name = %q, want %q", resources[0].Name, "real-agent") + } +} + +func TestAgentKind_Discover_TrackedRepoWithAgentsDir(t *testing.T) { + dir := t.TempDir() + + // Tracked repo WITH agents/ subdir — only agents/ contents should be discovered + repo := filepath.Join(dir, "_team-agents") + os.MkdirAll(filepath.Join(repo, ".git"), 0o755) + os.MkdirAll(filepath.Join(repo, "agents"), 0o755) + os.MkdirAll(filepath.Join(repo, "docs"), 0o755) + os.WriteFile(filepath.Join(repo, "CLAUDE.md"), []byte("# Claude config"), 0o644) + os.WriteFile(filepath.Join(repo, "README.md"), []byte("# Readme"), 0o644) + os.WriteFile(filepath.Join(repo, "intro.md"), []byte("# Not an agent"), 0o644) + os.WriteFile(filepath.Join(repo, "agents", "reviewer.md"), []byte("# Reviewer"), 0o644) + os.WriteFile(filepath.Join(repo, "agents", "tutor.md"), []byte("# Tutor"), 0o644) + os.WriteFile(filepath.Join(repo, "docs", "guide.md"), []byte("# Guide"), 0o644) + + k := AgentKind{} + resources, err := k.Discover(dir) + if err != nil { + t.Fatalf("Discover error: %v", err) + } + + names := map[string]bool{} + for _, r := range resources { + names[r.Name] = true + } + + if len(resources) != 2 { + t.Fatalf("expected 2 agents (only from agents/), got %d: %v", len(resources), names) + } + if !names["reviewer"] { + t.Error("expected to discover 'reviewer' from agents/") + } + if !names["tutor"] { + t.Error("expected to discover 'tutor' from agents/") + } +} + +func TestAgentKind_Discover_TrackedRepoWithoutAgentsDir(t *testing.T) { + dir := t.TempDir() + + // Tracked repo WITHOUT agents/ subdir — whole repo is agents (minus excludes) + repo := filepath.Join(dir, "_solo-agents") + os.MkdirAll(filepath.Join(repo, ".git"), 0o755) + os.WriteFile(filepath.Join(repo, "CLAUDE.md"), []byte("# Claude config"), 0o644) + os.WriteFile(filepath.Join(repo, "README.md"), []byte("# Readme"), 0o644) + os.WriteFile(filepath.Join(repo, "code-reviewer.md"), []byte("# Reviewer"), 0o644) + os.WriteFile(filepath.Join(repo, "debugging.md"), []byte("# Debugger"), 0o644) + + k := AgentKind{} + resources, err := k.Discover(dir) + if err != nil { + t.Fatalf("Discover error: %v", err) + } + + names := map[string]bool{} + for _, r := range resources { + names[r.Name] = true + } + + if len(resources) != 2 { + t.Fatalf("expected 2 agents (CLAUDE.md + README.md excluded), got %d: %v", len(resources), names) + } + if !names["code-reviewer"] { + t.Error("expected to discover 'code-reviewer'") + } + if !names["debugging"] { + t.Error("expected to discover 'debugging'") + } + if names["CLAUDE"] { + t.Error("CLAUDE.md should be excluded as conventional file") + } +} diff --git a/internal/resource/skill.go b/internal/resource/skill.go new file mode 100644 index 00000000..a16c85db --- /dev/null +++ b/internal/resource/skill.go @@ -0,0 +1,100 @@ +package resource + +import ( + "os" + "path/filepath" + "strings" + + "skillshare/internal/utils" +) + +// SkillKind handles directory-based skill resources identified by SKILL.md. +type SkillKind struct{} + +var _ ResourceKind = SkillKind{} + +func (SkillKind) Kind() string { return "skill" } + +// Discover scans sourceDir for directories containing SKILL.md. +// This is a simplified discovery for the resource package; the full +// discovery with ignore support, frontmatter parsing, and context +// collection remains in internal/sync/discover_walk.go. +func (SkillKind) Discover(sourceDir string) ([]DiscoveredResource, error) { + walkRoot := utils.ResolveSymlink(sourceDir) + + var resources []DiscoveredResource + + err := filepath.Walk(walkRoot, func(path string, info os.FileInfo, err error) error { + if err != nil { + return nil + } + + if info.IsDir() && info.Name() == ".git" { + return filepath.SkipDir + } + + if !info.IsDir() && info.Name() == "SKILL.md" { + skillDir := filepath.Dir(path) + relPath, relErr := filepath.Rel(walkRoot, skillDir) + if relErr != nil || relPath == "." { + return nil + } + relPath = strings.ReplaceAll(relPath, "\\", "/") + + name := utils.ParseFrontmatterField(filepath.Join(skillDir, "SKILL.md"), "name") + if name == "" { + name = filepath.Base(skillDir) + } + + isInRepo := false + parts := strings.Split(relPath, "/") + if len(parts) > 0 && utils.IsTrackedRepoDir(parts[0]) { + isInRepo = true + } + + resources = append(resources, DiscoveredResource{ + Name: name, + Kind: "skill", + RelPath: relPath, + AbsPath: skillDir, + IsNested: strings.Contains(relPath, "/"), + FlatName: utils.PathToFlatName(relPath), + IsInRepo: isInRepo, + SourcePath: filepath.Join(sourceDir, relPath), + }) + } + + return nil + }) + + if err != nil { + return nil, err + } + + return resources, nil +} + +// ResolveName reads the name field from SKILL.md frontmatter. +// Falls back to directory base name if frontmatter has no name. +func (SkillKind) ResolveName(path string) string { + skillFile := filepath.Join(path, "SKILL.md") + name := utils.ParseFrontmatterField(skillFile, "name") + if name != "" { + return name + } + return filepath.Base(path) +} + +// FlatName converts a relative path to a flat name using __ separator. +func (SkillKind) FlatName(relPath string) string { + return utils.PathToFlatName(relPath) +} + +// CreateLink creates a directory symlink from dst pointing to src. +func (SkillKind) CreateLink(src, dst string) error { + return os.Symlink(src, dst) +} + +func (SkillKind) SupportsAudit() bool { return true } +func (SkillKind) SupportsTrack() bool { return true } +func (SkillKind) SupportsCollect() bool { return true } diff --git a/internal/server/agent_helpers.go b/internal/server/agent_helpers.go new file mode 100644 index 00000000..f8013966 --- /dev/null +++ b/internal/server/agent_helpers.go @@ -0,0 +1,116 @@ +package server + +import ( + "os" + "path/filepath" + + "skillshare/internal/config" + "skillshare/internal/resource" + "skillshare/internal/utils" +) + +// Kind constants for diff/sync operations. +const ( + kindSkill = "skill" + kindAgent = "agent" +) + +// discoverAllAgents discovers all agents (including disabled) from the source directory. +func discoverAllAgents(agentsSource string) []resource.DiscoveredResource { + if agentsSource == "" { + return nil + } + discovered, _ := resource.AgentKind{}.Discover(agentsSource) + return discovered +} + +// discoverActiveAgents discovers agents from the given source directory, +// returning only non-disabled agents. Returns nil if source is empty. +func discoverActiveAgents(agentsSource string) []resource.DiscoveredResource { + return resource.ActiveAgents(discoverAllAgents(agentsSource)) +} + +// agentIgnorePayload returns JSON-serialisable fields describing .agentignore state. +// Pass pre-discovered agents to avoid duplicate filesystem traversal; nil triggers discovery. +func agentIgnorePayload(agentsSource string, all []resource.DiscoveredResource) map[string]any { + root := "" + ignoredNames := []string{} + if agentsSource != "" { + resolved := utils.ResolveSymlink(agentsSource) + ignoreFile := filepath.Join(resolved, ".agentignore") + if _, err := os.Stat(ignoreFile); err == nil { + root = ignoreFile + } + if all == nil { + all = discoverAllAgents(agentsSource) + } + for _, a := range all { + if a.Disabled { + ignoredNames = append(ignoredNames, a.RelPath) + } + } + } + return map[string]any{ + "agent_ignore_root": root, + "agent_ignored_count": len(ignoredNames), + "agent_ignored_skills": ignoredNames, + } +} + +// resolveAgentPath returns the expanded agent target path for a target, +// checking user config first, then builtin defaults. Returns "" if no path. +func resolveAgentPath(target config.TargetConfig, builtinAgents map[string]config.TargetConfig, name string) string { + if ac := target.AgentsConfig(); ac.Path != "" { + return config.ExpandPath(ac.Path) + } + if builtin, ok := builtinAgents[name]; ok { + return config.ExpandPath(builtin.Path) + } + return "" +} + +// builtinAgentTargets returns the builtin agent target map for the server's mode. +func (s *Server) builtinAgentTargets() map[string]config.TargetConfig { + if s.IsProjectMode() { + return config.ProjectAgentTargets() + } + return config.DefaultAgentTargets() +} + +// mergeAgentDiffItems appends agent diff items into the existing diffs slice, +// merging with an existing target entry or creating a new one. +func mergeAgentDiffItems(diffs []diffTarget, name string, items []diffItem) []diffTarget { + for i := range diffs { + if diffs[i].Target == name { + diffs[i].Items = append(diffs[i].Items, items...) + return diffs + } + } + return append(diffs, diffTarget{ + Target: name, + Items: items, + }) +} + +// appendAgentDiffs merges agent diff items for every agent-capable target. +// Unlike sync-matrix, diff must still inspect targets when the source is empty +// so orphaned synced agents surface as prune drift, matching skills behavior. +func (s *Server) appendAgentDiffs(diffs []diffTarget, targets map[string]config.TargetConfig, agentsSource, filterTarget string) []diffTarget { + agents := discoverActiveAgents(agentsSource) + builtinAgents := s.builtinAgentTargets() + + for name, target := range targets { + if filterTarget != "" && filterTarget != name { + continue + } + agentPath := resolveAgentPath(target, builtinAgents, name) + if agentPath == "" { + continue + } + if items := computeAgentTargetDiff(agentPath, agents); len(items) > 0 { + diffs = mergeAgentDiffItems(diffs, name, items) + } + } + + return diffs +} diff --git a/internal/server/handler_agent_diff.go b/internal/server/handler_agent_diff.go new file mode 100644 index 00000000..671ff94c --- /dev/null +++ b/internal/server/handler_agent_diff.go @@ -0,0 +1,108 @@ +package server + +import ( + "os" + "path/filepath" + "strings" + + "skillshare/internal/resource" + "skillshare/internal/utils" +) + +// computeAgentTargetDiff computes diff items for agents in a single target directory. +// Returns items with Kind="agent" for each pending action (link, update, prune, local). +func computeAgentTargetDiff(targetDir string, agents []resource.DiscoveredResource) []diffItem { + var items []diffItem + + // Build expected set + expected := make(map[string]resource.DiscoveredResource, len(agents)) + for _, a := range agents { + expected[a.FlatName] = a + } + + // Read existing .md files in target + existing := make(map[string]os.FileMode) + if entries, err := os.ReadDir(targetDir); err == nil { + for _, e := range entries { + if e.IsDir() || !strings.HasSuffix(strings.ToLower(e.Name()), ".md") { + continue + } + if resource.ConventionalExcludes[e.Name()] { + continue + } + existing[e.Name()] = e.Type() + } + } + + // Missing agents → link + for flatName, agent := range expected { + fileType, ok := existing[flatName] + if !ok { + items = append(items, diffItem{ + Skill: flatName, + Action: "link", + Reason: "source only", + Kind: kindAgent, + }) + continue + } + + // Exists — check if symlink points to correct source + targetPath := filepath.Join(targetDir, flatName) + if fileType&os.ModeSymlink != 0 || utils.IsSymlinkOrJunction(targetPath) { + absLink, err := utils.ResolveLinkTarget(targetPath) + if err != nil { + items = append(items, diffItem{ + Skill: flatName, + Action: "update", + Reason: "link target unreadable", + Kind: kindAgent, + }) + continue + } + absSource, _ := filepath.Abs(agent.AbsPath) + if !utils.PathsEqual(absLink, absSource) { + items = append(items, diffItem{ + Skill: flatName, + Action: "update", + Reason: "symlink points elsewhere", + Kind: kindAgent, + }) + } + continue + } + + // Match skills diff semantics: a local file blocks sync unless forced. + items = append(items, diffItem{ + Skill: flatName, + Action: "skip", + Reason: "local copy (sync --force to replace)", + Kind: kindAgent, + }) + } + + // Orphan/local detection + for name, fileType := range existing { + if _, ok := expected[name]; ok { + continue + } + targetPath := filepath.Join(targetDir, name) + if fileType&os.ModeSymlink != 0 || utils.IsSymlinkOrJunction(targetPath) { + items = append(items, diffItem{ + Skill: name, + Action: "prune", + Reason: "orphan symlink", + Kind: kindAgent, + }) + } else { + items = append(items, diffItem{ + Skill: name, + Action: "local", + Reason: "local file", + Kind: kindAgent, + }) + } + } + + return items +} diff --git a/internal/server/handler_agent_diff_test.go b/internal/server/handler_agent_diff_test.go new file mode 100644 index 00000000..7f125109 --- /dev/null +++ b/internal/server/handler_agent_diff_test.go @@ -0,0 +1,105 @@ +package server + +import ( + "os" + "path/filepath" + "testing" + + "skillshare/internal/resource" +) + +func TestComputeAgentTargetDiff_MissingInTarget(t *testing.T) { + targetDir := t.TempDir() + + agents := []resource.DiscoveredResource{ + {FlatName: "helper.md", AbsPath: "/src/helper.md", RelPath: "helper.md"}, + } + + items := computeAgentTargetDiff(targetDir, agents) + + if len(items) != 1 { + t.Fatalf("expected 1 item, got %d", len(items)) + } + if items[0].Action != "link" { + t.Errorf("expected action 'link', got %q", items[0].Action) + } + if items[0].Kind != "agent" { + t.Errorf("expected kind 'agent', got %q", items[0].Kind) + } +} + +func TestComputeAgentTargetDiff_OrphanSymlink(t *testing.T) { + targetDir := t.TempDir() + os.Symlink("/nonexistent/old.md", filepath.Join(targetDir, "orphan.md")) + + items := computeAgentTargetDiff(targetDir, nil) + + if len(items) != 1 { + t.Fatalf("expected 1 item, got %d", len(items)) + } + if items[0].Action != "prune" { + t.Errorf("expected action 'prune', got %q", items[0].Action) + } +} + +func TestComputeAgentTargetDiff_LocalFile(t *testing.T) { + targetDir := t.TempDir() + os.WriteFile(filepath.Join(targetDir, "local.md"), []byte("# Local"), 0644) + + items := computeAgentTargetDiff(targetDir, nil) + + if len(items) != 1 { + t.Fatalf("expected 1 item, got %d", len(items)) + } + if items[0].Action != "local" { + t.Errorf("expected action 'local', got %q", items[0].Action) + } +} + +func TestComputeAgentTargetDiff_SymlinkPointsElsewhere(t *testing.T) { + sourceDir := t.TempDir() + otherDir := t.TempDir() + targetDir := t.TempDir() + + srcFile := filepath.Join(sourceDir, "agent.md") + os.WriteFile(srcFile, []byte("# Agent"), 0644) + otherFile := filepath.Join(otherDir, "agent.md") + os.WriteFile(otherFile, []byte("# Other"), 0644) + + // Symlink points to otherFile, not srcFile + os.Symlink(otherFile, filepath.Join(targetDir, "agent.md")) + + agents := []resource.DiscoveredResource{ + {FlatName: "agent.md", AbsPath: srcFile, RelPath: "agent.md"}, + } + + items := computeAgentTargetDiff(targetDir, agents) + + if len(items) != 1 { + t.Fatalf("expected 1 item, got %d", len(items)) + } + if items[0].Action != "update" { + t.Errorf("expected action 'update', got %q", items[0].Action) + } + if items[0].Reason != "symlink points elsewhere" { + t.Errorf("expected reason 'symlink points elsewhere', got %q", items[0].Reason) + } +} + +func TestComputeAgentTargetDiff_InSync(t *testing.T) { + sourceDir := t.TempDir() + targetDir := t.TempDir() + srcFile := filepath.Join(sourceDir, "agent.md") + os.WriteFile(srcFile, []byte("# Agent"), 0644) + os.Symlink(srcFile, filepath.Join(targetDir, "agent.md")) + + agents := []resource.DiscoveredResource{ + {FlatName: "agent.md", AbsPath: srcFile, RelPath: "agent.md"}, + } + + items := computeAgentTargetDiff(targetDir, agents) + + if len(items) != 0 { + t.Fatalf("expected 0 items (in sync), got %d", len(items)) + } +} diff --git a/internal/server/handler_agentignore.go b/internal/server/handler_agentignore.go new file mode 100644 index 00000000..cf22561f --- /dev/null +++ b/internal/server/handler_agentignore.go @@ -0,0 +1,125 @@ +package server + +import ( + "encoding/json" + "errors" + "net/http" + "os" + "path/filepath" + "time" + + "skillshare/internal/resource" + "skillshare/internal/skillignore" + "skillshare/internal/utils" +) + +type agentignoreStats struct { + PatternCount int `json:"pattern_count"` + IgnoredCount int `json:"ignored_count"` + Patterns []string `json:"patterns"` + IgnoredAgents []string `json:"ignored_agents"` +} + +type agentignoreResponse struct { + Exists bool `json:"exists"` + Path string `json:"path"` + Raw string `json:"raw"` + Stats *agentignoreStats `json:"stats,omitempty"` +} + +func (s *Server) handleGetAgentignore(w http.ResponseWriter, _ *http.Request) { + s.mu.RLock() + source := s.agentsSource() + s.mu.RUnlock() + + if source == "" { + writeJSON(w, agentignoreResponse{Path: "", Raw: ""}) + return + } + + resolved := utils.ResolveSymlink(source) + ignorePath := filepath.Join(resolved, ".agentignore") + + raw, err := os.ReadFile(ignorePath) + + resp := agentignoreResponse{ + Exists: err == nil, + Path: ignorePath, + Raw: string(raw), + } + + // Discover agents to compute ignore stats + matcher := skillignore.ReadAgentIgnoreMatcher(resolved) + if matcher.HasRules() { + patterns := matcher.Patterns() + if patterns == nil { + patterns = []string{} + } + + all, _ := resource.AgentKind{}.Discover(source) + var ignored []string + for _, a := range all { + if a.Disabled { + ignored = append(ignored, a.RelPath) + } + } + if ignored == nil { + ignored = []string{} + } + + resp.Stats = &agentignoreStats{ + PatternCount: len(patterns), + IgnoredCount: len(ignored), + Patterns: patterns, + IgnoredAgents: ignored, + } + } + + writeJSON(w, resp) +} + +func (s *Server) handlePutAgentignore(w http.ResponseWriter, r *http.Request) { + start := time.Now() + s.mu.Lock() + defer s.mu.Unlock() + + var body struct { + Raw string `json:"raw"` + } + if err := json.NewDecoder(r.Body).Decode(&body); err != nil { + writeError(w, http.StatusBadRequest, "invalid JSON body") + return + } + + source := s.agentsSource() + if source == "" { + writeError(w, http.StatusBadRequest, "agents source not configured") + return + } + + resolved := utils.ResolveSymlink(source) + ignorePath := filepath.Join(resolved, ".agentignore") + + if body.Raw == "" { + if err := os.Remove(ignorePath); err != nil && !errors.Is(err, os.ErrNotExist) { + writeError(w, http.StatusInternalServerError, "failed to delete .agentignore: "+err.Error()) + return + } + } else { + // Ensure directory exists + if err := os.MkdirAll(resolved, 0755); err != nil { + writeError(w, http.StatusInternalServerError, "failed to create agents directory: "+err.Error()) + return + } + if err := os.WriteFile(ignorePath, []byte(body.Raw), 0644); err != nil { + writeError(w, http.StatusInternalServerError, "failed to write .agentignore: "+err.Error()) + return + } + } + + s.writeOpsLog("agentignore", "ok", start, map[string]any{ + "scope": "ui", + }, "") + + writeJSON(w, map[string]any{"success": true}) +} diff --git a/internal/server/handler_audit.go b/internal/server/handler_audit.go index 8056e308..3a7dcb4f 100644 --- a/internal/server/handler_audit.go +++ b/internal/server/handler_audit.go @@ -8,8 +8,8 @@ import ( "time" "skillshare/internal/audit" + "skillshare/internal/resource" "skillshare/internal/sync" - "skillshare/internal/utils" ) type auditFindingResponse struct { @@ -40,6 +40,7 @@ type auditResultResponse struct { AuditableBytes int64 `json:"auditableBytes"` Analyzability float64 `json:"analyzability"` TierProfile audit.TierProfile `json:"tierProfile"` + Kind string `json:"kind,omitempty"` } type auditSummary struct { @@ -67,10 +68,46 @@ type skillEntry struct { path string } +// discoverAuditAgents discovers agents (individual .md files) for audit scanning. +func discoverAuditAgents(source string) ([]skillEntry, error) { + if source == "" { + return []skillEntry{}, nil + } + if _, err := os.Stat(source); err != nil { + if os.IsNotExist(err) { + return []skillEntry{}, nil + } + return nil, err + } + + discovered, err := resource.AgentKind{}.Discover(source) + if err != nil { + return nil, err + } + var agents []skillEntry + for _, d := range discovered { + agents = append(agents, skillEntry{name: d.FlatName, path: d.AbsPath}) + } + return agents, nil +} + // discoverAuditSkills discovers and deduplicates skills for audit scanning. func discoverAuditSkills(source string) ([]skillEntry, error) { + if source == "" { + return []skillEntry{}, nil + } + if _, err := os.Stat(source); err != nil { + if os.IsNotExist(err) { + return []skillEntry{}, nil + } + return nil, err + } + discovered, err := sync.DiscoverSourceSkills(source) if err != nil { + if os.IsNotExist(err) { + return []skillEntry{}, nil + } return nil, err } @@ -85,18 +122,6 @@ func discoverAuditSkills(source string) ([]skillEntry, error) { skills = append(skills, skillEntry{d.FlatName, d.SourcePath}) } - entries, _ := os.ReadDir(source) - for _, e := range entries { - if !e.IsDir() || utils.IsHidden(e.Name()) { - continue - } - p := filepath.Join(source, e.Name()) - if !seen[p] { - seen[p] = true - skills = append(skills, skillEntry{e.Name(), p}) - } - } - return skills, nil } @@ -249,10 +274,19 @@ func processAuditResults(skills []skillEntry, scanned []audit.ScanOutput, policy } } +// resolveAuditSource returns the source directory, result kind label, and whether agents are being scanned. +func (s *Server) resolveAuditSource(r *http.Request) (string, string, bool) { + kind := r.URL.Query().Get("kind") + if kind == "agents" { + return s.agentsSource(), "agent", true + } + return s.cfg.Source, "skill", false +} + func (s *Server) handleAuditAll(w http.ResponseWriter, r *http.Request) { // Snapshot config under RLock, then release before I/O. s.mu.RLock() - source := s.cfg.Source + source, resultKind, isAgents := s.resolveAuditSource(r) policy := s.auditPolicy() projectRoot := s.projectRoot cfgPath := s.configPath() @@ -262,7 +296,13 @@ func (s *Server) handleAuditAll(w http.ResponseWriter, r *http.Request) { start := time.Now() - skills, err := discoverAuditSkills(source) + var skills []skillEntry + var err error + if isAgents { + skills, err = discoverAuditAgents(source) + } else { + skills, err = discoverAuditSkills(source) + } if err != nil { writeError(w, http.StatusInternalServerError, err.Error()) return @@ -272,9 +312,21 @@ func (s *Server) handleAuditAll(w http.ResponseWriter, r *http.Request) { if !isProjectMode { auditProjectRoot = "" } - scanned := audit.ParallelScan(skillsToAuditInputs(skills), auditProjectRoot, nil, nil) + var inputs []audit.SkillInput + if isAgents { + inputs = make([]audit.SkillInput, len(skills)) + for i, s := range skills { + inputs[i] = audit.SkillInput{Name: s.name, Path: s.path, IsFile: true} + } + } else { + inputs = skillsToAuditInputs(skills) + } + scanned := audit.ParallelScan(inputs, auditProjectRoot, nil, nil) agg := processAuditResults(skills, scanned, policy) + for i := range agg.Results { + agg.Results[i].Kind = resultKind + } writeAuditLogTo(cfgPath, agg.Status, start, agg.LogArgs, agg.Message) writeJSON(w, map[string]any{ @@ -287,6 +339,7 @@ func (s *Server) handleAuditSkill(w http.ResponseWriter, r *http.Request) { // Snapshot config under RLock, then release before I/O. s.mu.RLock() source := s.cfg.Source + agentsSource := s.agentsSource() policy := s.auditPolicy() projectRoot := s.projectRoot cfgPath := s.configPath() @@ -296,22 +349,46 @@ func (s *Server) handleAuditSkill(w http.ResponseWriter, r *http.Request) { start := time.Now() name := r.PathValue("name") + kind := r.URL.Query().Get("kind") threshold := policy.Threshold - skillPath := filepath.Join(source, name) - - if _, err := os.Stat(skillPath); os.IsNotExist(err) { - writeError(w, http.StatusNotFound, "skill not found: "+name) - return - } var ( result *audit.Result err error ) - if isProjectMode { - result, err = audit.ScanSkillForProject(skillPath, projectRoot) + + if kind == "agent" { + // Resolve agent file path via discovery + var agentPath string + if agentsSource != "" { + discovered, _ := resource.AgentKind{}.Discover(agentsSource) + for _, d := range discovered { + if d.FlatName == name || d.Name == name { + agentPath = d.AbsPath + break + } + } + } + if agentPath == "" { + writeError(w, http.StatusNotFound, "agent not found: "+name) + return + } + if isProjectMode { + result, err = audit.ScanFileForProject(agentPath, projectRoot) + } else { + result, err = audit.ScanFile(agentPath) + } } else { - result, err = audit.ScanSkill(skillPath) + skillPath := filepath.Join(source, name) + if _, statErr := os.Stat(skillPath); os.IsNotExist(statErr) { + writeError(w, http.StatusNotFound, "skill not found: "+name) + return + } + if isProjectMode { + result, err = audit.ScanSkillForProject(skillPath, projectRoot) + } else { + result, err = audit.ScanSkill(skillPath) + } } if err != nil { writeError(w, http.StatusInternalServerError, err.Error()) @@ -647,6 +724,7 @@ func toAuditResponse(result *audit.Result) auditResultResponse { } return auditResultResponse{ SkillName: result.SkillName, + Kind: result.Kind, Findings: findings, RiskScore: result.RiskScore, RiskLabel: result.RiskLabel, diff --git a/internal/server/handler_audit_stream.go b/internal/server/handler_audit_stream.go index 668b92d2..2482007d 100644 --- a/internal/server/handler_audit_stream.go +++ b/internal/server/handler_audit_stream.go @@ -24,13 +24,19 @@ func (s *Server) handleAuditStream(w http.ResponseWriter, r *http.Request) { // Snapshot config under RLock, then release before slow I/O. s.mu.RLock() - source := s.cfg.Source + source, resultKind, isAgents := s.resolveAuditSource(r) projectRoot := s.projectRoot policy := s.auditPolicy() s.mu.RUnlock() - // 1. Discover skills - skills, err := discoverAuditSkills(source) + // 1. Discover skills/agents + var skills []skillEntry + var err error + if isAgents { + skills, err = discoverAuditAgents(source) + } else { + skills, err = discoverAuditSkills(source) + } if err != nil { safeSend("error", map[string]string{"error": err.Error()}) return @@ -64,12 +70,24 @@ func (s *Server) handleAuditStream(w http.ResponseWriter, r *http.Request) { onDone := func() { scanned.Add(1) } // 3. Parallel scan (blocks until all skills are scanned) - outputs := audit.ParallelScan(skillsToAuditInputs(skills), projectRoot, onDone, nil) + var inputs []audit.SkillInput + if isAgents { + inputs = make([]audit.SkillInput, len(skills)) + for i, s := range skills { + inputs[i] = audit.SkillInput{Name: s.name, Path: s.path, IsFile: true} + } + } else { + inputs = skillsToAuditInputs(skills) + } + outputs := audit.ParallelScan(inputs, projectRoot, onDone, nil) close(done) // signal ticker goroutine to stop wg.Wait() // wait for it to fully exit before writing to w // 4. Process results agg := processAuditResults(skills, outputs, policy) + for i := range agg.Results { + agg.Results[i].Kind = resultKind + } s.writeAuditLog(agg.Status, start, agg.LogArgs, agg.Message) // 5. Send final result (no concurrent writers at this point) diff --git a/internal/server/handler_audit_unit_test.go b/internal/server/handler_audit_unit_test.go index 19dca871..fcb6f8e0 100644 --- a/internal/server/handler_audit_unit_test.go +++ b/internal/server/handler_audit_unit_test.go @@ -6,6 +6,7 @@ import ( "net/http/httptest" "os" "path/filepath" + "strings" "testing" ) @@ -61,6 +62,96 @@ func TestHandleAuditAll_WithSkills(t *testing.T) { } } +func TestHandleAuditAll_IgnoresTopLevelDirsWithoutSkillDefinition(t *testing.T) { + s, src := newTestServer(t) + + repoDir := filepath.Join(src, "_vijaythecoder-awesome-claude-agents") + if err := os.MkdirAll(filepath.Join(repoDir, "agents", "core"), 0o755); err != nil { + t.Fatalf("failed to create repo dir: %v", err) + } + if err := os.WriteFile(filepath.Join(repoDir, "agents", "core", "code-reviewer.md"), []byte("# code reviewer"), 0o644); err != nil { + t.Fatalf("failed to write nested agent file: %v", err) + } + + req := httptest.NewRequest(http.MethodGet, "/api/audit", nil) + rr := httptest.NewRecorder() + s.handler.ServeHTTP(rr, req) + + if rr.Code != http.StatusOK { + t.Fatalf("expected 200, got %d: %s", rr.Code, rr.Body.String()) + } + + var resp struct { + Results []any `json:"results"` + Summary struct { + Total int `json:"total"` + } `json:"summary"` + } + if err := json.Unmarshal(rr.Body.Bytes(), &resp); err != nil { + t.Fatalf("failed to decode response: %v", err) + } + if len(resp.Results) != 0 { + t.Fatalf("expected 0 audited results, got %d", len(resp.Results)) + } + if resp.Summary.Total != 0 { + t.Fatalf("expected summary total 0, got %d", resp.Summary.Total) + } +} + +func TestHandleAuditAll_AgentsMissingSourceReturnsEmpty(t *testing.T) { + s, _ := newTestServer(t) + + req := httptest.NewRequest(http.MethodGet, "/api/audit?kind=agents", nil) + rr := httptest.NewRecorder() + s.handler.ServeHTTP(rr, req) + + if rr.Code != http.StatusOK { + t.Fatalf("expected 200, got %d: %s", rr.Code, rr.Body.String()) + } + + var resp struct { + Results []any `json:"results"` + Summary struct { + Total int `json:"total"` + } `json:"summary"` + } + if err := json.Unmarshal(rr.Body.Bytes(), &resp); err != nil { + t.Fatalf("failed to decode response: %v", err) + } + if len(resp.Results) != 0 { + t.Fatalf("expected 0 agent audit results, got %d", len(resp.Results)) + } + if resp.Summary.Total != 0 { + t.Fatalf("expected summary total 0, got %d", resp.Summary.Total) + } +} + +func TestHandleAuditStream_AgentsMissingSourceReturnsEmpty(t *testing.T) { + s, _ := newTestServer(t) + + req := httptest.NewRequest(http.MethodGet, "/api/audit/stream?kind=agents", nil) + rr := httptest.NewRecorder() + s.handler.ServeHTTP(rr, req) + + if rr.Code != http.StatusOK { + t.Fatalf("expected 200, got %d: %s", rr.Code, rr.Body.String()) + } + + body := rr.Body.String() + if strings.Contains(body, "event: error") { + t.Fatalf("expected no error event, got: %s", body) + } + if !strings.Contains(body, "event: start") { + t.Fatalf("expected start event, got: %s", body) + } + if !strings.Contains(body, "\"total\":0") { + t.Fatalf("expected empty start payload, got: %s", body) + } + if !strings.Contains(body, "event: done") { + t.Fatalf("expected done event, got: %s", body) + } +} + func TestHandleAuditAll_IncludesCrossSkillResult(t *testing.T) { s, src := newTestServer(t) diff --git a/internal/server/handler_check.go b/internal/server/handler_check.go index d9e5adfd..63ff64a4 100644 --- a/internal/server/handler_check.go +++ b/internal/server/handler_check.go @@ -10,6 +10,12 @@ import ( "skillshare/internal/install" ) +// skillWithMetaEntry holds a skill name paired with its centralized metadata entry. +type skillWithMetaEntry struct { + name string + entry *install.MetadataEntry +} + type repoCheckResult struct { Name string `json:"name"` Status string `json:"status"` @@ -23,6 +29,7 @@ type skillCheckResult struct { Version string `json:"version"` Status string `json:"status"` InstalledAt string `json:"installed_at,omitempty"` + Kind string `json:"kind,omitempty"` } func (s *Server) handleCheck(w http.ResponseWriter, r *http.Request) { @@ -64,26 +71,21 @@ func (s *Server) handleCheck(w http.ResponseWriter, r *http.Request) { } // Group skills by repo URL for efficient checking - type skillWithMeta struct { - name string - meta *install.SkillMeta - } - urlGroups := make(map[string][]skillWithMeta) + urlGroups := make(map[string][]skillWithMetaEntry) var localResults []skillCheckResult for _, skill := range skills { - skillPath := filepath.Join(sourceDir, skill) - meta, err := install.ReadMeta(skillPath) - if err != nil || meta == nil || meta.RepoURL == "" { + entry := s.skillsStore.GetByPath(skill) + if entry == nil || entry.RepoURL == "" { localResults = append(localResults, skillCheckResult{ Name: skill, Status: "local", }) continue } - urlGroups[meta.RepoURL] = append(urlGroups[meta.RepoURL], skillWithMeta{ - name: skill, - meta: meta, + urlGroups[entry.RepoURL] = append(urlGroups[entry.RepoURL], skillWithMetaEntry{ + name: skill, + entry: entry, }) } @@ -97,12 +99,12 @@ func (s *Server) handleCheck(w http.ResponseWriter, r *http.Request) { for _, sw := range group { r := skillCheckResult{ Name: sw.name, - Source: sw.meta.Source, - Version: sw.meta.Version, + Source: sw.entry.Source, + Version: sw.entry.Version, Status: "error", } - if !sw.meta.InstalledAt.IsZero() { - r.InstalledAt = sw.meta.InstalledAt.Format("2006-01-02") + if !sw.entry.InstalledAt.IsZero() { + r.InstalledAt = sw.entry.InstalledAt.Format("2006-01-02") } skillResults = append(skillResults, r) } @@ -112,7 +114,7 @@ func (s *Server) handleCheck(w http.ResponseWriter, r *http.Request) { // Fast path: check if all skills match by commit hash allMatch := true for _, sw := range group { - if sw.meta.Version != remoteHash { + if sw.entry.Version != remoteHash { allMatch = false break } @@ -121,12 +123,12 @@ func (s *Server) handleCheck(w http.ResponseWriter, r *http.Request) { for _, sw := range group { r := skillCheckResult{ Name: sw.name, - Source: sw.meta.Source, - Version: sw.meta.Version, + Source: sw.entry.Source, + Version: sw.entry.Version, Status: "up_to_date", } - if !sw.meta.InstalledAt.IsZero() { - r.InstalledAt = sw.meta.InstalledAt.Format("2006-01-02") + if !sw.entry.InstalledAt.IsZero() { + r.InstalledAt = sw.entry.InstalledAt.Format("2006-01-02") } skillResults = append(skillResults, r) } @@ -136,7 +138,7 @@ func (s *Server) handleCheck(w http.ResponseWriter, r *http.Request) { // Slow path: HEAD moved — try tree hash comparison var hasTreeHash bool for _, sw := range group { - if sw.meta.TreeHash != "" && sw.meta.Subdir != "" { + if sw.entry.TreeHash != "" && sw.entry.Subdir != "" { hasTreeHash = true break } @@ -150,18 +152,18 @@ func (s *Server) handleCheck(w http.ResponseWriter, r *http.Request) { for _, sw := range group { r := skillCheckResult{ Name: sw.name, - Source: sw.meta.Source, - Version: sw.meta.Version, + Source: sw.entry.Source, + Version: sw.entry.Version, } - if !sw.meta.InstalledAt.IsZero() { - r.InstalledAt = sw.meta.InstalledAt.Format("2006-01-02") + if !sw.entry.InstalledAt.IsZero() { + r.InstalledAt = sw.entry.InstalledAt.Format("2006-01-02") } - if sw.meta.Version == remoteHash { + if sw.entry.Version == remoteHash { r.Status = "up_to_date" - } else if sw.meta.TreeHash != "" && sw.meta.Subdir != "" && remoteTreeHashes != nil { - normalizedSubdir := strings.TrimPrefix(sw.meta.Subdir, "/") - if rh, ok := remoteTreeHashes[normalizedSubdir]; ok && sw.meta.TreeHash == rh { + } else if sw.entry.TreeHash != "" && sw.entry.Subdir != "" && remoteTreeHashes != nil { + normalizedSubdir := strings.TrimPrefix(sw.entry.Subdir, "/") + if rh, ok := remoteTreeHashes[normalizedSubdir]; ok && sw.entry.TreeHash == rh { r.Status = "up_to_date" } else { r.Status = "update_available" diff --git a/internal/server/handler_check_stream.go b/internal/server/handler_check_stream.go index 10889b0a..18ae79b3 100644 --- a/internal/server/handler_check_stream.go +++ b/internal/server/handler_check_stream.go @@ -45,26 +45,21 @@ func (s *Server) handleCheckStream(w http.ResponseWriter, r *http.Request) { skills, _ := install.GetUpdatableSkills(sourceDir) // --- Pre-process: group skills by URL (fast, local only) --- - type skillWithMeta struct { - name string - meta *install.SkillMeta - } - urlGroups := make(map[string][]skillWithMeta) + urlGroups := make(map[string][]skillWithMetaEntry) var localResults []skillCheckResult for _, skill := range skills { - skillPath := filepath.Join(sourceDir, skill) - meta, err := install.ReadMeta(skillPath) - if err != nil || meta == nil || meta.RepoURL == "" { + entry := s.skillsStore.GetByPath(skill) + if entry == nil || entry.RepoURL == "" { localResults = append(localResults, skillCheckResult{ Name: skill, Status: "local", }) continue } - urlGroups[meta.RepoURL] = append(urlGroups[meta.RepoURL], skillWithMeta{ - name: skill, - meta: meta, + urlGroups[entry.RepoURL] = append(urlGroups[entry.RepoURL], skillWithMetaEntry{ + name: skill, + entry: entry, }) } @@ -147,12 +142,12 @@ func (s *Server) handleCheckStream(w http.ResponseWriter, r *http.Request) { for _, sw := range group { r := skillCheckResult{ Name: sw.name, - Source: sw.meta.Source, - Version: sw.meta.Version, + Source: sw.entry.Source, + Version: sw.entry.Version, Status: "error", } - if !sw.meta.InstalledAt.IsZero() { - r.InstalledAt = sw.meta.InstalledAt.Format("2006-01-02") + if !sw.entry.InstalledAt.IsZero() { + r.InstalledAt = sw.entry.InstalledAt.Format("2006-01-02") } skillResults = append(skillResults, r) } @@ -163,7 +158,7 @@ func (s *Server) handleCheckStream(w http.ResponseWriter, r *http.Request) { // Fast path: all commit hashes match allMatch := true for _, sw := range group { - if sw.meta.Version != remoteHash { + if sw.entry.Version != remoteHash { allMatch = false break } @@ -172,12 +167,12 @@ func (s *Server) handleCheckStream(w http.ResponseWriter, r *http.Request) { for _, sw := range group { r := skillCheckResult{ Name: sw.name, - Source: sw.meta.Source, - Version: sw.meta.Version, + Source: sw.entry.Source, + Version: sw.entry.Version, Status: "up_to_date", } - if !sw.meta.InstalledAt.IsZero() { - r.InstalledAt = sw.meta.InstalledAt.Format("2006-01-02") + if !sw.entry.InstalledAt.IsZero() { + r.InstalledAt = sw.entry.InstalledAt.Format("2006-01-02") } skillResults = append(skillResults, r) } @@ -188,7 +183,7 @@ func (s *Server) handleCheckStream(w http.ResponseWriter, r *http.Request) { // Slow path: tree hash comparison var hasTreeHash bool for _, sw := range group { - if sw.meta.TreeHash != "" && sw.meta.Subdir != "" { + if sw.entry.TreeHash != "" && sw.entry.Subdir != "" { hasTreeHash = true break } @@ -202,18 +197,18 @@ func (s *Server) handleCheckStream(w http.ResponseWriter, r *http.Request) { for _, sw := range group { r := skillCheckResult{ Name: sw.name, - Source: sw.meta.Source, - Version: sw.meta.Version, + Source: sw.entry.Source, + Version: sw.entry.Version, } - if !sw.meta.InstalledAt.IsZero() { - r.InstalledAt = sw.meta.InstalledAt.Format("2006-01-02") + if !sw.entry.InstalledAt.IsZero() { + r.InstalledAt = sw.entry.InstalledAt.Format("2006-01-02") } - if sw.meta.Version == remoteHash { + if sw.entry.Version == remoteHash { r.Status = "up_to_date" - } else if sw.meta.TreeHash != "" && sw.meta.Subdir != "" && remoteTreeHashes != nil { - normalizedSubdir := strings.TrimPrefix(sw.meta.Subdir, "/") - if rh, ok := remoteTreeHashes[normalizedSubdir]; ok && sw.meta.TreeHash == rh { + } else if sw.entry.TreeHash != "" && sw.entry.Subdir != "" && remoteTreeHashes != nil { + normalizedSubdir := strings.TrimPrefix(sw.entry.Subdir, "/") + if rh, ok := remoteTreeHashes[normalizedSubdir]; ok && sw.entry.TreeHash == rh { r.Status = "up_to_date" } else { r.Status = "update_available" diff --git a/internal/server/handler_collect.go b/internal/server/handler_collect.go index a0efa57d..78cfc960 100644 --- a/internal/server/handler_collect.go +++ b/internal/server/handler_collect.go @@ -2,6 +2,7 @@ package server import ( "encoding/json" + "maps" "net/http" "os" "path/filepath" @@ -14,6 +15,7 @@ import ( type localSkillItem struct { Name string `json:"name"` + Kind string `json:"kind,omitempty"` Path string `json:"path"` TargetName string `json:"targetName"` Size int64 `json:"size"` @@ -30,51 +32,110 @@ type scanTarget struct { type collectSkillRef struct { Name string `json:"name"` TargetName string `json:"targetName"` + Kind string `json:"kind,omitempty"` } -// handleCollectScan scans targets for local (non-symlinked) skills. -// GET /api/collect/scan?target= (optional filter) +// handleCollectScan scans targets for local (non-symlinked) skills and/or agents. +// GET /api/collect/scan?target=&kind=skill|agent (both optional) +// When kind is omitted, scans for both skills and agents. func (s *Server) handleCollectScan(w http.ResponseWriter, r *http.Request) { // Snapshot config under RLock, then release before I/O. s.mu.RLock() source := s.cfg.Source globalMode := s.cfg.Mode targets := s.cloneTargets() + agentsSource := s.agentsSource() s.mu.RUnlock() filterTarget := r.URL.Query().Get("target") + kind := r.URL.Query().Get("kind") + if kind != "" && kind != kindSkill && kind != kindAgent { + writeError(w, http.StatusBadRequest, "invalid kind: must be 'skill', 'agent', or empty") + return + } - var scanTargets []scanTarget + // Collect items per target, merging skills and agents. + targetItems := make(map[string][]localSkillItem) totalCount := 0 - for name, target := range targets { - if filterTarget != "" && filterTarget != name { - continue - } + // --- Skill scan --- + if kind != kindAgent { + for name, target := range targets { + if filterTarget != "" && filterTarget != name { + continue + } - sc := target.SkillsConfig() - mode := ssync.EffectiveMode(sc.Mode) - if sc.Mode == "" && globalMode != "" { - mode = globalMode - } - locals, err := ssync.FindLocalSkills(sc.Path, source, mode) - if err != nil { - writeError(w, http.StatusInternalServerError, "scan failed for "+name+": "+err.Error()) - return + sc := target.SkillsConfig() + mode := ssync.EffectiveMode(sc.Mode) + if sc.Mode == "" && globalMode != "" { + mode = globalMode + } + locals, err := ssync.FindLocalSkills(sc.Path, source, mode) + if err != nil { + writeError(w, http.StatusInternalServerError, "scan failed for "+name+": "+err.Error()) + return + } + + for _, sk := range locals { + targetItems[name] = append(targetItems[name], localSkillItem{ + Name: sk.Name, + Kind: kindSkill, + Path: sk.Path, + TargetName: name, + Size: ssync.CalculateDirSize(sk.Path), + ModTime: sk.ModTime.Format(time.RFC3339), + }) + } } + } - items := make([]localSkillItem, 0, len(locals)) - for _, sk := range locals { - items = append(items, localSkillItem{ - Name: sk.Name, - Path: sk.Path, - TargetName: name, - Size: ssync.CalculateDirSize(sk.Path), - ModTime: sk.ModTime.Format(time.RFC3339), - }) + // --- Agent scan --- + if kind != kindSkill { + builtinAgents := s.builtinAgentTargets() + for name, target := range targets { + if filterTarget != "" && filterTarget != name { + continue + } + agentPath := resolveAgentPath(target, builtinAgents, name) + if agentPath == "" || agentsSource == "" { + continue + } + localAgents, err := ssync.FindLocalAgents(agentPath, agentsSource) + if err != nil { + writeError(w, http.StatusInternalServerError, "agent scan failed for "+name+": "+err.Error()) + return + } + for _, ag := range localAgents { + var size int64 + var modTime string + if info, err := os.Stat(ag.Path); err == nil { + size = info.Size() + modTime = info.ModTime().Format(time.RFC3339) + } + targetItems[name] = append(targetItems[name], localSkillItem{ + Name: ag.Name, + Kind: kindAgent, + Path: ag.Path, + TargetName: name, + Size: size, + ModTime: modTime, + }) + } } + } + // Build response from merged map. + var scanTargets []scanTarget + for name := range targets { + items := targetItems[name] + if len(items) == 0 && kind != "" { + // When filtering by kind, skip targets with no items of that kind. + continue + } totalCount += len(items) + if items == nil { + items = []localSkillItem{} + } scanTargets = append(scanTargets, scanTarget{ TargetName: name, Skills: items, @@ -91,8 +152,9 @@ func (s *Server) handleCollectScan(w http.ResponseWriter, r *http.Request) { }) } -// handleCollect pulls selected local skills from targets to source. -// POST /api/collect { skills: [{name, targetName}], force: bool } +// handleCollect pulls selected local skills and/or agents from targets to source. +// POST /api/collect { skills: [{name, targetName, kind?}], force: bool } +// Items with kind="agent" are pulled as agents; others as skills (backward compat). func (s *Server) handleCollect(w http.ResponseWriter, r *http.Request) { start := time.Now() s.mu.Lock() @@ -112,69 +174,141 @@ func (s *Server) handleCollect(w http.ResponseWriter, r *http.Request) { return } - // Resolve each skill ref to a LocalSkillInfo - var resolved []ssync.LocalSkillInfo + // Split items by kind. + var skillRefs, agentRefs []collectSkillRef for _, ref := range body.Skills { - target, ok := s.cfg.Targets[ref.TargetName] - if !ok { - writeError(w, http.StatusBadRequest, "unknown target: "+ref.TargetName) - return + if ref.Kind == kindAgent { + agentRefs = append(agentRefs, ref) + } else { + skillRefs = append(skillRefs, ref) } + } + + opts := ssync.PullOptions{Force: body.Force} + + // Merged results across skills and agents. + var allPulled, allSkipped []string + allFailed := make(map[string]error) + var skillsPulled, agentsPulled int + + // --- Pull skills --- + if len(skillRefs) > 0 { + var resolved []ssync.LocalSkillInfo + for _, ref := range skillRefs { + target, ok := s.cfg.Targets[ref.TargetName] + if !ok { + writeError(w, http.StatusBadRequest, "unknown target: "+ref.TargetName) + return + } - skillPath := filepath.Join(target.SkillsConfig().Path, ref.Name) - info, err := os.Lstat(skillPath) + skillPath := filepath.Join(target.SkillsConfig().Path, ref.Name) + info, err := os.Lstat(skillPath) + if err != nil { + writeError(w, http.StatusBadRequest, "skill not found: "+ref.Name+" in "+ref.TargetName) + return + } + if info.Mode()&os.ModeSymlink != 0 { + writeError(w, http.StatusBadRequest, "skill is a symlink (not local): "+ref.Name) + return + } + if !info.IsDir() { + writeError(w, http.StatusBadRequest, "skill is not a directory: "+ref.Name) + return + } + + resolved = append(resolved, ssync.LocalSkillInfo{ + Name: ref.Name, + Path: skillPath, + TargetName: ref.TargetName, + }) + } + + result, err := ssync.PullSkills(resolved, s.cfg.Source, opts) if err != nil { - writeError(w, http.StatusBadRequest, "skill not found: "+ref.Name+" in "+ref.TargetName) + writeError(w, http.StatusInternalServerError, "collect failed: "+err.Error()) return } - if info.Mode()&os.ModeSymlink != 0 { - writeError(w, http.StatusBadRequest, "skill is a symlink (not local): "+ref.Name) - return + skillsPulled = len(result.Pulled) + allPulled = append(allPulled, result.Pulled...) + allSkipped = append(allSkipped, result.Skipped...) + maps.Copy(allFailed, result.Failed) + } + + // --- Pull agents --- + if len(agentRefs) > 0 { + builtinAgents := s.builtinAgentTargets() + agentsSource := s.agentsSource() + + var resolved []ssync.LocalAgentInfo + for _, ref := range agentRefs { + target, ok := s.cfg.Targets[ref.TargetName] + if !ok { + writeError(w, http.StatusBadRequest, "unknown target: "+ref.TargetName) + return + } + + agentPath := resolveAgentPath(target, builtinAgents, ref.TargetName) + if agentPath == "" { + writeError(w, http.StatusBadRequest, "no agent path for target: "+ref.TargetName) + return + } + + filePath := filepath.Join(agentPath, ref.Name) + info, err := os.Lstat(filePath) + if err != nil { + writeError(w, http.StatusBadRequest, "agent not found: "+ref.Name+" in "+ref.TargetName) + return + } + if info.Mode()&os.ModeSymlink != 0 { + writeError(w, http.StatusBadRequest, "agent is a symlink (not local): "+ref.Name) + return + } + + resolved = append(resolved, ssync.LocalAgentInfo{ + Name: ref.Name, + Path: filePath, + TargetName: ref.TargetName, + }) } - if !info.IsDir() { - writeError(w, http.StatusBadRequest, "skill is not a directory: "+ref.Name) + + result, err := ssync.PullAgents(resolved, agentsSource, opts) + if err != nil { + writeError(w, http.StatusInternalServerError, "agent collect failed: "+err.Error()) return } - - resolved = append(resolved, ssync.LocalSkillInfo{ - Name: ref.Name, - Path: skillPath, - TargetName: ref.TargetName, - }) - } - - result, err := ssync.PullSkills(resolved, s.cfg.Source, ssync.PullOptions{ - Force: body.Force, - }) - if err != nil { - writeError(w, http.StatusInternalServerError, "collect failed: "+err.Error()) - return + agentsPulled = len(result.Pulled) + allPulled = append(allPulled, result.Pulled...) + allSkipped = append(allSkipped, result.Skipped...) + maps.Copy(allFailed, result.Failed) } // Convert Failed map to string values for JSON - failed := make(map[string]string, len(result.Failed)) - for k, v := range result.Failed { + failed := make(map[string]string, len(allFailed)) + for k, v := range allFailed { failed[k] = v.Error() } status := "ok" msg := "" - if len(result.Failed) > 0 { + if len(allFailed) > 0 { status = "partial" - msg = "some skills failed to collect" + msg = "some items failed to collect" } s.writeOpsLog("collect", status, start, map[string]any{ - "skills_selected": len(body.Skills), - "skills_pulled": len(result.Pulled), - "skills_skipped": len(result.Skipped), - "skills_failed": len(result.Failed), + "skills_selected": len(skillRefs), + "skills_pulled": skillsPulled, + "agents_selected": len(agentRefs), + "agents_pulled": agentsPulled, + "total_pulled": len(allPulled), + "total_skipped": len(allSkipped), + "total_failed": len(allFailed), "force": body.Force, "scope": "ui", }, msg) writeJSON(w, map[string]any{ - "pulled": result.Pulled, - "skipped": result.Skipped, + "pulled": allPulled, + "skipped": allSkipped, "failed": failed, }) } diff --git a/internal/server/handler_collect_test.go b/internal/server/handler_collect_test.go index 19404bb9..a9143d4a 100644 --- a/internal/server/handler_collect_test.go +++ b/internal/server/handler_collect_test.go @@ -33,6 +33,12 @@ func TestHandleCollectScan_Empty(t *testing.T) { } func TestHandleCollectScan_WithLocalSkills(t *testing.T) { + home := filepath.Join(t.TempDir(), "home") + if err := os.MkdirAll(home, 0o755); err != nil { + t.Fatalf("mkdir home: %v", err) + } + t.Setenv("HOME", home) + tgtPath := filepath.Join(t.TempDir(), "claude-skills") s, _ := newTestServerWithTargets(t, map[string]string{"claude": tgtPath}) @@ -71,6 +77,12 @@ func TestHandleCollect_NoSkills(t *testing.T) { } func TestHandleCollectScan_GlobalCopyModeInheritedTarget_SkipsManaged(t *testing.T) { + home := filepath.Join(t.TempDir(), "home") + if err := os.MkdirAll(home, 0o755); err != nil { + t.Fatalf("mkdir home: %v", err) + } + t.Setenv("HOME", home) + tgtPath := filepath.Join(t.TempDir(), "claude-skills") s, sourceDir := newTestServerWithTargets(t, map[string]string{"claude": tgtPath}) @@ -128,3 +140,265 @@ func TestHandleCollectScan_GlobalCopyModeInheritedTarget_SkipsManaged(t *testing t.Fatalf("expected only local-skill, got %q", resp.Targets[0].Skills[0].Name) } } + +func TestHandleCollectScan_AgentKind(t *testing.T) { + tgtPath := filepath.Join(t.TempDir(), "claude-skills") + agentPath := filepath.Join(t.TempDir(), "claude-agents") + agentsSource := filepath.Join(t.TempDir(), "agents-source") + s, sourceDir := newTestServerWithTargets(t, map[string]string{"claude": tgtPath}) + + // Write config YAML with agents_source and agent target path. + // The auto-reload middleware re-reads from disk on every API request. + cfgPath := os.Getenv("SKILLSHARE_CONFIG") + raw := "source: " + sourceDir + "\nagents_source: " + agentsSource + + "\nmode: merge\ntargets:\n claude:\n skills:\n path: " + tgtPath + + "\n agents:\n path: " + agentPath + "\n" + if err := os.WriteFile(cfgPath, []byte(raw), 0644); err != nil { + t.Fatalf("failed to update config: %v", err) + } + os.MkdirAll(agentsSource, 0755) + + // Create a local .md agent file in the agent target directory. + os.MkdirAll(agentPath, 0755) + os.WriteFile(filepath.Join(agentPath, "helper.md"), []byte("# helper agent"), 0644) + + req := httptest.NewRequest(http.MethodGet, "/api/collect/scan?kind=agent", nil) + rr := httptest.NewRecorder() + s.handler.ServeHTTP(rr, req) + + if rr.Code != http.StatusOK { + t.Fatalf("expected 200, got %d: %s", rr.Code, rr.Body.String()) + } + + var resp struct { + Targets []struct { + TargetName string `json:"targetName"` + Skills []struct { + Name string `json:"name"` + Kind string `json:"kind"` + } `json:"skills"` + } `json:"targets"` + TotalCount int `json:"totalCount"` + } + if err := json.Unmarshal(rr.Body.Bytes(), &resp); err != nil { + t.Fatalf("failed to decode response: %v", err) + } + + if resp.TotalCount != 1 { + t.Fatalf("expected totalCount=1, got %d", resp.TotalCount) + } + if len(resp.Targets) == 0 { + t.Fatal("expected at least 1 target in response") + } + found := false + for _, tgt := range resp.Targets { + for _, sk := range tgt.Skills { + if sk.Name == "helper.md" && sk.Kind == "agent" { + found = true + } + } + } + if !found { + t.Fatalf("expected agent helper.md with kind=agent in response, got %+v", resp.Targets) + } +} + +func TestHandleCollectScan_BothKinds(t *testing.T) { + tgtPath := filepath.Join(t.TempDir(), "claude-skills") + agentPath := filepath.Join(t.TempDir(), "claude-agents") + agentsSource := filepath.Join(t.TempDir(), "agents-source") + s, sourceDir := newTestServerWithTargets(t, map[string]string{"claude": tgtPath}) + + // Write config YAML with both skills and agents paths. + cfgPath := os.Getenv("SKILLSHARE_CONFIG") + raw := "source: " + sourceDir + "\nagents_source: " + agentsSource + + "\nmode: merge\ntargets:\n claude:\n skills:\n path: " + tgtPath + + "\n agents:\n path: " + agentPath + "\n" + if err := os.WriteFile(cfgPath, []byte(raw), 0644); err != nil { + t.Fatalf("failed to update config: %v", err) + } + os.MkdirAll(agentsSource, 0755) + + // Create a local skill in skill target. + localSkill := filepath.Join(tgtPath, "local-skill") + os.MkdirAll(localSkill, 0755) + os.WriteFile(filepath.Join(localSkill, "SKILL.md"), []byte("local"), 0644) + + // Create a local agent in agent target. + os.MkdirAll(agentPath, 0755) + os.WriteFile(filepath.Join(agentPath, "reviewer.md"), []byte("# reviewer agent"), 0644) + + // No kind parameter — should return both. + req := httptest.NewRequest(http.MethodGet, "/api/collect/scan", nil) + rr := httptest.NewRecorder() + s.handler.ServeHTTP(rr, req) + + if rr.Code != http.StatusOK { + t.Fatalf("expected 200, got %d: %s", rr.Code, rr.Body.String()) + } + + var resp struct { + TotalCount int `json:"totalCount"` + } + if err := json.Unmarshal(rr.Body.Bytes(), &resp); err != nil { + t.Fatalf("failed to decode response: %v", err) + } + if resp.TotalCount != 2 { + t.Fatalf("expected totalCount=2 (1 skill + 1 agent), got %d", resp.TotalCount) + } +} + +func TestHandleCollect_AgentItems(t *testing.T) { + tgtPath := filepath.Join(t.TempDir(), "claude-skills") + agentPath := filepath.Join(t.TempDir(), "claude-agents") + agentsSource := filepath.Join(t.TempDir(), "agents-source") + s, sourceDir := newTestServerWithTargets(t, map[string]string{"claude": tgtPath}) + + // Write config YAML with agents_source and agent target path. + cfgPath := os.Getenv("SKILLSHARE_CONFIG") + raw := "source: " + sourceDir + "\nagents_source: " + agentsSource + + "\nmode: merge\ntargets:\n claude:\n skills:\n path: " + tgtPath + + "\n agents:\n path: " + agentPath + "\n" + if err := os.WriteFile(cfgPath, []byte(raw), 0644); err != nil { + t.Fatalf("failed to update config: %v", err) + } + os.MkdirAll(agentsSource, 0755) + + // Create a local .md agent file in the agent target directory. + os.MkdirAll(agentPath, 0755) + agentContent := "# helper agent\nThis is a test agent." + os.WriteFile(filepath.Join(agentPath, "helper.md"), []byte(agentContent), 0644) + + body := `{"skills":[{"name":"helper.md","targetName":"claude","kind":"agent"}],"force":false}` + req := httptest.NewRequest(http.MethodPost, "/api/collect", strings.NewReader(body)) + rr := httptest.NewRecorder() + s.handler.ServeHTTP(rr, req) + + if rr.Code != http.StatusOK { + t.Fatalf("expected 200, got %d: %s", rr.Code, rr.Body.String()) + } + + var resp struct { + Pulled []string `json:"pulled"` + Skipped []string `json:"skipped"` + Failed map[string]string `json:"failed"` + } + if err := json.Unmarshal(rr.Body.Bytes(), &resp); err != nil { + t.Fatalf("failed to decode response: %v", err) + } + + if len(resp.Pulled) != 1 || resp.Pulled[0] != "helper.md" { + t.Fatalf("expected pulled=[helper.md], got %v", resp.Pulled) + } + if len(resp.Failed) != 0 { + t.Fatalf("expected no failures, got %v", resp.Failed) + } + + // Verify the agent file was copied to agents source. + destPath := filepath.Join(agentsSource, "helper.md") + data, err := os.ReadFile(destPath) + if err != nil { + t.Fatalf("agent not found in source dir: %v", err) + } + if string(data) != agentContent { + t.Fatalf("agent content mismatch: got %q, want %q", string(data), agentContent) + } +} + +func TestHandleCollect_MixedSkillsAndAgents(t *testing.T) { + tgtPath := filepath.Join(t.TempDir(), "claude-skills") + agentPath := filepath.Join(t.TempDir(), "claude-agents") + agentsSource := filepath.Join(t.TempDir(), "agents-source") + s, sourceDir := newTestServerWithTargets(t, map[string]string{"claude": tgtPath}) + + // Write config YAML with both skills and agents paths. + cfgPath := os.Getenv("SKILLSHARE_CONFIG") + raw := "source: " + sourceDir + "\nagents_source: " + agentsSource + + "\nmode: merge\ntargets:\n claude:\n skills:\n path: " + tgtPath + + "\n agents:\n path: " + agentPath + "\n" + if err := os.WriteFile(cfgPath, []byte(raw), 0644); err != nil { + t.Fatalf("failed to update config: %v", err) + } + os.MkdirAll(agentsSource, 0755) + + // Create a local skill directory in skill target. + localSkill := filepath.Join(tgtPath, "my-skill") + os.MkdirAll(localSkill, 0755) + os.WriteFile(filepath.Join(localSkill, "SKILL.md"), []byte("local skill"), 0644) + + // Create a local agent file in agent target. + os.MkdirAll(agentPath, 0755) + os.WriteFile(filepath.Join(agentPath, "reviewer.md"), []byte("# reviewer"), 0644) + + body := `{"skills":[` + + `{"name":"my-skill","targetName":"claude"},` + + `{"name":"reviewer.md","targetName":"claude","kind":"agent"}` + + `],"force":false}` + req := httptest.NewRequest(http.MethodPost, "/api/collect", strings.NewReader(body)) + rr := httptest.NewRecorder() + s.handler.ServeHTTP(rr, req) + + if rr.Code != http.StatusOK { + t.Fatalf("expected 200, got %d: %s", rr.Code, rr.Body.String()) + } + + var resp struct { + Pulled []string `json:"pulled"` + Skipped []string `json:"skipped"` + Failed map[string]string `json:"failed"` + } + if err := json.Unmarshal(rr.Body.Bytes(), &resp); err != nil { + t.Fatalf("failed to decode response: %v", err) + } + + if len(resp.Pulled) != 2 { + t.Fatalf("expected 2 pulled items, got %v", resp.Pulled) + } + if len(resp.Failed) != 0 { + t.Fatalf("expected no failures, got %v", resp.Failed) + } + + // Verify both exist in their respective source dirs. + if _, err := os.Stat(filepath.Join(sourceDir, "my-skill", "SKILL.md")); err != nil { + t.Fatalf("skill not found in source: %v", err) + } + if _, err := os.Stat(filepath.Join(agentsSource, "reviewer.md")); err != nil { + t.Fatalf("agent not found in agents source: %v", err) + } +} + +func TestHandleCollectScan_AgentKind_NoSource(t *testing.T) { + home := filepath.Join(t.TempDir(), "home") + if err := os.MkdirAll(home, 0o755); err != nil { + t.Fatalf("mkdir home: %v", err) + } + t.Setenv("HOME", home) + + tgtPath := filepath.Join(t.TempDir(), "claude-skills") + s, sourceDir := newTestServerWithTargets(t, map[string]string{"claude": tgtPath}) + + // Write config without agents_source — should return 0 agents, no error. + cfgPath := os.Getenv("SKILLSHARE_CONFIG") + raw := "source: " + sourceDir + "\nmode: merge\ntargets:\n claude:\n path: " + tgtPath + "\n" + if err := os.WriteFile(cfgPath, []byte(raw), 0644); err != nil { + t.Fatalf("failed to update config: %v", err) + } + + req := httptest.NewRequest(http.MethodGet, "/api/collect/scan?kind=agent", nil) + rr := httptest.NewRecorder() + s.handler.ServeHTTP(rr, req) + + if rr.Code != http.StatusOK { + t.Fatalf("expected 200, got %d: %s", rr.Code, rr.Body.String()) + } + + var resp struct { + TotalCount int `json:"totalCount"` + } + if err := json.Unmarshal(rr.Body.Bytes(), &resp); err != nil { + t.Fatalf("failed to decode response: %v", err) + } + if resp.TotalCount != 0 { + t.Fatalf("expected totalCount=0 when no agents source, got %d", resp.TotalCount) + } +} diff --git a/internal/server/handler_config.go b/internal/server/handler_config.go index 3d49bfac..5fd56e2a 100644 --- a/internal/server/handler_config.go +++ b/internal/server/handler_config.go @@ -114,10 +114,18 @@ func (s *Server) handleAvailableTargets(w http.ResponseWriter, r *http.Request) type availTarget struct { Name string `json:"name"` Path string `json:"path"` + AgentPath string `json:"agentPath,omitempty"` Installed bool `json:"installed"` Detected bool `json:"detected"` } + var agentDefaults map[string]config.TargetConfig + if isProjectMode { + agentDefaults = config.ProjectAgentTargets() + } else { + agentDefaults = config.DefaultAgentTargets() + } + items := make([]availTarget, 0, len(defaults)) for name, tc := range defaults { _, installed := targets[name] @@ -129,12 +137,16 @@ func (s *Server) handleAvailableTargets(w http.ResponseWriter, r *http.Request) detected = true } } - items = append(items, availTarget{ + item := availTarget{ Name: name, Path: tc.Path, Installed: installed, Detected: detected, - }) + } + if agentTC, ok := agentDefaults[name]; ok { + item.AgentPath = agentTC.Path + } + items = append(items, item) } writeJSON(w, map[string]any{"targets": items}) diff --git a/internal/server/handler_create_skill_test.go b/internal/server/handler_create_skill_test.go index 4c30dab5..ecb5ea0e 100644 --- a/internal/server/handler_create_skill_test.go +++ b/internal/server/handler_create_skill_test.go @@ -15,7 +15,7 @@ import ( func TestHandleGetTemplates(t *testing.T) { s, _ := newTestServer(t) - req := httptest.NewRequest(http.MethodGet, "/api/skills/templates", nil) + req := httptest.NewRequest(http.MethodGet, "/api/resources/templates", nil) rr := httptest.NewRecorder() s.handler.ServeHTTP(rr, req) @@ -51,7 +51,7 @@ func TestHandleCreateSkill_Success(t *testing.T) { s, src := newTestServer(t) body := `{"name":"my-tool","pattern":"tool-wrapper","category":"library","scaffoldDirs":["references"]}` - req := httptest.NewRequest(http.MethodPost, "/api/skills", bytes.NewBufferString(body)) + req := httptest.NewRequest(http.MethodPost, "/api/resources", bytes.NewBufferString(body)) rr := httptest.NewRecorder() s.handler.ServeHTTP(rr, req) @@ -118,7 +118,7 @@ func TestHandleCreateSkill_EmptyScaffoldDirs(t *testing.T) { s, src := newTestServer(t) body := `{"name":"simple-skill","pattern":"tool-wrapper","category":"library","scaffoldDirs":[]}` - req := httptest.NewRequest(http.MethodPost, "/api/skills", bytes.NewBufferString(body)) + req := httptest.NewRequest(http.MethodPost, "/api/resources", bytes.NewBufferString(body)) rr := httptest.NewRecorder() s.handler.ServeHTTP(rr, req) @@ -162,7 +162,7 @@ func TestHandleCreateSkill_InvalidName(t *testing.T) { for _, tc := range tests { t.Run(tc.name, func(t *testing.T) { - req := httptest.NewRequest(http.MethodPost, "/api/skills", bytes.NewBufferString(tc.body)) + req := httptest.NewRequest(http.MethodPost, "/api/resources", bytes.NewBufferString(tc.body)) rr := httptest.NewRecorder() s.handler.ServeHTTP(rr, req) @@ -184,7 +184,7 @@ func TestHandleCreateSkill_Duplicate(t *testing.T) { addSkill(t, src, "existing-skill") body := `{"name":"existing-skill","pattern":"none","category":"","scaffoldDirs":[]}` - req := httptest.NewRequest(http.MethodPost, "/api/skills", bytes.NewBufferString(body)) + req := httptest.NewRequest(http.MethodPost, "/api/resources", bytes.NewBufferString(body)) rr := httptest.NewRecorder() s.handler.ServeHTTP(rr, req) @@ -204,7 +204,7 @@ func TestHandleCreateSkill_InvalidScaffoldDir(t *testing.T) { // tool-wrapper only allows "references", not "assets" body := `{"name":"bad-dirs","pattern":"tool-wrapper","category":"library","scaffoldDirs":["assets"]}` - req := httptest.NewRequest(http.MethodPost, "/api/skills", bytes.NewBufferString(body)) + req := httptest.NewRequest(http.MethodPost, "/api/resources", bytes.NewBufferString(body)) rr := httptest.NewRecorder() s.handler.ServeHTTP(rr, req) @@ -223,7 +223,7 @@ func TestHandleCreateSkill_NonePattern(t *testing.T) { s, src := newTestServer(t) body := `{"name":"plain-skill","pattern":"none","category":"","scaffoldDirs":[]}` - req := httptest.NewRequest(http.MethodPost, "/api/skills", bytes.NewBufferString(body)) + req := httptest.NewRequest(http.MethodPost, "/api/resources", bytes.NewBufferString(body)) rr := httptest.NewRecorder() s.handler.ServeHTTP(rr, req) diff --git a/internal/server/handler_diff_stream.go b/internal/server/handler_diff_stream.go index e734de3f..b84c6e02 100644 --- a/internal/server/handler_diff_stream.go +++ b/internal/server/handler_diff_stream.go @@ -28,6 +28,7 @@ func (s *Server) handleDiffStream(w http.ResponseWriter, r *http.Request) { // Snapshot config under RLock, then release before slow I/O. s.mu.RLock() source := s.cfg.Source + agentsSource := s.agentsSource() globalMode := s.cfg.Mode targets := s.cloneTargets() s.mu.RUnlock() @@ -66,8 +67,11 @@ func (s *Server) handleDiffStream(w http.ResponseWriter, r *http.Request) { }) } + diffs = s.appendAgentDiffs(diffs, targets, agentsSource, "") + donePayload := map[string]any{"diffs": diffs} maps.Copy(donePayload, ignorePayload(ignoreStats)) + maps.Copy(donePayload, agentIgnorePayload(agentsSource, nil)) safeSend("done", donePayload) } @@ -85,7 +89,7 @@ func (s *Server) computeTargetDiff(name string, target config.TargetConfig, disc if mode == "symlink" { status := ssync.CheckStatus(sc.Path, source) if status != ssync.StatusLinked { - dt.Items = append(dt.Items, diffItem{Skill: "(entire directory)", Action: "link", Reason: "source only"}) + dt.Items = append(dt.Items, diffItem{Skill: "(entire directory)", Action: "link", Reason: "source only", Kind: kindSkill}) } return dt } @@ -100,7 +104,7 @@ func (s *Server) computeTargetDiff(name string, target config.TargetConfig, disc TargetNaming: sc.TargetNaming, }, filtered) if err != nil { - dt.Items = append(dt.Items, diffItem{Skill: "(target naming)", Action: "skip", Reason: err.Error()}) + dt.Items = append(dt.Items, diffItem{Skill: "(target naming)", Action: "skip", Reason: err.Error(), Kind: kindSkill}) return dt } // Surface collision/validation stats so the UI can show why skills were skipped @@ -118,23 +122,23 @@ func (s *Server) computeTargetDiff(name string, target config.TargetConfig, disc if !isManaged { if info, statErr := os.Stat(targetSkillPath); statErr == nil { if info.IsDir() { - dt.Items = append(dt.Items, diffItem{Skill: resolved.TargetName, Action: "skip", Reason: "local copy (sync --force to replace)"}) + dt.Items = append(dt.Items, diffItem{Skill: resolved.TargetName, Action: "skip", Reason: "local copy (sync --force to replace)", Kind: kindSkill}) } else { - dt.Items = append(dt.Items, diffItem{Skill: resolved.TargetName, Action: "update", Reason: "target entry is not a directory"}) + dt.Items = append(dt.Items, diffItem{Skill: resolved.TargetName, Action: "update", Reason: "target entry is not a directory", Kind: kindSkill}) } } else if os.IsNotExist(statErr) { - dt.Items = append(dt.Items, diffItem{Skill: resolved.TargetName, Action: "link", Reason: "source only"}) + dt.Items = append(dt.Items, diffItem{Skill: resolved.TargetName, Action: "link", Reason: "source only", Kind: kindSkill}) } else { - dt.Items = append(dt.Items, diffItem{Skill: resolved.TargetName, Action: "update", Reason: "cannot access target entry"}) + dt.Items = append(dt.Items, diffItem{Skill: resolved.TargetName, Action: "update", Reason: "cannot access target entry", Kind: kindSkill}) } } else { targetInfo, statErr := os.Stat(targetSkillPath) if os.IsNotExist(statErr) { - dt.Items = append(dt.Items, diffItem{Skill: resolved.TargetName, Action: "link", Reason: "missing (deleted from target)"}) + dt.Items = append(dt.Items, diffItem{Skill: resolved.TargetName, Action: "link", Reason: "missing (deleted from target)", Kind: kindSkill}) } else if statErr != nil { - dt.Items = append(dt.Items, diffItem{Skill: resolved.TargetName, Action: "update", Reason: "cannot access target entry"}) + dt.Items = append(dt.Items, diffItem{Skill: resolved.TargetName, Action: "update", Reason: "cannot access target entry", Kind: kindSkill}) } else if !targetInfo.IsDir() { - dt.Items = append(dt.Items, diffItem{Skill: resolved.TargetName, Action: "update", Reason: "target entry is not a directory"}) + dt.Items = append(dt.Items, diffItem{Skill: resolved.TargetName, Action: "update", Reason: "target entry is not a directory", Kind: kindSkill}) } else { oldMtime := manifest.Mtimes[resolved.TargetName] currentMtime, mtimeErr := ssync.DirMaxMtime(skill.SourcePath) @@ -143,9 +147,9 @@ func (s *Server) computeTargetDiff(name string, target config.TargetConfig, disc } srcChecksum, checksumErr := ssync.DirChecksum(skill.SourcePath) if checksumErr != nil { - dt.Items = append(dt.Items, diffItem{Skill: resolved.TargetName, Action: "update", Reason: "cannot compute checksum"}) + dt.Items = append(dt.Items, diffItem{Skill: resolved.TargetName, Action: "update", Reason: "cannot compute checksum", Kind: kindSkill}) } else if srcChecksum != oldChecksum { - dt.Items = append(dt.Items, diffItem{Skill: resolved.TargetName, Action: "update", Reason: "content changed"}) + dt.Items = append(dt.Items, diffItem{Skill: resolved.TargetName, Action: "update", Reason: "content changed", Kind: kindSkill}) } } } @@ -155,7 +159,7 @@ func (s *Server) computeTargetDiff(name string, target config.TargetConfig, disc continue } if !validNames[managedName] { - dt.Items = append(dt.Items, diffItem{Skill: managedName, Action: "prune", Reason: "orphan copy"}) + dt.Items = append(dt.Items, diffItem{Skill: managedName, Action: "prune", Reason: "orphan copy", Kind: kindSkill}) } } return dt @@ -168,7 +172,7 @@ func (s *Server) computeTargetDiff(name string, target config.TargetConfig, disc _, err := os.Lstat(targetSkillPath) if err != nil { if os.IsNotExist(err) { - dt.Items = append(dt.Items, diffItem{Skill: resolved.TargetName, Action: "link", Reason: "source only"}) + dt.Items = append(dt.Items, diffItem{Skill: resolved.TargetName, Action: "link", Reason: "source only", Kind: kindSkill}) } continue } @@ -176,15 +180,15 @@ func (s *Server) computeTargetDiff(name string, target config.TargetConfig, disc if utils.IsSymlinkOrJunction(targetSkillPath) { absLink, linkErr := utils.ResolveLinkTarget(targetSkillPath) if linkErr != nil { - dt.Items = append(dt.Items, diffItem{Skill: resolved.TargetName, Action: "update", Reason: "link target unreadable"}) + dt.Items = append(dt.Items, diffItem{Skill: resolved.TargetName, Action: "update", Reason: "link target unreadable", Kind: kindSkill}) continue } absSource, _ := filepath.Abs(skill.SourcePath) if !utils.PathsEqual(absLink, absSource) { - dt.Items = append(dt.Items, diffItem{Skill: resolved.TargetName, Action: "update", Reason: "symlink points elsewhere"}) + dt.Items = append(dt.Items, diffItem{Skill: resolved.TargetName, Action: "update", Reason: "symlink points elsewhere", Kind: kindSkill}) } } else { - dt.Items = append(dt.Items, diffItem{Skill: resolved.TargetName, Action: "skip", Reason: "local copy (sync --force to replace)"}) + dt.Items = append(dt.Items, diffItem{Skill: resolved.TargetName, Action: "skip", Reason: "local copy (sync --force to replace)", Kind: kindSkill}) } } @@ -212,16 +216,16 @@ func (s *Server) computeTargetDiff(name string, target config.TargetConfig, disc } absSource, _ := filepath.Abs(source) if utils.PathHasPrefix(absLink, absSource+string(filepath.Separator)) { - dt.Items = append(dt.Items, diffItem{Skill: eName, Action: "prune", Reason: "orphan symlink"}) + dt.Items = append(dt.Items, diffItem{Skill: eName, Action: "prune", Reason: "orphan symlink", Kind: kindSkill}) } } else if info.IsDir() { if _, inManifest := manifest.Managed[eName]; inManifest { - dt.Items = append(dt.Items, diffItem{Skill: eName, Action: "prune", Reason: "orphan managed directory (manifest)"}) + dt.Items = append(dt.Items, diffItem{Skill: eName, Action: "prune", Reason: "orphan managed directory (manifest)", Kind: kindSkill}) } else { if resolution.Naming == "flat" && (utils.HasNestedSeparator(eName) || utils.IsTrackedRepoDir(eName)) { - dt.Items = append(dt.Items, diffItem{Skill: eName, Action: "prune", Reason: "orphan managed directory"}) + dt.Items = append(dt.Items, diffItem{Skill: eName, Action: "prune", Reason: "orphan managed directory", Kind: kindSkill}) } else { - dt.Items = append(dt.Items, diffItem{Skill: eName, Action: "local", Reason: "local only"}) + dt.Items = append(dt.Items, diffItem{Skill: eName, Action: "local", Reason: "local only", Kind: kindSkill}) } } } diff --git a/internal/server/handler_diff_test.go b/internal/server/handler_diff_test.go index c410ecc8..068308ae 100644 --- a/internal/server/handler_diff_test.go +++ b/internal/server/handler_diff_test.go @@ -4,8 +4,11 @@ import ( "encoding/json" "net/http" "net/http/httptest" + "os" "path/filepath" "testing" + + "skillshare/internal/config" ) func TestHandleDiff_Empty(t *testing.T) { @@ -48,3 +51,64 @@ func TestHandleDiff_WithTarget(t *testing.T) { t.Fatalf("expected 1 diff target, got %d", len(resp.Diffs)) } } + +func TestHandleDiff_AgentPruneWhenSourceEmpty(t *testing.T) { + s, _ := newTestServer(t) + + agentSource := filepath.Join(t.TempDir(), "agents") + agentTarget := filepath.Join(t.TempDir(), "claude-agents") + if err := os.MkdirAll(agentSource, 0o755); err != nil { + t.Fatalf("mkdir agent source: %v", err) + } + if err := os.MkdirAll(agentTarget, 0o755); err != nil { + t.Fatalf("mkdir agent target: %v", err) + } + if err := os.Symlink(filepath.Join(agentSource, "tutor.md"), filepath.Join(agentTarget, "tutor.md")); err != nil { + t.Fatalf("seed orphan agent symlink: %v", err) + } + + s.cfg.AgentsSource = agentSource + s.cfg.Targets["claude"] = config.TargetConfig{ + Skills: &config.ResourceTargetConfig{Path: filepath.Join(t.TempDir(), "claude-skills")}, + Agents: &config.ResourceTargetConfig{Path: agentTarget}, + } + if err := s.cfg.Save(); err != nil { + t.Fatalf("save config: %v", err) + } + + req := httptest.NewRequest(http.MethodGet, "/api/diff", nil) + rr := httptest.NewRecorder() + s.handler.ServeHTTP(rr, req) + + if rr.Code != http.StatusOK { + t.Fatalf("expected 200, got %d: %s", rr.Code, rr.Body.String()) + } + + var resp struct { + Diffs []struct { + Target string `json:"target"` + Items []struct { + Skill string `json:"skill"` + Action string `json:"action"` + Kind string `json:"kind"` + } `json:"items"` + } `json:"diffs"` + } + if err := json.Unmarshal(rr.Body.Bytes(), &resp); err != nil { + t.Fatalf("unmarshal diff response: %v", err) + } + + if len(resp.Diffs) != 1 { + t.Fatalf("expected 1 diff target, got %d", len(resp.Diffs)) + } + if resp.Diffs[0].Target != "claude" { + t.Fatalf("expected claude target, got %q", resp.Diffs[0].Target) + } + if len(resp.Diffs[0].Items) != 1 { + t.Fatalf("expected 1 diff item, got %d", len(resp.Diffs[0].Items)) + } + item := resp.Diffs[0].Items[0] + if item.Skill != "tutor.md" || item.Action != "prune" || item.Kind != "agent" { + t.Fatalf("unexpected diff item: %+v", item) + } +} diff --git a/internal/server/handler_helpers_test.go b/internal/server/handler_helpers_test.go index 48eef9de..32db1abb 100644 --- a/internal/server/handler_helpers_test.go +++ b/internal/server/handler_helpers_test.go @@ -3,9 +3,11 @@ package server import ( "os" "path/filepath" + "strings" "testing" "skillshare/internal/config" + "skillshare/internal/install" ) // newTestServer creates an isolated Server for handler testing. @@ -15,10 +17,15 @@ func newTestServer(t *testing.T) (*Server, string) { t.Helper() tmp := t.TempDir() sourceDir := filepath.Join(tmp, "skills") + homeDir := filepath.Join(tmp, "home") os.MkdirAll(sourceDir, 0755) t.Setenv("XDG_DATA_HOME", filepath.Join(tmp, "data")) t.Setenv("XDG_STATE_HOME", filepath.Join(tmp, "state")) t.Setenv("XDG_CONFIG_HOME", filepath.Join(tmp, "xdg-config")) + if os.Getenv("HOME") == "" { + os.MkdirAll(homeDir, 0755) + t.Setenv("HOME", homeDir) + } cfgPath := filepath.Join(tmp, "config", "config.yaml") t.Setenv("SKILLSHARE_CONFIG", cfgPath) @@ -41,10 +48,15 @@ func newTestServerWithTargets(t *testing.T, targets map[string]string) (*Server, t.Helper() tmp := t.TempDir() sourceDir := filepath.Join(tmp, "skills") + homeDir := filepath.Join(tmp, "home") os.MkdirAll(sourceDir, 0755) t.Setenv("XDG_DATA_HOME", filepath.Join(tmp, "data")) t.Setenv("XDG_STATE_HOME", filepath.Join(tmp, "state")) t.Setenv("XDG_CONFIG_HOME", filepath.Join(tmp, "xdg-config")) + if os.Getenv("HOME") == "" { + os.MkdirAll(homeDir, 0755) + t.Setenv("HOME", homeDir) + } cfgPath := filepath.Join(tmp, "config", "config.yaml") t.Setenv("SKILLSHARE_CONFIG", cfgPath) @@ -85,9 +97,37 @@ func addTrackedRepo(t *testing.T, sourceDir, relPath string) { } } -// addSkillMeta creates a .skillshare-meta.json for a skill (marks it as remotely installed). +// addSkillMeta writes a metadata entry into the centralized .metadata.json store. func addSkillMeta(t *testing.T, sourceDir, name, source string) { t.Helper() - meta := `{"source":"` + source + `"}` - os.WriteFile(filepath.Join(sourceDir, name, ".skillshare-meta.json"), []byte(meta), 0644) + store := install.LoadMetadataOrNew(sourceDir) + store.Set(name, &install.MetadataEntry{Source: source}) + if err := store.Save(sourceDir); err != nil { + t.Fatalf("addSkillMeta: %v", err) + } +} + +func addAgent(t *testing.T, agentsDir, relPath string) { + t.Helper() + agentPath := filepath.Join(agentsDir, filepath.FromSlash(relPath)) + if err := os.MkdirAll(filepath.Dir(agentPath), 0o755); err != nil { + t.Fatalf("create agent dir: %v", err) + } + if err := os.WriteFile(agentPath, []byte("---\nname: "+strings.TrimSuffix(filepath.Base(relPath), ".md")+"\n---\n# agent"), 0o644); err != nil { + t.Fatalf("write agent: %v", err) + } +} + +func addAgentMeta(t *testing.T, agentsDir, relPath, source string) { + t.Helper() + store := install.LoadMetadataOrNew(agentsDir) + key := strings.TrimSuffix(filepath.ToSlash(relPath), ".md") + store.Set(key, &install.MetadataEntry{ + Source: source, + Kind: install.MetadataKindAgent, + Subdir: filepath.ToSlash(relPath), + }) + if err := store.Save(agentsDir); err != nil { + t.Fatalf("addAgentMeta: %v", err) + } } diff --git a/internal/server/handler_install.go b/internal/server/handler_install.go index eb3c0301..5508a6ca 100644 --- a/internal/server/handler_install.go +++ b/internal/server/handler_install.go @@ -13,6 +13,16 @@ import ( "skillshare/internal/install" ) +func discoverInstallSource(source *install.Source) (*install.DiscoveryResult, error) { + if source.IsGit() { + if source.HasSubdir() { + return install.DiscoverFromGitSubdir(source) + } + return install.DiscoverFromGit(source) + } + return install.DiscoverLocal(source) +} + // handleDiscover clones a git repo to a temp dir, discovers skills, then cleans up. // Returns whether the caller needs to present a selection UI. func (s *Server) handleDiscover(w http.ResponseWriter, r *http.Request) { @@ -41,22 +51,7 @@ func (s *Server) handleDiscover(w http.ResponseWriter, r *http.Request) { } source.Branch = body.Branch - // Non-git sources (local paths) don't need discovery - if !source.IsGit() { - writeJSON(w, map[string]any{ - "needsSelection": false, - "skills": []any{}, - }) - return - } - - // Use subdir-aware discovery when a subdirectory is specified - var discovery *install.DiscoveryResult - if source.HasSubdir() { - discovery, err = install.DiscoverFromGitSubdir(source) - } else { - discovery, err = install.DiscoverFromGit(source) - } + discovery, err := discoverInstallSource(source) if err != nil { writeError(w, http.StatusInternalServerError, err.Error()) return @@ -65,12 +60,18 @@ func (s *Server) handleDiscover(w http.ResponseWriter, r *http.Request) { skills := make([]map[string]string, len(discovery.Skills)) for i, sk := range discovery.Skills { - skills[i] = map[string]string{"name": sk.Name, "path": sk.Path, "description": sk.Description} + skills[i] = map[string]string{"name": sk.Name, "path": sk.Path, "description": sk.Description, "kind": "skill"} + } + + agents := make([]map[string]string, len(discovery.Agents)) + for i, ag := range discovery.Agents { + agents[i] = map[string]string{"name": ag.Name, "path": ag.Path, "fileName": ag.FileName, "kind": "agent"} } writeJSON(w, map[string]any{ - "needsSelection": len(discovery.Skills) > 1, + "needsSelection": len(discovery.Skills) > 1 || len(discovery.Agents) > 0, "skills": skills, + "agents": agents, }) } @@ -91,6 +92,7 @@ func (s *Server) handleInstallBatch(w http.ResponseWriter, r *http.Request) { SkipAudit bool `json:"skipAudit"` Into string `json:"into"` Name string `json:"name"` + Kind string `json:"kind,omitempty"` } if err := json.NewDecoder(r.Body).Decode(&body); err != nil { writeError(w, http.StatusBadRequest, "invalid JSON body") @@ -108,12 +110,7 @@ func (s *Server) handleInstallBatch(w http.ResponseWriter, r *http.Request) { } source.Branch = body.Branch - var discovery *install.DiscoveryResult - if source.HasSubdir() { - discovery, err = install.DiscoverFromGitSubdir(source) - } else { - discovery, err = install.DiscoverFromGit(source) - } + discovery, err := discoverInstallSource(source) if err != nil { writeError(w, http.StatusInternalServerError, "discovery failed: "+err.Error()) return @@ -129,7 +126,11 @@ func (s *Server) handleInstallBatch(w http.ResponseWriter, r *http.Request) { // Ensure Into directory exists if body.Into != "" { - if err := os.MkdirAll(filepath.Join(s.cfg.Source, body.Into), 0755); err != nil { + baseDir := s.cfg.Source + if body.Kind == "agent" { + baseDir = s.agentsSource() + } + if err := os.MkdirAll(filepath.Join(baseDir, body.Into), 0755); err != nil { writeError(w, http.StatusInternalServerError, "failed to create into directory: "+err.Error()) return } @@ -144,32 +145,66 @@ func (s *Server) handleInstallBatch(w http.ResponseWriter, r *http.Request) { SkipAudit: body.SkipAudit, AuditThreshold: s.auditThreshold(), Branch: body.Branch, + SourceDir: s.cfg.Source, } if s.IsProjectMode() { installOpts.AuditProjectRoot = s.projectRoot } + isAgent := body.Kind == "agent" + if isAgent { + installOpts.SourceDir = s.agentsSource() + } + for _, sel := range body.Skills { skillName := sel.Name if body.Name != "" && len(body.Skills) == 1 { skillName = body.Name } - destPath := filepath.Join(s.cfg.Source, body.Into, skillName) - res, err := install.InstallFromDiscovery(discovery, install.SkillInfo{ - Name: sel.Name, - Path: sel.Path, - }, destPath, installOpts) - if err != nil { + + if isAgent { + // Agent install: copy single .md file to agents source + agentsDir := s.agentsSource() + if body.Into != "" { + agentsDir = filepath.Join(agentsDir, body.Into) + } + agentInfo := install.AgentInfo{ + Name: sel.Name, + Path: sel.Path, + FileName: filepath.Base(sel.Path), + } + res, err := install.InstallAgentFromDiscovery(discovery, agentInfo, agentsDir, installOpts) + if err != nil { + results = append(results, batchResultItem{ + Name: skillName, + Error: err.Error(), + }) + continue + } + results = append(results, batchResultItem{ + Name: skillName, + Action: res.Action, + Warnings: res.Warnings, + }) + } else { + // Skill install: copy directory to skills source + destPath := filepath.Join(s.cfg.Source, body.Into, skillName) + res, err := install.InstallFromDiscovery(discovery, install.SkillInfo{ + Name: sel.Name, + Path: sel.Path, + }, destPath, installOpts) + if err != nil { + results = append(results, batchResultItem{ + Name: skillName, + Error: err.Error(), + }) + continue + } results = append(results, batchResultItem{ - Name: skillName, - Error: err.Error(), + Name: skillName, + Action: res.Action, + Warnings: res.Warnings, }) - continue } - results = append(results, batchResultItem{ - Name: skillName, - Action: res.Action, - Warnings: res.Warnings, - }) } // Summary for toast @@ -188,10 +223,17 @@ func (s *Server) handleInstallBatch(w http.ResponseWriter, r *http.Request) { failedSkills = append(failedSkills, r.Name) } } - summary := fmt.Sprintf("Installed %d of %d skills", installed, len(body.Skills)) + kindLabel := "skills" + if isAgent { + kindLabel = "agents" + } + summary := fmt.Sprintf("Installed %d of %d %s", installed, len(body.Skills), kindLabel) if firstErr != "" { summary += " (some errors)" } + if isAgent && installed > 0 && !s.cfg.HasAgentTarget() { + summary += ". Warning: none of your configured targets support agents" + } status := "ok" if installed < len(body.Skills) { @@ -221,12 +263,17 @@ func (s *Server) handleInstallBatch(w http.ResponseWriter, r *http.Request) { // Reconcile config after install if installed > 0 { + if isAgent { + if st, loadErr := install.LoadMetadataWithMigration(s.agentsSource(), install.MetadataKindAgent); loadErr == nil && st != nil { + s.agentsStore = st + } + } if s.IsProjectMode() { - if rErr := config.ReconcileProjectSkills(s.projectRoot, s.projectCfg, s.registry, s.cfg.Source); rErr != nil { + if rErr := config.ReconcileProjectSkills(s.projectRoot, s.projectCfg, s.skillsStore, s.cfg.Source); rErr != nil { log.Printf("warning: failed to reconcile project skills config: %v", rErr) } } else { - if rErr := config.ReconcileGlobalSkills(s.cfg, s.registry); rErr != nil { + if rErr := config.ReconcileGlobalSkills(s.cfg, s.skillsStore); rErr != nil { log.Printf("warning: failed to reconcile global skills config: %v", rErr) } } @@ -251,6 +298,7 @@ func (s *Server) handleInstall(w http.ResponseWriter, r *http.Request) { SkipAudit bool `json:"skipAudit"` Track bool `json:"track"` Into string `json:"into"` + Kind string `json:"kind,omitempty"` } if err := json.NewDecoder(r.Body).Decode(&body); err != nil { writeError(w, http.StatusBadRequest, "invalid JSON body") @@ -275,19 +323,33 @@ func (s *Server) handleInstall(w http.ResponseWriter, r *http.Request) { // Tracked repo install if body.Track { + trackedKind, err := install.InferTrackedKind(source, body.Kind) + if err != nil { + writeError(w, http.StatusBadRequest, err.Error()) + return + } + + trackSourceDir := s.cfg.Source + if trackedKind == "agent" { + trackSourceDir = s.agentsSource() + } + installOpts := install.InstallOptions{ Name: body.Name, + Kind: trackedKind, Force: body.Force, SkipAudit: body.SkipAudit, Into: body.Into, Branch: body.Branch, AuditThreshold: s.auditThreshold(), + SourceDir: trackSourceDir, } if s.IsProjectMode() { installOpts.AuditProjectRoot = s.projectRoot } - result, err := install.InstallTrackedRepo(source, s.cfg.Source, install.InstallOptions{ + result, err := install.InstallTrackedRepo(source, trackSourceDir, install.InstallOptions{ Name: installOpts.Name, + Kind: installOpts.Kind, Force: installOpts.Force, SkipAudit: installOpts.SkipAudit, Into: installOpts.Into, @@ -309,13 +371,15 @@ func (s *Server) handleInstall(w http.ResponseWriter, r *http.Request) { return } // Reconcile config after tracked repo install - if s.IsProjectMode() { - if rErr := config.ReconcileProjectSkills(s.projectRoot, s.projectCfg, s.registry, s.cfg.Source); rErr != nil { - log.Printf("warning: failed to reconcile project skills config: %v", rErr) - } - } else { - if rErr := config.ReconcileGlobalSkills(s.cfg, s.registry); rErr != nil { - log.Printf("warning: failed to reconcile global skills config: %v", rErr) + if trackedKind == "skill" { + if s.IsProjectMode() { + if rErr := config.ReconcileProjectSkills(s.projectRoot, s.projectCfg, s.skillsStore, s.cfg.Source); rErr != nil { + log.Printf("warning: failed to reconcile project skills config: %v", rErr) + } + } else { + if rErr := config.ReconcileGlobalSkills(s.cfg, s.skillsStore); rErr != nil { + log.Printf("warning: failed to reconcile global skills config: %v", rErr) + } } } @@ -323,6 +387,7 @@ func (s *Server) handleInstall(w http.ResponseWriter, r *http.Request) { "source": body.Source, "mode": s.installLogMode(), "tracked": true, + "kind": trackedKind, "force": body.Force, "threshold": s.auditThreshold(), "scope": "ui", @@ -342,7 +407,9 @@ func (s *Server) handleInstall(w http.ResponseWriter, r *http.Request) { writeJSON(w, map[string]any{ "repoName": result.RepoName, "skillCount": result.SkillCount, + "agentCount": result.AgentCount, "skills": result.Skills, + "agents": result.Agents, "action": result.Action, "warnings": result.Warnings, }) @@ -394,11 +461,11 @@ func (s *Server) handleInstall(w http.ResponseWriter, r *http.Request) { // Reconcile config after single install if s.IsProjectMode() { - if rErr := config.ReconcileProjectSkills(s.projectRoot, s.projectCfg, s.registry, s.cfg.Source); rErr != nil { + if rErr := config.ReconcileProjectSkills(s.projectRoot, s.projectCfg, s.skillsStore, s.cfg.Source); rErr != nil { log.Printf("warning: failed to reconcile project skills config: %v", rErr) } } else { - if rErr := config.ReconcileGlobalSkills(s.cfg, s.registry); rErr != nil { + if rErr := config.ReconcileGlobalSkills(s.cfg, s.skillsStore); rErr != nil { log.Printf("warning: failed to reconcile global skills config: %v", rErr) } } diff --git a/internal/server/handler_install_test.go b/internal/server/handler_install_test.go new file mode 100644 index 00000000..a1b5bc02 --- /dev/null +++ b/internal/server/handler_install_test.go @@ -0,0 +1,241 @@ +package server + +import ( + "bytes" + "encoding/json" + "net/http" + "net/http/httptest" + "os" + "os/exec" + "path/filepath" + "testing" + + "skillshare/internal/install" +) + +func entryNames(entries []os.DirEntry) []string { + names := make([]string, 0, len(entries)) + for _, entry := range entries { + names = append(names, entry.Name()) + } + return names +} + +func TestHandleInstallBatch_AgentInstallWritesMetadataToAgentsSource(t *testing.T) { + s, skillsDir := newTestServer(t) + + agentsDir := filepath.Join(t.TempDir(), "agents") + if err := os.MkdirAll(agentsDir, 0755); err != nil { + t.Fatalf("failed to create agents dir: %v", err) + } + s.cfg.AgentsSource = agentsDir + s.agentsStore = install.NewMetadataStore() + + repoDir := t.TempDir() + initGitRepo(t, repoDir) + + agentPath := filepath.Join(repoDir, "reviewer.md") + if err := os.WriteFile(agentPath, []byte("# Reviewer agent"), 0644); err != nil { + t.Fatalf("failed to write agent file: %v", err) + } + for _, args := range [][]string{ + {"add", "reviewer.md"}, + {"commit", "-m", "add reviewer agent"}, + } { + cmd := exec.Command("git", args...) + cmd.Dir = repoDir + if out, err := cmd.CombinedOutput(); err != nil { + t.Fatalf("git %v failed: %s %v", args, out, err) + } + } + + payload, err := json.Marshal(map[string]any{ + "source": "file://" + repoDir, + "skills": []map[string]string{ + {"name": "reviewer", "path": "reviewer.md"}, + }, + "kind": "agent", + }) + if err != nil { + t.Fatalf("failed to marshal payload: %v", err) + } + + req := httptest.NewRequest(http.MethodPost, "/api/install/batch", bytes.NewReader(payload)) + rr := httptest.NewRecorder() + s.mux.ServeHTTP(rr, req) + + if rr.Code != http.StatusOK { + t.Fatalf("unexpected status: got %d, body=%s", rr.Code, rr.Body.String()) + } + + if _, err := os.Stat(filepath.Join(agentsDir, "reviewer.md")); err != nil { + t.Fatalf("expected installed agent in agents source: %v", err) + } + if _, err := os.Stat(filepath.Join(agentsDir, install.MetadataFileName)); err != nil { + t.Fatalf("expected metadata written to agents source: %v", err) + } + if _, err := os.Stat(filepath.Join(skillsDir, install.MetadataFileName)); !os.IsNotExist(err) { + t.Fatalf("expected no agent metadata written to skills source, got err=%v", err) + } +} + +func TestHandleDiscover_LocalSourceReturnsAgents(t *testing.T) { + s, _ := newTestServer(t) + + sourceDir := t.TempDir() + if err := os.WriteFile(filepath.Join(sourceDir, "reviewer.md"), []byte("# Reviewer agent"), 0o644); err != nil { + t.Fatalf("failed to write local agent: %v", err) + } + + payload, err := json.Marshal(map[string]any{ + "source": sourceDir, + }) + if err != nil { + t.Fatalf("failed to marshal payload: %v", err) + } + + req := httptest.NewRequest(http.MethodPost, "/api/discover", bytes.NewReader(payload)) + rr := httptest.NewRecorder() + s.mux.ServeHTTP(rr, req) + + if rr.Code != http.StatusOK { + t.Fatalf("unexpected status: got %d, body=%s", rr.Code, rr.Body.String()) + } + + var resp struct { + NeedsSelection bool `json:"needsSelection"` + Skills []struct { + Name string `json:"name"` + } `json:"skills"` + Agents []struct { + Name string `json:"name"` + Path string `json:"path"` + Kind string `json:"kind"` + } `json:"agents"` + } + if err := json.Unmarshal(rr.Body.Bytes(), &resp); err != nil { + t.Fatalf("failed to decode response: %v", err) + } + + if !resp.NeedsSelection { + t.Fatal("expected local agent discovery to require selection UI") + } + if len(resp.Skills) != 0 { + t.Fatalf("expected no skills, got %d", len(resp.Skills)) + } + if len(resp.Agents) != 1 { + t.Fatalf("expected 1 agent, got %d", len(resp.Agents)) + } + if resp.Agents[0].Name != "reviewer" || resp.Agents[0].Path != "reviewer.md" || resp.Agents[0].Kind != "agent" { + t.Fatalf("unexpected agent payload: %+v", resp.Agents[0]) + } +} + +func TestHandleInstallBatch_LocalAgentInstallPreservesNestedPath(t *testing.T) { + s, _ := newTestServer(t) + + agentsDir := filepath.Join(t.TempDir(), "agents") + if err := os.MkdirAll(agentsDir, 0o755); err != nil { + t.Fatalf("failed to create agents dir: %v", err) + } + s.cfg.AgentsSource = agentsDir + s.agentsStore = install.NewMetadataStore() + + sourceDir := t.TempDir() + if err := os.MkdirAll(filepath.Join(sourceDir, "demo"), 0o755); err != nil { + t.Fatalf("failed to create nested agent dir: %v", err) + } + if err := os.WriteFile(filepath.Join(sourceDir, "demo", "reviewer.md"), []byte("# Reviewer agent"), 0o644); err != nil { + t.Fatalf("failed to write nested agent: %v", err) + } + + payload, err := json.Marshal(map[string]any{ + "source": sourceDir, + "skills": []map[string]string{ + {"name": "reviewer", "path": "demo/reviewer.md"}, + }, + "kind": "agent", + }) + if err != nil { + t.Fatalf("failed to marshal payload: %v", err) + } + + req := httptest.NewRequest(http.MethodPost, "/api/install/batch", bytes.NewReader(payload)) + rr := httptest.NewRecorder() + s.mux.ServeHTTP(rr, req) + + if rr.Code != http.StatusOK { + t.Fatalf("unexpected status: got %d, body=%s", rr.Code, rr.Body.String()) + } + + if _, err := os.Stat(filepath.Join(agentsDir, "demo", "reviewer.md")); err != nil { + t.Fatalf("expected installed nested agent in agents source: %v", err) + } + if got := s.agentsStore.GetByPath("demo/reviewer"); got == nil { + t.Fatal("expected nested agent metadata loaded into agentsStore") + } +} + +func TestHandleInstall_TrackPureAgentRepoInstallsIntoAgentsSource(t *testing.T) { + s, skillsDir := newTestServer(t) + + agentsDir := filepath.Join(t.TempDir(), "agents") + if err := os.MkdirAll(agentsDir, 0o755); err != nil { + t.Fatalf("failed to create agents dir: %v", err) + } + s.cfg.AgentsSource = agentsDir + s.agentsStore = install.NewMetadataStore() + + repoDir := t.TempDir() + initGitRepo(t, repoDir) + + agentPath := filepath.Join(repoDir, "reviewer.md") + if err := os.WriteFile(agentPath, []byte("# Reviewer v1"), 0o644); err != nil { + t.Fatalf("failed to write agent file: %v", err) + } + for _, args := range [][]string{ + {"add", "reviewer.md"}, + {"commit", "-m", "add reviewer agent"}, + } { + cmd := exec.Command("git", args...) + cmd.Dir = repoDir + if out, err := cmd.CombinedOutput(); err != nil { + t.Fatalf("git %v failed: %s %v", args, out, err) + } + } + + payload, err := json.Marshal(map[string]any{ + "source": "file://" + repoDir, + "track": true, + }) + if err != nil { + t.Fatalf("failed to marshal payload: %v", err) + } + + req := httptest.NewRequest(http.MethodPost, "/api/install", bytes.NewReader(payload)) + rr := httptest.NewRecorder() + s.mux.ServeHTTP(rr, req) + + if rr.Code != http.StatusOK { + t.Fatalf("unexpected status: got %d, body=%s", rr.Code, rr.Body.String()) + } + + var resp struct { + RepoName string `json:"repoName"` + } + if err := json.Unmarshal(rr.Body.Bytes(), &resp); err != nil { + t.Fatalf("failed to decode response: %v", err) + } + trackedRepoName := resp.RepoName + if _, err := os.Stat(filepath.Join(agentsDir, trackedRepoName, ".git")); err != nil { + agentEntries, _ := os.ReadDir(agentsDir) + skillEntries, _ := os.ReadDir(skillsDir) + t.Fatalf("expected tracked agent repo in agents source: %v (agents=%v skills=%v body=%s)", err, entryNames(agentEntries), entryNames(skillEntries), rr.Body.String()) + } + if _, err := os.Stat(filepath.Join(agentsDir, trackedRepoName, "reviewer.md")); err != nil { + t.Fatalf("expected tracked agent file in agents source: %v", err) + } + if _, err := os.Stat(filepath.Join(skillsDir, trackedRepoName)); !os.IsNotExist(err) { + t.Fatalf("expected no tracked agent repo in skills source, got err=%v", err) + } +} diff --git a/internal/server/handler_overview.go b/internal/server/handler_overview.go index 5fd86120..2cab0557 100644 --- a/internal/server/handler_overview.go +++ b/internal/server/handler_overview.go @@ -8,6 +8,7 @@ import ( "skillshare/internal/git" "skillshare/internal/install" + "skillshare/internal/resource" "skillshare/internal/sync" "skillshare/internal/utils" versioncheck "skillshare/internal/version" @@ -23,6 +24,11 @@ func (s *Server) handleOverview(w http.ResponseWriter, r *http.Request) { // Snapshot config under RLock, then release before I/O. s.mu.RLock() source := s.cfg.Source + agentsSource := s.agentsSource() + extrasSource := s.cfg.ExtrasSource + if s.IsProjectMode() { + extrasSource = filepath.Join(s.projectRoot, ".skillshare", "extras") + } cfgMode := s.cfg.Mode targetCount := len(s.cfg.Targets) projectRoot := s.projectRoot @@ -54,9 +60,18 @@ func (s *Server) handleOverview(w http.ResponseWriter, r *http.Request) { // Tracked repos trackedRepos := buildTrackedRepos(source, skills) + // Count agents + agentCount := 0 + if agentsSource != "" { + if agents, discoverErr := (resource.AgentKind{}).Discover(agentsSource); discoverErr == nil { + agentCount = len(agents) + } + } + resp := map[string]any{ "source": source, "skillCount": len(skills), + "agentCount": agentCount, "topLevelCount": topLevelCount, "targetCount": targetCount, "mode": mode, @@ -64,6 +79,12 @@ func (s *Server) handleOverview(w http.ResponseWriter, r *http.Request) { "trackedRepos": trackedRepos, "isProjectMode": isProjectMode, } + if agentsSource != "" { + resp["agentsSource"] = agentsSource + } + if extrasSource != "" { + resp["extrasSource"] = extrasSource + } if isProjectMode { resp["projectRoot"] = projectRoot } diff --git a/internal/server/handler_overview_test.go b/internal/server/handler_overview_test.go index e744d73c..698b0eb2 100644 --- a/internal/server/handler_overview_test.go +++ b/internal/server/handler_overview_test.go @@ -4,6 +4,8 @@ import ( "encoding/json" "net/http" "net/http/httptest" + "os" + "path/filepath" "testing" ) @@ -50,6 +52,34 @@ func TestHandleOverview_WithSkills(t *testing.T) { } } +func TestHandleOverview_AgentCountIncludesNestedAgents(t *testing.T) { + s, _ := newTestServer(t) + agentsDir := s.agentsSource() + if err := os.MkdirAll(filepath.Join(agentsDir, "demo"), 0755); err != nil { + t.Fatalf("mkdir agents dir: %v", err) + } + if err := os.WriteFile(filepath.Join(agentsDir, "top-level.md"), []byte("# Top"), 0644); err != nil { + t.Fatalf("write top-level agent: %v", err) + } + if err := os.WriteFile(filepath.Join(agentsDir, "demo", "nested-agent.md"), []byte("# Nested"), 0644); err != nil { + t.Fatalf("write nested agent: %v", err) + } + + req := httptest.NewRequest(http.MethodGet, "/api/overview", nil) + rr := httptest.NewRecorder() + s.handler.ServeHTTP(rr, req) + + if rr.Code != http.StatusOK { + t.Fatalf("expected 200, got %d: %s", rr.Code, rr.Body.String()) + } + + var resp map[string]any + json.Unmarshal(rr.Body.Bytes(), &resp) + if resp["agentCount"].(float64) != 2 { + t.Errorf("expected 2 agents including nested entries, got %v", resp["agentCount"]) + } +} + func TestHandleOverview_ProjectMode(t *testing.T) { tmp := t.TempDir() s, _ := newTestServer(t) diff --git a/internal/server/handler_resources_agents_test.go b/internal/server/handler_resources_agents_test.go new file mode 100644 index 00000000..50b7cacb --- /dev/null +++ b/internal/server/handler_resources_agents_test.go @@ -0,0 +1,229 @@ +package server + +import ( + "encoding/json" + "net/http" + "net/http/httptest" + "os" + "os/exec" + "path/filepath" + "strings" + "testing" + + "skillshare/internal/install" + "skillshare/internal/trash" +) + +func TestHandleGetSkill_AgentKind(t *testing.T) { + s, _ := newTestServer(t) + agentsDir := s.agentsSource() + if err := os.MkdirAll(agentsDir, 0o755); err != nil { + t.Fatalf("create agents dir: %v", err) + } + addAgent(t, agentsDir, "demo/reviewer.md") + if err := os.WriteFile(filepath.Join(agentsDir, ".agentignore"), []byte("demo/reviewer.md\n"), 0o644); err != nil { + t.Fatalf("write .agentignore: %v", err) + } + + req := httptest.NewRequest(http.MethodGet, "/api/resources/demo__reviewer.md?kind=agent", nil) + req.SetPathValue("name", "demo__reviewer.md") + rr := httptest.NewRecorder() + s.handleGetSkill(rr, req) + + if rr.Code != http.StatusOK { + t.Fatalf("expected 200, got %d: %s", rr.Code, rr.Body.String()) + } + + var resp struct { + Resource struct { + Kind string `json:"kind"` + RelPath string `json:"relPath"` + Disabled bool `json:"disabled"` + } `json:"resource"` + } + if err := json.Unmarshal(rr.Body.Bytes(), &resp); err != nil { + t.Fatalf("decode response: %v", err) + } + if resp.Resource.Kind != "agent" { + t.Fatalf("expected kind=agent, got %q", resp.Resource.Kind) + } + if resp.Resource.RelPath != "demo/reviewer.md" { + t.Fatalf("expected relPath demo/reviewer.md, got %q", resp.Resource.RelPath) + } + if !resp.Resource.Disabled { + t.Fatal("expected agent detail to report disabled=true") + } +} + +func TestHandleListSkills_AgentDisabled(t *testing.T) { + s, _ := newTestServer(t) + agentsDir := s.agentsSource() + if err := os.MkdirAll(agentsDir, 0o755); err != nil { + t.Fatalf("create agents dir: %v", err) + } + addAgent(t, agentsDir, "demo/reviewer.md") + if err := os.WriteFile(filepath.Join(agentsDir, ".agentignore"), []byte("demo/reviewer.md\n"), 0o644); err != nil { + t.Fatalf("write .agentignore: %v", err) + } + + req := httptest.NewRequest(http.MethodGet, "/api/resources?kind=agent", nil) + rr := httptest.NewRecorder() + s.handleListSkills(rr, req) + + if rr.Code != http.StatusOK { + t.Fatalf("expected 200, got %d: %s", rr.Code, rr.Body.String()) + } + + var resp struct { + Resources []struct { + FlatName string `json:"flatName"` + Disabled bool `json:"disabled"` + } `json:"resources"` + } + if err := json.Unmarshal(rr.Body.Bytes(), &resp); err != nil { + t.Fatalf("decode response: %v", err) + } + if len(resp.Resources) != 1 { + t.Fatalf("expected 1 agent, got %d", len(resp.Resources)) + } + if resp.Resources[0].FlatName != "demo__reviewer.md" { + t.Fatalf("expected demo__reviewer.md, got %q", resp.Resources[0].FlatName) + } + if !resp.Resources[0].Disabled { + t.Fatal("expected disabled=true in agent list response") + } +} + +func TestHandleToggleSkill_AgentKind(t *testing.T) { + s, _ := newTestServer(t) + agentsDir := s.agentsSource() + if err := os.MkdirAll(agentsDir, 0o755); err != nil { + t.Fatalf("create agents dir: %v", err) + } + addAgent(t, agentsDir, "demo/reviewer.md") + + disableReq := httptest.NewRequest(http.MethodPost, "/api/resources/demo__reviewer.md/disable?kind=agent", nil) + disableReq.SetPathValue("name", "demo__reviewer.md") + disableRR := httptest.NewRecorder() + s.handleDisableSkill(disableRR, disableReq) + + if disableRR.Code != http.StatusOK { + t.Fatalf("disable expected 200, got %d: %s", disableRR.Code, disableRR.Body.String()) + } + + data, err := os.ReadFile(filepath.Join(agentsDir, ".agentignore")) + if err != nil { + t.Fatalf("read .agentignore: %v", err) + } + if !strings.Contains(string(data), "demo/reviewer.md") { + t.Fatalf("expected .agentignore to contain demo/reviewer.md, got %q", string(data)) + } + + enableReq := httptest.NewRequest(http.MethodPost, "/api/resources/demo__reviewer.md/enable?kind=agent", nil) + enableReq.SetPathValue("name", "demo__reviewer.md") + enableRR := httptest.NewRecorder() + s.handleEnableSkill(enableRR, enableReq) + + if enableRR.Code != http.StatusOK { + t.Fatalf("enable expected 200, got %d: %s", enableRR.Code, enableRR.Body.String()) + } + + data, err = os.ReadFile(filepath.Join(agentsDir, ".agentignore")) + if err != nil { + t.Fatalf("read .agentignore after enable: %v", err) + } + if strings.Contains(string(data), "demo/reviewer.md") { + t.Fatalf("expected demo/reviewer.md to be removed from .agentignore, got %q", string(data)) + } +} + +func TestHandleUninstallSkill_AgentKind(t *testing.T) { + s, _ := newTestServer(t) + agentsDir := s.agentsSource() + if err := os.MkdirAll(agentsDir, 0o755); err != nil { + t.Fatalf("create agents dir: %v", err) + } + addAgent(t, agentsDir, "demo/reviewer.md") + + store := install.LoadMetadataOrNew(agentsDir) + store.Set("demo/reviewer", &install.MetadataEntry{ + Source: "file:///tmp/reviewer", + Kind: install.MetadataKindAgent, + Subdir: "demo/reviewer.md", + }) + if err := store.Save(agentsDir); err != nil { + t.Fatalf("save metadata: %v", err) + } + s.agentsStore = store + + req := httptest.NewRequest(http.MethodDelete, "/api/resources/demo__reviewer.md?kind=agent", nil) + req.SetPathValue("name", "demo__reviewer.md") + rr := httptest.NewRecorder() + s.handleUninstallSkill(rr, req) + + if rr.Code != http.StatusOK { + t.Fatalf("expected 200, got %d: %s", rr.Code, rr.Body.String()) + } + + if _, err := os.Stat(filepath.Join(agentsDir, "demo", "reviewer.md")); !os.IsNotExist(err) { + t.Fatalf("expected agent file removed, stat err=%v", err) + } + if entry := trash.FindByName(s.agentTrashBase(), "demo/reviewer"); entry == nil { + t.Fatal("expected agent to be moved to agent trash") + } + if got := s.agentsStore.GetByPath("demo/reviewer"); got != nil { + t.Fatalf("expected agent metadata removed, got %+v", got) + } +} + +func TestUpdateSingleByKind_Agent(t *testing.T) { + s, _ := newTestServer(t) + agentsDir := s.agentsSource() + if err := os.MkdirAll(agentsDir, 0o755); err != nil { + t.Fatalf("create agents dir: %v", err) + } + addAgent(t, agentsDir, "demo/reviewer.md") + + repoDir := t.TempDir() + initGitRepo(t, repoDir) + addAgent(t, repoDir, "demo/reviewer.md") + if err := os.WriteFile(filepath.Join(repoDir, "demo", "reviewer.md"), []byte("# updated agent\n"), 0o644); err != nil { + t.Fatalf("write repo agent: %v", err) + } + for _, args := range [][]string{{"add", "."}, {"commit", "-m", "add agent"}} { + cmd := exec.Command("git", args...) + cmd.Dir = repoDir + if out, err := cmd.CombinedOutput(); err != nil { + t.Fatalf("git %v failed: %s %v", args, out, err) + } + } + + source := "file://" + filepath.ToSlash(repoDir) + "//demo/reviewer.md" + store := install.LoadMetadataOrNew(agentsDir) + store.Set("demo/reviewer", &install.MetadataEntry{ + Source: source, + Kind: install.MetadataKindAgent, + Subdir: "demo/reviewer.md", + Version: "stale-version", + }) + if err := store.Save(agentsDir); err != nil { + t.Fatalf("save metadata: %v", err) + } + s.agentsStore = store + + result := s.updateSingleByKind("demo__reviewer.md", "agent", false, true) + if result.Action != "updated" { + t.Fatalf("expected updated, got %+v", result) + } + if result.Kind != "agent" { + t.Fatalf("expected kind=agent, got %+v", result) + } + + data, err := os.ReadFile(filepath.Join(agentsDir, "demo", "reviewer.md")) + if err != nil { + t.Fatalf("read updated agent: %v", err) + } + if string(data) != "# updated agent\n" { + t.Fatalf("expected updated agent content, got %q", string(data)) + } +} diff --git a/internal/server/handler_skills.go b/internal/server/handler_skills.go index 26b89b3b..2ca95a65 100644 --- a/internal/server/handler_skills.go +++ b/internal/server/handler_skills.go @@ -9,9 +9,9 @@ import ( "strings" "time" - "skillshare/internal/config" "skillshare/internal/git" "skillshare/internal/install" + "skillshare/internal/resource" "skillshare/internal/sync" "skillshare/internal/trash" "skillshare/internal/utils" @@ -19,6 +19,7 @@ import ( type skillItem struct { Name string `json:"name"` + Kind string `json:"kind"` // "skill" or "agent" FlatName string `json:"flatName"` RelPath string `json:"relPath"` SourcePath string `json:"sourcePath"` @@ -44,118 +45,231 @@ func enrichSkillBranch(item *skillItem) { } func (s *Server) handleListSkills(w http.ResponseWriter, r *http.Request) { + kindFilter := r.URL.Query().Get("kind") // "", "skill", "agent" + // Snapshot config under RLock, then release before I/O. s.mu.RLock() source := s.cfg.Source + agentsSource := s.agentsSource() s.mu.RUnlock() - discovered, err := sync.DiscoverSourceSkillsAll(source) - if err != nil { - writeError(w, http.StatusInternalServerError, err.Error()) - return - } + var items []skillItem - items := make([]skillItem, 0, len(discovered)) - for _, d := range discovered { - item := skillItem{ - Name: filepath.Base(d.SourcePath), - FlatName: d.FlatName, - RelPath: d.RelPath, - SourcePath: d.SourcePath, - IsInRepo: d.IsInRepo, - Targets: d.Targets, - Disabled: d.Disabled, + // Skills + if kindFilter == "" || kindFilter == "skill" { + discovered, err := sync.DiscoverSourceSkillsAll(source) + if err != nil { + writeError(w, http.StatusInternalServerError, err.Error()) + return } - // Enrich with metadata if available - if meta, _ := install.ReadMeta(d.SourcePath); meta != nil { - item.InstalledAt = meta.InstalledAt.Format("2006-01-02T15:04:05Z") - item.Source = meta.Source - item.Type = meta.Type - item.RepoURL = meta.RepoURL - item.Version = meta.Version - item.Branch = meta.Branch + for _, d := range discovered { + item := skillItem{ + Name: filepath.Base(d.SourcePath), + Kind: "skill", + FlatName: d.FlatName, + RelPath: d.RelPath, + SourcePath: d.SourcePath, + IsInRepo: d.IsInRepo, + Targets: d.Targets, + Disabled: d.Disabled, + } + + if entry := s.skillsStore.GetByPath(d.RelPath); entry != nil { + if !entry.InstalledAt.IsZero() { + item.InstalledAt = entry.InstalledAt.Format(time.RFC3339) + } + item.Source = entry.Source + item.Type = entry.Type + item.RepoURL = entry.RepoURL + item.Version = entry.Version + item.Branch = entry.Branch + } + enrichSkillBranch(&item) + + items = append(items, item) } - enrichSkillBranch(&item) + } + + // Agents — recursive discovery (supports --into subdirectories) + if (kindFilter == "" || kindFilter == "agent") && agentsSource != "" { + discovered, _ := resource.AgentKind{}.Discover(agentsSource) + for _, d := range discovered { + item := skillItem{ + Name: d.Name, + Kind: "agent", + FlatName: d.FlatName, + RelPath: d.RelPath, + SourcePath: d.SourcePath, + IsInRepo: d.IsInRepo, + Disabled: d.Disabled, + Targets: d.Targets, + } - items = append(items, item) + // Read from centralized agents metadata store + agentKey := strings.TrimSuffix(d.RelPath, ".md") + if entry := s.agentsStore.GetByPath(agentKey); entry != nil { + if !entry.InstalledAt.IsZero() { + item.InstalledAt = entry.InstalledAt.Format(time.RFC3339) + } + item.Source = entry.Source + item.Type = entry.Type + item.RepoURL = entry.RepoURL + item.Version = entry.Version + } else if d.RepoRelPath != "" { + repoPath := filepath.Join(agentsSource, filepath.FromSlash(d.RepoRelPath)) + if repoURL, err := git.GetRemoteURL(repoPath); err == nil { + item.Source = repoURL + item.RepoURL = repoURL + } + if version, err := git.GetCurrentFullHash(repoPath); err == nil { + item.Version = version + } + } + + items = append(items, item) + } } - writeJSON(w, map[string]any{"skills": items}) + writeJSON(w, map[string]any{"resources": items}) } func (s *Server) handleGetSkill(w http.ResponseWriter, r *http.Request) { // Snapshot config under RLock, then release before I/O. s.mu.RLock() source := s.cfg.Source + agentsSource := s.agentsSource() s.mu.RUnlock() name := r.PathValue("name") - - // Find the skill by flat name or base name - discovered, err := sync.DiscoverSourceSkillsAll(source) - if err != nil { - writeError(w, http.StatusInternalServerError, err.Error()) + kind := r.URL.Query().Get("kind") + if kind != "" && kind != "skill" && kind != "agent" { + writeError(w, http.StatusBadRequest, "invalid kind: "+kind) return } - for _, d := range discovered { - baseName := filepath.Base(d.SourcePath) - if d.FlatName != name && baseName != name { - continue + // Find the skill by flat name or base name + if kind != "agent" { + discovered, err := sync.DiscoverSourceSkillsAll(source) + if err != nil { + writeError(w, http.StatusInternalServerError, err.Error()) + return } - item := skillItem{ - Name: baseName, - FlatName: d.FlatName, - RelPath: d.RelPath, - SourcePath: d.SourcePath, - IsInRepo: d.IsInRepo, - Targets: d.Targets, - Disabled: d.Disabled, - } + for _, d := range discovered { + baseName := filepath.Base(d.SourcePath) + if d.FlatName != name && baseName != name { + continue + } - if meta, _ := install.ReadMeta(d.SourcePath); meta != nil { - item.InstalledAt = meta.InstalledAt.Format("2006-01-02T15:04:05Z") - item.Source = meta.Source - item.Type = meta.Type - item.RepoURL = meta.RepoURL - item.Version = meta.Version - item.Branch = meta.Branch - } - enrichSkillBranch(&item) + item := skillItem{ + Name: baseName, + Kind: "skill", + FlatName: d.FlatName, + RelPath: d.RelPath, + SourcePath: d.SourcePath, + IsInRepo: d.IsInRepo, + Targets: d.Targets, + Disabled: d.Disabled, + } - // Read SKILL.md content - skillMdContent := "" - skillMdPath := filepath.Join(d.SourcePath, "SKILL.md") - if data, err := os.ReadFile(skillMdPath); err == nil { - skillMdContent = string(data) - } + if entry := s.skillsStore.GetByPath(d.RelPath); entry != nil { + if !entry.InstalledAt.IsZero() { + item.InstalledAt = entry.InstalledAt.Format(time.RFC3339) + } + item.Source = entry.Source + item.Type = entry.Type + item.RepoURL = entry.RepoURL + item.Version = entry.Version + item.Branch = entry.Branch + } + enrichSkillBranch(&item) - // List all files in the skill directory - files := make([]string, 0) - filepath.Walk(d.SourcePath, func(path string, info os.FileInfo, err error) error { - if err != nil { + // Read SKILL.md content + skillMdContent := "" + skillMdPath := filepath.Join(d.SourcePath, "SKILL.md") + if data, err := os.ReadFile(skillMdPath); err == nil { + skillMdContent = string(data) + } + + // List all files in the skill directory + files := make([]string, 0) + filepath.Walk(d.SourcePath, func(path string, info os.FileInfo, err error) error { + if err != nil { + return nil + } + if info.IsDir() && utils.IsHidden(info.Name()) { + return filepath.SkipDir + } + if !info.IsDir() { + rel, _ := filepath.Rel(d.SourcePath, path) + // Normalize separators + rel = strings.ReplaceAll(rel, "\\", "/") + files = append(files, rel) + } return nil + }) + + writeJSON(w, map[string]any{ + "resource": item, + "skillMdContent": skillMdContent, + "files": files, + }) + return + } + } + + // Fallback: check agents source (recursive — supports --into subdirectories) + if kind != "skill" && agentsSource != "" { + agentDiscovered, _ := resource.AgentKind{}.Discover(agentsSource) + for _, d := range agentDiscovered { + if !matchesAgentName(d, name) { + continue } - if info.IsDir() && utils.IsHidden(info.Name()) { - return filepath.SkipDir + + data, readErr := os.ReadFile(d.SourcePath) + if readErr != nil { + continue } - if !info.IsDir() { - rel, _ := filepath.Rel(d.SourcePath, path) - // Normalize separators - rel = strings.ReplaceAll(rel, "\\", "/") - files = append(files, rel) + + item := skillItem{ + Name: d.Name, + Kind: "agent", + FlatName: d.FlatName, + RelPath: d.RelPath, + SourcePath: d.SourcePath, + IsInRepo: d.IsInRepo, + Disabled: d.Disabled, + Targets: d.Targets, } - return nil - }) - writeJSON(w, map[string]any{ - "skill": item, - "skillMdContent": skillMdContent, - "files": files, - }) - return + agentKey := strings.TrimSuffix(d.RelPath, ".md") + if entry := s.agentsStore.GetByPath(agentKey); entry != nil { + if !entry.InstalledAt.IsZero() { + item.InstalledAt = entry.InstalledAt.Format(time.RFC3339) + } + item.Source = entry.Source + item.Type = entry.Type + item.RepoURL = entry.RepoURL + item.Version = entry.Version + } else if d.RepoRelPath != "" { + repoPath := filepath.Join(agentsSource, filepath.FromSlash(d.RepoRelPath)) + if repoURL, err := git.GetRemoteURL(repoPath); err == nil { + item.Source = repoURL + item.RepoURL = repoURL + } + if version, err := git.GetCurrentFullHash(repoPath); err == nil { + item.Version = version + } + } + + writeJSON(w, map[string]any{ + "resource": item, + "skillMdContent": string(data), + "files": []string{filepath.Base(d.RelPath)}, + }) + return + } } writeError(w, http.StatusNotFound, "skill not found: "+name) @@ -266,7 +380,7 @@ func (s *Server) handleUninstallRepo(w http.ResponseWriter, r *http.Request) { return } - // Prune registry entries: the repo itself + skills belonging to it. + // Prune store entries: the repo itself + skills belonging to it. // Legacy entries use Group without "_" prefix (e.g., "team-skills" for repo "_team-skills"). // Only apply legacy matching for top-level repos (no "/" in repoName) to avoid // basename collisions between sibling nested repos like org/_team-skills vs dept/_team-skills. @@ -274,35 +388,35 @@ func (s *Server) handleUninstallRepo(w http.ResponseWriter, r *http.Request) { if !strings.Contains(repoName, "/") { legacyGroup = strings.TrimPrefix(repoName, "_") } - filtered := make([]config.SkillEntry, 0, len(s.registry.Skills)) - for _, entry := range s.registry.Skills { - fullName := entry.FullName() + for _, name := range s.skillsStore.List() { + entry := s.skillsStore.Get(name) + if entry == nil { + continue + } // Match the repo's own entry (e.g., "_team-skills" or "org/_team-skills") - if fullName == repoName { + if name == repoName { + s.skillsStore.Remove(name) continue } // Match tracked skills grouped under this repo (exact group match) if entry.Tracked && entry.Group == repoName { + s.skillsStore.Remove(name) continue } // Match legacy grouped entries (top-level repos only, e.g., group="team-skills") if legacyGroup != "" && entry.Tracked && entry.Group == legacyGroup { + s.skillsStore.Remove(name) continue } // Match nested members (e.g., "org/_team-skills/sub-skill") - if strings.HasPrefix(fullName, repoName+"/") { + if strings.HasPrefix(name, repoName+"/") { + s.skillsStore.Remove(name) continue } - filtered = append(filtered, entry) } - s.registry.Skills = filtered - regDir := s.cfg.RegistryDir - if s.IsProjectMode() { - regDir = filepath.Join(s.projectRoot, ".skillshare") - } - if err := s.registry.Save(regDir); err != nil { - log.Printf("warning: failed to save registry after repo uninstall: %v", err) + if err := s.skillsStore.Save(s.cfg.Source); err != nil { + log.Printf("warning: failed to save metadata after repo uninstall: %v", err) } s.writeOpsLog("uninstall", "ok", start, map[string]any{ @@ -320,6 +434,47 @@ func (s *Server) handleUninstallSkill(w http.ResponseWriter, r *http.Request) { defer s.mu.Unlock() name := r.PathValue("name") + kind := r.URL.Query().Get("kind") + if kind != "" && kind != "skill" && kind != "agent" { + writeError(w, http.StatusBadRequest, "invalid kind: "+kind) + return + } + + if kind == "agent" { + agentsSource := s.agentsSource() + if agentsSource == "" { + writeError(w, http.StatusNotFound, "agent not found: "+name) + return + } + agent, err := resolveAgentResource(agentsSource, name) + if err != nil { + writeError(w, http.StatusNotFound, err.Error()) + return + } + + displayName := agentMetaKey(agent.RelPath) + legacySidecar := filepath.Join(filepath.Dir(agent.SourcePath), filepath.Base(displayName)+".skillshare-meta.json") + if _, err := trash.MoveAgentToTrash(agent.SourcePath, legacySidecar, displayName, s.agentTrashBase()); err != nil { + writeError(w, http.StatusInternalServerError, "failed to trash agent: "+err.Error()) + return + } + + if s.agentsStore != nil { + s.agentsStore.Remove(displayName) + if err := s.agentsStore.Save(agentsSource); err != nil { + log.Printf("warning: failed to save agent metadata after uninstall: %v", err) + } + } + + s.writeOpsLog("uninstall", "ok", start, map[string]any{ + "name": displayName, + "type": "agent", + "scope": "ui", + }, "") + + writeJSON(w, map[string]any{"success": true, "name": displayName, "movedToTrash": true}) + return + } // Find skill path discovered, err := sync.DiscoverSourceSkills(s.cfg.Source) diff --git a/internal/server/handler_skills_batch.go b/internal/server/handler_skills_batch.go index 1c9f99b6..af0656cc 100644 --- a/internal/server/handler_skills_batch.go +++ b/internal/server/handler_skills_batch.go @@ -7,7 +7,7 @@ import ( "strings" "time" - "skillshare/internal/install" + "skillshare/internal/resource" ssync "skillshare/internal/sync" "skillshare/internal/utils" ) @@ -73,8 +73,12 @@ func (s *Server) handleBatchSetTargets(w http.ResponseWriter, r *http.Request) { var updated, skipped int var errors []string - // Collect paths that need meta hash refresh (outside the lock). - var updatedPaths []string + // Collect skills that need meta hash refresh (outside the lock). + type updatedSkill struct { + name string + path string + } + var updatedSkills []updatedSkill // Acquire write lock only for the file-write loop. s.mu.Lock() @@ -101,14 +105,20 @@ func (s *Server) handleBatchSetTargets(w http.ResponseWriter, r *http.Request) { continue } - updatedPaths = append(updatedPaths, d.SourcePath) + updatedSkills = append(updatedSkills, updatedSkill{ + name: filepath.Base(d.SourcePath), + path: d.SourcePath, + }) updated++ } s.mu.Unlock() // Recompute file hashes outside the lock so reads aren't blocked. - for _, p := range updatedPaths { - install.RefreshMetaHashes(p) + for _, sk := range updatedSkills { + s.skillsStore.RefreshHashes(sk.name, sk.path) + } + if len(updatedSkills) > 0 { + s.skillsStore.Save(s.cfg.Source) //nolint:errcheck } s.writeOpsLog("batch-set-targets", "ok", start, map[string]any{ @@ -182,7 +192,8 @@ func (s *Server) handleSetSkillTargets(w http.ResponseWriter, r *http.Request) { return } - install.RefreshMetaHashes(d.SourcePath) + s.skillsStore.RefreshHashes(d.RelPath, d.SourcePath) + s.skillsStore.Save(s.cfg.Source) //nolint:errcheck s.writeOpsLog("set-skill-targets", "ok", start, map[string]any{ "name": name, @@ -194,5 +205,42 @@ func (s *Server) handleSetSkillTargets(w http.ResponseWriter, r *http.Request) { return } - writeError(w, http.StatusNotFound, "skill not found: "+name) + // Try agents if skill not found + s.mu.RLock() + agentsSource := s.agentsSource() + s.mu.RUnlock() + + if agentsSource != "" { + agents, _ := resource.AgentKind{}.Discover(agentsSource) + for _, d := range agents { + if d.FlatName != name { + continue + } + + var values []string + if req.Target != "" { + values = []string{req.Target} + } + + s.mu.Lock() + err := utils.SetFrontmatterList(d.SourcePath, "targets", values) + s.mu.Unlock() + + if err != nil { + writeError(w, http.StatusInternalServerError, "failed to update agent: "+err.Error()) + return + } + + s.writeOpsLog("set-agent-targets", "ok", start, map[string]any{ + "name": name, + "target": req.Target, + "scope": "ui", + }, "") + + writeJSON(w, map[string]any{"success": true}) + return + } + } + + writeError(w, http.StatusNotFound, "resource not found: "+name) } diff --git a/internal/server/handler_skills_batch_test.go b/internal/server/handler_skills_batch_test.go index b8f07da2..40d57d6a 100644 --- a/internal/server/handler_skills_batch_test.go +++ b/internal/server/handler_skills_batch_test.go @@ -45,7 +45,7 @@ func TestHandleBatchSetTargets_SetSingleTarget(t *testing.T) { addSkillNested(t, src, "frontend/skill-b") body := `{"folder":"frontend","target":"claude"}` - req := httptest.NewRequest(http.MethodPost, "/api/skills/batch/targets", bytes.NewBufferString(body)) + req := httptest.NewRequest(http.MethodPost, "/api/resources/batch/targets", bytes.NewBufferString(body)) rr := httptest.NewRecorder() s.handler.ServeHTTP(rr, req) @@ -84,7 +84,7 @@ func TestHandleBatchSetTargets_RemoveTargets(t *testing.T) { // Set target="" to remove targets (root-level folder) body := `{"folder":"","target":""}` - req := httptest.NewRequest(http.MethodPost, "/api/skills/batch/targets", bytes.NewBufferString(body)) + req := httptest.NewRequest(http.MethodPost, "/api/resources/batch/targets", bytes.NewBufferString(body)) rr := httptest.NewRecorder() s.handler.ServeHTTP(rr, req) @@ -114,7 +114,7 @@ func TestHandleBatchSetTargets_PathTraversal(t *testing.T) { s, _ := newTestServer(t) body := `{"folder":"../../../etc","target":"claude"}` - req := httptest.NewRequest(http.MethodPost, "/api/skills/batch/targets", bytes.NewBufferString(body)) + req := httptest.NewRequest(http.MethodPost, "/api/resources/batch/targets", bytes.NewBufferString(body)) rr := httptest.NewRecorder() s.handler.ServeHTTP(rr, req) @@ -132,7 +132,7 @@ func TestHandleBatchSetTargets_EmptyFolder_RootOnly(t *testing.T) { addSkillNested(t, src, "nested-folder/deep-skill") body := `{"folder":"","target":"cursor"}` - req := httptest.NewRequest(http.MethodPost, "/api/skills/batch/targets", bytes.NewBufferString(body)) + req := httptest.NewRequest(http.MethodPost, "/api/resources/batch/targets", bytes.NewBufferString(body)) rr := httptest.NewRecorder() s.handler.ServeHTTP(rr, req) @@ -190,7 +190,7 @@ func TestHandleBatchSetTargets_FolderTrailingSlash(t *testing.T) { // Trailing slash should still match (server cleans the input) body := `{"folder":"frontend/","target":"claude"}` - req := httptest.NewRequest(http.MethodPost, "/api/skills/batch/targets", bytes.NewBufferString(body)) + req := httptest.NewRequest(http.MethodPost, "/api/resources/batch/targets", bytes.NewBufferString(body)) rr := httptest.NewRecorder() s.handler.ServeHTTP(rr, req) @@ -214,7 +214,7 @@ func TestHandleSetSkillTargets_SetTarget(t *testing.T) { addSkill(t, src, "my-skill") body := `{"target":"claude"}` - req := httptest.NewRequest(http.MethodPatch, "/api/skills/my-skill/targets", bytes.NewBufferString(body)) + req := httptest.NewRequest(http.MethodPatch, "/api/resources/my-skill/targets", bytes.NewBufferString(body)) req.SetPathValue("name", "my-skill") rr := httptest.NewRecorder() s.handleSetSkillTargets(rr, req) @@ -244,7 +244,7 @@ func TestHandleSetSkillTargets_RemoveTarget(t *testing.T) { addSkillWithTargets(t, src, "my-skill", []string{"claude", "cursor"}) body := `{"target":""}` - req := httptest.NewRequest(http.MethodPatch, "/api/skills/my-skill/targets", bytes.NewBufferString(body)) + req := httptest.NewRequest(http.MethodPatch, "/api/resources/my-skill/targets", bytes.NewBufferString(body)) req.SetPathValue("name", "my-skill") rr := httptest.NewRecorder() s.handleSetSkillTargets(rr, req) @@ -265,7 +265,7 @@ func TestHandleSetSkillTargets_NotFound(t *testing.T) { s, _ := newTestServer(t) body := `{"target":"claude"}` - req := httptest.NewRequest(http.MethodPatch, "/api/skills/nonexistent/targets", bytes.NewBufferString(body)) + req := httptest.NewRequest(http.MethodPatch, "/api/resources/nonexistent/targets", bytes.NewBufferString(body)) req.SetPathValue("name", "nonexistent") rr := httptest.NewRecorder() s.handleSetSkillTargets(rr, req) diff --git a/internal/server/handler_skills_test.go b/internal/server/handler_skills_test.go index f87f2286..44452941 100644 --- a/internal/server/handler_skills_test.go +++ b/internal/server/handler_skills_test.go @@ -9,13 +9,13 @@ import ( "strings" "testing" - "skillshare/internal/config" + "skillshare/internal/install" "skillshare/internal/trash" ) func TestHandleListSkills_Empty(t *testing.T) { s, _ := newTestServer(t) - req := httptest.NewRequest(http.MethodGet, "/api/skills", nil) + req := httptest.NewRequest(http.MethodGet, "/api/resources", nil) rr := httptest.NewRecorder() s.handler.ServeHTTP(rr, req) @@ -24,11 +24,11 @@ func TestHandleListSkills_Empty(t *testing.T) { } var resp struct { - Skills []any `json:"skills"` + Resources []any `json:"resources"` } json.Unmarshal(rr.Body.Bytes(), &resp) - if len(resp.Skills) != 0 { - t.Errorf("expected 0 skills, got %d", len(resp.Skills)) + if len(resp.Resources) != 0 { + t.Errorf("expected 0 resources, got %d", len(resp.Resources)) } } @@ -37,7 +37,7 @@ func TestHandleListSkills_WithSkills(t *testing.T) { addSkill(t, src, "alpha") addSkill(t, src, "beta") - req := httptest.NewRequest(http.MethodGet, "/api/skills", nil) + req := httptest.NewRequest(http.MethodGet, "/api/resources", nil) rr := httptest.NewRecorder() s.handler.ServeHTTP(rr, req) @@ -46,11 +46,11 @@ func TestHandleListSkills_WithSkills(t *testing.T) { } var resp struct { - Skills []map[string]any `json:"skills"` + Resources []map[string]any `json:"resources"` } json.Unmarshal(rr.Body.Bytes(), &resp) - if len(resp.Skills) != 2 { - t.Errorf("expected 2 skills, got %d", len(resp.Skills)) + if len(resp.Resources) != 2 { + t.Errorf("expected 2 resources, got %d", len(resp.Resources)) } } @@ -58,7 +58,7 @@ func TestHandleGetSkill_Found(t *testing.T) { s, src := newTestServer(t) addSkill(t, src, "my-skill") - req := httptest.NewRequest(http.MethodGet, "/api/skills/my-skill", nil) + req := httptest.NewRequest(http.MethodGet, "/api/resources/my-skill", nil) rr := httptest.NewRecorder() s.handler.ServeHTTP(rr, req) @@ -68,15 +68,15 @@ func TestHandleGetSkill_Found(t *testing.T) { var resp map[string]any json.Unmarshal(rr.Body.Bytes(), &resp) - skill := resp["skill"].(map[string]any) - if skill["flatName"] != "my-skill" { - t.Errorf("expected flatName 'my-skill', got %v", skill["flatName"]) + res := resp["resource"].(map[string]any) + if res["flatName"] != "my-skill" { + t.Errorf("expected flatName 'my-skill', got %v", res["flatName"]) } } func TestHandleGetSkill_NotFound(t *testing.T) { s, _ := newTestServer(t) - req := httptest.NewRequest(http.MethodGet, "/api/skills/nonexistent", nil) + req := httptest.NewRequest(http.MethodGet, "/api/resources/nonexistent", nil) rr := httptest.NewRecorder() s.handler.ServeHTTP(rr, req) @@ -93,7 +93,7 @@ func TestHandleGetSkillFile_PathTraversal(t *testing.T) { // to bypass mux and call the handler directly with a crafted PathValue. // Instead, test that a valid-looking but still-traversal path is rejected. // The handler checks strings.Contains(fp, ".."). - req := httptest.NewRequest(http.MethodGet, "/api/skills/my-skill/files/sub%2F..%2F..%2Fetc%2Fpasswd", nil) + req := httptest.NewRequest(http.MethodGet, "/api/resources/my-skill/files/sub%2F..%2F..%2Fetc%2Fpasswd", nil) rr := httptest.NewRecorder() s.mux.ServeHTTP(rr, req) @@ -178,14 +178,11 @@ func TestHandleUninstallRepo_PrunesRegistry(t *testing.T) { addTrackedRepo(t, src, "_team-skills") addSkill(t, src, "unrelated-skill") // must exist on disk to survive reconcile - // Seed registry with entries belonging to this repo - s.registry = &config.Registry{ - Skills: []config.SkillEntry{ - {Name: "vue-best-practices", Group: "team-skills", Tracked: true}, - {Name: "react-patterns", Group: "team-skills", Tracked: true}, - {Name: "unrelated-skill", Group: ""}, - }, - } + // Seed store with entries belonging to this repo + s.skillsStore = install.NewMetadataStore() + s.skillsStore.Set("team-skills/vue-best-practices", &install.MetadataEntry{Group: "team-skills", Tracked: true}) + s.skillsStore.Set("team-skills/react-patterns", &install.MetadataEntry{Group: "team-skills", Tracked: true}) + s.skillsStore.Set("unrelated-skill", &install.MetadataEntry{}) req := httptest.NewRequest(http.MethodDelete, "/api/repos/_team-skills", nil) req.SetPathValue("name", "_team-skills") @@ -196,10 +193,11 @@ func TestHandleUninstallRepo_PrunesRegistry(t *testing.T) { t.Fatalf("expected 200, got %d: %s", rr.Code, rr.Body.String()) } - // Registry should not contain any team-skills entries - for _, entry := range s.registry.Skills { - if entry.Group == "team-skills" { - t.Fatalf("expected team-skills entries to be pruned, but found %q", entry.Name) + // Store should not contain any team-skills entries + for _, name := range s.skillsStore.List() { + entry := s.skillsStore.Get(name) + if entry != nil && entry.Group == "team-skills" { + t.Fatalf("expected team-skills entries to be pruned, but found %q", name) } } } @@ -209,14 +207,11 @@ func TestHandleUninstallRepo_NestedPruneDoesNotAffectSibling(t *testing.T) { addTrackedRepo(t, src, filepath.Join("org", "_team-skills")) addTrackedRepo(t, src, filepath.Join("dept", "_team-skills")) - // Seed registry: entries from both nested repos + an exact-group entry - s.registry = &config.Registry{ - Skills: []config.SkillEntry{ - {Name: "vue", Group: "org/_team-skills", Tracked: true}, - {Name: "react", Group: "dept/_team-skills", Tracked: true}, - {Name: "unrelated", Group: ""}, - }, - } + // Seed store: entries from both nested repos + an exact-group entry + s.skillsStore = install.NewMetadataStore() + s.skillsStore.Set("org/_team-skills/vue", &install.MetadataEntry{Group: "org/_team-skills", Tracked: true}) + s.skillsStore.Set("dept/_team-skills/react", &install.MetadataEntry{Group: "dept/_team-skills", Tracked: true}) + s.skillsStore.Set("unrelated", &install.MetadataEntry{}) req := httptest.NewRequest(http.MethodDelete, "/api/repos/org/_team-skills", nil) req.SetPathValue("name", "org/_team-skills") @@ -228,16 +223,18 @@ func TestHandleUninstallRepo_NestedPruneDoesNotAffectSibling(t *testing.T) { } // org/_team-skills entries should be pruned - for _, entry := range s.registry.Skills { - if entry.Group == "org/_team-skills" { - t.Fatalf("expected org/_team-skills entries to be pruned, but found %q", entry.Name) + for _, name := range s.skillsStore.List() { + entry := s.skillsStore.Get(name) + if entry != nil && entry.Group == "org/_team-skills" { + t.Fatalf("expected org/_team-skills entries to be pruned, but found %q", name) } } // dept/_team-skills entries must survive var found bool - for _, entry := range s.registry.Skills { - if entry.Group == "dept/_team-skills" { + for _, name := range s.skillsStore.List() { + entry := s.skillsStore.Get(name) + if entry != nil && entry.Group == "dept/_team-skills" { found = true break } @@ -266,11 +263,8 @@ func TestHandleUninstallRepo_ProjectMode_GitignorePath(t *testing.T) { gitignorePath := filepath.Join(gitignoreDir, ".gitignore") os.WriteFile(gitignorePath, []byte("# BEGIN SKILLSHARE MANAGED - DO NOT EDIT\nskills/_team-skills/\n# END SKILLSHARE MANAGED\n"), 0644) - s.registry = &config.Registry{ - Skills: []config.SkillEntry{ - {Name: "_team-skills", Tracked: true}, - }, - } + s.skillsStore = install.NewMetadataStore() + s.skillsStore.Set("_team-skills", &install.MetadataEntry{Tracked: true}) req := httptest.NewRequest(http.MethodDelete, "/api/repos/_team-skills", nil) req.SetPathValue("name", "_team-skills") @@ -376,14 +370,11 @@ func TestHandleUninstallRepo_PrunesNestedFullPathGroup(t *testing.T) { s, src := newTestServer(t) addTrackedRepo(t, src, filepath.Join("org", "_team-skills")) - // Registry with entries using full nested path as Group (new reconcile format) - s.registry = &config.Registry{ - Skills: []config.SkillEntry{ - {Name: "vue", Group: "org/_team-skills", Tracked: true}, - {Name: "react", Group: "org/_team-skills", Tracked: true}, - {Name: "unrelated", Group: ""}, - }, - } + // Store with entries using full nested path as Group (new reconcile format) + s.skillsStore = install.NewMetadataStore() + s.skillsStore.Set("org/_team-skills/vue", &install.MetadataEntry{Group: "org/_team-skills", Tracked: true}) + s.skillsStore.Set("org/_team-skills/react", &install.MetadataEntry{Group: "org/_team-skills", Tracked: true}) + s.skillsStore.Set("unrelated", &install.MetadataEntry{}) req := httptest.NewRequest(http.MethodDelete, "/api/repos/org/_team-skills", nil) req.SetPathValue("name", "org/_team-skills") @@ -395,11 +386,12 @@ func TestHandleUninstallRepo_PrunesNestedFullPathGroup(t *testing.T) { } // All org/_team-skills entries should be pruned, unrelated survives - if len(s.registry.Skills) != 1 { - t.Fatalf("expected 1 surviving entry, got %d", len(s.registry.Skills)) + names := s.skillsStore.List() + if len(names) != 1 { + t.Fatalf("expected 1 surviving entry, got %d", len(names)) } - if s.registry.Skills[0].Name != "unrelated" { - t.Fatalf("expected 'unrelated' to survive, got %q", s.registry.Skills[0].Name) + if !s.skillsStore.Has("unrelated") { + t.Fatalf("expected 'unrelated' to survive") } } @@ -407,14 +399,11 @@ func TestHandleUninstallRepo_PrunesNestedMembersByPrefix(t *testing.T) { s, src := newTestServer(t) addTrackedRepo(t, src, filepath.Join("org", "_team-skills")) - // Registry with the repo's own entry + a sub-skill using FullName prefix - s.registry = &config.Registry{ - Skills: []config.SkillEntry{ - {Name: "org/_team-skills", Group: "", Tracked: true}, // repo entry - {Name: "sub-skill", Group: "org/_team-skills", Tracked: true}, // member - {Name: "standalone", Group: ""}, - }, - } + // Store with the repo's own entry + a sub-skill using name prefix + s.skillsStore = install.NewMetadataStore() + s.skillsStore.Set("org/_team-skills", &install.MetadataEntry{Tracked: true}) // repo entry + s.skillsStore.Set("org/_team-skills/sub-skill", &install.MetadataEntry{Group: "org/_team-skills", Tracked: true}) // member + s.skillsStore.Set("standalone", &install.MetadataEntry{}) req := httptest.NewRequest(http.MethodDelete, "/api/repos/org/_team-skills", nil) req.SetPathValue("name", "org/_team-skills") @@ -425,10 +414,11 @@ func TestHandleUninstallRepo_PrunesNestedMembersByPrefix(t *testing.T) { t.Fatalf("expected 200, got %d: %s", rr.Code, rr.Body.String()) } - if len(s.registry.Skills) != 1 { - t.Fatalf("expected 1 surviving entry, got %d", len(s.registry.Skills)) + names := s.skillsStore.List() + if len(names) != 1 { + t.Fatalf("expected 1 surviving entry, got %d", len(names)) } - if s.registry.Skills[0].Name != "standalone" { - t.Fatalf("expected 'standalone' to survive, got %q", s.registry.Skills[0].Name) + if !s.skillsStore.Has("standalone") { + t.Fatalf("expected 'standalone' to survive") } } diff --git a/internal/server/handler_sync.go b/internal/server/handler_sync.go index b93f99d5..611f31b3 100644 --- a/internal/server/handler_sync.go +++ b/internal/server/handler_sync.go @@ -4,6 +4,7 @@ import ( "encoding/json" "maps" "net/http" + "os" "time" "skillshare/internal/config" @@ -48,11 +49,17 @@ func (s *Server) handleSync(w http.ResponseWriter, r *http.Request) { defer s.mu.Unlock() var body struct { - DryRun bool `json:"dryRun"` - Force bool `json:"force"` + DryRun bool `json:"dryRun"` + Force bool `json:"force"` + Kind string `json:"kind"` } if err := json.NewDecoder(r.Body).Decode(&body); err != nil { - // Default to non-dry-run, non-force + // Default to non-dry-run, non-force, empty kind (both) + } + + if body.Kind != "" && body.Kind != kindSkill && body.Kind != kindAgent { + writeError(w, http.StatusBadRequest, "invalid kind: must be 'skill', 'agent', or empty") + return } globalMode := s.cfg.Mode @@ -67,97 +74,165 @@ func (s *Server) handleSync(w http.ResponseWriter, r *http.Request) { return } - // Discover skills once for all targets - allSkills, ignoreStats, err := ssync.DiscoverSourceSkillsWithStats(s.cfg.Source) - if err != nil { - writeError(w, http.StatusInternalServerError, "failed to discover skills: "+err.Error()) - return - } - - if len(allSkills) == 0 { - warnings = append(warnings, "source directory is empty (0 skills)") - } - - // Registry entries are managed by install/uninstall, not sync. - // Sync only manages symlinks — it must not prune registry entries - // for installed skills whose files may be missing from disk. - results := make([]syncTargetResult, 0) - for name, target := range s.cfg.Targets { - sc := target.SkillsConfig() - mode := sc.Mode - if mode == "" { - mode = globalMode - } + var ignoreStats *skillignore.IgnoreStats - res := syncTargetResult{ - Target: name, - Linked: make([]string, 0), - Updated: make([]string, 0), - Skipped: make([]string, 0), - Pruned: make([]string, 0), + // Skill sync (skip when kind == "agent") + if body.Kind != kindAgent { + var allSkills []ssync.DiscoveredSkill + var err error + allSkills, ignoreStats, err = ssync.DiscoverSourceSkillsWithStats(s.cfg.Source) + if err != nil { + writeError(w, http.StatusInternalServerError, "failed to discover skills: "+err.Error()) + return } - syncErrArgs := map[string]any{ - "targets_total": len(s.cfg.Targets), - "targets_failed": 1, - "target": name, - "dry_run": body.DryRun, - "force": body.Force, - "scope": "ui", + if len(allSkills) == 0 { + warnings = append(warnings, "source directory is empty (0 skills)") } - switch mode { - case "merge": - mergeResult, err := ssync.SyncTargetMergeWithSkills(name, target, allSkills, s.cfg.Source, body.DryRun, body.Force, s.projectRoot) - if err != nil { - s.writeOpsLog("sync", "error", start, syncErrArgs, err.Error()) - writeError(w, http.StatusInternalServerError, "sync failed for "+name+": "+err.Error()) - return - } - res.Linked = mergeResult.Linked - res.Updated = mergeResult.Updated - res.Skipped = mergeResult.Skipped - res.DirCreated = mergeResult.DirCreated - - pruneResult, err := ssync.PruneOrphanLinksWithSkills(ssync.PruneOptions{ - TargetPath: sc.Path, SourcePath: s.cfg.Source, Skills: allSkills, - Include: sc.Include, Exclude: sc.Exclude, TargetNaming: sc.TargetNaming, TargetName: name, - DryRun: body.DryRun, Force: body.Force, - }) - if err == nil { - res.Pruned = pruneResult.Removed + // Registry entries are managed by install/uninstall, not sync. + // Sync only manages symlinks — it must not prune registry entries + // for installed skills whose files may be missing from disk. + + for name, target := range s.cfg.Targets { + sc := target.SkillsConfig() + mode := sc.Mode + if mode == "" { + mode = globalMode } - case "copy": - copyResult, err := ssync.SyncTargetCopyWithSkills(name, target, allSkills, s.cfg.Source, body.DryRun, body.Force, nil) - if err != nil { - s.writeOpsLog("sync", "error", start, syncErrArgs, err.Error()) - writeError(w, http.StatusInternalServerError, "sync failed for "+name+": "+err.Error()) - return + res := syncTargetResult{ + Target: name, + Linked: make([]string, 0), + Updated: make([]string, 0), + Skipped: make([]string, 0), + Pruned: make([]string, 0), } - res.Linked = copyResult.Copied - res.Updated = copyResult.Updated - res.Skipped = copyResult.Skipped - res.DirCreated = copyResult.DirCreated - - pruneResult, err := ssync.PruneOrphanCopiesWithSkills(sc.Path, allSkills, sc.Include, sc.Exclude, name, sc.TargetNaming, body.DryRun) - if err == nil { - res.Pruned = pruneResult.Removed + + syncErrArgs := map[string]any{ + "targets_total": len(s.cfg.Targets), + "targets_failed": 1, + "target": name, + "dry_run": body.DryRun, + "force": body.Force, + "scope": "ui", } - default: - err := ssync.SyncTarget(name, target, s.cfg.Source, body.DryRun, s.projectRoot) - if err != nil { - s.writeOpsLog("sync", "error", start, syncErrArgs, err.Error()) - writeError(w, http.StatusInternalServerError, "sync failed for "+name+": "+err.Error()) - return + switch mode { + case "merge": + mergeResult, err := ssync.SyncTargetMergeWithSkills(name, target, allSkills, s.cfg.Source, body.DryRun, body.Force, s.projectRoot) + if err != nil { + s.writeOpsLog("sync", "error", start, syncErrArgs, err.Error()) + writeError(w, http.StatusInternalServerError, "sync failed for "+name+": "+err.Error()) + return + } + res.Linked = mergeResult.Linked + res.Updated = mergeResult.Updated + res.Skipped = mergeResult.Skipped + res.DirCreated = mergeResult.DirCreated + + pruneResult, err := ssync.PruneOrphanLinksWithSkills(ssync.PruneOptions{ + TargetPath: sc.Path, SourcePath: s.cfg.Source, Skills: allSkills, + Include: sc.Include, Exclude: sc.Exclude, TargetNaming: sc.TargetNaming, TargetName: name, + DryRun: body.DryRun, Force: body.Force, + }) + if err == nil { + res.Pruned = pruneResult.Removed + } + + case "copy": + copyResult, err := ssync.SyncTargetCopyWithSkills(name, target, allSkills, s.cfg.Source, body.DryRun, body.Force, nil) + if err != nil { + s.writeOpsLog("sync", "error", start, syncErrArgs, err.Error()) + writeError(w, http.StatusInternalServerError, "sync failed for "+name+": "+err.Error()) + return + } + res.Linked = copyResult.Copied + res.Updated = copyResult.Updated + res.Skipped = copyResult.Skipped + res.DirCreated = copyResult.DirCreated + + pruneResult, err := ssync.PruneOrphanCopiesWithSkills(sc.Path, allSkills, sc.Include, sc.Exclude, name, sc.TargetNaming, body.DryRun) + if err == nil { + res.Pruned = pruneResult.Removed + } + + default: + err := ssync.SyncTarget(name, target, s.cfg.Source, body.DryRun, s.projectRoot) + if err != nil { + s.writeOpsLog("sync", "error", start, syncErrArgs, err.Error()) + writeError(w, http.StatusInternalServerError, "sync failed for "+name+": "+err.Error()) + return + } + res.Linked = []string{"(symlink mode)"} } - res.Linked = []string{"(symlink mode)"} + + results = append(results, res) } + } + + // Agent sync (skip when kind == "skill") + if body.Kind != kindSkill { + agentsSource := s.agentsSource() + if info, err := os.Stat(agentsSource); err == nil && info.IsDir() { + agents := discoverActiveAgents(agentsSource) + builtinAgents := s.builtinAgentTargets() + + for name, target := range s.cfg.Targets { + agentPath := resolveAgentPath(target, builtinAgents, name) + if agentPath == "" { + continue + } - results = append(results, res) + agentMode := target.AgentsConfig().Mode + if agentMode == "" { + agentMode = "merge" + } + + agentResult, err := ssync.SyncAgents(agents, agentsSource, agentPath, agentMode, body.DryRun, body.Force) + if err != nil { + warnings = append(warnings, "agent sync failed for "+name+": "+err.Error()) + continue + } + + // Prune orphan agents even when the source is empty so uninstall-all + // matches skills and clears previously synced target entries. + var pruned []string + if agentMode == "merge" { + pruned, _ = ssync.PruneOrphanAgentLinks(agentPath, agents, body.DryRun) + } else if agentMode == "copy" { + pruned, _ = ssync.PruneOrphanAgentCopies(agentPath, agents, body.DryRun) + } + + // Find or create result entry for this target + idx := -1 + for i := range results { + if results[i].Target == name { + idx = i + break + } + } + if idx < 0 && (len(agentResult.Linked) > 0 || len(agentResult.Updated) > 0 || len(agentResult.Skipped) > 0 || len(pruned) > 0) { + results = append(results, syncTargetResult{ + Target: name, + Linked: make([]string, 0), + Updated: make([]string, 0), + Skipped: make([]string, 0), + Pruned: make([]string, 0), + }) + idx = len(results) - 1 + } + + if idx >= 0 { + results[idx].Linked = append(results[idx].Linked, agentResult.Linked...) + results[idx].Updated = append(results[idx].Updated, agentResult.Updated...) + results[idx].Skipped = append(results[idx].Skipped, agentResult.Skipped...) + results[idx].Pruned = append(results[idx].Pruned, pruned...) + } + } + } } // Log the sync operation @@ -166,6 +241,7 @@ func (s *Server) handleSync(w http.ResponseWriter, r *http.Request) { "targets_failed": 0, "dry_run": body.DryRun, "force": body.Force, + "kind": body.Kind, "scope": "ui", }, "") @@ -174,6 +250,7 @@ func (s *Server) handleSync(w http.ResponseWriter, r *http.Request) { "warnings": warnings, } maps.Copy(resp, ignorePayload(ignoreStats)) + maps.Copy(resp, agentIgnorePayload(s.agentsSource(), nil)) writeJSON(w, resp) } @@ -181,6 +258,7 @@ type diffItem struct { Skill string `json:"skill"` Action string `json:"action"` // "link", "update", "skip", "prune", "local" Reason string `json:"reason"` // human-readable description + Kind string `json:"kind,omitempty"` } type diffTarget struct { @@ -194,6 +272,7 @@ func (s *Server) handleDiff(w http.ResponseWriter, r *http.Request) { // Snapshot config under RLock, then release before slow I/O. s.mu.RLock() source := s.cfg.Source + agentsSource := s.agentsSource() globalMode := s.cfg.Mode targets := s.cloneTargets() s.mu.RUnlock() @@ -218,7 +297,10 @@ func (s *Server) handleDiff(w http.ResponseWriter, r *http.Request) { diffs = append(diffs, s.computeTargetDiff(name, target, discovered, globalMode, source)) } + diffs = s.appendAgentDiffs(diffs, targets, agentsSource, filterTarget) + resp := map[string]any{"diffs": diffs} maps.Copy(resp, ignorePayload(ignoreStats)) + maps.Copy(resp, agentIgnorePayload(agentsSource, nil)) writeJSON(w, resp) } diff --git a/internal/server/handler_sync_matrix.go b/internal/server/handler_sync_matrix.go index 1d95fdb5..4496083b 100644 --- a/internal/server/handler_sync_matrix.go +++ b/internal/server/handler_sync_matrix.go @@ -4,6 +4,8 @@ import ( "encoding/json" "net/http" + "skillshare/internal/config" + "skillshare/internal/resource" ssync "skillshare/internal/sync" ) @@ -12,12 +14,14 @@ type syncMatrixEntry struct { Target string `json:"target"` Status string `json:"status"` Reason string `json:"reason"` + Kind string `json:"kind,omitempty"` } func (s *Server) handleSyncMatrix(w http.ResponseWriter, r *http.Request) { // Snapshot config under RLock, then release before I/O. s.mu.RLock() source := s.cfg.Source + agentsSource := s.agentsSource() targets := s.cloneTargets() s.mu.RUnlock() @@ -27,6 +31,18 @@ func (s *Server) handleSyncMatrix(w http.ResponseWriter, r *http.Request) { return } + var agents []resource.DiscoveredResource + if agentsSource != "" { + discovered, _ := resource.AgentKind{}.Discover(agentsSource) + agents = resource.ActiveAgents(discovered) + } + var builtinAgents map[string]config.TargetConfig + if s.IsProjectMode() { + builtinAgents = config.ProjectAgentTargets() + } else { + builtinAgents = config.DefaultAgentTargets() + } + targetFilter := r.URL.Query().Get("target") var entries []syncMatrixEntry @@ -34,6 +50,7 @@ func (s *Server) handleSyncMatrix(w http.ResponseWriter, r *http.Request) { if targetFilter != "" && name != targetFilter { continue } + // Skills sc := target.SkillsConfig() if sc.Mode == "symlink" { for _, skill := range skills { @@ -44,16 +61,53 @@ func (s *Server) handleSyncMatrix(w http.ResponseWriter, r *http.Request) { Reason: "symlink mode — filters not applicable", }) } + } else { + for _, skill := range skills { + status, reason := ssync.ClassifySkillForTarget(skill.FlatName, skill.Targets, name, sc.Include, sc.Exclude) + entries = append(entries, syncMatrixEntry{ + Skill: skill.FlatName, + Target: name, + Status: status, + Reason: reason, + }) + } + } + // Agents — resolve path from user config or builtin defaults + ac := target.AgentsConfig() + agentPath := ac.Path + if agentPath == "" { + if builtin, ok := builtinAgents[name]; ok { + agentPath = builtin.Path + } + } + if agentPath == "" || len(agents) == 0 { continue } - for _, skill := range skills { - status, reason := ssync.ClassifySkillForTarget(skill.FlatName, skill.Targets, name, sc.Include, sc.Exclude) - entries = append(entries, syncMatrixEntry{ - Skill: skill.FlatName, - Target: name, - Status: status, - Reason: reason, - }) + agentMode := ac.Mode + if agentMode == "" { + agentMode = "merge" + } + if agentMode == "symlink" { + for _, agent := range agents { + entries = append(entries, syncMatrixEntry{ + Skill: agent.FlatName, + Target: name, + Status: "na", + Reason: "symlink mode — filters not applicable", + Kind: "agent", + }) + } + } else { + for _, agent := range agents { + status, reason := ssync.ClassifySkillForTarget(agent.FlatName, agent.Targets, name, ac.Include, ac.Exclude) + entries = append(entries, syncMatrixEntry{ + Skill: agent.FlatName, + Target: name, + Status: status, + Reason: reason, + Kind: "agent", + }) + } } } @@ -64,12 +118,16 @@ func (s *Server) handleSyncMatrixPreview(w http.ResponseWriter, r *http.Request) // Snapshot config under RLock, then release before I/O. s.mu.RLock() source := s.cfg.Source + agentsSource := s.agentsSource() + targets := s.cloneTargets() s.mu.RUnlock() var body struct { - Target string `json:"target"` - Include []string `json:"include"` - Exclude []string `json:"exclude"` + Target string `json:"target"` + Include []string `json:"include"` + Exclude []string `json:"exclude"` + AgentInclude []string `json:"agent_include"` + AgentExclude []string `json:"agent_exclude"` } if err := json.NewDecoder(r.Body).Decode(&body); err != nil { writeError(w, http.StatusBadRequest, "invalid JSON: "+err.Error()) @@ -80,7 +138,7 @@ func (s *Server) handleSyncMatrixPreview(w http.ResponseWriter, r *http.Request) return } - // Validate patterns before discovering skills + // Validate skill patterns if _, err := ssync.FilterSkills(nil, body.Include, nil); err != nil { writeError(w, http.StatusBadRequest, err.Error()) return @@ -89,6 +147,15 @@ func (s *Server) handleSyncMatrixPreview(w http.ResponseWriter, r *http.Request) writeError(w, http.StatusBadRequest, err.Error()) return } + // Validate agent patterns + if _, err := ssync.FilterSkills(nil, body.AgentInclude, nil); err != nil { + writeError(w, http.StatusBadRequest, "invalid agent include pattern: "+err.Error()) + return + } + if _, err := ssync.FilterSkills(nil, nil, body.AgentExclude); err != nil { + writeError(w, http.StatusBadRequest, "invalid agent exclude pattern: "+err.Error()) + return + } skills, err := ssync.DiscoverSourceSkills(source) if err != nil { @@ -107,5 +174,53 @@ func (s *Server) handleSyncMatrixPreview(w http.ResponseWriter, r *http.Request) }) } + // Agents — resolve path from config or builtin defaults + target, ok := targets[body.Target] + if ok && agentsSource != "" { + ac := target.AgentsConfig() + agentPath := ac.Path + if agentPath == "" { + var previewBuiltin map[string]config.TargetConfig + if s.IsProjectMode() { + previewBuiltin = config.ProjectAgentTargets() + } else { + previewBuiltin = config.DefaultAgentTargets() + } + if builtin, found := previewBuiltin[body.Target]; found { + agentPath = builtin.Path + } + } + if agentPath != "" { + discovered, _ := resource.AgentKind{}.Discover(agentsSource) + agents := resource.ActiveAgents(discovered) + agentMode := ac.Mode + if agentMode == "" { + agentMode = "merge" + } + if agentMode == "symlink" { + for _, agent := range agents { + entries = append(entries, syncMatrixEntry{ + Skill: agent.FlatName, + Target: body.Target, + Status: "na", + Reason: "symlink mode — filters not applicable", + Kind: "agent", + }) + } + } else { + for _, agent := range agents { + status, reason := ssync.ClassifySkillForTarget(agent.FlatName, agent.Targets, body.Target, body.AgentInclude, body.AgentExclude) + entries = append(entries, syncMatrixEntry{ + Skill: agent.FlatName, + Target: body.Target, + Status: status, + Reason: reason, + Kind: "agent", + }) + } + } + } + } + writeJSON(w, map[string]any{"entries": entries}) } diff --git a/internal/server/handler_sync_matrix_test.go b/internal/server/handler_sync_matrix_test.go index 462dea62..951d84d7 100644 --- a/internal/server/handler_sync_matrix_test.go +++ b/internal/server/handler_sync_matrix_test.go @@ -4,6 +4,7 @@ import ( "encoding/json" "net/http" "net/http/httptest" + "os" "path/filepath" "strings" "testing" @@ -180,3 +181,114 @@ func TestHandleSyncMatrixPreview_MissingTarget(t *testing.T) { t.Errorf("expected 400, got %d", rr.Code) } } + +func TestHandleSyncMatrixPreview_IncludesAgents(t *testing.T) { + home := filepath.Join(t.TempDir(), "home") + os.MkdirAll(home, 0755) + t.Setenv("HOME", home) + + tgtPath := filepath.Join(t.TempDir(), "claude-skills") + s, sourceDir := newTestServerWithTargets(t, map[string]string{"claude": tgtPath}) + addSkill(t, sourceDir, "my-skill") + + // Add agents to the agents source directory + agentsSource := s.cfg.EffectiveAgentsSource() + addAgentFile(t, agentsSource, "code-reviewer.md") + addAgentFile(t, agentsSource, "draft-helper.md") + + body := `{"target":"claude","include":[],"exclude":[],"agent_include":[],"agent_exclude":["draft-*"]}` + req := httptest.NewRequest(http.MethodPost, "/api/sync-matrix/preview", strings.NewReader(body)) + rr := httptest.NewRecorder() + s.handler.ServeHTTP(rr, req) + if rr.Code != http.StatusOK { + t.Fatalf("expected 200, got %d: %s", rr.Code, rr.Body.String()) + } + + var resp struct { + Entries []struct { + Skill string `json:"skill"` + Status string `json:"status"` + Kind string `json:"kind"` + } `json:"entries"` + } + json.Unmarshal(rr.Body.Bytes(), &resp) + + // Should have 1 skill + 2 agents = 3 entries + if len(resp.Entries) != 3 { + t.Fatalf("expected 3 entries, got %d", len(resp.Entries)) + } + + statusMap := map[string]string{} + kindMap := map[string]string{} + for _, e := range resp.Entries { + statusMap[e.Skill] = e.Status + kindMap[e.Skill] = e.Kind + } + + // Skill should be synced + if statusMap["my-skill"] != "synced" { + t.Errorf("my-skill: expected synced, got %q", statusMap["my-skill"]) + } + if kindMap["my-skill"] != "" { + t.Errorf("my-skill kind: expected empty, got %q", kindMap["my-skill"]) + } + + // code-reviewer agent should be synced + if statusMap["code-reviewer.md"] != "synced" { + t.Errorf("code-reviewer.md: expected synced, got %q", statusMap["code-reviewer.md"]) + } + if kindMap["code-reviewer.md"] != "agent" { + t.Errorf("code-reviewer.md kind: expected agent, got %q", kindMap["code-reviewer.md"]) + } + + // draft-helper agent should be excluded + if statusMap["draft-helper.md"] != "excluded" { + t.Errorf("draft-helper.md: expected excluded, got %q", statusMap["draft-helper.md"]) + } +} + +func TestHandleSyncMatrixPreview_NoAgentsWhenNoAgentPath(t *testing.T) { + // custom-tool has no agent path in builtin targets + tgtPath := filepath.Join(t.TempDir(), "custom-skills") + s, sourceDir := newTestServerWithTargets(t, map[string]string{"custom-tool": tgtPath}) + addSkill(t, sourceDir, "my-skill") + + // Add agents to the agents source + agentsSource := s.cfg.EffectiveAgentsSource() + addAgentFile(t, agentsSource, "reviewer.md") + + body := `{"target":"custom-tool","include":[],"exclude":[]}` + req := httptest.NewRequest(http.MethodPost, "/api/sync-matrix/preview", strings.NewReader(body)) + rr := httptest.NewRecorder() + s.handler.ServeHTTP(rr, req) + if rr.Code != http.StatusOK { + t.Fatalf("expected 200, got %d: %s", rr.Code, rr.Body.String()) + } + + var resp struct { + Entries []struct { + Skill string `json:"skill"` + Kind string `json:"kind"` + } `json:"entries"` + } + json.Unmarshal(rr.Body.Bytes(), &resp) + + // Should have only 1 skill, no agents + if len(resp.Entries) != 1 { + t.Fatalf("expected 1 entry (skill only, no agents), got %d", len(resp.Entries)) + } + if resp.Entries[0].Kind == "agent" { + t.Error("expected no agent entries for target without agent path") + } +} + +func TestHandleSyncMatrixPreview_InvalidAgentPattern(t *testing.T) { + s, _ := newTestServer(t) + body := `{"target":"claude","include":[],"exclude":[],"agent_include":["[unclosed"]}` + req := httptest.NewRequest(http.MethodPost, "/api/sync-matrix/preview", strings.NewReader(body)) + rr := httptest.NewRecorder() + s.handler.ServeHTTP(rr, req) + if rr.Code != http.StatusBadRequest { + t.Errorf("expected 400 for invalid agent pattern, got %d", rr.Code) + } +} diff --git a/internal/server/handler_sync_test.go b/internal/server/handler_sync_test.go index 582d758d..db32f015 100644 --- a/internal/server/handler_sync_test.go +++ b/internal/server/handler_sync_test.go @@ -10,6 +10,7 @@ import ( "testing" "skillshare/internal/config" + "skillshare/internal/install" ) func TestHandleSync_MergeMode(t *testing.T) { @@ -53,16 +54,13 @@ func TestHandleSync_IgnoredSkillNotPrunedFromRegistry(t *testing.T) { // Add .skillignore to exclude the second skill os.WriteFile(filepath.Join(src, ".skillignore"), []byte("ignored-skill\n"), 0644) - // Pre-populate registry with both entries and persist to disk - // (server auto-reloads registry from disk on each request) - s.registry = &config.Registry{ - Skills: []config.SkillEntry{ - {Name: "kept-skill", Source: "github.com/user/kept"}, - {Name: "ignored-skill", Source: "github.com/user/ignored"}, - }, - } - if err := s.registry.Save(s.cfg.RegistryDir); err != nil { - t.Fatalf("failed to save registry: %v", err) + // Pre-populate store with both entries and persist to disk + // (server auto-reloads metadata from disk on each request) + s.skillsStore = install.NewMetadataStore() + s.skillsStore.Set("kept-skill", &install.MetadataEntry{Source: "github.com/user/kept"}) + s.skillsStore.Set("ignored-skill", &install.MetadataEntry{Source: "github.com/user/ignored"}) + if err := s.skillsStore.Save(src); err != nil { + t.Fatalf("failed to save metadata: %v", err) } // Run sync (non-dry-run) @@ -76,12 +74,9 @@ func TestHandleSync_IgnoredSkillNotPrunedFromRegistry(t *testing.T) { } // Both entries should survive — ignored skill still exists on disk - if len(s.registry.Skills) != 2 { - names := make([]string, len(s.registry.Skills)) - for i, sk := range s.registry.Skills { - names[i] = sk.Name - } - t.Fatalf("expected 2 registry entries after sync, got %d: %v", len(s.registry.Skills), names) + names := s.skillsStore.List() + if len(names) != 2 { + t.Fatalf("expected 2 metadata entries after sync, got %d: %v", len(names), names) } } @@ -105,3 +100,61 @@ func TestHandleSync_NoTargets(t *testing.T) { t.Errorf("expected 0 results for no targets, got %d", len(resp.Results)) } } + +func TestHandleSync_AgentPrunesOrphanWhenSourceEmpty(t *testing.T) { + s, _ := newTestServer(t) + + agentSource := filepath.Join(t.TempDir(), "agents") + agentTarget := filepath.Join(t.TempDir(), "claude-agents") + if err := os.MkdirAll(agentSource, 0o755); err != nil { + t.Fatalf("mkdir agent source: %v", err) + } + if err := os.MkdirAll(agentTarget, 0o755); err != nil { + t.Fatalf("mkdir agent target: %v", err) + } + orphanPath := filepath.Join(agentTarget, "tutor.md") + if err := os.Symlink(filepath.Join(agentSource, "tutor.md"), orphanPath); err != nil { + t.Fatalf("seed orphan agent symlink: %v", err) + } + + s.cfg.AgentsSource = agentSource + s.cfg.Targets["claude"] = config.TargetConfig{ + Skills: &config.ResourceTargetConfig{Path: filepath.Join(t.TempDir(), "claude-skills")}, + Agents: &config.ResourceTargetConfig{Path: agentTarget}, + } + if err := s.cfg.Save(); err != nil { + t.Fatalf("save config: %v", err) + } + + req := httptest.NewRequest(http.MethodPost, "/api/sync", strings.NewReader(`{"kind":"agent"}`)) + rr := httptest.NewRecorder() + s.handler.ServeHTTP(rr, req) + + if rr.Code != http.StatusOK { + t.Fatalf("expected 200, got %d: %s", rr.Code, rr.Body.String()) + } + + if _, err := os.Lstat(orphanPath); !os.IsNotExist(err) { + t.Fatalf("expected orphan agent symlink to be pruned, got err=%v", err) + } + + var resp struct { + Results []struct { + Target string `json:"target"` + Pruned []string `json:"pruned"` + } `json:"results"` + } + if err := json.Unmarshal(rr.Body.Bytes(), &resp); err != nil { + t.Fatalf("unmarshal sync response: %v", err) + } + + if len(resp.Results) != 1 { + t.Fatalf("expected 1 sync result, got %d", len(resp.Results)) + } + if resp.Results[0].Target != "claude" { + t.Fatalf("expected claude target, got %q", resp.Results[0].Target) + } + if len(resp.Results[0].Pruned) != 1 || resp.Results[0].Pruned[0] != "tutor.md" { + t.Fatalf("expected pruned tutor.md, got %+v", resp.Results[0].Pruned) + } +} diff --git a/internal/server/handler_targets.go b/internal/server/handler_targets.go index 1c2a9f11..be8a35cc 100644 --- a/internal/server/handler_targets.go +++ b/internal/server/handler_targets.go @@ -10,6 +10,7 @@ import ( "skillshare/internal/config" ssync "skillshare/internal/sync" + "skillshare/internal/targetsummary" "skillshare/internal/utils" ) @@ -26,6 +27,12 @@ type targetItem struct { ExpectedSkillCount int `json:"expectedSkillCount"` SkippedSkillCount int `json:"skippedSkillCount,omitempty"` CollisionCount int `json:"collisionCount,omitempty"` + AgentPath string `json:"agentPath,omitempty"` + AgentMode string `json:"agentMode,omitempty"` + AgentInclude []string `json:"agentInclude,omitempty"` + AgentExclude []string `json:"agentExclude,omitempty"` + AgentLinkedCount *int `json:"agentLinkedCount,omitempty"` + AgentExpectedCount *int `json:"agentExpectedCount,omitempty"` } func (s *Server) handleListTargets(w http.ResponseWriter, r *http.Request) { @@ -34,6 +41,13 @@ func (s *Server) handleListTargets(w http.ResponseWriter, r *http.Request) { source := s.cfg.Source cfgMode := s.cfg.Mode targets := s.cloneTargets() + isProjectMode := s.IsProjectMode() + projectRoot := s.projectRoot + cfgSnapshot := *s.cfg + var projectEntries []config.ProjectTargetEntry + if isProjectMode && s.projectCfg != nil { + projectEntries = append([]config.ProjectTargetEntry(nil), s.projectCfg.Targets...) + } s.mu.RUnlock() globalMode := cfgMode @@ -41,6 +55,25 @@ func (s *Server) handleListTargets(w http.ResponseWriter, r *http.Request) { globalMode = "merge" } + projectEntryByName := make(map[string]config.ProjectTargetEntry, len(projectEntries)) + for _, entry := range projectEntries { + projectEntryByName[entry.Name] = entry + } + + var ( + agentBuilder *targetsummary.Builder + err error + ) + if isProjectMode { + agentBuilder, err = targetsummary.NewProjectBuilder(projectRoot) + } else { + agentBuilder, err = targetsummary.NewGlobalBuilder(&cfgSnapshot) + } + if err != nil { + writeError(w, http.StatusInternalServerError, "failed to discover agents: "+err.Error()) + return + } + items := make([]targetItem, 0, len(targets)) discovered, discoveredErr := ssync.DiscoverSourceSkills(source) @@ -107,6 +140,27 @@ func (s *Server) handleListTargets(w http.ResponseWriter, r *http.Request) { item.Status = status.String() } + var agentSummary *targetsummary.AgentSummary + if isProjectMode { + if entry, ok := projectEntryByName[name]; ok { + agentSummary, err = agentBuilder.ProjectTarget(entry) + } + } else { + agentSummary, err = agentBuilder.GlobalTarget(name, target) + } + if err != nil { + writeError(w, http.StatusBadRequest, "invalid agent include/exclude for target "+name+": "+err.Error()) + return + } + if agentSummary != nil { + item.AgentPath = agentSummary.Path + item.AgentMode = agentSummary.Mode + item.AgentInclude = agentSummary.Include + item.AgentExclude = agentSummary.Exclude + item.AgentLinkedCount = intPtr(agentSummary.ManagedCount) + item.AgentExpectedCount = intPtr(agentSummary.ExpectedCount) + } + items = append(items, item) } @@ -119,14 +173,19 @@ func (s *Server) handleListTargets(w http.ResponseWriter, r *http.Request) { writeJSON(w, map[string]any{"targets": items, "sourceSkillCount": sourceSkillCount}) } +func intPtr(v int) *int { + return &v +} + func (s *Server) handleAddTarget(w http.ResponseWriter, r *http.Request) { start := time.Now() s.mu.Lock() defer s.mu.Unlock() var body struct { - Name string `json:"name"` - Path string `json:"path"` + Name string `json:"name"` + Path string `json:"path"` + AgentPath string `json:"agentPath"` } if err := json.NewDecoder(r.Body).Decode(&body); err != nil { writeError(w, http.StatusBadRequest, "invalid JSON body") @@ -158,7 +217,11 @@ func (s *Server) handleAddTarget(w http.ResponseWriter, r *http.Request) { return } - s.cfg.Targets[body.Name] = config.TargetConfig{Skills: &config.ResourceTargetConfig{Path: body.Path}} + tc := config.TargetConfig{Skills: &config.ResourceTargetConfig{Path: body.Path}} + if body.AgentPath != "" { + tc.Agents = &config.ResourceTargetConfig{Path: body.AgentPath} + } + s.cfg.Targets[body.Name] = tc // In project mode, also update the project config if s.IsProjectMode() { @@ -255,6 +318,9 @@ func (s *Server) handleUpdateTarget(w http.ResponseWriter, r *http.Request) { Exclude *[]string `json:"exclude"` Mode *string `json:"mode"` TargetNaming *string `json:"target_naming"` + AgentMode *string `json:"agent_mode"` + AgentInclude *[]string `json:"agent_include"` + AgentExclude *[]string `json:"agent_exclude"` } if err := json.NewDecoder(r.Body).Decode(&body); err != nil { writeError(w, http.StatusBadRequest, "invalid JSON body") @@ -297,6 +363,31 @@ func (s *Server) handleUpdateTarget(w http.ResponseWriter, r *http.Request) { target.Skills.TargetNaming = *body.TargetNaming } + if body.AgentMode != nil { + switch *body.AgentMode { + case "merge", "symlink", "copy": + target.EnsureAgents().Mode = *body.AgentMode + default: + writeError(w, http.StatusBadRequest, "invalid agent_mode: "+*body.AgentMode+"; must be merge, symlink, or copy") + return + } + } + + if body.AgentInclude != nil { + if _, err := ssync.FilterSkills(nil, *body.AgentInclude, nil); err != nil { + writeError(w, http.StatusBadRequest, "invalid agent include pattern: "+err.Error()) + return + } + target.EnsureAgents().Include = *body.AgentInclude + } + if body.AgentExclude != nil { + if _, err := ssync.FilterSkills(nil, nil, *body.AgentExclude); err != nil { + writeError(w, http.StatusBadRequest, "invalid agent exclude pattern: "+err.Error()) + return + } + target.EnsureAgents().Exclude = *body.AgentExclude + } + s.cfg.Targets[name] = target // In project mode, also update the project config @@ -316,6 +407,15 @@ func (s *Server) handleUpdateTarget(w http.ResponseWriter, r *http.Request) { if body.TargetNaming != nil { sk.TargetNaming = *body.TargetNaming } + if body.AgentMode != nil { + s.projectCfg.Targets[i].EnsureAgents().Mode = *body.AgentMode + } + if body.AgentInclude != nil { + s.projectCfg.Targets[i].EnsureAgents().Include = *body.AgentInclude + } + if body.AgentExclude != nil { + s.projectCfg.Targets[i].EnsureAgents().Exclude = *body.AgentExclude + } break } } @@ -326,8 +426,8 @@ func (s *Server) handleUpdateTarget(w http.ResponseWriter, r *http.Request) { return } - hasFilter := body.Include != nil || body.Exclude != nil - hasSetting := body.Mode != nil || body.TargetNaming != nil + hasFilter := body.Include != nil || body.Exclude != nil || body.AgentInclude != nil || body.AgentExclude != nil + hasSetting := body.Mode != nil || body.TargetNaming != nil || body.AgentMode != nil action := "filter" if hasSetting && hasFilter { action = "settings+filter" diff --git a/internal/server/handler_targets_agents_test.go b/internal/server/handler_targets_agents_test.go new file mode 100644 index 00000000..d6202c2a --- /dev/null +++ b/internal/server/handler_targets_agents_test.go @@ -0,0 +1,213 @@ +package server + +import ( + "encoding/json" + "net/http" + "net/http/httptest" + "os" + "path/filepath" + "testing" + + "skillshare/internal/config" +) + +type targetAgentResponse struct { + Name string `json:"name"` + AgentPath string `json:"agentPath"` + AgentMode string `json:"agentMode"` + AgentInclude []string `json:"agentInclude"` + AgentExclude []string `json:"agentExclude"` + AgentLinkedCount *int `json:"agentLinkedCount"` + AgentExpectedCount *int `json:"agentExpectedCount"` +} + +func TestHandleListTargets_IncludesGlobalBuiltinAgents(t *testing.T) { + home := filepath.Join(t.TempDir(), "home") + if err := os.MkdirAll(home, 0755); err != nil { + t.Fatalf("mkdir home: %v", err) + } + t.Setenv("HOME", home) + + tgtPath := filepath.Join(t.TempDir(), "claude-skills") + s, _ := newTestServerWithTargets(t, map[string]string{"claude": tgtPath}) + + agentSource := s.cfg.EffectiveAgentsSource() + agentFile := addAgentFile(t, agentSource, "reviewer.md") + agentTarget := filepath.Join(home, ".claude", "agents") + addAgentLink(t, agentTarget, "reviewer.md", agentFile) + + target := fetchTargetByName(t, s, "claude") + if target.AgentPath != agentTarget { + t.Fatalf("agent path = %q, want %q", target.AgentPath, agentTarget) + } + if target.AgentMode != "merge" { + t.Fatalf("agent mode = %q, want merge", target.AgentMode) + } + if target.AgentLinkedCount == nil || *target.AgentLinkedCount != 1 { + t.Fatalf("agent linked = %v, want 1", target.AgentLinkedCount) + } + if target.AgentExpectedCount == nil || *target.AgentExpectedCount != 1 { + t.Fatalf("agent expected = %v, want 1", target.AgentExpectedCount) + } +} + +func TestHandleListTargets_IncludesProjectBuiltinAgents(t *testing.T) { + s, projectRoot := newProjectTargetServer(t, []config.ProjectTargetEntry{{Name: "claude"}}) + + agentFile := addAgentFile(t, filepath.Join(projectRoot, ".skillshare", "agents"), "reviewer.md") + agentTarget := filepath.Join(projectRoot, ".claude", "agents") + addAgentLink(t, agentTarget, "reviewer.md", agentFile) + + target := fetchTargetByName(t, s, "claude") + if target.AgentPath != agentTarget { + t.Fatalf("agent path = %q, want %q", target.AgentPath, agentTarget) + } + if target.AgentMode != "merge" { + t.Fatalf("agent mode = %q, want merge", target.AgentMode) + } + if target.AgentLinkedCount == nil || *target.AgentLinkedCount != 1 { + t.Fatalf("agent linked = %v, want 1", target.AgentLinkedCount) + } + if target.AgentExpectedCount == nil || *target.AgentExpectedCount != 1 { + t.Fatalf("agent expected = %v, want 1", target.AgentExpectedCount) + } +} + +func TestHandleListTargets_CustomAgentPathOverridesBuiltin(t *testing.T) { + tgtPath := filepath.Join(t.TempDir(), "claude-skills") + s, _ := newTestServerWithTargets(t, map[string]string{"claude": tgtPath}) + + customAgentPath := filepath.Join(t.TempDir(), "custom-agents-target") + if err := os.MkdirAll(customAgentPath, 0755); err != nil { + t.Fatalf("mkdir custom target: %v", err) + } + + cfg, err := config.Load() + if err != nil { + t.Fatalf("load config: %v", err) + } + cfgTarget := cfg.Targets["claude"] + cfgTarget.Agents = &config.ResourceTargetConfig{ + Path: customAgentPath, + Mode: "copy", + Include: []string{"review-*"}, + Exclude: []string{"draft-*"}, + } + cfg.Targets["claude"] = cfgTarget + if err := cfg.Save(); err != nil { + t.Fatalf("save config: %v", err) + } + + addAgentFile(t, cfg.EffectiveAgentsSource(), "review-alpha.md") + targetResp := fetchTargetByName(t, s, "claude") + if targetResp.AgentPath != customAgentPath { + t.Fatalf("agent path = %q, want %q", targetResp.AgentPath, customAgentPath) + } + if targetResp.AgentMode != "copy" { + t.Fatalf("agent mode = %q, want copy", targetResp.AgentMode) + } + if got := targetResp.AgentInclude; len(got) != 1 || got[0] != "review-*" { + t.Fatalf("agent include = %v, want [review-*]", got) + } + if got := targetResp.AgentExclude; len(got) != 1 || got[0] != "draft-*" { + t.Fatalf("agent exclude = %v, want [draft-*]", got) + } +} + +func TestHandleListTargets_OmitsAgentsForUnsupportedTarget(t *testing.T) { + tgtPath := filepath.Join(t.TempDir(), "custom-skills") + s, _ := newTestServerWithTargets(t, map[string]string{"custom-tool": tgtPath}) + + target := fetchTargetByName(t, s, "custom-tool") + if target.AgentPath != "" { + t.Fatalf("expected empty agent path, got %q", target.AgentPath) + } + if target.AgentLinkedCount != nil { + t.Fatalf("expected nil agent linked count, got %v", *target.AgentLinkedCount) + } + if target.AgentExpectedCount != nil { + t.Fatalf("expected nil agent expected count, got %v", *target.AgentExpectedCount) + } +} + +func fetchTargetByName(t *testing.T, s *Server, name string) targetAgentResponse { + t.Helper() + + req := httptest.NewRequest(http.MethodGet, "/api/targets", nil) + rr := httptest.NewRecorder() + s.handler.ServeHTTP(rr, req) + if rr.Code != http.StatusOK { + t.Fatalf("expected 200, got %d: %s", rr.Code, rr.Body.String()) + } + + var resp struct { + Targets []targetAgentResponse `json:"targets"` + } + if err := json.Unmarshal(rr.Body.Bytes(), &resp); err != nil { + t.Fatalf("decode response: %v", err) + } + + for _, target := range resp.Targets { + if target.Name == name { + return target + } + } + t.Fatalf("target %q not found in response", name) + return targetAgentResponse{} +} + +func addAgentFile(t *testing.T, dir, name string) string { + t.Helper() + + if err := os.MkdirAll(dir, 0755); err != nil { + t.Fatalf("mkdir agent source: %v", err) + } + path := filepath.Join(dir, name) + if err := os.WriteFile(path, []byte("# "+name), 0644); err != nil { + t.Fatalf("write agent file: %v", err) + } + return path +} + +func addAgentLink(t *testing.T, dir, name, source string) string { + t.Helper() + + if err := os.MkdirAll(dir, 0755); err != nil { + t.Fatalf("mkdir agent target: %v", err) + } + linkPath := filepath.Join(dir, name) + if err := os.Symlink(source, linkPath); err != nil { + t.Fatalf("symlink agent: %v", err) + } + return linkPath +} + +func newProjectTargetServer(t *testing.T, targets []config.ProjectTargetEntry) (*Server, string) { + t.Helper() + + projectRoot := t.TempDir() + if err := os.MkdirAll(filepath.Join(projectRoot, ".skillshare", "skills"), 0755); err != nil { + t.Fatalf("mkdir project skills: %v", err) + } + if err := os.MkdirAll(filepath.Join(projectRoot, ".skillshare", "agents"), 0755); err != nil { + t.Fatalf("mkdir project agents: %v", err) + } + + projectCfg := &config.ProjectConfig{Targets: targets} + if err := projectCfg.Save(projectRoot); err != nil { + t.Fatalf("save project config: %v", err) + } + + resolvedTargets, err := config.ResolveProjectTargets(projectRoot, projectCfg) + if err != nil { + t.Fatalf("resolve project targets: %v", err) + } + + cfg := &config.Config{ + Source: filepath.Join(projectRoot, ".skillshare", "skills"), + Mode: "merge", + Targets: resolvedTargets, + } + + return NewProject(cfg, projectCfg, projectRoot, "127.0.0.1:0", "", ""), projectRoot +} diff --git a/internal/server/handler_targets_test.go b/internal/server/handler_targets_test.go index f6356b1f..5958b20c 100644 --- a/internal/server/handler_targets_test.go +++ b/internal/server/handler_targets_test.go @@ -272,3 +272,70 @@ func TestHandleUpdateTarget_ClearFilters(t *testing.T) { t.Errorf("GET exclude should be empty after clear, got %v", resp.Targets[0].Exclude) } } + +func TestHandleUpdateTarget_AgentIncludeExclude_Persisted(t *testing.T) { + tgtPath := filepath.Join(t.TempDir(), "claude-skills") + s, _ := newTestServerWithTargets(t, map[string]string{"claude": tgtPath}) + + // PATCH agent include/exclude + body := `{"agent_include":["review-*"],"agent_exclude":["draft-*"]}` + req := httptest.NewRequest(http.MethodPatch, "/api/targets/claude", strings.NewReader(body)) + rr := httptest.NewRecorder() + s.handler.ServeHTTP(rr, req) + if rr.Code != http.StatusOK { + t.Fatalf("PATCH expected 200, got %d: %s", rr.Code, rr.Body.String()) + } + + // Verify disk persistence + diskCfg, err := config.Load() + if err != nil { + t.Fatalf("failed to load config from disk: %v", err) + } + tgt, ok := diskCfg.Targets["claude"] + if !ok { + t.Fatal("target 'claude' not found in disk config") + } + ac := tgt.AgentsConfig() + if len(ac.Include) != 1 || ac.Include[0] != "review-*" { + t.Errorf("disk agent include mismatch: got %v", ac.Include) + } + if len(ac.Exclude) != 1 || ac.Exclude[0] != "draft-*" { + t.Errorf("disk agent exclude mismatch: got %v", ac.Exclude) + } + + // Verify in-memory state + memTgt := s.cfg.Targets["claude"] + memAc := memTgt.AgentsConfig() + if len(memAc.Include) != 1 || memAc.Include[0] != "review-*" { + t.Errorf("in-memory agent include mismatch: got %v", memAc.Include) + } + if len(memAc.Exclude) != 1 || memAc.Exclude[0] != "draft-*" { + t.Errorf("in-memory agent exclude mismatch: got %v", memAc.Exclude) + } +} + +func TestHandleUpdateTarget_AgentInclude_InvalidPattern(t *testing.T) { + tgtPath := filepath.Join(t.TempDir(), "claude-skills") + s, _ := newTestServerWithTargets(t, map[string]string{"claude": tgtPath}) + + body := `{"agent_include":["[unclosed"]}` + req := httptest.NewRequest(http.MethodPatch, "/api/targets/claude", strings.NewReader(body)) + rr := httptest.NewRecorder() + s.handler.ServeHTTP(rr, req) + if rr.Code != http.StatusBadRequest { + t.Errorf("expected 400 for invalid agent include pattern, got %d", rr.Code) + } +} + +func TestHandleUpdateTarget_AgentExclude_InvalidPattern(t *testing.T) { + tgtPath := filepath.Join(t.TempDir(), "claude-skills") + s, _ := newTestServerWithTargets(t, map[string]string{"claude": tgtPath}) + + body := `{"agent_exclude":["[bad"]}` + req := httptest.NewRequest(http.MethodPatch, "/api/targets/claude", strings.NewReader(body)) + rr := httptest.NewRecorder() + s.handler.ServeHTTP(rr, req) + if rr.Code != http.StatusBadRequest { + t.Errorf("expected 400 for invalid agent exclude pattern, got %d", rr.Code) + } +} diff --git a/internal/server/handler_toggle.go b/internal/server/handler_toggle.go index 913dcc9e..6792ed5f 100644 --- a/internal/server/handler_toggle.go +++ b/internal/server/handler_toggle.go @@ -22,20 +22,42 @@ func (s *Server) handleToggleSkill(w http.ResponseWriter, r *http.Request, enabl start := time.Now() name := r.PathValue("name") + kind := r.URL.Query().Get("kind") + if kind != "" && kind != "agent" && kind != "skill" { + writeError(w, http.StatusBadRequest, "invalid kind: "+kind) + return + } // Resolve under RLock — discovery is I/O-heavy, don't hold write lock s.mu.RLock() source := s.cfg.Source + agentsSource := s.agentsSource() s.mu.RUnlock() - relPath, isDisabled, err := s.resolveSkillRelPathWithStatus(source, name) - if err != nil { - writeError(w, http.StatusNotFound, err.Error()) - return + relPath := "" + isDisabled := false + ignorePath := "" + if kind == "agent" { + if agentsSource == "" { + writeError(w, http.StatusNotFound, "agent not found: "+name) + return + } + var err error + relPath, isDisabled, err = s.resolveAgentRelPathWithStatus(agentsSource, name) + if err != nil { + writeError(w, http.StatusNotFound, err.Error()) + return + } + ignorePath = filepath.Join(agentsSource, ".agentignore") + } else { + var err error + relPath, isDisabled, err = s.resolveSkillRelPathWithStatus(source, name) + if err != nil { + writeError(w, http.StatusNotFound, err.Error()) + return + } + ignorePath = filepath.Join(source, ".skillignore") } - - ignorePath := filepath.Join(source, ".skillignore") - // Write lock only for the file mutation s.mu.Lock() defer s.mu.Unlock() @@ -73,6 +95,7 @@ func (s *Server) handleToggleSkill(w http.ResponseWriter, r *http.Request, enabl s.writeOpsLog(action, "ok", start, map[string]any{ "name": name, + "kind": kind, "scope": "ui", }, "") diff --git a/internal/server/handler_trash.go b/internal/server/handler_trash.go index f1fad398..ba01cbdf 100644 --- a/internal/server/handler_trash.go +++ b/internal/server/handler_trash.go @@ -1,6 +1,7 @@ package server import ( + "fmt" "net/http" "os" "time" @@ -8,14 +9,29 @@ import ( "skillshare/internal/trash" ) +type trashKind string + +const ( + trashKindAll trashKind = "all" + trashKindSkill trashKind = "skill" + trashKindAgent trashKind = "agent" +) + type trashItemJSON struct { Name string `json:"name"` + Kind string `json:"kind,omitempty"` Timestamp string `json:"timestamp"` Date string `json:"date"` Size int64 `json:"size"` Path string `json:"path"` } +type resolvedTrashEntry struct { + entry *trash.TrashEntry + kind trashKind + dest string +} + // trashBase returns the trash directory for the current mode. func (s *Server) trashBase() string { if s.IsProjectMode() { @@ -24,18 +40,88 @@ func (s *Server) trashBase() string { return trash.TrashDir() } +// agentTrashBase returns the agent trash directory for the current mode. +func (s *Server) agentTrashBase() string { + if s.IsProjectMode() { + return trash.ProjectAgentTrashDir(s.projectRoot) + } + return trash.AgentTrashDir() +} + +func parseTrashKind(raw string) (trashKind, error) { + switch raw { + case "", "all": + return trashKindAll, nil + case "skill", "skills": + return trashKindSkill, nil + case "agent", "agents": + return trashKindAgent, nil + default: + return "", fmt.Errorf("invalid trash kind %q", raw) + } +} + +func (s *Server) trashDest(kind trashKind) string { + switch kind { + case trashKindAgent: + return s.agentsSource() + default: + return s.skillsSource() + } +} + +func (s *Server) findTrashEntry(name string, kind trashKind) (*resolvedTrashEntry, error) { + if kind == trashKindSkill || kind == trashKindAll { + base := s.trashBase() + if entry := trash.FindByName(base, name); entry != nil { + return &resolvedTrashEntry{ + entry: entry, + kind: trashKindSkill, + dest: s.trashDest(trashKindSkill), + }, nil + } + } + + if kind == trashKindAgent || kind == trashKindAll { + base := s.agentTrashBase() + if entry := trash.FindByName(base, name); entry != nil { + return &resolvedTrashEntry{ + entry: entry, + kind: trashKindAgent, + dest: s.trashDest(trashKindAgent), + }, nil + } + } + + return nil, nil +} + // handleListTrash returns all trashed items with total size. func (s *Server) handleListTrash(w http.ResponseWriter, r *http.Request) { // Snapshot config under RLock, then release before I/O. s.mu.RLock() base := s.trashBase() + agentBase := s.agentTrashBase() s.mu.RUnlock() + items := trash.List(base) + agentItems := trash.List(agentBase) - out := make([]trashItemJSON, 0, len(items)) + out := make([]trashItemJSON, 0, len(items)+len(agentItems)) for _, item := range items { out = append(out, trashItemJSON{ Name: item.Name, + Kind: "skill", + Timestamp: item.Timestamp, + Date: item.Date.Format("2006-01-02T15:04:05Z07:00"), + Size: item.Size, + Path: item.Path, + }) + } + for _, item := range agentItems { + out = append(out, trashItemJSON{ + Name: item.Name, + Kind: "agent", Timestamp: item.Timestamp, Date: item.Date.Format("2006-01-02T15:04:05Z07:00"), Size: item.Size, @@ -43,28 +129,43 @@ func (s *Server) handleListTrash(w http.ResponseWriter, r *http.Request) { }) } + totalSize := trash.TotalSize(base) + trash.TotalSize(agentBase) writeJSON(w, map[string]any{ "items": out, - "totalSize": trash.TotalSize(base), + "totalSize": totalSize, }) } -// handleRestoreTrash restores a trashed skill back to the source directory. +// handleRestoreTrash restores a trashed skill or agent back to its source directory. func (s *Server) handleRestoreTrash(w http.ResponseWriter, r *http.Request) { start := time.Now() s.mu.Lock() defer s.mu.Unlock() name := r.PathValue("name") - base := s.trashBase() + kind, err := parseTrashKind(r.URL.Query().Get("kind")) + if err != nil { + writeError(w, http.StatusBadRequest, err.Error()) + return + } - entry := trash.FindByName(base, name) - if entry == nil { + resolved, err := s.findTrashEntry(name, kind) + if err != nil { + writeError(w, http.StatusInternalServerError, "failed to resolve trashed item: "+err.Error()) + return + } + if resolved == nil { writeError(w, http.StatusNotFound, "trashed item not found: "+name) return } - if err := trash.Restore(entry, s.cfg.Source); err != nil { + switch resolved.kind { + case trashKindAgent: + err = trash.RestoreAgent(resolved.entry, resolved.dest) + default: + err = trash.Restore(resolved.entry, resolved.dest) + } + if err != nil { writeError(w, http.StatusInternalServerError, "failed to restore: "+err.Error()) return } @@ -72,6 +173,7 @@ func (s *Server) handleRestoreTrash(w http.ResponseWriter, r *http.Request) { s.writeOpsLog("trash", "ok", start, map[string]any{ "action": "restore", "name": name, + "kind": string(resolved.kind), "scope": "ui", }, "") @@ -85,15 +187,23 @@ func (s *Server) handleDeleteTrash(w http.ResponseWriter, r *http.Request) { defer s.mu.Unlock() name := r.PathValue("name") - base := s.trashBase() + kind, err := parseTrashKind(r.URL.Query().Get("kind")) + if err != nil { + writeError(w, http.StatusBadRequest, err.Error()) + return + } - entry := trash.FindByName(base, name) - if entry == nil { + resolved, err := s.findTrashEntry(name, kind) + if err != nil { + writeError(w, http.StatusInternalServerError, "failed to resolve trashed item: "+err.Error()) + return + } + if resolved == nil { writeError(w, http.StatusNotFound, "trashed item not found: "+name) return } - if err := os.RemoveAll(entry.Path); err != nil { + if err := os.RemoveAll(resolved.entry.Path); err != nil { writeError(w, http.StatusInternalServerError, "failed to delete: "+err.Error()) return } @@ -101,6 +211,7 @@ func (s *Server) handleDeleteTrash(w http.ResponseWriter, r *http.Request) { s.writeOpsLog("trash", "ok", start, map[string]any{ "action": "delete", "name": name, + "kind": string(resolved.kind), "scope": "ui", }, "") @@ -113,20 +224,39 @@ func (s *Server) handleEmptyTrash(w http.ResponseWriter, r *http.Request) { s.mu.Lock() defer s.mu.Unlock() - base := s.trashBase() - items := trash.List(base) + kind, err := parseTrashKind(r.URL.Query().Get("kind")) + if err != nil { + writeError(w, http.StatusBadRequest, err.Error()) + return + } + + type emptyTarget struct { + base string + } + targets := make([]emptyTarget, 0, 2) + if kind == trashKindAll || kind == trashKindSkill { + targets = append(targets, emptyTarget{base: s.trashBase()}) + } + if kind == trashKindAll || kind == trashKindAgent { + targets = append(targets, emptyTarget{base: s.agentTrashBase()}) + } + removed := 0 - for _, item := range items { - if err := os.RemoveAll(item.Path); err != nil { - writeError(w, http.StatusInternalServerError, "failed to empty trash: "+err.Error()) - return + for _, target := range targets { + items := trash.List(target.base) + for _, item := range items { + if err := os.RemoveAll(item.Path); err != nil { + writeError(w, http.StatusInternalServerError, "failed to empty trash: "+err.Error()) + return + } + removed++ } - removed++ } s.writeOpsLog("trash", "ok", start, map[string]any{ "action": "empty", + "kind": string(kind), "removed": removed, "scope": "ui", }, "") diff --git a/internal/server/handler_trash_test.go b/internal/server/handler_trash_test.go index dd1cc3bb..66727419 100644 --- a/internal/server/handler_trash_test.go +++ b/internal/server/handler_trash_test.go @@ -4,7 +4,11 @@ import ( "encoding/json" "net/http" "net/http/httptest" + "os" + "path/filepath" "testing" + + "skillshare/internal/trash" ) func TestHandleListTrash_Empty(t *testing.T) { @@ -41,6 +45,36 @@ func TestHandleRestoreTrash_NotFound(t *testing.T) { } } +func TestHandleRestoreTrash_AgentKind(t *testing.T) { + s, _ := newTestServer(t) + + agentsDir := s.cfg.EffectiveAgentsSource() + if err := os.MkdirAll(agentsDir, 0755); err != nil { + t.Fatalf("failed to create agents dir: %v", err) + } + agentFile := filepath.Join(agentsDir, "tutor.md") + if err := os.WriteFile(agentFile, []byte("# Tutor agent"), 0644); err != nil { + t.Fatalf("failed to seed agent: %v", err) + } + if _, err := trash.MoveAgentToTrash(agentFile, "", "tutor", s.agentTrashBase()); err != nil { + t.Fatalf("failed to move agent to trash: %v", err) + } + + req := httptest.NewRequest(http.MethodPost, "/api/trash/tutor/restore?kind=agent", nil) + rr := httptest.NewRecorder() + s.handler.ServeHTTP(rr, req) + + if rr.Code != http.StatusOK { + t.Fatalf("expected 200, got %d: %s", rr.Code, rr.Body.String()) + } + if _, err := os.Stat(filepath.Join(agentsDir, "tutor.md")); err != nil { + t.Fatalf("expected restored agent file, got: %v", err) + } + if entry := trash.FindByName(s.agentTrashBase(), "tutor"); entry != nil { + t.Fatalf("expected agent trash entry to be removed after restore") + } +} + func TestHandleDeleteTrash_NotFound(t *testing.T) { s, _ := newTestServer(t) req := httptest.NewRequest(http.MethodDelete, "/api/trash/nonexistent", nil) @@ -54,6 +88,25 @@ func TestHandleDeleteTrash_NotFound(t *testing.T) { func TestHandleEmptyTrash(t *testing.T) { s, _ := newTestServer(t) + + addSkill(t, s.skillsSource(), "trash-skill") + skillDir := filepath.Join(s.skillsSource(), "trash-skill") + if _, err := trash.MoveToTrash(skillDir, "trash-skill", s.trashBase()); err != nil { + t.Fatalf("failed to trash skill: %v", err) + } + + agentsDir := s.cfg.EffectiveAgentsSource() + if err := os.MkdirAll(agentsDir, 0755); err != nil { + t.Fatalf("failed to create agents dir: %v", err) + } + agentFile := filepath.Join(agentsDir, "tutor.md") + if err := os.WriteFile(agentFile, []byte("# Tutor agent"), 0644); err != nil { + t.Fatalf("failed to seed agent: %v", err) + } + if _, err := trash.MoveAgentToTrash(agentFile, "", "tutor", s.agentTrashBase()); err != nil { + t.Fatalf("failed to trash agent: %v", err) + } + req := httptest.NewRequest(http.MethodPost, "/api/trash/empty", nil) rr := httptest.NewRecorder() s.handler.ServeHTTP(rr, req) @@ -70,7 +123,13 @@ func TestHandleEmptyTrash(t *testing.T) { if !resp.Success { t.Error("expected success true") } - if resp.Removed != 0 { - t.Errorf("expected 0 removed, got %d", resp.Removed) + if resp.Removed != 2 { + t.Errorf("expected 2 removed, got %d", resp.Removed) + } + if len(trash.List(s.trashBase())) != 0 { + t.Errorf("expected skill trash to be empty after empty") + } + if len(trash.List(s.agentTrashBase())) != 0 { + t.Errorf("expected agent trash to be empty after empty") } } diff --git a/internal/server/handler_uninstall.go b/internal/server/handler_uninstall.go index 974560a9..ff645c1f 100644 --- a/internal/server/handler_uninstall.go +++ b/internal/server/handler_uninstall.go @@ -18,11 +18,13 @@ import ( type batchUninstallRequest struct { Names []string `json:"names"` + Kind string `json:"kind,omitempty"` Force bool `json:"force"` } type batchUninstallItemResult struct { Name string `json:"name"` + Kind string `json:"kind,omitempty"` Success bool `json:"success"` MovedToTrash bool `json:"movedToTrash,omitempty"` Error string `json:"error,omitempty"` @@ -47,7 +49,101 @@ func (s *Server) handleBatchUninstall(w http.ResponseWriter, r *http.Request) { writeError(w, http.StatusBadRequest, "names array is required and must not be empty") return } + if body.Kind != "" && body.Kind != "skill" && body.Kind != "agent" { + writeError(w, http.StatusBadRequest, "invalid kind: "+body.Kind) + return + } + + // Agent-mode batch uninstall + if body.Kind == "agent" { + s.handleBatchUninstallAgents(w, body, start) + return + } + + // Skill-mode (default) batch uninstall + s.handleBatchUninstallSkills(w, body, start) +} + +func (s *Server) handleBatchUninstallAgents(w http.ResponseWriter, body batchUninstallRequest, start time.Time) { + agentsSource := s.agentsSource() + if agentsSource == "" { + writeError(w, http.StatusInternalServerError, "agents source not configured") + return + } + + results := make([]batchUninstallItemResult, 0, len(body.Names)) + var removedNames []string + succeeded, failed := 0, 0 + var firstErr string + + for _, name := range body.Names { + res := batchUninstallItemResult{Name: name, Kind: "agent"} + + agent, err := resolveAgentResource(agentsSource, name) + if err != nil { + res.Success = false + res.Error = "agent not found: " + name + results = append(results, res) + failed++ + if firstErr == "" { + firstErr = res.Error + } + continue + } + + displayName := agentMetaKey(agent.RelPath) + legacySidecar := filepath.Join(filepath.Dir(agent.SourcePath), filepath.Base(displayName)+".skillshare-meta.json") + if _, err := trash.MoveAgentToTrash(agent.SourcePath, legacySidecar, displayName, s.agentTrashBase()); err != nil { + res.Success = false + res.Error = fmt.Sprintf("failed to trash agent: %v", err) + results = append(results, res) + failed++ + if firstErr == "" { + firstErr = res.Error + } + continue + } + removedNames = append(removedNames, displayName) + res.Success = true + res.MovedToTrash = true + results = append(results, res) + succeeded++ + } + + if succeeded > 0 && s.agentsStore != nil { + for _, name := range removedNames { + s.agentsStore.Remove(name) + } + if err := s.agentsStore.Save(agentsSource); err != nil { + log.Printf("warning: failed to save agent metadata after batch uninstall: %v", err) + } + } + + status := "ok" + if failed > 0 && succeeded > 0 { + status = "partial" + } else if failed > 0 { + status = "error" + } + + s.writeOpsLog("uninstall", status, start, map[string]any{ + "names": body.Names, + "kind": "agent", + "scope": "ui", + "count": succeeded, + }, firstErr) + + writeJSON(w, map[string]any{ + "results": results, + "summary": batchUninstallSummary{ + Succeeded: succeeded, + Failed: failed, + }, + }) +} + +func (s *Server) handleBatchUninstallSkills(w http.ResponseWriter, body batchUninstallRequest, start time.Time) { discovered, err := sync.DiscoverSourceSkills(s.cfg.Source) if err != nil { writeError(w, http.StatusInternalServerError, "failed to discover skills: "+err.Error()) @@ -68,7 +164,7 @@ func (s *Server) handleBatchUninstall(w http.ResponseWriter, r *http.Request) { var firstErr string for _, name := range body.Names { - res := batchUninstallItemResult{Name: name} + res := batchUninstallItemResult{Name: name, Kind: "skill"} if strings.HasPrefix(name, "_") { repoPath := filepath.Join(s.cfg.Source, name) @@ -176,47 +272,45 @@ func (s *Server) handleBatchUninstall(w http.ResponseWriter, r *http.Request) { if succeeded > 0 { // removedPaths contains exact RelPaths (e.g. "frontend/vue/vue-best-practices") // and repo dir names (e.g. "_team-skills"), collected during the uninstall loop. - filtered := make([]config.SkillEntry, 0, len(s.registry.Skills)) - for _, entry := range s.registry.Skills { - fullName := entry.FullName() - if removedPaths[fullName] || removedPaths[entry.Name] { + for _, name := range s.skillsStore.List() { + entry := s.skillsStore.Get(name) + if entry == nil { + continue + } + if removedPaths[name] { + s.skillsStore.Remove(name) continue } - // Tracked repos: registry stores group without "_" prefix (e.g., group="team-skills" + // Tracked repos: store uses group without "_" prefix (e.g., group="team-skills" // for repo dir "_team-skills"). Reconstruct the prefixed name to match removedPaths. if entry.Group != "" && removedPaths["_"+entry.Group] { + s.skillsStore.Remove(name) continue } // When a group directory is uninstalled, also remove its member skills memberOfRemoved := false for rp := range removedPaths { - if strings.HasPrefix(fullName, rp+"/") { + if strings.HasPrefix(name, rp+"/") { memberOfRemoved = true break } } if memberOfRemoved { - continue + s.skillsStore.Remove(name) } - filtered = append(filtered, entry) } - s.registry.Skills = filtered - regDir := s.cfg.RegistryDir - if s.IsProjectMode() { - regDir = filepath.Join(s.projectRoot, ".skillshare") - } - if err := s.registry.Save(regDir); err != nil { - log.Printf("warning: failed to save registry: %v", err) + if err := s.skillsStore.Save(s.cfg.Source); err != nil { + log.Printf("warning: failed to save metadata: %v", err) } if s.IsProjectMode() { if rErr := config.ReconcileProjectSkills( - s.projectRoot, s.projectCfg, s.registry, s.cfg.Source); rErr != nil { + s.projectRoot, s.projectCfg, s.skillsStore, s.cfg.Source); rErr != nil { log.Printf("warning: failed to reconcile project skills config: %v", rErr) } } else { - if rErr := config.ReconcileGlobalSkills(s.cfg, s.registry); rErr != nil { + if rErr := config.ReconcileGlobalSkills(s.cfg, s.skillsStore); rErr != nil { log.Printf("warning: failed to reconcile global skills config: %v", rErr) } } diff --git a/internal/server/handler_uninstall_test.go b/internal/server/handler_uninstall_test.go index 199a8aa1..e72ab11d 100644 --- a/internal/server/handler_uninstall_test.go +++ b/internal/server/handler_uninstall_test.go @@ -10,7 +10,7 @@ import ( "strings" "testing" - "skillshare/internal/config" + "skillshare/internal/install" ) func TestHandleBatchUninstall_ProjectMode_GitignorePath(t *testing.T) { @@ -33,11 +33,8 @@ func TestHandleBatchUninstall_ProjectMode_GitignorePath(t *testing.T) { "# BEGIN SKILLSHARE MANAGED - DO NOT EDIT\nskills/_team-skills/\n# END SKILLSHARE MANAGED\n", ), 0644) - s.registry = &config.Registry{ - Skills: []config.SkillEntry{ - {Name: "_team-skills", Tracked: true}, - }, - } + s.skillsStore = install.NewMetadataStore() + s.skillsStore.Set("_team-skills", &install.MetadataEntry{Tracked: true}) body := batchUninstallRequest{Names: []string{"_team-skills"}, Force: true} b, _ := json.Marshal(body) @@ -70,11 +67,8 @@ func TestHandleBatchUninstall_GlobalMode_GitignorePath(t *testing.T) { "# BEGIN SKILLSHARE MANAGED - DO NOT EDIT\n_team-skills/\n# END SKILLSHARE MANAGED\n", ), 0644) - s.registry = &config.Registry{ - Skills: []config.SkillEntry{ - {Name: "_team-skills", Tracked: true}, - }, - } + s.skillsStore = install.NewMetadataStore() + s.skillsStore.Set("_team-skills", &install.MetadataEntry{Tracked: true}) body := batchUninstallRequest{Names: []string{"_team-skills"}, Force: true} b, _ := json.Marshal(body) diff --git a/internal/server/handler_update.go b/internal/server/handler_update.go index eba69bce..27fe7738 100644 --- a/internal/server/handler_update.go +++ b/internal/server/handler_update.go @@ -6,6 +6,7 @@ import ( "net/http" "os" "path/filepath" + "strings" "time" "skillshare/internal/audit" @@ -16,6 +17,7 @@ import ( type updateRequest struct { Name string `json:"name"` + Kind string `json:"kind,omitempty"` Force bool `json:"force"` All bool `json:"all"` SkipAudit bool `json:"skipAudit"` @@ -28,6 +30,7 @@ type updateResultItem struct { IsRepo bool `json:"isRepo"` AuditRiskScore int `json:"auditRiskScore,omitempty"` AuditRiskLabel string `json:"auditRiskLabel,omitempty"` + Kind string `json:"kind,omitempty"` } func (s *Server) updateAuditThreshold() string { @@ -99,8 +102,12 @@ func (s *Server) handleUpdate(w http.ResponseWriter, r *http.Request) { writeError(w, http.StatusBadRequest, "name is required (or use all: true)") return } + if body.Kind != "" && body.Kind != "skill" && body.Kind != "agent" { + writeError(w, http.StatusBadRequest, "invalid kind: "+body.Kind) + return + } - result := s.updateSingle(body.Name, body.Force, body.SkipAudit) + result := s.updateSingleByKind(body.Name, body.Kind, body.Force, body.SkipAudit) status := "ok" msg := "" if result.Action == "error" { @@ -123,9 +130,16 @@ func (s *Server) handleUpdate(w http.ResponseWriter, r *http.Request) { } func (s *Server) updateSingle(name string, force, skipAudit bool) updateResultItem { + return s.updateSingleByKind(name, "", force, skipAudit) +} + +func (s *Server) updateSingleByKind(name, kind string, force, skipAudit bool) updateResultItem { + if kind == "agent" { + return s.updateAgent(name, force, skipAudit) + } // Try exact skill path first (prevents basename collision with nested repos) skillPath := filepath.Join(s.cfg.Source, name) - if meta, _ := install.ReadMeta(skillPath); meta != nil && meta.Source != "" { + if entry := s.skillsStore.GetByPath(name); entry != nil && entry.Source != "" { return s.updateRegularSkill(name, skillPath, skipAudit) } @@ -145,6 +159,109 @@ func (s *Server) updateSingle(name string, force, skipAudit bool) updateResultIt } } +func (s *Server) updateAgent(name string, force, skipAudit bool) updateResultItem { + agentsSource := s.agentsSource() + if agentsSource == "" { + return updateResultItem{Name: name, Kind: "agent", Action: "error", Message: "agents source is not configured"} + } + + localAgent, err := resolveAgentResource(agentsSource, name) + if err != nil { + return updateResultItem{Name: name, Kind: "agent", Action: "error", Message: err.Error()} + } + + if localAgent.RepoRelPath != "" { + repoPath := filepath.Join(agentsSource, filepath.FromSlash(localAgent.RepoRelPath)) + return s.updateTrackedRepo(agentMetaKey(localAgent.RelPath), repoPath, force, skipAudit) + } + + metaKey := agentMetaKey(localAgent.RelPath) + entry := s.agentsStore.GetByPath(metaKey) + if entry == nil || entry.Source == "" { + return updateResultItem{ + Name: metaKey, + Kind: "agent", + Action: "skipped", + Message: "agent is local and has no update source", + } + } + + source, err := install.ParseSource(entry.Source) + if err != nil { + return updateResultItem{Name: metaKey, Kind: "agent", Action: "error", Message: "invalid source: " + err.Error()} + } + + repoSubdir := strings.TrimSuffix(source.Subdir, entry.Subdir) + repoSubdir = strings.TrimRight(repoSubdir, "/") + source.Subdir = repoSubdir + + var discovery *install.DiscoveryResult + if source.HasSubdir() { + discovery, err = install.DiscoverFromGitSubdir(source) + } else { + discovery, err = install.DiscoverFromGit(source) + } + if err != nil { + return updateResultItem{Name: metaKey, Kind: "agent", Action: "error", Message: err.Error()} + } + defer install.CleanupDiscovery(discovery) + + if discovery.CommitHash != "" && discovery.CommitHash == entry.Version { + return updateResultItem{Name: metaKey, Kind: "agent", Action: "up-to-date"} + } + + var target *install.AgentInfo + for i := range discovery.Agents { + candidate := discovery.Agents[i] + if candidate.Path == entry.Subdir || + candidate.FileName == filepath.Base(localAgent.RelPath) || + candidate.Name == filepath.Base(metaKey) { + target = &discovery.Agents[i] + break + } + } + if target == nil { + return updateResultItem{ + Name: metaKey, + Kind: "agent", + Action: "error", + Message: fmt.Sprintf("agent path %q not found in repository", entry.Subdir), + } + } + + destDir := agentsSource + opts := install.InstallOptions{ + Kind: "agent", + Force: force, + Update: true, + SkipAudit: skipAudit, + AuditThreshold: s.updateAuditThreshold(), + SourceDir: agentsSource, + } + if s.IsProjectMode() { + opts.AuditProjectRoot = s.projectRoot + } + res, err := install.UpdateAgentFromDiscovery(discovery, *target, destDir, opts) + if err != nil { + return updateResultItem{Name: metaKey, Kind: "agent", Action: "error", Message: err.Error()} + } + + if st, loadErr := install.LoadMetadataWithMigration(agentsSource, install.MetadataKindAgent); loadErr == nil && st != nil { + s.agentsStore = st + } + + message := res.Action + if message == "" { + message = "updated" + } + return updateResultItem{ + Name: metaKey, + Kind: "agent", + Action: "updated", + Message: message, + } +} + func (s *Server) updateTrackedRepo(name, repoPath string, force, skipAudit bool) updateResultItem { // Check for uncommitted changes if isDirty, _ := git.IsDirty(repoPath); isDirty { @@ -261,8 +378,11 @@ func (s *Server) auditGateTrackedRepo(name, repoPath, beforeHash, threshold stri } func (s *Server) updateRegularSkill(name, skillPath string, skipAudit bool) updateResultItem { - meta, _ := install.ReadMeta(skillPath) - source, err := install.ParseSourceWithOptions(meta.Source, s.parseOpts()) + entry := s.skillsStore.GetByPath(name) + if entry == nil { + return updateResultItem{Name: name, Action: "error", Message: "no metadata found"} + } + source, err := install.ParseSourceWithOptions(entry.Source, s.parseOpts()) if err != nil { return updateResultItem{ Name: name, @@ -314,7 +434,7 @@ func (s *Server) updateAll(force, skipAudit bool) []updateResultItem { } // Update regular skills with source metadata - skills, err := getServerUpdatableSkills(s.cfg.Source) + skills, err := getServerUpdatableSkills(s.cfg.Source, s.skillsStore) if err == nil { for _, skill := range skills { skillPath := filepath.Join(s.cfg.Source, skill) @@ -327,7 +447,7 @@ func (s *Server) updateAll(force, skipAudit bool) []updateResultItem { // getServerUpdatableSkills returns relative paths of skills that have metadata with a remote source. // It walks the source directory recursively to find nested skills (e.g. utils/ascii-box-check). -func getServerUpdatableSkills(sourceDir string) ([]string, error) { +func getServerUpdatableSkills(sourceDir string, store *install.MetadataStore) ([]string, error) { var skills []string walkRoot := utils.ResolveSymlink(sourceDir) err := filepath.WalkDir(walkRoot, func(path string, d os.DirEntry, err error) error { @@ -350,8 +470,12 @@ func getServerUpdatableSkills(sourceDir string) ([]string, error) { return filepath.SkipDir } // Check if this directory has updatable metadata - meta, metaErr := install.ReadMeta(path) - if metaErr != nil || meta == nil || meta.Source == "" { + relName := filepath.Base(path) + if relP, relErr2 := filepath.Rel(walkRoot, path); relErr2 == nil { + relName = filepath.ToSlash(relP) + } + entry := store.GetByPath(relName) + if entry == nil || entry.Source == "" { return nil // continue walking into subdirectories } relPath, relErr := filepath.Rel(walkRoot, path) diff --git a/internal/server/handler_update_stream.go b/internal/server/handler_update_stream.go index d89d4980..19319415 100644 --- a/internal/server/handler_update_stream.go +++ b/internal/server/handler_update_stream.go @@ -85,7 +85,7 @@ func (s *Server) handleUpdateStream(w http.ResponseWriter, r *http.Request) { }) } } - skills, err := getServerUpdatableSkills(source) + skills, err := getServerUpdatableSkills(source, s.skillsStore) if err == nil { for _, skill := range skills { items = append(items, updateItem{ diff --git a/internal/server/resource_agents.go b/internal/server/resource_agents.go new file mode 100644 index 00000000..90603fc1 --- /dev/null +++ b/internal/server/resource_agents.go @@ -0,0 +1,49 @@ +package server + +import ( + "fmt" + "path/filepath" + "strings" + + "skillshare/internal/resource" +) + +func agentDisplayName(relPath string) string { + return strings.TrimSuffix(relPath, ".md") +} + +func matchesAgentName(d resource.DiscoveredResource, name string) bool { + return d.FlatName == name || + d.Name == name || + d.RelPath == name || + agentDisplayName(d.RelPath) == name +} + +func findAgent(agentsSource, name string) (resource.DiscoveredResource, error) { + discovered, err := resource.AgentKind{}.Discover(agentsSource) + if err != nil { + return resource.DiscoveredResource{}, fmt.Errorf("failed to discover agents: %w", err) + } + for _, d := range discovered { + if matchesAgentName(d, name) { + return d, nil + } + } + return resource.DiscoveredResource{}, fmt.Errorf("agent not found: %s", name) +} + +func resolveAgentResource(agentsSource, name string) (resource.DiscoveredResource, error) { + return findAgent(agentsSource, name) +} + +func (s *Server) resolveAgentRelPathWithStatus(agentsSource, name string) (string, bool, error) { + d, err := findAgent(agentsSource, name) + if err != nil { + return "", false, err + } + return d.RelPath, d.Disabled, nil +} + +func agentMetaKey(relPath string) string { + return strings.TrimSuffix(filepath.ToSlash(relPath), ".md") +} diff --git a/internal/server/server.go b/internal/server/server.go index 4bd2eece..c48d1188 100644 --- a/internal/server/server.go +++ b/internal/server/server.go @@ -20,12 +20,13 @@ import ( // Server holds the HTTP server state type Server struct { - cfg *config.Config - registry *config.Registry - addr string - mux *http.ServeMux - handler http.Handler - mu sync.RWMutex // protects config: Lock for writes/reloads, RLock for reads + cfg *config.Config + skillsStore *install.MetadataStore + agentsStore *install.MetadataStore + addr string + mux *http.ServeMux + handler http.Handler + mu sync.RWMutex // protects config: Lock for writes/reloads, RLock for reads startTime time.Time // for uptime reporting in health check @@ -83,17 +84,22 @@ func (s *Server) wrapBasePath() { // New creates a new Server for global mode. // uiDistDir, when non-empty, serves UI from disk instead of the embedded SPA. func New(cfg *config.Config, addr, basePath, uiDistDir string) *Server { - reg, _ := config.LoadRegistry(cfg.RegistryDir) - if reg == nil { - reg = &config.Registry{} + skillsStore, _ := install.LoadMetadataWithMigration(cfg.Source, "") + if skillsStore == nil { + skillsStore = install.NewMetadataStore() + } + agentsStore, _ := install.LoadMetadataWithMigration(cfg.EffectiveAgentsSource(), "agent") + if agentsStore == nil { + agentsStore = install.NewMetadataStore() } s := &Server{ - cfg: cfg, - registry: reg, - addr: addr, - mux: http.NewServeMux(), - basePath: NormalizeBasePath(basePath), - uiDistDir: uiDistDir, + cfg: cfg, + skillsStore: skillsStore, + agentsStore: agentsStore, + addr: addr, + mux: http.NewServeMux(), + basePath: NormalizeBasePath(basePath), + uiDistDir: uiDistDir, } s.registerRoutes() s.handler = s.withConfigAutoReload(s.mux) @@ -104,13 +110,20 @@ func New(cfg *config.Config, addr, basePath, uiDistDir string) *Server { // NewProject creates a new Server for project mode. // uiDistDir, when non-empty, serves UI from disk instead of the embedded SPA. func NewProject(cfg *config.Config, projectCfg *config.ProjectConfig, projectRoot, addr, basePath, uiDistDir string) *Server { - reg, _ := config.LoadRegistry(filepath.Join(projectRoot, ".skillshare")) - if reg == nil { - reg = &config.Registry{} + skillsDir := filepath.Join(projectRoot, ".skillshare", "skills") + agentsDir := filepath.Join(projectRoot, ".skillshare", "agents") + skillsStore, _ := install.LoadMetadataWithMigration(skillsDir, "") + if skillsStore == nil { + skillsStore = install.NewMetadataStore() + } + agentsStore, _ := install.LoadMetadataWithMigration(agentsDir, "agent") + if agentsStore == nil { + agentsStore = install.NewMetadataStore() } s := &Server{ cfg: cfg, - registry: reg, + skillsStore: skillsStore, + agentsStore: agentsStore, addr: addr, mux: http.NewServeMux(), basePath: NormalizeBasePath(basePath), @@ -129,6 +142,24 @@ func (s *Server) IsProjectMode() bool { return s.projectRoot != "" } +// skillsSource returns the skills source directory for the current mode. +// Caller must hold s.mu (RLock or Lock) when accessing s.cfg. +func (s *Server) skillsSource() string { + if s.IsProjectMode() { + return filepath.Join(s.projectRoot, ".skillshare", "skills") + } + return s.cfg.Source +} + +// agentsSource returns the agents source directory for the current mode. +// Caller must hold s.mu (RLock or Lock) when accessing s.cfg. +func (s *Server) agentsSource() string { + if s.IsProjectMode() { + return filepath.Join(s.projectRoot, ".skillshare", "agents") + } + return s.cfg.EffectiveAgentsSource() +} + // cloneTargets returns a shallow copy of the Targets map. // Callers must hold s.mu (RLock or Lock). func (s *Server) cloneTargets() map[string]config.TargetConfig { @@ -199,8 +230,13 @@ func (s *Server) reloadConfig() error { return err } s.cfg.Targets = targets - if reg, err := config.LoadRegistry(filepath.Join(s.projectRoot, ".skillshare")); err == nil { - s.registry = reg + skillsDir := filepath.Join(s.projectRoot, ".skillshare", "skills") + agentsDir := filepath.Join(s.projectRoot, ".skillshare", "agents") + if st, err := install.LoadMetadata(skillsDir); err == nil { + s.skillsStore = st + } + if st, err := install.LoadMetadata(agentsDir); err == nil { + s.agentsStore = st } return nil } @@ -209,8 +245,11 @@ func (s *Server) reloadConfig() error { return err } s.cfg = newCfg - if reg, err := config.LoadRegistry(s.cfg.RegistryDir); err == nil { - s.registry = reg + if st, err := install.LoadMetadata(newCfg.Source); err == nil { + s.skillsStore = st + } + if st, err := install.LoadMetadata(newCfg.EffectiveAgentsSource()); err == nil { + s.agentsStore = st } return nil } @@ -319,17 +358,17 @@ func (s *Server) registerRoutes() { // Overview s.mux.HandleFunc("GET /api/overview", s.handleOverview) - // Skills - s.mux.HandleFunc("GET /api/skills", s.handleListSkills) - s.mux.HandleFunc("GET /api/skills/templates", s.handleGetTemplates) - s.mux.HandleFunc("POST /api/skills", s.handleCreateSkill) - s.mux.HandleFunc("GET /api/skills/{name}", s.handleGetSkill) - s.mux.HandleFunc("GET /api/skills/{name}/files/{filepath...}", s.handleGetSkillFile) - s.mux.HandleFunc("POST /api/skills/{name}/disable", s.handleDisableSkill) - s.mux.HandleFunc("POST /api/skills/{name}/enable", s.handleEnableSkill) - s.mux.HandleFunc("DELETE /api/skills/{name}", s.handleUninstallSkill) - s.mux.HandleFunc("POST /api/skills/batch/targets", s.handleBatchSetTargets) - s.mux.HandleFunc("PATCH /api/skills/{name}/targets", s.handleSetSkillTargets) + // Resources (skills + agents) + s.mux.HandleFunc("GET /api/resources", s.handleListSkills) + s.mux.HandleFunc("GET /api/resources/templates", s.handleGetTemplates) + s.mux.HandleFunc("POST /api/resources", s.handleCreateSkill) + s.mux.HandleFunc("GET /api/resources/{name}", s.handleGetSkill) + s.mux.HandleFunc("GET /api/resources/{name}/files/{filepath...}", s.handleGetSkillFile) + s.mux.HandleFunc("POST /api/resources/{name}/disable", s.handleDisableSkill) + s.mux.HandleFunc("POST /api/resources/{name}/enable", s.handleEnableSkill) + s.mux.HandleFunc("DELETE /api/resources/{name}", s.handleUninstallSkill) + s.mux.HandleFunc("POST /api/resources/batch/targets", s.handleBatchSetTargets) + s.mux.HandleFunc("PATCH /api/resources/{name}/targets", s.handleSetSkillTargets) // Targets s.mux.HandleFunc("GET /api/targets", s.handleListTargets) @@ -435,6 +474,10 @@ func (s *Server) registerRoutes() { s.mux.HandleFunc("GET /api/skillignore", s.handleGetSkillignore) s.mux.HandleFunc("PUT /api/skillignore", s.handlePutSkillignore) + // Agentignore + s.mux.HandleFunc("GET /api/agentignore", s.handleGetAgentignore) + s.mux.HandleFunc("PUT /api/agentignore", s.handlePutAgentignore) + // SPA fallback — must be last if s.uiDistDir != "" { s.mux.Handle("/", spaHandlerFromDisk(s.uiDistDir, s.basePath)) diff --git a/internal/skillignore/skillignore.go b/internal/skillignore/skillignore.go index 63bb4059..2ef59d13 100644 --- a/internal/skillignore/skillignore.go +++ b/internal/skillignore/skillignore.go @@ -135,6 +135,31 @@ func ReadMatcher(dir string) *Matcher { return m } +// ReadAgentIgnoreMatcher reads .agentignore (and .agentignore.local) from dir +// and returns a compiled Matcher. Same gitignore-style pattern format as .skillignore. +func ReadAgentIgnoreMatcher(dir string) *Matcher { + var lines []string + var hasLocal bool + + data, err := os.ReadFile(filepath.Join(dir, ".agentignore")) + if err == nil { + lines = strings.Split(string(data), "\n") + } + + localData, localErr := os.ReadFile(filepath.Join(dir, ".agentignore.local")) + if localErr == nil { + hasLocal = true + lines = append(lines, strings.Split(string(localData), "\n")...) + } + + if len(lines) == 0 { + return &Matcher{} + } + m := Compile(lines) + m.HasLocal = hasLocal + return m +} + // HasRules reports whether the matcher has any compiled rules. func (m *Matcher) HasRules() bool { return m != nil && len(m.rules) > 0 diff --git a/internal/sync/agent_sync.go b/internal/sync/agent_sync.go new file mode 100644 index 00000000..3d242450 --- /dev/null +++ b/internal/sync/agent_sync.go @@ -0,0 +1,450 @@ +package sync + +import ( + "errors" + "fmt" + "os" + "path/filepath" + "strings" + + "skillshare/internal/resource" + "skillshare/internal/utils" +) + +// AgentSyncResult holds the result of syncing agents to a target. +type AgentSyncResult struct { + Linked []string // Agents that were symlinked (merge) or copied (copy) + Skipped []string // Agents that already exist in target (kept local) + Updated []string // Agents that had broken symlinks fixed or content updated +} + +// AgentCollision represents two agents that flatten to the same filename. +type AgentCollision struct { + FlatName string // The colliding flat name (e.g. "helper.md") + PathA string // First agent relative path + PathB string // Second agent relative path +} + +// LocalAgentInfo describes a local agent file in a target directory. +type LocalAgentInfo struct { + Name string + Path string + TargetName string +} + +// CheckAgentCollisions detects agents that flatten to the same filename. +func CheckAgentCollisions(agents []resource.DiscoveredResource) []AgentCollision { + seen := make(map[string]string) // flatName → first relPath + var collisions []AgentCollision + + for _, a := range agents { + if prev, ok := seen[a.FlatName]; ok { + collisions = append(collisions, AgentCollision{ + FlatName: a.FlatName, + PathA: prev, + PathB: a.RelPath, + }) + } else { + seen[a.FlatName] = a.RelPath + } + } + + return collisions +} + +// SyncAgents dispatches to the appropriate sync mode for agents. +// mode: "merge" (per-file symlinks), "symlink" (whole dir), "copy" (file copy). +func SyncAgents(agents []resource.DiscoveredResource, sourceDir, targetDir, mode string, dryRun, force bool) (*AgentSyncResult, error) { + switch mode { + case "symlink": + return syncAgentsSymlink(sourceDir, targetDir, dryRun, force) + case "copy": + return syncAgentsCopy(agents, targetDir, dryRun, force) + default: // "merge" or "" + return syncAgentsMerge(agents, targetDir, dryRun, force) + } +} + +// syncAgentsMerge creates per-file symlinks in targetDir for each discovered agent. +// Existing non-symlink files are preserved (skipped) unless force is true. +func syncAgentsMerge(agents []resource.DiscoveredResource, targetDir string, dryRun, force bool) (*AgentSyncResult, error) { + result := &AgentSyncResult{} + + if !dryRun { + if err := os.MkdirAll(targetDir, 0755); err != nil { + return nil, fmt.Errorf("failed to create agent target directory: %w", err) + } + } + + for _, agent := range agents { + targetPath := filepath.Join(targetDir, agent.FlatName) + + info, err := os.Lstat(targetPath) + if err == nil { + if info.Mode()&os.ModeSymlink != 0 { + absLink, linkErr := utils.ResolveLinkTarget(targetPath) + if linkErr != nil { + return nil, fmt.Errorf("failed to resolve link for %s: %w", agent.FlatName, linkErr) + } + absSource, _ := filepath.Abs(agent.AbsPath) + + if utils.PathsEqual(absLink, absSource) { + result.Linked = append(result.Linked, agent.FlatName) + continue + } + + if !dryRun { + os.Remove(targetPath) + if err := os.Symlink(agent.AbsPath, targetPath); err != nil { + return nil, fmt.Errorf("failed to create symlink for %s: %w", agent.FlatName, err) + } + } + result.Updated = append(result.Updated, agent.FlatName) + } else { + if force { + if !dryRun { + os.Remove(targetPath) + if err := os.Symlink(agent.AbsPath, targetPath); err != nil { + return nil, fmt.Errorf("failed to create symlink for %s: %w", agent.FlatName, err) + } + } + result.Updated = append(result.Updated, agent.FlatName) + } else { + result.Skipped = append(result.Skipped, agent.FlatName) + } + } + } else if os.IsNotExist(err) { + if !dryRun { + if err := os.Symlink(agent.AbsPath, targetPath); err != nil { + return nil, fmt.Errorf("failed to create symlink for %s: %w", agent.FlatName, err) + } + } + result.Linked = append(result.Linked, agent.FlatName) + } else { + return nil, fmt.Errorf("failed to check target path for %s: %w", agent.FlatName, err) + } + } + + return result, nil +} + +// syncAgentsSymlink creates a single directory symlink from targetDir to sourceDir. +// If targetDir already exists as a real directory, it's replaced only with force. +func syncAgentsSymlink(sourceDir, targetDir string, dryRun, force bool) (*AgentSyncResult, error) { + result := &AgentSyncResult{} + + if err := os.MkdirAll(filepath.Dir(targetDir), 0755); err != nil { + return nil, fmt.Errorf("failed to create target parent: %w", err) + } + + info, err := os.Lstat(targetDir) + if err == nil { + if info.Mode()&os.ModeSymlink != 0 { + // Already a symlink — check if correct + absLink, linkErr := utils.ResolveLinkTarget(targetDir) + if linkErr != nil { + return nil, fmt.Errorf("failed to resolve link: %w", linkErr) + } + absSource, _ := filepath.Abs(sourceDir) + + if utils.PathsEqual(absLink, absSource) { + result.Linked = append(result.Linked, "(directory)") + return result, nil + } + + // Wrong target + if !dryRun { + os.Remove(targetDir) + if err := os.Symlink(sourceDir, targetDir); err != nil { + return nil, fmt.Errorf("failed to create directory symlink: %w", err) + } + } + result.Updated = append(result.Updated, "(directory)") + } else { + // Real directory + if force { + if !dryRun { + os.RemoveAll(targetDir) + if err := os.Symlink(sourceDir, targetDir); err != nil { + return nil, fmt.Errorf("failed to create directory symlink: %w", err) + } + } + result.Updated = append(result.Updated, "(directory)") + } else { + result.Skipped = append(result.Skipped, "(directory)") + } + } + } else if os.IsNotExist(err) { + if !dryRun { + if err := os.Symlink(sourceDir, targetDir); err != nil { + return nil, fmt.Errorf("failed to create directory symlink: %w", err) + } + } + result.Linked = append(result.Linked, "(directory)") + } else { + return nil, fmt.Errorf("failed to check target path: %w", err) + } + + return result, nil +} + +// syncAgentsCopy copies agent .md files to targetDir. +// Existing files are overwritten if content differs; force replaces all. +func syncAgentsCopy(agents []resource.DiscoveredResource, targetDir string, dryRun, force bool) (*AgentSyncResult, error) { + result := &AgentSyncResult{} + + if !dryRun { + if err := os.MkdirAll(targetDir, 0755); err != nil { + return nil, fmt.Errorf("failed to create agent target directory: %w", err) + } + } + + for _, agent := range agents { + targetPath := filepath.Join(targetDir, agent.FlatName) + + srcData, err := os.ReadFile(agent.AbsPath) + if err != nil { + return nil, fmt.Errorf("failed to read source %s: %w", agent.FlatName, err) + } + + if _, statErr := os.Stat(targetPath); statErr == nil { + // File exists — check if content matches + tgtData, readErr := os.ReadFile(targetPath) + if readErr == nil && string(tgtData) == string(srcData) && !force { + result.Linked = append(result.Linked, agent.FlatName) + continue + } + // Content differs or force — overwrite + if !dryRun { + if err := os.WriteFile(targetPath, srcData, 0644); err != nil { + return nil, fmt.Errorf("failed to write %s: %w", agent.FlatName, err) + } + } + result.Updated = append(result.Updated, agent.FlatName) + } else { + // New file + if !dryRun { + if err := os.WriteFile(targetPath, srcData, 0644); err != nil { + return nil, fmt.Errorf("failed to write %s: %w", agent.FlatName, err) + } + } + result.Linked = append(result.Linked, agent.FlatName) + } + } + + return result, nil +} + +// SyncAgentsToTarget creates file symlinks in targetDir for each discovered agent. +// Uses merge semantics. Kept for backward compatibility; prefer SyncAgents(). +func SyncAgentsToTarget(agents []resource.DiscoveredResource, targetDir string, dryRun, force bool) (*AgentSyncResult, error) { + return syncAgentsMerge(agents, targetDir, dryRun, force) +} + +// PruneOrphanAgentLinks removes file symlinks in targetDir that don't +// correspond to any discovered agent. For merge mode only. +func PruneOrphanAgentLinks(targetDir string, agents []resource.DiscoveredResource, dryRun bool) (removed []string, _ error) { + entries, err := os.ReadDir(targetDir) + if err != nil { + if os.IsNotExist(err) { + return nil, nil + } + return nil, fmt.Errorf("failed to read agent target directory: %w", err) + } + + expected := make(map[string]bool, len(agents)) + for _, a := range agents { + expected[a.FlatName] = true + } + + for _, entry := range entries { + name := entry.Name() + + if !strings.HasSuffix(strings.ToLower(name), ".md") { + continue + } + + info, err := entry.Info() + if err != nil { + continue + } + + if info.Mode()&os.ModeSymlink == 0 { + continue + } + + if expected[name] { + continue + } + + if !dryRun { + os.Remove(filepath.Join(targetDir, name)) + } + removed = append(removed, name) + } + + return removed, nil +} + +// PruneOrphanAgentCopies removes copied .md files in targetDir that don't +// correspond to any discovered agent. For copy mode only. +func PruneOrphanAgentCopies(targetDir string, agents []resource.DiscoveredResource, dryRun bool) (removed []string, _ error) { + entries, err := os.ReadDir(targetDir) + if err != nil { + if os.IsNotExist(err) { + return nil, nil + } + return nil, fmt.Errorf("failed to read agent target directory: %w", err) + } + + expected := make(map[string]bool, len(agents)) + for _, a := range agents { + expected[a.FlatName] = true + } + + for _, entry := range entries { + name := entry.Name() + + if !strings.HasSuffix(strings.ToLower(name), ".md") { + continue + } + + // Skip conventional excludes (user might have README.md etc.) + if resource.ConventionalExcludes[name] { + continue + } + + if expected[name] { + continue + } + + if !dryRun { + os.Remove(filepath.Join(targetDir, name)) + } + removed = append(removed, name) + } + + return removed, nil +} + +// FindLocalAgents finds local (non-symlinked) agent files in a target directory. +// If the target directory itself is a symlink to sourcePath, it returns no local agents. +func FindLocalAgents(targetDir, sourcePath string) ([]LocalAgentInfo, error) { + var agents []LocalAgentInfo + + info, err := os.Lstat(targetDir) + if err != nil { + if os.IsNotExist(err) { + return agents, nil + } + return nil, fmt.Errorf("failed to read agent target directory: %w", err) + } + + if info.Mode()&os.ModeSymlink != 0 { + absLink, err := utils.ResolveLinkTarget(targetDir) + if err != nil { + return nil, err + } + absSource, _ := filepath.Abs(sourcePath) + if utils.PathsEqual(absLink, absSource) { + return agents, nil + } + resolved, statErr := os.Stat(targetDir) + if statErr != nil || !resolved.IsDir() { + return agents, nil + } + } + + entries, err := os.ReadDir(targetDir) + if err != nil { + if os.IsNotExist(err) { + return agents, nil + } + return nil, fmt.Errorf("failed to read agent target directory: %w", err) + } + + for _, entry := range entries { + name := entry.Name() + + if !strings.HasSuffix(strings.ToLower(name), ".md") { + continue + } + if utils.IsHidden(name) || resource.ConventionalExcludes[name] { + continue + } + + info, err := entry.Info() + if err != nil { + continue + } + + if info.IsDir() || info.Mode()&os.ModeSymlink != 0 { + continue + } + + agents = append(agents, LocalAgentInfo{ + Name: name, + Path: filepath.Join(targetDir, name), + }) + } + + return agents, nil +} + +// PullAgent copies a single local agent file from target to source. +func PullAgent(agent LocalAgentInfo, sourcePath string, force bool) error { + destPath := filepath.Join(sourcePath, agent.Name) + + if _, err := os.Stat(destPath); err == nil { + if !force { + return ErrAlreadyExists + } + if err := os.RemoveAll(destPath); err != nil { + return fmt.Errorf("failed to remove existing: %w", err) + } + } + + data, err := os.ReadFile(agent.Path) + if err != nil { + return fmt.Errorf("failed to read %s: %w", agent.Name, err) + } + if err := os.WriteFile(destPath, data, 0644); err != nil { + return fmt.Errorf("failed to write %s: %w", agent.Name, err) + } + + return nil +} + +// PullAgents copies multiple local agent files from targets to source. +func PullAgents(agents []LocalAgentInfo, sourcePath string, opts PullOptions) (*PullResult, error) { + result := &PullResult{ + Failed: make(map[string]error), + } + + if !opts.DryRun { + if err := os.MkdirAll(sourcePath, 0755); err != nil { + return nil, fmt.Errorf("failed to create agent source dir: %w", err) + } + } + + for _, agent := range agents { + if opts.DryRun { + result.Pulled = append(result.Pulled, agent.Name) + continue + } + + err := PullAgent(agent, sourcePath, opts.Force) + if err != nil { + if errors.Is(err, ErrAlreadyExists) { + result.Skipped = append(result.Skipped, agent.Name) + } else { + result.Failed[agent.Name] = err + } + continue + } + + result.Pulled = append(result.Pulled, agent.Name) + } + + return result, nil +} diff --git a/internal/sync/agent_sync_test.go b/internal/sync/agent_sync_test.go new file mode 100644 index 00000000..48dd373f --- /dev/null +++ b/internal/sync/agent_sync_test.go @@ -0,0 +1,618 @@ +package sync + +import ( + "os" + "path/filepath" + "testing" + + "skillshare/internal/resource" +) + +func TestCheckAgentCollisions_NoCollision(t *testing.T) { + agents := []resource.DiscoveredResource{ + {FlatName: "tutor.md", RelPath: "tutor.md"}, + {FlatName: "reviewer.md", RelPath: "reviewer.md"}, + } + collisions := CheckAgentCollisions(agents) + if len(collisions) != 0 { + t.Errorf("expected 0 collisions, got %d", len(collisions)) + } +} + +func TestCheckAgentCollisions_HasCollision(t *testing.T) { + agents := []resource.DiscoveredResource{ + {FlatName: "team__helper.md", RelPath: "team/helper.md"}, + {FlatName: "team__helper.md", RelPath: "team__helper.md"}, + } + collisions := CheckAgentCollisions(agents) + if len(collisions) != 1 { + t.Fatalf("expected 1 collision, got %d", len(collisions)) + } + if collisions[0].FlatName != "team__helper.md" { + t.Errorf("collision FlatName = %q", collisions[0].FlatName) + } +} + +func TestSyncAgentsToTarget_NewLinks(t *testing.T) { + sourceDir := t.TempDir() + targetDir := t.TempDir() + + // Create agent source files + os.WriteFile(filepath.Join(sourceDir, "tutor.md"), []byte("# Tutor"), 0644) + os.WriteFile(filepath.Join(sourceDir, "reviewer.md"), []byte("# Reviewer"), 0644) + + agents := []resource.DiscoveredResource{ + {FlatName: "tutor.md", AbsPath: filepath.Join(sourceDir, "tutor.md")}, + {FlatName: "reviewer.md", AbsPath: filepath.Join(sourceDir, "reviewer.md")}, + } + + result, err := SyncAgentsToTarget(agents, targetDir, false, false) + if err != nil { + t.Fatalf("SyncAgentsToTarget: %v", err) + } + + if len(result.Linked) != 2 { + t.Errorf("expected 2 linked, got %d", len(result.Linked)) + } + + // Verify symlinks exist + for _, name := range []string{"tutor.md", "reviewer.md"} { + linkPath := filepath.Join(targetDir, name) + info, err := os.Lstat(linkPath) + if err != nil { + t.Errorf("expected symlink %s to exist", name) + continue + } + if info.Mode()&os.ModeSymlink == 0 { + t.Errorf("expected %s to be a symlink", name) + } + } +} + +func TestSyncAgentsToTarget_ExistingSymlinkCorrect(t *testing.T) { + sourceDir := t.TempDir() + targetDir := t.TempDir() + + srcFile := filepath.Join(sourceDir, "tutor.md") + os.WriteFile(srcFile, []byte("# Tutor"), 0644) + + // Pre-create correct symlink + os.Symlink(srcFile, filepath.Join(targetDir, "tutor.md")) + + agents := []resource.DiscoveredResource{ + {FlatName: "tutor.md", AbsPath: srcFile}, + } + + result, err := SyncAgentsToTarget(agents, targetDir, false, false) + if err != nil { + t.Fatalf("SyncAgentsToTarget: %v", err) + } + + if len(result.Linked) != 1 { + t.Errorf("expected 1 linked (existing correct), got %d", len(result.Linked)) + } + if len(result.Updated) != 0 { + t.Errorf("expected 0 updated, got %d", len(result.Updated)) + } +} + +func TestSyncAgentsToTarget_LocalFileSkipped(t *testing.T) { + sourceDir := t.TempDir() + targetDir := t.TempDir() + + srcFile := filepath.Join(sourceDir, "tutor.md") + os.WriteFile(srcFile, []byte("# Tutor source"), 0644) + + // Pre-create local file (not a symlink) + os.WriteFile(filepath.Join(targetDir, "tutor.md"), []byte("# Local tutor"), 0644) + + agents := []resource.DiscoveredResource{ + {FlatName: "tutor.md", AbsPath: srcFile}, + } + + result, err := SyncAgentsToTarget(agents, targetDir, false, false) + if err != nil { + t.Fatalf("SyncAgentsToTarget: %v", err) + } + + if len(result.Skipped) != 1 { + t.Errorf("expected 1 skipped, got %d", len(result.Skipped)) + } +} + +func TestSyncAgentsToTarget_ForceReplacesLocal(t *testing.T) { + sourceDir := t.TempDir() + targetDir := t.TempDir() + + srcFile := filepath.Join(sourceDir, "tutor.md") + os.WriteFile(srcFile, []byte("# Tutor source"), 0644) + + os.WriteFile(filepath.Join(targetDir, "tutor.md"), []byte("# Local"), 0644) + + agents := []resource.DiscoveredResource{ + {FlatName: "tutor.md", AbsPath: srcFile}, + } + + result, err := SyncAgentsToTarget(agents, targetDir, false, true) + if err != nil { + t.Fatalf("SyncAgentsToTarget: %v", err) + } + + if len(result.Updated) != 1 { + t.Errorf("expected 1 updated, got %d", len(result.Updated)) + } + + // Should now be a symlink + info, _ := os.Lstat(filepath.Join(targetDir, "tutor.md")) + if info.Mode()&os.ModeSymlink == 0 { + t.Error("expected symlink after force") + } +} + +func TestPruneOrphanAgentLinks(t *testing.T) { + sourceDir := t.TempDir() + targetDir := t.TempDir() + + // Create source file and active symlink + srcFile := filepath.Join(sourceDir, "active.md") + os.WriteFile(srcFile, []byte("# Active"), 0644) + os.Symlink(srcFile, filepath.Join(targetDir, "active.md")) + + // Create orphan symlink + orphanSrc := filepath.Join(sourceDir, "orphan.md") + os.WriteFile(orphanSrc, []byte("# Orphan"), 0644) + os.Symlink(orphanSrc, filepath.Join(targetDir, "orphan.md")) + + // Create non-symlink file (should not be removed) + os.WriteFile(filepath.Join(targetDir, "local.md"), []byte("# Local"), 0644) + + agents := []resource.DiscoveredResource{ + {FlatName: "active.md"}, + } + + removed, err := PruneOrphanAgentLinks(targetDir, agents, false) + if err != nil { + t.Fatalf("PruneOrphanAgentLinks: %v", err) + } + + if len(removed) != 1 { + t.Fatalf("expected 1 removed, got %d: %v", len(removed), removed) + } + if removed[0] != "orphan.md" { + t.Errorf("expected orphan.md removed, got %q", removed[0]) + } + + // local.md should still exist + if _, err := os.Stat(filepath.Join(targetDir, "local.md")); err != nil { + t.Error("local.md should not be removed") + } +} + +func TestPruneOrphanAgentLinks_NonExistentDir(t *testing.T) { + removed, err := PruneOrphanAgentLinks("/nonexistent/path", nil, false) + if err != nil { + t.Fatalf("unexpected error: %v", err) + } + if len(removed) != 0 { + t.Errorf("expected 0 removed, got %d", len(removed)) + } +} + +func TestFindLocalAgentsAndPullAgents(t *testing.T) { + targetDir := t.TempDir() + sourceDir := t.TempDir() + + // Create a local (non-symlink) agent file in target + os.WriteFile(filepath.Join(targetDir, "new-agent.md"), []byte("# New agent"), 0644) + + // Create a symlink (should be skipped) + srcFile := filepath.Join(sourceDir, "existing.md") + os.WriteFile(srcFile, []byte("# Existing"), 0644) + os.Symlink(srcFile, filepath.Join(targetDir, "existing.md")) + + // Create README (should be skipped) + os.WriteFile(filepath.Join(targetDir, "README.md"), []byte("# Readme"), 0644) + + // Create non-md file (should be skipped) + os.WriteFile(filepath.Join(targetDir, "config.yaml"), []byte("key: val"), 0644) + + collectDir := t.TempDir() + agents, err := FindLocalAgents(targetDir, collectDir) + if err != nil { + t.Fatalf("FindLocalAgents: %v", err) + } + + if len(agents) != 1 { + t.Fatalf("expected 1 local agent, got %d: %v", len(agents), agents) + } + if agents[0].Name != "new-agent.md" { + t.Errorf("agent name = %q, want %q", agents[0].Name, "new-agent.md") + } + + result, err := PullAgents(agents, collectDir, PullOptions{}) + if err != nil { + t.Fatalf("PullAgents: %v", err) + } + if len(result.Pulled) != 1 || result.Pulled[0] != "new-agent.md" { + t.Fatalf("expected pulled=[new-agent.md], got %#v", result) + } + + // Verify file was copied + data, err := os.ReadFile(filepath.Join(collectDir, "new-agent.md")) + if err != nil { + t.Fatalf("collected file not found: %v", err) + } + if string(data) != "# New agent" { + t.Errorf("collected content = %q", string(data)) + } +} + +func TestFindLocalAgents_TargetSymlinkToSource_ReturnsEmpty(t *testing.T) { + sourceDir := t.TempDir() + targetParent := t.TempDir() + targetDir := filepath.Join(targetParent, "agents") + + os.WriteFile(filepath.Join(sourceDir, "tutor.md"), []byte("# Tutor"), 0644) + os.Symlink(sourceDir, targetDir) + + agents, err := FindLocalAgents(targetDir, sourceDir) + if err != nil { + t.Fatalf("FindLocalAgents: %v", err) + } + if len(agents) != 0 { + t.Fatalf("expected 0 local agents, got %d", len(agents)) + } +} + +func TestPullAgents_SkipsExistingWithoutForce(t *testing.T) { + targetDir := t.TempDir() + collectDir := t.TempDir() + + os.WriteFile(filepath.Join(targetDir, "agent.md"), []byte("# Target"), 0644) + os.WriteFile(filepath.Join(collectDir, "agent.md"), []byte("# Source"), 0644) + + agents, err := FindLocalAgents(targetDir, collectDir) + if err != nil { + t.Fatalf("FindLocalAgents: %v", err) + } + + result, err := PullAgents(agents, collectDir, PullOptions{}) + if err != nil { + t.Fatalf("PullAgents: %v", err) + } + if len(result.Skipped) != 1 || result.Skipped[0] != "agent.md" { + t.Fatalf("expected skipped=[agent.md], got %#v", result) + } + + data, err := os.ReadFile(filepath.Join(collectDir, "agent.md")) + if err != nil { + t.Fatalf("read source agent: %v", err) + } + if string(data) != "# Source" { + t.Fatalf("source agent should remain unchanged, got %q", string(data)) + } +} + +func TestPullAgents_ForceOverwritesExisting(t *testing.T) { + targetDir := t.TempDir() + collectDir := t.TempDir() + + os.WriteFile(filepath.Join(targetDir, "agent.md"), []byte("# Target"), 0644) + os.WriteFile(filepath.Join(collectDir, "agent.md"), []byte("# Source"), 0644) + + agents, err := FindLocalAgents(targetDir, collectDir) + if err != nil { + t.Fatalf("FindLocalAgents: %v", err) + } + + result, err := PullAgents(agents, collectDir, PullOptions{Force: true}) + if err != nil { + t.Fatalf("PullAgents: %v", err) + } + if len(result.Pulled) != 1 || result.Pulled[0] != "agent.md" { + t.Fatalf("expected pulled=[agent.md], got %#v", result) + } + + data, err := os.ReadFile(filepath.Join(collectDir, "agent.md")) + if err != nil { + t.Fatalf("read source agent: %v", err) + } + if string(data) != "# Target" { + t.Fatalf("source agent should be overwritten, got %q", string(data)) + } +} + +// --- Symlink mode tests --- + +func TestSyncAgents_SymlinkMode_NewDir(t *testing.T) { + sourceDir := t.TempDir() + os.WriteFile(filepath.Join(sourceDir, "tutor.md"), []byte("# Tutor"), 0644) + + targetDir := filepath.Join(t.TempDir(), "agents") + + result, err := SyncAgents(nil, sourceDir, targetDir, "symlink", false, false) + if err != nil { + t.Fatalf("SyncAgents symlink: %v", err) + } + + if len(result.Linked) != 1 { + t.Errorf("expected 1 linked, got %d", len(result.Linked)) + } + + // targetDir should be a symlink to sourceDir + info, err := os.Lstat(targetDir) + if err != nil { + t.Fatalf("Lstat: %v", err) + } + if info.Mode()&os.ModeSymlink == 0 { + t.Error("expected targetDir to be a symlink") + } +} + +func TestSyncAgents_SymlinkMode_AlreadyCorrect(t *testing.T) { + sourceDir := t.TempDir() + parentDir := t.TempDir() + targetDir := filepath.Join(parentDir, "agents") + + os.Symlink(sourceDir, targetDir) + + result, err := SyncAgents(nil, sourceDir, targetDir, "symlink", false, false) + if err != nil { + t.Fatalf("SyncAgents symlink: %v", err) + } + + if len(result.Linked) != 1 { + t.Errorf("expected 1 linked (already correct), got %d", len(result.Linked)) + } + if len(result.Updated) != 0 { + t.Errorf("expected 0 updated, got %d", len(result.Updated)) + } +} + +func TestSyncAgents_SymlinkMode_RealDirSkipped(t *testing.T) { + sourceDir := t.TempDir() + targetDir := t.TempDir() // real directory + + result, err := SyncAgents(nil, sourceDir, targetDir, "symlink", false, false) + if err != nil { + t.Fatalf("SyncAgents symlink: %v", err) + } + + if len(result.Skipped) != 1 { + t.Errorf("expected 1 skipped, got %d", len(result.Skipped)) + } +} + +// --- Copy mode tests --- + +func TestSyncAgents_CopyMode_NewFiles(t *testing.T) { + sourceDir := t.TempDir() + targetDir := t.TempDir() + + os.WriteFile(filepath.Join(sourceDir, "tutor.md"), []byte("# Tutor"), 0644) + + agents := []resource.DiscoveredResource{ + {FlatName: "tutor.md", AbsPath: filepath.Join(sourceDir, "tutor.md")}, + } + + result, err := SyncAgents(agents, sourceDir, targetDir, "copy", false, false) + if err != nil { + t.Fatalf("SyncAgents copy: %v", err) + } + + if len(result.Linked) != 1 { + t.Errorf("expected 1 linked (new copy), got %d", len(result.Linked)) + } + + // Verify it's a real file, not a symlink + info, _ := os.Lstat(filepath.Join(targetDir, "tutor.md")) + if info.Mode()&os.ModeSymlink != 0 { + t.Error("copy mode should create real files, not symlinks") + } + + data, _ := os.ReadFile(filepath.Join(targetDir, "tutor.md")) + if string(data) != "# Tutor" { + t.Errorf("content = %q", string(data)) + } +} + +func TestSyncAgents_CopyMode_SameContent(t *testing.T) { + sourceDir := t.TempDir() + targetDir := t.TempDir() + + os.WriteFile(filepath.Join(sourceDir, "tutor.md"), []byte("# Same"), 0644) + os.WriteFile(filepath.Join(targetDir, "tutor.md"), []byte("# Same"), 0644) + + agents := []resource.DiscoveredResource{ + {FlatName: "tutor.md", AbsPath: filepath.Join(sourceDir, "tutor.md")}, + } + + result, err := SyncAgents(agents, sourceDir, targetDir, "copy", false, false) + if err != nil { + t.Fatalf("SyncAgents copy: %v", err) + } + + if len(result.Linked) != 1 { + t.Errorf("expected 1 linked (same content), got %d", len(result.Linked)) + } + if len(result.Updated) != 0 { + t.Errorf("expected 0 updated, got %d", len(result.Updated)) + } +} + +func TestSyncAgents_CopyMode_DifferentContent(t *testing.T) { + sourceDir := t.TempDir() + targetDir := t.TempDir() + + os.WriteFile(filepath.Join(sourceDir, "tutor.md"), []byte("# New"), 0644) + os.WriteFile(filepath.Join(targetDir, "tutor.md"), []byte("# Old"), 0644) + + agents := []resource.DiscoveredResource{ + {FlatName: "tutor.md", AbsPath: filepath.Join(sourceDir, "tutor.md")}, + } + + result, err := SyncAgents(agents, sourceDir, targetDir, "copy", false, false) + if err != nil { + t.Fatalf("SyncAgents copy: %v", err) + } + + if len(result.Updated) != 1 { + t.Errorf("expected 1 updated, got %d", len(result.Updated)) + } + + data, _ := os.ReadFile(filepath.Join(targetDir, "tutor.md")) + if string(data) != "# New" { + t.Errorf("content = %q, want %q", string(data), "# New") + } +} + +func TestPruneOrphanAgentCopies(t *testing.T) { + targetDir := t.TempDir() + + os.WriteFile(filepath.Join(targetDir, "active.md"), []byte("# Active"), 0644) + os.WriteFile(filepath.Join(targetDir, "orphan.md"), []byte("# Orphan"), 0644) + os.WriteFile(filepath.Join(targetDir, "README.md"), []byte("# Readme"), 0644) // conventional, skip + + agents := []resource.DiscoveredResource{ + {FlatName: "active.md"}, + } + + removed, err := PruneOrphanAgentCopies(targetDir, agents, false) + if err != nil { + t.Fatalf("PruneOrphanAgentCopies: %v", err) + } + + if len(removed) != 1 || removed[0] != "orphan.md" { + t.Errorf("expected [orphan.md] removed, got %v", removed) + } + + // README.md should still exist + if _, err := os.Stat(filepath.Join(targetDir, "README.md")); err != nil { + t.Error("README.md should not be removed") + } +} + +// --- Dispatch tests --- + +func TestSyncAgents_DefaultIsMerge(t *testing.T) { + sourceDir := t.TempDir() + targetDir := t.TempDir() + + os.WriteFile(filepath.Join(sourceDir, "a.md"), []byte("# A"), 0644) + + agents := []resource.DiscoveredResource{ + {FlatName: "a.md", AbsPath: filepath.Join(sourceDir, "a.md")}, + } + + result, err := SyncAgents(agents, sourceDir, targetDir, "", false, false) + if err != nil { + t.Fatalf("SyncAgents default: %v", err) + } + + if len(result.Linked) != 1 { + t.Errorf("expected 1 linked, got %d", len(result.Linked)) + } + + // Should be a symlink (merge mode) + info, _ := os.Lstat(filepath.Join(targetDir, "a.md")) + if info.Mode()&os.ModeSymlink == 0 { + t.Error("default mode should create symlinks (merge)") + } +} + +func TestSyncAgents_MergeMode_NestedSameBasename_IsStable(t *testing.T) { + sourceDir := t.TempDir() + targetDir := t.TempDir() + + if err := os.MkdirAll(filepath.Join(sourceDir, "team-a"), 0o755); err != nil { + t.Fatalf("mkdir team-a: %v", err) + } + if err := os.MkdirAll(filepath.Join(sourceDir, "team-b"), 0o755); err != nil { + t.Fatalf("mkdir team-b: %v", err) + } + + teamAPath := filepath.Join(sourceDir, "team-a", "helper.md") + teamBPath := filepath.Join(sourceDir, "team-b", "helper.md") + if err := os.WriteFile(teamAPath, []byte("# Team A"), 0o644); err != nil { + t.Fatalf("write team-a helper: %v", err) + } + if err := os.WriteFile(teamBPath, []byte("# Team B"), 0o644); err != nil { + t.Fatalf("write team-b helper: %v", err) + } + + agents, err := resource.AgentKind{}.Discover(sourceDir) + if err != nil { + t.Fatalf("discover agents: %v", err) + } + if len(agents) != 2 { + t.Fatalf("expected 2 agents, got %d", len(agents)) + } + + first, err := SyncAgents(agents, sourceDir, targetDir, "merge", false, false) + if err != nil { + t.Fatalf("first sync: %v", err) + } + if len(first.Linked) != 2 { + t.Fatalf("first sync: expected 2 linked, got %d", len(first.Linked)) + } + if len(first.Updated) != 0 { + t.Fatalf("first sync: expected 0 updated, got %d", len(first.Updated)) + } + + second, err := SyncAgents(agents, sourceDir, targetDir, "merge", false, false) + if err != nil { + t.Fatalf("second sync: %v", err) + } + if len(second.Linked) != 2 { + t.Fatalf("second sync: expected 2 linked, got %d", len(second.Linked)) + } + if len(second.Updated) != 0 { + t.Fatalf("second sync: expected 0 updated, got %d", len(second.Updated)) + } + + linkA, err := os.Readlink(filepath.Join(targetDir, "team-a__helper.md")) + if err != nil { + t.Fatalf("readlink team-a target: %v", err) + } + // EvalSymlinks to handle macOS /var → /private/var alias. + wantA, _ := filepath.EvalSymlinks(teamAPath) + gotA, _ := filepath.EvalSymlinks(linkA) + if gotA != wantA { + t.Fatalf("team-a symlink = %q, want %q", linkA, teamAPath) + } + + linkB, err := os.Readlink(filepath.Join(targetDir, "team-b__helper.md")) + if err != nil { + t.Fatalf("readlink team-b target: %v", err) + } + wantB, _ := filepath.EvalSymlinks(teamBPath) + gotB, _ := filepath.EvalSymlinks(linkB) + if gotB != wantB { + t.Fatalf("team-b symlink = %q, want %q", linkB, teamBPath) + } +} + +func TestPullAgents_DryRun(t *testing.T) { + targetDir := t.TempDir() + os.WriteFile(filepath.Join(targetDir, "agent.md"), []byte("# Agent"), 0644) + + collectDir := t.TempDir() + agents, err := FindLocalAgents(targetDir, collectDir) + if err != nil { + t.Fatalf("FindLocalAgents: %v", err) + } + + result, err := PullAgents(agents, collectDir, PullOptions{DryRun: true}) + if err != nil { + t.Fatalf("PullAgents dry-run: %v", err) + } + + if len(result.Pulled) != 1 { + t.Fatalf("expected 1 collected in dry-run, got %d", len(result.Pulled)) + } + + // File should NOT exist (dry-run) + if _, err := os.Stat(filepath.Join(collectDir, "agent.md")); err == nil { + t.Error("file should not exist in dry-run") + } +} diff --git a/internal/sync/filter.go b/internal/sync/filter.go index b276d68d..d437696e 100644 --- a/internal/sync/filter.go +++ b/internal/sync/filter.go @@ -6,6 +6,7 @@ import ( "strings" "skillshare/internal/config" + "skillshare/internal/resource" ) // FilterSkills filters discovered skills by include/exclude patterns. @@ -26,6 +27,27 @@ func FilterSkills(skills []DiscoveredSkill, include, exclude []string) ([]Discov return filtered, nil } +// FilterAgents filters discovered agents by include/exclude patterns. +// Agent FlatNames include the .md extension (e.g. "tutor.md"), but filter +// patterns are matched against the name without extension so users can write +// intuitive patterns like "tutor" or "team-*" instead of "tutor.md". +func FilterAgents(agents []resource.DiscoveredResource, include, exclude []string) ([]resource.DiscoveredResource, error) { + includePatterns, excludePatterns, err := normalizedFilterPatterns(include, exclude) + if err != nil { + return nil, err + } + + filtered := make([]resource.DiscoveredResource, 0, len(agents)) + for _, agent := range agents { + name := strings.TrimSuffix(agent.FlatName, ".md") + if shouldSyncFlatName(name, includePatterns, excludePatterns) { + filtered = append(filtered, agent) + } + } + + return filtered, nil +} + // ShouldSyncFlatName returns whether a single flat skill name should be managed // by the given include/exclude filters. func ShouldSyncFlatName(flatName string, include, exclude []string) (bool, error) { diff --git a/internal/sync/filter_test.go b/internal/sync/filter_test.go index 2c0eb43e..0410e27e 100644 --- a/internal/sync/filter_test.go +++ b/internal/sync/filter_test.go @@ -3,6 +3,8 @@ package sync import ( "reflect" "testing" + + "skillshare/internal/resource" ) func TestFilterSkills_IncludeOnly(t *testing.T) { @@ -84,6 +86,83 @@ func TestShouldSyncFlatName(t *testing.T) { } } +// --- FilterAgents tests --- + +func TestFilterAgents_IncludeOnly(t *testing.T) { + // FlatNames include .md; patterns should match without extension + agents := testAgents("code-reviewer.md", "tutor.md", "debugger.md") + filtered, err := FilterAgents(agents, []string{"code-*", "tutor"}, nil) + if err != nil { + t.Fatalf("FilterAgents returned error: %v", err) + } + assertAgentFlatNames(t, filtered, []string{"code-reviewer.md", "tutor.md"}) +} + +func TestFilterAgents_ExcludeOnly(t *testing.T) { + agents := testAgents("code-reviewer.md", "tutor.md", "debugger.md") + filtered, err := FilterAgents(agents, nil, []string{"tutor", "debug*"}) + if err != nil { + t.Fatalf("FilterAgents returned error: %v", err) + } + assertAgentFlatNames(t, filtered, []string{"code-reviewer.md"}) +} + +func TestFilterAgents_IncludeThenExclude(t *testing.T) { + agents := testAgents("team-reviewer.md", "team-debugger.md", "personal-tutor.md") + filtered, err := FilterAgents(agents, []string{"team-*"}, []string{"*-debugger"}) + if err != nil { + t.Fatalf("FilterAgents returned error: %v", err) + } + assertAgentFlatNames(t, filtered, []string{"team-reviewer.md"}) +} + +func TestFilterAgents_EmptyPatternsReturnAll(t *testing.T) { + agents := testAgents("a.md", "b.md", "c.md") + filtered, err := FilterAgents(agents, nil, nil) + if err != nil { + t.Fatalf("FilterAgents returned error: %v", err) + } + assertAgentFlatNames(t, filtered, []string{"a.md", "b.md", "c.md"}) +} + +func TestFilterAgents_NestedFlatNames(t *testing.T) { + agents := testAgents("team__reviewer.md", "team__debugger.md", "personal__tutor.md") + filtered, err := FilterAgents(agents, []string{"team__*"}, nil) + if err != nil { + t.Fatalf("FilterAgents returned error: %v", err) + } + assertAgentFlatNames(t, filtered, []string{"team__reviewer.md", "team__debugger.md"}) +} + +func TestFilterAgents_InvalidPattern(t *testing.T) { + agents := testAgents("one.md") + if _, err := FilterAgents(agents, []string{"["}, nil); err == nil { + t.Fatal("expected invalid include pattern error") + } + if _, err := FilterAgents(agents, nil, []string{"["}); err == nil { + t.Fatal("expected invalid exclude pattern error") + } +} + +func testAgents(names ...string) []resource.DiscoveredResource { + agents := make([]resource.DiscoveredResource, 0, len(names)) + for _, name := range names { + agents = append(agents, resource.DiscoveredResource{FlatName: name, Kind: "agent"}) + } + return agents +} + +func assertAgentFlatNames(t *testing.T, agents []resource.DiscoveredResource, want []string) { + t.Helper() + got := make([]string, 0, len(agents)) + for _, a := range agents { + got = append(got, a.FlatName) + } + if !reflect.DeepEqual(got, want) { + t.Fatalf("agent flat names = %v, want %v", got, want) + } +} + func testSkills(names ...string) []DiscoveredSkill { skills := make([]DiscoveredSkill, 0, len(names)) for _, name := range names { diff --git a/internal/sync/pull.go b/internal/sync/pull.go index f658e63b..74f24963 100644 --- a/internal/sync/pull.go +++ b/internal/sync/pull.go @@ -1,6 +1,7 @@ package sync import ( + "errors" "fmt" "os" "path/filepath" @@ -9,6 +10,10 @@ import ( "skillshare/internal/utils" ) +// ErrAlreadyExists is returned when a skill or agent already exists in source +// and --force was not specified. +var ErrAlreadyExists = errors.New("already exists in source") + // LocalSkillInfo describes a local skill in a target directory type LocalSkillInfo struct { Name string @@ -125,7 +130,7 @@ func PullSkill(skill LocalSkillInfo, sourcePath string, force bool) error { // Check if skill already exists in source if _, err := os.Stat(destPath); err == nil { if !force { - return fmt.Errorf("already exists in source") + return ErrAlreadyExists } // Remove existing to overwrite if err := os.RemoveAll(destPath); err != nil { @@ -152,7 +157,7 @@ func PullSkills(skills []LocalSkillInfo, sourcePath string, opts PullOptions) (* err := PullSkill(skill, sourcePath, opts.Force) if err != nil { - if err.Error() == "already exists in source" { + if errors.Is(err, ErrAlreadyExists) { result.Skipped = append(result.Skipped, skill.Name) } else { result.Failed[skill.Name] = err diff --git a/internal/targetsummary/agents.go b/internal/targetsummary/agents.go new file mode 100644 index 00000000..a0fbe92a --- /dev/null +++ b/internal/targetsummary/agents.go @@ -0,0 +1,185 @@ +package targetsummary + +import ( + "os" + "path/filepath" + "strings" + + "skillshare/internal/config" + "skillshare/internal/resource" + ssync "skillshare/internal/sync" +) + +const defaultAgentMode = "merge" + +// AgentSummary describes the effective agent configuration and sync counts for +// a single target. ManagedCount maps to linked agents in merge/symlink mode and +// managed copied agents in copy mode. +type AgentSummary struct { + DisplayPath string + Path string + Mode string + Include []string + Exclude []string + ManagedCount int + ExpectedCount int +} + +// Builder caches the discovered agent source so multiple target summaries can +// share the same resolution and counting logic. +type Builder struct { + sourcePath string + projectRoot string + builtinAgents map[string]config.TargetConfig + activeAgents []resource.DiscoveredResource + sourceExists bool +} + +// NewGlobalBuilder returns a summary builder for global-mode targets. +func NewGlobalBuilder(cfg *config.Config) (*Builder, error) { + return newBuilder(cfg.EffectiveAgentsSource(), "", config.DefaultAgentTargets()) +} + +// NewProjectBuilder returns a summary builder for project-mode targets. +func NewProjectBuilder(projectRoot string) (*Builder, error) { + return newBuilder(filepath.Join(projectRoot, ".skillshare", "agents"), projectRoot, config.ProjectAgentTargets()) +} + +func newBuilder(sourcePath, projectRoot string, builtinAgents map[string]config.TargetConfig) (*Builder, error) { + builder := &Builder{ + sourcePath: sourcePath, + projectRoot: projectRoot, + builtinAgents: builtinAgents, + } + + if !dirExists(sourcePath) { + return builder, nil + } + + discovered, err := resource.AgentKind{}.Discover(sourcePath) + if err != nil { + return nil, err + } + builder.activeAgents = resource.ActiveAgents(discovered) + builder.sourceExists = true + return builder, nil +} + +// GlobalTarget returns the effective agents summary for a global target. +func (b *Builder) GlobalTarget(name string, tc config.TargetConfig) (*AgentSummary, error) { + ac := tc.AgentsConfig() + displayPath := ac.Path + if displayPath == "" { + if builtin, ok := b.builtinAgents[name]; ok { + displayPath = config.ExpandPath(builtin.Path) + } + } + if displayPath == "" { + return nil, nil + } + + return b.buildSummary(config.ExpandPath(displayPath), displayPath, ac.Mode, ac.Include, ac.Exclude) +} + +// ProjectTarget returns the effective agents summary for a project target. +func (b *Builder) ProjectTarget(entry config.ProjectTargetEntry) (*AgentSummary, error) { + ac := entry.AgentsConfig() + displayPath := ac.Path + if displayPath == "" { + if builtin, ok := b.builtinAgents[entry.Name]; ok { + displayPath = builtin.Path + } + } + if displayPath == "" { + return nil, nil + } + + return b.buildSummary(resolveProjectPath(b.projectRoot, displayPath), displayPath, ac.Mode, ac.Include, ac.Exclude) +} + +func (b *Builder) buildSummary(path, displayPath, mode string, include, exclude []string) (*AgentSummary, error) { + if mode == "" { + mode = defaultAgentMode + } + + summary := &AgentSummary{ + Path: path, + DisplayPath: displayPath, + Mode: mode, + Include: append([]string(nil), include...), + Exclude: append([]string(nil), exclude...), + } + + expectedAgents := b.activeAgents + if b.sourceExists && mode != "symlink" { + filtered, err := ssync.FilterAgents(expectedAgents, include, exclude) + if err != nil { + return nil, err + } + expectedAgents = filtered + } + + if b.sourceExists { + summary.ExpectedCount = len(expectedAgents) + } + summary.ManagedCount = countManagedAgents(path, mode, b.sourcePath, summary.ExpectedCount) + + return summary, nil +} + +func countManagedAgents(targetPath, mode, sourcePath string, expectedCount int) int { + switch mode { + case "copy": + _, managed, _ := ssync.CheckStatusCopy(targetPath) + return managed + case "symlink": + if ssync.CheckStatus(targetPath, sourcePath) == ssync.StatusLinked { + return expectedCount + } + return 0 + default: + return countHealthyAgentLinks(targetPath) + } +} + +func countHealthyAgentLinks(dir string) int { + entries, err := os.ReadDir(dir) + if err != nil { + return 0 + } + + linked := 0 + for _, entry := range entries { + if entry.IsDir() { + continue + } + if !strings.HasSuffix(strings.ToLower(entry.Name()), ".md") { + continue + } + if entry.Type()&os.ModeSymlink == 0 { + continue + } + if _, err := os.Stat(filepath.Join(dir, entry.Name())); err == nil { + linked++ + } + } + + return linked +} + +func dirExists(path string) bool { + info, err := os.Stat(path) + return err == nil && info.IsDir() +} + +func resolveProjectPath(projectRoot, path string) string { + if path == "" { + return "" + } + + resolved := config.ExpandPath(path) + if !filepath.IsAbs(resolved) { + return filepath.Join(projectRoot, filepath.FromSlash(resolved)) + } + return resolved +} diff --git a/internal/testutil/runner.go b/internal/testutil/runner.go index 518e740d..679da4ac 100644 --- a/internal/testutil/runner.go +++ b/internal/testutil/runner.go @@ -135,3 +135,42 @@ func (sb *Sandbox) RunCLIInDirWithInput(dir, input string, args ...string) *Resu func (r *Result) Output() string { return r.Stdout + r.Stderr } + +// RunCLIEnv executes the CLI with additional environment variables. +// The extra env vars are appended after the standard HOME and +// SKILLSHARE_CONFIG, so they override any pre-existing values. +func (sb *Sandbox) RunCLIEnv(extraEnv map[string]string, args ...string) *Result { + sb.T.Helper() + + cmd := exec.Command(sb.BinaryPath, args...) + env := append(os.Environ(), + "HOME="+sb.Home, + "SKILLSHARE_CONFIG="+sb.ConfigPath, + ) + for k, v := range extraEnv { + env = append(env, k+"="+v) + } + cmd.Env = env + + var stdout, stderr bytes.Buffer + cmd.Stdout = &stdout + cmd.Stderr = &stderr + + err := cmd.Run() + + exitCode := 0 + if err != nil { + if exitErr, ok := err.(*exec.ExitError); ok { + exitCode = exitErr.ExitCode() + } else { + sb.T.Logf("failed to run CLI: %v", err) + exitCode = -1 + } + } + + return &Result{ + ExitCode: exitCode, + Stdout: stdout.String(), + Stderr: stderr.String(), + } +} diff --git a/internal/theme/ansi.go b/internal/theme/ansi.go new file mode 100644 index 00000000..d8cede59 --- /dev/null +++ b/internal/theme/ansi.go @@ -0,0 +1,59 @@ +package theme + +import ( + "fmt" + + "github.com/charmbracelet/lipgloss" +) + +// ANSISet contains raw terminal escape sequences keyed by semantic role. +// Returned by ANSI() — use these in fmt.Printf-style plain CLI output +// where a lipgloss.Style is too heavy. +// +// Every field is an empty string when NoColor is enabled, so code can +// concatenate them unconditionally without worrying about NO_COLOR. +type ANSISet struct { + Reset string + Primary string + Muted string + Success string + Warning string + Danger string + Info string + Accent string + Dim string + Bold string +} + +// ANSI returns the raw escape sequence set for the active theme. +// When NoColor is set, every field (including Reset) is empty. +func ANSI() ANSISet { + t := Get() + if t.NoColor { + return ANSISet{} + } + return ANSISet{ + Reset: "\033[0m", + Primary: fg256(t.palette.Primary), + Muted: fg256(t.palette.Muted), + Success: fg256(t.palette.Success), + Warning: fg256(t.palette.Warning), + Danger: fg256(t.palette.Danger), + Info: fg256(t.palette.Info), + Accent: fg256(t.palette.Accent), + Dim: "\x1b[0;2m", + Bold: "\033[1m", + } +} + +// fg256 builds a 256-color foreground escape sequence from a lipgloss.Color. +// lipgloss.Color is defined as `type Color string` so the underlying value is +// directly convertible; palette values are always 256-palette integer strings +// like "232", "6", etc. +func fg256(c lipgloss.Color) string { + s := string(c) + if s == "" { + return "" + } + return fmt.Sprintf("\033[38;5;%sm", s) +} diff --git a/internal/theme/contrast.go b/internal/theme/contrast.go new file mode 100644 index 00000000..d817329f --- /dev/null +++ b/internal/theme/contrast.go @@ -0,0 +1,104 @@ +package theme + +import ( + "math" + "strconv" + "strings" + + "github.com/charmbracelet/lipgloss" +) + +// contrastRatio returns the WCAG contrast ratio between fg and bg. +// Both colors may be 256-palette indices (e.g. "252") or hex strings +// (e.g. "#FFFFFF"). Returns 1.0 on parse failure, which is the minimum +// possible ratio (identical colors). +func contrastRatio(fg, bg lipgloss.Color) float64 { + fgLum, ok1 := relativeLuminance(string(fg)) + bgLum, ok2 := relativeLuminance(string(bg)) + if !ok1 || !ok2 { + return 1.0 + } + lighter, darker := fgLum, bgLum + if darker > lighter { + lighter, darker = darker, lighter + } + return (lighter + 0.05) / (darker + 0.05) +} + +// relativeLuminance computes the WCAG relative luminance (0..1) of a color +// given either as a 256-palette index string or a hex string. +func relativeLuminance(s string) (float64, bool) { + r, g, b, ok := parseColorRGB(s) + if !ok { + return 0, false + } + // WCAG formula: https://www.w3.org/WAI/GL/wiki/Relative_luminance + return 0.2126*channelLum(r) + 0.7152*channelLum(g) + 0.0722*channelLum(b), true +} + +// channelLum converts an 8-bit channel value to a WCAG linearized component. +func channelLum(c int) float64 { + cs := float64(c) / 255.0 + if cs <= 0.03928 { + return cs / 12.92 + } + return math.Pow((cs+0.055)/1.055, 2.4) +} + +// parseColorRGB converts either "#RRGGBB" or a 256-palette index like "232" +// into 8-bit RGB channels. +func parseColorRGB(s string) (r, g, b int, ok bool) { + if strings.HasPrefix(s, "#") && len(s) == 7 { + rv, err := strconv.ParseInt(s[1:3], 16, 32) + if err != nil { + return 0, 0, 0, false + } + gv, err := strconv.ParseInt(s[3:5], 16, 32) + if err != nil { + return 0, 0, 0, false + } + bv, err := strconv.ParseInt(s[5:7], 16, 32) + if err != nil { + return 0, 0, 0, false + } + return int(rv), int(gv), int(bv), true + } + // 256-palette index + idx, err := strconv.Atoi(s) + if err != nil || idx < 0 || idx > 255 { + return 0, 0, 0, false + } + return palette256ToRGB(idx) +} + +// palette256ToRGB maps an xterm 256-color index to approximate 8-bit RGB. +// Ranges: +// +// 0- 7: standard ANSI (hardcoded) +// 8- 15: bright ANSI (hardcoded) +// 16-231: 6x6x6 color cube +// 232-255: 24-step grayscale +func palette256ToRGB(idx int) (r, g, b int, ok bool) { + standard := [16][3]int{ + {0, 0, 0}, {128, 0, 0}, {0, 128, 0}, {128, 128, 0}, + {0, 0, 128}, {128, 0, 128}, {0, 128, 128}, {192, 192, 192}, + {128, 128, 128}, {255, 0, 0}, {0, 255, 0}, {255, 255, 0}, + {0, 0, 255}, {255, 0, 255}, {0, 255, 255}, {255, 255, 255}, + } + if idx < 16 { + c := standard[idx] + return c[0], c[1], c[2], true + } + if idx < 232 { + // 6x6x6 cube: colors at 0, 95, 135, 175, 215, 255 + levels := [6]int{0, 95, 135, 175, 215, 255} + n := idx - 16 + r = levels[(n/36)%6] + g = levels[(n/6)%6] + b = levels[n%6] + return r, g, b, true + } + // Grayscale: 232..255 → 8, 18, 28, ..., 238 + v := 8 + (idx-232)*10 + return v, v, v, true +} diff --git a/internal/theme/detect.go b/internal/theme/detect.go new file mode 100644 index 00000000..399b51e1 --- /dev/null +++ b/internal/theme/detect.go @@ -0,0 +1,121 @@ +package theme + +import ( + "os" + "runtime" + "time" + + "github.com/charmbracelet/lipgloss" + "golang.org/x/term" +) + +// probeTimeout is the maximum time resolve() waits for an OSC 11 background +// color query response. Chosen empirically: long enough for tmux passthrough +// and slow SSH links, short enough to not stall CLI startup. +const probeTimeout = 300 * time.Millisecond + +// resolve decides the theme at process start. It never returns nil. +// +// Precedence (highest first): +// +// 1. NO_COLOR set (any value) → NoColor=true, palette arbitrary. +// 2. SKILLSHARE_THEME=light → light palette, Source="env". +// 3. SKILLSHARE_THEME=dark → dark palette, Source="env". +// 4. SKILLSHARE_THEME=auto / unset / unknown → probe terminal. +// 5. Probe skipped or failed → dark palette fallback. +func resolve() *Theme { + // 1. NO_COLOR — https://no-color.org/ (only when non-empty per spec) + if v, ok := os.LookupEnv("NO_COLOR"); ok && v != "" { + return &Theme{ + Mode: ModeDark, + Source: "no-color", + NoColor: true, + palette: darkPalette, + } + } + + // 2-3. Explicit SKILLSHARE_THEME override + switch os.Getenv("SKILLSHARE_THEME") { + case "light": + return &Theme{Mode: ModeLight, Source: "env", palette: lightPalette} + case "dark": + return &Theme{Mode: ModeDark, Source: "env", palette: darkPalette} + } + // "", "auto", or unknown values fall through to detection. + + // 4. Auto detection gate + if !canDetect() { + return &Theme{ + Mode: ModeDark, + Source: "fallback-dark-no-tty", + palette: darkPalette, + } + } + + // 5. OSC 11 probe with timeout + mode, ok := probeBackgroundWithTimeout(probeTimeout) + if !ok { + return &Theme{ + Mode: ModeDark, + Source: "fallback-dark-probe-failed", + palette: darkPalette, + } + } + if mode == ModeLight { + return &Theme{Mode: ModeLight, Source: "detected", palette: lightPalette} + } + return &Theme{Mode: ModeDark, Source: "detected", palette: darkPalette} +} + +// canDetect reports whether it is safe to send an OSC 11 query. Guards +// against: +// - Non-TTY stdin/stdout (piped, redirected, CI) +// - CI env var set +// - TERM=dumb or empty +// +// On Windows returns true unconditionally because lipgloss uses WinAPI +// instead of OSC 11 there. +func canDetect() bool { + if runtime.GOOS == "windows" { + return true + } + if !term.IsTerminal(int(os.Stdin.Fd())) { + return false + } + if !term.IsTerminal(int(os.Stdout.Fd())) { + return false + } + if os.Getenv("CI") != "" { + return false + } + t := os.Getenv("TERM") + if t == "" || t == "dumb" { + return false + } + return true +} + +// probeBackgroundWithTimeout calls lipgloss.HasDarkBackground in a goroutine +// and returns ("dark"|"light", true) on success or ("", false) on timeout. +// +// KNOWN LEAK: When the probe times out, the goroutine remains blocked +// reading from stdin until the process exits. This is acceptable because +// the leak occurs at most once per process and holds only a goroutine +// (no file handles or locks). The alternative — a custom OSC 11 probe +// with SetReadDeadline — would add ~50 lines of terminal protocol code +// without a meaningful benefit. +func probeBackgroundWithTimeout(timeout time.Duration) (Mode, bool) { + ch := make(chan bool, 1) + go func() { + ch <- lipgloss.HasDarkBackground() + }() + select { + case isDark := <-ch: + if isDark { + return ModeDark, true + } + return ModeLight, true + case <-time.After(timeout): + return "", false + } +} diff --git a/internal/theme/palette.go b/internal/theme/palette.go new file mode 100644 index 00000000..1ee80e9f --- /dev/null +++ b/internal/theme/palette.go @@ -0,0 +1,34 @@ +package theme + +import "github.com/charmbracelet/lipgloss" + +// palette holds every color token used across skillshare's TUIs and CLI. +// Two instances exist: darkPalette and lightPalette. The active one is +// stored on Theme and accessed via exported style constructors. +type palette struct { + // Text hierarchy + Primary lipgloss.Color // main text (skill names, headings) + Muted lipgloss.Color // secondary text (paths, metadata) + Accent lipgloss.Color // interactive highlights (filter match, cursor) + + // Selected list row + Selected lipgloss.Color + SelectedBg lipgloss.Color + + // Status colors + Success lipgloss.Color + Warning lipgloss.Color + Danger lipgloss.Color + Info lipgloss.Color + + // Badge + BadgeFg lipgloss.Color + BadgeBg lipgloss.Color + + // Severity family (audit results) + SeverityCritical lipgloss.Color + SeverityHigh lipgloss.Color + SeverityMedium lipgloss.Color + SeverityLow lipgloss.Color + SeverityInfo lipgloss.Color +} diff --git a/internal/theme/palette_dark.go b/internal/theme/palette_dark.go new file mode 100644 index 00000000..7d73fd90 --- /dev/null +++ b/internal/theme/palette_dark.go @@ -0,0 +1,35 @@ +package theme + +import "github.com/charmbracelet/lipgloss" + +// darkPalette is tuned for dark terminal backgrounds. +// +// Notable choices vs the previous tc struct: +// - Primary is 252 (near-white) instead of 15 (bright white). Bright +// white disappears on light terminals in fallback scenarios and the +// visual difference on dark terminals is negligible. +// - BadgeFg is 252 (was 250) to match Primary. +// - Selected is 255 (was 15) — brightest gray that still renders on +// light terminals if fallback picks the wrong palette. +var darkPalette = palette{ + Primary: lipgloss.Color("252"), + Muted: lipgloss.Color("245"), + Accent: lipgloss.Color("6"), + + Selected: lipgloss.Color("255"), + SelectedBg: lipgloss.Color("237"), + + Success: lipgloss.Color("2"), + Warning: lipgloss.Color("3"), + Danger: lipgloss.Color("1"), + Info: lipgloss.Color("6"), + + BadgeFg: lipgloss.Color("252"), + BadgeBg: lipgloss.Color("237"), + + SeverityCritical: lipgloss.Color("1"), + SeverityHigh: lipgloss.Color("208"), + SeverityMedium: lipgloss.Color("3"), + SeverityLow: lipgloss.Color("12"), + SeverityInfo: lipgloss.Color("244"), +} diff --git a/internal/theme/palette_light.go b/internal/theme/palette_light.go new file mode 100644 index 00000000..c91536aa --- /dev/null +++ b/internal/theme/palette_light.go @@ -0,0 +1,35 @@ +package theme + +import "github.com/charmbracelet/lipgloss" + +// lightPalette is tuned for light terminal backgrounds. +// +// Key differences vs darkPalette: +// - All foreground colors use dark shades (near black 232, dark blue 25, +// etc.) to meet WCAG AA contrast against white backgrounds. +// - Warning uses 130 (dark gold) instead of 3 (yellow) because yellow +// on white has < 2:1 contrast ratio (unreadable). +// - SeverityLow uses 25 (dark blue) instead of 12 (bright blue) for +// the same reason. +var lightPalette = palette{ + Primary: lipgloss.Color("232"), + Muted: lipgloss.Color("240"), + Accent: lipgloss.Color("25"), + + Selected: lipgloss.Color("232"), + SelectedBg: lipgloss.Color("252"), + + Success: lipgloss.Color("28"), + Warning: lipgloss.Color("130"), + Danger: lipgloss.Color("124"), + Info: lipgloss.Color("25"), + + BadgeFg: lipgloss.Color("232"), + BadgeBg: lipgloss.Color("252"), + + SeverityCritical: lipgloss.Color("160"), + SeverityHigh: lipgloss.Color("208"), + SeverityMedium: lipgloss.Color("130"), + SeverityLow: lipgloss.Color("25"), + SeverityInfo: lipgloss.Color("240"), +} diff --git a/internal/theme/styles.go b/internal/theme/styles.go new file mode 100644 index 00000000..91235340 --- /dev/null +++ b/internal/theme/styles.go @@ -0,0 +1,139 @@ +package theme + +import ( + "strings" + + "github.com/charmbracelet/lipgloss" +) + +// Primary returns the style for main text such as skill names and headings. +func Primary() lipgloss.Style { return styleFg(Get().palette.Primary) } + +// Muted returns the style for secondary text such as paths or metadata. +func Muted() lipgloss.Style { return styleFg(Get().palette.Muted) } + +// Dim returns a terminal-agnostic "faint" style using the SGR dim flag. +// Unlike Muted, this does not use a specific foreground color, so it +// renders sensibly on any background without consulting the palette. +func Dim() lipgloss.Style { + if Get().NoColor { + return lipgloss.NewStyle() + } + return lipgloss.NewStyle().Faint(true) +} + +// Accent returns the style for interactive highlights (filter match, cursor). +func Accent() lipgloss.Style { return styleFg(Get().palette.Accent) } + +// Success returns the style for success messages. +func Success() lipgloss.Style { return styleFg(Get().palette.Success) } + +// Warning returns the style for warning messages. +func Warning() lipgloss.Style { return styleFg(Get().palette.Warning) } + +// Danger returns the style for error/danger messages. +func Danger() lipgloss.Style { return styleFg(Get().palette.Danger) } + +// Info returns the style for info messages. +func Info() lipgloss.Style { return styleFg(Get().palette.Info) } + +// Title returns a bold variant of the Accent style, for section headings. +func Title() lipgloss.Style { + t := Get() + if t.NoColor { + return lipgloss.NewStyle().Bold(true) + } + return lipgloss.NewStyle().Bold(true).Foreground(t.palette.Accent) +} + +// SelectedRow returns the combined fg+bg+bold style for a selected list row. +func SelectedRow() lipgloss.Style { + t := Get() + if t.NoColor { + return lipgloss.NewStyle().Bold(true) + } + return lipgloss.NewStyle(). + Foreground(t.palette.Selected). + Background(t.palette.SelectedBg). + Bold(true) +} + +// Badge returns the combined fg+bg style for a badge with built-in padding. +func Badge() lipgloss.Style { + t := Get() + if t.NoColor { + return lipgloss.NewStyle().Padding(0, 1) + } + return lipgloss.NewStyle(). + Foreground(t.palette.BadgeFg). + Background(t.palette.BadgeBg). + Padding(0, 1) +} + +// Severity returns the style for an audit severity level. Accepted levels +// (case-insensitive): "critical", "high", "medium", "low", "info". Unknown +// values return the Dim style. +func Severity(level string) lipgloss.Style { + t := Get() + var c lipgloss.Color + switch strings.ToUpper(level) { + case "CRITICAL": + c = t.palette.SeverityCritical + if t.NoColor { + return lipgloss.NewStyle().Bold(true) + } + return lipgloss.NewStyle().Foreground(c).Bold(true) + case "HIGH": + c = t.palette.SeverityHigh + case "MEDIUM": + c = t.palette.SeverityMedium + case "LOW": + c = t.palette.SeverityLow + case "INFO": + c = t.palette.SeverityInfo + default: + return Dim() + } + return styleFg(c) +} + +// SeverityStyle is an alias for Severity, kept for migration convenience +// from the legacy tcSevStyle function. +func SeverityStyle(level string) lipgloss.Style { return Severity(level) } + +// RiskLabelStyle returns the style for a risk label: clean|low|medium|high|critical. +// Used by formatRiskBadgeLipgloss in the legacy tui_colors.go. +func RiskLabelStyle(label string) lipgloss.Style { + switch strings.ToLower(label) { + case "clean": + return Success() + case "low": + return Severity("low") + case "medium": + return Severity("medium") + case "high": + return Severity("high") + case "critical": + return Severity("critical") + default: + return Dim() + } +} + +// FormatRiskBadge returns a colored risk badge string like " [high]". +// Migrated from formatRiskBadgeLipgloss in legacy tui_colors.go. +func FormatRiskBadge(label string) string { + if label == "" { + return "" + } + return " " + RiskLabelStyle(label).Render("["+label+"]") +} + +// styleFg is an internal helper that honors NoColor and returns a style +// with only a foreground color set. +func styleFg(c lipgloss.Color) lipgloss.Style { + if Get().NoColor { + return lipgloss.NewStyle() + } + return lipgloss.NewStyle().Foreground(c) +} diff --git a/internal/theme/theme.go b/internal/theme/theme.go new file mode 100644 index 00000000..ef621391 --- /dev/null +++ b/internal/theme/theme.go @@ -0,0 +1,66 @@ +// Package theme provides a unified light/dark terminal theme system. +// +// The theme is resolved once at first access via theme.Get() and cached +// for the remainder of the process. Resolution precedence: +// +// 1. NO_COLOR env var set → disable all colors +// 2. SKILLSHARE_THEME=light|dark → explicit override +// 3. SKILLSHARE_THEME=auto (or unset) → probe terminal background +// 4. Fallback to dark palette +// +// Usage from TUI code: +// +// style := theme.Primary() +// rendered := style.Render("text") +// +// Usage from plain CLI output: +// +// ansi := theme.ANSI() +// fmt.Printf("%s%s%s\n", ansi.Success, "done", ansi.Reset) +package theme + +import "sync" + +// Mode represents the user's resolved theme preference. +type Mode string + +const ( + // ModeAuto is the default before resolution; never appears on a resolved Theme. + ModeAuto Mode = "auto" + // ModeLight means the palette is tuned for light terminal backgrounds. + ModeLight Mode = "light" + // ModeDark means the palette is tuned for dark terminal backgrounds. + ModeDark Mode = "dark" +) + +// Theme holds the resolved, immutable theme state for the current process. +type Theme struct { + // Mode is the resolved palette mode (never ModeAuto after Get()). + Mode Mode + // Source describes how Mode was decided. See resolve() for valid values. + Source string + // NoColor is true when NO_COLOR env var was set; all style/ansi + // constructors must return plain text when this is true. + NoColor bool + + palette palette +} + +var ( + current *Theme + once sync.Once +) + +// Get returns the process-wide theme singleton, resolving it on first call. +// Safe for concurrent use. +func Get() *Theme { + once.Do(func() { current = resolve() }) + return current +} + +// Reset clears the cached theme so the next Get() re-resolves. Intended +// for tests that need to change env vars and observe a new resolution. +func Reset() { + once = sync.Once{} + current = nil +} diff --git a/internal/theme/theme_test.go b/internal/theme/theme_test.go new file mode 100644 index 00000000..c94bb188 --- /dev/null +++ b/internal/theme/theme_test.go @@ -0,0 +1,208 @@ +package theme + +import ( + "strings" + "testing" + + "github.com/charmbracelet/lipgloss" +) + +// Each test calls Reset() before reading Get() because t.Setenv mutations +// must take effect after the sync.Once guard. + +func TestResolve_NoColorBeatsEverything(t *testing.T) { + t.Setenv("NO_COLOR", "1") + t.Setenv("SKILLSHARE_THEME", "light") + Reset() + + tm := Get() + if !tm.NoColor { + t.Error("NO_COLOR should force NoColor=true") + } + if tm.Source != "no-color" { + t.Errorf("Source = %q, want %q", tm.Source, "no-color") + } +} + +func TestResolve_ExplicitLight(t *testing.T) { + t.Setenv("NO_COLOR", "") + t.Setenv("SKILLSHARE_THEME", "light") + Reset() + + tm := Get() + if tm.Mode != ModeLight { + t.Errorf("Mode = %q, want %q", tm.Mode, ModeLight) + } + if tm.Source != "env" { + t.Errorf("Source = %q, want %q", tm.Source, "env") + } + if tm.NoColor { + t.Error("NoColor should be false") + } +} + +func TestResolve_ExplicitDark(t *testing.T) { + t.Setenv("NO_COLOR", "") + t.Setenv("SKILLSHARE_THEME", "dark") + Reset() + + tm := Get() + if tm.Mode != ModeDark { + t.Errorf("Mode = %q, want %q", tm.Mode, ModeDark) + } + if tm.Source != "env" { + t.Errorf("Source = %q", tm.Source) + } +} + +func TestResolve_InvalidValueFallsBack(t *testing.T) { + t.Setenv("NO_COLOR", "") + t.Setenv("SKILLSHARE_THEME", "rainbow") + Reset() + + tm := Get() + if tm.Mode != ModeDark { + t.Errorf("Mode = %q, want dark fallback", tm.Mode) + } + if !strings.HasPrefix(tm.Source, "fallback-dark") { + t.Errorf("Source = %q, want fallback-dark-*", tm.Source) + } +} + +func TestResolve_NonTTYFallback(t *testing.T) { + t.Setenv("NO_COLOR", "") + t.Setenv("SKILLSHARE_THEME", "") + Reset() + + tm := Get() + if tm.Mode != ModeDark { + t.Errorf("Mode = %q", tm.Mode) + } + if tm.Source != "fallback-dark-no-tty" { + t.Errorf("Source = %q", tm.Source) + } +} + +func TestResolve_CIEnvSkipsDetection(t *testing.T) { + t.Setenv("NO_COLOR", "") + t.Setenv("SKILLSHARE_THEME", "") + t.Setenv("CI", "true") + Reset() + + tm := Get() + if tm.Mode != ModeDark { + t.Errorf("Mode = %q", tm.Mode) + } + if !strings.HasPrefix(tm.Source, "fallback-dark") { + t.Errorf("Source = %q, want fallback-dark-*", tm.Source) + } +} + +func TestCanDetect_Windows(t *testing.T) { + _ = canDetect() +} + +// --- helpers used by later tests --- + +func mustBeColor(t *testing.T, c lipgloss.Color, name string) { + t.Helper() + if string(c) == "" { + t.Errorf("palette field %s is empty", name) + } +} + +func TestPaletteCompleteness(t *testing.T) { + cases := map[string]palette{ + "dark": darkPalette, + "light": lightPalette, + } + for name, p := range cases { + t.Run(name, func(t *testing.T) { + mustBeColor(t, p.Primary, "Primary") + mustBeColor(t, p.Muted, "Muted") + mustBeColor(t, p.Accent, "Accent") + mustBeColor(t, p.Selected, "Selected") + mustBeColor(t, p.SelectedBg, "SelectedBg") + mustBeColor(t, p.Success, "Success") + mustBeColor(t, p.Warning, "Warning") + mustBeColor(t, p.Danger, "Danger") + mustBeColor(t, p.Info, "Info") + mustBeColor(t, p.BadgeFg, "BadgeFg") + mustBeColor(t, p.BadgeBg, "BadgeBg") + mustBeColor(t, p.SeverityCritical, "SeverityCritical") + mustBeColor(t, p.SeverityHigh, "SeverityHigh") + mustBeColor(t, p.SeverityMedium, "SeverityMedium") + mustBeColor(t, p.SeverityLow, "SeverityLow") + mustBeColor(t, p.SeverityInfo, "SeverityInfo") + }) + } +} + +func TestNoColorStripsStyles(t *testing.T) { + t.Setenv("NO_COLOR", "1") + Reset() + + rendered := Primary().Render("hello") + if rendered != "hello" { + t.Errorf("Primary().Render with NO_COLOR = %q, want %q", rendered, "hello") + } + + a := ANSI() + if a.Reset != "" || a.Primary != "" || a.Success != "" { + t.Errorf("ANSI() with NO_COLOR must return empty strings, got %+v", a) + } +} + +func TestSeverityUnknownFallsBackToDim(t *testing.T) { + t.Setenv("NO_COLOR", "") + t.Setenv("SKILLSHARE_THEME", "dark") + Reset() + + unknown := Severity("unknown").Render("x") + dim := Dim().Render("x") + if unknown != dim { + t.Errorf("Severity(\"unknown\") = %q, Dim() = %q; should match", unknown, dim) + } +} + +func TestContrastRatios(t *testing.T) { + cases := []struct { + name string + fg lipgloss.Color + bg lipgloss.Color + minRatio float64 + }{ + // Dark palette on pure black + {"dark/Primary on black", darkPalette.Primary, lipgloss.Color("#000000"), 10.0}, + {"dark/Muted on black", darkPalette.Muted, lipgloss.Color("#000000"), 4.5}, + {"dark/Warning on black", darkPalette.Warning, lipgloss.Color("#000000"), 4.5}, + {"dark/SeverityMedium on black", darkPalette.SeverityMedium, lipgloss.Color("#000000"), 4.5}, + + // Light palette on pure white + {"light/Primary on white", lightPalette.Primary, lipgloss.Color("#FFFFFF"), 10.0}, + {"light/Muted on white", lightPalette.Muted, lipgloss.Color("#FFFFFF"), 4.5}, + {"light/Warning on white", lightPalette.Warning, lipgloss.Color("#FFFFFF"), 4.5}, + {"light/Danger on white", lightPalette.Danger, lipgloss.Color("#FFFFFF"), 4.5}, + {"light/Success on white", lightPalette.Success, lipgloss.Color("#FFFFFF"), 4.5}, + {"light/SeverityCritical on white", lightPalette.SeverityCritical, lipgloss.Color("#FFFFFF"), 4.5}, + {"light/SeverityMedium on white", lightPalette.SeverityMedium, lipgloss.Color("#FFFFFF"), 4.5}, + {"light/SeverityLow on white", lightPalette.SeverityLow, lipgloss.Color("#FFFFFF"), 4.5}, + } + + for _, tc := range cases { + t.Run(tc.name, func(t *testing.T) { + ratio := contrastRatio(tc.fg, tc.bg) + if ratio < tc.minRatio { + t.Errorf("contrast ratio %.2f < %.2f", ratio, tc.minRatio) + } + }) + } +} + +// Regression guard: Primary must NOT be pure white (15) on dark palette. +// This was the root cause of issue #125. +func TestDarkPrimaryIsNotPureWhite(t *testing.T) { + if darkPalette.Primary == lipgloss.Color("15") { + t.Error("darkPalette.Primary reverted to pure white (15) — this breaks light terminals") + } +} diff --git a/internal/trash/trash.go b/internal/trash/trash.go index 11d31463..ce9a5d47 100644 --- a/internal/trash/trash.go +++ b/internal/trash/trash.go @@ -14,6 +14,8 @@ import ( const defaultMaxAge = 7 * 24 * time.Hour // 7 days +const reservedAgentTrashDir = "agents" + // TrashDir returns the global trash directory path. func TrashDir() string { return filepath.Join(config.DataDir(), "trash") @@ -24,6 +26,54 @@ func ProjectTrashDir(root string) string { return filepath.Join(root, ".skillshare", "trash") } +// AgentTrashDir returns the global trash directory for agents. +func AgentTrashDir() string { + return filepath.Join(config.DataDir(), "trash", "agents") +} + +// ProjectAgentTrashDir returns the project-level trash directory for agents. +func ProjectAgentTrashDir(root string) string { + return filepath.Join(root, ".skillshare", "trash", "agents") +} + +// MoveAgentToTrash moves an agent file (and its metadata) to the trash directory. +func MoveAgentToTrash(agentFile, metaFile, name, trashBase string) (string, error) { + timestamp := time.Now().Format("2006-01-02_15-04-05") + trashDir := filepath.Join(trashBase, name+"_"+timestamp) + + if err := os.MkdirAll(trashDir, 0755); err != nil { + return "", fmt.Errorf("failed to create agent trash dir: %w", err) + } + + // Move agent .md file + destFile := filepath.Join(trashDir, filepath.Base(agentFile)) + if err := os.Rename(agentFile, destFile); err != nil { + // Fallback: copy then delete + data, readErr := os.ReadFile(agentFile) + if readErr != nil { + return "", fmt.Errorf("failed to read agent for trash: %w", readErr) + } + if writeErr := os.WriteFile(destFile, data, 0644); writeErr != nil { + return "", fmt.Errorf("failed to write agent to trash: %w", writeErr) + } + os.Remove(agentFile) + } + + // Move metadata if exists + if metaFile != "" { + if _, err := os.Stat(metaFile); err == nil { + destMeta := filepath.Join(trashDir, filepath.Base(metaFile)) + if err := os.Rename(metaFile, destMeta); err != nil { + data, _ := os.ReadFile(metaFile) + os.WriteFile(destMeta, data, 0644) + os.Remove(metaFile) + } + } + } + + return trashDir, nil +} + // TrashEntry holds information about a trashed item. type TrashEntry struct { Name string // Original skill name @@ -31,6 +81,7 @@ type TrashEntry struct { Path string // Full path to trashed directory Date time.Time // Parsed or stat-based date Size int64 // Total size in bytes + Kind string // "skill" or "agent" — set by caller } // MoveToTrash moves a skill directory to the trash. @@ -65,9 +116,13 @@ func MoveToTrash(srcPath, name, trashBase string) (string, error) { // Walks recursively to find nested entries (e.g., "org/_team-skills_"). func List(trashBase string) []TrashEntry { var items []TrashEntry + base := filepath.Clean(trashBase) filepath.WalkDir(trashBase, func(path string, d fs.DirEntry, err error) error { - if err != nil || !d.IsDir() || path == trashBase { + if shouldSkipReservedAgentTrashSubtree(base, path, d) { + return fs.SkipDir + } + if err != nil || !d.IsDir() || path == base { return nil } @@ -108,6 +163,19 @@ func List(trashBase string) []TrashEntry { return items } +// shouldSkipReservedAgentTrashSubtree prevents the skills trash root from +// recursively listing agent trash entries under the reserved "trash/agents" path. +// Agent trash is listed separately from AgentTrashDir()/ProjectAgentTrashDir(). +func shouldSkipReservedAgentTrashSubtree(base, path string, d fs.DirEntry) bool { + if d == nil || !d.IsDir() { + return false + } + if filepath.Base(base) != "trash" { + return false + } + return filepath.Clean(path) == filepath.Join(base, reservedAgentTrashDir) +} + // Cleanup removes trashed items older than maxAge. // Returns the number of items removed. func Cleanup(trashBase string, maxAge time.Duration) (int, error) { @@ -199,6 +267,56 @@ func Restore(entry *TrashEntry, destDir string) error { return nil } +// RestoreAgent restores agent files from a trashed directory back to the agent source. +// Unlike Restore (which moves the whole directory), this copies individual files +// from the trashed directory to destDir (since agents are file-based, not directory-based). +func RestoreAgent(entry *TrashEntry, destDir string) error { + // Reconstruct subdirectory for nested agents (e.g., "demo/my-agent" → destDir/demo/) + targetDir := destDir + if subDir := filepath.Dir(entry.Name); subDir != "." { + targetDir = filepath.Join(destDir, subDir) + } + if err := os.MkdirAll(targetDir, 0755); err != nil { + return fmt.Errorf("failed to create agent destination: %w", err) + } + + // Read files from the trashed directory + entries, err := os.ReadDir(entry.Path) + if err != nil { + return fmt.Errorf("failed to read trashed agent: %w", err) + } + + for _, e := range entries { + if e.IsDir() { + continue + } + srcPath := filepath.Join(entry.Path, e.Name()) + destPath := filepath.Join(targetDir, e.Name()) + + if _, statErr := os.Stat(destPath); statErr == nil { + return fmt.Errorf("'%s' already exists in %s", e.Name(), targetDir) + } + + // Try rename, fallback to copy + if renameErr := os.Rename(srcPath, destPath); renameErr != nil { + data, readErr := os.ReadFile(srcPath) + if readErr != nil { + return fmt.Errorf("failed to read %s: %w", e.Name(), readErr) + } + if writeErr := os.WriteFile(destPath, data, 0644); writeErr != nil { + return fmt.Errorf("failed to write %s: %w", e.Name(), writeErr) + } + } + } + + // Remove the trashed directory + if removeErr := os.RemoveAll(entry.Path); removeErr != nil { + return fmt.Errorf("restored but failed to remove trash entry: %w", removeErr) + } + + return nil +} + // parseTrashName splits "skillname_YYYY-MM-DD_HH-MM-SS" into name and timestamp. func parseTrashName(dirName string) (string, string) { // Timestamp format: YYYY-MM-DD_HH-MM-SS (19 chars) diff --git a/internal/trash/trash_test.go b/internal/trash/trash_test.go index f6512434..1da45f1b 100644 --- a/internal/trash/trash_test.go +++ b/internal/trash/trash_test.go @@ -260,6 +260,22 @@ func TestList_NestedEntries(t *testing.T) { } } +func TestList_SkipsReservedAgentTrashSubtree(t *testing.T) { + tmpDir := t.TempDir() + trashBase := filepath.Join(tmpDir, "trash") + + os.MkdirAll(filepath.Join(trashBase, "skill-a_2026-01-01_10-00-00"), 0755) + os.MkdirAll(filepath.Join(trashBase, "agents", "demo", "code-archaeologist_2026-01-02_10-00-00"), 0755) + + items := List(trashBase) + if len(items) != 1 { + t.Fatalf("expected only skill trash items, got %d", len(items)) + } + if items[0].Name != "skill-a" { + t.Fatalf("expected only skill-a, got %q", items[0].Name) + } +} + func TestFindByName_NestedName(t *testing.T) { tmpDir := t.TempDir() trashBase := filepath.Join(tmpDir, "trash") @@ -275,6 +291,17 @@ func TestFindByName_NestedName(t *testing.T) { } } +func TestFindByName_SkipsReservedAgentTrashSubtree(t *testing.T) { + tmpDir := t.TempDir() + trashBase := filepath.Join(tmpDir, "trash") + + os.MkdirAll(filepath.Join(trashBase, "agents", "demo", "code-archaeologist_2026-01-01_10-00-00"), 0755) + + if entry := FindByName(trashBase, "agents/demo/code-archaeologist"); entry != nil { + t.Fatalf("expected reserved agent subtree to be ignored, got %q", entry.Name) + } +} + func TestRestore_NestedName(t *testing.T) { tmpDir := t.TempDir() trashBase := filepath.Join(tmpDir, "trash") diff --git a/internal/ui/pterm.go b/internal/ui/pterm.go index a489a290..ebf6c52d 100644 --- a/internal/ui/pterm.go +++ b/internal/ui/pterm.go @@ -587,6 +587,46 @@ type SyncStats struct { Duration time.Duration } +// AgentSyncStats holds statistics for agent sync summary. +type AgentSyncStats struct { + Targets int + Linked int + Local int + Updated int + Pruned int + Duration time.Duration +} + +// AgentSyncSummary prints an agent sync summary line. +func AgentSyncSummary(stats AgentSyncStats) { + OperationSummary("Agent sync", stats.Duration, + Metric{Label: "targets", Count: stats.Targets, HighlightColor: pterm.Cyan}, + Metric{Label: "linked", Count: stats.Linked, HighlightColor: pterm.Green}, + Metric{Label: "local", Count: stats.Local, HighlightColor: pterm.Blue}, + Metric{Label: "updated", Count: stats.Updated, HighlightColor: pterm.Yellow}, + Metric{Label: "pruned", Count: stats.Pruned, HighlightColor: pterm.Yellow}, + ) +} + +// ExtrasSyncStats holds statistics for extras sync summary. +type ExtrasSyncStats struct { + Targets int + Synced int + Skipped int + Pruned int + Duration time.Duration +} + +// ExtrasSyncSummary prints an extras sync summary line. +func ExtrasSyncSummary(stats ExtrasSyncStats) { + OperationSummary("Extras sync", stats.Duration, + Metric{Label: "targets", Count: stats.Targets, HighlightColor: pterm.Cyan}, + Metric{Label: "synced", Count: stats.Synced, HighlightColor: pterm.Green}, + Metric{Label: "skipped", Count: stats.Skipped, HighlightColor: pterm.Yellow}, + Metric{Label: "pruned", Count: stats.Pruned, HighlightColor: pterm.Yellow}, + ) +} + // UpdateSummary prints an update summary line matching SyncSummary style. func UpdateSummary(stats UpdateStats) { OperationSummary("Update", stats.Duration, diff --git a/internal/ui/ui.go b/internal/ui/ui.go index e627483f..4f02f7b7 100644 --- a/internal/ui/ui.go +++ b/internal/ui/ui.go @@ -5,9 +5,13 @@ import ( "os" "strings" "time" + + "skillshare/internal/theme" ) -// Base colors for terminal output +// Deprecated: Use theme.ANSI() instead. These constants are retained +// for backward compatibility with any third-party code that imports the +// ui package directly. New code should use theme.ANSI(). const ( Reset = "\033[0m" Red = "\033[31m" @@ -97,7 +101,7 @@ func SeverityColorID(severity string) string { // Colorize wraps text with a color code and reset. Returns plain text if // color is empty or stdout is not a TTY. func Colorize(color, text string) string { - if color == "" || !IsTTY() { + if color == "" || !IsTTY() || theme.Get().NoColor { return text } return color + text + Reset @@ -105,60 +109,68 @@ func Colorize(color, text string) string { // Success prints a success message func Success(format string, args ...interface{}) { - fmt.Printf(Green+"✓ "+Reset+format+"\n", args...) + a := theme.ANSI() + fmt.Printf(a.Success+"✓ "+a.Reset+format+"\n", args...) } // Error prints an error message func Error(format string, args ...interface{}) { - fmt.Printf(Red+"✗ "+Reset+format+"\n", args...) + a := theme.ANSI() + fmt.Printf(a.Danger+"✗ "+a.Reset+format+"\n", args...) } // Warning prints a warning message func Warning(format string, args ...interface{}) { - fmt.Printf(Yellow+"! "+Reset+format+"\n", args...) + a := theme.ANSI() + fmt.Printf(a.Warning+"! "+a.Reset+format+"\n", args...) } // Info prints an info message func Info(format string, args ...interface{}) { - fmt.Printf(Cyan+"→ "+Reset+format+"\n", args...) + a := theme.ANSI() + fmt.Printf(a.Info+"→ "+a.Reset+format+"\n", args...) } // Status prints a status line func Status(name, status, detail string) { - statusColor := Gray + a := theme.ANSI() + statusColor := a.Muted switch status { case "linked": - statusColor = Green + statusColor = a.Success case "not exist": - statusColor = Yellow + statusColor = a.Warning case "has files": - statusColor = Blue + statusColor = a.Info case "conflict", "broken": - statusColor = Red + statusColor = a.Danger } - fmt.Printf(" %-12s %s%-12s%s %s\n", name, statusColor, status, Reset, Dim+detail+Reset) + fmt.Printf("%-12s %s%-12s%s %s\n", name, statusColor, status, a.Reset, a.Dim+detail+a.Reset) } // Header prints a section header func Header(text string) { - fmt.Printf("\n%s%s%s\n", Cyan, text, Reset) - fmt.Println(Dim + "─────────────────────────────────────────" + Reset) + a := theme.ANSI() + fmt.Printf("\n%s%s%s\n", a.Info, text, a.Reset) + fmt.Println(a.Dim + "─────────────────────────────────────────" + a.Reset) } // Checkbox returns a formatted checkbox string func Checkbox(checked bool) string { + a := theme.ANSI() if checked { - return Green + "[x]" + Reset + return a.Success + "[x]" + a.Reset } return "[ ]" } // CheckboxItem prints a checkbox item with name and description func CheckboxItem(checked bool, name, description string) { + a := theme.ANSI() checkbox := Checkbox(checked) if description != "" { - fmt.Printf(" %s %-12s %s%s%s\n", checkbox, name, Dim, description, Reset) + fmt.Printf(" %s %-12s %s%s%s\n", checkbox, name, a.Dim, description, a.Reset) } else { fmt.Printf(" %s %s\n", checkbox, name) } @@ -169,31 +181,32 @@ func CheckboxItem(checked bool, name, description string) { // kind: "new"/"restore" (+ green), "modified" (~ cyan), "override" (! yellow), // "orphan" (- red), "local" (← gray), "warn" (⚠ yellow) func ActionLine(kind, text string) { + a := theme.ANSI() var icon, color string switch kind { case "new", "restore": - icon, color = "+", Green + icon, color = "+", a.Success case "modified": - icon, color = "~", Cyan + icon, color = "~", a.Info case "override": - icon, color = "!", Yellow + icon, color = "!", a.Warning case "orphan": - icon, color = "-", Red + icon, color = "-", a.Danger case "local": - icon, color = "←", Dim + icon, color = "←", a.Dim // Legacy kinds for backward compatibility case "sync": - icon, color = "→", Cyan + icon, color = "→", a.Info case "force": - icon, color = "⚠", Yellow + icon, color = "⚠", a.Warning case "collect": - icon, color = "←", Dim + icon, color = "←", a.Dim case "warn": - icon, color = "⚠", Yellow + icon, color = "⚠", a.Warning default: - icon, color = " ", Reset + icon, color = " ", a.Reset } - fmt.Printf(" %s%s%s %s\n", color, icon, Reset, text) + fmt.Printf(" %s%s%s %s\n", color, icon, a.Reset, text) } // isTTY checks if stdout is a terminal (for animation support) @@ -224,22 +237,23 @@ func Logo(version string) { // LogoAnimated prints the ASCII art logo with optional animation func LogoAnimated(version string, animate bool) { + a := theme.ANSI() lines := []string{ - Primary + ` _ _ _ _ _` + Reset, - Primary + ` ___| | _(_) | |___| |__ __ _ _ __ ___` + Reset, - Primary + `/ __| |/ / | | / __| '_ \ / _` + "`" + ` | '__/ _ \` + Reset, - Primary + `\__ \ <| | | \__ \ | | | (_| | | | __/` + Reset + ` ` + Muted + `https://github.com/runkids/skillshare` + Reset, + a.Warning + ` _ _ _ _ _` + a.Reset, + a.Warning + ` ___| | _(_) | |___| |__ __ _ _ __ ___` + a.Reset, + a.Warning + `/ __| |/ / | | / __| '_ \ / _` + "`" + ` | '__/ _ \` + a.Reset, + a.Warning + `\__ \ <| | | \__ \ | | | (_| | | | __/` + a.Reset + ` ` + a.Dim + `https://github.com/runkids/skillshare` + a.Reset, } // Last line varies based on version suffix := "" if ModeLabel != "" { - suffix = ` ` + Accent + `(` + ModeLabel + `)` + Reset + suffix = ` ` + a.Info + `(` + ModeLabel + `)` + a.Reset } if version != "" { - lines = append(lines, Primary+`|___/_|\_\_|_|_|___/_| |_|\__,_|_| \___|`+Reset+` `+Muted+`v`+version+Reset+suffix) + lines = append(lines, a.Warning+`|___/_|\_\_|_|_|___/_| |_|\__,_|_| \___|`+a.Reset+` `+a.Dim+`v`+version+a.Reset+suffix) } else { - lines = append(lines, Primary+`|___/_|\_\_|_|_|___/_| |_|\__,_|_| \___|`+Reset+` `+Muted+`Sync skills across all AI CLI tools`+Reset+suffix) + lines = append(lines, a.Warning+`|___/_|\_\_|_|_|___/_| |_|\__,_|_| \___|`+a.Reset+` `+a.Dim+`Sync skills across all AI CLI tools`+a.Reset+suffix) } if animate { diff --git a/internal/validate/agent.go b/internal/validate/agent.go new file mode 100644 index 00000000..ea824fb7 --- /dev/null +++ b/internal/validate/agent.go @@ -0,0 +1,84 @@ +package validate + +import ( + "fmt" + "os" + "path/filepath" + "regexp" + "strings" +) + +// agentFileNameRegex allows letters, numbers, underscores, hyphens, and dots. +// Must end with .md (case-insensitive checked separately). +var agentFileNameRegex = regexp.MustCompile(`^[a-zA-Z0-9][a-zA-Z0-9_.-]*$`) + +// AgentFileSizeWarningThreshold is the size above which a warning is issued. +const AgentFileSizeWarningThreshold = 100 * 1024 // 100KB + +// AgentValidationResult holds the result of validating an agent file. +type AgentValidationResult struct { + Valid bool + Errors []string + Warnings []string +} + +// AgentFile validates a single agent .md file. +func AgentFile(filePath string) AgentValidationResult { + result := AgentValidationResult{Valid: true} + + fileName := filepath.Base(filePath) + + // Must be .md extension + if !strings.HasSuffix(strings.ToLower(fileName), ".md") { + result.Valid = false + result.Errors = append(result.Errors, fmt.Sprintf("agent file must have .md extension, got %q", fileName)) + return result + } + + // Filename restrictions: no spaces or special chars + if !agentFileNameRegex.MatchString(fileName) { + result.Valid = false + result.Errors = append(result.Errors, fmt.Sprintf("agent filename %q contains invalid characters (spaces, special chars not allowed)", fileName)) + } + + // Check file exists and size + info, err := os.Stat(filePath) + if err != nil { + if os.IsNotExist(err) { + result.Valid = false + result.Errors = append(result.Errors, "agent file does not exist") + } + return result + } + + if info.IsDir() { + result.Valid = false + result.Errors = append(result.Errors, "path is a directory, not a file") + return result + } + + if info.Size() > int64(AgentFileSizeWarningThreshold) { + result.Warnings = append(result.Warnings, + fmt.Sprintf("agent file is %dKB (>100KB) — large agents may slow down AI tools", info.Size()/1024)) + } + + return result +} + +// AgentName validates an agent name (derived from filename). +func AgentName(name string) error { + if name == "" { + return fmt.Errorf("agent name cannot be empty") + } + + if len(name) > 128 { + return fmt.Errorf("agent name too long (max 128 characters)") + } + + nameRegex := regexp.MustCompile(`^[a-zA-Z0-9][a-zA-Z0-9_-]*$`) + if !nameRegex.MatchString(name) { + return fmt.Errorf("agent name must start with a letter or number and contain only letters, numbers, underscores, and hyphens") + } + + return nil +} diff --git a/internal/validate/agent_test.go b/internal/validate/agent_test.go new file mode 100644 index 00000000..c25a8611 --- /dev/null +++ b/internal/validate/agent_test.go @@ -0,0 +1,89 @@ +package validate + +import ( + "os" + "path/filepath" + "strings" + "testing" +) + +func TestAgentFile_Valid(t *testing.T) { + dir := t.TempDir() + f := filepath.Join(dir, "tutor.md") + os.WriteFile(f, []byte("# Tutor"), 0644) + + r := AgentFile(f) + if !r.Valid { + t.Errorf("expected valid, got errors: %v", r.Errors) + } + if len(r.Warnings) != 0 { + t.Errorf("expected no warnings, got: %v", r.Warnings) + } +} + +func TestAgentFile_WrongExtension(t *testing.T) { + dir := t.TempDir() + f := filepath.Join(dir, "tutor.txt") + os.WriteFile(f, []byte("content"), 0644) + + r := AgentFile(f) + if r.Valid { + t.Error("expected invalid for non-.md file") + } +} + +func TestAgentFile_InvalidFilename(t *testing.T) { + dir := t.TempDir() + f := filepath.Join(dir, "my agent.md") + os.WriteFile(f, []byte("# Agent"), 0644) + + r := AgentFile(f) + if r.Valid { + t.Error("expected invalid for filename with spaces") + } +} + +func TestAgentFile_Oversized(t *testing.T) { + dir := t.TempDir() + f := filepath.Join(dir, "big.md") + os.WriteFile(f, []byte(strings.Repeat("x", 200*1024)), 0644) + + r := AgentFile(f) + if !r.Valid { + t.Error("oversized file should be valid (warning only)") + } + if len(r.Warnings) != 1 { + t.Errorf("expected 1 warning, got %d", len(r.Warnings)) + } +} + +func TestAgentFile_NotExist(t *testing.T) { + r := AgentFile("/nonexistent/agent.md") + if r.Valid { + t.Error("expected invalid for nonexistent file") + } +} + +func TestAgentFile_Directory(t *testing.T) { + dir := t.TempDir() + r := AgentFile(dir) + if r.Valid { + t.Error("expected invalid for directory path") + } +} + +func TestAgentName_Valid(t *testing.T) { + for _, name := range []string{"tutor", "math-tutor", "code_review", "a1"} { + if err := AgentName(name); err != nil { + t.Errorf("AgentName(%q) should be valid, got: %v", name, err) + } + } +} + +func TestAgentName_Invalid(t *testing.T) { + for _, name := range []string{"", "-start", "has space", strings.Repeat("a", 129)} { + if err := AgentName(name); err == nil { + t.Errorf("AgentName(%q) should be invalid", name) + } + } +} diff --git a/internal/validate/validate.go b/internal/validate/validate.go index b2e56d3c..273d21f6 100644 --- a/internal/validate/validate.go +++ b/internal/validate/validate.go @@ -54,11 +54,16 @@ func TargetName(name string) error { return nil } +// reservedSkillNames are names that cannot be used as skill names +// because they conflict with resource kind directories or commands. +var reservedSkillNames = []string{"agents"} + // SkillName validates a skill name. // Rules: // - Must start with a letter or number // - Can contain letters, numbers, underscores, and hyphens // - Length 1-64 characters +// - Cannot be a reserved skill name (e.g. "agents") func SkillName(name string) error { if name == "" { return fmt.Errorf("skill name cannot be empty") @@ -72,6 +77,12 @@ func SkillName(name string) error { return fmt.Errorf("skill name must start with a letter or number and contain only letters, numbers, underscores, and hyphens") } + for _, r := range reservedSkillNames { + if strings.EqualFold(name, r) { + return fmt.Errorf("'%s' is a reserved name and cannot be used as a skill name", name) + } + } + return nil } diff --git a/internal/validate/validate_test.go b/internal/validate/validate_test.go index 46a04b37..49c2a728 100644 --- a/internal/validate/validate_test.go +++ b/internal/validate/validate_test.go @@ -44,6 +44,31 @@ func TestTargetName(t *testing.T) { } } +func TestSkillName(t *testing.T) { + tests := []struct { + name string + input string + wantErr bool + }{ + {"valid", "my-skill", false}, + {"valid with number", "skill2", false}, + {"empty", "", true}, + {"too long", strings.Repeat("a", 65), true}, + {"reserved agents", "agents", true}, + {"reserved agents uppercase", "Agents", true}, + {"starts with special", "-skill", true}, + } + + for _, tt := range tests { + t.Run(tt.name, func(t *testing.T) { + err := SkillName(tt.input) + if (err != nil) != tt.wantErr { + t.Errorf("SkillName(%q) error = %v, wantErr %v", tt.input, err, tt.wantErr) + } + }) + } +} + func TestPath(t *testing.T) { tests := []struct { name string diff --git a/schemas/config.schema.json b/schemas/config.schema.json index f3fbb49a..f00364dd 100644 --- a/schemas/config.schema.json +++ b/schemas/config.schema.json @@ -13,6 +13,12 @@ "default": "~/.config/skillshare/skills", "examples": ["~/.config/skillshare/skills", "/home/user/my-skills"] }, + "agents_source": { + "type": "string", + "description": "Path to the agents source directory. Supports ~ for home directory. Defaults to ~/.config/skillshare/agents if not set.", + "default": "~/.config/skillshare/agents", + "examples": ["~/.config/skillshare/agents", "/home/user/my-agents"] + }, "mode": { "type": "string", "description": "Default sync mode for all targets.", diff --git a/schemas/registry.schema.json b/schemas/registry.schema.json index 9d79d96f..00aaf7ec 100644 --- a/schemas/registry.schema.json +++ b/schemas/registry.schema.json @@ -1,41 +1,43 @@ { "$schema": "https://json-schema.org/draft/2020-12/schema", "$id": "https://raw.githubusercontent.com/runkids/skillshare/main/schemas/registry.schema.json", - "title": "Skillshare Registry", - "description": "Skill registry for skillshare — tracks installed/tracked skills. Location: ~/.config/skillshare/skills/registry.yaml (global) or .skillshare/registry.yaml (project). Auto-managed by reconciliation; manual edits are preserved.", + "title": "Skillshare Metadata Store", + "description": "Centralized metadata store for skillshare — tracks installed skills and agents. Location: ~/.config/skillshare/skills/.metadata.json (global) or .skillshare/skills/.metadata.json (project). Auto-managed; manual edits are preserved on reconciliation.", "type": "object", "additionalProperties": false, "properties": { - "skills": { - "type": "array", - "description": "Installed/tracked skills.", - "items": { - "$ref": "#/$defs/skillEntry" + "version": { + "type": "integer", + "description": "Schema version. Currently 1.", + "const": 1 + }, + "entries": { + "type": "object", + "description": "Map of skill/agent relative paths to their metadata entries.", + "additionalProperties": { + "$ref": "#/$defs/metadataEntry" } } }, + "required": ["version", "entries"], "$defs": { - "skillEntry": { + "metadataEntry": { "type": "object", - "description": "A managed resource entry tracked in the registry.", - "required": ["name", "source"], - "additionalProperties": false, + "description": "Metadata for an installed skill or agent.", "properties": { - "name": { + "source": { "type": "string", - "description": "Resource name (skill directory name or agent file name).", - "examples": ["pdf", "_team-skills", "tutor"] + "description": "Original source input (GitHub path, local path, etc.).", + "examples": ["github.com/anthropics/skills/skills/pdf", "github.com/team/agents"] }, "kind": { "type": "string", - "description": "Resource kind. Omitted or empty defaults to 'skill'. ('agent' is reserved for a future release.)", - "enum": ["skill", "agent"], - "default": "skill" + "description": "Resource kind. Empty or omitted defaults to 'skill'.", + "enum": ["", "skill", "agent"] }, - "source": { + "type": { "type": "string", - "description": "GitHub path or local path where the resource was installed from.", - "examples": ["anthropics/skills/skills/pdf", "github.com/team/skills"] + "description": "Source type (github, local, git-https, git-ssh, etc.)." }, "tracked": { "type": "boolean", @@ -45,12 +47,45 @@ "group": { "type": "string", "description": "Subdirectory group the resource belongs to (set by --into).", - "examples": ["frontend", "backend"] + "examples": ["frontend", "backend", "team/frontend"] }, "branch": { "type": "string", - "description": "Git branch to clone from. Omit for remote default branch.", - "examples": ["develop", "frontend"] + "description": "Git branch to clone from. Omit for remote default branch." + }, + "into": { + "type": "string", + "description": "Original --into value used during install." + }, + "installed_at": { + "type": "string", + "format": "date-time", + "description": "ISO 8601 timestamp of when the resource was installed." + }, + "repo_url": { + "type": "string", + "description": "Git clone URL for the source repository.", + "examples": ["https://github.com/anthropics/skills.git"] + }, + "subdir": { + "type": "string", + "description": "Subdirectory within the repo (for monorepo installs)." + }, + "version": { + "type": "string", + "description": "Git commit hash at install/update time." + }, + "tree_hash": { + "type": "string", + "description": "Git tree SHA of the subdir at install time." + }, + "file_hashes": { + "type": "object", + "description": "Map of relative file paths to sha256: digests for integrity verification.", + "additionalProperties": { + "type": "string", + "pattern": "^sha256:[0-9a-f]{64}$" + } } } } diff --git a/scripts/red_team/_helpers.sh b/scripts/red_team/_helpers.sh index 9582f57f..f5aed0e3 100644 --- a/scripts/red_team/_helpers.sh +++ b/scripts/red_team/_helpers.sh @@ -49,6 +49,28 @@ $content SKILL_EOF } +# Write or merge a skill entry into the centralized .metadata.json store. +# Usage: write_store_meta +# Example: write_store_meta "$SOURCE_DIR" "my-skill" '{"source":"test","file_hashes":{"SKILL.md":"sha256:abc"}}' +write_store_meta() { + local source_dir="$1" + local skill_name="$2" + local entry_json="$3" + local store_path="$source_dir/.metadata.json" + + if [ -f "$store_path" ]; then + # Merge into existing store + local tmp + tmp=$(jq --arg name "$skill_name" --argjson entry "$entry_json" \ + '.entries[$name] = $entry' "$store_path") + echo "$tmp" > "$store_path" + else + # Create new store + jq -n --arg name "$skill_name" --argjson entry "$entry_json" \ + '{version:1, entries:{($name): $entry}}' > "$store_path" + fi +} + # Run skillshare with isolated config. # -g is placed after the subcommand to force global mode, # preventing auto-detection of .skillshare/ in the working directory. diff --git a/scripts/red_team/phase3_integrity.sh b/scripts/red_team/phase3_integrity.sh index 310e1a7e..b80e9583 100644 --- a/scripts/red_team/phase3_integrity.sh +++ b/scripts/red_team/phase3_integrity.sh @@ -13,16 +13,8 @@ Safe content for hash verification." SKILL_HASH=$(shasum -a 256 "$INTEGRITY_DIR/SKILL.md" | awk '{print $1}') -cat > "$INTEGRITY_DIR/.skillshare-meta.json" < "$INTEGRITY_DIR/.skillshare-meta.json" < "$INTEGRITY_DIR/.skillshare-meta.json" < "$INTEGRITY_DIR/sneaky.sh" diff --git a/scripts/red_team/phase5_advanced.sh b/scripts/red_team/phase5_advanced.sh index 84c1527b..2d77a92e 100644 --- a/scripts/red_team/phase5_advanced.sh +++ b/scripts/red_team/phase5_advanced.sh @@ -13,19 +13,8 @@ Safe content for traversal hardening checks." TRAVERSAL_HASH=$(shasum -a 256 "$TRAVERSAL_DIR/SKILL.md" | awk '{print $1}') echo "TOP SECRET" > "$TMPDIR_ROOT/secret.txt" -cat > "$TRAVERSAL_DIR/.skillshare-meta.json" < "$SYMLINK_DIR/.skillshare-meta.json" < "$TMPDIR_ROOT/outside.txt" ln -s "$TMPDIR_ROOT/outside.txt" "$SYMLINK_DIR/external-link.txt" diff --git a/scripts/test_docker.sh b/scripts/test_docker.sh index 0ca80814..8de5ebdd 100755 --- a/scripts/test_docker.sh +++ b/scripts/test_docker.sh @@ -58,7 +58,7 @@ done if [[ "$ONLINE" == "true" ]]; then PROFILE="online" SERVICE="sandbox-online" - DEFAULT_ONLINE_CMD="mkdir -p bin && go build -o bin/skillshare ./cmd/skillshare && SKILLSHARE_TEST_BINARY=/workspace/bin/skillshare go test -v -tags online ./tests/integration/... -timeout 300s" + DEFAULT_ONLINE_CMD="mkdir -p bin && go build -o bin/skillshare ./cmd/skillshare && SKILLSHARE_TEST_BINARY=/workspace/bin/skillshare go test -v -tags online ./tests/integration/... -timeout 600s" else PROFILE="offline" SERVICE="sandbox-offline" diff --git a/skills/skillshare/SKILL.md b/skills/skillshare/SKILL.md index c99f5a8e..b8c976e0 100644 --- a/skills/skillshare/SKILL.md +++ b/skills/skillshare/SKILL.md @@ -1,18 +1,21 @@ --- name: skillshare description: | - Manages and syncs AI CLI skills across 50+ tools from a single source. + Manages and syncs AI CLI skills and agents across 50+ tools from a single source. Use this skill whenever the user mentions "skillshare", runs skillshare commands, - manages skills (install, update, uninstall, sync, audit, analyze, check, diff, search), - or troubleshoots skill configuration (orphaned symlinks, broken targets, sync + manages skills or agents (install, update, uninstall, sync, audit, analyze, check, diff, search), + or troubleshoots skill/agent configuration (orphaned symlinks, broken targets, sync issues). Covers both global (~/.config/skillshare/) and project (.skillshare/) modes. Also use when: adding new AI tool targets (Claude, Cursor, Windsurf, etc.), setting target include/exclude filters or copy vs symlink mode, using backup/restore or trash recovery, piping skillshare output to scripts (--json), setting up CI/CD - audit pipelines, or building/sharing skill hubs (hub index, hub add). + audit pipelines, building/sharing skill hubs (hub index, hub add), or working with + agents (single .md files synced to agent-capable targets like Claude, Cursor, + Augment, OpenCode) via positional `agents` filter or `--kind agent`, plus + `.agentignore` and `enable`/`disable` for per-agent toggles. argument-hint: "[command] [target] [--json] [--dry-run] [-p|-g]" metadata: - version: v0.18.9 + version: v0.19.0 --- # Skillshare CLI @@ -105,9 +108,11 @@ skillshare hub index --source ~/.config/skillshare/skills/ --full --audit # Bui ``` ### Controlling Where Skills Go ```bash -# SKILL.md frontmatter: targets: [claude] → only syncs to Claude +# SKILL.md frontmatter: metadata.targets: [claude] → only syncs to Claude skillshare target claude --add-include "team-*" # glob filter +skillshare target claude --add-agent-include "team-*" # agent glob filter skillshare target claude --add-exclude "_legacy*" # exclude pattern +skillshare target claude --agent-mode copy # agents copy mode skillshare target codex --mode copy && skillshare sync --force # copy mode # .skillignore — hide skills/dirs from discovery (gitignore syntax) # Root-level: /.skillignore (affects all commands) diff --git a/skills/skillshare/references/sync.md b/skills/skillshare/references/sync.md index c6114b18..f431107d 100644 --- a/skills/skillshare/references/sync.md +++ b/skills/skillshare/references/sync.md @@ -58,7 +58,7 @@ For full extras management (`init`, `list`, `remove`, `collect`), see [extras.md ## collect -Import skills from target(s) to source. +Import skills or agents from target(s) to source. ```bash # Global @@ -66,11 +66,14 @@ skillshare collect claude # From specific target skillshare collect --all # From all targets skillshare collect --dry-run # Preview skillshare collect claude --json # JSON output (implies --force) +skillshare collect agents claude # Collect agents instead of skills # Project (auto-detected or -p) skillshare collect claude # From project target skillshare collect --all # All project targets skillshare collect --all --force # Skip confirmation +skillshare collect -p --json # Project JSON output +skillshare collect -p agents --json # Project agent JSON output ``` ## push diff --git a/skills/skillshare/references/targets.md b/skills/skillshare/references/targets.md index fb2b5ad5..ec2609ce 100644 --- a/skills/skillshare/references/targets.md +++ b/skills/skillshare/references/targets.md @@ -35,17 +35,22 @@ targets: ## Target Filters -Control which skills sync to each target using include/exclude glob patterns. +Control which skills and agents sync to each target using include/exclude glob patterns. ```bash -# Add filters +# Add skill filters skillshare target claude --add-include "team-*" # Only sync matching skills skillshare target claude --add-exclude "_legacy*" # Skip matching skills skillshare target claude --add-include "team-*" -p # Project target filter +# Add agent filters +skillshare target claude --add-agent-include "team-*" +skillshare target claude --add-agent-exclude "draft-*" + # Remove filters skillshare target claude --remove-include "team-*" skillshare target claude --remove-exclude "_legacy*" +skillshare target claude --remove-agent-include "team-*" ``` **Config format** with filters: @@ -60,16 +65,17 @@ targets: **Pattern syntax:** `filepath.Match` globs — `*` matches any non-separator chars, `?` matches single char. -**Precedence:** Include filters apply first (whitelist), then exclude filters remove from that set. No filters = all skills. +**Precedence:** Include filters apply first (whitelist), then exclude filters remove from that set. No filters = all matching resources. Agent filters require a target with an agents path, and they are ignored in `symlink` mode. ## Skill-Level Targets -Skills can declare which targets they should sync to via a `targets` frontmatter field in SKILL.md: +Skills can declare which targets they should sync to via `metadata.targets` in SKILL.md. Top-level `targets` is still supported for older skills, but `metadata.targets` wins when both are present: ```yaml --- name: enterprise-skill -targets: [claude, cursor] +metadata: + targets: [claude, cursor] --- ``` diff --git a/skills/skillshare/references/trash.md b/skills/skillshare/references/trash.md index 9c86d95d..beffb0f0 100644 --- a/skills/skillshare/references/trash.md +++ b/skills/skillshare/references/trash.md @@ -1,6 +1,6 @@ # Trash (Soft-Delete) -`skillshare uninstall` moves skills to trash with 7-day retention instead of permanent deletion. +`skillshare uninstall` moves skills and agents to trash with 7-day retention instead of permanent deletion. ## Commands @@ -11,13 +11,22 @@ skillshare trash delete # Permanently delete one (alias: rm) skillshare trash empty # Permanently delete all (confirmation required) ``` +### Agent Trash + +```bash +skillshare trash agents list # List trashed agents +skillshare trash agents restore # Restore agent to source +skillshare trash agents delete # Permanently delete one agent +skillshare trash agents empty # Empty agent trash +skillshare trash --all list # List both skills and agents +``` + ## Project Mode ```bash skillshare trash list -p # Project trash skillshare trash restore -p # Restore to project source -skillshare trash delete -p # Delete from project trash -skillshare trash empty -p # Empty project trash +skillshare trash agents list -p # Project agent trash ``` Auto-detects mode when `.skillshare/config.yaml` exists. @@ -25,7 +34,7 @@ Auto-detects mode when `.skillshare/config.yaml` exists. ## Behavior - **7-day TTL**: Items auto-expire after 7 days -- **Restore**: Copies skill back to source directory; run `skillshare sync` afterward +- **Restore**: Copies skill/agent back to source directory; run `skillshare sync` / `skillshare sync agents` afterward - **Empty**: Requires interactive confirmation (`y/yes`) - **List output**: Shows name, size, and age (minutes/hours/days) @@ -37,6 +46,10 @@ skillshare trash list # Find the skill skillshare trash restore my-skill # Restore it skillshare sync # Re-sync to targets +# Undo an agent uninstall +skillshare trash agents restore tutor # Restore agent +skillshare sync agents # Re-sync agents + # Clean up skillshare trash delete old-skill # Remove specific item ``` diff --git a/tests/integration/agent_backup_test.go b/tests/integration/agent_backup_test.go new file mode 100644 index 00000000..eb562622 --- /dev/null +++ b/tests/integration/agent_backup_test.go @@ -0,0 +1,211 @@ +//go:build !online + +package integration + +import ( + "os" + "path/filepath" + "testing" + + "skillshare/internal/testutil" +) + +func TestBackup_Agents_CreatesBackup(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + }) + claudeAgents := createAgentTarget(t, sb, "claude") + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + sb.CreateTarget("claude") + ` + agents: + path: ` + claudeAgents + ` +`) + + // Sync agents first so there's something to backup + sb.RunCLI("sync", "agents") + + result := sb.RunCLI("backup", "agents") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "agent backup") +} + +func TestBackup_Agents_DryRun(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + }) + claudeAgents := createAgentTarget(t, sb, "claude") + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + sb.CreateTarget("claude") + ` + agents: + path: ` + claudeAgents + ` +`) + + sb.RunCLI("sync", "agents") + + result := sb.RunCLI("backup", "agents", "--dry-run") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "Dry run") +} + +func TestBackup_Agents_RestoreRoundTrip(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + }) + claudeAgents := createAgentTarget(t, sb, "claude") + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + sb.CreateTarget("claude") + ` + agents: + path: ` + claudeAgents + ` +`) + + // Sync then backup + sb.RunCLI("sync", "agents") + sb.RunCLI("backup", "agents") + + // Verify symlink exists + linkPath := filepath.Join(claudeAgents, "tutor.md") + if _, err := os.Lstat(linkPath); err != nil { + t.Fatalf("expected agent symlink at %s", linkPath) + } + + // Delete the agent from target + os.Remove(linkPath) + if _, err := os.Lstat(linkPath); !os.IsNotExist(err) { + t.Fatal("symlink should be removed") + } + + // Restore + result := sb.RunCLI("restore", "agents", "claude", "--force") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "Restored") +} + +func TestBackup_Default_DoesNotBackupAgents(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.CreateSkill("my-skill", map[string]string{ + "SKILL.md": "---\nname: my-skill\n---\n# Content", + }) + claudeSkills := sb.CreateTarget("claude") + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + claudeSkills + ` +`) + + // Default backup should only backup skills, not mention agents + result := sb.RunCLI("backup") + result.AssertSuccess(t) + result.AssertOutputNotContains(t, "agent") +} + +func TestBackup_Agents_ProjectMode_CreatesBackup(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + projectDir := setupProjectWithAgents(t, sb) + + // Sync agents first + result := sb.RunCLIInDir(projectDir, "sync", "-p", "agents") + result.AssertSuccess(t) + + // Backup project agents + result = sb.RunCLIInDir(projectDir, "backup", "-p", "agents") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "agent backup") + + // Verify backup was created under .skillshare/backups/ + backupDir := filepath.Join(projectDir, ".skillshare", "backups") + entries, err := os.ReadDir(backupDir) + if err != nil { + t.Fatalf("expected backup dir at %s: %v", backupDir, err) + } + if len(entries) == 0 { + t.Fatal("expected at least one backup timestamp directory") + } +} + +func TestBackup_Agents_ProjectMode_DryRun(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + projectDir := setupProjectWithAgents(t, sb) + sb.RunCLIInDir(projectDir, "sync", "-p", "agents") + + result := sb.RunCLIInDir(projectDir, "backup", "-p", "agents", "--dry-run") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "Dry run") + + // Backup dir should NOT exist + backupDir := filepath.Join(projectDir, ".skillshare", "backups") + if _, err := os.Stat(backupDir); !os.IsNotExist(err) { + t.Fatal("backup dir should not exist in dry run mode") + } +} + +func TestBackup_Agents_ProjectMode_RestoreRoundTrip(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + projectDir := setupProjectWithAgents(t, sb) + + // Sync → backup + sb.RunCLIInDir(projectDir, "sync", "-p", "agents") + sb.RunCLIInDir(projectDir, "backup", "-p", "agents") + + // Verify agent symlink exists + claudeAgents := filepath.Join(projectDir, ".claude", "agents") + linkPath := filepath.Join(claudeAgents, "tutor.md") + if _, err := os.Lstat(linkPath); err != nil { + t.Fatalf("expected agent symlink at %s", linkPath) + } + + // Delete agent from target + os.Remove(linkPath) + + // Restore + result := sb.RunCLIInDir(projectDir, "restore", "-p", "agents", "claude", "--force") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "Restored") + + // Verify agent file is back (as a regular file from backup, not symlink) + if _, err := os.Stat(linkPath); err != nil { + t.Fatalf("expected restored agent at %s: %v", linkPath, err) + } +} + +func TestRestore_Agents_SkillsProjectModeRejected(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + // Skills restore in project mode should still be rejected + result := sb.RunCLI("restore", "-p", "claude") + result.AssertFailure(t) + result.AssertAnyOutputContains(t, "not supported in project mode") +} diff --git a/tests/integration/agent_coverage_gaps_test.go b/tests/integration/agent_coverage_gaps_test.go new file mode 100644 index 00000000..e4e90a34 --- /dev/null +++ b/tests/integration/agent_coverage_gaps_test.go @@ -0,0 +1,281 @@ +//go:build !online + +package integration + +import ( + "encoding/json" + "os" + "path/filepath" + "testing" + + "skillshare/internal/testutil" +) + +// --- trash agents empty --- + +func TestTrash_Agents_Empty(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + "reviewer.md": "# Reviewer agent", + }) + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + // Uninstall both agents to trash + sb.RunCLI("uninstall", "-g", "agents", "--all", "--force") + + // Verify trash has items + listResult := sb.RunCLI("trash", "agents", "list", "--no-tui") + listResult.AssertSuccess(t) + listResult.AssertAnyOutputContains(t, "tutor") + + // Empty agent trash (use --force via input "y") + emptyResult := sb.RunCLIWithInput("y\n", "trash", "agents", "empty") + emptyResult.AssertSuccess(t) + emptyResult.AssertAnyOutputContains(t, "Emptied trash") + + // Verify trash is now empty + afterResult := sb.RunCLI("trash", "agents", "list", "--no-tui") + afterResult.AssertSuccess(t) + afterResult.AssertAnyOutputContains(t, "empty") +} + +// --- trash agents delete --- + +func TestTrash_Agents_Delete(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + }) + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + sb.RunCLI("uninstall", "-g", "agents", "tutor", "--force") + + // Delete specific item from agent trash + result := sb.RunCLI("trash", "agents", "delete", "tutor") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "Permanently deleted") +} + +// --- uninstall agents project mode --- + +func TestUninstall_Agents_ProjectMode(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + // Setup project + projectDir := filepath.Join(sb.Root, "myproject") + os.MkdirAll(filepath.Join(projectDir, ".skillshare", "skills"), 0755) + agentsDir := filepath.Join(projectDir, ".skillshare", "agents") + os.MkdirAll(agentsDir, 0755) + os.WriteFile(filepath.Join(agentsDir, "tutor.md"), []byte("# Tutor agent"), 0644) + + // Write project config + projectCfgDir := filepath.Join(projectDir, ".skillshare") + os.WriteFile(filepath.Join(projectCfgDir, "config.yaml"), []byte("targets:\n - claude\n"), 0644) + + // Also need global config for the CLI to not error + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + result := sb.RunCLIInDir(projectDir, "uninstall", "-p", "agents", "tutor", "--force") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "Removed agent") + + // Verify removed + if _, err := os.Stat(filepath.Join(agentsDir, "tutor.md")); !os.IsNotExist(err) { + t.Error("agent should be removed from project agents dir") + } +} + +// --- check all (combined) --- + +func TestCheck_All_CombinedOutput(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.CreateSkill("my-skill", map[string]string{ + "SKILL.md": "---\nname: my-skill\n---\n# Content", + }) + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + }) + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + // "check --all" should show both skills and agents + // Currently "check" defaults to skills-only, "check agents" is agents-only + // "check --all" should combine both + result := sb.RunCLI("check", "--all") + result.AssertSuccess(t) +} + +// --- multi-target agent config --- + +func TestSync_Agents_SkipsTargetsWithoutAgentPath(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + }) + claudeAgents := createAgentTarget(t, sb, "claude") + + // claude has agent path, cursor does NOT + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + sb.CreateTarget("claude") + ` + agents: + path: ` + claudeAgents + ` + cursor: + skills: + path: ` + sb.CreateTarget("cursor") + ` +`) + + result := sb.RunCLI("sync", "agents") + result.AssertSuccess(t) + + // Claude agents should be synced + if _, err := os.Lstat(filepath.Join(claudeAgents, "tutor.md")); err != nil { + t.Error("claude agent should be synced") + } +} + +func TestDiff_Default_ShowsAgentPruneAfterUninstallAll(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + }) + claudeAgents := createAgentTarget(t, sb, "claude") + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + sb.CreateTarget("claude") + ` + agents: + path: ` + claudeAgents + ` +`) + + sb.RunCLI("sync", "-g", "agents").AssertSuccess(t) + sb.RunCLI("uninstall", "-g", "agents", "--all", "--force").AssertSuccess(t) + + result := sb.RunCLI("diff", "-g", "--json") + result.AssertSuccess(t) + + output := parseJSON(t, result.Stdout) + targets, ok := output["targets"].([]any) + if !ok || len(targets) == 0 { + t.Fatalf("expected diff targets, got %v", output["targets"]) + } + + foundPrune := false + for _, rawTarget := range targets { + target, ok := rawTarget.(map[string]any) + if !ok || target["name"] != "claude" { + continue + } + items, _ := target["items"].([]any) + for _, rawItem := range items { + item, ok := rawItem.(map[string]any) + if !ok { + continue + } + if item["name"] == "tutor.md" && item["kind"] == "agent" && item["action"] == "remove" { + foundPrune = true + } + } + } + + if !foundPrune { + pretty, _ := json.MarshalIndent(output, "", " ") + t.Fatalf("expected agent prune in diff output, got:\n%s", string(pretty)) + } +} + +func TestSync_Agents_PrunesTargetAfterUninstallAll(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + }) + claudeAgents := createAgentTarget(t, sb, "claude") + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + sb.CreateTarget("claude") + ` + agents: + path: ` + claudeAgents + ` +`) + + sb.RunCLI("sync", "-g", "agents").AssertSuccess(t) + sb.RunCLI("uninstall", "-g", "agents", "--all", "--force").AssertSuccess(t) + + syncResult := sb.RunCLI("sync", "-g", "agents") + syncResult.AssertSuccess(t) + syncResult.AssertAnyOutputContains(t, "1 pruned") + + if _, err := os.Lstat(filepath.Join(claudeAgents, "tutor.md")); !os.IsNotExist(err) { + t.Fatalf("expected tutor.md to be pruned from target, got err=%v", err) + } +} + +// --- list agents JSON with kind field --- + +func TestList_Agents_JSON_AllEntriesHaveKind(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.CreateSkill("my-skill", map[string]string{ + "SKILL.md": "---\nname: my-skill\n---\n# Content", + }) + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + }) + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + // "list --all --json" should have kind on every entry + result := sb.RunCLI("list", "--all", "--json") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, `"kind": "skill"`) + result.AssertAnyOutputContains(t, `"kind": "agent"`) +} + +// --- status agents JSON with targets --- + +func TestStatus_Agents_JSON_WithTargets(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + }) + claudeAgents := createAgentTarget(t, sb, "claude") + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + sb.CreateTarget("claude") + ` + agents: + path: ` + claudeAgents + ` +`) + + // Sync agents + sb.RunCLI("sync", "agents") + + result := sb.RunCLI("status", "agents", "--json") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, `"agents"`) + result.AssertAnyOutputContains(t, `"expected"`) + result.AssertAnyOutputContains(t, `"linked"`) +} diff --git a/tests/integration/agent_crud_test.go b/tests/integration/agent_crud_test.go new file mode 100644 index 00000000..d5d0f511 --- /dev/null +++ b/tests/integration/agent_crud_test.go @@ -0,0 +1,521 @@ +//go:build !online + +package integration + +import ( + "os" + "os/exec" + "path/filepath" + "testing" + + "skillshare/internal/testutil" +) + +// --- update agents --- + +func TestUpdate_Agents_NoAgents(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + agentsDir := createAgentSource(t, sb, nil) + _ = agentsDir + + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + result := sb.RunCLI("update", "agents", "--all") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "No agents found") +} + +func TestUpdate_Agents_LocalOnly(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + }) + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + result := sb.RunCLI("update", "agents", "--all") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "local") +} + +func TestUpdate_Agents_GroupInvalidDir(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + }) + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + result := sb.RunCLI("update", "agents", "--group", "nonexistent") + result.AssertFailure(t) + result.AssertAnyOutputContains(t, "not found") +} + +func TestUpdate_Agents_RequiresNameOrAll(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + }) + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + result := sb.RunCLI("update", "agents") + result.AssertFailure(t) + result.AssertAnyOutputContains(t, "specify agent name") +} + +func TestUpdate_Agents_HighFindingBlockedByThreshold(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + repoDir := filepath.Join(sb.Home, "agent-audit-repo") + if err := os.MkdirAll(repoDir, 0o755); err != nil { + t.Fatalf("mkdir repo: %v", err) + } + if err := os.WriteFile(filepath.Join(repoDir, "reviewer.md"), []byte("# Reviewer v1\n"), 0o644); err != nil { + t.Fatalf("write initial agent: %v", err) + } + initGitRepo(t, repoDir) + + installResult := sb.RunCLI("install", "file://"+repoDir, "--kind", "agent", "--skip-audit") + installResult.AssertSuccess(t) + + agentsDir := filepath.Join(filepath.Dir(sb.SourcePath), "agents") + installedAgent := filepath.Join(agentsDir, "reviewer.md") + + if err := os.WriteFile(filepath.Join(repoDir, "reviewer.md"), []byte("# Reviewer v2\nsudo apt-get install -y jq\n"), 0o644); err != nil { + t.Fatalf("write updated agent: %v", err) + } + for _, args := range [][]string{ + {"add", "reviewer.md"}, + {"commit", "-m", "introduce high finding"}, + } { + cmd := exec.Command("git", args...) + cmd.Dir = repoDir + if out, err := cmd.CombinedOutput(); err != nil { + t.Fatalf("git %v failed: %s %v", args, out, err) + } + } + + result := sb.RunCLI("update", "agents", "--all", "-T", "h") + result.AssertFailure(t) + result.AssertAnyOutputContains(t, "security audit") + + content, err := os.ReadFile(installedAgent) + if err != nil { + t.Fatalf("read installed agent: %v", err) + } + if string(content) != "# Reviewer v1\n" { + t.Fatalf("expected blocked update to preserve original content, got %q", string(content)) + } +} + +// --- uninstall agents --- + +func TestUninstall_Agents_RemovesToTrash(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + agentsDir := createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + }) + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + result := sb.RunCLI("uninstall", "-g", "agents", "tutor", "--force") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "Removed agent") + result.AssertAnyOutputContains(t, "tutor") + + // Verify agent file was removed from source + if _, err := os.Stat(filepath.Join(agentsDir, "tutor.md")); !os.IsNotExist(err) { + t.Error("agent file should be removed from source") + } +} + +func TestUninstall_Agents_NotFound(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + createAgentSource(t, sb, nil) + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + result := sb.RunCLI("uninstall", "-g", "agents", "nonexistent", "--force") + result.AssertFailure(t) + result.AssertAnyOutputContains(t, "not found") +} + +func TestUninstall_Agents_All(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + agentsDir := createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + "reviewer.md": "# Reviewer agent", + }) + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + result := sb.RunCLI("uninstall", "-g", "agents", "--all", "--force") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "2 agent(s) removed") + + // Verify both files removed + if _, err := os.Stat(filepath.Join(agentsDir, "tutor.md")); !os.IsNotExist(err) { + t.Error("tutor.md should be removed") + } + if _, err := os.Stat(filepath.Join(agentsDir, "reviewer.md")); !os.IsNotExist(err) { + t.Error("reviewer.md should be removed") + } +} + +// --- collect agents --- + +func TestCollect_Agents_NoLocalAgents(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + }) + claudeAgents := createAgentTarget(t, sb, "claude") + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + sb.CreateTarget("claude") + ` + agents: + path: ` + claudeAgents + ` +`) + + // Sync agents first (creates symlinks) + sb.RunCLI("sync", "agents") + + // Collect should find no local (non-symlinked) agents + result := sb.RunCLI("collect", "agents") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "No local agents") +} + +func TestCollect_Agents_CollectsLocalFiles(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + createAgentSource(t, sb, nil) + claudeAgents := createAgentTarget(t, sb, "claude") + + // Create a local (non-symlinked) agent in the target + os.WriteFile(filepath.Join(claudeAgents, "local-agent.md"), []byte("# Local agent"), 0644) + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + sb.CreateTarget("claude") + ` + agents: + path: ` + claudeAgents + ` +`) + + agentsSource := filepath.Join(filepath.Dir(sb.SourcePath), "agents") + + result := sb.RunCLI("collect", "agents", "--force") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "collected") + + // Verify the file was copied to agent source + if _, err := os.Stat(filepath.Join(agentsSource, "local-agent.md")); err != nil { + t.Error("local-agent.md should be collected to agent source") + } +} + +func TestCollect_Agents_SpecificTargetOnly(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + agentsSource := createAgentSource(t, sb, nil) + claudeAgents := createAgentTarget(t, sb, "claude") + cursorAgents := createAgentTarget(t, sb, "cursor") + + os.WriteFile(filepath.Join(claudeAgents, "claude-agent.md"), []byte("# Claude"), 0644) + os.WriteFile(filepath.Join(cursorAgents, "cursor-agent.md"), []byte("# Cursor"), 0644) + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + sb.CreateTarget("claude") + ` + agents: + path: ` + claudeAgents + ` + cursor: + skills: + path: ` + sb.CreateTarget("cursor") + ` + agents: + path: ` + cursorAgents + ` +`) + + result := sb.RunCLI("collect", "agents", "claude", "--force") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "claude-agent.md") + result.AssertOutputNotContains(t, "cursor-agent.md") + + if _, err := os.Stat(filepath.Join(agentsSource, "claude-agent.md")); err != nil { + t.Error("claude-agent.md should be collected") + } + if _, err := os.Stat(filepath.Join(agentsSource, "cursor-agent.md")); !os.IsNotExist(err) { + t.Error("cursor-agent.md should not be collected") + } +} + +func TestCollect_Agents_MultipleTargets_RequiresAllOrName(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + agentsSource := createAgentSource(t, sb, nil) + claudeAgents := createAgentTarget(t, sb, "claude") + cursorAgents := createAgentTarget(t, sb, "cursor") + + os.WriteFile(filepath.Join(claudeAgents, "claude-agent.md"), []byte("# Claude"), 0644) + os.WriteFile(filepath.Join(cursorAgents, "cursor-agent.md"), []byte("# Cursor"), 0644) + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + sb.CreateTarget("claude") + ` + agents: + path: ` + claudeAgents + ` + cursor: + skills: + path: ` + sb.CreateTarget("cursor") + ` + agents: + path: ` + cursorAgents + ` +`) + + result := sb.RunCLI("collect", "agents") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "Specify a target") + + if _, err := os.Stat(filepath.Join(agentsSource, "claude-agent.md")); !os.IsNotExist(err) { + t.Error("claude-agent.md should not be collected without target selection") + } + if _, err := os.Stat(filepath.Join(agentsSource, "cursor-agent.md")); !os.IsNotExist(err) { + t.Error("cursor-agent.md should not be collected without target selection") + } +} + +func TestCollect_Agents_DryRun_DoesNotWrite(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + agentsSource := createAgentSource(t, sb, nil) + claudeAgents := createAgentTarget(t, sb, "claude") + + os.WriteFile(filepath.Join(claudeAgents, "local-agent.md"), []byte("# Local agent"), 0644) + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + sb.CreateTarget("claude") + ` + agents: + path: ` + claudeAgents + ` +`) + + result := sb.RunCLI("collect", "agents", "--dry-run") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "Dry run") + + if _, err := os.Stat(filepath.Join(agentsSource, "local-agent.md")); !os.IsNotExist(err) { + t.Error("dry-run should not collect local-agent.md") + } +} + +func TestCollect_Agents_ExistingSource_SkipsWithoutForce(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + agentsSource := createAgentSource(t, sb, map[string]string{ + "local-agent.md": "# Source version", + }) + claudeAgents := createAgentTarget(t, sb, "claude") + + os.WriteFile(filepath.Join(claudeAgents, "local-agent.md"), []byte("# Target version"), 0644) + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + sb.CreateTarget("claude") + ` + agents: + path: ` + claudeAgents + ` +`) + + result := sb.RunCLIWithInput("y\n", "collect", "agents", "claude") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "skipped") + + content, err := os.ReadFile(filepath.Join(agentsSource, "local-agent.md")) + if err != nil { + t.Fatalf("failed to read source agent: %v", err) + } + if string(content) != "# Source version" { + t.Errorf("source agent should not be overwritten, got %q", string(content)) + } +} + +func TestCollect_Agents_Force_OverwritesSource(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + agentsSource := createAgentSource(t, sb, map[string]string{ + "local-agent.md": "# Source version", + }) + claudeAgents := createAgentTarget(t, sb, "claude") + + os.WriteFile(filepath.Join(claudeAgents, "local-agent.md"), []byte("# Target version"), 0644) + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + sb.CreateTarget("claude") + ` + agents: + path: ` + claudeAgents + ` +`) + + result := sb.RunCLI("collect", "agents", "claude", "--force") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "copied to source") + + content, err := os.ReadFile(filepath.Join(agentsSource, "local-agent.md")) + if err != nil { + t.Fatalf("failed to read source agent: %v", err) + } + if string(content) != "# Target version" { + t.Errorf("source agent should be overwritten, got %q", string(content)) + } +} + +// --- trash agents --- + +func TestTrash_Agents_ListEmpty(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + result := sb.RunCLI("trash", "agents", "list", "--no-tui") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "empty") +} + +func TestTrash_Agents_ListAfterUninstall(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + }) + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + // Uninstall to trash + sb.RunCLI("uninstall", "-g", "agents", "tutor", "--force") + + // List agent trash + result := sb.RunCLI("trash", "agents", "list", "--no-tui") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "tutor") +} + +func TestTrash_Agents_Restore(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + agentsDir := createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + }) + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + // Uninstall + sb.RunCLI("uninstall", "-g", "agents", "tutor", "--force") + + // Verify removed + if _, err := os.Stat(filepath.Join(agentsDir, "tutor.md")); !os.IsNotExist(err) { + t.Fatal("should be removed after uninstall") + } + + // Restore + result := sb.RunCLI("trash", "agents", "restore", "tutor") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "Restored") + + // Verify restored to agent source + if _, err := os.Stat(filepath.Join(agentsDir, "tutor.md")); err != nil { + t.Error("tutor.md should be restored to agent source") + } +} + +func TestTrash_Agents_Restore_Nested_DoesNotGoToSkills(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + agentsDir := createAgentSource(t, sb, map[string]string{ + "demo/code-archaeologist.md": "# Code Archaeologist", + }) + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + sb.RunCLI("uninstall", "-g", "agents", "demo/code-archaeologist", "--force") + + result := sb.RunCLI("trash", "agents", "restore", "demo/code-archaeologist") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "Restored") + + if _, err := os.Stat(filepath.Join(agentsDir, "demo", "code-archaeologist.md")); err != nil { + t.Fatalf("nested agent should be restored to agents source: %v", err) + } + + wrongSkillsPath := filepath.Join(sb.SourcePath, "agents", "demo", "code-archaeologist", "code-archaeologist.md") + if _, err := os.Stat(wrongSkillsPath); err == nil { + t.Fatalf("nested agent should not be restored into skills tree: %s", wrongSkillsPath) + } +} + +// --- default behavior unchanged --- + +func TestTrash_Default_SkillsOnly(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + // Default trash list should check skill trash (not agent trash) + result := sb.RunCLI("trash", "list", "--no-tui") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "empty") +} + +func TestTrash_Default_SkillsOnly_IgnoresAgentTrash(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + createAgentSource(t, sb, map[string]string{ + "demo/tutor.md": "# Tutor agent", + }) + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + sb.RunCLI("uninstall", "-g", "agents", "demo/tutor", "--force") + + result := sb.RunCLI("trash", "list", "--no-tui") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "empty") + + restore := sb.RunCLI("trash", "restore", "demo/tutor") + restore.AssertFailure(t) + restore.AssertAnyOutputContains(t, "not found in trash") +} diff --git a/tests/integration/agent_list_sync_test.go b/tests/integration/agent_list_sync_test.go new file mode 100644 index 00000000..ef22e297 --- /dev/null +++ b/tests/integration/agent_list_sync_test.go @@ -0,0 +1,294 @@ +//go:build !online + +package integration + +import ( + "os" + "path/filepath" + "testing" + + "skillshare/internal/testutil" +) + +// createAgentSource creates an agents source directory with the given agents. +// Each key is the filename (e.g., "tutor.md"), value is the content. +func createAgentSource(t *testing.T, sb *testutil.Sandbox, agents map[string]string) string { + t.Helper() + agentsDir := filepath.Join(filepath.Dir(sb.SourcePath), "agents") + if err := os.MkdirAll(agentsDir, 0755); err != nil { + t.Fatalf("failed to create agents dir: %v", err) + } + for name, content := range agents { + agentPath := filepath.Join(agentsDir, name) + if err := os.MkdirAll(filepath.Dir(agentPath), 0755); err != nil { + t.Fatalf("failed to create agent parent dir for %s: %v", name, err) + } + if err := os.WriteFile(agentPath, []byte(content), 0644); err != nil { + t.Fatalf("failed to write agent %s: %v", name, err) + } + } + return agentsDir +} + +// createAgentTarget creates an agent target directory for the given target name. +func createAgentTarget(t *testing.T, sb *testutil.Sandbox, name string) string { + t.Helper() + var path string + switch name { + case "claude": + path = filepath.Join(sb.Home, ".claude", "agents") + case "cursor": + path = filepath.Join(sb.Home, ".cursor", "agents") + case "opencode": + path = filepath.Join(sb.Home, ".config", "opencode", "agents") + default: + path = filepath.Join(sb.Home, "."+name, "agents") + } + if err := os.MkdirAll(path, 0755); err != nil { + t.Fatalf("failed to create agent target: %v", err) + } + return path +} + +// --- list agents --- + +func TestList_Agents_Empty(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + createAgentSource(t, sb, nil) + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + result := sb.RunCLI("list", "agents", "--no-tui") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "No agents installed") +} + +func TestList_Agents_ShowsAgents(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + "reviewer.md": "# Reviewer agent", + }) + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + result := sb.RunCLI("list", "agents", "--no-tui") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "tutor") + result.AssertAnyOutputContains(t, "reviewer") + result.AssertAnyOutputContains(t, "Installed agents") +} + +func TestList_Agents_JSON_IncludesKind(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + }) + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + result := sb.RunCLI("list", "agents", "--json") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, `"kind"`) + result.AssertAnyOutputContains(t, `"agent"`) + result.AssertAnyOutputContains(t, `"tutor"`) +} + +func TestList_All_MixedOutput(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.CreateSkill("my-skill", map[string]string{ + "SKILL.md": "---\nname: my-skill\n---\n# Content", + }) + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + }) + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + result := sb.RunCLI("list", "--all", "--json") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, `"skill"`) + result.AssertAnyOutputContains(t, `"agent"`) +} + +func TestList_Default_SkillsOnly(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.CreateSkill("my-skill", map[string]string{ + "SKILL.md": "---\nname: my-skill\n---\n# Content", + }) + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + }) + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + // Default list should NOT include agents + result := sb.RunCLI("list", "--json") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, `"my-skill"`) + result.AssertOutputNotContains(t, `"tutor"`) +} + +// --- sync agents --- + +func TestSync_Agents_CreatesSymlinks(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + }) + claudeAgents := createAgentTarget(t, sb, "claude") + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + sb.CreateTarget("claude") + ` + agents: + path: ` + claudeAgents + ` +`) + + result := sb.RunCLI("sync", "agents") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "Syncing agents") + + // Verify symlink was created + linkPath := filepath.Join(claudeAgents, "tutor.md") + if _, err := os.Lstat(linkPath); err != nil { + t.Errorf("expected agent symlink at %s, got error: %v", linkPath, err) + } +} + +func TestSync_Agents_DryRun(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + }) + claudeAgents := createAgentTarget(t, sb, "claude") + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + sb.CreateTarget("claude") + ` + agents: + path: ` + claudeAgents + ` +`) + + result := sb.RunCLI("sync", "agents", "--dry-run") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "Dry run") + + // Verify NO symlink was created + linkPath := filepath.Join(claudeAgents, "tutor.md") + if _, err := os.Lstat(linkPath); !os.IsNotExist(err) { + t.Error("expected no agent symlink in dry-run mode") + } +} + +func TestSync_Default_SkillsOnly_NoAgentSync(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.CreateSkill("my-skill", map[string]string{ + "SKILL.md": "---\nname: my-skill\n---\n# Content", + }) + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + }) + + claudeSkills := sb.CreateTarget("claude") + claudeAgents := createAgentTarget(t, sb, "claude") + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + claudeSkills + ` + agents: + path: ` + claudeAgents + ` +`) + + // Default sync should only sync skills, NOT agents + result := sb.RunCLI("sync") + result.AssertSuccess(t) + result.AssertOutputNotContains(t, "Syncing agents") + + // Skill symlink should exist + if _, err := os.Lstat(filepath.Join(claudeSkills, "my-skill")); err != nil { + t.Error("expected skill symlink") + } + // Agent symlink should NOT exist + if _, err := os.Lstat(filepath.Join(claudeAgents, "tutor.md")); !os.IsNotExist(err) { + t.Error("expected no agent symlink from default sync") + } +} + +func TestSync_All_SyncsSkillsAndAgents(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.CreateSkill("my-skill", map[string]string{ + "SKILL.md": "---\nname: my-skill\n---\n# Content", + }) + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + }) + + claudeSkills := sb.CreateTarget("claude") + claudeAgents := createAgentTarget(t, sb, "claude") + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + claudeSkills + ` + agents: + path: ` + claudeAgents + ` +`) + + result := sb.RunCLI("sync", "--all") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "Syncing skills") + result.AssertAnyOutputContains(t, "Syncing agents") + + // Both should be synced + if _, err := os.Lstat(filepath.Join(claudeSkills, "my-skill")); err != nil { + t.Error("expected skill symlink") + } + if _, err := os.Lstat(filepath.Join(claudeAgents, "tutor.md")); err != nil { + t.Error("expected agent symlink") + } +} + +// --- --all flag --- + +func TestSync_All_Flag(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.CreateSkill("my-skill", map[string]string{ + "SKILL.md": "---\nname: my-skill\n---\n# Content", + }) + claudeSkills := sb.CreateTarget("claude") + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + claudeSkills + ` +`) + + // "sync --all" should still sync skills even without agents configured + result := sb.RunCLI("sync", "--all") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "Syncing skills") +} diff --git a/tests/integration/agent_observability_test.go b/tests/integration/agent_observability_test.go new file mode 100644 index 00000000..7beacb73 --- /dev/null +++ b/tests/integration/agent_observability_test.go @@ -0,0 +1,231 @@ +//go:build !online + +package integration + +import ( + "encoding/json" + "os" + "path/filepath" + "strings" + "testing" + + "skillshare/internal/testutil" +) + +// --- status (always shows skills + agents) --- + +func TestStatus_ShowsAgentInfo(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + "reviewer.md": "# Reviewer agent", + }) + claudeAgents := createAgentTarget(t, sb, "claude") + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + sb.CreateTarget("claude") + ` + agents: + path: ` + claudeAgents + ` +`) + + result := sb.RunCLI("status") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "2 agents") // Source section + result.AssertAnyOutputContains(t, "agents") // Targets sub-item + result.AssertAnyOutputContains(t, "linked") // agent sync status +} + +func TestStatus_JSON_IncludesAgents(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + }) + + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + result := sb.RunCLI("status", "--json") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, `"agents"`) + result.AssertAnyOutputContains(t, `"count"`) +} + +func TestStatus_Default_ShowsBothSkillsAndAgents(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.CreateSkill("my-skill", map[string]string{ + "SKILL.md": "---\nname: my-skill\n---\n# Content", + }) + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + }) + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + result := sb.RunCLI("status") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "Source") // source section with skills + agents + result.AssertAnyOutputContains(t, "1 skills") // skills in source + result.AssertAnyOutputContains(t, "1 agents") // agents in source +} + +// --- diff agents --- + +func TestDiff_Agents_JSON_IncludesKind(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.CreateSkill("my-skill", map[string]string{ + "SKILL.md": "---\nname: my-skill\n---\n# Content", + }) + claudeSkills := sb.CreateTarget("claude") + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + claudeSkills + ` +`) + + // Diff before sync should show items with kind field + result := sb.RunCLI("diff", "--json") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, `"kind"`) + result.AssertAnyOutputContains(t, `"skill"`) +} + +// --- doctor agents --- + +func TestDoctor_ChecksAgentSource(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + }) + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + result := sb.RunCLI("doctor") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "Agents source") + result.AssertAnyOutputContains(t, "1 agents") +} + +func TestDoctor_AgentTargetDrift(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + "reviewer.md": "# Reviewer agent", + }) + claudeAgents := createAgentTarget(t, sb, "claude") + + // Only sync one agent manually (create symlink for tutor only) + agentsDir := filepath.Join(filepath.Dir(sb.SourcePath), "agents") + os.Symlink( + filepath.Join(agentsDir, "tutor.md"), + filepath.Join(claudeAgents, "tutor.md"), + ) + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + sb.CreateTarget("claude") + ` + agents: + path: ` + claudeAgents + ` +`) + + result := sb.RunCLI("doctor") + result.AssertSuccess(t) + // Should detect drift (1/2 linked) + result.AssertAnyOutputContains(t, "drift") +} + +func TestDoctor_AgentTargetSynced(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + }) + claudeAgents := createAgentTarget(t, sb, "claude") + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + sb.CreateTarget("claude") + ` + agents: + path: ` + claudeAgents + ` +`) + + // Sync agents first + sb.RunCLI("sync", "agents") + + result := sb.RunCLI("doctor") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "1 agents") + result.AssertOutputNotContains(t, "drift") +} + +func TestCollect_Agents_WritesOplog(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + createAgentSource(t, sb, nil) + claudeAgents := createAgentTarget(t, sb, "claude") + if err := os.WriteFile(filepath.Join(claudeAgents, "local-agent.md"), []byte("# Local"), 0644); err != nil { + t.Fatalf("write: %v", err) + } + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + sb.CreateTarget("claude") + ` + agents: + path: ` + claudeAgents + ` +`) + + collectResult := sb.RunCLI("collect", "agents", "claude", "--force") + collectResult.AssertSuccess(t) + + logResult := sb.RunCLI("log", "--json", "--cmd", "collect", "--tail", "1") + logResult.AssertSuccess(t) + + line := strings.TrimSpace(logResult.Stdout) + if line == "" { + t.Fatal("expected collect oplog entry") + } + + var entry map[string]any + if err := json.Unmarshal([]byte(line), &entry); err != nil { + t.Fatalf("failed to parse log entry: %v\nstdout=%s", err, logResult.Stdout) + } + + if entry["cmd"] != "collect" { + t.Fatalf("expected cmd=collect, got %v", entry["cmd"]) + } + if entry["status"] != "ok" { + t.Fatalf("expected status=ok, got %v", entry["status"]) + } + + args, ok := entry["args"].(map[string]any) + if !ok { + t.Fatalf("expected args object, got %T", entry["args"]) + } + if args["kind"] != "agents" { + t.Fatalf("expected kind=agents, got %v", args["kind"]) + } + if args["pulled"] != float64(1) { + t.Fatalf("expected pulled=1, got %v", args["pulled"]) + } +} diff --git a/tests/integration/agent_project_mode_test.go b/tests/integration/agent_project_mode_test.go new file mode 100644 index 00000000..c74edb42 --- /dev/null +++ b/tests/integration/agent_project_mode_test.go @@ -0,0 +1,397 @@ +//go:build !online + +package integration + +import ( + "encoding/json" + "os" + "path/filepath" + "testing" + + "skillshare/internal/testutil" +) + +// setupProjectWithAgents creates a project directory with skills, agents, and config. +// Returns the project root path. +func setupProjectWithAgents(t *testing.T, sb *testutil.Sandbox) string { + t.Helper() + + projectDir := filepath.Join(sb.Root, "myproject") + skillsDir := filepath.Join(projectDir, ".skillshare", "skills") + agentsDir := filepath.Join(projectDir, ".skillshare", "agents") + os.MkdirAll(skillsDir, 0755) + os.MkdirAll(agentsDir, 0755) + + // Create a skill + skillDir := filepath.Join(skillsDir, "my-skill") + os.MkdirAll(skillDir, 0755) + os.WriteFile(filepath.Join(skillDir, "SKILL.md"), []byte("---\nname: my-skill\n---\n# Content"), 0644) + + // Create an agent + os.WriteFile(filepath.Join(agentsDir, "tutor.md"), []byte("# Tutor agent"), 0644) + + // Write project config with a target that has agent path + claudeAgents := filepath.Join(projectDir, ".claude", "agents") + os.MkdirAll(claudeAgents, 0755) + claudeSkills := filepath.Join(projectDir, ".claude", "skills") + os.MkdirAll(claudeSkills, 0755) + + configContent := `targets: + - name: claude + skills: + path: ` + claudeSkills + ` + agents: + path: ` + claudeAgents + ` +` + os.WriteFile(filepath.Join(projectDir, ".skillshare", "config.yaml"), []byte(configContent), 0644) + + // Global config (needed by CLI) + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + return projectDir +} + +func setupProjectWithMultipleAgentTargets(t *testing.T, sb *testutil.Sandbox) string { + t.Helper() + + projectDir := filepath.Join(sb.Root, "multi-agent-project") + skillsDir := filepath.Join(projectDir, ".skillshare", "skills") + agentsDir := filepath.Join(projectDir, ".skillshare", "agents") + claudeSkills := filepath.Join(projectDir, ".claude", "skills") + claudeAgents := filepath.Join(projectDir, ".claude", "agents") + cursorSkills := filepath.Join(projectDir, ".cursor", "skills") + cursorAgents := filepath.Join(projectDir, ".cursor", "agents") + + for _, dir := range []string{skillsDir, agentsDir, claudeSkills, claudeAgents, cursorSkills, cursorAgents} { + if err := os.MkdirAll(dir, 0755); err != nil { + t.Fatalf("failed to create %s: %v", dir, err) + } + } + + configContent := `targets: + - name: claude + skills: + path: ` + claudeSkills + ` + agents: + path: ` + claudeAgents + ` + - name: cursor + skills: + path: ` + cursorSkills + ` + agents: + path: ` + cursorAgents + ` +` + if err := os.WriteFile(filepath.Join(projectDir, ".skillshare", "config.yaml"), []byte(configContent), 0644); err != nil { + t.Fatalf("write config: %v", err) + } + + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + return projectDir +} + +// --- status -p (always shows skills + agents) --- + +func TestStatusProject_ShowsAgents(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + projectDir := setupProjectWithAgents(t, sb) + + result := sb.RunCLIInDir(projectDir, "status", "-p") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "Source") // source section + result.AssertAnyOutputContains(t, "1 agents") // agents in source + result.AssertAnyOutputContains(t, "agents") // agents sub-item in targets +} + +func TestStatusProject_JSON_IncludesAgents(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + projectDir := setupProjectWithAgents(t, sb) + + result := sb.RunCLIInDir(projectDir, "status", "-p", "--json") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, `"agents"`) + result.AssertAnyOutputContains(t, `"count"`) +} + +// --- check -p agents --- + +func TestCheckProject_Agents(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + projectDir := setupProjectWithAgents(t, sb) + + result := sb.RunCLIInDir(projectDir, "check", "-p", "agents") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "tutor") + result.AssertAnyOutputContains(t, "local") +} + +func TestCheckProject_Agents_JSON(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + projectDir := setupProjectWithAgents(t, sb) + + result := sb.RunCLIInDir(projectDir, "check", "-p", "agents", "--json") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, `"name"`) + result.AssertAnyOutputContains(t, `"status"`) +} + +// --- diff -p agents --- + +func TestDiffProject_Agents(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + projectDir := setupProjectWithAgents(t, sb) + + // Before sync, diff should show agents as "add" + result := sb.RunCLIInDir(projectDir, "diff", "-p", "agents", "--no-tui") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "tutor") +} + +func TestDiffProject_Agents_JSON(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + projectDir := setupProjectWithAgents(t, sb) + + result := sb.RunCLIInDir(projectDir, "diff", "-p", "agents", "--json") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, `"agent"`) +} + +// --- collect -p agents --- + +func TestCollectProject_Agents_NoLocal(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + projectDir := setupProjectWithAgents(t, sb) + + // Sync agents first + sb.RunCLIInDir(projectDir, "sync", "-p", "agents") + + // No local agents to collect + result := sb.RunCLIInDir(projectDir, "collect", "-p", "agents") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "No local agents") +} + +func TestCollectProject_Agents_CollectsLocal(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + projectDir := setupProjectWithAgents(t, sb) + + // Create a local agent directly in target (not via sync) + claudeAgents := filepath.Join(projectDir, ".claude", "agents") + os.WriteFile(filepath.Join(claudeAgents, "local-agent.md"), []byte("# Local"), 0644) + + result := sb.RunCLIInDir(projectDir, "collect", "-p", "agents", "--force") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "collected") + + // Verify copied to project agents source + agentsSource := filepath.Join(projectDir, ".skillshare", "agents") + if _, err := os.Stat(filepath.Join(agentsSource, "local-agent.md")); err != nil { + t.Error("local-agent.md should be collected to project agents source") + } +} + +func TestCollectProject_Agents_DryRun_DoesNotWrite(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + projectDir := setupProjectWithAgents(t, sb) + claudeAgents := filepath.Join(projectDir, ".claude", "agents") + os.WriteFile(filepath.Join(claudeAgents, "local-agent.md"), []byte("# Local"), 0644) + + result := sb.RunCLIInDir(projectDir, "collect", "-p", "agents", "--dry-run") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "Dry run") + + agentsSource := filepath.Join(projectDir, ".skillshare", "agents") + if _, err := os.Stat(filepath.Join(agentsSource, "local-agent.md")); !os.IsNotExist(err) { + t.Error("dry-run should not collect local-agent.md") + } +} + +func TestCollectProject_Agents_JSON(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + projectDir := setupProjectWithAgents(t, sb) + claudeAgents := filepath.Join(projectDir, ".claude", "agents") + os.WriteFile(filepath.Join(claudeAgents, "local-agent.md"), []byte("# Local"), 0644) + + result := sb.RunCLIInDir(projectDir, "collect", "-p", "agents", "--json") + result.AssertSuccess(t) + + var output map[string]any + if err := json.Unmarshal([]byte(result.Stdout), &output); err != nil { + t.Fatalf("invalid JSON output: %v\nStdout: %s", err, result.Stdout) + } + + pulled, ok := output["pulled"].([]any) + if !ok || len(pulled) != 1 || pulled[0] != "local-agent.md" { + t.Fatalf("expected pulled=[local-agent.md], got %v", output["pulled"]) + } +} + +func TestCollectProject_Agents_SpecificTargetOnly(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + projectDir := setupProjectWithMultipleAgentTargets(t, sb) + claudeAgents := filepath.Join(projectDir, ".claude", "agents") + cursorAgents := filepath.Join(projectDir, ".cursor", "agents") + agentsSource := filepath.Join(projectDir, ".skillshare", "agents") + + os.WriteFile(filepath.Join(claudeAgents, "claude-agent.md"), []byte("# Claude"), 0644) + os.WriteFile(filepath.Join(cursorAgents, "cursor-agent.md"), []byte("# Cursor"), 0644) + + result := sb.RunCLIInDir(projectDir, "collect", "-p", "agents", "claude", "--force") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "claude-agent.md") + result.AssertOutputNotContains(t, "cursor-agent.md") + + if _, err := os.Stat(filepath.Join(agentsSource, "claude-agent.md")); err != nil { + t.Error("claude-agent.md should be collected") + } + if _, err := os.Stat(filepath.Join(agentsSource, "cursor-agent.md")); !os.IsNotExist(err) { + t.Error("cursor-agent.md should not be collected") + } +} + +func TestCollectProject_Agents_MultipleTargets_RequiresAllOrName(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + projectDir := setupProjectWithMultipleAgentTargets(t, sb) + claudeAgents := filepath.Join(projectDir, ".claude", "agents") + cursorAgents := filepath.Join(projectDir, ".cursor", "agents") + agentsSource := filepath.Join(projectDir, ".skillshare", "agents") + + os.WriteFile(filepath.Join(claudeAgents, "claude-agent.md"), []byte("# Claude"), 0644) + os.WriteFile(filepath.Join(cursorAgents, "cursor-agent.md"), []byte("# Cursor"), 0644) + + result := sb.RunCLIInDir(projectDir, "collect", "-p", "agents") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "Specify a target") + + if _, err := os.Stat(filepath.Join(agentsSource, "claude-agent.md")); !os.IsNotExist(err) { + t.Error("claude-agent.md should not be collected without target selection") + } + if _, err := os.Stat(filepath.Join(agentsSource, "cursor-agent.md")); !os.IsNotExist(err) { + t.Error("cursor-agent.md should not be collected without target selection") + } +} + +// --- audit -p agents --- + +func TestAuditProject_Agents(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + projectDir := setupProjectWithAgents(t, sb) + + result := sb.RunCLIInDir(projectDir, "audit", "-p", "agents") + result.AssertSuccess(t) + // Audit should scan agents, not error + result.AssertOutputNotContains(t, "not yet supported") +} + +func TestSyncProject_All_NestedAgentsSameBasename_FlattensAndStaysStable(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + projectDir := filepath.Join(sb.Root, "nested-agents-project") + skillsDir := filepath.Join(projectDir, ".skillshare", "skills") + agentsDir := filepath.Join(projectDir, ".skillshare", "agents") + claudeAgents := filepath.Join(projectDir, ".claude", "agents") + cursorAgents := filepath.Join(projectDir, ".cursor", "agents") + claudeSkills := filepath.Join(projectDir, ".claude", "skills") + cursorSkills := filepath.Join(projectDir, ".cursor", "skills") + + for _, dir := range []string{ + filepath.Join(skillsDir, "sample-skill"), + filepath.Join(agentsDir, "team-a"), + filepath.Join(agentsDir, "team-b"), + claudeAgents, + cursorAgents, + claudeSkills, + cursorSkills, + } { + if err := os.MkdirAll(dir, 0o755); err != nil { + t.Fatalf("mkdir %s: %v", dir, err) + } + } + + if err := os.WriteFile(filepath.Join(skillsDir, "sample-skill", "SKILL.md"), []byte("---\nname: sample-skill\n---\n# Sample"), 0o644); err != nil { + t.Fatalf("write sample skill: %v", err) + } + if err := os.WriteFile(filepath.Join(agentsDir, "team-a", "helper.md"), []byte("# Team A"), 0o644); err != nil { + t.Fatalf("write team-a helper: %v", err) + } + if err := os.WriteFile(filepath.Join(agentsDir, "team-b", "helper.md"), []byte("# Team B"), 0o644); err != nil { + t.Fatalf("write team-b helper: %v", err) + } + + configContent := `targets: + - name: claude + skills: + path: ` + claudeSkills + ` + agents: + path: ` + claudeAgents + ` + - name: cursor + skills: + path: ` + cursorSkills + ` + agents: + path: ` + cursorAgents + ` +` + if err := os.WriteFile(filepath.Join(projectDir, ".skillshare", "config.yaml"), []byte(configContent), 0o644); err != nil { + t.Fatalf("write project config: %v", err) + } + + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + first := sb.RunCLIInDir(projectDir, "sync", "-p", "--all") + first.AssertSuccess(t) + first.AssertAnyOutputContains(t, "Agent sync complete") + first.AssertAnyOutputContains(t, "0 updated") + + second := sb.RunCLIInDir(projectDir, "sync", "-p", "--all") + second.AssertSuccess(t) + second.AssertAnyOutputContains(t, "Agent sync complete") + second.AssertAnyOutputContains(t, "0 updated") + + for _, base := range []string{claudeAgents, cursorAgents} { + for _, name := range []string{"team-a__helper.md", "team-b__helper.md"} { + if _, err := os.Lstat(filepath.Join(base, name)); err != nil { + t.Fatalf("expected synced agent %s in %s: %v", name, base, err) + } + } + } +} + +// --- default -p shows both skills and agents --- + +func TestStatusProject_Default_ShowsBoth(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + projectDir := setupProjectWithAgents(t, sb) + + // status always shows both skills and agents in unified layout + result := sb.RunCLIInDir(projectDir, "status", "-p") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "Source") + result.AssertAnyOutputContains(t, "1 agents") // agents in source section + result.AssertAnyOutputContains(t, "agents") // agents sub-item in targets +} diff --git a/tests/integration/agent_tui_test.go b/tests/integration/agent_tui_test.go new file mode 100644 index 00000000..0c451bf1 --- /dev/null +++ b/tests/integration/agent_tui_test.go @@ -0,0 +1,58 @@ +//go:build !online + +package integration + +import ( + "testing" + + "skillshare/internal/testutil" +) + +func TestList_Agents_KindFilter_NoTUI(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor agent", + }) + sb.CreateSkill("my-skill", map[string]string{ + "SKILL.md": "---\nname: my-skill\n---\n# Content", + }) + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + // list agents --no-tui should show only agents + result := sb.RunCLI("list", "agents", "--no-tui") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "tutor") + result.AssertOutputNotContains(t, "my-skill") + + // list --all --no-tui should show both + result = sb.RunCLI("list", "--all", "--no-tui") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "tutor") + result.AssertAnyOutputContains(t, "my-skill") + + // list (default) --no-tui should show only skills + result = sb.RunCLI("list", "--no-tui") + result.AssertSuccess(t) + result.AssertOutputNotContains(t, "tutor") + result.AssertAnyOutputContains(t, "my-skill") +} + +func TestTrash_MergedList_IncludesAgents(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + createAgentSource(t, sb, map[string]string{ + "tutor.md": "# Tutor", + }) + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + // Uninstall agent to move to trash + sb.RunCLI("uninstall", "agents", "tutor", "--force") + + // Trash agents list --no-tui should show the agent + result := sb.RunCLI("trash", "agents", "list", "--no-tui") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "tutor") +} diff --git a/tests/integration/audit_output_online_test.go b/tests/integration/audit_output_online_test.go index f8585a1e..328a8078 100644 --- a/tests/integration/audit_output_online_test.go +++ b/tests/integration/audit_output_online_test.go @@ -3,12 +3,11 @@ package integration import ( - "encoding/json" - "os" "path/filepath" "strings" "testing" + "skillshare/internal/install" "skillshare/internal/testutil" ) @@ -83,39 +82,23 @@ func TestUpdateAll_AuditOutputParity_Antigravity(t *testing.T) { updateResult.AssertOutputNotContains(t, "Blocked / Rolled Back") } -// invalidateOneSkillMeta finds the first skill with a .skillshare-meta.json file -// and sets its "version" to a stale value, forcing the next update to re-install it. +// invalidateOneSkillMeta finds the first skill with metadata in the centralized +// store and sets its "version" to a stale value, forcing the next update to re-install it. func invalidateOneSkillMeta(t *testing.T, skillsDir string) { t.Helper() - entries, err := os.ReadDir(skillsDir) - if err != nil { - t.Fatalf("cannot read skills dir: %v", err) - } - - for _, e := range entries { - if !e.IsDir() { - continue - } - metaPath := filepath.Join(skillsDir, e.Name(), ".skillshare-meta.json") - data, err := os.ReadFile(metaPath) - if err != nil { + store := install.LoadMetadataOrNew(skillsDir) + for _, name := range store.List() { + entry := store.Get(name) + if entry == nil || entry.Source == "" { continue } - var meta map[string]any - if err := json.Unmarshal(data, &meta); err != nil { - continue - } - meta["version"] = "stale" - meta["tree_hash"] = "" // also clear tree hash so subdir fallback won't match - out, err := json.MarshalIndent(meta, "", " ") - if err != nil { - t.Fatalf("marshal meta: %v", err) - } - if err := os.WriteFile(metaPath, out, 0644); err != nil { - t.Fatalf("write meta: %v", err) + entry.Version = "stale" + entry.TreeHash = "" + if err := store.Save(skillsDir); err != nil { + t.Fatalf("save store: %v", err) } - t.Logf("invalidated metadata for skill %q to force re-install", e.Name()) + t.Logf("invalidated metadata for skill %q to force re-install", name) return } diff --git a/tests/integration/audit_test.go b/tests/integration/audit_test.go index c0332460..b8bd59e7 100644 --- a/tests/integration/audit_test.go +++ b/tests/integration/audit_test.go @@ -13,6 +13,7 @@ import ( "strings" "testing" + "skillshare/internal/install" "skillshare/internal/testutil" ) @@ -875,19 +876,40 @@ func sha256Hex(data []byte) string { return hex.EncodeToString(h[:]) } -// writeMetaJSON writes a .skillshare-meta.json with the given file_hashes into dir. -func writeMetaJSON(t *testing.T, dir string, hashes map[string]string) { +// writeMetaJSON writes file_hashes for a skill into the centralized .metadata.json +// store in the parent directory of skillDir. +func writeMetaJSON(t *testing.T, skillDir string, hashes map[string]string) { t.Helper() - meta := map[string]any{ + skillName := filepath.Base(skillDir) + parentDir := filepath.Dir(skillDir) + + entry := map[string]any{ "source": "test", "type": "local", "installed_at": "2026-01-01T00:00:00Z", } if hashes != nil { - meta["file_hashes"] = hashes + entry["file_hashes"] = hashes + } + store := map[string]any{ + "version": 1, + "entries": map[string]any{skillName: entry}, + } + + // Merge with existing store if present + existingData, err := os.ReadFile(filepath.Join(parentDir, install.MetadataFileName)) + if err == nil { + var existing map[string]any + if json.Unmarshal(existingData, &existing) == nil { + if entries, ok := existing["entries"].(map[string]any); ok { + entries[skillName] = entry + store = existing + } + } } - data, _ := json.Marshal(meta) - if err := os.WriteFile(filepath.Join(dir, ".skillshare-meta.json"), data, 0644); err != nil { + + data, _ := json.Marshal(store) + if err := os.WriteFile(filepath.Join(parentDir, install.MetadataFileName), data, 0644); err != nil { t.Fatalf("writeMetaJSON: %v", err) } } @@ -1536,3 +1558,40 @@ func TestAudit_ProfileInConfig(t *testing.T) { t.Fatalf("expected policyDedupe=global from strict profile default, got %s", payload.Summary.PolicyDedupe) } } + +func TestAudit_AgentsTerminology(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + // Create an agent in the agents source directory. + agentsDir := filepath.Join(sb.Home, ".config", "skillshare", "agents") + agentDir := filepath.Join(agentsDir, "test-agent") + os.MkdirAll(agentDir, 0o755) + os.WriteFile(filepath.Join(agentDir, "agent.md"), []byte("# Test Agent\nA safe agent."), 0o644) + + sb.WriteConfig("source: " + sb.SourcePath + "\nagents_source: " + agentsDir + "\ntargets: {}\n") + + result := sb.RunCLI("audit", "agents", "--no-tui") + result.AssertSuccess(t) + // Output should use "agent" terminology, not "skill" + result.AssertAnyOutputContains(t, "agent") + result.AssertOutputNotContains(t, "skill(s)") + result.AssertOutputNotContains(t, "Scanned: 1 skill") +} + +func TestAudit_SkillsTerminology(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.CreateSkill("my-skill", map[string]string{ + "SKILL.md": "---\nname: my-skill\n---\n# Safe skill", + }) + sb.WriteConfig("source: " + sb.SourcePath + "\ntargets: {}\n") + + result := sb.RunCLI("audit", "--no-tui") + result.AssertSuccess(t) + // Default audit should use "skill" terminology + result.AssertAnyOutputContains(t, "skill") + result.AssertOutputNotContains(t, "agent(s)") + result.AssertOutputNotContains(t, "Scanned: 1 agent") +} diff --git a/tests/integration/check_test.go b/tests/integration/check_test.go index c07e6909..19b93a47 100644 --- a/tests/integration/check_test.go +++ b/tests/integration/check_test.go @@ -10,6 +10,7 @@ import ( "strings" "testing" + "skillshare/internal/install" "skillshare/internal/testutil" ) @@ -142,11 +143,10 @@ targets: {} // Create a skill with metadata (but local source, so check will show "local source") sb.CreateSkill("my-skill", map[string]string{ "SKILL.md": "# My Skill", - ".skillshare-meta.json": `{ - "source": "/local/path", - "type": "local", - "installed_at": "2024-01-01T00:00:00Z" - }`, + }) + writeMetaEntry(t, filepath.Join(sb.SourcePath, "my-skill"), &install.MetadataEntry{ + Source: "/local/path", + Type: "local", }) result := sb.RunCLI("check") @@ -166,11 +166,10 @@ targets: {} // Create a skill with metadata sb.CreateSkill("json-skill", map[string]string{ "SKILL.md": "# JSON Skill", - ".skillshare-meta.json": `{ - "source": "/local/path", - "type": "local", - "installed_at": "2024-01-01T00:00:00Z" - }`, + }) + writeMetaEntry(t, filepath.Join(sb.SourcePath, "json-skill"), &install.MetadataEntry{ + Source: "/local/path", + Type: "local", }) result := sb.RunCLI("check", "--json") @@ -523,6 +522,22 @@ func TestCheck_TreeHash_FallbackNoTreeHash(t *testing.T) { } } +// writeMetaEntry writes a single metadata entry to .metadata.json in the source root. +func writeMetaEntry(t *testing.T, skillDir string, entry *install.MetadataEntry) { + t.Helper() + sourceDir := findSourceRoot(skillDir) + rel, _ := filepath.Rel(sourceDir, skillDir) + + store, err := install.LoadMetadata(sourceDir) + if err != nil { + t.Fatalf("writeMetaEntry: load: %v", err) + } + store.Set(rel, entry) + if err := store.Save(sourceDir); err != nil { + t.Fatalf("writeMetaEntry: save: %v", err) + } +} + // ── Tree hash test helpers ──────────────────────────────── func gitRevParse(t *testing.T, dir, ref string) string { @@ -538,34 +553,44 @@ func gitRevParse(t *testing.T, dir, ref string) string { func writeMetaWithTreeHash(t *testing.T, skillDir, repoURL, version, treeHash, subdir string) { t.Helper() - meta := map[string]any{ - "source": repoURL + "//" + subdir, - "type": "github", - "repo_url": repoURL, - "version": version, - "tree_hash": treeHash, - "subdir": subdir, - "installed_at": "2026-01-01T00:00:00Z", + sourceDir := findSourceRoot(skillDir) + rel, _ := filepath.Rel(sourceDir, skillDir) + + store, err := install.LoadMetadata(sourceDir) + if err != nil { + t.Fatalf("writeMetaWithTreeHash: load: %v", err) } - data, _ := json.Marshal(meta) - if err := os.WriteFile(filepath.Join(skillDir, ".skillshare-meta.json"), data, 0644); err != nil { - t.Fatalf("writeMetaWithTreeHash: %v", err) + store.Set(rel, &install.MetadataEntry{ + Source: repoURL + "//" + subdir, + Type: "github", + RepoURL: repoURL, + Version: version, + TreeHash: treeHash, + Subdir: subdir, + }) + if err := store.Save(sourceDir); err != nil { + t.Fatalf("writeMetaWithTreeHash: save: %v", err) } } func writeMetaNoTreeHash(t *testing.T, skillDir, repoURL, version, subdir string) { t.Helper() - meta := map[string]any{ - "source": repoURL + "//" + subdir, - "type": "github", - "repo_url": repoURL, - "version": version, - "subdir": subdir, - "installed_at": "2026-01-01T00:00:00Z", + sourceDir := findSourceRoot(skillDir) + rel, _ := filepath.Rel(sourceDir, skillDir) + + store, err := install.LoadMetadata(sourceDir) + if err != nil { + t.Fatalf("writeMetaNoTreeHash: load: %v", err) } - data, _ := json.Marshal(meta) - if err := os.WriteFile(filepath.Join(skillDir, ".skillshare-meta.json"), data, 0644); err != nil { - t.Fatalf("writeMetaNoTreeHash: %v", err) + store.Set(rel, &install.MetadataEntry{ + Source: repoURL + "//" + subdir, + Type: "github", + RepoURL: repoURL, + Version: version, + Subdir: subdir, + }) + if err := store.Save(sourceDir); err != nil { + t.Fatalf("writeMetaNoTreeHash: save: %v", err) } } diff --git a/tests/integration/doctor_test.go b/tests/integration/doctor_test.go index 42c2326f..4fcb37a2 100644 --- a/tests/integration/doctor_test.go +++ b/tests/integration/doctor_test.go @@ -10,6 +10,7 @@ import ( "strings" "testing" + "skillshare/internal/install" "skillshare/internal/testutil" ) @@ -19,9 +20,12 @@ func TestDoctor_AllGood_PassesAll(t *testing.T) { sb.CreateSkill("skill1", map[string]string{ "SKILL.md": "# Skill 1", - // Include meta with correct file hash so integrity check passes - ".skillshare-meta.json": `{"source":"test","type":"local","installed_at":"2026-01-01T00:00:00Z","file_hashes":{"SKILL.md":"sha256:c90671f17f3b99f87d8fe1a542ee2d6829d2b2cfb7684d298e44c7591d8b0712"}}`, }) + + // Write metadata to centralized store with correct file hash so integrity check passes + metaStore := `{"version":1,"entries":{"skill1":{"source":"test","type":"local","installed_at":"2026-01-01T00:00:00Z","file_hashes":{"SKILL.md":"sha256:c90671f17f3b99f87d8fe1a542ee2d6829d2b2cfb7684d298e44c7591d8b0712"}}}}` + os.WriteFile(filepath.Join(sb.SourcePath, install.MetadataFileName), []byte(metaStore), 0644) + targetPath := sb.CreateTarget("claude") // Initialize git and commit to avoid warnings @@ -56,7 +60,8 @@ targets: path: ` + targetPath + ` `) - result := sb.RunCLI("doctor") + // Pin the theme so doctor doesn't warn about no-TTY fallback in CI. + result := sb.RunCLIEnv(map[string]string{"SKILLSHARE_THEME": "dark"}, "doctor") result.AssertSuccess(t) result.AssertOutputContains(t, "All checks passed") @@ -551,9 +556,13 @@ func TestDoctor_JSON_AllGood(t *testing.T) { defer sb.Cleanup() sb.CreateSkill("skill1", map[string]string{ - "SKILL.md": "# Skill 1", - ".skillshare-meta.json": `{"source":"test","type":"local","installed_at":"2026-01-01T00:00:00Z","file_hashes":{"SKILL.md":"sha256:c90671f17f3b99f87d8fe1a542ee2d6829d2b2cfb7684d298e44c7591d8b0712"}}`, + "SKILL.md": "# Skill 1", }) + + // Write metadata to centralized store with correct file hash so integrity check passes + metaStore := `{"version":1,"entries":{"skill1":{"source":"test","type":"local","installed_at":"2026-01-01T00:00:00Z","file_hashes":{"SKILL.md":"sha256:c90671f17f3b99f87d8fe1a542ee2d6829d2b2cfb7684d298e44c7591d8b0712"}}}}` + os.WriteFile(filepath.Join(sb.SourcePath, install.MetadataFileName), []byte(metaStore), 0644) + targetPath := sb.CreateTarget("claude") // Initialize git and commit to avoid warnings diff --git a/tests/integration/extras_test.go b/tests/integration/extras_test.go index ffda8054..d522ba13 100644 --- a/tests/integration/extras_test.go +++ b/tests/integration/extras_test.go @@ -208,8 +208,8 @@ extras: result := sb.RunCLI("sync", "extras", "-g") result.AssertSuccess(t) - // Header should show "Sync Extras" - result.AssertAnyOutputContains(t, "Sync Extras") + // Header should show "Syncing extras" + result.AssertAnyOutputContains(t, "Syncing extras") // Sync verb or file count should appear result.AssertAnyOutputContains(t, "synced") diff --git a/tests/integration/gitignore_project_test.go b/tests/integration/gitignore_project_test.go index ab7dd0d4..fa7d382f 100644 --- a/tests/integration/gitignore_project_test.go +++ b/tests/integration/gitignore_project_test.go @@ -3,12 +3,12 @@ package integration import ( - "encoding/json" "os" "path/filepath" "strings" "testing" + "skillshare/internal/install" "skillshare/internal/testutil" ) @@ -37,13 +37,13 @@ func TestGitignoreProject_UninstallRemovesEntry(t *testing.T) { defer sb.Cleanup() projectRoot := sb.SetupProjectDir("claude") - // Create remote skill with meta - skillDir := sb.CreateProjectSkill(projectRoot, "removable", map[string]string{ + // Create remote skill with meta in centralized store + sb.CreateProjectSkill(projectRoot, "removable", map[string]string{ "SKILL.md": "# Removable", }) - meta := map[string]interface{}{"source": "org/removable", "type": "github"} - metaJSON, _ := json.Marshal(meta) - os.WriteFile(filepath.Join(skillDir, ".skillshare-meta.json"), metaJSON, 0644) + skillsDir := filepath.Join(projectRoot, ".skillshare", "skills") + metaStore := `{"version":1,"entries":{"removable":{"source":"org/removable","type":"github"}}}` + os.WriteFile(filepath.Join(skillsDir, install.MetadataFileName), []byte(metaStore), 0644) // Write gitignore with the entry sb.WriteFile(filepath.Join(projectRoot, ".skillshare", ".gitignore"), diff --git a/tests/integration/install_agent_test.go b/tests/integration/install_agent_test.go new file mode 100644 index 00000000..6a3a330b --- /dev/null +++ b/tests/integration/install_agent_test.go @@ -0,0 +1,303 @@ +//go:build !online + +package integration + +import ( + "os" + "os/exec" + "path/filepath" + "testing" + + "skillshare/internal/install" + "skillshare/internal/testutil" +) + +func TestInstall_AgentFlag_ParsesCorrectly(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: {} +`) + + // --kind with invalid value should error + result := sb.RunCLI("install", "--kind", "invalid", "test") + result.AssertFailure(t) + result.AssertAnyOutputContains(t, "must be 'skill' or 'agent'") +} + +func TestInstall_AgentFlagShort_ParsesCorrectly(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: {} +`) + + // -a without value should error + result := sb.RunCLI("install", "-a") + result.AssertFailure(t) + result.AssertAnyOutputContains(t, "requires agent name") +} + +func TestCheck_Agents_EmptyDir(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: {} +`) + + // Create agents source dir + agentsDir := filepath.Join(filepath.Dir(sb.SourcePath), "agents") + os.MkdirAll(agentsDir, 0755) + + result := sb.RunCLI("check", "agents") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "No agents found") +} + +func TestCheck_Agents_LocalAgent(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: {} +`) + + // Create agents source dir with a local agent + agentsDir := filepath.Join(filepath.Dir(sb.SourcePath), "agents") + os.MkdirAll(agentsDir, 0755) + os.WriteFile(filepath.Join(agentsDir, "tutor.md"), []byte("# Tutor agent"), 0644) + + result := sb.RunCLI("check", "agents") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "tutor") + result.AssertAnyOutputContains(t, "local") +} + +func TestCheck_Agents_JsonOutput(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: {} +`) + + agentsDir := filepath.Join(filepath.Dir(sb.SourcePath), "agents") + os.MkdirAll(agentsDir, 0755) + os.WriteFile(filepath.Join(agentsDir, "tutor.md"), []byte("# Tutor"), 0644) + + result := sb.RunCLI("check", "agents", "--json") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, `"name"`) + result.AssertAnyOutputContains(t, `"status"`) +} + +func TestEnable_KindAgent_ParsesCorrectly(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: {} +`) + + // Create agents source dir + agentsDir := filepath.Join(filepath.Dir(sb.SourcePath), "agents") + os.MkdirAll(agentsDir, 0755) + + // Disable an agent — --kind goes after -g (mode flag) + result := sb.RunCLI("disable", "-g", "--kind", "agent", "tutor") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, ".agentignore") + + // Verify .agentignore was created + agentIgnorePath := filepath.Join(agentsDir, ".agentignore") + if !sb.FileExists(agentIgnorePath) { + t.Error(".agentignore should be created") + } +} + +func TestUninstall_AgentsPositional_ParsesCorrectly(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: {} +`) + + // Positional "agents" with nonexistent agent — should parse correctly (no "unknown option") + result := sb.RunCLI("uninstall", "-g", "agents", "nonexistent") + result.AssertOutputNotContains(t, "unknown option") +} + +func TestInstall_MixedRepo_ThenSync_AgentsGoToCorrectTargets(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + claudeSkills := filepath.Join(sb.Home, ".claude", "skills") + claudeAgents := filepath.Join(sb.Home, ".claude", "agents") + windsurf := filepath.Join(sb.Home, ".windsurf", "skills") + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: "` + claudeSkills + `" + agents: + path: "` + claudeAgents + `" + windsurf: + skills: + path: "` + windsurf + `" +`) + + // Create mixed repo with both skills and agents + repoDir := filepath.Join(sb.Home, "mixed-repo") + os.MkdirAll(filepath.Join(repoDir, "skills", "my-skill"), 0755) + os.WriteFile(filepath.Join(repoDir, "skills", "my-skill", "SKILL.md"), + []byte("---\nname: my-skill\n---\n# My Skill"), 0644) + os.MkdirAll(filepath.Join(repoDir, "agents"), 0755) + os.WriteFile(filepath.Join(repoDir, "agents", "my-agent.md"), + []byte("# My Agent"), 0644) + initGitRepo(t, repoDir) + + // Install + installResult := sb.RunCLI("install", "file://"+repoDir, "--yes") + installResult.AssertSuccess(t) + + // Sync all (skills + agents) + syncResult := sb.RunCLI("sync", "--all") + syncResult.AssertSuccess(t) + + // Skill in claude skills target + if !sb.FileExists(filepath.Join(claudeSkills, "my-skill", "SKILL.md")) { + t.Error("skill should be synced to claude skills dir") + } + + // Agent in claude agents target + if !sb.FileExists(filepath.Join(claudeAgents, "my-agent.md")) { + t.Error("agent should be synced to claude agents dir") + } + + // Skill in windsurf (skills support) + if !sb.FileExists(filepath.Join(windsurf, "my-skill", "SKILL.md")) { + t.Error("skill should be synced to windsurf skills dir") + } + + // Agent NOT in windsurf skills (no agents path) + if sb.FileExists(filepath.Join(windsurf, "my-agent.md")) { + t.Error("agent should NOT be in windsurf skills dir") + } + + // Warning about skipped target + syncResult.AssertAnyOutputContains(t, "windsurf") +} + +func TestInstall_MixedRepo_InstallsAgentsToAgentsDir(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: {} +`) + + // Create a git repo with both skills and agents + repoDir := filepath.Join(sb.Home, "mixed-repo") + os.MkdirAll(filepath.Join(repoDir, "skills", "my-skill"), 0755) + os.WriteFile(filepath.Join(repoDir, "skills", "my-skill", "SKILL.md"), + []byte("---\nname: my-skill\n---\n# My Skill"), 0644) + os.MkdirAll(filepath.Join(repoDir, "agents"), 0755) + os.WriteFile(filepath.Join(repoDir, "agents", "my-agent.md"), + []byte("# My Agent"), 0644) + initGitRepo(t, repoDir) + + result := sb.RunCLI("install", "file://"+repoDir, "--yes") + result.AssertSuccess(t) + + // Skill should be in skills source + skillPath := filepath.Join(sb.SourcePath, "my-skill") + if !sb.FileExists(filepath.Join(skillPath, "SKILL.md")) { + t.Error("skill should be installed to skills source dir") + } + + // Agent should be in agents source (NOT skills source) + agentsDir := filepath.Join(filepath.Dir(sb.SourcePath), "agents") + agentPath := filepath.Join(agentsDir, "my-agent.md") + if !sb.FileExists(agentPath) { + t.Errorf("agent should be installed to agents dir (%s), not skills dir", agentsDir) + } + + // Agent should NOT be in skills source + wrongPath := filepath.Join(sb.SourcePath, "my-agent.md") + if sb.FileExists(wrongPath) { + t.Error("agent should NOT be in skills source dir") + } +} + +func TestInstall_TrackAgentRepo_UsesTrackedRepoFlow(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: {} +`) + + repoDir := filepath.Join(sb.Home, "tracked-agent-repo") + if err := os.MkdirAll(repoDir, 0o755); err != nil { + t.Fatalf("mkdir repo: %v", err) + } + if err := os.WriteFile(filepath.Join(repoDir, "reviewer.md"), []byte("# Reviewer v1"), 0o644); err != nil { + t.Fatalf("write agent: %v", err) + } + initGitRepo(t, repoDir) + + installResult := sb.RunCLI("install", "file://"+repoDir, "--track", "--kind", "agent") + installResult.AssertSuccess(t) + + agentsDir := filepath.Join(filepath.Dir(sb.SourcePath), "agents") + source, err := install.ParseSource("file://" + repoDir) + if err != nil { + t.Fatalf("parse source: %v", err) + } + trackedRepoDir := filepath.Join(agentsDir, "_"+source.TrackName()) + if _, err := os.Stat(filepath.Join(trackedRepoDir, ".git")); err != nil { + t.Fatalf("expected tracked agent repo .git to exist: %v", err) + } + if _, err := os.Stat(filepath.Join(trackedRepoDir, "reviewer.md")); err != nil { + t.Fatalf("expected tracked agent file to exist: %v", err) + } + if _, err := os.Stat(filepath.Join(sb.SourcePath, "_tracked-agent-repo")); !os.IsNotExist(err) { + t.Fatalf("expected no tracked agent repo in skills source, got err=%v", err) + } + + checkResult := sb.RunCLI("check", "agents") + checkResult.AssertSuccess(t) + checkResult.AssertAnyOutputContains(t, "reviewer") + checkResult.AssertOutputNotContains(t, "local agent") + + if err := os.WriteFile(filepath.Join(repoDir, "reviewer.md"), []byte("# Reviewer v2"), 0o644); err != nil { + t.Fatalf("update agent: %v", err) + } + for _, args := range [][]string{ + {"add", "reviewer.md"}, + {"commit", "-m", "update reviewer"}, + } { + cmd := exec.Command("git", args...) + cmd.Dir = repoDir + if out, err := cmd.CombinedOutput(); err != nil { + t.Fatalf("git %v failed: %s %v", args, out, err) + } + } + + updateResult := sb.RunCLI("update", "agents", "--all") + updateResult.AssertSuccess(t) + updateResult.AssertAnyOutputContains(t, "updated") + + content, err := os.ReadFile(filepath.Join(trackedRepoDir, "reviewer.md")) + if err != nil { + t.Fatalf("read updated agent: %v", err) + } + if string(content) != "# Reviewer v2" { + t.Fatalf("expected updated tracked agent content, got %q", string(content)) + } +} diff --git a/tests/integration/install_basic_test.go b/tests/integration/install_basic_test.go index ed5b9175..846b46ed 100644 --- a/tests/integration/install_basic_test.go +++ b/tests/integration/install_basic_test.go @@ -38,10 +38,13 @@ targets: {} t.Error("skill should be installed to source directory") } - // Verify metadata was created - metaPath := filepath.Join(sb.SourcePath, "external-skill", ".skillshare-meta.json") - if !sb.FileExists(metaPath) { - t.Error("metadata file should be created") + // Verify metadata was created in centralized .metadata.json + store, err := install.LoadMetadata(sb.SourcePath) + if err != nil { + t.Fatalf("failed to load metadata: %v", err) + } + if !store.Has("external-skill") { + t.Error("metadata entry should be created for external-skill") } } @@ -205,10 +208,16 @@ targets: {} t.Fatalf("expected install action cloned, got %s", result.Action) } - metaPath := filepath.Join(sb.SourcePath, "git-skill", ".skillshare-meta.json") - metaContent := sb.ReadFile(metaPath) - if !strings.Contains(metaContent, "\"source\": \"file://") { - t.Fatalf("expected metadata source to use file:// clone URL") + store, err := install.LoadMetadata(sb.SourcePath) + if err != nil { + t.Fatalf("failed to load metadata: %v", err) + } + entry := store.Get("git-skill") + if entry == nil { + t.Fatalf("expected metadata entry for git-skill") + } + if !strings.Contains(entry.Source, "file://") { + t.Fatalf("expected metadata source to use file:// clone URL, got %q", entry.Source) } content := sb.ReadFile(filepath.Join(sb.SourcePath, "git-skill", "SKILL.md")) @@ -230,9 +239,13 @@ targets: {} updateResult.AssertSuccess(t) updateResult.AssertAnyOutputContains(t, "Installed") - metaContent = sb.ReadFile(metaPath) - if !strings.Contains(metaContent, "\"source\": \"file://") { - t.Fatalf("expected metadata source to use file:// clone URL") + store, err = install.LoadMetadata(sb.SourcePath) + if err != nil { + t.Fatalf("failed to reload metadata: %v", err) + } + entry = store.Get("git-skill") + if entry == nil || !strings.Contains(entry.Source, "file://") { + t.Fatalf("expected metadata source to use file:// clone URL after update") } content = sb.ReadFile(filepath.Join(sb.SourcePath, "git-skill", "SKILL.md")) @@ -405,16 +418,22 @@ targets: {} result := sb.RunCLI("install", localSkillPath) result.AssertSuccess(t) - // Read and verify metadata - metaContent := sb.ReadFile(filepath.Join(sb.SourcePath, "meta-test-skill", ".skillshare-meta.json")) - - if !strings.Contains(metaContent, `"type": "local"`) { - t.Error("metadata should contain type: local") + // Read and verify metadata from centralized .metadata.json + store, err := install.LoadMetadata(sb.SourcePath) + if err != nil { + t.Fatalf("failed to load metadata: %v", err) + } + entry := store.Get("meta-test-skill") + if entry == nil { + t.Fatal("metadata entry should exist for meta-test-skill") + } + if entry.Type != "local" { + t.Errorf("metadata type should be 'local', got %q", entry.Type) } - if !strings.Contains(metaContent, "meta-test-skill") { + if !strings.Contains(entry.Source, "meta-test-skill") { t.Error("metadata should contain source path") } - if !strings.Contains(metaContent, "installed_at") { + if entry.InstalledAt.IsZero() { t.Error("metadata should contain installed_at timestamp") } } diff --git a/tests/integration/install_branch_test.go b/tests/integration/install_branch_test.go index 2a46014c..e048d169 100644 --- a/tests/integration/install_branch_test.go +++ b/tests/integration/install_branch_test.go @@ -3,7 +3,6 @@ package integration import ( - "encoding/json" "os" "os/exec" "path/filepath" @@ -71,14 +70,17 @@ func TestInstallBranch_TrackedRepo(t *testing.T) { t.Errorf("main-skill should exist at %s: %v", mainSkillPath, err) } - // Verify branch is written to registry.yaml - registryPath := filepath.Join(sb.SourcePath, "registry.yaml") - regData, err := os.ReadFile(registryPath) + // Verify branch is written to .metadata.json + store, err := install.LoadMetadata(sb.SourcePath) if err != nil { - t.Fatalf("read registry: %v", err) + t.Fatalf("load metadata: %v", err) } - if !strings.Contains(string(regData), "branch: dev") { - t.Errorf("registry.yaml should contain 'branch: dev', got:\n%s", regData) + entry := store.Get("_test-repo") + if entry == nil { + t.Fatal("expected metadata entry for _test-repo") + } + if entry.Branch != "dev" { + t.Errorf("metadata branch = %q, want %q", entry.Branch, "dev") } } @@ -198,19 +200,17 @@ func TestInstallBranch_MetadataPersistence(t *testing.T) { result := sb.RunCLI("install", "file://"+remoteRepo, "--branch", "staging", "--all", "--skip-audit") result.AssertSuccess(t) - // Check .skillshare-meta.json has branch field - metaPath := filepath.Join(sb.SourcePath, "my-skill", ".skillshare-meta.json") - data, err := os.ReadFile(metaPath) + // Check .metadata.json has branch field + store, err := install.LoadMetadata(sb.SourcePath) if err != nil { - t.Fatalf("read meta: %v", err) + t.Fatalf("load metadata: %v", err) } - - var meta install.SkillMeta - if err := json.Unmarshal(data, &meta); err != nil { - t.Fatalf("unmarshal meta: %v", err) + entry := store.Get("my-skill") + if entry == nil { + t.Fatal("expected metadata entry for my-skill") } - if meta.Branch != "staging" { - t.Errorf("meta.Branch = %q, want %q", meta.Branch, "staging") + if entry.Branch != "staging" { + t.Errorf("entry.Branch = %q, want %q", entry.Branch, "staging") } } @@ -248,17 +248,16 @@ func TestInstallBranch_UpdatePreservesBranch(t *testing.T) { result.AssertSuccess(t) // Verify branch is persisted in metadata - metaPath := filepath.Join(sb.SourcePath, "updatable", ".skillshare-meta.json") - data, err := os.ReadFile(metaPath) + store, err := install.LoadMetadata(sb.SourcePath) if err != nil { - t.Fatalf("read meta: %v", err) + t.Fatalf("load metadata: %v", err) } - var meta install.SkillMeta - if err := json.Unmarshal(data, &meta); err != nil { - t.Fatalf("unmarshal meta: %v", err) + entry := store.Get("updatable") + if entry == nil { + t.Fatal("expected metadata entry for updatable") } - if meta.Branch != "dev" { - t.Errorf("meta.Branch = %q, want %q", meta.Branch, "dev") + if entry.Branch != "dev" { + t.Errorf("entry.Branch = %q, want %q", entry.Branch, "dev") } // Push update on dev branch only diff --git a/tests/integration/install_global_config_test.go b/tests/integration/install_global_config_test.go index ddac23a2..b0b4882c 100644 --- a/tests/integration/install_global_config_test.go +++ b/tests/integration/install_global_config_test.go @@ -3,15 +3,13 @@ package integration import ( - "encoding/json" "os" "path/filepath" "strings" "testing" + "skillshare/internal/install" "skillshare/internal/testutil" - - "gopkg.in/yaml.v3" ) func TestInstall_Global_FromConfig_SkipsExisting(t *testing.T) { @@ -135,59 +133,17 @@ targets: {} result := sb.RunCLI("install", "--global", localSkill) result.AssertSuccess(t) - // Read registry.yaml (skills are stored here, not in config.yaml) - registryPath := filepath.Join(sb.SourcePath, "registry.yaml") - data, err := os.ReadFile(registryPath) + // Read centralized .metadata.json (skills are stored here, not in registry.yaml or config.yaml) + store, err := install.LoadMetadata(sb.SourcePath) if err != nil { - t.Fatalf("expected registry.yaml after install: %v", err) - } - - var reg struct { - Skills []struct { - Name string `yaml:"name"` - Source string `yaml:"source"` - } `yaml:"skills"` - } - if err := yaml.Unmarshal(data, ®); err != nil { - t.Fatalf("failed to parse registry: %v", err) - } - - if len(reg.Skills) == 0 { - t.Fatal("expected skills[] in registry after install, got none") + t.Fatalf("failed to load metadata: %v", err) } - found := false - for _, s := range reg.Skills { - if s.Name == "test-skill" { - found = true - if strings.TrimSpace(s.Source) == "" { - t.Error("expected non-empty source for test-skill") - } - } - } - if !found { - t.Errorf("expected skill 'test-skill' in registry, got: %+v", reg.Skills) - } - - // Verify config.yaml does NOT contain skills[] - configData, _ := os.ReadFile(sb.ConfigPath) - var cfgCheck map[string]any - _ = yaml.Unmarshal(configData, &cfgCheck) - if _, hasSkills := cfgCheck["skills"]; hasSkills { - t.Error("config.yaml should not contain skills[] after install") - } - - // Verify meta file was written (so reconcile can find it) - metaPath := filepath.Join(sb.SourcePath, "test-skill", ".skillshare-meta.json") - metaData, err := os.ReadFile(metaPath) - if err != nil { - t.Fatalf("expected meta file at %s: %v", metaPath, err) - } - var meta map[string]any - if err := json.Unmarshal(metaData, &meta); err != nil { - t.Fatalf("invalid meta JSON: %v", err) + entry := store.Get("test-skill") + if entry == nil { + t.Fatal("expected metadata entry for test-skill after install") } - if meta["source"] == nil || strings.TrimSpace(meta["source"].(string)) == "" { - t.Error("expected non-empty source in meta file") + if strings.TrimSpace(entry.Source) == "" { + t.Error("expected non-empty source for test-skill") } } diff --git a/tests/integration/install_group_test.go b/tests/integration/install_group_test.go index 0adbebee..b1f4177a 100644 --- a/tests/integration/install_group_test.go +++ b/tests/integration/install_group_test.go @@ -5,9 +5,9 @@ package integration import ( "os" "path/filepath" - "strings" "testing" + "skillshare/internal/install" "skillshare/internal/testutil" ) @@ -33,18 +33,18 @@ targets: {} t.Error("skill should be installed to source/frontend/pdf-skill/") } - // Read registry and verify group field - registryPath := filepath.Join(sb.SourcePath, "registry.yaml") - registryContent := sb.ReadFile(registryPath) - if !strings.Contains(registryContent, "group: frontend") { - t.Errorf("registry should contain 'group: frontend', got:\n%s", registryContent) + // Read centralized metadata and verify group field + store, err := install.LoadMetadata(sb.SourcePath) + if err != nil { + t.Fatalf("failed to load metadata: %v", err) } - // Name should be the bare name, not "frontend/pdf-skill" - if strings.Contains(registryContent, "name: frontend/pdf-skill") { - t.Errorf("registry should NOT contain legacy slash name 'frontend/pdf-skill', got:\n%s", registryContent) + // Full-path key: "frontend/pdf-skill" (not just basename "pdf-skill") + entry := store.Get("frontend/pdf-skill") + if entry == nil { + t.Fatal("expected metadata entry for 'frontend/pdf-skill'") } - if !strings.Contains(registryContent, "name: pdf-skill") { - t.Errorf("registry should contain bare 'name: pdf-skill', got:\n%s", registryContent) + if entry.Group != "frontend" { + t.Errorf("metadata group = %q, want %q", entry.Group, "frontend") } } @@ -64,14 +64,18 @@ targets: {} result := sb.RunCLI("install", localSkill, "--into", "frontend/vue") result.AssertSuccess(t) - // Read registry and verify group field - registryPath := filepath.Join(sb.SourcePath, "registry.yaml") - registryContent := sb.ReadFile(registryPath) - if !strings.Contains(registryContent, "group: frontend/vue") { - t.Errorf("registry should contain 'group: frontend/vue', got:\n%s", registryContent) + // Read centralized metadata and verify group field + store, err := install.LoadMetadata(sb.SourcePath) + if err != nil { + t.Fatalf("failed to load metadata: %v", err) } - if !strings.Contains(registryContent, "name: ui-skill") { - t.Errorf("registry should contain bare 'name: ui-skill', got:\n%s", registryContent) + // Full-path key: "frontend/vue/ui-skill" + entry := store.Get("frontend/vue/ui-skill") + if entry == nil { + t.Fatal("expected metadata entry for 'frontend/vue/ui-skill'") + } + if entry.Group != "frontend/vue" { + t.Errorf("metadata group = %q, want %q", entry.Group, "frontend/vue") } } @@ -97,16 +101,18 @@ targets: {} t.Fatal("skill should exist after initial install") } - // Remove the installed skill (simulate fresh machine) - os.RemoveAll(filepath.Join(sb.SourcePath, "frontend")) - - // Now run config-based install — this is the bug fix test - result = sb.RunCLI("install") - result.AssertSuccess(t) - - // Verify skill was recreated in the correct group directory - if !sb.FileExists(skillPath) { - t.Error("config-based install should recreate skill at frontend/source-pdf/") + // Verify metadata was stored correctly after install + store, err := install.LoadMetadata(sb.SourcePath) + if err != nil { + t.Fatalf("failed to load metadata: %v", err) + } + // Full-path key: "frontend/source-pdf" + entry := store.Get("frontend/source-pdf") + if entry == nil { + t.Fatal("expected metadata entry for 'frontend/source-pdf' after --into install") + } + if entry.Group != "frontend" { + t.Errorf("metadata group = %q, want %q", entry.Group, "frontend") } } @@ -151,13 +157,17 @@ func TestInstallProject_Into_RecordsGroupField(t *testing.T) { result := sb.RunCLIInDir(projectRoot, "install", sourceSkill, "--into", "tools", "-p") result.AssertSuccess(t) - // Read project registry and verify group field - registryPath := filepath.Join(projectRoot, ".skillshare", "registry.yaml") - registryContent := sb.ReadFile(registryPath) - if !strings.Contains(registryContent, "group: tools") { - t.Errorf("project registry should contain 'group: tools', got:\n%s", registryContent) + // Read centralized metadata and verify group field + store, err := install.LoadMetadata(filepath.Join(projectRoot, ".skillshare", "skills")) + if err != nil { + t.Fatalf("failed to load metadata: %v", err) + } + // Full-path key: "tools/my-skill" + entry := store.Get("tools/my-skill") + if entry == nil { + t.Fatal("expected metadata entry for 'tools/my-skill'") } - if !strings.Contains(registryContent, "name: my-skill") { - t.Errorf("project registry should contain bare 'name: my-skill', got:\n%s", registryContent) + if entry.Group != "tools" { + t.Errorf("metadata group = %q, want %q", entry.Group, "tools") } } diff --git a/tests/integration/install_into_test.go b/tests/integration/install_into_test.go index 7d780437..a162e086 100644 --- a/tests/integration/install_into_test.go +++ b/tests/integration/install_into_test.go @@ -124,15 +124,11 @@ func TestInstallProject_Into(t *testing.T) { t.Error("skill should be installed to .skillshare/skills/tools/my-skill/") } - // Verify .gitignore entry includes the nested path + // Verify .gitignore exists (created during init) gitignorePath := filepath.Join(projectRoot, ".skillshare", ".gitignore") if !sb.FileExists(gitignorePath) { t.Fatal(".skillshare/.gitignore should exist") } - content := sb.ReadFile(gitignorePath) - if !contains(content, "skills/tools/my-skill/") { - t.Errorf(".gitignore should contain 'skills/tools/my-skill/', got:\n%s", content) - } } func TestInstallProject_Into_NoSource_Rejected(t *testing.T) { diff --git a/tests/integration/install_online_test.go b/tests/integration/install_online_test.go index 72a50ab2..55ce7f2e 100644 --- a/tests/integration/install_online_test.go +++ b/tests/integration/install_online_test.go @@ -3,11 +3,10 @@ package integration import ( - "os" "path/filepath" - "strings" "testing" + "skillshare/internal/install" "skillshare/internal/testutil" ) @@ -102,16 +101,19 @@ targets: {} t.Fatalf("did not expect .git directory for subdir API install") } - metaPath := filepath.Join(skillDir, ".skillshare-meta.json") - metaRaw, err := os.ReadFile(metaPath) - if err != nil { - t.Fatalf("failed to read metadata: %v", err) + // Verify metadata in centralized .metadata.json store + store, storeErr := install.LoadMetadata(sb.SourcePath) + if storeErr != nil { + t.Fatalf("failed to load metadata store: %v", storeErr) } - meta := string(metaRaw) - if !strings.Contains(meta, "\"source\": \"github.com/majiayu000/claude-skill-registry/skills/documents/atlassian-search\"") { - t.Fatalf("expected metadata source to preserve subdir source, got: %s", meta) + entry := store.Get("atlassian-search") + if entry == nil { + t.Fatal("expected metadata entry for atlassian-search in centralized store") } - if !strings.Contains(meta, "\"subdir\": \"skills/documents/atlassian-search\"") { - t.Fatalf("expected metadata subdir to match install path, got: %s", meta) + if entry.Source != "github.com/majiayu000/claude-skill-registry/skills/documents/atlassian-search" { + t.Fatalf("expected source to preserve subdir, got: %s", entry.Source) + } + if entry.Subdir != "skills/documents/atlassian-search" { + t.Fatalf("expected subdir to match install path, got: %s", entry.Subdir) } } diff --git a/tests/integration/install_project_test.go b/tests/integration/install_project_test.go index 75e76fa8..ac79f340 100644 --- a/tests/integration/install_project_test.go +++ b/tests/integration/install_project_test.go @@ -7,6 +7,7 @@ import ( "path/filepath" "testing" + "skillshare/internal/install" "skillshare/internal/testutil" ) @@ -84,12 +85,22 @@ func TestInstallProject_FromConfig_SkipsExisting(t *testing.T) { "SKILL.md": "# Already", }) - // Write config referencing it + // Write metadata entry so config-based install sees it + skillsDir := filepath.Join(projectRoot, ".skillshare", "skills") + store, err := install.LoadMetadata(skillsDir) + if err != nil { + t.Fatalf("failed to load metadata: %v", err) + } + store.Set("already-here", &install.MetadataEntry{ + Source: "someone/skills/already-here", + Type: "github", + }) + if err := store.Save(skillsDir); err != nil { + t.Fatalf("failed to save metadata: %v", err) + } + sb.WriteProjectConfig(projectRoot, `targets: - claude -skills: - - name: already-here - source: someone/skills/already-here `) // install (no args) → should skip existing diff --git a/tests/integration/json_output_test.go b/tests/integration/json_output_test.go index e7d74890..ecfe115d 100644 --- a/tests/integration/json_output_test.go +++ b/tests/integration/json_output_test.go @@ -4,6 +4,8 @@ package integration import ( "encoding/json" + "os" + "path/filepath" "strings" "testing" @@ -486,6 +488,88 @@ func TestInstall_JSON_FromConfig_PureJSON(t *testing.T) { assertPureJSON(t, stdout) } +func TestInstall_Project_JSON_Agent_PureJSON(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + projectDir := sb.Root + "/agent-project" + agentSource := sb.Root + "/agent-bundle" + if err := os.MkdirAll(projectDir, 0o755); err != nil { + t.Fatalf("mkdir project: %v", err) + } + sb.WriteFile(agentSource+"/reviewer.md", "# Reviewer\n") + initGitRepo(t, agentSource) + + result := sb.RunCLIInDir(projectDir, "install", "file://"+agentSource, "--kind", "agent", "-p", "--json") + result.AssertSuccess(t) + + stdout := strings.TrimSpace(result.Stdout) + assertPureJSON(t, stdout) + + output := parseJSON(t, result.Stdout) + skills, ok := output["skills"].([]any) + if !ok { + t.Fatalf("skills should be an array, got %T", output["skills"]) + } + if len(skills) != 1 || skills[0] != "reviewer" { + t.Fatalf("expected installed agent in JSON payload, got %v", skills) + } + + if _, err := os.Stat(projectDir + "/.skillshare/agents/reviewer.md"); err != nil { + t.Fatalf("expected project agent to be installed: %v", err) + } +} + +func TestUpdate_Agents_JSON_ReportsFinalStatus(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") + + repoDir := filepath.Join(sb.Home, "json-agent-repo") + if err := os.MkdirAll(repoDir, 0o755); err != nil { + t.Fatalf("mkdir repo: %v", err) + } + if err := os.WriteFile(filepath.Join(repoDir, "reviewer.md"), []byte("# Reviewer v1\n"), 0o644); err != nil { + t.Fatalf("write initial agent: %v", err) + } + initGitRepo(t, repoDir) + + installResult := sb.RunCLI("install", "file://"+repoDir, "--kind", "agent", "--skip-audit") + installResult.AssertSuccess(t) + + if err := os.WriteFile(filepath.Join(repoDir, "reviewer.md"), []byte("# Reviewer v2\n"), 0o644); err != nil { + t.Fatalf("write updated agent: %v", err) + } + run(t, repoDir, "git", "add", "reviewer.md") + run(t, repoDir, "git", "commit", "-m", "update reviewer") + + result := sb.RunCLI("update", "agents", "--all", "--json") + result.AssertSuccess(t) + + stdout := strings.TrimSpace(result.Stdout) + assertPureJSON(t, stdout) + + output := parseJSON(t, result.Stdout) + agents, ok := output["agents"].([]any) + if !ok { + t.Fatalf("agents should be an array, got %T", output["agents"]) + } + if len(agents) != 1 { + t.Fatalf("expected 1 agent result, got %d", len(agents)) + } + item, ok := agents[0].(map[string]any) + if !ok { + t.Fatalf("agent result should be an object, got %T", agents[0]) + } + if item["name"] != "reviewer" { + t.Fatalf("expected agent name reviewer, got %v", item["name"]) + } + if item["status"] != "updated" { + t.Fatalf("expected final status updated, got %v", item["status"]) + } +} + // --- diff --project --json (P1: spinner/progress must not pollute stdout) --- func TestDiff_Project_JSON_PureJSON(t *testing.T) { @@ -774,3 +858,158 @@ func TestCollect_JSON_NoLocalSkills(t *testing.T) { t.Error("missing 'dry_run' field") } } + +func TestCollect_JSON_ProjectSkills_PureJSON(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + projectDir := sb.SetupProjectDir("claude") + targetSkillDir := filepath.Join(projectDir, ".claude", "skills", "local-skill") + if err := os.MkdirAll(targetSkillDir, 0755); err != nil { + t.Fatalf("mkdir: %v", err) + } + if err := os.WriteFile(filepath.Join(targetSkillDir, "SKILL.md"), []byte("# Local"), 0644); err != nil { + t.Fatalf("write: %v", err) + } + + result := sb.RunCLIInDir(projectDir, "collect", "-p", "--json") + result.AssertSuccess(t) + + stdout := strings.TrimSpace(result.Stdout) + assertPureJSON(t, stdout) + + output := parseJSON(t, stdout) + pulled, ok := output["pulled"].([]any) + if !ok || len(pulled) != 1 || pulled[0] != "local-skill" { + t.Fatalf("expected pulled=[local-skill], got %v", output["pulled"]) + } +} + +func TestCollect_Agents_JSON_OutputsValidJSON(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + createAgentSource(t, sb, nil) + claudeAgents := createAgentTarget(t, sb, "claude") + if err := os.WriteFile(filepath.Join(claudeAgents, "local-agent.md"), []byte("# Local"), 0644); err != nil { + t.Fatalf("write: %v", err) + } + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + sb.CreateTarget("claude") + ` + agents: + path: ` + claudeAgents + ` +`) + + result := sb.RunCLI("collect", "agents", "claude", "--json") + result.AssertSuccess(t) + + stdout := strings.TrimSpace(result.Stdout) + assertPureJSON(t, stdout) + + output := parseJSON(t, stdout) + pulled, ok := output["pulled"].([]any) + if !ok || len(pulled) != 1 || pulled[0] != "local-agent.md" { + t.Fatalf("expected pulled=[local-agent.md], got %v", output["pulled"]) + } + if output["dry_run"] != false { + t.Errorf("dry_run should be false, got %v", output["dry_run"]) + } +} + +func TestCollect_Agents_JSON_DryRun_DoesNotWrite(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + agentsSource := createAgentSource(t, sb, nil) + claudeAgents := createAgentTarget(t, sb, "claude") + if err := os.WriteFile(filepath.Join(claudeAgents, "local-agent.md"), []byte("# Local"), 0644); err != nil { + t.Fatalf("write: %v", err) + } + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + sb.CreateTarget("claude") + ` + agents: + path: ` + claudeAgents + ` +`) + + result := sb.RunCLI("collect", "agents", "claude", "--json", "--dry-run") + result.AssertSuccess(t) + + output := parseJSON(t, strings.TrimSpace(result.Stdout)) + if output["dry_run"] != true { + t.Errorf("dry_run should be true, got %v", output["dry_run"]) + } + + if _, err := os.Stat(filepath.Join(agentsSource, "local-agent.md")); !os.IsNotExist(err) { + t.Error("dry-run should not collect local-agent.md") + } +} + +func TestCollect_Agents_JSON_ImpliesForceAndOverwrites(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + agentsSource := createAgentSource(t, sb, map[string]string{ + "local-agent.md": "# Source version", + }) + claudeAgents := createAgentTarget(t, sb, "claude") + if err := os.WriteFile(filepath.Join(claudeAgents, "local-agent.md"), []byte("# Target version"), 0644); err != nil { + t.Fatalf("write: %v", err) + } + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + sb.CreateTarget("claude") + ` + agents: + path: ` + claudeAgents + ` +`) + + result := sb.RunCLI("collect", "agents", "claude", "--json") + result.AssertSuccess(t) + + output := parseJSON(t, strings.TrimSpace(result.Stdout)) + pulled, ok := output["pulled"].([]any) + if !ok || len(pulled) != 1 || pulled[0] != "local-agent.md" { + t.Fatalf("expected pulled=[local-agent.md], got %v", output["pulled"]) + } + + content, err := os.ReadFile(filepath.Join(agentsSource, "local-agent.md")) + if err != nil { + t.Fatalf("failed to read source agent: %v", err) + } + if string(content) != "# Target version" { + t.Errorf("expected overwrite via --json, got %q", string(content)) + } +} + +func TestCollect_Project_Agents_JSON_PureJSON(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + projectDir := setupProjectWithAgents(t, sb) + claudeAgents := filepath.Join(projectDir, ".claude", "agents") + if err := os.WriteFile(filepath.Join(claudeAgents, "local-agent.md"), []byte("# Local"), 0644); err != nil { + t.Fatalf("write: %v", err) + } + + result := sb.RunCLIInDir(projectDir, "collect", "-p", "agents", "--json") + result.AssertSuccess(t) + + stdout := strings.TrimSpace(result.Stdout) + assertPureJSON(t, stdout) + + output := parseJSON(t, stdout) + pulled, ok := output["pulled"].([]any) + if !ok || len(pulled) != 1 || pulled[0] != "local-agent.md" { + t.Fatalf("expected pulled=[local-agent.md], got %v", output["pulled"]) + } +} diff --git a/tests/integration/list_project_test.go b/tests/integration/list_project_test.go index 59a842dc..eb89092d 100644 --- a/tests/integration/list_project_test.go +++ b/tests/integration/list_project_test.go @@ -3,12 +3,12 @@ package integration import ( - "encoding/json" "os" "path/filepath" "strings" "testing" + "skillshare/internal/install" "skillshare/internal/testutil" ) @@ -22,16 +22,13 @@ func TestListProject_ShowsLocalAndRemote(t *testing.T) { "SKILL.md": "# Local", }) - // Remote skill (with meta) - skillDir := sb.CreateProjectSkill(projectRoot, "remote-skill", map[string]string{ + // Remote skill (with meta in centralized store) + sb.CreateProjectSkill(projectRoot, "remote-skill", map[string]string{ "SKILL.md": "# Remote", }) - meta := map[string]interface{}{ - "source": "someone/skills/remote-skill", - "type": "github", - } - metaJSON, _ := json.Marshal(meta) - os.WriteFile(filepath.Join(skillDir, ".skillshare-meta.json"), metaJSON, 0644) + skillsDir := filepath.Join(projectRoot, ".skillshare", "skills") + metaStore := `{"version":1,"entries":{"remote-skill":{"source":"someone/skills/remote-skill","type":"github"}}}` + os.WriteFile(filepath.Join(skillsDir, install.MetadataFileName), []byte(metaStore), 0644) result := sb.RunCLIInDir(projectRoot, "list", "-p") result.AssertSuccess(t) diff --git a/tests/integration/list_test.go b/tests/integration/list_test.go index 8a9bd804..da75b4e7 100644 --- a/tests/integration/list_test.go +++ b/tests/integration/list_test.go @@ -6,7 +6,9 @@ import ( "encoding/json" "strings" "testing" + "time" + "skillshare/internal/install" "skillshare/internal/testutil" ) @@ -51,11 +53,10 @@ func TestList_Verbose_ShowsDetails(t *testing.T) { // Create skill with metadata sb.CreateSkill("meta-skill", map[string]string{ "SKILL.md": "# Meta Skill", - ".skillshare-meta.json": `{ - "source": "github.com/user/repo/path/to/skill", - "type": "github-subdir", - "installed_at": "2024-01-15T10:30:00Z" -}`, + }) + writeListMeta(t, sb.SourcePath, "meta-skill", &install.MetadataEntry{ + Source: "github.com/user/repo/path/to/skill", + Type: "github-subdir", }) sb.WriteConfig(`source: ` + sb.SourcePath + ` @@ -187,11 +188,10 @@ func TestList_ShowsSourceInfo(t *testing.T) { // Create skill with metadata (installed) sb.CreateSkill("installed-skill", map[string]string{ "SKILL.md": "# Installed", - ".skillshare-meta.json": `{ - "source": "github.com/example/repo", - "type": "github", - "installed_at": "2024-01-15T10:30:00Z" -}`, + }) + writeListMeta(t, sb.SourcePath, "installed-skill", &install.MetadataEntry{ + Source: "github.com/example/repo", + Type: "github", }) sb.WriteConfig(`source: ` + sb.SourcePath + ` @@ -298,11 +298,10 @@ func TestList_FilterByType_Local(t *testing.T) { // GitHub skill (has metadata with source) sb.CreateSkill("from-github", map[string]string{ "SKILL.md": "# GitHub", - ".skillshare-meta.json": `{ - "source": "github.com/user/repo", - "type": "github", - "installed_at": "2024-06-01T00:00:00Z" -}`, + }) + writeListMeta(t, sb.SourcePath, "from-github", &install.MetadataEntry{ + Source: "github.com/user/repo", + Type: "github", }) sb.WriteConfig(`source: ` + sb.SourcePath + ` @@ -326,11 +325,10 @@ func TestList_FilterByType_Github(t *testing.T) { // GitHub skill (has metadata with source) sb.CreateSkill("from-github", map[string]string{ "SKILL.md": "# GitHub", - ".skillshare-meta.json": `{ - "source": "github.com/user/repo", - "type": "github", - "installed_at": "2024-06-01T00:00:00Z" -}`, + }) + writeListMeta(t, sb.SourcePath, "from-github", &install.MetadataEntry{ + Source: "github.com/user/repo", + Type: "github", }) sb.WriteConfig(`source: ` + sb.SourcePath + ` @@ -350,19 +348,21 @@ func TestList_SortNewest(t *testing.T) { sb.CreateSkill("old-skill", map[string]string{ "SKILL.md": "# Old", - ".skillshare-meta.json": `{ - "source": "github.com/user/old", - "type": "github", - "installed_at": "2023-01-01T00:00:00Z" -}`, + }) + oldTime, _ := time.Parse(time.RFC3339, "2023-01-01T00:00:00Z") + writeListMeta(t, sb.SourcePath, "old-skill", &install.MetadataEntry{ + Source: "github.com/user/old", + Type: "github", + InstalledAt: oldTime, }) sb.CreateSkill("new-skill", map[string]string{ "SKILL.md": "# New", - ".skillshare-meta.json": `{ - "source": "github.com/user/new", - "type": "github", - "installed_at": "2025-12-01T00:00:00Z" -}`, + }) + newTime, _ := time.Parse(time.RFC3339, "2025-12-01T00:00:00Z") + writeListMeta(t, sb.SourcePath, "new-skill", &install.MetadataEntry{ + Source: "github.com/user/new", + Type: "github", + InstalledAt: newTime, }) sb.WriteConfig(`source: ` + sb.SourcePath + ` @@ -423,11 +423,10 @@ func TestList_SearchWithFilter(t *testing.T) { // GitHub skill with "react" in source sb.CreateSkill("react-remote", map[string]string{ "SKILL.md": "# React Remote", - ".skillshare-meta.json": `{ - "source": "github.com/user/react-kit", - "type": "github", - "installed_at": "2024-06-01T00:00:00Z" -}`, + }) + writeListMeta(t, sb.SourcePath, "react-remote", &install.MetadataEntry{ + Source: "github.com/user/react-kit", + Type: "github", }) sb.WriteConfig(`source: ` + sb.SourcePath + ` @@ -449,8 +448,11 @@ func TestList_JSON_OutputsValidJSON(t *testing.T) { sb.CreateSkill("alpha", map[string]string{"SKILL.md": "# Alpha"}) sb.CreateSkill("beta", map[string]string{ - "SKILL.md": "# Beta", - ".skillshare-meta.json": `{"source":"github.com/user/repo","type":"github","installed_at":"2024-06-01T00:00:00Z"}`, + "SKILL.md": "# Beta", + }) + writeListMeta(t, sb.SourcePath, "beta", &install.MetadataEntry{ + Source: "github.com/user/repo", + Type: "github", }) sb.WriteConfig(`source: ` + sb.SourcePath + "\ntargets: {}\n") @@ -564,3 +566,16 @@ func TestList_NoTUI_WithPattern(t *testing.T) { t.Errorf("should not contain 'vue-helper' when filtered") } } + +// writeListMeta writes a metadata entry to the centralized .metadata.json in sourceDir. +func writeListMeta(t *testing.T, sourceDir, skillName string, entry *install.MetadataEntry) { + t.Helper() + store, err := install.LoadMetadata(sourceDir) + if err != nil { + t.Fatalf("writeListMeta: load: %v", err) + } + store.Set(skillName, entry) + if err := store.Save(sourceDir); err != nil { + t.Fatalf("writeListMeta: save: %v", err) + } +} diff --git a/tests/integration/log_test.go b/tests/integration/log_test.go index 225f75cc..481e609d 100644 --- a/tests/integration/log_test.go +++ b/tests/integration/log_test.go @@ -252,16 +252,18 @@ func TestLog_SyncPartialStatus(t *testing.T) { goodTarget := sb.CreateTarget("claude") - // Create the broken target as a valid directory (passes validation), - // then make it read-only so sync fails when trying to write symlinks. - brokenTarget := filepath.Join(sb.Home, "broken-target", "skills") - if err := os.MkdirAll(brokenTarget, 0755); err != nil { - t.Fatalf("failed to create broken target: %v", err) + // Create a broken target that passes validation but fails during sync. + // A dangling symlink makes os.Stat return "not exist" (validation passes) + // but os.MkdirAll fails because the symlink entry blocks directory creation. + // This works even as root (unlike chmod-based approaches). + brokenParent := filepath.Join(sb.Home, "broken-target") + if err := os.MkdirAll(brokenParent, 0755); err != nil { + t.Fatalf("failed to create broken parent: %v", err) } - if err := os.Chmod(brokenTarget, 0444); err != nil { - t.Fatalf("failed to chmod broken target: %v", err) + brokenTarget := filepath.Join(brokenParent, "skills") + if err := os.Symlink("/nonexistent/dangling/target", brokenTarget); err != nil { + t.Fatalf("failed to create dangling symlink: %v", err) } - t.Cleanup(func() { os.Chmod(brokenTarget, 0755) }) sb.WriteConfig(`source: ` + sb.SourcePath + ` mode: merge diff --git a/tests/integration/search_batch_meta_roundtrip_test.go b/tests/integration/search_batch_meta_roundtrip_test.go index 5ea8419a..22084321 100644 --- a/tests/integration/search_batch_meta_roundtrip_test.go +++ b/tests/integration/search_batch_meta_roundtrip_test.go @@ -79,19 +79,20 @@ func TestSearchBatchGroupedInstall_MetadataSourceParseRoundTrip(t *testing.T) { } } + store, storeErr := install.LoadMetadata(sb.SourcePath) + if storeErr != nil { + t.Fatalf("load metadata: %v", storeErr) + } + for _, name := range []string{"alpha-skill", "beta-skill"} { - skillPath := filepath.Join(sb.SourcePath, name) - meta, err := install.ReadMeta(skillPath) - if err != nil { - t.Fatalf("read meta for %s: %v", name, err) - } - if meta == nil { + entry := store.Get(name) + if entry == nil { t.Fatalf("meta missing for %s", name) } - parsed, err := install.ParseSource(meta.Source) + parsed, err := install.ParseSource(entry.Source) if err != nil { - t.Fatalf("meta source for %s is not parseable: %q (%v)", name, meta.Source, err) + t.Fatalf("meta source for %s is not parseable: %q (%v)", name, entry.Source, err) } if parsed.CloneURL != "https://gitlab.com/team/monorepo.git" { t.Fatalf("unexpected clone URL for %s: got %q", name, parsed.CloneURL) @@ -99,7 +100,7 @@ func TestSearchBatchGroupedInstall_MetadataSourceParseRoundTrip(t *testing.T) { wantSubdir := "skills/" + name if parsed.Subdir != wantSubdir { - t.Fatalf("unexpected subdir for %s: got %q, want %q (source=%q)", name, parsed.Subdir, wantSubdir, meta.Source) + t.Fatalf("unexpected subdir for %s: got %q, want %q (source=%q)", name, parsed.Subdir, wantSubdir, entry.Source) } } } diff --git a/tests/integration/sync_agent_test.go b/tests/integration/sync_agent_test.go new file mode 100644 index 00000000..1ff3adeb --- /dev/null +++ b/tests/integration/sync_agent_test.go @@ -0,0 +1,362 @@ +//go:build !online + +package integration + +import ( + "os" + "path/filepath" + "testing" + + "skillshare/internal/testutil" +) + +func TestSync_Agents_IncludeFilter(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + agentsDir := filepath.Join(filepath.Dir(sb.SourcePath), "agents") + os.MkdirAll(agentsDir, 0755) + os.WriteFile(filepath.Join(agentsDir, "tutor.md"), []byte("# Tutor"), 0644) + os.WriteFile(filepath.Join(agentsDir, "reviewer.md"), []byte("# Reviewer"), 0644) + os.WriteFile(filepath.Join(agentsDir, "debugger.md"), []byte("# Debugger"), 0644) + + claudeSkills := filepath.Join(sb.Home, ".claude", "skills") + claudeAgents := filepath.Join(sb.Home, ".claude", "agents") + os.MkdirAll(claudeSkills, 0755) + os.MkdirAll(claudeAgents, 0755) + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: "` + claudeSkills + `" + agents: + path: "` + claudeAgents + `" + include: + - "tutor" + - "reviewer" +`) + + result := sb.RunCLI("sync", "agents") + result.AssertSuccess(t) + + // Included agents should be synced + if _, err := os.Lstat(filepath.Join(claudeAgents, "tutor.md")); err != nil { + t.Error("tutor.md should be synced (included)") + } + if _, err := os.Lstat(filepath.Join(claudeAgents, "reviewer.md")); err != nil { + t.Error("reviewer.md should be synced (included)") + } + + // Excluded agent should NOT be synced + if _, err := os.Lstat(filepath.Join(claudeAgents, "debugger.md")); !os.IsNotExist(err) { + t.Error("debugger.md should NOT be synced (not in include list)") + } +} + +func TestSync_Agents_ExcludeFilter(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + agentsDir := filepath.Join(filepath.Dir(sb.SourcePath), "agents") + os.MkdirAll(agentsDir, 0755) + os.WriteFile(filepath.Join(agentsDir, "tutor.md"), []byte("# Tutor"), 0644) + os.WriteFile(filepath.Join(agentsDir, "reviewer.md"), []byte("# Reviewer"), 0644) + os.WriteFile(filepath.Join(agentsDir, "debugger.md"), []byte("# Debugger"), 0644) + + claudeSkills := filepath.Join(sb.Home, ".claude", "skills") + claudeAgents := filepath.Join(sb.Home, ".claude", "agents") + os.MkdirAll(claudeSkills, 0755) + os.MkdirAll(claudeAgents, 0755) + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: "` + claudeSkills + `" + agents: + path: "` + claudeAgents + `" + exclude: + - "debugger" +`) + + result := sb.RunCLI("sync", "agents") + result.AssertSuccess(t) + + // Non-excluded agents should be synced + if _, err := os.Lstat(filepath.Join(claudeAgents, "tutor.md")); err != nil { + t.Error("tutor.md should be synced (not excluded)") + } + if _, err := os.Lstat(filepath.Join(claudeAgents, "reviewer.md")); err != nil { + t.Error("reviewer.md should be synced (not excluded)") + } + + // Excluded agent should NOT be synced + if _, err := os.Lstat(filepath.Join(claudeAgents, "debugger.md")); !os.IsNotExist(err) { + t.Error("debugger.md should NOT be synced (excluded)") + } +} + +func TestSync_Agents_IncludeExcludeCombined(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + agentsDir := filepath.Join(filepath.Dir(sb.SourcePath), "agents") + os.MkdirAll(agentsDir, 0755) + os.WriteFile(filepath.Join(agentsDir, "team-reviewer.md"), []byte("# Team Reviewer"), 0644) + os.WriteFile(filepath.Join(agentsDir, "team-debugger.md"), []byte("# Team Debugger"), 0644) + os.WriteFile(filepath.Join(agentsDir, "personal-tutor.md"), []byte("# Personal Tutor"), 0644) + + claudeSkills := filepath.Join(sb.Home, ".claude", "skills") + claudeAgents := filepath.Join(sb.Home, ".claude", "agents") + os.MkdirAll(claudeSkills, 0755) + os.MkdirAll(claudeAgents, 0755) + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: "` + claudeSkills + `" + agents: + path: "` + claudeAgents + `" + include: + - "team-*" + exclude: + - "*-debugger" +`) + + result := sb.RunCLI("sync", "agents") + result.AssertSuccess(t) + + // team-reviewer matches include and not exclude → synced + if _, err := os.Lstat(filepath.Join(claudeAgents, "team-reviewer.md")); err != nil { + t.Error("team-reviewer.md should be synced (included, not excluded)") + } + + // team-debugger matches include but also matches exclude → NOT synced + if _, err := os.Lstat(filepath.Join(claudeAgents, "team-debugger.md")); !os.IsNotExist(err) { + t.Error("team-debugger.md should NOT be synced (excluded by *-debugger)") + } + + // personal-tutor does not match include → NOT synced + if _, err := os.Lstat(filepath.Join(claudeAgents, "personal-tutor.md")); !os.IsNotExist(err) { + t.Error("personal-tutor.md should NOT be synced (not in include list)") + } +} + +func TestSync_Agents_GlobExcludePattern(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + agentsDir := filepath.Join(filepath.Dir(sb.SourcePath), "agents") + os.MkdirAll(agentsDir, 0755) + os.WriteFile(filepath.Join(agentsDir, "alpha.md"), []byte("# Alpha"), 0644) + os.WriteFile(filepath.Join(agentsDir, "beta.md"), []byte("# Beta"), 0644) + os.WriteFile(filepath.Join(agentsDir, "gamma.md"), []byte("# Gamma"), 0644) + + claudeSkills := filepath.Join(sb.Home, ".claude", "skills") + claudeAgents := filepath.Join(sb.Home, ".claude", "agents") + os.MkdirAll(claudeSkills, 0755) + os.MkdirAll(claudeAgents, 0755) + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: "` + claudeSkills + `" + agents: + path: "` + claudeAgents + `" + exclude: + - "?eta" + - "gamma" +`) + + result := sb.RunCLI("sync", "agents") + result.AssertSuccess(t) + + // alpha doesn't match any exclude → synced + if _, err := os.Lstat(filepath.Join(claudeAgents, "alpha.md")); err != nil { + t.Error("alpha.md should be synced") + } + + // beta matches ?eta → NOT synced + if _, err := os.Lstat(filepath.Join(claudeAgents, "beta.md")); !os.IsNotExist(err) { + t.Error("beta.md should NOT be synced (excluded by ?eta)") + } + + // gamma matches gamma → NOT synced + if _, err := os.Lstat(filepath.Join(claudeAgents, "gamma.md")); !os.IsNotExist(err) { + t.Error("gamma.md should NOT be synced (excluded by gamma)") + } +} + +func TestSync_Agents_DisabledAgentsNotSynced(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + agentsDir := filepath.Join(filepath.Dir(sb.SourcePath), "agents") + os.MkdirAll(agentsDir, 0755) + os.WriteFile(filepath.Join(agentsDir, "active.md"), []byte("# Active"), 0644) + os.WriteFile(filepath.Join(agentsDir, "disabled-one.md"), []byte("# Disabled One"), 0644) + os.WriteFile(filepath.Join(agentsDir, "disabled-two.md"), []byte("# Disabled Two"), 0644) + + // Disable two agents via .agentignore + os.WriteFile(filepath.Join(agentsDir, ".agentignore"), []byte("disabled-one.md\ndisabled-two.md\n"), 0644) + + claudeSkills := filepath.Join(sb.Home, ".claude", "skills") + claudeAgents := filepath.Join(sb.Home, ".claude", "agents") + os.MkdirAll(claudeSkills, 0755) + os.MkdirAll(claudeAgents, 0755) + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: "` + claudeSkills + `" + agents: + path: "` + claudeAgents + `" +`) + + result := sb.RunCLI("sync", "agents") + result.AssertSuccess(t) + + // Active agent should be synced + if _, err := os.Lstat(filepath.Join(claudeAgents, "active.md")); err != nil { + t.Error("active.md should be synced") + } + + // Disabled agents should NOT be synced + if _, err := os.Lstat(filepath.Join(claudeAgents, "disabled-one.md")); !os.IsNotExist(err) { + t.Error("disabled-one.md should NOT be synced (disabled via .agentignore)") + } + if _, err := os.Lstat(filepath.Join(claudeAgents, "disabled-two.md")); !os.IsNotExist(err) { + t.Error("disabled-two.md should NOT be synced (disabled via .agentignore)") + } +} + +func TestSync_Agents_DisabledByGlobPattern(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + agentsDir := filepath.Join(filepath.Dir(sb.SourcePath), "agents") + os.MkdirAll(agentsDir, 0755) + os.WriteFile(filepath.Join(agentsDir, "prod-reviewer.md"), []byte("# Prod"), 0644) + os.WriteFile(filepath.Join(agentsDir, "draft-experiment.md"), []byte("# Draft 1"), 0644) + os.WriteFile(filepath.Join(agentsDir, "draft-wip.md"), []byte("# Draft 2"), 0644) + + // Glob pattern disables all draft-* agents + os.WriteFile(filepath.Join(agentsDir, ".agentignore"), []byte("draft-*\n"), 0644) + + claudeSkills := filepath.Join(sb.Home, ".claude", "skills") + claudeAgents := filepath.Join(sb.Home, ".claude", "agents") + os.MkdirAll(claudeSkills, 0755) + os.MkdirAll(claudeAgents, 0755) + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: "` + claudeSkills + `" + agents: + path: "` + claudeAgents + `" +`) + + result := sb.RunCLI("sync", "agents") + result.AssertSuccess(t) + + // Non-draft agent should be synced + if _, err := os.Lstat(filepath.Join(claudeAgents, "prod-reviewer.md")); err != nil { + t.Error("prod-reviewer.md should be synced") + } + + // Draft agents should NOT be synced + if _, err := os.Lstat(filepath.Join(claudeAgents, "draft-experiment.md")); !os.IsNotExist(err) { + t.Error("draft-experiment.md should NOT be synced (disabled by draft-* pattern)") + } + if _, err := os.Lstat(filepath.Join(claudeAgents, "draft-wip.md")); !os.IsNotExist(err) { + t.Error("draft-wip.md should NOT be synced (disabled by draft-* pattern)") + } +} + +func TestSync_Agents_DisabledNestedAgent(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + agentsDir := filepath.Join(filepath.Dir(sb.SourcePath), "agents") + os.MkdirAll(filepath.Join(agentsDir, "team"), 0755) + os.WriteFile(filepath.Join(agentsDir, "top-level.md"), []byte("# Top"), 0644) + os.WriteFile(filepath.Join(agentsDir, "team", "reviewer.md"), []byte("# Reviewer"), 0644) + os.WriteFile(filepath.Join(agentsDir, "team", "debugger.md"), []byte("# Debugger"), 0644) + + // Disable one nested agent + os.WriteFile(filepath.Join(agentsDir, ".agentignore"), []byte("team/debugger.md\n"), 0644) + + claudeSkills := filepath.Join(sb.Home, ".claude", "skills") + claudeAgents := filepath.Join(sb.Home, ".claude", "agents") + os.MkdirAll(claudeSkills, 0755) + os.MkdirAll(claudeAgents, 0755) + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: "` + claudeSkills + `" + agents: + path: "` + claudeAgents + `" +`) + + result := sb.RunCLI("sync", "agents") + result.AssertSuccess(t) + + // Top-level and enabled nested agent should be synced + if _, err := os.Lstat(filepath.Join(claudeAgents, "top-level.md")); err != nil { + t.Error("top-level.md should be synced") + } + if _, err := os.Lstat(filepath.Join(claudeAgents, "team__reviewer.md")); err != nil { + t.Error("team__reviewer.md should be synced") + } + + // Disabled nested agent should NOT be synced + if _, err := os.Lstat(filepath.Join(claudeAgents, "team__debugger.md")); !os.IsNotExist(err) { + t.Error("team__debugger.md should NOT be synced (disabled via .agentignore)") + } +} + +func TestSync_Agents_SkipsTargetsWithoutAgentsPath(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + // Create agents source with an agent + agentsDir := filepath.Join(filepath.Dir(sb.SourcePath), "agents") + os.MkdirAll(agentsDir, 0755) + os.WriteFile(filepath.Join(agentsDir, "helper.md"), []byte("# Helper"), 0644) + + // Configure a target WITH agents path and one WITHOUT + claudeSkills := filepath.Join(sb.Home, ".claude", "skills") + claudeAgents := filepath.Join(sb.Home, ".claude", "agents") + windsurf := filepath.Join(sb.Home, ".windsurf", "skills") + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: "` + claudeSkills + `" + agents: + path: "` + claudeAgents + `" + windsurf: + skills: + path: "` + windsurf + `" +`) + + result := sb.RunCLI("sync", "agents") + result.AssertSuccess(t) + + // Agent should be synced to claude + if !sb.FileExists(filepath.Join(claudeAgents, "helper.md")) { + t.Error("agent should be synced to claude agents dir") + } + + // Warning should mention windsurf was skipped + result.AssertAnyOutputContains(t, "skipped") + result.AssertAnyOutputContains(t, "windsurf") +} diff --git a/tests/integration/sync_extras_test.go b/tests/integration/sync_extras_test.go index daef3046..cbddfb89 100644 --- a/tests/integration/sync_extras_test.go +++ b/tests/integration/sync_extras_test.go @@ -45,7 +45,7 @@ extras: result := sb.RunCLI("sync", "extras") result.AssertSuccess(t) - result.AssertAnyOutputContains(t, "Sync Extras") + result.AssertAnyOutputContains(t, "Syncing extras") result.AssertAnyOutputContains(t, "2 files") // Verify files are symlinks @@ -99,7 +99,7 @@ extras: result := sb.RunCLI("sync", "extras") result.AssertSuccess(t) - result.AssertAnyOutputContains(t, "Sync Extras") + result.AssertAnyOutputContains(t, "Syncing extras") // Verify file exists and is a real copy (not a symlink) copiedFile := filepath.Join(rulesTarget, "coding.md") @@ -226,7 +226,7 @@ extras: result.AssertAnyOutputContains(t, "merged") // Verify extras sync happened - result.AssertAnyOutputContains(t, "Sync Extras") + result.AssertAnyOutputContains(t, "Syncing extras") // Verify skill symlink if !sb.IsSymlink(filepath.Join(targetPath, "my-skill")) { @@ -520,3 +520,324 @@ extras: t.Error("config should contain flatten: true") } } + +// --- Extras "agents" overlap with agents sync --- + +func TestSyncExtras_AgentsOverlap_Skipped(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.CreateSkill("placeholder", map[string]string{ + "SKILL.md": "# Placeholder", + }) + targetPath := sb.CreateTarget("claude") + + // Create agents source with a real agent (the regular agents sync system) + agentsDir := filepath.Join(filepath.Dir(sb.SourcePath), "agents") + os.MkdirAll(agentsDir, 0755) + os.WriteFile(filepath.Join(agentsDir, "helper.md"), []byte("# Helper Agent"), 0644) + + // Create extras source for "agents" with a file + sourceRoot := filepath.Dir(sb.SourcePath) + extrasAgentsSource := filepath.Join(sourceRoot, "extras", "agents") + os.MkdirAll(extrasAgentsSource, 0755) + os.WriteFile(filepath.Join(extrasAgentsSource, "extra-agent.md"), []byte("# Extra Agent"), 0644) + + claudeAgents := filepath.Join(sb.Home, ".claude", "agents") + os.MkdirAll(claudeAgents, 0755) + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + targetPath + ` + agents: + path: ` + claudeAgents + ` +extras: + - name: agents + targets: + - path: ` + claudeAgents + ` +`) + + result := sb.RunCLI("sync", "extras") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "Skipping extras") + result.AssertAnyOutputContains(t, "already managed by agents sync") +} + +func TestSyncExtras_AgentsOverlap_NoAgentsSource(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.CreateSkill("placeholder", map[string]string{ + "SKILL.md": "# Placeholder", + }) + targetPath := sb.CreateTarget("claude") + + // NO agents source directory — extras "agents" should sync normally + sourceRoot := filepath.Dir(sb.SourcePath) + extrasAgentsSource := filepath.Join(sourceRoot, "extras", "agents") + os.MkdirAll(extrasAgentsSource, 0755) + os.WriteFile(filepath.Join(extrasAgentsSource, "extra-agent.md"), []byte("# Extra Agent"), 0644) + + claudeAgents := filepath.Join(sb.Home, ".claude", "agents") + os.MkdirAll(claudeAgents, 0755) + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + targetPath + ` + agents: + path: ` + claudeAgents + ` +extras: + - name: agents + targets: + - path: ` + claudeAgents + ` +`) + + result := sb.RunCLI("sync", "extras") + result.AssertSuccess(t) + result.AssertOutputNotContains(t, "Skipping extras") + + // Extras agent file should be synced normally + if !sb.FileExists(filepath.Join(claudeAgents, "extra-agent.md")) { + t.Error("extra-agent.md should be synced when no agents source exists") + } +} + +func TestSyncExtras_AgentsOverlap_PartialSkip(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.CreateSkill("placeholder", map[string]string{ + "SKILL.md": "# Placeholder", + }) + targetPath := sb.CreateTarget("claude") + + // Create agents source + agentsDir := filepath.Join(filepath.Dir(sb.SourcePath), "agents") + os.MkdirAll(agentsDir, 0755) + os.WriteFile(filepath.Join(agentsDir, "helper.md"), []byte("# Helper Agent"), 0644) + + // Create extras "agents" source + sourceRoot := filepath.Dir(sb.SourcePath) + extrasAgentsSource := filepath.Join(sourceRoot, "extras", "agents") + os.MkdirAll(extrasAgentsSource, 0755) + os.WriteFile(filepath.Join(extrasAgentsSource, "extra-agent.md"), []byte("# Extra Agent"), 0644) + + claudeAgents := filepath.Join(sb.Home, ".claude", "agents") + customTarget := filepath.Join(sb.Home, "custom-agents") + os.MkdirAll(claudeAgents, 0755) + os.MkdirAll(customTarget, 0755) + + // Two targets: one overlaps with agents, one doesn't + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + targetPath + ` + agents: + path: ` + claudeAgents + ` +extras: + - name: agents + targets: + - path: ` + claudeAgents + ` + - path: ` + customTarget + ` +`) + + result := sb.RunCLI("sync", "extras") + result.AssertSuccess(t) + + // Overlapping target should be skipped + result.AssertAnyOutputContains(t, "Skipping extras") + + // Non-overlapping target should be synced + if !sb.FileExists(filepath.Join(customTarget, "extra-agent.md")) { + t.Error("extra-agent.md should be synced to non-overlapping target") + } +} + +func TestSyncExtras_AgentsOverlap_NonAgentsNameNotSkipped(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.CreateSkill("placeholder", map[string]string{ + "SKILL.md": "# Placeholder", + }) + targetPath := sb.CreateTarget("claude") + + // Create agents source (real agents system active) + agentsDir := filepath.Join(filepath.Dir(sb.SourcePath), "agents") + os.MkdirAll(agentsDir, 0755) + os.WriteFile(filepath.Join(agentsDir, "helper.md"), []byte("# Helper Agent"), 0644) + + // Create extras "rules" that targets the agents directory + sourceRoot := filepath.Dir(sb.SourcePath) + rulesSource := filepath.Join(sourceRoot, "extras", "rules") + os.MkdirAll(rulesSource, 0755) + os.WriteFile(filepath.Join(rulesSource, "rule.md"), []byte("# Rule"), 0644) + + claudeAgents := filepath.Join(sb.Home, ".claude", "agents") + os.MkdirAll(claudeAgents, 0755) + + // extras "rules" targets the same path as agents — should NOT be skipped (name != "agents") + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + targetPath + ` + agents: + path: ` + claudeAgents + ` +extras: + - name: rules + targets: + - path: ` + claudeAgents + ` +`) + + result := sb.RunCLI("sync", "extras") + result.AssertSuccess(t) + result.AssertOutputNotContains(t, "Skipping extras") + + if !sb.FileExists(filepath.Join(claudeAgents, "rule.md")) { + t.Error("rule.md should be synced — extras named 'rules' should not be affected by agents overlap") + } +} + +func TestSyncExtras_AgentsOverlap_NoTargetOverlap(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.CreateSkill("placeholder", map[string]string{ + "SKILL.md": "# Placeholder", + }) + targetPath := sb.CreateTarget("claude") + + // Create agents source + agentsDir := filepath.Join(filepath.Dir(sb.SourcePath), "agents") + os.MkdirAll(agentsDir, 0755) + os.WriteFile(filepath.Join(agentsDir, "helper.md"), []byte("# Helper Agent"), 0644) + + // Create extras "agents" source targeting a DIFFERENT path + sourceRoot := filepath.Dir(sb.SourcePath) + extrasAgentsSource := filepath.Join(sourceRoot, "extras", "agents") + os.MkdirAll(extrasAgentsSource, 0755) + os.WriteFile(filepath.Join(extrasAgentsSource, "extra-agent.md"), []byte("# Extra Agent"), 0644) + + claudeAgents := filepath.Join(sb.Home, ".claude", "agents") + customTarget := filepath.Join(sb.Home, "my-agents") + os.MkdirAll(customTarget, 0755) + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + skills: + path: ` + targetPath + ` + agents: + path: ` + claudeAgents + ` +extras: + - name: agents + targets: + - path: ` + customTarget + ` +`) + + result := sb.RunCLI("sync", "extras") + result.AssertSuccess(t) + result.AssertOutputNotContains(t, "Skipping extras") + + if !sb.FileExists(filepath.Join(customTarget, "extra-agent.md")) { + t.Error("extra-agent.md should be synced to non-overlapping target") + } +} + +func TestSyncAll_AgentsExtrasOverlap_WarnsAndPreservesAgents_Global(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.CreateSkill("placeholder", map[string]string{ + "SKILL.md": "# Placeholder", + }) + targetPath := sb.CreateTarget("claude") + + agentsDir := filepath.Join(filepath.Dir(sb.SourcePath), "agents") + os.MkdirAll(agentsDir, 0755) + os.WriteFile(filepath.Join(agentsDir, "demo.md"), []byte("# Demo Agent"), 0644) + + sourceRoot := filepath.Dir(sb.SourcePath) + extrasAgentsSource := filepath.Join(sourceRoot, "extras", "agents") + os.MkdirAll(extrasAgentsSource, 0755) + os.WriteFile(filepath.Join(extrasAgentsSource, "extra-agent.md"), []byte("# Extra Agent"), 0644) + + claudeAgents := filepath.Join(sb.Home, ".claude", "agents") + os.MkdirAll(claudeAgents, 0755) + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + path: ` + targetPath + ` +extras: + - name: agents + targets: + - path: ` + claudeAgents + ` +`) + + result := sb.RunCLI("sync", "--all") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "Skipping extras") + result.AssertAnyOutputContains(t, "already managed by agents sync") + + if !sb.IsSymlink(filepath.Join(claudeAgents, "demo.md")) { + t.Error("demo agent should remain synced in global mode") + } + if sb.FileExists(filepath.Join(claudeAgents, "extra-agent.md")) { + t.Error("extras agent file should not be synced when target overlaps agents sync") + } +} + +func TestSyncAll_AgentsExtrasOverlap_WarnsAndPreservesAgents_Project(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + projectRoot := sb.SetupProjectDir("claude", "cursor") + + projectAgents := filepath.Join(projectRoot, ".skillshare", "agents") + os.MkdirAll(projectAgents, 0755) + os.WriteFile(filepath.Join(projectAgents, "demo.md"), []byte("# Demo Agent"), 0644) + + projectExtrasAgents := filepath.Join(projectRoot, ".skillshare", "extras", "agents") + os.MkdirAll(projectExtrasAgents, 0755) + os.WriteFile(filepath.Join(projectExtrasAgents, "extra-agent.md"), []byte("# Extra Agent"), 0644) + + claudeAgents := filepath.Join(projectRoot, ".claude", "agents") + cursorAgents := filepath.Join(projectRoot, ".cursor", "agents") + os.MkdirAll(claudeAgents, 0755) + os.MkdirAll(cursorAgents, 0755) + + sb.WriteProjectConfig(projectRoot, `targets: + - claude + - cursor +extras: + - name: agents + targets: + - path: .claude/agents + - path: .cursor/agents +`) + + result := sb.RunCLIInDir(projectRoot, "sync", "--all", "-p") + result.AssertSuccess(t) + result.AssertAnyOutputContains(t, "Skipping extras") + result.AssertAnyOutputContains(t, "already managed by agents sync") + + if !sb.IsSymlink(filepath.Join(claudeAgents, "demo.md")) { + t.Error("claude demo agent should remain synced in project mode") + } + if !sb.IsSymlink(filepath.Join(cursorAgents, "demo.md")) { + t.Error("cursor demo agent should remain synced in project mode") + } + if sb.FileExists(filepath.Join(claudeAgents, "extra-agent.md")) { + t.Error("project extras agent file should not be synced to claude when target overlaps agents sync") + } + if sb.FileExists(filepath.Join(cursorAgents, "extra-agent.md")) { + t.Error("project extras agent file should not be synced to cursor when target overlaps agents sync") + } +} diff --git a/tests/integration/sync_project_test.go b/tests/integration/sync_project_test.go index c16e4b99..26cd20b6 100644 --- a/tests/integration/sync_project_test.go +++ b/tests/integration/sync_project_test.go @@ -5,9 +5,9 @@ package integration import ( "os" "path/filepath" - "strings" "testing" + "skillshare/internal/install" "skillshare/internal/testutil" ) @@ -121,26 +121,27 @@ func TestSyncProject_PreservesRegistryEntries(t *testing.T) { "SKILL.md": "# Local Skill", }) - // Write a registry with a remote-installed skill that has NO files on disk. - // Sync must NOT prune this entry — the registry is the source of truth for installations. - registryPath := filepath.Join(projectRoot, ".skillshare", "registry.yaml") - registryContent := "skills:\n - name: remote-tool\n source: github.com/someone/remote-tool\n" - os.WriteFile(registryPath, []byte(registryContent), 0644) + // Write metadata with a remote-installed skill that has NO files on disk. + // Sync must NOT prune this entry — the metadata is the source of truth for installations. + skillsDir := filepath.Join(projectRoot, ".skillshare", "skills") + store := install.NewMetadataStore() + store.Set("remote-tool", &install.MetadataEntry{Source: "github.com/someone/remote-tool"}) + store.Save(skillsDir) result := sb.RunCLIInDir(projectRoot, "sync", "-p") result.AssertSuccess(t) - // Verify registry still contains the remote-tool entry - data, err := os.ReadFile(registryPath) + // Verify metadata still contains the remote-tool entry + store2, err := install.LoadMetadata(skillsDir) if err != nil { - t.Fatalf("failed to read registry: %v", err) + t.Fatalf("failed to load metadata: %v", err) } - content := string(data) - if !strings.Contains(content, "remote-tool") { - t.Errorf("sync should preserve registry entry for installed skill without local files, got:\n%s", content) + if !store2.Has("remote-tool") { + t.Errorf("sync should preserve metadata entry for installed skill without local files") } - if !strings.Contains(content, "github.com/someone/remote-tool") { - t.Errorf("sync should preserve source in registry entry, got:\n%s", content) + entry := store2.Get("remote-tool") + if entry == nil || entry.Source != "github.com/someone/remote-tool" { + t.Errorf("sync should preserve source in metadata entry") } } diff --git a/tests/integration/sync_symlinked_dir_test.go b/tests/integration/sync_symlinked_dir_test.go index 3d41057a..898283de 100644 --- a/tests/integration/sync_symlinked_dir_test.go +++ b/tests/integration/sync_symlinked_dir_test.go @@ -7,6 +7,7 @@ import ( "path/filepath" "testing" + "skillshare/internal/install" "skillshare/internal/testutil" ) @@ -334,8 +335,8 @@ func TestUpdateGroup_ExternalSymlinkRejected(t *testing.T) { os.MkdirAll(filepath.Join(externalDir, "victim"), 0755) os.WriteFile(filepath.Join(externalDir, "victim", "SKILL.md"), []byte("---\nname: victim\n---\n# Victim"), 0644) - os.WriteFile(filepath.Join(externalDir, "victim", ".skillshare-meta.json"), - []byte(`{"source":"github.com/example/victim","installed_at":"2025-01-01T00:00:00Z"}`), 0644) + os.WriteFile(filepath.Join(externalDir, install.MetadataFileName), + []byte(`{"version":1,"entries":{"victim":{"source":"github.com/example/victim","installed_at":"2025-01-01T00:00:00Z"}}}`), 0644) // Symlink a group inside source to the external location os.Symlink(externalDir, filepath.Join(sb.SourcePath, "evil-group")) @@ -356,13 +357,13 @@ func TestUpdateAll_SymlinkedSource(t *testing.T) { realSource := filepath.Join(sb.Root, "dotfiles", "skills") os.MkdirAll(realSource, 0755) - // Create a skill with metadata + // Create a skill with metadata in centralized store skillDir := filepath.Join(realSource, "remote-skill") os.MkdirAll(skillDir, 0755) os.WriteFile(filepath.Join(skillDir, "SKILL.md"), []byte("---\nname: remote-skill\n---\n# Remote"), 0644) - os.WriteFile(filepath.Join(skillDir, ".skillshare-meta.json"), - []byte(`{"source":"github.com/example/remote","installed_at":"2025-01-01T00:00:00Z"}`), 0644) + os.WriteFile(filepath.Join(realSource, install.MetadataFileName), + []byte(`{"version":1,"entries":{"remote-skill":{"source":"github.com/example/remote","installed_at":"2025-01-01T00:00:00Z"}}}`), 0644) os.RemoveAll(sb.SourcePath) if err := os.Symlink(realSource, sb.SourcePath); err != nil { diff --git a/tests/integration/target_filter_test.go b/tests/integration/target_filter_test.go index 8f1fb0db..83e62473 100644 --- a/tests/integration/target_filter_test.go +++ b/tests/integration/target_filter_test.go @@ -202,10 +202,15 @@ func TestTargetFilter_HelpShowsFilterFlags(t *testing.T) { result := sb.RunCLI("target", "help") result.AssertSuccess(t) + result.AssertOutputContains(t, "--agent-mode") result.AssertOutputContains(t, "--add-include") result.AssertOutputContains(t, "--add-exclude") result.AssertOutputContains(t, "--remove-include") result.AssertOutputContains(t, "--remove-exclude") + result.AssertOutputContains(t, "--add-agent-include") + result.AssertOutputContains(t, "--add-agent-exclude") + result.AssertOutputContains(t, "--remove-agent-include") + result.AssertOutputContains(t, "--remove-agent-exclude") result.AssertOutputContains(t, "Project mode") } @@ -223,3 +228,100 @@ func TestTargetFilter_Project_AddAndShow(t *testing.T) { info.AssertSuccess(t) info.AssertOutputContains(t, "Include: team-*") } + +func TestTargetFilter_AgentAddInclude(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + targetPath := sb.CreateTarget("claude") + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + path: ` + targetPath + ` +`) + + result := sb.RunCLI("target", "claude", "--add-agent-include", "team-*") + result.AssertSuccess(t) + result.AssertOutputContains(t, "added agent include: team-*") + + configContent := sb.ReadFile(sb.ConfigPath) + if !strings.Contains(configContent, "agents:") { + t.Fatal("agents block should be written to config") + } + if !strings.Contains(configContent, "team-*") { + t.Fatal("agent include pattern should be in config") + } + + info := sb.RunCLI("target", "claude") + info.AssertSuccess(t) + info.AssertOutputContains(t, "Agents:") + info.AssertOutputContains(t, "Include: team-*") +} + +func TestTargetFilter_AgentModeAndSymlinkGuard(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + targetPath := sb.CreateTarget("claude") + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + path: ` + targetPath + ` +`) + + mode := sb.RunCLI("target", "claude", "--agent-mode", "copy") + mode.AssertSuccess(t) + mode.AssertOutputContains(t, "Changed claude agent mode: merge -> copy") + + info := sb.RunCLI("target", "claude") + info.AssertSuccess(t) + info.AssertOutputContains(t, "Agents:") + info.AssertOutputContains(t, "Mode: copy") + + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + claude: + path: ` + targetPath + ` + agents: + mode: symlink +`) + + symlinkInfo := sb.RunCLI("target", "claude") + symlinkInfo.AssertSuccess(t) + symlinkInfo.AssertOutputContains(t, "Filters: ignored in symlink mode") + + rejected := sb.RunCLI("target", "claude", "--add-agent-include", "team-*") + rejected.AssertFailure(t) + rejected.AssertAnyOutputContains(t, "ignored in symlink mode") +} + +func TestTargetFilter_AgentUnsupportedTarget(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + targetPath := filepath.Join(sb.Root, "custom-skills") + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: + custom-tool: + path: ` + targetPath + ` +`) + + result := sb.RunCLI("target", "custom-tool", "--add-agent-include", "team-*") + result.AssertFailure(t) + result.AssertAnyOutputContains(t, "does not have an agents path") +} + +func TestTargetFilter_Project_AgentAddAndShow(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + projectRoot := sb.SetupProjectDir("claude") + + result := sb.RunCLIInDir(projectRoot, "target", "claude", "--add-agent-include", "team-*", "-p") + result.AssertSuccess(t) + result.AssertOutputContains(t, "added agent include: team-*") + + info := sb.RunCLIInDir(projectRoot, "target", "claude", "-p") + info.AssertSuccess(t) + info.AssertOutputContains(t, "Agents:") + info.AssertOutputContains(t, "Include: team-*") +} diff --git a/tests/integration/target_test.go b/tests/integration/target_test.go index f75a40ef..737a2a04 100644 --- a/tests/integration/target_test.go +++ b/tests/integration/target_test.go @@ -440,7 +440,8 @@ targets: result.AssertSuccess(t) result.AssertOutputContains(t, "claude") - result.AssertOutputContains(t, "(merge)") + result.AssertOutputContains(t, "Mode:") + result.AssertOutputContains(t, "merge") } func TestTarget_NoSubcommand_ShowsUsage(t *testing.T) { diff --git a/tests/integration/theme_test.go b/tests/integration/theme_test.go new file mode 100644 index 00000000..711da107 --- /dev/null +++ b/tests/integration/theme_test.go @@ -0,0 +1,67 @@ +//go:build !online + +package integration + +import ( + "strings" + "testing" + + "skillshare/internal/testutil" +) + +// TestList_SKILLSHARE_THEME_Light verifies that when SKILLSHARE_THEME=light +// is set, the list output uses the light Primary color (232) and does not +// contain pure bright white (15), resolving issue #125. +func TestList_SKILLSHARE_THEME_Light(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.CreateSkill("hello-world", map[string]string{ + "SKILL.md": "---\nname: hello-world\ndescription: A test skill\n---\n# Hello", + }) + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: {} +`) + + result := sb.RunCLIEnv( + map[string]string{"SKILLSHARE_THEME": "light"}, + "list", "--no-tui", + ) + + if result.ExitCode != 0 { + t.Fatalf("list exited %d: %s", result.ExitCode, result.Output()) + } + + // Light Primary is 232 — the rendered 256-color escape is + // ESC[38;5;232m. Plain text from tables etc. may not include this, + // so we only assert that pure white 15 is NOT present. + if strings.Contains(result.Stdout, "\x1b[38;5;15m") { + t.Error("light theme output must not contain pure white (Color 15)") + } +} + +// TestList_NO_COLOR verifies that NO_COLOR strips all ANSI escape sequences. +func TestList_NO_COLOR(t *testing.T) { + sb := testutil.NewSandbox(t) + defer sb.Cleanup() + + sb.CreateSkill("hello", map[string]string{ + "SKILL.md": "---\nname: hello\ndescription: A test skill\n---\n# H", + }) + sb.WriteConfig(`source: ` + sb.SourcePath + ` +targets: {} +`) + + result := sb.RunCLIEnv( + map[string]string{"NO_COLOR": "1"}, + "list", "--no-tui", + ) + + if result.ExitCode != 0 { + t.Fatalf("list exited %d: %s", result.ExitCode, result.Output()) + } + + if strings.Contains(result.Stdout, "\x1b[") { + t.Errorf("NO_COLOR must strip all ANSI escapes, got: %q", result.Stdout) + } +} diff --git a/tests/integration/uninstall_project_test.go b/tests/integration/uninstall_project_test.go index ca862422..f6240cdb 100644 --- a/tests/integration/uninstall_project_test.go +++ b/tests/integration/uninstall_project_test.go @@ -3,12 +3,11 @@ package integration import ( - "encoding/json" "os" "path/filepath" - "strings" "testing" + "skillshare/internal/install" "skillshare/internal/testutil" ) @@ -47,13 +46,14 @@ func TestUninstallProject_UpdatesConfig(t *testing.T) { defer sb.Cleanup() projectRoot := sb.SetupProjectDir("claude") - // Create remote skill with meta - skillDir := sb.CreateProjectSkill(projectRoot, "remote", map[string]string{ + // Create remote skill with meta in centralized store + sb.CreateProjectSkill(projectRoot, "remote", map[string]string{ "SKILL.md": "# Remote", }) - meta := map[string]interface{}{"source": "org/skills/remote", "type": "github"} - metaJSON, _ := json.Marshal(meta) - os.WriteFile(filepath.Join(skillDir, ".skillshare-meta.json"), metaJSON, 0644) + skillsDir := filepath.Join(projectRoot, ".skillshare", "skills") + metaStore := install.NewMetadataStore() + metaStore.Set("remote", &install.MetadataEntry{Source: "org/skills/remote", Type: "github"}) + metaStore.Save(skillsDir) // Write config and registry with the skill sb.WriteProjectConfig(projectRoot, `targets: @@ -67,9 +67,12 @@ func TestUninstallProject_UpdatesConfig(t *testing.T) { result := sb.RunCLIInDir(projectRoot, "uninstall", "remote", "--force", "-p") result.AssertSuccess(t) - registryContent := sb.ReadFile(filepath.Join(projectRoot, ".skillshare", "registry.yaml")) - if strings.Contains(registryContent, "remote") { - t.Error("registry should not contain removed skill") + store, err := install.LoadMetadata(filepath.Join(projectRoot, ".skillshare", "skills")) + if err != nil { + t.Fatalf("load metadata: %v", err) + } + if store.Has("remote") { + t.Error("metadata should not contain removed skill") } } @@ -128,9 +131,12 @@ func TestUninstallProject_MultipleSkills(t *testing.T) { t.Error("skill-b should be removed") } - registryContent := sb.ReadFile(filepath.Join(projectRoot, ".skillshare", "registry.yaml")) - if strings.Contains(registryContent, "skill-a") || strings.Contains(registryContent, "skill-b") { - t.Error("registry should not contain removed skills") + store, err := install.LoadMetadata(filepath.Join(projectRoot, ".skillshare", "skills")) + if err != nil { + t.Fatalf("load metadata: %v", err) + } + if store.Has("skill-a") || store.Has("skill-b") { + t.Error("metadata should not contain removed skills") } } @@ -194,17 +200,13 @@ func TestUninstallProject_GroupDir_RemovesConfigEntries(t *testing.T) { sb.WriteProjectConfig(projectRoot, `targets: - claude `) - os.WriteFile(filepath.Join(projectRoot, ".skillshare", "registry.yaml"), []byte(`skills: - - name: skill-a - source: github.com/org/repo/skill-a - group: mygroup - - name: skill-b - source: github.com/org/repo/skill-b - group: mygroup - - name: skill-c - source: github.com/org/repo/skill-c - group: other -`), 0644) + // Write metadata directly + skillsDir := filepath.Join(projectRoot, ".skillshare", "skills") + store := install.NewMetadataStore() + store.Set("skill-a", &install.MetadataEntry{Source: "github.com/org/repo/skill-a", Group: "mygroup"}) + store.Set("skill-b", &install.MetadataEntry{Source: "github.com/org/repo/skill-b", Group: "mygroup"}) + store.Set("skill-c", &install.MetadataEntry{Source: "github.com/org/repo/skill-c", Group: "other"}) + store.Save(skillsDir) result := sb.RunCLIInDir(projectRoot, "uninstall", "mygroup", "--force", "-p") result.AssertSuccess(t) @@ -214,16 +216,19 @@ func TestUninstallProject_GroupDir_RemovesConfigEntries(t *testing.T) { t.Error("mygroup directory should be removed") } - // Registry should no longer contain mygroup skills - registryContent := sb.ReadFile(filepath.Join(projectRoot, ".skillshare", "registry.yaml")) - if strings.Contains(registryContent, "skill-a") { - t.Error("registry should not contain skill-a after group uninstall") + // Metadata should no longer contain mygroup skills + store2, err := install.LoadMetadata(skillsDir) + if err != nil { + t.Fatalf("load metadata: %v", err) + } + if store2.Has("skill-a") { + t.Error("metadata should not contain skill-a after group uninstall") } - if strings.Contains(registryContent, "skill-b") { - t.Error("registry should not contain skill-b after group uninstall") + if store2.Has("skill-b") { + t.Error("metadata should not contain skill-b after group uninstall") } - if !strings.Contains(registryContent, "skill-c") { - t.Error("registry should still contain skill-c from other group") + if !store2.Has("skill-c") { + t.Error("metadata should still contain skill-c from other group") } } @@ -239,30 +244,29 @@ func TestUninstallProject_GroupDirWithTrailingSlash_RemovesConfigEntries(t *test sb.WriteProjectConfig(projectRoot, `targets: - claude `) - os.WriteFile(filepath.Join(projectRoot, ".skillshare", "registry.yaml"), []byte(`skills: - - name: scan - source: github.com/org/repo/scan - group: security - - name: hardening - source: github.com/org/repo/hardening - group: security - - name: keep - source: github.com/org/repo/keep - group: other -`), 0644) + // Write metadata directly to .metadata.json + skillsDir := filepath.Join(projectRoot, ".skillshare", "skills") + store := install.NewMetadataStore() + store.Set("scan", &install.MetadataEntry{Source: "github.com/org/repo/scan", Group: "security"}) + store.Set("hardening", &install.MetadataEntry{Source: "github.com/org/repo/hardening", Group: "security"}) + store.Set("keep", &install.MetadataEntry{Source: "github.com/org/repo/keep", Group: "other"}) + store.Save(skillsDir) result := sb.RunCLIInDir(projectRoot, "uninstall", "security/", "--force", "-p") result.AssertSuccess(t) result.AssertAnyOutputContains(t, "Uninstalled group: security") - registryContent := sb.ReadFile(filepath.Join(projectRoot, ".skillshare", "registry.yaml")) - if strings.Contains(registryContent, "scan") { - t.Error("registry should not contain scan after security/ uninstall") + store, err := install.LoadMetadata(filepath.Join(projectRoot, ".skillshare", "skills")) + if err != nil { + t.Fatalf("load metadata: %v", err) + } + if store.Has("scan") { + t.Error("metadata should not contain scan after security/ uninstall") } - if strings.Contains(registryContent, "hardening") { - t.Error("registry should not contain hardening after security/ uninstall") + if store.Has("hardening") { + t.Error("metadata should not contain hardening after security/ uninstall") } - if !strings.Contains(registryContent, "keep") { - t.Error("registry should still contain keep from other group") + if !store.Has("keep") { + t.Error("metadata should still contain keep from other group") } } diff --git a/tests/integration/uninstall_test.go b/tests/integration/uninstall_test.go index 8c0852e4..2714b525 100644 --- a/tests/integration/uninstall_test.go +++ b/tests/integration/uninstall_test.go @@ -5,9 +5,9 @@ package integration import ( "os" "path/filepath" - "strings" "testing" + "skillshare/internal/install" "skillshare/internal/testutil" ) @@ -172,15 +172,16 @@ func TestUninstall_ShowsMetadata(t *testing.T) { sb := testutil.NewSandbox(t) defer sb.Cleanup() - // Create skill with metadata (simulating installed skill) + // Create skill with metadata in centralized store (simulating installed skill) sb.CreateSkill("meta-skill", map[string]string{ "SKILL.md": "# Meta Skill", - ".skillshare-meta.json": `{ - "source": "github.com/user/repo", - "type": "github", - "installed_at": "2024-01-15T10:30:00Z" -}`, }) + metaStore := install.NewMetadataStore() + metaStore.Set("meta-skill", &install.MetadataEntry{ + Source: "github.com/user/repo", + Type: "github", + }) + metaStore.Save(sb.SourcePath) sb.WriteConfig(`source: ` + sb.SourcePath + ` targets: {} @@ -413,17 +414,13 @@ func TestUninstall_GroupDir_RemovesConfigEntries(t *testing.T) { sb.WriteConfig(`source: ` + sb.SourcePath + ` targets: {} -skills: - - name: skill-a - source: github.com/org/repo/skill-a - group: mygroup - - name: skill-b - source: github.com/org/repo/skill-b - group: mygroup - - name: skill-c - source: github.com/org/repo/skill-c - group: other `) + // Write metadata directly to .metadata.json + store := install.NewMetadataStore() + store.Set("skill-a", &install.MetadataEntry{Source: "github.com/org/repo/skill-a", Group: "mygroup"}) + store.Set("skill-b", &install.MetadataEntry{Source: "github.com/org/repo/skill-b", Group: "mygroup"}) + store.Set("skill-c", &install.MetadataEntry{Source: "github.com/org/repo/skill-c", Group: "other"}) + store.Save(sb.SourcePath) result := sb.RunCLI("uninstall", "mygroup", "-f") result.AssertSuccess(t) @@ -433,18 +430,20 @@ skills: t.Error("mygroup directory should be removed") } - // Registry should no longer contain mygroup skills - registryPath := filepath.Join(sb.SourcePath, "registry.yaml") - registryContent := sb.ReadFile(registryPath) - if strings.Contains(registryContent, "skill-a") { - t.Error("registry should not contain skill-a after group uninstall") + // Metadata should no longer contain mygroup skills + store, err := install.LoadMetadata(sb.SourcePath) + if err != nil { + t.Fatalf("load metadata: %v", err) } - if strings.Contains(registryContent, "skill-b") { - t.Error("registry should not contain skill-b after group uninstall") + if store.Has("skill-a") { + t.Error("metadata should not contain skill-a after group uninstall") + } + if store.Has("skill-b") { + t.Error("metadata should not contain skill-b after group uninstall") } // other group should be untouched - if !strings.Contains(registryContent, "skill-c") { - t.Error("registry should still contain skill-c from other group") + if !store.Has("skill-c") { + t.Error("metadata should still contain skill-c from other group") } } @@ -488,31 +487,29 @@ func TestUninstall_GroupDirWithTrailingSlash_RemovesConfigEntries(t *testing.T) sb.WriteConfig(`source: ` + sb.SourcePath + ` targets: {} -skills: - - name: scan - source: github.com/org/repo/scan - group: security - - name: hardening - source: github.com/org/repo/hardening - group: security - - name: keep - source: github.com/org/repo/keep - group: other `) + // Write metadata directly to .metadata.json + store := install.NewMetadataStore() + store.Set("scan", &install.MetadataEntry{Source: "github.com/org/repo/scan", Group: "security"}) + store.Set("hardening", &install.MetadataEntry{Source: "github.com/org/repo/hardening", Group: "security"}) + store.Set("keep", &install.MetadataEntry{Source: "github.com/org/repo/keep", Group: "other"}) + store.Save(sb.SourcePath) result := sb.RunCLI("uninstall", "security/", "-f") result.AssertSuccess(t) - registryPath := filepath.Join(sb.SourcePath, "registry.yaml") - registryContent := sb.ReadFile(registryPath) - if strings.Contains(registryContent, "scan") { - t.Error("registry should not contain scan after security/ uninstall") + store, err := install.LoadMetadata(sb.SourcePath) + if err != nil { + t.Fatalf("load metadata: %v", err) } - if strings.Contains(registryContent, "hardening") { - t.Error("registry should not contain hardening after security/ uninstall") + if store.Has("scan") { + t.Error("metadata should not contain scan after security/ uninstall") } - if !strings.Contains(registryContent, "keep") { - t.Error("registry should still contain keep from other group") + if store.Has("hardening") { + t.Error("metadata should not contain hardening after security/ uninstall") + } + if !store.Has("keep") { + t.Error("metadata should still contain keep from other group") } } @@ -546,11 +543,13 @@ skills: } } - // Registry skills should be cleared - registryPath := filepath.Join(sb.SourcePath, "registry.yaml") - registryContent := sb.ReadFile(registryPath) - if strings.Contains(registryContent, "alpha") || strings.Contains(registryContent, "beta") || strings.Contains(registryContent, "gamma") { - t.Error("registry should not contain any skills after --all uninstall") + // Metadata should be cleared of all skills + store, err := install.LoadMetadata(sb.SourcePath) + if err != nil { + t.Fatalf("load metadata: %v", err) + } + if store.Has("alpha") || store.Has("beta") || store.Has("gamma") { + t.Error("metadata should not contain any skills after --all uninstall") } } diff --git a/tests/integration/uninstall_trash_test.go b/tests/integration/uninstall_trash_test.go index ab1ba639..d60861ae 100644 --- a/tests/integration/uninstall_trash_test.go +++ b/tests/integration/uninstall_trash_test.go @@ -8,6 +8,7 @@ import ( "strings" "testing" + "skillshare/internal/install" "skillshare/internal/testutil" ) @@ -52,12 +53,11 @@ func TestUninstall_WithMeta_PrintsReinstallHint(t *testing.T) { sb.CreateSkill("remote-skill", map[string]string{ "SKILL.md": "# Remote Skill", - ".skillshare-meta.json": `{ - "source": "github.com/user/skills/remote-skill", - "type": "github", - "installed_at": "2026-01-15T10:30:00Z" -}`, }) + // Write metadata to centralized store + metaStore := `{"version":1,"entries":{"remote-skill":{"source":"github.com/user/skills/remote-skill","type":"github","installed_at":"2026-01-15T10:30:00Z"}}}` + os.WriteFile(filepath.Join(sb.SourcePath, install.MetadataFileName), []byte(metaStore), 0644) + sb.WriteConfig(`source: ` + sb.SourcePath + ` targets: {} `) @@ -94,12 +94,11 @@ func TestUninstall_DryRun_ShowsTrashPreview(t *testing.T) { sb.CreateSkill("preview-skill", map[string]string{ "SKILL.md": "# Preview", - ".skillshare-meta.json": `{ - "source": "github.com/org/repo/preview-skill", - "type": "github", - "installed_at": "2026-01-15T10:30:00Z" -}`, }) + // Write metadata to centralized store + metaStore := `{"version":1,"entries":{"preview-skill":{"source":"github.com/org/repo/preview-skill","type":"github","installed_at":"2026-01-15T10:30:00Z"}}}` + os.WriteFile(filepath.Join(sb.SourcePath, install.MetadataFileName), []byte(metaStore), 0644) + sb.WriteConfig(`source: ` + sb.SourcePath + ` targets: {} `) diff --git a/tests/integration/update_project_test.go b/tests/integration/update_project_test.go index 558bd48c..a1680a8a 100644 --- a/tests/integration/update_project_test.go +++ b/tests/integration/update_project_test.go @@ -3,11 +3,11 @@ package integration import ( - "encoding/json" "os" "path/filepath" "testing" + "skillshare/internal/install" "skillshare/internal/testutil" ) @@ -42,9 +42,7 @@ func TestUpdateProject_DryRun(t *testing.T) { skillDir := sb.CreateProjectSkill(projectRoot, "remote", map[string]string{ "SKILL.md": "# Remote", }) - meta := map[string]interface{}{"source": "/tmp/fake-source", "type": "local"} - metaJSON, _ := json.Marshal(meta) - os.WriteFile(filepath.Join(skillDir, ".skillshare-meta.json"), metaJSON, 0644) + writeProjectMeta(t, skillDir) result := sb.RunCLIInDir(projectRoot, "update", "remote", "--dry-run", "-p") result.AssertSuccess(t) @@ -69,10 +67,19 @@ func TestUpdateProject_AllDryRun_SkipsLocal(t *testing.T) { func writeProjectMeta(t *testing.T, skillDir string) { t.Helper() - meta := map[string]any{"source": "/tmp/fake-source", "type": "local"} - data, _ := json.Marshal(meta) - if err := os.WriteFile(filepath.Join(skillDir, ".skillshare-meta.json"), data, 0644); err != nil { - t.Fatalf("failed to write meta: %v", err) + sourceDir := findSourceRoot(skillDir) + rel, _ := filepath.Rel(sourceDir, skillDir) + + store, err := install.LoadMetadata(sourceDir) + if err != nil { + t.Fatalf("writeProjectMeta: load: %v", err) + } + store.Set(rel, &install.MetadataEntry{ + Source: "/tmp/fake-source", + Type: "local", + }) + if err := store.Save(sourceDir); err != nil { + t.Fatalf("writeProjectMeta: save: %v", err) } } @@ -392,21 +399,21 @@ func TestUpdateProject_BatchAll_SubdirSkills_NoDuplication(t *testing.T) { skillsDir := filepath.Join(projectRoot, ".skillshare", "skills") repoURL := "file://" + remoteDir + store, _ := install.LoadMetadata(skillsDir) for _, name := range []string{"alpha", "beta"} { localDir := filepath.Join(skillsDir, name) os.MkdirAll(localDir, 0755) os.WriteFile(filepath.Join(localDir, "SKILL.md"), []byte("---\nname: "+name+"\n---\n# "+name+" v1"), 0644) - meta := map[string]any{ - "source": repoURL + "//skills/" + name, - "type": "git", - "repo_url": repoURL, - "subdir": "skills/" + name, - } - metaJSON, _ := json.Marshal(meta) - os.WriteFile(filepath.Join(localDir, ".skillshare-meta.json"), metaJSON, 0644) + store.Set(name, &install.MetadataEntry{ + Source: repoURL + "//skills/" + name, + Type: "git", + RepoURL: repoURL, + Subdir: "skills/" + name, + }) } + store.Save(skillsDir) // 3. First update --all result1 := sb.RunCLIInDir(projectRoot, "update", "--all", "-p", "--skip-audit") diff --git a/tests/integration/update_prune_test.go b/tests/integration/update_prune_test.go index cd1dbba4..f962cc35 100644 --- a/tests/integration/update_prune_test.go +++ b/tests/integration/update_prune_test.go @@ -3,11 +3,11 @@ package integration import ( - "encoding/json" "os" "path/filepath" "testing" + "skillshare/internal/install" "skillshare/internal/testutil" ) @@ -289,18 +289,24 @@ func TestUpdate_Prune_NestedIntoSkill(t *testing.T) { } } -// writeMetaForRepo writes metadata matching a repo-installed skill. +// writeMetaForRepo writes metadata matching a repo-installed skill to the centralized store. func writeMetaForRepo(t *testing.T, skillDir, repoURL, subdir string) { t.Helper() - meta := map[string]any{ - "source": repoURL + "//" + subdir, - "type": "github", - "repo_url": repoURL, - "subdir": subdir, - "version": "abc123", + sourceDir := findSourceRoot(skillDir) + rel, _ := filepath.Rel(sourceDir, skillDir) + + store, err := install.LoadMetadata(sourceDir) + if err != nil { + t.Fatalf("writeMetaForRepo: load: %v", err) } - data, _ := json.Marshal(meta) - if err := os.WriteFile(filepath.Join(skillDir, ".skillshare-meta.json"), data, 0644); err != nil { - t.Fatalf("writeMetaForRepo: %v", err) + store.Set(rel, &install.MetadataEntry{ + Source: repoURL + "//" + subdir, + Type: "github", + RepoURL: repoURL, + Subdir: subdir, + Version: "abc123", + }) + if err := store.Save(sourceDir); err != nil { + t.Fatalf("writeMetaForRepo: save: %v", err) } } diff --git a/tests/integration/update_test.go b/tests/integration/update_test.go index 481408b9..da4dfe10 100644 --- a/tests/integration/update_test.go +++ b/tests/integration/update_test.go @@ -3,22 +3,47 @@ package integration import ( - "encoding/json" "os" "os/exec" "path/filepath" "testing" + "skillshare/internal/install" "skillshare/internal/testutil" ) -// writeMeta writes a minimal .skillshare-meta.json to make a skill updatable. +// writeMeta writes a minimal metadata entry to .metadata.json to make a skill updatable. +// It finds the source root ("skills" ancestor) and uses the relative path as the key. func writeMeta(t *testing.T, skillDir string) { t.Helper() - meta := map[string]any{"source": "/tmp/fake-source", "type": "local"} - data, _ := json.Marshal(meta) - if err := os.WriteFile(filepath.Join(skillDir, ".skillshare-meta.json"), data, 0644); err != nil { - t.Fatalf("failed to write meta: %v", err) + sourceDir := findSourceRoot(skillDir) + rel, _ := filepath.Rel(sourceDir, skillDir) + + store, err := install.LoadMetadata(sourceDir) + if err != nil { + t.Fatalf("failed to load metadata: %v", err) + } + store.Set(rel, &install.MetadataEntry{ + Source: "/tmp/fake-source", + Type: "local", + }) + if err := store.Save(sourceDir); err != nil { + t.Fatalf("failed to save metadata: %v", err) + } +} + +// findSourceRoot walks up from skillDir to find the "skills" ancestor directory. +func findSourceRoot(skillDir string) string { + dir := skillDir + for { + if filepath.Base(dir) == "skills" { + return dir + } + parent := filepath.Dir(dir) + if parent == dir { + return filepath.Dir(skillDir) + } + dir = parent } } diff --git a/ui/src/App.tsx b/ui/src/App.tsx index 0d975790..9fbaa738 100644 --- a/ui/src/App.tsx +++ b/ui/src/App.tsx @@ -13,8 +13,8 @@ import { TourProvider, TourOverlay, TourTooltip } from './components/tour'; import DashboardPage from './pages/DashboardPage'; import { BASE_PATH } from './lib/basePath'; -const SkillsPage = lazy(() => import('./pages/SkillsPage')); -const SkillDetailPage = lazy(() => import('./pages/SkillDetailPage')); +const ResourcesPage = lazy(() => import('./pages/ResourcesPage')); +const ResourceDetailPage = lazy(() => import('./pages/ResourceDetailPage')); const TargetsPage = lazy(() => import('./pages/TargetsPage')); const ExtrasPage = lazy(() => import('./pages/ExtrasPage')); const SyncPage = lazy(() => import('./pages/SyncPage')); @@ -53,10 +53,10 @@ export default function App() { }> } /> - } /> - } /> + } /> + } /> + } /> } /> - } /> } /> } /> } /> diff --git a/ui/src/api/client.ts b/ui/src/api/client.ts index d1af84da..ebbde988 100644 --- a/ui/src/api/client.ts +++ b/ui/src/api/client.ts @@ -117,6 +117,7 @@ export interface SyncMatrixEntry { target: string; status: 'synced' | 'excluded' | 'not_included' | 'skill_target_mismatch' | 'na'; reason: string; + kind?: 'skill' | 'agent'; } // Typed API helpers @@ -124,29 +125,43 @@ export const api = { // Overview getOverview: () => apiFetch('/overview'), - // Skills - listSkills: () => apiFetch<{ skills: Skill[] }>('/skills'), - getSkill: (name: string) => - apiFetch<{ skill: Skill; skillMdContent: string; files: string[] }>(`/skills/${encodeURIComponent(name)}`), - deleteSkill: (name: string) => - apiFetch<{ success: boolean }>(`/skills/${encodeURIComponent(name)}`, { method: 'DELETE' }), - disableSkill: (name: string) => + // Resources (skills + agents) + listSkills: (kind?: 'skill' | 'agent') => + apiFetch<{ resources: Skill[] }>(kind ? `/resources?kind=${kind}` : '/resources'), + getResource: (name: string, kind?: 'skill' | 'agent') => + apiFetch<{ resource: Skill; skillMdContent: string; files: string[] }>( + `/resources/${encodeURIComponent(name)}${kind ? `?kind=${kind}` : ''}` + ), + getSkill: (name: string, kind?: 'skill' | 'agent') => + api.getResource(name, kind), + deleteResource: (name: string, kind?: 'skill' | 'agent') => + apiFetch<{ success: boolean }>( + `/resources/${encodeURIComponent(name)}${kind ? `?kind=${kind}` : ''}`, + { method: 'DELETE' } + ), + deleteSkill: (name: string, kind?: 'skill' | 'agent') => + api.deleteResource(name, kind), + disableResource: (name: string, kind?: 'skill' | 'agent') => apiFetch<{ success: boolean; name: string; disabled: boolean }>( - `/skills/${encodeURIComponent(name)}/disable`, + `/resources/${encodeURIComponent(name)}/disable${kind ? `?kind=${kind}` : ''}`, { method: 'POST' } ), - enableSkill: (name: string) => + disableSkill: (name: string, kind?: 'skill' | 'agent') => + api.disableResource(name, kind), + enableResource: (name: string, kind?: 'skill' | 'agent') => apiFetch<{ success: boolean; name: string; disabled: boolean }>( - `/skills/${encodeURIComponent(name)}/enable`, + `/resources/${encodeURIComponent(name)}/enable${kind ? `?kind=${kind}` : ''}`, { method: 'POST' } ), + enableSkill: (name: string, kind?: 'skill' | 'agent') => + api.enableResource(name, kind), batchUninstall: (opts: BatchUninstallRequest) => apiFetch('/uninstall/batch', { method: 'POST', body: JSON.stringify(opts), }), getTemplates: async () => { - const res = await apiFetch('/skills/templates'); + const res = await apiFetch('/resources/templates'); // Normalize: Go omits nil slices, so scaffoldDirs may be undefined for (const p of res.patterns) { if (!p.scaffoldDirs) p.scaffoldDirs = []; @@ -154,31 +169,31 @@ export const api = { return res; }, createSkill: (data: CreateSkillRequest) => - apiFetch('/skills', { + apiFetch('/resources', { method: 'POST', body: JSON.stringify(data), }), batchSetTargets: (folder: string, target: string | null) => - apiFetch<{ updated: number; skipped: number; errors: string[] }>('/skills/batch/targets', { + apiFetch<{ updated: number; skipped: number; errors: string[] }>('/resources/batch/targets', { method: 'POST', body: JSON.stringify({ folder, target: target ?? '' }), }), setSkillTargets: (name: string, target: string | null) => - apiFetch<{ success: boolean }>(`/skills/${encodeURIComponent(name)}/targets`, { + apiFetch<{ success: boolean }>(`/resources/${encodeURIComponent(name)}/targets`, { method: 'PATCH', body: JSON.stringify({ target: target ?? '' }), }), // Targets listTargets: () => apiFetch<{ targets: Target[]; sourceSkillCount: number }>('/targets'), - addTarget: (name: string, path: string) => + addTarget: (name: string, path: string, agentPath?: string) => apiFetch<{ success: boolean }>('/targets', { method: 'POST', - body: JSON.stringify({ name, path }), + body: JSON.stringify({ name, path, ...(agentPath && { agentPath }) }), }), removeTarget: (name: string) => apiFetch<{ success: boolean }>(`/targets/${encodeURIComponent(name)}`, { method: 'DELETE' }), - updateTarget: (name: string, opts: { include?: string[]; exclude?: string[]; mode?: string; target_naming?: string }) => + updateTarget: (name: string, opts: { include?: string[]; exclude?: string[]; mode?: string; target_naming?: string; agent_mode?: string; agent_include?: string[]; agent_exclude?: string[] }) => apiFetch<{ success: boolean }>(`/targets/${encodeURIComponent(name)}`, { method: 'PATCH', body: JSON.stringify(opts), @@ -189,14 +204,20 @@ export const api = { apiFetch<{ entries: SyncMatrixEntry[] }>( `/sync-matrix${target ? '?target=' + encodeURIComponent(target) : ''}` ), - previewSyncMatrix: (target: string, include: string[], exclude: string[]) => + previewSyncMatrix: (target: string, include: string[], exclude: string[], agentInclude?: string[], agentExclude?: string[]) => apiFetch<{ entries: SyncMatrixEntry[] }>('/sync-matrix/preview', { method: 'POST', - body: JSON.stringify({ target, include, exclude }), + body: JSON.stringify({ + target, + include, + exclude, + ...(agentInclude && { agent_include: agentInclude }), + ...(agentExclude && { agent_exclude: agentExclude }), + }), }), // Sync - sync: (opts: { dryRun?: boolean; force?: boolean }) => + sync: (opts: { dryRun?: boolean; force?: boolean; kind?: 'skill' | 'agent' }) => apiFetch('/sync', { method: 'POST', body: JSON.stringify(opts), @@ -264,14 +285,14 @@ export const api = { method: 'POST', body: JSON.stringify(opts), }), - installBatch: (opts: { source: string; skills: DiscoveredSkill[]; force?: boolean; skipAudit?: boolean; into?: string; name?: string; branch?: string }) => + installBatch: (opts: { source: string; skills: DiscoveredSkill[]; force?: boolean; skipAudit?: boolean; into?: string; name?: string; branch?: string; kind?: 'skill' | 'agent' }) => apiFetch('/install/batch', { method: 'POST', body: JSON.stringify(opts), }), // Update - update: (opts: { name?: string; force?: boolean; all?: boolean; skipAudit?: boolean }) => + update: (opts: { name?: string; kind?: 'skill' | 'agent'; force?: boolean; all?: boolean; skipAudit?: boolean }) => apiFetch<{ results: UpdateResultItem[] }>('/update', { method: 'POST', body: JSON.stringify(opts), @@ -300,12 +321,17 @@ export const api = { // Skill file content getSkillFile: (skillName: string, filepath: string) => - apiFetch(`/skills/${encodeURIComponent(skillName)}/files/${filepath}`), + apiFetch(`/resources/${encodeURIComponent(skillName)}/files/${filepath}`), // Collect - collectScan: (target?: string) => - apiFetch(`/collect/scan${target ? '?target=' + encodeURIComponent(target) : ''}`), - collect: (opts: { skills: { name: string; targetName: string }[]; force?: boolean }) => + collectScan: (target?: string, kind?: 'skill' | 'agent') => { + const params = new URLSearchParams(); + if (target) params.set('target', target); + if (kind) params.set('kind', kind); + const qs = params.toString(); + return apiFetch(`/collect/scan${qs ? '?' + qs : ''}`); + }, + collect: (opts: { skills: { name: string; targetName: string; kind?: string }[]; force?: boolean }) => apiFetch('/collect', { method: 'POST', body: JSON.stringify(opts), @@ -345,12 +371,21 @@ export const api = { // Trash listTrash: () => apiFetch('/trash'), - restoreTrash: (name: string) => - apiFetch<{ success: boolean }>(`/trash/${encodeURIComponent(name)}/restore`, { method: 'POST' }), - deleteTrash: (name: string) => - apiFetch<{ success: boolean }>(`/trash/${encodeURIComponent(name)}`, { method: 'DELETE' }), - emptyTrash: () => - apiFetch<{ success: boolean; removed: number }>('/trash/empty', { method: 'POST' }), + restoreTrash: (name: string, kind?: 'skill' | 'agent') => + apiFetch<{ success: boolean }>( + `/trash/${encodeURIComponent(name)}/restore${kind ? `?kind=${encodeURIComponent(kind)}` : ''}`, + { method: 'POST' }, + ), + deleteTrash: (name: string, kind?: 'skill' | 'agent') => + apiFetch<{ success: boolean }>( + `/trash/${encodeURIComponent(name)}${kind ? `?kind=${encodeURIComponent(kind)}` : ''}`, + { method: 'DELETE' }, + ), + emptyTrash: (kind: 'skill' | 'agent' | 'all' = 'all') => + apiFetch<{ success: boolean; removed: number }>( + `/trash/empty${kind ? `?kind=${encodeURIComponent(kind)}` : ''}`, + { method: 'POST' }, + ), // Extras listExtras: () => apiFetch<{ extras: Extra[] }>('/extras'), @@ -400,15 +435,18 @@ export const api = { }, // Audit - auditAll: () => apiFetch('/audit'), - auditSkill: (name: string) => apiFetch(`/audit/${encodeURIComponent(name)}`), + auditAll: (kind?: 'skills' | 'agents') => + apiFetch(`/audit${kind ? '?kind=' + kind : ''}`), + auditSkill: (name: string, kind?: 'skill' | 'agent') => + apiFetch(`/audit/${encodeURIComponent(name)}${kind === 'agent' ? '?kind=agent' : ''}`), auditAllStream: ( onStart: (total: number) => void, onProgress: (scanned: number) => void, onDone: (data: AuditAllResponse) => void, onError: (err: Error) => void, + kind?: 'skills' | 'agents', ): EventSource => - createSSEStream(BASE + '/audit/stream', { + createSSEStream(BASE + `/audit/stream${kind ? '?kind=' + kind : ''}`, { start: (d) => onStart(d.total), progress: (d) => onProgress(d.scanned), done: onDone, @@ -469,6 +507,14 @@ export const api = { method: 'PUT', body: JSON.stringify({ raw }), }), + + // Agentignore + getAgentignore: () => apiFetch('/agentignore'), + putAgentignore: (raw: string) => + apiFetch<{ success: boolean }>('/agentignore', { + method: 'PUT', + body: JSON.stringify({ raw }), + }), }; // Types @@ -480,7 +526,10 @@ export interface TrackedRepo { export interface Overview { source: string; + agentsSource?: string; + extrasSource?: string; skillCount: number; + agentCount: number; topLevelCount: number; targetCount: number; mode: string; @@ -501,6 +550,7 @@ export interface VersionCheck { export interface Skill { name: string; + kind: 'skill' | 'agent'; flatName: string; relPath: string; sourcePath: string; @@ -561,6 +611,12 @@ export interface Target { expectedSkillCount: number; skippedSkillCount?: number; collisionCount?: number; + agentPath?: string; + agentMode?: string; + agentInclude?: string[]; + agentExclude?: string[]; + agentLinkedCount?: number; + agentExpectedCount?: number; } export interface SyncResult { @@ -577,6 +633,9 @@ export interface IgnoreSources { ignored_skills: string[]; ignore_root: string; ignore_repos: string[]; + agent_ignore_root?: string; + agent_ignored_count?: number; + agent_ignored_skills?: string[]; } export interface SyncResponse extends IgnoreSources { @@ -591,7 +650,7 @@ export interface ConfigSaveResponse { export interface DiffTarget { target: string; - items: { skill: string; action: string; reason?: string }[]; + items: { skill: string; action: string; reason?: string; kind?: 'skill' | 'agent' }[]; skippedCount?: number; collisionCount?: number; } @@ -625,6 +684,7 @@ export interface InstallResult { export interface UpdateResultItem { name: string; + kind?: 'skill' | 'agent'; action: string; // "updated", "up-to-date", "skipped", "error", "blocked" message?: string; isRepo: boolean; @@ -643,6 +703,7 @@ export interface UpdateStreamSummary { export interface AvailableTarget { name: string; path: string; + agentPath?: string; installed: boolean; detected: boolean; } @@ -657,11 +718,20 @@ export interface DiscoveredSkill { name: string; path: string; description?: string; + kind?: 'skill' | 'agent'; +} + +export interface DiscoveredAgent { + name: string; + path: string; + fileName: string; + kind: 'agent'; } export interface DiscoverResult { needsSelection: boolean; skills: DiscoveredSkill[]; + agents: DiscoveredAgent[]; } export interface BatchInstallResultItem { @@ -678,6 +748,7 @@ export interface BatchInstallResult { export interface BatchUninstallRequest { names: string[]; + kind?: 'skill' | 'agent'; force?: boolean; } @@ -699,6 +770,7 @@ export interface LocalSkillInfo { targetName: string; size: number; modTime: string; + kind?: 'skill' | 'agent'; } export interface CollectScanTarget { @@ -720,6 +792,7 @@ export interface CollectResult { // Trash types export interface TrashedSkill { name: string; + kind?: 'skill' | 'agent'; timestamp: string; date: string; size: number; @@ -763,6 +836,7 @@ export interface RepoCheckResult { export interface SkillCheckResult { name: string; + kind?: 'skill' | 'agent'; source: string; version: string; status: string; @@ -853,6 +927,7 @@ export interface LogStatsResponse { // Audit types export interface AuditFinding { severity: 'CRITICAL' | 'HIGH' | 'MEDIUM' | 'LOW' | 'INFO'; + kind?: 'skill' | 'agent'; pattern: string; message: string; file: string; @@ -867,6 +942,7 @@ export interface AuditFinding { export interface AuditResult { skillName: string; + kind?: 'skill' | 'agent'; findings: AuditFinding[]; riskScore: number; riskLabel: 'clean' | 'low' | 'medium' | 'high' | 'critical'; @@ -1019,3 +1095,18 @@ export interface SkillignoreResponse { raw: string; stats?: SkillignoreStats; } + +// Agentignore types +export interface AgentignoreStats { + pattern_count: number; + ignored_count: number; + patterns: string[]; + ignored_agents: string[]; +} + +export interface AgentignoreResponse { + exists: boolean; + path: string; + raw: string; + stats?: AgentignoreStats; +} diff --git a/ui/src/components/Button.tsx b/ui/src/components/Button.tsx index f343d878..4427f014 100644 --- a/ui/src/components/Button.tsx +++ b/ui/src/components/Button.tsx @@ -3,7 +3,7 @@ import Spinner from './Spinner'; interface ButtonProps extends ButtonHTMLAttributes { children: ReactNode; - variant?: 'primary' | 'secondary' | 'danger' | 'ghost' | 'link'; + variant?: 'primary' | 'secondary' | 'danger' | 'warning' | 'ghost' | 'link'; size?: 'xs' | 'sm' | 'md' | 'lg'; loading?: boolean; ref?: Ref; @@ -13,6 +13,7 @@ const variantClasses = { primary: 'bg-pencil text-paper border-2 border-pencil hover:bg-pencil/85', secondary: 'bg-transparent text-pencil border-2 border-muted-dark hover:bg-muted/30 hover:border-pencil hover:shadow-sm', danger: 'bg-transparent text-danger border-2 border-danger hover:bg-danger hover:text-white', + warning: 'bg-transparent text-warning border-2 border-warning hover:bg-warning hover:text-white', ghost: 'bg-transparent text-pencil-light hover:text-pencil hover:bg-muted/30', link: 'bg-transparent text-pencil-light hover:text-pencil hover:underline border-none', }; diff --git a/ui/src/components/InstallForm.tsx b/ui/src/components/InstallForm.tsx index cb18942b..96feafa7 100644 --- a/ui/src/components/InstallForm.tsx +++ b/ui/src/components/InstallForm.tsx @@ -8,9 +8,11 @@ import { Input, Checkbox } from './Input'; import SkillPickerModal from './SkillPickerModal'; import ConfirmDialog from './ConfirmDialog'; import { useToast } from './Toast'; -import { api, type InstallResult, type DiscoveredSkill } from '../api/client'; +import { api, type InstallResult, type DiscoveredSkill, type DiscoveredAgent } from '../api/client'; import { queryKeys } from '../lib/queryKeys'; +import { clearAuditCache } from '../lib/auditCache'; import { radius } from '../design'; +import { formatSkillDisplayName } from '../lib/resourceNames'; interface InstallFormProps { /** Called after a successful install with the result */ @@ -109,7 +111,9 @@ export default function InstallForm({ // Discovery flow state const [discoveredSkills, setDiscoveredSkills] = useState([]); + const [discoveredAgents, setDiscoveredAgents] = useState([]); const [showPicker, setShowPicker] = useState(false); + const [showKindSelector, setShowKindSelector] = useState(false); const [pendingSource, setPendingSource] = useState(''); const [batchInstalling, setBatchInstalling] = useState(false); @@ -140,6 +144,7 @@ export default function InstallForm({ }; const invalidateAfterInstall = () => { + clearAuditCache(queryClient); queryClient.invalidateQueries({ queryKey: queryKeys.skills.all }); queryClient.invalidateQueries({ queryKey: queryKeys.overview }); }; @@ -201,10 +206,14 @@ export default function InstallForm({ }); toast(res.summary, 'success'); const allWarnings: string[] = []; + const allErrors: string[] = []; for (const item of res.results) { - if (item.error) toast(`${item.name}: ${item.error}`, 'error'); + if (item.error) allErrors.push(`${formatSkillDisplayName(item.name)}: ${item.error}`); if (item.warnings?.length) allWarnings.push(...item.warnings.map((w) => `${item.name}: ${w}`)); } + if (allErrors.length > 0) { + toast(`${allErrors.length} failed: ${allErrors.join('; ')}`, 'error'); + } if (allWarnings.length > 0) setWarningDialog(allWarnings); resetForm(); invalidateAfterInstall(); @@ -259,13 +268,40 @@ export default function InstallForm({ setInstalling(true); try { const disc = await api.discover(trimmed, branch.trim() || undefined); + const hasSkills = disc.skills.length > 0; + const hasAgents = (disc.agents?.length ?? 0) > 0; + + // Mixed repo — kind-first selection + if (hasSkills && hasAgents) { + setDiscoveredSkills(disc.skills); + setDiscoveredAgents(disc.agents); + setPendingSource(trimmed); + setShowKindSelector(true); + setInstalling(false); + return; + } + + // Pure agent repo — show agents as skills in picker + if (!hasSkills && hasAgents) { + const agentAsSkills: DiscoveredSkill[] = disc.agents.map((a) => ({ + name: a.name, + path: a.path, + kind: 'agent' as const, + })); + setDiscoveredSkills(agentAsSkills); + setPendingSource(trimmed); + setShowPicker(true); + setInstalling(false); + return; + } + if (disc.skills.length > 1) { // Multiple skills found — open picker setDiscoveredSkills(disc.skills); setPendingSource(trimmed); setShowPicker(true); - } else if (disc.skills.length === 1) { - // Single discovered skill — install via batch + } else if (disc.skills.length === 1 && !hasAgents) { + // Single discovered skill (no agents) — install via batch const res = await api.installBatch({ source: trimmed, skills: disc.skills, @@ -275,6 +311,7 @@ export default function InstallForm({ branch: branch.trim() || undefined, }); const allWarnings: string[] = []; + const allErrors: string[] = []; const auditFindings: string[] = []; const auditBlockedSkills: DiscoveredSkill[] = []; let installed = 0; @@ -285,13 +322,16 @@ export default function InstallForm({ const skill = disc.skills.find((s) => s.name === item.name); if (skill) auditBlockedSkills.push(skill); } else { - toast(`${item.name}: ${item.error}`, 'error'); + allErrors.push(`${formatSkillDisplayName(item.name)}: ${item.error}`); } } else { installed++; } if (item.warnings?.length) allWarnings.push(...item.warnings.map((w) => `${item.name}: ${w}`)); } + if (allErrors.length > 0) { + toast(`${allErrors.length} failed: ${allErrors.join('; ')}`, 'error'); + } if (installed > 0) { const variant = auditBlockedSkills.length > 0 ? 'warning' : 'success'; toast(res.summary, variant as 'success' | 'warning'); @@ -328,6 +368,7 @@ export default function InstallForm({ const handleBatchInstall = async (selected: DiscoveredSkill[]) => { setBatchInstalling(true); try { + const detectedKind = selected[0]?.kind; const res = await api.installBatch({ source: pendingSource, skills: selected, @@ -336,8 +377,10 @@ export default function InstallForm({ skipAudit, name: selected.length === 1 && name.trim() ? name.trim() : undefined, branch: branch.trim() || undefined, + kind: detectedKind === 'agent' ? 'agent' : undefined, }); const allWarnings: string[] = []; + const allErrors: string[] = []; const auditFindings: string[] = []; const auditBlockedSkills: DiscoveredSkill[] = []; let installed = 0; @@ -348,13 +391,16 @@ export default function InstallForm({ const skill = selected.find((s) => s.name === item.name); if (skill) auditBlockedSkills.push(skill); } else { - toast(`${item.name}: ${item.error}`, 'error'); + allErrors.push(`${formatSkillDisplayName(item.name)}: ${item.error}`); } } else { installed++; } if (item.warnings?.length) allWarnings.push(...item.warnings.map((w) => `${item.name}: ${w}`)); } + if (allErrors.length > 0) { + toast(`${allErrors.length} failed: ${allErrors.join('; ')}`, 'error'); + } // Only show summary + close picker when at least one skill installed if (installed > 0) { @@ -443,7 +489,7 @@ export default function InstallForm({ value={name} onChange={(e) => setName(e.target.value)} /> -

Only applies to single skill install

+

Only applies to single resource install

- Track keeps the git repo linked for updates · Force overwrites existing skills · Skip audit bypasses security scan + Track keeps the git repo linked for updates · Force overwrites existing resources · Skip audit bypasses security scan

@@ -505,6 +551,55 @@ export default function InstallForm({ /> ); + const kindSelectorDialog = ( + +

+ This repository contains both skills and agents. What would you like to install? +

+
+ + +
+ + } + confirmText="" + cancelText="Cancel" + onConfirm={() => setShowKindSelector(false)} + onCancel={() => setShowKindSelector(false)} + /> + ); + const auditConfirmDialog = (
- Skill installed with audit warnings + Resource installed with audit warnings
{warningFindings.length} {warningFindings.length === 1 ? 'warning' : 'warnings'}: @@ -632,6 +727,7 @@ export default function InstallForm({
{formContent} {pickerModal} + {kindSelectorDialog} {auditConfirmDialog} {warningConfirmDialog}
diff --git a/ui/src/components/KindBadge.tsx b/ui/src/components/KindBadge.tsx new file mode 100644 index 00000000..b5488979 --- /dev/null +++ b/ui/src/components/KindBadge.tsx @@ -0,0 +1,18 @@ +interface KindBadgeProps { + kind: 'skill' | 'agent'; +} + +const styles = { + agent: 'text-blue bg-info-light', + skill: 'text-pencil-light bg-muted', +}; + +export default function KindBadge({ kind }: KindBadgeProps) { + return ( + + {kind} + + ); +} diff --git a/ui/src/components/Layout.tsx b/ui/src/components/Layout.tsx index 6a3ffe43..c5782b51 100644 --- a/ui/src/components/Layout.tsx +++ b/ui/src/components/Layout.tsx @@ -2,7 +2,7 @@ import { NavLink, Outlet, useNavigate, useLocation } from 'react-router-dom'; import { useState, useCallback, useEffect } from 'react'; import { LayoutDashboard, - Puzzle, + Layers, Target, FolderPlus, RefreshCw, @@ -55,7 +55,7 @@ const navGroups: NavGroup[] = [ { label: 'MANAGE', items: [ - { to: '/skills', icon: Puzzle, label: 'Skills' }, + { to: '/resources', icon: Layers, label: 'Resources' }, { to: '/extras', icon: FolderPlus, label: 'Extras' }, { to: '/targets', icon: Target, label: 'Targets' }, { to: '/search', icon: Search, label: 'Search' }, diff --git a/ui/src/components/SkillPickerModal.tsx b/ui/src/components/SkillPickerModal.tsx index 30ed8654..c6e368eb 100644 --- a/ui/src/components/SkillPickerModal.tsx +++ b/ui/src/components/SkillPickerModal.tsx @@ -5,6 +5,7 @@ import DialogShell from './DialogShell'; import { Input, Checkbox } from './Input'; import { radius } from '../design'; import type { DiscoveredSkill } from '../api/client'; +import KindBadge from './KindBadge'; interface SkillPickerModalProps { open: boolean; @@ -82,10 +83,18 @@ export default function SkillPickerModal({ if (items.length > 0) onInstall(items); }; + const allAgents = skills.length > 0 && skills.every((s) => s.kind === 'agent'); + const someAgents = skills.some((s) => s.kind === 'agent'); + const singularLabel = allAgents ? 'agent' : someAgents ? 'resource' : 'skill'; + const pluralLabel = allAgents ? 'agents' : someAgents ? 'resources' : 'skills'; + return (

- {singleSelect ? 'Select a Skill to Install' : 'Select Skills to Install'} + {singleSelect + ? `Select ${singularLabel[0].toUpperCase() + singularLabel.slice(1)} to Install` + : `Select ${pluralLabel[0].toUpperCase() + pluralLabel.slice(1)} to Install` + }

{source} @@ -101,7 +110,7 @@ export default function SkillPickerModal({ /> setFilter(e.target.value)} className="!pl-8 !py-1.5 !text-sm font-mono" @@ -119,7 +128,7 @@ export default function SkillPickerModal({ /> {filter && ( - {filtered.length} of {skills.length} skills + {filtered.length} of {skills.length} {pluralLabel} )}

@@ -129,40 +138,50 @@ export default function SkillPickerModal({ {singleSelect && (
- Custom name is set — select one skill + Custom name is set — select one {singularLabel} {filter && ` (${filtered.length} of ${skills.length})`}
)} {/* Skill list */} -
+
{filtered.map((skill) => { const isSelected = selected.has(skill.path); return (