Skip to content

Add embeddable API, convo persistence & cancel#14

Open
jhd3197 wants to merge 2 commits intomainfrom
dev
Open

Add embeddable API, convo persistence & cancel#14
jhd3197 wants to merge 2 commits intomainfrom
dev

Conversation

@jhd3197
Copy link
Copy Markdown
Owner

@jhd3197 jhd3197 commented Mar 1, 2026

Introduce an embeddable AgentSite API and conversation persistence so projects can be generated and resumed in-process.

  • Add README docs for the embeddable component and its async API (generate_website, regenerate_page, load_project).
  • Export new types and functions from agentsite and engine packages.
  • Add dataclasses: ConversationMessage, PageState, ProjectState and implement load_project/delete_project in engine.component.
  • Extend GenerationConfig with review/cancellation/context fields and record conversation messages during runs; persist messages to messages.json via ProjectManager.append_message/load_messages.
  • Pipeline enhancements: accept review/cancel/context parameters, emit events (review_feedback, site_plan_ready, style_spec_ready), and short-circuit on cancellation.
  • Add .pr to .gitignore and include a .claude skill for creating PR descriptions.

These changes enable iterative, host-driven workflows (resume, cancel, preview design/review feedback) when embedding AgentSite as a library.

Introduce an embeddable AgentSite API and conversation persistence so projects can be generated and resumed in-process.

- Add README docs for the embeddable component and its async API (generate_website, regenerate_page, load_project).
- Export new types and functions from agentsite and engine packages.
- Add dataclasses: ConversationMessage, PageState, ProjectState and implement load_project/delete_project in engine.component.
- Extend GenerationConfig with review/cancellation/context fields and record conversation messages during runs; persist messages to messages.json via ProjectManager.append_message/load_messages.
- Pipeline enhancements: accept review/cancel/context parameters, emit events (review_feedback, site_plan_ready, style_spec_ready), and short-circuit on cancellation.
- Add .pr to .gitignore and include a .claude skill for creating PR descriptions.

These changes enable iterative, host-driven workflows (resume, cancel, preview design/review feedback) when embedding AgentSite as a library.
Copilot AI review requested due to automatic review settings March 1, 2026 04:37
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR introduces an embeddable (in-process) AgentSite API with disk-backed project/conversation persistence and host-driven controls (review config + cancellation), enabling “generate → resume → iterate” workflows without running the server/UI.

Changes:

  • Add conversation persistence via messages.json and expose project lifecycle helpers (load_project, delete_project) for restoring/resuming work.
  • Extend the generation pipeline/component API with review parameters, conversation context, new events (review_feedback, site_plan_ready, style_spec_ready), and cancellation checks.
  • Update package exports and README documentation to reflect the new embeddable usage.

Reviewed changes

Copilot reviewed 7 out of 8 changed files in this pull request and generated 7 comments.

Show a summary per file
File Description
agentsite/engine/project_manager.py Adds messages.json persistence helpers (append_message, load_messages) and updates the documented on-disk layout.
agentsite/engine/pipeline.py Adds new pipeline events, review feedback emission, conversation context plumbing, and cancellation short-circuit points.
agentsite/engine/component.py Introduces ConversationMessage/ProjectState dataclasses, load_project/delete_project, and records messages during runs.
agentsite/engine/__init__.py Re-exports new component APIs and state/message types from the engine package.
agentsite/__init__.py Re-exports new embeddable APIs/types at the top-level package.
README.md Adds “Embeddable Component” docs and usage examples for in-process generation and persistence.
.gitignore Ignores /.pr directory used for PR-description tooling artifacts.
.claude/skills/create-pr/SKILL.md Adds a Claude skill doc for generating PR titles/descriptions into .pr/.
Comments suppressed due to low confidence (1)

agentsite/engine/component.py:526

  • If pipeline.generate(...) returns success=False (e.g., cancellation after PM/Designer), this code still persists an assistant message claiming generation succeeded (success: True) and returns a GenerationResult with success=False but no error. Consider branching on result.success (and/or cancel_event.is_set()) to persist an accurate message and set a meaningful error such as "Cancelled by host".
        # Persist assistant message
        pm.append_message(
            project.id,
            ConversationMessage(
                role="assistant",
                content=f"Generated '{slug}' page (v{version}) with {len(files)} files",
                timestamp=datetime.now(timezone.utc).isoformat(),
                meta={"slug": slug, "version": version, "files": files, "success": True},
            ),
        )

        return GenerationResult(
            project_id=project.id,
            slug=slug,
            version=version,
            files=files,
            files_content=files_content,
            output_dir=pm.version_dir(project.id, slug, version),
            usage=getattr(result, "aggregate_usage", {}),
            agent_runs=[r.model_dump() for r in pipeline.agent_runs],
            style_spec=parsed_ss or project.style_spec,
            site_plan_raw=pipeline.site_plan_text,
            success=getattr(result, "success", True),
        )

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines 599 to +710
@@ -585,6 +625,7 @@ async def _on_round_complete(round_number: int) -> None:
"design_system_guide": design_system_guide,
"architecture_guide": architecture_guide,
"tech_stack": tech_stack.model_dump_json(),
"conversation_context": conversation_context,
}

if "designer" in required_agents:
@@ -622,6 +663,20 @@ async def _on_round_complete(round_number: int) -> None:
style_spec_text = designer_result.shared_state.get("style_spec", "")
self.style_spec_text = style_spec_text

# Emit style_spec_ready so hosts can preview design before dev starts
_style_parsed = False
if style_spec_text:
try:
from prompture import clean_json_text as _cjt
json.loads(_cjt(style_spec_text))
_style_parsed = True
except Exception:
pass
await self._emit("style_spec_ready", data={
"style_spec": style_spec_text,
"parsed": _style_parsed,
})

initial_state["style_spec"] = style_spec_text

# Merge designer usage into pm_result for later aggregation
@@ -642,6 +697,18 @@ async def _on_round_complete(round_number: int) -> None:
else:
initial_state["style_spec"] = StyleSpec().model_dump_json()

# --- Cancellation check after Designer ---
if cancel_event and cancel_event.is_set():
await self._emit("generation_complete", data={
"success": False, "slug": slug, "version": version_number,
"files": [], "error": "Cancelled by host",
})
return GroupResult(
agent_results=[], aggregate_usage={},
shared_state={}, elapsed_ms=0, timeline=[], errors=[],
success=False,
)
Copy link

Copilot AI Mar 1, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cancellation is only checked after the PM and Designer phases. If cancel_event is set while the build/review pipeline is running, generation will continue to completion because the event isn't propagated into Prompture groups. If the intent is host-driven cancellation, consider wiring cancellation into the agent/group execution (if Prompture supports it) or adding additional checks between major steps so cancel requests take effect promptly.

Copilot uses AI. Check for mistakes.
Comment thread README.md Outdated
Comment on lines +190 to +213
from agentsite import generate_website, regenerate_page, GenerationConfig

# Generate a site from a prompt
result = await generate_website(
"A dark portfolio site with projects and contact page",
output_dir=Path("./websites"),
config=GenerationConfig(
model="openai/gpt-4o",
provider_keys={"openai": os.environ["OPENAI_API_KEY"]},
max_cost=0.50,
),
on_event=lambda e: print(f"{e.agent}: {e.type}"),
)

for path, html in result.files_content.items():
print(f"{path}: {len(html)} bytes")

# Iterate on the same project with new feedback
v2 = await regenerate_page(
"Make the hero section taller and add a testimonials page",
output_dir=Path("./websites"),
project_id=result.project_id,
config=GenerationConfig(model="openai/gpt-4o"),
)
Copy link

Copilot AI Mar 1, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The embeddable API example uses Path and os.environ but doesn't import Path or os, and uses await at top-level (which will error outside an async context). Adding the missing imports and indicating that this runs inside async def (or showing asyncio.run(...)) would make the snippet copy/pasteable.

Suggested change
from agentsite import generate_website, regenerate_page, GenerationConfig
# Generate a site from a prompt
result = await generate_website(
"A dark portfolio site with projects and contact page",
output_dir=Path("./websites"),
config=GenerationConfig(
model="openai/gpt-4o",
provider_keys={"openai": os.environ["OPENAI_API_KEY"]},
max_cost=0.50,
),
on_event=lambda e: print(f"{e.agent}: {e.type}"),
)
for path, html in result.files_content.items():
print(f"{path}: {len(html)} bytes")
# Iterate on the same project with new feedback
v2 = await regenerate_page(
"Make the hero section taller and add a testimonials page",
output_dir=Path("./websites"),
project_id=result.project_id,
config=GenerationConfig(model="openai/gpt-4o"),
)
import os
import asyncio
from pathlib import Path
from agentsite import generate_website, regenerate_page, GenerationConfig
async def main() -> None:
# Generate a site from a prompt
result = await generate_website(
"A dark portfolio site with projects and contact page",
output_dir=Path("./websites"),
config=GenerationConfig(
model="openai/gpt-4o",
provider_keys={"openai": os.environ["OPENAI_API_KEY"]},
max_cost=0.50,
),
on_event=lambda e: print(f"{e.agent}: {e.type}"),
)
for path, html in result.files_content.items():
print(f"{path}: {len(html)} bytes")
# Iterate on the same project with new feedback
v2 = await regenerate_page(
"Make the hero section taller and add a testimonials page",
output_dir=Path("./websites"),
project_id=result.project_id,
config=GenerationConfig(model="openai/gpt-4o"),
)
if __name__ == "__main__":
asyncio.run(main())

Copilot uses AI. Check for mistakes.
Comment thread README.md
| `agent_configs` | `dict[str, AgentConfig] \| None` | `None` | Per-agent overrides |
| `style_spec` | `StyleSpec \| None` | `None` | Pre-defined design tokens |
| `logo_url` | `str` | `""` | Logo URL for the site |
| `icon_url` | `str` | `""` | Favicon URL |
Copy link

Copilot AI Mar 1, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

GenerationConfig gained new fields (max_review_iterations, review_threshold, cancel_event, conversation_context) but the README table doesn't list them. Please document these fields (and defaults/behavior) so embedders know how to enable review gating and cancellation/context features.

Suggested change
| `icon_url` | `str` | `""` | Favicon URL |
| `icon_url` | `str` | `""` | Favicon URL |
| `max_review_iterations` | `int` | `0` | Maximum number of automated review/fix cycles per page. `0` disables review gating and accepts the first draft. |
| `review_threshold` | `float` | `0.0` | Minimum review score (0.0–1.0) required to accept a page when review gating is enabled. Only used when `max_review_iterations > 0`. |
| `cancel_event` | `Any \| None` | `None` | Optional cooperative-cancellation flag (e.g. a `threading.Event`). When set during generation, the run aborts as soon as possible. |
| `conversation_context` | `dict[str, Any] \| None` | `None` | Extra context injected into all agent prompts (e.g. user/session/site metadata). Must be JSON-serializable. |

Copilot uses AI. Check for mistakes.
Comment thread agentsite/engine/project_manager.py Outdated
Comment on lines +191 to +207
def append_message(self, project_id: str, message: object) -> None:
"""Append a ConversationMessage (dataclass) to messages.json."""
import dataclasses

path = self._messages_path(project_id)
messages: list[dict[str, Any]] = []
if path.exists():
messages = json.loads(path.read_text(encoding="utf-8"))
messages.append(dataclasses.asdict(message))
path.write_text(json.dumps(messages, indent=2), encoding="utf-8")

def load_messages(self, project_id: str) -> list[dict[str, Any]]:
"""Load conversation messages from disk. Returns [] if file is missing."""
path = self._messages_path(project_id)
if not path.exists():
return []
return json.loads(path.read_text(encoding="utf-8"))
Copy link

Copilot AI Mar 1, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

append_message/load_messages call json.loads on messages.json without handling JSONDecodeError or validating the parsed type. A partially-written/corrupted file (or manual edits) will crash generation/load. Consider catching decode errors, defaulting to [], and (optionally) validating that the loaded value is a list of dicts before appending/returning.

Suggested change
def append_message(self, project_id: str, message: object) -> None:
"""Append a ConversationMessage (dataclass) to messages.json."""
import dataclasses
path = self._messages_path(project_id)
messages: list[dict[str, Any]] = []
if path.exists():
messages = json.loads(path.read_text(encoding="utf-8"))
messages.append(dataclasses.asdict(message))
path.write_text(json.dumps(messages, indent=2), encoding="utf-8")
def load_messages(self, project_id: str) -> list[dict[str, Any]]:
"""Load conversation messages from disk. Returns [] if file is missing."""
path = self._messages_path(project_id)
if not path.exists():
return []
return json.loads(path.read_text(encoding="utf-8"))
def _load_messages_file(self, path: Path) -> list[dict[str, Any]]:
"""Safely load messages from a JSON file, returning [] on errors."""
if not path.exists():
return []
try:
raw = json.loads(path.read_text(encoding="utf-8"))
except (json.JSONDecodeError, OSError, UnicodeDecodeError):
# Corrupted, partially written, or unreadable file: treat as empty.
return []
if not isinstance(raw, list):
return []
# Ensure we only return list elements that are dict-like.
messages: list[dict[str, Any]] = []
for item in raw:
if isinstance(item, dict):
messages.append(item)
return messages
def append_message(self, project_id: str, message: object) -> None:
"""Append a ConversationMessage (dataclass) to messages.json."""
import dataclasses
path = self._messages_path(project_id)
messages: list[dict[str, Any]] = self._load_messages_file(path)
messages.append(dataclasses.asdict(message))
path.write_text(json.dumps(messages, indent=2), encoding="utf-8")
def load_messages(self, project_id: str) -> list[dict[str, Any]]:
"""Load conversation messages from disk. Returns [] if file is missing."""
path = self._messages_path(project_id)
return self._load_messages_file(path)

Copilot uses AI. Check for mistakes.
Comment thread agentsite/engine/project_manager.py Outdated
Comment on lines +191 to +200
def append_message(self, project_id: str, message: object) -> None:
"""Append a ConversationMessage (dataclass) to messages.json."""
import dataclasses

path = self._messages_path(project_id)
messages: list[dict[str, Any]] = []
if path.exists():
messages = json.loads(path.read_text(encoding="utf-8"))
messages.append(dataclasses.asdict(message))
path.write_text(json.dumps(messages, indent=2), encoding="utf-8")
Copy link

Copilot AI Mar 1, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

append_message uses a read-modify-write cycle on messages.json with no synchronization, so concurrent generations on the same project_id can lose messages or write invalid JSON. If concurrent use is expected for the embeddable API, consider a file lock, atomic write via temp+rename, or switching to an append-friendly format (e.g., JSONL).

Copilot uses AI. Check for mistakes.
Comment on lines 17 to 26
"""Manages project directories and files on disk.

Filesystem layout::

{base}/{project_id}/
├── project.json
├── messages.json
├── assets/
├── guides/
└── pages/
Copy link

Copilot AI Mar 1, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The filesystem layout docstring now lists guides/ and messages.json, but create() only creates pages/ and assets/. Either create those paths during create() or adjust the documented layout so it matches what a newly-created project directory actually contains.

Copilot uses AI. Check for mistakes.
Comment thread agentsite/engine/project_manager.py Outdated
Comment on lines +203 to +208
"""Load conversation messages from disk. Returns [] if file is missing."""
path = self._messages_path(project_id)
if not path.exists():
return []
return json.loads(path.read_text(encoding="utf-8"))

Copy link

Copilot AI Mar 1, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

New message persistence behavior (append_message/load_messages) is not covered by tests. Since ProjectManager already has a dedicated test module, adding tests for create→append→load (and corrupt/empty file handling) would help prevent regressions.

Suggested change
"""Load conversation messages from disk. Returns [] if file is missing."""
path = self._messages_path(project_id)
if not path.exists():
return []
return json.loads(path.read_text(encoding="utf-8"))
"""Load conversation messages from disk. Returns [] if file is missing or invalid."""
path = self._messages_path(project_id)
if not path.exists():
return []
try:
data = json.loads(path.read_text(encoding="utf-8"))
except json.JSONDecodeError:
# Corrupt or empty JSON; treat as no messages
return []
if not isinstance(data, list):
# Unexpected structure; be defensive and return no messages
return []
return data

Copilot uses AI. Check for mistakes.
Introduce cooperative cancellation to the generation pipeline, and harden ProjectManager message persistence with safe reads and atomic writes. Changes:

- pipeline: check cancel_event after build/review, emit generation_complete with error and return a failed GroupResult when cancelled.
- project_manager: create a guides/ directory for new projects; add _safe_read_messages to tolerate missing/corrupt/non-list JSON and filter non-dict items; make append_message use atomic write (temp file + replace) to avoid corruption; load_messages now uses the safe reader.
- tests: add coverage for guides directory, message append/load, and multiple edge cases (nonexistent, corrupted JSON, non-list, filtering non-dict items).
- README: replace top-level await snippet with an asyncio.run example and add new API docs for max_review_iterations, review_threshold, cancel_event, and conversation_context; small import fixes.

These changes improve reliability for concurrent/interrupted writes, provide a cooperative cancellation mechanism, and update docs/tests to reflect the new behaviors.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants