diff --git a/.github/AUTO_PR_WORKFLOW.md b/.github/AUTO_PR_WORKFLOW.md
new file mode 100644
index 000000000..e55643366
--- /dev/null
+++ b/.github/AUTO_PR_WORKFLOW.md
@@ -0,0 +1,156 @@
+# Auto PR from Issues Workflow
+
+## Overview
+
+This workflow automatically scans open issues labeled with "auto PR" and creates pull requests with the required documentation updates for KDF (Komodo DeFi Framework) methods.
+
+## How It Works
+
+### 1. Issue Scanning
+- Runs daily at 4:30 AM UTC
+- Can be triggered manually via GitHub Actions UI
+- Scans for open issues with the "auto PR" label
+
+### 2. Content Parsing
+The script parses issue content to extract:
+- **Method names**: KDF method names (e.g., `task::enable_utxo::init`)
+- **File paths**: Specific documentation paths if mentioned
+- **Categories**: Method categories (lightning, wallet, swap, etc.)
+
+### 3. Documentation Generation
+- Converts KDF method names to filesystem paths following the naming convention
+- Determines appropriate API version directory (legacy, v20, v20-dev)
+- Generates MDX documentation using:
+ - AI-powered content generation (if OpenAI API key is available)
+ - Template-based generation (fallback)
+
+### 4. PR Creation
+- Creates a new branch: `auto-pr/issue-{number}`
+- Commits generated documentation files
+- Creates pull request targeting the `dev` branch
+- Adds appropriate labels and links back to the original issue
+
+## Issue Format Requirements
+
+For the workflow to process an issue correctly, include:
+
+```markdown
+**Method**: `task::enable_utxo::init`
+**Type**: wallet activation
+**Description**: Enable UTXO coin activation method
+
+Additional details about the method functionality...
+```
+
+### Supported Patterns
+
+The script recognizes these patterns in issue content:
+- `method: method_name`
+- `function: function_name`
+- `rpc: rpc_name`
+- `path: file/path`
+- `type: category`
+
+## Naming Conventions
+
+The workflow follows the KDF naming conventions:
+
+- **Canonical form**: `task::enable_utxo::init`
+- **File/folder name**: `task-enable_utxo-init` (:: → -, underscores preserved)
+- **API versioning**:
+ - Methods with `::` → `v20-dev`
+ - Lightning/task methods → `v20`
+ - Other methods → `legacy`
+
+## Configuration
+
+### Environment Variables
+
+| Variable | Description | Required |
+|----------|-------------|----------|
+| `GITHUB_TOKEN` | GitHub Actions token | Yes |
+| `OPENAI_API_KEY` | OpenAI API key for AI generation | No |
+| `TARGET_OWNER` | Repository owner | Yes |
+| `TARGET_REPO` | Repository name | Yes |
+| `AUTO_PR_LABEL` | Label to scan for | Yes |
+
+### Manual Execution
+
+```bash
+# Dry run mode
+DRY_RUN=true node scripts/auto-pr-from-issues.js
+
+# Process specific issues
+SPECIFIC_ISSUES="123,456" node scripts/auto-pr-from-issues.js
+
+# Normal execution
+node scripts/auto-pr-from-issues.js
+```
+
+## Generated Documentation Structure
+
+The workflow creates documentation files following this structure:
+
+```
+src/pages/komodo-defi-framework/api/
+├── legacy/
+├── v20/
+│ ├── lightning/
+│ ├── wallet/
+│ ├── task_managed/
+│ └── ...
+└── v20-dev/
+ ├── lightning/
+ ├── wallet/
+ └── ...
+```
+
+## Example Workflow
+
+1. **Issue Created**: Developer creates issue with "auto PR" label
+2. **Daily Scan**: Workflow runs and detects the issue
+3. **Parsing**: Extracts method `lightning::channels::open_channel`
+4. **Generation**: Creates documentation in `v20-dev/lightning/`
+5. **PR Creation**: Creates PR with generated documentation
+6. **Review**: Team reviews and merges the PR
+7. **Issue Closure**: PR merge automatically closes the issue
+
+## Features
+
+### AI-Powered Generation
+- Uses OpenAI GPT-4o-mini for comprehensive documentation
+- Generates request/response tables, examples, and descriptions
+- Falls back to template generation if API unavailable
+
+### Error Handling
+- Robust error handling with detailed logging
+- Automatic cleanup of failed branches
+- Comprehensive statistics reporting
+
+### Repository Integration
+- Follows existing Git flow (targets `dev` branch)
+- Uses established labeling conventions
+- Integrates with existing Python virtual environment
+
+## Monitoring
+
+The workflow provides detailed logs and summaries:
+- Issues processed/skipped/errored
+- Generated files and their locations
+- PR creation status and links
+
+## Troubleshooting
+
+### Common Issues
+
+1. **No methods detected**: Ensure issue content follows the expected patterns
+2. **Git conflicts**: Workflow automatically pulls latest `dev` branch
+3. **API rate limits**: Built-in throttling and retry mechanisms
+4. **File permissions**: Workflow runs with appropriate GitHub permissions
+
+### Debug Mode
+
+Enable debug output by setting environment variables:
+```bash
+DEBUG=true DRY_RUN=true node scripts/auto-pr-from-issues.js
+```
diff --git a/.github/package.json b/.github/package.json
index ca97c062b..49b857074 100644
--- a/.github/package.json
+++ b/.github/package.json
@@ -4,7 +4,8 @@
"description": "Dependencies for docs sync workflow",
"main": "scripts/sync-docs-issues.js",
"scripts": {
- "sync": "node scripts/sync-docs-issues.js"
+ "sync": "node scripts/sync-docs-issues.js",
+ "auto-pr": "node scripts/auto-pr-from-issues.js"
},
"dependencies": {
"@octokit/core": "^5.0.0",
diff --git a/.github/scripts/auto-pr-from-issues.js b/.github/scripts/auto-pr-from-issues.js
new file mode 100644
index 000000000..0f118d143
--- /dev/null
+++ b/.github/scripts/auto-pr-from-issues.js
@@ -0,0 +1,752 @@
+#!/usr/bin/env node
+
+/**
+ * Auto PR from Issues Script
+ * Scans issues labeled with "auto PR" and creates pull requests with required documentation updates
+ *
+ * IMPORTANT: This script follows the AI Agent Reference Guide for KDF documentation.
+ *
+ * Key principles:
+ * 1. NEVER edit auto-generated files (src/pages/komodo-defi-framework/api/index.mdx, filepathSlugs.json)
+ * 2. Include required {{label}} attributes for method parsing
+ * 3. Use proper directory structure and naming conventions
+ * 4. Let build scripts handle slug generation and API index updates
+ * 5. Manual updates: sidebar.json, changelog, and documentation files only
+ */
+
+const { Octokit } = require("@octokit/core");
+const { retry } = require("@octokit/plugin-retry");
+const { throttling } = require("@octokit/plugin-throttling");
+const { execSync } = require('child_process');
+const fs = require('fs');
+const path = require('path');
+
+// Enhanced Octokit with retry and throttling
+const MyOctokit = Octokit.plugin(retry, throttling);
+
+class AutoPRManager {
+ constructor() {
+ this.config = this.validateConfig();
+ this.octokit = this.initializeOctokit();
+ this.stats = {
+ processed: 0,
+ created: 0,
+ skipped: 0,
+ errors: 0
+ };
+ this.repoRoot = process.cwd().replace('/.github', '');
+ }
+
+ validateConfig() {
+ const {
+ TARGET_OWNER, TARGET_REPO, AUTO_PR_LABEL,
+ DRY_RUN, SPECIFIC_ISSUES
+ } = process.env;
+
+ const requiredVars = [TARGET_OWNER, TARGET_REPO, AUTO_PR_LABEL];
+ const missing = requiredVars.filter(v => !v);
+
+ if (missing.length > 0) {
+ throw new Error(`Missing required environment variables: ${missing.join(', ')}`);
+ }
+
+ return {
+ targetOwner: TARGET_OWNER,
+ targetRepo: TARGET_REPO,
+ autoPrLabel: AUTO_PR_LABEL,
+ dryRun: DRY_RUN === 'true',
+ specificIssues: SPECIFIC_ISSUES ? SPECIFIC_ISSUES.split(',').map(n => parseInt(n.trim())).filter(n => !isNaN(n)) : null
+ };
+ }
+
+ initializeOctokit() {
+ const auth = process.env.GITHUB_TOKEN;
+ if (!auth) {
+ throw new Error("No authentication token available");
+ }
+
+ return new MyOctokit({
+ auth,
+ throttle: {
+ onRateLimit: (retryAfter, options, octokit) => {
+ console.warn(`Request quota exhausted for request ${options.method} ${options.url}`);
+ if (options.request.retryCount === 0) {
+ console.info(`Retrying after ${retryAfter} seconds!`);
+ return true;
+ }
+ },
+ onSecondaryRateLimit: (retryAfter, options, octokit) => {
+ console.warn(`Secondary rate limit hit for request ${options.method} ${options.url}`);
+ },
+ },
+ retry: {
+ doNotRetry: ["abuse"],
+ },
+ });
+ }
+
+ async listCandidateIssues() {
+ try {
+ // If specific issues are requested, fetch them directly
+ if (this.config.specificIssues) {
+ const issues = [];
+ for (const issueNumber of this.config.specificIssues) {
+ try {
+ const issue = await this.octokit.request("GET /repos/{owner}/{repo}/issues/{issue_number}", {
+ owner: this.config.targetOwner,
+ repo: this.config.targetRepo,
+ issue_number: issueNumber
+ });
+
+ // Check if issue has the required label
+ const hasLabel = issue.data.labels?.some(l =>
+ (l.name || "").toLowerCase() === this.config.autoPrLabel.toLowerCase()
+ );
+
+ if (hasLabel && issue.data.state === "open") {
+ issues.push(issue.data);
+ } else {
+ console.info(`Issue #${issueNumber} does not have required label or is not open`);
+ }
+ } catch (error) {
+ console.warn(`Error fetching issue #${issueNumber}: ${error.message}`);
+ }
+ }
+ return issues;
+ }
+
+ // Otherwise, search for issues with the label
+ const q = [
+ `repo:${this.config.targetOwner}/${this.config.targetRepo}`,
+ "is:issue",
+ "is:open",
+ `label:"${this.config.autoPrLabel}"`
+ ].join(" ");
+
+ let page = 1;
+ const all = [];
+ const maxPages = 10;
+
+ while (page <= maxPages) {
+ const res = await this.octokit.request("GET /search/issues", {
+ q,
+ sort: "updated",
+ order: "desc",
+ per_page: 50,
+ page,
+ headers: {
+ 'X-GitHub-Api-Version': '2022-11-28'
+ }
+ });
+
+ all.push(...res.data.items);
+
+ if (!res.data.items.length || all.length >= res.data.total_count) {
+ break;
+ }
+ page++;
+ }
+
+ return all;
+ } catch (error) {
+ console.error(`Error listing candidate issues: ${error.message}`);
+ return [];
+ }
+ }
+
+ parseIssueContent(issue) {
+ const body = issue.body || "";
+ const title = issue.title || "";
+
+ // Extract KDF method information from issue content
+ const methodPattern = /(?:method|function|rpc)[:\s]*`?([a-zA-Z0-9_:]+)`?/gi;
+ const pathPattern = /(?:path|file)[:\s]*`?([a-zA-Z0-9_\/\-\.]+)`?/gi;
+ const typePattern = /(?:type|category)[:\s]*`?([a-zA-Z0-9_\-]+)`?/gi;
+
+ let methods = [];
+ let paths = [];
+ let types = [];
+
+ let match;
+ while ((match = methodPattern.exec(body)) !== null) {
+ methods.push(match[1]);
+ }
+
+ while ((match = pathPattern.exec(body)) !== null) {
+ paths.push(match[1]);
+ }
+
+ while ((match = typePattern.exec(body)) !== null) {
+ types.push(match[1]);
+ }
+
+ // Try to extract from title as well
+ const titleMethodMatch = title.match(/([a-zA-Z0-9_:]+)/);
+ if (titleMethodMatch && !methods.includes(titleMethodMatch[1])) {
+ methods.push(titleMethodMatch[1]);
+ }
+
+ // Extract structured method information if available (from enhanced sync script)
+ const structuredMethods = this.extractStructuredMethods(body);
+ const codeExamples = this.extractCodeExamples(body);
+
+ return {
+ methods: [...new Set(methods)],
+ paths: [...new Set(paths)],
+ types: [...new Set(types)],
+ description: body,
+ title: title,
+ structuredMethods: structuredMethods,
+ codeExamples: codeExamples
+ };
+ }
+
+ /**
+ * Extract structured method information from issue body (created by sync script)
+ */
+ extractStructuredMethods(body) {
+ const methods = [];
+ const methodSections = body.split('#### Method:').slice(1);
+
+ methodSections.forEach(section => {
+ const lines = section.split('\n');
+ const methodMatch = lines[0].match(/`([^`]+)`/);
+ if (!methodMatch) return;
+
+ const methodName = methodMatch[1];
+ const method = { name: methodName, parameters: [] };
+
+ let inParams = false;
+ let currentExample = '';
+ let inExample = false;
+
+ lines.forEach(line => {
+ if (line.includes('**Request Parameters:**')) {
+ inParams = true;
+ } else if (line.includes('**Example Request:**')) {
+ inParams = false;
+ inExample = true;
+ } else if (line.startsWith('```json')) {
+ currentExample = '';
+ } else if (line.startsWith('```') && inExample) {
+ try {
+ method.example = JSON.parse(currentExample);
+ } catch (e) {
+ // Invalid JSON, skip
+ }
+ inExample = false;
+ } else if (inExample) {
+ currentExample += line + '\n';
+ } else if (inParams && line.startsWith('- ')) {
+ const paramMatch = line.match(/- `([^`]+)` \(([^)]+)\): Example value `([^`]+)`/);
+ if (paramMatch) {
+ method.parameters.push({
+ name: paramMatch[1],
+ type: paramMatch[2],
+ example: paramMatch[3]
+ });
+ }
+ }
+ });
+
+ methods.push(method);
+ });
+
+ return methods;
+ }
+
+ /**
+ * Extract JSON code examples from text
+ */
+ extractCodeExamples(text) {
+ const examples = [];
+ const codeBlockPattern = /```json\s*\n([\s\S]*?)\n```/gi;
+
+ let match;
+ while ((match = codeBlockPattern.exec(text)) !== null) {
+ try {
+ const parsed = JSON.parse(match[1]);
+ examples.push(parsed);
+ } catch (e) {
+ // Skip invalid JSON
+ }
+ }
+
+ return examples;
+ }
+
+ convertMethodToPath(method) {
+ // Convert KDF method format to filesystem path [[memory:353920]]
+ // e.g., "task::enable_utxo::init" -> "task-enable_utxo-init"
+ return method.replace(/::/g, '-');
+ }
+
+ determineApiVersion(method) {
+ // Determine API version based on method pattern
+ if (method.includes('::')) {
+ return 'v20-dev'; // New format methods go to v20-dev
+ } else if (method.startsWith('lightning') || method.startsWith('task')) {
+ return 'v20';
+ } else {
+ return 'legacy';
+ }
+ }
+
+ async generateMethodDocumentation(issueData, method) {
+ const apiVersion = this.determineApiVersion(method);
+ const methodPath = this.convertMethodToPath(method);
+
+ // Determine category from method name
+ let category = 'misc';
+ if (method.includes('lightning')) category = 'lightning';
+ else if (method.includes('task')) category = 'task_managed';
+ else if (method.includes('wallet')) category = 'wallet';
+ else if (method.includes('swap')) category = 'swap';
+ else if (method.includes('orderbook')) category = 'orderbook';
+
+ const fileName = `${methodPath}.mdx`;
+ const dirPath = path.join(this.repoRoot, 'src', 'pages', 'komodo-defi-framework', 'api', apiVersion, category);
+ const filePath = path.join(dirPath, fileName);
+
+ // Generate method documentation using AI if available
+ const aiContent = await this.generateAIDocumentation(issueData, method, category);
+
+ const content = aiContent || this.generateTemplateDocumentation(method, issueData, category);
+
+ return {
+ path: filePath,
+ content: content,
+ category: category,
+ apiVersion: apiVersion,
+ methodName: method
+ };
+ }
+
+ async generateAIDocumentation(issueData, method, category = 'misc') {
+ const apiKey = process.env.OPENAI_API_KEY;
+ if (!apiKey) {
+ console.info("No OpenAI API key provided, using template documentation");
+ return null;
+ }
+
+ try {
+ const prompt = `Generate comprehensive MDX documentation for the KDF method "${method}" based on this issue description:
+
+Title: ${issueData.title}
+Description: ${issueData.description}
+
+CRITICAL REQUIREMENTS (AI Agent Reference Guide):
+1. MUST include {{label : '${method}', tag : 'API-v2'}} in the main heading
+2. MUST include label="${method}" in the CodeGroup component
+3. MUST use CompactTable components for request/response parameter tables
+4. MUST include the CompactTable import statement
+5. Follow this exact structure:
+
+export const title = "...";
+export const description = "...";
+import CompactTable from '@/components/mdx/CompactTable';
+
+# ${method}
+
+## ${method} {{label : '${method}', tag : 'API-v2'}}
+
+Description...
+
+### Request Parameters
+
+
+
+### Response Parameters
+
+
+
+#### 📌 Examples
+
+
+6. The {{label}} attributes are CRITICAL for auto-generation scripts
+7. Use realistic KDF method parameters and response structures
+8. Be comprehensive but production-ready
+9. CompactTable components reference JSON schema files in src/data/tables/
+
+This documentation will be parsed by scripts to auto-generate API indexes and navigation.`;
+
+ const fetch = (await import('node-fetch')).default;
+ const response = await fetch("https://api.openai.com/v1/chat/completions", {
+ method: "POST",
+ headers: {
+ "Authorization": `Bearer ${apiKey}`,
+ "Content-Type": "application/json"
+ },
+ body: JSON.stringify({
+ model: "gpt-4o-mini",
+ messages: [
+ {
+ role: "system",
+ content: "You are a technical documentation expert specializing in API documentation. Generate complete, accurate MDX documentation following the provided patterns."
+ },
+ {
+ role: "user",
+ content: prompt
+ }
+ ],
+ max_tokens: 2000,
+ temperature: 0.3
+ })
+ });
+
+ if (!response.ok) {
+ console.warn(`OpenAI API failed: ${response.status} ${response.statusText}`);
+ return null;
+ }
+
+ const data = await response.json();
+ console.info("✅ Using OpenAI API for documentation generation");
+ return data.choices?.[0]?.message?.content?.trim() || null;
+
+ } catch (error) {
+ console.warn(`OpenAI API error: ${error.message}`);
+ return null;
+ }
+ }
+
+ generateTemplateDocumentation(method, issueData, category = 'misc') {
+ const methodTitle = method.replace(/::/g, ' ').replace(/_/g, ' ').toLowerCase()
+ .replace(/\b\w/g, l => l.toUpperCase());
+
+ // Find structured method info for this method
+ const structuredMethod = issueData.structuredMethods?.find(m => m.name === method);
+ const exampleFromIssue = structuredMethod?.example ||
+ issueData.codeExamples?.find(ex => ex.method === method);
+
+ // Generate parameter documentation from extracted data
+ let parameterDocs = '';
+ if (structuredMethod?.parameters?.length > 0) {
+ parameterDocs = '\n**Extracted Parameters:**\n\n';
+ structuredMethod.parameters.forEach(param => {
+ parameterDocs += `- **${param.name}** (${param.type}): Example: \`${param.example}\`\n`;
+ });
+ parameterDocs += '\n';
+ }
+
+ // Use real example if available, otherwise generate template
+ const exampleRequest = exampleFromIssue || {
+ "userpass": "RPC_UserP@SSW0RD",
+ "mmrpc": "2.0",
+ "method": method,
+ "params": {},
+ "id": 42
+ };
+
+ // Note: This template follows the AI Agent Reference Guide requirements
+ // The {{label}} attribute is CRITICAL for script parsing and auto-generation
+ return `export const title = "Komodo DeFi Framework Method: ${methodTitle}";
+export const description = "Documentation for the ${method} method of the Komodo DeFi Framework.";
+import CompactTable from '@/components/mdx/CompactTable';
+
+# ${method}
+
+## ${method} {{label : '${method}', tag : 'API-v2'}}
+
+The \`${method}\` method ${this.extractMethodDescription(issueData, method)}
+${parameterDocs}
+### Request Parameters
+
+
+
+### Response Parameters
+
+
+
+#### 📌 Examples
+
+
+ \`\`\`json
+ ${JSON.stringify(exampleRequest, null, 2)}
+ \`\`\`
+
+
+
+ This documentation was auto-generated from issue #${issueData.number || 'N/A'}${structuredMethod ? ' with extracted parameter information from the source PR' : ''}. Please review and update as needed.
+`;
+ }
+
+ /**
+ * Extract method description from issue data
+ */
+ extractMethodDescription(issueData, method) {
+ const description = issueData.description || '';
+
+ // Try to find method-specific description
+ const methodSection = description.split(`#### Method: \`${method}\``)[1];
+ if (methodSection) {
+ const lines = methodSection.split('\n').slice(1, 5); // Take first few lines
+ return lines.join(' ').replace(/\*\*.*?\*\*/g, '').trim().slice(0, 200) + '...';
+ }
+
+ // Fall back to summary or generic description
+ const summaryMatch = description.match(/### Summary\s*\n([^\n#]+)/);
+ if (summaryMatch) {
+ return summaryMatch[1].trim().slice(0, 200) + '...';
+ }
+
+ return 'provides functionality for the Komodo DeFi Framework.';
+ }
+
+ async createBranchAndCommit(issue, generatedDocs) {
+ const branchName = `auto-pr/issue-${issue.number}`;
+ const commitMessage = `docs: Add documentation for issue #${issue.number}`;
+
+ try {
+ // Create and checkout new branch
+ execSync(`git checkout -b ${branchName}`, {
+ cwd: this.repoRoot,
+ stdio: 'inherit'
+ });
+
+ // Create directories and files
+ for (const doc of generatedDocs) {
+ const dir = path.dirname(doc.path);
+ if (!fs.existsSync(dir)) {
+ fs.mkdirSync(dir, { recursive: true });
+ }
+
+ fs.writeFileSync(doc.path, doc.content);
+ console.info(`📝 Created documentation: ${doc.path}`);
+ }
+
+ // Run build scripts to auto-generate API index and slugs
+ console.info("🔧 Running build scripts to generate API index and file path slugs...");
+ try {
+ execSync(`source utils/py/.venv/bin/activate && ./utils/gen_api_methods_table.py`, {
+ cwd: this.repoRoot,
+ stdio: 'inherit',
+ shell: '/bin/bash'
+ });
+ console.info("✅ Auto-generation completed successfully");
+ } catch (buildError) {
+ console.error(`❌ Build script failed: ${buildError.message}`);
+ throw buildError;
+ }
+
+ // Stage only specific files (exclude auto-generated files that shouldn't be committed manually)
+ const filesToCommit = [
+ ...generatedDocs.map(doc => doc.path.replace(this.repoRoot + '/', '')),
+ 'src/data/sidebar.json', // Only if it was updated
+ 'src/pages/komodo-defi-framework/changelog/index.mdx' // Only if it was updated
+ ];
+
+ for (const file of filesToCommit) {
+ try {
+ execSync(`git add "${file}"`, { cwd: this.repoRoot, stdio: 'inherit' });
+ } catch (addError) {
+ console.warn(`Could not add ${file} (may not exist or not modified)`);
+ }
+ }
+
+ execSync(`git commit -m "${commitMessage}"`, {
+ cwd: this.repoRoot,
+ stdio: 'inherit'
+ });
+
+ if (!this.config.dryRun) {
+ // Push branch
+ execSync(`git push origin ${branchName}`, {
+ cwd: this.repoRoot,
+ stdio: 'inherit'
+ });
+ }
+
+ return branchName;
+ } catch (error) {
+ console.error(`Git operations failed: ${error.message}`);
+ throw error;
+ }
+ }
+
+ async createPullRequest(issue, branchName, generatedDocs) {
+ if (this.config.dryRun) {
+ console.info(`📝 [DRY RUN] Would create PR for issue #${issue.number}`);
+ console.info(`🔗 [DRY RUN] Branch: ${branchName}`);
+ console.info(`📄 [DRY RUN] Files: ${generatedDocs.map(d => d.path).join(', ')}`);
+ return { data: { number: 'DRY-RUN', html_url: 'https://github.com/test/dry-run' } };
+ }
+
+ const title = `docs: Add documentation for issue #${issue.number} - ${issue.title}`;
+ const methodList = generatedDocs.map(doc => `- \`${doc.methodName}\``).join('\n');
+
+ const body = `## 🤖 Auto-generated PR
+
+This PR was automatically created from issue #${issue.number}.
+
+### 📝 Changes
+
+Added documentation for the following methods:
+${methodList}
+
+### 📂 Files Created/Modified
+
+${generatedDocs.map(doc => `- \`${doc.path}\``).join('\n')}
+
+### 🔗 Related Issue
+
+Closes #${issue.number}
+
+---
+
+> This PR was generated automatically. Please review the content and make any necessary adjustments before merging.`;
+
+ try {
+ const pr = await this.octokit.request("POST /repos/{owner}/{repo}/pulls", {
+ owner: this.config.targetOwner,
+ repo: this.config.targetRepo,
+ title,
+ head: branchName,
+ base: "dev", // Using dev branch as base per repository conventions
+ body,
+ draft: false
+ });
+
+ // Add labels to the PR
+ await this.octokit.request("POST /repos/{owner}/{repo}/issues/{issue_number}/labels", {
+ owner: this.config.targetOwner,
+ repo: this.config.targetRepo,
+ issue_number: pr.data.number,
+ labels: ["docs", "auto-generated", "status: pending review"]
+ });
+
+ console.info(`🔗 Created PR: ${pr.data.html_url}`);
+ return pr;
+ } catch (error) {
+ console.error(`Failed to create PR: ${error.message}`);
+ throw error;
+ }
+ }
+
+ async processIssue(issue) {
+ try {
+ this.stats.processed++;
+
+ console.info(`\n🔍 Processing issue #${issue.number}: ${issue.title}`);
+
+ // Verify issue still has the required label and is open
+ const hasLabel = issue.labels?.some(l =>
+ (l.name || "").toLowerCase() === this.config.autoPrLabel.toLowerCase()
+ );
+
+ if (!hasLabel || issue.state !== "open") {
+ console.info(`Skipping issue #${issue.number}: missing label or not open`);
+ this.stats.skipped++;
+ return;
+ }
+
+ // Parse issue content
+ const issueData = this.parseIssueContent(issue);
+ issueData.number = issue.number;
+
+ if (issueData.methods.length === 0) {
+ console.warn(`No methods found in issue #${issue.number}`);
+ this.stats.skipped++;
+ return;
+ }
+
+ console.info(`📋 Found methods: ${issueData.methods.join(', ')}`);
+
+ // Generate documentation for each method
+ const generatedDocs = [];
+ for (const method of issueData.methods) {
+ try {
+ const doc = await this.generateMethodDocumentation(issueData, method);
+ generatedDocs.push(doc);
+ } catch (error) {
+ console.warn(`Failed to generate docs for method ${method}: ${error.message}`);
+ }
+ }
+
+ if (generatedDocs.length === 0) {
+ console.warn(`No documentation generated for issue #${issue.number}`);
+ this.stats.skipped++;
+ return;
+ }
+
+ // Create branch and commit changes
+ const branchName = await this.createBranchAndCommit(issue, generatedDocs);
+
+ // Create pull request
+ const pr = await this.createPullRequest(issue, branchName, generatedDocs);
+
+ console.info(`✅ Created PR #${pr.data.number} for issue #${issue.number}`);
+ this.stats.created++;
+
+ // Comment on the original issue
+ if (!this.config.dryRun) {
+ await this.octokit.request("POST /repos/{owner}/{repo}/issues/{issue_number}/comments", {
+ owner: this.config.targetOwner,
+ repo: this.config.targetRepo,
+ issue_number: issue.number,
+ body: `🤖 **Auto PR Created**: ${pr.data.html_url}\n\nThis pull request contains the requested documentation updates.`
+ });
+ }
+
+ } catch (error) {
+ console.error(`Error processing issue #${issue.number}: ${error.message}`);
+ this.stats.errors++;
+
+ // Try to cleanup any created branch
+ try {
+ execSync(`git checkout dev && git branch -D auto-pr/issue-${issue.number}`, {
+ cwd: this.repoRoot,
+ stdio: 'ignore'
+ });
+ } catch {}
+ }
+ }
+
+ async run() {
+ try {
+ console.info(`🚀 Starting auto PR creation - DRY RUN: ${this.config.dryRun}`);
+
+ // Ensure we're on the dev branch
+ execSync(`git checkout dev`, { cwd: this.repoRoot, stdio: 'inherit' });
+ execSync(`git pull origin dev`, { cwd: this.repoRoot, stdio: 'inherit' });
+
+ const issues = await this.listCandidateIssues();
+ console.info(`📋 Found ${issues.length} candidate issue(s) with label "${this.config.autoPrLabel}"`);
+
+ // Process issues with some concurrency control
+ const concurrency = 2; // Lower concurrency for PR creation
+ for (let i = 0; i < issues.length; i += concurrency) {
+ const batch = issues.slice(i, i + concurrency);
+ await Promise.all(batch.map(issue => this.processIssue(issue)));
+ }
+
+ // Output final summary
+ console.info(`\n📊 === AUTO PR SUMMARY ===`);
+ console.info(`📈 Issues Processed: ${this.stats.processed}`);
+ console.info(`✅ PRs Created: ${this.stats.created}`);
+ console.info(`⏭️ Issues Skipped: ${this.stats.skipped}`);
+ console.info(`❌ Errors: ${this.stats.errors}`);
+
+ if (this.stats.errors > 0) {
+ console.error(`❌ Auto PR creation completed with ${this.stats.errors} errors`);
+ process.exit(1);
+ } else {
+ console.info(`🎉 Auto PR creation completed successfully!`);
+ }
+
+ } catch (error) {
+ console.error(`💥 Auto PR creation failed: ${error.message}`);
+ process.exit(1);
+ }
+ }
+}
+
+// Execute if run directly
+if (require.main === module) {
+ const manager = new AutoPRManager();
+ manager.run().catch(error => {
+ console.error('Unhandled error:', error);
+ process.exit(1);
+ });
+}
+
+module.exports = AutoPRManager;
diff --git a/.github/scripts/sync-docs-issues.js b/.github/scripts/sync-docs-issues.js
index c44c028b2..a67862a74 100644
--- a/.github/scripts/sync-docs-issues.js
+++ b/.github/scripts/sync-docs-issues.js
@@ -196,6 +196,114 @@ class DocsSyncManager {
}
}
+ /**
+ * Extract JSON code blocks from PR description and comments
+ */
+ extractCodeExamples(pr) {
+ const text = pr.body || "";
+ const codeBlocks = [];
+
+ // Match JSON code blocks with various markdown formats
+ const patterns = [
+ /```json\s*\n([\s\S]*?)\n```/gi,
+ /```\s*\n(\{[\s\S]*?\})\s*\n```/gi,
+ /`{[^`]*}`/gi
+ ];
+
+ patterns.forEach(pattern => {
+ let match;
+ while ((match = pattern.exec(text)) !== null) {
+ try {
+ const jsonText = match[1] || match[0];
+ // Clean up the JSON (remove comments, fix common issues)
+ let cleanJson = jsonText
+ .replace(/\/\/.*$/gm, '') // Remove line comments
+ .replace(/,(\s*[}\]])/g, '$1') // Remove trailing commas
+ .trim(); // Remove extra whitespace
+
+ // Handle template variables more carefully
+ cleanJson = cleanJson.replace(/"{{[^}]*}}"/g, '"{{placeholder}}"');
+
+ const parsed = JSON.parse(cleanJson);
+ codeBlocks.push({
+ raw: match[0],
+ json: parsed,
+ text: cleanJson
+ });
+ } catch (e) {
+ // Not valid JSON, skip
+ }
+ }
+ });
+
+ return codeBlocks;
+ }
+
+ /**
+ * Derive RPC method information from code examples
+ */
+ deriveMethodInfo(codeBlocks) {
+ const methods = [];
+
+ codeBlocks.forEach(block => {
+ if (block.json && block.json.method) {
+ const method = {
+ name: block.json.method,
+ params: block.json.params || {},
+ example: block.json,
+ hasUserpass: !!block.json.userpass,
+ hasMmrpc: !!block.json.mmrpc
+ };
+
+ // Extract parameter structure
+ if (method.params && typeof method.params === 'object') {
+ method.requestParams = this.extractParameterStructure(method.params);
+ }
+
+ methods.push(method);
+ }
+ });
+
+ return methods;
+ }
+
+ /**
+ * Extract parameter structure for documentation
+ */
+ extractParameterStructure(params, prefix = '') {
+ const structure = [];
+
+ Object.entries(params).forEach(([key, value]) => {
+ const param = {
+ name: prefix ? `${prefix}.${key}` : key,
+ type: this.inferType(value),
+ example: value
+ };
+
+ if (typeof value === 'object' && value !== null && !Array.isArray(value)) {
+ param.type = 'object';
+ param.properties = this.extractParameterStructure(value, param.name);
+ }
+
+ structure.push(param);
+ });
+
+ return structure;
+ }
+
+ /**
+ * Infer parameter type from example value
+ */
+ inferType(value) {
+ if (value === null) return 'null';
+ if (typeof value === 'boolean') return 'boolean';
+ if (typeof value === 'number') return Number.isInteger(value) ? 'integer' : 'number';
+ if (typeof value === 'string') return 'string';
+ if (Array.isArray(value)) return 'array';
+ if (typeof value === 'object') return 'object';
+ return 'unknown';
+ }
+
async generateAISummary(pr, files) {
// Use OpenAI API for AI summaries
const apiKey = process.env.OPENAI_API_KEY;
@@ -254,7 +362,7 @@ ${prBody}`;
return data.choices?.[0]?.message?.content?.trim() || null;
} catch (error) {
- console.warn(`OpenAI API error: ${error.message}`);
+ console.warn(`OpenAI API API error: ${error.message}`);
return null;
}
}
@@ -266,6 +374,10 @@ ${prBody}`;
`- \`${f.filename}\`${f.status !== 'modified' ? ` (${f.status})` : ""}`
).join("\n");
+ // Extract code examples and method information
+ const codeBlocks = this.extractCodeExamples(pr);
+ const methods = this.deriveMethodInfo(codeBlocks);
+
const sections = [
marker,
`### Source`,
@@ -274,17 +386,63 @@ ${prBody}`;
`- Author: @${pr.user?.login}`,
``,
aiSummary ? `### Summary\n${aiSummary}\n` : "",
+ ];
+
+ // Add extracted method information
+ if (methods.length > 0) {
+ sections.push(`### Detected RPC Methods`);
+ methods.forEach(method => {
+ sections.push(`#### Method: \`${method.name}\``);
+ sections.push(`**Request Parameters:**`);
+
+ if (method.requestParams && method.requestParams.length > 0) {
+ method.requestParams.forEach(param => {
+ sections.push(`- \`${param.name}\` (${param.type}): Example value \`${JSON.stringify(param.example)}\``);
+ if (param.properties) {
+ param.properties.forEach(prop => {
+ sections.push(` - \`${prop.name}\` (${prop.type}): Example value \`${JSON.stringify(prop.example)}\``);
+ });
+ }
+ });
+ } else {
+ sections.push(`- No parameters detected`);
+ }
+
+ sections.push(`**Example Request:**`);
+ sections.push('```json');
+ sections.push(JSON.stringify(method.example, null, 2));
+ sections.push('```');
+ sections.push('');
+ });
+ }
+
+ sections.push(
`### Suggested docs tasks`,
`- [ ] Update relevant pages`,
`- [ ] Add/adjust RPC docs: purpose, parameters (types/defaults), and examples`,
`- [ ] Provide JSON-RPC request/response samples`,
`- [ ] Update changelog/What's New (if applicable)`,
- ``,
+ ``
+ );
+
+ if (methods.length > 0) {
+ sections.push(`### Method Documentation Requirements`);
+ methods.forEach(method => {
+ sections.push(`- [ ] Create documentation file for \`${method.name}\``);
+ sections.push(`- [ ] Define parameter types and validation rules`);
+ sections.push(`- [ ] Document response structure`);
+ sections.push(`- [ ] Add CompactTable components for request/response`);
+ sections.push(`- [ ] Include working code examples`);
+ });
+ sections.push('');
+ }
+
+ sections.push(
`### Changed files (for scoping)`,
changed || "_No file listing_",
``,
- `> _This issue was generated automatically; edit as needed._`
- ];
+ `> _This issue was generated automatically from PR analysis. Code examples and parameters were extracted automatically._`
+ );
return sections.filter(Boolean).join("\n");
}
diff --git a/.github/workflows/auto-pr-from-issues.yml b/.github/workflows/auto-pr-from-issues.yml
new file mode 100644
index 000000000..5dbba1d30
--- /dev/null
+++ b/.github/workflows/auto-pr-from-issues.yml
@@ -0,0 +1,109 @@
+# Auto PR from Issues Workflow
+#
+# This workflow automatically creates pull requests for documentation updates
+# based on issues labeled with "ai-draft-pr".
+#
+# IMPORTANT: Follows AI Agent Reference Guide principles:
+# - Creates documentation files with proper {{label}} attributes
+# - Updates sidebar.json manually
+# - Lets build scripts auto-generate API index and file path slugs
+# - Validates documentation structure with test suite
+#
+# See: AI_AGENT_REFERENCE_GUIDE.md for complete documentation requirements
+
+name: Auto PR from Issues
+
+on:
+ schedule:
+ - cron: "30 4 * * *" # Daily at 4:30 AM UTC
+ workflow_dispatch: # Allow manual triggering
+ inputs:
+ dry_run:
+ description: 'Run in dry-run mode (no PRs created)'
+ required: false
+ default: false
+ type: boolean
+ issue_numbers:
+ description: 'Specific issue numbers to process (comma-separated)'
+ required: false
+ type: string
+
+permissions:
+ contents: write
+ issues: write
+ pull-requests: write
+
+env:
+ TARGET_OWNER: KomodoPlatform
+ TARGET_REPO: komodo-docs-mdx
+ AUTO_PR_LABEL: "ai-draft-pr"
+
+jobs:
+ auto-pr:
+ runs-on: ubuntu-latest
+ timeout-minutes: 45
+
+ steps:
+ - name: Checkout repository
+ uses: actions/checkout@v4
+ with:
+ token: ${{ secrets.GITHUB_TOKEN }}
+ fetch-depth: 0
+
+ - name: Setup Node.js
+ uses: actions/setup-node@v4
+ with:
+ node-version: '20'
+ cache: 'npm'
+ cache-dependency-path: '.github/package-lock.json'
+
+ - name: Install dependencies
+ working-directory: .github
+ run: npm ci --only=production
+
+ - name: Setup Python
+ uses: actions/setup-python@v4
+ with:
+ python-version: '3.11'
+
+ - name: Activate Python venv and install dependencies
+ run: |
+ source utils/py/.venv/bin/activate
+ pip install --upgrade pip
+ if [ -f utils/py/requirements.txt ]; then
+ pip install -r utils/py/requirements.txt
+ fi
+
+ - name: Configure Git
+ run: |
+ git config --global user.name "github-actions[bot]"
+ git config --global user.email "github-actions[bot]@users.noreply.github.com"
+
+ - name: Run auto PR creation
+ env:
+ GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
+ # OpenAI API key for AI-generated content (optional)
+ OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY || '' }}
+ DRY_RUN: ${{ inputs.dry_run || 'false' }}
+ SPECIFIC_ISSUES: ${{ inputs.issue_numbers || '' }}
+ run: |
+ # Run the auto-PR creation script from .github directory
+ cd .github
+ node scripts/auto-pr-from-issues.js
+
+ # Return to root and validate generated documentation
+ cd ..
+ echo "Validating generated documentation..."
+ source utils/py/.venv/bin/activate
+ ./utils/run_tests.sh
+
+ - name: Generate summary
+ if: always()
+ run: |
+ echo "## Auto PR Results" >> $GITHUB_STEP_SUMMARY
+ echo "| Metric | Value |" >> $GITHUB_STEP_SUMMARY
+ echo "|--------|-------|" >> $GITHUB_STEP_SUMMARY
+ echo "| Mode | ${{ inputs.dry_run == 'true' && 'DRY RUN (testing)' || 'WET RUN (creates PRs)' }}" >> $GITHUB_STEP_SUMMARY
+ echo "| Trigger | ${{ github.event_name }}" >> $GITHUB_STEP_SUMMARY
+ echo "| Branch | ${{ github.ref_name }}" >> $GITHUB_STEP_SUMMARY
+ echo "| Repository | ${{ env.TARGET_OWNER }}/${{ env.TARGET_REPO }}" >> $GITHUB_STEP_SUMMARY
diff --git a/AI_AGENT_REFERENCE_GUIDE.md b/AI_AGENT_REFERENCE_GUIDE.md
new file mode 100644
index 000000000..08c28192b
--- /dev/null
+++ b/AI_AGENT_REFERENCE_GUIDE.md
@@ -0,0 +1,199 @@
+# AI Agent Reference Guide for Komodo DeFi Framework Documentation
+
+## Overview
+
+This guide explains the automated documentation system used in the komodo-docs-mdx repository, specifically designed for AI agents creating documentation for KDF (Komodo DeFi Framework) API methods.
+
+## Critical Concepts
+
+### 1. Auto-Generated Files (DO NOT MANUALLY EDIT OR COMMIT)
+
+These files are automatically generated by scripts and should NEVER be manually edited or committed:
+
+- `src/pages/komodo-defi-framework/api/index.mdx` - The main API methods table
+- `filepathSlugs.json` - Contains anchor slugs for all documentation files
+
+### 2. File Structure and Naming Conventions
+
+#### API Version Directories
+- `src/pages/komodo-defi-framework/api/legacy/` - Legacy RPC methods
+- `src/pages/komodo-defi-framework/api/v20/` - Version 2.0 stable methods
+- `src/pages/komodo-defi-framework/api/v20-dev/` - Version 2.0 development methods
+
+#### Method Path Conversion
+KDF methods use a specific naming convention:
+- **Canonical form**: `task::enable_utxo::init` (uses `::` separators)
+- **File/folder name**: `task-enable_utxo-init` (replace `::` with `-`, preserve `_`)
+
+### 3. Documentation File Requirements
+
+Every KDF method documentation file MUST contain:
+
+#### Required Elements for Script Parsing
+
+1. **Method Label in Heading**:
+ ```mdx
+ ## method_name {{label : 'method_name', tag : 'API-v2'}}
+ ```
+
+2. **CodeGroup with label attribute**:
+ ```mdx
+
+ ```
+
+3. **File Structure**:
+ ```
+ method_directory/
+ └── index.mdx
+ ```
+
+#### Document Structure Template
+
+```mdx
+export const title = "Komodo DeFi Framework Method: Method Name";
+export const description = "Description of what this method does.";
+
+# method_name
+
+## method_name {{label : 'method_name', tag : 'API-v2'}}
+
+Description of the method functionality.
+
+### Arguments
+
+| Structure | Type | Description |
+|-----------|------|-------------|
+| param1 | string | Parameter description |
+
+### Response
+
+| Structure | Type | Description |
+|-----------|------|-------------|
+| result | object | Response description |
+
+### 📌 Examples
+
+
+ ```json
+ {
+ "userpass": "RPC_UserP@SSW0RD",
+ "method": "method_name",
+ "params": {}
+ }
+ ```
+
+```
+
+### 4. Build Process Flow
+
+When documentation is created, this sequence occurs:
+
+1. **Documentation files created** by AI agent in appropriate directories
+2. **`utils/gen_api_methods_table.py`** runs and:
+ - Scans all KDF documentation files
+ - Extracts method names from `label` attributes
+ - Generates anchor slugs from headings and content
+ - Updates `filepathSlugs.json` with new entries
+ - Regenerates `src/pages/komodo-defi-framework/api/index.mdx`
+3. **Navigation files updated** (sidebar.json) manually by AI agent
+4. **Tests validate** the structure
+
+### 5. Slug Generation Rules
+
+Slugs are auto-generated from:
+- Main method heading (becomes primary slug)
+- Section headings (arguments, response, examples, etc.)
+- Collapsible section titles
+- Error response headings
+
+Slug transformation:
+- Convert to lowercase
+- Replace spaces and underscores with hyphens
+- Remove special characters except hyphens
+- Example: "Response (success)" → "response-success"
+
+### 6. Navigation Integration
+
+#### Sidebar Updates
+Manual updates required in `src/data/sidebar.json`:
+```json
+{
+ "title": "method_name",
+ "href": "/komodo-defi-framework/api/legacy/method_name/"
+}
+```
+
+#### Changelog Updates
+Add new methods to `src/pages/komodo-defi-framework/changelog/index.mdx`.
+
+## AI Agent Instructions
+
+### DO
+1. ✅ Create properly structured documentation files
+2. ✅ Use correct label attributes in headings and CodeGroups
+3. ✅ Follow the directory structure conventions
+4. ✅ Update sidebar.json manually
+5. ✅ Update changelog manually
+6. ✅ Let the build process generate slugs and API index
+
+### DON'T
+1. ❌ Edit `src/pages/komodo-defi-framework/api/index.mdx` manually
+2. ❌ Edit `filepathSlugs.json` manually
+3. ❌ Commit auto-generated files
+4. ❌ Skip the label attributes in headings/CodeGroups
+5. ❌ Use inconsistent naming conventions
+
+### Validation Process
+
+Always run the test suite after creating documentation:
+```bash
+source utils/py/.venv/bin/activate
+./utils/run_tests.sh
+```
+
+This validates:
+- File structure integrity
+- Method extraction from labels
+- Slug generation
+- Internal link consistency
+- Compact table data
+- Changelog updates
+
+### Common Pitfalls
+
+1. **Missing labels**: Method won't be detected by scripts
+2. **Wrong directory structure**: Files not found by glob patterns
+3. **Inconsistent naming**: Breaks the method path conversion logic
+4. **Manual slug editing**: Gets overwritten by auto-generation
+5. **Committing auto-generated files**: Creates merge conflicts
+
+## Example: Complete Process
+
+For a new method `consolidate_utxos`:
+
+1. **Create documentation**: `src/pages/komodo-defi-framework/api/legacy/consolidate_utxos/index.mdx`
+2. **Include required elements**:
+ ```mdx
+ ## consolidate_utxos {{label : 'consolidate_utxos', tag : 'API-v2'}}
+
+
+ ```
+3. **Update sidebar**: Add entry to `src/data/sidebar.json`
+4. **Update changelog**: Add feature description
+5. **Run tests**: `./utils/run_tests.sh` - this auto-generates:
+ - Slugs in `filepathSlugs.json`
+ - Method table in `src/pages/komodo-defi-framework/api/index.mdx`
+6. **Commit only** (NEVER commit auto-generated files):
+ - ✅ Documentation file (`src/pages/komodo-defi-framework/api/legacy/consolidate_utxos/index.mdx`)
+ - ✅ Sidebar.json updates (`src/data/sidebar.json`)
+ - ✅ Changelog updates (`src/pages/komodo-defi-framework/changelog/index.mdx`)
+ - ❌ **NEVER COMMIT**: `src/pages/komodo-defi-framework/api/index.mdx` (auto-generated)
+ - ❌ **NEVER COMMIT**: `filepathSlugs.json` (auto-generated)
+
+### Files That Should NEVER Be Manually Edited or Committed
+
+- `src/pages/komodo-defi-framework/api/index.mdx` - Generated by `gen_api_methods_table.py`
+- `filepathSlugs.json` - Generated by build scripts parsing documentation files
+- Any file marked with "auto-generated" comments
+
+This approach ensures consistency, reduces conflicts, and maintains the automated build system.
diff --git a/src/data/sidebar.json b/src/data/sidebar.json
index 639c2b85e..2f34feb3e 100644
--- a/src/data/sidebar.json
+++ b/src/data/sidebar.json
@@ -1136,6 +1136,10 @@
"title": "convertaddress",
"href": "/komodo-defi-framework/api/legacy/convertaddress/"
},
+ {
+ "title": "consolidate_utxos",
+ "href": "/komodo-defi-framework/api/legacy/consolidate_utxos/"
+ },
{
"title": "convert_utxo_address",
"href": "/komodo-defi-framework/api/legacy/convert_utxo_address/"
diff --git a/src/pages/komodo-defi-framework/api/index.mdx b/src/pages/komodo-defi-framework/api/index.mdx
index 073edc5fe..29406ce4e 100644
--- a/src/pages/komodo-defi-framework/api/index.mdx
+++ b/src/pages/komodo-defi-framework/api/index.mdx
@@ -31,6 +31,7 @@ Below is a table of the currently available legacy, v2.0 and v2.0 (Dev) methods:
| | [clear\_nft\_db](/komodo-defi-framework/api/v20/non_fungible_tokens/clear_nft_db/#clear-nft-db) | |
| | [close\_channel](/komodo-defi-framework/api/v20/lightning/channels/#close-channel) | |
| [coins\_needed\_for\_kick\_start](/komodo-defi-framework/api/legacy/coins_needed_for_kick_start/#coins-needed-for-kick-start) | | |
+| [consolidate\_utxos](/komodo-defi-framework/api/legacy/consolidate_utxos/#consolidate-utxos) | | |
| [convert\_utxo\_address](/komodo-defi-framework/api/legacy/convert_utxo_address/#convert-utxo-address) | | |
| [convertaddress](/komodo-defi-framework/api/legacy/convertaddress/#convertaddress) | | |
| | [delete\_wallet](/komodo-defi-framework/api/v20/wallet/delete_wallet/#delete-wallet) | |
diff --git a/src/pages/komodo-defi-framework/api/legacy/consolidate_utxos/index.mdx b/src/pages/komodo-defi-framework/api/legacy/consolidate_utxos/index.mdx
new file mode 100644
index 000000000..261899c33
--- /dev/null
+++ b/src/pages/komodo-defi-framework/api/legacy/consolidate_utxos/index.mdx
@@ -0,0 +1,166 @@
+export const title = "Komodo DeFi Framework Method: Consolidate UTXOs";
+export const description = "The consolidate_utxos method consolidates unspent transaction outputs (UTXOs) for a specified UTXO-based coin to optimize transaction efficiency and reduce wallet fragmentation.";
+
+# consolidate\_utxos
+
+**consolidate\_utxos coin merge\_conditions**
+
+The `consolidate_utxos` method consolidates unspent transaction outputs (UTXOs) for a specified UTXO-based coin. This operation helps optimize transaction efficiency by reducing wallet fragmentation and lowering the number of UTXOs, which can improve transaction creation speed and reduce fees.
+
+
+ This method is specifically designed for UTXO-based coins (Bitcoin, Litecoin, KMD, etc.) and will not work with account-based coins like Ethereum or ERC-20 tokens.
+
+
+## Arguments
+
+| Structure | Type | Description |
+| -------------------------------------- | ------- | ------------------------------------------------------------------------------------------ |
+| coin | string | The name of the UTXO-based coin to consolidate |
+| merge\_conditions | object | Optional. Configuration object for consolidation behavior |
+| merge\_conditions.merge\_at | integer | Optional. Minimum number of lone UTXOs that triggers automatic consolidation. Default: 100 |
+| merge\_conditions.max\_merge\_at\_once | integer | Optional. Maximum number of UTXOs to merge in a single transaction. Default: 50 |
+
+## Response
+
+| Structure | Type | Description |
+| ------------------- | ---------------- | -------------------------------------------------------------------------- |
+| tx\_hash | string | The hash of the consolidation transaction |
+| tx\_hex | string | The raw transaction hex that can be broadcast using `send_raw_transaction` |
+| from | array of strings | Source addresses (UTXOs) that were consolidated |
+| to | array of strings | Destination address where consolidated funds were sent |
+| total\_amount | string (numeric) | Total amount consolidated |
+| spent\_by\_me | string (numeric) | Total amount spent including fees |
+| received\_by\_me | string (numeric) | Amount received after consolidation |
+| my\_balance\_change | string (numeric) | Net balance change (typically negative due to transaction fees) |
+| fee\_details | object | Fee information for the consolidation transaction |
+| coin | string | The coin that was consolidated |
+| utxos\_merged | integer | Number of UTXOs that were successfully merged |
+
+#### 📌 Examples
+
+#### Basic UTXO Consolidation
+
+
+ ```json
+ {
+ "userpass": "RPC_UserP@SSW0RD",
+ "method": "consolidate_utxos",
+ "coin": "KMD"
+ }
+ ```
+
+
+
+ #### Response (success)
+
+ ```json
+ {
+ "tx_hash": "7e0e1b0c8a2f1d9e3c4b5a6789012345678901234567890123456789abcdef01",
+ "tx_hex": "0400008085202f890312345...abcdef",
+ "from": [
+ "RGVd8VTPgKXWo9JGjUSdkkDKFzCUKKXVDJ",
+ "RGVd8VTPgKXWo9JGjUSdkkDKFzCUKKXVDJ",
+ "RGVd8VTPgKXWo9JGjUSdkkDKFzCUKKXVDJ"
+ ],
+ "to": [
+ "RGVd8VTPgKXWo9JGjUSdkkDKFzCUKKXVDJ"
+ ],
+ "total_amount": "15.75",
+ "spent_by_me": "15.75001",
+ "received_by_me": "15.74999",
+ "my_balance_change": "-0.00001",
+ "fee_details": {
+ "type": "Utxo",
+ "amount": "0.00001"
+ },
+ "coin": "KMD",
+ "utxos_merged": 25
+ }
+ ```
+
+
+#### UTXO Consolidation with Custom Merge Conditions
+
+
+ ```json
+ {
+ "userpass": "RPC_UserP@SSW0RD",
+ "method": "consolidate_utxos",
+ "coin": "BTC",
+ "merge_conditions": {
+ "merge_at": 50,
+ "max_merge_at_once": 25
+ }
+ }
+ ```
+
+
+
+ #### Response (success)
+
+ ```json
+ {
+ "tx_hash": "8f1f2e3d4c5b6a7890123456789012345678901234567890123456789abcdef02",
+ "tx_hex": "0100000012345...abcdef",
+ "from": [
+ "1A1zP1eP5QGefi2DMPTfTL5SLmv7DivfNa",
+ "1A1zP1eP5QGefi2DMPTfTL5SLmv7DivfNa"
+ ],
+ "to": [
+ "1A1zP1eP5QGefi2DMPTfTL5SLmv7DivfNa"
+ ],
+ "total_amount": "0.5",
+ "spent_by_me": "0.50001",
+ "received_by_me": "0.49999",
+ "my_balance_change": "-0.00001",
+ "fee_details": {
+ "type": "Utxo",
+ "amount": "0.00001"
+ },
+ "coin": "BTC",
+ "utxos_merged": 25
+ }
+ ```
+
+
+#### Response (error - coin not activated)
+
+
+ ```json
+ {
+ "error": "rpc:174] dispatcher_legacy:155] lp_coins:1668] Coin BTC is not activated"
+ }
+ ```
+
+
+#### Response (error - insufficient UTXOs to consolidate)
+
+
+ ```json
+ {
+ "error": "Not enough UTXOs to consolidate. Current UTXO count: 5, minimum required: 50"
+ }
+ ```
+
+
+#### Response (error - non-UTXO coin)
+
+
+ ```json
+ {
+ "error": "UTXO consolidation is not supported for ETH. This method only works with UTXO-based coins."
+ }
+ ```
+
+
+## Additional Notes
+
+* **Transaction Broadcasting**: The `consolidate_utxos` method only creates the consolidation transaction. You must use the [`send_raw_transaction`](/komodo-defi-framework/api/legacy/send_raw_transaction/) method to broadcast the transaction to the network.
+
+* **Fee Optimization**: Consolidating UTXOs can reduce future transaction fees by minimizing the number of inputs required for subsequent transactions.
+
+* **Timing Considerations**: It's generally best to consolidate UTXOs during periods of low network congestion to minimize transaction fees.
+
+* **Privacy Impact**: Consolidation transactions can link multiple addresses/UTXOs together, which may impact privacy. Consider this when deciding whether to consolidate.
+
+* **Automatic Triggering**: If you don't specify `merge_conditions`, the default behavior will trigger consolidation when you have 100 or more UTXOs, merging up to 50 at once.
diff --git a/src/pages/komodo-defi-framework/changelog/index.mdx b/src/pages/komodo-defi-framework/changelog/index.mdx
index 1f3c63612..12ebb9043 100644
--- a/src/pages/komodo-defi-framework/changelog/index.mdx
+++ b/src/pages/komodo-defi-framework/changelog/index.mdx
@@ -4,6 +4,18 @@ export const description =
# Change Log
+## Komodo DeFi Framework v2.6.0-beta (Upcoming)
+
+### UTXO Management Enhancement
+
+This upcoming release introduces improved UTXO management capabilities to help users optimize their wallet performance and transaction efficiency.
+
+#### 🔧 UTXO Optimization
+
+* **feat(utxo): introduce a utxo consolidation rpc** [PR #2587](https://github.com/KomodoPlatform/komodo-defi-framework/pull/2587): A new [`consolidate_utxos`](/komodo-defi-framework/api/legacy/consolidate_utxos/) RPC method that allows users to consolidate unspent transaction outputs (UTXOs) for UTXO-based coins. This feature helps reduce wallet fragmentation, improve transaction creation speed, and potentially lower transaction fees by merging multiple small UTXOs into fewer larger ones. The method includes configurable merge conditions such as minimum UTXO count thresholds and maximum UTXOs per consolidation transaction.
+
+***
+
## Komodo DeFi Framework v2.5.1-beta
### Offline Private Key Export and First AI Agent Contribution