Skip to content

Help us spread the word — context-mode needs your voice #134

@mksglu

Description

@mksglu

The ask

I'm the solo maintainer of context-mode. I built it, I maintain it, I write the docs, I fix the bugs, and I try to tell people about it — but that last part is where I'm stuck.

I have no marketing budget. No DevRel team. No growth hacker. Just a good tool that solves a real problem, and a growing community of developers who use it every day. If you've found context-mode useful, I'm asking for your help to get it in front of more people.

What context-mode does (the short version)

AI coding agents (Claude Code, Cursor, Copilot, Gemini CLI, etc.) burn through their context windows fast. A single Playwright snapshot eats 56 KB. Twenty GitHub issues cost 59 KB. One access log wipes 45 KB. Your agent starts forgetting things, code quality drops, and you restart the session.

context-mode fixes this. It sandboxes all tool output — command results, API responses, file analysis, web pages — into a local FTS5 knowledge base. The agent searches it instead of reading raw data. Result: ~98% context savings with hooks enabled. Sessions go from ~30 minutes to ~3 hours.

  • 12 supported platforms: Claude Code, Gemini CLI, VS Code Copilot, Cursor, OpenCode, OpenClaw, Codex CLI, Antigravity, Kiro, KiloCode, Zed, Pi
  • Privacy-first: Nothing leaves your machine. No telemetry, no cloud, no accounts, no tracking. Your code stays yours.
  • ELv2 licensed: Source-available. You can read every line, self-host, and modify for your own use.
  • Free: No per-seat pricing. No usage limits.

Current traction

We've had some good moments, but momentum needs community effort to sustain:

  • Hacker News: Had a successful Show HN post. Unfortunately, subsequent updates can't be shared there due to moderation constraints (I've spoken with a mod directly).
  • YouTube: This walkthrough video reached a solid audience and continues to bring in new users.
  • X (Twitter) & LinkedIn: I share updates actively, but breaking through the noise as a solo developer is hard. The right people — engineering leads, developer advocates, AI-focused teams — aren't seeing it yet.
  • npm: Available as context-mode — one install, works everywhere.

How you can help

Pick whatever feels natural to you. Even one small action makes a difference.

Share on social media

Post about context-mode on your platform of choice — Twitter/X, LinkedIn, Reddit, Mastodon, Bluesky, Dev.to, Medium, Hashnode, whatever you use. Tag me so I can amplify it.

Write about it

Blog posts, tutorials, comparisons, "how I set up my AI coding workflow" articles — anything that shows context-mode in action. If you write something, open a PR to add it to our README and I'll happily link it.

Make a video

YouTube tutorials, shorts, screen recordings of your setup process, before/after comparisons showing context savings. Visual proof is the most convincing format.

Recommend it at work

If your team uses AI coding agents, suggest context-mode in your next standup or tooling discussion. It takes 2 minutes to install and the savings are measurable from the first session.

Connect us with the right people

  • Developer advocates and content creators who cover AI coding tools
  • Engineering teams evaluating AI agent infrastructure
  • VCs or companies investing in AI developer tooling
  • Community managers of relevant Discord servers, Slack groups, or forums
  • Conference organizers looking for talks about AI developer experience

Suggest communities

Know a subreddit, Discord server, forum, or newsletter where developers discuss AI coding tools? Drop it in the comments. I'll engage genuinely — not spam.

Ready-to-use sharing templates

Feel free to copy, modify, or use these as a starting point.

Twitter/X

I've been using context-mode — an open-source MCP plugin that saves ~98% of context window for AI coding agents.

Works with Claude Code, Gemini CLI, Copilot, Cursor, and 8 more platforms. Free, self-hosted, no telemetry.

Before: sessions die in 30 min. After: 3+ hours.

https://github.com/mksglu/context-mode

LinkedIn

If your engineering team uses AI coding agents, you're probably burning through context windows faster than you think.

context-mode is an open-source MCP plugin that sandboxes tool output into a local knowledge base. The AI searches instead of reading raw data. Result: ~98% context savings — sessions go from ~30 minutes to 3+ hours.

It supports 12 platforms (Claude Code, Gemini CLI, VS Code Copilot, Cursor, OpenCode, OpenClaw, Codex CLI, Antigravity, Kiro, KiloCode, Zed, Pi), runs entirely on your machine, and collects zero telemetry. ELv2 licensed, free to use.

Worth evaluating if your team is hitting context limits during code reviews, debugging, or repo research.

https://github.com/mksglu/context-mode

Reddit / Forum post

context-mode — open-source MCP plugin that saves 98% of context window for AI coding agents

I've been using this tool that changes how AI agents handle context. Instead of dumping raw command output, API responses, and file contents into the context window, it sandboxes everything into a local FTS5 database. The agent searches it with intent-driven queries instead of reading megabytes of raw text.

Real numbers: a Playwright snapshot goes from 56 KB to 299 bytes. 20 GitHub issues go from 59 KB to 1.1 KB. Sessions extend from ~30 min to ~3 hours.

Works with Claude Code, Gemini CLI, VS Code Copilot, Cursor, OpenCode, OpenClaw, Codex CLI, Antigravity, Kiro, KiloCode, Zed, and Pi. Privacy-first — everything stays local, no cloud, no telemetry.

GitHub: https://github.com/mksglu/context-mode

Key talking points

If someone asks you "why should I care?" — here's what matters:

Stat Detail
Context savings ~98% with hooks, ~60% without
Session extension ~30 min → ~3 hours
Platforms supported 12 (Claude Code, Gemini CLI, VS Code Copilot, Cursor, OpenCode, OpenClaw, Codex CLI, Antigravity, Kiro, KiloCode, Zed, Pi)
Privacy Zero telemetry, fully local, no cloud, no account
License ELv2 (source-available, free to use)
Install time < 2 minutes
Playwright snapshot 56.2 KB → 299 B (99% saved)
20 GitHub issues 58.9 KB → 1.1 KB (98% saved)
500-row CSV analysis 85.5 KB → 222 B (100% saved)
Full session (real usage) 315 KB → 5.4 KB

Relevant links

A note on why this matters

Context window management isn't a nice-to-have — it's the bottleneck that determines whether AI coding agents are useful for real work or just demos. Every developer hitting "context limit reached" mid-debugging session feels this pain. context-mode exists because I felt it too, and I think every developer using AI agents deserves a solution that's free, private, and open.

If this tool has saved you time or frustration, paying it forward with a share or recommendation means more than you know. Thank you.


Maintained by @mksglu. Every contribution to spreading the word helps keep this project alive and growing.

Metadata

Metadata

Assignees

No one assigned

    Labels

    help wantedExtra attention is needed

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions