Prompter
Code smarter with AI—no more messy copy-pasting. Prompter structures your prompts and applies AI changes seamlessly, streamlining your coding workflow.
Why Prompter?
- Too much bloat in your repo? Stop zipping everything—send only the files that matter.
- LLM underperforming? Cut the noise for sharper, more accurate responses.
- Better AI coding? Select just the right context to optimize results.
Prompter empowers you to work efficiently with AI, reducing token waste and improving clarity.
Features
-
Advanced File Selection & Token Estimation Precisely filter files and estimate token usage instantly for optimized, cost-effective prompts.
-
Optimized XML Prompt Structured file trees, CodeMaps, content, and instructions in XML for maximum LLM clarity.
-
Structured XML Diffs Converts LLM-generated XML edits into precise, reviewable diffs—works at any file size.
-
Codemap Extraction Scans files locally to extract classes, functions, and references, minimizing tokens and hallucinations. Auto-detects referenced types.
-
Mac-Native Performance Built for macOS with native speed and responsiveness—because performance matters.
-
Clipboard Integration Copy structured prompts into any AI chat app—your data stays local, no external API needed.
-
Works with Any Model Compatible with OpenAI, Anthropic, DeepSeek, Gemini, Azure, OpenRouter, and local models—private and offline when you need it.
-
Privacy First Local models, offline scanning, and direct clipboard use—no intermediaries required.
Installation
(Note: Installation steps are assumed based on the VS Code context from other files. Adjust as needed.)
- Clone the repository:
git clone <repository-url> - Open the project in VS Code.
- Install dependencies:
npm install - Build the extension:
npm run compile - Press
F5in VS Code to launch the extension in a development window.
Usage
- Open your project in VS Code.
- Use the Prompter interface to select files and estimate tokens.
- Generate a structured XML prompt via the clipboard.
- Paste into your preferred AI model (e.g., ChatGPT, Claude, or a local LLM).
- Apply the returned XML diffs directly through Prompter for seamless integration.
Contributing
We welcome contributions! To get started:
- Fork the repository.
- Create a feature branch:
git checkout -b my-feature. - Commit your changes:
git commit -m "Add my feature". - Push to your branch:
git push origin my-feature. - Open a pull request.
See vsc-extension-quickstart.md for development setup and testing details.
Built with ❤️ by the Prompter team.
Code smarter with AI—no more messy copy-pasting. Prompter structures your prompts and applies AI changes seamlessly, streamlining your coding workflow.
Why Prompter?
- Too much bloat in your repo? Stop zipping everything—send only the files that matter.
- LLM underperforming? Cut the noise for sharper, more accurate responses.
- Better AI coding? Select just the right context to optimize results.
Prompter empowers you to work efficiently with AI, reducing token waste and improving clarity.
Features
-
Advanced File Selection & Token Estimation Precisely filter files and estimate token usage instantly for optimized, cost-effective prompts.
-
Optimized XML Prompt Structured file trees, CodeMaps, content, and instructions in XML for maximum LLM clarity.
-
Structured XML Diffs Converts LLM-generated XML edits into precise, reviewable diffs—works at any file size.
-
Codemap Extraction Scans files locally to extract classes, functions, and references, minimizing tokens and hallucinations. Auto-detects referenced types.
-
Mac-Native Performance Built for macOS with native speed and responsiveness—because performance matters.
-
Clipboard Integration Copy structured prompts into any AI chat app—your data stays local, no external API needed.
-
Works with Any Model Compatible with OpenAI, Anthropic, DeepSeek, Gemini, Azure, OpenRouter, and local models—private and offline when you need it.
-
Privacy First Local models, offline scanning, and direct clipboard use—no intermediaries required.
Installation
(Note: Installation steps are assumed based on the VS Code context from other files. Adjust as needed.)
- Clone the repository:
git clone <repository-url> - Open the project in VS Code.
- Install dependencies:
npm install - Build the extension:
npm run compile - Press
F5in VS Code to launch the extension in a development window.
Usage
- Open your project in VS Code.
- Use the Prompter interface to select files and estimate tokens.
- Generate a structured XML prompt via the clipboard.
- Paste into your preferred AI model (e.g., ChatGPT, Claude, or a local LLM).
- Apply the returned XML diffs directly through Prompter for seamless integration.
Contributing
We welcome contributions! To get started:
- Fork the repository.
- Create a feature branch:
git checkout -b my-feature. - Commit your changes:
git commit -m "Add my feature". - Push to your branch:
git push origin my-feature. - Open a pull request.
See vsc-extension-quickstart.md for development setup and testing details.
Built with ❤️ by the Prompter team.