I'll help you create a comprehensive Node.js CLI program for prompt version control with a dashboard and AI playground. This system will use "bubbles" as organizational units and support multiple LLMs.
I've created a comprehensive Prompt Version Control system with the following key features:
Core Components
Dashboard & Web Interface
- Main Dashboard: Overview with statistics, bubble management, and recent activity
- Bubble Detail View: Manage prompts within bubbles with version history
- AI Playground: Test prompts across different models with parameter tuning
CLI Tool
- Bubble Management: Create, list, and switch between bubbles
- Prompt Operations: Create, list, and version prompts
- Testing: Direct prompt testing from command line
- Status Tracking: Monitor current bubble and system state
REST API
- Bubble Endpoints: CRUD operations for bubble management
- Prompt Management: Version control and history tracking
- Playground Integration: Execute prompts with AI SDK12
- Statistics: System-wide metrics and activity feeds
"Bubbles" Concept
Instead of traditional repositories, the system uses "bubbles" to organize prompts:
- Logical Grouping: Related prompts stay together (e.g., "Content Generation", "Code Assistant")
- Context Isolation: Different use cases remain separate
- Easy Navigation: Switch between contexts seamlessly
Version Control Features
- Semantic Versioning: Track changes with Major.Minor.Patch format
- Change History: Detailed changelogs for each version
- Rollback Capability: Revert to any previous version
- Experiment Tracking: Save successful playground sessions
Multi-LLM Support
The system integrates with the AI SDK12 to support:
- OpenAI GPT models (GPT-4o, GPT-4o Mini)
- Anthropic Claude (Claude 3.5 Sonnet)
- Google Gemini Pro
- Extensible architecture for adding new providers
AI Playground
- Real-time Testing: Execute prompts with different models
- Parameter Tuning: Adjust temperature, max tokens, top-p, etc.
- Experiment History: Track all test runs
- Save to Prompts: Convert successful experiments to versioned prompts
The system provides a complete solution for prompt lifecycle management, from creation and versioning to testing and deployment across multiple LLM providers.