Skip to content

rawveg/intellillm-playground

Repository files navigation

License: AGPL v3 Built with Next.js Docker ready Cloud Run ready
Focus: Hobbyists Focus: Software Engineers Focus: Prompt Engineers Focus: Data Scientists

IntelliLLM Playground

A modern, extensible playground for prompt engineering and testing against OpenRouter-hosted LLMs.
Beautiful UI, real-time web search augmentation, and seamless containerized deployment.


🚀 What is IntelliLLM Playground?

IntelliLLM Playground is a powerful, modern tool for prompt engineers, data scientists, and developers to create, test, and refine prompts for large language models. With a beautiful UI and rich feature set, it helps you get the most out of your LLM interactions.

IntelliLLM Playground UI

✨ Key Features

  • 🖥️ Modern UI: Multi-tab prompt editing, collapsible system prompt editor, and theme support
  • 🤖 LLM Integration: Supports all OpenRouter models with dynamic per-prompt configuration
  • 📂 Prompt Management: Import/export prompts, folder organization, and GitHub Gist integration
  • 🔄 Prompt Parametrization: Create template prompts with {{ParameterName}} syntax
  • Prompt Augmentation: Use AI to enhance your prompts automatically, making them more detailed and effective.
  • 🧠 Web Search Augmentation: Real-time web search for all models via DuckDuckGo for up-to-date information
  • 📎 File Attachments: Upload images and documents to include with your prompts
  • 🐳 Dockerized: Easy to build, run, and deploy anywhere

See all features →

⚡ Quickstart (Docker Recommended)

The fastest and most reliable way to use IntelliLLM Playground is via Docker.

# 1. Clone the repo
$ git clone https://github.com/rawveg/intellillm-playground.git
$ cd intellillm-playground

# 2. Build the Docker image
$ docker build -t intellillm-playground .

# 3. Run the container (serves on port 3000)
$ docker run -p 3000:3000 -v /path/to/your/prompts:/app/prompts intellillm-playground

Tip: Mount your prompt directory (-v /path/to/your/prompts:/app/prompts) for persistent prompt storage.

Open http://localhost:3000 in your browser.

Alternative setup options →

🏗️ Architecture Overview

Layer Technology/Notes
Frontend Next.js, React, TailwindCSS, Radix UI, Monaco Editor
Backend Next.js API routes (TypeScript)
LLM Access OpenRouter API
Web Search DuckDuckGo search (@pikisoft/duckduckgo-search)
Prompts Markdown files with YAML frontmatter (/prompts)
Container Docker (multi-stage build)

📚 Documentation

📜 License

This project is licensed under the GNU Affero General Public License v3.0. See the LICENSE file for details.

🙏 Acknowledgements

Made with ❤️ for prompt engineers and LLM enthusiasts.