A serverless AI agent built with Cloudflare Workers, KV storage, and LLaMA 3 inference. Offers endpoints for health, Q&A, webhooks, and external API routing. Uses KV for lightweight memory and state. Deployed automatically via GitHub Actions. Minimal, fast, and production-grade edge deployment. Ideal for building AI-enabled microservices at the edge.
A modular Cloudflare Worker application demonstrating how to build fast, scalable edge agents with AI, storage, and automation built-in.
This lab project includes:
- ✅ Health endpoint for monitoring
- 🤖 AI Q&A using Cloudflare’s LLaMA 3 inference
- 📬 Webhook listener for external integrations
- 🌐 External API proxy for mashups and edge routing
- 💾 KV data store for memory/stateful actions
- ⚙️ CI/CD via GitHub Actions
URL: https://cloudflare-agent-lab.harelabs.workers.dev
Endpoint | Method | Purpose |
---|---|---|
/ |
GET | Health check |
/ask?q=question |
GET | AI response using Cloudflare AI Inference |
/webhook |
POST | Accepts JSON payloads for processing |
/external |
GET | Returns API results from public endpoint |
/kv?store=value |
GET | Stores a string in KV |
/kv?get |
GET | Retrieves last stored string from KV |
- Cloudflare Workers
- Cloudflare AI Inference
- Workers KV Storage
- GitHub Actions CI/CD
- TypeScript
npm install -g wrangler
wrangler login
wrangler dev
wrangler deploy
Automatically deploys on push to main
.
Create a Cloudflare API Token with:
- Workers Scripts: Edit
- Workers KV Storage: Edit
Add to GitHub:
Settings → Secrets → Actions → New Repository Secret
Name: CLOUDFLARE_API_TOKEN
curl "https://cloudflare-agent-lab.harelabs.workers.dev/ask?q=what+is+cloudflare%3F"
curl -X POST https://cloudflare-agent-lab.harelabs.workers.dev/webhook \
-H "Content-Type: application/json" \
-d '{"event":"test","source":"github"}'
MIT License © 2025 [HareLabs]