minions-ai-agents/README.md

142 lines
4.3 KiB
Markdown

# 🧠 Antigravity Brain - Enterprise AI Crew System
A modular **CrewAI** application with shared memory, 26 specialized AI agents, and a web interface powered by **Chainlit**.
![Python](https://img.shields.io/badge/Python-3.11+-blue)
![CrewAI](https://img.shields.io/badge/CrewAI-0.80+-green)
![Docker](https://img.shields.io/badge/Docker-Ready-blue)
![License](https://img.shields.io/badge/License-Private-red)
## 🚀 Features
- **26 Specialized AI Agents** - From infrastructure (Arthur Mendes, Gus Fring) to sales (Ari Gold, Don Draper) to crisis management (Olivia Pope, Saul Goodman)
- **Shared Memory** - Agents share knowledge via Mem0 + Qdrant vector database
- **Smart Routing** - Automatic request classification to the appropriate crew
- **Multi-Provider LLM Support** - Works with Gemini, OpenAI, Anthropic, or local Ollama
- **Web Interface** - Beautiful chat UI powered by Chainlit
- **Docker Ready** - One-command deployment with docker-compose
## 📁 Project Structure
```
minions-da-itguys/
├── src/
│ ├── app.py # Chainlit entry point
│ ├── config.py # LLM & Memory configuration
│ ├── router.py # Smart request routing
│ ├── agents/
│ │ ├── factory.py # Agent instantiation
│ │ └── personas/ # 26 agent personality files (.md)
│ ├── crews/
│ │ └── definitions.py # Crew assembly logic
│ ├── knowledge/
│ │ └── standards/ # Corporate knowledge base
│ ├── memory/
│ │ └── wrapper.py # Mem0 integration with rate limiting
│ └── tools/ # Custom tools (Zabbix, Evolution, etc.)
├── docker-compose.yml # Container orchestration
├── Dockerfile # App container
├── requirements.txt # Python dependencies
└── .env # API keys & configuration
```
## 🛠️ Installation
### Prerequisites
- Docker & Docker Compose
- Python 3.11+ (for local development)
- Gemini/OpenAI API Key
### Quick Start (Docker)
```bash
# 1. Clone the repository
git clone https://github.com/your-org/minions-da-itguys.git
cd minions-da-itguys
# 2. Configure environment
cp .env.example .env
# Edit .env with your API keys
# 3. Start the application
docker-compose up -d
# 4. Access the web interface
open http://localhost:8000
```
### Local Development
```bash
# Install dependencies
pip install -r requirements.txt
# Run Chainlit
chainlit run src/app.py --port 8000
```
## ⚙️ Configuration
Edit `.env` to configure the AI backend:
```env
# LLM Provider: gemini, openai, anthropic, ollama
LLM_PROVIDER=gemini
LLM_MODEL_FAST=gemini-2.5-flash-lite-preview-06-17
LLM_MODEL_SMART=gemini-2.5-flash-lite-preview-06-17
GEMINI_API_KEY=your-api-key
# Memory: qdrant (local) or mem0 (cloud)
MEMORY_PROVIDER=qdrant
MEMORY_EMBEDDING_PROVIDER=local
```
## 🤖 Available Crews
| Crew | Agents | Purpose |
|------|--------|---------|
| **Infra Engineering** | Arthur Mendes, Gus Fring | Zabbix templates, monitoring |
| **Security Audit** | Elliot Alderson, Devil | Vulnerability assessment |
| **HR & Evolution** | The Architect, Sherlock | Create agents, learn policies |
| **Sales Growth** | Ari Gold, Chris Gardner, Don Draper | Pipeline management |
| **Business Strategy** | Harvey Specter, Kevin O'Leary | Compliance, ROI analysis |
## 🧪 Usage Examples
```
User: "Validate this Zabbix template"
→ Routes to: Infra Engineering (Zabbix)
→ Arthur validates YAML, fixes UUIDs, Gus reviews
User: "Create a new agent named Bob for DevOps"
→ Routes to: HR & Evolution
→ The Architect spawns new persona file
User: "Analyze security of our login page"
→ Routes to: Security Audit
→ Elliot performs reconnaissance
```
## 📦 Dependencies
- `crewai` - Multi-agent orchestration
- `chainlit` - Web UI
- `mem0ai` - Shared memory
- `qdrant-client` - Vector database
- `litellm` - Multi-provider LLM support
- `sentence-transformers` - Local embeddings
## 🔒 Security Notes
- Never commit `.env` with real API keys
- The `.env.example` contains safe placeholder values
- Memory is persisted in Docker volume `qdrant_data`
## 📝 License
Private - ITGuys Internal Use Only
---
Built with ❤️ by ITGuys