Create beautiful websites with AI through simple chat interface
Hugo AI Studio is a modern web application that lets users create complete websites by simply describing what they want in natural language. The AI analyzes the description and generates a fully functional Hugo website with custom content.
- π¬ Simple Chat Interface - Just describe what website you want
- π€ AI-Powered Creation - Uses Ollama LLM to understand and create
- π Instant Preview - See your website immediately
- π₯ Download Ready - Get complete website files as ZIP
- πΎ Database Storage - All websites stored persistently
- π³ Docker Containerized - Easy deployment and scaling
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
β React Chat β β FastAPI + AI β β Ollama LLM β
β Port: 3001 βββββΊβ Port: 8000 βββββΊβ Port: 11434 β
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
β β β
βββββββββββββββββββββββββΌββββββββββββββββββββββββ
βΌ
βββββββββββββββββββ βββββββββββββββββββ
β SQLite DB β β Nginx β
β (Persistent) β β Port: 8080 β
βββββββββββββββββββ βββββββββββββββββββ
- Docker and Docker Compose
- 4GB+ RAM (for Ollama LLM)
- Internet connection (for initial setup)
git clone https://github.com/your-username/hugo-ai-studio.git
cd hugo-ai-studio
docker-compose -f compose.yml up -d
docker exec hugo-ai-studio-ollama-1 ollama pull llama3.2
- Chat Interface: http://localhost:3001
- API Documentation: http://localhost:8000/docs
- Generated Sites: http://localhost:8080/sites/{site-id}/
- Open the chat interface at http://localhost:3001
- Describe your website in natural language:
- "Create a tech blog about artificial intelligence"
- "Build a portfolio website for a photographer"
- "Make a business website for a coffee shop"
- Wait for AI to create your website (30-60 seconds)
- Preview your site using the provided link
- Download ZIP file with all website files
hugo-ai-studio/
βββ frontend/ # React Chat Interface
β βββ src/
β β βββ App.js # Main chat component
β β βββ index.js # React entry point
β βββ public/
β β βββ index.html # HTML template
β βββ package.json # Dependencies
β βββ Dockerfile # Frontend container
βββ backend/ # FastAPI Backend
β βββ main.py # API endpoints & AI logic
β βββ requirements.txt # Python dependencies
β βββ Dockerfile # Backend container
βββ compose.yml # Docker Compose configuration
βββ nginx.conf # Nginx configuration
βββ data/ # SQLite database storage
βββ generated-sites/ # Generated website files
βββ ollama-data/ # LLM model storage
REACT_APP_API_URL
: Backend API URL (default: http://localhost:8000)OLLAMA_URL
: Ollama service URL (default: http://ollama:11434)DATABASE_PATH
: SQLite database path (default: /app/data/sites.db)
- 3001: React Frontend (Chat Interface)
- 8000: FastAPI Backend (API)
- 8080: Nginx (Generated Sites)
- 11434: Ollama (LLM Service)
# Frontend
cd frontend
npm install
npm start
# Backend
cd backend
pip install -r requirements.txt
uvicorn main:app --reload
# Ollama (separate terminal)
ollama serve
ollama pull llama3.2
GET /health
- Health checkPOST /api/create-website
- Create website from descriptionGET /api/download/{site_id}
- Download website ZIP
"Create a tech blog about artificial intelligence and machine learning with posts about recent developments"
"Build a portfolio website for a graphic designer showcasing creative work and client testimonials"
"Make a professional website for a local coffee shop with menu, location, and contact information"
# Check service status
docker-compose -f compose.yml ps
# View logs
docker-compose -f compose.yml logs frontend
docker-compose -f compose.yml logs backend
# Verify model is downloaded
docker exec hugo-ai-studio-ollama-1 ollama list
# Re-download model
docker exec hugo-ai-studio-ollama-1 ollama pull llama3.2
# Stop all services
docker-compose -f compose.yml down
# Check port usage
netstat -tulpn | grep :3001
- CPU: 2+ cores recommended
- RAM: 4GB minimum, 8GB recommended
- Storage: 10GB for models and generated sites
- Network: Internet connection for initial setup
- Fork the repository
- Create feature branch (
git checkout -b feature/amazing-feature
) - Commit changes (
git commit -m 'Add amazing feature'
) - Push to branch (
git push origin feature/amazing-feature
) - Open Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- Hugo - Static site generator
- Ollama - Local LLM inference
- React - Frontend framework
- FastAPI - Backend framework
- Docker - Containerization
Made with β€οΈ for easy website creation