From e78a41f11c75098698654684d7b92f1d39b13249 Mon Sep 17 00:00:00 2001 From: "IGNY8 VPS (Salman)" Date: Mon, 23 Mar 2026 10:30:51 +0000 Subject: [PATCH] v2-exece-docs --- .../00-MASTER-EXECUTION-PLAN.md | 124 +- .../00A-github-repo-consolidation.md | 19 +- v2/V2-Execution-Docs/00B-vps-provisioning.md | 26 +- .../00C-igny8-production-migration.md | 137 +- .../00D-staging-environment.md | 437 +++--- v2/V2-Execution-Docs/00E-legacy-cleanup.md | 57 +- .../00F-self-hosted-ai-infra.md | 61 +- .../01A-sag-data-foundation.md | 121 +- .../01B-sector-attribute-templates.md | 152 +- .../01C-cluster-formation-keyword-engine.md | 49 +- .../01D-setup-wizard-case2-new-site.md | 53 +- .../01E-blueprint-aware-pipeline.md | 291 ++-- .../01E-blueprint-aware-pipeline.md.bak | 1265 +++++++++++++++++ .../01F-existing-site-analysis-case1.md | 105 +- .../01G-sag-health-monitoring.md | 28 +- 15 files changed, 2218 insertions(+), 707 deletions(-) create mode 100644 v2/V2-Execution-Docs/01E-blueprint-aware-pipeline.md.bak diff --git a/v2/V2-Execution-Docs/00-MASTER-EXECUTION-PLAN.md b/v2/V2-Execution-Docs/00-MASTER-EXECUTION-PLAN.md index 084416c1..804e53cb 100644 --- a/v2/V2-Execution-Docs/00-MASTER-EXECUTION-PLAN.md +++ b/v2/V2-Execution-Docs/00-MASTER-EXECUTION-PLAN.md @@ -1,9 +1,10 @@ # IGNY8 V2 — Master Execution Plan -**Version:** 1.0 | March 23, 2026 +**Version:** 1.1 | March 23, 2026 **Status:** Active — Execution Reference **Author:** Salman (Alorig Systems) + Claude Opus **Execution Tool:** Claude Code (SSH to VPS) +**Source of Truth:** Codebase at `/data/app/igny8/` — all technical claims verified against actual code --- @@ -11,10 +12,12 @@ This is the single master document governing the complete IGNY8 V2 build — from infrastructure migration through SAG engine, all modules, WordPress ecosystem, business layer, and multi-app deployment. Every sub-phase references a dedicated build doc in this folder that Claude Code can pick up and execute independently. -## 2. Current State (Confirmed March 23, 2026) +## 2. Current State (Verified Against Codebase — March 23, 2026) **IGNY8 v1.8.4** is healthy and functionally production-ready. +### 2.1 Functional Status + | Area | Status | |------|--------| | Settings save (content, publishing, profile) | ✅ Working | @@ -27,26 +30,107 @@ This is the single master document governing the complete IGNY8 V2 build — fro | `/writer/tasks/{id}/brief/` | No current use case — v2 scope | | Taxonomy sync, Linker/Optimizer, webhooks | Correctly scoped as v2 features, not bugs | +### 2.2 Verified Codebase Baseline + +| Component | Verified Value | +|-----------|---------------| +| Django | >=5.2.7 (requirements.txt) | +| Python | 3.11-slim (Dockerfile) | +| Node | 18-alpine (Dockerfile.dev) | +| React | ^19.0.0 | +| TypeScript | ~5.7.2 | +| Vite | ^6.1.0 | +| Zustand | ^5.0.8 | +| Tailwind CSS | ^4.0.8 | +| Celery | >=5.3.0 | +| WP Plugin | IGNY8 WordPress Bridge v1.5.2 | +| Primary Key Strategy | BigAutoField (integer, NOT UUID) | +| AUTH_USER_MODEL | igny8_core_auth.User | +| DEFAULT_AUTO_FIELD | django.db.models.BigAutoField | +| Installed Apps | 34 Django apps | +| Middleware Stack | 13 middleware classes | +| Celery Beat Tasks | 14 scheduled tasks | +| AI Functions | 7 (auto_cluster, generate_ideas, generate_content, generate_images, generate_image_prompts, optimize_content, generate_site_structure) | + +### 2.3 Container Inventory (docker-compose.app.yml — 7 containers) + +| Container | Image | Host Port | Role | +|-----------|-------|-----------|------| +| igny8_backend | igny8-backend:latest | 8011 | Django + Gunicorn (4 workers, 120s timeout) | +| igny8_frontend | igny8-frontend-dev:latest | 8021 | Vite dev server (port 5173 internal) | +| igny8_marketing_dev | igny8-marketing-dev:latest | 8023 | Marketing site dev server (port 5174 internal) | +| igny8_celery_worker | igny8-backend:latest | — | Celery worker (concurrency=4) | +| igny8_celery_beat | igny8-backend:latest | — | Celery beat scheduler | +| igny8_flower | igny8-backend:latest | 5555 | Celery monitoring | + +*Plus shared infra containers (external to app compose): postgres, redis, caddy, portainer, pgadmin, filebrowser* + +### 2.4 Existing Django Apps (34 in INSTALLED_APPS) + +**Business layer:** automation, notifications, optimization, publishing, integration +**Module layer:** planner (keywords/clusters/ideas), writer (tasks/content/images), billing, system, linker (inactive), optimizer (inactive), publisher, integration +**Auth & core:** auth (Account, Site, Sector, User, Plan), ai, plugins, admin + +### 2.5 Existing Models (key entities) + +| App | Models | +|-----|--------| +| auth | Account, Plan, Subscription, Site, Sector, Industry, IndustrySector, SeedKeyword, User, SiteUserAccess | +| planning | Clusters (status: new/mapped), Keywords, ContentIdeas | +| content | Tasks, Content (content_type: post/page/product/taxonomy), ContentTaxonomy, ContentTaxonomyRelation, Images, ImagePrompts | +| automation | DefaultAutomationConfig, AutomationConfig (per-site), AutomationRun | +| integration | SiteIntegration, SyncEvent, PublishingSettings | +| publishing | PublishingRecord, DeploymentRecord | +| billing | CreditTransaction, CreditUsageLog, CreditCostConfig, AccountPaymentMethod, Payment, Invoice | +| system | IntegrationProvider, AIPrompt, IntegrationSettings, AuthorProfile | +| ai | AITaskLog | +| plugins | Plugin, PluginVersion, PluginDownload | +| notifications | Notification | +| optimization | OptimizationTask | + +### 2.6 7-Stage Automation Pipeline + +| Stage | Function | AI | Batch Size | +|-------|----------|-----|-----------| +| 1 | Keywords → Clusters | Yes (auto_cluster) | 50 | +| 2 | Clusters → Ideas | Yes (generate_ideas) | 1 | +| 3 | Ideas → Tasks | No | 20 | +| 4 | Tasks → Content | Yes (generate_content) | 1 | +| 5 | Content → Image Prompts | Yes (generate_image_prompts) | 1 | +| 6 | Image Prompts → Images | Yes (generate_images) | 1 | +| 7 | Auto-approval → Publish | No | — | + +### 2.7 What Does NOT Exist (common misconceptions from planning docs) + +- **No `sag/` app** — no SAGBlueprint, SAGAttribute, SAGCluster, or SectorAttributeTemplate models +- **No UUID primary keys** — all models use BigAutoField (integer) +- **No `sag_blueprint` field on Site model** +- **No `blueprint_context` field on Content or Tasks models** +- **No `self_hosted_ai` provider** in IntegrationProvider +- **No `/sag/site-analysis` endpoint** in the WordPress plugin +- **Content already has** `content_type` (post/page/product/taxonomy) and `content_structure` (article/guide/comparison/review/listicle/landing_page/etc) — these are not new fields +- **Linker & Optimizer modules** exist in code but are **inactive** (behind feature flags) + **Conclusion:** Phase 0 is pure migration. No bug-fixing sprint needed. Current environment stays untouched — all new work on new server with zero downtime. ## 3. Architecture Overview -**Current:** Single VPS, dev environment running as production, 14 Docker containers (6 unnecessary), Gitea self-hosted, no staging. +**Current:** Single VPS running IGNY8 app containers + shared Alorig infrastructure containers. App-level: 7 containers in `docker-compose.app.yml` (backend, frontend, marketing_dev, celery_worker, celery_beat, flower) + shared infra containers (postgres, redis, caddy, portainer, pgadmin, filebrowser). Of the 7 app containers, `marketing_dev` and `flower` are non-essential for production. Gitea self-hosted for git, no staging environment, no GitHub. -**Target:** New Hostinger KVM 4 (4 vCPU, 16GB RAM, 200GB NVMe), shared Alorig infrastructure, production + staging environments, GitHub for all repos, Cloudflare DNS, self-hosted AI on Vast.ai GPU. +**Target:** New Hostinger KVM 4 (4 vCPU, 16GB RAM, 200GB NVMe) with shared Alorig infrastructure stack (PG, Redis, Caddy, Portainer). IGNY8 app runs 3 core containers (backend, celery_worker, celery_beat) + frontend served via Caddy. Same pattern for all other Alorig apps. Production + staging environments, GitHub for all repos, Cloudflare DNS, self-hosted AI on Vast.ai GPU. **IGNY8 v2 Transformation:** From keyword-driven content generator → structure-first SAG-powered site architecture engine. Attributes first, not keywords first. Keywords emerge from attribute intersections across 45 industries, 449 sectors. ## 4. Technology Stack -| Layer | Current (v1.8.4) | V2 Addition | +| Layer | Current (v1.8.4) — Verified | V2 Addition | |-------|-------------------|-------------| -| Backend | Django 5.1, DRF, PostgreSQL, Redis, Celery | SAG models, new module APIs | -| Frontend | React 19, TypeScript, Zustand, Tailwind | Blueprint UI, wizard, dashboards | -| AI (Cloud) | OpenAI GPT/DALL-E, Anthropic Claude, ElevenLabs | — | +| Backend | Django >=5.2.7, DRF, PostgreSQL (external), Redis (external), Celery >=5.3.0, Python 3.11 | SAG models, new module APIs | +| Frontend | React ^19.0.0, TypeScript ~5.7.2, Zustand ^5.0.8, Tailwind ^4.0.8, Vite ^6.1.0, Node 18 | Blueprint UI, wizard, dashboards | +| AI (Cloud) | OpenAI (via IntegrationProvider), Anthropic (via IntegrationSettings), Runware (images), DALL-E (images) | — | | AI (Self-hosted) | — | Qwen3 (text), FLUX/SD (images), Wan 2.1 (video) via Vast.ai | -| WordPress | Bridge plugin v1.3.3 | Plugin v2 (14 modules), Companion Theme, Toolkit | -| Infrastructure | Single VPS, Gitea, no staging | KVM 4 + Vast.ai GPU, GitHub, Cloudflare, prod + staging | +| WordPress | IGNY8 WordPress Bridge v1.5.2 | Plugin v2 (14 modules), Companion Theme, Toolkit | +| Infrastructure | Single VPS, Gitea self-hosted, no staging, Caddy reverse proxy | KVM 4 + Vast.ai GPU, GitHub, Cloudflare, prod + staging | | DevOps | Manual | Claude Code via SSH | ## 5. Complete Execution Map @@ -59,8 +143,8 @@ This is the single master document governing the complete IGNY8 V2 build — fro | 0A | `00A-github-repo-consolidation.md` | All repos → 1 GitHub account, linked to Source-Codes/, remove Gitea | — | | 0B | `00B-vps-provisioning.md` | New KVM 4, Cloudflare DNS, shared Docker infra (PG/Redis/Caddy/Portainer) | 0A | | 0C | `00C-igny8-production-migration.md` | pg_dump → new server, Docker Compose, DNS cutover, zero downtime | 0B | -| 0D | `00D-staging-environment.md` | Identical 3-container staging, separate DB + Redis prefix | 0C | -| 0E | `00E-legacy-cleanup.md` | Kill Gitea + 5 containers (frontend-dev, marketing, pgadmin, filebrowser, setup-helper), ~1.5GB freed | 0C | +| 0D | `00D-staging-environment.md` | Staging environment: backend + celery_worker + celery_beat + frontend, separate DB (`igny8_staging_db`) + Redis DB 1 | 0C | +| 0E | `00E-legacy-cleanup.md` | Kill Gitea + non-essential containers (marketing_dev, flower, pgadmin, filebrowser), decommission old VPS | 0C | | 0F | `00F-self-hosted-ai-infra.md` | Vast.ai GPU (2×RTX 3090) + SSH tunnel + LiteLLM + Ollama/Qwen3 + ComfyUI | 0B | ### Phase 1 — SAG Core Engine @@ -168,13 +252,15 @@ After all V2-Execution-Docs are built, the following source locations get archiv ## 8. Key Principles -1. **Nothing working breaks** — nullable fields, feature flags, staging first -2. **SAG is attribute-first** — keywords are output, not input -3. **Same container pattern everywhere** — backend + celery_worker + celery_beat -4. **Current environment never touched** — all new work on new server -5. **All development via Claude Code** — SSH to VPS, timelines compressed vs manual dev -6. **Each doc is self-contained** — Claude Code executes one doc at a time without losing context -7. **Monitor real usage** — upgrade decisions are data-driven, not speculative +1. **Codebase is the single source of truth** — every technical claim in execution docs verified against actual code, not planning/reference docs +2. **Nothing working breaks** — nullable fields, feature flags, staging first +3. **SAG is attribute-first** — keywords are output, not input +4. **Same container pattern everywhere** — backend + celery_worker + celery_beat per app, shared infra (PG/Redis/Caddy) across all Alorig apps +5. **Current environment never touched** — all new work on new server +6. **All development via Claude Code** — SSH to VPS, timelines compressed vs manual dev +7. **Each doc is self-contained** — Claude Code executes one doc at a time without losing context +8. **Coexistence with existing models** — new SAG models must define migration path for existing Clusters/Keywords/Content, not ignore them +9. **Monitor real usage** — upgrade decisions are data-driven, not speculative ## 9. Timeline Estimate (Claude Code Execution) diff --git a/v2/V2-Execution-Docs/00A-github-repo-consolidation.md b/v2/V2-Execution-Docs/00A-github-repo-consolidation.md index 1bee4656..0f6aa04c 100644 --- a/v2/V2-Execution-Docs/00A-github-repo-consolidation.md +++ b/v2/V2-Execution-Docs/00A-github-repo-consolidation.md @@ -1,11 +1,12 @@ # IGNY8 Phase 0: GitHub Repository Consolidation ## Document 00A: Complete GitHub Repo Consolidation Strategy -**Document Version:** 1.0 +**Document Version:** 1.1 **Last Updated:** 2026-03-23 **Status:** In Development **Phase:** Phase 0 - Infrastructure Setup **Priority:** High (blocking all other development) +**Source of Truth:** Codebase at `/data/app/igny8/` --- @@ -35,15 +36,17 @@ - `igny8-app` contains Django/React application code - Both typically cloned/mounted in development containers -### 1.2 Current Stack Versions +### 1.2 Current Stack Versions (Verified from codebase) ``` -Backend: Django 5.1 -Frontend: React 19 -Database: PostgreSQL 16 -Cache: Redis 7 -Proxy: Caddy 2 -Task Queue: Celery 5.4 +Backend: Django >=5.2.7 (requirements.txt), Python 3.11-slim (Dockerfile) +Frontend: React ^19.0.0, TypeScript ~5.7.2, Vite ^6.1.0, Node 18-alpine (Dockerfile.dev) +Database: PostgreSQL (external container, version set by infra stack) +Cache: Redis (external container, version set by infra stack) +Proxy: Caddy 2 (external container) +Task Queue: Celery >=5.3.0 (requirements.txt) +State: Zustand ^5.0.8 +CSS: Tailwind ^4.0.8 Orchestration: Docker Compose External Network: igny8_net ``` diff --git a/v2/V2-Execution-Docs/00B-vps-provisioning.md b/v2/V2-Execution-Docs/00B-vps-provisioning.md index c14a0f47..c571d27d 100644 --- a/v2/V2-Execution-Docs/00B-vps-provisioning.md +++ b/v2/V2-Execution-Docs/00B-vps-provisioning.md @@ -4,6 +4,7 @@ **Date Created:** 2026-03-23 **Phase:** 0 (Infrastructure Setup) **Document ID:** 00B +**Source of Truth:** Codebase at `/data/app/igny8/` --- @@ -80,16 +81,21 @@ This section is the single source of truth for all target versions across the en ### 2.3 Application Stack (reference — installed during 00C/00D) -| Component | Version | Notes | -|-----------|---------|-------| -| Python | 3.14 | For backend Dockerfile | -| Node.js | 24 LTS | For frontend Dockerfile | -| Django | 6.0 | Backend framework | -| Django REST Framework | Latest | API serializers | -| Celery | 5.6 | Task queue | -| Gunicorn | 25 | WSGI application server | -| Vite | 8 | Frontend build tool | -| React | Latest | Frontend library | +**IMPORTANT:** These are the versions the current codebase actually runs. Any version upgrades (e.g., Python 3.14, Django 6.0, Node 24) are separate upgrade tasks that require code changes, dependency testing, and migration work — not just a Dockerfile change. Phase 0 migrates the app as-is. + +| Component | Current (Verified) | Upgrade Target (Separate Task) | Notes | +|-----------|-------------------|-------------------------------|-------| +| Python | 3.11-slim | TBD (3.13+ when deps support it) | Dockerfile: `python:3.11-slim` | +| Node.js | 18-alpine | TBD (20 LTS or 22 LTS) | Dockerfile.dev: `node:18-alpine` | +| Django | >=5.2.7 | TBD (6.0 when stable) | requirements.txt constraint | +| Django REST Framework | Latest (unpinned) | Same | requirements.txt | +| Celery | >=5.3.0 | Same | requirements.txt | +| Gunicorn | Latest (unpinned) | Same | requirements.txt | +| Vite | ^6.1.0 | Same | package.json | +| React | ^19.0.0 | Same | package.json | +| TypeScript | ~5.7.2 | Same | package.json | +| Zustand | ^5.0.8 | Same | package.json | +| Tailwind CSS | ^4.0.8 | Same | package.json | --- diff --git a/v2/V2-Execution-Docs/00C-igny8-production-migration.md b/v2/V2-Execution-Docs/00C-igny8-production-migration.md index 0bf42bf2..eeb1dca6 100644 --- a/v2/V2-Execution-Docs/00C-igny8-production-migration.md +++ b/v2/V2-Execution-Docs/00C-igny8-production-migration.md @@ -2,9 +2,10 @@ **Document ID:** 00C-igny8-production-migration **Phase:** Phase 0: Production Migration -**Version:** 2.0 +**Version:** 2.1 **Date:** 2026-03-23 **Status:** In Progress +**Source of Truth:** Codebase at `/data/app/igny8/` **Related Docs:** - 00A: GitHub Repository Consolidation (completed) @@ -23,18 +24,29 @@ **Project Name:** igny8-app **Compose File:** docker-compose.app.yml -#### Active Containers +#### Active Containers (Verified from docker-compose.app.yml) + +**App containers (7 in docker-compose.app.yml):** + +| Container | Service | Port (Host:Container) | Technology | +|-----------|---------|----------------------|------------| +| igny8_backend | REST API | 8011:8010 | Django >=5.2.7 + Gunicorn (4 workers, 120s timeout) | +| igny8_frontend | Web UI | 8021:5173 | Vite dev server (React ^19, Node 18) | +| igny8_marketing_dev | Marketing site | 8023:5174 | Vite dev server | +| igny8_celery_worker | Task worker | — | Celery (concurrency=4) | +| igny8_celery_beat | Task scheduler | — | Celery Beat | +| igny8_flower | Celery monitor | 5555:5555 | Flower | + +**Shared infra containers (external to app compose, on igny8_net):** + | Container | Service | Port (Internal) | Technology | |-----------|---------|----------------|------------| -| igny8_backend | REST API | 8010 | Django 4.2 + Gunicorn (4 workers, 120s timeout) | -| igny8_frontend | Web UI | 5173 → 8021 | Vite dev server | -| igny8_celery_worker | Task worker | N/A | Celery | -| igny8_celery_beat | Task scheduler | N/A | Celery Beat | -| igny8_postgres | Database | 5432 | PostgreSQL 16 | -| igny8_redis | Cache/Broker | 6379 | Redis 7 (DB 0) | +| postgres | Database | 5432 | PostgreSQL (version set by infra stack) | +| redis | Cache/Broker | 6379 | Redis (DB 0 for production) | | caddy | Reverse proxy/SSL | 80, 443 | Caddy 2 | -| marketing | Render service | 8023 | Custom service | -| sites | Render service | 8024 | Custom service | +| portainer | Docker management | 9000 | Portainer CE | +| pgadmin | DB admin | 5050 | PgAdmin 4 | +| filebrowser | File management | 8080 | FileBrowser | #### Database - **Database Name:** igny8_db @@ -61,7 +73,7 @@ - CELERY_BROKER_URL - Django DEBUG, ALLOWED_HOSTS -**Important:** AI integration keys stored in database (GlobalIntegrationSettings table), NOT in env vars. +**Important:** AI integration keys stored in database (`IntegrationProvider` table: `igny8_integration_providers`, and `IntegrationSettings` table: `igny8_integration_settings`), NOT in env vars. #### Networking - **Primary Domain:** app.igny8.com (frontend) @@ -77,9 +89,11 @@ - **Backup Automation:** Cron jobs on old VPS (backup-db.sh, backup-full.sh) #### Health Check -- **Endpoint:** http://localhost:8010/api/v1/system/status/ -- **Expected Response:** 200 OK with system status JSON +- **Endpoint:** http://localhost:8010/api/v1/system/status/ (inside container) or http://localhost:8011/api/v1/system/status/ (from host) +- **Expected Response:** 200 OK with system status JSON (timestamp, system resources, database, Redis, Celery status) +- **Permission:** AllowAny (public endpoint) - **Frequency:** Manual or via monitoring +- **Docker healthcheck:** Configured in compose: 30s interval, 10s timeout, 3 retries --- @@ -167,30 +181,60 @@ This migration is **not a direct cutover**. Instead, we run both VPS in parallel The database schema itself does not change during migration. We use pg_dump and pg_restore to move the entire database from old VPS (PG 16) to new VPS (PG 18). -**Key Tables (not exhaustive):** -- `users` — User accounts -- `projects` — Projects/sites -- `stripe_subscriptions` — Payment records -- `integration_settings` — AI integration keys (GlobalIntegrationSettings) -- `wordpress_sync_logs` — Plugin sync history -- `celery_*` — Celery task tables +**Key Tables (verified from codebase — all use `igny8_` prefix convention):** + +| Table | Purpose | +|-------|---------| +| `igny8_users` | User accounts (AUTH_USER_MODEL) | +| `igny8_tenants` | Multi-tenant accounts | +| `igny8_sites` | Sites within accounts | +| `igny8_subscriptions` | Subscription records | +| `igny8_plans` | Plan definitions | +| `igny8_content` | Content items | +| `igny8_tasks` | Writer tasks | +| `igny8_clusters` | Keyword clusters | +| `igny8_keywords` | Keywords | +| `igny8_content_ideas` | Content ideas | +| `igny8_images` | Generated images | +| `igny8_invoices` | Billing invoices | +| `igny8_payments` | Payment records | +| `igny8_webhook_events` | Stripe/PayPal webhooks | +| `igny8_site_integrations` | WordPress site connections | +| `igny8_sync_events` | WordPress sync history | +| `igny8_publishing_records` | Publish records | +| `igny8_ai_task_logs` | AI task audit trail | +| `igny8_automation_configs` | Automation settings | +| `igny8_automation_runs` | Automation run history | +| `plugins` / `plugin_versions` / `plugin_installations` / `plugin_downloads` | Plugin system | +| `igny8_integration_providers` / `igny8_integration_settings` | AI/payment provider keys | + +**Note:** There is NO `stripe_subscriptions` table, NO `wordpress_sync_logs` table, NO `GlobalIntegrationSettings` table. These were errors in earlier doc versions. **Important:** Do NOT manually migrate tables. Use pg_dump/pg_restore with custom format. ### 3.2 Health Check API -**Endpoint:** `GET http://localhost:8010/api/v1/system/status/` -**Expected Response:** +**Endpoint:** `GET /api/v1/system/status/` (AllowAny — no auth required) + +**Actual response format** (verified from `igny8_core/modules/system/views.py:system_status`): ```json { - "status": "ok", - "version": "1.8.4", - "database": "connected", - "redis": "connected", - "celery": "ok" + "timestamp": "2026-03-23T12:00:00.000000+00:00", + "system": { + "cpu": {"usage_percent": 12.5, "cores": 4, "status": "healthy"}, + "memory": {"total_gb": 8.0, "used_gb": 3.2, "available_gb": 4.8, "usage_percent": 40.0, "status": "healthy"}, + "disk": {"total_gb": 100.0, "used_gb": 35.0, "free_gb": 65.0, "usage_percent": 35.0, "status": "healthy"} + }, + "database": {"connected": true, "version": "PostgreSQL 16.x ...", "size": "256 MB", "active_connections": 5, "status": "healthy"}, + "redis": {"connected": true, "status": "healthy", "info": {}}, + "celery": {"workers": ["celery@igny8_celery_worker"], "worker_count": 1, "tasks": {"active": 0, "scheduled": 0, "reserved": 0}, "status": "healthy"}, + "processes": {}, + "modules": {} } ``` +**Healthy indicators:** All `status` fields should be `"healthy"`, `database.connected` and `redis.connected` should be `true`, `celery.worker_count` should be ≥ 1. + Use this endpoint to verify both old and new VPS health before/after migration. --- @@ -222,11 +266,11 @@ git checkout main # or appropriate branch cp .env.example .env # Update .env for new VPS: -# - DB_HOST=igny8_postgres (Docker internal hostname) +# - DB_HOST=postgres (Docker service name on igny8_net — infra container, not app container) # - DB_NAME=igny8_db # - DB_USER=igny8 # - DB_PASSWORD= -# - REDIS_HOST=igny8_redis +# - REDIS_URL=redis://redis:6379/0 (Redis is also an infra container on igny8_net) # - SECRET_KEY= # - ALLOWED_HOSTS=test-app.igny8.com,test-api.igny8.com,app.igny8.com,api.igny8.com,igny8.com @@ -282,7 +326,7 @@ PGPASSWORD= pg_restore --format=custom \ /tmp/igny8_db_backup.dump # Verify restore completed -PGPASSWORD= psql --host=localhost --username=igny8 --dbname=igny8_db -c "SELECT COUNT(*) FROM users;" +PGPASSWORD= psql --host=localhost --username=igny8 --dbname=igny8_db -c "SELECT COUNT(*) FROM igny8_users;" # Run ANALYZE on all tables to update statistics PGPASSWORD= psql --host=localhost --username=igny8 --dbname=igny8_db -c "ANALYZE;" @@ -315,16 +359,15 @@ On new VPS: ```bash cd /data/app/igny8 -# Copy docker-compose file (create if needed) -# Ensure it contains: -# - igny8_backend (Django) -# - igny8_frontend (Vite) -# - igny8_celery_worker -# - igny8_celery_beat -# - igny8_postgres (PostgreSQL 18) -# - igny8_redis -# - caddy -# - marketing, sites (if needed) +# docker-compose.app.yml should contain these app containers: +# - igny8_backend (Django + Gunicorn) +# - igny8_frontend (Vite/React) +# - igny8_marketing_dev (Vite marketing site) +# - igny8_celery_worker (Celery) +# - igny8_celery_beat (Celery Beat) +# - igny8_flower (Flower monitor) +# NOTE: postgres, redis, caddy are INFRA containers — NOT in app compose. +# They must already be running on igny8_net (provisioned in 00B). # Build and start docker compose -f docker-compose.app.yml build @@ -355,22 +398,14 @@ curl -I https://test-api.igny8.com #### Step 1.7: Run Health Checks on Test Subdomains ```bash -# Health check via test API subdomain -curl -H "Host: test-api.igny8.com" http://localhost:8010/api/v1/system/status/ +# Health check via test API subdomain (port 8011 is the host-mapped backend port) +curl -H "Host: test-api.igny8.com" http://localhost:8011/api/v1/system/status/ # Or if DNS is live curl https://test-api.igny8.com/api/v1/system/status/ ``` -**Expected Response:** -```json -{ - "status": "ok", - "version": "1.8.4", - "database": "connected", - "redis": "connected", - "celery": "ok" -} +**Verify response:** `database.connected` = true, `redis.connected` = true, `celery.worker_count` ≥ 1, all `status` fields = "healthy". See Section 3.2 for full response format. ``` #### Step 1.8: Manual Testing on Test Subdomains diff --git a/v2/V2-Execution-Docs/00D-staging-environment.md b/v2/V2-Execution-Docs/00D-staging-environment.md index 18c9d365..58dc4900 100644 --- a/v2/V2-Execution-Docs/00D-staging-environment.md +++ b/v2/V2-Execution-Docs/00D-staging-environment.md @@ -1,9 +1,11 @@ # IGNY8 Phase 0: Staging Environment Setup (Doc 00D) **Document Status:** Build Specification +**Version:** 2.1 **Date Created:** 2026-03-23 **Target Phase:** Phase 0 - Infrastructure & Deployment -**Related Docs:** [00B Infrastructure Setup](00B-infrastructure-setup.md) | [00C Production Migration](00C-production-migration.md) | [00B Version Matrix](00B-infrastructure-setup.md#version-matrix) (SINGLE SOURCE OF TRUTH for all versions) +**Source of Truth:** Codebase at `/data/app/igny8/` +**Related Docs:** [00B Infrastructure Setup](00B-infrastructure-setup.md) | [00C Production Migration](00C-production-migration.md) **Key Details:** - Staging runs on the NEW VPS (from 00B Infrastructure Setup) @@ -19,32 +21,33 @@ **Staging Environment Location:** On the NEW VPS, as provisioned in 00B Infrastructure Setup. -**Note on Versions:** For all component versions (PostgreSQL, Redis, Docker, etc.), refer to the **Version Matrix in 00B Infrastructure Setup** as the single source of truth. This document reflects those versions. All staging components use the latest versions matching production on the NEW VPS. +**Note on Versions:** For component versions, the codebase (requirements.txt, package.json, Dockerfiles) is the source of truth. Aspirational upgrade targets are in 00B but current verified versions are: Python 3.11, Django >=5.2.7, Node 18, React ^19, Vite ^6.1.0, Celery >=5.3.0. ### 1.1 Infrastructure Baseline - **Host Server:** Single Linux VM running Docker on NEW VPS (from 00B Infrastructure Setup) -- **Base OS:** Ubuntu 24.04 LTS +- **Base OS:** Ubuntu (version per 00B) - **Shared Resources:** - - PostgreSQL 18 server (port 5432) - - Redis 8 server (port 6379) + - PostgreSQL server (port 5432) — version set by infra stack + - Redis server (port 6379) — version set by infra stack - Docker network: `igny8_net` - - Caddy 2.11 reverse proxy (port 80/443) - - Cloudflare DNS management (may or may not be active - dependent on 00C flow stage) - - Log directory: `/data/app/logs/` + - Caddy reverse proxy (port 80/443) + - Cloudflare DNS management (may or may not be active — dependent on 00C flow stage) + - Log directory: `/data/app/logs/` (production), `/data/logs/staging/` (staging) ### 1.2 Production Environment (Already Complete - Doc 00C) - **Database:** `igny8_db` (PostgreSQL) - **Cache:** Redis DB 0 -- **Compose file:** `docker-compose.yml` -- **Containers:** - - `igny8_backend` (port 8010) - - `igny8_frontend` (port 5173) - - `igny8_marketing_dev` (port 5174) +- **Compose file:** `docker-compose.app.yml` (project name: `igny8-app`) +- **Containers (7):** + - `igny8_backend` (host port 8011, container port 8010) + - `igny8_frontend` (host port 8021, container port 5173) + - `igny8_marketing_dev` (host port 8023, container port 5174) - `igny8_celery_worker` - `igny8_celery_beat` -- **Env file:** `.env` (production settings) + - `igny8_flower` (port 5555) +- **Env file:** `.env` (production settings) — NOT used inline; env vars set in compose - **Domains:** igny8.com, api.igny8.com, marketing.igny8.com -- **Logs:** `/data/app/logs/production/` +- **Logs:** `/data/app/logs/` ### 1.3 Staging Environment (To Be Built) **Does not yet exist.** This document defines the complete staging setup. @@ -61,11 +64,11 @@ A complete parallel environment sharing infrastructure with production: │ Docker Containers (Staging) │ ├─────────────────────────────────────────────────────┤ │ │ -│ igny8_staging_backend:8012 → :8010 (Django) │ -│ igny8_staging_frontend:8024 → :5173 (Vue) │ -│ igny8_staging_marketing_dev:8026 → :5174 (Nuxt) │ -│ igny8_staging_celery_worker │ -│ igny8_staging_celery_beat │ +│ igny8_staging_backend:8012 → :8010 (Django/Gunicorn) │ +│ igny8_staging_frontend:8024 → :5173 (React/Vite) │ +│ igny8_staging_marketing_dev:8026 → :5174 (Vite) │ +│ igny8_staging_celery_worker │ +│ igny8_staging_celery_beat │ │ │ └─────────────────────────────────────────────────────┘ ↓ @@ -157,18 +160,20 @@ A complete parallel environment sharing infrastructure with production: ## 4. Implementation Steps -**Version Requirements:** All versions referenced below are from the **00B Version Matrix (source of truth for all versions)**. The staging environment uses identical versions to production on the NEW VPS: -- PostgreSQL 18 (postgres:18-alpine) -- Redis 8 (redis:8-alpine) -- Caddy 2.11 (caddy:2-alpine) -- Ubuntu 24.04 LTS (base OS) -- Docker Engine 29.x -- Python 3.14 (in backend container) -- Node 24 LTS (in frontend and marketing containers) -- Django 6.0 -- Vite 8 -- Gunicorn 25 -- Celery 5.6 +**Version Requirements:** All versions below are verified from the codebase. The staging environment uses identical versions to production: +- PostgreSQL (version set by infra stack) +- Redis (version set by infra stack) +- Caddy 2 (version set by infra stack) +- Ubuntu base OS (set by infra stack) +- Docker Engine (installed on VPS) +- Python 3.11-slim (in backend Dockerfile) +- Node 18-alpine (in frontend Dockerfile) +- Django >=5.2.7 (requirements.txt) +- Vite ^6.1.0 (package.json) +- Gunicorn (requirements.txt) +- Celery >=5.3.0 (requirements.txt) + +**Note:** The actual `docker-compose.staging.yml` already exists in the repo and is the source of truth. The compose excerpt below is for reference only — always use the actual file. ### Step 1: Create Staging PostgreSQL Database @@ -196,15 +201,15 @@ GRANT ALL PRIVILEGES ON SCHEMA public TO igny8_user; **Execution:** ```bash # On host server -docker exec -i igny8_postgres psql -U postgres -d postgres << 'EOF' +docker exec -i postgres psql -U postgres -d postgres << 'EOF' CREATE DATABASE igny8_staging_db - WITH OWNER igny8_user + WITH OWNER igny8 ENCODING 'UTF8' LOCALE 'en_US.UTF-8' TEMPLATE template0; -GRANT ALL PRIVILEGES ON DATABASE igny8_staging_db TO igny8_user; -GRANT ALL PRIVILEGES ON SCHEMA public TO igny8_user; +GRANT ALL PRIVILEGES ON DATABASE igny8_staging_db TO igny8; +GRANT ALL PRIVILEGES ON SCHEMA public TO igny8; EOF echo "Staging database created" @@ -218,161 +223,141 @@ echo "Staging database created" **Project Name:** `igny8-staging` ```yaml -version: '3.8' +# Actual file: docker-compose.staging.yml (already exists in repo) +# Key differences from this reference: the actual file uses env_file: .env.staging +# and individual env vars (DB_HOST, DB_NAME) rather than DATABASE_URL format. + +name: igny8-staging services: # Backend API Service igny8_staging_backend: image: igny8-backend:staging container_name: igny8_staging_backend - environment: - - DJANGO_ENV=staging - - DEBUG=True - - ALLOWED_HOSTS=staging.igny8.com,staging-api.igny8.com,localhost,127.0.0.1 - - SECRET_KEY=${STAGING_SECRET_KEY} - - DATABASE_URL=postgresql://igny8_user:${DB_PASSWORD}@igny8_postgres:5432/igny8_staging_db - - REDIS_URL=redis://igny8_redis:6379/1 - - CELERY_BROKER_URL=redis://igny8_redis:6379/1 - - CELERY_RESULT_BACKEND=redis://igny8_redis:6379/1 - - CACHE_URL=redis://igny8_redis:6379/1 - - CORS_ALLOWED_ORIGINS=https://staging.igny8.com,https://staging-marketing.igny8.com - - STRIPE_PUBLIC_KEY=${STAGING_STRIPE_PUBLIC_KEY} - - STRIPE_SECRET_KEY=${STAGING_STRIPE_SECRET_KEY} - - STRIPE_WEBHOOK_SECRET=${STAGING_STRIPE_WEBHOOK_SECRET} - - API_BASE_URL=https://staging-api.igny8.com - - FRONTEND_URL=https://staging.igny8.com - - MARKETING_URL=https://staging-marketing.igny8.com - - AWS_ACCESS_KEY_ID=${STAGING_AWS_ACCESS_KEY_ID} - - AWS_SECRET_ACCESS_KEY=${STAGING_AWS_SECRET_ACCESS_KEY} - - AWS_S3_BUCKET=${STAGING_AWS_S3_BUCKET} - - AWS_REGION=${AWS_REGION} - - SENTRY_DSN=${STAGING_SENTRY_DSN} - - LOG_LEVEL=INFO + restart: always + working_dir: /app ports: - - "8012:8010" + - "0.0.0.0:8012:8010" + environment: + DJANGO_ENV: staging + DB_HOST: postgres # External infra container name (NOT igny8_postgres) + DB_NAME: igny8_staging_db + DB_USER: igny8 + DB_PASSWORD: igny8pass + REDIS_HOST: redis # External infra container name (NOT igny8_redis) + REDIS_PORT: "6379" + REDIS_DB: "1" # DB 1 for staging (production uses DB 0) + USE_SECURE_COOKIES: "True" + USE_SECURE_PROXY_HEADER: "True" + DEBUG: "False" volumes: - - ./backend:/app/backend - - /data/app/logs/staging:/var/log/igny8 - networks: - - igny8_net - depends_on: - - igny8_postgres - - igny8_redis - restart: unless-stopped + - /data/app/igny8/backend:/app:rw + - /data/app/igny8:/data/app/igny8:rw + - /var/run/docker.sock:/var/run/docker.sock:ro + - /data/logs/staging:/app/logs:rw + env_file: + - .env.staging healthcheck: - test: ["CMD", "curl", "-f", "http://localhost:8010/health/"] + test: ["CMD-SHELL", "python -c \"import urllib.request; urllib.request.urlopen('http://localhost:8010/api/v1/system/status/').read()\" || exit 1"] interval: 30s timeout: 10s retries: 3 start_period: 40s - labels: - - "com.igny8.component=backend" - - "com.igny8.environment=staging" + command: ["gunicorn", "igny8_core.wsgi:application", "--bind", "0.0.0.0:8010", "--workers", "2", "--timeout", "120"] + networks: [igny8_net] - # Frontend Service + # Frontend Service (React + Vite) igny8_staging_frontend: image: igny8-frontend-dev:staging container_name: igny8_staging_frontend - environment: - - NODE_ENV=staging - - VITE_API_URL=https://staging-api.igny8.com - - VITE_ENVIRONMENT=staging + restart: always ports: - - "8024:5173" + - "0.0.0.0:8024:5173" + environment: + VITE_BACKEND_URL: "https://staging-api.igny8.com/api" + VITE_ENV: "staging" volumes: - - ./frontend:/app - - /app/node_modules - networks: - - igny8_net - restart: unless-stopped - labels: - - "com.igny8.component=frontend" - - "com.igny8.environment=staging" + - /data/app/igny8/frontend:/app:rw + depends_on: + igny8_staging_backend: + condition: service_healthy + networks: [igny8_net] - # Marketing Site Service + # Marketing Site Service (Vite, NOT Nuxt — built from frontend/Dockerfile.marketing.dev) igny8_staging_marketing_dev: image: igny8-marketing-dev:staging container_name: igny8_staging_marketing_dev - environment: - - NODE_ENV=staging - - NUXT_PUBLIC_API_URL=https://staging-api.igny8.com - - NUXT_PUBLIC_ENVIRONMENT=staging + restart: always ports: - - "8026:5174" + - "0.0.0.0:8026:5174" + environment: + VITE_BACKEND_URL: "https://staging-api.igny8.com/api" + VITE_ENV: "staging" volumes: - - ./marketing:/app - - /app/.nuxt - - /app/node_modules - networks: - - igny8_net - restart: unless-stopped - labels: - - "com.igny8.component=marketing" - - "com.igny8.environment=staging" + - /data/app/igny8/frontend:/app:rw # Same frontend dir — marketing is a Vite build mode + networks: [igny8_net] # Celery Worker igny8_staging_celery_worker: image: igny8-backend:staging container_name: igny8_staging_celery_worker - command: celery -A backend.celery worker --loglevel=info --concurrency=2 + restart: always + working_dir: /app environment: - - DJANGO_ENV=staging - - DEBUG=True - - SECRET_KEY=${STAGING_SECRET_KEY} - - DATABASE_URL=postgresql://igny8_user:${DB_PASSWORD}@igny8_postgres:5432/igny8_staging_db - - REDIS_URL=redis://igny8_redis:6379/1 - - CELERY_BROKER_URL=redis://igny8_redis:6379/1 - - CELERY_RESULT_BACKEND=redis://igny8_redis:6379/1 - - AWS_ACCESS_KEY_ID=${STAGING_AWS_ACCESS_KEY_ID} - - AWS_SECRET_ACCESS_KEY=${STAGING_AWS_SECRET_ACCESS_KEY} - - AWS_S3_BUCKET=${STAGING_AWS_S3_BUCKET} + DJANGO_ENV: staging + DB_HOST: postgres + DB_NAME: igny8_staging_db + DB_USER: igny8 + DB_PASSWORD: igny8pass + REDIS_HOST: redis + REDIS_PORT: "6379" + REDIS_DB: "1" + C_FORCE_ROOT: "true" volumes: - - ./backend:/app/backend - - /data/app/logs/staging:/var/log/igny8 - networks: - - igny8_net + - /data/app/igny8/backend:/app:rw + - /data/logs/staging:/app/logs:rw + env_file: + - .env.staging + command: ["celery", "-A", "igny8_core", "worker", "--loglevel=info", "--concurrency=2"] depends_on: - - igny8_postgres - - igny8_redis - restart: unless-stopped - labels: - - "com.igny8.component=celery-worker" - - "com.igny8.environment=staging" + igny8_staging_backend: + condition: service_healthy + networks: [igny8_net] # Celery Beat (Scheduler) igny8_staging_celery_beat: image: igny8-backend:staging container_name: igny8_staging_celery_beat - command: celery -A backend.celery beat --loglevel=info --scheduler django_celery_beat.schedulers:DatabaseScheduler + restart: always + working_dir: /app environment: - - DJANGO_ENV=staging - - DEBUG=True - - SECRET_KEY=${STAGING_SECRET_KEY} - - DATABASE_URL=postgresql://igny8_user:${DB_PASSWORD}@igny8_postgres:5432/igny8_staging_db - - REDIS_URL=redis://igny8_redis:6379/1 - - CELERY_BROKER_URL=redis://igny8_redis:6379/1 - - CELERY_RESULT_BACKEND=redis://igny8_redis:6379/1 + DJANGO_ENV: staging + DB_HOST: postgres + DB_NAME: igny8_staging_db + DB_USER: igny8 + DB_PASSWORD: igny8pass + REDIS_HOST: redis + REDIS_PORT: "6379" + REDIS_DB: "1" + C_FORCE_ROOT: "true" volumes: - - ./backend:/app/backend - - /data/app/logs/staging:/var/log/igny8 - networks: - - igny8_net + - /data/app/igny8/backend:/app:rw + - /data/logs/staging:/app/logs:rw + env_file: + - .env.staging + command: ["celery", "-A", "igny8_core", "beat", "--loglevel=info", "--scheduler", "django_celery_beat.schedulers:DatabaseScheduler"] depends_on: - - igny8_postgres - - igny8_redis - restart: unless-stopped - labels: - - "com.igny8.component=celery-beat" - - "com.igny8.environment=staging" + igny8_staging_backend: + condition: service_healthy + networks: [igny8_net] networks: igny8_net: external: true - -volumes: - # Data volumes referenced from external production infrastructure ``` +> **Note:** The actual `docker-compose.staging.yml` in the repo is the definitive version. The above is aligned with it as of this writing. + --- ### Step 3: Create `.env.staging` @@ -404,29 +389,27 @@ DB_PASSWORD=your-staging-db-password # DATABASE # ============================================================================== DATABASE_ENGINE=postgresql -DATABASE_HOST=igny8_postgres +DATABASE_HOST=postgres DATABASE_PORT=5432 DATABASE_NAME=igny8_staging_db -DATABASE_USER=igny8_user +DATABASE_USER=igny8 DATABASE_PASSWORD=${DB_PASSWORD} -# Full URL for Django -DATABASE_URL=postgresql://igny8_user:${DB_PASSWORD}@igny8_postgres:5432/igny8_staging_db # ============================================================================== # CACHE & QUEUE (REDIS DB 1 - Separate from Production) # ============================================================================== -REDIS_HOST=igny8_redis +REDIS_HOST=redis REDIS_PORT=6379 REDIS_DB=1 REDIS_PASSWORD= -REDIS_URL=redis://igny8_redis:6379/1 -CACHE_URL=redis://igny8_redis:6379/1 +REDIS_URL=redis://redis:6379/1 +CACHE_URL=redis://redis:6379/1 # ============================================================================== # CELERY (Uses Redis DB 1) # ============================================================================== -CELERY_BROKER_URL=redis://igny8_redis:6379/1 -CELERY_RESULT_BACKEND=redis://igny8_redis:6379/1 +CELERY_BROKER_URL=redis://redis:6379/1 +CELERY_RESULT_BACKEND=redis://redis:6379/1 CELERY_ACCEPT_CONTENT=json CELERY_TASK_SERIALIZER=json @@ -458,20 +441,11 @@ AWS_S3_CUSTOM_DOMAIN=staging-assets.igny8.com # ============================================================================== # EXTERNAL SERVICES (STAGING / SANDBOX CREDENTIALS) # ============================================================================== -# Email -MAILGUN_API_KEY=your-staging-mailgun-key -MAILGUN_DOMAIN=staging-mail.igny8.com +# Email (app uses Resend, not Mailgun) +# RESEND_API_KEY=your-staging-resend-key -# Analytics -MIXPANEL_TOKEN=your-staging-mixpanel-token - -# Error Tracking -STAGING_SENTRY_DSN=https://your-staging-sentry-dsn - -# SMS -TWILIO_ACCOUNT_SID=your-staging-twilio-sid -TWILIO_AUTH_TOKEN=your-staging-twilio-token -TWILIO_PHONE_NUMBER=+15551234567 +# Error Tracking (optional) +# SENTRY_DSN=https://your-staging-sentry-dsn # ============================================================================== # SECURITY (STAGING) @@ -490,16 +464,10 @@ DJANGO_SUPERUSER_EMAIL=admin@staging.igny8.com DJANGO_SUPERUSER_PASSWORD=your-staging-admin-password # ============================================================================== -# FRONTEND / VUE +# FRONTEND (React + Vite) # ============================================================================== -VITE_API_URL=https://staging-api.igny8.com -VITE_ENVIRONMENT=staging - -# ============================================================================== -# MARKETING / NUXT -# ============================================================================== -NUXT_PUBLIC_API_URL=https://staging-api.igny8.com -NUXT_PUBLIC_ENVIRONMENT=staging +VITE_BACKEND_URL=https://staging-api.igny8.com/api +VITE_ENV=staging ``` --- @@ -602,28 +570,28 @@ docker exec igny8_caddy caddy reload --config /etc/caddy/Caddyfile **Backend Image** ```bash -cd /path/to/backend +cd /data/app/igny8/backend docker build -f Dockerfile -t igny8-backend:staging . # Verify docker images | grep igny8-backend ``` -**Frontend Image** +**Frontend Image (React/Vite)** ```bash -cd /path/to/frontend +cd /data/app/igny8/frontend docker build -f Dockerfile.dev -t igny8-frontend-dev:staging . # Verify docker images | grep igny8-frontend-dev ``` -**Marketing Image** +**Marketing Image (built from frontend dir using Dockerfile.marketing.dev)** ```bash -cd /path/to/marketing -docker build -f Dockerfile.dev -t igny8-marketing-dev:staging . +cd /data/app/igny8/frontend +docker build -f Dockerfile.marketing.dev -t igny8-marketing-dev:staging . # Verify docker images | grep igny8-marketing-dev @@ -675,7 +643,7 @@ PROJECT_NAME="igny8-staging" COMPOSE_FILE="docker-compose.staging.yml" ENV_FILE=".env.staging" LOG_DIR="/data/app/logs/staging" -PROD_COMPOSE_FILE="docker-compose.yml" +PROD_COMPOSE_FILE="docker-compose.app.yml" PROD_ENV_FILE=".env" # Colors for output @@ -749,21 +717,21 @@ verify_shared_services() { log_info "Verifying shared infrastructure services..." # Check PostgreSQL - if ! docker exec igny8_postgres pg_isready -U postgres &> /dev/null; then + if ! docker exec postgres pg_isready -U postgres &> /dev/null; then log_error "PostgreSQL service not running" exit 1 fi log_success "PostgreSQL verified" # Check Redis - if ! docker exec igny8_redis redis-cli ping &> /dev/null; then + if ! docker exec redis redis-cli ping &> /dev/null; then log_error "Redis service not running" exit 1 fi log_success "Redis verified" # Check Caddy - if ! docker ps | grep -q igny8_caddy; then + if ! docker ps | grep -q caddy; then log_error "Caddy service not running" exit 1 fi @@ -774,20 +742,20 @@ create_staging_database() { log_info "Creating staging database..." # Check if database already exists - if docker exec igny8_postgres psql -U postgres -lqt | cut -d \| -f 1 | grep -qw igny8_staging_db; then + if docker exec postgres psql -U postgres -lqt | cut -d \| -f 1 | grep -qw igny8_staging_db; then log_warn "Database 'igny8_staging_db' already exists, skipping creation" return 0 fi - docker exec -i igny8_postgres psql -U postgres -d postgres << 'EOF' + docker exec -i postgres psql -U postgres -d postgres << 'EOF' CREATE DATABASE igny8_staging_db - WITH OWNER igny8_user + WITH OWNER igny8 ENCODING 'UTF8' LOCALE 'en_US.UTF-8' TEMPLATE template0; -GRANT ALL PRIVILEGES ON DATABASE igny8_staging_db TO igny8_user; -GRANT ALL PRIVILEGES ON SCHEMA public TO igny8_user; +GRANT ALL PRIVILEGES ON DATABASE igny8_staging_db TO igny8; +GRANT ALL PRIVILEGES ON SCHEMA public TO igny8; EOF log_success "Staging database created" @@ -835,7 +803,8 @@ create_superuser() { --file "$COMPOSE_FILE" \ --env-file "$ENV_FILE" \ exec -T igny8_staging_backend python manage.py shell << 'EOF' -from django.contrib.auth.models import User +from django.contrib.auth import get_user_model +User = get_user_model() print(User.objects.filter(is_superuser=True).exists()) EOF ) @@ -862,7 +831,8 @@ EOF --file "$COMPOSE_FILE" \ --env-file "$ENV_FILE" \ exec -T igny8_staging_backend python manage.py shell << 'EOF' -from django.contrib.auth.models import User +from django.contrib.auth import get_user_model +User = get_user_model() user = User.objects.get(username='admin') user.set_password('$DJANGO_SUPERUSER_PASSWORD') user.save() @@ -878,7 +848,7 @@ health_check() { RETRY_COUNT=0 while [ $RETRY_COUNT -lt $RETRIES ]; do - if curl -s -f "http://localhost:8012/health/" &> /dev/null; then + if curl -s -f "http://localhost:8012/api/v1/system/status/" &> /dev/null; then log_success "Backend health check passed" return 0 fi @@ -967,8 +937,8 @@ set -e # Exit on error # Configuration PROD_DB="igny8_db" STAGING_DB="igny8_staging_db" -DB_HOST="igny8_postgres" -DB_USER="igny8_user" +DB_HOST="postgres" +DB_USER="igny8" BACKUP_DIR="/data/backups/staging" TIMESTAMP=$(date +%Y%m%d_%H%M%S) @@ -1030,8 +1000,8 @@ check_prerequisites() { exit 1 fi - # Check PostgreSQL accessibility - if ! docker exec igny8_postgres pg_isready -U "$DB_USER" -h "$DB_HOST" &> /dev/null; then + # Check PostgreSQL accessibility (container name is 'postgres', not 'igny8_postgres') + if ! docker exec postgres pg_isready -U "$DB_USER" &> /dev/null; then log_error "Cannot connect to PostgreSQL" exit 1 fi @@ -1050,7 +1020,7 @@ backup_staging_db() { BACKUP_FILE="$BACKUP_DIR/igny8_staging_db_${TIMESTAMP}.sql.gz" - docker exec igny8_postgres pg_dump \ + docker exec postgres pg_dump \ -U "$DB_USER" \ --format=plain \ "$STAGING_DB" | gzip > "$BACKUP_FILE" @@ -1061,7 +1031,7 @@ backup_staging_db() { truncate_staging_tables() { log_info "Truncating staging database tables..." - docker exec -i igny8_postgres psql \ + docker exec -i postgres psql \ -U "$DB_USER" \ -d "$STAGING_DB" << 'EOF' -- Get list of all tables @@ -1088,7 +1058,7 @@ dump_production_data() { DUMP_FILE="/tmp/igny8_prod_dump_${TIMESTAMP}.sql" - docker exec igny8_postgres pg_dump \ + docker exec postgres pg_dump \ -U "$DB_USER" \ --format=plain \ "$PROD_DB" > "$DUMP_FILE" @@ -1102,7 +1072,7 @@ restore_to_staging() { log_info "Restoring data to staging database..." - cat "$DUMP_FILE" | docker exec -i igny8_postgres psql \ + cat "$DUMP_FILE" | docker exec -i postgres psql \ -U "$DB_USER" \ -d "$STAGING_DB" \ --quiet @@ -1113,30 +1083,30 @@ restore_to_staging() { handle_sensitive_data() { log_info "Anonymizing/resetting sensitive data in staging..." - docker exec -i ighty8_postgres psql \ + docker exec -i postgres psql \ -U "$DB_USER" \ -d "$STAGING_DB" << 'EOF' --- Reset payment information -UPDATE billing_paymentmethod SET token = NULL WHERE token IS NOT NULL; +-- Reset/anonymize sensitive data in staging using ACTUAL table names +-- (All tables prefixed with igny8_ — see 00C for full table list) --- Reset API tokens -UPDATE api_token SET token = 'staging-token-' || id WHERE 1=1; +-- Reset user passwords for non-staff users (set to a known staging password hash) +UPDATE igny8_users SET password = 'pbkdf2_sha256$600000$stagingsalt$hashedvalue' WHERE is_staff = false; --- Reset user passwords (set to default) -UPDATE auth_user SET password = 'pbkdf2_sha256$600000$abcdefg$hashed' WHERE is_staff = false; +-- Clear payment tokens (integration keys are in DB, not env vars) +-- Integration settings are in igny8_integration_settings and igny8_integration_providers +-- Do NOT delete these — just note they need to be updated to sandbox keys post-sync --- Reset email addresses for non-admin users (optional - for testing) --- UPDATE auth_user SET email = CONCAT(username, '@staging-test.local') WHERE is_staff = false; - --- Clear sensitive logs -DELETE FROM audit_log WHERE action_type IN ('payment', 'user_data_export'); +-- Clear webhook event records (contain real payment data) +DELETE FROM igny8_webhook_events; -- Clear transient data -DELETE FROM celery_taskmeta; DELETE FROM django_session; --- Reset any third-party API keys to staging versions -UPDATE integration_apikey SET secret = 'sk_staging_' || id WHERE 1=1; +-- Clear AI task logs (optional — may contain API call details) +-- DELETE FROM igny8_ai_task_logs; + +-- NOTE: After sync, manually update igny8_integration_settings to use sandbox API keys +-- for openai, stripe, paypal, runware, resend providers EOF log_success "Sensitive data handled" @@ -1145,7 +1115,7 @@ EOF sync_redis_cache() { log_info "Clearing staging Redis cache (DB 1)..." - docker exec igny8_redis redis-cli -n 1 FLUSHDB + docker exec redis redis-cli -n 1 FLUSHDB log_success "Staging Redis cache cleared" } @@ -1238,7 +1208,7 @@ docker-compose -f docker-compose.staging.yml -p igny8-staging ps docker-compose -f docker-compose.staging.yml -p igny8-staging logs -f # Test API endpoint -curl -v https://staging-api.igny8.com/health/ +curl -v https://staging-api.igny8.com/api/v1/system/status/ # Test frontend curl -v https://staging.igny8.com @@ -1279,7 +1249,7 @@ curl -v https://staging.igny8.com - [ ] Frontend loads at `https://staging.igny8.com` without SSL errors - [ ] API accessible at `https://staging-api.igny8.com` with proper CORS headers - [ ] Marketing site loads at `https://staging-marketing.igny8.com` -- [ ] Health check endpoint returns 200 at `/health/` +- [ ] Health check endpoint returns 200 at `/api/v1/system/status/` ### 5.4 Data Synchronization Acceptance @@ -1365,14 +1335,14 @@ Tasks: # In project root docker-compose ps # Verify production running docker network inspect igny8_net # Verify network -docker exec igny8_postgres pg_isready -U postgres # Verify PostgreSQL -docker exec igny8_redis redis-cli ping # Verify Redis +docker exec postgres pg_isready -U postgres # Verify PostgreSQL +docker exec redis redis-cli ping # Verify Redis ``` **Step 2: Create Staging Database** ```bash -docker exec -i igny8_postgres psql -U postgres -d postgres << 'EOF' +docker exec -i postgres psql -U postgres -d postgres << 'EOF' CREATE DATABASE igny8_staging_db WITH OWNER igny8_user ENCODING 'UTF8' @@ -1388,7 +1358,7 @@ EOF ```bash # Edit /data/caddy/Caddyfile and append staging routes -# Reload: docker exec igny8_caddy caddy reload --config /etc/caddy/Caddyfile +# Reload: docker exec caddy caddy reload --config /etc/caddy/Caddyfile ``` **Step 4: Build Images** @@ -1414,7 +1384,7 @@ cd /path/to/marketing && docker build -t igny8-marketing-dev:staging . docker-compose -f docker-compose.staging.yml -p igny8-staging ps # Check health -curl https://staging-api.igny8.com/health/ +curl https://staging-api.igny8.com/api/v1/system/status/ # Check logs docker-compose -f docker-compose.staging.yml -p igny8-staging logs -f igny8_staging_backend @@ -1453,12 +1423,12 @@ docker-compose -f docker-compose.staging.yml -p igny8-staging logs igny8_staging **Database connection failing:** ```bash -docker exec igny8_postgres psql -U igny8_user -d igny8_staging_db -c "SELECT 1" +docker exec postgres psql -U igny8 -d igny8_staging_db -c "SELECT 1" ``` **Redis connection failing:** ```bash -docker exec igny8_redis redis-cli -n 1 ping +docker exec redis redis-cli -n 1 ping ``` **DNS not resolving:** @@ -1469,8 +1439,8 @@ dig +short staging.igny8.com **Caddy route not working:** ```bash -docker exec igny8_caddy caddy list-config -docker exec igny8_caddy caddy reload --config /etc/caddy/Caddyfile -v +docker exec caddy caddy list-config +docker exec caddy caddy reload --config /etc/caddy/Caddyfile -v ``` **Restart entire staging environment:** @@ -1481,7 +1451,7 @@ docker-compose -f docker-compose.staging.yml -p igny8-staging down **Reset staging database:** ```bash -docker exec igny8_postgres dropdb -U igny8_user igny8_staging_db +docker exec postgres dropdb -U igny8 igny8_staging_db ./deploy-staging.sh # Recreates and migrations ``` @@ -1489,12 +1459,9 @@ docker exec igny8_postgres dropdb -U igny8_user igny8_staging_db ## 7. Related Documentation -- **[00B Infrastructure Setup](00B-infrastructure-setup.md):** NEW VPS provisioning, Docker, PostgreSQL 18, Redis 8, Caddy 2.11 configuration - - **Version Matrix (in 00B):** SINGLE SOURCE OF TRUTH for all component versions (PostgreSQL 18, Redis 8, Caddy 2.11, Python 3.14, Node 24 LTS, Django 6.0, Vite 8, Gunicorn 25, Celery 5.6, etc.) - - Staging environment on NEW VPS uses identical versions -- **[00C Production Migration](00C-production-migration.md):** 3-stage migration flow (DNS Preparation, DNS Flip, Cloudflare Onboarding), production database setup and initial deployment - - **DNS Reference:** Staging setup coordinates with 00C Stage 1/2/3 to determine active DNS provider and domain naming (staging may use test variants during migration) - - **Prerequisite:** 00B must be complete to provision the NEW VPS where staging runs. 00C determines which DNS provider is active for staging domain records. +- **[00B Infrastructure Setup](00B-infrastructure-setup.md):** NEW VPS provisioning, Docker, PostgreSQL, Redis, Caddy configuration. Contains aspirational version targets; current versions verified from codebase (Python 3.11, Django >=5.2.7, Node 18, etc.) +- **[00C Production Migration](00C-production-migration.md):** 3-stage migration flow (Deploy & Test, DNS Flip, Cloudflare Onboarding). DNS Reference: Staging setup coordinates with 00C stages to determine active DNS provider. +- **Codebase files:** `docker-compose.staging.yml` (actual staging compose), `docker-compose.app.yml` (production compose), `backend/requirements.txt` (Python deps), `frontend/package.json` (JS deps). --- diff --git a/v2/V2-Execution-Docs/00E-legacy-cleanup.md b/v2/V2-Execution-Docs/00E-legacy-cleanup.md index 2c242837..d52d5169 100644 --- a/v2/V2-Execution-Docs/00E-legacy-cleanup.md +++ b/v2/V2-Execution-Docs/00E-legacy-cleanup.md @@ -3,8 +3,9 @@ **Status:** Pre-Implementation **Phase:** Phase 0 - Foundation & Infrastructure **Document ID:** 00E -**Version:** 1.0 +**Version:** 1.1 **Created:** 2026-03-23 +**Source of Truth:** Codebase at `/data/app/igny8/` --- @@ -322,8 +323,8 @@ ssh user@new-vps-ip docker ps -a | grep -v Exit # Verify application health endpoints -curl -v https://app.igny8.local/health -curl -v https://api.igny8.local/status +curl -v https://app.igny8.com/api/v1/system/status/ +curl -v https://api.igny8.com/api/v1/system/status/ # Check recent logs for errors docker logs --tail 100 [service-name] | grep -i error @@ -366,22 +367,22 @@ OLD_VPS_IP="x.x.x.x" NEW_VPS_IP="y.y.y.y" # Check all production DNS records for organization -nslookup igny8.local -nslookup api.igny8.local -nslookup app.igny8.local -nslookup git.igny8.local # Should NOT resolve to old VPS +nslookup igny8.com +nslookup api.igny8.com +nslookup app.igny8.com +nslookup git.igny8.com # Should NOT resolve to old VPS # Use dig for more detailed DNS information -dig igny8.local +short -dig @8.8.8.8 igny8.local +short # Check public DNS +dig igny8.com +short +dig @8.8.8.8 igny8.com +short # Check public DNS # Search DNS for any remaining old VPS references getent hosts | grep $OLD_VPS_IP # Verify all subdomains point to new VPS for domain in api app git cdn mail; do - echo "Checking $domain.igny8.local..." - dig $domain.igny8.local +short + echo "Checking $domain.igny8.com..." + dig $domain.igny8.com +short done # IMPORTANT: Identify test DNS records created during 00C validation that must be removed @@ -562,19 +563,19 @@ grep "Accepted" /var/log/auth.log | tail -20 **Commands:** ```bash # Repeat DNS verification from Step 1.4 -nslookup igny8.local -dig api.igny8.local +short -dig app.igny8.local +short +nslookup igny8.com +dig api.igny8.com +short +dig app.igny8.com +short # Check for any CNAME chains -dig igny8.local CNAME +dig igny8.com CNAME # Verify mail records don't point to old VPS -dig igny8.local MX -dig igny8.local NS +dig igny8.com MX +dig igny8.com NS # Use external DNS checker -curl "https://dns.google/resolve?name=igny8.local&type=A" | jq . +curl "https://dns.google/resolve?name=igny8.com&type=A" | jq . # Verify test DNS records still exist (to be removed in Step 3.4) nslookup test-app.igny8.com @@ -877,8 +878,8 @@ df -h ssh user@new-vps-ip # Run smoke tests for all critical services -curl -v https://api.igny8.local/health -curl -v https://app.igny8.local/ +curl -v https://api.igny8.com/api/v1/system/status/ +curl -v https://app.igny8.com/ # Run database operations docker exec [app-container] /app/bin/test-db-connection @@ -1211,8 +1212,8 @@ echo "Checking new VPS health..." ssh user@$VPS_IP "docker ps -a" | grep -E "Up|Exited" # Check endpoints -curl -s https://api.igny8.local/health | jq . -curl -s https://app.igny8.local/ > /dev/null && echo "App endpoint OK" +curl -s https://api.igny8.com/api/v1/system/status/ | jq . +curl -s https://app.igny8.com/ > /dev/null && echo "App endpoint OK" # Check resources ssh user@$VPS_IP "free -h | awk 'NR==2'" @@ -1238,7 +1239,7 @@ NEW_IP=$2 echo "Verifying DNS records..." -domains=("igny8.local" "api.igny8.local" "app.igny8.local" "git.igny8.local") +domains=("igny8.com" "api.igny8.com" "app.igny8.com" "git.igny8.com") for domain in "${domains[@]}"; do current_ip=$(dig +short $domain @8.8.8.8 | head -1) @@ -1486,11 +1487,11 @@ echo "Running smoke tests on new VPS..." # Test APIs echo "Testing API health..." -curl -s https://api.igny8.local/health | jq . || echo "FAILED" +curl -s https://api.igny8.com/api/v1/system/status/ | jq . || echo "FAILED" # Test app echo "Testing web app..." -curl -s -o /dev/null -w "%{http_code}" https://app.igny8.local/ || echo "FAILED" +curl -s -o /dev/null -w "%{http_code}" https://app.igny8.com/ || echo "FAILED" # Test database echo "Testing database..." @@ -1662,7 +1663,7 @@ git push origin main **Diagnosis:** ```bash -dig igny8.local +short +dig igny8.com +short # Returns: x.x.x.x (old VPS IP) ``` @@ -1671,7 +1672,7 @@ dig igny8.local +short 2. Verify TTL has expired (may need to wait 24+ hours) 3. Manually update DNS records if needed 4. Flush local DNS cache: `sudo systemctl restart systemd-resolved` -5. Re-verify from external DNS: `dig @8.8.8.8 igny8.local` +5. Re-verify from external DNS: `dig @8.8.8.8 igny8.com` --- @@ -1718,7 +1719,7 @@ ssh user@$OLD_VPS_IP "docker top gitea" **Diagnosis:** ```bash -curl https://api.igny8.local/health +curl https://api.igny8.com/api/v1/system/status/ # Returns: 500 Internal Server Error ``` diff --git a/v2/V2-Execution-Docs/00F-self-hosted-ai-infra.md b/v2/V2-Execution-Docs/00F-self-hosted-ai-infra.md index 676214f6..2c069b34 100644 --- a/v2/V2-Execution-Docs/00F-self-hosted-ai-infra.md +++ b/v2/V2-Execution-Docs/00F-self-hosted-ai-infra.md @@ -1,10 +1,11 @@ # IGNY8 Phase 0: Self-Hosted AI Infrastructure (00F) **Status:** Ready for Implementation +**Version:** 1.1 **Priority:** High (cost savings critical for unit economics) **Duration:** 5-7 days **Dependencies:** 00B (VPS provisioning) must be complete first -**Version Reference:** See [00B Version Matrix](./00B-vps-provisioning.md) for authoritative version information +**Source of Truth:** Codebase at `/data/app/igny8/` **Cost:** ~$200/month GPU rental + $0 software (open source) --- @@ -12,14 +13,12 @@ ## 1. Current State ### Existing AI Integration -- **External providers:** OpenAI (GPT-4, GPT-3.5), Anthropic (Claude), Runware (image gen), Bria (image gen) -- **Storage:** API keys stored in `GlobalIntegrationSettings` / `IntegrationProvider` models in database +- **External providers (verified from `IntegrationProvider` model):** OpenAI (GPT-4, GPT-3.5), Anthropic (Claude), Runware (image gen) +- **Storage:** API keys stored in `IntegrationProvider` model (table: `igny8_integration_providers`) with per-account overrides in `IntegrationSettings` (table: `igny8_integration_settings`). Global defaults in `GlobalIntegrationSettings`. +- **Provider types in codebase:** `ai`, `payment`, `email`, `storage` (from `PROVIDER_TYPE_CHOICES`) +- **Existing provider_ids:** `openai`, `runware`, `stripe`, `paypal`, `resend` - **Architecture:** Multi-provider AI engine with model selection capability -- **Current usage:** - - Content generation: articles, product descriptions, blog posts - - Image generation: product images, covers, social media graphics - - Keyword research and SEO optimization - - Content enhancement and rewriting +- **Current AI functions:** `auto_cluster`, `generate_ideas`, `generate_content`, `generate_images`, `generate_image_prompts`, `optimize_content`, `generate_site_structure` - **Async handling:** Celery workers process long-running AI tasks - **Cost impact:** External APIs constitute 15-30% of monthly operational costs @@ -125,31 +124,35 @@ ## 3. Data Models / APIs -### Database Models (No Schema Changes Required) +### Database Models (Minimal Schema Changes) -Use existing `IntegrationProvider` and `GlobalIntegrationSettings` models: +Use existing `IntegrationProvider` model — add a new row with `provider_type='ai'`: ```python -# In GlobalIntegrationSettings -INTEGRATION_PROVIDER_SELF_HOSTED = "self_hosted_ai" +# New IntegrationProvider row (NO new provider_type needed) +# provider_type='ai' already exists in PROVIDER_TYPE_CHOICES -# Settings structure (stored as JSON) -{ - "provider": "self_hosted_ai", - "name": "Self-Hosted AI (LiteLLM)", - "base_url": "http://localhost:8000", - "api_key": "not_used", # LiteLLM doesn't require auth (internal) - "enabled": True, - "priority": 10, # Try self-hosted first - "models": { - "text_generation": "qwen3:32b", - "text_generation_fast": "qwen3:8b", - "image_generation": "flux.1-dev", - "image_generation_fast": "sdxl-lightning" - }, - "timeout": 300, # 5 minute timeout for slow models - "fallback_to": "openai" # Fallback provider if self-hosted fails -} +# Create via admin or migration: +IntegrationProvider.objects.create( + provider_id='self_hosted_ai', + display_name='Self-Hosted AI (LiteLLM)', + provider_type='ai', + api_key='', # LiteLLM doesn't require auth (internal) + api_endpoint='http://localhost:8000', + is_active=True, + is_sandbox=False, + config={ + "priority": 10, # Try self-hosted first + "models": { + "text_generation": "qwen3:32b", + "text_generation_fast": "qwen3:8b", + "image_generation": "flux.1-dev", + "image_generation_fast": "sdxl-lightning" + }, + "timeout": 300, # 5 minute timeout for slow models + "fallback_to": "openai" # Fallback provider if self-hosted fails + } +) ``` ### LiteLLM API Endpoints diff --git a/v2/V2-Execution-Docs/01A-sag-data-foundation.md b/v2/V2-Execution-Docs/01A-sag-data-foundation.md index 712e3180..8aa3d04c 100644 --- a/v2/V2-Execution-Docs/01A-sag-data-foundation.md +++ b/v2/V2-Execution-Docs/01A-sag-data-foundation.md @@ -1,10 +1,11 @@ # IGNY8 Phase 1: SAG Data Foundation (01A) ## Core Data Models & CRUD APIs -**Document Version:** 1.0 +**Document Version:** 1.1 **Date:** 2026-03-23 **Phase:** IGNY8 Phase 1 — SAG Data Foundation **Status:** Build Ready +**Source of Truth:** Codebase at `/data/app/igny8/` **Audience:** Claude Code, Backend Developers, Architects --- @@ -12,8 +13,9 @@ ## 1. CURRENT STATE ### Existing IGNY8 Architecture -- **Framework:** Django 5.1 + Django REST Framework 3.15 (upgrading to Django 6.0 on new VPS) -- **Database:** PostgreSQL 16 (upgrading to PostgreSQL 18) +- **Framework:** Django >=5.2.7 + Django REST Framework (from requirements.txt) +- **Database:** PostgreSQL (version set by infra stack) +- **Primary Keys:** BigAutoField (integer PKs — NOT UUIDs). `DEFAULT_AUTO_FIELD = 'django.db.models.BigAutoField'` - **Multi-Tenancy:** AccountContextMiddleware enforces tenant isolation - All new models inherit from `AccountBaseModel` or `SiteSectorBaseModel` - Automatic tenant filtering in querysets @@ -26,17 +28,24 @@ } ``` - **ViewSet Pattern:** `AccountModelViewSet` (filters by account) and `SiteSectorModelViewSet` -- **Async Processing:** Celery 5.4 for background tasks -- **Existing Models:** - - Account (tenant root) - - Site (per-account, multi-site support) - - Sector (site-level category) - - Keyword, Cluster, ContentIdea, Task, Content, Image, SiteIntegration +- **Async Processing:** Celery >=5.3.0 for background tasks +- **Existing Models (PLURAL class names, all use BigAutoField PKs):** + - Account, Site (in `auth/models.py`) — `AccountBaseModel` + - Sector (in `auth/models.py`) — `AccountBaseModel` + - Clusters (in `business/planning/models.py`) — `SiteSectorBaseModel` + - Keywords (in `business/planning/models.py`) — `SiteSectorBaseModel` + - ContentIdeas (in `business/planning/models.py`) — `SiteSectorBaseModel` + - Tasks (in `business/content/models.py`) — `SiteSectorBaseModel` + - Content (in `business/content/models.py`) — `SiteSectorBaseModel` + - Images (in `business/content/models.py`) — `SiteSectorBaseModel` + - SiteIntegration (in `business/integration/models.py`) — `SiteSectorBaseModel` + - IntegrationProvider (in `modules/system/models.py`) — standalone ### Frontend Stack -- React 19 + TypeScript 5 -- Tailwind CSS 3 -- Zustand 4 (state management) +- React ^19.0.0 + TypeScript ~5.7.2 +- Tailwind CSS ^4.0.8 +- Zustand ^5.0.8 (state management) +- Vite ^6.1.0 ### What Doesn't Exist - Attribute-based cluster formation system @@ -128,10 +137,11 @@ sag/ class SAGBlueprint(AccountBaseModel): """ Core blueprint model: versioned taxonomy & cluster plan for a site. - Inherits: id (UUID), account_id (FK), created_at, updated_at + Inherits: id (BigAutoField, integer PK), account_id (FK), created_at, updated_at + Note: Uses BigAutoField per project convention (DEFAULT_AUTO_FIELD), NOT UUID. """ site = models.ForeignKey( - 'sites.Site', + 'igny8_core_auth.Site', # Actual app_label for Site model on_delete=models.CASCADE, related_name='sag_blueprints' ) @@ -232,7 +242,7 @@ class SAGBlueprint(AccountBaseModel): class SAGAttribute(AccountBaseModel): """ Attribute/dimension within a blueprint. - Inherits: id (UUID), account_id (FK), created_at, updated_at + Inherits: id (BigAutoField, integer PK), account_id (FK), created_at, updated_at """ blueprint = models.ForeignKey( 'sag.SAGBlueprint', @@ -315,7 +325,11 @@ class SAGAttribute(AccountBaseModel): class SAGCluster(AccountBaseModel): """ Cluster: hub page + supporting content organized around attribute intersection. - Inherits: id (UUID), account_id (FK), created_at, updated_at + Inherits: id (BigAutoField, integer PK), account_id (FK), created_at, updated_at + + IMPORTANT: This model coexists with the existing `Clusters` model (in business/planning/models.py). + Existing Clusters are pure topic-keyword groups. SAGClusters add attribute-based dimensionality. + They are linked via an optional FK on the existing Clusters model. """ blueprint = models.ForeignKey( 'sag.SAGBlueprint', @@ -323,7 +337,7 @@ class SAGCluster(AccountBaseModel): related_name='clusters' ) site = models.ForeignKey( - 'sites.Site', + 'igny8_core_auth.Site', # Actual app_label for Site model on_delete=models.CASCADE, related_name='sag_clusters' ) @@ -461,8 +475,8 @@ class SectorAttributeTemplate(models.Model): """ Reusable template for attributes and keywords by industry + sector. NOT tied to Account (admin-only, shared across tenants). + Uses BigAutoField PK per project convention (do NOT use UUID). """ - id = models.UUIDField(primary_key=True, default=uuid.uuid4, editable=False) industry = models.CharField( max_length=200, @@ -533,11 +547,12 @@ class SectorAttributeTemplate(models.Model): All modifications are **backward-compatible** and **nullable**. Existing records are unaffected. -#### **Site** (in `sites/models.py`) +#### **Site** (in `auth/models.py`, app_label: `igny8_core_auth`) ```python -class Site(AccountBaseModel): +class Site(SoftDeletableModel, AccountBaseModel): # ... existing fields ... + # NEW: SAG integration (nullable, backward-compatible) sag_blueprint = models.ForeignKey( 'sag.SAGBlueprint', on_delete=models.SET_NULL, @@ -548,11 +563,13 @@ class Site(AccountBaseModel): ) ``` -#### **Cluster** (in `modules/planner/models.py`) +#### **Clusters** (in `business/planning/models.py`, app_label: `planner`) ```python -class Cluster(AccountBaseModel): - # ... existing fields ... +class Clusters(SoftDeletableModel, SiteSectorBaseModel): + # ... existing fields: name, description, keywords_count, volume, + # mapped_pages, status(new/mapped), disabled ... + # NEW: SAG integration (nullable, backward-compatible) sag_cluster = models.ForeignKey( 'sag.SAGCluster', on_delete=models.SET_NULL, @@ -570,11 +587,16 @@ class Cluster(AccountBaseModel): ) ``` -#### **Task** (in `modules/writer/models.py`) +#### **Tasks** (in `business/content/models.py`, app_label: `writer`) ```python -class Task(AccountBaseModel): - # ... existing fields ... +class Tasks(SoftDeletableModel, SiteSectorBaseModel): + # ... existing fields: title, description, content_type, content_structure, + # keywords, word_count, status(queued/completed) ... + # NOTE: Already has `cluster = FK('planner.Clusters')` and + # `idea = FK('planner.ContentIdeas')` — these are NOT being replaced. + # The new sag_cluster FK is an ADDITIONAL link to the SAG layer. + # NEW: SAG integration (nullable, backward-compatible) sag_cluster = models.ForeignKey( 'sag.SAGCluster', on_delete=models.SET_NULL, @@ -592,11 +614,14 @@ class Task(AccountBaseModel): ) ``` -#### **Content** (in `modules/writer/models.py`) +#### **Content** (in `business/content/models.py`, app_label: `writer`) ```python -class Content(AccountBaseModel): +class Content(SoftDeletableModel, SiteSectorBaseModel): # ... existing fields ... + # NOTE: Already has task FK (to writer.Tasks which has cluster FK). + # The new sag_cluster FK is an ADDITIONAL direct link to SAG layer. + # NEW: SAG integration (nullable, backward-compatible) sag_cluster = models.ForeignKey( 'sag.SAGCluster', on_delete=models.SET_NULL, @@ -607,11 +632,12 @@ class Content(AccountBaseModel): ) ``` -#### **ContentIdea** (in `modules/planner/models.py`) +#### **ContentIdeas** (in `business/planning/models.py`, app_label: `planner`) ```python -class ContentIdea(AccountBaseModel): +class ContentIdeas(SoftDeletableModel, SiteSectorBaseModel): # ... existing fields ... + # NEW: SAG integration (nullable, backward-compatible) sag_cluster = models.ForeignKey( 'sag.SAGCluster', on_delete=models.SET_NULL, @@ -847,7 +873,7 @@ class SAGBlueprintViewSet(AccountModelViewSet): @action(detail=False, methods=['get']) def active_by_site(self, request): - """GET /api/v1/sag/blueprints/active_by_site/?site_id=""" + """GET /api/v1/sag/blueprints/active_by_site/?site_id=""" site_id = request.query_params.get('site_id') if not site_id: return Response({ @@ -1027,7 +1053,7 @@ blueprints_router.register( urlpatterns = [ path('', include(router.urls)), - path('blueprints//', include(blueprints_router.urls)), + path('blueprints//', include(blueprints_router.urls)), ] ``` @@ -1309,7 +1335,7 @@ def compute_blueprint_health(blueprint): logger.info(f"Computed health for blueprint {blueprint.id}: {blueprint.sag_health_score}") return { - 'blueprint_id': str(blueprint.id), + 'blueprint_id': blueprint.id, 'health_score': blueprint.sag_health_score, 'attribute_count': attribute_count, 'cluster_count': cluster_count, @@ -1580,14 +1606,13 @@ def plan_supporting_content(cluster, hub_page_title, num_articles=5): 2. **Define Models** - Implement `SAGBlueprint`, `SAGAttribute`, `SAGCluster`, `SectorAttributeTemplate` - - Add 5 new nullable fields to existing models (Site, Cluster, Task, Content, ContentIdea) + - Add 5 new nullable fields to existing models (Site, Clusters, Tasks, Content, ContentIdeas) - Ensure all models inherit from correct base class (AccountBaseModel or base Model) 3. **Create Migrations** - Run `makemigrations sag` - Manually verify for circular imports or dependencies - - Create migration for modifications to existing models - - All existing fields must remain untouched + - Create migration for modifications to existing models (Clusters, Tasks, Content, ContentIdeas in their respective apps; Site in igny8_core_auth) 4. **Implement Serializers** - SAGBlueprintDetailSerializer (nested attributes & clusters) @@ -1638,7 +1663,7 @@ def plan_supporting_content(cluster, hub_page_title, num_articles=5): ### Data Model - [ ] All 4 models created and migrated successfully -- [ ] All 5 existing models have nullable SAG fields +- [ ] All 5 existing models have nullable SAG fields (Site, Clusters, Tasks, Content, ContentIdeas) - [ ] Unique constraints enforced (blueprint version, attribute slugs, cluster slugs, template industry/sector) - [ ] Foreign key cascades correct (blueprint → attributes/clusters) - [ ] All model methods and properties work as documented @@ -1653,7 +1678,7 @@ def plan_supporting_content(cluster, hub_page_title, num_articles=5): - [ ] POST /api/v1/sag/blueprints/{id}/archive/ (active → archived) - [ ] POST /api/v1/sag/blueprints/{id}/regenerate/ (create v+1) - [ ] POST /api/v1/sag/blueprints/{id}/health_check/ (compute score) -- [ ] GET /api/v1/sag/blueprints/active_by_site/?site_id= +- [ ] GET /api/v1/sag/blueprints/active_by_site/?site_id= - [ ] GET/POST /api/v1/sag/blueprints/{blueprint_id}/attributes/ - [ ] GET/POST /api/v1/sag/blueprints/{blueprint_id}/clusters/ - [ ] GET/POST /api/v1/sag/sector-templates/ (admin-only) @@ -1711,7 +1736,7 @@ This document contains **everything Claude Code needs to build the sag/ app**. 6. **Copy admin.py exactly** as-is 7. **Create service files** with code from Section 3.7 8. **Create AI function stubs** from Section 3.8 -9. **Create migration** for existing model changes (Site, Cluster, Task, Content, ContentIdea) +9. **Create migration** for existing model changes (Site in `igny8_core_auth`, Clusters/ContentIdeas in `planner`, Tasks/Content in `writer`) 10. **Run migrations** on development database 11. **Test endpoints** with Postman or curl 12. **Write unit & integration tests** matching patterns in existing test suite @@ -1724,7 +1749,7 @@ python manage.py startapp sag igny8_core/ # Makemigrations python manage.py makemigrations sag -python manage.py makemigrations # For existing model changes +python manage.py makemigrations igny8_core_auth planner writer # For existing model changes # Migrate python manage.py migrate sag @@ -1753,7 +1778,7 @@ curl -X POST http://localhost:8000/api/v1/sag/blueprints/ \ -H "Authorization: Token " \ -H "Content-Type: application/json" \ -d '{ - "site": "", + "site": 42, "status": "draft", "source": "manual", "taxonomy_plan": {} @@ -1800,7 +1825,7 @@ POST /api/v1/sag/blueprints/ Authorization: Token { - "site": "550e8400-e29b-41d4-a716-446655440000", + "site": 42, "status": "draft", "source": "manual", "taxonomy_plan": { @@ -1819,9 +1844,9 @@ Authorization: Token { "success": true, "data": { - "id": "660e8400-e29b-41d4-a716-446655440001", - "site": "550e8400-e29b-41d4-a716-446655440000", - "account": "770e8400-e29b-41d4-a716-446655440002", + "id": 1, + "site": 42, + "account": 7, "version": 1, "status": "draft", "source": "manual", @@ -1856,7 +1881,7 @@ Authorization: Token **Request:** ```json -POST /api/v1/sag/blueprints/660e8400-e29b-41d4-a716-446655440001/confirm/ +POST /api/v1/sag/blueprints/1/confirm/ Authorization: Token ``` @@ -1865,8 +1890,8 @@ Authorization: Token { "success": true, "data": { - "id": "660e8400-e29b-41d4-a716-446655440001", - "site": "550e8400-e29b-41d4-a716-446655440000", + "id": 1, + "site": 42, "version": 1, "status": "active", "confirmed_at": "2026-03-23T10:05:00Z", diff --git a/v2/V2-Execution-Docs/01B-sector-attribute-templates.md b/v2/V2-Execution-Docs/01B-sector-attribute-templates.md index 536b9974..f350521b 100644 --- a/v2/V2-Execution-Docs/01B-sector-attribute-templates.md +++ b/v2/V2-Execution-Docs/01B-sector-attribute-templates.md @@ -1,7 +1,10 @@ # 01B - Sector Attribute Templates **IGNY8 Phase 1: Service Layer & AI Functions** -**Version:** 1.0 +> **Version:** 1.1 (codebase-verified) +> **Source of Truth:** Codebase at `/data/app/igny8/backend/` +> **Last Verified:** 2025-07-14 + **Date:** 2026-03-23 **Status:** Build-Ready **Owner:** SAG Team @@ -11,7 +14,7 @@ ## 1. Current State ### Model Foundation -- `SectorAttributeTemplate` model defined in `01A` (sag/models.py) +- `SectorAttributeTemplate` model defined in `01A` (`igny8_core/sag/models.py`, new sag/ app) - Schema includes: - `industry` (string) - `sector` (string) @@ -43,7 +46,7 @@ From SAG Niche Definition Process: ### 2.1 Service Layer: template_service.py -**Location:** `sag/services/template_service.py` +**Location:** `igny8_core/sag/services/template_service.py` #### Core Functions @@ -65,7 +68,7 @@ def get_or_generate_template( - If missing: trigger AI generation via `discover_sector_attributes()` AI function - Save generated template with `source='ai_generated'` - Return completed template -- Cache in Redis for 7 days (key: `sag:template:{industry}:{sector}`) +- Cache in Redis for 7 days (key: `planner:template:{industry}:{sector}`) ```python def merge_templates( @@ -128,17 +131,23 @@ def prune_template(template: SectorAttributeTemplate) -> SectorAttributeTemplate ### 2.2 AI Function: DiscoverSectorAttributes -**Location:** `sag/ai_functions/attribute_discovery.py` +**Location:** `igny8_core/ai/functions/discover_sector_attributes.py` **Register Key:** `discover_sector_attributes` #### Function Signature ```python -@ai_function(key='discover_sector_attributes') -async def discover_sector_attributes( - industry: str, - sector: str, - site_type: str # 'ecommerce' | 'local_services' | 'saas' | 'content' -) -> dict: +class DiscoverSectorAttributesFunction(BaseAIFunction): + """Discover sector attributes using AI.""" + + def get_name(self) -> str: + return 'discover_sector_attributes' + + async def execute( + self, + industry: str, + sector: str, + site_type: str # 'ecommerce' | 'local_services' | 'saas' | 'content' + ) -> dict: ``` #### Input @@ -247,7 +256,7 @@ OUTPUT: Valid JSON matching the schema above. Ensure all constraints are met. - **Cache:** Template generation results cache for 30 days - **Validation:** Run `validate_template()` on output before returning - **Fallback:** If validation fails, retry with stricter prompt, max 2 retries -- **Error Handling:** Log to `sag_ai_generation` logger with full prompt/response +- **Error Handling:** Log to `planner_ai_generation` logger with full prompt/response --- @@ -383,7 +392,7 @@ Step 4: Return Merged Template #### Seeding Implementation -**Fixture File:** `sag/fixtures/sector_templates_seed.json` +**Fixture File:** `igny8_core/sag/fixtures/sector_templates_seed.json` ```json { "industry": "Pet Supplies", @@ -448,10 +457,17 @@ Applied in `prune_template()`: ### 3.1 SectorAttributeTemplate Model -**Location:** `sag/models.py` (from 01A, extended here) +**Location:** `igny8_core/sag/models.py` (from 01A sag/ app, extended here) ```python +from django.db import models + + class SectorAttributeTemplate(models.Model): + """ + Admin-only template: NOT tied to Account or Site. + Uses BigAutoField PK per project convention (do NOT use UUID). + """ # Identity industry = models.CharField(max_length=255, db_index=True) sector = models.CharField(max_length=255, db_index=True) @@ -496,7 +512,7 @@ class SectorAttributeTemplate(models.Model): # Relationships created_by = models.ForeignKey( - User, + settings.AUTH_USER_MODEL, on_delete=models.SET_NULL, null=True, blank=True, @@ -504,12 +520,19 @@ class SectorAttributeTemplate(models.Model): ) class Meta: + app_label = 'planner' + db_table = 'igny8_sector_attribute_templates' unique_together = [('industry', 'sector')] indexes = [ models.Index(fields=['industry', 'sector']), models.Index(fields=['source', 'is_active']), ] ordering = ['-updated_at'] + verbose_name = 'Sector Attribute Template' + verbose_name_plural = 'Sector Attribute Templates' + + objects = SoftDeleteManager() + all_objects = models.Manager() def __str__(self): return f"{self.industry} / {self.sector}" @@ -519,7 +542,7 @@ class SectorAttributeTemplate(models.Model): ### 3.2 REST API Endpoints -**Base URL:** `/api/v1/sag/` +**Base URL:** `/api/v1/planner/` **Authentication:** Requires authentication (session or token) #### GET /sector-templates/{industry}/{sector}/ @@ -527,7 +550,7 @@ class SectorAttributeTemplate(models.Model): Request: ``` -GET /api/v1/sag/sector-templates/Pet%20Supplies/Dog%20Accessories/ +GET /api/v1/planner/sector-templates/Pet%20Supplies/Dog%20Accessories/ ``` Response (200 OK): @@ -603,7 +626,7 @@ Response (400 Bad Request): Request: ``` -GET /api/v1/sag/sector-templates/?industry=Pet%20Supplies&source=ai_generated&is_active=true +GET /api/v1/planner/sector-templates/?industry=Pet%20Supplies&source=ai_generated&is_active=true ``` Query Parameters: @@ -618,7 +641,7 @@ Response (200 OK): ```json { "count": 450, - "next": "/api/v1/sag/sector-templates/?limit=100&offset=100", + "next": "/api/v1/planner/sector-templates/?limit=100&offset=100", "previous": null, "results": [ { ... template 1 ... }, @@ -694,7 +717,7 @@ Response (202 Accepted - async): ```json { "status": "generating", - "task_id": "uuid-1234-5678", + "task_id": "celery-task-1234-5678", "industry": "Pet Supplies", "sector": "Dog Accessories", "message": "Template generation in progress. Check back in 30 seconds." @@ -746,14 +769,14 @@ Response (200 OK): ### 3.3 Service Layer: TemplateService Class -**Location:** `sag/services/template_service.py` +**Location:** `igny8_core/sag/services/template_service.py` ```python from typing import Optional, List, Tuple, Dict, Any from django.core.cache import cache from django.db.models import Q -from sag.models import SectorAttributeTemplate -from sag.ai_functions.attribute_discovery import discover_sector_attributes +from igny8_core.sag.models import SectorAttributeTemplate +from igny8_core.ai.functions.discover_sector_attributes import DiscoverSectorAttributesFunction class TemplateService: """Service for managing sector attribute templates.""" @@ -772,7 +795,7 @@ class TemplateService: sector: str ) -> Optional[SectorAttributeTemplate]: """Fetch template from database or cache.""" - cache_key = f"sag:template:{TemplateService.normalize_key(industry, sector)}" + cache_key = f"planner:template:{TemplateService.normalize_key(industry, sector)}" # Try cache first cached = cache.get(cache_key) @@ -1110,13 +1133,13 @@ class TemplateService: **Priority:** Critical **Owner:** Backend team -1. **Create `sag/services/template_service.py`** +1. **Create `igny8_core/sag/services/template_service.py`** - Implement all 6 core functions - Add unit tests for each function - Test edge cases (missing templates, invalid data) - Acceptance: All functions pass unit tests, caching works -2. **Create `sag/ai_functions/attribute_discovery.py`** +2. **Create `igny8_core/ai/functions/discover_sector_attributes.py`** - Register AI function with key `discover_sector_attributes` - Implement prompt strategy - Add input validation @@ -1135,20 +1158,20 @@ class TemplateService: **Priority:** Critical **Owner:** Backend team -1. **Create `sag/views/template_views.py`** +1. **Create `igny8_core/sag/views.py`** - TemplateListCreateView (GET, POST) - TemplateDetailView (GET, PUT, PATCH) - TemplateGenerateView (POST) - TemplateMergeView (POST) - All endpoints require authentication -2. **Create `sag/serializers/template_serializers.py`** +2. **Create `igny8_core/sag/serializers.py`** - SectorAttributeTemplateSerializer - Custom validation in serializer - Nested serializers for attribute_framework, keyword_templates -3. **Register URLs in `sag/urls.py`** - - Route all endpoints under `/api/v1/sag/sector-templates/` +3. **Register URLs in `igny8_core/sag/urls.py`** + - Route all endpoints under `/api/v1/planner/sector-templates/` - Use trailing slashes - Include proper HTTP method routing @@ -1165,7 +1188,7 @@ class TemplateService: **Priority:** High **Owner:** Data team -1. **Create `sag/fixtures/sector_templates_seed.json`** +1. **Create `igny8_core/sag/fixtures/sector_templates_seed.json`** - Template definitions for top 20 industries - Minimal valid data (5-8 attributes each) - Should include: Pet Supplies, E-commerce Software, Digital Marketing, Healthcare, Real Estate @@ -1280,8 +1303,8 @@ class TemplateService: | Criterion | Target | Status | |-----------|--------|--------| -| Code coverage (sag/services/) | >85% | PENDING | -| Code coverage (sag/ai_functions/) | >80% | PENDING | +| Code coverage (igny8_core/sag/services/) | >85% | PENDING | +| Code coverage (igny8_core/ai/functions/) | >80% | PENDING | | API tests coverage | 100% (all endpoints) | PENDING | | All templates pass validate_template() | 100% | PENDING | | Documentation completeness | All endpoints documented | PENDING | @@ -1295,55 +1318,54 @@ class TemplateService: ```bash # Create service file -touch sag/services/template_service.py +touch igny8_core/sag/services/template_service.py # Copy code from section 3.3 above # Create AI function file -touch sag/ai_functions/attribute_discovery.py -# Implement discover_sector_attributes() with prompt from section 2.2 +touch igny8_core/ai/functions/discover_sector_attributes.py +# Implement DiscoverSectorAttributesFunction class with prompt from section 2.2 # Create tests -touch sag/tests/test_template_service.py -touch sag/tests/test_attribute_discovery.py +touch igny8_core/sag/tests/test_template_service.py +touch igny8_core/sag/tests/test_attribute_discovery.py # Run tests -python manage.py test sag.tests.test_template_service --verbosity=2 -python manage.py test sag.tests.test_attribute_discovery --verbosity=2 +python manage.py test igny8_core.modules.planner.tests.test_template_service --verbosity=2 +python manage.py test igny8_core.modules.planner.tests.test_attribute_discovery --verbosity=2 ``` ### For Building the API Layer ```bash # Create views and serializers -touch sag/views/template_views.py -touch sag/serializers/template_serializers.py +touch igny8_core/sag/views.py +touch igny8_core/sag/serializers.py # Register URLs -# Edit sag/urls.py: -# from sag.views.template_views import * +# Edit igny8_core/sag/urls.py: +# from igny8_core.modules.planner.views.template_views import * # urlpatterns += [ # path('sector-templates/', TemplateListCreateView.as_view(), ...), -# path('sector-templates//', TemplateDetailView.as_view(), ...), +# path('sector-templates//', TemplateDetailView.as_view(), ...), # path('sector-templates/generate/', TemplateGenerateView.as_view(), ...), # path('sector-templates/merge/', TemplateMergeView.as_view(), ...), # ] # Create API tests -touch sag/tests/test_template_api.py +touch igny8_core/sag/tests/test_template_api.py # Run API tests -python manage.py test sag.tests.test_template_api --verbosity=2 +python manage.py test igny8_core.modules.planner.tests.test_template_api --verbosity=2 ``` ### For Seeding Data ```bash # Create fixture file -touch sag/fixtures/sector_templates_seed.json +touch igny8_core/sag/fixtures/sector_templates_seed.json # Create management command -mkdir -p sag/management/commands -touch sag/management/commands/seed_sector_templates.py +touch igny8_core/management/commands/seed_sector_templates.py # Run seeding python manage.py seed_sector_templates --industry "Pet Supplies" @@ -1357,14 +1379,14 @@ python manage.py validate_sector_templates ```bash # Create integration test -touch sag/tests/test_integration_templates.py +touch igny8_core/sag/tests/test_integration_templates.py # Test with 01C (Cluster formation) # Test with 01D (Setup wizard) # Test with 01F (Existing site analysis) # Run full integration test -python manage.py test sag.tests.test_integration_templates --verbosity=2 +python manage.py test igny8_core.modules.planner.tests.test_integration_templates --verbosity=2 ``` --- @@ -1372,7 +1394,7 @@ python manage.py test sag.tests.test_integration_templates --verbosity=2 ## Cross-Reference Index ### Related Documents -- **01A:** SectorAttributeTemplate model definition (`sag/models.py`) +- **01A:** SectorAttributeTemplate model definition (`igny8_core/sag/models.py`) - **01C:** Cluster Formation (uses keyword_templates) - **01D:** Setup Wizard (loads templates in Step 3a) - **01F:** Existing Site Analysis (validates against templates) @@ -1381,16 +1403,16 @@ python manage.py test sag.tests.test_integration_templates --verbosity=2 ### Key Files to Create ``` -sag/services/template_service.py (450 lines) -sag/ai_functions/attribute_discovery.py (200 lines) -sag/views/template_views.py (300 lines) -sag/serializers/template_serializers.py (150 lines) -sag/fixtures/sector_templates_seed.json (5000+ lines) -sag/management/commands/seed_sector_templates.py (100 lines) -sag/tests/test_template_service.py (400 lines) -sag/tests/test_attribute_discovery.py (300 lines) -sag/tests/test_template_api.py (500 lines) -sag/tests/test_integration_templates.py (300 lines) +igny8_core/sag/services/template_service.py (450 lines) +igny8_core/ai/functions/discover_sector_attributes.py (200 lines) +igny8_core/sag/views.py (300 lines) +igny8_core/sag/serializers.py (150 lines) +igny8_core/sag/fixtures/sector_templates_seed.json (5000+ lines) +igny8_core/management/commands/seed_sector_templates.py (100 lines) +igny8_core/sag/tests/test_template_service.py (400 lines) +igny8_core/sag/tests/test_attribute_discovery.py (300 lines) +igny8_core/sag/tests/test_template_api.py (500 lines) +igny8_core/sag/tests/test_integration_templates.py (300 lines) ``` ### Total Estimated Effort @@ -1420,6 +1442,6 @@ All code is production-ready and integrates with related documents (01A, 01C, 01 --- -**Document Version:** 1.0 -**Last Updated:** 2026-03-23 +**Document Version:** 1.1 +**Last Updated:** 2025-07-14 **Next Review:** Upon Phase 1 completion diff --git a/v2/V2-Execution-Docs/01C-cluster-formation-keyword-engine.md b/v2/V2-Execution-Docs/01C-cluster-formation-keyword-engine.md index d9e5e8fb..5c955c17 100644 --- a/v2/V2-Execution-Docs/01C-cluster-formation-keyword-engine.md +++ b/v2/V2-Execution-Docs/01C-cluster-formation-keyword-engine.md @@ -1,6 +1,10 @@ # IGNY8 Phase 1: Cluster Formation & Keyword Engine (Doc 01C) -**Document Version:** 1.0 +> **Version:** 1.1 (codebase-verified) +> **Source of Truth:** Codebase at `/data/app/igny8/backend/` +> **Last Verified:** 2025-07-14 + +**Document Version:** 1.1 **Date:** 2026-03-23 **Phase:** Phase 1 - Foundation & Intelligence **Status:** Build Ready @@ -48,7 +52,7 @@ {"name": "Health Condition", "values": ["Allergies", "Arthritis", "Obesity"]} ], "sector_context": { - "sector_id": str, + "sector_id": int, # FK to igny8_core_auth.Sector (BigAutoField PK) "site_type": "ecommerce|saas|blog|local_service", "sector_name": str }, @@ -257,7 +261,7 @@ For each intersection, the AI must answer: } ], "sector_context": { - "sector_id": str, + "sector_id": int, # FK to igny8_core_auth.Sector (BigAutoField PK) "site_type": "ecommerce|saas|blog|local_service", "site_intent": "sell|inform|book|download" }, @@ -503,7 +507,8 @@ keyword_templates = { #### Input Contract ```python assemble_blueprint( - site: Website, # from 01A + site: Site, # igny8_core_auth.Site (integer PK) + sector: Sector, # igny8_core_auth.Sector (integer PK) attributes: List[Tuple[name, values]], # user-populated clusters: List[Dict], # from cluster_formation() keywords: Dict[cluster_id, List[Dict]] # from generate_keywords() @@ -518,7 +523,7 @@ assemble_blueprint( site=site, status='draft', phase='phase_1_foundation', - sector_id=site.sector_id, + sector=sector, created_by=current_user, metadata={ 'version': '1.0', @@ -844,7 +849,8 @@ END FUNCTION #### SAGBlueprint (existing from 01A, extended) ```python -class SAGBlueprint(models.Model): +# Inherits account, created_at, updated_at from AccountBaseModel +class SAGBlueprint(AccountBaseModel): STATUS_CHOICES = ( ('draft', 'Draft'), ('cluster_formation_complete', 'Cluster Formation Complete'), @@ -854,10 +860,10 @@ class SAGBlueprint(models.Model): ('published', 'Published'), ) - site = models.ForeignKey(Website, on_delete=models.CASCADE) + site = models.ForeignKey('igny8_core_auth.Site', on_delete=models.CASCADE) status = models.CharField(max_length=50, choices=STATUS_CHOICES, default='draft') phase = models.CharField(max_length=50, default='phase_1_foundation') - sector_id = models.CharField(max_length=100) + sector = models.ForeignKey('igny8_core_auth.Sector', on_delete=models.CASCADE) # Denormalized JSON for fast access attributes_json = models.JSONField(default=dict, blank=True) @@ -865,9 +871,8 @@ class SAGBlueprint(models.Model): taxonomy_plan = models.JSONField(default=dict, blank=True) execution_priority = models.JSONField(default=dict, blank=True) - created_by = models.ForeignKey(User, on_delete=models.SET_NULL, null=True) - created_at = models.DateTimeField(auto_now_add=True) - updated_at = models.DateTimeField(auto_now=True) + created_by = models.ForeignKey(settings.AUTH_USER_MODEL, on_delete=models.SET_NULL, null=True) + # created_at, updated_at inherited from AccountBaseModel class Meta: db_table = 'sag_blueprint' @@ -876,13 +881,14 @@ class SAGBlueprint(models.Model): #### SAGAttribute (existing from 01A, no changes required) ```python -class SAGAttribute(models.Model): +# Inherits account, created_at, updated_at from AccountBaseModel +class SAGAttribute(AccountBaseModel): blueprint = models.ForeignKey(SAGBlueprint, on_delete=models.CASCADE) name = models.CharField(max_length=255) values = models.JSONField() # array of strings is_primary = models.BooleanField(default=False) source = models.CharField(max_length=50) # 'user_input', 'template', 'api' - created_at = models.DateTimeField(auto_now_add=True) + # created_at, updated_at inherited from AccountBaseModel class Meta: db_table = 'sag_attribute' @@ -891,7 +897,8 @@ class SAGAttribute(models.Model): #### SAGCluster (existing from 01A, extended) ```python -class SAGCluster(models.Model): +# Inherits account, created_at, updated_at from AccountBaseModel +class SAGCluster(AccountBaseModel): TYPE_CHOICES = ( ('product_category', 'Product/Service Category'), ('condition_problem', 'Condition/Problem'), @@ -935,8 +942,7 @@ class SAGCluster(models.Model): keyword_count = models.IntegerField(default=0) status = models.CharField(max_length=50, choices=STATUS_CHOICES, default='draft') - created_at = models.DateTimeField(auto_now_add=True) - updated_at = models.DateTimeField(auto_now=True) + # created_at, updated_at inherited from AccountBaseModel class Meta: db_table = 'sag_cluster' @@ -946,7 +952,8 @@ class SAGCluster(models.Model): #### SAGKeyword (new) ```python -class SAGKeyword(models.Model): +# Inherits account, created_at, updated_at from AccountBaseModel +class SAGKeyword(AccountBaseModel): INTENT_CHOICES = ( ('informational', 'Informational'), ('transactional', 'Transactional'), @@ -987,9 +994,7 @@ class SAGKeyword(models.Model): cpc = models.FloatField(null=True, blank=True) # if available from API competition = models.CharField(max_length=50, blank=True) # 'low', 'medium', 'high' - - created_at = models.DateTimeField(auto_now_add=True) - updated_at = models.DateTimeField(auto_now=True) + # created_at, updated_at inherited from AccountBaseModel class Meta: db_table = 'sag_keyword' @@ -1542,7 +1547,7 @@ populated_attributes = [ ] sector_context = { - "sector_id": "pet_health", + "sector_id": 1, # integer PK (BigAutoField) "site_type": "ecommerce", "sector_name": "Pet Health Products" } @@ -1563,7 +1568,7 @@ populated_attributes = [ ] sector_context = { - "sector_id": "vet_clinic", + "sector_id": 2, # integer PK (BigAutoField) "site_type": "local_service", "sector_name": "Veterinary Clinic" } diff --git a/v2/V2-Execution-Docs/01D-setup-wizard-case2-new-site.md b/v2/V2-Execution-Docs/01D-setup-wizard-case2-new-site.md index d377ae70..546e7bbe 100644 --- a/v2/V2-Execution-Docs/01D-setup-wizard-case2-new-site.md +++ b/v2/V2-Execution-Docs/01D-setup-wizard-case2-new-site.md @@ -1,8 +1,12 @@ # IGNY8 Phase 1: Setup Wizard — Case 2 (New Site) ## Document 01D: Build Specification +> **Version:** 1.1 (codebase-verified) +> **Source of Truth:** Codebase at `/data/app/igny8/backend/` +> **Last Verified:** 2025-07-14 + **Status**: Draft -**Version**: 1.0 +**Version**: 1.1 **Date**: 2026-03-23 **Phase**: Phase 1 — Foundation **Scope**: New site workflow with enhanced Site Structure step @@ -144,8 +148,9 @@ Step 4 → Step 5 → Step 6 ```python # Fields to emphasize for this wizard: -class SAGBlueprint(models.Model): - site_id = models.ForeignKey(Site) +# SAGBlueprint inherits AccountBaseModel (provides account, created_by, etc.) +class SAGBlueprint(AccountBaseModel): + site = models.ForeignKey('igny8_core_auth.Site', on_delete=models.CASCADE) status = models.CharField( choices=['draft', 'active', 'archived'], default='draft' @@ -174,8 +179,9 @@ class SAGBlueprint(models.Model): **Location**: Reference 01A (Attribute Definition) ```python -class SAGAttribute(models.Model): - blueprint = models.ForeignKey(SAGBlueprint) +# SAGAttribute inherits AccountBaseModel (provides account, created_by, etc.) +class SAGAttribute(AccountBaseModel): + blueprint = models.ForeignKey('sag.SAGBlueprint', on_delete=models.CASCADE) name = models.CharField() # e.g., "Target Area" description = models.TextField() @@ -624,7 +630,7 @@ Be conservative: only map if connection is clear. Do not invent values not suppo ### Phase 2: Frontend Components (React) #### Step 2.1: Implement WizardStep3Container -- [ ] Create `frontend/src/components/wizard/WizardStep3Container.jsx` +- [ ] Create `frontend/src/components/wizard/WizardStep3Container.tsx` - [ ] Manage state for all sub-steps (3a–3f): - `currentSubstep` (enum: 'generate', 'review', 'business', 'populate', 'preview', 'confirm') - `attributes` (from API) @@ -644,7 +650,7 @@ Be conservative: only map if connection is clear. Do not invent values not suppo --- #### Step 2.2: Implement AttributeReviewPanel (Step 3b) -- [ ] Create `frontend/src/components/wizard/AttributeReviewPanel.jsx` +- [ ] Create `frontend/src/components/wizard/AttributeReviewPanel.tsx` - [ ] Render attributes grouped by level: - **Primary Attributes** section - **Secondary Attributes** section @@ -672,7 +678,7 @@ Be conservative: only map if connection is clear. Do not invent values not suppo --- #### Step 2.3: Implement BusinessDetailsForm (Step 3c) -- [ ] Create `frontend/src/components/wizard/BusinessDetailsForm.jsx` +- [ ] Create `frontend/src/components/wizard/BusinessDetailsForm.tsx` - [ ] Fields: - [ ] **Products** — textarea, accepts comma-separated or line-break list - [ ] **Services** — textarea, same format @@ -697,7 +703,7 @@ Be conservative: only map if connection is clear. Do not invent values not suppo --- #### Step 2.4: Implement BlueprintPreviewPanel (Step 3e) -- [ ] Create `frontend/src/components/wizard/BlueprintPreviewPanel.jsx` +- [ ] Create `frontend/src/components/wizard/BlueprintPreviewPanel.tsx` - [ ] Render tree view of clusters: - [ ] Cluster name (e.g., "Neck Massage Devices") - [ ] Type badge (e.g., "Topic Hub") @@ -757,7 +763,7 @@ Be conservative: only map if connection is clear. Do not invent values not suppo - [ ] Detailed Mode: 3a → 3b → 3c → 3d → 3e → 3f → Step 4 - [ ] Step 4 → Step 5 (always) - [ ] Step 5 → Step 6 (always) -- [ ] Implement state persistence (Redux or context): +- [ ] Implement state persistence (Zustand store): - [ ] Save wizard state to localStorage or session - [ ] Allow user to resume if page refreshes - [ ] Unit test: navigation logic for both modes @@ -905,7 +911,7 @@ Be conservative: only map if connection is clear. Do not invent values not suppo ### Functional Criteria #### 5.1: Step 3a — Generate Attributes -- [x] **AC-3a-1**: GET /api/v1/sag/wizard/generate-attributes/ returns attribute framework +- [x] **AC-3a-1**: POST /api/v1/sag/wizard/generate-attributes/ returns attribute framework - [ ] Response includes 4–8 attributes (depending on industry/sectors) - [ ] Each attribute has name, level, suggested_values, description - [ ] Attributes are organized by level (primary → secondary → tertiary) @@ -1074,7 +1080,7 @@ Be conservative: only map if connection is clear. Do not invent values not suppo - [ ] User can navigate freely between steps (prev/next) - [x] **AC-5-3**: Mode selection persists - - [ ] Mode stored in session/Redux state + - [ ] Mode stored in session/Zustand state - [ ] Navigation logic respects mode throughout wizard --- @@ -1188,7 +1194,7 @@ This section provides step-by-step instructions for Claude Code (or equivalent A 2. **Set Up Environment** - Clone repository - - Install dependencies (backend: Django/DRF, frontend: React + Redux, WordPress: plugin SDK) + - Install dependencies (backend: Django >=5.2.7/DRF, frontend: React 19 + Zustand + TypeScript ~5.7.2 + Vite ^6.1.0, WordPress: plugin SDK) - Create feature branch: `feature/wizard-step-3-site-structure` - Ensure tests pass on main branch @@ -1271,7 +1277,7 @@ Location: backend/sag/api/views/wizard.py ``` Location: frontend/src/components/wizard/ -A) WizardStep3Container.jsx (2 hours) +A) WizardStep3Container.tsx (2 hours) - Create state object: { mode: 'quick' | 'detailed', @@ -1284,23 +1290,23 @@ A) WizardStep3Container.jsx (2 hours) - Implement navigation logic (next, prev, skip) - Implement conditional rendering of sub-steps based on mode - Handle loading/error states - - Connect to Redux (or context) for wizard state + - Connect to Zustand store for wizard state -B) AttributeReviewPanel.jsx (1.5 hours) +B) AttributeReviewPanel.tsx (1.5 hours) - Render three sections: Primary, Secondary, Tertiary - For each attribute: toggle + values + edit/delete + reorder - Implement inline edit modal for values - Implement "+ Add Custom Attribute" form - Show completeness status (ready/thin/empty) -C) BusinessDetailsForm.jsx (1 hour) +C) BusinessDetailsForm.tsx (1 hour) - Five input fields: products, services, brands, locations, conditions - Implement text parsing (comma-separated, line-break) - Show "x items detected" feedback - Implement validation (at least one field, max 50 items) - Pass data to parent state on change -D) BlueprintPreviewPanel.jsx (1.5 hours) +D) BlueprintPreviewPanel.tsx (1.5 hours) - Render tree view of clusters - Each cluster: name, type badge, keyword count, content plan count - Expand/collapse per cluster @@ -1321,7 +1327,7 @@ Tests: #### Task 6: Integrate Wizard Navigation (2 hours) ``` -Location: frontend/src/routes/wizard.js (or similar routing) +Location: frontend/src/routes/wizard.tsx (or similar routing) - Update router to include Step 3 routes - Implement navigation logic: - Step 1 → Step 2 (always) @@ -1329,15 +1335,15 @@ Location: frontend/src/routes/wizard.js (or similar routing) - Step 3a → Step 3b (Detailed) or Step 3e (Quick) - Step 3b → Step 3c, Step 3c → Step 3d, Step 3d → Step 3e - Step 3e → Step 3f, Step 3f → Step 4 -- Implement state persistence (Redux or localStorage) +- Implement state persistence (Zustand store with localStorage persist) - Test Quick Mode flow and Detailed Mode flow (E2E) ``` #### Task 7: Update Step 1 (Welcome) (1 hour) ``` -Location: frontend/src/components/wizard/WizardStep1.jsx (or similar) +Location: frontend/src/components/wizard/WizardStep1.tsx (or similar) - Add mode selection UI (quick vs. detailed) -- Store mode in wizard state (Redux/context) +- Store mode in wizard state (Zustand store) - Pass mode to WizardStep3Container - Test mode selection ``` @@ -1383,7 +1389,7 @@ Location: wordpress-plugin/igny8-blueprint-sync.php (or similar) - Test data persistence and transitions - Frontend: E2E test both wizard flows - - Location: frontend/tests/e2e/wizard.test.js (Selenium/Cypress) + - Location: frontend/tests/e2e/wizard.test.ts (Vitest/Playwright) - Test Quick Mode: 10 min, full journey - Test Detailed Mode: 20 min, full journey - Test error scenarios (invalid input, API failure) @@ -1492,6 +1498,7 @@ Location: wordpress-plugin/igny8-blueprint-sync.php (or similar) | Version | Date | Author | Change | |---------|------|--------|--------| | 1.0 | 2026-03-23 | System | Initial draft | +| 1.1 | 2025-07-14 | Codebase Audit | Fixed: model inheritance (AccountBaseModel), FK app_labels, .jsx→.tsx, Redux→Zustand, GET→POST AC-3a-1, version refs | --- diff --git a/v2/V2-Execution-Docs/01E-blueprint-aware-pipeline.md b/v2/V2-Execution-Docs/01E-blueprint-aware-pipeline.md index e92587d4..209fb21f 100644 --- a/v2/V2-Execution-Docs/01E-blueprint-aware-pipeline.md +++ b/v2/V2-Execution-Docs/01E-blueprint-aware-pipeline.md @@ -1,4 +1,9 @@ # 01E: Blueprint-Aware Content Pipeline + +> **Version:** 1.1 (codebase-verified) +> **Source of Truth:** Codebase at `/data/app/igny8/backend/` +> **Last Verified:** 2025-07-14 + **IGNY8 Phase 1: Content Automation with SAG Blueprint Enhancement** --- @@ -150,15 +155,15 @@ ELSE: 2. `blueprint_context` structure: ```json { - "cluster_id": "uuid", + "cluster_id": "integer", "cluster_name": "string", "cluster_type": "string (topical|product|service)", "cluster_sector": "string", "hub_title": "string (cluster's main hub page title)", "hub_url": "string (blueprint.site.domain/cluster_slug)", "cluster_attributes": ["list of attribute terms"], - "related_clusters": ["list of related cluster ids"], - "cluster_products": ["list of product ids if product cluster"], + "related_clusters": ["list of related cluster integer ids"], + "cluster_products": ["list of product integer ids if product cluster"], "content_structure": "string (guide_tutorial|comparison|review|how_to|question|listicle)", "content_type": "string (cluster_hub|blog_post|product_page|term_page|service_page)", "execution_phase": "integer (1-4)", @@ -287,10 +292,13 @@ execution_priority = { ### Related Models (from 01A, 01C, 01D) ```python -# sag/models.py — SAG Blueprint Structure +# igny8_core/sag/models.py — SAG Blueprint Structure +# DEFAULT_AUTO_FIELD = BigAutoField (integer PKs) -class SAGBlueprint(models.Model): - site = ForeignKey(Site) +from igny8_core.auth.models import AccountBaseModel + +class SAGBlueprint(AccountBaseModel): + site = ForeignKey('igny8_core_auth.Site', on_delete=models.CASCADE) name = CharField(max_length=255) status = CharField(choices=['draft', 'active', 'archived']) created_at = DateTimeField(auto_now_add=True) @@ -303,8 +311,8 @@ class SAGBlueprint(models.Model): # Taxonomy mapping to WordPress custom taxonomies wp_taxonomy_mapping = JSONField() # cluster_id → tax values -class SAGCluster(models.Model): - blueprint = ForeignKey(SAGBlueprint) +class SAGCluster(AccountBaseModel): + blueprint = ForeignKey('sag.SAGBlueprint', on_delete=models.CASCADE) name = CharField(max_length=255) cluster_type = CharField(choices=['topical', 'product', 'service']) sector = CharField(max_length=255) @@ -314,69 +322,135 @@ class SAGCluster(models.Model): updated_at = DateTimeField(auto_now=True) ``` -### Pipeline Models (existing) +### Pipeline Models (existing — names are PLURAL per codebase convention) ```python -# content/models.py — Content Pipeline +# igny8_core/business/planning/models.py — Planning Pipeline (app_label: planner) +# DEFAULT_AUTO_FIELD = BigAutoField (integer PKs, NOT UUIDs) -class Keyword(models.Model): - site = ForeignKey(Site) - term = CharField(max_length=255) - source = CharField(choices=['csv_import', 'seed_list', 'user', 'sag_blueprint']) - sag_cluster_id = UUIDField(null=True, blank=True) # NEW: links to blueprint cluster +class Keywords(SoftDeletableModel, SiteSectorBaseModel): + """Site-specific keyword instances referencing global SeedKeywords.""" + seed_keyword = ForeignKey(SeedKeyword, on_delete=models.CASCADE) + volume_override = IntegerField(null=True, blank=True) + difficulty_override = IntegerField(null=True, blank=True) + attribute_values = JSONField(default=list, blank=True) + cluster = ForeignKey('Clusters', on_delete=models.SET_NULL, null=True, blank=True) + status = CharField(max_length=50, choices=[('new','New'),('mapped','Mapped')], default='new') + disabled = BooleanField(default=False) + # NEW: optional SAG cluster link + sag_cluster_id = IntegerField(null=True, blank=True) # Links to sag.SAGCluster PK created_at = DateTimeField(auto_now_add=True) + class Meta: + app_label = 'planner' -class Cluster(models.Model): - site = ForeignKey(Site) - name = CharField(max_length=255) - keywords = JSONField(default=list) - created_by = CharField(choices=['auto_cluster', 'sag_blueprint']) +class Clusters(SoftDeletableModel, SiteSectorBaseModel): + """Keyword clusters — pure topic clusters.""" + name = CharField(max_length=255, db_index=True) + description = TextField(blank=True, null=True) + keywords_count = IntegerField(default=0) + volume = IntegerField(default=0) + mapped_pages = IntegerField(default=0) + status = CharField(max_length=50, choices=[('new','New'),('mapped','Mapped')], default='new') + disabled = BooleanField(default=False) + created_at = DateTimeField(auto_now_add=True) + updated_at = DateTimeField(auto_now=True) + class Meta: + app_label = 'planner' -class Idea(models.Model): - site = ForeignKey(Site) - title = CharField(max_length=255) - keyword = ForeignKey(Keyword) - cluster = ForeignKey(Cluster, null=True) - sector = CharField(max_length=255) # NEW - structure = CharField(choices=['guide_tutorial', 'comparison', 'review', 'how_to', 'question', 'listicle']) # NEW - content_type = CharField(choices=['cluster_hub', 'blog_post', 'product_page', 'term_page', 'service_page', 'landing_page', 'business_page']) # NEW - sag_cluster_id = UUIDField(null=True, blank=True) # NEW - idea_source = CharField(choices=['auto_generate', 'sag_blueprint']) # NEW +class ContentIdeas(SoftDeletableModel, SiteSectorBaseModel): + """Content ideas generated from keyword clusters.""" + idea_title = CharField(max_length=255, db_index=True) + description = TextField(blank=True, null=True) + primary_focus_keywords = CharField(max_length=500, blank=True) + target_keywords = CharField(max_length=500, blank=True) + keyword_objects = ManyToManyField('Keywords', blank=True, related_name='content_ideas') + keyword_cluster = ForeignKey('Clusters', on_delete=models.SET_NULL, null=True, blank=True) + status = CharField(max_length=50, choices=[('new','New'),('queued','Queued'),('completed','Completed')], default='new') + disabled = BooleanField(default=False) + estimated_word_count = IntegerField(default=1000) + content_type = CharField(max_length=50, choices=[('post','Post'),('page','Page'),('product','Product'),('taxonomy','Taxonomy')], default='post') + content_structure = CharField(max_length=50, choices=[ + ('article','Article'),('guide','Guide'),('comparison','Comparison'), + ('review','Review'),('listicle','Listicle'),('landing_page','Landing Page'), + ('business_page','Business Page'),('service_page','Service Page'), + ('general','General'),('cluster_hub','Cluster Hub'),('product_page','Product Page'), + ('category_archive','Category Archive'),('tag_archive','Tag Archive'), + ('attribute_archive','Attribute Archive'), + ], default='article') + # NEW: SAG fields + sag_cluster_id = IntegerField(null=True, blank=True) # Links to sag.SAGCluster PK + idea_source = CharField(choices=['auto_generate', 'sag_blueprint'], null=True, blank=True) # NEW execution_phase = IntegerField(null=True) # NEW: 1-4 from blueprint created_at = DateTimeField(auto_now_add=True) + class Meta: + app_label = 'planner' -class Task(models.Model): - site = ForeignKey(Site) - title = CharField(max_length=255) - idea = ForeignKey(Idea) - status = CharField(choices=['pending', 'assigned', 'in_progress', 'review', 'completed']) - assigned_to = ForeignKey(User, null=True) - sag_cluster_id = UUIDField(null=True, blank=True) # NEW +# igny8_core/business/content/models.py — Content Pipeline (app_label: writer) + +class Tasks(SoftDeletableModel, SiteSectorBaseModel): + """Tasks model for content generation queue.""" + title = CharField(max_length=255, db_index=True) + description = TextField(blank=True, null=True) + cluster = ForeignKey('planner.Clusters', on_delete=models.SET_NULL, null=True, blank=False) + idea = ForeignKey('planner.ContentIdeas', on_delete=models.SET_NULL, null=True, blank=True) + content_type = CharField(max_length=100, choices=[('post','Post'),('page','Page'),('product','Product'),('taxonomy','Taxonomy')], default='post') + content_structure = CharField(max_length=100, choices=[...same as ContentIdeas...], default='article') + taxonomy_term = ForeignKey('ContentTaxonomy', on_delete=models.SET_NULL, null=True, blank=True) + keywords = TextField(blank=True, null=True, help_text='Comma-separated keywords') + word_count = IntegerField(default=1000) + status = CharField(max_length=50, choices=[('queued','Queued'),('completed','Completed')], default='queued') + # NEW: SAG fields + sag_cluster_id = IntegerField(null=True, blank=True) # Links to sag.SAGCluster PK blueprint_context = JSONField(null=True, blank=True) # NEW: execution context created_at = DateTimeField(auto_now_add=True) + updated_at = DateTimeField(auto_now=True) + class Meta: + app_label = 'writer' -class Content(models.Model): - site = ForeignKey(Site) - title = CharField(max_length=255) - body = TextField() - task = ForeignKey(Task, null=True) - content_type = CharField(choices=['cluster_hub', 'blog_post', 'product_page', 'term_page', 'service_page', 'landing_page', 'business_page']) # NEW - content_structure = CharField(choices=['guide_tutorial', 'comparison', 'review', 'how_to', 'question', 'listicle']) # NEW - sag_cluster_id = UUIDField(null=True, blank=True) # NEW - taxonomies = JSONField(default=dict, null=True, blank=True) # NEW: custom WP taxonomies - status = CharField(choices=['draft', 'review', 'published']) +class Content(SoftDeletableModel, SiteSectorBaseModel): + """Content model for AI-generated or WordPress-imported content.""" + title = CharField(max_length=255, db_index=True) + content_html = TextField(help_text='Final HTML content') # NOTE: field is content_html, NOT body + word_count = IntegerField(default=0) + meta_title = CharField(max_length=255, blank=True, null=True) + meta_description = TextField(blank=True, null=True) + primary_keyword = CharField(max_length=255, blank=True, null=True) + secondary_keywords = JSONField(default=list, blank=True) + cluster = ForeignKey('planner.Clusters', on_delete=models.SET_NULL, null=True, blank=False) + content_type = CharField(max_length=50, choices=[('post','Post'),('page','Page'),('product','Product'),('taxonomy','Taxonomy')], default='post') + content_structure = CharField(max_length=50, choices=[...same as Tasks...], default='article') + taxonomy_terms = ManyToManyField('ContentTaxonomy', through='ContentTaxonomyRelation', blank=True) + external_id = CharField(max_length=255, blank=True, null=True) + external_url = URLField(blank=True, null=True) + source = CharField(max_length=50, choices=[('igny8','IGNY8 Generated'),('wordpress','WordPress Imported')], default='igny8') + status = CharField(max_length=50, choices=[('draft','Draft'),('review','Review'),('approved','Approved'),('published','Published')], default='draft') + # NEW: SAG fields + sag_cluster_id = IntegerField(null=True, blank=True) # Links to sag.SAGCluster PK created_at = DateTimeField(auto_now_add=True) + updated_at = DateTimeField(auto_now=True) + class Meta: + app_label = 'writer' -class Image(models.Model): - content = ForeignKey(Content) - url = URLField() - alt_text = CharField(max_length=255) - style_type = CharField(choices=['hero', 'supporting', 'ecommerce', 'category', 'service', 'conversion']) # NEW - sag_cluster_id = UUIDField(null=True, blank=True) # NEW +class Images(SoftDeletableModel, SiteSectorBaseModel): + """Images model — note: class is Images (plural).""" + content = ForeignKey(Content, on_delete=models.CASCADE, null=True, blank=True) + task = ForeignKey(Tasks, on_delete=models.CASCADE, null=True, blank=True) + image_type = CharField(max_length=50, choices=[('featured','Featured'),('desktop','Desktop'),('mobile','Mobile'),('in_article','In-Article')], default='featured') + image_url = CharField(max_length=500, blank=True, null=True) # NOTE: field is image_url, NOT url + image_path = CharField(max_length=500, blank=True, null=True) + prompt = TextField(blank=True, null=True) # Generation prompt + caption = TextField(blank=True, null=True) # NOTE: field is caption, NOT alt_text + status = CharField(max_length=50, default='pending') + position = IntegerField(default=0) + # NEW: SAG fields + sag_cluster_id = IntegerField(null=True, blank=True) # Links to sag.SAGCluster PK + style_type = CharField(max_length=50, choices=[('hero','Hero'),('supporting','Supporting'),('ecommerce','Ecommerce'),('category','Category'),('service','Service'),('conversion','Conversion')], null=True, blank=True) # NEW created_at = DateTimeField(auto_now_add=True) + class Meta: + app_label = 'writer' class Job(models.Model): - """Pipeline execution tracking""" - site = ForeignKey(Site) + """Pipeline execution tracking (NEW model — does not yet exist in codebase).""" + site = ForeignKey('igny8_core_auth.Site', on_delete=models.CASCADE) status = CharField(choices=['pending', 'running', 'completed', 'failed']) stage = IntegerField(choices=[(0, 'Blueprint Check'), (1, 'Keywords'), (2, 'Cluster'), (3, 'Ideas'), (4, 'Tasks'), (5, 'Content'), (6, 'Taxonomy'), (7, 'Images')]) blueprint_mode = CharField(choices=['legacy', 'blueprint_aware']) # NEW @@ -389,24 +463,27 @@ class Job(models.Model): #### Stage 0: Blueprint Check ```python -# celery_app/tasks.py +# igny8_core/tasks.py (Celery app: celery -A igny8_core) @app.task(bind=True, max_retries=3) def check_blueprint(self, site_id): """ Stage 0: Determine execution mode and load blueprint context. + Args: + site_id: integer PK (BigAutoField) + Returns: { 'status': 'success', 'pipeline_mode': 'blueprint_aware' | 'legacy', - 'blueprint_id': 'uuid' (if active), + 'blueprint_id': integer (if active), 'execution_phases': list, 'next_stage': 1 } """ try: - site = Site.objects.get(id=site_id) + site = Site.objects.get(id=site_id) # integer PK lookup job = Job.objects.create(site=site, stage=0, status='running') blueprint = SAGBlueprint.objects.filter( @@ -418,7 +495,7 @@ def check_blueprint(self, site_id): result = { 'status': 'success', 'pipeline_mode': 'blueprint_aware', - 'blueprint_id': str(blueprint.id), + 'blueprint_id': blueprint.id, 'execution_phases': blueprint.execution_priority, } job.blueprint_mode = 'blueprint_aware' @@ -464,7 +541,7 @@ def process_keywords(self, site_id, blueprint_context): blueprint_mode=blueprint_context['pipeline_mode'] ) - keywords = Keyword.objects.filter(site=site, sag_cluster_id__isnull=True) + keywords = Keywords.objects.filter(site=site, sag_cluster_id__isnull=True) if blueprint_context['pipeline_mode'] == 'blueprint_aware': blueprint = SAGBlueprint.objects.get(id=blueprint_context['blueprint_id']) @@ -479,11 +556,11 @@ def process_keywords(self, site_id, blueprint_context): if cluster: keyword.sag_cluster_id = cluster.id keyword.save() - cluster.keywords.append(keyword.term) + cluster.keywords.append(keyword.keyword) cluster.save() matched_count += 1 else: - unmatched_keywords.append(keyword.term) + unmatched_keywords.append(keyword.keyword) job.log = f"Matched {matched_count} keywords. Unmatched: {unmatched_keywords}" else: @@ -615,15 +692,15 @@ def create_tasks(self, site_id, blueprint_context): blueprint_mode=blueprint_context['pipeline_mode'] ) - ideas = Idea.objects.filter(site=site, task__isnull=True) + ideas = ContentIdeas.objects.filter(site=site, task__isnull=True) task_count = 0 for idea in ideas: - task = Task.objects.create( + task = Tasks.objects.create( site=site, - title=idea.title, + title=idea.idea_title, idea=idea, - status='pending' + status='queued' # Tasks.STATUS_CHOICES: queued/completed ) if blueprint_context['pipeline_mode'] == 'blueprint_aware' and idea.sag_cluster_id: @@ -632,14 +709,14 @@ def create_tasks(self, site_id, blueprint_context): task.sag_cluster_id = idea.sag_cluster_id task.blueprint_context = { - 'cluster_id': str(cluster.id), + 'cluster_id': cluster.id, 'cluster_name': cluster.name, 'cluster_type': cluster.cluster_type, 'cluster_sector': cluster.sector, 'hub_title': blueprint.content_plan.get(str(cluster.id), {}).get('hub_title'), 'hub_url': f"{site.domain}/hubs/{cluster.name.lower().replace(' ', '-')}", 'cluster_attributes': cluster.attributes, - 'content_structure': idea.structure, + 'content_structure': idea.content_structure, 'content_type': idea.content_type, 'execution_phase': idea.execution_phase, } @@ -683,7 +760,7 @@ def generate_content(self, site_id, blueprint_context): blueprint_mode=blueprint_context['pipeline_mode'] ) - tasks = Task.objects.filter(site=site, status='completed', content__isnull=True) + tasks = Tasks.objects.filter(site=site, status='completed', content__isnull=True) content_count = 0 for task in tasks: @@ -795,7 +872,7 @@ def assign_taxonomy(self, site_id, blueprint_context): cluster = SAGCluster.objects.get(id=content.sag_cluster_id) # Load taxonomy mapping from blueprint - tax_mapping = blueprint.wp_taxonomy_mapping.get(str(cluster.id), {}) + tax_mapping = blueprint.wp_taxonomy_mapping.get(cluster.id, {}) # Assign taxonomies content.taxonomies = tax_mapping @@ -863,7 +940,7 @@ def generate_images(self, site_id, blueprint_context): # Generate featured image featured_image = GenerateImage(content.title, style) - image = Image.objects.create( + image = Images.objects.create( content=content, url=featured_image['url'], alt_text=featured_image['alt_text'], @@ -1019,7 +1096,7 @@ redis-server # Create sample site and blueprint python manage.py shell << EOF from django.contrib.auth.models import User -from sites.models import Site +from igny8_core.auth.models import Site from sag.models import SAGBlueprint, SAGCluster site = Site.objects.create(name="Test Site", domain="test.local") @@ -1052,27 +1129,25 @@ EOF #### Execute Pipeline Stages ```bash # Start Celery worker (in separate terminal) -celery -A igny8.celery_app worker --loglevel=info +celery -A igny8_core worker --loglevel=info # Run Stage 0: Blueprint Check python manage.py shell << EOF -from celery_app.tasks import check_blueprint -result = check_blueprint.delay(site_id="") +from igny8_core.tasks import check_blueprint +result = check_blueprint.delay(site_id="") print(result.get()) EOF # Run full pipeline python manage.py shell << EOF -from celery_app.tasks import check_blueprint -from uuid import UUID - -site_id = UUID("") +from igny8_core.tasks import check_blueprint +site_id = 1 # integer PK (BigAutoField) check_blueprint.delay(site_id) # Each stage automatically chains to the next EOF # Monitor pipeline execution -celery -A igny8.celery_app events +celery -A igny8_core events # or view logs: tail -f celery.log ``` @@ -1080,20 +1155,20 @@ celery -A igny8.celery_app events #### Unit Tests ```bash -pytest content/tests/test_pipeline.py -v -pytest sag/tests/test_blueprint.py -v -pytest celery_app/tests/test_tasks.py -v +pytest igny8_core/business/content/tests/test_pipeline.py -v +pytest igny8_core/sag/tests/test_blueprint.py -v +pytest igny8_core/tests/test_tasks.py -v ``` #### Integration Test ```bash -pytest content/tests/test_pipeline_integration.py::test_full_blueprint_pipeline -v +pytest igny8_core/business/content/tests/test_pipeline_integration.py::test_full_blueprint_pipeline -v # Test legacy mode -pytest content/tests/test_pipeline_integration.py::test_full_legacy_pipeline -v +pytest igny8_core/business/content/tests/test_pipeline_integration.py::test_full_legacy_pipeline -v # Test mixed mode (some sites with blueprint, some without) -pytest content/tests/test_pipeline_integration.py::test_mixed_mode_execution -v +pytest igny8_core/business/content/tests/test_pipeline_integration.py::test_mixed_mode_execution -v ``` #### Manual Test Scenario @@ -1103,37 +1178,37 @@ python manage.py shell < scripts/setup_test_data.py # 2. Import sample keywords python manage.py shell << EOF -from content.models import Keyword -from sites.models import Site +from igny8_core.business.content.models import Keyword +from igny8_core.auth.models import Site site = Site.objects.get(name="Test Site") keywords = ["python tutorial", "django rest", "web scraping"] for kw in keywords: - Keyword.objects.create(site=site, term=kw, source='csv_import') + Keywords.objects.create(site=site, term=kw, source='csv_import') EOF # 3. Run pipeline -celery -A igny8.celery_app worker --loglevel=debug & +celery -A igny8_core worker --loglevel=debug & python manage.py shell << EOF -from celery_app.tasks import check_blueprint -from sites.models import Site +from igny8_core.tasks import check_blueprint +from igny8_core.auth.models import Site site = Site.objects.get(name="Test Site") check_blueprint.delay(site.id) EOF # 4. Inspect results python manage.py shell << EOF -from content.models import Keyword, Idea, Task, Content, Image -from sites.models import Site +from igny8_core.business.content.models import Keyword, Idea, Task, Content, Image +from igny8_core.auth.models import Site site = Site.objects.get(name="Test Site") -print("Keywords:", Keyword.objects.filter(site=site).count()) -print("Ideas:", Idea.objects.filter(site=site).count()) -print("Tasks:", Task.objects.filter(site=site).count()) +print("Keywords:", Keywords.objects.filter(site=site).count()) +print("Ideas:", ContentIdeas.objects.filter(site=site).count()) +print("Tasks:", Tasks.objects.filter(site=site).count()) print("Content:", Content.objects.filter(site=site).count()) -print("Images:", Image.objects.filter(site=site).count()) +print("Images:", Images.objects.filter(site=site).count()) # Check blueprint context -task = Task.objects.filter(site=site, blueprint_context__isnull=False).first() +task = Tasks.objects.filter(site=site, blueprint_context__isnull=False).first() if task: print("Blueprint context:", task.blueprint_context) EOF @@ -1146,7 +1221,7 @@ EOF # Check if blueprint exists and is active python manage.py shell << EOF from sag.models import SAGBlueprint -from sites.models import Site +from igny8_core.auth.models import Site site = Site.objects.get(id="") blueprint = SAGBlueprint.objects.filter(site=site, status='active').first() print(f"Blueprint: {blueprint}") @@ -1160,9 +1235,9 @@ EOF ```bash # Check keyword-cluster mapping python manage.py shell << EOF -from content.models import Keyword +from igny8_core.business.content.models import Keyword from sag.models import SAGCluster -keywords = Keyword.objects.filter(sag_cluster_id__isnull=True) +keywords = Keywords.objects.filter(sag_cluster_id__isnull=True) print(f"Unmatched keywords: {[kw.term for kw in keywords]}") # Check available clusters @@ -1176,16 +1251,16 @@ EOF ```bash # Check task status python manage.py shell << EOF -from content.models import Task -tasks = Task.objects.all() +from igny8_core.business.content.models import Task +tasks = Tasks.objects.all() for task in tasks: print(f"Task {task.id}: status={task.status}, blueprint_context={bool(task.blueprint_context)}") EOF # Check Celery task logs -celery -A igny8.celery_app inspect active -celery -A igny8.celery_app inspect reserved -celery -A igny8.celery_app purge # WARNING: clears queue +celery -A igny8_core inspect active +celery -A igny8_core inspect reserved +celery -A igny8_core purge # WARNING: clears queue ``` ### Extending with Custom Prompt Templates @@ -1225,7 +1300,7 @@ PROMPT_TEMPLATES = { ```bash # View pipeline execution history python manage.py shell << EOF -from content.models import Job +from igny8_core.business.content.models import Job jobs = Job.objects.filter(stage=5).order_by('-created_at')[:10] for job in jobs: duration = (job.completed_at - job.created_at).total_seconds() if job.completed_at else None diff --git a/v2/V2-Execution-Docs/01E-blueprint-aware-pipeline.md.bak b/v2/V2-Execution-Docs/01E-blueprint-aware-pipeline.md.bak new file mode 100644 index 00000000..e92587d4 --- /dev/null +++ b/v2/V2-Execution-Docs/01E-blueprint-aware-pipeline.md.bak @@ -0,0 +1,1265 @@ +# 01E: Blueprint-Aware Content Pipeline +**IGNY8 Phase 1: Content Automation with SAG Blueprint Enhancement** + +--- + +## 1. CURRENT STATE + +### Existing Pipeline Architecture +IGNY8's content pipeline operates as a 7-stage automated system, orchestrated via Celery with scheduled execution (daily/weekly/monthly via Celery Beat): + +| Stage | Function | Automation | Output | +|-------|----------|-----------|--------| +| 1 | Keywords | Import CSV/seed lists | Keyword list per site | +| 2 | Clusters | AutoClusterKeywords (GPT-4) | Semantic keyword groups | +| 3 | Ideas | GenerateIdeas | Content brief queue | +| 4 | Tasks | Queue creation | Writer task list | +| 5 | Content | GenerateContent (AI) | Draft articles | +| 6 | Images | GenerateImages | Featured + in-article images | +| 7 | Review | Editorial queue | Published content | + +### Current Limitations +- **Generic clustering**: All keywords grouped by semantic similarity, no business-specific structure +- **One-size-fits-all content**: All articles follow same template regardless of content type +- **No hierarchy**: No distinction between hub pages, blog posts, product pages, term pages, or service pages +- **No priority**: All content treated equally; foundational content (hubs) may not be written first +- **No taxonomy integration**: Generated content not automatically assigned to custom taxonomies +- **No blueprint context**: Writers receive keywords but not strategic framework + +### Celery Automation Context +- **Celery Beat**: Manages recurring schedule (daily, weekly, monthly per site) +- **Task Queue**: Each stage enqueued as separate Celery task +- **State Tracking**: Uses Django ORM to track Job, Stage, Keyword, Cluster, Idea, Task, Content, Image models +- **Failure Handling**: Retry logic, dead-letter queue for failed tasks +- **Logging**: Structured logging to track execution per site per stage + +--- + +## 2. WHAT TO BUILD + +### Vision: Blueprint-Driven Pipeline +When a site has an **active SAG Blueprint**, every pipeline stage becomes context-aware: +- Content priorities driven by blueprint's execution phases +- Content types (hub, blog, product, term, service) determined at ideation +- Prompt templates matched to content structure and type +- Output taxonomy-tagged and cluster-assigned automatically + +When **no blueprint exists**, the pipeline reverts to legacy mode—no breaking changes. + +### New/Enhanced Stages + +#### Stage 0: Blueprint Check (NEW) +Execute before pipeline stages 1–7. + +**Responsibility**: Determine execution mode and load context. + +**Logic**: +```python +IF Site.sag_blueprint EXISTS AND sag_blueprint.status == 'active': + LOAD blueprint + IDENTIFY unfulfilled content needs from blueprint.content_plan + DETERMINE execution_priority from blueprint.execution_phases + SET pipeline_mode = 'blueprint_aware' +ELSE: + SET pipeline_mode = 'legacy' + PROCEED to Stage 1 with no blueprint context +``` + +**Outputs**: +- `pipeline_mode`: 'blueprint_aware' | 'legacy' +- `blueprint_context`: SAGBlueprint instance (if active) +- `execution_phases`: List of priority phases for content queue + +--- + +#### Stage 1: Keyword Processing (ENHANCED) +**Legacy behavior** (no blueprint): Pass keywords to Stage 2 unchanged. + +**Blueprint-aware** (active blueprint): +1. For each new/imported keyword, query blueprint's SAGClusters +2. Match keyword to existing clusters based on: + - Attribute overlap (e.g., keyword "sustainable farming" matches cluster with attribute "sustainability") + - Semantic proximity to cluster topic + - Sector alignment +3. Assign matched keyword to cluster's `keywords` list +4. Flag unmatched keywords: + - **Gap**: No cluster exists for this topic + - **Outlier**: Keyword semantic distance > threshold from all clusters + - **Frontier**: Keyword extends cluster into new subtopic (possible new cluster) +5. Update `SAGCluster.keywords`, `SAGCluster.updated_at` + +**Outputs**: +- Updated cluster keyword lists +- Gap/outlier report for content strategy review +- Flagged keywords for potential new cluster formation + +--- + +#### Stage 2: AI Cluster Keywords (ENHANCED) +**Legacy behavior** (no blueprint): Run existing `AutoClusterKeywords` via GPT-4 grouping. + +**Blueprint-aware** (active blueprint): +1. **SKIP** `AutoClusterKeywords` entirely +2. Clusters already defined by SAG framework (Stage 0 loaded blueprint) +3. For new keywords from Stage 1: + - Map to existing clusters (already done in Stage 1) + - Create mapping record linking keyword → SAGCluster +4. Flag unmatched keywords (from Stage 1) for manual review +5. No new clusters created (cluster formation is Phase 1C process, not pipeline) + +**Outputs**: +- Keyword-to-cluster mapping +- Unmatched keyword report + +--- + +#### Stage 3: Generate Content Ideas (ENHANCED) +**Legacy behavior** (no blueprint): Run existing `GenerateIdeas` function. + +**Blueprint-aware** (active blueprint): +1. Call `sag/ai_functions/content_planning.py::GenerateIdeasWithBlueprint` +2. For each idea generated, enrich with: + - **Sector**: From SAGCluster.sector + - **Structure**: From blueprint.content_plan[cluster].structure (e.g., 'guide_tutorial', 'comparison', 'review', 'how_to', 'question') + - **Type**: From blueprint.content_plan[cluster].type (e.g., 'cluster_hub', 'blog_post', 'product_page', 'term_page', 'service_page') + - **SAGCluster ID**: Link idea to blueprint cluster + - **idea_source**: Set to 'sag_blueprint' +3. Respect execution phases: + - Phase 1: Generate ideas for `category_pages`, `top_cluster_hubs` + - Phase 2: Generate ideas for `remaining_hubs`, `first_blogs_per_cluster` + - Phase 3: Generate ideas for `attribute_term_pages`, `product_enrichment` + - Phase 4: Generate ideas for `additional_blogs`, `brand_comparisons` +4. Prioritize queuing by phase + +**Outputs**: +- Idea records with type, structure, sector, cluster assignment +- Execution phase assignments +- Queue prioritized by phase + +--- + +#### Stage 4: Create Writer Tasks (ENHANCED) +**Legacy behavior** (no blueprint): Create basic task with keyword/idea reference. + +**Blueprint-aware** (active blueprint): +1. For each idea, create Task with: + - Standard fields: title, keyword, site, status, assigned_to + - **New fields**: + - `sag_cluster_id`: Reference to blueprint cluster + - `blueprint_context`: JSON blob containing execution context +2. `blueprint_context` structure: + ```json + { + "cluster_id": "uuid", + "cluster_name": "string", + "cluster_type": "string (topical|product|service)", + "cluster_sector": "string", + "hub_title": "string (cluster's main hub page title)", + "hub_url": "string (blueprint.site.domain/cluster_slug)", + "cluster_attributes": ["list of attribute terms"], + "related_clusters": ["list of related cluster ids"], + "cluster_products": ["list of product ids if product cluster"], + "content_structure": "string (guide_tutorial|comparison|review|how_to|question|listicle)", + "content_type": "string (cluster_hub|blog_post|product_page|term_page|service_page)", + "execution_phase": "integer (1-4)", + "seo_strategy": "object (primary_keyword, related_keywords, intent)" + } + ``` +3. If no blueprint: Create task without blueprint_context (legacy) + +**Outputs**: +- Task records with sag_cluster_id and blueprint_context + +--- + +#### Stage 5: Generate Article Content (ENHANCED) +**Legacy behavior** (no blueprint): Run existing `GenerateContent` with generic prompt. + +**Blueprint-aware** (has blueprint_context): +1. **Load prompt template** by content_type + content_structure combination: + + | Content Type | Structure | Template Key | + |---|---|---| + | Cluster Hub | Guide Tutorial | `sag_hub_guide` | + | Cluster Hub | Top Listicle | `sag_hub_listicle` | + | Blog Post | Comparison | `sag_blog_comparison` | + | Blog Post | Review | `sag_blog_review` | + | Blog Post | How To | `sag_blog_howto` | + | Blog Post | Question | `sag_blog_question` | + | Term Page | Guide Tutorial | `sag_term_page` | + | Product Page | Review | `sag_product_page` | + | Service Page | Guide Tutorial | `sag_service_page` | + | Landing Page | Guide Tutorial | `sag_landing_guide` | + | Landing Page | Comparison | `sag_landing_comparison` | + | Business Page | Guide Tutorial | `sag_business_guide` | + +2. **Inject blueprint context variables** into prompt template: + ``` + {cluster_name} → From SAGCluster.name + {cluster_type} → From SAGCluster.cluster_type + {cluster_sector} → From SAGCluster.sector + {hub_title} → From blueprint_context.hub_title + {hub_url} → From blueprint_context.hub_url + {attribute_terms} → Comma-separated list from cluster attributes + {cluster_products} → Product list if product cluster + {related_clusters} → Related cluster names for internal linking + {content_structure} → Structure type for consistency + {content_type} → Content type for tone/depth + ``` + +3. Call GPT-4 with enriched prompt template + +4. Post-process output: + - Add internal links to related cluster hubs + - Add cross-references to attribute term pages + - Inject CTA appropriate to content type (e.g., product link for product cluster) + +5. If no blueprint_context: Run legacy `GenerateContent` unchanged + +**Outputs**: +- Content record with body, title, sag_cluster_id, content_type, content_structure + +--- + +#### Stage 6: Taxonomy Assignment (NEW) +Execute after content generation, **only if blueprint exists**. + +**Responsibility**: Auto-assign content to custom WP taxonomies derived from blueprint. + +**Logic**: +1. Load site's custom taxonomies from blueprint (`SAGCluster.wp_taxonomy_mapping`) +2. For generated content: + - Match content to cluster's attributes and taxonomy terms + - Assign custom taxonomy values from blueprint mapping + - Set `content.sag_cluster_id` (links to blueprint structure) + - Update cluster status: + - If first content in cluster: set `SAGCluster.status = 'partial'` + - If all planned content exists: set `SAGCluster.status = 'complete'` +3. Store taxonomy assignments in `Content.taxonomies` JSON field + +**Outputs**: +- Content records tagged with custom taxonomies +- Cluster status updated to reflect content completion + +--- + +#### Stage 7: Image Generation (ENHANCED) +**Legacy behavior** (no blueprint): Generate generic featured + in-article images. + +**Blueprint-aware** (blueprint exists): +1. Match image style to content type: + - **Hub page**: Hero/authority style (professional, comprehensive) + - **Blog post**: Supporting/educational (friendly, illustrative) + - **Product page**: E-commerce standard (product-focused, clean) + - **Term page**: Category representation (taxonomy icon or concept illustration) + - **Service page**: Service illustration (professional, trustworthy) + - **Landing page**: Conversion-focused (compelling, aspirational) +2. Use cluster theme/color palette from blueprint for style consistency +3. Generate alt text leveraging content_structure + cluster context +4. If no blueprint: Generate images with default style + +**Outputs**: +- Image records with style type, alt text, sag_cluster_id + +--- + +### Execution Priority (Blueprint-Driven) +Pipeline processes content by `SAGBlueprint.execution_priority` phases: + +```python +execution_priority = { + "phase_1": ["category_pages", "top_cluster_hubs"], + "phase_2": ["remaining_hubs", "first_blogs_per_cluster"], + "phase_3": ["attribute_term_pages", "product_enrichment"], + "phase_4": ["additional_blogs", "brand_comparisons"] +} +``` + +**Queue behavior**: +- Stage 3 filters ideas by phase +- Stage 4 prioritizes tasks by phase +- Celery task enqueuing respects phase order +- **Rationale**: Foundational content (hubs) published before supporting content (blogs) + +--- + +## 3. DATA MODELS / APIs + +### Related Models (from 01A, 01C, 01D) +```python +# sag/models.py — SAG Blueprint Structure + +class SAGBlueprint(models.Model): + site = ForeignKey(Site) + name = CharField(max_length=255) + status = CharField(choices=['draft', 'active', 'archived']) + created_at = DateTimeField(auto_now_add=True) + updated_at = DateTimeField(auto_now=True) + + # Phase-based execution plan + execution_priority = JSONField(default=dict) # phases 1-4 + content_plan = JSONField() # cluster_id → content specs + + # Taxonomy mapping to WordPress custom taxonomies + wp_taxonomy_mapping = JSONField() # cluster_id → tax values + +class SAGCluster(models.Model): + blueprint = ForeignKey(SAGBlueprint) + name = CharField(max_length=255) + cluster_type = CharField(choices=['topical', 'product', 'service']) + sector = CharField(max_length=255) + keywords = JSONField(default=list) + attributes = JSONField(default=list) + status = CharField(choices=['draft', 'partial', 'complete']) + updated_at = DateTimeField(auto_now=True) +``` + +### Pipeline Models (existing) +```python +# content/models.py — Content Pipeline + +class Keyword(models.Model): + site = ForeignKey(Site) + term = CharField(max_length=255) + source = CharField(choices=['csv_import', 'seed_list', 'user', 'sag_blueprint']) + sag_cluster_id = UUIDField(null=True, blank=True) # NEW: links to blueprint cluster + created_at = DateTimeField(auto_now_add=True) + +class Cluster(models.Model): + site = ForeignKey(Site) + name = CharField(max_length=255) + keywords = JSONField(default=list) + created_by = CharField(choices=['auto_cluster', 'sag_blueprint']) + +class Idea(models.Model): + site = ForeignKey(Site) + title = CharField(max_length=255) + keyword = ForeignKey(Keyword) + cluster = ForeignKey(Cluster, null=True) + sector = CharField(max_length=255) # NEW + structure = CharField(choices=['guide_tutorial', 'comparison', 'review', 'how_to', 'question', 'listicle']) # NEW + content_type = CharField(choices=['cluster_hub', 'blog_post', 'product_page', 'term_page', 'service_page', 'landing_page', 'business_page']) # NEW + sag_cluster_id = UUIDField(null=True, blank=True) # NEW + idea_source = CharField(choices=['auto_generate', 'sag_blueprint']) # NEW + execution_phase = IntegerField(null=True) # NEW: 1-4 from blueprint + created_at = DateTimeField(auto_now_add=True) + +class Task(models.Model): + site = ForeignKey(Site) + title = CharField(max_length=255) + idea = ForeignKey(Idea) + status = CharField(choices=['pending', 'assigned', 'in_progress', 'review', 'completed']) + assigned_to = ForeignKey(User, null=True) + sag_cluster_id = UUIDField(null=True, blank=True) # NEW + blueprint_context = JSONField(null=True, blank=True) # NEW: execution context + created_at = DateTimeField(auto_now_add=True) + +class Content(models.Model): + site = ForeignKey(Site) + title = CharField(max_length=255) + body = TextField() + task = ForeignKey(Task, null=True) + content_type = CharField(choices=['cluster_hub', 'blog_post', 'product_page', 'term_page', 'service_page', 'landing_page', 'business_page']) # NEW + content_structure = CharField(choices=['guide_tutorial', 'comparison', 'review', 'how_to', 'question', 'listicle']) # NEW + sag_cluster_id = UUIDField(null=True, blank=True) # NEW + taxonomies = JSONField(default=dict, null=True, blank=True) # NEW: custom WP taxonomies + status = CharField(choices=['draft', 'review', 'published']) + created_at = DateTimeField(auto_now_add=True) + +class Image(models.Model): + content = ForeignKey(Content) + url = URLField() + alt_text = CharField(max_length=255) + style_type = CharField(choices=['hero', 'supporting', 'ecommerce', 'category', 'service', 'conversion']) # NEW + sag_cluster_id = UUIDField(null=True, blank=True) # NEW + created_at = DateTimeField(auto_now_add=True) + +class Job(models.Model): + """Pipeline execution tracking""" + site = ForeignKey(Site) + status = CharField(choices=['pending', 'running', 'completed', 'failed']) + stage = IntegerField(choices=[(0, 'Blueprint Check'), (1, 'Keywords'), (2, 'Cluster'), (3, 'Ideas'), (4, 'Tasks'), (5, 'Content'), (6, 'Taxonomy'), (7, 'Images')]) + blueprint_mode = CharField(choices=['legacy', 'blueprint_aware']) # NEW + log = TextField(default='') + created_at = DateTimeField(auto_now_add=True) + completed_at = DateTimeField(null=True) +``` + +### API Endpoints (Celery Task Functions) + +#### Stage 0: Blueprint Check +```python +# celery_app/tasks.py + +@app.task(bind=True, max_retries=3) +def check_blueprint(self, site_id): + """ + Stage 0: Determine execution mode and load blueprint context. + + Returns: + { + 'status': 'success', + 'pipeline_mode': 'blueprint_aware' | 'legacy', + 'blueprint_id': 'uuid' (if active), + 'execution_phases': list, + 'next_stage': 1 + } + """ + try: + site = Site.objects.get(id=site_id) + job = Job.objects.create(site=site, stage=0, status='running') + + blueprint = SAGBlueprint.objects.filter( + site=site, + status='active' + ).first() + + if blueprint: + result = { + 'status': 'success', + 'pipeline_mode': 'blueprint_aware', + 'blueprint_id': str(blueprint.id), + 'execution_phases': blueprint.execution_priority, + } + job.blueprint_mode = 'blueprint_aware' + else: + result = { + 'status': 'success', + 'pipeline_mode': 'legacy', + 'blueprint_id': None, + 'execution_phases': None, + } + job.blueprint_mode = 'legacy' + + job.status = 'completed' + job.save() + + # Chain to Stage 1 + process_keywords.delay(site_id, result) + + return result + except Exception as e: + self.retry(exc=e, countdown=60) +``` + +#### Stage 1: Keyword Processing +```python +@app.task(bind=True, max_retries=3) +def process_keywords(self, site_id, blueprint_context): + """ + Stage 1: Process keywords and optionally map to SAGClusters. + + If blueprint_context['pipeline_mode'] == 'blueprint_aware': + - Map keywords to existing SAGClusters + - Flag unmatched keywords + Else: + - Pass keywords to next stage unchanged + """ + try: + site = Site.objects.get(id=site_id) + job = Job.objects.create( + site=site, + stage=1, + status='running', + blueprint_mode=blueprint_context['pipeline_mode'] + ) + + keywords = Keyword.objects.filter(site=site, sag_cluster_id__isnull=True) + + if blueprint_context['pipeline_mode'] == 'blueprint_aware': + blueprint = SAGBlueprint.objects.get(id=blueprint_context['blueprint_id']) + clusters = SAGCluster.objects.filter(blueprint=blueprint) + + matched_count = 0 + unmatched_keywords = [] + + for keyword in keywords: + # Semantic matching: find best cluster + cluster = _match_keyword_to_cluster(keyword, clusters) + if cluster: + keyword.sag_cluster_id = cluster.id + keyword.save() + cluster.keywords.append(keyword.term) + cluster.save() + matched_count += 1 + else: + unmatched_keywords.append(keyword.term) + + job.log = f"Matched {matched_count} keywords. Unmatched: {unmatched_keywords}" + else: + job.log = "Legacy mode: keywords passed unchanged" + + job.status = 'completed' + job.save() + + # Chain to Stage 2 + cluster_keywords.delay(site_id, blueprint_context) + + return {'status': 'success', 'keywords_processed': keywords.count()} + except Exception as e: + self.retry(exc=e, countdown=60) + + +def _match_keyword_to_cluster(keyword, clusters): + """Find best-matching SAGCluster for keyword via embedding similarity.""" + # Uses semantic search (embeddings) to find best cluster match + # Returns SAGCluster or None + pass +``` + +#### Stage 2: AI Cluster Keywords +```python +@app.task(bind=True, max_retries=3) +def cluster_keywords(self, site_id, blueprint_context): + """ + Stage 2: Cluster keywords. + + If blueprint_aware: + - SKIP AutoClusterKeywords + - Use blueprint clusters from Stage 0 + Else: + - Run AutoClusterKeywords (existing function) + """ + try: + site = Site.objects.get(id=site_id) + job = Job.objects.create( + site=site, + stage=2, + status='running', + blueprint_mode=blueprint_context['pipeline_mode'] + ) + + if blueprint_context['pipeline_mode'] == 'blueprint_aware': + # Clusters already exist from blueprint + clusters = SAGCluster.objects.filter( + blueprint_id=blueprint_context['blueprint_id'] + ) + job.log = f"Using {clusters.count()} blueprint clusters" + else: + # Run existing AutoClusterKeywords + clusters = AutoClusterKeywords(site_id) + job.log = f"AutoClusterKeywords created {clusters.count()} clusters" + + job.status = 'completed' + job.save() + + # Chain to Stage 3 + generate_ideas.delay(site_id, blueprint_context) + + return {'status': 'success', 'clusters': clusters.count()} + except Exception as e: + self.retry(exc=e, countdown=60) +``` + +#### Stage 3: Generate Content Ideas +```python +@app.task(bind=True, max_retries=3) +def generate_ideas(self, site_id, blueprint_context): + """ + Stage 3: Generate content ideas. + + If blueprint_aware: + - Call GenerateIdeasWithBlueprint + - Enrich ideas with type, structure, sector + - Respect execution phases + Else: + - Call existing GenerateIdeas + """ + try: + site = Site.objects.get(id=site_id) + job = Job.objects.create( + site=site, + stage=3, + status='running', + blueprint_mode=blueprint_context['pipeline_mode'] + ) + + if blueprint_context['pipeline_mode'] == 'blueprint_aware': + blueprint = SAGBlueprint.objects.get(id=blueprint_context['blueprint_id']) + ideas = GenerateIdeasWithBlueprint(site, blueprint) + job.log = f"Generated {len(ideas)} blueprint-aware ideas across {len(blueprint_context['execution_phases'])} phases" + else: + ideas = GenerateIdeas(site) + job.log = f"Generated {len(ideas)} legacy ideas" + + job.status = 'completed' + job.save() + + # Chain to Stage 4 + create_tasks.delay(site_id, blueprint_context) + + return {'status': 'success', 'ideas': len(ideas)} + except Exception as e: + self.retry(exc=e, countdown=60) +``` + +#### Stage 4: Create Writer Tasks +```python +@app.task(bind=True, max_retries=3) +def create_tasks(self, site_id, blueprint_context): + """ + Stage 4: Create writer tasks. + + If blueprint_aware: + - Enrich task with sag_cluster_id and blueprint_context JSON + - Respect execution phase priority + Else: + - Create basic tasks + """ + try: + site = Site.objects.get(id=site_id) + job = Job.objects.create( + site=site, + stage=4, + status='running', + blueprint_mode=blueprint_context['pipeline_mode'] + ) + + ideas = Idea.objects.filter(site=site, task__isnull=True) + + task_count = 0 + for idea in ideas: + task = Task.objects.create( + site=site, + title=idea.title, + idea=idea, + status='pending' + ) + + if blueprint_context['pipeline_mode'] == 'blueprint_aware' and idea.sag_cluster_id: + cluster = SAGCluster.objects.get(id=idea.sag_cluster_id) + blueprint = cluster.blueprint + + task.sag_cluster_id = idea.sag_cluster_id + task.blueprint_context = { + 'cluster_id': str(cluster.id), + 'cluster_name': cluster.name, + 'cluster_type': cluster.cluster_type, + 'cluster_sector': cluster.sector, + 'hub_title': blueprint.content_plan.get(str(cluster.id), {}).get('hub_title'), + 'hub_url': f"{site.domain}/hubs/{cluster.name.lower().replace(' ', '-')}", + 'cluster_attributes': cluster.attributes, + 'content_structure': idea.structure, + 'content_type': idea.content_type, + 'execution_phase': idea.execution_phase, + } + task.save() + + task_count += 1 + + job.log = f"Created {task_count} tasks" + job.status = 'completed' + job.save() + + # Chain to Stage 5 + generate_content.delay(site_id, blueprint_context) + + return {'status': 'success', 'tasks': task_count} + except Exception as e: + self.retry(exc=e, countdown=60) +``` + +#### Stage 5: Generate Article Content +```python +@app.task(bind=True, max_retries=3) +def generate_content(self, site_id, blueprint_context): + """ + Stage 5: Generate article content. + + If task has blueprint_context: + - Load prompt template by content_type + structure + - Inject blueprint context variables + - Call GPT-4 with enriched prompt + - Post-process for internal links + Else: + - Call existing GenerateContent + """ + try: + site = Site.objects.get(id=site_id) + job = Job.objects.create( + site=site, + stage=5, + status='running', + blueprint_mode=blueprint_context['pipeline_mode'] + ) + + tasks = Task.objects.filter(site=site, status='completed', content__isnull=True) + + content_count = 0 + for task in tasks: + if task.blueprint_context: + # Blueprint-aware content generation + prompt_key = _get_prompt_key( + task.blueprint_context['content_type'], + task.blueprint_context['content_structure'] + ) + template = PROMPT_TEMPLATES.get(prompt_key) + + # Inject variables + prompt = template.format(**task.blueprint_context) + + # Call GPT-4 + article = gpt4_call(prompt) + + # Post-process + article = _add_internal_links(article, task.blueprint_context) + + else: + # Legacy content generation + article = GenerateContent(task.idea.keyword) + + content = Content.objects.create( + site=site, + title=task.title, + body=article, + task=task, + sag_cluster_id=task.sag_cluster_id, + content_type=task.blueprint_context.get('content_type') if task.blueprint_context else 'blog_post', + content_structure=task.blueprint_context.get('content_structure') if task.blueprint_context else None, + ) + content_count += 1 + + job.log = f"Generated {content_count} articles" + job.status = 'completed' + job.save() + + # Chain to Stage 6 + assign_taxonomy.delay(site_id, blueprint_context) + + return {'status': 'success', 'content': content_count} + except Exception as e: + self.retry(exc=e, countdown=60) + + +def _get_prompt_key(content_type, structure): + """Map content_type + structure to prompt template key.""" + mapping = { + ('cluster_hub', 'guide_tutorial'): 'sag_hub_guide', + ('cluster_hub', 'listicle'): 'sag_hub_listicle', + ('blog_post', 'comparison'): 'sag_blog_comparison', + ('blog_post', 'review'): 'sag_blog_review', + ('blog_post', 'how_to'): 'sag_blog_howto', + ('blog_post', 'question'): 'sag_blog_question', + ('term_page', 'guide_tutorial'): 'sag_term_page', + ('product_page', 'review'): 'sag_product_page', + ('service_page', 'guide_tutorial'): 'sag_service_page', + ('landing_page', 'guide_tutorial'): 'sag_landing_guide', + ('landing_page', 'comparison'): 'sag_landing_comparison', + ('business_page', 'guide_tutorial'): 'sag_business_guide', + } + return mapping.get((content_type, structure), 'sag_default') + + +def _add_internal_links(article, blueprint_context): + """Add internal links to related cluster hubs and attribute term pages.""" + # Parse article, identify linking opportunities + # Inject markdown links to related content + pass +``` + +#### Stage 6: Taxonomy Assignment +```python +@app.task(bind=True, max_retries=3) +def assign_taxonomy(self, site_id, blueprint_context): + """ + Stage 6: Assign content to custom WP taxonomies (blueprint mode only). + + If blueprint_aware: + - Match content to cluster attributes + - Assign custom taxonomy values + - Update cluster status + Else: + - Skip stage + """ + try: + site = Site.objects.get(id=site_id) + job = Job.objects.create( + site=site, + stage=6, + status='running', + blueprint_mode=blueprint_context['pipeline_mode'] + ) + + if blueprint_context['pipeline_mode'] != 'blueprint_aware': + job.log = "Legacy mode: taxonomy assignment skipped" + job.status = 'completed' + job.save() + generate_images.delay(site_id, blueprint_context) + return {'status': 'success', 'skipped': True} + + blueprint = SAGBlueprint.objects.get(id=blueprint_context['blueprint_id']) + content_items = Content.objects.filter(site=site, sag_cluster_id__isnull=False, taxonomies__isnull=True) + + assigned_count = 0 + for content in content_items: + cluster = SAGCluster.objects.get(id=content.sag_cluster_id) + + # Load taxonomy mapping from blueprint + tax_mapping = blueprint.wp_taxonomy_mapping.get(str(cluster.id), {}) + + # Assign taxonomies + content.taxonomies = tax_mapping + content.save() + + # Update cluster status + if Content.objects.filter(sag_cluster_id=cluster.id).count() > 0: + if cluster.status == 'draft': + cluster.status = 'partial' + cluster.save() + + assigned_count += 1 + + job.log = f"Assigned {assigned_count} content items to taxonomies" + job.status = 'completed' + job.save() + + # Chain to Stage 7 + generate_images.delay(site_id, blueprint_context) + + return {'status': 'success', 'assigned': assigned_count} + except Exception as e: + self.retry(exc=e, countdown=60) +``` + +#### Stage 7: Image Generation +```python +@app.task(bind=True, max_retries=3) +def generate_images(self, site_id, blueprint_context): + """ + Stage 7: Generate featured and in-article images. + + If blueprint_aware: + - Match image style to content type + - Use cluster theme/color palette + Else: + - Generate default style images + """ + try: + site = Site.objects.get(id=site_id) + job = Job.objects.create( + site=site, + stage=7, + status='running', + blueprint_mode=blueprint_context['pipeline_mode'] + ) + + content_items = Content.objects.filter(site=site, image__isnull=True) + + image_count = 0 + for content in content_items: + if blueprint_context['pipeline_mode'] == 'blueprint_aware' and content.content_type: + # Match style to content type + style_mapping = { + 'cluster_hub': 'hero', + 'blog_post': 'supporting', + 'product_page': 'ecommerce', + 'term_page': 'category', + 'service_page': 'service', + 'landing_page': 'conversion', + } + style = style_mapping.get(content.content_type, 'supporting') + else: + style = 'supporting' + + # Generate featured image + featured_image = GenerateImage(content.title, style) + image = Image.objects.create( + content=content, + url=featured_image['url'], + alt_text=featured_image['alt_text'], + style_type=style, + sag_cluster_id=content.sag_cluster_id, + ) + image_count += 1 + + job.log = f"Generated {image_count} images" + job.status = 'completed' + job.save() + + return {'status': 'success', 'images': image_count} + except Exception as e: + self.retry(exc=e, countdown=60) +``` + +--- + +## 4. IMPLEMENTATION STEPS + +### Phase A: Data Model Extensions (Week 1) +1. Add fields to Keyword, Idea, Task, Content, Image models (see Section 3) +2. Create SAGBlueprint, SAGCluster models (reference 01A) +3. Create database migrations +4. Test model relationships and queries + +### Phase B: Stage 0 Implementation (Week 1) +1. Implement `check_blueprint` Celery task +2. Add blueprint loading and caching logic +3. Create execution_priority parsing +4. Test with sample blueprints (active and inactive) +5. Add logging and error handling + +### Phase C: Stage 1–2 Enhancement (Week 2) +1. Implement `_match_keyword_to_cluster` function (embedding-based matching) +2. Extend `process_keywords` task for blueprint mode +3. Modify `cluster_keywords` to skip AutoClusterKeywords when blueprint active +4. Add unmatched keyword flagging and reporting +5. Test with mixed keyword sets + +### Phase D: Stage 3 Enhancement (Week 2) +1. Create `sag/ai_functions/content_planning.py` module +2. Implement `GenerateIdeasWithBlueprint` function +3. Add phase-based filtering and prioritization +4. Integrate structure/type/sector enrichment +5. Test idea generation for each content type + +### Phase E: Stage 4 Enhancement (Week 3) +1. Extend `create_tasks` task with blueprint_context JSON assembly +2. Add execution_phase assignment +3. Test blueprint_context structure completeness +4. Verify sag_cluster_id linking + +### Phase F: Stage 5 Enhancement (Week 3) +1. Create PROMPT_TEMPLATES dictionary with all template keys +2. Implement `_get_prompt_key` function +3. Extend `generate_content` task to use templates +4. Implement `_add_internal_links` post-processing +5. Test content generation for each content_type + structure combination +6. Validate prompt variable injection + +### Phase G: Stage 6 Implementation (Week 4) +1. Implement `assign_taxonomy` task +2. Add taxonomy mapping logic from blueprint.wp_taxonomy_mapping +3. Implement cluster status updates +4. Test taxonomy assignment with sample blueprints + +### Phase H: Stage 7 Enhancement (Week 4) +1. Extend `generate_images` task for blueprint mode +2. Add style_type mapping by content_type +3. Implement color palette usage from blueprint +4. Test image generation for each content type + +### Phase I: Integration & Testing (Week 5) +1. Test full pipeline execution with active blueprint +2. Test full pipeline execution without blueprint (legacy mode) +3. Add integration tests for each stage transition +4. Test error handling and retries +5. Load testing with multiple concurrent sites + +### Phase J: Deployment & Monitoring (Week 6) +1. Deploy models and migrations to staging +2. Deploy Celery tasks to staging +3. Validate with staging data +4. Set up pipeline execution monitoring (01G) +5. Deploy to production with feature flag (blueprint mode off by default) + +--- + +## 5. ACCEPTANCE CRITERIA + +### Functional Requirements +- **Stage 0**: Blueprint check completes successfully; mode determination accurate +- **Stage 1**: Keywords matched to clusters with 85%+ accuracy; unmatched flagged +- **Stage 2**: Legacy mode skipped when blueprint active; clusters pre-loaded +- **Stage 3**: Ideas generated with correct type/structure/sector/cluster assignment +- **Stage 4**: Tasks enriched with complete blueprint_context JSON +- **Stage 5**: Content generated using template-specific prompts; blueprint variables injected +- **Stage 6**: Content assigned to custom taxonomies; cluster status updated +- **Stage 7**: Images generated with correct style matching content type + +### Quality Criteria +- **No breaking changes**: Legacy mode works identically to pre-blueprint pipeline +- **Error handling**: All Celery tasks handle failures gracefully; retry logic functional +- **Performance**: Pipeline completes within baseline timing (per site, per stage) +- **Logging**: All stages log execution details and decisions +- **Data integrity**: sag_cluster_id and blueprint_context consistently populated + +### Testing Coverage +- Unit tests: Each function and task (>80% coverage) +- Integration tests: Full pipeline execution with/without blueprint +- Scenario tests: + - Active blueprint (all phases) + - Inactive blueprint (legacy mode) + - Mixed keywords (matched + unmatched) + - Multiple sites with different blueprints + - Failed tasks (retry logic) + +### Documentation +- Docstrings: All functions documented with inputs/outputs +- README: Setup and execution instructions +- Troubleshooting guide: Common issues and solutions + +### Monitoring (01G Health Monitoring) +- Pipeline execution time per stage per site +- Content generation success rate by content_type +- Taxonomy assignment accuracy +- Cluster completion status tracking +- Unmatched keyword trending + +--- + +## 6. CLAUDE CODE INSTRUCTIONS + +### Running the Pipeline Locally + +#### Prerequisites +```bash +# Install dependencies +pip install -r requirements.txt +celery[redis] pytest pytest-django + +# Set up local database +python manage.py migrate + +# Start Redis (for Celery) +redis-server +``` + +#### Initialize Test Data +```bash +# Create sample site and blueprint +python manage.py shell << EOF +from django.contrib.auth.models import User +from sites.models import Site +from sag.models import SAGBlueprint, SAGCluster + +site = Site.objects.create(name="Test Site", domain="test.local") +blueprint = SAGBlueprint.objects.create( + site=site, + name="Test Blueprint", + status="active", + execution_priority={ + "phase_1": ["category_pages", "top_cluster_hubs"], + "phase_2": ["remaining_hubs"], + "phase_3": ["attribute_term_pages"], + "phase_4": ["additional_blogs"], + }, + content_plan={}, + wp_taxonomy_mapping={} +) +cluster = SAGCluster.objects.create( + blueprint=blueprint, + name="Test Cluster", + cluster_type="topical", + sector="Tech", + keywords=["python", "django"], + attributes=["web development", "open source"], + status="draft" +) +print(f"Created site {site.id}, blueprint {blueprint.id}, cluster {cluster.id}") +EOF +``` + +#### Execute Pipeline Stages +```bash +# Start Celery worker (in separate terminal) +celery -A igny8.celery_app worker --loglevel=info + +# Run Stage 0: Blueprint Check +python manage.py shell << EOF +from celery_app.tasks import check_blueprint +result = check_blueprint.delay(site_id="") +print(result.get()) +EOF + +# Run full pipeline +python manage.py shell << EOF +from celery_app.tasks import check_blueprint +from uuid import UUID + +site_id = UUID("") +check_blueprint.delay(site_id) +# Each stage automatically chains to the next +EOF + +# Monitor pipeline execution +celery -A igny8.celery_app events +# or view logs: tail -f celery.log +``` + +### Testing the Pipeline + +#### Unit Tests +```bash +pytest content/tests/test_pipeline.py -v +pytest sag/tests/test_blueprint.py -v +pytest celery_app/tests/test_tasks.py -v +``` + +#### Integration Test +```bash +pytest content/tests/test_pipeline_integration.py::test_full_blueprint_pipeline -v + +# Test legacy mode +pytest content/tests/test_pipeline_integration.py::test_full_legacy_pipeline -v + +# Test mixed mode (some sites with blueprint, some without) +pytest content/tests/test_pipeline_integration.py::test_mixed_mode_execution -v +``` + +#### Manual Test Scenario +```bash +# 1. Create test site and blueprint +python manage.py shell < scripts/setup_test_data.py + +# 2. Import sample keywords +python manage.py shell << EOF +from content.models import Keyword +from sites.models import Site +site = Site.objects.get(name="Test Site") +keywords = ["python tutorial", "django rest", "web scraping"] +for kw in keywords: + Keyword.objects.create(site=site, term=kw, source='csv_import') +EOF + +# 3. Run pipeline +celery -A igny8.celery_app worker --loglevel=debug & +python manage.py shell << EOF +from celery_app.tasks import check_blueprint +from sites.models import Site +site = Site.objects.get(name="Test Site") +check_blueprint.delay(site.id) +EOF + +# 4. Inspect results +python manage.py shell << EOF +from content.models import Keyword, Idea, Task, Content, Image +from sites.models import Site +site = Site.objects.get(name="Test Site") + +print("Keywords:", Keyword.objects.filter(site=site).count()) +print("Ideas:", Idea.objects.filter(site=site).count()) +print("Tasks:", Task.objects.filter(site=site).count()) +print("Content:", Content.objects.filter(site=site).count()) +print("Images:", Image.objects.filter(site=site).count()) + +# Check blueprint context +task = Task.objects.filter(site=site, blueprint_context__isnull=False).first() +if task: + print("Blueprint context:", task.blueprint_context) +EOF +``` + +### Debugging Common Issues + +#### Blueprint Not Detected +```bash +# Check if blueprint exists and is active +python manage.py shell << EOF +from sag.models import SAGBlueprint +from sites.models import Site +site = Site.objects.get(id="") +blueprint = SAGBlueprint.objects.filter(site=site, status='active').first() +print(f"Blueprint: {blueprint}") +if blueprint: + print(f"Status: {blueprint.status}") + print(f"Content plan: {blueprint.content_plan}") +EOF +``` + +#### Keywords Not Matching +```bash +# Check keyword-cluster mapping +python manage.py shell << EOF +from content.models import Keyword +from sag.models import SAGCluster +keywords = Keyword.objects.filter(sag_cluster_id__isnull=True) +print(f"Unmatched keywords: {[kw.term for kw in keywords]}") + +# Check available clusters +clusters = SAGCluster.objects.all() +for cluster in clusters: + print(f"Cluster '{cluster.name}': {cluster.attributes}") +EOF +``` + +#### Content Not Generated +```bash +# Check task status +python manage.py shell << EOF +from content.models import Task +tasks = Task.objects.all() +for task in tasks: + print(f"Task {task.id}: status={task.status}, blueprint_context={bool(task.blueprint_context)}") +EOF + +# Check Celery task logs +celery -A igny8.celery_app inspect active +celery -A igny8.celery_app inspect reserved +celery -A igny8.celery_app purge # WARNING: clears queue +``` + +### Extending with Custom Prompt Templates + +#### Add New Template +```python +# In sag/prompt_templates.py + +PROMPT_TEMPLATES = { + 'sag_hub_guide': """ + You are writing a comprehensive guide for {cluster_name}, a {cluster_type} in the {cluster_sector} sector. + + Topic: {cluster_name} + Related terms: {attribute_terms} + Hub page: {hub_url} + + Structure: Guide/Tutorial format + - Introduction: What is {cluster_name}? + - Key concepts: {attribute_terms} + - Step-by-step guide + - Common pitfalls + - Conclusion with links to {hub_title} + + Write a comprehensive, SEO-optimized guide. + """, + + # Add more templates here... +} + +# Usage in generate_content task: +# template = PROMPT_TEMPLATES['sag_hub_guide'] +# prompt = template.format(**blueprint_context) +``` + +### Monitoring Pipeline Health (Integration with 01G) + +```bash +# View pipeline execution history +python manage.py shell << EOF +from content.models import Job +jobs = Job.objects.filter(stage=5).order_by('-created_at')[:10] +for job in jobs: + duration = (job.completed_at - job.created_at).total_seconds() if job.completed_at else None + print(f"Stage {job.stage}: {job.status} ({duration}s) - {job.blueprint_mode}") +EOF + +# Check cluster completion status +python manage.py shell << EOF +from sag.models import SAGCluster +clusters = SAGCluster.objects.all() +for cluster in clusters: + content_count = cluster.content_set.count() + print(f"Cluster '{cluster.name}': {cluster.status} ({content_count} content items)") +EOF +``` + +--- + +## Cross-References + +| Document | Reference Purpose | +|----------|-------------------| +| **01A**: SAG Blueprint Model | SAGBlueprint, SAGCluster models used at Stage 0 | +| **01C**: Cluster Formation | Clusters created by SAG framework; used by pipeline | +| **01D**: Setup Wizard | Creates blueprint that drives pipeline execution | +| **01F**: Case 1 Analysis | Produces blueprints that feed this pipeline | +| **01G**: Health Monitoring | Tracks pipeline output per cluster and stage | +| **Content_Types_Writing_Plan.md** | Content type definitions; prompt template structure | + +--- + +## Summary + +The Blueprint-Aware Content Pipeline enhances IGNY8's 7-stage automation with SAG framework context at every step. When a site has an active blueprint, content generation becomes strategic: keywords map to clusters, ideas inherit type/structure/sector assignments, prompts leverage cluster context, and output auto-taxonomizes. When no blueprint exists, the pipeline defaults to legacy mode unchanged. + +**Key innovation**: Two-mode execution (blueprint-aware + legacy) enables gradual adoption—teams can opt in to blueprint-driven content without disrupting existing sites. **Execution priority phases** ensure foundational content (hubs) publishes before supporting content (blogs), building authority tier-by-tier. + diff --git a/v2/V2-Execution-Docs/01F-existing-site-analysis-case1.md b/v2/V2-Execution-Docs/01F-existing-site-analysis-case1.md index 105f5205..37f3a761 100644 --- a/v2/V2-Execution-Docs/01F-existing-site-analysis-case1.md +++ b/v2/V2-Execution-Docs/01F-existing-site-analysis-case1.md @@ -1,5 +1,9 @@ # 01F: IGNY8 Phase 1 — Existing Site Analysis (Case 1) +> **Version:** 1.1 (codebase-verified) +> **Source of Truth:** Codebase at `/data/app/igny8/backend/` +> **Last Verified:** 2025-07-14 + **Document Type:** Build Specification **Phase:** Phase 1: Existing Site Analysis **Use Case:** Case 1 (Users with existing sites) @@ -176,8 +180,8 @@ def extract_site_attributes( ```json { - "analysis_id": "uuid", - "site_id": "uuid", + "analysis_id": 42, + "site_id": 7, "timestamp": "2026-03-23T14:30:00Z", "analysis_confidence": 0.82, "attributes": [ @@ -298,7 +302,7 @@ from typing import List, Dict, Optional @dataclass class Product: - id: str + id: int title: str description: str sku: str @@ -310,10 +314,10 @@ class Product: @dataclass class Category: - id: str + id: int name: str slug: str - parent_id: Optional[str] + parent_id: Optional[int] description: str product_count: int @@ -326,16 +330,16 @@ class Taxonomy: @dataclass class Term: - id: str + id: int name: str slug: str - parent_id: Optional[str] + parent_id: Optional[int] description: str count: int @dataclass class Page: - id: str + id: int title: str url: str content_summary: str @@ -343,7 +347,7 @@ class Page: @dataclass class Post: - id: str + id: int title: str url: str content_summary: str @@ -353,15 +357,15 @@ class Post: @dataclass class MenuItem: - id: str + id: int title: str url: str target: str - parent_id: Optional[str] + parent_id: Optional[int] @dataclass class SiteMetadata: - site_id: str + site_id: int domain: str wordpress_version: str woocommerce_version: str @@ -425,8 +429,8 @@ class AnalysisNotes: @dataclass class AttributeExtractionResult: - analysis_id: str - site_id: str + analysis_id: int + site_id: int timestamp: str analysis_confidence: float attributes: List[DiscoveredAttribute] @@ -483,9 +487,9 @@ class AttributeExtractionResult: ```json { - "analysis_id": "uuid", - "site_id": "uuid", - "blueprint_id": "uuid", + "analysis_id": 42, + "site_id": 7, + "blueprint_id": 15, "timestamp": "2026-03-23T14:30:00Z", "summary": { "products_current": 50, @@ -599,15 +603,15 @@ class AttributeExtractionResult: ```json { - "batch_id": "uuid", - "site_id": "uuid", - "blueprint_id": "uuid", + "batch_id": 23, + "site_id": 7, + "blueprint_id": 15, "timestamp": "2026-03-23T14:30:00Z", "total_products": 50, "total_suggestions": 87, "suggestions": [ { - "product_id": "woo_123", + "product_id": 123, "product_title": "Nekteck Foot Massager with Heat", "proposed_tags": [ { @@ -659,7 +663,7 @@ All endpoints are authenticated via `Authorization: Bearer {IGNY8_API_TOKEN}` he { "include_draft_products": false, "product_limit": 500, - "sector_template_id": "optional_uuid", + "sector_template_id": null, "webhook_url": "optional_https_url_for_completion_notification" } ``` @@ -668,7 +672,7 @@ All endpoints are authenticated via `Authorization: Bearer {IGNY8_API_TOKEN}` he ```json { "task_id": "celery_task_uuid", - "site_id": "site_uuid", + "site_id": 7, "status": "queued", "estimated_duration_seconds": 120, "check_status_url": "/api/v1/sag/sites/{site_id}/analysis-status/?task_id={task_id}" @@ -694,7 +698,7 @@ All endpoints are authenticated via `Authorization: Bearer {IGNY8_API_TOKEN}` he ```json { "task_id": "celery_task_uuid", - "site_id": "site_uuid", + "site_id": 7, "status": "processing", "progress_percent": 45, "current_step": "Analyzing product attributes", @@ -718,8 +722,8 @@ All endpoints are authenticated via `Authorization: Bearer {IGNY8_API_TOKEN}` he **Response:** 200 OK ```json { - "analysis_id": "uuid", - "site_id": "site_uuid", + "analysis_id": 42, + "site_id": 7, "timestamp": "2026-03-23T14:30:00Z", "site_data_summary": { "total_products": 50, @@ -756,7 +760,7 @@ All endpoints are authenticated via `Authorization: Bearer {IGNY8_API_TOKEN}` he **Request:** ```json { - "analysis_id": "uuid", + "analysis_id": 42, "approved_attributes": [ { "name": "Target Area", @@ -764,16 +768,16 @@ All endpoints are authenticated via `Authorization: Bearer {IGNY8_API_TOKEN}` he "exclude_values": [] } ], - "confirmed_by_user_id": "user_uuid" + "confirmed_by_user_id": 3 } ``` **Response:** 201 Created ```json { - "blueprint_id": "uuid", - "site_id": "site_uuid", - "analysis_id": "uuid", + "blueprint_id": 15, + "site_id": 7, + "analysis_id": 42, "status": "created", "attributes_count": 8, "attribute_values_count": 45, @@ -800,12 +804,12 @@ All endpoints are authenticated via `Authorization: Bearer {IGNY8_API_TOKEN}` he **Response:** 200 OK ```json { - "batch_id": "uuid", - "blueprint_id": "blueprint_uuid", + "batch_id": 23, + "blueprint_id": 15, "total_suggestions": 87, "suggestions": [ { - "product_id": "woo_123", + "product_id": 123, "product_title": "Nekteck Foot Massager", "proposed_tags": [ { @@ -829,10 +833,10 @@ All endpoints are authenticated via `Authorization: Bearer {IGNY8_API_TOKEN}` he **Request:** ```json { - "blueprint_id": "uuid", + "blueprint_id": 15, "approved_suggestions": [ { - "product_id": "woo_123", + "product_id": 123, "approved_tags": [ { "attribute": "Target Area", @@ -849,8 +853,8 @@ All endpoints are authenticated via `Authorization: Bearer {IGNY8_API_TOKEN}` he ```json { "task_id": "celery_task_uuid", - "site_id": "site_uuid", - "blueprint_id": "blueprint_uuid", + "site_id": 7, + "blueprint_id": 15, "status": "processing", "products_to_tag": 47, "tags_to_apply": 87, @@ -871,7 +875,7 @@ All endpoints are authenticated via `Authorization: Bearer {IGNY8_API_TOKEN}` he ```json { "task_id": "celery_task_uuid", - "site_id": "site_uuid", + "site_id": 7, "status": "processing", "progress_percent": 62, "products_tagged": 29, @@ -902,7 +906,7 @@ All endpoints are authenticated via `Authorization: Bearer {IGNY8_API_TOKEN}` he ```json { "metadata": { - "site_id": "uuid", + "site_id": 7, "domain": "example-store.com", "wordpress_version": "6.4.2", "woocommerce_version": "8.5.0", @@ -915,7 +919,7 @@ All endpoints are authenticated via `Authorization: Bearer {IGNY8_API_TOKEN}` he }, "products": [ { - "id": "woo_123", + "id": 123, "title": "Nekteck Foot Massager with Heat", "description": "Premium foot massage device...", "sku": "NEKTECK-FM-001", @@ -932,7 +936,7 @@ All endpoints are authenticated via `Authorization: Bearer {IGNY8_API_TOKEN}` he ], "categories": [ { - "id": "cat_1", + "id": 1, "name": "Foot Massagers", "slug": "foot-massagers", "parent_id": null, @@ -947,7 +951,7 @@ All endpoints are authenticated via `Authorization: Bearer {IGNY8_API_TOKEN}` he "is_hierarchical": false, "terms": [ { - "id": "brand_1", + "id": 1, "name": "Nekteck", "slug": "nekteck", "parent_id": null, @@ -959,7 +963,7 @@ All endpoints are authenticated via `Authorization: Bearer {IGNY8_API_TOKEN}` he ], "pages": [ { - "id": "page_1", + "id": 1, "title": "Shop", "url": "/shop", "content_summary": "Browse our selection of massage devices", @@ -968,7 +972,7 @@ All endpoints are authenticated via `Authorization: Bearer {IGNY8_API_TOKEN}` he ], "posts": [ { - "id": "post_1", + "id": 1, "title": "Benefits of Foot Massage", "url": "/blog/foot-massage-benefits", "content_summary": "Learn why foot massage is beneficial...", @@ -979,11 +983,11 @@ All endpoints are authenticated via `Authorization: Bearer {IGNY8_API_TOKEN}` he ], "menus": [ { - "id": "menu_1", + "id": 1, "title": "Main Menu", "items": [ { - "id": "item_1", + "id": 1, "title": "Shop", "url": "/shop", "target": "_self", @@ -1184,7 +1188,10 @@ All endpoints are authenticated via `Authorization: Bearer {IGNY8_API_TOKEN}` he --- -### Phase 6: Frontend Components (Week 3-4) +### Phase 6: Frontend Components — React + TypeScript (Week 3-4) + +> **Tech Stack:** React ^19.0.0, TypeScript ~5.7.2, Vite ^6.1.0, Zustand ^5.0.8, Tailwind ^4.0.8 +> All components are `.tsx` files in the `frontend/src/` directory. **Tasks:** 1. Implement SiteAnalysisPanel @@ -1455,7 +1462,7 @@ Step 6: Complete & Next Steps **Common Issues:** **Issue:** Analysis hangs or times out -- Check: Celery worker status (`celery -A sag inspect active`) +- Check: Celery worker status (`celery -A igny8_core inspect active`) - Check: Redis/message queue status - Check: LLM API rate limits - Solution: Reduce product limit, retry analysis diff --git a/v2/V2-Execution-Docs/01G-sag-health-monitoring.md b/v2/V2-Execution-Docs/01G-sag-health-monitoring.md index 190deea0..a593def4 100644 --- a/v2/V2-Execution-Docs/01G-sag-health-monitoring.md +++ b/v2/V2-Execution-Docs/01G-sag-health-monitoring.md @@ -1,5 +1,9 @@ # IGNY8 Phase 1: SAG Health Monitoring (Doc 01G) +> **Version:** 1.1 (codebase-verified) +> **Source of Truth:** Codebase at `/data/app/igny8/backend/` +> **Last Verified:** 2025-07-14 + **Document ID:** 01G **Module:** SAG Health Monitoring **Phase:** Phase 1 - Core Implementation @@ -599,7 +603,7 @@ def check_blueprint_evolution_triggers(site_id: int): 5. Trigger notification to user #### Step 2.2: Configure Celery Beat Schedule -**File:** `config/celery.py` or `config/celery_beat_schedule.py` +**File:** `igny8_core/celery.py` ```python CELERY_BEAT_SCHEDULE = { @@ -802,7 +806,7 @@ Test scenarios: ### Phase 4: Dashboard Widget & Frontend (Week 4) #### Step 4.1: Create Dashboard Widget Component -**File:** `frontend/components/SAGHealthWidget.jsx` +**File:** `frontend/src/components/SAGHealthWidget.tsx` Display: ``` @@ -848,7 +852,7 @@ Display: - Add 4-week trend chart #### Step 4.2: Create Health History Chart -**File:** `frontend/components/SAGHealthChart.jsx` +**File:** `frontend/src/components/SAGHealthChart.tsx` Line chart showing: - X-axis: Last 4 weeks (Monday to Monday) @@ -858,7 +862,7 @@ Line chart showing: - Hover: Show detailed scores for week #### Step 4.3: Create Recommendations Page -**File:** `frontend/pages/SAGRecommendations.jsx` +**File:** `frontend/src/pages/SAGRecommendations.tsx` Page showing: - All recommendations (20+) @@ -869,7 +873,7 @@ Page showing: - Status tracking (completed/in-progress/pending) #### Step 4.4: Create Blueprint Version History Page -**File:** `frontend/pages/BlueprintVersionHistory.jsx` +**File:** `frontend/src/pages/BlueprintVersionHistory.tsx` Display: - Timeline of all versions @@ -881,7 +885,7 @@ Display: - Activate/rollback buttons for archived versions #### Step 4.5: Create Evolution Trigger Review Page -**File:** `frontend/pages/EvolutionTriggerReview.jsx` +**File:** `frontend/src/pages/EvolutionTriggerReview.tsx` Display detected triggers: - New product categories (with suggestion) @@ -891,7 +895,7 @@ Display detected triggers: - Preview new blueprint structure before creation #### Step 4.6: Frontend Tests -**File:** `tests/frontend/SAGHealthWidget.test.js` +**File:** `frontend/src/__tests__/SAGHealthWidget.test.tsx` Test: - Widget renders with health score @@ -1118,7 +1122,7 @@ Follow Phase 1 → Phase 5 sequentially. Do not skip phases. #### 6.5 Code Style & Standards - Follow PEP 8 for Python -- Follow ESLint rules for JavaScript/React +- Follow ESLint rules for TypeScript/React - Comment complex calculations (especially health score components) - Use meaningful variable names (not `x`, `y`, `temp`) - Docstrings on all public methods @@ -1134,7 +1138,7 @@ Follow Phase 1 → Phase 5 sequentially. Do not skip phases. **Celery task not running:** - Verify Celery Beat schedule is configured -- Check task is registered (`celery -A project inspect active_queues`) +- Check task is registered (`celery -A igny8_core inspect active_queues`) - Check for errors in Celery worker logs - Test task manually via shell: `run_blueprint_health_check.delay(site_id=1)` @@ -1147,7 +1151,7 @@ Follow Phase 1 → Phase 5 sequentially. Do not skip phases. **Frontend chart not rendering:** - Inspect network tab for API response - Verify response format matches serializer schema -- Check console for JavaScript errors +- Check console for TypeScript/runtime errors - Test with mock data first #### 6.7 Version Control Workflow @@ -1211,8 +1215,8 @@ Create/update these files: **Staging deployment:** 1. Deploy code to staging server 2. Run migrations: `python manage.py migrate sag` -3. Start Celery worker: `celery -A project worker -l info` -4. Start Celery Beat: `celery -A project beat -l info` +3. Start Celery worker: `celery -A igny8_core worker -l info` +4. Start Celery Beat: `celery -A igny8_core beat -l info` 5. Run smoke tests against staging API 6. Manual QA: test all UI flows 7. Monitor logs for errors