Compare commits
2 Commits
f28f641fd5
...
phase-0-fo
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
67283ad3e7 | ||
|
|
72a31b2edb |
5
.gitignore
vendored
5
.gitignore
vendored
@@ -45,11 +45,6 @@ backend/.venv/
|
||||
dist/
|
||||
*.egg
|
||||
|
||||
# Celery scheduler database (binary file, regenerated by celery beat)
|
||||
celerybeat-schedule
|
||||
**/celerybeat-schedule
|
||||
backend/celerybeat-schedule
|
||||
|
||||
# Environment variables
|
||||
.env
|
||||
.env.local
|
||||
|
||||
224
.rules
224
.rules
@@ -1,224 +0,0 @@
|
||||
# IGNY8 AI Agent Rules
|
||||
|
||||
**Version:** 1.1.3 | **Updated:** December 27, 2025
|
||||
|
||||
---
|
||||
|
||||
## 🚀 Quick Start for AI Agents
|
||||
|
||||
**BEFORE any change, read these docs in order:**
|
||||
1. [docs/INDEX.md](docs/INDEX.md) - Quick navigation to any module/feature
|
||||
2. Module doc for the feature you're modifying (see INDEX.md for paths)
|
||||
3. [CHANGELOG.md](CHANGELOG.md) - Recent changes and version history
|
||||
|
||||
---
|
||||
|
||||
## 📁 Project Structure
|
||||
|
||||
| Layer | Path | Purpose |
|
||||
|-------|------|---------|
|
||||
| Backend | `backend/igny8_core/` | Django REST API |
|
||||
| Frontend | `frontend/src/` | React + TypeScript SPA |
|
||||
| Docs | `docs/` | Technical documentation |
|
||||
| AI Engine | `backend/igny8_core/ai/` | AI functions (use this, NOT `utils/ai_processor.py`) |
|
||||
|
||||
**Module → File Quick Reference:** See [docs/INDEX.md](docs/INDEX.md#module--file-quick-reference)
|
||||
|
||||
---
|
||||
|
||||
## ⚠️ Module Status
|
||||
|
||||
| Module | Status | Notes |
|
||||
|--------|--------|-------|
|
||||
| Planner | ✅ Active | Keywords, Clusters, Ideas |
|
||||
| Writer | ✅ Active | Tasks, Content, Images |
|
||||
| Automation | ✅ Active | 7-stage pipeline |
|
||||
| Billing | ✅ Active | Credits, Plans |
|
||||
| Publisher | ✅ Active | WordPress publishing |
|
||||
| **Linker** | ⏸️ Inactive | Exists but disabled - Phase 2 |
|
||||
| **Optimizer** | ⏸️ Inactive | Exists but disabled - Phase 2 |
|
||||
| **SiteBuilder** | ❌ Removed | Code exists but NOT part of app - mark for removal in TODOS.md |
|
||||
|
||||
**Important:**
|
||||
- Do NOT work on Linker/Optimizer unless specifically requested
|
||||
- SiteBuilder code is deprecated - if found, add to `TODOS.md` for cleanup
|
||||
|
||||
---
|
||||
|
||||
## 🐳 Docker Commands (IMPORTANT!)
|
||||
|
||||
**Container Names:**
|
||||
| Container | Name | Purpose |
|
||||
|-----------|------|---------|
|
||||
| Backend | `igny8_backend` | Django API server |
|
||||
| Frontend | `igny8_frontend` | React dev server |
|
||||
| Celery Worker | `igny8_celery_worker` | Background tasks |
|
||||
| Celery Beat | `igny8_celery_beat` | Scheduled tasks |
|
||||
|
||||
**Run commands INSIDE containers:**
|
||||
```bash
|
||||
# ✅ CORRECT - Run Django management commands
|
||||
docker exec -it igny8_backend python manage.py migrate
|
||||
docker exec -it igny8_backend python manage.py makemigrations
|
||||
docker exec -it igny8_backend python manage.py shell
|
||||
|
||||
# ✅ CORRECT - Run npm commands
|
||||
docker exec -it igny8_frontend npm install
|
||||
docker exec -it igny8_frontend npm run build
|
||||
|
||||
# ✅ CORRECT - View logs
|
||||
docker logs igny8_backend -f
|
||||
docker logs igny8_celery_worker -f
|
||||
|
||||
# ❌ WRONG - Don't use docker-compose for commands
|
||||
# docker-compose exec backend python manage.py migrate
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 📊 Data Scoping (CRITICAL!)
|
||||
|
||||
**Understand which data is scoped where:**
|
||||
|
||||
| Scope | Models | Notes |
|
||||
|-------|--------|-------|
|
||||
| **Global (Platform-wide)** | `GlobalIntegrationSettings`, `GlobalAIPrompt`, `GlobalAuthorProfile`, `GlobalStrategy`, `GlobalModuleSettings`, `Industry`, `SeedKeyword` | Admin-only, shared by ALL accounts |
|
||||
| **Account-scoped** | `Account`, `User`, `Plan`, `IntegrationSettings`, `ModuleEnableSettings`, `AISettings`, `AIPrompt`, `AuthorProfile` | Filter by `account` |
|
||||
| **Site+Sector-scoped** | `Keywords`, `Clusters`, `ContentIdeas`, `Tasks`, `Content`, `Images` | Filter by `site` AND optionally `sector` |
|
||||
|
||||
**Key Rules:**
|
||||
- Global settings: NO account filtering (platform-wide, admin managed)
|
||||
- Account models: Use `AccountBaseModel`, filter by `request.user.account`
|
||||
- Site/Sector models: Use `SiteSectorBaseModel`, filter by `site` and `sector`
|
||||
|
||||
---
|
||||
|
||||
## ✅ Rules (One Line Each)
|
||||
|
||||
### Before Coding
|
||||
1. **Read docs first** - Always read the relevant module doc from `docs/10-MODULES/` before changing code
|
||||
2. **Check existing patterns** - Search codebase for similar implementations before creating new ones
|
||||
3. **Use existing components** - Never duplicate; reuse components from `frontend/src/components/`
|
||||
4. **Check data scope** - Know if your model is Global, Account, or Site/Sector scoped (see table above)
|
||||
|
||||
### During Coding
|
||||
5. **Use correct base class** - Global: `models.Model`, Account: `AccountBaseModel`, Site: `SiteSectorBaseModel`
|
||||
6. **Use AI framework** - Use `backend/igny8_core/ai/` for AI operations, NOT legacy `utils/ai_processor.py`
|
||||
7. **Follow service pattern** - Business logic in `backend/igny8_core/business/*/services/`
|
||||
8. **Check permissions** - Use `IsAuthenticatedAndActive`, `HasTenantAccess` in views
|
||||
9. **Use TypeScript types** - All frontend code must be typed
|
||||
10. **Use TailwindCSS** - No inline styles; follow `frontend/DESIGN_SYSTEM.md`
|
||||
|
||||
### After Coding
|
||||
11. **Update CHANGELOG.md** - Every commit needs a changelog entry with git reference
|
||||
12. **Increment version** - PATCH for fixes, MINOR for features, MAJOR for breaking changes
|
||||
13. **Update docs** - If you changed APIs or architecture, update relevant docs in `docs/`
|
||||
14. **Run migrations** - After model changes: `docker exec -it igny8_backend python manage.py makemigrations`
|
||||
|
||||
---
|
||||
|
||||
## 📝 Changelog Format
|
||||
|
||||
```markdown
|
||||
## v1.1.1 - December 27, 2025
|
||||
|
||||
### Fixed
|
||||
- Description here (git: abc1234)
|
||||
|
||||
### Added
|
||||
- Description here (git: def5678)
|
||||
|
||||
### Changed
|
||||
- Description here (git: ghi9012)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 🔗 Key Documentation
|
||||
|
||||
| I want to... | Go to |
|
||||
|--------------|-------|
|
||||
| Find any module | [docs/INDEX.md](docs/INDEX.md) |
|
||||
| Understand architecture | [docs/00-SYSTEM/ARCHITECTURE.md](docs/00-SYSTEM/ARCHITECTURE.md) |
|
||||
| Find an API endpoint | [docs/20-API/ENDPOINTS.md](docs/20-API/ENDPOINTS.md) |
|
||||
| See all models | [docs/90-REFERENCE/MODELS.md](docs/90-REFERENCE/MODELS.md) |
|
||||
| Understand AI functions | [docs/90-REFERENCE/AI-FUNCTIONS.md](docs/90-REFERENCE/AI-FUNCTIONS.md) |
|
||||
| See frontend pages | [docs/30-FRONTEND/PAGES.md](docs/30-FRONTEND/PAGES.md) |
|
||||
| See recent changes | [CHANGELOG.md](CHANGELOG.md) |
|
||||
|
||||
---
|
||||
|
||||
## 🚫 Don't Do
|
||||
|
||||
- ❌ Skip reading docs before coding
|
||||
- ❌ Create duplicate components
|
||||
- ❌ Use `docker-compose` for exec commands (use `docker exec`)
|
||||
- ❌ Use legacy `utils/ai_processor.py`
|
||||
- ❌ Add account filtering to Global models (they're platform-wide!)
|
||||
- ❌ Forget site/sector filtering on content models
|
||||
- ❌ Forget to update CHANGELOG
|
||||
- ❌ Use inline styles (use TailwindCSS)
|
||||
- ❌ Hardcode values (use settings/constants)
|
||||
- ❌ Work on Linker/Optimizer (inactive modules - Phase 2)
|
||||
- ❌ Use any SiteBuilder code (deprecated - mark for removal)
|
||||
|
||||
---
|
||||
|
||||
## 📊 API Base URLs
|
||||
|
||||
| Module | Base URL |
|
||||
|--------|----------|
|
||||
| Auth | `/api/v1/auth/` |
|
||||
| Planner | `/api/v1/planner/` |
|
||||
| Writer | `/api/v1/writer/` |
|
||||
| Billing | `/api/v1/billing/` |
|
||||
| Integration | `/api/v1/integration/` |
|
||||
| System | `/api/v1/system/` |
|
||||
|
||||
**API Docs:** https://api.igny8.com/api/docs/
|
||||
**Admin:** https://api.igny8.com/admin/
|
||||
**App:** https://app.igny8.com/
|
||||
|
||||
---
|
||||
|
||||
## 📄 Documentation Rules
|
||||
|
||||
**Root folder MD files allowed:**
|
||||
- `CHANGELOG.md` - Version history
|
||||
- `README.md` - Project quickstart
|
||||
- `IGNY8-APP.md` - Executive summary
|
||||
- `TODOS.md` - Cleanup tracking
|
||||
|
||||
**All other docs go in `/docs/` folder:**
|
||||
```
|
||||
docs/
|
||||
├── INDEX.md # Master navigation
|
||||
├── 00-SYSTEM/ # Architecture, auth, tenancy
|
||||
├── 10-MODULES/ # One file per module
|
||||
├── 20-API/ # API endpoints
|
||||
├── 30-FRONTEND/ # Pages, stores
|
||||
├── 40-WORKFLOWS/ # Cross-module flows
|
||||
└── 90-REFERENCE/ # Models, AI functions
|
||||
```
|
||||
|
||||
**When updating docs:**
|
||||
| Change Type | Update These Files |
|
||||
|-------------|-------------------|
|
||||
| New endpoint | Module doc + `docs/20-API/ENDPOINTS.md` |
|
||||
| New model | Module doc + `docs/90-REFERENCE/MODELS.md` |
|
||||
| New page | Module doc + `docs/30-FRONTEND/PAGES.md` |
|
||||
| New module | Create module doc + update `docs/INDEX.md` |
|
||||
|
||||
**DO NOT** create random MD files - update existing docs instead.
|
||||
|
||||
---
|
||||
|
||||
## 🎯 Quick Checklist Before Commit
|
||||
|
||||
- [ ] Read relevant module docs
|
||||
- [ ] Used existing components/patterns
|
||||
- [ ] Correct data scope (Global/Account/Site)
|
||||
- [ ] Updated CHANGELOG.md with git reference
|
||||
- [ ] Updated version number
|
||||
- [ ] Ran migrations if model changed
|
||||
- [ ] Tested locally
|
||||
1657
CHANGELOG.md
1657
CHANGELOG.md
File diff suppressed because it is too large
Load Diff
183
DESIGN-GUIDE.md
183
DESIGN-GUIDE.md
@@ -1,183 +0,0 @@
|
||||
# IGNY8 Design System Guide
|
||||
|
||||
> **Single Source of Truth for UI Components**
|
||||
>
|
||||
> This guide ensures consistent, maintainable frontend code across the entire application.
|
||||
|
||||
---
|
||||
|
||||
## Quick Links
|
||||
|
||||
| Resource | Path | Description |
|
||||
|----------|------|-------------|
|
||||
| **Component System** | [docs/30-FRONTEND/COMPONENT-SYSTEM.md](docs/30-FRONTEND/COMPONENT-SYSTEM.md) | Full component reference with props, examples, and usage |
|
||||
| **ESLint Plugin** | [frontend/eslint/](frontend/eslint/) | Custom rules enforcing design system |
|
||||
| **Live Demo** | `/ui-elements` route | Interactive component showcase |
|
||||
| **Design Tokens** | [frontend/src/styles/design-system.css](frontend/src/styles/design-system.css) | CSS variables and tokens |
|
||||
| **Icons** | [frontend/src/icons/](frontend/src/icons/) | All SVG icons |
|
||||
|
||||
---
|
||||
|
||||
## Core Principles
|
||||
|
||||
### 1. Use Components, Never Raw HTML
|
||||
|
||||
```tsx
|
||||
// ❌ NEVER
|
||||
<button className="...">Click</button>
|
||||
<input type="text" className="..." />
|
||||
<select className="...">...</select>
|
||||
<textarea className="..."></textarea>
|
||||
|
||||
// ✅ ALWAYS
|
||||
<Button variant="primary">Click</Button>
|
||||
<InputField type="text" label="Name" />
|
||||
<Select options={options} />
|
||||
<TextArea rows={4} />
|
||||
```
|
||||
|
||||
### 2. Import Icons from Central Location
|
||||
|
||||
```tsx
|
||||
// ❌ NEVER
|
||||
import { XIcon } from '@heroicons/react/24/outline';
|
||||
import { Trash } from 'lucide-react';
|
||||
|
||||
// ✅ ALWAYS
|
||||
import { CloseIcon, TrashBinIcon } from '../../icons';
|
||||
```
|
||||
|
||||
### 3. Consistent Sizing
|
||||
|
||||
```tsx
|
||||
// Icons in buttons/badges
|
||||
<Icon className="w-4 h-4" />
|
||||
|
||||
// Standalone icons
|
||||
<Icon className="w-5 h-5" />
|
||||
|
||||
// Large/header icons
|
||||
<Icon className="w-6 h-6" />
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Component Quick Reference
|
||||
|
||||
| Need | Component | Import |
|
||||
|------|-----------|--------|
|
||||
| Action button | `Button` | `components/ui/button/Button` |
|
||||
| Icon-only button | `IconButton` | `components/ui/button/IconButton` |
|
||||
| Text input | `InputField` | `components/form/input/InputField` |
|
||||
| Checkbox | `Checkbox` | `components/form/input/Checkbox` |
|
||||
| Radio | `Radio` | `components/form/input/Radio` |
|
||||
| Dropdown | `Select` | `components/form/Select` |
|
||||
| Multi-line text | `TextArea` | `components/form/input/TextArea` |
|
||||
| Toggle | `Switch` | `components/form/switch/Switch` |
|
||||
| Status label | `Badge` | `components/ui/badge/Badge` |
|
||||
| Container | `Card` | `components/ui/card/Card` |
|
||||
| Popup | `Modal` | `components/ui/modal` |
|
||||
| Loading | `Spinner` | `components/ui/spinner/Spinner` |
|
||||
| Notification | `useToast` | `components/ui/toast/ToastContainer` |
|
||||
|
||||
**→ See [COMPONENT-SYSTEM.md](docs/30-FRONTEND/COMPONENT-SYSTEM.md) for full props and examples**
|
||||
|
||||
---
|
||||
|
||||
## ESLint Enforcement
|
||||
|
||||
### Rules
|
||||
|
||||
| Rule | Level | Action |
|
||||
|------|-------|--------|
|
||||
| `no-raw-button` | warn → error | Use `Button` or `IconButton` |
|
||||
| `no-raw-input` | warn → error | Use `InputField`, `Checkbox`, `Radio` |
|
||||
| `no-raw-select` | warn → error | Use `Select` or `SelectDropdown` |
|
||||
| `no-raw-textarea` | warn → error | Use `TextArea` |
|
||||
| `no-restricted-imports` | error | Block external icon libraries |
|
||||
|
||||
### Check Violations
|
||||
|
||||
```bash
|
||||
cd frontend
|
||||
npm run lint
|
||||
```
|
||||
|
||||
### Plugin Location
|
||||
|
||||
The custom ESLint plugin is at: `frontend/eslint/eslint-plugin-igny8-design-system.cjs`
|
||||
|
||||
---
|
||||
|
||||
## For AI Agents
|
||||
|
||||
When working on this codebase:
|
||||
|
||||
1. **Read first**: [docs/30-FRONTEND/COMPONENT-SYSTEM.md](docs/30-FRONTEND/COMPONENT-SYSTEM.md)
|
||||
2. **Never use**: `<button>`, `<input>`, `<select>`, `<textarea>` tags
|
||||
3. **Import icons from**: `src/icons` only
|
||||
4. **Verify after changes**: `npm run lint`
|
||||
5. **Reference pages**: Planner and Writer modules use correct patterns
|
||||
|
||||
### Correct Import Paths
|
||||
|
||||
```tsx
|
||||
// From a page in src/pages/
|
||||
import Button from '../components/ui/button/Button';
|
||||
import IconButton from '../components/ui/button/IconButton';
|
||||
import InputField from '../components/form/input/InputField';
|
||||
import { PlusIcon, CloseIcon } from '../icons';
|
||||
|
||||
// From a component in src/components/
|
||||
import Button from '../../components/ui/button/Button';
|
||||
import { PlusIcon } from '../../icons';
|
||||
|
||||
// From a nested component
|
||||
// Adjust ../ based on depth
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## File Structure
|
||||
|
||||
```
|
||||
frontend/
|
||||
├── eslint/
|
||||
│ └── eslint-plugin-igny8-design-system.cjs # Custom rules
|
||||
├── src/
|
||||
│ ├── components/
|
||||
│ │ ├── ui/ # Display components
|
||||
│ │ │ ├── button/ # Button, IconButton
|
||||
│ │ │ ├── badge/ # Badge
|
||||
│ │ │ ├── card/ # Card
|
||||
│ │ │ ├── modal/ # Modal
|
||||
│ │ │ └── ...
|
||||
│ │ └── form/ # Form components
|
||||
│ │ ├── input/ # InputField, Checkbox, Radio, TextArea
|
||||
│ │ ├── switch/ # Switch
|
||||
│ │ ├── Select.tsx
|
||||
│ │ └── ...
|
||||
│ ├── icons/ # All SVG icons
|
||||
│ │ └── index.ts # Export all icons
|
||||
│ └── styles/
|
||||
│ └── design-system.css # Design tokens
|
||||
docs/
|
||||
└── 30-FRONTEND/
|
||||
└── COMPONENT-SYSTEM.md # Full component documentation
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Migration Checklist
|
||||
|
||||
When fixing violations:
|
||||
|
||||
- [ ] Replace `<button>` with `Button` or `IconButton`
|
||||
- [ ] Replace `<input type="text/email/password/number">` with `InputField`
|
||||
- [ ] Replace `<input type="checkbox">` with `Checkbox`
|
||||
- [ ] Replace `<input type="radio">` with `Radio`
|
||||
- [ ] Replace `<select>` with `Select` or `SelectDropdown`
|
||||
- [ ] Replace `<textarea>` with `TextArea`
|
||||
- [ ] Replace external icon imports with `src/icons`
|
||||
- [ ] Run `npm run lint` to verify
|
||||
- [ ] Run `npm run build` to confirm no errors
|
||||
383
IGNY8-APP.md
383
IGNY8-APP.md
@@ -1,383 +0,0 @@
|
||||
# IGNY8 - AI-Powered SEO Content Platform
|
||||
|
||||
**Version:** 1.1.0
|
||||
**Last Updated:** December 25, 2025
|
||||
**Status:** Production Ready
|
||||
|
||||
---
|
||||
|
||||
## What is IGNY8?
|
||||
|
||||
IGNY8 is an enterprise-grade AI content platform that transforms keyword research into published, SEO-optimized articles at scale. The platform automates the entire content lifecycle—from discovering keywords to publishing polished articles with AI-generated images—reducing what typically takes days of manual work into hours.
|
||||
|
||||
**The Problem:** Creating SEO content at scale requires keyword research, content planning, writing, image creation, optimization, and publishing—a process that's labor-intensive and inconsistent.
|
||||
|
||||
**The Solution:** IGNY8 provides an end-to-end automated pipeline where AI handles clustering, ideation, writing, and image generation while you maintain editorial control.
|
||||
|
||||
---
|
||||
|
||||
## Platform Architecture
|
||||
|
||||
| Aspect | Details |
|
||||
|--------|---------|
|
||||
| **Type** | Full-stack SaaS Platform |
|
||||
| **Architecture** | Multi-tenant with complete data isolation |
|
||||
| **Target Users** | Content marketers, SEO agencies, digital publishers |
|
||||
| **Deployment** | Docker-based, cloud-hosted |
|
||||
| **Pricing Model** | Credits + Monthly subscription plans |
|
||||
|
||||
---
|
||||
|
||||
## Core Workflow
|
||||
|
||||
IGNY8 follows a structured 8-stage content pipeline:
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────────────────────────────────────────────┐
|
||||
│ SETUP WORKFLOW │
|
||||
│ ───── ──────── │
|
||||
│ Sites → Keywords → Clusters → Ideas → Tasks → Content → Images → Published │
|
||||
│ ↑ ↑ ↑ │
|
||||
│ Configure AI Groups AI Writes │
|
||||
│ WordPress Related Full Articles │
|
||||
│ Keywords + SEO Meta │
|
||||
└─────────────────────────────────────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
### Manual Mode
|
||||
Walk through each stage with full control—review and edit at every step.
|
||||
|
||||
### Automation Mode
|
||||
Configure once, let IGNY8 process all 7 stages automatically on a schedule (daily, weekly, or monthly). Content lands in your review queue ready for approval.
|
||||
|
||||
---
|
||||
|
||||
## Feature Deep-Dive
|
||||
|
||||
### 1. Keyword Management (Planner)
|
||||
|
||||
**Import Options:**
|
||||
- Upload CSV with thousands of keywords
|
||||
- Browse 50+ industries of pre-curated seed keywords
|
||||
- Filter by country, difficulty, search volume
|
||||
|
||||
**AI Clustering:**
|
||||
- GPT-4 analyzes keyword intent and relationships
|
||||
- Groups related keywords into topical clusters
|
||||
- Enables one comprehensive article per cluster (instead of keyword stuffing)
|
||||
|
||||
**Metrics Tracked:**
|
||||
- Search volume (monthly)
|
||||
- Keyword difficulty (0-100)
|
||||
- CPC (cost-per-click)
|
||||
- Intent classification (informational, commercial, transactional)
|
||||
|
||||
---
|
||||
|
||||
### 2. Content Ideation (Planner)
|
||||
|
||||
**AI-Generated Ideas:**
|
||||
- Each cluster becomes a content brief
|
||||
- Suggested titles, angles, and outlines
|
||||
- Word count recommendations based on competition
|
||||
- Priority scoring by SEO potential
|
||||
|
||||
**Bulk Operations:**
|
||||
- Generate ideas for multiple clusters at once
|
||||
- Queue ideas directly to Writer module
|
||||
- Batch status updates
|
||||
|
||||
---
|
||||
|
||||
### 3. Content Generation (Writer)
|
||||
|
||||
**AI Writing Engine:**
|
||||
- Powered by GPT-4/GPT-4 Turbo
|
||||
- Produces structured articles (H2s, H3s, lists, paragraphs)
|
||||
- SEO-optimized meta titles and descriptions
|
||||
- Configurable length: 500 to 5,000+ words
|
||||
|
||||
**Content Types:**
|
||||
- Blog posts and articles
|
||||
- How-to guides
|
||||
- Product comparisons
|
||||
- Reviews and roundups
|
||||
|
||||
**Customization:**
|
||||
- Custom prompt additions (append to all AI prompts)
|
||||
- Default tone selection (professional, casual, authoritative, etc.)
|
||||
- Default article length preferences
|
||||
|
||||
**Workflow States:**
|
||||
- Queue → Draft → Review → Published
|
||||
- Each stage has dedicated management views
|
||||
|
||||
---
|
||||
|
||||
### 4. Image Generation (Writer)
|
||||
|
||||
**Dual AI Providers:**
|
||||
| Provider | Quality Tier | Best For |
|
||||
|----------|--------------|----------|
|
||||
| DALL-E 2 | Standard | Fast, economical |
|
||||
| DALL-E 3 | Premium | High quality, detailed |
|
||||
| Runware | Best | Alternative variety |
|
||||
|
||||
**Image Types:**
|
||||
- Featured images (hero/thumbnail)
|
||||
- In-article images (embedded in content)
|
||||
- Desktop and mobile sizes (responsive)
|
||||
|
||||
**Smart Features:**
|
||||
- AI extracts image prompts from article content
|
||||
- Negative prompts for style control
|
||||
- Multiple format support (WebP, JPG, PNG)
|
||||
- Configurable max images per article
|
||||
|
||||
---
|
||||
|
||||
### 5. Automation Pipeline
|
||||
|
||||
**7-Stage Automated Workflow:**
|
||||
```
|
||||
Stage 1: Process new keywords
|
||||
Stage 2: AI cluster keywords
|
||||
Stage 3: Generate content ideas
|
||||
Stage 4: Create writer tasks
|
||||
Stage 5: Generate article content
|
||||
Stage 6: Extract image prompts
|
||||
Stage 7: Generate images → Review queue
|
||||
```
|
||||
|
||||
**Configuration:**
|
||||
- Schedule: Daily, weekly, or monthly runs
|
||||
- Run controls: Start, pause, resume
|
||||
- Credit estimation before running
|
||||
- Real-time progress tracking
|
||||
- Activity log and run history
|
||||
|
||||
---
|
||||
|
||||
### 6. WordPress Integration
|
||||
|
||||
**Publishing:**
|
||||
- One-click publish from Review tab
|
||||
- Direct WordPress REST API integration
|
||||
- Supports multiple WordPress sites per account
|
||||
|
||||
**Synchronization:**
|
||||
- Two-way content sync (import/export)
|
||||
- Category and tag mapping
|
||||
- Featured image upload
|
||||
- Post type configuration (posts, pages, custom)
|
||||
|
||||
**Setup:**
|
||||
- Site URL and REST API authentication
|
||||
- Content type mapping in Site Settings
|
||||
- Auto-publish toggle (skip review step)
|
||||
- Auto-sync toggle (keep WordPress updated)
|
||||
|
||||
---
|
||||
|
||||
### 7. Team & Account Management
|
||||
|
||||
**User Roles:**
|
||||
| Role | Permissions |
|
||||
|------|-------------|
|
||||
| Admin | Full access, billing, team management |
|
||||
| Manager | Content + billing view, no team management |
|
||||
| Editor | AI content, clusters, tasks |
|
||||
| Viewer | Read-only dashboards |
|
||||
|
||||
**Team Features:**
|
||||
- Invite team members by email
|
||||
- Remove members
|
||||
- Role display (editing roles coming soon)
|
||||
|
||||
**Account Settings:**
|
||||
- Organization name and billing address
|
||||
- Tax ID / VAT number
|
||||
- Billing email
|
||||
|
||||
---
|
||||
|
||||
### 8. Usage & Billing
|
||||
|
||||
**Credit System:**
|
||||
| Operation | Typical Credits |
|
||||
|-----------|-----------------|
|
||||
| Keyword clustering (batch) | 10 |
|
||||
| Content idea generation | 2 |
|
||||
| Article (per 100 words) | 5 |
|
||||
| Image (standard) | 3 |
|
||||
| Image (premium/best) | 5 |
|
||||
|
||||
**Usage Tracking:**
|
||||
- Real-time credit balance
|
||||
- Monthly usage vs. limits
|
||||
- Transaction history
|
||||
- Hard limits (sites, users, keywords, clusters)
|
||||
- Monthly limits (ideas, words, images)
|
||||
|
||||
**Subscription Plans:**
|
||||
| Plan | Sites | Users | Credits/Month | Best For |
|
||||
|------|-------|-------|---------------|----------|
|
||||
| Free | 1 | 1 | 100 | Trial/Evaluation |
|
||||
| Starter | 3 | 3 | 1,000 | Individual creators |
|
||||
| Growth | 10 | 10 | 5,000 | Small teams |
|
||||
| Scale | Unlimited | Unlimited | 25,000 | Agencies |
|
||||
|
||||
---
|
||||
|
||||
## Module Status
|
||||
|
||||
| Module | Status | Location |
|
||||
|--------|--------|----------|
|
||||
| **Dashboard** | ✅ Active | `/` |
|
||||
| **Add Keywords** | ✅ Active | `/setup/add-keywords` |
|
||||
| **Content Settings** | ✅ Active | `/account/content-settings` |
|
||||
| **Sites** | ✅ Active | `/sites` |
|
||||
| **Thinker** | ✅ Active (Admin) | `/thinker/prompts` |
|
||||
| **Planner** | ✅ Active | `/planner/keywords` |
|
||||
| **Writer** | ✅ Active | `/writer/tasks` |
|
||||
| **Automation** | ✅ Active | `/automation` |
|
||||
| **Account Settings** | ✅ Active | `/account/settings` |
|
||||
| **Plans & Billing** | ✅ Active | `/account/plans` |
|
||||
| **Usage** | ✅ Active | `/account/usage` |
|
||||
| **AI Models** | ✅ Active (Admin) | `/settings/integration` |
|
||||
| **Help** | ✅ Active | `/help` |
|
||||
| **SiteBuilder** | ❌ Deprecated | Removed - was for site structure generation |
|
||||
| **Linker** | ⏸️ Phase 2 | Internal linking suggestions (disabled by default) |
|
||||
| **Optimizer** | ⏸️ Phase 2 | Content optimization (disabled by default) |
|
||||
|
||||
### Module Status Details
|
||||
|
||||
| Module | Status | Notes |
|
||||
|--------|--------|-------|
|
||||
| **SiteBuilder** | ❌ Deprecated | Code exists but feature is removed. Marked for cleanup. |
|
||||
| **Linker** | ⏸️ Phase 2 | Feature flag: `linker_enabled`. Available but disabled by default. |
|
||||
| **Optimizer** | ⏸️ Phase 2 | Feature flag: `optimizer_enabled`. Available but disabled by default. |
|
||||
|
||||
To enable Phase 2 modules, update via Django Admin:
|
||||
- `GlobalModuleSettings` (pk=1) for platform-wide settings
|
||||
- `ModuleEnableSettings` for per-account settings
|
||||
|
||||
---
|
||||
|
||||
## Navigation Structure
|
||||
|
||||
```
|
||||
Sidebar Menu
|
||||
├── Dashboard
|
||||
├── SETUP
|
||||
│ ├── Add Keywords
|
||||
│ ├── Content Settings
|
||||
│ ├── Sites (if enabled)
|
||||
│ └── Thinker (admin only, if enabled)
|
||||
├── WORKFLOW
|
||||
│ ├── Planner (Keywords → Clusters → Ideas)
|
||||
│ ├── Writer (Queue → Drafts → Images → Review → Published)
|
||||
│ ├── Automation
|
||||
│ ├── Linker (if enabled)
|
||||
│ └── Optimizer (if enabled)
|
||||
├── ACCOUNT
|
||||
│ ├── Account Settings (Account → Profile → Team)
|
||||
│ ├── Plans & Billing (Plan → Upgrade → History)
|
||||
│ ├── Usage (Limits → Credit History → API Activity)
|
||||
│ └── AI Models (admin only)
|
||||
└── HELP
|
||||
└── Help & Docs
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Technical Stack
|
||||
|
||||
| Layer | Technology |
|
||||
|-------|------------|
|
||||
| **Backend** | Django 5.x, Django REST Framework, Python 3.11+ |
|
||||
| **Frontend** | React 19, TypeScript, Vite, TailwindCSS |
|
||||
| **Database** | PostgreSQL 15+ |
|
||||
| **Cache/Sessions** | Redis |
|
||||
| **Task Queue** | Celery + Celery Beat |
|
||||
| **AI Services** | OpenAI GPT-4, DALL-E 3, Runware |
|
||||
| **Deployment** | Docker, Docker Compose |
|
||||
|
||||
---
|
||||
|
||||
## Security
|
||||
|
||||
- **Data Isolation:** Complete multi-tenant separation at database level
|
||||
- **Authentication:** JWT tokens with Redis session management
|
||||
- **Encryption:** Data encrypted at rest and in transit
|
||||
- **Access Control:** Role-based permissions per account
|
||||
- **Session Security:** Secure cookie handling, session integrity checks
|
||||
|
||||
---
|
||||
|
||||
## Current Limitations & Known Issues
|
||||
|
||||
**Payment Processing:**
|
||||
- Stripe/PayPal pending production credentials
|
||||
- Manual payment methods available
|
||||
|
||||
**Pending Backend Implementation:**
|
||||
- Content Generation settings (append prompt, tone, length) - UI exists, API pending
|
||||
- Publishing settings (auto-publish, sync) - UI exists, API pending
|
||||
- Profile settings save - UI exists, API pending
|
||||
- Password change functionality
|
||||
- API Activity tracking (currently placeholder data)
|
||||
|
||||
**Disabled Modules:**
|
||||
- Linker (internal linking) - Available but disabled
|
||||
- Optimizer (content optimization) - Available but disabled
|
||||
|
||||
---
|
||||
|
||||
## Getting Started
|
||||
|
||||
### For New Users
|
||||
1. Create account and verify email
|
||||
2. Create your first site (industry + sectors)
|
||||
3. Connect WordPress (optional)
|
||||
4. Add keywords from seed database or import CSV
|
||||
5. Run AI clustering on keywords
|
||||
6. Generate content ideas from clusters
|
||||
7. Queue ideas to Writer
|
||||
8. Generate content and images
|
||||
9. Review and publish
|
||||
|
||||
### For Admins
|
||||
1. Configure AI Models (OpenAI API key, Runware key)
|
||||
2. Customize prompts in Thinker module
|
||||
3. Set up global module settings
|
||||
4. Configure automation schedules
|
||||
|
||||
---
|
||||
|
||||
## Documentation
|
||||
|
||||
| Document | Location |
|
||||
|----------|----------|
|
||||
| Technical Docs | `/docs/INDEX.md` |
|
||||
| API Reference | `/docs/20-API/` |
|
||||
| Pre-Launch Audit | `/PRE-LAUNCH-AUDIT.md` |
|
||||
| Changelog | `/CHANGELOG.md` |
|
||||
|
||||
---
|
||||
|
||||
## Version History
|
||||
|
||||
| Version | Date | Highlights |
|
||||
|---------|------|------------|
|
||||
| **1.1.0** | Dec 25, 2025 | UX overhaul, page consolidation, pre-launch audit |
|
||||
| 1.0.5 | Dec 12, 2025 | Purchase Credits tab |
|
||||
| 1.0.4 | Dec 12, 2025 | Credit Activity tab |
|
||||
| 1.0.3 | Dec 12, 2025 | Usage Limits improvements |
|
||||
| 1.0.2 | Dec 12, 2025 | Design system enforcement |
|
||||
| 1.0.1 | Dec 12, 2025 | Plan limits UI |
|
||||
| 1.0.0 | Dec 12, 2025 | Initial production release |
|
||||
|
||||
---
|
||||
|
||||
*For detailed technical implementation, see the `/docs` folder. For known issues and improvement roadmap, see `/PRE-LAUNCH-AUDIT.md`.*
|
||||
651
README.md
651
README.md
@@ -1,386 +1,385 @@
|
||||
# IGNY8 - AI-Powered SEO Content Platform
|
||||
# IGNY8 Platform
|
||||
|
||||
**Version:** 1.0.5
|
||||
**License:** Proprietary
|
||||
**Website:** https://igny8.com
|
||||
Full-stack SaaS platform for SEO keyword management and AI-driven content generation, built with Django REST Framework and React.
|
||||
|
||||
**Last Updated:** 2025-01-XX
|
||||
|
||||
---
|
||||
|
||||
## Quick Links
|
||||
## 🏗️ Architecture
|
||||
|
||||
| Document | Description |
|
||||
|----------|-------------|
|
||||
| [IGNY8-APP.md](IGNY8-APP.md) | Executive summary (non-technical) |
|
||||
| [docs/INDEX.md](docs/INDEX.md) | Full documentation index |
|
||||
| [CHANGELOG.md](CHANGELOG.md) | Version history |
|
||||
| [RULES.md](RULES.md) | Documentation maintenance rules |
|
||||
- **Backend**: Django 5.2+ with Django REST Framework (Port 8010/8011)
|
||||
- **Frontend**: React 19 with TypeScript and Vite (Port 5173/8021)
|
||||
- **Database**: PostgreSQL 15
|
||||
- **Task Queue**: Celery with Redis
|
||||
- **Reverse Proxy**: Caddy (HTTPS on port 443)
|
||||
- **Deployment**: Docker-based containerization
|
||||
|
||||
---
|
||||
|
||||
## What is IGNY8?
|
||||
|
||||
IGNY8 is a full-stack SaaS platform that combines AI-powered content generation with intelligent SEO management. It helps content creators, marketers, and agencies streamline their content workflow from keyword research to published articles.
|
||||
|
||||
### Key Features
|
||||
|
||||
- 🔍 **Smart Keyword Management** - Import, cluster, and organize keywords with AI
|
||||
- ✍️ **AI Content Generation** - Generate SEO-optimized blog posts using GPT-4
|
||||
- 🖼️ **AI Image Creation** - Auto-generate featured and in-article images
|
||||
- 🔗 **Internal Linking** - AI-powered link suggestions (coming soon)
|
||||
- 📊 **Content Optimization** - Analyze and score content quality (coming soon)
|
||||
- 🔄 **WordPress Integration** - Bidirectional sync with WordPress sites
|
||||
- 📈 **Usage-Based Billing** - Credit system for AI operations
|
||||
- 👥 **Multi-Tenancy** - Manage multiple sites and teams
|
||||
|
||||
---
|
||||
|
||||
## Repository Structure
|
||||
## 📁 Project Structure
|
||||
|
||||
```
|
||||
igny8/
|
||||
├── README.md # This file
|
||||
├── CHANGELOG.md # Version history
|
||||
├── IGNY8-APP.md # Executive summary
|
||||
├── RULES.md # Documentation rules
|
||||
├── backend/ # Django REST API + Celery
|
||||
├── frontend/ # React + Vite SPA
|
||||
├── docs/ # Full documentation
|
||||
│ ├── INDEX.md # Documentation navigation
|
||||
│ ├── 00-SYSTEM/ # Architecture & auth
|
||||
│ ├── 10-MODULES/ # Module documentation
|
||||
│ ├── 20-API/ # API endpoints
|
||||
│ ├── 30-FRONTEND/ # Frontend pages & stores
|
||||
│ ├── 40-WORKFLOWS/ # Cross-module workflows
|
||||
│ ├── 50-DEPLOYMENT/ # Deployment guides
|
||||
│ └── 90-REFERENCE/ # Models & AI functions
|
||||
├── backend/ # Django backend
|
||||
│ ├── igny8_core/ # Django project
|
||||
│ │ ├── modules/ # Feature modules (Planner, Writer, System, Billing, Auth)
|
||||
│ │ ├── ai/ # AI framework
|
||||
│ │ ├── api/ # API base classes
|
||||
│ │ └── middleware/ # Custom middleware
|
||||
│ ├── Dockerfile
|
||||
│ └── requirements.txt
|
||||
├── frontend/ # React frontend
|
||||
│ ├── src/
|
||||
│ │ ├── pages/ # Page components
|
||||
│ │ ├── services/ # API clients
|
||||
│ │ ├── components/ # UI components
|
||||
│ │ ├── config/ # Configuration files
|
||||
│ │ └── stores/ # Zustand stores
|
||||
│ ├── Dockerfile
|
||||
│ ├── Dockerfile.dev # Development mode
|
||||
│ └── vite.config.ts
|
||||
├── docs/ # Complete documentation
|
||||
│ ├── 00-DOCUMENTATION-MANAGEMENT.md # Documentation & changelog management (READ FIRST)
|
||||
│ ├── 01-TECH-STACK-AND-INFRASTRUCTURE.md
|
||||
│ ├── 02-APPLICATION-ARCHITECTURE.md
|
||||
│ ├── 03-FRONTEND-ARCHITECTURE.md
|
||||
│ ├── 04-BACKEND-IMPLEMENTATION.md
|
||||
│ ├── 05-AI-FRAMEWORK-IMPLEMENTATION.md
|
||||
│ ├── 06-FUNCTIONAL-BUSINESS-LOGIC.md
|
||||
│ ├── API-COMPLETE-REFERENCE.md # Complete unified API documentation
|
||||
│ ├── planning/ # Architecture & implementation planning documents
|
||||
│ │ ├── IGNY8-HOLISTIC-ARCHITECTURE-PLAN.md # Complete architecture plan
|
||||
│ │ ├── IGNY8-IMPLEMENTATION-PLAN.md # Step-by-step implementation plan
|
||||
│ │ ├── Igny8-phase-2-plan.md # Phase 2 feature specifications
|
||||
│ │ ├── CONTENT-WORKFLOW-DIAGRAM.md # Content workflow diagrams
|
||||
│ │ ├── ARCHITECTURE_CONTEXT.md # Architecture context reference
|
||||
│ │ └── sample-usage-limits-credit-system # Credit system specification
|
||||
│ └── refactor/ # Refactoring plans and documentation
|
||||
├── CHANGELOG.md # Version history and changes (only updated after user confirmation)
|
||||
└── docker-compose.app.yml
|
||||
```
|
||||
|
||||
**Separate Repository:**
|
||||
- [igny8-wp-integration](https://github.com/alorig/igny8-wp-integration) - WordPress bridge plugin
|
||||
|
||||
---
|
||||
|
||||
## Quick Start
|
||||
## 🚀 Quick Start
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- **Python 3.11+**
|
||||
- **Node.js 18+**
|
||||
- **PostgreSQL 14+**
|
||||
- **Redis 7+**
|
||||
- **Docker** (optional, recommended for local development)
|
||||
- Docker & Docker Compose
|
||||
- Node.js 18+ (for local development)
|
||||
- Python 3.11+ (for local development)
|
||||
|
||||
### Local Development with Docker
|
||||
### Development Setup
|
||||
|
||||
1. **Clone the repository**
|
||||
```powershell
|
||||
git clone https://github.com/alorig/igny8-app.git
|
||||
cd igny8
|
||||
1. **Navigate to the project directory:**
|
||||
```bash
|
||||
cd /data/app/igny8
|
||||
```
|
||||
|
||||
2. **Set environment variables**
|
||||
|
||||
Create `.env` file in `backend/` directory:
|
||||
```env
|
||||
SECRET_KEY=your-secret-key-here
|
||||
DEBUG=True
|
||||
DATABASE_URL=postgresql://postgres:postgres@db:5432/igny8
|
||||
REDIS_URL=redis://redis:6379/0
|
||||
OPENAI_API_KEY=your-openai-key
|
||||
RUNWARE_API_KEY=your-runware-key
|
||||
2. **Backend Setup:**
|
||||
```bash
|
||||
cd backend
|
||||
pip install -r requirements.txt
|
||||
python manage.py migrate
|
||||
python manage.py createsuperuser
|
||||
python manage.py runserver
|
||||
```
|
||||
|
||||
3. **Start services**
|
||||
```powershell
|
||||
docker-compose -f docker-compose.app.yml up --build
|
||||
3. **Frontend Setup:**
|
||||
```bash
|
||||
cd frontend
|
||||
npm install
|
||||
npm run dev
|
||||
```
|
||||
|
||||
4. **Access applications**
|
||||
4. **Access:**
|
||||
- Frontend: http://localhost:5173
|
||||
- Backend API: http://localhost:8000
|
||||
- API Docs: http://localhost:8000/api/docs/
|
||||
- Django Admin: http://localhost:8000/admin/
|
||||
- Backend API: http://localhost:8011/api/
|
||||
- Admin: http://localhost:8011/admin/
|
||||
|
||||
### Manual Setup (Without Docker)
|
||||
### Docker Setup
|
||||
|
||||
#### Backend Setup
|
||||
```bash
|
||||
# Build images
|
||||
docker build -f backend/Dockerfile -t igny8-backend ./backend
|
||||
docker build -f frontend/Dockerfile.dev -t igny8-frontend-dev ./frontend
|
||||
|
||||
```powershell
|
||||
cd backend
|
||||
|
||||
# Create virtual environment
|
||||
python -m venv .venv
|
||||
.\.venv\Scripts\Activate.ps1
|
||||
|
||||
# Install dependencies
|
||||
pip install -r requirements.txt
|
||||
|
||||
# Run migrations
|
||||
python manage.py migrate
|
||||
|
||||
# Create superuser
|
||||
python manage.py createsuperuser
|
||||
|
||||
# Run development server
|
||||
python manage.py runserver
|
||||
# Run with docker-compose
|
||||
docker-compose -f docker-compose.app.yml up
|
||||
```
|
||||
|
||||
In separate terminals, start Celery:
|
||||
|
||||
```powershell
|
||||
# Celery worker
|
||||
celery -A igny8_core worker -l info
|
||||
|
||||
# Celery beat (scheduled tasks)
|
||||
celery -A igny8_core beat -l info
|
||||
```
|
||||
|
||||
#### Frontend Setup
|
||||
|
||||
```powershell
|
||||
cd frontend
|
||||
|
||||
# Install dependencies
|
||||
npm install
|
||||
|
||||
# Start dev server
|
||||
npm run dev
|
||||
```
|
||||
For complete installation guide, see [docs/01-TECH-STACK-AND-INFRASTRUCTURE.md](docs/01-TECH-STACK-AND-INFRASTRUCTURE.md).
|
||||
|
||||
---
|
||||
|
||||
## Project Architecture
|
||||
## 📚 Features
|
||||
|
||||
### System Overview
|
||||
### ✅ Implemented
|
||||
|
||||
```
|
||||
User Interface (React)
|
||||
↓
|
||||
REST API (Django)
|
||||
↓
|
||||
┌───────┴────────┐
|
||||
│ │
|
||||
Database AI Engine
|
||||
(PostgreSQL) (Celery + OpenAI)
|
||||
│
|
||||
WordPress Plugin
|
||||
(Bidirectional Sync)
|
||||
- **Foundation**: Multi-tenancy system, Authentication (login/register), RBAC permissions
|
||||
- **Planner Module**: Keywords, Clusters, Content Ideas (full CRUD, filtering, pagination, bulk operations, CSV import/export, AI clustering)
|
||||
- **Writer Module**: Tasks, Content, Images (full CRUD, AI content generation, AI image generation)
|
||||
- **Thinker Module**: Prompts, Author Profiles, Strategies, Image Testing
|
||||
- **System Module**: Settings, Integrations (OpenAI, Runware), AI Prompts
|
||||
- **Billing Module**: Credits, Transactions, Usage Logs
|
||||
- **AI Functions**: 5 AI operations (Auto Cluster, Generate Ideas, Generate Content, Generate Image Prompts, Generate Images)
|
||||
- **Frontend**: Complete component library, 4 master templates, config-driven UI system
|
||||
- **Backend**: REST API with tenant isolation, Site > Sector hierarchy, Celery async tasks
|
||||
- **WordPress Integration**: Direct publishing to WordPress sites
|
||||
- **Development**: Docker Compose setup, hot reload, TypeScript + React
|
||||
|
||||
### 🚧 In Progress
|
||||
|
||||
- Planner Dashboard enhancement with KPIs
|
||||
- Automation & CRON tasks
|
||||
- Advanced analytics
|
||||
|
||||
### 🔄 Planned
|
||||
|
||||
- Analytics module enhancements
|
||||
- Advanced scheduling features
|
||||
- Additional AI model integrations
|
||||
|
||||
---
|
||||
|
||||
## 🔗 API Documentation
|
||||
|
||||
### Interactive Documentation
|
||||
|
||||
- **Swagger UI**: `https://api.igny8.com/api/docs/`
|
||||
- **ReDoc**: `https://api.igny8.com/api/redoc/`
|
||||
- **OpenAPI Schema**: `https://api.igny8.com/api/schema/`
|
||||
|
||||
### API Complete Reference
|
||||
|
||||
**[API Complete Reference](docs/API-COMPLETE-REFERENCE.md)** - Comprehensive unified API documentation (single source of truth)
|
||||
- Complete endpoint reference (100+ endpoints across all modules)
|
||||
- Authentication & authorization guide
|
||||
- Response format standards (unified format: `{success, data, message, errors, request_id}`)
|
||||
- Error handling
|
||||
- Rate limiting (scoped by operation type)
|
||||
- Pagination
|
||||
- Roles & permissions
|
||||
- Tenant/site/sector scoping
|
||||
- Integration examples (Python, JavaScript, cURL, PHP)
|
||||
- Testing & debugging
|
||||
- Change management
|
||||
|
||||
### API Standard Features
|
||||
|
||||
- ✅ **Unified Response Format** - Consistent JSON structure for all endpoints
|
||||
- ✅ **Layered Authorization** - Authentication → Tenant → Role → Site/Sector
|
||||
- ✅ **Centralized Error Handling** - All errors in unified format with request_id
|
||||
- ✅ **Scoped Rate Limiting** - Different limits per operation type (10-100/min)
|
||||
- ✅ **Tenant Isolation** - Account/site/sector scoping
|
||||
- ✅ **Request Tracking** - Unique request ID for debugging
|
||||
- ✅ **100% Implemented** - All endpoints use unified format
|
||||
|
||||
### Quick API Example
|
||||
|
||||
```bash
|
||||
# Login
|
||||
curl -X POST https://api.igny8.com/api/v1/auth/login/ \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"email":"user@example.com","password":"password"}'
|
||||
|
||||
# Get keywords (with token)
|
||||
curl -X GET https://api.igny8.com/api/v1/planner/keywords/ \
|
||||
-H "Authorization: Bearer YOUR_TOKEN" \
|
||||
-H "Content-Type: application/json"
|
||||
```
|
||||
|
||||
### Tech Stack
|
||||
### Additional API Guides
|
||||
|
||||
- **[Authentication Guide](docs/AUTHENTICATION-GUIDE.md)** - Detailed JWT authentication guide
|
||||
- **[Error Codes Reference](docs/ERROR-CODES.md)** - Complete error code reference
|
||||
- **[Rate Limiting Guide](docs/RATE-LIMITING.md)** - Rate limiting and throttling details
|
||||
- **[Migration Guide](docs/MIGRATION-GUIDE.md)** - Migrating to API v1.0
|
||||
- **[WordPress Plugin Integration](docs/WORDPRESS-PLUGIN-INTEGRATION.md)** - WordPress integration guide
|
||||
|
||||
For backend implementation details, see [docs/04-BACKEND-IMPLEMENTATION.md](docs/04-BACKEND-IMPLEMENTATION.md).
|
||||
|
||||
---
|
||||
|
||||
## 📖 Documentation
|
||||
|
||||
All documentation is consolidated in the `/docs/` folder.
|
||||
|
||||
**⚠️ IMPORTANT FOR AI AGENTS**: Before making any changes, read:
|
||||
1. **[00-DOCUMENTATION-MANAGEMENT.md](docs/00-DOCUMENTATION-MANAGEMENT.md)** - Versioning, changelog, and DRY principles
|
||||
2. **[CHANGELOG.md](CHANGELOG.md)** - Current version and change history
|
||||
|
||||
### Core Documentation
|
||||
|
||||
0. **[00-DOCUMENTATION-MANAGEMENT.md](docs/00-DOCUMENTATION-MANAGEMENT.md)** ⚠️ **READ FIRST**
|
||||
- Documentation and changelog management system
|
||||
- Versioning system (Semantic Versioning)
|
||||
- Changelog update rules (only after user confirmation)
|
||||
- DRY principles and standards
|
||||
- AI agent instructions
|
||||
|
||||
1. **[01-TECH-STACK-AND-INFRASTRUCTURE.md](docs/01-TECH-STACK-AND-INFRASTRUCTURE.md)**
|
||||
- Technology stack overview
|
||||
- Infrastructure components
|
||||
- Docker deployment architecture
|
||||
- Fresh installation guide
|
||||
- External service integrations
|
||||
|
||||
2. **[02-APPLICATION-ARCHITECTURE.md](docs/02-APPLICATION-ARCHITECTURE.md)**
|
||||
- IGNY8 application architecture
|
||||
- System hierarchy and relationships
|
||||
- User roles and access control
|
||||
- Module organization
|
||||
- Complete workflows
|
||||
- Data models and relationships
|
||||
- Multi-tenancy architecture
|
||||
- API architecture
|
||||
- Security architecture
|
||||
|
||||
3. **[03-FRONTEND-ARCHITECTURE.md](docs/03-FRONTEND-ARCHITECTURE.md)**
|
||||
- Frontend architecture
|
||||
- Project structure
|
||||
- Routing system
|
||||
- Template system
|
||||
- Component library
|
||||
- State management
|
||||
- API integration
|
||||
- Configuration system
|
||||
- All pages and features
|
||||
|
||||
4. **[04-BACKEND-IMPLEMENTATION.md](docs/04-BACKEND-IMPLEMENTATION.md)**
|
||||
- Backend architecture
|
||||
- Project structure
|
||||
- Models and relationships
|
||||
- ViewSets and API endpoints
|
||||
- Serializers
|
||||
- Celery tasks
|
||||
- Middleware
|
||||
- All modules (Planner, Writer, System, Billing, Auth)
|
||||
|
||||
5. **[05-AI-FRAMEWORK-IMPLEMENTATION.md](docs/05-AI-FRAMEWORK-IMPLEMENTATION.md)**
|
||||
- AI framework architecture and code structure
|
||||
- All 5 AI functions (technical implementation)
|
||||
- AI function execution flow
|
||||
- Progress tracking
|
||||
- Cost tracking
|
||||
- Prompt management
|
||||
- Model configuration
|
||||
|
||||
6. **[06-FUNCTIONAL-BUSINESS-LOGIC.md](docs/06-FUNCTIONAL-BUSINESS-LOGIC.md)**
|
||||
- Complete functional and business logic documentation
|
||||
- All workflows and processes
|
||||
- All features and functions
|
||||
- How the application works from business perspective
|
||||
- Credit system details
|
||||
- WordPress integration
|
||||
- Data flow and state management
|
||||
|
||||
### Quick Start Guide
|
||||
|
||||
**For AI Agents**: Start with [00-DOCUMENTATION-MANAGEMENT.md](docs/00-DOCUMENTATION-MANAGEMENT.md) to understand versioning, changelog, and DRY principles.
|
||||
|
||||
1. **New to IGNY8?** Start with [01-TECH-STACK-AND-INFRASTRUCTURE.md](docs/01-TECH-STACK-AND-INFRASTRUCTURE.md) for technology overview
|
||||
2. **Understanding the System?** Read [02-APPLICATION-ARCHITECTURE.md](docs/02-APPLICATION-ARCHITECTURE.md) for complete architecture
|
||||
3. **Frontend Development?** See [03-FRONTEND-ARCHITECTURE.md](docs/03-FRONTEND-ARCHITECTURE.md) for all frontend details
|
||||
4. **Backend Development?** See [04-BACKEND-IMPLEMENTATION.md](docs/04-BACKEND-IMPLEMENTATION.md) for all backend details
|
||||
5. **Working with AI?** See [05-AI-FRAMEWORK-IMPLEMENTATION.md](docs/05-AI-FRAMEWORK-IMPLEMENTATION.md) for AI framework implementation
|
||||
6. **Understanding Business Logic?** See [06-FUNCTIONAL-BUSINESS-LOGIC.md](docs/06-FUNCTIONAL-BUSINESS-LOGIC.md) for complete workflows and features
|
||||
7. **What's New?** Check [CHANGELOG.md](CHANGELOG.md) for recent changes
|
||||
|
||||
### Finding Information
|
||||
|
||||
**By Topic:**
|
||||
- **API Documentation**: [API-COMPLETE-REFERENCE.md](docs/API-COMPLETE-REFERENCE.md) - Complete unified API reference (single source of truth)
|
||||
- **Infrastructure & Deployment**: [01-TECH-STACK-AND-INFRASTRUCTURE.md](docs/01-TECH-STACK-AND-INFRASTRUCTURE.md)
|
||||
- **Application Architecture**: [02-APPLICATION-ARCHITECTURE.md](docs/02-APPLICATION-ARCHITECTURE.md)
|
||||
- **Frontend Development**: [03-FRONTEND-ARCHITECTURE.md](docs/03-FRONTEND-ARCHITECTURE.md)
|
||||
- **Backend Development**: [04-BACKEND-IMPLEMENTATION.md](docs/04-BACKEND-IMPLEMENTATION.md)
|
||||
- **AI Framework Implementation**: [05-AI-FRAMEWORK-IMPLEMENTATION.md](docs/05-AI-FRAMEWORK-IMPLEMENTATION.md)
|
||||
- **Business Logic & Workflows**: [06-FUNCTIONAL-BUSINESS-LOGIC.md](docs/06-FUNCTIONAL-BUSINESS-LOGIC.md)
|
||||
- **Changes & Updates**: [CHANGELOG.md](CHANGELOG.md)
|
||||
- **Documentation Management**: [00-DOCUMENTATION-MANAGEMENT.md](docs/00-DOCUMENTATION-MANAGEMENT.md) ⚠️ **For AI Agents**
|
||||
|
||||
**By Module:**
|
||||
- **Planner**: See [02-APPLICATION-ARCHITECTURE.md](docs/02-APPLICATION-ARCHITECTURE.md) (Module Organization) and [04-BACKEND-IMPLEMENTATION.md](docs/04-BACKEND-IMPLEMENTATION.md) (Planner Module)
|
||||
- **Writer**: See [02-APPLICATION-ARCHITECTURE.md](docs/02-APPLICATION-ARCHITECTURE.md) (Module Organization) and [04-BACKEND-IMPLEMENTATION.md](docs/04-BACKEND-IMPLEMENTATION.md) (Writer Module)
|
||||
- **Thinker**: See [03-FRONTEND-ARCHITECTURE.md](docs/03-FRONTEND-ARCHITECTURE.md) (Thinker Pages) and [04-BACKEND-IMPLEMENTATION.md](docs/04-BACKEND-IMPLEMENTATION.md) (System Module)
|
||||
- **System**: See [04-BACKEND-IMPLEMENTATION.md](docs/04-BACKEND-IMPLEMENTATION.md) (System Module)
|
||||
- **Billing**: See [04-BACKEND-IMPLEMENTATION.md](docs/04-BACKEND-IMPLEMENTATION.md) (Billing Module)
|
||||
|
||||
---
|
||||
|
||||
## 🛠️ Development
|
||||
|
||||
### Technology Stack
|
||||
|
||||
**Backend:**
|
||||
- Django 5.2+ (Python web framework)
|
||||
- Django REST Framework (API)
|
||||
- PostgreSQL (Database)
|
||||
- Celery (Async task queue)
|
||||
- Redis (Message broker)
|
||||
- OpenAI API (Content generation)
|
||||
- Django 5.2+
|
||||
- Django REST Framework
|
||||
- PostgreSQL 15
|
||||
- Celery 5.3+
|
||||
- Redis 7
|
||||
|
||||
**Frontend:**
|
||||
- React 19 (UI library)
|
||||
- Vite 6 (Build tool)
|
||||
- Zustand (State management)
|
||||
- React Router v7 (Routing)
|
||||
- Tailwind CSS 4 (Styling)
|
||||
- React 19
|
||||
- TypeScript 5.7+
|
||||
- Vite 6.1+
|
||||
- Tailwind CSS 4.0+
|
||||
- Zustand 5.0+
|
||||
|
||||
**WordPress Plugin:**
|
||||
- PHP 7.4+ (WordPress compatibility)
|
||||
- REST API integration
|
||||
- Bidirectional sync
|
||||
**Infrastructure:**
|
||||
- Docker & Docker Compose
|
||||
- Caddy (Reverse Proxy)
|
||||
- Portainer (Container Management)
|
||||
|
||||
### System Capabilities
|
||||
|
||||
- **Multi-Tenancy**: Complete account isolation with automatic filtering
|
||||
- **Planner Module**: Keywords, Clusters, Content Ideas management
|
||||
- **Writer Module**: Tasks, Content, Images generation and management
|
||||
- **Thinker Module**: Prompts, Author Profiles, Strategies, Image Testing
|
||||
- **System Module**: Settings, Integrations, AI Prompts
|
||||
- **Billing Module**: Credits, Transactions, Usage Logs
|
||||
- **AI Functions**: 5 AI operations (Auto Cluster, Generate Ideas, Generate Content, Generate Image Prompts, Generate Images)
|
||||
|
||||
---
|
||||
|
||||
## How IGNY8 Works
|
||||
---
|
||||
|
||||
### Content Creation Workflow
|
||||
## 🔒 Documentation & Changelog Management
|
||||
|
||||
```
|
||||
1. Import Keywords
|
||||
↓
|
||||
2. AI Clusters Keywords
|
||||
↓
|
||||
3. Generate Content Ideas
|
||||
↓
|
||||
4. Create Writer Tasks
|
||||
↓
|
||||
5. AI Generates Content
|
||||
↓
|
||||
6. AI Creates Images
|
||||
↓
|
||||
7. Publish to WordPress
|
||||
↓
|
||||
8. Sync Status Back
|
||||
```
|
||||
### Versioning System
|
||||
|
||||
### WordPress Integration
|
||||
- **Format**: Semantic Versioning (MAJOR.MINOR.PATCH)
|
||||
- **Current Version**: `1.0.0`
|
||||
- **Location**: `CHANGELOG.md` (root directory)
|
||||
- **Rules**: Only updated after user confirmation that fix/feature is complete
|
||||
|
||||
The WordPress bridge plugin (`igny8-wp-integration`) creates a bidirectional connection:
|
||||
### Changelog Management
|
||||
|
||||
- **IGNY8 → WordPress:** Publish AI-generated content to WordPress
|
||||
- **WordPress → IGNY8:** Sync post status updates back to IGNY8
|
||||
- **Location**: `CHANGELOG.md` (root directory)
|
||||
- **Rules**: Only updated after user confirmation
|
||||
- **Structure**: Added, Changed, Fixed, Deprecated, Removed, Security
|
||||
- **For Details**: See [00-DOCUMENTATION-MANAGEMENT.md](docs/00-DOCUMENTATION-MANAGEMENT.md)
|
||||
|
||||
**Setup:**
|
||||
1. Install WordPress plugin on your site
|
||||
2. Generate API key in IGNY8 app
|
||||
3. Connect plugin using email, password, and API key
|
||||
4. Plugin syncs automatically
|
||||
### DRY Principles
|
||||
|
||||
**Core Principle**: Always use existing, predefined, standardized components, utilities, functions, and configurations.
|
||||
|
||||
**Frontend**: Use existing templates, components, stores, contexts, utilities, and Tailwind CSS
|
||||
**Backend**: Use existing base classes, AI framework, services, and middleware
|
||||
|
||||
**For Complete Guidelines**: See [00-DOCUMENTATION-MANAGEMENT.md](docs/00-DOCUMENTATION-MANAGEMENT.md)
|
||||
|
||||
**⚠️ For AI Agents**: Read `docs/00-DOCUMENTATION-MANAGEMENT.md` at the start of every session.
|
||||
|
||||
---
|
||||
|
||||
## Documentation
|
||||
## 📝 License
|
||||
|
||||
Start here: [docs/README.md](./docs/README.md) (index of all topics).
|
||||
|
||||
Common entry points:
|
||||
- App architecture: `docs/igny8-app/IGNY8-APP-ARCHITECTURE.md`
|
||||
- Backend architecture: `docs/backend/IGNY8-BACKEND-ARCHITECTURE.md`
|
||||
- Planner backend detail: `docs/backend/IGNY8-PLANNER-BACKEND.md`
|
||||
- Writer backend detail: `docs/backend/IGNY8-WRITER-BACKEND.md`
|
||||
- Automation: `docs/automation/AUTOMATION-REFERENCE.md`
|
||||
- Tech stack: `docs/tech-stack/00-SYSTEM-ARCHITECTURE-MASTER-REFERENCE.md`
|
||||
- API: `docs/API/API-COMPLETE-REFERENCE-LATEST.md`
|
||||
- Billing & Credits: `docs/billing/billing-account-final-plan-2025-12-05.md`
|
||||
- App guides: `docs/igny8-app/` (planner/writer workflows, taxonomy, feature modification)
|
||||
- WordPress: `docs/wp/` (plugin integration and sync)
|
||||
- Docs changelog: `docs/CHANGELOG.md`
|
||||
[Add license information]
|
||||
|
||||
---
|
||||
|
||||
## Development Workflow
|
||||
## 📞 Support
|
||||
|
||||
### Running Tests
|
||||
|
||||
```powershell
|
||||
# Backend tests
|
||||
cd backend
|
||||
python manage.py test
|
||||
|
||||
# Frontend tests
|
||||
cd frontend
|
||||
npm run test
|
||||
```
|
||||
|
||||
### Code Quality
|
||||
|
||||
```powershell
|
||||
# Frontend linting
|
||||
cd frontend
|
||||
npm run lint
|
||||
```
|
||||
|
||||
### Building for Production
|
||||
|
||||
```powershell
|
||||
# Backend
|
||||
cd backend
|
||||
python manage.py collectstatic
|
||||
|
||||
# Frontend
|
||||
cd frontend
|
||||
npm run build
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## API Overview
|
||||
|
||||
**Base URL:** `https://api.igny8.com/api/v1/`
|
||||
|
||||
**Authentication:** JWT Bearer token
|
||||
|
||||
**Key Endpoints:**
|
||||
- `/auth/login/` - User authentication
|
||||
- `/planner/keywords/` - Keyword management
|
||||
- `/planner/clusters/` - Keyword clusters
|
||||
- `/writer/tasks/` - Content tasks
|
||||
- `/writer/content/` - Generated content
|
||||
- `/integration/integrations/` - WordPress integrations
|
||||
|
||||
**Interactive Docs:**
|
||||
- Swagger UI: https://api.igny8.com/api/docs/
|
||||
- ReDoc: https://api.igny8.com/api/redoc/
|
||||
|
||||
See [API-COMPLETE-REFERENCE.md](./master-docs/API-COMPLETE-REFERENCE.md) for full documentation.
|
||||
|
||||
---
|
||||
|
||||
## Multi-Tenancy
|
||||
|
||||
IGNY8 supports complete account isolation:
|
||||
|
||||
```
|
||||
Account (Organization)
|
||||
├── Users (with roles: owner, admin, editor, viewer)
|
||||
├── Sites (multiple WordPress sites)
|
||||
└── Sectors (content categories)
|
||||
└── Keywords, Clusters, Content
|
||||
```
|
||||
|
||||
All data is automatically scoped to the authenticated user's account.
|
||||
|
||||
---
|
||||
|
||||
## Contributing
|
||||
|
||||
This is a private repository. For internal development:
|
||||
|
||||
1. Create feature branch: `git checkout -b feature/your-feature`
|
||||
2. Make changes and test thoroughly
|
||||
3. Commit: `git commit -m "Add your feature"`
|
||||
4. Push: `git push origin feature/your-feature`
|
||||
5. Create Pull Request
|
||||
|
||||
---
|
||||
|
||||
## Deployment
|
||||
|
||||
### Production Deployment
|
||||
|
||||
1. **Set production environment variables**
|
||||
2. **Build frontend:** `npm run build`
|
||||
3. **Collect static files:** `python manage.py collectstatic`
|
||||
4. **Run migrations:** `python manage.py migrate`
|
||||
5. **Use docker-compose:** `docker-compose -f docker-compose.app.yml up -d`
|
||||
|
||||
### Environment Variables
|
||||
|
||||
Required for production:
|
||||
|
||||
```env
|
||||
SECRET_KEY=<random-secret-key>
|
||||
DEBUG=False
|
||||
ALLOWED_HOSTS=api.igny8.com,app.igny8.com
|
||||
DATABASE_URL=postgresql://user:pass@host:5432/dbname
|
||||
REDIS_URL=redis://host:6379/0
|
||||
OPENAI_API_KEY=<openai-key>
|
||||
RUNWARE_API_KEY=<runware-key>
|
||||
USE_SECURE_COOKIES=True
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Support
|
||||
|
||||
For support and questions:
|
||||
- Check [MASTER_REFERENCE.md](./MASTER_REFERENCE.md) for detailed documentation
|
||||
- Review API docs at `/api/docs/`
|
||||
- Contact development team
|
||||
|
||||
---
|
||||
|
||||
## License
|
||||
|
||||
Proprietary. All rights reserved.
|
||||
|
||||
---
|
||||
|
||||
## Changelog
|
||||
|
||||
See [CHANGELOG.md](./CHANGELOG.md) for version history and updates.
|
||||
|
||||
---
|
||||
|
||||
**Built with ❤️ by the IGNY8 team**
|
||||
# Test commit - Mon Dec 15 07:18:54 UTC 2025
|
||||
For questions or clarifications about the documentation, refer to the specific document in the `/docs/` folder or contact the development team.
|
||||
|
||||
37
backend/=0.27.0
Normal file
37
backend/=0.27.0
Normal file
@@ -0,0 +1,37 @@
|
||||
Collecting drf-spectacular
|
||||
Downloading drf_spectacular-0.29.0-py3-none-any.whl.metadata (14 kB)
|
||||
Requirement already satisfied: Django>=2.2 in /usr/local/lib/python3.11/site-packages (from drf-spectacular) (5.2.8)
|
||||
Requirement already satisfied: djangorestframework>=3.10.3 in /usr/local/lib/python3.11/site-packages (from drf-spectacular) (3.16.1)
|
||||
Collecting uritemplate>=2.0.0 (from drf-spectacular)
|
||||
Downloading uritemplate-4.2.0-py3-none-any.whl.metadata (2.6 kB)
|
||||
Collecting PyYAML>=5.1 (from drf-spectacular)
|
||||
Downloading pyyaml-6.0.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl.metadata (2.4 kB)
|
||||
Collecting jsonschema>=2.6.0 (from drf-spectacular)
|
||||
Downloading jsonschema-4.25.1-py3-none-any.whl.metadata (7.6 kB)
|
||||
Collecting inflection>=0.3.1 (from drf-spectacular)
|
||||
Downloading inflection-0.5.1-py2.py3-none-any.whl.metadata (1.7 kB)
|
||||
Requirement already satisfied: asgiref>=3.8.1 in /usr/local/lib/python3.11/site-packages (from Django>=2.2->drf-spectacular) (3.10.0)
|
||||
Requirement already satisfied: sqlparse>=0.3.1 in /usr/local/lib/python3.11/site-packages (from Django>=2.2->drf-spectacular) (0.5.3)
|
||||
Collecting attrs>=22.2.0 (from jsonschema>=2.6.0->drf-spectacular)
|
||||
Downloading attrs-25.4.0-py3-none-any.whl.metadata (10 kB)
|
||||
Collecting jsonschema-specifications>=2023.03.6 (from jsonschema>=2.6.0->drf-spectacular)
|
||||
Downloading jsonschema_specifications-2025.9.1-py3-none-any.whl.metadata (2.9 kB)
|
||||
Collecting referencing>=0.28.4 (from jsonschema>=2.6.0->drf-spectacular)
|
||||
Downloading referencing-0.37.0-py3-none-any.whl.metadata (2.8 kB)
|
||||
Collecting rpds-py>=0.7.1 (from jsonschema>=2.6.0->drf-spectacular)
|
||||
Downloading rpds_py-0.28.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (4.1 kB)
|
||||
Requirement already satisfied: typing-extensions>=4.4.0 in /usr/local/lib/python3.11/site-packages (from referencing>=0.28.4->jsonschema>=2.6.0->drf-spectacular) (4.15.0)
|
||||
Downloading drf_spectacular-0.29.0-py3-none-any.whl (105 kB)
|
||||
Downloading inflection-0.5.1-py2.py3-none-any.whl (9.5 kB)
|
||||
Downloading jsonschema-4.25.1-py3-none-any.whl (90 kB)
|
||||
Downloading attrs-25.4.0-py3-none-any.whl (67 kB)
|
||||
Downloading jsonschema_specifications-2025.9.1-py3-none-any.whl (18 kB)
|
||||
Downloading pyyaml-6.0.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl (806 kB)
|
||||
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 806.6/806.6 kB 36.0 MB/s 0:00:00
|
||||
Downloading referencing-0.37.0-py3-none-any.whl (26 kB)
|
||||
Downloading rpds_py-0.28.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (382 kB)
|
||||
Downloading uritemplate-4.2.0-py3-none-any.whl (11 kB)
|
||||
Installing collected packages: uritemplate, rpds-py, PyYAML, inflection, attrs, referencing, jsonschema-specifications, jsonschema, drf-spectacular
|
||||
|
||||
Successfully installed PyYAML-6.0.3 attrs-25.4.0 drf-spectacular-0.29.0 inflection-0.5.1 jsonschema-4.25.1 jsonschema-specifications-2025.9.1 referencing-0.37.0 rpds-py-0.28.0 uritemplate-4.2.0
|
||||
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager, possibly rendering your system unusable. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv. Use the --root-user-action option if you know what you are doing and want to suppress this warning.
|
||||
@@ -22,10 +22,6 @@ RUN pip install --upgrade pip \
|
||||
# Copy full project
|
||||
COPY . /app/
|
||||
|
||||
# Copy startup script
|
||||
COPY container_startup.sh /app/
|
||||
RUN chmod +x /app/container_startup.sh
|
||||
|
||||
# Collect static files for WhiteNoise (skip during build if DB not available)
|
||||
# Will be run during container startup if needed
|
||||
RUN python manage.py collectstatic --noinput || echo "Skipping collectstatic during build"
|
||||
@@ -36,7 +32,5 @@ ENV DJANGO_SETTINGS_MODULE=igny8_core.settings
|
||||
# Expose port for Gunicorn (matches Portainer docker-compose config)
|
||||
EXPOSE 8010
|
||||
|
||||
# Use startup script as entrypoint to log container lifecycle
|
||||
# Start using Gunicorn (matches Portainer docker-compose config)
|
||||
ENTRYPOINT ["/app/container_startup.sh"]
|
||||
CMD ["gunicorn", "igny8_core.wsgi:application", "--bind", "0.0.0.0:8010"]
|
||||
|
||||
BIN
backend/celerybeat-schedule
Normal file
BIN
backend/celerybeat-schedule
Normal file
Binary file not shown.
@@ -1,47 +0,0 @@
|
||||
#!/bin/bash
|
||||
# Container Startup Logger
|
||||
# Logs container lifecycle events for debugging restarts
|
||||
|
||||
set -e
|
||||
|
||||
echo "=========================================="
|
||||
echo "[CONTAINER-STARTUP] $(date '+%Y-%m-%d %H:%M:%S')"
|
||||
echo "Container: igny8_backend"
|
||||
echo "Hostname: $(hostname)"
|
||||
echo "PID: $$"
|
||||
echo "=========================================="
|
||||
|
||||
# Log environment info
|
||||
echo "[INFO] Python version: $(python --version 2>&1)"
|
||||
echo "[INFO] Django settings: ${DJANGO_SETTINGS_MODULE:-igny8_core.settings}"
|
||||
echo "[INFO] Debug mode: ${DEBUG:-False}"
|
||||
echo "[INFO] Database host: ${DB_HOST:-not set}"
|
||||
|
||||
# Check if this is a restart (look for previous process artifacts)
|
||||
if [ -f /tmp/container_pid ]; then
|
||||
PREV_PID=$(cat /tmp/container_pid)
|
||||
echo "[WARNING] Previous container PID found: $PREV_PID"
|
||||
echo "[WARNING] This appears to be a RESTART event"
|
||||
echo "[WARNING] Check Docker logs for SIGTERM/SIGKILL signals"
|
||||
else
|
||||
echo "[INFO] First startup (no previous PID file found)"
|
||||
fi
|
||||
|
||||
# Save current PID
|
||||
echo $$ > /tmp/container_pid
|
||||
|
||||
# Run database migrations (will skip if up to date)
|
||||
echo "[INFO] Running database migrations..."
|
||||
python manage.py migrate --noinput || echo "[WARNING] Migration failed or skipped"
|
||||
|
||||
# Collect static files (skip if already done)
|
||||
echo "[INFO] Collecting static files..."
|
||||
python manage.py collectstatic --noinput || echo "[WARNING] Collectstatic failed or skipped"
|
||||
|
||||
echo "=========================================="
|
||||
echo "[CONTAINER-STARTUP] Initialization complete"
|
||||
echo "[CONTAINER-STARTUP] Starting Gunicorn..."
|
||||
echo "=========================================="
|
||||
|
||||
# Execute the CMD passed to the script (Gunicorn command)
|
||||
exec "$@"
|
||||
@@ -1,61 +0,0 @@
|
||||
#!/usr/bin/env python
|
||||
"""Script to create admin permission groups"""
|
||||
import os
|
||||
import django
|
||||
|
||||
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'igny8_core.settings')
|
||||
django.setup()
|
||||
|
||||
from django.contrib.auth.models import Group, Permission
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
|
||||
groups_permissions = {
|
||||
'Content Manager': {
|
||||
'models': [
|
||||
('writer', 'content'), ('writer', 'tasks'), ('writer', 'images'),
|
||||
('planner', 'keywords'), ('planner', 'clusters'), ('planner', 'contentideas'),
|
||||
],
|
||||
'permissions': ['add', 'change', 'view'],
|
||||
},
|
||||
'Billing Admin': {
|
||||
'models': [
|
||||
('billing', 'payment'), ('billing', 'invoice'), ('billing', 'credittransaction'),
|
||||
('billing', 'creditusagelog'), ('igny8_core_auth', 'account'),
|
||||
],
|
||||
'permissions': ['add', 'change', 'view', 'delete'],
|
||||
},
|
||||
'Support Agent': {
|
||||
'models': [
|
||||
('writer', 'content'), ('writer', 'tasks'),
|
||||
('igny8_core_auth', 'account'), ('igny8_core_auth', 'site'),
|
||||
],
|
||||
'permissions': ['view'],
|
||||
},
|
||||
}
|
||||
|
||||
print('Creating admin permission groups...\n')
|
||||
|
||||
for group_name, config in groups_permissions.items():
|
||||
group, created = Group.objects.get_or_create(name=group_name)
|
||||
status = 'Created' if created else 'Updated'
|
||||
print(f'✓ {status} group: {group_name}')
|
||||
|
||||
group.permissions.clear()
|
||||
added = 0
|
||||
|
||||
for app_label, model_name in config['models']:
|
||||
try:
|
||||
ct = ContentType.objects.get(app_label=app_label, model=model_name)
|
||||
for perm_type in config['permissions']:
|
||||
try:
|
||||
perm = Permission.objects.get(content_type=ct, codename=f'{perm_type}_{model_name}')
|
||||
group.permissions.add(perm)
|
||||
added += 1
|
||||
except Permission.DoesNotExist:
|
||||
print(f' ! Permission not found: {perm_type}_{model_name}')
|
||||
except ContentType.DoesNotExist:
|
||||
print(f' ! ContentType not found: {app_label}.{model_name}')
|
||||
|
||||
print(f' Added {added} permissions')
|
||||
|
||||
print('\n✓ Permission groups created successfully!')
|
||||
187
backend/create_test_users.py
Normal file
187
backend/create_test_users.py
Normal file
@@ -0,0 +1,187 @@
|
||||
#!/usr/bin/env python
|
||||
"""
|
||||
Script to create 3 real users with 3 paid packages (Starter, Growth, Scale)
|
||||
All accounts will be active and properly configured.
|
||||
Email format: plan-name@igny8.com
|
||||
"""
|
||||
import os
|
||||
import django
|
||||
import sys
|
||||
from decimal import Decimal
|
||||
|
||||
# Setup Django
|
||||
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'igny8_core.settings')
|
||||
django.setup()
|
||||
|
||||
from django.db import transaction
|
||||
from igny8_core.auth.models import Plan, Account, User
|
||||
from django.utils.text import slugify
|
||||
|
||||
# User data - 3 users with 3 different paid plans
|
||||
# Email format: plan-name@igny8.com
|
||||
USERS_DATA = [
|
||||
{
|
||||
"email": "starter@igny8.com",
|
||||
"username": "starter",
|
||||
"first_name": "Starter",
|
||||
"last_name": "Account",
|
||||
"password": "SecurePass123!@#",
|
||||
"plan_slug": "starter", # $89/month
|
||||
"account_name": "Starter Account",
|
||||
},
|
||||
{
|
||||
"email": "growth@igny8.com",
|
||||
"username": "growth",
|
||||
"first_name": "Growth",
|
||||
"last_name": "Account",
|
||||
"password": "SecurePass123!@#",
|
||||
"plan_slug": "growth", # $139/month
|
||||
"account_name": "Growth Account",
|
||||
},
|
||||
{
|
||||
"email": "scale@igny8.com",
|
||||
"username": "scale",
|
||||
"first_name": "Scale",
|
||||
"last_name": "Account",
|
||||
"password": "SecurePass123!@#",
|
||||
"plan_slug": "scale", # $229/month
|
||||
"account_name": "Scale Account",
|
||||
},
|
||||
]
|
||||
|
||||
|
||||
def create_user_with_plan(user_data):
|
||||
"""Create a user with account and assigned plan."""
|
||||
try:
|
||||
with transaction.atomic():
|
||||
# Get the plan
|
||||
try:
|
||||
plan = Plan.objects.get(slug=user_data['plan_slug'], is_active=True)
|
||||
except Plan.DoesNotExist:
|
||||
print(f"❌ ERROR: Plan '{user_data['plan_slug']}' not found or inactive!")
|
||||
return None
|
||||
|
||||
# Check if user already exists
|
||||
if User.objects.filter(email=user_data['email']).exists():
|
||||
print(f"⚠️ User {user_data['email']} already exists. Updating...")
|
||||
existing_user = User.objects.get(email=user_data['email'])
|
||||
if existing_user.account:
|
||||
existing_user.account.plan = plan
|
||||
existing_user.account.status = 'active'
|
||||
existing_user.account.save()
|
||||
print(f" ✅ Updated account plan to {plan.name} and set status to active")
|
||||
return existing_user
|
||||
|
||||
# Generate unique account slug
|
||||
base_slug = slugify(user_data['account_name'])
|
||||
account_slug = base_slug
|
||||
counter = 1
|
||||
while Account.objects.filter(slug=account_slug).exists():
|
||||
account_slug = f"{base_slug}-{counter}"
|
||||
counter += 1
|
||||
|
||||
# Create user first (without account)
|
||||
user = User.objects.create_user(
|
||||
username=user_data['username'],
|
||||
email=user_data['email'],
|
||||
password=user_data['password'],
|
||||
first_name=user_data['first_name'],
|
||||
last_name=user_data['last_name'],
|
||||
account=None, # Will be set after account creation
|
||||
role='owner'
|
||||
)
|
||||
|
||||
# Create account with user as owner and assigned plan
|
||||
account = Account.objects.create(
|
||||
name=user_data['account_name'],
|
||||
slug=account_slug,
|
||||
owner=user,
|
||||
plan=plan,
|
||||
status='active', # Set to active
|
||||
credits=plan.included_credits or 0, # Set initial credits from plan
|
||||
)
|
||||
|
||||
# Update user to reference the new account
|
||||
user.account = account
|
||||
user.save()
|
||||
|
||||
print(f"✅ Created user: {user.email}")
|
||||
print(f" - Name: {user.get_full_name()}")
|
||||
print(f" - Username: {user.username}")
|
||||
print(f" - Account: {account.name} (slug: {account.slug})")
|
||||
print(f" - Plan: {plan.name} (${plan.price}/month)")
|
||||
print(f" - Status: {account.status}")
|
||||
print(f" - Credits: {account.credits}")
|
||||
print(f" - Max Sites: {plan.max_sites}")
|
||||
print(f" - Max Users: {plan.max_users}")
|
||||
print()
|
||||
|
||||
return user
|
||||
|
||||
except Exception as e:
|
||||
print(f"❌ ERROR creating user {user_data['email']}: {e}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
return None
|
||||
|
||||
|
||||
def main():
|
||||
"""Main function to create all users."""
|
||||
print("=" * 80)
|
||||
print("Creating 3 Users with Paid Plans")
|
||||
print("=" * 80)
|
||||
print()
|
||||
|
||||
# Verify plans exist
|
||||
print("Checking available plans...")
|
||||
plans = Plan.objects.filter(is_active=True).order_by('price')
|
||||
if plans.count() < 3:
|
||||
print(f"⚠️ WARNING: Only {plans.count()} active plan(s) found. Need at least 3.")
|
||||
print("Available plans:")
|
||||
for p in plans:
|
||||
print(f" - {p.slug} (${p.price})")
|
||||
print()
|
||||
print("Please run import_plans.py first to create the plans.")
|
||||
return
|
||||
|
||||
print("✅ Found plans:")
|
||||
for p in plans:
|
||||
print(f" - {p.name} ({p.slug}): ${p.price}/month")
|
||||
print()
|
||||
|
||||
# Create users
|
||||
created_users = []
|
||||
for user_data in USERS_DATA:
|
||||
user = create_user_with_plan(user_data)
|
||||
if user:
|
||||
created_users.append(user)
|
||||
|
||||
# Summary
|
||||
print("=" * 80)
|
||||
print("SUMMARY")
|
||||
print("=" * 80)
|
||||
print(f"Total users created/updated: {len(created_users)}")
|
||||
print()
|
||||
print("User Login Credentials:")
|
||||
print("-" * 80)
|
||||
for user_data in USERS_DATA:
|
||||
print(f"Email: {user_data['email']}")
|
||||
print(f"Password: {user_data['password']}")
|
||||
print(f"Plan: {user_data['plan_slug'].title()}")
|
||||
print()
|
||||
|
||||
print("✅ All users created successfully!")
|
||||
print()
|
||||
print("You can now log in with any of these accounts at:")
|
||||
print("https://app.igny8.com/login")
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
try:
|
||||
main()
|
||||
except Exception as e:
|
||||
print(f"❌ Fatal error: {e}", file=sys.stderr)
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
sys.exit(1)
|
||||
|
||||
BIN
backend/db.sqlite3
Normal file
BIN
backend/db.sqlite3
Normal file
Binary file not shown.
Binary file not shown.
|
After Width: | Height: | Size: 164 KiB |
@@ -1,7 +1,7 @@
|
||||
"""
|
||||
Admin module for IGNY8
|
||||
"""
|
||||
# Note: Igny8ModelAdmin is imported by individual admin modules as needed to avoid circular imports
|
||||
from .base import AccountAdminMixin, SiteSectorAdminMixin
|
||||
|
||||
__all__ = []
|
||||
__all__ = ['AccountAdminMixin', 'SiteSectorAdminMixin']
|
||||
|
||||
|
||||
@@ -1,122 +0,0 @@
|
||||
"""
|
||||
Admin Alert System
|
||||
"""
|
||||
from django.utils import timezone
|
||||
from datetime import timedelta
|
||||
|
||||
|
||||
class AdminAlerts:
|
||||
"""System for admin alerts and notifications"""
|
||||
|
||||
@staticmethod
|
||||
def get_alerts():
|
||||
"""Get all active alerts"""
|
||||
alerts = []
|
||||
today = timezone.now().date()
|
||||
|
||||
# Check for pending payments
|
||||
from igny8_core.business.billing.models import Payment
|
||||
pending_payments = Payment.objects.filter(status='pending_approval').count()
|
||||
if pending_payments > 0:
|
||||
alerts.append({
|
||||
'level': 'warning',
|
||||
'icon': '⚠️',
|
||||
'message': f'{pending_payments} payment(s) awaiting approval',
|
||||
'url': '/admin/billing/payment/?status=pending_approval',
|
||||
'action': 'Review Payments'
|
||||
})
|
||||
|
||||
# Check for low credit accounts
|
||||
from igny8_core.auth.models import Account
|
||||
low_credit_accounts = Account.objects.filter(
|
||||
status='active',
|
||||
credits__lt=100
|
||||
).count()
|
||||
if low_credit_accounts > 0:
|
||||
alerts.append({
|
||||
'level': 'info',
|
||||
'icon': 'ℹ️',
|
||||
'message': f'{low_credit_accounts} account(s) with low credits',
|
||||
'url': '/admin/igny8_core_auth/account/?credits__lt=100',
|
||||
'action': 'View Accounts'
|
||||
})
|
||||
|
||||
# Check for very low credits (critical)
|
||||
critical_credit_accounts = Account.objects.filter(
|
||||
status='active',
|
||||
credits__lt=10
|
||||
).count()
|
||||
if critical_credit_accounts > 0:
|
||||
alerts.append({
|
||||
'level': 'error',
|
||||
'icon': '🔴',
|
||||
'message': f'{critical_credit_accounts} account(s) with critical low credits (< 10)',
|
||||
'url': '/admin/igny8_core_auth/account/?credits__lt=10',
|
||||
'action': 'Urgent Review'
|
||||
})
|
||||
|
||||
# Check for failed automations
|
||||
from igny8_core.business.automation.models import AutomationRun
|
||||
failed_today = AutomationRun.objects.filter(
|
||||
status='failed',
|
||||
started_at__date=today
|
||||
).count()
|
||||
if failed_today > 0:
|
||||
alerts.append({
|
||||
'level': 'error',
|
||||
'icon': '🔴',
|
||||
'message': f'{failed_today} automation(s) failed today',
|
||||
'url': '/admin/automation/automationrun/?status=failed',
|
||||
'action': 'Review Failures'
|
||||
})
|
||||
|
||||
# Check for failed syncs
|
||||
from igny8_core.business.integration.models import SyncEvent
|
||||
failed_syncs = SyncEvent.objects.filter(
|
||||
success=False,
|
||||
created_at__date=today
|
||||
).count()
|
||||
if failed_syncs > 5: # Only alert if more than 5
|
||||
alerts.append({
|
||||
'level': 'warning',
|
||||
'icon': '⚠️',
|
||||
'message': f'{failed_syncs} WordPress sync failures today',
|
||||
'url': '/admin/integration/syncevent/?success=False',
|
||||
'action': 'Review Syncs'
|
||||
})
|
||||
|
||||
# Check for failed Celery tasks
|
||||
try:
|
||||
from django_celery_results.models import TaskResult
|
||||
celery_failed = TaskResult.objects.filter(
|
||||
status='FAILURE',
|
||||
date_created__date=today
|
||||
).count()
|
||||
if celery_failed > 0:
|
||||
alerts.append({
|
||||
'level': 'error',
|
||||
'icon': '🔴',
|
||||
'message': f'{celery_failed} Celery task(s) failed today',
|
||||
'url': '/admin/django_celery_results/taskresult/?status=FAILURE',
|
||||
'action': 'Review Tasks'
|
||||
})
|
||||
except:
|
||||
pass
|
||||
|
||||
# Check for stale pending tasks (older than 24 hours)
|
||||
from igny8_core.modules.writer.models import Tasks
|
||||
yesterday = today - timedelta(days=1)
|
||||
stale_tasks = Tasks.objects.filter(
|
||||
status='pending',
|
||||
created_at__date__lte=yesterday
|
||||
).count()
|
||||
if stale_tasks > 10:
|
||||
alerts.append({
|
||||
'level': 'info',
|
||||
'icon': 'ℹ️',
|
||||
'message': f'{stale_tasks} tasks pending for more than 24 hours',
|
||||
'url': '/admin/writer/tasks/?status=pending',
|
||||
'action': 'Review Tasks'
|
||||
})
|
||||
|
||||
return alerts
|
||||
@@ -1,93 +1,8 @@
|
||||
from django.contrib import admin
|
||||
from django.contrib.admin.apps import AdminConfig
|
||||
|
||||
|
||||
class ReadOnlyAdmin(admin.ModelAdmin):
|
||||
"""Generic read-only admin for system tables."""
|
||||
|
||||
def has_add_permission(self, request):
|
||||
return False
|
||||
|
||||
def has_change_permission(self, request, obj=None):
|
||||
return False
|
||||
|
||||
def has_delete_permission(self, request, obj=None):
|
||||
return False
|
||||
|
||||
|
||||
def _safe_register(model, model_admin):
|
||||
try:
|
||||
admin.site.register(model, model_admin)
|
||||
except admin.sites.AlreadyRegistered:
|
||||
pass
|
||||
|
||||
|
||||
class Igny8AdminConfig(AdminConfig):
|
||||
default_site = 'igny8_core.admin.site.Igny8AdminSite'
|
||||
name = 'django.contrib.admin'
|
||||
|
||||
def ready(self):
|
||||
super().ready()
|
||||
|
||||
# Replace default admin.site with our custom Igny8AdminSite
|
||||
# IMPORTANT: Must copy all registrations from old site to new site
|
||||
# because models register themselves before ready() is called
|
||||
from igny8_core.admin.site import admin_site
|
||||
import django.contrib.admin as admin_module
|
||||
|
||||
# Copy all model registrations from the default site to our custom site
|
||||
old_site = admin_module.site
|
||||
admin_site._registry = old_site._registry.copy()
|
||||
admin_site._actions = old_site._actions.copy()
|
||||
admin_site._global_actions = old_site._global_actions.copy()
|
||||
|
||||
# Now replace the default site
|
||||
admin_module.site = admin_site
|
||||
admin_module.sites.site = admin_site
|
||||
|
||||
# Import Unfold AFTER apps are ready
|
||||
from unfold.admin import ModelAdmin as UnfoldModelAdmin
|
||||
|
||||
# Register Django internals in admin (read-only where appropriate)
|
||||
from django.contrib.admin.models import LogEntry
|
||||
from django.contrib.auth.models import Group, Permission
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from django.contrib.sessions.models import Session
|
||||
|
||||
_safe_register(LogEntry, ReadOnlyAdmin)
|
||||
_safe_register(Permission, UnfoldModelAdmin)
|
||||
_safe_register(Group, UnfoldModelAdmin)
|
||||
_safe_register(ContentType, ReadOnlyAdmin)
|
||||
_safe_register(Session, ReadOnlyAdmin)
|
||||
|
||||
# Import and setup enhanced Celery task monitoring
|
||||
self._setup_celery_admin()
|
||||
|
||||
def _setup_celery_admin(self):
|
||||
"""Setup enhanced Celery admin with proper unregister/register"""
|
||||
try:
|
||||
from django_celery_results.models import TaskResult, GroupResult
|
||||
from igny8_core.admin.celery_admin import CeleryTaskResultAdmin, CeleryGroupResultAdmin
|
||||
|
||||
# Unregister the default TaskResult admin
|
||||
try:
|
||||
admin.site.unregister(TaskResult)
|
||||
except admin.sites.NotRegistered:
|
||||
pass
|
||||
|
||||
# Unregister the default GroupResult admin
|
||||
try:
|
||||
admin.site.unregister(GroupResult)
|
||||
except admin.sites.NotRegistered:
|
||||
pass
|
||||
|
||||
# Register our enhanced versions
|
||||
admin.site.register(TaskResult, CeleryTaskResultAdmin)
|
||||
admin.site.register(GroupResult, CeleryGroupResultAdmin)
|
||||
except Exception as e:
|
||||
# Log the error but don't crash the app
|
||||
import logging
|
||||
logger = logging.getLogger(__name__)
|
||||
logger.warning(f"Could not setup enhanced Celery admin: {e}")
|
||||
|
||||
|
||||
|
||||
@@ -107,77 +107,3 @@ class SiteSectorAdminMixin:
|
||||
return obj.site in accessible_sites
|
||||
return super().has_delete_permission(request, obj)
|
||||
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Custom ModelAdmin for Sidebar Fix
|
||||
# ============================================================================
|
||||
|
||||
from unfold.admin import ModelAdmin as UnfoldModelAdmin
|
||||
|
||||
|
||||
class Igny8ModelAdmin(UnfoldModelAdmin):
|
||||
"""
|
||||
Custom ModelAdmin that ensures sidebar_navigation is set correctly on ALL pages
|
||||
|
||||
Django's ModelAdmin views don't call AdminSite.each_context(),
|
||||
so we override them to inject our custom sidebar.
|
||||
"""
|
||||
|
||||
def _inject_sidebar_context(self, request, extra_context=None):
|
||||
"""Helper to inject custom sidebar into context"""
|
||||
if extra_context is None:
|
||||
extra_context = {}
|
||||
|
||||
# Get our custom sidebar from the admin site
|
||||
from igny8_core.admin.site import admin_site
|
||||
|
||||
# CRITICAL: Get the full Unfold context (includes all branding, form classes, etc.)
|
||||
# This is what makes the logo/title appear properly
|
||||
unfold_context = admin_site.each_context(request)
|
||||
|
||||
# Get the current path to detect active group
|
||||
current_path = request.path
|
||||
|
||||
sidebar_navigation = admin_site.get_sidebar_list(request)
|
||||
|
||||
# Detect active group and expand it by setting collapsible=False
|
||||
for group in sidebar_navigation:
|
||||
group_is_active = False
|
||||
for item in group.get('items', []):
|
||||
item_link = item.get('link', '')
|
||||
# Check if current path matches this item's link
|
||||
if item_link and current_path.startswith(item_link):
|
||||
item['active'] = True
|
||||
group_is_active = True
|
||||
|
||||
# If any item in this group is active, expand the group
|
||||
if group_is_active:
|
||||
group['collapsible'] = False # Expanded state
|
||||
else:
|
||||
group['collapsible'] = True # Collapsed state
|
||||
|
||||
# Merge Unfold context with our custom sidebar
|
||||
unfold_context['sidebar_navigation'] = sidebar_navigation
|
||||
unfold_context['available_apps'] = admin_site.get_app_list(request, app_label=None)
|
||||
unfold_context['app_list'] = unfold_context['available_apps']
|
||||
|
||||
# Merge with any existing extra_context
|
||||
unfold_context.update(extra_context)
|
||||
|
||||
return unfold_context
|
||||
|
||||
def changelist_view(self, request, extra_context=None):
|
||||
"""Override to inject custom sidebar"""
|
||||
extra_context = self._inject_sidebar_context(request, extra_context)
|
||||
return super().changelist_view(request, extra_context)
|
||||
|
||||
def change_view(self, request, object_id, form_url='', extra_context=None):
|
||||
"""Override to inject custom sidebar"""
|
||||
extra_context = self._inject_sidebar_context(request, extra_context)
|
||||
return super().change_view(request, object_id, form_url, extra_context)
|
||||
|
||||
def add_view(self, request, form_url='', extra_context=None):
|
||||
"""Override to inject custom sidebar"""
|
||||
extra_context = self._inject_sidebar_context(request, extra_context)
|
||||
return super().add_view(request, form_url, extra_context)
|
||||
|
||||
@@ -1,213 +0,0 @@
|
||||
"""
|
||||
Celery Task Monitoring Admin - Unfold Style
|
||||
"""
|
||||
from django.contrib import admin
|
||||
from django.utils.html import format_html
|
||||
from django.contrib import messages
|
||||
from django_celery_results.models import TaskResult, GroupResult
|
||||
from unfold.admin import ModelAdmin
|
||||
from unfold.contrib.filters.admin import RangeDateFilter
|
||||
from celery import current_app
|
||||
|
||||
|
||||
class CeleryTaskResultAdmin(ModelAdmin):
|
||||
"""Admin interface for monitoring Celery tasks with Unfold styling"""
|
||||
|
||||
list_display = [
|
||||
'task_id',
|
||||
'task_name',
|
||||
'colored_status',
|
||||
'date_created',
|
||||
'date_done',
|
||||
'execution_time',
|
||||
]
|
||||
list_filter = [
|
||||
'status',
|
||||
'task_name',
|
||||
('date_created', RangeDateFilter),
|
||||
('date_done', RangeDateFilter),
|
||||
]
|
||||
search_fields = ['task_id', 'task_name', 'task_args']
|
||||
readonly_fields = [
|
||||
'task_id', 'task_name', 'task_args', 'task_kwargs',
|
||||
'result', 'traceback', 'date_created', 'date_done',
|
||||
'colored_status', 'execution_time'
|
||||
]
|
||||
date_hierarchy = 'date_created'
|
||||
ordering = ['-date_created']
|
||||
|
||||
actions = ['retry_failed_tasks', 'clear_old_tasks']
|
||||
|
||||
fieldsets = (
|
||||
('Task Information', {
|
||||
'fields': ('task_id', 'task_name', 'colored_status')
|
||||
}),
|
||||
('Execution Details', {
|
||||
'fields': ('date_created', 'date_done', 'execution_time')
|
||||
}),
|
||||
('Task Arguments', {
|
||||
'fields': ('task_args', 'task_kwargs'),
|
||||
'classes': ('collapse',)
|
||||
}),
|
||||
('Result & Errors', {
|
||||
'fields': ('result', 'traceback'),
|
||||
'classes': ('collapse',)
|
||||
}),
|
||||
)
|
||||
|
||||
def colored_status(self, obj):
|
||||
"""Display status with color coding"""
|
||||
colors = {
|
||||
'SUCCESS': '#0bbf87', # IGNY8 success green
|
||||
'FAILURE': '#ef4444', # IGNY8 danger red
|
||||
'PENDING': '#ff7a00', # IGNY8 warning orange
|
||||
'STARTED': '#0693e3', # IGNY8 primary blue
|
||||
'RETRY': '#5d4ae3', # IGNY8 purple
|
||||
}
|
||||
color = colors.get(obj.status, '#64748b') # Default gray
|
||||
|
||||
return format_html(
|
||||
'<span style="color: {}; font-weight: bold; font-size: 14px;">{}</span>',
|
||||
color,
|
||||
obj.status
|
||||
)
|
||||
colored_status.short_description = 'Status'
|
||||
|
||||
def execution_time(self, obj):
|
||||
"""Calculate and display execution time"""
|
||||
if obj.date_done and obj.date_created:
|
||||
duration = obj.date_done - obj.date_created
|
||||
seconds = duration.total_seconds()
|
||||
|
||||
if seconds < 1:
|
||||
time_str = f'{seconds * 1000:.2f}ms'
|
||||
return format_html('<span style="color: #0bbf87;">{}</span>', time_str)
|
||||
elif seconds < 60:
|
||||
time_str = f'{seconds:.2f}s'
|
||||
return format_html('<span style="color: #0693e3;">{}</span>', time_str)
|
||||
else:
|
||||
minutes = seconds / 60
|
||||
time_str = f'{minutes:.1f}m'
|
||||
return format_html('<span style="color: #ff7a00;">{}</span>', time_str)
|
||||
return '-'
|
||||
execution_time.short_description = 'Duration'
|
||||
|
||||
def retry_failed_tasks(self, request, queryset):
|
||||
"""Retry failed celery tasks by re-queuing them"""
|
||||
from igny8_core.celery import app
|
||||
import json
|
||||
|
||||
failed_tasks = queryset.filter(status='FAILURE')
|
||||
count = 0
|
||||
errors = []
|
||||
|
||||
for task in failed_tasks:
|
||||
try:
|
||||
# Get task function
|
||||
task_func = current_app.tasks.get(task.task_name)
|
||||
if task_func:
|
||||
# Parse task args and kwargs
|
||||
import ast
|
||||
try:
|
||||
args = ast.literal_eval(task.task_args) if task.task_args else []
|
||||
kwargs = ast.literal_eval(task.task_kwargs) if task.task_kwargs else {}
|
||||
except:
|
||||
args = []
|
||||
kwargs = {}
|
||||
|
||||
# Retry the task
|
||||
task_func.apply_async(args=args, kwargs=kwargs)
|
||||
count += 1
|
||||
else:
|
||||
errors.append(f'Task function not found: {task.task_name}')
|
||||
except Exception as e:
|
||||
errors.append(f'Error retrying {task.task_id}: {str(e)}')
|
||||
|
||||
if count > 0:
|
||||
self.message_user(request, f'Successfully queued {count} task(s) for retry.', 'SUCCESS')
|
||||
|
||||
if errors:
|
||||
for error in errors[:5]: # Show max 5 errors
|
||||
self.message_user(request, f'Error: {error}', 'WARNING')
|
||||
|
||||
retry_failed_tasks.short_description = 'Retry Failed Tasks'
|
||||
|
||||
def clear_old_tasks(self, request, queryset):
|
||||
"""Clear old completed tasks"""
|
||||
from datetime import timedelta
|
||||
from django.utils import timezone
|
||||
|
||||
# Delete tasks older than 30 days
|
||||
cutoff_date = timezone.now() - timedelta(days=30)
|
||||
old_tasks = queryset.filter(
|
||||
date_created__lt=cutoff_date,
|
||||
status__in=['SUCCESS', 'FAILURE']
|
||||
)
|
||||
|
||||
count = old_tasks.count()
|
||||
old_tasks.delete()
|
||||
|
||||
self.message_user(request, f'Cleared {count} old task(s)', messages.SUCCESS)
|
||||
|
||||
clear_old_tasks.short_description = 'Clear Old Tasks (30+ days)'
|
||||
|
||||
def has_add_permission(self, request):
|
||||
"""Disable manual task creation"""
|
||||
return False
|
||||
|
||||
def has_change_permission(self, request, obj=None):
|
||||
"""Make read-only"""
|
||||
return False
|
||||
|
||||
|
||||
class CeleryGroupResultAdmin(ModelAdmin):
|
||||
"""Admin interface for monitoring Celery group results with Unfold styling"""
|
||||
|
||||
list_display = [
|
||||
'group_id',
|
||||
'date_created',
|
||||
'date_done',
|
||||
'result_count',
|
||||
]
|
||||
list_filter = [
|
||||
('date_created', RangeDateFilter),
|
||||
('date_done', RangeDateFilter),
|
||||
]
|
||||
search_fields = ['group_id', 'result']
|
||||
readonly_fields = [
|
||||
'group_id', 'date_created', 'date_done', 'content_type',
|
||||
'content_encoding', 'result'
|
||||
]
|
||||
date_hierarchy = 'date_created'
|
||||
ordering = ['-date_created']
|
||||
|
||||
fieldsets = (
|
||||
('Group Information', {
|
||||
'fields': ('group_id', 'date_created', 'date_done')
|
||||
}),
|
||||
('Result Details', {
|
||||
'fields': ('content_type', 'content_encoding', 'result'),
|
||||
'classes': ('collapse',)
|
||||
}),
|
||||
)
|
||||
|
||||
def result_count(self, obj):
|
||||
"""Count tasks in the group"""
|
||||
if obj.result:
|
||||
try:
|
||||
import json
|
||||
result_data = json.loads(obj.result) if isinstance(obj.result, str) else obj.result
|
||||
if isinstance(result_data, list):
|
||||
return len(result_data)
|
||||
except:
|
||||
pass
|
||||
return '-'
|
||||
result_count.short_description = 'Task Count'
|
||||
|
||||
def has_add_permission(self, request):
|
||||
"""Disable manual group result creation"""
|
||||
return False
|
||||
|
||||
def has_change_permission(self, request, obj=None):
|
||||
"""Make read-only"""
|
||||
return False
|
||||
@@ -1,189 +0,0 @@
|
||||
"""
|
||||
Custom Admin Dashboard with Key Metrics
|
||||
"""
|
||||
from django.contrib import admin
|
||||
from django.shortcuts import render
|
||||
from django.db.models import Count, Sum, Q
|
||||
from django.utils import timezone
|
||||
from datetime import timedelta
|
||||
|
||||
|
||||
def admin_dashboard(request):
|
||||
"""Custom admin dashboard with operational metrics"""
|
||||
|
||||
# Date ranges
|
||||
today = timezone.now().date()
|
||||
week_ago = today - timedelta(days=7)
|
||||
month_ago = today - timedelta(days=30)
|
||||
|
||||
# Account metrics
|
||||
from igny8_core.auth.models import Account, Site
|
||||
total_accounts = Account.objects.count()
|
||||
active_accounts = Account.objects.filter(status='active').count()
|
||||
low_credit_accounts = Account.objects.filter(
|
||||
status='active',
|
||||
credits__lt=100
|
||||
).count()
|
||||
critical_credit_accounts = Account.objects.filter(
|
||||
status='active',
|
||||
credits__lt=10
|
||||
).count()
|
||||
|
||||
# Site metrics
|
||||
total_sites = Site.objects.count()
|
||||
active_sites = Site.objects.filter(is_active=True, status='active').count()
|
||||
|
||||
# Content metrics
|
||||
from igny8_core.modules.writer.models import Content, Tasks
|
||||
content_this_week = Content.objects.filter(created_at__gte=week_ago).count()
|
||||
content_this_month = Content.objects.filter(created_at__gte=month_ago).count()
|
||||
tasks_pending = Tasks.objects.filter(status='pending').count()
|
||||
tasks_in_progress = Tasks.objects.filter(status='in_progress').count()
|
||||
|
||||
# Billing metrics
|
||||
from igny8_core.business.billing.models import Payment, CreditTransaction
|
||||
pending_payments = Payment.objects.filter(status='pending_approval').count()
|
||||
payments_this_month = Payment.objects.filter(
|
||||
created_at__gte=month_ago,
|
||||
status='succeeded'
|
||||
).aggregate(total=Sum('amount'))['total'] or 0
|
||||
|
||||
credit_usage_this_month = CreditTransaction.objects.filter(
|
||||
created_at__gte=month_ago,
|
||||
transaction_type='deduction'
|
||||
).aggregate(total=Sum('amount'))['total'] or 0
|
||||
|
||||
# Automation metrics
|
||||
from igny8_core.business.automation.models import AutomationRun
|
||||
automation_running = AutomationRun.objects.filter(status='running').count()
|
||||
automation_failed = AutomationRun.objects.filter(
|
||||
status='failed',
|
||||
started_at__gte=week_ago
|
||||
).count()
|
||||
|
||||
# Calculate success rate
|
||||
total_runs = AutomationRun.objects.filter(started_at__gte=week_ago).count()
|
||||
if total_runs > 0:
|
||||
success_runs = AutomationRun.objects.filter(
|
||||
started_at__gte=week_ago,
|
||||
status='completed'
|
||||
).count()
|
||||
automation_success_rate = round((success_runs / total_runs) * 100, 1)
|
||||
else:
|
||||
automation_success_rate = 0
|
||||
|
||||
# WordPress sync metrics
|
||||
from igny8_core.business.integration.models import SyncEvent
|
||||
sync_failed_today = SyncEvent.objects.filter(
|
||||
success=False,
|
||||
created_at__date=today
|
||||
).count()
|
||||
sync_success_today = SyncEvent.objects.filter(
|
||||
success=True,
|
||||
created_at__date=today
|
||||
).count()
|
||||
|
||||
# Celery task metrics
|
||||
try:
|
||||
from django_celery_results.models import TaskResult
|
||||
celery_failed = TaskResult.objects.filter(
|
||||
status='FAILURE',
|
||||
date_created__gte=week_ago
|
||||
).count()
|
||||
celery_pending = TaskResult.objects.filter(status='PENDING').count()
|
||||
except:
|
||||
celery_failed = 0
|
||||
celery_pending = 0
|
||||
|
||||
# Generate alerts
|
||||
alerts = []
|
||||
|
||||
if critical_credit_accounts > 0:
|
||||
alerts.append({
|
||||
'level': 'error',
|
||||
'message': f'{critical_credit_accounts} account(s) have CRITICAL low credits (< 10)',
|
||||
'action': 'Review Accounts',
|
||||
'url': '/admin/igny8_core_auth/account/?credits__lt=10'
|
||||
})
|
||||
|
||||
if low_credit_accounts > 0:
|
||||
alerts.append({
|
||||
'level': 'warning',
|
||||
'message': f'{low_credit_accounts} account(s) have low credits (< 100)',
|
||||
'action': 'Review Accounts',
|
||||
'url': '/admin/igny8_core_auth/account/?credits__lt=100'
|
||||
})
|
||||
|
||||
if pending_payments > 0:
|
||||
alerts.append({
|
||||
'level': 'warning',
|
||||
'message': f'{pending_payments} payment(s) awaiting approval',
|
||||
'action': 'Approve Payments',
|
||||
'url': '/admin/billing/payment/?status__exact=pending_approval'
|
||||
})
|
||||
|
||||
if automation_failed > 5:
|
||||
alerts.append({
|
||||
'level': 'error',
|
||||
'message': f'{automation_failed} automation runs failed this week',
|
||||
'action': 'View Failed Runs',
|
||||
'url': '/admin/automation/automationrun/?status__exact=failed'
|
||||
})
|
||||
|
||||
if sync_failed_today > 0:
|
||||
alerts.append({
|
||||
'level': 'warning',
|
||||
'message': f'{sync_failed_today} WordPress sync failure(s) today',
|
||||
'action': 'View Sync Events',
|
||||
'url': '/admin/integration/syncevent/?success__exact=0'
|
||||
})
|
||||
|
||||
if celery_failed > 10:
|
||||
alerts.append({
|
||||
'level': 'error',
|
||||
'message': f'{celery_failed} Celery tasks failed this week',
|
||||
'action': 'View Failed Tasks',
|
||||
'url': '/admin/django_celery_results/taskresult/?status__exact=FAILURE'
|
||||
})
|
||||
|
||||
context = {
|
||||
'title': 'IGNY8 Dashboard',
|
||||
'site_title': 'IGNY8 Admin',
|
||||
'site_header': 'IGNY8 Administration',
|
||||
# Account metrics
|
||||
'total_accounts': total_accounts,
|
||||
'active_accounts': active_accounts,
|
||||
'low_credit_accounts': low_credit_accounts,
|
||||
'critical_credit_accounts': critical_credit_accounts,
|
||||
# Site metrics
|
||||
'total_sites': total_sites,
|
||||
'active_sites': active_sites,
|
||||
# Content metrics
|
||||
'content_this_week': content_this_week,
|
||||
'content_this_month': content_this_month,
|
||||
'tasks_pending': tasks_pending,
|
||||
'tasks_in_progress': tasks_in_progress,
|
||||
# Billing metrics
|
||||
'pending_payments': pending_payments,
|
||||
'payments_this_month': float(payments_this_month),
|
||||
'credit_usage_this_month': abs(float(credit_usage_this_month)),
|
||||
# Automation metrics
|
||||
'automation_running': automation_running,
|
||||
'automation_failed': automation_failed,
|
||||
'automation_success_rate': automation_success_rate,
|
||||
# Integration metrics
|
||||
'sync_failed_today': sync_failed_today,
|
||||
'sync_success_today': sync_success_today,
|
||||
# Celery metrics
|
||||
'celery_failed': celery_failed,
|
||||
'celery_pending': celery_pending,
|
||||
# Alerts
|
||||
'alerts': alerts,
|
||||
}
|
||||
|
||||
# Merge with admin context to get sidebar and header
|
||||
from igny8_core.admin.site import admin_site
|
||||
admin_context = admin_site.each_context(request)
|
||||
context.update(admin_context)
|
||||
|
||||
return render(request, 'admin/dashboard.html', context)
|
||||
@@ -1,406 +0,0 @@
|
||||
"""
|
||||
Admin Monitoring Module - System Health, API Monitor, Debug Console
|
||||
Provides read-only monitoring and debugging tools for Django Admin
|
||||
"""
|
||||
from django.shortcuts import render
|
||||
from django.contrib.admin.views.decorators import staff_member_required
|
||||
from django.utils import timezone
|
||||
from django.db import connection
|
||||
from django.conf import settings
|
||||
import time
|
||||
import os
|
||||
|
||||
|
||||
@staff_member_required
|
||||
def system_health_dashboard(request):
|
||||
"""
|
||||
System infrastructure health monitoring
|
||||
Checks: Database, Redis, Celery, File System
|
||||
"""
|
||||
context = {
|
||||
'page_title': 'System Health Monitor',
|
||||
'checked_at': timezone.now(),
|
||||
'checks': []
|
||||
}
|
||||
|
||||
# Database Check
|
||||
db_check = {
|
||||
'name': 'PostgreSQL Database',
|
||||
'status': 'unknown',
|
||||
'message': '',
|
||||
'details': {}
|
||||
}
|
||||
try:
|
||||
start = time.time()
|
||||
with connection.cursor() as cursor:
|
||||
cursor.execute("SELECT version()")
|
||||
version = cursor.fetchone()[0]
|
||||
cursor.execute("SELECT COUNT(*) FROM django_session")
|
||||
session_count = cursor.fetchone()[0]
|
||||
|
||||
elapsed = (time.time() - start) * 1000
|
||||
db_check.update({
|
||||
'status': 'healthy',
|
||||
'message': f'Connected ({elapsed:.2f}ms)',
|
||||
'details': {
|
||||
'version': version.split('\n')[0],
|
||||
'response_time': f'{elapsed:.2f}ms',
|
||||
'active_sessions': session_count
|
||||
}
|
||||
})
|
||||
except Exception as e:
|
||||
db_check.update({
|
||||
'status': 'error',
|
||||
'message': f'Connection failed: {str(e)}'
|
||||
})
|
||||
context['checks'].append(db_check)
|
||||
|
||||
# Redis Check
|
||||
redis_check = {
|
||||
'name': 'Redis Cache',
|
||||
'status': 'unknown',
|
||||
'message': '',
|
||||
'details': {}
|
||||
}
|
||||
try:
|
||||
import redis
|
||||
r = redis.Redis(
|
||||
host=settings.CACHES['default']['LOCATION'].split(':')[0] if ':' in settings.CACHES['default'].get('LOCATION', '') else 'redis',
|
||||
port=6379,
|
||||
db=0,
|
||||
socket_connect_timeout=2
|
||||
)
|
||||
start = time.time()
|
||||
r.ping()
|
||||
elapsed = (time.time() - start) * 1000
|
||||
|
||||
info = r.info()
|
||||
redis_check.update({
|
||||
'status': 'healthy',
|
||||
'message': f'Connected ({elapsed:.2f}ms)',
|
||||
'details': {
|
||||
'version': info.get('redis_version', 'unknown'),
|
||||
'uptime': f"{info.get('uptime_in_seconds', 0) // 3600}h",
|
||||
'connected_clients': info.get('connected_clients', 0),
|
||||
'used_memory': f"{info.get('used_memory_human', 'unknown')}",
|
||||
'response_time': f'{elapsed:.2f}ms'
|
||||
}
|
||||
})
|
||||
except Exception as e:
|
||||
redis_check.update({
|
||||
'status': 'error',
|
||||
'message': f'Connection failed: {str(e)}'
|
||||
})
|
||||
context['checks'].append(redis_check)
|
||||
|
||||
# Celery Workers Check
|
||||
celery_check = {
|
||||
'name': 'Celery Workers',
|
||||
'status': 'unknown',
|
||||
'message': '',
|
||||
'details': {}
|
||||
}
|
||||
try:
|
||||
from igny8_core.celery import app
|
||||
inspect = app.control.inspect(timeout=2)
|
||||
stats = inspect.stats()
|
||||
active = inspect.active()
|
||||
|
||||
if stats:
|
||||
worker_count = len(stats)
|
||||
total_tasks = sum(len(tasks) for tasks in active.values()) if active else 0
|
||||
celery_check.update({
|
||||
'status': 'healthy',
|
||||
'message': f'{worker_count} worker(s) active',
|
||||
'details': {
|
||||
'workers': worker_count,
|
||||
'active_tasks': total_tasks,
|
||||
'worker_names': list(stats.keys())
|
||||
}
|
||||
})
|
||||
else:
|
||||
celery_check.update({
|
||||
'status': 'warning',
|
||||
'message': 'No workers responding'
|
||||
})
|
||||
except Exception as e:
|
||||
celery_check.update({
|
||||
'status': 'error',
|
||||
'message': f'Check failed: {str(e)}'
|
||||
})
|
||||
context['checks'].append(celery_check)
|
||||
|
||||
# File System Check
|
||||
fs_check = {
|
||||
'name': 'File System',
|
||||
'status': 'unknown',
|
||||
'message': '',
|
||||
'details': {}
|
||||
}
|
||||
try:
|
||||
import shutil
|
||||
media_root = settings.MEDIA_ROOT
|
||||
static_root = settings.STATIC_ROOT
|
||||
|
||||
media_stat = shutil.disk_usage(media_root) if os.path.exists(media_root) else None
|
||||
|
||||
if media_stat:
|
||||
free_gb = media_stat.free / (1024**3)
|
||||
total_gb = media_stat.total / (1024**3)
|
||||
used_percent = (media_stat.used / media_stat.total) * 100
|
||||
|
||||
fs_check.update({
|
||||
'status': 'healthy' if used_percent < 90 else 'warning',
|
||||
'message': f'{free_gb:.1f}GB free of {total_gb:.1f}GB',
|
||||
'details': {
|
||||
'media_root': media_root,
|
||||
'free_space': f'{free_gb:.1f}GB',
|
||||
'total_space': f'{total_gb:.1f}GB',
|
||||
'used_percent': f'{used_percent:.1f}%'
|
||||
}
|
||||
})
|
||||
else:
|
||||
fs_check.update({
|
||||
'status': 'warning',
|
||||
'message': 'Media directory not found'
|
||||
})
|
||||
except Exception as e:
|
||||
fs_check.update({
|
||||
'status': 'error',
|
||||
'message': f'Check failed: {str(e)}'
|
||||
})
|
||||
context['checks'].append(fs_check)
|
||||
|
||||
# Overall system status
|
||||
statuses = [check['status'] for check in context['checks']]
|
||||
if 'error' in statuses:
|
||||
context['overall_status'] = 'error'
|
||||
context['overall_message'] = 'System has errors'
|
||||
elif 'warning' in statuses:
|
||||
context['overall_status'] = 'warning'
|
||||
context['overall_message'] = 'System has warnings'
|
||||
else:
|
||||
context['overall_status'] = 'healthy'
|
||||
context['overall_message'] = 'All systems operational'
|
||||
|
||||
return render(request, 'admin/monitoring/system_health.html', context)
|
||||
|
||||
|
||||
@staff_member_required
|
||||
def api_monitor_dashboard(request):
|
||||
"""
|
||||
API endpoint health monitoring
|
||||
Tests key endpoints and displays response times
|
||||
"""
|
||||
from django.test.client import Client
|
||||
|
||||
context = {
|
||||
'page_title': 'API Monitor',
|
||||
'checked_at': timezone.now(),
|
||||
'endpoint_groups': []
|
||||
}
|
||||
|
||||
# Define endpoint groups to check
|
||||
endpoint_configs = [
|
||||
{
|
||||
'name': 'Authentication',
|
||||
'endpoints': [
|
||||
{'path': '/api/v1/auth/check/', 'method': 'GET', 'auth_required': False},
|
||||
]
|
||||
},
|
||||
{
|
||||
'name': 'System Settings',
|
||||
'endpoints': [
|
||||
{'path': '/api/v1/system/health/', 'method': 'GET', 'auth_required': False},
|
||||
]
|
||||
},
|
||||
{
|
||||
'name': 'Planner Module',
|
||||
'endpoints': [
|
||||
{'path': '/api/v1/planner/keywords/', 'method': 'GET', 'auth_required': True},
|
||||
]
|
||||
},
|
||||
{
|
||||
'name': 'Writer Module',
|
||||
'endpoints': [
|
||||
{'path': '/api/v1/writer/tasks/', 'method': 'GET', 'auth_required': True},
|
||||
]
|
||||
},
|
||||
{
|
||||
'name': 'Billing',
|
||||
'endpoints': [
|
||||
{'path': '/api/v1/billing/credits/balance/', 'method': 'GET', 'auth_required': True},
|
||||
]
|
||||
},
|
||||
]
|
||||
|
||||
client = Client()
|
||||
|
||||
for group_config in endpoint_configs:
|
||||
group_results = {
|
||||
'name': group_config['name'],
|
||||
'endpoints': []
|
||||
}
|
||||
|
||||
for endpoint in group_config['endpoints']:
|
||||
result = {
|
||||
'path': endpoint['path'],
|
||||
'method': endpoint['method'],
|
||||
'status': 'unknown',
|
||||
'status_code': None,
|
||||
'response_time': None,
|
||||
'message': ''
|
||||
}
|
||||
|
||||
try:
|
||||
start = time.time()
|
||||
|
||||
if endpoint['method'] == 'GET':
|
||||
response = client.get(endpoint['path'])
|
||||
else:
|
||||
response = client.post(endpoint['path'])
|
||||
|
||||
elapsed = (time.time() - start) * 1000
|
||||
|
||||
result.update({
|
||||
'status_code': response.status_code,
|
||||
'response_time': f'{elapsed:.2f}ms',
|
||||
})
|
||||
|
||||
# Determine status
|
||||
if response.status_code < 300:
|
||||
result['status'] = 'healthy'
|
||||
result['message'] = 'OK'
|
||||
elif response.status_code == 401 and endpoint.get('auth_required'):
|
||||
result['status'] = 'healthy'
|
||||
result['message'] = 'Auth required (expected)'
|
||||
elif response.status_code < 500:
|
||||
result['status'] = 'warning'
|
||||
result['message'] = 'Client error'
|
||||
else:
|
||||
result['status'] = 'error'
|
||||
result['message'] = 'Server error'
|
||||
|
||||
except Exception as e:
|
||||
result.update({
|
||||
'status': 'error',
|
||||
'message': str(e)[:100]
|
||||
})
|
||||
|
||||
group_results['endpoints'].append(result)
|
||||
|
||||
context['endpoint_groups'].append(group_results)
|
||||
|
||||
# Calculate overall stats
|
||||
all_endpoints = [ep for group in context['endpoint_groups'] for ep in group['endpoints']]
|
||||
total = len(all_endpoints)
|
||||
healthy = len([ep for ep in all_endpoints if ep['status'] == 'healthy'])
|
||||
warnings = len([ep for ep in all_endpoints if ep['status'] == 'warning'])
|
||||
errors = len([ep for ep in all_endpoints if ep['status'] == 'error'])
|
||||
|
||||
context['stats'] = {
|
||||
'total': total,
|
||||
'healthy': healthy,
|
||||
'warnings': warnings,
|
||||
'errors': errors,
|
||||
'health_percentage': (healthy / total * 100) if total > 0 else 0
|
||||
}
|
||||
|
||||
return render(request, 'admin/monitoring/api_monitor.html', context)
|
||||
|
||||
|
||||
@staff_member_required
|
||||
def debug_console(request):
|
||||
"""
|
||||
System debug information (read-only)
|
||||
Shows environment, database config, cache config, etc.
|
||||
"""
|
||||
context = {
|
||||
'page_title': 'Debug Console',
|
||||
'checked_at': timezone.now(),
|
||||
'sections': []
|
||||
}
|
||||
|
||||
# Environment Variables Section
|
||||
env_section = {
|
||||
'title': 'Environment',
|
||||
'items': {
|
||||
'DEBUG': settings.DEBUG,
|
||||
'ENVIRONMENT': os.getenv('ENVIRONMENT', 'not set'),
|
||||
'DJANGO_SETTINGS_MODULE': os.getenv('DJANGO_SETTINGS_MODULE', 'not set'),
|
||||
'ALLOWED_HOSTS': settings.ALLOWED_HOSTS,
|
||||
'TIME_ZONE': settings.TIME_ZONE,
|
||||
'USE_TZ': settings.USE_TZ,
|
||||
}
|
||||
}
|
||||
context['sections'].append(env_section)
|
||||
|
||||
# Database Configuration
|
||||
db_config = settings.DATABASES.get('default', {})
|
||||
db_section = {
|
||||
'title': 'Database Configuration',
|
||||
'items': {
|
||||
'ENGINE': db_config.get('ENGINE', 'not set'),
|
||||
'NAME': db_config.get('NAME', 'not set'),
|
||||
'HOST': db_config.get('HOST', 'not set'),
|
||||
'PORT': db_config.get('PORT', 'not set'),
|
||||
'CONN_MAX_AGE': db_config.get('CONN_MAX_AGE', 'not set'),
|
||||
}
|
||||
}
|
||||
context['sections'].append(db_section)
|
||||
|
||||
# Cache Configuration
|
||||
cache_config = settings.CACHES.get('default', {})
|
||||
cache_section = {
|
||||
'title': 'Cache Configuration',
|
||||
'items': {
|
||||
'BACKEND': cache_config.get('BACKEND', 'not set'),
|
||||
'LOCATION': cache_config.get('LOCATION', 'not set'),
|
||||
'KEY_PREFIX': cache_config.get('KEY_PREFIX', 'not set'),
|
||||
}
|
||||
}
|
||||
context['sections'].append(cache_section)
|
||||
|
||||
# Celery Configuration
|
||||
celery_section = {
|
||||
'title': 'Celery Configuration',
|
||||
'items': {
|
||||
'BROKER_URL': getattr(settings, 'CELERY_BROKER_URL', 'not set'),
|
||||
'RESULT_BACKEND': getattr(settings, 'CELERY_RESULT_BACKEND', 'not set'),
|
||||
'TASK_ALWAYS_EAGER': getattr(settings, 'CELERY_TASK_ALWAYS_EAGER', False),
|
||||
}
|
||||
}
|
||||
context['sections'].append(celery_section)
|
||||
|
||||
# Media & Static Files
|
||||
files_section = {
|
||||
'title': 'Media & Static Files',
|
||||
'items': {
|
||||
'MEDIA_ROOT': settings.MEDIA_ROOT,
|
||||
'MEDIA_URL': settings.MEDIA_URL,
|
||||
'STATIC_ROOT': settings.STATIC_ROOT,
|
||||
'STATIC_URL': settings.STATIC_URL,
|
||||
}
|
||||
}
|
||||
context['sections'].append(files_section)
|
||||
|
||||
# Installed Apps (count)
|
||||
apps_section = {
|
||||
'title': 'Installed Applications',
|
||||
'items': {
|
||||
'Total Apps': len(settings.INSTALLED_APPS),
|
||||
'Custom Apps': len([app for app in settings.INSTALLED_APPS if app.startswith('igny8_')]),
|
||||
}
|
||||
}
|
||||
context['sections'].append(apps_section)
|
||||
|
||||
# Middleware (count)
|
||||
middleware_section = {
|
||||
'title': 'Middleware',
|
||||
'items': {
|
||||
'Total Middleware': len(settings.MIDDLEWARE),
|
||||
}
|
||||
}
|
||||
context['sections'].append(middleware_section)
|
||||
|
||||
return render(request, 'admin/monitoring/debug_console.html', context)
|
||||
@@ -1,617 +0,0 @@
|
||||
"""
|
||||
Analytics & Reporting Views for IGNY8 Admin
|
||||
"""
|
||||
from django.contrib.admin.views.decorators import staff_member_required
|
||||
from django.shortcuts import render
|
||||
from django.db.models import Count, Sum, Avg, Q
|
||||
from django.utils import timezone
|
||||
from datetime import timedelta
|
||||
import json
|
||||
|
||||
|
||||
@staff_member_required
|
||||
def revenue_report(request):
|
||||
"""Revenue and billing analytics"""
|
||||
from igny8_core.business.billing.models import Payment
|
||||
from igny8_core.auth.models import Plan
|
||||
|
||||
# Date ranges
|
||||
today = timezone.now()
|
||||
months = []
|
||||
monthly_revenue = []
|
||||
|
||||
for i in range(6):
|
||||
month_start = today.replace(day=1) - timedelta(days=30*i)
|
||||
month_end = month_start.replace(day=28) + timedelta(days=4)
|
||||
|
||||
revenue = Payment.objects.filter(
|
||||
status='succeeded',
|
||||
processed_at__gte=month_start,
|
||||
processed_at__lt=month_end
|
||||
).aggregate(total=Sum('amount'))['total'] or 0
|
||||
|
||||
months.insert(0, month_start.strftime('%b %Y'))
|
||||
monthly_revenue.insert(0, float(revenue))
|
||||
|
||||
# Plan distribution
|
||||
plan_distribution = Plan.objects.annotate(
|
||||
account_count=Count('accounts')
|
||||
).values('name', 'account_count')
|
||||
|
||||
# Payment method breakdown
|
||||
payment_methods = Payment.objects.filter(
|
||||
status='succeeded'
|
||||
).values('payment_method').annotate(
|
||||
count=Count('id'),
|
||||
total=Sum('amount')
|
||||
).order_by('-total')
|
||||
|
||||
# Total revenue all time
|
||||
total_revenue = Payment.objects.filter(
|
||||
status='succeeded'
|
||||
).aggregate(total=Sum('amount'))['total'] or 0
|
||||
|
||||
context = {
|
||||
'title': 'Revenue Report',
|
||||
'months': json.dumps(months),
|
||||
'monthly_revenue': json.dumps(monthly_revenue),
|
||||
'plan_distribution': list(plan_distribution),
|
||||
'payment_methods': list(payment_methods),
|
||||
'total_revenue': float(total_revenue),
|
||||
}
|
||||
|
||||
# Merge with admin context
|
||||
from igny8_core.admin.site import admin_site
|
||||
admin_context = admin_site.each_context(request)
|
||||
context.update(admin_context)
|
||||
|
||||
return render(request, 'admin/reports/revenue.html', context)
|
||||
|
||||
|
||||
@staff_member_required
|
||||
def usage_report(request):
|
||||
"""Credit usage and AI operations analytics"""
|
||||
from igny8_core.business.billing.models import CreditUsageLog
|
||||
|
||||
# Usage by operation type
|
||||
usage_by_operation = CreditUsageLog.objects.values(
|
||||
'operation_type'
|
||||
).annotate(
|
||||
total_credits=Sum('credits_used'),
|
||||
total_cost=Sum('cost_usd'),
|
||||
operation_count=Count('id')
|
||||
).order_by('-total_credits')
|
||||
|
||||
# Format operation types as Title Case
|
||||
for usage in usage_by_operation:
|
||||
usage['operation_type'] = usage['operation_type'].replace('_', ' ').title() if usage['operation_type'] else 'Unknown'
|
||||
|
||||
# Top credit consumers
|
||||
top_consumers = CreditUsageLog.objects.values(
|
||||
'account__name'
|
||||
).annotate(
|
||||
total_credits=Sum('credits_used'),
|
||||
operation_count=Count('id')
|
||||
).order_by('-total_credits')[:10]
|
||||
|
||||
# Model usage distribution
|
||||
model_usage = CreditUsageLog.objects.values(
|
||||
'model_used'
|
||||
).annotate(
|
||||
usage_count=Count('id')
|
||||
).order_by('-usage_count')
|
||||
|
||||
# Total credits used
|
||||
total_credits = CreditUsageLog.objects.aggregate(
|
||||
total=Sum('credits_used')
|
||||
)['total'] or 0
|
||||
|
||||
context = {
|
||||
'title': 'Usage Report',
|
||||
'usage_by_operation': list(usage_by_operation),
|
||||
'top_consumers': list(top_consumers),
|
||||
'model_usage': list(model_usage),
|
||||
'total_credits': int(total_credits),
|
||||
}
|
||||
|
||||
# Merge with admin context
|
||||
from igny8_core.admin.site import admin_site
|
||||
admin_context = admin_site.each_context(request)
|
||||
context.update(admin_context)
|
||||
|
||||
return render(request, 'admin/reports/usage.html', context)
|
||||
|
||||
|
||||
@staff_member_required
|
||||
def content_report(request):
|
||||
"""Content production analytics"""
|
||||
from igny8_core.modules.writer.models import Content, Tasks
|
||||
|
||||
# Content by type
|
||||
content_by_type = Content.objects.values(
|
||||
'content_type'
|
||||
).annotate(count=Count('id')).order_by('-count')
|
||||
|
||||
# Production timeline (last 30 days)
|
||||
days = []
|
||||
daily_counts = []
|
||||
for i in range(30):
|
||||
day = timezone.now().date() - timedelta(days=i)
|
||||
count = Content.objects.filter(created_at__date=day).count()
|
||||
days.insert(0, day.strftime('%m/%d'))
|
||||
daily_counts.insert(0, count)
|
||||
|
||||
# Average word count by content type
|
||||
avg_words = Content.objects.values('content_type').annotate(
|
||||
avg_words=Avg('word_count')
|
||||
).order_by('-avg_words')
|
||||
|
||||
# Task completion rate
|
||||
total_tasks = Tasks.objects.count()
|
||||
completed_tasks = Tasks.objects.filter(status='completed').count()
|
||||
completion_rate = (completed_tasks / total_tasks * 100) if total_tasks > 0 else 0
|
||||
|
||||
# Total content produced
|
||||
total_content = Content.objects.count()
|
||||
|
||||
context = {
|
||||
'title': 'Content Production Report',
|
||||
'content_by_type': list(content_by_type),
|
||||
'days': json.dumps(days),
|
||||
'daily_counts': json.dumps(daily_counts),
|
||||
'avg_words': list(avg_words),
|
||||
'completion_rate': round(completion_rate, 1),
|
||||
'total_content': total_content,
|
||||
'total_tasks': total_tasks,
|
||||
'completed_tasks': completed_tasks,
|
||||
}
|
||||
|
||||
# Merge with admin context
|
||||
from igny8_core.admin.site import admin_site
|
||||
admin_context = admin_site.each_context(request)
|
||||
context.update(admin_context)
|
||||
|
||||
return render(request, 'admin/reports/content.html', context)
|
||||
|
||||
|
||||
@staff_member_required
|
||||
def data_quality_report(request):
|
||||
"""Check data quality and integrity"""
|
||||
issues = []
|
||||
|
||||
# Orphaned content (no site)
|
||||
from igny8_core.modules.writer.models import Content
|
||||
orphaned_content = Content.objects.filter(site__isnull=True).count()
|
||||
if orphaned_content > 0:
|
||||
issues.append({
|
||||
'severity': 'warning',
|
||||
'type': 'Orphaned Records',
|
||||
'count': orphaned_content,
|
||||
'description': 'Content items without assigned site',
|
||||
'action_url': '/admin/writer/content/?site__isnull=True'
|
||||
})
|
||||
|
||||
# Tasks without clusters
|
||||
from igny8_core.modules.writer.models import Tasks
|
||||
tasks_no_cluster = Tasks.objects.filter(cluster__isnull=True).count()
|
||||
if tasks_no_cluster > 0:
|
||||
issues.append({
|
||||
'severity': 'info',
|
||||
'type': 'Missing Relationships',
|
||||
'count': tasks_no_cluster,
|
||||
'description': 'Tasks without assigned cluster',
|
||||
'action_url': '/admin/writer/tasks/?cluster__isnull=True'
|
||||
})
|
||||
|
||||
# Accounts with negative credits
|
||||
from igny8_core.auth.models import Account
|
||||
negative_credits = Account.objects.filter(credits__lt=0).count()
|
||||
if negative_credits > 0:
|
||||
issues.append({
|
||||
'severity': 'error',
|
||||
'type': 'Data Integrity',
|
||||
'count': negative_credits,
|
||||
'description': 'Accounts with negative credit balance',
|
||||
'action_url': '/admin/igny8_core_auth/account/?credits__lt=0'
|
||||
})
|
||||
|
||||
# Duplicate keywords
|
||||
from igny8_core.modules.planner.models import Keywords
|
||||
duplicates = Keywords.objects.values('seed_keyword', 'site', 'sector').annotate(
|
||||
count=Count('id')
|
||||
).filter(count__gt=1).count()
|
||||
if duplicates > 0:
|
||||
issues.append({
|
||||
'severity': 'warning',
|
||||
'type': 'Duplicates',
|
||||
'count': duplicates,
|
||||
'description': 'Duplicate keywords for same site/sector',
|
||||
'action_url': '/admin/planner/keywords/'
|
||||
})
|
||||
|
||||
# Content without SEO data
|
||||
no_seo = Content.objects.filter(
|
||||
Q(meta_title__isnull=True) | Q(meta_title='') |
|
||||
Q(meta_description__isnull=True) | Q(meta_description='')
|
||||
).count()
|
||||
if no_seo > 0:
|
||||
issues.append({
|
||||
'severity': 'info',
|
||||
'type': 'Incomplete Data',
|
||||
'count': no_seo,
|
||||
'description': 'Content missing SEO metadata',
|
||||
'action_url': '/admin/writer/content/'
|
||||
})
|
||||
|
||||
context = {
|
||||
'title': 'Data Quality Report',
|
||||
'issues': issues,
|
||||
'total_issues': len(issues),
|
||||
}
|
||||
|
||||
# Merge with admin context
|
||||
from igny8_core.admin.site import admin_site
|
||||
admin_context = admin_site.each_context(request)
|
||||
context.update(admin_context)
|
||||
|
||||
return render(request, 'admin/reports/data_quality.html', context)
|
||||
|
||||
|
||||
@staff_member_required
|
||||
def token_usage_report(request):
|
||||
"""Comprehensive token usage analytics with multi-dimensional insights"""
|
||||
from igny8_core.business.billing.models import CreditUsageLog
|
||||
from igny8_core.auth.models import Account
|
||||
from decimal import Decimal
|
||||
|
||||
# Date filter setup
|
||||
days_filter = request.GET.get('days', '30')
|
||||
try:
|
||||
days = int(days_filter)
|
||||
except ValueError:
|
||||
days = 30
|
||||
|
||||
start_date = timezone.now() - timedelta(days=days)
|
||||
|
||||
# Base queryset - include all records (tokens may be 0 for historical data)
|
||||
logs = CreditUsageLog.objects.filter(
|
||||
created_at__gte=start_date
|
||||
)
|
||||
|
||||
# Total statistics
|
||||
total_tokens_input = logs.aggregate(total=Sum('tokens_input'))['total'] or 0
|
||||
total_tokens_output = logs.aggregate(total=Sum('tokens_output'))['total'] or 0
|
||||
total_tokens = total_tokens_input + total_tokens_output
|
||||
total_calls = logs.count()
|
||||
avg_tokens_per_call = total_tokens / total_calls if total_calls > 0 else 0
|
||||
|
||||
# Token usage by model
|
||||
token_by_model = logs.values('model_used').annotate(
|
||||
total_tokens_input=Sum('tokens_input'),
|
||||
total_tokens_output=Sum('tokens_output'),
|
||||
call_count=Count('id'),
|
||||
total_cost=Sum('cost_usd')
|
||||
).order_by('-total_tokens_input')[:10]
|
||||
|
||||
# Add total_tokens to each model and sort by total
|
||||
for model in token_by_model:
|
||||
model['total_tokens'] = (model['total_tokens_input'] or 0) + (model['total_tokens_output'] or 0)
|
||||
model['avg_tokens'] = model['total_tokens'] / model['call_count'] if model['call_count'] > 0 else 0
|
||||
model['model'] = model['model_used'] # Add alias for template
|
||||
token_by_model = sorted(token_by_model, key=lambda x: x['total_tokens'], reverse=True)
|
||||
|
||||
# Token usage by function/operation
|
||||
token_by_function = logs.values('operation_type').annotate(
|
||||
total_tokens_input=Sum('tokens_input'),
|
||||
total_tokens_output=Sum('tokens_output'),
|
||||
call_count=Count('id'),
|
||||
total_cost=Sum('cost_usd')
|
||||
).order_by('-total_tokens_input')[:10]
|
||||
|
||||
# Add total_tokens to each function and sort by total
|
||||
for func in token_by_function:
|
||||
func['total_tokens'] = (func['total_tokens_input'] or 0) + (func['total_tokens_output'] or 0)
|
||||
func['avg_tokens'] = func['total_tokens'] / func['call_count'] if func['call_count'] > 0 else 0
|
||||
# Format operation_type as Title Case
|
||||
func['function'] = func['operation_type'].replace('_', ' ').title() if func['operation_type'] else 'Unknown'
|
||||
token_by_function = sorted(token_by_function, key=lambda x: x['total_tokens'], reverse=True)
|
||||
|
||||
# Token usage by account (top consumers)
|
||||
token_by_account = logs.values('account__name', 'account_id').annotate(
|
||||
total_tokens_input=Sum('tokens_input'),
|
||||
total_tokens_output=Sum('tokens_output'),
|
||||
call_count=Count('id'),
|
||||
total_cost=Sum('cost_usd')
|
||||
).order_by('-total_tokens_input')[:15]
|
||||
|
||||
# Add total_tokens to each account and sort by total
|
||||
for account in token_by_account:
|
||||
account['total_tokens'] = (account['total_tokens_input'] or 0) + (account['total_tokens_output'] or 0)
|
||||
token_by_account = sorted(token_by_account, key=lambda x: x['total_tokens'], reverse=True)[:15]
|
||||
|
||||
# Daily token trends (time series)
|
||||
daily_data = []
|
||||
daily_labels = []
|
||||
for i in range(days):
|
||||
day = timezone.now().date() - timedelta(days=days-i-1)
|
||||
day_logs = logs.filter(created_at__date=day)
|
||||
day_tokens_input = day_logs.aggregate(total=Sum('tokens_input'))['total'] or 0
|
||||
day_tokens_output = day_logs.aggregate(total=Sum('tokens_output'))['total'] or 0
|
||||
day_tokens = day_tokens_input + day_tokens_output
|
||||
daily_labels.append(day.strftime('%m/%d'))
|
||||
daily_data.append(int(day_tokens))
|
||||
|
||||
# Token efficiency metrics (CreditUsageLog doesn't have error field, so assume all successful)
|
||||
success_rate = 100.0
|
||||
successful_tokens = total_tokens
|
||||
wasted_tokens = 0
|
||||
|
||||
# Create tokens_by_status for template compatibility
|
||||
tokens_by_status = [{
|
||||
'error': None,
|
||||
'total_tokens': total_tokens,
|
||||
'call_count': total_calls,
|
||||
'avg_tokens': avg_tokens_per_call
|
||||
}]
|
||||
|
||||
# Peak usage times (hour of day)
|
||||
hourly_usage = logs.extra(
|
||||
select={'hour': "EXTRACT(hour FROM created_at)"}
|
||||
).values('hour').annotate(
|
||||
token_input=Sum('tokens_input'),
|
||||
token_output=Sum('tokens_output'),
|
||||
call_count=Count('id')
|
||||
).order_by('hour')
|
||||
|
||||
# Add total token_count for each hour
|
||||
for hour_data in hourly_usage:
|
||||
hour_data['token_count'] = (hour_data['token_input'] or 0) + (hour_data['token_output'] or 0)
|
||||
|
||||
# Cost efficiency
|
||||
total_cost = logs.aggregate(total=Sum('cost_usd'))['total'] or Decimal('0.00')
|
||||
cost_per_1k_tokens = float(total_cost) / (total_tokens / 1000) if total_tokens > 0 else 0.0
|
||||
|
||||
context = {
|
||||
'title': 'Token Usage Report',
|
||||
'days_filter': days,
|
||||
'total_tokens': int(total_tokens),
|
||||
'total_calls': total_calls,
|
||||
'avg_tokens_per_call': round(avg_tokens_per_call, 2),
|
||||
'token_by_model': list(token_by_model),
|
||||
'token_by_function': list(token_by_function),
|
||||
'token_by_account': list(token_by_account),
|
||||
'daily_labels': json.dumps(daily_labels),
|
||||
'daily_data': json.dumps(daily_data),
|
||||
'tokens_by_status': list(tokens_by_status),
|
||||
'success_rate': round(success_rate, 2),
|
||||
'successful_tokens': int(successful_tokens),
|
||||
'wasted_tokens': int(wasted_tokens),
|
||||
'hourly_usage': list(hourly_usage),
|
||||
'total_cost': float(total_cost),
|
||||
'cost_per_1k_tokens': float(cost_per_1k_tokens),
|
||||
'current_app': '_reports', # For active menu state
|
||||
}
|
||||
|
||||
# Merge with admin context
|
||||
from igny8_core.admin.site import admin_site
|
||||
admin_context = admin_site.each_context(request)
|
||||
context.update(admin_context)
|
||||
|
||||
return render(request, 'admin/reports/token_usage.html', context)
|
||||
|
||||
|
||||
@staff_member_required
|
||||
def ai_cost_analysis(request):
|
||||
"""Multi-dimensional AI cost analysis with model pricing, trends, and predictions"""
|
||||
from igny8_core.business.billing.models import CreditUsageLog
|
||||
from igny8_core.auth.models import Account
|
||||
from decimal import Decimal
|
||||
|
||||
# Date filter setup
|
||||
days_filter = request.GET.get('days', '30')
|
||||
try:
|
||||
days = int(days_filter)
|
||||
except ValueError:
|
||||
days = 30
|
||||
|
||||
start_date = timezone.now() - timedelta(days=days)
|
||||
|
||||
# Base queryset - filter for records with cost data
|
||||
logs = CreditUsageLog.objects.filter(
|
||||
created_at__gte=start_date,
|
||||
cost_usd__isnull=False
|
||||
)
|
||||
|
||||
# Overall cost metrics
|
||||
total_cost = logs.aggregate(total=Sum('cost_usd'))['total'] or Decimal('0.00')
|
||||
total_calls = logs.count()
|
||||
avg_cost_per_call = logs.aggregate(avg=Avg('cost_usd'))['avg'] or Decimal('0.00')
|
||||
total_tokens_input = logs.aggregate(total=Sum('tokens_input'))['total'] or 0
|
||||
total_tokens_output = logs.aggregate(total=Sum('tokens_output'))['total'] or 0
|
||||
total_tokens = total_tokens_input + total_tokens_output
|
||||
|
||||
# Revenue & Margin calculation
|
||||
from igny8_core.business.billing.models import BillingConfiguration
|
||||
billing_config = BillingConfiguration.get_config()
|
||||
total_credits_charged = logs.aggregate(total=Sum('credits_used'))['total'] or 0
|
||||
total_revenue = Decimal(total_credits_charged) * billing_config.default_credit_price_usd
|
||||
total_margin = total_revenue - total_cost
|
||||
margin_percentage = float((total_margin / total_revenue * 100) if total_revenue > 0 else 0)
|
||||
|
||||
# Per-unit margins
|
||||
# Calculate per 1M tokens (margin per million tokens)
|
||||
margin_per_1m_tokens = float(total_margin) / (total_tokens / 1_000_000) if total_tokens > 0 else 0
|
||||
# Calculate per 1K credits (margin per thousand credits)
|
||||
margin_per_1k_credits = float(total_margin) / (total_credits_charged / 1000) if total_credits_charged > 0 else 0
|
||||
|
||||
# Cost by model with efficiency metrics
|
||||
cost_by_model = logs.values('model_used').annotate(
|
||||
total_cost=Sum('cost_usd'),
|
||||
call_count=Count('id'),
|
||||
avg_cost=Avg('cost_usd'),
|
||||
total_tokens_input=Sum('tokens_input'),
|
||||
total_tokens_output=Sum('tokens_output')
|
||||
).order_by('-total_cost')
|
||||
|
||||
# Add cost efficiency and margin for each model
|
||||
for model in cost_by_model:
|
||||
model['total_tokens'] = (model['total_tokens_input'] or 0) + (model['total_tokens_output'] or 0)
|
||||
model['avg_tokens'] = model['total_tokens'] / model['call_count'] if model['call_count'] > 0 else 0
|
||||
model['model'] = model['model_used'] # Add alias for template
|
||||
if model['total_tokens'] and model['total_tokens'] > 0:
|
||||
model['cost_per_1k_tokens'] = float(model['total_cost']) / (model['total_tokens'] / 1000)
|
||||
else:
|
||||
model['cost_per_1k_tokens'] = 0
|
||||
|
||||
# Calculate margin for this model
|
||||
model_credits = logs.filter(model_used=model['model_used']).aggregate(total=Sum('credits_used'))['total'] or 0
|
||||
model_revenue = Decimal(model_credits) * billing_config.default_credit_price_usd
|
||||
model_margin = model_revenue - model['total_cost']
|
||||
model['revenue'] = float(model_revenue)
|
||||
model['margin'] = float(model_margin)
|
||||
model['margin_percentage'] = float((model_margin / model_revenue * 100) if model_revenue > 0 else 0)
|
||||
|
||||
# Cost by account (top spenders)
|
||||
cost_by_account = logs.values('account__name', 'account_id').annotate(
|
||||
total_cost=Sum('cost_usd'),
|
||||
call_count=Count('id'),
|
||||
total_tokens_input=Sum('tokens_input'),
|
||||
total_tokens_output=Sum('tokens_output'),
|
||||
avg_cost=Avg('cost_usd')
|
||||
).order_by('-total_cost')[:15]
|
||||
|
||||
# Add total_tokens to each account
|
||||
for account in cost_by_account:
|
||||
account['total_tokens'] = (account['total_tokens_input'] or 0) + (account['total_tokens_output'] or 0)
|
||||
|
||||
# Cost by function/operation
|
||||
cost_by_function = logs.values('operation_type').annotate(
|
||||
total_cost=Sum('cost_usd'),
|
||||
call_count=Count('id'),
|
||||
avg_cost=Avg('cost_usd'),
|
||||
total_tokens_input=Sum('tokens_input'),
|
||||
total_tokens_output=Sum('tokens_output')
|
||||
).order_by('-total_cost')[:10]
|
||||
|
||||
# Add total_tokens, function alias, and margin
|
||||
for func in cost_by_function:
|
||||
func['total_tokens'] = (func['total_tokens_input'] or 0) + (func['total_tokens_output'] or 0)
|
||||
# Format operation_type as Title Case
|
||||
func['function'] = func['operation_type'].replace('_', ' ').title() if func['operation_type'] else 'Unknown'
|
||||
|
||||
# Calculate margin for this operation
|
||||
func_credits = logs.filter(operation_type=func['operation_type']).aggregate(total=Sum('credits_used'))['total'] or 0
|
||||
func_revenue = Decimal(func_credits) * billing_config.default_credit_price_usd
|
||||
func_margin = func_revenue - func['total_cost']
|
||||
func['revenue'] = float(func_revenue)
|
||||
func['margin'] = float(func_margin)
|
||||
func['margin_percentage'] = float((func_margin / func_revenue * 100) if func_revenue > 0 else 0)
|
||||
|
||||
# Daily cost trends (time series)
|
||||
daily_cost_data = []
|
||||
daily_cost_labels = []
|
||||
daily_call_data = []
|
||||
|
||||
for i in range(days):
|
||||
day = timezone.now().date() - timedelta(days=days-i-1)
|
||||
day_logs = logs.filter(created_at__date=day)
|
||||
day_cost = day_logs.aggregate(total=Sum('cost_usd'))['total'] or Decimal('0.00')
|
||||
day_calls = day_logs.count()
|
||||
|
||||
daily_cost_labels.append(day.strftime('%m/%d'))
|
||||
daily_cost_data.append(float(day_cost))
|
||||
daily_call_data.append(day_calls)
|
||||
|
||||
# Cost prediction (simple linear extrapolation)
|
||||
if len(daily_cost_data) > 7:
|
||||
recent_avg_daily = sum(daily_cost_data[-7:]) / 7
|
||||
projected_monthly = recent_avg_daily * 30
|
||||
else:
|
||||
projected_monthly = 0
|
||||
|
||||
# Failed requests cost (CreditUsageLog doesn't track errors, so no failed cost)
|
||||
failed_cost = Decimal('0.00')
|
||||
|
||||
# Cost anomalies (calls costing > 3x average)
|
||||
if avg_cost_per_call > 0:
|
||||
anomaly_threshold = float(avg_cost_per_call) * 3
|
||||
anomalies = logs.filter(cost_usd__gt=anomaly_threshold).values(
|
||||
'model_used', 'operation_type', 'account__name', 'cost_usd', 'tokens_input', 'tokens_output', 'created_at'
|
||||
).order_by('-cost_usd')[:10]
|
||||
# Add aliases and calculate total tokens for each anomaly
|
||||
for anomaly in anomalies:
|
||||
anomaly['model'] = anomaly['model_used']
|
||||
# Format operation_type as Title Case
|
||||
anomaly['function'] = anomaly['operation_type'].replace('_', ' ').title() if anomaly['operation_type'] else 'Unknown'
|
||||
anomaly['cost'] = anomaly['cost_usd']
|
||||
anomaly['tokens'] = (anomaly['tokens_input'] or 0) + (anomaly['tokens_output'] or 0)
|
||||
else:
|
||||
anomalies = []
|
||||
|
||||
# Model comparison matrix
|
||||
model_comparison = []
|
||||
for model_data in cost_by_model:
|
||||
model_name = model_data['model']
|
||||
model_comparison.append({
|
||||
'model': model_name,
|
||||
'total_cost': float(model_data['total_cost']),
|
||||
'calls': model_data['call_count'],
|
||||
'avg_cost': float(model_data['avg_cost']),
|
||||
'total_tokens': model_data['total_tokens'],
|
||||
'cost_per_1k': model_data['cost_per_1k_tokens'],
|
||||
})
|
||||
|
||||
# Cost distribution percentages
|
||||
if total_cost > 0:
|
||||
for item in cost_by_model:
|
||||
item['cost_percentage'] = float((item['total_cost'] / total_cost) * 100)
|
||||
|
||||
# Peak cost hours
|
||||
hourly_cost = logs.extra(
|
||||
select={'hour': "EXTRACT(hour FROM created_at)"}
|
||||
).values('hour').annotate(
|
||||
total_cost=Sum('cost_usd'),
|
||||
call_count=Count('id')
|
||||
).order_by('hour')
|
||||
|
||||
# Cost efficiency score (CreditUsageLog doesn't track errors, assume all successful)
|
||||
successful_cost = total_cost
|
||||
efficiency_score = 100.0
|
||||
|
||||
context = {
|
||||
'title': 'AI Cost & Margin Analysis',
|
||||
'days_filter': days,
|
||||
'total_cost': float(total_cost),
|
||||
'total_revenue': float(total_revenue),
|
||||
'total_margin': float(total_margin),
|
||||
'margin_percentage': round(margin_percentage, 2),
|
||||
'margin_per_1m_tokens': round(margin_per_1m_tokens, 4),
|
||||
'margin_per_1k_credits': round(margin_per_1k_credits, 4),
|
||||
'total_credits_charged': total_credits_charged,
|
||||
'credit_price': float(billing_config.default_credit_price_usd),
|
||||
'total_calls': total_calls,
|
||||
'avg_cost_per_call': float(avg_cost_per_call),
|
||||
'total_tokens': int(total_tokens),
|
||||
'cost_by_model': list(cost_by_model),
|
||||
'cost_by_account': list(cost_by_account),
|
||||
'cost_by_function': list(cost_by_function),
|
||||
'daily_cost_labels': json.dumps(daily_cost_labels),
|
||||
'daily_cost_data': json.dumps(daily_cost_data),
|
||||
'daily_call_data': json.dumps(daily_call_data),
|
||||
'projected_monthly': round(projected_monthly, 2),
|
||||
'failed_cost': float(failed_cost),
|
||||
'wasted_percentage': float((failed_cost / total_cost * 100) if total_cost > 0 else 0),
|
||||
'anomalies': list(anomalies),
|
||||
'model_comparison': model_comparison,
|
||||
'hourly_cost': list(hourly_cost),
|
||||
'efficiency_score': round(efficiency_score, 2),
|
||||
'successful_cost': float(successful_cost),
|
||||
'current_app': '_reports', # For active menu state
|
||||
}
|
||||
|
||||
# Merge with admin context
|
||||
from igny8_core.admin.site import admin_site
|
||||
admin_context = admin_site.each_context(request)
|
||||
context.update(admin_context)
|
||||
|
||||
return render(request, 'admin/reports/ai_cost_analysis.html', context)
|
||||
@@ -1,354 +1,134 @@
|
||||
"""
|
||||
Custom AdminSite for IGNY8 to organize models into proper groups using Unfold
|
||||
NO EMOJIS - Unfold handles all icons via Material Design
|
||||
Custom AdminSite for IGNY8 to organize models into proper groups
|
||||
"""
|
||||
from django.contrib import admin
|
||||
from django.contrib.admin.apps import AdminConfig
|
||||
from django.apps import apps
|
||||
from django.urls import path, reverse_lazy
|
||||
from django.shortcuts import redirect
|
||||
from django.contrib.admin import sites
|
||||
from unfold.admin import ModelAdmin as UnfoldModelAdmin
|
||||
from unfold.sites import UnfoldAdminSite
|
||||
|
||||
|
||||
class Igny8AdminSite(UnfoldAdminSite):
|
||||
class Igny8AdminSite(admin.AdminSite):
|
||||
"""
|
||||
Custom AdminSite based on Unfold that organizes models into the planned groups
|
||||
Custom AdminSite that organizes models into the planned groups:
|
||||
1. Billing & Tenancy
|
||||
2. Sites & Users
|
||||
3. Global Reference Data
|
||||
4. Planner
|
||||
5. Writer Module
|
||||
6. Thinker Module
|
||||
7. System Configuration
|
||||
"""
|
||||
site_header = 'IGNY8 Administration'
|
||||
site_title = 'IGNY8 Admin'
|
||||
index_title = 'IGNY8 Administration'
|
||||
|
||||
def get_urls(self):
|
||||
"""Get admin URLs with dashboard, reports, and monitoring pages available"""
|
||||
from django.urls import path
|
||||
from .dashboard import admin_dashboard
|
||||
from .reports import (
|
||||
revenue_report, usage_report, content_report, data_quality_report,
|
||||
token_usage_report, ai_cost_analysis
|
||||
)
|
||||
from .monitoring import (
|
||||
system_health_dashboard, api_monitor_dashboard, debug_console
|
||||
)
|
||||
|
||||
urls = super().get_urls()
|
||||
custom_urls = [
|
||||
# Dashboard
|
||||
path('dashboard/', self.admin_view(admin_dashboard), name='dashboard'),
|
||||
|
||||
# Reports
|
||||
path('reports/revenue/', self.admin_view(revenue_report), name='report_revenue'),
|
||||
path('reports/usage/', self.admin_view(usage_report), name='report_usage'),
|
||||
path('reports/content/', self.admin_view(content_report), name='report_content'),
|
||||
path('reports/data-quality/', self.admin_view(data_quality_report), name='report_data_quality'),
|
||||
path('reports/token-usage/', self.admin_view(token_usage_report), name='report_token_usage'),
|
||||
path('reports/ai-cost-analysis/', self.admin_view(ai_cost_analysis), name='report_ai_cost_analysis'),
|
||||
|
||||
# Monitoring (NEW)
|
||||
path('monitoring/system-health/', self.admin_view(system_health_dashboard), name='monitoring_system_health'),
|
||||
path('monitoring/api-monitor/', self.admin_view(api_monitor_dashboard), name='monitoring_api_monitor'),
|
||||
path('monitoring/debug-console/', self.admin_view(debug_console), name='monitoring_debug_console'),
|
||||
]
|
||||
return custom_urls + urls
|
||||
|
||||
def index(self, request, extra_context=None):
|
||||
"""Redirect to custom dashboard"""
|
||||
from django.shortcuts import redirect
|
||||
return redirect('admin:dashboard')
|
||||
|
||||
def get_sidebar_list(self, request):
|
||||
"""
|
||||
Override Unfold's get_sidebar_list to return our custom app groups
|
||||
Convert Django app_list format to Unfold sidebar navigation format
|
||||
"""
|
||||
# Get our custom Django app list
|
||||
django_apps = self.get_app_list(request, app_label=None)
|
||||
|
||||
# Convert to Unfold navigation format: {title, items: [{title, link, icon}]}
|
||||
sidebar_groups = []
|
||||
|
||||
for app in django_apps:
|
||||
group = {
|
||||
'title': app['name'],
|
||||
'collapsible': True,
|
||||
'items': []
|
||||
}
|
||||
|
||||
# Convert each model to navigation item
|
||||
for model in app.get('models', []):
|
||||
if model.get('perms', {}).get('view', False) or model.get('perms', {}).get('change', False):
|
||||
item = {
|
||||
'title': model['name'],
|
||||
'link': model['admin_url'],
|
||||
'icon': None, # Unfold will use default
|
||||
'has_permission': True, # CRITICAL: Template checks this
|
||||
}
|
||||
group['items'].append(item)
|
||||
|
||||
# Only add groups that have items
|
||||
if group['items']:
|
||||
sidebar_groups.append(group)
|
||||
|
||||
return sidebar_groups
|
||||
|
||||
def each_context(self, request):
|
||||
"""
|
||||
Override context to ensure our custom app_list is always used
|
||||
This is called by all admin templates for sidebar rendering
|
||||
|
||||
CRITICAL FIX: Force custom sidebar on ALL pages including model detail/list views
|
||||
"""
|
||||
# CRITICAL: Must call parent to get sidebar_navigation set
|
||||
context = super().each_context(request)
|
||||
|
||||
# DEBUGGING: Print to console what parent returned
|
||||
print(f"\n=== DEBUG each_context for {request.path} ===")
|
||||
print(f"sidebar_navigation length from parent: {len(context.get('sidebar_navigation', []))}")
|
||||
if context.get('sidebar_navigation'):
|
||||
print(f"First sidebar group: {context['sidebar_navigation'][0].get('title', 'NO TITLE')}")
|
||||
|
||||
# Force our custom app list to be used everywhere - IGNORE app_label parameter
|
||||
custom_apps = self.get_app_list(request, app_label=None)
|
||||
context['available_apps'] = custom_apps
|
||||
context['app_list'] = custom_apps # Also set app_list for compatibility
|
||||
|
||||
# CRITICAL FIX: Ensure sidebar_navigation is using our custom sidebar
|
||||
# Parent's each_context already called get_sidebar_list(), which returns our custom sidebar
|
||||
# So sidebar_navigation should already be correct, but let's verify
|
||||
if not context.get('sidebar_navigation') or len(context.get('sidebar_navigation', [])) == 0:
|
||||
# If sidebar_navigation is empty, force it
|
||||
print("WARNING: sidebar_navigation was empty, forcing it!")
|
||||
context['sidebar_navigation'] = self.get_sidebar_list(request)
|
||||
|
||||
print(f"Final sidebar_navigation length: {len(context['sidebar_navigation'])}")
|
||||
print("=== END DEBUG ===\n")
|
||||
|
||||
return context
|
||||
|
||||
def get_app_list(self, request, app_label=None):
|
||||
def get_app_list(self, request):
|
||||
"""
|
||||
Customize the app list to organize models into logical groups
|
||||
NO EMOJIS - Unfold handles all icons via Material Design
|
||||
|
||||
Args:
|
||||
request: The HTTP request
|
||||
app_label: IGNORED - Always return full custom sidebar for consistency
|
||||
Customize the app list to organize models into proper groups
|
||||
"""
|
||||
# CRITICAL: Always build full app_dict (ignore app_label) for consistent sidebar
|
||||
app_dict = self._build_app_dict(request, None)
|
||||
# Get the default app list
|
||||
app_dict = self._build_app_dict(request)
|
||||
|
||||
# Define our custom groups with their models (using object_name)
|
||||
# Organized by business function - Material icons configured in Unfold
|
||||
custom_groups = {
|
||||
'Accounts & Tenancy': {
|
||||
'Billing & Tenancy': {
|
||||
'models': [
|
||||
('igny8_core_auth', 'Plan'),
|
||||
('igny8_core_auth', 'Account'),
|
||||
('igny8_core_auth', 'User'),
|
||||
('igny8_core_auth', 'Site'),
|
||||
('igny8_core_auth', 'Sector'),
|
||||
('igny8_core_auth', 'SiteUserAccess'),
|
||||
('igny8_core_auth', 'Subscription'),
|
||||
('billing', 'CreditTransaction'),
|
||||
('billing', 'CreditUsageLog'),
|
||||
],
|
||||
},
|
||||
'Global Resources': {
|
||||
'Sites & Users': {
|
||||
'models': [
|
||||
('igny8_core_auth', 'Site'),
|
||||
('igny8_core_auth', 'User'),
|
||||
('igny8_core_auth', 'SiteUserAccess'),
|
||||
('igny8_core_auth', 'PasswordResetToken'),
|
||||
],
|
||||
},
|
||||
'Global Reference Data': {
|
||||
'models': [
|
||||
('igny8_core_auth', 'Industry'),
|
||||
('igny8_core_auth', 'IndustrySector'),
|
||||
('igny8_core_auth', 'SeedKeyword'),
|
||||
],
|
||||
},
|
||||
'Global Settings': {
|
||||
'models': [
|
||||
('system', 'GlobalIntegrationSettings'),
|
||||
('system', 'GlobalModuleSettings'),
|
||||
('system', 'GlobalAIPrompt'),
|
||||
('system', 'GlobalAuthorProfile'),
|
||||
('system', 'GlobalStrategy'),
|
||||
],
|
||||
},
|
||||
'Plans and Billing': {
|
||||
'models': [
|
||||
('igny8_core_auth', 'Plan'),
|
||||
('igny8_core_auth', 'Subscription'),
|
||||
('billing', 'BillingConfiguration'),
|
||||
('billing', 'Invoice'),
|
||||
('billing', 'Payment'),
|
||||
('billing', 'CreditPackage'),
|
||||
('billing', 'PaymentMethodConfig'),
|
||||
('billing', 'AccountPaymentMethod'),
|
||||
],
|
||||
},
|
||||
'Credits': {
|
||||
'models': [
|
||||
('billing', 'CreditTransaction'),
|
||||
('billing', 'CreditUsageLog'),
|
||||
('billing', 'CreditCostConfig'),
|
||||
('billing', 'PlanLimitUsage'),
|
||||
],
|
||||
},
|
||||
'Content Planning': {
|
||||
'Planner': {
|
||||
'models': [
|
||||
('planner', 'Keywords'),
|
||||
('planner', 'Clusters'),
|
||||
('planner', 'ContentIdeas'),
|
||||
],
|
||||
},
|
||||
'Content Generation': {
|
||||
'Writer Module': {
|
||||
'models': [
|
||||
('writer', 'Tasks'),
|
||||
('writer', 'Content'),
|
||||
('writer', 'Images'),
|
||||
('writer', 'ImagePrompts'),
|
||||
],
|
||||
},
|
||||
'Taxonomy & Organization': {
|
||||
'Thinker Module': {
|
||||
'models': [
|
||||
('writer', 'ContentTaxonomy'),
|
||||
('writer', 'ContentTaxonomyRelation'),
|
||||
('writer', 'ContentClusterMap'),
|
||||
('writer', 'ContentAttribute'),
|
||||
('system', 'AIPrompt'),
|
||||
('system', 'AuthorProfile'),
|
||||
('system', 'Strategy'),
|
||||
],
|
||||
},
|
||||
'Publishing & Integration': {
|
||||
'models': [
|
||||
('integration', 'SiteIntegration'),
|
||||
('integration', 'SyncEvent'),
|
||||
('publishing', 'PublishingRecord'),
|
||||
('system', 'PublishingChannel'),
|
||||
('publishing', 'DeploymentRecord'),
|
||||
],
|
||||
},
|
||||
'AI & Automation': {
|
||||
'System Configuration': {
|
||||
'models': [
|
||||
('system', 'IntegrationSettings'),
|
||||
('system', 'AIPrompt'),
|
||||
('system', 'Strategy'),
|
||||
('system', 'AuthorProfile'),
|
||||
('system', 'APIKey'),
|
||||
('system', 'WebhookConfig'),
|
||||
('automation', 'AutomationConfig'),
|
||||
('automation', 'AutomationRun'),
|
||||
],
|
||||
},
|
||||
'System Settings': {
|
||||
'models': [
|
||||
('contenttypes', 'ContentType'),
|
||||
('system', 'ContentTemplate'),
|
||||
('system', 'TaxonomyConfig'),
|
||||
('system', 'SystemSetting'),
|
||||
('system', 'ContentTypeConfig'),
|
||||
('system', 'NotificationConfig'),
|
||||
],
|
||||
},
|
||||
'Django Admin': {
|
||||
'models': [
|
||||
('auth', 'Group'),
|
||||
('auth', 'Permission'),
|
||||
('igny8_core_auth', 'PasswordResetToken'),
|
||||
('sessions', 'Session'),
|
||||
],
|
||||
},
|
||||
'Tasks & Logging': {
|
||||
'models': [
|
||||
('ai', 'AITaskLog'),
|
||||
('system', 'AuditLog'),
|
||||
('admin', 'LogEntry'),
|
||||
('django_celery_results', 'TaskResult'),
|
||||
('django_celery_results', 'GroupResult'),
|
||||
('system', 'SystemLog'),
|
||||
('system', 'SystemStatus'),
|
||||
('system', 'SystemSettings'),
|
||||
('system', 'AccountSettings'),
|
||||
('system', 'UserSettings'),
|
||||
('system', 'ModuleSettings'),
|
||||
('system', 'AISettings'),
|
||||
],
|
||||
},
|
||||
}
|
||||
|
||||
# ALWAYS build and return our custom organized app list
|
||||
# regardless of app_label parameter (for consistent sidebar on all pages)
|
||||
organized_apps = []
|
||||
|
||||
# Add Dashboard link as first item
|
||||
organized_apps.append({
|
||||
'name': '📊 Dashboard',
|
||||
'app_label': '_dashboard',
|
||||
'app_url': '/admin/dashboard/',
|
||||
'has_module_perms': True,
|
||||
'models': [],
|
||||
})
|
||||
|
||||
# Add Reports section with links to all reports
|
||||
organized_apps.append({
|
||||
'name': 'Reports & Analytics',
|
||||
'app_label': '_reports',
|
||||
'app_url': '#',
|
||||
'has_module_perms': True,
|
||||
'models': [
|
||||
{
|
||||
'name': 'Revenue Report',
|
||||
'object_name': 'RevenueReport',
|
||||
'admin_url': '/admin/reports/revenue/',
|
||||
'view_only': True,
|
||||
'perms': {'view': True},
|
||||
},
|
||||
{
|
||||
'name': 'Usage Report',
|
||||
'object_name': 'UsageReport',
|
||||
'admin_url': '/admin/reports/usage/',
|
||||
'view_only': True,
|
||||
'perms': {'view': True},
|
||||
},
|
||||
{
|
||||
'name': 'Content Report',
|
||||
'object_name': 'ContentReport',
|
||||
'admin_url': '/admin/reports/content/',
|
||||
'view_only': True,
|
||||
'perms': {'view': True},
|
||||
},
|
||||
{
|
||||
'name': 'Data Quality Report',
|
||||
'object_name': 'DataQualityReport',
|
||||
'admin_url': '/admin/reports/data-quality/',
|
||||
'view_only': True,
|
||||
'perms': {'view': True},
|
||||
},
|
||||
{
|
||||
'name': 'Token Usage Report',
|
||||
'object_name': 'TokenUsageReport',
|
||||
'admin_url': '/admin/reports/token-usage/',
|
||||
'view_only': True,
|
||||
'perms': {'view': True},
|
||||
},
|
||||
{
|
||||
'name': 'AI Cost Analysis',
|
||||
'object_name': 'AICostAnalysis',
|
||||
'admin_url': '/admin/reports/ai-cost-analysis/',
|
||||
'view_only': True,
|
||||
'perms': {'view': True},
|
||||
},
|
||||
],
|
||||
})
|
||||
# Build the custom app list
|
||||
app_list = []
|
||||
|
||||
for group_name, group_config in custom_groups.items():
|
||||
group_models = []
|
||||
|
||||
for app_label, model_name in group_config['models']:
|
||||
# Find the model in app_dict
|
||||
for app in app_dict.values():
|
||||
if app['app_label'] == app_label:
|
||||
for model in app.get('models', []):
|
||||
if model['object_name'] == model_name:
|
||||
group_models.append(model)
|
||||
break
|
||||
if app_label in app_dict:
|
||||
app_data = app_dict[app_label]
|
||||
# Look for the model in the app's models
|
||||
for model in app_data.get('models', []):
|
||||
if model['object_name'] == model_name:
|
||||
group_models.append(model)
|
||||
break
|
||||
|
||||
# Only add the group if it has models
|
||||
if group_models:
|
||||
# Get the first model's app_label to use as the real app_label
|
||||
first_model_app_label = group_config['models'][0][0]
|
||||
organized_apps.append({
|
||||
app_list.append({
|
||||
'name': group_name,
|
||||
'app_label': first_model_app_label, # Use real app_label, not fake one
|
||||
'app_url': f'/admin/{first_model_app_label}/', # Real URL, not '#'
|
||||
'app_label': group_name.lower().replace(' ', '_').replace('&', ''),
|
||||
'app_url': None,
|
||||
'has_module_perms': True,
|
||||
'models': group_models,
|
||||
})
|
||||
|
||||
return organized_apps
|
||||
# Sort the app list by our custom order
|
||||
order = [
|
||||
'Billing & Tenancy',
|
||||
'Sites & Users',
|
||||
'Global Reference Data',
|
||||
'Planner',
|
||||
'Writer Module',
|
||||
'Thinker Module',
|
||||
'System Configuration',
|
||||
]
|
||||
|
||||
app_list.sort(key=lambda x: order.index(x['name']) if x['name'] in order else 999)
|
||||
|
||||
return app_list
|
||||
|
||||
|
||||
|
||||
# Instantiate custom admin site
|
||||
admin_site = Igny8AdminSite(name='admin')
|
||||
|
||||
@@ -1,179 +0,0 @@
|
||||
"""
|
||||
Custom AdminSite for IGNY8 to organize models into proper groups using Unfold
|
||||
"""
|
||||
from django.contrib import admin
|
||||
from django.contrib.admin.apps import AdminConfig
|
||||
from django.apps import apps
|
||||
from django.urls import path, reverse_lazy
|
||||
from django.shortcuts import redirect
|
||||
from unfold.admin import ModelAdmin as UnfoldModelAdmin
|
||||
from unfold.sites import UnfoldAdminSite
|
||||
|
||||
|
||||
class Igny8AdminSite(UnfoldAdminSite):
|
||||
"""
|
||||
Custom AdminSite based on Unfold that organizes models into the planned groups
|
||||
"""
|
||||
site_header = 'IGNY8 Administration'
|
||||
site_title = 'IGNY8 Admin'
|
||||
index_title = 'IGNY8 Administration'
|
||||
|
||||
def get_urls(self):
|
||||
"""Get admin URLs without custom dashboard"""
|
||||
urls = super().get_urls()
|
||||
return urls
|
||||
|
||||
def get_app_list(self, request):
|
||||
"""
|
||||
Customize the app list to organize models into logical groups
|
||||
"""
|
||||
# Get the default app list
|
||||
app_dict = self._build_app_dict(request)
|
||||
|
||||
# Define our custom groups with their models (using object_name)
|
||||
# Organized by business function with emoji icons for visual recognition
|
||||
custom_groups = {
|
||||
'💰 Billing & Accounts': {
|
||||
'models': [
|
||||
('igny8_core_auth', 'Plan'),
|
||||
('billing', 'PlanLimitUsage'),
|
||||
('igny8_core_auth', 'Account'),
|
||||
('igny8_core_auth', 'Subscription'),
|
||||
('billing', 'Invoice'),
|
||||
('billing', 'Payment'),
|
||||
('billing', 'CreditTransaction'),
|
||||
('billing', 'CreditUsageLog'),
|
||||
('billing', 'CreditPackage'),
|
||||
('billing', 'PaymentMethodConfig'),
|
||||
('billing', 'AccountPaymentMethod'),
|
||||
('billing', 'CreditCostConfig'),
|
||||
],
|
||||
},
|
||||
'👥 Sites & Users': {
|
||||
'models': [
|
||||
('igny8_core_auth', 'Site'),
|
||||
('igny8_core_auth', 'Sector'),
|
||||
('igny8_core_auth', 'User'),
|
||||
('igny8_core_auth', 'SiteUserAccess'),
|
||||
('igny8_core_auth', 'PasswordResetToken'),
|
||||
],
|
||||
},
|
||||
'📚 Content Management': {
|
||||
'models': [
|
||||
('writer', 'Content'),
|
||||
('writer', 'Tasks'),
|
||||
('writer', 'Images'),
|
||||
('writer', 'ContentTaxonomy'),
|
||||
('writer', 'ContentAttribute'),
|
||||
('writer', 'ContentTaxonomyRelation'),
|
||||
('writer', 'ContentClusterMap'),
|
||||
],
|
||||
},
|
||||
'🎯 Planning & Strategy': {
|
||||
'models': [
|
||||
('planner', 'Clusters'),
|
||||
('planner', 'Keywords'),
|
||||
('planner', 'ContentIdeas'),
|
||||
('system', 'Strategy'),
|
||||
],
|
||||
},
|
||||
'🔗 Integrations & Publishing': {
|
||||
'models': [
|
||||
('integration', 'SiteIntegration'),
|
||||
('integration', 'SyncEvent'),
|
||||
('publishing', 'PublishingRecord'),
|
||||
('publishing', 'DeploymentRecord'),
|
||||
],
|
||||
},
|
||||
'🤖 AI & Automation': {
|
||||
'models': [
|
||||
('ai', 'AITaskLog'),
|
||||
('system', 'AIPrompt'),
|
||||
('automation', 'AutomationConfig'),
|
||||
('automation', 'AutomationRun'),
|
||||
('optimization', 'OptimizationTask'),
|
||||
],
|
||||
},
|
||||
'🌍 Global Reference Data': {
|
||||
'models': [
|
||||
('igny8_core_auth', 'Industry'),
|
||||
('igny8_core_auth', 'IndustrySector'),
|
||||
('igny8_core_auth', 'SeedKeyword'),
|
||||
],
|
||||
},
|
||||
'⚙️ System Configuration': {
|
||||
'models': [
|
||||
('system', 'IntegrationSettings'),
|
||||
('system', 'AuthorProfile'),
|
||||
('system', 'SystemSettings'),
|
||||
('system', 'AccountSettings'),
|
||||
('system', 'UserSettings'),
|
||||
('system', 'ModuleSettings'),
|
||||
('system', 'AISettings'),
|
||||
('system', 'ModuleEnableSettings'),
|
||||
('system', 'SystemLog'),
|
||||
('system', 'SystemStatus'),
|
||||
],
|
||||
},
|
||||
'<EFBFBD> Monitoring & Tasks': {
|
||||
'models': [
|
||||
('django_celery_results', 'TaskResult'),
|
||||
('django_celery_results', 'GroupResult'),
|
||||
],
|
||||
},
|
||||
'<EFBFBD>🔧 Django System': {
|
||||
'models': [
|
||||
('admin', 'LogEntry'),
|
||||
('auth', 'Group'),
|
||||
('auth', 'Permission'),
|
||||
('contenttypes', 'ContentType'),
|
||||
('sessions', 'Session'),
|
||||
],
|
||||
},
|
||||
}
|
||||
|
||||
# Build the custom app list
|
||||
app_list = []
|
||||
|
||||
for group_name, group_config in custom_groups.items():
|
||||
group_models = []
|
||||
|
||||
for app_label, model_name in group_config['models']:
|
||||
# Find the model in app_dict
|
||||
if app_label in app_dict:
|
||||
app_data = app_dict[app_label]
|
||||
# Look for the model in the app's models
|
||||
for model in app_data.get('models', []):
|
||||
if model['object_name'] == model_name:
|
||||
group_models.append(model)
|
||||
break
|
||||
|
||||
# Only add the group if it has models
|
||||
if group_models:
|
||||
app_list.append({
|
||||
'name': group_name,
|
||||
'app_label': group_name.lower().replace(' ', '_').replace('&', '').replace('emoji', ''),
|
||||
'app_url': None,
|
||||
'has_module_perms': True,
|
||||
'models': group_models,
|
||||
})
|
||||
|
||||
# Sort the app list by our custom order
|
||||
order = [
|
||||
'💰 Billing & Accounts',
|
||||
'👥 Sites & Users',
|
||||
'📚 Content Management',
|
||||
'🎯 Planning & Strategy',
|
||||
'🔗 Integrations & Publishing',
|
||||
'🤖 AI & Automation',
|
||||
'🌍 Global Reference Data',
|
||||
'⚙️ System Configuration',
|
||||
'🔧 Django System',
|
||||
]
|
||||
|
||||
app_list.sort(key=lambda x: order.index(x['name']) if x['name'] in order else 999)
|
||||
|
||||
return app_list
|
||||
|
||||
|
||||
|
||||
@@ -1,179 +0,0 @@
|
||||
"""
|
||||
Custom AdminSite for IGNY8 to organize models into proper groups using Unfold
|
||||
"""
|
||||
from django.contrib import admin
|
||||
from django.contrib.admin.apps import AdminConfig
|
||||
from django.apps import apps
|
||||
from django.urls import path, reverse_lazy
|
||||
from django.shortcuts import redirect
|
||||
from unfold.admin import ModelAdmin as UnfoldModelAdmin
|
||||
from unfold.sites import UnfoldAdminSite
|
||||
|
||||
|
||||
class Igny8AdminSite(UnfoldAdminSite):
|
||||
"""
|
||||
Custom AdminSite based on Unfold that organizes models into the planned groups
|
||||
"""
|
||||
site_header = 'IGNY8 Administration'
|
||||
site_title = 'IGNY8 Admin'
|
||||
index_title = 'IGNY8 Administration'
|
||||
|
||||
def get_urls(self):
|
||||
"""Get admin URLs without custom dashboard"""
|
||||
urls = super().get_urls()
|
||||
return urls
|
||||
|
||||
def get_app_list(self, request):
|
||||
"""
|
||||
Customize the app list to organize models into logical groups
|
||||
"""
|
||||
# Get the default app list
|
||||
app_dict = self._build_app_dict(request)
|
||||
|
||||
# Define our custom groups with their models (using object_name)
|
||||
# Organized by business function with emoji icons for visual recognition
|
||||
custom_groups = {
|
||||
'💰 Billing & Accounts': {
|
||||
'models': [
|
||||
('igny8_core_auth', 'Plan'),
|
||||
('billing', 'PlanLimitUsage'),
|
||||
('igny8_core_auth', 'Account'),
|
||||
('igny8_core_auth', 'Subscription'),
|
||||
('billing', 'Invoice'),
|
||||
('billing', 'Payment'),
|
||||
('billing', 'CreditTransaction'),
|
||||
('billing', 'CreditUsageLog'),
|
||||
('billing', 'CreditPackage'),
|
||||
('billing', 'PaymentMethodConfig'),
|
||||
('billing', 'AccountPaymentMethod'),
|
||||
('billing', 'CreditCostConfig'),
|
||||
],
|
||||
},
|
||||
'👥 Sites & Users': {
|
||||
'models': [
|
||||
('igny8_core_auth', 'Site'),
|
||||
('igny8_core_auth', 'Sector'),
|
||||
('igny8_core_auth', 'User'),
|
||||
('igny8_core_auth', 'SiteUserAccess'),
|
||||
('igny8_core_auth', 'PasswordResetToken'),
|
||||
],
|
||||
},
|
||||
'📚 Content Management': {
|
||||
'models': [
|
||||
('writer', 'Content'),
|
||||
('writer', 'Tasks'),
|
||||
('writer', 'Images'),
|
||||
('writer', 'ContentTaxonomy'),
|
||||
('writer', 'ContentAttribute'),
|
||||
('writer', 'ContentTaxonomyRelation'),
|
||||
('writer', 'ContentClusterMap'),
|
||||
],
|
||||
},
|
||||
'🎯 Planning & Strategy': {
|
||||
'models': [
|
||||
('planner', 'Clusters'),
|
||||
('planner', 'Keywords'),
|
||||
('planner', 'ContentIdeas'),
|
||||
('system', 'Strategy'),
|
||||
],
|
||||
},
|
||||
'🔗 Integrations & Publishing': {
|
||||
'models': [
|
||||
('integration', 'SiteIntegration'),
|
||||
('integration', 'SyncEvent'),
|
||||
('publishing', 'PublishingRecord'),
|
||||
('publishing', 'DeploymentRecord'),
|
||||
],
|
||||
},
|
||||
'🤖 AI & Automation': {
|
||||
'models': [
|
||||
('ai', 'AITaskLog'),
|
||||
('system', 'AIPrompt'),
|
||||
('automation', 'AutomationConfig'),
|
||||
('automation', 'AutomationRun'),
|
||||
('optimization', 'OptimizationTask'),
|
||||
],
|
||||
},
|
||||
'🌍 Global Reference Data': {
|
||||
'models': [
|
||||
('igny8_core_auth', 'Industry'),
|
||||
('igny8_core_auth', 'IndustrySector'),
|
||||
('igny8_core_auth', 'SeedKeyword'),
|
||||
],
|
||||
},
|
||||
'⚙️ System Configuration': {
|
||||
'models': [
|
||||
('system', 'IntegrationSettings'),
|
||||
('system', 'AuthorProfile'),
|
||||
('system', 'SystemSettings'),
|
||||
('system', 'AccountSettings'),
|
||||
('system', 'UserSettings'),
|
||||
('system', 'ModuleSettings'),
|
||||
('system', 'AISettings'),
|
||||
('system', 'ModuleEnableSettings'),
|
||||
('system', 'SystemLog'),
|
||||
('system', 'SystemStatus'),
|
||||
],
|
||||
},
|
||||
'<EFBFBD> Monitoring & Tasks': {
|
||||
'models': [
|
||||
('django_celery_results', 'TaskResult'),
|
||||
('django_celery_results', 'GroupResult'),
|
||||
],
|
||||
},
|
||||
'<EFBFBD>🔧 Django System': {
|
||||
'models': [
|
||||
('admin', 'LogEntry'),
|
||||
('auth', 'Group'),
|
||||
('auth', 'Permission'),
|
||||
('contenttypes', 'ContentType'),
|
||||
('sessions', 'Session'),
|
||||
],
|
||||
},
|
||||
}
|
||||
|
||||
# Build the custom app list
|
||||
app_list = []
|
||||
|
||||
for group_name, group_config in custom_groups.items():
|
||||
group_models = []
|
||||
|
||||
for app_label, model_name in group_config['models']:
|
||||
# Find the model in app_dict
|
||||
if app_label in app_dict:
|
||||
app_data = app_dict[app_label]
|
||||
# Look for the model in the app's models
|
||||
for model in app_data.get('models', []):
|
||||
if model['object_name'] == model_name:
|
||||
group_models.append(model)
|
||||
break
|
||||
|
||||
# Only add the group if it has models
|
||||
if group_models:
|
||||
app_list.append({
|
||||
'name': group_name,
|
||||
'app_label': group_name.lower().replace(' ', '_').replace('&', '').replace('emoji', ''),
|
||||
'app_url': None,
|
||||
'has_module_perms': True,
|
||||
'models': group_models,
|
||||
})
|
||||
|
||||
# Sort the app list by our custom order
|
||||
order = [
|
||||
'💰 Billing & Accounts',
|
||||
'👥 Sites & Users',
|
||||
'📚 Content Management',
|
||||
'🎯 Planning & Strategy',
|
||||
'🔗 Integrations & Publishing',
|
||||
'🤖 AI & Automation',
|
||||
'🌍 Global Reference Data',
|
||||
'⚙️ System Configuration',
|
||||
'🔧 Django System',
|
||||
]
|
||||
|
||||
app_list.sort(key=lambda x: order.index(x['name']) if x['name'] in order else 999)
|
||||
|
||||
return app_list
|
||||
|
||||
|
||||
|
||||
@@ -2,27 +2,11 @@
|
||||
Admin configuration for AI models
|
||||
"""
|
||||
from django.contrib import admin
|
||||
from unfold.admin import ModelAdmin
|
||||
from igny8_core.admin.base import Igny8ModelAdmin
|
||||
from igny8_core.ai.models import AITaskLog
|
||||
|
||||
|
||||
from import_export.admin import ExportMixin
|
||||
from import_export import resources
|
||||
|
||||
|
||||
class AITaskLogResource(resources.ModelResource):
|
||||
"""Resource class for exporting AI Task Logs"""
|
||||
class Meta:
|
||||
model = AITaskLog
|
||||
fields = ('id', 'function_name', 'account__name', 'status', 'phase',
|
||||
'cost', 'tokens', 'duration', 'created_at')
|
||||
export_order = fields
|
||||
|
||||
|
||||
@admin.register(AITaskLog)
|
||||
class AITaskLogAdmin(ExportMixin, Igny8ModelAdmin):
|
||||
resource_class = AITaskLogResource
|
||||
class AITaskLogAdmin(admin.ModelAdmin):
|
||||
"""Admin interface for AI task logs"""
|
||||
list_display = [
|
||||
'function_name',
|
||||
@@ -64,10 +48,6 @@ class AITaskLogAdmin(ExportMixin, Igny8ModelAdmin):
|
||||
'created_at',
|
||||
'updated_at'
|
||||
]
|
||||
actions = [
|
||||
'bulk_delete_old_logs',
|
||||
'bulk_mark_reviewed',
|
||||
]
|
||||
|
||||
def has_add_permission(self, request):
|
||||
"""Logs are created automatically, no manual creation"""
|
||||
@@ -76,22 +56,4 @@ class AITaskLogAdmin(ExportMixin, Igny8ModelAdmin):
|
||||
def has_change_permission(self, request, obj=None):
|
||||
"""Logs are read-only"""
|
||||
return False
|
||||
|
||||
def bulk_delete_old_logs(self, request, queryset):
|
||||
"""Delete AI task logs older than 90 days"""
|
||||
from django.utils import timezone
|
||||
from datetime import timedelta
|
||||
|
||||
cutoff_date = timezone.now() - timedelta(days=90)
|
||||
old_logs = queryset.filter(created_at__lt=cutoff_date)
|
||||
count = old_logs.count()
|
||||
old_logs.delete()
|
||||
self.message_user(request, f'{count} old AI task log(s) deleted (older than 90 days).', messages.SUCCESS)
|
||||
bulk_delete_old_logs.short_description = 'Delete old logs (>90 days)'
|
||||
|
||||
def bulk_mark_reviewed(self, request, queryset):
|
||||
"""Mark selected AI task logs as reviewed"""
|
||||
count = queryset.count()
|
||||
self.message_user(request, f'{count} AI task log(s) marked as reviewed.', messages.SUCCESS)
|
||||
bulk_mark_reviewed.short_description = 'Mark as reviewed'
|
||||
|
||||
|
||||
@@ -40,30 +40,39 @@ class AICore:
|
||||
self.account = account
|
||||
self._openai_api_key = None
|
||||
self._runware_api_key = None
|
||||
self._bria_api_key = None
|
||||
self._anthropic_api_key = None
|
||||
self._load_account_settings()
|
||||
|
||||
def _load_account_settings(self):
|
||||
"""Load API keys from GlobalIntegrationSettings (platform-wide, used by ALL accounts)"""
|
||||
try:
|
||||
from igny8_core.modules.system.global_settings_models import GlobalIntegrationSettings
|
||||
|
||||
# Get global settings - single instance used by ALL accounts
|
||||
global_settings = GlobalIntegrationSettings.get_instance()
|
||||
|
||||
# Load API keys from global settings (platform-wide)
|
||||
self._openai_api_key = global_settings.openai_api_key
|
||||
self._runware_api_key = global_settings.runware_api_key
|
||||
self._bria_api_key = getattr(global_settings, 'bria_api_key', None)
|
||||
self._anthropic_api_key = getattr(global_settings, 'anthropic_api_key', None)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Could not load GlobalIntegrationSettings: {e}", exc_info=True)
|
||||
self._openai_api_key = None
|
||||
self._runware_api_key = None
|
||||
self._bria_api_key = None
|
||||
self._anthropic_api_key = None
|
||||
"""Load API keys and model from IntegrationSettings or Django settings"""
|
||||
if self.account:
|
||||
try:
|
||||
from igny8_core.modules.system.models import IntegrationSettings
|
||||
|
||||
# Load OpenAI settings
|
||||
openai_settings = IntegrationSettings.objects.filter(
|
||||
integration_type='openai',
|
||||
account=self.account,
|
||||
is_active=True
|
||||
).first()
|
||||
if openai_settings and openai_settings.config:
|
||||
self._openai_api_key = openai_settings.config.get('apiKey')
|
||||
|
||||
# Load Runware settings
|
||||
runware_settings = IntegrationSettings.objects.filter(
|
||||
integration_type='runware',
|
||||
account=self.account,
|
||||
is_active=True
|
||||
).first()
|
||||
if runware_settings and runware_settings.config:
|
||||
self._runware_api_key = runware_settings.config.get('apiKey')
|
||||
except Exception as e:
|
||||
logger.warning(f"Could not load account settings: {e}", exc_info=True)
|
||||
|
||||
# Fallback to Django settings for API keys only (no model fallback)
|
||||
if not self._openai_api_key:
|
||||
self._openai_api_key = getattr(settings, 'OPENAI_API_KEY', None)
|
||||
if not self._runware_api_key:
|
||||
self._runware_api_key = getattr(settings, 'RUNWARE_API_KEY', None)
|
||||
|
||||
def get_api_key(self, integration_type: str = 'openai') -> Optional[str]:
|
||||
"""Get API key for integration type"""
|
||||
@@ -71,10 +80,6 @@ class AICore:
|
||||
return self._openai_api_key
|
||||
elif integration_type == 'runware':
|
||||
return self._runware_api_key
|
||||
elif integration_type == 'bria':
|
||||
return self._bria_api_key
|
||||
elif integration_type == 'anthropic':
|
||||
return self._anthropic_api_key
|
||||
return None
|
||||
|
||||
def get_model(self, integration_type: str = 'openai') -> str:
|
||||
@@ -92,18 +97,18 @@ class AICore:
|
||||
self,
|
||||
prompt: str,
|
||||
model: str,
|
||||
max_tokens: int = 8192,
|
||||
max_tokens: int = 4000,
|
||||
temperature: float = 0.7,
|
||||
response_format: Optional[Dict] = None,
|
||||
api_key: Optional[str] = None,
|
||||
function_name: str = 'ai_request',
|
||||
prompt_prefix: Optional[str] = None,
|
||||
function_id: Optional[str] = None,
|
||||
tracker: Optional[ConsoleStepTracker] = None
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Centralized AI request handler with console logging.
|
||||
All AI text generation requests go through this method.
|
||||
|
||||
|
||||
Args:
|
||||
prompt: Prompt text
|
||||
model: Model name (required - must be provided from IntegrationSettings)
|
||||
@@ -112,13 +117,12 @@ class AICore:
|
||||
response_format: Optional response format dict (for JSON mode)
|
||||
api_key: Optional API key override
|
||||
function_name: Function name for logging (e.g., 'cluster_keywords')
|
||||
prompt_prefix: Optional prefix to add before prompt (e.g., '##GP01-Clustering')
|
||||
tracker: Optional ConsoleStepTracker instance for logging
|
||||
|
||||
|
||||
Returns:
|
||||
Dict with 'content', 'input_tokens', 'output_tokens', 'total_tokens',
|
||||
'model', 'cost', 'error', 'api_id'
|
||||
|
||||
|
||||
Raises:
|
||||
ValueError: If model is not provided
|
||||
"""
|
||||
@@ -169,24 +173,20 @@ class AICore:
|
||||
logger.info(f" - Model used in request: {active_model}")
|
||||
tracker.ai_call(f"Using model: {active_model}")
|
||||
|
||||
# Use ModelRegistry for validation with fallback to constants
|
||||
from igny8_core.ai.model_registry import ModelRegistry
|
||||
if not ModelRegistry.validate_model(active_model):
|
||||
# Fallback check against constants for backward compatibility
|
||||
if active_model not in MODEL_RATES:
|
||||
error_msg = f"Model '{active_model}' is not supported. Supported models: {list(MODEL_RATES.keys())}"
|
||||
logger.error(f"[AICore] {error_msg}")
|
||||
tracker.error('ConfigurationError', error_msg)
|
||||
return {
|
||||
'content': None,
|
||||
'error': error_msg,
|
||||
'input_tokens': 0,
|
||||
'output_tokens': 0,
|
||||
'total_tokens': 0,
|
||||
'model': active_model,
|
||||
'cost': 0.0,
|
||||
'api_id': None,
|
||||
}
|
||||
if active_model not in MODEL_RATES:
|
||||
error_msg = f"Model '{active_model}' is not supported. Supported models: {list(MODEL_RATES.keys())}"
|
||||
logger.error(f"[AICore] {error_msg}")
|
||||
tracker.error('ConfigurationError', error_msg)
|
||||
return {
|
||||
'content': None,
|
||||
'error': error_msg,
|
||||
'input_tokens': 0,
|
||||
'output_tokens': 0,
|
||||
'total_tokens': 0,
|
||||
'model': active_model,
|
||||
'cost': 0.0,
|
||||
'api_id': None,
|
||||
}
|
||||
|
||||
tracker.ai_call(f"Using model: {active_model}")
|
||||
|
||||
@@ -199,16 +199,16 @@ class AICore:
|
||||
else:
|
||||
tracker.ai_call("Using text response format")
|
||||
|
||||
# Step 4: Validate prompt length and add prompt_prefix
|
||||
# Step 4: Validate prompt length and add function_id
|
||||
prompt_length = len(prompt)
|
||||
tracker.ai_call(f"Prompt length: {prompt_length} characters")
|
||||
|
||||
# Add prompt_prefix to prompt if provided (for tracking)
|
||||
# Format: ##GP01-Clustering or ##CP01-Clustering
|
||||
|
||||
# Add function_id to prompt if provided (for tracking)
|
||||
final_prompt = prompt
|
||||
if prompt_prefix:
|
||||
final_prompt = f'{prompt_prefix}\n\n{prompt}'
|
||||
tracker.ai_call(f"Added prompt prefix: {prompt_prefix}")
|
||||
if function_id:
|
||||
function_id_prefix = f'function_id: "{function_id}"\n\n'
|
||||
final_prompt = function_id_prefix + prompt
|
||||
tracker.ai_call(f"Added function_id to prompt: {function_id}")
|
||||
|
||||
# Step 5: Build request payload
|
||||
url = 'https://api.openai.com/v1/chat/completions'
|
||||
@@ -223,12 +223,8 @@ class AICore:
|
||||
'temperature': temperature,
|
||||
}
|
||||
|
||||
# GPT-5.1 and GPT-5.2 use max_completion_tokens instead of max_tokens
|
||||
if max_tokens:
|
||||
if active_model in ['gpt-5.1', 'gpt-5.2']:
|
||||
body_data['max_completion_tokens'] = max_tokens
|
||||
else:
|
||||
body_data['max_tokens'] = max_tokens
|
||||
body_data['max_tokens'] = max_tokens
|
||||
|
||||
if response_format:
|
||||
body_data['response_format'] = response_format
|
||||
@@ -240,7 +236,7 @@ class AICore:
|
||||
request_start = time.time()
|
||||
|
||||
try:
|
||||
response = requests.post(url, headers=headers, json=body_data, timeout=180)
|
||||
response = requests.post(url, headers=headers, json=body_data, timeout=60)
|
||||
request_duration = time.time() - request_start
|
||||
tracker.ai_call(f"Received response in {request_duration:.2f}s (status={response.status_code})")
|
||||
|
||||
@@ -305,17 +301,9 @@ class AICore:
|
||||
tracker.parse(f"Received {total_tokens} tokens (input: {input_tokens}, output: {output_tokens})")
|
||||
tracker.parse(f"Content length: {len(content)} characters")
|
||||
|
||||
# Step 10: Calculate cost using ModelRegistry (with fallback to constants)
|
||||
from igny8_core.ai.model_registry import ModelRegistry
|
||||
cost = float(ModelRegistry.calculate_cost(
|
||||
active_model,
|
||||
input_tokens=input_tokens,
|
||||
output_tokens=output_tokens
|
||||
))
|
||||
# Fallback to constants if ModelRegistry returns 0
|
||||
if cost == 0:
|
||||
rates = MODEL_RATES.get(active_model, {'input': 2.00, 'output': 8.00})
|
||||
cost = (input_tokens * rates['input'] + output_tokens * rates['output']) / 1_000_000
|
||||
# Step 10: Calculate cost
|
||||
rates = MODEL_RATES.get(active_model, {'input': 2.00, 'output': 8.00})
|
||||
cost = (input_tokens * rates['input'] + output_tokens * rates['output']) / 1_000_000
|
||||
tracker.parse(f"Cost calculated: ${cost:.6f}")
|
||||
|
||||
tracker.done("Request completed successfully")
|
||||
@@ -347,8 +335,8 @@ class AICore:
|
||||
}
|
||||
|
||||
except requests.exceptions.Timeout:
|
||||
error_msg = 'Request timeout (180s exceeded)'
|
||||
tracker.timeout(180)
|
||||
error_msg = 'Request timeout (60s exceeded)'
|
||||
tracker.timeout(60)
|
||||
logger.error(error_msg)
|
||||
return {
|
||||
'content': None,
|
||||
@@ -390,289 +378,6 @@ class AICore:
|
||||
'api_id': None,
|
||||
}
|
||||
|
||||
def run_anthropic_request(
|
||||
self,
|
||||
prompt: str,
|
||||
model: str,
|
||||
max_tokens: int = 8192,
|
||||
temperature: float = 0.7,
|
||||
api_key: Optional[str] = None,
|
||||
function_name: str = 'anthropic_request',
|
||||
prompt_prefix: Optional[str] = None,
|
||||
tracker: Optional[ConsoleStepTracker] = None,
|
||||
system_prompt: Optional[str] = None,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Anthropic (Claude) AI request handler with console logging.
|
||||
Alternative to OpenAI for text generation.
|
||||
|
||||
Args:
|
||||
prompt: Prompt text
|
||||
model: Claude model name (required - must be provided from IntegrationSettings)
|
||||
max_tokens: Maximum tokens
|
||||
temperature: Temperature (0-1)
|
||||
api_key: Optional API key override
|
||||
function_name: Function name for logging (e.g., 'cluster_keywords')
|
||||
prompt_prefix: Optional prefix to add before prompt
|
||||
tracker: Optional ConsoleStepTracker instance for logging
|
||||
system_prompt: Optional system prompt for Claude
|
||||
|
||||
Returns:
|
||||
Dict with 'content', 'input_tokens', 'output_tokens', 'total_tokens',
|
||||
'model', 'cost', 'error', 'api_id'
|
||||
|
||||
Raises:
|
||||
ValueError: If model is not provided
|
||||
"""
|
||||
# Use provided tracker or create a new one
|
||||
if tracker is None:
|
||||
tracker = ConsoleStepTracker(function_name)
|
||||
|
||||
tracker.ai_call("Preparing Anthropic request...")
|
||||
|
||||
# Step 1: Validate model is provided
|
||||
if not model:
|
||||
error_msg = "Model is required. Ensure IntegrationSettings is configured for the account."
|
||||
tracker.error('ConfigurationError', error_msg)
|
||||
logger.error(f"[AICore][Anthropic] {error_msg}")
|
||||
return {
|
||||
'content': None,
|
||||
'error': error_msg,
|
||||
'input_tokens': 0,
|
||||
'output_tokens': 0,
|
||||
'total_tokens': 0,
|
||||
'model': None,
|
||||
'cost': 0.0,
|
||||
'api_id': None,
|
||||
}
|
||||
|
||||
# Step 2: Validate API key
|
||||
api_key = api_key or self._anthropic_api_key
|
||||
if not api_key:
|
||||
error_msg = 'Anthropic API key not configured'
|
||||
tracker.error('ConfigurationError', error_msg)
|
||||
return {
|
||||
'content': None,
|
||||
'error': error_msg,
|
||||
'input_tokens': 0,
|
||||
'output_tokens': 0,
|
||||
'total_tokens': 0,
|
||||
'model': model,
|
||||
'cost': 0.0,
|
||||
'api_id': None,
|
||||
}
|
||||
|
||||
active_model = model
|
||||
|
||||
# Debug logging: Show model used
|
||||
logger.info(f"[AICore][Anthropic] Model Configuration:")
|
||||
logger.info(f" - Model parameter passed: {model}")
|
||||
logger.info(f" - Model used in request: {active_model}")
|
||||
tracker.ai_call(f"Using Anthropic model: {active_model}")
|
||||
|
||||
# Add prompt_prefix to prompt if provided (for tracking)
|
||||
final_prompt = prompt
|
||||
if prompt_prefix:
|
||||
final_prompt = f'{prompt_prefix}\n\n{prompt}'
|
||||
tracker.ai_call(f"Added prompt prefix: {prompt_prefix}")
|
||||
|
||||
# Step 5: Build request payload using Anthropic Messages API
|
||||
url = 'https://api.anthropic.com/v1/messages'
|
||||
headers = {
|
||||
'x-api-key': api_key,
|
||||
'anthropic-version': '2023-06-01',
|
||||
'Content-Type': 'application/json',
|
||||
}
|
||||
|
||||
body_data = {
|
||||
'model': active_model,
|
||||
'max_tokens': max_tokens,
|
||||
'messages': [{'role': 'user', 'content': final_prompt}],
|
||||
}
|
||||
|
||||
# Only add temperature if it's less than 1.0 (Claude's default)
|
||||
if temperature < 1.0:
|
||||
body_data['temperature'] = temperature
|
||||
|
||||
# Add system prompt if provided
|
||||
if system_prompt:
|
||||
body_data['system'] = system_prompt
|
||||
|
||||
tracker.ai_call(f"Request payload prepared (model={active_model}, max_tokens={max_tokens}, temp={temperature})")
|
||||
|
||||
# Step 6: Send request
|
||||
tracker.ai_call("Sending request to Anthropic API...")
|
||||
request_start = time.time()
|
||||
|
||||
try:
|
||||
response = requests.post(url, headers=headers, json=body_data, timeout=180)
|
||||
request_duration = time.time() - request_start
|
||||
tracker.ai_call(f"Received response in {request_duration:.2f}s (status={response.status_code})")
|
||||
|
||||
# Step 7: Validate HTTP response
|
||||
if response.status_code != 200:
|
||||
error_data = response.json() if response.headers.get('content-type', '').startswith('application/json') else {}
|
||||
error_message = f"HTTP {response.status_code} error"
|
||||
|
||||
if isinstance(error_data, dict) and 'error' in error_data:
|
||||
if isinstance(error_data['error'], dict) and 'message' in error_data['error']:
|
||||
error_message += f": {error_data['error']['message']}"
|
||||
|
||||
# Check for rate limit
|
||||
if response.status_code == 429:
|
||||
retry_after = response.headers.get('retry-after', '60')
|
||||
tracker.rate_limit(retry_after)
|
||||
error_message += f" (Rate limit - retry after {retry_after}s)"
|
||||
else:
|
||||
tracker.error('HTTPError', error_message)
|
||||
|
||||
logger.error(f"Anthropic API HTTP error {response.status_code}: {error_message}")
|
||||
|
||||
return {
|
||||
'content': None,
|
||||
'error': error_message,
|
||||
'input_tokens': 0,
|
||||
'output_tokens': 0,
|
||||
'total_tokens': 0,
|
||||
'model': active_model,
|
||||
'cost': 0.0,
|
||||
'api_id': None,
|
||||
}
|
||||
|
||||
# Step 8: Parse response JSON
|
||||
try:
|
||||
data = response.json()
|
||||
except json.JSONDecodeError as e:
|
||||
error_msg = f'Failed to parse JSON response: {str(e)}'
|
||||
tracker.malformed_json(str(e))
|
||||
logger.error(error_msg)
|
||||
return {
|
||||
'content': None,
|
||||
'error': error_msg,
|
||||
'input_tokens': 0,
|
||||
'output_tokens': 0,
|
||||
'total_tokens': 0,
|
||||
'model': active_model,
|
||||
'cost': 0.0,
|
||||
'api_id': None,
|
||||
}
|
||||
|
||||
api_id = data.get('id')
|
||||
|
||||
# Step 9: Extract content (Anthropic format)
|
||||
# Claude returns content as array: [{"type": "text", "text": "..."}]
|
||||
if 'content' in data and len(data['content']) > 0:
|
||||
# Extract text from first content block
|
||||
content_blocks = data['content']
|
||||
content = ''
|
||||
for block in content_blocks:
|
||||
if block.get('type') == 'text':
|
||||
content += block.get('text', '')
|
||||
|
||||
usage = data.get('usage', {})
|
||||
input_tokens = usage.get('input_tokens', 0)
|
||||
output_tokens = usage.get('output_tokens', 0)
|
||||
total_tokens = input_tokens + output_tokens
|
||||
|
||||
tracker.parse(f"Received {total_tokens} tokens (input: {input_tokens}, output: {output_tokens})")
|
||||
tracker.parse(f"Content length: {len(content)} characters")
|
||||
|
||||
# Step 10: Calculate cost using ModelRegistry (with fallback)
|
||||
# Claude pricing as of 2024:
|
||||
# claude-3-5-sonnet: $3/1M input, $15/1M output
|
||||
# claude-3-opus: $15/1M input, $75/1M output
|
||||
# claude-3-haiku: $0.25/1M input, $1.25/1M output
|
||||
from igny8_core.ai.model_registry import ModelRegistry
|
||||
cost = float(ModelRegistry.calculate_cost(
|
||||
active_model,
|
||||
input_tokens=input_tokens,
|
||||
output_tokens=output_tokens
|
||||
))
|
||||
# Fallback to hardcoded rates if ModelRegistry returns 0
|
||||
if cost == 0:
|
||||
anthropic_rates = {
|
||||
'claude-3-5-sonnet-20241022': {'input': 3.00, 'output': 15.00},
|
||||
'claude-3-5-haiku-20241022': {'input': 1.00, 'output': 5.00},
|
||||
'claude-3-opus-20240229': {'input': 15.00, 'output': 75.00},
|
||||
'claude-3-sonnet-20240229': {'input': 3.00, 'output': 15.00},
|
||||
'claude-3-haiku-20240307': {'input': 0.25, 'output': 1.25},
|
||||
}
|
||||
rates = anthropic_rates.get(active_model, {'input': 3.00, 'output': 15.00})
|
||||
cost = (input_tokens * rates['input'] + output_tokens * rates['output']) / 1_000_000
|
||||
tracker.parse(f"Cost calculated: ${cost:.6f}")
|
||||
|
||||
tracker.done("Anthropic request completed successfully")
|
||||
|
||||
return {
|
||||
'content': content,
|
||||
'input_tokens': input_tokens,
|
||||
'output_tokens': output_tokens,
|
||||
'total_tokens': total_tokens,
|
||||
'model': active_model,
|
||||
'cost': cost,
|
||||
'error': None,
|
||||
'api_id': api_id,
|
||||
'duration': request_duration,
|
||||
}
|
||||
else:
|
||||
error_msg = 'No content in Anthropic response'
|
||||
tracker.error('EmptyResponse', error_msg)
|
||||
logger.error(error_msg)
|
||||
return {
|
||||
'content': None,
|
||||
'error': error_msg,
|
||||
'input_tokens': 0,
|
||||
'output_tokens': 0,
|
||||
'total_tokens': 0,
|
||||
'model': active_model,
|
||||
'cost': 0.0,
|
||||
'api_id': api_id,
|
||||
}
|
||||
|
||||
except requests.exceptions.Timeout:
|
||||
error_msg = 'Request timeout (180s exceeded)'
|
||||
tracker.timeout(180)
|
||||
logger.error(error_msg)
|
||||
return {
|
||||
'content': None,
|
||||
'error': error_msg,
|
||||
'input_tokens': 0,
|
||||
'output_tokens': 0,
|
||||
'total_tokens': 0,
|
||||
'model': active_model,
|
||||
'cost': 0.0,
|
||||
'api_id': None,
|
||||
}
|
||||
except requests.exceptions.RequestException as e:
|
||||
error_msg = f'Request exception: {str(e)}'
|
||||
tracker.error('RequestException', error_msg, e)
|
||||
logger.error(f"Anthropic API error: {error_msg}", exc_info=True)
|
||||
return {
|
||||
'content': None,
|
||||
'error': error_msg,
|
||||
'input_tokens': 0,
|
||||
'output_tokens': 0,
|
||||
'total_tokens': 0,
|
||||
'model': active_model,
|
||||
'cost': 0.0,
|
||||
'api_id': None,
|
||||
}
|
||||
except Exception as e:
|
||||
error_msg = f'Unexpected error: {str(e)}'
|
||||
logger.error(f"[AI][{function_name}][Anthropic][Error] {error_msg}", exc_info=True)
|
||||
if tracker:
|
||||
tracker.error('UnexpectedError', error_msg, e)
|
||||
return {
|
||||
'content': None,
|
||||
'error': error_msg,
|
||||
'input_tokens': 0,
|
||||
'output_tokens': 0,
|
||||
'total_tokens': 0,
|
||||
'model': active_model,
|
||||
'cost': 0.0,
|
||||
'api_id': None,
|
||||
}
|
||||
|
||||
def extract_json(self, response_text: str) -> Optional[Dict]:
|
||||
"""
|
||||
Extract JSON from response text.
|
||||
@@ -746,8 +451,6 @@ class AICore:
|
||||
return self._generate_image_openai(prompt, model, size, n, api_key, negative_prompt, function_name)
|
||||
elif provider == 'runware':
|
||||
return self._generate_image_runware(prompt, model, size, n, api_key, negative_prompt, function_name)
|
||||
elif provider == 'bria':
|
||||
return self._generate_image_bria(prompt, model, size, n, api_key, negative_prompt, function_name)
|
||||
else:
|
||||
error_msg = f'Unknown provider: {provider}'
|
||||
print(f"[AI][{function_name}][Error] {error_msg}")
|
||||
@@ -886,11 +589,7 @@ class AICore:
|
||||
image_url = image_data.get('url')
|
||||
revised_prompt = image_data.get('revised_prompt')
|
||||
|
||||
# Use ModelRegistry for image cost (with fallback to constants)
|
||||
from igny8_core.ai.model_registry import ModelRegistry
|
||||
cost = float(ModelRegistry.calculate_cost(model, num_images=n))
|
||||
if cost == 0:
|
||||
cost = IMAGE_MODEL_RATES.get(model, 0.040) * n
|
||||
cost = IMAGE_MODEL_RATES.get(model, 0.040) * n
|
||||
print(f"[AI][{function_name}] Step 5: Image generated successfully")
|
||||
print(f"[AI][{function_name}] Step 6: Cost: ${cost:.4f}")
|
||||
print(f"[AI][{function_name}][Success] Image generation completed")
|
||||
@@ -1125,196 +824,23 @@ class AICore:
|
||||
'error': error_msg,
|
||||
}
|
||||
|
||||
def _generate_image_bria(
|
||||
self,
|
||||
prompt: str,
|
||||
model: Optional[str],
|
||||
size: str,
|
||||
n: int,
|
||||
api_key: Optional[str],
|
||||
negative_prompt: Optional[str],
|
||||
function_name: str
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Generate image using Bria AI.
|
||||
|
||||
Bria API Reference: https://docs.bria.ai/reference/text-to-image
|
||||
"""
|
||||
print(f"[AI][{function_name}] Provider: Bria AI")
|
||||
|
||||
api_key = api_key or self._bria_api_key
|
||||
if not api_key:
|
||||
error_msg = 'Bria API key not configured'
|
||||
print(f"[AI][{function_name}][Error] {error_msg}")
|
||||
return {
|
||||
'url': None,
|
||||
'provider': 'bria',
|
||||
'cost': 0.0,
|
||||
'error': error_msg,
|
||||
}
|
||||
|
||||
bria_model = model or 'bria-2.3'
|
||||
print(f"[AI][{function_name}] Step 2: Using model: {bria_model}, size: {size}")
|
||||
|
||||
# Parse size
|
||||
try:
|
||||
width, height = map(int, size.split('x'))
|
||||
except ValueError:
|
||||
error_msg = f"Invalid size format: {size}. Expected format: WIDTHxHEIGHT"
|
||||
print(f"[AI][{function_name}][Error] {error_msg}")
|
||||
return {
|
||||
'url': None,
|
||||
'provider': 'bria',
|
||||
'cost': 0.0,
|
||||
'error': error_msg,
|
||||
}
|
||||
|
||||
# Bria API endpoint
|
||||
url = 'https://engine.prod.bria-api.com/v1/text-to-image/base'
|
||||
headers = {
|
||||
'api_token': api_key,
|
||||
'Content-Type': 'application/json'
|
||||
}
|
||||
|
||||
payload = {
|
||||
'prompt': prompt,
|
||||
'num_results': n,
|
||||
'sync': True, # Wait for result
|
||||
'model_version': bria_model.replace('bria-', ''), # e.g., '2.3'
|
||||
}
|
||||
|
||||
# Add negative prompt if provided
|
||||
if negative_prompt:
|
||||
payload['negative_prompt'] = negative_prompt
|
||||
|
||||
# Add size constraints if not default
|
||||
if width and height:
|
||||
# Bria uses aspect ratio or fixed sizes
|
||||
payload['width'] = width
|
||||
payload['height'] = height
|
||||
|
||||
print(f"[AI][{function_name}] Step 3: Sending request to Bria API...")
|
||||
|
||||
request_start = time.time()
|
||||
try:
|
||||
response = requests.post(url, json=payload, headers=headers, timeout=150)
|
||||
request_duration = time.time() - request_start
|
||||
print(f"[AI][{function_name}] Step 4: Received response in {request_duration:.2f}s (status={response.status_code})")
|
||||
|
||||
if response.status_code != 200:
|
||||
error_msg = f"HTTP {response.status_code} error: {response.text[:200]}"
|
||||
print(f"[AI][{function_name}][Error] {error_msg}")
|
||||
return {
|
||||
'url': None,
|
||||
'provider': 'bria',
|
||||
'cost': 0.0,
|
||||
'error': error_msg,
|
||||
}
|
||||
|
||||
body = response.json()
|
||||
print(f"[AI][{function_name}] Bria response keys: {list(body.keys()) if isinstance(body, dict) else type(body)}")
|
||||
|
||||
# Bria returns { "result": [ { "urls": ["..."] } ] }
|
||||
image_url = None
|
||||
error_msg = None
|
||||
|
||||
if isinstance(body, dict):
|
||||
if 'result' in body and isinstance(body['result'], list) and len(body['result']) > 0:
|
||||
first_result = body['result'][0]
|
||||
if 'urls' in first_result and isinstance(first_result['urls'], list) and len(first_result['urls']) > 0:
|
||||
image_url = first_result['urls'][0]
|
||||
elif 'url' in first_result:
|
||||
image_url = first_result['url']
|
||||
elif 'error' in body:
|
||||
error_msg = body['error']
|
||||
elif 'message' in body:
|
||||
error_msg = body['message']
|
||||
|
||||
if error_msg:
|
||||
print(f"[AI][{function_name}][Error] Bria API error: {error_msg}")
|
||||
return {
|
||||
'url': None,
|
||||
'provider': 'bria',
|
||||
'cost': 0.0,
|
||||
'error': error_msg,
|
||||
}
|
||||
|
||||
if image_url:
|
||||
# Cost based on model
|
||||
cost_per_image = {
|
||||
'bria-2.3': 0.015,
|
||||
'bria-2.3-fast': 0.010,
|
||||
'bria-2.2': 0.012,
|
||||
}.get(bria_model, 0.015)
|
||||
cost = cost_per_image * n
|
||||
|
||||
print(f"[AI][{function_name}] Step 5: Image generated successfully")
|
||||
print(f"[AI][{function_name}] Step 6: Cost: ${cost:.4f}")
|
||||
print(f"[AI][{function_name}][Success] Image generation completed")
|
||||
|
||||
return {
|
||||
'url': image_url,
|
||||
'provider': 'bria',
|
||||
'cost': cost,
|
||||
'error': None,
|
||||
}
|
||||
else:
|
||||
error_msg = f'No image data in Bria response'
|
||||
print(f"[AI][{function_name}][Error] {error_msg}")
|
||||
logger.error(f"[AI][{function_name}] Full Bria response: {json.dumps(body, indent=2) if isinstance(body, dict) else str(body)}")
|
||||
return {
|
||||
'url': None,
|
||||
'provider': 'bria',
|
||||
'cost': 0.0,
|
||||
'error': error_msg,
|
||||
}
|
||||
|
||||
except requests.exceptions.Timeout:
|
||||
error_msg = 'Request timeout (150s exceeded)'
|
||||
print(f"[AI][{function_name}][Error] {error_msg}")
|
||||
return {
|
||||
'url': None,
|
||||
'provider': 'bria',
|
||||
'cost': 0.0,
|
||||
'error': error_msg,
|
||||
}
|
||||
except Exception as e:
|
||||
error_msg = f'Unexpected error: {str(e)}'
|
||||
print(f"[AI][{function_name}][Error] {error_msg}")
|
||||
logger.error(error_msg, exc_info=True)
|
||||
return {
|
||||
'url': None,
|
||||
'provider': 'bria',
|
||||
'cost': 0.0,
|
||||
'error': error_msg,
|
||||
}
|
||||
|
||||
def calculate_cost(self, model: str, input_tokens: int, output_tokens: int, model_type: str = 'text') -> float:
|
||||
"""Calculate cost for API call using ModelRegistry with fallback to constants"""
|
||||
from igny8_core.ai.model_registry import ModelRegistry
|
||||
|
||||
"""Calculate cost for API call"""
|
||||
if model_type == 'text':
|
||||
cost = float(ModelRegistry.calculate_cost(model, input_tokens=input_tokens, output_tokens=output_tokens))
|
||||
if cost == 0:
|
||||
# Fallback to constants
|
||||
rates = MODEL_RATES.get(model, {'input': 2.00, 'output': 8.00})
|
||||
input_cost = (input_tokens / 1_000_000) * rates['input']
|
||||
output_cost = (output_tokens / 1_000_000) * rates['output']
|
||||
return input_cost + output_cost
|
||||
return cost
|
||||
rates = MODEL_RATES.get(model, {'input': 2.00, 'output': 8.00})
|
||||
input_cost = (input_tokens / 1_000_000) * rates['input']
|
||||
output_cost = (output_tokens / 1_000_000) * rates['output']
|
||||
return input_cost + output_cost
|
||||
elif model_type == 'image':
|
||||
cost = float(ModelRegistry.calculate_cost(model, num_images=1))
|
||||
if cost == 0:
|
||||
rate = IMAGE_MODEL_RATES.get(model, 0.040)
|
||||
return rate * 1
|
||||
return cost
|
||||
rate = IMAGE_MODEL_RATES.get(model, 0.040)
|
||||
return rate * 1
|
||||
return 0.0
|
||||
|
||||
# Legacy method names for backward compatibility
|
||||
def call_openai(self, prompt: str, model: Optional[str] = None, max_tokens: int = 8192,
|
||||
def call_openai(self, prompt: str, model: Optional[str] = None, max_tokens: int = 4000,
|
||||
temperature: float = 0.7, response_format: Optional[Dict] = None,
|
||||
api_key: Optional[str] = None) -> Dict[str, Any]:
|
||||
"""DEPRECATED: Legacy method - redirects to run_ai_request(). Use run_ai_request() directly."""
|
||||
"""Legacy method - redirects to run_ai_request()"""
|
||||
return self.run_ai_request(
|
||||
prompt=prompt,
|
||||
model=model,
|
||||
|
||||
@@ -6,8 +6,6 @@ MODEL_RATES = {
|
||||
'gpt-4.1': {'input': 2.00, 'output': 8.00},
|
||||
'gpt-4o-mini': {'input': 0.15, 'output': 0.60},
|
||||
'gpt-4o': {'input': 2.50, 'output': 10.00},
|
||||
'gpt-5.1': {'input': 1.25, 'output': 10.00},
|
||||
'gpt-5.2': {'input': 1.75, 'output': 14.00},
|
||||
}
|
||||
|
||||
# Image model pricing (per image) - EXACT from reference plugin
|
||||
@@ -35,7 +33,7 @@ VALID_SIZES_BY_MODEL = {
|
||||
DEFAULT_AI_MODEL = 'gpt-4.1'
|
||||
|
||||
# JSON mode supported models
|
||||
JSON_MODE_MODELS = ['gpt-4o', 'gpt-4o-mini', 'gpt-4-turbo-preview', 'gpt-5.1', 'gpt-5.2']
|
||||
JSON_MODE_MODELS = ['gpt-4o', 'gpt-4o-mini', 'gpt-4-turbo-preview']
|
||||
|
||||
# Debug mode - controls console logging
|
||||
# Set to False in production to disable verbose logging
|
||||
|
||||
@@ -31,15 +31,9 @@ class AIEngine:
|
||||
elif function_name == 'generate_ideas':
|
||||
return f"{count} cluster{'s' if count != 1 else ''}"
|
||||
elif function_name == 'generate_content':
|
||||
return f"{count} article{'s' if count != 1 else ''}"
|
||||
return f"{count} task{'s' if count != 1 else ''}"
|
||||
elif function_name == 'generate_images':
|
||||
return f"{count} image{'s' if count != 1 else ''}"
|
||||
elif function_name == 'generate_image_prompts':
|
||||
return f"{count} image prompt{'s' if count != 1 else ''}"
|
||||
elif function_name == 'optimize_content':
|
||||
return f"{count} article{'s' if count != 1 else ''}"
|
||||
elif function_name == 'generate_site_structure':
|
||||
return "site blueprint"
|
||||
return f"{count} task{'s' if count != 1 else ''}"
|
||||
return f"{count} item{'s' if count != 1 else ''}"
|
||||
|
||||
def _build_validation_message(self, function_name: str, payload: dict, count: int, input_description: str) -> str:
|
||||
@@ -55,22 +49,12 @@ class AIEngine:
|
||||
remaining = count - len(keyword_list)
|
||||
if remaining > 0:
|
||||
keywords_text = ', '.join(keyword_list)
|
||||
return f"Validating {count} keywords for clustering"
|
||||
return f"Validating {keywords_text} and {remaining} more keyword{'s' if remaining != 1 else ''}"
|
||||
else:
|
||||
keywords_text = ', '.join(keyword_list)
|
||||
return f"Validating {keywords_text}"
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to load keyword names for validation message: {e}")
|
||||
elif function_name == 'generate_ideas':
|
||||
return f"Analyzing {count} clusters for content opportunities"
|
||||
elif function_name == 'generate_content':
|
||||
return f"Preparing {count} article{'s' if count != 1 else ''} for generation"
|
||||
elif function_name == 'generate_image_prompts':
|
||||
return f"Analyzing content for image opportunities"
|
||||
elif function_name == 'generate_images':
|
||||
return f"Queuing {count} image{'s' if count != 1 else ''} for generation"
|
||||
elif function_name == 'optimize_content':
|
||||
return f"Analyzing {count} article{'s' if count != 1 else ''} for optimization"
|
||||
|
||||
# Fallback to simple count message
|
||||
return f"Validating {input_description}"
|
||||
@@ -78,147 +62,83 @@ class AIEngine:
|
||||
def _get_prep_message(self, function_name: str, count: int, data: Any) -> str:
|
||||
"""Get user-friendly prep message"""
|
||||
if function_name == 'auto_cluster':
|
||||
return f"Analyzing keyword relationships for {count} keyword{'s' if count != 1 else ''}"
|
||||
return f"Loading {count} keyword{'s' if count != 1 else ''}"
|
||||
elif function_name == 'generate_ideas':
|
||||
# Count keywords in clusters if available
|
||||
keyword_count = 0
|
||||
if isinstance(data, dict) and 'cluster_data' in data:
|
||||
for cluster in data['cluster_data']:
|
||||
keyword_count += len(cluster.get('keywords', []))
|
||||
if keyword_count > 0:
|
||||
return f"Mapping {keyword_count} keywords to topic briefs"
|
||||
return f"Mapping keywords to topic briefs for {count} cluster{'s' if count != 1 else ''}"
|
||||
return f"Loading {count} cluster{'s' if count != 1 else ''}"
|
||||
elif function_name == 'generate_content':
|
||||
return f"Building content brief{'s' if count != 1 else ''} with target keywords"
|
||||
return f"Preparing {count} content idea{'s' if count != 1 else ''}"
|
||||
elif function_name == 'generate_images':
|
||||
return f"Preparing AI image generation ({count} image{'s' if count != 1 else ''})"
|
||||
return f"Extracting image prompts from {count} task{'s' if count != 1 else ''}"
|
||||
elif function_name == 'generate_image_prompts':
|
||||
# Extract max_images from data if available
|
||||
if isinstance(data, list) and len(data) > 0:
|
||||
max_images = data[0].get('max_images')
|
||||
max_images = data[0].get('max_images', 2)
|
||||
total_images = 1 + max_images # 1 featured + max_images in-article
|
||||
return f"Identifying 1 featured + {max_images} in-article image slots"
|
||||
return f"Mapping Content for {total_images} Image Prompts"
|
||||
elif isinstance(data, dict) and 'max_images' in data:
|
||||
max_images = data.get('max_images')
|
||||
max_images = data.get('max_images', 2)
|
||||
total_images = 1 + max_images
|
||||
return f"Identifying 1 featured + {max_images} in-article image slots"
|
||||
return f"Identifying featured and in-article image slots"
|
||||
elif function_name == 'optimize_content':
|
||||
return f"Analyzing SEO factors for {count} article{'s' if count != 1 else ''}"
|
||||
elif function_name == 'generate_site_structure':
|
||||
blueprint_name = ''
|
||||
if isinstance(data, dict):
|
||||
blueprint = data.get('blueprint')
|
||||
if blueprint and getattr(blueprint, 'name', None):
|
||||
blueprint_name = f'"{blueprint.name}"'
|
||||
return f"Preparing site blueprint {blueprint_name}".strip()
|
||||
return f"Mapping Content for {total_images} Image Prompts"
|
||||
return f"Mapping Content for Image Prompts"
|
||||
return f"Preparing {count} item{'s' if count != 1 else ''}"
|
||||
|
||||
def _get_ai_call_message(self, function_name: str, count: int) -> str:
|
||||
"""Get user-friendly AI call message"""
|
||||
if function_name == 'auto_cluster':
|
||||
return f"Grouping {count} keywords by search intent"
|
||||
return f"Grouping {count} keyword{'s' if count != 1 else ''} into clusters"
|
||||
elif function_name == 'generate_ideas':
|
||||
return f"Generating content ideas for {count} cluster{'s' if count != 1 else ''}"
|
||||
elif function_name == 'generate_content':
|
||||
return f"Writing {count} article{'s' if count != 1 else ''} with AI"
|
||||
return f"Writing article{'s' if count != 1 else ''} with AI"
|
||||
elif function_name == 'generate_images':
|
||||
return f"Generating image{'s' if count != 1 else ''} with AI"
|
||||
elif function_name == 'generate_image_prompts':
|
||||
return f"Creating optimized prompts for {count} image{'s' if count != 1 else ''}"
|
||||
elif function_name == 'optimize_content':
|
||||
return f"Optimizing {count} article{'s' if count != 1 else ''} for SEO"
|
||||
elif function_name == 'generate_site_structure':
|
||||
return "Designing complete site architecture"
|
||||
return f"Creating image{'s' if count != 1 else ''} with AI"
|
||||
return f"Processing with AI"
|
||||
|
||||
def _get_parse_message(self, function_name: str) -> str:
|
||||
"""Get user-friendly parse message"""
|
||||
if function_name == 'auto_cluster':
|
||||
return "Organizing semantic clusters"
|
||||
return "Organizing clusters"
|
||||
elif function_name == 'generate_ideas':
|
||||
return "Structuring article outlines"
|
||||
return "Structuring outlines"
|
||||
elif function_name == 'generate_content':
|
||||
return "Formatting HTML content and metadata"
|
||||
return "Formatting content"
|
||||
elif function_name == 'generate_images':
|
||||
return "Processing generated images"
|
||||
elif function_name == 'generate_image_prompts':
|
||||
return "Refining contextual image descriptions"
|
||||
elif function_name == 'optimize_content':
|
||||
return "Compiling optimization scores"
|
||||
elif function_name == 'generate_site_structure':
|
||||
return "Compiling site map"
|
||||
return "Processing images"
|
||||
return "Processing results"
|
||||
|
||||
def _get_parse_message_with_count(self, function_name: str, count: int) -> str:
|
||||
"""Get user-friendly parse message with count"""
|
||||
if function_name == 'auto_cluster':
|
||||
return f"Organizing {count} semantic cluster{'s' if count != 1 else ''}"
|
||||
return f"{count} cluster{'s' if count != 1 else ''} created"
|
||||
elif function_name == 'generate_ideas':
|
||||
return f"Structuring {count} article outline{'s' if count != 1 else ''}"
|
||||
return f"{count} idea{'s' if count != 1 else ''} created"
|
||||
elif function_name == 'generate_content':
|
||||
return f"Formatting {count} article{'s' if count != 1 else ''}"
|
||||
return f"{count} article{'s' if count != 1 else ''} created"
|
||||
elif function_name == 'generate_images':
|
||||
return f"Processing {count} generated image{'s' if count != 1 else ''}"
|
||||
return f"{count} image{'s' if count != 1 else ''} created"
|
||||
elif function_name == 'generate_image_prompts':
|
||||
# Count is total prompts, in-article is count - 1 (subtract featured)
|
||||
in_article_count = max(0, count - 1)
|
||||
if in_article_count > 0:
|
||||
return f"Refining {in_article_count} in-article image description{'s' if in_article_count != 1 else ''}"
|
||||
return "Refining image descriptions"
|
||||
elif function_name == 'optimize_content':
|
||||
return f"Compiling scores for {count} article{'s' if count != 1 else ''}"
|
||||
elif function_name == 'generate_site_structure':
|
||||
return f"{count} page blueprint{'s' if count != 1 else ''} mapped"
|
||||
return f"Writing {in_article_count} In‑article Image Prompts"
|
||||
return "Writing In‑article Image Prompts"
|
||||
return f"{count} item{'s' if count != 1 else ''} processed"
|
||||
|
||||
def _get_save_message(self, function_name: str, count: int) -> str:
|
||||
"""Get user-friendly save message"""
|
||||
if function_name == 'auto_cluster':
|
||||
return f"Saving {count} cluster{'s' if count != 1 else ''} with keywords"
|
||||
return f"Saving {count} cluster{'s' if count != 1 else ''}"
|
||||
elif function_name == 'generate_ideas':
|
||||
return f"Saving {count} idea{'s' if count != 1 else ''} with outlines"
|
||||
return f"Saving {count} idea{'s' if count != 1 else ''}"
|
||||
elif function_name == 'generate_content':
|
||||
return f"Saving {count} article{'s' if count != 1 else ''}"
|
||||
elif function_name == 'generate_images':
|
||||
return f"Uploading {count} image{'s' if count != 1 else ''} to media library"
|
||||
return f"Saving {count} image{'s' if count != 1 else ''}"
|
||||
elif function_name == 'generate_image_prompts':
|
||||
in_article = max(0, count - 1)
|
||||
return f"Assigning {count} prompts (1 featured + {in_article} in-article)"
|
||||
elif function_name == 'optimize_content':
|
||||
return f"Saving optimization scores for {count} article{'s' if count != 1 else ''}"
|
||||
elif function_name == 'generate_site_structure':
|
||||
return f"Publishing {count} page blueprint{'s' if count != 1 else ''}"
|
||||
# Count is total prompts created
|
||||
return f"Assigning {count} Prompts to Dedicated Slots"
|
||||
return f"Saving {count} item{'s' if count != 1 else ''}"
|
||||
|
||||
def _get_done_message(self, function_name: str, result: dict) -> str:
|
||||
"""Get user-friendly completion message with counts"""
|
||||
count = result.get('count', 0)
|
||||
|
||||
if function_name == 'auto_cluster':
|
||||
keyword_count = result.get('keywords_clustered', 0)
|
||||
return f"✓ Organized {keyword_count} keywords into {count} semantic cluster{'s' if count != 1 else ''}"
|
||||
elif function_name == 'generate_ideas':
|
||||
return f"✓ Created {count} content idea{'s' if count != 1 else ''} with detailed outlines"
|
||||
elif function_name == 'generate_content':
|
||||
total_words = result.get('total_words', 0)
|
||||
if total_words > 0:
|
||||
return f"✓ Generated {count} article{'s' if count != 1 else ''} ({total_words:,} words)"
|
||||
return f"✓ Generated {count} article{'s' if count != 1 else ''}"
|
||||
elif function_name == 'generate_images':
|
||||
return f"✓ Generated and saved {count} AI image{'s' if count != 1 else ''}"
|
||||
elif function_name == 'generate_image_prompts':
|
||||
in_article = max(0, count - 1)
|
||||
return f"✓ Created {count} image prompt{'s' if count != 1 else ''} (1 featured + {in_article} in-article)"
|
||||
elif function_name == 'optimize_content':
|
||||
avg_score = result.get('average_score', 0)
|
||||
if avg_score > 0:
|
||||
return f"✓ Optimized {count} article{'s' if count != 1 else ''} (avg score: {avg_score}%)"
|
||||
return f"✓ Optimized {count} article{'s' if count != 1 else ''}"
|
||||
elif function_name == 'generate_site_structure':
|
||||
return f"✓ Created {count} page blueprint{'s' if count != 1 else ''}"
|
||||
return f"✓ {count} item{'s' if count != 1 else ''} completed"
|
||||
|
||||
def execute(self, fn: BaseAIFunction, payload: dict) -> dict:
|
||||
"""
|
||||
Unified execution pipeline for all AI functions.
|
||||
@@ -272,31 +192,6 @@ class AIEngine:
|
||||
self.step_tracker.add_request_step("PREP", "success", prep_message)
|
||||
self.tracker.update("PREP", 25, prep_message, meta=self.step_tracker.get_meta())
|
||||
|
||||
# Phase 2.5: CREDIT CHECK - Check credits before AI call (25%)
|
||||
if self.account:
|
||||
try:
|
||||
from igny8_core.business.billing.services.credit_service import CreditService
|
||||
from igny8_core.business.billing.exceptions import InsufficientCreditsError
|
||||
|
||||
# Map function name to operation type
|
||||
operation_type = self._get_operation_type(function_name)
|
||||
|
||||
# Calculate estimated cost
|
||||
estimated_amount = self._get_estimated_amount(function_name, data, payload)
|
||||
|
||||
# Check credits BEFORE AI call
|
||||
CreditService.check_credits(self.account, operation_type, estimated_amount)
|
||||
|
||||
logger.info(f"[AIEngine] Credit check passed: {operation_type}, estimated amount: {estimated_amount}")
|
||||
except InsufficientCreditsError as e:
|
||||
error_msg = str(e)
|
||||
error_type = 'InsufficientCreditsError'
|
||||
logger.error(f"[AIEngine] {error_msg}")
|
||||
return self._handle_error(error_msg, fn, error_type=error_type)
|
||||
except Exception as e:
|
||||
logger.warning(f"[AIEngine] Failed to check credits: {e}", exc_info=True)
|
||||
# Don't fail the operation if credit check fails (for backward compatibility)
|
||||
|
||||
# Phase 3: AI_CALL - Provider API Call (25-70%)
|
||||
# Validate account exists before proceeding
|
||||
if not self.account:
|
||||
@@ -306,13 +201,12 @@ class AIEngine:
|
||||
|
||||
ai_core = AICore(account=self.account)
|
||||
function_name = fn.get_name()
|
||||
|
||||
# Generate prompt prefix for tracking (e.g., ##GP01-Clustering or ##CP01-Clustering)
|
||||
# This replaces function_id and indicates whether prompt is global or custom
|
||||
from igny8_core.ai.prompts import get_prompt_prefix_for_function
|
||||
prompt_prefix = get_prompt_prefix_for_function(function_name, account=self.account)
|
||||
logger.info(f"[AIEngine] Using prompt prefix: {prompt_prefix}")
|
||||
|
||||
|
||||
# Generate function_id for tracking (ai-{function_name}-01)
|
||||
# Normalize underscores to hyphens to match frontend tracking IDs
|
||||
function_id_base = function_name.replace('_', '-')
|
||||
function_id = f"ai-{function_id_base}-01-desktop"
|
||||
|
||||
# Get model config from settings (requires account)
|
||||
# This will raise ValueError if IntegrationSettings not configured
|
||||
try:
|
||||
@@ -350,7 +244,7 @@ class AIEngine:
|
||||
temperature=model_config.get('temperature'),
|
||||
response_format=model_config.get('response_format'),
|
||||
function_name=function_name,
|
||||
prompt_prefix=prompt_prefix # Pass prompt prefix for tracking (replaces function_id)
|
||||
function_id=function_id # Pass function_id for tracking
|
||||
)
|
||||
except Exception as e:
|
||||
error_msg = f"AI call failed: {str(e)}"
|
||||
@@ -431,60 +325,46 @@ class AIEngine:
|
||||
# Store save_msg for use in DONE phase
|
||||
final_save_msg = save_msg
|
||||
|
||||
# Phase 5.5: DEDUCT CREDITS - Deduct credits after successful save
|
||||
# Track credit usage after successful save
|
||||
if self.account and raw_response:
|
||||
try:
|
||||
from igny8_core.business.billing.services.credit_service import CreditService
|
||||
from igny8_core.business.billing.exceptions import InsufficientCreditsError
|
||||
from igny8_core.modules.billing.services import CreditService
|
||||
from igny8_core.modules.billing.models import CreditUsageLog
|
||||
|
||||
# Map function name to operation type
|
||||
operation_type = self._get_operation_type(function_name)
|
||||
# Calculate credits used (based on tokens or fixed cost)
|
||||
credits_used = self._calculate_credits_for_clustering(
|
||||
keyword_count=len(data.get('keywords', [])) if isinstance(data, dict) else len(data) if isinstance(data, list) else 1,
|
||||
tokens=raw_response.get('total_tokens', 0),
|
||||
cost=raw_response.get('cost', 0)
|
||||
)
|
||||
|
||||
# Get actual token usage from response (AI returns 'input_tokens' and 'output_tokens')
|
||||
tokens_input = raw_response.get('input_tokens', 0)
|
||||
tokens_output = raw_response.get('output_tokens', 0)
|
||||
|
||||
# Deduct credits based on actual token usage
|
||||
CreditService.deduct_credits_for_operation(
|
||||
# Log credit usage (don't deduct from account.credits, just log)
|
||||
CreditUsageLog.objects.create(
|
||||
account=self.account,
|
||||
operation_type=operation_type,
|
||||
tokens_input=tokens_input,
|
||||
tokens_output=tokens_output,
|
||||
operation_type='clustering',
|
||||
credits_used=credits_used,
|
||||
cost_usd=raw_response.get('cost'),
|
||||
model_used=raw_response.get('model', ''),
|
||||
related_object_type=self._get_related_object_type(function_name),
|
||||
related_object_id=save_result.get('id') or save_result.get('cluster_id') or save_result.get('task_id'),
|
||||
tokens_input=raw_response.get('tokens_input', 0),
|
||||
tokens_output=raw_response.get('tokens_output', 0),
|
||||
related_object_type='cluster',
|
||||
metadata={
|
||||
'function_name': function_name,
|
||||
'clusters_created': clusters_created,
|
||||
'keywords_updated': keywords_updated,
|
||||
'count': count,
|
||||
**save_result
|
||||
'function_name': function_name
|
||||
}
|
||||
)
|
||||
|
||||
logger.info(
|
||||
f"[AIEngine] Credits deducted: {operation_type}, "
|
||||
f"tokens: {tokens_input + tokens_output} ({tokens_input} in, {tokens_output} out)"
|
||||
)
|
||||
except InsufficientCreditsError as e:
|
||||
# This shouldn't happen since we checked before, but log it
|
||||
logger.error(f"[AIEngine] Insufficient credits during deduction: {e}")
|
||||
except Exception as e:
|
||||
logger.warning(f"[AIEngine] Failed to deduct credits: {e}", exc_info=True)
|
||||
# Don't fail the operation if credit deduction fails (for backward compatibility)
|
||||
logger.warning(f"Failed to log credit usage: {e}", exc_info=True)
|
||||
|
||||
# Phase 6: DONE - Finalization (98-100%)
|
||||
done_msg = self._get_done_message(function_name, save_result)
|
||||
self.step_tracker.add_request_step("DONE", "success", done_msg)
|
||||
self.tracker.update("DONE", 100, done_msg, meta=self.step_tracker.get_meta())
|
||||
success_msg = f"Task completed: {final_save_msg}" if 'final_save_msg' in locals() else "Task completed successfully"
|
||||
self.step_tracker.add_request_step("DONE", "success", "Task completed successfully")
|
||||
self.tracker.update("DONE", 100, "Task complete!", meta=self.step_tracker.get_meta())
|
||||
|
||||
# Log to database
|
||||
self._log_to_database(fn, payload, parsed, save_result)
|
||||
|
||||
# Create notification for successful completion
|
||||
self._create_success_notification(function_name, save_result, payload)
|
||||
|
||||
return {
|
||||
'success': True,
|
||||
**save_result,
|
||||
@@ -528,9 +408,6 @@ class AIEngine:
|
||||
|
||||
self._log_to_database(fn, None, None, None, error=error)
|
||||
|
||||
# Create notification for failure
|
||||
self._create_failure_notification(function_name, error)
|
||||
|
||||
return {
|
||||
'success': False,
|
||||
'error': error,
|
||||
@@ -576,186 +453,18 @@ class AIEngine:
|
||||
# Don't fail the task if logging fails
|
||||
logger.warning(f"Failed to log to database: {e}")
|
||||
|
||||
def _get_operation_type(self, function_name):
|
||||
"""Map function name to operation type for credit system"""
|
||||
mapping = {
|
||||
'auto_cluster': 'clustering',
|
||||
'generate_ideas': 'idea_generation',
|
||||
'generate_content': 'content_generation',
|
||||
'generate_image_prompts': 'image_prompt_extraction',
|
||||
'generate_images': 'image_generation',
|
||||
'generate_site_structure': 'site_structure_generation',
|
||||
}
|
||||
return mapping.get(function_name, function_name)
|
||||
|
||||
def _get_estimated_amount(self, function_name, data, payload):
|
||||
"""Get estimated amount for credit calculation (before operation)"""
|
||||
if function_name == 'generate_content':
|
||||
# Estimate word count - tasks don't have word_count field, use default
|
||||
# data is a list of Task objects
|
||||
if isinstance(data, list) and len(data) > 0:
|
||||
# Multiple tasks - estimate 1000 words per task
|
||||
return len(data) * 1000
|
||||
return 1000 # Default estimate for single item
|
||||
elif function_name == 'generate_images':
|
||||
# Count images to generate
|
||||
if isinstance(payload, dict):
|
||||
image_ids = payload.get('image_ids', [])
|
||||
return len(image_ids) if image_ids else 1
|
||||
return 1
|
||||
elif function_name == 'generate_ideas':
|
||||
# Count clusters
|
||||
if isinstance(data, dict) and 'cluster_data' in data:
|
||||
return len(data['cluster_data'])
|
||||
return 1
|
||||
# For fixed cost operations (clustering, image_prompt_extraction), return None
|
||||
return None
|
||||
|
||||
def _get_actual_amount(self, function_name, save_result, parsed, data):
|
||||
"""Get actual amount for credit calculation (after operation)"""
|
||||
if function_name == 'generate_content':
|
||||
# Get actual word count from saved content
|
||||
if isinstance(save_result, dict):
|
||||
word_count = save_result.get('word_count')
|
||||
if word_count and word_count > 0:
|
||||
return word_count
|
||||
# Fallback: estimate from parsed content
|
||||
if isinstance(parsed, dict) and 'content' in parsed:
|
||||
content = parsed['content']
|
||||
return len(content.split()) if isinstance(content, str) else 1000
|
||||
# Fallback: estimate from html_content if available
|
||||
if isinstance(parsed, dict) and 'html_content' in parsed:
|
||||
html_content = parsed['html_content']
|
||||
if isinstance(html_content, str):
|
||||
# Strip HTML tags for word count
|
||||
import re
|
||||
text = re.sub(r'<[^>]+>', '', html_content)
|
||||
return len(text.split())
|
||||
return 1000
|
||||
elif function_name == 'generate_images':
|
||||
# Count successfully generated images
|
||||
count = save_result.get('count', 0)
|
||||
if count > 0:
|
||||
return count
|
||||
return 1
|
||||
elif function_name == 'generate_ideas':
|
||||
# Count ideas generated
|
||||
count = save_result.get('count', 0)
|
||||
if count > 0:
|
||||
return count
|
||||
return 1
|
||||
# For fixed cost operations, return None
|
||||
return None
|
||||
|
||||
def _get_related_object_type(self, function_name):
|
||||
"""Get related object type for credit logging"""
|
||||
mapping = {
|
||||
'auto_cluster': 'cluster',
|
||||
'generate_ideas': 'content_idea',
|
||||
'generate_content': 'content',
|
||||
'generate_image_prompts': 'image',
|
||||
'generate_images': 'image',
|
||||
'generate_site_structure': 'site_blueprint',
|
||||
}
|
||||
return mapping.get(function_name, 'unknown')
|
||||
|
||||
def _create_success_notification(self, function_name: str, save_result: dict, payload: dict):
|
||||
"""Create notification for successful AI task completion"""
|
||||
if not self.account:
|
||||
return
|
||||
def _calculate_credits_for_clustering(self, keyword_count, tokens, cost):
|
||||
"""Calculate credits used for clustering operation"""
|
||||
# Use plan's cost per request if available, otherwise calculate from tokens
|
||||
if self.account and hasattr(self.account, 'plan') and self.account.plan:
|
||||
plan = self.account.plan
|
||||
# Check if plan has ai_cost_per_request config
|
||||
if hasattr(plan, 'ai_cost_per_request') and plan.ai_cost_per_request:
|
||||
cluster_cost = plan.ai_cost_per_request.get('cluster', 0)
|
||||
if cluster_cost:
|
||||
return int(cluster_cost)
|
||||
|
||||
# Lazy import to avoid circular dependency and Django app loading issues
|
||||
from igny8_core.business.notifications.services import NotificationService
|
||||
|
||||
# Get site from payload if available
|
||||
site = None
|
||||
site_id = payload.get('site_id')
|
||||
if site_id:
|
||||
try:
|
||||
from igny8_core.auth.models import Site
|
||||
site = Site.objects.get(id=site_id, account=self.account)
|
||||
except:
|
||||
pass
|
||||
|
||||
try:
|
||||
# Map function to appropriate notification method
|
||||
if function_name == 'auto_cluster':
|
||||
NotificationService.notify_clustering_complete(
|
||||
account=self.account,
|
||||
site=site,
|
||||
cluster_count=save_result.get('clusters_created', 0),
|
||||
keyword_count=save_result.get('keywords_updated', 0)
|
||||
)
|
||||
elif function_name == 'generate_ideas':
|
||||
NotificationService.notify_ideas_complete(
|
||||
account=self.account,
|
||||
site=site,
|
||||
idea_count=save_result.get('count', 0),
|
||||
cluster_count=len(payload.get('ids', []))
|
||||
)
|
||||
elif function_name == 'generate_content':
|
||||
NotificationService.notify_content_complete(
|
||||
account=self.account,
|
||||
site=site,
|
||||
article_count=save_result.get('count', 0),
|
||||
word_count=save_result.get('word_count', 0)
|
||||
)
|
||||
elif function_name == 'generate_image_prompts':
|
||||
NotificationService.notify_prompts_complete(
|
||||
account=self.account,
|
||||
site=site,
|
||||
prompt_count=save_result.get('count', 0)
|
||||
)
|
||||
elif function_name == 'generate_images':
|
||||
NotificationService.notify_images_complete(
|
||||
account=self.account,
|
||||
site=site,
|
||||
image_count=save_result.get('count', 0)
|
||||
)
|
||||
|
||||
logger.info(f"[AIEngine] Created success notification for {function_name}")
|
||||
except Exception as e:
|
||||
# Don't fail the task if notification creation fails
|
||||
logger.warning(f"[AIEngine] Failed to create success notification: {e}", exc_info=True)
|
||||
|
||||
def _create_failure_notification(self, function_name: str, error: str):
|
||||
"""Create notification for failed AI task"""
|
||||
if not self.account:
|
||||
return
|
||||
|
||||
# Lazy import to avoid circular dependency and Django app loading issues
|
||||
from igny8_core.business.notifications.services import NotificationService
|
||||
|
||||
try:
|
||||
# Map function to appropriate failure notification method
|
||||
if function_name == 'auto_cluster':
|
||||
NotificationService.notify_clustering_failed(
|
||||
account=self.account,
|
||||
error=error
|
||||
)
|
||||
elif function_name == 'generate_ideas':
|
||||
NotificationService.notify_ideas_failed(
|
||||
account=self.account,
|
||||
error=error
|
||||
)
|
||||
elif function_name == 'generate_content':
|
||||
NotificationService.notify_content_failed(
|
||||
account=self.account,
|
||||
error=error
|
||||
)
|
||||
elif function_name == 'generate_image_prompts':
|
||||
NotificationService.notify_prompts_failed(
|
||||
account=self.account,
|
||||
error=error
|
||||
)
|
||||
elif function_name == 'generate_images':
|
||||
NotificationService.notify_images_failed(
|
||||
account=self.account,
|
||||
error=error
|
||||
)
|
||||
|
||||
logger.info(f"[AIEngine] Created failure notification for {function_name}")
|
||||
except Exception as e:
|
||||
# Don't fail the task if notification creation fails
|
||||
logger.warning(f"[AIEngine] Failed to create failure notification: {e}", exc_info=True)
|
||||
# Fallback: 1 credit per 30 keywords (minimum 1)
|
||||
credits = max(1, int(keyword_count / 30))
|
||||
return credits
|
||||
|
||||
|
||||
@@ -40,7 +40,6 @@ class AutoClusterFunction(BaseAIFunction):
|
||||
def validate(self, payload: dict, account=None) -> Dict:
|
||||
"""Custom validation for clustering"""
|
||||
from igny8_core.ai.validators import validate_ids, validate_keywords_exist
|
||||
from igny8_core.ai.validators.cluster_validators import validate_minimum_keywords
|
||||
|
||||
# Base validation (no max_items limit)
|
||||
result = validate_ids(payload, max_items=None)
|
||||
@@ -53,21 +52,6 @@ class AutoClusterFunction(BaseAIFunction):
|
||||
if not keywords_result['valid']:
|
||||
return keywords_result
|
||||
|
||||
# NEW: Validate minimum keywords (5 required for meaningful clustering)
|
||||
min_validation = validate_minimum_keywords(
|
||||
keyword_ids=ids,
|
||||
account=account,
|
||||
min_required=5
|
||||
)
|
||||
|
||||
if not min_validation['valid']:
|
||||
logger.warning(f"[AutoCluster] Validation failed: {min_validation['error']}")
|
||||
return min_validation
|
||||
|
||||
logger.info(
|
||||
f"[AutoCluster] Validation passed: {min_validation['count']} keywords available (min: {min_validation['required']})"
|
||||
)
|
||||
|
||||
# Removed plan limits check
|
||||
|
||||
return {'valid': True}
|
||||
@@ -97,6 +81,7 @@ class AutoClusterFunction(BaseAIFunction):
|
||||
'keyword': kw.keyword,
|
||||
'volume': kw.volume,
|
||||
'difficulty': kw.difficulty,
|
||||
'intent': kw.intent,
|
||||
}
|
||||
for kw in keywords
|
||||
],
|
||||
@@ -110,7 +95,7 @@ class AutoClusterFunction(BaseAIFunction):
|
||||
|
||||
# Format keywords
|
||||
keywords_text = '\n'.join([
|
||||
f"- {kw['keyword']} (Volume: {kw['volume']}, Difficulty: {kw['difficulty']})"
|
||||
f"- {kw['keyword']} (Volume: {kw['volume']}, Difficulty: {kw['difficulty']}, Intent: {kw['intent']})"
|
||||
for kw in keyword_data
|
||||
])
|
||||
|
||||
@@ -264,7 +249,7 @@ class AutoClusterFunction(BaseAIFunction):
|
||||
sector=sector,
|
||||
defaults={
|
||||
'description': cluster_data.get('description', ''),
|
||||
'status': 'new', # FIXED: Changed from 'active' to 'new'
|
||||
'status': 'active',
|
||||
}
|
||||
)
|
||||
else:
|
||||
@@ -275,7 +260,7 @@ class AutoClusterFunction(BaseAIFunction):
|
||||
sector__isnull=True,
|
||||
defaults={
|
||||
'description': cluster_data.get('description', ''),
|
||||
'status': 'new', # FIXED: Changed from 'active' to 'new'
|
||||
'status': 'active',
|
||||
'sector': None,
|
||||
}
|
||||
)
|
||||
@@ -307,10 +292,9 @@ class AutoClusterFunction(BaseAIFunction):
|
||||
else:
|
||||
keyword_filter = keyword_filter.filter(sector__isnull=True)
|
||||
|
||||
# FIXED: Ensure keywords status updates from 'new' to 'mapped'
|
||||
updated_count = keyword_filter.update(
|
||||
cluster=cluster,
|
||||
status='mapped' # Status changes from 'new' to 'mapped'
|
||||
status='mapped'
|
||||
)
|
||||
keywords_updated += updated_count
|
||||
|
||||
|
||||
@@ -1,14 +1,13 @@
|
||||
"""
|
||||
Generate Content AI Function
|
||||
STAGE 3: Updated to use final Stage 1 Content schema
|
||||
Extracted from modules/writer/tasks.py
|
||||
"""
|
||||
import logging
|
||||
import re
|
||||
from typing import Dict, List, Any
|
||||
from django.db import transaction
|
||||
from igny8_core.ai.base import BaseAIFunction
|
||||
from igny8_core.modules.writer.models import Tasks, Content
|
||||
from igny8_core.business.content.models import ContentTaxonomy
|
||||
from igny8_core.modules.writer.models import Tasks, Content as TaskContent
|
||||
from igny8_core.ai.ai_core import AICore
|
||||
from igny8_core.ai.validators import validate_tasks_exist
|
||||
from igny8_core.ai.prompts import PromptRegistry
|
||||
@@ -63,9 +62,9 @@ class GenerateContentFunction(BaseAIFunction):
|
||||
if account:
|
||||
queryset = queryset.filter(account=account)
|
||||
|
||||
# STAGE 3: Preload relationships - taxonomy_term instead of taxonomy
|
||||
# Preload all relationships to avoid N+1 queries
|
||||
tasks = list(queryset.select_related(
|
||||
'account', 'site', 'sector', 'cluster', 'taxonomy_term'
|
||||
'account', 'site', 'sector', 'cluster', 'idea'
|
||||
))
|
||||
|
||||
if not tasks:
|
||||
@@ -74,8 +73,9 @@ class GenerateContentFunction(BaseAIFunction):
|
||||
return tasks
|
||||
|
||||
def build_prompt(self, data: Any, account=None) -> str:
|
||||
"""STAGE 3: Build content generation prompt using final Task schema"""
|
||||
"""Build content generation prompt for a single task using registry"""
|
||||
if isinstance(data, list):
|
||||
# For now, handle single task (will be called per task)
|
||||
if not data:
|
||||
raise ValueError("No tasks provided")
|
||||
task = data[0]
|
||||
@@ -89,9 +89,33 @@ class GenerateContentFunction(BaseAIFunction):
|
||||
if task.description:
|
||||
idea_data += f"Description: {task.description}\n"
|
||||
|
||||
# Add content type and structure from task
|
||||
idea_data += f"Content Type: {task.content_type or 'post'}\n"
|
||||
idea_data += f"Content Structure: {task.content_structure or 'article'}\n"
|
||||
# Handle idea description (might be JSON or plain text)
|
||||
if task.idea and task.idea.description:
|
||||
description = task.idea.description
|
||||
try:
|
||||
import json
|
||||
parsed_desc = json.loads(description)
|
||||
if isinstance(parsed_desc, dict):
|
||||
formatted_desc = "Content Outline:\n\n"
|
||||
if 'H2' in parsed_desc:
|
||||
for h2_section in parsed_desc['H2']:
|
||||
formatted_desc += f"## {h2_section.get('heading', '')}\n"
|
||||
if 'subsections' in h2_section:
|
||||
for h3_section in h2_section['subsections']:
|
||||
formatted_desc += f"### {h3_section.get('subheading', '')}\n"
|
||||
formatted_desc += f"Content Type: {h3_section.get('content_type', '')}\n"
|
||||
formatted_desc += f"Details: {h3_section.get('details', '')}\n\n"
|
||||
description = formatted_desc
|
||||
except (json.JSONDecodeError, TypeError):
|
||||
pass # Use as plain text
|
||||
|
||||
idea_data += f"Outline: {description}\n"
|
||||
|
||||
if task.idea:
|
||||
idea_data += f"Structure: {task.idea.content_structure or task.content_structure or 'blog_post'}\n"
|
||||
idea_data += f"Type: {task.idea.content_type or task.content_type or 'blog_post'}\n"
|
||||
if task.idea.estimated_word_count:
|
||||
idea_data += f"Estimated Word Count: {task.idea.estimated_word_count}\n"
|
||||
|
||||
# Build cluster data string
|
||||
cluster_data = ''
|
||||
@@ -99,18 +123,12 @@ class GenerateContentFunction(BaseAIFunction):
|
||||
cluster_data = f"Cluster Name: {task.cluster.name or ''}\n"
|
||||
if task.cluster.description:
|
||||
cluster_data += f"Description: {task.cluster.description}\n"
|
||||
cluster_data += f"Status: {task.cluster.status or 'active'}\n"
|
||||
|
||||
# STAGE 3: Build taxonomy context (from taxonomy_term FK)
|
||||
taxonomy_data = ''
|
||||
if task.taxonomy_term:
|
||||
taxonomy_data = f"Taxonomy: {task.taxonomy_term.name or ''}\n"
|
||||
if task.taxonomy_term.taxonomy_type:
|
||||
taxonomy_data += f"Type: {task.taxonomy_term.get_taxonomy_type_display()}\n"
|
||||
|
||||
# STAGE 3: Build keywords context (from keywords TextField)
|
||||
keywords_data = ''
|
||||
if task.keywords:
|
||||
keywords_data = f"Keywords: {task.keywords}\n"
|
||||
# Build keywords string
|
||||
keywords_data = task.keywords or ''
|
||||
if not keywords_data and task.idea:
|
||||
keywords_data = task.idea.target_keywords or ''
|
||||
|
||||
# Get prompt from registry with context
|
||||
prompt = PromptRegistry.get_prompt(
|
||||
@@ -120,7 +138,6 @@ class GenerateContentFunction(BaseAIFunction):
|
||||
context={
|
||||
'IDEA': idea_data,
|
||||
'CLUSTER': cluster_data,
|
||||
'TAXONOMY': taxonomy_data,
|
||||
'KEYWORDS': keywords_data,
|
||||
}
|
||||
)
|
||||
@@ -159,11 +176,7 @@ class GenerateContentFunction(BaseAIFunction):
|
||||
progress_tracker=None,
|
||||
step_tracker=None
|
||||
) -> Dict:
|
||||
"""
|
||||
STAGE 3: Save content using final Stage 1 Content model schema.
|
||||
Creates independent Content record (no OneToOne to Task).
|
||||
Handles tags and categories from AI response.
|
||||
"""
|
||||
"""Save content to task - handles both JSON and plain text responses"""
|
||||
if isinstance(original_data, list):
|
||||
task = original_data[0] if original_data else None
|
||||
else:
|
||||
@@ -177,158 +190,113 @@ class GenerateContentFunction(BaseAIFunction):
|
||||
# JSON response with structured fields
|
||||
content_html = parsed.get('content', '')
|
||||
title = parsed.get('title') or task.title
|
||||
meta_title = parsed.get('meta_title') or parsed.get('seo_title') or title
|
||||
meta_description = parsed.get('meta_description') or parsed.get('seo_description')
|
||||
primary_keyword = parsed.get('primary_keyword') or parsed.get('focus_keyword')
|
||||
secondary_keywords = parsed.get('secondary_keywords') or parsed.get('keywords', [])
|
||||
# Extract tags and categories from AI response
|
||||
tags_from_response = parsed.get('tags', [])
|
||||
categories_from_response = parsed.get('categories', [])
|
||||
|
||||
# DEBUG: Log the full parsed response to see what we're getting
|
||||
logger.info(f"===== GENERATE CONTENT DEBUG =====")
|
||||
logger.info(f"Full parsed response keys: {list(parsed.keys())}")
|
||||
logger.info(f"Tags from response (type: {type(tags_from_response)}): {tags_from_response}")
|
||||
logger.info(f"Categories from response (type: {type(categories_from_response)}): {categories_from_response}")
|
||||
logger.info(f"==================================")
|
||||
meta_title = parsed.get('meta_title') or title or task.title
|
||||
meta_description = parsed.get('meta_description', '')
|
||||
word_count = parsed.get('word_count', 0)
|
||||
primary_keyword = parsed.get('primary_keyword', '')
|
||||
secondary_keywords = parsed.get('secondary_keywords', [])
|
||||
tags = parsed.get('tags', [])
|
||||
categories = parsed.get('categories', [])
|
||||
# Content status should always be 'draft' for newly generated content
|
||||
# Status can only be changed manually to 'review' or 'publish'
|
||||
content_status = 'draft'
|
||||
else:
|
||||
# Plain text response
|
||||
# Plain text response (legacy)
|
||||
content_html = str(parsed)
|
||||
title = task.title
|
||||
meta_title = title
|
||||
meta_description = None
|
||||
primary_keyword = None
|
||||
meta_title = task.meta_title or task.title
|
||||
meta_description = task.meta_description or (task.description or '')[:160] if task.description else ''
|
||||
word_count = 0
|
||||
primary_keyword = ''
|
||||
secondary_keywords = []
|
||||
tags_from_response = []
|
||||
categories_from_response = []
|
||||
tags = []
|
||||
categories = []
|
||||
content_status = 'draft'
|
||||
|
||||
# Calculate word count
|
||||
word_count = 0
|
||||
if content_html:
|
||||
# Calculate word count if not provided
|
||||
if not word_count and content_html:
|
||||
text_for_counting = re.sub(r'<[^>]+>', '', content_html)
|
||||
word_count = len(text_for_counting.split())
|
||||
|
||||
# STAGE 3: Create independent Content record using final schema
|
||||
content_record = Content.objects.create(
|
||||
# Core fields
|
||||
title=title,
|
||||
content_html=content_html or '',
|
||||
word_count=word_count,
|
||||
# SEO fields
|
||||
meta_title=meta_title,
|
||||
meta_description=meta_description,
|
||||
primary_keyword=primary_keyword,
|
||||
secondary_keywords=secondary_keywords if isinstance(secondary_keywords, list) else [],
|
||||
# Structure
|
||||
cluster=task.cluster,
|
||||
content_type=task.content_type,
|
||||
content_structure=task.content_structure,
|
||||
# Source and status
|
||||
source='igny8',
|
||||
status='draft',
|
||||
# Site/Sector/Account
|
||||
account=task.account,
|
||||
site=task.site,
|
||||
sector=task.sector,
|
||||
|
||||
# Ensure related content record exists
|
||||
content_record, _created = TaskContent.objects.get_or_create(
|
||||
task=task,
|
||||
defaults={
|
||||
'account': task.account,
|
||||
'site': task.site,
|
||||
'sector': task.sector,
|
||||
'html_content': content_html or '',
|
||||
'word_count': word_count or 0,
|
||||
'status': 'draft',
|
||||
},
|
||||
)
|
||||
|
||||
logger.info(f"Created content record ID: {content_record.id}")
|
||||
logger.info(f"Processing taxonomies - Tags: {len(tags_from_response) if tags_from_response else 0}, Categories: {len(categories_from_response) if categories_from_response else 0}")
|
||||
|
||||
# Link taxonomy terms from task if available
|
||||
if task.taxonomy_term:
|
||||
content_record.taxonomy_terms.add(task.taxonomy_term)
|
||||
logger.info(f"Added task taxonomy term: {task.taxonomy_term.name}")
|
||||
|
||||
# Process tags from AI response
|
||||
logger.info(f"Starting tag processing: {tags_from_response}")
|
||||
if tags_from_response and isinstance(tags_from_response, list):
|
||||
from django.utils.text import slugify
|
||||
for tag_name in tags_from_response:
|
||||
logger.info(f"Processing tag: '{tag_name}' (type: {type(tag_name)})")
|
||||
if tag_name and isinstance(tag_name, str):
|
||||
tag_name = tag_name.strip()
|
||||
if tag_name:
|
||||
try:
|
||||
tag_slug = slugify(tag_name)
|
||||
logger.info(f"Creating/finding tag: name='{tag_name}', slug='{tag_slug}'")
|
||||
# Get or create tag taxonomy term using site + slug + type for uniqueness
|
||||
tag_obj, created = ContentTaxonomy.objects.get_or_create(
|
||||
site=task.site,
|
||||
slug=tag_slug,
|
||||
taxonomy_type='tag',
|
||||
defaults={
|
||||
'name': tag_name,
|
||||
'sector': task.sector,
|
||||
'account': task.account,
|
||||
'description': '',
|
||||
'external_taxonomy': '',
|
||||
'sync_status': '',
|
||||
'count': 0,
|
||||
'metadata': {},
|
||||
}
|
||||
)
|
||||
content_record.taxonomy_terms.add(tag_obj)
|
||||
logger.info(f"✅ {'Created' if created else 'Found'} and linked tag: {tag_name} (ID: {tag_obj.id}, Slug: {tag_slug})")
|
||||
except Exception as e:
|
||||
logger.error(f"❌ Failed to add tag '{tag_name}': {e}", exc_info=True)
|
||||
else:
|
||||
logger.warning(f"Skipping invalid tag: '{tag_name}' (type: {type(tag_name)})")
|
||||
|
||||
# Update content fields
|
||||
if content_html:
|
||||
content_record.html_content = content_html
|
||||
content_record.word_count = word_count or content_record.word_count or 0
|
||||
content_record.title = title
|
||||
content_record.meta_title = meta_title
|
||||
content_record.meta_description = meta_description
|
||||
content_record.primary_keyword = primary_keyword or ''
|
||||
if isinstance(secondary_keywords, list):
|
||||
content_record.secondary_keywords = secondary_keywords
|
||||
elif secondary_keywords:
|
||||
content_record.secondary_keywords = [secondary_keywords]
|
||||
else:
|
||||
logger.info(f"No tags to process or tags_from_response is not a list: {type(tags_from_response)}")
|
||||
|
||||
# Process categories from AI response
|
||||
logger.info(f"Starting category processing: {categories_from_response}")
|
||||
if categories_from_response and isinstance(categories_from_response, list):
|
||||
from django.utils.text import slugify
|
||||
for category_name in categories_from_response:
|
||||
logger.info(f"Processing category: '{category_name}' (type: {type(category_name)})")
|
||||
if category_name and isinstance(category_name, str):
|
||||
category_name = category_name.strip()
|
||||
if category_name:
|
||||
try:
|
||||
category_slug = slugify(category_name)
|
||||
logger.info(f"Creating/finding category: name='{category_name}', slug='{category_slug}'")
|
||||
# Get or create category taxonomy term using site + slug + type for uniqueness
|
||||
category_obj, created = ContentTaxonomy.objects.get_or_create(
|
||||
site=task.site,
|
||||
slug=category_slug,
|
||||
taxonomy_type='category',
|
||||
defaults={
|
||||
'name': category_name,
|
||||
'sector': task.sector,
|
||||
'account': task.account,
|
||||
'description': '',
|
||||
'external_taxonomy': '',
|
||||
'sync_status': '',
|
||||
'count': 0,
|
||||
'metadata': {},
|
||||
}
|
||||
)
|
||||
content_record.taxonomy_terms.add(category_obj)
|
||||
logger.info(f"✅ {'Created' if created else 'Found'} and linked category: {category_name} (ID: {category_obj.id}, Slug: {category_slug})")
|
||||
except Exception as e:
|
||||
logger.error(f"❌ Failed to add category '{category_name}': {e}", exc_info=True)
|
||||
else:
|
||||
logger.warning(f"Skipping invalid category: '{category_name}' (type: {type(category_name)})")
|
||||
content_record.secondary_keywords = []
|
||||
if isinstance(tags, list):
|
||||
content_record.tags = tags
|
||||
elif tags:
|
||||
content_record.tags = [tags]
|
||||
else:
|
||||
logger.info(f"No categories to process or categories_from_response is not a list: {type(categories_from_response)}")
|
||||
|
||||
# STAGE 3: Update task status to completed
|
||||
content_record.tags = []
|
||||
if isinstance(categories, list):
|
||||
content_record.categories = categories
|
||||
elif categories:
|
||||
content_record.categories = [categories]
|
||||
else:
|
||||
content_record.categories = []
|
||||
|
||||
# Always set status to 'draft' for newly generated content
|
||||
# Status can only be: draft, review, published (changed manually)
|
||||
content_record.status = 'draft'
|
||||
|
||||
# Merge any extra fields into metadata (non-standard keys)
|
||||
if isinstance(parsed, dict):
|
||||
excluded_keys = {
|
||||
'content',
|
||||
'title',
|
||||
'meta_title',
|
||||
'meta_description',
|
||||
'primary_keyword',
|
||||
'secondary_keywords',
|
||||
'tags',
|
||||
'categories',
|
||||
'word_count',
|
||||
'status',
|
||||
}
|
||||
extra_meta = {k: v for k, v in parsed.items() if k not in excluded_keys}
|
||||
existing_meta = content_record.metadata or {}
|
||||
existing_meta.update(extra_meta)
|
||||
content_record.metadata = existing_meta
|
||||
|
||||
# Align foreign keys to ensure consistency
|
||||
content_record.account = task.account
|
||||
content_record.site = task.site
|
||||
content_record.sector = task.sector
|
||||
content_record.task = task
|
||||
|
||||
content_record.save()
|
||||
|
||||
# Update task status - keep task data intact but mark as completed
|
||||
task.status = 'completed'
|
||||
task.save(update_fields=['status', 'updated_at'])
|
||||
|
||||
# NEW: Auto-sync idea status from task status
|
||||
if hasattr(task, 'idea') and task.idea:
|
||||
task.idea.status = 'completed'
|
||||
task.idea.save(update_fields=['status', 'updated_at'])
|
||||
logger.info(f"Updated related idea ID {task.idea.id} to completed")
|
||||
|
||||
|
||||
return {
|
||||
'count': 1,
|
||||
'content_id': content_record.id,
|
||||
'task_id': task.id,
|
||||
'word_count': word_count,
|
||||
'tasks_updated': 1,
|
||||
'word_count': content_record.word_count,
|
||||
}
|
||||
|
||||
|
||||
|
||||
@@ -208,16 +208,12 @@ class GenerateIdeasFunction(BaseAIFunction):
|
||||
# Handle target_keywords
|
||||
target_keywords = idea_data.get('covered_keywords', '') or idea_data.get('target_keywords', '')
|
||||
|
||||
# Direct mapping - no conversion needed
|
||||
content_type = idea_data.get('content_type', 'post')
|
||||
content_structure = idea_data.get('content_structure', 'article')
|
||||
|
||||
# Create ContentIdeas record
|
||||
ContentIdeas.objects.create(
|
||||
idea_title=idea_data.get('title', 'Untitled Idea'),
|
||||
description=description, # Stored as JSON string
|
||||
content_type=content_type,
|
||||
content_structure=content_structure,
|
||||
description=description,
|
||||
content_type=idea_data.get('content_type', 'blog_post'),
|
||||
content_structure=idea_data.get('content_structure', 'supporting_page'),
|
||||
target_keywords=target_keywords,
|
||||
keyword_cluster=cluster,
|
||||
estimated_word_count=idea_data.get('estimated_word_count', 1500),
|
||||
@@ -227,11 +223,6 @@ class GenerateIdeasFunction(BaseAIFunction):
|
||||
sector=cluster.sector,
|
||||
)
|
||||
ideas_created += 1
|
||||
|
||||
# Update cluster status to 'mapped' after ideas are generated
|
||||
if cluster and cluster.status == 'new':
|
||||
cluster.status = 'mapped'
|
||||
cluster.save()
|
||||
|
||||
return {
|
||||
'count': ideas_created,
|
||||
|
||||
@@ -63,7 +63,7 @@ class GenerateImagePromptsFunction(BaseAIFunction):
|
||||
if account:
|
||||
queryset = queryset.filter(account=account)
|
||||
|
||||
contents = list(queryset.select_related('account', 'site', 'sector', 'cluster'))
|
||||
contents = list(queryset.select_related('task', 'account', 'site', 'sector'))
|
||||
|
||||
if not contents:
|
||||
raise ValueError("No content records found")
|
||||
@@ -93,7 +93,7 @@ class GenerateImagePromptsFunction(BaseAIFunction):
|
||||
data = data[0]
|
||||
|
||||
extracted = data['extracted']
|
||||
max_images = data.get('max_images')
|
||||
max_images = data.get('max_images', 2)
|
||||
|
||||
# Format content for prompt
|
||||
content_text = self._format_content_for_prompt(extracted)
|
||||
@@ -112,7 +112,7 @@ class GenerateImagePromptsFunction(BaseAIFunction):
|
||||
return prompt
|
||||
|
||||
def parse_response(self, response: str, step_tracker=None) -> Dict:
|
||||
"""Parse AI response with new structure including captions"""
|
||||
"""Parse AI response - same pattern as other functions"""
|
||||
ai_core = AICore(account=getattr(self, 'account', None))
|
||||
json_data = ai_core.extract_json(response)
|
||||
|
||||
@@ -123,28 +123,9 @@ class GenerateImagePromptsFunction(BaseAIFunction):
|
||||
if 'featured_prompt' not in json_data:
|
||||
raise ValueError("Missing 'featured_prompt' in AI response")
|
||||
|
||||
if 'featured_caption' not in json_data:
|
||||
raise ValueError("Missing 'featured_caption' in AI response")
|
||||
|
||||
if 'in_article_prompts' not in json_data:
|
||||
raise ValueError("Missing 'in_article_prompts' in AI response")
|
||||
|
||||
# Validate in_article_prompts structure (should be list of objects with prompt & caption)
|
||||
in_article_prompts = json_data.get('in_article_prompts', [])
|
||||
if in_article_prompts:
|
||||
for idx, item in enumerate(in_article_prompts):
|
||||
if isinstance(item, dict):
|
||||
if 'prompt' not in item:
|
||||
raise ValueError(f"Missing 'prompt' in in_article_prompts[{idx}]")
|
||||
if 'caption' not in item:
|
||||
raise ValueError(f"Missing 'caption' in in_article_prompts[{idx}]")
|
||||
else:
|
||||
# Legacy format (just string) - convert to new format
|
||||
in_article_prompts[idx] = {
|
||||
'prompt': str(item),
|
||||
'caption': '' # Empty caption for legacy data
|
||||
}
|
||||
|
||||
return json_data
|
||||
|
||||
def save_output(
|
||||
@@ -165,47 +146,36 @@ class GenerateImagePromptsFunction(BaseAIFunction):
|
||||
|
||||
content = original_data['content']
|
||||
extracted = original_data['extracted']
|
||||
max_images = original_data.get('max_images')
|
||||
max_images = original_data.get('max_images', 2)
|
||||
|
||||
prompts_created = 0
|
||||
|
||||
with transaction.atomic():
|
||||
# Save featured image prompt with caption
|
||||
# Save featured image prompt - use content instead of task
|
||||
Images.objects.update_or_create(
|
||||
content=content,
|
||||
image_type='featured',
|
||||
defaults={
|
||||
'prompt': parsed['featured_prompt'],
|
||||
'caption': parsed.get('featured_caption', ''),
|
||||
'status': 'pending',
|
||||
'position': 0,
|
||||
}
|
||||
)
|
||||
prompts_created += 1
|
||||
|
||||
# Save in-article image prompts with captions
|
||||
# Save in-article image prompts
|
||||
in_article_prompts = parsed.get('in_article_prompts', [])
|
||||
h2_headings = extracted.get('h2_headings', [])
|
||||
|
||||
for idx, prompt_data in enumerate(in_article_prompts[:max_images]):
|
||||
# Handle both new format (dict with prompt & caption) and legacy format (string)
|
||||
if isinstance(prompt_data, dict):
|
||||
prompt_text = prompt_data.get('prompt', '')
|
||||
caption_text = prompt_data.get('caption', '')
|
||||
else:
|
||||
# Legacy format - just a string prompt
|
||||
prompt_text = str(prompt_data)
|
||||
caption_text = ''
|
||||
|
||||
heading = h2_headings[idx] if idx < len(h2_headings) else f"Section {idx}"
|
||||
for idx, prompt_text in enumerate(in_article_prompts[:max_images]):
|
||||
heading = h2_headings[idx] if idx < len(h2_headings) else f"Section {idx + 1}"
|
||||
|
||||
Images.objects.update_or_create(
|
||||
content=content,
|
||||
image_type='in_article',
|
||||
position=idx, # 0-based position matching section array indices
|
||||
position=idx + 1,
|
||||
defaults={
|
||||
'prompt': prompt_text,
|
||||
'caption': caption_text,
|
||||
'status': 'pending',
|
||||
}
|
||||
)
|
||||
@@ -218,45 +188,26 @@ class GenerateImagePromptsFunction(BaseAIFunction):
|
||||
|
||||
# Helper methods
|
||||
def _get_max_in_article_images(self, account) -> int:
|
||||
"""
|
||||
Get max_in_article_images from settings.
|
||||
Uses account's IntegrationSettings override, or GlobalIntegrationSettings.
|
||||
"""
|
||||
from igny8_core.modules.system.models import IntegrationSettings
|
||||
from igny8_core.modules.system.global_settings_models import GlobalIntegrationSettings
|
||||
|
||||
# Try account-specific override first
|
||||
"""Get max_in_article_images from IntegrationSettings"""
|
||||
try:
|
||||
from igny8_core.modules.system.models import IntegrationSettings
|
||||
settings = IntegrationSettings.objects.get(
|
||||
account=account,
|
||||
integration_type='image_generation',
|
||||
is_active=True
|
||||
integration_type='image_generation'
|
||||
)
|
||||
max_images = settings.config.get('max_in_article_images')
|
||||
|
||||
if max_images is not None:
|
||||
max_images = int(max_images)
|
||||
logger.info(f"Using max_in_article_images={max_images} from account {account.id} IntegrationSettings override")
|
||||
return max_images
|
||||
return settings.config.get('max_in_article_images', 2)
|
||||
except IntegrationSettings.DoesNotExist:
|
||||
logger.debug(f"No IntegrationSettings override for account {account.id}, using GlobalIntegrationSettings")
|
||||
|
||||
# Use GlobalIntegrationSettings default
|
||||
global_settings = GlobalIntegrationSettings.get_instance()
|
||||
max_images = global_settings.max_in_article_images
|
||||
logger.info(f"Using max_in_article_images={max_images} from GlobalIntegrationSettings (account {account.id})")
|
||||
return max_images
|
||||
return 2 # Default
|
||||
|
||||
def _extract_content_elements(self, content: Content, max_images: int) -> Dict:
|
||||
"""Extract title, intro paragraphs, and H2 headings from content HTML"""
|
||||
from bs4 import BeautifulSoup
|
||||
|
||||
html_content = content.content_html or ''
|
||||
html_content = content.html_content or ''
|
||||
soup = BeautifulSoup(html_content, 'html.parser')
|
||||
|
||||
# Extract title
|
||||
# Get content title (task field was removed in refactor)
|
||||
title = content.title or ''
|
||||
title = content.title or content.task.title or ''
|
||||
|
||||
# Extract first 1-2 intro paragraphs (skip italic hook if present)
|
||||
paragraphs = soup.find_all('p')
|
||||
|
||||
@@ -68,39 +68,33 @@ class GenerateImagesFunction(BaseAIFunction):
|
||||
raise ValueError("No tasks found")
|
||||
|
||||
# Get image generation settings
|
||||
# Try account-specific override, otherwise use GlobalIntegrationSettings
|
||||
from igny8_core.modules.system.models import IntegrationSettings
|
||||
from igny8_core.modules.system.global_settings_models import GlobalIntegrationSettings
|
||||
|
||||
image_settings = {}
|
||||
try:
|
||||
integration = IntegrationSettings.objects.get(
|
||||
account=account,
|
||||
integration_type='image_generation',
|
||||
is_active=True
|
||||
)
|
||||
image_settings = integration.config or {}
|
||||
logger.info(f"Using image settings from account {account.id} IntegrationSettings override")
|
||||
except IntegrationSettings.DoesNotExist:
|
||||
logger.info(f"No IntegrationSettings override for account {account.id}, using GlobalIntegrationSettings")
|
||||
if account:
|
||||
try:
|
||||
from igny8_core.modules.system.models import IntegrationSettings
|
||||
integration = IntegrationSettings.objects.get(
|
||||
account=account,
|
||||
integration_type='image_generation',
|
||||
is_active=True
|
||||
)
|
||||
image_settings = integration.config or {}
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# Use GlobalIntegrationSettings for missing values
|
||||
global_settings = GlobalIntegrationSettings.get_instance()
|
||||
|
||||
# Extract settings with defaults from global settings
|
||||
provider = image_settings.get('provider') or image_settings.get('service') or global_settings.default_image_service
|
||||
# Extract settings with defaults
|
||||
provider = image_settings.get('provider') or image_settings.get('service', 'openai')
|
||||
if provider == 'runware':
|
||||
model = image_settings.get('model') or image_settings.get('runwareModel') or global_settings.runware_model
|
||||
model = image_settings.get('model') or image_settings.get('runwareModel', 'runware:97@1')
|
||||
else:
|
||||
model = image_settings.get('model') or global_settings.dalle_model
|
||||
model = image_settings.get('model', 'dall-e-3')
|
||||
|
||||
return {
|
||||
'tasks': tasks,
|
||||
'account': account,
|
||||
'provider': provider,
|
||||
'model': model,
|
||||
'image_type': image_settings.get('image_type') or global_settings.image_style,
|
||||
'max_in_article_images': int(image_settings.get('max_in_article_images') or global_settings.max_in_article_images),
|
||||
'image_type': image_settings.get('image_type', 'realistic'),
|
||||
'max_in_article_images': int(image_settings.get('max_in_article_images', 2)),
|
||||
'desktop_enabled': image_settings.get('desktop_enabled', True),
|
||||
'mobile_enabled': image_settings.get('mobile_enabled', True),
|
||||
}
|
||||
@@ -108,7 +102,7 @@ class GenerateImagesFunction(BaseAIFunction):
|
||||
def build_prompt(self, data: Dict, account=None) -> Dict:
|
||||
"""Extract image prompts from task content"""
|
||||
task = data.get('task')
|
||||
max_images = data.get('max_in_article_images')
|
||||
max_images = data.get('max_in_article_images', 2)
|
||||
|
||||
if not task or not task.content:
|
||||
raise ValueError("Task has no content")
|
||||
|
||||
@@ -1,167 +0,0 @@
|
||||
"""
|
||||
Optimize Content AI Function
|
||||
Phase 4 – Linker & Optimizer
|
||||
"""
|
||||
import json
|
||||
import logging
|
||||
from typing import Any, Dict
|
||||
|
||||
from igny8_core.ai.base import BaseAIFunction
|
||||
from igny8_core.ai.prompts import PromptRegistry
|
||||
from igny8_core.business.content.models import Content
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class OptimizeContentFunction(BaseAIFunction):
|
||||
"""AI function that optimizes content for SEO, readability, and engagement."""
|
||||
|
||||
def get_name(self) -> str:
|
||||
return 'optimize_content'
|
||||
|
||||
def get_metadata(self) -> Dict:
|
||||
metadata = super().get_metadata()
|
||||
metadata.update({
|
||||
'display_name': 'Optimize Content',
|
||||
'description': 'Optimize content for SEO, readability, and engagement.',
|
||||
'phases': {
|
||||
'INIT': 'Validating content data…',
|
||||
'PREP': 'Preparing content context…',
|
||||
'AI_CALL': 'Optimizing content with AI…',
|
||||
'PARSE': 'Parsing optimized content…',
|
||||
'SAVE': 'Saving optimized content…',
|
||||
'DONE': 'Content optimized!'
|
||||
}
|
||||
})
|
||||
return metadata
|
||||
|
||||
def validate(self, payload: dict, account=None) -> Dict[str, Any]:
|
||||
if not payload.get('ids'):
|
||||
return {'valid': False, 'error': 'Content ID is required'}
|
||||
return {'valid': True}
|
||||
|
||||
def prepare(self, payload: dict, account=None) -> Dict[str, Any]:
|
||||
content_ids = payload.get('ids', [])
|
||||
queryset = Content.objects.filter(id__in=content_ids)
|
||||
if account:
|
||||
queryset = queryset.filter(account=account)
|
||||
content = queryset.select_related('account', 'site', 'sector').first()
|
||||
if not content:
|
||||
raise ValueError("Content not found")
|
||||
|
||||
# Get current scores from analyzer
|
||||
from igny8_core.business.optimization.services.analyzer import ContentAnalyzer
|
||||
analyzer = ContentAnalyzer()
|
||||
scores_before = analyzer.analyze(content)
|
||||
|
||||
return {
|
||||
'content': content,
|
||||
'scores_before': scores_before,
|
||||
'html_content': content.html_content or '',
|
||||
'meta_title': content.meta_title or '',
|
||||
'meta_description': content.meta_description or '',
|
||||
'primary_keyword': content.primary_keyword or '',
|
||||
}
|
||||
|
||||
def build_prompt(self, data: Dict[str, Any], account=None) -> str:
|
||||
content: Content = data['content']
|
||||
scores_before = data.get('scores_before', {})
|
||||
|
||||
context = {
|
||||
'CONTENT_TITLE': content.title or 'Untitled',
|
||||
'HTML_CONTENT': data.get('html_content', ''),
|
||||
'META_TITLE': data.get('meta_title', ''),
|
||||
'META_DESCRIPTION': data.get('meta_description', ''),
|
||||
'PRIMARY_KEYWORD': data.get('primary_keyword', ''),
|
||||
'WORD_COUNT': str(content.word_count or 0),
|
||||
'CURRENT_SCORES': json.dumps(scores_before, indent=2),
|
||||
'SOURCE': content.source,
|
||||
'INTERNAL_LINKS_COUNT': str(len(content.internal_links) if content.internal_links else 0),
|
||||
}
|
||||
|
||||
return PromptRegistry.get_prompt(
|
||||
'optimize_content',
|
||||
account=account or content.account,
|
||||
context=context
|
||||
)
|
||||
|
||||
def parse_response(self, response: str, step_tracker=None) -> Dict[str, Any]:
|
||||
if not response:
|
||||
raise ValueError("AI response is empty")
|
||||
|
||||
response = response.strip()
|
||||
try:
|
||||
return self._ensure_dict(json.loads(response))
|
||||
except json.JSONDecodeError:
|
||||
logger.warning("Response not valid JSON, attempting to extract JSON object")
|
||||
cleaned = self._extract_json_object(response)
|
||||
if cleaned:
|
||||
return self._ensure_dict(json.loads(cleaned))
|
||||
raise ValueError("Unable to parse AI response into JSON")
|
||||
|
||||
def save_output(
|
||||
self,
|
||||
parsed: Dict[str, Any],
|
||||
original_data: Dict[str, Any],
|
||||
account=None,
|
||||
progress_tracker=None,
|
||||
step_tracker=None
|
||||
) -> Dict[str, Any]:
|
||||
content: Content = original_data['content']
|
||||
|
||||
# Extract optimized content
|
||||
optimized_html = parsed.get('html_content') or parsed.get('content') or content.html_content
|
||||
optimized_meta_title = parsed.get('meta_title') or content.meta_title
|
||||
optimized_meta_description = parsed.get('meta_description') or content.meta_description
|
||||
|
||||
# Update content
|
||||
content.html_content = optimized_html
|
||||
if optimized_meta_title:
|
||||
content.meta_title = optimized_meta_title
|
||||
if optimized_meta_description:
|
||||
content.meta_description = optimized_meta_description
|
||||
|
||||
# Recalculate word count
|
||||
from igny8_core.business.content.services.content_generation_service import ContentGenerationService
|
||||
content_service = ContentGenerationService()
|
||||
content.word_count = content_service._count_words(optimized_html)
|
||||
|
||||
# Increment optimizer version
|
||||
content.optimizer_version += 1
|
||||
|
||||
# Get scores after optimization
|
||||
from igny8_core.business.optimization.services.analyzer import ContentAnalyzer
|
||||
analyzer = ContentAnalyzer()
|
||||
scores_after = analyzer.analyze(content)
|
||||
content.optimization_scores = scores_after
|
||||
|
||||
content.save(update_fields=[
|
||||
'html_content', 'meta_title', 'meta_description',
|
||||
'word_count', 'optimizer_version', 'optimization_scores', 'updated_at'
|
||||
])
|
||||
|
||||
return {
|
||||
'success': True,
|
||||
'content_id': content.id,
|
||||
'scores_before': original_data.get('scores_before', {}),
|
||||
'scores_after': scores_after,
|
||||
'word_count_before': original_data.get('word_count', 0),
|
||||
'word_count_after': content.word_count,
|
||||
'html_content': optimized_html,
|
||||
'meta_title': optimized_meta_title,
|
||||
'meta_description': optimized_meta_description,
|
||||
}
|
||||
|
||||
# Helper methods
|
||||
def _ensure_dict(self, data: Any) -> Dict[str, Any]:
|
||||
if isinstance(data, dict):
|
||||
return data
|
||||
raise ValueError("AI response must be a JSON object")
|
||||
|
||||
def _extract_json_object(self, text: str) -> str:
|
||||
start = text.find('{')
|
||||
end = text.rfind('}')
|
||||
if start != -1 and end != -1 and end > start:
|
||||
return text[start:end + 1]
|
||||
return ''
|
||||
|
||||
@@ -1,2 +0,0 @@
|
||||
# AI functions tests
|
||||
|
||||
@@ -1,179 +0,0 @@
|
||||
"""
|
||||
Tests for OptimizeContentFunction
|
||||
"""
|
||||
from unittest.mock import Mock, patch, MagicMock
|
||||
from django.test import TestCase
|
||||
from igny8_core.business.content.models import Content
|
||||
from igny8_core.ai.functions.optimize_content import OptimizeContentFunction
|
||||
from igny8_core.api.tests.test_integration_base import IntegrationTestBase
|
||||
|
||||
|
||||
class OptimizeContentFunctionTests(IntegrationTestBase):
|
||||
"""Tests for OptimizeContentFunction"""
|
||||
|
||||
def setUp(self):
|
||||
super().setUp()
|
||||
self.function = OptimizeContentFunction()
|
||||
|
||||
# Create test content
|
||||
self.content = Content.objects.create(
|
||||
account=self.account,
|
||||
site=self.site,
|
||||
sector=self.sector,
|
||||
title="Test Content",
|
||||
html_content="<p>This is test content.</p>",
|
||||
meta_title="Test Title",
|
||||
meta_description="Test description",
|
||||
primary_keyword="test keyword",
|
||||
word_count=500,
|
||||
status='draft'
|
||||
)
|
||||
|
||||
def test_function_validation_phase(self):
|
||||
"""Test validation phase"""
|
||||
# Valid payload
|
||||
result = self.function.validate({'ids': [self.content.id]}, self.account)
|
||||
self.assertTrue(result['valid'])
|
||||
|
||||
# Invalid payload - missing ids
|
||||
result = self.function.validate({}, self.account)
|
||||
self.assertFalse(result['valid'])
|
||||
self.assertIn('error', result)
|
||||
|
||||
def test_function_prep_phase(self):
|
||||
"""Test prep phase"""
|
||||
payload = {'ids': [self.content.id]}
|
||||
|
||||
data = self.function.prepare(payload, self.account)
|
||||
|
||||
self.assertIn('content', data)
|
||||
self.assertIn('scores_before', data)
|
||||
self.assertIn('html_content', data)
|
||||
self.assertEqual(data['content'].id, self.content.id)
|
||||
|
||||
def test_function_prep_phase_content_not_found(self):
|
||||
"""Test prep phase with non-existent content"""
|
||||
payload = {'ids': [99999]}
|
||||
|
||||
with self.assertRaises(ValueError):
|
||||
self.function.prepare(payload, self.account)
|
||||
|
||||
@patch('igny8_core.ai.functions.optimize_content.PromptRegistry.get_prompt')
|
||||
def test_function_build_prompt(self, mock_get_prompt):
|
||||
"""Test prompt building"""
|
||||
mock_get_prompt.return_value = "Test prompt"
|
||||
|
||||
data = {
|
||||
'content': self.content,
|
||||
'html_content': '<p>Test</p>',
|
||||
'meta_title': 'Title',
|
||||
'meta_description': 'Description',
|
||||
'primary_keyword': 'keyword',
|
||||
'scores_before': {'overall_score': 50.0}
|
||||
}
|
||||
|
||||
prompt = self.function.build_prompt(data, self.account)
|
||||
|
||||
self.assertEqual(prompt, "Test prompt")
|
||||
mock_get_prompt.assert_called_once()
|
||||
# Check that context was passed
|
||||
call_args = mock_get_prompt.call_args
|
||||
self.assertIn('context', call_args.kwargs)
|
||||
|
||||
def test_function_parse_response_valid_json(self):
|
||||
"""Test parsing valid JSON response"""
|
||||
response = '{"html_content": "<p>Optimized</p>", "meta_title": "New Title"}'
|
||||
|
||||
parsed = self.function.parse_response(response)
|
||||
|
||||
self.assertIn('html_content', parsed)
|
||||
self.assertEqual(parsed['html_content'], "<p>Optimized</p>")
|
||||
self.assertEqual(parsed['meta_title'], "New Title")
|
||||
|
||||
def test_function_parse_response_invalid_json(self):
|
||||
"""Test parsing invalid JSON response"""
|
||||
response = "This is not JSON"
|
||||
|
||||
with self.assertRaises(ValueError):
|
||||
self.function.parse_response(response)
|
||||
|
||||
def test_function_parse_response_extracts_json_object(self):
|
||||
"""Test that JSON object is extracted from text"""
|
||||
response = 'Some text {"html_content": "<p>Optimized</p>"} more text'
|
||||
|
||||
parsed = self.function.parse_response(response)
|
||||
|
||||
self.assertIn('html_content', parsed)
|
||||
self.assertEqual(parsed['html_content'], "<p>Optimized</p>")
|
||||
|
||||
@patch('igny8_core.business.optimization.services.analyzer.ContentAnalyzer.analyze')
|
||||
@patch('igny8_core.business.content.services.content_generation_service.ContentGenerationService._count_words')
|
||||
def test_function_save_phase(self, mock_count_words, mock_analyze):
|
||||
"""Test save phase updates content"""
|
||||
mock_count_words.return_value = 600
|
||||
mock_analyze.return_value = {
|
||||
'seo_score': 75.0,
|
||||
'readability_score': 80.0,
|
||||
'engagement_score': 70.0,
|
||||
'overall_score': 75.0
|
||||
}
|
||||
|
||||
parsed = {
|
||||
'html_content': '<p>Optimized content.</p>',
|
||||
'meta_title': 'Optimized Title',
|
||||
'meta_description': 'Optimized Description'
|
||||
}
|
||||
|
||||
original_data = {
|
||||
'content': self.content,
|
||||
'scores_before': {'overall_score': 50.0},
|
||||
'word_count': 500
|
||||
}
|
||||
|
||||
result = self.function.save_output(parsed, original_data, self.account)
|
||||
|
||||
self.assertTrue(result['success'])
|
||||
self.assertEqual(result['content_id'], self.content.id)
|
||||
|
||||
# Refresh content from DB
|
||||
self.content.refresh_from_db()
|
||||
self.assertEqual(self.content.html_content, '<p>Optimized content.</p>')
|
||||
self.assertEqual(self.content.optimizer_version, 1)
|
||||
self.assertIsNotNone(self.content.optimization_scores)
|
||||
|
||||
def test_function_handles_invalid_content_id(self):
|
||||
"""Test that function handles invalid content ID"""
|
||||
payload = {'ids': [99999]}
|
||||
|
||||
with self.assertRaises(ValueError):
|
||||
self.function.prepare(payload, self.account)
|
||||
|
||||
def test_function_respects_account_isolation(self):
|
||||
"""Test that function respects account isolation"""
|
||||
from igny8_core.auth.models import Account
|
||||
other_account = Account.objects.create(
|
||||
name="Other Account",
|
||||
slug="other",
|
||||
plan=self.plan,
|
||||
owner=self.user
|
||||
)
|
||||
|
||||
payload = {'ids': [self.content.id]}
|
||||
|
||||
# Should not find content from different account
|
||||
with self.assertRaises(ValueError):
|
||||
self.function.prepare(payload, other_account)
|
||||
|
||||
def test_get_name(self):
|
||||
"""Test get_name method"""
|
||||
self.assertEqual(self.function.get_name(), 'optimize_content')
|
||||
|
||||
def test_get_metadata(self):
|
||||
"""Test get_metadata method"""
|
||||
metadata = self.function.get_metadata()
|
||||
|
||||
self.assertIn('display_name', metadata)
|
||||
self.assertIn('description', metadata)
|
||||
self.assertIn('phases', metadata)
|
||||
self.assertEqual(metadata['display_name'], 'Optimize Content')
|
||||
|
||||
@@ -1,39 +0,0 @@
|
||||
# Generated by Django 5.2.8 on 2025-11-20 23:27
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
initial = True
|
||||
|
||||
dependencies = [
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name='AITaskLog',
|
||||
fields=[
|
||||
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('created_at', models.DateTimeField(auto_now_add=True, db_index=True)),
|
||||
('updated_at', models.DateTimeField(auto_now=True)),
|
||||
('task_id', models.CharField(blank=True, db_index=True, max_length=255, null=True)),
|
||||
('function_name', models.CharField(db_index=True, max_length=100)),
|
||||
('phase', models.CharField(default='INIT', max_length=50)),
|
||||
('message', models.TextField(blank=True)),
|
||||
('status', models.CharField(choices=[('success', 'Success'), ('error', 'Error'), ('pending', 'Pending')], default='pending', max_length=20)),
|
||||
('duration', models.IntegerField(blank=True, help_text='Duration in milliseconds', null=True)),
|
||||
('cost', models.DecimalField(decimal_places=6, default=0.0, max_digits=10)),
|
||||
('tokens', models.IntegerField(default=0)),
|
||||
('request_steps', models.JSONField(blank=True, default=list)),
|
||||
('response_steps', models.JSONField(blank=True, default=list)),
|
||||
('error', models.TextField(blank=True, null=True)),
|
||||
('payload', models.JSONField(blank=True, null=True)),
|
||||
('result', models.JSONField(blank=True, null=True)),
|
||||
],
|
||||
options={
|
||||
'db_table': 'igny8_ai_task_logs',
|
||||
'ordering': ['-created_at'],
|
||||
},
|
||||
),
|
||||
]
|
||||
@@ -1,34 +0,0 @@
|
||||
# Generated by Django 5.2.8 on 2025-11-20 23:27
|
||||
|
||||
import django.db.models.deletion
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
initial = True
|
||||
|
||||
dependencies = [
|
||||
('ai', '0001_initial'),
|
||||
('igny8_core_auth', '0001_initial'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='aitasklog',
|
||||
name='account',
|
||||
field=models.ForeignKey(db_column='tenant_id', on_delete=django.db.models.deletion.CASCADE, related_name='%(class)s_set', to='igny8_core_auth.account'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='aitasklog',
|
||||
index=models.Index(fields=['task_id'], name='igny8_ai_ta_task_id_310356_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='aitasklog',
|
||||
index=models.Index(fields=['function_name', 'account'], name='igny8_ai_ta_functio_0e5a30_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='aitasklog',
|
||||
index=models.Index(fields=['status', 'created_at'], name='igny8_ai_ta_status_ed93b5_idx'),
|
||||
),
|
||||
]
|
||||
@@ -1,339 +0,0 @@
|
||||
"""
|
||||
Model Registry Service
|
||||
Central registry for AI model configurations with caching.
|
||||
Replaces hardcoded MODEL_RATES and IMAGE_MODEL_RATES from constants.py
|
||||
|
||||
This service provides:
|
||||
- Database-driven model configuration (from AIModelConfig)
|
||||
- Fallback to constants.py for backward compatibility
|
||||
- Caching for performance
|
||||
- Cost calculation methods
|
||||
|
||||
Usage:
|
||||
from igny8_core.ai.model_registry import ModelRegistry
|
||||
|
||||
# Get model config
|
||||
model = ModelRegistry.get_model('gpt-4o-mini')
|
||||
|
||||
# Get rate
|
||||
input_rate = ModelRegistry.get_rate('gpt-4o-mini', 'input')
|
||||
|
||||
# Calculate cost
|
||||
cost = ModelRegistry.calculate_cost('gpt-4o-mini', input_tokens=1000, output_tokens=500)
|
||||
"""
|
||||
import logging
|
||||
from decimal import Decimal
|
||||
from typing import Optional, Dict, Any
|
||||
from django.core.cache import cache
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Cache TTL in seconds (5 minutes)
|
||||
MODEL_CACHE_TTL = 300
|
||||
|
||||
# Cache key prefix
|
||||
CACHE_KEY_PREFIX = 'ai_model_'
|
||||
|
||||
|
||||
class ModelRegistry:
|
||||
"""
|
||||
Central registry for AI model configurations with caching.
|
||||
Uses AIModelConfig from database with fallback to constants.py
|
||||
"""
|
||||
|
||||
@classmethod
|
||||
def _get_cache_key(cls, model_id: str) -> str:
|
||||
"""Generate cache key for model"""
|
||||
return f"{CACHE_KEY_PREFIX}{model_id}"
|
||||
|
||||
@classmethod
|
||||
def _get_from_db(cls, model_id: str) -> Optional[Any]:
|
||||
"""Get model config from database"""
|
||||
try:
|
||||
from igny8_core.business.billing.models import AIModelConfig
|
||||
return AIModelConfig.objects.filter(
|
||||
model_name=model_id,
|
||||
is_active=True
|
||||
).first()
|
||||
except Exception as e:
|
||||
logger.debug(f"Could not fetch model {model_id} from DB: {e}")
|
||||
return None
|
||||
|
||||
@classmethod
|
||||
def _get_from_constants(cls, model_id: str) -> Optional[Dict[str, Any]]:
|
||||
"""
|
||||
Get model config from constants.py as fallback.
|
||||
Returns a dict mimicking AIModelConfig attributes.
|
||||
"""
|
||||
from igny8_core.ai.constants import MODEL_RATES, IMAGE_MODEL_RATES
|
||||
|
||||
# Check text models first
|
||||
if model_id in MODEL_RATES:
|
||||
rates = MODEL_RATES[model_id]
|
||||
return {
|
||||
'model_name': model_id,
|
||||
'display_name': model_id,
|
||||
'model_type': 'text',
|
||||
'provider': 'openai',
|
||||
'input_cost_per_1m': Decimal(str(rates.get('input', 0))),
|
||||
'output_cost_per_1m': Decimal(str(rates.get('output', 0))),
|
||||
'cost_per_image': None,
|
||||
'is_active': True,
|
||||
'_from_constants': True
|
||||
}
|
||||
|
||||
# Check image models
|
||||
if model_id in IMAGE_MODEL_RATES:
|
||||
cost = IMAGE_MODEL_RATES[model_id]
|
||||
return {
|
||||
'model_name': model_id,
|
||||
'display_name': model_id,
|
||||
'model_type': 'image',
|
||||
'provider': 'openai' if 'dall-e' in model_id else 'runware',
|
||||
'input_cost_per_1m': None,
|
||||
'output_cost_per_1m': None,
|
||||
'cost_per_image': Decimal(str(cost)),
|
||||
'is_active': True,
|
||||
'_from_constants': True
|
||||
}
|
||||
|
||||
return None
|
||||
|
||||
@classmethod
|
||||
def get_model(cls, model_id: str) -> Optional[Any]:
|
||||
"""
|
||||
Get model configuration by model_id.
|
||||
|
||||
Order of lookup:
|
||||
1. Cache
|
||||
2. Database (AIModelConfig)
|
||||
3. constants.py fallback
|
||||
|
||||
Args:
|
||||
model_id: The model identifier (e.g., 'gpt-4o-mini', 'dall-e-3')
|
||||
|
||||
Returns:
|
||||
AIModelConfig instance or dict with model config, None if not found
|
||||
"""
|
||||
cache_key = cls._get_cache_key(model_id)
|
||||
|
||||
# Try cache first
|
||||
cached = cache.get(cache_key)
|
||||
if cached is not None:
|
||||
return cached
|
||||
|
||||
# Try database
|
||||
model_config = cls._get_from_db(model_id)
|
||||
|
||||
if model_config:
|
||||
cache.set(cache_key, model_config, MODEL_CACHE_TTL)
|
||||
return model_config
|
||||
|
||||
# Fallback to constants
|
||||
fallback = cls._get_from_constants(model_id)
|
||||
if fallback:
|
||||
cache.set(cache_key, fallback, MODEL_CACHE_TTL)
|
||||
return fallback
|
||||
|
||||
logger.warning(f"Model {model_id} not found in DB or constants")
|
||||
return None
|
||||
|
||||
@classmethod
|
||||
def get_rate(cls, model_id: str, rate_type: str) -> Decimal:
|
||||
"""
|
||||
Get specific rate for a model.
|
||||
|
||||
Args:
|
||||
model_id: The model identifier
|
||||
rate_type: 'input', 'output' (for text models) or 'image' (for image models)
|
||||
|
||||
Returns:
|
||||
Decimal rate value, 0 if not found
|
||||
"""
|
||||
model = cls.get_model(model_id)
|
||||
if not model:
|
||||
return Decimal('0')
|
||||
|
||||
# Handle dict (from constants fallback)
|
||||
if isinstance(model, dict):
|
||||
if rate_type == 'input':
|
||||
return model.get('input_cost_per_1m') or Decimal('0')
|
||||
elif rate_type == 'output':
|
||||
return model.get('output_cost_per_1m') or Decimal('0')
|
||||
elif rate_type == 'image':
|
||||
return model.get('cost_per_image') or Decimal('0')
|
||||
return Decimal('0')
|
||||
|
||||
# Handle AIModelConfig instance
|
||||
if rate_type == 'input':
|
||||
return model.input_cost_per_1m or Decimal('0')
|
||||
elif rate_type == 'output':
|
||||
return model.output_cost_per_1m or Decimal('0')
|
||||
elif rate_type == 'image':
|
||||
return model.cost_per_image or Decimal('0')
|
||||
|
||||
return Decimal('0')
|
||||
|
||||
@classmethod
|
||||
def calculate_cost(cls, model_id: str, input_tokens: int = 0, output_tokens: int = 0, num_images: int = 0) -> Decimal:
|
||||
"""
|
||||
Calculate cost for model usage.
|
||||
|
||||
For text models: Uses input/output token counts
|
||||
For image models: Uses num_images
|
||||
|
||||
Args:
|
||||
model_id: The model identifier
|
||||
input_tokens: Number of input tokens (for text models)
|
||||
output_tokens: Number of output tokens (for text models)
|
||||
num_images: Number of images (for image models)
|
||||
|
||||
Returns:
|
||||
Decimal cost in USD
|
||||
"""
|
||||
model = cls.get_model(model_id)
|
||||
if not model:
|
||||
return Decimal('0')
|
||||
|
||||
# Determine model type
|
||||
model_type = model.get('model_type') if isinstance(model, dict) else model.model_type
|
||||
|
||||
if model_type == 'text':
|
||||
input_rate = cls.get_rate(model_id, 'input')
|
||||
output_rate = cls.get_rate(model_id, 'output')
|
||||
|
||||
cost = (
|
||||
(Decimal(input_tokens) * input_rate) +
|
||||
(Decimal(output_tokens) * output_rate)
|
||||
) / Decimal('1000000')
|
||||
|
||||
return cost
|
||||
|
||||
elif model_type == 'image':
|
||||
image_rate = cls.get_rate(model_id, 'image')
|
||||
return image_rate * Decimal(num_images)
|
||||
|
||||
return Decimal('0')
|
||||
|
||||
@classmethod
|
||||
def get_default_model(cls, model_type: str = 'text') -> Optional[str]:
|
||||
"""
|
||||
Get the default model for a given type.
|
||||
|
||||
Args:
|
||||
model_type: 'text' or 'image'
|
||||
|
||||
Returns:
|
||||
model_id string or None
|
||||
"""
|
||||
try:
|
||||
from igny8_core.business.billing.models import AIModelConfig
|
||||
default = AIModelConfig.objects.filter(
|
||||
model_type=model_type,
|
||||
is_active=True,
|
||||
is_default=True
|
||||
).first()
|
||||
|
||||
if default:
|
||||
return default.model_name
|
||||
except Exception as e:
|
||||
logger.debug(f"Could not get default {model_type} model from DB: {e}")
|
||||
|
||||
# Fallback to constants
|
||||
from igny8_core.ai.constants import DEFAULT_AI_MODEL
|
||||
if model_type == 'text':
|
||||
return DEFAULT_AI_MODEL
|
||||
elif model_type == 'image':
|
||||
return 'dall-e-3'
|
||||
|
||||
return None
|
||||
|
||||
@classmethod
|
||||
def list_models(cls, model_type: Optional[str] = None, provider: Optional[str] = None) -> list:
|
||||
"""
|
||||
List all available models, optionally filtered by type or provider.
|
||||
|
||||
Args:
|
||||
model_type: Filter by 'text', 'image', or 'embedding'
|
||||
provider: Filter by 'openai', 'anthropic', 'runware', etc.
|
||||
|
||||
Returns:
|
||||
List of model configs
|
||||
"""
|
||||
models = []
|
||||
|
||||
try:
|
||||
from igny8_core.business.billing.models import AIModelConfig
|
||||
queryset = AIModelConfig.objects.filter(is_active=True)
|
||||
|
||||
if model_type:
|
||||
queryset = queryset.filter(model_type=model_type)
|
||||
if provider:
|
||||
queryset = queryset.filter(provider=provider)
|
||||
|
||||
models = list(queryset.order_by('sort_order', 'model_name'))
|
||||
except Exception as e:
|
||||
logger.debug(f"Could not list models from DB: {e}")
|
||||
|
||||
# Add models from constants if not in DB
|
||||
if not models:
|
||||
from igny8_core.ai.constants import MODEL_RATES, IMAGE_MODEL_RATES
|
||||
|
||||
if model_type in (None, 'text'):
|
||||
for model_id in MODEL_RATES:
|
||||
fallback = cls._get_from_constants(model_id)
|
||||
if fallback:
|
||||
models.append(fallback)
|
||||
|
||||
if model_type in (None, 'image'):
|
||||
for model_id in IMAGE_MODEL_RATES:
|
||||
fallback = cls._get_from_constants(model_id)
|
||||
if fallback:
|
||||
models.append(fallback)
|
||||
|
||||
return models
|
||||
|
||||
@classmethod
|
||||
def clear_cache(cls, model_id: Optional[str] = None):
|
||||
"""
|
||||
Clear model cache.
|
||||
|
||||
Args:
|
||||
model_id: Clear specific model cache, or all if None
|
||||
"""
|
||||
if model_id:
|
||||
cache.delete(cls._get_cache_key(model_id))
|
||||
else:
|
||||
# Clear all model caches - use pattern if available
|
||||
try:
|
||||
from django.core.cache import caches
|
||||
default_cache = caches['default']
|
||||
if hasattr(default_cache, 'delete_pattern'):
|
||||
default_cache.delete_pattern(f"{CACHE_KEY_PREFIX}*")
|
||||
else:
|
||||
# Fallback: clear known models
|
||||
from igny8_core.ai.constants import MODEL_RATES, IMAGE_MODEL_RATES
|
||||
for model_id in list(MODEL_RATES.keys()) + list(IMAGE_MODEL_RATES.keys()):
|
||||
cache.delete(cls._get_cache_key(model_id))
|
||||
except Exception as e:
|
||||
logger.warning(f"Could not clear all model caches: {e}")
|
||||
|
||||
@classmethod
|
||||
def validate_model(cls, model_id: str) -> bool:
|
||||
"""
|
||||
Check if a model ID is valid and active.
|
||||
|
||||
Args:
|
||||
model_id: The model identifier to validate
|
||||
|
||||
Returns:
|
||||
True if model exists and is active, False otherwise
|
||||
"""
|
||||
model = cls.get_model(model_id)
|
||||
if not model:
|
||||
return False
|
||||
|
||||
# Check if active
|
||||
if isinstance(model, dict):
|
||||
return model.get('is_active', True)
|
||||
return model.is_active
|
||||
@@ -1,9 +1,9 @@
|
||||
"""
|
||||
Prompt Registry - Centralized prompt management with override hierarchy
|
||||
Supports: task-level overrides → DB prompts → GlobalAIPrompt (REQUIRED)
|
||||
Supports: task-level overrides → DB prompts → default fallbacks
|
||||
"""
|
||||
import logging
|
||||
from typing import Dict, Any, Optional, Tuple
|
||||
from typing import Dict, Any, Optional
|
||||
from django.db import models
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
@@ -14,12 +14,259 @@ class PromptRegistry:
|
||||
Centralized prompt registry with hierarchical resolution:
|
||||
1. Task-level prompt_override (if exists)
|
||||
2. DB prompt for (account, function)
|
||||
3. GlobalAIPrompt (REQUIRED - no hardcoded fallbacks)
|
||||
3. Default fallback from registry
|
||||
"""
|
||||
|
||||
# Default prompts stored in registry
|
||||
DEFAULT_PROMPTS = {
|
||||
'clustering': """You are a semantic strategist and SEO architecture engine. Your task is to analyze the provided keyword list and group them into meaningful, intent-driven topic clusters that reflect how real users search, think, and act online.
|
||||
|
||||
# Removed ALL hardcoded prompts - GlobalAIPrompt is now the ONLY source of default prompts
|
||||
# To add/modify prompts, use Django admin: /admin/system/globalaiprompt/
|
||||
Return a single JSON object with a "clusters" array. Each cluster must follow this structure:
|
||||
|
||||
{
|
||||
"name": "[Descriptive cluster name — natural, SEO-relevant, clearly expressing the topic]",
|
||||
"description": "[1–2 concise sentences explaining what this cluster covers and why these keywords belong together]",
|
||||
"keywords": ["keyword 1", "keyword 2", "keyword 3", "..."]
|
||||
}
|
||||
|
||||
CLUSTERING STRATEGY:
|
||||
|
||||
1. Keyword-first, structure-follows:
|
||||
- Do NOT rely on assumed categories or existing content structures.
|
||||
- Begin purely from the meaning, intent, and behavioral connection between keywords.
|
||||
|
||||
2. Use multi-dimensional grouping logic:
|
||||
- Group keywords by these behavioral dimensions:
|
||||
• Search Intent → informational, commercial, transactional, navigational
|
||||
• Use-Case or Problem → what the user is trying to achieve or solve
|
||||
• Function or Feature → how something works or what it does
|
||||
• Persona or Audience → who the content or product serves
|
||||
• Context → location, time, season, platform, or device
|
||||
- Combine 2–3 dimensions naturally where they make sense.
|
||||
|
||||
3. Model real search behavior:
|
||||
- Favor clusters that form natural user journeys such as:
|
||||
• Problem ➝ Solution
|
||||
• General ➝ Specific
|
||||
• Product ➝ Use-case
|
||||
• Buyer ➝ Benefit
|
||||
• Tool ➝ Function
|
||||
• Task ➝ Method
|
||||
- Each cluster should feel like a real topic hub users would explore in depth.
|
||||
|
||||
4. Avoid superficial groupings:
|
||||
- Do not cluster keywords just because they share words.
|
||||
- Do not force-fit outliers or unrelated keywords.
|
||||
- Exclude keywords that don't logically connect to any cluster.
|
||||
|
||||
5. Quality rules:
|
||||
- Each cluster should include between 3–10 strongly related keywords.
|
||||
- Never duplicate a keyword across multiple clusters.
|
||||
- Prioritize semantic strength, search intent, and usefulness for SEO-driven content structure.
|
||||
- It's better to output fewer, high-quality clusters than many weak or shallow ones.
|
||||
|
||||
INPUT FORMAT:
|
||||
{
|
||||
"keywords": [IGNY8_KEYWORDS]
|
||||
}
|
||||
|
||||
OUTPUT FORMAT:
|
||||
Return ONLY the final JSON object in this format:
|
||||
{
|
||||
"clusters": [
|
||||
{
|
||||
"name": "...",
|
||||
"description": "...",
|
||||
"keywords": ["...", "...", "..."]
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
Do not include any explanations, text, or commentary outside the JSON output.
|
||||
""",
|
||||
|
||||
'ideas': """Generate SEO-optimized, high-quality content ideas and outlines for each keyword cluster.
|
||||
Input:
|
||||
Clusters: [IGNY8_CLUSTERS]
|
||||
Keywords: [IGNY8_CLUSTER_KEYWORDS]
|
||||
|
||||
Output: JSON with "ideas" array.
|
||||
Each cluster → 1 cluster_hub + 2–4 supporting ideas.
|
||||
Each idea must include:
|
||||
title, description, content_type, content_structure, cluster_id, estimated_word_count (1500–2200), and covered_keywords.
|
||||
|
||||
Outline Rules:
|
||||
|
||||
Intro: 1 hook (30–40 words) + 2 intro paragraphs (50–60 words each).
|
||||
|
||||
5–8 H2 sections, each with 2–3 H3s.
|
||||
|
||||
Each H2 ≈ 250–300 words, mixed content (paragraphs, lists, tables, blockquotes).
|
||||
|
||||
Vary section format and tone; no bullets or lists at start.
|
||||
|
||||
Tables have columns; blockquotes = expert POV or data insight.
|
||||
|
||||
Use depth, examples, and real context.
|
||||
|
||||
Avoid repetitive structure.
|
||||
|
||||
Tone: Professional editorial flow. No generic phrasing. Use varied sentence openings and realistic examples.
|
||||
|
||||
Output JSON Example:
|
||||
|
||||
{
|
||||
"ideas": [
|
||||
{
|
||||
"title": "Best Organic Cotton Duvet Covers for All Seasons",
|
||||
"description": {
|
||||
"introduction": {
|
||||
"hook": "Transform your sleep with organic cotton that blends comfort and sustainability.",
|
||||
"paragraphs": [
|
||||
{"content_type": "paragraph", "details": "Overview of organic cotton's rise in bedding industry."},
|
||||
{"content_type": "paragraph", "details": "Why consumers prefer organic bedding over synthetic alternatives."}
|
||||
]
|
||||
},
|
||||
"H2": [
|
||||
{
|
||||
"heading": "Why Choose Organic Cotton for Bedding?",
|
||||
"subsections": [
|
||||
{"subheading": "Health and Skin Benefits", "content_type": "paragraph", "details": "Discuss hypoallergenic and chemical-free aspects."},
|
||||
{"subheading": "Environmental Sustainability", "content_type": "list", "details": "Eco benefits like low water use, no pesticides."},
|
||||
{"subheading": "Long-Term Cost Savings", "content_type": "table", "details": "Compare durability and pricing over time."}
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
"content_type": "post",
|
||||
"content_structure": "review",
|
||||
"cluster_id": 12,
|
||||
"estimated_word_count": 1800,
|
||||
"covered_keywords": "organic duvet covers, eco-friendly bedding, sustainable sheets"
|
||||
}
|
||||
]
|
||||
}""",
|
||||
|
||||
'content_generation': """You are an editorial content strategist. Your task is to generate a complete JSON response object that includes all the fields listed below, based on the provided content idea, keyword cluster, and keyword list.
|
||||
|
||||
Only the `content` field should contain HTML inside JSON object.
|
||||
|
||||
==================
|
||||
Generate a complete JSON response object matching this structure:
|
||||
==================
|
||||
|
||||
{
|
||||
"title": "[Blog title using the primary keyword — full sentence case]",
|
||||
"meta_title": "[Meta title under 60 characters — natural, optimized, and compelling]",
|
||||
"meta_description": "[Meta description under 160 characters — clear and enticing summary]",
|
||||
"content": "[HTML content — full editorial structure with <p>, <h2>, <h3>, <ul>, <ol>, <table>]",
|
||||
"word_count": [Exact integer — word count of HTML body only],
|
||||
"primary_keyword": "[Single primary keyword used in title and first paragraph]",
|
||||
"secondary_keywords": [
|
||||
"[Keyword 1]",
|
||||
"[Keyword 2]",
|
||||
"[Keyword 3]"
|
||||
],
|
||||
"tags": [
|
||||
"[2–4 word lowercase tag 1]",
|
||||
"[2–4 word lowercase tag 2]",
|
||||
"[2–4 word lowercase tag 3]",
|
||||
"[2–4 word lowercase tag 4]",
|
||||
"[2–4 word lowercase tag 5]"
|
||||
],
|
||||
"categories": [
|
||||
"[Parent Category > Child Category]",
|
||||
"[Optional Second Category > Optional Subcategory]"
|
||||
]
|
||||
}
|
||||
|
||||
===========================
|
||||
CONTENT FLOW RULES
|
||||
===========================
|
||||
|
||||
**INTRODUCTION:**
|
||||
- Start with 1 italicized hook (30–40 words)
|
||||
- Follow with 2 narrative paragraphs (each 50–60 words; 2–3 sentences max)
|
||||
- No headings allowed in intro
|
||||
|
||||
**H2 SECTIONS (5–8 total):**
|
||||
Each section should be 250–300 words and follow this format:
|
||||
1. Two narrative paragraphs (80–120 words each, 2–3 sentences)
|
||||
2. One list or table (must come *after* a paragraph)
|
||||
3. Optional closing paragraph (40–60 words)
|
||||
4. Insert 2–3 subsections naturally after main paragraphs
|
||||
|
||||
**Formatting Rules:**
|
||||
- Vary use of unordered lists, ordered lists, and tables across sections
|
||||
- Never begin any section or sub-section with a list or table
|
||||
|
||||
===========================
|
||||
KEYWORD & SEO RULES
|
||||
===========================
|
||||
|
||||
- **Primary keyword** must appear in:
|
||||
- The title
|
||||
- First paragraph of the introduction
|
||||
- At least 2 H2 headings
|
||||
|
||||
- **Secondary keywords** must be used naturally, not forced
|
||||
|
||||
- **Tone & style guidelines:**
|
||||
- No robotic or passive voice
|
||||
- Avoid generic intros like "In today's world…"
|
||||
- Don't repeat heading in opening sentence
|
||||
- Vary sentence structure and length
|
||||
|
||||
|
||||
|
||||
===========================
|
||||
INPUT VARIABLES
|
||||
===========================
|
||||
|
||||
CONTENT IDEA DETAILS:
|
||||
[IGNY8_IDEA]
|
||||
|
||||
KEYWORD CLUSTER:
|
||||
[IGNY8_CLUSTER]
|
||||
|
||||
ASSOCIATED KEYWORDS:
|
||||
[IGNY8_KEYWORDS]
|
||||
|
||||
===========================
|
||||
OUTPUT FORMAT
|
||||
===========================
|
||||
|
||||
Return ONLY the final JSON object.
|
||||
Do NOT include any comments, formatting, or explanations.""",
|
||||
|
||||
'image_prompt_extraction': """Extract image prompts from the following article content.
|
||||
|
||||
ARTICLE TITLE: {title}
|
||||
|
||||
ARTICLE CONTENT:
|
||||
{content}
|
||||
|
||||
Extract image prompts for:
|
||||
1. Featured Image: One main image that represents the article topic
|
||||
2. In-Article Images: Up to {max_images} images that would be useful within the article content
|
||||
|
||||
Return a JSON object with this structure:
|
||||
{{
|
||||
"featured_prompt": "Detailed description of the featured image",
|
||||
"in_article_prompts": [
|
||||
"Description of first in-article image",
|
||||
"Description of second in-article image",
|
||||
...
|
||||
]
|
||||
}}
|
||||
|
||||
Make sure each prompt is detailed enough for image generation, describing the visual elements, style, mood, and composition.""",
|
||||
|
||||
'image_prompt_template': 'Create a high-quality {image_type} image to use as a featured photo for a blog post titled "{post_title}". The image should visually represent the theme, mood, and subject implied by the image prompt: {image_prompt}. Focus on a realistic, well-composed scene that naturally communicates the topic without text or logos. Use balanced lighting, pleasing composition, and photographic detail suitable for lifestyle or editorial web content. Avoid adding any visible or readable text, brand names, or illustrative effects. **And make sure image is not blurry.**',
|
||||
|
||||
'negative_prompt': 'text, watermark, logo, overlay, title, caption, writing on walls, writing on objects, UI, infographic elements, post title',
|
||||
}
|
||||
|
||||
# Mapping from function names to prompt types
|
||||
FUNCTION_TO_PROMPT_TYPE = {
|
||||
'auto_cluster': 'clustering',
|
||||
@@ -28,121 +275,8 @@ class PromptRegistry:
|
||||
'generate_images': 'image_prompt_extraction',
|
||||
'extract_image_prompts': 'image_prompt_extraction',
|
||||
'generate_image_prompts': 'image_prompt_extraction',
|
||||
'generate_site_structure': 'site_structure_generation',
|
||||
'optimize_content': 'optimize_content',
|
||||
# Phase 8: Universal Content Types
|
||||
'generate_product_content': 'product_generation',
|
||||
'generate_service_page': 'service_generation',
|
||||
'generate_taxonomy': 'taxonomy_generation',
|
||||
}
|
||||
|
||||
# Mapping of prompt types to their prefix numbers and display names
|
||||
# Format: {prompt_type: (number, display_name)}
|
||||
# GP = Global Prompt, CP = Custom Prompt
|
||||
PROMPT_PREFIX_MAP = {
|
||||
'clustering': ('01', 'Clustering'),
|
||||
'ideas': ('02', 'Ideas'),
|
||||
'content_generation': ('03', 'ContentGen'),
|
||||
'image_prompt_extraction': ('04', 'ImagePrompts'),
|
||||
'site_structure_generation': ('05', 'SiteStructure'),
|
||||
'optimize_content': ('06', 'OptimizeContent'),
|
||||
'product_generation': ('07', 'ProductGen'),
|
||||
'service_generation': ('08', 'ServiceGen'),
|
||||
'taxonomy_generation': ('09', 'TaxonomyGen'),
|
||||
'image_prompt_template': ('10', 'ImageTemplate'),
|
||||
'negative_prompt': ('11', 'NegativePrompt'),
|
||||
}
|
||||
|
||||
@classmethod
|
||||
def get_prompt_prefix(cls, prompt_type: str, is_custom: bool) -> str:
|
||||
"""
|
||||
Generate prompt prefix for tracking.
|
||||
|
||||
Args:
|
||||
prompt_type: The prompt type (e.g., 'clustering', 'ideas')
|
||||
is_custom: True if using custom/account-specific prompt, False if global
|
||||
|
||||
Returns:
|
||||
Prefix string like "##GP01-Clustering" or "##CP01-Clustering"
|
||||
"""
|
||||
prefix_info = cls.PROMPT_PREFIX_MAP.get(prompt_type, ('00', prompt_type.title()))
|
||||
number, display_name = prefix_info
|
||||
prefix_type = 'CP' if is_custom else 'GP'
|
||||
return f"##{prefix_type}{number}-{display_name}"
|
||||
|
||||
@classmethod
|
||||
def get_prompt_with_metadata(
|
||||
cls,
|
||||
function_name: str,
|
||||
account: Optional[Any] = None,
|
||||
task: Optional[Any] = None,
|
||||
context: Optional[Dict[str, Any]] = None
|
||||
) -> Tuple[str, bool, str]:
|
||||
"""
|
||||
Get prompt for a function with metadata about source.
|
||||
|
||||
Priority:
|
||||
1. task.prompt_override (if task provided and has override)
|
||||
2. DB prompt for (account, function) - marked as custom if is_customized=True
|
||||
3. GlobalAIPrompt (REQUIRED - no hardcoded fallbacks)
|
||||
|
||||
Args:
|
||||
function_name: AI function name (e.g., 'auto_cluster', 'generate_ideas')
|
||||
account: Account object (optional)
|
||||
task: Task object with optional prompt_override (optional)
|
||||
context: Additional context for prompt rendering (optional)
|
||||
|
||||
Returns:
|
||||
Tuple of (prompt_string, is_custom, prompt_type)
|
||||
- prompt_string: The rendered prompt
|
||||
- is_custom: True if using custom/account prompt, False if global
|
||||
- prompt_type: The prompt type identifier
|
||||
"""
|
||||
# Step 1: Get prompt type
|
||||
prompt_type = cls.FUNCTION_TO_PROMPT_TYPE.get(function_name, function_name)
|
||||
|
||||
# Step 2: Check task-level override (always considered custom)
|
||||
if task and hasattr(task, 'prompt_override') and task.prompt_override:
|
||||
logger.info(f"Using task-level prompt override for {function_name}")
|
||||
prompt = task.prompt_override
|
||||
return cls._render_prompt(prompt, context or {}), True, prompt_type
|
||||
|
||||
# Step 3: Try DB prompt (account-specific)
|
||||
if account:
|
||||
try:
|
||||
from igny8_core.modules.system.models import AIPrompt
|
||||
db_prompt = AIPrompt.objects.get(
|
||||
account=account,
|
||||
prompt_type=prompt_type,
|
||||
is_active=True
|
||||
)
|
||||
# Check if prompt is customized
|
||||
is_custom = db_prompt.is_customized
|
||||
logger.info(f"Using {'customized' if is_custom else 'default'} account prompt for {function_name} (account {account.id})")
|
||||
prompt = db_prompt.prompt_value
|
||||
return cls._render_prompt(prompt, context or {}), is_custom, prompt_type
|
||||
except Exception as e:
|
||||
logger.debug(f"No account-specific prompt found for {function_name}: {e}")
|
||||
|
||||
# Step 4: Try GlobalAIPrompt (platform-wide default) - REQUIRED
|
||||
try:
|
||||
from igny8_core.modules.system.global_settings_models import GlobalAIPrompt
|
||||
global_prompt = GlobalAIPrompt.objects.get(
|
||||
prompt_type=prompt_type,
|
||||
is_active=True
|
||||
)
|
||||
logger.info(f"Using global default prompt for {function_name} from GlobalAIPrompt")
|
||||
prompt = global_prompt.prompt_value
|
||||
return cls._render_prompt(prompt, context or {}), False, prompt_type
|
||||
except Exception as e:
|
||||
error_msg = (
|
||||
f"ERROR: Global prompt '{prompt_type}' not found for function '{function_name}'. "
|
||||
f"Please configure it in Django admin at: /admin/system/globalaiprompt/. "
|
||||
f"Error: {e}"
|
||||
)
|
||||
logger.error(error_msg)
|
||||
raise ValueError(error_msg)
|
||||
|
||||
@classmethod
|
||||
def get_prompt(
|
||||
cls,
|
||||
@@ -153,23 +287,51 @@ class PromptRegistry:
|
||||
) -> str:
|
||||
"""
|
||||
Get prompt for a function with hierarchical resolution.
|
||||
|
||||
|
||||
Priority:
|
||||
1. task.prompt_override (if task provided and has override)
|
||||
2. DB prompt for (account, function)
|
||||
3. GlobalAIPrompt (REQUIRED - no hardcoded fallbacks)
|
||||
|
||||
3. Default fallback from registry
|
||||
|
||||
Args:
|
||||
function_name: AI function name (e.g., 'auto_cluster', 'generate_ideas')
|
||||
account: Account object (optional)
|
||||
task: Task object with optional prompt_override (optional)
|
||||
context: Additional context for prompt rendering (optional)
|
||||
|
||||
|
||||
Returns:
|
||||
Prompt string ready for formatting
|
||||
"""
|
||||
prompt, _, _ = cls.get_prompt_with_metadata(function_name, account, task, context)
|
||||
return prompt
|
||||
# Step 1: Check task-level override
|
||||
if task and hasattr(task, 'prompt_override') and task.prompt_override:
|
||||
logger.info(f"Using task-level prompt override for {function_name}")
|
||||
prompt = task.prompt_override
|
||||
return cls._render_prompt(prompt, context or {})
|
||||
|
||||
# Step 2: Get prompt type
|
||||
prompt_type = cls.FUNCTION_TO_PROMPT_TYPE.get(function_name, function_name)
|
||||
|
||||
# Step 3: Try DB prompt
|
||||
if account:
|
||||
try:
|
||||
from igny8_core.modules.system.models import AIPrompt
|
||||
db_prompt = AIPrompt.objects.get(
|
||||
account=account,
|
||||
prompt_type=prompt_type,
|
||||
is_active=True
|
||||
)
|
||||
logger.info(f"Using DB prompt for {function_name} (account {account.id})")
|
||||
prompt = db_prompt.prompt_value
|
||||
return cls._render_prompt(prompt, context or {})
|
||||
except Exception as e:
|
||||
logger.debug(f"No DB prompt found for {function_name}: {e}")
|
||||
|
||||
# Step 4: Use default fallback
|
||||
prompt = cls.DEFAULT_PROMPTS.get(prompt_type, '')
|
||||
if not prompt:
|
||||
logger.warning(f"No default prompt found for {prompt_type}, using empty string")
|
||||
|
||||
return cls._render_prompt(prompt, context or {})
|
||||
|
||||
@classmethod
|
||||
def _render_prompt(cls, prompt_template: str, context: Dict[str, Any]) -> str:
|
||||
@@ -208,7 +370,7 @@ class PromptRegistry:
|
||||
if '{' in rendered and '}' in rendered:
|
||||
try:
|
||||
rendered = rendered.format(**normalized_context)
|
||||
except (KeyError, ValueError, IndexError) as e:
|
||||
except (KeyError, ValueError) as e:
|
||||
# If .format() fails, log warning but keep the [IGNY8_*] replacements
|
||||
logger.warning(f"Failed to format prompt with .format(): {e}. Using [IGNY8_*] replacements only.")
|
||||
|
||||
@@ -235,17 +397,8 @@ class PromptRegistry:
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# Try GlobalAIPrompt
|
||||
try:
|
||||
from igny8_core.modules.system.global_settings_models import GlobalAIPrompt
|
||||
global_prompt = GlobalAIPrompt.objects.get(
|
||||
prompt_type=prompt_type,
|
||||
is_active=True
|
||||
)
|
||||
return global_prompt.prompt_value
|
||||
except Exception:
|
||||
# Fallback for image_prompt_template
|
||||
return '{image_type} image for blog post titled "{post_title}": {image_prompt}'
|
||||
# Use default
|
||||
return cls.DEFAULT_PROMPTS.get(prompt_type, '')
|
||||
|
||||
@classmethod
|
||||
def get_negative_prompt(cls, account: Optional[Any] = None) -> str:
|
||||
@@ -268,17 +421,8 @@ class PromptRegistry:
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# Try GlobalAIPrompt
|
||||
try:
|
||||
from igny8_core.modules.system.global_settings_models import GlobalAIPrompt
|
||||
global_prompt = GlobalAIPrompt.objects.get(
|
||||
prompt_type=prompt_type,
|
||||
is_active=True
|
||||
)
|
||||
return global_prompt.prompt_value
|
||||
except Exception:
|
||||
# Fallback for negative_prompt
|
||||
return 'text, watermark, logo, overlay, title, caption, writing on walls, writing on objects, UI, infographic elements, post title'
|
||||
# Use default
|
||||
return cls.DEFAULT_PROMPTS.get(prompt_type, '')
|
||||
|
||||
|
||||
# Convenience function for backward compatibility
|
||||
@@ -286,61 +430,3 @@ def get_prompt(function_name: str, account=None, task=None, context=None) -> str
|
||||
"""Get prompt using registry"""
|
||||
return PromptRegistry.get_prompt(function_name, account=account, task=task, context=context)
|
||||
|
||||
|
||||
def get_prompt_with_prefix(function_name: str, account=None, task=None, context=None) -> Tuple[str, str]:
|
||||
"""
|
||||
Get prompt with its tracking prefix.
|
||||
|
||||
Args:
|
||||
function_name: AI function name
|
||||
account: Account object (optional)
|
||||
task: Task object with optional prompt_override (optional)
|
||||
context: Additional context for prompt rendering (optional)
|
||||
|
||||
Returns:
|
||||
Tuple of (prompt_string, prefix_string)
|
||||
- prompt_string: The rendered prompt
|
||||
- prefix_string: The tracking prefix (e.g., '##GP01-Clustering' or '##CP01-Clustering')
|
||||
"""
|
||||
prompt, is_custom, prompt_type = PromptRegistry.get_prompt_with_metadata(
|
||||
function_name, account=account, task=task, context=context
|
||||
)
|
||||
prefix = PromptRegistry.get_prompt_prefix(prompt_type, is_custom)
|
||||
return prompt, prefix
|
||||
|
||||
|
||||
def get_prompt_prefix_for_function(function_name: str, account=None, task=None) -> str:
|
||||
"""
|
||||
Get just the prefix for a function without fetching the full prompt.
|
||||
Useful when the prompt was already fetched elsewhere.
|
||||
|
||||
Args:
|
||||
function_name: AI function name
|
||||
account: Account object (optional)
|
||||
task: Task object with optional prompt_override (optional)
|
||||
|
||||
Returns:
|
||||
The tracking prefix (e.g., '##GP01-Clustering' or '##CP01-Clustering')
|
||||
"""
|
||||
prompt_type = PromptRegistry.FUNCTION_TO_PROMPT_TYPE.get(function_name, function_name)
|
||||
|
||||
# Check for task-level override (always custom)
|
||||
if task and hasattr(task, 'prompt_override') and task.prompt_override:
|
||||
return PromptRegistry.get_prompt_prefix(prompt_type, is_custom=True)
|
||||
|
||||
# Check for account-specific prompt
|
||||
if account:
|
||||
try:
|
||||
from igny8_core.modules.system.models import AIPrompt
|
||||
db_prompt = AIPrompt.objects.get(
|
||||
account=account,
|
||||
prompt_type=prompt_type,
|
||||
is_active=True
|
||||
)
|
||||
return PromptRegistry.get_prompt_prefix(prompt_type, is_custom=db_prompt.is_customized)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# Fallback to global (not custom)
|
||||
return PromptRegistry.get_prompt_prefix(prompt_type, is_custom=False)
|
||||
|
||||
|
||||
@@ -94,15 +94,9 @@ def _load_generate_image_prompts():
|
||||
from igny8_core.ai.functions.generate_image_prompts import GenerateImagePromptsFunction
|
||||
return GenerateImagePromptsFunction
|
||||
|
||||
def _load_optimize_content():
|
||||
"""Lazy loader for optimize_content function"""
|
||||
from igny8_core.ai.functions.optimize_content import OptimizeContentFunction
|
||||
return OptimizeContentFunction
|
||||
|
||||
register_lazy_function('auto_cluster', _load_auto_cluster)
|
||||
register_lazy_function('generate_ideas', _load_generate_ideas)
|
||||
register_lazy_function('generate_content', _load_generate_content)
|
||||
register_lazy_function('generate_images', _load_generate_images)
|
||||
register_lazy_function('generate_image_prompts', _load_generate_image_prompts)
|
||||
register_lazy_function('optimize_content', _load_optimize_content)
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
"""
|
||||
AI Settings - Centralized model configurations and limits
|
||||
Uses global settings with optional per-account overrides.
|
||||
Uses IntegrationSettings only - no hardcoded defaults or fallbacks.
|
||||
"""
|
||||
from typing import Dict, Any
|
||||
import logging
|
||||
@@ -19,23 +19,18 @@ FUNCTION_ALIASES = {
|
||||
|
||||
def get_model_config(function_name: str, account) -> Dict[str, Any]:
|
||||
"""
|
||||
Get model configuration for AI function.
|
||||
|
||||
Architecture:
|
||||
- API keys: ALWAYS from GlobalIntegrationSettings (platform-wide)
|
||||
- Model/params: From IntegrationSettings if account has override, else from global
|
||||
- Free plan: Cannot override, uses global defaults
|
||||
- Starter/Growth/Scale: Can override model, temperature, max_tokens, etc.
|
||||
Get model configuration from IntegrationSettings only.
|
||||
No fallbacks - account must have IntegrationSettings configured.
|
||||
|
||||
Args:
|
||||
function_name: Name of the AI function
|
||||
account: Account instance (required)
|
||||
|
||||
Returns:
|
||||
dict: Model configuration with 'model', 'max_tokens', 'temperature', 'api_key'
|
||||
dict: Model configuration with 'model', 'max_tokens', 'temperature'
|
||||
|
||||
Raises:
|
||||
ValueError: If account not provided or settings not configured
|
||||
ValueError: If account not provided or IntegrationSettings not configured
|
||||
"""
|
||||
if not account:
|
||||
raise ValueError("Account is required for model configuration")
|
||||
@@ -43,57 +38,28 @@ def get_model_config(function_name: str, account) -> Dict[str, Any]:
|
||||
# Resolve function alias
|
||||
actual_name = FUNCTION_ALIASES.get(function_name, function_name)
|
||||
|
||||
# Get IntegrationSettings for OpenAI
|
||||
try:
|
||||
from igny8_core.modules.system.global_settings_models import GlobalIntegrationSettings
|
||||
from igny8_core.modules.system.models import IntegrationSettings
|
||||
|
||||
# Get global settings (for API keys and defaults)
|
||||
global_settings = GlobalIntegrationSettings.get_instance()
|
||||
|
||||
if not global_settings.openai_api_key:
|
||||
raise ValueError(
|
||||
"Platform OpenAI API key not configured. "
|
||||
"Please configure GlobalIntegrationSettings in Django admin."
|
||||
)
|
||||
|
||||
# Start with global defaults
|
||||
model = global_settings.openai_model
|
||||
temperature = global_settings.openai_temperature
|
||||
max_tokens = global_settings.openai_max_tokens
|
||||
api_key = global_settings.openai_api_key # ALWAYS from global
|
||||
|
||||
# Check if account has overrides (only for Starter/Growth/Scale plans)
|
||||
# Free plan users cannot create IntegrationSettings records
|
||||
try:
|
||||
account_settings = IntegrationSettings.objects.get(
|
||||
account=account,
|
||||
integration_type='openai',
|
||||
is_active=True
|
||||
)
|
||||
|
||||
config = account_settings.config or {}
|
||||
|
||||
# Override model if specified (NULL = use global)
|
||||
if config.get('model'):
|
||||
model = config['model']
|
||||
|
||||
# Override temperature if specified
|
||||
if config.get('temperature') is not None:
|
||||
temperature = config['temperature']
|
||||
|
||||
# Override max_tokens if specified
|
||||
if config.get('max_tokens'):
|
||||
max_tokens = config['max_tokens']
|
||||
|
||||
except IntegrationSettings.DoesNotExist:
|
||||
# No account override, use global defaults (already set above)
|
||||
pass
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Could not load OpenAI settings for account {account.id}: {e}")
|
||||
integration_settings = IntegrationSettings.objects.get(
|
||||
integration_type='openai',
|
||||
account=account,
|
||||
is_active=True
|
||||
)
|
||||
except IntegrationSettings.DoesNotExist:
|
||||
raise ValueError(
|
||||
f"Could not load OpenAI configuration for account {account.id}. "
|
||||
f"Please configure GlobalIntegrationSettings."
|
||||
f"OpenAI IntegrationSettings not configured for account {account.id}. "
|
||||
f"Please configure OpenAI settings in the integration page."
|
||||
)
|
||||
|
||||
config = integration_settings.config or {}
|
||||
|
||||
# Get model from config
|
||||
model = config.get('model')
|
||||
if not model:
|
||||
raise ValueError(
|
||||
f"Model not configured in IntegrationSettings for account {account.id}. "
|
||||
f"Please set 'model' in OpenAI integration settings."
|
||||
)
|
||||
|
||||
# Validate model is in our supported list (optional validation)
|
||||
@@ -105,8 +71,13 @@ def get_model_config(function_name: str, account) -> Dict[str, Any]:
|
||||
f"Supported models: {list(MODEL_RATES.keys())}"
|
||||
)
|
||||
except ImportError:
|
||||
# MODEL_RATES not available - skip validation
|
||||
pass
|
||||
|
||||
# Get max_tokens and temperature from config (with reasonable defaults for API)
|
||||
max_tokens = config.get('max_tokens', 4000) # Reasonable default for API limits
|
||||
temperature = config.get('temperature', 0.7) # Reasonable default
|
||||
|
||||
# Build response format based on model (JSON mode for supported models)
|
||||
response_format = None
|
||||
try:
|
||||
@@ -114,6 +85,7 @@ def get_model_config(function_name: str, account) -> Dict[str, Any]:
|
||||
if model in JSON_MODE_MODELS:
|
||||
response_format = {"type": "json_object"}
|
||||
except ImportError:
|
||||
# JSON_MODE_MODELS not available - skip
|
||||
pass
|
||||
|
||||
return {
|
||||
|
||||
@@ -181,84 +181,82 @@ def process_image_generation_queue(self, image_ids: list, account_id: int = None
|
||||
failed = 0
|
||||
results = []
|
||||
|
||||
# Get image generation settings
|
||||
# Try account-specific override, otherwise use GlobalIntegrationSettings
|
||||
# Get image generation settings from IntegrationSettings
|
||||
logger.info("[process_image_generation_queue] Step 1: Loading image generation settings")
|
||||
from igny8_core.modules.system.global_settings_models import GlobalIntegrationSettings
|
||||
|
||||
config = {}
|
||||
try:
|
||||
image_settings = IntegrationSettings.objects.get(
|
||||
account=account,
|
||||
integration_type='image_generation',
|
||||
is_active=True
|
||||
)
|
||||
logger.info(f"[process_image_generation_queue] Using account {account.id} IntegrationSettings override")
|
||||
config = image_settings.config or {}
|
||||
logger.info(f"[process_image_generation_queue] Image generation settings found. Config keys: {list(config.keys())}")
|
||||
logger.info(f"[process_image_generation_queue] Full config: {config}")
|
||||
|
||||
# Get provider and model from config (respect user settings)
|
||||
provider = config.get('provider', 'openai')
|
||||
# Get model - try 'model' first, then 'imageModel' as fallback
|
||||
model = config.get('model') or config.get('imageModel') or 'dall-e-3'
|
||||
logger.info(f"[process_image_generation_queue] Using PROVIDER: {provider}, MODEL: {model} from settings")
|
||||
image_type = config.get('image_type', 'realistic')
|
||||
image_format = config.get('image_format', 'webp')
|
||||
desktop_enabled = config.get('desktop_enabled', True)
|
||||
mobile_enabled = config.get('mobile_enabled', True)
|
||||
# Get image sizes from config, with fallback defaults
|
||||
featured_image_size = config.get('featured_image_size') or ('1280x832' if provider == 'runware' else '1024x1024')
|
||||
desktop_image_size = config.get('desktop_image_size') or '1024x1024'
|
||||
in_article_image_size = config.get('in_article_image_size') or '512x512' # Default to 512x512
|
||||
|
||||
logger.info(f"[process_image_generation_queue] Settings loaded:")
|
||||
logger.info(f" - Provider: {provider}")
|
||||
logger.info(f" - Model: {model}")
|
||||
logger.info(f" - Image type: {image_type}")
|
||||
logger.info(f" - Image format: {image_format}")
|
||||
logger.info(f" - Desktop enabled: {desktop_enabled}")
|
||||
logger.info(f" - Mobile enabled: {mobile_enabled}")
|
||||
except IntegrationSettings.DoesNotExist:
|
||||
logger.info(f"[process_image_generation_queue] No IntegrationSettings override for account {account.id}, using GlobalIntegrationSettings")
|
||||
logger.error("[process_image_generation_queue] ERROR: Image generation settings not found")
|
||||
logger.error(f"[process_image_generation_queue] Account: {account.id if account else 'None'}, integration_type: 'image_generation'")
|
||||
return {'success': False, 'error': 'Image generation settings not found'}
|
||||
except Exception as e:
|
||||
logger.error(f"[process_image_generation_queue] ERROR loading image generation settings: {e}", exc_info=True)
|
||||
return {'success': False, 'error': f'Error loading image generation settings: {str(e)}'}
|
||||
|
||||
# Use GlobalIntegrationSettings for missing values
|
||||
global_settings = GlobalIntegrationSettings.get_instance()
|
||||
|
||||
logger.info(f"[process_image_generation_queue] Image generation settings loaded. Config keys: {list(config.keys())}")
|
||||
logger.info(f"[process_image_generation_queue] Full config: {config}")
|
||||
|
||||
# Get provider and model from config with global fallbacks
|
||||
provider = config.get('provider') or global_settings.default_image_service
|
||||
if provider == 'runware':
|
||||
model = config.get('model') or config.get('imageModel') or global_settings.runware_model
|
||||
else:
|
||||
model = config.get('model') or config.get('imageModel') or global_settings.dalle_model
|
||||
|
||||
logger.info(f"[process_image_generation_queue] Using PROVIDER: {provider}, MODEL: {model} from settings")
|
||||
image_type = config.get('image_type') or global_settings.image_style
|
||||
image_format = config.get('image_format', 'webp')
|
||||
desktop_enabled = config.get('desktop_enabled', True)
|
||||
mobile_enabled = config.get('mobile_enabled', True)
|
||||
# Get image sizes from config, with fallback defaults
|
||||
featured_image_size = config.get('featured_image_size') or ('1280x832' if provider == 'runware' else '1024x1024')
|
||||
desktop_image_size = config.get('desktop_image_size') or global_settings.desktop_image_size
|
||||
in_article_image_size = config.get('in_article_image_size') or '512x512' # Default to 512x512
|
||||
|
||||
logger.info(f"[process_image_generation_queue] Settings loaded:")
|
||||
logger.info(f" - Provider: {provider}")
|
||||
logger.info(f" - Model: {model}")
|
||||
logger.info(f" - Image type: {image_type}")
|
||||
logger.info(f" - Image format: {image_format}")
|
||||
logger.info(f" - Desktop enabled: {desktop_enabled}")
|
||||
logger.info(f" - Mobile enabled: {mobile_enabled}")
|
||||
|
||||
# Get provider API key
|
||||
# API keys are ALWAYS from GlobalIntegrationSettings (accounts cannot override API keys)
|
||||
# Account IntegrationSettings only store provider preference, NOT API keys
|
||||
logger.info(f"[process_image_generation_queue] Step 2: Loading {provider.upper()} API key from GlobalIntegrationSettings")
|
||||
|
||||
# Get API key from GlobalIntegrationSettings
|
||||
if provider == 'runware':
|
||||
api_key = global_settings.runware_api_key
|
||||
elif provider == 'openai':
|
||||
api_key = global_settings.dalle_api_key or global_settings.openai_api_key
|
||||
else:
|
||||
api_key = None
|
||||
|
||||
if not api_key:
|
||||
logger.error(f"[process_image_generation_queue] {provider.upper()} API key not configured in GlobalIntegrationSettings")
|
||||
return {'success': False, 'error': f'{provider.upper()} API key not configured in GlobalIntegrationSettings'}
|
||||
|
||||
# Log API key presence (but not the actual key for security)
|
||||
api_key_preview = f"{api_key[:10]}...{api_key[-4:]}" if len(api_key) > 14 else "***"
|
||||
logger.info(f"[process_image_generation_queue] {provider.upper()} API key retrieved successfully (length: {len(api_key)}, preview: {api_key_preview})")
|
||||
# Get provider API key (using same approach as test image generation)
|
||||
# Note: API key is stored as 'apiKey' (camelCase) in IntegrationSettings.config
|
||||
logger.info(f"[process_image_generation_queue] Step 2: Loading {provider.upper()} API key")
|
||||
try:
|
||||
provider_settings = IntegrationSettings.objects.get(
|
||||
account=account,
|
||||
integration_type=provider, # Use the provider from settings
|
||||
is_active=True
|
||||
)
|
||||
logger.info(f"[process_image_generation_queue] {provider.upper()} integration settings found")
|
||||
logger.info(f"[process_image_generation_queue] {provider.upper()} config keys: {list(provider_settings.config.keys()) if provider_settings.config else 'None'}")
|
||||
|
||||
api_key = provider_settings.config.get('apiKey') if provider_settings.config else None
|
||||
if not api_key:
|
||||
logger.error(f"[process_image_generation_queue] {provider.upper()} API key not found in config")
|
||||
logger.error(f"[process_image_generation_queue] {provider.upper()} config: {provider_settings.config}")
|
||||
return {'success': False, 'error': f'{provider.upper()} API key not configured'}
|
||||
|
||||
# Log API key presence (but not the actual key for security)
|
||||
api_key_preview = f"{api_key[:10]}...{api_key[-4:]}" if len(api_key) > 14 else "***"
|
||||
logger.info(f"[process_image_generation_queue] {provider.upper()} API key retrieved successfully (length: {len(api_key)}, preview: {api_key_preview})")
|
||||
except IntegrationSettings.DoesNotExist:
|
||||
logger.error(f"[process_image_generation_queue] ERROR: {provider.upper()} integration settings not found")
|
||||
logger.error(f"[process_image_generation_queue] Account: {account.id if account else 'None'}, integration_type: '{provider}'")
|
||||
return {'success': False, 'error': f'{provider.upper()} integration not found or not active'}
|
||||
except Exception as e:
|
||||
logger.error(f"[process_image_generation_queue] ERROR getting {provider.upper()} API key: {e}", exc_info=True)
|
||||
return {'success': False, 'error': f'Error retrieving {provider.upper()} API key: {str(e)}'}
|
||||
|
||||
# Get image prompt template (has placeholders: {image_type}, {post_title}, {image_prompt})
|
||||
try:
|
||||
image_prompt_template = PromptRegistry.get_image_prompt_template(account)
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to get image prompt template: {e}, using fallback")
|
||||
image_prompt_template = '{image_type} image for blog post titled "{post_title}": {image_prompt}'
|
||||
image_prompt_template = 'Create a high-quality {image_type} image for a blog post titled "{post_title}". Image prompt: {image_prompt}'
|
||||
|
||||
# Get negative prompt for Runware (only needed for Runware provider)
|
||||
negative_prompt = None
|
||||
@@ -709,25 +707,6 @@ def process_image_generation_queue(self, image_ids: list, account_id: int = None
|
||||
})
|
||||
failed += 1
|
||||
|
||||
# Check if all images for the content are generated and update status to 'review'
|
||||
if content_id and completed > 0:
|
||||
try:
|
||||
from igny8_core.business.content.models import Content, Images
|
||||
|
||||
content = Content.objects.get(id=content_id)
|
||||
|
||||
# Check if all images for this content are now generated
|
||||
all_images = Images.objects.filter(content=content)
|
||||
pending_images = all_images.filter(status='pending').count()
|
||||
|
||||
# If no pending images and content is still in draft, move to review
|
||||
if pending_images == 0 and content.status == 'draft':
|
||||
content.status = 'review'
|
||||
content.save(update_fields=['status'])
|
||||
logger.info(f"[process_image_generation_queue] Content #{content_id} status updated to 'review' (all images generated)")
|
||||
except Exception as e:
|
||||
logger.error(f"[process_image_generation_queue] Error updating content status: {str(e)}", exc_info=True)
|
||||
|
||||
# Final state
|
||||
logger.info("=" * 80)
|
||||
logger.info(f"process_image_generation_queue COMPLETED")
|
||||
|
||||
@@ -1,86 +0,0 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from igny8_core.ai.functions.generate_site_structure import GenerateSiteStructureFunction
|
||||
from igny8_core.business.site_building.models import PageBlueprint
|
||||
from igny8_core.business.site_building.tests.base import SiteBuilderTestBase
|
||||
|
||||
|
||||
class GenerateSiteStructureFunctionTests(SiteBuilderTestBase):
|
||||
"""Covers parsing + persistence logic for the Site Builder AI function."""
|
||||
|
||||
def setUp(self):
|
||||
super().setUp()
|
||||
self.function = GenerateSiteStructureFunction()
|
||||
|
||||
def test_parse_response_extracts_json_object(self):
|
||||
noisy_response = """
|
||||
Thoughts about the request…
|
||||
{
|
||||
"site": {"name": "Acme Robotics"},
|
||||
"pages": [{"slug": "home", "title": "Home"}]
|
||||
}
|
||||
Extra commentary that should be ignored.
|
||||
"""
|
||||
parsed = self.function.parse_response(noisy_response)
|
||||
self.assertEqual(parsed['site']['name'], 'Acme Robotics')
|
||||
self.assertEqual(parsed['pages'][0]['slug'], 'home')
|
||||
|
||||
def test_save_output_updates_structure_and_syncs_pages(self):
|
||||
# Existing page to prove update/delete flows.
|
||||
legacy_page = PageBlueprint.objects.create(
|
||||
site_blueprint=self.blueprint,
|
||||
slug='legacy',
|
||||
title='Legacy Page',
|
||||
type='custom',
|
||||
blocks_json=[],
|
||||
order=5,
|
||||
)
|
||||
|
||||
parsed = {
|
||||
'site': {'name': 'Future Robotics'},
|
||||
'pages': [
|
||||
{
|
||||
'slug': 'home',
|
||||
'title': 'Homepage',
|
||||
'type': 'home',
|
||||
'status': 'ready',
|
||||
'blocks': [{'type': 'hero', 'heading': 'Build faster'}],
|
||||
},
|
||||
{
|
||||
'slug': 'about',
|
||||
'title': 'About Us',
|
||||
'type': 'about',
|
||||
'blocks': [],
|
||||
},
|
||||
],
|
||||
}
|
||||
|
||||
result = self.function.save_output(parsed, {'blueprint': self.blueprint})
|
||||
|
||||
self.blueprint.refresh_from_db()
|
||||
self.assertEqual(self.blueprint.status, 'ready')
|
||||
self.assertEqual(self.blueprint.structure_json['site']['name'], 'Future Robotics')
|
||||
self.assertEqual(result['pages_created'], 1)
|
||||
self.assertEqual(result['pages_updated'], 1)
|
||||
self.assertEqual(result['pages_deleted'], 1)
|
||||
|
||||
slugs = set(self.blueprint.pages.values_list('slug', flat=True))
|
||||
self.assertIn('home', slugs)
|
||||
self.assertIn('about', slugs)
|
||||
self.assertNotIn(legacy_page.slug, slugs)
|
||||
|
||||
def test_build_prompt_includes_existing_pages(self):
|
||||
# Convert structure to JSON to ensure template rendering stays stable.
|
||||
data = self.function.prepare(
|
||||
payload={'ids': [self.blueprint.id]},
|
||||
account=self.account,
|
||||
)
|
||||
prompt = self.function.build_prompt(data, account=self.account)
|
||||
self.assertIn(self.blueprint.name, prompt)
|
||||
self.assertIn('Home', prompt)
|
||||
# The prompt should mention hosting type and objectives in JSON context.
|
||||
self.assertIn(self.blueprint.hosting_type, prompt)
|
||||
for objective in self.blueprint.config_json.get('objectives', []):
|
||||
self.assertIn(objective, prompt)
|
||||
|
||||
|
||||
116
backend/igny8_core/ai/tests/test_run.py
Normal file
116
backend/igny8_core/ai/tests/test_run.py
Normal file
@@ -0,0 +1,116 @@
|
||||
"""
|
||||
Test script for AI functions
|
||||
Run this to verify all AI functions work with console logging
|
||||
"""
|
||||
import os
|
||||
import sys
|
||||
import django
|
||||
|
||||
# Setup Django
|
||||
sys.path.insert(0, os.path.join(os.path.dirname(__file__), '../../../../'))
|
||||
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'igny8.settings')
|
||||
django.setup()
|
||||
|
||||
from igny8_core.ai.functions.auto_cluster import AutoClusterFunction
|
||||
from igny8_core.ai.functions.generate_images import generate_images_core
|
||||
from igny8_core.ai.ai_core import AICore
|
||||
|
||||
|
||||
def test_ai_core():
|
||||
"""Test AICore.run_ai_request() directly"""
|
||||
print("\n" + "="*80)
|
||||
print("TEST 1: AICore.run_ai_request() - Direct API Call")
|
||||
print("="*80)
|
||||
|
||||
ai_core = AICore()
|
||||
result = ai_core.run_ai_request(
|
||||
prompt="Say 'Hello, World!' in JSON format: {\"message\": \"your message\"}",
|
||||
max_tokens=100,
|
||||
function_name='test_ai_core'
|
||||
)
|
||||
|
||||
if result.get('error'):
|
||||
print(f"❌ Error: {result['error']}")
|
||||
else:
|
||||
print(f"✅ Success! Content: {result.get('content', '')[:100]}")
|
||||
print(f" Tokens: {result.get('total_tokens')}, Cost: ${result.get('cost', 0):.6f}")
|
||||
|
||||
|
||||
def test_auto_cluster():
|
||||
"""Test auto cluster function"""
|
||||
print("\n" + "="*80)
|
||||
print("TEST 2: Auto Cluster Function")
|
||||
print("="*80)
|
||||
print("Note: This requires actual keyword IDs in the database")
|
||||
print("Skipping - requires database setup")
|
||||
# Uncomment to test with real data:
|
||||
# fn = AutoClusterFunction()
|
||||
# result = fn.validate({'ids': [1, 2, 3]})
|
||||
# print(f"Validation result: {result}")
|
||||
|
||||
|
||||
def test_generate_content():
|
||||
"""Test generate content function"""
|
||||
print("\n" + "="*80)
|
||||
print("TEST 3: Generate Content Function")
|
||||
print("="*80)
|
||||
print("Note: This requires actual task IDs in the database")
|
||||
print("Skipping - requires database setup")
|
||||
|
||||
|
||||
def test_generate_images():
|
||||
"""Test generate images function"""
|
||||
print("\n" + "="*80)
|
||||
print("TEST 4: Generate Images Function")
|
||||
print("="*80)
|
||||
print("Note: This requires actual task IDs in the database")
|
||||
print("Skipping - requires database setup")
|
||||
# Uncomment to test with real data:
|
||||
# result = generate_images_core(task_ids=[1], account_id=1)
|
||||
# print(f"Result: {result}")
|
||||
|
||||
|
||||
def test_json_extraction():
|
||||
"""Test JSON extraction"""
|
||||
print("\n" + "="*80)
|
||||
print("TEST 5: JSON Extraction")
|
||||
print("="*80)
|
||||
|
||||
ai_core = AICore()
|
||||
|
||||
# Test 1: Direct JSON
|
||||
json_text = '{"clusters": [{"name": "Test", "keywords": ["test"]}]}'
|
||||
result = ai_core.extract_json(json_text)
|
||||
print(f"✅ Direct JSON: {result is not None}")
|
||||
|
||||
# Test 2: JSON in markdown
|
||||
json_markdown = '```json\n{"clusters": [{"name": "Test"}]}\n```'
|
||||
result = ai_core.extract_json(json_markdown)
|
||||
print(f"✅ JSON in markdown: {result is not None}")
|
||||
|
||||
# Test 3: Invalid JSON
|
||||
invalid_json = "This is not JSON"
|
||||
result = ai_core.extract_json(invalid_json)
|
||||
print(f"✅ Invalid JSON handled: {result is None}")
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
print("\n" + "="*80)
|
||||
print("AI FUNCTIONS TEST SUITE")
|
||||
print("="*80)
|
||||
print("Testing all AI functions with console logging enabled")
|
||||
print("="*80)
|
||||
|
||||
# Run tests
|
||||
test_ai_core()
|
||||
test_json_extraction()
|
||||
test_auto_cluster()
|
||||
test_generate_content()
|
||||
test_generate_images()
|
||||
|
||||
print("\n" + "="*80)
|
||||
print("TEST SUITE COMPLETE")
|
||||
print("="*80)
|
||||
print("\nAll console logging should be visible above.")
|
||||
print("Check for [AI][function_name] Step X: messages")
|
||||
|
||||
@@ -5,7 +5,6 @@ import time
|
||||
import logging
|
||||
from typing import List, Dict, Any, Optional, Callable
|
||||
from datetime import datetime
|
||||
from decimal import Decimal
|
||||
from igny8_core.ai.constants import DEBUG_MODE
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
@@ -196,35 +195,24 @@ class CostTracker:
|
||||
"""Tracks API costs and token usage"""
|
||||
|
||||
def __init__(self):
|
||||
self.total_cost = Decimal('0.0')
|
||||
self.total_cost = 0.0
|
||||
self.total_tokens = 0
|
||||
self.operations = []
|
||||
|
||||
def record(self, function_name: str, cost, tokens: int, model: str = None):
|
||||
"""Record an API call cost
|
||||
|
||||
Args:
|
||||
function_name: Name of the AI function
|
||||
cost: Cost value (can be float or Decimal)
|
||||
tokens: Number of tokens used
|
||||
model: Model name
|
||||
"""
|
||||
# Convert cost to Decimal if it's a float to avoid type mixing
|
||||
if not isinstance(cost, Decimal):
|
||||
cost = Decimal(str(cost))
|
||||
|
||||
def record(self, function_name: str, cost: float, tokens: int, model: str = None):
|
||||
"""Record an API call cost"""
|
||||
self.total_cost += cost
|
||||
self.total_tokens += tokens
|
||||
self.operations.append({
|
||||
'function': function_name,
|
||||
'cost': float(cost), # Store as float for JSON serialization
|
||||
'cost': cost,
|
||||
'tokens': tokens,
|
||||
'model': model
|
||||
})
|
||||
|
||||
def get_total(self):
|
||||
"""Get total cost (returns float for JSON serialization)"""
|
||||
return float(self.total_cost)
|
||||
def get_total(self) -> float:
|
||||
"""Get total cost"""
|
||||
return self.total_cost
|
||||
|
||||
def get_total_tokens(self) -> int:
|
||||
"""Get total tokens"""
|
||||
|
||||
@@ -135,7 +135,7 @@ def validate_api_key(api_key: Optional[str], integration_type: str = 'openai') -
|
||||
|
||||
def validate_model(model: str, model_type: str = 'text') -> Dict[str, Any]:
|
||||
"""
|
||||
Validate that model is in supported list using database.
|
||||
Validate that model is in supported list.
|
||||
|
||||
Args:
|
||||
model: Model name to validate
|
||||
@@ -144,59 +144,27 @@ def validate_model(model: str, model_type: str = 'text') -> Dict[str, Any]:
|
||||
Returns:
|
||||
Dict with 'valid' (bool) and optional 'error' (str)
|
||||
"""
|
||||
try:
|
||||
# Try database first
|
||||
from igny8_core.business.billing.models import AIModelConfig
|
||||
|
||||
exists = AIModelConfig.objects.filter(
|
||||
model_name=model,
|
||||
model_type=model_type,
|
||||
is_active=True
|
||||
).exists()
|
||||
|
||||
if not exists:
|
||||
# Get available models for better error message
|
||||
available = list(AIModelConfig.objects.filter(
|
||||
model_type=model_type,
|
||||
is_active=True
|
||||
).values_list('model_name', flat=True))
|
||||
|
||||
if available:
|
||||
return {
|
||||
'valid': False,
|
||||
'error': f'Model "{model}" is not active or not found. Available {model_type} models: {", ".join(available)}'
|
||||
}
|
||||
else:
|
||||
return {
|
||||
'valid': False,
|
||||
'error': f'Model "{model}" is not found in database'
|
||||
}
|
||||
|
||||
return {'valid': True}
|
||||
|
||||
except Exception:
|
||||
# Fallback to constants if database fails
|
||||
from .constants import MODEL_RATES, VALID_OPENAI_IMAGE_MODELS
|
||||
|
||||
if model_type == 'text':
|
||||
if model not in MODEL_RATES:
|
||||
return {
|
||||
'valid': False,
|
||||
'error': f'Model "{model}" is not in supported models list'
|
||||
}
|
||||
elif model_type == 'image':
|
||||
if model not in VALID_OPENAI_IMAGE_MODELS:
|
||||
return {
|
||||
'valid': False,
|
||||
'error': f'Model "{model}" is not valid for OpenAI image generation. Only {", ".join(VALID_OPENAI_IMAGE_MODELS)} are supported.'
|
||||
}
|
||||
|
||||
return {'valid': True}
|
||||
from .constants import MODEL_RATES, VALID_OPENAI_IMAGE_MODELS
|
||||
|
||||
if model_type == 'text':
|
||||
if model not in MODEL_RATES:
|
||||
return {
|
||||
'valid': False,
|
||||
'error': f'Model "{model}" is not in supported models list'
|
||||
}
|
||||
elif model_type == 'image':
|
||||
if model not in VALID_OPENAI_IMAGE_MODELS:
|
||||
return {
|
||||
'valid': False,
|
||||
'error': f'Model "{model}" is not valid for OpenAI image generation. Only {", ".join(VALID_OPENAI_IMAGE_MODELS)} are supported.'
|
||||
}
|
||||
|
||||
return {'valid': True}
|
||||
|
||||
|
||||
def validate_image_size(size: str, model: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Validate that image size is valid for the selected model using database.
|
||||
Validate that image size is valid for the selected model.
|
||||
|
||||
Args:
|
||||
size: Image size (e.g., '1024x1024')
|
||||
@@ -205,40 +173,14 @@ def validate_image_size(size: str, model: str) -> Dict[str, Any]:
|
||||
Returns:
|
||||
Dict with 'valid' (bool) and optional 'error' (str)
|
||||
"""
|
||||
try:
|
||||
# Try database first
|
||||
from igny8_core.business.billing.models import AIModelConfig
|
||||
|
||||
model_config = AIModelConfig.objects.filter(
|
||||
model_name=model,
|
||||
model_type='image',
|
||||
is_active=True
|
||||
).first()
|
||||
|
||||
if model_config:
|
||||
if not model_config.validate_size(size):
|
||||
valid_sizes = model_config.valid_sizes or []
|
||||
return {
|
||||
'valid': False,
|
||||
'error': f'Image size "{size}" is not valid for model "{model}". Valid sizes are: {", ".join(valid_sizes)}'
|
||||
}
|
||||
return {'valid': True}
|
||||
else:
|
||||
return {
|
||||
'valid': False,
|
||||
'error': f'Image model "{model}" not found in database'
|
||||
}
|
||||
|
||||
except Exception:
|
||||
# Fallback to constants if database fails
|
||||
from .constants import VALID_SIZES_BY_MODEL
|
||||
|
||||
valid_sizes = VALID_SIZES_BY_MODEL.get(model, [])
|
||||
if size not in valid_sizes:
|
||||
return {
|
||||
'valid': False,
|
||||
'error': f'Image size "{size}" is not valid for model "{model}". Valid sizes are: {", ".join(valid_sizes)}'
|
||||
}
|
||||
|
||||
return {'valid': True}
|
||||
from .constants import VALID_SIZES_BY_MODEL
|
||||
|
||||
valid_sizes = VALID_SIZES_BY_MODEL.get(model, [])
|
||||
if size not in valid_sizes:
|
||||
return {
|
||||
'valid': False,
|
||||
'error': f'Image size "{size}" is not valid for model "{model}". Valid sizes are: {", ".join(valid_sizes)}'
|
||||
}
|
||||
|
||||
return {'valid': True}
|
||||
|
||||
|
||||
@@ -1,52 +0,0 @@
|
||||
"""
|
||||
AI Validators Package
|
||||
Shared validation logic for AI functions
|
||||
"""
|
||||
from .cluster_validators import validate_minimum_keywords, validate_keyword_selection
|
||||
|
||||
# The codebase also contains a module-level file `ai/validators.py` which defines
|
||||
# common validator helpers (e.g. `validate_ids`). Because there is both a
|
||||
# package directory `ai/validators/` and a module file `ai/validators.py`, Python
|
||||
# will resolve `igny8_core.ai.validators` to the package and not the module file.
|
||||
# To avoid changing many imports across the project, load the module file here
|
||||
# and re-export the commonly used functions.
|
||||
import importlib.util
|
||||
import os
|
||||
|
||||
_module_path = os.path.normpath(os.path.join(os.path.dirname(__file__), '..', 'validators.py'))
|
||||
if os.path.exists(_module_path):
|
||||
spec = importlib.util.spec_from_file_location('igny8_core.ai._validators_module', _module_path)
|
||||
_validators_mod = importlib.util.module_from_spec(spec)
|
||||
spec.loader.exec_module(_validators_mod)
|
||||
# Re-export commonly used functions from the module file
|
||||
validate_ids = getattr(_validators_mod, 'validate_ids', None)
|
||||
validate_keywords_exist = getattr(_validators_mod, 'validate_keywords_exist', None)
|
||||
validate_cluster_limits = getattr(_validators_mod, 'validate_cluster_limits', None)
|
||||
validate_cluster_exists = getattr(_validators_mod, 'validate_cluster_exists', None)
|
||||
validate_tasks_exist = getattr(_validators_mod, 'validate_tasks_exist', None)
|
||||
validate_api_key = getattr(_validators_mod, 'validate_api_key', None)
|
||||
validate_model = getattr(_validators_mod, 'validate_model', None)
|
||||
validate_image_size = getattr(_validators_mod, 'validate_image_size', None)
|
||||
else:
|
||||
# Module file missing - keep names defined if cluster validators provide them
|
||||
validate_ids = None
|
||||
validate_keywords_exist = None
|
||||
validate_cluster_limits = None
|
||||
validate_cluster_exists = None
|
||||
validate_tasks_exist = None
|
||||
validate_api_key = None
|
||||
validate_model = None
|
||||
validate_image_size = None
|
||||
|
||||
__all__ = [
|
||||
'validate_minimum_keywords',
|
||||
'validate_keyword_selection',
|
||||
'validate_ids',
|
||||
'validate_keywords_exist',
|
||||
'validate_cluster_limits',
|
||||
'validate_cluster_exists',
|
||||
'validate_tasks_exist',
|
||||
'validate_api_key',
|
||||
'validate_model',
|
||||
'validate_image_size',
|
||||
]
|
||||
@@ -1,105 +0,0 @@
|
||||
"""
|
||||
Cluster-specific validators
|
||||
Shared between auto-cluster function and automation pipeline
|
||||
"""
|
||||
import logging
|
||||
from typing import Dict, List
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def validate_minimum_keywords(
|
||||
keyword_ids: List[int],
|
||||
account=None,
|
||||
min_required: int = 5
|
||||
) -> Dict:
|
||||
"""
|
||||
Validate that sufficient keywords are available for clustering
|
||||
|
||||
Args:
|
||||
keyword_ids: List of keyword IDs to cluster
|
||||
account: Account object for filtering
|
||||
min_required: Minimum number of keywords required (default: 5)
|
||||
|
||||
Returns:
|
||||
Dict with 'valid' (bool) and 'error' (str) or 'count' (int)
|
||||
"""
|
||||
from igny8_core.modules.planner.models import Keywords
|
||||
|
||||
# Build queryset
|
||||
queryset = Keywords.objects.filter(id__in=keyword_ids, status='new')
|
||||
|
||||
if account:
|
||||
queryset = queryset.filter(account=account)
|
||||
|
||||
# Count available keywords
|
||||
count = queryset.count()
|
||||
|
||||
# Validate minimum
|
||||
if count < min_required:
|
||||
return {
|
||||
'valid': False,
|
||||
'error': f'Insufficient keywords for clustering. Need at least {min_required} keywords, but only {count} available.',
|
||||
'count': count,
|
||||
'required': min_required
|
||||
}
|
||||
|
||||
return {
|
||||
'valid': True,
|
||||
'count': count,
|
||||
'required': min_required
|
||||
}
|
||||
|
||||
|
||||
def validate_keyword_selection(
|
||||
selected_ids: List[int],
|
||||
available_count: int,
|
||||
min_required: int = 5
|
||||
) -> Dict:
|
||||
"""
|
||||
Validate keyword selection (for frontend validation)
|
||||
|
||||
Args:
|
||||
selected_ids: List of selected keyword IDs
|
||||
available_count: Total count of available keywords
|
||||
min_required: Minimum required
|
||||
|
||||
Returns:
|
||||
Dict with validation result
|
||||
"""
|
||||
selected_count = len(selected_ids)
|
||||
|
||||
# Check if any keywords selected
|
||||
if selected_count == 0:
|
||||
return {
|
||||
'valid': False,
|
||||
'error': 'No keywords selected',
|
||||
'type': 'NO_SELECTION'
|
||||
}
|
||||
|
||||
# Check if enough selected
|
||||
if selected_count < min_required:
|
||||
return {
|
||||
'valid': False,
|
||||
'error': f'Please select at least {min_required} keywords. Currently selected: {selected_count}',
|
||||
'type': 'INSUFFICIENT_SELECTION',
|
||||
'selected': selected_count,
|
||||
'required': min_required
|
||||
}
|
||||
|
||||
# Check if enough available (even if not all selected)
|
||||
if available_count < min_required:
|
||||
return {
|
||||
'valid': False,
|
||||
'error': f'Not enough keywords available. Need at least {min_required} keywords, but only {available_count} exist.',
|
||||
'type': 'INSUFFICIENT_AVAILABLE',
|
||||
'available': available_count,
|
||||
'required': min_required
|
||||
}
|
||||
|
||||
return {
|
||||
'valid': True,
|
||||
'selected': selected_count,
|
||||
'available': available_count,
|
||||
'required': min_required
|
||||
}
|
||||
@@ -1,37 +0,0 @@
|
||||
"""
|
||||
Account API URLs
|
||||
"""
|
||||
from django.urls import path
|
||||
from igny8_core.api.account_views import (
|
||||
AccountSettingsViewSet,
|
||||
TeamManagementViewSet,
|
||||
UsageAnalyticsViewSet,
|
||||
DashboardStatsViewSet
|
||||
)
|
||||
|
||||
urlpatterns = [
|
||||
# Account Settings
|
||||
path('settings/', AccountSettingsViewSet.as_view({
|
||||
'get': 'retrieve',
|
||||
'patch': 'partial_update'
|
||||
}), name='account-settings'),
|
||||
|
||||
# Team Management
|
||||
path('team/', TeamManagementViewSet.as_view({
|
||||
'get': 'list',
|
||||
'post': 'create'
|
||||
}), name='team-list'),
|
||||
path('team/<int:pk>/', TeamManagementViewSet.as_view({
|
||||
'delete': 'destroy'
|
||||
}), name='team-detail'),
|
||||
|
||||
# Usage Analytics
|
||||
path('usage/analytics/', UsageAnalyticsViewSet.as_view({
|
||||
'get': 'overview'
|
||||
}), name='usage-analytics'),
|
||||
|
||||
# Dashboard Stats (real data for home page)
|
||||
path('dashboard/stats/', DashboardStatsViewSet.as_view({
|
||||
'get': 'stats'
|
||||
}), name='dashboard-stats'),
|
||||
]
|
||||
@@ -1,458 +0,0 @@
|
||||
"""
|
||||
Account Management API Views
|
||||
Handles account settings, team management, and usage analytics
|
||||
"""
|
||||
from rest_framework import viewsets, status
|
||||
from rest_framework.decorators import action
|
||||
from rest_framework.response import Response
|
||||
from rest_framework.permissions import IsAuthenticated
|
||||
from django.contrib.auth import get_user_model
|
||||
from django.db.models import Q, Count, Sum
|
||||
from django.utils import timezone
|
||||
from datetime import timedelta
|
||||
from decimal import Decimal
|
||||
from drf_spectacular.utils import extend_schema, extend_schema_view
|
||||
|
||||
from igny8_core.auth.models import Account
|
||||
from igny8_core.business.billing.models import CreditTransaction
|
||||
|
||||
User = get_user_model()
|
||||
|
||||
|
||||
@extend_schema_view(
|
||||
retrieve=extend_schema(tags=['Account']),
|
||||
partial_update=extend_schema(tags=['Account']),
|
||||
)
|
||||
class AccountSettingsViewSet(viewsets.ViewSet):
|
||||
"""Account settings management"""
|
||||
permission_classes = [IsAuthenticated]
|
||||
|
||||
def retrieve(self, request):
|
||||
"""Get account settings"""
|
||||
account = request.user.account
|
||||
|
||||
return Response({
|
||||
'id': account.id,
|
||||
'name': account.name,
|
||||
'slug': account.slug,
|
||||
'billing_address_line1': account.billing_address_line1 or '',
|
||||
'billing_address_line2': account.billing_address_line2 or '',
|
||||
'billing_city': account.billing_city or '',
|
||||
'billing_state': account.billing_state or '',
|
||||
'billing_postal_code': account.billing_postal_code or '',
|
||||
'billing_country': account.billing_country or '',
|
||||
'tax_id': account.tax_id or '',
|
||||
'billing_email': account.billing_email or '',
|
||||
'credits': account.credits,
|
||||
'created_at': account.created_at.isoformat(),
|
||||
'updated_at': account.updated_at.isoformat(),
|
||||
})
|
||||
|
||||
def partial_update(self, request):
|
||||
"""Update account settings"""
|
||||
account = request.user.account
|
||||
|
||||
# Update allowed fields
|
||||
allowed_fields = [
|
||||
'name', 'billing_address_line1', 'billing_address_line2',
|
||||
'billing_city', 'billing_state', 'billing_postal_code',
|
||||
'billing_country', 'tax_id', 'billing_email'
|
||||
]
|
||||
|
||||
for field in allowed_fields:
|
||||
if field in request.data:
|
||||
setattr(account, field, request.data[field])
|
||||
|
||||
account.save()
|
||||
|
||||
return Response({
|
||||
'message': 'Account settings updated successfully',
|
||||
'account': {
|
||||
'id': account.id,
|
||||
'name': account.name,
|
||||
'slug': account.slug,
|
||||
'billing_address_line1': account.billing_address_line1,
|
||||
'billing_address_line2': account.billing_address_line2,
|
||||
'billing_city': account.billing_city,
|
||||
'billing_state': account.billing_state,
|
||||
'billing_postal_code': account.billing_postal_code,
|
||||
'billing_country': account.billing_country,
|
||||
'tax_id': account.tax_id,
|
||||
'billing_email': account.billing_email,
|
||||
}
|
||||
})
|
||||
|
||||
|
||||
@extend_schema_view(
|
||||
list=extend_schema(tags=['Account']),
|
||||
create=extend_schema(tags=['Account']),
|
||||
destroy=extend_schema(tags=['Account']),
|
||||
)
|
||||
class TeamManagementViewSet(viewsets.ViewSet):
|
||||
"""Team members management"""
|
||||
permission_classes = [IsAuthenticated]
|
||||
|
||||
def list(self, request):
|
||||
"""List team members"""
|
||||
account = request.user.account
|
||||
users = User.objects.filter(account=account)
|
||||
|
||||
return Response({
|
||||
'results': [
|
||||
{
|
||||
'id': user.id,
|
||||
'email': user.email,
|
||||
'first_name': user.first_name,
|
||||
'last_name': user.last_name,
|
||||
'is_active': user.is_active,
|
||||
'is_staff': user.is_staff,
|
||||
'date_joined': user.date_joined.isoformat(),
|
||||
'last_login': user.last_login.isoformat() if user.last_login else None,
|
||||
}
|
||||
for user in users
|
||||
],
|
||||
'count': users.count()
|
||||
})
|
||||
|
||||
def create(self, request):
|
||||
"""Invite new team member"""
|
||||
account = request.user.account
|
||||
email = request.data.get('email')
|
||||
|
||||
if not email:
|
||||
return Response(
|
||||
{'error': 'Email is required'},
|
||||
status=status.HTTP_400_BAD_REQUEST
|
||||
)
|
||||
|
||||
# Check if user already exists
|
||||
if User.objects.filter(email=email).exists():
|
||||
return Response(
|
||||
{'error': 'User with this email already exists'},
|
||||
status=status.HTTP_400_BAD_REQUEST
|
||||
)
|
||||
|
||||
# Create user (simplified - in production, send invitation email)
|
||||
user = User.objects.create_user(
|
||||
email=email,
|
||||
first_name=request.data.get('first_name', ''),
|
||||
last_name=request.data.get('last_name', ''),
|
||||
account=account
|
||||
)
|
||||
|
||||
return Response({
|
||||
'message': 'Team member invited successfully',
|
||||
'user': {
|
||||
'id': user.id,
|
||||
'email': user.email,
|
||||
'first_name': user.first_name,
|
||||
'last_name': user.last_name,
|
||||
}
|
||||
}, status=status.HTTP_201_CREATED)
|
||||
|
||||
def destroy(self, request, pk=None):
|
||||
"""Remove team member"""
|
||||
account = request.user.account
|
||||
|
||||
try:
|
||||
user = User.objects.get(id=pk, account=account)
|
||||
|
||||
# Prevent removing yourself
|
||||
if user.id == request.user.id:
|
||||
return Response(
|
||||
{'error': 'Cannot remove yourself'},
|
||||
status=status.HTTP_400_BAD_REQUEST
|
||||
)
|
||||
|
||||
user.is_active = False
|
||||
user.save()
|
||||
|
||||
return Response({
|
||||
'message': 'Team member removed successfully'
|
||||
})
|
||||
except User.DoesNotExist:
|
||||
return Response(
|
||||
{'error': 'User not found'},
|
||||
status=status.HTTP_404_NOT_FOUND
|
||||
)
|
||||
|
||||
|
||||
@extend_schema_view(
|
||||
overview=extend_schema(tags=['Account']),
|
||||
)
|
||||
class UsageAnalyticsViewSet(viewsets.ViewSet):
|
||||
"""Usage analytics and statistics"""
|
||||
permission_classes = [IsAuthenticated]
|
||||
|
||||
@action(detail=False, methods=['get'])
|
||||
def overview(self, request):
|
||||
"""Get usage analytics overview"""
|
||||
account = request.user.account
|
||||
|
||||
# Get date range (default: last 30 days)
|
||||
days = int(request.query_params.get('days', 30))
|
||||
start_date = timezone.now() - timedelta(days=days)
|
||||
|
||||
# Get transactions in period
|
||||
transactions = CreditTransaction.objects.filter(
|
||||
account=account,
|
||||
created_at__gte=start_date
|
||||
)
|
||||
|
||||
# Calculate totals by type
|
||||
usage_by_type = transactions.filter(
|
||||
amount__lt=0
|
||||
).values('transaction_type').annotate(
|
||||
total=Sum('amount'),
|
||||
count=Count('id')
|
||||
)
|
||||
|
||||
purchases_by_type = transactions.filter(
|
||||
amount__gt=0
|
||||
).values('transaction_type').annotate(
|
||||
total=Sum('amount'),
|
||||
count=Count('id')
|
||||
)
|
||||
|
||||
# Daily usage
|
||||
daily_usage = []
|
||||
for i in range(days):
|
||||
date = start_date + timedelta(days=i)
|
||||
day_txns = transactions.filter(
|
||||
created_at__date=date.date()
|
||||
)
|
||||
|
||||
usage = day_txns.filter(amount__lt=0).aggregate(Sum('amount'))['amount__sum'] or 0
|
||||
purchases = day_txns.filter(amount__gt=0).aggregate(Sum('amount'))['amount__sum'] or 0
|
||||
|
||||
daily_usage.append({
|
||||
'date': date.date().isoformat(),
|
||||
'usage': abs(usage),
|
||||
'purchases': purchases,
|
||||
'net': purchases + usage
|
||||
})
|
||||
|
||||
return Response({
|
||||
'period_days': days,
|
||||
'start_date': start_date.isoformat(),
|
||||
'end_date': timezone.now().isoformat(),
|
||||
'current_balance': account.credits,
|
||||
'usage_by_type': list(usage_by_type),
|
||||
'purchases_by_type': list(purchases_by_type),
|
||||
'daily_usage': daily_usage,
|
||||
'total_usage': abs(transactions.filter(amount__lt=0).aggregate(Sum('amount'))['amount__sum'] or 0),
|
||||
'total_purchases': transactions.filter(amount__gt=0).aggregate(Sum('amount'))['amount__sum'] or 0,
|
||||
})
|
||||
|
||||
|
||||
@extend_schema_view(
|
||||
stats=extend_schema(tags=['Account']),
|
||||
)
|
||||
class DashboardStatsViewSet(viewsets.ViewSet):
|
||||
"""Dashboard statistics - real data for home page widgets"""
|
||||
permission_classes = [IsAuthenticated]
|
||||
|
||||
@action(detail=False, methods=['get'])
|
||||
def stats(self, request):
|
||||
"""
|
||||
Get dashboard statistics for the home page.
|
||||
|
||||
Query params:
|
||||
- site_id: Filter by site (optional, defaults to all sites)
|
||||
- days: Number of days for AI operations (default: 7)
|
||||
|
||||
Returns:
|
||||
- ai_operations: Real credit usage by operation type
|
||||
- recent_activity: Recent notifications
|
||||
- content_velocity: Content created this week/month
|
||||
- images_count: Actual total images count
|
||||
- published_count: Actual published content count
|
||||
"""
|
||||
account = request.user.account
|
||||
site_id = request.query_params.get('site_id')
|
||||
days = int(request.query_params.get('days', 7))
|
||||
|
||||
# Import models here to avoid circular imports
|
||||
from igny8_core.modules.writer.models import Images, Content
|
||||
from igny8_core.modules.planner.models import Keywords, Clusters, ContentIdeas
|
||||
from igny8_core.business.notifications.models import Notification
|
||||
from igny8_core.business.billing.models import CreditUsageLog
|
||||
from igny8_core.auth.models import Site
|
||||
|
||||
# Build base filter for site
|
||||
site_filter = {}
|
||||
if site_id:
|
||||
try:
|
||||
site_filter['site_id'] = int(site_id)
|
||||
except (ValueError, TypeError):
|
||||
pass
|
||||
|
||||
# ========== AI OPERATIONS (from CreditUsageLog) ==========
|
||||
start_date = timezone.now() - timedelta(days=days)
|
||||
usage_query = CreditUsageLog.objects.filter(
|
||||
account=account,
|
||||
created_at__gte=start_date
|
||||
)
|
||||
|
||||
# Get operations grouped by type
|
||||
operations_data = usage_query.values('operation_type').annotate(
|
||||
count=Count('id'),
|
||||
credits=Sum('credits_used')
|
||||
).order_by('-credits')
|
||||
|
||||
# Calculate totals
|
||||
total_ops = usage_query.count()
|
||||
total_credits = usage_query.aggregate(total=Sum('credits_used'))['total'] or 0
|
||||
|
||||
# Format operations for frontend
|
||||
operations = []
|
||||
for op in operations_data:
|
||||
op_type = op['operation_type'] or 'other'
|
||||
operations.append({
|
||||
'type': op_type,
|
||||
'count': op['count'] or 0,
|
||||
'credits': op['credits'] or 0,
|
||||
})
|
||||
|
||||
ai_operations = {
|
||||
'period': f'{days}d',
|
||||
'operations': operations,
|
||||
'totals': {
|
||||
'count': total_ops,
|
||||
'credits': total_credits,
|
||||
'successRate': 98.5, # TODO: calculate from actual success/failure
|
||||
'avgCreditsPerOp': round(total_credits / total_ops, 1) if total_ops > 0 else 0,
|
||||
}
|
||||
}
|
||||
|
||||
# ========== RECENT ACTIVITY (from Notifications) ==========
|
||||
recent_notifications = Notification.objects.filter(
|
||||
account=account
|
||||
).order_by('-created_at')[:10]
|
||||
|
||||
recent_activity = []
|
||||
for notif in recent_notifications:
|
||||
# Map notification type to activity type
|
||||
activity_type_map = {
|
||||
'ai_clustering_complete': 'clustering',
|
||||
'ai_ideas_complete': 'ideas',
|
||||
'ai_content_complete': 'content',
|
||||
'ai_images_complete': 'images',
|
||||
'ai_prompts_complete': 'images',
|
||||
'content_published': 'published',
|
||||
'wp_sync_success': 'published',
|
||||
}
|
||||
activity_type = activity_type_map.get(notif.notification_type, 'system')
|
||||
|
||||
# Map notification type to href
|
||||
href_map = {
|
||||
'clustering': '/planner/clusters',
|
||||
'ideas': '/planner/ideas',
|
||||
'content': '/writer/content',
|
||||
'images': '/writer/images',
|
||||
'published': '/writer/published',
|
||||
}
|
||||
|
||||
recent_activity.append({
|
||||
'id': str(notif.id),
|
||||
'type': activity_type,
|
||||
'title': notif.title,
|
||||
'description': notif.message[:100] if notif.message else '',
|
||||
'timestamp': notif.created_at.isoformat(),
|
||||
'href': href_map.get(activity_type, '/dashboard'),
|
||||
})
|
||||
|
||||
# ========== CONTENT COUNTS ==========
|
||||
content_base = Content.objects.filter(account=account)
|
||||
if site_filter:
|
||||
content_base = content_base.filter(**site_filter)
|
||||
|
||||
total_content = content_base.count()
|
||||
draft_content = content_base.filter(status='draft').count()
|
||||
review_content = content_base.filter(status='review').count()
|
||||
published_content = content_base.filter(status='published').count()
|
||||
|
||||
# ========== IMAGES COUNT (actual images, not content with images) ==========
|
||||
images_base = Images.objects.filter(account=account)
|
||||
if site_filter:
|
||||
images_base = images_base.filter(**site_filter)
|
||||
|
||||
total_images = images_base.count()
|
||||
generated_images = images_base.filter(status='generated').count()
|
||||
pending_images = images_base.filter(status='pending').count()
|
||||
|
||||
# ========== CONTENT VELOCITY ==========
|
||||
now = timezone.now()
|
||||
week_ago = now - timedelta(days=7)
|
||||
month_ago = now - timedelta(days=30)
|
||||
|
||||
# This week's content
|
||||
week_content = content_base.filter(created_at__gte=week_ago).count()
|
||||
week_images = images_base.filter(created_at__gte=week_ago).count()
|
||||
|
||||
# This month's content
|
||||
month_content = content_base.filter(created_at__gte=month_ago).count()
|
||||
month_images = images_base.filter(created_at__gte=month_ago).count()
|
||||
|
||||
# Estimate words (avg 1500 per article)
|
||||
content_velocity = {
|
||||
'thisWeek': {
|
||||
'articles': week_content,
|
||||
'words': week_content * 1500,
|
||||
'images': week_images,
|
||||
},
|
||||
'thisMonth': {
|
||||
'articles': month_content,
|
||||
'words': month_content * 1500,
|
||||
'images': month_images,
|
||||
},
|
||||
'total': {
|
||||
'articles': total_content,
|
||||
'words': total_content * 1500,
|
||||
'images': total_images,
|
||||
},
|
||||
'trend': 0, # TODO: calculate actual trend
|
||||
}
|
||||
|
||||
# ========== PIPELINE COUNTS ==========
|
||||
keywords_base = Keywords.objects.filter(account=account)
|
||||
clusters_base = Clusters.objects.filter(account=account)
|
||||
ideas_base = ContentIdeas.objects.filter(account=account)
|
||||
|
||||
if site_filter:
|
||||
keywords_base = keywords_base.filter(**site_filter)
|
||||
clusters_base = clusters_base.filter(**site_filter)
|
||||
ideas_base = ideas_base.filter(**site_filter)
|
||||
|
||||
# Get site count
|
||||
sites_count = Site.objects.filter(account=account, is_active=True).count()
|
||||
|
||||
pipeline = {
|
||||
'sites': sites_count,
|
||||
'keywords': keywords_base.count(),
|
||||
'clusters': clusters_base.count(),
|
||||
'ideas': ideas_base.count(),
|
||||
'tasks': ideas_base.filter(status='queued').count() + ideas_base.filter(status='completed').count(),
|
||||
'drafts': draft_content + review_content,
|
||||
'published': published_content,
|
||||
}
|
||||
|
||||
return Response({
|
||||
'ai_operations': ai_operations,
|
||||
'recent_activity': recent_activity,
|
||||
'content_velocity': content_velocity,
|
||||
'pipeline': pipeline,
|
||||
'counts': {
|
||||
'content': {
|
||||
'total': total_content,
|
||||
'draft': draft_content,
|
||||
'review': review_content,
|
||||
'published': published_content,
|
||||
},
|
||||
'images': {
|
||||
'total': total_images,
|
||||
'generated': generated_images,
|
||||
'pending': pending_images,
|
||||
},
|
||||
}
|
||||
})
|
||||
@@ -67,10 +67,16 @@ class JWTAuthentication(BaseAuthentication):
|
||||
try:
|
||||
account = Account.objects.get(id=account_id)
|
||||
except Account.DoesNotExist:
|
||||
# Account from token doesn't exist - don't fallback, set to None
|
||||
pass
|
||||
|
||||
if not account:
|
||||
try:
|
||||
account = getattr(user, 'account', None)
|
||||
except (AttributeError, Exception):
|
||||
# If account access fails, set to None
|
||||
account = None
|
||||
|
||||
# Set account on request (only if account_id was in token and account exists)
|
||||
# Set account on request
|
||||
request.account = account
|
||||
|
||||
return (user, token)
|
||||
@@ -83,79 +89,3 @@ class JWTAuthentication(BaseAuthentication):
|
||||
# This allows session authentication to work if JWT fails
|
||||
return None
|
||||
|
||||
|
||||
class APIKeyAuthentication(BaseAuthentication):
|
||||
"""
|
||||
API Key authentication for WordPress integration.
|
||||
Validates API keys stored in Site.wp_api_key field.
|
||||
"""
|
||||
def authenticate(self, request):
|
||||
"""
|
||||
Authenticate using WordPress API key.
|
||||
Returns (user, api_key) tuple if valid.
|
||||
"""
|
||||
auth_header = request.META.get('HTTP_AUTHORIZATION', '')
|
||||
|
||||
if not auth_header.startswith('Bearer '):
|
||||
return None # Not an API key request
|
||||
|
||||
api_key = auth_header.split(' ')[1] if len(auth_header.split(' ')) > 1 else None
|
||||
if not api_key or len(api_key) < 20: # API keys should be at least 20 chars
|
||||
return None
|
||||
|
||||
# Don't try to authenticate JWT tokens (they start with 'ey')
|
||||
if api_key.startswith('ey'):
|
||||
return None # Let JWTAuthentication handle it
|
||||
|
||||
try:
|
||||
from igny8_core.auth.models import Site, User
|
||||
from igny8_core.auth.utils import validate_account_and_plan
|
||||
from rest_framework.exceptions import AuthenticationFailed
|
||||
|
||||
# Find site by API key
|
||||
site = Site.objects.select_related('account', 'account__owner', 'account__plan').filter(
|
||||
wp_api_key=api_key,
|
||||
is_active=True
|
||||
).first()
|
||||
|
||||
if not site:
|
||||
return None # API key not found or site inactive
|
||||
|
||||
# Get account and validate it
|
||||
account = site.account
|
||||
if not account:
|
||||
raise AuthenticationFailed('No account associated with this API key.')
|
||||
|
||||
# CRITICAL FIX: Validate account and plan status
|
||||
is_valid, error_message, http_status = validate_account_and_plan(account)
|
||||
if not is_valid:
|
||||
raise AuthenticationFailed(error_message)
|
||||
|
||||
# Get user (prefer owner but gracefully fall back)
|
||||
user = account.owner
|
||||
if not user or not getattr(user, 'is_active', False):
|
||||
# Fall back to any active developer/owner/admin in the account
|
||||
user = account.users.filter(
|
||||
is_active=True,
|
||||
role__in=['developer', 'owner', 'admin']
|
||||
).order_by('role').first() or account.users.filter(is_active=True).first()
|
||||
|
||||
if not user:
|
||||
raise AuthenticationFailed('No active user available for this account.')
|
||||
if not user.is_active:
|
||||
raise AuthenticationFailed('User account is disabled.')
|
||||
|
||||
# Set account on request for tenant isolation
|
||||
request.account = account
|
||||
|
||||
# Set site on request for WordPress integration context
|
||||
request.site = site
|
||||
|
||||
return (user, api_key)
|
||||
|
||||
except Exception as e:
|
||||
# Log the error but return None to allow other auth classes to try
|
||||
import logging
|
||||
logger = logging.getLogger(__name__)
|
||||
logger.debug(f'APIKeyAuthentication error: {str(e)}')
|
||||
return None
|
||||
|
||||
@@ -19,21 +19,34 @@ class AccountModelViewSet(viewsets.ModelViewSet):
|
||||
# Filter by account if model has account field
|
||||
if hasattr(queryset.model, 'account'):
|
||||
user = getattr(self.request, 'user', None)
|
||||
|
||||
|
||||
# ADMIN/DEV/SYSTEM ACCOUNT OVERRIDE: Skip account filtering for:
|
||||
# - Admins and developers (by role)
|
||||
# - Users in system accounts (aws-admin, default-account)
|
||||
if user and hasattr(user, 'is_authenticated') and user.is_authenticated:
|
||||
try:
|
||||
account = getattr(self.request, 'account', None)
|
||||
if not account and hasattr(self.request, 'user') and self.request.user and hasattr(self.request.user, 'is_authenticated') and self.request.user.is_authenticated:
|
||||
user_account = getattr(self.request.user, 'account', None)
|
||||
if user_account:
|
||||
account = user_account
|
||||
|
||||
if account:
|
||||
queryset = queryset.filter(account=account)
|
||||
# Check if user has admin/developer privileges
|
||||
is_admin_or_dev = (hasattr(user, 'is_admin_or_developer') and user.is_admin_or_developer()) if user else False
|
||||
is_system_user = (hasattr(user, 'is_system_account_user') and user.is_system_account_user()) if user else False
|
||||
|
||||
if is_admin_or_dev or is_system_user:
|
||||
# Skip account filtering - allow all accounts
|
||||
pass
|
||||
else:
|
||||
# No account context -> block access
|
||||
return queryset.none()
|
||||
except (AttributeError, TypeError):
|
||||
# Get account from request (set by middleware)
|
||||
account = getattr(self.request, 'account', None)
|
||||
if account:
|
||||
queryset = queryset.filter(account=account)
|
||||
elif hasattr(self.request, 'user') and self.request.user and hasattr(self.request.user, 'is_authenticated') and self.request.user.is_authenticated:
|
||||
# Fallback to user's account
|
||||
try:
|
||||
user_account = getattr(self.request.user, 'account', None)
|
||||
if user_account:
|
||||
queryset = queryset.filter(account=user_account)
|
||||
except (AttributeError, Exception):
|
||||
# If account access fails (e.g., column mismatch), skip account filtering
|
||||
pass
|
||||
except (AttributeError, TypeError) as e:
|
||||
# If there's an error accessing user attributes, return empty queryset
|
||||
return queryset.none()
|
||||
else:
|
||||
@@ -48,11 +61,11 @@ class AccountModelViewSet(viewsets.ModelViewSet):
|
||||
try:
|
||||
account = getattr(self.request.user, 'account', None)
|
||||
except (AttributeError, Exception):
|
||||
# If account access fails (e.g., column mismatch), set to None
|
||||
account = None
|
||||
|
||||
if hasattr(serializer.Meta.model, 'account'):
|
||||
if not account:
|
||||
raise PermissionDenied("Account context is required to create this object.")
|
||||
|
||||
# If model has account field, set it
|
||||
if account and hasattr(serializer.Meta.model, 'account'):
|
||||
serializer.save(account=account)
|
||||
else:
|
||||
serializer.save()
|
||||
@@ -168,26 +181,7 @@ class AccountModelViewSet(viewsets.ModelViewSet):
|
||||
"""
|
||||
try:
|
||||
instance = self.get_object()
|
||||
# Protect system account
|
||||
if hasattr(instance, 'slug') and getattr(instance, 'slug', '') == 'aws-admin':
|
||||
from django.core.exceptions import PermissionDenied
|
||||
raise PermissionDenied("System account cannot be deleted.")
|
||||
|
||||
if hasattr(instance, 'soft_delete'):
|
||||
user = getattr(request, 'user', None)
|
||||
retention_days = None
|
||||
account = getattr(instance, 'account', None)
|
||||
if account and hasattr(account, 'deletion_retention_days'):
|
||||
retention_days = account.deletion_retention_days
|
||||
elif hasattr(instance, 'deletion_retention_days'):
|
||||
retention_days = getattr(instance, 'deletion_retention_days', None)
|
||||
instance.soft_delete(
|
||||
user=user if getattr(user, 'is_authenticated', False) else None,
|
||||
retention_days=retention_days,
|
||||
reason='api_delete'
|
||||
)
|
||||
else:
|
||||
self.perform_destroy(instance)
|
||||
self.perform_destroy(instance)
|
||||
return success_response(
|
||||
data=None,
|
||||
message='Deleted successfully',
|
||||
@@ -240,16 +234,24 @@ class SiteSectorModelViewSet(AccountModelViewSet):
|
||||
# Check if user is authenticated and is a proper User instance (not AnonymousUser)
|
||||
if user and hasattr(user, 'is_authenticated') and user.is_authenticated and hasattr(user, 'get_accessible_sites'):
|
||||
try:
|
||||
# Get user's accessible sites
|
||||
accessible_sites = user.get_accessible_sites()
|
||||
|
||||
# If no accessible sites, return empty queryset
|
||||
if not accessible_sites.exists():
|
||||
queryset = queryset.none()
|
||||
# ADMIN/DEV/SYSTEM ACCOUNT OVERRIDE: Developers, admins, and system account users
|
||||
# can see all data regardless of site/sector
|
||||
if (hasattr(user, 'is_admin_or_developer') and user.is_admin_or_developer()) or \
|
||||
(hasattr(user, 'is_system_account_user') and user.is_system_account_user()):
|
||||
# Skip site/sector filtering for admins, developers, and system account users
|
||||
# But still respect optional query params if provided
|
||||
pass
|
||||
else:
|
||||
# Filter by accessible sites
|
||||
queryset = queryset.filter(site__in=accessible_sites)
|
||||
except (AttributeError, TypeError):
|
||||
# Get user's accessible sites
|
||||
accessible_sites = user.get_accessible_sites()
|
||||
|
||||
# If no accessible sites, return empty queryset (unless admin/developer/system account)
|
||||
if not accessible_sites.exists():
|
||||
queryset = queryset.none()
|
||||
else:
|
||||
# Filter by accessible sites
|
||||
queryset = queryset.filter(site__in=accessible_sites)
|
||||
except (AttributeError, TypeError) as e:
|
||||
# If there's an error accessing user attributes, return empty queryset
|
||||
queryset = queryset.none()
|
||||
else:
|
||||
@@ -263,9 +265,9 @@ class SiteSectorModelViewSet(AccountModelViewSet):
|
||||
if query_params is None:
|
||||
# Fallback for non-DRF requests
|
||||
query_params = getattr(self.request, 'GET', {})
|
||||
site_id = query_params.get('site_id') or query_params.get('site')
|
||||
site_id = query_params.get('site_id')
|
||||
else:
|
||||
site_id = query_params.get('site_id') or query_params.get('site')
|
||||
site_id = query_params.get('site_id')
|
||||
except AttributeError:
|
||||
site_id = None
|
||||
|
||||
@@ -274,14 +276,21 @@ class SiteSectorModelViewSet(AccountModelViewSet):
|
||||
# Convert site_id to int if it's a string
|
||||
site_id_int = int(site_id) if site_id else None
|
||||
if site_id_int:
|
||||
# ADMIN/DEV/SYSTEM ACCOUNT OVERRIDE: Admins, developers, and system account users
|
||||
# can filter by any site, others must verify access
|
||||
if user and hasattr(user, 'is_authenticated') and user.is_authenticated and hasattr(user, 'get_accessible_sites'):
|
||||
try:
|
||||
accessible_sites = user.get_accessible_sites()
|
||||
if accessible_sites.filter(id=site_id_int).exists():
|
||||
if (hasattr(user, 'is_admin_or_developer') and user.is_admin_or_developer()) or \
|
||||
(hasattr(user, 'is_system_account_user') and user.is_system_account_user()):
|
||||
# Admin/Developer/System Account User can filter by any site
|
||||
queryset = queryset.filter(site_id=site_id_int)
|
||||
else:
|
||||
queryset = queryset.none() # Site not accessible
|
||||
except (AttributeError, TypeError):
|
||||
accessible_sites = user.get_accessible_sites()
|
||||
if accessible_sites.filter(id=site_id_int).exists():
|
||||
queryset = queryset.filter(site_id=site_id_int)
|
||||
else:
|
||||
queryset = queryset.none() # Site not accessible
|
||||
except (AttributeError, TypeError) as e:
|
||||
# If there's an error accessing user attributes, return empty queryset
|
||||
queryset = queryset.none()
|
||||
else:
|
||||
@@ -341,10 +350,14 @@ class SiteSectorModelViewSet(AccountModelViewSet):
|
||||
|
||||
if user and hasattr(user, 'is_authenticated') and user.is_authenticated and site:
|
||||
try:
|
||||
if hasattr(user, 'get_accessible_sites'):
|
||||
accessible_sites = user.get_accessible_sites()
|
||||
if not accessible_sites.filter(id=site.id).exists():
|
||||
raise PermissionDenied("You do not have access to this site")
|
||||
# ADMIN/DEV/SYSTEM ACCOUNT OVERRIDE: Admins, developers, and system account users
|
||||
# can create in any site, others must verify access
|
||||
if not ((hasattr(user, 'is_admin_or_developer') and user.is_admin_or_developer()) or
|
||||
(hasattr(user, 'is_system_account_user') and user.is_system_account_user())):
|
||||
if hasattr(user, 'get_accessible_sites'):
|
||||
accessible_sites = user.get_accessible_sites()
|
||||
if not accessible_sites.filter(id=site.id).exists():
|
||||
raise PermissionDenied("You do not have access to this site")
|
||||
|
||||
# Verify sector belongs to site
|
||||
if sector and hasattr(sector, 'site') and sector.site != site:
|
||||
|
||||
@@ -12,23 +12,13 @@ class IsAuthenticatedAndActive(permissions.BasePermission):
|
||||
Base permission for most endpoints
|
||||
"""
|
||||
def has_permission(self, request, view):
|
||||
import logging
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
if not request.user or not request.user.is_authenticated:
|
||||
logger.warning(f"[IsAuthenticatedAndActive] DENIED: User not authenticated")
|
||||
return False
|
||||
|
||||
# Check if user is active
|
||||
if hasattr(request.user, 'is_active'):
|
||||
is_active = request.user.is_active
|
||||
if is_active:
|
||||
logger.info(f"[IsAuthenticatedAndActive] ALLOWED: User {request.user.email} is active")
|
||||
else:
|
||||
logger.warning(f"[IsAuthenticatedAndActive] DENIED: User {request.user.email} is inactive")
|
||||
return is_active
|
||||
return request.user.is_active
|
||||
|
||||
logger.info(f"[IsAuthenticatedAndActive] ALLOWED: User {request.user.email} (no is_active check)")
|
||||
return True
|
||||
|
||||
|
||||
@@ -36,41 +26,45 @@ class HasTenantAccess(permissions.BasePermission):
|
||||
"""
|
||||
Permission class that requires user to belong to the tenant/account
|
||||
Ensures tenant isolation
|
||||
Superusers, developers, and system account users bypass this check.
|
||||
|
||||
CRITICAL: Every authenticated user MUST have an account.
|
||||
The middleware sets request.account from request.user.account.
|
||||
If a user doesn't have an account, it's a data integrity issue.
|
||||
"""
|
||||
def has_permission(self, request, view):
|
||||
import logging
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
if not request.user or not request.user.is_authenticated:
|
||||
logger.warning(f"[HasTenantAccess] DENIED: User not authenticated")
|
||||
return False
|
||||
|
||||
# SIMPLIFIED LOGIC: Every authenticated user MUST have an account
|
||||
# Middleware already set request.account from request.user.account
|
||||
# Just verify it exists
|
||||
if not hasattr(request.user, 'account'):
|
||||
logger.warning(f"[HasTenantAccess] DENIED: User {request.user.email} has no account attribute")
|
||||
return False
|
||||
# Get account from request (set by middleware)
|
||||
account = getattr(request, 'account', None)
|
||||
|
||||
try:
|
||||
# Access the account to trigger any lazy loading
|
||||
user_account = request.user.account
|
||||
if not user_account:
|
||||
logger.warning(f"[HasTenantAccess] DENIED: User {request.user.email} has NULL account")
|
||||
return False
|
||||
|
||||
# Success - user has a valid account
|
||||
logger.info(f"[HasTenantAccess] ALLOWED: User {request.user.email} has account {user_account.name} (ID: {user_account.id})")
|
||||
return True
|
||||
except (AttributeError, Exception) as e:
|
||||
# User doesn't have account relationship - data integrity issue
|
||||
logger.warning(f"[HasTenantAccess] DENIED: User {request.user.email} account access failed: {e}")
|
||||
return False
|
||||
# If no account in request, try to get from user
|
||||
if not account and hasattr(request.user, 'account'):
|
||||
try:
|
||||
account = request.user.account
|
||||
except (AttributeError, Exception):
|
||||
pass
|
||||
|
||||
# Admin/Developer/System account users bypass tenant check
|
||||
if request.user and hasattr(request.user, 'is_authenticated') and request.user.is_authenticated:
|
||||
try:
|
||||
is_admin_or_dev = (hasattr(request.user, 'is_admin_or_developer') and
|
||||
request.user.is_admin_or_developer()) if request.user else False
|
||||
is_system_user = (hasattr(request.user, 'is_system_account_user') and
|
||||
request.user.is_system_account_user()) if request.user else False
|
||||
|
||||
if is_admin_or_dev or is_system_user:
|
||||
return True
|
||||
except (AttributeError, TypeError):
|
||||
pass
|
||||
|
||||
# Regular users must have account access
|
||||
if account:
|
||||
# Check if user belongs to this account
|
||||
if hasattr(request.user, 'account'):
|
||||
try:
|
||||
user_account = request.user.account
|
||||
return user_account == account or user_account.id == account.id
|
||||
except (AttributeError, Exception):
|
||||
pass
|
||||
|
||||
return False
|
||||
|
||||
|
||||
class IsViewerOrAbove(permissions.BasePermission):
|
||||
@@ -79,26 +73,28 @@ class IsViewerOrAbove(permissions.BasePermission):
|
||||
For read-only operations
|
||||
"""
|
||||
def has_permission(self, request, view):
|
||||
import logging
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
if not request.user or not request.user.is_authenticated:
|
||||
logger.warning(f"[IsViewerOrAbove] DENIED: User not authenticated")
|
||||
return False
|
||||
|
||||
# Admin/Developer/System account users always have access
|
||||
try:
|
||||
is_admin_or_dev = (hasattr(request.user, 'is_admin_or_developer') and
|
||||
request.user.is_admin_or_developer()) if request.user else False
|
||||
is_system_user = (hasattr(request.user, 'is_system_account_user') and
|
||||
request.user.is_system_account_user()) if request.user else False
|
||||
|
||||
if is_admin_or_dev or is_system_user:
|
||||
return True
|
||||
except (AttributeError, TypeError):
|
||||
pass
|
||||
|
||||
# Check user role
|
||||
if hasattr(request.user, 'role'):
|
||||
role = request.user.role
|
||||
# viewer, editor, admin, owner all have access
|
||||
allowed = role in ['viewer', 'editor', 'admin', 'owner']
|
||||
if allowed:
|
||||
logger.info(f"[IsViewerOrAbove] ALLOWED: User {request.user.email} has role {role}")
|
||||
else:
|
||||
logger.warning(f"[IsViewerOrAbove] DENIED: User {request.user.email} has invalid role {role}")
|
||||
return allowed
|
||||
return role in ['viewer', 'editor', 'admin', 'owner']
|
||||
|
||||
# If no role system, allow authenticated users
|
||||
logger.info(f"[IsViewerOrAbove] ALLOWED: User {request.user.email} (no role system)")
|
||||
return True
|
||||
|
||||
|
||||
@@ -111,6 +107,18 @@ class IsEditorOrAbove(permissions.BasePermission):
|
||||
if not request.user or not request.user.is_authenticated:
|
||||
return False
|
||||
|
||||
# Admin/Developer/System account users always have access
|
||||
try:
|
||||
is_admin_or_dev = (hasattr(request.user, 'is_admin_or_developer') and
|
||||
request.user.is_admin_or_developer()) if request.user else False
|
||||
is_system_user = (hasattr(request.user, 'is_system_account_user') and
|
||||
request.user.is_system_account_user()) if request.user else False
|
||||
|
||||
if is_admin_or_dev or is_system_user:
|
||||
return True
|
||||
except (AttributeError, TypeError):
|
||||
pass
|
||||
|
||||
# Check user role
|
||||
if hasattr(request.user, 'role'):
|
||||
role = request.user.role
|
||||
@@ -130,6 +138,18 @@ class IsAdminOrOwner(permissions.BasePermission):
|
||||
if not request.user or not request.user.is_authenticated:
|
||||
return False
|
||||
|
||||
# Admin/Developer/System account users always have access
|
||||
try:
|
||||
is_admin_or_dev = (hasattr(request.user, 'is_admin_or_developer') and
|
||||
request.user.is_admin_or_developer()) if request.user else False
|
||||
is_system_user = (hasattr(request.user, 'is_system_account_user') and
|
||||
request.user.is_system_account_user()) if request.user else False
|
||||
|
||||
if is_admin_or_dev or is_system_user:
|
||||
return True
|
||||
except (AttributeError, TypeError):
|
||||
pass
|
||||
|
||||
# Check user role
|
||||
if hasattr(request.user, 'role'):
|
||||
role = request.user.role
|
||||
@@ -138,3 +158,5 @@ class IsAdminOrOwner(permissions.BasePermission):
|
||||
|
||||
# If no role system, deny by default for security
|
||||
return False
|
||||
|
||||
|
||||
|
||||
@@ -5,8 +5,6 @@ Provides consistent response format across all endpoints
|
||||
from rest_framework.response import Response
|
||||
from rest_framework import status
|
||||
import uuid
|
||||
from typing import Any
|
||||
from django.http import HttpRequest
|
||||
|
||||
|
||||
def get_request_id(request):
|
||||
@@ -76,28 +74,6 @@ def error_response(error=None, errors=None, status_code=status.HTTP_400_BAD_REQU
|
||||
'success': False,
|
||||
}
|
||||
|
||||
# Backwards compatibility: some callers used positional args in the order
|
||||
# (error, status_code, request) which maps to (error, errors, status_code=request)
|
||||
# causing `status_code` to be a Request object and raising TypeError.
|
||||
# Detect this misuse and normalize arguments:
|
||||
try:
|
||||
if request is None and status_code is not None:
|
||||
# If status_code appears to be a Request object, shift arguments
|
||||
if isinstance(status_code, HttpRequest) or hasattr(status_code, 'META'):
|
||||
# original call looked like: error_response(msg, status.HTTP_400_BAD_REQUEST, request)
|
||||
# which resulted in: errors = status.HTTP_400..., status_code = request
|
||||
request = status_code
|
||||
# If `errors` holds an int-like HTTP status, use it as status_code
|
||||
if isinstance(errors, int):
|
||||
status_code = errors
|
||||
errors = None
|
||||
else:
|
||||
# fallback to default 400
|
||||
status_code = status.HTTP_400_BAD_REQUEST
|
||||
except Exception:
|
||||
# Defensive: if introspection fails, continue with provided args
|
||||
pass
|
||||
|
||||
if error:
|
||||
response_data['error'] = error
|
||||
elif status_code == status.HTTP_400_BAD_REQUEST:
|
||||
|
||||
@@ -8,20 +8,7 @@ from drf_spectacular.utils import extend_schema, OpenApiResponse
|
||||
from rest_framework import status
|
||||
|
||||
# Explicit tags we want to keep (from SPECTACULAR_SETTINGS)
|
||||
EXPLICIT_TAGS = {
|
||||
'Authentication',
|
||||
'Planner',
|
||||
'Writer',
|
||||
'System',
|
||||
'Billing',
|
||||
'Account',
|
||||
'Automation',
|
||||
'Linker',
|
||||
'Optimizer',
|
||||
'Publisher',
|
||||
'Integration',
|
||||
'Admin Billing',
|
||||
}
|
||||
EXPLICIT_TAGS = {'Authentication', 'Planner', 'Writer', 'System', 'Billing'}
|
||||
|
||||
|
||||
def postprocess_schema_filter_tags(result, generator, request, public):
|
||||
@@ -34,11 +21,6 @@ def postprocess_schema_filter_tags(result, generator, request, public):
|
||||
for path, methods in result['paths'].items():
|
||||
for method, operation in methods.items():
|
||||
if isinstance(operation, dict) and 'tags' in operation:
|
||||
# Explicitly exclude system webhook from tagging/docs grouping
|
||||
if '/system/webhook' in path:
|
||||
operation['tags'] = []
|
||||
continue
|
||||
|
||||
# Keep only explicit tags from the operation
|
||||
filtered_tags = [
|
||||
tag for tag in operation['tags']
|
||||
@@ -59,20 +41,6 @@ def postprocess_schema_filter_tags(result, generator, request, public):
|
||||
filtered_tags = ['System']
|
||||
elif '/billing/' in path or '/api/v1/billing/' in path:
|
||||
filtered_tags = ['Billing']
|
||||
elif '/account/' in path or '/api/v1/account/' in path:
|
||||
filtered_tags = ['Account']
|
||||
elif '/automation/' in path or '/api/v1/automation/' in path:
|
||||
filtered_tags = ['Automation']
|
||||
elif '/linker/' in path or '/api/v1/linker/' in path:
|
||||
filtered_tags = ['Linker']
|
||||
elif '/optimizer/' in path or '/api/v1/optimizer/' in path:
|
||||
filtered_tags = ['Optimizer']
|
||||
elif '/publisher/' in path or '/api/v1/publisher/' in path:
|
||||
filtered_tags = ['Publisher']
|
||||
elif '/integration/' in path or '/api/v1/integration/' in path:
|
||||
filtered_tags = ['Integration']
|
||||
elif '/admin/' in path or '/api/v1/admin/' in path:
|
||||
filtered_tags = ['Admin Billing']
|
||||
|
||||
operation['tags'] = filtered_tags
|
||||
|
||||
|
||||
25
backend/igny8_core/api/tests/run_tests.py
Normal file
25
backend/igny8_core/api/tests/run_tests.py
Normal file
@@ -0,0 +1,25 @@
|
||||
#!/usr/bin/env python
|
||||
"""
|
||||
Test runner script for API tests
|
||||
Run all tests: python manage.py test igny8_core.api.tests
|
||||
Run specific test: python manage.py test igny8_core.api.tests.test_response
|
||||
"""
|
||||
import os
|
||||
import sys
|
||||
import django
|
||||
|
||||
# Setup Django
|
||||
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'igny8_core.settings')
|
||||
django.setup()
|
||||
|
||||
from django.core.management import execute_from_command_line
|
||||
|
||||
if __name__ == '__main__':
|
||||
# Run all API tests
|
||||
if len(sys.argv) > 1:
|
||||
# Custom test specified
|
||||
execute_from_command_line(['manage.py', 'test'] + sys.argv[1:])
|
||||
else:
|
||||
# Run all API tests
|
||||
execute_from_command_line(['manage.py', 'test', 'igny8_core.api.tests', '--verbosity=2'])
|
||||
|
||||
@@ -140,7 +140,7 @@ class GetModelConfigTestCase(TestCase):
|
||||
|
||||
def test_get_model_config_json_mode_models(self):
|
||||
"""Test get_model_config() sets response_format for JSON mode models"""
|
||||
json_models = ['gpt-4o', 'gpt-4o-mini', 'gpt-4-turbo-preview', 'gpt-5.1', 'gpt-5.2']
|
||||
json_models = ['gpt-4o', 'gpt-4o-mini', 'gpt-4-turbo-preview']
|
||||
|
||||
for model in json_models:
|
||||
IntegrationSettings.objects.filter(account=self.account).delete()
|
||||
|
||||
@@ -79,7 +79,7 @@ class IntegrationTestBase(TestCase):
|
||||
sector=self.industry_sector,
|
||||
volume=1000,
|
||||
difficulty=50,
|
||||
country="US"
|
||||
intent="informational"
|
||||
)
|
||||
|
||||
# Authenticate client
|
||||
|
||||
@@ -21,24 +21,33 @@ class DebugScopedRateThrottle(ScopedRateThrottle):
|
||||
|
||||
def allow_request(self, request, view):
|
||||
"""
|
||||
Check if request should be throttled.
|
||||
DISABLED - Always allow all requests.
|
||||
"""
|
||||
return True
|
||||
Check if request should be throttled
|
||||
|
||||
# OLD CODE BELOW (DISABLED)
|
||||
Bypasses throttling if:
|
||||
- DEBUG mode is True
|
||||
- IGNY8_DEBUG_THROTTLE environment variable is True
|
||||
- User belongs to aws-admin or other system accounts
|
||||
- User is admin/developer role
|
||||
"""
|
||||
# Check if throttling should be bypassed
|
||||
debug_bypass = getattr(settings, 'DEBUG', False)
|
||||
env_bypass = getattr(settings, 'IGNY8_DEBUG_THROTTLE', False)
|
||||
|
||||
# Bypass for public blueprint list requests (Sites Renderer fallback)
|
||||
public_blueprint_bypass = False
|
||||
if hasattr(view, 'action') and view.action == 'list':
|
||||
if hasattr(request, 'query_params') and request.query_params.get('site'):
|
||||
if not request.user or not hasattr(request.user, 'is_authenticated') or not request.user.is_authenticated:
|
||||
public_blueprint_bypass = True
|
||||
# Bypass for system account users (aws-admin, default-account, etc.)
|
||||
system_account_bypass = False
|
||||
if hasattr(request, 'user') and request.user and hasattr(request.user, 'is_authenticated') and request.user.is_authenticated:
|
||||
try:
|
||||
# Check if user is in system account (aws-admin, default-account, default)
|
||||
if hasattr(request.user, 'is_system_account_user') and request.user.is_system_account_user():
|
||||
system_account_bypass = True
|
||||
# Also bypass for admin/developer roles
|
||||
elif hasattr(request.user, 'is_admin_or_developer') and request.user.is_admin_or_developer():
|
||||
system_account_bypass = True
|
||||
except (AttributeError, Exception):
|
||||
# If checking fails, continue with normal throttling
|
||||
pass
|
||||
|
||||
if debug_bypass or env_bypass or public_blueprint_bypass:
|
||||
if debug_bypass or env_bypass or system_account_bypass:
|
||||
# In debug mode or for system accounts, still set throttle headers but don't actually throttle
|
||||
# This allows testing throttle headers without blocking requests
|
||||
if hasattr(self, 'get_rate'):
|
||||
@@ -59,27 +68,9 @@ class DebugScopedRateThrottle(ScopedRateThrottle):
|
||||
}
|
||||
return True
|
||||
|
||||
# Normal throttling with per-account keying
|
||||
# Normal throttling behavior
|
||||
return super().allow_request(request, view)
|
||||
|
||||
def get_cache_key(self, request, view):
|
||||
"""
|
||||
Override to add account-based throttle keying.
|
||||
Keys by (scope, account.id) instead of just user.
|
||||
"""
|
||||
if not self.scope:
|
||||
return None
|
||||
|
||||
# Get account from request
|
||||
account = getattr(request, 'account', None)
|
||||
if not account and hasattr(request, 'user') and request.user and request.user.is_authenticated:
|
||||
account = getattr(request.user, 'account', None)
|
||||
|
||||
account_id = account.id if account else 'anon'
|
||||
|
||||
# Build throttle key: scope:account_id
|
||||
return f'{self.scope}:{account_id}'
|
||||
|
||||
def get_rate(self):
|
||||
"""
|
||||
Get rate for the current scope
|
||||
|
||||
@@ -1,30 +0,0 @@
|
||||
"""
|
||||
URL patterns for account management API
|
||||
"""
|
||||
from django.urls import path, include
|
||||
from rest_framework.routers import DefaultRouter
|
||||
from .account_views import (
|
||||
AccountSettingsViewSet,
|
||||
TeamManagementViewSet,
|
||||
UsageAnalyticsViewSet,
|
||||
DashboardStatsViewSet
|
||||
)
|
||||
|
||||
router = DefaultRouter()
|
||||
|
||||
urlpatterns = [
|
||||
# Account settings (non-router endpoints for simplified access)
|
||||
path('settings/', AccountSettingsViewSet.as_view({'get': 'retrieve', 'patch': 'partial_update'}), name='account-settings'),
|
||||
|
||||
# Team management
|
||||
path('team/', TeamManagementViewSet.as_view({'get': 'list', 'post': 'create'}), name='team-list'),
|
||||
path('team/<int:pk>/', TeamManagementViewSet.as_view({'delete': 'destroy'}), name='team-detail'),
|
||||
|
||||
# Usage analytics
|
||||
path('usage/analytics/', UsageAnalyticsViewSet.as_view({'get': 'overview'}), name='usage-analytics'),
|
||||
|
||||
# Dashboard stats (real data for home page)
|
||||
path('dashboard/stats/', DashboardStatsViewSet.as_view({'get': 'stats'}), name='dashboard-stats'),
|
||||
|
||||
path('', include(router.urls)),
|
||||
]
|
||||
@@ -1,400 +0,0 @@
|
||||
"""
|
||||
WordPress Publishing API Views
|
||||
Handles manual content publishing to WordPress sites
|
||||
"""
|
||||
from rest_framework import status
|
||||
from rest_framework.decorators import api_view, permission_classes
|
||||
from rest_framework.permissions import IsAuthenticated
|
||||
from rest_framework.response import Response
|
||||
from django.shortcuts import get_object_or_404
|
||||
from django.utils import timezone
|
||||
from typing import Dict, Any, List
|
||||
|
||||
from igny8_core.models import ContentPost, SiteIntegration
|
||||
from igny8_core.tasks.wordpress_publishing import (
|
||||
publish_content_to_wordpress,
|
||||
bulk_publish_content_to_wordpress
|
||||
)
|
||||
|
||||
|
||||
@api_view(['POST'])
|
||||
@permission_classes([IsAuthenticated])
|
||||
def publish_single_content(request, content_id: int) -> Response:
|
||||
"""
|
||||
Publish a single content item to WordPress
|
||||
|
||||
POST /api/v1/content/{content_id}/publish-to-wordpress/
|
||||
|
||||
Body:
|
||||
{
|
||||
"site_integration_id": 123, // Optional - will use default if not provided
|
||||
"force": false // Optional - force republish even if already published
|
||||
}
|
||||
"""
|
||||
try:
|
||||
content = get_object_or_404(ContentPost, id=content_id)
|
||||
|
||||
# Check permissions
|
||||
if not request.user.has_perm('content.change_contentpost'):
|
||||
return Response(
|
||||
{
|
||||
'success': False,
|
||||
'message': 'Permission denied',
|
||||
'error': 'insufficient_permissions'
|
||||
},
|
||||
status=status.HTTP_403_FORBIDDEN
|
||||
)
|
||||
|
||||
# Get site integration
|
||||
site_integration_id = request.data.get('site_integration_id')
|
||||
force = request.data.get('force', False)
|
||||
|
||||
if site_integration_id:
|
||||
site_integration = get_object_or_404(SiteIntegration, id=site_integration_id)
|
||||
else:
|
||||
# Get default WordPress integration for user's organization
|
||||
site_integration = SiteIntegration.objects.filter(
|
||||
platform='wordpress',
|
||||
is_active=True,
|
||||
# Add organization filter if applicable
|
||||
).first()
|
||||
|
||||
if not site_integration:
|
||||
return Response(
|
||||
{
|
||||
'success': False,
|
||||
'message': 'No WordPress integration found',
|
||||
'error': 'no_integration'
|
||||
},
|
||||
status=status.HTTP_400_BAD_REQUEST
|
||||
)
|
||||
|
||||
# Check if already published (unless force is true)
|
||||
if not force and content.wordpress_sync_status == 'success':
|
||||
return Response(
|
||||
{
|
||||
'success': True,
|
||||
'message': 'Content already published to WordPress',
|
||||
'data': {
|
||||
'content_id': content.id,
|
||||
'wordpress_post_id': content.wordpress_post_id,
|
||||
'wordpress_post_url': content.wordpress_post_url,
|
||||
'status': 'already_published'
|
||||
}
|
||||
}
|
||||
)
|
||||
|
||||
# Check if currently syncing
|
||||
if content.wordpress_sync_status == 'syncing':
|
||||
return Response(
|
||||
{
|
||||
'success': False,
|
||||
'message': 'Content is currently being published to WordPress',
|
||||
'error': 'sync_in_progress'
|
||||
},
|
||||
status=status.HTTP_409_CONFLICT
|
||||
)
|
||||
|
||||
# Validate content is ready for publishing
|
||||
if not content.title or not (content.content_html or content.content):
|
||||
return Response(
|
||||
{
|
||||
'success': False,
|
||||
'message': 'Content is incomplete - missing title or content',
|
||||
'error': 'incomplete_content'
|
||||
},
|
||||
status=status.HTTP_400_BAD_REQUEST
|
||||
)
|
||||
|
||||
# Set status to pending and queue the task
|
||||
content.wordpress_sync_status = 'pending'
|
||||
content.save(update_fields=['wordpress_sync_status'])
|
||||
|
||||
# Get task_id if content is associated with a writer task
|
||||
task_id = None
|
||||
if hasattr(content, 'writer_task'):
|
||||
task_id = content.writer_task.id
|
||||
|
||||
# Queue the publishing task
|
||||
task_result = publish_content_to_wordpress.delay(
|
||||
content.id,
|
||||
site_integration.id,
|
||||
task_id
|
||||
)
|
||||
|
||||
return Response(
|
||||
{
|
||||
'success': True,
|
||||
'message': 'Content queued for WordPress publishing',
|
||||
'data': {
|
||||
'content_id': content.id,
|
||||
'site_integration_id': site_integration.id,
|
||||
'task_id': task_result.id,
|
||||
'status': 'queued'
|
||||
}
|
||||
},
|
||||
status=status.HTTP_202_ACCEPTED
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
return Response(
|
||||
{
|
||||
'success': False,
|
||||
'message': f'Error queuing content for WordPress publishing: {str(e)}',
|
||||
'error': 'server_error'
|
||||
},
|
||||
status=status.HTTP_500_INTERNAL_SERVER_ERROR
|
||||
)
|
||||
|
||||
|
||||
@api_view(['POST'])
|
||||
@permission_classes([IsAuthenticated])
|
||||
def bulk_publish_content(request) -> Response:
|
||||
"""
|
||||
Bulk publish multiple content items to WordPress
|
||||
|
||||
POST /api/v1/content/bulk-publish-to-wordpress/
|
||||
|
||||
Body:
|
||||
{
|
||||
"content_ids": [1, 2, 3, 4],
|
||||
"site_integration_id": 123, // Optional
|
||||
"force": false // Optional
|
||||
}
|
||||
"""
|
||||
try:
|
||||
content_ids = request.data.get('content_ids', [])
|
||||
site_integration_id = request.data.get('site_integration_id')
|
||||
force = request.data.get('force', False)
|
||||
|
||||
if not content_ids:
|
||||
return Response(
|
||||
{
|
||||
'success': False,
|
||||
'message': 'No content IDs provided',
|
||||
'error': 'missing_content_ids'
|
||||
},
|
||||
status=status.HTTP_400_BAD_REQUEST
|
||||
)
|
||||
|
||||
# Check permissions
|
||||
if not request.user.has_perm('content.change_contentpost'):
|
||||
return Response(
|
||||
{
|
||||
'success': False,
|
||||
'message': 'Permission denied',
|
||||
'error': 'insufficient_permissions'
|
||||
},
|
||||
status=status.HTTP_403_FORBIDDEN
|
||||
)
|
||||
|
||||
# Get site integration
|
||||
if site_integration_id:
|
||||
site_integration = get_object_or_404(SiteIntegration, id=site_integration_id)
|
||||
else:
|
||||
site_integration = SiteIntegration.objects.filter(
|
||||
platform='wordpress',
|
||||
is_active=True,
|
||||
).first()
|
||||
|
||||
if not site_integration:
|
||||
return Response(
|
||||
{
|
||||
'success': False,
|
||||
'message': 'No WordPress integration found',
|
||||
'error': 'no_integration'
|
||||
},
|
||||
status=status.HTTP_400_BAD_REQUEST
|
||||
)
|
||||
|
||||
# Validate content items
|
||||
content_items = ContentPost.objects.filter(id__in=content_ids)
|
||||
|
||||
if content_items.count() != len(content_ids):
|
||||
return Response(
|
||||
{
|
||||
'success': False,
|
||||
'message': 'Some content items not found',
|
||||
'error': 'content_not_found'
|
||||
},
|
||||
status=status.HTTP_404_NOT_FOUND
|
||||
)
|
||||
|
||||
# Queue bulk publishing task
|
||||
task_result = bulk_publish_content_to_wordpress.delay(
|
||||
content_ids,
|
||||
site_integration.id
|
||||
)
|
||||
|
||||
return Response(
|
||||
{
|
||||
'success': True,
|
||||
'message': f'{len(content_ids)} content items queued for WordPress publishing',
|
||||
'data': {
|
||||
'content_count': len(content_ids),
|
||||
'site_integration_id': site_integration.id,
|
||||
'task_id': task_result.id,
|
||||
'status': 'queued'
|
||||
}
|
||||
},
|
||||
status=status.HTTP_202_ACCEPTED
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
return Response(
|
||||
{
|
||||
'success': False,
|
||||
'message': f'Error queuing bulk WordPress publishing: {str(e)}',
|
||||
'error': 'server_error'
|
||||
},
|
||||
status=status.HTTP_500_INTERNAL_SERVER_ERROR
|
||||
)
|
||||
|
||||
|
||||
@api_view(['GET'])
|
||||
@permission_classes([IsAuthenticated])
|
||||
def get_wordpress_status(request, content_id: int) -> Response:
|
||||
"""
|
||||
Get WordPress publishing status for a content item
|
||||
|
||||
GET /api/v1/content/{content_id}/wordpress-status/
|
||||
"""
|
||||
try:
|
||||
content = get_object_or_404(ContentPost, id=content_id)
|
||||
|
||||
return Response(
|
||||
{
|
||||
'success': True,
|
||||
'data': {
|
||||
'content_id': content.id,
|
||||
'wordpress_sync_status': content.wordpress_sync_status,
|
||||
'wordpress_post_id': content.wordpress_post_id,
|
||||
'wordpress_post_url': content.wordpress_post_url,
|
||||
'wordpress_sync_attempts': content.wordpress_sync_attempts,
|
||||
'last_wordpress_sync': content.last_wordpress_sync.isoformat() if content.last_wordpress_sync else None,
|
||||
}
|
||||
}
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
return Response(
|
||||
{
|
||||
'success': False,
|
||||
'message': f'Error getting WordPress status: {str(e)}',
|
||||
'error': 'server_error'
|
||||
},
|
||||
status=status.HTTP_500_INTERNAL_SERVER_ERROR
|
||||
)
|
||||
|
||||
|
||||
@api_view(['GET'])
|
||||
@permission_classes([IsAuthenticated])
|
||||
def get_wordpress_integrations(request) -> Response:
|
||||
"""
|
||||
Get available WordPress integrations for publishing
|
||||
|
||||
GET /api/v1/wordpress-integrations/
|
||||
"""
|
||||
try:
|
||||
integrations = SiteIntegration.objects.filter(
|
||||
platform='wordpress',
|
||||
is_active=True,
|
||||
# Add organization filter if applicable
|
||||
).values(
|
||||
'id', 'site_name', 'site_url', 'is_active',
|
||||
'created_at', 'last_sync_at'
|
||||
)
|
||||
|
||||
return Response(
|
||||
{
|
||||
'success': True,
|
||||
'data': list(integrations)
|
||||
}
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
return Response(
|
||||
{
|
||||
'success': False,
|
||||
'message': f'Error getting WordPress integrations: {str(e)}',
|
||||
'error': 'server_error'
|
||||
},
|
||||
status=status.HTTP_500_INTERNAL_SERVER_ERROR
|
||||
)
|
||||
|
||||
|
||||
@api_view(['POST'])
|
||||
@permission_classes([IsAuthenticated])
|
||||
def retry_failed_wordpress_sync(request, content_id: int) -> Response:
|
||||
"""
|
||||
Retry a failed WordPress sync
|
||||
|
||||
POST /api/v1/content/{content_id}/retry-wordpress-sync/
|
||||
"""
|
||||
try:
|
||||
content = get_object_or_404(ContentPost, id=content_id)
|
||||
|
||||
if content.wordpress_sync_status != 'failed':
|
||||
return Response(
|
||||
{
|
||||
'success': False,
|
||||
'message': 'Content is not in failed status',
|
||||
'error': 'invalid_status'
|
||||
},
|
||||
status=status.HTTP_400_BAD_REQUEST
|
||||
)
|
||||
|
||||
# Get default WordPress integration
|
||||
site_integration = SiteIntegration.objects.filter(
|
||||
platform='wordpress',
|
||||
is_active=True,
|
||||
).first()
|
||||
|
||||
if not site_integration:
|
||||
return Response(
|
||||
{
|
||||
'success': False,
|
||||
'message': 'No WordPress integration found',
|
||||
'error': 'no_integration'
|
||||
},
|
||||
status=status.HTTP_400_BAD_REQUEST
|
||||
)
|
||||
|
||||
# Reset status and retry
|
||||
content.wordpress_sync_status = 'pending'
|
||||
content.save(update_fields=['wordpress_sync_status'])
|
||||
|
||||
# Get task_id if available
|
||||
task_id = None
|
||||
if hasattr(content, 'writer_task'):
|
||||
task_id = content.writer_task.id
|
||||
|
||||
# Queue the publishing task
|
||||
task_result = publish_content_to_wordpress.delay(
|
||||
content.id,
|
||||
site_integration.id,
|
||||
task_id
|
||||
)
|
||||
|
||||
return Response(
|
||||
{
|
||||
'success': True,
|
||||
'message': 'WordPress sync retry queued',
|
||||
'data': {
|
||||
'content_id': content.id,
|
||||
'task_id': task_result.id,
|
||||
'status': 'queued'
|
||||
}
|
||||
},
|
||||
status=status.HTTP_202_ACCEPTED
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
return Response(
|
||||
{
|
||||
'success': False,
|
||||
'message': f'Error retrying WordPress sync: {str(e)}',
|
||||
'error': 'server_error'
|
||||
},
|
||||
status=status.HTTP_500_INTERNAL_SERVER_ERROR
|
||||
)
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,35 +0,0 @@
|
||||
"""
|
||||
Custom Authentication Backend - No Caching
|
||||
Prevents cross-request user contamination by disabling Django's default user caching
|
||||
"""
|
||||
from django.contrib.auth.backends import ModelBackend
|
||||
|
||||
|
||||
class NoCacheModelBackend(ModelBackend):
|
||||
"""
|
||||
Custom authentication backend that disables user object caching.
|
||||
|
||||
Django's default ModelBackend caches the user object in thread-local storage,
|
||||
which can cause cross-request contamination when the same worker process
|
||||
handles requests from different users.
|
||||
|
||||
This backend forces a fresh DB query on EVERY request to prevent user swapping.
|
||||
"""
|
||||
|
||||
def get_user(self, user_id):
|
||||
"""
|
||||
Get user from database WITHOUT caching.
|
||||
|
||||
This overrides the default behavior which caches user objects
|
||||
at the process level, causing session contamination.
|
||||
"""
|
||||
from django.contrib.auth import get_user_model
|
||||
UserModel = get_user_model()
|
||||
|
||||
try:
|
||||
# CRITICAL: Use select_related to load account/plan in ONE query
|
||||
# But do NOT cache the result - return fresh object every time
|
||||
user = UserModel.objects.select_related('account', 'account__plan').get(pk=user_id)
|
||||
return user
|
||||
except UserModel.DoesNotExist:
|
||||
return None
|
||||
@@ -8,7 +8,7 @@ from django.db.models import Q
|
||||
from igny8_core.auth.models import Account, User, Site, Sector
|
||||
from igny8_core.modules.planner.models import Keywords, Clusters, ContentIdeas
|
||||
from igny8_core.modules.writer.models import Tasks, Images, Content
|
||||
from igny8_core.business.billing.models import CreditTransaction, CreditUsageLog
|
||||
from igny8_core.modules.billing.models import CreditTransaction, CreditUsageLog
|
||||
from igny8_core.modules.system.models import AIPrompt, IntegrationSettings, AuthorProfile, Strategy
|
||||
from igny8_core.modules.system.settings_models import AccountSettings, UserSettings, ModuleSettings, AISettings
|
||||
|
||||
|
||||
@@ -1,82 +0,0 @@
|
||||
"""
|
||||
Management command to clean up expired and orphaned sessions
|
||||
Helps prevent session contamination and reduces DB bloat
|
||||
"""
|
||||
from django.core.management.base import BaseCommand
|
||||
from django.contrib.sessions.models import Session
|
||||
from django.contrib.auth import get_user_model
|
||||
from datetime import datetime, timedelta
|
||||
|
||||
User = get_user_model()
|
||||
|
||||
class Command(BaseCommand):
|
||||
help = 'Clean up expired sessions and detect session contamination'
|
||||
|
||||
def add_arguments(self, parser):
|
||||
parser.add_argument(
|
||||
'--dry-run',
|
||||
action='store_true',
|
||||
help='Show what would be deleted without actually deleting',
|
||||
)
|
||||
parser.add_argument(
|
||||
'--days',
|
||||
type=int,
|
||||
default=7,
|
||||
help='Delete sessions older than X days (default: 7)',
|
||||
)
|
||||
|
||||
def handle(self, *args, **options):
|
||||
dry_run = options['dry_run']
|
||||
days = options['days']
|
||||
cutoff_date = datetime.now() - timedelta(days=days)
|
||||
|
||||
# Get all sessions
|
||||
all_sessions = Session.objects.all()
|
||||
expired_sessions = Session.objects.filter(expire_date__lt=datetime.now())
|
||||
old_sessions = Session.objects.filter(expire_date__lt=cutoff_date)
|
||||
|
||||
self.stdout.write(f"\n📊 Session Statistics:")
|
||||
self.stdout.write(f" Total sessions: {all_sessions.count()}")
|
||||
self.stdout.write(f" Expired sessions: {expired_sessions.count()}")
|
||||
self.stdout.write(f" Sessions older than {days} days: {old_sessions.count()}")
|
||||
|
||||
# Count sessions by user
|
||||
user_sessions = {}
|
||||
for session in all_sessions:
|
||||
try:
|
||||
data = session.get_decoded()
|
||||
user_id = data.get('_auth_user_id')
|
||||
if user_id:
|
||||
user = User.objects.get(id=user_id)
|
||||
key = f"{user.username} ({user.account.slug if user.account else 'no-account'})"
|
||||
user_sessions[key] = user_sessions.get(key, 0) + 1
|
||||
except:
|
||||
pass
|
||||
|
||||
if user_sessions:
|
||||
self.stdout.write(f"\n📈 Active sessions by user:")
|
||||
for user_key, count in sorted(user_sessions.items(), key=lambda x: x[1], reverse=True)[:10]:
|
||||
indicator = "⚠️ " if count > 20 else " "
|
||||
self.stdout.write(f"{indicator}{user_key}: {count} sessions")
|
||||
|
||||
# Delete expired sessions
|
||||
if expired_sessions.exists():
|
||||
if dry_run:
|
||||
self.stdout.write(self.style.WARNING(f"\n[DRY RUN] Would delete {expired_sessions.count()} expired sessions"))
|
||||
else:
|
||||
count = expired_sessions.delete()[0]
|
||||
self.stdout.write(self.style.SUCCESS(f"\n✓ Deleted {count} expired sessions"))
|
||||
else:
|
||||
self.stdout.write(f"\n✓ No expired sessions to clean")
|
||||
|
||||
# Detect potential contamination
|
||||
warnings = []
|
||||
for user_key, count in user_sessions.items():
|
||||
if count > 50:
|
||||
warnings.append(f"User '{user_key}' has {count} active sessions (potential proliferation)")
|
||||
|
||||
if warnings:
|
||||
self.stdout.write(self.style.WARNING(f"\n⚠️ Contamination Warnings:"))
|
||||
for warning in warnings:
|
||||
self.stdout.write(self.style.WARNING(f" {warning}"))
|
||||
self.stdout.write(f"\n💡 Consider running: python manage.py clearsessions")
|
||||
@@ -1,57 +0,0 @@
|
||||
"""
|
||||
Management command to create or update the Free Trial plan
|
||||
"""
|
||||
from django.core.management.base import BaseCommand
|
||||
from igny8_core.auth.models import Plan
|
||||
|
||||
|
||||
class Command(BaseCommand):
|
||||
help = 'Create or update the Free Trial plan for signup'
|
||||
|
||||
def handle(self, *args, **options):
|
||||
self.stdout.write('Creating/updating Free Trial plan...')
|
||||
|
||||
plan, created = Plan.objects.update_or_create(
|
||||
slug='free-trial',
|
||||
defaults={
|
||||
'name': 'Free Trial',
|
||||
'price': 0.00,
|
||||
'billing_cycle': 'monthly',
|
||||
'included_credits': 2000, # 2000 credits for trial
|
||||
'credits_per_month': 2000, # Legacy field
|
||||
'max_sites': 1,
|
||||
'max_users': 1,
|
||||
'max_industries': 3, # 3 sectors per site
|
||||
'max_author_profiles': 2,
|
||||
'is_active': True,
|
||||
'features': ['ai_writer', 'planner', 'basic_support'],
|
||||
'allow_credit_topup': False, # No top-up during trial
|
||||
'extra_credit_price': 0.00,
|
||||
}
|
||||
)
|
||||
|
||||
if created:
|
||||
self.stdout.write(self.style.SUCCESS(
|
||||
f'✓ Created Free Trial plan (ID: {plan.id})'
|
||||
))
|
||||
else:
|
||||
self.stdout.write(self.style.SUCCESS(
|
||||
f'✓ Updated Free Trial plan (ID: {plan.id})'
|
||||
))
|
||||
|
||||
self.stdout.write(self.style.SUCCESS(
|
||||
f' - Credits: {plan.included_credits}'
|
||||
))
|
||||
self.stdout.write(self.style.SUCCESS(
|
||||
f' - Max Sites: {plan.max_sites}'
|
||||
))
|
||||
self.stdout.write(self.style.SUCCESS(
|
||||
f' - Max Sectors: {plan.max_industries}'
|
||||
))
|
||||
self.stdout.write(self.style.SUCCESS(
|
||||
f' - Status: {"Active" if plan.is_active else "Inactive"}'
|
||||
))
|
||||
|
||||
self.stdout.write(self.style.SUCCESS(
|
||||
'\nFree Trial plan is ready for signup!'
|
||||
))
|
||||
@@ -1,42 +0,0 @@
|
||||
from django.core.management.base import BaseCommand
|
||||
from django.utils import timezone
|
||||
|
||||
from igny8_core.auth.models import Account, Site, Sector
|
||||
from igny8_core.business.planning.models import Clusters, Keywords, ContentIdeas
|
||||
from igny8_core.business.content.models import Tasks, Content, Images
|
||||
|
||||
|
||||
class Command(BaseCommand):
|
||||
help = "Permanently delete soft-deleted records whose retention window has expired."
|
||||
|
||||
def handle(self, *args, **options):
|
||||
now = timezone.now()
|
||||
total_deleted = 0
|
||||
|
||||
models = [
|
||||
Account,
|
||||
Site,
|
||||
Sector,
|
||||
Clusters,
|
||||
Keywords,
|
||||
ContentIdeas,
|
||||
Tasks,
|
||||
Content,
|
||||
Images,
|
||||
]
|
||||
|
||||
for model in models:
|
||||
qs = model.all_objects.filter(is_deleted=True, restore_until__lt=now)
|
||||
if model is Account:
|
||||
qs = qs.exclude(slug='aws-admin')
|
||||
count = qs.count()
|
||||
if count:
|
||||
qs.delete()
|
||||
total_deleted += count
|
||||
self.stdout.write(self.style.SUCCESS(f"Purged {count} {model.__name__} record(s)."))
|
||||
|
||||
if total_deleted == 0:
|
||||
self.stdout.write("No expired soft-deleted records to purge.")
|
||||
else:
|
||||
self.stdout.write(self.style.SUCCESS(f"Total purged: {total_deleted}"))
|
||||
|
||||
@@ -2,27 +2,9 @@
|
||||
Multi-Account Middleware
|
||||
Extracts account from JWT token and injects into request context
|
||||
"""
|
||||
import logging
|
||||
from django.utils.deprecation import MiddlewareMixin
|
||||
from django.http import JsonResponse
|
||||
from django.contrib.auth import logout
|
||||
from rest_framework import status
|
||||
import json
|
||||
from datetime import datetime
|
||||
|
||||
logger = logging.getLogger('auth.middleware')
|
||||
|
||||
# Logout reason codes for precise tracking
|
||||
LOGOUT_REASONS = {
|
||||
'SESSION_ACCOUNT_MISMATCH': 'Session contamination: account ID mismatch',
|
||||
'SESSION_USER_MISMATCH': 'Session contamination: user ID mismatch',
|
||||
'ACCOUNT_MISSING': 'Account not configured for this user',
|
||||
'ACCOUNT_SUSPENDED': 'Account is suspended',
|
||||
'ACCOUNT_CANCELLED': 'Account is cancelled',
|
||||
'PLAN_MISSING': 'No subscription plan assigned',
|
||||
'PLAN_INACTIVE': 'Subscription plan is inactive',
|
||||
'USER_INACTIVE': 'User account is inactive',
|
||||
}
|
||||
|
||||
try:
|
||||
import jwt
|
||||
@@ -48,25 +30,30 @@ class AccountContextMiddleware(MiddlewareMixin):
|
||||
# First, try to get user from Django session (cookie-based auth)
|
||||
# This handles cases where frontend uses credentials: 'include' with session cookies
|
||||
if hasattr(request, 'user') and request.user and request.user.is_authenticated:
|
||||
# CRITICAL FIX: Never query DB again or mutate request.user
|
||||
# Django's AuthenticationMiddleware already loaded the user correctly
|
||||
# Just use it directly and set request.account from the ALREADY LOADED relationship
|
||||
# User is authenticated via session - refresh from DB to get latest account/plan data
|
||||
# This ensures changes to account/plan are reflected immediately without re-login
|
||||
try:
|
||||
# Validate account/plan - but use the user object already set by Django
|
||||
validation_error = self._validate_account_and_plan(request, request.user)
|
||||
if validation_error:
|
||||
return validation_error
|
||||
|
||||
# Set request.account from the user's account relationship
|
||||
# This is already loaded, no need to query DB again
|
||||
request.account = getattr(request.user, 'account', None)
|
||||
|
||||
# REMOVED: Session contamination checks on every request
|
||||
# These were causing random logouts - session integrity handled by Django
|
||||
|
||||
return None
|
||||
except (AttributeError, Exception):
|
||||
# If anything fails, just set account to None and continue
|
||||
from .models import User as UserModel
|
||||
# Refresh user from DB with account and plan relationships to get latest data
|
||||
# This is important so account/plan changes are reflected immediately
|
||||
user = UserModel.objects.select_related('account', 'account__plan').get(id=request.user.id)
|
||||
# Update request.user with fresh data
|
||||
request.user = user
|
||||
# Get account from refreshed user
|
||||
user_account = getattr(user, 'account', None)
|
||||
if user_account:
|
||||
request.account = user_account
|
||||
return None
|
||||
except (AttributeError, UserModel.DoesNotExist, Exception):
|
||||
# If refresh fails, fallback to cached account
|
||||
try:
|
||||
user_account = getattr(request.user, 'account', None)
|
||||
if user_account:
|
||||
request.account = user_account
|
||||
return None
|
||||
except (AttributeError, Exception):
|
||||
pass
|
||||
# If account access fails (e.g., column mismatch), set to None
|
||||
request.account = None
|
||||
return None
|
||||
|
||||
@@ -89,6 +76,7 @@ class AccountContextMiddleware(MiddlewareMixin):
|
||||
if not JWT_AVAILABLE:
|
||||
# JWT library not installed yet - skip for now
|
||||
request.account = None
|
||||
request.user = None
|
||||
return None
|
||||
|
||||
# Decode JWT token with signature verification
|
||||
@@ -106,92 +94,42 @@ class AccountContextMiddleware(MiddlewareMixin):
|
||||
if user_id:
|
||||
from .models import User, Account
|
||||
try:
|
||||
# Get user from DB (but don't set request.user - let DRF authentication handle that)
|
||||
# Only set request.account for account context
|
||||
# Refresh user from DB with account and plan relationships to get latest data
|
||||
# This ensures changes to account/plan are reflected immediately without re-login
|
||||
user = User.objects.select_related('account', 'account__plan').get(id=user_id)
|
||||
validation_error = self._validate_account_and_plan(request, user)
|
||||
if validation_error:
|
||||
return validation_error
|
||||
request.user = user
|
||||
if account_id:
|
||||
# Verify account still exists
|
||||
try:
|
||||
account = Account.objects.get(id=account_id)
|
||||
# Verify account still exists and matches user
|
||||
account = Account.objects.get(id=account_id)
|
||||
# If user's account changed, use the new one from user object
|
||||
if user.account and user.account.id != account_id:
|
||||
request.account = user.account
|
||||
else:
|
||||
request.account = account
|
||||
except Account.DoesNotExist:
|
||||
# Account from token doesn't exist - don't fallback, set to None
|
||||
request.account = None
|
||||
else:
|
||||
# No account_id in token - set to None (don't fallback to user.account)
|
||||
request.account = None
|
||||
try:
|
||||
user_account = getattr(user, 'account', None)
|
||||
if user_account:
|
||||
request.account = user_account
|
||||
else:
|
||||
request.account = None
|
||||
except (AttributeError, Exception):
|
||||
# If account access fails (e.g., column mismatch), set to None
|
||||
request.account = None
|
||||
except (User.DoesNotExist, Account.DoesNotExist):
|
||||
request.account = None
|
||||
request.user = None
|
||||
else:
|
||||
request.account = None
|
||||
request.user = None
|
||||
|
||||
except jwt.InvalidTokenError:
|
||||
request.account = None
|
||||
request.user = None
|
||||
except Exception:
|
||||
# Fail silently for now - allow unauthenticated access
|
||||
request.account = None
|
||||
request.user = None
|
||||
|
||||
return None
|
||||
|
||||
def _validate_account_and_plan(self, request, user):
|
||||
"""
|
||||
Ensure the authenticated user has an account and an active plan.
|
||||
Uses shared validation helper for consistency.
|
||||
"""
|
||||
from .utils import validate_account_and_plan
|
||||
|
||||
is_valid, error_message, http_status = validate_account_and_plan(user)
|
||||
|
||||
if not is_valid:
|
||||
return self._deny_request(request, error_message, http_status)
|
||||
|
||||
return None
|
||||
|
||||
def _deny_request(self, request, error, status_code):
|
||||
"""Logout session users (if any) and return a consistent JSON error with detailed tracking."""
|
||||
# Determine logout reason code based on error message
|
||||
reason_code = 'UNKNOWN'
|
||||
if 'Account not configured' in error or 'Account not found' in error:
|
||||
reason_code = 'ACCOUNT_MISSING'
|
||||
elif 'suspended' in error.lower():
|
||||
reason_code = 'ACCOUNT_SUSPENDED'
|
||||
elif 'cancelled' in error.lower():
|
||||
reason_code = 'ACCOUNT_CANCELLED'
|
||||
elif 'No subscription plan' in error or 'plan assigned' in error.lower():
|
||||
reason_code = 'PLAN_MISSING'
|
||||
elif 'plan is inactive' in error.lower() or 'Active subscription required' in error:
|
||||
reason_code = 'PLAN_INACTIVE'
|
||||
elif 'inactive' in error.lower():
|
||||
reason_code = 'USER_INACTIVE'
|
||||
|
||||
try:
|
||||
if hasattr(request, 'user') and request.user and request.user.is_authenticated:
|
||||
logger.warning(
|
||||
f"[AUTO-LOGOUT] {reason_code}: {error}. "
|
||||
f"User={request.user.id}, Account={getattr(request, 'account', None)}, "
|
||||
f"Path={request.path}, IP={request.META.get('REMOTE_ADDR')}, "
|
||||
f"Status={status_code}, Timestamp={datetime.now().isoformat()}"
|
||||
)
|
||||
logout(request)
|
||||
except Exception as e:
|
||||
logger.error(f"[AUTO-LOGOUT] Error during logout: {e}")
|
||||
|
||||
return JsonResponse(
|
||||
{
|
||||
'success': False,
|
||||
'error': error,
|
||||
'logout_reason': reason_code,
|
||||
'logout_message': LOGOUT_REASONS.get(reason_code, error),
|
||||
'logout_path': request.path,
|
||||
'logout_context': {
|
||||
'user_id': request.user.id if hasattr(request, 'user') and request.user and request.user.is_authenticated else None,
|
||||
'account_id': getattr(request, 'account', None).id if hasattr(request, 'account') and getattr(request, 'account', None) else None,
|
||||
'status_code': status_code,
|
||||
}
|
||||
},
|
||||
status=status_code,
|
||||
)
|
||||
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# Generated by Django 5.2.8 on 2025-11-20 23:27
|
||||
# Generated by Django 5.2.7 on 2025-11-02 21:42
|
||||
|
||||
import django.contrib.auth.models
|
||||
import django.contrib.auth.validators
|
||||
@@ -25,22 +25,12 @@ class Migration(migrations.Migration):
|
||||
('name', models.CharField(max_length=255)),
|
||||
('slug', models.SlugField(max_length=255, unique=True)),
|
||||
('price', models.DecimalField(decimal_places=2, max_digits=10)),
|
||||
('billing_cycle', models.CharField(choices=[('monthly', 'Monthly'), ('annual', 'Annual')], default='monthly', max_length=20)),
|
||||
('features', models.JSONField(blank=True, default=list, help_text="Plan features as JSON array (e.g., ['ai_writer', 'image_gen', 'auto_publish'])")),
|
||||
('credits_per_month', models.IntegerField(default=0, validators=[django.core.validators.MinValueValidator(0)])),
|
||||
('max_sites', models.IntegerField(default=1, help_text='Maximum number of sites allowed (1-10)', validators=[django.core.validators.MinValueValidator(1), django.core.validators.MaxValueValidator(10)])),
|
||||
('features', models.JSONField(default=dict, help_text='Plan features as JSON')),
|
||||
('stripe_price_id', models.CharField(blank=True, max_length=255, null=True)),
|
||||
('is_active', models.BooleanField(default=True)),
|
||||
('created_at', models.DateTimeField(auto_now_add=True)),
|
||||
('max_users', models.IntegerField(default=1, help_text='Total users allowed per account', validators=[django.core.validators.MinValueValidator(1)])),
|
||||
('max_sites', models.IntegerField(default=1, help_text='Maximum number of sites allowed', validators=[django.core.validators.MinValueValidator(1)])),
|
||||
('max_industries', models.IntegerField(blank=True, default=None, help_text='Optional limit for industries/sectors', null=True, validators=[django.core.validators.MinValueValidator(1)])),
|
||||
('max_author_profiles', models.IntegerField(default=5, help_text='Limit for saved writing styles', validators=[django.core.validators.MinValueValidator(0)])),
|
||||
('included_credits', models.IntegerField(default=0, help_text='Monthly credits included', validators=[django.core.validators.MinValueValidator(0)])),
|
||||
('extra_credit_price', models.DecimalField(decimal_places=2, default=0.01, help_text='Price per additional credit', max_digits=10)),
|
||||
('allow_credit_topup', models.BooleanField(default=True, help_text='Can user purchase more credits?')),
|
||||
('auto_credit_topup_threshold', models.IntegerField(blank=True, default=None, help_text='Auto top-up trigger point (optional)', null=True, validators=[django.core.validators.MinValueValidator(0)])),
|
||||
('auto_credit_topup_amount', models.IntegerField(blank=True, default=None, help_text='How many credits to auto-buy', null=True, validators=[django.core.validators.MinValueValidator(1)])),
|
||||
('stripe_product_id', models.CharField(blank=True, help_text='For Stripe plan sync', max_length=255, null=True)),
|
||||
('stripe_price_id', models.CharField(blank=True, help_text='Monthly price ID for Stripe', max_length=255, null=True)),
|
||||
('credits_per_month', models.IntegerField(default=0, help_text='DEPRECATED: Use included_credits instead', validators=[django.core.validators.MinValueValidator(0)])),
|
||||
],
|
||||
options={
|
||||
'db_table': 'igny8_plans',
|
||||
@@ -60,7 +50,7 @@ class Migration(migrations.Migration):
|
||||
('is_staff', models.BooleanField(default=False, help_text='Designates whether the user can log into this admin site.', verbose_name='staff status')),
|
||||
('is_active', models.BooleanField(default=True, help_text='Designates whether this user should be treated as active. Unselect this instead of deleting accounts.', verbose_name='active')),
|
||||
('date_joined', models.DateTimeField(default=django.utils.timezone.now, verbose_name='date joined')),
|
||||
('role', models.CharField(choices=[('developer', 'Developer / Super Admin'), ('owner', 'Owner'), ('admin', 'Admin'), ('editor', 'Editor'), ('viewer', 'Viewer'), ('system_bot', 'System Bot')], default='viewer', max_length=20)),
|
||||
('role', models.CharField(choices=[('owner', 'Owner'), ('admin', 'Admin'), ('editor', 'Editor'), ('viewer', 'Viewer'), ('system_bot', 'System Bot')], default='viewer', max_length=20)),
|
||||
('email', models.EmailField(max_length=254, unique=True, verbose_name='email address')),
|
||||
('created_at', models.DateTimeField(auto_now_add=True)),
|
||||
('updated_at', models.DateTimeField(auto_now=True)),
|
||||
@@ -75,7 +65,7 @@ class Migration(migrations.Migration):
|
||||
],
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name='Account',
|
||||
name='Tenant',
|
||||
fields=[
|
||||
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('name', models.CharField(max_length=255)),
|
||||
@@ -85,93 +75,28 @@ class Migration(migrations.Migration):
|
||||
('status', models.CharField(choices=[('active', 'Active'), ('suspended', 'Suspended'), ('trial', 'Trial'), ('cancelled', 'Cancelled')], default='trial', max_length=20)),
|
||||
('created_at', models.DateTimeField(auto_now_add=True)),
|
||||
('updated_at', models.DateTimeField(auto_now=True)),
|
||||
('owner', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='owned_accounts', to=settings.AUTH_USER_MODEL)),
|
||||
('plan', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='accounts', to='igny8_core_auth.plan')),
|
||||
('owner', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='owned_tenants', to=settings.AUTH_USER_MODEL)),
|
||||
('plan', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='tenants', to='igny8_core_auth.plan')),
|
||||
],
|
||||
options={
|
||||
'verbose_name': 'Account',
|
||||
'verbose_name_plural': 'Accounts',
|
||||
'db_table': 'igny8_tenants',
|
||||
},
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='user',
|
||||
name='account',
|
||||
field=models.ForeignKey(blank=True, db_column='tenant_id', null=True, on_delete=django.db.models.deletion.CASCADE, related_name='users', to='igny8_core_auth.account'),
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name='Industry',
|
||||
name='Subscription',
|
||||
fields=[
|
||||
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('name', models.CharField(max_length=255, unique=True)),
|
||||
('slug', models.SlugField(max_length=255, unique=True)),
|
||||
('description', models.TextField(blank=True, null=True)),
|
||||
('is_active', models.BooleanField(db_index=True, default=True)),
|
||||
('stripe_subscription_id', models.CharField(max_length=255, unique=True)),
|
||||
('status', models.CharField(choices=[('active', 'Active'), ('past_due', 'Past Due'), ('canceled', 'Canceled'), ('trialing', 'Trialing')], max_length=20)),
|
||||
('current_period_start', models.DateTimeField()),
|
||||
('current_period_end', models.DateTimeField()),
|
||||
('cancel_at_period_end', models.BooleanField(default=False)),
|
||||
('created_at', models.DateTimeField(auto_now_add=True)),
|
||||
('updated_at', models.DateTimeField(auto_now=True)),
|
||||
('tenant', models.OneToOneField(on_delete=django.db.models.deletion.CASCADE, related_name='subscription', to='igny8_core_auth.tenant')),
|
||||
],
|
||||
options={
|
||||
'verbose_name': 'Industry',
|
||||
'verbose_name_plural': 'Industries',
|
||||
'db_table': 'igny8_industries',
|
||||
'ordering': ['name'],
|
||||
'indexes': [models.Index(fields=['slug'], name='igny8_indus_slug_2f8769_idx'), models.Index(fields=['is_active'], name='igny8_indus_is_acti_146d41_idx')],
|
||||
},
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name='IndustrySector',
|
||||
fields=[
|
||||
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('name', models.CharField(max_length=255)),
|
||||
('slug', models.SlugField(max_length=255)),
|
||||
('description', models.TextField(blank=True, null=True)),
|
||||
('suggested_keywords', models.JSONField(default=list, help_text='List of suggested keywords for this sector template')),
|
||||
('is_active', models.BooleanField(db_index=True, default=True)),
|
||||
('created_at', models.DateTimeField(auto_now_add=True)),
|
||||
('updated_at', models.DateTimeField(auto_now=True)),
|
||||
('industry', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='sectors', to='igny8_core_auth.industry')),
|
||||
],
|
||||
options={
|
||||
'verbose_name': 'Industry Sector',
|
||||
'verbose_name_plural': 'Industry Sectors',
|
||||
'db_table': 'igny8_industry_sectors',
|
||||
'ordering': ['industry', 'name'],
|
||||
},
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name='PasswordResetToken',
|
||||
fields=[
|
||||
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('token', models.CharField(db_index=True, max_length=255, unique=True)),
|
||||
('expires_at', models.DateTimeField()),
|
||||
('used', models.BooleanField(default=False)),
|
||||
('created_at', models.DateTimeField(auto_now_add=True)),
|
||||
('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='password_reset_tokens', to=settings.AUTH_USER_MODEL)),
|
||||
],
|
||||
options={
|
||||
'db_table': 'igny8_password_reset_tokens',
|
||||
'ordering': ['-created_at'],
|
||||
},
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name='SeedKeyword',
|
||||
fields=[
|
||||
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('keyword', models.CharField(db_index=True, max_length=255)),
|
||||
('volume', models.IntegerField(default=0, help_text='Search volume estimate')),
|
||||
('difficulty', models.IntegerField(default=0, help_text='Keyword difficulty (0-100)', validators=[django.core.validators.MinValueValidator(0), django.core.validators.MaxValueValidator(100)])),
|
||||
('intent', models.CharField(choices=[('informational', 'Informational'), ('navigational', 'Navigational'), ('commercial', 'Commercial'), ('transactional', 'Transactional')], default='informational', max_length=50)),
|
||||
('is_active', models.BooleanField(db_index=True, default=True)),
|
||||
('created_at', models.DateTimeField(auto_now_add=True)),
|
||||
('updated_at', models.DateTimeField(auto_now=True)),
|
||||
('industry', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='seed_keywords', to='igny8_core_auth.industry')),
|
||||
('sector', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='seed_keywords', to='igny8_core_auth.industrysector')),
|
||||
],
|
||||
options={
|
||||
'verbose_name': 'Seed Keyword',
|
||||
'verbose_name_plural': 'Seed Keywords',
|
||||
'db_table': 'igny8_seed_keywords',
|
||||
'ordering': ['keyword'],
|
||||
'db_table': 'igny8_subscriptions',
|
||||
},
|
||||
),
|
||||
migrations.CreateModel(
|
||||
@@ -186,18 +111,13 @@ class Migration(migrations.Migration):
|
||||
('status', models.CharField(choices=[('active', 'Active'), ('inactive', 'Inactive'), ('suspended', 'Suspended')], default='active', max_length=20)),
|
||||
('created_at', models.DateTimeField(auto_now_add=True)),
|
||||
('updated_at', models.DateTimeField(auto_now=True)),
|
||||
('wp_url', models.URLField(blank=True, help_text='WordPress site URL (legacy - use SiteIntegration)', null=True)),
|
||||
('wp_url', models.URLField(blank=True, help_text='WordPress site URL', null=True)),
|
||||
('wp_username', models.CharField(blank=True, max_length=255, null=True)),
|
||||
('wp_app_password', models.CharField(blank=True, max_length=255, null=True)),
|
||||
('site_type', models.CharField(choices=[('marketing', 'Marketing Site'), ('ecommerce', 'Ecommerce Site'), ('blog', 'Blog'), ('portfolio', 'Portfolio'), ('corporate', 'Corporate')], db_index=True, default='marketing', help_text='Type of site', max_length=50)),
|
||||
('hosting_type', models.CharField(choices=[('igny8_sites', 'IGNY8 Sites'), ('wordpress', 'WordPress'), ('shopify', 'Shopify'), ('multi', 'Multi-Destination')], db_index=True, default='igny8_sites', help_text='Target hosting platform', max_length=50)),
|
||||
('seo_metadata', models.JSONField(blank=True, default=dict, help_text='SEO metadata: meta tags, Open Graph, Schema.org')),
|
||||
('account', models.ForeignKey(db_column='tenant_id', on_delete=django.db.models.deletion.CASCADE, related_name='%(class)s_set', to='igny8_core_auth.account')),
|
||||
('industry', models.ForeignKey(blank=True, help_text='Industry this site belongs to', null=True, on_delete=django.db.models.deletion.PROTECT, related_name='sites', to='igny8_core_auth.industry')),
|
||||
('tenant', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='%(class)s_set', to='igny8_core_auth.tenant')),
|
||||
],
|
||||
options={
|
||||
'db_table': 'igny8_sites',
|
||||
'ordering': ['-created_at'],
|
||||
},
|
||||
),
|
||||
migrations.CreateModel(
|
||||
@@ -211,14 +131,18 @@ class Migration(migrations.Migration):
|
||||
('status', models.CharField(choices=[('active', 'Active'), ('inactive', 'Inactive')], default='active', max_length=20)),
|
||||
('created_at', models.DateTimeField(auto_now_add=True)),
|
||||
('updated_at', models.DateTimeField(auto_now=True)),
|
||||
('account', models.ForeignKey(db_column='tenant_id', on_delete=django.db.models.deletion.CASCADE, related_name='%(class)s_set', to='igny8_core_auth.account')),
|
||||
('industry_sector', models.ForeignKey(blank=True, help_text='Reference to the industry sector template', null=True, on_delete=django.db.models.deletion.PROTECT, related_name='site_sectors', to='igny8_core_auth.industrysector')),
|
||||
('site', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='sectors', to='igny8_core_auth.site')),
|
||||
('tenant', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='%(class)s_set', to='igny8_core_auth.tenant')),
|
||||
],
|
||||
options={
|
||||
'db_table': 'igny8_sectors',
|
||||
},
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='user',
|
||||
name='tenant',
|
||||
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='users', to='igny8_core_auth.tenant'),
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name='SiteUserAccess',
|
||||
fields=[
|
||||
@@ -229,111 +153,34 @@ class Migration(migrations.Migration):
|
||||
('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='site_access', to=settings.AUTH_USER_MODEL)),
|
||||
],
|
||||
options={
|
||||
'verbose_name': 'Site User Access',
|
||||
'verbose_name_plural': 'Site User Access',
|
||||
'db_table': 'igny8_site_user_access',
|
||||
},
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name='Subscription',
|
||||
fields=[
|
||||
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('stripe_subscription_id', models.CharField(max_length=255, unique=True)),
|
||||
('status', models.CharField(choices=[('active', 'Active'), ('past_due', 'Past Due'), ('canceled', 'Canceled'), ('trialing', 'Trialing')], max_length=20)),
|
||||
('current_period_start', models.DateTimeField()),
|
||||
('current_period_end', models.DateTimeField()),
|
||||
('cancel_at_period_end', models.BooleanField(default=False)),
|
||||
('created_at', models.DateTimeField(auto_now_add=True)),
|
||||
('updated_at', models.DateTimeField(auto_now=True)),
|
||||
('account', models.OneToOneField(db_column='tenant_id', on_delete=django.db.models.deletion.CASCADE, related_name='subscription', to='igny8_core_auth.account')),
|
||||
],
|
||||
options={
|
||||
'db_table': 'igny8_subscriptions',
|
||||
'indexes': [models.Index(fields=['user', 'site'], name='igny8_site__user_id_61951e_idx')],
|
||||
'unique_together': {('user', 'site')},
|
||||
},
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='user',
|
||||
index=models.Index(fields=['account', 'role'], name='igny8_users_tenant__0ab02b_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='user',
|
||||
index=models.Index(fields=['email'], name='igny8_users_email_fd61ff_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='industrysector',
|
||||
index=models.Index(fields=['industry', 'is_active'], name='igny8_indus_industr_00b524_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='industrysector',
|
||||
index=models.Index(fields=['slug'], name='igny8_indus_slug_101d63_idx'),
|
||||
),
|
||||
migrations.AlterUniqueTogether(
|
||||
name='industrysector',
|
||||
unique_together={('industry', 'slug')},
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='passwordresettoken',
|
||||
index=models.Index(fields=['token'], name='igny8_passw_token_0eaf0c_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='passwordresettoken',
|
||||
index=models.Index(fields=['user', 'used'], name='igny8_passw_user_id_320c02_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='passwordresettoken',
|
||||
index=models.Index(fields=['expires_at'], name='igny8_passw_expires_c9aa03_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='account',
|
||||
model_name='tenant',
|
||||
index=models.Index(fields=['slug'], name='igny8_tenan_slug_f25e97_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='account',
|
||||
model_name='tenant',
|
||||
index=models.Index(fields=['status'], name='igny8_tenan_status_5dc02a_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='seedkeyword',
|
||||
index=models.Index(fields=['keyword'], name='igny8_seed__keyword_efa089_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='seedkeyword',
|
||||
index=models.Index(fields=['industry', 'sector'], name='igny8_seed__industr_c41841_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='seedkeyword',
|
||||
index=models.Index(fields=['industry', 'sector', 'is_active'], name='igny8_seed__industr_da0030_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='seedkeyword',
|
||||
index=models.Index(fields=['intent'], name='igny8_seed__intent_15020d_idx'),
|
||||
),
|
||||
migrations.AlterUniqueTogether(
|
||||
name='seedkeyword',
|
||||
unique_together={('keyword', 'industry', 'sector')},
|
||||
model_name='subscription',
|
||||
index=models.Index(fields=['status'], name='igny8_subsc_status_2fa897_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='site',
|
||||
index=models.Index(fields=['account', 'is_active'], name='igny8_sites_tenant__e0f31d_idx'),
|
||||
index=models.Index(fields=['tenant', 'is_active'], name='igny8_sites_tenant__e0f31d_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='site',
|
||||
index=models.Index(fields=['account', 'status'], name='igny8_sites_tenant__a20275_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='site',
|
||||
index=models.Index(fields=['industry'], name='igny8_sites_industr_66e004_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='site',
|
||||
index=models.Index(fields=['site_type'], name='igny8_sites_site_ty_0dfbc3_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='site',
|
||||
index=models.Index(fields=['hosting_type'], name='igny8_sites_hosting_c484c2_idx'),
|
||||
index=models.Index(fields=['tenant', 'status'], name='igny8_sites_tenant__a20275_idx'),
|
||||
),
|
||||
migrations.AlterUniqueTogether(
|
||||
name='site',
|
||||
unique_together={('account', 'slug')},
|
||||
unique_together={('tenant', 'slug')},
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='sector',
|
||||
@@ -341,26 +188,18 @@ class Migration(migrations.Migration):
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='sector',
|
||||
index=models.Index(fields=['account', 'site'], name='igny8_secto_tenant__af54ae_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='sector',
|
||||
index=models.Index(fields=['industry_sector'], name='igny8_secto_industr_1cf990_idx'),
|
||||
index=models.Index(fields=['tenant', 'site'], name='igny8_secto_tenant__af54ae_idx'),
|
||||
),
|
||||
migrations.AlterUniqueTogether(
|
||||
name='sector',
|
||||
unique_together={('site', 'slug')},
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='siteuseraccess',
|
||||
index=models.Index(fields=['user', 'site'], name='igny8_site__user_id_61951e_idx'),
|
||||
),
|
||||
migrations.AlterUniqueTogether(
|
||||
name='siteuseraccess',
|
||||
unique_together={('user', 'site')},
|
||||
model_name='user',
|
||||
index=models.Index(fields=['tenant', 'role'], name='igny8_users_tenant__0ab02b_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='subscription',
|
||||
index=models.Index(fields=['status'], name='igny8_subsc_status_2fa897_idx'),
|
||||
model_name='user',
|
||||
index=models.Index(fields=['email'], name='igny8_users_email_fd61ff_idx'),
|
||||
),
|
||||
]
|
||||
|
||||
@@ -0,0 +1,13 @@
|
||||
# Generated by Django 5.2.7 on 2025-11-02 22:27
|
||||
|
||||
from django.db import migrations
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('igny8_core_auth', '0001_initial'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
]
|
||||
@@ -1,19 +0,0 @@
|
||||
# Generated manually for adding wp_api_key to Site model
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('igny8_core_auth', '0001_initial'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='site',
|
||||
name='wp_api_key',
|
||||
field=models.CharField(blank=True, help_text='API key for WordPress integration via IGNY8 WP Bridge plugin', max_length=255, null=True),
|
||||
),
|
||||
]
|
||||
|
||||
@@ -1,17 +0,0 @@
|
||||
# Generated by Django 5.2.8 on 2025-12-01 00:05
|
||||
|
||||
from django.db import migrations
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('igny8_core_auth', '0002_add_wp_api_key_to_site'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterModelOptions(
|
||||
name='seedkeyword',
|
||||
options={'ordering': ['keyword'], 'verbose_name': 'Seed Keyword', 'verbose_name_plural': 'Global Keywords Database'},
|
||||
),
|
||||
]
|
||||
18
backend/igny8_core/auth/migrations/0003_alter_user_role.py
Normal file
18
backend/igny8_core/auth/migrations/0003_alter_user_role.py
Normal file
@@ -0,0 +1,18 @@
|
||||
# Generated by Django 5.2.7 on 2025-11-03 13:22
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('igny8_core_auth', '0002_add_developer_role'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name='user',
|
||||
name='role',
|
||||
field=models.CharField(choices=[('developer', 'Developer / Super Admin'), ('owner', 'Owner'), ('admin', 'Admin'), ('editor', 'Editor'), ('viewer', 'Viewer'), ('system_bot', 'System Bot')], default='viewer', max_length=20),
|
||||
),
|
||||
]
|
||||
@@ -0,0 +1,75 @@
|
||||
# Generated migration for Industry and IndustrySector models
|
||||
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('igny8_core_auth', '0003_alter_user_role'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name='Industry',
|
||||
fields=[
|
||||
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('name', models.CharField(max_length=255, unique=True)),
|
||||
('slug', models.SlugField(db_index=True, max_length=255, unique=True)),
|
||||
('description', models.TextField(blank=True, null=True)),
|
||||
('is_active', models.BooleanField(db_index=True, default=True)),
|
||||
('created_at', models.DateTimeField(auto_now_add=True)),
|
||||
('updated_at', models.DateTimeField(auto_now=True)),
|
||||
],
|
||||
options={
|
||||
'db_table': 'igny8_industries',
|
||||
'ordering': ['name'],
|
||||
},
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name='IndustrySector',
|
||||
fields=[
|
||||
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('name', models.CharField(max_length=255)),
|
||||
('slug', models.SlugField(db_index=True, max_length=255)),
|
||||
('description', models.TextField(blank=True, null=True)),
|
||||
('suggested_keywords', models.JSONField(default=list, help_text='List of suggested keywords for this sector template')),
|
||||
('is_active', models.BooleanField(db_index=True, default=True)),
|
||||
('created_at', models.DateTimeField(auto_now_add=True)),
|
||||
('updated_at', models.DateTimeField(auto_now=True)),
|
||||
('industry', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='sectors', to='igny8_core_auth.industry')),
|
||||
],
|
||||
options={
|
||||
'db_table': 'igny8_industry_sectors',
|
||||
'ordering': ['industry', 'name'],
|
||||
'unique_together': {('industry', 'slug')},
|
||||
},
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='sector',
|
||||
name='industry_sector',
|
||||
field=models.ForeignKey(blank=True, help_text='Reference to the industry sector template', null=True, on_delete=django.db.models.deletion.PROTECT, related_name='site_sectors', to='igny8_core_auth.industrysector'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='industry',
|
||||
index=models.Index(fields=['slug'], name='igny8_indu_slug_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='industry',
|
||||
index=models.Index(fields=['is_active'], name='igny8_indu_is_acti_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='industrysector',
|
||||
index=models.Index(fields=['industry', 'is_active'], name='igny8_indu_industr_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='industrysector',
|
||||
index=models.Index(fields=['slug'], name='igny8_indu_slug_1_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='sector',
|
||||
index=models.Index(fields=['industry_sector'], name='igny8_sect_industr_idx'),
|
||||
),
|
||||
]
|
||||
|
||||
@@ -1,53 +0,0 @@
|
||||
# Generated by Django 5.2.8 on 2025-12-04 23:35
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('igny8_core_auth', '0003_add_sync_event_model'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='account',
|
||||
name='billing_address_line1',
|
||||
field=models.CharField(blank=True, help_text='Street address', max_length=255),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='account',
|
||||
name='billing_address_line2',
|
||||
field=models.CharField(blank=True, help_text='Apt, suite, etc.', max_length=255),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='account',
|
||||
name='billing_city',
|
||||
field=models.CharField(blank=True, max_length=100),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='account',
|
||||
name='billing_country',
|
||||
field=models.CharField(blank=True, help_text='ISO 2-letter country code', max_length=2),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='account',
|
||||
name='billing_email',
|
||||
field=models.EmailField(blank=True, help_text='Email for billing notifications', max_length=254, null=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='account',
|
||||
name='billing_postal_code',
|
||||
field=models.CharField(blank=True, max_length=20),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='account',
|
||||
name='billing_state',
|
||||
field=models.CharField(blank=True, help_text='State/Province/Region', max_length=100),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='account',
|
||||
name='tax_id',
|
||||
field=models.CharField(blank=True, help_text='VAT/Tax ID number', max_length=100),
|
||||
),
|
||||
]
|
||||
@@ -1,23 +0,0 @@
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
('igny8_core_auth', '0004_add_invoice_payment_models'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name='account',
|
||||
name='owner',
|
||||
field=models.ForeignKey(
|
||||
blank=True,
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.SET_NULL,
|
||||
related_name='owned_accounts',
|
||||
to='igny8_core_auth.user',
|
||||
),
|
||||
),
|
||||
]
|
||||
|
||||
@@ -0,0 +1,31 @@
|
||||
# Migration to add industry field to Site model
|
||||
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('igny8_core_auth', '0004_add_industry_models'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='site',
|
||||
name='industry',
|
||||
field=models.ForeignKey(
|
||||
blank=True,
|
||||
help_text='Industry this site belongs to',
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.PROTECT,
|
||||
related_name='sites',
|
||||
to='igny8_core_auth.industry'
|
||||
),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='site',
|
||||
index=models.Index(fields=['industry'], name='igny8_site_industr_idx'),
|
||||
),
|
||||
]
|
||||
|
||||
@@ -1,93 +0,0 @@
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
from django.core.validators import MinValueValidator, MaxValueValidator
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
('igny8_core_auth', '0005_account_owner_nullable'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='account',
|
||||
name='delete_reason',
|
||||
field=models.CharField(blank=True, max_length=255, null=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='account',
|
||||
name='deleted_at',
|
||||
field=models.DateTimeField(blank=True, db_index=True, null=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='account',
|
||||
name='deleted_by',
|
||||
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='igny8_core_auth.user'),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='account',
|
||||
name='deletion_retention_days',
|
||||
field=models.PositiveIntegerField(default=14, help_text='Retention window (days) before soft-deleted items are purged', validators=[MinValueValidator(1), MaxValueValidator(365)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='account',
|
||||
name='is_deleted',
|
||||
field=models.BooleanField(db_index=True, default=False),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='account',
|
||||
name='restore_until',
|
||||
field=models.DateTimeField(blank=True, db_index=True, null=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='sector',
|
||||
name='delete_reason',
|
||||
field=models.CharField(blank=True, max_length=255, null=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='sector',
|
||||
name='deleted_at',
|
||||
field=models.DateTimeField(blank=True, db_index=True, null=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='sector',
|
||||
name='deleted_by',
|
||||
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='igny8_core_auth.user'),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='sector',
|
||||
name='is_deleted',
|
||||
field=models.BooleanField(db_index=True, default=False),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='sector',
|
||||
name='restore_until',
|
||||
field=models.DateTimeField(blank=True, db_index=True, null=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='site',
|
||||
name='delete_reason',
|
||||
field=models.CharField(blank=True, max_length=255, null=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='site',
|
||||
name='deleted_at',
|
||||
field=models.DateTimeField(blank=True, db_index=True, null=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='site',
|
||||
name='deleted_by',
|
||||
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='igny8_core_auth.user'),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='site',
|
||||
name='is_deleted',
|
||||
field=models.BooleanField(db_index=True, default=False),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='site',
|
||||
name='restore_until',
|
||||
field=models.DateTimeField(blank=True, db_index=True, null=True),
|
||||
),
|
||||
]
|
||||
|
||||
@@ -1,105 +0,0 @@
|
||||
# Generated manually based on FINAL-IMPLEMENTATION-REQUIREMENTS.md
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('igny8_core_auth', '0006_soft_delete_and_retention'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
# Add payment_method to Account
|
||||
migrations.AddField(
|
||||
model_name='account',
|
||||
name='payment_method',
|
||||
field=models.CharField(
|
||||
max_length=30,
|
||||
choices=[
|
||||
('stripe', 'Stripe'),
|
||||
('paypal', 'PayPal'),
|
||||
('bank_transfer', 'Bank Transfer'),
|
||||
],
|
||||
default='stripe',
|
||||
help_text='Payment method used for this account'
|
||||
),
|
||||
),
|
||||
# Add payment_method to Subscription
|
||||
migrations.AddField(
|
||||
model_name='subscription',
|
||||
name='payment_method',
|
||||
field=models.CharField(
|
||||
max_length=30,
|
||||
choices=[
|
||||
('stripe', 'Stripe'),
|
||||
('paypal', 'PayPal'),
|
||||
('bank_transfer', 'Bank Transfer'),
|
||||
],
|
||||
default='stripe',
|
||||
help_text='Payment method for this subscription'
|
||||
),
|
||||
),
|
||||
# Add external_payment_id to Subscription
|
||||
migrations.AddField(
|
||||
model_name='subscription',
|
||||
name='external_payment_id',
|
||||
field=models.CharField(
|
||||
max_length=255,
|
||||
blank=True,
|
||||
null=True,
|
||||
help_text='External payment reference (bank transfer ref, PayPal transaction ID)'
|
||||
),
|
||||
),
|
||||
# Make stripe_subscription_id nullable
|
||||
migrations.AlterField(
|
||||
model_name='subscription',
|
||||
name='stripe_subscription_id',
|
||||
field=models.CharField(
|
||||
max_length=255,
|
||||
blank=True,
|
||||
null=True,
|
||||
db_index=True,
|
||||
help_text='Stripe subscription ID (when using Stripe)'
|
||||
),
|
||||
),
|
||||
# Add pending_payment status to Account
|
||||
migrations.AlterField(
|
||||
model_name='account',
|
||||
name='status',
|
||||
field=models.CharField(
|
||||
max_length=20,
|
||||
choices=[
|
||||
('active', 'Active'),
|
||||
('suspended', 'Suspended'),
|
||||
('trial', 'Trial'),
|
||||
('cancelled', 'Cancelled'),
|
||||
('pending_payment', 'Pending Payment'),
|
||||
],
|
||||
default='trial'
|
||||
),
|
||||
),
|
||||
# Add pending_payment status to Subscription
|
||||
migrations.AlterField(
|
||||
model_name='subscription',
|
||||
name='status',
|
||||
field=models.CharField(
|
||||
max_length=20,
|
||||
choices=[
|
||||
('active', 'Active'),
|
||||
('past_due', 'Past Due'),
|
||||
('canceled', 'Canceled'),
|
||||
('trialing', 'Trialing'),
|
||||
('pending_payment', 'Pending Payment'),
|
||||
]
|
||||
),
|
||||
),
|
||||
# Add index on payment_method
|
||||
migrations.AddIndex(
|
||||
model_name='account',
|
||||
index=models.Index(fields=['payment_method'], name='auth_acc_payment_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='subscription',
|
||||
index=models.Index(fields=['payment_method'], name='auth_sub_payment_idx'),
|
||||
),
|
||||
]
|
||||
151
backend/igny8_core/auth/migrations/0007_expand_plan_limits.py
Normal file
151
backend/igny8_core/auth/migrations/0007_expand_plan_limits.py
Normal file
@@ -0,0 +1,151 @@
|
||||
"""Add extended plan configuration fields"""
|
||||
from decimal import Decimal
|
||||
|
||||
from django.core.validators import MinValueValidator
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('igny8_core_auth', '0006_add_industry_to_site'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='ai_cost_per_request',
|
||||
field=models.JSONField(default=dict, help_text="Cost per request type (e.g., {'cluster': 2, 'idea': 3, 'content': 5, 'image': 1})"),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='allow_credit_topup',
|
||||
field=models.BooleanField(default=True, help_text='Can user purchase more credits?'),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='billing_cycle',
|
||||
field=models.CharField(choices=[('monthly', 'Monthly'), ('annual', 'Annual')], default='monthly', max_length=20),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='daily_ai_request_limit',
|
||||
field=models.IntegerField(default=100, help_text='Global daily AI request cap', validators=[MinValueValidator(0)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='daily_ai_requests',
|
||||
field=models.IntegerField(default=50, help_text='Total AI executions (content + idea + image) allowed per day', validators=[MinValueValidator(0)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='daily_cluster_limit',
|
||||
field=models.IntegerField(default=10, help_text='Max clusters that can be created per day', validators=[MinValueValidator(0)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='daily_content_tasks',
|
||||
field=models.IntegerField(default=10, help_text='Max number of content tasks (blogs) per day', validators=[MinValueValidator(0)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='daily_keyword_import_limit',
|
||||
field=models.IntegerField(default=100, help_text='SeedKeywords import limit per day', validators=[MinValueValidator(0)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='extra_credit_price',
|
||||
field=models.DecimalField(decimal_places=2, default=Decimal('0.01'), help_text='Price per additional credit', max_digits=10),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='image_model_choices',
|
||||
field=models.JSONField(default=list, help_text="Allowed image models (e.g., ['dalle3', 'hidream'])"),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='included_credits',
|
||||
field=models.IntegerField(default=0, help_text='Monthly credits included', validators=[MinValueValidator(0)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='max_author_profiles',
|
||||
field=models.IntegerField(default=5, help_text='Limit for saved writing styles', validators=[MinValueValidator(0)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='max_clusters',
|
||||
field=models.IntegerField(default=100, help_text='Total clusters allowed (global)', validators=[MinValueValidator(0)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='max_images_per_task',
|
||||
field=models.IntegerField(default=4, help_text='Max images per content task', validators=[MinValueValidator(1)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='max_industries',
|
||||
field=models.IntegerField(blank=True, default=None, help_text='Optional limit for industries/sectors', null=True, validators=[MinValueValidator(1)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='max_keywords',
|
||||
field=models.IntegerField(default=1000, help_text='Total keywords allowed (global limit)', validators=[MinValueValidator(0)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='max_users',
|
||||
field=models.IntegerField(default=1, help_text='Total users allowed per account', validators=[MinValueValidator(1)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='monthly_ai_credit_limit',
|
||||
field=models.IntegerField(default=500, help_text='Unified credit ceiling per month (all AI functions)', validators=[MinValueValidator(0)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='monthly_cluster_ai_credits',
|
||||
field=models.IntegerField(default=50, help_text='AI credits allocated for clustering', validators=[MinValueValidator(0)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='monthly_content_ai_credits',
|
||||
field=models.IntegerField(default=200, help_text='AI credit pool for content generation', validators=[MinValueValidator(0)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='monthly_image_ai_credits',
|
||||
field=models.IntegerField(default=100, help_text='AI credit pool for image generation', validators=[MinValueValidator(0)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='monthly_image_count',
|
||||
field=models.IntegerField(default=100, help_text='Max images per month', validators=[MinValueValidator(0)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='monthly_word_count_limit',
|
||||
field=models.IntegerField(default=50000, help_text='Monthly word limit (for generated content)', validators=[MinValueValidator(0)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='auto_credit_topup_threshold',
|
||||
field=models.IntegerField(blank=True, default=None, help_text='Auto top-up trigger point (optional)', null=True, validators=[MinValueValidator(0)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='auto_credit_topup_amount',
|
||||
field=models.IntegerField(blank=True, default=None, help_text='How many credits to auto-buy', null=True, validators=[MinValueValidator(1)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='stripe_product_id',
|
||||
field=models.CharField(blank=True, help_text='For Stripe plan sync', max_length=255, null=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='plan',
|
||||
name='features',
|
||||
field=models.JSONField(default=list, help_text="Plan features as JSON array (e.g., ['ai_writer', 'image_gen', 'auto_publish'])"),
|
||||
),
|
||||
]
|
||||
|
||||
@@ -1,26 +0,0 @@
|
||||
# Generated by Django 5.2.8 on 2025-12-08 13:01
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('igny8_core_auth', '0007_add_payment_method_fields'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.RemoveIndex(
|
||||
model_name='account',
|
||||
name='auth_acc_payment_idx',
|
||||
),
|
||||
migrations.RemoveIndex(
|
||||
model_name='subscription',
|
||||
name='auth_sub_payment_idx',
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='is_internal',
|
||||
field=models.BooleanField(default=False, help_text='Internal-only plan (Free/Internal) - hidden from public plan listings'),
|
||||
),
|
||||
]
|
||||
@@ -0,0 +1,108 @@
|
||||
# Generated by Django 5.2.8 on 2025-11-07 10:06
|
||||
|
||||
import django.core.validators
|
||||
import django.db.models.deletion
|
||||
from django.conf import settings
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('igny8_core_auth', '0007_expand_plan_limits'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name='PasswordResetToken',
|
||||
fields=[
|
||||
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('token', models.CharField(db_index=True, max_length=255, unique=True)),
|
||||
('expires_at', models.DateTimeField()),
|
||||
('used', models.BooleanField(default=False)),
|
||||
('created_at', models.DateTimeField(auto_now_add=True)),
|
||||
],
|
||||
options={
|
||||
'db_table': 'igny8_password_reset_tokens',
|
||||
'ordering': ['-created_at'],
|
||||
},
|
||||
),
|
||||
migrations.AlterModelOptions(
|
||||
name='industry',
|
||||
options={'ordering': ['name'], 'verbose_name': 'Industry', 'verbose_name_plural': 'Industries'},
|
||||
),
|
||||
migrations.AlterModelOptions(
|
||||
name='industrysector',
|
||||
options={'ordering': ['industry', 'name'], 'verbose_name': 'Industry Sector', 'verbose_name_plural': 'Industry Sectors'},
|
||||
),
|
||||
migrations.AlterModelOptions(
|
||||
name='site',
|
||||
options={'ordering': ['-created_at']},
|
||||
),
|
||||
migrations.AlterModelOptions(
|
||||
name='siteuseraccess',
|
||||
options={'verbose_name': 'Site User Access', 'verbose_name_plural': 'Site User Access'},
|
||||
),
|
||||
migrations.RenameIndex(
|
||||
model_name='industry',
|
||||
new_name='igny8_indus_slug_2f8769_idx',
|
||||
old_name='igny8_indu_slug_idx',
|
||||
),
|
||||
migrations.RenameIndex(
|
||||
model_name='industry',
|
||||
new_name='igny8_indus_is_acti_146d41_idx',
|
||||
old_name='igny8_indu_is_acti_idx',
|
||||
),
|
||||
migrations.RenameIndex(
|
||||
model_name='industrysector',
|
||||
new_name='igny8_indus_industr_00b524_idx',
|
||||
old_name='igny8_indu_industr_idx',
|
||||
),
|
||||
migrations.RenameIndex(
|
||||
model_name='industrysector',
|
||||
new_name='igny8_indus_slug_101d63_idx',
|
||||
old_name='igny8_indu_slug_1_idx',
|
||||
),
|
||||
migrations.RenameIndex(
|
||||
model_name='sector',
|
||||
new_name='igny8_secto_industr_1cf990_idx',
|
||||
old_name='igny8_sect_industr_idx',
|
||||
),
|
||||
migrations.RenameIndex(
|
||||
model_name='site',
|
||||
new_name='igny8_sites_industr_66e004_idx',
|
||||
old_name='igny8_site_industr_idx',
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='plan',
|
||||
name='credits_per_month',
|
||||
field=models.IntegerField(default=0, help_text='DEPRECATED: Use included_credits instead', validators=[django.core.validators.MinValueValidator(0)]),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='plan',
|
||||
name='extra_credit_price',
|
||||
field=models.DecimalField(decimal_places=2, default=0.01, help_text='Price per additional credit', max_digits=10),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='plan',
|
||||
name='stripe_price_id',
|
||||
field=models.CharField(blank=True, help_text='Monthly price ID for Stripe', max_length=255, null=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='passwordresettoken',
|
||||
name='user',
|
||||
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='password_reset_tokens', to=settings.AUTH_USER_MODEL),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='passwordresettoken',
|
||||
index=models.Index(fields=['token'], name='igny8_passw_token_0eaf0c_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='passwordresettoken',
|
||||
index=models.Index(fields=['user', 'used'], name='igny8_passw_user_id_320c02_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='passwordresettoken',
|
||||
index=models.Index(fields=['expires_at'], name='igny8_passw_expires_c9aa03_idx'),
|
||||
),
|
||||
]
|
||||
@@ -1,36 +0,0 @@
|
||||
# Generated manually
|
||||
|
||||
from django.db import migrations, models
|
||||
import django.core.validators
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('igny8_core_auth', '0008_add_plan_is_internal'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='annual_discount_percent',
|
||||
field=models.DecimalField(
|
||||
decimal_places=2,
|
||||
default=15.0,
|
||||
help_text='Annual subscription discount percentage (default 15%)',
|
||||
max_digits=5,
|
||||
validators=[
|
||||
django.core.validators.MinValueValidator(0),
|
||||
django.core.validators.MaxValueValidator(100)
|
||||
]
|
||||
),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='is_featured',
|
||||
field=models.BooleanField(
|
||||
default=False,
|
||||
help_text='Highlight this plan as popular/recommended'
|
||||
),
|
||||
),
|
||||
]
|
||||
@@ -0,0 +1,88 @@
|
||||
from django.db import migrations
|
||||
|
||||
|
||||
def forward_fix_admin_log_fk(apps, schema_editor):
|
||||
if schema_editor.connection.vendor != "postgresql":
|
||||
return
|
||||
schema_editor.execute(
|
||||
"""
|
||||
ALTER TABLE django_admin_log
|
||||
DROP CONSTRAINT IF EXISTS django_admin_log_user_id_c564eba6_fk_auth_user_id;
|
||||
"""
|
||||
)
|
||||
schema_editor.execute(
|
||||
"""
|
||||
UPDATE django_admin_log
|
||||
SET user_id = sub.new_user_id
|
||||
FROM (
|
||||
SELECT id AS new_user_id
|
||||
FROM igny8_users
|
||||
ORDER BY id
|
||||
LIMIT 1
|
||||
) AS sub
|
||||
WHERE django_admin_log.user_id NOT IN (
|
||||
SELECT id FROM igny8_users
|
||||
);
|
||||
"""
|
||||
)
|
||||
schema_editor.execute(
|
||||
"""
|
||||
DO $$
|
||||
BEGIN
|
||||
IF NOT EXISTS (
|
||||
SELECT 1 FROM pg_constraint
|
||||
WHERE conname = 'django_admin_log_user_id_c564eba6_fk_igny8_users_id'
|
||||
) THEN
|
||||
ALTER TABLE django_admin_log
|
||||
ADD CONSTRAINT django_admin_log_user_id_c564eba6_fk_igny8_users_id
|
||||
FOREIGN KEY (user_id) REFERENCES igny8_users(id) DEFERRABLE INITIALLY DEFERRED;
|
||||
END IF;
|
||||
END $$;
|
||||
"""
|
||||
)
|
||||
|
||||
|
||||
def reverse_fix_admin_log_fk(apps, schema_editor):
|
||||
if schema_editor.connection.vendor != "postgresql":
|
||||
return
|
||||
schema_editor.execute(
|
||||
"""
|
||||
ALTER TABLE django_admin_log
|
||||
DROP CONSTRAINT IF EXISTS django_admin_log_user_id_c564eba6_fk_igny8_users_id;
|
||||
"""
|
||||
)
|
||||
schema_editor.execute(
|
||||
"""
|
||||
UPDATE django_admin_log
|
||||
SET user_id = sub.old_user_id
|
||||
FROM (
|
||||
SELECT id AS old_user_id
|
||||
FROM auth_user
|
||||
ORDER BY id
|
||||
LIMIT 1
|
||||
) AS sub
|
||||
WHERE django_admin_log.user_id NOT IN (
|
||||
SELECT id FROM auth_user
|
||||
);
|
||||
"""
|
||||
)
|
||||
schema_editor.execute(
|
||||
"""
|
||||
ALTER TABLE django_admin_log
|
||||
ADD CONSTRAINT django_admin_log_user_id_c564eba6_fk_auth_user_id
|
||||
FOREIGN KEY (user_id) REFERENCES auth_user(id) DEFERRABLE INITIALLY DEFERRED;
|
||||
"""
|
||||
)
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
("igny8_core_auth", "0008_passwordresettoken_alter_industry_options_and_more"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.RunPython(forward_fix_admin_log_fk, reverse_fix_admin_log_fk),
|
||||
]
|
||||
|
||||
|
||||
38
backend/igny8_core/auth/migrations/0010_add_seed_keyword.py
Normal file
38
backend/igny8_core/auth/migrations/0010_add_seed_keyword.py
Normal file
@@ -0,0 +1,38 @@
|
||||
# Generated by Django 5.2.8 on 2025-11-07 11:34
|
||||
|
||||
import django.core.validators
|
||||
import django.db.models.deletion
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('igny8_core_auth', '0009_fix_admin_log_user_fk'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name='SeedKeyword',
|
||||
fields=[
|
||||
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('keyword', models.CharField(db_index=True, max_length=255)),
|
||||
('volume', models.IntegerField(default=0, help_text='Search volume estimate')),
|
||||
('difficulty', models.IntegerField(default=0, help_text='Keyword difficulty (0-100)', validators=[django.core.validators.MinValueValidator(0), django.core.validators.MaxValueValidator(100)])),
|
||||
('intent', models.CharField(choices=[('informational', 'Informational'), ('navigational', 'Navigational'), ('commercial', 'Commercial'), ('transactional', 'Transactional')], default='informational', max_length=50)),
|
||||
('is_active', models.BooleanField(db_index=True, default=True)),
|
||||
('created_at', models.DateTimeField(auto_now_add=True)),
|
||||
('updated_at', models.DateTimeField(auto_now=True)),
|
||||
('industry', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='seed_keywords', to='igny8_core_auth.industry')),
|
||||
('sector', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='seed_keywords', to='igny8_core_auth.industrysector')),
|
||||
],
|
||||
options={
|
||||
'verbose_name': 'Seed Keyword',
|
||||
'verbose_name_plural': 'Seed Keywords',
|
||||
'db_table': 'igny8_seed_keywords',
|
||||
'ordering': ['keyword'],
|
||||
'indexes': [models.Index(fields=['keyword'], name='igny8_seed__keyword_efa089_idx'), models.Index(fields=['industry', 'sector'], name='igny8_seed__industr_c41841_idx'), models.Index(fields=['industry', 'sector', 'is_active'], name='igny8_seed__industr_da0030_idx'), models.Index(fields=['intent'], name='igny8_seed__intent_15020d_idx')],
|
||||
'unique_together': {('keyword', 'industry', 'sector')},
|
||||
},
|
||||
),
|
||||
]
|
||||
@@ -1,25 +0,0 @@
|
||||
# Generated by Django 5.2.8 on 2025-12-08 22:42
|
||||
|
||||
import django.db.models.deletion
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('igny8_core_auth', '0009_add_plan_annual_discount_and_featured'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='subscription',
|
||||
name='plan',
|
||||
field=models.ForeignKey(blank=True, help_text='Subscription plan (tracks historical plan even if account changes plan)', null=True, on_delete=django.db.models.deletion.PROTECT, related_name='subscriptions', to='igny8_core_auth.plan'),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='site',
|
||||
name='industry',
|
||||
field=models.ForeignKey(default=21, help_text='Industry this site belongs to (required for sector creation)', on_delete=django.db.models.deletion.PROTECT, related_name='sites', to='igny8_core_auth.industry'),
|
||||
preserve_default=False,
|
||||
),
|
||||
]
|
||||
@@ -0,0 +1,29 @@
|
||||
# Generated by Django 5.2.7 on 2025-11-07 11:45
|
||||
|
||||
import django.core.validators
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('igny8_core_auth', '0010_add_seed_keyword'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='daily_image_generation_limit',
|
||||
field=models.IntegerField(default=25, help_text='Max images that can be generated per day', validators=[django.core.validators.MinValueValidator(0)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='max_content_ideas',
|
||||
field=models.IntegerField(default=300, help_text='Total content ideas allowed (global limit)', validators=[django.core.validators.MinValueValidator(0)]),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='plan',
|
||||
name='max_sites',
|
||||
field=models.IntegerField(default=1, help_text='Maximum number of sites allowed', validators=[django.core.validators.MinValueValidator(1)]),
|
||||
),
|
||||
]
|
||||
@@ -1,17 +0,0 @@
|
||||
# Generated by Django 5.2.8 on 2025-12-08 22:52
|
||||
|
||||
from django.db import migrations
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('igny8_core_auth', '0010_add_subscription_plan_and_require_site_industry'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.RemoveField(
|
||||
model_name='subscription',
|
||||
name='payment_method',
|
||||
),
|
||||
]
|
||||
@@ -0,0 +1,28 @@
|
||||
# Generated by Django 5.2.7 on 2025-11-07 11:56
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('igny8_core_auth', '0011_add_plan_fields_and_fix_constraints'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name='plan',
|
||||
name='ai_cost_per_request',
|
||||
field=models.JSONField(blank=True, default=dict, help_text="Cost per request type (e.g., {'cluster': 2, 'idea': 3, 'content': 5, 'image': 1})"),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='plan',
|
||||
name='features',
|
||||
field=models.JSONField(blank=True, default=list, help_text="Plan features as JSON array (e.g., ['ai_writer', 'image_gen', 'auto_publish'])"),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='plan',
|
||||
name='image_model_choices',
|
||||
field=models.JSONField(blank=True, default=list, help_text="Allowed image models (e.g., ['dalle3', 'hidream'])"),
|
||||
),
|
||||
]
|
||||
@@ -1,47 +0,0 @@
|
||||
# Generated migration to fix subscription constraints
|
||||
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('igny8_core_auth', '0011_remove_subscription_payment_method'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
# Add unique constraint on tenant_id at database level
|
||||
migrations.RunSQL(
|
||||
sql="""
|
||||
CREATE UNIQUE INDEX IF NOT EXISTS igny8_subscriptions_tenant_id_unique
|
||||
ON igny8_subscriptions(tenant_id);
|
||||
""",
|
||||
reverse_sql="""
|
||||
DROP INDEX IF EXISTS igny8_subscriptions_tenant_id_unique;
|
||||
"""
|
||||
),
|
||||
|
||||
# Make plan field required (non-nullable)
|
||||
# First set default plan (ID 1 - Free Plan) for any null values
|
||||
migrations.RunSQL(
|
||||
sql="""
|
||||
UPDATE igny8_subscriptions
|
||||
SET plan_id = 1
|
||||
WHERE plan_id IS NULL;
|
||||
""",
|
||||
reverse_sql=migrations.RunSQL.noop
|
||||
),
|
||||
|
||||
# Now alter the field to be non-nullable
|
||||
migrations.AlterField(
|
||||
model_name='subscription',
|
||||
name='plan',
|
||||
field=models.ForeignKey(
|
||||
on_delete=django.db.models.deletion.PROTECT,
|
||||
related_name='subscriptions',
|
||||
to='igny8_core_auth.plan',
|
||||
help_text='Subscription plan (tracks historical plan even if account changes plan)'
|
||||
),
|
||||
),
|
||||
]
|
||||
@@ -1,49 +0,0 @@
|
||||
# Generated by Django 5.2.8 on 2025-12-12 11:26
|
||||
|
||||
import django.core.validators
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('igny8_core_auth', '0012_fix_subscription_constraints'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='max_clusters',
|
||||
field=models.IntegerField(default=100, help_text='Maximum AI keyword clusters allowed (hard limit)', validators=[django.core.validators.MinValueValidator(1)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='max_content_ideas',
|
||||
field=models.IntegerField(default=300, help_text='Maximum AI content ideas per month', validators=[django.core.validators.MinValueValidator(1)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='max_content_words',
|
||||
field=models.IntegerField(default=100000, help_text='Maximum content words per month (e.g., 100000 = 100K words)', validators=[django.core.validators.MinValueValidator(1)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='max_image_prompts',
|
||||
field=models.IntegerField(default=300, help_text='Maximum image prompts per month', validators=[django.core.validators.MinValueValidator(0)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='max_images_basic',
|
||||
field=models.IntegerField(default=300, help_text='Maximum basic AI images per month', validators=[django.core.validators.MinValueValidator(0)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='max_images_premium',
|
||||
field=models.IntegerField(default=60, help_text='Maximum premium AI images per month (DALL-E)', validators=[django.core.validators.MinValueValidator(0)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='max_keywords',
|
||||
field=models.IntegerField(default=1000, help_text='Maximum total keywords allowed (hard limit)', validators=[django.core.validators.MinValueValidator(1)]),
|
||||
),
|
||||
]
|
||||
@@ -0,0 +1,17 @@
|
||||
# Generated by Django 5.2.7 on 2025-11-07 12:01
|
||||
|
||||
from django.db import migrations
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('igny8_core_auth', '0012_allow_blank_json_fields'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.RemoveField(
|
||||
model_name='plan',
|
||||
name='ai_cost_per_request',
|
||||
),
|
||||
]
|
||||
@@ -1,49 +0,0 @@
|
||||
# Generated by Django 5.2.8 on 2025-12-12 12:24
|
||||
|
||||
import django.core.validators
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('igny8_core_auth', '0013_plan_max_clusters_plan_max_content_ideas_and_more'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='account',
|
||||
name='usage_content_ideas',
|
||||
field=models.IntegerField(default=0, help_text='Content ideas generated this month', validators=[django.core.validators.MinValueValidator(0)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='account',
|
||||
name='usage_content_words',
|
||||
field=models.IntegerField(default=0, help_text='Content words generated this month', validators=[django.core.validators.MinValueValidator(0)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='account',
|
||||
name='usage_image_prompts',
|
||||
field=models.IntegerField(default=0, help_text='Image prompts this month', validators=[django.core.validators.MinValueValidator(0)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='account',
|
||||
name='usage_images_basic',
|
||||
field=models.IntegerField(default=0, help_text='Basic AI images this month', validators=[django.core.validators.MinValueValidator(0)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='account',
|
||||
name='usage_images_premium',
|
||||
field=models.IntegerField(default=0, help_text='Premium AI images this month', validators=[django.core.validators.MinValueValidator(0)]),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='account',
|
||||
name='usage_period_end',
|
||||
field=models.DateTimeField(blank=True, help_text='Current billing period end', null=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='account',
|
||||
name='usage_period_start',
|
||||
field=models.DateTimeField(blank=True, help_text='Current billing period start', null=True),
|
||||
),
|
||||
]
|
||||
@@ -1,24 +0,0 @@
|
||||
# Generated manually
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('igny8_core_auth', '0014_add_usage_tracking_to_account'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='plan',
|
||||
name='original_price',
|
||||
field=models.DecimalField(
|
||||
blank=True,
|
||||
decimal_places=2,
|
||||
help_text='Original price (before discount) - shows as crossed out price. Leave empty if no discount.',
|
||||
max_digits=10,
|
||||
null=True
|
||||
),
|
||||
),
|
||||
]
|
||||
@@ -1,19 +0,0 @@
|
||||
# Generated by Django 5.2.9 on 2025-12-13 20:31
|
||||
|
||||
import django.core.validators
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('igny8_core_auth', '0015_add_plan_original_price'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name='plan',
|
||||
name='annual_discount_percent',
|
||||
field=models.IntegerField(default=15, help_text='Annual subscription discount percentage (default 15%)', validators=[django.core.validators.MinValueValidator(0), django.core.validators.MaxValueValidator(100)]),
|
||||
),
|
||||
]
|
||||
@@ -1,66 +0,0 @@
|
||||
# Generated by Django 5.2.9 on 2025-12-15 01:28
|
||||
|
||||
import django.core.validators
|
||||
import django.db.models.deletion
|
||||
import simple_history.models
|
||||
from django.conf import settings
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('igny8_core_auth', '0016_alter_plan_annual_discount_percent'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name='HistoricalAccount',
|
||||
fields=[
|
||||
('id', models.BigIntegerField(auto_created=True, blank=True, db_index=True, verbose_name='ID')),
|
||||
('is_deleted', models.BooleanField(db_index=True, default=False)),
|
||||
('deleted_at', models.DateTimeField(blank=True, db_index=True, null=True)),
|
||||
('restore_until', models.DateTimeField(blank=True, db_index=True, null=True)),
|
||||
('delete_reason', models.CharField(blank=True, max_length=255, null=True)),
|
||||
('name', models.CharField(max_length=255)),
|
||||
('slug', models.SlugField(max_length=255)),
|
||||
('stripe_customer_id', models.CharField(blank=True, max_length=255, null=True)),
|
||||
('credits', models.IntegerField(default=0, validators=[django.core.validators.MinValueValidator(0)])),
|
||||
('status', models.CharField(choices=[('active', 'Active'), ('suspended', 'Suspended'), ('trial', 'Trial'), ('cancelled', 'Cancelled'), ('pending_payment', 'Pending Payment')], default='trial', max_length=20)),
|
||||
('payment_method', models.CharField(choices=[('stripe', 'Stripe'), ('paypal', 'PayPal'), ('bank_transfer', 'Bank Transfer')], default='stripe', help_text='Payment method used for this account', max_length=30)),
|
||||
('deletion_retention_days', models.PositiveIntegerField(default=14, help_text='Retention window (days) before soft-deleted items are purged', validators=[django.core.validators.MinValueValidator(1), django.core.validators.MaxValueValidator(365)])),
|
||||
('billing_email', models.EmailField(blank=True, help_text='Email for billing notifications', max_length=254, null=True)),
|
||||
('billing_address_line1', models.CharField(blank=True, help_text='Street address', max_length=255)),
|
||||
('billing_address_line2', models.CharField(blank=True, help_text='Apt, suite, etc.', max_length=255)),
|
||||
('billing_city', models.CharField(blank=True, max_length=100)),
|
||||
('billing_state', models.CharField(blank=True, help_text='State/Province/Region', max_length=100)),
|
||||
('billing_postal_code', models.CharField(blank=True, max_length=20)),
|
||||
('billing_country', models.CharField(blank=True, help_text='ISO 2-letter country code', max_length=2)),
|
||||
('tax_id', models.CharField(blank=True, help_text='VAT/Tax ID number', max_length=100)),
|
||||
('usage_content_ideas', models.IntegerField(default=0, help_text='Content ideas generated this month', validators=[django.core.validators.MinValueValidator(0)])),
|
||||
('usage_content_words', models.IntegerField(default=0, help_text='Content words generated this month', validators=[django.core.validators.MinValueValidator(0)])),
|
||||
('usage_images_basic', models.IntegerField(default=0, help_text='Basic AI images this month', validators=[django.core.validators.MinValueValidator(0)])),
|
||||
('usage_images_premium', models.IntegerField(default=0, help_text='Premium AI images this month', validators=[django.core.validators.MinValueValidator(0)])),
|
||||
('usage_image_prompts', models.IntegerField(default=0, help_text='Image prompts this month', validators=[django.core.validators.MinValueValidator(0)])),
|
||||
('usage_period_start', models.DateTimeField(blank=True, help_text='Current billing period start', null=True)),
|
||||
('usage_period_end', models.DateTimeField(blank=True, help_text='Current billing period end', null=True)),
|
||||
('created_at', models.DateTimeField(blank=True, editable=False)),
|
||||
('updated_at', models.DateTimeField(blank=True, editable=False)),
|
||||
('history_id', models.AutoField(primary_key=True, serialize=False)),
|
||||
('history_date', models.DateTimeField(db_index=True)),
|
||||
('history_change_reason', models.CharField(max_length=100, null=True)),
|
||||
('history_type', models.CharField(choices=[('+', 'Created'), ('~', 'Changed'), ('-', 'Deleted')], max_length=1)),
|
||||
('deleted_by', models.ForeignKey(blank=True, db_constraint=False, null=True, on_delete=django.db.models.deletion.DO_NOTHING, related_name='+', to=settings.AUTH_USER_MODEL)),
|
||||
('history_user', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to=settings.AUTH_USER_MODEL)),
|
||||
('owner', models.ForeignKey(blank=True, db_constraint=False, null=True, on_delete=django.db.models.deletion.DO_NOTHING, related_name='+', to=settings.AUTH_USER_MODEL)),
|
||||
('plan', models.ForeignKey(blank=True, db_constraint=False, null=True, on_delete=django.db.models.deletion.DO_NOTHING, related_name='+', to='igny8_core_auth.plan')),
|
||||
],
|
||||
options={
|
||||
'verbose_name': 'historical Account',
|
||||
'verbose_name_plural': 'historical Accounts',
|
||||
'ordering': ('-history_date', '-history_id'),
|
||||
'get_latest_by': ('history_date', 'history_id'),
|
||||
},
|
||||
bases=(simple_history.models.HistoricalChanges, models.Model),
|
||||
),
|
||||
]
|
||||
@@ -1,30 +0,0 @@
|
||||
# Generated by Django 5.2.9 on 2025-12-17 06:04
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('igny8_core_auth', '0017_add_history_tracking'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.RemoveIndex(
|
||||
model_name='seedkeyword',
|
||||
name='igny8_seed__intent_15020d_idx',
|
||||
),
|
||||
migrations.RemoveField(
|
||||
model_name='seedkeyword',
|
||||
name='intent',
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='seedkeyword',
|
||||
name='country',
|
||||
field=models.CharField(choices=[('US', 'United States'), ('CA', 'Canada'), ('GB', 'United Kingdom'), ('AE', 'United Arab Emirates'), ('AU', 'Australia'), ('IN', 'India'), ('PK', 'Pakistan')], default='US', help_text='Target country for this keyword', max_length=2),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='seedkeyword',
|
||||
index=models.Index(fields=['country'], name='igny8_seed__country_4127a5_idx'),
|
||||
),
|
||||
]
|
||||
@@ -5,8 +5,6 @@ from django.db import models
|
||||
from django.contrib.auth.models import AbstractUser
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
from django.core.validators import MinValueValidator, MaxValueValidator
|
||||
from igny8_core.common.soft_delete import SoftDeletableModel, SoftDeleteManager
|
||||
from simple_history.models import HistoricalRecords
|
||||
|
||||
|
||||
class AccountBaseModel(models.Model):
|
||||
@@ -54,7 +52,7 @@ class SiteSectorBaseModel(AccountBaseModel):
|
||||
super().save(*args, **kwargs)
|
||||
|
||||
|
||||
class Account(SoftDeletableModel):
|
||||
class Account(models.Model):
|
||||
"""
|
||||
Account/Organization model for multi-account support.
|
||||
"""
|
||||
@@ -63,64 +61,17 @@ class Account(SoftDeletableModel):
|
||||
('suspended', 'Suspended'),
|
||||
('trial', 'Trial'),
|
||||
('cancelled', 'Cancelled'),
|
||||
('pending_payment', 'Pending Payment'),
|
||||
]
|
||||
|
||||
PAYMENT_METHOD_CHOICES = [
|
||||
('stripe', 'Stripe'),
|
||||
('paypal', 'PayPal'),
|
||||
('bank_transfer', 'Bank Transfer'),
|
||||
]
|
||||
|
||||
name = models.CharField(max_length=255)
|
||||
slug = models.SlugField(unique=True, max_length=255)
|
||||
owner = models.ForeignKey(
|
||||
'igny8_core_auth.User',
|
||||
on_delete=models.SET_NULL,
|
||||
null=True,
|
||||
blank=True,
|
||||
related_name='owned_accounts',
|
||||
)
|
||||
owner = models.ForeignKey('igny8_core_auth.User', on_delete=models.PROTECT, related_name='owned_accounts')
|
||||
stripe_customer_id = models.CharField(max_length=255, blank=True, null=True)
|
||||
plan = models.ForeignKey('igny8_core_auth.Plan', on_delete=models.PROTECT, related_name='accounts')
|
||||
credits = models.IntegerField(default=0, validators=[MinValueValidator(0)])
|
||||
status = models.CharField(max_length=20, choices=STATUS_CHOICES, default='trial')
|
||||
payment_method = models.CharField(
|
||||
max_length=30,
|
||||
choices=PAYMENT_METHOD_CHOICES,
|
||||
default='stripe',
|
||||
help_text='Payment method used for this account'
|
||||
)
|
||||
deletion_retention_days = models.PositiveIntegerField(
|
||||
default=14,
|
||||
validators=[MinValueValidator(1), MaxValueValidator(365)],
|
||||
help_text="Retention window (days) before soft-deleted items are purged",
|
||||
)
|
||||
|
||||
# Billing information
|
||||
billing_email = models.EmailField(blank=True, null=True, help_text="Email for billing notifications")
|
||||
billing_address_line1 = models.CharField(max_length=255, blank=True, help_text="Street address")
|
||||
billing_address_line2 = models.CharField(max_length=255, blank=True, help_text="Apt, suite, etc.")
|
||||
billing_city = models.CharField(max_length=100, blank=True)
|
||||
billing_state = models.CharField(max_length=100, blank=True, help_text="State/Province/Region")
|
||||
billing_postal_code = models.CharField(max_length=20, blank=True)
|
||||
billing_country = models.CharField(max_length=2, blank=True, help_text="ISO 2-letter country code")
|
||||
tax_id = models.CharField(max_length=100, blank=True, help_text="VAT/Tax ID number")
|
||||
|
||||
# Monthly usage tracking (reset on billing cycle)
|
||||
usage_content_ideas = models.IntegerField(default=0, validators=[MinValueValidator(0)], help_text="Content ideas generated this month")
|
||||
usage_content_words = models.IntegerField(default=0, validators=[MinValueValidator(0)], help_text="Content words generated this month")
|
||||
usage_images_basic = models.IntegerField(default=0, validators=[MinValueValidator(0)], help_text="Basic AI images this month")
|
||||
usage_images_premium = models.IntegerField(default=0, validators=[MinValueValidator(0)], help_text="Premium AI images this month")
|
||||
usage_image_prompts = models.IntegerField(default=0, validators=[MinValueValidator(0)], help_text="Image prompts this month")
|
||||
usage_period_start = models.DateTimeField(null=True, blank=True, help_text="Current billing period start")
|
||||
usage_period_end = models.DateTimeField(null=True, blank=True, help_text="Current billing period end")
|
||||
|
||||
created_at = models.DateTimeField(auto_now_add=True)
|
||||
updated_at = models.DateTimeField(auto_now=True)
|
||||
|
||||
# History tracking
|
||||
history = HistoricalRecords()
|
||||
|
||||
class Meta:
|
||||
db_table = 'igny8_tenants'
|
||||
@@ -131,46 +82,19 @@ class Account(SoftDeletableModel):
|
||||
models.Index(fields=['status']),
|
||||
]
|
||||
|
||||
objects = SoftDeleteManager()
|
||||
all_objects = models.Manager()
|
||||
|
||||
def __str__(self):
|
||||
return self.name
|
||||
|
||||
@property
|
||||
def default_payment_method(self):
|
||||
"""Get default payment method from AccountPaymentMethod table"""
|
||||
try:
|
||||
from igny8_core.business.billing.models import AccountPaymentMethod
|
||||
method = AccountPaymentMethod.objects.filter(
|
||||
account=self,
|
||||
is_default=True,
|
||||
is_enabled=True
|
||||
).first()
|
||||
return method.type if method else self.payment_method
|
||||
except Exception:
|
||||
# Fallback to field if table doesn't exist or error
|
||||
return self.payment_method
|
||||
|
||||
def is_system_account(self):
|
||||
"""Check if this account is a system account with highest access level."""
|
||||
# System accounts bypass all filtering restrictions
|
||||
return self.slug in ['aws-admin', 'default-account', 'default']
|
||||
|
||||
def soft_delete(self, user=None, reason=None, retention_days=None):
|
||||
if self.is_system_account():
|
||||
from django.core.exceptions import PermissionDenied
|
||||
raise PermissionDenied("System account cannot be deleted.")
|
||||
return super().soft_delete(user=user, reason=reason, retention_days=retention_days)
|
||||
|
||||
def delete(self, using=None, keep_parents=False):
|
||||
return self.soft_delete()
|
||||
|
||||
|
||||
class Plan(models.Model):
|
||||
"""
|
||||
Subscription plan model - Phase 0: Credit-only system.
|
||||
Plans define credits, billing, and account management limits only.
|
||||
Subscription plan model with comprehensive limits and features.
|
||||
Plans define limits for users, sites, content generation, AI usage, and billing.
|
||||
"""
|
||||
BILLING_CYCLE_CHOICES = [
|
||||
('monthly', 'Monthly'),
|
||||
@@ -181,26 +105,12 @@ class Plan(models.Model):
|
||||
name = models.CharField(max_length=255)
|
||||
slug = models.SlugField(unique=True, max_length=255)
|
||||
price = models.DecimalField(max_digits=10, decimal_places=2)
|
||||
original_price = models.DecimalField(
|
||||
max_digits=10,
|
||||
decimal_places=2,
|
||||
null=True,
|
||||
blank=True,
|
||||
help_text="Original price (before discount) - shows as crossed out price. Leave empty if no discount."
|
||||
)
|
||||
billing_cycle = models.CharField(max_length=20, choices=BILLING_CYCLE_CHOICES, default='monthly')
|
||||
annual_discount_percent = models.IntegerField(
|
||||
default=15,
|
||||
validators=[MinValueValidator(0), MaxValueValidator(100)],
|
||||
help_text="Annual subscription discount percentage (default 15%)"
|
||||
)
|
||||
is_featured = models.BooleanField(default=False, help_text="Highlight this plan as popular/recommended")
|
||||
features = models.JSONField(default=list, blank=True, help_text="Plan features as JSON array (e.g., ['ai_writer', 'image_gen', 'auto_publish'])")
|
||||
is_active = models.BooleanField(default=True)
|
||||
is_internal = models.BooleanField(default=False, help_text="Internal-only plan (Free/Internal) - hidden from public plan listings")
|
||||
created_at = models.DateTimeField(auto_now_add=True)
|
||||
|
||||
# Account Management Limits (kept - not operation limits)
|
||||
# User / Site / Scope Limits
|
||||
max_users = models.IntegerField(default=1, validators=[MinValueValidator(1)], help_text="Total users allowed per account")
|
||||
max_sites = models.IntegerField(
|
||||
default=1,
|
||||
@@ -210,46 +120,32 @@ class Plan(models.Model):
|
||||
max_industries = models.IntegerField(default=None, null=True, blank=True, validators=[MinValueValidator(1)], help_text="Optional limit for industries/sectors")
|
||||
max_author_profiles = models.IntegerField(default=5, validators=[MinValueValidator(0)], help_text="Limit for saved writing styles")
|
||||
|
||||
# Hard Limits (Persistent - user manages within limit)
|
||||
max_keywords = models.IntegerField(
|
||||
default=1000,
|
||||
validators=[MinValueValidator(1)],
|
||||
help_text="Maximum total keywords allowed (hard limit)"
|
||||
)
|
||||
max_clusters = models.IntegerField(
|
||||
default=100,
|
||||
validators=[MinValueValidator(1)],
|
||||
help_text="Maximum AI keyword clusters allowed (hard limit)"
|
||||
)
|
||||
# Planner Limits
|
||||
max_keywords = models.IntegerField(default=1000, validators=[MinValueValidator(0)], help_text="Total keywords allowed (global limit)")
|
||||
max_clusters = models.IntegerField(default=100, validators=[MinValueValidator(0)], help_text="Total clusters allowed (global)")
|
||||
max_content_ideas = models.IntegerField(default=300, validators=[MinValueValidator(0)], help_text="Total content ideas allowed (global limit)")
|
||||
daily_cluster_limit = models.IntegerField(default=10, validators=[MinValueValidator(0)], help_text="Max clusters that can be created per day")
|
||||
daily_keyword_import_limit = models.IntegerField(default=100, validators=[MinValueValidator(0)], help_text="SeedKeywords import limit per day")
|
||||
monthly_cluster_ai_credits = models.IntegerField(default=50, validators=[MinValueValidator(0)], help_text="AI credits allocated for clustering")
|
||||
|
||||
# Monthly Limits (Reset on billing cycle)
|
||||
max_content_ideas = models.IntegerField(
|
||||
default=300,
|
||||
validators=[MinValueValidator(1)],
|
||||
help_text="Maximum AI content ideas per month"
|
||||
)
|
||||
max_content_words = models.IntegerField(
|
||||
default=100000,
|
||||
validators=[MinValueValidator(1)],
|
||||
help_text="Maximum content words per month (e.g., 100000 = 100K words)"
|
||||
)
|
||||
max_images_basic = models.IntegerField(
|
||||
default=300,
|
||||
validators=[MinValueValidator(0)],
|
||||
help_text="Maximum basic AI images per month"
|
||||
)
|
||||
max_images_premium = models.IntegerField(
|
||||
default=60,
|
||||
validators=[MinValueValidator(0)],
|
||||
help_text="Maximum premium AI images per month (DALL-E)"
|
||||
)
|
||||
max_image_prompts = models.IntegerField(
|
||||
default=300,
|
||||
validators=[MinValueValidator(0)],
|
||||
help_text="Maximum image prompts per month"
|
||||
)
|
||||
# Writer Limits
|
||||
daily_content_tasks = models.IntegerField(default=10, validators=[MinValueValidator(0)], help_text="Max number of content tasks (blogs) per day")
|
||||
daily_ai_requests = models.IntegerField(default=50, validators=[MinValueValidator(0)], help_text="Total AI executions (content + idea + image) allowed per day")
|
||||
monthly_word_count_limit = models.IntegerField(default=50000, validators=[MinValueValidator(0)], help_text="Monthly word limit (for generated content)")
|
||||
monthly_content_ai_credits = models.IntegerField(default=200, validators=[MinValueValidator(0)], help_text="AI credit pool for content generation")
|
||||
|
||||
# Billing & Credits (Phase 0: Credit-only system)
|
||||
# Image Generation Limits
|
||||
monthly_image_count = models.IntegerField(default=100, validators=[MinValueValidator(0)], help_text="Max images per month")
|
||||
daily_image_generation_limit = models.IntegerField(default=25, validators=[MinValueValidator(0)], help_text="Max images that can be generated per day")
|
||||
monthly_image_ai_credits = models.IntegerField(default=100, validators=[MinValueValidator(0)], help_text="AI credit pool for image generation")
|
||||
max_images_per_task = models.IntegerField(default=4, validators=[MinValueValidator(1)], help_text="Max images per content task")
|
||||
image_model_choices = models.JSONField(default=list, blank=True, help_text="Allowed image models (e.g., ['dalle3', 'hidream'])")
|
||||
|
||||
# AI Request Controls
|
||||
daily_ai_request_limit = models.IntegerField(default=100, validators=[MinValueValidator(0)], help_text="Global daily AI request cap")
|
||||
monthly_ai_credit_limit = models.IntegerField(default=500, validators=[MinValueValidator(0)], help_text="Unified credit ceiling per month (all AI functions)")
|
||||
|
||||
# Billing & Add-ons
|
||||
included_credits = models.IntegerField(default=0, validators=[MinValueValidator(0)], help_text="Monthly credits included")
|
||||
extra_credit_price = models.DecimalField(max_digits=10, decimal_places=2, default=0.01, help_text="Price per additional credit")
|
||||
allow_credit_topup = models.BooleanField(default=True, help_text="Can user purchase more credits?")
|
||||
@@ -285,56 +181,23 @@ class Plan(models.Model):
|
||||
|
||||
class Subscription(models.Model):
|
||||
"""
|
||||
Account subscription model supporting multiple payment methods.
|
||||
Account subscription model linking to Stripe.
|
||||
"""
|
||||
STATUS_CHOICES = [
|
||||
('active', 'Active'),
|
||||
('past_due', 'Past Due'),
|
||||
('canceled', 'Canceled'),
|
||||
('trialing', 'Trialing'),
|
||||
('pending_payment', 'Pending Payment'),
|
||||
]
|
||||
|
||||
PAYMENT_METHOD_CHOICES = [
|
||||
('stripe', 'Stripe'),
|
||||
('paypal', 'PayPal'),
|
||||
('bank_transfer', 'Bank Transfer'),
|
||||
]
|
||||
|
||||
account = models.OneToOneField('igny8_core_auth.Account', on_delete=models.CASCADE, related_name='subscription', db_column='tenant_id')
|
||||
plan = models.ForeignKey(
|
||||
'igny8_core_auth.Plan',
|
||||
on_delete=models.PROTECT,
|
||||
related_name='subscriptions',
|
||||
help_text='Subscription plan (tracks historical plan even if account changes plan)'
|
||||
)
|
||||
stripe_subscription_id = models.CharField(
|
||||
max_length=255,
|
||||
blank=True,
|
||||
null=True,
|
||||
db_index=True,
|
||||
help_text='Stripe subscription ID (when using Stripe)'
|
||||
)
|
||||
external_payment_id = models.CharField(
|
||||
max_length=255,
|
||||
blank=True,
|
||||
null=True,
|
||||
help_text='External payment reference (bank transfer ref, PayPal transaction ID)'
|
||||
)
|
||||
stripe_subscription_id = models.CharField(max_length=255, unique=True)
|
||||
status = models.CharField(max_length=20, choices=STATUS_CHOICES)
|
||||
current_period_start = models.DateTimeField()
|
||||
current_period_end = models.DateTimeField()
|
||||
cancel_at_period_end = models.BooleanField(default=False)
|
||||
created_at = models.DateTimeField(auto_now_add=True)
|
||||
updated_at = models.DateTimeField(auto_now=True)
|
||||
|
||||
@property
|
||||
def payment_method(self):
|
||||
"""Get payment method from account's default payment method"""
|
||||
if hasattr(self.account, 'default_payment_method'):
|
||||
return self.account.default_payment_method
|
||||
# Fallback to account.payment_method field if property doesn't exist yet
|
||||
return getattr(self.account, 'payment_method', 'stripe')
|
||||
|
||||
class Meta:
|
||||
db_table = 'igny8_subscriptions'
|
||||
@@ -347,7 +210,7 @@ class Subscription(models.Model):
|
||||
|
||||
|
||||
|
||||
class Site(SoftDeletableModel, AccountBaseModel):
|
||||
class Site(AccountBaseModel):
|
||||
"""
|
||||
Site model - Each account can have multiple sites based on their plan.
|
||||
Each site belongs to ONE industry and can have 1-5 sectors from that industry.
|
||||
@@ -366,60 +229,19 @@ class Site(SoftDeletableModel, AccountBaseModel):
|
||||
'igny8_core_auth.Industry',
|
||||
on_delete=models.PROTECT,
|
||||
related_name='sites',
|
||||
help_text="Industry this site belongs to (required for sector creation)"
|
||||
null=True,
|
||||
blank=True,
|
||||
help_text="Industry this site belongs to"
|
||||
)
|
||||
is_active = models.BooleanField(default=True, db_index=True)
|
||||
status = models.CharField(max_length=20, choices=STATUS_CHOICES, default='active')
|
||||
created_at = models.DateTimeField(auto_now_add=True)
|
||||
updated_at = models.DateTimeField(auto_now=True)
|
||||
|
||||
# WordPress integration fields (legacy - use SiteIntegration instead)
|
||||
wp_url = models.URLField(blank=True, null=True, help_text="WordPress site URL (legacy - use SiteIntegration)")
|
||||
# WordPress integration fields
|
||||
wp_url = models.URLField(blank=True, null=True, help_text="WordPress site URL")
|
||||
wp_username = models.CharField(max_length=255, blank=True, null=True)
|
||||
wp_app_password = models.CharField(max_length=255, blank=True, null=True)
|
||||
wp_api_key = models.CharField(max_length=255, blank=True, null=True, help_text="API key for WordPress integration via IGNY8 WP Bridge plugin")
|
||||
|
||||
# Site type and hosting (Phase 6)
|
||||
SITE_TYPE_CHOICES = [
|
||||
('marketing', 'Marketing Site'),
|
||||
('ecommerce', 'Ecommerce Site'),
|
||||
('blog', 'Blog'),
|
||||
('portfolio', 'Portfolio'),
|
||||
('corporate', 'Corporate'),
|
||||
]
|
||||
|
||||
HOSTING_TYPE_CHOICES = [
|
||||
('igny8_sites', 'IGNY8 Sites'),
|
||||
('wordpress', 'WordPress'),
|
||||
('shopify', 'Shopify'),
|
||||
('multi', 'Multi-Destination'),
|
||||
]
|
||||
|
||||
site_type = models.CharField(
|
||||
max_length=50,
|
||||
choices=SITE_TYPE_CHOICES,
|
||||
default='marketing',
|
||||
db_index=True,
|
||||
help_text="Type of site"
|
||||
)
|
||||
|
||||
hosting_type = models.CharField(
|
||||
max_length=50,
|
||||
choices=HOSTING_TYPE_CHOICES,
|
||||
default='igny8_sites',
|
||||
db_index=True,
|
||||
help_text="Target hosting platform"
|
||||
)
|
||||
|
||||
# SEO metadata (Phase 7)
|
||||
seo_metadata = models.JSONField(
|
||||
default=dict,
|
||||
blank=True,
|
||||
help_text="SEO metadata: meta tags, Open Graph, Schema.org"
|
||||
)
|
||||
|
||||
objects = SoftDeleteManager()
|
||||
all_objects = models.Manager()
|
||||
|
||||
class Meta:
|
||||
db_table = 'igny8_sites'
|
||||
@@ -429,8 +251,6 @@ class Site(SoftDeletableModel, AccountBaseModel):
|
||||
models.Index(fields=['account', 'is_active']),
|
||||
models.Index(fields=['account', 'status']),
|
||||
models.Index(fields=['industry']),
|
||||
models.Index(fields=['site_type']),
|
||||
models.Index(fields=['hosting_type']),
|
||||
]
|
||||
|
||||
def __str__(self):
|
||||
@@ -517,14 +337,11 @@ class SeedKeyword(models.Model):
|
||||
These are canonical keywords that can be imported into account-specific Keywords.
|
||||
Non-deletable global reference data.
|
||||
"""
|
||||
COUNTRY_CHOICES = [
|
||||
('US', 'United States'),
|
||||
('CA', 'Canada'),
|
||||
('GB', 'United Kingdom'),
|
||||
('AE', 'United Arab Emirates'),
|
||||
('AU', 'Australia'),
|
||||
('IN', 'India'),
|
||||
('PK', 'Pakistan'),
|
||||
INTENT_CHOICES = [
|
||||
('informational', 'Informational'),
|
||||
('navigational', 'Navigational'),
|
||||
('commercial', 'Commercial'),
|
||||
('transactional', 'Transactional'),
|
||||
]
|
||||
|
||||
keyword = models.CharField(max_length=255, db_index=True)
|
||||
@@ -536,7 +353,7 @@ class SeedKeyword(models.Model):
|
||||
validators=[MinValueValidator(0), MaxValueValidator(100)],
|
||||
help_text='Keyword difficulty (0-100)'
|
||||
)
|
||||
country = models.CharField(max_length=2, choices=COUNTRY_CHOICES, default='US', help_text='Target country for this keyword')
|
||||
intent = models.CharField(max_length=50, choices=INTENT_CHOICES, default='informational')
|
||||
is_active = models.BooleanField(default=True, db_index=True)
|
||||
created_at = models.DateTimeField(auto_now_add=True)
|
||||
updated_at = models.DateTimeField(auto_now=True)
|
||||
@@ -545,12 +362,12 @@ class SeedKeyword(models.Model):
|
||||
db_table = 'igny8_seed_keywords'
|
||||
unique_together = [['keyword', 'industry', 'sector']]
|
||||
verbose_name = 'Seed Keyword'
|
||||
verbose_name_plural = 'Global Keywords Database'
|
||||
verbose_name_plural = 'Seed Keywords'
|
||||
indexes = [
|
||||
models.Index(fields=['keyword']),
|
||||
models.Index(fields=['industry', 'sector']),
|
||||
models.Index(fields=['industry', 'sector', 'is_active']),
|
||||
models.Index(fields=['country']),
|
||||
models.Index(fields=['intent']),
|
||||
]
|
||||
ordering = ['keyword']
|
||||
|
||||
@@ -558,7 +375,7 @@ class SeedKeyword(models.Model):
|
||||
return f"{self.keyword} ({self.industry.name} - {self.sector.name})"
|
||||
|
||||
|
||||
class Sector(SoftDeletableModel, AccountBaseModel):
|
||||
class Sector(AccountBaseModel):
|
||||
"""
|
||||
Sector model - Each site can have 1-5 sectors.
|
||||
Sectors are site-specific instances that reference an IndustrySector template.
|
||||
@@ -585,9 +402,6 @@ class Sector(SoftDeletableModel, AccountBaseModel):
|
||||
status = models.CharField(max_length=20, choices=STATUS_CHOICES, default='active')
|
||||
created_at = models.DateTimeField(auto_now_add=True)
|
||||
updated_at = models.DateTimeField(auto_now=True)
|
||||
|
||||
objects = SoftDeleteManager()
|
||||
all_objects = models.Manager()
|
||||
|
||||
class Meta:
|
||||
db_table = 'igny8_sectors'
|
||||
@@ -732,7 +546,8 @@ class User(AbstractUser):
|
||||
return self.role == 'developer' or self.is_superuser
|
||||
|
||||
def is_admin_or_developer(self):
|
||||
"""Check if user is admin or developer."""
|
||||
"""Check if user is admin or developer with override privileges."""
|
||||
# ADMIN/DEV OVERRIDE: Both admin and developer roles bypass account/site/sector restrictions
|
||||
return self.role in ['admin', 'developer'] or self.is_superuser
|
||||
|
||||
def is_system_account_user(self):
|
||||
@@ -745,17 +560,29 @@ class User(AbstractUser):
|
||||
|
||||
def get_accessible_sites(self):
|
||||
"""Get all sites the user can access."""
|
||||
# System account users can access all sites across all accounts
|
||||
if self.is_system_account_user():
|
||||
return Site.objects.filter(is_active=True).distinct()
|
||||
|
||||
# Developers/super admins can access all sites across all accounts
|
||||
# ADMIN/DEV OVERRIDE: Admins also bypass account restrictions (see is_admin_or_developer)
|
||||
if self.is_developer():
|
||||
return Site.objects.filter(is_active=True).distinct()
|
||||
|
||||
try:
|
||||
if not self.account:
|
||||
return Site.objects.none()
|
||||
|
||||
base_sites = Site.objects.filter(account=self.account)
|
||||
|
||||
if self.role in ['owner', 'admin', 'developer'] or self.is_superuser or self.is_system_account_user():
|
||||
return base_sites
|
||||
# Owners and admins can access all sites in their account
|
||||
if self.role in ['owner', 'admin']:
|
||||
return Site.objects.filter(account=self.account, is_active=True)
|
||||
|
||||
# Other users can only access sites explicitly granted via SiteUserAccess
|
||||
return base_sites.filter(user_access__user=self).distinct()
|
||||
return Site.objects.filter(
|
||||
account=self.account,
|
||||
is_active=True,
|
||||
user_access__user=self
|
||||
).distinct()
|
||||
except (AttributeError, Exception):
|
||||
# If account access fails (e.g., column mismatch), return empty queryset
|
||||
return Site.objects.none()
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user