feat: add boards and tasks management endpoints
This commit is contained in:
107
HEARTBEAT.md
Normal file
107
HEARTBEAT.md
Normal file
@@ -0,0 +1,107 @@
|
||||
# Mission Control Orchestrator Instructions
|
||||
|
||||
You are the Mission Control orchestrator. Your job is to:
|
||||
1. Claim unassigned tasks (FIFO)
|
||||
2. Execute work (optionally spawn sub-agents)
|
||||
3. Log progress and deliverables
|
||||
4. Move tasks to review when complete
|
||||
|
||||
## CRITICAL: You MUST call Mission Control APIs
|
||||
|
||||
Every action you take MUST be reflected in Mission Control via API calls. The dashboard shows task status in real-time.
|
||||
|
||||
## Required Inputs
|
||||
|
||||
- `BASE_URL` (e.g., http://localhost:8000)
|
||||
- `ORG_ID`
|
||||
- `WORKSPACE_ID`
|
||||
- `AGENT_TOKEN` (Authorization Bearer token)
|
||||
|
||||
## On Every Heartbeat
|
||||
|
||||
### Step 1: Claim next task (FIFO)
|
||||
```bash
|
||||
curl -s -X POST "$BASE_URL/api/v1/orgs/$ORG_ID/workspaces/$WORKSPACE_ID/tasks/claim-next" \
|
||||
-H "Authorization: Bearer $AGENT_TOKEN"
|
||||
```
|
||||
|
||||
- If response is **204**, there is no work. Wait and retry.
|
||||
- If response returns a task, process it.
|
||||
|
||||
### Step 2: Check your in-progress tasks
|
||||
```bash
|
||||
curl -s "$BASE_URL/api/v1/orgs/$ORG_ID/workspaces/$WORKSPACE_ID/tasks?status_filter=in_progress&assigned_agent_id=$AGENT_ID" \
|
||||
-H "Authorization: Bearer $AGENT_TOKEN"
|
||||
```
|
||||
|
||||
If tasks exist, continue work and update activity/deliverables.
|
||||
|
||||
## When Processing a New Task
|
||||
|
||||
### 1) Log that you're starting
|
||||
```bash
|
||||
curl -X POST "$BASE_URL/api/v1/orgs/$ORG_ID/workspaces/$WORKSPACE_ID/tasks/{TASK_ID}/activities" \
|
||||
-H "Authorization: Bearer $AGENT_TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"activity_type": "updated", "message": "Starting work on task"}'
|
||||
```
|
||||
|
||||
### 2) Register a sub-agent (if you spawn one)
|
||||
```bash
|
||||
curl -X POST "$BASE_URL/api/v1/orgs/$ORG_ID/workspaces/$WORKSPACE_ID/tasks/{TASK_ID}/subagents" \
|
||||
-H "Authorization: Bearer $AGENT_TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"openclaw_session_id": "optional-session-id",
|
||||
"agent_name": "Designer"
|
||||
}'
|
||||
```
|
||||
|
||||
### 3) Register deliverables
|
||||
```bash
|
||||
curl -X POST "$BASE_URL/api/v1/orgs/$ORG_ID/workspaces/$WORKSPACE_ID/tasks/{TASK_ID}/deliverables" \
|
||||
-H "Authorization: Bearer $AGENT_TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"title": "Homepage Design",
|
||||
"markdown_content": "## Summary\n- Implemented layout\n- Added responsive styles"
|
||||
}'
|
||||
```
|
||||
|
||||
### 4) Log completion
|
||||
```bash
|
||||
curl -X POST "$BASE_URL/api/v1/orgs/$ORG_ID/workspaces/$WORKSPACE_ID/tasks/{TASK_ID}/activities" \
|
||||
-H "Authorization: Bearer $AGENT_TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"activity_type": "completed", "message": "Task completed successfully"}'
|
||||
```
|
||||
|
||||
### 5) Move task to REVIEW
|
||||
```bash
|
||||
curl -X POST "$BASE_URL/api/v1/orgs/$ORG_ID/workspaces/$WORKSPACE_ID/tasks/{TASK_ID}/transition" \
|
||||
-H "Authorization: Bearer $AGENT_TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"to_status": "review"}'
|
||||
```
|
||||
|
||||
## Task Statuses
|
||||
|
||||
```
|
||||
inbox → in_progress → review → done
|
||||
```
|
||||
|
||||
Other statuses may be used if configured (`assigned`, `testing`), but the default flow above is expected.
|
||||
|
||||
## Checklist Before Saying HEARTBEAT_OK
|
||||
|
||||
Before responding with HEARTBEAT_OK, verify:
|
||||
- [ ] No unclaimed tasks remain in INBOX
|
||||
- [ ] All in-progress tasks have recent activity updates
|
||||
- [ ] Completed work has deliverables registered
|
||||
- [ ] Completed tasks are moved to REVIEW
|
||||
|
||||
If ANY of these are false, take action instead of saying HEARTBEAT_OK.
|
||||
|
||||
## Reference
|
||||
|
||||
Full API documentation: See ORCHESTRATION.md in this project.
|
||||
176
ORCHESTRATION.md
Normal file
176
ORCHESTRATION.md
Normal file
@@ -0,0 +1,176 @@
|
||||
# Mission Control Orchestration Guide
|
||||
|
||||
This document explains how to orchestrate tasks in Mission Control, including how to:
|
||||
- Register sub-agents
|
||||
- Log activities
|
||||
- Track deliverables
|
||||
- Update task status
|
||||
|
||||
## API Base URL
|
||||
|
||||
```
|
||||
http://localhost:8000
|
||||
```
|
||||
|
||||
Or use the `BASE_URL` environment variable.
|
||||
|
||||
## Task Lifecycle
|
||||
|
||||
```
|
||||
INBOX → IN_PROGRESS → REVIEW → DONE
|
||||
```
|
||||
|
||||
**Status Descriptions:**
|
||||
- **INBOX**: New tasks awaiting processing
|
||||
- **IN_PROGRESS**: Agent actively working on the task
|
||||
- **REVIEW**: Agent finished, awaiting human approval
|
||||
- **DONE**: Task completed and approved
|
||||
|
||||
Optional statuses may be enabled (`ASSIGNED`, `TESTING`) but are not required by default.
|
||||
|
||||
## When You Receive a Task
|
||||
|
||||
When a task is claimed, the response includes:
|
||||
- Task ID
|
||||
- Title, description, priority
|
||||
- Project ID
|
||||
|
||||
## Required API Calls
|
||||
|
||||
### 1. Register Sub-Agent (when spawning a worker)
|
||||
|
||||
```bash
|
||||
curl -X POST "$BASE_URL/api/v1/orgs/$ORG_ID/workspaces/$WORKSPACE_ID/tasks/{TASK_ID}/subagents" \
|
||||
-H "Authorization: Bearer $AGENT_TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"openclaw_session_id": "unique-session-id",
|
||||
"agent_name": "Designer"
|
||||
}'
|
||||
```
|
||||
|
||||
### 2. Log Activity (for each significant action)
|
||||
|
||||
```bash
|
||||
curl -X POST "$BASE_URL/api/v1/orgs/$ORG_ID/workspaces/$WORKSPACE_ID/tasks/{TASK_ID}/activities" \
|
||||
-H "Authorization: Bearer $AGENT_TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"activity_type": "updated",
|
||||
"message": "Started working on design mockups"
|
||||
}'
|
||||
```
|
||||
|
||||
Activity types:
|
||||
- `spawned` - When sub-agent starts
|
||||
- `updated` - Progress update
|
||||
- `completed` - Work finished
|
||||
- `file_created` - Created a deliverable
|
||||
- `status_changed` - Task moved to new status
|
||||
|
||||
### 3. Register Deliverable (for each output)
|
||||
|
||||
```bash
|
||||
curl -X POST "$BASE_URL/api/v1/orgs/$ORG_ID/workspaces/$WORKSPACE_ID/tasks/{TASK_ID}/deliverables" \
|
||||
-H "Authorization: Bearer $AGENT_TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"title": "Homepage Design",
|
||||
"markdown_content": "## Summary\n- Implemented layout\n- Added responsive styles"
|
||||
}'
|
||||
```
|
||||
|
||||
### 4. Update Task Status
|
||||
|
||||
```bash
|
||||
curl -X POST "$BASE_URL/api/v1/orgs/$ORG_ID/workspaces/$WORKSPACE_ID/tasks/{TASK_ID}/transition" \
|
||||
-H "Authorization: Bearer $AGENT_TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{ "to_status": "review" }'
|
||||
```
|
||||
|
||||
## Complete Example Workflow
|
||||
|
||||
```bash
|
||||
TASK_ID="abc-123"
|
||||
BASE_URL="http://localhost:8000"
|
||||
ORG_ID="org-uuid"
|
||||
WORKSPACE_ID="workspace-uuid"
|
||||
AGENT_TOKEN="agent-token"
|
||||
|
||||
# 1) Log that you're starting
|
||||
curl -X POST "$BASE_URL/api/v1/orgs/$ORG_ID/workspaces/$WORKSPACE_ID/tasks/$TASK_ID/activities" \
|
||||
-H "Authorization: Bearer $AGENT_TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"activity_type": "updated", "message": "Starting work on task"}'
|
||||
|
||||
# 2) Spawn a sub-agent
|
||||
curl -X POST "$BASE_URL/api/v1/orgs/$ORG_ID/workspaces/$WORKSPACE_ID/tasks/$TASK_ID/subagents" \
|
||||
-H "Authorization: Bearer $AGENT_TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"openclaw_session_id": "subagent-'$(date +%s)'", "agent_name": "Designer"}'
|
||||
|
||||
# 3) Register the deliverable
|
||||
curl -X POST "$BASE_URL/api/v1/orgs/$ORG_ID/workspaces/$WORKSPACE_ID/tasks/$TASK_ID/deliverables" \
|
||||
-H "Authorization: Bearer $AGENT_TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"title": "Completed Design",
|
||||
"markdown_content": "## Deliverable\n- Final design with all requested features"
|
||||
}'
|
||||
|
||||
# 4) Log completion
|
||||
curl -X POST "$BASE_URL/api/v1/orgs/$ORG_ID/workspaces/$WORKSPACE_ID/tasks/$TASK_ID/activities" \
|
||||
-H "Authorization: Bearer $AGENT_TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"activity_type": "completed", "message": "Design completed successfully"}'
|
||||
|
||||
# 5) Move to review
|
||||
curl -X POST "$BASE_URL/api/v1/orgs/$ORG_ID/workspaces/$WORKSPACE_ID/tasks/$TASK_ID/transition" \
|
||||
-H "Authorization: Bearer $AGENT_TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"to_status": "review"}'
|
||||
```
|
||||
|
||||
## Endpoints Reference
|
||||
|
||||
| Endpoint | Method | Purpose |
|
||||
|----------|--------|---------|
|
||||
| `/api/v1/orgs/{org_id}/workspaces/{workspace_id}/tasks` | GET | List tasks |
|
||||
| `/api/v1/orgs/{org_id}/workspaces/{workspace_id}/tasks` | POST | Create task |
|
||||
| `/api/v1/orgs/{org_id}/workspaces/{workspace_id}/tasks/{task_id}` | PATCH | Update task |
|
||||
| `/api/v1/orgs/{org_id}/workspaces/{workspace_id}/tasks/{task_id}/activities` | GET | List activities |
|
||||
| `/api/v1/orgs/{org_id}/workspaces/{workspace_id}/tasks/{task_id}/activities` | POST | Log activity |
|
||||
| `/api/v1/orgs/{org_id}/workspaces/{workspace_id}/tasks/{task_id}/deliverables` | GET | List deliverables |
|
||||
| `/api/v1/orgs/{org_id}/workspaces/{workspace_id}/tasks/{task_id}/deliverables` | POST | Add deliverable |
|
||||
| `/api/v1/orgs/{org_id}/workspaces/{workspace_id}/tasks/{task_id}/subagents` | GET | List sub-agents |
|
||||
| `/api/v1/orgs/{org_id}/workspaces/{workspace_id}/tasks/{task_id}/subagents` | POST | Register sub-agent |
|
||||
| `/api/v1/orgs/{org_id}/workspaces/{workspace_id}/tasks/claim-next` | POST | Claim next task (FIFO) |
|
||||
| `/api/v1/orgs/{org_id}/workspaces/{workspace_id}/events/activities` | GET | SSE activity stream |
|
||||
|
||||
## Activity Body Schema
|
||||
|
||||
```json
|
||||
{
|
||||
"activity_type": "spawned|updated|completed|file_created|status_changed",
|
||||
"message": "Human-readable description of what happened"
|
||||
}
|
||||
```
|
||||
|
||||
## Deliverable Body Schema
|
||||
|
||||
```json
|
||||
{
|
||||
"title": "Display name for the deliverable",
|
||||
"markdown_content": "Markdown content for the deliverable"
|
||||
}
|
||||
```
|
||||
|
||||
## Sub-Agent Body Schema
|
||||
|
||||
```json
|
||||
{
|
||||
"openclaw_session_id": "unique-identifier-for-session",
|
||||
"agent_name": "Designer|Developer|Researcher|Writer"
|
||||
}
|
||||
```
|
||||
66
README.md
66
README.md
@@ -1,66 +0,0 @@
|
||||
# OpenClaw Agency — Pilot (Kanban)
|
||||
|
||||
MVP: **Next.js (frontend)** + **FastAPI (backend)** + **PostgreSQL**.
|
||||
|
||||
No auth (yet). The goal is simple visibility: everyone can see what exists and who owns it.
|
||||
|
||||
## Repo layout
|
||||
|
||||
- `frontend/` — Next.js App Router (TypeScript)
|
||||
- `backend/` — FastAPI + SQLAlchemy + Alembic
|
||||
|
||||
## Database
|
||||
|
||||
Uses local Postgres:
|
||||
|
||||
- user: `postgres`
|
||||
- password: `REDACTED`
|
||||
- db: `openclaw_agency`
|
||||
|
||||
## Environment
|
||||
|
||||
Do **not** commit real `.env` files.
|
||||
|
||||
- Backend: copy `backend/.env.example` → `backend/.env`
|
||||
- Frontend: copy `frontend/.env.example` → `frontend/.env.local`
|
||||
|
||||
If you want to test from another device (phone/laptop), make sure:
|
||||
|
||||
- both servers bind to `0.0.0.0`
|
||||
- `NEXT_PUBLIC_API_URL` is set to `http://<YOUR_MACHINE_IP>:8000` (not `127.0.0.1`)
|
||||
- backend `CORS_ORIGINS` includes `http://<YOUR_MACHINE_IP>:3000`
|
||||
|
||||
## Run backend (LAN-accessible)
|
||||
|
||||
```bash
|
||||
cd backend
|
||||
source .venv/bin/activate
|
||||
uvicorn app.main:app --reload --host 0.0.0.0 --port 8000
|
||||
```
|
||||
|
||||
Health check:
|
||||
|
||||
```bash
|
||||
curl http://127.0.0.1:8000/health
|
||||
# or from another machine:
|
||||
# curl http://<YOUR_MACHINE_IP>:8000/health
|
||||
```
|
||||
|
||||
## Run frontend (LAN-accessible)
|
||||
|
||||
```bash
|
||||
cd frontend
|
||||
npm run dev:lan
|
||||
```
|
||||
|
||||
Open:
|
||||
|
||||
- local: http://localhost:3000
|
||||
- LAN: `http://<YOUR_MACHINE_IP>:3000`
|
||||
|
||||
## API
|
||||
|
||||
- `GET /tasks`
|
||||
- `POST /tasks`
|
||||
- `PATCH /tasks/{id}`
|
||||
- `DELETE /tasks/{id}`
|
||||
@@ -1,7 +1,17 @@
|
||||
# Example config for local/dev.
|
||||
#
|
||||
# If you plan to access the app from another device (via machine IP),
|
||||
# set CORS_ORIGINS to include that frontend origin too.
|
||||
ENVIRONMENT=dev
|
||||
LOG_LEVEL=INFO
|
||||
DATABASE_URL=postgresql+psycopg://postgres:postgres@localhost:5432/openclaw_agency
|
||||
REDIS_URL=redis://localhost:6379/0
|
||||
CORS_ORIGINS=http://localhost:3000
|
||||
|
||||
DATABASE_URL=postgresql+psycopg2://postgres:CHANGE_ME@127.0.0.1:5432/openclaw_agency
|
||||
CORS_ORIGINS=http://localhost:3000,http://<YOUR_MACHINE_IP>:3000
|
||||
# Clerk (auth only)
|
||||
CLERK_JWKS_URL=
|
||||
CLERK_VERIFY_IAT=true
|
||||
CLERK_LEEWAY=10.0
|
||||
|
||||
# OpenClaw Gateway
|
||||
OPENCLAW_GATEWAY_URL=ws://127.0.0.1:18789
|
||||
OPENCLAW_GATEWAY_TOKEN=
|
||||
|
||||
# Database
|
||||
DB_AUTO_MIGRATE=false
|
||||
|
||||
5
backend/.gitignore
vendored
5
backend/.gitignore
vendored
@@ -1,4 +1,5 @@
|
||||
__pycache__/
|
||||
*.py[cod]
|
||||
.env
|
||||
*.pyc
|
||||
.venv/
|
||||
.env
|
||||
.runlogs/
|
||||
|
||||
@@ -1,119 +1,8 @@
|
||||
# A generic, single database configuration.
|
||||
|
||||
[alembic]
|
||||
# path to migration scripts.
|
||||
# this is typically a path given in POSIX (e.g. forward slashes)
|
||||
# format, relative to the token %(here)s which refers to the location of this
|
||||
# ini file
|
||||
script_location = %(here)s/alembic
|
||||
|
||||
# template used to generate migration file names; The default value is %%(rev)s_%%(slug)s
|
||||
# Uncomment the line below if you want the files to be prepended with date and time
|
||||
# see https://alembic.sqlalchemy.org/en/latest/tutorial.html#editing-the-ini-file
|
||||
# for all available tokens
|
||||
# file_template = %%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s
|
||||
# Or organize into date-based subdirectories (requires recursive_version_locations = true)
|
||||
# file_template = %%(year)d/%%(month).2d/%%(day).2d_%%(hour).2d%%(minute).2d_%%(second).2d_%%(rev)s_%%(slug)s
|
||||
|
||||
# sys.path path, will be prepended to sys.path if present.
|
||||
# defaults to the current working directory. for multiple paths, the path separator
|
||||
# is defined by "path_separator" below.
|
||||
script_location = alembic
|
||||
prepend_sys_path = .
|
||||
sqlalchemy.url = driver://user:pass@localhost/dbname
|
||||
|
||||
|
||||
# timezone to use when rendering the date within the migration file
|
||||
# as well as the filename.
|
||||
# If specified, requires the tzdata library which can be installed by adding
|
||||
# `alembic[tz]` to the pip requirements.
|
||||
# string value is passed to ZoneInfo()
|
||||
# leave blank for localtime
|
||||
# timezone =
|
||||
|
||||
# max length of characters to apply to the "slug" field
|
||||
# truncate_slug_length = 40
|
||||
|
||||
# set to 'true' to run the environment during
|
||||
# the 'revision' command, regardless of autogenerate
|
||||
# revision_environment = false
|
||||
|
||||
# set to 'true' to allow .pyc and .pyo files without
|
||||
# a source .py file to be detected as revisions in the
|
||||
# versions/ directory
|
||||
# sourceless = false
|
||||
|
||||
# version location specification; This defaults
|
||||
# to <script_location>/versions. When using multiple version
|
||||
# directories, initial revisions must be specified with --version-path.
|
||||
# The path separator used here should be the separator specified by "path_separator"
|
||||
# below.
|
||||
# version_locations = %(here)s/bar:%(here)s/bat:%(here)s/alembic/versions
|
||||
|
||||
# path_separator; This indicates what character is used to split lists of file
|
||||
# paths, including version_locations and prepend_sys_path within configparser
|
||||
# files such as alembic.ini.
|
||||
# The default rendered in new alembic.ini files is "os", which uses os.pathsep
|
||||
# to provide os-dependent path splitting.
|
||||
#
|
||||
# Note that in order to support legacy alembic.ini files, this default does NOT
|
||||
# take place if path_separator is not present in alembic.ini. If this
|
||||
# option is omitted entirely, fallback logic is as follows:
|
||||
#
|
||||
# 1. Parsing of the version_locations option falls back to using the legacy
|
||||
# "version_path_separator" key, which if absent then falls back to the legacy
|
||||
# behavior of splitting on spaces and/or commas.
|
||||
# 2. Parsing of the prepend_sys_path option falls back to the legacy
|
||||
# behavior of splitting on spaces, commas, or colons.
|
||||
#
|
||||
# Valid values for path_separator are:
|
||||
#
|
||||
# path_separator = :
|
||||
# path_separator = ;
|
||||
# path_separator = space
|
||||
# path_separator = newline
|
||||
#
|
||||
# Use os.pathsep. Default configuration used for new projects.
|
||||
path_separator = os
|
||||
|
||||
# set to 'true' to search source files recursively
|
||||
# in each "version_locations" directory
|
||||
# new in Alembic version 1.10
|
||||
# recursive_version_locations = false
|
||||
|
||||
# the output encoding used when revision files
|
||||
# are written from script.py.mako
|
||||
# output_encoding = utf-8
|
||||
|
||||
# database URL. This is consumed by the user-maintained env.py script only.
|
||||
# other means of configuring database URLs may be customized within the env.py
|
||||
# file.
|
||||
sqlalchemy.url =
|
||||
|
||||
|
||||
[post_write_hooks]
|
||||
# post_write_hooks defines scripts or Python functions that are run
|
||||
# on newly generated revision scripts. See the documentation for further
|
||||
# detail and examples
|
||||
|
||||
# format using "black" - use the console_scripts runner, against the "black" entrypoint
|
||||
# hooks = black
|
||||
# black.type = console_scripts
|
||||
# black.entrypoint = black
|
||||
# black.options = -l 79 REVISION_SCRIPT_FILENAME
|
||||
|
||||
# lint with attempts to fix using "ruff" - use the module runner, against the "ruff" module
|
||||
# hooks = ruff
|
||||
# ruff.type = module
|
||||
# ruff.module = ruff
|
||||
# ruff.options = check --fix REVISION_SCRIPT_FILENAME
|
||||
|
||||
# Alternatively, use the exec runner to execute a binary found on your PATH
|
||||
# hooks = ruff
|
||||
# ruff.type = exec
|
||||
# ruff.executable = ruff
|
||||
# ruff.options = check --fix REVISION_SCRIPT_FILENAME
|
||||
|
||||
# Logging configuration. This is also consumed by the user-maintained
|
||||
# env.py script only.
|
||||
[loggers]
|
||||
keys = root,sqlalchemy,alembic
|
||||
|
||||
@@ -124,12 +13,11 @@ keys = console
|
||||
keys = generic
|
||||
|
||||
[logger_root]
|
||||
level = WARNING
|
||||
level = INFO
|
||||
handlers = console
|
||||
qualname =
|
||||
|
||||
[logger_sqlalchemy]
|
||||
level = WARNING
|
||||
level = WARN
|
||||
handlers =
|
||||
qualname = sqlalchemy.engine
|
||||
|
||||
@@ -146,4 +34,3 @@ formatter = generic
|
||||
|
||||
[formatter_generic]
|
||||
format = %(levelname)-5.5s [%(name)s] %(message)s
|
||||
datefmt = %H:%M:%S
|
||||
|
||||
@@ -1 +0,0 @@
|
||||
Generic single-database configuration.
|
||||
@@ -1,36 +1,51 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import sys
|
||||
from logging.config import fileConfig
|
||||
from pathlib import Path
|
||||
|
||||
from sqlalchemy import engine_from_config, pool
|
||||
from sqlmodel import SQLModel
|
||||
|
||||
from alembic import context
|
||||
|
||||
# Import models to register tables in metadata
|
||||
from app import models # noqa: F401
|
||||
from app.core.config import settings
|
||||
PROJECT_ROOT = Path(__file__).resolve().parents[1]
|
||||
if str(PROJECT_ROOT) not in sys.path:
|
||||
sys.path.append(str(PROJECT_ROOT))
|
||||
|
||||
from app import models # noqa: E402,F401
|
||||
from app.core.config import settings # noqa: E402
|
||||
|
||||
config = context.config
|
||||
|
||||
if config.config_file_name is not None:
|
||||
if config.config_file_name is not None and config.attributes.get("configure_logger", True):
|
||||
fileConfig(config.config_file_name)
|
||||
|
||||
|
||||
target_metadata = SQLModel.metadata
|
||||
|
||||
|
||||
def _normalize_database_url(database_url: str) -> str:
|
||||
if "://" not in database_url:
|
||||
return database_url
|
||||
scheme, rest = database_url.split("://", 1)
|
||||
if scheme == "postgresql":
|
||||
return f"postgresql+psycopg://{rest}"
|
||||
return database_url
|
||||
|
||||
|
||||
def get_url() -> str:
|
||||
return settings.database_url
|
||||
return _normalize_database_url(settings.database_url)
|
||||
|
||||
|
||||
config.set_main_option("sqlalchemy.url", get_url())
|
||||
|
||||
|
||||
def run_migrations_offline() -> None:
|
||||
url = get_url()
|
||||
context.configure(
|
||||
url=url,
|
||||
url=get_url(),
|
||||
target_metadata=target_metadata,
|
||||
literal_binds=True,
|
||||
dialect_opts={"paramstyle": "named"},
|
||||
compare_type=True,
|
||||
)
|
||||
|
||||
with context.begin_transaction():
|
||||
@@ -48,7 +63,11 @@ def run_migrations_online() -> None:
|
||||
)
|
||||
|
||||
with connectable.connect() as connection:
|
||||
context.configure(connection=connection, target_metadata=target_metadata)
|
||||
context.configure(
|
||||
connection=connection,
|
||||
target_metadata=target_metadata,
|
||||
compare_type=True,
|
||||
)
|
||||
|
||||
with context.begin_transaction():
|
||||
context.run_migrations()
|
||||
|
||||
@@ -5,24 +5,22 @@ Revises: ${down_revision | comma,n}
|
||||
Create Date: ${create_date}
|
||||
|
||||
"""
|
||||
from typing import Sequence, Union
|
||||
from __future__ import annotations
|
||||
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
${imports if imports else ""}
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision: str = ${repr(up_revision)}
|
||||
down_revision: Union[str, Sequence[str], None] = ${repr(down_revision)}
|
||||
branch_labels: Union[str, Sequence[str], None] = ${repr(branch_labels)}
|
||||
depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)}
|
||||
revision = ${repr(up_revision)}
|
||||
down_revision = ${repr(down_revision)}
|
||||
branch_labels = ${repr(branch_labels)}
|
||||
depends_on = ${repr(depends_on)}
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
"""Upgrade schema."""
|
||||
${upgrades if upgrades else "pass"}
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
"""Downgrade schema."""
|
||||
${downgrades if downgrades else "pass"}
|
||||
|
||||
@@ -1,195 +0,0 @@
|
||||
"""Baseline schema (squashed)
|
||||
|
||||
Revision ID: 0a1b2c3d4e5f
|
||||
Revises:
|
||||
Create Date: 2026-02-02
|
||||
|
||||
This is a squashed baseline migration for Mission Control.
|
||||
All prior incremental migrations were removed to keep the repo simple.
|
||||
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
import sqlmodel
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = "0a1b2c3d4e5f"
|
||||
down_revision = None
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
# Departments (FK to employees added after employees table exists)
|
||||
op.create_table(
|
||||
"departments",
|
||||
sa.Column("id", sa.Integer(), nullable=False),
|
||||
sa.Column("name", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("head_employee_id", sa.Integer(), nullable=True),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index(op.f("ix_departments_name"), "departments", ["name"], unique=True)
|
||||
|
||||
# Employees
|
||||
op.create_table(
|
||||
"employees",
|
||||
sa.Column("id", sa.Integer(), nullable=False),
|
||||
sa.Column("name", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("employee_type", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("department_id", sa.Integer(), nullable=True),
|
||||
sa.Column("team_id", sa.Integer(), nullable=True),
|
||||
sa.Column("manager_id", sa.Integer(), nullable=True),
|
||||
sa.Column("title", sqlmodel.sql.sqltypes.AutoString(), nullable=True),
|
||||
sa.Column("status", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("openclaw_session_key", sqlmodel.sql.sqltypes.AutoString(), nullable=True),
|
||||
sa.Column("notify_enabled", sa.Boolean(), nullable=False),
|
||||
sa.ForeignKeyConstraint(["department_id"], ["departments.id"]),
|
||||
sa.ForeignKeyConstraint(["manager_id"], ["employees.id"]),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
|
||||
# Break the departments<->employees cycle: add this FK after both tables exist
|
||||
op.create_foreign_key(None, "departments", "employees", ["head_employee_id"], ["id"])
|
||||
|
||||
# Teams
|
||||
op.create_table(
|
||||
"teams",
|
||||
sa.Column("id", sa.Integer(), primary_key=True, nullable=False),
|
||||
sa.Column("name", sa.String(), nullable=False),
|
||||
sa.Column("department_id", sa.Integer(), nullable=False),
|
||||
sa.Column("lead_employee_id", sa.Integer(), nullable=True),
|
||||
sa.ForeignKeyConstraint(["department_id"], ["departments.id"], ondelete="CASCADE"),
|
||||
sa.ForeignKeyConstraint(["lead_employee_id"], ["employees.id"], ondelete="SET NULL"),
|
||||
sa.UniqueConstraint("department_id", "name", name="uq_teams_department_id_name"),
|
||||
)
|
||||
op.create_index("ix_teams_name", "teams", ["name"], unique=False)
|
||||
op.create_index("ix_teams_department_id", "teams", ["department_id"], unique=False)
|
||||
|
||||
# Employees.team_id FK (added after teams exists)
|
||||
op.create_index("ix_employees_team_id", "employees", ["team_id"], unique=False)
|
||||
op.create_foreign_key(
|
||||
"fk_employees_team_id_teams",
|
||||
"employees",
|
||||
"teams",
|
||||
["team_id"],
|
||||
["id"],
|
||||
ondelete="SET NULL",
|
||||
)
|
||||
|
||||
# Projects
|
||||
op.create_table(
|
||||
"projects",
|
||||
sa.Column("id", sa.Integer(), nullable=False),
|
||||
sa.Column("name", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("status", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("team_id", sa.Integer(), nullable=True),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index(op.f("ix_projects_name"), "projects", ["name"], unique=True)
|
||||
op.create_index("ix_projects_team_id", "projects", ["team_id"], unique=False)
|
||||
op.create_foreign_key(
|
||||
"fk_projects_team_id_teams",
|
||||
"projects",
|
||||
"teams",
|
||||
["team_id"],
|
||||
["id"],
|
||||
ondelete="SET NULL",
|
||||
)
|
||||
|
||||
# Activities
|
||||
op.create_table(
|
||||
"activities",
|
||||
sa.Column("id", sa.Integer(), nullable=False),
|
||||
sa.Column("actor_employee_id", sa.Integer(), nullable=True),
|
||||
sa.Column("entity_type", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("entity_id", sa.Integer(), nullable=True),
|
||||
sa.Column("verb", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("payload_json", sqlmodel.sql.sqltypes.AutoString(), nullable=True),
|
||||
sa.Column("created_at", sa.DateTime(), nullable=False),
|
||||
sa.ForeignKeyConstraint(["actor_employee_id"], ["employees.id"]),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
|
||||
# Project members
|
||||
op.create_table(
|
||||
"project_members",
|
||||
sa.Column("id", sa.Integer(), nullable=False),
|
||||
sa.Column("project_id", sa.Integer(), nullable=False),
|
||||
sa.Column("employee_id", sa.Integer(), nullable=False),
|
||||
sa.Column("role", sqlmodel.sql.sqltypes.AutoString(), nullable=True),
|
||||
sa.ForeignKeyConstraint(["employee_id"], ["employees.id"]),
|
||||
sa.ForeignKeyConstraint(["project_id"], ["projects.id"]),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
|
||||
# Tasks
|
||||
op.create_table(
|
||||
"tasks",
|
||||
sa.Column("id", sa.Integer(), nullable=False),
|
||||
sa.Column("project_id", sa.Integer(), nullable=False),
|
||||
sa.Column("title", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("description", sqlmodel.sql.sqltypes.AutoString(), nullable=True),
|
||||
sa.Column("status", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("assignee_employee_id", sa.Integer(), nullable=True),
|
||||
sa.Column("reviewer_employee_id", sa.Integer(), nullable=True),
|
||||
sa.Column("created_by_employee_id", sa.Integer(), nullable=True),
|
||||
sa.Column("created_at", sa.DateTime(), nullable=False),
|
||||
sa.Column("updated_at", sa.DateTime(), nullable=False),
|
||||
sa.ForeignKeyConstraint(["assignee_employee_id"], ["employees.id"]),
|
||||
sa.ForeignKeyConstraint(["created_by_employee_id"], ["employees.id"]),
|
||||
sa.ForeignKeyConstraint(["project_id"], ["projects.id"]),
|
||||
sa.ForeignKeyConstraint(["reviewer_employee_id"], ["employees.id"]),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index(op.f("ix_tasks_project_id"), "tasks", ["project_id"], unique=False)
|
||||
op.create_index(op.f("ix_tasks_status"), "tasks", ["status"], unique=False)
|
||||
|
||||
# Task comments
|
||||
op.create_table(
|
||||
"task_comments",
|
||||
sa.Column("id", sa.Integer(), nullable=False),
|
||||
sa.Column("task_id", sa.Integer(), nullable=False),
|
||||
sa.Column("author_employee_id", sa.Integer(), nullable=True),
|
||||
sa.Column("reply_to_comment_id", sa.Integer(), nullable=True),
|
||||
sa.Column("body", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("created_at", sa.DateTime(), nullable=False),
|
||||
sa.ForeignKeyConstraint(["author_employee_id"], ["employees.id"]),
|
||||
sa.ForeignKeyConstraint(["reply_to_comment_id"], ["task_comments.id"]),
|
||||
sa.ForeignKeyConstraint(["task_id"], ["tasks.id"]),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index(op.f("ix_task_comments_task_id"), "task_comments", ["task_id"], unique=False)
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
op.drop_index(op.f("ix_task_comments_task_id"), table_name="task_comments")
|
||||
op.drop_table("task_comments")
|
||||
|
||||
op.drop_index(op.f("ix_tasks_status"), table_name="tasks")
|
||||
op.drop_index(op.f("ix_tasks_project_id"), table_name="tasks")
|
||||
op.drop_table("tasks")
|
||||
|
||||
op.drop_table("project_members")
|
||||
|
||||
op.drop_table("activities")
|
||||
|
||||
op.drop_constraint("fk_projects_team_id_teams", "projects", type_="foreignkey")
|
||||
op.drop_index("ix_projects_team_id", table_name="projects")
|
||||
op.drop_index(op.f("ix_projects_name"), table_name="projects")
|
||||
op.drop_table("projects")
|
||||
|
||||
op.drop_constraint("fk_employees_team_id_teams", "employees", type_="foreignkey")
|
||||
op.drop_index("ix_employees_team_id", table_name="employees")
|
||||
|
||||
op.drop_index("ix_teams_department_id", table_name="teams")
|
||||
op.drop_index("ix_teams_name", table_name="teams")
|
||||
op.drop_table("teams")
|
||||
|
||||
op.drop_table("employees")
|
||||
|
||||
op.drop_index(op.f("ix_departments_name"), table_name="departments")
|
||||
op.drop_table("departments")
|
||||
@@ -1,27 +0,0 @@
|
||||
"""Add openclaw_agent_id to employees
|
||||
|
||||
Revision ID: 1c2d3e4f5a6b
|
||||
Revises: 0a1b2c3d4e5f
|
||||
Create Date: 2026-02-02
|
||||
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
|
||||
revision = "1c2d3e4f5a6b"
|
||||
down_revision = "0a1b2c3d4e5f"
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
# Be tolerant if the column was added manually during development.
|
||||
op.execute("ALTER TABLE employees ADD COLUMN IF NOT EXISTS openclaw_agent_id VARCHAR")
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
op.drop_column("employees", "openclaw_agent_id")
|
||||
574
backend/alembic/versions/5630abfa60f8_init.py
Normal file
574
backend/alembic/versions/5630abfa60f8_init.py
Normal file
@@ -0,0 +1,574 @@
|
||||
"""init
|
||||
|
||||
Revision ID: 5630abfa60f8
|
||||
Revises:
|
||||
Create Date: 2026-02-03 17:52:47.887105
|
||||
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
import sqlmodel
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = "5630abfa60f8"
|
||||
down_revision = None
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
op.create_table(
|
||||
"orgs",
|
||||
sa.Column("id", sa.Uuid(), nullable=False),
|
||||
sa.Column("name", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("slug", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index(op.f("ix_orgs_slug"), "orgs", ["slug"], unique=True)
|
||||
op.create_table(
|
||||
"users",
|
||||
sa.Column("id", sa.Uuid(), nullable=False),
|
||||
sa.Column("clerk_user_id", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("email", sqlmodel.sql.sqltypes.AutoString(), nullable=True),
|
||||
sa.Column("name", sqlmodel.sql.sqltypes.AutoString(), nullable=True),
|
||||
sa.Column("is_super_admin", sa.Boolean(), nullable=False),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index(op.f("ix_users_clerk_user_id"), "users", ["clerk_user_id"], unique=True)
|
||||
op.create_index(op.f("ix_users_email"), "users", ["email"], unique=False)
|
||||
op.create_table(
|
||||
"workspaces",
|
||||
sa.Column("id", sa.Uuid(), nullable=False),
|
||||
sa.Column("org_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("name", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("slug", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.ForeignKeyConstraint(
|
||||
["org_id"],
|
||||
["orgs.id"],
|
||||
),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index(op.f("ix_workspaces_org_id"), "workspaces", ["org_id"], unique=False)
|
||||
op.create_index(op.f("ix_workspaces_slug"), "workspaces", ["slug"], unique=False)
|
||||
op.create_table(
|
||||
"agents",
|
||||
sa.Column("org_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("workspace_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("id", sa.Uuid(), nullable=False),
|
||||
sa.Column("name", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("role", sqlmodel.sql.sqltypes.AutoString(), nullable=True),
|
||||
sa.Column("status", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("openclaw_session_id", sqlmodel.sql.sqltypes.AutoString(), nullable=True),
|
||||
sa.Column("api_token_hash", sqlmodel.sql.sqltypes.AutoString(), nullable=True),
|
||||
sa.Column("api_token_last_used_at", sa.DateTime(), nullable=True),
|
||||
sa.Column("created_at", sa.DateTime(), nullable=False),
|
||||
sa.Column("updated_at", sa.DateTime(), nullable=False),
|
||||
sa.ForeignKeyConstraint(
|
||||
["org_id"],
|
||||
["orgs.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["workspace_id"],
|
||||
["workspaces.id"],
|
||||
),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index(
|
||||
op.f("ix_agents_openclaw_session_id"), "agents", ["openclaw_session_id"], unique=False
|
||||
)
|
||||
op.create_index(op.f("ix_agents_org_id"), "agents", ["org_id"], unique=False)
|
||||
op.create_index(op.f("ix_agents_workspace_id"), "agents", ["workspace_id"], unique=False)
|
||||
op.create_table(
|
||||
"gateway_configs",
|
||||
sa.Column("id", sa.Uuid(), nullable=False),
|
||||
sa.Column("org_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("workspace_id", sa.Uuid(), nullable=True),
|
||||
sa.Column("name", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("base_url", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("token", sqlmodel.sql.sqltypes.AutoString(), nullable=True),
|
||||
sa.Column("created_at", sa.DateTime(), nullable=False),
|
||||
sa.ForeignKeyConstraint(
|
||||
["org_id"],
|
||||
["orgs.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["workspace_id"],
|
||||
["workspaces.id"],
|
||||
),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index(op.f("ix_gateway_configs_org_id"), "gateway_configs", ["org_id"], unique=False)
|
||||
op.create_index(
|
||||
op.f("ix_gateway_configs_workspace_id"), "gateway_configs", ["workspace_id"], unique=False
|
||||
)
|
||||
op.create_table(
|
||||
"memberships",
|
||||
sa.Column("org_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("workspace_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("id", sa.Uuid(), nullable=False),
|
||||
sa.Column("user_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("role", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("created_at", sa.DateTime(), nullable=False),
|
||||
sa.ForeignKeyConstraint(
|
||||
["org_id"],
|
||||
["orgs.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["user_id"],
|
||||
["users.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["workspace_id"],
|
||||
["workspaces.id"],
|
||||
),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index(op.f("ix_memberships_org_id"), "memberships", ["org_id"], unique=False)
|
||||
op.create_index(op.f("ix_memberships_user_id"), "memberships", ["user_id"], unique=False)
|
||||
op.create_index(
|
||||
op.f("ix_memberships_workspace_id"), "memberships", ["workspace_id"], unique=False
|
||||
)
|
||||
op.create_table(
|
||||
"orchestration_templates",
|
||||
sa.Column("org_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("workspace_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("id", sa.Uuid(), nullable=False),
|
||||
sa.Column("kind", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("title", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("template_markdown", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("created_at", sa.DateTime(), nullable=False),
|
||||
sa.Column("updated_at", sa.DateTime(), nullable=False),
|
||||
sa.ForeignKeyConstraint(
|
||||
["org_id"],
|
||||
["orgs.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["workspace_id"],
|
||||
["workspaces.id"],
|
||||
),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index(
|
||||
op.f("ix_orchestration_templates_kind"), "orchestration_templates", ["kind"], unique=False
|
||||
)
|
||||
op.create_index(
|
||||
op.f("ix_orchestration_templates_org_id"),
|
||||
"orchestration_templates",
|
||||
["org_id"],
|
||||
unique=False,
|
||||
)
|
||||
op.create_index(
|
||||
op.f("ix_orchestration_templates_workspace_id"),
|
||||
"orchestration_templates",
|
||||
["workspace_id"],
|
||||
unique=False,
|
||||
)
|
||||
op.create_table(
|
||||
"projects",
|
||||
sa.Column("org_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("workspace_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("id", sa.Uuid(), nullable=False),
|
||||
sa.Column("name", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("description", sqlmodel.sql.sqltypes.AutoString(), nullable=True),
|
||||
sa.Column("status", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("created_at", sa.DateTime(), nullable=False),
|
||||
sa.Column("updated_at", sa.DateTime(), nullable=False),
|
||||
sa.ForeignKeyConstraint(
|
||||
["org_id"],
|
||||
["orgs.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["workspace_id"],
|
||||
["workspaces.id"],
|
||||
),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index(op.f("ix_projects_org_id"), "projects", ["org_id"], unique=False)
|
||||
op.create_index(op.f("ix_projects_status"), "projects", ["status"], unique=False)
|
||||
op.create_index(op.f("ix_projects_workspace_id"), "projects", ["workspace_id"], unique=False)
|
||||
op.create_table(
|
||||
"workspace_api_tokens",
|
||||
sa.Column("org_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("workspace_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("id", sa.Uuid(), nullable=False),
|
||||
sa.Column("token_hash", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("label", sqlmodel.sql.sqltypes.AutoString(), nullable=True),
|
||||
sa.Column("last_used_at", sa.DateTime(), nullable=True),
|
||||
sa.Column("created_at", sa.DateTime(), nullable=False),
|
||||
sa.ForeignKeyConstraint(
|
||||
["org_id"],
|
||||
["orgs.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["workspace_id"],
|
||||
["workspaces.id"],
|
||||
),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index(
|
||||
op.f("ix_workspace_api_tokens_org_id"), "workspace_api_tokens", ["org_id"], unique=False
|
||||
)
|
||||
op.create_index(
|
||||
op.f("ix_workspace_api_tokens_token_hash"),
|
||||
"workspace_api_tokens",
|
||||
["token_hash"],
|
||||
unique=True,
|
||||
)
|
||||
op.create_index(
|
||||
op.f("ix_workspace_api_tokens_workspace_id"),
|
||||
"workspace_api_tokens",
|
||||
["workspace_id"],
|
||||
unique=False,
|
||||
)
|
||||
op.create_table(
|
||||
"openclaw_sessions",
|
||||
sa.Column("org_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("workspace_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("id", sa.Uuid(), nullable=False),
|
||||
sa.Column("session_id", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("agent_id", sa.Uuid(), nullable=True),
|
||||
sa.Column("status", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("last_seen_at", sa.DateTime(), nullable=True),
|
||||
sa.Column("created_at", sa.DateTime(), nullable=False),
|
||||
sa.ForeignKeyConstraint(
|
||||
["agent_id"],
|
||||
["agents.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["org_id"],
|
||||
["orgs.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["workspace_id"],
|
||||
["workspaces.id"],
|
||||
),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index(
|
||||
op.f("ix_openclaw_sessions_org_id"), "openclaw_sessions", ["org_id"], unique=False
|
||||
)
|
||||
op.create_index(
|
||||
op.f("ix_openclaw_sessions_session_id"), "openclaw_sessions", ["session_id"], unique=True
|
||||
)
|
||||
op.create_index(
|
||||
op.f("ix_openclaw_sessions_status"), "openclaw_sessions", ["status"], unique=False
|
||||
)
|
||||
op.create_index(
|
||||
op.f("ix_openclaw_sessions_workspace_id"),
|
||||
"openclaw_sessions",
|
||||
["workspace_id"],
|
||||
unique=False,
|
||||
)
|
||||
op.create_table(
|
||||
"tasks",
|
||||
sa.Column("org_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("workspace_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("id", sa.Uuid(), nullable=False),
|
||||
sa.Column("project_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("title", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("description", sqlmodel.sql.sqltypes.AutoString(), nullable=True),
|
||||
sa.Column("status", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("priority", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("due_at", sa.DateTime(), nullable=True),
|
||||
sa.Column("assigned_agent_id", sa.Uuid(), nullable=True),
|
||||
sa.Column("created_by_user_id", sa.Uuid(), nullable=True),
|
||||
sa.Column("created_at", sa.DateTime(), nullable=False),
|
||||
sa.Column("updated_at", sa.DateTime(), nullable=False),
|
||||
sa.ForeignKeyConstraint(
|
||||
["assigned_agent_id"],
|
||||
["agents.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["created_by_user_id"],
|
||||
["users.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["org_id"],
|
||||
["orgs.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["project_id"],
|
||||
["projects.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["workspace_id"],
|
||||
["workspaces.id"],
|
||||
),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index(
|
||||
op.f("ix_tasks_assigned_agent_id"), "tasks", ["assigned_agent_id"], unique=False
|
||||
)
|
||||
op.create_index(
|
||||
op.f("ix_tasks_created_by_user_id"), "tasks", ["created_by_user_id"], unique=False
|
||||
)
|
||||
op.create_index(op.f("ix_tasks_org_id"), "tasks", ["org_id"], unique=False)
|
||||
op.create_index(op.f("ix_tasks_priority"), "tasks", ["priority"], unique=False)
|
||||
op.create_index(op.f("ix_tasks_project_id"), "tasks", ["project_id"], unique=False)
|
||||
op.create_index(op.f("ix_tasks_status"), "tasks", ["status"], unique=False)
|
||||
op.create_index(op.f("ix_tasks_workspace_id"), "tasks", ["workspace_id"], unique=False)
|
||||
op.create_table(
|
||||
"task_activities",
|
||||
sa.Column("org_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("workspace_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("id", sa.Uuid(), nullable=False),
|
||||
sa.Column("task_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("activity_type", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("message", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("actor_user_id", sa.Uuid(), nullable=True),
|
||||
sa.Column("actor_agent_id", sa.Uuid(), nullable=True),
|
||||
sa.Column("created_at", sa.DateTime(), nullable=False),
|
||||
sa.ForeignKeyConstraint(
|
||||
["actor_agent_id"],
|
||||
["agents.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["actor_user_id"],
|
||||
["users.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["org_id"],
|
||||
["orgs.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["task_id"],
|
||||
["tasks.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["workspace_id"],
|
||||
["workspaces.id"],
|
||||
),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index(op.f("ix_task_activities_org_id"), "task_activities", ["org_id"], unique=False)
|
||||
op.create_index(
|
||||
op.f("ix_task_activities_task_id"), "task_activities", ["task_id"], unique=False
|
||||
)
|
||||
op.create_index(
|
||||
op.f("ix_task_activities_workspace_id"), "task_activities", ["workspace_id"], unique=False
|
||||
)
|
||||
op.create_table(
|
||||
"task_deliverables",
|
||||
sa.Column("org_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("workspace_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("id", sa.Uuid(), nullable=False),
|
||||
sa.Column("task_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("title", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("markdown_content", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("created_by_user_id", sa.Uuid(), nullable=True),
|
||||
sa.Column("created_by_agent_id", sa.Uuid(), nullable=True),
|
||||
sa.Column("created_at", sa.DateTime(), nullable=False),
|
||||
sa.ForeignKeyConstraint(
|
||||
["created_by_agent_id"],
|
||||
["agents.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["created_by_user_id"],
|
||||
["users.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["org_id"],
|
||||
["orgs.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["task_id"],
|
||||
["tasks.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["workspace_id"],
|
||||
["workspaces.id"],
|
||||
),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index(
|
||||
op.f("ix_task_deliverables_org_id"), "task_deliverables", ["org_id"], unique=False
|
||||
)
|
||||
op.create_index(
|
||||
op.f("ix_task_deliverables_task_id"), "task_deliverables", ["task_id"], unique=False
|
||||
)
|
||||
op.create_index(
|
||||
op.f("ix_task_deliverables_workspace_id"),
|
||||
"task_deliverables",
|
||||
["workspace_id"],
|
||||
unique=False,
|
||||
)
|
||||
op.create_table(
|
||||
"task_status_history",
|
||||
sa.Column("org_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("workspace_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("id", sa.Uuid(), nullable=False),
|
||||
sa.Column("task_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("from_status", sqlmodel.sql.sqltypes.AutoString(), nullable=True),
|
||||
sa.Column("to_status", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("actor_user_id", sa.Uuid(), nullable=True),
|
||||
sa.Column("actor_agent_id", sa.Uuid(), nullable=True),
|
||||
sa.Column("created_at", sa.DateTime(), nullable=False),
|
||||
sa.ForeignKeyConstraint(
|
||||
["actor_agent_id"],
|
||||
["agents.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["actor_user_id"],
|
||||
["users.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["org_id"],
|
||||
["orgs.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["task_id"],
|
||||
["tasks.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["workspace_id"],
|
||||
["workspaces.id"],
|
||||
),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index(
|
||||
op.f("ix_task_status_history_org_id"), "task_status_history", ["org_id"], unique=False
|
||||
)
|
||||
op.create_index(
|
||||
op.f("ix_task_status_history_task_id"), "task_status_history", ["task_id"], unique=False
|
||||
)
|
||||
op.create_index(
|
||||
op.f("ix_task_status_history_workspace_id"),
|
||||
"task_status_history",
|
||||
["workspace_id"],
|
||||
unique=False,
|
||||
)
|
||||
op.create_table(
|
||||
"task_subagents",
|
||||
sa.Column("org_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("workspace_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("id", sa.Uuid(), nullable=False),
|
||||
sa.Column("task_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("agent_name", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("openclaw_session_id", sqlmodel.sql.sqltypes.AutoString(), nullable=True),
|
||||
sa.Column("created_at", sa.DateTime(), nullable=False),
|
||||
sa.ForeignKeyConstraint(
|
||||
["org_id"],
|
||||
["orgs.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["task_id"],
|
||||
["tasks.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["workspace_id"],
|
||||
["workspaces.id"],
|
||||
),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index(op.f("ix_task_subagents_org_id"), "task_subagents", ["org_id"], unique=False)
|
||||
op.create_index(op.f("ix_task_subagents_task_id"), "task_subagents", ["task_id"], unique=False)
|
||||
op.create_index(
|
||||
op.f("ix_task_subagents_workspace_id"), "task_subagents", ["workspace_id"], unique=False
|
||||
)
|
||||
op.create_table(
|
||||
"transcripts",
|
||||
sa.Column("org_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("workspace_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("id", sa.Uuid(), nullable=False),
|
||||
sa.Column("task_id", sa.Uuid(), nullable=True),
|
||||
sa.Column("session_id", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("full_text", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("created_at", sa.DateTime(), nullable=False),
|
||||
sa.ForeignKeyConstraint(
|
||||
["org_id"],
|
||||
["orgs.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["task_id"],
|
||||
["tasks.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["workspace_id"],
|
||||
["workspaces.id"],
|
||||
),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index(op.f("ix_transcripts_org_id"), "transcripts", ["org_id"], unique=False)
|
||||
op.create_index(op.f("ix_transcripts_session_id"), "transcripts", ["session_id"], unique=False)
|
||||
op.create_index(op.f("ix_transcripts_task_id"), "transcripts", ["task_id"], unique=False)
|
||||
op.create_index(
|
||||
op.f("ix_transcripts_workspace_id"), "transcripts", ["workspace_id"], unique=False
|
||||
)
|
||||
# ### end Alembic commands ###
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
op.drop_index(op.f("ix_transcripts_workspace_id"), table_name="transcripts")
|
||||
op.drop_index(op.f("ix_transcripts_task_id"), table_name="transcripts")
|
||||
op.drop_index(op.f("ix_transcripts_session_id"), table_name="transcripts")
|
||||
op.drop_index(op.f("ix_transcripts_org_id"), table_name="transcripts")
|
||||
op.drop_table("transcripts")
|
||||
op.drop_index(op.f("ix_task_subagents_workspace_id"), table_name="task_subagents")
|
||||
op.drop_index(op.f("ix_task_subagents_task_id"), table_name="task_subagents")
|
||||
op.drop_index(op.f("ix_task_subagents_org_id"), table_name="task_subagents")
|
||||
op.drop_table("task_subagents")
|
||||
op.drop_index(op.f("ix_task_status_history_workspace_id"), table_name="task_status_history")
|
||||
op.drop_index(op.f("ix_task_status_history_task_id"), table_name="task_status_history")
|
||||
op.drop_index(op.f("ix_task_status_history_org_id"), table_name="task_status_history")
|
||||
op.drop_table("task_status_history")
|
||||
op.drop_index(op.f("ix_task_deliverables_workspace_id"), table_name="task_deliverables")
|
||||
op.drop_index(op.f("ix_task_deliverables_task_id"), table_name="task_deliverables")
|
||||
op.drop_index(op.f("ix_task_deliverables_org_id"), table_name="task_deliverables")
|
||||
op.drop_table("task_deliverables")
|
||||
op.drop_index(op.f("ix_task_activities_workspace_id"), table_name="task_activities")
|
||||
op.drop_index(op.f("ix_task_activities_task_id"), table_name="task_activities")
|
||||
op.drop_index(op.f("ix_task_activities_org_id"), table_name="task_activities")
|
||||
op.drop_table("task_activities")
|
||||
op.drop_index(op.f("ix_tasks_workspace_id"), table_name="tasks")
|
||||
op.drop_index(op.f("ix_tasks_status"), table_name="tasks")
|
||||
op.drop_index(op.f("ix_tasks_project_id"), table_name="tasks")
|
||||
op.drop_index(op.f("ix_tasks_priority"), table_name="tasks")
|
||||
op.drop_index(op.f("ix_tasks_org_id"), table_name="tasks")
|
||||
op.drop_index(op.f("ix_tasks_created_by_user_id"), table_name="tasks")
|
||||
op.drop_index(op.f("ix_tasks_assigned_agent_id"), table_name="tasks")
|
||||
op.drop_table("tasks")
|
||||
op.drop_index(op.f("ix_openclaw_sessions_workspace_id"), table_name="openclaw_sessions")
|
||||
op.drop_index(op.f("ix_openclaw_sessions_status"), table_name="openclaw_sessions")
|
||||
op.drop_index(op.f("ix_openclaw_sessions_session_id"), table_name="openclaw_sessions")
|
||||
op.drop_index(op.f("ix_openclaw_sessions_org_id"), table_name="openclaw_sessions")
|
||||
op.drop_table("openclaw_sessions")
|
||||
op.drop_index(op.f("ix_workspace_api_tokens_workspace_id"), table_name="workspace_api_tokens")
|
||||
op.drop_index(op.f("ix_workspace_api_tokens_token_hash"), table_name="workspace_api_tokens")
|
||||
op.drop_index(op.f("ix_workspace_api_tokens_org_id"), table_name="workspace_api_tokens")
|
||||
op.drop_table("workspace_api_tokens")
|
||||
op.drop_index(op.f("ix_projects_workspace_id"), table_name="projects")
|
||||
op.drop_index(op.f("ix_projects_status"), table_name="projects")
|
||||
op.drop_index(op.f("ix_projects_org_id"), table_name="projects")
|
||||
op.drop_table("projects")
|
||||
op.drop_index(
|
||||
op.f("ix_orchestration_templates_workspace_id"), table_name="orchestration_templates"
|
||||
)
|
||||
op.drop_index(op.f("ix_orchestration_templates_org_id"), table_name="orchestration_templates")
|
||||
op.drop_index(op.f("ix_orchestration_templates_kind"), table_name="orchestration_templates")
|
||||
op.drop_table("orchestration_templates")
|
||||
op.drop_index(op.f("ix_memberships_workspace_id"), table_name="memberships")
|
||||
op.drop_index(op.f("ix_memberships_user_id"), table_name="memberships")
|
||||
op.drop_index(op.f("ix_memberships_org_id"), table_name="memberships")
|
||||
op.drop_table("memberships")
|
||||
op.drop_index(op.f("ix_gateway_configs_workspace_id"), table_name="gateway_configs")
|
||||
op.drop_index(op.f("ix_gateway_configs_org_id"), table_name="gateway_configs")
|
||||
op.drop_table("gateway_configs")
|
||||
op.drop_index(op.f("ix_agents_workspace_id"), table_name="agents")
|
||||
op.drop_index(op.f("ix_agents_org_id"), table_name="agents")
|
||||
op.drop_index(op.f("ix_agents_openclaw_session_id"), table_name="agents")
|
||||
op.drop_table("agents")
|
||||
op.drop_index(op.f("ix_workspaces_slug"), table_name="workspaces")
|
||||
op.drop_index(op.f("ix_workspaces_org_id"), table_name="workspaces")
|
||||
op.drop_table("workspaces")
|
||||
op.drop_index(op.f("ix_users_email"), table_name="users")
|
||||
op.drop_index(op.f("ix_users_clerk_user_id"), table_name="users")
|
||||
op.drop_table("users")
|
||||
op.drop_index(op.f("ix_orgs_slug"), table_name="orgs")
|
||||
op.drop_table("orgs")
|
||||
# ### end Alembic commands ###
|
||||
@@ -0,0 +1,64 @@
|
||||
"""add boards and task board id
|
||||
|
||||
Revision ID: 7e3d9b8c1f4a
|
||||
Revises: 5630abfa60f8
|
||||
Create Date: 2026-02-03 20:12:00.000000
|
||||
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
import sqlmodel
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = "7e3d9b8c1f4a"
|
||||
down_revision = "5630abfa60f8"
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
op.create_table(
|
||||
"boards",
|
||||
sa.Column("org_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("workspace_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("id", sa.Uuid(), nullable=False),
|
||||
sa.Column("name", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("slug", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("created_at", sa.DateTime(), nullable=False),
|
||||
sa.Column("updated_at", sa.DateTime(), nullable=False),
|
||||
sa.ForeignKeyConstraint(
|
||||
["org_id"],
|
||||
["orgs.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["workspace_id"],
|
||||
["workspaces.id"],
|
||||
),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index(op.f("ix_boards_org_id"), "boards", ["org_id"], unique=False)
|
||||
op.create_index(
|
||||
op.f("ix_boards_workspace_id"), "boards", ["workspace_id"], unique=False
|
||||
)
|
||||
op.create_index(op.f("ix_boards_slug"), "boards", ["slug"], unique=False)
|
||||
|
||||
op.add_column("tasks", sa.Column("board_id", sa.Uuid(), nullable=True))
|
||||
op.create_index(op.f("ix_tasks_board_id"), "tasks", ["board_id"], unique=False)
|
||||
op.create_foreign_key(
|
||||
"fk_tasks_board_id_boards", "tasks", "boards", ["board_id"], ["id"]
|
||||
)
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
op.drop_constraint("fk_tasks_board_id_boards", "tasks", type_="foreignkey")
|
||||
op.drop_index(op.f("ix_tasks_board_id"), table_name="tasks")
|
||||
op.drop_column("tasks", "board_id")
|
||||
|
||||
op.drop_index(op.f("ix_boards_slug"), table_name="boards")
|
||||
op.drop_index(op.f("ix_boards_workspace_id"), table_name="boards")
|
||||
op.drop_index(op.f("ix_boards_org_id"), table_name="boards")
|
||||
op.drop_table("boards")
|
||||
63
backend/alembic/versions/8b6d1b8f4b21_drop_projects.py
Normal file
63
backend/alembic/versions/8b6d1b8f4b21_drop_projects.py
Normal file
@@ -0,0 +1,63 @@
|
||||
"""drop projects and task project_id
|
||||
|
||||
Revision ID: 8b6d1b8f4b21
|
||||
Revises: 7e3d9b8c1f4a
|
||||
Create Date: 2026-02-03 23:05:00.000000
|
||||
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
import sqlmodel
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = "8b6d1b8f4b21"
|
||||
down_revision = "7e3d9b8c1f4a"
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
op.drop_constraint("tasks_project_id_fkey", "tasks", type_="foreignkey")
|
||||
op.drop_index(op.f("ix_tasks_project_id"), table_name="tasks")
|
||||
op.drop_column("tasks", "project_id")
|
||||
|
||||
op.drop_index(op.f("ix_projects_workspace_id"), table_name="projects")
|
||||
op.drop_index(op.f("ix_projects_status"), table_name="projects")
|
||||
op.drop_index(op.f("ix_projects_org_id"), table_name="projects")
|
||||
op.drop_table("projects")
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
op.create_table(
|
||||
"projects",
|
||||
sa.Column("org_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("workspace_id", sa.Uuid(), nullable=False),
|
||||
sa.Column("id", sa.Uuid(), nullable=False),
|
||||
sa.Column("name", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("description", sqlmodel.sql.sqltypes.AutoString(), nullable=True),
|
||||
sa.Column("status", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("created_at", sa.DateTime(), nullable=False),
|
||||
sa.Column("updated_at", sa.DateTime(), nullable=False),
|
||||
sa.ForeignKeyConstraint(
|
||||
["org_id"],
|
||||
["orgs.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["workspace_id"],
|
||||
["workspaces.id"],
|
||||
),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index(op.f("ix_projects_org_id"), "projects", ["org_id"], unique=False)
|
||||
op.create_index(op.f("ix_projects_status"), "projects", ["status"], unique=False)
|
||||
op.create_index(op.f("ix_projects_workspace_id"), "projects", ["workspace_id"], unique=False)
|
||||
|
||||
op.add_column("tasks", sa.Column("project_id", sa.Uuid(), nullable=True))
|
||||
op.create_index(op.f("ix_tasks_project_id"), "tasks", ["project_id"], unique=False)
|
||||
op.create_foreign_key(
|
||||
"tasks_project_id_fkey", "tasks", "projects", ["project_id"], ["id"]
|
||||
)
|
||||
56
backend/alembic/versions/9c4f1a2b3d4e_drop_tenancy_tables.py
Normal file
56
backend/alembic/versions/9c4f1a2b3d4e_drop_tenancy_tables.py
Normal file
@@ -0,0 +1,56 @@
|
||||
"""drop tenancy tables and columns
|
||||
|
||||
Revision ID: 9c4f1a2b3d4e
|
||||
Revises: 8b6d1b8f4b21
|
||||
Create Date: 2026-02-03 23:35:00.000000
|
||||
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from alembic import op
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = "9c4f1a2b3d4e"
|
||||
down_revision = "8b6d1b8f4b21"
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
op.drop_table("task_subagents")
|
||||
op.drop_table("task_status_history")
|
||||
op.drop_table("task_deliverables")
|
||||
op.drop_table("task_activities")
|
||||
op.drop_table("transcripts")
|
||||
op.drop_table("openclaw_sessions")
|
||||
op.drop_table("workspace_api_tokens")
|
||||
op.drop_table("orchestration_templates")
|
||||
op.drop_table("memberships")
|
||||
op.drop_table("gateway_configs")
|
||||
|
||||
op.drop_constraint("tasks_assigned_agent_id_fkey", "tasks", type_="foreignkey")
|
||||
op.drop_constraint("tasks_org_id_fkey", "tasks", type_="foreignkey")
|
||||
op.drop_constraint("tasks_workspace_id_fkey", "tasks", type_="foreignkey")
|
||||
op.drop_index(op.f("ix_tasks_assigned_agent_id"), table_name="tasks")
|
||||
op.drop_index(op.f("ix_tasks_org_id"), table_name="tasks")
|
||||
op.drop_index(op.f("ix_tasks_workspace_id"), table_name="tasks")
|
||||
op.drop_column("tasks", "assigned_agent_id")
|
||||
op.drop_column("tasks", "org_id")
|
||||
op.drop_column("tasks", "workspace_id")
|
||||
|
||||
op.drop_constraint("boards_org_id_fkey", "boards", type_="foreignkey")
|
||||
op.drop_constraint("boards_workspace_id_fkey", "boards", type_="foreignkey")
|
||||
op.drop_index(op.f("ix_boards_org_id"), table_name="boards")
|
||||
op.drop_index(op.f("ix_boards_workspace_id"), table_name="boards")
|
||||
op.drop_column("boards", "org_id")
|
||||
op.drop_column("boards", "workspace_id")
|
||||
|
||||
op.drop_table("agents")
|
||||
op.drop_table("workspaces")
|
||||
op.drop_table("orgs")
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
raise NotImplementedError("Downgrade not supported for simplified tenancy removal.")
|
||||
@@ -0,0 +1,70 @@
|
||||
"""add agents and activity events
|
||||
|
||||
Revision ID: a1b2c3d4e5f6
|
||||
Revises: 9c4f1a2b3d4e
|
||||
Create Date: 2026-02-03 23:50:00.000000
|
||||
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
import sqlmodel
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = "a1b2c3d4e5f6"
|
||||
down_revision = "9c4f1a2b3d4e"
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
op.create_table(
|
||||
"agents",
|
||||
sa.Column("id", sa.Uuid(), nullable=False),
|
||||
sa.Column("name", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("status", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("last_seen_at", sa.DateTime(), nullable=False),
|
||||
sa.Column("created_at", sa.DateTime(), nullable=False),
|
||||
sa.Column("updated_at", sa.DateTime(), nullable=False),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index(op.f("ix_agents_name"), "agents", ["name"], unique=False)
|
||||
op.create_index(op.f("ix_agents_status"), "agents", ["status"], unique=False)
|
||||
|
||||
op.create_table(
|
||||
"activity_events",
|
||||
sa.Column("id", sa.Uuid(), nullable=False),
|
||||
sa.Column("event_type", sqlmodel.sql.sqltypes.AutoString(), nullable=False),
|
||||
sa.Column("message", sqlmodel.sql.sqltypes.AutoString(), nullable=True),
|
||||
sa.Column("agent_id", sa.Uuid(), nullable=True),
|
||||
sa.Column("task_id", sa.Uuid(), nullable=True),
|
||||
sa.Column("created_at", sa.DateTime(), nullable=False),
|
||||
sa.ForeignKeyConstraint(["agent_id"], ["agents.id"]),
|
||||
sa.ForeignKeyConstraint(["task_id"], ["tasks.id"]),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index(
|
||||
op.f("ix_activity_events_agent_id"), "activity_events", ["agent_id"], unique=False
|
||||
)
|
||||
op.create_index(
|
||||
op.f("ix_activity_events_event_type"),
|
||||
"activity_events",
|
||||
["event_type"],
|
||||
unique=False,
|
||||
)
|
||||
op.create_index(
|
||||
op.f("ix_activity_events_task_id"), "activity_events", ["task_id"], unique=False
|
||||
)
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
op.drop_index(op.f("ix_activity_events_task_id"), table_name="activity_events")
|
||||
op.drop_index(op.f("ix_activity_events_event_type"), table_name="activity_events")
|
||||
op.drop_index(op.f("ix_activity_events_agent_id"), table_name="activity_events")
|
||||
op.drop_table("activity_events")
|
||||
op.drop_index(op.f("ix_agents_status"), table_name="agents")
|
||||
op.drop_index(op.f("ix_agents_name"), table_name="agents")
|
||||
op.drop_table("agents")
|
||||
@@ -0,0 +1,38 @@
|
||||
"""add agent openclaw session id
|
||||
|
||||
Revision ID: c7f0a2b1d4e3
|
||||
Revises: a1b2c3d4e5f6
|
||||
Create Date: 2026-02-04 02:20:00.000000
|
||||
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
import sqlmodel
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = "c7f0a2b1d4e3"
|
||||
down_revision = "a1b2c3d4e5f6"
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
op.add_column(
|
||||
"agents",
|
||||
sa.Column("openclaw_session_id", sqlmodel.sql.sqltypes.AutoString(), nullable=True),
|
||||
)
|
||||
op.create_index(
|
||||
op.f("ix_agents_openclaw_session_id"),
|
||||
"agents",
|
||||
["openclaw_session_id"],
|
||||
unique=False,
|
||||
)
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
op.drop_index(op.f("ix_agents_openclaw_session_id"), table_name="agents")
|
||||
op.drop_column("agents", "openclaw_session_id")
|
||||
0
backend/app/__init__.py
Normal file
0
backend/app/__init__.py
Normal file
0
backend/app/api/__init__.py
Normal file
0
backend/app/api/__init__.py
Normal file
@@ -1,32 +0,0 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
|
||||
from fastapi import APIRouter, Depends
|
||||
from sqlmodel import Session, select
|
||||
|
||||
from app.db.session import get_session
|
||||
from app.models.activity import Activity
|
||||
|
||||
router = APIRouter(prefix="/activities", tags=["activities"])
|
||||
|
||||
|
||||
@router.get("")
|
||||
def list_activities(limit: int = 50, session: Session = Depends(get_session)):
|
||||
items = session.exec(
|
||||
select(Activity).order_by(Activity.id.desc()).limit(max(1, min(limit, 200)))
|
||||
).all()
|
||||
out = []
|
||||
for a in items:
|
||||
out.append(
|
||||
{
|
||||
"id": a.id,
|
||||
"actor_employee_id": a.actor_employee_id,
|
||||
"entity_type": a.entity_type,
|
||||
"entity_id": a.entity_id,
|
||||
"verb": a.verb,
|
||||
"payload": json.loads(a.payload_json) if a.payload_json else None,
|
||||
"created_at": a.created_at,
|
||||
}
|
||||
)
|
||||
return out
|
||||
26
backend/app/api/activity.py
Normal file
26
backend/app/api/activity.py
Normal file
@@ -0,0 +1,26 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from fastapi import APIRouter, Depends, Query
|
||||
from sqlmodel import Session, select
|
||||
|
||||
from app.core.auth import get_auth_context
|
||||
from app.db.session import get_session
|
||||
from app.models.activity_events import ActivityEvent
|
||||
from app.schemas.activity_events import ActivityEventRead
|
||||
from app.services.admin_access import require_admin
|
||||
|
||||
router = APIRouter(prefix="/activity", tags=["activity"])
|
||||
|
||||
|
||||
@router.get("", response_model=list[ActivityEventRead])
|
||||
def list_activity(
|
||||
limit: int = Query(50, ge=1, le=200),
|
||||
offset: int = Query(0, ge=0),
|
||||
session: Session = Depends(get_session),
|
||||
auth=Depends(get_auth_context),
|
||||
) -> list[ActivityEvent]:
|
||||
require_admin(auth)
|
||||
statement = (
|
||||
select(ActivityEvent).order_by(ActivityEvent.created_at.desc()).offset(offset).limit(limit)
|
||||
)
|
||||
return list(session.exec(statement))
|
||||
205
backend/app/api/agents.py
Normal file
205
backend/app/api/agents.py
Normal file
@@ -0,0 +1,205 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import re
|
||||
from datetime import datetime, timedelta
|
||||
from uuid import uuid4
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException, status
|
||||
from sqlmodel import Session, select
|
||||
|
||||
from app.core.auth import get_auth_context
|
||||
from app.db.session import get_session
|
||||
from app.integrations.openclaw_gateway import OpenClawGatewayError, openclaw_call
|
||||
from app.models.activity_events import ActivityEvent
|
||||
from app.models.agents import Agent
|
||||
from app.schemas.agents import (
|
||||
AgentCreate,
|
||||
AgentHeartbeat,
|
||||
AgentHeartbeatCreate,
|
||||
AgentRead,
|
||||
AgentUpdate,
|
||||
)
|
||||
from app.services.admin_access import require_admin
|
||||
|
||||
router = APIRouter(prefix="/agents", tags=["agents"])
|
||||
|
||||
OFFLINE_AFTER = timedelta(minutes=10)
|
||||
DEFAULT_GATEWAY_CHANNEL = "openclaw-agency"
|
||||
|
||||
|
||||
def _slugify(value: str) -> str:
|
||||
slug = re.sub(r"[^a-z0-9]+", "-", value.lower()).strip("-")
|
||||
return slug or uuid4().hex
|
||||
|
||||
|
||||
def _build_session_label(agent_name: str) -> str:
|
||||
return f"{DEFAULT_GATEWAY_CHANNEL}-{_slugify(agent_name)}"
|
||||
|
||||
|
||||
async def _create_gateway_session(agent_name: str) -> str:
|
||||
label = _build_session_label(agent_name)
|
||||
try:
|
||||
await openclaw_call("sessions.patch", {"key": label, "label": agent_name})
|
||||
except OpenClawGatewayError as exc:
|
||||
raise HTTPException(status_code=status.HTTP_502_BAD_GATEWAY, detail=str(exc)) from exc
|
||||
return label
|
||||
|
||||
|
||||
def _with_computed_status(agent: Agent) -> Agent:
|
||||
now = datetime.utcnow()
|
||||
if agent.last_seen_at and now - agent.last_seen_at > OFFLINE_AFTER:
|
||||
agent.status = "offline"
|
||||
return agent
|
||||
|
||||
|
||||
def _record_heartbeat(session: Session, agent: Agent) -> None:
|
||||
event = ActivityEvent(
|
||||
event_type="agent.heartbeat",
|
||||
message=f"Heartbeat received from {agent.name}.",
|
||||
agent_id=agent.id,
|
||||
)
|
||||
session.add(event)
|
||||
|
||||
|
||||
@router.get("", response_model=list[AgentRead])
|
||||
def list_agents(
|
||||
session: Session = Depends(get_session),
|
||||
auth=Depends(get_auth_context),
|
||||
) -> list[Agent]:
|
||||
require_admin(auth)
|
||||
agents = list(session.exec(select(Agent)))
|
||||
return [_with_computed_status(agent) for agent in agents]
|
||||
|
||||
|
||||
@router.post("", response_model=AgentRead)
|
||||
async def create_agent(
|
||||
payload: AgentCreate,
|
||||
session: Session = Depends(get_session),
|
||||
auth=Depends(get_auth_context),
|
||||
) -> Agent:
|
||||
require_admin(auth)
|
||||
agent = Agent.model_validate(payload)
|
||||
agent.openclaw_session_id = await _create_gateway_session(agent.name)
|
||||
session.add(agent)
|
||||
session.commit()
|
||||
session.refresh(agent)
|
||||
session.add(
|
||||
ActivityEvent(
|
||||
event_type="agent.session.created",
|
||||
message=f"Session created for {agent.name}.",
|
||||
agent_id=agent.id,
|
||||
)
|
||||
)
|
||||
session.commit()
|
||||
return agent
|
||||
|
||||
|
||||
@router.get("/{agent_id}", response_model=AgentRead)
|
||||
def get_agent(
|
||||
agent_id: str,
|
||||
session: Session = Depends(get_session),
|
||||
auth=Depends(get_auth_context),
|
||||
) -> Agent:
|
||||
require_admin(auth)
|
||||
agent = session.get(Agent, agent_id)
|
||||
if agent is None:
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND)
|
||||
return _with_computed_status(agent)
|
||||
|
||||
|
||||
@router.patch("/{agent_id}", response_model=AgentRead)
|
||||
def update_agent(
|
||||
agent_id: str,
|
||||
payload: AgentUpdate,
|
||||
session: Session = Depends(get_session),
|
||||
auth=Depends(get_auth_context),
|
||||
) -> Agent:
|
||||
require_admin(auth)
|
||||
agent = session.get(Agent, agent_id)
|
||||
if agent is None:
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND)
|
||||
updates = payload.model_dump(exclude_unset=True)
|
||||
for key, value in updates.items():
|
||||
setattr(agent, key, value)
|
||||
agent.updated_at = datetime.utcnow()
|
||||
session.add(agent)
|
||||
session.commit()
|
||||
session.refresh(agent)
|
||||
return _with_computed_status(agent)
|
||||
|
||||
|
||||
@router.post("/{agent_id}/heartbeat", response_model=AgentRead)
|
||||
def heartbeat_agent(
|
||||
agent_id: str,
|
||||
payload: AgentHeartbeat,
|
||||
session: Session = Depends(get_session),
|
||||
auth=Depends(get_auth_context),
|
||||
) -> Agent:
|
||||
require_admin(auth)
|
||||
agent = session.get(Agent, agent_id)
|
||||
if agent is None:
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND)
|
||||
if payload.status:
|
||||
agent.status = payload.status
|
||||
agent.last_seen_at = datetime.utcnow()
|
||||
agent.updated_at = datetime.utcnow()
|
||||
_record_heartbeat(session, agent)
|
||||
session.add(agent)
|
||||
session.commit()
|
||||
session.refresh(agent)
|
||||
return _with_computed_status(agent)
|
||||
|
||||
|
||||
@router.post("/heartbeat", response_model=AgentRead)
|
||||
async def heartbeat_or_create_agent(
|
||||
payload: AgentHeartbeatCreate,
|
||||
session: Session = Depends(get_session),
|
||||
auth=Depends(get_auth_context),
|
||||
) -> Agent:
|
||||
require_admin(auth)
|
||||
agent = session.exec(select(Agent).where(Agent.name == payload.name)).first()
|
||||
if agent is None:
|
||||
agent = Agent(name=payload.name, status=payload.status or "online")
|
||||
agent.openclaw_session_id = await _create_gateway_session(agent.name)
|
||||
session.add(agent)
|
||||
session.commit()
|
||||
session.refresh(agent)
|
||||
session.add(
|
||||
ActivityEvent(
|
||||
event_type="agent.session.created",
|
||||
message=f"Session created for {agent.name}.",
|
||||
agent_id=agent.id,
|
||||
)
|
||||
)
|
||||
elif not agent.openclaw_session_id:
|
||||
agent.openclaw_session_id = await _create_gateway_session(agent.name)
|
||||
session.add(
|
||||
ActivityEvent(
|
||||
event_type="agent.session.created",
|
||||
message=f"Session created for {agent.name}.",
|
||||
agent_id=agent.id,
|
||||
)
|
||||
)
|
||||
if payload.status:
|
||||
agent.status = payload.status
|
||||
agent.last_seen_at = datetime.utcnow()
|
||||
agent.updated_at = datetime.utcnow()
|
||||
_record_heartbeat(session, agent)
|
||||
session.add(agent)
|
||||
session.commit()
|
||||
session.refresh(agent)
|
||||
return _with_computed_status(agent)
|
||||
|
||||
|
||||
@router.delete("/{agent_id}")
|
||||
def delete_agent(
|
||||
agent_id: str,
|
||||
session: Session = Depends(get_session),
|
||||
auth=Depends(get_auth_context),
|
||||
) -> dict[str, bool]:
|
||||
require_admin(auth)
|
||||
agent = session.get(Agent, agent_id)
|
||||
if agent:
|
||||
session.delete(agent)
|
||||
session.commit()
|
||||
return {"ok": True}
|
||||
15
backend/app/api/auth.py
Normal file
15
backend/app/api/auth.py
Normal file
@@ -0,0 +1,15 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException, status
|
||||
|
||||
from app.core.auth import get_auth_context
|
||||
from app.schemas.users import UserRead
|
||||
|
||||
router = APIRouter(prefix="/auth", tags=["auth"])
|
||||
|
||||
|
||||
@router.post("/bootstrap", response_model=UserRead)
|
||||
async def bootstrap_user(auth=Depends(get_auth_context)) -> UserRead:
|
||||
if auth.actor_type != "user" or auth.user is None:
|
||||
raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED)
|
||||
return auth.user
|
||||
82
backend/app/api/boards.py
Normal file
82
backend/app/api/boards.py
Normal file
@@ -0,0 +1,82 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException, status
|
||||
from sqlmodel import Session, select
|
||||
|
||||
from app.core.auth import get_auth_context
|
||||
from app.db.session import get_session
|
||||
from app.models.boards import Board
|
||||
from app.schemas.boards import BoardCreate, BoardRead, BoardUpdate
|
||||
from app.services.admin_access import require_admin
|
||||
|
||||
router = APIRouter(prefix="/boards", tags=["boards"])
|
||||
|
||||
|
||||
@router.get("", response_model=list[BoardRead])
|
||||
def list_boards(
|
||||
session: Session = Depends(get_session),
|
||||
auth=Depends(get_auth_context),
|
||||
) -> list[Board]:
|
||||
require_admin(auth)
|
||||
return list(session.exec(select(Board)))
|
||||
|
||||
|
||||
@router.post("", response_model=BoardRead)
|
||||
def create_board(
|
||||
payload: BoardCreate,
|
||||
session: Session = Depends(get_session),
|
||||
auth=Depends(get_auth_context),
|
||||
) -> Board:
|
||||
require_admin(auth)
|
||||
board = Board.model_validate(payload)
|
||||
session.add(board)
|
||||
session.commit()
|
||||
session.refresh(board)
|
||||
return board
|
||||
|
||||
|
||||
@router.get("/{board_id}", response_model=BoardRead)
|
||||
def get_board(
|
||||
board_id: str,
|
||||
session: Session = Depends(get_session),
|
||||
auth=Depends(get_auth_context),
|
||||
) -> Board:
|
||||
require_admin(auth)
|
||||
board = session.get(Board, board_id)
|
||||
if board is None:
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND)
|
||||
return board
|
||||
|
||||
|
||||
@router.patch("/{board_id}", response_model=BoardRead)
|
||||
def update_board(
|
||||
board_id: str,
|
||||
payload: BoardUpdate,
|
||||
session: Session = Depends(get_session),
|
||||
auth=Depends(get_auth_context),
|
||||
) -> Board:
|
||||
require_admin(auth)
|
||||
board = session.get(Board, board_id)
|
||||
if board is None:
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND)
|
||||
updates = payload.model_dump(exclude_unset=True)
|
||||
for key, value in updates.items():
|
||||
setattr(board, key, value)
|
||||
session.add(board)
|
||||
session.commit()
|
||||
session.refresh(board)
|
||||
return board
|
||||
|
||||
|
||||
@router.delete("/{board_id}")
|
||||
def delete_board(
|
||||
board_id: str,
|
||||
session: Session = Depends(get_session),
|
||||
auth=Depends(get_auth_context),
|
||||
) -> dict[str, bool]:
|
||||
require_admin(auth)
|
||||
board = session.get(Board, board_id)
|
||||
if board:
|
||||
session.delete(board)
|
||||
session.commit()
|
||||
return {"ok": True}
|
||||
99
backend/app/api/gateway.py
Normal file
99
backend/app/api/gateway.py
Normal file
@@ -0,0 +1,99 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from fastapi import APIRouter, Body, Depends, HTTPException, status
|
||||
|
||||
from app.core.auth import get_auth_context
|
||||
from app.core.config import settings
|
||||
from app.integrations.openclaw_gateway import (
|
||||
OpenClawGatewayError,
|
||||
get_chat_history,
|
||||
openclaw_call,
|
||||
send_message,
|
||||
)
|
||||
from app.services.admin_access import require_admin
|
||||
|
||||
router = APIRouter(prefix="/gateway", tags=["gateway"])
|
||||
|
||||
|
||||
@router.get("/status")
|
||||
async def gateway_status(auth=Depends(get_auth_context)) -> dict[str, object]:
|
||||
require_admin(auth)
|
||||
gateway_url = settings.openclaw_gateway_url or "ws://127.0.0.1:18789"
|
||||
try:
|
||||
sessions = await openclaw_call("sessions.list")
|
||||
if isinstance(sessions, dict):
|
||||
sessions_list = list(sessions.get("sessions") or [])
|
||||
else:
|
||||
sessions_list = list(sessions or [])
|
||||
return {
|
||||
"connected": True,
|
||||
"gateway_url": gateway_url,
|
||||
"sessions_count": len(sessions_list),
|
||||
"sessions": sessions_list,
|
||||
}
|
||||
except OpenClawGatewayError as exc:
|
||||
return {
|
||||
"connected": False,
|
||||
"gateway_url": gateway_url,
|
||||
"error": str(exc),
|
||||
}
|
||||
|
||||
|
||||
@router.get("/sessions")
|
||||
async def list_sessions(auth=Depends(get_auth_context)) -> dict[str, object]:
|
||||
require_admin(auth)
|
||||
try:
|
||||
sessions = await openclaw_call("sessions.list")
|
||||
except OpenClawGatewayError as exc:
|
||||
raise HTTPException(status_code=status.HTTP_502_BAD_GATEWAY, detail=str(exc)) from exc
|
||||
if isinstance(sessions, dict):
|
||||
return {"sessions": list(sessions.get("sessions") or [])}
|
||||
return {"sessions": list(sessions or [])}
|
||||
|
||||
|
||||
@router.get("/sessions/{session_id}")
|
||||
async def get_session(session_id: str, auth=Depends(get_auth_context)) -> dict[str, object]:
|
||||
require_admin(auth)
|
||||
try:
|
||||
sessions = await openclaw_call("sessions.list")
|
||||
except OpenClawGatewayError as exc:
|
||||
raise HTTPException(status_code=status.HTTP_502_BAD_GATEWAY, detail=str(exc)) from exc
|
||||
if isinstance(sessions, dict):
|
||||
sessions_list = list(sessions.get("sessions") or [])
|
||||
else:
|
||||
sessions_list = list(sessions or [])
|
||||
session = next((item for item in sessions_list if item.get("key") == session_id), None)
|
||||
if session is None:
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Session not found")
|
||||
return {"session": session}
|
||||
|
||||
|
||||
@router.get("/sessions/{session_id}/history")
|
||||
async def get_session_history(session_id: str, auth=Depends(get_auth_context)) -> dict[str, object]:
|
||||
require_admin(auth)
|
||||
try:
|
||||
history = await get_chat_history(session_id)
|
||||
except OpenClawGatewayError as exc:
|
||||
raise HTTPException(status_code=status.HTTP_502_BAD_GATEWAY, detail=str(exc)) from exc
|
||||
if isinstance(history, dict) and isinstance(history.get("messages"), list):
|
||||
return {"history": history["messages"]}
|
||||
return {"history": list(history or [])}
|
||||
|
||||
|
||||
@router.post("/sessions/{session_id}/message")
|
||||
async def send_session_message(
|
||||
session_id: str,
|
||||
payload: dict = Body(...),
|
||||
auth=Depends(get_auth_context),
|
||||
) -> dict[str, bool]:
|
||||
require_admin(auth)
|
||||
content = payload.get("content")
|
||||
if not content:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_422_UNPROCESSABLE_ENTITY, detail="content is required"
|
||||
)
|
||||
try:
|
||||
await send_message(content, session_key=session_id)
|
||||
except OpenClawGatewayError as exc:
|
||||
raise HTTPException(status_code=status.HTTP_502_BAD_GATEWAY, detail=str(exc)) from exc
|
||||
return {"ok": True}
|
||||
@@ -1,474 +0,0 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException
|
||||
from sqlalchemy.exc import IntegrityError
|
||||
from sqlmodel import Session, select
|
||||
|
||||
from app.api.utils import get_actor_employee_id, log_activity
|
||||
from app.core.urls import public_api_base_url
|
||||
from app.db.session import get_session
|
||||
from app.integrations.openclaw import OpenClawClient
|
||||
from app.models.org import Department, Employee, Team
|
||||
from app.schemas.org import (
|
||||
DepartmentCreate,
|
||||
DepartmentUpdate,
|
||||
EmployeeCreate,
|
||||
EmployeeUpdate,
|
||||
TeamCreate,
|
||||
TeamUpdate,
|
||||
)
|
||||
|
||||
router = APIRouter(tags=["org"])
|
||||
|
||||
|
||||
def _enforce_employee_create_policy(
|
||||
session: Session, *, actor_employee_id: int, target_employee_type: str
|
||||
) -> None:
|
||||
"""Enforce: agents can only create/provision agents; humans can create humans + agents."""
|
||||
|
||||
actor = session.get(Employee, actor_employee_id)
|
||||
if actor is None:
|
||||
# Actor header is required; if it points to nothing, treat as invalid.
|
||||
raise HTTPException(status_code=400, detail="Actor employee not found")
|
||||
|
||||
target = (target_employee_type or "").lower()
|
||||
actor_type = (actor.employee_type or "").lower()
|
||||
|
||||
if actor_type == "agent" and target != "agent":
|
||||
raise HTTPException(
|
||||
status_code=403,
|
||||
detail="Agent employees may only create/provision agent employees",
|
||||
)
|
||||
|
||||
|
||||
def _default_agent_prompt(emp: Employee) -> str:
|
||||
"""Generate a conservative default prompt for a newly-created agent employee.
|
||||
|
||||
We keep this short and deterministic; the human can refine later.
|
||||
"""
|
||||
|
||||
title = emp.title or "Agent"
|
||||
dept = str(emp.department_id) if emp.department_id is not None else "(unassigned)"
|
||||
|
||||
return (
|
||||
f"You are {emp.name}, an AI agent employee in Mission Control.\n"
|
||||
f"Your employee_id is {emp.id}.\n"
|
||||
f"Title: {title}. Department id: {dept}.\n\n"
|
||||
"Mission Control API access (no UI):\n"
|
||||
f"- Base URL: {public_api_base_url()}\n"
|
||||
"- Auth: none. REQUIRED header on ALL write operations: X-Actor-Employee-Id: <your_employee_id>\n"
|
||||
f" Example for you: X-Actor-Employee-Id: {emp.id}\n\n"
|
||||
"How to execute writes from an OpenClaw agent (IMPORTANT):\n"
|
||||
"- Use the exec tool to run curl against the Base URL above.\n"
|
||||
"- Example: start a task\n"
|
||||
" curl -sS -X PATCH $BASE/tasks/<TASK_ID> -H 'X-Actor-Employee-Id: <your_employee_id>' -H 'Content-Type: application/json' -d '{\"status\":\"in_progress\"}'\n"
|
||||
"- Example: add a progress comment\n"
|
||||
" curl -sS -X POST $BASE/task-comments -H 'X-Actor-Employee-Id: <your_employee_id>' -H 'Content-Type: application/json' -d '{\"task_id\":<TASK_ID>,\"body\":\"...\"}'\n\n"
|
||||
"Common endpoints (JSON):\n"
|
||||
"- GET /tasks, POST /tasks\n"
|
||||
"- GET /task-comments, POST /task-comments\n"
|
||||
"- GET /projects, GET /employees, GET /departments\n"
|
||||
"- OpenAPI schema: GET /openapi.json\n\n"
|
||||
"Rules:\n"
|
||||
"- Use the Mission Control API only (no UI).\n"
|
||||
"- You are responsible for driving assigned work to completion.\n"
|
||||
"- For every task you own: (1) read it, (2) plan next steps, (3) post progress comments, (4) update status as it moves (backlog/ready/in_progress/review/done/blocked).\n"
|
||||
"- Always leave an audit trail: add a comment whenever you start work, whenever you learn something important, and whenever you change status.\n"
|
||||
"- If blocked, set status=blocked and comment what you need (missing access, unclear requirements, etc.).\n"
|
||||
"- When notified about tasks/comments, respond with concise, actionable updates and immediately sync the task state in Mission Control.\n"
|
||||
"- Do not invent facts; ask for missing context.\n"
|
||||
)
|
||||
|
||||
|
||||
def _maybe_auto_provision_agent(session: Session, *, emp: Employee, actor_employee_id: int) -> None:
|
||||
"""Auto-provision an OpenClaw session for an agent employee.
|
||||
|
||||
This is intentionally best-effort. If OpenClaw is not configured or the call fails,
|
||||
we leave the employee as-is (openclaw_session_key stays null).
|
||||
"""
|
||||
|
||||
# Enforce: agent actors may only provision agents (humans can provision agents).
|
||||
_enforce_employee_create_policy(
|
||||
session, actor_employee_id=actor_employee_id, target_employee_type=emp.employee_type
|
||||
)
|
||||
|
||||
if emp.employee_type != "agent":
|
||||
return
|
||||
if emp.status != "active":
|
||||
return
|
||||
if emp.openclaw_session_key:
|
||||
return
|
||||
|
||||
client = OpenClawClient.from_env()
|
||||
if client is None:
|
||||
return
|
||||
|
||||
# FULL IMPLEMENTATION: ensure a dedicated OpenClaw agent profile exists per employee.
|
||||
try:
|
||||
from app.integrations.openclaw_agents import ensure_full_agent_profile
|
||||
|
||||
info = ensure_full_agent_profile(
|
||||
client=client,
|
||||
employee_id=int(emp.id),
|
||||
employee_name=emp.name,
|
||||
)
|
||||
emp.openclaw_agent_id = info["agent_id"]
|
||||
session.add(emp)
|
||||
session.flush()
|
||||
except Exception as e:
|
||||
log_activity(
|
||||
session,
|
||||
actor_employee_id=actor_employee_id,
|
||||
entity_type="employee",
|
||||
entity_id=emp.id,
|
||||
verb="agent_profile_failed",
|
||||
payload={"error": f"{type(e).__name__}: {e}"},
|
||||
)
|
||||
# Do not block employee creation on provisioning.
|
||||
return
|
||||
|
||||
label = f"employee:{emp.id}:{emp.name}"
|
||||
try:
|
||||
resp = client.tools_invoke(
|
||||
"sessions_spawn",
|
||||
{
|
||||
"task": _default_agent_prompt(emp),
|
||||
"label": label,
|
||||
"agentId": emp.openclaw_agent_id,
|
||||
"cleanup": "keep",
|
||||
"runTimeoutSeconds": 600,
|
||||
},
|
||||
timeout_s=20.0,
|
||||
)
|
||||
except Exception as e:
|
||||
log_activity(
|
||||
session,
|
||||
actor_employee_id=actor_employee_id,
|
||||
entity_type="employee",
|
||||
entity_id=emp.id,
|
||||
verb="provision_failed",
|
||||
payload={"error": f"{type(e).__name__}: {e}"},
|
||||
)
|
||||
return
|
||||
|
||||
session_key = None
|
||||
if isinstance(resp, dict):
|
||||
session_key = resp.get("sessionKey")
|
||||
if not session_key:
|
||||
result = resp.get("result") or {}
|
||||
if isinstance(result, dict):
|
||||
session_key = result.get("sessionKey") or result.get("childSessionKey")
|
||||
details = (result.get("details") if isinstance(result, dict) else None) or {}
|
||||
if isinstance(details, dict):
|
||||
session_key = session_key or details.get("sessionKey") or details.get("childSessionKey")
|
||||
|
||||
if not session_key:
|
||||
log_activity(
|
||||
session,
|
||||
actor_employee_id=actor_employee_id,
|
||||
entity_type="employee",
|
||||
entity_id=emp.id,
|
||||
verb="provision_incomplete",
|
||||
payload={"label": label},
|
||||
)
|
||||
return
|
||||
|
||||
emp.openclaw_session_key = session_key
|
||||
session.add(emp)
|
||||
session.flush()
|
||||
|
||||
log_activity(
|
||||
session,
|
||||
actor_employee_id=actor_employee_id,
|
||||
entity_type="employee",
|
||||
entity_id=emp.id,
|
||||
verb="provisioned",
|
||||
payload={"session_key": session_key, "label": label},
|
||||
)
|
||||
|
||||
|
||||
@router.get("/departments", response_model=list[Department])
|
||||
def list_departments(session: Session = Depends(get_session)):
|
||||
return session.exec(select(Department).order_by(Department.name.asc())).all()
|
||||
|
||||
|
||||
@router.get("/teams", response_model=list[Team])
|
||||
def list_teams(department_id: int | None = None, session: Session = Depends(get_session)):
|
||||
q = select(Team)
|
||||
if department_id is not None:
|
||||
q = q.where(Team.department_id == department_id)
|
||||
return session.exec(q.order_by(Team.name.asc())).all()
|
||||
|
||||
|
||||
@router.post("/teams", response_model=Team)
|
||||
def create_team(
|
||||
payload: TeamCreate,
|
||||
session: Session = Depends(get_session),
|
||||
actor_employee_id: int = Depends(get_actor_employee_id),
|
||||
):
|
||||
team = Team(**payload.model_dump())
|
||||
session.add(team)
|
||||
|
||||
try:
|
||||
session.flush()
|
||||
log_activity(
|
||||
session,
|
||||
actor_employee_id=actor_employee_id,
|
||||
entity_type="team",
|
||||
entity_id=team.id,
|
||||
verb="created",
|
||||
payload={
|
||||
"name": team.name,
|
||||
"department_id": team.department_id,
|
||||
"lead_employee_id": team.lead_employee_id,
|
||||
},
|
||||
)
|
||||
session.commit()
|
||||
except IntegrityError:
|
||||
session.rollback()
|
||||
raise HTTPException(status_code=409, detail="Team already exists or violates constraints")
|
||||
|
||||
session.refresh(team)
|
||||
return team
|
||||
|
||||
|
||||
@router.patch("/teams/{team_id}", response_model=Team)
|
||||
def update_team(
|
||||
team_id: int,
|
||||
payload: TeamUpdate,
|
||||
session: Session = Depends(get_session),
|
||||
actor_employee_id: int = Depends(get_actor_employee_id),
|
||||
):
|
||||
team = session.get(Team, team_id)
|
||||
if not team:
|
||||
raise HTTPException(status_code=404, detail="Team not found")
|
||||
|
||||
data = payload.model_dump(exclude_unset=True)
|
||||
for k, v in data.items():
|
||||
setattr(team, k, v)
|
||||
|
||||
session.add(team)
|
||||
try:
|
||||
session.flush()
|
||||
log_activity(
|
||||
session,
|
||||
actor_employee_id=actor_employee_id,
|
||||
entity_type="team",
|
||||
entity_id=team.id,
|
||||
verb="updated",
|
||||
payload=data,
|
||||
)
|
||||
session.commit()
|
||||
except IntegrityError:
|
||||
session.rollback()
|
||||
raise HTTPException(status_code=409, detail="Team update violates constraints")
|
||||
|
||||
session.refresh(team)
|
||||
return team
|
||||
|
||||
|
||||
@router.post("/departments", response_model=Department)
|
||||
def create_department(
|
||||
payload: DepartmentCreate,
|
||||
session: Session = Depends(get_session),
|
||||
actor_employee_id: int = Depends(get_actor_employee_id),
|
||||
):
|
||||
"""Create a department.
|
||||
|
||||
Important: keep the operation atomic. We flush to get dept.id, log the activity,
|
||||
then commit once. We also translate common DB integrity errors into 409s.
|
||||
"""
|
||||
|
||||
dept = Department(name=payload.name, head_employee_id=payload.head_employee_id)
|
||||
session.add(dept)
|
||||
|
||||
try:
|
||||
session.flush() # assigns dept.id without committing
|
||||
log_activity(
|
||||
session,
|
||||
actor_employee_id=actor_employee_id,
|
||||
entity_type="department",
|
||||
entity_id=dept.id,
|
||||
verb="created",
|
||||
payload={"name": dept.name},
|
||||
)
|
||||
session.commit()
|
||||
except IntegrityError:
|
||||
session.rollback()
|
||||
raise HTTPException(
|
||||
status_code=409, detail="Department already exists or violates constraints"
|
||||
)
|
||||
|
||||
session.refresh(dept)
|
||||
return dept
|
||||
|
||||
|
||||
@router.patch("/departments/{department_id}", response_model=Department)
|
||||
def update_department(
|
||||
department_id: int,
|
||||
payload: DepartmentUpdate,
|
||||
session: Session = Depends(get_session),
|
||||
actor_employee_id: int = Depends(get_actor_employee_id),
|
||||
):
|
||||
dept = session.get(Department, department_id)
|
||||
if not dept:
|
||||
raise HTTPException(status_code=404, detail="Department not found")
|
||||
|
||||
data = payload.model_dump(exclude_unset=True)
|
||||
for k, v in data.items():
|
||||
setattr(dept, k, v)
|
||||
|
||||
session.add(dept)
|
||||
session.commit()
|
||||
session.refresh(dept)
|
||||
log_activity(
|
||||
session,
|
||||
actor_employee_id=actor_employee_id,
|
||||
entity_type="department",
|
||||
entity_id=dept.id,
|
||||
verb="updated",
|
||||
payload=data,
|
||||
)
|
||||
session.commit()
|
||||
return dept
|
||||
|
||||
|
||||
@router.get("/employees", response_model=list[Employee])
|
||||
def list_employees(session: Session = Depends(get_session)):
|
||||
return session.exec(select(Employee).order_by(Employee.id.asc())).all()
|
||||
|
||||
|
||||
@router.post("/employees", response_model=Employee)
|
||||
def create_employee(
|
||||
payload: EmployeeCreate,
|
||||
session: Session = Depends(get_session),
|
||||
actor_employee_id: int = Depends(get_actor_employee_id),
|
||||
):
|
||||
_enforce_employee_create_policy(
|
||||
session, actor_employee_id=actor_employee_id, target_employee_type=payload.employee_type
|
||||
)
|
||||
|
||||
emp = Employee(**payload.model_dump())
|
||||
session.add(emp)
|
||||
|
||||
try:
|
||||
session.flush()
|
||||
log_activity(
|
||||
session,
|
||||
actor_employee_id=actor_employee_id,
|
||||
entity_type="employee",
|
||||
entity_id=emp.id,
|
||||
verb="created",
|
||||
payload={"name": emp.name, "type": emp.employee_type},
|
||||
)
|
||||
|
||||
# AUTO-PROVISION: if this is an agent employee, try to create an OpenClaw session.
|
||||
_maybe_auto_provision_agent(session, emp=emp, actor_employee_id=actor_employee_id)
|
||||
|
||||
session.commit()
|
||||
except IntegrityError:
|
||||
session.rollback()
|
||||
raise HTTPException(status_code=409, detail="Employee create violates constraints")
|
||||
|
||||
session.refresh(emp)
|
||||
return Employee.model_validate(emp)
|
||||
|
||||
|
||||
@router.patch("/employees/{employee_id}", response_model=Employee)
|
||||
def update_employee(
|
||||
employee_id: int,
|
||||
payload: EmployeeUpdate,
|
||||
session: Session = Depends(get_session),
|
||||
actor_employee_id: int = Depends(get_actor_employee_id),
|
||||
):
|
||||
emp = session.get(Employee, employee_id)
|
||||
if not emp:
|
||||
raise HTTPException(status_code=404, detail="Employee not found")
|
||||
|
||||
data = payload.model_dump(exclude_unset=True)
|
||||
for k, v in data.items():
|
||||
setattr(emp, k, v)
|
||||
|
||||
session.add(emp)
|
||||
try:
|
||||
session.flush()
|
||||
log_activity(
|
||||
session,
|
||||
actor_employee_id=actor_employee_id,
|
||||
entity_type="employee",
|
||||
entity_id=emp.id,
|
||||
verb="updated",
|
||||
payload=data,
|
||||
)
|
||||
session.commit()
|
||||
except IntegrityError:
|
||||
session.rollback()
|
||||
raise HTTPException(status_code=409, detail="Employee update violates constraints")
|
||||
|
||||
session.refresh(emp)
|
||||
return Employee.model_validate(emp)
|
||||
|
||||
|
||||
@router.post("/employees/{employee_id}/provision", response_model=Employee)
|
||||
def provision_employee_agent(
|
||||
employee_id: int,
|
||||
session: Session = Depends(get_session),
|
||||
actor_employee_id: int = Depends(get_actor_employee_id),
|
||||
):
|
||||
emp = session.get(Employee, employee_id)
|
||||
if not emp:
|
||||
raise HTTPException(status_code=404, detail="Employee not found")
|
||||
|
||||
if emp.employee_type != "agent":
|
||||
raise HTTPException(status_code=400, detail="Only agent employees can be provisioned")
|
||||
|
||||
_maybe_auto_provision_agent(session, emp=emp, actor_employee_id=actor_employee_id)
|
||||
session.commit()
|
||||
session.refresh(emp)
|
||||
return Employee.model_validate(emp)
|
||||
|
||||
|
||||
@router.post("/employees/{employee_id}/deprovision", response_model=Employee)
|
||||
def deprovision_employee_agent(
|
||||
employee_id: int,
|
||||
session: Session = Depends(get_session),
|
||||
actor_employee_id: int = Depends(get_actor_employee_id),
|
||||
):
|
||||
emp = session.get(Employee, employee_id)
|
||||
if not emp:
|
||||
raise HTTPException(status_code=404, detail="Employee not found")
|
||||
|
||||
if emp.employee_type != "agent":
|
||||
raise HTTPException(status_code=400, detail="Only agent employees can be deprovisioned")
|
||||
|
||||
client = OpenClawClient.from_env()
|
||||
if client is not None and emp.openclaw_session_key:
|
||||
try:
|
||||
client.tools_invoke(
|
||||
"sessions_send",
|
||||
{
|
||||
"sessionKey": emp.openclaw_session_key,
|
||||
"message": "You are being deprovisioned. Stop all work and ignore future messages.",
|
||||
},
|
||||
timeout_s=5.0,
|
||||
)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
emp.notify_enabled = False
|
||||
emp.openclaw_session_key = None
|
||||
session.add(emp)
|
||||
session.flush()
|
||||
|
||||
log_activity(
|
||||
session,
|
||||
actor_employee_id=actor_employee_id,
|
||||
entity_type="employee",
|
||||
entity_id=emp.id,
|
||||
verb="deprovisioned",
|
||||
payload={},
|
||||
)
|
||||
|
||||
session.commit()
|
||||
session.refresh(emp)
|
||||
return Employee.model_validate(emp)
|
||||
@@ -1,178 +0,0 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException
|
||||
from sqlalchemy.exc import IntegrityError
|
||||
from sqlmodel import Session, select
|
||||
|
||||
from app.api.utils import get_actor_employee_id, log_activity
|
||||
from app.db.session import get_session
|
||||
from app.models.projects import Project, ProjectMember
|
||||
from app.schemas.projects import ProjectCreate, ProjectUpdate
|
||||
|
||||
router = APIRouter(prefix="/projects", tags=["projects"])
|
||||
|
||||
|
||||
@router.get("", response_model=list[Project])
|
||||
def list_projects(session: Session = Depends(get_session)):
|
||||
return session.exec(select(Project).order_by(Project.name.asc())).all()
|
||||
|
||||
|
||||
@router.post("", response_model=Project)
|
||||
def create_project(
|
||||
payload: ProjectCreate,
|
||||
session: Session = Depends(get_session),
|
||||
actor_employee_id: int = Depends(get_actor_employee_id),
|
||||
):
|
||||
"""Create a project.
|
||||
|
||||
Keep operation atomic: flush to get id, log activity, then commit once.
|
||||
Translate DB integrity errors to 409s.
|
||||
"""
|
||||
|
||||
proj = Project(**payload.model_dump())
|
||||
session.add(proj)
|
||||
|
||||
try:
|
||||
session.flush()
|
||||
log_activity(
|
||||
session,
|
||||
actor_employee_id=actor_employee_id,
|
||||
entity_type="project",
|
||||
entity_id=proj.id,
|
||||
verb="created",
|
||||
payload={"name": proj.name},
|
||||
)
|
||||
session.commit()
|
||||
except IntegrityError:
|
||||
session.rollback()
|
||||
raise HTTPException(
|
||||
status_code=409, detail="Project already exists or violates constraints"
|
||||
)
|
||||
|
||||
session.refresh(proj)
|
||||
return proj
|
||||
|
||||
|
||||
@router.patch("/{project_id}", response_model=Project)
|
||||
def update_project(
|
||||
project_id: int,
|
||||
payload: ProjectUpdate,
|
||||
session: Session = Depends(get_session),
|
||||
actor_employee_id: int = Depends(get_actor_employee_id),
|
||||
):
|
||||
proj = session.get(Project, project_id)
|
||||
if not proj:
|
||||
raise HTTPException(status_code=404, detail="Project not found")
|
||||
|
||||
data = payload.model_dump(exclude_unset=True)
|
||||
for k, v in data.items():
|
||||
setattr(proj, k, v)
|
||||
|
||||
session.add(proj)
|
||||
session.commit()
|
||||
session.refresh(proj)
|
||||
log_activity(
|
||||
session,
|
||||
actor_employee_id=actor_employee_id,
|
||||
entity_type="project",
|
||||
entity_id=proj.id,
|
||||
verb="updated",
|
||||
payload=data,
|
||||
)
|
||||
session.commit()
|
||||
return proj
|
||||
|
||||
|
||||
@router.get("/{project_id}/members", response_model=list[ProjectMember])
|
||||
def list_project_members(project_id: int, session: Session = Depends(get_session)):
|
||||
return session.exec(
|
||||
select(ProjectMember)
|
||||
.where(ProjectMember.project_id == project_id)
|
||||
.order_by(ProjectMember.id.asc())
|
||||
).all()
|
||||
|
||||
|
||||
@router.post("/{project_id}/members", response_model=ProjectMember)
|
||||
def add_project_member(
|
||||
project_id: int,
|
||||
payload: ProjectMember,
|
||||
session: Session = Depends(get_session),
|
||||
actor_employee_id: int = Depends(get_actor_employee_id),
|
||||
):
|
||||
existing = session.exec(
|
||||
select(ProjectMember).where(
|
||||
ProjectMember.project_id == project_id, ProjectMember.employee_id == payload.employee_id
|
||||
)
|
||||
).first()
|
||||
if existing:
|
||||
raise HTTPException(status_code=409, detail="Member already added")
|
||||
member = ProjectMember(
|
||||
project_id=project_id, employee_id=payload.employee_id, role=payload.role
|
||||
)
|
||||
session.add(member)
|
||||
session.commit()
|
||||
session.refresh(member)
|
||||
log_activity(
|
||||
session,
|
||||
actor_employee_id=actor_employee_id,
|
||||
entity_type="project_member",
|
||||
entity_id=member.id,
|
||||
verb="added",
|
||||
payload={"project_id": project_id, "employee_id": member.employee_id, "role": member.role},
|
||||
)
|
||||
session.commit()
|
||||
return member
|
||||
|
||||
|
||||
@router.delete("/{project_id}/members/{member_id}")
|
||||
def remove_project_member(
|
||||
project_id: int,
|
||||
member_id: int,
|
||||
session: Session = Depends(get_session),
|
||||
actor_employee_id: int = Depends(get_actor_employee_id),
|
||||
):
|
||||
member = session.get(ProjectMember, member_id)
|
||||
if not member or member.project_id != project_id:
|
||||
raise HTTPException(status_code=404, detail="Project member not found")
|
||||
session.delete(member)
|
||||
session.commit()
|
||||
log_activity(
|
||||
session,
|
||||
actor_employee_id=actor_employee_id,
|
||||
entity_type="project_member",
|
||||
entity_id=member_id,
|
||||
verb="removed",
|
||||
payload={"project_id": project_id},
|
||||
)
|
||||
session.commit()
|
||||
return {"ok": True}
|
||||
|
||||
|
||||
@router.patch("/{project_id}/members/{member_id}", response_model=ProjectMember)
|
||||
def update_project_member(
|
||||
project_id: int,
|
||||
member_id: int,
|
||||
payload: ProjectMember,
|
||||
session: Session = Depends(get_session),
|
||||
actor_employee_id: int = Depends(get_actor_employee_id),
|
||||
):
|
||||
member = session.get(ProjectMember, member_id)
|
||||
if not member or member.project_id != project_id:
|
||||
raise HTTPException(status_code=404, detail="Project member not found")
|
||||
|
||||
if payload.role is not None:
|
||||
member.role = payload.role
|
||||
|
||||
session.add(member)
|
||||
session.commit()
|
||||
session.refresh(member)
|
||||
log_activity(
|
||||
session,
|
||||
actor_employee_id=actor_employee_id,
|
||||
entity_type="project_member",
|
||||
entity_id=member.id,
|
||||
verb="updated",
|
||||
payload={"project_id": project_id, "role": member.role},
|
||||
)
|
||||
session.commit()
|
||||
return member
|
||||
115
backend/app/api/tasks.py
Normal file
115
backend/app/api/tasks.py
Normal file
@@ -0,0 +1,115 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import datetime
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException, status
|
||||
from sqlmodel import Session, select
|
||||
|
||||
from app.core.auth import get_auth_context
|
||||
from app.db.session import get_session
|
||||
from app.models.activity_events import ActivityEvent
|
||||
from app.models.boards import Board
|
||||
from app.models.tasks import Task
|
||||
from app.schemas.tasks import TaskCreate, TaskRead, TaskUpdate
|
||||
from app.services.admin_access import require_admin
|
||||
|
||||
router = APIRouter(prefix="/boards/{board_id}/tasks", tags=["tasks"])
|
||||
|
||||
|
||||
@router.get("", response_model=list[TaskRead])
|
||||
def list_tasks(
|
||||
board_id: str,
|
||||
session: Session = Depends(get_session),
|
||||
auth=Depends(get_auth_context),
|
||||
) -> list[Task]:
|
||||
require_admin(auth)
|
||||
board = session.get(Board, board_id)
|
||||
if board is None:
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND)
|
||||
return list(session.exec(select(Task).where(Task.board_id == board.id)))
|
||||
|
||||
|
||||
@router.post("", response_model=TaskRead)
|
||||
def create_task(
|
||||
board_id: str,
|
||||
payload: TaskCreate,
|
||||
session: Session = Depends(get_session),
|
||||
auth=Depends(get_auth_context),
|
||||
) -> Task:
|
||||
require_admin(auth)
|
||||
board = session.get(Board, board_id)
|
||||
if board is None:
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND)
|
||||
|
||||
task = Task.model_validate(payload)
|
||||
task.board_id = board.id
|
||||
if task.created_by_user_id is None and auth.user is not None:
|
||||
task.created_by_user_id = auth.user.id
|
||||
session.add(task)
|
||||
session.commit()
|
||||
session.refresh(task)
|
||||
|
||||
event = ActivityEvent(
|
||||
event_type="task.created",
|
||||
task_id=task.id,
|
||||
message=f"Task created: {task.title}.",
|
||||
)
|
||||
session.add(event)
|
||||
session.commit()
|
||||
return task
|
||||
|
||||
|
||||
@router.patch("/{task_id}", response_model=TaskRead)
|
||||
def update_task(
|
||||
board_id: str,
|
||||
task_id: str,
|
||||
payload: TaskUpdate,
|
||||
session: Session = Depends(get_session),
|
||||
auth=Depends(get_auth_context),
|
||||
) -> Task:
|
||||
require_admin(auth)
|
||||
board = session.get(Board, board_id)
|
||||
if board is None:
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND)
|
||||
task = session.get(Task, task_id)
|
||||
if task is None or task.board_id != board.id:
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND)
|
||||
|
||||
previous_status = task.status
|
||||
updates = payload.model_dump(exclude_unset=True)
|
||||
for key, value in updates.items():
|
||||
setattr(task, key, value)
|
||||
task.updated_at = datetime.utcnow()
|
||||
|
||||
session.add(task)
|
||||
session.commit()
|
||||
session.refresh(task)
|
||||
|
||||
if "status" in updates and task.status != previous_status:
|
||||
event_type = "task.status_changed"
|
||||
message = f"Task moved to {task.status}: {task.title}."
|
||||
else:
|
||||
event_type = "task.updated"
|
||||
message = f"Task updated: {task.title}."
|
||||
event = ActivityEvent(event_type=event_type, task_id=task.id, message=message)
|
||||
session.add(event)
|
||||
session.commit()
|
||||
return task
|
||||
|
||||
|
||||
@router.delete("/{task_id}")
|
||||
def delete_task(
|
||||
board_id: str,
|
||||
task_id: str,
|
||||
session: Session = Depends(get_session),
|
||||
auth=Depends(get_auth_context),
|
||||
) -> dict[str, bool]:
|
||||
require_admin(auth)
|
||||
board = session.get(Board, board_id)
|
||||
if board is None:
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND)
|
||||
task = session.get(Task, task_id)
|
||||
if task and task.board_id == board.id:
|
||||
session.delete(task)
|
||||
session.commit()
|
||||
return {"ok": True}
|
||||
@@ -1,37 +0,0 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
from typing import Any
|
||||
|
||||
from fastapi import Header, HTTPException
|
||||
from sqlmodel import Session
|
||||
|
||||
from app.models.activity import Activity
|
||||
|
||||
|
||||
def log_activity(
|
||||
session: Session,
|
||||
*,
|
||||
actor_employee_id: int | None,
|
||||
entity_type: str,
|
||||
entity_id: int | None,
|
||||
verb: str,
|
||||
payload: dict[str, Any] | None = None,
|
||||
) -> None:
|
||||
session.add(
|
||||
Activity(
|
||||
actor_employee_id=actor_employee_id,
|
||||
entity_type=entity_type,
|
||||
entity_id=entity_id,
|
||||
verb=verb,
|
||||
payload_json=json.dumps(payload) if payload is not None else None,
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
def get_actor_employee_id(
|
||||
x_actor_employee_id: int | None = Header(default=None, alias="X-Actor-Employee-Id"),
|
||||
) -> int:
|
||||
if x_actor_employee_id is None:
|
||||
raise HTTPException(status_code=400, detail="X-Actor-Employee-Id required")
|
||||
return x_actor_employee_id
|
||||
@@ -1,443 +0,0 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
from datetime import datetime
|
||||
|
||||
from fastapi import APIRouter, BackgroundTasks, Depends, HTTPException
|
||||
from sqlalchemy.exc import IntegrityError
|
||||
from sqlmodel import Session, select
|
||||
|
||||
from app.api.utils import get_actor_employee_id, log_activity
|
||||
from app.db.session import get_session
|
||||
from app.integrations.notify import NotifyContext, notify_openclaw
|
||||
from app.integrations.openclaw import OpenClawClient
|
||||
from app.models.org import Employee
|
||||
from app.models.work import Task, TaskComment
|
||||
from app.schemas.work import TaskCommentCreate, TaskCreate, TaskReviewDecision, TaskUpdate
|
||||
|
||||
logger = logging.getLogger("app.work")
|
||||
|
||||
router = APIRouter(tags=["work"])
|
||||
|
||||
ALLOWED_STATUSES = {"backlog", "ready", "in_progress", "review", "done", "blocked"}
|
||||
|
||||
|
||||
def _validate_task_assignee(session: Session, assignee_employee_id: int) -> None:
|
||||
"""Enforce that only provisioned agents can be assigned tasks.
|
||||
|
||||
Humans can be assigned regardless.
|
||||
Agents must be active, notify_enabled, and have openclaw_session_key.
|
||||
"""
|
||||
|
||||
emp = session.get(Employee, assignee_employee_id)
|
||||
if emp is None:
|
||||
raise HTTPException(status_code=400, detail="Assignee employee not found")
|
||||
|
||||
if emp.employee_type == "agent":
|
||||
if emp.status != "active":
|
||||
raise HTTPException(status_code=400, detail="Cannot assign task to inactive agent")
|
||||
if not emp.notify_enabled:
|
||||
raise HTTPException(
|
||||
status_code=400, detail="Cannot assign task to agent with notifications disabled"
|
||||
)
|
||||
if not emp.openclaw_session_key:
|
||||
raise HTTPException(status_code=400, detail="Cannot assign task to unprovisioned agent")
|
||||
|
||||
|
||||
@router.get("/tasks", response_model=list[Task])
|
||||
def list_tasks(project_id: int | None = None, session: Session = Depends(get_session)):
|
||||
stmt = select(Task).order_by(Task.id.asc())
|
||||
if project_id is not None:
|
||||
stmt = stmt.where(Task.project_id == project_id)
|
||||
return session.exec(stmt).all()
|
||||
|
||||
|
||||
@router.post("/tasks", response_model=Task)
|
||||
def create_task(
|
||||
payload: TaskCreate,
|
||||
background: BackgroundTasks,
|
||||
session: Session = Depends(get_session),
|
||||
actor_employee_id: int = Depends(get_actor_employee_id),
|
||||
):
|
||||
# SECURITY / AUDIT: never allow spoofing task creator.
|
||||
# The creator is always the actor making the request.
|
||||
payload = TaskCreate(**{**payload.model_dump(), "created_by_employee_id": actor_employee_id})
|
||||
|
||||
if payload.assignee_employee_id is not None:
|
||||
_validate_task_assignee(session, payload.assignee_employee_id)
|
||||
|
||||
# Default reviewer to the manager of the assignee (if not explicitly provided).
|
||||
if payload.reviewer_employee_id is None and payload.assignee_employee_id is not None:
|
||||
assignee = session.get(Employee, payload.assignee_employee_id)
|
||||
if assignee is not None and assignee.manager_id is not None:
|
||||
payload = TaskCreate(
|
||||
**{**payload.model_dump(), "reviewer_employee_id": assignee.manager_id}
|
||||
)
|
||||
|
||||
task = Task(**payload.model_dump())
|
||||
if task.status not in ALLOWED_STATUSES:
|
||||
raise HTTPException(status_code=400, detail="Invalid status")
|
||||
task.updated_at = datetime.utcnow()
|
||||
session.add(task)
|
||||
|
||||
try:
|
||||
session.flush()
|
||||
log_activity(
|
||||
session,
|
||||
actor_employee_id=actor_employee_id,
|
||||
entity_type="task",
|
||||
entity_id=task.id,
|
||||
verb="created",
|
||||
payload={"project_id": task.project_id, "title": task.title},
|
||||
)
|
||||
session.commit()
|
||||
except IntegrityError:
|
||||
session.rollback()
|
||||
raise HTTPException(status_code=409, detail="Task create violates constraints")
|
||||
|
||||
session.refresh(task)
|
||||
background.add_task(
|
||||
notify_openclaw,
|
||||
NotifyContext(event="task.created", actor_employee_id=actor_employee_id, task_id=task.id),
|
||||
)
|
||||
# Explicitly return a serializable payload (guards against empty {} responses)
|
||||
return Task.model_validate(task)
|
||||
|
||||
|
||||
@router.post("/tasks/{task_id}/dispatch")
|
||||
def dispatch_task(
|
||||
task_id: int,
|
||||
background: BackgroundTasks,
|
||||
session: Session = Depends(get_session),
|
||||
actor_employee_id: int = Depends(get_actor_employee_id),
|
||||
):
|
||||
logger.info("dispatch_task: called", extra={"task_id": task_id, "actor": actor_employee_id})
|
||||
task = session.get(Task, task_id)
|
||||
if not task:
|
||||
raise HTTPException(status_code=404, detail="Task not found")
|
||||
|
||||
logger.info(
|
||||
"dispatch_task: loaded",
|
||||
extra={
|
||||
"task_id": getattr(task, "id", None),
|
||||
"assignee_employee_id": getattr(task, "assignee_employee_id", None),
|
||||
},
|
||||
)
|
||||
|
||||
if task.assignee_employee_id is None:
|
||||
raise HTTPException(status_code=400, detail="Task has no assignee")
|
||||
|
||||
_validate_task_assignee(session, task.assignee_employee_id)
|
||||
|
||||
client = OpenClawClient.from_env()
|
||||
if client is None:
|
||||
logger.warning("dispatch_task: missing OpenClaw env")
|
||||
raise HTTPException(
|
||||
status_code=503,
|
||||
detail="OpenClaw gateway is not configured (set OPENCLAW_GATEWAY_URL/TOKEN)",
|
||||
)
|
||||
|
||||
# Best-effort: enqueue a dispatch notification.
|
||||
# IMPORTANT: if a task is already in review, the reviewer (not the assignee) should be notified.
|
||||
status = (getattr(task, "status", None) or "").lower()
|
||||
if status in {"review", "ready_for_review"}:
|
||||
background.add_task(
|
||||
notify_openclaw,
|
||||
NotifyContext(
|
||||
event="status.changed",
|
||||
actor_employee_id=actor_employee_id,
|
||||
task_id=task.id,
|
||||
changed_fields={"status": {"to": task.status}},
|
||||
),
|
||||
)
|
||||
else:
|
||||
background.add_task(
|
||||
notify_openclaw,
|
||||
NotifyContext(event="task.assigned", actor_employee_id=actor_employee_id, task_id=task.id),
|
||||
)
|
||||
|
||||
return {"ok": True}
|
||||
|
||||
|
||||
def _require_reviewer_comment(body: str | None) -> str:
|
||||
if body is None or not body.strip():
|
||||
raise HTTPException(status_code=400, detail="Reviewer must provide a comment for audit")
|
||||
return body.strip()
|
||||
|
||||
|
||||
@router.post("/tasks/{task_id}/review", response_model=Task)
|
||||
def review_task(
|
||||
task_id: int,
|
||||
payload: TaskReviewDecision,
|
||||
background: BackgroundTasks,
|
||||
session: Session = Depends(get_session),
|
||||
actor_employee_id: int = Depends(get_actor_employee_id),
|
||||
):
|
||||
"""Reviewer approves or requests changes.
|
||||
|
||||
- Approve => status=done
|
||||
- Changes => status=in_progress
|
||||
|
||||
Always writes a TaskComment by the reviewer for audit.
|
||||
"""
|
||||
|
||||
task = session.get(Task, task_id)
|
||||
if not task:
|
||||
raise HTTPException(status_code=404, detail="Task not found")
|
||||
|
||||
if task.reviewer_employee_id is None:
|
||||
raise HTTPException(status_code=400, detail="Task has no reviewer")
|
||||
|
||||
if actor_employee_id != task.reviewer_employee_id:
|
||||
raise HTTPException(status_code=403, detail="Only the reviewer can approve/request changes")
|
||||
|
||||
decision = (payload.decision or "").strip().lower()
|
||||
if decision not in {"approve", "changes"}:
|
||||
raise HTTPException(status_code=400, detail="Invalid decision")
|
||||
|
||||
comment_body = _require_reviewer_comment(payload.comment_body)
|
||||
|
||||
new_status = "done" if decision == "approve" else "in_progress"
|
||||
|
||||
before_status = task.status
|
||||
task.status = new_status
|
||||
task.updated_at = datetime.utcnow()
|
||||
session.add(task)
|
||||
|
||||
c = TaskComment(task_id=task.id, author_employee_id=actor_employee_id, body=comment_body)
|
||||
session.add(c)
|
||||
|
||||
try:
|
||||
session.flush()
|
||||
log_activity(
|
||||
session,
|
||||
actor_employee_id=actor_employee_id,
|
||||
entity_type="task",
|
||||
entity_id=task.id,
|
||||
verb="reviewed",
|
||||
payload={"decision": decision, "status": new_status},
|
||||
)
|
||||
session.commit()
|
||||
except IntegrityError:
|
||||
session.rollback()
|
||||
raise HTTPException(status_code=409, detail="Review action violates constraints")
|
||||
|
||||
session.refresh(task)
|
||||
session.refresh(c)
|
||||
|
||||
# Notify assignee (comment.created will exclude author)
|
||||
background.add_task(
|
||||
notify_openclaw,
|
||||
NotifyContext(
|
||||
event="comment.created",
|
||||
actor_employee_id=actor_employee_id,
|
||||
task_id=task.id,
|
||||
comment_id=c.id,
|
||||
),
|
||||
)
|
||||
|
||||
# Notify reviewer/PMs about status change
|
||||
if before_status != task.status:
|
||||
background.add_task(
|
||||
notify_openclaw,
|
||||
NotifyContext(
|
||||
event="status.changed",
|
||||
actor_employee_id=actor_employee_id,
|
||||
task_id=task.id,
|
||||
changed_fields={"status": {"from": before_status, "to": task.status}},
|
||||
),
|
||||
)
|
||||
|
||||
return Task.model_validate(task)
|
||||
|
||||
|
||||
@router.patch("/tasks/{task_id}", response_model=Task)
|
||||
def update_task(
|
||||
task_id: int,
|
||||
payload: TaskUpdate,
|
||||
background: BackgroundTasks,
|
||||
session: Session = Depends(get_session),
|
||||
actor_employee_id: int = Depends(get_actor_employee_id),
|
||||
):
|
||||
logger.info("dispatch_task: called", extra={"task_id": task_id, "actor": actor_employee_id})
|
||||
task = session.get(Task, task_id)
|
||||
if not task:
|
||||
raise HTTPException(status_code=404, detail="Task not found")
|
||||
|
||||
before = {
|
||||
"assignee_employee_id": task.assignee_employee_id,
|
||||
"reviewer_employee_id": task.reviewer_employee_id,
|
||||
"status": task.status,
|
||||
}
|
||||
|
||||
data = payload.model_dump(exclude_unset=True)
|
||||
if "assignee_employee_id" in data and data["assignee_employee_id"] is not None:
|
||||
_validate_task_assignee(session, data["assignee_employee_id"])
|
||||
if "status" in data and data["status"] not in ALLOWED_STATUSES:
|
||||
raise HTTPException(status_code=400, detail="Invalid status")
|
||||
|
||||
# Enforce review workflow: agent assignees cannot mark tasks done directly.
|
||||
if data.get("status") == "done":
|
||||
assignee = (
|
||||
session.get(Employee, task.assignee_employee_id) if task.assignee_employee_id else None
|
||||
)
|
||||
if assignee is not None and assignee.employee_type == "agent":
|
||||
if actor_employee_id == task.assignee_employee_id:
|
||||
raise HTTPException(
|
||||
status_code=403,
|
||||
detail="Assignee agents cannot mark tasks done; set status=review for manager approval",
|
||||
)
|
||||
if task.reviewer_employee_id is not None and actor_employee_id != task.reviewer_employee_id:
|
||||
raise HTTPException(status_code=403, detail="Only the reviewer can mark a task done")
|
||||
|
||||
# If a task is sent to review and no reviewer is set, default reviewer to assignee's manager.
|
||||
if (
|
||||
data.get("status") in {"review", "ready_for_review"}
|
||||
and data.get("reviewer_employee_id") is None
|
||||
):
|
||||
assignee_id = data.get("assignee_employee_id", task.assignee_employee_id)
|
||||
if assignee_id is not None:
|
||||
assignee = session.get(Employee, assignee_id)
|
||||
if assignee is not None and assignee.manager_id is not None:
|
||||
data["reviewer_employee_id"] = assignee.manager_id
|
||||
|
||||
for k, v in data.items():
|
||||
setattr(task, k, v)
|
||||
task.updated_at = datetime.utcnow()
|
||||
session.add(task)
|
||||
|
||||
try:
|
||||
session.flush()
|
||||
log_activity(
|
||||
session,
|
||||
actor_employee_id=actor_employee_id,
|
||||
entity_type="task",
|
||||
entity_id=task.id,
|
||||
verb="updated",
|
||||
payload=data,
|
||||
)
|
||||
session.commit()
|
||||
except IntegrityError:
|
||||
session.rollback()
|
||||
raise HTTPException(status_code=409, detail="Task update violates constraints")
|
||||
|
||||
session.refresh(task)
|
||||
|
||||
# notify based on meaningful changes
|
||||
changed = {}
|
||||
if before.get("assignee_employee_id") != task.assignee_employee_id:
|
||||
changed["assignee_employee_id"] = {
|
||||
"from": before.get("assignee_employee_id"),
|
||||
"to": task.assignee_employee_id,
|
||||
}
|
||||
background.add_task(
|
||||
notify_openclaw,
|
||||
NotifyContext(
|
||||
event="task.assigned",
|
||||
actor_employee_id=actor_employee_id,
|
||||
task_id=task.id,
|
||||
changed_fields=changed,
|
||||
),
|
||||
)
|
||||
if before.get("status") != task.status:
|
||||
changed["status"] = {"from": before.get("status"), "to": task.status}
|
||||
background.add_task(
|
||||
notify_openclaw,
|
||||
NotifyContext(
|
||||
event="status.changed",
|
||||
actor_employee_id=actor_employee_id,
|
||||
task_id=task.id,
|
||||
changed_fields=changed,
|
||||
),
|
||||
)
|
||||
if not changed and data:
|
||||
background.add_task(
|
||||
notify_openclaw,
|
||||
NotifyContext(
|
||||
event="task.updated",
|
||||
actor_employee_id=actor_employee_id,
|
||||
task_id=task.id,
|
||||
changed_fields=data,
|
||||
),
|
||||
)
|
||||
|
||||
return Task.model_validate(task)
|
||||
|
||||
|
||||
@router.delete("/tasks/{task_id}")
|
||||
def delete_task(
|
||||
task_id: int,
|
||||
session: Session = Depends(get_session),
|
||||
actor_employee_id: int = Depends(get_actor_employee_id),
|
||||
):
|
||||
logger.info("dispatch_task: called", extra={"task_id": task_id, "actor": actor_employee_id})
|
||||
task = session.get(Task, task_id)
|
||||
if not task:
|
||||
raise HTTPException(status_code=404, detail="Task not found")
|
||||
|
||||
session.delete(task)
|
||||
try:
|
||||
session.flush()
|
||||
log_activity(
|
||||
session,
|
||||
actor_employee_id=actor_employee_id,
|
||||
entity_type="task",
|
||||
entity_id=task_id,
|
||||
verb="deleted",
|
||||
)
|
||||
session.commit()
|
||||
except IntegrityError:
|
||||
session.rollback()
|
||||
raise HTTPException(status_code=409, detail="Task delete violates constraints")
|
||||
|
||||
return {"ok": True}
|
||||
|
||||
|
||||
@router.get("/task-comments", response_model=list[TaskComment])
|
||||
def list_task_comments(task_id: int, session: Session = Depends(get_session)):
|
||||
return session.exec(
|
||||
select(TaskComment).where(TaskComment.task_id == task_id).order_by(TaskComment.id.asc())
|
||||
).all()
|
||||
|
||||
|
||||
@router.post("/task-comments", response_model=TaskComment)
|
||||
def create_task_comment(
|
||||
payload: TaskCommentCreate,
|
||||
background: BackgroundTasks,
|
||||
session: Session = Depends(get_session),
|
||||
actor_employee_id: int = Depends(get_actor_employee_id),
|
||||
):
|
||||
# SECURITY / AUDIT: never allow spoofing comment authorship.
|
||||
# The author is always the actor making the request.
|
||||
payload = TaskCommentCreate(**{**payload.model_dump(), "author_employee_id": actor_employee_id})
|
||||
|
||||
c = TaskComment(**payload.model_dump())
|
||||
session.add(c)
|
||||
|
||||
try:
|
||||
session.flush()
|
||||
log_activity(
|
||||
session,
|
||||
actor_employee_id=actor_employee_id,
|
||||
entity_type="task",
|
||||
entity_id=c.task_id,
|
||||
verb="commented",
|
||||
)
|
||||
session.commit()
|
||||
except IntegrityError:
|
||||
session.rollback()
|
||||
raise HTTPException(status_code=409, detail="Comment create violates constraints")
|
||||
|
||||
session.refresh(c)
|
||||
task = session.get(Task, c.task_id)
|
||||
if task is not None:
|
||||
background.add_task(
|
||||
notify_openclaw,
|
||||
NotifyContext(
|
||||
event="comment.created",
|
||||
actor_employee_id=actor_employee_id,
|
||||
task_id=task.id,
|
||||
comment_id=c.id,
|
||||
),
|
||||
)
|
||||
return TaskComment.model_validate(c)
|
||||
0
backend/app/core/__init__.py
Normal file
0
backend/app/core/__init__.py
Normal file
97
backend/app/core/auth.py
Normal file
97
backend/app/core/auth.py
Normal file
@@ -0,0 +1,97 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from dataclasses import dataclass
|
||||
from functools import lru_cache
|
||||
from typing import Literal
|
||||
|
||||
from fastapi import Depends, HTTPException, Request, status
|
||||
from fastapi.security import HTTPAuthorizationCredentials, HTTPBearer
|
||||
from fastapi_clerk_auth import ClerkConfig, ClerkHTTPBearer
|
||||
from fastapi_clerk_auth import HTTPAuthorizationCredentials as ClerkCredentials
|
||||
from pydantic import BaseModel, ValidationError
|
||||
from sqlmodel import Session, select
|
||||
|
||||
from app.core.config import settings
|
||||
from app.db.session import get_session
|
||||
from app.models.users import User
|
||||
|
||||
security = HTTPBearer(auto_error=False)
|
||||
|
||||
|
||||
class ClerkTokenPayload(BaseModel):
|
||||
sub: str
|
||||
|
||||
|
||||
@lru_cache
|
||||
def _build_clerk_http_bearer(auto_error: bool) -> ClerkHTTPBearer:
|
||||
if not settings.clerk_jwks_url:
|
||||
raise RuntimeError("CLERK_JWKS_URL is not set.")
|
||||
clerk_config = ClerkConfig(
|
||||
jwks_url=settings.clerk_jwks_url,
|
||||
verify_iat=settings.clerk_verify_iat,
|
||||
leeway=settings.clerk_leeway,
|
||||
)
|
||||
return ClerkHTTPBearer(config=clerk_config, auto_error=auto_error, add_state=True)
|
||||
|
||||
|
||||
@dataclass
|
||||
class AuthContext:
|
||||
actor_type: Literal["user"]
|
||||
user: User | None = None
|
||||
|
||||
|
||||
def _resolve_clerk_auth(
|
||||
request: Request, fallback: ClerkCredentials | None
|
||||
) -> ClerkCredentials | None:
|
||||
auth_data = getattr(request.state, "clerk_auth", None)
|
||||
return auth_data or fallback
|
||||
|
||||
|
||||
def _parse_subject(auth_data: ClerkCredentials | None) -> str | None:
|
||||
if not auth_data or not auth_data.decoded:
|
||||
return None
|
||||
payload = ClerkTokenPayload.model_validate(auth_data.decoded)
|
||||
return payload.sub
|
||||
|
||||
|
||||
async def get_auth_context(
|
||||
request: Request,
|
||||
credentials: HTTPAuthorizationCredentials | None = Depends(security),
|
||||
session: Session = Depends(get_session),
|
||||
) -> AuthContext:
|
||||
if credentials is None:
|
||||
raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED)
|
||||
|
||||
try:
|
||||
guard = _build_clerk_http_bearer(auto_error=False)
|
||||
clerk_credentials = await guard(request)
|
||||
except (RuntimeError, ValueError) as exc:
|
||||
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR) from exc
|
||||
except HTTPException as exc:
|
||||
raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED) from exc
|
||||
|
||||
auth_data = _resolve_clerk_auth(request, clerk_credentials)
|
||||
try:
|
||||
clerk_user_id = _parse_subject(auth_data)
|
||||
except ValidationError as exc:
|
||||
raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED) from exc
|
||||
|
||||
if not clerk_user_id:
|
||||
raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED)
|
||||
|
||||
user = session.exec(select(User).where(User.clerk_user_id == clerk_user_id)).first()
|
||||
if user is None:
|
||||
claims = auth_data.decoded if auth_data and auth_data.decoded else {}
|
||||
user = User(
|
||||
clerk_user_id=clerk_user_id,
|
||||
email=claims.get("email"),
|
||||
name=claims.get("name"),
|
||||
)
|
||||
session.add(user)
|
||||
session.commit()
|
||||
session.refresh(user)
|
||||
|
||||
return AuthContext(
|
||||
actor_type="user",
|
||||
user=user,
|
||||
)
|
||||
@@ -4,10 +4,29 @@ from pydantic_settings import BaseSettings, SettingsConfigDict
|
||||
|
||||
|
||||
class Settings(BaseSettings):
|
||||
model_config = SettingsConfigDict(env_file=".env", extra="ignore")
|
||||
model_config = SettingsConfigDict(
|
||||
env_file=".env",
|
||||
env_file_encoding="utf-8",
|
||||
extra="ignore",
|
||||
)
|
||||
|
||||
environment: str = "dev"
|
||||
database_url: str = "postgresql+psycopg://postgres:postgres@localhost:5432/openclaw_agency"
|
||||
redis_url: str = "redis://localhost:6379/0"
|
||||
|
||||
# Clerk auth (auth only; roles stored in DB)
|
||||
clerk_jwks_url: str = ""
|
||||
clerk_verify_iat: bool = True
|
||||
clerk_leeway: float = 10.0
|
||||
|
||||
# OpenClaw Gateway
|
||||
openclaw_gateway_url: str = ""
|
||||
openclaw_gateway_token: str = ""
|
||||
|
||||
database_url: str
|
||||
cors_origins: str = ""
|
||||
|
||||
# Database lifecycle
|
||||
db_auto_migrate: bool = False
|
||||
|
||||
settings = Settings() # type: ignore
|
||||
|
||||
settings = Settings()
|
||||
|
||||
@@ -2,59 +2,13 @@ from __future__ import annotations
|
||||
|
||||
import logging
|
||||
import os
|
||||
import sys
|
||||
from typing import Any
|
||||
|
||||
|
||||
def _level() -> str:
|
||||
return (os.environ.get("LOG_LEVEL") or os.environ.get("UVICORN_LOG_LEVEL") or "INFO").upper()
|
||||
|
||||
|
||||
def configure_logging() -> None:
|
||||
"""Configure app logging to stream to stdout.
|
||||
|
||||
Uvicorn already logs requests, but we want our app/integrations logs to be visible
|
||||
in the same console stream.
|
||||
"""
|
||||
|
||||
level = getattr(logging, _level(), logging.INFO)
|
||||
|
||||
root = logging.getLogger()
|
||||
root.setLevel(level)
|
||||
|
||||
# Avoid duplicate handlers (e.g., when autoreload imports twice)
|
||||
if not any(isinstance(h, logging.StreamHandler) for h in root.handlers):
|
||||
handler = logging.StreamHandler(sys.stdout)
|
||||
handler.setLevel(level)
|
||||
formatter = logging.Formatter(
|
||||
fmt="%(asctime)s | %(levelname)s | %(name)s | %(message)s",
|
||||
datefmt="%Y-%m-%dT%H:%M:%SZ",
|
||||
)
|
||||
handler.setFormatter(formatter)
|
||||
root.addHandler(handler)
|
||||
|
||||
# Make common noisy loggers respect our level
|
||||
for name in [
|
||||
"uvicorn",
|
||||
"uvicorn.error",
|
||||
"uvicorn.access",
|
||||
"httpx",
|
||||
"requests",
|
||||
]:
|
||||
logging.getLogger(name).setLevel(level)
|
||||
|
||||
# Hide SQLAlchemy engine chatter unless explicitly debugging.
|
||||
# (You can still enable it by setting LOG_LEVEL=DEBUG and adjusting this.)
|
||||
logging.getLogger("sqlalchemy").setLevel(logging.WARNING)
|
||||
logging.getLogger("sqlalchemy.engine").setLevel(logging.WARNING)
|
||||
logging.getLogger("sqlalchemy.pool").setLevel(logging.WARNING)
|
||||
logging.getLogger("sqlalchemy.dialects").setLevel(logging.WARNING)
|
||||
|
||||
|
||||
def log_kv(logger: logging.Logger, msg: str, **kv: Any) -> None:
|
||||
# Lightweight key-value logging without requiring JSON logging.
|
||||
if kv:
|
||||
suffix = " ".join(f"{k}={v!r}" for k, v in kv.items())
|
||||
logger.info(f"{msg} | {suffix}")
|
||||
else:
|
||||
logger.info(msg)
|
||||
level_name = os.getenv("LOG_LEVEL", "INFO").upper()
|
||||
level = logging._nameToLevel.get(level_name, logging.INFO)
|
||||
logging.basicConfig(
|
||||
level=level,
|
||||
format="%(asctime)s %(levelname)s %(name)s %(message)s",
|
||||
force=True,
|
||||
)
|
||||
|
||||
@@ -1,35 +0,0 @@
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
def public_api_base_url() -> str:
|
||||
"""Return a LAN-reachable base URL for the Mission Control API.
|
||||
|
||||
Priority:
|
||||
1) MISSION_CONTROL_BASE_URL env var (recommended)
|
||||
2) First non-loopback IPv4 from `hostname -I`
|
||||
|
||||
Never returns localhost because agents may run on another machine.
|
||||
"""
|
||||
|
||||
import os
|
||||
import re
|
||||
import subprocess
|
||||
|
||||
explicit = os.environ.get("MISSION_CONTROL_BASE_URL")
|
||||
if explicit:
|
||||
return explicit.rstrip("/")
|
||||
|
||||
try:
|
||||
out = subprocess.check_output(["bash", "-lc", "hostname -I"], text=True).strip()
|
||||
ips = re.findall(r"\b(?:\d{1,3}\.){3}\d{1,3}\b", out)
|
||||
for ip in ips:
|
||||
if ip.startswith("127."):
|
||||
continue
|
||||
if ip.startswith("172.17."):
|
||||
continue
|
||||
if ip.startswith(("192.168.", "10.", "172.")):
|
||||
return f"http://{ip}:8000"
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
return "http://<dev-machine-ip>:8000"
|
||||
0
backend/app/db/__init__.py
Normal file
0
backend/app/db/__init__.py
Normal file
@@ -1,16 +1,45 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
from collections.abc import Generator
|
||||
from pathlib import Path
|
||||
|
||||
from sqlmodel import Session, SQLModel, create_engine
|
||||
|
||||
from alembic import command
|
||||
from alembic.config import Config
|
||||
from app import models # noqa: F401
|
||||
from app.core.config import settings
|
||||
|
||||
engine = create_engine(settings.database_url, echo=False)
|
||||
engine = create_engine(settings.database_url, pool_pre_ping=True)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def _alembic_config() -> Config:
|
||||
alembic_ini = Path(__file__).resolve().parents[2] / "alembic.ini"
|
||||
alembic_cfg = Config(str(alembic_ini))
|
||||
alembic_cfg.attributes["configure_logger"] = False
|
||||
return alembic_cfg
|
||||
|
||||
|
||||
def run_migrations() -> None:
|
||||
logger.info("Running database migrations.")
|
||||
command.upgrade(_alembic_config(), "head")
|
||||
logger.info("Database migrations complete.")
|
||||
|
||||
|
||||
def init_db() -> None:
|
||||
if settings.db_auto_migrate:
|
||||
versions_dir = Path(__file__).resolve().parents[2] / "alembic" / "versions"
|
||||
if any(versions_dir.glob("*.py")):
|
||||
logger.info("Running Alembic migrations on startup")
|
||||
run_migrations()
|
||||
return
|
||||
logger.warning("No Alembic revisions found; falling back to create_all")
|
||||
|
||||
SQLModel.metadata.create_all(engine)
|
||||
|
||||
|
||||
def get_session():
|
||||
def get_session() -> Generator[Session, None, None]:
|
||||
with Session(engine) as session:
|
||||
yield session
|
||||
|
||||
0
backend/app/integrations/__init__.py
Normal file
0
backend/app/integrations/__init__.py
Normal file
@@ -1,282 +0,0 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
from dataclasses import dataclass
|
||||
from typing import Iterable
|
||||
|
||||
from sqlmodel import Session, select
|
||||
|
||||
from app.db.session import engine
|
||||
from app.integrations.openclaw import OpenClawClient
|
||||
from app.models.org import Employee
|
||||
from app.models.projects import ProjectMember
|
||||
from app.models.work import Task, TaskComment
|
||||
|
||||
logger = logging.getLogger("app.notify")
|
||||
|
||||
|
||||
@dataclass(frozen=True)
|
||||
class NotifyContext:
|
||||
"""Notification context.
|
||||
|
||||
IMPORTANT: this is passed into FastAPI BackgroundTasks.
|
||||
Do not store live SQLAlchemy/SQLModel objects here; only ids/primitive data.
|
||||
"""
|
||||
|
||||
event: str # task.created | task.updated | task.assigned | comment.created | status.changed
|
||||
actor_employee_id: int
|
||||
task_id: int
|
||||
comment_id: int | None = None
|
||||
changed_fields: dict | None = None
|
||||
|
||||
|
||||
def _employees_with_session_keys(session: Session, employee_ids: Iterable[int]) -> list[Employee]:
|
||||
ids = sorted({i for i in employee_ids if i is not None})
|
||||
if not ids:
|
||||
return []
|
||||
|
||||
emps = session.exec(select(Employee).where(Employee.id.in_(ids))).all()
|
||||
out: list[Employee] = []
|
||||
for e in emps:
|
||||
if not getattr(e, "notify_enabled", True):
|
||||
continue
|
||||
if getattr(e, "openclaw_session_key", None):
|
||||
out.append(e)
|
||||
return out
|
||||
|
||||
|
||||
def _project_pm_employee_ids(session: Session, project_id: int) -> set[int]:
|
||||
pms = session.exec(select(ProjectMember).where(ProjectMember.project_id == project_id)).all()
|
||||
pm_ids: set[int] = set()
|
||||
for m in pms:
|
||||
role = (m.role or "").lower()
|
||||
if role in {"pm", "product", "product_manager", "manager"}:
|
||||
pm_ids.add(m.employee_id)
|
||||
return pm_ids
|
||||
|
||||
|
||||
def resolve_recipients(
|
||||
session: Session, ctx: NotifyContext, task: Task, comment: TaskComment | None
|
||||
) -> set[int]:
|
||||
recipients: set[int] = set()
|
||||
|
||||
if ctx.event == "task.created":
|
||||
if task.assignee_employee_id:
|
||||
recipients.add(task.assignee_employee_id)
|
||||
recipients |= _project_pm_employee_ids(session, task.project_id)
|
||||
|
||||
elif ctx.event == "task.assigned":
|
||||
if task.assignee_employee_id:
|
||||
recipients.add(task.assignee_employee_id)
|
||||
recipients |= _project_pm_employee_ids(session, task.project_id)
|
||||
|
||||
elif ctx.event == "comment.created":
|
||||
if task.assignee_employee_id:
|
||||
recipients.add(task.assignee_employee_id)
|
||||
if task.reviewer_employee_id:
|
||||
recipients.add(task.reviewer_employee_id)
|
||||
recipients |= _project_pm_employee_ids(session, task.project_id)
|
||||
if comment and comment.author_employee_id:
|
||||
recipients.discard(comment.author_employee_id)
|
||||
|
||||
elif ctx.event == "status.changed":
|
||||
new_status = (getattr(task, "status", None) or "").lower()
|
||||
if new_status in {"review", "ready_for_review"} and task.reviewer_employee_id:
|
||||
recipients.add(task.reviewer_employee_id)
|
||||
recipients |= _project_pm_employee_ids(session, task.project_id)
|
||||
|
||||
elif ctx.event == "task.updated":
|
||||
recipients |= _project_pm_employee_ids(session, task.project_id)
|
||||
|
||||
recipients.discard(ctx.actor_employee_id)
|
||||
return recipients
|
||||
|
||||
|
||||
def ensure_employee_provisioned(session: Session, employee_id: int) -> None:
|
||||
"""Best-effort provisioning of a reviewer/manager so notifications can be delivered."""
|
||||
|
||||
emp = session.get(Employee, employee_id)
|
||||
if emp is None:
|
||||
return
|
||||
if not getattr(emp, "notify_enabled", True):
|
||||
return
|
||||
if getattr(emp, "openclaw_session_key", None):
|
||||
return
|
||||
|
||||
client = OpenClawClient.from_env()
|
||||
if client is None:
|
||||
logger.warning(
|
||||
"ensure_employee_provisioned: missing OpenClaw env", extra={"employee_id": employee_id}
|
||||
)
|
||||
return
|
||||
|
||||
prompt = (
|
||||
f"You are {emp.name} (employee_id={emp.id}).\n"
|
||||
"You are a reviewer/manager in Mission Control.\n\n"
|
||||
"Your job is to REVIEW work within the bounds of what the task requester asked for, using only the task + comments + current system state.\n"
|
||||
"Do NOT wait for the requester to provide more info by default.\n\n"
|
||||
"When a task is in review you must:\n"
|
||||
"1) Read the task title/description and all comments\n"
|
||||
"2) Verify the requested changes were actually made (check via Mission Control API if needed)\n"
|
||||
"3) Decide: approve or request changes\n"
|
||||
"4) Leave an audit comment explaining your decision (required)\n\n"
|
||||
"If something is ambiguous or missing, request changes with a clear checklist. Only ask the human if it's truly impossible to decide.\n"
|
||||
)
|
||||
|
||||
try:
|
||||
res = client.tools_invoke(
|
||||
"sessions_spawn",
|
||||
{"task": prompt, "label": f"employee:{emp.id}:{emp.name}"},
|
||||
timeout_s=20.0,
|
||||
)
|
||||
details = (res.get("result") or {}).get("details") or {}
|
||||
sk = details.get("childSessionKey") or details.get("sessionKey")
|
||||
if sk:
|
||||
emp.openclaw_session_key = sk
|
||||
session.add(emp)
|
||||
session.commit()
|
||||
logger.info(
|
||||
"ensure_employee_provisioned: provisioned",
|
||||
extra={"employee_id": emp.id, "session_key": sk},
|
||||
)
|
||||
except Exception:
|
||||
session.rollback()
|
||||
logger.exception("ensure_employee_provisioned: failed", extra={"employee_id": employee_id})
|
||||
|
||||
|
||||
def build_message(
|
||||
*,
|
||||
ctx: NotifyContext,
|
||||
task: Task,
|
||||
comment: TaskComment | None,
|
||||
recipient: Employee,
|
||||
base_url: str,
|
||||
) -> str:
|
||||
base = f"Task #{task.id}: {task.title}" if task.id is not None else f"Task: {task.title}"
|
||||
|
||||
if ctx.event in {"task.created", "task.assigned"} and recipient.employee_type == "agent":
|
||||
desc = (task.description or "").strip()
|
||||
if len(desc) > 500:
|
||||
desc = desc[:497] + "..."
|
||||
desc_block = f"\n\nDescription:\n{desc}" if desc else ""
|
||||
|
||||
return (
|
||||
f"{base}\n\n"
|
||||
f"Set BASE={base_url}\n\n"
|
||||
"You are the assignee. Start NOW (use the exec tool to run these curl commands):\n"
|
||||
f"1) curl -sS -X PATCH $BASE/tasks/{task.id} -H 'X-Actor-Employee-Id: {recipient.id}' -H 'Content-Type: application/json' -d '{{\"status\":\"in_progress\"}}'\n"
|
||||
f"2) curl -sS -X POST $BASE/task-comments -H 'X-Actor-Employee-Id: {recipient.id}' -H 'Content-Type: application/json' -d '{{\"task_id\":{task.id},\"body\":\"Plan: ... Next: ...\"}}'\n"
|
||||
"3) Do the work\n"
|
||||
"4) Post progress updates via POST $BASE/task-comments (same headers)\n"
|
||||
f"5) When complete: set status=review (assignee cannot set done) and wait for manager approval\n"
|
||||
f"{desc_block}"
|
||||
)
|
||||
|
||||
if ctx.event == "comment.created":
|
||||
snippet = ""
|
||||
if comment and comment.body:
|
||||
snippet = comment.body.strip().replace("\n", " ")
|
||||
if len(snippet) > 180:
|
||||
snippet = snippet[:177] + "..."
|
||||
snippet = f"\nComment: {snippet}"
|
||||
return f"New comment on {base}.{snippet}\nPlease review and respond in Mission Control."
|
||||
|
||||
if ctx.event == "status.changed":
|
||||
new_status = (getattr(task, "status", None) or "").lower()
|
||||
if new_status in {"review", "ready_for_review"}:
|
||||
return (
|
||||
f"Review requested for {base}.\n"
|
||||
"As the reviewer/manager, you must:\n"
|
||||
"1) Read the task + latest assignee comments\n"
|
||||
"2) Decide: approve or request changes\n"
|
||||
"3) Leave an audit comment explaining your decision (required)\n"
|
||||
f"4) Submit decision via POST /tasks/{task.id}/review (decision=approve|changes)\n"
|
||||
"Approve → task becomes done. Changes → task returns to in_progress and assignee is notified."
|
||||
)
|
||||
return (
|
||||
f"Status changed on {base} → {task.status}.\n"
|
||||
"Please review and respond in Mission Control."
|
||||
)
|
||||
|
||||
if ctx.event == "task.created":
|
||||
return f"New task created: {base}.\nPlease review and respond in Mission Control."
|
||||
|
||||
if ctx.event == "task.assigned":
|
||||
return f"Assigned: {base}.\nPlease review and respond in Mission Control."
|
||||
|
||||
return f"Update on {base}.\nPlease review and respond in Mission Control."
|
||||
|
||||
|
||||
def notify_openclaw(ctx: NotifyContext) -> None:
|
||||
"""Send OpenClaw notifications.
|
||||
|
||||
Runs in BackgroundTasks; opens its own DB session for safety.
|
||||
"""
|
||||
|
||||
client = OpenClawClient.from_env()
|
||||
logger.info(
|
||||
"notify_openclaw: start",
|
||||
extra={"event": ctx.event, "task_id": ctx.task_id, "actor": ctx.actor_employee_id},
|
||||
)
|
||||
if client is None:
|
||||
logger.warning("notify_openclaw: skipped (missing OpenClaw env)")
|
||||
return
|
||||
|
||||
with Session(engine) as session:
|
||||
task = session.get(Task, ctx.task_id)
|
||||
if task is None:
|
||||
logger.warning("notify_openclaw: task not found", extra={"task_id": ctx.task_id})
|
||||
return
|
||||
|
||||
comment = session.get(TaskComment, ctx.comment_id) if ctx.comment_id else None
|
||||
|
||||
if ctx.event == "status.changed":
|
||||
new_status = (getattr(task, "status", None) or "").lower()
|
||||
if new_status in {"review", "ready_for_review"} and task.reviewer_employee_id:
|
||||
ensure_employee_provisioned(session, int(task.reviewer_employee_id))
|
||||
|
||||
recipient_ids = resolve_recipients(session, ctx, task, comment)
|
||||
logger.info(
|
||||
"notify_openclaw: recipients resolved", extra={"recipient_ids": sorted(recipient_ids)}
|
||||
)
|
||||
recipients = _employees_with_session_keys(session, recipient_ids)
|
||||
if not recipients:
|
||||
logger.info("notify_openclaw: no recipients with session keys")
|
||||
return
|
||||
|
||||
# base URL used in agent messages
|
||||
base_url = __import__(
|
||||
"app.core.urls", fromlist=["public_api_base_url"]
|
||||
).public_api_base_url()
|
||||
|
||||
for e in recipients:
|
||||
sk = getattr(e, "openclaw_session_key", None)
|
||||
if not sk:
|
||||
continue
|
||||
|
||||
message = build_message(
|
||||
ctx=ctx,
|
||||
task=task,
|
||||
comment=comment,
|
||||
recipient=e,
|
||||
base_url=base_url,
|
||||
)
|
||||
|
||||
try:
|
||||
client.tools_invoke(
|
||||
"sessions_send",
|
||||
{"sessionKey": sk, "message": message},
|
||||
timeout_s=30.0,
|
||||
)
|
||||
except Exception:
|
||||
# keep the log, but avoid giant stack spam unless debugging
|
||||
logger.warning(
|
||||
"notify_openclaw: sessions_send failed",
|
||||
extra={
|
||||
"event": ctx.event,
|
||||
"task_id": ctx.task_id,
|
||||
"to_employee_id": getattr(e, "id", None),
|
||||
"session_key": sk,
|
||||
},
|
||||
)
|
||||
continue
|
||||
@@ -1,88 +0,0 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
import os
|
||||
import time
|
||||
from typing import Any
|
||||
|
||||
import requests
|
||||
from requests.exceptions import ReadTimeout, RequestException
|
||||
|
||||
logger = logging.getLogger("app.openclaw")
|
||||
|
||||
|
||||
class OpenClawClient:
|
||||
def __init__(self, base_url: str, token: str):
|
||||
self.base_url = base_url.rstrip("/")
|
||||
self.token = token
|
||||
|
||||
@classmethod
|
||||
def from_env(cls) -> "OpenClawClient | None":
|
||||
# Ensure .env is loaded into os.environ (pydantic Settings reads env_file but
|
||||
# does not automatically populate os.environ).
|
||||
try:
|
||||
from dotenv import load_dotenv
|
||||
|
||||
load_dotenv(override=False)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
url = os.environ.get("OPENCLAW_GATEWAY_URL")
|
||||
token = os.environ.get("OPENCLAW_GATEWAY_TOKEN")
|
||||
if not url or not token:
|
||||
return None
|
||||
return cls(url, token)
|
||||
|
||||
def tools_invoke(
|
||||
self,
|
||||
tool: str,
|
||||
args: dict[str, Any],
|
||||
*,
|
||||
session_key: str | None = None,
|
||||
timeout_s: float = 10.0,
|
||||
) -> dict[str, Any]:
|
||||
payload: dict[str, Any] = {"tool": tool, "args": args}
|
||||
logger.info(
|
||||
"openclaw.tools_invoke",
|
||||
extra={"tool": tool, "has_session_key": bool(session_key), "timeout_s": timeout_s},
|
||||
)
|
||||
if session_key is not None:
|
||||
payload["sessionKey"] = session_key
|
||||
|
||||
last_err: Exception | None = None
|
||||
# Retry a few times; the gateway can be busy and respond slowly.
|
||||
for attempt in range(4):
|
||||
try:
|
||||
r = requests.post(
|
||||
f"{self.base_url}/tools/invoke",
|
||||
headers={
|
||||
"Authorization": f"Bearer {self.token}",
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
json=payload,
|
||||
# connect timeout, read timeout
|
||||
timeout=(2.0, timeout_s),
|
||||
)
|
||||
r.raise_for_status()
|
||||
logger.info(
|
||||
"openclaw.tools_invoke: ok",
|
||||
extra={"tool": tool, "status": r.status_code, "attempt": attempt + 1},
|
||||
)
|
||||
return r.json()
|
||||
except ReadTimeout as e:
|
||||
last_err = e
|
||||
logger.warning(
|
||||
"openclaw.tools_invoke: timeout",
|
||||
extra={"tool": tool, "attempt": attempt + 1, "timeout_s": timeout_s},
|
||||
)
|
||||
time.sleep(0.5 * (2**attempt))
|
||||
except RequestException as e:
|
||||
last_err = e
|
||||
logger.warning(
|
||||
"openclaw.tools_invoke: request error",
|
||||
extra={"tool": tool, "attempt": attempt + 1, "error": str(e)},
|
||||
)
|
||||
time.sleep(0.5 * (2**attempt))
|
||||
|
||||
assert last_err is not None
|
||||
raise last_err
|
||||
@@ -1,129 +0,0 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
import re
|
||||
import time
|
||||
from typing import Any
|
||||
|
||||
from app.integrations.openclaw import OpenClawClient
|
||||
|
||||
|
||||
def _slug(s: str) -> str:
|
||||
s = (s or "").strip().lower()
|
||||
s = re.sub(r"[^a-z0-9]+", "-", s)
|
||||
s = re.sub(r"-+", "-", s).strip("-")
|
||||
return s or "agent"
|
||||
|
||||
|
||||
def desired_agent_id(*, employee_id: int, name: str) -> str:
|
||||
return f"employee-{employee_id}-{_slug(name)}"
|
||||
|
||||
|
||||
def ensure_full_agent_profile(
|
||||
*,
|
||||
client: OpenClawClient,
|
||||
employee_id: int,
|
||||
employee_name: str,
|
||||
) -> dict[str, str]:
|
||||
"""Ensure an OpenClaw agent profile exists for this employee.
|
||||
|
||||
Returns {"agent_id": ..., "workspace": ...}.
|
||||
|
||||
Implementation strategy:
|
||||
- Create per-agent workspace + agent dir on the gateway host.
|
||||
- Add/ensure entry in openclaw.json agents.list.
|
||||
|
||||
NOTE: This uses OpenClaw gateway tools via /tools/invoke (gateway + exec).
|
||||
"""
|
||||
|
||||
agent_id = desired_agent_id(employee_id=employee_id, name=employee_name)
|
||||
|
||||
workspace = f"/home/asaharan/.openclaw/workspaces/{agent_id}"
|
||||
agent_dir = f"/home/asaharan/.openclaw/agents/{agent_id}/agent"
|
||||
|
||||
# 1) Create dirs
|
||||
client.tools_invoke(
|
||||
"exec",
|
||||
{
|
||||
"command": f"mkdir -p {workspace} {agent_dir}",
|
||||
},
|
||||
timeout_s=20.0,
|
||||
)
|
||||
|
||||
# 2) Write minimal identity files in the per-agent workspace
|
||||
identity_md = (
|
||||
"# IDENTITY.md\n\n"
|
||||
"- **Name:** " + employee_name + "\n"
|
||||
"- **Creature:** AI agent employee (Mission Control)\n"
|
||||
"- **Vibe:** Direct, action-oriented, leaves audit trails\n"
|
||||
)
|
||||
user_md = (
|
||||
"# USER.md\n\n"
|
||||
"You work for Abhimanyu.\n"
|
||||
"You must execute Mission Control tasks via the API and keep state synced.\n"
|
||||
)
|
||||
|
||||
# Use cat heredocs to avoid dependency on extra tooling.
|
||||
client.tools_invoke(
|
||||
"exec",
|
||||
{
|
||||
"command": "bash -lc "
|
||||
+ json.dumps(
|
||||
"""
|
||||
cat > {ws}/IDENTITY.md <<'EOF'
|
||||
{identity}
|
||||
EOF
|
||||
cat > {ws}/USER.md <<'EOF'
|
||||
{user}
|
||||
EOF
|
||||
""".format(ws=workspace, identity=identity_md, user=user_md)
|
||||
),
|
||||
},
|
||||
timeout_s=20.0,
|
||||
)
|
||||
|
||||
# 3) Update openclaw.json agents.list (idempotent)
|
||||
cfg_resp = client.tools_invoke("gateway", {"action": "config.get"}, timeout_s=20.0)
|
||||
raw = (
|
||||
(((cfg_resp or {}).get("result") or {}).get("content") or [{}])[0].get("text")
|
||||
if isinstance((((cfg_resp or {}).get("result") or {}).get("content") or [{}]), list)
|
||||
else None
|
||||
)
|
||||
|
||||
if not raw:
|
||||
# fallback: tool may return {ok:true,result:{raw:...}}
|
||||
raw = ((cfg_resp.get("result") or {}).get("raw")) if isinstance(cfg_resp, dict) else None
|
||||
|
||||
if not raw:
|
||||
raise RuntimeError("Unable to read gateway config via tools")
|
||||
|
||||
cfg = json.loads(raw)
|
||||
|
||||
agents = cfg.get("agents") or {}
|
||||
agents_list = agents.get("list") or []
|
||||
if not isinstance(agents_list, list):
|
||||
agents_list = []
|
||||
|
||||
exists = any(isinstance(a, dict) and a.get("id") == agent_id for a in agents_list)
|
||||
if not exists:
|
||||
agents_list.append(
|
||||
{
|
||||
"id": agent_id,
|
||||
"name": employee_name,
|
||||
"workspace": workspace,
|
||||
"agentDir": agent_dir,
|
||||
"identity": {"name": employee_name, "emoji": "🜁"},
|
||||
}
|
||||
)
|
||||
agents["list"] = agents_list
|
||||
cfg["agents"] = agents
|
||||
|
||||
client.tools_invoke(
|
||||
"gateway",
|
||||
{"action": "config.apply", "raw": json.dumps(cfg)},
|
||||
timeout_s=30.0,
|
||||
)
|
||||
# give the gateway a moment to reload the agent registry
|
||||
time.sleep(2.5)
|
||||
|
||||
return {"agent_id": agent_id, "workspace": workspace}
|
||||
126
backend/app/integrations/openclaw_gateway.py
Normal file
126
backend/app/integrations/openclaw_gateway.py
Normal file
@@ -0,0 +1,126 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
import json
|
||||
from dataclasses import dataclass
|
||||
from typing import Any
|
||||
from urllib.parse import urlencode, urlparse, urlunparse
|
||||
from uuid import uuid4
|
||||
|
||||
import websockets
|
||||
|
||||
from app.core.config import settings
|
||||
|
||||
|
||||
class OpenClawGatewayError(RuntimeError):
|
||||
pass
|
||||
|
||||
|
||||
@dataclass
|
||||
class OpenClawResponse:
|
||||
payload: Any
|
||||
|
||||
|
||||
def _build_gateway_url() -> str:
|
||||
base_url = settings.openclaw_gateway_url or "ws://127.0.0.1:18789"
|
||||
token = settings.openclaw_gateway_token
|
||||
if not token:
|
||||
return base_url
|
||||
parsed = urlparse(base_url)
|
||||
query = urlencode({"token": token})
|
||||
return urlunparse(parsed._replace(query=query))
|
||||
|
||||
|
||||
async def _await_response(ws: websockets.WebSocketClientProtocol, request_id: str) -> Any:
|
||||
while True:
|
||||
raw = await ws.recv()
|
||||
data = json.loads(raw)
|
||||
|
||||
if data.get("type") == "res" and data.get("id") == request_id:
|
||||
if data.get("ok") is False:
|
||||
error = data.get("error", {}).get("message", "Gateway error")
|
||||
raise OpenClawGatewayError(error)
|
||||
return data.get("payload")
|
||||
|
||||
if data.get("id") == request_id:
|
||||
if data.get("error"):
|
||||
raise OpenClawGatewayError(data["error"].get("message", "Gateway error"))
|
||||
return data.get("result")
|
||||
|
||||
|
||||
async def _send_request(
|
||||
ws: websockets.WebSocketClientProtocol, method: str, params: dict[str, Any] | None
|
||||
) -> Any:
|
||||
request_id = str(uuid4())
|
||||
message = {"type": "req", "id": request_id, "method": method, "params": params or {}}
|
||||
await ws.send(json.dumps(message))
|
||||
return await _await_response(ws, request_id)
|
||||
|
||||
|
||||
async def _handle_challenge(
|
||||
ws: websockets.WebSocketClientProtocol, first_message: str | None
|
||||
) -> None:
|
||||
if not first_message:
|
||||
return
|
||||
data = json.loads(first_message)
|
||||
if data.get("type") != "event" or data.get("event") != "connect.challenge":
|
||||
return
|
||||
|
||||
connect_id = str(uuid4())
|
||||
response = {
|
||||
"type": "req",
|
||||
"id": connect_id,
|
||||
"method": "connect",
|
||||
"params": {
|
||||
"minProtocol": 3,
|
||||
"maxProtocol": 3,
|
||||
"client": {
|
||||
"id": "gateway-client",
|
||||
"version": "1.0.0",
|
||||
"platform": "web",
|
||||
"mode": "ui",
|
||||
},
|
||||
"auth": {"token": settings.openclaw_gateway_token},
|
||||
},
|
||||
}
|
||||
await ws.send(json.dumps(response))
|
||||
await _await_response(ws, connect_id)
|
||||
|
||||
|
||||
async def openclaw_call(method: str, params: dict[str, Any] | None = None) -> Any:
|
||||
gateway_url = _build_gateway_url()
|
||||
try:
|
||||
async with websockets.connect(gateway_url, ping_interval=None) as ws:
|
||||
first_message = None
|
||||
try:
|
||||
first_message = await asyncio.wait_for(ws.recv(), timeout=2)
|
||||
except asyncio.TimeoutError:
|
||||
first_message = None
|
||||
await _handle_challenge(ws, first_message)
|
||||
return await _send_request(ws, method, params)
|
||||
except OpenClawGatewayError:
|
||||
raise
|
||||
except Exception as exc: # pragma: no cover - network errors
|
||||
raise OpenClawGatewayError(str(exc)) from exc
|
||||
|
||||
|
||||
async def send_message(
|
||||
message: str,
|
||||
*,
|
||||
session_key: str,
|
||||
deliver: bool = False,
|
||||
) -> Any:
|
||||
params: dict[str, Any] = {
|
||||
"sessionKey": session_key,
|
||||
"message": message,
|
||||
"deliver": deliver,
|
||||
"idempotencyKey": str(uuid4()),
|
||||
}
|
||||
return await openclaw_call("chat.send", params)
|
||||
|
||||
|
||||
async def get_chat_history(session_key: str, limit: int | None = None) -> Any:
|
||||
params: dict[str, Any] = {"sessionKey": session_key}
|
||||
if limit is not None:
|
||||
params["limit"] = limit
|
||||
return await openclaw_call("chat.history", params)
|
||||
@@ -1,19 +1,21 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from fastapi import FastAPI
|
||||
from fastapi import APIRouter, FastAPI
|
||||
from fastapi.middleware.cors import CORSMiddleware
|
||||
|
||||
from app.api.activities import router as activities_router
|
||||
from app.api.org import router as org_router
|
||||
from app.api.projects import router as projects_router
|
||||
from app.api.work import router as work_router
|
||||
from app.api.activity import router as activity_router
|
||||
from app.api.agents import router as agents_router
|
||||
from app.api.auth import router as auth_router
|
||||
from app.api.boards import router as boards_router
|
||||
from app.api.gateway import router as gateway_router
|
||||
from app.api.tasks import router as tasks_router
|
||||
from app.core.config import settings
|
||||
from app.core.logging import configure_logging
|
||||
from app.db.session import init_db
|
||||
|
||||
configure_logging()
|
||||
|
||||
app = FastAPI(title="OpenClaw Agency API", version="0.3.0")
|
||||
app = FastAPI(title="Mission Control API", version="0.1.0")
|
||||
|
||||
origins = [o.strip() for o in settings.cors_origins.split(",") if o.strip()]
|
||||
if origins:
|
||||
@@ -31,12 +33,16 @@ def on_startup() -> None:
|
||||
init_db()
|
||||
|
||||
|
||||
app.include_router(org_router)
|
||||
app.include_router(projects_router)
|
||||
app.include_router(work_router)
|
||||
app.include_router(activities_router)
|
||||
|
||||
|
||||
@app.get("/health")
|
||||
def health():
|
||||
def health() -> dict[str, bool]:
|
||||
return {"ok": True}
|
||||
|
||||
|
||||
api_v1 = APIRouter(prefix="/api/v1")
|
||||
api_v1.include_router(auth_router)
|
||||
api_v1.include_router(agents_router)
|
||||
api_v1.include_router(activity_router)
|
||||
api_v1.include_router(gateway_router)
|
||||
api_v1.include_router(boards_router)
|
||||
api_v1.include_router(tasks_router)
|
||||
app.include_router(api_v1)
|
||||
|
||||
@@ -1,15 +1,13 @@
|
||||
from app.models.activity import Activity
|
||||
from app.models.org import Department, Employee, Team
|
||||
from app.models.projects import Project, ProjectMember
|
||||
from app.models.work import Task, TaskComment
|
||||
from app.models.activity_events import ActivityEvent
|
||||
from app.models.agents import Agent
|
||||
from app.models.boards import Board
|
||||
from app.models.tasks import Task
|
||||
from app.models.users import User
|
||||
|
||||
__all__ = [
|
||||
"Department",
|
||||
"Employee",
|
||||
"Team",
|
||||
"Project",
|
||||
"ProjectMember",
|
||||
"ActivityEvent",
|
||||
"Agent",
|
||||
"Board",
|
||||
"Task",
|
||||
"TaskComment",
|
||||
"Activity",
|
||||
"User",
|
||||
]
|
||||
|
||||
@@ -1,19 +0,0 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import datetime
|
||||
|
||||
from sqlmodel import Field, SQLModel
|
||||
|
||||
|
||||
class Activity(SQLModel, table=True):
|
||||
__tablename__ = "activities"
|
||||
|
||||
id: int | None = Field(default=None, primary_key=True)
|
||||
actor_employee_id: int | None = Field(default=None, foreign_key="employees.id")
|
||||
|
||||
entity_type: str
|
||||
entity_id: int | None = None
|
||||
verb: str
|
||||
|
||||
payload_json: str | None = None
|
||||
created_at: datetime = Field(default_factory=datetime.utcnow)
|
||||
17
backend/app/models/activity_events.py
Normal file
17
backend/app/models/activity_events.py
Normal file
@@ -0,0 +1,17 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import datetime
|
||||
from uuid import UUID, uuid4
|
||||
|
||||
from sqlmodel import Field, SQLModel
|
||||
|
||||
|
||||
class ActivityEvent(SQLModel, table=True):
|
||||
__tablename__ = "activity_events"
|
||||
|
||||
id: UUID = Field(default_factory=uuid4, primary_key=True)
|
||||
event_type: str = Field(index=True)
|
||||
message: str | None = None
|
||||
agent_id: UUID | None = Field(default=None, foreign_key="agents.id", index=True)
|
||||
task_id: UUID | None = Field(default=None, foreign_key="tasks.id", index=True)
|
||||
created_at: datetime = Field(default_factory=datetime.utcnow)
|
||||
18
backend/app/models/agents.py
Normal file
18
backend/app/models/agents.py
Normal file
@@ -0,0 +1,18 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import datetime
|
||||
from uuid import UUID, uuid4
|
||||
|
||||
from sqlmodel import Field, SQLModel
|
||||
|
||||
|
||||
class Agent(SQLModel, table=True):
|
||||
__tablename__ = "agents"
|
||||
|
||||
id: UUID = Field(default_factory=uuid4, primary_key=True)
|
||||
name: str = Field(index=True)
|
||||
status: str = Field(default="online", index=True)
|
||||
openclaw_session_id: str | None = Field(default=None, index=True)
|
||||
last_seen_at: datetime = Field(default_factory=datetime.utcnow)
|
||||
created_at: datetime = Field(default_factory=datetime.utcnow)
|
||||
updated_at: datetime = Field(default_factory=datetime.utcnow)
|
||||
18
backend/app/models/boards.py
Normal file
18
backend/app/models/boards.py
Normal file
@@ -0,0 +1,18 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import datetime
|
||||
from uuid import UUID, uuid4
|
||||
|
||||
from sqlmodel import Field
|
||||
|
||||
from app.models.tenancy import TenantScoped
|
||||
|
||||
|
||||
class Board(TenantScoped, table=True):
|
||||
__tablename__ = "boards"
|
||||
|
||||
id: UUID = Field(default_factory=uuid4, primary_key=True)
|
||||
name: str
|
||||
slug: str = Field(index=True)
|
||||
created_at: datetime = Field(default_factory=datetime.utcnow)
|
||||
updated_at: datetime = Field(default_factory=datetime.utcnow)
|
||||
@@ -1,40 +0,0 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from sqlmodel import Field, SQLModel
|
||||
|
||||
|
||||
class Department(SQLModel, table=True):
|
||||
__tablename__ = "departments"
|
||||
|
||||
id: int | None = Field(default=None, primary_key=True)
|
||||
name: str = Field(index=True, unique=True)
|
||||
head_employee_id: int | None = Field(default=None, foreign_key="employees.id")
|
||||
|
||||
|
||||
class Team(SQLModel, table=True):
|
||||
__tablename__ = "teams"
|
||||
|
||||
id: int | None = Field(default=None, primary_key=True)
|
||||
name: str = Field(index=True)
|
||||
|
||||
department_id: int = Field(foreign_key="departments.id")
|
||||
lead_employee_id: int | None = Field(default=None, foreign_key="employees.id")
|
||||
|
||||
|
||||
class Employee(SQLModel, table=True):
|
||||
__tablename__ = "employees"
|
||||
|
||||
id: int | None = Field(default=None, primary_key=True)
|
||||
name: str
|
||||
employee_type: str # human | agent
|
||||
|
||||
department_id: int | None = Field(default=None, foreign_key="departments.id")
|
||||
team_id: int | None = Field(default=None, foreign_key="teams.id")
|
||||
manager_id: int | None = Field(default=None, foreign_key="employees.id")
|
||||
|
||||
title: str | None = None
|
||||
status: str = Field(default="active")
|
||||
|
||||
# OpenClaw integration
|
||||
openclaw_session_key: str | None = None
|
||||
notify_enabled: bool = Field(default=True)
|
||||
@@ -1,23 +0,0 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from sqlmodel import Field, SQLModel
|
||||
|
||||
|
||||
class Project(SQLModel, table=True):
|
||||
__tablename__ = "projects"
|
||||
|
||||
id: int | None = Field(default=None, primary_key=True)
|
||||
name: str = Field(index=True, unique=True)
|
||||
status: str = Field(default="active")
|
||||
|
||||
# Project ownership: projects are assigned to teams (not departments)
|
||||
team_id: int | None = Field(default=None, foreign_key="teams.id")
|
||||
|
||||
|
||||
class ProjectMember(SQLModel, table=True):
|
||||
__tablename__ = "project_members"
|
||||
|
||||
id: int | None = Field(default=None, primary_key=True)
|
||||
project_id: int = Field(foreign_key="projects.id")
|
||||
employee_id: int = Field(foreign_key="employees.id")
|
||||
role: str | None = None
|
||||
26
backend/app/models/tasks.py
Normal file
26
backend/app/models/tasks.py
Normal file
@@ -0,0 +1,26 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import datetime
|
||||
from uuid import UUID, uuid4
|
||||
|
||||
from sqlmodel import Field
|
||||
|
||||
from app.models.tenancy import TenantScoped
|
||||
|
||||
|
||||
class Task(TenantScoped, table=True):
|
||||
__tablename__ = "tasks"
|
||||
|
||||
id: UUID = Field(default_factory=uuid4, primary_key=True)
|
||||
board_id: UUID | None = Field(default=None, foreign_key="boards.id", index=True)
|
||||
|
||||
title: str
|
||||
description: str | None = None
|
||||
status: str = Field(default="inbox", index=True)
|
||||
priority: str = Field(default="medium", index=True)
|
||||
due_at: datetime | None = None
|
||||
|
||||
created_by_user_id: UUID | None = Field(default=None, foreign_key="users.id", index=True)
|
||||
|
||||
created_at: datetime = Field(default_factory=datetime.utcnow)
|
||||
updated_at: datetime = Field(default_factory=datetime.utcnow)
|
||||
7
backend/app/models/tenancy.py
Normal file
7
backend/app/models/tenancy.py
Normal file
@@ -0,0 +1,7 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from sqlmodel import SQLModel
|
||||
|
||||
|
||||
class TenantScoped(SQLModel, table=False):
|
||||
pass
|
||||
15
backend/app/models/users.py
Normal file
15
backend/app/models/users.py
Normal file
@@ -0,0 +1,15 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from uuid import UUID, uuid4
|
||||
|
||||
from sqlmodel import Field, SQLModel
|
||||
|
||||
|
||||
class User(SQLModel, table=True):
|
||||
__tablename__ = "users"
|
||||
|
||||
id: UUID = Field(default_factory=uuid4, primary_key=True)
|
||||
clerk_user_id: str = Field(index=True, unique=True)
|
||||
email: str | None = Field(default=None, index=True)
|
||||
name: str | None = None
|
||||
is_super_admin: bool = Field(default=False)
|
||||
@@ -1,38 +0,0 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import datetime
|
||||
|
||||
from sqlmodel import Field, SQLModel
|
||||
|
||||
|
||||
class Task(SQLModel, table=True):
|
||||
__tablename__ = "tasks"
|
||||
|
||||
id: int | None = Field(default=None, primary_key=True)
|
||||
|
||||
project_id: int = Field(foreign_key="projects.id", index=True)
|
||||
title: str
|
||||
description: str | None = None
|
||||
|
||||
status: str = Field(default="backlog", index=True)
|
||||
|
||||
assignee_employee_id: int | None = Field(default=None, foreign_key="employees.id")
|
||||
reviewer_employee_id: int | None = Field(default=None, foreign_key="employees.id")
|
||||
created_by_employee_id: int | None = Field(default=None, foreign_key="employees.id")
|
||||
|
||||
created_at: datetime = Field(default_factory=datetime.utcnow)
|
||||
updated_at: datetime = Field(default_factory=datetime.utcnow)
|
||||
|
||||
|
||||
class TaskComment(SQLModel, table=True):
|
||||
__tablename__ = "task_comments"
|
||||
|
||||
id: int | None = Field(default=None, primary_key=True)
|
||||
task_id: int = Field(foreign_key="tasks.id", index=True)
|
||||
author_employee_id: int | None = Field(default=None, foreign_key="employees.id")
|
||||
|
||||
# Optional reply threading
|
||||
reply_to_comment_id: int | None = Field(default=None, foreign_key="task_comments.id")
|
||||
|
||||
body: str
|
||||
created_at: datetime = Field(default_factory=datetime.utcnow)
|
||||
20
backend/app/schemas/__init__.py
Normal file
20
backend/app/schemas/__init__.py
Normal file
@@ -0,0 +1,20 @@
|
||||
from app.schemas.activity_events import ActivityEventRead
|
||||
from app.schemas.agents import AgentCreate, AgentRead, AgentUpdate
|
||||
from app.schemas.boards import BoardCreate, BoardRead, BoardUpdate
|
||||
from app.schemas.tasks import TaskCreate, TaskRead, TaskUpdate
|
||||
from app.schemas.users import UserCreate, UserRead
|
||||
|
||||
__all__ = [
|
||||
"ActivityEventRead",
|
||||
"AgentCreate",
|
||||
"AgentRead",
|
||||
"AgentUpdate",
|
||||
"BoardCreate",
|
||||
"BoardRead",
|
||||
"BoardUpdate",
|
||||
"TaskCreate",
|
||||
"TaskRead",
|
||||
"TaskUpdate",
|
||||
"UserCreate",
|
||||
"UserRead",
|
||||
]
|
||||
15
backend/app/schemas/activity_events.py
Normal file
15
backend/app/schemas/activity_events.py
Normal file
@@ -0,0 +1,15 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import datetime
|
||||
from uuid import UUID
|
||||
|
||||
from sqlmodel import SQLModel
|
||||
|
||||
|
||||
class ActivityEventRead(SQLModel):
|
||||
id: UUID
|
||||
event_type: str
|
||||
message: str | None
|
||||
agent_id: UUID | None
|
||||
task_id: UUID | None
|
||||
created_at: datetime
|
||||
36
backend/app/schemas/agents.py
Normal file
36
backend/app/schemas/agents.py
Normal file
@@ -0,0 +1,36 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import datetime
|
||||
from uuid import UUID
|
||||
|
||||
from sqlmodel import SQLModel
|
||||
|
||||
|
||||
class AgentBase(SQLModel):
|
||||
name: str
|
||||
status: str = "online"
|
||||
|
||||
|
||||
class AgentCreate(AgentBase):
|
||||
pass
|
||||
|
||||
|
||||
class AgentUpdate(SQLModel):
|
||||
name: str | None = None
|
||||
status: str | None = None
|
||||
|
||||
|
||||
class AgentRead(AgentBase):
|
||||
id: UUID
|
||||
openclaw_session_id: str | None = None
|
||||
last_seen_at: datetime
|
||||
created_at: datetime
|
||||
updated_at: datetime
|
||||
|
||||
|
||||
class AgentHeartbeat(SQLModel):
|
||||
status: str | None = None
|
||||
|
||||
|
||||
class AgentHeartbeatCreate(AgentHeartbeat):
|
||||
name: str
|
||||
26
backend/app/schemas/boards.py
Normal file
26
backend/app/schemas/boards.py
Normal file
@@ -0,0 +1,26 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import datetime
|
||||
from uuid import UUID
|
||||
|
||||
from sqlmodel import SQLModel
|
||||
|
||||
|
||||
class BoardBase(SQLModel):
|
||||
name: str
|
||||
slug: str
|
||||
|
||||
|
||||
class BoardCreate(BoardBase):
|
||||
pass
|
||||
|
||||
|
||||
class BoardUpdate(SQLModel):
|
||||
name: str | None = None
|
||||
slug: str | None = None
|
||||
|
||||
|
||||
class BoardRead(BoardBase):
|
||||
id: UUID
|
||||
created_at: datetime
|
||||
updated_at: datetime
|
||||
@@ -1,53 +0,0 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from sqlmodel import SQLModel
|
||||
|
||||
|
||||
class DepartmentCreate(SQLModel):
|
||||
name: str
|
||||
head_employee_id: int | None = None
|
||||
|
||||
|
||||
class DepartmentUpdate(SQLModel):
|
||||
name: str | None = None
|
||||
head_employee_id: int | None = None
|
||||
|
||||
|
||||
class TeamCreate(SQLModel):
|
||||
name: str
|
||||
department_id: int
|
||||
lead_employee_id: int | None = None
|
||||
|
||||
|
||||
class TeamUpdate(SQLModel):
|
||||
name: str | None = None
|
||||
department_id: int | None = None
|
||||
lead_employee_id: int | None = None
|
||||
|
||||
|
||||
class EmployeeCreate(SQLModel):
|
||||
name: str
|
||||
employee_type: str
|
||||
department_id: int | None = None
|
||||
team_id: int | None = None
|
||||
manager_id: int | None = None
|
||||
title: str | None = None
|
||||
status: str = "active"
|
||||
|
||||
# OpenClaw integration
|
||||
openclaw_session_key: str | None = None
|
||||
notify_enabled: bool = True
|
||||
|
||||
|
||||
class EmployeeUpdate(SQLModel):
|
||||
name: str | None = None
|
||||
employee_type: str | None = None
|
||||
department_id: int | None = None
|
||||
team_id: int | None = None
|
||||
manager_id: int | None = None
|
||||
title: str | None = None
|
||||
status: str | None = None
|
||||
|
||||
# OpenClaw integration
|
||||
openclaw_session_key: str | None = None
|
||||
notify_enabled: bool | None = None
|
||||
@@ -1,15 +0,0 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from sqlmodel import SQLModel
|
||||
|
||||
|
||||
class ProjectCreate(SQLModel):
|
||||
name: str
|
||||
status: str = "active"
|
||||
team_id: int | None = None
|
||||
|
||||
|
||||
class ProjectUpdate(SQLModel):
|
||||
name: str | None = None
|
||||
status: str | None = None
|
||||
team_id: int | None = None
|
||||
34
backend/app/schemas/tasks.py
Normal file
34
backend/app/schemas/tasks.py
Normal file
@@ -0,0 +1,34 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import datetime
|
||||
from uuid import UUID
|
||||
|
||||
from sqlmodel import SQLModel
|
||||
|
||||
|
||||
class TaskBase(SQLModel):
|
||||
title: str
|
||||
description: str | None = None
|
||||
status: str = "inbox"
|
||||
priority: str = "medium"
|
||||
due_at: datetime | None = None
|
||||
|
||||
|
||||
class TaskCreate(TaskBase):
|
||||
created_by_user_id: UUID | None = None
|
||||
|
||||
|
||||
class TaskUpdate(SQLModel):
|
||||
title: str | None = None
|
||||
description: str | None = None
|
||||
status: str | None = None
|
||||
priority: str | None = None
|
||||
due_at: datetime | None = None
|
||||
|
||||
|
||||
class TaskRead(TaskBase):
|
||||
id: UUID
|
||||
board_id: UUID | None
|
||||
created_by_user_id: UUID | None
|
||||
created_at: datetime
|
||||
updated_at: datetime
|
||||
20
backend/app/schemas/users.py
Normal file
20
backend/app/schemas/users.py
Normal file
@@ -0,0 +1,20 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from uuid import UUID
|
||||
|
||||
from sqlmodel import SQLModel
|
||||
|
||||
|
||||
class UserBase(SQLModel):
|
||||
clerk_user_id: str
|
||||
email: str | None = None
|
||||
name: str | None = None
|
||||
|
||||
|
||||
class UserCreate(UserBase):
|
||||
pass
|
||||
|
||||
|
||||
class UserRead(UserBase):
|
||||
id: UUID
|
||||
is_super_admin: bool
|
||||
@@ -1,33 +0,0 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from sqlmodel import SQLModel
|
||||
|
||||
|
||||
class TaskCreate(SQLModel):
|
||||
project_id: int
|
||||
title: str
|
||||
description: str | None = None
|
||||
status: str = "backlog"
|
||||
assignee_employee_id: int | None = None
|
||||
reviewer_employee_id: int | None = None
|
||||
created_by_employee_id: int | None = None
|
||||
|
||||
|
||||
class TaskUpdate(SQLModel):
|
||||
title: str | None = None
|
||||
description: str | None = None
|
||||
status: str | None = None
|
||||
assignee_employee_id: int | None = None
|
||||
reviewer_employee_id: int | None = None
|
||||
|
||||
|
||||
class TaskCommentCreate(SQLModel):
|
||||
task_id: int
|
||||
author_employee_id: int | None = None
|
||||
reply_to_comment_id: int | None = None
|
||||
body: str
|
||||
|
||||
|
||||
class TaskReviewDecision(SQLModel):
|
||||
decision: str # approve | changes
|
||||
comment_body: str
|
||||
0
backend/app/services/__init__.py
Normal file
0
backend/app/services/__init__.py
Normal file
10
backend/app/services/admin_access.py
Normal file
10
backend/app/services/admin_access.py
Normal file
@@ -0,0 +1,10 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from fastapi import HTTPException, status
|
||||
|
||||
from app.core.auth import AuthContext
|
||||
|
||||
|
||||
def require_admin(auth: AuthContext) -> None:
|
||||
if auth.actor_type != "user" or auth.user is None:
|
||||
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN)
|
||||
0
backend/app/workers/__init__.py
Normal file
0
backend/app/workers/__init__.py
Normal file
14
backend/app/workers/queue.py
Normal file
14
backend/app/workers/queue.py
Normal file
@@ -0,0 +1,14 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from redis import Redis
|
||||
from rq import Queue
|
||||
|
||||
from app.core.config import settings
|
||||
|
||||
|
||||
def get_redis() -> Redis:
|
||||
return Redis.from_url(settings.redis_url)
|
||||
|
||||
|
||||
def get_queue(name: str) -> Queue:
|
||||
return Queue(name, connection=get_redis())
|
||||
@@ -7,3 +7,33 @@ extend-exclude = '(\.venv|alembic/versions)'
|
||||
profile = "black"
|
||||
line_length = 100
|
||||
skip = [".venv", "alembic/versions"]
|
||||
[project]
|
||||
name = "openclaw-agency-backend"
|
||||
version = "0.1.0"
|
||||
requires-python = ">=3.12"
|
||||
dependencies = [
|
||||
"fastapi==0.115.4",
|
||||
"uvicorn[standard]==0.30.6",
|
||||
"sqlmodel==0.0.22",
|
||||
"sqlalchemy==2.0.34",
|
||||
"alembic==1.13.2",
|
||||
"psycopg[binary]==3.2.1",
|
||||
"pydantic-settings==2.5.2",
|
||||
"python-dotenv==1.0.1",
|
||||
"httpx==0.27.2",
|
||||
"websockets==12.0",
|
||||
"rq==1.16.2",
|
||||
"redis==5.1.1",
|
||||
"fastapi-clerk-auth==0.0.9",
|
||||
"sse-starlette==2.1.3",
|
||||
]
|
||||
|
||||
[project.optional-dependencies]
|
||||
dev = [
|
||||
"black==24.10.0",
|
||||
"flake8==7.1.1",
|
||||
"isort==5.13.2",
|
||||
"pytest==8.3.3",
|
||||
"pytest-asyncio==0.24.0",
|
||||
"ruff==0.6.9",
|
||||
]
|
||||
|
||||
@@ -1,4 +1,3 @@
|
||||
black==24.10.0
|
||||
isort==5.13.2
|
||||
flake8==7.1.1
|
||||
pre-commit==4.1.0
|
||||
pytest==8.3.3
|
||||
pytest-asyncio==0.24.0
|
||||
ruff==0.6.9
|
||||
|
||||
@@ -1,8 +1,14 @@
|
||||
fastapi
|
||||
uvicorn[standard]
|
||||
sqlmodel
|
||||
alembic
|
||||
psycopg[binary]
|
||||
python-dotenv
|
||||
pydantic-settings
|
||||
requests
|
||||
fastapi==0.115.4
|
||||
uvicorn[standard]==0.30.6
|
||||
sqlmodel==0.0.22
|
||||
sqlalchemy==2.0.34
|
||||
alembic==1.13.2
|
||||
psycopg[binary]==3.2.1
|
||||
pydantic-settings==2.5.2
|
||||
python-dotenv==1.0.1
|
||||
httpx==0.27.2
|
||||
websockets==12.0
|
||||
rq==1.16.2
|
||||
redis==5.1.1
|
||||
fastapi-clerk-auth==0.0.9
|
||||
sse-starlette==2.1.3
|
||||
|
||||
@@ -1,37 +0,0 @@
|
||||
# DB reset + seed (dev-machine)
|
||||
|
||||
This repo uses Alembic migrations as schema source-of-truth.
|
||||
|
||||
## Reset to the current seed
|
||||
|
||||
```bash
|
||||
cd backend
|
||||
./scripts/reset_db.sh
|
||||
```
|
||||
|
||||
Environment variables (optional):
|
||||
|
||||
- `DB_NAME` (default `openclaw_agency`)
|
||||
- `DB_USER` (default `postgres`)
|
||||
- `DB_HOST` (default `127.0.0.1`)
|
||||
- `DB_PORT` (default `5432`)
|
||||
- `DB_PASSWORD` (default `postgres`)
|
||||
|
||||
## Updating the seed
|
||||
|
||||
The seed is a **data-only** dump (not schema). Regenerate it from the current DB state:
|
||||
|
||||
```bash
|
||||
cd backend
|
||||
PGPASSWORD=postgres pg_dump \
|
||||
--data-only \
|
||||
--column-inserts \
|
||||
--disable-triggers \
|
||||
--no-owner \
|
||||
--no-privileges \
|
||||
-U postgres -h 127.0.0.1 -d openclaw_agency \
|
||||
> scripts/seed_data.sql
|
||||
|
||||
# IMPORTANT: do not include alembic_version in the seed (migrations already set it)
|
||||
# (our committed seed already has this removed)
|
||||
```
|
||||
@@ -1,9 +0,0 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
cd "$(dirname "$0")/.."
|
||||
|
||||
. .venv/bin/activate
|
||||
|
||||
python -m black .
|
||||
python -m isort .
|
||||
python -m flake8 .
|
||||
@@ -1,55 +0,0 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
DB_NAME=${DB_NAME:-openclaw_agency}
|
||||
DB_USER=${DB_USER:-postgres}
|
||||
DB_HOST=${DB_HOST:-127.0.0.1}
|
||||
DB_PORT=${DB_PORT:-5432}
|
||||
|
||||
# Never hardcode passwords in git. Prefer:
|
||||
# - DB_PASSWORD env var, or
|
||||
# - infer from backend/.env DATABASE_URL
|
||||
DB_PASSWORD=${DB_PASSWORD:-}
|
||||
|
||||
cd "$(dirname "$0")/.."
|
||||
|
||||
if [[ -z "${DB_PASSWORD}" ]] && [[ -f .env ]]; then
|
||||
DB_PASSWORD=$(python3 - <<'PY'
|
||||
import os
|
||||
from pathlib import Path
|
||||
from urllib.parse import urlparse
|
||||
|
||||
def parse_database_url(url: str) -> str:
|
||||
# supports postgresql+psycopg://user:pass@host:port/db
|
||||
u = urlparse(url)
|
||||
return u.password or ""
|
||||
|
||||
for line in Path('.env').read_text().splitlines():
|
||||
if line.startswith('DATABASE_URL='):
|
||||
print(parse_database_url(line.split('=',1)[1].strip()))
|
||||
break
|
||||
PY
|
||||
)
|
||||
fi
|
||||
|
||||
if [[ -z "${DB_PASSWORD}" ]]; then
|
||||
echo "ERROR: DB_PASSWORD not set and could not infer it from backend/.env DATABASE_URL" >&2
|
||||
echo "Set DB_PASSWORD=... or create backend/.env with DATABASE_URL" >&2
|
||||
exit 2
|
||||
fi
|
||||
|
||||
export PGPASSWORD="$DB_PASSWORD"
|
||||
|
||||
# 1) wipe schema
|
||||
psql -h "$DB_HOST" -p "$DB_PORT" -U "$DB_USER" -d "$DB_NAME" -v ON_ERROR_STOP=1 \
|
||||
-c 'DROP SCHEMA public CASCADE; CREATE SCHEMA public;'
|
||||
|
||||
# 2) migrate
|
||||
. .venv/bin/activate
|
||||
alembic upgrade head
|
||||
|
||||
# 3) seed
|
||||
psql -h "$DB_HOST" -p "$DB_PORT" -U "$DB_USER" -d "$DB_NAME" -v ON_ERROR_STOP=1 \
|
||||
-f scripts/seed_data.sql
|
||||
|
||||
echo "Reset complete: $DB_USER@$DB_HOST:$DB_PORT/$DB_NAME"
|
||||
@@ -1,50 +0,0 @@
|
||||
-- Mission Control seed data (minimal)
|
||||
-- Keep this data-only seed small and deterministic.
|
||||
-- NOTE: Do NOT include alembic_version here; migrations manage it.
|
||||
|
||||
SET client_min_messages = warning;
|
||||
SET row_security = off;
|
||||
|
||||
-- Disable triggers to avoid FK ordering issues during seed.
|
||||
ALTER TABLE public.employees DISABLE TRIGGER ALL;
|
||||
ALTER TABLE public.departments DISABLE TRIGGER ALL;
|
||||
ALTER TABLE public.teams DISABLE TRIGGER ALL;
|
||||
ALTER TABLE public.projects DISABLE TRIGGER ALL;
|
||||
ALTER TABLE public.tasks DISABLE TRIGGER ALL;
|
||||
ALTER TABLE public.task_comments DISABLE TRIGGER ALL;
|
||||
ALTER TABLE public.project_members DISABLE TRIGGER ALL;
|
||||
ALTER TABLE public.activities DISABLE TRIGGER ALL;
|
||||
|
||||
-- Employees (keep only Abhimanyu)
|
||||
INSERT INTO public.employees (id, name, employee_type, department_id, manager_id, title, status, openclaw_session_key, notify_enabled, team_id)
|
||||
VALUES
|
||||
(1, 'Abhimanyu', 'human', NULL, NULL, 'CEO', 'active', NULL, false, NULL)
|
||||
ON CONFLICT (id) DO UPDATE SET
|
||||
name = EXCLUDED.name,
|
||||
employee_type = EXCLUDED.employee_type,
|
||||
department_id = EXCLUDED.department_id,
|
||||
manager_id = EXCLUDED.manager_id,
|
||||
title = EXCLUDED.title,
|
||||
status = EXCLUDED.status,
|
||||
openclaw_session_key = EXCLUDED.openclaw_session_key,
|
||||
notify_enabled = EXCLUDED.notify_enabled,
|
||||
team_id = EXCLUDED.team_id;
|
||||
|
||||
-- Fix sequences (avoid PK reuse after explicit ids)
|
||||
SELECT setval('employees_id_seq', (SELECT COALESCE(max(id), 1) FROM public.employees));
|
||||
SELECT setval('departments_id_seq', (SELECT COALESCE(max(id), 1) FROM public.departments));
|
||||
SELECT setval('teams_id_seq', (SELECT COALESCE(max(id), 1) FROM public.teams));
|
||||
SELECT setval('projects_id_seq', (SELECT COALESCE(max(id), 1) FROM public.projects));
|
||||
SELECT setval('tasks_id_seq', (SELECT COALESCE(max(id), 1) FROM public.tasks));
|
||||
SELECT setval('task_comments_id_seq', (SELECT COALESCE(max(id), 1) FROM public.task_comments));
|
||||
SELECT setval('project_members_id_seq', (SELECT COALESCE(max(id), 1) FROM public.project_members));
|
||||
SELECT setval('activities_id_seq', (SELECT COALESCE(max(id), 1) FROM public.activities));
|
||||
|
||||
ALTER TABLE public.employees ENABLE TRIGGER ALL;
|
||||
ALTER TABLE public.departments ENABLE TRIGGER ALL;
|
||||
ALTER TABLE public.teams ENABLE TRIGGER ALL;
|
||||
ALTER TABLE public.projects ENABLE TRIGGER ALL;
|
||||
ALTER TABLE public.tasks ENABLE TRIGGER ALL;
|
||||
ALTER TABLE public.task_comments ENABLE TRIGGER ALL;
|
||||
ALTER TABLE public.project_members ENABLE TRIGGER ALL;
|
||||
ALTER TABLE public.activities ENABLE TRIGGER ALL;
|
||||
46
backend/scripts/seed_demo.py
Normal file
46
backend/scripts/seed_demo.py
Normal file
@@ -0,0 +1,46 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from uuid import uuid4
|
||||
|
||||
from sqlmodel import Session
|
||||
|
||||
from app.db.session import engine
|
||||
from app.models.orgs import Org, Workspace
|
||||
from app.models.users import Membership, User
|
||||
|
||||
|
||||
def run() -> None:
|
||||
with Session(engine) as session:
|
||||
org = Org(name="Demo Org", slug="demo-org")
|
||||
session.add(org)
|
||||
session.commit()
|
||||
session.refresh(org)
|
||||
|
||||
workspace = Workspace(org_id=org.id, name="Demo Workspace", slug="demo-workspace")
|
||||
session.add(workspace)
|
||||
session.commit()
|
||||
session.refresh(workspace)
|
||||
|
||||
user = User(
|
||||
clerk_user_id=f"demo-{uuid4()}",
|
||||
email="demo@example.com",
|
||||
name="Demo Admin",
|
||||
is_super_admin=True,
|
||||
)
|
||||
session.add(user)
|
||||
session.commit()
|
||||
session.refresh(user)
|
||||
|
||||
membership = Membership(
|
||||
org_id=org.id,
|
||||
workspace_id=workspace.id,
|
||||
user_id=user.id,
|
||||
role="admin",
|
||||
)
|
||||
session.add(membership)
|
||||
|
||||
session.commit()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
run()
|
||||
1084
backend/uv.lock
generated
Normal file
1084
backend/uv.lock
generated
Normal file
File diff suppressed because it is too large
Load Diff
@@ -1,4 +1,7 @@
|
||||
# When accessing the frontend from another device, the browser will call this URL.
|
||||
# Use your machine IP (not 127.0.0.1).
|
||||
|
||||
NEXT_PUBLIC_API_URL=http://<YOUR_MACHINE_IP>:8000
|
||||
NEXT_PUBLIC_API_URL=http://localhost:8000
|
||||
NEXT_PUBLIC_CLERK_PUBLISHABLE_KEY=YOUR_PUBLISHABLE_KEY
|
||||
CLERK_SECRET_KEY=YOUR_SECRET_KEY
|
||||
NEXT_PUBLIC_CLERK_SIGN_IN_FORCE_REDIRECT_URL=/boards
|
||||
NEXT_PUBLIC_CLERK_SIGN_UP_FORCE_REDIRECT_URL=/boards
|
||||
NEXT_PUBLIC_CLERK_SIGN_IN_FALLBACK_REDIRECT_URL=/boards
|
||||
NEXT_PUBLIC_CLERK_SIGN_UP_FALLBACK_REDIRECT_URL=/boards
|
||||
|
||||
48
frontend/.gitignore
vendored
48
frontend/.gitignore
vendored
@@ -1,41 +1,9 @@
|
||||
# See https://help.github.com/articles/ignoring-files/ for more about ignoring files.
|
||||
node_modules/
|
||||
.next/
|
||||
.env.local
|
||||
.env
|
||||
out/
|
||||
dist/
|
||||
|
||||
# dependencies
|
||||
/node_modules
|
||||
/.pnp
|
||||
.pnp.*
|
||||
.yarn/*
|
||||
!.yarn/patches
|
||||
!.yarn/plugins
|
||||
!.yarn/releases
|
||||
!.yarn/versions
|
||||
|
||||
# testing
|
||||
/coverage
|
||||
|
||||
# next.js
|
||||
/.next/
|
||||
/out/
|
||||
|
||||
# production
|
||||
/build
|
||||
|
||||
# misc
|
||||
.DS_Store
|
||||
*.pem
|
||||
|
||||
# debug
|
||||
npm-debug.log*
|
||||
yarn-debug.log*
|
||||
yarn-error.log*
|
||||
.pnpm-debug.log*
|
||||
|
||||
# env files (can opt-in for committing if needed)
|
||||
.env*
|
||||
|
||||
# vercel
|
||||
.vercel
|
||||
|
||||
# typescript
|
||||
*.tsbuildinfo
|
||||
next-env.d.ts
|
||||
# clerk configuration (can include secrets)
|
||||
/.clerk/
|
||||
|
||||
6
frontend/next-env.d.ts
vendored
Normal file
6
frontend/next-env.d.ts
vendored
Normal file
@@ -0,0 +1,6 @@
|
||||
/// <reference types="next" />
|
||||
/// <reference types="next/image-types/global" />
|
||||
import "./.next/dev/types/routes.d.ts";
|
||||
|
||||
// NOTE: This file should not be edited
|
||||
// see https://nextjs.org/docs/app/api-reference/config/typescript for more information.
|
||||
@@ -1,7 +1,11 @@
|
||||
import type { NextConfig } from "next";
|
||||
|
||||
const nextConfig: NextConfig = {
|
||||
/* config options here */
|
||||
allowedDevOrigins: [
|
||||
"http://localhost:3000",
|
||||
"http://127.0.0.1:3000",
|
||||
"http://192.168.1.101:3000",
|
||||
],
|
||||
};
|
||||
|
||||
export default nextConfig;
|
||||
|
||||
1055
frontend/package-lock.json
generated
1055
frontend/package-lock.json
generated
File diff suppressed because it is too large
Load Diff
@@ -11,7 +11,13 @@
|
||||
"api:gen": "orval --config ./orval.config.ts"
|
||||
},
|
||||
"dependencies": {
|
||||
"@clerk/nextjs": "^6.37.1",
|
||||
"@radix-ui/react-dialog": "^1.1.2",
|
||||
"@radix-ui/react-select": "^2.2.6",
|
||||
"@radix-ui/react-tabs": "^1.1.1",
|
||||
"@radix-ui/react-tooltip": "^1.1.2",
|
||||
"@tanstack/react-query": "^5.90.20",
|
||||
"@tanstack/react-table": "^8.21.3",
|
||||
"next": "16.1.6",
|
||||
"react": "19.2.3",
|
||||
"react-dom": "19.2.3"
|
||||
|
||||
@@ -1 +0,0 @@
|
||||
<svg fill="none" viewBox="0 0 16 16" xmlns="http://www.w3.org/2000/svg"><path d="M14.5 13.5V5.41a1 1 0 0 0-.3-.7L9.8.29A1 1 0 0 0 9.08 0H1.5v13.5A2.5 2.5 0 0 0 4 16h8a2.5 2.5 0 0 0 2.5-2.5m-1.5 0v-7H8v-5H3v12a1 1 0 0 0 1 1h8a1 1 0 0 0 1-1M9.5 5V2.12L12.38 5zM5.13 5h-.62v1.25h2.12V5zm-.62 3h7.12v1.25H4.5zm.62 3h-.62v1.25h7.12V11z" clip-rule="evenodd" fill="#666" fill-rule="evenodd"/></svg>
|
||||
|
Before Width: | Height: | Size: 392 B |
@@ -1 +0,0 @@
|
||||
<svg fill="none" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 16 16"><g clip-path="url(#a)"><path fill-rule="evenodd" clip-rule="evenodd" d="M10.27 14.1a6.5 6.5 0 0 0 3.67-3.45q-1.24.21-2.7.34-.31 1.83-.97 3.1M8 16A8 8 0 1 0 8 0a8 8 0 0 0 0 16m.48-1.52a7 7 0 0 1-.96 0H7.5a4 4 0 0 1-.84-1.32q-.38-.89-.63-2.08a40 40 0 0 0 3.92 0q-.25 1.2-.63 2.08a4 4 0 0 1-.84 1.31zm2.94-4.76q1.66-.15 2.95-.43a7 7 0 0 0 0-2.58q-1.3-.27-2.95-.43a18 18 0 0 1 0 3.44m-1.27-3.54a17 17 0 0 1 0 3.64 39 39 0 0 1-4.3 0 17 17 0 0 1 0-3.64 39 39 0 0 1 4.3 0m1.1-1.17q1.45.13 2.69.34a6.5 6.5 0 0 0-3.67-3.44q.65 1.26.98 3.1M8.48 1.5l.01.02q.41.37.84 1.31.38.89.63 2.08a40 40 0 0 0-3.92 0q.25-1.2.63-2.08a4 4 0 0 1 .85-1.32 7 7 0 0 1 .96 0m-2.75.4a6.5 6.5 0 0 0-3.67 3.44 29 29 0 0 1 2.7-.34q.31-1.83.97-3.1M4.58 6.28q-1.66.16-2.95.43a7 7 0 0 0 0 2.58q1.3.27 2.95.43a18 18 0 0 1 0-3.44m.17 4.71q-1.45-.12-2.69-.34a6.5 6.5 0 0 0 3.67 3.44q-.65-1.27-.98-3.1" fill="#666"/></g><defs><clipPath id="a"><path fill="#fff" d="M0 0h16v16H0z"/></clipPath></defs></svg>
|
||||
|
Before Width: | Height: | Size: 1.0 KiB |
@@ -1 +0,0 @@
|
||||
<svg xmlns="http://www.w3.org/2000/svg" fill="none" viewBox="0 0 394 80"><path fill="#000" d="M262 0h68.5v12.7h-27.2v66.6h-13.6V12.7H262V0ZM149 0v12.7H94v20.4h44.3v12.6H94v21h55v12.6H80.5V0h68.7zm34.3 0h-17.8l63.8 79.4h17.9l-32-39.7 32-39.6h-17.9l-23 28.6-23-28.6zm18.3 56.7-9-11-27.1 33.7h17.8l18.3-22.7z"/><path fill="#000" d="M81 79.3 17 0H0v79.3h13.6V17l50.2 62.3H81Zm252.6-.4c-1 0-1.8-.4-2.5-1s-1.1-1.6-1.1-2.6.3-1.8 1-2.5 1.6-1 2.6-1 1.8.3 2.5 1a3.4 3.4 0 0 1 .6 4.3 3.7 3.7 0 0 1-3 1.8zm23.2-33.5h6v23.3c0 2.1-.4 4-1.3 5.5a9.1 9.1 0 0 1-3.8 3.5c-1.6.8-3.5 1.3-5.7 1.3-2 0-3.7-.4-5.3-1s-2.8-1.8-3.7-3.2c-.9-1.3-1.4-3-1.4-5h6c.1.8.3 1.6.7 2.2s1 1.2 1.6 1.5c.7.4 1.5.5 2.4.5 1 0 1.8-.2 2.4-.6a4 4 0 0 0 1.6-1.8c.3-.8.5-1.8.5-3V45.5zm30.9 9.1a4.4 4.4 0 0 0-2-3.3 7.5 7.5 0 0 0-4.3-1.1c-1.3 0-2.4.2-3.3.5-.9.4-1.6 1-2 1.6a3.5 3.5 0 0 0-.3 4c.3.5.7.9 1.3 1.2l1.8 1 2 .5 3.2.8c1.3.3 2.5.7 3.7 1.2a13 13 0 0 1 3.2 1.8 8.1 8.1 0 0 1 3 6.5c0 2-.5 3.7-1.5 5.1a10 10 0 0 1-4.4 3.5c-1.8.8-4.1 1.2-6.8 1.2-2.6 0-4.9-.4-6.8-1.2-2-.8-3.4-2-4.5-3.5a10 10 0 0 1-1.7-5.6h6a5 5 0 0 0 3.5 4.6c1 .4 2.2.6 3.4.6 1.3 0 2.5-.2 3.5-.6 1-.4 1.8-1 2.4-1.7a4 4 0 0 0 .8-2.4c0-.9-.2-1.6-.7-2.2a11 11 0 0 0-2.1-1.4l-3.2-1-3.8-1c-2.8-.7-5-1.7-6.6-3.2a7.2 7.2 0 0 1-2.4-5.7 8 8 0 0 1 1.7-5 10 10 0 0 1 4.3-3.5c2-.8 4-1.2 6.4-1.2 2.3 0 4.4.4 6.2 1.2 1.8.8 3.2 2 4.3 3.4 1 1.4 1.5 3 1.5 5h-5.8z"/></svg>
|
||||
|
Before Width: | Height: | Size: 1.3 KiB |
@@ -1 +0,0 @@
|
||||
<svg fill="none" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 1155 1000"><path d="m577.3 0 577.4 1000H0z" fill="#fff"/></svg>
|
||||
|
Before Width: | Height: | Size: 129 B |
@@ -1 +0,0 @@
|
||||
<svg fill="none" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 16 16"><path fill-rule="evenodd" clip-rule="evenodd" d="M1.5 2.5h13v10a1 1 0 0 1-1 1h-11a1 1 0 0 1-1-1zM0 1h16v11.5a2.5 2.5 0 0 1-2.5 2.5h-11A2.5 2.5 0 0 1 0 12.5zm3.75 4.5a.75.75 0 1 0 0-1.5.75.75 0 0 0 0 1.5M7 4.75a.75.75 0 1 1-1.5 0 .75.75 0 0 1 1.5 0m1.75.75a.75.75 0 1 0 0-1.5.75.75 0 0 0 0 1.5" fill="#666"/></svg>
|
||||
|
Before Width: | Height: | Size: 386 B |
@@ -1,237 +0,0 @@
|
||||
/**
|
||||
* Generated by orval v8.2.0 🍺
|
||||
* Do not edit manually.
|
||||
* OpenClaw Agency API
|
||||
* OpenAPI spec version: 0.3.0
|
||||
*/
|
||||
import { useQuery } from "@tanstack/react-query";
|
||||
import type {
|
||||
DataTag,
|
||||
DefinedInitialDataOptions,
|
||||
DefinedUseQueryResult,
|
||||
QueryClient,
|
||||
QueryFunction,
|
||||
QueryKey,
|
||||
UndefinedInitialDataOptions,
|
||||
UseQueryOptions,
|
||||
UseQueryResult,
|
||||
} from "@tanstack/react-query";
|
||||
|
||||
import type {
|
||||
HTTPValidationError,
|
||||
ListActivitiesActivitiesGetParams,
|
||||
} from ".././model";
|
||||
|
||||
import { customFetch } from "../../mutator";
|
||||
|
||||
type SecondParameter<T extends (...args: never) => unknown> = Parameters<T>[1];
|
||||
|
||||
/**
|
||||
* @summary List Activities
|
||||
*/
|
||||
export type listActivitiesActivitiesGetResponse200 = {
|
||||
data: unknown;
|
||||
status: 200;
|
||||
};
|
||||
|
||||
export type listActivitiesActivitiesGetResponse422 = {
|
||||
data: HTTPValidationError;
|
||||
status: 422;
|
||||
};
|
||||
|
||||
export type listActivitiesActivitiesGetResponseSuccess =
|
||||
listActivitiesActivitiesGetResponse200 & {
|
||||
headers: Headers;
|
||||
};
|
||||
export type listActivitiesActivitiesGetResponseError =
|
||||
listActivitiesActivitiesGetResponse422 & {
|
||||
headers: Headers;
|
||||
};
|
||||
|
||||
export type listActivitiesActivitiesGetResponse =
|
||||
| listActivitiesActivitiesGetResponseSuccess
|
||||
| listActivitiesActivitiesGetResponseError;
|
||||
|
||||
export const getListActivitiesActivitiesGetUrl = (
|
||||
params?: ListActivitiesActivitiesGetParams,
|
||||
) => {
|
||||
const normalizedParams = new URLSearchParams();
|
||||
|
||||
Object.entries(params || {}).forEach(([key, value]) => {
|
||||
if (value !== undefined) {
|
||||
normalizedParams.append(key, value === null ? "null" : value.toString());
|
||||
}
|
||||
});
|
||||
|
||||
const stringifiedParams = normalizedParams.toString();
|
||||
|
||||
return stringifiedParams.length > 0
|
||||
? `/activities?${stringifiedParams}`
|
||||
: `/activities`;
|
||||
};
|
||||
|
||||
export const listActivitiesActivitiesGet = async (
|
||||
params?: ListActivitiesActivitiesGetParams,
|
||||
options?: RequestInit,
|
||||
): Promise<listActivitiesActivitiesGetResponse> => {
|
||||
return customFetch<listActivitiesActivitiesGetResponse>(
|
||||
getListActivitiesActivitiesGetUrl(params),
|
||||
{
|
||||
...options,
|
||||
method: "GET",
|
||||
},
|
||||
);
|
||||
};
|
||||
|
||||
export const getListActivitiesActivitiesGetQueryKey = (
|
||||
params?: ListActivitiesActivitiesGetParams,
|
||||
) => {
|
||||
return [`/activities`, ...(params ? [params] : [])] as const;
|
||||
};
|
||||
|
||||
export const getListActivitiesActivitiesGetQueryOptions = <
|
||||
TData = Awaited<ReturnType<typeof listActivitiesActivitiesGet>>,
|
||||
TError = HTTPValidationError,
|
||||
>(
|
||||
params?: ListActivitiesActivitiesGetParams,
|
||||
options?: {
|
||||
query?: Partial<
|
||||
UseQueryOptions<
|
||||
Awaited<ReturnType<typeof listActivitiesActivitiesGet>>,
|
||||
TError,
|
||||
TData
|
||||
>
|
||||
>;
|
||||
request?: SecondParameter<typeof customFetch>;
|
||||
},
|
||||
) => {
|
||||
const { query: queryOptions, request: requestOptions } = options ?? {};
|
||||
|
||||
const queryKey =
|
||||
queryOptions?.queryKey ?? getListActivitiesActivitiesGetQueryKey(params);
|
||||
|
||||
const queryFn: QueryFunction<
|
||||
Awaited<ReturnType<typeof listActivitiesActivitiesGet>>
|
||||
> = ({ signal }) =>
|
||||
listActivitiesActivitiesGet(params, { signal, ...requestOptions });
|
||||
|
||||
return { queryKey, queryFn, ...queryOptions } as UseQueryOptions<
|
||||
Awaited<ReturnType<typeof listActivitiesActivitiesGet>>,
|
||||
TError,
|
||||
TData
|
||||
> & { queryKey: DataTag<QueryKey, TData, TError> };
|
||||
};
|
||||
|
||||
export type ListActivitiesActivitiesGetQueryResult = NonNullable<
|
||||
Awaited<ReturnType<typeof listActivitiesActivitiesGet>>
|
||||
>;
|
||||
export type ListActivitiesActivitiesGetQueryError = HTTPValidationError;
|
||||
|
||||
export function useListActivitiesActivitiesGet<
|
||||
TData = Awaited<ReturnType<typeof listActivitiesActivitiesGet>>,
|
||||
TError = HTTPValidationError,
|
||||
>(
|
||||
params: undefined | ListActivitiesActivitiesGetParams,
|
||||
options: {
|
||||
query: Partial<
|
||||
UseQueryOptions<
|
||||
Awaited<ReturnType<typeof listActivitiesActivitiesGet>>,
|
||||
TError,
|
||||
TData
|
||||
>
|
||||
> &
|
||||
Pick<
|
||||
DefinedInitialDataOptions<
|
||||
Awaited<ReturnType<typeof listActivitiesActivitiesGet>>,
|
||||
TError,
|
||||
Awaited<ReturnType<typeof listActivitiesActivitiesGet>>
|
||||
>,
|
||||
"initialData"
|
||||
>;
|
||||
request?: SecondParameter<typeof customFetch>;
|
||||
},
|
||||
queryClient?: QueryClient,
|
||||
): DefinedUseQueryResult<TData, TError> & {
|
||||
queryKey: DataTag<QueryKey, TData, TError>;
|
||||
};
|
||||
export function useListActivitiesActivitiesGet<
|
||||
TData = Awaited<ReturnType<typeof listActivitiesActivitiesGet>>,
|
||||
TError = HTTPValidationError,
|
||||
>(
|
||||
params?: ListActivitiesActivitiesGetParams,
|
||||
options?: {
|
||||
query?: Partial<
|
||||
UseQueryOptions<
|
||||
Awaited<ReturnType<typeof listActivitiesActivitiesGet>>,
|
||||
TError,
|
||||
TData
|
||||
>
|
||||
> &
|
||||
Pick<
|
||||
UndefinedInitialDataOptions<
|
||||
Awaited<ReturnType<typeof listActivitiesActivitiesGet>>,
|
||||
TError,
|
||||
Awaited<ReturnType<typeof listActivitiesActivitiesGet>>
|
||||
>,
|
||||
"initialData"
|
||||
>;
|
||||
request?: SecondParameter<typeof customFetch>;
|
||||
},
|
||||
queryClient?: QueryClient,
|
||||
): UseQueryResult<TData, TError> & {
|
||||
queryKey: DataTag<QueryKey, TData, TError>;
|
||||
};
|
||||
export function useListActivitiesActivitiesGet<
|
||||
TData = Awaited<ReturnType<typeof listActivitiesActivitiesGet>>,
|
||||
TError = HTTPValidationError,
|
||||
>(
|
||||
params?: ListActivitiesActivitiesGetParams,
|
||||
options?: {
|
||||
query?: Partial<
|
||||
UseQueryOptions<
|
||||
Awaited<ReturnType<typeof listActivitiesActivitiesGet>>,
|
||||
TError,
|
||||
TData
|
||||
>
|
||||
>;
|
||||
request?: SecondParameter<typeof customFetch>;
|
||||
},
|
||||
queryClient?: QueryClient,
|
||||
): UseQueryResult<TData, TError> & {
|
||||
queryKey: DataTag<QueryKey, TData, TError>;
|
||||
};
|
||||
/**
|
||||
* @summary List Activities
|
||||
*/
|
||||
|
||||
export function useListActivitiesActivitiesGet<
|
||||
TData = Awaited<ReturnType<typeof listActivitiesActivitiesGet>>,
|
||||
TError = HTTPValidationError,
|
||||
>(
|
||||
params?: ListActivitiesActivitiesGetParams,
|
||||
options?: {
|
||||
query?: Partial<
|
||||
UseQueryOptions<
|
||||
Awaited<ReturnType<typeof listActivitiesActivitiesGet>>,
|
||||
TError,
|
||||
TData
|
||||
>
|
||||
>;
|
||||
request?: SecondParameter<typeof customFetch>;
|
||||
},
|
||||
queryClient?: QueryClient,
|
||||
): UseQueryResult<TData, TError> & {
|
||||
queryKey: DataTag<QueryKey, TData, TError>;
|
||||
} {
|
||||
const queryOptions = getListActivitiesActivitiesGetQueryOptions(
|
||||
params,
|
||||
options,
|
||||
);
|
||||
|
||||
const query = useQuery(queryOptions, queryClient) as UseQueryResult<
|
||||
TData,
|
||||
TError
|
||||
> & { queryKey: DataTag<QueryKey, TData, TError> };
|
||||
|
||||
return { ...query, queryKey: queryOptions.queryKey };
|
||||
}
|
||||
@@ -1,183 +0,0 @@
|
||||
/**
|
||||
* Generated by orval v8.2.0 🍺
|
||||
* Do not edit manually.
|
||||
* OpenClaw Agency API
|
||||
* OpenAPI spec version: 0.3.0
|
||||
*/
|
||||
import { useQuery } from "@tanstack/react-query";
|
||||
import type {
|
||||
DataTag,
|
||||
DefinedInitialDataOptions,
|
||||
DefinedUseQueryResult,
|
||||
QueryClient,
|
||||
QueryFunction,
|
||||
QueryKey,
|
||||
UndefinedInitialDataOptions,
|
||||
UseQueryOptions,
|
||||
UseQueryResult,
|
||||
} from "@tanstack/react-query";
|
||||
|
||||
import { customFetch } from "../../mutator";
|
||||
|
||||
type SecondParameter<T extends (...args: never) => unknown> = Parameters<T>[1];
|
||||
|
||||
/**
|
||||
* @summary Health
|
||||
*/
|
||||
export type healthHealthGetResponse200 = {
|
||||
data: unknown;
|
||||
status: 200;
|
||||
};
|
||||
|
||||
export type healthHealthGetResponseSuccess = healthHealthGetResponse200 & {
|
||||
headers: Headers;
|
||||
};
|
||||
export type healthHealthGetResponse = healthHealthGetResponseSuccess;
|
||||
|
||||
export const getHealthHealthGetUrl = () => {
|
||||
return `/health`;
|
||||
};
|
||||
|
||||
export const healthHealthGet = async (
|
||||
options?: RequestInit,
|
||||
): Promise<healthHealthGetResponse> => {
|
||||
return customFetch<healthHealthGetResponse>(getHealthHealthGetUrl(), {
|
||||
...options,
|
||||
method: "GET",
|
||||
});
|
||||
};
|
||||
|
||||
export const getHealthHealthGetQueryKey = () => {
|
||||
return [`/health`] as const;
|
||||
};
|
||||
|
||||
export const getHealthHealthGetQueryOptions = <
|
||||
TData = Awaited<ReturnType<typeof healthHealthGet>>,
|
||||
TError = unknown,
|
||||
>(options?: {
|
||||
query?: Partial<
|
||||
UseQueryOptions<Awaited<ReturnType<typeof healthHealthGet>>, TError, TData>
|
||||
>;
|
||||
request?: SecondParameter<typeof customFetch>;
|
||||
}) => {
|
||||
const { query: queryOptions, request: requestOptions } = options ?? {};
|
||||
|
||||
const queryKey = queryOptions?.queryKey ?? getHealthHealthGetQueryKey();
|
||||
|
||||
const queryFn: QueryFunction<Awaited<ReturnType<typeof healthHealthGet>>> = ({
|
||||
signal,
|
||||
}) => healthHealthGet({ signal, ...requestOptions });
|
||||
|
||||
return { queryKey, queryFn, ...queryOptions } as UseQueryOptions<
|
||||
Awaited<ReturnType<typeof healthHealthGet>>,
|
||||
TError,
|
||||
TData
|
||||
> & { queryKey: DataTag<QueryKey, TData, TError> };
|
||||
};
|
||||
|
||||
export type HealthHealthGetQueryResult = NonNullable<
|
||||
Awaited<ReturnType<typeof healthHealthGet>>
|
||||
>;
|
||||
export type HealthHealthGetQueryError = unknown;
|
||||
|
||||
export function useHealthHealthGet<
|
||||
TData = Awaited<ReturnType<typeof healthHealthGet>>,
|
||||
TError = unknown,
|
||||
>(
|
||||
options: {
|
||||
query: Partial<
|
||||
UseQueryOptions<
|
||||
Awaited<ReturnType<typeof healthHealthGet>>,
|
||||
TError,
|
||||
TData
|
||||
>
|
||||
> &
|
||||
Pick<
|
||||
DefinedInitialDataOptions<
|
||||
Awaited<ReturnType<typeof healthHealthGet>>,
|
||||
TError,
|
||||
Awaited<ReturnType<typeof healthHealthGet>>
|
||||
>,
|
||||
"initialData"
|
||||
>;
|
||||
request?: SecondParameter<typeof customFetch>;
|
||||
},
|
||||
queryClient?: QueryClient,
|
||||
): DefinedUseQueryResult<TData, TError> & {
|
||||
queryKey: DataTag<QueryKey, TData, TError>;
|
||||
};
|
||||
export function useHealthHealthGet<
|
||||
TData = Awaited<ReturnType<typeof healthHealthGet>>,
|
||||
TError = unknown,
|
||||
>(
|
||||
options?: {
|
||||
query?: Partial<
|
||||
UseQueryOptions<
|
||||
Awaited<ReturnType<typeof healthHealthGet>>,
|
||||
TError,
|
||||
TData
|
||||
>
|
||||
> &
|
||||
Pick<
|
||||
UndefinedInitialDataOptions<
|
||||
Awaited<ReturnType<typeof healthHealthGet>>,
|
||||
TError,
|
||||
Awaited<ReturnType<typeof healthHealthGet>>
|
||||
>,
|
||||
"initialData"
|
||||
>;
|
||||
request?: SecondParameter<typeof customFetch>;
|
||||
},
|
||||
queryClient?: QueryClient,
|
||||
): UseQueryResult<TData, TError> & {
|
||||
queryKey: DataTag<QueryKey, TData, TError>;
|
||||
};
|
||||
export function useHealthHealthGet<
|
||||
TData = Awaited<ReturnType<typeof healthHealthGet>>,
|
||||
TError = unknown,
|
||||
>(
|
||||
options?: {
|
||||
query?: Partial<
|
||||
UseQueryOptions<
|
||||
Awaited<ReturnType<typeof healthHealthGet>>,
|
||||
TError,
|
||||
TData
|
||||
>
|
||||
>;
|
||||
request?: SecondParameter<typeof customFetch>;
|
||||
},
|
||||
queryClient?: QueryClient,
|
||||
): UseQueryResult<TData, TError> & {
|
||||
queryKey: DataTag<QueryKey, TData, TError>;
|
||||
};
|
||||
/**
|
||||
* @summary Health
|
||||
*/
|
||||
|
||||
export function useHealthHealthGet<
|
||||
TData = Awaited<ReturnType<typeof healthHealthGet>>,
|
||||
TError = unknown,
|
||||
>(
|
||||
options?: {
|
||||
query?: Partial<
|
||||
UseQueryOptions<
|
||||
Awaited<ReturnType<typeof healthHealthGet>>,
|
||||
TError,
|
||||
TData
|
||||
>
|
||||
>;
|
||||
request?: SecondParameter<typeof customFetch>;
|
||||
},
|
||||
queryClient?: QueryClient,
|
||||
): UseQueryResult<TData, TError> & {
|
||||
queryKey: DataTag<QueryKey, TData, TError>;
|
||||
} {
|
||||
const queryOptions = getHealthHealthGetQueryOptions(options);
|
||||
|
||||
const query = useQuery(queryOptions, queryClient) as UseQueryResult<
|
||||
TData,
|
||||
TError
|
||||
> & { queryKey: DataTag<QueryKey, TData, TError> };
|
||||
|
||||
return { ...query, queryKey: queryOptions.queryKey };
|
||||
}
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,22 +0,0 @@
|
||||
/**
|
||||
* Generated by orval v8.2.0 🍺
|
||||
* Do not edit manually.
|
||||
* OpenClaw Agency API
|
||||
* OpenAPI spec version: 0.3.0
|
||||
*/
|
||||
|
||||
export interface AgentOnboarding {
|
||||
id?: number | null;
|
||||
agent_name: string;
|
||||
role_title: string;
|
||||
prompt: string;
|
||||
cron_interval_ms?: number | null;
|
||||
tools_json?: string | null;
|
||||
owner_hr_id?: number | null;
|
||||
status?: string;
|
||||
spawned_agent_id?: string | null;
|
||||
session_key?: string | null;
|
||||
notes?: string | null;
|
||||
created_at?: string;
|
||||
updated_at?: string;
|
||||
}
|
||||
@@ -1,19 +0,0 @@
|
||||
/**
|
||||
* Generated by orval v8.2.0 🍺
|
||||
* Do not edit manually.
|
||||
* OpenClaw Agency API
|
||||
* OpenAPI spec version: 0.3.0
|
||||
*/
|
||||
|
||||
export interface AgentOnboardingCreate {
|
||||
agent_name: string;
|
||||
role_title: string;
|
||||
prompt: string;
|
||||
cron_interval_ms?: number | null;
|
||||
tools_json?: string | null;
|
||||
owner_hr_id?: number | null;
|
||||
status?: string;
|
||||
spawned_agent_id?: string | null;
|
||||
session_key?: string | null;
|
||||
notes?: string | null;
|
||||
}
|
||||
@@ -1,19 +0,0 @@
|
||||
/**
|
||||
* Generated by orval v8.2.0 🍺
|
||||
* Do not edit manually.
|
||||
* OpenClaw Agency API
|
||||
* OpenAPI spec version: 0.3.0
|
||||
*/
|
||||
|
||||
export interface AgentOnboardingUpdate {
|
||||
agent_name?: string | null;
|
||||
role_title?: string | null;
|
||||
prompt?: string | null;
|
||||
cron_interval_ms?: number | null;
|
||||
tools_json?: string | null;
|
||||
owner_hr_id?: number | null;
|
||||
status?: string | null;
|
||||
spawned_agent_id?: string | null;
|
||||
session_key?: string | null;
|
||||
notes?: string | null;
|
||||
}
|
||||
@@ -1,12 +0,0 @@
|
||||
/**
|
||||
* Generated by orval v8.2.0 🍺
|
||||
* Do not edit manually.
|
||||
* OpenClaw Agency API
|
||||
* OpenAPI spec version: 0.3.0
|
||||
*/
|
||||
|
||||
export interface Department {
|
||||
id?: number | null;
|
||||
name: string;
|
||||
head_employee_id?: number | null;
|
||||
}
|
||||
@@ -1,11 +0,0 @@
|
||||
/**
|
||||
* Generated by orval v8.2.0 🍺
|
||||
* Do not edit manually.
|
||||
* OpenClaw Agency API
|
||||
* OpenAPI spec version: 0.3.0
|
||||
*/
|
||||
|
||||
export interface DepartmentCreate {
|
||||
name: string;
|
||||
head_employee_id?: number | null;
|
||||
}
|
||||
@@ -1,11 +0,0 @@
|
||||
/**
|
||||
* Generated by orval v8.2.0 🍺
|
||||
* Do not edit manually.
|
||||
* OpenClaw Agency API
|
||||
* OpenAPI spec version: 0.3.0
|
||||
*/
|
||||
|
||||
export interface DepartmentUpdate {
|
||||
name?: string | null;
|
||||
head_employee_id?: number | null;
|
||||
}
|
||||
@@ -1,19 +0,0 @@
|
||||
/**
|
||||
* Generated by orval v8.2.0 🍺
|
||||
* Do not edit manually.
|
||||
* OpenClaw Agency API
|
||||
* OpenAPI spec version: 0.3.0
|
||||
*/
|
||||
|
||||
export interface Employee {
|
||||
id?: number | null;
|
||||
name: string;
|
||||
employee_type: string;
|
||||
department_id?: number | null;
|
||||
team_id?: number | null;
|
||||
manager_id?: number | null;
|
||||
title?: string | null;
|
||||
status?: string;
|
||||
openclaw_session_key?: string | null;
|
||||
notify_enabled?: boolean;
|
||||
}
|
||||
@@ -1,18 +0,0 @@
|
||||
/**
|
||||
* Generated by orval v8.2.0 🍺
|
||||
* Do not edit manually.
|
||||
* OpenClaw Agency API
|
||||
* OpenAPI spec version: 0.3.0
|
||||
*/
|
||||
|
||||
export interface EmployeeCreate {
|
||||
name: string;
|
||||
employee_type: string;
|
||||
department_id?: number | null;
|
||||
team_id?: number | null;
|
||||
manager_id?: number | null;
|
||||
title?: string | null;
|
||||
status?: string;
|
||||
openclaw_session_key?: string | null;
|
||||
notify_enabled?: boolean;
|
||||
}
|
||||
@@ -1,18 +0,0 @@
|
||||
/**
|
||||
* Generated by orval v8.2.0 🍺
|
||||
* Do not edit manually.
|
||||
* OpenClaw Agency API
|
||||
* OpenAPI spec version: 0.3.0
|
||||
*/
|
||||
|
||||
export interface EmployeeUpdate {
|
||||
name?: string | null;
|
||||
employee_type?: string | null;
|
||||
department_id?: number | null;
|
||||
team_id?: number | null;
|
||||
manager_id?: number | null;
|
||||
title?: string | null;
|
||||
status?: string | null;
|
||||
openclaw_session_key?: string | null;
|
||||
notify_enabled?: boolean | null;
|
||||
}
|
||||
@@ -1,15 +0,0 @@
|
||||
/**
|
||||
* Generated by orval v8.2.0 🍺
|
||||
* Do not edit manually.
|
||||
* OpenClaw Agency API
|
||||
* OpenAPI spec version: 0.3.0
|
||||
*/
|
||||
|
||||
export interface EmploymentAction {
|
||||
id?: number | null;
|
||||
employee_id: number;
|
||||
issued_by_employee_id: number;
|
||||
action_type: string;
|
||||
notes?: string | null;
|
||||
created_at?: string;
|
||||
}
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user