feat: Phase 7 — Ace Attorney overlay, NDJSON replay, Twitch integration, Prometheus metrics

Phase 7 features:
- Ace Attorney-style PixiJS renderer with camera presets, character poses,
  dialogue typewriter, and effects engine
- Render directives inferred from role, phase, and dialogue keywords
- Structured case file generated at session start with witness roster,
  evidence inventory, and charge sheet
- Witness statement events emitted during examination
- Audience interaction endpoints: POST /press and /present with rate limiting
- Twitch chat adapter (press, present, objection commands)
- NDJSON session recording and deterministic replay (CLI flags + env vars)
- SSE fixture recorder for browser overlay replay
- Prometheus metrics endpoint (GET /api/metrics) with session lifecycle,
  vote latency, SSE connection, and phase transition telemetry
- Docker: TRUST_PROXY support, 127.0.0.1 bind, network isolation

Quality review fixes:
- C1: Rate-limit /press and /present endpoints (10 req/IP/10s)
- C2: Fix Twitch objection to read current count and increment properly
- I1: Document /api/metrics in README, docs/api.md, operator-runbook
- I2: Comment out unused TWITCH_EVENTSUB_SECRET in .env.example
- I3: Fix event-taxonomy render_directive schema (poses/faces maps)
- I4: Deduplicate dashboard event mapping into session-snapshot.ts
- M1: Match Ace Attorney keywords with or without trailing punctuation
- M3: Add SSE fixture recording docs to docs/api.md
This commit is contained in:
2026-02-28 17:17:59 +00:00
parent 467d7d9a15
commit c08d3ddcc1
50 changed files with 6070 additions and 200 deletions

View File

@@ -2,7 +2,9 @@ OPENROUTER_API_KEY=
LLM_MODEL=deepseek/deepseek-chat-v3-0324:free
LLM_MOCK=false
PORT=3000
API_HOST_PORT=3000
# Set to 1 when behind one reverse proxy (Docker ingress, Nginx, Traefik, etc.)
TRUST_PROXY=
API_HOST_PORT=3001
DATABASE_URL=postgresql://postgres:postgres@localhost:5432/juryrigged
TTS_PROVIDER=noop
LOG_LEVEL=info
@@ -27,3 +29,14 @@ TOKEN_COST_PER_1K_USD=0.002
BROADCAST_PROVIDER=noop
OBS_WEBSOCKET_URL=ws://localhost:4455
OBS_WEBSOCKET_PASSWORD=
# Replay + Recording (Phase 7)
RECORDINGS_DIR=recordings
REPLAY_FILE=
REPLAY_SPEED=1
# Twitch Integration (Phase 7)
TWITCH_CHANNEL=
TWITCH_BOT_TOKEN=
TWITCH_CLIENT_ID=
# TWITCH_EVENTSUB_SECRET= (future — not yet used)

View File

@@ -7,7 +7,11 @@ COPY package.json package-lock.json ./
RUN npm ci
COPY tsconfig.json ./
COPY vite.config.ts ./
COPY postcss.config.js ./
COPY tailwind.config.js ./
COPY src ./src
COPY dashboard ./dashboard
COPY public ./public
COPY db ./db

193
README.md
View File

@@ -11,13 +11,18 @@ This repository is standalone and does not require `subcult-corp` at runtime.
- Multi-agent role orchestration (judge, prosecutor, defense, witnesses, bailiff)
- Strict, forward-only phase progression:
- `case_prompt``openings``witness_exam``evidence_reveal``closings``verdict_vote``sentence_vote``final_ruling`
- Optional skip: `witness_exam``closings`
- `case_prompt``openings``witness_exam``evidence_reveal``closings``verdict_vote``sentence_vote``final_ruling`
- Optional skip: `witness_exam``closings`
- Live per-session SSE stream (`/api/court/sessions/:id/stream`)
- Jury voting APIs with phase gating + anti-spam/rate-limiting
- Main viewer UI (`public/`) and React operator dashboard (`/operator`)
- In-memory or Postgres-backed persistence (auto-selected by `DATABASE_URL`)
- Optional broadcast hook integration (`noop` or `obs`) for production workflows
- **Ace Attorneystyle renderer** (Phase 7): PixiJS overlay with camera presets, character poses, dialogue typewriter, effects engine, and evidence presentation
- **Structured case file**: immutable case context with witness roster, evidence inventory, and charge sheet generated at session start
- **Audience interaction**: `/press` and `/present` API endpoints + Twitch chat commands (`!press`, `!present`, `!objection`)
- **Render directives**: backend-inferred visual cues (camera, pose, face, effects) streamed alongside dialogue turns
- **NDJSON recording and deterministic replay**: record live sessions to NDJSON, replay at configurable speed
## Tech Stack
@@ -103,6 +108,7 @@ Container behavior:
- API runs on container port `3001`
- Host mapping defaults to `${API_HOST_PORT:-3001}`
- Migrations run automatically on container startup (`npm run migrate:dist`)
- `TRUST_PROXY` defaults to `1` in compose so IP-based rate limits remain accurate behind one proxy hop
Default compose endpoints:
@@ -115,54 +121,76 @@ Copy `.env.example` and tune as needed.
### Core runtime
| Variable | Purpose |
| --- | --- |
| `PORT` | API port for local non-Docker runs (default in `.env.example`: `3000`) |
| `OPENROUTER_API_KEY` | Required for live LLM calls; empty enables deterministic mock fallback |
| `LLM_MODEL` | OpenRouter model identifier |
| `LLM_MOCK` | Force mock mode (`true`/`false`) |
| `DATABASE_URL` | Enables Postgres-backed durable store; omit for in-memory |
| `LOG_LEVEL` | `debug`, `info`, `warn`, `error` |
| Variable | Purpose |
| -------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------- |
| `PORT` | API port for local non-Docker runs (default in `.env.example`: `3000`) |
| `TRUST_PROXY` | Express proxy trust setting (`true`, `false`, hop count like `1`, or CIDR/subnet list) used for accurate client IP detection behind reverse proxies |
| `OPENROUTER_API_KEY` | Required for live LLM calls; empty enables deterministic mock fallback |
| `LLM_MODEL` | OpenRouter model identifier |
| `LLM_MOCK` | Force mock mode (`true`/`false`) |
| `DATABASE_URL` | Enables Postgres-backed durable store; omit for in-memory |
| `LOG_LEVEL` | `debug`, `info`, `warn`, `error` |
### Voting + moderation safety
| Variable | Purpose |
| --- | --- |
| `VERDICT_VOTE_WINDOW_MS` | Verdict poll window duration |
| `SENTENCE_VOTE_WINDOW_MS` | Sentence poll window duration |
| `VOTE_SPAM_MAX_VOTES_PER_WINDOW` | Vote rate cap per window |
| `VOTE_SPAM_WINDOW_MS` | Rate-limit window size |
| `VOTE_SPAM_DUPLICATE_WINDOW_MS` | Duplicate-vote suppression window |
| Variable | Purpose |
| -------------------------------- | --------------------------------- |
| `VERDICT_VOTE_WINDOW_MS` | Verdict poll window duration |
| `SENTENCE_VOTE_WINDOW_MS` | Sentence poll window duration |
| `VOTE_SPAM_MAX_VOTES_PER_WINDOW` | Vote rate cap per window |
| `VOTE_SPAM_WINDOW_MS` | Rate-limit window size |
| `VOTE_SPAM_DUPLICATE_WINDOW_MS` | Duplicate-vote suppression window |
### Witness / token controls
| Variable | Purpose |
| --- | --- |
| `WITNESS_MAX_TOKENS` | Max witness response tokens before truncation |
| `WITNESS_MAX_SECONDS` | Max witness response duration |
| `WITNESS_TOKENS_PER_SECOND` | Duration↔token heuristic |
| `WITNESS_TRUNCATION_MARKER` | Marker appended after truncation |
| `JUDGE_RECAP_CADENCE` | Emit recap every N witness cycles |
| `ROLE_MAX_TOKENS_*` | Per-role token budget overrides |
| `TOKEN_COST_PER_1K_USD` | Cost estimation coefficient |
| Variable | Purpose |
| --------------------------- | --------------------------------------------- |
| `WITNESS_MAX_TOKENS` | Max witness response tokens before truncation |
| `WITNESS_MAX_SECONDS` | Max witness response duration |
| `WITNESS_TOKENS_PER_SECOND` | Duration↔token heuristic |
| `WITNESS_TRUNCATION_MARKER` | Marker appended after truncation |
| `JUDGE_RECAP_CADENCE` | Emit recap every N witness cycles |
| `ROLE_MAX_TOKENS_*` | Per-role token budget overrides |
| `TOKEN_COST_PER_1K_USD` | Cost estimation coefficient |
### Broadcast integration
| Variable | Purpose |
| --- | --- |
| `BROADCAST_PROVIDER` | `noop` or `obs` |
| `OBS_WEBSOCKET_URL` | OBS WebSocket endpoint |
| Variable | Purpose |
| ------------------------ | -------------------------------------------- |
| `BROADCAST_PROVIDER` | `noop` or `obs` |
| `OBS_WEBSOCKET_URL` | OBS WebSocket endpoint |
| `OBS_WEBSOCKET_PASSWORD` | OBS auth password (optional but recommended) |
See `docs/broadcast-integration.md` for setup details.
### Twitch integration (Phase 7)
| Variable | Purpose |
| ------------------ | ----------------------------------------------- |
| `TWITCH_CHANNEL` | Twitch channel to monitor for audience commands |
| `TWITCH_BOT_TOKEN` | OAuth token for the Twitch bot account |
| `TWITCH_CLIENT_ID` | Twitch application client ID |
When any Twitch variable is unset, the adapter runs in noop mode (no connection).
### Replay + recording (Phase 7)
| Variable | Purpose |
| ---------------- | ---------------------------------------------------------------- |
| `RECORDINGS_DIR` | Directory for NDJSON session recordings (default: `recordings/`) |
| `REPLAY_FILE` | Path to NDJSON file to replay instead of live orchestration |
| `REPLAY_SPEED` | Playback speed multiplier (`1` = real-time, `4` = 4×) |
## API at a Glance
- `GET /api/health`
- `GET /api/metrics` (Prometheus-format telemetry)
- `GET /api/court/sessions`
- `GET /api/court/sessions/:id`
- `POST /api/court/sessions`
- `POST /api/court/sessions/:id/vote`
- `POST /api/court/sessions/:id/press` (Phase 7 — audience press)
- `POST /api/court/sessions/:id/present` (Phase 7 — present evidence)
- `POST /api/court/sessions/:id/phase`
- `GET /api/court/sessions/:id/stream` (SSE)
@@ -172,18 +200,69 @@ Full schemas, error codes, and event contracts: `docs/api.md`.
### npm scripts
| Command | Description |
| --- | --- |
| `npm run dev` | Start API in watch mode (`src/server.ts`) |
| `npm run dev:dashboard` | Start Vite dashboard dev server |
| `npm run build` | Compile TS to `dist/` + build dashboard |
| `npm run build:dashboard` | Build dashboard only |
| `npm run start` | Run compiled server (`dist/server.js`) |
| `npm run migrate` | Run migrations from source (`tsx`) |
| `npm run migrate:dist` | Run migrations from compiled output |
| `npm test` | Run Node test suite |
| `npm run test:ops` | Run ops config tests |
| `npm run smoke:staging` | Run staging smoke script |
| Command | Description |
| ------------------------- | ----------------------------------------- |
| `npm run dev` | Start API in watch mode (`src/server.ts`) |
| `npm run dev:dashboard` | Start Vite dashboard dev server |
| `npm run build` | Compile TS to `dist/` + build dashboard |
| `npm run build:dashboard` | Build dashboard only |
| `npm run start` | Run compiled server (`dist/server.js`) |
| `npm run migrate` | Run migrations from source (`tsx`) |
| `npm run record:sse` | Record SSE session fixture |
| `npm run migrate:dist` | Run migrations from compiled output |
| `npm test` | Run Node test suite |
| `npm run test:ops` | Run ops config tests |
| `npm run smoke:staging` | Run staging smoke script |
### SSE fixture record + replay
Record a live SSE stream to a fixture file:
```bash
npm run record:sse -- --session <SESSION_ID>
```
Defaults:
- base URL: `http://127.0.0.1:${PORT}`
- output: `public/fixtures/sse-<SESSION_ID>-<timestamp>.json`
Optional flags:
- `--base <url>`
- `--out <absolute-or-relative-path>`
- `--max-events <number>`
- `--duration-ms <number>`
Replay a fixture in the browser overlay by adding a query param:
```text
http://localhost:3000/?replayFixture=/fixtures/<fixture-file>.json
```
When fixture replay mode is enabled, live SSE is disabled and recorded events are replayed with their captured offsets.
### Server-side NDJSON recording and replay (Phase 7)
During normal live runs, session events are recorded to NDJSON files in `recordings/<SESSION_ID>.ndjson` (override with `RECORDINGS_DIR`).
Start server replay mode from an existing NDJSON file:
```bash
npm run dev -- --replay recordings/<SESSION_ID>.ndjson --speed 4
```
Equivalent environment-variable mode:
```bash
REPLAY_FILE=recordings/<SESSION_ID>.ndjson REPLAY_SPEED=4 npm run dev
```
Notes:
- `--speed` / `REPLAY_SPEED` controls playback rate (`1` = real-time, `4` = 4× faster).
- In replay mode, orchestration is disabled and SSE emits recorded events with captured inter-event timing.
- Existing viewer and operator UIs connect to replay mode using the same SSE endpoint.
### Make targets
@@ -201,8 +280,10 @@ npm test
## Repository Layout
- `src/` — server, orchestrator, store, moderation, broadcast, tests
- `public/` — viewer UI
- `src/` — server, orchestrator, store, moderation, broadcast, Twitch adapter, tests
- `public/` — viewer UI + overlay
- `public/renderer/` — modular PixiJS renderer (stage, layers, camera, dialogue, effects)
- `public/assets/` — placeholder-first assets (backgrounds, characters, UI, fonts, SFX)
- `dashboard/` — operator dashboard (React + Vite)
- `db/migrations/` — SQL schema migrations
- `docs/` — architecture, API, moderation, ops runbooks
@@ -210,18 +291,18 @@ npm test
## Documentation Map
| Document | Description |
| --- | --- |
| `docs/ADR-001-juryrigged-architecture.md` | Core architectural decisions and invariants |
| `docs/architecture.md` | System components and phase sequencing |
| `docs/api.md` | REST + SSE contracts and schemas |
| `docs/coding-conventions.md` | Team coding style and maintainability conventions |
| `docs/operator-runbook.md` | Operator procedures and incident response |
| `docs/ops-runbook.md` | Staging deploy path, SLI/alert definitions |
| `docs/moderation-playbook.md` | Moderation policy and handling |
| `docs/event-taxonomy.md` | Event taxonomy and payload expectations |
| `docs/broadcast-integration.md` | OBS/broadcast automation configuration |
| `docs/phase5-6-implementation-plan.md` | Roadmap implementation plan |
| Document | Description |
| ----------------------------------------- | ------------------------------------------------- |
| `docs/ADR-001-juryrigged-architecture.md` | Core architectural decisions and invariants |
| `docs/architecture.md` | System components and phase sequencing |
| `docs/api.md` | REST + SSE contracts and schemas |
| `docs/coding-conventions.md` | Team coding style and maintainability conventions |
| `docs/operator-runbook.md` | Operator procedures and incident response |
| `docs/ops-runbook.md` | Staging deploy path, SLI/alert definitions |
| `docs/moderation-playbook.md` | Moderation policy and handling |
| `docs/event-taxonomy.md` | Event taxonomy and payload expectations |
| `docs/broadcast-integration.md` | OBS/broadcast automation + Twitch integration |
| `docs/phase5-6-implementation-plan.md` | Roadmap implementation plan |
## Notes

View File

@@ -1,9 +1,40 @@
import React, { Suspense, lazy, useCallback, useEffect, useState } from 'react';
import { SessionMonitor } from './components/SessionMonitor';
import { useSSE } from './hooks/useSSE';
import { mapSessionToSnapshot } from './session-snapshot';
import { applyEventToSnapshot, mapSessionToSnapshot } from './session-snapshot';
import type { CourtEvent, SessionSnapshot } from './types';
type UnknownRecord = Record<string, unknown>;
const SESSION_DISCOVERY_INTERVAL_MS = 5_000;
function asRecord(value: unknown): UnknownRecord {
return (
typeof value === 'object' && value !== null && !Array.isArray(value)
) ?
(value as UnknownRecord)
: {};
}
function asString(value: unknown): string | null {
return typeof value === 'string' ? value : null;
}
function resolvePreferredSessionId(response: unknown): string | null {
const payload = asRecord(response);
const sessions =
Array.isArray(payload.sessions) ? payload.sessions : ([] as unknown[]);
if (sessions.length === 0) {
return null;
}
const running = sessions.find(
candidate => asRecord(candidate).status === 'running',
);
const selected = asRecord(running ?? sessions[0]);
return asString(selected.id) ?? asString(selected.sessionId);
}
type DashboardTabId = 'monitor' | 'moderation' | 'controls' | 'analytics';
const loadModerationQueue = () => import('./components/ModerationQueue');
@@ -71,6 +102,7 @@ function App() {
const handleSSEEvent = useCallback((event: CourtEvent) => {
setEvents(prev => [...prev, event]);
setSessionSnapshot(current => applyEventToSnapshot(current, event));
}, []);
const handleSSESnapshot = useCallback(
@@ -100,43 +132,40 @@ function App() {
useEffect(() => {
let cancelled = false;
// Fetch current session on mount
fetch('/api/court/sessions')
.then(res => {
const syncSessionId = async () => {
try {
const res = await fetch('/api/court/sessions');
if (!res.ok) {
throw new Error(`Unexpected status ${res.status}`);
}
return res.json();
})
.then(sessionsResponse => {
const sessionsResponse = await res.json();
if (cancelled) {
return;
}
let id: string | null = null;
const nextSessionId =
resolvePreferredSessionId(sessionsResponse);
setSessionId(current =>
current === nextSessionId ? current : nextSessionId,
);
} catch (err) {
console.error('Failed to fetch session:', err);
}
};
if (
Array.isArray(sessionsResponse.sessions) &&
sessionsResponse.sessions.length > 0
) {
const first = sessionsResponse.sessions[0] as
| { id?: string; sessionId?: string }
| undefined;
id = first?.id ?? first?.sessionId ?? null;
}
void syncSessionId().finally(() => {
if (!cancelled) {
setSessionLookupLoading(false);
}
});
if (id) {
setSessionId(id);
}
})
.catch(err => console.error('Failed to fetch session:', err))
.finally(() => {
if (!cancelled) {
setSessionLookupLoading(false);
}
});
const intervalId = setInterval(() => {
void syncSessionId();
}, SESSION_DISCOVERY_INTERVAL_MS);
return () => {
clearInterval(intervalId);
cancelled = true;
};
}, []);

View File

@@ -1,4 +1,9 @@
import type { SessionSnapshot, TranscriptEntry, VoteCount } from './types';
import type {
CourtEvent,
SessionSnapshot,
TranscriptEntry,
VoteCount,
} from './types';
const DEFAULT_MAX_WITNESS_STATEMENTS = 3;
const DEFAULT_RECAP_INTERVAL = 2;
@@ -67,6 +72,7 @@ function buildTranscript(
new Date().toISOString();
transcript.push({
turnId: turnId ?? undefined,
speaker,
content,
timestamp,
@@ -147,3 +153,103 @@ export function mapSessionToSnapshot(
config: buildConfig(metadata),
};
}
function buildVerdictVotesFromPayload(rawVotes: unknown): VoteCount {
const votes = asRecord(rawVotes);
const directGuilty = asNumber(votes.guilty);
const civilGuilty = asNumber(votes.liable);
const guilty = directGuilty > 0 ? directGuilty : civilGuilty;
const innocent =
asNumber(votes.not_guilty) > 0 ?
asNumber(votes.not_guilty)
: asNumber(votes.not_liable);
const total = Object.values(votes).reduce(
(sum, value) => sum + asNumber(value),
0,
);
return { guilty, innocent, total };
}
export function applyEventToSnapshot(
current: SessionSnapshot | null,
event: CourtEvent,
): SessionSnapshot | null {
if (!current || current.sessionId !== event.sessionId) {
return current;
}
const payload = asRecord(event.payload);
switch (event.type) {
case 'phase_changed': {
const phase = asString(payload.phase);
if (!phase) return current;
return { ...current, phase };
}
case 'turn': {
const turn = asRecord(payload.turn);
const turnId = asString(turn.id);
const speaker = asString(turn.speaker) ?? 'Unknown';
const content =
asString(turn.dialogue) ?? asString(turn.content) ?? '';
if (
turnId &&
current.transcript.some(entry => entry.turnId === turnId)
) {
return current;
}
const timestamp =
asString(turn.createdAt) ?? asString(turn.at) ?? event.at;
return {
...current,
transcript: [
...current.transcript,
{
turnId: turnId ?? undefined,
speaker,
content,
timestamp,
isRecap: false,
},
],
};
}
case 'judge_recap_emitted': {
const turnId = asString(payload.turnId);
if (!turnId) return current;
let didMarkRecap = false;
const transcript = current.transcript.map(entry => {
if (entry.turnId !== turnId || entry.isRecap) return entry;
didMarkRecap = true;
return { ...entry, isRecap: true };
});
if (!didMarkRecap) return current;
const recapCount = transcript.reduce(
(sum, entry) => sum + (entry.isRecap ? 1 : 0),
0,
);
return { ...current, transcript, recapCount };
}
case 'vote_updated': {
return {
...current,
votes: {
...current.votes,
verdict: buildVerdictVotesFromPayload(payload.verdictVotes),
},
};
}
default:
return current;
}
}

View File

@@ -78,6 +78,7 @@ export interface SessionSnapshot {
}
export interface TranscriptEntry {
turnId?: string;
speaker: string;
content: string;
timestamp: string;

View File

@@ -14,6 +14,8 @@ services:
interval: 5s
timeout: 5s
retries: 10
networks:
- internal
api:
build:
@@ -26,6 +28,7 @@ services:
condition: service_healthy
environment:
PORT: 3001
TRUST_PROXY: ${TRUST_PROXY:-1}
DATABASE_URL: postgresql://postgres:postgres@db:5432/juryrigged
OPENROUTER_API_KEY: ${OPENROUTER_API_KEY:-}
LLM_MODEL: ${LLM_MODEL:-deepseek/deepseek-chat-v3-0324:free}
@@ -36,7 +39,15 @@ services:
VERDICT_VOTE_WINDOW_MS: ${VERDICT_VOTE_WINDOW_MS:-20000}
SENTENCE_VOTE_WINDOW_MS: ${SENTENCE_VOTE_WINDOW_MS:-20000}
ports:
- '${API_HOST_PORT:-3001}:3001'
- '127.0.0.1:${API_HOST_PORT:-3001}:3001'
networks:
- internal
- web
volumes:
postgres_data:
networks:
internal:
web:
external: true

View File

@@ -21,6 +21,21 @@ Returns service liveness.
---
### `GET /api/metrics`
Returns Prometheus-format telemetry metrics. Includes session lifecycle counters, phase transitions, vote latency histograms, SSE connection gauges, and default Node.js process metrics (prefixed `juryrigged_`).
**Response `200`**`Content-Type: text/plain; version=0.0.4; charset=utf-8`
```
# HELP juryrigged_votes_cast_total Total number of accepted jury votes
# TYPE juryrigged_votes_cast_total counter
juryrigged_votes_cast_total{vote_type="verdict"} 5
...
```
---
## Sessions
### `GET /api/court/sessions`
@@ -53,12 +68,12 @@ Creates and immediately starts a new court session.
**Request body**
| Field | Type | Required | Description |
| ----------------- | ----------------------- | -------- | --------------------------------------------------------------------------------------- |
| Field | Type | Required | Description |
| ----------------- | ----------------------- | -------- | -------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `topic` | `string` | ❌ | Case description. When provided, must be at least 10 characters and will be screened for safety. If omitted or empty, falls back to a prompt-bank entry. |
| `caseType` | `"criminal" \| "civil"` | ❌ | Defaults to `"criminal"`. |
| `participants` | `AgentId[]` | ❌ | List of agent IDs to include. Defaults to all six agents. Must be at least 4 valid IDs. |
| `sentenceOptions` | `string[]` | ❌ | Custom sentence choices for the sentencing poll. Defaults to five built-in options. |
| `caseType` | `"criminal" \| "civil"` | ❌ | Defaults to `"criminal"`. |
| `participants` | `AgentId[]` | ❌ | List of agent IDs to include. Defaults to all six agents. Must be at least 4 valid IDs. |
| `sentenceOptions` | `string[]` | ❌ | Custom sentence choices for the sentencing poll. Defaults to five built-in options. |
**Response `201`**`{ "session": <CourtSession> }`
@@ -132,6 +147,44 @@ Additional error codes:
---
### `POST /api/court/sessions/:id/press`
Audience "press" action during witness examination. Emits a shake render directive targeting the active witness camera.
**Request body** — none required
**Response `200`**
```json
{ "ok": true, "action": "press" }
```
**Response `404`** — session not found (`SESSION_NOT_FOUND`)
---
### `POST /api/court/sessions/:id/present`
Audience "present evidence" action. Emits a `take_that` render directive with the specified evidence and switches camera to the evidence view.
**Request body**
| Field | Type | Required | Description |
| ------------ | -------- | -------- | ------------------------------ |
| `evidenceId` | `string` | ✅ | ID of the evidence to present. |
**Response `200`**
```json
{ "ok": true, "action": "present", "evidenceId": "exhibit_a" }
```
**Response `400`**`MISSING_EVIDENCE_ID``evidenceId` is required
**Response `404`** — session not found (`SESSION_NOT_FOUND`)
---
### `POST /api/court/sessions/:id/phase`
Manually advance or set the session phase (operator use).
@@ -284,21 +337,55 @@ Every SSE payload is a `CourtEvent`:
### Event Types
| Type | When emitted | Key payload fields |
| ------------------------- | ------------------------------------------------------------------- | --------------------------------------------------------------------------- |
| `snapshot` | Immediately on SSE connect | `session`, `turns`, `verdictVotes`, `sentenceVotes`, `recapTurnIds` |
| `session_created` | Session record inserted | `sessionId` |
| `session_started` | Orchestration begins | `sessionId`, `startedAt` |
| `phase_changed` | Phase advances | `phase`, `phaseStartedAt`, `phaseDurationMs` |
| `turn` | A new dialogue turn is stored | `turn: CourtTurn` |
| `vote_updated` | A vote is successfully cast | `voteType`, `choice`, `verdictVotes`, `sentenceVotes` |
| `vote_closed` | Transitioned away from a vote phase; includes frozen tally snapshot | `pollType`, `closedAt`, `votes`, `nextPhase` |
| `witness_response_capped` | Witness response was truncated due to caps | `turnId`, `speaker`, `phase`, `originalLength`, `truncatedLength`, `reason` |
| `judge_recap_emitted` | Judge recap emitted during witness exam | `turnId`, `phase`, `cycleNumber` |
| `token_budget_applied` | Per-role token budget applied to generated turn | `turnId`, `speaker`, `role`, `phase`, `requestedMaxTokens`, `appliedMaxTokens`, `roleMaxTokens`, `source` |
| Type | When emitted | Key payload fields |
| ------------------------- | ------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------ |
| `snapshot` | Immediately on SSE connect | `session`, `turns`, `verdictVotes`, `sentenceVotes`, `recapTurnIds` |
| `session_created` | Session record inserted | `sessionId` |
| `session_started` | Orchestration begins | `sessionId`, `startedAt` |
| `phase_changed` | Phase advances | `phase`, `phaseStartedAt`, `phaseDurationMs` |
| `turn` | A new dialogue turn is stored | `turn: CourtTurn` |
| `vote_updated` | A vote is successfully cast | `voteType`, `choice`, `verdictVotes`, `sentenceVotes` |
| `vote_closed` | Transitioned away from a vote phase; includes frozen tally snapshot | `pollType`, `closedAt`, `votes`, `nextPhase` |
| `witness_response_capped` | Witness response was truncated due to caps | `turnId`, `speaker`, `phase`, `originalLength`, `truncatedLength`, `reason` |
| `judge_recap_emitted` | Judge recap emitted during witness exam | `turnId`, `phase`, `cycleNumber` |
| `token_budget_applied` | Per-role token budget applied to generated turn | `turnId`, `speaker`, `role`, `phase`, `requestedMaxTokens`, `appliedMaxTokens`, `roleMaxTokens`, `source` |
| `session_token_estimate` | Cumulative session token/cost estimate updated | `turnId`, `role`, `phase`, `estimatedPromptTokens`, `estimatedCompletionTokens`, `cumulativeEstimatedTokens`, `costPer1kTokensUsd`, `estimatedCostUsd` |
| `analytics_event` | Poll open/close lifecycle events | `name`, `pollType`, `phase?`, `choice?` |
| `moderation_action` | Turn content was flagged and redacted | `turnId`, `speaker`, `reasons`, `phase` |
| `vote_spam_blocked` | Vote rejected due to rate limiting or duplicate detection | `ip`, `voteType`, `reason`, `retryAfterMs` |
| `session_completed` | Session reached `final_ruling` successfully | `sessionId`, `completedAt` |
| `session_failed` | Orchestration threw an unrecoverable error | `sessionId`, `reason` |
| `analytics_event` | Poll open/close lifecycle events | `name`, `pollType`, `phase?`, `choice?` |
| `moderation_action` | Turn content was flagged and redacted | `turnId`, `speaker`, `reasons`, `phase` |
| `vote_spam_blocked` | Vote rejected due to rate limiting or duplicate detection | `ip`, `voteType`, `reason`, `retryAfterMs` |
| `session_completed` | Session reached `final_ruling` successfully | `sessionId`, `completedAt` |
| `session_failed` | Orchestration threw an unrecoverable error | `sessionId`, `reason` |
| `render_directive` | Visual/audio directive emitted for renderer | `directive`, `phase`, `emittedAt` |
| `case_file_generated` | Structured case file built at session start | `caseFile`, `sessionId`, `generatedAt` |
| `witness_statement` | Witness testimony statement emitted | `statement`, `phase`, `emittedAt` |
| `evidence_revealed` | Evidence item revealed during examination | `evidenceId`, `label`, `phase` |
---
## SSE Fixture Recording
Record a live SSE stream to a fixture file:
```bash
npm run record:sse -- --session <SESSION_ID>
```
Defaults:
- Base URL: `http://127.0.0.1:${PORT}`
- Output: `public/fixtures/sse-<SESSION_ID>-<timestamp>.json`
Optional flags:
| Flag | Description |
| ------------------- | ----------------------------- |
| `--base <url>` | Override the SSE base URL |
| `--out <path>` | Override the output file path |
| `--max-events <n>` | Stop after recording N events |
| `--duration-ms <n>` | Stop after N milliseconds |
Replay a fixture in the browser overlay by adding a query param:
```
http://localhost:3000/?replayFixture=/fixtures/<fixture-file>.json
```

View File

@@ -301,6 +301,74 @@ All broadcast hooks emit telemetry events via SSE:
---
## Twitch Chat Integration (Phase 7)
> **Phase 7 Feature**: Audience interaction through Twitch chat commands and channel points.
### Overview
The Twitch adapter reads chat messages in a configured channel and forwards recognised commands to the session orchestrator. When Twitch credentials are absent, the adapter is a no-op — no errors, no connections attempted.
### Supported Commands
| Command | Effect |
| ------------------------- | --------------------------------------------------------------------- |
| `!press` | Audience presses the current witness — emits a shake render directive |
| `!present <evidence_id>` | Presents evidence — emits a `take_that` render directive |
| `!objection` | Increments the audience objection counter |
### Configuration
Add to `.env`:
```bash
# Twitch integration (Phase 7)
TWITCH_CHANNEL=your_channel_name
TWITCH_BOT_TOKEN=oauth:your_bot_token
TWITCH_CLIENT_ID=your_client_id
```
When any of these three variables is empty or missing, the adapter defaults to noop mode.
### API Endpoints
The audience interaction also works via HTTP (for non-Twitch integrations):
- `POST /api/court/sessions/:id/press` — press the current witness
- `POST /api/court/sessions/:id/present` — present evidence (`{ "evidenceId": "..." }`)
Both endpoints emit `render_directive` SSE events consumed by the overlay renderer.
### Architecture
```
Twitch IRC ──► TwitchAdapter.onCommand() ──► wireTwitchToSession()
├──► store.emitEvent('objection_count_changed')
├──► store.emitEvent('render_directive')
└──► console.info (press/present logging)
HTTP POST /press ──► server handler ──► store.emitEvent('render_directive')
HTTP POST /present ──► server handler ──► store.emitEvent('render_directive')
```
### Fail-Safe Behavior
The same non-blocking isolation guarantees apply:
1. Twitch adapter failures never crash the session
2. Missing credentials default to noop (no connection attempt)
3. Malformed commands are silently ignored
4. Command parsing is strict: only `!press`, `!present`, `!objection` are recognised
### Current Limitations
- IRC socket connection is deferred (placeholder-first); the adapter logs commands but does not connect to Twitch IRC yet
- Channel-point redemption mapping (via EventSub) is planned for a future iteration
- Per-loop audience action rate limits are enforced at the API layer but not yet at the Twitch command layer
---
## Advanced Configuration
### Custom Adapter Implementation

View File

@@ -643,9 +643,147 @@ Emitted when the objection counter increments (typically on moderation flags).
---
### `render_directive` (Phase 7)
Emitted when a visual/audio render directive is generated for the overlay renderer.
Directives are inferred from role, phase, and dialogue content (e.g. objection keywords trigger effects).
**Severity:** `info`
**Payload**
```ts
{
directive: {
camera?: CameraPreset; // 'wide' | 'judge' | 'prosecution' | 'defense' | 'witness' | 'evidence' | 'verdict'
poses?: Partial<Record<CourtRole, CharacterPose>>; // e.g. { prosecutor: 'point' }
faces?: Partial<Record<CourtRole, CharacterFace>>; // e.g. { prosecutor: 'angry' }
effect?: string; // 'objection' | 'hold_it' | 'take_that' | 'flash' | 'shake' | 'freeze'
evidencePresent?: string; // evidence ID to display
};
phase: CourtPhase;
emittedAt: string; // ISO 8601
}
```
**Example**
```json
{
"type": "render_directive",
"payload": {
"directive": {
"camera": "prosecution",
"effect": "objection",
"poses": { "prosecutor": "point" }
},
"phase": "witness_exam",
"emittedAt": "2024-01-15T10:03:00.000Z"
}
}
```
---
### `case_file_generated` (Phase 7)
Emitted once at session start when the structured case file is built.
Contains the immutable case summary, charges, witness roster, and evidence inventory.
**Severity:** `info`
**Payload**
```ts
{
caseFile: {
title: string;
synopsis: string;
charges: string[];
witnesses: Array<{ role: string; agentId: string; name: string }>;
evidence: Array<{ id: string; label: string; description: string }>;
sentenceOptions: string[];
};
sessionId: string;
generatedAt: string; // ISO 8601
}
```
**Example**
```json
{
"type": "case_file_generated",
"payload": {
"caseFile": {
"title": "The People v. Office Thermostat",
"synopsis": "The defendant is accused of stealing the office thermostat",
"charges": ["As stated in case prompt"],
"witnesses": [
{ "role": "witness_1", "agentId": "thaum", "name": "Thaum" }
],
"evidence": [
{
"id": "exhibit_a",
"label": "Exhibit A",
"description": "Placeholder evidence"
}
],
"sentenceOptions": ["Community service", "Fine"]
},
"sessionId": "f5d6e7f8-…",
"generatedAt": "2024-01-15T10:00:02.000Z"
}
}
```
---
### `witness_statement` (Phase 7)
Emitted each time a witness delivers testimony during examination.
Statements are accumulated in `session.metadata.witnessStatements` for press/present mechanics.
**Severity:** `info`
**Payload**
```ts
{
statement: {
witnessRole: string; // e.g. 'witness_1'
agentId: AgentId;
statementText: string;
issuedAt: string; // ISO 8601
}
phase: CourtPhase;
emittedAt: string; // ISO 8601
}
```
**Example**
```json
{
"type": "witness_statement",
"payload": {
"statement": {
"witnessRole": "witness_1",
"agentId": "thaum",
"statementText": "I saw the defendant near the thermostat at approximately 3 PM.",
"issuedAt": "2024-01-15T10:02:15.000Z"
},
"phase": "witness_exam",
"emittedAt": "2024-01-15T10:02:15.000Z"
}
}
```
---
## Severity levels
- **info**: `session_created`, `session_started`, `phase_changed`, `turn`, `vote_updated`, `vote_closed`, `witness_response_capped`, `judge_recap_emitted`, `token_budget_applied`, `session_token_estimate`, `analytics_event`, `session_completed`, `broadcast_hook_triggered`, `evidence_revealed`, `objection_count_changed`
- **info**: `session_created`, `session_started`, `phase_changed`, `turn`, `vote_updated`, `vote_closed`, `witness_response_capped`, `judge_recap_emitted`, `token_budget_applied`, `session_token_estimate`, `analytics_event`, `session_completed`, `broadcast_hook_triggered`, `evidence_revealed`, `objection_count_changed`, `render_directive`, `case_file_generated`, `witness_statement`
- **warn**: `moderation_action`, `vote_spam_blocked`, `broadcast_hook_failed`
- **error**: `session_failed`

View File

@@ -33,14 +33,14 @@ cp .env.example .env
Key variables:
| Variable | Default | Description |
|--------------------------|--------------------------------------|-------------|
| `OPENROUTER_API_KEY` | *(empty)* | Set to enable real LLM calls. Leave empty for deterministic mock mode. |
| `LLM_MODEL` | `deepseek/deepseek-chat-v3-0324:free`| OpenRouter model identifier. |
| `PORT` | `3001` | Port the HTTP server listens on. |
| `DATABASE_URL` | *(empty)* | Postgres connection string. Omit for in-memory mode. |
| `VERDICT_VOTE_WINDOW_MS` | `20000` | Duration of the verdict poll in milliseconds. |
| `SENTENCE_VOTE_WINDOW_MS`| `20000` | Duration of the sentence poll in milliseconds. |
| Variable | Default | Description |
| ------------------------- | ------------------------------------- | ---------------------------------------------------------------------- |
| `OPENROUTER_API_KEY` | _(empty)_ | Set to enable real LLM calls. Leave empty for deterministic mock mode. |
| `LLM_MODEL` | `deepseek/deepseek-chat-v3-0324:free` | OpenRouter model identifier. |
| `PORT` | `3001` | Port the HTTP server listens on. |
| `DATABASE_URL` | _(empty)_ | Postgres connection string. Omit for in-memory mode. |
| `VERDICT_VOTE_WINDOW_MS` | `20000` | Duration of the verdict poll in milliseconds. |
| `SENTENCE_VOTE_WINDOW_MS` | `20000` | Duration of the sentence poll in milliseconds. |
### 1.3 (Optional) Run database migrations
@@ -89,7 +89,7 @@ To expose Postgres to the host (e.g., for `psql` inspection), add a `ports` entr
```yaml
ports:
- "5433:5432"
- '5433:5432'
```
---
@@ -203,14 +203,14 @@ To test with very short windows (e.g., smoke tests): set `VERDICT_VOTE_WINDOW_MS
The server logs to `stdout`/`stderr`. Key log prefixes:
| Prefix | Meaning |
|---------------------|---------|
| `[moderation]` | A turn was flagged and redacted. Includes session ID, speaker, and reason codes. |
| `[vote-spam]` | A vote was blocked by the rate limiter. Includes IP and session ID. |
| Prefix | Meaning |
| -------------- | -------------------------------------------------------------------------------- |
| `[moderation]` | A turn was flagged and redacted. Includes session ID, speaker, and reason codes. |
| `[vote-spam]` | A vote was blocked by the rate limiter. Includes IP and session ID. |
All session events are also emitted to the SSE stream (see [api.md](./api.md#sse-event-contracts)).
There is no built-in metrics endpoint. For production observability, pipe logs to a structured logging system or attach to the SSE stream.
A Prometheus-compatible metrics endpoint is available at `GET /api/metrics`. It exposes session lifecycle counters, phase transition totals, vote latency histograms, SSE connection gauges, and default Node.js process metrics (prefixed `juryrigged_`). For production observability, scrape `/api/metrics` with Prometheus or a compatible collector, and pipe logs to a structured logging system.
---
@@ -247,20 +247,27 @@ Use this section during real shows. Dashboard/alert configuration details are do
### 9.1 Startup checklist (T-30 minutes to T-5 minutes before show start)
1. **Environment sanity**
- Confirm `OPENROUTER_API_KEY` present for live mode (or `LLM_MOCK=true` for rehearsal).
- Confirm vote windows and token/recap knobs are set for this show.
- Confirm `OPENROUTER_API_KEY` present for live mode (or `LLM_MOCK=true` for rehearsal).
- Confirm vote windows and token/recap knobs are set for this show.
2. **System readiness**
- Confirm `GET /api/health` is green.
- Confirm API health probe (hard-down threshold from `docs/ops-runbook.md` Section 3) is passing.
- Confirm `GET /api/health` is green.
- Confirm API health probe (hard-down threshold from `docs/ops-runbook.md` Section 3) is passing.
3. **Dry session smoke**
- Create one session and verify SSE stream connects.
- Confirm metric movement in:
- Create one session and verify SSE stream connects.
- Confirm metric movement in:
- SLI A — Session completion rate (`docs/ops-runbook.md` Section 2)
- SLI B — Vote API latency p95 (`docs/ops-runbook.md` Section 2)
- SLI C — Moderation events per 15m (`docs/ops-runbook.md` Section 2)
4. **Alert routing check**
- Validate pager/notification channel receives one test alert payload.
- Verify alert payload contains runbook link back to this document.
- Validate pager/notification channel receives one test alert payload.
- Verify alert payload contains runbook link back to this document.
### 9.2 Live show checklist (continuous)
@@ -333,8 +340,10 @@ Use when fairness, safety, or service stability cannot be restored quickly.
1. Announce mistrial to viewers in the broadcast layer.
2. Move session to closure quickly using **forward-only** phase steps via `POST /api/court/sessions/:id/phase`.
- If in `witness_exam`, move to `closings`.
- Then advance through `verdict_vote``sentence_vote``final_ruling` with short durations if needed.
- If in `witness_exam`, move to `closings`.
- Then advance through `verdict_vote``sentence_vote``final_ruling` with short durations if needed.
3. Record incident details and open retrospective action items.
### 11.2 Emergency recap procedure
@@ -363,12 +372,12 @@ Workaround:
## 12 — Dashboard and alert reference map
| Operational concern | Dashboard panel ID | Alert ID |
| --- | --- | --- |
| Session completion health | `session_completion_rate_15m` | `session_completion_rate_low` |
| Vote API responsiveness | `vote_latency_p95_10m` | `vote_latency_high` |
| Moderation intensity | `moderation_events_15m` | `moderation_spike` |
| API + stream liveliness | `stream_and_api_health` | `api_hard_down`, `stream_connectivity_degraded` |
| Operational concern | Dashboard panel ID | Alert ID |
| ------------------------- | ----------------------------- | ----------------------------------------------- |
| Session completion health | `session_completion_rate_15m` | `session_completion_rate_low` |
| Vote API responsiveness | `vote_latency_p95_10m` | `vote_latency_high` |
| Moderation intensity | `moderation_events_15m` | `moderation_spike` |
| API + stream liveliness | `stream_and_api_health` | `api_hard_down`, `stream_connectivity_degraded` |
---

38
package-lock.json generated
View File

@@ -12,6 +12,7 @@
"express": "^4.21.2",
"express-rate-limit": "^8.2.1",
"postgres": "^3.4.5",
"prom-client": "^15.1.3",
"react": "^18.3.1",
"react-dom": "^18.3.1"
},
@@ -905,6 +906,15 @@
"node": ">= 8"
}
},
"node_modules/@opentelemetry/api": {
"version": "1.9.0",
"resolved": "https://registry.npmjs.org/@opentelemetry/api/-/api-1.9.0.tgz",
"integrity": "sha512-3giAOQvZiH5F9bMlMiv8+GSPMeqg0dbaeo58/0SlA9sxSqZhnUtxzX9/2FzyhS9sWQf5S0GJE0AKBrFqjpeYcg==",
"license": "Apache-2.0",
"engines": {
"node": ">=8.0.0"
}
},
"node_modules/@rolldown/pluginutils": {
"version": "1.0.0-beta.27",
"resolved": "https://registry.npmjs.org/@rolldown/pluginutils/-/pluginutils-1.0.0-beta.27.tgz",
@@ -1592,6 +1602,12 @@
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/bintrees": {
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/bintrees/-/bintrees-1.0.2.tgz",
"integrity": "sha512-VOMgTMwjAaUG580SXn3LacVgjurrbMme7ZZNYGSSV7mmtY6QQRh0Eg3pwIcntQ77DErK1L0NxkbetjcoXzVwKw==",
"license": "MIT"
},
"node_modules/body-parser": {
"version": "1.20.4",
"resolved": "https://registry.npmjs.org/body-parser/-/body-parser-1.20.4.tgz",
@@ -2974,6 +2990,19 @@
"url": "https://github.com/sponsors/porsager"
}
},
"node_modules/prom-client": {
"version": "15.1.3",
"resolved": "https://registry.npmjs.org/prom-client/-/prom-client-15.1.3.tgz",
"integrity": "sha512-6ZiOBfCywsD4k1BN9IX0uZhF+tJkV8q8llP64G5Hajs4JOeVLPCwpPVcpXy3BwYiUGgyJzsJJQeOIv7+hDSq8g==",
"license": "Apache-2.0",
"dependencies": {
"@opentelemetry/api": "^1.4.0",
"tdigest": "^0.1.1"
},
"engines": {
"node": "^16 || ^18 || >=20"
}
},
"node_modules/proxy-addr": {
"version": "2.0.7",
"resolved": "https://registry.npmjs.org/proxy-addr/-/proxy-addr-2.0.7.tgz",
@@ -3478,6 +3507,15 @@
"node": ">=14.0.0"
}
},
"node_modules/tdigest": {
"version": "0.1.2",
"resolved": "https://registry.npmjs.org/tdigest/-/tdigest-0.1.2.tgz",
"integrity": "sha512-+G0LLgjjo9BZX2MfdvPfH+MKLCrxlXSYec5DaPYP1fe6Iyhf0/fSmJ0bFiZ1F8BT6cGXl2LpltQptzjXKWEkKA==",
"license": "MIT",
"dependencies": {
"bintrees": "1.0.2"
}
},
"node_modules/thenify": {
"version": "3.3.1",
"resolved": "https://registry.npmjs.org/thenify/-/thenify-3.3.1.tgz",

View File

@@ -14,6 +14,7 @@
"smoke:staging": "bash ./scripts/staging-smoke.sh",
"start": "node dist/server.js",
"migrate": "tsx src/scripts/migrate.ts",
"record:sse": "tsx src/scripts/record-sse-fixture.ts",
"migrate:dist": "node dist/scripts/migrate.js",
"docker:up": "docker compose up --build",
"docker:down": "docker compose down"
@@ -22,6 +23,7 @@
"dotenv": "^17.2.3",
"express": "^4.21.2",
"express-rate-limit": "^8.2.1",
"prom-client": "^15.1.3",
"postgres": "^3.4.5",
"react": "^18.3.1",
"react-dom": "^18.3.1"

View File

@@ -5,6 +5,7 @@ import {
resetStreamState,
shouldAppendTurn,
} from './stream-state.js';
import { createCourtRenderer } from './renderer/index.js';
const topicInput = document.getElementById('topic');
const caseTypeSelect = document.getElementById('caseType');
@@ -29,6 +30,12 @@ const phaseTimer = document.getElementById('phaseTimer');
const phaseTimerFill = document.getElementById('phaseTimerFill');
const activeSpeakerEl = document.getElementById('activeSpeaker');
const captionLineEl = document.getElementById('captionLine');
const pixiStageHost = document.getElementById('pixiStage');
const captionSkipBtn = document.getElementById('captionSkipBtn');
const captionSkipAllToggle = document.getElementById('captionSkipAll');
const captionTypewriterToggle = document.getElementById(
'captionTypewriterToggle',
);
const connectionBanner = document.getElementById('connectionBanner');
const catchupToggleBtn = document.getElementById('catchupToggle');
const catchupBody = document.getElementById('catchupBody');
@@ -41,6 +48,13 @@ let timerInterval = null;
let voteCountdownInterval = null;
let reconnectTimer = null;
let reconnectAttempts = 0;
/** @type {Awaited<ReturnType<typeof createCourtRenderer>> | null} */
let courtRenderer = null;
const runtimeSearchParams = new URLSearchParams(window.location.search);
const fixtureReplayUrl = runtimeSearchParams.get('replayFixture');
const isFixtureReplayMode =
typeof fixtureReplayUrl === 'string' && fixtureReplayUrl.length > 0;
const streamState = createStreamState();
const voteState = {
@@ -62,6 +76,24 @@ const RECONNECT_BASE_MS = 1000;
const RECONNECT_MAX_MS = 10_000;
const CATCHUP_MAX_CHARS = 220;
const TIMER_TICK_MS = 250;
const TYPEWRITER_CHARS_PER_SECOND = 48;
const fixtureReplayState = {
active: false,
timers: [],
};
const dialogueTypewriterState = {
enabled: true,
skipAll: false,
skipRequested: false,
fullText: 'Captions will appear here.',
speakerLabel: 'Waiting for first turn…',
frameId: null,
lineToken: 0,
};
// pixiOverlayState replaced by courtRenderer — see renderer/index.js
const catchupState = {
visible: true,
@@ -98,6 +130,165 @@ function pulseActiveSpeaker() {
activeSpeakerEl.classList.add('speaker-live');
}
function setCaptionText(text) {
captionLineEl.textContent = text;
if (courtRenderer?.ui?.dialogueText) {
courtRenderer.ui.dialogueText.text = text;
}
}
function setActiveSpeakerLabel(label, { pulse = false } = {}) {
activeSpeakerEl.textContent = label;
if (pulse) {
pulseActiveSpeaker();
}
if (courtRenderer?.ui?.speakerText) {
courtRenderer.ui.speakerText.text = label;
}
}
async function bootstrapCourtRenderer() {
if (!pixiStageHost) {
return;
}
courtRenderer = await createCourtRenderer(pixiStageHost);
if (courtRenderer) {
// Sync initial text state into the renderer overlay
courtRenderer.update({
phase: 'idle',
speakerLabel: activeSpeakerEl.textContent,
dialogueContent: captionLineEl.textContent,
nameplate: '',
});
}
}
function clearDialogueTypewriter() {
if (dialogueTypewriterState.frameId !== null) {
cancelAnimationFrame(dialogueTypewriterState.frameId);
dialogueTypewriterState.frameId = null;
}
}
function commitDialogueTypewriterLine() {
clearDialogueTypewriter();
dialogueTypewriterState.skipRequested = false;
setCaptionText(dialogueTypewriterState.fullText);
}
function skipDialogueTypewriter() {
if (dialogueTypewriterState.frameId === null) {
return;
}
dialogueTypewriterState.skipRequested = true;
}
function startDialogueTypewriter(turn) {
const dialogue = typeof turn.dialogue === 'string' ? turn.dialogue : '';
const role = typeof turn.role === 'string' ? turn.role : 'unknown role';
const speaker =
typeof turn.speaker === 'string' ? turn.speaker : 'unknown speaker';
const speakerLabel = `${role} · ${speaker}`;
dialogueTypewriterState.lineToken += 1;
const token = dialogueTypewriterState.lineToken;
dialogueTypewriterState.speakerLabel = speakerLabel;
dialogueTypewriterState.fullText = dialogue;
dialogueTypewriterState.skipRequested = false;
clearDialogueTypewriter();
setActiveSpeakerLabel(speakerLabel, { pulse: true });
if (
dialogueTypewriterState.skipAll ||
!dialogueTypewriterState.enabled ||
dialogue.length <= 1
) {
setCaptionText(dialogue);
return;
}
const startedAt = performance.now();
const tick = timestamp => {
if (token !== dialogueTypewriterState.lineToken) {
return;
}
if (
dialogueTypewriterState.skipRequested ||
dialogueTypewriterState.skipAll
) {
commitDialogueTypewriterLine();
return;
}
const elapsedMs = Math.max(0, timestamp - startedAt);
const characters = Math.min(
dialogue.length,
Math.max(
1,
Math.floor((elapsedMs / 1000) * TYPEWRITER_CHARS_PER_SECOND),
),
);
setCaptionText(dialogue.slice(0, characters));
if (characters >= dialogue.length) {
dialogueTypewriterState.frameId = null;
return;
}
dialogueTypewriterState.frameId = requestAnimationFrame(tick);
};
setCaptionText('');
dialogueTypewriterState.frameId = requestAnimationFrame(tick);
}
function renderDialogueControlState() {
if (captionTypewriterToggle) {
captionTypewriterToggle.textContent =
dialogueTypewriterState.enabled ? 'Typewriter: on' : (
'Typewriter: off'
);
}
if (captionSkipAllToggle) {
captionSkipAllToggle.checked = dialogueTypewriterState.skipAll;
}
}
function setTypewriterEnabled(enabled) {
dialogueTypewriterState.enabled = Boolean(enabled);
renderDialogueControlState();
if (!dialogueTypewriterState.enabled) {
commitDialogueTypewriterLine();
}
}
function setSkipAllCaptions(enabled) {
dialogueTypewriterState.skipAll = Boolean(enabled);
renderDialogueControlState();
if (dialogueTypewriterState.skipAll) {
commitDialogueTypewriterLine();
}
}
function clearFixtureReplayTimers() {
fixtureReplayState.timers.forEach(timerId => clearTimeout(timerId));
fixtureReplayState.timers = [];
fixtureReplayState.active = false;
}
const JURY_STEP_LABELS = Object.freeze({
case_prompt: 'Jury pending — court intro in progress',
openings: 'Jury listening — opening statements',
@@ -140,16 +331,19 @@ function summarizeCaseSoFar(turns) {
return 'The court has just opened. Waiting for opening statements.';
}
return clip(recent.map(turn => `${turn.speaker}: ${turn.dialogue}`).join(' · '));
return clip(
recent.map(turn => `${turn.speaker}: ${turn.dialogue}`).join(' · '),
);
}
function updateCatchupPanel(session) {
const phase = session?.phase;
const turns = session?.turns ?? [];
catchupSummaryEl.textContent = summarizeCaseSoFar(turns);
catchupMetaEl.textContent = phase
? `phase: ${phase} · ${juryStepLabel(phase)}`
: 'phase: idle · Jury pending';
catchupMetaEl.textContent =
phase ?
`phase: ${phase} · ${juryStepLabel(phase)}`
: 'phase: idle · Jury pending';
}
function recordCatchupToggleTelemetry(visible, reason) {
@@ -214,9 +408,7 @@ function appendTurn(turn, { recap = false } = {}) {
item.append(meta, body);
feed.appendChild(item);
feed.scrollTop = feed.scrollHeight;
activeSpeakerEl.textContent = `${turn.role} · ${turn.speaker}`;
pulseActiveSpeaker();
captionLineEl.textContent = turn.dialogue;
startDialogueTypewriter(turn);
}
function markTurnRecap(turnId) {
@@ -450,6 +642,10 @@ function updateTimer(phaseStartedAt, phaseDurationMs) {
}
function scheduleReconnect(sessionId) {
if (isFixtureReplayMode) {
return;
}
if (reconnectTimer || !activeSession || activeSession.id !== sessionId) {
return;
}
@@ -494,8 +690,9 @@ function handleSnapshotEvent(snapshotPayload) {
});
});
if (turns.length === 0) {
activeSpeakerEl.textContent = 'Waiting for first turn…';
captionLineEl.textContent = 'Captions will appear here.';
setActiveSpeakerLabel('Waiting for first turn…');
dialogueTypewriterState.fullText = 'Captions will appear here.';
setCaptionText(dialogueTypewriterState.fullText);
}
renderTally(verdictTallies, verdictVotes);
renderTally(sentenceTallies, sentenceVotes);
@@ -516,6 +713,7 @@ function handleSnapshotEvent(snapshotPayload) {
}
renderActions(session);
renderVoteMeta();
syncRendererState();
}
function handleTurnEvent(turnPayload) {
@@ -531,6 +729,7 @@ function handleTurnEvent(turnPayload) {
recap: isRecapTurn(streamState, turn.id),
});
updateCatchupPanel(activeSession);
syncRendererState();
}
function handleJudgeRecapEvent(recapPayload) {
@@ -549,6 +748,7 @@ function handlePhaseChangedEvent(phasePayload) {
phaseBadge.textContent = `phase: ${phasePayload.phase}`;
updateTimer(phasePayload.phaseStartedAt, phasePayload.phaseDurationMs);
updateCatchupPanel(activeSession);
syncRendererState();
if (phasePayload.phase === 'verdict_vote') {
openVoteWindow(
'verdict',
@@ -606,6 +806,67 @@ function handleAnalyticsEvent(analyticsPayload) {
}
}
/**
* Handle render_directive SSE events — forward to the PixiJS renderer.
*/
function handleRenderDirectiveEvent(directivePayload) {
if (!courtRenderer?.applyDirective) {
return;
}
const directive = directivePayload.directive;
if (directive && typeof directive === 'object') {
courtRenderer.applyDirective(directive);
}
}
/**
* Handle evidence_revealed SSE events — add card to the renderer tray.
*/
function handleEvidenceRevealedEvent(evidencePayload) {
if (!courtRenderer?.evidence) {
return;
}
const evidenceId = evidencePayload.evidenceId;
const evidenceText = evidencePayload.evidenceText;
if (typeof evidenceId === 'string' && typeof evidenceText === 'string') {
courtRenderer.evidence.addCard({ id: evidenceId, text: evidenceText });
}
}
/**
* Push the current overlay state into the PixiJS renderer (if active).
* Extracts role names from the session role assignments when available.
*/
function syncRendererState() {
if (!courtRenderer) {
return;
}
const lastTurn =
activeSession?.turns?.length > 0 ?
activeSession.turns[activeSession.turns.length - 1]
: null;
const roleNames = {};
const assignments = activeSession?.metadata?.roleAssignments;
if (assignments && typeof assignments === 'object') {
for (const [role, name] of Object.entries(assignments)) {
if (typeof name === 'string') {
roleNames[role] = name;
}
}
}
courtRenderer.update({
phase: activeSession?.phase ?? 'idle',
activeSpeakerRole: lastTurn?.role ?? null,
roleNames,
speakerLabel: dialogueTypewriterState.speakerLabel,
dialogueContent: dialogueTypewriterState.fullText,
nameplate: lastTurn ? `${lastTurn.role} · ${lastTurn.speaker}` : '',
});
}
const STREAM_EVENT_HANDLERS = {
snapshot: handleSnapshotEvent,
turn: handleTurnEvent,
@@ -616,8 +877,192 @@ const STREAM_EVENT_HANDLERS = {
session_completed: handleSessionCompletedEvent,
session_failed: handleSessionFailedEvent,
analytics_event: handleAnalyticsEvent,
render_directive: handleRenderDirectiveEvent,
evidence_revealed: handleEvidenceRevealedEvent,
};
function dispatchStreamPayload(message) {
if (!message || typeof message !== 'object') {
return;
}
const type = typeof message.type === 'string' ? message.type : null;
if (!type) {
return;
}
const handler = STREAM_EVENT_HANDLERS[type];
if (!handler) {
return;
}
const payload =
message.payload && typeof message.payload === 'object' ?
message.payload
: {};
handler(payload);
}
function readFixtureMessages(fixturePayload) {
const rawEvents =
Array.isArray(fixturePayload?.events) ? fixturePayload.events : [];
return rawEvents
.map(event => {
const offsetMs = Number(event?.offsetMs);
const delayMs =
Number.isFinite(offsetMs) ? Math.max(0, offsetMs) : 0;
const message =
event?.message && typeof event.message === 'object' ?
event.message
: event;
return { delayMs, message };
})
.filter(entry => entry.message && typeof entry.message === 'object');
}
async function fetchFixturePayload() {
if (!fixtureReplayUrl) {
throw new Error('Missing replayFixture URL');
}
const response = await fetch(fixtureReplayUrl, { cache: 'no-store' });
if (!response.ok) {
throw new Error(
`fixture fetch failed with ${response.status} ${response.statusText}`,
);
}
return await response.json();
}
function buildFixtureSessionFromSnapshot(fixturePayload) {
const replayMessages = readFixtureMessages(fixturePayload);
const snapshotEnvelope = replayMessages
.map(entry => entry.message)
.find(message => message?.type === 'snapshot');
const snapshotPayload =
snapshotEnvelope && typeof snapshotEnvelope.payload === 'object' ?
snapshotEnvelope.payload
: null;
if (
!snapshotPayload?.session ||
typeof snapshotPayload.session !== 'object'
) {
return null;
}
const session = snapshotPayload.session;
const metadata =
typeof session.metadata === 'object' && session.metadata !== null ?
session.metadata
: {};
return {
...session,
turns:
Array.isArray(snapshotPayload.turns) ? snapshotPayload.turns : [],
metadata: {
...metadata,
verdictVotes:
snapshotPayload.verdictVotes ?? metadata.verdictVotes ?? {},
sentenceVotes:
snapshotPayload.sentenceVotes ?? metadata.sentenceVotes ?? {},
recapTurnIds:
snapshotPayload.recapTurnIds ?? metadata.recapTurnIds ?? [],
},
};
}
async function replayFixtureSession(sessionId) {
if (!isFixtureReplayMode || !fixtureReplayUrl) {
return;
}
clearFixtureReplayTimers();
try {
const fixturePayload = await fetchFixturePayload();
const fixtureSessionId =
typeof fixturePayload?.sessionId === 'string' ?
fixturePayload.sessionId
: null;
if (fixtureSessionId && fixtureSessionId !== sessionId) {
setStatus(
`Fixture replay session mismatch: expected ${sessionId.slice(0, 8)}, got ${fixtureSessionId.slice(0, 8)}.`,
'error',
);
}
const replayMessages = readFixtureMessages(fixturePayload);
if (replayMessages.length === 0) {
setStatus(
'Fixture loaded, but no replay events were found.',
'error',
);
return;
}
fixtureReplayState.active = true;
setConnectionBanner(
'Fixture replay mode active. Live SSE is disabled.',
);
setStatus(`Replaying fixture (${replayMessages.length} events).`);
replayMessages.forEach(({ delayMs, message }, index) => {
const timerId = setTimeout(() => {
dispatchStreamPayload(message);
if (index === replayMessages.length - 1) {
fixtureReplayState.active = false;
setStatus('Fixture replay finished.');
}
}, delayMs);
fixtureReplayState.timers.push(timerId);
});
} catch (error) {
fixtureReplayState.active = false;
setStatus(
`Fixture replay failed: ${error instanceof Error ? error.message : String(error)}`,
'error',
);
}
}
async function hydrateFromFixtureReplay() {
if (!isFixtureReplayMode) {
return false;
}
try {
const fixturePayload = await fetchFixturePayload();
const fixtureSession = buildFixtureSessionFromSnapshot(fixturePayload);
if (!fixtureSession?.id) {
throw new Error(
'fixture does not contain a valid snapshot session',
);
}
hydrateSession(
fixtureSession,
'Loaded fixture snapshot. Replaying recorded stream.',
);
return true;
} catch (error) {
setStatus(
`Failed to hydrate fixture replay: ${error instanceof Error ? error.message : String(error)}`,
'error',
);
return false;
}
}
function connectStream(sessionId, isReconnect = false) {
if (source) {
source.close();
@@ -629,6 +1074,11 @@ function connectStream(sessionId, isReconnect = false) {
reconnectTimer = null;
}
if (isFixtureReplayMode) {
void replayFixtureSession(sessionId);
return;
}
source = new EventSource(`/api/court/sessions/${sessionId}/stream`);
source.onopen = () => {
@@ -641,12 +1091,7 @@ function connectStream(sessionId, isReconnect = false) {
source.onmessage = event => {
const payload = JSON.parse(event.data);
const handler = STREAM_EVENT_HANDLERS[payload.type];
if (!handler) {
return;
}
handler(payload.payload);
dispatchStreamPayload(payload);
};
source.onerror = () => {
@@ -659,6 +1104,64 @@ function connectStream(sessionId, isReconnect = false) {
};
}
function hydrateSession(session, statusMessage) {
clearFixtureReplayTimers();
activeSession = session;
activeSession.turns = session.turns || [];
sessionMeta.textContent = `${activeSession.id} · ${activeSession.status}`;
phaseBadge.textContent = `phase: ${activeSession.phase}`;
feed.innerHTML = '';
resetStreamState(streamState, {
turns: activeSession.turns,
recapTurnIds: activeSession.metadata.recapTurnIds ?? [],
});
activeSession.turns.forEach(turn => {
appendTurn(turn, {
recap: isRecapTurn(streamState, turn.id),
});
});
if (activeSession.turns.length === 0) {
setActiveSpeakerLabel('Waiting for first turn…');
dialogueTypewriterState.fullText = 'Captions will appear here.';
setCaptionText(dialogueTypewriterState.fullText);
}
renderTally(verdictTallies, activeSession.metadata.verdictVotes);
renderTally(sentenceTallies, activeSession.metadata.sentenceVotes);
resetVoteState();
if (activeSession.phase === 'verdict_vote') {
openVoteWindow(
'verdict',
activeSession.metadata.phaseStartedAt,
activeSession.metadata.phaseDurationMs,
);
}
if (activeSession.phase === 'sentence_vote') {
openVoteWindow(
'sentence',
activeSession.metadata.phaseStartedAt,
activeSession.metadata.phaseDurationMs,
);
}
renderActions(activeSession);
renderVoteMeta();
updateTimer(
activeSession.metadata.phaseStartedAt,
activeSession.metadata.phaseDurationMs,
);
updateCatchupPanel(activeSession);
connectStream(activeSession.id);
setStatus(statusMessage);
}
startBtn.onclick = async () => {
const topic = topicInput.value.trim();
const caseType = caseTypeSelect.value;
@@ -686,26 +1189,142 @@ startBtn.onclick = async () => {
return;
}
activeSession = data.session;
activeSession.turns = data.session.turns || [];
sessionMeta.textContent = `${activeSession.id} · ${activeSession.status}`;
phaseBadge.textContent = `phase: ${activeSession.phase}`;
feed.innerHTML = '';
renderTally(verdictTallies, activeSession.metadata.verdictVotes);
renderTally(sentenceTallies, activeSession.metadata.sentenceVotes);
renderActions(activeSession);
updateTimer(
activeSession.metadata.phaseStartedAt,
activeSession.metadata.phaseDurationMs,
hydrateSession(
data.session,
'Session started. Court is now in session.',
);
updateCatchupPanel(activeSession);
connectStream(activeSession.id);
setStatus('Session started. Court is now in session.');
} finally {
setStartLoading(false);
}
};
async function connectLatestSession() {
try {
const response = await fetch('/api/court/sessions');
if (!response.ok) {
if (isFixtureReplayMode) {
await hydrateFromFixtureReplay();
}
return;
}
const data = await response.json();
const sessions = Array.isArray(data.sessions) ? data.sessions : [];
if (sessions.length === 0) {
if (isFixtureReplayMode) {
const hydrated = await hydrateFromFixtureReplay();
if (!hydrated) {
setStatus(
'No active session and fixture replay failed.',
'error',
);
}
} else {
setStatus('No active session. Start one to begin.');
}
return;
}
const selectedSession =
sessions.find(session => session?.status === 'running') ??
sessions[0];
const selectedSessionId = selectedSession?.id;
if (!selectedSessionId) {
return;
}
const sessionResponse = await fetch(
`/api/court/sessions/${selectedSessionId}`,
);
if (!sessionResponse.ok) {
return;
}
const sessionData = await sessionResponse.json();
if (!sessionData?.session) {
return;
}
const statusMessage =
sessionData.session.status === 'running' ?
'Connected to live session.'
: 'Loaded latest session snapshot.';
hydrateSession(sessionData.session, statusMessage);
} catch (error) {
if (isFixtureReplayMode) {
const hydrated = await hydrateFromFixtureReplay();
if (hydrated) {
return;
}
}
// eslint-disable-next-line no-console
console.warn(
'Failed to auto-attach latest session:',
error instanceof Error ? error.message : error,
);
}
}
catchupToggleBtn.onclick = () => {
setCatchupVisible(!catchupState.visible);
};
function isEditableElementFocused() {
const focused = document.activeElement;
if (!focused) {
return false;
}
const tag = focused.tagName;
return (
tag === 'INPUT' ||
tag === 'TEXTAREA' ||
tag === 'SELECT' ||
focused.isContentEditable
);
}
if (captionSkipBtn) {
captionSkipBtn.onclick = () => {
skipDialogueTypewriter();
};
}
if (captionSkipAllToggle) {
captionSkipAllToggle.onchange = event => {
const nextChecked = Boolean(event.target?.checked);
setSkipAllCaptions(nextChecked);
};
}
if (captionTypewriterToggle) {
captionTypewriterToggle.onclick = () => {
setTypewriterEnabled(!dialogueTypewriterState.enabled);
};
}
document.addEventListener('keydown', event => {
if (isEditableElementFocused()) {
return;
}
if (event.key === 'Escape' || event.key === 'Enter') {
event.preventDefault();
skipDialogueTypewriter();
}
});
renderDialogueControlState();
dialogueTypewriterState.fullText = captionLineEl.textContent;
setActiveSpeakerLabel(activeSpeakerEl.textContent);
setCaptionText(dialogueTypewriterState.fullText);
if (isFixtureReplayMode) {
setConnectionBanner(
`Fixture replay mode enabled (${fixtureReplayUrl}). Live SSE disabled.`,
);
}
void bootstrapCourtRenderer();
void connectLatestSession();

30
public/assets/README.md Normal file
View File

@@ -0,0 +1,30 @@
# Placeholder Assets
This directory holds visual and audio assets for the Ace Attorneystyle renderer.
## Directory Layout
```
assets/
├── backgrounds/ Courtroom and scene background images
├── characters/ Per-character pose and face sprites
├── ui/ UI elements (dialogue box frame, nameplate, badges)
├── fonts/ Custom fonts for dialogue and labels
└── sfx/ Sound effects (objection stinger, gavel, blip)
```
## Placeholder-First Policy
Phase 7 work proceeds without final art. Missing assets are handled gracefully:
- **Backgrounds** — coloured gradient with outlined furniture (bench, podiums,
gallery railing) drawn by `renderer/layers/background.js`.
- **Characters** — labelled rectangles at fixed role positions drawn by
`renderer/layers/characters.js`. Active speaker gets a highlight tint.
- **UI** — rendered directly in PixiJS; no external sprites needed yet.
- **Fonts** — system fonts used (`Inter`, `monospace`).
- **SFX** — silent / console log stub until audio engine lands (Issue #73).
Drop real assets into subdirectories here and update the respective renderer
layer to load them. The renderer will continue to fall back to placeholders
for any asset that is absent.

View File

@@ -0,0 +1 @@
# Placeholder — drop courtroom background images here

View File

@@ -0,0 +1 @@
# Placeholder — drop character pose/face sprites here (one subfolder per character)

View File

@@ -0,0 +1 @@
# Placeholder — drop custom fonts here

View File

@@ -0,0 +1 @@
# Placeholder — drop sound effect files here (objection stinger, gavel, blip)

View File

@@ -0,0 +1 @@
# Placeholder — drop UI element sprites here (dialogue frame, nameplate, badges)

View File

@@ -125,6 +125,30 @@
gap: 8px;
}
.pixi-stage {
min-height: 380px;
border: 1px solid #2b2f41;
border-radius: 10px;
background: linear-gradient(135deg, #0d1423, #171f33);
overflow: hidden;
position: relative;
}
.pixi-stage[data-pixi-ready='false']::before {
content: 'Renderer stage bootstrapping…';
color: var(--muted);
font-size: 12px;
position: absolute;
left: 10px;
top: 8px;
}
.pixi-stage canvas {
width: 100% !important;
height: 380px !important;
display: block;
}
.banner {
border: 1px solid #4f2f35;
background: color-mix(in srgb, var(--danger) 12%, #1a1216);
@@ -146,6 +170,37 @@
color: var(--muted);
}
.caption-controls {
align-items: center;
flex-wrap: wrap;
}
.caption-controls-actions {
display: inline-flex;
align-items: center;
gap: 8px;
justify-content: flex-end;
flex-wrap: wrap;
}
.caption-controls-actions .badge {
background: #171a28;
}
.caption-toggle {
display: inline-flex;
align-items: center;
gap: 6px;
font-size: 12px;
color: var(--muted);
}
.caption-toggle input {
width: auto;
margin: 0;
accent-color: var(--accent);
}
#captionLine {
font-size: 14px;
color: var(--text);
@@ -382,6 +437,12 @@
<span class="badge" id="phaseBadge">phase: idle</span>
</div>
<div class="overlay-shell">
<div
id="pixiStage"
class="pixi-stage"
data-pixi-ready="false"
aria-live="off"
></div>
<div class="overlay-row">
<span>⏱️ Phase timer</span>
<span id="phaseTimer">--:--</span>
@@ -393,13 +454,44 @@
<span>🎙️ Active speaker</span>
<span id="activeSpeaker">Waiting for first turn…</span>
</div>
<div class="overlay-row caption-controls">
<span>⌨️ Dialogue animation</span>
<div class="caption-controls-actions">
<button
id="captionSkipBtn"
type="button"
class="badge"
>
Skip line
</button>
<button
id="captionTypewriterToggle"
type="button"
class="badge"
>
Typewriter: on
</button>
<label class="caption-toggle" for="captionSkipAll">
<input id="captionSkipAll" type="checkbox" />
Always skip
</label>
</div>
</div>
<div id="captionLine">Captions will appear here.</div>
</div>
<div class="catchup-shell">
<div class="catchup-header">
<h3 style="margin: 0; font-size: 14px">🧭 Case so far</h3>
<button id="catchupToggle" type="button" class="badge" aria-expanded="true" aria-controls="catchupBody">
<h3 style="margin: 0; font-size: 14px">
🧭 Case so far
</h3>
<button
id="catchupToggle"
type="button"
class="badge"
aria-expanded="true"
aria-controls="catchupBody"
>
Hide
</button>
</div>
@@ -454,6 +546,7 @@
</aside>
</div>
<script src="https://cdn.jsdelivr.net/npm/pixi.js@8.9.1/dist/pixi.min.js"></script>
<script type="module" src="/app.js"></script>
</body>
</html>

149
public/renderer/camera.js Normal file
View File

@@ -0,0 +1,149 @@
/**
* Camera preset controller — manages zoom / pan transitions on the PixiJS
* stage to create an Ace Attorneystyle cinematic feel.
*
* Camera presets are named positions that the stage pivot + scale can
* animate to. Transitions use simple eased interpolation via
* requestAnimationFrame so there are no animation-library dependencies.
*/
export const CAMERA_PRESETS = {
wide: { x: 0, y: 0, zoom: 1.0 },
judge: { x: 0.5, y: 0.1, zoom: 1.6 },
prosecution: { x: 0.85, y: 0.35, zoom: 1.5 },
defense: { x: 0.13, y: 0.35, zoom: 1.5 },
witness: { x: 0.63, y: 0.25, zoom: 1.4 },
evidence: { x: 0.5, y: 0.5, zoom: 1.3 },
verdict: { x: 0.5, y: 0.3, zoom: 1.1 },
};
const DEFAULT_TRANSITION_MS = 600;
/**
* Ease-out quad.
* @param {number} t Progress 01
* @returns {number}
*/
function easeOutQuad(t) {
return t * (2 - t);
}
/**
* @param {import('./stage.js').RendererStage} stage
*/
export function initCamera(stage) {
const { app } = stage;
let currentPreset = 'wide';
let animFrameId = null;
// Resolved pixel values for current camera state
let camX = 0;
let camY = 0;
let camZoom = 1.0;
function applyTransform() {
const w = app.screen.width;
const h = app.screen.height;
// Scale the stage around the centre of the screen
app.stage.scale.set(camZoom);
// Translate so the camera target is centred
const pivotX = camX * w;
const pivotY = camY * h;
app.stage.pivot.set(pivotX, pivotY);
// Offset so pivot point maps to screen centre
app.stage.position.set(
w / 2 - pivotX * (camZoom - 1),
h / 2 - pivotY * (camZoom - 1),
);
}
/**
* Immediately snap the camera to a preset (no animation).
*
* @param {string} presetName
*/
function snapTo(presetName) {
if (animFrameId !== null) {
cancelAnimationFrame(animFrameId);
animFrameId = null;
}
const preset = CAMERA_PRESETS[presetName] ?? CAMERA_PRESETS.wide;
currentPreset = presetName;
camX = preset.x;
camY = preset.y;
camZoom = preset.zoom;
applyTransform();
}
/**
* Animate the camera to a preset over time.
*
* @param {string} presetName Target preset name
* @param {Object} [opts]
* @param {number} [opts.durationMs] Transition duration
* @param {(t: number) => number} [opts.ease] Easing function (default: easeOutQuad)
* @returns {Promise<void>} Resolves when animation completes
*/
function transitionTo(presetName, opts = {}) {
const preset = CAMERA_PRESETS[presetName] ?? CAMERA_PRESETS.wide;
const durationMs = opts.durationMs ?? DEFAULT_TRANSITION_MS;
const ease = opts.ease ?? easeOutQuad;
if (animFrameId !== null) {
cancelAnimationFrame(animFrameId);
animFrameId = null;
}
currentPreset = presetName;
const fromX = camX;
const fromY = camY;
const fromZoom = camZoom;
const startTime = performance.now();
return new Promise(resolve => {
const tick = (timestamp) => {
const elapsed = timestamp - startTime;
const rawT = Math.min(1, elapsed / durationMs);
const t = ease(rawT);
camX = fromX + (preset.x - fromX) * t;
camY = fromY + (preset.y - fromY) * t;
camZoom = fromZoom + (preset.zoom - fromZoom) * t;
applyTransform();
if (rawT < 1) {
animFrameId = requestAnimationFrame(tick);
} else {
animFrameId = null;
resolve();
}
};
animFrameId = requestAnimationFrame(tick);
});
}
/**
* Reset camera to wide shot.
*/
function reset() {
snapTo('wide');
}
/**
* Get the current preset name.
*/
function getCurrentPreset() {
return currentPreset;
}
// Start at wide shot
snapTo('wide');
return { snapTo, transitionTo, reset, getCurrentPreset, CAMERA_PRESETS };
}

225
public/renderer/dialogue.js Normal file
View File

@@ -0,0 +1,225 @@
/**
* Dialogue state machine — Ace Attorneystyle typewriter with punctuation
* pauses, skip/advance, and per-line token tracking.
*
* Drives the canvas dialogue text independently of the DOM caption overlay
* so the renderer can own its own timing (important for camera/effect sync).
*/
/** Pause durations (ms) after specific punctuation marks. */
const PUNCTUATION_PAUSES = {
'.': 180,
'!': 200,
'?': 200,
',': 90,
';': 110,
':': 100,
'…': 260,
'—': 140,
};
const DEFAULT_CHARS_PER_SECOND = 48;
const BLIP_INTERVAL_CHARS = 3; // play blip every N characters
/**
* @typedef {Object} DialogueLine
* @property {string} speaker Speaker display label
* @property {string} text Full dialogue text
* @property {number} token Monotonic token to detect stale frames
*/
/**
* @typedef {Object} DialogueCallbacks
* @property {(text: string) => void} onTextUpdate Called with visible text slice
* @property {(speaker: string) => void} onSpeakerUpdate Called with speaker label
* @property {() => void} onLineComplete Called when line finishes
* @property {() => void} [onBlip] Called for typewriter blip SFX
*/
/**
* Create a dialogue state machine.
*
* @param {DialogueCallbacks} callbacks
* @returns {Object}
*/
export function createDialogueStateMachine(callbacks) {
let currentLine = null;
let lineToken = 0;
let charIndex = 0;
let frameId = null;
let paused = false;
let pauseTimeout = null;
let skipRequested = false;
let skipAllMode = false;
let enabled = true;
let charsPerSecond = DEFAULT_CHARS_PER_SECOND;
let lastBlipChar = 0;
function clear() {
if (frameId !== null) {
cancelAnimationFrame(frameId);
frameId = null;
}
if (pauseTimeout !== null) {
clearTimeout(pauseTimeout);
pauseTimeout = null;
}
paused = false;
skipRequested = false;
}
function commitLine() {
clear();
if (currentLine) {
callbacks.onTextUpdate(currentLine.text);
callbacks.onLineComplete();
}
}
function getPunctuationPause(char) {
// Check for ellipsis (three dots)
if (
currentLine &&
charIndex >= 3 &&
currentLine.text.slice(charIndex - 2, charIndex + 1) === '...'
) {
return PUNCTUATION_PAUSES['…'];
}
return PUNCTUATION_PAUSES[char] ?? 0;
}
function tick(timestamp) {
if (!currentLine || lineToken !== currentLine.token) {
return;
}
if (skipRequested || skipAllMode) {
commitLine();
return;
}
if (paused) {
return; // waiting on punctuation pause
}
// Advance characters
const elapsed = timestamp - (tick._startTime ?? timestamp);
if (!tick._startTime) tick._startTime = timestamp;
const targetChars = Math.min(
currentLine.text.length,
Math.max(1, Math.floor((elapsed / 1000) * charsPerSecond)),
);
if (targetChars > charIndex) {
charIndex = targetChars;
const visibleText = currentLine.text.slice(0, charIndex);
callbacks.onTextUpdate(visibleText);
// Blip SFX
if (
callbacks.onBlip &&
charIndex - lastBlipChar >= BLIP_INTERVAL_CHARS
) {
lastBlipChar = charIndex;
callbacks.onBlip();
}
// Check punctuation pause
const lastChar = currentLine.text[charIndex - 1];
const pauseMs = getPunctuationPause(lastChar);
if (pauseMs > 0 && charIndex < currentLine.text.length) {
paused = true;
pauseTimeout = setTimeout(() => {
paused = false;
pauseTimeout = null;
if (currentLine && lineToken === currentLine.token) {
frameId = requestAnimationFrame(tick);
}
}, pauseMs);
return;
}
}
if (charIndex >= currentLine.text.length) {
frameId = null;
callbacks.onLineComplete();
return;
}
frameId = requestAnimationFrame(tick);
}
/**
* Start displaying a new dialogue line.
*
* @param {string} speaker Speaker label
* @param {string} text Full dialogue text
*/
function setLine(speaker, text) {
clear();
lineToken += 1;
charIndex = 0;
lastBlipChar = 0;
tick._startTime = null;
currentLine = { speaker, text, token: lineToken };
callbacks.onSpeakerUpdate(speaker);
if (!enabled || skipAllMode || text.length <= 1) {
callbacks.onTextUpdate(text);
callbacks.onLineComplete();
return;
}
callbacks.onTextUpdate('');
frameId = requestAnimationFrame(tick);
}
function skip() {
if (frameId !== null || paused) {
skipRequested = true;
if (paused) {
// Force resume from punctuation pause
if (pauseTimeout !== null) {
clearTimeout(pauseTimeout);
pauseTimeout = null;
}
paused = false;
commitLine();
}
}
}
function setSkipAll(value) {
skipAllMode = Boolean(value);
if (skipAllMode && (frameId !== null || paused)) {
commitLine();
}
}
function setEnabled(value) {
enabled = Boolean(value);
if (!enabled && (frameId !== null || paused)) {
commitLine();
}
}
function setSpeed(cps) {
charsPerSecond = Number.isFinite(cps) && cps > 0 ? cps : DEFAULT_CHARS_PER_SECOND;
}
function isAnimating() {
return frameId !== null || paused;
}
return {
setLine,
skip,
setSkipAll,
setEnabled,
setSpeed,
isAnimating,
destroy: clear,
};
}

174
public/renderer/index.js Normal file
View File

@@ -0,0 +1,174 @@
/**
* Court Renderer — Ace Attorneystyle PixiJS renderer façade.
*
* Usage:
* import { createCourtRenderer } from './renderer/index.js';
*
* const renderer = await createCourtRenderer(document.getElementById('pixiStage'));
* if (renderer) {
* renderer.update({ phase, activeSpeakerRole, roleNames, speakerLabel, dialogueContent });
* renderer.applyDirective({ camera: 'judge', effect: 'objection', poses: { judge: 'point' } });
* }
*
* Returns `null` when PixiJS is unavailable — callers should degrade to the
* existing DOM-only overlay in that case.
*/
import { createStage } from './stage.js';
import { initBackground } from './layers/background.js';
import { initCharacters } from './layers/characters.js';
import { initUI } from './layers/ui.js';
import { initEffects } from './layers/effects.js';
import { initEvidence } from './layers/evidence.js';
import { initCamera } from './camera.js';
import { createDialogueStateMachine } from './dialogue.js';
/**
* @typedef {Object} RendererState
* @property {string} phase Current court phase
* @property {string | null} activeSpeakerRole Role key of current speaker
* @property {Record<string, string>} roleNames role → display name map
* @property {string} speakerLabel "role · name" label
* @property {string} dialogueContent Visible dialogue text
* @property {string} nameplate Nameplate label
*/
/**
* @typedef {Object} RenderDirective
* @property {string} [camera] Camera preset name
* @property {string} [effect] Effect cue name
* @property {Object} [effectOpts] Effect options
* @property {Record<string, string>} [poses] role → pose key
* @property {Record<string, string>} [faces] role → face key
* @property {string} [evidencePresent] Evidence ID to present
*/
/**
* @typedef {Object} CourtRenderer
* @property {(state: Partial<RendererState>) => void} update
* @property {(directive: RenderDirective) => void} applyDirective
* @property {() => void} destroy
* @property {{ speakerText: import('pixi.js').Text, dialogueText: import('pixi.js').Text }} ui
*/
/**
* Bootstrap the full court renderer and mount it to `host`.
*
* @param {HTMLElement} host DOM element to mount the PixiJS canvas
* @returns {Promise<CourtRenderer | null>}
*/
export async function createCourtRenderer(host) {
const stage = await createStage(host);
if (!stage) {
return null;
}
const background = initBackground(stage);
const characters = initCharacters(stage);
const ui = initUI(stage);
const effects = initEffects(stage);
const evidence = initEvidence(stage);
const camera = initCamera(stage);
// Dialogue state machine drives the canvas text typewriter
const dialogueSM = createDialogueStateMachine({
onTextUpdate: (text) => {
ui.update({ dialogueContent: text });
},
onSpeakerUpdate: (speaker) => {
ui.update({ speakerLabel: speaker });
},
onLineComplete: () => {
// no-op for now — future: auto-advance to next line
},
});
let resizeListenerAttached = false;
if (!resizeListenerAttached) {
window.addEventListener(
'resize',
() => {
background.draw();
characters.layout();
ui.layout();
evidence.layoutCards();
},
{ passive: true },
);
resizeListenerAttached = true;
}
/**
* Push the latest session state into every renderer layer.
*
* @param {Partial<RendererState>} state
*/
function update(state) {
characters.update({
activeSpeakerRole: state.activeSpeakerRole ?? null,
roleNames: state.roleNames ?? {},
});
ui.update({
phase: state.phase ?? 'idle',
speakerLabel: state.speakerLabel ?? '',
dialogueContent: state.dialogueContent ?? '',
nameplate: state.nameplate ?? '',
});
}
/**
* Apply a RenderDirective from the backend.
* Translates directive fields into renderer subsystem calls.
*
* @param {RenderDirective} directive
*/
function applyDirective(directive) {
if (!directive) return;
// Camera transition
if (directive.camera) {
camera.transitionTo(directive.camera);
}
// Effect cue
if (directive.effect) {
const effectName = directive.effect;
// Composite cues have convenience methods
if (effectName === 'objection') {
effects.objection();
} else if (effectName === 'hold_it') {
effects.holdIt();
} else if (effectName === 'take_that') {
effects.takeThat();
} else {
effects.trigger(effectName, directive.effectOpts ?? {});
}
}
// Evidence present cutscene
if (directive.evidencePresent) {
evidence.presentEvidence(directive.evidencePresent, effects);
}
}
function destroy() {
dialogueSM.destroy();
stage.destroy();
}
return {
update,
applyDirective,
destroy,
ui: {
speakerText: ui.speakerText,
dialogueText: ui.dialogueText,
},
/** @internal exposed for orchestration */
effects,
evidence,
camera,
dialogue: dialogueSM,
};
}

View File

@@ -0,0 +1,99 @@
/**
* Background layer — draws a courtroom backdrop.
*
* Placeholder-first: when no background sprite is available, renders a
* labelled gradient rectangle with outlined furniture (bench, podiums,
* gallery railing) so the layout is visible during development.
*/
import { STAGE_WIDTH, STAGE_HEIGHT } from '../stage.js';
const BENCH_COLOR = 0x3b2f1e;
const PODIUM_COLOR = 0x2d2417;
const FLOOR_COLOR = 0x1a1428;
const WALL_COLOR = 0x0e1422;
const LINE_COLOR = 0x4a3f2e;
/**
* @param {import('../stage.js').RendererStage} stage
*/
export function initBackground(stage) {
const { PIXI, backgroundLayer, app } = stage;
const gfx = new PIXI.Graphics();
backgroundLayer.addChild(gfx);
const label = new PIXI.Text({
text: 'COURTROOM BACKGROUND (placeholder)',
style: {
fill: 0x555566,
fontSize: 11,
fontFamily: 'monospace',
},
});
label.anchor.set(0.5, 0);
backgroundLayer.addChild(label);
function draw() {
const w = app.screen.width;
const h = app.screen.height;
gfx.clear();
// Wall
gfx.beginFill(WALL_COLOR);
gfx.drawRect(0, 0, w, h);
gfx.endFill();
// Floor
gfx.beginFill(FLOOR_COLOR);
gfx.drawRect(0, h * 0.65, w, h * 0.35);
gfx.endFill();
// Judge bench (center, rear)
gfx.beginFill(BENCH_COLOR, 0.8);
gfx.drawRoundedRect(w * 0.3, h * 0.08, w * 0.4, h * 0.18, 6);
gfx.endFill();
gfx.lineStyle(1, LINE_COLOR, 0.6);
gfx.drawRoundedRect(w * 0.3, h * 0.08, w * 0.4, h * 0.18, 6);
gfx.lineStyle(0);
// Defense podium (left)
gfx.beginFill(PODIUM_COLOR, 0.7);
gfx.drawRoundedRect(w * 0.04, h * 0.42, w * 0.18, h * 0.15, 4);
gfx.endFill();
gfx.lineStyle(1, LINE_COLOR, 0.5);
gfx.drawRoundedRect(w * 0.04, h * 0.42, w * 0.18, h * 0.15, 4);
gfx.lineStyle(0);
// Prosecution podium (right)
gfx.beginFill(PODIUM_COLOR, 0.7);
gfx.drawRoundedRect(w * 0.78, h * 0.42, w * 0.18, h * 0.15, 4);
gfx.endFill();
gfx.lineStyle(1, LINE_COLOR, 0.5);
gfx.drawRoundedRect(w * 0.78, h * 0.42, w * 0.18, h * 0.15, 4);
gfx.lineStyle(0);
// Witness stand (right of center)
gfx.beginFill(PODIUM_COLOR, 0.6);
gfx.drawRoundedRect(w * 0.6, h * 0.3, w * 0.12, h * 0.12, 4);
gfx.endFill();
gfx.lineStyle(1, LINE_COLOR, 0.4);
gfx.drawRoundedRect(w * 0.6, h * 0.3, w * 0.12, h * 0.12, 4);
gfx.lineStyle(0);
// Gallery railing
gfx.lineStyle(2, LINE_COLOR, 0.35);
gfx.moveTo(w * 0.02, h * 0.64);
gfx.lineTo(w * 0.98, h * 0.64);
gfx.lineStyle(0);
// Placeholder text
label.position.set(w / 2, 4);
}
draw();
app.renderer.on('resize', draw);
return { draw };
}

View File

@@ -0,0 +1,271 @@
/**
* Characters layer — renders labelled placeholder silhouettes for each court
* role at fixed positions. The active speaker receives a highlight accent.
*
* #71 CharacterDisplay: each slot has overlay containers for pose, face, and
* per-character effects. When sprite assets are absent the placeholder
* rectangle is drawn; when a sprite texture is set, it replaces the rectangle.
*/
/** Default layout positions (proportional to canvas size). */
const ROLE_POSITIONS = {
judge: { x: 0.5, y: 0.12, w: 0.1, h: 0.16, color: 0xa08040 },
prosecutor: { x: 0.85, y: 0.38, w: 0.1, h: 0.2, color: 0x7b4040 },
defense: { x: 0.13, y: 0.38, w: 0.1, h: 0.2, color: 0x405a7b },
witness_1: { x: 0.64, y: 0.28, w: 0.08, h: 0.14, color: 0x4f6f50 },
witness_2: { x: 0.56, y: 0.28, w: 0.08, h: 0.14, color: 0x4f6f50 },
witness_3: { x: 0.72, y: 0.28, w: 0.08, h: 0.14, color: 0x4f6f50 },
bailiff: { x: 0.36, y: 0.3, w: 0.07, h: 0.14, color: 0x555566 },
};
/** Recognised pose keys — pose sprites are resolved via asset lookup. */
export const POSES = ['idle', 'talk', 'point', 'slam', 'think', 'shock'];
/** Recognised face overlay keys. */
export const FACE_OVERLAYS = ['neutral', 'angry', 'happy', 'surprised', 'sweating'];
const ACTIVE_TINT = 0xffdd44;
const INACTIVE_ALPHA = 0.55;
const ACTIVE_ALPHA = 1.0;
/**
* @param {import('../stage.js').RendererStage} stage
*/
export function initCharacters(stage) {
const { PIXI, charactersLayer, app } = stage;
/**
* @typedef {Object} CharacterSlotEntry
* @property {import('pixi.js').Container} container Root container
* @property {import('pixi.js').Graphics} gfx Placeholder graphic
* @property {import('pixi.js').Text} label Role label
* @property {import('pixi.js').Text} nameLabel Display-name label
* @property {import('pixi.js').Container} poseLayer Pose sprite container
* @property {import('pixi.js').Container} faceLayer Face overlay container
* @property {import('pixi.js').Container} fxLayer Per-character effects
* @property {typeof ROLE_POSITIONS[string]} slot Position definition
* @property {string} currentPose Current pose key
* @property {string} currentFace Current face key
* @property {boolean} hasSprite Whether a sprite texture replaced the placeholder
*/
/** @type {Record<string, CharacterSlotEntry>} */
const slots = {};
for (const [role, slot] of Object.entries(ROLE_POSITIONS)) {
const container = new PIXI.Container();
container.label = role;
const gfx = new PIXI.Graphics();
container.addChild(gfx);
// Pose sprite layer (sits on top of placeholder rectangle)
const poseLayer = new PIXI.Container();
poseLayer.label = `${role}_pose`;
container.addChild(poseLayer);
// Face overlay layer (composites on top of pose)
const faceLayer = new PIXI.Container();
faceLayer.label = `${role}_face`;
container.addChild(faceLayer);
// Per-character effects layer (flash / glow / particles)
const fxLayer = new PIXI.Container();
fxLayer.label = `${role}_fx`;
container.addChild(fxLayer);
const label = new PIXI.Text({
text: role.replace(/_/g, ' ').toUpperCase(),
style: {
fill: 0xcccccc,
fontSize: 9,
fontFamily: 'monospace',
align: 'center',
},
});
label.anchor.set(0.5, 0);
container.addChild(label);
const nameLabel = new PIXI.Text({
text: '',
style: {
fill: 0xeeeeee,
fontSize: 10,
fontFamily: 'Inter, system-ui, sans-serif',
fontWeight: '600',
align: 'center',
},
});
nameLabel.anchor.set(0.5, 0);
container.addChild(nameLabel);
container.alpha = INACTIVE_ALPHA;
charactersLayer.addChild(container);
slots[role] = {
container,
gfx,
label,
nameLabel,
poseLayer,
faceLayer,
fxLayer,
slot,
currentPose: 'idle',
currentFace: 'neutral',
hasSprite: false,
};
}
function layout() {
const w = app.screen.width;
const h = app.screen.height;
for (const entry of Object.values(slots)) {
const { container, gfx, label, nameLabel, poseLayer, faceLayer, fxLayer, slot } = entry;
const sw = slot.w * w;
const sh = slot.h * h;
const sx = slot.x * w - sw / 2;
const sy = slot.y * h;
// Draw placeholder rectangle only when no sprite has been loaded
gfx.clear();
if (!entry.hasSprite) {
gfx.beginFill(slot.color, 0.6);
gfx.drawRoundedRect(0, 0, sw, sh, 4);
gfx.endFill();
gfx.lineStyle(1, 0x888888, 0.3);
gfx.drawRoundedRect(0, 0, sw, sh, 4);
gfx.lineStyle(0);
}
container.position.set(sx, sy);
label.position.set(sw / 2, 2);
nameLabel.position.set(sw / 2, sh + 2);
// Size overlay layers to match the slot
poseLayer.position.set(0, 0);
faceLayer.position.set(0, 0);
fxLayer.position.set(0, 0);
}
}
layout();
app.renderer.on('resize', layout);
/**
* Set a pose sprite for a role. If `texture` is null, the slot falls
* back to the placeholder rectangle.
*
* @param {string} role Court role key
* @param {string} pose Pose key (e.g. 'idle', 'talk')
* @param {import('pixi.js').Texture | null} texture Texture or null
*/
function setPoseSprite(role, pose, texture) {
const entry = slots[role];
if (!entry) return;
entry.currentPose = pose;
entry.poseLayer.removeChildren();
if (texture) {
const sprite = new PIXI.Sprite(texture);
const w = app.screen.width;
const h = app.screen.height;
sprite.width = entry.slot.w * w;
sprite.height = entry.slot.h * h;
entry.poseLayer.addChild(sprite);
entry.hasSprite = true;
entry.gfx.clear(); // hide placeholder
} else {
entry.hasSprite = false;
layout(); // re-draw placeholder
}
}
/**
* Set a face overlay sprite for a role.
*
* @param {string} role Court role key
* @param {string} face Face key (e.g. 'neutral', 'angry')
* @param {import('pixi.js').Texture | null} texture Texture or null
*/
function setFaceOverlay(role, face, texture) {
const entry = slots[role];
if (!entry) return;
entry.currentFace = face;
entry.faceLayer.removeChildren();
if (texture) {
const sprite = new PIXI.Sprite(texture);
// Face overlay is positioned relative to the top of the slot
const w = app.screen.width;
sprite.width = entry.slot.w * w * 0.6;
sprite.height = sprite.width; // square face overlay
sprite.anchor.set(0.5, 0);
sprite.position.set((entry.slot.w * w) / 2, 2);
entry.faceLayer.addChild(sprite);
}
}
/**
* Flash a colour overlay on a specific character slot for effect emphasis.
*
* @param {string} role Court role key
* @param {number} color Hex tint (e.g. 0xff0000 for damage flash)
* @param {number} durationMs Flash duration in milliseconds
*/
function flashCharacter(role, color = 0xffffff, durationMs = 120) {
const entry = slots[role];
if (!entry) return;
const flash = new PIXI.Graphics();
const w = entry.slot.w * app.screen.width;
const h = entry.slot.h * app.screen.height;
flash.beginFill(color, 0.45);
flash.drawRoundedRect(0, 0, w, h, 4);
flash.endFill();
entry.fxLayer.addChild(flash);
setTimeout(() => {
entry.fxLayer.removeChild(flash);
flash.destroy();
}, durationMs);
}
/**
* Update character display state.
*
* @param {Object} state
* @param {string | null} state.activeSpeakerRole Currently speaking role key
* @param {Record<string, string>} state.roleNames Map of role → display name
* @param {Record<string, string>} [state.poses] Map of role → pose key (optional)
* @param {Record<string, string>} [state.faces] Map of role → face key (optional)
*/
function update(state) {
const { activeSpeakerRole, roleNames } = state;
for (const [role, entry] of Object.entries(slots)) {
const isActive = activeSpeakerRole === role;
entry.container.alpha = isActive ? ACTIVE_ALPHA : INACTIVE_ALPHA;
if (isActive) {
entry.gfx.tint = ACTIVE_TINT;
} else {
entry.gfx.tint = 0xffffff;
}
const name = roleNames?.[role];
if (typeof name === 'string') {
entry.nameLabel.text = name;
}
}
}
/**
* Get the current slot entries (read-only reference for testing/debug).
*/
function getSlots() {
return slots;
}
return { update, layout, setPoseSprite, setFaceOverlay, flashCharacter, getSlots };
}

212
public/renderer/layers/effects.js vendored Normal file
View File

@@ -0,0 +1,212 @@
/**
* Effects layer — visual effect cues (flash, shake, freeze/hit-stop,
* stamped overlays like "OBJECTION!" / "HOLD IT!").
*
* Each cue is fire-and-forget: call `trigger(cueName, opts)` and the effect
* runs to completion inside the effects container then self-destructs.
*/
const SHAKE_INTENSITY_PX = 6;
const SHAKE_DURATION_MS = 300;
const FLASH_DURATION_MS = 150;
const FREEZE_DURATION_MS = 400;
const STAMP_DISPLAY_MS = 1200;
/**
* @param {import('../stage.js').RendererStage} stage
*/
export function initEffects(stage) {
const { PIXI, effectsLayer, app } = stage;
let shakeTimer = null;
/**
* Full-screen colour flash.
*
* @param {Object} opts
* @param {number} [opts.color=0xffffff] Flash colour
* @param {number} [opts.alpha=0.6] Flash alpha
* @param {number} [opts.durationMs] Duration in ms
*/
function flash(opts = {}) {
const color = opts.color ?? 0xffffff;
const alpha = opts.alpha ?? 0.6;
const duration = opts.durationMs ?? FLASH_DURATION_MS;
const gfx = new PIXI.Graphics();
gfx.beginFill(color, alpha);
gfx.drawRect(0, 0, app.screen.width, app.screen.height);
gfx.endFill();
effectsLayer.addChild(gfx);
setTimeout(() => {
effectsLayer.removeChild(gfx);
gfx.destroy();
}, duration);
}
/**
* Screen shake — oscillates the stage pivot for a short burst.
*
* @param {Object} opts
* @param {number} [opts.intensity] Pixel amplitude
* @param {number} [opts.durationMs] Duration in ms
*/
function shake(opts = {}) {
const intensity = opts.intensity ?? SHAKE_INTENSITY_PX;
const duration = opts.durationMs ?? SHAKE_DURATION_MS;
// Cancel any running shake
if (shakeTimer !== null) {
clearInterval(shakeTimer);
app.stage.position.set(0, 0);
}
const startTime = performance.now();
shakeTimer = setInterval(() => {
const elapsed = performance.now() - startTime;
if (elapsed >= duration) {
clearInterval(shakeTimer);
shakeTimer = null;
app.stage.position.set(0, 0);
return;
}
const decay = 1 - elapsed / duration;
const dx = (Math.random() * 2 - 1) * intensity * decay;
const dy = (Math.random() * 2 - 1) * intensity * decay;
app.stage.position.set(dx, dy);
}, 16);
}
/**
* Freeze / hit-stop — pauses ticker for a brief moment.
*
* @param {Object} opts
* @param {number} [opts.durationMs]
*/
function freeze(opts = {}) {
const duration = opts.durationMs ?? FREEZE_DURATION_MS;
app.ticker.stop();
setTimeout(() => {
app.ticker.start();
}, duration);
}
/**
* Stamped text overlay — centred text that fades in, holds, then fades out.
* Used for "OBJECTION!", "HOLD IT!", "TAKE THAT!", etc.
*
* @param {Object} opts
* @param {string} opts.text Text to stamp
* @param {number} [opts.color=0xff4444] Text fill colour
* @param {number} [opts.fontSize=48] Font size
* @param {number} [opts.displayMs] Total display time
*/
function stamp(opts = {}) {
const text = opts.text ?? 'OBJECTION!';
const color = opts.color ?? 0xff4444;
const fontSize = opts.fontSize ?? 48;
const displayMs = opts.displayMs ?? STAMP_DISPLAY_MS;
const label = new PIXI.Text({
text,
style: {
fill: color,
fontSize,
fontFamily: 'Impact, Arial Black, sans-serif',
fontWeight: '900',
stroke: 0x000000,
strokeThickness: 4,
align: 'center',
dropShadow: true,
dropShadowColor: 0x000000,
dropShadowDistance: 3,
},
});
label.anchor.set(0.5, 0.5);
label.position.set(app.screen.width / 2, app.screen.height / 2);
label.alpha = 0;
effectsLayer.addChild(label);
// Fade in over 80ms
const fadeInMs = 80;
const fadeOutMs = 200;
const holdMs = Math.max(0, displayMs - fadeInMs - fadeOutMs);
let startTime = performance.now();
const animateIn = () => {
const elapsed = performance.now() - startTime;
label.alpha = Math.min(1, elapsed / fadeInMs);
if (elapsed < fadeInMs) {
requestAnimationFrame(animateIn);
} else {
label.alpha = 1;
setTimeout(() => {
startTime = performance.now();
requestAnimationFrame(animateOut);
}, holdMs);
}
};
const animateOut = () => {
const elapsed = performance.now() - startTime;
label.alpha = Math.max(0, 1 - elapsed / fadeOutMs);
if (elapsed < fadeOutMs) {
requestAnimationFrame(animateOut);
} else {
effectsLayer.removeChild(label);
label.destroy();
}
};
requestAnimationFrame(animateIn);
}
/** Cue name → handler mapping. */
const CUE_HANDLERS = {
flash,
shake,
freeze,
stamp,
};
/**
* Trigger an effect cue by name.
*
* @param {string} cue Effect name ('flash' | 'shake' | 'freeze' | 'stamp')
* @param {Object} [opts] Options forwarded to the handler
*/
function trigger(cue, opts = {}) {
const handler = CUE_HANDLERS[cue];
if (handler) {
handler(opts);
}
}
/**
* Convenience: composite "objection" cue (stamp + flash + shake).
*/
function objection() {
flash({ color: 0xff4444, alpha: 0.35 });
shake({ intensity: 8, durationMs: 350 });
stamp({ text: 'OBJECTION!', color: 0xff4444 });
}
/**
* Convenience: "hold it" cue.
*/
function holdIt() {
flash({ color: 0x44aaff, alpha: 0.3 });
stamp({ text: 'HOLD IT!', color: 0x44aaff });
}
/**
* Convenience: "take that" cue.
*/
function takeThat() {
flash({ color: 0x44ff88, alpha: 0.3 });
stamp({ text: 'TAKE THAT!', color: 0x44ff88 });
}
return { trigger, flash, shake, freeze, stamp, objection, holdIt, takeThat };
}

View File

@@ -0,0 +1,201 @@
/**
* Evidence layer — renders evidence cards and the "Present!" cutscene effect.
*
* Evidence cards appear as styled rectangles with text. When evidence is
* "presented", a brief cutscene animation plays: the card slides to centre,
* scales up, flashes, then settles back to the evidence tray.
*/
const EVIDENCE_TRAY_ALPHA = 0.85;
const EVIDENCE_CARD_WIDTH = 140;
const EVIDENCE_CARD_HEIGHT = 80;
const EVIDENCE_CARD_GAP = 8;
const EVIDENCE_CARD_BG = 0x1e293b;
const EVIDENCE_CARD_BORDER = 0x475569;
const EVIDENCE_CARD_ACTIVE_BORDER = 0xfbbf24;
const PRESENT_ANIMATION_MS = 900;
/**
* @param {import('../stage.js').RendererStage} stage
*/
export function initEvidence(stage) {
const { PIXI, uiLayer, app } = stage;
const evidenceContainer = new PIXI.Container();
evidenceContainer.label = 'evidence';
uiLayer.addChild(evidenceContainer);
/** @type {Array<{id: string, text: string, card: import('pixi.js').Container}>} */
const cards = [];
let presentAnimating = false;
function layoutCards() {
const startX = 12;
const startY = 12;
cards.forEach((entry, i) => {
entry.card.position.set(
startX + i * (EVIDENCE_CARD_WIDTH + EVIDENCE_CARD_GAP),
startY,
);
});
}
/**
* Add an evidence card to the tray.
*
* @param {Object} evidence
* @param {string} evidence.id
* @param {string} evidence.text
*/
function addCard(evidence) {
// Don't add duplicates
if (cards.some(c => c.id === evidence.id)) return;
const card = new PIXI.Container();
card.label = `evidence_${evidence.id}`;
const bg = new PIXI.Graphics();
bg.beginFill(EVIDENCE_CARD_BG, EVIDENCE_TRAY_ALPHA);
bg.lineStyle(2, EVIDENCE_CARD_BORDER, 1);
bg.drawRoundedRect(0, 0, EVIDENCE_CARD_WIDTH, EVIDENCE_CARD_HEIGHT, 6);
bg.endFill();
card.addChild(bg);
const idLabel = new PIXI.Text({
text: `#${evidence.id}`,
style: {
fill: 0x94a3b8,
fontSize: 9,
fontFamily: 'monospace',
},
});
idLabel.position.set(6, 4);
card.addChild(idLabel);
const textLabel = new PIXI.Text({
text: evidence.text.length > 60 ? evidence.text.slice(0, 57) + '…' : evidence.text,
style: {
fill: 0xe2e8f0,
fontSize: 10,
fontFamily: 'Inter, system-ui, sans-serif',
wordWrap: true,
wordWrapWidth: EVIDENCE_CARD_WIDTH - 12,
lineHeight: 13,
},
});
textLabel.position.set(6, 18);
card.addChild(textLabel);
evidenceContainer.addChild(card);
cards.push({ id: evidence.id, text: evidence.text, card });
layoutCards();
}
/**
* Remove all evidence cards (e.g. between sessions).
*/
function clearCards() {
for (const entry of cards) {
evidenceContainer.removeChild(entry.card);
entry.card.destroy({ children: true });
}
cards.length = 0;
}
/**
* Play the "Present!" cutscene for a specific evidence card.
* The card zooms to centre screen, flashes, then returns.
*
* @param {string} evidenceId
* @param {Object} [effectsRef] Reference to effects module (for flash)
* @returns {Promise<void>}
*/
function presentEvidence(evidenceId, effectsRef) {
if (presentAnimating) return Promise.resolve();
const entry = cards.find(c => c.id === evidenceId);
if (!entry) return Promise.resolve();
presentAnimating = true;
const card = entry.card;
const origX = card.position.x;
const origY = card.position.y;
const origScaleX = card.scale.x;
const origScaleY = card.scale.y;
const centreX = app.screen.width / 2 - EVIDENCE_CARD_WIDTH / 2;
const centreY = app.screen.height / 2 - EVIDENCE_CARD_HEIGHT / 2;
const targetScale = 2.2;
const zoomInMs = PRESENT_ANIMATION_MS * 0.35;
const holdMs = PRESENT_ANIMATION_MS * 0.3;
const zoomOutMs = PRESENT_ANIMATION_MS * 0.35;
return new Promise(resolve => {
const startTime = performance.now();
const zoomIn = (timestamp) => {
const elapsed = timestamp - startTime;
const t = Math.min(1, elapsed / zoomInMs);
const eased = t * (2 - t); // ease-out quad
card.position.set(
origX + (centreX - origX) * eased,
origY + (centreY - origY) * eased,
);
card.scale.set(
origScaleX + (targetScale - origScaleX) * eased,
origScaleY + (targetScale - origScaleY) * eased,
);
if (t < 1) {
requestAnimationFrame(zoomIn);
} else {
// Flash at peak
if (effectsRef?.flash) {
effectsRef.flash({ color: 0xfbbf24, alpha: 0.4, durationMs: 100 });
}
setTimeout(() => {
const returnStart = performance.now();
const zoomOut = (ts) => {
const el = ts - returnStart;
const rt = Math.min(1, el / zoomOutMs);
const re = rt * (2 - rt);
card.position.set(
centreX + (origX - centreX) * re,
centreY + (origY - centreY) * re,
);
card.scale.set(
targetScale + (origScaleX - targetScale) * re,
targetScale + (origScaleY - targetScale) * re,
);
if (rt < 1) {
requestAnimationFrame(zoomOut);
} else {
card.position.set(origX, origY);
card.scale.set(origScaleX, origScaleY);
presentAnimating = false;
resolve();
}
};
requestAnimationFrame(zoomOut);
}, holdMs);
}
};
requestAnimationFrame(zoomIn);
});
}
/**
* Get the list of current card IDs.
*/
function getCardIds() {
return cards.map(c => c.id);
}
return { addCard, clearCards, presentEvidence, getCardIds, layoutCards };
}

View File

@@ -0,0 +1,157 @@
/**
* UI layer — renders the on-canvas heads-up display: phase indicator badge,
* speaker nameplate, and dialogue box with text.
*
* These elements mirror the DOM overlay but live inside the PixiJS canvas so
* they compose with the rest of the scene graph (camera, effects, etc.).
*/
const DIALOGUE_BOX_ALPHA = 0.82;
const DIALOGUE_BOX_COLOR = 0x0e1422;
const DIALOGUE_BOX_RADIUS = 8;
const NAMEPLATE_BG = 0x1a2540;
const NAMEPLATE_RADIUS = 6;
/**
* @param {import('../stage.js').RendererStage} stage
*/
export function initUI(stage) {
const { PIXI, uiLayer, app } = stage;
// -- Dialogue box ---------------------------------------------------------
const dialogueContainer = new PIXI.Container();
dialogueContainer.label = 'dialogue';
uiLayer.addChild(dialogueContainer);
const dialogueBoxBg = new PIXI.Graphics();
dialogueContainer.addChild(dialogueBoxBg);
const speakerText = new PIXI.Text({
text: '',
style: {
fill: 0xa5b4fc,
fontSize: 13,
fontFamily: 'Inter, system-ui, sans-serif',
fontWeight: '600',
},
});
speakerText.position.set(14, 8);
dialogueContainer.addChild(speakerText);
const dialogueText = new PIXI.Text({
text: '',
style: {
fill: 0xf8fafc,
fontSize: 15,
fontFamily: 'Inter, system-ui, sans-serif',
wordWrap: true,
wordWrapWidth: 320,
lineHeight: 20,
},
});
dialogueText.position.set(14, 28);
dialogueContainer.addChild(dialogueText);
// -- Nameplate (above dialogue box) ---------------------------------------
const nameplateContainer = new PIXI.Container();
nameplateContainer.label = 'nameplate';
uiLayer.addChild(nameplateContainer);
const nameplateBg = new PIXI.Graphics();
nameplateContainer.addChild(nameplateBg);
const nameplateText = new PIXI.Text({
text: '',
style: {
fill: 0xd9e6ff,
fontSize: 12,
fontFamily: 'Inter, system-ui, sans-serif',
fontWeight: '600',
},
});
nameplateText.position.set(8, 4);
nameplateContainer.addChild(nameplateText);
// -- Phase badge (top-right) ----------------------------------------------
const phaseContainer = new PIXI.Container();
phaseContainer.label = 'phaseBadge';
uiLayer.addChild(phaseContainer);
const phaseBg = new PIXI.Graphics();
phaseContainer.addChild(phaseBg);
const phaseText = new PIXI.Text({
text: 'phase: idle',
style: {
fill: 0x9da2b6,
fontSize: 11,
fontFamily: 'monospace',
},
});
phaseText.position.set(8, 4);
phaseContainer.addChild(phaseText);
// -- Layout ---------------------------------------------------------------
function layout() {
const w = app.screen.width;
const h = app.screen.height;
const padding = 10;
const boxH = 100;
const boxW = w - padding * 2;
// Dialogue box at bottom
dialogueContainer.position.set(padding, h - boxH - padding);
dialogueBoxBg.clear();
dialogueBoxBg.beginFill(DIALOGUE_BOX_COLOR, DIALOGUE_BOX_ALPHA);
dialogueBoxBg.drawRoundedRect(0, 0, boxW, boxH, DIALOGUE_BOX_RADIUS);
dialogueBoxBg.endFill();
dialogueText.style.wordWrapWidth = Math.max(200, boxW - 28);
// Nameplate just above dialogue box
const npW = Math.min(180, w * 0.25);
const npH = 24;
nameplateContainer.position.set(padding, h - boxH - padding - npH - 4);
nameplateBg.clear();
nameplateBg.beginFill(NAMEPLATE_BG, 0.9);
nameplateBg.drawRoundedRect(0, 0, npW, npH, NAMEPLATE_RADIUS);
nameplateBg.endFill();
// Phase badge top-right
const pbW = Math.min(200, w * 0.3);
const pbH = 22;
phaseContainer.position.set(w - pbW - padding, padding);
phaseBg.clear();
phaseBg.beginFill(0x14141d, 0.85);
phaseBg.drawRoundedRect(0, 0, pbW, pbH, 4);
phaseBg.endFill();
}
layout();
app.renderer.on('resize', layout);
/**
* Update the UI overlay text.
*
* @param {Object} state
* @param {string} state.phase Current phase string
* @param {string} state.speakerLabel "role · name" label
* @param {string} state.dialogueContent Visible dialogue text
* @param {string} state.nameplate Nameplate label text
*/
function update(state) {
if (typeof state.phase === 'string') {
phaseText.text = `phase: ${state.phase}`;
}
if (typeof state.speakerLabel === 'string') {
speakerText.text = state.speakerLabel;
}
if (typeof state.dialogueContent === 'string') {
dialogueText.text = state.dialogueContent;
}
if (typeof state.nameplate === 'string') {
nameplateText.text = state.nameplate;
}
}
return { update, layout, speakerText, dialogueText };
}

View File

@@ -0,0 +1,27 @@
/**
* PixiJS runtime resolver.
*
* Tries the global PIXI object first (loaded via <script> tag), then falls
* back to a dynamic ESM import from the CDN. Returns `null` when neither
* source is available so callers can degrade gracefully.
*/
const CDN_URL = 'https://cdn.jsdelivr.net/npm/pixi.js@8.9.1/+esm';
export async function resolvePixiRuntime() {
const globalPixi = globalThis.PIXI;
if (globalPixi?.Application) {
return globalPixi;
}
try {
return await import(CDN_URL);
} catch (error) {
// eslint-disable-next-line no-console
console.warn(
'PIXI runtime unavailable; continuing without renderer stage.',
error,
);
return null;
}
}

90
public/renderer/stage.js Normal file
View File

@@ -0,0 +1,90 @@
/**
* Core stage manager — creates the PixiJS Application and the four ordered
* scene-graph containers (background → characters → ui → effects).
*
* Returns `null` when PixiJS is unavailable so consumers can degrade.
*/
import { resolvePixiRuntime } from './pixi-runtime.js';
/**
* Default canvas dimensions. The canvas is responsive; these are the logical
* reference dimensions the layers use for layout.
*/
export const STAGE_WIDTH = 960;
export const STAGE_HEIGHT = 540;
/**
* @typedef {Object} RendererStage
* @property {import('pixi.js').Application} app
* @property {typeof import('pixi.js')} PIXI
* @property {import('pixi.js').Container} backgroundLayer
* @property {import('pixi.js').Container} charactersLayer
* @property {import('pixi.js').Container} uiLayer
* @property {import('pixi.js').Container} effectsLayer
* @property {() => void} destroy
*/
/**
* Bootstrap the PixiJS application and mount to the given host element.
*
* @param {HTMLElement} host DOM element to mount the canvas into
* @returns {Promise<RendererStage | null>}
*/
export async function createStage(host) {
const PIXI = await resolvePixiRuntime();
if (!PIXI) {
host.dataset.pixiReady = 'false';
return null;
}
try {
const app = new PIXI.Application();
await app.init({
width: STAGE_WIDTH,
height: STAGE_HEIGHT,
antialias: true,
backgroundAlpha: 0,
resizeTo: host,
});
const backgroundLayer = new PIXI.Container();
const charactersLayer = new PIXI.Container();
const uiLayer = new PIXI.Container();
const effectsLayer = new PIXI.Container();
backgroundLayer.label = 'background';
charactersLayer.label = 'characters';
uiLayer.label = 'ui';
effectsLayer.label = 'effects';
app.stage.addChild(backgroundLayer);
app.stage.addChild(charactersLayer);
app.stage.addChild(uiLayer);
app.stage.addChild(effectsLayer);
host.innerHTML = '';
host.appendChild(app.canvas);
host.dataset.pixiReady = 'true';
const destroy = () => {
app.destroy(true, { children: true });
host.dataset.pixiReady = 'false';
};
return {
app,
PIXI,
backgroundLayer,
charactersLayer,
uiLayer,
effectsLayer,
destroy,
};
} catch (error) {
host.dataset.pixiReady = 'false';
// eslint-disable-next-line no-console
console.warn('Failed to bootstrap PIXI renderer stage:', error);
return null;
}
}

66
quality-review.json Normal file
View File

@@ -0,0 +1,66 @@
{
"scope": "working-tree vs 467d7d9 (HEAD/main)",
"verdict": "Ready with fixes",
"tests": { "pass": 158, "fail": 0, "skipped": 2 },
"triage": {
"docsOnly": false,
"reactNextPerfReview": false,
"uiGuidelinesAudit": true
},
"issues": {
"critical": [
{
"id": "C1",
"title": "/press and /present endpoints have no rate limiting",
"location": "src/server.ts:703-731",
"impact": "Unauthenticated flood of render_directive events via rapid /press calls"
},
{
"id": "C2",
"title": "Twitch !objection emits count: -1 sentinel; nothing increments it",
"location": "src/twitch/adapter.ts:149-153",
"impact": "SSE listeners receive count=-1, corrupting overlay and dashboard UI state"
}
],
"important": [
{
"id": "I1",
"title": "/api/metrics endpoint not documented; operator runbook contradicts",
"location": "docs/operator-runbook.md:213"
},
{
"id": "I2",
"title": "TWITCH_EVENTSUB_SECRET in .env.example is unused in code",
"location": ".env.example:42"
},
{
"id": "I3",
"title": "event-taxonomy.md render_directive payload shows flat pose/face vs actual map",
"location": "docs/event-taxonomy.md"
},
{
"id": "I4",
"title": "Dashboard App.tsx duplicates session-snapshot event mapping logic",
"location": "dashboard/src/App.tsx:10-160"
}
],
"minor": [
{
"id": "M1",
"title": "inferRenderDirective only matches keywords with trailing !",
"location": "src/court/orchestrator.ts:100-115"
},
{
"id": "M2",
"title": "public/app.js growing large (~970 lines)",
"location": "public/app.js"
},
{
"id": "M3",
"title": "record:sse script flags not in docs/api.md",
"location": "README.md"
}
]
},
"untested_new_modules": ["src/metrics.ts", "src/twitch/adapter.ts"]
}

124
quality-review.md Normal file
View File

@@ -0,0 +1,124 @@
# Quality Review — Phase 7 Uncommitted Changes
## Summary
- **Verdict: Ready with fixes** (2 critical, 4 important)
- **Scope:** Working tree vs HEAD (`467d7d9`) — 20 modified files + ~12 new untracked files
- **Tests:** 158 pass / 0 fail / 2 skipped
## Triage
- Docs-only: **no**
- React/Next perf review: **no** (dashboard is Vite + React 18, no Next.js)
- UI guidelines audit: **yes**`dashboard/src/App.tsx` and `public/app.js` / `public/index.html` have significant UI changes
- Reason:
- `.tsx` files changed (`dashboard/src/App.tsx`)
- Client JS overlay code heavily extended (`public/app.js`, `public/index.html`)
- Core server, orchestrator, types, events, and new modules added (metrics, replay, twitch)
---
## Strengths
1. **Well-structured event system extension** — Phase 7 event types (`render_directive`, `witness_statement`, `case_file_generated`) follow existing conventions: typed payloads, runtime assertion in `assertEventPayload`, and matching test coverage.
2. **Comprehensive documentation**`docs/event-taxonomy.md`, `docs/api.md`, `docs/broadcast-integration.md`, and `README.md` are all updated with the new endpoints, event types, and Twitch integration docs.
3. **Prometheus metrics module**`src/metrics.ts` is clean, uses a dedicated registry, instruments all store operations via a proxy wrapper (`instrumentCourtSessionStore`), and exposes SSE/vote telemetry.
4. **Replay/recording system** — NDJSON record + replay is well-separated in `src/replay/session-replay.ts` with a clean manager class, test coverage, and both CLI flag and env-var entry points.
5. **Graceful degradation** — Twitch adapter, replay, and metrics all degrade to no-op when unconfigured; LLM client now falls back to mock on empty response instead of returning empty strings.
6. **Docker security hardened** — compose binds to `127.0.0.1` only and adds `TRUST_PROXY` support with proper parsing (bool / int / CIDR list).
---
## Issues
### Critical (Must Fix)
#### C1. `/press` and `/present` endpoints have no rate limiting
- **Location:** [src/server.ts](src/server.ts#L703-L731)
- **What:** The new `POST /api/court/sessions/:id/press` and `POST /api/court/sessions/:id/present` handlers are registered without any rate limiter. The existing vote endpoint has `VoteSpamGuard`; these audience-interaction endpoints have none.
- **Why it matters:** These are public-facing, unauthenticated endpoints. An attacker can flood `render_directive` events via rapid `/press` calls, overwhelming SSE clients and the overlay renderer. The docs themselves note "Per-loop audience action rate limits are enforced at the API layer" (broadcast-integration.md) — but this is not actually implemented.
- **Minimal fix:** Apply `express-rate-limit` (or extend `VoteSpamGuard`) to both endpoints. A sensible default: 10 req/IP/10s.
#### C2. Twitch `!objection` emits `count: -1` sentinel, but nothing increments it
- **Location:** [src/twitch/adapter.ts](src/twitch/adapter.ts#L149-L153)
- **What:** `wireTwitchToSession` emits `objection_count_changed` with `count: -1` and a comment "sentinel — orchestrator should increment". But the orchestrator only increments `objectionCount` inside `handleModerationRedirect` (moderation path). No code reads the `-1` sentinel from an SSE event and increments the counter.
- **Why it matters:** Every Twitch `!objection` command will broadcast `count: -1` to all SSE listeners. The overlay and dashboard will display `-1` as the objection count, corrupting the UI state. This is a data integrity issue.
- **Minimal fix:** Either (a) read the current `objectionCount` from the session inside `wireTwitchToSession` and emit `currentCount + 1`, or (b) have the store handle the increment internally when it receives this event.
---
### Important (Should Fix)
#### I1. `/api/metrics` endpoint not documented; operator runbook says it doesn't exist
- **Location:** [docs/operator-runbook.md](docs/operator-runbook.md#L213), README "API at a Glance" section
- **What:** `docs/operator-runbook.md` line 213 states: "There is no built-in metrics endpoint." But the uncommitted code adds `GET /api/metrics` serving Prometheus-format metrics. The README's API listing also omits `/api/metrics`.
- **Why it matters:** Operators won't discover the metrics endpoint; the runbook actively misleads.
- **Minimal fix:** Update `docs/operator-runbook.md` to reference `/api/metrics` and add it to the README's API listing and `docs/api.md`.
#### I2. `TWITCH_EVENTSUB_SECRET` in `.env.example` is unused
- **Location:** `.env.example` line 42
- **What:** `.env.example` declares `TWITCH_EVENTSUB_SECRET=` but no code reads this variable. The Twitch adapter only uses `TWITCH_CHANNEL`, `TWITCH_BOT_TOKEN`, and `TWITCH_CLIENT_ID`.
- **Why it matters:** Misleads operators into thinking EventSub is configured. If a secret is generated and placed here, it creates a false sense of security for a feature that doesn't exist yet.
- **Minimal fix:** Remove the line or comment it with `# (future — not yet used)`.
#### I3. `docs/event-taxonomy.md` payload schema mismatch for `render_directive`
- **Location:** [docs/event-taxonomy.md](docs/event-taxonomy.md) (render_directive payload section)
- **What:** The doc schema shows `directive.pose` (singular string) and `directive.face` (singular string), but the TypeScript type `RenderDirective` in `src/types.ts` uses `poses?: Partial<Record<CourtRole, CharacterPose>>` (plural, map) and `faces?: Partial<Record<CourtRole, CharacterFace>>` (plural, map). The orchestrator's `inferRenderDirective` emits `poses: { [role]: 'point' }`, not a flat `pose: 'point'`.
- **Why it matters:** Frontend consumers implementing against the docs will expect a flat string but receive an object map, causing rendering bugs.
- **Minimal fix:** Update the event taxonomy doc to show the actual `poses`/`faces` map shape.
#### I4. Dashboard `applyEventToSnapshot` duplicates session-snapshot logic
- **Location:** [dashboard/src/App.tsx](dashboard/src/App.tsx#L10-L160) (the entire `applyEventToSnapshot` function)
- **What:** The 160-line `applyEventToSnapshot` function in `App.tsx` re-implements event-to-snapshot mapping that already exists in `dashboard/src/session-snapshot.ts` (`mapSessionToSnapshot`). The two implementations handle overlapping event types (`phase_changed`, `turn`, `judge_recap_emitted`, `vote_updated`) with slightly different parsing logic.
- **Why it matters:** Two sources of truth for the same transformation. If one is updated, the other goes stale. The `App.tsx` version uses its own `asRecord`/`asString`/`asNumber` helpers rather than sharing the snapshot mapper.
- **Minimal fix:** Extend `mapSessionToSnapshot` (or add an `applyEventDelta` function in `session-snapshot.ts`) and import it into `App.tsx`.
---
### Minor (Nice to Have)
#### M1. `inferRenderDirective` keyword matching is case-sensitive after `.toUpperCase()`
- **Location:** [src/court/orchestrator.ts](src/court/orchestrator.ts#L100-L115)
- **What:** The function calls `dialogue.toUpperCase()` then checks for `'OBJECTION!'`, `'HOLD IT!'`, `'TAKE THAT!'`. This works, but only matches exact substrings with the exclamation mark. Dialogue like `"Objection, your honor"` (no `!`) won't trigger the effect.
- **Why it matters:** Low severity — the effect miss is cosmetic, but users may expect the Ace Attorney effect on any `objection` keyword.
- **Minimal fix:** Consider also matching without the trailing `!`.
#### M2. `public/app.js` growing large
- **Location:** [public/app.js](public/app.js) (~970 lines after diff)
- **What:** The file is accumulating responsibilities: SSE connection, fixture replay, dialogue typewriter, renderer bootstrap, keyboard shortcuts, vote UI, caption controls. No module splitting beyond the renderer.
- **Why it matters:** Harder to maintain as more overlay features land. Not blocking, but worth noting.
#### M3. `record:sse` script not documented in `docs/api.md`
- **Location:** README mentions it; `docs/api.md` does not.
- **What:** The `npm run record:sse` command and its flags (`--session`, `--base`, `--out`, `--max-events`, `--duration-ms`) are documented in README but absent from the API reference doc.
- **Why it matters:** Minor discovery issue for developers who check `docs/api.md` first.
---
## UI Guidelines (terse audit of changed UI files)
- [dashboard/src/App.tsx](dashboard/src/App.tsx#L262): `setInterval` with 5s polling — no `AbortController` or visibility-based pause; wastes battery on background tabs.
- [public/app.js](public/app.js): `document.addEventListener('keydown', ...)` does preventDefault on Enter/Escape globally — may conflict with form fields in future overlays. Current `isEditableElementFocused` guard is adequate for now.
- [public/index.html](public/index.html): New `#pixiStage` container introduced — no `aria-hidden="true"` attribute on the decorative canvas; screen readers may try to parse it.
---
## Verification Evidence
| Check | Result |
| ------------------------- | ------------------------------------------------------- |
| `npm test` | 158 pass, 0 fail, 2 skipped |
| TypeScript compilation | Implicit via test run (tsx) |
| New event payloads tested | Yes — 6 new tests for Phase 7 events |
| Replay module tested | Yes — `session-replay.test.ts`, `server-replay.test.ts` |
| Metrics module | Not tested (no `src/metrics.test.ts`) |
| Twitch adapter | Not tested (no `src/twitch/adapter.test.ts`) |

View File

@@ -7,6 +7,11 @@ import {
safeBroadcastHook,
type BroadcastAdapter,
} from '../broadcast/adapter.js';
import {
createTwitchAdapter,
wireTwitchToSession,
type TwitchAdapter,
} from '../twitch/adapter.js';
import {
applyWitnessCap,
estimateTokens,
@@ -25,6 +30,10 @@ import type {
CourtRole,
CourtSession,
CourtTurn,
RenderDirective,
CameraPreset,
CaseFile,
WitnessStatement,
} from '../types.js';
import type { CourtSessionStore } from '../store/session-store.js';
import { buildCourtSystemPrompt } from './personas.js';
@@ -63,6 +72,68 @@ const MODERATION_REDIRECT_DIALOGUE =
const broadcastBySession = new Map<string, BroadcastAdapter>();
// ---------------------------------------------------------------------------
// Phase 7: Render directive inference (#70)
// ---------------------------------------------------------------------------
const ROLE_CAMERA_MAP: Record<CourtRole, CameraPreset> = {
judge: 'judge',
prosecutor: 'prosecution',
defense: 'defense',
witness_1: 'witness',
witness_2: 'witness',
witness_3: 'witness',
bailiff: 'wide',
};
function inferRenderDirective(
role: CourtRole,
phase: CourtPhase,
dialogue: string,
): RenderDirective {
const directive: RenderDirective = {
camera: ROLE_CAMERA_MAP[role] ?? 'wide',
poses: { [role]: 'talk' } as RenderDirective['poses'],
};
// Detect exclamatory dialogue for effects (match with or without trailing !)
const upper = dialogue.toUpperCase();
if (/\bOBJECTION[!.]?/.test(upper)) {
directive.effect = 'objection';
directive.poses = { [role]: 'point' } as RenderDirective['poses'];
} else if (/\bHOLD IT[!.]?/.test(upper)) {
directive.effect = 'hold_it';
directive.poses = { [role]: 'slam' } as RenderDirective['poses'];
} else if (/\bTAKE THAT[!.]?/.test(upper)) {
directive.effect = 'take_that';
directive.poses = { [role]: 'point' } as RenderDirective['poses'];
}
// Phase-specific camera overrides
if (phase === 'verdict_vote' || phase === 'final_ruling') {
directive.camera = 'verdict';
} else if (phase === 'evidence_reveal') {
directive.camera = 'evidence';
}
return directive;
}
function emitRenderDirective(
store: CourtSessionStore,
session: CourtSession,
directive: RenderDirective,
turnId?: string,
): void {
session.metadata.lastRenderDirective = directive;
store.emitEvent(session.id, 'render_directive', {
directive,
turnId,
phase: session.phase,
emittedAt: new Date().toISOString(),
});
}
type BudgetResolution = {
requestedMaxTokens: number;
appliedMaxTokens: number;
@@ -212,7 +283,8 @@ async function generateTurn(input: {
}
const moderation = moderateContent(dialogue);
const activeBroadcast = input.broadcast ?? broadcastBySession.get(session.id);
const activeBroadcast =
input.broadcast ?? broadcastBySession.get(session.id);
if (moderation.flagged) {
await handleFlaggedModeration({
@@ -238,6 +310,17 @@ async function generateTurn(input: {
appendTurnToSession(session, turn);
// Phase 7: Infer and emit render directive for this turn
const renderDirective = inferRenderDirective(
role,
session.phase,
moderation.sanitized,
);
emitRenderDirective(store, session, renderDirective, turn.id);
// Phase 7: Emit witness statement if applicable
emitWitnessStatement(store, session, turn);
store.emitEvent(session.id, 'token_budget_applied', {
turnId: turn.id,
speaker,
@@ -304,6 +387,98 @@ function createGenerateBudgetedTurn(input: {
});
}
// ---------------------------------------------------------------------------
// Phase 7: Structured case file (#67)
// ---------------------------------------------------------------------------
function buildCaseFile(session: CourtSession): CaseFile {
const meta = session.metadata;
const assignments = meta.roleAssignments;
const witnesses: CaseFile['witnesses'] = [];
for (const wRole of [
'witness_1',
'witness_2',
'witness_3',
] as CourtRole[]) {
const agentId = (assignments as unknown as Record<string, AgentId>)[
wRole
];
if (agentId) {
const agent = AGENTS[agentId];
witnesses.push({
role: wRole,
agentId,
displayName: agent?.displayName ?? agentId,
bio: agent?.description ?? 'Court witness',
});
}
}
const evidenceItems: CaseFile['evidence'] = (meta.evidenceCards ?? []).map(
(card, i) => ({
id: card.id,
label: `Evidence ${i + 1}`,
description: card.text,
revealPhase: 'evidence_reveal' as CourtPhase,
}),
);
return {
title: session.topic,
genre: meta.currentGenre ?? 'absurd_civil',
caseType: meta.caseType,
synopsis: session.topic,
charges:
meta.caseType === 'criminal' ?
['As stated in case prompt']
: ['Damages as alleged'],
witnesses,
evidence: evidenceItems,
sentenceOptions: meta.sentenceOptions,
};
}
function emitCaseFile(store: CourtSessionStore, session: CourtSession): void {
const caseFile = buildCaseFile(session);
session.metadata.caseFile = caseFile;
store.emitEvent(session.id, 'case_file_generated', {
caseFile,
sessionId: session.id,
generatedAt: new Date().toISOString(),
});
}
// ---------------------------------------------------------------------------
// Phase 7: Witness statement emission (#75)
// ---------------------------------------------------------------------------
function emitWitnessStatement(
store: CourtSessionStore,
session: CourtSession,
turn: CourtTurn,
): void {
if (!turn.role.startsWith('witness_')) return;
const statement: WitnessStatement = {
witnessRole: turn.role,
agentId: turn.speaker,
statementText: turn.dialogue,
issuedAt: new Date().toISOString(),
};
if (!session.metadata.witnessStatements) {
session.metadata.witnessStatements = [];
}
session.metadata.witnessStatements.push(statement);
store.emitEvent(session.id, 'witness_statement', {
statement,
phase: session.phase,
emittedAt: new Date().toISOString(),
});
}
export async function runCourtSession(
sessionId: string,
store: CourtSessionStore,
@@ -313,6 +488,13 @@ export async function runCourtSession(
const tts = options.ttsAdapter ?? createTTSAdapterFromEnv();
const broadcast = await createBroadcastAdapterFromEnv(); // Phase 3: Initialize broadcast adapter
broadcastBySession.set(session.id, broadcast);
// Phase 7: Generate and emit structured case file
emitCaseFile(store, session);
// Phase 7: Wire Twitch integration
const twitchAdapter = createTwitchAdapter();
void wireTwitchToSession(twitchAdapter, store, session.id);
const pause = options.sleepFn ?? sleep;
const witnessCapConfig = resolveWitnessCapConfig();
const roleTokenBudgetConfig = resolveRoleTokenBudgetConfig();
@@ -388,6 +570,7 @@ export async function runCourtSession(
await store.failSession(session.id, message);
} finally {
broadcastBySession.delete(session.id);
twitchAdapter.disconnect();
// eslint-disable-next-line no-console
console.info(
`[tts] session=${session.id} provider=${tts.provider} success=${ttsMetrics.success} failure=${ttsMetrics.failure}`,

View File

@@ -434,3 +434,104 @@ test('assertEventPayload: session_failed missing reason', () => {
TypeError,
);
});
// ---------------------------------------------------------------------------
// Phase 7: render_directive
// ---------------------------------------------------------------------------
test('assertEventPayload: render_directive valid', () => {
assert.doesNotThrow(() =>
assertEventPayload(
makeEvent('render_directive', {
directive: { camera: 'judge', effect: 'objection' },
phase: 'witness_exam',
emittedAt: new Date().toISOString(),
}),
),
);
});
test('assertEventPayload: render_directive missing directive object', () => {
assert.throws(
() =>
assertEventPayload(
makeEvent('render_directive', {
phase: 'openings',
emittedAt: new Date().toISOString(),
}),
),
TypeError,
);
});
// ---------------------------------------------------------------------------
// Phase 7: witness_statement
// ---------------------------------------------------------------------------
test('assertEventPayload: witness_statement valid', () => {
assert.doesNotThrow(() =>
assertEventPayload(
makeEvent('witness_statement', {
statement: {
witnessRole: 'witness_1',
agentId: 'chora',
statementText: 'I saw it happen.',
issuedAt: new Date().toISOString(),
},
phase: 'witness_exam',
emittedAt: new Date().toISOString(),
}),
),
);
});
test('assertEventPayload: witness_statement missing statement', () => {
assert.throws(
() =>
assertEventPayload(
makeEvent('witness_statement', {
phase: 'witness_exam',
emittedAt: new Date().toISOString(),
}),
),
TypeError,
);
});
// ---------------------------------------------------------------------------
// Phase 7: case_file_generated
// ---------------------------------------------------------------------------
test('assertEventPayload: case_file_generated valid', () => {
assert.doesNotThrow(() =>
assertEventPayload(
makeEvent('case_file_generated', {
caseFile: {
title: 'Test Case',
genre: 'absurd_civil',
caseType: 'civil',
synopsis: 'A test case',
charges: [],
witnesses: [],
evidence: [],
sentenceOptions: ['warning'],
},
sessionId: 'sess-1',
generatedAt: new Date().toISOString(),
}),
),
);
});
test('assertEventPayload: case_file_generated missing caseFile', () => {
assert.throws(
() =>
assertEventPayload(
makeEvent('case_file_generated', {
sessionId: 'sess-1',
generatedAt: new Date().toISOString(),
}),
),
TypeError,
);
});

View File

@@ -5,7 +5,7 @@
* Use `assertEventPayload` to validate a raw `CourtEvent` at runtime.
*/
import type { CourtEvent, CourtPhase } from './types.js';
import type { CourtEvent, CourtPhase, RenderDirective, CaseFile, WitnessStatement } from './types.js';
// ---------------------------------------------------------------------------
// Payload interfaces
@@ -156,6 +156,27 @@ export interface ObjectionCountChangedPayload {
changedAt: string; // ISO 8601
}
// Phase 7 payload interfaces
export interface RenderDirectivePayload {
directive: RenderDirective;
turnId?: string;
phase: string;
emittedAt: string; // ISO 8601
}
export interface WitnessStatementPayload {
statement: WitnessStatement;
phase: string;
emittedAt: string; // ISO 8601
}
export interface CaseFileGeneratedPayload {
caseFile: CaseFile;
sessionId: string;
generatedAt: string; // ISO 8601
}
// ---------------------------------------------------------------------------
// Shape guard
// ---------------------------------------------------------------------------
@@ -391,6 +412,40 @@ export function assertEventPayload(event: CourtEvent): void {
}
break;
// Phase 7 event types
case 'render_directive':
if (
!hasObjectKey(payload, 'directive') ||
!hasStringKeys(payload, ['phase', 'emittedAt'])
) {
throw new TypeError(
`render_directive payload missing required fields: directive (object), phase, emittedAt`,
);
}
break;
case 'witness_statement':
if (
!hasObjectKey(payload, 'statement') ||
!hasStringKeys(payload, ['phase', 'emittedAt'])
) {
throw new TypeError(
`witness_statement payload missing required fields: statement (object), phase, emittedAt`,
);
}
break;
case 'case_file_generated':
if (
!hasObjectKey(payload, 'caseFile') ||
!hasStringKeys(payload, ['sessionId', 'generatedAt'])
) {
throw new TypeError(
`case_file_generated payload missing required fields: caseFile (object), sessionId, generatedAt`,
);
}
break;
default: {
const _exhaustive: never = type;
throw new TypeError(`Unknown event type: ${String(_exhaustive)}`);

139
src/llm/client.test.ts Normal file
View File

@@ -0,0 +1,139 @@
import assert from 'node:assert/strict';
import test from 'node:test';
import { llmGenerate } from './client.js';
type EnvKey = 'OPENROUTER_API_KEY' | 'LLM_MOCK' | 'LLM_MODEL';
function withTemporaryEnv(
updates: Partial<Record<EnvKey, string>>,
run: () => Promise<void>,
): Promise<void> {
const previous = new Map<EnvKey, string | undefined>();
for (const key of Object.keys(updates) as EnvKey[]) {
previous.set(key, process.env[key]);
process.env[key] = updates[key];
}
return run().finally(() => {
for (const [key, value] of previous.entries()) {
if (value === undefined) {
delete process.env[key];
} else {
process.env[key] = value;
}
}
});
}
test('llmGenerate falls back when provider returns empty message content', async () => {
const originalFetch = globalThis.fetch;
const originalArgv = [...process.argv];
globalThis.fetch = async () =>
new Response(
JSON.stringify({
choices: [
{
message: {
role: 'assistant',
content: '',
reasoning:
'Internal reasoning consumed the token budget before final answer.',
},
},
],
}),
{
status: 200,
headers: {
'content-type': 'application/json',
},
},
);
process.argv = process.argv.filter(arg => arg !== '--test');
await withTemporaryEnv(
{
OPENROUTER_API_KEY: 'test-key',
LLM_MOCK: 'false',
LLM_MODEL: 'stepfun/step-3.5-flash:free',
},
async () => {
const output = await llmGenerate({
messages: [
{
role: 'system',
content: 'You are a courtroom defense attorney.',
},
{
role: 'user',
content: 'Deliver your opening statement.',
},
],
maxTokens: 180,
});
assert.notEqual(
output,
'',
'Expected non-empty fallback text when model content is empty',
);
assert.match(output, /Ladies and gentlemen/i);
},
).finally(() => {
globalThis.fetch = originalFetch;
process.argv = originalArgv;
});
});
test('llmGenerate returns sanitized provider content when non-empty', async () => {
const originalFetch = globalThis.fetch;
const originalArgv = [...process.argv];
globalThis.fetch = async () =>
new Response(
JSON.stringify({
choices: [
{
message: {
role: 'assistant',
content:
'"**Objection!** Visit https://example.com for docs"',
},
},
],
}),
{
status: 200,
headers: {
'content-type': 'application/json',
},
},
);
process.argv = process.argv.filter(arg => arg !== '--test');
await withTemporaryEnv(
{
OPENROUTER_API_KEY: 'test-key',
LLM_MOCK: 'false',
LLM_MODEL: 'stepfun/step-3.5-flash:free',
},
async () => {
const output = await llmGenerate({
messages: [
{ role: 'system', content: 'You are concise.' },
{ role: 'user', content: 'Say one line.' },
],
maxTokens: 120,
});
assert.equal(output, 'Objection! Visit for docs');
},
).finally(() => {
globalThis.fetch = originalFetch;
process.argv = originalArgv;
});
});

View File

@@ -103,11 +103,22 @@ export async function llmGenerate(
}
const data = (await response.json()) as {
choices?: [{ message?: { content?: string } }];
choices?: [{ message?: { content?: unknown } }];
};
const text = data.choices?.[0]?.message?.content ?? '';
return sanitizeDialogue(text);
const rawContent = data.choices?.[0]?.message?.content;
const text = typeof rawContent === 'string' ? rawContent : '';
const sanitized = sanitizeDialogue(text);
if (!sanitized) {
// eslint-disable-next-line no-console
console.warn(
`OpenRouter response returned empty content for model=${model}; falling back to mock dialogue.`,
);
return mockReply(latestUserMessage ?? '');
}
return sanitized;
} catch (error) {
// eslint-disable-next-line no-console
console.warn(

381
src/metrics.ts Normal file
View File

@@ -0,0 +1,381 @@
import {
Counter,
Gauge,
Histogram,
Registry,
collectDefaultMetrics,
} from 'prom-client';
import {
CourtNotFoundError,
CourtValidationError,
type CourtSessionStore,
} from './store/session-store.js';
import type { CourtPhase, SessionStatus } from './types.js';
export const metricsRegistry = new Registry();
collectDefaultMetrics({
register: metricsRegistry,
prefix: 'juryrigged_',
});
const appInfo = new Gauge({
name: 'juryrigged_app_info',
help: 'Static metadata about the JuryRigged runtime',
labelNames: ['service', 'version'],
registers: [metricsRegistry],
});
const sessionLifecycleTotal = new Counter({
name: 'juryrigged_session_lifecycle_total',
help: 'Total number of JuryRigged session lifecycle events',
labelNames: ['event'],
registers: [metricsRegistry],
});
const sessionsByStatus = new Gauge({
name: 'juryrigged_sessions_status',
help: 'Current number of sessions grouped by status',
labelNames: ['status'],
registers: [metricsRegistry],
});
const phaseTransitionsTotal = new Counter({
name: 'juryrigged_phase_transitions_total',
help: 'Total number of successful phase transitions',
labelNames: ['phase'],
registers: [metricsRegistry],
});
const phaseTransitionRejectionsTotal = new Counter({
name: 'juryrigged_phase_transition_rejections_total',
help: 'Total number of rejected phase transitions',
labelNames: ['reason'],
registers: [metricsRegistry],
});
const sessionStoreErrorsTotal = new Counter({
name: 'juryrigged_session_store_errors_total',
help: 'Total number of store-level operation errors',
labelNames: ['operation', 'error_type'],
registers: [metricsRegistry],
});
const votesCastTotal = new Counter({
name: 'juryrigged_votes_cast_total',
help: 'Total number of accepted jury votes',
labelNames: ['vote_type'],
registers: [metricsRegistry],
});
const votesRejectedTotal = new Counter({
name: 'juryrigged_votes_rejected_total',
help: 'Total number of rejected jury vote attempts',
labelNames: ['vote_type', 'reason'],
registers: [metricsRegistry],
});
const voteCastDurationSeconds = new Histogram({
name: 'juryrigged_vote_cast_duration_seconds',
help: 'Latency of accepted vote submissions',
labelNames: ['vote_type'],
buckets: [0.001, 0.005, 0.01, 0.025, 0.05, 0.1, 0.25, 0.5, 1, 2.5, 5],
registers: [metricsRegistry],
});
const sseConnectionsTotal = new Counter({
name: 'juryrigged_sse_connections_total',
help: 'Total number of SSE stream connections opened',
registers: [metricsRegistry],
});
const sseConnectionsActive = new Gauge({
name: 'juryrigged_sse_connections_active',
help: 'Current number of active SSE stream connections',
registers: [metricsRegistry],
});
const sseDisconnectsTotal = new Counter({
name: 'juryrigged_sse_disconnects_total',
help: 'Total number of SSE disconnections by reason',
labelNames: ['reason'],
registers: [metricsRegistry],
});
const sseEventsSentTotal = new Counter({
name: 'juryrigged_sse_events_sent_total',
help: 'Total number of SSE events sent to clients',
labelNames: ['event_type'],
registers: [metricsRegistry],
});
const sseConnectionDurationSeconds = new Histogram({
name: 'juryrigged_sse_connection_duration_seconds',
help: 'Duration of SSE client connections in seconds',
buckets: [0.1, 0.5, 1, 5, 10, 30, 60, 120, 300, 600, 1800],
registers: [metricsRegistry],
});
const SESSION_STATUSES: SessionStatus[] = [
'pending',
'running',
'completed',
'failed',
];
function sanitizeLabel(value: string, fallback = 'unknown'): string {
const trimmed = value.trim();
if (!trimmed) return fallback;
return trimmed.toLowerCase().replace(/[^a-z0-9_:-]/g, '_').slice(0, 64);
}
function classifyErrorType(error: unknown): string {
if (error instanceof CourtValidationError) return 'validation';
if (error instanceof CourtNotFoundError) return 'not_found';
if (error instanceof Error) return sanitizeLabel(error.name, 'error');
return 'unknown';
}
function logMetricsError(context: string, error: unknown): void {
// eslint-disable-next-line no-console
console.error(
`[metrics] ${context}:`,
error instanceof Error ? error.message : error,
);
}
export function elapsedSecondsSince(startedAt: bigint): number {
return Number(process.hrtime.bigint() - startedAt) / 1_000_000_000;
}
export function recordVoteCast(
voteType: 'verdict' | 'sentence',
durationSeconds: number,
): void {
votesCastTotal.inc({ vote_type: voteType });
voteCastDurationSeconds.observe({ vote_type: voteType }, durationSeconds);
}
export function recordVoteRejected(voteType: string, reason: string): void {
votesRejectedTotal.inc({
vote_type: sanitizeLabel(voteType, 'unknown'),
reason: sanitizeLabel(reason, 'unknown'),
});
}
export function recordSseConnectionOpened(): bigint {
sseConnectionsTotal.inc();
sseConnectionsActive.inc();
return process.hrtime.bigint();
}
export function recordSseEventSent(eventType: string): void {
sseEventsSentTotal.inc({ event_type: sanitizeLabel(eventType) });
}
export function recordSseConnectionClosed(
openedAt: bigint,
reason: string,
): void {
sseConnectionsActive.dec();
sseDisconnectsTotal.inc({ reason: sanitizeLabel(reason) });
sseConnectionDurationSeconds.observe(elapsedSecondsSince(openedAt));
}
async function syncSessionStatusGauges(store: CourtSessionStore): Promise<void> {
const sessions = await store.listSessions();
const counts = new Map<SessionStatus, number>(
SESSION_STATUSES.map(status => [status, 0]),
);
for (const session of sessions) {
const current = counts.get(session.status) ?? 0;
counts.set(session.status, current + 1);
}
for (const status of SESSION_STATUSES) {
sessionsByStatus.set({ status }, counts.get(status) ?? 0);
}
}
export function instrumentCourtSessionStore(
baseStore: CourtSessionStore,
): CourtSessionStore {
let syncInFlight: Promise<void> | undefined;
const scheduleSessionStatusSync = (): void => {
if (syncInFlight) return;
syncInFlight = syncSessionStatusGauges(baseStore)
.catch(error => {
logMetricsError('syncSessionStatusGauges failed', error);
})
.finally(() => {
syncInFlight = undefined;
});
};
const recordStoreError = (operation: string, error: unknown): void => {
sessionStoreErrorsTotal.inc({
operation,
error_type: classifyErrorType(error),
});
};
scheduleSessionStatusSync();
return {
async createSession(input) {
try {
const session = await baseStore.createSession(input);
sessionLifecycleTotal.inc({ event: 'created' });
scheduleSessionStatusSync();
return session;
} catch (error) {
recordStoreError('create_session', error);
throw error;
}
},
async listSessions() {
try {
return await baseStore.listSessions();
} catch (error) {
recordStoreError('list_sessions', error);
throw error;
}
},
async getSession(sessionId) {
try {
return await baseStore.getSession(sessionId);
} catch (error) {
recordStoreError('get_session', error);
throw error;
}
},
async startSession(sessionId) {
try {
const session = await baseStore.startSession(sessionId);
sessionLifecycleTotal.inc({ event: 'started' });
scheduleSessionStatusSync();
return session;
} catch (error) {
recordStoreError('start_session', error);
throw error;
}
},
async setPhase(sessionId, phase, phaseDurationMs) {
try {
const session = await baseStore.setPhase(
sessionId,
phase,
phaseDurationMs,
);
phaseTransitionsTotal.inc({ phase });
return session;
} catch (error) {
phaseTransitionRejectionsTotal.inc({
reason: classifyErrorType(error),
});
recordStoreError('set_phase', error);
throw error;
}
},
async addTurn(input) {
try {
return await baseStore.addTurn(input);
} catch (error) {
recordStoreError('add_turn', error);
throw error;
}
},
async castVote(input) {
try {
return await baseStore.castVote(input);
} catch (error) {
recordStoreError('cast_vote', error);
throw error;
}
},
async recordFinalRuling(input) {
try {
return await baseStore.recordFinalRuling(input);
} catch (error) {
recordStoreError('record_final_ruling', error);
throw error;
}
},
async recordRecap(input) {
try {
await baseStore.recordRecap(input);
} catch (error) {
recordStoreError('record_recap', error);
throw error;
}
},
async completeSession(sessionId) {
try {
const session = await baseStore.completeSession(sessionId);
sessionLifecycleTotal.inc({ event: 'completed' });
scheduleSessionStatusSync();
return session;
} catch (error) {
recordStoreError('complete_session', error);
throw error;
}
},
async failSession(sessionId, reason) {
try {
const session = await baseStore.failSession(sessionId, reason);
sessionLifecycleTotal.inc({ event: 'failed' });
scheduleSessionStatusSync();
return session;
} catch (error) {
recordStoreError('fail_session', error);
throw error;
}
},
async recoverInterruptedSessions() {
try {
const ids = await baseStore.recoverInterruptedSessions();
scheduleSessionStatusSync();
return ids;
} catch (error) {
recordStoreError('recover_interrupted_sessions', error);
throw error;
}
},
subscribe(sessionId, handler) {
return baseStore.subscribe(sessionId, handler);
},
emitEvent(sessionId, type, payload) {
baseStore.emitEvent(sessionId, type, payload);
},
};
}
appInfo.set(
{
service: 'juryrigged',
version: process.env.npm_package_version ?? 'unknown',
},
1,
);
export async function renderMetrics(): Promise<string> {
return metricsRegistry.metrics();
}
export const metricsContentType = metricsRegistry.contentType;

View File

@@ -0,0 +1,143 @@
import test from 'node:test';
import assert from 'node:assert/strict';
import { readFileSync, existsSync } from 'node:fs';
import { join } from 'node:path';
test('public index includes pixi stage and dialogue skip controls', () => {
const html = readFileSync(join(process.cwd(), 'public/index.html'), 'utf8');
assert.match(html, /id="pixiStage"/);
assert.match(html, /id="captionSkipBtn"/);
assert.match(html, /id="captionTypewriterToggle"/);
assert.match(html, /id="captionSkipAll"/);
});
test('public app wires fixture replay mode and typewriter helpers', () => {
const js = readFileSync(join(process.cwd(), 'public/app.js'), 'utf8');
assert.match(js, /replayFixture/);
assert.match(js, /function\s+bootstrapCourtRenderer\(/);
assert.match(js, /createCourtRenderer/);
assert.match(js, /function\s+startDialogueTypewriter\(/);
assert.match(js, /function\s+skipDialogueTypewriter\(/);
assert.match(js, /function\s+replayFixtureSession\(/);
assert.match(js, /dispatchStreamPayload\(/);
assert.match(js, /syncRendererState\(/);
});
test('renderer scaffold modules exist with expected exports', () => {
const rendererDir = join(process.cwd(), 'public/renderer');
const expectedFiles = [
'index.js',
'stage.js',
'pixi-runtime.js',
'dialogue.js',
'camera.js',
'layers/background.js',
'layers/characters.js',
'layers/ui.js',
'layers/effects.js',
'layers/evidence.js',
];
for (const file of expectedFiles) {
assert.ok(
existsSync(join(rendererDir, file)),
`Missing renderer file: renderer/${file}`,
);
}
const indexJs = readFileSync(join(rendererDir, 'index.js'), 'utf8');
assert.match(indexJs, /export\s+(async\s+)?function\s+createCourtRenderer/);
assert.match(indexJs, /createStage/);
assert.match(indexJs, /initBackground/);
assert.match(indexJs, /initCharacters/);
assert.match(indexJs, /initUI/);
assert.match(indexJs, /initEffects/);
assert.match(indexJs, /initEvidence/);
assert.match(indexJs, /initCamera/);
assert.match(indexJs, /createDialogueStateMachine/);
assert.match(indexJs, /applyDirective/);
const stageJs = readFileSync(join(rendererDir, 'stage.js'), 'utf8');
assert.match(stageJs, /export\s+(async\s+)?function\s+createStage/);
assert.match(stageJs, /backgroundLayer/);
assert.match(stageJs, /charactersLayer/);
assert.match(stageJs, /uiLayer/);
assert.match(stageJs, /effectsLayer/);
});
test('Phase 7 renderer modules have expected exports and structure', () => {
const rendererDir = join(process.cwd(), 'public/renderer');
// Dialogue state machine
const dialogueJs = readFileSync(join(rendererDir, 'dialogue.js'), 'utf8');
assert.match(dialogueJs, /export\s+function\s+createDialogueStateMachine/);
assert.match(dialogueJs, /PUNCTUATION_PAUSES/);
assert.match(dialogueJs, /setLine/);
assert.match(dialogueJs, /skip\b/);
assert.match(dialogueJs, /setSkipAll/);
// Camera controller
const cameraJs = readFileSync(join(rendererDir, 'camera.js'), 'utf8');
assert.match(cameraJs, /export\s+.*CAMERA_PRESETS/);
assert.match(cameraJs, /export\s+function\s+initCamera/);
assert.match(cameraJs, /snapTo/);
assert.match(cameraJs, /transitionTo/);
assert.match(cameraJs, /wide/);
assert.match(cameraJs, /judge/);
assert.match(cameraJs, /prosecution/);
assert.match(cameraJs, /defense/);
// Effects engine
const effectsJs = readFileSync(join(rendererDir, 'layers/effects.js'), 'utf8');
assert.match(effectsJs, /export\s+function\s+initEffects/);
assert.match(effectsJs, /function\s+flash/);
assert.match(effectsJs, /function\s+shake/);
assert.match(effectsJs, /function\s+freeze/);
assert.match(effectsJs, /function\s+stamp/);
assert.match(effectsJs, /function\s+objection/);
assert.match(effectsJs, /function\s+holdIt/);
assert.match(effectsJs, /function\s+takeThat/);
// Characters layer (enhanced)
const charsJs = readFileSync(join(rendererDir, 'layers/characters.js'), 'utf8');
assert.match(charsJs, /export\s+.*POSES/);
assert.match(charsJs, /export\s+.*FACE_OVERLAYS/);
assert.match(charsJs, /poseLayer/);
assert.match(charsJs, /faceLayer/);
assert.match(charsJs, /fxLayer/);
assert.match(charsJs, /setPoseSprite/);
assert.match(charsJs, /setFaceOverlay/);
assert.match(charsJs, /flashCharacter/);
// Evidence layer
const evidenceJs = readFileSync(join(rendererDir, 'layers/evidence.js'), 'utf8');
assert.match(evidenceJs, /export\s+function\s+initEvidence/);
assert.match(evidenceJs, /addCard/);
assert.match(evidenceJs, /clearCards/);
assert.match(evidenceJs, /presentEvidence/);
});
test('app.js handles render_directive and evidence_revealed events', () => {
const js = readFileSync(join(process.cwd(), 'public/app.js'), 'utf8');
assert.match(js, /handleRenderDirectiveEvent/);
assert.match(js, /handleEvidenceRevealedEvent/);
assert.match(js, /render_directive.*handleRenderDirectiveEvent/);
assert.match(js, /evidence_revealed.*handleEvidenceRevealedEvent/);
assert.match(js, /applyDirective/);
});
test('placeholder asset directory structure exists', () => {
const assetsDir = join(process.cwd(), 'public/assets');
const subdirs = ['backgrounds', 'characters', 'ui', 'fonts', 'sfx'];
for (const dir of subdirs) {
assert.ok(
existsSync(join(assetsDir, dir)),
`Missing asset directory: assets/${dir}`,
);
}
});

View File

@@ -0,0 +1,186 @@
import assert from 'node:assert/strict';
import { mkdtemp, readFile } from 'node:fs/promises';
import { tmpdir } from 'node:os';
import { join } from 'node:path';
import test from 'node:test';
import { AGENT_IDS } from '../agents.js';
import { assignCourtRoles } from '../court/roles.js';
import { createCourtSessionStore } from '../store/session-store.js';
import type { CourtEvent } from '../types.js';
import {
buildReplayFrames,
createSyntheticEvent,
parseReplaySpeed,
rewriteReplayEventForSession,
SessionEventRecorderManager,
} from './session-replay.js';
function makeEvent(input: {
type: CourtEvent['type'];
at: string;
sessionId?: string;
payload?: Record<string, unknown>;
}): CourtEvent {
return {
id: `${input.type}-${input.at}`,
sessionId: input.sessionId ?? 'source-session',
type: input.type,
at: input.at,
payload: input.payload ?? {},
};
}
async function createInMemoryStore() {
const previousDatabaseUrl = process.env.DATABASE_URL;
process.env.DATABASE_URL = '';
try {
return await createCourtSessionStore();
} finally {
if (previousDatabaseUrl === undefined) {
delete process.env.DATABASE_URL;
} else {
process.env.DATABASE_URL = previousDatabaseUrl;
}
}
}
test('parseReplaySpeed clamps invalid values to default', () => {
assert.equal(parseReplaySpeed(undefined), 1);
assert.equal(parseReplaySpeed('0'), 1);
assert.equal(parseReplaySpeed(-2), 1);
assert.equal(parseReplaySpeed('4'), 4);
});
test('buildReplayFrames respects inter-event timing and speed multiplier', () => {
const events = [
makeEvent({
type: 'session_started',
at: '2026-02-28T10:00:00.000Z',
}),
makeEvent({
type: 'phase_changed',
at: '2026-02-28T10:00:00.100Z',
payload: {
phase: 'openings',
phaseStartedAt: '2026-02-28T10:00:00.100Z',
},
}),
makeEvent({
type: 'turn',
at: '2026-02-28T10:00:00.300Z',
payload: {
turn: {
id: 'turn-1',
sessionId: 'source-session',
turnNumber: 0,
speaker: 'primus',
role: 'judge',
phase: 'openings',
dialogue: 'Court is now in session.',
createdAt: '2026-02-28T10:00:00.300Z',
},
},
}),
];
const frames = buildReplayFrames(events, 2);
assert.deepEqual(
frames.map(frame => frame.delayMs),
[0, 50, 150],
);
});
test('rewriteReplayEventForSession rewrites top-level and turn session IDs', () => {
const source = makeEvent({
type: 'turn',
at: '2026-02-28T10:00:00.000Z',
sessionId: 'source-session',
payload: {
sessionId: 'source-session',
turn: {
id: 'turn-1',
sessionId: 'source-session',
turnNumber: 1,
speaker: 'primus',
role: 'judge',
phase: 'openings',
dialogue: 'Overruled.',
createdAt: '2026-02-28T10:00:00.000Z',
},
},
});
const rewritten = rewriteReplayEventForSession(source, 'target-session');
assert.equal(rewritten.sessionId, 'target-session');
assert.equal(rewritten.payload.sessionId, 'target-session');
assert.equal(
(rewritten.payload.turn as { sessionId: string }).sessionId,
'target-session',
);
assert.equal(source.sessionId, 'source-session');
assert.equal(source.payload.sessionId, 'source-session');
});
test('SessionEventRecorderManager writes initial and live session events to NDJSON', async () => {
const store = await createInMemoryStore();
const recordingsDir = await mkdtemp(
join(tmpdir(), 'juryrigged-recordings-'),
);
const participants = AGENT_IDS.slice(0, 5);
const session = await store.createSession({
topic: 'Did someone replace all office coffee with soup?',
participants,
metadata: {
mode: 'juryrigged',
casePrompt: 'Did someone replace all office coffee with soup?',
caseType: 'criminal',
sentenceOptions: ['Fine'],
verdictVoteWindowMs: 10,
sentenceVoteWindowMs: 10,
verdictVotes: {},
sentenceVotes: {},
roleAssignments: assignCourtRoles(participants),
},
});
const recorder = new SessionEventRecorderManager(store, recordingsDir);
await recorder.start({
sessionId: session.id,
initialEvents: [
createSyntheticEvent({
sessionId: session.id,
type: 'session_created',
payload: { sessionId: session.id },
at: '2026-02-28T10:00:00.000Z',
}),
],
});
await store.startSession(session.id);
await store.addTurn({
sessionId: session.id,
speaker: participants[0],
role: 'judge',
phase: 'case_prompt',
dialogue: 'All rise.',
});
await store.completeSession(session.id);
await recorder.stop(session.id);
await recorder.dispose();
const recordingPath = join(recordingsDir, `${session.id}.ndjson`);
const lines = (await readFile(recordingPath, 'utf8'))
.split(/\r?\n/)
.filter(Boolean)
.map(line => JSON.parse(line) as CourtEvent);
assert.ok(lines.length >= 4);
assert.equal(lines[0]?.type, 'session_created');
assert.ok(lines.some(event => event.type === 'session_started'));
assert.ok(lines.some(event => event.type === 'turn'));
assert.ok(lines.some(event => event.type === 'session_completed'));
});

View File

@@ -0,0 +1,302 @@
import { randomUUID } from 'node:crypto';
import { createWriteStream, type WriteStream } from 'node:fs';
import { mkdir, readFile } from 'node:fs/promises';
import { dirname, resolve } from 'node:path';
import type { CourtEvent } from '../types.js';
import type { CourtSessionStore } from '../store/session-store.js';
const DEFAULT_REPLAY_SPEED = 1;
const DEFAULT_RECORDINGS_DIR = 'recordings';
export interface ReplayFrame {
delayMs: number;
event: CourtEvent;
}
export interface LoadedReplayRecording {
filePath: string;
speed: number;
events: CourtEvent[];
frames: ReplayFrame[];
}
interface RecorderState {
sessionId: string;
filePath: string;
stream: WriteStream;
unsubscribe: () => void;
closed: boolean;
}
function isRecord(value: unknown): value is Record<string, unknown> {
return !!value && typeof value === 'object' && !Array.isArray(value);
}
function asCourtEvent(raw: unknown): CourtEvent | undefined {
if (!isRecord(raw)) return undefined;
const payload = isRecord(raw.payload) ? raw.payload : {};
const id = typeof raw.id === 'string' ? raw.id : randomUUID();
const sessionId =
typeof raw.sessionId === 'string' ? raw.sessionId.trim() : '';
const type = typeof raw.type === 'string' ? raw.type.trim() : '';
const at = typeof raw.at === 'string' ? raw.at : new Date().toISOString();
if (!sessionId || !type) {
return undefined;
}
return {
id,
sessionId,
type: type as CourtEvent['type'],
at,
payload,
};
}
export function parseReplaySpeed(value: number | string | undefined): number {
const parsed =
typeof value === 'number' ? value : Number.parseFloat(value ?? '');
if (!Number.isFinite(parsed) || parsed <= 0) {
return DEFAULT_REPLAY_SPEED;
}
return parsed;
}
export function resolveRecordingsDir(
env: NodeJS.ProcessEnv = process.env,
): string {
const raw = env.RECORDINGS_DIR?.trim();
return resolve(raw || DEFAULT_RECORDINGS_DIR);
}
export function createSyntheticEvent(input: {
sessionId: string;
type: CourtEvent['type'];
payload: Record<string, unknown>;
at?: string;
}): CourtEvent {
return {
id: randomUUID(),
sessionId: input.sessionId,
type: input.type,
at: input.at ?? new Date().toISOString(),
payload: structuredClone(input.payload),
};
}
export async function readEventsFromNdjson(
filePath: string,
): Promise<CourtEvent[]> {
const raw = await readFile(filePath, 'utf8');
const lines = raw
.split(/\r?\n/)
.map(line => line.trim())
.filter(Boolean);
const events: CourtEvent[] = [];
for (const line of lines) {
try {
const parsed = JSON.parse(line) as unknown;
const event = asCourtEvent(parsed);
if (event) {
events.push(event);
}
} catch {
// Ignore malformed lines to keep replay resilient.
}
}
return events;
}
export function buildReplayFrames(
events: CourtEvent[],
speedInput: number | string | undefined,
): ReplayFrame[] {
const speed = parseReplaySpeed(speedInput);
const frames: ReplayFrame[] = [];
let cumulativeDelayMs = 0;
let previousAtMs: number | undefined;
for (const event of events) {
const currentAtMs = Date.parse(event.at);
const hasCurrentTimestamp = Number.isFinite(currentAtMs);
if (previousAtMs !== undefined && hasCurrentTimestamp) {
const diff = Math.max(0, currentAtMs - previousAtMs);
cumulativeDelayMs += Math.max(0, Math.round(diff / speed));
}
if (hasCurrentTimestamp) {
previousAtMs = currentAtMs;
}
frames.push({
delayMs: cumulativeDelayMs,
event,
});
}
return frames;
}
export async function loadReplayRecording(input: {
filePath: string;
speed?: number | string;
}): Promise<LoadedReplayRecording> {
const filePath = resolve(input.filePath);
const events = await readEventsFromNdjson(filePath);
if (events.length === 0) {
throw new Error(`Replay file has no readable events: ${filePath}`);
}
const speed = parseReplaySpeed(input.speed);
const frames = buildReplayFrames(events, speed);
return {
filePath,
speed,
events,
frames,
};
}
export function rewriteReplayEventForSession(
event: CourtEvent,
sessionId: string,
): CourtEvent {
const payload = structuredClone(event.payload);
if (typeof payload['sessionId'] === 'string') {
payload['sessionId'] = sessionId;
}
if (isRecord(payload['turn'])) {
const turnPayload = payload['turn'];
if (typeof turnPayload['sessionId'] === 'string') {
turnPayload['sessionId'] = sessionId;
}
}
return {
...event,
sessionId,
payload,
};
}
function writeEventLine(stream: WriteStream, event: CourtEvent): void {
stream.write(`${JSON.stringify(event)}\n`);
}
async function closeStream(state: RecorderState): Promise<void> {
if (state.closed) return;
state.closed = true;
state.unsubscribe();
await new Promise<void>(resolve => {
state.stream.end(() => resolve());
});
}
export class SessionEventRecorderManager {
private readonly recorders = new Map<string, RecorderState>();
constructor(
private readonly store: CourtSessionStore,
private readonly recordingsDir = resolveRecordingsDir(),
) {}
async start(input: {
sessionId: string;
initialEvents?: CourtEvent[];
}): Promise<string> {
const existing = this.recorders.get(input.sessionId);
if (existing) {
return existing.filePath;
}
const filePath = resolve(
this.recordingsDir,
`${input.sessionId}.ndjson`,
);
await mkdir(dirname(filePath), { recursive: true });
const stream = createWriteStream(filePath, {
flags: 'a',
encoding: 'utf8',
});
await new Promise<void>((resolveReady, rejectReady) => {
const onOpen = () => {
stream.off('error', onError);
resolveReady();
};
const onError = (error: Error) => {
stream.off('open', onOpen);
rejectReady(error);
};
stream.once('open', onOpen);
stream.once('error', onError);
});
const state: RecorderState = {
sessionId: input.sessionId,
filePath,
stream,
unsubscribe: () => {},
closed: false,
};
const unsubscribe = this.store.subscribe(input.sessionId, event => {
if (state.closed) {
return;
}
writeEventLine(stream, event);
if (
event.type === 'session_completed' ||
event.type === 'session_failed'
) {
void this.stop(input.sessionId);
}
});
state.unsubscribe = unsubscribe;
this.recorders.set(input.sessionId, state);
stream.on('error', error => {
// eslint-disable-next-line no-console
console.warn(
`[replay] recorder stream error session=${input.sessionId} file=${filePath}: ${error instanceof Error ? error.message : String(error)}`,
);
void this.stop(input.sessionId);
});
for (const event of input.initialEvents ?? []) {
writeEventLine(stream, event);
}
return filePath;
}
async stop(sessionId: string): Promise<void> {
const state = this.recorders.get(sessionId);
if (!state) return;
this.recorders.delete(sessionId);
await closeStream(state);
}
async dispose(): Promise<void> {
const sessionIds = [...this.recorders.keys()];
for (const sessionId of sessionIds) {
await this.stop(sessionId);
}
}
}

View File

@@ -0,0 +1,115 @@
import { mkdir, writeFile } from 'node:fs/promises';
import { dirname, resolve } from 'node:path';
import { buildFixtureFileName, recordSseFixture } from './sse-fixture.js';
interface CliOptions {
sessionId: string;
outPath: string;
baseUrl: string;
maxEvents: number;
durationMs: number;
}
function parsePositiveInteger(value: string, label: string): number {
const parsed = Number.parseInt(value, 10);
if (!Number.isFinite(parsed) || parsed <= 0) {
throw new Error(`${label} must be a positive integer`);
}
return parsed;
}
function defaultBaseUrl(): string {
const port = process.env.PORT ?? '3000';
return `http://127.0.0.1:${port}`;
}
function parseArgs(argv: string[]): CliOptions {
let sessionId = '';
let outPath = '';
let baseUrl = defaultBaseUrl();
let maxEvents = 400;
let durationMs = 90_000;
for (let index = 0; index < argv.length; index += 1) {
const token = argv[index];
if (!token.startsWith('--')) {
continue;
}
const value = argv[index + 1];
if (value === undefined || value.startsWith('--')) {
throw new Error(`Missing value for ${token}`);
}
switch (token) {
case '--session':
sessionId = value;
break;
case '--out':
outPath = value;
break;
case '--base':
baseUrl = value;
break;
case '--max-events':
maxEvents = parsePositiveInteger(value, '--max-events');
break;
case '--duration-ms':
durationMs = parsePositiveInteger(value, '--duration-ms');
break;
default:
throw new Error(`Unknown argument: ${token}`);
}
index += 1;
}
if (!sessionId) {
throw new Error('Missing required argument: --session <session-id>');
}
const resolvedOutPath =
outPath ||
resolve('public', 'fixtures', buildFixtureFileName(sessionId));
return {
sessionId,
outPath: resolve(resolvedOutPath),
baseUrl,
maxEvents,
durationMs,
};
}
async function main(): Promise<void> {
const options = parseArgs(process.argv.slice(2));
// eslint-disable-next-line no-console
console.log(
`[sse-fixture] recording session=${options.sessionId} base=${options.baseUrl} maxEvents=${options.maxEvents} durationMs=${options.durationMs}`,
);
const fixture = await recordSseFixture({
sessionId: options.sessionId,
baseUrl: options.baseUrl,
maxEvents: options.maxEvents,
durationMs: options.durationMs,
});
await mkdir(dirname(options.outPath), { recursive: true });
await writeFile(options.outPath, JSON.stringify(fixture, null, 2), 'utf8');
// eslint-disable-next-line no-console
console.log(
`[sse-fixture] wrote ${fixture.events.length} events to ${options.outPath}`,
);
}
main().catch(error => {
// eslint-disable-next-line no-console
console.error(
`[sse-fixture] failed: ${error instanceof Error ? error.message : String(error)}`,
);
process.exit(1);
});

View File

@@ -0,0 +1,42 @@
import assert from 'node:assert/strict';
import test from 'node:test';
import { buildFixtureFileName, createSseDataParser } from './sse-fixture.js';
test('createSseDataParser parses chunked data lines into complete SSE messages', () => {
const payloads: string[] = [];
const parser = createSseDataParser(data => {
payloads.push(data);
});
parser.push('data: {"type":"snapshot"}\n\n');
parser.push('data: {"type":"turn","payload":{"speaker":"judge"}}\n');
parser.push('\n');
parser.flush();
assert.deepEqual(payloads, [
'{"type":"snapshot"}',
'{"type":"turn","payload":{"speaker":"judge"}}',
]);
});
test('createSseDataParser supports multiline data payloads', () => {
const payloads: string[] = [];
const parser = createSseDataParser(data => {
payloads.push(data);
});
parser.push('data: {"type":"snapshot",\n');
parser.push('data: "payload": {"phase": "openings"}}\n\n');
parser.flush();
assert.equal(
payloads[0],
'{"type":"snapshot",\n"payload": {"phase": "openings"}}',
);
});
test('buildFixtureFileName sanitizes unsafe characters', () => {
const fileName = buildFixtureFileName('session/alpha:beta', 1234);
assert.equal(fileName, 'sse-session_alpha_beta-1234.json');
});

198
src/scripts/sse-fixture.ts Normal file
View File

@@ -0,0 +1,198 @@
import { setTimeout as delay } from 'node:timers/promises';
export interface RecordedSseEvent {
offsetMs: number;
message: Record<string, unknown>;
}
export interface SseReplayFixture {
version: 1;
sessionId: string;
sourceUrl: string;
recordedAt: string;
events: RecordedSseEvent[];
}
export interface RecordSseFixtureOptions {
sessionId: string;
baseUrl: string;
maxEvents?: number;
durationMs?: number;
fetchImpl?: typeof fetch;
}
export interface SseDataParser {
push: (chunk: string) => void;
flush: () => void;
}
function normalizeLineBreaks(input: string): string {
return input.replace(/\r\n/g, '\n').replace(/\r/g, '\n');
}
export function createSseDataParser(
onData: (data: string) => void,
): SseDataParser {
let buffer = '';
let dataLines: string[] = [];
const dispatchIfReady = () => {
if (dataLines.length === 0) {
return;
}
onData(dataLines.join('\n'));
dataLines = [];
};
const processLine = (line: string) => {
if (line.length === 0) {
dispatchIfReady();
return;
}
if (!line.startsWith('data:')) {
return;
}
dataLines.push(line.slice(5).trimStart());
};
return {
push(chunk: string) {
if (!chunk) {
return;
}
buffer += normalizeLineBreaks(chunk);
while (true) {
const lineBreakIndex = buffer.indexOf('\n');
if (lineBreakIndex === -1) {
return;
}
const line = buffer.slice(0, lineBreakIndex);
buffer = buffer.slice(lineBreakIndex + 1);
processLine(line);
}
},
flush() {
if (buffer.length > 0) {
processLine(buffer);
buffer = '';
}
dispatchIfReady();
},
};
}
function toSourceUrl(baseUrl: string, sessionId: string): string {
const trimmed = baseUrl.replace(/\/+$/, '');
return `${trimmed}/api/court/sessions/${encodeURIComponent(sessionId)}/stream`;
}
function asPositiveInteger(
value: number | undefined,
fallback: number,
): number {
if (!Number.isFinite(value)) {
return fallback;
}
const parsed = Math.floor(value ?? fallback);
return parsed > 0 ? parsed : fallback;
}
export async function recordSseFixture(
options: RecordSseFixtureOptions,
): Promise<SseReplayFixture> {
const fetchImpl = options.fetchImpl ?? fetch;
const maxEvents = asPositiveInteger(options.maxEvents, 400);
const durationMs = asPositiveInteger(options.durationMs, 90_000);
const sourceUrl = toSourceUrl(options.baseUrl, options.sessionId);
const abortController = new AbortController();
const timeout = setTimeout(() => abortController.abort(), durationMs);
const response = await fetchImpl(sourceUrl, {
headers: { Accept: 'text/event-stream' },
signal: abortController.signal,
});
if (!response.ok) {
clearTimeout(timeout);
throw new Error(
`SSE stream request failed with ${response.status} ${response.statusText}`,
);
}
if (!response.body) {
clearTimeout(timeout);
throw new Error('SSE response does not expose a readable body stream');
}
const reader = response.body.getReader();
const decoder = new TextDecoder();
const startedAt = Date.now();
const events: RecordedSseEvent[] = [];
const parser = createSseDataParser(data => {
if (events.length >= maxEvents) {
return;
}
try {
const parsed = JSON.parse(data) as Record<string, unknown>;
events.push({
offsetMs: Math.max(0, Date.now() - startedAt),
message: parsed,
});
} catch {
// Ignore malformed data lines while recording; fixture consumers only
// care about valid JSON event envelopes.
}
});
try {
while (events.length < maxEvents) {
const { done, value } = await reader.read();
if (done) {
break;
}
parser.push(decoder.decode(value, { stream: true }));
// Yield to event loop for long-running recordings.
await delay(0);
}
} catch (error) {
if (!(error instanceof Error && error.name === 'AbortError')) {
throw error;
}
} finally {
abortController.abort();
clearTimeout(timeout);
parser.flush();
reader.releaseLock();
}
return {
version: 1,
sessionId: options.sessionId,
sourceUrl,
recordedAt: new Date().toISOString(),
events,
};
}
export function buildFixtureFileName(
sessionId: string,
timestamp = Date.now(),
): string {
const safeSessionId = sessionId.replace(/[^a-zA-Z0-9_-]/g, '_');
return `sse-${safeSessionId}-${timestamp}.json`;
}

72
src/server-config.test.ts Normal file
View File

@@ -0,0 +1,72 @@
import assert from 'node:assert/strict';
import test from 'node:test';
import { parseReplayLaunchConfig, resolveTrustProxySetting } from './server.js';
test('resolveTrustProxySetting returns undefined when TRUST_PROXY is missing or blank', () => {
assert.equal(resolveTrustProxySetting({} as NodeJS.ProcessEnv), undefined);
assert.equal(
resolveTrustProxySetting({ TRUST_PROXY: ' ' } as NodeJS.ProcessEnv),
undefined,
);
});
test('resolveTrustProxySetting parses booleans and hop counts', () => {
assert.equal(
resolveTrustProxySetting({ TRUST_PROXY: 'true' } as NodeJS.ProcessEnv),
true,
);
assert.equal(
resolveTrustProxySetting({ TRUST_PROXY: 'FALSE' } as NodeJS.ProcessEnv),
false,
);
assert.equal(
resolveTrustProxySetting({ TRUST_PROXY: '1' } as NodeJS.ProcessEnv),
1,
);
});
test('resolveTrustProxySetting parses csv lists and passthrough values', () => {
assert.deepEqual(
resolveTrustProxySetting({
TRUST_PROXY: 'loopback, linklocal, uniquelocal',
} as NodeJS.ProcessEnv),
['loopback', 'linklocal', 'uniquelocal'],
);
assert.equal(
resolveTrustProxySetting({
TRUST_PROXY: 'loopback',
} as NodeJS.ProcessEnv),
'loopback',
);
});
test('parseReplayLaunchConfig reads replay path and speed from env', () => {
const config = parseReplayLaunchConfig([], {
REPLAY_FILE: './recordings/session.ndjson',
REPLAY_SPEED: '4',
} as NodeJS.ProcessEnv);
assert.ok(config);
assert.equal(config?.speed, 4);
assert.match(config?.filePath ?? '', /recordings\/session\.ndjson$/);
});
test('parseReplayLaunchConfig applies argv overrides', () => {
const config = parseReplayLaunchConfig(
['--replay', 'fixtures/demo.ndjson', '--speed', '2'],
{
REPLAY_FILE: './recordings/session.ndjson',
REPLAY_SPEED: '1',
} as NodeJS.ProcessEnv,
);
assert.ok(config);
assert.equal(config?.speed, 2);
assert.match(config?.filePath ?? '', /fixtures\/demo\.ndjson$/);
});
test('parseReplayLaunchConfig returns undefined when replay file is absent', () => {
const config = parseReplayLaunchConfig([], {} as NodeJS.ProcessEnv);
assert.equal(config, undefined);
});

184
src/server-replay.test.ts Normal file
View File

@@ -0,0 +1,184 @@
import assert from 'node:assert/strict';
import { once } from 'node:events';
import { mkdtemp, writeFile } from 'node:fs/promises';
import type { Server } from 'node:http';
import type { AddressInfo } from 'node:net';
import { tmpdir } from 'node:os';
import { join } from 'node:path';
import test from 'node:test';
import { createSseDataParser } from './scripts/sse-fixture.js';
import { createServerApp } from './server.js';
import type { CourtEvent } from './types.js';
function makeReplayEvent(input: {
type: CourtEvent['type'];
at: string;
payload: Record<string, unknown>;
}): CourtEvent {
return {
id: `${input.type}-${input.at}`,
sessionId: 'recorded-session',
type: input.type,
at: input.at,
payload: input.payload,
};
}
async function readSseMessages(input: {
url: string;
expectedMessages: number;
timeoutMs: number;
}): Promise<Array<Record<string, unknown>>> {
const response = await fetch(input.url, {
headers: { Accept: 'text/event-stream' },
});
assert.equal(response.ok, true);
assert.ok(response.body);
const reader = response.body.getReader();
const decoder = new TextDecoder();
const messages: Array<Record<string, unknown>> = [];
const parser = createSseDataParser(data => {
try {
messages.push(JSON.parse(data) as Record<string, unknown>);
} catch {
// Ignore malformed chunks for resilience
}
});
const deadline = Date.now() + input.timeoutMs;
try {
while (
messages.length < input.expectedMessages &&
Date.now() < deadline
) {
const chunk = await Promise.race([
reader.read(),
new Promise<never>((_, reject) => {
const remaining = Math.max(1, deadline - Date.now());
setTimeout(
() => reject(new Error('SSE read timeout')),
remaining,
);
}),
]);
if (chunk.done) {
break;
}
parser.push(decoder.decode(chunk.value, { stream: true }));
}
} finally {
parser.flush();
await reader.cancel();
}
return messages;
}
test('replay mode re-emits NDJSON events on SSE with session rewriting', async () => {
const replayDir = await mkdtemp(join(tmpdir(), 'juryrigged-replay-'));
const replayFile = join(replayDir, 'session.ndjson');
const replayEvents: CourtEvent[] = [
makeReplayEvent({
type: 'session_started',
at: '2026-02-28T10:00:00.000Z',
payload: {
sessionId: 'recorded-session',
startedAt: '2026-02-28T10:00:00.000Z',
},
}),
makeReplayEvent({
type: 'turn',
at: '2026-02-28T10:00:00.040Z',
payload: {
turn: {
id: 'turn-1',
sessionId: 'recorded-session',
turnNumber: 0,
speaker: 'primus',
role: 'judge',
phase: 'case_prompt',
dialogue: 'Court is now in session.',
createdAt: '2026-02-28T10:00:00.040Z',
},
},
}),
];
await writeFile(
replayFile,
`${replayEvents.map(event => JSON.stringify(event)).join('\n')}\n`,
'utf8',
);
const previousDatabaseUrl = process.env.DATABASE_URL;
process.env.DATABASE_URL = '';
let server: Server | undefined;
let dispose: (() => void) | undefined;
try {
const created = await createServerApp({
replay: { filePath: replayFile, speed: 4 },
});
dispose = created.dispose;
server = created.app.listen(0);
await once(server, 'listening');
const address = server.address() as AddressInfo | null;
assert.ok(address && typeof address !== 'string');
const baseUrl = `http://127.0.0.1:${address.port}`;
const createResponse = await fetch(`${baseUrl}/api/court/sessions`, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
topic: 'Did someone replace all office coffee with soup?',
caseType: 'criminal',
}),
});
assert.equal(createResponse.status, 201);
const createdPayload = (await createResponse.json()) as {
session: { id: string };
};
const sessionId = createdPayload.session.id;
const messages = await readSseMessages({
url: `${baseUrl}/api/court/sessions/${sessionId}/stream`,
expectedMessages: 3,
timeoutMs: 2_000,
});
assert.equal(messages.length >= 3, true);
assert.equal(messages[0]?.type, 'snapshot');
assert.equal(messages[1]?.type, 'session_started');
assert.equal(messages[2]?.type, 'turn');
assert.equal(messages[1]?.sessionId, sessionId);
const turnPayload = messages[2]?.payload as {
turn?: { sessionId?: string };
};
assert.equal(turnPayload.turn?.sessionId, sessionId);
} finally {
if (server) {
await new Promise<void>(resolve => {
server?.close(() => resolve());
});
}
dispose?.();
if (previousDatabaseUrl === undefined) {
delete process.env.DATABASE_URL;
} else {
process.env.DATABASE_URL = previousDatabaseUrl;
}
}
});

View File

@@ -18,6 +18,26 @@ import {
createCourtSessionStore,
} from './store/session-store.js';
import { VoteSpamGuard } from './moderation/vote-spam.js';
import {
elapsedSecondsSince,
instrumentCourtSessionStore,
metricsContentType,
recordSseConnectionClosed,
recordSseConnectionOpened,
recordSseEventSent,
recordVoteCast,
recordVoteRejected,
renderMetrics,
} from './metrics.js';
import {
createSyntheticEvent,
loadReplayRecording,
parseReplaySpeed,
resolveRecordingsDir,
rewriteReplayEventForSession,
SessionEventRecorderManager,
type LoadedReplayRecording,
} from './replay/session-replay.js';
import type {
AgentId,
CaseType,
@@ -58,7 +78,9 @@ function mapSessionMutationError(input: {
message: string;
} {
const message =
input.error instanceof Error ? input.error.message : input.fallbackMessage;
input.error instanceof Error ?
input.error.message
: input.fallbackMessage;
if (input.error instanceof CourtValidationError) {
return {
@@ -88,11 +110,99 @@ function parsePositiveInt(value: string | undefined, fallback: number): number {
return Number.isFinite(parsed) && parsed > 0 ? parsed : fallback;
}
export interface ReplayRuntimeOptions {
filePath: string;
speed?: number;
}
export interface ReplayLaunchConfig {
filePath: string;
speed: number;
}
export function parseReplayLaunchConfig(
argv: string[] = process.argv.slice(2),
env: NodeJS.ProcessEnv = process.env,
): ReplayLaunchConfig | undefined {
let replayFile = env.REPLAY_FILE?.trim() ?? '';
let replaySpeed = parseReplaySpeed(env.REPLAY_SPEED);
for (let index = 0; index < argv.length; index += 1) {
const token = argv[index];
if (token === '--replay') {
const value = argv[index + 1];
if (!value || value.startsWith('--')) {
throw new Error('Missing value for --replay <file-path>');
}
replayFile = value;
index += 1;
continue;
}
if (token === '--speed') {
const value = argv[index + 1];
if (!value || value.startsWith('--')) {
throw new Error('Missing value for --speed <multiplier>');
}
replaySpeed = parseReplaySpeed(value);
index += 1;
}
}
if (!replayFile) {
return undefined;
}
return {
filePath: path.resolve(replayFile),
speed: replaySpeed,
};
}
type TrustProxySetting = boolean | number | string | string[];
export function resolveTrustProxySetting(
env: NodeJS.ProcessEnv = process.env,
): TrustProxySetting | undefined {
const raw = env.TRUST_PROXY?.trim();
if (!raw) {
return undefined;
}
const normalized = raw.toLowerCase();
if (normalized === 'true') {
return true;
}
if (normalized === 'false') {
return false;
}
if (/^\d+$/.test(raw)) {
return Number.parseInt(raw, 10);
}
if (raw.includes(',')) {
const trustedProxies = raw
.split(',')
.map(segment => segment.trim())
.filter(Boolean);
if (trustedProxies.length > 0) {
return trustedProxies;
}
}
return raw;
}
interface SessionRouteDeps {
store: CourtSessionStore;
autoRunCourtSession: boolean;
verdictWindowMs: number;
sentenceWindowMs: number;
recorder: SessionEventRecorderManager;
replay?: LoadedReplayRecording;
}
function createSessionHandler(deps: SessionRouteDeps) {
@@ -130,7 +240,9 @@ function createSessionHandler(deps: SessionRouteDeps) {
}
const userTopic =
typeof req.body?.topic === 'string' ? req.body.topic.trim() : '';
typeof req.body?.topic === 'string' ?
req.body.topic.trim()
: '';
if (userTopic && userTopic.length < 10) {
return sendError(
@@ -162,7 +274,9 @@ function createSessionHandler(deps: SessionRouteDeps) {
: selectedPrompt.caseType; // Use selected prompt's case type if not specified
const participantsInput =
Array.isArray(req.body?.participants) ? req.body.participants : AGENT_IDS;
Array.isArray(req.body?.participants) ?
req.body.participants
: AGENT_IDS;
const participants = participantsInput.filter(
(id: string): id is AgentId => isValidAgent(id),
@@ -185,7 +299,7 @@ function createSessionHandler(deps: SessionRouteDeps) {
req.body.sentenceOptions
.map((option: unknown) => String(option).trim())
.filter(Boolean)
: [
: [
'Community service in the meme archives',
'Banished to the shadow realm',
'Mandatory apology haikus',
@@ -222,6 +336,26 @@ function createSessionHandler(deps: SessionRouteDeps) {
},
});
if (deps.autoRunCourtSession && !deps.replay) {
try {
await deps.recorder.start({
sessionId: session.id,
initialEvents: [
createSyntheticEvent({
sessionId: session.id,
type: 'session_created',
payload: { sessionId: session.id },
}),
],
});
} catch (error) {
// eslint-disable-next-line no-console
console.warn(
`[replay] failed to start recorder for session=${session.id}: ${error instanceof Error ? error.message : String(error)}`,
);
}
}
if (deps.autoRunCourtSession) {
void runCourtSession(session.id, deps.store);
}
@@ -229,19 +363,27 @@ function createSessionHandler(deps: SessionRouteDeps) {
return res.status(201).json({ session });
} catch (error) {
const message =
error instanceof Error ? error.message : 'Failed to create session';
error instanceof Error ?
error.message
: 'Failed to create session';
return sendError(res, 500, 'SESSION_CREATE_FAILED', message);
}
};
}
function createVoteHandler(store: CourtSessionStore, voteSpamGuard: VoteSpamGuard) {
function createVoteHandler(
store: CourtSessionStore,
voteSpamGuard: VoteSpamGuard,
) {
return async (req: Request, res: Response): Promise<Response> => {
const voteType = req.body?.type;
const voteTypeLabel =
typeof voteType === 'string' ? voteType : 'unknown';
const choice =
typeof req.body?.choice === 'string' ? req.body.choice.trim() : '';
if (voteType !== 'verdict' && voteType !== 'sentence') {
recordVoteRejected(voteTypeLabel, 'invalid_vote_type');
return sendError(
res,
400,
@@ -251,7 +393,13 @@ function createVoteHandler(store: CourtSessionStore, voteSpamGuard: VoteSpamGuar
}
if (!choice) {
return sendError(res, 400, 'MISSING_VOTE_CHOICE', 'choice is required');
recordVoteRejected(voteTypeLabel, 'missing_vote_choice');
return sendError(
res,
400,
'MISSING_VOTE_CHOICE',
'choice is required',
);
}
const clientIp = req.ip ?? req.socket.remoteAddress ?? 'unknown';
@@ -262,14 +410,16 @@ function createVoteHandler(store: CourtSessionStore, voteSpamGuard: VoteSpamGuar
choice,
);
if (!spamDecision.allowed) {
const spamReason = spamDecision.reason ?? 'unknown';
recordVoteRejected(voteType, spamReason);
// eslint-disable-next-line no-console
console.warn(
`[vote-spam] blocked ip=${clientIp} session=${req.params.id} reason=${spamDecision.reason ?? 'unknown'}`,
`[vote-spam] blocked ip=${clientIp} session=${req.params.id} reason=${spamReason}`,
);
store.emitEvent(req.params.id, 'vote_spam_blocked', {
ip: clientIp,
voteType,
reason: spamDecision.reason ?? 'unknown',
reason: spamReason,
retryAfterMs: spamDecision.retryAfterMs,
});
const code =
@@ -288,12 +438,15 @@ function createVoteHandler(store: CourtSessionStore, voteSpamGuard: VoteSpamGuar
});
}
const voteStartedAt = process.hrtime.bigint();
try {
const session = await store.castVote({
sessionId: req.params.id,
voteType,
choice,
});
recordVoteCast(voteType, elapsedSecondsSince(voteStartedAt));
return res.json({
sessionId: session.id,
@@ -307,6 +460,7 @@ function createVoteHandler(store: CourtSessionStore, voteSpamGuard: VoteSpamGuar
fallbackCode: 'VOTE_FAILED',
fallbackMessage: 'Failed to cast vote',
});
recordVoteRejected(voteType, mapped.code);
return sendError(res, mapped.status, mapped.code, mapped.message);
}
};
@@ -316,14 +470,20 @@ function createPhaseHandler(store: CourtSessionStore) {
return async (req: Request, res: Response): Promise<Response> => {
const phase = req.body?.phase as CourtPhase;
const durationMs =
typeof req.body?.durationMs === 'number' ? req.body.durationMs : undefined;
typeof req.body?.durationMs === 'number' ?
req.body.durationMs
: undefined;
if (!validPhases.includes(phase)) {
return sendError(res, 400, 'INVALID_PHASE', 'invalid phase');
}
try {
const session = await store.setPhase(req.params.id, phase, durationMs);
const session = await store.setPhase(
req.params.id,
phase,
durationMs,
);
return res.json({ session });
} catch (error) {
const mapped = mapSessionMutationError({
@@ -337,18 +497,41 @@ function createPhaseHandler(store: CourtSessionStore) {
};
}
function createStreamHandler(store: CourtSessionStore) {
return async (req: Request, res: Response): Promise<Response | undefined> => {
function createStreamHandler(
store: CourtSessionStore,
replay?: LoadedReplayRecording,
) {
return async (
req: Request,
res: Response,
): Promise<Response | undefined> => {
const session = await store.getSession(req.params.id);
if (!session) {
return sendError(res, 404, 'SESSION_NOT_FOUND', 'Session not found');
return sendError(
res,
404,
'SESSION_NOT_FOUND',
'Session not found',
);
}
res.setHeader('Content-Type', 'text/event-stream');
res.setHeader('Cache-Control', 'no-cache');
res.setHeader('Connection', 'keep-alive');
const openedAt = recordSseConnectionOpened();
const send = (event: unknown) => {
const eventType =
(
typeof event === 'object' &&
event !== null &&
'type' in event &&
typeof (event as { type?: unknown }).type === 'string'
) ?
(event as { type: string }).type
: 'unknown';
recordSseEventSent(eventType);
res.write(`data: ${JSON.stringify(event)}\n\n`);
};
@@ -363,13 +546,47 @@ function createStreamHandler(store: CourtSessionStore) {
},
});
const unsubscribe = store.subscribe(req.params.id, event => {
send(event);
});
let streamClosed = false;
const cleanup: Array<() => void> = [];
req.on('close', () => {
unsubscribe();
});
if (replay) {
const timers = replay.frames.map(frame =>
setTimeout(() => {
if (streamClosed) return;
send(
rewriteReplayEventForSession(
frame.event,
req.params.id,
),
);
}, frame.delayMs),
);
cleanup.push(() => {
for (const timer of timers) {
clearTimeout(timer);
}
});
} else {
const unsubscribe = store.subscribe(req.params.id, event => {
send(event);
});
cleanup.push(unsubscribe);
}
const closeStream = (reason: string) => {
if (streamClosed) return;
streamClosed = true;
for (const dispose of cleanup) {
dispose();
}
recordSseConnectionClosed(openedAt, reason);
};
req.on('close', () => closeStream('request_close'));
req.on('aborted', () => closeStream('request_aborted'));
res.on('error', () => closeStream('response_error'));
res.on('close', () => closeStream('response_close'));
return undefined;
};
@@ -383,6 +600,14 @@ const spaIndexLimiter = rateLimit({
max: 100, // limit each IP to 100 requests per windowMs
});
// Rate limiter for audience interaction endpoints (press/present)
const audienceInteractionLimiter = rateLimit({
windowMs: 10_000, // 10 seconds
max: 10, // 10 requests per IP per window
standardHeaders: true,
legacyHeaders: false,
});
function registerStaticAndSpaRoutes(
app: ExpressApp,
dirs: { publicDir: string; dashboardDir: string },
@@ -423,12 +648,29 @@ function registerApiRoutes(
autoRunCourtSession: boolean;
verdictWindowMs: number;
sentenceWindowMs: number;
recorder: SessionEventRecorderManager;
replay?: LoadedReplayRecording;
},
): void {
app.get('/api/health', (_req, res) => {
res.json({ ok: true, service: 'juryrigged' });
});
app.get('/api/metrics', async (_req, res) => {
try {
const metrics = await renderMetrics();
res.setHeader('Content-Type', metricsContentType);
res.status(200).send(metrics);
} catch (error) {
// eslint-disable-next-line no-console
console.error(
'[metrics] failed to render metrics:',
error instanceof Error ? error.message : error,
);
res.status(500).send('failed to render metrics');
}
});
app.get('/api/court/sessions', async (_req, res) => {
const sessions = await deps.store.listSessions();
res.json({ sessions });
@@ -454,6 +696,8 @@ function registerApiRoutes(
autoRunCourtSession: deps.autoRunCourtSession,
verdictWindowMs: deps.verdictWindowMs,
sentenceWindowMs: deps.sentenceWindowMs,
recorder: deps.recorder,
replay: deps.replay,
}),
);
@@ -464,12 +708,81 @@ function registerApiRoutes(
app.post('/api/court/sessions/:id/phase', createPhaseHandler(deps.store));
app.get('/api/court/sessions/:id/stream', createStreamHandler(deps.store));
// Phase 7: Audience interaction endpoints (#77)
app.post(
'/api/court/sessions/:id/press',
audienceInteractionLimiter,
async (req: Request, res: Response) => {
const session = await deps.store.getSession(req.params.id);
if (!session) {
return sendError(
res,
404,
'SESSION_NOT_FOUND',
'Session not found',
);
}
deps.store.emitEvent(req.params.id, 'render_directive', {
directive: {
effect: 'shake',
camera:
session.phase === 'witness_exam' ? 'witness' : 'wide',
},
phase: session.phase ?? 'witness_exam',
emittedAt: new Date().toISOString(),
});
res.json({ ok: true, action: 'press' });
},
);
app.post(
'/api/court/sessions/:id/present',
audienceInteractionLimiter,
async (req: Request, res: Response) => {
const session = await deps.store.getSession(req.params.id);
if (!session) {
return sendError(
res,
404,
'SESSION_NOT_FOUND',
'Session not found',
);
}
const evidenceId =
typeof req.body?.evidenceId === 'string' ?
req.body.evidenceId
: undefined;
if (!evidenceId) {
return sendError(
res,
400,
'MISSING_EVIDENCE_ID',
'evidenceId is required',
);
}
deps.store.emitEvent(req.params.id, 'render_directive', {
directive: {
effect: 'take_that',
evidencePresent: evidenceId,
camera: 'evidence',
},
phase: session.phase ?? 'evidence_reveal',
emittedAt: new Date().toISOString(),
});
res.json({ ok: true, action: 'present', evidenceId });
},
);
app.get(
'/api/court/sessions/:id/stream',
createStreamHandler(deps.store, deps.replay),
);
}
export interface CreateServerAppOptions {
autoRunCourtSession?: boolean;
store?: CourtSessionStore;
replay?: ReplayRuntimeOptions;
}
export async function createServerApp(
@@ -480,8 +793,27 @@ export async function createServerApp(
dispose: () => void;
}> {
const app = express();
const store = options.store ?? (await createCourtSessionStore());
const autoRunCourtSession = options.autoRunCourtSession ?? true;
const trustProxy = resolveTrustProxySetting();
if (trustProxy !== undefined) {
app.set('trust proxy', trustProxy);
}
const baseStore = options.store ?? (await createCourtSessionStore());
const store = instrumentCourtSessionStore(baseStore);
const replay =
options.replay ?
await loadReplayRecording({
filePath: options.replay.filePath,
speed: options.replay.speed,
})
: undefined;
const autoRunCourtSession = options.autoRunCourtSession ?? !replay;
const recorder = new SessionEventRecorderManager(
store,
resolveRecordingsDir(),
);
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
@@ -523,6 +855,8 @@ export async function createServerApp(
autoRunCourtSession,
verdictWindowMs,
sentenceWindowMs,
recorder,
replay,
});
registerStaticAndSpaRoutes(app, {
@@ -533,6 +867,14 @@ export async function createServerApp(
const restartPendingIds = await store.recoverInterruptedSessions();
if (autoRunCourtSession) {
for (const sessionId of restartPendingIds) {
try {
await recorder.start({ sessionId });
} catch (error) {
// eslint-disable-next-line no-console
console.warn(
`[replay] failed to start recorder for recovered session=${sessionId}: ${error instanceof Error ? error.message : String(error)}`,
);
}
void runCourtSession(sessionId, store);
}
}
@@ -542,12 +884,17 @@ export async function createServerApp(
store,
dispose: () => {
clearInterval(pruneTimer);
void recorder.dispose();
},
};
}
export async function bootstrap(): Promise<void> {
const { app } = await createServerApp();
const replayLaunch = parseReplayLaunchConfig();
const { app } = await createServerApp({
replay: replayLaunch,
autoRunCourtSession: replayLaunch ? false : undefined,
});
const port = Number.parseInt(process.env.PORT ?? '3000', 10);
app.listen(port, () => {
@@ -555,6 +902,12 @@ export async function bootstrap(): Promise<void> {
console.log(`JuryRigged running on http://localhost:${port}`);
// eslint-disable-next-line no-console
console.log(`Operator Dashboard: http://localhost:${port}/operator`);
if (replayLaunch) {
// eslint-disable-next-line no-console
console.log(
`[replay] enabled file=${replayLaunch.filePath} speed=${replayLaunch.speed}x`,
);
}
});
}

181
src/twitch/adapter.ts Normal file
View File

@@ -0,0 +1,181 @@
/**
* Twitch chat integration — reads chat messages and channel-point redemptions
* to drive audience interactions (#77).
*
* Placeholder-first: when TWITCH_CHANNEL is empty the adapter is a no-op.
* When configured, it connects to a Twitch IRC channel and:
*
* 1. Forwards `!press` / `!present` / `!objection` commands to the server
* via an internal callback.
* 2. Accepts outbound messages from the orchestrator (phase transitions,
* vote prompts) and posts them as chat messages.
*
* This module does NOT handle EventSub directly — it uses a lightweight
* IRC-only approach for MVP. EventSub / channel-point webhooks can be
* added later behind the same adapter interface.
*/
import type { CourtSessionStore } from '../store/session-store.js';
import type { CourtSession } from '../types.js';
export interface TwitchConfig {
channel: string;
botToken: string;
clientId: string;
}
export interface TwitchChatCommand {
command: 'press' | 'present' | 'objection';
username: string;
args: string[];
}
export interface TwitchAdapter {
readonly enabled: boolean;
sendChat(message: string): void;
onCommand(handler: (cmd: TwitchChatCommand) => void): void;
disconnect(): void;
}
/**
* Resolve Twitch config from environment.
*/
export function resolveTwitchConfig(
env: NodeJS.ProcessEnv = process.env,
): TwitchConfig | null {
const channel = env.TWITCH_CHANNEL?.trim();
const botToken = env.TWITCH_BOT_TOKEN?.trim();
const clientId = env.TWITCH_CLIENT_ID?.trim();
if (!channel || !botToken || !clientId) {
return null;
}
return { channel, botToken, clientId };
}
/**
* Create a no-op Twitch adapter for when Twitch is not configured.
*/
function createNoopAdapter(): TwitchAdapter {
return {
enabled: false,
sendChat: () => {},
onCommand: () => {},
disconnect: () => {},
};
}
/**
* Parse a chat line for recognised commands.
* Recognised: !press, !present <evidence_id>, !objection
*/
function parseCommand(
message: string,
username: string,
): TwitchChatCommand | null {
const trimmed = message.trim();
if (!trimmed.startsWith('!')) return null;
const parts = trimmed.split(/\s+/);
const cmd = parts[0].slice(1).toLowerCase();
if (cmd === 'press' || cmd === 'present' || cmd === 'objection') {
return {
command: cmd as TwitchChatCommand['command'],
username,
args: parts.slice(1),
};
}
return null;
}
/**
* Create the Twitch adapter. Returns a no-op adapter when not configured.
*
* Note: actual IRC connection is deferred until a real Twitch SDK/lib is
* integrated. This placeholder adapter logs commands and exposes the
* interface so the rest of the system can wire up.
*/
export function createTwitchAdapter(
env: NodeJS.ProcessEnv = process.env,
): TwitchAdapter {
const config = resolveTwitchConfig(env);
if (!config) {
return createNoopAdapter();
}
const commandHandlers: Array<(cmd: TwitchChatCommand) => void> = [];
// eslint-disable-next-line no-console
console.info(
`[twitch] adapter created channel=${config.channel} (IRC connection deferred)`,
);
return {
enabled: true,
sendChat(message: string) {
// eslint-disable-next-line no-console
console.info(
`[twitch] sendChat channel=${config.channel} message=${message.slice(0, 100)}`,
);
},
onCommand(handler: (cmd: TwitchChatCommand) => void) {
commandHandlers.push(handler);
},
disconnect() {
commandHandlers.length = 0;
// eslint-disable-next-line no-console
console.info(`[twitch] adapter disconnected`);
},
};
}
/**
* Wire Twitch commands to the session store.
* Auto-emits events when audience interacts through chat.
*/
export async function wireTwitchToSession(
adapter: TwitchAdapter,
store: CourtSessionStore,
sessionId: string,
): Promise<void> {
if (!adapter.enabled) return;
adapter.onCommand(async cmd => {
switch (cmd.command) {
case 'objection': {
// Audience objection: read current count and increment
const session = await store.getSession(sessionId);
const currentCount = session?.metadata?.objectionCount ?? 0;
const newCount = currentCount + 1;
store.emitEvent(sessionId, 'objection_count_changed', {
count: newCount,
phase: session?.phase ?? 'witness_exam',
changedAt: new Date().toISOString(),
});
break;
}
case 'press':
// Audience press — logged for future implementation
// eslint-disable-next-line no-console
console.info(
`[twitch] press command user=${cmd.username} session=${sessionId}`,
);
break;
case 'present':
// Audience present evidence — logged for future implementation
// eslint-disable-next-line no-console
console.info(
`[twitch] present command user=${cmd.username} args=${cmd.args.join(',')} session=${sessionId}`,
);
break;
}
});
}
// Re-export for testing
export { parseCommand as _parseCommand };

View File

@@ -114,6 +114,10 @@ export interface CourtSessionMetadata {
genreHistory?: GenreTag[]; // Last N genres used
evidenceCards?: EvidenceCard[];
objectionCount?: number;
// Phase 7 additions
caseFile?: CaseFile;
witnessStatements?: WitnessStatement[];
lastRenderDirective?: RenderDirective;
}
export interface CourtSession {
@@ -165,7 +169,11 @@ export type CourtEventType =
| 'broadcast_hook_triggered'
| 'broadcast_hook_failed'
| 'evidence_revealed'
| 'objection_count_changed';
| 'objection_count_changed'
// Phase 7 additions
| 'render_directive'
| 'witness_statement'
| 'case_file_generated';
export interface CourtEvent {
id: string;
@@ -186,3 +194,90 @@ export interface LLMGenerateOptions {
temperature?: number;
maxTokens?: number;
}
// ---------------------------------------------------------------------------
// Phase 7: Render Directives (#70)
// ---------------------------------------------------------------------------
export type RenderEffectCue =
| 'flash'
| 'shake'
| 'freeze'
| 'stamp'
| 'objection'
| 'hold_it'
| 'take_that';
export type CameraPreset =
| 'wide'
| 'judge'
| 'prosecution'
| 'defense'
| 'witness'
| 'evidence'
| 'verdict';
export type CharacterPose =
| 'idle'
| 'talk'
| 'point'
| 'slam'
| 'think'
| 'shock';
export type CharacterFace =
| 'neutral'
| 'angry'
| 'happy'
| 'surprised'
| 'sweating';
export interface RenderDirective {
camera?: CameraPreset;
effect?: RenderEffectCue;
effectOpts?: Record<string, unknown>;
poses?: Partial<Record<CourtRole, CharacterPose>>;
faces?: Partial<Record<CourtRole, CharacterFace>>;
evidencePresent?: string; // evidence ID to present
}
// ---------------------------------------------------------------------------
// Phase 7: Structured Case File (#67)
// ---------------------------------------------------------------------------
export interface CaseFileWitness {
role: CourtRole;
agentId: AgentId;
displayName: string;
bio: string;
}
export interface CaseFileEvidence {
id: string;
label: string;
description: string;
revealPhase: CourtPhase;
}
export interface CaseFile {
title: string;
genre: GenreTag;
caseType: CaseType;
synopsis: string;
charges: string[];
witnesses: CaseFileWitness[];
evidence: CaseFileEvidence[];
sentenceOptions: string[];
}
// ---------------------------------------------------------------------------
// Phase 7: Witness Statement (#75)
// ---------------------------------------------------------------------------
export interface WitnessStatement {
witnessRole: CourtRole;
agentId: AgentId;
statementText: string;
issuedAt: string; // ISO 8601
contradictions?: string[]; // IDs of evidence that contradict
}