* Initial plan * Fix onLimitReached deprecation in rate-limit middleware - Replace deprecated onLimitReached callback with inline logging in handler - Update tests to verify logging happens in the rate limit handler - Remove onLimitReached from exports as it's no longer a separate function Co-authored-by: onnwee <211922112+onnwee@users.noreply.github.com> * Fix fromBlock: 0 in event queries for better performance - Replace fromBlock: 0 with smart default (last 1M blocks) - Add REGISTRY_START_BLOCK env var for configurable starting block - Update make-proof.ts, verification-jobs.routes.ts, and verification-queue.service.ts - Document new env variable in .env.example Co-authored-by: onnwee <211922112+onnwee@users.noreply.github.com> * Remove hardcoded testnet RPC fallback for production safety - Replace testnet fallbacks with proper error handling when RPC_URL is not configured - Update registry.service.ts, blockchain.service.ts to throw errors if RPC_URL missing - Update CLI scripts (verify.ts, register.ts, make-proof.ts) to fail fast without RPC_URL - Update API routes to return 503 error when RPC_URL is not configured - Update .env.example to emphasize RPC_URL is required Co-authored-by: onnwee <211922112+onnwee@users.noreply.github.com> * Add documentation for dual Prisma generators and pin Redis version - Add comprehensive comments in schema.prisma explaining dual generator setup - Document why both generators are needed (API vs Next.js web app) - Pin Redis version to 7.2-alpine in all docker-compose files for reproducibility - Update docker-compose.yml, docker-compose.production.yml, and docker-compose.staging.yml Co-authored-by: onnwee <211922112+onnwee@users.noreply.github.com> * Address code review feedback - add validation for REGISTRY_START_BLOCK - Add proper validation for parseInt to handle NaN cases - Ensure REGISTRY_START_BLOCK is validated before use - Add comment explaining intentional empty catch block - Prevents invalid block numbers from breaking event queries Co-authored-by: onnwee <211922112+onnwee@users.noreply.github.com> * Refactor block range validation into shared utility function - Create block-range.util.ts with getStartBlock helper - Extract duplicated validation logic from make-proof.ts, verification-jobs.routes.ts, and verification-queue.service.ts - Improves code maintainability and ensures consistent validation - Add comprehensive JSDoc documentation Co-authored-by: onnwee <211922112+onnwee@users.noreply.github.com> * Add comprehensive test coverage for block-range utility - Create test/utils/block-range.util.test.ts following existing test patterns - Test valid REGISTRY_START_BLOCK values (positive, zero, large numbers) - Test invalid inputs (NaN, negative, empty string, whitespace) - Test default fallback behavior (current block - 1M) - Test edge cases (low block numbers, decimals, provider errors) - 15 test cases covering all code paths and validation logic Co-authored-by: PatrickFanella <61631520+PatrickFanella@users.noreply.github.com> --------- Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com> Co-authored-by: onnwee <211922112+onnwee@users.noreply.github.com> Co-authored-by: PatrickFanella <61631520+PatrickFanella@users.noreply.github.com>
Internet-ID: Human-Created Content Anchoring
This repo scaffolds a minimal on-chain content provenance flow:
- A creator hashes their content and signs a manifest.
- The manifest and content are stored off-chain (e.g., IPFS/Web3.Storage).
- A small registry contract on an L2 anchors the content hash and a URI to the manifest.
- Verifiers can recompute the hash, check the signature, and confirm the on-chain anchor.
Note: This proves provenance, not truth. It helps distinguish opted-in human-created content from anonymous deepfakes.
📚 Documentation:
For End Users (Content Creators & Viewers)
- 🚀 User Guide - Complete documentation for creators and viewers
- ⚡ Quick Start - Register your first content in 5 minutes
- ❓ FAQ - Frequently asked questions
- 🔍 What is Internet ID? - Learn the basics
- 🛠️ Troubleshooting - Common issues and solutions
For Developers & Contributors
- New here? Start with the Contributor Onboarding Guide
- Architecture Overview: See docs/ARCHITECTURE.md for system design and component interactions
- Plain-English Pitch: PITCH.md explains the problem and solution
- Accessibility: See web/ACCESSIBILITY.md for WCAG 2.1 AA conformance and web/ACCESSIBILITY_TESTING.md for testing guide
- Browser Extension: See extension/README.md for the browser extension that provides seamless verification on YouTube, Twitter, and other platforms
Stack
- Solidity (ContentRegistry)
- Hardhat + TypeScript
- Ethers v6
- IPFS uploads via Infura, Web3.Storage, or Pinata
- Express API with optional API key protection
- Comprehensive input validation using Zod (see docs/VALIDATION.md)
- Prisma ORM (SQLite by default; Postgres optional)
- Redis caching layer for improved performance (optional, see docs/CACHING_ARCHITECTURE.md)
- Asynchronous verification queue using BullMQ for background processing (optional, see docs/VERIFICATION_QUEUE.md)
- Next.js App Router web UI (optional)
- NextAuth for sign-in (GitHub/Google to start), Prisma adapter
- Browser Extension for one-click verification on supported platforms (Chrome, Firefox, Safari - see extension/README.md)
Security
This project implements comprehensive security measures across smart contracts and API:
Smart Contract Security
- ✅ Automated security analysis completed (Slither)
- ✅ No critical or high severity vulnerabilities found
- ✅ Comprehensive access control with
onlyCreatormodifier - ✅ No reentrancy risks (no external calls)
- ✅ Integer overflow protection (Solidity 0.8+)
- 📋 Professional audit planned before mainnet launch
See: Smart Contract Audit Report | Security Policy
API Security
- ✅ Comprehensive input validation and sanitization
- ✅ XSS (Cross-Site Scripting) prevention
- ✅ SQL injection protection via Prisma ORM
- ✅ Command injection prevention
- ✅ Path traversal protection
- ✅ File upload security with size limits and type restrictions
- ✅ Rate limiting (when configured with Redis)
- ✅ Performance optimization with Redis caching layer
See: Input Validation Documentation | Security Implementation Summary | Caching Security Summary
Web Application Security
- ✅ Content Security Policy (CSP) with nonce-based script execution
- ✅ XSS protection via strict CSP (no
unsafe-evalorunsafe-inlinein production) - ✅ Clickjacking protection (
frame-ancestors,X-Frame-Options) - ✅ MIME-type sniffing protection (
X-Content-Type-Options) - ✅ HSTS (HTTP Strict Transport Security)
- ✅ Referrer policy for privacy
- ✅ Permissions policy for browser features
See: CSP Security Improvements
Reporting Security Issues
We take security seriously. If you discover a vulnerability, please report it responsibly:
- Email: security@subculture.io (or use GitHub Security Advisory)
- DO NOT open public issues for security vulnerabilities
- See our Security Policy for details and potential rewards
Code Quality
This project uses ESLint and Prettier to maintain consistent code style and catch common issues:
- ESLint: Configured for both Node.js/Hardhat scripts (TypeScript) and Next.js app
- Prettier: Shared formatting config across the monorepo
Linting & Formatting
# Run linters across the entire monorepo
npm run lint
# Fix auto-fixable linting issues
npm run lint:fix
# Format all code with Prettier
npm run format
# Check if code is properly formatted
npm run format:check
For the web package specifically:
cd web
npm run lint # ESLint for Next.js app
npm run lint:fix # Auto-fix issues
npm run format # Format with Prettier
Configuration files:
- Root ESLint:
.eslintrc.json(TypeScript + Node.js) - Web ESLint:
web/.eslintrc.json(Next.js) - Prettier:
.prettierrc.json(shared)
Continuous Integration
This project uses GitHub Actions to ensure code quality and prevent regressions. The CI workflow runs automatically on pull requests and pushes to the main branch.
CI Workflows
-
Backend Job:
- Installs dependencies with npm cache
- Runs ESLint on root package
- Checks code formatting with Prettier
- Validates Prisma schema formatting
- Compiles Solidity contracts with Hardhat
- Generates Prisma client and runs migrations
- Runs all backend tests (Hardhat + unit tests)
-
Web Job:
- Installs dependencies for both root and web packages
- Runs ESLint on Next.js app
- Checks code formatting
- Builds the Next.js application
-
E2E Tests (manual trigger):
- Runs comprehensive end-to-end tests with Playwright
- Tests across Chromium, Firefox, and WebKit browsers
- Validates mobile responsiveness
- Performs visual regression testing
- Can run against preview deployments
-
Security Scanning (automated):
- CodeQL analysis for security vulnerabilities (weekly + on every push/PR)
- Dependency review for new vulnerabilities in PRs
- Dependabot security alerts (daily checks)
View the CI workflow configuration and E2E workflow configuration.
Note: This CI workflow is part of the project roadmap to guard against regressions (see #10).
CI Environment Variables
The CI workflows use the following environment variables:
Backend Job:
DATABASE_URL: PostgreSQL connection for tests (provided by CI service container)- Value:
postgresql://internetid:internetid@localhost:5432/internetid_test?schema=public - PostgreSQL 16 runs as a service container with health checks
- Value:
Web Job:
DATABASE_URL: Same as backend (for Prisma client generation)NEXTAUTH_URL: Base URL for NextAuth (mock value:http://localhost:3000)NEXTAUTH_SECRET: Session encryption secret (mock value for build-time only)
Note: No real secrets or RPC URLs are required for CI. The workflows use:
- Local PostgreSQL for database tests
- Mock values for Next.js build (build runs in standalone mode)
- Hardhat's built-in test network for smart contract tests
Local CI Debugging
To reproduce CI failures locally:
# Backend workflow
npm ci --legacy-peer-deps
npm run lint:root
npm run format:check
npx prisma format --check
npm run build # Compile Hardhat contracts
npm run db:generate # Generate Prisma client
npx prisma migrate deploy # Run migrations
npm test # Run Hardhat tests
# Web workflow
cd web
npm ci --legacy-peer-deps
npm run lint
npm run format:check
npm run build # Build Next.js app
cd ..
PostgreSQL Setup for Local Testing:
# Using Docker
docker run -d \
--name postgres-test \
-e POSTGRES_USER=internetid \
-e POSTGRES_PASSWORD=internetid \
-e POSTGRES_DB=internetid_test \
-p 5432:5432 \
postgres:16-alpine
# Set environment variable
export DATABASE_URL="postgresql://internetid:internetid@localhost:5432/internetid_test?schema=public"
# Run migrations and tests
npm run db:generate
npx prisma migrate deploy
npm test
For more troubleshooting, see the Development Setup Guide.
Dependency Management
The project uses automated tools to keep dependencies up-to-date and secure:
- Dependabot: Automatically creates PRs for dependency updates
- Security updates: Daily checks
- Regular updates: Weekly checks (Mondays)
- Auto-merge for patch/minor updates after CI passes
- Manual review required for major version updates
- CodeQL: Advanced security analysis (weekly + on push/PR)
- Dependency Review: Checks PRs for vulnerable dependencies
Quick Reference: Dependency Update Process
Full Guide: Dependency Management
Setup
Quick Start: See the Contributor Onboarding Guide for detailed setup instructions.
Essential Configuration
-
Install dependencies:
npm install --legacy-peer-deps -
Configure environment:
cp .env.example .env # Edit .env and set: # - PRIVATE_KEY (deployer wallet private key) # - RPC_URL (blockchain RPC endpoint, e.g., https://sepolia.base.org) # - IPFS provider (Web3.Storage, Pinata, or Infura credentials) # - DATABASE_URL (default: file:./dev.db for SQLite) # See .env.example for all options and descriptions -
Set up database:
npm run db:generate npm run db:migrate -
Compile contracts:
npm run build
Web App Configuration
If you plan to use the web UI (web/), create web/.env.local:
cp web/.env.example web/.env.local
# Edit web/.env.local and set:
# - NEXT_PUBLIC_API_BASE (API server URL, e.g., http://localhost:3001)
# - NEXTAUTH_SECRET (generate with: openssl rand -base64 32)
# - DATABASE_URL (must match root .env)
# - OAuth provider credentials (GitHub, Google, Twitter, etc.)
# See web/.env.example for complete configuration
Note on Multi-Chain Deployments:
- Each network requires a separate deployment of the ContentRegistry contract
- Deployed addresses are saved in
deployed/<network>.jsonfiles - The registry service automatically resolves the correct contract address based on the chain ID
Multi-Chain Support
Internet-ID supports deployment and verification across multiple EVM-compatible chains:
Supported Networks
Mainnets (Production):
- Ethereum Mainnet (chain ID: 1) – High security, higher gas costs
- Polygon (chain ID: 137) – Low cost, good UX, MATIC gas token
- Base (chain ID: 8453) – Coinbase L2, low cost, good UX
- Arbitrum One (chain ID: 42161) – Low cost L2
- Optimism (chain ID: 10) – Low cost L2
Testnets (Development):
- Ethereum Sepolia (chain ID: 11155111)
- Polygon Amoy (chain ID: 80002)
- Base Sepolia (chain ID: 84532)
- Arbitrum Sepolia (chain ID: 421614)
- Optimism Sepolia (chain ID: 11155420)
Chain Configuration
Chain configurations are defined in config/chains.ts with:
- RPC URLs (with environment variable overrides)
- Block explorer URLs
- Native currency details
- Gas settings
You can override default RPC URLs via environment variables:
ETHEREUM_RPC_URL=https://your-eth-rpc.com
POLYGON_RPC_URL=https://your-polygon-rpc.com
BASE_RPC_URL=https://your-base-rpc.com
# See .env.example for all options
Scripts
build– compile contracts
Deployment Scripts (Multi-Chain):
deploy:ethereum– deploy to Ethereum Mainnetdeploy:sepolia– deploy to Ethereum Sepolia testnetdeploy:polygon– deploy to Polygondeploy:polygon-amoy– deploy to Polygon Amoy testnetdeploy:base– deploy to Basedeploy:base-sepolia– deploy to Base Sepolia testnetdeploy:arbitrum– deploy to Arbitrum Onedeploy:arbitrum-sepolia– deploy to Arbitrum Sepolia testnetdeploy:optimism– deploy to Optimismdeploy:optimism-sepolia– deploy to Optimism Sepolia testnetdeploy:local– deploy to local Hardhat node
Other Scripts:
register– hash a file and register its hash + manifest URI on-chainRPC_URLfor your preferred network. For local, you can useLOCAL_RPC_URL=http://127.0.0.1:8545.- For IPFS uploads:
IPFS_API_URLand optionalIPFS_PROJECT_ID/IPFS_PROJECT_SECRET
verify– verify a file against its manifest and on-chain registrybind:youtube– bind a YouTube videoId to a previously registered master fileverify:youtube– verify a YouTube URL/ID via on-chain binding + manifeststart:api– start the Express API server (default port 3001)lint– run ESLint on both root and web packageslint:fix– automatically fix ESLint issues where possibleformat– format all code with Prettierformat:check– check if code is formatted correctly- Web: from
web/workspacenpm run dev– start Next.js dev server on :3000npm run build && npm start– production build/startnpm run prisma:generate– generate Prisma Client for web (uses root schema)npm run lint– run ESLint on web packagenpm run lint:fix– automatically fix ESLint issues in web packagenpm run format– format web code with Prettiernpm run test:e2e– run end-to-end tests with Playwrightnpm run test:e2e:ui– run E2E tests in interactive UI modenpm run test:e2e:chromium– run E2E tests on Chromium onlynpm run test:e2e:firefox– run E2E tests on Firefox onlynpm run test:e2e:webkit– run E2E tests on WebKit (Safari) onlynpm run test:e2e:mobile– run E2E tests on mobile viewports
Quickstart
- Compile and deploy
npm i
npx hardhat compile
# Deploy to Base Sepolia (testnet)
npm run deploy:base-sepolia
# Or deploy to other networks
npm run deploy:polygon-amoy # Polygon testnet
npm run deploy:sepolia # Ethereum testnet
npm run deploy:optimism-sepolia # Optimism testnet
npm run deploy:arbitrum-sepolia # Arbitrum testnet
Local node option (no faucets needed)
# Terminal A: start local node (prefunded accounts)
npm run node
# Terminal B: deploy locally
npm run deploy:local
Production Deployments:
For mainnet deployments, ensure you have:
- Sufficient native tokens for gas (ETH, MATIC, etc.)
PRIVATE_KEYset in.env- Appropriate RPC URL configured
npm run deploy:polygon # Polygon mainnet (low cost)
npm run deploy:base # Base mainnet (low cost L2)
npm run deploy:arbitrum # Arbitrum One (low cost L2)
npm run deploy:optimism # Optimism (low cost L2)
npm run deploy:ethereum # Ethereum mainnet (high cost, high security)
- Upload your content and manifest
## Docker Deployment
For production and staging environments, use Docker for containerized deployment:
### Quick Start with Docker Compose
```bash
# Development (local testing)
docker compose up -d
# Staging environment
docker compose -f docker-compose.staging.yml up -d
# Production environment
docker compose -f docker-compose.production.yml up -d
Container Images
The project provides two Docker images:
-
API Server (
Dockerfile.api):- Express API server
- Hardhat contracts
- Prisma database client
- Multi-stage build for optimized size
-
Web Application (
web/Dockerfile):- Next.js application
- Standalone output for production
- Multi-stage build for optimized size
Environment-Specific Configurations
- Development:
docker-compose.yml- Local development with SQLite - Staging:
docker-compose.staging.yml- Staging with PostgreSQL, Redis, auto-deployment - Production:
docker-compose.production.yml- Production with HA, resource limits, backups
See Deployment Playbook for complete deployment instructions.
IPFS providers
Set one of the following in .env before uploading. By default, the uploader tries providers in this order and falls back on failures: Web3.Storage → Pinata → Infura. You can also run a local IPFS node.
- Infura IPFS:
IPFS_API_URL,IPFS_PROJECT_ID,IPFS_PROJECT_SECRET - Web3.Storage:
WEB3_STORAGE_TOKEN - Pinata:
PINATA_JWT - Local IPFS node:
IPFS_PROVIDER=localand (optionally)IPFS_API_URL=http://127.0.0.1:5001 - Note: If both Web3.Storage and Pinata are set, Web3.Storage is attempted first. 5xx errors automatically trigger fallback.
Force a specific provider (optional)
- Set
IPFS_PROVIDER=web3storage|pinata|infurain.envto force the uploader to use one provider only (no fallback). Helpful while debugging credentials. - For local node usage, set
IPFS_PROVIDER=local.
Troubleshooting
- 401 Unauthorized (Infura): Ensure you created an IPFS project and used those credentials. Ethereum RPC keys won’t work for IPFS. Check
IPFS_PROJECT_IDandIPFS_PROJECT_SECRET. - 503/5xx (Web3.Storage/Pinata): Temporary outage or maintenance. Either wait, or set
IPFS_PROVIDERto try another provider. - Slow or timeouts: The uploader retries with exponential backoff. You can re-run the command; CIDs are content-addressed and idempotent across providers.
Local IPFS quickstart (optional)
If you prefer not to use third-party providers, you can run a local Kubo node:
- Install IPFS (Kubo) from https://github.com/ipfs/kubo
- Initialize and start the daemon:
ipfs init
ipfs daemon
- In
.env, set:
IPFS_PROVIDER=local
IPFS_API_URL=http://127.0.0.1:5001
- Upload with the same script; it will hit your local node.
Upload your content and note the CID
npm run upload:ipfs -- ./path/to/file
# Make manifest.json (safer: use PRIVATE_KEY from .env)
npm run manifest -- ./path/to/file ipfs://<CID>
# Alternatively (less secure; your key appears in shell history):
npm run manifest -- ./path/to/file ipfs://<CID> <PRIVATE_KEY>
# Optionally upload manifest.json too
npm run upload:ipfs -- ./manifest.json
- Anchor on-chain
# Use the manifest URI (e.g., ipfs://<manifestCID>) and deployed address
npm run register -- ./path/to/file ipfs://<manifestCID> 0xYourRegistryAddress
- Verify a file
npm run verify -- ./path/to/file ipfs://<manifestCID> 0xYourRegistryAddress
- Generate a portable proof bundle (optional)
npm run proof -- ./path/to/file ipfs://<manifestCID> 0xYourRegistryAddress
This writes proof.json with the file hash, manifest details, recovered signer, on-chain entry, and the registration tx (best effort). You can share this alongside your content.
Web UI (optional)
The Next.js app in web/ provides end-to-end flows:
- Upload to IPFS
- One-shot: Upload → manifest → register (can also bind links)
- Manifest creation
- Register on-chain
- Verify and Proof generation
- Bind platform links (single or batch)
- Browse registered contents (with inline verify and Share block)
- Account: Sign in, register, and link platform identities (profile)
Run locally:
cd web
npm i
npm run dev
Set NEXT_PUBLIC_API_BASE to the API origin (default http://localhost:3001).
Privacy by default
The One‑shot flow does not upload the video by default. It computes the hash locally, builds a manifest, uploads the manifest, and registers on-chain. You can opt-in to upload the video to IPFS via a checkbox. The manifest’s content_uri may be omitted when not uploading, preserving privacy while still enabling provenance.
Public Verify page
Viewers can verify a platform link without downloading your master file. The web app exposes a public Verify page at /verify and backend endpoints to resolve bindings:
GET /api/resolve– map a URL orplatform+platformIdto the on-chain bindingGET /api/public-verify– resolve binding and return manifest summary
Sharing and badges
Each registered content gets a shareable badge and QR codes. In the UI, the Share block provides:
- Badge image (SVG):
/api/badge/[hash]?theme=dark|light&w=120..640 - QR code PNG:
/api/qr?url=<encoded_share_url> - Share link to the public Verify page:
/verify?platform=...&platformId=... - Embed HTML snippet:
<a href="..."><img src="/api/badge/[hash]" /></a> - Copy All button: copies a bundle of badge URL, per-link share URLs, QR URLs, and embed HTML
Tip: Set NEXT_PUBLIC_SITE_BASE so badges/links use your canonical host when sharing.
Verifying account ownership via OAuth
To confirm a creator controls a given platform account, use OAuth sign-in and link their provider account(s) to their user profile. Start with GitHub/Google for baseline auth, and add platform-specific providers as available (e.g., X/Twitter, YouTube via Google scopes, etc.). When a user binds a platform URL/ID, you can check their linked Accounts in the database to enforce ownership if desired.
API server
- Start the API:
npm run start:api - If
API_KEYis set in.env, the following endpoints require headerx-api-key: $API_KEY:- POST /api/upload
- POST /api/manifest
- POST /api/register
- POST /api/bind
Other endpoints like /api/verify and /api/proof are public by default.
Observability Endpoints
The API includes built-in observability for production monitoring:
- Health Check:
GET /api/health- Service health status with database, cache, and blockchain checks - Metrics:
GET /api/metrics- Prometheus-format metrics for monitoring - Metrics (JSON):
GET /api/metrics/json- Human-readable metrics
All HTTP requests are automatically logged with correlation IDs for request tracing. See Observability Guide for complete details on structured logging, metrics, and monitoring setup.
When calling from the Next.js UI or curl, include the header if enabled:
curl -H "x-api-key: $API_KEY" -F file=@./video.mp4 \
-F registryAddress=0x... -F manifestURI=ipfs://... \
http://localhost:3001/api/register
Performance & Caching
The API includes an optional Redis-based caching layer to improve performance and reduce database load:
Caching Features
- Cache-aside pattern: Automatic fallback to database on cache miss
- Smart TTLs: Different cache lifetimes based on data type
- Content metadata: 10 minutes
- Manifests: 15 minutes
- Platform bindings: 3 minutes
- Verification status: 5 minutes
- Automatic invalidation: Caches cleared on writes (register, bind, verify)
- LRU eviction: Keeps most frequently accessed data in memory
- Graceful degradation: Works without Redis, falls back to database
Setup
- Start Redis (Docker recommended):
docker compose up -d redis
- Set Redis URL in
.env:
REDIS_URL=redis://localhost:6379
- Restart the API - caching will be enabled automatically
Monitoring
Check cache performance at /api/cache/metrics:
curl http://localhost:3001/api/cache/metrics
Returns hit rate, cache hits/misses, and error counts.
See docs/CACHING_ARCHITECTURE.md for detailed documentation.
Database
By default, the project uses a local SQLite file for easy setup.
- Generate Prisma client and apply migrations:
npm run db:generate
npm run db:migrate
- Seed the database with test data (optional but recommended for development):
npm run db:seed
This populates the database with sample users, contents, platform bindings, and verifications. See prisma/SEED_DATA.md for details.
- Inspect data (optional):
npm run db:studio
Resetting the Database
To clear all data and start fresh:
npm run db:reset
This will drop the database, run migrations, and reseed test data. ⚠️ Warning: This deletes ALL data!
Prisma Schema - Single Source of Truth
⚠️ Important: The repository uses a single Prisma schema at prisma/schema.prisma.
This schema generates two separate Prisma Clients:
- Root client (for API/scripts):
./node_modules/@prisma/client - Web client (for Next.js):
../web/node_modules/.prisma/client
Never create duplicate schemas like web/prisma/schema.prisma. The single schema ensures:
- No schema drift between API and web
- Single migration history
- One place to update models
See prisma/README.md for detailed documentation.
Database Performance & Indexing
The database schema includes comprehensive indexes for optimal query performance:
- 17 indexes across all tables prevent full table scans
- Composite indexes optimize common multi-column queries
- Foreign key indexes ensure fast JOINs
- Performance target: Sub-100ms queries for 100k+ records
To verify indexes after migration:
npm run db:verify-indexes
See detailed documentation:
Optional: Postgres via Docker
If you prefer Postgres, a docker-compose.yml is included.
- Start Postgres:
docker compose up -d
-
In
.env, setDATABASE_URLto a Postgres URL (see.env.example). -
Re-run Prisma generate/migrate so the client matches the Postgres schema.
If you previously generated SQLite migrations, clear them before switching:
rm -rf prisma/migrations/\*
npm run db:migrate
Database Backup and Disaster Recovery
The project includes comprehensive automated backup and disaster recovery capabilities for production deployments:
- Automated Backups: Daily full backups and hourly incremental backups via WAL archiving
- Point-in-Time Recovery (PITR): Restore database to any specific timestamp
- Encrypted Storage: S3-compatible backup storage with encryption at rest
- Monitoring & Alerts: Automated backup verification and health checks
- Disaster Recovery Runbook: Tested procedures with RTO/RPO targets
See detailed documentation:
- Database Backup & Recovery Guide - Complete setup and usage
- Disaster Recovery Runbook - Emergency procedures and scenarios
- Backup Monitoring - Monitoring and alerting configuration
- Ops Scripts - Backup and restore scripts
Quick start:
# Run manual backup
cd ops/backup
./backup-database.sh full
# Restore from backup
cd ops/restore
./restore-database.sh full
# Verify backups
cd ops/backup
./verify-backup.sh
Verification sketch
- Recompute the file hash (sha256) and compare with
content_hashin manifest and on-chainentries[hash]. - Verify
signaturein manifest was produced by the creator key. - Confirm the creator matches the on-chain entry’s
creator.
YouTube flow
Because YouTube re-encodes media, the on-platform bytes won’t match your master file hash. Use a binding:
- Anchor your master file as usual (upload → manifest → register)
- After uploading to YouTube, get the
videoId(from the URL) - Bind the YouTube video to the master file:
npm run bind:youtube -- ./master.mp4 <YouTubeVideoId> 0xRegistry
- Verify a YouTube URL or ID later:
npm run verify:youtube -- https://www.youtube.com/watch?v=<YouTubeVideoId> 0xRegistry
End-to-End Testing
The web application includes a comprehensive E2E test suite built with Playwright:
Quick Start
cd web
npm run test:e2e
Features
- Multi-browser testing: Chromium, Firefox, WebKit (Safari)
- Mobile testing: iPhone 12, Pixel 5 viewports
- Visual regression: Screenshot comparison for UI changes
- Accessibility testing: WCAG compliance, ARIA roles, keyboard navigation
- 85+ test cases covering all major user flows:
- Navigation and page loading
- Authentication with OAuth providers
- Dashboard and content viewing
- Content upload and registration
- Platform binding and verification
- Profile and account management
Running Tests
# Run all E2E tests
npm run test:e2e
# Interactive UI mode (best for development)
npm run test:e2e:ui
# Debug mode with step-by-step execution
npm run test:e2e:debug
# Test specific browsers
npm run test:e2e:chromium
npm run test:e2e:firefox
npm run test:e2e:webkit
# Mobile viewport testing
npm run test:e2e:mobile
CI Integration
E2E tests can be triggered manually via GitHub Actions workflow:
- Go to Actions
- Click "Run workflow"
- Optionally specify custom base URL for testing preview deployments
See the complete E2E Testing Guide for detailed documentation, debugging tips, and best practices.
Documentation
Getting Started
- Contributor Onboarding Guide - Complete setup instructions, development workflow, and troubleshooting
- Architecture Overview - System design, component interactions, and data flow
- PITCH.md - Plain-English explanation of the problem and solution
Technical Documentation
- Input Validation - Zod schemas and security validation
- Caching Architecture - Redis caching implementation details
- Rate Limiting - API rate limiting configuration
- Database Indexing Strategy - Query optimization and indexes
- Multi-Chain Deployment - Deploying to multiple EVM chains
- Platform Verification - Platform binding details (YouTube, TikTok, etc.)
- E2E Testing Guide - End-to-end testing with Playwright
Operations & Security
- Security Policy - Reporting vulnerabilities and security practices
- Smart Contract Audit - Security analysis and audit results
- Observability & Monitoring - Structured logging, metrics, and monitoring setup
- Observability Quick Start - 5-minute guide to monitoring in production
- Database Backup & Recovery - Backup and disaster recovery procedures
- Secret Management - Managing sensitive credentials in production
Deployment & Infrastructure
- Deployment Playbook - Complete guide for staging and production deployments
- Environment Variables Reference - Comprehensive configuration documentation
- Ops Scripts - Backup, restore, and SSL management scripts
Next steps
- Add C2PA manifest embedding for images/video.
- Support Merkle batch anchoring.
- Add selective disclosure/zk proof of “is a real person” VC.
CLI Tool and SDK for Programmatic Access
Internet ID provides multiple ways to interact with the platform programmatically:
CLI Tool
Command-line tool for content registration and verification. Perfect for automation, scripting, and CI/CD workflows.
# Install globally
npm install -g @internet-id/cli
# Configure credentials
internet-id init
# Upload and register content
internet-id upload ./my-video.mp4
# Verify content
internet-id verify ./my-video.mp4
Features:
- ✅ Interactive configuration with
initcommand - ✅ Privacy mode (only manifest uploaded by default)
- ✅ Optional content upload to IPFS
- ✅ Content verification by file or manifest URI
- ✅ Support for multiple IPFS providers (Web3.Storage, Pinata, Infura, local)
- ✅ Multi-chain support (Base, Ethereum, Polygon, Arbitrum, Optimism)
Documentation: CLI README
TypeScript/JavaScript SDK
Official SDK for building integrations and tools.
# Install the SDK
npm install @internet-id/sdk
import { InternetIdClient } from "@internet-id/sdk";
const client = new InternetIdClient({
apiKey: "iid_your_api_key_here",
});
// Verify content by platform URL
const result = await client.verifyByPlatform({
url: "https://youtube.com/watch?v=abc123",
});
console.log(result.verified); // true or false
console.log(result.creator); // Creator's Ethereum address
Features:
- ✅ Full TypeScript support with type definitions
- ✅ Content verification and metadata retrieval
- ✅ API key management
- ✅ JWT authentication
- ✅ Automatic rate limiting and error handling
Documentation: SDK README
Browser Extension
Seamless verification workflow without leaving the platform. One-click verification improves UX and conversion significantly.
Installation:
- Chrome/Edge/Brave: Load unpacked from
extension/directory (developer mode) - Coming Soon: Chrome Web Store, Firefox Add-ons, Safari Extensions
Features:
- ✅ Platform detection (YouTube, Twitter/X, Instagram, GitHub, TikTok, LinkedIn)
- ✅ One-click verification from extension popup
- ✅ Verification badges displayed directly on platform pages
- ✅ Quick access to Internet ID dashboard
- ✅ Wallet connection for signing and registration
- ✅ Privacy-conscious with 5-minute cache and local storage only
- ✅ Configurable auto-verify and badge display settings
How It Works:
- Install extension in your browser
- Configure API endpoint in settings
- Visit supported platform (e.g., YouTube video)
- Extension automatically checks verification status
- Verified content displays a badge
- Click extension icon for details or to verify new content
Documentation:
- Browser Extension README - Installation and usage
- Extension Architecture - Technical design and development
Public API
RESTful API for third-party integrations.
Features:
- ✅ Versioned API (
/api/v1/) - ✅ Multiple authentication methods (API keys, JWT tokens)
- ✅ Rate limiting per tier (free: 100 req/min, paid: 1000 req/min)
- ✅ OpenAPI/Swagger documentation at
/api/docs
Documentation:
- Public API Documentation - Complete API reference
- Developer Onboarding Guide - Get started quickly
- Interactive API Explorer: http://localhost:3001/api/docs (when running locally)
API reference (summary)
Legacy Endpoints
Auth: If API_KEY is set, include x-api-key: $API_KEY in requests for protected endpoints.
GET /api/health– server statusGET /api/network– returnschainIdGET /api/registry– default registry address (if configured)POST /api/upload– upload file to IPFS (protected)POST /api/manifest– build and optionally upload manifest (protected)POST /api/register– register content hash + manifest on-chain (protected)POST /api/bind– bind a single platform ID (protected)POST /api/bind-many– bind multiple platform IDs at once (protected)POST /api/verify– verify a file against manifest + on-chainPOST /api/proof– generateproof.jsonGET /api/contents– list registered contentsGET /api/verifications– list recent verificationsGET /api/resolve– resolve URL or platform+id to on-chain bindingGET /api/public-verify– public verification summary for a binding- Web-only:
GET /api/badge/[hash]– SVG badge withthemeandw(width)GET /api/qr?url=...– QR PNG for a share URL
V1 Public API Endpoints
See Public API Documentation for the complete v1 API reference.