Files
internet-id/README.md
Copilot 99ddc7f7ba Tighten Content Security Policy and enhance security headers (#130)
* Initial plan

* Implement nonce-based CSP for production mode

Co-authored-by: PatrickFanella <61631520+PatrickFanella@users.noreply.github.com>

* Complete CSP implementation with dynamic nonces

Co-authored-by: PatrickFanella <61631520+PatrickFanella@users.noreply.github.com>

* Add CSP security improvements documentation

Co-authored-by: onnwee <211922112+onnwee@users.noreply.github.com>

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: PatrickFanella <61631520+PatrickFanella@users.noreply.github.com>
Co-authored-by: onnwee <211922112+onnwee@users.noreply.github.com>
2026-02-15 18:33:22 -06:00

36 KiB
Raw Permalink Blame History

Internet-ID: Human-Created Content Anchoring

CI

This repo scaffolds a minimal on-chain content provenance flow:

  • A creator hashes their content and signs a manifest.
  • The manifest and content are stored off-chain (e.g., IPFS/Web3.Storage).
  • A small registry contract on an L2 anchors the content hash and a URI to the manifest.
  • Verifiers can recompute the hash, check the signature, and confirm the on-chain anchor.

Note: This proves provenance, not truth. It helps distinguish opted-in human-created content from anonymous deepfakes.

📚 Documentation:

For End Users (Content Creators & Viewers)

For Developers & Contributors

Stack

  • Solidity (ContentRegistry)
  • Hardhat + TypeScript
  • Ethers v6
  • IPFS uploads via Infura, Web3.Storage, or Pinata
  • Express API with optional API key protection
  • Comprehensive input validation using Zod (see docs/VALIDATION.md)
  • Prisma ORM (SQLite by default; Postgres optional)
  • Redis caching layer for improved performance (optional, see docs/CACHING_ARCHITECTURE.md)
  • Asynchronous verification queue using BullMQ for background processing (optional, see docs/VERIFICATION_QUEUE.md)
  • Next.js App Router web UI (optional)
  • NextAuth for sign-in (GitHub/Google to start), Prisma adapter
  • Browser Extension for one-click verification on supported platforms (Chrome, Firefox, Safari - see extension/README.md)

Security

This project implements comprehensive security measures across smart contracts and API:

Smart Contract Security

  • Automated security analysis completed (Slither)
  • No critical or high severity vulnerabilities found
  • Comprehensive access control with onlyCreator modifier
  • No reentrancy risks (no external calls)
  • Integer overflow protection (Solidity 0.8+)
  • 📋 Professional audit planned before mainnet launch

See: Smart Contract Audit Report | Security Policy

API Security

  • Comprehensive input validation and sanitization
  • XSS (Cross-Site Scripting) prevention
  • SQL injection protection via Prisma ORM
  • Command injection prevention
  • Path traversal protection
  • File upload security with size limits and type restrictions
  • Rate limiting (when configured with Redis)
  • Performance optimization with Redis caching layer

See: Input Validation Documentation | Security Implementation Summary | Caching Security Summary

Web Application Security

  • Content Security Policy (CSP) with nonce-based script execution
  • XSS protection via strict CSP (no unsafe-eval or unsafe-inline in production)
  • Clickjacking protection (frame-ancestors, X-Frame-Options)
  • MIME-type sniffing protection (X-Content-Type-Options)
  • HSTS (HTTP Strict Transport Security)
  • Referrer policy for privacy
  • Permissions policy for browser features

See: CSP Security Improvements

Reporting Security Issues

We take security seriously. If you discover a vulnerability, please report it responsibly:

Code Quality

This project uses ESLint and Prettier to maintain consistent code style and catch common issues:

  • ESLint: Configured for both Node.js/Hardhat scripts (TypeScript) and Next.js app
  • Prettier: Shared formatting config across the monorepo

Linting & Formatting

# Run linters across the entire monorepo
npm run lint

# Fix auto-fixable linting issues
npm run lint:fix

# Format all code with Prettier
npm run format

# Check if code is properly formatted
npm run format:check

For the web package specifically:

cd web
npm run lint        # ESLint for Next.js app
npm run lint:fix    # Auto-fix issues
npm run format      # Format with Prettier

Configuration files:

  • Root ESLint: .eslintrc.json (TypeScript + Node.js)
  • Web ESLint: web/.eslintrc.json (Next.js)
  • Prettier: .prettierrc.json (shared)

Continuous Integration

This project uses GitHub Actions to ensure code quality and prevent regressions. The CI workflow runs automatically on pull requests and pushes to the main branch.

CI Workflows

  1. Backend Job:

    • Installs dependencies with npm cache
    • Runs ESLint on root package
    • Checks code formatting with Prettier
    • Validates Prisma schema formatting
    • Compiles Solidity contracts with Hardhat
    • Generates Prisma client and runs migrations
    • Runs all backend tests (Hardhat + unit tests)
  2. Web Job:

    • Installs dependencies for both root and web packages
    • Runs ESLint on Next.js app
    • Checks code formatting
    • Builds the Next.js application
  3. E2E Tests (manual trigger):

    • Runs comprehensive end-to-end tests with Playwright
    • Tests across Chromium, Firefox, and WebKit browsers
    • Validates mobile responsiveness
    • Performs visual regression testing
    • Can run against preview deployments
  4. Security Scanning (automated):

    • CodeQL analysis for security vulnerabilities (weekly + on every push/PR)
    • Dependency review for new vulnerabilities in PRs
    • Dependabot security alerts (daily checks)

View the CI workflow configuration and E2E workflow configuration.

Note: This CI workflow is part of the project roadmap to guard against regressions (see #10).

CI Environment Variables

The CI workflows use the following environment variables:

Backend Job:

  • DATABASE_URL: PostgreSQL connection for tests (provided by CI service container)
    • Value: postgresql://internetid:internetid@localhost:5432/internetid_test?schema=public
    • PostgreSQL 16 runs as a service container with health checks

Web Job:

  • DATABASE_URL: Same as backend (for Prisma client generation)
  • NEXTAUTH_URL: Base URL for NextAuth (mock value: http://localhost:3000)
  • NEXTAUTH_SECRET: Session encryption secret (mock value for build-time only)

Note: No real secrets or RPC URLs are required for CI. The workflows use:

  • Local PostgreSQL for database tests
  • Mock values for Next.js build (build runs in standalone mode)
  • Hardhat's built-in test network for smart contract tests

Local CI Debugging

To reproduce CI failures locally:

# Backend workflow
npm ci --legacy-peer-deps
npm run lint:root
npm run format:check
npx prisma format --check
npm run build                    # Compile Hardhat contracts
npm run db:generate              # Generate Prisma client
npx prisma migrate deploy        # Run migrations
npm test                         # Run Hardhat tests

# Web workflow
cd web
npm ci --legacy-peer-deps
npm run lint
npm run format:check
npm run build                    # Build Next.js app
cd ..

PostgreSQL Setup for Local Testing:

# Using Docker
docker run -d \
  --name postgres-test \
  -e POSTGRES_USER=internetid \
  -e POSTGRES_PASSWORD=internetid \
  -e POSTGRES_DB=internetid_test \
  -p 5432:5432 \
  postgres:16-alpine

# Set environment variable
export DATABASE_URL="postgresql://internetid:internetid@localhost:5432/internetid_test?schema=public"

# Run migrations and tests
npm run db:generate
npx prisma migrate deploy
npm test

For more troubleshooting, see the Development Setup Guide.

Dependency Management

The project uses automated tools to keep dependencies up-to-date and secure:

  • Dependabot: Automatically creates PRs for dependency updates
    • Security updates: Daily checks
    • Regular updates: Weekly checks (Mondays)
    • Auto-merge for patch/minor updates after CI passes
    • Manual review required for major version updates
  • CodeQL: Advanced security analysis (weekly + on push/PR)
  • Dependency Review: Checks PRs for vulnerable dependencies

Quick Reference: Dependency Update Process
Full Guide: Dependency Management

Setup

Quick Start: See the Contributor Onboarding Guide for detailed setup instructions.

Essential Configuration

  1. Install dependencies:

    npm install --legacy-peer-deps
    
  2. Configure environment:

    cp .env.example .env
    # Edit .env and set:
    # - PRIVATE_KEY (deployer wallet private key)
    # - RPC_URL (blockchain RPC endpoint, e.g., https://sepolia.base.org)
    # - IPFS provider (Web3.Storage, Pinata, or Infura credentials)
    # - DATABASE_URL (default: file:./dev.db for SQLite)
    # See .env.example for all options and descriptions
    
  3. Set up database:

    npm run db:generate
    npm run db:migrate
    
  4. Compile contracts:

    npm run build
    

Web App Configuration

If you plan to use the web UI (web/), create web/.env.local:

cp web/.env.example web/.env.local
# Edit web/.env.local and set:
# - NEXT_PUBLIC_API_BASE (API server URL, e.g., http://localhost:3001)
# - NEXTAUTH_SECRET (generate with: openssl rand -base64 32)
# - DATABASE_URL (must match root .env)
# - OAuth provider credentials (GitHub, Google, Twitter, etc.)
# See web/.env.example for complete configuration

Note on Multi-Chain Deployments:

  • Each network requires a separate deployment of the ContentRegistry contract
  • Deployed addresses are saved in deployed/<network>.json files
  • The registry service automatically resolves the correct contract address based on the chain ID

Multi-Chain Support

Internet-ID supports deployment and verification across multiple EVM-compatible chains:

Supported Networks

Mainnets (Production):

  • Ethereum Mainnet (chain ID: 1) High security, higher gas costs
  • Polygon (chain ID: 137) Low cost, good UX, MATIC gas token
  • Base (chain ID: 8453) Coinbase L2, low cost, good UX
  • Arbitrum One (chain ID: 42161) Low cost L2
  • Optimism (chain ID: 10) Low cost L2

Testnets (Development):

  • Ethereum Sepolia (chain ID: 11155111)
  • Polygon Amoy (chain ID: 80002)
  • Base Sepolia (chain ID: 84532)
  • Arbitrum Sepolia (chain ID: 421614)
  • Optimism Sepolia (chain ID: 11155420)

Chain Configuration

Chain configurations are defined in config/chains.ts with:

  • RPC URLs (with environment variable overrides)
  • Block explorer URLs
  • Native currency details
  • Gas settings

You can override default RPC URLs via environment variables:

ETHEREUM_RPC_URL=https://your-eth-rpc.com
POLYGON_RPC_URL=https://your-polygon-rpc.com
BASE_RPC_URL=https://your-base-rpc.com
# See .env.example for all options

Scripts

  • build compile contracts

Deployment Scripts (Multi-Chain):

  • deploy:ethereum deploy to Ethereum Mainnet
  • deploy:sepolia deploy to Ethereum Sepolia testnet
  • deploy:polygon deploy to Polygon
  • deploy:polygon-amoy deploy to Polygon Amoy testnet
  • deploy:base deploy to Base
  • deploy:base-sepolia deploy to Base Sepolia testnet
  • deploy:arbitrum deploy to Arbitrum One
  • deploy:arbitrum-sepolia deploy to Arbitrum Sepolia testnet
  • deploy:optimism deploy to Optimism
  • deploy:optimism-sepolia deploy to Optimism Sepolia testnet
  • deploy:local deploy to local Hardhat node

Other Scripts:

  • register hash a file and register its hash + manifest URI on-chain
    • RPC_URL for your preferred network. For local, you can use LOCAL_RPC_URL=http://127.0.0.1:8545.
    • For IPFS uploads: IPFS_API_URL and optional IPFS_PROJECT_ID/IPFS_PROJECT_SECRET
  • verify verify a file against its manifest and on-chain registry
  • bind:youtube bind a YouTube videoId to a previously registered master file
  • verify:youtube verify a YouTube URL/ID via on-chain binding + manifest
  • start:api start the Express API server (default port 3001)
  • lint run ESLint on both root and web packages
  • lint:fix automatically fix ESLint issues where possible
  • format format all code with Prettier
  • format:check check if code is formatted correctly
  • Web: from web/ workspace
    • npm run dev start Next.js dev server on :3000
    • npm run build && npm start production build/start
    • npm run prisma:generate generate Prisma Client for web (uses root schema)
    • npm run lint run ESLint on web package
    • npm run lint:fix automatically fix ESLint issues in web package
    • npm run format format web code with Prettier
    • npm run test:e2e run end-to-end tests with Playwright
    • npm run test:e2e:ui run E2E tests in interactive UI mode
    • npm run test:e2e:chromium run E2E tests on Chromium only
    • npm run test:e2e:firefox run E2E tests on Firefox only
    • npm run test:e2e:webkit run E2E tests on WebKit (Safari) only
    • npm run test:e2e:mobile run E2E tests on mobile viewports

Quickstart

  1. Compile and deploy
npm i
npx hardhat compile

# Deploy to Base Sepolia (testnet)
npm run deploy:base-sepolia

# Or deploy to other networks
npm run deploy:polygon-amoy  # Polygon testnet
npm run deploy:sepolia       # Ethereum testnet
npm run deploy:optimism-sepolia  # Optimism testnet
npm run deploy:arbitrum-sepolia  # Arbitrum testnet

Local node option (no faucets needed)

# Terminal A: start local node (prefunded accounts)
npm run node

# Terminal B: deploy locally
npm run deploy:local

Production Deployments:

For mainnet deployments, ensure you have:

  • Sufficient native tokens for gas (ETH, MATIC, etc.)
  • PRIVATE_KEY set in .env
  • Appropriate RPC URL configured
npm run deploy:polygon   # Polygon mainnet (low cost)
npm run deploy:base      # Base mainnet (low cost L2)
npm run deploy:arbitrum  # Arbitrum One (low cost L2)
npm run deploy:optimism  # Optimism (low cost L2)
npm run deploy:ethereum  # Ethereum mainnet (high cost, high security)
  1. Upload your content and manifest

## Docker Deployment

For production and staging environments, use Docker for containerized deployment:

### Quick Start with Docker Compose

```bash
# Development (local testing)
docker compose up -d

# Staging environment
docker compose -f docker-compose.staging.yml up -d

# Production environment
docker compose -f docker-compose.production.yml up -d

Container Images

The project provides two Docker images:

  1. API Server (Dockerfile.api):

    • Express API server
    • Hardhat contracts
    • Prisma database client
    • Multi-stage build for optimized size
  2. Web Application (web/Dockerfile):

    • Next.js application
    • Standalone output for production
    • Multi-stage build for optimized size

Environment-Specific Configurations

  • Development: docker-compose.yml - Local development with SQLite
  • Staging: docker-compose.staging.yml - Staging with PostgreSQL, Redis, auto-deployment
  • Production: docker-compose.production.yml - Production with HA, resource limits, backups

See Deployment Playbook for complete deployment instructions.

IPFS providers

Set one of the following in .env before uploading. By default, the uploader tries providers in this order and falls back on failures: Web3.Storage → Pinata → Infura. You can also run a local IPFS node.

  • Infura IPFS: IPFS_API_URL, IPFS_PROJECT_ID, IPFS_PROJECT_SECRET
  • Web3.Storage: WEB3_STORAGE_TOKEN
  • Pinata: PINATA_JWT
  • Local IPFS node: IPFS_PROVIDER=local and (optionally) IPFS_API_URL=http://127.0.0.1:5001
  • Note: If both Web3.Storage and Pinata are set, Web3.Storage is attempted first. 5xx errors automatically trigger fallback.

Force a specific provider (optional)

  • Set IPFS_PROVIDER=web3storage|pinata|infura in .env to force the uploader to use one provider only (no fallback). Helpful while debugging credentials.
  • For local node usage, set IPFS_PROVIDER=local.

Troubleshooting

  • 401 Unauthorized (Infura): Ensure you created an IPFS project and used those credentials. Ethereum RPC keys wont work for IPFS. Check IPFS_PROJECT_ID and IPFS_PROJECT_SECRET.
  • 503/5xx (Web3.Storage/Pinata): Temporary outage or maintenance. Either wait, or set IPFS_PROVIDER to try another provider.
  • Slow or timeouts: The uploader retries with exponential backoff. You can re-run the command; CIDs are content-addressed and idempotent across providers.

Local IPFS quickstart (optional)

If you prefer not to use third-party providers, you can run a local Kubo node:

  1. Install IPFS (Kubo) from https://github.com/ipfs/kubo
  2. Initialize and start the daemon:

ipfs init
ipfs daemon

  1. In .env, set:

IPFS_PROVIDER=local
IPFS_API_URL=http://127.0.0.1:5001

  1. Upload with the same script; it will hit your local node.

Upload your content and note the CID


npm run upload:ipfs -- ./path/to/file

# Make manifest.json (safer: use PRIVATE_KEY from .env)

npm run manifest -- ./path/to/file ipfs://<CID>

# Alternatively (less secure; your key appears in shell history):

npm run manifest -- ./path/to/file ipfs://<CID> <PRIVATE_KEY>

# Optionally upload manifest.json too

npm run upload:ipfs -- ./manifest.json

  1. Anchor on-chain

# Use the manifest URI (e.g., ipfs://<manifestCID>) and deployed address

npm run register -- ./path/to/file ipfs://<manifestCID> 0xYourRegistryAddress

  1. Verify a file

npm run verify -- ./path/to/file ipfs://<manifestCID> 0xYourRegistryAddress

  1. Generate a portable proof bundle (optional)

npm run proof -- ./path/to/file ipfs://<manifestCID> 0xYourRegistryAddress

This writes proof.json with the file hash, manifest details, recovered signer, on-chain entry, and the registration tx (best effort). You can share this alongside your content.

Web UI (optional)

The Next.js app in web/ provides end-to-end flows:

  • Upload to IPFS
  • One-shot: Upload → manifest → register (can also bind links)
  • Manifest creation
  • Register on-chain
  • Verify and Proof generation
  • Bind platform links (single or batch)
  • Browse registered contents (with inline verify and Share block)
  • Account: Sign in, register, and link platform identities (profile)

Run locally:


cd web
npm i
npm run dev

Set NEXT_PUBLIC_API_BASE to the API origin (default http://localhost:3001).

Privacy by default

The Oneshot flow does not upload the video by default. It computes the hash locally, builds a manifest, uploads the manifest, and registers on-chain. You can opt-in to upload the video to IPFS via a checkbox. The manifests content_uri may be omitted when not uploading, preserving privacy while still enabling provenance.

Public Verify page

Viewers can verify a platform link without downloading your master file. The web app exposes a public Verify page at /verify and backend endpoints to resolve bindings:

  • GET /api/resolve map a URL or platform+platformId to the on-chain binding
  • GET /api/public-verify resolve binding and return manifest summary

Sharing and badges

Each registered content gets a shareable badge and QR codes. In the UI, the Share block provides:

  • Badge image (SVG): /api/badge/[hash]?theme=dark|light&w=120..640
  • QR code PNG: /api/qr?url=<encoded_share_url>
  • Share link to the public Verify page: /verify?platform=...&platformId=...
  • Embed HTML snippet: <a href="..."><img src="/api/badge/[hash]" /></a>
  • Copy All button: copies a bundle of badge URL, per-link share URLs, QR URLs, and embed HTML

Tip: Set NEXT_PUBLIC_SITE_BASE so badges/links use your canonical host when sharing.

Verifying account ownership via OAuth

To confirm a creator controls a given platform account, use OAuth sign-in and link their provider account(s) to their user profile. Start with GitHub/Google for baseline auth, and add platform-specific providers as available (e.g., X/Twitter, YouTube via Google scopes, etc.). When a user binds a platform URL/ID, you can check their linked Accounts in the database to enforce ownership if desired.

API server

  • Start the API: npm run start:api
  • If API_KEY is set in .env, the following endpoints require header x-api-key: $API_KEY:
    • POST /api/upload
    • POST /api/manifest
    • POST /api/register
    • POST /api/bind

Other endpoints like /api/verify and /api/proof are public by default.

Observability Endpoints

The API includes built-in observability for production monitoring:

  • Health Check: GET /api/health - Service health status with database, cache, and blockchain checks
  • Metrics: GET /api/metrics - Prometheus-format metrics for monitoring
  • Metrics (JSON): GET /api/metrics/json - Human-readable metrics

All HTTP requests are automatically logged with correlation IDs for request tracing. See Observability Guide for complete details on structured logging, metrics, and monitoring setup.

When calling from the Next.js UI or curl, include the header if enabled:


curl -H "x-api-key: $API_KEY" -F file=@./video.mp4 \
 -F registryAddress=0x... -F manifestURI=ipfs://... \
 http://localhost:3001/api/register

Performance & Caching

The API includes an optional Redis-based caching layer to improve performance and reduce database load:

Caching Features

  • Cache-aside pattern: Automatic fallback to database on cache miss
  • Smart TTLs: Different cache lifetimes based on data type
    • Content metadata: 10 minutes
    • Manifests: 15 minutes
    • Platform bindings: 3 minutes
    • Verification status: 5 minutes
  • Automatic invalidation: Caches cleared on writes (register, bind, verify)
  • LRU eviction: Keeps most frequently accessed data in memory
  • Graceful degradation: Works without Redis, falls back to database

Setup

  1. Start Redis (Docker recommended):
docker compose up -d redis
  1. Set Redis URL in .env:
REDIS_URL=redis://localhost:6379
  1. Restart the API - caching will be enabled automatically

Monitoring

Check cache performance at /api/cache/metrics:

curl http://localhost:3001/api/cache/metrics

Returns hit rate, cache hits/misses, and error counts.

See docs/CACHING_ARCHITECTURE.md for detailed documentation.

Database

By default, the project uses a local SQLite file for easy setup.

  1. Generate Prisma client and apply migrations:

npm run db:generate
npm run db:migrate

  1. Seed the database with test data (optional but recommended for development):

npm run db:seed

This populates the database with sample users, contents, platform bindings, and verifications. See prisma/SEED_DATA.md for details.

  1. Inspect data (optional):

npm run db:studio

Resetting the Database

To clear all data and start fresh:


npm run db:reset

This will drop the database, run migrations, and reseed test data. ⚠️ Warning: This deletes ALL data!

Prisma Schema - Single Source of Truth

⚠️ Important: The repository uses a single Prisma schema at prisma/schema.prisma.

This schema generates two separate Prisma Clients:

  • Root client (for API/scripts): ./node_modules/@prisma/client
  • Web client (for Next.js): ../web/node_modules/.prisma/client

Never create duplicate schemas like web/prisma/schema.prisma. The single schema ensures:

  • No schema drift between API and web
  • Single migration history
  • One place to update models

See prisma/README.md for detailed documentation.

Database Performance & Indexing

The database schema includes comprehensive indexes for optimal query performance:

  • 17 indexes across all tables prevent full table scans
  • Composite indexes optimize common multi-column queries
  • Foreign key indexes ensure fast JOINs
  • Performance target: Sub-100ms queries for 100k+ records

To verify indexes after migration:

npm run db:verify-indexes

See detailed documentation:

Optional: Postgres via Docker

If you prefer Postgres, a docker-compose.yml is included.

  1. Start Postgres:

docker compose up -d

  1. In .env, set DATABASE_URL to a Postgres URL (see .env.example).

  2. Re-run Prisma generate/migrate so the client matches the Postgres schema.

If you previously generated SQLite migrations, clear them before switching:


rm -rf prisma/migrations/\*
npm run db:migrate

Database Backup and Disaster Recovery

The project includes comprehensive automated backup and disaster recovery capabilities for production deployments:

  • Automated Backups: Daily full backups and hourly incremental backups via WAL archiving
  • Point-in-Time Recovery (PITR): Restore database to any specific timestamp
  • Encrypted Storage: S3-compatible backup storage with encryption at rest
  • Monitoring & Alerts: Automated backup verification and health checks
  • Disaster Recovery Runbook: Tested procedures with RTO/RPO targets

See detailed documentation:

Quick start:

# Run manual backup
cd ops/backup
./backup-database.sh full

# Restore from backup
cd ops/restore
./restore-database.sh full

# Verify backups
cd ops/backup
./verify-backup.sh

Verification sketch

  • Recompute the file hash (sha256) and compare with content_hash in manifest and on-chain entries[hash].
  • Verify signature in manifest was produced by the creator key.
  • Confirm the creator matches the on-chain entrys creator.

YouTube flow

Because YouTube re-encodes media, the on-platform bytes wont match your master file hash. Use a binding:

  1. Anchor your master file as usual (upload → manifest → register)
  2. After uploading to YouTube, get the videoId (from the URL)
  3. Bind the YouTube video to the master file:

npm run bind:youtube -- ./master.mp4 <YouTubeVideoId> 0xRegistry

  1. Verify a YouTube URL or ID later:

npm run verify:youtube -- https://www.youtube.com/watch?v=<YouTubeVideoId> 0xRegistry

End-to-End Testing

The web application includes a comprehensive E2E test suite built with Playwright:

Quick Start

cd web
npm run test:e2e

Features

  • Multi-browser testing: Chromium, Firefox, WebKit (Safari)
  • Mobile testing: iPhone 12, Pixel 5 viewports
  • Visual regression: Screenshot comparison for UI changes
  • Accessibility testing: WCAG compliance, ARIA roles, keyboard navigation
  • 85+ test cases covering all major user flows:
    • Navigation and page loading
    • Authentication with OAuth providers
    • Dashboard and content viewing
    • Content upload and registration
    • Platform binding and verification
    • Profile and account management

Running Tests

# Run all E2E tests
npm run test:e2e

# Interactive UI mode (best for development)
npm run test:e2e:ui

# Debug mode with step-by-step execution
npm run test:e2e:debug

# Test specific browsers
npm run test:e2e:chromium
npm run test:e2e:firefox
npm run test:e2e:webkit

# Mobile viewport testing
npm run test:e2e:mobile

CI Integration

E2E tests can be triggered manually via GitHub Actions workflow:

  1. Go to Actions
  2. Click "Run workflow"
  3. Optionally specify custom base URL for testing preview deployments

See the complete E2E Testing Guide for detailed documentation, debugging tips, and best practices.

Documentation

Getting Started

Technical Documentation

Operations & Security

Deployment & Infrastructure

Next steps

  • Add C2PA manifest embedding for images/video.
  • Support Merkle batch anchoring.
  • Add selective disclosure/zk proof of “is a real person” VC.

CLI Tool and SDK for Programmatic Access

Internet ID provides multiple ways to interact with the platform programmatically:

CLI Tool

Command-line tool for content registration and verification. Perfect for automation, scripting, and CI/CD workflows.

# Install globally
npm install -g @internet-id/cli

# Configure credentials
internet-id init

# Upload and register content
internet-id upload ./my-video.mp4

# Verify content
internet-id verify ./my-video.mp4

Features:

  • Interactive configuration with init command
  • Privacy mode (only manifest uploaded by default)
  • Optional content upload to IPFS
  • Content verification by file or manifest URI
  • Support for multiple IPFS providers (Web3.Storage, Pinata, Infura, local)
  • Multi-chain support (Base, Ethereum, Polygon, Arbitrum, Optimism)

Documentation: CLI README

TypeScript/JavaScript SDK

Official SDK for building integrations and tools.

# Install the SDK
npm install @internet-id/sdk
import { InternetIdClient } from "@internet-id/sdk";

const client = new InternetIdClient({
  apiKey: "iid_your_api_key_here",
});

// Verify content by platform URL
const result = await client.verifyByPlatform({
  url: "https://youtube.com/watch?v=abc123",
});

console.log(result.verified); // true or false
console.log(result.creator); // Creator's Ethereum address

Features:

  • Full TypeScript support with type definitions
  • Content verification and metadata retrieval
  • API key management
  • JWT authentication
  • Automatic rate limiting and error handling

Documentation: SDK README

Browser Extension

Seamless verification workflow without leaving the platform. One-click verification improves UX and conversion significantly.

Installation:

  • Chrome/Edge/Brave: Load unpacked from extension/ directory (developer mode)
  • Coming Soon: Chrome Web Store, Firefox Add-ons, Safari Extensions

Features:

  • Platform detection (YouTube, Twitter/X, Instagram, GitHub, TikTok, LinkedIn)
  • One-click verification from extension popup
  • Verification badges displayed directly on platform pages
  • Quick access to Internet ID dashboard
  • Wallet connection for signing and registration
  • Privacy-conscious with 5-minute cache and local storage only
  • Configurable auto-verify and badge display settings

How It Works:

  1. Install extension in your browser
  2. Configure API endpoint in settings
  3. Visit supported platform (e.g., YouTube video)
  4. Extension automatically checks verification status
  5. Verified content displays a badge
  6. Click extension icon for details or to verify new content

Documentation:

Public API

RESTful API for third-party integrations.

Features:

  • Versioned API (/api/v1/)
  • Multiple authentication methods (API keys, JWT tokens)
  • Rate limiting per tier (free: 100 req/min, paid: 1000 req/min)
  • OpenAPI/Swagger documentation at /api/docs

Documentation:

API reference (summary)

Legacy Endpoints

Auth: If API_KEY is set, include x-api-key: $API_KEY in requests for protected endpoints.

  • GET /api/health server status
  • GET /api/network returns chainId
  • GET /api/registry default registry address (if configured)
  • POST /api/upload upload file to IPFS (protected)
  • POST /api/manifest build and optionally upload manifest (protected)
  • POST /api/register register content hash + manifest on-chain (protected)
  • POST /api/bind bind a single platform ID (protected)
  • POST /api/bind-many bind multiple platform IDs at once (protected)
  • POST /api/verify verify a file against manifest + on-chain
  • POST /api/proof generate proof.json
  • GET /api/contents list registered contents
  • GET /api/verifications list recent verifications
  • GET /api/resolve resolve URL or platform+id to on-chain binding
  • GET /api/public-verify public verification summary for a binding
  • Web-only:
    • GET /api/badge/[hash] SVG badge with theme and w (width)
    • GET /api/qr?url=... QR PNG for a share URL

V1 Public API Endpoints

See Public API Documentation for the complete v1 API reference.