Privacy Design

How the Birthmark Standard protects photographer privacy while enabling verification

Camera Submission Packet

Camera Submission Packet showing Birthmark Record and Manufacturer Certificate

When a camera authenticates an image, it creates two separate data structures that serve different purposes and go to different recipients. This architectural separation ensures no single party has complete information.

The Birthmark Record goes to the blockchain for public verification. It contains:

  • Image hashes - SHA-256 of raw and processed images
  • Modification level - Raw (0), Validated (1), or Modified (2)
  • Parent image hash - Links edited images to their originals
  • Metadata hashes (optional) - Irreversible hashes of timestamp, GPS location, lens ID, and owner ID
  • Processing timestamp - Server processing time, rounded to nearest minute

The Manufacturer Certificate goes only to the camera manufacturer for validation. It contains:

  • Encrypted camera token - Contains the camera's device fingerprint (NUC hash), encrypted with AES-256-GCM
  • Table/Key reference - Specifies which key table and index to use for decryption
  • Authority ID - Identifies which manufacturer should validate this camera

The manufacturer can identify which specific camera authenticated but never sees the image hashes or content. The blockchain stores image hashes but never receives camera identification data.

What Different Parties See

Data Type Submission Server Camera Manufacturer Registry (Blockchain) Public Verifier
Image Hashes ✅ Yes ❌ No ✅ Yes ✅ Yes
Image Content ❌ No ❌ No ❌ No ❌ No
Metadata Hashes ✅ Yes ❌ No ✅ Yes ✅ Yes
Device Fingerprint (NUC Hash) ❌ No
(encrypted)
✅ Yes
(decrypts token)
❌ No ❌ No
Table/Key Reference ✅ Yes ✅ Yes
(for validation)
❌ No ❌ No
Specific Camera Identity ❌ No
(anonymity sets)
✅ Yes ❌ No
(no visibility)
❌ No
Validation Result ✅ Yes
(PASS/FAIL)
N/A
(generates result)
❌ No ❌ No
Modification Levels ✅ Yes ❌ No ✅ Yes ✅ Yes
(query result)
Parent Image Hash ✅ Yes ❌ No ✅ Yes ✅ Yes
(query result)
Authority IDs ✅ Yes N/A
(is the authority)
❌ No ❌ No
Processing Timestamp ✅ Yes
(obscured timestamp)
❌ No ✅ Yes
(obscured timestamp)
✅ Yes
(obscured timestamp)
Photographer Identity ❌ No ❌ No ❌ No ❌ No
Photo Location ❌ No ❌ No ❌ No ❌ No
Capture Timestamp ❌ No ❌ No ❌ No ❌ No

Key Privacy Protections

  • 🖥️ Submission Server: Can process and route data but cannot decrypt camera tokens; uses anonymity sets to prevent specific camera identification
  • 🏭 Camera Manufacturer: Can validate camera authenticity and see which specific camera authenticated, but has no access to image hashes or content
  • ⛓️ Registry: Stores only irreversible hashes and metadata with obscured timestamps (rounded to nearest minute); no image content, photographer information, or authority IDs
  • 👥 Public Verifier: Can verify authenticity of images they possess but gains no information about photographer, location, specific camera, or provenance chain

Privacy Mechanisms

Two-Packet Separation

The core privacy mechanism: authentication creates two separate data structures sent to different parties that are never combined in any public record.

  • Birthmark Record: Contains image hashes and metadata hashes (public blockchain)
  • Manufacturer Certificate: Contains encrypted camera token (sent to manufacturer)

Result: To connect a specific camera to a specific image requires compromising multiple independent systems with opposing incentives. The manufacturer can identify the camera but never sees image hashes. The submission server sees image hashes but cannot identify which specific camera without manufacturer keys.

Key Table Anonymity

Each camera is randomly assigned 3 key tables (out of 2,500 total). Each table is shared by thousands of cameras. When authenticating, the camera randomly selects one of its 3 assigned tables.

Result: Submission servers and public verifiers cannot identify which specific camera authenticated an image—only the manufacturer can potentially identify the device through their private validation process, but they never see what was authenticated.

Hash-Only Storage

Image content is never transmitted to any server or stored anywhere. Only SHA-256 hashes are submitted to the blockchain.

Result: Complete privacy of image content. Hash collisions are computationally infeasible (2^256 possibilities).

Timestamp Rounding

All timestamps recorded on the blockchain are rounded up to the nearest minute. This creates anonymity sets where all submissions within the same minute receive identical timestamps.

Result: Even if an adversary observes submission timing patterns, they cannot distinguish individual images authenticated within the same 60-second window. This prevents timing-based correlation attacks and protects photographers who authenticate multiple images in quick succession.

Metadata Hashing (Optional)

Timestamp, GPS coordinates, lens ID, and owner ID are hashed before being included in the Birthmark Record. These features are opt-in and disabled by default.

Owner ID Hash Salt: The owner_hash uses a unique random salt stored in the image's EXIF metadata. This prevents correlation attacks across multiple images—each image from the same photographer produces a different owner_hash, making it impossible to link images by owner ID alone.

Result: Photographers can prove metadata authenticity without revealing location or identity unless they choose to share the original metadata alongside the image. Hashes confirm information you already have but never reveal information you don't possess.

Separated Concerns

Information is distributed across multiple independent parties so each sees only non-identifying fragments:

  • Manufacturer validates cameras but never sees image hashes or content
  • Submission server sees image hashes but cannot identify specific cameras without manufacturer keys
  • Public blockchain stores permanent records but contains no camera or manufacturer identifying information
  • Public verifiers can check authenticity but learn nothing about the photographer, camera, or manufacturer

Result: Combining the fragments requires cooperation from entities with opposing incentives. No single point of failure or control.

Source Protection

The system protects journalistic sources through architectural design, not just policy. To connect a specific camera to a specific image requires:

  1. Possessing the original image
  2. Compromising the submission server's encrypted logs to find which manufacturer validated it
  3. Compelling the manufacturer to reveal the camera identity

Result: Fishing expeditions are prevented. Authorities cannot browse the blockchain to identify photographers. Even with complete system access, adversaries can only connect cameras to images they already possess—they cannot predict future images or identify what new images depict.

What This System Does NOT Protect Against

  • Staged scenes: A real camera can photograph a fake scene
  • Screen photos: A camera can authenticate a photo of a screen showing an AI-generated image
  • Metadata leakage by the user: If you share your image with original EXIF data, that data is no longer private

Design Philosophy: The Birthmark Standard proves an image came from a legitimate camera sensor, not AI generation. It does not prove the scene is real, only that a physical camera captured it. Truth verification still requires journalistic judgment and context.