Verification infrastructure for production voice AI

Every AI voice interaction should be provable.

VoiceAttest turns conversations into auditable evidence: scored, signed, and reproducible across providers.

Request first-batch access

Trust collapse

Most teams can ship a voice agent. Very few can prove what it actually said.

$482,000 fraudulent transfer blocked after confidence collapse.

Replay confidence mismatch detected across provider transcripts.

Voice clone divergence: 12.4% above customer baseline.

Forensic replay

See the failure surface collapse in real time.

receipt://appointment-booking-v3/run_7e21
live call1.2s ago
CallerChase Bank
Voice Match97.2%
Deepfake RiskLow
Verified 1.2s agoatt_9f1c2b40e7
View signed attestation
transcript divergence“approve transfer” ≠ “confirm transfer”

Provider mismatch triggered policy replay before authorization.

confidence collapse98.7 → 61.3

Biometric confidence fell after synthetic prosody entered the call.

blocked action$482,000

High-risk transfer held pending manual verification.

00:00.000Caller audio

Voiceprint accepted · 98.7% match confidence

00:02.814Transcript

Provider A: "approve transfer" · Provider B: "confirm transfer"

00:04.109Policy

$482,000 fraudulent transfer blocked

00:05.733Clone check

Voice clone divergence: 12.4% above baseline

00:06.017Receipt

sha256: 9f1c2b40e7027d9f · signed immutable

Live attestation demo

Interaction timeline

Turn-by-turn verification with barge-in timing, disclosure coverage, and policy failures rendered as machine-readable evidence.

  • p95 turn latency: 402ms
  • verbatim consent: 100%
  • hallucinated pricing: 0 events

Provider-agnostic replay

Run one scenario across every stack and compare outcomes against the same scoring grammar.

run replay appointment-booking-v3 --providers realtime,11labs,deepgram --assertions policy.json
00:06.017 price_quote_mismatchconfidence delta -37.4receipt signedaction blocked
attestation receiptscenario appointment-booking-v3
00:01.238caller_intent_detected4f8b9a1everified
00:03.402disclosure_script_present9ceaf302verified
00:06.017price_quote_mismatch77d5cb0fflagged

sha256: 9f1c2b40e7027d9f / signed / immutable

Verification pipeline

01

Capture turn-level event stream + audio features

02

Normalize providers into a single scenario schema

03

Score each check against policy assertions

04

Write immutable attestation receipt per interaction

Model + provider agnostic

One scenario file. Same scoring semantics. Any stack.

scenario_id=appointment-booking-v3 // adapter=openai,anthropic,custom-sip // evidence=portable

Enterprise proof layer

Built for teams that answer to regulators, auditors, and boards.

Chain of custody

Each attestation keeps event lineage from raw interaction to final verdict.

Policy versioning

Scorecards are tied to explicit policy revisions, not shifting prompt snapshots.

Incident replay

Re-run historical scenarios against new models before any production release.

Signing authority

HS

Harshil Shah

Building cryptographic attestation primitives for the voice AI stack. Previously [TODO].

First-batch access

Get the specification and sandbox key.

We are onboarding 25 teams. Expect response within 48 hours.