- Docs
- Bureau — Blue Team (defensive)
- Policy-Auction
Bureau — Blue Team (defensive)
Policy-Auction
Regulations such as the EU AI Act require vendors to demonstrate that prohibited practices are not in use, but the underlying model weights are typically trade secrets. Policy-Auction uses zero-knowledge proofs so vendors can demonstrate compliance without disclosing weights, and runtime drift can be continuously verified by external probes.
Posture: 🔵 Blue Team (defensive) · Status: alpha (moonshot)
What it does
The trade-secret-versus-regulator gap has been long-standing. The EU AI Act's Article 5 lists prohibited practices for AI vendors – subliminal manipulation, exploitation of vulnerabilities, social-scoring by public authorities, real-time remote biometric identification in public spaces. Vendors are unwilling to disclose model weights to prove negative compliance; regulators are unwilling to accept self-attestation. Policy-Auction provides a four-step protocol intended to bridge that gap.
The regulator publishes a policy as a Nuclei probe-pack carrying the canonical inputs the proof must bind to and a tolerance band for runtime drift. Vendors bid to attest compliance, each bid carrying a zero-knowledge proof that the vendor's model satisfies the prohibited-use predicates without disclosing weights. The vendor signs the attestation publicly – policy id, public inputs, and proof bytes (opaque to the Bureau by design). Dragnet continuously verifies that the vendor's deployed model still matches the signed attestation, and Policy-Auction emits a proof when runtime divergence exceeds the policy's signed tolerance.
Who would use it
- An AI vendor (OpenAI, Anthropic, Mistral, a startup) selling into the EU market who needs Article 5 compliance observations without revealing weights.
- A national regulator (the EU AI Office, the UK FCA, Singapore IMDA) publishing a machine-checkable compliance policy that vendors can prove against.
- A procurement officer at a hospital, government, or bank evaluating which AI vendors have signed attestations against the policies they care about.
- A civil-society auditor (Algorithmic Justice League, Mozilla, AI Now) cross-checking vendor compliance claims against runtime drift reports.
What you'll need
- The Pluck CLI installed (
npm i -g @sizls/pluck-cli). - For real deployment: a ZK-SNARK toolchain – circom (a JavaScript circuit DSL), snarkjs (Groth16 prover/verifier), or halo2 (Rust framework, used by Zcash) – and a circuit that encodes the regulator's policy. Designing those circuits for arbitrary model behavior is research-required today.
Step-by-step
pluck bureau policy-auction demo
The demo publishes a synthetic EU AI Act Art. 5 probe-pack, three vendor bids (one honest, one fraudulent, one drifting), three compliance attestations, and one Dragnet drift report. The honest vendor's proof binds correctly. The fraudulent vendor's proof prefix doesn't match – zk-verify-fail fires. The drifting vendor's proof binds, but its observed runtime divergence (0.18) exceeds the 0.05 tolerance band – drift-detected fires.
policy-auction/demo: ingesting 1 PolicyProbePack + 3 VendorBids + 3 ZkComplianceAttestations + 1 DriftViolation -> 2 PolicyAuctionProofs.
[Bureau/POLICY-AUCTION] proof=1e6ae2ec… kind=zk-verify-fail
[Bureau/POLICY-AUCTION] proof=6d088e28… kind=drift-detected
Production CLI (publish for regulators, bid and attest for vendors, verify for auditors) lands in a follow-up.
Run it yourself
Drop this into a Node 18+ project (npm install @sizls/pluck-bureau-policy-auction @sizls/pluck-bureau-core tsx):
// index.ts
import { createHash } from "node:crypto";
import {
createPolicyAuctionSystem,
fingerprintPrivateKey,
signCanonicalBody,
} from "@sizls/pluck-bureau-policy-auction";
import { generateOperatorKey } from "@sizls/pluck-bureau-core";
const sha256 = (s: string) => createHash("sha256").update(s).digest("hex");
const flush = (n = 60) => new Promise<void>((r) => { let i = 0; const tick = () => (++i >= n ? r() : setImmediate(tick)); setImmediate(tick); });
async function main() {
const op = generateOperatorKey();
const opFp = fingerprintPrivateKey(op.privateKeyPem);
const vendor = generateOperatorKey();
const vendorFp = fingerprintPrivateKey(vendor.privateKeyPem);
const system = createPolicyAuctionSystem({
signingKey: op.privateKeyPem,
disablePausePoll: true,
disableLogging: true,
});
// A drift report against a fictional attestation -> drift-detected fires.
const driftBody = {
schemaVersion: 1 as const,
attestationId: sha256("attestation:drifting"),
vendorFingerprint: vendorFp,
operatorFingerprint: opFp,
observedAt: "2026-04-15T10:00:00.000Z",
divergenceFraction: 0.18,
};
const driftId = sha256(JSON.stringify(driftBody));
const { signature } = signCanonicalBody({ ...driftBody, driftId }, op.privateKeyPem);
try {
system.reportDrift({ ...driftBody, driftId, signature });
await flush();
for (const p of system.facts.proofs()) {
console.log(`proof kind=${p.kind} id=${p.proofId.slice(0, 16)}…`);
}
} finally {
await system.shutdown();
}
}
main().catch((err) => { console.error(err); process.exit(1); });
Run with tsx index.ts. Expected output:
proof kind=drift-detected id=…
▶ Open in StackBlitz – runs in your browser, no install required.
What you get
A signed ZkAttestation envelope containing the policy id, regulator's public inputs, the proof bytes (opaque), and the proof digest – all bound to the vendor's signing key and Rekor-anchored.
Three classes of red-team proof when the chain breaks:
zk-verify-fail– the vendor's proof doesn't validate against the policy probe-pack's public inputs. Fail-closed: regulator cannot accept compliance.drift-detected– Dragnet observes the vendor's deployed model diverging from its signed attestation by more than the operator-supplied tolerance band.vendor-equivocation– the same vendor signed two attestations on the same policy id with contradicting public inputs.
What it can't do
- The ZK-SNARK math is STUBBED in the alpha.
verifyZkProof()is a deterministic prefix-check (proof bytes start with sha256-of-public-inputs). It is NOT a security boundary. Do not deploy against adversarial vendors. Real circom / snarkjs / halo2 integration over model-behavior circuits lands post-alpha. - ZK over arbitrary model behavior is itself a research frontier. For narrow predicates ("this output never names a person from list L") workable circuits exist today; for the full Article 5 vocabulary, circuit design is open research.
- The auction half is a protocol, not a marketplace. No escrow, no settlement, no auction-clearing – just the signed-shape primitives a real auction would need.
- Cross-jurisdiction policy-pack vocabulary deferred – US NIST AI-RMF, UK FCA, Singapore IMDA.
A real-world example
In May 2027, the EU AI Office publishes its Article 5 probe-pack. A French model-as-a-service startup, Vendor A, computes a circom circuit over its model that proves the model's output distribution is bounded away from the prohibited-use vocabulary. They sign and publish the proof. A US-based competitor, Vendor B, fakes their proof – copies Vendor A's bytes with the prefix swapped – and submits. Pluck's verifier rejects Vendor B's submission immediately: zk-verify-fail. Vendor B's submission becomes a public Rekor entry of fraud. Six months later, Vendor A's deployed model drifts after a re-finetune for a corporate customer. Dragnet observes the drift. Policy-Auction emits drift-detected. The EU AI Office's procurement system auto-removes Vendor A from the approved list pending re-attestation. The auditor's job – for both vendors – is now reading a cassette, not negotiating a NDA.
For developers
Predicate URIs
| URI | What it attests |
|---|---|
https://pluck.run/PolicyAuction.ProbePack/v1 | Regulator R published policy P with public inputs I and tolerance band B. |
https://pluck.run/PolicyAuction.VendorBid/v1 | Vendor V bid to attest compliance with policy P, proof digest D. |
https://pluck.run/PolicyAuction.ZkAttestation/v1 | Vendor V's signed compliance attestation for policy P. |
https://pluck.run/PolicyAuction.Drift/v1 | Operator O observed runtime drift fraction X for vendor V's attestation A. |
https://pluck.run/PolicyAuction.Proof/v1 | Class: zk-verify-fail | drift-detected | vendor-equivocation. |
The signed body never carries vendor model weights – that is the whole point. Only the proof bytes (opaque), public inputs (regulator-published), and digest fingerprints appear.
Programs composed
- Nuclei – regulators publish policies as probe-packs.
- Dragnet – runtime drift probing.
- Oath – vendor compliance commitments.
- Custody – cryptographic audit-trail anchoring.
- Pluck core's DSSE in-toto envelopes + Sigstore Rekor client.
Threat model + adversary
The attacker is a vendor under regulatory pressure who wants a clean compliance signal without disclosing weights. Their levers: faking the proof (caught by zk-verify-fail once real ZK math is plumbed), drifting the production model (caught by drift-detected), equivocating (caught by vendor-equivocation). The protocol's weakness today is that real ZK math is research-required for the full Article 5 vocabulary.
What's stubbed (alpha – moonshot)
- Real ZK-SNARK math is stubbed – alpha verifier is a deterministic prefix-check, not a security boundary.
- ZK circuits for arbitrary model behavior are research-required.
- Auction marketplace (escrow, settlement, clearing) is protocol-only.
- Sigstore Rekor
notarizeintegration stubbed. - Cross-jurisdiction policy-pack vocabulary deferred.
Verify a published cassette
pluck bureau verify <bundle-dir>
cosign verify-blob --key <pubkey.pem> --signature <sig> \
--type https://pluck.run/PolicyAuction.ZkAttestation/v1 <body.json>
See also
- Bureau Foundations
- Threat Model
- Verify a dossier
- Nuclei – declarative compliance probe-packs
- Dragnet – runtime drift detection
- Oath – vendor compliance commitments
- EU AI Act Article 5