Docs · Verify
Merkle V1 spec.
Normative storage attestation algorithm.
This page defines the public Nukez Merkle tree algorithm used for storage attestations. Third-party verifiers can use it to recompute roots, validate per-file inclusion proofs, and compare those values against verification bundles and on-chain anchors.
Machine-readable copy: /specs/nukez-merkle-v1.json.
Identity
Spec identity and field formats.
| Name | nukez-merkle-v1 |
|---|---|
| Schema version | 1.0 |
| Hash | SHA-256 |
| Encoding | UTF-8 for all string material before hashing |
| Digest format | Lowercase 64-character hexadecimal |
| Published root | sha256:<64 lowercase hex> |
Input
File entry schema.
{
"filename": "report.pdf",
"size_bytes": 12048,
"content_hash": "sha256:<64 lowercase hex>"
}The content_hash may be bare hex or prefixed with sha256:. The prefix is stripped before Merkle leaf hashing. For byte-level file verification, SHA-256 the raw downloaded bytes and compare that digest to the file entry's content_hash.
Tree math
Root construction.
- Sort file entries lexicographically by
filename. - Compute each leaf as SHA-256 over
filename:size_bytes:content_hash. - Use the content hash without the
sha256:prefix in the leaf string. - Pair nodes left-to-right at every level.
- Parent hash is SHA-256 over concatenated hex strings:
left_hex + right_hex. - If a level has an odd node count, duplicate the final node.
- A single-file tree root is the file's leaf hash.
- Empty file lists are invalid for Nukez attestations.
import hashlib
import json
def strip_sha256_prefix(value):
return value[7:] if value.startswith("sha256:") else value
def leaf_hash(entry):
content_hash = strip_sha256_prefix(entry["content_hash"])
material = f'{entry["filename"]}:{entry["size_bytes"]}:{content_hash}'
return hashlib.sha256(material.encode("utf-8")).hexdigest()
def build_merkle_root(files):
if not files:
raise ValueError("empty file lists are invalid for Nukez attestations")
level = [leaf_hash(entry) for entry in sorted(files, key=lambda e: e["filename"])]
while len(level) > 1:
next_level = []
for i in range(0, len(level), 2):
left = level[i]
right = level[i + 1] if i + 1 < len(level) else level[i]
next_level.append(hashlib.sha256((left + right).encode("utf-8")).hexdigest())
level = next_level
return "sha256:" + level[0]
def result_hash(locker_id, files):
summary = {
"locker_id": locker_id,
"files": sorted(
[
{
"filename": entry["filename"],
"size_bytes": entry["size_bytes"],
"content_hash": entry["content_hash"],
}
for entry in files
],
key=lambda e: e["filename"],
),
}
canonical = json.dumps(summary, separators=(",", ":"), sort_keys=True, ensure_ascii=False)
return "sha256:" + hashlib.sha256(canonical.encode("utf-8")).hexdigest()
def att_code_from_hash(result_hash_value):
raw = strip_sha256_prefix(result_hash_value)
return int(raw[:12], 16) % 1_000_000_000JavaScript
Reference implementation.
async function sha256Hex(text) {
const bytes = new TextEncoder().encode(text);
const digest = await crypto.subtle.digest("SHA-256", bytes);
return [...new Uint8Array(digest)]
.map((byte) => byte.toString(16).padStart(2, "0"))
.join("");
}
function stripSha256Prefix(value) {
return value.startsWith("sha256:") ? value.slice(7) : value;
}
async function leafHash(entry) {
const contentHash = stripSha256Prefix(entry.content_hash);
return sha256Hex(`${entry.filename}:${entry.size_bytes}:${contentHash}`);
}
export async function buildMerkleRoot(files) {
if (!files.length) {
throw new Error("empty file lists are invalid for Nukez attestations");
}
let level = await Promise.all(
[...files]
.sort((a, b) => (a.filename < b.filename ? -1 : a.filename > b.filename ? 1 : 0))
.map((entry) => leafHash(entry)),
);
while (level.length > 1) {
const next = [];
for (let i = 0; i < level.length; i += 2) {
const left = level[i];
const right = level[i + 1] ?? level[i];
next.push(await sha256Hex(left + right));
}
level = next;
}
return `sha256:${level[0]}`;
}Result hash
Manifest summary hash.
Nukez also publishes result_hash, which is SHA-256 over canonical JSON of the locker manifest summary. The canonical JSON is compact, sorted by key, and encoded with UTF-8.
manifest_summary = {
"locker_id": locker_id,
"files": sorted(
[
{
"filename": entry["filename"],
"size_bytes": entry["size_bytes"],
"content_hash": entry["content_hash"]
}
for entry in files
],
key=lambda e: e["filename"]
)
}
canonical = json.dumps(
manifest_summary,
separators=(",", ":"),
sort_keys=True,
ensure_ascii=False
)
result_hash = "sha256:" + sha256(canonical.encode("utf-8")).hexdigest()att_code is derived from result_hash, not directly from merkle_root. It is a compact oracle and display value, not the security primitive.
att_code = int(result_hash.removeprefix("sha256:")[:12], 16) % 1_000_000_000Inclusion proof
Per-file proof verification.
A Merkle proof contains the target leaf and ordered sibling steps. Each step has a sibling hash and whether that sibling is to the left or right of the current hash at that level.
current = leaf_hash(file_entry)
for step in proof:
sibling = strip_sha256_prefix(step["hash"])
if step["position"] == "left":
current = sha256((sibling + current).encode("utf-8")).hexdigest()
else:
current = sha256((current + sibling).encode("utf-8")).hexdigest()
assert "sha256:" + current == merkle_rootAttestation object
Public fields.
{
"receipt_id": "string",
"locker_id": "string",
"result_hash": "sha256:<64 hex>",
"merkle_root": "sha256:<64 hex>",
"manifest_signature": "string",
"file_count": 3,
"total_bytes": 48291,
"files": [{ "filename": "...", "size_bytes": 0, "content_hash": "sha256:..." }],
"attestation_status": "pending | computed | complete",
"attested_at": "2026-05-11T00:00:00Z",
"schema_version": "1.0",
"switchboard_slot": null,
"switchboard_tx": null,
"switchboard_feed": null
}Endpoints
Public verification surfaces.
| Bundle | GET /v1/storage/verification-bundle?receipt_id={receipt_id} |
|---|---|
| Inclusion proof | GET /v1/storage/merkle-proof?receipt_id={receipt_id}&filename={filename} |
| Storage verify | GET /v1/storage/verify?receipt_id={receipt_id} or POST /v1/storage/verify |
| Attestation code | GET /v1/attest-code?receipt_id={receipt_id} |
| Receipt | GET /v1/receipts/{receipt_id} |
| Receipt hash | GET /v1/receipts/{receipt_id}/verify |
Anchors
On-chain verification fields.
Solana/Switchboard writes the compact att_code to the PullFeed and records the full attestation metadata in an SPL Memo: schema, receipt_id, merkle_root, file_count, and attested_at.
Monad stores Merkle attestations keyed by bytes16(keccak256(receipt_id_string)). The read path returns merkleRoot, solanaSlot, fileCount, and attestedAt.
Test vector
Three-file odd-node tree.
files = [
{"filename": "a.txt", "size_bytes": 3, "content_hash": "sha256:aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"},
{"filename": "b.txt", "size_bytes": 5, "content_hash": "bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb"},
{"filename": "c.txt", "size_bytes": 7, "content_hash": "cccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccc"}
]
merkle_root = "sha256:a80128f3298c7b6bf0b894576066d61a1e270d8bf4638d01ddd6d8e626f45528"
proof_for_b_txt = [
{"hash": "91481cbebb6c2f6438ed263b130212193ef908a9864c2b9b77d511bd07072879", "position": "left"},
{"hash": "539d42382ade0da0fe370b9f86b80739b31db6f06ac8a482ef1f7390251f6262", "position": "right"}
]