Skip to content

feat(aggregation_mode): aggregate proofs in chunks #1896

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 44 commits into
base: staging
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
44 commits
Select commit Hold shift + click to select a range
3200e0c
feat: risc0 aggregation program
MarcosNicolau Apr 15, 2025
2f54502
feat: [wip] risc0 aggregator backend
MarcosNicolau Apr 15, 2025
62bfb60
feat: ProofAggregationService risc0 verification
MarcosNicolau Apr 15, 2025
c475a63
feat: error handling + fetch risc0 proofs
MarcosNicolau Apr 15, 2025
e2dcf30
feat: send risc0 aggregated proof to AlignedProofAggregationService c…
MarcosNicolau Apr 15, 2025
1c4440b
feat: deploy AlignedProofAggregationService contract with risc0 verifier
MarcosNicolau Apr 16, 2025
df1d83a
feat: allow risc0 succinct proofs
MarcosNicolau Apr 16, 2025
4bc1a52
chore: modified risc0 no_pub_inputs to generate a succinct proof inst…
MarcosNicolau Apr 16, 2025
56a9d87
feat: fetch proofs based on zkvm engine
MarcosNicolau Apr 16, 2025
6c544d0
chore: commands to start proof agregator based on verifier
MarcosNicolau Apr 16, 2025
b137020
fix: compute image id bytes in little endian not in be
MarcosNicolau Apr 21, 2025
d0a4be6
feat: local verification of aggregated risc0 proof
MarcosNicolau Apr 21, 2025
8309ea8
Merge branch 'staging' into feat/aggregation-mode-risc0
MarcosNicolau Apr 21, 2025
369cdc6
chore: add gpu feature flag to run with cuda
MarcosNicolau Apr 21, 2025
7720553
docs: add gpu command
MarcosNicolau Apr 21, 2025
ebdc910
chore: undo no_pub_input file changes
MarcosNicolau Apr 21, 2025
1b8873a
ci: fix contracts build + install risc0 toolchain
MarcosNicolau Apr 22, 2025
01f60f4
chore: better proof aggregator makefile targets
MarcosNicolau Apr 22, 2025
20b7d61
docs: update how to run proof aggregator readme
MarcosNicolau Apr 22, 2025
454f80c
ci: skip build on clippy
MarcosNicolau Apr 22, 2025
5e43630
chore: address clippy warnings
MarcosNicolau Apr 22, 2025
79eb6b3
fix: correctly setting risc0 tiny-keccak patch
MarcosNicolau Apr 23, 2025
bd5ad9d
chore: rename prove feature to proving to prevent collision with risc…
MarcosNicolau Apr 23, 2025
ef2b968
Revert "chore: rename prove feature to proving to prevent collision w…
MarcosNicolau Apr 23, 2025
319c112
chore: address juli's comments
MarcosNicolau Apr 23, 2025
ac646c6
refactor: replace sp1,risc0 features for .env variables
MarcosNicolau Apr 23, 2025
74f15be
chore: update makefile for new aggregator env config
MarcosNicolau Apr 23, 2025
b63e875
refactor: aggregation mode reduce amount of types and abstractions
MarcosNicolau Apr 24, 2025
02987a4
Merge branch 'staging' into refactor/agg-mode-redundancies
MarcosNicolau Apr 24, 2025
4059b2c
feat: aggregate proofs in chunks of 512 proofs
MarcosNicolau Apr 25, 2025
e39ed67
fix: aggregated proofs commitment for the second layer
MarcosNicolau Apr 25, 2025
eae61ba
chore: add comments for aggregation function
MarcosNicolau Apr 25, 2025
e957da2
chore: add tracing for chunk aggregation
MarcosNicolau Apr 25, 2025
618688f
Merge branch 'staging' into feat/aggregate-proofs-in-chunks
MarcosNicolau Apr 28, 2025
4f778ad
Merge branch 'staging' into feat/aggregate-proofs-in-chunks
MarcosNicolau Apr 30, 2025
5afbf67
feat: split aggregation in two programs
MarcosNicolau Apr 30, 2025
4487597
Merge and fix sp1
MauroToscano May 13, 2025
a095608
feat: update risc0 programs to correctly compute the merkle tree
MarcosNicolau May 15, 2025
e20cffc
feat: update sp1 programs to correctly compute the merkle tree
MarcosNicolau May 15, 2025
5335d2a
feat: aggregator backends for new programs
MarcosNicolau May 15, 2025
1bea52d
feat: update write_program_image_id_vk_hash with new programs
MarcosNicolau May 15, 2025
68418b6
chore: build program and update chunk aggregator program id in root a…
MarcosNicolau May 15, 2025
4bb095b
Merge remote-tracking branch 'origin/staging' into feat/aggregate-pro…
MarcosNicolau May 15, 2025
8382808
chore: update anvil state
MarcosNicolau May 15, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 0 additions & 1 deletion aggregation_mode/aggregation_programs/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -8,4 +8,3 @@ members = ["sp1", "risc0"]
[patch.crates-io]
# Adding RISC Zero keccak precompile support
tiny-keccak = { git = "https://github.com/risc0/tiny-keccak", tag = "tiny-keccak/v2.0.2-risczero.0" }

8 changes: 6 additions & 2 deletions aggregation_mode/aggregation_programs/risc0/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -16,5 +16,9 @@ lambdaworks-crypto = { git = "https://github.com/lambdaclass/lambdaworks.git", r
path = "./src/lib.rs"

[[bin]]
name = "risc0_aggregator_program"
path = "./src/main.rs"
name = "risc0_chunk_aggregator_program"
path = "./src/chunk_aggregator_main.rs"

[[bin]]
name = "risc0_root_aggregator_program"
path = "./src/root_aggregator_main.rs"
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
#![no_main]

use lambdaworks_crypto::merkle_tree::merkle::MerkleTree;
use risc0_aggregation_program::{ChunkAggregatorInput, Risc0ImageIdAndPubInputs};
use risc0_zkvm::guest::env;

risc0_zkvm::guest::entry!(main);

fn main() {
let input = env::read::<ChunkAggregatorInput>();

for proof in &input.proofs_image_id_and_pub_inputs {
env::verify(proof.image_id.clone(), &proof.public_inputs)
.expect("proof to be verified correctly");
}

let merkle_tree =
MerkleTree::<Risc0ImageIdAndPubInputs>::build(&input.proofs_image_id_and_pub_inputs)
.unwrap();

env::commit_slice(&merkle_tree.root);
}
28 changes: 27 additions & 1 deletion aggregation_mode/aggregation_programs/risc0/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -52,6 +52,32 @@ impl IsMerkleTreeBackend for Risc0ImageIdAndPubInputs {
}

#[derive(Serialize, Deserialize)]
pub struct Input {
pub struct Hash32(pub [u8; 32]);

impl IsMerkleTreeBackend for Hash32 {
type Data = Hash32;
type Node = [u8; 32];

fn hash_data(leaf: &Self::Data) -> Self::Node {
leaf.0
}

fn hash_new_parent(child_1: &Self::Node, child_2: &Self::Node) -> Self::Node {
let mut hasher = Keccak::v256();
hasher.update(child_1);
hasher.update(child_2);
let mut hash = [0u8; 32];
hasher.finalize(&mut hash);
hash
}
}

#[derive(Serialize, Deserialize)]
pub struct ChunkAggregatorInput {
pub proofs_image_id_and_pub_inputs: Vec<Risc0ImageIdAndPubInputs>,
}

#[derive(Serialize, Deserialize)]
pub struct RootAggregatorInput {
pub proofs_and_leaves_commitment: Vec<(Risc0ImageIdAndPubInputs, Vec<[u8; 32]>)>,
}
24 changes: 0 additions & 24 deletions aggregation_mode/aggregation_programs/risc0/src/main.rs

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
#![no_main]

use lambdaworks_crypto::merkle_tree::merkle::MerkleTree;
use risc0_aggregation_program::{Hash32, RootAggregatorInput};
use risc0_zkvm::guest::env;

risc0_zkvm::guest::entry!(main);

// Generated with `make agg_mode_write_program_ids` and copied from program_ids.json
pub const CHUNK_AGGREGATOR_PROGRAM_IMAGE_ID: [u8; 32] = [200, 155, 105, 236, 200, 48, 124, 101, 178, 175, 101, 213, 67, 76, 52, 119, 110, 9, 50, 215, 92, 126, 5, 172, 211, 193, 88, 83, 150, 62, 51, 74];

fn main() {
let input = env::read::<RootAggregatorInput>();

let mut leaves: Vec<Hash32> = vec![];

for (proof, leaves_commitment) in input.proofs_and_leaves_commitment {
let image_id = proof.image_id;

// Ensure the aggregated chunk originates from the L1 aggregation program.
// This validation step guarantees that the proof was genuinely verified
// by this program. Without this check, a different program using the
// same public inputs could bypass verification.
assert!(image_id == CHUNK_AGGREGATOR_PROGRAM_IMAGE_ID);

// Ensure the committed root matches the root of the provided leaves
let merkle_root: [u8; 32] = proof
.public_inputs
.clone()
.try_into()
.expect("Public input to be the chunk merkle root");

let leaves_commitment: Vec<Hash32> =
leaves_commitment.into_iter().map(|el| Hash32(el)).collect();
let merkle_tree = MerkleTree::<Hash32>::build(&leaves_commitment).unwrap();
assert!(merkle_root == merkle_tree.root);

leaves.extend(leaves_commitment);

// finally verify the proof
env::verify(image_id, &proof.public_inputs).expect("proof to be verified correctly");
}

let merkle_tree = MerkleTree::<Hash32>::build(&leaves).unwrap();

env::commit_slice(&merkle_tree.root);
}
8 changes: 6 additions & 2 deletions aggregation_mode/aggregation_programs/sp1/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -16,5 +16,9 @@ lambdaworks-crypto = { git = "https://github.com/lambdaclass/lambdaworks.git", r
path = "./src/lib.rs"

[[bin]]
name = "sp1_aggregator_program"
path = "./src/main.rs"
name = "sp1_chunk_aggregator_program"
path = "./src/chunk_aggregator_main.rs"

[[bin]]
name = "sp1_root_aggregator_program"
path = "./src/root_aggregator_main.rs"
Original file line number Diff line number Diff line change
Expand Up @@ -3,22 +3,22 @@ sp1_zkvm::entrypoint!(main);

use lambdaworks_crypto::merkle_tree::merkle::MerkleTree;
use sha2::{Digest, Sha256};
use sha3::Keccak256;
use sp1_aggregation_program::{Input, SP1VkAndPubInputs};
use sp1_aggregation_program::{ChunkAggregatorInput, SP1VkAndPubInputs};

pub fn main() {
let input = sp1_zkvm::io::read::<Input>();
let input = sp1_zkvm::io::read::<ChunkAggregatorInput>();

// Verify the proofs.
for proof in input.proofs_vk_and_pub_inputs.iter() {
let vkey = proof.vk;
let public_values = &proof.public_inputs;
let public_values_digest = Sha256::digest(public_values);

sp1_zkvm::lib::verify::verify_sp1_proof(&vkey, &public_values_digest.into());
}

let merkle_tree: MerkleTree<SP1VkAndPubInputs> =
MerkleTree::build(&input.proofs_vk_and_pub_inputs).unwrap();
let merkle_tree =
MerkleTree::<SP1VkAndPubInputs>::build(&input.proofs_vk_and_pub_inputs).unwrap();

sp1_zkvm::io::commit_slice(&merkle_tree.root);
}
30 changes: 27 additions & 3 deletions aggregation_mode/aggregation_programs/sp1/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ pub struct SP1VkAndPubInputs {
}

impl SP1VkAndPubInputs {
pub fn hash(&self) -> [u8; 32] {
pub fn commitment(&self) -> [u8; 32] {
let mut hasher = Keccak256::new();
for &word in &self.vk {
hasher.update(word.to_be_bytes());
Expand All @@ -34,7 +34,7 @@ impl IsMerkleTreeBackend for SP1VkAndPubInputs {
type Node = [u8; 32];

fn hash_data(leaf: &Self::Data) -> Self::Node {
leaf.hash()
leaf.commitment()
}

fn hash_new_parent(child_1: &Self::Node, child_2: &Self::Node) -> Self::Node {
Expand All @@ -46,6 +46,30 @@ impl IsMerkleTreeBackend for SP1VkAndPubInputs {
}

#[derive(Serialize, Deserialize)]
pub struct Input {
pub struct Hash32(pub [u8; 32]);

impl IsMerkleTreeBackend for Hash32 {
type Data = Hash32;
type Node = [u8; 32];

fn hash_data(leaf: &Self::Data) -> Self::Node {
leaf.0
}

fn hash_new_parent(child_1: &Self::Node, child_2: &Self::Node) -> Self::Node {
let mut hasher = Keccak256::new();
hasher.update(child_1);
hasher.update(child_2);
hasher.finalize().into()
}
}

#[derive(Serialize, Deserialize)]
pub struct ChunkAggregatorInput {
pub proofs_vk_and_pub_inputs: Vec<SP1VkAndPubInputs>,
}

#[derive(Serialize, Deserialize)]
pub struct RootAggregatorInput {
pub proofs_and_leaves_commitment: Vec<(SP1VkAndPubInputs, Vec<[u8; 32]>)>,
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
#![no_main]
sp1_zkvm::entrypoint!(main);

use lambdaworks_crypto::merkle_tree::merkle::MerkleTree;
use sha2::{Digest, Sha256};
use sp1_aggregation_program::{Hash32, RootAggregatorInput};

// Generated with `make agg_mode_write_program_ids` and copied from program_ids.json
pub const CHUNK_AGGREGATOR_PROGRAM_VK_HASH: [u32; 8] = [899813587, 1068831252, 2000190855, 1210454658, 1057127892, 56287617, 1572193608, 1379769886];

pub fn main() {
let input = sp1_zkvm::io::read::<RootAggregatorInput>();

let mut leaves = vec![];

// Verify the proofs.
for (proof, leaves_commitment) in input.proofs_and_leaves_commitment {
let vkey = proof.vk;
let public_values_digest = Sha256::digest(&proof.public_inputs);

// Ensure the aggregated chunk originates from the L1 aggregation program.
// This validation step guarantees that the proof was genuinely verified
// by this program. Without this check, a different program using the
// same public inputs could bypass verification.
assert!(proof.vk == CHUNK_AGGREGATOR_PROGRAM_VK_HASH);

let merkle_root: [u8; 32] = proof
.public_inputs
.clone()
.try_into()
.expect("Public input to be the hash of the chunk tree");

// Reconstruct the merkle tree and verify that the roots match
let leaves_commitment: Vec<Hash32> =
leaves_commitment.into_iter().map(|el| Hash32(el)).collect();
let merkle_tree: MerkleTree<Hash32> = MerkleTree::build(&leaves_commitment).unwrap();
assert!(merkle_tree.root == merkle_root);

leaves.extend(leaves_commitment);

sp1_zkvm::lib::verify::verify_sp1_proof(&vkey, &public_values_digest.into());
}

// Finally, compute the final merkle root with all the leaves
let merkle_tree: MerkleTree<Hash32> = MerkleTree::build(&leaves).unwrap();

sp1_zkvm::io::commit_slice(&merkle_tree.root);
}
40 changes: 27 additions & 13 deletions aggregation_mode/bin/write_program_image_id_vk_hash.rs
Original file line number Diff line number Diff line change
@@ -1,34 +1,48 @@
use alloy::hex::hex;
use proof_aggregator::aggregators::{
risc0_aggregator::RISC0_AGGREGATOR_PROGRAM_ID_BYTES, sp1_aggregator,
};
use proof_aggregator::aggregators::{risc0_aggregator, sp1_aggregator};
use serde_json::json;
use sp1_sdk::HashableKey;
use std::{env, fs, path::Path};
use tracing::info;
use tracing_subscriber::FmtSubscriber;

const SP1_PROGRAM_ELF: &[u8] =
include_bytes!("../aggregation_programs/sp1/elf/sp1_aggregator_program");
const SP1_CHUNK_AGGREGATOR_PROGRAM_ELF: &[u8] =
include_bytes!("../aggregation_programs/sp1/elf/sp1_chunk_aggregator_program");

include!(concat!(env!("OUT_DIR"), "/methods.rs"));
const SP1_ROOT_AGGREGATOR_PROGRAM_ELF: &[u8] =
include_bytes!("../aggregation_programs/sp1/elf/sp1_root_aggregator_program");

fn main() {
let subscriber = FmtSubscriber::builder().finish();
tracing::subscriber::set_global_default(subscriber).expect("setting default subscriber failed");

info!("About to write sp1 programs vk hash bytes + risc0 programs image id bytes");
let sp1_vk_hash = sp1_aggregator::vk_from_elf(SP1_PROGRAM_ELF).bytes32_raw();
let risc0_image_id_bytes = RISC0_AGGREGATOR_PROGRAM_ID_BYTES;

let sp1_vk_hash_hex = hex::encode(sp1_vk_hash);
let risc0_image_id_hex = hex::encode(risc0_image_id_bytes);
let sp1_chunk_aggregator_vk_hash =
sp1_aggregator::vk_from_elf(SP1_CHUNK_AGGREGATOR_PROGRAM_ELF).bytes32_raw();
let sp1_chunk_aggregator_vk_hash_words =
sp1_aggregator::vk_from_elf(SP1_CHUNK_AGGREGATOR_PROGRAM_ELF).hash_u32();
let sp1_root_aggreagot_vk_hash =
sp1_aggregator::vk_from_elf(SP1_ROOT_AGGREGATOR_PROGRAM_ELF).bytes32_raw();

let risc0_chunk_aggregator_image_id_bytes =
risc0_aggregator::RISC0_CHUNK_AGGREGATOR_PROGRAM_ID_BYTES;
let risc0_root_aggregator_image_id_bytes =
risc0_aggregator::RISC0_ROOT_AGGREGATOR_PROGRAM_ID_BYTES;

let sp1_chunk_aggregator_vk_hash_hex = hex::encode(sp1_chunk_aggregator_vk_hash);
let sp1_root_aggregator_vk_hash_hex = hex::encode(sp1_root_aggreagot_vk_hash);
let risc0_chunk_aggregator_image_id_hex = hex::encode(risc0_chunk_aggregator_image_id_bytes);
let risc0_root_aggregator_imaged_id_hex = hex::encode(risc0_root_aggregator_image_id_bytes);

let dest_path = Path::new("programs_ids.json");

let json_data = json!({
"sp1_vk_hash": format!("0x{}", sp1_vk_hash_hex),
"risc0_image_id": format!("0x{}", risc0_image_id_hex),
"sp1_chunk_aggregator_vk_hash": format!("0x{}", sp1_chunk_aggregator_vk_hash_hex),
"sp1_chunk_aggregator_vk_hash_words": format!("{:?}", sp1_chunk_aggregator_vk_hash_words),
"sp1_root_aggregator_vk_hash": format!("0x{}", sp1_root_aggregator_vk_hash_hex),
"risc0_chunk_aggregator_image_id": format!("0x{}", risc0_chunk_aggregator_image_id_hex),
"risc0_chunk_aggregator_image_id_bytes": format!("{:?}", risc0_chunk_aggregator_image_id_bytes),
"risc0_root_aggregator_image_id": format!("0x{}", risc0_root_aggregator_imaged_id_hex),
});

// Write to the file
Expand Down
5 changes: 5 additions & 0 deletions aggregation_mode/build.rs
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,10 @@ fn main() {
sp1_build::build_program_with_args("./aggregation_programs/sp1", {
sp1_build::BuildArgs {
output_directory: Some("./aggregation_programs/sp1/elf".to_string()),
binaries: vec![
"sp1_chunk_aggregator_program".into(),
"sp1_root_aggregator_program".into(),
],
// We use Docker to generate a reproducible ELF that will be identical across all platforms
// (https://docs.succinct.xyz/docs/sp1/writing-programs/compiling#production-builds)
docker: true,
Expand All @@ -21,6 +25,7 @@ fn main() {
.use_docker(docker_options)
.build()
.unwrap();

risc0_build::embed_methods_with_options(HashMap::from([(
"risc0_aggregation_program",
guest_options,
Expand Down
10 changes: 7 additions & 3 deletions aggregation_mode/programs_ids.json
Original file line number Diff line number Diff line change
@@ -1,4 +1,8 @@
{
"risc0_image_id": "0xb0ce0665805c4219d134845b74be6f1f97ab74dbbc1b341b73af4068d33bc1b0",
"sp1_vk_hash": "0x0035279b39b0a20aac302dd3c569f3a6324a00acc1b8e864af9e3a5fbf453b69"
}
"risc0_chunk_aggregator_image_id": "0xc89b69ecc8307c65b2af65d5434c34776e0932d75c7e05acd3c15853963e334a",
"risc0_chunk_aggregator_image_id_bytes": "[200, 155, 105, 236, 200, 48, 124, 101, 178, 175, 101, 213, 67, 76, 52, 119, 110, 9, 50, 215, 92, 126, 5, 172, 211, 193, 88, 83, 150, 62, 51, 74]",
"risc0_root_aggregator_image_id": "0x3452ff90f5fa471705fd6a213d7cf126c372658ef759271ce687ef1161cceb8a",
"sp1_chunk_aggregator_vk_hash": "0x006b4421a6fed44853b9c3ec3c82612827e04fba80d6b8606edae2a4523d9e1e",
"sp1_chunk_aggregator_vk_hash_words": "[899813587, 1068831252, 2000190855, 1210454658, 1057127892, 56287617, 1572193608, 1379769886]",
"sp1_root_aggregator_vk_hash": "0x004b2cdc6c24f614ac3a66886a6bd96e0b2a6e0e1151b6251c8ba0ec4e8f19d1"
}
Loading
Loading