Skip to content

Conversation

@ax0
Copy link
Collaborator

@ax0 ax0 commented Dec 8, 2025

No description provided.

Copy link
Collaborator

@ed255 ed255 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall looks good to me! I've left a few comments


pub subset_of: CustomPredicateRef,
pub subset_of_recursive: CustomPredicateRef,
pub batch_def: CustomPredicateRef,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we're not using most of these fields. We could just remove them.

// These properties are committed on-chain
pub inputs: HashSet<Hash>,
pub key: RawValue,
// TODO: Maybe replace this with a Value -> Value map?
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Or even usize -> Value and we only use numbers to index an item in a batch?
Update: I saw that we're using strings for index in craftlib. Value for index sounds good if it gives us more flexibility / ergonomics.


// Build ItemInBatch(item, batch)
Ok(st_custom!(self.ctx,
ItemInBatch() = (
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I added a rustfmt rule so that this macro doesn't get formatted. This means it must be formatted manually.
Notice that this line is not indented.

If you have any suggestion to improve this I'd be happy to listen!


// Build ItemDef(item, ingredients, inputs, key, work)
Ok(st_custom!(self.ctx,
ItemDef() = (
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See previous comment. Notice that this line is indented twice.

write_elems(&mut buffer, &self.item.0);
write_elems(&mut buffer, &self.created_items_root.0);
assert!(self.nullifiers.len() <= 255);
// TODO: What are the constraints on these lengths apart from the type casts below?
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

there are no other constraints really.
For convenience we use a fixed size encoding, we think that a max value of 255 ought to be enough so we encode the length as a u8.
But if that's too small we can just use a u16.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

On the decoding side we may choose to have an upper limit on the length to avoid abuse, because the ETH cost of committing many items in a batch is the same as committing one single item, but the synchronizer takes more load.

}
Err(e) => {
info!("Invalid do_blob: {:?}", e);
error!("Ignoring blob due: Invalid do_blob: {:?}", e);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I understand the need to highlight this so that it stands out; but I'd like to point out that receiving an invalid do_blob is not an error from the synchronizer point of view.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Makes sense, initially when there was an error due an invalid blob (note: blob supposed to be a DO blob, but with incorrect formatting (eg. wrongly compressed plonky2 proof)) this was not being noticed but in a lower layer it was just panicking (and stopping the synchronizer completely), which is how noticed an error while adding new features (although it was not clear to detect, needed debugging to find out the cause); so from that I set it to 'warn' and returned the error (with ? instead of the old .unwrap()), but it was easy to miss along all the other logs; so, coming from a panic (.unwrap()), the error log seemed the natural step (also since it was a blob labeled as DO type with an error of formatting).
But it's true that there could be cases where somebody sends an invalid blob labeled as DO type without being an actual error, which should not be reported as an error by the synchronizer; before this PR that case would result in the synchronizer panicking, now it will log the error message.
Maybe we can change it to warn? (for sure not going back to panicking)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wasn't aware this was panicking before! Thanks for fixing that.
A warn sounds good to me.

let mut s1 = empty_set.clone();
s1.insert(&wood).unwrap();
let mut inputs = s1.clone();
println!("1");
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

these prints should be removed

let batch_hash = batch_def.batch_hash(self.params)?;
let dust_hash = hash_values(&[batch_hash.into(), DUST_BLUEPRINT.into()]);
let gem_hash = hash_values(&[batch_hash.into(), GEM_BLUEPRINT.into()]);
dbg!(&batch_def.ingredients.keys);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

to be removed

})
.collect::<Result<Vec<_>>>()?;

// TODO instead of 'i', use 'index' (which is the blueprint (which takes
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this TODO sounds good, I suggest applying this already.

@ax0 ax0 marked this pull request as ready for review December 9, 2025 04:39
ax0 and others added 2 commits December 9, 2025 14:52
…ce PoW of Stone from 3 to 2 (since now proving in general already involves more recursive steps)
@arnaucube arnaucube force-pushed the multi-outputs branch 3 times, most recently from 18726ab to 6a2ddf4 Compare December 9, 2025 15:27
shell.nix Outdated
wayland.dev
wayland-protocols
sway
emacs
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

😆

Copy link
Collaborator

@ed255 ed255 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

ItemDef(item, ingredients, inputs, key, work, private: batch, index, keys) = AND(
BatchDef(batch, ingredients, inputs, keys, work)
ItemInBatch(item, batch, index, keys)
DictContains(keys, index, key)
Copy link

@dhvanipa dhvanipa Jan 23, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is the DictContains needed if ItemInBatch already includes it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants