-
Notifications
You must be signed in to change notification settings - Fork 48
Set up basics of protobuf for the monorepo #4586
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 6f235de433
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
|
|
||
| _globals = globals() | ||
| _builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals) | ||
| _builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 'protobuf.sim.single_episode_pb2', _globals) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Regenerate protos so module name matches package
The generated module name is protobuf.sim.single_episode_pb2 even though the file lives under the metta package, so the message classes get a __module__ that points at a non-existent top‑level protobuf package. This breaks pickling/unpickling (e.g., multiprocessing, caching, or serialized message objects) with ModuleNotFoundError because Python will try to import protobuf.sim.single_episode_pb2. This stems from compiling with --proto_path=proto/metta (dropping the metta prefix); consider compiling from the proto root so the generated module name becomes metta.protobuf.sim.single_episode_pb2.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Excellent point, thank you.
6f235de to
08fbbd9
Compare
We install and use protoc via the "grpcio-tools" Python package, so everyone will use the same version after "uv sync". We check in the generated protobuf code, so it's ready to use in our Python code. This change includes the script to (re)generate it, but does not wire it into the CI process (where we could run it to confirm that checked-in code and proto files still match). We warn Claude to warn its user about changes to proto files, and to not include proto file changes in big refactors. And, we add a message that we may use soon in support of the pure single episode runner. (The details of it are subject to change; I'm still learning what it's like to pass the entire config in to the runner like that, so may want to refactor it to separate the "game rules" from "this particular episode's configuration for policies and outputs".)
08fbbd9 to
e132ea0
Compare
|
nishu-builder
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nice! Can we add something to CI to run this and check if anyone has introduced a change to .proto files and not run the generate-bindings script?
Also might be right to house the generation script under metta codegen <something>
|
Leaving this to sit for the weekend, we can discuss in person on Monday. It's not used for anything yet. The main questions here are, I think:
The messages I added aren't used for anything, and they may not even be a good starting point. The real need is in #4381, but I don't want to add yet another layer of complexity onto that single pull request by making it introduce the protobuf build infra too. |
|
Yes, we should end up with CI verifying that everything's still in sync. (But I wasn't confident that I could hole-in-one the decisions on how to integrate with CI.) I'm going to give @daveey a chance to see this before landing it, maybe midday Monday. |
daveey
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good, maybe we want a pre-commit check that .proto matches the generated code? But fine without
We install and use protoc via the "grpcio-tools" Python package, so everyone will use the same version after "uv sync".
We check in the generated protobuf code, so it's ready to use in our Python code. This change includes the script to (re)generate it, but does not wire it into the CI process (where we could run it to confirm that checked-in code and proto files still match).
We warn Claude to warn its user about changes to proto files, and to not include proto file changes in big refactors.
And, we add a message that we may use soon in support of the pure single episode runner. (The details of it are subject to change; I'm still learning what it's like to pass the entire config in to the runner like that, so may want to refactor it to separate the "game rules" from "this particular episode's configuration for policies and outputs".)
Asana Task