Conversation
…ed discoverability - Expanded the keywords list to include more specific terms related to FIX protocol and trading - Added new categories to better classify the project within the ecosystem - Aims to enhance visibility and relevance in package registries
- Incremented version from 0.7.3 to 0.7.4 to reflect the latest changes - Maintains consistency with workspace versioning practices - Prepares for upcoming release
There was a problem hiding this comment.
Summary of Changes
Hello @cognitive-glitch, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!
This pull request, despite its title suggesting ASN support, primarily focuses on updating project metadata and internal documentation. It includes a version increment and an enhancement of the project's discoverability on package registries through expanded keywords and categories. Additionally, a minor update to internal development guidelines has been applied.
Highlights
- Version Increment: The project's version has been updated from
0.7.3to0.7.4inCargo.toml, signaling a new patch release. - Package Metadata Enhancement: The
Cargo.tomlfile has been significantly updated to include a more comprehensive and specific set ofkeywordsandcategories. This aims to improve the discoverability and accurate categorization of therusty-fix-enginecrate on package registries, adding terms likefix-protocol,quickfix,fix-engine, andtrading. - Internal Documentation Update: A minor adjustment was made to the
CLAUDE.mdguideline document, adding a reference to@docs/fix-specs/alongside@docs/external-libraries/for developers to consult when implementing APIs or designing exchange abstractions.
Using Gemini Code Assist
The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.
Invoking Gemini
You can request assistance from Gemini at any point in your pull request via creating an issue comment (i.e. comment on the pull request page) using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands.
| Feature | Command | Description |
|---|---|---|
| Code Review | /gemini review |
Performs a code review for the current pull request in its current state. |
| Pull Request Summary | /gemini summary |
Provides a summary of the current pull request in its current state. |
| Comment | @gemini-code-assist | Responds in comments when explicitly tagged, both in issue comments and review comments. |
| Help | /gemini help |
Displays a list of available commands. |
Customization
To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.
Limitations & Feedback
Gemini Code Assist is currently in preview and may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments to provide feedback.
You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.
Footnotes
-
Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution. ↩
There was a problem hiding this comment.
Code Review
This pull request updates the workspace version and expands the package metadata in Cargo.toml. While the intent to improve discoverability is good, the changes to keywords and categories violate crates.io manifest policies and will prevent the package from being published. I've provided a critical comment with a suggested fix to align with the requirements.
Cargo.toml
Outdated
| keywords = [ | ||
| "fix", | ||
| "fix-protocol", | ||
| "quickfix", | ||
| "fix-engine", | ||
| "fix-parser", | ||
| "fast", | ||
| "protocol", | ||
| "trading", | ||
| "finance", | ||
| "fintech", | ||
| ] | ||
| categories = [ | ||
| "network-programming", | ||
| "parser-implementations", | ||
| "encoding", | ||
| "fix-engine", | ||
| "fix-parser", | ||
| "fix-protocol", | ||
| "quickfix", | ||
| "fast", | ||
| ] |
There was a problem hiding this comment.
The Cargo.toml manifest has some issues that will prevent publishing to crates.io.
- Number of Keywords: The
keywordsarray is limited to a maximum of 5 entries. The current change increases this to 10. - Number of Categories: The
categoriesarray is also limited to a maximum of 5 entries. The current change increases this to 8. - Invalid Categories: The categories
fix-engine,fix-parser,fix-protocol,quickfix, andfastare not valid category slugs on crates.io. You must choose from the official list of categories.
These issues are critical and will cause cargo publish to fail. I've suggested a valid set of keywords and categories below, but feel free to adjust them as you see fit while respecting the constraints.
keywords = [
"fix-protocol",
"fix-engine",
"fast",
"trading",
"fintech",
]
categories = [
"network-programming",
"parser-implementations",
"encoding",
"finance",
]
…mpilation - Implement comprehensive build script for ASN.1 schema generation from FIX dictionaries - Generate type-safe ASN.1 enums for FIX message types and field tags - Create ASN.1 struct definitions for FIX messages with proper field mapping - Handle field value enumerations with automatic Rust identifier generation - Add proper identifier handling for numeric values and invalid characters - Integrate generated code with main library through generated.rs module - Add fix40 and fix50 feature flags for conditional compilation - Create schemas/ directory with sample ASN.1 schema for extensions - Support multiple FIX versions (4.0, 4.4, 5.0 SP2) with feature-based generation - Implement conversion methods between simple types and ASN.1 representations Build script generates: - FixMessageType enum with all FIX message types - FixFieldTag enum with all field tags and conversions - Asn1FixMessage struct for ASN.1 message representation - Asn1Field struct for individual field representation - Field value enums for restricted field sets - Conversion implementations for interoperability The build script successfully compiles and generates working ASN.1 definitions from FIX dictionaries at build time, enabling type-safe ASN.1 encoding/decoding.
|
/gemini review |
There was a problem hiding this comment.
Code Review
This pull request introduces a new crate rustyasn for ASN.1 encoding support, which is a significant and valuable feature addition. The implementation is well-structured, including encoders, decoders, configuration options, and a build script for code generation from FIX dictionaries. It also comes with a good set of benchmarks and integration tests.
My review has identified a few high-severity issues that should be addressed before merging:
- The workspace
Cargo.tomldefines more categories than allowed bycrates.io, which will block publishing. - The
rustyasnencoder and decoder advertise support for PER encoding but fall back to DER. This discrepancy could lead to unexpected behavior and interoperability issues. - The
Encoderstruct inrustyasncontains unused fields (bufferandschema), which should be removed to clean up the code and eliminate unnecessary overhead.
I've also made several medium-severity suggestions to improve robustness and align with the project's coding guidelines, such as replacing .unwrap() with .expect() and making documentation examples more user-friendly.
Overall, this is a solid contribution that will be even better once these issues are addressed.
| categories = [ | ||
| "network-programming", | ||
| "parser-implementations", | ||
| "encoding", | ||
| "fix-engine", | ||
| "fix-parser", | ||
| "fix-protocol", | ||
| "quickfix", | ||
| "fast", | ||
| "trading", | ||
| "protocol", | ||
| "hft", | ||
| "high-frequency-trading", | ||
| ] |
There was a problem hiding this comment.
The categories list for the workspace contains more than 5 items. crates.io has a limit of 5 categories per crate. Any crate in the workspace that inherits this setting (like the new rustyasn crate) will fail to publish. Please reduce the number of categories to at most 5.
categories = [
"network-programming",
"parser-implementations",
"encoding",
"finance",
"fintech"
]
crates/rustyasn/src/decoder.rs
Outdated
| EncodingRule::PER | EncodingRule::APER | EncodingRule::UPER => { | ||
| // PER not available in this version, use DER as fallback | ||
| der_decode::<FixMessage>(data) | ||
| .map_err(|e| Error::Decode(DecodeError::Internal(e.to_string()))) | ||
| } |
There was a problem hiding this comment.
The implementation for PER, APER, and UPER encoding rules falls back to DER decoding. However, the README.md and config.rs files present these as supported encoding rules. This discrepancy is misleading for users who might choose PER for its compactness and get DER-encoded data instead, which could lead to interoperability issues. If PER is not yet supported, this function should return an UnsupportedEncodingRule error.
EncodingRule::PER | EncodingRule::APER | EncodingRule::UPER => {
Err(Error::Decode(DecodeError::UnsupportedEncodingRule("PER/APER/UPER")))
}
crates/rustyasn/src/encoder.rs
Outdated
| EncodingRule::PER | EncodingRule::APER | EncodingRule::UPER => { | ||
| // PER not available in this version, use DER as fallback | ||
| der_encode(message).map_err(|e| Error::Encode(EncodeError::Internal(e.to_string()))) | ||
| } |
There was a problem hiding this comment.
Similar to the decoder, the implementation for PER, APER, and UPER encoding rules falls back to DER encoding. This is misleading as users expecting PER's compact format will get DER instead. If PER is not supported, this should return an UnsupportedEncodingRule error to avoid silent failures and interoperability issues.
EncodingRule::PER | EncodingRule::APER | EncodingRule::UPER => {
// PER not available in this version, return an error.
Err(Error::Encode(EncodeError::UnsupportedEncodingRule {
rule: "PER/APER/UPER",
msg_type: message.msg_type.clone().into(),
}))
}
crates/rustyasn/Cargo.toml
Outdated
|
|
||
| [[bench]] | ||
| name = "asn1_encodings" | ||
| harness = false No newline at end of file |
crates/rustyasn/README.md
Outdated
|
|
||
| // Encode a message | ||
| let mut handle = encoder.start_message( | ||
| "D", // MsgType: NewOrderSingle |
There was a problem hiding this comment.
The code example for basic encoding/decoding is not self-contained as timestamp is not defined. This will cause confusion for users trying to run the example. Please either define the variable or mark the code block with ,ignore to indicate it's illustrative.
| "D", // MsgType: NewOrderSingle | |
| ```rust,ignore |
| } | ||
|
|
||
| fn benchmark_encoding(c: &mut Criterion) { | ||
| let dict = Arc::new(Dictionary::fix44().unwrap()); |
There was a problem hiding this comment.
The project guidelines in CLAUDE.md discourage the use of .unwrap(). Please use .expect() with a descriptive message instead to provide more context on panic, which is especially helpful for debugging failing benchmarks.
| let dict = Arc::new(Dictionary::fix44().unwrap()); | |
| let dict = Arc::new(Dictionary::fix44().expect("Failed to load FIX 4.4 dictionary")); |
| fn benchmark_message_sizes(c: &mut Criterion) { | ||
| let dict = Arc::new(Dictionary::fix44().unwrap()); | ||
| let group = c.benchmark_group("message_sizes"); | ||
|
|
||
| // Compare encoded sizes | ||
| let encoding_rules = [ | ||
| ("BER", EncodingRule::BER), | ||
| ("DER", EncodingRule::DER), | ||
| ("OER", EncodingRule::OER), | ||
| ]; | ||
|
|
||
| for (name, rule) in encoding_rules { | ||
| let config = Config::new(rule); | ||
| let encoder = Encoder::new(config, dict.clone()); | ||
| let encoded = create_test_message(&encoder, 1); | ||
|
|
||
| println!("{} encoded size: {} bytes", name, encoded.len()); | ||
| } | ||
|
|
||
| group.finish(); | ||
| } |
There was a problem hiding this comment.
crates/rustyasn/build.rs
Outdated
| @@ -0,0 +1,534 @@ | |||
| //! Build script for ASN.1 schema compilation and code generation. | |||
|
|
|||
| use anyhow::Result; | |||
There was a problem hiding this comment.
The project guidelines in CLAUDE.md state a preference for thiserror over anyhow for structured error handling. While anyhow is convenient in build scripts, adhering to the project's standard would improve consistency. Please consider refactoring to use thiserror or a custom error type for better error handling.
crates/rustyasn/build.rs
Outdated
| // For now, just copy the schema files to OUT_DIR | ||
| // In a full implementation, you would parse and compile them | ||
| let out_dir = env::var("OUT_DIR")?; | ||
| let filename = schema_file.file_name().unwrap(); |
There was a problem hiding this comment.
The project guidelines discourage .unwrap(). Using it here could cause the build script to panic if a path generated by glob somehow doesn't have a filename. Please use .expect() for robustness and to provide a more helpful error message if a panic does occur.
| let filename = schema_file.file_name().unwrap(); | |
| let filename = schema_file.file_name().expect("glob entry should have a filename"); |
|
|
||
| #[test] | ||
| fn test_basic_encoding_decoding() { | ||
| let dict = Arc::new(Dictionary::fix44().unwrap()); |
There was a problem hiding this comment.
The project guidelines in CLAUDE.md discourage the use of .unwrap(). Please use .expect() with a descriptive message to provide more context on panic during tests.
| let dict = Arc::new(Dictionary::fix44().unwrap()); | |
| let dict = Arc::new(Dictionary::fix44().expect("Failed to load FIX 4.4 dictionary")); |
- Implement SetField trait for EncoderHandle with SmallVec buffer - Add GetConfig trait implementations for Encoder, Decoder, DecoderStreaming - Enhance field processing with common FIX tag iteration - Fix Buffer trait usage and type annotations in tests - Update dependency from fix44 to fix50 feature - Resolve to_string method ambiguity and GroupEntries implementation - Apply clippy fixes for explicit counter loops and unwrap usage - Fix formatting and linting issues All 33 tests passing. ASN.1 encoding now seamlessly integrates with RustyFix ecosystem for encoding/decoding FIX messages.
- Add features to build-dependencies for rustyfix-dictionary - Update feature flags to properly enable corresponding rustyfix-dictionary features - Fix build script to use Dictionary::fix50() instead of Dictionary::fix50sp2() - Replace .unwrap() with .expect() for better error messages - Fix doc_cfg warning by using docsrs instead This resolves the compilation errors where Dictionary methods were not available in build scripts due to missing feature flags.
Previously PER/APER/UPER encoding rules would silently fall back to DER encoding, which could lead to unexpected behavior. Now they properly return an UnsupportedEncodingRule error indicating that PER variants are not supported by the rasn crate. This ensures users are aware when their requested encoding rule cannot be used, rather than getting an unexpected encoding format.
- Replace 21 .unwrap() calls in message.rs with descriptive .expect() messages - Replace 2 .unwrap() calls in encoder.rs with context-specific .expect() messages - Replace 13 .unwrap() calls in field_types.rs with field type specific .expect() messages - Replace 1 .unwrap() call in config.rs with configuration context .expect() - Replace 5 .unwrap() calls in schema.rs with schema building context .expect() messages - Replace 1 .unwrap() call in decoder.rs with decoding context .expect() - Replace 3 .unwrap() calls in generated.rs with test context .expect() messages All .expect() calls now include descriptive error messages explaining what should not fail and why. This improves error debugging and follows Rust best practices per project guidelines.
|
/gemini review |
There was a problem hiding this comment.
Code Review
This PR introduces a new crate rustyasn for ASN.1 support, which is a great addition. The overall structure is good, with clear separation of concerns into modules for encoding, decoding, configuration, etc. The use of a build script for code generation is also a solid approach.
However, there are several significant architectural issues and limitations in the current implementation that need to be addressed:
- Incomplete Feature Implementation: Several key features are either missing or incomplete, such as support for repeating groups, a complete schema implementation (it's currently hardcoded for a few messages), and support for PER/APER/UPER encoding rules.
- Architectural Disconnect: There seems to be a disconnect between the code generated by
build.rsand the runtime logic. The encoder/decoder rely on manually defined message structures and schemas, while the more detailed generated code is not fully utilized. This makes the crate confusing and hard to extend. - Misleading Public API: The public API (e.g.,
EncodingRuleenum,encode_messagefunction) suggests more functionality than is actually implemented, which can be misleading for users.
I've left more detailed comments on specific parts of the code. I recommend focusing on creating a solid, unified architecture before adding more features. A good starting point would be to make the encoder/decoder fully utilize the code generated from the FIX dictionaries.
crates/rustyasn/src/schema.rs
Outdated
| /// Builds field type information from dictionary. | ||
| fn build_field_types(&mut self) { | ||
| // Standard header fields | ||
| self.add_header_fields(); | ||
|
|
||
| // Standard trailer fields | ||
| self.add_trailer_fields(); | ||
|
|
||
| // TODO: Add all field definitions from dictionary | ||
| // This would normally iterate through dictionary.fields() | ||
| } |
There was a problem hiding this comment.
The build_field_types function has a TODO and only adds a few hardcoded header and trailer fields. Similarly, build_message_schemas only adds a few hardcoded messages. This means the schema is incomplete and doesn't use the full FIX dictionary.
This is a critical architectural issue. The crate has a build.rs that seems to generate code from dictionaries, but the runtime logic uses this separate, incomplete, manual schema. The generated code should be the source of truth for the schema information at runtime to ensure correctness and completeness.
crates/rustyasn/src/config.rs
Outdated
| /// Packed Encoding Rules - Compact, bit-oriented format | ||
| PER, | ||
| /// Aligned Packed Encoding Rules - PER with alignment | ||
| APER, | ||
| /// Unaligned Packed Encoding Rules - PER without alignment | ||
| UPER, | ||
| /// Octet Encoding Rules - Byte-aligned, efficient format | ||
| OER, | ||
| } |
There was a problem hiding this comment.
The EncodingRule enum includes PER, APER, and UPER, but these are not actually supported by the encoder or decoder. The encoder returns an UnsupportedEncodingRule error, and the decoder falls back to DER.
This is misleading to users of the crate. If these encoding rules are not supported, they should be removed from the public EncodingRule enum. Alternatively, if support is planned, the current implementation should be documented as a stub and the functions should clearly state the lack of support.
I suggest removing the unsupported variants for now to provide a clear and honest API.
| fn group( | ||
| &self, | ||
| _field: u32, | ||
| ) -> Result<Self::Group, FieldValueError<<usize as FieldType>::Error>> { | ||
| // For simplicity, create an empty group | ||
| // In a full implementation, this would parse repeating groups | ||
| Ok(MessageGroup::new(vec![])) | ||
| } | ||
|
|
||
| fn group_opt(&self, _field: u32) -> Result<Option<Self::Group>, <usize as FieldType>::Error> { | ||
| // For simplicity, return None | ||
| // In a full implementation, this would parse repeating groups | ||
| Ok(None) | ||
| } |
There was a problem hiding this comment.
The group and group_opt methods for FieldMap are implemented as stubs that always return an empty or None group. This means repeating groups, a fundamental part of FIX, are not supported.
This is a major limitation and should be clearly documented in the crate's README.md. For the crate to be considered for production use, this functionality will need to be implemented.
crates/rustyasn/src/encoder.rs
Outdated
| fn add_message_fields<F: FieldMap<u32>>( | ||
| &self, | ||
| handle: &mut EncoderHandle, | ||
| msg: &F, | ||
| ) -> Result<()> { | ||
| // Note: FieldMap doesn't provide field iteration, so we try common field tags | ||
| // In a full implementation, this would use a field iterator or schema | ||
| let common_tags = [55, 54, 38, 44, 114, 60]; // Symbol, Side, OrderQty, Price, etc. | ||
|
|
||
| for &tag in &common_tags { | ||
| if let Some(raw_data) = msg.get_raw(tag) { | ||
| let value_str = String::from_utf8_lossy(raw_data); | ||
| handle.add_field(tag as u16, value_str.to_string()); | ||
| } | ||
| } | ||
|
|
||
| Ok(()) |
There was a problem hiding this comment.
The implementation of add_message_fields is a stub that only handles a few hardcoded tags. This makes the encode_message function not generally useful. The comment // Note: FieldMap doesn't provide field iteration, so we try common field tags highlights the issue.
To make this function useful, it needs a way to iterate over all fields in the FieldMap. If the FieldMap trait doesn't support iteration, you might need to rethink the integration approach or document this severe limitation prominently. A possible solution could be to require a FieldMap that also implements IntoIterator.
crates/rustyasn/build.rs
Outdated
| sender_comp_id: msg.sender_comp_id.clone(), | ||
| target_comp_id: msg.target_comp_id.clone(), | ||
| msg_seq_num: msg.msg_seq_num, | ||
| sending_time: None, // TODO: Extract from fields if present |
There was a problem hiding this comment.
There's a TODO here to extract the sending time from fields. This is a good note, but it's worth considering how to handle this more robustly. The Asn1FixMessage struct has a dedicated sending_time field, but the conversion from FixMessage doesn't populate it. This could lead to loss of information during message conversion.
A more complete implementation would be to iterate through the fields of the source FixMessage, find the SendingTime (tag 52), and populate the sending_time field in Asn1FixMessage. The field should then be excluded from the generic fields vector to avoid duplication.
crates/rustyasn/README.md
Outdated
|
|
||
| // Encode a message |
crates/rustyasn/src/lib.rs
Outdated
| //! ```rust,ignore | ||
| //! use rustyasn::{Config, Encoder, Decoder, EncodingRule}; | ||
| //! use rustyfix::Dictionary; | ||
| //! | ||
| //! // Configure encoding | ||
| //! let config = Config::new(EncodingRule::PER); | ||
| //! let dictionary = Dictionary::fix44(); | ||
| //! | ||
| //! // Encode a message | ||
| //! let mut encoder = Encoder::new(config, dictionary); | ||
| //! let encoded = encoder.encode_message(msg)?; | ||
| //! | ||
| //! // Decode a message | ||
| //! let decoder = Decoder::new(config, dictionary); | ||
| //! let message = decoder.decode(&encoded)?; | ||
| //! ``` |
There was a problem hiding this comment.
The usage example in the crate-level documentation is marked as ignore and is incomplete. The msg variable is not defined, so the example is not self-contained and won't compile if a user tries to run it.
Please provide a complete, minimal, and working example to demonstrate the crate's usage.
//! ```rust,ignore
//! use rustyasn::{Config, Encoder, Decoder, EncodingRule};
//! use rustyfix::Dictionary;
//! use std::sync::Arc;
//!
//! // Configure encoding
//! let config = Config::new(EncodingRule::DER);
//! let dictionary = Arc::new(Dictionary::fix44().unwrap());
//!
//! // Encode a message
//! let encoder = Encoder::new(config.clone(), dictionary.clone());
//! let mut handle = encoder.start_message("A", "SENDER", "TARGET", 1);
//! handle.add_string(98, "0").add_int(108, 30);
//! let encoded = handle.encode().unwrap();
//!
//! // Decode a message
//! let decoder = Decoder::new(config, dictionary);
//! let message = decoder.decode(&encoded).unwrap();
//! assert_eq!(message.msg_type(), "A");
//! ```
| /// Adds an entry to the group. | ||
| pub fn add_entry(&mut self, entry: Message) { | ||
| self.entries.push(entry); | ||
| } |
crates/rustyasn/README.md
Outdated
| let mut handle = encoder.start_message( | ||
| "D", // MsgType: NewOrderSingle | ||
| "SENDER001", // SenderCompID | ||
| "TARGET001", // TargetCompID | ||
| 1, // MsgSeqNum | ||
| timestamp, // SendingTime | ||
| ); |
There was a problem hiding this comment.
The example for encoder.start_message is incorrect. The function signature in encoder.rs is pub fn start_message<'a>(&'a self, msg_type: &str, sender_comp_id: &str, target_comp_id: &str, msg_seq_num: u64) -> EncoderHandle<'a>, which does not take a timestamp argument. This example will not compile and will confuse users.
| let mut handle = encoder.start_message( | |
| "D", // MsgType: NewOrderSingle | |
| "SENDER001", // SenderCompID | |
| "TARGET001", // TargetCompID | |
| 1, // MsgSeqNum | |
| timestamp, // SendingTime | |
| ); | |
| let mut handle = encoder.start_message( | |
| "D", // MsgType: NewOrderSingle | |
| "SENDER001", // SenderCompID | |
| "TARGET001", // TargetCompID | |
| 1, // MsgSeqNum | |
| ); |
crates/rustyasn/Cargo.toml
Outdated
|
|
||
| [dependencies] | ||
| # Core dependencies | ||
| rustyfix = { path = "../rustyfix", version = "0.7.4", features = ["fix50"] } |
There was a problem hiding this comment.
The rustyfix dependency has a feature fix50 enabled, but the rustyfix-dictionary dependency does not have this feature enabled. This could lead to inconsistencies or compilation errors. Ensure that both dependencies have the same features enabled, or that the code handles the case where they don't.
rustyfix-dictionary = { path = "../rustyfix-dictionary", version = "0.7.4", features = ["fix50"] }
- Remove invalid categories like "fix-engine", "fix-parser", etc. - Keep only valid crates.io categories: "network-programming", "parser-implementations", "encoding", "finance", "development-tools" - This prevents publishing failures for crates that inherit workspace categories
There was a problem hiding this comment.
Code Review
This pull request introduces a new rustyasn crate to add ASN.1 encoding support. The structure is well-thought-out, with modules for encoding, decoding, and configuration. The build script for code generation is a solid approach for maintainability.
I've found a critical issue in the root Cargo.toml that would prevent publishing to crates.io. Additionally, there are numerous linting violations due to the use of .unwrap() and .expect(). Several key features, such as repeating group support and full schema generation, are currently stubbed out and should be implemented or documented as limitations. Addressing these points will significantly improve the robustness and completeness of this new feature.
| authors = ["cognitive-glitch", "Rusty Trading Team", "Filippo Neysofu Costa"] | ||
| version = "0.7.4" | ||
| edition = "2024" | ||
| homepage = "https://github.com/rusty-trading/rusty-fix-engine" | ||
| repository = "https://github.com/rusty-trading/rusty-fix-engine" | ||
| description = "FIX & FAST (FIX Adapted for STreaming) in pure Rust" | ||
| publish = true | ||
| readme = "README.md" | ||
| keywords = ["fix", "fast", "protocol", "finance", "fintech"] | ||
| categories = ["network-programming", "parser-implementations", "encoding"] | ||
| keywords = [ | ||
| "fix", | ||
| "fix-protocol", | ||
| "quickfix", | ||
| "fix-engine", | ||
| "fix-parser", | ||
| "fast", | ||
| "protocol", | ||
| "trading", | ||
| "finance", | ||
| "fintech", | ||
| "hft", | ||
| ] | ||
| categories = [ | ||
| "network-programming", | ||
| "parser-implementations", | ||
| "encoding", | ||
| "fix-engine", | ||
| "fix-parser", | ||
| "fix-protocol", | ||
| "quickfix", | ||
| "fast", | ||
| "trading", | ||
| "protocol", | ||
| "hft", | ||
| "high-frequency-trading", | ||
| ] |
There was a problem hiding this comment.
|
|
||
| #[test] | ||
| fn test_basic_encoding_decoding() { | ||
| let dict = Arc::new(Dictionary::fix44().unwrap()); |
There was a problem hiding this comment.
The use of .unwrap() is denied by the workspace linting rules (clippy::unwrap_used). Handle the Result properly. For a test prerequisite like this, unwrap_or_else with a descriptive panic message is a good alternative.
| let dict = Arc::new(Dictionary::fix44().unwrap()); | |
| let dict = Arc::new(Dictionary::fix44().unwrap_or_else(|e| panic!("Failed to load FIX 4.4 dictionary for test: {e}"))); |
| FixMessageType::from_str("D") | ||
| .expect("Failed to parse valid message type 'D'") |
There was a problem hiding this comment.
The use of .expect() is denied by the workspace linting rules (clippy::expect_used). Handle the Result properly, for example by using unwrap_or_else to panic with a more descriptive message for this test setup.
| FixMessageType::from_str("D") | |
| .expect("Failed to parse valid message type 'D'") | |
| FixMessageType::from_str("D") | |
| .unwrap_or_else(|| panic!("Failed to parse valid message type 'D'")) |
| let dict = | ||
| Arc::new(Dictionary::fix44().expect("Failed to load FIX 4.4 dictionary for test")); |
There was a problem hiding this comment.
The use of .expect() is denied by the workspace linting rules (clippy::expect_used). Handle the Result properly, for example by using unwrap_or_else to panic with a more descriptive message for this test setup.
let dict = Arc::new(
Dictionary::fix44().unwrap_or_else(|e| panic!("Failed to load FIX 4.4 dictionary for test: {e}")),
);
crates/rustyasn/src/decoder.rs
Outdated
| EncodingRule::PER | EncodingRule::APER | EncodingRule::UPER => { | ||
| // PER not available in this version, use DER as fallback | ||
| der_decode::<FixMessage>(data) | ||
| .map_err(|e| Error::Decode(DecodeError::Internal(e.to_string()))) | ||
| } |
There was a problem hiding this comment.
The decoder silently falls back to DER when PER encoding is specified. This is inconsistent with the encoder, which correctly returns an UnsupportedEncodingRule error. This inconsistency can lead to unexpected behavior. The decoder should also return an error for unsupported encoding rules.
EncodingRule::PER | EncodingRule::APER | EncodingRule::UPER => {
Err(Error::Decode(DecodeError::UnsupportedEncodingRule(
self.config.encoding_rule.name(),
)))
}| fn estimate_size(&self) -> usize { | ||
| // Basic estimation: header + fields | ||
| 100 + self.message.fields.len() * 20 | ||
| } |
crates/rustyasn/README.md
Outdated
| 1, // MsgSeqNum | ||
| timestamp, // SendingTime | ||
| ); |
There was a problem hiding this comment.
| fn group( | ||
| &self, | ||
| _field: u32, | ||
| ) -> Result<Self::Group, FieldValueError<<usize as FieldType>::Error>> { | ||
| // For simplicity, create an empty group | ||
| // In a full implementation, this would parse repeating groups | ||
| Ok(MessageGroup::new(vec![])) | ||
| } | ||
|
|
||
| fn group_opt(&self, _field: u32) -> Result<Option<Self::Group>, <usize as FieldType>::Error> { | ||
| // For simplicity, return None | ||
| // In a full implementation, this would parse repeating groups | ||
| Ok(None) | ||
| } |
There was a problem hiding this comment.
crates/rustyasn/src/schema.rs
Outdated
| // TODO: Add all field definitions from dictionary | ||
| // This would normally iterate through dictionary.fields() | ||
| } |
crates/rustyasn/Cargo.toml
Outdated
| name = "asn1_encodings" | ||
| harness = false No newline at end of file |
- Remove PER, APER, UPER variants from EncodingRule enum - Update encoder and decoder to only support BER, DER, OER - Change low_latency() config to use OER instead of PER - Update all tests and method implementations to expect OER - Fix benchmark to use std::hint::black_box and .expect() instead of .unwrap() - This makes the API honest about what encoding rules are actually supported Addresses AI code review feedback about misleading API surface.
|
/gemini review |
There was a problem hiding this comment.
Code Review
The pull request includes changes to the .clippy.toml, CLAUDE.md, Cargo.toml, README.md, and TODO.md files. It also adds several new files related to ASN.1 support, including a build script, cargo file, readme, benchmark, and examples. The changes aim to add ASN.1 support and improve code quality.
- Enhanced documentation: Added @docs/fix-specs/ reference to the guidelines for implementing APIs and designing exchange abstractions. - Introduced a new GitHub Actions workflow for Rust CI, which includes steps for building, testing, and checking code quality with clippy and formatting checks. These changes improve the clarity of development practices and ensure consistent code quality through automated checks.
|
/gemini review |
- Verified all 7 pending AI code review tasks are already implemented or resolved - Updated TODO.md to reflect current state accurately - All tasks marked as RESOLVED with verification details - No actual code changes needed - existing implementation is correct Tasks verified: 1. is_plausible_start_tag - Already has proper ASN.1 validation 2. map_to_dyn_error - Helper function already implemented 3. Redundant comment - No redundancy found 4. #[allow(dead_code)] - No dead code attributes found 5. Hardcoded fallback tags - No hardcoded values found 6. Hardcoded common tags - Already configurable 7. is_valid_asn1_tag - Function doesn't exist (resolved by removal)
There was a problem hiding this comment.
Code Review
This pull request introduces ASN.1 support via the rustyasn crate, along with smaller improvements and refactorings. The rustyasn crate includes a custom ASN.1 parser to ensure dependency compatibility. The review focuses on correctness and maintainability, identifying areas for improvement in error handling and code simplification.
| // TODO: Implement proper repeating group parsing | ||
| // This requires: | ||
| // 1. Look up the group schema from a dictionary/schema | ||
| // 2. Parse the fields in order based on the group definition | ||
| // 3. Create proper Message instances for each group entry | ||
|
|
||
| Err(Asn1FieldError::GroupParsingUnsupported { | ||
| tag: count_tag, | ||
| count, | ||
| }) | ||
| } |
There was a problem hiding this comment.
The current implementation of parse_group_entries returns an error, but the public-facing group() and group_opt() methods that call it swallow this specific error and return a misleading FieldValueError::Invalid(InvalidInt). This suggests to the caller that the group count field has an invalid value, which is incorrect and makes debugging difficult.
Since this feature is unimplemented, it would be better to make this explicit by using todo!(). This will cause a panic with a clear "not yet implemented" message when group functionality is used, which is the standard and expected behavior for todo!() and avoids returning a misleading error.
fn parse_group_entries(
&self,
_count_tag: u32,
_count: usize,
) -> Result<Vec<Message>, Asn1FieldError> {
// This feature is not yet implemented. Using `todo!` to make it explicit.
todo!("Repeating group parsing is not yet supported for ASN.1 messages.");
}| if NaiveDateTime::parse_from_str(&s, "%Y%m%d-%H:%M:%S").is_ok() { | ||
| return FixFieldValue::UtcTimestamp(s); | ||
| } | ||
| // Also check for timestamp with milliseconds | ||
| if NaiveDateTime::parse_from_str(&s, "%Y%m%d-%H:%M:%S%.3f").is_ok() { | ||
| return FixFieldValue::UtcTimestamp(s); | ||
| } |
There was a problem hiding this comment.
The logic for parsing UtcTimestamp can be simplified. The two separate checks for timestamps with and without fractional seconds can be combined into a single check using the %.f format specifier in chrono, which handles optional fractional seconds. This will make the code more concise and slightly more efficient.
A similar simplification can be applied to UtcTime parsing a few lines below.
| if NaiveDateTime::parse_from_str(&s, "%Y%m%d-%H:%M:%S").is_ok() { | |
| return FixFieldValue::UtcTimestamp(s); | |
| } | |
| // Also check for timestamp with milliseconds | |
| if NaiveDateTime::parse_from_str(&s, "%Y%m%d-%H:%M:%S%.3f").is_ok() { | |
| return FixFieldValue::UtcTimestamp(s); | |
| } | |
| if NaiveDateTime::parse_from_str(&s, "%Y%m%d-%H:%M:%S%.f").is_ok() { | |
| return FixFieldValue::UtcTimestamp(s); | |
| } |
| - Fixed `test_repeating_group_parsing` to expect proper error handling | ||
| - Enhanced date validation with robust chrono parsing (already implemented) | ||
| - **Medium Priority Improvements**: | ||
| - Renamed `is_valid_asn1_tag` to `is_plausible_start_tag` for clarity |
- Fix parsing error: 'Failed to parse type definition FixExtensions DEFINITIONS ::= BEGIN: Unsupported type: BEGIN' - Add logic to skip ASN.1 module definition lines (MODULE DEFINITIONS ::= BEGIN) - Prevent parser from trying to parse 'BEGIN' as a type definition - Module definitions are now properly recognized and skipped during parsing - Build now completes successfully without ASN.1 parsing errors The parser was incorrectly trying to parse module definition syntax as type definitions. Added a check to skip lines containing 'DEFINITIONS ::= BEGIN' before type parsing.
…ion and schema compilation - Cleaned up the build script by removing unnecessary println! statements that were commented out. - This change improves code readability and maintains focus on active code without clutter from unused logging statements. - No functional changes were made; the build process remains intact.
- Removed redundant keywords from the Cargo.toml file, streamlining the list to focus on essential terms: "fix", "fix-protocol", "fast", "trading", and "hft". - This change enhances the clarity of the project's metadata without affecting its functionality or dependencies. - The updated keywords better reflect the core aspects of the project, improving discoverability and relevance in package registries.
|
/gemini review |
| - Fixed `test_repeating_group_parsing` to expect proper error handling | ||
| - Enhanced date validation with robust chrono parsing (already implemented) | ||
| - **Medium Priority Improvements**: | ||
| - Renamed `is_valid_asn1_tag` to `is_plausible_start_tag` for clarity |
There was a problem hiding this comment.
Consider renaming is_valid_asn1_tag to is_plausible_start_tag for clarity, as the function only checks for plausible start tags rather than fully validating ASN.1 tags.
| - Renamed `is_valid_asn1_tag` to `is_plausible_start_tag` for clarity | |
| - Renamed `is_valid_asn1_tag` to `is_plausible_start_tag` for clarity |
| - Enhanced date validation with robust chrono parsing (already implemented) | ||
| - **Medium Priority Improvements**: | ||
| - Renamed `is_valid_asn1_tag` to `is_plausible_start_tag` for clarity | ||
| - Optimized const fn usage (`is_standard_header_field` already const) |
| - Replaced `format!` with static strings in tracing hot paths | ||
| - Fixed SmallVec inline detection with `spilled()` method | ||
| - **Style & Cleanup**: | ||
| - Added missing newlines to Cargo.toml and README.md files |
There was a problem hiding this comment.
- Updated `rasn` version from 0.18 to 0.27 in `rustyasn/Cargo.toml` for enhanced ASN.1 support. - Removed version specification for `rustyfix-dictionary` and `rustyfix-codegen` to align with workspace dependency management. - Updated `quick-xml` and `criterion` in `rustyfixml/Cargo.toml` to use workspace dependencies for consistency. - Upgraded `prost` and `prost-types` versions from 0.13 to 0.14 in `rustygpb/Cargo.toml` to leverage the latest features and fixes. These changes streamline dependency management and ensure compatibility across the project.
- Fix missing rasn::Decoder trait imports in feature-gated modules - Fix ASN.1 tag validation to accept SEQUENCE tag (0x30) - Correct is_plausible_start_tag() to properly validate constructed universal class tags - Fix doc test to use rustyfix_dictionary instead of rustyfix - All tests now pass successfully with both default and all features enabled
- Remove dependency on rustyfix main crate, use only rustyfix-dictionary - Create local traits module with FieldType, Buffer, FieldMap, etc. - Replace generic const buffer implementations with specific sized buffers - Add missing rasn::Decoder imports for derive macros - Fix clippy warnings: needless_pass_by_value in encoder.rs - Apply cargo fmt to fix formatting issues across all files - Add IO error variant to rustyfixml error types All compilation tests pass with all features enabled.
There was a problem hiding this comment.
Bug: Buffer Trait Infinite Recursion Bug
The Buffer trait implementations for Vec<u8> and smallvec::SmallVec types contain infinite recursion. The extend_from_slice method calls itself instead of the inherent extend_from_slice method of the concrete types, leading to a stack overflow.
crates/rustyasn/src/traits.rs#L11-L34
rustyfix/crates/rustyasn/src/traits.rs
Lines 11 to 34 in e0a86a4
Was this report helpful? Give feedback by reacting with 👍 or 👎
|
/gemini review |
There was a problem hiding this comment.
Code Review
This pull request introduces ASN.1 support and incorporates const functions to enhance performance. The changes include modifications to .clippy.toml, .cursorrules, AGENTS.md, CLAUDE.md, Cargo.toml, README.md, and TODO.md, along with the addition of new files related to ASN.1 implementation.
| - Fixed `test_repeating_group_parsing` to expect proper error handling | ||
| - Enhanced date validation with robust chrono parsing (already implemented) | ||
| - **Medium Priority Improvements**: | ||
| - Renamed `is_valid_asn1_tag` to `is_plausible_start_tag` for clarity |
| - Enhanced date validation with robust chrono parsing (already implemented) | ||
| - **Medium Priority Improvements**: | ||
| - Renamed `is_valid_asn1_tag` to `is_plausible_start_tag` for clarity | ||
| - Optimized const fn usage (`is_standard_header_field` already const) |
| - Optimized const fn usage (`is_standard_header_field` already const) | ||
| - Removed UTC timestamp length restrictions for variable fractional seconds | ||
| - Replaced `format!` with static strings in tracing hot paths | ||
| - Fixed SmallVec inline detection with `spilled()` method |
| - Replaced `format!` with static strings in tracing hot paths | ||
| - Fixed SmallVec inline detection with `spilled()` method | ||
| - **Style & Cleanup**: | ||
| - Added missing newlines to Cargo.toml and README.md files |
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
- Add detailed documentation for ASN.1 overhead constant explaining components - Replace hardcoded FIX tag values with named constants in encoder and message modules - Make FIX header field tag constants public for reuse across modules - Add comprehensive documentation for all public FIX tag constants with backticks - Simplify verbose documentation in tracing.rs functions - Improve code maintainability and self-documentation - Apply automatic formatting improvements from pre-commit hooks All tests pass (62 unit tests + 5 integration tests).
|
/gemini review |
There was a problem hiding this comment.
Code Review
The pull request includes changes to various configuration files, documentation, and adds ASN.1 encoding support. Most changes are minor updates to documentation and build configurations. The most significant change is the addition of ASN.1 support, which includes new crates (rustyasn) and modifications to the build process.
There was a problem hiding this comment.
Pull Request Overview
This PR enhances error handling across integration tests by returning Result<(), Box<dyn std::error::Error>>, updates several Cargo.toml files to use workspace dependencies and bumps versions, and broadens error enums with new variants.
- Refactor tests in multiple crates to use
Resultandmap_err, improving test ergonomics and error reporting. - Update
Cargo.tomlforrustygpbandrustyfixmlto workspace dependencies and bumpprostto 0.14. - Add an I/O error variant to
rustyfixml’sFixmlError.
Reviewed Changes
Copilot reviewed 43 out of 43 changed files in this pull request and generated 3 comments.
Show a summary per file
| File | Description |
|---|---|
| crates/rustysbe/src/lib.rs | Migrate integration tests to return Result and use map_to_dyn_error |
| crates/rustygpb/src/lib.rs | Refactor test to return Result and replace expect with map_err |
| crates/rustyfixs/src/lib.rs | Update TLS tests to return Result and map errors |
| crates/rustyfixml/src/error.rs | Add Io variant for std::io::Error |
| crates/rustygpb/Cargo.toml | Bump prost and prost-types to 0.14, switch criterion to workspace |
| crates/rustyfixml/Cargo.toml | Switch quick-xml, criterion, and tokio to workspace |
| crates/rustyasn/tests/integration_test.rs | New ASN.1 integration tests using Result |
| assert_eq!(length, 40); | ||
|
|
||
| // Test validation | ||
| let (len, tid, sv) = SbeMessageHeader::validate_basic(&message).unwrap(); | ||
| assert_eq!(len, message.len() as u32); | ||
| assert_eq!(tid, 123); | ||
| assert_eq!(sv, 5); | ||
| let (len, tid, sv) = | ||
| SbeMessageHeader::validate_basic(&message).map_err(map_to_dyn_error)?; | ||
| assert_eq!(len, 40); |
There was a problem hiding this comment.
Hardcoding the expected message length as 40 may break if header layout changes. Consider asserting against message.len() (e.g. assert_eq!(length, message.len() as u32)) to keep the test in sync with the actual output.
| let encoded = encoder.encode(&message).map_err(|e| { | ||
| Box::<dyn std::error::Error>::from(format!("Encoding should work but failed: {e}")) | ||
| })?; | ||
| let decoded = decoder.decode(&encoded).map_err(|e| { | ||
| Box::<dyn std::error::Error>::from(format!("Decoding should work but failed: {e}")) | ||
| })?; |
There was a problem hiding this comment.
[nitpick] The error mapping closure used here and for decoding is duplicated. Consider introducing a test-level helper like fn map_to_dyn_error<E: std::error::Error + 'static>(e: E) -> Box<dyn std::error::Error> or a ResultExt trait to reduce repetition and improve readability.
| let encoded = encoder.encode(&message).map_err(|e| { | |
| Box::<dyn std::error::Error>::from(format!("Encoding should work but failed: {e}")) | |
| })?; | |
| let decoded = decoder.decode(&encoded).map_err(|e| { | |
| Box::<dyn std::error::Error>::from(format!("Decoding should work but failed: {e}")) | |
| })?; | |
| let encoded = encoder.encode(&message).map_err(map_to_dyn_error("Encoding should work but failed"))?; | |
| let decoded = decoder.decode(&encoded).map_err(map_to_dyn_error("Decoding should work but failed"))?; |
| } | ||
|
|
||
| #[test] | ||
| fn test_message_size_limits() { |
There was a problem hiding this comment.
This test panics on failure but other tests use Result<(), Box<dyn std::error::Error>>. For consistency and better error reporting, consider returning Result<(), Box<dyn std::error::Error>> here as well and using ? on fallible operations.
| fn test_message_size_limits() { | |
| fn test_message_size_limits() -> Result<(), Box<dyn std::error::Error>> { |
- Add clippy::panic to allow directives in test modules for: - src/decoder.rs - src/schema.rs - src/types.rs All critical clippy errors (expect_used, unwrap_used, panic) are now resolved. Compilation successful with only non-critical warnings remaining.
No description provided.