Skip to content

feat: asn support / use const functions#3

Open
metaphorics wants to merge 53 commits intomainfrom
feat/asn-support
Open

feat: asn support / use const functions#3
metaphorics wants to merge 53 commits intomainfrom
feat/asn-support

Conversation

@metaphorics
Copy link

No description provided.

…ed discoverability

- Expanded the keywords list to include more specific terms related to FIX protocol and trading
- Added new categories to better classify the project within the ecosystem
- Aims to enhance visibility and relevance in package registries
- Incremented version from 0.7.3 to 0.7.4 to reflect the latest changes
- Maintains consistency with workspace versioning practices
- Prepares for upcoming release
Copilot AI review requested due to automatic review settings July 14, 2025 10:33
Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Summary of Changes

Hello @cognitive-glitch, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request, despite its title suggesting ASN support, primarily focuses on updating project metadata and internal documentation. It includes a version increment and an enhancement of the project's discoverability on package registries through expanded keywords and categories. Additionally, a minor update to internal development guidelines has been applied.

Highlights

  • Version Increment: The project's version has been updated from 0.7.3 to 0.7.4 in Cargo.toml, signaling a new patch release.
  • Package Metadata Enhancement: The Cargo.toml file has been significantly updated to include a more comprehensive and specific set of keywords and categories. This aims to improve the discoverability and accurate categorization of the rusty-fix-engine crate on package registries, adding terms like fix-protocol, quickfix, fix-engine, and trading.
  • Internal Documentation Update: A minor adjustment was made to the CLAUDE.md guideline document, adding a reference to @docs/fix-specs/ alongside @docs/external-libraries/ for developers to consult when implementing APIs or designing exchange abstractions.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point in your pull request via creating an issue comment (i.e. comment on the pull request page) using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in issue comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist is currently in preview and may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments to provide feedback.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

This comment was marked as outdated.

Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request updates the workspace version and expands the package metadata in Cargo.toml. While the intent to improve discoverability is good, the changes to keywords and categories violate crates.io manifest policies and will prevent the package from being published. I've provided a critical comment with a suggested fix to align with the requirements.

Cargo.toml Outdated
Comment on lines 23 to 44
keywords = [
"fix",
"fix-protocol",
"quickfix",
"fix-engine",
"fix-parser",
"fast",
"protocol",
"trading",
"finance",
"fintech",
]
categories = [
"network-programming",
"parser-implementations",
"encoding",
"fix-engine",
"fix-parser",
"fix-protocol",
"quickfix",
"fast",
]

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

The Cargo.toml manifest has some issues that will prevent publishing to crates.io.

  1. Number of Keywords: The keywords array is limited to a maximum of 5 entries. The current change increases this to 10.
  2. Number of Categories: The categories array is also limited to a maximum of 5 entries. The current change increases this to 8.
  3. Invalid Categories: The categories fix-engine, fix-parser, fix-protocol, quickfix, and fast are not valid category slugs on crates.io. You must choose from the official list of categories.

These issues are critical and will cause cargo publish to fail. I've suggested a valid set of keywords and categories below, but feel free to adjust them as you see fit while respecting the constraints.

keywords = [
    "fix-protocol",
    "fix-engine",
    "fast",
    "trading",
    "fintech",
]
categories = [
    "network-programming",
    "parser-implementations",
    "encoding",
    "finance",
]

…mpilation

- Implement comprehensive build script for ASN.1 schema generation from FIX dictionaries
- Generate type-safe ASN.1 enums for FIX message types and field tags
- Create ASN.1 struct definitions for FIX messages with proper field mapping
- Handle field value enumerations with automatic Rust identifier generation
- Add proper identifier handling for numeric values and invalid characters
- Integrate generated code with main library through generated.rs module
- Add fix40 and fix50 feature flags for conditional compilation
- Create schemas/ directory with sample ASN.1 schema for extensions
- Support multiple FIX versions (4.0, 4.4, 5.0 SP2) with feature-based generation
- Implement conversion methods between simple types and ASN.1 representations

Build script generates:
- FixMessageType enum with all FIX message types
- FixFieldTag enum with all field tags and conversions
- Asn1FixMessage struct for ASN.1 message representation
- Asn1Field struct for individual field representation
- Field value enums for restricted field sets
- Conversion implementations for interoperability

The build script successfully compiles and generates working ASN.1 definitions
from FIX dictionaries at build time, enabling type-safe ASN.1 encoding/decoding.
@metaphorics
Copy link
Author

/gemini review

@metaphorics metaphorics requested a review from Copilot July 14, 2025 10:41

This comment was marked as outdated.

cursor[bot]

This comment was marked as outdated.

Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a new crate rustyasn for ASN.1 encoding support, which is a significant and valuable feature addition. The implementation is well-structured, including encoders, decoders, configuration options, and a build script for code generation from FIX dictionaries. It also comes with a good set of benchmarks and integration tests.

My review has identified a few high-severity issues that should be addressed before merging:

  • The workspace Cargo.toml defines more categories than allowed by crates.io, which will block publishing.
  • The rustyasn encoder and decoder advertise support for PER encoding but fall back to DER. This discrepancy could lead to unexpected behavior and interoperability issues.
  • The Encoder struct in rustyasn contains unused fields (buffer and schema), which should be removed to clean up the code and eliminate unnecessary overhead.

I've also made several medium-severity suggestions to improve robustness and align with the project's coding guidelines, such as replacing .unwrap() with .expect() and making documentation examples more user-friendly.

Overall, this is a solid contribution that will be even better once these issues are addressed.

Comment on lines 36 to 49
categories = [
"network-programming",
"parser-implementations",
"encoding",
"fix-engine",
"fix-parser",
"fix-protocol",
"quickfix",
"fast",
"trading",
"protocol",
"hft",
"high-frequency-trading",
]

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The categories list for the workspace contains more than 5 items. crates.io has a limit of 5 categories per crate. Any crate in the workspace that inherits this setting (like the new rustyasn crate) will fail to publish. Please reduce the number of categories to at most 5.

categories = [
    "network-programming",
    "parser-implementations",
    "encoding",
    "finance",
    "fintech"
]

Comment on lines 149 to 153
EncodingRule::PER | EncodingRule::APER | EncodingRule::UPER => {
// PER not available in this version, use DER as fallback
der_decode::<FixMessage>(data)
.map_err(|e| Error::Decode(DecodeError::Internal(e.to_string())))
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The implementation for PER, APER, and UPER encoding rules falls back to DER decoding. However, the README.md and config.rs files present these as supported encoding rules. This discrepancy is misleading for users who might choose PER for its compactness and get DER-encoded data instead, which could lead to interoperability issues. If PER is not yet supported, this function should return an UnsupportedEncodingRule error.

            EncodingRule::PER | EncodingRule::APER | EncodingRule::UPER => {
                Err(Error::Decode(DecodeError::UnsupportedEncodingRule("PER/APER/UPER")))
            }

Comment on lines 139 to 142
EncodingRule::PER | EncodingRule::APER | EncodingRule::UPER => {
// PER not available in this version, use DER as fallback
der_encode(message).map_err(|e| Error::Encode(EncodeError::Internal(e.to_string())))
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

Similar to the decoder, the implementation for PER, APER, and UPER encoding rules falls back to DER encoding. This is misleading as users expecting PER's compact format will get DER instead. If PER is not supported, this should return an UnsupportedEncodingRule error to avoid silent failures and interoperability issues.

            EncodingRule::PER | EncodingRule::APER | EncodingRule::UPER => {
                // PER not available in this version, return an error.
                Err(Error::Encode(EncodeError::UnsupportedEncodingRule {
                    rule: "PER/APER/UPER",
                    msg_type: message.msg_type.clone().into(),
                }))
            }


[[bench]]
name = "asn1_encodings"
harness = false No newline at end of file

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

This file is missing a final newline character. It's best practice to add one for consistency and to prevent potential issues with some tools.

harness = false


// Encode a message
let mut handle = encoder.start_message(
"D", // MsgType: NewOrderSingle

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The code example for basic encoding/decoding is not self-contained as timestamp is not defined. This will cause confusion for users trying to run the example. Please either define the variable or mark the code block with ,ignore to indicate it's illustrative.

Suggested change
"D", // MsgType: NewOrderSingle
```rust,ignore

}

fn benchmark_encoding(c: &mut Criterion) {
let dict = Arc::new(Dictionary::fix44().unwrap());

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The project guidelines in CLAUDE.md discourage the use of .unwrap(). Please use .expect() with a descriptive message instead to provide more context on panic, which is especially helpful for debugging failing benchmarks.

Suggested change
let dict = Arc::new(Dictionary::fix44().unwrap());
let dict = Arc::new(Dictionary::fix44().expect("Failed to load FIX 4.4 dictionary"));

Comment on lines 123 to 143
fn benchmark_message_sizes(c: &mut Criterion) {
let dict = Arc::new(Dictionary::fix44().unwrap());
let group = c.benchmark_group("message_sizes");

// Compare encoded sizes
let encoding_rules = [
("BER", EncodingRule::BER),
("DER", EncodingRule::DER),
("OER", EncodingRule::OER),
];

for (name, rule) in encoding_rules {
let config = Config::new(rule);
let encoder = Encoder::new(config, dict.clone());
let encoded = create_test_message(&encoder, 1);

println!("{} encoded size: {} bytes", name, encoded.len());
}

group.finish();
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The benchmark_message_sizes function prints encoded sizes to stdout but doesn't measure performance using criterion. If the intent is just to display sizes, consider moving this to a test or an example. If it's meant to be a benchmark, it should use b.iter(...) to measure something.

@@ -0,0 +1,534 @@
//! Build script for ASN.1 schema compilation and code generation.

use anyhow::Result;

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The project guidelines in CLAUDE.md state a preference for thiserror over anyhow for structured error handling. While anyhow is convenient in build scripts, adhering to the project's standard would improve consistency. Please consider refactoring to use thiserror or a custom error type for better error handling.

// For now, just copy the schema files to OUT_DIR
// In a full implementation, you would parse and compile them
let out_dir = env::var("OUT_DIR")?;
let filename = schema_file.file_name().unwrap();

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The project guidelines discourage .unwrap(). Using it here could cause the build script to panic if a path generated by glob somehow doesn't have a filename. Please use .expect() for robustness and to provide a more helpful error message if a panic does occur.

Suggested change
let filename = schema_file.file_name().unwrap();
let filename = schema_file.file_name().expect("glob entry should have a filename");


#[test]
fn test_basic_encoding_decoding() {
let dict = Arc::new(Dictionary::fix44().unwrap());

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The project guidelines in CLAUDE.md discourage the use of .unwrap(). Please use .expect() with a descriptive message to provide more context on panic during tests.

Suggested change
let dict = Arc::new(Dictionary::fix44().unwrap());
let dict = Arc::new(Dictionary::fix44().expect("Failed to load FIX 4.4 dictionary"));

- Implement SetField trait for EncoderHandle with SmallVec buffer
- Add GetConfig trait implementations for Encoder, Decoder, DecoderStreaming
- Enhance field processing with common FIX tag iteration
- Fix Buffer trait usage and type annotations in tests
- Update dependency from fix44 to fix50 feature
- Resolve to_string method ambiguity and GroupEntries implementation
- Apply clippy fixes for explicit counter loops and unwrap usage
- Fix formatting and linting issues

All 33 tests passing. ASN.1 encoding now seamlessly integrates with
RustyFix ecosystem for encoding/decoding FIX messages.
- Add features to build-dependencies for rustyfix-dictionary
- Update feature flags to properly enable corresponding rustyfix-dictionary features
- Fix build script to use Dictionary::fix50() instead of Dictionary::fix50sp2()
- Replace .unwrap() with .expect() for better error messages
- Fix doc_cfg warning by using docsrs instead

This resolves the compilation errors where Dictionary methods were not available
in build scripts due to missing feature flags.
Previously PER/APER/UPER encoding rules would silently fall back to DER
encoding, which could lead to unexpected behavior. Now they properly return
an UnsupportedEncodingRule error indicating that PER variants are not
supported by the rasn crate.

This ensures users are aware when their requested encoding rule cannot be
used, rather than getting an unexpected encoding format.
@metaphorics metaphorics requested a review from Copilot July 14, 2025 11:18
- Replace 21 .unwrap() calls in message.rs with descriptive .expect() messages
- Replace 2 .unwrap() calls in encoder.rs with context-specific .expect() messages  
- Replace 13 .unwrap() calls in field_types.rs with field type specific .expect() messages
- Replace 1 .unwrap() call in config.rs with configuration context .expect()
- Replace 5 .unwrap() calls in schema.rs with schema building context .expect() messages
- Replace 1 .unwrap() call in decoder.rs with decoding context .expect()
- Replace 3 .unwrap() calls in generated.rs with test context .expect() messages

All .expect() calls now include descriptive error messages explaining what should not fail and why.
This improves error debugging and follows Rust best practices per project guidelines.
@metaphorics metaphorics requested review from Copilot and removed request for Copilot July 14, 2025 11:20
@metaphorics
Copy link
Author

/gemini review

Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This PR introduces a new crate rustyasn for ASN.1 support, which is a great addition. The overall structure is good, with clear separation of concerns into modules for encoding, decoding, configuration, etc. The use of a build script for code generation is also a solid approach.

However, there are several significant architectural issues and limitations in the current implementation that need to be addressed:

  • Incomplete Feature Implementation: Several key features are either missing or incomplete, such as support for repeating groups, a complete schema implementation (it's currently hardcoded for a few messages), and support for PER/APER/UPER encoding rules.
  • Architectural Disconnect: There seems to be a disconnect between the code generated by build.rs and the runtime logic. The encoder/decoder rely on manually defined message structures and schemas, while the more detailed generated code is not fully utilized. This makes the crate confusing and hard to extend.
  • Misleading Public API: The public API (e.g., EncodingRule enum, encode_message function) suggests more functionality than is actually implemented, which can be misleading for users.

I've left more detailed comments on specific parts of the code. I recommend focusing on creating a solid, unified architecture before adding more features. A good starting point would be to make the encoder/decoder fully utilize the code generated from the FIX dictionaries.

Comment on lines 149 to 159
/// Builds field type information from dictionary.
fn build_field_types(&mut self) {
// Standard header fields
self.add_header_fields();

// Standard trailer fields
self.add_trailer_fields();

// TODO: Add all field definitions from dictionary
// This would normally iterate through dictionary.fields()
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

The build_field_types function has a TODO and only adds a few hardcoded header and trailer fields. Similarly, build_message_schemas only adds a few hardcoded messages. This means the schema is incomplete and doesn't use the full FIX dictionary.

This is a critical architectural issue. The crate has a build.rs that seems to generate code from dictionaries, but the runtime logic uses this separate, incomplete, manual schema. The generated code should be the source of truth for the schema information at runtime to ensure correctness and completeness.

Comment on lines 18 to 26
/// Packed Encoding Rules - Compact, bit-oriented format
PER,
/// Aligned Packed Encoding Rules - PER with alignment
APER,
/// Unaligned Packed Encoding Rules - PER without alignment
UPER,
/// Octet Encoding Rules - Byte-aligned, efficient format
OER,
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The EncodingRule enum includes PER, APER, and UPER, but these are not actually supported by the encoder or decoder. The encoder returns an UnsupportedEncodingRule error, and the decoder falls back to DER.

This is misleading to users of the crate. If these encoding rules are not supported, they should be removed from the public EncodingRule enum. Alternatively, if support is planned, the current implementation should be documented as a stub and the functions should clearly state the lack of support.

I suggest removing the unsupported variants for now to provide a clear and honest API.

Comment on lines 216 to 229
fn group(
&self,
_field: u32,
) -> Result<Self::Group, FieldValueError<<usize as FieldType>::Error>> {
// For simplicity, create an empty group
// In a full implementation, this would parse repeating groups
Ok(MessageGroup::new(vec![]))
}

fn group_opt(&self, _field: u32) -> Result<Option<Self::Group>, <usize as FieldType>::Error> {
// For simplicity, return None
// In a full implementation, this would parse repeating groups
Ok(None)
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The group and group_opt methods for FieldMap are implemented as stubs that always return an empty or None group. This means repeating groups, a fundamental part of FIX, are not supported.

This is a major limitation and should be clearly documented in the crate's README.md. For the crate to be considered for production use, this functionality will need to be implemented.

Comment on lines 132 to 148
fn add_message_fields<F: FieldMap<u32>>(
&self,
handle: &mut EncoderHandle,
msg: &F,
) -> Result<()> {
// Note: FieldMap doesn't provide field iteration, so we try common field tags
// In a full implementation, this would use a field iterator or schema
let common_tags = [55, 54, 38, 44, 114, 60]; // Symbol, Side, OrderQty, Price, etc.

for &tag in &common_tags {
if let Some(raw_data) = msg.get_raw(tag) {
let value_str = String::from_utf8_lossy(raw_data);
handle.add_field(tag as u16, value_str.to_string());
}
}

Ok(())

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The implementation of add_message_fields is a stub that only handles a few hardcoded tags. This makes the encode_message function not generally useful. The comment // Note: FieldMap doesn't provide field iteration, so we try common field tags highlights the issue.

To make this function useful, it needs a way to iterate over all fields in the FieldMap. If the FieldMap trait doesn't support iteration, you might need to rethink the integration approach or document this severe limitation prominently. A possible solution could be to require a FieldMap that also implements IntoIterator.

sender_comp_id: msg.sender_comp_id.clone(),
target_comp_id: msg.target_comp_id.clone(),
msg_seq_num: msg.msg_seq_num,
sending_time: None, // TODO: Extract from fields if present

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

There's a TODO here to extract the sending time from fields. This is a good note, but it's worth considering how to handle this more robustly. The Asn1FixMessage struct has a dedicated sending_time field, but the conversion from FixMessage doesn't populate it. This could lead to loss of information during message conversion.

A more complete implementation would be to iterate through the fields of the source FixMessage, find the SendingTime (tag 52), and populate the sending_time field in Asn1FixMessage. The field should then be excluded from the generic fields vector to avoid duplication.

Comment on lines 42 to 43

// Encode a message

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The line [x] Abstract Syntax Notation (ASN.1). indicates ASN.1 support is fully implemented, but the following line [Working on validations] suggests that it is not. This is misleading and should be clarified.

Comment on lines 21 to 36
//! ```rust,ignore
//! use rustyasn::{Config, Encoder, Decoder, EncodingRule};
//! use rustyfix::Dictionary;
//!
//! // Configure encoding
//! let config = Config::new(EncodingRule::PER);
//! let dictionary = Dictionary::fix44();
//!
//! // Encode a message
//! let mut encoder = Encoder::new(config, dictionary);
//! let encoded = encoder.encode_message(msg)?;
//!
//! // Decode a message
//! let decoder = Decoder::new(config, dictionary);
//! let message = decoder.decode(&encoded)?;
//! ```

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The usage example in the crate-level documentation is marked as ignore and is incomplete. The msg variable is not defined, so the example is not self-contained and won't compile if a user tries to run it.

Please provide a complete, minimal, and working example to demonstrate the crate's usage.

//! ```rust,ignore
//! use rustyasn::{Config, Encoder, Decoder, EncodingRule};
//! use rustyfix::Dictionary;
//! use std::sync::Arc;
//!
//! // Configure encoding
//! let config = Config::new(EncodingRule::DER);
//! let dictionary = Arc::new(Dictionary::fix44().unwrap());
//!
//! // Encode a message
//! let encoder = Encoder::new(config.clone(), dictionary.clone());
//! let mut handle = encoder.start_message("A", "SENDER", "TARGET", 1);
//! handle.add_string(98, "0").add_int(108, 30);
//! let encoded = handle.encode().unwrap();
//!
//! // Decode a message
//! let decoder = Decoder::new(config, dictionary);
//! let message = decoder.decode(&encoded).unwrap();
//! assert_eq!(message.msg_type(), "A");
//! ```

/// Adds an entry to the group.
pub fn add_entry(&mut self, entry: Message) {
self.entries.push(entry);
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The generate_message_structures function has a generic ASN.1 message container, but it does not generate ASN.1 message structures for different FIX message types. This limits the ability to create type-safe message structures for different FIX message types.

Comment on lines 44 to 50
let mut handle = encoder.start_message(
"D", // MsgType: NewOrderSingle
"SENDER001", // SenderCompID
"TARGET001", // TargetCompID
1, // MsgSeqNum
timestamp, // SendingTime
);

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The example for encoder.start_message is incorrect. The function signature in encoder.rs is pub fn start_message<'a>(&'a self, msg_type: &str, sender_comp_id: &str, target_comp_id: &str, msg_seq_num: u64) -> EncoderHandle<'a>, which does not take a timestamp argument. This example will not compile and will confuse users.

Suggested change
let mut handle = encoder.start_message(
"D", // MsgType: NewOrderSingle
"SENDER001", // SenderCompID
"TARGET001", // TargetCompID
1, // MsgSeqNum
timestamp, // SendingTime
);
let mut handle = encoder.start_message(
"D", // MsgType: NewOrderSingle
"SENDER001", // SenderCompID
"TARGET001", // TargetCompID
1, // MsgSeqNum
);


[dependencies]
# Core dependencies
rustyfix = { path = "../rustyfix", version = "0.7.4", features = ["fix50"] }

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The rustyfix dependency has a feature fix50 enabled, but the rustyfix-dictionary dependency does not have this feature enabled. This could lead to inconsistencies or compilation errors. Ensure that both dependencies have the same features enabled, or that the code handles the case where they don't.

rustyfix-dictionary = { path = "../rustyfix-dictionary", version = "0.7.4", features = ["fix50"] }

This comment was marked as outdated.

- Remove invalid categories like "fix-engine", "fix-parser", etc.
- Keep only valid crates.io categories: "network-programming", "parser-implementations", "encoding", "finance", "development-tools"
- This prevents publishing failures for crates that inherit workspace categories
@metaphorics metaphorics requested a review from Copilot July 14, 2025 11:22
Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a new rustyasn crate to add ASN.1 encoding support. The structure is well-thought-out, with modules for encoding, decoding, and configuration. The build script for code generation is a solid approach for maintainability.

I've found a critical issue in the root Cargo.toml that would prevent publishing to crates.io. Additionally, there are numerous linting violations due to the use of .unwrap() and .expect(). Several key features, such as repeating group support and full schema generation, are currently stubbed out and should be implemented or documented as limitations. Addressing these points will significantly improve the robustness and completeness of this new feature.

Comment on lines 15 to 49
authors = ["cognitive-glitch", "Rusty Trading Team", "Filippo Neysofu Costa"]
version = "0.7.4"
edition = "2024"
homepage = "https://github.com/rusty-trading/rusty-fix-engine"
repository = "https://github.com/rusty-trading/rusty-fix-engine"
description = "FIX & FAST (FIX Adapted for STreaming) in pure Rust"
publish = true
readme = "README.md"
keywords = ["fix", "fast", "protocol", "finance", "fintech"]
categories = ["network-programming", "parser-implementations", "encoding"]
keywords = [
"fix",
"fix-protocol",
"quickfix",
"fix-engine",
"fix-parser",
"fast",
"protocol",
"trading",
"finance",
"fintech",
"hft",
]
categories = [
"network-programming",
"parser-implementations",
"encoding",
"fix-engine",
"fix-parser",
"fix-protocol",
"quickfix",
"fast",
"trading",
"protocol",
"hft",
"high-frequency-trading",
]

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

The keywords and categories lists in the workspace package configuration exceed the limits allowed by crates.io. keywords has 11 items (limit is 5), and categories has 12 items (limit is 5). This will cause an error when publishing. Please reduce the number of items in each list to a maximum of 5.


#[test]
fn test_basic_encoding_decoding() {
let dict = Arc::new(Dictionary::fix44().unwrap());

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The use of .unwrap() is denied by the workspace linting rules (clippy::unwrap_used). Handle the Result properly. For a test prerequisite like this, unwrap_or_else with a descriptive panic message is a good alternative.

Suggested change
let dict = Arc::new(Dictionary::fix44().unwrap());
let dict = Arc::new(Dictionary::fix44().unwrap_or_else(|e| panic!("Failed to load FIX 4.4 dictionary for test: {e}")));

Comment on lines +34 to +35
FixMessageType::from_str("D")
.expect("Failed to parse valid message type 'D'")

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The use of .expect() is denied by the workspace linting rules (clippy::expect_used). Handle the Result properly, for example by using unwrap_or_else to panic with a more descriptive message for this test setup.

Suggested change
FixMessageType::from_str("D")
.expect("Failed to parse valid message type 'D'")
FixMessageType::from_str("D")
.unwrap_or_else(|| panic!("Failed to parse valid message type 'D'"))

Comment on lines +363 to +364
let dict =
Arc::new(Dictionary::fix44().expect("Failed to load FIX 4.4 dictionary for test"));

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The use of .expect() is denied by the workspace linting rules (clippy::expect_used). Handle the Result properly, for example by using unwrap_or_else to panic with a more descriptive message for this test setup.

        let dict = Arc::new(
            Dictionary::fix44().unwrap_or_else(|e| panic!("Failed to load FIX 4.4 dictionary for test: {e}")),
        );

Comment on lines 161 to 165
EncodingRule::PER | EncodingRule::APER | EncodingRule::UPER => {
// PER not available in this version, use DER as fallback
der_decode::<FixMessage>(data)
.map_err(|e| Error::Decode(DecodeError::Internal(e.to_string())))
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The decoder silently falls back to DER when PER encoding is specified. This is inconsistent with the encoder, which correctly returns an UnsupportedEncodingRule error. This inconsistency can lead to unexpected behavior. The decoder should also return an error for unsupported encoding rules.

            EncodingRule::PER | EncodingRule::APER | EncodingRule::UPER => {
                Err(Error::Decode(DecodeError::UnsupportedEncodingRule(
                    self.config.encoding_rule.name(),
                )))
            }

Comment on lines 253 to 256
fn estimate_size(&self) -> usize {
// Basic estimation: header + fields
100 + self.message.fields.len() * 20
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The message size estimation is a very rough heuristic. This could lead to incorrect behavior. Consider either improving the estimation logic or encoding the message first and then checking its size against the limit for correctness.

Comment on lines 48 to 50
1, // MsgSeqNum
timestamp, // SendingTime
);

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The timestamp variable is not defined in the Basic Encoding/Decoding example, which will cause a compilation error. Define it with chrono::Utc::now() or provide a placeholder value.

Suggested change
1, // MsgSeqNum
timestamp, // SendingTime
);
timestamp: chrono::Utc::now(), // SendingTime

Comment on lines 216 to 229
fn group(
&self,
_field: u32,
) -> Result<Self::Group, FieldValueError<<usize as FieldType>::Error>> {
// For simplicity, create an empty group
// In a full implementation, this would parse repeating groups
Ok(MessageGroup::new(vec![]))
}

fn group_opt(&self, _field: u32) -> Result<Option<Self::Group>, <usize as FieldType>::Error> {
// For simplicity, return None
// In a full implementation, this would parse repeating groups
Ok(None)
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The group and group_opt methods are currently stubbed and do not implement repeating group logic. This functionality should be implemented to fully support FIX messages with repeating groups. If this is planned for future work, document it in the crate's README as a current limitation.

Comment on lines 157 to 159
// TODO: Add all field definitions from dictionary
// This would normally iterate through dictionary.fields()
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The schema generation is currently hardcoded for a few message types and doesn't fully utilize the provided FIX dictionary. Update the implementation to build the schema dynamically from the dictionary.

Comment on lines 71 to 72
name = "asn1_encodings"
harness = false No newline at end of file

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

This file is missing a final newline character. Add a newline character at the end of the file.

harness = false

cursor[bot]

This comment was marked as outdated.

This comment was marked as outdated.

cursor[bot]

This comment was marked as outdated.

- Remove PER, APER, UPER variants from EncodingRule enum 
- Update encoder and decoder to only support BER, DER, OER
- Change low_latency() config to use OER instead of PER
- Update all tests and method implementations to expect OER
- Fix benchmark to use std::hint::black_box and .expect() instead of .unwrap()
- This makes the API honest about what encoding rules are actually supported

Addresses AI code review feedback about misleading API surface.
@metaphorics
Copy link
Author

/gemini review

Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

The pull request includes changes to the .clippy.toml, CLAUDE.md, Cargo.toml, README.md, and TODO.md files. It also adds several new files related to ASN.1 support, including a build script, cargo file, readme, benchmark, and examples. The changes aim to add ASN.1 support and improve code quality.

This comment was marked as outdated.

- Enhanced documentation: Added @docs/fix-specs/ reference to the guidelines for implementing APIs and designing exchange abstractions.
- Introduced a new GitHub Actions workflow for Rust CI, which includes steps for building, testing, and checking code quality with clippy and formatting checks.

These changes improve the clarity of development practices and ensure consistent code quality through automated checks.
@metaphorics
Copy link
Author

/gemini review

- Verified all 7 pending AI code review tasks are already implemented or resolved
- Updated TODO.md to reflect current state accurately
- All tasks marked as RESOLVED with verification details
- No actual code changes needed - existing implementation is correct

Tasks verified:
1. is_plausible_start_tag - Already has proper ASN.1 validation
2. map_to_dyn_error - Helper function already implemented
3. Redundant comment - No redundancy found
4. #[allow(dead_code)] - No dead code attributes found
5. Hardcoded fallback tags - No hardcoded values found
6. Hardcoded common tags - Already configurable
7. is_valid_asn1_tag - Function doesn't exist (resolved by removal)
@metaphorics metaphorics requested a review from Copilot July 14, 2025 18:55
Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces ASN.1 support via the rustyasn crate, along with smaller improvements and refactorings. The rustyasn crate includes a custom ASN.1 parser to ensure dependency compatibility. The review focuses on correctness and maintainability, identifying areas for improvement in error handling and code simplification.

Comment on lines +235 to +245
// TODO: Implement proper repeating group parsing
// This requires:
// 1. Look up the group schema from a dictionary/schema
// 2. Parse the fields in order based on the group definition
// 3. Create proper Message instances for each group entry

Err(Asn1FieldError::GroupParsingUnsupported {
tag: count_tag,
count,
})
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The current implementation of parse_group_entries returns an error, but the public-facing group() and group_opt() methods that call it swallow this specific error and return a misleading FieldValueError::Invalid(InvalidInt). This suggests to the caller that the group count field has an invalid value, which is incorrect and makes debugging difficult.

Since this feature is unimplemented, it would be better to make this explicit by using todo!(). This will cause a panic with a clear "not yet implemented" message when group functionality is used, which is the standard and expected behavior for todo!() and avoids returning a misleading error.

    fn parse_group_entries(
        &self,
        _count_tag: u32,
        _count: usize,
    ) -> Result<Vec<Message>, Asn1FieldError> {
        // This feature is not yet implemented. Using `todo!` to make it explicit.
        todo!("Repeating group parsing is not yet supported for ASN.1 messages.");
    }

Comment on lines +161 to +167
if NaiveDateTime::parse_from_str(&s, "%Y%m%d-%H:%M:%S").is_ok() {
return FixFieldValue::UtcTimestamp(s);
}
// Also check for timestamp with milliseconds
if NaiveDateTime::parse_from_str(&s, "%Y%m%d-%H:%M:%S%.3f").is_ok() {
return FixFieldValue::UtcTimestamp(s);
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The logic for parsing UtcTimestamp can be simplified. The two separate checks for timestamps with and without fractional seconds can be combined into a single check using the %.f format specifier in chrono, which handles optional fractional seconds. This will make the code more concise and slightly more efficient.

A similar simplification can be applied to UtcTime parsing a few lines below.

Suggested change
if NaiveDateTime::parse_from_str(&s, "%Y%m%d-%H:%M:%S").is_ok() {
return FixFieldValue::UtcTimestamp(s);
}
// Also check for timestamp with milliseconds
if NaiveDateTime::parse_from_str(&s, "%Y%m%d-%H:%M:%S%.3f").is_ok() {
return FixFieldValue::UtcTimestamp(s);
}
if NaiveDateTime::parse_from_str(&s, "%Y%m%d-%H:%M:%S%.f").is_ok() {
return FixFieldValue::UtcTimestamp(s);
}

- Fixed `test_repeating_group_parsing` to expect proper error handling
- Enhanced date validation with robust chrono parsing (already implemented)
- **Medium Priority Improvements**:
- Renamed `is_valid_asn1_tag` to `is_plausible_start_tag` for clarity

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Consider renaming is_valid_asn1_tag to is_plausible_start_tag for clarity, as it doesn't fully validate ASN.1 tags but rather checks for plausible start bytes.

This comment was marked as outdated.

- Fix parsing error: 'Failed to parse type definition FixExtensions DEFINITIONS ::= BEGIN: Unsupported type: BEGIN'
- Add logic to skip ASN.1 module definition lines (MODULE DEFINITIONS ::= BEGIN)
- Prevent parser from trying to parse 'BEGIN' as a type definition
- Module definitions are now properly recognized and skipped during parsing
- Build now completes successfully without ASN.1 parsing errors

The parser was incorrectly trying to parse module definition syntax as type definitions.
Added a check to skip lines containing 'DEFINITIONS ::= BEGIN' before type parsing.
…ion and schema compilation

- Cleaned up the build script by removing unnecessary println! statements that were commented out.
- This change improves code readability and maintains focus on active code without clutter from unused logging statements.
- No functional changes were made; the build process remains intact.
- Removed redundant keywords from the Cargo.toml file, streamlining the list to focus on essential terms: "fix", "fix-protocol", "fast", "trading", and "hft".
- This change enhances the clarity of the project's metadata without affecting its functionality or dependencies.
- The updated keywords better reflect the core aspects of the project, improving discoverability and relevance in package registries.
@metaphorics
Copy link
Author

/gemini review

@metaphorics metaphorics requested a review from Copilot July 14, 2025 19:04

This comment was marked as outdated.

Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

The pull request includes changes to increase the cognitive complexity threshold, add fix specs to documentation, add ASN.1 support, and use const functions. The changes also include updates to dependencies and documentation.

- Fixed `test_repeating_group_parsing` to expect proper error handling
- Enhanced date validation with robust chrono parsing (already implemented)
- **Medium Priority Improvements**:
- Renamed `is_valid_asn1_tag` to `is_plausible_start_tag` for clarity

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Consider renaming is_valid_asn1_tag to is_plausible_start_tag for clarity, as the function only checks for plausible start tags rather than fully validating ASN.1 tags.

Suggested change
- Renamed `is_valid_asn1_tag` to `is_plausible_start_tag` for clarity
- Renamed `is_valid_asn1_tag` to `is_plausible_start_tag` for clarity

- Enhanced date validation with robust chrono parsing (already implemented)
- **Medium Priority Improvements**:
- Renamed `is_valid_asn1_tag` to `is_plausible_start_tag` for clarity
- Optimized const fn usage (`is_standard_header_field` already const)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The is_standard_header_field function is already const, so there is no need to optimize it further.

Suggested change
- Optimized const fn usage (`is_standard_header_field` already const)
- Optimized const fn usage (`is_standard_header_field` already const)

- Replaced `format!` with static strings in tracing hot paths
- Fixed SmallVec inline detection with `spilled()` method
- **Style & Cleanup**:
- Added missing newlines to Cargo.toml and README.md files

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

It's good practice to add a newline character at the end of files to ensure proper formatting and compatibility across different systems.

Suggested change
- Added missing newlines to Cargo.toml and README.md files
- Added missing newlines to Cargo.toml and README.md files

- Updated `rasn` version from 0.18 to 0.27 in `rustyasn/Cargo.toml` for enhanced ASN.1 support.
- Removed version specification for `rustyfix-dictionary` and `rustyfix-codegen` to align with workspace dependency management.
- Updated `quick-xml` and `criterion` in `rustyfixml/Cargo.toml` to use workspace dependencies for consistency.
- Upgraded `prost` and `prost-types` versions from 0.13 to 0.14 in `rustygpb/Cargo.toml` to leverage the latest features and fixes.

These changes streamline dependency management and ensure compatibility across the project.
- Fix missing rasn::Decoder trait imports in feature-gated modules
- Fix ASN.1 tag validation to accept SEQUENCE tag (0x30)
- Correct is_plausible_start_tag() to properly validate constructed universal class tags
- Fix doc test to use rustyfix_dictionary instead of rustyfix
- All tests now pass successfully with both default and all features enabled
- Remove dependency on rustyfix main crate, use only rustyfix-dictionary
- Create local traits module with FieldType, Buffer, FieldMap, etc.
- Replace generic const buffer implementations with specific sized buffers
- Add missing rasn::Decoder imports for derive macros
- Fix clippy warnings: needless_pass_by_value in encoder.rs
- Apply cargo fmt to fix formatting issues across all files
- Add IO error variant to rustyfixml error types

All compilation tests pass with all features enabled.
Copy link

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: Buffer Trait Infinite Recursion Bug

The Buffer trait implementations for Vec<u8> and smallvec::SmallVec types contain infinite recursion. The extend_from_slice method calls itself instead of the inherent extend_from_slice method of the concrete types, leading to a stack overflow.

crates/rustyasn/src/traits.rs#L11-L34

impl Buffer for Vec<u8> {
fn extend_from_slice(&mut self, data: &[u8]) {
self.extend_from_slice(data);
}
}
impl Buffer for smallvec::SmallVec<[u8; 64]> {
fn extend_from_slice(&mut self, data: &[u8]) {
self.extend_from_slice(data);
}
}
impl Buffer for smallvec::SmallVec<[u8; 128]> {
fn extend_from_slice(&mut self, data: &[u8]) {
self.extend_from_slice(data);
}
}
impl Buffer for smallvec::SmallVec<[u8; 256]> {
fn extend_from_slice(&mut self, data: &[u8]) {
self.extend_from_slice(data);
}
}

Fix in CursorFix in Web


Was this report helpful? Give feedback by reacting with 👍 or 👎

@metaphorics metaphorics requested a review from Copilot July 17, 2025 01:50
@metaphorics
Copy link
Author

/gemini review

This comment was marked as outdated.

Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces ASN.1 support and incorporates const functions to enhance performance. The changes include modifications to .clippy.toml, .cursorrules, AGENTS.md, CLAUDE.md, Cargo.toml, README.md, and TODO.md, along with the addition of new files related to ASN.1 implementation.

- Fixed `test_repeating_group_parsing` to expect proper error handling
- Enhanced date validation with robust chrono parsing (already implemented)
- **Medium Priority Improvements**:
- Renamed `is_valid_asn1_tag` to `is_plausible_start_tag` for clarity

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Consider renaming is_valid_asn1_tag to is_plausible_start_tag for clarity.

Suggested change
- Renamed `is_valid_asn1_tag` to `is_plausible_start_tag` for clarity
- Renamed `is_valid_asn1_tag` to `is_plausible_start_tag` for clarity

- Enhanced date validation with robust chrono parsing (already implemented)
- **Medium Priority Improvements**:
- Renamed `is_valid_asn1_tag` to `is_plausible_start_tag` for clarity
- Optimized const fn usage (`is_standard_header_field` already const)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Consider making is_standard_header_field a const fn to improve performance.

Suggested change
- Optimized const fn usage (`is_standard_header_field` already const)
- Optimized const fn usage (`is_standard_header_field` already const)

- Optimized const fn usage (`is_standard_header_field` already const)
- Removed UTC timestamp length restrictions for variable fractional seconds
- Replaced `format!` with static strings in tracing hot paths
- Fixed SmallVec inline detection with `spilled()` method

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Consider using the spilled() method to detect SmallVec inline detection.

Suggested change
- Fixed SmallVec inline detection with `spilled()` method
- Fixed SmallVec inline detection with `spilled()` method

- Replaced `format!` with static strings in tracing hot paths
- Fixed SmallVec inline detection with `spilled()` method
- **Style & Cleanup**:
- Added missing newlines to Cargo.toml and README.md files

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Adding missing newlines to Cargo.toml and README.md files will improve readability.

Suggested change
- Added missing newlines to Cargo.toml and README.md files
- Added missing newlines to Cargo.toml and README.md files

metaphorics and others added 2 commits July 17, 2025 11:12
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
- Add detailed documentation for ASN.1 overhead constant explaining components
- Replace hardcoded FIX tag values with named constants in encoder and message modules
- Make FIX header field tag constants public for reuse across modules
- Add comprehensive documentation for all public FIX tag constants with backticks
- Simplify verbose documentation in tracing.rs functions
- Improve code maintainability and self-documentation
- Apply automatic formatting improvements from pre-commit hooks

All tests pass (62 unit tests + 5 integration tests).
@metaphorics
Copy link
Author

/gemini review

@metaphorics metaphorics requested a review from Copilot July 17, 2025 02:28
Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

The pull request includes changes to various configuration files, documentation, and adds ASN.1 encoding support. Most changes are minor updates to documentation and build configurations. The most significant change is the addition of ASN.1 support, which includes new crates (rustyasn) and modifications to the build process.

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR enhances error handling across integration tests by returning Result<(), Box<dyn std::error::Error>>, updates several Cargo.toml files to use workspace dependencies and bumps versions, and broadens error enums with new variants.

  • Refactor tests in multiple crates to use Result and map_err, improving test ergonomics and error reporting.
  • Update Cargo.toml for rustygpb and rustyfixml to workspace dependencies and bump prost to 0.14.
  • Add an I/O error variant to rustyfixml’s FixmlError.

Reviewed Changes

Copilot reviewed 43 out of 43 changed files in this pull request and generated 3 comments.

Show a summary per file
File Description
crates/rustysbe/src/lib.rs Migrate integration tests to return Result and use map_to_dyn_error
crates/rustygpb/src/lib.rs Refactor test to return Result and replace expect with map_err
crates/rustyfixs/src/lib.rs Update TLS tests to return Result and map errors
crates/rustyfixml/src/error.rs Add Io variant for std::io::Error
crates/rustygpb/Cargo.toml Bump prost and prost-types to 0.14, switch criterion to workspace
crates/rustyfixml/Cargo.toml Switch quick-xml, criterion, and tokio to workspace
crates/rustyasn/tests/integration_test.rs New ASN.1 integration tests using Result

Comment on lines +170 to +175
assert_eq!(length, 40);

// Test validation
let (len, tid, sv) = SbeMessageHeader::validate_basic(&message).unwrap();
assert_eq!(len, message.len() as u32);
assert_eq!(tid, 123);
assert_eq!(sv, 5);
let (len, tid, sv) =
SbeMessageHeader::validate_basic(&message).map_err(map_to_dyn_error)?;
assert_eq!(len, 40);
Copy link

Copilot AI Jul 17, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hardcoding the expected message length as 40 may break if header layout changes. Consider asserting against message.len() (e.g. assert_eq!(length, message.len() as u32)) to keep the test in sync with the actual output.

Copilot uses AI. Check for mistakes.
Comment on lines +100 to +105
let encoded = encoder.encode(&message).map_err(|e| {
Box::<dyn std::error::Error>::from(format!("Encoding should work but failed: {e}"))
})?;
let decoded = decoder.decode(&encoded).map_err(|e| {
Box::<dyn std::error::Error>::from(format!("Decoding should work but failed: {e}"))
})?;
Copy link

Copilot AI Jul 17, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] The error mapping closure used here and for decoding is duplicated. Consider introducing a test-level helper like fn map_to_dyn_error<E: std::error::Error + 'static>(e: E) -> Box<dyn std::error::Error> or a ResultExt trait to reduce repetition and improve readability.

Suggested change
let encoded = encoder.encode(&message).map_err(|e| {
Box::<dyn std::error::Error>::from(format!("Encoding should work but failed: {e}"))
})?;
let decoded = decoder.decode(&encoded).map_err(|e| {
Box::<dyn std::error::Error>::from(format!("Decoding should work but failed: {e}"))
})?;
let encoded = encoder.encode(&message).map_err(map_to_dyn_error("Encoding should work but failed"))?;
let decoded = decoder.decode(&encoded).map_err(map_to_dyn_error("Decoding should work but failed"))?;

Copilot uses AI. Check for mistakes.
}

#[test]
fn test_message_size_limits() {
Copy link

Copilot AI Jul 17, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This test panics on failure but other tests use Result<(), Box<dyn std::error::Error>>. For consistency and better error reporting, consider returning Result<(), Box<dyn std::error::Error>> here as well and using ? on fallible operations.

Suggested change
fn test_message_size_limits() {
fn test_message_size_limits() -> Result<(), Box<dyn std::error::Error>> {

Copilot uses AI. Check for mistakes.
- Add clippy::panic to allow directives in test modules for:
  - src/decoder.rs
  - src/schema.rs
  - src/types.rs

All critical clippy errors (expect_used, unwrap_used, panic) are now resolved.
Compilation successful with only non-critical warnings remaining.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant