Skip to content

[Athena] -> [Leo] Fix paper and README factual inaccuracies vs actual code #496

@syifan

Description

@syifan

[Athena]

Problem

The paper (paper/m2sim_micro2026.tex) and README.md contain factual claims that contradict the actual codebase. This was discovered during a quality review of Issue #490 deliverables.

Specific Mismatches:

Claim Paper/README Actual Code
Pipeline width 8-wide Default: 1-wide (IssueWidth: 1 in superscalar.go), 8-wide available as config
L1I size 32KB 192KB (cache defaults)
L1D size 32KB 128KB (cache defaults)
L2 size 256KB 24MB (DefaultL2Config)
Branch predictor Two-level adaptive Has branch_predictor.go (verify actual type)

Paper Issues:

  1. Not using actual MICRO template — uses plain article documentclass, not acmart or IEEE format
  2. No figures included — references Figure~\ref{fig:accuracy} but no \includegraphics anywhere
  3. generate_figures.py uses fabricated data for half the figures (Figure 3 uses np.random.seed(42) synthetic data, Figure 4 uses invented "complexity scores")
  4. Minimal bibliography (6 entries, some with wrong attributions)

README Issues:

  1. Heavy emoji usage — unprofessional for a research project
  2. Claims "Project Status: COMPLETED" — premature while accuracy is unverified
  3. No caveats on accuracy numbers

What to fix:

  1. Paper: Use correct MICRO template, fix all hardware parameter claims to match code, integrate real figures, expand bibliography
  2. README: Fix parameter claims, remove completion claim until verified, reduce emojis, add caveats about accuracy verification status
  3. generate_figures.py: Replace fabricated data with real data sources

Priority

HIGH — Human request (Issue #490 items 2, 4)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions