Skip to content

fix(tests): honor parametrized dtype in test_autotuner; correct SQNR var in test_integration error msg#4340

Open
Anai-Guo wants to merge 3 commits intopytorch:mainfrom
Anai-Guo:fix/test-dtype-and-msg-4339
Open

fix(tests): honor parametrized dtype in test_autotuner; correct SQNR var in test_integration error msg#4340
Anai-Guo wants to merge 3 commits intopytorch:mainfrom
Anai-Guo:fix/test-dtype-and-msg-4339

Conversation

@Anai-Guo
Copy link
Copy Markdown
Contributor

Fixes two unrelated test bugs reported in #4339.

Bug 1: dtype parametrize silently ignored in test/kernel/test_autotuner.py

TestQuantFlow.test_int_mm, test_int_mm_float8, and test_int_scaled_mm are each @parameterized.expand(...)'d over both torch.bfloat16 and torch.float16, but each method's body opens with dtype = torch.bfloat16, overwriting the parameter. The float16 variants run with bfloat16 inputs, giving false coverage.

Removing the three dtype = torch.bfloat16 overrides also exposes a hardcoded assertion at the bottom of test_int_scaled_mm:

out32_1 = intmm.safe_int_mm(x_int, w_int) * scales
assert out32_1.dtype == torch.bfloat16   # <-- always bfloat16 even when dtype=float16

out32_1.dtype follows scales.dtype, which follows the parametrized dtype. Fixed to assert against the parameter:

assert out32_1.dtype == dtype

Bug 2: Wrong variable in error message in test/integration/test_integration.py

In test_save_load_qtensors (around line 641):

assert SQNR(ref_f, test) > min_sqnr, (
    f"got sqnr: {SQNR(ref_f, ref_q)}, expected: {min_sqnr}"   # <-- prints SQNR vs ref_q, not test
)

The assertion compares ref_f against test, but the failure message reports SQNR(ref_f, ref_q) — i.e. SQNR between the float reference and the compiled quantized reference computed earlier in the test, not between the float reference and the actual loaded model output. When the assertion fails, the printed value is meaningless for debugging.

Fixed to print SQNR(ref_f, test). (The earlier assert SQNR(ref_f, ref_q) > min_sqnr block at ~line 619 is already self-consistent and is left untouched.)

Test plan

  • Both fixes are pure test-suite hygiene; they do not change library behavior.
  • Removing the dtype overrides means the float16 parametrize branches now actually execute as float16. If any of those branches were previously hiding a real failure, this PR will surface it — please re-run the relevant CI.
  • test_int_scaled_mm's output dtype follows scales.dtype, which is now correctly the parametrized dtype; the new assertion out32_1.dtype == dtype matches.

🤖 Generated with Claude Code

@pytorch-bot
Copy link
Copy Markdown

pytorch-bot Bot commented Apr 26, 2026

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/ao/4340

Note: Links to docs will display an error until the docs builds have been completed.

❗ 1 Active SEVs

There are 1 currently active SEVs. If your PR is affected, please view them below:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla
Copy link
Copy Markdown

meta-cla Bot commented Apr 26, 2026

Hi @Anai-Guo!

Thank you for your pull request and welcome to our community.

Action Required

In order to merge any pull request (code, docs, etc.), we require contributors to sign our Contributor License Agreement, and we don't seem to have one on file for you.

Process

In order for us to review and merge your suggested changes, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA.

Once the CLA is signed, our tooling will perform checks and validations. Afterwards, the pull request will be tagged with CLA signed. The tagging process may take up to 1 hour after signing. Please give it that time before contacting us about it.

If you have received this in error or have any questions, please contact us at cla@meta.com. Thanks!

@meta-cla meta-cla Bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Apr 27, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant