Skip to content

Use Float8TrainingOpConfig instead of removed FP8GroupedMMConfig alias#2573

Merged
tianyu-l merged 1 commit intopytorch:mainfrom
lizamd:use-float8trainingopconfig
Mar 14, 2026
Merged

Use Float8TrainingOpConfig instead of removed FP8GroupedMMConfig alias#2573
tianyu-l merged 1 commit intopytorch:mainfrom
lizamd:use-float8trainingopconfig

Conversation

@lizamd
Copy link
Copy Markdown
Contributor

@lizamd lizamd commented Mar 14, 2026

Summary

FP8GroupedMMConfig was a temporary backward-compatibility alias in torchao that has been removed in pytorch/ao#4069. This PR updates torchtitan to use the canonical Float8TrainingOpConfig name directly.

Change

One-line rename in torchtitan/components/quantization/float8.py:

  • FP8GroupedMMConfigFloat8TrainingOpConfig (import + usage)

Test plan

  • No behavior change — FP8GroupedMMConfig was an alias for Float8TrainingOpConfig with identical defaults.
  • Existing MoE FP8 training tests cover this code path.

🤖 Generated with Claude Code

FP8GroupedMMConfig was a temporary BC alias in torchao that has been
removed (pytorch/ao#4069). Use the canonical Float8TrainingOpConfig name.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
@meta-cla meta-cla Bot added the CLA Signed This label is managed by the Meta Open Source bot. label Mar 14, 2026
Copy link
Copy Markdown
Contributor

@danielvegamyhre danielvegamyhre left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks @lizamd

@tianyu-l tianyu-l merged commit 7222941 into pytorch:main Mar 14, 2026
11 checks passed
joecummings pushed a commit to joecummings/torchtitan that referenced this pull request Mar 17, 2026
pytorch#2573)

## Summary

`FP8GroupedMMConfig` was a temporary backward-compatibility alias in
torchao that has been removed in pytorch/ao#4069. This PR updates
torchtitan to use the canonical `Float8TrainingOpConfig` name directly.

## Change

One-line rename in `torchtitan/components/quantization/float8.py`:
- `FP8GroupedMMConfig` → `Float8TrainingOpConfig` (import + usage)

## Test plan
- No behavior change — `FP8GroupedMMConfig` was an alias for
`Float8TrainingOpConfig` with identical defaults.
- Existing MoE FP8 training tests cover this code path.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Li <lizli102@ctr2-alola-ctrl-01.amd.com>
Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
weifengpy pushed a commit to weifengpy/torchtitan that referenced this pull request Mar 27, 2026
pytorch#2573)

## Summary

`FP8GroupedMMConfig` was a temporary backward-compatibility alias in
torchao that has been removed in pytorch/ao#4069. This PR updates
torchtitan to use the canonical `Float8TrainingOpConfig` name directly.

## Change

One-line rename in `torchtitan/components/quantization/float8.py`:
- `FP8GroupedMMConfig` → `Float8TrainingOpConfig` (import + usage)

## Test plan
- No behavior change — `FP8GroupedMMConfig` was an alias for
`Float8TrainingOpConfig` with identical defaults.
- Existing MoE FP8 training tests cover this code path.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Li <lizli102@ctr2-alola-ctrl-01.amd.com>
Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
TXacs pushed a commit to McmillanTAC/torchtitan that referenced this pull request Apr 13, 2026
pytorch#2573)

## Summary

`FP8GroupedMMConfig` was a temporary backward-compatibility alias in
torchao that has been removed in pytorch/ao#4069. This PR updates
torchtitan to use the canonical `Float8TrainingOpConfig` name directly.

## Change

One-line rename in `torchtitan/components/quantization/float8.py`:
- `FP8GroupedMMConfig` → `Float8TrainingOpConfig` (import + usage)

## Test plan
- No behavior change — `FP8GroupedMMConfig` was an alias for
`Float8TrainingOpConfig` with identical defaults.
- Existing MoE FP8 training tests cover this code path.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Li <lizli102@ctr2-alola-ctrl-01.amd.com>
Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
ACharacterInASimulation pushed a commit to ACharacterInASimulation/torchtitan that referenced this pull request Apr 21, 2026
pytorch#2573)

## Summary

`FP8GroupedMMConfig` was a temporary backward-compatibility alias in
torchao that has been removed in pytorch/ao#4069. This PR updates
torchtitan to use the canonical `Float8TrainingOpConfig` name directly.

## Change

One-line rename in `torchtitan/components/quantization/float8.py`:
- `FP8GroupedMMConfig` → `Float8TrainingOpConfig` (import + usage)

## Test plan
- No behavior change — `FP8GroupedMMConfig` was an alias for
`Float8TrainingOpConfig` with identical defaults.
- Existing MoE FP8 training tests cover this code path.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Li <lizli102@ctr2-alola-ctrl-01.amd.com>
Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Meta Open Source bot.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants