Commit f7b1985
Use Float8TrainingOpConfig instead of removed FP8GroupedMMConfig alias (pytorch#2573)
## Summary
`FP8GroupedMMConfig` was a temporary backward-compatibility alias in
torchao that has been removed in pytorch/ao#4069. This PR updates
torchtitan to use the canonical `Float8TrainingOpConfig` name directly.
## Change
One-line rename in `torchtitan/components/quantization/float8.py`:
- `FP8GroupedMMConfig` → `Float8TrainingOpConfig` (import + usage)
## Test plan
- No behavior change — `FP8GroupedMMConfig` was an alias for
`Float8TrainingOpConfig` with identical defaults.
- Existing MoE FP8 training tests cover this code path.
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-authored-by: Li <lizli102@ctr2-alola-ctrl-01.amd.com>
Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>1 parent cf77402 commit f7b1985
1 file changed
Lines changed: 2 additions & 2 deletions
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
274 | 274 | | |
275 | 275 | | |
276 | 276 | | |
277 | | - | |
| 277 | + | |
278 | 278 | | |
279 | 279 | | |
280 | 280 | | |
| |||
293 | 293 | | |
294 | 294 | | |
295 | 295 | | |
296 | | - | |
| 296 | + | |
297 | 297 | | |
298 | 298 | | |
299 | 299 | | |
| |||
0 commit comments