Summary
This issue proposes three interconnected improvements to the multi-conversation export flow:
1. Raise / unify the selection cap
Currently EXPORT_LIMIT = 100 is hardcoded in ExportDialog.tsx, while Settings → Export limit defaults to 1000 fetched conversations. Users who load 500+ conversations via Settings can never actually select more than 100.
Proposed: The maximum selectable count should be driven by the same exportAllLimit setting (or a dedicated configurable setting), not a second magic number.
2. Date-range filter on the conversation list
The conversation list API already returns create_time (Unix seconds) on every ApiConversationItem. Adding optional from / to date inputs in the Export dialog would let users narrow exports without manual checkbox hunting—especially useful when exporting thousands of conversations.
The filter lives inside the existing dialog (no separate overlay on the ChatGPT page), applied after the title search in a useMemo.
3. Batched API fetch + download (100 conversations per wave)
Right now, after the RequestQueue fetches all selected conversations, every exporter (exportAllToMarkdown, exportAllToJson, exportAllToHtml) builds one JSZip with all data, then calls generateAsync once. For 500+ conversations this can cause memory pressure and flaky browser downloads.
Proposed design:
- Define a fixed constant
EXPORT_OPERATION_BATCH = 100.
- Pipeline the fetch and download: fetch 1–100 → build + download
part-01-of-NN.zip → fetch 101–200 → download part 02 → …
- This keeps memory bounded per wave, reduces the risk of 429 rate-limit errors, and avoids browser multi-download blocking (with a small delay between waves).
- The
RequestQueue backoff / Retry-After handling already in place continues to govern throttling within each wave.
Impact
| Area |
Change |
src/constants.ts |
Add EXPORT_OPERATION_BATCH = 100 (fixed, not user-tunable) |
src/ui/SettingContext.tsx + SettingDialog.tsx |
Expose configurable fetch/selection limit only |
src/ui/ExportDialog.tsx |
Date filter, unified selection cap, wave-based export pipeline |
src/utils/download.ts |
buildZipFileName part-suffix support |
src/exporter/*.ts |
Pass derived name / custom zip base when called per-batch |
src/locales/* + src/i18n.ts |
New strings for date labels, "part i of n", tooltips |
Edge cases
- Local file source (
conversations.json): filter by create_time when present.
- Selection ≤ 100: single-file export, no part suffix.
- Archive/Delete: unchanged—still operates on the full current selection.
A fork with an implementation branch is underway at https://github.com/cjam28/chatgpt-exporter — will link a PR once complete.
Summary
This issue proposes three interconnected improvements to the multi-conversation export flow:
1. Raise / unify the selection cap
Currently
EXPORT_LIMIT = 100is hardcoded inExportDialog.tsx, while Settings → Export limit defaults to 1000 fetched conversations. Users who load 500+ conversations via Settings can never actually select more than 100.Proposed: The maximum selectable count should be driven by the same
exportAllLimitsetting (or a dedicated configurable setting), not a second magic number.2. Date-range filter on the conversation list
The conversation list API already returns
create_time(Unix seconds) on everyApiConversationItem. Adding optional from / to date inputs in the Export dialog would let users narrow exports without manual checkbox hunting—especially useful when exporting thousands of conversations.The filter lives inside the existing dialog (no separate overlay on the ChatGPT page), applied after the title search in a
useMemo.3. Batched API fetch + download (100 conversations per wave)
Right now, after the
RequestQueuefetches all selected conversations, every exporter (exportAllToMarkdown,exportAllToJson,exportAllToHtml) builds one JSZip with all data, then callsgenerateAsynconce. For 500+ conversations this can cause memory pressure and flaky browser downloads.Proposed design:
EXPORT_OPERATION_BATCH = 100.part-01-of-NN.zip→ fetch 101–200 → download part 02 → …RequestQueuebackoff /Retry-Afterhandling already in place continues to govern throttling within each wave.Impact
src/constants.tsEXPORT_OPERATION_BATCH = 100(fixed, not user-tunable)src/ui/SettingContext.tsx+SettingDialog.tsxsrc/ui/ExportDialog.tsxsrc/utils/download.tsbuildZipFileNamepart-suffix supportsrc/exporter/*.tssrc/locales/*+src/i18n.tsEdge cases
conversations.json): filter bycreate_timewhen present.A fork with an implementation branch is underway at https://github.com/cjam28/chatgpt-exporter — will link a PR once complete.