Skip to content

Cache complexity analyzer in-memory/redis #5632

@sobrinho

Description

@sobrinho

Is your feature request related to a problem? Please describe.

After the optimization in #5631,
our GraphQL/analyze dropped from ~806ms to ~176ms. However, 176ms is still
significant for a hot path since the complexity analyzer runs on every request,
even when the same query has been analyzed before.

Describe the solution you'd like

I would like a way to cache the complexity analysis result across requests —
either in-process (Ruby Hash/LRU, sub-microsecond) or in a shared store like
Redis (~1ms). The gem itself does not need to know how to cache; that
responsibility can stay with the application. What the gem needs to provide is
a stable fingerprint for the incoming query so the application can use it
as a cache key.

The challenge is determining what the fingerprint should cover. The complexity
of a query depends on its structure and on certain variables — most notably
pagination arguments (e.g. first, last) — so those must be included in the
fingerprint.

Describe alternatives you've considered

Keying the cache on the full query string + all variables is too conservative:
two calls with the same query structure but different id arguments would
always miss the cache even though their complexity is identical.

Keying on the query string alone (ignoring all variables) is too aggressive:
it would treat first: 10 and first: 1000 as the same, producing incorrect
complexity results.

The right fingerprint is the query structure (document or normalized AST)
combined with only the variables that influence complexity, such as pagination
arguments.

Additional context

#5631

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions