Queryset result cache causes unbounded memory growth on large datasets
criticalResource ContentionUpdated Mar 15, 2026(via Exa)
Technologies:
How to detect:
Django loads all queryset results into _result_cache list in memory. A row that's 40 bytes in PostgreSQL becomes 400-600 bytes as a Python model object. Processing 50 million rows can consume 8+ GB RAM and cause server swapping.
Recommended action:
Use .iterator(chunk_size=N) to stream rows instead of caching. For querysets over ~10,000 rows, always use iterator. Tune chunk_size based on row width (default 2000). Use .values() or .values_list() when full model instances aren't needed to reduce per-row memory by ~80%.