Apache DataFusionApache Spark

Out of memory errors terminate pipeline execution

critical
Resource ContentionUpdated Feb 17, 2026(via Exa)
How to detect:

Pipeline fails with OutOfMemoryError when executor memory is insufficient for processing batch size or data volume

Recommended action:

Increase executor memory via system.resources.memory runtime argument (e.g., 4096), reduce source fetch size to process fewer records per batch, or increase Spark executor memory via system.spark.executor.memory (e.g., 8g)