Technologies/Apache Spark/spark_executor_max_memory
Apache SparkApache SparkMetric

spark_executor_max_memory

Max memory across all executors working for a particular application
Dimensions:None
Available on:DatadogDatadog (1)
Interface Metrics (1)
DatadogDatadog
Max memory across all executors working for a particular application
Dimensions:None
Related Insights (1)
Executor Memory Pressure from Oversized Partitionscritical

Spark executors fail with OOM errors when processing partitions significantly larger than 200-500MB, exhausting executor heap memory and causing cascading failures across the cluster.