Ask Your Question
1

What does it mean when a spark job has a Peak Execution Memory Onheap of 0.0B?

asked 2023-07-09 12:23:11 +0000

ladyg gravatar image

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
2

answered 2023-07-09 12:35:01 +0000

lalupa gravatar image

When a Spark job has a Peak Execution Memory Onheap of 0.0B, it means that the job did not use any on-heap memory during its execution. This could happen if the job processed a small amount of data or if it was optimized to use off-heap memory or direct memory access (DMA) to improve its performance. This can also happen if there is a bug or issue with the job or the way it is executed, and it is not properly logging or reporting its peak memory usage. It is important to accurately monitor and measure memory usage in Spark jobs to optimize their performance and ensure they run smoothly.

edit flag offensive delete link more

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account. This space is reserved only for answers. If you would like to engage in a discussion, please instead post a comment under the question or an answer that you would like to discuss

Add Answer


Question Tools

Stats

Asked: 2023-07-09 12:23:11 +0000

Seen: 8 times

Last updated: Jul 09 '23