Ask Your Question
3

What is the default value for spark.default.parallelism when using Parallelize RDD with spark submit, and how does it affect parallelization?

asked 2021-08-29 11:00:00 +0000

plato gravatar image

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
3

answered 2021-05-18 19:00:00 +0000

lalupa gravatar image

The default value for spark.default.parallelism is the total number of available cores in the cluster. This value determines the maximum number of tasks that can be executed simultaneously by the cluster.

When parallelizing RDD with spark submit, the value of spark.default.parallelism determines the level of parallelization that is achieved. Increasing the value of spark.default.parallelism results in more parallelism and faster execution times. However, setting it too high can lead to excessive usage of memory and other resources, thus affecting the performance of the cluster.

Therefore, the ideal value of spark.default.parallelism depends on the available resources, data size, and the complexity of the computation being performed. It is recommended to set this value according to the specific requirements of the job to achieve optimal performance.

edit flag offensive delete link more

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account. This space is reserved only for answers. If you would like to engage in a discussion, please instead post a comment under the question or an answer that you would like to discuss

Add Answer


Question Tools

Stats

Asked: 2021-08-29 11:00:00 +0000

Seen: 8 times

Last updated: May 18 '21