Ask Your Question
0

At the workspace level in Databricks, what is the process to incorporate Spark configuration using Azure Databricks?

asked 2021-12-22 11:00:00 +0000

pufferfish gravatar image

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
0

answered 2021-06-22 17:00:00 +0000

ladyg gravatar image

To incorporate Spark configuration using Azure Databricks on the workspace level, follow these steps:

  1. Open your Databricks workspace and click on the "Clusters" icon in the left-hand sidebar.
  2. Click on the cluster name for which you want to set Spark configuration.
  3. In the "Spark Config" tab, you can add/edit configuration values as per your requirement.
  4. You can either set a specific configuration value or use a comma-separated list to set multiple values.
  5. After setting the values, click "Save".
  6. Now, restart the cluster to apply the new configuration values.

This process will help you incorporate Spark configuration using Azure Databricks on the workspace level.

edit flag offensive delete link more

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account. This space is reserved only for answers. If you would like to engage in a discussion, please instead post a comment under the question or an answer that you would like to discuss

Add Answer


Question Tools

Stats

Asked: 2021-12-22 11:00:00 +0000

Seen: 10 times

Last updated: Jun 22 '21