Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

To incorporate Spark configuration using Azure Databricks on the workspace level, follow these steps:

  1. Open your Databricks workspace and click on the "Clusters" icon in the left-hand sidebar.
  2. Click on the cluster name for which you want to set Spark configuration.
  3. In the "Spark Config" tab, you can add/edit configuration values as per your requirement.
  4. You can either set a specific configuration value or use a comma-separated list to set multiple values.
  5. After setting the values, click "Save".
  6. Now, restart the cluster to apply the new configuration values.

This process will help you incorporate Spark configuration using Azure Databricks on the workspace level.