Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Key Vault can be utilized to access Azure blob storage in Databricks by following these steps:

  1. Create a Key Vault in Azure and store the storage account key as a secret in the Key Vault.
  2. Add the Databricks app to the Key Vault's Access Policies in Azure.
  3. In your Databricks notebook, use the Azure Blob Storage Connector to link to the Azure Blob storage account.
  4. In the Databricks notebook, use the dbutils.secrets.get command to retrieve the storage account key from the Key Vault.
  5. Use the retrieved storage account key to authenticate the Blob Storage Connector and access the Blob storage account.

Here's an example code snippet that shows how to utilize Key Vault to access Azure blob storage in Databricks:

# Import the Azure Blob Storage Connector
from azure.storage.blob import BlobServiceClient

# Retrieve the storage account key from Key Vault
storage_account_key = dbutils.secrets.get(scope="<Key Vault Scope>", key="<Storage Account Key Secret Name>")

# Connect to Blob Storage using the storage account name and retrieved key
blob_service_client = BlobServiceClient(account_url="https://<Storage Account Name>.blob.core.windows.net", credential=storage_account_key)

# List the containers in the Blob Storage account
containers = blob_service_client.list_containers()
for container in containers:
    print(container.name)

Note: Replace <Key Vault Scope>, <Storage Account Key Secret Name>, and <Storage Account Name> with your specific values.