Key Vault can be utilized to access Azure blob storage in Databricks by following these steps:
dbutils.secrets.get
command to retrieve the storage account key from the Key Vault.Here's an example code snippet that shows how to utilize Key Vault to access Azure blob storage in Databricks:
# Import the Azure Blob Storage Connector
from azure.storage.blob import BlobServiceClient
# Retrieve the storage account key from Key Vault
storage_account_key = dbutils.secrets.get(scope="<Key Vault Scope>", key="<Storage Account Key Secret Name>")
# Connect to Blob Storage using the storage account name and retrieved key
blob_service_client = BlobServiceClient(account_url="https://<Storage Account Name>.blob.core.windows.net", credential=storage_account_key)
# List the containers in the Blob Storage account
containers = blob_service_client.list_containers()
for container in containers:
print(container.name)
Note: Replace <Key Vault Scope>
, <Storage Account Key Secret Name>
, and <Storage Account Name>
with your specific values.
Please start posting anonymously - your entry will be published after you log in or create a new account. This space is reserved only for answers. If you would like to engage in a discussion, please instead post a comment under the question or an answer that you would like to discuss
Asked: 2023-01-29 11:00:00 +0000
Seen: 14 times
Last updated: Jul 22 '21
How can a role assignment be added to a managed identity that is assigned to a system in Azure?
How can Oracle Tables in bulk be transferred to Azure Storage using Azure Data Factory?
What is the reason for the authentication failure in Azure GIT?
What is the process of utilizing Forge API with Azure, S3, or other storage systems?
How does UDP multicast work on Windows Azure cloud service?
How can the onpremisessamaccountname be included as a claim in the JWT token for Azure AD?