No, it is not impossible to retrieve secrets saved in Secret Manager using cloud dataproc (Pyspark). You can use the Secret Manager API to retrieve the secrets and pass them to your Pyspark code. Here's an example:
First, authenticate with the Secret Manager API using the appropriate credentials:
from google.cloud import secretmanager
# Authenticate with the Secret Manager API
client = secretmanager.SecretManagerServiceClient()
Next, retrieve the secret:
# Retrieve the secret
project_id = '<your-project-id>'
secret_id = '<your-secret-id>'
version_id = '<your-version-id>' # optional
name = f"projects/{project_id}/secrets/{secret_id}/versions/{version_id}"
response = client.access_secret_version(request={"name": name})
# Extract the secret value
secret_string = response.payload.data.decode('UTF-8')
Finally, pass the secret to your Pyspark code:
from pyspark.sql import SparkSession
spark = SparkSession.builder \
.appName("MyApp") \
.config("spark.some.config.option", "some-value") \
.config("mysecret", secret_string) \ # pass the secret as a Spark configuration option
.getOrCreate()
Please start posting anonymously - your entry will be published after you log in or create a new account. This space is reserved only for answers. If you would like to engage in a discussion, please instead post a comment under the question or an answer that you would like to discuss
Asked: 2022-03-01 11:00:00 +0000
Seen: 8 times
Last updated: Sep 30 '22
How do you log Python data into a database?
How can the SQL debug mode be activated in PostgreSQL version 15.2?
How to deal with an operational error when connecting Django to MySQL?
What is the method for choosing data FROM a stored procedure?
How can SQL/PLSQL blocks be stripped of their comments?
What is the process for initializing Java UDFs in Spark?
How to set up Database First configuration in Entity Framework 7 for MVC 6?