1 | initial version |
To use execute_values
to insert a JSON array into Postgres, you can follow these steps:
import json
import psycopg2
json_data = '[{"name": "John", "age": 30}, {"name": "Jane", "age": 25}]'
data = json.loads(json_data)
values = [(json.dumps(row),) for row in data]
In this example, we are loading a JSON array of objects into Python and converting each object into a JSON string before adding it to the list of tuples. Note that the json.dumps
function is used to convert the object to a JSON string.
execute_values
to insert the data into the table. For example:conn = psycopg2.connect(database="mydb", user="myuser", password="mypassword", host="localhost", port="5432")
cur = conn.cursor()
sql = "INSERT INTO mytable (myjsoncolumn) VALUES %s"
psycopg2.extras.execute_values(cur, sql, values)
conn.commit()
cur.close()
conn.close()
In this example, we are executing a SQL statement that inserts a list of JSON strings into a column named myjsoncolumn
in a table named mytable
. The %s
placeholder is used to signal where the data should be inserted in the query. We pass the list of tuples containing the data to execute_values
, along with the SQL query and the cursor object. Finally, we commit the transaction and close the cursor and connection objects.
Note that execute_values
automatically handles batching and parameterization, which can greatly improve the performance of your insertions.