Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

There are several ways to efficiently add a million rows to a Postgres server from another Postgres server using Java,

  1. Use JDBC Batch Insert: JDBC provides a batch insert feature that can be used to insert a large number of rows at once. This feature reduces the round-trip time to the database by inserting multiple rows in a single transaction.

  2. Use COPY command: The COPY command is a native PostgreSQL command that allows fast transfer of large amounts of data between servers. The COPY command uses a file-based approach to move data, so you can create a CSV file in Java and then use the COPY command to insert the data into the target database.

  3. Use PreparedStatement with Batch Updates: You can also use prepared statements with batch updates to insert a large amount of data into a PostgreSQL database. Prepared statements cache the SQL statement and speeds up the database access by reducing parsing overhead. Batch updates allow you to execute multiple SQL statements as a single transaction.

  4. Use pgbulkload: pgbulkload is a command-line utility provided by PostgreSQL that allows you to load large data sets into a database quickly. You can use the Java Process API to execute the pg_bulkload command from Java.

Overall, the method you choose depends on the size of the data set, the frequency of updates, and the available resources. Using the batch insert method or the COPY command is recommended for larger data sets, while prepared statements with batch updates are more efficient for smaller data sets.