Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

To transfer Oracle tables in bulk to Azure Storage using Azure Data Factory, follow these steps:

  1. Create an Azure Data Factory pipeline: This pipeline will be responsible for executing the data transfer from Oracle to Azure Storage.

  2. Configure the Source Dataset: In the Source Dataset, you will need to provide the connection details for your Oracle instance. This includes the Oracle server name, user name, password, and the name of the table(s) that you want to transfer.

  3. Configure the Sink Dataset: In the Sink Dataset, you will need to provide the connection details for your Azure Storage account. This includes the account name, access key, and the container name in which you want to store the data.

  4. Transform the Data: In some cases, you may want to transform the data before transferring it to Azure Storage. For example, you might want to perform some data aggregation, filtering, or join operations. You can use Transform Data activities to achieve this.

  5. Execute the Pipeline: Once you have configured the pipeline, you can execute it to start the data transfer process. During the execution, data will be transferred from Oracle to Azure Storage according to the configured settings.

  6. Monitor the Pipeline: After the data transfer process is completed, you can monitor the pipeline to ensure that everything went smoothly. You can also check the contents of your Azure Storage container to verify that the data has been transferred successfully.