Extract Process : 

Extract process captures all the changes made to all objects which we configure for Golden Gate for the synchronization process. It reads data either from transaction logs of database or archive log files.

Once those transactions are completed, the Extract process sends all details of those transactions to trail files for propagation to the target system.

We can configure two types of Extract Process :

  1. Primary Extract Process: Primary Extract process can read data directly from the source table for the initial load process. It captures changed data from database logs or archive logs. It can then send data to local trail files or remote trail files.
  2. Secondary or Data Pump Extract Process: It is an optional process but a highly recommended process. It is used to distribute data from local trail files to a remote server.

Setting Change Capture:

Extract process is added on Source System.

Steps to configure Change Capture using Primary Extract

1. Add Primary Extract, This process has an associated parameter file and it reads from the transaction logs located at source system.

Login to ggsci :

Add Extract

Check with info all command that extract is added

2. Edit Parameter File

3. Add exttrail/rmttrail for extract ext2

4. Start Extract Process

5. Check extract process is running using info all command unless you find info command you will not get extract process status.

Steps to Configure Change Capture with Secondary Capture : 

Data Pump is called secondary extract in Oracle Golden Gate environment.

A data pump extract can write to one or many remote trails and servers.

The task of Data Pump Extract :

1. Reads local trail files.

2. Manipulate data or passes it without any change.

3. Sends those data to one or many targets.

It is optional but highly recommended as it provides many benefits as below :

1. Data Pump extract is useful as a safeguard against network or target failures means when we use only primary extract and use the remote trail for capture and replication when the network goes down or target server goes down trail files may fail or lose. But when we use data pump extract it ensures that the trail file is at least stored locally.

2. Data Pump Extract is useful to break complex data filtering and transformation into phases.

3. Use Data Pump Extract we can consolidate data from many sources.

4. We can use Data Pump Extract to synchronize one source with multiple targets.

Let us add one Data Pump Extract for primary extract we have created in the previous example :

Step 1: Add extract with exttrailsource as primary extract trail files.

Step 2: Check extract added with info all command

Step 3: Configure parameter of pump2 [secondary ] extract

I have set the following parameters for pump2 extract

I have set remote host with replication server and remote trail [rmttrail] as ./dirdat/ba.

This process will copy local trail file to remote destination: repserver at a defined location in rmttrail.

Step 4: add a rmttrail option with a pump2 extract from ggsci prompt.

Step 5: Start pump2 extract

Step 6: Check pump2 process is running

In the next article, we will see configuring change delivery.

If you want to be updated with all our articles send us an Invitation or Follow us:

Telegram Channel: https://t.me/helporacle

Skant Gupta’s LinkedIn: www.linkedin.com/in/skantali/

Joel Perez’s LinkedIn: Joel Perez’s Profile

LinkedIn Group: Oracle Cloud DBAAS

Facebook Page: OracleHelp

Tagged:

About The Author

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.