Oracle GoldenGate Supported RDBMS to GCP BigQuery

GGSCI (pocgg) 9> edit param EINLOADGGSCI (pocgg) 10> view param EINLOADSOURCEISTABLE
SOURCEDB sourcegg@pocgg, USERID ggsuser_S, PASSWORD Ggsuser$1234
RMTHOST 10.128.0.6, MGRPORT 7820
RMTFILE /GGbigdata/OGGHOME/dirdat/ia, megabytes 200
TABLE sourcegg.categories;
TABLE sourcegg.customers;
TABLE sourcegg.departments;
TABLE sourcegg.order_items;
TABLE sourcegg.orders;
TABLE sourcegg.products;
GGSCI (pocgg) 14> edit param EGCPBIGGGSCI (pocgg) 15> view param EGCPBIGEXTRACT EGCPBIG
SOURCEDB sourcegg@pocgg, USERID ggsuser_S, PASSWORD Ggsuser$1234
DISCARDFILE ./dirrpt/egcpbig.dsc, append, MEGABYTES 10
TRANLOGOPTIONS ALTLOGDEST "/var/lib/mysql/binlog.index"
EXTTRAIL /oggmysql/OGG19c/dirdat/ed
TABLE sourcegg.categories;
TABLE sourcegg.customers;
TABLE sourcegg.departments;
TABLE sourcegg.order_items;
TABLE sourcegg.orders;
TABLE sourcegg.products;
Before starting extract process check for supplemental logging on table. For MySQL ADD trandata owner.table name doesn't work. By default Goldengate determine column identifier based on
1. primary key
2. First unique key alphanumerically that does not contain a timestamp or non-materialized computed column
3.If Primary or Unique Key doesn't exist then Oracle GoldenGate constructs a pseudo key of all columns that the database allows to be used in a unique key.
GGSCI (pocgg) 5> dblogin SOURCEDB sourcegg@pocgg, USERID ggsuser_S, PASSWORD Ggsuser$1234
Successfully logged into database.
In our case all table have a PK column. Now Add and start the extract process GGSCI (pocgg DBLOGIN as ggsuser_S) 12> ADD EXTRACT EGCPBIG, TRANLOG, BEGIN NOW
EXTRACT added.
GGSCI (pocgg DBLOGIN as ggsuser_S) 13> ADD EXTTRAIL /oggmysql/OGG19c/dirdat/ed , EXTRACT EGCPBIG
EXTTRAIL added.
GGSCI (pocgg DBLOGIN as ggsuser_S) 14> info allProgram Status Group Lag at Chkpt Time Since ChkptMANAGER RUNNING
EXTRACT STOPPED EGCPBIG 00:00:00 00:00:07
REPLICAT RUNNING RCDC1 00:00:00 00:00:02
GGSCI (pocgg) 8> edit param PUMPGCPGGSCI (pocgg) 9> view param PUMPGCPEXTRACT PUMPGCP
SOURCEDB sourcegg@pocgg, USERID ggsuser_S, PASSWORD Ggsuser$1234
DISCARDFILE ./dirrpt/pumpgcp.dsc, append, MEGABYTES 10
RMTHOST 10.128.0.6, MGRPORT 7820
RMTTRAIL /GGbigdata/OGGHOME/dirdat/rd
TABLE sourcegg.categories;
TABLE sourcegg.customers;
TABLE sourcegg.departments;
TABLE sourcegg.order_items;
TABLE sourcegg.orders;
TABLE sourcegg.products;
GGSCI (pocgg) 27> ADD EXTRACT PUMPGCP, EXTTRAILSOURCE /oggmysql/OGG19c/dirdat/ed
EXTRACT added.
GGSCI (pocgg) 28> ADD RMTTRAIL /GGbigdata/OGGHOME/dirdat/rd , EXTRACT PUMPGCP
RMTTRAIL added.
[opc@pocgg OGG19c]$ extract paramfile /oggmysql/OGG19c/dirprm/einload.prm reportfile /oggmysql/OGG19c/dirrpt/einload.rptCheck file size of trail 
[opc@pocgg OGG19c]$ ls -ltrh /GGbigdata/OGGHOME/dirdat/ia*
-rw-r-----. 1 opc mysql 30M Nov 20 18:18 /GGbigdata/OGGHOME/dirdat/ia000000
Also check report file of all status
We need to download latest version of the Java Client library for BigQuery below is the link.
https://developers.google.com/api-client-library/java/apis/bigquery/v2
I will share list of Java Client library jar, keep all below jar in a directory (/home/opc/bigquery/libs/) this path will be value for gg.classpath property.
Eg:- gg.classpath=/home/opc/bigquery/libs/*
https://download.jar-download.com/cache_jars/com.google.cloud/google-cloud-bigquery/1.124.4/jar_files.zip
Eg:-  gg.handler.bigquery.projectId=oracle-******8106
For "gg.handler.bigquery.datasetId" property go to BigQuery Console. On left side you will see your project id detail, click on it, Then create dataset give it a name and location. So value for gg.handler.bigquery.datasetId=CDC_GG
for "gg.handler.bigquery.credentialsFile" property Go to IAM & Admin console, then go to Service Accounts and create a new service account and download the file. Transfer that file to your VM. 
gg.handler.bigquery.credentialsFile=/home/opc/oracle-****8106-717fb0cdacad.json
Note:
1. Initial load for larger table will take long time using GoldenGate.
2. If Oracle GoldenGate for RDBMS database is present Client DC and and you have installed GoldenGate Bigdata on GCP VM, then make sure network speed between these VM is high. Other wise you may face lag in pump process.

--

--

--

Oracle database and GoldenGate consultant

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Finding an instance of thread by id in WinDbg

Monitoring /etc/hosts with osquery

NativShark 1.0 — We’re live.

Prevent Breaking Code Changes in Future Releases using `non exhaustive` enums in Rust

Stack Overflow Survey Data 2017

Keeping Up With the Business

Exercism.io challenge — intro to polyglot programming

HackerRank: 30 Days of Code Challenge — Day 0

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Abhii

Abhii

Oracle database and GoldenGate consultant

More from Medium

GCP — On-premise data publish using APIs

Traferium: The Key to Successfully Migrating Data to the Cloud

Traferium Time & Cost Savings infographic

Solving correlated subquery error in BigQuery

Installing Cloudera Quickstart VM using Docker Hub (on Mac M1)