RestAPI for Oracle DBaaS Cookbook

Hello Friends,

Great news that I wanted to share with you all!!! I am delighted to announce the launch of my first Oracle technical book named “RestAPI for Oracle DBaaS Cookbook”. 

This book authored by myself (Hariprasath Rajaram, Cloud OCP on DBAAS),  Skant Gupta (Oracle Certified Master) and Stuti Kaushal (Oracle Certified Associate)  describe the hands-on guide imparting practical working knowledge and guidance of RestAPI concepts.

The Target Audience

The primary target audience are Oracle DBAs and Oracle Cloud DBA’s (DBaas). This book has been diligently planned for all levels of DBA experts. Coming right from the real-world practicing DBA’s, this book is written as a hands-on guide imparting practical working knowledge and guidance of RestAPI concepts.

What this book covers

Chapter 1, Introduction to RestAPI covers the Fundamentals of RestAPI, verbs, methods and status codes.

Chapter 2, Access Rules discusses the method of creating a rule, viewing a rule, enabling a rule, disabling a rule and deleting an access rule using RestAPI in Oracle Database Cloud Service.

Chapter 3, Backup and Recovery discusses method to start a backup operation, viewing all backup operations, starting a recovery operation and viewing all recovery operations.

Chapter 4, Patches illustrates the details about various operations being performed on Oracle Database Cloud patches using RestAPI

Chapter 5, Service Instances gives an overview on the usage of Oracle Database Cloud Service instances using various RestAPI methods

Chapter 6, Snapshots discusses the procedure of creating, viewing and deleting a snapshot in database using RestAPI

Chapter 7, SSH Keys describes usage of Oracle Database Cloud service RestAPI to create and manage database endpoints to view and manage Secure Shell (SSH) public keys of a Database

This book is available on Paperback, Kindle Edition and PDF version.

RestAPI for Oracle DBaaS Cookbook-Paperback

  
BUY ON AMAZON   https://www.amazon.com/dp/1731311249/

Co-Author’s

Special thanks for Skant Gupta for initiated the writing of this book and helped me to complete it on time successfully along with Stuti Kaushal.

Skant Gupta
Stuti Kaushal
RestAPI for Oracle DBaaS Cookbook-Kindle Edition
BUY ON AMAZON   https://www.amazon.in/dp/B07L2C5GKH/

Family Members

Special thanks to my wife and son who have patience during this time and helped me to complete this book. Without your support, i would not have completed this!!!

Doyensys Team

Thanks Somu Chockalingam, president of Doyensys Inc, for his encouragement and motivation to complete this book successfully. Thanks to all Doyensys DBA team who has supported me during this journey. Special Thanks To Marimuthu Pandu

Aioug Chennai Chapter Team

Thanks for overall support and guidance from AIOUG team, especially Hariharaputhran and Harish for their support.

Hariharaputhran Vaithinathan
Harish Panduranga Rao
Justin Michael Raj
Veeratteshwaran Sridhar

 

Sincerely,
Hariprasath Rajaram
Oracle DBA Senior Consultant,
Doyensys Inc.Chennai,
India
Catch Me On:- Hariprasath Rajaram

Telegram:https://t.me/joinchat/I_f4DkeGfZsxfzXxHD6gTg
LinkedIn:https://www.linkedin.com/in/hari-prasath-aa65bb19/
Facebook:https://www.facebook.com/HariPrasathdba
FB Group:https://www.facebook.com/groups/894402327369506/
FB Page: https://www.facebook.com/dbahariprasath/?
Twitter: https://twitter.com/hariprasathdba

Oracle SQL*Loader (sqlldr) Utility Tips And Tricks

 

SQL*Loader loads data from external files into tables of an Oracle database. It has a powerful data parsing engine that puts little limitation on the format of the data in the data file. You can use SQL*Loader to do the following:

  • Load data across a network if your data files are on a different system than the database.
  • Load data from multiple data files during the same load session.
  • Load data into multiple tables during the same load session.
  • Specify the character set of the data.
  • Selectively load data (you can load records based on the records’ values).
  • Manipulate the data before loading it, using SQL functions.
  • Generate unique sequential key values in specified columns.
  • Use the operating system’s file system to access the data files.
  • Load data from disk, tape, or named pipe.
  • Generate sophisticated error reports, which greatly aid troubleshooting.
  • Load arbitrarily complex object-relational data.
  • Use secondary data files for loading LOBs and collections.
  • Use conventional, direct path, or external table loads.

  1. Input Datafile contains file containing the data to be loaded.The record format can be specified in the control file with the INFILE parameter.
    cat /home/oracle/employee.txt
    
    100,Hari,MCA,5000
    200,Karthi,Technology,5500
    300,Sunil,Technology,7000
  2. Control file contains the instructions to the sqlldr utility. This tells sqlldr the location of the input file, the format of the input file, and other optional meta data information required by the sqlldr to upload the data into oracle tables.
load data
infile '/home/oracle/employee.txt'
into table employee
fields terminated by ","
( emp_id,emp_name,dept,salary )

The above control file indicates the following:

  • infile – Indicates the location of the input data file
  • into table – Indicates the table name where this data should be inserted
  • fields terminated by – Indicates the delimiter that is used in the input file to separate the fields
  • ( emp_id, emp_name, dept, salary ) – Lists the name of the column names in the table into which the data should be uploaded

3. Discard file contains rejected rows those were discarded because they were filtered due to a statement in SQL*Loader control file. Data written to any database table is not written to the discard file.

4. Bad File contains the records which are rejected either by SQL Loader or by the database because of the bad formatting or data type mismatch.It will have the same name as the data file, with a .bad extension.

5. Log File  contains a detailed summary of the load, including a description of any errors that occurred during the load.

Basics execution of sqlloader :-

Create the table structure:-

SQL> create table employee(emp_id integer,emp_name varchar2(10),dept varchar2(15),salary integer,Join_date date);

Table created.

Input Data :-(Datafile)

cat /home/oracle/employee.txt

100,Hari,MCA,5000
200,Karthi,Technology,5500
300,Sunil,Technology,7000

INSERT : Default value for loading data using SQL loader.

Sqlldr control file :-

[oracle@test]  cat > sqlldr-add-records.ctl
load data
infile '/home/oracle/employee.txt'
into table employee
fields terminated by ","
( emp_id,emp_name,dept,salary )

Run the sqlloader utility:-

[oracle@orcl:~ orcldemo] sqlldr hari/hari control=/home/oracle/sqlldr-add-records.ctl

SQL*Loader: Release 12.2.0.1.0 - Production on Thu Sep 13 02:40:28 2018

Copyright (c) 1982, 2017, Oracle and/or its affiliates. All rights reserved.

Path used: Conventional
Commit point reached - logical record count 3

Table EMPLOYEE:
3 Rows successfully loaded.

Check the log file:
sqlldr-add-records.log
for more information about the load.

Check the loaded data in table 

SQL> select * from hari.employee;

EMP_ID      EMP_NAME      DEPT        SALARY       JOIN_DATE
---------- ---------- --------------- -------      ---------
100           Hari        MCA          5000
200          Karthi       Technology   5500
300           Sunil       Technology   7000

APPEND :

Input data into existing table employee :-

cat /home/oracle/newemployee.txt

400,Ram,DBA,5500
500,Siva,Developer,7000
[oracle@test]  cat > sqlldr-append-records.ctl
load data
infile '/home/oracle/newemployee.txt'
append
into table employee
fields terminated by ","
( emp_id,emp_name,dept,salary )
[oracle@orcl:~ orcldemo] sqlldr hari/hari control=/home/oracle/sqlldr-append-records.ctl

SQL*Loader: Release 12.2.0.1.0 - Production on Thu Sep 13 02:47:18 2018Copyright (c) 1982, 2017, Oracle and/or its affiliates. All rights reserved.

Path used: Conventional
Commit point reached - logical record count 2

Table EMPLOYEE:
2 Rows successfully loaded.

Check the log file:
sqlldr-append-records.log
for more information about the load.
SQL> select * from hari.employee;

EMP_ID     EMP_NAME   DEPT            SALARY    JOIN_DATE
---------- ---------- --------------- ---------- ---------
100        Hari      MCA              5000
200        Karthi    Technology       5500
300        Sunil     Technology       7000
400        Ram       DBA              5500
500        Siva      Developer        7000

5 rows selected.

TRUNCATE :

Table Structure :

SQL> select * from hari.employee;

EMP_ID     EMP_NAME   DEPT            SALARY     JOIN_DATE
---------- ---------- --------------- ---------- ---------
100        Hari       MCA             5000
200        Karthi     Technology      5500
300        Sunil      Technology      7000
400        Ram        DBA             5500
500        Siva       Developer       7000

5 rows selected.

Input Data file :-

cat /home/oracle/newemployee.txt

400,Sam,DBA,5500
500,Scott,Developer,7000

Sqlloader control file :

cat > sqlldr-truncate-records.ctl
load data
infile '/home/oracle/newemployee.txt'
truncate
into table employee
fields terminated by ","
( emp_id,emp_name,dept,salary )

Sqlloader execution :-

[oracle@orcl:~ orcldemo] sqlldr hari/hari control=/home/oracle/sqlldr-truncate-records.ctl

SQL*Loader: Release 12.2.0.1.0 - Production on Fri Sep 14 15:22:16 2018

Copyright (c) 1982, 2017, Oracle and/or its affiliates. All rights reserved.

Path used: Conventional
Commit point reached - logical record count 2

Table EMPLOYEE:
2 Rows successfully loaded.

Check the log file:
sqlldr-truncate-records.log
for more information about the load.

Check the table data,

SQL> select * from hari.employee;

EMP_ID     EMP_NAME   DEPT            SALARY     JOIN_DATE
---------- ---------- --------------- ---------- ---------
400        Ram        DBA             5500
500        Siva       Developer       7000

REPLACE : 

When REPLACE is specified, the entire table is replaced, not just individual rows.It uses an implicit truncate of the table to replace existing data. It does not look at specific rows but rather removes all rows and inserts new ones, even if the new data is the same as the original data.

Table Structure :-

SQL> select * from hari.employee;

EMP_ID     EMP_NAME    DEPT         SALARY  JOIN_DATE
---------- ----------  -----------  ------- ---------
4000        Steven     ITlead        50000
5000        Brad       SystemAdmin   10000

Input data :-

[oracle@orcl:~ orcldemo] cat /home/oracle/newemployee.txt
4000,Steven,ITlead,50000
5000,Brad,SystemAdmin,10000

Control file :-

[oracle@orcl:~ orcldemo] cat sqlldr-replace-records.ctl

load data
infile '/home/oracle/newemployee.txt'
replace
into table employee
fields terminated by ","
( emp_id,emp_name,dept,salary )

Sqlloader execution :-

[oracle@orcl:~ orcldemo] sqlldr hari/hari control=/home/oracle/sqlldr-replace-records.ctl

SQL*Loader: Release 12.2.0.1.0 - Production on Fri Sep 14 17:08:22 2018

Copyright (c) 1982, 2017, Oracle and/or its affiliates. All rights reserved.

Path used: Conventional
Commit point reached - logical record count 2

Table EMPLOYEE:
2 Rows successfully loaded.

Check the log file:
sqlldr-replace-records.log
for more information about the load.

Check Table data,

SQL> select * from hari.employee;

EMP_ID     EMP_NAME    DEPT         SALARY  JOIN_DATE
---------- ----------  -----------  ------- ---------
4000        Steven     ITlead        50000
5000        Brad       SystemAdmin   10000

Bad and Discard file Scenario :-

Table Structure :-

SQL> create table dept (deptno number,dname varchar(20),location varchar(20));

Table created.

Input data:-

cat sqlloader.dat

10,ACCOUNTING,PAK
30,SALES,RUSSIA
4D,OPERATIONS,USA
50,HUMAN RESOURCE,USA
60,IT,USA
70,PRODUCTION,ENG
80,QUALITY,USA

Control file :

cat sqlload.ctl
load data
infile '/home/oracle/sqlloader.dat'
badfile '/home/oracle/badrecords.bad'
discardfile '/home/oracle/dicardload.dsc'
into table dept
WHEN LOCATION!='USA'
fields terminated by ","
(DEPTNO,DNAME,LOCATION)

Run the sqlloader utility :-

[oracle@orcl:~ orcldemo] sqlldr hari/hari control=/home/oracle/sqlload.ctl

SQL*Loader: Release 12.2.0.1.0 - Production on Fri Sep 14 14:43:30 2018

Copyright (c) 1982, 2017, Oracle and/or its affiliates. All rights reserved.

Path used: Conventional
Commit point reached - logical record count 7

Table DEPT:
3 Rows successfully loaded.

Check the log file:
sqlload.log
for more information about the load.

Discard file record:-

[oracle@test:~ orcldemo] cat dicardload.dsc
4D,OPERATIONS,USA
50,HUMAN RESOURCE,USA
60,IT,USA
80,QUALITY,USA

Bad file record:-

[oracle@test:~ orcldemo] cat badrecords.bad
10,ACCOUNTING,PAK
30,SALES,RUSSIA
70,PRODUCTION,ENG

Catch Me On:- Hariprasath Rajaram

Telegram:https://t.me/joinchat/I_f4DhGF_Zifr9YZvvMkRg
LinkedIn:https://www.linkedin.com/in/hari-prasath-aa65bb19/
Facebook:https://www.facebook.com/HariPrasathdba
FB Group:https://www.facebook.com/groups/894402327369506/
FB Page: https://www.facebook.com/dbahariprasath/?
Twitter: https://twitter.com/hariprasathdba

 

 

 

All In One Datapump Parameters Up To Oracle 18c (A-Z)

Description:-

In this article we are covering all the datapump parameters upto 18c database

Oracle 10g Release 1

ATTACH https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-attach-parameter/

CONTENT https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-content-parameter-2/

DIRECTORY https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-directory-parameter/ 
    
DUMPFILE https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-dumpfile-parameter/          
        
ESTIMATE https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-estimate-parameter/

ESTIMATE_ONLY  http://www.oracledbwr.com/18c-datapump/oracle-18c-datap…e_only-parameter/


EXCLUDE https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-exclude-parameter/


FLASHBACK_SCN https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-flashback_scn-parameter/  
 
FLASHBACK_TIME https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-flashback_time-parameter/  
 
FULL https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-database-level/

HELP https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-help-parameter/

INCLUDE https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-include-parameter/ 
JOB_NAME https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-job_name-parameter/

NETWORK_LINK https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-network-link/
NOLOGFILE https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-nologfile-parameter/

PARALLEL https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-parallel-parameter/

PARFILE https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-parfile-parameter/

QUERY https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-query-parameter/

REMAP_DATAFILE https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-remap_datafile-parameter/

REMAP_SCHEMA https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-remap_schema-parameter/ 

REMAP_TABLESPACE https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-remap_tablespace-parameter/         
            
REUSE_DUMPFILES https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-reuse_dumpfiles-parameter/

SCHEMAS https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-schema-level/

SKIP_UNUSABLE_INDEXES https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-skip_unusable_indexes-parameter/

SQLFILE https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-sqlfile-parameter/

STATUS https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-status-parameter/

 
TABLE_EXISTS_ACTION  https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-table_exists_action-parameter/


TABLES https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-table-level/

TABLESPACES https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-tablespace-level/

TRANSPORT_DATAFILES https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-transport_datafiles-parameter/

TRANSPORT_TABLESPACES https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-transport_tablespaces-parameter/

VERSION https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-version-parameter/

CONTINUE_CLIENT https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-continue_client-parameter/

EXIT_CLIENT https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-exit_client-parameter/

HELP https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-help-parameter/

KILL_JOB https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-kill_job-parameter/

PARALLEL https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-parallel-parameter/

START_JOB https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-start_job-parameter/

STATUS https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-status-parameter/

STOP_JOB https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-stop_job-parameter/

ADD_FILE https://www.oracledbwr.com/18c-database/oracle-18c-datapump-add_file-parameter/

REMAP_SCHEMA https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-remap_schema-parameter/

Oracle 10g Release 2

ENCRYPTION_PASSWORD https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-encryption_password-parameter/


FILESIZE https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-filesize-parameter/

SAMPLE https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-sample-parameter/

COMPRESSION https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-compression-parameter/

Oracle 11g Release 1


PARTITION_OPTIONS https://www.oracledbwr.com/18c-datapump/oracle-18c-datap…ptions-parameter/


REMAP_DATA http://www.oracledbwr.com/18c-datapump/oracle-18c-datap…p_data-parameter/


REMAP_TABLE https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-remap_table-parameter/

ENCRYPTION https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-encryption-parameter-2/

ENCRYPTION_ALGORITHM https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-encryption_algorithm-parameter/

ENCRYPTION_MODE https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-encryption_mode-parameter/

Oracle 11g Release 2

Source_edition

https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-source_edition-parameter/

Target_edition

https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-target_edition-parameter/
Oracle 12c Release 1

VIEWS_AS_TABLES https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-views_as_tables-parameter/

METRICS https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-metrics-parameter/

MASTER_ONLY http://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-master_only-parameter/


KEEP_MASTER https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-keep_master-parameter/

ENCRYPTION_PWD_PROMPT https://www.oracledbwr.com/uncategorized/oracle-18c-datapump-encryption_pwd_prompt-parameter/


DISABLE_ARCHIVE_LOGGING https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-disable_archive_logging-parameter/
LOGTIME https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-logfile-parameter/


Oracle 12c Release 2

REMAP_DIRECTORY https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-remap_directory-parameter/

TRANSPORT_DATAFILES https://www.oracledbwr.com/18c-datapump/oracle-18c-datap…afiles-parameter/

LOGTIME https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-logtime-parameter/

Oracle 18c Release 1

DATABASE LINK https://www.oracledbwr.com/18c-datapump/oracle-18c-datapump-database-link/

 

Oracle 18c Datapump-TARGET_EDITION Parameter

  • TARGET_EDITION parameter used in import datapump process.
  •  You should remove the objects from the target edition before importing the dump.Importing the dumpfile / views to ORA$BASE edition.
  • Drop the target edition present under HARI user and import TARGET_ EDITION.
SQL> SELECT SYS_CONTEXT('userenv', 'current_edition_name') FROM DUAL;

SYS_CONTEXT('USERENV','CURRENT_EDITION_NAME')
----------------------------------------------
ORA$BASE
SQL> conn hari/oracle;
Connected.

SQL> ALTER SESSION SET EDITION =ORA$BASE;

Session altered.

SQL> drop view ed_emp_view_ORABASE;

View dropped.

Import  the dumpfile / views to ORA$BASE edition.

[oracle@18c empdata]$ impdp dumpfile=edition.dmp directory=TEST_DIR target_edition=ORA\$BASE

Import: Release 18.0.0.0.0 - Production on Sat Aug 25 02:38:44 2018
Version 18.3.0.0.0

Copyright (c) 1982, 2018, Oracle and/or its affiliates. All rights reserved.

Username: system
Password:

Connected to: Oracle Database 18c Enterprise Edition Release 18.0.0.0.0 - Production
Master table "SYSTEM"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "SYSTEM"."SYS_IMPORT_FULL_01": system/******** dumpfile=edition.dmp directory=TEST_DIR target_edition=ORA$BASE
Processing object type SCHEMA_EXPORT/VIEW/VIEW
Job "SYSTEM"."SYS_IMPORT_FULL_01" successfully completed at Sat Aug 25 02:38:51 2018 elapsed 0 00:00:03

After the import you can verify the import after setting corresponding edition.

SQL> conn hari/oracle;
Connected.
SQL> ALTER SESSION SET EDITION =ORA$BASE;

Session altered.

SQL> SELECT SYS_CONTEXT('userenv', 'current_edition_name') FROM DUAL;

SYS_CONTEXT('USERENV','CURRENT_EDITION_NAME')
---------------------------------------------
ORA$BASE

SQL> desc ed_emp_view_ORABASE;

Name  Null?  Type
---- ----- ------
ID         NUMBER
SAL        NUMBER

 

Catch Me On:- Hariprasath Rajaram

Telegram:https://t.me/joinchat/I_f4DhGF_Zifr9YZvvMkRg
LinkedIn:https://www.linkedin.com/in/hari-prasath-aa65bb19/
Facebook:https://www.facebook.com/HariPrasathdba
FB Group:https://www.facebook.com/groups/894402327369506/
FB Page: https://www.facebook.com/dbahariprasath/?
Twitter: https://twitter.com/hariprasathdba

 

Oracle 18c Datapump-SOURCE_EDITION Parameter

  • SOURCE_EDITION parameter used in export datapump process.
  • Oracle provides the provision for different versions of same object under different editions. Suppose if you wanted to backup and restore these versions into across different editions using these options.
  • During the application development you can create multiple versions of objects for different releases. So in case if you wanted to migrate code from one version to different version within database or across database you can use this option.
  • The objects are limited to some set of objects which is not having the storage.

EDITIONABLE AND NONEDITIONABLE SCHEMA OBJECT TYPES :

These schema objects types are editionable:

  • SYNONYM
  • VIEW
  • All PL/SQL object types:
    • FUNCTION
    • LIBRARY
    • PACKAGE and PACKAGE BODY
    • PROCEDURE
    • TRIGGER
    • TYPE and TYPE BODY
  • All other schema object types are noneditionable. Table is an example of an noneditionable type.
  • A schema object of an editionable type is editioned if its owner is editions-enabled; otherwise, it is potentially editioned.
  • A schema object of a noneditionable type is always noneditioned, even if its owner is editions-enabled. A table is an example of an noneditioned object.

Grant access to HARI user to create edition

SQL> GRANT CREATE ANY EDITION, DROP ANY EDITION to HARI;

Grant succeeded.

Enabling edition for a user HARI

SQL> select USERNAME,EDITIONS_ENABLED from dba_users where USERNAME='HARI';

USERNAME      E
-----------  ---
HARI          N


SQL> alter user HARI enable editions;

User altered.

SQL> grant create any view to HARI;

Grant succeeded.

SQL> select USERNAME,EDITIONS_ENABLED from dba_users where USERNAME='HARI';

USERNAME         E
-----------     ---
HARI             Y

Check the current & new edition for HARI

SQL> conn hari/oracle;
Connected.


SQL> create edition NEW_EDITION;

Edition created.


SQL> SELECT SYS_CONTEXT('userenv', 'current_edition_name') FROM DUAL;

SYS_CONTEXT('USERENV','CURRENT_EDITION_NAME')
-----------------------------------------------
ORA$BASE

SQL> create editioning view ED_EMP_VIEW_ORABASE as select ID,SAL from emp_data;

View created.

SQL> desc ed_emp_view_ORABASE;
Name   Null?     Type
----- ----   ------
ID                     NUMBER
SAL                  NUMBER


SQL> alter session set edition=NEW_EDITION;

Session altered.

SQL> SELECT SYS_CONTEXT('userenv', 'current_edition_name') FROM DUAL;

SYS_CONTEXT('USERENV','CURRENT_EDITION_NAME')
-----------------------------------------------
NEW_EDITION

Take the export view

[oracle@18c empdata]$ expdp dumpfile=edition.dmp schemas=HARI include=view directory=TEST_DIR source_edition=NEW_EDITION

Export: Release 18.0.0.0.0 - Production on Sat Aug 25 02:20:35 2018
Version 18.3.0.0.0

Copyright (c) 1982, 2018, Oracle and/or its affiliates. All rights reserved.

Username: hari
Password:

Connected to: Oracle Database 18c Enterprise Edition Release 18.0.0.0.0 - Production
Starting "HARI"."SYS_EXPORT_SCHEMA_05": hari/******** dumpfile=edition.dmp schemas=HARI include=view directory=TEST_DIR source_edition=NEW_EDITION
Processing object type SCHEMA_EXPORT/VIEW/VIEW
Master table "HARI"."SYS_EXPORT_SCHEMA_05" successfully loaded/unloaded
******************************************************************************
Dump file set for HARI.SYS_EXPORT_SCHEMA_05 is:
/u01/empdata/edition.dmp
Job "HARI"."SYS_EXPORT_SCHEMA_05" successfully completed at Sat Aug 25 02:20:52 2018 elapsed 0 00:00:12

 

Catch Me On:- Hariprasath Rajaram

Telegram:https://t.me/joinchat/I_f4DhGF_Zifr9YZvvMkRg
LinkedIn:https://www.linkedin.com/in/hari-prasath-aa65bb19/
Facebook:https://www.facebook.com/HariPrasathdba
FB Group:https://www.facebook.com/groups/894402327369506/
FB Page: https://www.facebook.com/dbahariprasath/?
Twitter: https://twitter.com/hariprasathdba

 

Oracle 18c Datapump-REMAP_DATA Parameter

REMAP_DATA :-

This is a common requirement for DBA to export data from production for various purposes. In a restricted or production environment it’s a must to mask a particular data while exporting from production.

Syntax :- REMAP_DATA=[schema.]tablename.column_name:[schema.]pkg.function

schema — the schema containing the table to be remapped. By default, this is the schema of the user doing the export.
tablename — the table whose column will be remapped.
column_name — the column whose data is to be remapped. The maximum number of columns that can be remapped for a single table is 10.
schema — the schema containing the PL/SQL package you have created that contains the remapping function. As a default, this is the schema of the user doing the export.
pkg — the name of the PL/SQL package you have created that contains the remapping function.
function — the name of the function within the PL/SQL that will be called to remap the column table in each row of the specified table.

Create a package in source database

create or replace package body datapump_remap_emp
as
function idcard(id varchar2) return varchar2 is
begin
return translate(id,
'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz1234567890!@#$%^&*()_+-=\/ ',
'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx');
end;
end;
/ 4 5
Package created.

SQL> SQL> 2 3 4 5 6 7 8 9 10

Package body created.

REMAP_DATA :

[oracle@18c ~]$ expdp system/oracle directory=DATA_PUMP_DIR dumpfile=emp_prod18c.dmp logfile=emp_prod18c.log REMAP_DATA=hari.emp.id:sys.datapump_remap_emp.idcard

Export: Release 18.0.0.0.0 - Production on Tue Aug 21 02:09:18 2018
Version 18.3.0.0.0

Copyright (c) 1982, 2018, Oracle and/or its affiliates. All rights reserved.

Connected to: Oracle Database 18c Enterprise Edition Release 18.0.0.0.0 - Production
Starting "SYSTEM"."SYS_EXPORT_SCHEMA_02": system/******** directory=DATA_PUMP_DIR dumpfile=emp_prod18c.dmp logfile=emp_prod18c.log REMAP_DATA=hari.emp.id:sys.datapump_remap_emp.idcard
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type SCHEMA_EXPORT/STATISTICS/MARKER
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/COMMENT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
. . exported "SYSTEM"."SYS_EXPORT_SCHEMA_01" 263.0 KB 1456 rows
Master table "SYSTEM"."SYS_EXPORT_SCHEMA_02" successfully loaded/unloaded
******************************************************************************
Dump file set for SYSTEM.SYS_EXPORT_SCHEMA_02 is:
/u01/app/oracle/admin/prod18c/dpdump/emp_prod18c.dmp
Job "SYSTEM"."SYS_EXPORT_SCHEMA_02" successfully completed at Tue Aug 21 02:09:58 2018 elapsed 0 00:00:40

In impdp process,emp table column ID  data is encrypted in target database.Import with REMAP_DATA parameter option,the column ID data  of emp table is imported with decrypt original data.

Catch Me On:- Hariprasath Rajaram

Telegram:https://t.me/joinchat/I_f4DhGF_Zifr9YZvvMkRg
LinkedIn:https://www.linkedin.com/in/hari-prasath-aa65bb19/
Facebook:https://www.facebook.com/HariPrasathdba
FB Group:https://www.facebook.com/groups/894402327369506/
FB Page: https://www.facebook.com/dbahariprasath/?
Twitter: https://twitter.com/hariprasathdba

Oracle 18c Datapump-PARTITION_OPTIONS Parameter

The PARTITION_OPTIONS parameter determines how partitions will be handled during export and import operations. The syntax is shown below.

Syntax:

PARTITION_OPTIONS={none | departition | merge}

NONE: The partitions are created exactly as they were on the system the export was taken from.
DEPARTITION: Each partition and sub-partition is created as a separate table, named using a combination of the table and (sub-)partition name.
MERGE: This option will import all partition data into single table.
The NONE and MERGE options are not possible if the export was done using the TRANSPORTABLE parameter with a partition or subpartition filter. If there are any grants on objects being departitioned, an error message is generated and the objects are not loaded.

Partition Table creation:

SQL> create table employee (emp_id number,sal INT NOT NULL) PARTITION BY RANGE (sal)( PARTITION employee_e1 VALUES LESS THAN (001),PARTITION employee_e2 VALUES LESS THAN (002));

Table created.
SQL> SELECT partition_name FROM user_tab_partitions WHERE table_name = 'EMPLOYEE';

PARTITION_NAME
----------------
EMPLOYEE_E1
EMPLOYEE_E2
[oracle@18c empdata]$ expdp hari/oracle dumpfile=employee.dmp directory=test_dir tables=employee

Export: Release 18.0.0.0.0 - Production on Fri Aug 24 23:32:41 2018
Version 18.3.0.0.0

Copyright (c) 1982, 2018, Oracle and/or its affiliates. All rights reserved.

Connected to: Oracle Database 18c Enterprise Edition Release 18.0.0.0.0 - Production
Starting "HARI"."SYS_EXPORT_TABLE_01": hari/******** dumpfile=employee.dmp directory=test_dir tables=employee
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type TABLE_EXPORT/TABLE/STATISTICS/MARKER
Processing object type TABLE_EXPORT/TABLE/TABLE
. . exported "HARI"."EMPLOYEE":"EMPLOYEE_E1" 0 KB 0 rows
. . exported "HARI"."EMPLOYEE":"EMPLOYEE_E2" 0 KB 0 rows
Master table "HARI"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
**********************************************************
Dump file set for HARI.SYS_EXPORT_TABLE_01 is:
/u01/empdata/employee.dmp
Job "HARI"."SYS_EXPORT_TABLE_01" successfully completed at Fri Aug 24 23:33:00 2018 elapsed 0 00:00:18

DEPARTITION :-

Drop EMPLOYEE table, 

SQL> conn hari/oracle;
Connected.
SQL> drop table employee purge;

Table dropped.

 

[oracle@18c empdata]$ impdp hari/oracle dumpfile=employee.dmp directory=test_dir partition_options=DEPARTITION

Import: Release 18.0.0.0.0 - Production on Fri Aug 24 23:48:09 2018
Version 18.3.0.0.0

Copyright (c) 1982, 2018, Oracle and/or its affiliates. All rights reserved.

Connected to: Oracle Database 18c Enterprise Edition Release 18.0.0.0.0 - Production
Master table "HARI"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "HARI"."SYS_IMPORT_FULL_01": hari/******** dumpfile=employee.dmp directory=test_dir partition_options=DEPARTITION
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
. . imported "HARI"."EMPLOYEE_EMPLOYEE_E1" 0 KB 0 rows
. . imported "HARI"."EMPLOYEE_EMPLOYEE_E2" 0 KB 0 rows
Processing object type TABLE_EXPORT/TABLE/STATISTICS/MARKER
Job "HARI"."SYS_IMPORT_FULL_01" successfully completed at Fri Aug 24 23:48:12 2018 elapsed 0 00:00:03
SQL> select * from tab where tname like 'EMP%';

TNAME                TABTYPE       CLUSTERID
-------------------- ------------- ----------
EMPLOYEE               TABLE
EMPLOYEE_EMPLOYEE_E1   TABLE
EMPLOYEE_EMPLOYEE_E2   TABLE

SQL> select partition_name from user_tab_partitions where table_name='EMPLOYEE';

no rows selected

MERGE :-

[oracle@18c empdata]$ impdp hari/oracle dumpfile=employee.dmp directory=test_dir partition_options=MERGE

Import: Release 18.0.0.0.0 - Production on Sat Aug 25 00:36:31 2018
Version 18.3.0.0.0

Copyright (c) 1982, 2018, Oracle and/or its affiliates. All rights reserved.

Connected to: Oracle Database 18c Enterprise Edition Release 18.0.0.0.0 - Production
Master table "HARI"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "HARI"."SYS_IMPORT_FULL_01": hari/******** dumpfile=employee.dmp directory=test_dir partition_options=MERGE
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
. . imported "HARI"."EMPLOYEE":"EMPLOYEE_E1" 0 KB 0 rows
. . imported "HARI"."EMPLOYEE":"EMPLOYEE_E2" 0 KB 0 rows
Processing object type TABLE_EXPORT/TABLE/STATISTICS/MARKER
Job "HARI"."SYS_IMPORT_FULL_01" successfully completed at Sat Aug 25 00:36:35 2018 elapsed 0 00:00:02
SQL> select * from tab where tname like 'EMP%';

TNAME                TABTYPE       CLUSTERID
-------------------- ------------- ----------
EMPLOYEE               TABLE
EMPLOYEE_EMPLOYEE_E1   TABLE
EMPLOYEE_EMPLOYEE_E2   TABLE

SQL> select partition_name from user_tab_partitions where table_name='EMPLOYEE';

no rows selected

SQL> select table_name , partitioned from dba_tables where table_name='EMPLOYEE';

TABLE_NAME           PAR
-------------------- ---
EMPLOYEE             NO

 

Catch Me On:- Hariprasath Rajaram

Telegram:https://t.me/joinchat/I_f4DhGF_Zifr9YZvvMkRg
LinkedIn:https://www.linkedin.com/in/hari-prasath-aa65bb19/
Facebook:https://www.facebook.com/HariPrasathdba
FB Group:https://www.facebook.com/groups/894402327369506/
FB Page: https://www.facebook.com/dbahariprasath/?
Twitter: https://twitter.com/hariprasathdba

Oracle 18c Datapump-KEEP_MASTER Parameter

Description:-

Indicates whether the master table should be deleted or retained at the end of a Data Pump job that completes successfully. The master table is automatically retained for jobs that do not complete successfully.

Without Keep_master

[oracle@prod101:~ orcl101] expdp scott/tiger tables=emp,oracledbwr directory=TEST_DIR dumpfile=tbs.dmp logfile=tbs.log Job_name=oradbwr24

Export: Release 18.0.0.0.0 - Production on Sat Aug 25 00:12:03 2018
Version 18.3.0.0.0

Copyright (c) 1982, 2018, Oracle and/or its affiliates. All rights reserved.

Connected to: Oracle Database 18c Enterprise Edition Release 18.0.0.0.0 - Production
Starting "SCOTT"."ORADBWR24": scott/******** tables=emp,oracledbwr directory=TEST_DIR dumpfile=tbs.dmp logfile=tbs.log Job_name=oradbwr24 
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type TABLE_EXPORT/TABLE/STATISTICS/MARKER
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
. . exported "SCOTT"."ORACLEDBWR" 3.067 MB 458752 rows
. . exported "SCOTT"."EMP" 0 KB 0 rows
ORA-39173: Encrypted data has been stored unencrypted in dump file set.
Master table "SCOTT"."ORADBWR24" successfully loaded/unloaded
******************************************************************************
Dump file set for SCOTT.ORADBWR24 is:
/u01/app/oracle/datapump/tbs.dmp
Job "SCOTT"."ORADBWR24" successfully completed at Sat Aug 25 00:12:07 2018 elapsed 0 00:00:04

 

SQL> select owner,segment_name,segment_type,tablespace_name,(bytes/1024/1024)MB from dba_segments where segment_name=’ORADBWR24′;

no rows selected

With keep_master

[oracle@prod101:~ orcl101] expdp scott/tiger tables=emp,oracledbwr directory=TEST_DIR dumpfile=tbs1.dmp logfile=tbs1.log Job_name=oradbwr24 keep_master=y

Export: Release 18.0.0.0.0 - Production on Sat Aug 25 00:14:10 2018
Version 18.3.0.0.0

Copyright (c) 1982, 2018, Oracle and/or its affiliates. All rights reserved.

Connected to: Oracle Database 18c Enterprise Edition Release 18.0.0.0.0 - Production
Starting "SCOTT"."ORADBWR24": scott/******** tables=emp,oracledbwr directory=TEST_DIR dumpfile=tbs1.dmp logfile=tbs1.log Job_name=oradbwr24 keep_master=y 
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type TABLE_EXPORT/TABLE/STATISTICS/MARKER
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
. . exported "SCOTT"."ORACLEDBWR" 3.067 MB 458752 rows
. . exported "SCOTT"."EMP" 0 KB 0 rows
ORA-39173: Encrypted data has been stored unencrypted in dump file set.
Master table "SCOTT"."ORADBWR24" successfully loaded/unloaded
******************************************************************************
Dump file set for SCOTT.ORADBWR24 is:
/u01/app/oracle/datapump/tbs1.dmp
Job "SCOTT"."ORADBWR24" successfully completed at Sat Aug 25 00:14:14 2018 elapsed 0 00:00:03

Master Table is not deleted after the job completes. The drop of the master table does not lead to any data dictionary corruption and if you keep the master table after the job completes, then a drop of the master table afterwards will not cause any corruption either.

SQL> select owner,segment_name,segment_type,tablespace_name,(bytes/1024/1024)MB from dba_segments where segment_name='ORADBWR24';

OWNER SEGMENT_NAME SEGMENT_TYPE TABLESPACE_NAME MB
--------------- -------------------- ------------------ -------------------- ----------
SCOTT      ORADBWR24 TABLE             USERS    .1875

 

Catch Me On:- Hariprasath Rajaram

Telegram:https://t.me/joinchat/I_f4DhGF_Zifr9YZvvMkRg
LinkedIn:https://www.linkedin.com/in/hari-prasath-aa65bb19/
Facebook:https://www.facebook.com/HariPrasathdba
FB Group:https://www.facebook.com/groups/894402327369506/
FB Page: https://www.facebook.com/dbahariprasath/?
Twitter: https://twitter.com/hariprasathdba

Oracle 18c Datapump-METRICS Parameter

Description:-

When METRICS=YES is used, the number of objects and the elapsed time are recorded in the Data Pump log file.

METRICS=[YES | NO]

We have additional information can be obtained about the number of objects that were processed and the time it took for processing them.

The METRICS parameter does not create any additional trace files. Instead,
the extra details are logged in the logfile of the Export Data Pump or Import Data Pump job.

[oracle@prod101 ~]$ expdp scott/tiger directory=test_dir dumpfile=oradbs.dmp logfile=oradbs.log tables=oracledbwr metrics=y

Export: Release 18.0.0.0.0 - Production on Fri Aug 24 23:48:28 2018
Version 18.3.0.0.0
Copyright (c) 1982, 2018, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 18c Enterprise Edition Release 18.0.0.0.0 - Production
Starting "SCOTT"."SYS_EXPORT_TABLE_01": scott/******** directory=test_dir dumpfile=oradbs.dmp logfile=oradbs.log tables=oracledbwr metrics=y 
W-1 Startup took 0 seconds
W-1 Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
W-1 Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
W-1 Completed 1 INDEX_STATISTICS objects in 1 seconds
W-1 Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
W-1 Completed 1 TABLE_STATISTICS objects in 0 seconds
W-1 Processing object type TABLE_EXPORT/TABLE/STATISTICS/MARKER
W-1 Completed 1 MARKER objects in 7 seconds
W-1 Processing object type TABLE_EXPORT/TABLE/TABLE
W-1 Completed 1 TABLE objects in 4 seconds
W-1 Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
W-1 Completed 1 INDEX objects in 0 seconds
W-1 . . exported "SCOTT"."ORACLEDBWR" 3.067 MB 458752 rows in .9999999999999999999999999999999999999996 seconds using direct_path
W-1 Completed 1 TABLE_EXPORT/TABLE/TABLE_DATA objects in 1 seconds
W-1 Master table "SCOTT"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
******************************************************************************
Dump file set for SCOTT.SYS_EXPORT_TABLE_01 is:
/u01/app/oracle/datapump/oradbs.dmp
Job "SCOTT"."SYS_EXPORT_TABLE_01" successfully completed at Fri Aug 24 23:49:00 2018 elapsed 0 00:00:25

Catch Me On:- Hariprasath Rajaram

Telegram:https://t.me/joinchat/I_f4DhGF_Zifr9YZvvMkRg
LinkedIn:https://www.linkedin.com/in/hari-prasath-aa65bb19/
Facebook:https://www.facebook.com/HariPrasathdba
FB Group:https://www.facebook.com/groups/894402327369506/
FB Page: https://www.facebook.com/dbahariprasath/?
Twitter: https://twitter.com/hariprasathdba

Oracle 18c Datapump-ESTIMATE_ONLY Parameter

This parameter is useful when you wanted get the approximate size of the dump file. Suppose you have only limited space available in the disk and if you wanted to check whether your dump will be fit into the available space then you can use this parameter to check or estimate the size of the dump with doing the actual export. The default option for this parameter is ESTIMATE_ONLY=NO.

  • You don’t need specify any of other parameters like directory, dumpfile etc.

Syntax :

ESTIMATE_ONLY=[YES | NO]

If ESTIMATE_ONLY=YES, then Export estimates the space that will consume but the export operation will not performed.

[oracle@18c empdata]$ expdp system/oracle TABLESPACES=USERS estimate_only=YES

Export: Release 18.0.0.0.0 - Production on Fri Aug 24 22:51:52 2018
Version 18.3.0.0.0

Copyright (c) 1982, 2018, Oracle and/or its affiliates. All rights reserved.

Connected to: Oracle Database 18c Enterprise Edition Release 18.0.0.0.0 - Production
Starting "SYSTEM"."SYS_EXPORT_TABLESPACE_01": system/******** TABLESPACES=USERS estimate_only=YES
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
. estimated "SAM"."EMP" 2 MB
. estimated "SAM"."SYS_EXPORT_SCHEMA_02" 1.75 MB
. estimated "SAM"."SYS_EXPORT_SCHEMA_03" 1.75 MB
. estimated "SAM"."SYS_EXPORT_SCHEMA_04" 1.75 MB
. estimated "SAM"."SYS_EXPORT_SCHEMA_01" 320 KB
Total estimation using BLOCKS method: 7.562 MB
Job "SYSTEM"."SYS_EXPORT_TABLESPACE_01" successfully completed at Fri Aug 24 22:51:57 2018 elapsed 0 00:00:04

 

Catch Me On:- Hariprasath Rajaram

Telegram:https://t.me/joinchat/I_f4DhGF_Zifr9YZvvMkRg
LinkedIn:https://www.linkedin.com/in/hari-prasath-aa65bb19/
Facebook:https://www.facebook.com/HariPrasathdba
FB Group:https://www.facebook.com/groups/894402327369506/
FB Page: https://www.facebook.com/dbahariprasath/?
Twitter: https://twitter.com/hariprasathdba