In this article, we are going to learn about the parameters of Import Oracle Datapump.

Parameter Description
abort_step undocumented feature
access_method data access method : default is automatic
attach attach to existing job : default is no
cluster start workers across cluster: default is y
content content to import : default is ALL
data_options Import data layer options
current_edition application on local database directoery default directory specification
directory default diectory specification
dumper_directory directory for stream dumper
dumpfile import dumfile names format is (file1, file2…)
encryption_password encryption key to be used
estimate calculate size estimate default is BLOCKS
exculde import exculde option
flashback_scn system change number to be used for flashback import: default is no
flashback_time database time to be used for flashback import : default no
full indicates a full mode import
help help: display description of import parameters:default is N
inculde import inculde : default is no
ip_address ip address for PL/SQL debugger
job_name job_name : default is no
keep master keep master: retain job table upon completion
logfile log important messages to specified file
master_only only import the master table associted with this job
metrics enable/disable object metrics reporting
mp_enable enable/disable multi processing for current session
network_link network mod-import
nologfile no import log life creates
package_load specify how to load PL/SQL
parallel degree of parallesim default is 1
parallel_threshold degree of DML parallelism
parfile parameter file: name of file contains parameter specification
partition_options options  determne how partitions should be handle : default is NONE
query query used to select a subset of rows for a table
remap_data transform data is user tables
remap_schema remap source schema object to new schema
remap_table remap tables to a different name
remap_tablspace remap objects to a different name
reuse_datafiles re-initialize existing datafiles
schemas schemas to import: format is (‘schema1,….,schemaN’)
services_name service name that job will charge against
silent silent:display  information default is NONE
skip_unusable_indexes skip indexes which indexes which are in the unused state
source_edition application edition to be used  on remote database
sqlfile write appropiate sql DDL to speciied file
status interval between status updates
streams_configuration import streams configuration metadata
table_exists_action taken if  the table to import already exists
tables tables to import : formt is
tablespaces tablespaces to transport formt is  ‘(ts1,…, tsN)’
trace trace option: enable sql_trace and timed_stat : default 0
transform metadata_transforms
transportable use transportable data movement: default is NEVER
transport_datafiles list of datafiles to be plugged into target system
transport_tablespaces transportable transpace option: default is N
transport_full_check verify that tablespces to be  used do not have dependencies
tts_closure_check enable/disable transport able containment check: default is Y
user_id user/password to coonect to oracle no default
version job version: compatiable is default.


Leave a Reply

Your email address will not be published. Required fields are marked *