Ans: you can use CONTENT=METADATA_ONLY parameter during export.
Q2: Which memory area used by datapump process?
show parameter STREAMS_POOL_SIZE
NAME TYPE VALUE
———————————— ———– —————–
streams_pool_size big integer 96M
- Allocate streams_pool_size memory. Below query will give you the recommended settings of this parameter.
- Use CLUSTER=N : In a RAC environment it can improve the speed of Data Pump API based operations.
- Set PARALLEL_FORCE_LOCAL to a value of TRUE since PARALLEL_FORCE_LOCAL could have a wider scope of effect than just Data Pump API based operations.
- EXCLUDE=STATISTICS: excluding the generation and export of statistics at export time will shorten the time needed to perform any export operation. The DBMS_STATS.GATHER_DATABASE_STATS procedure would then be used at the target database once the import operation was completed.
- Use PARALLEL : If there is more than one CPU available and the environment is not already CPU bound or disk I/O bound or memory bound and multiple dump files are going be used (ideally on different spindles) in the DUMPFILE parameter, then parallelism has the greatest potential of being used to positive effect, performance wise.
Q4: How to monitor status of export/import – datapump operations/jobs?
Test with tnsping sid
Create a database link to the remote database
Specify the database link as network_link in your expdp or impdp syntax
Q10: Tell me some of the parameters you have used during export?
Ans:
CONTENT: Specifies data to unload where the valid keywords are:
(ALL), DATA_ONLY, and METADATA_ONLY.
DIRECTORY Directory object to be used for dumpfiles and logfiles.
DUMPFILE List of destination dump files (expdat.dmp),
e.g. DUMPFILE=scott1.dmp, scott2.dmp, dmpdir:scott3.dmp.
ESTIMATE_ONLY Calculate job estimates without performing the export.
EXCLUDE Exclude specific object types, e.g. EXCLUDE=TABLE:EMP.
FILESIZE Specify the size of each dumpfile in units of bytes.
FLASHBACK_SCN SCN used to set session snapshot back to.
FULL Export entire database (N).
HELP Display Help messages (N).
INCLUDE Include specific object types, e.g. INCLUDE=TABLE_DATA.
JOB_NAME Name of export job to create.
LOGFILE Log file name (export.log).
NETWORK_LINK Name of remote database link to the source system.
NOLOGFILE Do not write logfile (N).
PARALLEL Change the number of active workers for current job.
PARFILE Specify parameter file.
QUERY Predicate clause used to export a subset of a table.
SCHEMAS List of schemas to export (login schema).
TABLES Identifies a list of tables to export – one schema only.
TRANSPORT_TABLESPACES List of tablespaces from which metadata will be unloaded.
VERSION Version of objects to export where valid keywords are:
(COMPATIBLE), LATEST, or any valid database version.
Q11: You are getting undo tablespace error during import, how you will avoid it?
Ans: We can use COMMIT=Y option
Q12: Can we import a 11g dumpfile into 10g database using datapump?
Ans: Yes we can import from 11g to 10g using VERSION option.