No question at this time
DBA Top 10
1 B. Vroman 13400
2 M. Cadot 10400
3 J. Schnackenberg 8300
4 T. Boles 7850
5 A. Kavsek 5700
6 M. Hidayathullah ... 2200
7 G. Lambregts 1100
8 T. P 1000
8 P. Wisse 1000
10 . Lauri 800
About
DBA-Village
The DBA-Village forum
Forum as RSS
as RSS feed
Site Statistics
Ever registered users48302
Total active users1583
Act. users last 24h3
Act. users last hour0
Registered user hits last week198
Registered user hits last month812
Go up

need to speed up backup of large table by using Datapump
Next thread: Audit_trail No capturing Select,UPDATE etc...
Prev thread: Options for gathering statistics for datawarehouse tables

Message Score Author Date
Dear all I have a big table (For example Table ...... Jill Salalila Dec 02, 2018, 16:17
Hello Jill, (note that 11.2.0.4 or 12.1.0.2 can...... Score: 300 PtsScore: 300 PtsScore: 300 PtsScore: 300 PtsScore: 300 Pts Bruno Vroman Dec 04, 2018, 16:25

Follow up by mail Click here


Subject: need to speed up backup of large table by using Datapump
Author: Jill Salalila, Philippines
Date: Dec 02, 2018, 16:17, 13 days ago
Os info: Oracle Linux 6.7
Oracle info: 11204 & 12102
Message: Dear all

I have a big table (For example Table size 2 TB)
how do i increase backup speed of large table by using data pump ?

I know we can set parallel option
Any other chances other than parallel option ?

Thanks
Goto: Reply - Top of page 
If you think this item violates copyrights, please click here

Subject: Re: need to speed up backup of large table by using Datapump
Author: Bruno Vroman, Belgium
Date: Dec 04, 2018, 16:25, 11 days ago
Score:   Score: 300 PtsScore: 300 PtsScore: 300 PtsScore: 300 PtsScore: 300 Pts
Message: Hello Jill,

(note that 11.2.0.4 or 12.1.0.2 can make a big difference... I'm struggling for months with Oracle Support about an issue (should soon be categorized as a bug) that makes datapump jobs in 12.1 ((and 12.2)) (and both expdp and impdp) extremely slow when using NFS filesystems -as opposed to "local filesystems")

Keep in mind that expdp is not really what is called a backup, especially if you export a single table (what about referential integrity with other tables for example)

Isn't your table of 2TB partitioned? Do you have to export the complete content each time (I expect that you can partition in such a way that most of the data doesn't need to be exported again)?

One option I use often for expdp (and impdp when it has not been done in expdp session): EXCLUDE=STATISTICS

Other idea: maybe your large table has its own dedicated tablespace? You might take copies of the relevant datafiles and metadata (with RMAN), in the spirit of "transportable tablespaces"...

Best regards,

Bruno Vroman.
Your rating?: This reply is Good Excellent
Goto: Reply - Top of page 
If you think this item violates copyrights, please click here