No question at this time
DBA Top 10
1 A. Kavsek 8600
2 M. Cadot 7000
3 B. Vroman 6000
4 T. Boles 4550
5 J. Schnackenberg 4100
6 P. Wisse 3000
7 T. P 1200
8 G. Lambregts 1100
8 . Lauri 1100
10 R. Pattyn 800
About
DBA-Village
The DBA-Village forum
Forum as RSS
as RSS feed
Site Statistics
Ever registered users48373
Total active users1542
Act. users last 24h9
Act. users last hour0
Registered user hits last week188
Registered user hits last month1051
Go up

Large volume of data in inserting in our database.
Next thread: ROLES AND ASSOCIATED PRIVILEGES !
Prev thread: getting temp tablespace error while inserting data in a table

Message Score Author Date
Dear friends, We need to import 7 tb of data in o...... Prasathi Ji Sep 07, 2018, 23:03
Hello Prasathi,   <I>big file tablespace manage...... Score: 400 PtsScore: 400 PtsScore: 400 PtsScore: 400 PtsScore: 400 Pts Bruno Vroman Sep 08, 2018, 13:46
Dear Bruno, Your review for Bigfile tablespace ...... Prasathi Ji Sep 08, 2018, 19:12
In addition to what Bruno said: The biggest con...... Score: 100 PtsScore: 100 PtsScore: 100 PtsScore: 100 PtsScore: 100 Pts Jan Schnackenberg Sep 10, 2018, 08:55
And now, in an extra post to separate my feelings ...... Score: 200 PtsScore: 200 PtsScore: 200 PtsScore: 200 PtsScore: 200 Pts Jan Schnackenberg Sep 10, 2018, 08:56

Follow up by mail Click here


Subject: Large volume of data in inserting in our database.
Author: Prasathi Ji, India
Date: Sep 07, 2018, 23:03, 256 days ago
Os info: Redhat7
Oracle info: Oracle 12c cdb and Pdb instances
Message: Dear friends,
We need to import 7 tb of data in our oracle 12c DB server in cdb and Pdb environment. Please tell me best practices for tablespace management.
1.Should we use big file tablespace? But big file tablespace management is not easy.

Please share your views.

Best Regards
Prasathi
Goto: Reply - Top of page 
If you think this item violates copyrights, please click here

Subject: Re: Large volume of data in inserting in our database.
Author: Bruno Vroman, Belgium
Date: Sep 08, 2018, 13:46, 255 days ago
Score:   Score: 400 PtsScore: 400 PtsScore: 400 PtsScore: 400 PtsScore: 400 Pts
Message: Hello Prasathi,

  big file tablespace management is not easy ?

Well, isn't it on the opposite supposed to be easier? ;-)

Personally I dislike big files because I want to limit the problem in case of file problem (rather lose 1 file of 20GB to recover than 1 file of 1TB), or to have "small" pieces to send to the backup software... But my tabespaces are not very large (largest are in the 1TB range and I use files of ~20GB so about fifty files for the largest tablespaces).

One issue with bigfile might be that you reach the limit at the OS level for one file hence for one tablespace (I suppose that your 7TB of data to load are not all in the same tablespace, but what is the max (target) size of your tablespaces?)

You can also have a look at Oracle MOS document 262472.1 10g: BIGFILE Type Tablespaces Versus SMALLFILE Type (hmmm, yes, "10g")
Or search the web with "bigfile vs smallfile"...

Best regards,

Bruno Vroman.
Your rating?: This reply is Good Excellent
Goto: Reply - Top of page 
If you think this item violates copyrights, please click here

Subject: Re: Large volume of data in inserting in our database.
Author: Prasathi Ji, India
Date: Sep 08, 2018, 19:12, 255 days ago
Message: Dear Bruno,

Your review for Bigfile tablespace management is excellent.I will follow your guidlines and read suggested documents.

Best Regards
Prashansh Kumar
Your rating?: This reply is Good Excellent
Goto: Reply - Top of page 
If you think this item violates copyrights, please click here

Subject: Re: Large volume of data in inserting in our database.
Author: Jan Schnackenberg, Germany
Date: Sep 10, 2018, 08:55, 253 days ago
Score:   Score: 100 PtsScore: 100 PtsScore: 100 PtsScore: 100 PtsScore: 100 Pts
Message: In addition to what Bruno said:

The biggest concerns he mentioned were in regard to backup and recovery.

In 12c using RMAN for backups, you can use segmented backups of datafiles (called "multisection backups" allowing you to backup a bigfile datafile in parallel. Using block-level media recovery you don't need to restore the complete datafile if there are corrupted blocks.

Both features (parallel backup and block-level media recovery) require an Enterprise Edition database but no additional-cost option.

So basically, you probably only need to worry about the maximum filesize for your filesystem. By using ASM you can pretty much forget about that, too.

Regards, Jan
Your rating?: This reply is Good Excellent
Goto: Reply - Top of page 
If you think this item violates copyrights, please click here

Subject: Re: Large volume of data in inserting in our database.
Author: Jan Schnackenberg, Germany
Date: Sep 10, 2018, 08:56, 253 days ago
Score:   Score: 200 PtsScore: 200 PtsScore: 200 PtsScore: 200 PtsScore: 200 Pts
Message: And now, in an extra post to separate my feelings from the facts, I'll tell you that I basically never use bigfile tablespaces. ;)
Your rating?: This reply is Good Excellent
Goto: Reply - Top of page 
If you think this item violates copyrights, please click here