No question at this time
DBA Top 10
1 B. Vroman 16200
2 M. Cadot 10100
3 T. Boles 8150
4 J. Schnackenberg 7700
5 A. Kavsek 7000
6 M. Hidayathullah ... 3000
7 G. Lambregts 1100
7 P. Wisse 1100
9 T. P 1000
10 B. Derous 500
About
DBA-Village
The DBA-Village forum
Forum as RSS
as RSS feed
Site Statistics
Ever registered users48261
Total active users1615
Act. users last 24h3
Act. users last hour0
Registered user hits last week315
Registered user hits last month1260
Go up

Large volume of data in inserting in our database.
Next thread: ROLES AND ASSOCIATED PRIVILEGES !
Prev thread: getting temp tablespace error while inserting data in a table

Message Score Author Date
Dear friends, We need to import 7 tb of data in o...... Prasathi Ji Sep 07, 2018, 23:03
Hello Prasathi,   <I>big file tablespace manage...... Score: 400 PtsScore: 400 PtsScore: 400 PtsScore: 400 PtsScore: 400 Pts Bruno Vroman Sep 08, 2018, 13:46
Dear Bruno, Your review for Bigfile tablespace ...... Prasathi Ji Sep 08, 2018, 19:12
In addition to what Bruno said: The biggest con...... Score: 100 PtsScore: 100 PtsScore: 100 PtsScore: 100 PtsScore: 100 Pts Jan Schnackenberg Sep 10, 2018, 08:55
And now, in an extra post to separate my feelings ...... Score: 200 PtsScore: 200 PtsScore: 200 PtsScore: 200 PtsScore: 200 Pts Jan Schnackenberg Sep 10, 2018, 08:56

Follow up by mail Click here


Subject: Large volume of data in inserting in our database.
Author: Prasathi Ji, India
Date: Sep 07, 2018, 23:03, 13 days ago
Os info: Redhat7
Oracle info: Oracle 12c cdb and Pdb instances
Message: Dear friends,
We need to import 7 tb of data in our oracle 12c DB server in cdb and Pdb environment. Please tell me best practices for tablespace management.
1.Should we use big file tablespace? But big file tablespace management is not easy.

Please share your views.

Best Regards
Prasathi
Goto: Reply - Top of page 
If you think this item violates copyrights, please click here

Subject: Re: Large volume of data in inserting in our database.
Author: Bruno Vroman, Belgium
Date: Sep 08, 2018, 13:46, 12 days ago
Score:   Score: 400 PtsScore: 400 PtsScore: 400 PtsScore: 400 PtsScore: 400 Pts
Message: Hello Prasathi,

  big file tablespace management is not easy ?

Well, isn't it on the opposite supposed to be easier? ;-)

Personally I dislike big files because I want to limit the problem in case of file problem (rather lose 1 file of 20GB to recover than 1 file of 1TB), or to have "small" pieces to send to the backup software... But my tabespaces are not very large (largest are in the 1TB range and I use files of ~20GB so about fifty files for the largest tablespaces).

One issue with bigfile might be that you reach the limit at the OS level for one file hence for one tablespace (I suppose that your 7TB of data to load are not all in the same tablespace, but what is the max (target) size of your tablespaces?)

You can also have a look at Oracle MOS document 262472.1 10g: BIGFILE Type Tablespaces Versus SMALLFILE Type (hmmm, yes, "10g")
Or search the web with "bigfile vs smallfile"...

Best regards,

Bruno Vroman.
Your rating?: This reply is Good Excellent
Goto: Reply - Top of page 
If you think this item violates copyrights, please click here

Subject: Re: Large volume of data in inserting in our database.
Author: Prasathi Ji, India
Date: Sep 08, 2018, 19:12, 12 days ago
Message: Dear Bruno,

Your review for Bigfile tablespace management is excellent.I will follow your guidlines and read suggested documents.

Best Regards
Prashansh Kumar
Your rating?: This reply is Good Excellent
Goto: Reply - Top of page 
If you think this item violates copyrights, please click here

Subject: Re: Large volume of data in inserting in our database.
Author: Jan Schnackenberg, Germany
Date: Sep 10, 2018, 08:55, 10 days ago
Score:   Score: 100 PtsScore: 100 PtsScore: 100 PtsScore: 100 PtsScore: 100 Pts
Message: In addition to what Bruno said:

The biggest concerns he mentioned were in regard to backup and recovery.

In 12c using RMAN for backups, you can use segmented backups of datafiles (called "multisection backups" allowing you to backup a bigfile datafile in parallel. Using block-level media recovery you don't need to restore the complete datafile if there are corrupted blocks.

Both features (parallel backup and block-level media recovery) require an Enterprise Edition database but no additional-cost option.

So basically, you probably only need to worry about the maximum filesize for your filesystem. By using ASM you can pretty much forget about that, too.

Regards, Jan
Your rating?: This reply is Good Excellent
Goto: Reply - Top of page 
If you think this item violates copyrights, please click here

Subject: Re: Large volume of data in inserting in our database.
Author: Jan Schnackenberg, Germany
Date: Sep 10, 2018, 08:56, 10 days ago
Score:   Score: 200 PtsScore: 200 PtsScore: 200 PtsScore: 200 PtsScore: 200 Pts
Message: And now, in an extra post to separate my feelings from the facts, I'll tell you that I basically never use bigfile tablespaces. ;)
Your rating?: This reply is Good Excellent
Goto: Reply - Top of page 
If you think this item violates copyrights, please click here