No question at this time
DBA Top 10
1 M. Cadot 11600
2 B. Vroman 6400
3 A. Kavsek 6000
4 P. Wisse 3400
5 J. Schnackenberg 3200
6 J. Péran 1800
7 T. Boles 600
8 Z. Hudec 400
8 G. Lambregts 400
8 N. Havard 400
8 F. Pachot 400
8 D. Dave 400
About
DBA-Village
The DBA-Village forum
Forum as RSS
as RSS feed
Site Statistics
Ever registered users48560
Total active users1401
Act. users last 24h3
Act. users last hour0
Registered user hits last week152
Registered user hits last month875
Go up

Cumulative SUM of Data
Next thread: Explain Plan and Statistics
Prev thread: Missing archive logs

Message Score Author Date
Hi DBA,s Below is my requirement : I need a cu...... rahul singh Nov 07, 2020, 14:10
Hello Rahul, you need to clarify a bit your req...... Score: 200 PtsScore: 200 PtsScore: 200 PtsScore: 200 PtsScore: 200 Pts Bruno Vroman Nov 08, 2020, 17:26
(... continued) <pre>CREATE TABLE op_accounting_u...... Score: 700 PtsScore: 700 PtsScore: 700 PtsScore: 700 PtsScore: 700 Pts Bruno Vroman Nov 08, 2020, 17:28
Thanks Bruno i will check and update you ..also in...... rahul singh Nov 08, 2020, 18:13
... and provide a test case: CREATE TABLE and IN...... Michel Cadot Nov 08, 2020, 19:15
Hi Bruno I am dividing the Pass_price amount in...... rahul singh Nov 09, 2020, 05:20
Hi Rahul, oh, I had divided "pass_price" by 100...... Bruno Vroman Nov 09, 2020, 10:22
Hi Bruno, Below is the answer for your queries ...... rahul singh Nov 09, 2020, 10:39
Hello, answer 1 is ambiguous: <i>I am taking 30...... Score: 200 PtsScore: 200 PtsScore: 200 PtsScore: 200 PtsScore: 200 Pts Bruno Vroman Nov 09, 2020, 11:18
Hi Bruno, Sorry for the inconvenience. a. ...... rahul singh Nov 09, 2020, 12:24
Hello Rahul, sorry, I have many difficulties to...... Score: 200 PtsScore: 200 PtsScore: 200 PtsScore: 200 PtsScore: 200 Pts Bruno Vroman Nov 09, 2020, 13:35
(...continued) Let's say that we want the rows wi...... Score: 200 PtsScore: 200 PtsScore: 200 PtsScore: 200 PtsScore: 200 Pts Bruno Vroman Nov 09, 2020, 13:38
(explanations of first rows of output) we have ...... Score: 200 PtsScore: 200 PtsScore: 200 PtsScore: 200 PtsScore: 200 Pts Bruno Vroman Nov 09, 2020, 13:46
Hi Bruno, Thanks for the great help, This is wh...... rahul singh Nov 09, 2020, 15:16
Hi Rahul, in the "main select" of the previous ...... Score: 600 PtsScore: 600 PtsScore: 600 PtsScore: 600 PtsScore: 600 Pts Bruno Vroman Nov 09, 2020, 16:42
Hi Bruno, thanks Bruno for your help, I have mo...... rahul singh Nov 11, 2020, 06:53
Expected Output : DAY P_800 P_1125 P_1400 29.0...... rahul singh Nov 11, 2020, 06:54
Hi Rahul, sorry I don't understand your questio...... Bruno Vroman Nov 11, 2020, 13:50
Hi Bruno, Below are the answer for your queries...... rahul singh Nov 11, 2020, 15:17
Contin..... SELECT d0,a0 D0 FROM ( sel...... rahul singh Nov 11, 2020, 15:17

Follow up by mail Click here


Subject: Cumulative SUM of Data
Author: rahul singh, India
Date: Nov 07, 2020, 14:10, 21 days ago
Os info: Windows
Oracle info: 11g
Message: Hi DBA,s

Below is my requirement :
I need a cumulative addition result for Example there is one record of tnx A Dated 01-Nov-2020 there is no record on date 02-Nov-2020 even though i need that record in my output

Example :
Tnx Date Amount
01-Nov-2020 50

Expected Output:
----------------
Tnx Date Amount
01-Nov-2020 50
02-Nov-2020 100 (50+50) (01-Nov + 02-Nov)
03-Nov-2020 150 (50+50+50) (01-Nov + 02-Nov +3 Nov)
and So on......

I wrote below Query but its given me only the 1st record and on second record its given me sum of 1st record and 2nd record

select trunc(record_date),round(sum(pass_price/100/30) over (order by record_date),2) as Pass
from op_accounting_unit
where operation_type_id in (1,3)
and Pass_id=63
and record_date>=to_date('02/01/2020 01:10:00','MM/DD/YYYY H24:MI:ss')
and record_date>=to_date('04/01/2020 01:10:00','MM/DD/YYYY H24:MI:ss')
and sw_serial_number=1232
Order by trunc(record_date);

Output:
Trunc(Record_date) PASS
01.02.2020 00:00:00 26.67
02.03.2020 00:00:00 53.33 (26.67+ 26.67)

Goto: Reply - Top of page 
If you think this item violates copyrights, please click here

Subject: Re: Cumulative SUM of Data
Author: Bruno Vroman, Belgium
Date: Nov 08, 2020, 17:26, 20 days ago
Score:   Score: 200 PtsScore: 200 PtsScore: 200 PtsScore: 200 PtsScore: 200 Pts
Message: Hello Rahul,

you need to clarify a bit your requirement if there are multiple rows for a given day (example: 01-NOV-2020 at 13:30, 01-NOV-2020 16:00, 03-NOV-2020 08:45)
From what I see, my guess is that you want one line per day even if there are more than 1 corresponding row...

Your "main question" is about the "missing lines" like 02-NOV-2020 in your example. This comes from the fact that you query op_accounting_unit and there are no rows on 02-NOV-2020. To see a line anyway, the "trick" is to generate the list of dates you want and to use this as basis for the SELECT, with OUTER join to op_accounting_unit, so that is row(s) exist(s) they are reported, but if no row exist, the data is listed anyway.
(Remark: beware with the date formats... You mention things about November 01 to 04 and your example lists 01-FEB-2020 to 01-APR-2020 and even if I admit a typo and that you meant '02/11/2020 01:10' and '04/11/2020 01:10' this would be 11-FEB to 11-APR, not 02-NOV to 04-NOV...)

Here are two examples to generate "the dates that you want": make your choice or take this as a basis to derive your own set of dates.
WITH mydates AS

( SELECT DATE '2010-11-01' + LEVEL - 1 day
FROM dual
CONNECT BY DATE '2020-11-01' + LEVEL - 1 <= DATE '2020-11-06'
)
this will list the dates from 01-NOV-2020 to 06-NOV-2020.
WITH mini_maxi AS

( SELECT TRUNC( MIN( o.record_date ) ) mini
, TRUNC( MAX( o.record_date ) ) maxi
FROM op_accounting_unit o
WHERE o.operation_type_id IN ( 1, 3 )
AND o.pass_id = 63
AND o.record_date >= TO_DATE( '11/02/2020 01:10', 'MM/DD/YYYY HH24:MI' )
AND o.record_date <= TO_DATE( '11/04/2020 01:10', 'MM/DD/YYYY HH24:MI' )
AND o.sw_serial_number = 1232
)
, mydates AS
( SELECT mm.mini + LEVEL - 1 day FROM mini_maxi mm CONNECT BY mm.mini + LEVEL - 1 <= mm.maxi )
this will list all the relevant dates for the query.

Then I will assume that you want one line of output per day even if there are several rows for a given day.
(in my example I add rows that we don't want to select, on purpose to show how it works)
(for the facility I will only keep 2 columns in the table)

(to be continued...)
Your rating?: This reply is Good Excellent
Goto: Reply - Top of page 
If you think this item violates copyrights, please click here

Subject: Re: Cumulative SUM of Data
Author: Bruno Vroman, Belgium
Date: Nov 08, 2020, 17:28, 20 days ago
Score:   Score: 700 PtsScore: 700 PtsScore: 700 PtsScore: 700 PtsScore: 700 Pts
Message: (... continued)
CREATE TABLE op_accounting_unit( record_date DATE, pass_price NUMBER( 9, 0 ) );

INSERT INTO op_accounting_unit VALUES ( TO_DATE( '202008080808', 'YYYYMMDDHH24MI' ), 147000 );
INSERT INTO op_accounting_unit VALUES ( TO_DATE( '202009090909', 'YYYYMMDDHH24MI' ), 150000 );
INSERT INTO op_accounting_unit VALUES ( TO_DATE( '202011011330', 'YYYYMMDDHH24MI' ), 153000 );
INSERT INTO op_accounting_unit VALUES ( TO_DATE( '202011011600', 'YYYYMMDDHH24MI' ), 156000 );
INSERT INTO op_accounting_unit VALUES ( TO_DATE( '202011030845', 'YYYYMMDDHH24MI' ), 159000 );
INSERT INTO op_accounting_unit VALUES ( TO_DATE( '202011040000', 'YYYYMMDDHH24MI' ), 162000 );
INSERT INTO op_accounting_unit VALUES ( TO_DATE( '202011050101', 'YYYYMMDDHH24MI' ), 165000 );
INSERT INTO op_accounting_unit VALUES ( TO_DATE( '202011080202', 'YYYYMMDDHH24MI' ), 168000 );
INSERT INTO op_accounting_unit VALUES ( TO_DATE( '202011080303', 'YYYYMMDDHH24MI' ), 171000 );
INSERT INTO op_accounting_unit VALUES ( TO_DATE( '202012131415', 'YYYYMMDDHH24MI' ), 174000 );
COMMIT;

WITH mini_maxi AS
( SELECT TRUNC( MIN( o.record_date ) ) mini
, TRUNC( MAX( o.record_date ) ) maxi
FROM op_accounting_unit o
WHERE o.record_date >= TO_DATE( '10/30/2020 01:10', 'MM/DD/YYYY HH24:MI' )
AND o.record_date < TO_DATE( '11/09/2020', 'MM/DD/YYYY' )
)
, mydates AS
( SELECT mm.mini + LEVEL - 1 day
FROM mini_maxi mm
CONNECT BY mm.mini + LEVEL - 1 <= mm.maxi
)
, grouped AS
( SELECT TRUNC( o.record_date ) d, ROUND( SUM( pass_price ) / 100 / 30 ) amount
FROM op_accounting_unit o
, mini_maxi mm
WHERE o.record_date >= mm.mini
AND o.record_date < mm.maxi + 1
GROUP BY TRUNC( o.record_date )
)
SELECT m.day
, NVL( g.amount, 0 ) this_day
, SUM( g.amount ) OVER ( ORDER BY m.day RANGE UNBOUNDED PRECEDING ) tot
FROM mydates m
LEFT OUTER JOIN grouped g
ON m.day = g.d
ORDER BY 1
;
DAY THIS_DAY TOT
------------- ---------- ----------
Sun 01-NOV-20 103 103
Mon 02-NOV-20 0 103
Tue 03-NOV-20 53 156
Wed 04-NOV-20 54 210
Thu 05-NOV-20 55 265
Fri 06-NOV-20 0 265
Sat 07-NOV-20 0 265
Sun 08-NOV-20 113 378
8 rows selected.
HTH,

Bruno Vroman
Your rating?: This reply is Good Excellent
Goto: Reply - Top of page 
If you think this item violates copyrights, please click here

Subject: Re: Cumulative SUM of Data
Author: rahul singh, India
Date: Nov 08, 2020, 18:13, 20 days ago
Message: Thanks Bruno i will check and update you ..also in future i will try to clear more on my requirnment.
Your rating?: This reply is Good Excellent
Goto: Reply - Top of page 
If you think this item violates copyrights, please click here

Subject: Re: Cumulative SUM of Data
Author: Michel Cadot, France
Date: Nov 08, 2020, 19:15, 20 days ago
Message:
... and provide a test case: CREATE TABLE and INSERT statements for some data and the result you want for these later. This will avoid those who want to help to have to make effort and waste time to build it.

Regards
Michel
Your rating?: This reply is Good Excellent
Goto: Reply - Top of page 
If you think this item violates copyrights, please click here

Subject: Re: Cumulative SUM of Data
Author: rahul singh, India
Date: Nov 09, 2020, 05:20, 20 days ago
Message: Hi Bruno

I am dividing the Pass_price amount into 30 days, for every rows this amount to be added instead of 0 (Subsequent Previous Days amount to be added into including current Day amount) .
In your sample data if at 02-Nov-20 THIS_DAY column should have 103 instead of 0, so that same can be added in TOT.


Your Sample :

DAY THIS_DAY TOT
------------- ---------- ----------

Sun 01-NOV-20 103 103

Mon 02-NOV-20 0 103


Expected Sample

DAY THIS_DAY TOT
------------- ---------- ----------

Sun 01-NOV-20 103 103

Mon 02-NOV-20 103 103->should (103+103)=206

Tue 03-NOV-20 103 309 (103+103+103)

Your rating?: This reply is Good Excellent
Goto: Reply - Top of page 
If you think this item violates copyrights, please click here

Subject: Re: Cumulative SUM of Data
Author: Bruno Vroman, Belgium
Date: Nov 09, 2020, 10:22, 19 days ago
Message: Hi Rahul,

oh, I had divided "pass_price" by 100 and 30 to give the "amount" just to do "like in your select" but without knowledge of a reason... You give +/- a reason for the "/30" but the "/100" is still arbitrary (maybe pass_price is in cents...)

It seems now that you want to "smoothen" pass_price over 30 days or over one month? but once again the requirement needs clarification.
Please provide some info about
- you divide by 30 because 30 is a constant that you consider or because this is the number of days in November 2020? (in this latter case, rows of December 2020 must be divided by 31? and rows of February 2020 by 29?)
- if a row has record_date 09-NOV-2020 14:15:16 and a pass_price of 1000, do we have to consider slices of 1000 / 100 /30 on 09-NOV, on 10-NOV, ..., on 30-NOV and stop, or continue on 01-DEC, 02-DEC, ..., 08-DEC-2020 (to reach 30 pieces)?
- has the time portion any importance? (well, in your select you state "rows between a_give_day_at_01:10:00 and another_given_day_at_01:10:00" but this looks weird; why not "TRUNC( all the dates )"?)
- any other clarification that you might think of. You might for example give the output that you want if all the input is:
 29-OCT-2020 20:20 147000
 02-NOV-2020 00:30 150000
 02-NOV-2020 15:15 153000
 04-NOV-2020 16:16 156000
 05-NOV-2020 17:17 159000
 08-NOV-2020 18:18 162000
 01-DEC-2020 19:19 165000

Once I better understand I can make a try.

Best regards,

Bruno
Your rating?: This reply is Good Excellent
Goto: Reply - Top of page 
If you think this item violates copyrights, please click here

Subject: Re: Cumulative SUM of Data
Author: rahul singh, India
Date: Nov 09, 2020, 10:39, 19 days ago
Message: Hi Bruno,

Below is the answer for your queries

1. you divide by 30 because 30 is a constant that you consider or because this is the number of days in November 2020? (in this latter case, rows of December 2020 must be divided by 31? and rows of February 2020 by 29?)

Ans : I am taking 30 constant for a month.

2. if a row has record_date 09-NOV-2020 14:15:16 and a pass_price of 1000, do we have to consider slices of 1000 / 100 /30 on 09-NOV, on 10-NOV, ..., on 30-NOV and stop, or continue on 01-DEC, 02-DEC, ..., 08-DEC-2020 (to reach 30 pieces)?

Ans : Time is not required in case but Date is important, There is a product of 3000 buks and i need to divide it into monthly equally on each day of month (In my case it is 30 ) so 100 Buks daily with condition that every day 100 buks add on subsequent days.

Example : If My table doesnt have one day data even though the query output shows me 100 buks, sample output

Date Amount Total
01-Nov-20 100 100
02-Nov-20 100 200 (100+100)
03-Nov-20 100 300 (100+100+100)
and so on...

3. has the time portion any importance? (well, in your select you state "rows between a_give_day_at_01:10:00 and another_given_day_at_01:10:00" but this looks weird; why not "TRUNC( all the dates )"?)

Ans: No time is not important only the date, i can trunc the timing


Your rating?: This reply is Good Excellent
Goto: Reply - Top of page 
If you think this item violates copyrights, please click here

Subject: Re: Cumulative SUM of Data
Author: Bruno Vroman, Belgium
Date: Nov 09, 2020, 11:18, 19 days ago
Score:   Score: 200 PtsScore: 200 PtsScore: 200 PtsScore: 200 PtsScore: 200 Pts
Message: Hello,

answer 1 is ambiguous: I am taking 30 constant for a month I understand that you take always 30, the month can be November, December of February or anything else, the number is 30.

answer 2 is missing or not clear, I have to guess again:
if we have 30000 for a row of 09-NOV, we will split in slices:
100 counted for every day between 01-NOV and 30-NOV, the the totals shown will be:
01-NOV: 100
02-NOV: 200
and so on until
30-NOV: 3000
(with this rule, even the day is not so important, only the month is important: if we have 30000 on 09-NOV and 60000 on 17-NOV, we will have 300 on 01-NOV (100 from the row "09-NOV" and 200 from the row "17-NOV"), 600 on 02-NOV, ..., 90000 on 30-NOV. In other words: every day of Nov will have the same "delta" amount

answer 3 is OK: time has no importance (but then why were you querying with "at 01:10" in the original post)

answer 4 is missing: output for the given input, other clarifications (for example if there are rows of several months... Do you always ask the report for a given month? What if you ask for February 2020 (still divide by 30?)...)

(for example with the input I have given, maybe your wish is:
-a- we ignore the rows of October and December
-b- so we have to consider only the rows of November: 150K, 153K, 156K, 159K, 162K.
-c- hence: total is 780000, split in 30 times 260 (we divide also by 100) and we have:
01-NOV-2020 260
02-NOV-2020 520
03-NOV-2020 780
04-NOV-2020 1040
...
30-NOV-2020 7800
(we see that for any row of November, the effect on final output is independant of the "day_of_month"; for example an amount of 100000 on any day of November will add 33 on line 01-NOV, 66 on line 02-NOV, 99 on line 03-NOV, and so on until 990 on 30-NOV -the missing 10 are due to rounding effect)

Please provide missing answers or I will work "for nothing".

Best regards,

Bruno
Your rating?: This reply is Good Excellent
Goto: Reply - Top of page 
If you think this item violates copyrights, please click here

Subject: Re: Cumulative SUM of Data
Author: rahul singh, India
Date: Nov 09, 2020, 12:24, 19 days ago
Message: Hi Bruno,

Sorry for the inconvenience.


a. if a row has record_date 09-NOV-2020 14:15:16 and a pass_price of 1000, do we have to consider slices of 1000 / 100 /30 on 09-NOV, on 10-NOV, ..., on 30-NOV and stop, or continue on 01-DEC, 02-DEC, ..., 08-DEC-2020 (to reach 30 pieces)?

Ans :Yes it should go on till 30 days reach no matter about the month.

b. To make it simple can we do one thing where i can only break-down the rows into equal
pieces divide my 3 without adding subsiquent rows.

For Example

Day Amount Total
01-Nov-20 27 27
02-Nov-20 0 27
03-Nov-20 0 27

till
30-Nov-20 0 27

01-Dec-20 27 27
02-Dec-20 0 27
03-Dec-20 0 27

till
30-Dec-20 0 27





Your rating?: This reply is Good Excellent
Goto: Reply - Top of page 
If you think this item violates copyrights, please click here

Subject: Re: Cumulative SUM of Data
Author: Bruno Vroman, Belgium
Date: Nov 09, 2020, 13:35, 19 days ago
Score:   Score: 200 PtsScore: 200 PtsScore: 200 PtsScore: 200 PtsScore: 200 Pts
Message: Hello Rahul,

sorry, I have many difficulties to understand you. So I make a last guess and after this I am afraid you will have to figure out by yourself.
My understanding:
-a- we are interested in a subset of the table:
  rows with record_date between two given days (example: 02-NOV-2020 to 10-NOV-2020 included)
  (and other conditions on type_id, pass_id, serial_number... but we can ignore this for the clarity of the test)
  (and for the facility I will now forget the mysteious "divide also by 100")
-b- each of the relevant row contains info like for example [ record_date: 04-NOV-2020 13:14:15, pass_price: 1500 ]
  each of these rows has to be spread over 30 days, from its initial day to a day 29 days later (for example: the row of 04-NOV will be split in 1500/30 = 50 on 04-NOV, 50 on 05-NOV, ..., 50 on 03-DEC
-c- and we want to present a cumul: for the example, 50 on 04-NOV, but 100 on 05-NOV, 150 on 06-NOV, ... until 1500 on 03-DEC
-d- if we have many rows, we just sum the various values.
(-e- to show a consistent result set, it has to extend up to 29 days after the day of the last relevant row)


In other words; for one row to consider, n days after the record_date (n varying from 0 to 29), we have a contribution of (n+1)/30 of the amount.
with the example: the row [04-NOV 1500] will contribute, on 04-NOV + 10 days (just an example) = 14-NOV, to 1500*11/30 i.e. 550.

I don't know if you have millions of rows to treat but I will work in a "didactic way" that might be far from optimal from a performance point of view (the goal is to see clearly what we do);

first I select the relevant subset of dates: get mini and maxi date and generate the list of dates between "mini and maxi+29" -note that we go 29 days further than last "record_date" so that last record is fully taken inti account)

then I compute for each relevant row (like 04-NOV 1500) the initial day, last dst day, the daily contribution

then for each day to consider, I sum the daily contributions weighted with the correct number of days
(example: on 14-NOV, the row of (04-NOV 1500) will contribute to 11*50)

(last remark: I "round" a bit differently to decrease rounding errors)

DROP TABLE op_accounting_unit;
CREATE TABLE op_accounting_unit( record_date DATE, pass_price NUMBER( 9, 0 ) );
INSERT INTO op_accounting_unit VALUES ( TO_DATE( '202008080808', 'YYYYMMDDHH24MI' ), 1470 );
INSERT INTO op_accounting_unit VALUES ( TO_DATE( '202009090909', 'YYYYMMDDHH24MI' ), 1500 );
INSERT INTO op_accounting_unit VALUES ( TO_DATE( '202011011330', 'YYYYMMDDHH24MI' ), 60 );
INSERT INTO op_accounting_unit VALUES ( TO_DATE( '202011030900', 'YYYYMMDDHH24MI' ), 15 );
INSERT INTO op_accounting_unit VALUES ( TO_DATE( '202011031030', 'YYYYMMDDHH24MI' ), 15 );
INSERT INTO op_accounting_unit VALUES ( TO_DATE( '202011090000', 'YYYYMMDDHH24MI' ), 1500 );
INSERT INTO op_accounting_unit VALUES ( TO_DATE( '202012251000', 'YYYYMMDDHH24MI' ), 30 );
COMMIT;
(to be continued...)
Your rating?: This reply is Good Excellent
Goto: Reply - Top of page 
If you think this item violates copyrights, please click here

Subject: Re: Cumulative SUM of Data
Author: Bruno Vroman, Belgium
Date: Nov 09, 2020, 13:38, 19 days ago
Score:   Score: 200 PtsScore: 200 PtsScore: 200 PtsScore: 200 PtsScore: 200 Pts
Message: (...continued)
Let's say that we want the rows with record_date in [ O1-NOV-2020 01:10:00 10-NOV-2020 01:09:59 ]
WITH mini_maxi AS

( SELECT TRUNC( MIN( o.record_date ) ) mini
, TRUNC( MAX( o.record_date ) ) maxi
FROM op_accounting_unit o
WHERE o.record_date >= TO_DATE( '11/01/2020 01:10', 'MM/DD/YYYY HH24:MI' )
AND o.record_date <= TO_DATE( '11/10/2020 01:09:59', 'MM/DD/YYYY HH24:MI:SS' )
)
, mydates AS
( SELECT mm.mini + LEVEL - 1 day
FROM mini_maxi mm
CONNECT BY mm.mini + LEVEL - 1 <= mm.maxi + 29
)
, relevant_slices AS
( SELECT TRUNC( o.record_date ) d1
, TRUNC( o.record_date ) + 29 d2
, SUM( pass_price ) / 30 slice_not_rounded
FROM op_accounting_unit o
, mini_maxi mm
WHERE o.record_date >= mm.mini
AND o.record_date < mm.maxi + 1
GROUP BY TRUNC( o.record_date )
)
SELECT m.day
, NVL( ROUND( SUM( r.slice_not_rounded
* ( m.day - r.d1 + 1 )
)
)
, 0
) amount
FROM mydates m
LEFT OUTER JOIN relevant_slices r
ON m.day BETWEEN r.d1 AND r.d2
GROUP BY m.day
ORDER BY 1
;
DAY AMOUNT
--------- ----------
01-NOV-20 2
02-NOV-20 4
03-NOV-20 7
04-NOV-20 10
05-NOV-20 13
06-NOV-20 16
07-NOV-20 19
08-NOV-20 22
09-NOV-20 75
10-NOV-20 128
11-NOV-20 181
12-NOV-20 234
13-NOV-20 287
14-NOV-20 340
15-NOV-20 393
16-NOV-20 446
17-NOV-20 499
18-NOV-20 552
19-NOV-20 605
20-NOV-20 658
21-NOV-20 711
22-NOV-20 764
23-NOV-20 817
24-NOV-20 870
25-NOV-20 923
26-NOV-20 976
27-NOV-20 1029
28-NOV-20 1082
29-NOV-20 1135
30-NOV-20 1188
01-DEC-20 1179
02-DEC-20 1230
03-DEC-20 1250
04-DEC-20 1300
05-DEC-20 1350
06-DEC-20 1400
07-DEC-20 1450
08-DEC-20 1500
38 rows selected.
Remark: to better see: change the end date to 31-DEC-2020 (in "mini_maxi") for example and rerun the query...
WITH mini_maxi AS

( SELECT TRUNC( MIN( o.record_date ) ) mini
, TRUNC( MAX( o.record_date ) ) maxi
FROM op_accounting_unit o
WHERE o.record_date >= TO_DATE( '11/01/2020 01:10', 'MM/DD/YYYY HH24:MI' )
AND o.record_date <= TO_DATE( '12/31/2020 23:59:59', 'MM/DD/YYYY HH24:MI:SS' )
and so on...
HTH,

Bruno
Your rating?: This reply is Good Excellent
Goto: Reply - Top of page 
If you think this item violates copyrights, please click here

Subject: Re: Cumulative SUM of Data
Author: Bruno Vroman, Belgium
Date: Nov 09, 2020, 13:46, 19 days ago
Score:   Score: 200 PtsScore: 200 PtsScore: 200 PtsScore: 200 PtsScore: 200 Pts
Message: (explanations of first rows of output)

we have initial data: 60 on 01-NOV, 15 on 03-NOV, 15 on 03-NOV again, so:

01-NOV we have the initial slice of 60/30 for first row

02-NOV we have two slices of the same: 60/30 * 2

03-NOV we have three slices of the same: 60/30 * 3 and in addition the first slice due to the two contributions of 03-NOV: (15+15)/30 and a total of 6 + 1 = 7

04-NOV we have four slices of 60/30 and two slices of (15+15)/30, so 8 + 2 = 10

and so on.

Best regards,

Bruno.
Your rating?: This reply is Good Excellent
Goto: Reply - Top of page 
If you think this item violates copyrights, please click here

Subject: Re: Cumulative SUM of Data
Author: rahul singh, India
Date: Nov 09, 2020, 15:16, 19 days ago
Message: Hi Bruno,

Thanks for the great help, This is what i exactly needed.. you are great and very helpful.

A little more help for the learning point of you .. if i do not want to subsequent addition then what would be the strategy like below :

b. To make it simple can we do one thing where i can only break-down the rows into equal slices not adding previous row data.


For Example

Day Amount Total
01-Nov-20 27 27
02-Nov-20 0 27
03-Nov-20 0 27

till
30-Nov-20 0 27
------------------------
01-Dec-20 27 27
02-Dec-20 0 27
03-Dec-20 0 27

till
30-Dec-20 0 27
Your rating?: This reply is Good Excellent
Goto: Reply - Top of page 
If you think this item violates copyrights, please click here

Subject: Re: Cumulative SUM of Data
Author: Bruno Vroman, Belgium
Date: Nov 09, 2020, 16:42, 19 days ago
Score:   Score: 600 PtsScore: 600 PtsScore: 600 PtsScore: 600 PtsScore: 600 Pts
Message: Hi Rahul,

in the "main select" of the previous case, the amount of each slice is multiplied by the number of days (to have one slice on day 1, two slices on day 2, ... thirty on last day); this is done via

, NVL( ROUND( SUM( r.slice_not_rounded
* ( m.day - r.d1 + 1 )
)
)
, 0
) amount
where you can see that r.slice_not_rounded is the value of one slice, and ( m.day - r.d1 + 1 ) is the number of days or equivalently the number of slices to consider for the day...
Consequently if we simplify the query by keeping only ONE slice whatever the day, we have what you want.

, NVL( ROUND( SUM( r.slice_not_rounded ) ), 0 ) amount
(try to see if that can be rewritten in a "cheaper" way, but this should do the job)

Best regards,

Bruno
Your rating?: This reply is Good Excellent
Goto: Reply - Top of page 
If you think this item violates copyrights, please click here

Subject: Re: Cumulative SUM of Data
Author: rahul singh, India
Date: Nov 11, 2020, 06:53, 18 days ago
Message: Hi Bruno,

thanks Bruno for your help, I have modified the Query and below are few small issues :



WITH mini_maxi AS

( SELECT TRUNC( MIN( o.record_date ) ) mini, TRUNC( MAX( o.record_date ) ) maxi

FROM op_accounting_unit o
WHERE o.record_date >= TO_DATE( '01/05/2020 01:10', 'MM/DD/YYYY HH24:MI' )
AND o.record_date <= TO_DATE( '03/21/2020 01:09:59', 'MM/DD/YYYY HH24:MI:SS' )
)

,mydates AS

(SELECT mm.mini + LEVEL - 1 day
FROM mini_maxi mm
CONNECT BY mm.mini + LEVEL - 1 <= mm.maxi + 29)

,relevant_slices AS

( SELECT TRUNC( o.record_date ) d1, TRUNC( o.record_date ) + 29 d2
ROUND( SUM( 85000 ) / 100 /30 ) as P_800,
ROUND( SUM( 112500 ) / 100 /30 ) as P_1125,
ROUND( SUM( 140000 ) / 100 /30 ) as P_1400

FROM op_accounting_unit o, mini_maxi mm
WHERE o.record_date >= mm.mini
AND o.record_date < mm.maxi + 1 AND o.pass_id=63
AND o.sw_Serial_number=5452143
GROUP BY TRUNC( o.record_date )

)

SELECT m.DAY,
NVL( ROUND( SUM( r.P_800)), 0) P_800,
NVL( ROUND( SUM( r.P_1125)), 0) P_1125,
NVL( ROUND( SUM( r.P_1400)), 0) P_1400
FROM mydates m

LEFT OUTER JOIN relevant_slices r

ON m.day BETWEEN r.d1 AND r.d2

GROUP BY m.DAY

ORDER BY 1

Output :

DAY P_800 P_1125 P_1400
29.01.2020 00:00:00 0 0 0
30.01.2020 00:00:00 0 0 0
31.01.2020 00:00:00 0 0 0
01.02.2020 00:00:00 28 38 47
02.02.2020 00:00:00 28 38 47
03.02.2020 00:00:00 28 38 47
04.02.2020 00:00:00 28 38 47
05.02.2020 00:00:00 28 38 47
06.02.2020 00:00:00 28 38 47
07.02.2020 00:00:00 28 38 47
08.02.2020 00:00:00 28 38 47
09.02.2020 00:00:00 28 38 47
10.02.2020 00:00:00 28 38 47
11.02.2020 00:00:00 28 38 47
12.02.2020 00:00:00 28 38 47
13.02.2020 00:00:00 28 38 47

Issues :
1. P_1125 and P_1400 doesn't have records in day Range given why its comming?

2. If i do CONNECT BY mm.mini + LEVEL - 1 <= mm.maxi + 30) instead of CONNECT BY mm.mini + LEVEL - 1 <= mm.maxi + 29) then on next record for that sw_serial_number it multiply for that day ..why?

Output with mm.maxi + 30 : refer Day: 11.02.2020 00:00:00

DAY P_800 P_1125 P_1400
----before record hidden
29.01.2020 00:00:00 0 0 0
30.01.2020 00:00:00 0 0 0
31.01.2020 00:00:00 0 0 0
01.02.2020 00:00:00 28 38 47
02.02.2020 00:00:00 28 38 47
03.02.2020 00:00:00 28 38 47
04.02.2020 00:00:00 28 38 47
05.02.2020 00:00:00 28 38 47
06.02.2020 00:00:00 28 38 47
07.02.2020 00:00:00 28 38 47
08.02.2020 00:00:00 28 38 47
09.02.2020 00:00:00 28 38 47
10.02.2020 00:00:00 28 38 47
11.02.2020 00:00:00 56 76 94
12.02.2020 00:00:00 28 38 47
13.02.2020 00:00:00 28 38 47

----After record hidden


Expected Output :

DAY P_800 P_1125 P_1400
29.01.2020 00:00:00 0 0 0
30.01.2020 00:00:00 0 0 0
31.01.2020 00:00:00 0 0 0
01.02.2020 00:00:00 28 0 0
02.02.2020 00:00:00 28 0 0
03.02.2020 00:00:00 28 0 0
04.02.2020 00:00:00 28 0 0
05.02.2020 00:00:00 28 0 0
06.02.2020 00:00:00 28 0 0
07.02.2020 00:00:00 28 0 0
08.02.2020 00:00:00 28 0 0
09.02.2020 00:00:00 28 0 0
10.02.2020 00:00:00 28 0 0
11.02.2020 00:00:00 56 0 0
12.02.2020 00:00:00 28 0 0
13.02.2020 00:00:00 28 0 0




Your rating?: This reply is Good Excellent
Goto: Reply - Top of page 
If you think this item violates copyrights, please click here

Subject: Re: Cumulative SUM of Data
Author: rahul singh, India
Date: Nov 11, 2020, 06:54, 18 days ago
Message: Expected Output :

DAY P_800 P_1125 P_1400
29.01.2020 00:00:00 0 0 0
30.01.2020 00:00:00 0 0 0
31.01.2020 00:00:00 0 0 0
01.02.2020 00:00:00 28 0 0
02.02.2020 00:00:00 28 0 0
03.02.2020 00:00:00 28 0 0
04.02.2020 00:00:00 28 0 0
05.02.2020 00:00:00 28 0 0
06.02.2020 00:00:00 28 0 0
07.02.2020 00:00:00 28 0 0
08.02.2020 00:00:00 28 0 0
09.02.2020 00:00:00 28 0 0
10.02.2020 00:00:00 28 0 0
11.02.2020 00:00:00 28 0 0
12.02.2020 00:00:00 28 0 0
13.02.2020 00:00:00 28 0 0
Your rating?: This reply is Good Excellent
Goto: Reply - Top of page 
If you think this item violates copyrights, please click here

Subject: Re: Cumulative SUM of Data
Author: Bruno Vroman, Belgium
Date: Nov 11, 2020, 13:50, 17 days ago
Message: Hi Rahul,

sorry I don't understand your questions.
Note a few things:
a) you select constants whatever the values of op_accounting_unit are:
  ROUND( SUM( 85000 ) / 100 /30 ) as P_800,
  ROUND( SUM( 112500 ) / 100 /30 ) as P_1125,
  ROUND( SUM( 140000 ) / 100 /30 ) as P_1400
(BTW 1st one might be called P_850 instead of P_800)
(and only if there are more than 1 row for a given date the values for each day will always be the same (28 38 47, or 57 75 93, or ...)

b) to select mini and maxi you should also consider the conditions on pass_id and sw_serial_number (as you have them for the "relevant_slices"):
  WHERE ... AND o.pass_id = 63 AND o.sw_serial_number = 5452143
The fact that you don't do it causes the extra dates

c) as p_800 P_1125 P_1400 are selected as constants I cannot understand how you can get 28 38 47 one time (well, this one I understand) and 28 0 0 another time...
but we don't have your data so we cannot run the same query as you.

d) remarks about "records hidden", "30 instead of 29": no idea what you mean.

Best regards,

Bruno
Your rating?: This reply is Good Excellent
Goto: Reply - Top of page 
If you think this item violates copyrights, please click here

Subject: Re: Cumulative SUM of Data
Author: rahul singh, India
Date: Nov 11, 2020, 15:17, 17 days ago
Message: Hi Bruno,

Below are the answer for your queries :

Ques: (BTW 1st one might be called P_850 instead of P_800)

.> Its a Typo Error its should be P_800, Its an alias

Ques:and only if there are more than 1 row for a given date the values for each day will always be the same (28 38 47, or 57 75 93, or ...)

--> every time Only 1 Record w.r.t operation_type_id 1 and 3 and sw_serial_number.

--> There might me thousand of records for the day with different sw_Serial_number.

--> I have mentioned the constants coz i want to display each product utilization different on course of 30 days and i am assuming that Pass_price(constant) utilization would be Pass_price/100/30 (Each Day)

Ques: as p_800 P_1125 P_1400 are selected as constants I cannot understand how you can get 28 38 47 one time (well, this one I understand) and 28 0 0 another time...
but we don't have your data so we cannot run the same query as you.

Each Product is either p_800 or P_1125 or P_1400 at one time, i dont know why its coming.

Ques:remarks about "records hidden", "30 instead of 29": no idea what you mean.

Ans: I cant paste all rows thats why i wrote hidden.

I have written one query on which i am comparing am comparing with raw data sum its giving me correct output.

Pls review and understand the logic.
Your rating?: This reply is Good Excellent
Goto: Reply - Top of page 
If you think this item violates copyrights, please click here

Subject: Re: Cumulative SUM of Data
Author: rahul singh, India
Date: Nov 11, 2020, 15:17, 17 days ago
Message: Contin.....



SELECT
d0,a0 D0
FROM (

select Trunc(record_date) d0,Round(Sum(PASS_PRICE/100/30),2) AS a0 ,
Trunc(record_date+1) d1, Round(Sum(PASS_PRICE/100/30),2) AS a1 ,
Trunc(record_date+2) d2, Round(Sum(PASS_PRICE/100/30),2) AS a2 ,
Trunc(record_date+3)d3, Round(Sum(PASS_PRICE/100/30),2) AS a3 ,
Trunc(record_date+4)d4, Round(Sum(PASS_PRICE/100/30),2) AS a4 ,
Trunc(record_date+5)d5, Round(Sum(PASS_PRICE/100/30),2) AS a5 ,
Trunc(record_date+6)d6, Round(Sum(PASS_PRICE/100/30),2) AS a6 ,
Trunc(record_date+7)d7, Round(Sum(PASS_PRICE/100/30),2) AS a7 ,
Trunc(record_date+8)d8, Round(Sum(PASS_PRICE/100/30),2) AS a8 ,
Trunc(record_date+9)d9, Round(Sum(PASS_PRICE/100/30),2) AS a9 ,
Trunc(record_date+10)d10, Round(Sum(PASS_PRICE/100/30),2) AS a10 ,
Trunc(record_date+11)d11, Round(Sum(PASS_PRICE/100/30),2) AS a11 ,
Trunc(record_date+12)d12, Round(Sum(PASS_PRICE/100/30),2) AS a12 ,
Trunc(record_date+13)d13, Round(Sum(PASS_PRICE/100/30),2) AS a13 ,
Trunc(record_date+14)d14, Round(Sum(PASS_PRICE/100/30),2) AS a14 ,
Trunc(record_date+15)d15, Round(Sum(PASS_PRICE/100/30),2) AS a15 ,
Trunc(record_date+16)d16, Round(Sum(PASS_PRICE/100/30),2) AS a16 ,
Trunc(record_date+17)d17, Round(Sum(PASS_PRICE/100/30),2) AS a17 ,
Trunc(record_date+18)d18, Round(Sum(PASS_PRICE/100/30),2) AS a18 ,
Trunc(record_date+19)d19, Round(Sum(PASS_PRICE/100/30),2) AS a19 ,
Trunc(record_date+20)d20, Round(Sum(PASS_PRICE/100/30),2) AS a20 ,
Trunc(record_date+21)d21, Round(Sum(PASS_PRICE/100/30),2) AS a21 ,
Trunc(record_date+22)d22, Round(Sum(PASS_PRICE/100/30),2) AS a22 ,
Trunc(record_date+23)d23, Round(Sum(PASS_PRICE/100/30),2) AS a23 ,
Trunc(record_date+24)d24, Round(Sum(PASS_PRICE/100/30),2) AS a24 ,
Trunc(record_date+25)d25, Round(Sum(PASS_PRICE/100/30),2) AS a25 ,
Trunc(record_date+26)d26, Round(Sum(PASS_PRICE/100/30),2) AS a26 ,
Trunc(record_date+27)d27, Round(Sum(PASS_PRICE/100/30),2) AS a27 ,
Trunc(record_date+28)d28, Round(Sum(PASS_PRICE/100/30),2) AS a28 ,
Trunc(record_date+29)d29, Round(Sum(PASS_PRICE/100/30),2) AS a29 ,
Trunc(record_date+30)d30, Round(Sum(PASS_PRICE/100/30),2) AS a30

from op_accounting_unit where operation_type_id IN (1,3) AND PASS_ID=63
AND record_date >= TO_DATE( '01/22/2020 01:10:00','MM/DD/YYYY HH24:MI:ss')
AND record_date < TO_DATE( '03/21/2020 01:10:00','MM/DD/YYYY HH24:MI:ss')
--AND sw_Serial_number=5461608
GROUP BY Trunc(record_date),Trunc(record_date+1),Trunc(record_date+2),Trunc(record_date+3),Trunc(record_date+4),Trunc(record_date+5),Trunc(record_date+6)
,Trunc(record_date+7),Trunc(record_date+8),Trunc(record_date+9),Trunc(record_date+10),Trunc(record_date+11),Trunc(record_date+12),Trunc(record_date+13),Trunc(record_date+14)
,Trunc(record_date+15),Trunc(record_date+16),Trunc(record_date+17),Trunc(record_date+18),Trunc(record_date+19),Trunc(record_date+20),Trunc(record_date+21)
,Trunc(record_date+22),Trunc(record_date+23),Trunc(record_date+24), Trunc(record_date+25)
,Trunc(record_date+26),Trunc(record_date+27)
,Trunc(record_date+28),Trunc(record_date+29),Trunc(record_date+30)



ORDER BY Trunc(record_date) )


Output :

D0 D0
------------------
22-01-2020 37.5
23-01-2020 27029.17
24-01-2020 22646.67

and so on

20-03-2020 7923.33







Your rating?: This reply is Good Excellent
Goto: Reply - Top of page 
If you think this item violates copyrights, please click here