bpc blogs.docx

44
7/30/2019 BPC blogs.docx http://slidepdf.com/reader/full/bpc-blogsdocx 1/44 Running BPC process chains within non-BPC process chains Posted by Pravin Datar in pravin.datar on Jan 26, 2009 10:59:42 AM Business Planning and Consolidation version for Netweaver extensively uses process chains for running the BPC processes. These process chains are automatically invoked by the BPC application when the BPC user executes processes from the front end. Should these process chains be exclusively executed only within BPC alone or should we able to execute them outside BPC, using native Netweaver BW, may be from within any custom process chains that we may create? Is there any need to do so? And finally, is there any way to do that? Let try to answer these questions in this blog. Let us begin with trying to see if we have any business reason to run the BPC process chain outside BPC application. In order to do that we need to understand how the optimization process works in BPC version for Netweaver. Optimizing the BPC data model:  A dimension in BPC is equivalent to a characteristic in Netweaver BW and dimension members in BPC are equivalent to characteristic values in Netweaver BW. Taking this further, when a user creates a dimension in BPC version for Netweaver, a Netweaver BW characteristic is generated in the BPC namespace for the same. When a user creates a dimension member for that dimension in BPC version for Netweaver, a characteristic value is generated in Netweaver BW in the master data of characteristic corresponding to that BPC dimension. When a user creates a BPC application in BPC version for Netweaver by selecting a few of the BPC dimensions, an infocube (as well as a multiprovider containing that infocube) is generated in the BPC namespace that includes all the characteristics corresponding to the selected BPC dimensions. (You can read more about the BPC namespace at A reservation of a different kind – why, what and how of BPC namespaceWe should distinguish the BPC dimension from the Netweaver BW dimension. In Netweaver BW, the term dimension is used to group the characteristics. How the characteristics in a BPC infocube are organized among the Netweaver BW dimensions within the generated BPC infocube? Well, it depends upon the number of dimensions included in the BPC application. If the number of BPC dimensions in the BPC application is 13 or fewer, then all of them are automatically modeled as line item dimensions in the BPC infocube. This is because Netweaver BW allows upto 13 user defined Netweaver dimensions in an infocube. If the number of BPC dimensions exceeds 13, then the BPC infocube model is automatically generated for those BPC dimensions. The data modeling thus generated while creating the cube may not remain the most optimized one as the fact table of the cube begins to grow. BPC version for Netweaver gives the option to the BPC user to optimize the data model from the front end. As shown below, there are two options to optimize - Lite optimize and Full optimize..

Upload: aditya-pavan

Post on 14-Apr-2018

256 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 1/44

Running BPC process chains within non-BPC process chains 

Posted by Pravin Datar in pravin.datar on Jan 26, 2009 10:59:42 AM

Business Planning and Consolidation version for Netweaver extensively uses process chains for

running the BPC processes. These process chains are automatically invoked by the BPC

application when the BPC user executes processes from the front end. Should these process

chains be exclusively executed only within BPC alone or should we able to execute themoutside BPC, using native Netweaver BW, may be from within any custom process chains that

we may create? Is there any need to do so? And finally, is there any way to do that? Let try to

answer these questions in this blog. Let us begin with trying to see if we have any business

reason to run the BPC process chain outside BPC application. In order to do that we need to

understand how the optimization process works in BPC version for Netweaver.

Optimizing the BPC data model: 

A dimension in BPC is equivalent to a characteristic in Netweaver BW and dimension members

in BPC are equivalent to characteristic values in Netweaver BW. Taking this further, when a user

creates a dimension in BPC version for Netweaver, a Netweaver BW characteristic is generated

in the BPC namespace for the same. When a user creates a dimension member for that

dimension in BPC version for Netweaver, a characteristic value is generated in Netweaver BW

in the master data of characteristic corresponding to that BPC dimension. When a user creates

a BPC application in BPC version for Netweaver by selecting a few of the BPC dimensions, an

infocube (as well as a multiprovider containing that infocube) is generated in the BPC

namespace that includes all the characteristics corresponding to the selected BPC dimensions.

(You can read more about the BPC namespace at A reservation of a different kind – why, what

and how of BPC namespace) 

We should distinguish the BPC dimension from the Netweaver BW dimension. In Netweaver

BW, the term dimension is used to group the characteristics. How the characteristics in a BPC

infocube are organized among the Netweaver BW dimensions within the generated BPC

infocube? Well, it depends upon the number of dimensions included in the BPC application. If the number of BPC dimensions in the BPC application is 13 or fewer, then all of them are

automatically modeled as line item dimensions in the BPC infocube. This is because Netweaver

BW allows upto 13 user defined Netweaver dimensions in an infocube. If the number of BPC

dimensions exceeds 13, then the BPC infocube model is automatically generated for those BPC

dimensions. The data modeling thus generated while creating the cube may not remain the

most optimized one as the fact table of the cube begins to grow. BPC version for Netweaver

gives the option to the BPC user to optimize the data model from the front end. As shown

below, there are two options to optimize - Lite optimize and Full optimize..

Page 2: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 2/44

 The Lite Optimize option does not make any changes to the data model. It just closes the open

request; compresses and indexes the cube and updates database statistics. The Full optimize

option is the one that may rearrange the characteristics among the 13 user defined Netweaver

BW dimensions. The Full Optimize process will check if the size of the dimension table is less

than 20% of the fact table or not and create as many line item dimensions as possible. In order

to do this reconfiguration, it takes the appset offline, creates a shadow cube with optimal datamodel; links the new optimal cube to the multiprovider for the application; moves data to the

shadow cube; deletes the original cube; closes the open request; compresses and indexes the

cube; updates database statistics and brings the appset online again. Though this results in

creating a new infocube, the multiprovider remains the same and all the BPC reports are built

on the multiprovider and not the underlying infocube. Hence this optimization does not affect

the BPC reports reporting this data.

Using ETL for BPC infocubes: 

Since the data that the BPC user enters from the BPC front end is stored in the underlying real

time infocube for that application, one may ask whether it is possible for us to load data to that

cube with normal Netweaver BW ETL process. The answer to that is ‘yes' - but with a caveat.

We can use Netweaver BW ETL for the BPC infocubes. Here is an example of a DTP to load datathrough a flat file to a BPC infocube.

Page 3: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 3/44

Now if the BPC user chooses to do a Full Optimize' for this application, it may result in creating

a new infocube with more optimal data model. That new infocube, though gets automatically

linked to the multiprovider for the BPC application, at present, does not inherit the ETL

structure that was built on the original cube. So in the above example, if the BPC user executes

a ‘Full Optimize' for the Finance application, the new optimal infocube for the Finance

application may not inherit the DTP created on the original /CPMB/DZID30P infocube. Thesource system, data source, infosource etc will remain but the transformation that links these

to the infocube will get deleted and has to be recreated. If this optimization happens in the

production system then the transformation may have to be recreated and transported up the

landscape.

A way to obviate such situation is to execute the process chains used by BPC to load data using

native Netweaver BW tools, outside the BPC application. In the above example, a flat file is

being loaded to the BPC infocube using Netweaver BW ETL tools. However, BPC application

itself offers a front end functionality of Data Manager to load data either through a flat file or

from any other Infoprovider. Data Manager uses BPC process chains in the background to load

the data as shown below.

If we can run these process chains outside BPC - from the EDW layer using the native

Netweaver BW, then not only we can integrate this with the custom process chains but also

obviate the issue of ETL structures getting deleted on ‘Full Optimize'. Running BPC process

chains outside BPC is also important if we are using open hub and want to automate the flat file

load to BPC cubes by creating a user defined process chain that integrates the file creation of 

the open hub and loading of that file to BPC cube. If by any means, our user defined (custom)

process chain (that we create in transaction ‘rspc') can run the BPC process chain to load the

data to BPC cube, then we have an ‘industrial strength' solution for loading data to BPC

infocubes using Netweaver toolset. The question now becomes how to accomplish this. Let us

try to understand the steps involved.Steps in using BPC process chain within non-BPC process chain: 

The first step is to upload the flat file. If we want to use open hub then the open hub can place

the file at any specified location on the BPC application server or we can upload the flat file to

the BPC File service (transaction ‘ujfs') as shown below. 

Page 4: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 4/44

 The second step is to create a transformation file using the BPC front end. Though we want to

run the BPC process chain with native Netweaver tools, this is the only step that we have to do

with the BPC front end. This is because the BPC process chain looks for the XML version of the

transformation file. When we process the transformation file from the BPC front end, this XMLversion of the transformation file is automatically created and stored in the file service.

The third step is to create an answer prompt file that passes the required parameters to the

BPC process chain. This file should be a tab delimited file. The format of the answer prompt file

is as follows:

%FILE% 'csv file path in file service'

%TRANSFORMATION% 'transformation file path in file service '

%CLEARDATA% 1/0

%RUNLOGIC% 1/0

%CHECKLCK% 1/0

Here is an example of the answer prompt file:

Page 5: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 5/44

The fourth step is to run program ujd_test_package with the right Appset and Application. We

should use the answer prompt file created in the above step and save the variant for the

program as shown below.

However, please note that this ujd_test_package program was originally designed to assist in

debugging the data manager packages. Hence it may not be a bad idea to copy this program to

a user defined program and use the user defined program in the next step - just to be on saferside so that if future development changes the nature of this program, then we shouldn't get

unnecessary surprises!

Now in the final step, we are ready to create our custom process chain that executes the BPC

process chain. As shown below, create a user defined process chain in transaction ‘rspc' and

include a process type to execute ABAP program. Include ujd_test_package program (or the

user defined program created based on ujd_test_package) with the saved variant.

Page 6: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 6/44

 

Activate the process chain and execute the process chain.

Thus we can run the BPC process chain from within non-BPC process chains. These steps will

work not only for the process chain to load flat file into BPC infocube with open hub, but also

for loading data from other Infoprovider to BPC infocube (using the BPC process chain to load

data from Infoprovider)

BPC Script logic for Dummies? (Part 1) 

Posted by James Lim in SAP Planning and Consolidations, version for SAP NetWeaver on May

24, 2011 10:45:56 AM

Even though I have a long experience with BPC, if someone asks me to a write a script logic, I

might not write that code within 10 minutes.

It is not a problem of my knowledge but writing a script logic needs to understand your

financial requirements and knowledge of your application like account member id and name of 

properties.

Page 7: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 7/44

There are some documents and help files about the script logic and HTG but end users may feel

that is not easy. I agree with that but if you understand its structure and concept, I can

guarantee you can read and understand what it means and what the purpose of that script logic

is. On top of that, you might be enable to modify or create it. It is the same thing that is not

easy to write a book but reading a book is a different story.

Let’s learn it step by step.

1. Understand 3 logic parts. 

Logic is a special script engine and it consists of 3 parts. Scoping, calculation(create/Record) and

Writing.

2. Scoping 

BPC is based on NW BI or MSAS which has a lot of data. Therefore, if you don't specify scope,

it will take a lot of time to read data.

Let's say you need to calculate 2011.January, actual data and only one account like

'Discounted External sales' based on the External Sales.

How can we scope this from a big database?

The answer is.... *XDIM_MEMBERSET 

*XDIM_MEMBERSET is using for scope data by each dimension.

Here is the grammar of XDIM_MEMBERSET.

*XDIM_MEMBERSET <DIMENSIONNAME> = <MEMBERNAME 1>,<MEMBERNAME

2>...<MEMBERNAME n>

Now, let't scope above exampe.

for scoping 2011.January, *XDIM_MEMBERSET TIMEDIM=2011.JAN

for scoping actual, *XDIM_MEMBERSET CATEGORYDIM=ACTUAL

for scoping external sales, *XDIM_MEMBERSET ACCOUNTDIM=EXTSALES

(Note: we need to scope External sales because discounted External sales will

be calculated based on the External Sales.)

Page 8: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 8/44

 

3. Now, we just finished scoping so it is time to calculate(create) data. 

Unlike other script engine, there is no temporary variable in the logic script engine so it will

create a record that has same as fact table structure.

and it will replace or change its value uaing '*REC' command. (Note: *REC means 'Record'.)

Here is the grammar of *REC statement

*REC[([FACTOR|EXPRESSION={Expression}[,{dim1}={member},{dim2}=?)]

Using that grammar, We can make our script as below.

*REC (FACTOR = 0.9,ACCOUNT="DISCOUNT_EXTSALES")

Which means multiply by 0.9 to current scoped record and replace account member with

DiSCOUNT_EXTSALES

Here is an example what happens with above statement.

<Scoped record> 

EXTSALES,2011.JAN,ACTUAL,10000 

<Generated record> 

DISCOUNT_EXTSALES,2011.JAN,ACTUAL,9000 

What if you want to put generated record into BUDGET category?

Then statement should be

*REC (FACTOR = 0.9,ACCOUNT="DISCOUNT_EXTSALES",CATEGORY="BUDGET")

Now you want to put 80% value into FORECAST at the same time. what should we do?

We can use another *REC statement at the same time.

Page 9: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 9/44

*REC (FACTOR = 0.9,ACCOUNT="DISCOUNT_EXTSALES",CATEGORY="BUDGET")

*REC (FACTOR = 0.8,ACCOUNT="DISCOUNT_EXTSALES",CATEGORY="FORECAST")

<Scoped record> 

EXTSALES,2011.JAN,ACTUAL,10000 

<Generated record> 

DISCOUNT_EXTSALES,2011.JAN,BUDGET,9000 

DISCOUNT_EXTSALES,2011.JAN,FORECAS,8000 

Getting easier? I hope so

Please keep in mind below rule.

a. Each REC instruction generates ONE new record.

b. Each source record can generate as many records as desired.

It means you scoped 1 record but you can create multiple records using this.

Currency translation is the best example because you need multiple

converted currencies using a local currency record.

Therefore, you can imagine there will be multiple *REC statement

in your currency conversion script logic.

c. Destination cell can be same. All values will be accumulated.

*REC statement will generate a new record so it doesn't matter even though destination

is same.

4. As a final step, we need to write data into database. 

Page 10: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 10/44

script statement is really simple one.

*COMMIT

Fortunately, it doesn't have any parameter. Just use *COMMIT.

When BPC script engine execute *COMMIT, generated records will be posted

to the table using BPC sending engine which is same engine that you submit

data from the Excel workbook.

We reviewed three main parts of BPC Logic script as a first step.

I will explain advanced scoping, recording and commit command in the next post.

BPC Script logic for Dummies? (Part 2) 

Posted by James Lim in SAP Planning and Consolidations, version for SAP NetWeaver on Jun 20,

2011 10:19:13 AM

I explained basic 3 parts of script logic in the last post. It was Scoping, Calculation and writing.

We will find out more advanced features for scoping In this post.

1. Scope using member property 

We found how to use *XDIM_MEMBERSET last time.

*XDIM_MEMBERSET is for scoping based on the member ID.

What if user wants to scope members based on a specific property value?

For example, a user wants to filter Account dimension members those are Asset.

Page 11: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 11/44

To achieve this, we need to use ACCTYPE property which has the type value of account.

AST is the value for ASSET account. (Note: Value is based on the APSHELL of BPC.)

The command is *XDIM_FILTER.

The usage is *XDIM_FILTER <DIMENSIONNAME> = [DIMENSIONName].Properties("Property

name") = "Property value"

So above example can be written as below.

*XDIM_FILTER ACCOUNT = [account].properties(ACCTYPE='AST')

Let's say Account dimension has 3 members as below.

ID ACCTYPE

Extsales INC

CASH AST

TAXES EXP

NETINCOME INC

Then *XDIM_FILTER_ACCOUNT will select CASH member only.

Let's assume If you already used multiple *XDIM_MEMBERSET command and below are

selected data from the fact tables.

*XDIM_MEMBERSET TIME = 2011.JAN

*XDIM_MEMBERSET CATEGORY = BUDGET

<Result> 

EXTSALES , 2011.JAN, BUDGET, 9000

CASH , 2011.JAN, BUDGET, 3000

Page 12: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 12/44

TAXES , 2011.JAN, BUDGET, 800

NETINCOME, 2011.JAN, BUDGET, 1500

Now if you add *XDIM_FILTER against ACCOUNT dimension.

*XDIM_MEMBERSET TIME = 2011.JAN

*XDIM_MEMBERSET CATEGORY = BUDGET

*XDIM_FILTER ACCOUNT = [account].properties(ACCTYPE?='AST') 

Then only one record will be selected from above result because CASH is the only account

member that has 'AST' value of ACCTYPE property.

<Result> 

CASH , 2011.JAN, BUDGET, 3000

2. Scope using member value 

We just figured out how to scope the source data based on the property.

Then someone might ask this question.

"Can we scope based on the value?

For example, can we select all data that an account value greater than 100?

Of course, we can do it.

The command is *XDIM_GETMEMBERSET. Unlike other command,

it needs *ENDXDIM command to specify data.

Here is the grammar of XDIM_GETMEMBERSET and an example.

*XDIM_GETMEMBERSET {dimension} [={member set}]

[*APP={application}] //optional

Page 13: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 13/44

[*XDIM_MEMBERSET {dimension} [={member set}] //as many of these as needed

[*QUERY_TYPE= 0 | 1 | 2] //optional

*CRITERIA {expression} //required

*ENDXDIM

*XDIM_GETMEMBERSET P_CC=[P_CC].[H1].[AAPJ].CHILDREN

*APP=PLANNING

*XDIM_MEMBERSET P_DataSrc=INPUT

*CRITERIA [P_ACCT].[H1].[CE0001000]>1000

*ENDXDIM

It will get data those are..

a. Children member of AAPJ in the P_CC dimension. AND

b. from the PLANNING application AND

c. INPUT member of P_Datasrc dimension AND

d. CE0001000 member's value of the P_ACCT dimension should be greater than 100000

Let's Assume Fact table has below records.

CE0001000, 2011.JAN, ACTUAL, INPUT, KOREA , 2500

CE0002000, 2011.JAN, ACTUAL, INPUT, CHINA , 5000

CE0001000, 2011.JAN, ACTUAL, ADJ , CHINA , 3000

CE0002000, 2011.JAN, ACTUAL, INPUT, JAPAN , 1999

CE0003000, 2011.JAN, ACTUAL, INPUT, JAPAN , 2222

CE0001000, 2011.FEB, BUDGET, ADJ , KOREA , 345

CE0001000, 2011.FEB, BUDGET, INPUT, TURKEY, 1999

CE0003000, 2011.JAN, ACTUAL, INPUT, TURKEY, 1100

CE0001000, 2011.FEB, BUDGET, INPUT, CHINA , 1050

CE0001000, 2011.FEB, BUDGET, INPUT, JAPAN , 450

Which records will be selected?

The answer is

CE0001000, 2011.JAN, ACTUAL, INPUT, KOREA, 2500

CE0001000, 2011.FEB, BUDGET, INPUT, CHINA, 1050

Page 14: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 14/44

 

Below records will not be selected even though P_ACCT is CE0001000

because its value is less than 1000

or Datasrc is not INPUT

or it is not the child member of AAPJ (Asia Pacific)

CE0001000, 2011.JAN, ACTUAL, ADJ , CHINA , 3000 (datasrc is not input)

CE0001000, 2011.FEB, BUDGET, ADJ , KOREA , 345 (datasrc is not input)

CE0001000, 2011.FEB, BUDGET, INPUT, TURKEY, 1999 (Turkey is not child member of AAPJ)

CE0001000, 2011.FEB, BUDGET, INPUT, JAPAN , 450 (Value is less than 1000)

Here are some important Notes for using this command.

Note 1: This command works only for BPC MS.

Note 2: if you don't specify each dimension's scope, it will be performed in the

corresponding

members of the pre-selected region which is defined with XDIMMEMBERSET of 

previous line or

Passed by Data Manager.

Note 3: This command will generate MDX statement so it takes more time to execute.

if your dataset has only base members, you can use *XDIM_GETINPUTSET. (please

refer help file)

3. When user wants to add more members on top of current scoped data. 

Let's say a user wants to add USASales entity on top of predefined memberset.

In that case user defines as below.

*XDIM_ADDMEMBERSET Entity = USASales

The main reason why we need this is

Page 15: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 15/44

a. Sometimes XDIMMEMBERSET doesn't work with some specific functions like

BAS(parent).

For example, IN BPC NW, *XDIM_MEMBERSET = BAS(US),CANADA will not work.

Therefore, we should use *XDIM_MEMBERSET and *XDIM_ADDMEMBERSET.

*Note: In BPC MS, BAS() will not work with XDIM_MEMBERSET.

b. if user always wants to run a specific memberset whenever logic runs,

should use XDIM_ADDMEMBERSET

4. Dynamic Scope and saving it to a variable.

Sometimes we need to save our scoped data into a script logic variable.

But... what if your dimension members are updated frequently?

As I know, almost every customer updates their dimension at least once a month.

If customer changes their dimension members, what will happen in your script logic?You can use *Filter but sometimes it may not work all the time.

Then we can use *SELECT and *MEMBERSET command as a dynamic scope tool.

Like other script engine, Logic script also supports Variable to save some data.

The Variable is defined using % symbol.Here are some examples, %MYTIME% , %CUR% etc.

So how can we save some data into the variable and when it can be used?

Usually, the variable can be filled using *SELECT command and *MEMBERSET command.

Both of them is scope command but *SELECT will be faster because it will create SQL

statement.

Here is the grammar of both commands.

*SELECT (, {member set in MDX format}) 

Page 16: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 16/44

 

Let's see how to use *SELECT command.

SELECT(%REPORTING_CURRENCIES%, “ID”, “CURRENCY”, “*GROUP+ = 'REP'”) 

This command will get the 'member ID(what)' from the 'currency dimension(From)' that the

GROUP property has the value 'REP' (where).

Actually, it will create a SQL statement as below

SELECT ID from mbrCurrency where [GROUP] = 'REP'

After it executes above SQL command, all result will be saved into

%REPORTING_CURRENCIES% variable.

Here is an example of *MEMBERSET which will make same result but execute MDX

statement instead of SQL.

*MEMBERSET(%REPORTING_CURRENCIES%, Filter{[CURRENCY].members,

[currency].properties(GROUP='REP')})

The variable can be used anywhere in the logic, like in this example:

*XDIM_MEMBER_SET CURRENCY=%REPORTING_CURRENCIES%

Let's assume Currency dimension has below members.

ID GROUP

USD REP

EUR REPKRW

JPY

Then above statement will be converted as

*XDIM_MEMBER_SET CURRENCY = USD,EUR

Page 17: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 17/44

 

When you define and fill in data using *SELECT and *MEMBERSET,

please remember this as 'MEMBERSET Variable'

Note: MEMBERSET command is only supported by MS version.

We reviewed key command of scoping today. 

We will review advanced calculation command and control command like *IF or *FOR - *NEXTin the next post.

BPC Script logic for Dummies? (Part 3) 

Posted by James Lim in SAP Planning and Consolidations, version for SAP NetWeaver on Aug 4,

2011 7:37:49 PM

I am sorry for the late posting of this series but I had to take my vacation and needed to get

some training about HANA

Let's start to learn how to caluate and write some data using the script logic.

Again, the script logic consists of 3 parts; Scoping, Calculationand Writing.

1. Basic concept of Writing and *REC statement 

As we saw in my first posting of this series, *REC statement is used for writing data.You need to keep in mind that *REC will create records based on the scoped records.

For example, if your scoped record is same as below.

Page 18: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 18/44

  <Scoped record>

EXTSALES, 2011.JAN, ACTUAL, USA, 10000 

and your *REC statement is below.*REC (FACTOR = 0.9, ACCOUNT="DISCOUNTED_EXTSALES", CATEGORY="BUDGET")

Then your generated record will be

<Generated record> 

DISCOUNTED_EXTSALES, 2011.JAN, BUDGET, USA, 9000

What if your scoped record is not a single one but multiple record?

<Scoped record> 

EXTSALES, 2011.JAN, ACTUAL, USA, 10000 

EXTSALES, 2011.JAN, ACTUAL, KOREA, 3000 

EXTSALES, 2011.JAN, ACTUAL, CANADA, 5000 

Then your generated records will be

<Generated record> 

DISCOUNTED_EXTSALES, 2011.JAN, BUDGET, USA, 9000 

DISCOUNTED_EXTSALES, 2011.JAN, BUDGET, KOREA, 2700 

DISCOUNTED_EXTSALES, 2011.JAN, BUDGET, CANADA, 4500 

As you can see, we changed Account value, Category value and its signeddata vale (or measure

value) using *REC statement.

The other dimension that is not specified in the *REC statement will be same as scoped data

so 2011.JAN and each country (entity) doesn't

be changed.

Page 19: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 19/44

 

2. Grammar of *REC statement. 

Here is the grammar of *REC statement. You can use FACTOR or EXPRESSION for various

calculations for signeddata vale (or measure value).And specify dimension name and member to change its value.

*REC[([FACTOR|EXPRESSION={Expression}[,{dim1}={member},{dim2}=?)]

3. What is the difference between FACTOR and EXPRESSION? 

The FACTOR is a factor(multiply) by which the retrieved amount is to be multiplied.

Here is an example.

<Scoped record> 

EXTSALES, 2011.JAN, ACTUAL, 10000

*REC(FACTOR=6/2)

<Generated record> 

EXTSALES, 2011.JAN, ACTUAL, 30000

What if you want to add or divide? then you should use EXPRESSION.

The EXPRESSION is any formula that will result in the new value to post.

The formula can include regular arithmetic operators, fixed values and the Script logic

keyword %VALUE%

this is representing the original retrieved value of the scoped record.

Here is an example.

<Scoped record> 

EXTSALES, 2011.JAN, ACTUAL, 10000 

*REC(EXPRESSION=%VALUE% + 5000)

Page 20: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 20/44

  <Generated record> 

EXTSALES, 2011.JAN, ACTUAL, 15000 

Now we got the basic things of *REC statement but you may ask below questions. 

"There are some scoped data and I need to do different calculations based on each specific

dimension member."

"I need to copy a value to multiple destinations!"

"How can I get the value from the other application?"

"I want to use some value from other records to calculate the result."

"Can I use a property value to calculate the result?"

The script logic can handle above requirements.

I will explain first question in this post and will do others in the next post.

"There are some scoped data and I need to do some calculations based on each specific

dimension member."

Yes. That's why *you MUST use *REC statement with *WHEN ~ *IS ~ *ELSE ~

*ENDWHEN statement.

Let's assume you want to create forecast values of salary and put it into the forecast category

based on the country's actual salary values of January, 2011.

We need to increase 10% for US, 5% for Canada and 3% for other countries.

Let's assume ENTITY dimension has country information.

To do this, you need to scope first.

*XDIM_MEMBERSET ACCT = SALARY

*XDIM_MEMBERSET TIME = 2011.JAN

*XDIM_MEMBERSET CATEGORY = ACTUAL

Now you need to write the *REC statements

Page 21: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 21/44

*REC(FACTOR = 1.1, CATEGORY=“FORECAST“)  // 10% 

*REC(FACTOR = 1.05, CATEGORY=“FORECAST“) // 5% 

*REC(FACTOR = 1.03, CATEGORY=“FORECAST“)  // 3% 

Finally, you should specify a condition of each *REC statement.

For doing this, you MUST use *WHEN ~ *IS ~ *ELSE ~ ENDWHEN statement. 

First, Write down *WHEN and *ENDWHEN outside of the *REC statement

*WHEN

REC(FACTOR = 1.1, CATEGORY=“FORECAST“) // 10%

REC(FACTOR = 1.05, CATEGORY=“FORECAST“) // 5%

REC(FACTOR = 1.03, CATEGORY=“FORECAST“) // 3%

*ENDWHEN 

NOTE : You don't need to use the indentation of code in the script logic

but I would like to recommend using it for better readability.

Second, write a dimension name that you want to compare next to *WHEN.

In this example, it will be ENTITY dimension.

*WHEN ENTITY 

REC(FACTOR = 1.1, CATEGORY=“FORECAST“) // 10%

Page 22: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 22/44

  REC(FACTOR = 1.05, CATEGORY=“FORECAST“) // 5%

REC(FACTOR = 1.03, CATEGORY=“FORECAST“) // 3%

*ENDWHEN

Third, put *IS statement on top of each *REC statement and *ELSE statement on top of the last

*REC statement.

We need two *IS statements and *ELSE statement because there are two conditions

and others will be calculated as one condition.

*WHEN ENTITY

*IS 

REC(FACTOR = 1.1, CATEGORY=“FORECAST“) // 10%

*IS 

REC(FACTOR = 1.05, CATEGORY=“FORECAST“) // 5%

*ELSE 

REC(FACTOR = 1.03, CATEGORY=“FORECAST“) // 3%ENDWHEN 

Fourth, put each condition value next to *IS

*WHEN ENTITY

***IS USA

REC(FACTOR = 1.1, CATEGORY=“FORECAST“) // 10%

***IS CANADA 

Page 23: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 23/44

  REC(FACTOR = 1.05, CATEGORY=“FORECAST“) // 5%

***ELSE

REC(FACTOR = 1.03, CATEGORY=“FORECAST“) // 3%

*ENDWHEN 

As a last step, put *COMMIT at end of the script so that logic engine can post data to Database.

so final version should be same as below code.

*XDIM_MEMBERSET ACCT = SALARY

*XDIM_MEMBERSET TIME = 2011.JAN

*XDIM_MEMBERSET CATEGORY = ACTUAL

*WHEN ENTITY

***IS USA

REC(FACTOR = 1.1, CATEGORY=“FORECAST“) // 10%

***IS CANADA

REC(FACTOR = 1.05, CATEGORY=“FORECAST“) // 5%***ELSE

REC(FACTOR = 1.03, CATEGORY=“FORECAST“) // 3%

*ENDWHEN

*COMMIT 

Note 1 : You can use multiple condition value like *IS VALUE_A, VALUE_B

Note 2 : You can use >, <= with numeric value with *IS statement. ex) *IS > 4

By default, it is equal (=) so it will be ok even though you don't specify it.Note 3 : can't use AND, OR and NOT with *IS

Note 4 : " (double quotation) is not mandatory for comparing string value with *IS

statement.

Note 5 : *WHEN statement can be nested. For example,

Page 24: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 24/44

*WHEN xxx

IS “A” 

REC(…) 

REC(…) 

IS “B” *REC(…) 

*WHEN yyy

IS “C”,”D”,”E” 

REC(…) 

*ELSE

REC(…) 

*ENDWHEN

*ENDWHEN

Note 6 : You can use property value with *IS statement. ex) *IS Intco.Entity

Now we finished learning 3 basic parts of the script logic.

As I explained you in the first post of this series, I hope you feel script logic is not too

complex.

I will post a couple of advanced features like LOOKUP in the next post for answering otherquestions.

BPC Script logic for Dummies? (Part 4) 

Posted by James Lim in SAP Planning and Consolidations, version for SAP NetWeaver on Aug 12,

2011 11:11:21 AM

OK, let’s start to find out the answer about one of the questions that we had in the last post. 

"How can I get the value from the other application?"

The simple answer is... USE *LOOKUP/*ENDLOOKUP! 

The simplest example is the currency conversion because you need to read rate value from the

rate application to convert

currency values of the finance application.

Page 25: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 25/44

 (NOTE:*LOOKUP/*ENDLOOKUP also can be used for reading value of the current

application.) 

Here is the syntax of *LOOKUP/*ENDLOOKUP

The syntax is:

*LOOKUP {Application}

*DIM [{LookupID}:] {DimName}="Value" | {CallingDimensionName}[.{Property}]

*DIM …+ 

*ENDLOOKUP

{Application} is the name of the application which you will retrieve value.

{DimensionName} is a dimension in the lookup application.

{CallingDimensionName} is a dimension in the current application.

{LookupID} is an optional variable that will hold the value so that you can use it in the script.

This is only required when multiple values must be retrieved.

Now, let's do it step by step.

Here are our requirements for the currency conversion.

1. You need to get the rate values from rate application for currency conversion (LC to USD and

EUR).

2. The member id of RATE dimension in the rate application should be the same as RATETYPE

property of the account dimension in the finance application.

3. The member id of RATEENTITY dimension in the rate application should be "DEFAULT"

4. The rule of currency conversion is 'DESTINATION CURRENCY/CURRENT CURRENCY'

First, you need to define *LOOKUP with application name.

*LOOKUP RATE 

*ENDLOOKUP 

Page 26: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 26/44

Second, specify dimensions of RATE application with *DIM statement.

(Let's assume the rate application has RATEENTITY, INPUTCURRENCY, RATE, CATEGORY and

TIME dimension.)

*LOOKUP RATE

*DIM RATEENTITY 

*DIM INPUTCURRENCY 

*DIM RATE 

*DIM CATEGORY 

*DIM TIME 

*ENDLOOKUP

Third, assign the member id value of each dimension from the current application (Finance) or

use fixed value.

If you need to retrieve multiple value according to different member id values of specific

dimensions,

Make copies of that dimension and assign different values.

*LOOKUP RATE

*DIM RATEENTITY="DEFAULT" // Fixed value

*DIM INPUTCURRENCY="USD" // Fixed value

*DIM INPUTCURRENCY="EUR" // Fixed value, Copy same dimension for another value

*DIM INPUTCURRENCY=ENTITY.CURR // added one more for the currency conversion as

variable value

*DIM RATE=ACCOUNT.RATETYPE // Variable value based on the current application

*DIM CATEGORY

*DIM TIME

*ENDLOOKUP

Fourth, Put variables for multiple retrieving values in front of each duplicate dimension name.

*LOOKUP RATE

*DIM RATEENTITY="DEFAULT"

*DIM DESTCURR1:INPUTCURRENCY="USD"

*DIM DESTCURR2:INPUTCURRENCY="EUR"

*DIM SOURCECUR:INPUTCURRENCY=ENTITY.CURR

*DIM RATE=ACCOUNT.RATETYPE

*DIM CATEGORY

Page 27: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 27/44

*DIM TIME

*ENDLOOKUP

-------------------------------------------------------------------------

Note: If you want to get some value based on two or more dimensions,

You should use the same variable name when you map dimensions.

Here is an example.

*LOOKUP OWNERSHIP

*DIM INTCO="IC_NONE"

*DIM PARENT="MYPARENT"

*DIM PCON:ACCOUNTOWN="PCON" // PCON is used for ACCOUNTOWN

*DIM PCON:ENTITY=ENTITY // PCON is used for ENTITY

*DIM IC_PCON:ACCOUNTOWN="PCON" // IC_PCON is used even though it searchessame "PCON"

*DIM IC_PCON:ENTITY=INTCO.ENTITY // IC_PCON is used for INTCO.ENTITY

*ENDLOOKUP

Even though the member id of ACCOUNTOWN dimension is same, the variable should be

defined as a different variable because the member id of ENTITY dimension is different in the

combination.

If the 'ENTITY' property of INTCO dimension has I_FRANCE value, above *LOOKUP will

select below two records and each variable will have different value.

IC_NONE,MYPARENT,PCON,FRANCE,100 => PCON

IC_NONE,MYPARENT,PCON,I_FRANCE,80 => IC_PCON

---------------------------------------------------------------------------

Last, Remove dimension names (TIME and CATEGORY> that don’t have any fixed value or

variable value because it will be passed as current value

automatically.

*LOOKUP RATE

*DIM RATEENTITY="DEFAULT"

*DIM SOURCECUR:INPUTCURRENCY=ENTITY.CURR

*DIM DESTCURR1:INPUTCURRENCY="USD"

Page 28: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 28/44

*DIM DESTCURR2:INPUTCURRENCY="EUR"

*DIM RATE=ACCOUNT.RATETYPE

*ENDLOOKUP

Now we get the values so how can we use these values?You can use it using LOOKUP(Variable) in your *REC statement as below

*WHEN ACCOUNT.RATETYPE

*IS "AVG","END"

*REC(FACTOR=LOOKUP(DESTCURR1)/LOOKUP(SOURCECURR),CURRENCY=”USD”) 

*REC(FACTOR=LOOKUP(DESTCURR2)/LOOKUP(SOURCECURR),CURRENCY=”EUR”)  

*ENDWHEN

NOTE: You can use LOOKUP(variable) with *WHEN and *IS statement.

Ex) *WHEN LOOKUP(PCON) //as a condition value of when

*IS <= LOOKUP(IC_PCON) //as a value of IS

*REC(FACTOR=-1, PARENT ="MYPARENT",DATASRC="ELIM")

*ENDWHEN

We reviewed how to define *LOOKUP/*ENDLOOKUP statement and how to use it.

Now it is time to find out how it works in the script engine.

Let's assume below records are in the rate application and see what will happen during executeof the script logic.

RATEENTITY, INPUTCURRENCY, RATE, CATEGORY, TIME, SIGNEDDATA

DEFAULT, USD, AVG, ACTUAL, 2011.JAN, 1

DEFAULT, EUR, AVG, ACTUAL, 2011.JAN, 1.22

DEFAULT, CHF, AVG, ACTUAL, 2011.JAN, 0.91

DEFAULT, USD, END, ACTUAL, 2011.JAN, 1

DEFAULT, EUR, END, ACTUAL, 2011.JAN, 1.24

DEFAULT, CHF, END, ACTUAL, 2011.JAN, 0.93

RATCALC, USD, AVG, ACTUAL, 2011.JAN, 1

RATCALC, USD, AVG, ACTUAL, 2011.JAN, 1

Here are your current finance application records that need to be processed.

Page 29: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 29/44

ACCOUNT, ENTITY, CATEGORY, TIME, CURRENCY, SIGNEDDATA

INVENTORY, SWITZERLAND, ACTUAL, 2011.JAN, LC, 5000

REVENUE, SWITZERLAND, ACTUAL, 2011.JAN, LC, 1000

As you can see, there is no relationship between finance application and rate application.

We know Switzerland currency is CHF but there is no information in each fact table record.

It only has LC (local currency) value.

Then, how can script logic find the value and calculate it?

The key point is 'ENTITY.CURR' which we used it for mapping dimension as below.

*DIM SOURCECUR:INPUTCURRENCY=ENTITY.CURR

ENTITY.CURR means 'CURR' property value of ENTITY dimension.

Therefore, Switzerland which is one of the Entity dimension member should have 'CHF' value in

its 'CURR' property.

Same thing is for mapping RATE dimension of rate application as below.

*DIM RATE=ACCOUNT.RATETYPE

So the 'RATETYPE' property value of INVENTORY and REVENUE account should have 'AVG' or

'END' value.

Therefore, the Script engine will do the following steps to process the first record of the fact

table.

1. Select RATEENTITY = "DEFAULT"

2. Select INPUTCURRENCY = "CHF” (because current Entity member's 'CURR' property value is

'CHF')

OR INPUTCURRENCY = "USD"

OR INPUTCURRENCY = "EUR"

3. Select RATE = "END” (because current account member's 'RATETYPE' property value is'END')

4. Select CATEGORY = "ACTUAL” (There is no statement so it is same as current application

CATEGORY value.)

Page 30: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 30/44

5. Select TIME = "2011.JAN” (There is no statement so it is same as current application TIME

value.)

All above selection will be combined with 'AND' condition. 

So the 3 records below will be selected and its signeddata value will be assigned to each

variable.

DEFAULT, USD, END, ACTUAL, 2011.JAN, 1 => DESTCURR1 will be 1 

DEFAULT, EUR, END, ACTUAL, 2011.JAN, 1.24 => DESTCURR2 will be 1.24 

DEFAULT, CHF, END, ACTUAL, 2011.JAN, 0.93 => SOURCECUR will be 0.93 

After the script logic engine executes below statements, it will generate 2 records.

*WHEN ACCOUNT.RATETYPE

*IS "AVG","END"

*REC(FACTOR=LOOKUP(DESTCURR1)/LOOKUP(SOURCECURR),CURRENCY=”USD”) 

*REC(FACTOR=LOOKUP(DESTCURR2)/LOOKUP(SOURCECURR),CURRENCY=”EUR”)  

*ENDWHEN

ACCOUNT, ENTITY, CATEGORY, TIME, CURRENCY, SIGNEDDATA

INVENTORY, SWITZERLAND, ACTUAL, 2011.JAN, LC, 5000

INVENTORY, SWITZERLAND, ACTUAL, 2011.JAN, USD, 5376.34  // 5000 * (1/0.93) INVENTORY, SWITZERLAND, ACTUAL, 2011.JAN, EUR, 6666.67  // 5000 * (1.24/0.93) 

For the 2nd record in the fact table, the 3 records below will be selected from the rate

application fact table because

Revenue account has 'AVG' RATETYPE.

DEFAULT, USD, AVG, ACTUAL, 2011.JAN, 1

DEFAULT, EUR, AVG, ACTUAL, 2011.JAN, 1.22DEFAULT, CHF, AVG, ACTUAL, 2011.JAN, 0.91

After it processes 'REVENUE' records, there will be 6 records in the fact table as below.

(4 records will be generated in total.)

Page 31: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 31/44

ACCOUNT, ENTITY, CATEGORY, TIME, CURRENCY, SIGNEDDATA

INVENTORY, SWITZERLAND, ACTUAL, 2011.JAN, LC, 5000

INVENTORY, SWITZERLAND, ACTUAL, 2011.JAN, USD, 5376.34

INVENTORY, SWITZERLAND, ACTUAL, 2011.JAN, EUR, 6666.67

REVENUE, SWITZERLAND, ACTUAL, 2011.JAN, LC, 1000REVENUE, SWITZERLAND, ACTUAL, 2011.JAN, USD, 1098.90 // 1000 * (1/0.91)

REVENUE, SWITZERLAND, ACTUAL, 2011.JAN, EUR, 1340.66 // 1000 * (1.22/0.91)

We finished learning how to use *LOOKUP/*ENDLOOKUP statement.

Here are some questions and answers that I got frequently.

Question 1: What if rate application doesn't have the value?

Then currency conversion will not happen.

Question 2: I don't have any records in the fact table of current application. What will happen?

The script logic always reads records from the current application.

Therefore, if there are no records in the fact table of the current application,

Nothing will happen.

Question 3: Can we lookup parent member value instead of base member (leaf member)?

MS version can do it with *OLAPLOOKUP statement instead of *LOOKUP but NW version

doesn't have it yet.

I will explain about *FOR/*NEXT in the next post.

BPC Script logic for Dummies? (Part 5) 

Posted by James Lim in SAP Planning and Consolidations, version for SAP NetWeaver on Oct 3,

2011 7:39:07 AM

Like other program language or script, the logic script also supports Loop statement.

Let's see how it works.

Here is the syntax of *FOR - *NEXT statement.

*FOR {variable1} = {set1} [ AND {variable2={set2}]

Page 32: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 32/44

{other statement...}

*NEXT

And here is an example.

*FOR %CURR%=USD,EURO 

*REC(FACTOR=1.1,TIME=2011.OCT,CURRENCY=%CURR%)

*NEXT

So what is the meaning of the example above?

1. We set a variable as %CURR% 

2. %CURR% variable will be replaced with two values USD and EURO.3. *REC statement includes %CURR% variable.

4. Therefore, it will be translated two statement as below

because *REC statement exists inside of *FOR - *NEXT statement.

*REC(FACTOR=1.1,TIME=2011.OCT,CURRENCY=USD) 

*REC(FACTOR=1.1,TIME=2011.OCT,CURRENCY=EURO) 

Let's assume %CURR% varible has USD,EURO,CHF,KRW then the it will be translated as below.

*REC(FACTOR=1.1,TIME=2011.OCT,CURRENCY=USD) 

*REC(FACTOR=1.1,TIME=2011.OCT,CURRENCY=EURO) 

*REC(FACTOR=1.1,TIME=2011.OCT,CURRENCY=CHF) 

*REC(FACTOR=1.1,TIME=2011.OCT,CURRENCY=KRW) 

Someone may say "we can use multiple line of *REC statement".

Of course, it is correct if it is simple but we need *FOR - *NEXT statement because it can be

used as a nested form.

For example,

*FOR %YEAR%=2003,2004,2005

*FOR %MONTH%=JAN,FEB,MAR,APR,MAY,JUN,JUL,AUG,SEP,OCT,NOV,DEC

*REC(FACTOR=GET(ACCOUNT="TOT.OVRHEAD",TIME="TOT.INP")/100,TIME="%YEAR%.%MONT

H%")

Page 33: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 33/44

  *NEXT

*NEXT 

If the user is using *REC statement,user should write 36 statements instead of above simple a

*FOR - *NEXT statement.

(NOTE: BPC NW supports Nested FOR - NEXT in the version 7.5 ) 

In addtion, User may use two variable sets like the exmaple below.

*FOR %X%=1,2,3 AND %Y%=A,B,C 

*REC(FACTOR=1.5,TIME=%X%,CURRENCY=%Y%)

*NEXT

So the first variable set and the second variable set will be matched 1 to 1; then will be replaced

as the example below.

*REC(FACTOR=1.5,TIME=1 ,CURRENCY=A) 

*REC(FACTOR=1.5,TIME=2 ,CURRENCY=B) 

*REC(FACTOR=1.5,TIME=3 ,CURRENCY=C) 

What if the number of values is not matched between first and second variable?

If the first variable has less values than the second variable,

the extra values of the second variable will be ignored.

If the first variable has more values than the second variable,

the missing values of the second variable will be assumed null so please be careful to match the

number of varible.

The last thing about *FOR - *NEXT is using data set variable as values.

Users can use data_set like %TIME_SET% instead of specifying all time members.

This is very useful when we use script logic with dynamic dataset.

For example, We can use

*FOR %MYTIME%=%TIME_SET% 

Page 34: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 34/44

 

instead of 

*FOR %MYTIME%=2003.JAN,2004.JAN,2005.JAN

Therefore, users can execute script logic dynamically based on the passed data sets.

We will see how to use GET and FLD function and memory variable in the next post as the last

topic.

Loading transactional data from any infocube to BPC Application in BPC7NW 

Posted by Pravin Datar in pravin.datar on Apr 16, 2009 6:54:44 PM

Business Planning and Simulation version for Netweaver successfully leverages the Netweaver

infrastructure and we can use the data stored in infocubes in the Enterprise Data Warehouse in

BPC7NW. In this blog we will discuss what tools and techniques we can use to get the

transactional data stored in any infocube in BW into BPC Application using Data Manager in

BPC7NW.

Options available for cube to cube data load: 

There are many options available to the users of BPC7NW to get transactional data from a BW

cube to BPC Application. Here are some of them:

1.  We can export the transactional data from any infocube in the form of flat file and then

use Data Manager in BPC7NW to load that data from flat file to BPC Application. The

advantage of this option is that it is relatively very simple to administer and very flexible

to adapt since we are dealing with a flat file when it comes to importing data into BPC

Application. The limitation of this option is that here we are really not leveraging the BW

platform. If we have to load data from any other non-BW source, we can use exactly the

same approach - export data from that external data source to a flat file and thenimport that flat file into BPC. So in essence, in this option we are treating BW infocube

as any other external data source. There is another ostensible limitation that this option

may portray - that this process can not be automated. If we have to get a flat file export

from an infocube and then import that flat file into BPC, then it may appear that there

has to be a handoff from the flat file export to the import of flat file. However we can

overcome this limitation, if we want to, by creating a custom process chain that in turn

Page 35: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 35/44

executes the process chain to import the flat file data into BPC Application. You can read

more about that atRunning BPC process chains within non-BPC process chains 

2.  We can leverage BW Extraction, Transformation and Loading (ETL) tools to transfer

transactional data from any infocube to the infocube used by BPC Application. BPC

Application in BPC7NW generates its own infocube in its namespace. We can use

standard BW ETL techniques to transform and load the data to the cube used by the BPC

Application. (You can read more about BPC namespace at A reservation of a different

kind – why, what and how of BPC namespace). This option also is valid one and it indeed

leverages the BW platform and also this can very well be automated if we desire to do

so. Further this option can handle deltas very efficiently since we can use the delta

handling mechanisms in BW. Lastly, using the ETL toolset is a proven, scalable and very

robust technique to handle data within BW. However this option is also beset with

several limitations. Firstly, infocube for the BPC Application will need to be switched to

loading mode while loading it using ETL and then switch it back to planning mode after

loading is complete. This means that during that time, it will not be available for

planning through BPC front end. So the ETL load has to coordinated with the BPC users

to avoid any potential mishaps like the automated process switching the BPC cube to

loading mode when a planner is in the middle of updating his/her plan. Secondly, in

BPC7NW, we can maintain validations to validate the data written to BPC Application so

that invalid records are not written and data integrity is maintained. If we use BW ETL,

it will bypass this validation mechanism completely and there is a risk of invalid records

being written to BPC application. The validation mechanism will not check the records

that have already been written. Thirdly, the BPC audit logs will not be updated if we use

BW ETL since it won't invoke any BPC audit functionality. Fourthly, the data will always

be additive - for example if we have 10 in the cube and then write 100, the result will

always be 110. This is just a consideration rather than a limitation. Finally, if and when

the BPC user executes ‘full optimize' for the BPC Application, BPC may end up

generating a totally new cube which is more optimized than the previous one. In that

case, all the ETL work that was done for the previous cube will be dropped from the new

cube. At present, the ‘full optimize' option does not automatically inherit the ETL

configuration done on the earlier BPC cube. The building blocks for the ETL like the

datasource, infosource etc will still be there in BW but they would be required to belinked again to the new cube.

3.  This brings us to the third option which we will discuss in much more detail. This option

is to use the Data Manager in BPC7NW to load transactional data from any Infoprovider

in BW to the BPC Application. This option overcomes almost all the limitations

enumerated above since this is executed from within BPC itself. This can very well be

Page 36: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 36/44

scheduled from within Data Manager or by invoking the Data Manager process chain

from a custom process chain. So let us see how exactly we should go about realizing

this.

Creating transformation file: 

Before we run the data manager package to load data, we should prepare the transformation

and conversion files. An example of a transformation file that we can use to load data from

other infoproviders is given below.

Please note the mapping section. The dimension names in the BPC Application are mapped to

the corresponding characteristics from the Infoprovider from where we want to load the data.

Please note that as far as the BPC dimension names are concerned, we are using the dimension

names (and not the technical names of the BW characteristics corresponding to those

dimensions) whereas when we map them to the Infoprovider characteristics, we must use the

technical names of the BW characteristics. For example, the TIME dimension in BPC is mapped

to 0FISCPER characteristic in BW. Here TIME is the BPC Dimension name whereas 0FISCPER is

the technical name of the BW characteristic in the source Infoprovider. Also please note the

mapping for AMOUNT. The AMOUNT is mapped to the technical mane of the key figure in thesource Infoprovider. So in this case, ZMAOUNT01 is the technical name of the key figure in the

source Infoprovider.

In this regard, please note that the source BW Infoprovider can have multiple key figures. If so,

we can choose only one key figure in a transformation file since our BPC cube has only one key

figure. If for any reason, you have a situation where you have to get data from two or more key

Page 37: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 37/44

figures, you can use multiple transformation files and run the data manager package multiple

times with each transformation file. However in that case, please note that the data from all

those key figures will end up in the same single key figure in the BPC cube.

What if there is no corresponding characteristic in the source Infoprovider for a dimension in

BPC? What if we want to have a single default value in the data load regardless of what exists in

the source Infoprovider? Well, in that case, we can use the keyword *NEWCOL in the

transformation file for the corresponding BPC dimension. Please see the example of the

transformation file below.

Here for the category dimension, we are forcing the default value of FORECAST in the data load.

Similarly for the dimension P_DATASRC, even if there is no corresponding BW characteristic in

the source Infoprovider, it is OK - we can always pass the default value through our

transformation file.

What if we want to load only a subset of the data from the source Infoprovider? In that case,

we can enter our selection in the transformation file. Please see the example of the following

transformation file:

Page 38: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 38/44

 

Please note that in the OPTIONS section, we have entered a selection to select data from onlytwo characteristic values C2000 and C1000 of the characteristic ZACCT01. Hence we are

selecting only a subset of the source Infoprovider. You can enter selections from multiple

characteristics here in the selection and thus load data from a specific data slice in the source

from Infoprovider.

Creating conversion file: 

In addition to the transformation file, we can have conversion files and refer them in the

transformation files. This is necessary if the master data values in the BPC dimension in the BPC

Application (dimension members) and the characteristics in the source Infoprovider(characteristic values) are different. Those conversion files we can refer in the transformation

file as shown below:

Page 39: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 39/44

Here, the conversions for TIME dimension are read from the file ZBICGR01TIME.XLS. That

conversion file can have the mapping between internal (BPC) time dimension members and

external (0FISCPER) values. That conversion file is shown below:

Here ? is used as a wildcard so that we don't have to write conversion for each year. This

conversion file can work for any year. Another example of the conversion file for account is

shown below:

In addition to writing such conversion files for BPC dimensions, we can write the conversion for

AMOUNT if necessary. For example, if we want to change the data coming over for a particular

dimension member during the load, we can write the formula in the conversion file for

AMOUNT as shown below:

So in this case, the data for account CE0004220 will be increased by 10% during the load.

Validating the transformation file: 

After we prepare the transformation and conversion files, the next step would be to get the

transformation validated with the source Infoprovider. BPC7NW gives additional options for

validation as shown below:

Page 40: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 40/44

 

We should select the option 'transaction data from infocube' and enter the technical name of 

the Infoprovider. Please note that though the option here reads ‘transaction data from

infocube' it works with DSO also in the same way.

The validation then can validate our transformation and conversion files against the data in the

source Infoprovider and gives us the result as shown below:

Page 41: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 41/44

 

Running the data manager package: 

Once we have our transformation files validated, we are in a good shape to run our data

manager package to load the data from the source Infoprovider. There is a delivered process

chain and data manager package to load data from Infoprovider as shown below:

Page 42: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 42/44

 

If we run this package, we can enter the technical name of the source Infoprovider and the

transformation file and schedule the load. Upon completion, we can get the success message in

the data manager package status as shown below:

Page 43: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 43/44

 

Loading data from other appsets: 

So far we have seen how we can use the data manager features to load data from any

Infoprovider in BW to BPC Application. This raises another question. Can we use the same

technique to load data from one BPC application to another BPC application in the same Appset

or for that matter from any application in any other Appset within BPC7NW? The answer is an

emphatic ‘yes'. We can treat the BPC application in the same or other appset as just another

BW cube with characteristics and key figure. The only consideration is using this approach to

load data from other applications in other appsets is that we have to maintain the /CPMB name

space technical names of the characteristics of those dimensions in the transformation file

against the dimension names of the target application and while running the package, we have

to enter the /CPMB namespace technical name of the infocube pertinent to the source BPC

application. An example of the transformation file used for such data transfer is shown below.

Page 44: BPC blogs.docx

7/30/2019 BPC blogs.docx

http://slidepdf.com/reader/full/bpc-blogsdocx 44/44

 

Thus we can see that we can effectively leverage the Netweaver platform to transfer the

transactional data from one infocube to another using Data Manager package.