lo steps

11
LO STEPS  Go to transaction code RSA3 and see if any data is available related to your DataSource. If data is there in RSA3 then go to transaction code LBWG (Delete Setup data) and delete the data by entering the application name.  Go to transaction SBIW --> Settings for Application Specific Datasource --> Logistics --> Managing extract structures --> Initialization --> Filling the Setup table --> Application specific setup of statistical data --> perform setup (relevant application)  In OLI*** (for example OLI7BW for Statistical setup for old documents : Orders) give the name of the run and execute. Now all the available records from R/3 will be loaded to setup tables.  Go to transaction RSA3 and check the data.  Go to transaction LBWE and make sure the update mode for the corresponding DataSource is serialized V3 update.  Go to BW system and create infopackage and under the update tab select the initialize delta process. And schedule the package. Now all the data available in the setup tables a re now loaded into the data target.  Now for the delta records go to LBWE in R/3 and change the update mode for the corresponding DataSource to Direct/Queue delta. By doing this record will bypass SM13 and directly go to RSA7. Go to transaction code RSA7 there you can see green light # Once the new records are added immediately you can see the record in RSA7.  Go to BW system and create a ne w infopackage for delta loads. Double click on new infopackage. Under update tab you can see the delta update radio button.  Now you can go to your data target and see the delta record INFO CUBE MODELING InfoCube Definition An object that can function as both a data target and an InfoProvider. From a reporting point of view, an InfoCube describes a self-contained dataset, for example, of a business- orientated area. This dataset can be evaluated in a BEx query. An InfoCube is a quantity of relational tables arranged according to the star schema: A large fact table in the middle surrounded by several dimension tables. Use InfoCubes are supplied with data from one or more InfoSources or ODS objects (Basic InfoCube) or with data from a different system (RemoteCube , SAP RemoteCube, virtual InfoCube with Services, transactional InfoCube). Structure  There are various types of InfoCube: 1. Physical data stores: Basic InfoCubes  Transactiona l InfoCubes 2. Virtual data stores: RemoteCube SAP RemoteCube Virtual InfoCube with Services Only Basic InfoCubes and transactiona l InfoCubes physically contain data in the database. Virtual InfoCubes are only logical views of a dataset. By definition, they are not data targets. However, the InfoCube type is of 

Upload: vijay-chandra

Post on 07-Apr-2018

217 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: LO STEPS

8/6/2019 LO STEPS

http://slidepdf.com/reader/full/lo-steps 1/11

LO STEPS

 Go to transaction code RSA3 and see if any data is available related to your DataSource. If data isthere in RSA3 then go to transaction code LBWG (Delete Setup data) and delete the data by entering theapplication name.

  Go to transaction SBIW --> Settings for Application Specific Datasource --> Logistics --> Managing

extract structures --> Initialization --> Filling the Setup table --> Application specific setup of statisticaldata --> perform setup (relevant application)

  In OLI*** (for example OLI7BW for Statistical setup for old documents : Orders) give the name of therun and execute. Now all the available records from R/3 will be loaded to setup tables.

  Go to transaction RSA3 and check the data.

  Go to transaction LBWE and make sure the update mode for the corresponding DataSource isserialized V3 update.

  Go to BW system and create infopackage and under the update tab select the initialize delta process.And schedule the package. Now all the data available in the setup tables are now loaded into the datatarget.

  Now for the delta records go to LBWE in R/3 and change the update mode for the correspondingDataSource to Direct/Queue delta. By doing this record will bypass SM13 and directly go to RSA7. Go totransaction code RSA7 there you can see green light # Once the new records are added immediately youcan see the record in RSA7.

  Go to BW system and create a new infopackage for delta loads. Double click on new infopackage.Under update tab you can see the delta update radio button.

  Now you can go to your data target and see the delta record

INFO CUBE MODELING

InfoCube

Definition

An object that can function as both a data target and an InfoProvider.

From a reporting point of view, an InfoCube describes a self-contained dataset, for example, of a business-

orientated area. This dataset can be evaluated in a BEx query.

An InfoCube is a quantity of relational tables arranged according to the star schema: A large fact table in the

middle surrounded by several dimension tables.

Use

InfoCubes are supplied with data from one or more InfoSources or ODS objects (Basic InfoCube) or with data

from a different system (RemoteCube, SAP RemoteCube, virtual InfoCube with Services, transactional

InfoCube).

Structure

 There are various types of InfoCube:

1. Physical data stores:

Basic InfoCubes

 Transactional InfoCubes

2. Virtual data stores:

RemoteCube

SAP RemoteCube

Virtual InfoCube with Services

Only Basic InfoCubes and transactional InfoCubes physically contain data in the database. Virtual InfoCubes

are only logical views of a dataset. By definition, they are not data targets. However, the InfoCube type is of 

Page 2: LO STEPS

8/6/2019 LO STEPS

http://slidepdf.com/reader/full/lo-steps 2/11

no importance from the reporting perspective, since an InfoCube is accessed as an InfoProvider.

Integration

 You can access the characteristics and key figures defined for an InfoCube in the Query Definition in the BEx

Web or in the BEx Analyzer.

Basic InfoCube

DefinitionA Basic InfoCube is a type of InfoCube that physically stores data. It is filled with data using BW Staging.

Afterwards, it can be used as an InfoProvider in BEx Reporting.

Structure

As with other InfoCube types, the structure of a Basic InfoCube corresponds to the Star Schema.

For more information, see InfoCube

Integration

 The Basic InfoCube is filled using the Scheduler, providing that Update Rules have been maintained.

It is then made available to Reporting as an InfoProvider. It can also be updated into additional data targets

or build a MultiProvider together with other data targets.

 Transactional InfoCubes

Definition

 Transactional InfoCubes differ from Basic InfoCubes in their ability to support parallel write accesses. Basic

InfoCubes are technically optimized for read accesses to the detriment of write accesses.

Use

 Transactional InfoCubes are used in connection with the entry of planning data. See also Overview of 

Planning with BW-BPS. The data from this kind of InfoCube is accessed transactionally, meaning data is

written to the InfoCube (possibly by several users at the same time). Basic InfoCubes are not suitable for

this. You should use Basic InfoCubes for read-only access (for example, when reading reference data).

Structure

 Transactional InfoCubes can be filled with data using two different methods: Using the transaction of BW-BPS

to enter planning data and using BW staging, whereas planning data then cannot be loaded simultaneously.

 You have the option to convert a transactional Info Cube Select Convert Transactional InfoCube using the

context menu in your transactional InfoCube in the InfoProvider tree. By default, Transaction Cube Can Be

Planned, Data Loading Not Permitted is selected. Switch this setting to Transactional Cube Can Be Loaded

With Data; Planning Not Permitted if you want to fill the cube with data via BW Staging.

During entry of planning data, the data is written to a transactional InfoCube data request. As soon as the

number of records in a data request exceeds a threshold value, the request is closed and a rollup is carried

out for this request in defined aggregates (asynchronously). You can still rollup and define aggregates,

collapse, and so on, as before.

According to the database on which they are based, transactional InfoCubes differ from Basic InfoCubes in

the way they are indexed and partitioned. For an Oracle DBMS this means, for example, no bitmap index for

the fact table and no partitioning (initiated by SAP BW) of the fact table according to the package

dimensions.

Reduced read-only performance is accepted as a drawback of transactional InfoCubes, in the face of the

parallel (transactional) writing option and improved write performance.Creating a transactional InfoCube

Select the Transactional indicator when creating a new (Basis) InfoCube in the Administrator Workbench.

Converting a basic InfoCube into a transactional InfoCube

InfoCube conversion: Removing transaction data

If the Basic InfoCube already contains transaction data that you no longer need (for example, test data from

the implementation phase of the system), proceed as follows:

1. In the InfoCube maintenance in the Administrator Workbench choose, from the main menu, InfoCube ®

Delete Data Content. The transaction data is deleted and the InfoCube is set to "inactive".

Page 3: LO STEPS

8/6/2019 LO STEPS

http://slidepdf.com/reader/full/lo-steps 3/11

2. Continue with the same procedure as with creating a transactional InfoCube.

InfoCube conversion: retaining transaction data

If the Basic InfoCube already contains transaction data from the production operation you still need, proceed

as follows:

Execute the SAP_CONVERT_TO_TRANSACTIONAL ABAP report under the name of the corresponding InfoCube.

 You should schedule this report as a background job for InfoCubes with more than 10,000 data records. This

is to avoid a potentially long run-time.

Integration

 The following typical scenarios arise for the use of transactional InfoCubes in BW-BPS.

1st Scenario:

Actual data (read-only access) and planned data (read-only and write access) have to be held in different

InfoCubes. Therefore, use a Basic InfoCube for actual data and a transactional InfoCube for planned data.

Data integration is achieved using a multi-planning area that contains the areas that are assigned to the

InfoCubes. Access to the two different InfoCubes is controlled here by the characteristic "Planning area",

which is added automatically.

2nd Scenario:

In this scenario, the planned and actual data have to be together in one InfoCube. This is the case, for

example, with special rolling forecast variants. Here you have to use a transactional InfoCube, since bothread-only and write accesses take place. You can no longer load data directly that has already arrived in the

InfoCube by means of an upload or import source. To be able to load data nevertheless, you have to make a

copy of the transactional InfoCube that is identified as a Basic InfoCube and not as transactional. Data is

loaded as usual here and subsequently updated to the transactional InfoCube.

RemoteCube

Definition

A RemoteCube is an InfoCube whose transaction data is not managed in the Business Information

Warehouse but externally. Only the structure of the RemoteCube is defined in BW. The data is read for

reporting using a BAPI from another system.

Use

Using a RemoteCube, you can carry out reporting using data in external systems without having to physically

store transaction data in BW. You can, for example, include an external system from market data providersusing a RemoteCube.

By doing this, you can reduce the administrative work on the BW side and also save memory space.

Structure

When reporting using a RemoteCube, the Data Manager, instead of using a BasicCube filled with data, calls

the RemoteCube BAPI and transfers the parameters.

Selection

Characteristics

Key figures

As a result, the external system transfers the requested data to the OLAP Processor.

Integration

 To report using a RemoteCube you have to carry out the following steps:

1. In BW, create a source system for the external system that you want to use.

2. Define the required InfoObjects.

3. Load the master data:

Create a master data InfoSource for each characteristic

Load texts and attributes

4. Define the RemoteCube

5. Define the queries based on the RemoteCube

Page 4: LO STEPS

8/6/2019 LO STEPS

http://slidepdf.com/reader/full/lo-steps 4/11

SAP RemoteCube

Definition

An SAP RemoteCube is a RemoteCube that allows the definition of queries with direct access to transaction

data in other SAP systems.

Use

Use SAP RemoteCubes if:

 You need very up-to-date data from an SAP source system

 You only access a small amount of data from time to time

Only a few users execute queries simultaneously on the database.

Do not use SAP RemoteCubes if:

 You request a large amount of data in the first query navigation step, and no appropriate aggregates are

available in the source system

A lot of users execute queries simultaneously

 You frequently access the same data

Structure

SAP RemoteCubes are defined based on an InfoSource with flexible updating. They copy the characteristics

and key figures of the InfoSource. Master data and hierarchies are not read directly in the source system.

 They are already replicated in BW when you execute a query. The transaction data is called during execution of a query in the source system. During this process, the

selections are provided to the InfoObjects if the transformation is only simple mapping of the InfoObject. If 

you have specified a constant in the transfer rules, the data is transferred only if this constant can be

fulfilled. With more complex transformations such as routines or formulas, the selections cannot be

transferred. It takes longer to read the data in the source system because the amount of data is not limited.

 To prevent this you can create an inversion routine for every transfer routine. Inversion is not possible with

formulas, which is why SAP recommends that you use formulas instead of routines.

Integration

 To be assigned to an SAP RemoteCube, a source system must meet the following requirements:

BW Service API functions (contained in the SAP R/3 plug-in) are installed.

 The Release status of the source system is at least 4.0B

In BW, a source system ID has been created for the source systemDataSources from the source system that are released for direct access are assigned to the InfoSource of the

SAP RemoteCube. There are active transfer rules for these combinations.

Virtual InfoCubes with Services

Definition

A virtual InfoCube with services is an InfoCube that does not physically store its own data in BW. The data

source is a user-defined function module. You have a number of options for defining the properties of the

data source more precisely. Depending on these properties, the data manager provides services to convert

the parameters and data.

Use

 You use a virtual InfoCube with services if you want to display data from non-BW data sources in BW without

having to copy the data set into the BW structures. The data can be local or remote. You can also use your

own calculations to change the data before it is passed to the OLAP processor.

 This function is used primarily in the SAP Strategic Enterprise Management (SEM) application.

In comparison to the RemoteCube, the virtual InfoCube with services is more generic. It offers more

flexibility, but also requires more implementation effort.

Structure

When you create an InfoCube you can specify the type. If you choose Virtual InfoCube with Services as the

type for your InfoCube, an extra Detail pushbutton appears on the interface. This pushbutton opens an

additional dialog box, in which you define the services.

Page 5: LO STEPS

8/6/2019 LO STEPS

http://slidepdf.com/reader/full/lo-steps 5/11

1. Enter the name of the function module that you want to use as the data source for the virtual InfoCube.

 There are different default variants for the interface of this function module. One method for defining the

correct variant, together with the description of the interfaces, is given at the end of this documentation.

2. The next step is to select options for converting/simplifying the selection conditions. You do this by

selecting the Convert Restrictions option. These conversions only change the transfer table in the user-

defined function module. The result of the query is not changed because the restrictions that are not

processed by the function module are checked later in the OLAP processor.

Options:

No restrictions: If this option is selected, no restrictions are passed to the InfoCube.

Only global restrictions: If this option is selected, only global restrictions (FEMS = 0) are passed to the

function module. Other restrictions (FEMS > 0) that are created, for example, by setting restrictions on

columns in queries, are deleted.

Simplify selections: Currently this option is not yet implemented.

Expand hierarchy restrictions: If this option is selected, restrictions on hierarchy nodes are converted into the

corresponding restrictions on the characteristic value.

3. Pack RFC: This option packs the parameter tables in BAPI format before the function module is called

and unpacks the data table that is returned by the function module after the call is performed. Since this

option is only useful in conjunction with a remote function call, you have to define a logical system that isused to determine the target system for the remote function call, if you select this option.

4. SID support: If the data source of the function module can process SIDs, you should select this option.

If this is not possible, the characteristic values are read from the data source and the data manager

determines the SIDs dynamically. In this case, wherever possible, restrictions that are applied to SID values

are converted automatically into the corresponding restrictions for the characteristic values.

5. With navigation attributes: If this option is selected, navigation attributes and restrictions applied to

navigation attributes are passed to the function module.

If this option is not selected, the navigation attributes are read in the data manager once the user-defined

function module has been executed. In this case, in the query, you need to have selected the characteristics

that correspond to these attributes. Restrictions applied to the navigation attributes are not passed to the

function module in this case.

6. Internal format (key figures): In SAP systems a separate format is often used to display currency keyfigures. The value in this internal format is different from the correct value in that the decimal places are

shifted. You use the currency tables to determine the correct value for this internal representation.

If this option is selected, the OLAP processor incorporates this conversion during the calculation.

Dependencies

If you use a remote function call, SID support must be switched off and the hierarchy restrictions must be

expanded.

Description of the interfaces for user-defined function modules

Variant 1: Variant 2:

Additional parameters for variant 2 for transferring hierarchy restrictions, if they are not expanded:

With hierarchy restrictions, an entry for the 'COMPOP' = 'HI' (for hierarchy) field is created at the appropriate

place in table I_T_RANGE (for FEMS 0) or I_TX_RANGETAB (for FEMS > 0), and the 'LOW' field contains a

number that can be used to read the corresponding hierarchy restriction from table I_TSX_HIER, using field

'POSIT' . Table i_tsx_hier has the following type:

Variant 3:

SAP advises against using this interface.

 The interface is intended for internal use only and only half of it is given here.

Note that SAP may change the structures used in the interface.

Method for determining the correct variant for the interface

 The following list describes the procedure for determining the correct interface for the user-defined function

Page 6: LO STEPS

8/6/2019 LO STEPS

http://slidepdf.com/reader/full/lo-steps 6/11

module. Go through the list from top to the bottom. The first appropriate case is the variant that you should

use:

If Pack RFC is activated: Variant 1

If SID Support is deactivated: Variant 2

Labels parameters

Edit

 

A dd Labels http://w iki.sdn.sa 60947 BI

Labels

http://wiki.sdn.sap.com/wiki/display/BI/About+DSO+types+in+BI

DataStore object types:

Standard DataStore object

• Data provided using a data transfer process

• SID values can be generated

• Data records with the same key are aggregated during activation

• Data is available for reporting after activation

Write-optimized DataStore object

• Data provided using a data transfer process

• SID values cannot be generated

• Records with the same key are not aggregated

• Data is available for reporting immediately after it is loaded

DataStore object for direct update• Data provided using APIs

• SIDs cannot be generated

• Records with the same key are not aggregated

DataStore object serves as a storage location for consolidated and cleansed

transaction data or master data on a document (atomic) level.

 This data can be evaluated using a BEx query.

A DataStore object contains key fields (such as document number, document item)

and data fields that, in addition to key figures, can also contain character fields

(such as order status, customer). The data from a DataStore object can be updatedwith a delta update into InfoCubes (standard) and/or other DataStore objects or

master data tables (attributes or texts) in the same system or across different

systems.

Unlike multidimensional data storage using InfoCubes, the data in DataStore objects

is stored in transparent, flat database tables. The system does not create fact tables

or dimension tables.

Page 7: LO STEPS

8/6/2019 LO STEPS

http://slidepdf.com/reader/full/lo-steps 7/11

Use

Overview of DataStore Object Types

Type Structure Data Supply SIDGeneration

Details Example

StandardDataStoreObject

Consistsof threetables:activationqueue,table of activedata,change log

From datatransfer process

Yes StandardDataStoreObject

OperationalScenario for StandardDataStoreObjects

Write-

OptimizedDataStoreObjects

Consists

of thetable of active dataonly

From data

transfer process

 No Write-

OptimizedDataStoreObject

Operational

Scenario for Write-OptimizedDataStoreObjects

DataStoreObjects for Direct Update

Consistsof thetable of active dataonly

From APIs No DataStoreObjects for Direct Update

OperationalScenario for DataStoreObjects for DirectUpdate.

 You can find more information about management and further processing of 

DataStore objects under:

Managing DataStore Objects

Further Processing of Data in DataStore Objects

Integration

 You can find out more about integration under Integration into the Data Flow.

Management of DataStore Objects

Features

 The DataStore object is displayed in the top table. You only have to select a

DataStore object from the DataStore objects available if you called DataStore object

administration from the monitor.

In the top toolbar, choose Contents to display the contents of the table of active

data for the DataStore object you have selected. With Delete Contents, you can

Page 8: LO STEPS

8/6/2019 LO STEPS

http://slidepdf.com/reader/full/lo-steps 8/11

delete the contents of the DataStore object. You can also display an application log

and a process overview.

 Tab Page: Contents

 You can display the content of the change log table, the newly loaded data table

(activation queue), or the active data (A table). You can also selectively deleterequests from the DataStore object.

More information: DataStore Object Content

 Tab Page: Requests

 This tab page provides information about all requests that have run in the

DataStore object. You can also delete requests here or schedule an update.

More information: Requests in DataStore Objects

 Tab Page: Reconstruction

 You use this function to fill a DataStore object with requests that have already been

loaded into the BI system or into another DataStore object. This function is only

necessary for DataStore objects that obtained their data from InfoPackages.

More information: Reconstruction of DataStore Objects

 Tab Page: Archiving

If you have created a data archiving process, you see this additional tab page.

More information: Administration of Data Archiving Processes

Automatic Further Processing

If you still use a 3.x InfoPackage to load data, you can activate several automatisms

to further process the data in the DataStore object. If you use the data transfer

process and process chains that SAP recommends you use, you cannot however use

these automatisms.

We recommend that you always use process chains.

More information: Including DataStore Objects in Process Chains

If you choose the main menu path Environment  →  Automatic Request Processing,

you can specify that the system automatically sets the quality status of the data to

OK after the data has been loaded into the DataStore object. You can also activate

and update DataStore data automatically. Data that has the quality status OK istransferred from the activation queue into the table of active data, and the change

log is updated. The data is then updated to other InfoProviders.

 You can also make these settings when you create DataStore objects. More

information: DataStore Object Settings

Only switch on automatic activation and automatic update if you are sure that these processes donot overlap. More information: Functional Constraints of Processes

Page 9: LO STEPS

8/6/2019 LO STEPS

http://slidepdf.com/reader/full/lo-steps 9/11

Delete Change Log Data

 You can delete data from the change log. To do this, choose Environment  →Delete

Change Log Data in the main menu. Continue as described under Deleting Requests

from the PSA.

For more information about deleting from the change log, see Deleting from theChange Log.

Deleting from the Change Log 

Use

We recommend that you delete data from the change log of a DataStore object if 

several requests that are no longer needed for the delta update and are no longer

used for an initialization from the change log have already been loaded into the

DataStore object. If a delta initialization is available for updates to connected

InfoProviders, requests have to be updated before the corresponding data can bedeleted from the change log.

A temporary, limited history is retained. In some cases the change log becomes so

large that is advisable to reduce the volume of data and delete data from a specific

time period.

If you have deleted requests form the change log, and the requests can still be seen

in the DataStore object administration screen on the Requests tab, this means that

these requests cannot be deleted again on the DataStore object administration

screen

This is because the requests have already been deleted from the change log, which means that it

is no longer possible to perform a rollback.

Procedure

Inserting the deletion of requests from the change log into the process chain

 You are in the plan view of the process chain where you want to insert the process

variant.

...

  1.  To insert a process variant for deleting requests from the change log into the process chain,double-click process type Deletion of Requests from the Change Log from process category

Further BI Processes.2.  In the next dialog box, enter a name for the process variant and choose Create.

  3.  On the next screen, enter a description for the process variant and choose Continue.

The maintenance screen for the process variant appears.

  4.  Under Type of Object, select Change Log Table.

5.  Under Name of Object select one or more DataStore objects for which requests are to be deletedfrom the relevant change log tables.

Page 10: LO STEPS

8/6/2019 LO STEPS

http://slidepdf.com/reader/full/lo-steps 10/11

  6.  Specify the requests to be deleted by determining the days or dates (local time zone). You alsohave the option to specify whether you only want to delete successfully updated requests, and/or only incorrect requests that are no longer updated in an InfoProvider (this only applies totransaction data).

  7.  Save your entries and return to the previous screen.

  8.  On the next screen, confirm the insertion of the process variant into the process chain.

The plan view of the process chain appears. The process variant for deleting requests from thePSA is included in your process chain.

Deleting requests from the change log in the administration of the

DataStore object

  1.  In the main menu in DataStore object administration, choose Environment →Delete ChangeLog Data.

  2.  Specify the requests to be deleted by determining the days or dates (local time zone). You alsohave the option to specify whether you only want to delete successfully updated requests, and/or only incorrect requests that are no longer updated in an InfoProvider (this only applies totransaction data).

  3.  Define the start date values under Start Conditions.

  4.  Select Start (Schedule Deletion).

The deletion is scheduled in the background according to the selection conditions.

 You can also display a request list for the change log. In this list, you can mark

requests and then delete them directly.

Further Processing of Data in DataStore Objects 

Purpose

If you have loaded data into a DataStore object, you can use this DataStore object

as the source for another InfoProvider. To do this, the data must be active. Use

process chains to ensure that one process has ended before any subsequent

processes are triggered.

More information: Process Chains 

and Including DataStore Objects in Process Chains

Process Flow

Process flow for updating DataStore object data:

  1.  Activating the DataStore object data: The data is in the activation queue. When you activate thedata, the change log is filled with the data required for a delta update, and the data appears in thetable of active data.

Page 11: LO STEPS

8/6/2019 LO STEPS

http://slidepdf.com/reader/full/lo-steps 11/11

  2.  Updating the data to the connected InfoProviders: Using the transformation rules, the changelog data (the delta) that has not yet been processed is updated to other InfoProviders. The data isalready available in a cleansed and consolidated format.