sap bi

Upload: kishore-dammavalam

Post on 15-Oct-2015

59 views

Category:

Documents


0 download

DESCRIPTION

hi

TRANSCRIPT

SAP BI:[Business Intelligence]

SAP BI:[Business Intelligence]------------------------------------------ BW 3.5 (SAP Net weaver 2004) BI 7.0 (SAP Net weaver 2004s)-------------------------------------------Overview on Bi 7.0:---------------------------- BI - Business Intelligence Till BW 3.5 Application the concept was "Data Warehousing" but from BI 7.0 the concept is Business Intelligence. Data ware housing / Business Intelligence

New Features in BI 7.0

MODELING:---------------------- RSA1 Enterprise Data Warehouse workbench RSA1old Administrator workbench Info source is not mandatory PSA is the only Transfer method by default. No direct flexibility of PSA access but PSA data can be accessed by going to the Manage of the Data Source. TR and UR are replaced with Transformations. In transformations - we have "END ROUTINE" AND "EXPERT ROUTINE". Info package can be used to load data from SS to PSA. Standard DTP ( Data Transfer Process ) is used to load data from PSA to Data Targets. Error records will be written into Error Stack, we can do editing in Error Stack and use Error DTP to upload data from Error Stack to Data Targets. ODS is called Data Store. 3 types of Data Store Objects:-1) Standard2) Write Optimized3) Direct Update.

Creating Info cube screen has changed like ODS creation screen in BW 3.5 Basic Cube - Standard Cube Transaction Cube - Real Time Cube Remote cubes - Virtual Providers. SAP Remote Cube - Virtual Provider with Direct Access DTP. Info Cubes can also be included into Info Sets. Maximum we can only have 2 info cubes in an info set. Re-partioning. Re-Modelling. RDA - Real Time Data Acquition. No event chains and No Info package groups. Process Chains - Display mode is given. Direct flexibility of Copying the process chain - in the RSPC - PRocess Chain Menu. Decision Process type implemented with Formula Editor. Integrated Planning - ( Excel based / Web Based ). We can build Info Spokes on Multi provider / Infoset. Info Set - we can also include "Info Cube". Max no of cubes in info set - 2. In Multiprivider - " Aggregation Level" - (these are like Planning levels in BPS. and these aggregation levels are used for "Integrated Planning" built on REAL - TIME CUBE.

REPORTING:----------------------- GUI of Query designer has changed. BEx Analyzer. WEB Analyzer. Report Designer. Visual Composer All the Functionalities of Reporting Agent has moved to Information Broadtcasting. Old Reporting Agent T_CODE---- 'REPORTING_AGENT'. BI Acclerator is implemented with Hardware components called Blades to improve the performance of the queries. Integrated Planning - similar to BPS and more functialities. Data Archiving. Data Mining. SS connections - UD connect, Web Services BI Acclerator Ready flexibility for doing Unit Translations. REPORTING_AGENT APD - Analysis Process Design RS*admin - authorizations

New Data Flow:------------------------- Info Source is not Mandatory By Default - PSA [DS] IP - PSA DTP [Standard ] Transformations End Routine Expert Routine Master Data Loading Remote Master Data Types of Info Cubes DSO & Types & Data Mart Multiprovider Info Set Open Hub Destination Process chains (Changes) Migration of 3.x to BI 7.x DTP & Types RDA [Real time Data Acquisition] Migration Remote Master data Re-Modeling Re-partitioning Bi Statistics & Administration Cockpit

Reporting------------------ New changes in BEx query designer BEx Analyzer - Design Tool Box & Analysis Tool Box WAD - New Items & Commands Report Designer Web Analyzer Information Broadcasting (Changes) Visual Composer (I View) Integrating BI objects with EP Query Designer Integrating the BEx components with EP Visual Composer [Dashboards) Upgrades DB connect Web Services

Performance Tuning----------------------------- BIA (BI Accelerator) IP [Integrated Planning] (Overview) BI Statistics Unit & Currency Conversion

Creating MD[ATTR & TEXT] & loading from Flat file with BI 7 data flow:--------------------------------------------------------------------------------------------------Steps:--------1. Insert the characteristic as Data Target (Info Provider)

2. Create the Data Source [PSA] one for ATTR & one for TEXT

3. Create the Transformations to connect DS & Master Data Objects

4. Create the Info package & schedule the load

5. Create the Standard DTP and start the load

How to Create Info Cube & Load it from Flat file without Info Source :--------------------------------------------------------------------------------------------

Note:-------1. Info Source is not Mandatory2. TR & UR are replaced with "Transformations"3. DATA Source as PSA4. Info package can be used to load data only to PSA / DATA Source (Full, Init, Delta)5. Standard DTP is used to load data from data Source to Data target. (Full , Delta)6. Standard DTP cannot be schedule in the background, we can only schedule the load in the background by using Process chains.

Pre-Requisites:------------------ Info cube to be active Flat file Source system Connection

Steps:--------1. Create the Data Source with Flat file as the source

2. Create the Transformations to connect DS and IC

3. Create the Info package and load the data from SS to DS (PSA)

4. Create the Standard DTP to load the data from DS to Data Target

How to Create Info Cube & Load it from Flat file with Info Source :----------------------------------------------------------------------------------------Steps:--------1. Create the Info Source

2. Create the Data Source [PSA]

3. Create the Transformations to connect DS & IS

4. Create the Transformations to connect IS & Info Cube

5. Create the Info package & schedule the load

6. Create the Standard DTP and start the load

When do we use Info Source:-------------------------------------- Multiple Data Sources to Multiple Data Targets.

When we need to implement Multiple levels of transformations

DTP ------- [Data Transfer process]------------------------------------------------------------------------------------------------------------

Why DTP?---------------- Error Handling is made easy (Valid are not processed again) (Error Handling with Error Stack) Delta Mechanism per data target (Weekly, Monthly, Daily)(- Different Delta Mechanism for each data Target) We can implement Snap Shot scenario with one level of the cube Easy to handle data at all levels because of Transparency. (Data Transparency) We can reduce the data flow (Improve un necessary processes)

4 types of DTP's----------------------1) Standard DTP2) Error DTP3) RDA DTP4) Direct Access DTP

1) Standard DTP:------------------Purpose:------------- We use Standard DTP to Extract data from Data Source and update the data to the Data Targets [Info Cube, DSO, Info Objects converted as Info providers].

Importance of Standard DTP:--------------------------------------- We can set up Different Delta Mechanism of a single Data Source for each of the Data Targets because Standard DTP runs Delta based on the Request. For ex:- Daily Delta to Cube 1, Hourly Delta to Cube 2, Weekly Delta to Cube 3 from same Data Source

Properties of Standard DTP:-------------------------------------

i) Extraction Tab:----------------------a) Data Source - (Source Object)

b) Extraction Mode: 1) Delta: Delta mechanism is enabled based on the request number. - 2) Full : It Extracts all the data from the data Source.

NOTE:- Delta: Delta Mechanism - Based on the Request Number

c) Get Delta at Once - To Process the request only once irrespective of the request deleted in the data target or not d) Get data Request by Requeste) Filterf) Data Packet Sizeg) Semantic keysh) Access data directly from Source for small amounts of data

Only Get Delta Once: To Process the request only once irrespective of the request deleted in the data target or not

ii) Update Tab :-(Data Target)----------------------------------------

iii) Execute Tab:-------------------- Standard DTP cannot be scheduled directly in the back ground, we must use Process chains to achieve this.

- Settings for Batch Manager:-------------------------------

- Temporary storage:------------------------

DTP Loading Process (For Each Data Packet)----------------------------------------------------------Steps:--------- Extraction Filtering Error Handling Transformations Updating into data Target

1. Extraction of Data from Source Object based on the Filter Selections.2. Preparing Error Handling - Checks for Error records, and if it finds the error records it will write the error records to Error Stack.3. Then Valide records will be processed through the transformations.4. Updating the records into the Data Target.

Transformations------------------------

New concept for UR & TR of 3.x UR & TR of 3.x are replaced with Transformations To perform all kind of transformations We use transformations to transform the data coming from source system.

Types of Transformations:------------------------------------

ZPO_ORD

ZSUPPLIER

- YPONUMBER

DATA : LV_SUPPLIER TYPE /BIC/AZPO_ORD00-/BIC/ZSUPPLIER.

SELECT SINGLE /BIC/ZSUPPLIER FROM /BIC/AZPO_ORD00 INTO LV_SUPPLIER WHERE /BIC/YPONUMBER = SOURCE_FIELDS_RULE-/BIC/YPONUMBER.

IF SY-SUBRC = 0. RESULT = LV_SUPPLIER. END IF.

ROUTINES---------------1) Start routine:---------------------- It is executed before the Individual Transformation rules. It is executed packet by packet SOURCE_PACKAGE (Internal table without header). The structure of this table is similar to Source object Internal table without header OOPS concept of ABAP/4 is used So we use Field symbols < source_fields >

1) DATA : BEGIN OF WA_ZPO_ORD, /BIC/YPONUMBER TYPE /BIC/AZPO_ORD00-/BIC/YPONUMBER, /BIC/ZSUPPLIER TYPE /BIC/AZPO_ORD00-/BIC/ZSUPPLIER, END OF WA_ZPO_ORD.

DATA : IT_ZPO_ORD LIKE TABLE OF WA_ZPO_ORD.

SELECT /BIC/YPONUMBER /BIC/ZSUPPLIER FROM /BIC/AZPO_ORD00 INTO TABLE IT_ZPO_ORD FOR ALL ENTRIES IN SOURCE_PACKAGE WHERE /BIC/YPONUMBER = SOURCE_PACKAGE-/BIC/YPONUMBER.

READ TABLE IT_ZPO_ORD INTO WA_ZPO_ORD WITH KEY /BIC/YPONUMBER = SOURCE_FIELDS_RULE-/BIC/YPONUMBER

IF SY-SUBRC = 0. RESULT = WA_ZPO_ORD-/BIC/ZSUPPLIER. ENDIF.

2) LOOP AT SOURCE_PACKAGE ASSIGNING .

IF -/BIC/YCS_PRC < 10. DELETE SOURCE_PACKAGE. ENDIF. ENDLOOP.

2) End Routine:--------------------- this is executed after Individual Transformation Rules Packet by Packet RESULT_PACKAGE internal table without header oops concept of ABAP/4 is used So we use Field symbols The structure of the table RESULT_PACKAGE is similar to target object

1) LOOP AT RESULT_PACKAGE ASSIGNING .

IF -/BIC/YM_ADD 'HYDERABAD'. DELETE RESULT_PACKAGE. ENDIF. ENDLOOP.

3) Expert Routine:------------------------ New in Bi 7 Expert Routine, It will be a single routine to feed data for all the target fields When we implement Expert Routine, We cannot implement Start Routine & End Routine and Individual Transformation. SOURCE_PACKAGE - Similar to structure of Source Object. RESULT_PACKAGE - Similar to Structure of Target Object Result_ package. We use this Expert routine, when we want to feed almost all the Info objects of the Target with a Routine. To Split the records De Normalize the records [3.x Return Table].

When do we use Expert routine :----------------------------------------- Case 1: When we need to implement a routine for most of the target fields (When we want to implement look ups to feed most of the Info Objects in the Data Target) Case 2: Return table (Splitting of records)(When we want to split one record to multiple records by de-normalizing the data - we can use expert routine.)

How to implement the routine:-------------------------------------SOURCE_PACKAGE - Source object structure

RESULT_PACKAGE - Target object structure, RESULT_FIELDS

Example: 1:**********DATA : lv_week TYPE scal-week.* calculate current week CALL FUNCTION 'DATE_GET_WEEK' EXPORTING DATE = sy-datum IMPORTING WEEK = lv_week.

* Loop through the table source_package* manipulate the records to be updated into Data target* And append them to result_package

LOOP AT SOURCE_PACKAGE ASSIGNING .

* FIRST RECORD

RESULT_FIELDS-/BIC/YCO_AREA = -COAREA. RESULT_FIELDS-/BIC/YMNO = -MNO. IF -COAREA = 'A1'. RESULT_FIELDS-CALWEEK = LV_WEEK. ELSE. RESULT_FIELDS-CALWEEK = LV_WEEK + 1. ENDIF. RESULT_FIELDS-/BIC/YCS_DMD = -WK1. APPEND RESULT_FIELDS TO RESULT_PACKAGE.

* SECOND RECORD

RESULT_FIELDS-/BIC/YCO_AREA = -COAREA. RESULT_FIELDS-/BIC/YMNO = -MNO. IF -COAREA = 'A1'. RESULT_FIELDS-CALWEEK = LV_WEEK + 1. ELSE. RESULT_FIELDS-CALWEEK = LV_WEEK + 2. ENDIF. RESULT_FIELDS-/BIC/YCS_DMD = -WK2. APPEND RESULT_FIELDS TO RESULT_PACKAGE.

* THRD RECORD

RESULT_FIELDS-/BIC/YCO_AREA = -COAREA. RESULT_FIELDS-/BIC/YMNO = -MNO. IF -COAREA = 'A1'. RESULT_FIELDS-CALWEEK = LV_WEEK + 2. ELSE. RESULT_FIELDS-CALWEEK = LV_WEEK + 3. ENDIF. RESULT_FIELDS-/BIC/YCS_DMD = -WK3. APPEND RESULT_FIELDS TO RESULT_PACKAGE.

* FOURTH RECORD RESULT_FIELDS-/BIC/YCO_AREA = -COAREA. RESULT_FIELDS-/BIC/YMNO = -MNO. IF -COAREA = 'A1'. RESULT_FIELDS-CALWEEK = LV_WEEK + 3. ELSE. RESULT_FIELDS-CALWEEK = LV_WEEK + 4. ENDIF. RESULT_FIELDS-/BIC/YCS_DMD = -WK4. APPEND RESULT_FIELDS TO RESULT_PACKAGE.

* FIFTH RECORD

RESULT_FIELDS-/BIC/YCO_AREA = -COAREA. RESULT_FIELDS-/BIC/YMNO = -MNO. IF -COAREA = 'A1'. RESULT_FIELDS-CALWEEK = LV_WEEK + 4. ELSE. RESULT_FIELDS-CALWEEK = LV_WEEK + 5. ENDIF. RESULT_FIELDS-/BIC/YCS_DMD = -WK5. APPEND RESULT_FIELDS TO RESULT_PACKAGE.

ENDLOOP.

Ex 2) :---------------

LOOP AT SOURCE_PACKAGE ASSIGNING .

RESULT_FIELDS-/BIC/ZCOAREA = -/BIC/ZCOAREA.RESULT_FIELDS-CALYEAR = -CALYEAR.RESULT_FIELDS-/BIC/ZKEYIND = 'LC'.RESULT_FIELDS-/BIC/ZCOST = -/BIC/ZLC.

APPEND RESULT_FIELDS TO RESULT_PACKAGE.

RESULT_FIELDS-/BIC/ZCOAREA = -/BIC/ZCOAREA.RESULT_FIELDS-CALYEAR = -CALYEAR.RESULT_FIELDS-/BIC/ZKEYIND = 'MC'.RESULT_FIELDS-/BIC/ZCOST = -/BIC/ZMC.

APPEND RESULT_FIELDS TO RESULT_PACKAGE.

RESULT_FIELDS-/BIC/ZCOAREA = -/BIC/ZCOAREA.RESULT_FIELDS-CALYEAR = -CALYEAR.RESULT_FIELDS-/BIC/ZKEYIND = 'OC'.RESULT_FIELDS-/BIC/ZCOST = -/BIC/ZOC.

APPEND RESULT_FIELDS TO RESULT_PACKAGE.

RESULT_FIELDS-/BIC/ZCOAREA = -/BIC/ZCOAREA.RESULT_FIELDS-CALYEAR = -CALYEAR.RESULT_FIELDS-/BIC/ZKEYIND = 'RC'.RESULT_FIELDS-/BIC/ZCOST = -/BIC/ZRC.

APPEND RESULT_FIELDS TO RESULT_PACKAGE.

ENDLOOP.

Ex 3) :-------------

data : v_mgrp type /bic/pztmno-/BIC/ZTMGRP.data : v_mng type /bic/azdemo00-/BIC/ZTREGMNG.

loop at source_package assigning .

result_fields-/BIC/ZTCNO = toupeer(-/BIC/ZTCNO).result_fields-/BIC/ZTMNO = -/BIC/ZTMNO.result_fields-/BIC/ZTREGION = -/BIC/ZTREGION.

select single /BIC/ZTMGRP from /bic/pztmno into v_mgrp where /bic/ztmno = -/bic/ztmno.if sy-subrc = 0.result_fields-/BIC/ZTMGRP = v_mgrp.endif.

select single /BIC/ZTREGMNG from /bic/azdemo00 into v_mng where /BIC/ZTREGION = -/BIC/ZTREGION.if sy-subrc = 0.result_fields-/BIC/ZTREGMNG = v_mng.endif.

result_fields-/BIC/ZTPRC = -/BIC/ZTPRC.result_fields-/BIC/ZTQTY = -/BIC/ZTQTY.result_fields-/BIC/ZTREV = -/BIC/ZTPRC * -/BIC/ZTQTY.

append result_fields to result_package.endloop.

ield "TOUPPER" is unknown. It is neither in one of the specified

Method "TOUPPER" is unknown or PROTECTED or PRIVATE.

Pseudo code.------------------loop at source_package.

do necessary transformations for each of the record.append the records to the table RESULT_PACKAGE

end loop.

Multiple Update Groups----------------------------------- In a single Transformation, we can have multiple update groups mappings When we want to implement Multiple Sets of Mapping within Single Transformation then we go for Multiple Update Groups. Each update group indicates one set of mapping we use mainly when we need to feed the data from multiple fields to single Info Objects we will load data, data is processed once for each update group Mapping of Char can be different for each key figure. Records are processed for each update group once. Multiple Fields - feeding Single Info Object

DSO: [Data Store Object]---------------------------------- ODS in 3.X is referred as DSO in BI 7.0.

3.x ODS:------------ Standard ODS Transactional ODS

Types of DSO in BI 7.0:------------------------------1) Standard DSO or Standard ODS2) Direct Update DSO or Transactional ODS - APD3) Write Optimized DSO *****New Very Imp

1)Standard DSO:----------------------- 3 tables New, Active, Change Log Tables Change in New Data Table:- Instead of Request No - Sid of the Request no Reporting - Yes Data Mart - Yes [Delta] ETL Supported - Yes Settings New* SIDs Generation upon Activation Unique Data Records

Structure:

2)Direct Update DSO:---------------------------- Similar to Transactional ODS in 3.x 1 table Active Data Table Reporting - Yes ETL is not Supported Activation Process - No Data Mart Supported - Yes [No Delta]

Usage:--------1. We can use Direct Update DSO as a Data Target in APD.2. We can insert records into this DSO using BAPI.

3 )Write Optimized DSO:-------------------------------- New in BI 7.0 1 table - Active Data Table Structure:- Technical Keys [Request No + Data Packet no + record No] + Schematic keys + Data fields + record mode ETL is supported - yes No Activation Process Reporting is Possible - yes Data Mart is supported - yes - [delta based on request]

Usages:-------- Overwrite Functionality. - Standard DSO Detailed level of data. - WODSO Staging - WODSO BAPI or APD - Direct Update DSO

* RSAN_PROCESS_EXECUTE

APD (Analysis Process & Design) --------------------------------------------- Data Mining Tool TCODE :- RSANWB 3 aspects (Source ----- Transformations ---- Data Targets) Source Object(Info Cube, DSO, MD, File, Table, BEx query) - Transformations -- Target Object (Direct Update DSO or Master Data)

Source ----------- Transformations ---------------- Data TargetInfo CubeROUTINE MDDSOFORMULA DIRECT UPDATE DSOMDSORTINGFF AGGREGATINGTABLEABC ANALYSISQUERY

Case 1:---------- ZPO_001

Case 2:----------IT_SOURCEIS_SOURCEET_TARGETLS_TARGET

Scenario 1:-------------- To improve the performance of the query

Scenario 2:-------------- Demand Aggregation Ratio

Types of Cubes---------------------1. Standard Cube or Basic Cube2. Real Time Cube or Transactional cube3. Based on Direct Access DTP or SAP Remote cube4. Based on BAPI or General remote Cube5. Based on FM or with services

1) Standard Cube:----------------------- Same as Basic Cube in 3.x

2) Real Time Cube:------------------------- Same as Transactional Cube in 3.x BPS or IP Aggregation Level - Virtual Plan Mode , Load Mode - 2 process types in PC

3) Info Cube Based on Direct Access DTP:---------------------------------------------------------------- To access data remotely from SAP Source, DB connect, Flat file Data Source should be enabled for Direct Access Having Display Data Option(When R click on your cube)

How to Implement?-------------------------Steps:--------1. Create Generic Data Source - RSO2 2. Replicate the data Source - RSA13. Migrate the Data Source from ISFS TO RSDS4. Create the Info Cube of type DTP for Direct access5. Create Transformations connecting DS & Info Cube6. Create the DTP of type Direct Access

4) Based on BAPI:------------------------- Same as General remote Cube in 3.x SAP or Non - SAP applications using BAPI

5) Based on FM :--------------------- Same as With Services in 3.x Within SAP BI if we want to access data remotely using a FM 0FIGL_V10

Multiprovider--------------------- Info Cubes, DSO(Any type of DSO),Info Sets, Info Objects, Aggregation Level Aggregation Level Any Type of DSO can be used to build multiprovider Display Data

Info Set:----------- DSO, Info Objects, Info Cube Info cubes can also be used into Info sets Max no of cubes : 2 Display Data Adjust Info Cube cannot be used as a left operand of the Left outer join Adjust we can also use Info Set as a Source object

ZPO_001ZWR_001ZERROR

Open Hub Destination:---------------------------- Retraction

Case 1:--------- Info Cube to Flat file

Steps:---------1. Create the Destination2. Create the Transformations to connect SO and Open Hub Destination3. Create the Standard DTP and schedule the load

Problems solved from 3.x:----------------------------------- Multiprovider / Infoset we can transform the data by using normal Transformations

Re-modeling:------------------

Re-partitioning:------------------ Merging Adding new Completely new set

Analyzing BW objects (Repairs/ Test):-------------------------------------------------- RSRV Size of the cube

Authorizations:---------------------- RSECADMIN SUIM SU53

SDN.SAP.COM

ELEARING - bi ADMIN COCKPIT

Migration:--------------

How to Migrate the objects from 3.x to Bi 7.x ?--------------------------------------------------------

Case 1:[With Routine]-----------------------------Steps:---------0. 1. Copy all the routines into notepad1. Migrate the UR to Transformations2. Migrate the TR to Transformations3. Migrate the Data Source With Export Without Export RSDS_EXPORT RSDS (T-CODE)

This will delete the 3.x transfer rulesInfo package processing type will be fixed / changed to "ONLY PSA".

4. Create the Standard DTP & Do necessary changes to corresponding Process chains

5. Migrating BEx query

open 3.x query in new query designer

6. Migrating BEx Workbook

open the 3.x workbook in new Analyzer

7. Migrating Web template

Case 2: Without Routines:-----------------------------------Steps:--------1. Migrate the data Source - With Export

Note:* This will delete the existing Transfer rules* Info Packages setting will be changesProcessing - Only PSAData Targets will be disabled* With Export & Without Export* RSDSEXPORT* RSDS

2. Create the Transformation - DS & IC3. Create the Standard DTP & Do necessary changes to corresponding Process chains4. Migrating BEx Query5. Migrate the BEx workbook.6. Migrate the Web Template

RDA [Real time Data Acquisition]--------------------------------------------

Steps:--------1. Create Generic Data source on the table enabling it for RDA.2. Replicate the data Source3. Migrate the data source from ISFS to RSDS4. Create the Standard DSO5. Create the Transformations connecting DS & DSO6. Create the Info package and schedule the load with "INIT"7. Create the Standard DTP & schedule the load8. Create the IP with RDA and Assign it to Demon9. Convert Standard DTP to RDA DTP10. Create Demon, Scheduling, Assigning - RSRDA

Points to be Noted:-------------------------- RDA DTP is supported only for Standard DSO We use this DTP to have the Real Time Data Available for Reporting. RDA can be enabled only for data source based on Myself, SAP Connections, Web services, BAPI Standard Data Sources - 2lis_11_vaitm and 2lis_03_bf are supported for RDA. we can enable the other standard data sources for RDA by customizing the table contents - ROOSOURCE Limitation- It uses lots of resources (we may have to improve the Hardware capabilities as it uses lots of resources continuously). Scheduling the RDA - IP and RDA DTP by using Demon. - RSRDA

LIMITATION :--------------------- Resource Constraints

BI Statistics:----------------- BW statistics (Technical content is absolute) New set of Standard cubes & Virtual cubes & Multi providers & BEx queries Maintaining the BW statistics Tables & Views Technical content BI Administration cockpit

BIA:[BI Accelerator]:---------------------------

To Improve Query Performance. BIA is a separate Server. BIA contains a File storage & 64 bit Xeon Blade Servers BIA Index - Horizontally portioned & vertically decomposed Indexes are built with Dictionary based integer coding. Query Execution process RSDDV RSDDBIAMON2 EVERY TABLE WILL HAVE AN INDEX Compressing the Info Cube will reduce the Database space occupied by the Info Cube in BIA Server. MD Index , TD Index MD Index Global for all the TD Index Based on the through put we decide on the no of blade servers - Sizing. TREX search mechanism

BEx Reporting--------------------Components:

1. BEx Query Designer2. BEx Analyzer3. BEx WAD4. BEX Report Designer (New)5. BEX Web Analyzer (New)

BEx -------1. QUERY DESIGNER2. BWX ANALYZER

WEB--------1. WAD2. REPORT DESIGNER3. WEB ANALYZER

1) BEx Query Designer:------------------------------ New GUI built with .Net(.net framework) Understanding the default output of the query(Understanding the output) Simple Query

Filter:-------- Default values - can be changed at the run time of the query Char. restrictions - Can not be changed at the runtime of the query.

Conditions:---------------along rows(Most Detailed Char in Rows)along columns(Most Detailed Char in Columns)

Exceptions: ---------------Evaluated Key figure & Display Key figure could be different.Controlling Display of Exceptions Properties:--------------- Key figure Properties [Exception Aggregation] Unit Conversion Level up & Level Down for Key figures VariablesNote :- Exception Aggregation can be nested.

2)BEx Analyzer:--------------------- 3.x Analyzer + BEx Design Tool Box BEx Analysis Tool Box & Design Tool Bar Design Tool Box (Excel based Dashboards ) Analysis Grid Text Elements Buttons Checkbox Radio Buttons Exceptions Conditions Navigation Block Filter Drop Down Box

To Design Workbooks (Formatted Reports) Use the Design Tool Box items and specify the BEx query as the Data provider Default workbook Layout

Functionalities:--------------------- Drag & Drop Flexibility Add Local Formula Create Condition Convert to Formula

How to Create the BEx Workbooks?--------------------------------------------Case 1:--------- Understand the default workbook

Case 2:--------- Analysis Grid & Navigation Block Filter & Chart & Text Elements

= BExGetData("DP_1",H$4,$F15)

Case 3:--------- Drop Down Box & Analysis Grid & Check Box, Radio Button

Case 4:---------- Conditions & Exceptions

Case 5:--------- Insert Text

Case 6:--------- Command Button

Case 7:---------- workbook settings

3) BEx WAD [Web Application Designer]--------------------------------------------------Why...? To Design Web Templates by using Web Items. Procedure of creating web templates is till same as 3.x

What is new in BI 7.x?----------------------------- XHTML New set of web Items (Tab pages, Container, Container Layout, Button Group, Information Fields, Script....) URL commands are absolute - replaced with Navigation options for commands(URL Commands are no more supported) (provided commands with navigation)

Case 1:[Real time Scenario]----------------------------------

Case 2:[Tab Pages]---------------------------

4) BEx Report Designer:------------------------------ Web Reports Layout formatting

Case 1:--------- Formatting

Case 2:--------- Header / Footer Information Fields (Field Catalog)

Case 3: --------- Conditional Formatting Report structure

Case 4:--------- Static Query & Dynamic Query

Case 5: [Multiple Data Providers]-----------------------------------------

Case 6: [Images, Charts, Header, Footer]----------------------------------------------------

Case 7: [Report Field]---------------------------

Case 8: [Group Levels]------------------------------

Note :- Report Web item in WAD

5) BEx Web Analyzer:--------------------------- Tool to design Ad-hoc reports

*Integrating BEx queries / reports with SAP Enterprise portals: (Integrating BEx objects with SAP EP)

SSO

Case 1:---------- BEx query to EP

* Visual Composer:-------------------------- To Design I Views by integrating data from any application

Case 1:--------- BEx query - Table Display

Case 2:---------- BEx query - Graph Display

Case 3:--------- Layers (Tabs)

Information Broadcasting:---------------------------------- IB is available in 3.5 (but limited functionality of - broadcasting to Email) In 3.x we were using it to distribute the query outputs to mail. Reporting Agent Email Bursting In BI 7.0 , All the reporting agent functionality have come into IB New one is we can broadcast BEx workbooks also

Pre-requisite settings:---------------------------- Connection between SAP BI server & Mail Server (smtp) - SCOT(Configure SMTP connection to Mail Server - SCOT) Mail ID has to be maintained for the SAP user - SU01 (User Maintenance with Email ID - SU01) - SMTP Configuration in SCOT. - User Maintenance - Email ID How to Broadcast the BEx queries?---------------------------------------------- Distribution Type:

Process chains:------------------- Display Mode New process types

Management Cycle:-------------------------

Integrated Planning: [IP]--------------------------------- Anything we decide in advance Short-term planning Long term planning rolling forecast 3.5 we had BPS for planning

4 aspects:-------------1. Modeling the planning structures - RTC, AL, MP, Filters & Char. Relationships, data Slice2. Manual Planning - [Input Ready Query, Planning layouts]3. Planning Functions & planning Sequences - to automate the planning applications we use functions (COPY, REPOST, DELETE, DISTRIBUTION BY LKEYS ,Revaluation , Fox....)4. Lock Management

Business requirements of planning:-------------------------------------------- Volume Planning Profit Margin Revenue planning Cost Center Planning

Modeling the planning structures - RTC, AL, MP, Filters & Char. Relationships, data Slice--------------------------------------------------------------------------------------------- RTC:

- Aggregation Level:

- Filters:

- Data Slice:

- Char. relationships - ATTR, Hier , DSO

- Manual Planning - [Input Ready Query, Planning layouts]-------------------------------------------------------------- Steps:---------1. Gather the requirement for planning

2. Identify the Detailed Level / Granularity of Planning.

3. Based on the Detailed Level / Granularity of Planning design the RTC.

4. Create the Info Objects.

5. Maintain the Master data.

6. Create a RTC as per the Design.

7. Load the template data to this cube.

8. Go to Planning Modeler, Specify the Info Provider and create the Aggregation Level.

9. Define the Filter

10. Define the Input Ready Query----------------------------------* Input Ready Query is built on AL* All Characteristic of AL must be used in a Query.

11. Create the Excel based planning layout by using BEx Analyzer.

12. Create the Planning Layout with Web Based

- Planning function Functions COPY , Repost ,Delete ,Revaluation

- Lock Management:

- Delete Locks - SM12- Defined the filters at a very detailed level - RSPLSE

- Customized Planning function - uploading data from a flat file.

-

Case Study: [Manual Planning]----------------------------------------Steps:-------1. Gather the Requirements2. Identify the detailed granularity of planning3. Design the RTC4. Create the Info Objects & load the respective Master Data5. Create the RTC6. Load the RTC with Template data7. Using the COPY function - Implement the planning application tom set the data for every forecast period so the planers can use the same data for planning (periodicity - beginning of every forecast)

- Copy

region version revenue------------------------------------------SOUTHV1 10000

AFTER

region version revenue------------------------------------------SOUTHV1 10000NORTHV1 10000

1. 7.1) Specify Info Provider - RSPLAN

2. 7.2) Create Aggregation Level

3. 7.3) Create Filter

4. 7.4) Create Planning function - COPY

5. -7.5) Create Planning Sequence

Manual Planning of dealer----------------------------------- Define filter Define Input ready query for dealer Define reporting query Define Excel based layout for planning Define web template for planning

Case [Complex aggregation Level]:----------------------------------------------Steps:-------1. Create Multiprovider2. Specify Info Provider3. Create Aggregation Level4. Define filter (Info provider both Standard cube & Planning cube)5. Define planning function- COPY6. Define planning sequence

Other Planning functions:------------------------------- Delete:

region versionrevenue:---------------------------------------South v1 10000

afterregion versionrevenue:------------------------------------------South v110000South v1 -10000

Note:------- filter is mandatory filter should be defined to select the data to be deleted

Repost:---------Region version revenue:-------------------------------------------------------------South v1 10000

AfterRegionversionrevenue:-----------------------------------------------South v1 10000South v1-10000North v1 10000

Note:------- filter is mandatory Filter must contain field to be changed with both FROM VALES & TO VALUEs

Revaluation:---------------- We use this function when we need to increment the Key figure values by certain percentage.

Distribution by Keys:---------------------------Year periodvolume--------------------------------------------20111300

Yearperiod volume--------------------------------------------2011 13002011 -13002011001 6502011002 650

filter is mandatory Filter must contain field to be changed with both FROM VALES & TO VALUEs

FOX [Formula]:-------------------- Extended Formula

Case 1:----------

V11000201115,000V11000201115,000V21000201016000

DATA ZLREGION TYPE ZBREGION.DATA V_COUNT TYPE I.DATA V_LOOP TYPE I.V_LOOP = 1.V_COUNT = VARC(ZGREGION).

DO. ZLREGION = VARI(ZGREGION,V_LOOP). IF ZLREGION = 'V1'. {ZVOLUME} = {ZVOLUME} - 5000. ELSEIF ZLREGION = 'V2'. {ZVOLUME} = {ZVOLUME} - 3000. ENDIF.

IF V_LOOP = V_COUNT.EXIT.ENDIF.V_LOOP = V_LOOP + 1.ENDDO.

Calling function module in Fox RSPLF_FDIR We can create our own custom functions - Create the Class - SE24 Use the class & create function - RSPLF1 Upload flat file through IP

Data Slice:-------------- Locking the data against planning

Char relationships:------------------------ Attribute Without Derivation - Matching & Null With Derivation - Only Matching Value

Hierarchies DSO (Standard DSO) Exit Class

Lock Management:------------------------To reduce Locks:---------------------- By Defining Filter at very detailed level Lock Management -RSPLSE To delete Locks in SM12

Currency Conversion:

2 ways:----------- Within the Transformations Reporting

TCURC - List of currencies supported in SAP BWTCURR - List of all the exchange rates

Reporting:--------------Steps:-----------1. Create Currency Conversion Factor - RRC1 / RSCUR2. Use the Conversion type in BEx Query Designer

Visual Composer:---------------------- To Design I View by integrating BEx queries or BAPI's from different Applications. Flash Player

Case 1:--------- Integrate the BEx query and display in form of a Table

- BIA

- To Improve Query Performance.

- BIA is a separate Server.- BIA contains a File storage & 64 bit Xeon Blade Servers- BIA Index - Horizontally partitioned & vertically decomposed- Indexes are built with Dictionary based integer coding.- Query Execution process- RSDDV- RSDDBIAMON2- EVERY TABLE WILL HAVE AN INDEX- Compressing the Info Cube will reduce the Database space occupied by the Info Cube in BIA Server.- MD Index , TD Index- MD Index Global for all the TD Index- Based on the through put we decide on the no of blade servers - Sizing.

DB Connect:---------------1. Create the data source

Unit conversion:------------------------