Google+ Facebook Twitter MySpace SC

MODELLING



 

 

 

Following is youtube link which explains the BW architecture in very efficient way. The video needs to be appreciated in the way it captured the wholestic view and interaction between various sections in BW throughout version 3.5 , 7.0 .  Awaiting video on 7.3 

http://www.youtube.com/watch?v=fovmnSIplpc ( BW 3.5 architecture)

http://www.youtube.com/watch?v=dCcyKmdwVYk (BW 7.0 archetecture)


How to search for the data in PSA – choice 1.

Once you are working with SAP Business Warehouse (BW) your job is about data analysis most likely. You need to find check the data in data targets (info cubes/ODSs) explore how data got into these objects, what is the transformation logic and when the bug is somewhere here to compare the data in source or in the PSA area with the data in targets.
One of the ways how to search for the data in PSA is described in following material. First navigate to your DataSource in PSA tree within Modeling section in Administrator section in TA RSA1 and choose “Delete PSA data” item from right click menu:
On next screen that appears scroll to right first section of screen and try to find field called “DDIC table of the PSA”. The content of this field is dictationary table of the PDA data.
You can browse it in SE 11 for instance:
Here you can choose “Contents” icon or use keyboard shortcut CTRL+SHIFT+F10.

SDN Articles - BI Modelling

List Of Articles:


1) Data Source





7) Open Hub Destination (OHD) and APD


Analysis Process Designer: Step by Step Process for Formatting the Query Extract


APD to Update Marketing Attributes from SAP BI to SAP CRM


APDs and Open hubs Sending SAP BW Data to 3rd Party


InfoSpokes and OpenHubs in SAP BIInsert Custom Header and Remove Trailing Commas from APD Generated (or any)-CSV file


Join and Union options in APD


Open Hub Destination - Basics


Open Hub Destination - Make use of Navigational Attributes


Open Hub Destination-Use Same Logical Path for Multiple Directories


SAP BW Infospoke – Dynamic Update Selection Screen Values 

8) RDA (Realtime Data Acquistition)


How to do Real-Time Data Acquisition trough Web Service (Push Method)


Real Time Data Acquisition (RDA) – Overview and Step-by-Step Guide (SAPI and Web Services)


RDA – Step by Step


Real Time Data Acquisition (RDA) Steps - Business Intelligence


9) Miscellaneous


0RECORDMODE and Delta Type Concepts in Delta Management


Record Mode Concept in Delta Management


Aggregation of Key Figures


ALE Settings for Communication between a BW System and an SAP System


Basics of Non – Cumulative Key Figures


Direct Access to Source System Data using VirtualProviders


SAP BW - Virtual Characteristic (Multiprovider & Infoset) - RSR_OLAP_BADI


 Introduction to Database:


The main purpose of database is to store data and this data can be used for later purpose (analysis). Any big Enterprise we consider, they always store their business data into database. So that they use this data for analyzing their business.

Any database like (Oracle , Ms Access, SQL Server, Sybase ……) always store data in form of a structure called “TABLE”.

Table: Table is set of ROWS and COLUMNS
Each Row in a table is referred as Record

Primary Key:

It is a column in a database table which can maintain uniqueness for all the records in the table. It can be used to uniquely identify each and every record in the table

In above example: In the Customer Table the “Customer Number” is the Primary Key.  

Every table must have a Primary Key (In Exceptional Cases we can have a table without a primary)

we have 2 types of columns in a table.
            1) Key Column
            2) Non - Key Column
Key Column:-   Column which is a part of Primary Key.
Non - Key Column:-   Column which is not a part of Primary Key.

 All Non - Key Columns act as Attributes or Properties for Key Column.



Composite Key:-

 When 2 or more columns act as the primary key in a table.

In above Example, Bill No & Sno together act as the primary key in the database table, So this is referred as Composite Key.
It is the limitation of the database that we can only have a maximum of 16 columns act as the composite key.

De-Normalized Table:

When we store all the data in a single big table, and we find the data being stored redundently/Duplicatly,we call it as De-Normalized Table.

Limitations of De-Normalized Table:

1) Wasting Database Space
2) Complexity will be high.

Normalized Table:

 Instead of storing the data in a single table, we split the data into multiple smaller tables connected with Primary key - Foreign Key where there is no data redundency - Normalized tables.

Normalization:

 The process of converting De-normalized tables into normalized tables by using normalization forms.

Foreign Key:

 When a Primary key of one table takes part in the other table we call it as Foreign Key.

Software Engineering process:

Whenever we develop a software we follow the SDLC cycle . SDLC contains 5 steps:
  • 1)    Requirement Gathering
  • 2)    Design
  • 3)    Develop
  • 4)    Testing
  • 5)    Deploy
  • 6)    Maintenance & Support
Requirement Gathering:
At this phase we gather the requirements from the end users and understand the business process.
Design:
As we know every application will have the Front End (Interface Screens) & Back end (Database). As part of this phase we will have to design the Front end screens and Design the Database (Database design is referred in the Next Section)
Develop
At this phase by using some Programming Language & some database we develop the software.
Testing
At this phase we test the software, weather the software is working as per the user requirements or not.
Deploy
Once the Software is tedted perfectly we deploy the software at the client location. So that the business can start using the software.
Maintenance & Support:
 Once the software is deployed, we will have to provide the Maintenance & Support for any issues what the client/ Business faces.
Business:  - Business / Business Process ....?
        - Set of Business activities / Transactions ...?(Selling & Buying, Sales Order,    Delivery, Billing............................)
                        - When 2 or more entities or parties or objects interact with each                     other to perform an event.
                                        - Entity...?Any Object which can perform work by itself or                                                 which we can use it to perform some work (Noun).

- Product id,name,qty,price,cno,cname,date,time,branch,................ (Transaction Data)
- Detailed information of an entity - Master data


Applications & Types:

 Programming Language is used to design the frontend (i.e, Interface screens &Application logic)
·         Database is used to design the backend to store data
·         Operating system to run the application
·         Concept is the reason for what the application is designed for

We have 2 types of applications:
1)    OLTP
2)    OLAP

OLTP [Online Transaction Process]:

OLTP applications are mainly to record all the transactions of the business

OLAP[Online Analytical Processing]:

OLAP application takes in all the transaction data from different OLTP applications and provide the reports for analysis. 


DATABASE DESIGN:

Database Design in OLTP:


·         ER Model [Entity Relationship Model] is used to design the database for OLTP applications.
·         Database designed with ER Model is 2 Dimensional & it is completely normalized.

Database design in OLAP:

 In OLAP applications we store data in MDM[Multi Dimensional Format] by using the following models:

  • 1)    Start Schema or Traditional Star schema
  • 2)    Extended Start schema or BW Star schema or BI star schema
  • 3)    Snow Flake
  • 4)    Hybrid
Star Schema:

Star schema is an MDM ( Multi Dimensional Model ) which contains Fact table / Transaction data Table at the center, surrounded by Dimension tables / Master Data Tables existing within the Cube.

 These Dimension Tables / Master Data Tables are linked to the Fact table / Transaction data table with Primary Key – Foreign Key Relationship.



Difference between ER Model &  Star Schema
ER Model
Star Schema
2 Dimensional
Multi Dimensional
Normalized
De-Normalized

Limitations or Dis-advantages of Star Schema:

          Master Data is not Reused:
In Case of star schema, Master data is stored inside the cube. So Master data cannot be reused in other cubes.
          Degraded performance:
Since all the tables inside the cube contains Alpha-numeric data, it degrades query performance. Because processing of numeric’s is much faster than processing of Alpha-numeric’s
          Limited Analysis:
In case of Star schema, we are limited to only 16 dimensions.

Extended Star Schema:

In case of extended star schema, we will have Fact table connected to the Dimension table and the Dimension table is connected to the SID table and SID table is connected to the master data tables.
Fact Table and Dimension table will be inside the cube.
SID table and Master data tables are outside the cube

One Fact table can get connected to 16 Dimension tables, one Dimension table can be assigned with maximum of 248 SID tables (248 characteristics).


Master data & SID tables:

Every characteristic Info Object will have its own SID table to convert the Alpha-Numeric value to a Numeric value. But the Key figure Info Object will not SID table because the keyfigure value is a numeric.

When ever we insert a value into the characteristic Info Object, system will generate an SID number in the SID table which is a numeric value
Each Characteristic can have its own master data tables (ATTR,TEXT,HIER)
 

Attribute Table is used to store all the attribute or properties data
Text table is used to store the description in multiple languages
Hier table is used to store the Parent-Child data.
As you can observe the in above picture, Material Attribute table holds all the attribute information like (Material Group, Material Price), Material Text table Holds Description in multiple languages.

So when we load master data, SID’s are generated in the SID table.

Fact Table & Dimension Tables:

Fact Table: 

Fact Table will have Dimension ID’s and Key figures. 

Maximum DIM ID’s – 16

·         Maximum Keyfigure – 233
·         The Dimension ID’s in the Fact table is connected to the Dimension Table.
·         Fact Table must have at least one Dimension ID.

Dimension Table:

 Dimension Table contains Dimension ID and SID columns.
·         One column is used for Dimension ID
·         We have maximum of 248 SID Columns
·         We can assign maximum of 248 characteristics to one dimension.
When we load Transaction data into Info Cube, System generates DIM ID based on the SID’s and uses the Dim ID’s in the Fact Table.
We can load the Transaction data without master data, In this case system first inserts the Master data into Master data tables, then generates the SID ID’s and based on these SID ID’d it generates DIM ID’s and uses the DIM ID in the fact table.

Standards to Design Info Cube:

  •  If we have 2 characteristics which are related as 1:1 or 1:M, we should assign them to same Dimension table 
  •   If we have 2 characteristics which are related as M:M, we should assign them to different Dimension tables
  • Modeling of Characteristics: 
  • ·        If we model the Characteristic as an attribute of another Characteristic, It gives Present truth because the property of master data is overwrite.
  • ·         If we model the Characteristic as a separate Characteristic and assign the Characteristic to an Dimension table, It gives Fact.
  •  
  • Modeling of Keyfigures:
  • ·         If we Model the Keyfigure as an attribute of another Characteristic, It gives present truth because the property of master data is overwrite.
  • ·         If we model the keyfigure inside the Fact table, it gives fact because the property of Info cube is additive.
SAP BW Architecture:
Points to be Noted regarding SAP BW:

  •   In SAP BW we work with objects like (Info Cube, ODS, Info Source, data Source, Info Package, Update Rules, Transfer rules, BEx queries…….)
  • In SAP BW we will have 2 types of Objects:

1)    Standard or Business Content Objects:
·         These are the readymade Objects delivered by SAP.
·         All the standarad objects will have their technical name starting with the number 0.
·         All Business content objects will be in delivered version.

2)    Customized Objects:
·         These are the objects what we create as per our requirements.
  •  Every Object in SAP BW will have the Technical name and Description
  • Once the Object is created we cannot change the technical name but we can change the description

Info Area:

Info Area is like “Folder” in Windows.
It is used to organize the objects in SAP BW.

Info Object Catalogs:

Similar to Info Area, Info Object catalog is used to organize the Info Objects based on their type.
So we will have Info objects catalogs of type Characteristics & Key figures.

Info Objects:

It is the Basic unit or object in SAP BW used to create any structures in SAP BW(Info Cube, ODS, Info Source…..)
Each field in the source system is referred as Info Object in SAP BW.
We have 5 types of Info Objects:
  • 1)    Characteristic
  • All Business subjects what we analyze
  • Ex:- Customer Number, Material Group, Company Code, Employee Group
  • 2)    Key figure
  • All Quantitative measures used to analyze the subjects
  • Ex:-Price, Revenue, Qty, Number of employees, VAT %
  • 3)    Time Characteristic
  • Characteristics which maintain Time factor information
  • We cannot create the Time Characteristics
  • Ex:- 0CALDAY, 0CALMONTH,0CALYEAR …
  • 4)    Unit Characteristic
  • Characteristics which can be used to hold Currencies and units.
  • Like 0CURRENCY, OUNIT
  • We always have to create unit Characteristic by taking 0CURRENCY OR 0UNIT as the reference.
  • 5)    Technical Characteristic
  • Characteristics which hold technical details like Request number, datapacket  no, Record Number.
  • Ex:- 0REQUID….

Info Cubes:

  •  Info Cube is an Multi-Dimensional Object which is used to store the transaction data.
  • Info Cube contains Fact Table & Dimension Table
  •  Info Cube is referred as datatarget because it holds the data physically in it.
  •  Info cube is referred as Info Provider because we can do reporting on Info cube
  •  The property of Info Cube is additive.

ODS:

  •   ODS stands for Operational data Store
  • It is an 2 Dimensional object
  •  The property of ODS is overwrite 
  •   We use ODS for staging the data and also detailed reporting

Info Source:

  •   Info Source defines communication structure
  • Communication structure is a group of Info objects which are required to communicate the fields coming from the source system
  •   We have 2 types of Info Sources:
  • Direct update
  • Direct Update Info Source is used to load the master data objects
  • Flexible Update
  •   Flexible update is used to load the transcation data to any data targets like(Info Cube, ODS)

Data Source:

  •  Data Source defines Transfer Structure
  •  Transfer Structure indicates what fields and in what sequence are they being transferred from the source sytem
  • We have 4 types of datasource:
1)    Attr: used to load master data attr
2)    Text: Used to load text data
3)    Hier: used to load hierarchy data
4)    Transcation data: used tgo load transaction data to Info cube or ODS

Source System:

Source system Connection

We use Source system connection to connect different OLTP applications to SAP BW.
We have different adapters / connectors available:
  •   SAP Connection Manual
  •   SAP Connection Automatic
Both these connections are used to connect any SAP application to SAP BW by RFC connections.
Ex:- we use this connection to connect SAP R/3, SAP APO,SAP CRM to SAP BW
  •         My Self Connection:

We use this connector to connect SAP BW to the Same SAP BW server.
We generally use this to load data from one Info Cube to another Info Cube.

  •         Falt file Interface:

We use this adaptor to load data from flat files (It only supports ASCII or CSV files)
  •             DB connect

We use this connector to connect any SAP certified database to SAP BW.(Certfied databases like Oracle, SQL Server, DB2…)
  • External Systems with BAPI

We use this connector to connect any 3rd party ETL tools like Informatics or Data Stage.

Info Package:

  • Info package is used to schedule the loading process.
  •   Info package is specific to data source
  •   All properties what we see in the info package depends on the properties of the DataSouce.

Business Explorer[BEx]:


We use BEx components to design all the reports in SAP BW.

RSA1: [Administrator workbench]:

- Modeling

We create the BW objects like (Info Area, Info Objects, Info Cube, Info Source, ODS, Multi Provider, Info Set)
  We do perform procedures to load data into these objects.

- Monitoring

  We monitor all the BW Objects
  we do even monitor the Loading Process
- Reporting Agent
  To run / schedule the BEx reports in the background.
- Transport Connection
  We use tab to transport the objects from one BW server to another
- Documents
  BDS(Business Document Services), used to maintain documents within SAP BW
- Business Content
  All the BC objects will be in Delivered version, But if we need to use the objects they sholud be available in Active version, So in the Business Content Tab - we install the Business content objects ( Creating a Copy of the Deliverd version objects into Active Version)              

- Translation:

We use this tab to translate the objects from one Language to another Language
  When we translate an object only Description of an Object will change but not the Technical Name

- Meta Data:

Meta data is nothing but data about Data
  Meta Data is Maintained in Meta Data Repository maintained by Meta Data Manager


How to load data from Flat file to Info Cube:

Case Study – 1



Pre-requistes:

Source System connection Between Flat file and SAP BW.

Steps:

1) Design the Info Cube.
2) Implement the Design in SAP BW
      2.1) Create the Info Area
      2.2) Create the Info Object Catalogs
      2.3) Create the Info Objects
      2.4) Create the Info Cube
      2.5) Loading the Info Cube from Flat file.
         2.5.1) Created the Application Component
         2.5.2) Create the Info Source
         2.5.3) Assign the Data Source to Info Source
         2.5.4) Activate the Transfer rules
         2.5.5) Create the Update Rules
         2.5.6) Create the Info Package and schedule the load


Deleting Data in Info Cube:

  • 1)    Deleting data based on the request
  • 2)    Delete Data
  • 3)    Selective deletion

ETL Process:
When we start the Infopackage SAP BW triggers the loading process with the below steps:
1.    SAP BW sends a request to the source system
2.    It gets confirmation “OK” from the source system
3.    Then SAP BW send the data request
4.    Based on the Data Selections, the data is extraction from the source system
5.    The same data is transferred to SAP BW (Transfer methods PSA & IDOC)
6.    Then the data is loaded into the Data target through Transfer rules & update rules.

Transfer Methods [PSA & IDOC]:

PSA:
  •   Presistant Staging Area
  •   It is a 2 Dimensional Table.
  •   Data is transferred directly to PSA table and Information is transferred through Info IDoc’s
  •   When we activate the transfer rule with PSA as the Transfer method, the system will automatically create the PSA table.
  •   Every Data Source will have its own PSA table.
  •   We can find the PSA table of a Data Source by using the T-code SE11
  •   Structure of the PSA Table:- Transfer structure + 4 technical fields (Request No, Data packet ID, Record No, Partition No).
  • PSA holds Replica of data coming from Source.
  •   We can do editing in PSA
  •   Error handling is possible with PSA
  •   We can reload the records from PSA to the Data target by using - Reconstruction
  •   We can delete data in PSA (Generally we delete data in PSA which is older than 7 days)
  •   Allowing Special Characters into SAP BW - RSKC [76].
  •   RSALLOWEDCHAR

IDOC:
  • Intermediate Document
  • It is the Standard used to transfer the data in SAP environment.
  •   Data is transferred through Data Idoc's and Information is transferred through Info IDoc's.
  •   IDoc Maintenance



Infopackage Update Modes:
Full Update:

It extracts all the records from the Source system with respective to data Selections.

Initialize Delta Update:

It is similar to Full update but enables us to run the Delta updates once the init is successful

Delta Update:

It only extracts the data what is newly created or modified since the last update.
Note:-
  •   With whatever data selections we run the init update, delta update also should run with the same data selection
  • In the Infopackage we can see the Delta update option only if the data source supports delta.


Transfer Rules:

By using transfer rules, we can do mapping between Transfer structure & communication structure.
Types of transfer rules:
  • 1)    Direct Mapping: We use option to map the value from a source field in the Transfer structure to the target Info object in the communication structure.
  • 2)    Constant: We use option to specify a fixed/constant value for the records loaded through transfer rules.
  • 3)    Formula: we use option to implement a formula by using Formula editor.
  • 4)    Routine [Transfer Routine]: when use this option to transform the data by using ABAP/4 code. When implementing a transfer routine we must refer to the fields by using the structure name as “TRAN_STRUCTURE” i.e, [TRAN_STRUCTURE-/BIC/PRICE]. When debugging the name of the transfer routine will be formed as COMPUTE_FIELDNAME.
Update Rules:

Update rules specify the mapping between Source object and Target object. We use update rules to perform all kind of Transformations. Update rules update the data into the data target.

Types of Update rules:

Keyfigure:
  •   Source Keyfigure or Direct Mapping
  •    Formula: we use option to implement a formula by using Formula editor.
  •   Routine or Update routine: when use this option to transform the value of a key figure by using ABAP/4 code. When implementing an update routine we must refer to the fields by using the structure name as “COMM_STRUCTURE” i.e, [COMM_STRUCTURE-/BIC/PRICE]. When debugging the name of the update routine will be formed as ROUTINE_001.
  •   Routine with Unit: when use this option to transform the value of a key figure and also value of the unit characteristic associated with it by using ABAP/4 code

Characteristic:

  •  Source char or Direct Mapping
  •   Constant
  •   Master Data Attribute of: we use this option to feed value by doing lookup to the master data tables.
  •    Formula
  •     Routine
  •    Initial Value: Populates no value (by default NULL)

 Time Characteristics:

  •   Source char or Direct Mapping (Automatic Time Conversion)
  •   Constant
  •  Master Data Attribute of: we use this option to feed value by doing lookup to the master data tables.
  •   Formula   Routine
  •  Initial Value: Populates no value (by default NULL)
  •    Time Distribution: we use the option to distribution values from Higher level Time characteristics to Lower level Time characteristics.


Start Routine:
  •  Start routine is executed before individual update rules.
  •  Start routine is executed packet by packet.
  • So we use start routine to perform or implement any kind of logic which is supposed to get executed before update rules.
  •   When implementing start routine we use an INTERNAL TABLE – DATA_PACKAGE.
  • Sample code:
  • LOOP AT DATA_PACKAGE.
  •             IF DATA_PACKAGE-/BIC/ZCREG <> ‘AMR’.
  •                         DELETE DATA_PACKAGE.
  •             ENDIF.
  • ENDLOOP.
Return Table:

We use this option when we want to split one record from the source to multiple records in the data target.

Difference between Update rules & Transfer rules:


Transfer Rules
Update Rules
Transfer Rules will just Transfer the data
Update Rules will update the data into data target
Transfer rules are specific to source system
Update rules are specific to Data Target.

Different Data flow Designs: 

One InfoSource to Multiple Data Target – Yes

  Multiple InfoSource to Single Data Target – Yes 

   One InfoSource can be assigned with multiple DataSources – yes

             Same DataSource cannot be assigned to multiple InfoSources
Master Data:

Detailed Information about any Entity is called as Master Data.
Ex:- Detailed information about a customer – Customer Master data
In SAP BW, we have 3 types of Master data:
               ATTR
               TEXT
             HIER

ATTR: is used store all the attributes / properties of an entity.
Text: is used to store all the descriptions in different languages
HIER: is used to store parent-child data

How to load Master data ATTR & TEXT from Flatfile

Steps:

          Create the Application component
         Create the Info Source of type Direct Update
         Assign the DataSource to InfoSource
         Activate the Transfer rules for ATTR & Text DataSources
      Create the Info Packages and schedule the loads
Note:- we need to create one Infopackage for ATTR DataSource and one for Text DataSource


Hierarchies:
When do we go for hierarchies: 

When the Characteristics are related as 1:M and in the Reporting if we need to display the values by using hierarchies (Tree like display)

Types of Hierarchies:
- Hierarchy Not Time dependent
- Hierarchy Structure Dependent on Time
- Entire Hierarchy Dependent on Time
- How to load Hierarchy from the flat file?
Steps:
  • Create the Info Source
  • Assign the Data Source
  •   Create the file as per the Hierarchy
  • Create the Infopackage and schedule the load



Reference & Template: 

Reference

If we have an info object 'A', when we create the Info object 'B' by taking 'A' as the reference, all the properties of 'A' are copied towards 'B' and we cannot change any properties to 'B', We cannot load any data to the info object 'B' but it refers to the data / data dictionary tables of main info object 'A'.

Template

If we have an info object 'A', when we create the Info object 'C' by taking 'A' as the Template, all the properties of 'A' are copied towards 'C'  and we can change the properties of 'C', we can load the seperate master data for the info object 'C'.

When do we go with reference:

When we want to create the new master data object which is supposed to hold the data which is already a sub set of some other Master data objects, we go for creating the Master data object with reference.

Reference Example:-

- Sold to Party , Ship to Party, Bill to Party, Payer are created with reference to Customer.
- Sende            r Cost Center and Reciever Cost Center are created with reference to Cost Center.

Converting Master Data as Data Target:

ODS

             ODS: (Operational Data Store)
             ODS is also an Info Provider like Info Cube.
             ODS is a 2 dimensional.
              the Property of ODS is : Overwrite.
              We perfer ODS to do detailed level of reporting
               we also use ODS for Staging.

ODS CONTAINS 3 TABLES:-
     New Data Table
     Active Data Table
     Change Log
1. Active Data Table:
     /BIC,0/AXXXXXX00
    Structure: - All Key Fields [Primary Key]+ All data Fields + Recordmode
     Reporting
    Active data table will be the Source when we schedule Init / Full update for Data Mart

2. New data Table:
     /BIC,0/AXXXXXX40
     Structure: - Technical Keys [loading Request No + Data packet no + Record No] (Primary Key) + All Key Fields + All data Fields + Recordmode
      First table where the data is staged in ODS.
3. Change Log Table:
  •   - /BIC/B000*
  • - Structure: - Technical Keys [Activation Request No + Data packet no + Record No + Partin no] (Primary Key) + All Key Fields + All data Fields + Recordmode
  •   - Registry of all the changes in the ODS.
  •   - Change log table will be the Source when we schedule Delta  update for Data Mart
Points to be noted:
  •   when we design a report on the ODS, it fetches data from the active data table.
  •   When we load data from ODS to info cube with "Full update / Initialize delta update", it takes the data from Actve data table of the ODS.
  •   When we load data from ODS to info cube with "Delta update", it takes the data from Change log table of the ODS.
  •   Note : Max no. of Key fields : 16
  •   We cannot use keyfigures in the Key fields.
 How does the overwrite functionality works by using thes tables:
when we load data into ODS, initially the data is loaded into New Data table. By using "Set quality status to OK" we convert the request status from yellow to green. once the request status is green we "Activate the DATA in ODS" - It delets the records in the new Data table and then moves the records from new Data table to Activa data table by overwriting the records if it finds the records with the same key field combination and maintains respective entries in change log table.

  How to create the ODS?
     How to load data into ODS from Flat file ?
Pre-requisites:
1) Flat file source system connection should be ready
2) Flat file should be ready

Steps:-
  • 1.       Create the Application Component
  • 2.       Create the Info Source - [ flexible update ].
  • 3.       Assign the Data Source to Info Source
  • 4.       Connect ODS to the Info Source with Update rules.
  • 5.       Create the info package and run the load.
  Deleting Data
1) Delete Data
                                - It deletes all the contents in all the three tables.
2) Deleting based on a Request
                                - It deltes data in all the tables .
- When we delete a request in a ODS it deletes the selected request and all the requests above it.
3) Selective Deletion
- When we want to delete the records in the ODS based on the values of a particular Characteristic.
                                                - It Deletes data only in Active Data Table.
4) Delete Change log Data
- It deletes data only in the change log table  based on the request (no of days, before particular date).


     Condensing / Donot condense into Single reguest
                - When we are activating multiple request in a ODS at a time, if we select the option "DO NOT CONDENSE THE REQUEST INTO WHEN ACTVATION TAKES PLACE", each request will have its own Activaion request. if you dont select the option "DO NOT CONDENSE THE REQUEST INTO WHEN ACTVATION TAKES PLACE", all the request will have the same activation request. So if we delete a particular request, it deletes all the other request in the ODS with the same Activation reuest.

  Activation Serially / Parallel
Data Marts: 
 Case 1:
Loading Data from ODS to Info Cube

Pre-requisites:

1) Myself Source System Connection
2) Application Component - Data Mart (DM)

Steps:

1) Identify the Source object and the Target Object.
                SO - yo_sd01
                TO - yc_dm1
2) Check whether the Source Object has got the "EXPORT GENERATE DATA SOURCE". If it is not there we have to explicitly generate the "EXPORT GENERATE DATA SOURCE".

                SO - yo_sd01  - EGDS - 8yo_sd01

3) Connect the Source object to the Target Object with the help of Update rules.

4) Find the Info source which gets created automatically when we build the update rules and this Info source is also assigned with Myself source system connection under the Application Component (Data Marts).

5) Create the Info Package and Schedule the load.

Case 2: 

Loading Data from Info cube to Info Cube

Pre-requisites:


1) Myself Source System Connection
2) Application Component - Data Mart (DM)
Steps:
1) Identify the Source object and the Target Object.
                SO - yc_dm1
                TO - yc_dm2

2) Check whether the Source Object has got the "EXPORT GENERATE DATA SOURCE". If it is not there we have to explicitly generate the "EXPORT GENERATE DATA SOURCE".

3) Connect the Source object to the Target Object with the help of Update rules.

4) Find the Info source which gets created automatically when we build the update rules and this Info source is also assigned with Myself source system connection under the Application Component (Data Marts).

5) Create the Info Package and Schedule the load.

- How to correct the Delta Load ?
- Different Update Mechanisims to different Data Targets?

In case of ODS : we cannot do a "Full update" after doing "INIT" or "DELTA" updates because this will reset the Delta Management of the ODS. So to overcome with this problem we use "Repair Full Request".

In case of ODS : When we already have the full updates done to the ODS, The ODS will Not accept to load data with "INIT and DELTA updates" . So by using the Function Module - "RSSM_REQUEST_REPAIR_FULL_FLAG" we convert all the request with "FULL UPDATE" in the ODS to "REPAIR FULL REQUEST".

DSO(Data Store Object)

Delta Concepts ABR,AIE,ADD Methods
Delta Concepts ABR,AIE,ADD Methods

Management of DataStore Object (DSO)
Management of DataStore Object (DSO)

Record Mode Concept in Delta Management
Record Mode Concept in Delta Management

Step by Step Procedure for DSO Creation
Step by Step Procedure for DSO Creation

Understanding DSO (DataStore Object) Part 1- Standard DSO
Understanding DSO (DataStore Object) Part 1- Standard DSO

Understanding DSO (DataStore Object) Part 2- Write-Optimized DSO
Understanding DSO (DataStore Object) Part 2- Write-Optimized DSO

Understanding DSO (DataStore Object) Part 3- Direct Update DSO
Understanding DSO (DataStore Object) Part 3- Direct Update DSO




DATA SOURCES


          Explore Data Source-Part1       

        Exploring Data Sources – Part 2        

         Better View of Data Source Features in BW 7.0   



DSO(Data Store Object)


Definition

A DataStore object serves as a storage location for consolidated and cleansed transaction data or master data on a document (atomic) level.
This data can be evaluated using a BEx query.

A DataStore object contains key fields (such as document number, document item) and data fields that, in addition to key figures, can also contain character fields (such as order status, customer). The data from a DataStore object can be updated with a delta update into InfoCubes (standard) and/or other DataStore objects or master data tables (attributes or texts) in the same system or across different systems.

Unlike multidimensional data storage using InfoCubes, the data in DataStore objects is stored in transparent, flat database tables. The system does not create fact tables or dimension tables.



Overview of DataStore Object Types
Type
Structure
Data Supply
SID Generation
Details
Example
Standard DataStore Object
Consists of three tables: activation queue, table of active data, change log
From data transfer process
Yes
Write-Optimized DataStore Objects
Consists of the table of active data only
From data transfer process
No
DataStore Objects for Direct Update
Consists of the table of active data only
From APIs
No

You can find more information about management and further processing of DataStore objects under:


         Understanding DSO (DataStore Object) Part 1- Standard DSO

         Understanding DSO (DataStore Object) Part 2- Write-Optimized DSO


         Understanding DSO (DataStore Object) Part 3- Direct Update DSO



A Data Store object in SAP Net Weaver 2004s BI is the successor of the ODS object from earlier BI releases. This name change was aligned with the prevalent data warehousing terminology. A Data Store object stores consolidated and cleansed transaction data or master data at document level and item level (basic level) from one or several data sources. This data can be evaluated using a Bex query (primarily for supporting the operational reporting).
A DataStore object contains key fields (for example document number, document item) and data fields that can also contain characteristics (for example, order status, customer) in addition to key figures. The data for a Data Store object can be updated by delta update into Info Cubes and/or further Data Store objects or master data tables (attributes or texts) in the same system or across systems. Unlike multi-dimensional data storage using Info Cubes, the data in Data Store objects is stored in transparent, flat database tables. Fact and dimension tables are not created.



 

2 comments: