Interested in FI/CO Video Training? Click here for a free 30 minute Video
SAP Certified Application Associate – Modeling and Data Management with SAP BW and SAP BI Certification Exam Questions
These questions are similar to the ones asked in the actual Test.
How should I know? I know, because although I got my SAP Certified Application Associate – Modeling and Data Management with SAP BW and SAP BI Certification five years back, I have re-certified with the latest version of the Associate Certification test.
Before you start here are some Key features of the SAP Certified Application Associate – Modeling and Data Management with SAP BW and SAP BI Certification Exam
– The exam is Computer based and you have three Hours to answer 80 Questions.
– The Questions are (mostly) multiple choice type and there is NO penalty for an incorrect answer.
– Some of the Questions have more than one correct answers. You must get ALL the options correct for you to be awarded points.
– The Official Pass percentage is 65% (But this can vary). You will be told the exact passing percentage before your begin your test.
Sample Questions
1. SAP BW does not support more than how many lnfoCubes in one lnfoSet?
Please choose the correct answer.
a. One
b. Two
c. Three
d. Four
Answer: b
Explanation:
Recommendations for Modeling lnfoSets:
The following is a list of recommendations for modeling lnfoSets:
- We recommend that you use only one object of the lnfoSet (DSO, lnfoCube. or master data table) for ambiguous characteristic values. When joining a DSO with an lnfoCube, use all key characteristics for the DSO in the join condition for the lnfoCube (provided that the lnfoCube has the visible key figures). In addition, when joining a master data table that is compounded with an lnfoCube, join all key characteristics of the master data table with characteristics of the lnfoCube.
- Do not use many lnfoProviders in one lnfoSet. Instead, define several lnfoSets.
- Only use a small number of joins in one lnfoSet (especially if you load a lot of data).
- Use left outer joins in lnfoSets only if necessary. The performance of left outer joins is poorer than performance of inner joins.
- Do not make any calculations before the aggregation, since this may cause incorrect query results.
- For time-dependent lnfoSets in the query definition, do not use the Valid from (ODATEFROM) and Valid to (ODA TETO) fields. Instead, use the Valid Time Interval dimension (VALIDTIMEINTERVAL) that is only visible in the query designer and is used for time selection.
- If all characteristics affected by the join condition are not in the drilldown of a query, the key figure values of lnfoCubes and DSOs are duplicated when you join them to lnfoProviders (for more information, see SAP Note 592785). Therefore, interpreting the results of joins with nonunique lnfoProviders becomes more difficult as you include more lnfoProviders.
To avoid problems caused by duplicated key figure values (for more information, see SAP Note 592785), we recommend that you only stage the key figures in the lnfoSet for the query for one DSO or lnfoCube of the lnfoSet. To achieve this, set the marker in the first column of the lnfoSet maintenance.
- For performance reasons, it is not possible to define an lnfoCube as a right operand of a left outer join.
- SAP BW does not support more than two lnfoCubes in one lnfoSet. In contrast to the star schema (for which the potentially useful database access plans are limited by the table structure), several lnfoCubes exist for a join, and several fact tables or DSO tables exist if you join lnfoCubes with DSOs. This means that the schema is no longer based on just one large table, and choosing a good access plan is much more difficult. Therefore, the average response time increases exponentially with the number of lnfoCubes included.
2. Hybrid Providers are available for write-optimized DSOs. Is this statement true?
Please choose the correct answer.
a. Yes
b. No
c. Depends on certain conditions.
Answer: b
Explanation:
Features of Hybrid Providers:
- Hybrid Providers are not available for write-optimized DSOs.
- Direct access is the only option for DataSources that offer valid delta criteria and guarantee acceptable transfer time.
- Direct access does not support DataSources that use the delta queue.
HybridProvider Model Options:
3. The analytic index is used to generate which of the following?
Please choose the correct answer.
a. CompositeProvider
b. TransientProvider
c. Both a and b
d. None of the above
Answer: b
Explanation:
An analytic index is a data container whose data is stored in the SAP BW Accelerator or in the SAP HANA database.
The analytic index is used to generate a TransientProvider. Analytic indexes can be created in the Analysis Process Designer tool and filled with (transformed) data quickly. They are intended for ad hoc scenarios. They can also be created as lnfoProviders without reference to lnfoObjects. They are therefore not integrated into the metadata repository and cannot be transported.
A CompositeProvider is an object that can join existing analytical indexes via UNION, inner, and left outer joins. The calculation is performed at query runtime.
4. If you want to structure your data model only according to semantic criteria, you have to use which of the following?
Please choose the correct answer.
a. Hybrid Provider
b. Semantically Partitioned Object (SPO)
c. SAP BW Accelerator Only lnfoCube
d. All of the above
Answer: c
Explanation:
Use an lnfoCube on the SAP BW Accelerator only in the following cases:
- If you want to benefit from the SAP BW Accelerator not only in terms of query performance but also for saving database memory.
- If you want to structure your data model only according to semantic criteria
- If you want to move the reporting layer of the SAP BW Accelerator
Use a Hybrid Provider in the following cases:
- If you want performant reporting on near real-time data
- If you are confronted with high volume of (historical) data
- If you want to avoid a complex staging mechanism to cater for the above scenario
Use an SPO in the following cases:
- If you want to handle high data volumes for staging and/or reporting
- If you want to minimize manual modeling times
- If you want to speed up design changes for logically separated data targets
5. Can DSO (DataStore Object) for Direct Update be exported to other lnfoProviders?
Please choose the correct answer.
a. Yes
b. No
c. Depends on certain conditions.
Answer: a
Explanation:
The following list describes the important aspects of a DSO for direct update:
- A DSO for direct update consists only of a table with active data.
- It contains the key that was chosen in the definition of the DSO (the semantic key).
- It has no change log and no activation queue.
- It can be used in the Analysis Process Designer (APD) as a data target.
- It cannot be used for transformation scenarios or upload scenarios.
- It is not a target for SAP BW loading processes.
- However, it can be exported to other lnfoProviders.
- Reporting on the DSO for direct update is possible.
Usage:
DSOs for direct update ensure that the data is made available quickly. Access to data of this type of DSO is transactional, which means that data is written to the DSO (possibly by several users simultaneously) and may be read immediately. It is not a replacement for the standard DSO, but instead offers an additional function that can be used for special applications.
Application Scenarios:
The application scenarios for DSOs for direct update are as follows:
- A DSO for direct update is used to directly enter (external) transaction data, such as an SAP BW table for direct user interactions.
- An Application Programing Interface (API) is available with function modules, some of which are Remote Function Call (RFC)-enabled. This enables the automatic transfer of data from external source systems to DSOs for direct update.
- It functions as a data target in APD processes.
6. Write-optimized DSOs are partitioned automatically. Is this statement true?
Please choose the correct answer.
a. Yes
b. No
c. Depends on certain conditions.
Answer: a
Explanation:
The following are the tips for general data modeling:
- Use only as many keys as required. Reduce the granularity as much as possible.
- Use only as many data fields as required. Reduce the information fields as much as possible.
- Load only as many data records as required.
- SID generation
Do not select the SID generation if no reporting is required. This prevents time-consuming characteristic SID generation in the activation.
- Partitioning
The user checks specific database settings to have optimum read access, write access, or delete access. Write-optimized DSOs are partitioned automatically.
- Indexing
If the selection criteria in reporting do not correspond to the key specifications in the DSO definition, additional secondary indexes help to improve query response times. You can maintain indexes in the DSO definition.
7. Back up and restore processes are implemented for SAP BW Accelerator lnfoCubes. Is this statement true?
Please choose the correct answer.
a. Yes
b. No
c. Depends on certain conditions.
Answer: b
Explanation:
lnfoCubes with data persistency in the SAP BW Accelerator have the following advantages over standard lnfoCubes:
- Data redundancy is avoided in the SAP BW system.
This saves database space and reduces the SAP BW system load, which improves the overall database performance.
- The data analysis layer is relocated.
The data analysis layer is relocated from the SAP BW system to the SAP BW Accelerator. This results in improved performance for data analysis.
- There is no need to create database aggregates for the lnfoCube.
The data warehouse layer remains in the SAP BW system and provides a stable environment for staging the data.
Backup and restore processes are not implemented for SAP BW Accelerator lnfoCubes.
8. The HybridProvider based on a DataStore Object (DSO) comprises of which of the following?
Please choose the correct answer.
a. DSO
b. DSO and lnfoCube
c. InfoCube
d. None of the above
Answer: b
Explanation:
The HybridProvider based on a DSO is a combination of DSO and lnfoCube. When this
Hybrid Provider is activated, the objects required to control the data flow (the Data Transfer
Process (DTP) and the transformation between DSO and lnfoCube, as well as the associated process chain), are also generated at the same time.
9. The HybridProvider on direct access is based on a combination of VirtualProvider and lnfoCube. Is this statement true?
Please choose the correct answer.
a. Yes
b. No
c. Depends on certain conditions.
Answer: a
Explanation:
The HybridProvider on direct access is based on a combination of VirtualProvider and lnfoCube. Both objects have the same structure. When you edit the Hybrid Provider, you edit both lnfoProviders in the same way.
The data is loaded using a DTP that you create from the lnfoCube of the DataSource. The system also creates a DTP between Virtual Provider and DataSource, that uses the same transformation.
10. SAP HANA-optimized objects can be which of the following?
Note: There are more than one correct answers to this question.
a. lnfoCubes
b. DataStore Objects
c. Semantically Partitioned Objects
Answer: a, b, c
Explanation:
A Comparison of LSA and LSA++ :
- Use of SAP HANA-optimized objects
- Optimized EDW core
- Enhanced virtualization layer
- Additional LSA++ layers
SAP HANA-optimized objects can be lnfoCubes. DSOs and SPOs. An SAP HANA-optimized flat
lnfoCube for example does not contain dimension tables.
To optimize the EDW core you can amongst others reduce the number of persistent
lnfoProviders. For example, with an SAP HANA database it is not always necessary to use a persistent architected data mart layer with lnfoCubes.
The virtualization layer consists of Multi Providers, and in rare cases lnfoSets because joins in
lnfoSets are slow. In LSA++, the virtualization layer can be enhanced, for example by using a
CompositeProvider that can be used to join data.
An example for an additional layer in LSA++ is the Open Operational DataStore Layer (ODS). It allows you to push data directly in to SAP HANA. Data in the Open ODS layer can be accessed for reporting so Open ODS then replaces the data acquisition and ODS layer that we know from LSA.
More Questions? Have a look at: