Logo DWBI.org Login / Sign Up
Sign Up
Have Login?
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Login
New Account?
Recovery
Go to Login
By continuing you indicate that you agree to Terms of Service and Privacy Policy of the site.
SAP Data Services

Data Services Scenario Questions Part 1

Updated on Oct 02, 2020

In this tutorial we will discuss some scenario based questions and their solutions using SAP Data Services. This article is meant mainly for Data Services beginners.

This article is one of the series of articles written to showcase the solutions of different business scenarios in SAP Data Services. You may browse all the scenarios from the below list.

  1. Cumulative Sum of salaries, department wise
  2. Getting the value from the previous row in the current row
  3. Getting the value from the next row in the current row
  4. Getting total Sum of a value in every row
  5. Cumulative String Concatenation (Aggregation of string)
  6. Cumulative String Aggregation partition by other column
  7. String Aggregation

Consider the following Source data in a flat file:

DEPTNOSALARY
101000
202000
303000
404000

Scenario 1: Let's try to load the Cumulative Sum of salaries of the departments into the target table. The target table data should look like below:

DEPTNOSALARYCUMULATIVE_SALARY
1010001000
2020003000
3030006000
40400010000

Solution:

1. Let us first define the Source File Format. This same file format will be reused for the next set of the scenario questions.

File Format
File Format

2. Next we create a new Batch Job, say JB_SCENARIO_DS. Within the Job we create a Data Flow, say DF_SCENARIO_1.

3. At the Data flow level i.e. Context DF_SCENARIO_1, we Insert a new Parameter using the Definitions tab. Let's name it as $PREV_SAL with Data type decimal(10,2) and Parameter type as Input.

Parameters- Data flow
Parameters- Data flow
Parameter Properties
Parameter Properties

At the Job level i.e. Context JB_SCENARIO_1, we initialize the Parameter $PREV_SAL using the Calls tab. We set the Argument value to 0.00

Parameters- Job
Parameters- Job
Parameter Value
Parameter Value

4. Next we create a New Custom Function from the Local Object Library. Let's name it CF_CUME_SUM_SAL.

Custom Function
Custom Function

Within the Custom Function Smart Editor, first we Insert two Parameters, namely $CURR_SAL and $PREV_SAL with Data type decimal(10,2) with Parameter type as Input and Input/Output respectively.

Custom Function Definition
Custom Function Definition

Also we modify the Return Parameter Data type to decimal(10,2).

5. Next we define the custom function as below and Validate the same.

$PREV_SAL = $CURR_SAL + $PREV_SAL;

Return $PREV_SAL;

The purpose of defining the Parameter and Custom Function is to perform Parameter Short-circuiting. Here within the function, we basically set the $PREV_SAL Parameter of type Input/Output to sum of salaries till the current processing row. Since it is of type Input/Output the calculated sum value or the retained sum of salary is passed back into the Dataflow Parameter. So by using Custom Function we can modify and pass values to a Dataflow Parameter. Hence the Parameter defined at Dataflow level is short-circuited with the Input/Output Parameter of the Custom Function.

6. Let's go back and design the Data flow. First of all we take the File Format defined earlier, from the Local Object Library as Source.

Data flow
Data flow

7. Next we place a Query transform, say QRY_CUME_SUM. First we select the columns DEPTNO and SALARY from the Schema In of the Query transform and Map to Output.

Next we specify a New Function Call in Schema Out of the Query transform. Choose the Custom Functions from the Function categories and select the Function name CF_CUME_SUM_SAL.

Next we Define Input Parameters. We specify the inputs as below:

$CURR_SAL = FF_SRC_DEPT.SALARY

$PREV_SAL = $PREV_SAL
Function Input Parameters
Function Input Parameters

Select the Return column as the Output Parameter.

Query transform
Query transform

8. Finally we place a Template Table as Target in the Target Datastore.

Data Preview
Data Preview

Click here to read the next scenario - Getting the value from the previous row in the current row