Data archiving: customer and vendor master data

This blog will explain how to archive customer and vendor master data via objects FI_ACCRECV and FI_ACCPAYB. Generic technical setup must have been executed already, and is explained in this blog.

Most use of this archiving is when customers and vendors are created wrongly, to get them deleted from the system.

The below is mainly focusing on traditional ECC system. In S4HANA system both customers and vendors are integrated as business partners. For archiving sections of business partners for customer and / or vendors, read OSS note 3321585 – Archiving for Business Partner and Customer / Suppliers.

If you also want to archive/delete the LFC1 and KNC1 tables, also implement the FI_TF_DEB and FI_TF_CRE archiving objects.

Object FI_ACCRECV (customers)

Go to transaction SARA and select object FI_ACCRECV (customers).

Dependency schedule:

A lot of dependencies. Everywhere a customer number is used in an object. This makes it almost impossible to archive a customer master record. But still: it can be done to delete wrongly created master data if no transaction data is created yet.

Main tables that are archived:

  • KNA1: General customer master data
  • KNB1: Company code specific customer master data

Object FI_ACCPAYB (vendors)

Go to transaction SARA and select object FI_ACCPAYB (vendors).

Dependency schedule:

Quite some dependencies. Everywhere a customer number is used in an object. This makes it almost impossible to archive a vendor master record. But still: it can be done to delete wrongly created master data if no transaction data is created yet.

Main tables that are archived:

  • LFA1: General vendor master data
  • LFB1: Company code specific vendor master data

Technical programs and OSS notes

Write program customers: FI_ACCRECV_WRI

Delete program customers: FI_ACCRECV_DEL

Write program vendors: FI_ACCPAYB_WRI

Delete program vendors: FI_ACCPAYB_DEL

Relevant OSS notes:

Application specific customizing

There is no application specific customizing for customer and vendor archiving. You can use XD06 for customer master deletion flag setting and XK06 for vendor master deletion flag setting.

Executing the write run and delete run Customers

For customers: in transaction FI_ACCRECV select the write run:

Important is the consideration of the validation links and the deletion indicator. Customer deletion indicator flag can be set with transaction XD06.

Select your data, save the variant and start the archiving write run.

There is a sequence inconsistency. The online help has sequence FI, SD, general. The OSS note 788105 - Archiving FI_ACCRECV has sequence SD, FI, general.

You have to do the run three times: for FI, SD and general.

Deletion run is standard by selecting the archive file and starting the deletion run.

Executing the write run and delete run Vendors

For customers: in transaction FI_ACCPAYB select the write run:

Important is the consideration of the validation links and the deletion indicator. Vendor deletion indicator flag can be set with transaction XK06.

Select your data, save the variant and start the archiving write run.

You have to do the run three times: for FI, MM and general. A sequence is not given in OSS note, nor in online help.

Deletion run is standard by selecting the archive file and starting the deletion run.

Load balancing analysis tool

With SAP note 3515065 – Load Balancing Analysis, SAP delivers a new load balancing analysis tool.

Prerequisites

There are 2 prerequisites for the new load balancing analysis tool to work:

  1. Install OSS note 3515065 – Load Balancing Analysis
  2. Make sure snapshot monitoring is active (read this blog on activation)

Running the tool

To start the tool go to transaction SE38 and start program /SDF/RSLOADANALYSIS.

Selection screen:

Select the date range you want to analyze. The delta factor is normally 10 but bit too low. Increase it for more realistic result. This is factor to conclude if balancing is ok or not. Only 10% difference from average is too idealistic.

Output screen has 3 parts.

The first part is the load balancing analysis.

An overview is given on batch server groups, logon groups and RFC server groups. You can see which groups are defined, and how they are distributed over the application servers.

The second part is the work process analysis part.

Here you can see how load is distributed over the application servers using the snapshot monitoring statistics. The central instance can be excluded from the load balancing and hence show as ‘not balanced’.

The third part is host machine data.

Here you can see if the servers are having equal CPU power and memory. If no data for a sever: check in ST06 if it is configured properly.

It can be that CPU and memory are identical, but that older infrastructure was used. Then the CPU and mem look the same, but there can still be significant difference in CPU speed and memory speed. To rule this out, run the ABAPMETER tool.

SAP GUI for slow or remote network

Sometimes SAP users are far away from the server. There is much latency. For a global SAP system this is unavoidable. In some cases there might be a remote location you need to support which has a slow and/or low bandwidth connection.

In that case you best setup the SAP GUI to use

Default is as above. For low speed users, ask them to select the Low Speed Connection.

Some minor usability functions will be lost (see OSS note 161053 – Use of SAP GUI in WAN – SAP for Me):

But overall, the performance gain will outweigh normally these minor setbacks.

SAP help file: reference.

/SDF/SMON_DISPLAY to display snapshot monitoring data

The snapshot monitor tool is capturing a lot of good data. Displaying it can be bit harder. Here is where the /SDF/SMON_DISPLAY is helping.

Generic OSS note for this display is: 3210905 – Display Snapshot Monitor Data.

Setting link to plotly upfront

Before /SDF/SMON_DISPALY is working, you have to set a link to the plotly library. You can do this for all users, or for your personal user by setting a SU3 parameter:

Using /SDF/SMON_DISPLAY

Simply start transaction /SDF/SMON_DISPLAY:

Fill out the measurements you want to see. And the last n minutes. Automatically the results are shown in a separate window:

Extra enhanced functions

Extra functions are released in new OSS notes:

SICF tips and trikcs

SICF is an abbreviation for SAP internet communication framework.

It is used to expose internet services like SAP ABAP webdynpro, ODATA etc.

Checking active services

As per SAP “Note 1555208 – ICF services become inactive after upgrade or SP update” you can find the list of active services with the report RS_ICF_SERV_ADMIN_TASKS (choose option Export of Active Services into CSV file).

On table level: Check the table ICFSERVLOC. All active services are marked with an “X” flag.

Mass processing

SICF mass processing is done via program RS_ICF_SERV_MASS_PROCESSING.

Logging of SICF changes

To enable logging of SICF changes: switch on table logging for table ICFSERVLOC.

Various OSS notes around SICF

SAP Focused Run API’s

SAP Focused Run offers some nice API’s that you can use and re-use.

API’s available:

  • LMDB API
  • Work mode management API
  • Guided procedure API
  • Advanced analytics API

LMDB API

The LMDB now has a REST API available to read data in structured way. You can search for hosts, technical systems, software components, installed product versions and instances.

The full specification for this API can be found on this link.

Ad hoc work mode creation via function module

Via function module FM_START_ADHOC_WORKMODE you can create an ad-hoc work mode to stop monitoring for a system. You can start monitoring again by stopping the work mode by calling function module FM_STOP_ADHOC_WORKMODE.

The full specification of all the work mode API’s can be found in the PDF attached to OSS note 2508346 – Work Mode Management API Documentation for Focused Run.

Triggering guided procedure via function module

Function module FM_EXTRN_GP_EXEC can be used to call a guided procedure. Unfortunately, you need to pass the GUID of the guided procedure to the function module.

The full guided procedure API can be found on this SAP page.

Advanced Analytics API

The specification for the Advanced Analytics API can be found here.

User based debugging

In some cases you need to debug the session of another user. This can be needed for example when you need to solve an issue in ABAP for a FIORI app. The end user is doing his work until the break point is reached. Then you take over the session using the normal debugging tools. The basic principle is explained in OSS note 1919888 – Debugging the applications of another user, and in this SAP help file.

Prerequisites:

Then let the user start the work. You will take over as soon as the break point is reached.

Checklist for issues can be found in OSS note 2462481 – External debug / breakpoint is not recognized.

Set the user ID to be debugged

For your user ID choose the menu option Utilities/Settings. Then select main tab ABAP editor and subtab Debugging:

Now replace your user with the user name for which you want to take the session over using the external break point.

Custom SCI class for checking AUTHORITY-CHECK statement

Follow the steps explained in this blog to set up a new custom check. We will use these steps to set up an extra SCI class to check if the AUTHORITY-CHECK statement is added in ABAP code or not.

New SCI check coding

Step 1: create the ZCL_CI_SCAN_AUTH class

In SE24 copy class CL_CI_TEST_FREE_SEARCH to ZCL_CI_SCAN_AUTH. In the attributes of the copied class set C_MY_NAME as variable to ‘ZCL_CI_SCAN_AUTH’. Also set the error code MCODE_0001 to ‘Z001’.

Step 2: redo the CONSTRUCTOR

Goto the constructor of the new class and overwrite the existing code with this code snippit:

 super->constructor).
 
   description    'Search authority check statement'(001).  "required
   category       'ZCL_OWN_CHECKS'.         "required
   version        '000'.                      "required
   has_attributes c_false.                        "optional
   attributes_ok  c_false.                 "optional
   DEFINE fill_message.
     CLEAR smsg.
     smsg-test c_my_name.
     smsg-code &1.  "message code
     smsg-kind &2.  "message priority
     smsg-text &3.  "message text
     smsg-pcom &4.  "pseudocomment
     INSERT smsg INTO TABLE scimessages.
   END-OF-DEFINITION.
 
   fill_message 'Z001' 'E' 'Search authority check statement'(001' '.

Don’t forget to double click on the 001 to generate the text message.

Step 3: adapt the RUN code

Now the check itself has to be built in the RUN method:

DATA:
     l_include       TYPE sobj_name,
     l_row           TYPE token_row,
     l_column        TYPE token_col,
     l_tokennr       LIKE statement_wa-from,
     l_code          TYPE sci_errc,
     l_search_string LIKE LINE OF search_strings VALUE 'AUTHORITY-CHECK',
     l_position      TYPE i,
     l_found         TYPE VALUE ' '.
 
 *  IF search_strings IS INITIAL.
 *    RETURN.
 *  ENDIF.
 
   IF ref_scan IS INITIAL.
     CHECK get'X'.
   ENDIF.
 
   CHECK ref_scan->subrc 0.
 
 *-- loop at all tokens
   LOOP AT ref_scan->statements INTO statement_wa.
     CHECK statement_wa-from <= statement_wa-to.
     l_position sy-tabix.
     IF statement_wa-type 'S' OR
        statement_wa-type 'P'.
       CHECK comment_mode 'X'.
     ENDIF.
 
     LOOP AT ref_scan->tokens INTO token_wa
            FROM statement_wa-from TO statement_wa-to.
       l_tokennr sy-tabix.
       IF token_wa-type 'S'.
         CHECK literal_mode 'X'.
       ENDIF.
 
 *-- does ABAP-string contain search-string ?
       IF token_wa-str CP l_search_string.
         UNPACK sy-tabix TO l_code(4).
         l_include get_include).
 
         l_row     get_line_absl_tokennr ).
         l_column  get_column_absl_tokennr ).
         l_found 'X'.
 
         EXIT.
       ENDIF.     "l_strpos > l_pos
 
     ENDLOOP.
   ENDLOOP.
   IF l_found NE 'X'.
     informp_sub_obj_type c_type_include
                 p_sub_obj_name l_include
                 p_position     l_position
                 p_line         l_row
                 p_column       l_column
                 p_kind         'E'
                 p_test         c_my_name
                 p_code         'Z001'
                 p_suppress     '"#EC CI_NOAUTH '
                 p_param_1      token_wa-str ).
   ENDIF.

Basically the code looks for the statement ‘AUTHORITHY-CHECK’. If it found nothing happens. If it is found, it will generate a message.

Step 4: generating the message

In the method GET_MESSAGE_TEXT overwrite the code with this new code:

   data:
     L_CODE type SCI_ERRC.
 
   if P_TEST <> MYNAME or p_code c_code_not_remote_enabled.
     SUPER->GET_MESSAGE_TEXT(
                 exporting P_TEST P_TEST P_CODE P_CODE
                 importing P_TEXT P_TEXT ).
     return.
   endif.
   L_CODE P_CODE.
   shift L_CODE left deleting leading SPACE.
   P_TEXT 'No authorithy-check statement found'(101).
   replace first occurrence of '&N' in P_TEXT with L_CODE.

SCI settings

Use steps from blog xxx to add the new check to the SCI variant ZTEST.

SCI variant

Test program

We have written a simple test program without AUTHORITHY-CHECK.

Test program ZTEST

When running the SCI with our test variant, this is the result:

SCI check on AUTHORITHY-CHECK statement

 

S4HANA conversion downtime

When converting from ECC 6.0 EHP8 to S4HANA, you will have to face a significant downtime. In most cases a whole weekend or more.

To get insights into your estimated business downtime, first run the SAP S4HANA Readiness Check.

On the results website, scroll to the Planned Downtime Calculator tile:

Now the details will show the total estimated downtime split into phases:

Phases:

  • System Ramp-Down
  • Downtime Preparations
  • Technical Downtime (SUM)
  • Technical Postprocessing and Data Conversion Preparation
  • Finance and Material Ledger Data Conversion
  • Application Postprocessing
  • Business Validation
  • Go/No-Go Decision
  • System Ramp-Up
  • Fallback Buffer

Each of the phases is described and a amount of estimated hours are put into the estimation. Some are empirical, some are based on your data volume.

You can update the times in the graph by entering the number relevant for your situation and then press Save: