Duplicate Payments

Updated 3 weeks ago by Marco Kretschmann

In addition to the rule-based duplicate payment detection, dab AnalyticSuite also offers the possibility to enhance the output using machine learning by flagging the output of the analytic, hence reducing the number of false positives in a continuous approach. The audit step is named AP_DuplicatePayments_Enhanced in dab Nexus Content Studio.


Flagging the results

To make use of the of the machine learning algorithm, the results need be to flagged manually. To realize this, two tables are required:

  1. AP_DuplicatePaymentsFlagged
    This table is generated on every run and contains the entries that need to be flagged as well as the results of the machine learning algorithm indicating whether each entry is estimated to be a false positive. The following columns are relevant here:

    Column

    Description

    Target_Variable

    This field should be the manual assessment of the user, if this is a duplicate payment or not, the following values are accepted:

    • New
    • Resolved
    • False Positive
    • Duplicate Payment

    AI_Assessment

    Result of the AI algorithm whether this is a duplicate payment

    AI_Potential

    Probability that this is a true duplicate payment, ranging from 0.0 to 1.0

  2. AP_DuplicatePaymentsPool
    This table contains the previously flagged duplicate payments saved for use in the next analytic run. It will be continuously updated with every run with the newly flagged entries. If you delete the table, you will loose all your progress on already flagged entries.

For further information and details regarding the columns and their reference to the underlying data, refer to the analytic step documentation provided by your contact at dab.


Setting up the analytic in dab Nexus for continous execution

The AI-enhanced duplicate payment analytic is meant for continuous execution, not as a one-time run. The following steps should be considered when running it in a continuous scenario. Refer to the dab Nexus documentation for details.

  1. Create a data strategy that always uses the same database for every run
  2. Create an analytic group containing AP_DuplicatePayments_Enhanced
  3. Set up an analytic task using the created data strategy and analytic group
  4. You are good to go!

Avoiding execution failures

For proper operation, the Analytic AP_DuplicatePayments needs to have the following columns selected in Content Studio. If you don't select them, you will get a crash when running this analytic:

  • c_UpdateKey
  • c_UniqueId
  • BSAK_AUGDT
  • BSAK_WAERS
  • T001_WAERS
  • BSAK_BUKRS
  • BSAK_GJAHR
  • c_Posting_Status
  • c_BKPF_USNAM_DEPARTMENT
  • c_Clearing_Department
  • BSAK_ZTERM
  • LFA1_SORTL
  • LFA1_LAND1
  • c_Status_PO
  • LFA1_KTOKK
  • SKAT_TXT20
  • BSAK_BELNR
  • BSAK_SGTXT
  • BSAK_HKONT
  • c_FinalDueDate
  • LFA1_NAME1
  • LFA1_ORT01
  • BSAK_XBLNR
  • BSAK_BUDAT
  • BSAK_CPUDT
  • BKPF_USNAM
  • c_BKPF_USNAM_Clearing
  • BKPF_TCODE
  • c_BKPF_TCODE_Clearing
  • BKPF_AWTYP
  • BSAK_BLART
  • c_BKPF_BLART_Clearing
  • BKPF_DBBLG
  • BSAK_ZFBDT
  • BKPF_STBLG
  • BSAK_MWSKZ
  • BKPF_CPUTM
  • c_BSAK_DMBTR_LC
  • c_BSAK_DMBTR_RC
  • BSAK_BLDAT
  • BSAK_ZLSCH


How did we do?


Powered by HelpDocs (opens in a new tab)

Powered by HelpDocs (opens in a new tab)