Duplicate Payments

Updated 9 months ago by Marco Kretschmann

In addition to the rule-based duplicate payment detection dab AnalyticSuite offers also the possibility to enhance the output using machine learning by flagging the output of the analytic and hence reducing the number of false positives in a continous approach. The audit step is named AP_DuplicatePayments_Enhanced in dab Nexus Content Studio.

Flagging the results

To make use of the of the machine learning algorithm, the results need to flagged manually. To realise that two tables are required:

  1. AP_DuplicatePaymentsFlagged
    This table is generated every run and contains the entries which need to be flagged and also the results of the machine learning algorithm whether this is estimated to be a false-positive. The following columns are relevant here:

    Column

    Description

    Target_Variable

    This field should be the manual assessment of the user, if this is a duplicate payment or not, the following values are accepted:

    • No Duplicate Payment
    • True Duplicate Payment

    AI_Assessment

    Result of the AI algorithm whether this is a duplicate payment

    AI_Potential

    Probability that this is a true duplicate payment in the range from 0.0 to 1.0

  2. AP_DuplicatePaymentsPool
    This table contains the previously flagged duplicate payments saved for the usage in the next analytic run. It will be continuely updated at every run with the newly flagged entries. If you delete the table you will loose all your progress on already flagged tables.

For further information and details regarding the columns and their reference to the underlying data refer to the analytic step documentation provided by your contact at dab.

Setting up the analytic in dab Nexus for continous execution

The AI-enhanced duplicate payment analytic is meant for continous execution not as one-time run. The following steps should be considered if running it in a continous scenario. Refer to the dab Nexus documentation for details.

  1. Create a data strategy that always uses the same database for every run
  2. Create a analytic group containing AP_DuplicatePayments_Enhanced
  3. Setup an analytic task using the created data strategy and analytic group
  4. You are good to go!

Avoiding execution failures

For proper operation the Analytic AP_DuplicatePayments needs to have the following columns in Content Studio selected, if you don't selected them you will get a crash when running this analytic:

  • c_UpdateKey
  • c_UniqueId
  • BSAK_AUGDT
  • BSAK_WAERS
  • T001_WAERS
  • BSAK_BUKRS
  • BSAK_GJAHR
  • c_Posting_Status
  • c_BKPF_USNAM_DEPARTMENT
  • c_Clearing_Department
  • BSAK_ZTERM
  • LFA1_SORTL
  • LFA1_LAND1
  • c_Status_PO
  • LFA1_KTOKK
  • SKAT_TXT20
  • BSAK_BELNR
  • BSAK_SGTXT
  • BSAK_HKONT
  • c_FinalDueDate
  • LFA1_NAME1
  • LFA1_ORT01
  • BSAK_XBLNR
  • BSAK_BUDAT
  • BSAK_CPUDT
  • BKPF_USNAM
  • c_BKPF_USNAM_Clearing
  • BKPF_TCODE
  • c_BKPF_TCODE_Clearing
  • BKPF_AWTYP
  • BSAK_BLART
  • c_BKPF_BLART_Clearing
  • BKPF_DBBLG
  • BSAK_ZFBDT
  • BKPF_STBLG
  • BSAK_MWSKZ
  • BKPF_CPUTM
  • c_BSAK_DMBTR_LC
  • c_BSAK_DMBTR_RC
  • BSAK_BLDAT
  • BSAK_ZLSCH


How did we do?


Powered by HelpDocs (opens in a new tab)

Powered by HelpDocs (opens in a new tab)