We were asked to assist a client with their S/4 brownfield upgrade wrt to their Neptune Software Application portfolio. This is how we leveraged AI and basic python parsing to help us estimate development effort and associated remediations.

Which Neptune Apps Are Being Used?

Neptune provides a number of analysis tools "out of the box" — here we ran application usage to get an understanding of which apps needed to be migrated (lots of custom code is never used).

We Built a Remediation Estimation Tool

As we were quoting on the effort, we did not have VPN access to the client’s system yet, so asked them to provide the front-end Neptune Apps (XML) as well as the associated back-end data provider ABAP classes.

With just these two files we were able to provide a complexity and development estimate to remediate the applications and make sure they worked as "cleanly" as possible in the S/4 Environment.

It also provides proposed code snippets and successor objects to use (more on that a bit later in the blog):

SELECT PurchaseOrder, CompanyCode, Supplier,
       PurchasingOrganization, PurchasingGroup
  FROM I_PurchaseOrderAPI01
  INTO TABLE @DATA(lt_po_header)
  WHERE PurchaseOrder IN @lt_po_numbers.

Short Video Demo

Here is a short video of how the analysis tool works:

Technical Details

We used FastHTML, SQLite, Anthropic LLM and an external MCP server from Clement Ringot which introspects the SAP cloudification repository.

This MCP will provide the S/4 CDS views and other function modules and objects that should be replaced in your data provider class.

We built it as a multi-tenant service, so we can easily provision access to the tool to potential clients. The one feature that we designed from the start is the ability to download the markdown or Excel files for use in other tools or documentation. We provide default instruction files, which can then be tailored to each client’s specific scenario.

Other Analysis Types

We are already using this for other analysis scenarios. We are working on the custom instructions considering the new Neptune SAP Integration Hub offering and create leaner API footprints within SAP (that is probably worth another blog post). Each analysis has its own data pipeline and custom LLM instructions for specific output.

This Is Not a Silver Bullet

This is never going to give you a perfect answer, you always need an experienced architect to review and sense check the analysis.

If your Data Provider Class is well designed a lot of the logic will be obfuscated in deeper calls. Here using the online analysis tools like the ABAP VSP tool would make a lot of sense.

Let me know if you are interested in running an analysis — get in touch. Of course my Anthropic key is "paying for this" so don’t want to share access too widely.