Up to top

SAP data model design and HANA

By Nick Porter | 05 Aug 2011

Categories: Data and Application integration, Data Warehouse, SAP, Saphir

I just commented on a fascinating article http://www.sdn.sap.com/irj/scn/weblogs?blog=/pub/wlg/25147 by Jamie Oswald over on the SAP Community Network that relates to my earlier blog on BI and complex data environments.

He amusingly points up the data model optimisation “battle lines” between the BI and developer communities. I quote… “Why are transactional data models really hard to report off of you say? Because application developers build (currently out of necessity) really crappy data models. Data is stored a hundred times in a hundred ways so that the user experience is fast but the master data management is a painful process.”

He goes on, “Data is stored in painfully normalized (or denoSAP data model design and HANArmalized – typically whichever makes the least sense from an analytics perspective) which makes ugly, inefficient, and often not-reliably-correct multipass queries necessary when trying to actually analyze the data. Finally, data is stored in [please feel free to insert your own “trying to report off of a transactional database” horror story in here]. The bottom line is typically that app guys/girls need to design their database in such a way that reporting folks throw up in their mouths a little bit when they see the schema because no matter what the application needs to respond FAST.”

Methinks it’s slightly harsh to consider the SAP operational data model in these terms, but the BI design challenge on top of this complex and optimised data model is very real. As I say in my comment to his blog we spend our lives helping BI developers interpret this OLTP optimised data model via the Saphir toolset and advising on methods of data discovery as input to the BI data design.

This issue really is not going away in any short timescale given the investment that SAP (and ORACLE Applications) have in their sophisticated and complex data models. Particularly as this complexity is increased in the majority of their customer sites by the practice of adding more tables to meet individual design needs.

Jamie floats the idea that HANA may offer some way forward in optimising the data model/design for both operational and BI needs. Personally I can’t see SAP rushing to fundamentally change things for the above reasons but interested to hear back from anybody with views on this.

In the meantime we continue to expound the use of good data modelling techniques for BI design and refining of our tools and methods in support of the deep dive data discovery from complex data sources such as SAP and Oracle Applications.

2 Comments

  1. silwoodblog

    Good point about heritage factors such as pool and cluster tables etc.
    I think it’s important to say that we are not setting out to criticise the SAP data model per se.
    It is was it is and it copes with an immense range of functionality and extensibility.
    To a large extent this is not an issue when operating in the SAP ecosystem.
    But nowadays the enterprise level development and BI challenges demand that SAP (and other ERP and CRM packages) are as transparent as all other systems and data sources.
    So helping modellers and analysts outside of the SAP team to more easily get to grips with the SAP data model and integrate it with other tools delivers real benefit to many corporate projects.

  2. Dominik Tylczyński

    You are 100% right on SAP transactional data model. There are couple more factors that “contribute” to data model quality.
    1. It carries all the legacy of ancient databases i.e. pool tables and cluster tables
    2. It had been developed in an inconsistent way by different group of designers i.e. even cross application functionality as status management uses different approaches (compare status management tables of production order and handling units)
    3. New functionalities added additional layers to the data model without touching the fundamentals. Hence it resembles an onion layers.

    I’m pretty sure the transactional data model is not going to improve substantially as there is too much investment poured into it and the system is mammoth to allow swift fundamental change.

Leave a Reply

Your email address will not be published. Required fields are marked *

SIGN UP FOR SILWOOD NEWS »
Close

CUSTOMER SUPPORT

You must be registered to receive support
Please login or Register Here »


Forgot Password? Remember me

CUSTOMER REGISTRATION

Note: The token is a unique string for each Safyr license and can be found on your product delivery instructions

Already registered? Login here »

PASSWORD RESET

Enter your username
to receive your password

Login here »

Welcome to Silwood Technology. We use cookies to enhance your browsing experience. Learn more OK