5 Best Practices for SAP Master Data Governance - PDF

Please download to get full document.

View again

of 7
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Information Report
Category:

Crafts

Published:

Views: 23 | Pages: 7

Extension: PDF | Download: 0

Share
Related documents
Description
5 Best Practices for SAP Master Data Governance By David Loshin President, Knowledge Integrity, Inc. Sponsored by Winshuttle, LLC 2012 Winshuttle, LLC. All rights reserved. 4/12 Introduction
Transcript
5 Best Practices for SAP Master Data Governance By David Loshin President, Knowledge Integrity, Inc. Sponsored by Winshuttle, LLC 2012 Winshuttle, LLC. All rights reserved. 4/12 Introduction In most organizations, business application development is aligned by corporate function, such as sales or finance, or by line of business. However, many key business activities, such as order-to-cash or procure-topay, cut across functional and organizational boundaries. To be effective, applications that automate these cross functional activities must be able to share and exchange data among multiple domains. However the mismatch between siloed application development and cross-functional business process often causes same or similar concepts to be modeled differently, creating differences in structure and meaning that can lead to difficulties in data integration and sharing across functions. Fortunately, disciplines referred to as master data management (MDM) have been developed for creating shared views of uniquely identifiable data concepts. MDM is particularly useful for sharing data for crossfunctional processes such as those implemented within ERP applications. But there are potentially many touch points for shared master objects (customer master and material master are both good examples), and coordinating among the different participants across the different steps of workflow processes requires some oversight. That oversight is delivered in the form of a Master Data Governance (MDG) program that is centralized within the organization. In this paper we look at some best practices that effectively facilitate: Collecting business requirements for master data oversight Capturing and managing master data business rules Managing the processes for creating and updating master data entities Motivating Factors Master data management disciplines are intended to support sharing high quality master data repositories. However, solving the infrastructure problems without addressing policies and processes can often lead to delays or even project failures. Introducing Master Data Governance can address challenges such as: Collective data expectations: In a siloed environment, each data system is engineered to meet the data quality and usability needs of the business function it supports. However, when a master data system is deployed, the shared records must meet the needs of all the consumers of the data asset. That means that there must be well-defined processes in place for identifying who the data consumers are, how they plan to use the data, and what their data quality requirements are. In turn, those business rules must be centrally managed to ensure that everyone s needs are met. Consolidation vs. Shared Use: There is a big difference between utilizing a master repository as a channel for consolidating data dumped from multiple sources vs. managing the master repository as a channel for effective data sharing to support end-to-end processes. The former approach is prone to inconsistency and errors, whereas focusing on the usage characteristics of a shared data asset will improve productivity, reduce errors, and speed time to value. By necessity, though, the shared-use perspective requires oversight and governance. Employing a Collaborative Operational Model: In usage scenarios that have many data touch points performed by different individuals along the way, there is a need to oversee the sharing of the data associated with workflow processes that cross functional lines. 2 Five Best Practices for Master Data Governance Master Data Governance establishes policies and processes for the creation and maintenance of master data. Observing these policies will establish a level of trust in the quality and usability of master data. And in order to institute policies governing the creation and maintenance of master data, we recommend the following best practices; each is intended to address the challenges in centrally managing shared master data. 1: Establish a Master Data Governance Team and Operating Model Recognize that most business processes cross function boundaries. Multiple individuals are involved in the collaborative workflow processes for creating master data as well as executing the business processes. But because organizations are typically organized around function, a question arises as to who owns cross-functional master data processes? To ensure that the processes are not impeded by data errors or process failures, the first best practice that we recommend is that the executive management mandate the creation of a central data governance team. This data governance team should be composed of representatives from each area of the business and be vested with the authority to: Define and approve policies governing the lifecycle for master data to ensure data quality and usability. Oversee the process workflows that touch master data. Define and manage business rules for master data. Inspect and monitor compliance with defined master data policies. Notify individuals when data errors or process faults are putting the quality and usability of the data at risk. 2: Identify and Map the Production and Consumption of Master Data By virtue of the process of sharing data managed within a master data repository, we are able to benefit from reuse of the data even though it may only be stored and managed from a single central location. The quality and usability characteristics of the master data must be suitable for all of the multiple purposes. This means that you will need to ensure that cross-functional requirements are identified from all master data consumers and that those requirements can be observed throughout the data lifecycles associated with the cross-functional processes. In order to properly assess their data requirements and expectations for availability, usability, and quality, you must first have a horizontal view of the workflow processes and understand who the data producers and data consumers are; this is the basis of our second recommended best practice of documenting the business process workflows and determining the production and consumption touch points for master data. Whether you are looking at the workflow processes for creating shared master data or reviewing those activities that consume shared master data, one can link the assurance of consistency, accuracy, and usability to specific master data touch points. Documenting process flow maps is a practical process that enables one to consider all the touch points for master data (for creation, use, and updates) along each workflow activity. Any point of data creation or updating provides an opportunity for inserting inspection and monitoring of observance of data policies and data quality rules. Reviewing the usage scenarios provides traceability for the lineage of shared master values, and exposes the data dependencies that must be carefully monitored to ensure consistency in definition, aggregation, and reporting. The result is that this information production map identifies the opportunities for instituting the governance directives for ensuring compliance with enterprise data policies. 3 3: Govern Shared Metadata: Concepts, Data Elements, Reference Data, and Rules One of the main drivers for Master Data Governance is discrepancies and inconsistencies associated with shared data use, particularly in terms of the characteristics and attributes of master data concepts such as customer or product. And incredible as it may seem, inconsistencies in reference data sets (such as product codes or customer category codes) seem to be rampant. The duplication and overlap of user-defined code sets causes particular problems within an ERP environment when there is no centralized authority to define and manage standards for reference data. Incorrect use of overlapping codes can lead to incorrect behavior within hard-coded applications; these bugs are not only insidious, they can be extremely hard to track down. To reduce the potential negative impacts of inconsistencies, our third best practice for Master Data Governance is centralizing the oversight and management of shared metadata, focusing on entity concepts (such as customer or supplier), reference data and corresponding code sets, along with the data quality standards and rules used for validation at the different touch points in the data lifecycle. Because each business function may have its own understanding of what is implied by the terms used to refer to master data concepts, use a collaborative process to compare definitions and resolve differences to provide a standard set of master concept definitions. Governance and management of shared metadata involves the definition and observation of data policies for resolving variance across similarly-named data elements along with the procedures for normalizing reference data code sets, values, and associated mappings. Normalizing shared master reference data can alleviate a significant amount of pain in failed processes and repeated reconciliations that must be done when the reporting does not match the operational systems. Lastly, if (as our second best practice suggests) we are creating a collected set of data quality requirements, the data quality rules that are used for validation can be centralized and governed to ensure consistency in the application of those rules for inspection, monitoring, and notification of errors or invalid data. 4: Institute Policies for Ensuring Quality at Stages of the Data Lifecycle The fundamental value of instituting master data management is the reduction in data variance, inconsistency, and incompleteness. Cleansing or correcting data downstream is a reactive measure, and as long as any corrections are shared among all the data creators and consumers, this may be an acceptable last resort. But realize that these types of data issues are just symptomatic of the absence of data quality control processes intended to prevent errors from occurring in the first place. Instead, this fourth best practice seeks to ensure that the data meets the needs as specified by the collected data requirements. This suggests a more directed approach of defining data policies and instituting data governance processes for data discovery and quality assessment. By analyzing potential anomalies within the master data set, this practice helps to identify structural inconsistencies, code set issues, and semantic differences inherent in the data. Once the potential issues are identified, many can be resolved with stop-gap controls architected within those common master data models used for information sharing. At the same time, defining policies for instituting controls within the process workflows and application programs that can be inserted into the workflows based on the analysis of the information production maps documented using our second best practice. This will help reduce the introduction of errors into the environment in the first place. 5: Implement Discrete Approval and Separation of Duties Workflow By virtue of the fact that cross-functional processes (such as order-to-cash) are not owned by any particular line of business or functional area, the question of workflow process ownership becomes key in governing their successful execution. Drilling down on this question exposes different facets of the challenge. 4 One facet involves navigating the relationship between the business team members who define the workflow process and the IT teams who develop and implement the applications of the workflow processes. Another facet is that there are different types of processes: some that require complex IT solutions, and others that can be easily deployed by business people without IT involvement. A third facet involves the operational aspects of oversight of the decisions that are necessary within process workflows, such as reviewing newlyentered items, approving the new records, or signing off on the completion of a workflow, to name a few. Because process workflow ownership is effectively assigned across the areas of the business, there is a need for a set of policies and methods to centrally oversee the successful execution of all stages of the workflow. This leads to our fifth best practice of integrating discrete approval of tasks into the operational aspects of Master Data Governance. Separation of duties can be discretely integrated into the application environment through the proper delineation of oversight as part of an approval process that focuses on the business use of the data and does not require IT intervention. Implementing this practice on a small scale might be managed manually. Yet as the number of processes grows, the number of touch points increases, and the need for discrete separation of duties expand, a manual solution will become insufficient to provide the enterprise-wide effectiveness that is required, while relying on IT as the conduit for application development introduces a bottleneck in getting processes into production. To alleviate both these pressures, as part of this best practice we suggest exploring how different tools can be used to enable some degree of self-service for business people to develop and deploy workflow processes without the need for IT intervention. The best types of technologies would automate the management of these processes along with their approval procedures, and this master governance could then be directly integrated into the execution of the processes. Summary The successful deployment of ERP solutions can revolutionize the way a company works in terms of increased opportunities, decreased operational costs, and improved level of trust in generated reports. Yet the transition from a vertical, function-based mode of operations to one that is integrated horizontally across the different areas of the business introduces some challenges that are often ignored. Introducing Master Data Governance in alignment with the use of ERP frameworks for cross-function workflow processes can help you anticipate potential failures and lead to rapid time-to-value. But don t let governance strangle flexibility and adaptability. Governance is an evolving and ongoing process. Policies and practices must be reviewed and updated to ensure that business performance metrics are properly measured and business objective are met. Introducing practical master data governance practices such as those described in this paper can help your organization deploy the proper oversight over crossfunctional activities. Automating these Master Data Governance practices relies on a variety of core competencies, so when evaluating solutions, look for these types of capabilities: Broad access to shared master data (such as those integrated within ERP systems) Centralized management of data element definitions, structures, reference data sets, and other relevant metadata Shared data quality business rules Flexibility in developing and implementing workflow processes that reduce the IT bottleneck The incorporation of the inspection and monitoring of compliance with data quality and validation rules The incorporation of the inspection and monitoring of compliance with discrete approval and decisioning rules. 5 About the Author David Loshin, president of Knowledge Integrity, Inc., (www.knowledge-integrity.com), is a recognized thought leader and expert consultant in the areas of data quality, master data management, and business intelligence. David is a prolific author regarding BI best practices, via the expert channel at and numerous books and papers on BI and data quality. His book, Business Intelligence: The Savvy Manager s Guide (June 2003) has been hailed as a resource allowing readers to gain an understanding of business intelligence, business management disciplines, data warehousing, and how all of the pieces work together. He is the author of Master Data Management, which has been endorsed by data management industry leaders, and the recently-released The Practitioner s Guide to Data Quality Improvement, focusing on practical processes for improving information utility. Visit for more insights on data quality, and for thoughts on master data management. David can be reached at About Winshuttle The best practices outlined in this paper offer broad guidelines for conceptualizing and planning a Master Data Governance program that safeguards the quality and integrity of data assets. The preceding pages have made clear that MDG is not solely or even primarily a technical process: it is a system of roles, rules and rights that determine how people interact with data. Nonetheless, most organizations look to technology to implement and manage the policies and processes necessary for successful MDG. For users of SAP, Winshuttle provides tools for creating and implementing a best-practice based MDG program. The Winshuttle MDG solution focusses on solving the practical master data problems that most directly hinder organizational performance, including: Poor data quality: Winshuttle helps companies gain control over the extraction, creation, update, and maintenance of master data both at the user level, with desktop tools that reduce errors and standardize execution of SAP data operations; and at the process level, with tools and services for creating workflows that ensure compliance with MDG policies. Tension between governance and flexibility: MDG is necessarily restrictive. But an MDG regime that can t adapt to the demands of the changing business environment can end up harming rather than aiding performance. Winshuttle s approach to MDG promotes rapid innovation and problem solving within a controlled framework. The Winshuttle Platform is a codeless development environment that allows analysts and non-programmers to safely author and modify solutions, ensuring that technology never becomes a barrier to MDG effectiveness. Incomplete process tracking: Winshuttle s logging and auditing capabilities ensure that data stewards and shared service groups can document performance against SLA s and create benchmarks for continuous improvement. For more information on Winshuttle s approach to MDG in an SAP environment, visit or 6 Winshuttle provides software products that improve how business users work with SAP. For customers who struggle with rigid, expensive and inefficient processes that limit their ability to adapt to changing business conditions, Winshuttle has the solution. The Winshuttle Platform enables customers to build and adapt Excel and SharePoint-based interactive forms and workflows for SAP without programming. Thousands of Winshuttle customers have radically accelerated SAP processes, saving and redirecting millions of dollars every day. Winshuttle supports customers worldwide from offices in the United States, United Kingdom, France, Germany, and India. Corporate Headquarters Bothell, WA Tel + 1 (800) Fax + 1 (425) France Maisons-Alfort, France Tel +33 (0) Fax +33 (0) United Kingdom London, U.K. Tel +44 (0) Fax +44 (0) India Research & Development Chandigarh, India Tel +91 (0) Germany Bremerhaven, Germany Tel +49 (0) Fax +49 (0)
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks