DAMA Data Management Body of Knowledge Guide - Reading Notes 15

Chapter 15 Data Management Maturity Assessment

I. Introduction

The maturity model defines the maturity level by describing the capability characteristics of each stage. When an organization meets the capability characteristics of a stage, its maturity level can be assessed and a plan can be developed to improve capabilities. It also helps organizations make improvements guided by grade assessments to compare with competitors or partners.

Assessments help clarify what is being done well, what is not being done well, and where the organization has gaps. Based on the results of the assessment, the Organizational Section develops a roadmap to achieve the following goals:

  • High-value improvement opportunities related to processes, methods, resources, and automation.
  • Ability to align with business strategy.
  • Conduct governance projects for regular model-based assessment of organizational capabilities.

1.1 Business Drivers

Organizations conduct capability maturity assessments for several reasons:

  • Supervision. Regulation imposes minimum maturity levels on data management.
  • data governance. For planning and compliance purposes, data governance requires a maturity assessment.
  • Organizational readiness for process improvement. Organizations recognize that improving their practices should begin with an assessment of their current status.
  • Organizational changes. Organizational change creates data management challenges. DMMA has plans to address these challenges.
  • new technology. Advances in technology provide new ways to manage and use data. Organizations want to understand the likelihood of successful adoption.
  • Data management issues. When it comes time to address data quality issues or address other data management challenges, organizations want to assess their current state to better decide how to implement changes.

1.2 Objectives and principles

The primary goal of a data management capability assessment is to assess the current status of key data management activities in order to tailor plans for improvements. In achieving its main goals, DMMA can have a positive impact on culture by helping to:

  • Introduce data management concepts, principles and practices to stakeholders.
  • Clarify stakeholder roles and responsibilities regarding organizational data.
  • Emphasize the need to manage data as a critical asset.
  • Expand awareness of data management activities across the organization.
  • Help improve the collaboration needed for effective data governance.

1.3 Basic concepts

1.3.1 Evaluation levels and characteristics

Level 0 no ability Management activities or formal business processes are disorganized
Level 1 Initial/temporary level
  • little or no governance
  • Limited toolset
  • Define roles within a single siled system
  • Application of controls is completely inconsistent
  • Unresolved data quality issues
level 2 repeatable level
  • Governance begins to emerge
  • Introducing a consistent toolset
  • Some roles and processes are defined
  • Increasing awareness of the impact of data quality issues
Level 3 level defined
  • Data is seen as an enabler for the organization
  • Scalable processes and tools: Reduce manual processes
  • Process outcomes, including data quality, are more predictable
level 4 Already managed
  • Centralized planning and management
  • Managing data-related risks
  • Data Management Performance Metrics
  • Data Quality Improvement Can Be Measured Quantitatively
Level 5 Optimization level
  • Highly predictable process
  • reduce risk
  • Fully understand the meaning of metrics that measure data quality and process quality

1.3.2 Existing DAMA framework

(1) CMMI Data Management Maturity Model (DMM)

CMMI developed the CMMI-DMM (Data Management Maturity Model), which provides assessment criteria for the following data management areas:

  • Data management strategy
  • data governance
  • Data quality
  • Platform and architecture
  • data manipulation
  • Support process

(2) EDM Committee DCAM

Thirty-seven competencies and 115 sub-competencies related to sustainable data management project development are described. The assessment focuses on the level of stakeholder engagement, the format of the process, and the components that demonstrate capabilities.

(3) IBM Data Governance Council Maturity Model

The purpose of the model is to help organizations build consistency and quality control in governance through empirically validated technologies, collaborative approaches, and best practices. The model is organized around 4 key categories:

  • result. Data risk management and compliance, value creation.
  • Enabling factors. Organizational structure and cognition, policy, management.
  • core content. Data quality management, information lifecycle management, information security and privacy.
  • Support content. Data schema, classification and metadata, audit information, logging and reporting.

(4) Stanford Data Governance Maturity Model

This model focuses on data governance rather than data management, but it provides a foundation for a comprehensive assessment of data management. The model is divided into foundation parts (awareness, formalization, metadata) and project parts (data management, data quality, master data). In each section, the model clearly illustrates the drivers of people, policies, and capabilities, and clarifies the characteristics of each maturity level and provides qualitative and quantitative measurements for each level.

(5) Gartner’s Enterprise Information Management Maturity Model

Gartner has released an Enterprise Information Management Maturity Model that establishes criteria for evaluating vision, strategy, metrics, governance, roles and responsibilities, lifecycle and infrastructure.

2. Activities

The assessment is conducted by soliciting input from business, data management and information technology actors with the aim of reaching a consensus on current state capabilities supported by evidence. Assessments can be expanded to meet organizational needs, but modifications must be made with caution. If a model is cropped or modified, the model may lose its original rigor or traceability. When customizing a model, the integrity of the model should be maintained.

2.1 Planning assessment activities

Assessment planning includes defining the overall approach and communicating with stakeholders before and during the assessment to ensure their involvement. The evaluation itself involves gathering and evaluating input, communicating results, recommendations and action plans.

2.1.1 Define goals

Drivers must be articulated in the form of goals, describing the scope and focus of the assessment. Managers and business units must have a clear understanding of the objectives of the assessment to ensure it is aligned with the strategic direction of the organization. The evaluation objectives also need to provide some criteria, including which evaluation model to use, which business areas need priority evaluation, who will provide direct input, etc.

2.1.2 Select a framework

Review frameworks against assumptions about the current state and assessment goals in order to select one that makes sense for the organization. The choice of framework affects how the assessment is conducted, so the responding working group must have knowledge of the model and corresponding methodology.

2.1.3 Define organizational scope

  • Local assessment. You can go into more detail and you can do it faster because the scope is limited.
  • Enterprise evaluation. Focuses on widespread and sometimes disjointed parts of an organization. A business assessment can be made up of multiple partial assessments or it can be a standalone task.

2.1.4 Define interaction methods

Information gathering activities may include workshops, interviews, surveys, and component reviews. Adopting an approach that works well within the organizational culture can minimize the time investment of participants, define assessment actions while participants still have a clear understanding of the process, and enable assessments to be completed quickly.

2.1.5 Plan communication

Stakeholders should be informed of their expectations for the assessment before it begins. Communications should describe:

  • Purpose of Data Management Maturity Assessment
  • How the assessment should be done
  • What part are they involved in?
  • Timetable for assessment activities

During any evaluation activity, make sure there is a clear agenda, including a plan for addressing backlog issues. Constantly remind participants of the goals and objectives of the event, express gratitude for their continued participation, and describe next steps.

2.2 Execution Maturity Assessment

2.2.1 Collecting Information

Information collected includes, at a minimum, formal ratings of evaluation criteria, but may also include findings from interviews and focus groups, system analysis and design documents, data surveys, email strings, procedure manuals, standards, policies, document repositories, approval workflows, Various work products, metadata repositories, data and integration reference architectures, templates and forms.

2.2.2 Perform assessment

Overall evaluation tasks and interpretations are often multi-stage. Participants may have different ratings for the same assessment topic, and consensus needs to be reached through discussion. Input is provided by individual participants and then refined through component review or inspection by the assessment team. Improvement requires the following process:

  • Review the rating methodology and assign a preliminary rating to each work product or activity
  • Document supporting evidence.
  • Discuss with participants to reach consensus on final scores for each domain. Where appropriate, use different weights based on the importance of each criterion.
  • Record a statement about the model criteria and the reviewer's explanation as a description of the rating.
  • Develop visualization tools to display and illustrate evaluation results.

2.3 Interpretation of results and recommendations

Interpretation of results includes identifying opportunities for advancement that are aligned with organizational strategy and recommending actions to capitalize on these opportunities.

2.3.1 Reporting evaluation results

The assessment report should include:

  • Business Drivers Assessed
  • Overall results of the assessment
  • Ratings with gaps by topic
  • Suggested ways to bridge the gap
  • Observed strengths of the organization
  • risk of progression
  • Investment and Outcome Options
  • Governance and standards for measuring progress
  • Resource Analysis and Potential Future Utility
  • Components that can be used or reused within the organization

2.3.2 Develop management briefings

The assessment team should prepare a management briefing summarizing the findings (including strengths, gaps, and recommendations), which management uses as input into making decisions about goals, plans, and timelines. The team must refine this information to clarify the possible impact and benefits for each executive group.

2.4 Develop targeted improvement plans

The results of the DMM assessment should be sufficiently detailed and comprehensive to support a multi-year data management improvement plan, including the organization's best practice initiatives to build data management capabilities. Since change occurs in organizations primarily through projects, new projects must adopt better practices. The roadmap or reference plan should include:

  • A series of activities to improve specific data management functions.
  • Timetable for implementing improvement activities
  • Expected improvement in DMMA ratings once activities are implemented.
  • Supervision activities, including supervision that matures over a timeline.

2.5 Re-evaluate maturity

Reassessments should be carried out regularly as part of a cycle of continuous improvement:

  • Baseline ratings are established through the first assessment.
  • Define reassessment parameters, including organization scope.
  • Repeat DMM assessment on published schedule as needed
  • Based on trend relative to initial baseline
  • Develop recommendations based on reassessment results.

3. Tools

  • Data management maturity framework. The main tool used in maturity assessment is the DMM framework itself.
  • Communication plan. The communications plan includes stakeholder engagement models, the types of information to be shared and timelines, etc.
  • Collaboration tools. Collaboration tools allow sharing of assessment results. Evidence of data management practices can be found in emails, completed templates, and review documents produced through a standard process of collaborative design, operations, incident tracking, review, and approval.
  • Knowledge management and metadata repository. Data standards, policies, methodologies, agendas, meeting minutes or decisions can be managed in these repositories, as well as business and technical components used as proof of practice.

4. Method

4.1 Select DMM framework

When choosing a DMM framework, the following criteria should be considered:

  • ease of use. Practical activities are described in terms of non-technical attributes that convey the functional nature of the activity.
  • Comprehensiveness. The framework addresses a wide range of data management activities, including business engagement, not just IT processes.
  • Scalability and flexibility. The framework is structured to support enhanced industry-specific or additional disciplines and can be used in whole or in part depending on an organization's needs.
  • Built-in future evolution path. Although the priorities determined by different organizations vary, the DMM framework describes the logical way forward for each function.
  • Industry agnosticism versus industry specificism. DMM frameworks should follow data management best practices in vertical fields.
  • level of abstraction or detail. Practice and assessment criteria are articulated in detail to ensure they guide implementation.
  • Non-prescriptive. A framework describes what needs to be performed, not how it must be performed.
  • Organized by topic. The framework places data management activities into appropriate contexts, allowing each activity to be evaluated independently while dependencies can be identified.
  • Repeatable. The framework allows for consistent interpretation and supports repeatable results for comparing one organization to others in other industries and tracking progress over time.
  • Supported by neutral, independent organizations. To avoid conflicts of interest, the model should be widely available from neutral vendors to ensure broad representation of best practice.
  • Technology neutral. The focus of the model should be on practice, not on tools.
  • Training support. The model is supported by comprehensive training, enabling professionals to master the framework and optimize its use.

4.2 Use of DAMA-DMBOK framework

DAMA-DMBOK can be used to prepare work for DAMA or establish standards. Executives will see a direct link between each segmented intelligence and the corresponding tasks, DMBOK knowledge areas, activities and deliverables. Chen Guo can configure a specific DMM framework based on the measured area, its supported activities, dependencies and available time. This quick checklist approach can be used to identify areas that require deeper analysis, represent gaps, or point out hot spots for remediation.

5. Maturity Management and Governance

5.1 DMMA process supervision

Oversight of the DMMA process falls to the Data Governance team, and the breadth and depth of oversight depends on the scope of the DMMA. Every function involved in the process has a say in the roadmap for execution, approach, structure and overall assessment. Each data management area and organization involved will have an independent view and will have a common language through the DMM framework.

5.2 Metrics

In addition to being a core component of an improvement strategy, metrics are also a key communication tool. Initial DMMA indicators are ratings that represent the current state of data management, and these can be reassessed periodically to show trends in improvement. Every organization should develop metrics against their target state roadmap. Examples of metrics might include:

  • DMMA rating. DMMA ratings provide a snapshot of an organization's level of capability.
  • Resource utilization. This is a powerful metric that helps people express the cost of data management in countable form.
  • risk exposure. The ability to respond to risk scenarios reflects the organization's ability relative to its DMMA rating.
  • Spend management. Represent how data personalization costs are distributed across the organization and determine the impact of this cost on sustainability and value. These metrics overlap with data governance metrics. Data management sustainability, achieving the goals and objectives of the initiative, effectiveness of communication, effectiveness of education and training, speed of change adoption, data management value, contribution to business objectives, risk reduction, operational efficiency improvement.
  • DMMA input. Core inputs can include the following: count, coverage, availability, number of systems, data volume, teams involved, etc.
  • The speed of change. Refers to the speed at which an organization improves its capabilities. A baseline is established through DMMA, with regular reassessments for trending improvements.

Guess you like

Origin blog.csdn.net/baidu_38792549/article/details/125068013