Capability Maturity Models (CMMs) have typically 5 stages - Initial, Managed, Defined, Quantitatively Managed, respectively Optimized (see CMMI) and the stages are described so they can be easily understood also by non-specialists (including Management). A CMM delimits some states of art that characterize a certain maturity level in terms of different aspects related to the topic considered. It helps organizations to benchmark their maturity in respect to other organizations while identifying the road ahead.
Dell’s maturity model is simplistic and seems to reflect more the BI than the Data Management (DM) aspects. The functions of Data Management can facilitate the Business Intelligence (BI) functions as data quality, integration, operations and/or security can positively impact the usage and trust into reports and metrics, however the implications for the two are quite different even if some (e.g. DAMA) consider BI & Data Warehousing as DM function(s).
Data Maturity as growth can be easily delimited from a staired representation, especially if one considers its track as percentages from a given maturity level. In addition, in analysis one needs to consider an organization’s goals. Not all maturity requirements apply as organizations have own particularities and a maturity model’s aspects might not apply or are simply ignored. Maturity levels are not black & white characterizations, a multitude of nuances of grey being in between. One can make fast progress for example if one focusses on quick winners or considers a “80/20” approach when pursuing targets.
An organization’s growth is dependable on parameters like sales volume, market share or number of customers, and is not necessarily dependent on the maturity with it manages its data. Comparing the growth with data maturity is an interesting approach, though a more important study is in comparing the investment in DM & BI capabilities with the achieved data maturity.
Fitting a curve to match another curve might not reflect the actual growth and this can be different from one organization to another. In extremis the result might even falsify theories or lead to false conclusions. Moreover, providing analysis or charts without providing access to the data makes the story less credible even the story sounds pleasant or right. Without data, conclusions are nothing but stories or wishes, especially when the models or graphics are idealistic representations (as in most provided graphics).
It’s true that correlation does not imply causation, however in order to draw conclusions one must have a population which allow drawing such conclusions and a base of 1-10 cases typically don’t allow drawing representative conclusions that apply at bigger populations or to all organizations. All one can say when the analysis is based on an organization’s perspective is to tell the respective organization’s story. With 1-2 further organizations added to the picture one can make an analysis which differentiates the different aspects, however also from this is not possible to draw general conclusions.
It’s true that an organization’s problems can multiply with its growth, though this relies mainly in the fact that its data volume increases as the sales, respectively the number of employees, processes or systems increases. An organization can address data-related issues even from the start and this doesn’t require a data strategy if things are done right by design and/or people’s experience contribute to it.
A data strategy can be a bottleneck for an organization if for example it doesn’t reflect organization’s needs and/or goals or the strategy imposes limitations (e.g. bureaucracy) on its growth. A data/BI strategy must reflect, respectively be aligned with organization’s needs/goals and be an engine for growth. If this doesn’t happen then the organization does something wrong in the process.