By Michael Gonzales, PhD
The focus of the Analytic Ecosystem Inventory (AEI) is to collect, document, and quantify the current level of analytics being executed and consumed in the organization. While other assessment instruments collect opinions from a broad range of analytic consumers, the AEI is designed to gather quantitative metrics related to employed analytic applications, the technology and data on which they are implemented, and the communities they serve.
Specifically, the inventory includes:
- Antecedents – Any antecedent related to analytics being conducted, their standards, and corporate objectives
- Applications – Analytic applications, the size of user communities they serve, and their life stage
- Technologies – Analytic technologies on which these applications are based
- Data – Profiles of the data sourced by these applications, including size, number of sources, and data type
An instrument is best to ensure consistency in the collection of information across the ecosystem, even if multiple team resources are used to conduct the inventory. A spreadsheet can readily be crafted to include columns of information outlined in Table 1, including the drop down menus that include the options available for the investigator.
In general, antecedents are any relevant, formal documentation for assessing dimensions of maturity. For example, a business strategy document with content incorporating analytics provides evidence of analytics being leveraged for competitive advantage. Which represents a critical sign that analytics is being embraced by the enterprise. Although not exhaustive, the following are specific artifacts important for the assessment team to consider when evaluating the overall maturity of an organization:
- Business strategy where analytics is referenced (redacted if necessary)
- Analytic strategy document
- Organization chart for analytics team(s)
- Organization chart for data governance
- Analytic development process and implementation standards
- Example of requirements document for the analytic environment
- Example of a test plan for an analytic application implemented
- Example of a Service Level Agreement offered to user communities by the analytics team(s)
- Education curriculum for analytics
- Example of a course evaluation that participants would complete after taking a course offered by/through the analytics team(s)
Few resources are needed to gather antecedents. The assessment team simply requests any relevant documents from the client. This can be done during the kick-off session, coupled with feedback/suggestions from assessment team members, and followed-up with an email providing examples of documents, if not actual document titles (if given by team members).
Technical, Data, and Application Examination
An Excel spreadsheet can be constructed in order to provide guidance in the gathering of fundamental information regarding the technology licenses, the data used, user communities supported, and applications based on that technology.
For brevity, outlined in Table 1 are the specific columns of the inventory spreadsheet.
Table 1 – Architecture Inventory
Techniques for Analyzing the AEI
There are 2 areas of the inventory that must be assessed:
- all formal antecedent documentation
- inventory of analytic applications, supporting technology, and data
This section provides recommendations on how to evaluate each. However, it is important that the assessment team does not merely examine antecedents and the AEI individually, but rather in the context of the entire information gathered during the overall assessment effort. Figure 1 illustrates the importance of overlapping relationship between surveys/interviews, antecedents, and the inventory of analytic applications.
Figure 1 – Overlapping Information Channels
It is this author’s recommendation that information gleaned from one channel of information be compared and contrasted against information from other areas of the assessment. For example, if SME surveys suggest technology standards used for analytics, there should be evidence of these standards in the technologies (and their versions) supporting analytic applications. Or if SMEs suggest that there does not exist any formal analytic education provided to the users and yet there is a clear and robust analytic curriculum offered, then the assessment team needs to investigate and understand why there exists such a discrepancy.
A reasonable approach for examining antecedent documents is for the assessment team to review and comment on each document. However, this process is definitely a “black box” that is difficult to repeat. There are, however, elements of the antecedent evaluation that can be evaluated in a methodical, repeatable manner. Assessment members can leverage a Likert Scale and answer specific questions relating to each document such as those outlined in Table 2.
Table 2 – Assessing Antecedents
Analyzing the Inventory
This inventory provides plenty of opportunities for the assessment team to identify patterns based on the quantitative metrics gathered. For example, questions that can be answered from the information gathered include, but are not limited to, the following:
- Support for Standards – How diverse are the technologies used to support analytic applications? And how many different versions of the same technologies are used?
- Application Maturity – What is the mode of maturity in analytic applications? Are most applications “New” or “Expanding”? Or are they “Mature” or “Legacy”?
- User Communities – Do the applications support large user communities or are they relevant only to a selected group of analysts?
- Identification – Are the majority of analytic applications concentrated in 1 or 2 departments or are they spread across the enterprise?
- Data – Is the latency of the data used for most analytic applications historical and consumed in periodic batches or is the data real-time or available on-demand?
Analytic Maturity Assessment Resources
For more information regarding an analytic maturity assessment process, please refer to the following resources available on LinkedIn, including:
- (Video) Comprehensive Analytic Maturity Assessment: https://www.linkedin.com/posts/michael-gonzales-231b9085_strategy-competitiveadvantage-competitiveness-activity-6653330269333970944-qcJR
- (Video) CAMA Instruments: https://www.linkedin.com/posts/michael-gonzales-231b9085_datascience-businessinteligence-strategicplanning-activity-6658427283214254081-mvDQ
- (Article) How to Conduct an Executive Interview: https://www.linkedin.com/pulse/how-conduct-executive-interview-michael-gonzales
- (Article) Analytic Maturity Model for the Real-World: https://www.linkedin.com/pulse/analytic-maturity-model-real-world-michael-gonzales
This article is taken from a section in the CAMA Guide: How to Conduct an Analytic Maturity Assessment.
dss42, LLC Copyrights© 2019-2020 All Rights Reserved
About the Author
Michael L. Gonzales, Ph.D., is an active practitioner in the IT space with over 30 years of industry experience serving in roles of chief architect and senior solutions strategist. He specializes in the formulation of business analytics for competitive advantage in global organizations. Recent engagements include companies in the top global 100.
Dr. Gonzales holds a Ph.D. from the University of Texas with a concentration in Information and Decision Science. He has presented and published his research at leading IT international conferences, including: Decision Sciences Institute, Americans Conference on Information Systems, and Hawaii International Conference on Systems Science. His research streams include analytics against extremely large data and success factors for IT-enabled competitive advantage.
Dr. Gonzales is a successful author, industry speaker and is currently the Managing Partner of dss42, LLC, and Senior Data Scientist at Prolifics