
28 Jan Driving Principles of Modern Data Management




Organizations rely on their Business Intelligence (BI) platforms to help leaders make data-driven decisions based on timely insights. Today, central data repositories at the heart of BI platforms are under pressure due to the explosion in data sources, transactional volumes, and data formats. Furthermore, ever-increasing customer expectations for BI technology teams to deliver meaningful information to target customers and systems at an accelerating velocity are overwhelming traditional data management architectures and technologies. As a result, many traditional business intelligence / data warehouse systems deployed by organizations today have grown very complex, lack clear data governance, and are not designed to meet the needs of their respective businesses.
These trends have led many data analysis projects to fail because they:
- Deliver systems that are too rigid to adapt to changing business requirements
- Require sweeping changes across the business to be effective
- Have ballooning costs to satisfy expanding performance requirements
- Provide a solution that simply does not address the needed use cases
Successful implementations of cloud-based data management require an approach that addresses key considerations across the domains of data management and governance, data lifecycle, and data storage to ensure that the solution aligns with the goals of the business. This will ensure the delivery of a scalable set of capabilities that meet the operational requirements, planned analysis, and associated use cases, while providing mechanisms to manage the cost of operating and maintaining the system.
Challenges with Traditional Data Warehouses
Most traditional BI / data warehouse systems today are predicated on an on-premise infrastructure leveraging mostly proprietary platforms that tend to work well, or did, when all work was contained within that proprietary ecosystem of products. As needs evolved, data sources and targets multiplied, volumes grew, and required access to information by consumers expanded, a proliferation of tools were added to meet this growing demand for data and address the associated data management challenges. This has led to the following problems:
- Complex data collection impacts extract, transform, load (ETL) processes, driving up expenses
- Inconsistent data and unreliable analytics constrain scalability
- Limited agility to handle new data impacts data availability and required analysis
- Lack of openness and vendor lock-in prevents future extensibility of data analytics capabilities
- User access to analytical capabilities lacks standardization, driving up training costs and negatively impacting the user experience
While not an exhaustive list, the challenges outlined above describe the hurdles organizations face as they embark on improving their BI service and its underpinning data management processes and technologies to meet the needs of their business customers. Information management teams need to ensure that their next-generation architectures are:
- Flexible and vendor agnostic
- Truly engineered for the cloud, not legacy products retrofitted and “modernized’ for the cloud
- Highly scalable
- Allowing for open formats and integrations across technologies
- Adopting automation for high volume recurring tasks
- Incorporating agile design and deployment methodologies to meet the current and future customer needs while balancing required capabilities of the service in a cost responsible manner
Driving Principles of Today’s Data Management Infrastructure
Swiftly changing organizational, development, and data management needs and priorities have driven a shift to a philosophy of “develop fast and develop frequently” without sacrificing quality. The rise of agile development practices and LEAN principles (e.g., Scaled Agile Framework (SAFe)) enables teams to align approaches, processes, and platforms to business needs. The resulting platform can therefore not only handle emergent changes but provide flexibility, resilience, scalability, and velocity. This fosters efficiency to deliver the needs of today, while ensuring investments continue to provide future benefits through a sound platform that continually increases access to analytical capabilities for typical business users.
The Data Management Maturity (DMM) model developed by the CMMI Institute encourages organizations to increase maturity of handling data by influencing design with documented principles. It is a comprehensive framework of data management practices designed to “help organizations benchmark their capabilities, identify strengths and gaps, and leverage their data assets to improve business performance.” Criterion has drawn extensively on this model to help develop its data management services. These are the driving principles of our services as well as their influence on architecture and implementation:
- Business Agility: The environment, governance, and implementation process must respond to change in real time and without the delay of lengthy projects.
- Automation: Data infrastructure automation should encompass the entire data lifecycle from planning, analysis, and design through development and extending into operations, maintenance, governance, change management, and documentation.
- Data Warehouse Quality: Inject quality across the entire lifecycle. Discover issues with data unavailability, data quality issues, or elusive and difficult-to-define business rules as early as possible to reduce waste of time and resources.
- Design Patterns: Identify and reuse patterns to simultaneously achieve consistency, quality, speed, agility, and cost savings.
- Data Latency: Replicate data changes in real time and when individual changes need to be tracked and include streaming data in the overall architecture.
- Sustainability, Maintainability, and Operability: Ensure consistency of components in a data warehouse by building in standards and conventions.
- Cost Savings: Build better, faster, and change quickly when needed to bring substantial cost savings to data warehouse development, operation, maintenance, and evolution.
Implementing an Effective Data Analysis Platform
Criterion’s approach to implementing an effective data analysis platform addresses key considerations across the domains of data management and governance, data lifecycle, and data storage to ensure that the solution aligns with the goals of the business. Our customers benefit from solutions that have the following features:
- Open Architecture: Integrates best-of-breed products and enables customers to extend the architecture as business needs change.
- Open Source: Leverages the quality work of multiple organizations to deliver advanced capabilities while giving wider access to skilled resources by using more common, industry-standard tools, infrastructure, and products.
- Scalability: Offers a cloud-based strategy to scale ingestion, processing, and storage as needed.
- Vendor Agnostic: Provides the bests products for business needs while avoiding vendor lock-in and expensive upfront licensing costs.
- Agile Implementation: Ensures platform is aligned with the goals and objectives of the business while driving capability and implementing features in an optimal manner.
We will be sharing more on our BI/data management approaches in future articles. To read past articles in the series, click here.