Aligning Metrics, Data Architecture, and People
Business intelligence (BI) plays an integral role in every organization, turning data into meaningful information for making the right decisions. Increasing regulatory scrutiny and evolving business needs, plus the relentless focus on cost and efficiency, are also challenging businesses to modernize their business processes and decision making methodology. Business decisions can no longer be based just on a gut feeling, but must use a sensible, calculated approach based on data.
Thinking differently-more strategically-about integrating and communicating data can help any organization improve. Most often companies trying to modernize their business processes and technologies continue to rely on solution suites or just procure data visualization tools like Tableau, Qlikview, or other programs to deliver results.
However, the majority of the organizations that take this approach end up disillusioned due to poor adoption on account of people, process, and technology issues, including:
- Data visualization products requiring highly technical skills limit business user adoption
- Lack of awareness and change management
- Metrics are not useful/too many metrics
- A tool is incompatible with the latest devices
- Lack of proper infrastructure in terms of data management
Sub-optimal outcomes are caused by a lack of a comprehensive, structured, and tailored strategy that starts with a focused assessment of key issues and gaps in data processes, and also aligns the processes with people and technology.
By adopting certain best practices which form the core of a BI solution and following guiding principles when transforming their existing BI architecture, organizations can ensure they get the most out of their business intelligence solutions.
A metric refers to a direct numerical measure that depicts business concept. Successful business intelligence transformation requires choosing the right metrics for measuring success. These metrics should be planned in collaboration with the leadership and business units to effectively check the health of a company.
A good metric aligns to a company’s goal, objectives, and core of the business. For instance, month-on-month customer acquisition may be a good metric for many companies, but if a company’s objective is revenue growth, then most metrics should be related to factors affecting revenue growth.
All metrics should be predictable, achievable, and actionable. They should also be able to be easily tracked over a period of time. Good metric should also be comparable to other competitors’ metrics if possible, enabling a company to potentially identify possible areas for improvement.
There are two types of metrics:
Leading indicators: They are used to measure the activities necessary to achieve a company’s goals
Lagging indicators: They are used to measure the actual results to show whether or not a company hit their goals
Ideally, a combination of these metrics should be used. For example, if a company wants to acquire one million customers, the total number of customers acquired would be a leading indicator. The month-on-month customer acquisition should also be tracked as a lagging indicator.
Choosing the right metric is an iterative process; no one gets it right the first time. Metrics should be revisited when there is a change in business goals.
2. Business Rules
Business rules are used to define entities, attributes, relationships, and constraints. Usually, though, they are used to explain a policy, procedure, or principle for an organization that stores or uses data.
Once the metrics are set, business rules need to be defined and agreed upon by the different business units. These rules are guided by a variety of elements including regulatory agencies, industry standards, business acumen, or common sense. They often vary from country to country, industry to industry, and even business to business. Domain experts define and manage these business rules within an organization.
For example, banking organizations may determine an account as dormant when an open account is inactive for 90 days, and inactivity may be defined as no sales transactions within the past 90 days. This definition may vary with different geographies, hence, there is a need to collaborate with different business units and geographies to agree upon and document business rules.
There are three types of business rules:
Derivation rule: The derivation rule transforms information received into values returned, such as APR calculations.
Constraint rule: The constraint rule is used to determine if the values of a transaction or operation are consistent. For example, this could include saying the state and ZIP code must match for a customer to be able to proceed.
Invariant rule: Invariant rules look at multiple changes and ensure that the composite results are consistent. For example, this could include saying the balance of a savings account must equal the previous balance plus the credits or minus the debts. If you achieve something different, the system is leaking money, and it’s time to escalate the issue.
3. Analytical Process
Analytics helps in better understanding data and provides insights to form necessary business strategies. It can help drive innovation and operational effectiveness. Analytical process should be designed keeping the following points in mind:
Objective: The outcome of formulating the entire process should be governed by an objective. For example, this could include understanding a 360 degree view of the customer to provide best services at the lowest cost and increase the profitability of the customer.
Design: Analytics process needs to be designed to run on different timelines and cycles. Full automation may or may not be the best solution for a given business problem. For each process, planners needs to decide what role each should play, such as serving as a subject matter expert.
Type: The type of analysis, tools and techniques that need to be used. This could include statistical techniques, building strategy, and other aspects.
4. Data Architecture and Technology
It is very important to design a data architecture to check if the BI solution is feasible to implement.
a. Source systems: The frontend applications will capture the information entered by an agent, customer, or other user. That data will be stored in the source systems in its raw format. There may be many source systems. Whenever there is a change in requirements, the source systems need to be updated accordingly to capture the data, and ensure that the data is flowing into the data warehouse.
b. Landing area: This is an intermediate storage area used for data processing during the extract, transform and load (ETL) process. The raw data is collected from multiple source systems and stored in the landing area. This is the initial stage of the database where the table is loaded without applying any transformation or business rules.
c. Data warehouse: Data warehouses (DWs) are central repositories of integrated data from one or more disparate sources. They store current and historical data in one single place, and are used as the central repository of all data.
d. Data cube: The data cube is used to represent data along some measure of interest. It is a data abstraction to evaluate aggregated data from a variety of viewpoints. Even though it is called a cube, data cubes can be one, two, or three dimensional, or even track more dimensions depending on what’s needed. Every dimension represents a new measure whereas the cells in the cube represent the facts of interest.
Data cubes are mainly categorized into two categories:
Multidimensional Data Cube: Most online analytical processing (OLAP) products are developed based on a structure where the cube is patterned as a multidimensional array. These multidimensional OLAP (MOLAP) products usually offer improved performance when compared to other approaches, mainly because they can be indexed directly into the structure of the data cube to gather subsets of data. When the number of dimensions is greater, the cube becomes sparser. That means that several cells that represent particular attribute combinations will not contain any aggregated data. This in turn boosts the storage requirements, which may reach undesirable levels at times, making the MOLAP solution untenable for huge data sets with many dimensions. Compression techniques might help, but their use can damage the natural indexing of MOLAP.
Relational OLAP: Relational OLAP make use of the relational database model. The ROLAP data cube is employed when approximately twice as many relational tables are used as the quantity of dimensions, compared to a multidimensional array. Each one of these tables, known as a cuboid, signifies a specific view.
The data cube act as a golden source for all reporting to avoid different numbers being reported for the same metric and duplicity of data. If a visualization tool like Tableau is used, the data needs to be aggregated and fed into cube, as it cannot perform heavy data manipulations. For example, if a user wishes to see the balance of the customer across products, the data cube would store this information at an aggregated level rolled up against different measures like products, geographic location, months, and other factors. It will be easier to pull the data into Tableau, slicing and dicing the information as required.
5. Visualization Layer
Data visualization is both an art and a science. To communicate information clearly and efficiently, data visualization uses statistical graphics, plots, information graphics, and other tools. Numerical data may be encoded using dots, lines, or bars, to visually communicate a quantitative message. Effective visualization helps users analyze and reason about data and evidence. It makes complex data more accessible, understandable and usable. Users may have particular analytical tasks, such as making comparisons or understanding causality, and the design principle of the graphic follows the task. Tables are generally used where users will look up a specific measurement, while charts of various types are used to show patterns or relationships in the data for one or more variables.
There are various visualization tools available. Careful analysis of the pros and cons of each tool needs to be done to understand which one is best suited for an organization based on the type, size, frequency, architecture of data, and business insights that to be derived from the data. This understanding is driven by knowing the strategic, tactical, or operational level and the skill available in the organization for data visualization. An overview of the current popular visualization tools and their salient features which can help organizations choose the right tool for their purposes.
Qlikview may be an appropriate solution if the users have the right mindset to apply a programmatic interface, as well as the ability to integrate the right questions from the very start in case the data is derived in different forms. This solution also calls for readiness to invest in the extra efforts to maintain proper reporting.
Klipfolio is a BI solution that resides 100% in the cloud, with no desktop application required. It provides a genuinely insightful tool for data visualization and dashboard composition. This enables it to process data most efficiently and boosts real-time solutions and optimization, rather than relying on a periodical regression model. Klipfolio provides a powerful platform for building data dashboards and enables access to ever-changing live data sources. It is best used for live monitoring and control over continuous data flows when their dynamics are of great importance and may require urgent decisions. The live data connection is a way of data retrieval that maintains the time consistency required for data accuracy and reliability, while responsiveness deposits into the time factor of the fast decisions.
Tableau is intended for the easy creation and distribution of interactive data dashboards that provide an insightful depiction of dynamics, trends of change, and data density distributions via the convenient medium of simple, yet effective, visuals. The core distinction from competitors is that Tableau has a special feature of data blending. Another unique feature is the ability for collaboration in real time, making it a valuable investment for commercial and non-commercial organizations alike. Tableau is easy to apprehend as a working tool. Its learning curve is pretty gentle, even for those who haven’t been previously exposed to technical details of visualization workflows
Geckoboard is yet another cloud-based visualization solution. It can be an appropriate tool if one needs to simply display some values using a basic predefined widget. Geckoboard provides an ability to compose dashboards with various widgets. It has a rich library of prepared integrations with the Facebook, Twitter, and Salesforce APIs enabling users to instantly visualize social media engagements or business data. It is one of the most democratic tools for visualizations, popular among individual users, freelancers, businesses, and corporate users.
Power BI is a software solution developed and supported by Microsoft for business intelligence and analytics needs. At the core of Power BI is an online service with various options for interaction, featuring several outlets for connection to data provided by a third-party software and services. Power BI was created and designed with the aim to build upon the functionalities of MS Excel and extend its operability even further to unlock new use cases, cover more platforms, and reach out to the cloud.
Google Data Studio: The youngest tool on the list today is part of Google’s analytics solutions. Being relatively new to the field, it strives to take its position among many competitors via ease of usage, simple yet beautiful design, innovative problem solving and the ability to share dashboards with the same ease that people share documents. While still not fully released, Google Data Studio gives an interesting insight into how it can process the data. The tool has had a decent start, but time will show whether it will perform well in the long run.
Before making investments in a tool, the BI planning team should make sure to have a clear idea of the requirements and ensure that the same tool is used across the organization.
Any BI transformation initiative is incomplete without a proper change management strategy in place. This has two aspects:
a. Alignment of BI Strategy and Business Unit Collaboration: Leadership must ensure that the BI strategy is communicated across the business units and everyone is aligned to a common goal. Good strategic alignment has an amazing effect on organizational performance. People perform better when they fully understand and accept the purpose and goals of their organization, and they develop a better sense of ownership when they understand what a difference they make in achieving those goals.
On the other hand, a lack of strategic alignment is one of the major causes for organizations to fail. Many organizations over time lose track of their key business purpose, finding it hard to answer the question, “Why are we doing what we are doing?” While this seems like a simple question with a simple answer, businesses lose sight of their key business purpose all too often.
b. Training, Awareness, and Adoption Adoption is another major challenge facing most organizations. People are used to seeing numbers and charts in certain formats, making it hard to break old habits. This can present challenges when introducing a new BI or data visualization tool:
I. Comfort zone: People are often unwilling to change their habits, even if there is a more efficient way of working.
II. Unclear ownership: People are unclear as to who is responsible for the adoption process.
III. Measuring success: As the saying goes, it’s impossible to manage what isn’t measured. Success parameters need to be clearly defined.
IV. Communicate the change: There should be enough internal marketing done for the change and its benefits. Share success stories and the impact of transforming BI strategies.
V. Incentivize change: Provide people with the proper incentives to change. Incentives need not be monetary.
Adoption can also be facilitated through the following ways:
I. Trainings: Conduct workshops to upskill colleagues, walk through any new dashboards, and educate the end users as to how they benefit from using the tools.
II. Handover documents: Create easy-to-use documents as a handbook on how the users can use the dashboards.
Irrespective of the business vertical, the aforementioned best practices are fundamental for successfully implementing any BI transformation program. By isolating the current BI architecture in terms of these six parameters, enterprises can gain more clarity in terms of their current reporting mechanisms and the expected business insights from the different hierarchy levels in the organization. This leads to a streamlined BI workflow where relevant insights are accessible to all roles across the enterprise.