Traditional and Modern BI Tools

Enterprise analytic designs, guidance and visibility (20 years of direct experience)

Get the experience needed to evolve your enterprise analytic systems for increased coordination and profitable growth.  Alexicon focuses on enterprise data models, processes, data science and numerous analytic tools.  We provide the needed guidance during this dynamic time when companies are integrating bigger compute abilities with newer frontend visualization and statistical tools.

Most companies are struggling with integrating Modern BI Tools like Tableau, Qlik, Lumira, Spotfire, Microsoft Power BI with Traditional BI and analyst statistical tools/techniques.  We help with these areas by integrating structures for traditional and modern BI tools and advancing enterprise information integration so all tools access the same proven data models.

Typical large enterprise analytic systems have thousands of company shared (public) and personal reports.  Our first step is to catalog these systems which provides visibility on the As-is Environment.  After this, change is possible toward achieving the designed To-be Environment.  We put analytics on the analytic systems to provide the foundation and visibility for change.

Added Plus for Enterprise Analytic Systems (time duration)

Our organization has worked with numerous high-technology companies and we understand why they are experts with engineering and manufacturing which requires managing and balancing innovation, schedules and cost.  To meet schedules and desired cost or customer pricing, time duration is key for measuring and managing company and program processes and activities.  If you cannot measure time spans between activities, it is hard to speed-up operations.  We have expertise with process and time duration methodologies and focus on overall company operations when designing and modeling analytic systems.  Our goal is to enable company associates with innovative and advanced analytics to manage time duration in addition to quantities, counts and dollars.  This brings process-based analytics to life which fosters organizational process awareness and thinking to manage and speed activities which improves financial results.

Below is our process for new or enhancement efforts:

Intelligence is gained by "Organizational Use" when company associates are able to slice-and-dice dimensions and numeric values in this integrated and tuned environment.  Key numeric values like Revenue, COGS, Expenses and Goals/Targets need to be fully communicated at needed levels.  Key Performance Indicators (KPIs) around metrics like dollars, quantities, time durations, counts and unique computations are needed throughout the enterprise to drive top-level P&L or Income Statement performance.  With strong "Organizational Use", key discoveries of finds by users while designing and using the analytic systems can then be “baked in” or “integrated” within the EDW for mass use which improves company operations through shared information.

Large Enterprise Databases

Superfast databases have become the new norm for BI systems sourcing from the EDW.  These superfast databases are comprised of software and hardware platform components.  They are Massively Parallel Processing (MPP) databases used for very large datasets with advanced speeds for large query workload processing and large computations across large data sets.  Traditional databases are like one-cylinder motors and MPP databases have multiple cylinders that can exceed 100.  MPPs chew through large datasets and perform computations like "Market Basket Analysis" where retailers work to understand purchase behavior of customers.  MPPs complete these types of computations in minutes versus hours in the traditional one-cylinder databases.  Hadoop and Spark (Large Cluster Computing Frameworks) now allow computations on 10's of billions of records which lets businesses understand and analyze even more extreme datasets in minutes or seconds.  Big Data result sets from Hadoop and Spark can also be fed to the MPP/EDW for integration with structured company data and/or explored with Big Data visualization tools.

As datasets have grown, Big Data systems and smaller databases need business logic, summarizations and calculations to be handled in the database layer.  Many systems handle these important operations in the middle layer or BI Server (2) and/or in the frontend or Visualization (1) layer.  Moving to bigger data and faster systems requires different and newer data model builds rather than just trying to compensate with additional software and hardware purchases that provide less than acceptable performance gains because of existing slow or "hard-to-analyze" data models.  Alexicon can provide help and advice to avoid performance problems while opening up analysis capabilities as data scales by pushing Visualization or BI Server level work to the database layer.

Below are major parts of a BI or Analytics system

1. Visualizations, Dashboards and Reports

Traditionally, BI has been focused on users being able to run standard reports with prompt interactivity and even “ad hoc” abilities or a “self-service facility” to build reports.  Dashboards evolved from these systems and gained popularity in the last 10 years with Mobile BI, BI Search and Interactive Visualization tools following in recent years.  In addition, statistical client tools have been used throughout all these periods and continue to be a powerful tool that complements current “Big Data” efforts to assist with in-database computations.

2. Business Intelligence/Analytics (BI) Server

Frontend visualizations, dashboards and reports typically depend on the Business Intelligence Server (2) which sits in the middle of the overall BI Landscape.

The BI Server is used to aggregate reasonable data sets and present those data sets to Frontend Visualizations for OLAP or static use.  The server acts as a broker between the Users and the Database.  It provides a place to administer users and to organize content or visualization objects (typically thousands of users and thousands of visualization objects).

The key is to use the BI Server as a pass-through for the hard work performed at the Database layer (quick data throughput to users).  The Database should perform filtering, aggregations and computations where possible.  In addition, BI Server Meta Data should follow the same rule and should complement or be a pass-through for Meta Data at the Data Warehouse layer.

3. Databases MPPs, Large Compute SQL

The Database is where all the sorting, filtering, aggregations and computations should be performed.  This utilizes the databases computing power and passes summarized results to the BI Server layer which passes the summarized data to the Visualization layer.  One example is a simple need to aggregate records for five regions on 20 billion base records on five regions.  If the summarization is done in the database it will produce five rows with the region names and accompanying numeric values as columns and pass those to the BI Server which renders those five rows in the web visualization layer or passes data to a client analytics tool (desktop application).

While this is an extreme case to produce five rows of data, these types of aggregations are critical for databases to handle (big or small). Typically, the BI Server is overworked because this basic rule is not followed.  While 20 billion rows of data would definitely break most BI Servers, there are still many systems that run large aggregations in the BI Server and sometimes even in the visualization layer.  Users experience slowness with response times as a result of the BI server being overworked.  With Big Data, this becomes painfully clear quickly like in our example.  BI Servers start failing as data size and velocity increases data manipulations and computations need to be done in the database.

Often performance problems creep up over time through incremental builds with no Big Data strategy. Alexicon provides the needed guidance to transition traditional data warehouses to exploit Big Data computing power for efficient and effective reporting through the BI Server and at Visualization layers.  This involves modern data models.

The "Data Model"

Data Warehouses depend on the right data model to produce data structures for advanced Visualizations.  Traditionally, this has been done with heavy transforms in the Extract, Transform and Load (ETL) process which are still needed in most cases.  As data Volume, Variety and Velocity (3V's) continues to increase, more advanced ways on ingesting data into large data structures are advancing with Machine Learning (MI) and fast algorithms for streaming data loads. This is seen more with modern ingest tools that remove much of the manual work in the ETL process.  Typically, we see MI with Big Data systems.

Financial reporting is an example where data models must be accurate to the penny or decimal place.  Many larger or Big Data sets are used for exploring and have been called Data Lakes (raw data).  EDWs typically have data stored prior to ETL work in the Operational Data Store (ODS).  ETL is used to clean and transform data from the ODS (raw data) and place clean data that is conformed to Master Data (Enterprise Dimensions) tables in the EDW.  This is where precise loading of the EDW matters for corporate reporting accuracy.  Data Lakes or raw data can be similar to the ODS and used directly can provide valuable research insights; however, are not typically clean enough for financial type reporting use or integrated with existing reporting systems.  There is still creative tension in the software industry around financial type accuracy and exploration or data science work with very large data sets and existing corporate reporting systems.

We have 20 years of progressive experience building advanced data models for enterprise BI and analytic systems across many industries and functional company areas.  Our data models provide thorough analysis capability and quick response times.  We have found that most large data warehouses do not have the KPI and metric analysis capabilities needed in today's fast-changing and competitive world.

It is surprising what performance gains can be achieved when a company develops the right analytic systems and unleashes a large number of users with dashboards and/or ad hoc BI tools (traditional and modern BI tools).  These tools provide needed summary and drill-down views for standard reporting needs and act as a discovery facility for users to gain new insights (self-service facility).  These systems also need to be backed by the best fit user deployment model, appropriate IT governance and shared stewardship with business sponsors and users.

Contact Us

Home Page