Enterprise Analytics

Enterprise analytic designs, guidance and visibility (20 years of direct experience)

We are smaller, skilled and agile. Large companies typically have a large consulting firm, large integration company and multiple larger analytic and data management suppliers (many moving organizations and parts). We can assist when designs, cost or timelines are challenged by working independently and directly with your management team and coordinating with all suppliers.

We embrace Lean Six Sigma along with modern data science. Like martial arts, businesses have many compact moves or processes that need to be coordinated and timed to achieve the overall result. World-class athletes work all areas and constantly refine these moves or processes. We include Lean Six Sigma Black Belt level methods and techniques in Enterprise Data Warehouses (EDWs) around process-based data models and with system user procedures. This can help with defects, over-production, waiting, non-utilized talent, transportation, inventory, motion, extra-processing and ultimately increased profitability and/or growth.

QUALITIES - It's not the daily increase but the daily decrease - hack away the unessentials!  Bruce Lee

Cloud migrations are also accelerating. We help with these areas by integrating structures for traditional and modern BI tools and advancing enterprise information integration so all analytic tools access the same proven enterprise data model and sub models. This approach promotes high-end data models versus siloed code in visualizations or reports. Services include both on-prem and cloud-native landscapes.

We use business, management, lean, six sigma, data science and big data methods, techniques and computations in EDWs.

Typical large enterprise analytic systems have thousands of company shared (public) and personal reports. Our first step is to catalog these systems which provides visibility on the As-is Environment. After this, change is possible toward achieving the designed To-be Environment.

Large Enterprise Databases

Superfast databases have become the new norm for analytic systems sourcing from the EDW. These superfast databases are comprised of software and hardware platform components. They are Massively Parallel Processing (MPP) databases used for very large datasets with advanced speeds for large query workload processing and large computations across large data sets.

As datasets have grown, Big Data systems and smaller databases need business logic, summarizations and calculations to be handled in the database layer. Many systems handle these important operations in the middle layer or Analytics Server (2) and/or in the frontend or Visualization (1) layer. Moving to bigger data and faster systems requires different and newer data model builds rather than just trying to compensate with additional software and hardware purchases that provide less than acceptable performance gains because of existing slow or "hard-to-analyze" data models. Alexicon can provide help and advice to avoid performance problems while opening up analysis capabilities as data scales by pushing Visualization or Analytic Server level work to the database layer.

Below are major parts of a BI or Analytics system

1. Visualizations, Dashboards and Reports

Traditionally, BI and analytics has been focused on users being able to run standard reports with prompt interactivity and even “ad hoc” abilities or a “self-service facility” to build reports. Dashboards evolved from these systems and gained popularity in the last 10 years with Mobile BI, BI Search and Interactive Visualization tools following in recent years. In addition, statistical client tools have been used throughout all these periods and continue to be a powerful ad hoc tool that complements enterprise analytic efforts to assist with in-database computations.

2. Analytics Server

Frontend visualizations, dashboards and reports typically depend on the Analytics Server (2) which sits in the middle of the overall BI Landscape.

The Analytics Server is used to aggregate reasonable data sets and present those data sets to Frontend Visualizations for OLAP or static use. The server acts as a broker between the Users and the Database. It provides a place to administer users and to organize content or visualization objects (typically thousands of users and thousands of visualization objects).

The key is to use the Analytics Server as a pass-through for the hard work performed at the Database layer (quick data throughput to users). The database should perform filtering, aggregations and computations where possible. In addition, Analytic Server Meta Data should follow the same rule and should complement or be a pass-through for Meta Data at the Data Warehouse layer.

3. Database Servers (MPPs, Large Compute R, Python and SQL)

The Database is where all the sorting, filtering, aggregations and computations should be performed. This utilizes the databases computing power and passes summarized results to the Analytics Server layer which passes the summarized data to the Visualization layer. One example is a simple need to aggregate records for five regions on 20 billion base records. If the summarization is done in the database it will produce five rows with the region names and accompanying numeric values as columns. It passes those to the Analytics Server which renders those five rows in the web visualization layer or passes data to a client analytics tool (desktop application).

While this is an extreme case to produce five rows of data, these types of aggregations are critical for databases to handle (big or small). Typically, the Analytics Server is overworked because this basic rule is not followed. While 20 billion rows of data would definitely break most Analytics Servers, there are still many systems that run large aggregations in the Analytics Server and sometimes even in the visualization layer. Users experience slowness with response times as a result of the Analytics Server being overworked. With Big Data, this becomes painfully clear quickly. Analytics Servers start failing as data size and velocity increases. This is why data manipulations and computations need to be done in the database.

The "Data Model"

Data Warehouses depend on the right overall or enterprise data model to produce data structures for reports, dashboards and visualizations. As data Volume, Variety and Velocity (3V's) continues to increase, more advanced ways on ingesting, storing and providing data structures for analytic tools are available. Cloud solutions have added to the ease of scaling analytics systems. With their added “Big Compute” abilities, clouds are even more attractive. Alexicon focuses or the overall enterprise data model, indicators, sub data models, summaries and detail records. The goal is to drill from top levels as quickly as possible to any level in the system with accuracy.

Experience Curve

Alexicon has 20 years of progressive and successful experiences implementing enterprise analytic systems across a variety of industries and functional company areas. Our data models (heart of a system) provide thorough analysis capability and quick response times. It is surprising the quick performance gains that can be achieved when a company develops the right analytic systems to match their strategy. We can assist with your Company’s strategic alignment of their analytic and data management solutions (fixes, enhancements and new analytic landscapes).

Contact Us









Analytics and Data Management

Home

Indicator Analytics

Big Data

Data Science

Lean Six Sigma

Contact Us

© 2019 Alexicon Corporation. All rights reserved.