top of page

Welcome to Navigatinganalytics.com!

​Put together by Stephen Ogunyemi, Ph.D., the website shares different types of strategic approaches, practical know-how, implementation techniques and experiential information on how best to unlock the unlimited powers of innovative strategic data management solutions including small and big data, advanced analytics, advanced modeling, machine learning algorithms, applied optimization programming, business simulations, optional scenario generators, economic scenario generators, scenario testing plus scenario planning and business intelligence reporting processes, products and services that include static and real-time access and delivery through decisioning boards, dashboards, scorecards, operations command centers, virtual command centers and mobile channels.

 

Also in addition is navianalytics, which attempts to combine the power of in-memory processing in nanotechnologies with the unlimited possibilities of quantum computing to handle more complex, concurrent, streaming, distributed, dynamic and refactoring calculations in the pursuit of what are next to come in applied analytics as part of successful strategic business deliverables.

 

Simply, navigatinganalytics.com shows how to leverage different types of analytics in business practices, insights, process enhancement, tools and techniques plus tangible steps in gathering Voice of Customers (VOC), gap assessment, strategic planning, implementation, monitoring and reporting approaches to ensure successful and sustainable deployment of superior strategic business capabilities that include:

There are comprehensive lists of statistical, risk, econometrics, mathematical and machine learning techniques learnt, tested and used for solving practical business problems. These techniques have added tremendous values to the bottom-line of many companies in different industries. The lists are available for:

Stephen A. Ogunyemi MBA. Ph.D
   Vice President,
   Data Management Services  Previously, VP. Analytics
CBE Companies

Executive Builder and Manager of Analytics Teams, Structures, Models (Governance, Operating,  Organization and Deployment Models),  plus Products, Services and Processes for:

  • Big and Small Data Management Services

  • Advanced Predictive and Digital Analytics

  • Mathematical and Statistical Modeling

  • Applied Optimization

  • Scenario Analysis

  • Applied Simulation

  • Business Rule Engines

  • Business Intelligence Reporting, Visual Analytics and  Dashboards

  • Interactive Decisioning Boards

  • Real-time Streaming from  Cloud Technologies

  • Operations and Virtual Command Centers

  • Virtual Prescriptive Analytics

  • Applied Machine Learning

Enhanced Analytics on Demand (EAoD): Drilling Down on Performance Metrics Through Mobile Channel

In terms of context and content, analytics in general is not new but it is just starting out in terms of being able to flawlessly integrate and leverage new advanced technologies. Some of the advanced technologies are shown in Commercial Analytics.

Next Frontier: Marching Forward with Quantum Computing + In-memory Processing Solutions with

Nanotechnologies

​We have mostly been riding on the back of Grid Computing for most of the best of our sophisticated complex calculations and micro technologies to generate advances in  scalable storage systems; distributed databases; distributed file systems; massively parallel processing (MPP) databases; distributed real-time computation system for fast processing of large streaming data; fast and scalable in-memory databases, analytics and intelligence platforms; digital analytics tools; data mining grids and cloud computing platforms.

​With these advances, we now have tremendous opportunities to leverage advanced technological best practices to enhance unbelievable analytics value propositions in driving strategic business deliverables to new heights that we had once thought were only possible in the analytics dream lands!

 

As we look into the future so as to realize the best of the best in data, analytics, modeling and business intelligence (BI) deliverables, existing infrastructures are becoming really inadequate. With the continuous pressure from the stock markets on Facebook, Twitter, RenRen, Sina Weibo, QQ and WeChat to increase membership and by implication, increase in social related data, the future will need more powerful processing engines and storage power. Two related elements of interest coming down the pipeline are: Quantum Computing and Nanotechnologies.

Emerging Interactive Decisioning: Sophisticated and Complex Calculations, Machine Learning and Modeling Can Be Accelerated with More Processing Power

Quantum Computing:

​In many tests, Quantum Computing is displaying unlimited power to handle extreme rapid, complex, streaming and refactoring calculations with thousands of interacting categorical and continuous variables.

Apart from social related data, it is now obvious more than ever that we are going to keep generating tremendous amount of data through different activities. Example is the smashing of sub-atomic particles experiments by the Scientists at Large Hadron Collider (LHC), in Switzerland. Built by Conseil Européen pour la Recherche Nucléaire (CERN), between 1998 and 2008, the LHC became the world largest particle collider. Since the initial experiments, Worldwide LHC Computing Grid (WLCG) project has more than 170 computing centers in 42 countries with the aim to be able to handle the storage, distribution and analysis of more than 30 Petabytes or 30 million gigabytes of data that LHC generates annually.

The 27-kilometre Large Hadron Collider (LHC) @ CERN is the Current Largest Data Generator in the World @ 13 TeV. With the proposed China’s ‘Higgs factory’ by 2028 @ 52-kilometre @ eventual 240GeV,

We Are in for Lots of DATA to Come!

The question is: Do we have the Data Storage and Processing Power that gives us the BEST out of these Generated Data for Robust Pattern and Segmentation Analyses?!

LHC was shut down in 2013 for upgrade. It reopened in April 2015 and in May 2015, the LHC collided particles at unprecedented energy of 13 trillion electron volts (TeV) with more to come. This in itself is tremendous increase over the 8 TeV during the initial experiments which ended in 2012.

​Noticeable from the LHC experiments were lack of adequate processing power and storage infrastructures. Even with the China’s supercomputer Tianhe-2 (MilkyWay-2), which is the fastest computer in the world, the rapid increase in the complexity, size and dimensions of data is forcing the need to find a faster processing power that can go beyond treating data as binary digits (0’s or 1’s)! Statistically, in analyzing independent variables in modeling, we in analytics “throw” away independent variables as “insignificant”. The entanglement nature in quantum computing can potentially be used to process the correlated part of the “insignificant” independent variables so as to increase the coefficient of determination, the R-squared or percent of variance explained of the dependent variable.

 

The superposition principle becomes the precursor for addition of two time series. How awesome that will be for robust predictive analytics of business and stock market performances! With entanglement and superposition, a quantum computing can process a vast number of calculations simultaneously. Classical computer works with ones and zeros, a quantum computer has the advantage of using q-bits or ones, zeros and “superpositions” of ones and zeros. Certain difficult tasks that have long been thought impossible or intractable for classical computers can now be achieved quickly and efficiently with quantum computing.

Quantum computing will be faster in an environment where there is the need for Markov Chain Monte Carlo (MCMC) based Bayesian statistics with large datasets to generate large hierarchical models that require integrations of hundreds or even thousands of unknown parameters. So also will quantum computing be friendlier with complex machine learning algorithms. The Grover’s search algorithm, Shor’s factoring algorithm and the rapid advances in D-Wave’s processors are affirming the possibilities of high level reasoning in decision making processes that are in the pipeline for the next big future in data management, analytics, modeling, machine learning and business intelligence.

 

Can exascale computers take us to where we can handle more complex calculations better than where we are today? With the US President’s order of July 29th, 2015 to build the exascale computers which will overtake China’s supercomputer Tianhe-2 (MilkyWay-2) to become the fastest computer in the world, there is the hope that the exascale computers may end up imitating the level of performance expected from quantum computers or even better!

With Highly Intelligent and Interactive Decisioning-boards Replacing Dashboards, There is the Urgent Need for Very Powerful Processing and In-memory Technologies

In-Memory Processing Solutions with Nanotechnologies:

With Sarbanes-Oxley (SOX), Health Insurance Portability and Accountability Act of 1996 (HIPAA), SEC Rules 17a-3 and 4, Gramm-Leach-Bliley Act (GLB), USA PATRIOT Act, Computer Fraud and Abuse Act (CFAA), Basel III and Data Protection Act of the European Union compliance regimes in the background of information access, information management and data security, nanotechnologies provide better alternatives to current technologies that include RAID devices and Storage Area Networks (SANs). The opportunities from nanotechnologies for in-memory storage and processing for databases, analytics and business intelligence will provide the quick access in seek time, reliability and predictability needed to test and validate business strategies of the future. 

 

Going away are the days when the databases have to continuously access the disk-based memory system to call on the data. MemSQL, VoltDB, Microsoft, SAP, Teradata, IBM and Oracle are some of the companies that are making giant strides with new in-memory database management system (IMDBMS) technologies that are making it easier for analytics developers and others to build ultra-fast response applications that have analytics and business intelligence capabilities. The days of pre-created analytic metadata are shortening to allow for more robust analytics deliverables based on in-memory databases.

Solving and Modeling Complex Business Situations with Quantum Driven Technologies: A Dream Coming True for Strategic and Operation War Rooms Deliverables

The combination of nanotechnologies with in-memory processing technologies open new frontier in data storage, data processing and in-memory system. The nano-scale data storage can potentially replace the current hard drive technologies like SSD and DRAM as non-volatile solid-state memory which can provide building blocks for both grid and quantum computing.

Also important is that with quantum computing, nanotechnologies offer the opportunities to store possible configurations in memory as in-memory analytics.

Strategic Big Data + Open Source + Inexpensive High Powered Parallel Processing + Grid Technologies = Driving the Best in Analytics to the Next Levels To Support Successful Businesses

Strategic Data covers strategic management of big data and small data including data governance for the two types of data. Companies make decisions every day. Whether the company is concerned with performance drive to maintain its share of the current market or get into a new promising market or is introducing new policies to keep its share of wallet in terms of managing customers' churn, customer buying habits and/or keeping up with the new regulation coming into play across the board, the company needs data.

 

Many of the decisions that are long lasting and sustaining in competitive environments are usually the decisions driven by data. Many of these companies that out-compete are strategic so also the data that these companies use to make these decisions.

The data is strategic in that data provides the input into determining the capabilities which are needed to drive for long term goals and objectives, helps in assessment of what are the maneuvering boundaries of the company in the industry; how the company can break out of the limitations of the industry; how the company can change the name of the game in its current or aspiring industry; how it can leverage its competencies, resources, processes, insights of the customers’ needs, market trends, changes in economy to enable the company to design, develop and market new products and services that can turn the company into a leader rather than a follower in being able to outgrow its competitors in revenue generation, profit maximization and increase in shareholders’ value.

Managing the Big Data to the Points that Analytics Can Be Conducted and Insights Can Be Deducted for Actionable Deliverables

Navigatinganalytics.com provides two types of value propositions within the context of strategic data.  These include:

  1. That the company has to accept the principles that in order to be able to collect, know, standardize, access, use, derive values and secure its data, it has to invest in data as part of the company’s capabilities  

  2. That once these capabilities are planned, built or acquired and implemented, these capabilities have to be leveraged correctly to generate return on investment of the capital invested on data, analytics and business intelligence tools, technologies, processes, infrastructures and manpower and provide overall competitive edges for the company to outperform its competitors in revenue generation, profit maximization and increase in shareholders’ value.

Translating Big Data into Big Insights @ Our Fingertips

While there is tremendous amount of interest in big data and the big insights that are being generated,  an analytics that wants to get the complete picture of the company in terms of  being able to maximize its capabilities will have to combine the big and small data together. There are different, interesting and sophisticated output that can be generated when the big and small data are merged. So, the big and small data are leveraged to become value added in implementation of analytics as shown in:

Some of these solutions represent the next generation of products and services that will add tremendous values based on effective implementation of big and small data as part of the company's capabilities.

Enterprise Analytics: ​Leveraging Analytics to Stay Ahead in Competitive Environments

Enterprise Analytics addresses some of the issues of whether to have an enterprise analytics organization or just confine the analytics organization to the part of the business that seems to need it best. Whether the format of the analytics organization is of enterprise nature or not, we include the highlights of what is needed to manage existing enterprise analytics organization. Also, there are highlights on what are needed to build an enterprise analytics organization from the ground-up. The highlights start from the basics of understanding where the company wants to go in terms of strategic direction. Read More

​Whether managing or building an enterprise analytics organization, or just confining the analytics organization to a part of the business as a support functional organization, there are basic deliverables that the analytics organization must deliver. Some of these deliverables include:

 

There is also a look at some of the analytics methods which include statistical, risk, econometrics, mathematical and machine learning methods plus some of the critical to success factors in sustaining the existence an analytics organization.

Designing, Developing and Leveraging MetricsThat Matter: From Data Catalog to Analytic Catalog and From Metric Catalog to Reporting Catalog in Search of How Best To Reduce, Mitigate and Manage Variance to Plan (VOP) in the Quest to Ensure That Performance is Aligned to Pre-set Objectives in terms of Productivty, Sales, Revenue and Profits

Digital Analytics: ​Strengthening Digital Strategy With Digital Analytics

Digital Analytics explores a whole new world that was not possible some time ago. For data and analytics enthusiasts with Philosophy background, this will be considered a transcendental except that it is real and now! The emergence of big data technologies among others has elevated typical digital marketing strategy to a new height and beyond.

We need Digital Analytics to provide qualitative, semi-quantitative and quantitative approaches to discovery, description, prediction, prescription, visualization, actionable insights and recommendations in areas that include social media, CRM, search engine marketing, viral marketing,  mobile marketing, email marketing, online reputation management,  pay per click or PPC system, affiliate marketing and display media. Read More

Towards A Competitive Edge With Digital Analytics: What Are The Effects of Our Digital Strategy on This Customer? Which Customer Segment Does She Belong? What is Her Buying Behavior? What Stage of Brand/Purchase Funnel Is She? What Will She Buy? Will She Repurchase? Will She Come Back Or Defect? Will She Become A Loyal Customer? Are We Going To Be Able To Retain Her?

Commercial Analytics: ​Putting Together Strategically Designed Packages for Business Solutions

Commercial Analytics: According to International Data Corp., or IDC, Global spending on business analytics services is projected to rise from $51.6 billion in 2014 to $89.6 billion in 2018 at a compound annual growth rate (CAGR) of 14.7 percent. Within navigatinganalytics.com, Commercial Analytics lays out the basic requirements of how best to develop or package the data, analytics, modeling and business intelligence (BI) deliverables to meet the clients’ expectations. Read More

 Virtual Prescriptive Analytics + Your Five Years Strategic Business Plan

Run Your Current Business Today and See What Your Business Looks Like in 2027

Powering with Navianalytics: The Business Leaders Can Run their Businesses of the Future Today So As To See What their Businesses Look Like in Terms of Growth Opportunities, Acquisitions, Changes in National and Global Economies etc.

Virtual Prescriptive Analytics pushes the frontier in the next evolution of advanced tools to derive values from data, analytics, modeling, machine learning and business intelligence (BI) engagements. The Virtual Prescriptive Analytics sets to use the combined power of advanced technologies of multi-objective machine learning and in-memory processing powers with data, analytics, modeling, simulation and optimization to run the strategic plan of a company into the future. This is beyond predictive analytics! Navianalytics and the navianalytics cloud  are designed to allow streaming data both from the big data and traditional sources. In addition, navianalytics has in-built machine learning modules that are constantly learning the data, constantly producing machine learned analytics equations, refactorizing equations and optimizing the equations which are then moved through a process into machine learned Business Intelligence module that generate the needed combination of information as needed by the user or the business users. At each level of  data, analytics and business intelligence, there are in-memory capabilities that are constantly storing machine learned data, machine learned analytics equations and machine learned Business Intelligence information. Just like age progression in Adobe Photoshop or GIMP, the technological capability of navianalytics allows business leaders to see what their current strategic plan will actually turn out to be at the end of five years. No, it is not scenario analysis and it is not simulation. It is the ability of the machine learning technologies with in-memory capabilities that can accomodate numerous interactions of both continuous and discrete variables to produce time series output that can guide the business leaders on how best to execute a 3 to 5 years strategic plan without getting lost in the shuffle or in the weed! Read More

Navigatinganalytics.com shares high level depictions of:

Cloud Based Services: ​Leveraging the Cloud  Based Services (DaaS, AaaS, MaaS & RaaS) to Deliver Real Time Analytics and Reporting to the Business

Cloud-Based Analytics offers the efficient opportunities to deliver data, analytics, modeling and business intelligence (BI) services. With Cloud-Based Analytics, data can be stored in processed or unprocessed format. Analytics can be scheduled, stored or downloaded on demand. Models can reside in in-memory format and be augmented as model manager notices degeneration. Interactive Business Intelligence and visualization objects can be accessed with different devices. Read More

Operations Command Center: ​Leveraging Analytics to Dynamically Re-Distributing Work to More Productive Resources to Meet and Exceed Revenue Goals

Command Center: For those in the Military, the Command Center is an invaluable center to execute a plan flawlessly. The adaptation of the Command Center sitting on top of data, analytics, modeling and business intelligence (BI) services allows for tactical alignment of employee skill-sets with the need of the business units to re-distribute work to available resources during the operating hours and dynamically focus on meeting revenue goals during the operating hours. Read More

Next Generation of Command Center Strategic and Operation War Rooms: Designed for Reliable and Stable Strategic and Tactical Actionable Options

Please note that the word "Analytics" is used loosely here to include" Data Management, Analytics, Modeling, Optimization, Business Intelligence Dashboards, Machine Learning and Real-Time Streaming from Cloud Technologies. Thanks, Stephen Ogunyemi, Ph.D.

bottom of page