• TRUSTED RESEARCH

    TRUSTED RESEARCH | STRATEGIC INSIGHT

    SMB. CORE MIDMARKET. UPPER MIDMARKET. ECOSYSTEM
    LEARN MORE
  • BUYER JOURNEY

    BUYER JOURNEY

    SMB & Midmarket Buyers Journey Research
    LEARN MORE
  • BUYER PERSONAS

    BUYER PERSONAS

    SMB & Midmarket Technology Buyer Persona Research
    LEARN MORE
  • ARTIFICIAL INTELLIGENCE

    ARTIFICIAL INTELLIGENCE

    SMB & Midmarket Analytics & Artificial Intelligence Adoption
    LEARN MORE
  • DATACENTER SOLUTIONS

    DATACENTER SOLUTIONS

    SMB & Midmarket Datacenter Solution Adoption Trends
    LEARN MORE
  • INTERWORK 2.0: THE AGENTIC FUTURE OF CONNECTED BUSINESS

    INTERWORK 2.0: THE AGENTIC FUTURE OF CONNECTED BUSINESS

  • 2026 TOP 10 SMB BUSINESS ISSUES, IT PRIORITIES, IT CHALLENGES

    2026 TOP 10 SMB BUSINESS ISSUES, IT PRIORITIES, IT CHALLENGES

  • 2026 TOP 10 SMB PREDICTIONS

    2026 TOP 10 SMB PREDICTIONS

    SMB & Midmarket: Autonomous Business
    READ
  • 2026 TOP 10 PARTNER PREDICTIONS

    2026 TOP 10 PARTNER PREDICTIONS

    Partner & Ecosystem: Next Horizon
    READ
  • IT SECURITY TRENDS

    IT SECURITY TRENDS

    SMB & Midmarket Security Adoption Trends
    LATEST RESEARCH
  • PARTNER ECOSYSTEM

    PARTNER ECOSYSTEM

    Global Channel Partner Trends
    LATEST RESEARCH

Techaisle Analyst Insights

Trusted research and strategic insight decoding SMBs, the Midmarket, and the Partner Ecosystem.
Davis Blair

Meet the New Boss: Big Data (WSJ) – Techaisle Take

Wall Street Journal Article

This is an interesting article from the WSJ concerning how we are slowly allowing decision-making processes to move away from people and be handled by algorithms instead. It caught our attention at a time when we are completing survey work for Business Intelligence report. As discussed in an earlier post, one of the key trends in BI is how deeply it is being embedded into all kinds of applications , and this article is a good example. Please let us know what you think: comment, like, tweet or forward.

Laying the Foundation


Analytics and AI - Techaisle - Global SMB, Midmarket and Channel Partner Analyst Firm - Techaisle Analyst Insights - Page 41 Man_And_An_Old_Mainframe-e1348376410194 Analytic software has evolved through several generations over the last 70 years from around WWII, when a series of unprecedented number-crunching challenges gave rise to Decision Support Systems (DSS) designed to solve problems such as best equipment production mix given material constraints, how to logistically support the Allied invasion of Europe, split the atom, or break the Japanese code. These kinds of problems tended to be monolithic, using stochastic models to find the best answer – and were a major catalyst to development of the mainframe computer.

Business Intelligence (BI) followed this linear and iterative approach with one that supported solving business problems, mostly operational, within divisions and departments of large commercial organizations, using more distributed equipment within a wider audience, i.e., Finance, Operations and Distribution. In the late 1990s there was an explosion of data resulting from widespread adoption of CRM, the killer app of the Client/Server era, adding mountains of Sales and Marketing Data to the volumes of operational information. There was a growing need to get a top down view of how performance in one area of the organization was impacting the others, to begin taking a more structured approach at understanding cause and effect, setting objectives and consistently measuring performance to improve results. BI was evolving into Enterprise Performance Management (EPM) - which is where market leaders are today.

EPM is characterized by using Business Intelligence software to understand the best performance scenarios, measure actual performance indicators (KPIs) and determine how to close the gaps, using exception reporting for most front office functions (CRM/SFA) and rules-based processing for the back office (Process Manufacturing/Real Time Bidding, SCM/Advanced Web Analytics).

Optimization Nation


Equally important as the individual BI technology advances are some of the underlying rules that have accompanied the evolution: Moore’s Law, Metcalfe’s Law, the Law of Accelerating Returns all drove exponential growth in production, adoption and utility. Over a 20 year period, these have resulted in a slow-motion Black Swan event based on the cumulative effect of technology investments, and having huge impacts on our society, including but not limited to the following optimization activities:

Law of DisruptionEconomy – development of consumer mortgage products designed to optimize sales volume regardless of risk, bundling them into bonds to optimize profit on the debt, creation of derivatives to optimize transactions and create demand for increasingly suspect debt, development of new financial instruments that have no underlying value such as  synthetic derivatives that truly have nothing but conceptual paper profits behind them, etc. By 2008 these financial instruments had optimized leverage to create risk greater than the combined GDP of industrialized world.

Employment – The WSJ article goes into depth about how algorithms have already replaced a hiring manager's decisions based on probabilities of how the employee might behave under certain circumstances. Employer choices have also been optimized by a flattening of the market caused by oceans of virtually unlimited supply from sites like Monster.com, 100K Jobs, Dice, etc. Middle management has been optimized out of the organizational chart and replaced with productivity tools, more efficient communications and a lower ratio of managers to workers. And the actual number of staff required to hit the bottom line has been optimized while CEO salaries have been optimized. If we look a little further down the line, Andrew McAfee's POV is deep on this subject, and more technical than mine.

Industry – We all know that manufacturing was moved offshore en masse over the past three decades to optimize production costs, but several other industry segments have been optimized as well, including Retail which has optimized through consolidation and healthcare which has optimized revenue per patient. Retail has been optimized at a structural level, to provide one-stop shopping for almost everything you need in a single location while volume has been optimized to produce the absolute lowest price and any cost, including optimizing the number of worker hours to keep an optimal ratio of full time to part time employees and save the resulting benefit costs. And it has also optimized the number and variety of retail outlets and small businesses required to service an optimized population in square miles. Healthcare prices have been optimized to take advantage of tax structure, potential law suits, healthcare insurance gaps, maximizing federal matching funds, Board and C-Suite compensation, pharma industry profits, and many more.

Government – Automation has also enabled a profitable business model that optimizes the use of Federal Government funds and ensures that every available dollar is spent, whether it is to make sure everybody gets a state-of-the-art mobile wheelchair, their 120 monthly catheters, a speaking glucose meter, maximum disability benefits, etc.  “Don’t worry - we’ll handle all the paperwork.”

Set it and Forget it


Complex SystemsThe imminent next generation of analytics involves truly “optimized” complex systems with human intervention largely removed from the process. Not to single out Wall Street, but they offer one of the best examples of unbridled application of technology in the singular pursuit of optimization, in their case, profit for themselves and their shareholders. The Financial Services industry has invested billions into technology and employed thousands of physicists and Ph.D.-level mathematicians to achieve a couple-millisecond transaction advantage, and programmed algorithms to use the advantage and change the rules (i.e., share price represents perfect information is no longer true). This has not proved to always produce predictable results, and the ghost in the machine has put us back on the precipice more than once, as seen in this TED video by Kevin Slavin. As we move into a brave new world that combines optimization software with networks that operate too fast for human intervention, more of our lives will be controlled by how rules are programmed into the system than what we do as individuals to impact the results. One of the best examples of where this is heading is the IBM’s Smarter Cities Initiative, which combines intelligent networks that manage Traffic, Water and Electric Utilities, Transportation, Healthcare, Public Safety, Education and others into an overall “Intelligent City”. Everyone hates traffic, so the video example from the IBM innovation site does more to explain this than I can by writing more on the subject.

Whether you agree with it or not, we are on a direct course to this future that is almost impossible to divert. This is a philosophical question and everyone will have their own opinion about the cost/benefit of chasing optimization. Comments and Opinions are welcome, please let us know what you think.

Davis Blair

WOW..The Art of Technology

On a lighter note than usual, our post today is a  photo journal of Google's Data Centers, which we found on Mashable.com. These stunning images show some of the infrastructure that powers the Internet, and the incredible job done by Google on its' path to becoming one of the World's most powerful companies - all the more impressive considering the internal development and self sufficiency of the underlying systems. The video is equally impressive.

Google Evolution from Stanford Lab to Global Leader

 



Inside Google'sData Centers from Techaisle on Vimeo.

 

 

 

 

 

 

Dr. Cooram Ramacharlu Sridhar

Predictive Analytics – The Predicament

Predictive Analytics (PA) is emerging as an important tool in the area of business decision. Predictive Analytics primarily deals with making a forecast based on several inputs. In this and the blogs that follow I will share my experiences with Predictive Modelling (PM), with a view to contributing to the current knowledge base that exists in the Predictive Analytics World.

Analytics and AI - Techaisle - Global SMB, Midmarket and Channel Partner Analyst Firm - Techaisle Analyst Insights - Page 41 Analysis-blog In the world of business most predictive analytical tools are quantitative where numeric data is used for building an input-output model. The output is the prediction for specific inputs. For example: A 10% increase in advertising in January will result in an increase of 1% sale in May is a typical output from predictive analytics.

Common Mistake of Predictive Modelers: Assumption of linearity

Predictive Models are largely based on statistical techniques. Multiple Linear Regression (MLR) model is what most users will confront when they look at predictive models. This model works in the background whether one is using a multiple time series or multi-level modelling.

Multiple Linear Regression Models are developed based on a crucial assumption: the output is linearly dependent on the inputs. But all experience shows that in most business situations the assumption of linearity is not valid. Hence the statistical models have a poor fit and low predictive capability. In addition, the business world also suffers from Black Swan problems that no modelling can manage with any level of confidence.

The net effect of a linearity assumption, which is ubiquitous in almost all statistical modelling, and the resultant poor fit and low predictive capability has led to frustrated user community. Hence, a business executive looks at models with suspicion and trusts ‘gut’ to make decisions.

Predicament

The predicament of Predictive Modellers’ is: How do we get away from the linearity assumption on which almost all statistical tools are based, but it is known that this assumption is a poor, in fact a very poor, approximation of the real world behaviour?

The story of our approach to modelling starts from this predicament that we have been in, along with all others, and the path that we cut out to get out of it.

Dr. Cooram Ramacharlu Sridhar (Doc)
Managing Director and Advisor, Segmentation & Predictive Modeling

Dr. Cooram Ramacharlu Sridhar

Predictive Modelling – Watch out for land mines

Before you attempt any modelling you should first look at the inputs and outputs that you want to go in to your modelling. Here is the matrix:

Analytics and AI - Techaisle - Global SMB, Midmarket and Channel Partner Analyst Firm - Techaisle Analyst Insights - Page 41 PA-blog-22-1024x385


What you need to do is to make a laundry list of the variables (inputs) that affect the output. Typically in a marketing company one would look at sales as the output and a whole lot of variables as inputs. Let me look at a few examples for these cells.

1.       Measurable-Controllable Variables

GRPs of your brand through TV advertising are measurable and controllable.

2.       Measurable-Not-Controllable

Inflation is measurable but not controllable

 3. Not-measurable – Not Controllable

The amount of investments made by your competition in dealer incentives is neither easy to measure accurately nor can you have any control. But this activity impacts the sales of your brand.

4. Not Measurable-Controllable

Not measurable generally refers to qualitative issues which are quite often measured by a pseudo variable, for example: Quality of your salesperson.

In your business environment if the majority of your input variables are in Cells 1 and 2, and you feel that these make a big impact, then modelling will be successful. If not, and many variables are in Cells 3 and 4, modelling will not be a success.

Most companies do not undertake this simple preliminary exercise of classifying the variables that impact their business and then hit potholes throughout the design testing and implementation.

Unclassified variables are veritable landmines. Watch out for them.

Dr. Cooram Ramacharlu Sridhar (Doc)
Techaisle

Trusted Research | Strategic Insight

Techaisle - TA