• SIMPLIFY. EXPAND. GROW.

    SIMPLIFY. EXPAND. GROW.

    SMB. CORE MIDMARKET. UPPER MIDMARKET. ECOSYSTEM
    LEARN MORE
  • ARTIFICIAL INTELLIGENCE

    ARTIFICIAL INTELLIGENCE

    SMB & Midmarket Analytics & Artificial Intelligence Adoption
    LEARN MORE
  • IT SECURITY TRENDS

    IT SECURITY TRENDS

    SMB & Midmarket Security Adoption Trends
    LATEST RESEARCH
  • BUYERS JOURNEY

    BUYERS JOURNEY

    Technology Buyer Persona Research
    LEARN MORE
  • PARTNER ECOSYSTEM

    PARTNER ECOSYSTEM

    Global Channel Partner Trends
    LATEST RESEARCH
  • 2025 TOP 10 SMB BUSINESS ISSUES, IT PRIORITIES, IT CHALLENGES

    2025 TOP 10 SMB BUSINESS ISSUES, IT PRIORITIES, IT CHALLENGES

  • CLOUD ADOPTION TRENDS

    CLOUD ADOPTION TRENDS

    SMB & Midmarket Cloud Adoption
    LATEST RESEARCH
  • 2025 TOP 10 PREDICTIONS

    2025 TOP 10 PREDICTIONS

    SMB & Midmarket Predictions
    READ
  • FUTURE OF PARTNER ECOSYSTEM

    FUTURE OF PARTNER ECOSYSTEM

    Networked, Engaged, Extended, Hybrid
    DOWNLOAD NOW
  • MANAGED SERVICES RESEARCH

    MANAGED SERVICES RESEARCH

    SMB & Midmarket Managed Services Adoption
    LEARN MORE

Techaisle Blog

Insightful research, flexible data, and deep analysis by a global SMB IT Market Research and Industry Analyst organization dedicated to tracking the Future of SMBs and Channels.
Dr. Cooram Ramacharlu Sridhar

What is the big deal with ANN?

In the thirty years from the time Shunu Sen posed the marketing-mix problems, I have been busy with marketing research. I tried modeling most of the studies and discovered that market research data alone is not amenable to statistical predictive modeling. Take for example, imagery. Is there a correlation between Image parameters and Purchase Intention scores? There should be. But rarely does one get more than a 0.35 correlation coefficient. Try and link awareness, imagery, intention to buy, product knowledge, brand equity, etc. to the performance of the brand in the market place and one discovers land mines, unanswered questions and inactionability.

This is where ANN steps in.

Technically ANN (Artificial Neural Networks) offers a number of advantages that statistical models do not. I will list a few of them.

    1. Non-linear models are a tremendous advantage to a modeler. The real world is non-linear and any linear model is a huge approximation.

 

    1. In a statistical model, the model gives the error and one can do precious little to decrease the error. In ANN one can specify the error tolerance. For example we can fit a model for 85, 90, 95 or 99% error. It requires some expertise to figure out whether there is an over fit and what is the optimum error one can accept.

 

    1. Statistical models make assumptions on distributions that are not real in the real world. ANNs make no distribution assumptions.

 

    1. Most ANN software available today do not identify the functions that are fitted. We, on the other hand, have been able to identify the functions that are fitted and how to extract the weights and build them into an algorithm.



How do we bring the differentiation?

Our biggest strength is in data integration that combines market research and economic data with transaction data into a single file. This is tricky and requires some ingenuity. We use Monte Carlo techniques to build these files and then use ANN for building the Simulation models. Optimization then becomes clear and straight forward since we do not use statistical models. Optimization using statistical modeling, which most modelers use, is a nightmare. Most of the large IT vendors and even analytics companies continue to use statistical modeling for Optimization. And therein lays the problem. Neither are these companies aware of the possibilities that ANN can provide. Most modeling is done using aggregate data, whereas we handle the data at the respondent level. The conventional modeling is macro data oriented whereas we are micro data oriented. Hence the possibilities that we can generate with micro data for modeling is huge, compared to macro data.

We have crossed the stage of theories. There are many projects that we have executed successfully that have gone on to become a must-have analytical marketing input mechanism.

Doc
Techaisle

Davis Blair

WOW..The Art of Technology

On a lighter note than usual, our post today is a  photo journal of Google's Data Centers, which we found on Mashable.com. These stunning images show some of the infrastructure that powers the Internet, and the incredible job done by Google on its' path to becoming one of the World's most powerful companies - all the more impressive considering the internal development and self sufficiency of the underlying systems. The video is equally impressive.

Google Evolution from Stanford Lab to Global Leader

 



Inside Google'sData Centers from Techaisle on Vimeo.

 

 

 

 

 

 

Dr. Cooram Ramacharlu Sridhar

Brand Market Modeling Solution Delivers 85-95% Accuracy

What is the market mix problem that I keep talking about that has stayed unsolved for so many years. The problem by itself is simple: How much does my brand sell for different marketing inputs? A regression model needs historical data of at least three years. Most companies either do not have that much data, or the markets have changed in three years and hence the data is pretty much useless. In addition, regression models cannot accommodate changes in product formulation or advertising campaign changes. Further, economic data like GDP, CPI and inflation can never be built into the regression model. Hence all statistical modeling leads you up a cul-de-sac.

Brand LoyaltySo what did I do that was so different? I went back to market research data. Almost all companies do advertising or brand tracking. The sample sizes in these tracks vary from 800 to 4000, every month. The tracks capture primarily the exposure to media, awareness of advertising and the brands, brand imagery and Intention to buy. They set up benchmarks against each parameter and take marketing decisions. For example: if the claimed exposure to TV in April is 30% and it was 40% in March, they call the advertising agency and ask them to change the channels. If the imagery is not improving then they will change the advertising. These decisions are ad hoc, because they have no clue on how each of these decisions will affect a priori. The marketing guy’s knowledge is always from hindsight. So, what we did was to pick up the brand track data with 800 or 4000 respondents and modeled it. The basic modules were:

    1. Exposure to Awareness Model

 

    1. Awareness to Image Model

 

    1. Image to Intention to buy Model

 

    1. Intention to buy to actual sales model



With this approach we had solved one two problems: The requirement of long history of the brand’s performance and the irrelevance of historical data to the current scenario. We could take the brand track data from April and model it for May. Nothing can be more recent than this.

Now came the tricky part. How does one link media and other marketing inputs. This was done through a heuristic algorithm. Each respondent in the track was assigned a value for each marketing input. Now we had a file with track and inputs in each record. After this it was easy to integrate all other data. Once we had all the data together the independent modules were built using ANN and a dash-board was designed to make life easy for the user.

What were the results? Based on each month’s inputs we estimated the sales. What we could deliver was 85-95% accuracy of forecasts compared to the company’s sales-out figures. This far exceeded the original mandate to deliver a solution that gave 75% accuracy.

How does this help the brand managers? Here is a list.

    1. Understanding of the brand characteristics using the tabulator cuts down the time of extracting information from different sources by 80%.

 

    1. Simulate the brand response to variations in brand inputs to get the optimum response.

 

    1. If there is a competition activity in the market it is possible to check what investments should they make to reduce the effect of these activities on their own brand.

 

    1. Use the simulator during brand and advertising reviews.



Our Brand Modeling solution is a huge help in effective planning of a brand to meet the targets. With our unique Brand Modeling Solution a 30-year-old problem that brand marketers have been grappling with was put to rest. We have a successful solution.

Dr. Cooram Ramacharlu Sridhar (Doc)
Techaisle

Davis Blair

Meet the New Boss: Big Data (WSJ) – Techaisle Take

Wall Street Journal Article

This is an interesting article from the WSJ concerning how we are slowly allowing decision-making processes to move away from people and be handled by algorithms instead. It caught our attention at a time when we are completing survey work for Business Intelligence report. As discussed in an earlier post, one of the key trends in BI is how deeply it is being embedded into all kinds of applications , and this article is a good example. Please let us know what you think: comment, like, tweet or forward.

Laying the Foundation


Analytics and AI - Techaisle - Global SMB, Midmarket and Channel Partner Analyst Firm - Techaisle Blog - Page 31 Man_And_An_Old_Mainframe-e1348376410194 Analytic software has evolved through several generations over the last 70 years from around WWII, when a series of unprecedented number-crunching challenges gave rise to Decision Support Systems (DSS) designed to solve problems such as best equipment production mix given material constraints, how to logistically support the Allied invasion of Europe, split the atom, or break the Japanese code. These kinds of problems tended to be monolithic, using stochastic models to find the best answer – and were a major catalyst to development of the mainframe computer.

Business Intelligence (BI) followed this linear and iterative approach with one that supported solving business problems, mostly operational, within divisions and departments of large commercial organizations, using more distributed equipment within a wider audience, i.e., Finance, Operations and Distribution. In the late 1990s there was an explosion of data resulting from widespread adoption of CRM, the killer app of the Client/Server era, adding mountains of Sales and Marketing Data to the volumes of operational information. There was a growing need to get a top down view of how performance in one area of the organization was impacting the others, to begin taking a more structured approach at understanding cause and effect, setting objectives and consistently measuring performance to improve results. BI was evolving into Enterprise Performance Management (EPM) - which is where market leaders are today.

EPM is characterized by using Business Intelligence software to understand the best performance scenarios, measure actual performance indicators (KPIs) and determine how to close the gaps, using exception reporting for most front office functions (CRM/SFA) and rules-based processing for the back office (Process Manufacturing/Real Time Bidding, SCM/Advanced Web Analytics).

Optimization Nation


Equally important as the individual BI technology advances are some of the underlying rules that have accompanied the evolution: Moore’s Law, Metcalfe’s Law, the Law of Accelerating Returns all drove exponential growth in production, adoption and utility. Over a 20 year period, these have resulted in a slow-motion Black Swan event based on the cumulative effect of technology investments, and having huge impacts on our society, including but not limited to the following optimization activities:

Law of DisruptionEconomy – development of consumer mortgage products designed to optimize sales volume regardless of risk, bundling them into bonds to optimize profit on the debt, creation of derivatives to optimize transactions and create demand for increasingly suspect debt, development of new financial instruments that have no underlying value such as  synthetic derivatives that truly have nothing but conceptual paper profits behind them, etc. By 2008 these financial instruments had optimized leverage to create risk greater than the combined GDP of industrialized world.

Employment – The WSJ article goes into depth about how algorithms have already replaced a hiring manager's decisions based on probabilities of how the employee might behave under certain circumstances. Employer choices have also been optimized by a flattening of the market caused by oceans of virtually unlimited supply from sites like Monster.com, 100K Jobs, Dice, etc. Middle management has been optimized out of the organizational chart and replaced with productivity tools, more efficient communications and a lower ratio of managers to workers. And the actual number of staff required to hit the bottom line has been optimized while CEO salaries have been optimized. If we look a little further down the line, Andrew McAfee's POV is deep on this subject, and more technical than mine.

Industry – We all know that manufacturing was moved offshore en masse over the past three decades to optimize production costs, but several other industry segments have been optimized as well, including Retail which has optimized through consolidation and healthcare which has optimized revenue per patient. Retail has been optimized at a structural level, to provide one-stop shopping for almost everything you need in a single location while volume has been optimized to produce the absolute lowest price and any cost, including optimizing the number of worker hours to keep an optimal ratio of full time to part time employees and save the resulting benefit costs. And it has also optimized the number and variety of retail outlets and small businesses required to service an optimized population in square miles. Healthcare prices have been optimized to take advantage of tax structure, potential law suits, healthcare insurance gaps, maximizing federal matching funds, Board and C-Suite compensation, pharma industry profits, and many more.

Government – Automation has also enabled a profitable business model that optimizes the use of Federal Government funds and ensures that every available dollar is spent, whether it is to make sure everybody gets a state-of-the-art mobile wheelchair, their 120 monthly catheters, a speaking glucose meter, maximum disability benefits, etc.  “Don’t worry - we’ll handle all the paperwork.”

Set it and Forget it


Complex SystemsThe imminent next generation of analytics involves truly “optimized” complex systems with human intervention largely removed from the process. Not to single out Wall Street, but they offer one of the best examples of unbridled application of technology in the singular pursuit of optimization, in their case, profit for themselves and their shareholders. The Financial Services industry has invested billions into technology and employed thousands of physicists and Ph.D.-level mathematicians to achieve a couple-millisecond transaction advantage, and programmed algorithms to use the advantage and change the rules (i.e., share price represents perfect information is no longer true). This has not proved to always produce predictable results, and the ghost in the machine has put us back on the precipice more than once, as seen in this TED video by Kevin Slavin. As we move into a brave new world that combines optimization software with networks that operate too fast for human intervention, more of our lives will be controlled by how rules are programmed into the system than what we do as individuals to impact the results. One of the best examples of where this is heading is the IBM’s Smarter Cities Initiative, which combines intelligent networks that manage Traffic, Water and Electric Utilities, Transportation, Healthcare, Public Safety, Education and others into an overall “Intelligent City”. Everyone hates traffic, so the video example from the IBM innovation site does more to explain this than I can by writing more on the subject.

Whether you agree with it or not, we are on a direct course to this future that is almost impossible to divert. This is a philosophical question and everyone will have their own opinion about the cost/benefit of chasing optimization. Comments and Opinions are welcome, please let us know what you think.

Research You Can Rely On | Analysis You Can Act Upon

Techaisle - TA