• SIMPLIFY. EXPAND. GROW.

    SIMPLIFY. EXPAND. GROW.

    SMB. CORE MIDMARKET. UPPER MIDMARKET. ECOSYSTEM
    LEARN MORE
  • ARTIFICIAL INTELLIGENCE

    ARTIFICIAL INTELLIGENCE

    SMB & Midmarket Analytics & Artificial Intelligence Adoption
    LEARN MORE
  • IT SECURITY TRENDS

    IT SECURITY TRENDS

    SMB & Midmarket Security Adoption Trends
    LATEST RESEARCH
  • CHANNEL PARTNER RESEARCH

    CHANNEL PARTNER RESEARCH

    Channel Partner Trends
    LATEST RESEARCH
  • FEATURED INFOGRAPHIC

    FEATURED INFOGRAPHIC

    2024 Top 10 SMB Business Issues, IT Priorities, IT Challenges
    LEARN MORE
  • CHANNEL INFOGRAPHIC

    CHANNEL INFOGRAPHIC

    2024 Top 10 Partner Business Challenges
    LATEST RESEARCH
  • 2024 TOP 10 PREDICTIONS

    2024 TOP 10 PREDICTIONS

    SMB & Midmarket Predictions
    READ
  • 2024 TOP 10 PREDICTIONS

    2024 TOP 10 PREDICTIONS

    Channel Partner Predictions
    READ
  • CLOUD ADOPTION TRENDS

    CLOUD ADOPTION TRENDS

    SMB & Midmarket Cloud Adoption
    LATEST RESEARCH
  • FUTURE OF PARTNER ECOSYSTEM

    FUTURE OF PARTNER ECOSYSTEM

    Networked, Engaged, Extended, Hybrid
    DOWNLOAD NOW
  • BUYERS JOURNEY

    BUYERS JOURNEY

    Influence map & care-abouts
    LEARN MORE
  • DIGITAL TRANSFORMATION

    DIGITAL TRANSFORMATION

    Connected Business
    LEARN MORE
  • MANAGED SERVICES RESEARCH

    MANAGED SERVICES RESEARCH

    SMB & Midmarket Managed Services Adoption
    LEARN MORE
  • WHITE PAPER

    WHITE PAPER

    SMB Path to Digitalization
    DOWNLOAD

Techaisle Blog

Insightful research, flexible data, and deep analysis by a global SMB IT Market Research and Industry Analyst organization dedicated to tracking the Future of SMBs and Channels.
Anurag Agrawal

Amazon's Role in Emerging Cloud Service: Analytics-as-a-Service (no acronym allowed)

Many organizations are starting to think about “analytics-as-a-service” (no acronym allowed) as they struggle to cope with the problem of analyzing massive amounts of data to find patterns, extract signals from background noise and make predictions. In our discussions with CIOs and others, we are increasingly talking about leveraging the private or public cloud computing to build an analytics-as-a-service model.


The strategic goal is to harness data to drive insights and better decisions faster than competition as a core competency.  Executing this goal requires developing state-of-the-art capabilities around three facets:  algorithms, platform building blocks, and infrastructure.


Analytics is moving out of the IT function and into business — marketing, research and development, into strategy.  As a result of this shift, the focus is greater on speed-to-insight than on common or low-cost platforms.   In most IT organizations it takes anywhere from 6 weeks to 6 months to procure and configure servers.  Then another several months to load configure and test software. Not very fast for a business user who needs to churn data and test hypothesis. Hence cloud-as-a-analytics alternative is gaining traction with business users.


The “analytics-as-a-service” operating model that businesses are thinking about is already being facilitated by Amazon, Opera Solutions, eBay and others like LiquidHub.  They are anticipating the value migrating from traditional outmoded BI to an Analytics-as-a-service model.  We believe that Amazon’s analytics-as-a-service model provides a directional and aspirational target for IT organizations who want to build an on-premise equivalent.

 

Situation/Problem Summary: The Challenges of Departmental or Functional Analytics


The dominant design of analytics today is static or dependent on specific questions or dimensions. With the need for predictive analytics-driven business insights growing at ever increasing speeds, it’s clear that current departmental stove-pipe implementations are unable to meet the demands of increasingly complex KPIs, metrics and dashboards that will define the coming generation of Enterprise Performance Management. The fact that this capability will also be available to SMBs follows the trend of embedded BI and dashboards that is already sweeping the market as an integral part of SaaS applications. As we have written in the past, the move to true mobile BI can be provided as an application "bolt-ons" that work in conjunction with an existing Enterprise Applications or as pure play developed from scratch BI applications that take advantage of new technologies like HTML5. Generally, the large companies do the former through acquisition with existing technology and integration and with start-ups for the latter. Whether at the Departmental or Enterprise level, the requirements to hold down costs, minimize complexity and increase access and usability are pretty much universal, especially for SMBs, who are quickly moving away from on-premise equipment, software and services.


After years of cost cutting, organizations are looking for top-line growth again and finding that with the proliferation of front-end analytics tools and back-end BI tools, platforms and data marts, the burden/overhead of managing, maintaining and developing the “raw data to insights” value chain is growing in cost and complexity - a balance that brings SaaS and on-premise benefits together is needed.


The perennial challenge of a good BI deployment remains: it is becoming increasingly necessary to bring the disparate platforms/tools/information into a more centralized but flexible analytical architecture. Add to this the growth in volume of Big Data across all company types and the challenges accelerate.


Centralization of analytics infrastructure conflicts with the business requirement of time-to-impact, high quality and rate of user adoption - time can be more important than money if the application is strategic.  Line of Business teams need usable, adaptable, and flexible and constantly changing insights to keep up with customers.  The front-line teams care about revenue, alignment with customers and sales opportunities. So how do you bridge the two worlds and deliver the ultimate flexibility with the lowest possible cost of ownership?


The solution is Analytics-as-a-Service.

 

Emerging Operating Model:  Analytics-as-a-Service


It’s clear that sophisticated firms are moving along a trajectory of consolidating their departmental platforms into general purpose analytical platforms (either inside or outside the firewall) and then packaging them into a shared services utility.


This model is about providing a cloud computing model for analytics to anyone within or even outside an organization.  Fundamental building blocks (or enablers) like – Information Security, Data Integrity, Data and Storage Management, iPad and Mobile capabilities and other aspects – which are critical, don’t have to be designed, developed, tested again and again. More complex enablers like Operations Research, Data Mining, Machine Learning, Statistical models are also thought of as services.


Enterprise architects are migrating to “analytics-as-a-service” because they want to address three core challenges – size, speed, type – in every organization:

    • The vast amount of data that needs to be processed to produce accurate and actionable results

 

    • The speed at which one needs to analyze data to produce results

 

    • The type of data that one analyzes - structured versus unstructured



The real value of this service bureau model lies in achieving the economies of scale and scope…the more virtual analytical apps one deploys, the better the overall scalability and higher the cost savings. With growing data volumes and dozens of virtual analytical apps, chances are that more and more of them leverage processing at different times, usage patterns and frequencies, one of the main selling points of service pooling in the first place.

 

Amazon Analytics-as-a-Service in the Cloud


Amazon.com is becoming a market leader in supporting the analytics-as-a-service concept. They are attacking this as a cloud-enabled business model innovation opportunity than an incremental BI extension.  This is a great example of value migration from outmoded methods to new architectural patterns that are better able to satisfy business’ priorities.


Amazon is aiming at firms that deal with lots and lots of data and need elastic/flexible infrastructure.  This can be domain areas like Gene Sequencing, Clickstream analysis, Sensors, Instrumentation, Logs, Cyber-Security, Fraud, Geolocation, Oil Exploration modeling, HR/workforce analytics and others. The challenge is to harness data and derive insights without spending years building complex infrastructure.


Amazon is betting that traditional enterprise “hard-coded” BI infrastructure will be unable to handle the data volume growth, data structure flexibility and data dimensionality issues.  Also even if the IT organization wants to evolve from the status quo they are hamstrung with resource constraints, talent shortage and tight budgets. Predicting infrastructure needs for emerging (and yet-to-be-defined) analytics scenarios is not trivial.


Analytics-as-a-service that supports dynamic requirements requires some serious heavy lifting and complex infrastructure. Enter the AWS cloud.  The cloud offers some interesting value 1) on demand; 2) pay-as-you-go; 3) elastic; 4) programmable; 5) abstraction; and in many cases 6) better security.


The core differentiator for Amazon is parallel efficiency - the effectiveness of distributing large amounts of workload over pools and grids of servers coupled with techniques like MapReduce and Hadoop.


Amazon has analyzed the core requirements for general analytics-as-a-service infrastructure and is providing core building blocks that include 1) scalable persistent storage like Amazon Elastic Block Store; 2) scalable storage like Amazon S3; 3) elastic on-demand resources like Amazon Elastic Compute Cloud (Amazon EC2); and 4) tools like Amazon Elastic MapReduce.  It offers choice in the database images (Amazon RDS, Oracle, MySQL, etc.)

 

How does Amazon Analytics-in-the-Cloud work?


BestBuy had a clickstream analysis problem — 3.5 billion records, 71 million unique cookies, 1.7 million targeted ads required per day. How to make sense of this data? They used a partner to implement an analytic solution on Amazon Web Services and Elastic MapReduce. Solution was a 100 node cluster on demand; processing time was reduced from 2+ days to 8 hours.


Predictive exploration of data, separating “signals from noise” is the base use case. This manifests in different problem spaces like targeted advertising / clickstream analysis; data warehousing applications; bioinformatics; financial modeling; file processing; web indexing; data mining and BI.  Amazon analytics-as-a-service is perfect for compute intensive scenarios in financial services like Credit Ratings, Fraud Models, Portfolio analysis, and VaR calculations.


The ultimate goal for Amazon in Analytics-as-a-Service is to provide unconstrained tools for unconstrained growth. What is interesting is that an architecture of mixing commercial off-the-shelf packages with core Amazon services is also possible.

 

The Power of Amazon’s Analytics-as-a-Service


So what does the future hold?  The market in predictive analytics is shifting.  It is moving from “Data-at-Rest” to “Data-in-motion” Analytics.


The service infrastructure to do “data-in-motion” analytics is pretty complicated to setup and execute.  The complexity ranges from the core (e.g., analytics and query optimization), to the practical (e.g., horizontal scaling), to the mundane (e.g., backup and recovery).  Doing all these well while insulating the end-user is where Amazon.com will be most dominant.

 

Data in motion analytics


Data “in motion” analytics is the analysis of data before it has come to rest on a hard drive or other storage medium. Due to the vast amount of data being collected today, it is often not feasible to store the data first before analyzing it. In addition, even if you have the space to store the data first, additional time is required to store and then analyze. This time delay is often not acceptable in some use cases.

 

Data at rest analytics


Due to the vast amounts of data stored, technology is needed to sift through it, make sense of it, and draw conclusions from it. Much data is stored in relational or OLAP stores. But, more data today is not stored in a structured manner. With the explosive growth of unstructured data, technology is required to provide analytics on relational, non-relational, structured, and unstructured data sources.


Now Amazon AWS is not the only show in town attempting to provide analytics-as-a-service.  Competitors like Google BigQuery, a managed data analytics service in the cloud is aimed at analyzing big sets of data… one can run query analysis on big data sets — 5 to ten terabytes — and get a response back pretty quickly, in a matter of seconds, ten to twenty seconds. That’s pretty useful when you just want a standardized self-service machine learning service. How is BigQuery used? Claritic has built an application for game developers to gather real-time insights into gaming behavior. Another firm, Crystalloids, built an application to help a resort network “analyze customer reservations, optimize marketing and maximize revenue.” (THINKstrategies’ Cloud Analytics Summit in April, Ju-kay Kwek, product manager for Google’s cloud platform).

 

Bottom-line and Takeaways


Analytics is moving from the domain of departments to the enterprise level.   As the demand for analytics grows rapidly the CIOs and IT organizations are going to be under increasing pressure to deliver.  It will be especially interesting to watch how companies that have outsourced and offshored extensively (50+%) to Infosys, TCS, IBM,  Wipro, Cognizant, Accenture, HP, CapGemini and others will adapt and leverage their partners to deliver analytics innovation.


At the enterprise level a shared utility model is the right operating model.  But given the multiple BI projects already in progress and vendor stacks in place (sunk cost and effort); it is going to be extraordinarily difficult in most large corporations to rip-and-replace.  They will instead take a conservative and incremental integrate-and-enhance-what-we-have approach which will put them at a disadvantage. Users will increasingly complain that IT is not able to deliver what innovators like Amazon Web Services are providing.


Amazon’s analytics-as-a-service platform strategy shows exactly where the enterprise analytics marketplace is moving to or needs to go. But most IT groups are going to struggle to implement this trajectory without some strong leadership support, experimentation and program management. We expect this enterprise analytics transformation trend will take a decade to play out (innovation to maturity cycle).


Shirish Netke

Dr. Cooram Ramacharlu Sridhar

What is the big deal with ANN?

In the thirty years from the time Shunu Sen posed the marketing-mix problems, I have been busy with marketing research. I tried modeling most of the studies and discovered that market research data alone is not amenable to statistical predictive modeling. Take for example, imagery. Is there a correlation between Image parameters and Purchase Intention scores? There should be. But rarely does one get more than a 0.35 correlation coefficient. Try and link awareness, imagery, intention to buy, product knowledge, brand equity, etc. to the performance of the brand in the market place and one discovers land mines, unanswered questions and inactionability.

This is where ANN steps in.

Technically ANN (Artificial Neural Networks) offers a number of advantages that statistical models do not. I will list a few of them.

    1. Non-linear models are a tremendous advantage to a modeler. The real world is non-linear and any linear model is a huge approximation.

 

    1. In a statistical model, the model gives the error and one can do precious little to decrease the error. In ANN one can specify the error tolerance. For example we can fit a model for 85, 90, 95 or 99% error. It requires some expertise to figure out whether there is an over fit and what is the optimum error one can accept.

 

    1. Statistical models make assumptions on distributions that are not real in the real world. ANNs make no distribution assumptions.

 

    1. Most ANN software available today do not identify the functions that are fitted. We, on the other hand, have been able to identify the functions that are fitted and how to extract the weights and build them into an algorithm.



How do we bring the differentiation?

Our biggest strength is in data integration that combines market research and economic data with transaction data into a single file. This is tricky and requires some ingenuity. We use Monte Carlo techniques to build these files and then use ANN for building the Simulation models. Optimization then becomes clear and straight forward since we do not use statistical models. Optimization using statistical modeling, which most modelers use, is a nightmare. Most of the large IT vendors and even analytics companies continue to use statistical modeling for Optimization. And therein lays the problem. Neither are these companies aware of the possibilities that ANN can provide. Most modeling is done using aggregate data, whereas we handle the data at the respondent level. The conventional modeling is macro data oriented whereas we are micro data oriented. Hence the possibilities that we can generate with micro data for modeling is huge, compared to macro data.

We have crossed the stage of theories. There are many projects that we have executed successfully that have gone on to become a must-have analytical marketing input mechanism.

Doc
Techaisle

Dr. Cooram Ramacharlu Sridhar

Brand Market Modeling Solution Delivers 85-95% Accuracy

What is the market mix problem that I keep talking about that has stayed unsolved for so many years. The problem by itself is simple: How much does my brand sell for different marketing inputs? A regression model needs historical data of at least three years. Most companies either do not have that much data, or the markets have changed in three years and hence the data is pretty much useless. In addition, regression models cannot accommodate changes in product formulation or advertising campaign changes. Further, economic data like GDP, CPI and inflation can never be built into the regression model. Hence all statistical modeling leads you up a cul-de-sac.

Brand LoyaltySo what did I do that was so different? I went back to market research data. Almost all companies do advertising or brand tracking. The sample sizes in these tracks vary from 800 to 4000, every month. The tracks capture primarily the exposure to media, awareness of advertising and the brands, brand imagery and Intention to buy. They set up benchmarks against each parameter and take marketing decisions. For example: if the claimed exposure to TV in April is 30% and it was 40% in March, they call the advertising agency and ask them to change the channels. If the imagery is not improving then they will change the advertising. These decisions are ad hoc, because they have no clue on how each of these decisions will affect a priori. The marketing guy’s knowledge is always from hindsight. So, what we did was to pick up the brand track data with 800 or 4000 respondents and modeled it. The basic modules were:

    1. Exposure to Awareness Model

 

    1. Awareness to Image Model

 

    1. Image to Intention to buy Model

 

    1. Intention to buy to actual sales model



With this approach we had solved one two problems: The requirement of long history of the brand’s performance and the irrelevance of historical data to the current scenario. We could take the brand track data from April and model it for May. Nothing can be more recent than this.

Now came the tricky part. How does one link media and other marketing inputs. This was done through a heuristic algorithm. Each respondent in the track was assigned a value for each marketing input. Now we had a file with track and inputs in each record. After this it was easy to integrate all other data. Once we had all the data together the independent modules were built using ANN and a dash-board was designed to make life easy for the user.

What were the results? Based on each month’s inputs we estimated the sales. What we could deliver was 85-95% accuracy of forecasts compared to the company’s sales-out figures. This far exceeded the original mandate to deliver a solution that gave 75% accuracy.

How does this help the brand managers? Here is a list.

    1. Understanding of the brand characteristics using the tabulator cuts down the time of extracting information from different sources by 80%.

 

    1. Simulate the brand response to variations in brand inputs to get the optimum response.

 

    1. If there is a competition activity in the market it is possible to check what investments should they make to reduce the effect of these activities on their own brand.

 

    1. Use the simulator during brand and advertising reviews.



Our Brand Modeling solution is a huge help in effective planning of a brand to meet the targets. With our unique Brand Modeling Solution a 30-year-old problem that brand marketers have been grappling with was put to rest. We have a successful solution.

Dr. Cooram Ramacharlu Sridhar (Doc)
Techaisle

Dr. Cooram Ramacharlu Sridhar

Predictive Modelling – Watch out for land mines

Before you attempt any modelling you should first look at the inputs and outputs that you want to go in to your modelling. Here is the matrix:

predictive analytics - Techaisle - Global SMB, Midmarket and Channel Partner Market Research Organization - Techaisle Blog PA-blog-22-1024x385


What you need to do is to make a laundry list of the variables (inputs) that affect the output. Typically in a marketing company one would look at sales as the output and a whole lot of variables as inputs. Let me look at a few examples for these cells.

1.       Measurable-Controllable Variables

GRPs of your brand through TV advertising are measurable and controllable.

2.       Measurable-Not-Controllable

Inflation is measurable but not controllable

 3. Not-measurable – Not Controllable

The amount of investments made by your competition in dealer incentives is neither easy to measure accurately nor can you have any control. But this activity impacts the sales of your brand.

4. Not Measurable-Controllable

Not measurable generally refers to qualitative issues which are quite often measured by a pseudo variable, for example: Quality of your salesperson.

In your business environment if the majority of your input variables are in Cells 1 and 2, and you feel that these make a big impact, then modelling will be successful. If not, and many variables are in Cells 3 and 4, modelling will not be a success.

Most companies do not undertake this simple preliminary exercise of classifying the variables that impact their business and then hit potholes throughout the design testing and implementation.

Unclassified variables are veritable landmines. Watch out for them.

Dr. Cooram Ramacharlu Sridhar (Doc)
Techaisle

Research You Can Rely On | Analysis You Can Act Upon

Techaisle - TA