• SIMPLIFY. EXPAND. GROW.

    SIMPLIFY. EXPAND. GROW.

    SMB. CORE MIDMARKET. UPPER MIDMARKET. ECOSYSTEM
    LEARN MORE
  • ARTIFICIAL INTELLIGENCE

    ARTIFICIAL INTELLIGENCE

    SMB & Midmarket Analytics & Artificial Intelligence Adoption
    LEARN MORE
  • IT SECURITY TRENDS

    IT SECURITY TRENDS

    SMB & Midmarket Security Adoption Trends
    LATEST RESEARCH
  • CHANNEL PARTNER RESEARCH

    CHANNEL PARTNER RESEARCH

    Channel Partner Trends
    LATEST RESEARCH
  • FEATURED INFOGRAPHIC

    FEATURED INFOGRAPHIC

    2024 Top 10 SMB Business Issues, IT Priorities, IT Challenges
    LEARN MORE
  • CHANNEL INFOGRAPHIC

    CHANNEL INFOGRAPHIC

    2024 Top 10 Partner Business Challenges
    LATEST RESEARCH
  • 2024 TOP 10 PREDICTIONS

    2024 TOP 10 PREDICTIONS

    SMB & Midmarket Predictions
    READ
  • 2024 TOP 10 PREDICTIONS

    2024 TOP 10 PREDICTIONS

    Channel Partner Predictions
    READ
  • CLOUD ADOPTION TRENDS

    CLOUD ADOPTION TRENDS

    SMB & Midmarket Cloud Adoption
    LATEST RESEARCH
  • FUTURE OF PARTNER ECOSYSTEM

    FUTURE OF PARTNER ECOSYSTEM

    Networked, Engaged, Extended, Hybrid
    DOWNLOAD NOW
  • BUYERS JOURNEY

    BUYERS JOURNEY

    Influence map & care-abouts
    LEARN MORE
  • DIGITAL TRANSFORMATION

    DIGITAL TRANSFORMATION

    Connected Business
    LEARN MORE
  • MANAGED SERVICES RESEARCH

    MANAGED SERVICES RESEARCH

    SMB & Midmarket Managed Services Adoption
    LEARN MORE
  • WHITE PAPER

    WHITE PAPER

    SMB Path to Digitalization
    DOWNLOAD

Techaisle Blog

Insightful research, flexible data, and deep analysis by a global SMB IT Market Research and Industry Analyst organization dedicated to tracking the Future of SMBs and Channels.
Anurag Agrawal

Amazon's Role in Emerging Cloud Service: Analytics-as-a-Service (no acronym allowed)

Many organizations are starting to think about “analytics-as-a-service” (no acronym allowed) as they struggle to cope with the problem of analyzing massive amounts of data to find patterns, extract signals from background noise and make predictions. In our discussions with CIOs and others, we are increasingly talking about leveraging the private or public cloud computing to build an analytics-as-a-service model.


The strategic goal is to harness data to drive insights and better decisions faster than competition as a core competency.  Executing this goal requires developing state-of-the-art capabilities around three facets:  algorithms, platform building blocks, and infrastructure.


Analytics is moving out of the IT function and into business — marketing, research and development, into strategy.  As a result of this shift, the focus is greater on speed-to-insight than on common or low-cost platforms.   In most IT organizations it takes anywhere from 6 weeks to 6 months to procure and configure servers.  Then another several months to load configure and test software. Not very fast for a business user who needs to churn data and test hypothesis. Hence cloud-as-a-analytics alternative is gaining traction with business users.


The “analytics-as-a-service” operating model that businesses are thinking about is already being facilitated by Amazon, Opera Solutions, eBay and others like LiquidHub.  They are anticipating the value migrating from traditional outmoded BI to an Analytics-as-a-service model.  We believe that Amazon’s analytics-as-a-service model provides a directional and aspirational target for IT organizations who want to build an on-premise equivalent.

 

Situation/Problem Summary: The Challenges of Departmental or Functional Analytics


The dominant design of analytics today is static or dependent on specific questions or dimensions. With the need for predictive analytics-driven business insights growing at ever increasing speeds, it’s clear that current departmental stove-pipe implementations are unable to meet the demands of increasingly complex KPIs, metrics and dashboards that will define the coming generation of Enterprise Performance Management. The fact that this capability will also be available to SMBs follows the trend of embedded BI and dashboards that is already sweeping the market as an integral part of SaaS applications. As we have written in the past, the move to true mobile BI can be provided as an application "bolt-ons" that work in conjunction with an existing Enterprise Applications or as pure play developed from scratch BI applications that take advantage of new technologies like HTML5. Generally, the large companies do the former through acquisition with existing technology and integration and with start-ups for the latter. Whether at the Departmental or Enterprise level, the requirements to hold down costs, minimize complexity and increase access and usability are pretty much universal, especially for SMBs, who are quickly moving away from on-premise equipment, software and services.


After years of cost cutting, organizations are looking for top-line growth again and finding that with the proliferation of front-end analytics tools and back-end BI tools, platforms and data marts, the burden/overhead of managing, maintaining and developing the “raw data to insights” value chain is growing in cost and complexity - a balance that brings SaaS and on-premise benefits together is needed.


The perennial challenge of a good BI deployment remains: it is becoming increasingly necessary to bring the disparate platforms/tools/information into a more centralized but flexible analytical architecture. Add to this the growth in volume of Big Data across all company types and the challenges accelerate.


Centralization of analytics infrastructure conflicts with the business requirement of time-to-impact, high quality and rate of user adoption - time can be more important than money if the application is strategic.  Line of Business teams need usable, adaptable, and flexible and constantly changing insights to keep up with customers.  The front-line teams care about revenue, alignment with customers and sales opportunities. So how do you bridge the two worlds and deliver the ultimate flexibility with the lowest possible cost of ownership?


The solution is Analytics-as-a-Service.

 

Emerging Operating Model:  Analytics-as-a-Service


It’s clear that sophisticated firms are moving along a trajectory of consolidating their departmental platforms into general purpose analytical platforms (either inside or outside the firewall) and then packaging them into a shared services utility.


This model is about providing a cloud computing model for analytics to anyone within or even outside an organization.  Fundamental building blocks (or enablers) like – Information Security, Data Integrity, Data and Storage Management, iPad and Mobile capabilities and other aspects – which are critical, don’t have to be designed, developed, tested again and again. More complex enablers like Operations Research, Data Mining, Machine Learning, Statistical models are also thought of as services.


Enterprise architects are migrating to “analytics-as-a-service” because they want to address three core challenges – size, speed, type – in every organization:

    • The vast amount of data that needs to be processed to produce accurate and actionable results

 

    • The speed at which one needs to analyze data to produce results

 

    • The type of data that one analyzes - structured versus unstructured



The real value of this service bureau model lies in achieving the economies of scale and scope…the more virtual analytical apps one deploys, the better the overall scalability and higher the cost savings. With growing data volumes and dozens of virtual analytical apps, chances are that more and more of them leverage processing at different times, usage patterns and frequencies, one of the main selling points of service pooling in the first place.

 

Amazon Analytics-as-a-Service in the Cloud


Amazon.com is becoming a market leader in supporting the analytics-as-a-service concept. They are attacking this as a cloud-enabled business model innovation opportunity than an incremental BI extension.  This is a great example of value migration from outmoded methods to new architectural patterns that are better able to satisfy business’ priorities.


Amazon is aiming at firms that deal with lots and lots of data and need elastic/flexible infrastructure.  This can be domain areas like Gene Sequencing, Clickstream analysis, Sensors, Instrumentation, Logs, Cyber-Security, Fraud, Geolocation, Oil Exploration modeling, HR/workforce analytics and others. The challenge is to harness data and derive insights without spending years building complex infrastructure.


Amazon is betting that traditional enterprise “hard-coded” BI infrastructure will be unable to handle the data volume growth, data structure flexibility and data dimensionality issues.  Also even if the IT organization wants to evolve from the status quo they are hamstrung with resource constraints, talent shortage and tight budgets. Predicting infrastructure needs for emerging (and yet-to-be-defined) analytics scenarios is not trivial.


Analytics-as-a-service that supports dynamic requirements requires some serious heavy lifting and complex infrastructure. Enter the AWS cloud.  The cloud offers some interesting value 1) on demand; 2) pay-as-you-go; 3) elastic; 4) programmable; 5) abstraction; and in many cases 6) better security.


The core differentiator for Amazon is parallel efficiency - the effectiveness of distributing large amounts of workload over pools and grids of servers coupled with techniques like MapReduce and Hadoop.


Amazon has analyzed the core requirements for general analytics-as-a-service infrastructure and is providing core building blocks that include 1) scalable persistent storage like Amazon Elastic Block Store; 2) scalable storage like Amazon S3; 3) elastic on-demand resources like Amazon Elastic Compute Cloud (Amazon EC2); and 4) tools like Amazon Elastic MapReduce.  It offers choice in the database images (Amazon RDS, Oracle, MySQL, etc.)

 

How does Amazon Analytics-in-the-Cloud work?


BestBuy had a clickstream analysis problem — 3.5 billion records, 71 million unique cookies, 1.7 million targeted ads required per day. How to make sense of this data? They used a partner to implement an analytic solution on Amazon Web Services and Elastic MapReduce. Solution was a 100 node cluster on demand; processing time was reduced from 2+ days to 8 hours.


Predictive exploration of data, separating “signals from noise” is the base use case. This manifests in different problem spaces like targeted advertising / clickstream analysis; data warehousing applications; bioinformatics; financial modeling; file processing; web indexing; data mining and BI.  Amazon analytics-as-a-service is perfect for compute intensive scenarios in financial services like Credit Ratings, Fraud Models, Portfolio analysis, and VaR calculations.


The ultimate goal for Amazon in Analytics-as-a-Service is to provide unconstrained tools for unconstrained growth. What is interesting is that an architecture of mixing commercial off-the-shelf packages with core Amazon services is also possible.

 

The Power of Amazon’s Analytics-as-a-Service


So what does the future hold?  The market in predictive analytics is shifting.  It is moving from “Data-at-Rest” to “Data-in-motion” Analytics.


The service infrastructure to do “data-in-motion” analytics is pretty complicated to setup and execute.  The complexity ranges from the core (e.g., analytics and query optimization), to the practical (e.g., horizontal scaling), to the mundane (e.g., backup and recovery).  Doing all these well while insulating the end-user is where Amazon.com will be most dominant.

 

Data in motion analytics


Data “in motion” analytics is the analysis of data before it has come to rest on a hard drive or other storage medium. Due to the vast amount of data being collected today, it is often not feasible to store the data first before analyzing it. In addition, even if you have the space to store the data first, additional time is required to store and then analyze. This time delay is often not acceptable in some use cases.

 

Data at rest analytics


Due to the vast amounts of data stored, technology is needed to sift through it, make sense of it, and draw conclusions from it. Much data is stored in relational or OLAP stores. But, more data today is not stored in a structured manner. With the explosive growth of unstructured data, technology is required to provide analytics on relational, non-relational, structured, and unstructured data sources.


Now Amazon AWS is not the only show in town attempting to provide analytics-as-a-service.  Competitors like Google BigQuery, a managed data analytics service in the cloud is aimed at analyzing big sets of data… one can run query analysis on big data sets — 5 to ten terabytes — and get a response back pretty quickly, in a matter of seconds, ten to twenty seconds. That’s pretty useful when you just want a standardized self-service machine learning service. How is BigQuery used? Claritic has built an application for game developers to gather real-time insights into gaming behavior. Another firm, Crystalloids, built an application to help a resort network “analyze customer reservations, optimize marketing and maximize revenue.” (THINKstrategies’ Cloud Analytics Summit in April, Ju-kay Kwek, product manager for Google’s cloud platform).

 

Bottom-line and Takeaways


Analytics is moving from the domain of departments to the enterprise level.   As the demand for analytics grows rapidly the CIOs and IT organizations are going to be under increasing pressure to deliver.  It will be especially interesting to watch how companies that have outsourced and offshored extensively (50+%) to Infosys, TCS, IBM,  Wipro, Cognizant, Accenture, HP, CapGemini and others will adapt and leverage their partners to deliver analytics innovation.


At the enterprise level a shared utility model is the right operating model.  But given the multiple BI projects already in progress and vendor stacks in place (sunk cost and effort); it is going to be extraordinarily difficult in most large corporations to rip-and-replace.  They will instead take a conservative and incremental integrate-and-enhance-what-we-have approach which will put them at a disadvantage. Users will increasingly complain that IT is not able to deliver what innovators like Amazon Web Services are providing.


Amazon’s analytics-as-a-service platform strategy shows exactly where the enterprise analytics marketplace is moving to or needs to go. But most IT groups are going to struggle to implement this trajectory without some strong leadership support, experimentation and program management. We expect this enterprise analytics transformation trend will take a decade to play out (innovation to maturity cycle).


Shirish Netke

Tavishi Agrawal

2013 SMB & Channel Outlook, Trends & Predictions: Techaisle Take

All predictions below are compiled based on SMB and channel surveys conducted in 2012 covering Cloud, Mobility, Virtualization, Business Intelligence, Marketing Automation, Managed Services, Business Issues, IT Priorities, Channel Challenges across several countries.

SMB focused Predictions

  • As SMBs continue to adopt cloud computing aggressively they will continue to move away from capital budgets; Revenue will become the focus rather than tight cost control. The buyer will move toward the department that is responsible for delivering business results and thereby revenue. The CMO becomes increasingly important as this unfolds.  Further, countries are coming out of economic slump which was a major factor in the initial move to the Cloud, as firms were scrambling to reduce capital outlays and reduce OPEX. But SMBs are now priming themselves for growth, and the Cloud is firmly established as an important tool to build the business.
  • SMBs’ emphasis on front-office, revenue-generating applications will continue with CRM at the hub and with more marketing automation and business intelligence applications. The base of marketing automation vendors will continue to consolidate as start-ups fail and pure-plays are acquired and big players roll out integrated solutions. Cloud CRM spend will continue to grow at a healthy rate of 21 percent.
  • Communications, Collaboration, Content and Context will be the primary computing scenarios of SMB IT departments, driven by Mobility, Cloud-based Applications and Process Optimization. Virtualization, Cloud, Mobility, Managed Services will together form the Four Pillars of IT that will support the transformation of SMB, enabling them to reach their full potential in the shortest period of time. The foundation for these four pillars will be the datacenter, both off-site and on-premise depending upon SMB segmentation. Techaisle forecasts that global SMB Cloud spend will grow by 22 percent in 2013, Mobility by 14 percent, Managed Services by 15 percent, Virtualization by 25 percent and datacenter by 8 percent.
  • The adoption of cloud-based productivity suites among SMBs will accelerate which will begin to balance usage of collaborative and individual SaaS applications. Office365 will go main stream along with increased usage of ERP and more sophisticated applications, offering new customer and other value-added opportunities in data and application integration. We expect SMB Cloud productivity suites spend will almost double from relatively slow adoption through 2012.
  • There will be a significant increase in emphasis on data integration rather than application integration; data will be combined from several sources to power different application blocks and embedded business intelligence functionality, as we first predicted in 2008.
  • The SMB server and network will start becoming less visible as they progressively move offsite physically and from a remote management perspective. Cloud-based server spend will likely grow by 40 percent as compared to on-premise new server spend growth of 5 percent, the benefits of remote management overwhelming the on premise for headcount-constrained firms.
  • Although social media will gain importance, SMBs will continue to struggle to determine ROMI from their social media initiatives and its usage will be considered a “productivity drainer” by many lean-staffed and short-skilled SMBs, unless they are in a local business that requires high customer intimacy to grow and build business. Aggressive SMB adopters will realize benefits but many others will be disillusioned unless advised, encouraged and shown a path by early adopters. The market will be inundated by advisors causing more confusion, especially as big data analytics start showing strong results for Enterprise-level companies.
  • ISVs will focus their attention on developing client applications that integrate email, context and workflow to build other productivity applications. New business models and solutions from ISVs and Service providers will appear for SMB mobile apps that will deliver content based on context, beginning with a few verticals and then spreading horizontally.
  • BYOD will be the new normal; with priority for SMBs on data and applications management rather than managing devices.
  • The next generation of business intelligence and Mobile BI will be widely adopted within SMBs; Upper mid-market firms will experiment Big Data using combinations of Hadoop and other technology (e.g. Greenplum) whereas lower-mid-market and small businesses will look for insights from federated big data deliverables provisioned by cloud application vendors.

Channel Partner focused Predictions

  • There will be an accelerating trend to vendor direct through development of remote integrated-service interfaces and inbound marketing initiatives. To counter, channel partners will aggressively develop outbound sales capabilities to compete with vendor direct sales and rise of the Independent Consultant to prevent from being cut out of the distribution chain.
  • Successful Channels will finally realize and pursue their individual respective competencies and roles as consultants, business process advisors, integrators, aggregators or plain vanilla cloud deliverers.
  • Expect that channel partners will be more successful going deep with integrated suites or a few applications that they integrate rather than trying to provide a complete infrastructure, communications, applications and vertical solutions.
  • Channel partners will begin to put together a repeatable, profitable SMB solution that will include proprietary integration value-added services or software, accelerated with productivity suites and collaborative combinations, such as Office 365 and SharePoint, or Google Apps, or the new Citrix ApplicationMe@Work or XenDesktop.
  • Cloud aggregators will continue to enter the market, however, few will be profitable as aggregators will need to be able manage reseller relationships with structured sales and marketing programs, implementation and post-implementation support for the channel, and tier-2 customer support for end users.
  • Mid-market focused channels will look up to their vendor partners to help combine mobility, cloud, virtualization offerings while others will rely on a partner-to-partner network

Tavishi Agrawal
Techaisle

Dr. Cooram Ramacharlu Sridhar

What is the big deal with ANN?

In the thirty years from the time Shunu Sen posed the marketing-mix problems, I have been busy with marketing research. I tried modeling most of the studies and discovered that market research data alone is not amenable to statistical predictive modeling. Take for example, imagery. Is there a correlation between Image parameters and Purchase Intention scores? There should be. But rarely does one get more than a 0.35 correlation coefficient. Try and link awareness, imagery, intention to buy, product knowledge, brand equity, etc. to the performance of the brand in the market place and one discovers land mines, unanswered questions and inactionability.

This is where ANN steps in.

Technically ANN (Artificial Neural Networks) offers a number of advantages that statistical models do not. I will list a few of them.

    1. Non-linear models are a tremendous advantage to a modeler. The real world is non-linear and any linear model is a huge approximation.

 

    1. In a statistical model, the model gives the error and one can do precious little to decrease the error. In ANN one can specify the error tolerance. For example we can fit a model for 85, 90, 95 or 99% error. It requires some expertise to figure out whether there is an over fit and what is the optimum error one can accept.

 

    1. Statistical models make assumptions on distributions that are not real in the real world. ANNs make no distribution assumptions.

 

    1. Most ANN software available today do not identify the functions that are fitted. We, on the other hand, have been able to identify the functions that are fitted and how to extract the weights and build them into an algorithm.



How do we bring the differentiation?

Our biggest strength is in data integration that combines market research and economic data with transaction data into a single file. This is tricky and requires some ingenuity. We use Monte Carlo techniques to build these files and then use ANN for building the Simulation models. Optimization then becomes clear and straight forward since we do not use statistical models. Optimization using statistical modeling, which most modelers use, is a nightmare. Most of the large IT vendors and even analytics companies continue to use statistical modeling for Optimization. And therein lays the problem. Neither are these companies aware of the possibilities that ANN can provide. Most modeling is done using aggregate data, whereas we handle the data at the respondent level. The conventional modeling is macro data oriented whereas we are micro data oriented. Hence the possibilities that we can generate with micro data for modeling is huge, compared to macro data.

We have crossed the stage of theories. There are many projects that we have executed successfully that have gone on to become a must-have analytical marketing input mechanism.

Doc
Techaisle

Davis Blair

Meet the New Boss: Big Data (WSJ) – Techaisle Take

Wall Street Journal Article

This is an interesting article from the WSJ concerning how we are slowly allowing decision-making processes to move away from people and be handled by algorithms instead. It caught our attention at a time when we are completing survey work for Business Intelligence report. As discussed in an earlier post, one of the key trends in BI is how deeply it is being embedded into all kinds of applications , and this article is a good example. Please let us know what you think: comment, like, tweet or forward.

Laying the Foundation


Big Data - Techaisle - Global SMB, Midmarket and Channel Partner Market Research Organization - Techaisle Blog Man_And_An_Old_Mainframe-e1348376410194 Analytic software has evolved through several generations over the last 70 years from around WWII, when a series of unprecedented number-crunching challenges gave rise to Decision Support Systems (DSS) designed to solve problems such as best equipment production mix given material constraints, how to logistically support the Allied invasion of Europe, split the atom, or break the Japanese code. These kinds of problems tended to be monolithic, using stochastic models to find the best answer – and were a major catalyst to development of the mainframe computer.

Business Intelligence (BI) followed this linear and iterative approach with one that supported solving business problems, mostly operational, within divisions and departments of large commercial organizations, using more distributed equipment within a wider audience, i.e., Finance, Operations and Distribution. In the late 1990s there was an explosion of data resulting from widespread adoption of CRM, the killer app of the Client/Server era, adding mountains of Sales and Marketing Data to the volumes of operational information. There was a growing need to get a top down view of how performance in one area of the organization was impacting the others, to begin taking a more structured approach at understanding cause and effect, setting objectives and consistently measuring performance to improve results. BI was evolving into Enterprise Performance Management (EPM) - which is where market leaders are today.

EPM is characterized by using Business Intelligence software to understand the best performance scenarios, measure actual performance indicators (KPIs) and determine how to close the gaps, using exception reporting for most front office functions (CRM/SFA) and rules-based processing for the back office (Process Manufacturing/Real Time Bidding, SCM/Advanced Web Analytics).

Optimization Nation


Equally important as the individual BI technology advances are some of the underlying rules that have accompanied the evolution: Moore’s Law, Metcalfe’s Law, the Law of Accelerating Returns all drove exponential growth in production, adoption and utility. Over a 20 year period, these have resulted in a slow-motion Black Swan event based on the cumulative effect of technology investments, and having huge impacts on our society, including but not limited to the following optimization activities:

Law of DisruptionEconomy – development of consumer mortgage products designed to optimize sales volume regardless of risk, bundling them into bonds to optimize profit on the debt, creation of derivatives to optimize transactions and create demand for increasingly suspect debt, development of new financial instruments that have no underlying value such as  synthetic derivatives that truly have nothing but conceptual paper profits behind them, etc. By 2008 these financial instruments had optimized leverage to create risk greater than the combined GDP of industrialized world.

Employment – The WSJ article goes into depth about how algorithms have already replaced a hiring manager's decisions based on probabilities of how the employee might behave under certain circumstances. Employer choices have also been optimized by a flattening of the market caused by oceans of virtually unlimited supply from sites like Monster.com, 100K Jobs, Dice, etc. Middle management has been optimized out of the organizational chart and replaced with productivity tools, more efficient communications and a lower ratio of managers to workers. And the actual number of staff required to hit the bottom line has been optimized while CEO salaries have been optimized. If we look a little further down the line, Andrew McAfee's POV is deep on this subject, and more technical than mine.

Industry – We all know that manufacturing was moved offshore en masse over the past three decades to optimize production costs, but several other industry segments have been optimized as well, including Retail which has optimized through consolidation and healthcare which has optimized revenue per patient. Retail has been optimized at a structural level, to provide one-stop shopping for almost everything you need in a single location while volume has been optimized to produce the absolute lowest price and any cost, including optimizing the number of worker hours to keep an optimal ratio of full time to part time employees and save the resulting benefit costs. And it has also optimized the number and variety of retail outlets and small businesses required to service an optimized population in square miles. Healthcare prices have been optimized to take advantage of tax structure, potential law suits, healthcare insurance gaps, maximizing federal matching funds, Board and C-Suite compensation, pharma industry profits, and many more.

Government – Automation has also enabled a profitable business model that optimizes the use of Federal Government funds and ensures that every available dollar is spent, whether it is to make sure everybody gets a state-of-the-art mobile wheelchair, their 120 monthly catheters, a speaking glucose meter, maximum disability benefits, etc.  “Don’t worry - we’ll handle all the paperwork.”

Set it and Forget it


Complex SystemsThe imminent next generation of analytics involves truly “optimized” complex systems with human intervention largely removed from the process. Not to single out Wall Street, but they offer one of the best examples of unbridled application of technology in the singular pursuit of optimization, in their case, profit for themselves and their shareholders. The Financial Services industry has invested billions into technology and employed thousands of physicists and Ph.D.-level mathematicians to achieve a couple-millisecond transaction advantage, and programmed algorithms to use the advantage and change the rules (i.e., share price represents perfect information is no longer true). This has not proved to always produce predictable results, and the ghost in the machine has put us back on the precipice more than once, as seen in this TED video by Kevin Slavin. As we move into a brave new world that combines optimization software with networks that operate too fast for human intervention, more of our lives will be controlled by how rules are programmed into the system than what we do as individuals to impact the results. One of the best examples of where this is heading is the IBM’s Smarter Cities Initiative, which combines intelligent networks that manage Traffic, Water and Electric Utilities, Transportation, Healthcare, Public Safety, Education and others into an overall “Intelligent City”. Everyone hates traffic, so the video example from the IBM innovation site does more to explain this than I can by writing more on the subject.

Whether you agree with it or not, we are on a direct course to this future that is almost impossible to divert. This is a philosophical question and everyone will have their own opinion about the cost/benefit of chasing optimization. Comments and Opinions are welcome, please let us know what you think.

Research You Can Rely On | Analysis You Can Act Upon

Techaisle - TA