The Inaugural Hurwitz & Associates Predictive Analytics Victory Index is complete!

For more years than I like to admit, I have been focused on the importance of managing data so that it helps companies anticipate changes and therefore be prepared to take proactive action. Therefore, as I watched the market for predictive analytics really emerge I thought it was important to provide customers with a holistic perspective on the value of commercial offerings. I was determined that when I provided this analysis it would be based on real world factors. Therefore, I am delighted to announce the release of the Hurwitz & Associates Victory Index for Predictive Analytics! I’ve been working on this report for a quite some time and I believe that it will be very valuable tool for companies looking to understand predictive analytics and the vendors that play in this market.

Predictive analytics has become a key component of a highly competitive company’s analytics arsenal. Hurwitz & Associates defines predictive analytics as:

A statistical or data mining solution consisting of algorithms and techniques that can be used on both structured and unstructured data (together or individually) to determine future outcomes. It can be deployed for prediction, optimization, forecasting, simulation, and many other uses.

So what is this report all about? The Hurwitz & Associates Victory Index is a market research assessment tool, developed by Hurwitz & Associates that analyzes vendors across four dimensions: Vision, Viability, Validity and Value. Hurwitz & Associates takes a holistic view of the value and benefit of important technologies. We assess not just the technical capability of the technology but its ability to provide tangible value to the business. For the Victory Index we examined more than fifty attributes including: customer satisfaction, value/price, time to value, technical value, breadth and depth of functionality, customer adoption, financial viability, company vitality, strength of intellectual capital, business value, ROI, and clarity and practicality of strategy and vision. We also examine important trends in the predictive analytics market as part of the report and provide detailed overviews of vendor offerings in the space.

Some of the key vendor highlights include:
• Hurwitz & Associates named six vendors as Victors across two categories including SAS, IBM (SPSS), Pegasystems, Pitney Bowes, StatSoft and Angoss.
• Other vendors recognized in the Victory Index include KXEN, Megaputer Intelligence, Rapid-I, Revolution Analytics, SAP, and TIBCO.

Some of the key market findings include:
• Vendors have continued to place an emphasis on improving the technology’s ease of use, making strides towards automating model building capabilities and presenting findings in business context.
• Predictive analytics is no longer relegated to statisticians and mathematicians. The user profile for predictive analytics has shifted dramatically as the ability to leverage data for competitive advantage has placed business analysts in the driver’s seat.
• As companies gather greater volumes of disparate kinds of data, both structured and unstructured, they require solutions that can deliver high performance and scalability.
• The ability to operationalize predictive analytics is growing in importance as companies have come to understand the advantage to incorporating predictive models in their business processes. For example, statisticians at an insurance company might build a model that predicts the likelihood of a claim being fraudulent.

I invite you to find out more about the report by visiting our website: www.hurwitz.com

Five Analytics Predictions for 2011

In 2011 analytics will take center stage as a key trend because companies are at a tipping point with the volume of data they have and their urgent need to do something about it. So, with 2010 now past and 2011 to look forward to, I wanted to take the opportunity to submit my predictions (no pun intended) regarding the analytics and advanced analytics market.

Advanced Analytics gains more steam. Advanced Analytics was hot last year and will remain so in 2011. Growth will come from at least three different sources. First, advanced analytics will increase its footprint in large enterprises. A number of predictive and advanced analytics vendors tried to make their tools easier to use in 2009-2010. In 2011 expect new users in companies already deploying the technology to come on board. Second, more companies will begin to purchase the technology because they see it as a way to increase top line revenue while gaining deeper insights about their customers. Finally, small and mid sized companies will get into the act, looking for lower cost and user -friendly tools.
Social Media Monitoring Shake Out. The social media monitoring and analysis market is one crowded and confused space, with close to 200 vendors competing across no cost, low cost, and enterprise-cost solution classes. Expect 2011 to be a year of folding and consolidation with at least a third of these companies tanking. Before this happens, expect new entrants to the market for low cost social media monitoring platforms and everyone screaming for attention.
Discovery Rules. Text Analytics will become a main stream technology as more companies begin to finally understand the difference between simply searching information and actually discovering insight. Part of this will be due to the impact of social media monitoring services that utilize text analytics to discover, rather than simply search social media to find topics and patterns in unstructured data. However, innovative companies will continue to build text analytics solutions to do more than just analyze social media.
Sentiment Analysis is Supplanted by other Measures. Building on prediction #3, by the end of 2011 sentiment analysis won’t be the be all and end all of social media monitoring. Yes, it is important, but the reality is that most low cost social media monitoring vendors don’t do it well. They may tell you that they get 75-80% accuracy, but it ain’t so. In fact, it is probably more like 30-40%. After many users have gotten burned by not questioning sentiment scores, they will begin to look for other meaningful measures.
Data in the cloud continues to expand as well as BI SaaS. Expect there to still be a lot of discussion around data in the cloud. However, business analytics vendors will continue to launch SaaS BI solutions and companies will continue to buy the solutions, especially small and mid sized companies that find the SaaS model a good alternative to some pricey enterprise solutions. Expect to see at least ten more vendors enter the market.

On-premise becomes a new word. This last prediction is not really related to analytics (hence the 5 rather than 6 predictions), but I couldn’t resist. People will continue to use the term, “on-premise”, rather than “on-premises” when referring to cloud computing even though it is incorrect. This will continue to drive many people crazy since premise means “a proposition supporting or helping to support a conclusion” (dictionary.com) rather than a singular form of premises. Those of us in the know will finally give up correcting everyone else.

SAS and the Business Analytics Innovation Centre

Last Friday, SAS announced that it was partnering with Teradata and Elder Research Inc. (a data mining consultancy) to open a Business Analytics Innovation Centre.  According to the press release,

“ Recognising the growing need and challenges businesses face driving operational analytics across enterprises, SAS and Teradata are planning to establish a centralised “think tank” where customers can discuss analytic best practices with domain and subject-matter experts, and quickly test or implement innovative models that uncover unique insights for optimising business operations.”

The center will include a lab for pilot programs, analytic workshops and proof of concept for customers.  I was excited about the announcement, because it further validated the fact that business analytics continues to gain steam in the market. I had a few questions, however, that I sent to SAS.  Here are the responses. 

Q. Is this a physical center or a virtual center?  If physical – where is it located and how will it be staffed?  If virtual, how will it be operationalized?

R. The Business Analytics Innovation Center will be based at SAS headquarters in Cary, North Carolina.  We will offer customer meetings, workshops and projects out of the Center. 

Q. Will there be consulting services around actually deploying analytics into organizations?  In other words, is it business action oriented or more research oriented?

R.  The Business Analytics Innovation Center will offer consulting services around how best to deploy analytics into organizations, as well as conduct research-based activities to help businesses improve operational efficiency. 

Q.  Should we expect to hear more announcements from SAS around business analytics, similar to what has been happening with IBM?

R.  As the leader in business analytics software and services, SAS continues to make advances in its business analytics offerings. You can expect to hear more from SAS in this area in 2010

I’m looking forward to 2010!

IBM Business Analytics and Optimization – The Dawn of New Era

I attended the IBM Business Analytics and Optimization (BAO) briefing yesterday at the IBM Research facility in Hawthorne, NY.   At the meeting, IBM executives from Software, Global Business Services, and Research (yes, Research) announced its new consulting organization, which will be led by Fred Balboni.   The initiative includes 4000 GBS consultants working together with the Software Group and Research to deliver solutions to customers dedicated to advanced business analytics and business optimization. The initiative builds off of IBM’s Smarter Planet . 

 

IBM believes that there is a great opportunity for companies that can take all of the information they are being inundated with and use it effectively.  According to IBM (based on a recent study), only 13% of companies are utilizing analytics to their advantage.  The business drivers behind the new practice include the fact that companies are being pressured to make decisions smarter and faster.  Optimization is key as well as the ability for organizations to become more predictive.  In fact, the word predictive was used a lot yesterday. 

 

According to IBM, with an instrumented data explosion, powerful software will be needed to manage this information, analyze it, and act on it.  This goes beyond business intelligence and business process management, to what IBM terms business analytics and optimization.  BAO operationalizes this information via advanced analytics and optimization.  This means that advanced analytics operating on lots of data will be part of solutions that are sold to customers.  BAO will go to market with industry specific applications

 

‘Been doing this for years

 

IBM was quick to point out that they have been delivering solutions like this to customers for a number of years Here are a few examples:

 

·        The Sentinel Group , an organization that provides healthcare anti-fraud and abuse services, uses IBM software and advanced analytics to predict insurance fraud.

·        The Fire Department of New York is using IBM software and advanced analytics to “ build a state of the art system for collecting and sharing data in real-time that can potentially prevent fires and protect firefighters and other first responders when a fire occurs”.

·        The Operational Risk data exchange (ORX) is using IBM to help its 35 member banks better analyze operational loss data from across the banking industry.  This work is being done in conjunction with IBM Research.

 

These solutions were built in conjunction with the members of IBM Research who have been pioneering new techniques for analyzing data.  This is a group of 200 mathematicians and other quantitative scientists.  In fact, according to IBM, IBM research has been part of a very large number of client engagements.  A few years back, the company formalized the bridge between GBS and Research via the Center for Business Optimization.  The new consulting organization is yet a further outgrowth of this. 

 

The Dawn of a New Era

 

The new organization will provide consulting services in the following areas:

·        Strategy

·        Biz Intelligence and Business Performance Management

·        Advanced Analytics and Optimization

·        Enterprise info management

·        Enterprise Content management

 

It was significant that the meeting was held at the Research Labs.  We lunched with researchers, met with Brenda Dietrich, VP of Research, and saw a number of solution demos that utilized intellectual property from Research.  IBM believes that its research strength will help to differentiate it from competitors.

 

The research organization is doing some interesting work in many areas of data analysis including mining blogs, sentiment analysis, and machine learning and predictive analysis.  While there are researchers on the team that are more traditional and measure success based on how many papers they publish, there are a large number that get excited about solving real problems for real customers.   Brenda Dietrich requires that each lab participate in real-world work. 

 

Look, I get excited about business analytics, it’s in my blood.  I agree that world of data is changing and companies that make the most effective use of information will come out ahead. I’ve been saying this for years.   I’m glad that IBM is taking the bull by the horns.  I like that Research is involved. 

 

It will be interesting to see how effectively IBM can take its IP and reuse it and make it scale across different customers in different industries in order to solve complex problems.  According to IBM, once a specific piece of IP is used several times, they can effectively make it work across other solutions.  On a side note, it will also be interesting to see how this IP might make its way into the Cognos Platform.  That is not the thrust of this announcement (which is more GBS centric), but is worth mentioning.  

 

Four Questions about Innovations in Analysis

Several weeks ago, Hurwitz & Associates deployed a short survey entitled, “Four questions about innovations in analysis”.  Well, the results and they are quite interesting!

 

THE SURVEY

 

First, a few words about the survey itself and who responded to the survey.

 

  1. We wanted to make the survey short and sweet.  We were interested in what  kinds of analytical technology companies thought were important and specifically how companies were using text analytics to analyze unstructured information.  Finally, since there has been a lot of buzz about analyzing social media we asked about this, as well
  2. Let me say up front that given the nature of our list, I would categorize most of the respondents to the survey as fairly technology savvy.  In all,  61 people responded to the survey, 32% of these respondents were from high technology companies.  The verticals included professional services, followed by manufacturing, financial/insurance, healthcare and pharmaceutical. There were also some responses from governmental agencies, telecommunications and energy companies.  So, while the results are unscientific in terms of a random sample across all companies, they probably do reflect the intentions of potential early adopters, although not in a statistically significant manner.
  3. In analyzing the results, I first looked at the overall picture and then examined individual verticals as well as filtered the results by other attributes (such as those using text analytics vs. those not using the technology) to get a feel for what these companies were thinking about and whether one group was different from another.  These subgroups are of course, quite small and the results should be viewed accordingly.

THE RESULTS

The importance of innovative technologies

 We first asked all of the respondents to rate a number of technologies in terms of importance to their companies.  Figure 1 shows the results.  Overall, most of these technologies were at least somewhat important to this technology savvy group, with query and reporting leading the pack.  This isn’t surprising.  Interestingly, OLAP data cubes appeared to be the least important analytical technology – at least with this group of respondents.  Other technologies, such as performance management, predictive modeling, and visualization ranked fairly high, as well.  Again not surprisingly, text analytics ranked lower than some of the other technologies probably since it is just moving out of the early adopter stage.  Some of the respondents, from smaller firms, had no idea what any of these technologies were.  And, in terms of text analytics, one company commented, ” yeekes, this must be big time company kind of stuff. Way up in the clouds here, come down to earth.” They, no doubt, are still using Excel and Access for their analytical needs.  Other smaller companies were very interested in “non-cube” technologies such as some of the visualization products on the market today.

 

 

  Continue reading

Decisions and Consequences

Not everything is easy.  I analyzed data for decision-making for many years using advanced techniques such as predictive modeling, machine learning and even influence diagrams.  With the rush to pervasive BI we often forget about the need for truly sophisticated analysis to aid in complex decision making.    I’m talking about decision support for critical strategic initiatives such as managing a portfolio of investments, preparing for terrorist threats, or modeling sales spending for drug marketing when dealing with competing products.  In other words, analysis of dynamic situations where multiple outcomes are possible. 

Past performance is not a guarantee of future results

 What is constant is that the world does not remain constant. The future is dynamic, change is expected and traditional BI can only take you so far in the decision game. Often, it is necessary to determine a series of plausible futures or explore the consequences of possible decisions.  DecisionPath [www.decpath.com] a Boston based company uses the “past performance” phrase above to drive home some of the limitations of BI.   I had a very interesting briefing with Richard Adler, the CTO of DecisionPath, the other week.  He correctly pointed out the following:

  • BI technology helps to examine the past and today and how we got there
  • Predictive analysis is useful if the future doesn’t change, which of course it will, necessitating updating the models (if possible).
  • BI can provide high quality input into decision-making, but it doesn’t provide the whole picture because the world is dynamic.
  • BI does not actually support the process of decision-making (i.e., actively enabling or enhancing it).  Think about the word process here. 

DecisionPath offers a product called ForeTell that helps to develop and test decisions.  ForeTell combines various complementary simulation techniques in one framework.  So, whereas a software vendor might provide some of these techniques, DecisionPath has put them together in one framework that works together with BI systems to model, simulate, and explore possible decision outcomes and test alternative decisions.   Here is an illustration, provided by DecisionPath, that describes the relationship between BI and ForeTell:

ForeTell

source: DecisionPath

This is not your ma and pa BI and it clearly not for everyone.  DecisionPath has made good inroads in the government, pharmaceutical, and financial sectors where complex analysis is the norm.  However, alternative decisions with complex tradeoffs exist in all industries to some degree so certainly the approach is applicable to a wider range of verticals than those listed here.

Seven guiding principles for analyzing data

I was talking to an old friend the other day who is involved in using the results of research to help grow a business. He told me some interesting stories that made me revisit some basic tenets of good analysis. Yes, you may think that some of these are obvious, but they still bear repeating. Here are seven interrelated principles to start with:

  • Process is a way of thinking, not a substitute for thinking. You’d be surprised at how many people fall into this trap. For example, in behavioral research certain metrics might be the norm for capture. These might include the number of times that eye contact was made, or the quality of the interaction with the examiner. However, simply because others have used these “tried and true” measures doesn’t mean that they necessarily fit the situation you’re currently examining. Think about it.
  • Data needs to be thought about and reported in context. This is a pet peeve for me. If someone tells me that 1.5 million Americans were out of work at some point during the Great Depression I may think that is terrible, but I don’t really understand what that means because that fact was not put into context. I don’t know what percent of the working population this represents, or for that matter if it includes women or other groups. When a vendor tells me that Company X saved $20M by utilizing its product, that’s great but what does it really mean? What percent of its overall costs (whether by department or company) does this represent? How is another company, looking at this information, supposed to respond unless it understands what the data mean in context.
  • Look before you leap. Before you start applying statistical techniques or cranking out charts and reports, take a good hard look at the data you’ve collected. Be thoughtful. Ask yourself some basic questions such as, “Do the data seem reasonable, complete, and accurate?” “What are the data suggesting?” “Is there some sort of hypothesis I can propose to test based on the data?” Often times people simply jump into running every sort of analysis on their data, simply because they can.
  • Question everything. If you are using the results from someone else’s analysis to build upon, you need to question how they got their results. Did this analysis make sense? How big was the sample? This is (I hope) a basic principle in scientific research but I haven’t seen it necessarily carried over into business. If your sales figures have jumped by 50%, you need to ask yourself, “Why?” Perhaps new products were added or new markets were tapped. Whatever the reason, make sure the data makes sense. Data quality is obviously important here.
  • Do a gut check. this is an extension of the question everything principle. Again, once you’ve done some analysis, you need to ask yourself whether it makes sense to you or not. Remember the old saying, if something is too good to be true it probably is. If your sales figures have jumped by 150%, you need to ask yourself if this is possible and then go and figure it out.
  • Coincidence is not the same as causality. Just because it may appear that two variables are somehow related it doesn’t mean that they are. Remember to question everything and do a gut check.
  • Just because the data exist doesn’t mean the data are relevant. Here, you need to ask yourself what you are trying to figure out. Just because you have the data doesn’t mean that the data are necessarily useful to your analysis.

I’m sure you can think of more and I know I will certainly come up with others. But, that is all for now.

Follow

Get every new post delivered to your Inbox.

Join 1,190 other followers