My Take on the SAS Analyst Conference

I just got back from the SAS analyst event that was held in Steamboat Springs, Colorado.   It was a great meeting.  Here are some of the themes I heard over the few days I was there:

SAS is a unique place to work.

Consider the following:  SAS revenue per employee is somewhat lower than the software industry average because everyone is on the payroll.  That’s right.  Everyone from the grounds keepers to the health clinic professionals to those involved in advertising are on the SAS payroll.   The company treats its employees very well, providing fitness facilities and on site day care (also on the payroll). You don’t even have to buy your own coffee or soda! The company has found that these kinds of perks have a positive impact.  SAS announced no layoffs in 2009 and this further increased morale and productivity.  The company actually saw increased profits in 2009.   Executives from SAS also made the point that even thought they might have their own advertising, etc. they do not want to be insular.  The company knows it needs new blood and new ideas.  On that note, check out the next two themes:

Innovation is very important to SAS.

Here are some examples:

  • Dr. Goodnight gave his presentation using the latest version of the SAS BI dashboard, which looked pretty slick.
  • SAS has recently introduced some very innovative products and the trend will continue. One example is its social network analysis product that has been doing very well in the market.  The product analyzes social networks and can, for example, uncover groups of people working together to commit fraud.  This product was able to find $32M in welfare fraud in several weeks.
  • SAS continues to enhance its UI, which it has been beat up about in the past. We also got pre-briefed on some new product announcements that I can’t talk about yet, but other analysts did tweet about them at the conference.   There were a lot of tweats at this conference and they were analyzed in real time.

The partnership with Accenture is a meaningful one.

SAS execs stated that although they may not have that many partnerships, they try to make the ones they have very real.  While, on the surface, the recent announcement regarding the Accenture SAS Analytics Group might seem like a me too after IBM BAO, it is actually different.  Accenture’s goal is transform the front office, like ERP/CRM was transformed.  It wants to, “Take the what and turn it into so what and now what?” It views analytics not simply as a technology, but a new competitive management science that enables agility.  It obviously won’t market it that way as the company takes a business focus.  Look for the Accenture SAS Analytics Group to put out services such as Churn management as a service, Risk and fraud detection as a service.  They will operationalize this as part of a business process.

The Cloud!

SAS has a number of SaaS offerings in the market and will, no doubt, introduce more.  What I found refreshing was that SAS takes issues around SaaS very seriously.  You’d expect a data company to be concerned about their customers’ data and they are. 

Best line of the conference

SAS is putting a lot of effort into making its products easier to use and that is a good thing.  There are ways to get analysis to those people who aren’t that analytical.  In a discussion about the skill level required for people to use advanced analytics, however, one customer commented, “Just because you can turn on a stove doesn’t mean you know how to cook.”  More on this in another post.

Five Predictions for Advanced Analytics in 2010

With 2010 now upon us, I wanted to take the opportunity to talk about five advanced analytics technology trends that will take flight this year.  Some of these are up in the clouds, some down to earth.

  • Text Analytics:  Analyzing unstructured text will continue to be a hot area for companies. Vendors in this space have weathered the economic crisis well and the technology is positioned to do even better once a recovery begins.  Social media analysis really took off in 2009 and a number of text analytics vendors, such as Attensity and Clarabridge, have already partnered with online providers to offer this service. Those that haven’t will do so this year.  Additionally, numerous “listening post” services, dealing with brand image and voice of the customer have also sprung up. However, while voice of the customer has been a hot area and will continue to be, I think other application areas such as competitive intelligence will also gain momentum.  There is a lot of data out on the Internet that can be used to gain insight about markets, trends, and competitors.
  • Predictive Analytics Model Building:  In 2009, there was a lot of buzz about predictive analytics.  For example, IBM bought SPSS and other vendors, such as SAS and Megaputer, also beefed up offerings.  A newish development that will continue to gain steam is predictive analytics in the cloud.  For example, vendors Aha! software and Clario are providing predictive capabilities to users in a cloud-based model.  While different in approach they both speak to the trend that predictive analytics will be hot in 2010.
  • Operationalizing Predictive Analytics:  While not every company can or may want to build a predictive model, there are certainly a lot of uses for operationalizing predictive models as part of a business process.  Forward looking companies are already using this as part of the call center process, in fraud analysis, and churn analysis, to name a few use cases.  The momentum will continue to build making advanced analytics more pervasive.
  • Advanced Analytics in the Cloud:  speaking of putting predictive models in the cloud, business analytics in general will continue to move to the cloud for mid market companies and others that deem it valuable.  Companies such as QlikTech introduced a cloud-based service in 2009.  There are also a number of pure play SaaS vendors out there, like GoodData and others that provide cloud-based services in this space.  Expect to hear more about this in 2010.
  • Analyzing complex data streams.  A number of forward-looking companies with large amounts of real-time data (such as RFID or financial data) are already investing in analyzing these data streams.   Some are using the on-demand capacity of cloud based model to do this.  Expect this trend to continue in 2010.

Operationalizing Predictive Analytics

There has been a lot of excitement in the market recently around business analytics in general and specifically around predictive analytics. The promise of moving away from the typical rear view mirror approach to a predictive, anticipatory approach is a very compelling value proposition. 

But, just how can this be done?  Predictive models are complex.  So, how can companies use them to their best advantage?  A number of ideas have emerged to make this happen including 1) making the models easier to build in the first place and 2) operationalizing models that have been built so users across the organization can utilize the output of these models in various ways.  I have written several blogs on the topic.

Given the market momentum around predictive analytics, I was interested to speak to members of the Aha! Team about their spin on this subject, which they term “Business Embedded Analytics.” For those of you not familiar with Aha! the company was formed in 2006 to provide a services platform (i.e. SaaS platform called Axel ) to embed analytics within a business.  The company currently has customers in healthcare, telecommunications, and travel and transportation.  The idea behind the platform is to allow business analysts to utilize advanced business analytics in their day to day jobs by implementing a range of deterministic and stochastic predictive models and then tracking, trending, forecasting and monitoring business outcomes based on the output of the model.

An example

Here’s an example.  Say, you work at an insurance company and you are concerned about customers not renewing their policies.  Your company might have a lot of data about both past and present customers including demographic data, the type of policy they have, how long they’ve had it, and so on.  This kind of data can be used to create a predictive model of customers who are likely to drop their policy based on the characteristics of customers who have already done so.  The Aha! platform allows a company to collect the data necessary to run the model, implement the model, get the results from the model and continue to update it and track it as more data becomes available.   This, by itself, is not a new idea.  What is interesting about the Axel Services Platform is that the output from the model is displayed as a series of dynamic Key Performance Indicators (KPIs) models that the business analyst has created.  These KPIs are really important metrics, such as current membership, policy terminations, % disenrolled, and so on.   The idea is that once the model is chugging away, and getting more data, it can produce these indicators on an ongoing basis and analysts can use this information to actively understand and act on what is happening to their customer base.  The platform enables analysts to visualize these KPIs, trend them, forecast on them, and change the value of one of the KPIs in order to see the impact that might have on the overall business.   Here is a screen shot of the system:

In this instance, these are actual not forecasted values of the KPIs (although this could represent a modeled goal).  For example, the KPI on the lower right hand corner of the screen is called Internal Agent Member Retention.  This is actually a drill down of information from the Distribution Channel Performance.  The KPI might represent the number of policies renewed on a particular reference date, year to date, etc. If it was a modeled KPI, it might represent the target value for that particular KPI (i.e. in order to make a goal of selling 500,000 policies in a particular time period, an internal agent must sell, say 450 of them).  This goal might change based on seasonality, risk, time periods, and so on.

Aha! provides tools for collaboration among analysts and a dashboard, so that this information can be shared with members across the organization or across companies. Aha! Provides a series a predictive models, but also enables companies to pull in the models from outside sources such as SAS or SPSS. The service is currently targeted for enterprise class companies.

So what?

What does this mean?  Simply this:  that the model, once created, is not static.  Rather, its results are part of the business analyst’s day to day job.  In this way, companies can develop a strategy (for example around acquisition or retention), create a model to address it, and then continually monitor and analyze and act on what is happening to its customer base. 

When most analytics vendors talk about operationalizing predictive analytics, they generally mean putting a model in a process (say for a call center) that can be used by call center agents to tell them what they should be offering customers.  Call center agents can provide information back into the model, but I haven’t seen a solution where the model represents the business process in quite this way and continuously monitors the process.   This can be a tremendous help in the acquisition and retention efforts of a company. I see these kinds of models and process being very useful in industries that have a lot of small customers who aren’t that “sticky” meaning they have the potential to churn.  In this case, it is not enough to run a model once; it really needs to be part of the business process. In fact, the outcome analytics of the business user is the necessary feed back to calibrate and tune the predictive model (i.e. you might build a model, but it isn’t really the right model).  As offers, promotions, etc. are provided to these customers, the results can understood in a dynamic way, in a sense to get out ahead of your customer base 

Text Analytics Meets Publishing

I’ve been writing about text analytics for a number of years, now. Many of my blogs have included survey findings and vendor offerings in the space.  I’ve also provided a number of use cases for text analytics; many of which have revolved around voice of the customer, market intelligence, e-discovery, and fraud.  While these are all extremely valuable, there are a number of other very beneficial use cases for the technology and I believe it is important to put them out there, too.

Last week, I spoke with Daniel Mayer, a product-marketing manager, at TEMIS about the publishing landscape and how text analytics can be used in both the editorial and the new product development parts of the publishing business.  It’s an interesting and significant use of the technology.

First a little background.  I don’t believe that it comes as a surprise to anyone that publishing, as we used to know it has changed dramatically.  Mainstream newspapers and magazines have given way to desktop publishing and the Internet as economics have changed the game.  Chris Anderson wrote about this back in 2004, in Wired, in an article he called “The Long Tail” (it has since become a book).  Some of the results include:

  • Increased Competition.  There are more entrants, more content and more choice on the Internet and much of it is free.
  • Mass market vs. narrow market.  Additionally, whereas the successful newspapers and magazines of the past targeted a general audience, the Internet economically enables more narrow appeal publications.  
  • Social, Real time.  Social network sites, like twitter, are fast becoming an important source of real time news. 

All of this has caused mainstream publishers to rethink their strategies in order to survive.  In particular, publishers realize that content needs to be richer, interactive, timely, and relevant.

Consider the following example.  A plane crashes over a large river, close to an airport.  The editor in charge of the story wants to write about the crash itself, and also wants to include historical information about the cause of plane crashes (e.g. time of year, time of day, equipment malfunction, pilot error, etc based on other plane crashes for the past 40 years) to enrich the story.  Traditionally, publishers have annotated documents with key words and dates.   Typically, this was a manual process and not all documents were thoroughly tagged.  Past annotations might not meet current expectations. Even if the documents were tagged, they might have been tagged only at a high level (e.g. plane crash), so that the editor is overwhelmed with information.   This means that it might be very difficult her to find similar stories, much less analyze what happened in other relevant crashes.  

Using text analytics, all historical documents could be culled for relevant entities, concepts, and relationships to create a much more enriched annotation scheme.  Information about the plane crash such as location, type of planes involved, dates, times, and causes could be extracted from the text.  This information would be stored as enriched metadata about the articles and used when needed.  The Luxid Platform offered by TEMIS would also suggest topics close to the given topic.  What does this do? 

  • It improves the productivity of the editor.  The editor has a complete set of information that he or she can easily navigate.  Additionally, if text analytics can extract relationships such as cause this can be analyzed and used to enrich a story.
  • It provides new opportunities for publishers.  For example, Luxid would enable the publisher to provide the consumer with links to similar articles or set up alerts when new, similar content is created, as well as tools to better navigate data or analyze it (this might be used by fee based subscription services).  It also enables publishers to create targeted microsites and topical pages, which might be of interest to consumers.

Under many current schemes, advertisers pay online publishers.  Enhancing navigation means more visits, more page views, and a more focused audience, which can lead to more advertising revenue for the publisher.  Publishers, in some cases, are trying to go even further, by transforming readers into sales leads and receiving a commission from sales. There are other models that publishers are exploring, as well.  Additionally, text analytics could enable publishers to re-package content, on the fly (called content repurposing), which might lead to additional revenue opportunities such as selling content to brand sponsors that might resell it.  The possibilities are numerous.

I am interested in other compelling use cases for the technology.

A different spin on analyzing content – Infosphere Content Assessment

IBM made a number of announcements last week at IOD regarding new products/offerings to help companies analyze content.  One was Cognos Content Analytics, which enables organizations to analyze unstructured data alongside structured data.  It also looks like IBM may be releasing a “voice of the customer” type service to help companies understand what is being said about them in the “cloud” (i.e. blogs, message boards, and the like).  Stay tuned on that front, it is currently being “previewed”.

I was particularly interested in a new product called IBM Infosphere Content Assessment, because I thought it was an interesting use of text analytics technology.  The product uses content analytics (IBM’s term for text analytics) to analyze “content in the wild”.  This means that a user can take the software, run it over servers that might contain terabytes (or even petabytes) of data to understand what is being stored on servers.  Here are some of the potential use cases for this kind of product:

  • Decommission data.  Once you understand the data that is on a server, you might choose to decommission it, thereby freeing up storage space
  • Records enablement.   Infosphere Content Assessment can also be used to identify what records need to go into a records management system for a record retention program
  • E-Discovery.  Of course, this technology could also be used in litigation, investigation, and audit.  It can analyze unstructured content on servers which can help to discover information that may be used in legal matters or information that needs to meet certain audit requirements for compliance.

The reality is that the majority of companies don’t formally manage their content.  It is simply stored on file servers.  The IBM product team’s view is that companies can “acknowledge the chaos”, but use the software to understand what is there and gain control over the content.  I had not seen a product positioned quite this way before and I thought it was a good use of the content analysis software that IBM has developed.

If anyone else knows of software like this, please let me know.

SAS and the Business Analytics Innovation Centre

Last Friday, SAS announced that it was partnering with Teradata and Elder Research Inc. (a data mining consultancy) to open a Business Analytics Innovation Centre.  According to the press release,

“ Recognising the growing need and challenges businesses face driving operational analytics across enterprises, SAS and Teradata are planning to establish a centralised “think tank” where customers can discuss analytic best practices with domain and subject-matter experts, and quickly test or implement innovative models that uncover unique insights for optimising business operations.”

The center will include a lab for pilot programs, analytic workshops and proof of concept for customers.  I was excited about the announcement, because it further validated the fact that business analytics continues to gain steam in the market. I had a few questions, however, that I sent to SAS.  Here are the responses. 

Q. Is this a physical center or a virtual center?  If physical – where is it located and how will it be staffed?  If virtual, how will it be operationalized?

R. The Business Analytics Innovation Center will be based at SAS headquarters in Cary, North Carolina.  We will offer customer meetings, workshops and projects out of the Center. 

Q. Will there be consulting services around actually deploying analytics into organizations?  In other words, is it business action oriented or more research oriented?

R.  The Business Analytics Innovation Center will offer consulting services around how best to deploy analytics into organizations, as well as conduct research-based activities to help businesses improve operational efficiency. 

Q.  Should we expect to hear more announcements from SAS around business analytics, similar to what has been happening with IBM?

R.  As the leader in business analytics software and services, SAS continues to make advances in its business analytics offerings. You can expect to hear more from SAS in this area in 2010

I’m looking forward to 2010!

Is it Possible to Make Predictive Analytics Pervasive?

I just got back from the IBM Information on Demand (IOD) conference in Las Vegas.  A key message was that the future is in analytics and predictive analytics at that.  IBM has already invested $12B ($8B acquisitions, $4B organic growth) in analytics since 2005.  Its recent purchase of SPSS has enabled the company to put a stake in the ground regarding leading the analytics charge.

Predictive analytics uses historical data to try to predict what might happen in the future.  There are different technologies that can help you to do this including data mining and statistical modeling.  For example, a wireless telecommunications company might try to predict churn by analyzing the historical data associated with customers who disconnected the service vs. those that did not.  Attributes that might serve as predictors include dropped calls, calling volume (in network, out of network), demographic information, and so on.  An insurance company might try to predict future fraud using past claims that where the outcome is known.  Adam Gartenberg’s blog describes more examples of this.  IBM plans to make predictive analytics more pervasive in several ways. 

  • Making models easier to build. It will make predictive modeling tools easier to use for those who build the models.  A good example of this is the SPSS PASW Modeler product that uses a visual paradigm to build various kinds of models.  I stopped by the SPSS booth at the show and saw the software at the demo area and it is nice with lots of feature/functionality built into it.  Training is available (and I would argue necessary), for example, to understand when you might want to use a certain kind of model. 
  • Embedding the predictive model in a process.  Here, the predictive model would become part of a business process. For example, a predictive model might be built into a claims analysis process.  The model determines the characteristics and predictors of claims that might be classified as fraudulent.  As the claims come through the process, those that are suspicious, based on the model, would get kicked out for further examination.  

So, given these two approaches, can predictive analytics become pervasive? 

In the case of making predictive modeling tools easier to use, the question isn’t whether someone can use a tool, but whether he or she can use it correctly.   The goal of a tool like PASW is to enable business users to build advanced models. Could a BI power user who is accustomed to slicing and dicing and shaking and baking data effectively use a tool like this?  Possibly, if they have the right thought process and they pay attention to the part of the training that describes what type of technique to use for what type of problem.  It is a good goal.  Time will be the judge.

As for embedding predictive analytics in business processes; this is already starting to happen and here is where the possibility of making prediction more pervasive gets exciting.  For example, telecommunications companies can embed predictive analytics into a call center application to understand an action that a customer might take.  A call center representative can make use of the results of the model (without understanding the model or what it does).  He or she is simply fed information, from the model, (in real time) to help service a customer most effectively.   The model can be created by a skilled analytics person, but deployed in such a way that it can help a lot of other people across an organization.  One key will be the ability to integrate a model into the actual code and culture behind a business process.

Look, I don’t have a crystal ball (little predictive modeling humor there), but I am very excited about the possibilities of predictive modeling.  I did this kind of modeling for years at Bell Laboratories, way back when, and it is great to see it finally gaining traction in the marketplace.  Predictive analytics can be a truly powerful weapon in the right hands.

Four reasons why the time is right for IBM to tackle Advanced Analytics

IBM has dominated a good deal of the news in the business analytics world, recently. On Friday, it completed the purchase of SPSS and solidified its position in predictive analytics.  This is certainly the biggest leg of a recent three-prong attack on the analytics market that also includes:

  • Purchasing Red Pill.  Red Pill is a privately-held company headquartered in Singapore that provides advanced customer analytics services –  especially in the business process outsourcing arena.  The company has talent in the area of advanced data modeling and simulation for various verticals such as financial services and telecommunications. 
  • Opening a series of solutions centers focused on advanced analytics.  There are currently four centers operating now: in New York (announced last week), Berlin, Beijing, and Tokyo.  Others are planned for Washington D.C. and London. 

Of course, there is a good deal of organizational (and technology) integration that needs to be done to get all of the pieces working together (and working together) with all of the other software purchases IBM has made recently.  But what is compelling to me is the size of the effort that IBM is putting forth.  The company obviously sees an important market opportunity in the advanced analytics market.  Why?  I can think of at least four reasons:

  • More Data and different kinds of data.  As the amount of data continues to expand, companies are finally realizing that they can use this data for competitive advantage, if they can analyze it properly.  This data includes traditional structured data as well as data from sensors and other instruments that pump out a lot of data, and of course, all of that unstructured data that can be found both within and outside of a company.
  • Computing power.  The computing power now exists to actually analyze this information.  This includes analyzing unstructured information along with utilizing complex algorithms to analyze massive amounts of structured data. And, with the advent of cloud computing, if companies are willing to put their data into the cloud, the compute power increases.
  • The power of analytics.  Sure, not everyone at every company understands what a predictive model is, much less how to build one.  However, a critical mass of companies have come to realize the power that advanced analytics, such as predictive analysis can provide.  For example, insurance companies are predicting fraud, telecommunications companies are predicting churn.  When a company utilizes a new technique with success, it is often more willing to try other new analytical techniques. 
  • The analysis can be operationalized.  Predictive models have been around for decades.  The difference is that 1) the compute power exists and 2) the results of the models can be utilized in operations.  I remember developing models to predict churn many years ago, but the problem was that it was difficult to actually put these models in to operation.  This is changing.  For example, companies are using advanced analytics in call centers.  When a customer calls, an agent knows if that customer might be likely to disconnect a service.  The agent can utilize this information, along with recommendations for new service to try to retain the customer. 

 So, as someone who is passionate about data analysis, it is good to see that it is finally gaining the traction it deserves.

What is location intelligence and why is it important?

Visualization can change the way that we look at data and information.   If that data contains a geographic/geospatial component then utilizing location information can help provide a new layer of insight for certain kinds of analysis.  Location intelligence is the integration and analysis of visual geographic/geospatial information as part of the decision making process.  A few examples where this might be useful include:

  • Analyzing marketing activity
  • Analyzing sales activity
  • Analyzing crime patterns
  • Analyzing utility outages
  • Analyzing  military options

I had the opportunity to meet with the team from SpatialKey the other week.  SpatialKey offers a location intelligence solution, targeted at decision makers, in a Software as a Service (SaaS) model.  The offering is part of Universal Mind, a consulting company that specializes in design and usability and had done a lot of work on dashboards, Geographic Information Systems, and the like.  Based on its experience, it developed a cloud-based service to help people utilize geographic information more effectively. 

According to the company, all the user needs to get started is a CSV file with their data. Files must contain an address, which SpatialKey will geocode, or latitude and longitude for mapping purposes.  It can contain any other structured data component.   Here is a screen shot from the system.  It shows approximately 1000 real estate transactions from the Sacramento, California area that were reported over a five day period. 

sac_real_estate1

There are several points to note in this figure.  First, the data can be represented as a heat map, meaning areas where there are large number of transactions appear in red, lower numbers in green.   Second, the software gives the user the ability to add visualization pods, which are graphics (on the left) that drill down into the information.  The service also allows you to incrementally add other data sets, so you can visualize patterns.  For example, you might choose to add crime rates or foreclosure rates on top of the real estate transactions to understand the area better.  The system also provides filtering capabilities through pop ups and other sliders. 

SpatialKey has just moved out of beta and into trial.  The company does not intend to compete with traditional BI vendors.  Rather, its intention is to provide a lightweight alternative to traditional BI and GIS systems.  The idea would be to simply export data from different sources (either your company data stores or even other cloud sources such as Salesforce.com) and allow end users to analyze it via a cloud model.

 The future of data is more data.  Location intelligence solutions will continue to become important as the number of devices, such as RFID and other sensors continue to explode.   As these devices spew yet even more data into organizations, people will want a better way to analyze this information.  It makes sense to include geographic visualization as part of the business analytics arsenal.

2009 Text Analytics Survey

 Several weeks ago, in preparation for the Text Analytics Summit, I deployed a short survey about the state of text analytics.  I supplemented the end-user survey with vendor interviews.   Here are some of the end-user findings. 

First, a few words about the survey itself and who responded to the survey.

  • I wanted to make the survey short and sweet.  I was interested in company’s plans for text analytics and whether the economy was affecting these plans. 
  • Let me say up front that given the topic of the survey and our list, I would categorize most of the respondents as fairly analytical and technology savvy.  Approximately 50 companies responded to the survey – split evenly between those companies that were deploying the technology and those that were not (note that this is a self selecting sample and does not imply that 50% of companies are currently using text analytics).  The respondents represent a good mix across a number of verticals including computing, telecommunications, education, pharmaceuticals, financial services, government, and CPG.  There were also a few market researchers in the mix.  Likewise, there was a mix of companies of various sizes. 
  • Here’s my caveat:  I would not view the respondents as a scientific sample and I would view these results as qualitative.  That said, many of the results paralleled results from previous surveys.  So, while the results are unscientific, in terms of a random sample and size, I believe they probably do reflect what many companies are doing in this space.

Results:

Kinds of applications, implementation schemes

I asked those respondents that were deploying text analytics, what kinds of applications they were using it for.  The results were not surprising.  The top three responses:  Voice of the Customer (VoC), Competitive Intelligence, and eDiscovery, were also in the top three the last time I asked the question. Additionally, many of the respondents were deploying more than one type of application (i.e. VoC and quality analysis).  This was a pattern that also emerged in a study I did on text analytics back in 2007. Once a company gains value from one implementation, it then sees the wider value of the technology (and realizes that it has a vast amount of information that needs to be analyzed).

 I asked those companies that were planning to deploy the technology, the top applications being considered.  In this case, VoC and Competitive Intelligence were again in the top two.  Brand Management and Product R&D were tied for third.  This is not surprising.  Companies are quite concerned with customer feedback and any issues that impact customer retention.  Companies want to understand what competitors are up to and how their brand is being perceived in the market.  Likewise, they are also trying to get smarter about how they develop products and services to be more cost effective and more market focused.

Slide1

 

How Text Analytics is being deployed

I also wanted to find out how companies were deploying the technology. In particular, we’ve heard a lot this past year about organizations utilizing text analytics in a Software as a Service (SaaS) model.  This model has become particularly attractive in the Voice of the Customer/Competitive Intelligence/Brand Management area for several reasons.  For one thing, this kind of analysis might involve some sort of external information source such as news feeds and blog postings.  Even product R&D would draw from external sources such as trade journals, news about competitive products, and patent files.  Additionally, the SaaS model generally has a different price point that enterprise solutions.

In fact, SaaS was the model of choice for implementing the technology.  The SasS model does offer the flexibility and price point that many companies are looking for, especially in some of the above-mentioned areas.  However, that is not to say that companies are not deploying text analytics in other ways (note the values on the X axis).  Interestingly, companies are starting to deploy text analytics in conjunction with their content management systems.  I think we will see more of this as the technology continues to become more mainstream. 

Slide1

Just as an FYI, all of the companies that had deployed text analytics stated that the implementations either met or exceeded their expectations.  And, close to 60% stated that text analytics had actually exceeded expectations. 

 What about those companies that aren’t deploying the technology?

Equally important to understanding the market are those companies that are not deploying text analytics.  I asked those companies if they had any plans to utilize the technology.  Eleven percent stated that plans had been put on hold due to funding constraints.  Twenty-eight percent stated that they had no plans to implement the technology.  Another 28% stated that they planned to implement the technology in the next year and 33% said they planned to implement it in the next few years. 

Reasons cited for not implementing the technology included not understanding enough about text analytics to implement it.  Other companies just never considered implementing it, or had other analytic projects on the radar.

What about the economy?

There have been numerous articles written about whether certain technologies are recession proof, with various BI related technology vendors stating/hoping/praying that their technology falls in to this category.  And certainly, companies do feel the need to gain insight about operational efficiency, their customers, the market, and the competition with, perhaps a greater urgency than in the past.  This has helped keep business analytics vendors moving forward in this economy.

The 11% number is relatively small.  However, I wonder what part of the 61% that said that they would be deploying it in the future, might actually fall into the hold category.  When I asked text analytics vendors (mostly private companies) whether the economy was impacting the sales cycle, they pretty much said the same thing.  Existing customers were not dropping projects (there is too much value there, as supported by this survey).  However, sales cycles are longer (companies are not necessarily rushing) and potential clients may be looking for creative financing and contracting options. 

I am participating in an analyst panel at the Text Analytics Summit in June.  I have more to say about the topic, as I am sure, do the other analysts who will be participating.

Follow

Get every new post delivered to your Inbox.

Join 1,189 other followers