Who is using advanced analytics?

Advanced analytics is currently a hot topic among businesses, but who is actually using it and why? What are the challenges and benefits to those companies that are using advanced analytics? And, what is keeping some companies from exploring this technology?

Hurwitz & Associates would like your help in answering a short (5 min) survey on advanced analytics. We are interested in understanding what your company’s plans are for advanced analytics. If you’re not planning to use advanced analytics, we’d like to know why. If you’re already using advanced analytics we’d like to understand your experience.

If you participate in this survey we would be happy to send you a report of our findings. Simply provide us your email address at the end of the survey! Thanks!

Here is the link to the survey:
Click here to take survey

Analyzing Big Data

The term “Big Data” has gained popularity over the past 12-24 months as a) amounts of data available to companies continually increase and b) technologies have emerged to more effectively manage this data. Of course, large volumes of data have been around for a long time. For example, I worked in the telecommunications industry for many years analyzing customer behavior. This required analyzing call records. The problem was that the technology (particularly the infrastructure) couldn’t necessarily support this kind of compute intensive analysis, so we often analyzed billing records rather than streams of calls detail records, or sampled the records instead.

Now companies are looking to analyze everything from the genome to Radio Frequency ID (RFID) tags to business event streams. And, newer technologies have emerged to handle massive (TB and PB) quantities of data more effectively. Often this processing takes place on clusters of computers meaning that processing is occurring across machines. The advent of cloud computing and the elastic nature of the cloud has furthered this movement.

A number of frameworks have also emerged to deal with large-scale data processing and support large-scale distributed computing. These include MapReduce and Hadoop:

-MapReduce is a software framework introduced by Google to support distributed computing on large sets of data. It is designed to take advantage of cloud resources. This computing is done across large numbers of computer clusters. Each cluster is referred to as a node. MapReduce can deal with both structured and unstructured data. Users specify a map function that processes a key/value pair to generate a set of intermediate pairs and a reduction function that merges these pairs
-Apache Hadoop is an open source distributed computing platform that is written in Java and inspired by MapReduce. Data is stored over many machines in blocks that are replicated to other servers. It uses a hash algorithm to cluster data elements that are similar. Hadoop can cerate a map function of organized key/value pairs that can be output to a table, to memory, or to a temporary file to be analyzed.

But what about tools to actually analyze this massive amount of data?

Datameer

I recently had a very interesting conversation with the folks at Datameer. Datameer formed in 2009 to provide business users with a way to analyze massive amounts of data. The idea is straightforward: provide a platform to collect and read different kinds of large data stores, put it into a Hadoop framework, and then provide tools for analysis of this data. In other words, hide the complexity of Hadoop and provide analysis tools on top of it. The folks at Datameer believe their solution is particularly useful for data greater than 10 TB, where a company may have hit a cost wall using traditional technologies but where a business user might want to analyze some kind of behavior. So website activity, CRM systems, phone records, POS data might all be candidates for analysis. Datameer provides 164 functions (i.e. group, average, median, etc) for business users with APIs to target more specific requirements.

For example, suppose you’re in marketing at a wireless service provider and you offered a “free minutes” promotion. You want to analyze the call detail records of those customers who made use of the program to get a feel for how customers would use cell service if given unlimited minutes. The chart below shows the call detail records from one particular day of the promotion – July 11th. The chart shows the call number (MDN) as well as the time the call started and stopped and the duration of the call in milliseconds. Note that the data appear under the “analytics” tab. The “Data” tab provides tools to read different data sources into Hadoop.

This is just a snapshot – there may be TB of data from that day. So, what about analyzing this data? The chart below illustrates a simple analysis of the longest calls and the phone numbers those calls came from. It also illustrates basic statistics about all of the calls on that day – the average, median, and maximum call duration.

From this brief example, you can start to visualize the kind of analysis that is possible with Datameer.

Note too that since Datameer runs on top of Hadoop, it can deal with unstructured as well as structured data. The company has some solutions in the unstructured realm (such as basic analysis of twitter feeds), and is working to provide more sophisticated tools. Datameer offers its software either on either a SaaS license or on premises.

In the Cloud?

Not surprisingly, early adopters of the technology are using it in a private cloud model. This makes sense since some companies often want to keep control of their own data. Some of these companies already have Hadoop clusters in place and are looking for analytics capabilities for business use. Others are dealing with big data, but have not yet adopted Hadoop. They are looking at a complete “big data BI” type solution.

So, will there come a day when business users can analyze massive amounts of data without having to drag IT entirely into the picture? Utilizing BI adoption as a model, the folks from Datameer hope so. I’m interested in any thoughts readers might have on this topic!

Five requirements for Advanced Analytics

The other day I was looking at the analytics discussion board that I moderate on the Information Management site. I had posted a topic entitled “the value of advanced analytics.” I noticed that the number of views on this topic was at least 3 times as many as on other topics that had been posted on the forum. The second post that generated a lot of traffic was a question about a practical guide to predictive analytics.

Clearly, companies are curious and excited about advanced analytics. Advanced analytics utilizes sophisticated techniques to understand patterns and predict outcomes. It includes complex techniques such as statistical modeling, machine learning, linear programming, mathematics, and even natural language processing (on the unstructured side). While many kinds of “advanced analytics” have been around for the last 20+ years (I utilized it extensively in the 80s) and the term may simply be a way to invigorate the business analytics market, the point is that companies are finally starting to realize the value this kind of analysis can provide.

Companies want to better understand the value this technology brings and how to get started. And, while the number of users interested in advanced analytics continues to increase, the reality is that there will likely be a skills shortage in this area. Why? Because advanced analytics isn’t the same beast as what I refer to as, “slicing and dicing and shaking and baking” data to produce reports that might include information such as sales per region, revenue per customer, etc.

So what skills are needed for the business user to face the advanced analytics challenge? It’s a tough question. There is a certain thought process that goes into advanced analytics. Here are five (there are no doubt, more) skills I would say at a minimum, you should have:

1. It’s about the data. So, thoroughly understand your data. A business user needs to understand all aspects of his or her data. This includes answers to questions such as, “What is a customer?” “What does it mean if a data field is blank?” “Is there seasonality in my time series data?” It also means understanding what kind of derived variables (e.g. a ratio) you might be interested in and how you want to calculate them.
2. Garbage in, Garbage out. Appreciate data quality issues. A business user analyzing data cannot simply assume that the data (from whatever source) is absolutely fine. It might be the case, but you still need to check. Part of this ties to understanding your data, but it also means first looking at the data and asking if it make sense. And, what do you do with data that doesn’t make sense?
3. Know what questions to ask. I remember a time in graduate school when, excited by having my data and trying to analyze it, a wise professor told me not to simply throw statistical models at the data because you can. First, know what questions you are trying to answer from the data. Ask yourself if you have the right data to answer the questions. Look at the data to see what it is telling you. Then start to consider the models. Knowing what questions to ask will require business acumen.
4. Don’t skip the training step. Know how to use tools and what the tools can do for you. Again, it is simple to throw data at a model, especially if the software system suggests a certain model. However, it is important to understand what the models are good for. When does it make sense to use a decision tree? What about survival analysis? Certain tools will take your data and suggest a model. My concern is that if you don’t know what the model means, it makes it more difficult to defend your output. That is why vendors suggest training.
5. Be able to defend your output. At the end of the day, you’re the one who needs to present your analysis to your company. Make sure you know enough to defend it. Turn the analysis upside down, ask questions of it, and make sure you can articulate the output

I could go on and on but I’ll stop here. Advanced analytics tools are simply that – tools. And they will be only as good as the person utilizing them. This will require understanding the tools as well as how to think and strategize around the analysis. So my message? Utilized properly these tools can be great. Utilized incorrectly– well – it’s analogous to a do-it-yourself electrician who burns down the house.

Metrics Matter

I had a very interesting conversation last week with Dyke Hensen, SVP of Product Strategy for PivotLink.  For those of you not familiar with PivotLink, the company is a SaaS BI provider that has been in business for about 10 years (before the term SaaS became popular). Historically, the company has worked with small to mid sized firms, which often had large volumes of data (100s of millions of rows) taken from disparate data sources.  For example, the company has done a lot of work with retail POS systems, ecommerce sites, and the like.  Pivotlink enables companies to integrate information into a large columnar database and create dashboards to help slice and dice the information for decision-making.

Recently, the company announced ReadiMetrix, a SaaS BI service designed to provide, “Best practices-based metrics to accelerate time to insight.”  The company provides metrics in three areas:  Sales, Marketing, and HR.  These are actionable measures that companies can use to measure itself against its objectives.  If some of this sounds vaguely familiar (e.g. LucidEra), you might be asking yourself, “Can this succeed?”  Here are four reasons to think that it might:

  • PivotLink has been around for the past decade.  It has an established base of customers and business model. The company knows what its customers want. It should be able to upsell existing customers and it knows how to sell to new customers.
  • From a technical perspective, ReadiMetrix is not a tab in Salesforce.com like many other business SaaS services.   Rather, the company is partnering with integrators like Boomi to provide the connectors to on premises as well as cloud based applications. So, they are not trying to do the integration themselves (which often trips companies up).  The integration also utilizes a SOA based approach, which enables flexibility.
  • The company is building a community of best practices around metrics to continue to grow what it can provide and to raise awareness around the importance of metrics.
  • SaaS BI has some legs.  Since the economic downturn, companies realize the importance of gaining insight from their data and BI companies of all kinds (on and off premises) have benefited from this.  Additionally, the concept of a metric is not necessarily new (think Balanced Scorecard and other measurement systems), so the chasm has been crossed in that regard.

Of course, a critical key to success will be whether or not companies actually think they need or want these kinds of metrics.  Many companies may believe that they are “all set” when it comes to metrics.  However, I’ve seen firms all too often think that “more is better” when it comes to information, rather than considering a selected number of metrics with drill down capability underneath.  The right metrics require some thought.  I think that the idea of an established set of metrics, developed in conjunction with a best practices community might be appealing for companies that do not have expertise in developing their own.  It will be important for PivotLink to educate the market on “why” these categories of metrics matter and their value.

Social Network Analysis: What is it and why should we care?

When most people think of social networks they think of Facebook and Twitter, but social network analysis has its roots in psychology, sociology, anthropology and math (see Scott, John Social Network Analysis for more details). The phrase has a number of different definitions, depending on the discipline you’re interested in, but for the purposes of this discussion social network analysis can be used to understand the patterns of how individuals interact.  For other definitions, look here.

I had a very interesting conversation with the folks from SAS last week about Social Network Analysis.   SAS has a sophisticated social network analysis solution that draws upon its analytics arsenal to solve some very important problems.  These include discovering banking or insurance fraud rings, identifying tax evasion, social services fraud, and health care fraud (to name a few) These are huge issues.  For example, the 2009 ABA Deposit Account Fraud Survey found that eight out of ten banks reported having check fraud losses in 2008. A new report by the National Insurance Crime Bureau (NICB) shows an increase in claims related to “opportunistic fraud,” possibly due to the economic downturn.   These include worker’s compensation, staged and caused accidents.

Whereas some companies (and there are a number of them in this space) use mostly rules (e.g. If the transaction is out of the country, flag it) to identify potential fraud, SAS utilizes a hybrid approach that can also include:

  • Anomalies; e.g. the number of unsecured loans exceeds the norm
  • Patterns; using predictive models to understand account opening and closing patterns
  • Social link analysis: e.g. to identify transactions to suspicious counterparties

Consider the following fraud ring:

  • Robert Madden shares a phone number with Eric Sully and their accounts have been shut down
  • Robert Madden also shares and address with Chris Clark
  • Chris Clark Shares a phone with Sue Clark and she still has open accounts
  • Sue Clark and Eric Sully also share an address with Joseph Sullins who has open accounts and who is soft matched to Joe Sullins who has many open accounts and has been doing a lot of cash cycling between them.

This is depicted in the ring of fraud that the SAS software found, which is shown above.   The dark accounts indicate accounts that have been closed.  Joe Sullins represents a new burst of accounts that should be investigated.

The SAS solution accepts input from many sources (including text, where it can use text mining to extract information from, say a claim).  The strength of the solution is in its ability to take data from many sources and in the depth of its analytical capability.

Why is this important?

Many companies set up Investigation Units to investigate potential fraud.  However, often times there are large numbers of false positives (i.e. investigations that show up as potential fraud but aren’t) which cost the company a lot of to investigate.  Just think about how many times you’ve been called by your credit card company when you’ve made a big purchase or traveled out of the country and forgot to call them and you understand the dollars wasted on false positives.    This cost, of course, pales in comparison to the billions of dollars lost each year to fraud.    Social network analysis, especially using more sophisticated analytics, can be used to find previously undetected fraud rings.

Of course, social network analysis has other use cases as well as fraud detection.   SAS uses Social Network Analysis as part of its Fraud Framework, but it is expanding its vision to include customer churn and viral marketing  (i.e. to understand how customers are related to each other).   Other use cases include terrorism and crime prevention, company organizational analysis, as well as various kinds of marketing applications such as finding key opinion leaders.

Social network analysis for marketing is an area I expect to see more action in the near term, although people will need to be educated about social networks, the difference between social network analysis and social media analysis (as well as where they overlap) and the value of the use cases.  There seems to be some confusion in the market, but that is the subject of another blog.

My Take on the SAS Analyst Conference

I just got back from the SAS analyst event that was held in Steamboat Springs, Colorado.   It was a great meeting.  Here are some of the themes I heard over the few days I was there:

SAS is a unique place to work.

Consider the following:  SAS revenue per employee is somewhat lower than the software industry average because everyone is on the payroll.  That’s right.  Everyone from the grounds keepers to the health clinic professionals to those involved in advertising are on the SAS payroll.   The company treats its employees very well, providing fitness facilities and on site day care (also on the payroll). You don’t even have to buy your own coffee or soda! The company has found that these kinds of perks have a positive impact.  SAS announced no layoffs in 2009 and this further increased morale and productivity.  The company actually saw increased profits in 2009.   Executives from SAS also made the point that even thought they might have their own advertising, etc. they do not want to be insular.  The company knows it needs new blood and new ideas.  On that note, check out the next two themes:

Innovation is very important to SAS.

Here are some examples:

  • Dr. Goodnight gave his presentation using the latest version of the SAS BI dashboard, which looked pretty slick.
  • SAS has recently introduced some very innovative products and the trend will continue. One example is its social network analysis product that has been doing very well in the market.  The product analyzes social networks and can, for example, uncover groups of people working together to commit fraud.  This product was able to find $32M in welfare fraud in several weeks.
  • SAS continues to enhance its UI, which it has been beat up about in the past. We also got pre-briefed on some new product announcements that I can’t talk about yet, but other analysts did tweet about them at the conference.   There were a lot of tweats at this conference and they were analyzed in real time.

The partnership with Accenture is a meaningful one.

SAS execs stated that although they may not have that many partnerships, they try to make the ones they have very real.  While, on the surface, the recent announcement regarding the Accenture SAS Analytics Group might seem like a me too after IBM BAO, it is actually different.  Accenture’s goal is transform the front office, like ERP/CRM was transformed.  It wants to, “Take the what and turn it into so what and now what?” It views analytics not simply as a technology, but a new competitive management science that enables agility.  It obviously won’t market it that way as the company takes a business focus.  Look for the Accenture SAS Analytics Group to put out services such as Churn management as a service, Risk and fraud detection as a service.  They will operationalize this as part of a business process.

The Cloud!

SAS has a number of SaaS offerings in the market and will, no doubt, introduce more.  What I found refreshing was that SAS takes issues around SaaS very seriously.  You’d expect a data company to be concerned about their customers’ data and they are. 

Best line of the conference

SAS is putting a lot of effort into making its products easier to use and that is a good thing.  There are ways to get analysis to those people who aren’t that analytical.  In a discussion about the skill level required for people to use advanced analytics, however, one customer commented, “Just because you can turn on a stove doesn’t mean you know how to cook.”  More on this in another post.

Five Predictions for Advanced Analytics in 2010

With 2010 now upon us, I wanted to take the opportunity to talk about five advanced analytics technology trends that will take flight this year.  Some of these are up in the clouds, some down to earth.

  • Text Analytics:  Analyzing unstructured text will continue to be a hot area for companies. Vendors in this space have weathered the economic crisis well and the technology is positioned to do even better once a recovery begins.  Social media analysis really took off in 2009 and a number of text analytics vendors, such as Attensity and Clarabridge, have already partnered with online providers to offer this service. Those that haven’t will do so this year.  Additionally, numerous “listening post” services, dealing with brand image and voice of the customer have also sprung up. However, while voice of the customer has been a hot area and will continue to be, I think other application areas such as competitive intelligence will also gain momentum.  There is a lot of data out on the Internet that can be used to gain insight about markets, trends, and competitors.
  • Predictive Analytics Model Building:  In 2009, there was a lot of buzz about predictive analytics.  For example, IBM bought SPSS and other vendors, such as SAS and Megaputer, also beefed up offerings.  A newish development that will continue to gain steam is predictive analytics in the cloud.  For example, vendors Aha! software and Clario are providing predictive capabilities to users in a cloud-based model.  While different in approach they both speak to the trend that predictive analytics will be hot in 2010.
  • Operationalizing Predictive Analytics:  While not every company can or may want to build a predictive model, there are certainly a lot of uses for operationalizing predictive models as part of a business process.  Forward looking companies are already using this as part of the call center process, in fraud analysis, and churn analysis, to name a few use cases.  The momentum will continue to build making advanced analytics more pervasive.
  • Advanced Analytics in the Cloud:  speaking of putting predictive models in the cloud, business analytics in general will continue to move to the cloud for mid market companies and others that deem it valuable.  Companies such as QlikTech introduced a cloud-based service in 2009.  There are also a number of pure play SaaS vendors out there, like GoodData and others that provide cloud-based services in this space.  Expect to hear more about this in 2010.
  • Analyzing complex data streams.  A number of forward-looking companies with large amounts of real-time data (such as RFID or financial data) are already investing in analyzing these data streams.   Some are using the on-demand capacity of cloud based model to do this.  Expect this trend to continue in 2010.

Operationalizing Predictive Analytics

There has been a lot of excitement in the market recently around business analytics in general and specifically around predictive analytics. The promise of moving away from the typical rear view mirror approach to a predictive, anticipatory approach is a very compelling value proposition. 

But, just how can this be done?  Predictive models are complex.  So, how can companies use them to their best advantage?  A number of ideas have emerged to make this happen including 1) making the models easier to build in the first place and 2) operationalizing models that have been built so users across the organization can utilize the output of these models in various ways.  I have written several blogs on the topic.

Given the market momentum around predictive analytics, I was interested to speak to members of the Aha! Team about their spin on this subject, which they term “Business Embedded Analytics.” For those of you not familiar with Aha! the company was formed in 2006 to provide a services platform (i.e. SaaS platform called Axel ) to embed analytics within a business.  The company currently has customers in healthcare, telecommunications, and travel and transportation.  The idea behind the platform is to allow business analysts to utilize advanced business analytics in their day to day jobs by implementing a range of deterministic and stochastic predictive models and then tracking, trending, forecasting and monitoring business outcomes based on the output of the model.

An example

Here’s an example.  Say, you work at an insurance company and you are concerned about customers not renewing their policies.  Your company might have a lot of data about both past and present customers including demographic data, the type of policy they have, how long they’ve had it, and so on.  This kind of data can be used to create a predictive model of customers who are likely to drop their policy based on the characteristics of customers who have already done so.  The Aha! platform allows a company to collect the data necessary to run the model, implement the model, get the results from the model and continue to update it and track it as more data becomes available.   This, by itself, is not a new idea.  What is interesting about the Axel Services Platform is that the output from the model is displayed as a series of dynamic Key Performance Indicators (KPIs) models that the business analyst has created.  These KPIs are really important metrics, such as current membership, policy terminations, % disenrolled, and so on.   The idea is that once the model is chugging away, and getting more data, it can produce these indicators on an ongoing basis and analysts can use this information to actively understand and act on what is happening to their customer base.  The platform enables analysts to visualize these KPIs, trend them, forecast on them, and change the value of one of the KPIs in order to see the impact that might have on the overall business.   Here is a screen shot of the system:

In this instance, these are actual not forecasted values of the KPIs (although this could represent a modeled goal).  For example, the KPI on the lower right hand corner of the screen is called Internal Agent Member Retention.  This is actually a drill down of information from the Distribution Channel Performance.  The KPI might represent the number of policies renewed on a particular reference date, year to date, etc. If it was a modeled KPI, it might represent the target value for that particular KPI (i.e. in order to make a goal of selling 500,000 policies in a particular time period, an internal agent must sell, say 450 of them).  This goal might change based on seasonality, risk, time periods, and so on.

Aha! provides tools for collaboration among analysts and a dashboard, so that this information can be shared with members across the organization or across companies. Aha! Provides a series a predictive models, but also enables companies to pull in the models from outside sources such as SAS or SPSS. The service is currently targeted for enterprise class companies.

So what?

What does this mean?  Simply this:  that the model, once created, is not static.  Rather, its results are part of the business analyst’s day to day job.  In this way, companies can develop a strategy (for example around acquisition or retention), create a model to address it, and then continually monitor and analyze and act on what is happening to its customer base. 

When most analytics vendors talk about operationalizing predictive analytics, they generally mean putting a model in a process (say for a call center) that can be used by call center agents to tell them what they should be offering customers.  Call center agents can provide information back into the model, but I haven’t seen a solution where the model represents the business process in quite this way and continuously monitors the process.   This can be a tremendous help in the acquisition and retention efforts of a company. I see these kinds of models and process being very useful in industries that have a lot of small customers who aren’t that “sticky” meaning they have the potential to churn.  In this case, it is not enough to run a model once; it really needs to be part of the business process. In fact, the outcome analytics of the business user is the necessary feed back to calibrate and tune the predictive model (i.e. you might build a model, but it isn’t really the right model).  As offers, promotions, etc. are provided to these customers, the results can understood in a dynamic way, in a sense to get out ahead of your customer base 

Text Analytics Meets Publishing

I’ve been writing about text analytics for a number of years, now. Many of my blogs have included survey findings and vendor offerings in the space.  I’ve also provided a number of use cases for text analytics; many of which have revolved around voice of the customer, market intelligence, e-discovery, and fraud.  While these are all extremely valuable, there are a number of other very beneficial use cases for the technology and I believe it is important to put them out there, too.

Last week, I spoke with Daniel Mayer, a product-marketing manager, at TEMIS about the publishing landscape and how text analytics can be used in both the editorial and the new product development parts of the publishing business.  It’s an interesting and significant use of the technology.

First a little background.  I don’t believe that it comes as a surprise to anyone that publishing, as we used to know it has changed dramatically.  Mainstream newspapers and magazines have given way to desktop publishing and the Internet as economics have changed the game.  Chris Anderson wrote about this back in 2004, in Wired, in an article he called “The Long Tail” (it has since become a book).  Some of the results include:

  • Increased Competition.  There are more entrants, more content and more choice on the Internet and much of it is free.
  • Mass market vs. narrow market.  Additionally, whereas the successful newspapers and magazines of the past targeted a general audience, the Internet economically enables more narrow appeal publications.  
  • Social, Real time.  Social network sites, like twitter, are fast becoming an important source of real time news. 

All of this has caused mainstream publishers to rethink their strategies in order to survive.  In particular, publishers realize that content needs to be richer, interactive, timely, and relevant.

Consider the following example.  A plane crashes over a large river, close to an airport.  The editor in charge of the story wants to write about the crash itself, and also wants to include historical information about the cause of plane crashes (e.g. time of year, time of day, equipment malfunction, pilot error, etc based on other plane crashes for the past 40 years) to enrich the story.  Traditionally, publishers have annotated documents with key words and dates.   Typically, this was a manual process and not all documents were thoroughly tagged.  Past annotations might not meet current expectations. Even if the documents were tagged, they might have been tagged only at a high level (e.g. plane crash), so that the editor is overwhelmed with information.   This means that it might be very difficult her to find similar stories, much less analyze what happened in other relevant crashes.  

Using text analytics, all historical documents could be culled for relevant entities, concepts, and relationships to create a much more enriched annotation scheme.  Information about the plane crash such as location, type of planes involved, dates, times, and causes could be extracted from the text.  This information would be stored as enriched metadata about the articles and used when needed.  The Luxid Platform offered by TEMIS would also suggest topics close to the given topic.  What does this do? 

  • It improves the productivity of the editor.  The editor has a complete set of information that he or she can easily navigate.  Additionally, if text analytics can extract relationships such as cause this can be analyzed and used to enrich a story.
  • It provides new opportunities for publishers.  For example, Luxid would enable the publisher to provide the consumer with links to similar articles or set up alerts when new, similar content is created, as well as tools to better navigate data or analyze it (this might be used by fee based subscription services).  It also enables publishers to create targeted microsites and topical pages, which might be of interest to consumers.

Under many current schemes, advertisers pay online publishers.  Enhancing navigation means more visits, more page views, and a more focused audience, which can lead to more advertising revenue for the publisher.  Publishers, in some cases, are trying to go even further, by transforming readers into sales leads and receiving a commission from sales. There are other models that publishers are exploring, as well.  Additionally, text analytics could enable publishers to re-package content, on the fly (called content repurposing), which might lead to additional revenue opportunities such as selling content to brand sponsors that might resell it.  The possibilities are numerous.

I am interested in other compelling use cases for the technology.

SAS and the Business Analytics Innovation Centre

Last Friday, SAS announced that it was partnering with Teradata and Elder Research Inc. (a data mining consultancy) to open a Business Analytics Innovation Centre.  According to the press release,

“ Recognising the growing need and challenges businesses face driving operational analytics across enterprises, SAS and Teradata are planning to establish a centralised “think tank” where customers can discuss analytic best practices with domain and subject-matter experts, and quickly test or implement innovative models that uncover unique insights for optimising business operations.”

The center will include a lab for pilot programs, analytic workshops and proof of concept for customers.  I was excited about the announcement, because it further validated the fact that business analytics continues to gain steam in the market. I had a few questions, however, that I sent to SAS.  Here are the responses. 

Q. Is this a physical center or a virtual center?  If physical – where is it located and how will it be staffed?  If virtual, how will it be operationalized?

R. The Business Analytics Innovation Center will be based at SAS headquarters in Cary, North Carolina.  We will offer customer meetings, workshops and projects out of the Center. 

Q. Will there be consulting services around actually deploying analytics into organizations?  In other words, is it business action oriented or more research oriented?

R.  The Business Analytics Innovation Center will offer consulting services around how best to deploy analytics into organizations, as well as conduct research-based activities to help businesses improve operational efficiency. 

Q.  Should we expect to hear more announcements from SAS around business analytics, similar to what has been happening with IBM?

R.  As the leader in business analytics software and services, SAS continues to make advances in its business analytics offerings. You can expect to hear more from SAS in this area in 2010

I’m looking forward to 2010!

Follow

Get every new post delivered to your Inbox.

Join 1,189 other followers