What is advanced analytics?

There has been a lot of discussion recently around advanced analytics. I’d like to throw my definition into the rink. I spent many years at Bell Laboratories in the late 1980s and 1990s deploying what I would call advanced analytics. This included utilizing statistical and mathematical models to understand customer behavior, predict retention, or analyze trouble tickets. It also included new approaches for segmenting the customer base and thinking about how to analyze call streams in real time. We also tried to utilize unstructured data from call center logs to help improve the predictive power of our retention models, but the algorithms and the compute power didn’t exist at the time to do this.

Based on my own experiences as well as what I see happening in the market today as an analyst, I view advanced analytics as an umbrella term that includes a class of techniques and practices that go well beyond “slicing and dicing and shaking and baking” data for reports. I would define advanced analytics as:

“Advanced analytics provides algorithms for complex analysis of either structured or unstructured data. It includes sophisticated statistical models, machine learning, neural networks, text analytics, and other advanced data mining techniques. Among its many use cases, it can be deployed to find patterns in data, prediction, optimization, forecasting, and for complex event processing/analysis. Examples include predicting churn, identifying fraud, market basket analysis, or understanding website behavior. Advanced analytics does not include database query and reporting and OLAP cubes. “

Of course, the examples in this definition are marketing-centric and advanced analytics obviously extends into multiple arenas. Hurwitz & Associates is going to do a deep dive into this area in the coming year. We are currently fielding a study about advanced analytics and we’ll be producingadditional reports. For those of you who are interested in completing my survey, here is the link:

Advanced Analytics and the skills needed to make it happen: Takeaways from IBM IOD

Advanced Analytics was a big topic at the IBM IOD conference last week. As part of this, predictive analytics was again an important piece of the story along with other advanced analytics capabilities IBM has developed or is in the process of developing to support optimization. These include Big Insights (for big data), analyzing data streams, content/text analytics, and of course, the latest release of Cognos.

One especially interesting topic that was discussed at the conference was the skills required to make advanced analytics a reality. I have been writing and thinking a lot this subject so I was very happy to hear IBM address it head on during the second day keynote. This keynote included a customer panel and another speaker, Dr. Atul Gawande, and both offered some excellent insights. The panel included Scott Friesen (Best Buy), Scott Futren (Guinnett County Public Schools), Srinivas Koushik (Nationwide), and Greg Christopher (Nestle). Here are some of the interrelated nuggets from the discussions:

• Ability to deliver vs. the ability to absorb. One panelist made the point that a lot of new insights are being delivered to organizations. In the future, it may become difficult for people to absorb all of this information (and this will require new skills too).
• Analysis and interpretation. People will need to know how to analyze and how to interpret the results of an analysis. As Dr. Gawande pointed out, “Having knowledge is not the same as using knowledge effectively.”
• The right information. One of the panelists mentioned that putting analytics tools in the hands of line people might be too much for them, and instead the company is focusing on giving these employees the right information.
• Leaders need to have capabilities too. If executives are accustomed to using spreadsheets and relying on their gut instincts, then they will also need to learn how to make use of analytics.
• Cultural changes. From call center agents using the results of predictive models to workers on the line seeing reports to business analysts using more sophisticated models, change is coming. This change means people will be changing the way that they work. How this change is handled will require special thought by organizations.

IBM executives also made a point of discussing the critical skills required for analytics. These included strategy development, developing user interfaces, enterprise integration, modeling, and dealing with structured and unstructured data. IBM has, of course, made a huge investment in these skills. GBS executives emphasized the 8,500 employees in its Global Business Services Business Analytics and Optimization group. Executives also pointed to the fact that the company has thousands of partners in this space and that 1 in 3 IBMers will attend analytics training. So, IBM is prepared to help companies in their journey into business analytics.

Are companies there yet? I think that it is going to take organizations time to develop some of these skills (and some they should probably outsource). Sure, analytics has been around a long time. And sure, vendors are making their products easier to use and that is going to help end users become more effective. Even if we’re just talking about a lot of business people making use of analytic software (as opposed to operationalizing it in a business process), the reality is that analytics requires a certain mindset. Additionally, unless someone understands the context of the information he or she is dealing with, it doesn’t matter how user friendly the platform is – they can still get it wrong. People using analytics will need to think critically about data, understand their data, and understand context. They will also need to know what questions to ask.

I whole-heartedly believe it is worth the investment of time and energy to make analytics happen.

Please note:

As luck would have it, I am currently fielding a study on advanced analytics! In am interested in understanding what your company’s plans are for advanced analytics. If you’re not planning to use advanced analytics, I’d like to know why. If you’re already using advanced analytics I’d like to understand your experience.

If you participate in this survey I would be happy to send you a report of our findings. Simply provide your email address at the end of the survey! Here’s the link:

Click here to take survey

Who is using advanced analytics?

Advanced analytics is currently a hot topic among businesses, but who is actually using it and why? What are the challenges and benefits to those companies that are using advanced analytics? And, what is keeping some companies from exploring this technology?

Hurwitz & Associates would like your help in answering a short (5 min) survey on advanced analytics. We are interested in understanding what your company’s plans are for advanced analytics. If you’re not planning to use advanced analytics, we’d like to know why. If you’re already using advanced analytics we’d like to understand your experience.

If you participate in this survey we would be happy to send you a report of our findings. Simply provide us your email address at the end of the survey! Thanks!

Here is the link to the survey:
Click here to take survey

Analyzing Big Data

The term “Big Data” has gained popularity over the past 12-24 months as a) amounts of data available to companies continually increase and b) technologies have emerged to more effectively manage this data. Of course, large volumes of data have been around for a long time. For example, I worked in the telecommunications industry for many years analyzing customer behavior. This required analyzing call records. The problem was that the technology (particularly the infrastructure) couldn’t necessarily support this kind of compute intensive analysis, so we often analyzed billing records rather than streams of calls detail records, or sampled the records instead.

Now companies are looking to analyze everything from the genome to Radio Frequency ID (RFID) tags to business event streams. And, newer technologies have emerged to handle massive (TB and PB) quantities of data more effectively. Often this processing takes place on clusters of computers meaning that processing is occurring across machines. The advent of cloud computing and the elastic nature of the cloud has furthered this movement.

A number of frameworks have also emerged to deal with large-scale data processing and support large-scale distributed computing. These include MapReduce and Hadoop:

-MapReduce is a software framework introduced by Google to support distributed computing on large sets of data. It is designed to take advantage of cloud resources. This computing is done across large numbers of computer clusters. Each cluster is referred to as a node. MapReduce can deal with both structured and unstructured data. Users specify a map function that processes a key/value pair to generate a set of intermediate pairs and a reduction function that merges these pairs
-Apache Hadoop is an open source distributed computing platform that is written in Java and inspired by MapReduce. Data is stored over many machines in blocks that are replicated to other servers. It uses a hash algorithm to cluster data elements that are similar. Hadoop can cerate a map function of organized key/value pairs that can be output to a table, to memory, or to a temporary file to be analyzed.

But what about tools to actually analyze this massive amount of data?

Datameer

I recently had a very interesting conversation with the folks at Datameer. Datameer formed in 2009 to provide business users with a way to analyze massive amounts of data. The idea is straightforward: provide a platform to collect and read different kinds of large data stores, put it into a Hadoop framework, and then provide tools for analysis of this data. In other words, hide the complexity of Hadoop and provide analysis tools on top of it. The folks at Datameer believe their solution is particularly useful for data greater than 10 TB, where a company may have hit a cost wall using traditional technologies but where a business user might want to analyze some kind of behavior. So website activity, CRM systems, phone records, POS data might all be candidates for analysis. Datameer provides 164 functions (i.e. group, average, median, etc) for business users with APIs to target more specific requirements.

For example, suppose you’re in marketing at a wireless service provider and you offered a “free minutes” promotion. You want to analyze the call detail records of those customers who made use of the program to get a feel for how customers would use cell service if given unlimited minutes. The chart below shows the call detail records from one particular day of the promotion – July 11th. The chart shows the call number (MDN) as well as the time the call started and stopped and the duration of the call in milliseconds. Note that the data appear under the “analytics” tab. The “Data” tab provides tools to read different data sources into Hadoop.

This is just a snapshot – there may be TB of data from that day. So, what about analyzing this data? The chart below illustrates a simple analysis of the longest calls and the phone numbers those calls came from. It also illustrates basic statistics about all of the calls on that day – the average, median, and maximum call duration.

From this brief example, you can start to visualize the kind of analysis that is possible with Datameer.

Note too that since Datameer runs on top of Hadoop, it can deal with unstructured as well as structured data. The company has some solutions in the unstructured realm (such as basic analysis of twitter feeds), and is working to provide more sophisticated tools. Datameer offers its software either on either a SaaS license or on premises.

In the Cloud?

Not surprisingly, early adopters of the technology are using it in a private cloud model. This makes sense since some companies often want to keep control of their own data. Some of these companies already have Hadoop clusters in place and are looking for analytics capabilities for business use. Others are dealing with big data, but have not yet adopted Hadoop. They are looking at a complete “big data BI” type solution.

So, will there come a day when business users can analyze massive amounts of data without having to drag IT entirely into the picture? Utilizing BI adoption as a model, the folks from Datameer hope so. I’m interested in any thoughts readers might have on this topic!

Social Network Analysis: What is it and why should we care?

When most people think of social networks they think of Facebook and Twitter, but social network analysis has its roots in psychology, sociology, anthropology and math (see Scott, John Social Network Analysis for more details). The phrase has a number of different definitions, depending on the discipline you’re interested in, but for the purposes of this discussion social network analysis can be used to understand the patterns of how individuals interact.  For other definitions, look here.

I had a very interesting conversation with the folks from SAS last week about Social Network Analysis.   SAS has a sophisticated social network analysis solution that draws upon its analytics arsenal to solve some very important problems.  These include discovering banking or insurance fraud rings, identifying tax evasion, social services fraud, and health care fraud (to name a few) These are huge issues.  For example, the 2009 ABA Deposit Account Fraud Survey found that eight out of ten banks reported having check fraud losses in 2008. A new report by the National Insurance Crime Bureau (NICB) shows an increase in claims related to “opportunistic fraud,” possibly due to the economic downturn.   These include worker’s compensation, staged and caused accidents.

Whereas some companies (and there are a number of them in this space) use mostly rules (e.g. If the transaction is out of the country, flag it) to identify potential fraud, SAS utilizes a hybrid approach that can also include:

  • Anomalies; e.g. the number of unsecured loans exceeds the norm
  • Patterns; using predictive models to understand account opening and closing patterns
  • Social link analysis: e.g. to identify transactions to suspicious counterparties

Consider the following fraud ring:

  • Robert Madden shares a phone number with Eric Sully and their accounts have been shut down
  • Robert Madden also shares and address with Chris Clark
  • Chris Clark Shares a phone with Sue Clark and she still has open accounts
  • Sue Clark and Eric Sully also share an address with Joseph Sullins who has open accounts and who is soft matched to Joe Sullins who has many open accounts and has been doing a lot of cash cycling between them.

This is depicted in the ring of fraud that the SAS software found, which is shown above.   The dark accounts indicate accounts that have been closed.  Joe Sullins represents a new burst of accounts that should be investigated.

The SAS solution accepts input from many sources (including text, where it can use text mining to extract information from, say a claim).  The strength of the solution is in its ability to take data from many sources and in the depth of its analytical capability.

Why is this important?

Many companies set up Investigation Units to investigate potential fraud.  However, often times there are large numbers of false positives (i.e. investigations that show up as potential fraud but aren’t) which cost the company a lot of to investigate.  Just think about how many times you’ve been called by your credit card company when you’ve made a big purchase or traveled out of the country and forgot to call them and you understand the dollars wasted on false positives.    This cost, of course, pales in comparison to the billions of dollars lost each year to fraud.    Social network analysis, especially using more sophisticated analytics, can be used to find previously undetected fraud rings.

Of course, social network analysis has other use cases as well as fraud detection.   SAS uses Social Network Analysis as part of its Fraud Framework, but it is expanding its vision to include customer churn and viral marketing  (i.e. to understand how customers are related to each other).   Other use cases include terrorism and crime prevention, company organizational analysis, as well as various kinds of marketing applications such as finding key opinion leaders.

Social network analysis for marketing is an area I expect to see more action in the near term, although people will need to be educated about social networks, the difference between social network analysis and social media analysis (as well as where they overlap) and the value of the use cases.  There seems to be some confusion in the market, but that is the subject of another blog.

My Take on the SAS Analyst Conference

I just got back from the SAS analyst event that was held in Steamboat Springs, Colorado.   It was a great meeting.  Here are some of the themes I heard over the few days I was there:

SAS is a unique place to work.

Consider the following:  SAS revenue per employee is somewhat lower than the software industry average because everyone is on the payroll.  That’s right.  Everyone from the grounds keepers to the health clinic professionals to those involved in advertising are on the SAS payroll.   The company treats its employees very well, providing fitness facilities and on site day care (also on the payroll). You don’t even have to buy your own coffee or soda! The company has found that these kinds of perks have a positive impact.  SAS announced no layoffs in 2009 and this further increased morale and productivity.  The company actually saw increased profits in 2009.   Executives from SAS also made the point that even thought they might have their own advertising, etc. they do not want to be insular.  The company knows it needs new blood and new ideas.  On that note, check out the next two themes:

Innovation is very important to SAS.

Here are some examples:

  • Dr. Goodnight gave his presentation using the latest version of the SAS BI dashboard, which looked pretty slick.
  • SAS has recently introduced some very innovative products and the trend will continue. One example is its social network analysis product that has been doing very well in the market.  The product analyzes social networks and can, for example, uncover groups of people working together to commit fraud.  This product was able to find $32M in welfare fraud in several weeks.
  • SAS continues to enhance its UI, which it has been beat up about in the past. We also got pre-briefed on some new product announcements that I can’t talk about yet, but other analysts did tweet about them at the conference.   There were a lot of tweats at this conference and they were analyzed in real time.

The partnership with Accenture is a meaningful one.

SAS execs stated that although they may not have that many partnerships, they try to make the ones they have very real.  While, on the surface, the recent announcement regarding the Accenture SAS Analytics Group might seem like a me too after IBM BAO, it is actually different.  Accenture’s goal is transform the front office, like ERP/CRM was transformed.  It wants to, “Take the what and turn it into so what and now what?” It views analytics not simply as a technology, but a new competitive management science that enables agility.  It obviously won’t market it that way as the company takes a business focus.  Look for the Accenture SAS Analytics Group to put out services such as Churn management as a service, Risk and fraud detection as a service.  They will operationalize this as part of a business process.

The Cloud!

SAS has a number of SaaS offerings in the market and will, no doubt, introduce more.  What I found refreshing was that SAS takes issues around SaaS very seriously.  You’d expect a data company to be concerned about their customers’ data and they are. 

Best line of the conference

SAS is putting a lot of effort into making its products easier to use and that is a good thing.  There are ways to get analysis to those people who aren’t that analytical.  In a discussion about the skill level required for people to use advanced analytics, however, one customer commented, “Just because you can turn on a stove doesn’t mean you know how to cook.”  More on this in another post.

Five Predictions for Advanced Analytics in 2010

With 2010 now upon us, I wanted to take the opportunity to talk about five advanced analytics technology trends that will take flight this year.  Some of these are up in the clouds, some down to earth.

  • Text Analytics:  Analyzing unstructured text will continue to be a hot area for companies. Vendors in this space have weathered the economic crisis well and the technology is positioned to do even better once a recovery begins.  Social media analysis really took off in 2009 and a number of text analytics vendors, such as Attensity and Clarabridge, have already partnered with online providers to offer this service. Those that haven’t will do so this year.  Additionally, numerous “listening post” services, dealing with brand image and voice of the customer have also sprung up. However, while voice of the customer has been a hot area and will continue to be, I think other application areas such as competitive intelligence will also gain momentum.  There is a lot of data out on the Internet that can be used to gain insight about markets, trends, and competitors.
  • Predictive Analytics Model Building:  In 2009, there was a lot of buzz about predictive analytics.  For example, IBM bought SPSS and other vendors, such as SAS and Megaputer, also beefed up offerings.  A newish development that will continue to gain steam is predictive analytics in the cloud.  For example, vendors Aha! software and Clario are providing predictive capabilities to users in a cloud-based model.  While different in approach they both speak to the trend that predictive analytics will be hot in 2010.
  • Operationalizing Predictive Analytics:  While not every company can or may want to build a predictive model, there are certainly a lot of uses for operationalizing predictive models as part of a business process.  Forward looking companies are already using this as part of the call center process, in fraud analysis, and churn analysis, to name a few use cases.  The momentum will continue to build making advanced analytics more pervasive.
  • Advanced Analytics in the Cloud:  speaking of putting predictive models in the cloud, business analytics in general will continue to move to the cloud for mid market companies and others that deem it valuable.  Companies such as QlikTech introduced a cloud-based service in 2009.  There are also a number of pure play SaaS vendors out there, like GoodData and others that provide cloud-based services in this space.  Expect to hear more about this in 2010.
  • Analyzing complex data streams.  A number of forward-looking companies with large amounts of real-time data (such as RFID or financial data) are already investing in analyzing these data streams.   Some are using the on-demand capacity of cloud based model to do this.  Expect this trend to continue in 2010.
Follow

Get every new post delivered to your Inbox.

Join 1,190 other followers