Top of Mind – Data in the Cloud

I attended Cloud Camp Boston yesterday. It was a great meeting with some good discussions.  Several hundred people attended.  What struck me about the general session (when all attendees were present) was that there was a lot of interest around data in the cloud.  For example, during the “unpanel” (where people become panelists in real time), 50%; (5 of the 10 questions) that were up for grabs dealt with data in the cloud.  That’s pretty significant. 

  • How do I integrate large amounts of enterprise data in the cloud? (answers included various approaches, more traditional to new vendor technology were mentioned)
  • How do I move my enterprise data into the cloud? (answers included ship it FedEx on a hard drive and make sure there is a proven chain of custody around the transfer)
  • How do I ensure the security of my data in the cloud? (no answer – that deserved its own breakout session)
  • What is the maximum sustained data transfer rate in the cloud? (answers included when it takes a server down, no one knows, but a year ago someone mentioned that 8 gigabytes a second took down a cloud provider)
  • How do applications (and data) interoperate in the cloud? (answers included that standards need to rule)

 There were some interesting break out sessions as well.  One – the aforementioned security (and audit), another an intro to cloud computing (moderated by Judith Hurwitz), one about channel strategies, and a number of others.  I attended a break out session about Analytics and BI in the cloud and again, for obvious reasons, much of the discussion was data centric.   Some of the discussion items included: 

  • What public data sets are available in the cloud? 
  • What is the data infrastructure needed to support various kinds of data analysis? 
  • What SaaS vendors offer business analytics in the cloud? 
  • How do I determine what apps/data make sense to move to the cloud?

 The upshot?  Data in the cloud – moving it, securing it, accessing it, manipulating it, and analyzing it – is going to be a hot topic in 2010.

Operationalizing Predictive Analytics

There has been a lot of excitement in the market recently around business analytics in general and specifically around predictive analytics. The promise of moving away from the typical rear view mirror approach to a predictive, anticipatory approach is a very compelling value proposition. 

But, just how can this be done?  Predictive models are complex.  So, how can companies use them to their best advantage?  A number of ideas have emerged to make this happen including 1) making the models easier to build in the first place and 2) operationalizing models that have been built so users across the organization can utilize the output of these models in various ways.  I have written several blogs on the topic.

Given the market momentum around predictive analytics, I was interested to speak to members of the Aha! Team about their spin on this subject, which they term “Business Embedded Analytics.” For those of you not familiar with Aha! the company was formed in 2006 to provide a services platform (i.e. SaaS platform called Axel ) to embed analytics within a business.  The company currently has customers in healthcare, telecommunications, and travel and transportation.  The idea behind the platform is to allow business analysts to utilize advanced business analytics in their day to day jobs by implementing a range of deterministic and stochastic predictive models and then tracking, trending, forecasting and monitoring business outcomes based on the output of the model.

An example

Here’s an example.  Say, you work at an insurance company and you are concerned about customers not renewing their policies.  Your company might have a lot of data about both past and present customers including demographic data, the type of policy they have, how long they’ve had it, and so on.  This kind of data can be used to create a predictive model of customers who are likely to drop their policy based on the characteristics of customers who have already done so.  The Aha! platform allows a company to collect the data necessary to run the model, implement the model, get the results from the model and continue to update it and track it as more data becomes available.   This, by itself, is not a new idea.  What is interesting about the Axel Services Platform is that the output from the model is displayed as a series of dynamic Key Performance Indicators (KPIs) models that the business analyst has created.  These KPIs are really important metrics, such as current membership, policy terminations, % disenrolled, and so on.   The idea is that once the model is chugging away, and getting more data, it can produce these indicators on an ongoing basis and analysts can use this information to actively understand and act on what is happening to their customer base.  The platform enables analysts to visualize these KPIs, trend them, forecast on them, and change the value of one of the KPIs in order to see the impact that might have on the overall business.   Here is a screen shot of the system:

In this instance, these are actual not forecasted values of the KPIs (although this could represent a modeled goal).  For example, the KPI on the lower right hand corner of the screen is called Internal Agent Member Retention.  This is actually a drill down of information from the Distribution Channel Performance.  The KPI might represent the number of policies renewed on a particular reference date, year to date, etc. If it was a modeled KPI, it might represent the target value for that particular KPI (i.e. in order to make a goal of selling 500,000 policies in a particular time period, an internal agent must sell, say 450 of them).  This goal might change based on seasonality, risk, time periods, and so on.

Aha! provides tools for collaboration among analysts and a dashboard, so that this information can be shared with members across the organization or across companies. Aha! Provides a series a predictive models, but also enables companies to pull in the models from outside sources such as SAS or SPSS. The service is currently targeted for enterprise class companies.

So what?

What does this mean?  Simply this:  that the model, once created, is not static.  Rather, its results are part of the business analyst’s day to day job.  In this way, companies can develop a strategy (for example around acquisition or retention), create a model to address it, and then continually monitor and analyze and act on what is happening to its customer base. 

When most analytics vendors talk about operationalizing predictive analytics, they generally mean putting a model in a process (say for a call center) that can be used by call center agents to tell them what they should be offering customers.  Call center agents can provide information back into the model, but I haven’t seen a solution where the model represents the business process in quite this way and continuously monitors the process.   This can be a tremendous help in the acquisition and retention efforts of a company. I see these kinds of models and process being very useful in industries that have a lot of small customers who aren’t that “sticky” meaning they have the potential to churn.  In this case, it is not enough to run a model once; it really needs to be part of the business process. In fact, the outcome analytics of the business user is the necessary feed back to calibrate and tune the predictive model (i.e. you might build a model, but it isn’t really the right model).  As offers, promotions, etc. are provided to these customers, the results can understood in a dynamic way, in a sense to get out ahead of your customer base 

Follow

Get every new post delivered to your Inbox.

Join 1,189 other followers