Achieving Analytics Maturity: 3 Tips from the experts

What does it take to achieve analytics maturity?  Earlier this week, Dave Stodder and I hosted a webcast with a panel of vendor experts from Cloudera, Microstrategy, and Tableau.  These three companies are all sponsors of the Analytics Maturity Model; an analytics assessment tool that measures where your organization stands relative to its peers in terms of analytics maturity.

There were many good points made during the discussion.  A few particularly caught my attention, because they focused on the organizational aspects of analytics maturity, which are often the most daunting.

Crawl, Walk, Run:  TJ Laher, from Cloudera, pointed out that their customers often a crawl, walk, and then run to analytics. I’ve said before that there is no silver bullet for analytics.  TJ stressed the need for organizations to have a clear vision of strategic objectives and to start off with some early projects that might take place over a six month time frame.   He spoke about going deep with the use cases that you have and then becoming more advanced, such as in bringing in new data types. Cloudera has observed that success in these early projects often helps to facilitate the walking and then, ultimately the running (i.e., becoming more sophisticated) with analytics.

Short term victories have long term implications:  Vijay Anand from Microstategy also touched upon idea of early wins and pointed out that these can have long term implications.  He pointed out that it is important to think about these early victories in terms of what is down the road.  For instance, say the business implements a quick BI solution.  That’s great.  However,  business and IT need to work together to build a certified environment to avoid conflicting and non-standardized information.  It is important to think it through.

IT builds the car and business drives it.  Ian Coe, from Tableau, also talked about IT and the business working together.  He said that organizations achieve success and become mature when teams work together collaboratively on a number of prototypes using an Agile approach.   The over the wall, waterfall approach used by IT in the past won’t cut it because moving forward with analytics involves people and rapidly changing questions.  Tableau believes that the ideal model for empowering users involves a self-service BI approach. Business people are responsible for doing analysis. IT is responsible for managing and securing data.  This elevates IT from the role of dashboard factory to architect and steward of the company’s assets.   IT can work in quick cycles to give business what they need and check in with business regularly.

Of course, each expert came to the discussion table with their own point of view.  And, these are just some of the insights that the panel provided.  The webcast is available on demand.   I encourage you to listen to it and, of course, take the assessment!


Next-Generation Analytics: Four Findings from TDWI’s Latest Best Practices Report

I recently completed TDWI’s latest Best Practices Report: Next Generation Analytics and Platforms for Business Success. Although the phrase “next-generation analytics and platforms” can evoke images of machine learning, big data, Hadoop, and the Internet of things (IoT), most organizations are somewhere in between the technology vision and today’s reality of BI and dashboards. For some organizations, next generation can simply mean pushing past reports and dashboards to more advanced forms, such as predictive analytics. Next-generation analytics might move your organization from visualization to big data visualization; from slicing and dicing data to predictive analytics; or to using more than just structured data for analysis. The market is on the cusp of moving forward.

What are some of the newer next-generation steps that companies are taking to move ahead?

  • Moving to predictive analytics. Predictive analytics is a statistical or data mining technique that can be used on both structured and unstructured data to determine outcomes such as whether a customer will “leave or stay” or “buy or not buy.” Predictive analytics models provide probabilities of certain outcomes. Popular use cases include churn analysis, fraud analysis, and predictive maintenance. Predictive analytics is gaining momentum and the market is primed for growth, if users stick to their plans and if they can be successful with the technology. In this case, 39% of respondents stated they are using predictive analytics today, and an additional 46% are planning to use it in the next few years . Often organizations move in fits and starts when it comes to more advanced analytics, but predictive analytics along with other techniques such as geospatial analytics, text analytics, social media analytics, and stream mining are gaining interest in the market.
  • Adding disparate data to the mix. Currently, 94% of respondents stated they are using structured data for analytics, and 68% are enriching this structured data with demographic data for analysis. However, companies are also getting interested in other kinds of data. Sources such as internal text data (today 27%), external Web data (today 29%), and external social media data (today 19%) are set to double or even triple in use for analysis over the next three years. Likewise, while IoT data is used by fewer than 20% of respondents today, another 34% are expecting to use it in the next three years. Real-time streaming data, which goes hand in hand with IoT data, is also set to grow in use (today 18%).
  • Operationalizing and embedding analytics. Operationalizing refers to making analytics part of a business process; i.e., deploying analytics into production. In this way, the output of analytics can be acted upon. Operationalizing occurs in different ways. It may be as simple as manually routing all claims that seem to have a high probability of fraud to a special investigation unit, or it might be as complex as embedding analytics in a system that automatically takes action based on the analytics. The market is still relatively new to this concept. Twenty-five percent have not operationalized their analytics, and another 15% stated they operationalize using manual approaches. Less than 10% embed analytics in system processes to operationalize it.
  • Investing in skills. Respondents cited the lack of skilled personnel as a top challenge for next-generation analytics. To overcome this challenge, some respondents talked about hiring fewer but more skilled personnel such as data analysts and data scientists. Others talked about training from within because current employees understand the business. Our survey revealed that many organizations are doing both. Additionally, some organizations are building competency centers where they can train from within. Where funding is limited, organizations are engaging in self-study.

These are only a few of the findings in this Best Practices Report.  To download the complete report click here.

To learn more about all things data, attend a TDWI conference! Each TDWI Conference features a unique program taught by highly qualified, vetted instructors teaching full- and half- day courses on topics of specific interest to the analytics/BI/DW professional.


Data visualization and the dynamic dashboard

 I’m a big fan of data visualization because it really helps people understand information.  You can certainly derive a lot of insight from a report, but sometimes it helps to look at data in a different way.  Dashboards are one way to get information to people in an easily digestible format.  However, in order for dashboards to be effective, they need to be:


  • Engaging –  meaning you have to want to look at them 
  • Useful – meaning that they provide valuable information that is easily understandable


Anyone can say they provide a dashboard if they have a few gauges and charts, but if they are flat representations of information they don’t go a long way to help to display complex information. On the other hand, there can’t be so much happening in the dashboard that the user becomes overwhelmed and confused.


Now, I have to admit that I wasn’t a big fan of certain dashboard visuals – such as the gauge.  This may be because when I was developing executive information systems (way back when), our gauges were fairly static, so they didn’t convey much information.  However, I recently had a conversation with Shadan Malik, CEO of iDashboards, and I found the interactive dashboards he showed me quite engaging and useful.


Here’s an example of what I mean.  This is a static screenshot of a dashboard, from iDashboards, (which actually illustrates my point about some static displays of information).  But, click here to see the actual dynamic dashboard built using Flex® and Flashâ technology.





This dashboard presents information from two different bank call centers – one called Auburn Bank and the other Regel Bank.  This is actually a real dashboard, but the data and names have been changed to protect the innocent, which explains the fact that some axes aren’t labeled and why some of the data is a bit suspect – but you can get the idea of how useful a dashboard can be. 


There’s a lot of information on the dashboard, but it isn’t overwhelming.  The dashboard actually tells a story.  In this case, it is the story of two different banks and how well each bank’s call center is performing.  Right away, you can see that (for Auburn Bank) over the past six months, there have been some issues with cost and budget, that the percent of calls coming through the call center (vs. the web) has increased and abandonment rates are decreasing.  The interactive mode helps to drive some of the information, home however.  For example, Auburn Bank has been having some issues with answering calls in a timely manner.  I found this out in just a few seconds of looking at the dashboard by hovering over the cost/budget chart by month and looking at what was happening in gauges and when it trended into the red zone.  You can also toggle between the two banks to see how each is performing.  You can also drill down into any piece of data to get more information.


The company recently announced that it has incorporated Flex technology into its product.  This increases the speed of processing data and enables even more interactivity.  Of course, it is a fine line between visualizing just the right amount of information and being overwhelmed by too much information.   My understanding is that iDashboards works with many of its customers to help deal with data issues, determine important metrics and walk through the storyboarding of the dashboard.  This is probably a good thing, given the power behind the product.

Four Questions about Innovations in Analysis

Several weeks ago, Hurwitz & Associates deployed a short survey entitled, “Four questions about innovations in analysis”.  Well, the results and they are quite interesting!




First, a few words about the survey itself and who responded to the survey.


  1. We wanted to make the survey short and sweet.  We were interested in what  kinds of analytical technology companies thought were important and specifically how companies were using text analytics to analyze unstructured information.  Finally, since there has been a lot of buzz about analyzing social media we asked about this, as well
  2. Let me say up front that given the nature of our list, I would categorize most of the respondents to the survey as fairly technology savvy.  In all,  61 people responded to the survey, 32% of these respondents were from high technology companies.  The verticals included professional services, followed by manufacturing, financial/insurance, healthcare and pharmaceutical. There were also some responses from governmental agencies, telecommunications and energy companies.  So, while the results are unscientific in terms of a random sample across all companies, they probably do reflect the intentions of potential early adopters, although not in a statistically significant manner.
  3. In analyzing the results, I first looked at the overall picture and then examined individual verticals as well as filtered the results by other attributes (such as those using text analytics vs. those not using the technology) to get a feel for what these companies were thinking about and whether one group was different from another.  These subgroups are of course, quite small and the results should be viewed accordingly.


The importance of innovative technologies

 We first asked all of the respondents to rate a number of technologies in terms of importance to their companies.  Figure 1 shows the results.  Overall, most of these technologies were at least somewhat important to this technology savvy group, with query and reporting leading the pack.  This isn’t surprising.  Interestingly, OLAP data cubes appeared to be the least important analytical technology – at least with this group of respondents.  Other technologies, such as performance management, predictive modeling, and visualization ranked fairly high, as well.  Again not surprisingly, text analytics ranked lower than some of the other technologies probably since it is just moving out of the early adopter stage.  Some of the respondents, from smaller firms, had no idea what any of these technologies were.  And, in terms of text analytics, one company commented, ” yeekes, this must be big time company kind of stuff. Way up in the clouds here, come down to earth.” They, no doubt, are still using Excel and Access for their analytical needs.  Other smaller companies were very interested in “non-cube” technologies such as some of the visualization products on the market today.



  Continue reading

Four questions about BI Innovation

As many of you know, I have been spending a great deal of time researching the area of innovations in BI. Yesterday, I posted a short four question survey regarding how companies might be using some of the analysis technologies that are out in the market today. I’m starting to get some interesting responses!
I’d be interested in hearing your thoughts.
Simply click on the link below and answer a few questions. It should take no more than 30 seconds.

Innovations in Data Visualization – Animation and more

Robin Bloor and I were briefed by SAS about some of its visualization technologies last week as part of the research we’re undertaking in innovations in BI.  

SAS has thought a lot about visualization.  In fact, the company has an interesting user centric UI model that actually looks at classes of users across various visualization techniques including dashboards, reporting, application graphics, and interactive graphics.  What was particularly intriguing to us was this interactive graphics product called JMP. 

It’s not new

 I admit that I was unaware of this visualization tool.  I suspect that I am not alone. SAS actually developed JMP in the late 1980s in order to link graphics and data.  The product now runs with an in memory data structure that can handle upwards of 32 gigabytes of data (depending on your set up).    The visualization options that SAS provides run the gamut from the basic to the sophisticated, with links to its more complex analytics.   The latest version of JMP provides a data-filtering feature that allows users to focus on subsets of data and highlight across attributes.  JMP 7.0 also provides some well-designed bubble plots and some new three-dimensional scatter plots and non-parametric density contours (and spinning features with live scales!).  You can see some examples by clicking here 

Particularly exciting to both Robin and I is how SAS is incorporating animation into the product.  Robin wrote about this in his blog this past week.  The folks at SAS (correctly) appreciate that people can understand information better through animation and that the actual visualization of how data changes can be very helpful in analysis.  JMP provides an easy way of automating this animation by a series of sliders.    

Visualization techniques must continue to grow in importance because people need a better way to gain insight from data than simple charts and reports can provide.    We’ve only touched the tip of the iceberg with SAS and I’m sure we’ll both have more to say on the topic.  Stay tuned.

Innovations in Data Visualization – Visual analytics

A wise man once told me, “Look at the data. What are the data telling you?” That was my dissertation advisor, some twenty years ago, before the term data visualization was even coined. And that’s the sensible advice I’ve followed throughout my career when analyzing all different kinds of data.

Data visualization in the form of slicing and dicing, charting and pivoting is standard for most knowledge workers performing data analysis. BI vendors provide visualization in the form of charts and tables of data cut different ways. Microsoft provides its ever-popular pivot table, but dealing with the data can be cumbersome, especially if you want to explore the data quickly across multiple dimensions.

Marcia Kaufman and I recently got a chance to meet with Christian Chabot, CEO and co-founder and Elissa Fink, VP of Marketing from Tableau Software, Seattle, Washington. They impressed us both with Tableau’s innovations in data visualization.

The visualization is the query

So, what’s so interesting about Tableau’s approach?

Consider the following typical analysis problem. You are trying to analyze sales for different categories of TV sets at ten different store locations for the first half of the year. Data include location, region, TV type (flat panel LCD, flat panel plasma, LCD projection, etc.) date sold, dollar value, sales person, as well as information about promotions and warranty plans. If you used a pivot table in Microsoft Excel, you could cross-tab and slice and dice information, you could even drag and drop various attributes onto a chart. At the end of the day, however, you are still left looking at a two-dimensional static plot, or (more likely) a bunch of static plots, trying to derive insight.

With Tableau, it’s not about slotting the data into a plot or report to examine; it’s about rapid visual analysis of the data.

Tableau reads in structured data from many sources such as Excel, Access, text files, SQL Server, Oracle, DB2, MySQL, PostgreSQL, Firebird, Netezza, SQL Server Analysis Services, and Hyperion Essbase. The columns in an Excel spreadsheet would be read into Tableau and put into something it calls a dimension (non numeric) or a metric (numeric) that are listed on the left hand side of the Tableau screen. The user then simply drags and drops as many of these dimensions and metrics as desired onto the palette and the visual representation of the data changes.

In the example above, you might first start an analysis looking at sales of TVs by category and by region.

TVs by Category by Region

Then you might drag in another “column” that further breaks this down into the type of TV in each of the categories such as flat panel LCD, flat panel plasma, etc. onto the palette. This changes the visualization to include this additional dimension.





By interacting with the visual in this manner, the user is querying the visual. The product makes it easy to look at the data dynamically from all different angles, thereby enabling rapid analysis and discovery.

Here are a few of the features that make the analysis quick:

  • The product makes good use of color, so for example, losses would be shown in red. There are also very nice graphical representations to work with.
  • If the underlying data permit, Tableau lets users look across any time dimension (daily, weekly, monthly, quarterly, yearly) with a simple click of a drop down menu.
  • If you don’t want a particular time dimension included in the analysis, simply select and remove it and the visual changes.
  • Tableau lets the user drill down into the visual, to see the underlying data.

The product is flexible and extremely easy to use. It’s also visually appealing – the company definitely practices what it preaches. The charts are clean and crisp and there is good use of color. The latest version of Tableau (3.5) also includes Tableau Server, a Web-based sharing and publishing solution that enablers users to share their results with others. The Personal Edition is a visual analysis and reporting solution for data stored in Excel, MS Access or Text Files with a price tag of $999.00. It’s worth looking in to.


Get every new post delivered to your Inbox.

Join 1,710 other followers