What is location intelligence and why is it important?

Visualization can change the way that we look at data and information.   If that data contains a geographic/geospatial component then utilizing location information can help provide a new layer of insight for certain kinds of analysis.  Location intelligence is the integration and analysis of visual geographic/geospatial information as part of the decision making process.  A few examples where this might be useful include:

  • Analyzing marketing activity
  • Analyzing sales activity
  • Analyzing crime patterns
  • Analyzing utility outages
  • Analyzing  military options

I had the opportunity to meet with the team from SpatialKey the other week.  SpatialKey offers a location intelligence solution, targeted at decision makers, in a Software as a Service (SaaS) model.  The offering is part of Universal Mind, a consulting company that specializes in design and usability and had done a lot of work on dashboards, Geographic Information Systems, and the like.  Based on its experience, it developed a cloud-based service to help people utilize geographic information more effectively. 

According to the company, all the user needs to get started is a CSV file with their data. Files must contain an address, which SpatialKey will geocode, or latitude and longitude for mapping purposes.  It can contain any other structured data component.   Here is a screen shot from the system.  It shows approximately 1000 real estate transactions from the Sacramento, California area that were reported over a five day period. 

sac_real_estate1

There are several points to note in this figure.  First, the data can be represented as a heat map, meaning areas where there are large number of transactions appear in red, lower numbers in green.   Second, the software gives the user the ability to add visualization pods, which are graphics (on the left) that drill down into the information.  The service also allows you to incrementally add other data sets, so you can visualize patterns.  For example, you might choose to add crime rates or foreclosure rates on top of the real estate transactions to understand the area better.  The system also provides filtering capabilities through pop ups and other sliders. 

SpatialKey has just moved out of beta and into trial.  The company does not intend to compete with traditional BI vendors.  Rather, its intention is to provide a lightweight alternative to traditional BI and GIS systems.  The idea would be to simply export data from different sources (either your company data stores or even other cloud sources such as Salesforce.com) and allow end users to analyze it via a cloud model.

 The future of data is more data.  Location intelligence solutions will continue to become important as the number of devices, such as RFID and other sensors continue to explode.   As these devices spew yet even more data into organizations, people will want a better way to analyze this information.  It makes sense to include geographic visualization as part of the business analytics arsenal.

Four Questions about Innovations in Analysis

Several weeks ago, Hurwitz & Associates deployed a short survey entitled, “Four questions about innovations in analysis”.  Well, the results and they are quite interesting!

 

THE SURVEY

 

First, a few words about the survey itself and who responded to the survey.

 

  1. We wanted to make the survey short and sweet.  We were interested in what  kinds of analytical technology companies thought were important and specifically how companies were using text analytics to analyze unstructured information.  Finally, since there has been a lot of buzz about analyzing social media we asked about this, as well
  2. Let me say up front that given the nature of our list, I would categorize most of the respondents to the survey as fairly technology savvy.  In all,  61 people responded to the survey, 32% of these respondents were from high technology companies.  The verticals included professional services, followed by manufacturing, financial/insurance, healthcare and pharmaceutical. There were also some responses from governmental agencies, telecommunications and energy companies.  So, while the results are unscientific in terms of a random sample across all companies, they probably do reflect the intentions of potential early adopters, although not in a statistically significant manner.
  3. In analyzing the results, I first looked at the overall picture and then examined individual verticals as well as filtered the results by other attributes (such as those using text analytics vs. those not using the technology) to get a feel for what these companies were thinking about and whether one group was different from another.  These subgroups are of course, quite small and the results should be viewed accordingly.

THE RESULTS

The importance of innovative technologies

 We first asked all of the respondents to rate a number of technologies in terms of importance to their companies.  Figure 1 shows the results.  Overall, most of these technologies were at least somewhat important to this technology savvy group, with query and reporting leading the pack.  This isn’t surprising.  Interestingly, OLAP data cubes appeared to be the least important analytical technology – at least with this group of respondents.  Other technologies, such as performance management, predictive modeling, and visualization ranked fairly high, as well.  Again not surprisingly, text analytics ranked lower than some of the other technologies probably since it is just moving out of the early adopter stage.  Some of the respondents, from smaller firms, had no idea what any of these technologies were.  And, in terms of text analytics, one company commented, ” yeekes, this must be big time company kind of stuff. Way up in the clouds here, come down to earth.” They, no doubt, are still using Excel and Access for their analytical needs.  Other smaller companies were very interested in “non-cube” technologies such as some of the visualization products on the market today.

 

 

  Continue reading

Is This the Death of the Data Cube? (continued)

I’ve never been a fan of the data cube. In fact, I’ve always disliked it because it seems so constraining. I don’t want to be chained to a certain thought process when I’m analyzing data. Maybe that’s because I try to use a variety of analytical approaches when gaining insight from data.

Recently, Robin Bloor and I have begun to research innovations in Business Intelligence. One of the areas that we’re actively looking into includes new analytical approaches. A few weeks ago, Robin wrote about a company called QlikTech in his blog and I want to add my thoughts about the company to his. Robin discussed the fact that the company’s product, called QlikView builds its data structure in memory on a server using the associated schema information from databases. He said that the power is in that data structure. This means that QlikView can build any multi-dimensional view into that data in a fraction of a second.

Let me build on this because it is important.

As I just mentioned, I never liked cubes. I suppose they served their purpose in that they could provide a multidimensional view, but they lack flexibility. Generally, every time a user wants something new, someone else needs to get involved. This takes time and can be frustrating. With the advent of cheap memory and increased processor speed, this no longer has to be the case. Users shouldn’t have to be constrained. And, this is what QlikView is about.

It’s associative

QlikView reads in data from your company’s data sources into its data structure. QlikView can handle a maximum of 2 billion records per table. The practical limitation is the amount of data which can reside in the RAM of the computer that the QlikView is running on. QlikView compresses the data as it is brought into memory.

The user can develop various ways to view the data. Here’s a screen shot from QlikView. In this case, we are looking at customers for certain biking-related products . The data consist of information about customers including age, gender, marital status, the country they reside in, orders, spend, what category of product they bought, sub categories, etc. The data are available for the years 2005-2007. The view here consists of charts and plots that were created on the fly to examine how age affects purchase. Various plotting and charting options are available in QlikView.

The list boxes on the left of the screen show that the analysis concerns customers in the United States only – although the data exists to look across various countries. My view examines the years 2005-2007 (seen on the top of the screen). My charts and plots indicate that most of the sales come from two age agroups 22-33 and 33-44, which also have the highest average sale, although there are some interesting patterns in the avg sales figures for older customers. Other charts are indicating some interesting statistics regarding orders per customer.

cropped customer summaryHere’s

Here’s the associative part. This view may lead me to ask the question, “Has it been like this every year?” or any other question for that matter. The beauty of the QlikView approach is that the data to ask all sorts of additional questions is at my fingertips. So, let’s assume I want to look at what happened in 2007 only. I simply select 2007 and presto, the view changes immediately to examine this year in particular. The screen shot below illustrates this. You can see the changes to the plots and charts and the fact that data is only available for part of the year (the other months are grayed out). I note any pattern changes here and then I decide to look at 2006 and then 2005 – you get the idea.

customer summary 2 cropped The in memory advantage

This is a simple example. But, what I like about QlikView is that I can create different views on the fly and examine the data in different ways – instantaneously- because the data is in memory and the calculations are done when I need them.

I couldn’t do this with a cube.

Innovations in Data Visualization – Visual analytics

A wise man once told me, “Look at the data. What are the data telling you?” That was my dissertation advisor, some twenty years ago, before the term data visualization was even coined. And that’s the sensible advice I’ve followed throughout my career when analyzing all different kinds of data.

Data visualization in the form of slicing and dicing, charting and pivoting is standard for most knowledge workers performing data analysis. BI vendors provide visualization in the form of charts and tables of data cut different ways. Microsoft provides its ever-popular pivot table, but dealing with the data can be cumbersome, especially if you want to explore the data quickly across multiple dimensions.

Marcia Kaufman and I recently got a chance to meet with Christian Chabot, CEO and co-founder and Elissa Fink, VP of Marketing from Tableau Software, Seattle, Washington. They impressed us both with Tableau’s innovations in data visualization.

The visualization is the query

So, what’s so interesting about Tableau’s approach?

Consider the following typical analysis problem. You are trying to analyze sales for different categories of TV sets at ten different store locations for the first half of the year. Data include location, region, TV type (flat panel LCD, flat panel plasma, LCD projection, etc.) date sold, dollar value, sales person, as well as information about promotions and warranty plans. If you used a pivot table in Microsoft Excel, you could cross-tab and slice and dice information, you could even drag and drop various attributes onto a chart. At the end of the day, however, you are still left looking at a two-dimensional static plot, or (more likely) a bunch of static plots, trying to derive insight.

With Tableau, it’s not about slotting the data into a plot or report to examine; it’s about rapid visual analysis of the data.

Tableau reads in structured data from many sources such as Excel, Access, text files, SQL Server, Oracle, DB2, MySQL, PostgreSQL, Firebird, Netezza, SQL Server Analysis Services, and Hyperion Essbase. The columns in an Excel spreadsheet would be read into Tableau and put into something it calls a dimension (non numeric) or a metric (numeric) that are listed on the left hand side of the Tableau screen. The user then simply drags and drops as many of these dimensions and metrics as desired onto the palette and the visual representation of the data changes.

In the example above, you might first start an analysis looking at sales of TVs by category and by region.

TVs by Category by Region

Then you might drag in another “column” that further breaks this down into the type of TV in each of the categories such as flat panel LCD, flat panel plasma, etc. onto the palette. This changes the visualization to include this additional dimension.

tvs-by-category-region-and-type.png

 

 

 

By interacting with the visual in this manner, the user is querying the visual. The product makes it easy to look at the data dynamically from all different angles, thereby enabling rapid analysis and discovery.

Here are a few of the features that make the analysis quick:

  • The product makes good use of color, so for example, losses would be shown in red. There are also very nice graphical representations to work with.
  • If the underlying data permit, Tableau lets users look across any time dimension (daily, weekly, monthly, quarterly, yearly) with a simple click of a drop down menu.
  • If you don’t want a particular time dimension included in the analysis, simply select and remove it and the visual changes.
  • Tableau lets the user drill down into the visual, to see the underlying data.

The product is flexible and extremely easy to use. It’s also visually appealing – the company definitely practices what it preaches. The charts are clean and crisp and there is good use of color. The latest version of Tableau (3.5) also includes Tableau Server, a Web-based sharing and publishing solution that enablers users to share their results with others. The Personal Edition is a visual analysis and reporting solution for data stored in Excel, MS Access or Text Files with a price tag of $999.00. It’s worth looking in to.

Follow

Get every new post delivered to your Inbox.

Join 1,190 other followers