Search Discovery and DTM (Dynamic Tag Management)

I haven’t updated by blog here in quite some time, but wanted to give a quick update as to where I and and what I’m up to in analytics. Back in September 2013, I joined the great team at Search Discovery. This is the company that created the Satellite product that was then later purchased by Adobe, and renamed to Dynamic Tag Management (DTM). Over the last 6 months, I’ve already gotten to immerse myself in DTM and analytics implementations using this fantastic technology. It really does open things up and change how you will see analytics implementation. Especially for those that are on the analyst side and have always had to wait on dev teams to get analytics requirements in place. The technology can do so much, so easily, that I can’t really explain it all here.

If you’re interested in seeing what can be done with DTM, please feel free to shoot me an email here:

When to Stop Using Excel

Without a doubt, Microsoft Excel is the most widely used “business intelligence tool.” Yes, those quotes are there for a reason. Excel is a great tool, and one that most every analyst of any kind has used in excess and been heavily dependent upon. However, there does come a point where a business should begin evaluating if it is time to grow up and move beyond spreadsheets for analysis, reporting and data visualization. Over the years, I’ve worked on many projects where the final deliverable is some kind of automated spreadsheet. And, through this experience, I’ve seen many things that make me think, “okay, Excel is NOT the best way to meet this business requirement in the long term, because of X.” I just thought I’d share a few of the things I’ve seen being done in Excel that indicate it’s time to start looking for another way to analyze, report and visualize your data:

  • Ownership – Could a new person jump right into the spreadsheet and continue the work?
    • Many times, I’ve seen (or created) spreadsheets that would be next to impossible to follow all the way from how the data is inserted, how it’s later transformed, how some data may be tied together and then eventually reported upon. This makes maintaining a spreadsheet solution very difficult in the long-term. When a spreadsheet’s functionality becomes so complex that no one aside from the original creator can edit it, it’s time to consider other solutions. This becomes even more true when you start to introduce things such as VBA scripting, large pivot tables (these in general are an indicator that you need a real BI solution) and complex macros into the mix.
  • Size – Plain and simple, Excel is not meant for large datasets
    • This one is pretty much a no-brainer. Any single Excel tab can only hold around 1.5 million rows of data. But, good luck not pulling your hair out trying to manipulate (or even just open) an Excel file with that much data in it. Even though you can have 1.5 million rows in Excel, it takes far less data than that to essentially make a spreadsheet useless (or a nightmare to use). Even inserting a new, blank column into such a spreadsheet means sitting there for a couple of minutes waiting on your machine’s CPU to chug away trying to complete this task.
  • Geographic Data – It’s a spreadsheet not a mapping tool
    • When dealing with geographic data, your visualizations in Excel are pretty limited. There is no integrated way to visualize geographic data on any kind of map via Excel (at least out of the box). Visualizing geographic data on a map is great for the consumers of the data as it allows them to immediately put the data into proper context without having to first look down all of the rows (or maybe even only the top 10 rows,etc) of a ranked list of locations or at the entire bar chart of locations (neither of which visualize where the actual location is).
  • Data Integration – It’s a spreadsheet, not a database
    • A “vlookup” is not not really joining multiple data sets. Excel can be used to tie data from multiple datasets together if there are common keys, but you have to ask yourself if Excel is the best way to maintain a complex set of data where integration and data relationships are critical. To put this simply, if multiple data sets are critical to analysis, then you will eventually run into the ownership and size issues mentioned above. At some point, your business requirements  will be better served by using a true relational database (even Access would scale better, and MySQL is a free relational database).

Even though spreadsheets are very useful in the right situation, my primary argument against them in certain cases conform to the points above. So in summary, a spreadsheet isn’t the best long-term solution for business intelligence, analysis and data visualization if; a) it’s too complex for anyone else to quickly update or expand upon, b) you have a large amount of data, c) you need to effectively analyze geographic data and d) if you have many data sets that are related and need to be joined (or even visualized aside one another).


Data Visualization Best Practices

Below is a presentation deck from Tableau which was presented during one of their recent “roadshow” events. Regardless of the tools you have at hand to visualize your data, the best practices presented here are worth your time. Of especial interest within the deck is the information and research on how humans can best leverage data visually as well as how certain relationships in data are best visualized for efficient consumption.

From Web Analytics to Business Intelligence to Analytics

Right now, there is a shift happening in how organizations see Web analytics. This shift is part of the maturation of data usage within organizations. Before Web analytics, many organizations had investments in business intelligence (BI) solutions and technologies. Then, the Web came about, and dedicated Web analytics companies (WebTrends, Omniture, CoreMetrics, etc.) sprung up to quickly address these new data and reporting needs.

With the existing capabilities to handle large data sets and provide custom reporting, traditional business intelligence solutions really missed the boat. All they needed to do was figure out how to collect and store the data. But now is their chance to catch back up as organizations begin realizing that the Web is just a single part of the puzzle.

In order for this to happen, business intelligence solutions (Microstrategy, Cognos, Business Objects, etc.) need to develop competing offerings that allow organizations to quickly hit the ground running, with the goals of:

  1. Integrating Web traffic data into their solution from existing Web Analytics players (as mentioned above)
  2. Capturing Web traffic data and storing it in a raw form with a proper database
  3. Selling reporting solutions and visualizations that immediately address the shortcomings of “canned” Web analytic solutions.

In addition to the traditional BI providers mentioned above, there are now reporting-focused solutions such as QlikView and Tableau that enable organizations to quickly drop a visualization and reporting layer/solution on top of a, existing data source. So, once an organization can figure out the data collection and storage side of online performance (no small feat of course), these solutions can surpass the canned reporting limitations of the traditional Web analytics providers.

I’m not trying to say that anyone should leave Web analytics for BI here (in favor of one over the other), but what I am saying is that this is the time for organizations to consider how important it is to integrate Web analytics data with other data sources, what they could do if they owned their own data, had ready access to the raw data, and were not limited by “canned” solutions. The line between Web analytics and BI is starting to blur. If the choice were mine today, this is the approach (simplified of course) that I would take as the owner of analytics (not Web or BI) within an organization as we head into the future :

  1. Acknowledge that the Web is only another source of insight
  2. Collect and store my own data (I’m very intrigued by Pion as a collection tool)
  3. Deploy a reporting solution where I could create any visualization or reporting needed by business stakeholders (QlikView and Tableau could do this once you’ve solved the data storage side of things)
  4. Socialize reporting, analysis, insights and recommendations. Make analytics and knowledge sharing collaborative (again, QlikView and Tableau can facilitate this)

As an analyst, why would you not want access to raw data and the ability to create your own reporting and visualization solutions? And, you are no longer limited by the reporting and data integration capabilities of a “canned” solution that tries to do the collection, storage and reporting within a single environment.

This is all easier said than done of course, and could be more expensive than the “canned” solution. But, there are trade offs to be made in which ever direction you head. Will you sacrifice greater data integration, data ownership, and collaboration, or will you sacrifice the safer, easier to implement, solution? The decision is yours to make, but make sure that you weigh both options.

Digital Analytics Tools Aren’t a Strategy

What is your digital analytics strategy and how does it help your business achieve success?

Hold that thought until the end…

You’ve got some great analytics solutions in place that measure digital marketing performance (online sentiment, digital marketing automation, site optimization, site performance, digital customer experience, etc.), right? And I’m guessing that you might also have some great documentation that details how these solutions are implemented and maybe even what they’re measuring. But it has been my experience that most organizations consider this their digital analytics strategy. A tool is a means to and end, not a strategy in and of itself.

This is putting the cart (digital analytics) before the horse (that strategy that should pull the company and your digital analytics efforts in the right direction).

Additionally, once a tool or technology is put into place (even if there was a strategy beforehand), we often see that it is used to simply measure and report on the number of times things happen and what things are happening most frequently. At a strategic level, companies aren’t looking for an digital analytics strategy that simply measures the number of times things happen. Companies need a digit alanalytics strategy that will provide answers and provide recommendations that specifically address the business goals and objectives of the company. Nothing less.

So now back to that original question. Can you truly answer that question without naming or considering the tools that your company uses? If not you need to consider developing a true strategic direction for your digital analytics. Here’s one possible answer to that question:

Our digital analytics strategy is to analyze online sentiment, social media impact, site performance, site optimization and digital marketing efforts so that we can provide clear recommendations and timely information to directly impact our strategic business goals and objectives.

From there you could further elaborate on the details such as:

  • What those specific goals/objectives are and how they are impacted by the strategic areas of digital measurement
  • What information (not data points) is needed from the specific digital marketing efforts
  • How your going to get the information and disseminate it
  • Etc…
So there’s a lot to condier here well before a tool (again, a means to an end ) is even considered.
Put your horse back in front of the cart.


Omniture SiteCatalyst Data Processing Order

Since there are so many ways now for an Omniture customer to manipulate their data (standard data collection, APIs, processing rules, VISTA rules, marketing channel processing rules, etc…), I thought it’d be helpful to share this graphic (credit to Omniture) of the order with which data is processed: