Measuring a successful visit to a content site

So you have a content-based site, and you want to know whether your visitors’ time on your site was successful.

You have two options:

  1. Attempt to measure this via their on-site behaviour; or
  2. Ask them, via one of the many “voice of customer” solutions.

This post will deal only with #1.

Content sites can be challenging to measure the success of a visit, simply because there’s not necessarily one path. Rather, revenue is often generated via advertising, where page views = ad impressions = revenue.

If you are trying to measure the success of your content site, there are a few ways you can go about this.

  • Page Views per Visit: Seeing a large number of PVs/Visit could indicate a visitor has found information that is useful to them and has had a successful visit. However, a lost or confused visitor would also generate a large number of page views. How do you distinguish the two?
  • Time on Site: This too could indicate a successful visit. However, it could also indicate that someone is spending time searching for (and not finding) what they want.

So how could you better measure success?

  • Focus on valuable pages. A high number of page views to actual content suggests a more successful visit than a high number of page views that might include, say, site searches. Therefore, focus on PVs/Visit (or Time Spent) to a subset of pages. This can be more valuable than site wide PVs/Visit or Time Spent.

But you can do better. First, you need to assess why your content site exists. What behaviour can a visitor perform that would indicate they successfully found what they were looking for?

  • For example, your site exists to provide information X – that’s the goal and purpose of your site. Therefore, a visitor seeing content X achieves that goal, and suggests they had a successful visit.
  • If your site exists for reasons X, Y and Z, a successful visit could be a one that saw one or more of of X, Y or Z.
  • Setting up goals or segments around these behaviours can help you measure over time whether your visitors are performing these behaviours. Can better navigation drive up the percentage of visitors successfully completing this task? Which tasks are more popular? Are you even doing a good job of communicating what your site exists for? (If very few actually complete that main task or tasks, I’d suggest probably not!)

A final note: the intention of measuring a successful visit to your site is to measure this success from the point of view of the visitor. Is your site doing a good job of providing what visitors want?

This “success” doesn’t necessarily tie to short-term revenue for a content site. After all, a successful visit might be one where the visitor comes in, finds what they’re looking for immediately, and leaves. However, that visitor might generate more ad impressions by getting completely lost on your site. Good for you … in the short term. But doesn’t mean they had a successful visit to your site, nor does it bode well for your long-term revenue.

Therefore, measurement of visit success should be analysed alongside measures of revenue success, while carefully weighing the long-term benefits of successful visits (and happy visitors) against the short-term revenue generated by “lots and lots of page views”.

Happy Jojoba-mas

Okay, for those who have not heard Emer Kirrane‘s (slightly frightening) plots regarding Jojoba, here’s the backstory.

It started many months ago, with a random DM from Emer (aka @Exxx on Twitter) informing me: Hey, I just realised when I read your surname [Hinojosa] my brain reads it as Jojoba. (Pronounced “Ho-Ho-Bah”)

Chuckle, chuckle. Never thought much of it.

Next followed random Jojoba references. (Apparently it replaced my name…) Then others joined in. (They need no encouragement!)

Soon after, I was notified about the ultimate Jojoba campaign goal. And then I became afraid.

This campaign was advertised to the world in Emer’s “Silly-Season” interview of me. Or, as I call it, her recruiting minions. (For the record, read the other interviews too. The psychosis doesn’t just end with mine.)

Did I also mention that my husband joined in? I think Emer probably cried when he used the #jojoba hastag – or perhaps just cackled evilly to herself …

At this point, I had two choices. 1) Hide terrified under my bed of a tiny Oirish lass, or 2) Embrace it.

So … let’s check out which I did:  Drumroll … click!

PS. I’ll still be using my current @michelehinojosa account. So, uh, please make sure you keep referencing it. Otherwise you’re totally gonna kill my Twitalyzer scores!

PPS. Wondering what the hell is up with “Jojojoba”? (The extra “Jo”?) Yeah, that’s “ho-ho-ho”, for the holiday season. The official rules dictate that Jojojoba is only available for use over the holiday season, and ends on 12/31 of each year. Uh, yeah … Don’t blame me for this. I didn’t make these rules … Though I did watch in amazement as they were discussed and set out via Twitter.

PPPS. In case you’re curious, yes the @jojoba username was taken. The exchange required for me to actually get this username involved a little stalking (with generous help from Tim Wilson!), a very kind soul who accepted my plea and cause, plus a monkey, peach, and bucket of jello.

So now that the story is told … Happy Jojobamas, everyone!

Simplifying vs Oversimplifying vs Acknowledging Complexity

As analysts, we need to work with complexity, while simplifying for end-users, yet avoid oversimplifying. Naturally, this is easier said that done …

Simplifying for others: This is incredibly important. If you can’t explain the problem and findings to someone in 25 seconds or less, you 1) will likely lose their attention, and 2) possibly don’t understand it well enough yourself to explain it yet. That’s our job. We work with the details and bring others in on the conclusions.

Oversimplifying: The balance required is to simplify the final conclusions without oversimplifying the problem, the data, or your analysis. The struggle, however, is that our brains are hard wired to simplify information.

Think about the amount of stimuli your brain receives every day. For example, you are crossing the street. I am outside. I see a long stretch of gray. That is a road. There is a red thing coming towards me. I hear a noise. The red thing is making the noise. That red thing is a car. Cars can hit me. It is going at 45mph. I am stationary. It will reach my location in 4 seconds. I will take 10 seconds to cross the street. I should not walk yet. And of course, I’m completely understating all that goes through our brains for even simple tasks. If our brains didn’t find a way to make sense of a high volume of inputs, we simply wouldn’t function.

Acknowledging Complexity: The challenge for analysts is to try to simplify the answer, without oversimplifying the questions along the way. If you make erroneous assumptions because it (over)simplifies your analysis, you could end up drawing the wrong conclusions. You will probably make your analysis easier, but render it less valuable.

We need to acknowledge, work with (and enjoy) complexity. (And we had better get used to it, because the digital measurement space is not getting simpler.) However, we need to avoid oversimplifying more than is necessary to sift signal from noise. We need to question what we know, evaluate what we assume, and separate fact from opinion. And if in doubt, invite someone else question you or poke holes in your analysis. Chances are, they’ll spot something you didn’t.

What percentage of your company’s online revenue should be invested in analytics?

There is a lot of discussion in the web analytics community on what percentage of your analytics budget should be spent on tools vs. people. However, the question I’m posing is what percentage of your company’s online revenue should be invested in analytics to begin with? (Aka, 50:50 of what?)

With free solutions like Google Analytics out there, I’m not surprised some companies initially baulk at the cost of an enterprise solution. (After all, it’s more than “free”.)

I thought it might help us all to understand what other companies are doing. Therefore, if you’re at liberty to divulge (keeping in mind this is anonymous – I’m not asking who you are, or what company you work for) would you answer the following two questions?

I will happily share the findings with the community once complete.
An example is below, under the poll.

Example:

Your company generates a total of $200MM in yearly revenue.
(Note: revenue, not profit.)

$100MM of that comes from the online channel.

You spend $1MM total on analytics
$300K spent on tools
$700K spent on people.

= 1% spent on analytics
0.3% on tools
0.7% spent on people.

Shut up and listen

Shut up and listen – you might learn something.

I won’t lie. I’m not shy, I talk a lot, and I enjoy contributing to conversations and debates. However, sometimes (read: often) I need to give myself a slap upside the head to shut up and listen.

Why? I already know everything I’m going to say. There’s nothing I can talk about that I don’t already know. For the time my mouth is engaged, my brain learns nothing.

On the other hand, if I shut up and listen, I might learn something new.

eMetrics DC, 2010: Full wrap up

eMetrics 2010 was a great experience. Lots of smart, interesting speakers, and great food for thought. Here are some of my favourite points, quotes, etc:

Jim Sterne: Social Media – Time to Rethink Your Marketing Metrics

“Yes, I like data, but I’m all about customer centricity.”

Purpose of social media is to:

  • Gain awareness
  • Change attitudes
  • Consider your influence
  • Check on your competition
  • Take action and
  • Generate value.

Larry Freed from ForeSee Results: Managing Forward: Moving from Measuring the Past to Managing the Future

Measure what matters most – your customers. Need to measure what constitutes success from their view, not yours.

The consumer is in charge. Because of the internet, it is easy to shop around amongst competitors. You no longer have to go from store to store but can check another store’s prices from a mobile device. Power therefore now rests with a cross-channel consumer.

Satisfaction drives conversion, loyalty, retention and word of mouth. Satisfy your customers and be fiscally responsible to survive and thrive in the years ahead.

Knowledge is power. Integration of metrics yields big dividends. Turn data into information, information into intelligence.

Social Media Metrics Framework Faceoff

Possible objectives of social:

  • Fostering a dialogue
  • Promoting advocacy
  • Facilitating support
  • Spurring innovation
  • Generating awareness
  • Driving revenue

Measuring social media is still new, and it is challenging to figure out the right KPIs. Sentiment analysis is not yet at a point where it can be automated, as sentiment comes from a reader’s reaction. (And note, people’s reactions are not all the same!)

Hardest thing about social is getting a VP/Director/etc to focus on why they want to be in social. What are the goals? Put channel last (e.g. don’t worry about whether to be on Facebook or Twitter) and think first about why you want to be in social at all.

PR and Analytics often using different tools to measure social media success, and often don’t mix or share information. However, ideally they should integrate and share knowledge.

Joe Megibow, Expedia

“I like to build stuff and blow sh*t up”

Expedia achieved success via data integration and uniting teams. Originally had Tealeaf analysts, datawarehouse analysis, BI analysts (etc) in different countries, with no global standards of definition and measurement. There were multiple (conflicting) data systems, and while decisions were made by numbers, accuracy is a question, and they weren’t being made on facts.

[No wonder the web analytics industry doesn’t yet have standard definitions and measurement. Expedia didn’t even have them within one company!]

Now a united team, with consistent definitions and integrated data systems.

Joe’s six lessions:

  1. If you’re not working with your peers, you’re competing with them. Make sure people aren’t coming to different people for the same questions and having them compete against each other!
  2. Learn from finance. CFO has “the truth”, and seasoned analysts who work with it. Find your source of truth.
  3. Don’t just count, DO! Go from data collection to data action. Analysts can actually push projects through!
  4. Sign up for results. Are you willing to bet your job, your team?
  5. Manage expectations. If something will take a month, be clear about that. Business leaders just want plans and forecasts they can count on.
  6. Start small, and communicate, communicate, communicate. Find little business wins, and proactively deliver. Earn the right to do more.

The moral of the story? Get stuff done. Take ownership, get results, get more ownership.

Vendor two-minute presentations

I think this is all that really has to be shared!

Adam Greco, Salesforce.com

Web data and CRM data is like peanut butter and chocolate – better together. Web analytics data is good. Integrated with other systems is what drives real value.

Add offline success metrics to online variables.
Segment web analytics data by CRM fields
Target web promotions using web and CRM data.

PS. Awesome success via integrating golf handicap from CRM system into ad promotion on site!

Add easy to understand value of a website visitor via scoring. (E.g. reaching Page X is worth 1 point, Page Y is 2 points, etc.) Every visitor therefore has a score/value of their visit.

To justify your analytics budget, you need to show them the money. Tie your success to revenue. It’s what CEOs understand.

Stephane Hamel: Measuring Your Organisation’s Web Analytics Maturity

Only way to success: Creativity in continuous improvement and attention to details. Generate innovative ideas and manifest them through to reality. Requires original thinking and producing. Analytics should use creativity. Analytics without creativity is just theory and pure mathematics.

Analytics = how a business arrives at an optimal and realistic decision, based on existing data.

Need people, process and technology.

Good thoughts from the session:

  • No data is enough if someone doesn’t want to believe. (Seth Godin)
  • Creativity is always constrained in some way. (Stephane’s daughter!)
  • Constraints make us push our limits (Peter Gabriel)
  • If data blocks your creativity, either your idea sucks, or you’re not being creative enough. (Jim Sterne)

Anyone can make the simple complicated. The challenge is to make the hard stuff easy. Analysis requires breaking a complex topic into smaller parts, to gain a better understanding of it.

Web analytics may be new, but we can (and should) learn from existing disciplines, rather than recreate the wheel.

The maturity model: Essential elements of effective processes which describes an evolutionary improvement from ad hoc, immature processes to disciplined, mature processes with improved quality and effectiveness.

Example:

Analytics Maturity Model

Defined by level of maturity on a variety of axes. Aim is to develop but also remain even (e.g. no sense being mature in technology and having poorly developed analyst resources.) Growing one level of maturity is equal to one year of development.

Take homes:

  1. Get down to earth
  2. Fix your issues
  3. Gain experience
  4. Maintain balance (no point being well developed in one area, and low in another.)

Bob Page, eBay

eBay has an executive team that believes you can optimise the business with data. (And FYI, eBay has a ridiculous amount of data.)

They operate via distributed teams, organised datawarehouses, virtual data marts, but guided by common “North Star” metrics – those that are most important to the business.

Have developed an analytics community – like a Facebook for their many analysts. Allows information sharing, knowledge, and building relationships.

Structure of eBay analytics:

  • Dual hub & spoke model
  • Centralised technical team under the CTO
  • Centralised business analysts under the CFO
  • Distributed product analysts

The validation of groups’ findings helps keep a “separation of church and state” – keeps the businesses and teams honest when held accountable to another team.

Web analysts need to speak with a common vocabulary to and with finance.

On testing:

  • Exploration and testing are core pillars of an analytics driven organisation.
  • If all of your tests succeed, you’re not pushing hard enough. You need to do silly things, and fail.

Ensighten Tag Management

Interesting new kid on the block regarding tag management. “Just because we’ve always done it that way, doesn’t mean it’s the right way.”

KPI Clinic (June Li, Stephane Hammel, Angie Brown)

KPIs are your “Oh, sh*t!” metrics. If it doesn’t matter if they move up or down, that’s not a KPI. If they never change, that’s also not a KPI.

KPIs must also be actionable.

To make KPIs meaningful, they should be tied to people’s bonuses. Those are key for people.

There can (and should) be layers of KPIs: executives will have just a top-line few, middle will have more, analysts will have a lot. Being closer to the product and daily decisions will mean you should be looking at more detail.

Take the lead as an analyst in defining KPIs, and get buy-in. (If you ask for input, you may end up with 10,000 “Key” Performance Indicators.)

Agree on an expiration/review date, to make sure they get revisited from time to time.

Some think of KPIs as your dashboard, but John Lovett proposes thinking of them as your low fuel light, because you have to take action.

High level KPIs are great, but you must segment to make sure that they’re not hiding a lower-level trend. (E.g. One segment up, one down.)

If you’re not sure if a KPI is useful, or if anyone is looking at it? Go “metric radio silent” (or use fake data!!) to see if anyone notices!

Analysts, take the initiative, but collaborate. And don’t groan when a change in KPIs is needed …

Global Analytics at IBM

Analytics managers: Set a clear goal, give the team the right resources, and watch them soar. (Margaret Escobar from IBM)

IBM global web analytics team is a centralised and paid internal service. Internal clients contract with analytics at the beginning of the year as to the number of resources needed, the skill level, etc.

IBM focuses on:

  • People (need to communicate, especially in a global team)
  • Process (trying to establish “reusable assets” aka templates, as well as sharing methodology and focusing on documentation)
  • Tools

Trying to balance:

  1. Consistency, with customisation
  2. Demand, with overloading the team
  3. Learning and exploring, but still getting the work done.

IBM product management: Need to embed metrics and analytics into your program develop, and decide what qualifies as success before you start.

Other notable

  • “I received the oddest compliment today: ‘You make analytics and fun no longer seem mutually exclusive.’ But later (different person): ‘I’m not sure you do anything smart …’ ” – Lee Isensee
  • “Analysts take a complex topic, then breaks it up to gain a better understanding. Don’t throw data at people. Tell the story.” – Michelle Rutan
  • Question: “Is there a tool that will integrate data from various sources?” Answer: “You’re the tool.”
  • “Counting is not analytics. Seems obvious, but what did you do last week?” – @sutterbomb
  • “Analysts need to get better at the why, not the what.” Pat LaPointe, Michael Dunn session.
  • “Good web analysts have perspective and good problem solving skills, an intellectual curiosity and a desire to know why.” Pat LaPointe, Michael Dunn. I agree: http://www.michelehinojosa.com/2010/06/19/the-most-valuable-trait-of-analysts-that-you-cant-teach/
  • The web analytics industry is maturing. Companies are pushing hard. In the future, an increase in digital and data savvy CEOs will help this further.

Shameless self promotion

My presentation and review.

eMetrics Washington DC, 2010 (preliminary thoughts)

I’m headed home from eMetrics (in fact, I’m writing this from cruising altitude. Love in-flight wi-fi!) It was my first time at an eMetrics event, and I have to say – it not only met but exceeded my hopes and expectations. (And for the record, I was very excited to go, so my expectations were set high!)

I’m going to write a more in-depth post at a later date, but wanted to share a few thoughts. If these don’t suffice, feel free to check out the Twitter stream on the conference, which I will fully admit I dominated.

However, I wanted to share a few thoughts and favourite quotes of the day.

“I like to build stuff, and blow sh*t up.” (Joe Megibow, Expedia)
“KPIs are your ‘Oh, sh*t!’ metrics.” (Angie Brown) [Apparently it was word of the day …]
Kill unnecessary reports by trying to go “metrics radio silent” and see who notices (Lee Isensee, Unica)

The highlights for me were Joe Megibow, Stephane Hamel and Bob Page (with Adam Greco rounding out the list) and definitely the networking, conversations, and smart people I got to spend three days with. I feel very fortunate to have found such an amazing industry to be a part of, and can’t wait to get more involved via the Analysis Exchange, Web Analytics Association, and hopefully more conference presentations!

I think most conferences end up having a focus. This definitely seems the year of social and the holy grail of multi-channel analytics and complete view of the customer. (Note: we’re now getting demanding. It’s not enough to tie all online. We need traditional and POS and phone data – oh my!)

I also presented about analytics at Kelley Blue Book (the presentation is available here) which was a great first presentation experience. Someone very silly put me in the biiiiig ballroom, but happily, people showed up, and even had questions later!

I’ll share more once I have a chance to compile notes and tweets.

eMetrics presentation

Today I presented at eMetrics DC 2010.

eMetrics DC 2010: “Marketing Metrics: The Publisher Perspective”

When your business model is advertising, your focus is different. Michele talks about the tools, techniques and unique KPI’s at Kelley Blue Book in this session about audience measurement, web analytics and data integration to forecast site traffic, on-site advertising inventory and revenue. Michele goes deep about statistical rigor, automated reporting and managing an internal advertising data warehouse.

[APOLOGIES – PRESENTATION CURRENTLY UNAVAILABLE]

What analysts can learn from group fitness instructors

Les Mills RPM

I am an analyst and a certified Les Mills group fitness instructor for BodyPump (weight training), RPM (indoor cycling), BodyCombat (mixed martial arts based group fitness) and BodyJam (dance based group fitness.)

While analyst and group fitness instructor seem very different, there’s actually a lot that analysts can learn from instructors.

When we are trained as instructors, we spend a lot of time thinking about how different people learn, and how to teach to all of them.

Visual learners need to see it to understand. In group fitness, these participants need you to demonstrate a move, not explain it. In analytics, this may mean visually displaying data, using diagrams, graphs and flow charts instead of data tables – and perhaps even hitting up the whiteboard from time to time.

Auditory learners need to hear it. In group fitness, they rely on verbal cues from the instructor. In analytics, you may have a thousand beautiful visual displays or PowerPoint slides, but it’s your commentary and explanation that will help these people understand.

Kinesthetic learners need to feel it to understand, to experience what you’re talking about. In group fitness, you can show them and tell them, but what they need is to feel the difference between “the right way” and “the wrong way” (for example, “Oh, now I can feel how muscle x engages when I turn my heel!”) This is the same group that tend to need repetition to perfect what they’re doing. In analytics, these are often the people that need to be led through your logic. It’s not enough to show them your findings, and to display the final results. They need to see the steps along the way that you used to answer your questions.

Now here’s where it gets trickier. When you are presenting to a group, they won’t all be the same type of learner. Which means that a good group fitness instructor and a good analyst needs to explain the same thing in different ways to ensure that everyone understands. For an analyst, this may mean using visual displays of information on your slides, talking through the explanation, and giving a step-by-step example to put everyone on the same page.

Keep in mind that you too have your own learning style. Your analysis and presentation style will likely match your learning style. (If you are a visual learner, a visual presentation will come easy to you.) It may take a more conscious effort to make sure you incorporate the learning styles you do not share. However, by tailoring your message to ensure you hit all learning styles, you stand the best chance of getting everyone to the same understanding.

Are you an expert?

Guru. Ninja. Rockstar. Expert. These descriptions are all over the place. (*cough* Twitter bios *cough cough*)

Done learning. This is what I hear.

Here’s the deal. Calling yourself an expert sounds like you think you’ve got nothing left to learn. How can you be an expert in web analytics, or social media? These fields have been around for all of about forty-five seconds. (And they’ve changed twenty-seven times since then!)

My $0.015:  Don’t ever call yourself an expert, a guru, a rockstar. (And don’t just replace it with samurai or swami. You get my point.) Someone else may call you that, but let’s be honest, even then you should shrug it off.

The most appealing trait is a desire to learn, improve, to continue honing your skills. Focus on that. Let your work and development prove yourself. Not a self-appointed noun.