What’s in a name?

As we’ve shared the idea of Lean Analytics, we’ve obsessed—as most authors do—with naming. When I co-wrote Complete Web Monitoring a few years ago, I talked to a friend of mine, author Mitch Joel, about its initial name: Watching websites. Ultimately, my co-author Sean and I decided that name would be misinterpreted as a book on online video streaming, and went with the more descriptive (but less sexy) title Complete Web Monitoring. When Mitch heard this, he shook his head in dismay: he thought that by clarifying what the book was, we’d dramatically reduced its audience to a particular subset.

Every author wants an Airport Book. This is the kind of book that fills the racks in an airport bookstore. It makes you feel smarter for having bought it. It’s packed with little gems you can share at conferences. And it’s often got a compelling title: Made To StickSwitchLean Startup. One-word titles, with few syllables, that have a verb in them, do well.

It turns out there is, and isn’t, a lot of research behind a successful title. There’s plenty of data on book sales, of course—so much that publisher Lulu even has a title-scorer app that uses research on bestsellers to predict (in a tongue-in-cheek way) what chance a title has of becoming a bestseller.

The advent of the electronic book, and its accompanying electronic bookstore, helps collect data on what people browse, and can give a publisher early indications of success in real time.

Electronic bookstores have also polarized purchasing: getting featured in online stores is key to success, as is getting mentioned by the Oprah of your particular industry (or, for that matter, Oprah.) Unfortunately, the choice of title, cover art, and other aspects of a book remain somewhat of a dark art for traditional publishers. This New York Times piece explains it in detail.

Most in the industry seem to see consumer taste as a mystery that is inevitable and even appealing, akin to the uncontrollable highs and lows of falling in love or gambling.

Eric Simonoff, a literary agent at Janklow & Nesbit Associates, said that whenever he discusses the book industry with people in other industries, “they’re stunned because it’s so unpredictable, because the profit margins are so small, the cycles are so incredibly long, and because of the almost total lack of market research.”

As analytics types, we don’t like this. My biggest concern is the bubble in which we live and write. Consider another book in O’Reilly’s Lean series, Ash Maurya’s Running Lean. I’ve told plenty of people about it; it’s doing really well, and Ash is in high demand as a speaker and subject matter expert. When a potential reader comes from the startup world, they get the title immediately. But outside our world? They think it’s a book on jogging to lose weight.

No, seriously: if you asked a million random people what the book “Running Lean” was about, what do you think they’d say? Here’s a post from a car tuning forum.

In our case, many people I’ve talked to in the business world (that is, in established businesses that are well past finding their product/market fit) think “Lean Analytics” is about doing web analytics on a shoestring budget, or about Business Intelligence with a minimum of technology. One even thought it was about lifelogging to shed pounds by tracking weight. None of those topics will leap off airport shelves. This presents us with a dilemma:

  • If we target a market narrowly with a title that resonates, we’ll get good adoption within that market.
  • But if we over-target the branding, we limit our ability to reach a broader audience.

It’s a tough balance to strike. It’s one we hope to understand through analysis and experimentation, but even then, this is difficult: we mostly know how to survey startup types. Tim Ferris did some in-store testing before choosing his title (making many of us wonder how many hours he squeezes into four hours.)

He took a book about the same size, put a bunch of different covers on the books, put them in the new non-fiction book section, then just sat back and observed people for the next few hours, watching their reactions. An overwhelming percentage, something like 300% more people picked up The Four Hour Work Week title than the others.

Ferriss and his writing team came up with 12 alternate names. To break the deadlock and to help finalize and write the great book title they eventually came up with, Timothy Ferris ran a Google Adwords campaign. ” He bought ads for relevant keywords for all twelve potential book titles and tracked which titles performed the best. The clickthrough rate for The 4 Hour Work Week was by far the highest, so that is what his book is called.”

As far as using Google Adwords to write a great book title, Ferriss proved it’s possible and as Zeigler reports, “a smart and novel approach to write a great book title. “Google Adwords is a cheap and real time focus group.”

Ben and I have a number of ideas for titles that might work. But they’re just ideas—hypotheses to be tested. We’ll try a variety of titles with different audiences and see what works, because that’s what analytical types do when confronted with uncertainty. Not sure whether we’ll camp out in bookstores, but ads and surveys seem like a good start.

On that note, we have a survey going right now with 5 name options. Previously we had many more, and ran a survey on new sign-ups for the book. After 70 or so survey completions, we took the five highest ranked names and updated the survey.

Please take a look here: help us name our book!

The survey takes no more than a minute to complete, and will be one of the data points we use in picking a final name for the book. Thank you!

The One Metric That Matters

One of the things Ben and I have been discussing a lot is the concept of the One Metric That Matters (OMTM) and how to focus on it.

Founders are magpies, chasing the shiniest new thing they see. Many of us use a pivot as an enabler for chronic ADD, rather than as a way to iterate through ideas in a methodical fashion.

That means it’s better to run the risk of over-focusing (and miss some secondary metric) than it is to throw metrics at the wall and hope one sticks (the latter is what Avinash Kaushik calls Data Puking.)

That doesn’t mean there’s only one metric you care about from the day you wake up with an idea to the day you sell your company. It does, however, mean that at any given time, there’s one metric you should care about above all else. Communicating this focus to your employees, investors, and even the media will really help you concentrate your efforts.

There are three criteria you can use to help choose your OMTM: the business you’re in; the stage of your startup’s growth; and your audience. There are also some rules for what makes a good metric in general.

First: what business are you in?

We’ve found there are a few, big business model Key Performance Indicators (KPIs) that companies track, and they’re dictated largely by the main goal of the company. For online businesses, most of them are transactional, collaborative, SaaS-based, media, game, or app-centric. I’ll explain.

Transactional

Someone buys something in return for something.

Transactional sites are about shopping cart conversion, cart size, and abandonment. This is the typical transaction funnel that anyone who’s used web analytics is familiar with. To be useful today, however, it should be a long funnel that includes sources, email metrics, and social media impact. Companies like Kissmetrics and Mixpanel are championing this plenty these days.

Collaborative

Someone votes, comments, or creates content for you.

Collaboration is about the amount of good content versus bad, and the percent of users that are lurkers versus creators. This is an engagement funnel, and we think it should look something like Charlene Li’s engagement pyramid.

Collaboration varies wildly by site. Consider two companies at opposite ends of the spectrum. Reddit probably has a very high percentage of users who log in: it’s required to upvote posts, and the login process doesn’t demand an email confirmation look, so anonymous accounts are permitted. On the other hand, an adult site likely has a low rate of sign-ins; the content is extremely personal, and nobody wants to share their email details with a site they may not trust.

On Reddit, there are several tiers of engagement: lurking, voting, commenting, submitting links, and creating subreddits. Each of these represents a degree of collaboration by a user, and each segment represents a different lifetime customer value. The key for the site is to move as many people into the more lucrative tiers as possible.

SaaS

Someone uses your system, and their productivity means they don’t churn or cancel their subscription.

SaaS is about time-to-complete-a-task, SLA, and recency of use; and maybe uptime and SLA refunds. Companies like Totango (which predicts churn and upsell for SaaS), as well as uptime transparency sites like Salesforce’s trust.salesforce.com, are examples of this. There are good studies that show a strong correlation between site performance and conversion rates, so startups ignore this stuff at their peril.

Media

Someone clicks on a banner, pay-per-click ad, or affiliate link.

Media is about time on page, pages per visit, and clickthrough rates. That might sound pretty standard, but the variety of revenue models can complicate things. For example, Pinterest’s affiliate URL rewriting model, which requires that the site take into account the likelihood someone will actually buy a thing as well as the percentage of clickthroughs (see also this WSJ piece on the subject.)

Game

Players pay for additional content, time savings, extra lives, in-game currencies, and so on.

Game startups care about Average Revenue Per User Per Month and Lifetime Average Revenue Per User (ARPUs). Companies like Flurry do a lot of work in this space, and many application developers roll their own code to suit the way their games are used.

Game developers walk a fine line between compelling content, and in-game purchases that bring in money. They need to solicit payments without spoiling gameplay, keeping users coming back while still extracting a pound of flesh each month.

App

Users buy and install your software on their device.

App is about number of users, percentage that have loaded the most recent version, uninstalls, sideloading-versus-appstore, ratings and reviews. Ben and I saw a lot of this with High Score House and Localmind while they were in Year One Labs. While similar to SaaS, there are enough differences that it deserves its own category.

App marketing is also fraught with grey-market promotional tools. A large number of downloads makes an application more prominent in the App Store. Because of this, some companies run campaigns to artificially inflate download numbers using mercenaries. This gets the application some visibility, which in turn gives them legitimate users.

It’s not that simple

No company belongs in just one bucket. A game developer cares about the “app” KPI when getting users, and the “game” or “SaaS” KPI when keeping them; Amazon cares about “transactional” KPIs when converting buyers, but also “collaboration” KPIs when collecting reviews.

There are also some “blocking and tackling” metrics that are basic for all companies (and many of which are captured in lists like Dave McClure’s Pirate Metrics.)

  • Viral coefficient (how well your users become your marketers.)
  • Traffic sources and campaign effectiveness (the SEO stuff, measuring how well you get attention.)
  • Signup rates (how often you get permission to contact people; and the related bounce rate, opt-out rate, and list churn.)
  • Engagement (how long since users last used the product) and churn (how fast does someone go away). Peter Yared did a great job explaining this in a recent post on “Little Data”
  • Infrastructure KPIs (cost of running the site; uptime; etc.) This is important because it has a big impact on conversion rates.

Second: what stage are you at?

A second way to split up the OMTM is to consider the stage that your startup is at.

Attention, please

Right away you need attention generation to get people to sign up for your mailing list, MVP, or whatever. This is usually a “long funnel” that tracks which proponents, campaigns, and media drive traffic to you; and which of those are best for your goals (mailing list enrollment, for example.)

We did quite a lot of this when we launched the book a few weeks ago using Bit.ly, Google Analytics, and Google’s URL shortener. We wrote about it here: Behind the scenes of a book launch

Spoiler alert: for us, at least, Twitter beats pretty much everything else.

What do you need?

Then there’s need discovery. This is much more qualitative, but things like survey completions, which fields aren’t being answered, top answers, and so on; as well as which messages generate more interest/discussion are quantitative metrics to track. For many startups, this will be things like “how many qualitative surveys did I do this week?”

On a slightly different tone, there’s also the number of matching hits for a particular topic or term—for example, LinkedIn results for lawyers within 15km of Montreal—which can tell you how big your reachable audience is for interviews.

Am I satisfying that need?

There’s MVP validation—have we identified a product or service that satisfies a need. Here, metrics like amplification (how much does someone tell their friends about it?) and Net Promoter Score (would you tell your friends) and Sean Ellis’ One Question That Matters (from Survey.io—”How would you feel if you could no longer use this product or service?“) are useful.

Increasingly, companies like Indiegogo and Kickstarter are ways to launch, get funding, and test an idea all at the same time, and we’ll be looking at what works there in the book. Meanwhile, Ben found this excellent piece on Kickstarter stats. We’re also talking with the guys behind Pen Type A about their experiences (and I have a shiny new pen from them sitting on the table; it’s wonderful.)

Am I building the right things?

Then there’s Feature optimization. As we figure out what to build, we need to look at things like how much a new feature is being used, and whether the addition of the feature to a particular cohort or segment changes something like signup rates, time on site, etc.

This is an experimentation metric—obviously, the business KPI is still the most important one—but the OMTM is the result of the test you’re running.

Is my business model right?

There’s business model optimization. When we change an aspect of the service (charging by month rather than by transaction, for example) what does that do to our essential KPIs? This is about whether you can grow, or hire, or whether you’re getting the organic growth you expected.

Later, many of these KPIs become accounting inputs—stuff like sales, margins, and so on. Lean tends not to touch on these things, but they’re important for bigger, more established organizations who have found their product/market fit, and for intrapreneurs trying to convince more risk-averse stakeholders within their organization.

Third: who is your audience?

A third way to think about your OMTM is to consider the person you’re measuring it for. You want to tailor your message to your audience. Some things you share internally won’t help you in a board meeting; some metrics the media will talk about are just vanity content that won’t help you grow the business or find product/market fit.

For a startup, audiences may include:

  • Internal business groups, trying to decide on a pivot or a business model
  • Developers, prioritizing features and making experimental validation part of the “Lean QA” process
  • Marketers optimizing campaigns to generate traffic and leads
  • Investors, when we’re trying to raise money
  • Media, for things like infographics and blog posts (like what Massive Damage did.)

What makes a good metric?

Let’s say you’ve thought about your business model, the stage you’re at, and your audience. You’re still not done: you need to make sure it’s a good metric. Here are some rules of thumb for what makes a number that will produce the changes you’re looking for.

  • A rate or a ratio rather than an absolute or cumulative value. New users per day is better than total users.
  • Comparative to other time periods, sites, or segments. Increased conversion from last week is better than “2% conversion.”
  • No more complicated than a golf handicap. Otherwise people won’t remember and discuss it.
  • For “accounting” metrics you use to report the business to the board, investors, and the media, something which, when entered into your spreadsheet, makes your predictions more accurate.
  • For “experimental” metrics you use to optimize the product, pricing, or market, choose something which, based on the answer, will significantly change your behaviour. Better yet, agree on what that change will be before you collect the data.

The squeeze toy

There’s another important aspect to the OMTM. And I can’t really explain it better than with a squeeze toy.

Nope, this isn’t me. But sometimes I feel like this.

If you optimize your business to maximize one metric, something important happens. Just like one of the bulging stress-relief toys shown above, squeezing it in one place makes it bulge out in others. And that’s a good thing.

A smart CEO I worked with once asked me, “Alistair, what’s the most important metric in the business right now?”

I tried to answer him with something glib and erudite. He just smiled knowingly.

“The one that’s most broken.”

He was right, of course. That’s what focusing on the OMTM does. It squeezes that metric, so you get the most out of it. But it also reveals the next place you need to focus your efforts, which often happens at an inflection point for your business:

  • Perhaps you’ve optimized the number of enrolments in your gym—but now you need to focus on cost per customer so you turn a profit.
  • Maybe you’ve increased traffic to your site—but now you need to maximize conversion.
  • Perhaps you have the foot traffic in your coffee shop you’ve always wanted—but now you need to get people to buy several coffees rather than just stealing your wifi for hours.*

Whatever your current OMTM, expect it to change. And expect that change to reveal the next piece of data you need to build a better business faster.

(* with apologies to the excellent Café Baobab in Montreal, where I’m doing exactly that.)

Behind the scenes of a book launch

A couple of weeks ago, we launched the Lean Analytics website. Since we’re writing about lean analytics, we figured we should probably track a few things along the way. In the interest of transparency, here’s what we did and how it went.

Creating the site

Our first step was to create the website. We did this with the help of Nudge Design, a designer who knows WordPress really well and was able to turn things around quickly. We had to work fast, because Ben’s company had completed its recent acquisition by Salesforce and Startupfest was around the corner.

Setting up analytics and goals

We put Google Analytics into the site, and set up a couple of goals. These were simple enough. We wanted to know how many people would:

  • Click on the book cover to find out what an MVC was.
  • Sign up for our mailing list. (Come to think of it, you should probably do that now.)
  • Complete our brief survey to tell us about themselves.

Here’s what we configured in Google Analytics:

Note that we couldn’t easily track mailing list signups, since they happened elsewhere. We also configured the survey completion goal half-way through the launch; more on that later.

Generating tracking codes

We planned on telling some of our friends and proponents about the book. These are people with big followings: Tim O’Reilly, from our publisher; Eric Ries, the Lean Startup founder and series editor; Avinash Kaushik, arguably the smartest (and most irreverent) person in analytics; and Julien Smith, a Montrealer and bestselling author.

We also wanted to use our blogs (Solve for Interesting and Instigatorblog) as sources for traffic, as well as two events we’d be speaking at (Startupfest and Lean Startup Machine.)

Tracking all of these mentions across the “long funnel” from the first mention to the eventual visit is accomplished by embedding tags in the URL. Google has a page that helps you build these tracking URLs, allowing you to segment visitors by source, campaign, and so on.

The tags help distinguish visitors by their source, the campaign used, the medium, and so on. We didn’t want to burden our proponents with too many codes, but I did generate some codes for myself which identified different media (LinkedIn, Twitter, Facebook) to demonstrate how it works.

Generating short URLs

A long URL from Google’s tool won’t fit easily into social media platforms. Plus, it looks messy and it’s hard to remember. So we needed to shorten these, which we did by generating short URLs with bit.ly.

For many of the shared URLs, we just used the default random string of letters that bit.ly gave us, but for some of the events we generated custom ones that were easy to remember. For example, in my presentation on coefficients of friction at Startupfest, I used bit.ly/Leanfriction in my closing slide.

Giving the codes to famous people

Armed with these codes, we sent them to our proponents, and they mentioned the site to their followers. The race was on: who would generate the most traffic? Who would send us the most completed surveys?

We could have overwhelmed these proponents with too much tracking data, but that’s too big an “ask.” Giving them a single URL to mention is good; in fact, the less work, the better. I’ve run campaigns in the past where we actually gave each proponent a calendar invite, so they’d remember to do the thing we asked at the time we asked. Famous people are busy; you have to make it easy for them wherever you can if you want their support.

Watching the results on bit.ly

Bit.ly provides some detailed information about the way short links are shared, so we were able to see how each proponent’s URL spread across a particular medium. Here’s Eric’s mention spreading across Twitter on the day he announced it.

Note that we can also tell some things about how his audience interacts with content—some use Facebook, some Hootsuite, and the vast majority came through Twitter’s shortened URL (t.co) which in turn encapsulated our bit.ly URL.

Watching realtime on Google Analytics

Google Analytics has a beta feature in which realtime traffic scrolls across your screen. Here’s what we saw on the receiving end of Eric Ries’ mentions. Note that any shortfall (people who clicked his link, but for whom we didn’t see a visit) is an indicator of problems with the site (slow loading, for example, may cause someone to cancel their visit before the page loads.) This is known as the sessions to clicks ratio.)

I’m not sold on the benefits of realtime data unless you can react to things in realtime, but it is a good tool for seeing if you’ve got something hideously wrong (such as the wrong landing page in a link) so you can go fix it. Plus, it’s fun to watch.

Seeing the Facebook page traction

We also set up a Facebook page. Once it hit 30 people, we had access to some stats on traffic and growth.

Facebook doesn’t give you a ton of data, but it’s enough to see whether traction is growing and to know how big your reachable audience is.

Join us on Facebook »

Traffic to our own blogs

Ben and I both wrote posts about the book on our own sites. I created different URL tags for Facebook, LinkedIn, and Twitter, turned these into three different URLs, and used a different one on each social platform. The results show me how useful each of the three sources is for traffic to the site.

Note that this is traffic to the Solve For Interesting post about the book, which in turn links to the book site. As such, it’s part of a long funnel that begins with a tweet on Twitter; leads to this page; and then (hopefully) sends traffic to the Lean Analytics book site.

Seeing the Mailchimp traction

We use Mailchimp to track our mailing list size, as well as to manage subscriptions, opt-out, and other aspects of staying in touch with our readers. Mailchimp provides charts and data on signup growth.

Once we start sending out messages and posts, Mailchimp can measure things like unsubscriptions, bounce rate, and so on. When we first sign up subscribers, and clean up junk email addresses, we don’t know how many of those 572 subscribers are actually useful.

And by the way, if you haven’t already, you should probably sign up for the mailing list and take the survey. The form is over on the right, at the top of this page. Go on; we’ll wait.

Once we sent the first blog post to the mailing list, we had a better idea of our list quality. Here’s mailchimp’s dashboard for a mail sent to all the people who asked to receive blog updates.

Our open rate was 55.5%; the industry average for our industry (“consulting”) is a paltry 15.8%. And Mailchimp thinks we can actually reach 308 people as a result of all this work. We’re not currently using Mailchimp for other platforms like Twitter or Facebook, though.

At this point, we have to confess to something: we didn’t track survey completion properly. We use Wufoo for our survey, and only the paid version is able to direct someone to a separate page on completion of the survey. We’re cheap bastards, and hadn’t paid for this, so half-way through the launch we realized the error of our ways, turned up the forwarding feature, and sent respondents to a thank-you page.

This matters because the only way we could easily track survey completion was when someone came back to that page. We should have generated Google Analytics “events” when someone put their cursor in the signup box, and embedded an event in Wufoo when someone filled out the survey. We could even have used a tool like Clicktale to understand which fields in the form weren’t being completed. Our bad.

What it means is that some people who mentioned our book later in the launch process (Avinash, Julien) look like their followers are more likely to complete a survey than those who mentioned it early on (Tim, Eric.) It’s not their fault; it’s ours. This should underscore the challenge of Getting Your Shit Together First.

Setting up a Google Analytics dashboard

Finally, we set up a custom report in Google Analytics to compare traffic, survey rates, and the effectiveness each proponent had on our goals. We have data on pretty much the whole campaign; and we know who we have to buy drinks for, and whose blog generates the most traffic.

Some things we can tell from all this:

  • Julien had a tremendous number of followers visit the site (and Chris Brogan, his co-author, helped spread the word) but relatively few of them completed our survey compared to Avinash’s followers.
  • Eric Ries’ followers were much less likely to wonder what an MVC was and click on the picture of the book than Tim’s were.
  • “Uninvited” visitors from Twitter and Facebook clicked on the MVC picture, but Twitter users were far more likely to finish the survey.
  • Ben’s blog drove some traffic, but mine didn’t even make the top ten. Sigh.
  • Google Plus drove traffic our way, mostly due to Tim O’Reilly’s use of it; LinkedIn mentions didn’t drive anything.
  • It turns out the Facebook page did in fact generate quite a bit of traffic to the site.

And so on. Here’s a second custom report, which compares the unique visitors and goal completions to the referring URL.

Sadly, of the 99 people who followed links to Solve For Interesting (see “traffic to our blogs” above) only ten clicked through to the book site; and of those, only one completed the survey. Ben, on the other hand, had eight people do so. In other words, Ben’s followers are eight times as likely as mine to complete an online survey. Ouch.

Finding the unknown unknowns

Avinash likes to point out that Donald Rumsfeld was an analyst.

There are known knowns; there are things we know that we know. There are known unknowns; that is to say there are things that, we now know we don’t know. But there are also unknown unknowns—there are things we do not know, we don’t know.

Known knowns are simply facts we regurgitate (time on site.) Known unknowns are simply analysis we perform based on things we already believe are important (“which sites send me the most traffic?”) But it’s the unknown unknowns that really matter, because that’s where the magic comes from. Here’s a great session from him at Strata earlier this year.

Admittedly, we have a very small site and a very small amount of traffic. Right now, it’s enough to go through by hand. But as it grows, having the system tell us what’s interesting is critical. Here’s an example of Google Analytics’ “intelligence events”. Basically, Google forms an opinion about what it expects to happen, and then tells  us when that opinion was wrong—when something was unexpected.

In this case, Google expected 1.18 to 4.74% of visitors to complete the survey, but on the 14th, 9.4% of visitors did so. We can then drill into this to see what provoked the change. (Note that we’re lying a bit here to make a point—the increase in exits from this page was almost certainly due to us changing the survey landing page halfway through the launch process. But the point is still valid: often, the most interesting thing that happens is a deviation from your expectation.)

One more piece of eye candy: We can further segment visitors by other dimensions and find out more detail. For example, when we split our four proponents’ traffic into new versus returning visitors, we find out that Julien’s returning visitors are much more likely to complete the survey.

(Again, these results are skewed because Tim and Eric didn’t have the benefit of the survey landing page to track the people they sent. We have 572 people in the mailing list, but only 186 visits to the survey landing page—clearly, Tim and Eric were responsible for a whole lot of survey enrolment. But you get the point.)

So what?

Right now, the thing we care most about is the followers and survey respondents we have. We want people’s attention (and nearly 3,000 of you have  visited the site, so that’s good.) But we also want the permission to reach out to you in order to ask you questions, send you updates, and ultimately, have you buy the book or pay to see us speak. For us, that’s the One Metric That Matters.

We know, from this data, that we should work on Facebook, Julien, and Ben’s blog for traffic. And because we know which people sent us posts, as we write things we’ll learn whose followers like what kinds of content—which means we’ll be able to ask different proponents to promote different content based on what their followers seem to like.

It’s early days for the blog, but we’ve got a pretty solid foundation in place with which to understand our audience in the coming months.

Putting it all together

We put all of this together into an infographic. It’s awesome. But we want something in return. We want you to sign up.

So put your email in the signup form below, and we’ll send you a beautiful, informative, no-holds-barred look at the campaign from start to finish.

You’ll also get future updates from us, and a bunch of chances to participate in the creation of the book by sharing your stories, taking surveys, attending regional events, and more.


Note: If you’re already a subscriber to our list, we’ll be sending the infographic in a couple days.

Lean Startup Machine Montreal

On Saturday, I presented at the Lean Startup Machine event in Montreal, along with Jeremy Edberg (Reddit, Netflix, eBay) and some others. Several of the people were in town for the International Startup Festival.

It was the first time I’d talked about the One Metric That Matters (OMTM) in public, and the feedback from the audience was good. Admittedly, these folks were in the very early stages of their startups, spending the weekend getting out of the building in order to understand what was going on, so some of the later, more strategic, metrics weren’t as relevant.

We’ll get into what makes a good OMTM shortly; for now, here’s the deck, fresh from our shiny new Slideshare account.