Skip to main content

Hey there! Free trials are available for Standard and Essentials plans. Start for free today.

Staying Afloat: How to Thrive in Deep Data

FortyFour's Ryan Anderson has toiled tirelessly behind the scenes, using data to drive success. Here's his story.

Hero image for Staying Afloat: How to Thrive in Deep Data

Digital agency FortyFour sits in a former industrial building just a short walk from Atlanta’s Historic Fourth Ward Park. On a clear blue day, the surrounding bistros and bars spill over with patrons eating outdoors while the agency’s neighbor, King of Pops, serves up gourmet popsicles to the park’s joggers and dog walkers.

Ryan Anderson, director of analytics at FortyFour, sits near the agency’s kitchen as sunlight pours through the window behind him. He feels lucky to be here.

“After college I started working in supply chain logistics,” Anderson says. “It took about 18 months to realize I was just not very passionate about finding more efficient ways to move boxes.”

Anderson left that job to work for startups and tech companies until Adam Roe approached him. Roe, along with co-founder Thomas Frank, started FortyFour as a website design agency. But once the websites were built, clients wanted to see how they performed.

“FortyFour was becoming more focused on marketing and analytics when I joined on,” Anderson says. “When you’ve got the data, you want to learn how to drive performance. But it was a bit of a transition for me. I had to learn how to pivot between multiple accounts in a workday, and good communication with the client becomes crucial. Once I got through that learning curve, though, I really came to enjoy agency life.”

Secular data-ism

FortyFour has worked for the American Cancer Society, Cartoon Network’s Robot Chicken, and Coca-Cola’s wildly popular “Share a Coke” campaign. Through it all, Anderson has toiled tirelessly behind the scenes, using data to drive success. That means more than collecting analytics after the fact. First and foremost, it means understanding the client’s goals.

“We start with making sure we understand how clients view their business, and what they want it to do,” Anderson says. “Are they trying to grow their retail sales? Optimize their website? What metrics are actually meaningful? We also want to know if they think they’re already getting that data in some shape or form, so we understand where they’re coming from.”

Many of Anderson’s clients are already engaged in some kind of data gathering, whether it’s in the form of a WordPress stats panel or Google Analytics. This kind of data can be useful at the outset, but it’s not strictly necessary.

“We’re platform agnostic: If people like their analytics platforms, we’re happy to work with them,” Anderson says. “The main thing, from our perspective, is determining whether we can answer the questions they have with the data they’ve gathered. If the answer is no, that’s when we start adding more tracking to their website and on platforms like Facebook and Mailchimp.”

But, Anderson says, analytics tools don’t always agree. That’s when it’s important to know when to be concerned—and when not to be.

Nuclear subs and great white whales

Imagine, for a moment, that you ask your coworkers to describe your boss’s car. There may be some disagreement on the model year, or the shade of red. Those kinds of small differences are to be expected. It’s not a problem until somebody says that, actually, your boss drives a nuclear submarine.

“People worry a lot when they see a mismatch between their measurement platform and their database of record,” Anderson says. “Google Analytics, for instance, tends not to match Shopify—and it’s pretty consistent in how it misses. You’ll almost always see a 3 to 5% discrepancy there. But we know that, we expect it, so that doesn’t worry us. It’s when you get a real outlier that there’s a concern.”

Part of the problem is with attribution. Different marketing platforms have their own ways of counting clicks. Making these platforms get along isn’t easy.

“The general attribution problem is the great white whale of analytics, the problem everyone’s trying to solve,” Anderson says. “It can drive you crazy. But at a certain point, you decide on the formula you’re going to use, and make decisions based on what you’ve got in front of you. Otherwise you can paralyze yourself with data.”

To avoid such a malady, Anderson has a simple solution.

The right metrics for the job

“Which metrics allow you to take action? Those are the ones that matter,” Anderson says. “In e-commerce, for example, everyone wants to track revenue, and for good reason. But in terms of action, there’s no revenue lever I can pull to increase those numbers. Measurement doesn’t translate into action.”

Instead, Anderson says, he directs clients to look at the factors that result in sales.

“Revenue is made up of traffic times conversion rate times average order size, right? And those are metrics you can build strategies around. You can figure out ways to increase those pieces, and that in turn results in more sales,” Anderson says.

Another way to avoid data-paralysis is to pick your metrics and stick with them.

“We tend to have two kinds of clients: One wants numbers for everything—they want as much data as you can give them, they want updates minute by minute, they love it,” Anderson says. The other only looks at numbers occasionally, and might feel a little uncomfortable with them. In both cases, we recommend that they start a campaign by focusing on 3 to 5 metrics. It’s important to move incrementally in marketing automation. Because if you have too many data points, they won’t make sense together, and you end up spending all your time trying to sift out answers to data questions instead of strategizing the things that will impact your business.”

"You need to understand what you’re testing and how you’re testing it."

Like your data, love your brand

With email analytics, Anderson suggests a similar strategy: Prioritize a few metrics related to your goals, and track them diligently.

“Mailchimp allows some pretty deep segmentation, so we can track who’s receiving email, how often, and what actions they’re taking,” Anderson says. “And starting with a few simple metrics still allows for some pretty sophisticated stuff as you gather more feedback. As we see what works with each segment of an audience, we can tailor our messaging to each group.”

As results come in, priority is given to the data that best informs the next decision. In e-commerce, that can mean the sale is more important than the open.

“Of course we want people to enjoy the email, but ultimately the goal is still to sell things,” Anderson says. “In e-commerce, we’re more concerned with tracking items added to carts, purchases, and amount per sale. We’re less concerned with the open rate numbers because those can be tweaked over time. What’s harder for us to understand early on is who’s reading an email and then actually making a purchase. That’s where the most valuable data comes from.”

By focusing on a few key metrics, strategic decisions become easier—and it can also strengthen the brand overall.

“You get a lot more leeway from your audience if you’re heartfelt and passionate about your business,” Anderson says. “It’s hard to maintain that level of passion if you’re constantly focusing on every little piece of data, or making changes that improve your results by tiny percentages. You’re much better served in the long run if you’re willing to stand by your brand’s identity instead of allowing it to be dictated by the numbers. That’s how you build long-term affinity between your customers and your brand.”

4 ways to deal with bad data

Sooner or later, it happens to all of us: The data just stops making sense. Maybe your measurement platforms don’t agree on your traffic. Maybe the engagement numbers lead you to try terrible subject lines and nonsensical calls to action. Whatever the particulars, everyone engaged in analytics will sooner or later hit a snag.

Bad data happens to the best of us. Here’s how to deal with it:

1. Trust your gut (mostly). It can be tempting to run with the data even when it doesn’t make sense, but that can take you down some weird rabbit holes. Instead of denying that there’s a problem with the figures, trust your instincts. “When you’re trying to weed out bad data, some of it is just common sense,” Anderson says. “If something makes no sense at all, you need to look at your data to figure out what’s happening before you change your strategy.”

2. Clarify the question. Fuzzy data can be the result of a fuzzy question or hypothesis. Maybe it’s worth revisiting the basics, or maybe you just need to get feedback from someone who’s not so close to the campaign. Either way, the goal is the same. “You need to understand what you’re testing and how you’re testing it,” Anderson says. That means clear questions, few variables, and as large a sample size as you can get.

3. Find a bigger picture. “If you run a lot of little tests that result in tiny tweaks, you only maximize locally,” Anderson says. “You get to the top of one hill, but you end up missing out on this completely different hill that would require a drastic change to reach.” If you’re spending a lot of time collecting data to execute a strategy that results in only marginal gains, you may not be thinking big enough. Instead of A/B testing subject lines, it may be time to test 2 all-new templates.

4. Admit your assumptions. “Bad data is a little like the problem of marketing attribution,” Anderson says. “When you have disparate information, at some point you have to admit what you’re going to assume and just live in that world for a while.” If you come across conflicting data, it’s okay to go with what fits your initial assumptions—just so long as you’re aware that you’ve made them, and you’re prepared to revisit them later on as more information becomes available.

Share This Article