When to Stop Blindly Following Your Platform Data: From Search Console to Facebook

Chloe Stoker, Client Partner

As client partners at Portent, we are constantly looking at the results we drive for our clients. We also talk a lot about fearless accountability for remarkable results. But what happens when the KPIs you’re getting from advertising or analytics platforms don’t tell the full story? Whether it’s attribution bias from on-platform data, or a shortcoming in modeling the fully loaded cost of a sale, it’s a roadblock to measuring what’s actually working for our clients, and it’s a problem that requires critical thinking and constant vigilance.

Similarly, as analytics and ad platforms struggle with how much data they can and should share, we’ll see places where they come up short in explaining acquisition paths, user behavior, etc.

Here are a few recent examples I hope you can apply in your own work, and use to think about where else to bring a critical eye to your raw conversion and cost data.

Questioning Google Search Console’s Reliability? Confirm with a Third-Party Tool

I recently asked some other smart folks at Portent about cases where data or reporting was leading our clients astray. Zac Heinrichs, our SEO Team Lead, shared a situation he encounters quite often in Google Search Console’s data being unreliable.

While Google Search Console (GSC) provides insight into what Google has crawled and seen on your site, their crawl is often not as up to date as the live version of the site. For example, GSC may insist that 500 URLs are broken on your site when you know for a fact that they’ve been fixed.

Also misleading is GSC’s inclusion of any external link to the site, whether it is an actual page or not. For instance, if someone types the wrong spelling of their blog’s URL, Search Console will include that in their output of links to your site. Zac laments that Search Console provides “all the things they found wrong with your site across all of space and time.” This creates a problem, especially for larger sites, in filtering the erroneous, non-broken pages.

However, Zac explains that “the onus isn’t on Google Search Console” to update their data and make our lives easier. Instead, he insists on balancing this free service with insights from other tools that have fresher data. Using a combination of SEMRush, Ahrefs, and Moz will give you more accurate data, as well as different angles from which you can assess and cross-check your site.

So, can Google Search Console be trusted? The answer is a resounding: “Yeah, sometimes. It depends what you need it for.”

Google Analytics Lead You Astray? Try Log Files

Often, our clients come to us unable to understand why their website isn’t being indexed. While search engine spiders eventually reach their crawl budget and move on, it will happen even quicker if your site structure looks like this:

Crawl Tree

Your first inclination may be to open Google Analytics to diagnose the issue. But remember it’s Google’s crawler that has figuratively thrown its hands in the air and accepted defeat. In this scenario, Google Analytics isn’t going to provide a lot of useful information, since you’re working with partial data by virtue of the original crawlability problem.

Log Files To The Rescue

Instead consider accessing your website’s log files. Thorough, if not obsessive, log files record everything that happens on your site. When Google has long given up on your website, log files are there. First time reading log files? Read Ian Lurie’s guide to decoding log files.

During my first log file analysis with our SEO team I wondered: why don’t we use these more often? Why are they so underutilized? I realized that in an industry that values speed of execution, accuracy sometimes comes in a not so close second. I like to think of log files as the tortoise in The Tortoise and The Hare, where slow and steady wins the race.

Facebook Lead Costs and Quality – Consider a Manual or Offline Cost Reconciliation

Enough harping on Google, let’s pick on Facebook.

A caveat: this is really a parable about taking stock of lead quality. And a caution that as marketers we need to consider how much time and money it takes to convert leads into customers, rather than simply following the scoreboard of converted leads on any platform.

Alex DeLeon, Director of Search & Social at Portent shared a recent example where our team was initially misled by eCPA data on a Facebook lead gen campaign, in a way that could’ve led to a lot of waste if left unchecked.

The Scenario: Alex and his team built a lead-generation funnel for a client, advertising primarily through social media. The team identified three basic levels of lead readiness or quality, that mapped roughly to where they were in the customer journey and how much information the lead provided during an initial conversion:

  • Lead (a user who simply clicked on the ad through to our client’s website)
  • Qualified Lead (a user who provided their name and e-mail address)
  • Phone-Qualified Lead (a user who overtly requested to be contacted by a representative)

Using Facebook data to assess the costs for leads at each level allowed the team to play with optimizing their spend to drive the highest number of leads and converted customers at the lowest cost.

Initial Lead Cost / Conversion Rate to Paying Customer = Effective Cost Per Acquisition

Intuitively this seems sound and should look familiar to any marketer that’s evaluated the trade-off of adding more or less required fields to a form for conversion rate optimization, for instance.

What the team and the client nearly failed to build into the model in this case was that while Facebook was providing the true cost per acquisition of each type of lead, the landed lead cost was inflated by inside sales team fees and the cost to actually speak with a representative. It wasn’t simply that a smaller percentage of lower-intent users were going on to purchase. It actually cost the company significantly more to convert each one.

The solution was to work more closely with the client to manually pull and report the landed lead cost, rather than relying on technically accurate but misleading platform data. Ultimately this story did have a happy ending and the client’s running lead generation on social to this day. However, Alex admits they were so confident in Facebook’s abilities that they didn’t question the data at first glance.

The Lesson

Looking back personally, I realize that beyond building my digital marketing foundation, the training I went through for platforms like Google and Facebook built a strong sense of loyalty to these companies. (Turns out content marketing might really work!) When I needed guidance on industry changes, I looked to Google’s most recent update. When I needed performance insights, I went to Google Analytics without hesitation. While these and other platforms provide valuable training and line of sight, they aren’t the infallible source of truth that they’re often taken for.

To be clear, I’m not telling you to stop using these platforms or their analytics altogether. I’m simply recommending you put down the Kool-Aid, explore some other options, and sanity-check your lead gen cost models to make sure they tell the whole story of customer acquisition cost.

Start call to action

See how Portent can help you own your piece of the web.

End call to action
0

Leave a Reply

Your email address will not be published. Required fields are marked *

Close search overlay