Direct Distribution vs. Data Aggregators: Part 1

Author:
 

It’s long been accepted that distributing citations across hundreds of sites will increase organic rankings for individual business locations. As Location3 Chairman Andrew Beckman noted last month, it was almost TEN years ago that David Mihm lauded such a strategy, deeming citations the “new link”. Despite major shifts in Google’s algorithm and several changes in how it displays search results, Mihm’s conclusions are still held up today as a guiding principle of local SEO.

The thought goes something like this: It’s important for brands to pay data aggregators to pull listings data from other citation sources at different frequencies. These aggregators are better equipped to generate more citations, and because of the volume of citations indicates relevance and authority, brands will be rewarded with considerable gains in organic search rankings.

It’s perplexing that agencies and brands continue to cling to this outdated practice — even after considerable evidence that an aggregator-dependent strategy has little to no bearing on organic rankings on the major search engine sources.

So why are really smart agencies and brands still using a strategy that’s flawed at best and inept at worst? Well, because it’s easy. Submitting to database aggregators is a simple task that allows the SAAS platform, the agency, and/or the brand CMO to check a box and enjoy the peace of mind that comes with: “We have local listings covered.”

They are then free to move on to their next marketing objective. And there’s something to say for that progression in today’s marketing landscape, where DSPs, native ad formats, updated algorithms, IOT, emerging paid social opportunities, and the transition from traditional ad formats to a digital model form a pot-holed path to success.

I understand that covering all of your bases is satisfying, but I’m confident that these agencies and CMOs didn’t get to their positions by cutting corners. It’s more likely that they worked hard, paid attention to the details and were quick and savvy adopters of innovation. I encourage these professionals to rediscover that enthusiasm and not simply settle for less when it comes to listings success and local SEO.

I mentioned that an aggregator-dependent strategy has largely been discredited by industry experts as well as our own research at Location3. Let’s take a look at a study conducted by local SEO leader, Darren Shaw. This study, presented at MozCon Local 2016, compares manual citation creation to submitting through each database aggregator.

The results of Shaw’s study, in which he submitted listings data from seven “new” businesses to several major aggregators, showed minimal impact. Of the three data aggregators that charge brands a fee to correct data in their databases, Acxiom fared the worst — generating just 1 citation despite partnering with 100+ sources of listings content. To be fair, the other two aggregators didn’t perform much better. One aggregator generated just 10 citations in 3 months, while the other generated only 4 citations in 3 months.

Now, let’s compare those results to the results of a managed strategy executed by Shaw’s team at Whitespark. Over four months, the managed strategy resulted in 49 relevant citations — more than double any aggregator in the study. Perhaps more importantly, Shaw concluded that “citations alone won’t have much of an impact on rankings.” When we consider the cost to use these aggregators as part of a bulk listings management program, it’s difficult to find any logical argument for employing an aggregator strategy.

It’s important to note that these conclusions are not only supported by Shaw’s study, but also by comments directly from Google employees, as well as our own testing results.

To view Part 2 of this post and dive into our test results and insights, CLICK HERE.

 

 

Leave a Comment

Your email address will not be published. Required fields are marked *