How and why the SEO tools industry should develop technical standards
Could SEO benefit from a collaborative effort to establish industry standards for SEO software? Contributor Michael King discusses the value this would have -- as well as the challenges it would present.
The SEO technology space could benefit tremendously from the establishment of technical standards. Implementation of Google’s own specifications is inconsistent within our tools and can lead less-experienced SEOs to believe their sites are in better shape than they are.
In the same way that the W3C rallied around the definition of protocol standards in 1994 and the Web Standards Project (WaSP) standardized coding practices in 1998, it’s our turn to button up our software and get prepared for what’s coming next.
Stop me if you’ve heard this one. On December 4, I received an email from DeepCrawl telling me that my account was out of credits. That didn’t make any sense, though, because my billing cycle had just restarted a few days prior — and, frankly, we haven’t really used the tool much since October, as you can see in the screen shot below. I should still have a million credits.
Logging in, I remembered how much I prefer other tools now. Noting the advancements that competitors like On-Page.org and Botify have made in recent months, I’ve found myself annoyed with my current subscription.
The only reason I still have an account is that historical client data is locked in the platform. Sure, you can export a variety of .CSVs, but then what? There’s no easy way to move my historical data from Deep Crawl to On-Page or Botify.
That is because the SEO tools industry has no technical standards. Every tool has a wildly different approach to how and what they crawl, as well as how data is stored and ultimately exported.
Opinions expressed in this article are those of the guest author and not necessarily MarTech Today. Staff authors are listed here.