In major brands and websites, technical SEO techniques become part of a long list of developer responsibilities, frequently far below the threshold of serious attention. Even though many in our sector have learned the dangers of disobeying search engines’ crawling and ranking guidelines, many Fortune 1000 companies are still skeptical. So this is an excellent chance to review some typical SEO mistakes.
Ignore the Useful Keywords
Only around a dozen of the Fortune 500 companies, in my estimation, are using proper keyword research and targeting; the others entrust the creation of page content and title tags to “creative ad writers.”
Image-based and Flash Content
The most important content for search engines is painfully concealed in files that spiders can’t view, along with navigation. It appears we’re still a long way off despite earlier assurances that engines would soon be able to crawl Flash content (or read the text in photos).
Canonicalization Issues With The URL
“Canonical content” is unappreciated because of “print-friendly” versions, several URL navigation pathways leading to the same page, and simple duplication.
Distribution of Content & Partnerships
Many large web content owners have canonical problems on their websites and license their content to dozens (often hundreds) of sources. The only thing worse than having six different copies of your material on your website is having six different versions on six different significant, influential websites.
Requirements for Cookies or Session Variables
If even a spider needs a cookie drop to get through, someone else will be getting your traffic. Big sites that don’t provide content access methods for spiders are asking for trouble.
Content Access Regulated
The NY Times, Economist, and Salon.com don’t experience nearly the same growth in link popularity as their more forgiving rivals. Requiring membership for paid access means significantly fewer visitors will bother to link, even when you let the spiders through.
Create Multiple Sites
Big firms appear to take satisfaction in releasing 6 new websites every time their ad agency changes the campaign tagline rather than launching projects behind their core domain.
Unfamiliar With the Sandbox
A “search proxy architect” whose job is to deal with big brands’ unfriendly websites and produce alternative pages for search engines to crawl and the index was one of the fascinating folks I met at the last Pubcon. Because the engines would rather be able to crawl the content from these sites than have it deleted from the index, they tolerate this sophisticated kind of cloaking. I didn’t realize how widespread this technique was, but it turns out that “ethical cloaking” is a lot more frequent than you may expect. Unfortunately, I cannot provide the cases I am aware of, but if you do, please feel free to do so in the comments.
So the next time someone inquires whether cloaking is ethical or unethical, you can respond, “It depends.”
About ESage Digital
At ESage Digital, our technical SEO services are well-versed with the latest practices and techniques that most plausibly surround the idea of technical optimization SEO.