I've been quite the SEO (define) traveler this year, speaking on optimization and search usability in Germany, England, China, and Ireland (my current favorite). Though I greatly admire and respect my colleagues for spreading the word about effective SEO, I'm also concerned.
In some cultures, it's considered impolite and unprofessional to disagree with colleagues in a public business setting. So I generally hold my tongue at these conferences. But now I'm back in the States for a short time and no longer keeping quiet about some of the inaccurate material presented by my colleagues.
Interestingly, some myths and misconceptions about SEO won't go away here in the U.S., either. This week's column will dispel three common myths I commonly hear worldwide.
High Spidering Frequency Equals High Rankings
At conferences, my jaw hits the floor every time I hear this SEO myth because it's been around for many, many years. Some SEO professionals say they like to change their content every three to five days to achieve higher rankings. Some believe they must add blogs to sites so the fresh content will generate higher rankings.
Basically, this myth is a bad cause-and-effect conclusion. Search engines can crawl a Web page, and that Web page might never make it into the search engine index. The content might be filtered out because it's redundant or of poor quality.
Crawl frequency has nothing to do with the main search results rankings. Many Web pages rank well without having a high crawl frequency. Remember the fundamental principles of optimization:
These items are what affect search engine rankings. Sure, update your site's content when it needs to be updated. A news or publisher site might need to be updated daily, but not a business-to-business (B2B) site. Both site types can and do rank well and receive qualified search engine traffic over time. But the news site won't rank better just because it has a higher crawl frequency.
Replace Graphic Images With CSS-formatted Text
On the surface, it seems like sound advice to change all graphic images into CSS-formatted (define) text. For example, if a site is composed of all graphic images or all Flash elements, then modifying those items will certainly make the site more search-engine friendly.
However, this advice is simply inaccurate. Search engines want what users want. And though it might be difficult for many people to believe, there are many situations where searchers prefer graphic images over CSS-formatted text. In fact, the usability tests, corresponding Web analytics data, and ROI (define) computations I've done since 1995 show a marked preference for sites that consist of both graphic images and CSS-formatted text, including navigational elements.
Graphic images typically generate more clicks due to visual affordance. Sometimes, I honestly believe many SEO professionals and Web developers make that overly simplistic statement because they lack graphic design skills. They also cite usability data about download time as proof of their convictions. (Note: you might want to research actual vs. perceived download time before creating usability tests.)
Trust me when I say reading books and newsletters written by Jakob Nielsen, Jared Spool, Eric Schaeffer, and others doesn't make one a usability professional. Though I highly respect these gentlemen's work, even they would support me on this.
You have to actually conduct usability tests to draw conclusions. Many SEO professionals like to give lip service to Web site usability, and kudos to them for giving a persuasive sales pitch. But that's all it is -- a pitch. When your SEO firm talks about search usability, make sure they actually perform usability tests and analyze the results accurately. Make sure they're using the appropriate tests.
Additionally, Web developers shouldn't test their own designs; they aren't objective enough. Heck, I never test my own designs. That's why I have a director of usability. I test others' designs, though.
Web 2.0 and SEO
OK, Web 2.0 evangelists, pay attention. I understand your enthusiasm and zeal for Web standards and new technologies. Really, I do. But to make the bold statement that Web 2.0 sites rank better than non-2.0 sites? Completely false.
Many SEO professionals and Web developers who make this statement are merely trying to promote their unique selling proposition (USP), to separate themselves from other design firms. I understand that. I'm sure every professional marketer on the planet understands that.
Go back to those fundamental principles. Repeat them. Make them a mantra. Tattoo them somewhere on your body, if that's what it will take. I love many Web 2.0 sites and create them as well. But making inaccurate cause-and-effect statements as a USP just to close a sale? Maybe the SEO professionals honestly believe those statements and that's why they make them.
All SEO professionals must test, test, test. It takes time, patience, and objectivity to accurately test and analyze results. Unfortunately, in our I-want-it-yesterday Internet world, time is often a luxury. Objectivity is sacrificed when people want to believe so badly that their designs and methodologies are correct. Some Web 2.0 goodies haven't been around long enough to be truly tested and analyzed accurately.
Meet Shari at Search Engine Strategies on June 12-13 in Toronto.
Want more search information? ClickZ SEM Archives contain all our search columns, organized by topic.
Shari Thurow is the founder and SEO director at Omni Marketing Interactive, a full-service search engine marketing, Web, and graphic design firm. Acknowledged as a leading expert on search engine friendly Web sites worldwide, she is the author of the top-selling marketing book, "Search Engine Visibility," published through Peachpit Press. Shari's areas of expertise include site design, search engine optimization, and usability.
May 22, 2013
1:00pm ET / 10:00am PT
June 5, 2013
1:00pm ET / 10:00am PT