In part one, we discussed the long-term affects of personalized internal and external search functionality that combines to produce equally irrelevant individualized results over time. To enhance the contextual relevancy of large, complex Web sites that have historically diluted their relevancy because they sell too much stuff, online retailers have embraced content management systems that help refine the navigational ontology of goods and services they sell.
Meanwhile, the major search engines have expanded their collection of contextual suggestion boxes to help readily refine search queries that have lost their singular sense of relevancy. All of these factors have united to transform the way we searchandise large data-base driven Web sites.
(If you still don’t know what searchandising means, please take a moment to read part one. The concept of searchandising will be crystal-clear. Really, it will.)
In part two, we drilled down to examine what happens when the major search engines meet guided navigation and extreme pagination — two byproducts of the quest for relevancy. As an example, we reviewed how one singular database is sliced and diced into 25-plus internal site search results pages, otherwise known as guided navigation, which further fragments contextual relevancy by introducing pagination schemas that produce duplicitous site bloat.
(If you don’t understand the concept of how guided navigation contextually shatters large Web sites and can’t get your head around the idea of extreme pagination constructs, please read part two. You’ll be glad you did, because today’s article will actually make sense.)
Remember, search engine spiders hate bloated Web sites. It takes longer to crawl distended content and search engine spiders thrive on efficiency. Search engine spiders want to get in and out of a Web site quickly. (Yes, spiders feel desire.) Bloated sites tend to have smaller portions of their content crawled, upon each spider’s visit. (Spiders don’t overeat.) Epigrammatic crawl patterns mean that the entire site takes longer to crawl and index. This results in stale indexation levels. (Spiders like fresh meat.) Out-of-date information usually gets fewer visitors, and in consequence so begins the downward spiral of search referrals because the content has lost its click appeal. It’s a sad, tragic story that ends in a Web site’s slow, lingering death.
Here’s the great news: You can stop relevancy leaks by sticking your finger in the database dike to hold back the flow of irrelevant contextually deprived search results. All you have to do is pick and choose the internal search results pages that have the greatest relevancy. Hint: they usually make the most money.
Start with some intensive keyword research, focusing first on those terms and phrases that are already diving serious volumes of search referrals your way. Bind these words to conversions. In other words, know what words are associated with critical search referrals. These are the words and phrases that make you money. Now expand the list of words and phrases to include terms you’d like to make money on. Compare these lists to your navigational ontology, adjust where possible and poof! Your site is contextually optimal.
OK, it’s not quite that easy, but you get the idea. In its simplest form, natural search engine optimization is a three-step process (that sometimes leads to a 12-step program). First, get your site crawlable. Second, focus on optimizing the content contained in the site. And finally, link build to critical parts of your Web site until the cows come home. (For all you non-farmers’ daughters, that means infinity.)
So you can enhance the contextual relevancy of critical category pages within a complex database-driven Web site by understanding what keywords and phrases drive your revenue. But you still need to contend with that wonky pagination scheme that’s killing your crawl equity. There are a couple of things you can do about that:
- Adjust your pagination scheme to include more stuff. Search engine spiders tend to crawl to page-three of any 1-2-3 or next-next-next pagination scheme, so do the math. If you limit your pagination scheme to six items, then 18 product pages could potentially be indexed. But if you expand your pagination scheme to 12 items per page then you create the opportunity to have 36 product pages indexed. Just remember to keep users in mind. If fewer longer, scrolling pages don’t convert as well as shorter and non-scrolling pages, then you’ll need to strike a balance between what converts and what doesn’t.
- Enhance the internal and external context of the content. Individual product pages naturally tend to be keyword rich and well-targeted. The problem is that category and even subcateogory pages can be contextually scattered. You can:
- Nofollow navigational links that lead to suboptimal results pages, but the words used in anchor text links can dilute the general keyword theme.
- Set up XML sitemap feeds that help accentuate those pages you want indexed by the search engines, knowing that incongruous results are common at this time.
- Just link-build the heck out of pertinent pages, knowing strategically executing external contextual embellishments takes time.
No matter which path you take to optimize a large database-driven Web site, hopefully you’re now a bit more enlightened about the journey. Natural search engine optimization is an outing that requires intensive introspection and analytical extrospection, along with a little astro-projection. Enjoy your adventures in searchandising as part of the trip and you’ll get a great ride from the search engines, every step of the way.
SEO and search marketing are a vital part of any marketing strategy, linking together channels like social media, content marketing and offline advertising.
There is of course a lot of discussion about content and what does and doesn't work online. Is long-form the key? Does short-form content have a role to play? Are there other factors at play?