Understanding (multi) keyword targeting, XML feeds, and the differences between paid inclusion and crawler-based search engines.
The first column in this series looked at some of the marketing challenges and false starts of Horse-Threads.com, a dressage and jump seat supply store. Dennis Buchheim, Yahoo's general manager of paid inclusion (and the parent of Inktomi), offered some guidance on how owner Kelly Springer could use paid inclusion to grow her business.
Dennis offered thoughts on keyword targeting and why paid inclusion makes sense for search engines and advertisers alike. He begins with this premise:
"We definitely encourage sites to use XML feeds because feeds generally drive the best user experience and the best results for advertisers. Feeds foster a relationship between an advertiser and the search engine by supporting a more structured interaction."
To put this in context: when the "search experience" is improved, the advertiser benefits. When more people use the search engine, click on results, and find what they're looking for, the advertiser is more successful.
What does this mean to someone like Kelly, who promotes a small online horse supply store? In short, when the search engine does a better job producing a relevant, high quality experience, Kelly benefits by receiving relevant clicks from a potentially larger base of searchers. Her listings help her make sales when they're seen by people searching with an intent to buy her products. That only happens if the search engine builds a high quality index that attracts a qualified audience. It's a virtuous cycle.
Yahoo is betting paid inclusion will produce a better index with deeper and higher quality content than its competitors can provide by crawling alone, particularly from dynamically generated sites. In short, Yahoo believes paid inclusion will ultimately produce a better search experience. Whether or not you philosophically agree with the premise, marketers must recognize it's here to stay. Search marketers must consider its implications and uses to maximize return on this necessary and important part of their budget.
Buchheim also points out the Index Connect paid inclusion program has human editors to oversee quality control and maintain the highest quality content.
Yahoo's own crawler, and crawlers in general, are getting much better at spidering all of a Web site's deep content, Bucheim reminds us. The remaining challenge is dealing with all the noise encountered by the crawler. The noise in the data.
It's very easy for a human to look at a Web page describeing a hotel room and recognize it's about a hotel. As search engine algorithm often cannot differentiate between the navigation, the restaurant, the local attractions, and the hotel information content that all appear on that same page.
We've all seen hotel Web sites that discuss how close they are to various attractions: restaurants and local landmarks appear on the same page that describes a hotel property. Equal text may be devoted to describing the amenities at the hotel and the nearby attractions.
Size, positioning and content of the graphics may be the best clue to understanding a Web page, but a crawler cannot really interpret nuances of graphic size and positioning on a page. A large photograph of the building, the hotel lobby and perhaps photos of guest rooms make it plain the page is about the hotel. The algorithm sees hotel, local attraction, restaurant, weather and other content on the page as all potentially equal, as similar amounts of copy are assigned to each.
When it comes to indexing large Web sites, search engine crawlers often capture more total information, but not necessarily the most relevant or "right" pages that would improve the search experience.
Google plainly states in their information to Webmasters, "We are able to index dynamically generated pages. However, because our Web crawler can easily overwhelm and crash sites serving dynamic content, we limit the amount of dynamic pages we index." In short, Google will gather some, but not all, dynamically generated Web pages that make up your site.
A paid inclusion program overcomes this crawler limitation by allowing site owners to determine which pages of their Web sites must be included in the index.
Google's Webmaster information page continues by pointing out, "If your site's internal link structure does not provide a path to all your pages, our robot may not see all the pages on your site. Google follows links from one page to the next, so pages that are not linked to by others may be missed."
Dennis mentions to some, it may seem to be Nirvana to have every single page of a site in a search engine. But what if a few hundred products are discontinued? All search engines face this limitation.
"When crawling dynamic content, a crawler isn't always following links," he said. "In fact, a lot of dynamic content isn't linked at all. Instead, the crawler effectively queries the database behind a site and, as a result, some captured pages are inaccessible to someone navigating the site, or perhaps they're out of date."
As mentioned above, Google's own Webmaster information page acknowledges they restrict how many dynamically generated pages they crawl. They also say they may not find content that isn't linked, including some dynamically generated pages. Google would argue they index the most important content searchers seek, irregardless of who pays a fee, to produce the best results. Yahoo would argue paid inclusion will enable more comprehensive, higher quality content to draw on for search result sets. The winning philosophy will be proven over time. Each has its merits.
Buchheim notes in some crawler-based search engines, you often find multiple versions of contact pages or help pages in search results. That's often an indication the crawler was unable to distinguish the pages are similar. Paid inclusion solves this problem when editors can oversee content verification and quality assurance.
Dennis views the paid inclusion channel as one where three separate checks are conducted on all submitted data, something autonomous search engine crawlers cannot claim. "The ability to distinguish unique content is surprisingly difficult. It's invaluable to have editors involved in these assessments," he asserts. "Paid inclusion involves a Web site owner selecting their unique content for inclusion; a search marketing firm or agency putting their reputation and [paid inclusion relationship] agreement on the line in validating that it's unique, high quality content; and Yahoo's own in-house inspection team reviewing the SEM firm's inclusion feeds.
"So three human checks ensure the submitted content is truly unique and valuable. This is why we believe our strategy will ultimately produce search results with the highest relevance and best quality."
Dennis views keyword targeting in terms of "head" and "tail" search terms. The head is comprised of more common or frequently queried words, like "horse" or "tack" in the case of the Horse-Threads site. The tail includes more descriptive and targeted phrases, like "black leather riding crop."
Dennis acknowledges it's very difficult to achieve a top ranking on head search terms because there's so much competition. More important, even when a Web site ranks high on a broad keyword, the conversion rate may be low.
"Addressing head queries is really more focused on branding. Often, the conversions won't be great. But it does give you branding and visibility. It's with the tail that you'll see greater conversions."
In the case of keyword selection for the horse-threads.com site, "breeches," which was queried 1,000 times last month (according to Overture's search term suggestion tool), might be the head of the query. The modifying tail might be "full seat" (a specific type of riding pants). The keyword phrase, "full seat breeches" was queried only 59 times last month, but that increased specificity indicates a greater intent to purchase.
A recent OneStat study revealed 58 percent of all queries are now comprised of two or three words. Add in the percentage of people who search using four- and five-word queries, and the total is well over 75 percent of all queries. This is good news for people using paid inclusion. Often, more specific queries constructed of two-, three-, four- or even five-word phrases, produce search results that reach into the deepest parts of a site's content. That content is often best indexed with the aid of paid inclusion.
For maximum impact, I'd encourage Kelly to add relevant multi-word phrases of three and four words into her title tags and meta data so that, when fed into a paid inclusion feed, the pages attain high rankings on keywords that will drive conversions, not just traffic.
Kelly has lots of work to do to get Horse-Threads.com ready for primetime, and not just in terms of generating qualified traffic. The site needs a facelift and some usability improvements. In the interest of demonstrating various SEM strategies such as paid inclusion, we'll begin to submit Horse-Threads.com into an XML paid inclusion feed and apply other strategies. Dennis had estimated for Kelly's 250 pages of content, she should budget roughly $250 per month for paid inclusion. Kelly estimates she needs roughly $1,000 in revenue to break even, and twice that to justify the investment. In a future column, I'll report back on her progress, and on the sales produced through paid inclusion.
Revolutionize your digital marketing campaigns at ClickZ Live San Francisco (August 10-12)!
Educating marketers for over 15 years, our action-packed, educationally-focused agenda offers 9 tracks to cover every aspect of digital marketing. Join over 500 digital marketers and expert speakers from leading brands. Register today!
Fredrick Marckini is the founder and CEO of iProspect. Established in 1996 as the nation's first SEM-only firm, iProspect provides services that maximize online sales and marketing ROI through natural SEO, PPC advertising management, paid inclusion management, and Web analytics services.
Fredrick is recognized as a leading expert in the field of SEM and has authored three of the SEM industry's most respected books: "Secrets To Achieving Top-10 Positions" (1997), "Achieving Top-10 Rankings in Internet Search Engines" (1998), and "Search Engine Positioning" (2001, considered by most to be the industry bible). Considered a pioneer of SEM, Frederick was named to the Top 100 Marketers 2005 list from "BtoB Magazine."
Fredrick is a frequent speaker at industry conferences around the country, including Search Engine Strategies, ad:tech, Frost & Sullivan, and the eMarketing Association. In addition to ClickZ columns, He has written bylined articles for Search Engine Watch, "BtoB Magazine," "CMO Magazine," and numerous other publications. He has been interviewed and profiled in a variety of media outlets, including "The Wall Street Journal," "BusinessWeek," "The New York Times," "The Washington Post," "Financial Times," "Investor's Business Daily," "Internet Retailer," and National Public Radio.
Fredrick serves on the board for the Ad Club of Boston and was a founding board member of the Search Engine Marketing Professional Organization (SEMPO). He earned a bachelor's degree from Franciscan University in Ohio.
US Consumer Device Preference Report
Traditionally desktops have shown to convert better than mobile devices however, 2015 might be a tipping point for mobile conversions! Download this report to find why mobile users are more important then ever.
E-Commerce Customer Lifecycle
Have you ever wondered what factors influence online spending or why shoppers abandon their cart? This data-rich infogram offers actionable insight into creating a more seamless online shopping experience across the multiple devices consumers are using.