Last time, I examined SEM hypochondria, or some marketers’ tendency to worry about site details they can’t control or shouldn’t worry about.
An opposite condition exists, too. While I’m not sure Noah Webster considers hyperchondria a valid term, the less-revered Urban Dictionary defines hyperchondria as the opposite of hypochondria. In other words, “never thinking you are sick, even when you are.”
Still, hyperchondriac accurately describes site owners who refuse to address certain issues no matter how they affect a site’s search performance.
Unique Copy, and Plenty of It
One of the most difficult arguments to make to clients, especially for B2C retail sites, is that the manufacturer-supplied catalog copy isn’t necessarily going to cut it on product pages. “But it’s the ‘official’ copy,” they say. “It’s what the manufacturer wants us to say.”
Problem is, that’s the approach taken by 5,000 other affiliates, retailers, and shopping aggregator sites sell merchandise. And when 5,000 sites all use the exact same copy to sell an item, the rankings and traffic naturally gravitate toward sites that supplement that information with other content that audiences appreciate.
Why do some book titles rank well on Amazon.com? Is it because of the publisher’s blurb about the title that each online bookseller is allowed to post verbatim? That’s unlikely. It’s more probable that comments and reviews of each title help generate further attention, and consequently, more unique content.
Targeted Titles, Regardless of Depth
We frequently analyze sites and find that many deep pages, most of which contain really good content, have no meta description and only the company name for a title. It’s easy to understand why, but it’s imperative to explain to clients why this oversight hurts them.
Consider a “traditional” pyramidal site structure, with a few general pages at the top spreading to many tightly focused pages. When the entire site is optimized, traffic flows into the site with a sort of “inverted pyramid” style. In other words, on a per-page level, more traffic frequently flows in to top pages, stemming from shorter, more general query terms. Smaller numbers of traffic, typically stemming from longer, more focused query strings, land on deeper pages, those with a tighter focus.
Many sites optimize a few levels deep and ignore the potential of their deep, targeted content. When one product category splits into 20 different products, it gets hard to find time to write original titles and descriptions. But it’s time well spent, because the “long tail” takes hold when all the small numbers of visits to the wide array of deep URLs eventually overtake the traffic sent to broader, top-level pages. Many companies fail to realize that the traffic sent to deep pages is often more targeted and ready to buy.
Broad Content Offerings
Beyond the reluctance to beef up, or even replace, existing manufacturer text with original contributions, is the reluctance to build out a broad base of content addressing interest at all levels of the product and service lines.
One example (I’ve changed the specific industry) involves a manufacturer of let’s say, broadband routers. I informed the client that people were searching for information about their routers, and that their pages were performing well. But I added that keyword research shows that a greater potential exists in more content built around specific models, not just that company’s routers in general.
A marketer winced. “That’s a lot of pages,” he said.
He’s right. That’s a lot of pages, and a lot of work. But the work involved stands to bring in a lot of traffic and an opportunity to reclaim his brand’s role in the conversation already in progress about those products.
Initial Activation Energy
Building effective, search-friendly sites is similar to some chemical reactions. To get the reaction started requires a great amount of initial activation energy. Once the reaction begins, keeping it going requires less and less external energy. For Web site development, activation energy is equal to the initial thought required to build the site and populate the pages with content. The gradually decreasing amount of required energy might be due to users themselves who contribute reviews and forums to the site, or it might be the decreased amount of work involved simply because you’re getting more efficient at it. Either way, it gets easier as you move forward, and the benefits are worth it.
Online consumers with intent to purchase only find what they’re looking for in 50% of ecommerce searches. That needs to change. eBay ... read more
Update: Google’s Rudy Galfi, Google’s lead product manager for AMP, has revealed to Greg Sterling from Search Engine Land that the global rollout of ... read more
Three years ago, Mark Knowles wrote a thorough checklist for testing a website prior to its live launch. It was a very ... read more
Sridhar Ramaswamy, Google’s SVP of Ads & Commerce made announcements about two new products this morning at DMEXCO 2016. The first centred on ... read more