We are not alone; we share the Internet with spiders, crawlers, and bots. Marketers often downplay or downright ignore the potential of marketing to these non-human users of the Internet.
At this point, I’d like to draw a distinction between two types of marketing. The long-term brand fostering of the corporate website and the short-term and somewhat disposable ad campaign. The corporate site is often handed over to SEO experts who are looking to wring every conceivable advantage from search algorithms. The landing pages of ad campaigns are often overlooked, as advertising will be used as the key driver for traffic to the page.
This would all be fine if people clicked on banners or were willingly corralled into the marketing pathways we set, but unfortunately, that’s not the case. The campaign landing page is unloved from an SEO perspective, and all too often, unfindable.
In an earlier post, I talked about how clicking on banners is a fairly alien experience for people. Other mediums (like TV and outdoor) are not clicked on, but marketers fully expect people to find their own way from the ad to the information. For some reason, some marketers still believe that banners should break this rule entirely, even when they freely admit they never click banners themselves.
Too many times have I seen a campaign that has TV and online elements that if you do not type the exact branded name of the promotion, it is completely invisible on a search engine. I think we have to admit, that it is at least plausible that a percentage of the audience would search using a generic or inexact search term.
If the page is not accessible by generic search and we are relying on the audience remembering a specific URL from a TV ad or clicking on the banner unit, then we have not done our job of spreading our message very well.
That brings us back to the non-humans. If we take care to market our landing pages to non-humans we can solve (or at least go some ways to fix) the SEO disconnect. It is the bots that cruise the Internet looking for information humans might find useful and orderly package it up into accessible formats for humans to use… search engines.
For some strange reason we tend to make things hard for the bots on landing pages. Often in ways we never would in the corporate website. We make pages with the key text as images, loads of Flash, and many other things unfriendly to the non-human users of the site.
All parts of the media need to do their part in a holistic campaign. These days we will almost always have multiple media sources in the mix and the ultimate location for the audience to learn more about what the campaign offers is online. Spiders, crawlers, and bots are the gatekeepers for the search engines. Taking them into account when we plan a campaign will improve how campaigns perform and improve the user experience, as they will get to your offers with less hassle.
For better or worse, Google My Business (GMB) and Knowledge Graph (KG) are transforming mobile local search. It pays to watch the areas of innovation, such as hotels, restaurants and movies as these signal Google’s intentions.
Click-through rates for a business website fall with its position in organic search results. But what is the effect when organic results are pushed further and further off screen by paid ads, Google My Business listings and Knowledge Graph?
When you’re just starting out as a business owner it’s easy to become wrapped up in the seemingly endless number of metrics ... read more