We are not alone; we share the Internet with spiders, crawlers, and bots. Marketers often downplay or downright ignore the potential of marketing to these non-human users of the Internet.
At this point, I’d like to draw a distinction between two types of marketing. The long-term brand fostering of the corporate website and the short-term and somewhat disposable ad campaign. The corporate site is often handed over to SEO experts who are looking to wring every conceivable advantage from search algorithms. The landing pages of ad campaigns are often overlooked, as advertising will be used as the key driver for traffic to the page.
This would all be fine if people clicked on banners or were willingly corralled into the marketing pathways we set, but unfortunately, that’s not the case. The campaign landing page is unloved from an SEO perspective, and all too often, unfindable.
In an earlier post, I talked about how clicking on banners is a fairly alien experience for people. Other mediums (like TV and outdoor) are not clicked on, but marketers fully expect people to find their own way from the ad to the information. For some reason, some marketers still believe that banners should break this rule entirely, even when they freely admit they never click banners themselves.
Too many times have I seen a campaign that has TV and online elements that if you do not type the exact branded name of the promotion, it is completely invisible on a search engine. I think we have to admit, that it is at least plausible that a percentage of the audience would search using a generic or inexact search term.
If the page is not accessible by generic search and we are relying on the audience remembering a specific URL from a TV ad or clicking on the banner unit, then we have not done our job of spreading our message very well.
That brings us back to the non-humans. If we take care to market our landing pages to non-humans we can solve (or at least go some ways to fix) the SEO disconnect. It is the bots that cruise the Internet looking for information humans might find useful and orderly package it up into accessible formats for humans to use… search engines.
For some strange reason we tend to make things hard for the bots on landing pages. Often in ways we never would in the corporate website. We make pages with the key text as images, loads of Flash, and many other things unfriendly to the non-human users of the site.
All parts of the media need to do their part in a holistic campaign. These days we will almost always have multiple media sources in the mix and the ultimate location for the audience to learn more about what the campaign offers is online. Spiders, crawlers, and bots are the gatekeepers for the search engines. Taking them into account when we plan a campaign will improve how campaigns perform and improve the user experience, as they will get to your offers with less hassle.
27-year-old Swede Felix Kjellberg, who goes by the name PewDiePie on YouTube, has found himself at the center of a firestorm.
In part one a few weeks ago, we discussed what brand TLDs (top level domains) are, which brands are applying for them and why they might be important. Today, we’ll take an in-depth look at the potential benefits for brands, and explore the challenges brand TLDs could help solve.
The explosive growth of video in 2016 makes 2017 an important year for video content and as more publishers are tempted to use it, it’s useful to consider the best strategies to maximise its effectiveness.
Apple has announced that with the next update to iOS 10, they will limit the number of times an app owner can pester a user for a rating.