There are opportunities to quickly increase the number of quality links pointing to your site with minimal effort, and it's completely white hat…encouraged even.
Link building has become more than a touchy subject in response to Google's Penguin updates. What's the right way to do it? Which links are safe? How many are too many? Everyone's questions and attention is heavily focused on gaining new links. Oftentimes, there are plenty of opportunities to quickly increase the number of quality links pointing to your site with minimal effort, and it's completely white hat - encouraged even. It requires little research time, no outreach time, and has a 100 percent success rate in claiming the link. On top of all that, Google will even identify the opportunities automatically for you. You can even have the link pointed to any page on your site that you choose. Sound too good to be true? It's not, and the strategy is called link reclamation.
Inside of Google Webmaster Tools there is a report called "Crawl Errors," which identifies error pages that Google has encountered on your site during its crawling process. The "Not Found" Report shows you a listing of 404-error pages that have been crawled via links both internally on your site as well as from third-party sites. All of this link value is essentially going to waste since it's pointing at an error page rather than a valid page on your site. Whether the content has been moved or deleted from your site or the link is malformed and creating the error, they could be passing potentially high-quality value to your site for rank purposes as well as sending qualified traffic to your site. Depending on the size and complexity of your site, the number of pages listed here may vary significantly. A smaller site may only have a handful, such as the below:
However, large sites may have thousands of these errors, such as in the example shown below:
That's 30,000 links currently passing no value that a webmaster has complete control over to point to any page using a simple 301 redirect. It can be a daunting task to determine where to point these links for optimal results, so it's critical to prioritize and spend time focusing on those that matter most. Those that have access to a link tool, such as Majestic SEO, can do this easily with an Excel plugin from Niels Bosma. By pulling in link metrics such as ACRank (a metric used for scoring the quality/value of all links pointing to a page), external backlinks (the total number of links pointing to the error page), and referring domains (the total number of unique domains linking to the page), you can easily sort and prioritize which error pages to focus on redirecting first. Those with a higher ACRank and a wider range distribution of linking domains should be made a priority. Those with only a handful of links and no ACRank may not be worth the effort initially.
Once mapped, ideally the redirects can all be taken care of in one fell swoop using an .htaccess file or a redirect table. If this isn't possible, such as with .net environments, you'll likely need to be prepared to show the justification in having the web team implement this many redirects. Having the Majestic SEO data readily available will help in the conversations to point at all of the value currently being missed out on. The additional work for the web team may be easier to swallow if you provide the redirect recommendations in smaller chunks, such as in sets of 100 to 200 at a time.
Link reclamation is one of the easiest, yet typically overlooked methods of quickly building high-quality links to your site. Not only do the redirects improve your site's overall link profile, and potential organic visibility, but they improve the user experience, which, at the end of the day, is arguably the most important aspect of your website. Has anyone else used this method of reclaiming the links that you've already built and earned over the years? What kinds of improvements did you see in doing the redirects and what kinds of internal challenges did you have to overcome? Feel free to post your comments below.
Image on home page via Shutterstock.
On the heels of a fantastic event in New York City, ClickZ Live is taking the fun and learning to Toronto, June 23-25. With over 15 years' experience delivering industry-leading events, ClickZ Live offers an action-packed, educationally-focused agenda covering all aspects of digital marketing. Register today!
Crispin Sheridan is the Senior Director, Global Search at SAP. As part of the digital team, he established and leads the search and testing practices at SAP. Crispin is responsible for paid, natural, and mobile search and all online testing. Search and testing at SAP are fully centralized and globally funded and run under a hybrid in-house and agency model.
Crispin has proven that search learnings and keyword insights work hand in hand with social media marketing and together can effectively drive B2B lead generation. Furthermore, the development of the SAP.com Test Lab has contributed significant success to SAP's digital marketing efforts.
A frequent guest speaker at conferences, including SES New York, San Francisco, Toronto, London, Delhi, Shanghai, and Hong Kong, Crispin was appointed to the SES Advisory Board in December 2009. He has also been a guest speaker at the e-Metrics Summit and ad:Tech, and is a member of Google's B2B Technology Council. You can follow him on Twitter at @crispinsheridan and read his monthly SEO column on ClickZ.
Hong Kong, May 5-6, 2015
Gartner Magic Quadrant for Digital Commerce
This Magic Quadrant examines leading digital commerce platforms that enable organizations to build digital commerce sites. These commerce platforms facilitate purchasing transactions over the Web, and support the creation and continuing development of an online relationship with a consumer.
Paid Search in the Mobile Era
Google reports that paid search ads are currently driving 40+ million calls per month. Cost per click is increasing, paid search budgets are growing, and mobile continues to dominate. It's time to revamp old search strategies, reimagine stale best practices, and add new layers data to your analytics.
May 6, 2015
12:00pm ET/9:00am PT