Link Building: Fast and Easy Competitive Research

One of my biggest challenges when it comes to link building is trying to find new, industry-relevant websites to get links from, whether those are industry blogs, directories of companies in your industry, or some other potential source that you might not even know about yet. Whether you’re just not sure what’s out there, or if you’ve hit a brick wall when it comes to new ideas, one way to scrounge up some new ideas is through competitive research.

Competitive Research Steps:

  1. Identify keywords that you would like to rank better for
  2. Do a Google search for those terms
  3. Cull the top 20-100 URL results
  4. Check the back links for everyone who shows up higher than you
  5. Identify websites, types of links, or strategies that you could implement
  6. Add to your link building spreadsheet

Identifying Keywords

If you’ve already done your keyword research and figured out what keywords you want to build links for, great. Use these. Proceed to next step.

If you need help figuring out what keywords to use, check out this great post about keyword research here.

Finding Competitors with Advanced Google Search options

Got your keywords? Great. We’re going to put them into Google to see who’s ranking higher than you are. Those would be your competitors. But first, let’s play with Google’s search settings to give us the best results. Google personalizes all its search results now, so you’re never going to get 100% guaranteed the same results as your customers are finding, but there are ways to make sure that your search is as relevant as possible.

First, take one of your keywords. Let’s say you are gym located in Winnetka, IL, and one of the things that you want to rank for is “kickboxing classes in winnetka il.” Let’s type that in. (Note: in order to get as non-personalized results as possible, I use Chrome’s “Incognito Mode,” as it ensures that I am not signed into any of my accounts, and all cookies are clear so that Google isn’t taking any of my previous searches into account, and I have as clean a slate as possible for this. If you don’t use Google Chrome or incognito mode, the same effect can be created by clearing your browser history, cache, and cookies, but I find this to be something of a hassle on my regular browser.)

Go to Google and perform the search, so that you can get to the search results page (SERP for short). Once there, find the little settings button in the top right and choose “search settings” (see picture 1)

Advanced Google Search Settings

In the search settings, make sure that Google Instant predictions are set to “Never Show Instant Results.” This allows you to choose how many search results you see per page. Google Instant restricts results to 10 results per page. Once you have turned off Google Instant, choose how many search results per page you would like to see – I usually choose 100, but if you’re already ranking decently well for a term or don’t want to be overwhelmed, you might only need the top 20. (see picture 2)

Google Instant Results Settings

Once you’ve set these, go to the bottom and hit “save.” A little box should pop up that says “Your Preferences have been saved.” Click OK, and it will take you back to your search results.

Collecting the Top 20-100 Search Results

Now that you have your settings adjusted, let’s put this giant SERP to good use. One way to do this is just to scan through the search results and copy/paste the URLs that rank higher than you into a spreadsheet. Just doing this is enough to take you to the next step, so feel free to move on if you’d like.

If, on the other hand, you’re interested in a quick way to scrape every URL that shows up in the search results, I have just the thing for you. Bear in mind, though, that it involves finding the source code of a web page and requires a Regex-enabled text editor such as Notepad++. If that sounds like something you can handle, read on.

How to clean up Google Search Results when scraped from the source code of a SERP

This works because every search result starts with <!--m--> and ends with <!--n--></li>. What we’re doing here is using this knowledge to delete all the HTML code we don’t need and leave us with a clean list of URLs that can be pasted into an Excel spreadsheet. It uses some simple regular expressions (regex for short) to help make finding the extraneous code easier.

  1. First, find the source code of the SERP. In both Firefox and Chrome, the shortcut for this is Ctrl+U (in Windows). You can also usually right-click on the page and choose the “view page source” option. This brings up all of the HTML that makes up the SERP.
  2. Copy and paste this into the Regex-able text editor of your choice. Once again, I use Notepad++ because I use Windows. If you know of a good text editor that can do this on a Mac, leave a note in the comments.
  3. Make sure the Regex search is enabled. In Notepad++, this is found by opening the “find” window and making sure that “search mode” is set to “regular expression.”
  4. Find the first search result by searching for the first instance of <!--m-->. Delete everything above it.
  5. Add line breaks between each search result: Find <!--m--> replace with \n<!--m-->
  6. Delete everything before the URL: Find <!--m-->.+?<a href=" replace with nothing.
  7. Delete almost everything after the URL. Find: " onmousedown="return.+?<!--n--></li> replace with nothing.
  8. Delete the remaining tags after the URL. Find: <li class="g"> replace with nothing.
  9. Almost done! Delete everything from the last link down. If you’re lazy, the last link is usually followed by </ol></div><!--z--></div></div><div id="bottomads"></div> so search for that, and delete it and everything after.
  10. If you’ve done everything right, you should have a mostly clean list of URLs. This method doesn’t clean local results (with the map pins) or sometimes image results, so you’ll either have to extract those URLs on your own or just delete them.

And there you have it. My (formerly) super-secret method of getting a list of URLs from a search results page.

Checking Competitors’ Links

Now that you have some sort of list of competitors, and you’ve put it in a spreadsheet (or in some sort of list that you’ll be able to keep track of), it’s time to start analyzing their backlink profiles. What sites link to them? If there are site that link to these competitors that don’t link to you, that could be one of the reasons they’re outranking you.

Common tools used to find out a website’s backlinks include opensiteexplorer.org (by Moz), majesticSEO.com, and Ahrefs.com. All will give you slightly different information. Open Site Explorer and Ahrefs will give you the most information for free, but all of them will require a subscription to see all backlinks, or more than a couple backlink requests a day.

Identifying Link Building Sites and Strategies

I go through the backlinks one at a time, sorting them into categories like those that Alex mentioned: Self-submit resource, Purchasable Resource, Non-Competitive Resource, Competitive Resource. I also usually add a column for “Ideas,” for when I might not be able to get a link from that exact resource, but it gives me an idea of something that I could do or could suggest to the client to do (sponsor a little league team, speak at a small business conference, etc.)

Add To Your Link Building Spreadsheet

If you are already using the methods Alex talked about in his previous post, you should already have one of these. If you don’t have one, now’s a great time to start! Keep all of these great new link ideas and opportunities organized.

After that? Well, that’s the hard part. I’ve just showed you one way to get started, but the outreach and follow-through is on you.

Tags: , ,

  • Dario Civinelli

    Hmm…those RegEx steps to cull URLs from a SERP look familiar… ;-)

    • Lauren Hartman

      I know, I was rereading through and thinking I needed some kind of callout to you for the steps. I’ve had to tweak them since you gave them to me since I think Google’s source code has changed, though.

      • Alexander Bruner

        Brilliant

  • http://spudart.org/ spudart

    You asked for Regex text editors for the Mac. I’ve been using TextWrangler for years. It’s great and free. http://www.barebones.com/products/textwrangler/ The program is so lightweight, I even use it to start writing blog posts. I find Evernote, Google Drive just a tad too slow. When i want to plop down an idea, I don’t want to wait for Evernote to load a new document or for Google Drive to make things appear. It sounds crazy, but TextWrangler is much more responsive.

  • John O’Riordain

    I’m wondering why people do that stuff manually when there are plenty online and desktop-based tools for that (e.g. SEO SpyGlass, or Webmeup.com)

  • http://435digital.com/about-us/our-team/alex-bruner/ Alex Bruner

    John,

    We use this method along with several other pieces of paid for software. There are also lots of beginners and small businesses that dont have the budget for paid tools.

    That being said I’ll take a look at the two softwares you mentioned and see if they help me in my day to day.

    Best,

    Alex