Q&A With Google’s Matt Cutts On How To Use The Link Disavow Tool

It’s been almost two weeks since Google launched its link disavowal tool. Some have been busy diving in and using it, but others have had more detailed questions about it. We’ve got some answers, from the head of Google’s web spam team, Matt Cutts.


How do people know what links they should remove?


When we’re taking targeted action on some specific links, the emails that go out now include examples of bad links. We provide example links to guide sites that want to clean up the bad links. At the same time, we don’t want to help bad actors learn how to spam better, which is why we don’t provide an exhaustive list.


Why not list the bad links?


That’s related to the first question, of course. We don’t want to help bad actors learn how to spam better, which is why we don’t provide an exhaustive list.


Who should do this?


The post [Google’s announcement post last week] says anyone with an unnatural link warning. It also mentions anyone hit by Penguin, but I keep getting asked about this. I’m going to reiterate that if you were hit by Penguin and know or think you have bad links, you should probably use this too.


What if you don’t try to remove links? Given what a pain it is to get links off the web, why wouldn’t someone just use disavow? I know Google recommends requesting link removals, but from a technical standpoint, if they don’t do that and just disavow, it’s pretty much going to work, right?


No, I wouldn’t count on this. In particular, Google can look at the snapshot of links we saw when we took manual action. If we don’t see any links actually taken down off the web, then we can see that sites have been disavowing without trying to get the links taken down.


How are you dealing with index files? Do you have to remove all variations, such as like this:http://badsiteiwanttodisavow.com




We tried to cover this in the last two to three questions. Technically these are different URLs, so if you want to be ultra-safe, then you would list the URL variants.Practically speaking though, Google normally canonicalizes such URLs to a single URL, so if you’re going off the backlinks that you download from google.com/webmasters/, then you should normally only need to list one url.


If you download and reupload a disavow list, is it still a several week delay between when the fresh upload is acted upon, even if you upload a fresh list the same day, perhaps after catching a mistake?


I would count on it potentially still being a several week delay. If you have URLs A and B and you download the file and edit it to add a new URL C then it shouldn’t really affect A and B, but it will take time for disavowing C to go into effect.


How long will it take sites to see any potential improvement? It seems like potentially months.IE, say you upload a file. It takes several weeks for that to be read. Then you might wait several weeks for the next Penguin Update, until the change would be reflected, right?

Or when you say multiple weeks, do you mean that really, the file might get read right away, but the changes might not be reflected until some Penguin or other update can act on those changes?


It can definitely take some time, and potentially months. There’s a time delay for data to be baked into the index. Then there can also be the time delay after that for data to be refreshed in various algorithms.


Just to double-check, reconsideration should only be done if they’ve gotten a message about a manual action, correct?


That’s correct. If you don’t have a manual webspam action, then doing a reconsideration request won’t have any effect.


Do manual actions specifically say if they are related to bad links?


The message you receive does indicate what the issue with your site is. If you have enough bad links that our opinion of your entire site is affected, we’ll tell you that. If we’re only distrusting some links to your site, we now tell you that with a different message and we’ll provide at least some example links.


What about the www prefix? It sounds like to be safe, you should do this:domain:badsite.com



You only need the first line. If you do domain:badsite.com, then that also ignores all links from www.[NOTE: I’m pretty sure this also means Cutts is saying that if you only disavow from a domain with the www prefix, and it also has a non-www variation, those will still be counted. But I’m double-checking on this].


What prevents, and I can’t believe I’m saying this, but seemingly inevitable concerns about “negative negative SEO?” In other words, someone decides to disavow links from good sites as perhaps an attempt to send signals to Google these are bad? More to the point, are you mining this data to better understand what are bad sites?


Right now, we’re using this data in the normal straightforward way, e.g. for reconsideration requests. We haven’t decided whether we’ll look at this data more broadly. Even if we did, we have plenty of other ways of determining bad sites, and we have plenty of other ways of assessing that sites are actually good.We may do spot checks, but we’re not planning anything more broadly with this data right now. If a webmaster wants to shoot themselves in the foot and disavow high-quality links, that’s sort of like an IQ test and indicates that we wouldn’t want to give that webmaster’s disavowed links much weight anyway. It’s certainly not a scalable way to hurt another site, since you’d have to build a good site, then build up good links, then disavow those good links. Blackhats are normally lazy and don’t even get to the “build a good site” stage. 🙂


One last try on something I asked when the tool launched. Why not simply discount links so there’s no need for people to disavow, rather than considering some links as negative votes capable of harming a site?


As part of our efforts to be more open about manual actions, we’ve been providing more information to site owners, about when links to their site are affecting our opinion of their site. Because of that additional information, webmasters have been paying more attention to their link profile and trying to move toward higher quality links. That’s a good thing.But we understand that migrating toward higher-quality links also means that some sites feel the need to clean up previous spammy or low-quality links. Right now it can be a difficult task to clean up a site’s backlinks, and from listening to the SEO community we wanted to provide a tool that could help after site owners had already taken substantial steps to try to clean up their site’s backlinks.

Question: Any last thoughts, comments or perhaps warnings of mistakes you’ve seen people make?

I have gotten a couple people asking “If I disavow links, do I still need to do a reconsideration request?” We answered that in the blog post, but the answer is yes.We want to reiterate that if you have a manual action on your site (if you got a message in Webmaster Tools for example), and you decide to disavow links, you do still need to do a reconsideration request.

We recommend waiting a day or so after disavowing links before doing the reconsideration request to give our reconsideration request system time to pick up the disavowed links, and we also recommend mentioning that you disavowed links in the reconsideration request itself.

Top 15 Most Popular Social Networking Sites | October 2012

Here are the 15 Most Popular Social Networking Sites as derived from the eBizMBA Rank which is a constantly updated average of each website’s Alexa Global Traffic Rank, and U.S. Traffic Rank from both Compete and Quantcast. “*#*” Denotes an estimate for sites with limited Compete or Quantcast data. If you know a website that should be included on this list based on its traffic rankings Please Let Us Know.

1 | Facebook

2 – eBizMBA Rank | 750,000,000 – Estimated Unique Monthly Visitors | 2 – Compete Rank | 2 – Quantcast Rank | 2 – Alexa Rank.
Most Popular Social Networking Websites | Updated 10/15/2012 | eBizMBA

2 | Twitter

13 – eBizMBA Rank | 250,000,000 – Estimated Unique Monthly Visitors | 24 – Compete Rank | 5 – Quantcast Rank | 9 – Alexa Rank.
Most Popular Social Networking Websites | Updated 10/15/2012 | eBizMBA

3 | LinkedIn

27 – eBizMBA Rank | 110,000,000 – Estimated Unique Monthly Visitors | 44 – Compete Rank | 23 – Quantcast Rank | 14 – Alexa Rank.
Most Popular Social Networking Websites | Updated 10/15/2012 | eBizMBA

4 | MySpace

84 – eBizMBA Rank | 70,500,000 – Estimated Unique Monthly Visitors | 51 – Compete Rank | 62 – Quantcast Rank | 138 – Alexa Rank.
Most Popular Social Networking Websites | Updated 10/15/2012 | eBizMBA

5 | Google Plus+

95 – eBizMBA Rank | 65,000,000 – Estimated Unique Monthly Visitors | *NA* – Compete Rank | *NA* – Quantcast Rank | *NA* – Alexa Rank.
Most Popular Social Networking Websites | Updated 10/15/2012 | eBizMBA

6 | DeviantArt

183 – eBizMBA Rank | 25,500,000 – Estimated Unique Monthly Visitors | 346 – Compete Rank | 74 – Quantcast Rank | 130 – Alexa Rank.
Most Popular Social Networking Websites | Updated 10/15/2012 | eBizMBA

7 | LiveJournal

303 – eBizMBA Rank | 20,500,000 – Estimated Unique Monthly Visitors | 605 – Compete Rank | 203 – Quantcast Rank | 102 – Alexa Rank.
Most Popular Social Networking Websites | Updated 10/15/2012 | eBizMBA

8 | Tagged

315 – eBizMBA Rank | 19,500,000 – Estimated Unique Monthly Visitors | 447 – Compete Rank | 217 – Quantcast Rank | 282 – Alexa Rank.
Most Popular Social Networking Websites | Updated 10/15/2012 | eBizMBA

9 | Orkut

350 – eBizMBA Rank | 17,500,000 – Estimated Unique Monthly Visitors | *NA* – Compete Rank | *NA* – Quantcast Rank | 156 – Alexa Rank.
Most Popular Social Networking Websites | Updated 10/15/2012 | eBizMBA

10 | Pinterest

375 – eBizMBA Rank | 15,500,000 – Estimated Unique Monthly Visitors | 205 – Compete Rank | 811 – Quantcast Rank | 109 – Alexa Rank.
Most Popular Social Networking Websites | Updated 10/15/2012 | eBizMBA

11 | CafeMom

451 – eBizMBA Rank | 12,500,000 – Estimated Unique Monthly Visitors | 127 – Compete Rank | 82 – Quantcast Rank | 1,144 – Alexa Rank.
Most Popular Social Networking Websites | Updated 10/15/2012 | eBizMBA

12 | Ning

456 – eBizMBA Rank | 12,000,000 – Estimated Unique Monthly Visitors | 617 – Compete Rank | 411 – Quantcast Rank | 339 – Alexa Rank.
Most Popular Social Networking Websites | Updated 10/15/2012 | eBizMBA

13 | Meetup

621 – eBizMBA Rank | 7,500,000 – Estimated Unique Monthly Visitors | 838 – Compete Rank | 516 – Quantcast Rank | 509 – Alexa Rank.
Most Popular Social Networking Websites | Updated 10/15/2012 | eBizMBA

14 | myLife

728 – eBizMBA Rank | 5,400,000 – Estimated Unique Monthly Visitors | 122 – Compete Rank | 391 – Quantcast Rank | 1,670 – Alexa Rank.
Most Popular Social Networking Websites | Updated 10/15/2012 | eBizMBA

15 | Badoo

952 – eBizMBA Rank | 2,500,000 – Estimated Unique Monthly Visitors | 1,596 – Compete Rank | 1,148 – Quantcast Rank | 112 – Alexa Rank.
Most Popular Social Networking Websites | Updated 10/15/2012 | eBizMBA

Getting Along with Siri

Today, 90 percent of mobile searches result in an action such as visiting a business location or purchasing a product. Increasingly, a majority of consumers are becoming more comfortable shopping from their mobile devices. In fact, a full 60 percent of shoppers with smartphones search for a product on their mobile device before buying. These aren’t rare occurrences, either; 50 percent of mobile searchers made a mobile purchase in the last six months.

Siri and other mobile search assistants have been pegged by search catastrophists as the death of search optimisation, but this is a dramatic overreaction.
The truth is that Siri presents unique challenges and opportunities to marketers, which should be approached intelligently when optimising for mobile web search.
Siri often bypasses traditional search

  • For many searches, such as for stores and restaurants, Siri will ignore the traditional search results and compare your location with Yelp listings.
  • The Solution: Ensure that your mobile SEO strategy pays appropriate attention to optimisation in places like Yelp, Google Places, Yahoo Places, etc.
  • Siri affects the social landscape
  • Siri and similar global search aids are programmed to draw results from particular social services. If you aren’t equally relevant on all the majors, you risk being left behind.
  • The Solution: Build your following on all relevant social networks and influential communities to ensure relevancy.
  • Siri ignores PPC altogether
  • Pay-per-click might as well not exist to Siri; none of the results the software returns contain ads.

The Solution: If your current strategy focuses strongly on pay-per-click, you’ll need to broaden your strokes and cover more ground in the mobile search space.

Google’s Disavow Tool

The new tool was announced by head of Google’s web spam Matt Cutts during his keynote speech at the Pubcon conference in Las Vegas. It has been tested for a number of weeks by selected SEOs and is now live and ready to use.

The introduction of the disavow links tool aims to help webmasters that believe their Google search ranking to have been affected by low quality links from spam sites. Google are quick to suggest that the tool should only be used after alternative methods have been tried, “If you know of bad link-building done on your behalf (e.g., paid posts or paid links that pass PageRank), we recommend that you contact the sites that link to you and try to get links taken off the public web first,” says the updated Google help section. “You’re also helping to protect your site’s image, since people will no longer find spammy links and jump to conclusions about your website or business.”

If your efforts of trying to have a link taken down are unsuccessful you can then go ahead and use the disavow links tool. But be warned, if you disavow a good link in error, it could be a long wait for that link to be reinstated – if ever.

Webmasters will be able to disavow individual URLs, or entire domains in a text file uploaded to Google’s Webmaster Tools. Cutts said that the tool uses the “nofollow” attribute, which allows sites to link to other sites without passing ranking credit to those sites.

If you want to learn more about the tool, there is a video of Matt Cutts talking about Disavow Links here.

Hit By Penguin? Take the Red Pen Test

Like a lot of folks in the SEO world, I’ve been analyzing, dissecting, and pondering the significant changes that Google made in April 2012 – from Panda to Penguin and a lot of little things in between.

Leslie and I have the good fortune to work with hundreds of clients in our training and coaching programs, which provides us with access to detailed Analytics data on a large number of websites.

Within that group, we’ve now identified over two dozen cases of “confirmed” Penguin hits.
The analysis of those sites has proven to be very interesting indeed.

While much of the Penguin reporting to date has focused on inbound links as a potential issue, we’ve struggled to find many cases within our customer base who fit the bill for “unnatural” linking – possibly because many of our clients don’t really do traditional “link building” at all.

We have, however, in *every* case, found significant on-site issues that could be contributing to issues with the Penguin update, which Google described as targeting “webspam.”

“In the pursuit of higher rankings or traffic, a few sites use techniques that don’t benefit users, where the intent is to look for shortcuts or loopholes that would rank pages higher than they deserve to be ranked. We see all sorts of webspam techniques every day, from keyword stuffing to link schemes that attempt to propel sites higher in rankings.”

Source: Google Inside Search Blog

So how do you respond to this? Well, if you’ve gone nuts building unnatural links, I would encourage you to at least consider making some changes to your link building practices. However, if you are working with a site that has been affected by the Penguin update, now would be a very good time to look at potential on-site issues as well.

Take The “Red Pen” Test

Here’s a little exercise we’ve been having our clients do for a while. The purpose of the exercise is to identify potential “excessive” SEO on your website, and help you improve the site’s design to serve users better, convert better, and consistently rank better through algorithm changes in search engines.

What you will need:

  1. A couple of highlighters – I like to use green and yellow, but any two colors will do.
  2. A nice big fat red marker – the kind your mean old teachers used in school to grade papers.
  3. An open mind – you can lie to me, but don’t lie to yourself, because it’s bad for you.

Step 1: Print Out a Copy of Your Home Page

For many sites, you can simply print it out “as is” – but by doing so you may actually *miss* some things like hidden text, duplicated text, and links that are “concealed” by styling them to look exactly like the surrounding text. If you know how to disable CSS, you can print the page with all styles disabled, and get a pretty good printout to work with.

Most folks will find it just easier to use the text-only version of Google’s cached copy of the page. To find this, use the following procedure:

  1. From www.google.com or a Google search box in your browser, search for cache:URL or cache:www.domain.com
  2. For example, you can type cache:seomoz.org in a Google search box to get the cached copy of the SEOMoz home page.
  3. Click the little link that says “Text Only version” and print that out.

Not so pretty, is it? But this is a lot more like what your pages look like to spiders!

Step 2: Highlight the “SEO Keywords” on the Page

Okay – grab the printout, and pick up one of your highlighters. Start at the top of the page, and highlight every use of a “keyword” on the page, that isn’t absolutely required to let human visitors do one of three things:

  • Figure Out Where They Are: Am I on the right web site? Where am I within the web site? (Keywords rarely help with this)
  • Figure Out What To Do Next: Where do I click to find the women’s shoes? How do I sign up? (Keywords sometimes help)
  • Whatever Else The Page Is For: Are there any other things the page is supposed to accomplish? (Aside from “rank #1 in Google”)

Now, before you finish this task, open your mind, and think. If you sell “pool supplies,” and it’s obvious to your visitors that this is all you do, would the links to the “pool chemicals” category page *really* need to say any more than “Chemicals” in the anchor text? No, they wouldn’t – so go back with your highlighter and finish the job.

The point here is to uncover how many times you’re using keywords for the benefit of both you and your users, and how many times they’re really just there to “check a box” for SEO purposes.

That doesn’t mean you won’t ever use keywords in your copy, if it’s the right word to persuade a user to take action. However, if you didn’t need to use a keyword to say it, you probably want to highlight it.

It can be helpful to have your website open in a browser while you do this exercise – it can be amazing how many times you discover that someone stuffed every keyword you have into the alt attribute of an image, for example.

Step 3: Highlight the Excess Links on the Page

For the next step, switch highlighters, and start at the bottom of the page. Highlight every link on the page, that fits the following criteria:

  • Useless: Examples – a link to the designer’s site, to an “articles” section that people aren’t intended to read, etc. Does the link help your visitors accomplish one of the three obectives listed in step 2? If not, it had better be “required by law” or it gets highlighted.
  • Redundant: Links that already exist somewhere higher on the page, or links to redundant categories/pages that only exist to cover extra keywords with the same content.
  • Hidden/Concealed: You can see a blue underlined link on the “text only” printout, but you can’t actually see the link on the website in your browser.

If you use visitor analytics tools like CrazyEgg, you’ll find that a lot of the links you have highlighted during this step actually don’t get clicked at all.

Step 4: Time for the Red Pen Test (optional)

At this point you will have a page with, well, either a little bit of highlighter on it, or a lot. Scan, photograph, or copy the highlighted page for future reference… then grab the red marker, and run it over all of the highlighted text on the page. Stand back and admire your work.

Many of our clients have pointed out that this step is not strictly neccessary – with comments like “okay, I get it I get it” often occuring before they’ve even finished with the first highlighter – but there is a point to using the red pen.

Step 5: Analyzing Your Results

Of course, this exercise is probably worth repeating (at least mentally) with other pages on your site – maybe even all of them.

If you have more than a little bit of red ink on the page, there’s a very good chance that the following statements are true:

  1. You have been “doing SEO” on your site for a long time, and adding more keywords to more places helps you feel like you are “doing something” to rank better. You can find a lot of great tips and advice on SEO, different things that could help you rank better, but you’re not meant to do all of it – just enough to get the job done.
  2. Adding more keywords to more places is not actually helping you to rank better, and never has been. I’ve “owned” top positions in many markets for years, with a standing rule to never allow more than two exact occurences of a target keyword in on-page copy. In fact, I’ve got plenty of pages that have ranked for years with zero (0) occurences of the exact keyword in the copy, and no “unnatural” links either.
  3. You’ve been hit by Penguin, and/or Panda, and/or the new “keyword stuffing filters,” etc. – and you’re ready to make a change.

The truth is what it is: a lot of people have been going way, way, way, over the top with on page and on site SEO – even if they haven’t gone nuts with their link building. At this point, these practices are very likely to be hurting your rankings. If you’ve been able to rank in spite of these practices for a while, that’s great – but you knew it couldn’t last forever, right?

Consider the Following:

Panda, Penguin and the less-publicized “keyword stuffing” update from April are all based on document classifiers. Their job is to look at a document (your web page), and use a set of signals to determine whether it should, or should not, be identified as “low” or “high” quality, “webspam,” “stuffed with keywords,” etc.

The “red pen” test gives you some idea what the result is going to look like, and how they’re going to get there – because the job of these document classifiers, in a sense, is to simulate the outcome of a human being performing such an exercise.

If your home page came back covered in red ink, and you’ve been “slapped by Google,” now might be a really good time to turn the “on page SEO” all the way back to the bare minimum that is required for users, and work forward from there, with litttle changes that hint at ranking, instead of wholesale keyword stuffing on every page of your site.

If you have pages on your site – or entire sections of your content, that are solely devoted to stuffing in more internal keyword links for ranking purposes, now would be a good time – while you’re at “rock bottom” – to clean that mess up.

And if your business is going to survive, it might also be a good time to start thinking about redesigning your website from the ground up, with human uses and human intentions in mind. That’s another exercise (and another post…) for another day, but it starts with asking these questions about your visitors:

  • Identity: Who are they? How would they identify themselves?
  • Motivation: Why are they here? What do they care about right now?
  • Framing: How did they get here? Where did they come from?
  • Purpose: What specifically are they trying to accomplish?

The better each page of your site does at addressing these things, the better you’ll do in the long run. Let’s talk about that soon, okay?

Disclaimer: Author is not responsible for you ruining your pants or furniture due to leakage, seepage, or dripping of red ink from an ink-soaked printout of your home page. If your site is primarliy designed for keywords and not for humans, wear appropriate protective garb before performing this exercise.

This post was orginally from Dan Thies of SEO BRAINTRUST