A week ago, Google made new hunt waves when it took off upgrades to its nearby inquiry calculation.
The "Pigeon" update means to convey enhanced neighborhood list items, with updated area positioning parameters.
As indicated by Google, the new neighborhood look calculation ties deeper into the website's web seek capacities, leveraging many positioning signs, alongside inquiry peculiarities like spelling adjustment abilities, equivalent words and Google's information diagram.
There isn't an SEO in existence that doesn't love crawling a site. There's something undeniably powerful in clicking a button and having all-important SEO elements brought to you. Makes the job easier.
What's hard is quantifying and prioritizing that crawl data, then applying it to the site in a way that makes sense to the client.
What follows is a workflow idea that begins not just with a site crawl, but with what most clients already know intimately, types of pages on their site. Focusing on benchmarking non-ranking URLs by page type, we'll use Google site search to help provide an actionable starting point and describe how crawling isn't necessarily the best first step when auditing a site.
Non-Ranking URL Page Type Workflow
We're all familiar with identifying page types, benchmarking, then tracking over time the URLs we want to see in search results. How about applying that same concept to URLs we don't want to see in search results? For the purpose of describing the image below, ranking URLs are what we want in search results versus non-ranking.
Consider the typical technical workflow:
Crawl site for SEO elements and improvements.
Try to quantify the scale (apply to page types) of the issue or provide a couple examples.
Insert findings into client deliverable organized by common SEO issues.
With the typical technical workflow the site crawl is limited to internal linking, which means some URLs may not be found. Identifying scale and which page types the crawl findings apply to, are done secondarily. Finally, these findings are organized with a client deliverable structure that is unfamiliar to non-SEOs.
Now consider the page type workflow:
Identifying and create comprehensive page type lists.
Crawl lists for SEO elements and improvements.
Insert into client deliverable organized by areas of the site the client is familiar with.
By first identifying and crawling comprehensive page type lists, scale is immediately apparent (number of paginated pages for example), existing SEO elements have been identified and benchmarked, a recommendation is given, and everything falls under the umbrella of an area of the site any client can easily understand.
Using Site Search to Identify Non-Ranking Page Types
There are many ways to begin identifying all page types on a site, but probably the easier, most widely accessible way is the ol' trusty Google site search. This is less comprehensive than using analytics (non-organic content views work great), but provides valuable indexation metrics along the way.
Costco Sort URLs Example
Quickly clicking through the Costco site, it's obvious that they use the sortBy parameter to invoke the sorting functionality for product category pages.
A quick site search shows just more than 8,500 URLs in Google index.
Using the Page Type workflow we've identified a page type, since we're using site search we know they're indexed, and changing search settings to show 100 results and using an SEO browser extension like SEOQuake results can be exported 100 at a time.
This list can then be run through an SEO Crawler, identifying the SEO elements on the pages and whether they have any directives or annotations. Based on these findings a recommendation can be created and finally inserted into the deliverable under the more or less client friendly section entitled sort URLs or URLs Generated by Sorting.
Google Drive Subdomain Example
Rather than identify page types by clicking manually through the site, we can use advanced site search to identify non-ranking URL page types.
Take a look at what might be considered the longest advanced Google search ever:
Each of these can be separately searched using similar advanced operator techniques to get an estimate on how many pages Google has indexed as a benchmarked, then checked again after the recommendation has been implemented to see track the effect. For example ideally drive.google.com requires sign in to access content, but Google has almost one million unique URLs for the subdomain. Perhaps a good recommendation would be they remove this subdomain from indices and can refer back to this advanced search as a benchmark.
Client: So I heard Google's Matt Cutts said links are [dead/bad/over], so I stopped all my link building, fired my SEO agency, and wanted to ask you what we should do next.
Me: (long pause) Umm, you did what? Were you penalized or positioning badly?
Client: Oh no, ranking in the top 3 for most of our keywords, great traffic, but you know I have been reading a little and then I saw Matt said to stop all link building, so I thought I better do that.
Me: Ummmm. OK, no...
Amazingly, even in 2014, many people have heard bits of information about websites and search engine optimization (SEO) that are either no longer relevant, completely misplaced, or simply erroneous. These all need to die really horrible deaths.
SEO Mythbusting
Myths can start so easily and quickly become pervasive. Listening to them can do some serious damage to your site or business. So let's review five of the most common myths.
Myth 1: If You Build it, They Will Come
This is a favorite mantra of the content marketing. Just build an amazing website with even more amazing content and watch the traffic roll on in.
Really? Well, no.
If it were that easy, all SEO professionals would be writers or out of work. Unfortunately, while great writing and great content is a large part of SEO and something your site definitely needs, it also needs links, a strong technical base, fast page downloads, and the list goes on and on.
Create content, but other SEO tactics are needed unless you want to be sitting all alone in a big field.
Myth 2: Link Building is Dead
While Cutts would prefer you never build another link to your website, this isn't "Field of Dreams". You can build it, but it doesn't mean traffic will come to your site.
First, Cutts never said links were dead. In fact, it's quite the opposite.
Cutts said Google has tried excluding links from the algorithm and the results were "much worse." So while I don't think it will stay forever this way, we have years before it potentially could go away, according to Cutts.
Links aren't dead. You need links. What do you do?
You don't want to go to a link farm and buy links and get your site potentially penalized because that method is dead (unless you really, really know what you're doing or using "churn and burn" domains).
However, you can hire someone with experience to go create a strategic link acquisition plan and help you implement it. This means that you use strategic methods to acquire links in a way that would appear natural.
For example, let's say you sponsor a charity event every year, making sure to tie that in with local news, press coverage, and maybe the charity's own news release. All tied back to you. This is a very obvious way you can acquire natural links to your site that are part of a link building campaign, but NOT a link buying campaign.
A link builder will have many more inventive and fantastic ways to do this and the best part, it will all appear completely natural because basically they are, just with a little push. Most important to note, this is the most highly scrutinized area of SEO right now, so hire well.
Links are alive. Just some of the methods died.
Myth 3: Using Google Analytics Lets Google Spy On You
Analytics is a must-have. Yet so often we hear that a client isn't putting Google Analytics on their site because Google spies on them, so they fly blind.
Is this true? Does Google use Google Analytics to spy in you? Well, yes and no.
For instance, if you're creating multiple domains that are being used for nefarious things (in Google's eyes) and these sites all share the same Google Analytics code then yes, Google now knows you have these same domains (i.e., you have linked them together and told Google you own them).
If this is your link network, well you have now outed yourself. Have a penalty? Decided to just start a new site, not fix the old one? Did you use the same Google Analytics code? Well, same thing.
However, is Google using Google Analytics as part of site positioning? No.
How do we know? Because Cutts said so (here, here, and again here. Now, we don't believe everything Cutts tells us, but this is just common sense.
These are separate arms of the same company and they simply don't interact at that level. Also, many sites don't use Google Analytics, so if Google used Analytics to determine the results, it would probably be worse than excluding links as a ranking factor. It doesn't make practical sense.
Bad data in = bad product out which = bad business. So if you are a regular company with a regular site, go ahead and add Google Analytics. The only one spying on you is the NSA.
Myth 4: Ranking (Positioning) Doesn't Matter
You've probably heard this before: "We don't care about ranking. Traffic is what we measure." While there is truth in this, it's also a bit deceptive.
Sure, there is no true top 10 anymore. With geolocation, personalization, and other factors, you can no longer pull up a definite top 10 and know you are seeing what anyone else is seeing.
In fact, my agency no longer calls it ranking, we call it positioning because ranking has definite numeric order and stop and end points where positioning is a more loosely defined placement within the SERPs.
That being said, and as much as relevant, converting traffic is the most important metric when measuring the ROI of you investment dollars, the difference between 1st and 5th and 5th and 10thgreatly affects the flow and amount of that traffic, so even if we can't be sure how everyone is seeing the site in the SERPs, we can have a good idea of the opportunities for traffic increases and where drops and where these are happening if we follow the keyword positions.
Position does matter. Rankings maybe not.
Myth 5: Social is the New Link Building
No. Can you get links from social, yes, once they leave the walled garden, but not from sharing itself (Google+ excluded here).
The reason is simple. There is a negative history with Facebook and Google and Google and Twitter. Neither company is willing to give Google consistent access to their fire hose, so Google simply can't factor them into the algorithm.
A while back when Google did factor Twitter into the algorithm and had access to the fire hose, you ranked well for Twitter, but that changed when Twitter pulled that access from Google. You can learn more about that here.
Social is not the new link building; link building is the new link building,
10 SEO Tips
OK, now we've debunked those common myths, what shouldn't you ignore? Let's look at some things that actually matter. Here are 10 of the most common missed SEO opportunities.
1. Google Authorship
Often people write about the debate surrounding this tag – is it or is it not helping with rankings? What is often missed is the very basic concept that an image by your result in the search engine increases your click-through rates.
Don't add this tag everywhere. Make sure it is only on well written, good content, but add it. Give your site that extra lift.
2. Citations
Make sure wherever you list the name of your business online (or where anyone else has listed it) that everything in each one is the same as the other. Meaning the name, phone number, contact, address, etc. between all listings is the same, not similar.
3. Content
Google likes specific content, so keep topics clear and on point. Also make sure on average your content is over 600 words per page or you risk it being penalized as thin.
4. URLs
Make sure your site URLs are absolute and not relative in your code. They shouldn't be too long and contain multiple query parameters. If they need to be rewritten, you have rewritten them.
5. The Alt Attribute
Use this tag properly, but use it. It belongs on every image on your site (sometimes it will be empty).
In the case of linking an image, the alt attribute acts as though you added anchor text on a page. Don't stuff this though; we have seen this as part of penalty actions when that has been done.
Remember this is a tag for the blind. Treat it with respect.
6. Page Speed
Check your site on the Google Page Speed tool. Speed matters especially in mobile. Get your site score above an 85-90.
7. Robots.txt
Your robots.txt doesn't block your web page from being indexed, only crawled. If you want to block a page from being indexed, leave it off the robots.txt and add a noindex tag.
8. Penalties
If you try to recover your site and you fail the first time, get an expert to help you. Google penalties are tricky subjects. You may not know enough about penalty recovery to get your site out of penalty status.
The longer your site stays penalized, the harder it will be to recover your site. A site auditor will know what to look for and how to communicate with Google in order to get the best outcome.
9. New Sites
There is only one way to really make sure Google won't find your new site: lock it down with a login. Robots.txt don't prevent site indexing and Google doesn't need a link to find you. If you don't lock it down, don't be surprised when your new site is indexed before you wanted.
10. Get a Site Audit Every Year
There are so many new aspects of SEO in the past two years that it can be very difficult for the average business owner to keep up. Add to these changes the number of simple penalties a site can receive and you can be quickly over your head.
A yearly site audit will help you avoid issues, avoid penalties, and make sure you your site is running smoothly. An SEO professional can also help you build out strategies and assess missing income opportunities.
A good auditor is worth their weight in traffic, conversions, and missed penalties. Just make sure you aren't getting simple tools output and someone with real knowledge and expertise.
SEO Isn't Voodoo (or Black Magic or Even 'Bovine Feces')
Remember a few years back when SEO was considered "voodoo" or "black magic" by anyone who didn't understand it?
Really, SEO is based on the rules of a mathematical algorithm. This means the site meets or doesn't, certain points on a mathematical scorecard and your site is then adjusted accordingly.
Though we haven't been given these rules by the search engines, we can test against the algorithm and do things we know work because a + b= c. Math is predictable, testable, and somewhat verifiable.
So yes, even though Google does still suggest that SEO is... well, um, "bovine feces" – it really isn't. As long as you're doing it right.
Panda. Google vs. Bing. Penguin. Guest blogging. The years of 2011 to 2013 were nothing short of tumultuous for the search industry. And as the voice and face of Google, Matt Cutts felt the wrath of angry webmasters and marketers.
We continue our look back at some of Cutts' blog posts, videos, and thoughts to get a better understanding of where Google's been, which in turn can be a great way to get a feel for where Google (and therefore SEO) is going next.
If you're just joining us, we've been going year by year, highlighting two or three of the biggest splashes he made. This post has been split into three time periods:
From 2011 onward, things should be pretty familiar to most of you. Still, there is much to learn from the past few years.
Matt Cutts in 2011
This was the year of the first Panda update and, let's be honest, it's easier to remember things from 3 years ago that 13. Believe me, in searching for the past stories I knew were out there I was off by as much as a couple years in the events from the early 2000s.
So let's look at the top few things from Cutts in 2011 ...
Rel="author"
This video is a good watch for anyone interested in how Google wanted to treat the authorship tag:
At this stage it wasn't cross-domain but it's alluded to, but more interesting (to me anyways) is when Cutts discusses authors themselves holding a value that will pass to their content on the sites of others (when cross-domain authorship applies). There's also a little slip around the four minute mark where he talks in the present tense about cross-domain authorship (but catches himself quickly).
Authorship is important; I think we all know that. It's interesting to hear what it's intended to do and while we can debate now how authorship value is passing, knowing what it's intended to be can shed light on what Google is likely working toward in their quest to understand individuals and their trustability.
Bing Copying Their Results
Cutts doesn't seem to get angry much (I suppose that's easy enough when you can simply get even) but when rumor spread that Bing may be copying the search results from Google, Google ran a test and confirmed it and Cutts blogged about it – well... yeah. I suppose since he couldn't get even with Bing it makes sense that he seemed a little mad.
Before we get into his comments on the subject let's compare a sample set of search results for gibberish phrases that Google purposely set the results for in their testing:
There were more examples and if you read his full blog post on the subject here you can see them as well as read his full rant about the subject and even watch a video of him squaring off against Bing's Harry Shum and Rich Skrenta from Blekko. As he noted, he's not great at being snarky but directs those hoping for more to watch the following:
But to hear it straight from Cutts, his words on the subject were:
"If clicks on Google really account for only 1/1000th (or some other trivial fraction) of Microsoft's relevancy, why not just stop using those clicks and reduce the negative coverage and perception of this? And if Microsoft is unwilling to stop incorporating Google's clicks in Bing's rankings, doesn't that argue that Google's clicks account for much more than 1/1000th of Bing's rankings?
I really did try to be calm and constructive in this post, so I apologize if some frustration came through despite that–my feelings on the search panel were definitely not feigned. Since people at Microsoft might not like this post, I want to reiterate that I know the people (especially the engineers) at Bing work incredibly hard to compete with Google, and I have huge respect for that. It's because of how hard those engineers work that I think Microsoft should stop using clicks on Google in Bing's rankings. If Bing does better on a search query than Google does, that's fantastic. But an asterisk that says "we don't know how much of this win came from Google" does a disservice to everyone. I think Bing's engineers deserve to know that when they beat Google on a query, it's due entirely to their hard work. Unless Microsoft changes its practices, there will always be a question mark."
Why is this important? Well for one this is the first time I think I've ever read or heard Cutts get publically mad. I'm sure it happens privately but I'd yet to actually witness it (even when I thought it was deserved),
Secondly, this event highlights a moment in search history. The moment when not just the technologies were being infringed on, but the final product. Google may have patented everything else but apparently that didn't extend to the finished product. If I was a lawyer I probably would have had some fun with that but I'm not and it's 2014. not 2011.
A Panda Attack
I promised it earlier and it's only fair to include what was the biggest SEO event of the year, the attack of the Panda. On February 23rd it hit hard affecting 12 percent of search queries. In describing the update he said:
So it wasn't an attack on spam, it was an improvement to search quality. An important internal distinction I'm sure but to me that's a bit like saying (as he has), "There is no sandbox, there are just elements of the algorithm that may look and act like them." (I'm paraphrasing) In November of the year he spoke on the subject of recovery when he said:
"Improve the quality of content or if there is some part of your site that has got especially low-quality content, or stuff that really not all that useful, then it might make sense to not have that content on your site."
He also discussed the issue of scrapers in a reply to a question sent to him:
The Panda update was a turning point in SEO and as had become usual, Cutts was the figurehead for Google in helping webmasters get a handle on what was happening. Onsite SEO took a hard turn from a focus on keyword densities (remember that) and pure mathematics to compelling copy and visitor experience.
While SEOs might not have loved this change (or any other major shift for that matter) it put the focus where it should be to maximize a website's health. Finally the goal of Google in working to results that users like match with the efforts of SEO's in seeking rankings. In the end it works well for site owners who want conversions and where the path to them is via rankings.
Matt Cutts in 2012
2012 was probably the most turbulent year in SEO. Pandas and Penguins ransacked the results pages and Cutts was at the center of it all (at least from a public standpoint).
On top of that the nature of the Internet itself was in question PIPA was on the table. And of course, Cutts was vocal about that too. Let's take a look at some of the top Cuttsisms.
PIPA (Protect IP Act)
If not before, Cutts made his position on the issue very clear in his blog. In it he writes:
"Now it's time to rally and get loud. It's time to call your Senators. Heck, it's time to ask your parents to call their Senators. If you think the internet is something different, something special, then take a few minutes to protect it. Groups that support SOPA have contributed nine times more money in Washington D.C. than our side. We need to drown out that money with the sound of our voices. I'd like to flood every Senator's phone, email, and office with messages right up until January 24th."
So Cutts extends his influence past search and into the political spectrum (not for the first time but certainly the most aggressively I've seen). While interesting purely from the context of Cutts' career, it's also interesting to look through his time with Google and in the technology sector as he grew from the SafeSearch guy with not a ton to say into the search guy speaking not just on Google but on Bing thefts and now political issues.
Google Penguin
Penguin was an algorithm designed to target low quality links built only to impact search rankings. Rather than work on ways to determine which links are low quality, Google opted to punish website owners who had these links.
The logic of this approach being that if the punishment is severe enough, people won't use the strategies thus making the life of a Google engineer easier. Like cutting off the hand of a thief for stealing a pie.
Yes, the punishment far exceeds the crime, but people will think twice about taking that pie, no matter how hungry they are. Yes, what I'm saying is that Google acts a lot like the medieval judiciary system.
But back to Cutts...
When discussing the update and whether it was a penalty he said:
"No, neither Penguin nor Panda are manual penalties," explaining that Penguin was designed to tackle "the stuff in the middle;" between fantastic, high quality content and spam. Panda was all about spam, but the need for Penguin arose from this middle ground.
"It does demote web results, but it's an algorithmic change, not a penalty. It's yet another signal among over 200 signals we look at."
A penalty is a manual action taken against a site and you will pretty much always be notified in Webmaster Tools if it's a penalty affecting your site."
So, what we have here is the explanation that Penguin was built to filter link quality. I found it a bit coincidental that the Penguin update took place at roughly the same time as 1 million "unnatural links" warnings were sent out to webmasters.
Now, I pick on Google a bit (as you can tell) but in the end, we as SEOs created the bed we were lying in. I can blame Google for over-punishing (true in many cases) but when we look back at what Cutts has told us over the years and how any advice got almost immediately abused, we really can't blame them.
So now website owners were suffering due to the actions of previously successful strategies that we were told not to do. But there was a darker side and Cutts needed to address that too. If poor quality links can trigger a penalty (or algorithmic devaluation) then the issue needed to be addressed...
Negative SEO
2012 was the year that negative SEO returned to the forefront of our consciousness. I know that I personally had a client suffer a sudden spike in poor quality links with the anchor text "payday loans" and other similar permutations. The client wasn't at all involved with loans or the financial sector at all and yet... the unnatural links warning followed.
The concern (and legitimate obviously) was that a competitor could simply purchase large numbers of known-bad links and negatively impact your domain. Here's what Cutts had to say on the subject:
First, interesting that he refers to the webspam team thinking about negative SEO when it's in the context of algorithm updates that he noted previously weren't from that team, but let's set that aside.
Second, the big problem here is that Cutts is basically suggesting that website owners now have extra work on their plates in monitoring all their links and making sure to disavow the ones that might be a threat. This assumes that all website owners know how to do that or even that they should.
Another issue from this video is that he states, if a competitor is trying to frame someone in the eyes of the webspam team there's a simple way to deal with it using the disavow tool. This directly contradicts Google's own statements on reconsideration.
Let's say that the webspam team takes the bait (and I've seen it with my own eyes). Google specifically states that once a penalty is in place they need to see efforts to remove those links prior to a reconsideration request being filed. So now the site owner is dealing with lost revenue, the cost and/or effort of getting the links removed as best they can, the time delay in hearing back from Google and then the delay in getting their rankings back.
While I don't blame Cutts for this issue obviously, his explanation of how Google's working for site owners here is simplistic and not altogether truthful.
And this brings us to...
Matt Cutts in 2013
2013 will be the last year we cover as we've just begun 2014. The piece will be updated in January of 2015 with the best of Cutts this year. Will it be the knowing look at iAquire purchased links? Will it be the MyBlogGuest issues or is there even better to come?
Speaking of guest blogging, let's begin there with...
Guest Blogging
Let's be clear, this wasn't the first time we'd heard about guest blogging from Matt but he gave some great clarification. With all the confusion over guest blogging based on a few of his previous statements, Cutts does a decent job of bringing it all back to Earth in this video.
In short the message is, "Don't build links on crappy sites and use guest blogging as a method among others to get your name out there," is essentially what he's saying. A good reminder that there's very little black and white in the realm of SEO, just common sense. Thanks, Matt.
Matt Cutts on Everything
Speaking at PubCon 2013 Matt pretty much covered the full gambit of SEO-related topics. From discussing Hummingbird (more on that later) to rich snippets. The video can be viewed at:
I obviously can't cover all the areas he addressed here (see a great recap here), but the big takeaway from the content of this article (who is Cutts and why is this of key importance) is that in this video he discusses the "why" of some of the huge changes. At about the 3 minute mark Cutts gets goaded into replying to the thrashing the day before by Jason Calcanis (which is a bit entertaining), but at the 35 minute mark he does a great job showing us what they're dealing with on their end.
It's interesting; we generally view Google as an entity without remembering that it's filled with people. As he lists off the critiques of the services by major media outlets and the problems outlined in them (scraper sites, spammers, content farms, etc.) and the picture he paints and his reaction, while guarded as always, is one of the first times it really hit home to me personally that on their end it's not a matter of fighting little battles it's a matter of having the world watch you, say you're not doing a good job as you put in another late night, and suddenly the view of Cutts and the crew at Google take on a more human form.
While I've always found Cutts to be a pleasant guy when I've talked to him, one always goes away with the impression that while humorous and personable, he's got skin as thick as a tank and the ability to take you down. This was a change to all that in which we saw that he is affected by what's going on around him, that black hat is almost a personal attack.
While there were tons of tidbits in the session on a pure SEO level, none of it was really unknown at the time (though nice to have it all put together into one long presentation) the reason this video struck me as one of the more important the second I saw it was this human element I hadn't seen before.
The final Cuttsism we'll cover for 2013 will leave us with one of the highlights of the year...
Hummingbird
The actual Cutts comments on Hummingbird are in the Pubcon video above but the quiet launch of it on August 20 (announced to the public on September 26) makes the following video from July 8 more interesting:
In the video, Cutts is answering a question about how voice search is changing query syntax. He gets into discussing the changing way page content needs to be viewed. While I didn't first see this video and think, "Hey, they can't do that effectively yet," it was very interesting when the update did finally roll out.
Hummingbird was more of an infrastructure change. While 90 percent of queries showed adjustments, they were very minor. This update was more about adjusting the infrastructure and rewriting the algorithms to deal with the more complex tasks Google was working on (like voice).
This video is important because it highlights the importance of listening not just to what Cutts is saying but what it might mean. When he talks about changes in how Google needs to treat content in light of shifting user behavior or technology, we need to stop and think, "Have any of the recently algorithms done that?" If the answer is "no" or "not well" and it's a topic he's covering, you know that a change is coming.
Matt Cutts in Conclusion
I'll let each reader draw for themselves the impact Cutts has had or what's more important. These are the highlights of his career with my eyes and in retrospect but as mentioned in the beginning, I welcome your additions in the comments below.
Before we end this post, I'll leave you with a few of the funner (not officially a word but it should be) things I've seen over the year and in compiling and sorting the content for this article. Just a few things to lighten the mood while still calling it work.
Matt Cutts at School
Want to read the papers Cutts worked on when he was at UNC? Well you can at http://www.cs.unc.edu/~cutts/papers/.
Matt Cutts as a T-Rex
Enough said.
Matt Cutts on Ranking #1
This great mashup by Sam Applegate is a great way to close out the article. Enjoy!
Have a favorite Cutts highlight that isn't included here? Share it in the comments!
Welcome to the years of paid links. link bait, Caffeine, Google bombs, and page speed as a ranking factor. By this point, Google's Matt Cutts had plenty of important things to teach us all about the evolving landscape of search and SEO.
Our story continues by looking back at some of Cutts' blog posts, videos, and thoughts from 2006 to 2010 to get a better understanding of where Google's been, which in turn can be a great way to get a feel for where Google (and therefore SEO) is going next.
If you're just joining us, we've been going year by year, highlighting two or three of the biggest splashes he made. This post has been split into three time periods:
2006 is a hard year to consolidate into just a few snippets (as is pretty much every year after it) but we're going to try. Here's how 2006 is summarized from this author's take on Cutts' public voice.
BMW
For those who don't know, BMW got busted for hidden content doorway pages on February 4, 2006. Tsk tsk BWM. On the 7th, Cutts posting the following:
"I appreciate BMW's quick response on removing JavaScript-redirecting pages from BMW properties. The webspam team at Google has been in contact with BMW, and Google has reincluded bmw.de in our index. Likewise, ricoh.de has also removed similar doorway pages and has been reincluded in Google's index."
OK – so the quote itself is nothing special but I needed to include this as it let the world know something very specific. Google is like George Orwell's "Animal Farm". In that book the commandment is, "All animals are equal, but some animals are more equal than others." It seems the same is true at Google.
I don't expect that if I got busted for cloaking that Google would be in touch with me personally and that I'd actually get an expression of appreciation from Cutts for addressing a violation of their guidelines. This animal is less equal it seems.
Paid Links
On his blog, Cutts often answers user-submitted questions. Here's one:
Q: If one were to offer to sell space on their site (or consider purchasing it on another), would it be a good idea to offer to add a NOFOLLOW tag so to generate the traffic from the advertisement, but not have the appearance of artificial PR manipulation through purchasing of links?
A: Yes, if you sell links, you should mark them with the nofollow tag. Not doing so can affect your reputation in Google.
While Cutts was careful to say it will affect reputation and not that you're get penalized, he's reiterating that paid links (either buying or selling) can and likely will have a negative impact on your rankings. One could easily add "if caught" as it was 2006 and for those who remember, paid links tended to work far better at the time than they do now.
There was always a lot of flak pointed at Google for statements about nofollowing paid links with the assertion that webmaster's shouldn't have to do Google's job for them (I've heard the same about schema). Either way, it's an important Q&A from the context of highlighting the continuing battle between Google and SEOs in the area of paid links. We'll see more on this below.
GoogleGuy on Google Video
I'd love to be able to post Cutts' initial videos from their original source but alas, they were published over on Google Video before Google had purchased YouTube but it's worth noting that even before YouTube, Cutts was making videos to help webmasters understand how to deal with Google.
On a similar tangent (and as alluded to above) in August 2006 Cutts admitted to being GoogleGuy. GoogleGuy was a username he used on a variety of forums to answer questions and at SES San Jose 2007 (a conference I had the pleasure of speaking at giving me the opportunity to witness this confession live).
While not directly related to a Cutts statement, it was on October 9 of 2006 that the announcement was made that Google would be acquiring YouTube (for a paltry $1.65B).
Matt Cutts in 2007
Let's just cut straight to it as 2007 was an interesting year in search.
Privacy
In discussing privacy, Cutts said in his blog:
"I've seen firsthand how much Google works to protect users' privacy. I personally believe that we take more precautions and safeguards than any other major search engine. We also strongly protect users' privacy outside of Google (e.g. last year when the DOJ tried to get access to users' queries, and Google was the only company out of 30+ that said "no" and went to court about it – and won)."
I'm going to give credit where it's due, say what you will (see comments below I'm sure) given the data stores they have, he's right in that Google has done a decent job of protecting user data from outside access. He goes on to say:
"... your ISP has a superset of data that Google has, because everything you do passes through your ISP. So your ISP may have much more detailed records about places where you go on the net, plus they have a verified identity with something like a credit card, and they actually know which IPs you're on."
Suddenly the switch to (not provided) in 2013 makes a lot more sense. Of course, there's the part about keyword data passing for AdWords but that's a different story.
Snippets
Let's begin with a video Cutts produced with the help of Google's Kirkland offices:
In the video, Cutts talks about how the data is selected to appear in the search results. Aside from just being interesting in-and-of-itself it's a unique opportunity to hear how it was done in 2007. I also find it interesting that when the office had Cutts at their disposal and an hour to kill the first thing they thought to do was create some videos.
SEO Emails and Other Pubcon Musings
Probably my favorite note from Cutts in all of 2007 was when he stated in his keynote at Pubcon, "[The cold call emailers] even e-mail Google with automated messages that say 'we can increase the visibility of Google.com.' Here I thought we were a pretty well-known site."
But that's more for humor. Other great quotes from Pubcon were:
"Linkbaiting is essentially white hat SEO."
As true today is it was then, assuming the bait itself is ethical (i.e. just good content). And,
"Building your strategy around showing up #1 for your trophy phrase is not a good approach. If you're going after that, it's fantastic if you get it, but diversification is even better."
From copy to links to keywords, diversity is security. Write it on your hand so you don't forget.
Essentially Cutts spent Pubcon and much of 2007 taking Google's message from, "Here's what not to do," and adding in the part about, "Here's what you should be doing," in ways that gave webmasters actual action items and not just confusing jargon that left us more confused than helped and worried that the next thing we dreamt up would get added to their naughty list anyways.
Matt Cutts in 2008
Free Links
We all love links right? The only problem is that they take so darned long to develop (assuming you're not looking for ones to trigger an "unnatural links" warning/penalty. With this in mind it's the wording Cutts used to announce the launch of what is probably one of the best features in Webmaster Tools. He wrote:
"I can't believe a new feature from Google isn't getting more notice, because it converts already-existing links to your site into much higher quality links, for free. The Google webmaster blog just announced that you can find the pages that link to 404 pages on your site."
Even today this feature doesn't get the attention it deserves but that's OK... let's just keep it between us.
Matt Cutts Likes Keyword Cramming
OK, that's obviously tongue in cheek but a video interview with Cutts helps illustrate the limitations Google faced then and the contradicting nature of what webmasters are told to do.
What Cutts says here is that we need to make sure to get in all the keywords people might search. The problem is twofold:
This isn't going to read properly and is going to use terms people might not be familiar with as synonyms. Fortunately Google has addressed this and is much better and filling in there blanks themselves (ex – understanding that "usb drive" and "thumb drive" are likely meant to produce the same results). The worse problem is,
This is exactly the opposite of what would be recommended today. It's this conflicting element that results in animosity and confusion. How can Google hold against me today what they told me to do yesterday?
I include this as a critical bit of Cutts from the year as, while the information itself is interesting enough, it's more this conflict that makes this memorable.
Irrelevant Link Bait
In an interview with Eric Enge, Cutts went on record stating:
"So, what are the links that will stand the test of time? Those links are typically given voluntarily. It is an editorial link by someone, and it's someone that's informed. They are not misinformed, they are not tricked; there is no bait and switch involved. It's because somebody thinks that something is so cool, so useful, or so helpful that they want to make little sign posts so that other people on the web can find that out.
Now, there is also the notion of link bait or things that are just cool; maybe not helpful, but really interesting. And those can stand the test of time as well. Those links are links generated because of the sheer quality of your business or the value add proposition that you have that's unique about your business. Those are the things that no one else can get, because no one else has them or offers the exact same thing that your business offers."
So the interesting thing here that made it one of my favorite tidbits of the year was that Cutts essentially told the truth (for the time) despite the fact that it contradicts previous statements. Here he's saying that a link to a resource that isn't particularly helpful but is cool holds weight which could be construed as meaning that the voluntary nature of a link is more important than its relevancy or usefulness.
Matt Cutts in 2009
PageRank Evaporation
For anyone who's been an SEO for more than 4 or 5 years you'll be familiar with the idea of PageRank sculpting. For those that are not the premise is simple, if you have 10 internal links on a page the PageRank internally will flow with 1/10th heading to each target page (I'm brutally over-simplifying here but I'm hoping you'll forgive me).
The idea behind PageRank sculpting was that if you nofollow 5 of those links (say – to your privacy policy or other non-keyword-targeting page) then you would pass 2/20th weight to the remaining 5. And in fact, that was exactly the way it worked. Until he revealed that:
"So what happens when you have a page with "ten PageRank points" and ten outgoing links, and five of those links are nofollowed? Let's leave aside the decay factor to focus on the core part of the question. Originally, the five links without nofollow would have flowed two points of PageRank each (in essence, the nofollowed links didn't count toward the denominator when dividing PageRank by the outdegree of the page). More than a year ago, Google changed how the PageRank flows so that the five links without nofollow would flow one point of PageRank each."
Essentially this turned everything upside down. Fortunately I personally didn't chase after PageRank sculpting as a strategy as it didn't make sense but rather than it going from being a positive, it turned into a negative with (in the example above) 50% of the internal PageRank going absolutely nowhere. This became known as PageRank Evaporation.
Of everything from Cutts in 2009, this may be the more important.
Or was it...
Caffeine
In an interview, Cutts was asked about the Caffeine update. While the full update didn't roll out across Google until 2010 they had put it in a publically accessible sandbox location in August and rolled it out on one live datacenter later in the year. This video gives great insight into how Google works.
As an infrastructure update more than algorithmic the changes were huge and far-reaching and really showed the push into speed and faster indexing to allow for a broader spectrum of search capabilities predicting advances into new areas of search and new features.
Google Bombs
And one I'm going to include in my shortlist of important tidbits from Cutts was when he wrote on his blog about Google bombs. In answer to a question regarding how automated the detection of Google bombs are when Obama's Whitehouse page no longer ranked for "failure" only a few hours after it became public he replied:
"The short answer is that we do two different things – both of them algorithmic – to handle Google bombs: detect Google bombs and then mitigate their impact. The second algorithm (mitigating the impact of Google bombs) is always running in our productionized systems. The first algorithm (detecting Google bombs) has to process our entire web index, so in most typical cases we tend not to run that algorithm every single time we crawl new web data. I think that during 2008 we re-ran the Google bomb detection algorithm 5-6 times, for example.
The defusing algorithm is running all the time, but the algorithm to detect Google bombs is only run occasionally. We re-ran our algorithm last week and it detected both the "failure" and the "cheerful achievement" Google bombs, so our system now minimizes the impact of those Google bombs. Instead of a whitehouse.gov URL, you now see discussion and commentary about those queries."
This discussion with Cutts is important for two reasons:
It illustrates the legitimate questioning about the highly coincidental timing of a Googlebomb being publically mentioned and it's solving by Google. Are there manual actions being taken? Not if you ask Cutts, but I do sometimes wonder. And
Nostalgia. I remember the Googlebombs well and it's fun to think back to them.
If you don't know about the Googlebombs you can find a bit more info on his blog at http://www.mattcutts.com/blog/defuse-googlebomb/.
And a Quote ...
To end the 2009 section of this post I'd like to end with a great quote from Cutts in his blog:
"The objective is not to 'make your links appear natural', the objective is that your links are natural."
Good advice.
Matt Cutts in 2010
Google I/O
The video is an hour long but it's an interesting enough watch if you have time. I didn't watch it this time but I did watch it when it was first uploaded. Cutts hosts a session at the Google I/O conference and tears some sites apart. The video is at:
The funniest part comes in at about 5:48 where he essentially recommends being lazy. If you closed your eyes and didn't know who was speaking you'd almost think he was a blackhat during this part.
The video itself has nothing revolutionary in it but I referenced it a number of times and if you want to get a good understanding of where Google was at in 2010, this is great video. It also gave me personally a different take on Cutts and reinforced to me that:
He's a human.
He quasi-contradicts himself over time. (See previous point)
Page Speed
In February, Cutts put out a video discussing the important of page speed vs. relevancy:
In this he eluded to Google potentially using this as a factor in ranking.
Two months later he discussed the announcement that they were doing just that in his blog. On the subject he wrote:
"I know that there will be a lot of discussion about this change, and some people won't like it. But I'm glad that Google is making this step, both for the sake of transparency (letting webmasters know more about how to do better in Google) and because I think this change will make the web better. My takeaway messages would be three-fold: first, this is actually a relatively small-impact change, so you don't need to panic. Second, speeding up your website is a great thing to do in general. Visitors to your site will be happier (and might convert more or use your site more), and a faster web will be better for all. Third, this change highlights that there are very constructive things that can directly improve your website's user experience. Instead of wasting time on keyword meta tags, you can focus on some very easy, straightforward, small steps that can really improve how users perceive your site."
This was a pivotal moment in SEO. Until this point all we'd heard from Cutts in regards to rankings had mainly to do with content, links, link structure, and making sure the code allowed the bots to get through and prioritize.
This was the first time crawlable code was compared with other crawlable code and one deemed better than the other to a point where it was made a ranking factor. SEO was no longer just about getting good content in front of bots... err, visitors; it had become about making changes no one would notice to eek fractions of seconds improvements in things like load time.
SEO grew up then and Cutts was the one who announced it (to me at least though it had been posted to the Google blog a couple hours prior though even that post was done in part by... Cutts).