Matt Cutts kicked off day two at Pubcon with another of his signature
keynotes, dispelling myths, talking about spammers and about Jason
Calcanis’ keynote from day one, at the urging of the audience.
First, Cutts spoke about Google’s “Moonshot changes,” which he broke down into these areas:
- Knowledge graph
- Voice search
- Conversational search
- Google now
- Deep Learning
He revealed that Deep Learning is the ability to make relationships
between words and apply it to more words, and how it can help improve
search and the nuances of search queries.
Deep Learning in Regular and Voice Search
He explained that voice search is changing the types of search
queries people use, but also that it can be refined without repeating
previous parts of the user’s search queries. It does this when it knows
the user is still searching the same topic, but drilling down to more
specifics.
Cutts shared an example where he was searching for weather and
continued on with the query without having to keep retyping “What is the
weather?” because Google can recognize the user is still refining the
previous search query. “Will it rain tomorrow?” in voice search will
bring up the weather results for location for Las Vegas, Nevada. Then
when he says “What about in Mountain View?” and Google shows weather for
Mountain View, knowing that it is a refined voice query. Then “How
about this weekend?” is searched and it shows Saturday weather for
Mountain View.
Hummingbird, Panda & Authorship
Next up, Cutts spoke about Hummingbird and he feels that a lot of the
blog posts about how to rank with Hummingbird are not that relevant.
The fact is, Hummingbird was out for a month and no one noticed.
Hummingbird is primarily a core quality change. It doesn’t impact SEO
that much, he said, despite the many blog posts claiming otherwise.
Of most interest to some SEOs is that Google is looking at softening
Panda. Those sites caught in grey areas of Panda--if they are quality
sites--could see their sites start ranking again.
Google is also looking at boosting authority through authorship. We
have seen authorship becoming more and more important when it comes for
search results and visibility in those results; Cutts confirmed this is
the direction in which Google will continue to move.
Google on Mobile Search Results
Next, he discussed the role of smartphones and their impact on search
results. This is definitely an area SEOs need to continue to focus on,
as it is clear that sites that are not mobile-friendly will see a
negative impact on their rankings in the mobile search results.
Smartphone ranking will take several things into account, he explained:
- If your phone doesn’t display Flash, Google will not show flash sites in your results.
- If your website is Flash heavy, you need to consider its use, or ensure the mobile version of your site does not use it.
- If your website routes all mobile traffic to the homepage rather than the internal page the user was attempting to visit, it will be ranked lower.
- If your site is slow on mobile phones, Google is less likely to rank it.
Cutts was pretty clear that with the significant increase in mobile
traffic, not having a mobile-friendly site will seriously impact the
amount of mobile traffic Google will send you. Webmasters should begin
prioritizing their mobile strategy immediately.
Penguin, Google’s Spam Strategy & Native Advertising
Matt next talked about their spam strategy. When they originally
launched Penguin and the blackhat webmaster forums had spammers bragging
how they weren’t touched by Penguin, the webspam team’s response was,
“Ok, well we can turn that dial higher.” They upped the impact it had on
search results. Cutts said that when spammers are posting about wanting
to do him bodily harm, he knows his spam team is doing their job well.
He did say they are continuing their work on some specific keywords
that tend to be very spammy, including “payday loans,” “car insurance,”
“mesothelioma,” and some porn keywords. Because they are highly
profitable keywords, they attract the spammers, so they are working on
keeping those specific areas as spam-free as possible through their
algorithms.
He discusses advertorials and native advertising and how they are
continuing to penalize those who are utilizing it without properly using
disclaimers to show that it is paid advertising. Google has taken
action on several dozen newspapers in US and UK that were not labeling
advertorials and native advertising as such, and that were passing
PageRank. He did say there is nothing wrong with advertorials and native
advertising if it is disclosed as such; it’s only when it is not
disclosed that Google will take action against it.
Spam networks are still on Google’s radar and they are still bringing them down and taking action against them.
Bad News for PageRank Fans
For PageRank devotees, there is some bad news. PageRank is updated
internally within Google on a daily basis and every three months or so,
they would push out that information to the Google toolbar so it would
be visible to webmasters. Unfortunately, the pipeline they used to push
the data to the toolbar broke and Google does not have anyone working on
fixing it. As a result, Cutts said we shouldn’t expect to see any
PageRank updates anytime soon--not anytime this year. He doesn’t know if
they will fix it, but they are going to judge the impact of not
updating it. The speculation that PageRank could be retired is not that
far off from the truth, as it currently stands.
Communication with Webmasters, Rich Snippets, Java & Negative SEO
Google continues to increase their communication with webmasters.
They made new videos covering malware and hacking, as Google is seeing
these problems more and more, yet not all webmasters are clear about
what it is and how to fix it. They are working on including more
concrete examples in their guidelines, to make it easier for people to
determine the types of things that are causing ranking issues and point
webmasters in the right direction to fix it.
Cutts stressed that he is not the only face for Google search. They
have 100 speaking events per year and do Hangouts on Air to educate
webmasters. They hold Webmaster Office Hours, to increase communication
and give users the chance to engage and ask questions of the search
team.
Google is becoming smarter at being able to read JavaScript, as it
has definitely been used for evil by spammers. However, Cutts cautions
that even though they are doing a better job at reading it, don’t use
that as an excuse to create an entire site in JS.
Rich snippets could get a revamp and they will dial back on the
number of websites that will be able to display rich snippets. “More
reputable websites will get rich snippets while less reputable ones will
see theirs removed,” says Matt.
Matt also says negative SEO isn’t as common as people believe and is
often self-inflicted. One person approached Matt to say a competitor was
ruining them by pointing paid links to their site. Yet when he looked
into it, he discovered paid links from 2010 pointing to their site, and
said there was no way competitors would have bought paid links back in
2010 to point to their site, since the algorithm penalizing paid links
wasn’t until a couple years after those paid links went live.
The Future of Google Search: Mobile, Authorship & Quality Search Results
On the future of search, he again stressed the importance of mobile
site usability. YouTube traffic on mobile has skyrocketed from 6 percent
two years ago, to 25 percent last year, to 40 percent of all YouTube
this year. Some countries have more mobile traffic than they do desktop
traffic. Cutts reiterated, “If your website looks bad in mobile, now is
the time to fix that.”
Google is also working on machine learning and training their systems
to be able to comprehend and read at an elementary school level, in
order to improve search results.
Authorship is another area where Google wants to improve, because
tying an identity to an authorship profile can help keep spam out of
Google. They plan to tighten up authorship to combat spam and they found
if they removed about 15 percent of the lesser quality authors, it
dramatically increased the presence of the better quality authors.
They are also working on the next generation of hacked site
detection, where Cutts said he is not talking about ordinary blackhat,
but “go to prison blackhat.” Google wants to prevent people from being
able to find any results for the really nasty search queries, such as
child porn. Cutts said, “If you type in really nasty search queries, we
don’t want you to find it in Google.”
Cutts’ current advice (again) to webmasters is that it's important to
get ready for mobile. He spoke to the convenience for website visitors
when you utilize their auto-complete web form annotations, to make it
easier for people to fill out forms on your site. The mark-up to add to
the forms is easy to do, and will be available in the next few months.
The next generation of the algorithm will look at the issue of
ad-heavy websites, particularly those with a large number of ads placed
above the fold. This is really not a surprise, as it makes for a bad
user experience and Google has previously announced that their page
layout algorithm is targeting this. But sites using JavaScript to make
it appear to Googlebot that the ads aren’t above the fold should look at
replacing the ads before the algorithm impacts them.
Matt Cutts Q&A
During Q&A, Cutts discussed links from press release sites. He
said Google identified the sites that were press release syndication
sites and simply discounted them. He does stress that press release
links weren’t penalized, because press release sites do have value for
press and marketing reasons, but those links won’t pass PageRank.
The problem of infinite scrolling websites was raised, such as how
Twitter just keeps loading more tweets as you continue to scroll down.
He cautions that while Google tries to do a good job, other search
engines don’t handle infinite scrolling as well. He suggests any sites
utilizing infinite scrolling also have static links, such as with a
pagination structure, so bots can have access to all the information if
their bots don’t wait for the infinite loading of the page.
Someone asked about whether being very prolific on blogs and posting a
ton of posts daily has any impact on search rankings. Cutts used the
Huffington Post as an example, as they have a huge number of authors, so
logically they have many daily posts. However, he says posting as much
as your audience expects to see is the best way to go.
In closing, Cutts said they are keeping a close eye on the mix of
organic search results with non-organic search results and says he would
also like to hear feedback on it.
While no new features were announced during his keynote at Pubcon,
Cutts packed his presentation with many takeaways for webmasters.
Source Link:- http://searchenginewatch.com/article/2302895/Matt-Cutts-on-SEO-PageRank-Spam-the-Future-of-Google-Search-at-Pubcon-Las-Vegas
0 comments:
Post a Comment