Monday, September 14, 2015

Gary Illyes: Panda Will Update Slowly with Refreshes, Penguin Updates Will Be Real-Time ━ Eventually

Gary Illyes: Panda Will Update Slowly with Refreshes, Penguin Updates Will Be Real-Time ━ Eventually was originally published on, home of expert search engine optimization tips.

Google Webmaster Trends Analyst Gary Illyes joined Bruce Clay for a candid chat about topics vital to SEO, including content for mobile, Gary’s dream of a world where everyone employs HTTPS, and the reality of the Phantom updates — plus plenty of talk about penalties and algorithm updates.

gary illyesHere are key highlights right off the bat:

  • Panda updates will come slow with refreshes and Penguin updates will happen in real time (eventually).
  • HTTPS can be seen as a tiebreaker between otherwise equal sites.
  • Phantom was a core algorithm change.
  • Google considers Panda part of the core algorithm now and does not think of it as a penalty.
  • We are “months away” from the next Panda update.

Watch the entire interview, and/or read the transcript!

Bruce Clay: There’s been a lot of discussion about mobile and HTTPS, local and performance and other things that are not specifically non-page edits. What are some of the most important factors out of these? How does it play?

Gary Illyes: I wouldn’t pick one as most important because each of them are important for their own reasons. For example, mobile is big. You can’t deny that. It’s really, really very big. In at least ten countries we have more mobile searches nowadays then desktop search. So everyone should focus on mobile, regardless of their business model and where their users are coming from. Users demand content they can consume on their mobile devices – this doesn’t mean it has to be a mobile-friendly site. For example, it can be app-indexing enabled. We have tons of documentation both for apps and for mobile websites.

“Users demand content they can consume on their mobile devices.” — @Methode
Click To Tweet

You mentioned local. From Google’s point of view, I’m not too familiar with Local as they’re a separate branch. But as a Google user, I think Local is really important for specific businesses. If you have, for example, a restaurant you definitely want to be on local because people want to be able to find you easily when searching for restaurants.

As another example, I’m kind of a bookworm and a collector. For really old books that I want to add to my collection usually I do extensive research. In those cases I’m not looking for a local entry in the SERP. However, if I want to pass time and I have two hours in London and I want to go to a bookshop, then Local can give me relevant and useful information. And in those cases, Local becomes super important. Same for restaurants — if I search for Italian restaurant in Boston, then I expect a Local result because I want to get there fast.

HTTPS is important for me and also for Google. I do wish that everyone on the Internet would go HTTPS to protect their users but of course this is just wishful thinking. I hope that I see more and more websites on HTTPS because I think that privacy is important, but of course I can’t expect everyone to go HTTPS.

If you’re in a competitive niche it can give you an edge from Google’s point of view. The HTTPS ranking boost acts like a tie-breaker. For example, if all quality signals are equal for a given result, the one that has HTTPS may get the added boost.

“#HTTPS can give you an edge; it acts like a tiebreaker.” @Methode  #SEO
Click To Tweet

BC: Google laid out a lot of penalties over the last few years – a lot of people came to us and other firms to remove penalties and we’ve been diligently editing the customers’ sites and pruning bad links and stabilizing the content. Because the algorithm isn’t updated as often as we’d hope, those companies are actually wondering if it will work. What will happen? When are we going to get updates? How often will they happen? Are they part of the algorithm? What do you recommend a company does after they’ve fully repaired a site?

GI: (I will assume you’re talking about Panda and Penguin and not random algorithm updates that you don’t know about.) With Panda, we don’t think of it as a penalty. We think of it as a general ranking update. Long term, I would like to change people’s perception about how they think about Panda, for example, because in our view that’s really not a penalty. Sites sometimes may become really prominent for queries they don’t have great value content for, for example. In those cases since we want to offer our own users the best possible experience on our search results, we want them to find the correct information for their query, we have to adjust those overly prominent sites in our results. Not just Panda but some of our algorithms try to do that. They try to value more content that answers the users query better. If you think about it, that’s pretty much in line with what our general algorithms do. It’s just that with Panda we can’t update as fast as we do with our general algorithms.

As for Penguin, I was on vacation for a few weeks and fell out of the information flow about these things — Penguin refreshes, on the long run, as far as I know, will happen almost or close to real time. As usual, I don’t have anything to announce, and I think I already said this, but an update is still really far away. I think we are still talking about months.

BC: If we are able to work with clients on quality issues and we make changes, we might, then, be able to see a ranking change within days or weeks but in the case of links it will be longer?

GI: I think with Penguin it would take real-time crawling or re-indexing would give results already. With Panda we are still talking about refreshes, and as far as I know these refreshes will be rolled out really slowly. This is an infrastructure change on our end and I think we are going to stick with it.

BC: It seems like if the issue is quality, then we just have to keep working on it and wait for the updates and refreshes. And if it’s linking, and we have removed a large number of the bad links, we may be able to see something almost real-time. In many cases, we’ve removed and disavowed many, many files and the client doesn’t recognize a change in ranking. That could be because there are multiple algorithmic factors dampening their rankings. Is there something in the algorithm that says ‘Your quality was low so we’re going to lower you?’ Is that a manual process or is it algorithmic?

GI: No, it’s algorithmic. It’s part of our core ranking algorithm. With quality, SEOs tend to overthink it. I work a lot with websites; it’s part of my job. And what I see is that in many cases SEOs ‘over-SEO’ a website. They are trying to rank for keywords or terms that the site doesn’t have great content for. If I search for an Italian restaurant in New York, I don’t actually want results in the tri-state area — I want results for New York City. And what I see often is people trying to rank for queries they don’t have high quality and great value content for. Sooner or later the algorithm will catch it. Don’t overthink it. It’s simple content analysis and they will adjust the rank for the site and that’s it.

BC: You also indicated that Panda is part of the standard algorithm and Penguin is part of it and it’s real-time — is that right?

GI: That’s our plan — to integrate Panda into our core ranking algorithm; I don’t know if we’ve managed to do that yet. With Penguin, we still have lots of work to do as far as I know.

BC: So we’re going to be waiting for various implementations. How will we find out when they’re implemented?

GI: In general we tend to not announce core ranking algorithms. We don’t think that people should focus too much on it. Most search engines publish guidelines about how to build quality websites and that’s what people should focus on. If you are doing more than probably you are overdoing it. If we are asked about them, maybe we will confirm that we rolled out something that is associated with Panda or Penguin, but in general for core ranking algorithm changes we don’t confirm anything.

BC: Does the Phantom update exist? Is it imaginary? What is Phantom?

As far as I know it was a core ranking algorithm change. I don’t know much about it – and, in fact, I don’t want to know much about it, even though I am sometimes involved in ranking changes. We can go back to the quality guidelines and the Webmaster Guidelines. I would really recommend focusing on those rather than things like the Phantom update. It’s not productive and I’m very sure there are way better things to focus on.

BC: Is there something else that we should chiefly be paying attention to other than the Webmaster Guidelines?

GI: There’s nothing more that you really want to focus on. If you follow the Webmaster Guidelines, you will do good. I see many, many websites that are not doing much SEO on their websites and they are doing remarkably well. If they can do it, then pretty much anyone can do it. It’s not just Google that has these guidelines. Other search engines have done the same thing. Just focus on creating quality and great value content for the user. Don’t try to rank for overly vague keywords. It won’t work out well it in the long run.

Thank you so much to Gary Illyes for joining us all the way from Switzerland! His insights and commitment to knowledge transfer are appreciated by the entire digital marketing community. Catch him speaking at SMX East and Pubcon Las Vegas!

No comments:

Post a Comment