Google Webmaster Central Hangout 13th Jan 2015

Google has for a while has been holding a Google Webmaster Central Hangout on weekly basis, and you can find the previous hangout here, which contains a rather great discussion on Google Panda and poor quality comments. Anyone can join in on the hangout, and you can post questions in advanced with John Mueller answering the most popular upvoted ones.

If you don’t have time to watch the whole video then you are in luck.  We have included all the relevant parts of the transcript below (after removing all the superfluous conversation) as well as expanded on the topics covered with our brief opinions and links for further reading.   You will also see some time references next to each Question title so that you can move to the relevant part of the video should you wish.

In order to assist in finding questions that of most interest to you can find a summary of all the questions below (with internal links to take you to the relevant part of the page):

Transcript Commentary of the Google Webmaster Central Hangout 13th Jan 2015

Introduction – 0:02

JOHN MUELLER: OK, welcome everyone to this year’s first Webmaster Central Office Hours Hangout. My name is John Mueller. I am a webmaster trends analyst here at Google in Switzerland and part of my role is to communicate with webmasters like you all, publishers, people making websites, and make sure that you’re all aware of the things that we’re working on, maybe specific issues that we found outside when crawling and indexing the web, and also bringing all of the information
0:31that you guys have back to our engineers so that we can work with your input as well. So we’ve had, I guess, two years now or three years almost of these Hangouts. And I think these have been running fairly well. So traditionally, I ask anyone for a question in the beginning. Do any of you want to ask this year’s first question?

Question 1: 1:02 – How will it be for a company to switch to HTTPS?

BARUCH LABUNSKI: Well, I just wanted to ask about HTTPS. […] So for 2015, how important will it be for a company to switch, to make the change to go HTTPS Because in reality, if I’m a blogger, and I’m sitting in a coffee shop, and I’m updating my posts, and there’s somebody behind me that wants to hack, sure, they can probably hack, because it’s HTTP. But if I’m wired, do I really need to make that change if you’re doing it all in a company that’s all wired and secure?

JOHN MUELLER: So I guess there are two aspects there. On the one hand, you’re talking about an admin panel for your website. And that’s something where, if you’re using public Wi-Fi, I’d definitely try to find a way to secure that. That could be by using HTTPS. It could be by using a VPN setup, something like that and that’s something you might want to consider in general with any kind of a public Wi-Fi that you might be using, because you never really know what else is happening out there on the Wi-Fi. You can’t really tell.

JOHN MUELLER:The other aspect of making your website run on HTTPS in general for all of your users, I’d say that’s something that is worth keeping in mind for the long run.  I wouldn’t say that it’s like the most critical issue that you should address immediately on your website, at least for most websites. Of course, if you’re running a banking website, then maybe that’s a little bit different, and you should probably have some kind of security there. But if you have a normal blog and moving it to HTTPS is a big issue, a big hassle with the hosting, hassle with the embeds, maybe with the ads, with the tracking, whatever you have there, I’d say that’s something to keep in mind for the long run and at some point, you’re probably going to switch to HTTPS anyway but it’s not something where I’d say you need to do this as a first step this year.

BARUCH LABUNSKI: Yes, John, but– but there is an incentive that you guys are giving users. And the incentive is that there’s a ranking factor with that.

JOHN MUELLER: Yes. […] It’s small. It’s a small ranking factor. […] And it’s something where I think if you improve your website in general, you’ll probably see bigger changes in search. But it’s something I think that kind of helps boosts sites in situations where we essentially have equivalent sites ranking where I would say, OK, we have this site that’s pretty good, pretty relevant for this query. This other site is also pretty relevant. Which one should we show first? It’s pretty close. So we’ll pick the one that’s running on a secure site out there. It’s not the case that you’re going to jump from number ten to number one and bypass all your competitors just by switching to HTTPS. I think that would be the what users would expect when they’re doing searches. So that’s something that we’re still taking into account. I think HTTPS is definitely something to keep in mind. If you want to stay ahead of the trends, you should definitely learn how this works, how it’s set up. Understand the subtle problems that come up with HTTPS, like embeds, like the ads, like tracking pixels, tracking scripts, all of that. And that’s something that I think anyone who wants to stay ahead of the curve would be interested in anyway. But I wouldn’t see it as something where I could say all of your clients have to switch to HTTPS immediately. […]

JOHN MUELLER: So this is for a search in general more of a long-term issue and not something that will turn the whole search results around next month because we decide to change the ranking factor to be500 times as strong, something like that. I don’t think that’s going to happen. So focus on it in the long run. Be realistic with the expectations. And when you’re making bigger changes on your website and you’re moving to a different hoster, maybe think about if you can move to HTTPS while you’re doing that as well.

Best Host News Commentary – This topic seems to be a recurring theme lately, with the last Google Webmaster Central Hangout discussing whether HTTPS boosts traffic.  Whilst many are not seeing the results that they would like, with some reports of webmasters abandoning HTTPS, what John Mueller stresses here is that whilst the advantages are not large, there is likely to be a trend of increasing influence.  There is clearly no rush to change over to HTTPS, but if you do wish to there is a great guide by Yoast SEO here.  In addition you will want to make sure to use the change of address tool within webmaster tools.

Question 2: 5:18 – Discussion about how HTTPS and 301 Redirects may cause fluctuations

MALE SPEAKER 1: OK. I sent you a link on chat. This is a new client that I just took like yesterday and at the end of October, they did a couple of things on their website. They moved from HTTP to HTTPS. And at the same time, they also redirected some of their categories. So basically, they had two sets of 301 redirects. And a couple of days later, their traffic tanked. Their organic traffic tanked. They say it’s because Google didn’t get the redirects properly. I checked them. They’re pretty fine, as far as I saw. One thing would be that the HTTP or HTTPS redirect from Chrome or any browser returns a 307. But any webmaster tools and everything as we know are 301. So I’m thinking that’s just a browser thing.  And I told them I think there might be some algorithmic issues, because the end of October was also Penguin. So is there any chance you could tell me if there’s any problems outside these directions if possible?

JOHN MUELLER: I’d probably have to take a more detailed look. From a first glance, it looks like these are essentially just normal algorithmic fluctuations and that can happen when you make bigger changes on your website. I don’t know what else was changed on the website or if it was just at the tripping point, essentially but this is something where, when I look at it and see it on a website, usually it’s just a matter of algorithmic fluctuations that settle down after a while.

[continued discussion, but John Mueller couldn’t add anything else useful without taking a good look at the site.]

Best Host News Commentary – Certainly it was a case that we saw major fluctuations when we changed our Best Host News site over to HTTPS.  It lasted about a week and then settled down.  We also saw a massive spike in our site being crawled, so we put it down to just that the process of Google recognizing the site move taking time as it recrawled all the links etc.  Of course, there are many examples of people suffering drops after switching… see here for one example. John Mueller seems to suggest that these fluctuations are normal whenever you make larger changes to the website.  Of course it is always possible there is a bug, as what happened to Buffer.

Question 3: 8:24 – Does a high number of outgoing dead links affect ranking?

BARUCH LABUNSKI: If a site has a lot of dead links, suppose like his client, let’s say, had a lot of dead links within his blog or website and then does that affect like ranking? I mean, if a lot of– if you have like, let’s say, 300 dead links internally to a site,like in a site that’s pointing– I don’t know– to government websites or whatever, and they’re all dead links?

JOHN MUELLER: No. […] That– I mean, it’s kind of bad user experience. You’re losing your users that way. But it’s not something where I’d say our algorithms would pick up on that and say, hey, you’re not maintaining the links on your site. Therefore, you’re a low-quality website. I don’t think our algorithms would pick up on something like that. So if you see that happening with your website and you run some kind of a crawler over it and you see, oh, 50% of my outbound links have kind of decayed and aren’t working anymore, then, of course, fix that. I’d just fix that for the usability in the general. But it wouldn’t have any effect on the ranking, the way we crawl index the site.

Best Host News Commentary – This has been the subject of many discussions over the years, with the consensus of many SEO’s seeming to think this was a bad thing, and there are many articles by reputable blogs that classify this as a sure way to get penalized by Google.  John Mueller however has set the record straight that it does not effect ranking, but instead should be a concern for usability purposes only.

Question 4: 9:40 – When changing to HTTPS do we need to change all internal links to point to HTTPS version?

MALE SPEAKER 2: My question is that if we use 301 redirect to HTTPS and we do not change any URL on the website, that means whatever URL are being hyperlinked on the website are HTTP version, just only that we need to click and they are being 301 redirected. So what is the right way of doing? Is it an efficient way of doing with cross-fitting a website to HTTPS? Or do we need to remove entire HTTP version of URLs and we need to put HTTPS as in hyperlinked as well.

JOHN MUELLER: Setting up the redirect is definitely a good first step. I would also make sure you have the rel=canonical set up properly. I would not block the old version of the website. That’s something that should remain crawlable and indexable. We want to be able to find those redirects. So set up the redirect, make sure you have rel=canonical set up properly, make sure that pages actually work on HTTPS so that you don’t have issues with mixed content type problems. And then essentially you should be fine.

JOHN MUELLER: This is still kind of a site move situation, so it’s not the case that you set up these redirects, and everything just works properly without having to worry about anything else. It can still take a while for everything to be processed. You might still see fluctuations happening there. So those are the kind of things to keep in mind.

Best Host News Commentary – We have already mentioned above that you should do the proper site move procedures in Webmaster tools, but as part of an effective site move you would do a wildcard redirect.  Of course, redirects lose a little link juice, so keeping the redirects to a minimum is recommended.  You can find a great article on Redirection best practices here.

Question 5: 11:24 – Why move to a mobile-friendly site?

Written Submitted Question: All right, here’s a similar question about the move to mobile friendly.  Why move to a mobile-friendly site?  Mobile-friendly websites engagement should be higher from websites that are not page speed improvements. If the changes get a score of at least 90 out of 100, will I see a boost in ranking or traffic?

JOHN MUELLER: Essentially, moving to a mobile-friendly site is a great way to help your users when they’re using a smartphone to actually get to your content and be able to use it a little bit easier. So that’s something that you’re primarily doing for your users. I suspects, over time, this is something that we’ll pick up on as well in search, and we’ll try to kind of bubble up the mobile-friendly status a little bit more visibly. We do show it kind of as a badge in the search results already, if you’re searching on a smartphone.  Maybe it makes sense to rank these a little bit higher in the long run. I’m not sure where we’ll be heading there. But essentially, primarily you’d be doing this for your users. We see a lot of people using smartphones on the web.

JOHN MUELLER: There are studies I think that were recently done that show 60% of the e-commerce activity I think on Black Friday was done on smartphones. So that’s something there’s a large group of people who are using smartphones, and they might want to visit your site as well. And if you’re blocking them by showing them a site that they can’t actually use on their phone, then that’s going to be the primary impact that you’re seeing, that people go to your site, they can’t actually do anything. They can’t buy anything, or it’s too complicated for them to get through all the processes that are set up for desktop that don’t work that well on mobile. And they’ll jump off and go to a competitor. So that’s essentially the primary change you’d see there. I wouldn’t just move to mobile friendly for SEO reasons. Of course, moving to mobile friendly is always a good thing. And if you’re saying, I’ll only do it for SEO reasons, then I’m not going to hold you back from doing that. But I wouldn’t expect a gigantic jump in rankings just by having a mobile-friendly website.

Best Host News Commentary – This is one of those things that is extremely obvious, especially when you start to investigate how popular mobile usage is.  Just take this research by Mogan Stanley Research that shows the growth in mobile internet usage since 2007:


If you want to know more you can read a great article here about mobile usage that is up to date as of January 2015.

Question 6: 13:39 – Quick mention of the Usability Report

BARUCH LABUNSKI: Right, and I am seeing the crawlers come down after two months. You know what I want? I want to mark it as fixed, but hey. So all the errors from our class are kind of coming down,
13:52and it takes time, right? But what I like is that it takes you to the page and sites, and you can see everything that you’ve corrected and so forth.

JOHN MUELLER: Yeah, the usability report, that’s pretty cool, yeah. Good to see that the numbers are coming down.

Best Host News Commentary – You can find the Mobile Usability Report in Webmaster tools under the Search Traffic menu.  If you have a mobile friendly site you should see zero errors, with something like this:

Webmaster Tools - Mobile Usability - https___www.besthostnews.com_

Question 7: 14:10 – Discussion on Ad placement on mobile and using the .selector {display:none} to hide different ad blocks on mobile which may cause duplicated hidden content

MALE SPEAKER 3: Hey, John. On a related note, I’m filling in for Lyle about the mobile friendly. […]  We were wondering–like, we have ad blocks on the desktop version and we would like to move them, keep them up higher on the mobile. Right now, we have moved them below so it all follows in line. Is that frowned upon? Can that be looked on in any bad way of hiding blocks and having other blocks reappear, shifting where like ad blocks would appear on mobile versus desktop?

JOHN MUELLER: Generally, you have to change the layout if you’re going from a desktop layout to a mobile layout. And usually what you do is you take the individual blocks, and you shuffle them around a bit and see where which block makes sense. So, for example, if you have a sidebar, then that might be something you move into the main column. It might be something you kind of hide from the main column completely if you think that it’s too confusing for mobile users. So essentially, that’s up to you. The way that you shuffle these ad blocks around, the way that you shuffle the content blocks around, that’s something kind of between you and your mobile users. It’s not something where we would say, you need to keep exactly the same order, or you need to keep it in an equivalent order. We expect the content to be equivalent, the primary content of these pages.But how you do the ads on those pages– if you show them at all, if you show different types of ads– that essentially up to you.

MALE SPEAKER 3: So that wouldn’t be a negative. Google wouldn’t see it as listed twice. One time it was– on the desktop, it was showing up in one spot. And on the mobile, it would be unhidden on a different spot.

JOHN MUELLER: That’s fine. That’s totally up to you.

MALE SPEAKER 3: OK, thank you, John.

JOHN MUELLER: And that might be something you just want to test, where you like A/B tests with your users and see how they interact with your pages. I think there are lots of subtle things you could try out there. But that’s not something that would have an effect on search.

Best Host News Commentary – Just remember to follow the Ad implementation policies.  For example Adsense say that you should not hide adverts using display:none unless you are implementing responsive ad units.

Question 8: 16:16 – Does Google consider brand searches as an indicator of quality and use it as a trust factor for the site?

Written Submitted Question: Does Google consider brand searches as an indicator of quality and use it as a trust factor for the site? Also, are you using clickthrough rate to measure top 10 organic results?

JOHN MUELLER: I’m not aware of us using brand searches specifically as a way to recognize the quality of a site. I think it’s a good thing that people are searching specifically for your brand, but I don’t know if that’s a sign that we would need to rank this site higher in general. So if people are searching specifically  for your site, then, of course, we want to show your site. But it’s people are searching for a specific product that you’re offering, just because people were also searching for your brand name doesn’t necessarily mean that that product is going to be more relevant for them. So from that point view, I shy away from connecting brand searches with ranking elsewhere on your website.

JOHN MUELLER: With regards to clickthrough rate, I think that’s something that’s really useful for a webmaster to look at, because they understand their site best. From our point of view, we primarily use that as a way to kind of check our algorithms in general. So if we make specific changes on our algorithms, we’ll look to see, are people still finding the results they’re looking for in the top positions? Or does this ranking change, for example, result in people clicking on things that are lower on the page? Then maybe we did something wrong with our ranking page. So that’s something on a very aggregated level that makes sense for us to use.  On a very detailed site or page-wide level, it’s a very, very noisy signal. So I don’t think that would really make sense as something to use as a ranking factor there.

Best Host News Commentary – This is kind of a contested issue.  Sure we all want to believe John Mueller, but others have carried out extensive tests that might say otherwise.  John Mueller’s last paragraph above does seem to be a little cryptic and does imply that they use CTR to change their algorithms generally, but that it does not actually influence individual site results.  I’d have to say this sounds about right.

Question 9: 17:56 – Follow-up Discussion on Exact Match Domains

MALE SPEAKER 4: Can I follow up on that? […] How do you differentiate between a brand search and a domain search, if it’s something like ours where we’re Xperience Days. And that’s what we sell. So whether someone’s searching for the or an Xperience Day being what we sell. And the same for anyone with an exact match domain, because there’s lots of talk on forums about penalties for exact match domains, but I’m not sure there’s any truth in that. Perhaps you can expand on that. But how do you know what’s what unless someone’s actually putting the dot-com in there?

JOHN MUELLER: I don’t know. […] I don’t have a great answer for that. It might be that we’re not actually looking for brand searches, that we’re just trying to match to relevance there. And if we can recognize the relevance based on the content on the page, the external factors that we might have associated with the page. And that’s something that’s a good sign, I guess, for relevance and for ranking. And usually for brand names, those kind of things, that kind of aligns automatically. So if people are linking to your site with your brand name and someone is searching for your brand name, then that’s something that we can pick up and combine there. With regards to exact match domains, the main issue that we sometimes see is that people will go out and buy an exact match domain, and they’ll expect their site to always rank for those terms.  And that’s definitely not going to be the case. It’s not going to be the case that if you have a domain name, like– I don’t know– bestseocompany . com, that your site will automatically rank for best SEOcompany. We take into account a lot of factors in our rankings. So just because a domain name matches the keywords does not automatically mean that it’ll result in a ranking. Obviously, if you have a very obscure brand name and people are searching for something like– I don’t know– Google, which isn’t really a keyword that people would otherwise be searching for, then it’ll be a lot easier for you to rank for that kind of a keyword, because you have this really weird name and the only website that’s out there that’s really relevant for that weird name is your website. And if people are explicitly searching for that weird name, then obviously we don’t have that many results to pick from. But I think this is something where people have too high expectations for exact match domains. And just because it matches the keywords that people are searching for doesn’t mean that it’ll rank for that.

MALE SPEAKER: Right, and that should be fairly obvious. But I’ve seen specific talk of suppression or penalties for EMDs when I don’t know that that’s ever been confirmed.

JOHN MUELLER: For us, the biggest problem with regards to web spam is if we see like this big collection of sites that are essentially targeting a whole collection of keywords, and they’re really low-quality sites that we essentially just want to throw out anyway, then that’s something where we could correlate the low-quality sites with this exact match domain and take them out like that from a web spam point of view. But that’s not something where I’d say if you have an exact match domain and this is your main site and people have been searching for it like that, that’s not something that you wouldn’t have to worry about there. It’s really a matter of this combination of exact match domains and really low-quality sites that kind of runs into that. So just having an exact match domain isn’t really going to cause problems for a site.

BARUCH LABUNSKI: But sometimes– sometimes there is a boost. Remember you were saying one time, there is a boost maybe like two months or something like that. And then the site will go back to page three or four for places that are not that competitive, if you think it’s highly– if the site is really well-made.

JOHN MUELLER: I think that goes back to the other point where if we think this is a relevant site, then we’ll try to show it in the search results. And that’s primarily independent of the domain name, so if you have bestseocompany . com, the domain name, or if you call yourself something else and have that as a domain name, it can still rank for those keywords. And that’s not something that is dependent on those keywords being in the domain name. So if you’re out there trying to target a specific set of keywords, I wouldn’t just go out and buy an exact match domain name and put a website up there and assume that it’ll rank there. But rather, I’d– if you’re buying a domain for the first time, make sure that you pick something that maybe leaves a little bit of room to grow, maybe embeds your brand or the image that you want to expose to users, something that you want to keep for the really long run. And use that. Because if you just jump out and just get like this three keyword combination domain name that just happens to be free, you’re
kind of narrowing your target audience down in a way that maybe makes it harder for you in the long run, because you’ll have to either get more domain names to spread out for the new things that you’re working or switch to a completely different brand name over time. All of those things are additional hassles that, if you can avoid them, that makes your life a lot easier.

BARUCH LABUNSKI: Right, yeah. Makes a lot of sense. Users are scared to come to you anyway, because they’re like, OK, your name is Gennifer Flowers. Like there’s so many of them, you know?

JOHN MUELLER: I wouldn’t assume that the average user would pick up on this exact match domain stuff. They probably don’t realize that this is some SEO trick from essentially the last century. It’s just something to kind of keep in mind, especially when you’re setting up a new website for the first time for a company. Picking an exact match domain kind of narrow things down for you. And picking something that works as a brand for the long run where you can say, I can definitely live with this for the next 10, 20 years, that makes it a lot easier for you to grow over time.

BARUCH LABUNSKI: For sure, yeah.

JOSHUA BERG: I always wondered if Google recognizes proper names better as being a brand name rather than some kind of keywords or something. But I think from what you’re saying is maybe the association can be picked up better in queries and stuff.

JOHN MUELLER: I’m not sure what you’re hinting at.

JOSHUA BERG: I mean, like– […] –would look at the site has a strong brand presence. Well, it would be difficult to do that without knowing what that brand is or maybe that’s just overrated.

JOHN MUELLER: I think that’s partially overrated at least from an SEO point of view. I think from a marketing point of view, that’s definitely something worth focusing on, because if people know about your company and they’re explicitly searching for your company, that’s always a good thing. And that’s something that, I think, will always have a positive effect on a website, because people are going to explicitly for your website. Or instead of searching for blue shoes, they’re searching for blue shoes yourbrandname . com or your brand name.  And of course, they’ll find their way to your site, because they’re explicitly searching for your site, too. So I think working to create this presence with your users so that they want to come back to your site. They know about your site. They remember your website. They can explicitly recommended it to other people by saying, hey, I always get my stuff there. That’s something that’s always a good thing to aim for. I imagine it doesn’t work in every niche. It doesn’t work for every type of site. But if you can get that kind of loyalty from your users, that’s always a good thing to have. And that’s completely independent of any SEO aspect there.

JOSHUA BERG: Yes, I was wondering what was mentioned earlier also with the queries being more associated with particular sites, if those sites are frequently searched for those queries. But I realize that that would have a big potential for abuse as well. But it seems like when you see the Google autocompletes the searches, that a lot of brands, persons, individuals– like my name, Joshua Berg, will appear next to SEO, and then Joshua Berg actor, because there’s another Joshua Berg who’s an actor. So just because those are common queries, that those associations are made, is that reasonable to assume that that’s how that works? Because that’s what we see appearing in the autocomplete.

JOHN MUELLER: I’m not sure how autocomplete pulls that together. So I wouldn’t assume that it’s using the same things that we would use for web search for ranking, for example. So sometimes you’ll see things happening in autocomplete that are kind of their own island there. So it’s not something where I’d assume, because autocomplete is doing it like this, that this is the same type of thing we do in web search. But it’s always interesting to see how these kind of different parts of Google pull in different information to try to get similar or maybe slightly different information as well.

Best Host News Commentary – I think EMD’s have been a big thing in SEO for such a long time, diminished slightly with update Google did back in September 2012. When we first started in SEO we played around with EMD’s and they worked extremely well… at least for a while.  Certainly, they still have a slight benefit, but what John Mueller is saying now is that the benefit is no longer significant.

Question 10: 28:30 – What is the difference between useful content and thin content?

MALE SPEAKER 5: I was wondering about the difference between what’s useful content and what can be considered thin content. An example, if you’re doing a guide for shops in a city and you are publishing a website of their store hours, this is not very much information, but it’s very useful. So it could be considered, if you are the only website with this data, you have collected all this information in one. Can that be considered thin content, or the quality score would be good?

JOHN MUELLER: The quality score, I think, is an AdWords metric. So I can’t speak for that. But essentially, if you’re fulfilling what people are looking for, that sounds like a good thing. And that might be very little information if that’s all people are actually looking for. If they’re looking for the opening hours, if they’re looking for the address, and they land on a page with that information, that’s great. You don’t have to write a novel for that.

MALE SPEAKER 5: OK, thank you. I was wondering if we include maybe a custom-made quality for those stores, a metric in order to give more help to people,this is made by ourselves. It could be good or bad?Maybe it’s a good idea to it ourselves, or we need to try a source, a dependent source, for that?

JOHN MUELLER: I think that’s always a good idea. So any time you can go through your content and try to recognize the good parts and the bad parts and then take action on that. I think that’s always a great thing to do. And sometimes there are easy ways to do that, depending on your website. Sometimes it’s a lot harder. So you can’t always say you should focus on this metric and use that as a quality proxy. Maybe there are some things that work better for your site. Maybe there are other things that don’t work as well.

Best Host News Commentary  This is something that we struggle with ourselves.  With all the Google Algorithms and recent changes to those, it is getting more and more difficult to tie in what you are doing want to do, with what Google wants.  Just ask Seroundtable, which was discussed at length in the last hangout.  Quite often we will write an article, with plenty of detail, references and links just to find we have used a word too many times (over optimization) and you then find yourself not writing naturally but trying to change the article to please Google.

Question 11: 30:44 – Do mobile-friendly websites get more search traffic and engagement metrics?

Written Submitted Question: I’m preparing a case study about mobile-friendly sites. I’m wondering if you’re seeing that mobile-friendly websites are getting more search traffic and engagement metrics compared to non-mobile-friendly sites.

JOHN MUELLER: I don’t know so much about engagement traffic. That’s really hard to say. With regards to more traffic, I recently saw a blog post about someone saying as soon as they kind of updated their site to be mobile friendly and got the mobile-friendly label, the clickthrough rate for their site in search rose significantly. So that’s something that might be playing a role there. I know there are lots of studies out there in general about how users react to mobile-friendly sites. So that’s kind of what I’d be looking for there. For example, the Boston Consulting Group put out a study, I think in December, regarding mobile sites, mobile internet users in general, about how they interact with mobile sites, how they use their mobile internet. And that can probably work as a basis as well and kind of help you to make a decision around making mobile-friendly sites.

Best Host News Commentary – We kind of touched on this topic earlier.

Question 12: 32:03 – Does Google use clickthrough rate as a ranking factor?

Written Submitted Question: Does Google use clickthrough rate as a ranking factor?

JOHN MUELLER: We talked about that briefly before.

Question 13: 32:09 – Does Google Use Country Specific Algorithms like Panda?

Written Submitted Question: Different versions of Panda in small countries like Chile– […] But I’ve see cases of search results when you simply do not understand how the latest versions of Panda apply.

JOHN MUELLER: In general, we try to make our algorithms work as globally as possible. So I don’t think we have any country-specific kind of algorithms out there. But of course, within the country-specific search results, you’ll see different changes depending on the type of content that we have in those countries. So it’s not that the algorithms are country specific, but the search results are country specific, and the algorithms are kind of interacting in a general way with those search results. So you’re bound to see different changes in different countries depending on how these algorithms pick up that content.

For example, if you’re looking at a country where most of the sites are pretty low quality and a handful of sites are really high quality, then, of course, it might happen that a change like Panda that kind of looks for higher-quality sites has a more visible change, because that’s just what we have to work with. Like these sites are there. There’s a bigger threshold between the higher-quality and the lower-quality sites, so that might be more visible there. But it’s not the case that the Panda algorithm is country-specific. It’s just the websites that are out there that we have to work with are specific to those countries and might have specific aspects that are kind of unique or special in those countries.

Best Host News Commentary – John Mueller makes a great point, but variations could also be due to how some of the Algorithms are rolled out.  Some, like Penguin are rolled out over several weeks, so this can create a lot of flux between countries.

Question 14: 33:42 – Is display:none to hide content for responsive sites ok?

Written Submitted Question: For mobile friendly RWD sites certain lots of content and must be hidden from mobile, mostly navigation.  Some images are hidden or displayed depending on resolution. I’m worried that doing this might violate the rule against hidden text. Is using display:none in CSS for responsive sites OK?

JOHN MUELLER: That’s definitely OK. That’s not something where we’d say this is a problem. Essentially what we want is that the user sees an equivalent page on their mobile phone. And equivalent might mean that the images are slightly different. It might mean that the sidebar is gone. It might be that the navigation is a little bit different as long as the equivalent primary content is the same. So if you have a page about– I don’t know– Ford car insurance, and on mobile it’s still about Ford car insurance, then that’s perfect. On the other hand, if on mobile you look at it and it’s about Ford– I don’t know– car wheels or something else, something different, than that, of course, would be problematic from our point of view. But if the primary content is equivalent, even if the text isn’t one-to-one exactly the same, that’s fine.

Question 15: 35:07 – How long takes calculating rankings for a new subpage take after indexing?

Written Submitted Question: How long takes calculating rankings for a new subpage after indexing? I mean possible rankings for relevant keywords.

JOHN MUELLER: It’s really hard to say. So the first step is crawling and indexing the pages. That’s something that can happen in a matter of minutes to a matter of months depending on the website and how quickly and deeply we crawl the website. So that’s, I think, the first step. You’re looking at a time frame from minutes to a couple months, which is already pretty big. And with regards to ranking that page, that’s something that, again, can happen within a factor of minutes or it could take a couple of months to understand how that page fits in with the rest of the web. So it’s there is no absolute answer here where I’d say after five days, you will see exactly this. It really depends on the website. It depends on the page, on the information that we have about the page, and other signals that we’ve collected there.

Best Host News Commentary – Here are some tips to getting your site crawled and indexed in Google Faster:

  • Create a Sitemap – The easiest way to do this if you use WordPress, is to simply install the Yoast SEO plugin.
  • Submit Sitemap to Webmaster Tools – You can find full instructions on how to do this here.
  • Use the Fetch tool in webmaster tools – In Webmaster Tools under the search menu, there will be an option to “Fetch as Google”.  Enter your URL here, and then once retrieved submit that URL to the index.
  • Post your content to all your social channels

Of course, there are other Black Hat ways to get your content indexed, but these are not a good idea from a long term SEO strategy point of view so we will not be discussing them.

Question 16: 36:10– If you have an SEO agency, would you offer link building as a service?

Written Submitted Question: If you have an SEO agency, would you offer link building as a service?

JOHN MUELLER: So I don’t know. On the one hand, I don’t have an SEO agency, so I can’t really give you an absolute answer there. I think this is something that probably depends a little bit on how you define things like link building. There are a lot of things you can do to kind of promote your brand, to promote the content that you have that essentially might look similar to link building. So I don’t really have any yes or no answer there where I’d say, yes, I’d do exactly this. I chatted about this with the other people here in the office briefly. And it was an interesting discussion, but it’s not the case that I’d say, yes, I would have a link building team that would go out and email everyone to try to get links to my clients’ websites. Or no, I would never talk to any other website at all. I think there’s some middle ground there where finding the right balance makes sense.

Best Host News Commentary – This must have been difficult to answer, especially after Matt Cutts (head of Google Spam team, but currently on a break) declared a war on Guest Blogging.  This answer seems to imply that war on Guest Blogging is not black and white, but more differing shades of grey.

Question 17: 37:26 – Did you learn anything about the fate of search engine roundtable?

Written Submitted Question: Did you learn anything about the fate of search engine roundtable?

JOHN MUELLER: I don’t have anything specific to share about that at the moment, very sorry.

Question 18: 37:39 – Why some subpages don’t show in ranking for unique content?

Written Submitted Question: Why some subpages don’t show in ranking for unique content?

JOHN MUELLER: I’d have to look at some examples there to see what exactly you’re looking at. But there are definitely reasons why maybe we wouldn’t show a specific page for a specific set of queries, where we’d say, well, maybe we look at this page and it looks like a keyword stuffing page or it looks like an artificial spammy page. Maybe we don’t trust the site in general. There are some aspects that might come into play there. But I’d really need to look at the specific query and the specific URLs that you’re looking at there.

Question 19: 38:28 – Bilingual website redirect question regarding HTTPS and Country specific redirect

Written Submitted Question: I have a bilingual targeted website, Greek and English. We’ll be implementing HTTPS. Is this setup correct? […]  Everything will be redirected to .gr and then 302 redirected to .com.

JOHN MUELLER:  So in general, there are two aspects I would look at there. I’d have to look at the site specifically to see what exactly is happening there. But in general, there are two things you should aim for.  On the one hand, keep the number of redirects that are chained to an absolute minimum. So instead of setting up this complicated chain of redirects where you say, well, this version redirects to this, and that redirects to this and redirects to this, try to redirect directly to your target page as much as possible. When it comes to creating bilingual content, make sure that each of these language versions are crawlable separately so that we can crawl the English version, we can crawl the Greek version, and we can index those versions separately. If you can set up hreflang between those versions, if you have equivalent content, then that’s even better for us. If you have one version that tries to recognize the user’s language, settings, or location and reject them automatically, then I would set that up as a default in the hreflang set.  So then you would have your default version, then you would have your Greek and your English version, and we’d be able to crawl and index those versions separately. So that’s something essentially that I would aim for.On the one hand, make sure that the redirect count is minimum. On the other hand, make sure that you have these separate versions that are crawlable and indexable separately, and that you have hreflang set up appropriately between those versions.

Best Host News Commentary – We have never really had to look into this ourselves so our knowledge of the Hreflang tag is limited.  However, you may want to check out this great guide on it by Moz, as well as the Google Help article here, and the video below:

Question 20: 40:22 – If you’re loading content by AJAX, should the AJAX response have a no index HTTP header or a canonical header pointing to the full version of the page?

Written Submitted Question: If you’re loading content by AJAX, should the AJAX response have a no index HTTP header or a canonical header pointing to the full version of the page?

JOHN MUELLER:   No, I would not do that. We’re generally not going to be indexing AJAX responses directly. So the no index header wouldn’t really make sense there. And what we’ve sometimes seen is that people break the AJAX response by trying to inject HTTP meta tags into that as well. So if you’re not paying attention completely, you can kind of break things for your site, or for indexing in general. So that’s something that I try to focus on there. Make sure that you’re serving your content directly.  The no index tag won’t have any affect there, so I wouldn’t even include that there. Because any time you’re adding additional complexity, you’re adding additional room for things to break.

Question 21: 41:18 –  Can you ignore the really basic SEO questions in these Hangouts which have been answered in the past?

Written Submitted Question: Can you ignore the really basic SEO questions in these Hangouts which have been answered in the past? Even the ones that been voted up, these Hangouts should be for really difficult and technical questions only.”

JOHN MUELLER: This is good feedback to have but I think because people are still asking a lot of these normal questions, it’s also important to get these answers out there. So I’m not going to try to censor these questions in any specific way, but rather take things are voted up and make sure that we have answers for them. If there are specific technical issues that you really need to have answered, put them in there early so that people can vote on them early. And usually if they’re in there earlier, they can collect votes a little bit over time. So that’s essentially what I would do there.

Best Host News Commentary – Hopefully we will continue to provide a summary of the Google Webmaster Central Hangouts, so to create a resource of all the questions asked over time.

Question 22: 42:11 –  Question about AngularJS and Google Crawling Ajax Javascript

Written Submitted Question: We’re running a single page app on AngularJS for property listings. Besides not being able to serve escaped fragment versions fast enough, we also have dynamic URLs for the results. Where should we put our efforts first?

JOHN MUELLER: I think if you’re setting up your site to run with a JavaScript framework like Angular or Ember, or something like that, I skip the step at the moment of creating the AJAX crawling site. That’s something that has helped us in the past a lot to understood these data better. But more and more, we’re able to crawl and index these pages even if they’re using JavaScript to pull in the content. So make sure that your JavaScript versions have unique URLs, that we can crawl and index those URLs separately And past that, make sure that we can actually crawl all of the content, the JavaScript files, the AJAX responses, all of that. So that’s something you can test with the new Fetch as Google feature, the render view, to see what Googlebot would be able to see from your site.

Question 23: 43:23 – Is Disavowing links to remove Penguin Penalty Enough?

MIHAI APERGHIS: Hey John, speaking of Javascript frameworks, I have a question about Penguin. […]  I don’t know if you know […] Josh Bachynski had a Whiteboard Friday, and it stirred quite a bit of controversy. He was basically saying that disavowing links for removing the Penguin penalty isn’t enough.  You cannot simply get off with a Penguin penalty just by disavowing links. You have to take at least other steps like removing, actually removing the links themselves. Otherwise, just disavowing them won’t be enough. Is this something that you can shed some light on?

JOHN MUELLER: It’s kind of unrelated to JavaScript, but I’ll take a shot at it. So for manual actions, we definitely expect you to also remove links. So if there’s a manual action involved for unnatural links that were taken by a site, we definitely expect you to do that. With regards to algorithmic changes that we’re picking up on problematic links, technically you don’t need to do that. You can just use the disavow tool. Practically, I think it always make sense to make sure that you’re covering those problems as broadly as possible, that you’re cleaning up these issues that you or maybe your SEO or a previous SEO for the site created. So cleaning it up, I think, always make sense from a technical point of view for algorithms. The disavow file is essentially sufficient in those cases.

Question 24: 45:11 – Overcome Penguin penalty just by building new high quality links?

[continuing on from previous question]

BARUCH LABUNSKI: But you said also like that with basically16you don’t– so if you’ve never had a manual action– […] But you said that if you get better links, then it will overwrite kind of the Penguin,if you were affected by Penguin, overwrite the–

JOHN MUELLER: Yeah, I don’t know. That’s been pulled up a little bit recently.

BARUCH LABUNSKI: I said, hey, no need to disavow.

JOHN MUELLER: No. So from my point of view, if you’re aware of any problems that you were creating with regards to these links, you should always clean that up. It’s like if you have– I don’t know– typos on your website, then you know you have typos on your website. It’s just like saying, well, people accept that there are typos on my website. It’s fine. That’s something you want to clean up. It’s a problem. You’re aware of it. Cleaning it up always makes sense, especially in cases where you suspect there may be an algorithm picked up on this and kind of took action based on that. Cleaning that up always makes sense. The disavow tool is a technical tool. It essentially takes those links out of the calculations for our algorithms. So it’s not like an I’m guilty sign that you hold up where anyone who looks at your site and sees you have a disavow file and says, oh, you’re an SEO, and you were buying links, and you’re going to be penalized for having done that,even if you’ve cleaned that up now. The disavow tool is essentially a technical tool that takes these out of calculation. It’s not something you should be ashamed of using. It’s not something where you should say I don’t want to use this because it does something crazy for my website. It’s essentially– it’s kind of like a meta tag you put on your pages. And meta tag no index says, OK, no index those pages. The disavow tool says don’t take these links into account.


JOHN MUELLER: Kind of something like that, yeah. It’s a technical measure. It’s a tool you can use. It’s not something you need to be ashamed of.

[continued explanation summarizing what John Mueller said a minute ago]

MIHAI APERGHIS: John, so you said the disavow file is sufficient in removing a penalty. Would you at least say that actually removing the links or 404ing the page that they target, they lead to, would that be read faster by the algorithms rather than the disavow file? Or is it all the same?

JOHN MUELLER: It’s essentially equivalent. So I think one aspect also to mention that from our point of view, Penguin isn’t the penalty. It’s essentially a search quality algorithm. I know some people call it a penalty because they see it is taking the same kind of steps. But from our point of view, penalty is really something manual that was done manually by a web spam team member. And the algorithms, like the search quality algorithms around web spam issues like that like the Penguin algorithm, they’re essentially things that just run automatically.  It’s not that someone has manually looked at your site and penalized your site for that. So these algorithms look into the links that are still in our link graph for you website. And if you’ve taken them out with the disavow file, if you’ve removed those links from the other website, if you’ve removed the pages on your side, all of those things are essentially equivalent in that those links drop out of our algorithm. We don’t take them into account anymore. And the next time the algorithm runs, it can focus on the rest of your site.

Question 25: 50:17 – Does Webmaster Tools show all Backlinks that you take into account for Penguin Penalty?

JOSHUA BERG: So John, you’ve mentioned that the main links that you need to be concerned on the disavow and sorting are the ones in Webmaster Tools, and that itwould be kind of scraping the bottom of the barrel to go scraping for a lot of other links on the internet.  And so if that was the case, then an assumption might be made that that Penguin only uses the links that are in Webmaster Tools. So would that be correct?

JOHN MUELLER: No. We show a sample of the links in Webmaster Tools, but Penguin and our other algorithms do try to take into account the full picture. So usually what you’ll see with the sampling Webmaster Tools– for example, if you have a site wide link on one website, you’ll see maybe one or two or three of those pages listed. But actually, we know that this is on all of those pages on that website. So just disavowing those individual URLs is not really going to solve that problem. So that’s a case where, for example, using the domain directive in the disavow file kind ofhelps you to sidestep that problem. And one of these URLs is listed in Webmaster Tools. With the domain directive, you can make sure that anything else on that domain is disavowed. So that kind of helps there. Also, with regards to the type of links there, if you look into Webmaster Tools and you see a bunch of forum links there, then you can think about maybe a previous SEO ran some kind of script to drop the link on hundreds or thousands of forums out there. And maybe it’s worthwhile taking that sample and seeing if that matches a bigger picture view that you have from somewhere else. So from that point of view, I think we try to show the general picture in Webmaster Tools and the links there. But I wouldn’t say that these are 100% just the links that you need to rely on if you have a problem with regards to unnatural links to your site.

Best Host News Commentary – We have found that the best way to find all links pointing to your site is to use a service such as CognitiveSEO.  This links into all the best data sources such as Moz, Majestic and Hrefs, so provides more of a complete picture.

Question 26: 52:40 – How long does it take to process the disavow file?

MALE SPEAKER 6: Just a very quick question with regards to the disavow file. What sort of time did it take to kind of process? So after you upload the disavow, is it– is this instantaneous, or is it something that runs on a kind of a crawl that happens every kind of week or two, kind of referring slightly back to the kind of blog post that was mentioned earlier with regards to kind of seeing recoveries within a couple of days, which I don’t see is physically possible, but you know.

JOHN MUELLER: So the disavow file is processed immediately. But that’s essentially just the contents of the file is processed immediately. And you’ll see the status update in Webmaster Tools. What actually happens with those links is actually just reprocessed when we reprocess those URLs. So when we recrawl those pages that had the links and we see that there’s a disavow file there, we’ll drop that link out. So that’s something that can take a couple of months. It can take a half a year, maybe even longer, for all of that to be reprocessed completely. So it kind of depends on how quickly how often we crawl those individual pages. For things like if there’s an individual link on a homepage somewhere, probably we can recrawl that and reprocess that within a couple of days. If there’s a site-wide problem, if there are links on a lot of obscure forum pages, then that’s going to take a half a year or longer to actually recrawl all of that. So that’s something where I wouldn’t expect to see changes within a couple of days. And even after having recrawled and reprocessed those URLs, you have to reupdate the algorithm data as well. So that’s something where it’ll still take a while for that data actually to be updated and to be visible in search. So having changes within a couple of days seems a little bit unrealistic. That doesn’t mean that maybe there are other things that were changed and that happened to be processed during that time and suddenly see a jump because of that. So that seems like something that’s more coincidence that these things happen to be timed fairly close together than a real relationship between updating the disavow file and a couple days later the site umping up in rankings.

BARUCH LABUNSKI: So why not just putting more in brackets? Because you get an email after saying, hey, you’ve updated the file. It’s been updated. OK, most webmasters do understand that. So what about putting in brackets something like, only the file got updated, you know? Like so they won’t– I don’t know, because there’s all sorts of webmasters.

JOHN MUELLER: I haven’t looked at that message for a while. I– yeah, seems like something we should double check. We’re going through all of these messages that we’re sending out in Webmaster Tools anyway to kind of update them to a new layout. So I’m sure we’ll get to that message as well and see if we need to make any text changes there, too.

Best Host News Commentary – This is quite interesting, in that it splits up the process into two parts.  Crawling the links in the disavow file, and then once done waiting for an Algorithm refresh.  There is a great article here for best practices on using the disavow tool.

Question 27: 55:32 – How long does it take to process the disavow file?

MALE SPEAKER: John, my question is essentially the same, that whether the Penguin penalty, Penguin algorithm penalized overoptimization? Does it?

JOHN MUELLER: Penguin is a web spam penalty. So it looks for issues that are kind of related to web spam. And I don’t know. I mean, depending on how you define optimization, that could apply to that. If you’re defining optimization. For example, as creating unnatural links on various directories that are just out there to kind of past links on, then that could be calling that overoptimization. But in general, it looks for web spam issues. And if you overoptimize your site in a sense that you’re overoptimizing for user experience, then there’s nothing wrong with that. There’s nothing web spammy about that. So just because you’re optimizing doesn’t mean that you’re creating web spam.

Question 28: 56:37 – What would you consider to be best practice for using a Slider?

JOSHUA BERG: What would you consider to be best practice for using a Slider? And I don’t prefer using sliders on the home page of a site. But in certain industries– and we talked about it a few months ago– they’re preferred because they provide big images that are changing. So if we have clients that are quite certain they definitely want to use a slider on the home page, what would you consider some best practices? You mentioned not bearing things that are important on the third or more slide. Anything else on that?

JOHN MUELLER: So essentially what I’d look for there is that the page is still clearly usable, even if there’s a part of the content that’s kind of missing or hidden. So usually with a slider, you’ll have some slides visible, some that are kind of hidden because you have to kind of flip through to them to get that content. And if there’s critical information that’s not visible directly when someone views that page, that might be a problem for algorithms, because we might not pick that up. But if the slider just provides additional information, for example, you have a real estate homepage and there’s lots of information on that page about the type of work that you do, and the slider just shows different samples of links there, then that’s something that’s perfectly fine. That’s not critical information for those pages. It’s just additional information that helps people to find the right content within your website.

JOSHUA BERG: So do you think the above-the-fold algorithms, like a layout-type algorithm, is going to take much of a hit there? Or will that only get tripped if certain other low-quality indicators might be there as well?

JOHN MUELLER: Usually that’s more of a problem if you have things like random ads above the fold, no actual content. When you’re looking at a normal website and the slider contains samples of the content from the rest of the site, that’s not an ad. That’s like a part of your content, and that’s not something where I’d say is a problem from that point of view.

Best Host News Commentary – This kind of brings us back to the discussion we had at the last hangout regarding hidden content.  Essentially, any content on the slides that is hidden won’t be taken into account in the search engine, and therefore you should ensure that the information on those sliders (apart from the first one) is not important, or at least repeated elsewhere on the page.

Question 29: 59:09 – Can you get the “search related to” results removed at bottom of search queries?

BARUCH LABUNSKI: It’s the best, I think, question for 2015 that, well, since this Hangout started, nobody’s asked this. OK, so on the bottom of the search results, it says “search related to.” If you’re a brand name, if you’re a brand name and somebody keeps on typing example whatever 1, 2, 3, and keeps on typing it over and over and over and over again, and then uses like complaints or whatever and so on and so on, that ends up in the search results. I mean, how does a brand name remove that? Because a lot of people are having difficulties with this, and no SEO can remove such a thing. Do we just– what? Does the brand take a lawyer and send you guys– I don’t know. Like how do you remove that?

JOHN MUELLER: That’s essentially completely algorithmic. So that’s not something where there’s a magic button that we can push or that a webmaster can push and say I don’t want to have this found there. Essentially, over time, as we see how people search, as we recognize problematic behavior as well, that’s stuff that these algorithms will take into account.

BARUCH LABUNSKI: But John, it can affect the brand like overall, right?

JOHN MUELLER: It can.And sometimes I think there are good reasons for that. It’s not– I mean, if you’re looking at– I don’t know– something really problematic that a brand did, for example, and people are searching for that, then I think that’s the right type of information that we should be giving people. So it’s not something where I would artificially filter the type of content that we show there but rather try to expose what we think is relevant there. And sometimes our algorithms get it right. Sometimes we get it wrong. And if we get it wrong in those cases, I think that kind of feedback is always useful to have. It’s not the case that we have any kind of direct feedback loop where we say, oh, if you put this meta tag on your pages, then we won’t show this specific related query, or if you upload them to this part Webmaster Tools, it won’t show up like that. It’s really essentially algorithmic. And we try to improve our algorithms with the feedback. But we can’t guarantee that we can manually take action on these things.

BARUCH LABUNSKI: But it does change, yeah? Like because it’s been already like five years. It’s the same thing.

JOHN MUELLER: Of course, yeah. I mean, it changes over time, just like the algorithms change over time.


JOHN MUELLER: So that’s something. If we see things happen or change with user behavior, we’ll take that into account.

JOHN MUELLER: All right. Thank you everyone for your time and your questions. I wish you guys a good start into the new year. And maybe we’ll see each other in one of the future Hangouts.

Got something to add?  Please let us have your opinions in the comments.

You can reference by Question no. as appropriate for quick reference.  If you have helpful links feel free to include, but they will go directly to moderation first, so don’t worry if your comment doesn’t appear straight away.

Get Valuable SEO Insight into your domain

We will be happy to see your thoughts

Leave a reply