Summary of English Google Webmaster Central office-hours hangout

This is an AI generated summary. There may be inaccuracies.
Summarize another video · Purchase summarize.tech Premium

00:00:00 - 01:00:00

During the English Google Webmaster Central office-hours hangout, John Mueller addressed various webmaster concerns. He discussed the impact of disavowing links after the next Penguin algorithm update, advised website owners on how to handle 404 pages and duplicate content, and explained that simply moving to an older domain will not necessarily improve SEO. Mueller also addressed competitors hiring individuals to visit a website and falsely appear as though they have had a bad user experience with it in a bid to damage rankings. Additionally, he mentioned the improvements being made to Webmaster Tools, treating No Follow links, using reviews for rich snippets, and optimizing images for SEO.

  • 00:00:00 In this section, a user named "Male Speaker" asks John Mueller about the thin content on their website, which was hit by the latest Panda update. Mueller offers to take a quick look at the site to find areas considered thin content by the algorithm. The user also suggests that Google add a feature to the Webmaster Tools interface that allows users to communicate which 404s they do not want to see on their list, which Mueller thinks is a good idea. Another user also asks about HubPages and their relationship with Panda.
  • 00:05:00 In this section, a user asks if having a lot of new content from Squidoo in a short period can negatively impact their site's ranking. John Mueller confirms that new content, by itself, wouldn't be an issue, and Google's system should be able to handle it. He suggests focusing on maintaining high-quality content and finding ways to recognize low-quality content to no index it if necessary. Another user asks if sub-domains are treated as separate sites in Google's ranking algorithm. Mueller says they are considered separate sites if they are essentially independent websites. Finally, a user who has been pharma-hacked asks if a clean page cache automatically means that their site is entirely clean. Mueller suggests using Google's tool to test for defections and warns that hacked pages could cloak. He also explains that sometimes the snippet is slow to update, but it shouldn't last longer than a week.
  • 00:10:00 In this section, John Mueller of Google Webmaster Central explains that it is normal for pages that used to have hacked content to still rank for that content on a site query despite being cleaned up. He also clarifies that 404 pages are ordered in importance in Webmaster Tools, with the top ones being considered more important than the bottom ones, based on factors like whether the URL is in the site map and if it had search traffic before. Mueller also mentions that there is no specific release date yet for the Penguin algorithm update, and that it is difficult to estimate when it will be ready as it involves a lot of testing and data review by Google.
  • 00:15:00 In this section, John Mueller from Google discussed the impact of disavowing links after the next Penguin algorithm update. He revealed that even though the Penguin algorithm may have already run by the time a webmaster disavows bad links, it is still worth disavowing as there are always changes happening in search. It’s not useless because even if the current Penguin algorithm has already run, it is possible that there could be another update in the future. However, Mueller also cautioned that cleaning up a site does not guarantee a full recovery to its previous state as there are so many factors to consider, such as the site having unnaturally ranked higher due to the bad links that have since been removed.
  • 00:20:00 In this section, John Mueller explains that they are taking action on problematic links like those on junk sites that sell PR6 without the need for page ranks, which they haven't updated in more than a year for the toolbar page ranks. They have many ways of recognizing such problems to block its passing from the problematic sites anyway. To facilitate an easier way of reporting web spam appropriately and taking action faster, they are internally working on infrastructure to do it automatically while recognizing and recommending quality web spam reports. Additionally, the use of no-follow and robots.txt to block affiliate links is acceptable for not passing page rank. They do recognize Amazon affiliate links transparently by recognizing domains of an affiliate's system, and they only take action on those links specifically as they have no intention of penalizing affiliates in any way.
  • 00:25:00 In this section, John Mueller advises website owners on how to handle 404 pages and duplicate content. Regarding 404 errors, website owners should check Web Passage tools for the top 404 pages, analyze if they're all random or if there are important URLs that accidentally returned a 404. Having a 404 on pages that were intentionally removed or on a URL that never existed is acceptable, so there's no need to redirect or mask them. On duplicate content, Google will choose one URL to show in search results and won't penalize either site. However, webmasters should be mindful of creating multiple websites for every location; a handful of different locations is fine, but creating many pages for random, spammy purposes isn't. Finally, Mueller recommends choosing one domain and redirecting the others, but also acknowledges that moving domains can result in fluctuations, lost signals, and a temporary drop in rankings.
  • 00:30:00 In this section, John Mueller explains that simply moving to an older domain will not necessarily improve SEO. The algorithm recognizes that sites with a long history will have accumulated many signals over the years, but these signals cannot be combined with random domains, as it would not be fair to other sites. When the SEO team discusses a particular site being used as a scraper site, Mueller explains how the system follows the scraper site and takes the content from the website being scraped. He says there is no need for disavow because the site removed nothing unethical.
  • 00:35:00 In this section, John Mueller addresses concerns raised by a viewer around competitors hiring individuals to visit a website and falsely appear as though they have had a bad user experience with it in a bid to damage rankings. Mueller confirms that these kinds of actions are something that has been happening for a long time and isn't worried about them from Google's point of view. However, he suggests that the best way to combat this kind of activity is to have a really good site with lots of visitors who love it. The video also covers how Google determines which page to rank for a search term and how technical issues or unclear internal linking structures can cause ranking issues.
  • 00:40:00 In this section, John Mueller from Google Webmaster Central suggests that there is no direct mapping of keywords to landing pages in Google's search engine. However, in some case, Google will show keywords and the respective landing pages in Webmaster Tools, but not always. If there are discrepancies between the landing pages Google shows and the ones that companies believe match the keywords, business could try to improve their website's technical foundation and quality side by making internal links working properly and content set up correctly within the website.
  • 00:45:00 In this section, John Mueller discusses improvements to Webmaster Tools, treating No Follow links, using reviews for rich snippets, and optimizing images for SEO. He states that No Follow links are primarily a technical tool that Google follows as much as possible, and it reserves the right to act upon any abusive issues. He also mentions that images can be optimized for SEO by having a keyword in the file name and useful alt tags while having a specific landing page per image. Finally, if a website is affected by Panda, cleaning up and fixing everything is the next step, and there's no need to do anything technical past the cleanup.
  • 00:50:00 In this section, John Mueller from Google advises webmasters that for site-wide issues such as with Panda, altering the crawling of individual pages is not going to make a significant difference. It’s imperative instead to make sure that the overall quality of the site is high by analysing “what users are actually trying to do” and ensuring that their expectations are met. This is not a technical issue, rather something which takes careful analysis, and making a high-quality website can take time and effort; it’s not a simple task of changing website content. On the matter of rich snippet guidelines, John Mueller would have to check them before commenting. Regarding Panda updates, John Mueller emphasises that they will be rolling these out more frequently than before.
  • 00:55:00 not seeing the full count of indexed pages in Webmaster Tools because it looks at the exact URLs in the sitemap file, and sometimes indexed pages have slightly different URLs that aren't counted. One tip is to break the sitemap file into smaller ones to make it easier to find which ones aren't being indexed correctly. It's important to use unique reviews on your website for rich snippets rather than scraping them from other sites. And with regards to using email reviews, it's difficult to map them into star ratings and selectively choosing which ones to display may not be the best use of rich snippet markup.

01:00:00 - 01:00:00

During this English Google Webmaster Central office-hours hangout, John Mueller emphasizes the significance of rectifying indexing errors like indexing the wrong material or having URLs that vary slightly. He suggests that the website owner should update the site map file or internal links to exhibit the correct URL. John emphasizes that having mobile-friendly versions of web pages is more crucial than having streamlined forms with autocomplete, and having hard-to-use forms is the website's loss, not a search problem. He also mentions the upcoming Hangouts, which will focus on best practices for 2014 and Google News publishers.

  • 01:00:00 In this section, John Mueller discusses the importance of cleaning up indexing errors, such as indexing the wrong thing or having URLs that differ slightly. He suggests updating the site map file or internal links on the website to reflect the correct version of the URL. Regarding web forms, John mentions that having streamlined forms with autocomplete is not important for search quality algorithms. He states that having a mobile-friendly version of the page is more important, and having a difficult-to-use form is the website's loss, not a problem with search. John also mentions the upcoming Hangouts, focusing on general best practices for 2014 and Google News publishers.

Copyright © 2024 Summarize, LLC. All rights reserved. · Terms of Service · Privacy Policy · As an Amazon Associate, summarize.tech earns from qualifying purchases.