In a recent Google Webmaster Central office-hours hangout, John Mueller provided advice on how to fix the issue of discovering 90 subdomains indexed for a site that didn't exist, suggesting the removal of a wildcard DNS entry, but explained that duplicate content resulting from this kind of issue "shouldn't cause problems." Mueller also discussed how Google identifies high-quality content through a variety of algorithm updates and techniques, including rewriting titles to improve search results, and responded to questions about the impact of frequently changing metadata and the slow recovery of previous rankings following a site migration to HTTPS. Additionally, he clarified that keyword stuffing can cause a drop in rankings, but is not necessarily critical, and addressed how Google deals with email spam and negative sentiment surrounding links.
During the "English Google Webmaster Central office-hours hangout," John Mueller from Google provides advice for website owners. He suggests implementing CAPTCHAs to stop scripts from creating pages, using no-follow links and disavowing domains that cause harm. Mueller advises submitting the URL for him to check that it is clean. Mueller also suggests that a website built entirely using AJAX, Campus, and WebGeo shouldn't be treated as a bad site, but rendering options need to be used. Having separate URLs is essential to allow Google to crawl individual pages, index them and bring more visitors to a Google+ page. Mueller also explains that Google attempts to take the rel canonical signal into account but cannot trust it when used in a negative manner. Lastly, Mueller describes how disavow files can help identify legitimate market sites that have high-quality content and good links, but are negatively impacted by spam lists, ultimately affecting the other content on the website.