The Google Sandbox refers to an alleged filter that prevents new websites from ranking in Google’s top results. Think of it like a probation period—even if you do everything right, your site won’t get decent rankings until it comes to an end. The Google Sandbox has never been officially confirmed by Google. But many SEOs are confident in its existence, as they see sandbox-like effects when trying to rank new websites. So what’s the truth? Does the exist in 2018? If so, how do you prevent Google from “sandboxing” your site? To answer these questions, I reached out to a few SEO practitioners to learn what they think about in 2018, based on their experience with new websites.
Huge thanks to
These guys: Back in 2004, webmasters and SEO professionals noticed that their newly launched websites were not ranking well on Google for the first few months, despite executive email list their SEO efforts. New websites were duly indexed by Google, but they did not rank even for relatively low-competition keywords and phrases. But they were ranking well on other search engines, such as Bing and Yahoo. This “sandbox” effect lasted from a few weeks to several months. Keeping in mind that Google wants to serve authoritative and high-quality content to its users and might not trust in brand new websites, that actually made sense. Rand Fishkin, for example, believed that SEOmoz had been sandboxed for 9 months, despite its perfectly natural strong backlink profile.
Holding back brand
New websites from ranking could give Google more time to evaluate the website quality and fight spam. A week-old website suddenly breaking into the SERPs for a DM Databases competitive keyword, beating the pages that have been ranking there for years is suspicious, right? Besides, we all know that backlinks are one of the most important ranking factors. And it is highly unlikely that pages on a new website will accrue strong backlink profiles within a few weeks. In addition, many SEOs strongly believe that Google considers user behaviour in their ranking algorithms via click-through rates, dwell time, and other metrics, which they can potentially collect and analyze.