Is it still possible to trust keywords position tools ?

Google is more and more versatile, if you are searching some keyword from a place and later search the same keyword from another place, you should have differents results. Your position depend on which data center you are connected to. Sometimes datacenters are showing approximatively the same result, sometimes there is huge difference ! A good example is this SEO Challenge, look at Beanseohero ranking,, or my ranking and you will see huge differences, especially with Patrick’s rankings that make us better than it seems we are.

2 ways to know your real rankings on Google :

1 – Google Search Console

It’s obvious, Google is giving you the best result as possible for your website. There is only two problems :

  1. The data Google gives you is 3 days old
  2. Google keep the data during 90 days, then it’s lost

2 – : your new best friend !

This website is a tool that allow you to request multiple datacenters for a single request. This is a simple and free tools wich help you to know immediatly your position for a keyword over multiples Google data centers using multiples dedicated proxies. At the end you will have an average position and a detailled position per datacenter.  Big gap between datacenters generally show some future moves on the SERP.

How do I use for SEO Hero Challenge ?

1 – Checking rankings of my competitors

Since somedays, everyone is out from top 100, even, so it’s hard to take screenshot. For the exemple I will use a non eligible hero website :

2 – I also used the tools to made some test about the Google contest filter

Is Google filtering the website or the whole “SEO Hero” SERP ?

-> Testing a related non competition request

=> Only the SERP