Near the beginning of July, Google began rolling out changes to Google+ pages. And by "changes," I mean whatever exists a step beneath "genocide." Links to G+ pages began mysteriously redirecting to Google 404s in bulk, and non-Gmail Google accounts were receiving "deactivated" notices upon sign in. In the following weeks, Google rolled out their "Pigeon" algorithm, which altered the way Local rankings were calculated.
Since Business Promotion mostly specializes in local services, many of our clients felt the impact of Pigeon almost immediately. Figuring out how to adapt to Pigeon became my primary focus. So last week when I stumbled upon a series of clues that led to my theory of the major changes, I volunteered to take over the blog post for the week and set out to unravel the mysteries of Google - in a week. How hard could that really be?
The process I arrived at for this seemed simple enough.
Step 1: Find accounts that would not have previously ranked on Google Local but are currently ranking. There were plenty of these pages to be found; they have no reviews, no posts, many of them are unverified, some have no website to direct to, many have incorrect information, and some of these pages are for businesses that closed their doors years ago.
Step 2: Figure out what these pages were doing right. It seemed unlikely that Google picked Local rankings willy-nilly, so they must be doing something right. By isolating the things they were doing well, we can figure out which elements Pigeon takes into consideration that were not considered before.
To find these anomalies, I needed to find cities with a relatively high population but little competition in Google Locals. A city where all seven of the Local listings were top-notch gives me no new information at all. Strangely enough, these criteria led me to a specific area of the Pacific Northwest: Bellevue, Washington, through San Francisco, California - the cities between Google and Bing headquarters.
Thesis: Pigeon has changed its emphasis from Google+ reviews to any reviews for a business posted on a reputable site.
Initially, I had discovered that four out of seven of the Local listings in a specific search all used the same service. This service is used for appointment reminders and follows up after the appointment by texting a request for a review to the client. As a result, this service typically generates six times as many reviews as Google+, Yahoo, or Yelp (in the industry I examined). This finding seemed to show a strong correlation between reviews and Local rankings. Of course, this specific service is rather popular due to the value it provides outside of the reviews generated. Alone, it was inconclusive; however, many of the anomalies had nothing else going for them - from an SEO standpoint, that is.
I decided that the logical next step was to go from city to city, cataloguing each of the 7 Packs listings and counting how many total reviews I could find for each around the internet. As a control group, I then started counting the reviews for the top seven organic rankings, which were not featured in the 7 Pack. That is fourteen listings per city, over an 800-mile stretch of country, with one week to gather and analyze data. Somewhere along that road is where I learned a valuable lesson about over-ambitiousness.
What I can say about the data is this: there is still a very strong, albeit inconclusive, correlation between Google Local rankings and third-party reviews. I intend to continue compiling data and will post again when I have concrete statistics.
Author: Todd Brown
Position: SEO Agent
Todd Brown is an SEO Agent at Business Promotion. He has worked in marketing since high school, where he began promoting concerts and events using both printed ads and social media. In recent years, he has worked more closely with promoting events for visual arts. Todd spends his free time enjoying live music, hosting parties and events, and reading poetry and metaphysical philosophy.