Has Local Search 2.0 Arrived?
MIHMORANDUM NO. 117 | August 27th, 2008
Basically, Danny’s thesis is that search engines have undergone three quantum leaps in functionality and refinement since the days of Excite, AltaVista, and the other early players in the mid-90′s. Here’s my quick synopsis of what I understood from Danny at the seminar:
Search 1.0: On-Page Factors—Meta Tags & Direct Submissions
Search 2.0: Off-Page Factors—PageRank Formula / Link Data
Search 3.0: Universal & Blended Vertical Search
Search 4.0: Personalized Search based on User Data
Local is truly at the convergence of Search 3.0 and 4.0—the number of times we see Local results blended into Web SERPs is dramatically increasing. Google is straight up asking people to enter their Zip Code so they can show more Local results blended into the organic SERPs even when people don’t explicitly geo-target their queries. So ranking well in Local is not only going to be more important for Mobile search, but also for desktop search.
Local Search 1.0
Search 1.0 seems to me to map fairly well (no pun intended) to Local 1.0. In my opinion, the official “birth” of Local Search was January 30, 2007, when Bill Slawski published this article about Google beginning to integrate Local Results into Web SERPs, coinciding neatly with Danny’s “birthdate” of Search 3.0.
Like Search 1.0, Local 1.0 was highly vulnerable to spam tactics (some would argue that Local still is extremely vulnerable to spam and submissions, but I think Google’s gotten demonstrably better at filtering in the past several months). Local 1.0 relied heavily on old-school “on-listing” factors like proximity to centroid, keywords in business title, and direct submission to the Local Business Center.
Cue David Bowie (“Ch-ch-ch-ch-ch-anges…”)
Not to prejudice any contributors for next year’s Local Search Ranking Factors, but over the past 8-10 months, I think we’ve seen an evolution of sorts in Local, at least at Google. Off-listing factors seem to be taking on increasing importance.
- Proximity to centroid seems to be dying (though I don’t think it’s completely dead as Carter Maslan implied a couple of months ago–see the research I did with Mike Blumenthal and others.)
- Google seems to be taking citations and web references more and more seriously. Heck, they’ve even added a whole new tab for User Generated content!
- From my own personal research (unpublished–more on this in future posts; perhaps October?), they seem to be spidering a wider variety of sources for reviews.
- LBC categories are still important, but so are categories from other sources (in fact, as Mary Bowling pointed out in the LSRF study, Google relies heavily on traditional Internet Yellow Pages’ categories in the cases of unclaimed LBC listings).
- Local search expert Chris Silver Smith speculates that clickthrough data from the 10-pack may be a factor in ranking well in future 10-packs.
- KML Sitemaps are beginning to gain more notoriety.
…Where Is Google Headed?
Part of the problem in Local is that not every vertical offers the same quality or quantity of information. The recent research compiled by Mike Blumenthal, myself, and several others suggests that old-school 1.0-style factors matter more when Google has less information to go on. In other words, Proximity to Centroid and Business Title dominate in a vertical like Plumbing, where there are very few citations or links (let alone reviews), and even fewer businesses with websites.
But in a vertical like hotels, where 99.9% of businesses have a website, there are plenty off-listing factors, including UGC and reviews, to consider.
One of the main reasons that I think Google is already, or will soon be, obfuscating citation data in much the same way they obfuscate link data is that their Local algorithm will become increasingly dependent on it. As more and more Local portals pop up, the quantity of information about businesses in previously poorly represented verticals (like plumbing), and the 1.0-style factors will start to matter less.
To date, I don’t think “gaming” the citation system, or even mining of competitors’ citations, is occurring on any kind of widespread scale. If you’re reading this article, you’re one of the minority in Search who seems to care about Local, let alone one of two or three hundred people in the world who understands the distinction between citations and links. But it stands to reason that less-than-whitehat SEOs will eventually try to manipulate this part of the algorithm once the more obvious “1.0″ factors start to dwindle in importance.
As with link data, I think it’s entirely likely that small business owners logged into their Local Business Center account will be able to see this data presented more-or-less accurately, but that this information will not be presented to unauthenticated users.
Ironically, on-page factors may start to become more important as an off-listing factor for Local. Things like title tags and keywords on the landing page linked from an LBC listing are already thought to be of high importance by many Local experts.
So is what I’m talking about really Local 2.0? Or is it just an advancement of, or improvement upon, Local 1.0? Is this shift perceptible to any of you yet? Perhaps Google is just learning how to better incorporate more data for a broader range of industries, rather than actually tweaking its algorithm.
I’m not really sure that the semantics of it matter, but I wanted to share my thoughts on the subject with you and I would love to see a lively discussion happen in the comments below!