Has Local Search 2.0 Arrived?

MIHMORANDUM NO. 117 | August 27th, 2008Reader Comments (10)

Photo Credit: Flower by fmc.nikon.d40

Danny Sullivan’s “keynote” at the recent SEOmoz Expert Seminar highlighted the evolution of search, which he’s also blogged about before at Search Engine Land.

Basically, Danny’s thesis is that search engines have undergone three quantum leaps in functionality and refinement since the days of Excite, AltaVista, and the other early players in the mid-90’s. Here’s my quick synopsis of what I understood from Danny at the seminar:

Search 1.0: On-Page Factors—Meta Tags & Direct Submissions
Search 2.0: Off-Page Factors—PageRank Formula / Link Data
Search 3.0: Universal & Blended Vertical Search
Search 4.0: Personalized Search based on User Data

Local is truly at the convergence of Search 3.0 and 4.0—the number of times we see Local results blended into Web SERPs is dramatically increasing. Google is straight up asking people to enter their Zip Code so they can show more Local results blended into the organic SERPs even when people don’t explicitly geo-target their queries. So ranking well in Local is not only going to be more important for Mobile search, but also for desktop search.

Local Search 1.0

Search 1.0 seems to me to map fairly well (no pun intended) to Local 1.0. In my opinion, the official “birth” of Local Search was January 30, 2007, when Bill Slawski published this article about Google beginning to integrate Local Results into Web SERPs, coinciding neatly with Danny’s “birthdate” of Search 3.0.

Like Search 1.0, Local 1.0 was highly vulnerable to spam tactics (some would argue that Local still is extremely vulnerable to spam and submissions, but I think Google’s gotten demonstrably better at filtering in the past several months). Local 1.0 relied heavily on old-school “on-listing” factors like proximity to centroid, keywords in business title, and direct submission to the Local Business Center.

Cue David Bowie (“Ch-ch-ch-ch-ch-anges…”)

Not to prejudice any contributors for next year’s Local Search Ranking Factors, but over the past 8-10 months, I think we’ve seen an evolution of sorts in Local, at least at Google. Off-listing factors seem to be taking on increasing importance.

Google’s recent “bug” in the Local data they’re showing for certain searches (I’m still not entirely convinced that it’s just a bug), combined with Danny’s keynote at the seminar, got me thinking…

…Where Is Google Headed?

Part of the problem in Local is that not every vertical offers the same quality or quantity of information. The recent research compiled by Mike Blumenthal, myself, and several others suggests that old-school 1.0-style factors matter more when Google has less information to go on. In other words, Proximity to Centroid and Business Title dominate in a vertical like Plumbing, where there are very few citations or links (let alone reviews), and even fewer businesses with websites.

But in a vertical like hotels, where 99.9% of businesses have a website, there are plenty off-listing factors, including UGC and reviews, to consider.

One of the main reasons that I think Google is already, or will soon be, obfuscating citation data in much the same way they obfuscate link data is that their Local algorithm will become increasingly dependent on it. As more and more Local portals pop up, the quantity of information about businesses in previously poorly represented verticals (like plumbing), and the 1.0-style factors will start to matter less.

To date, I don’t think “gaming” the citation system, or even mining of competitors’ citations, is occurring on any kind of widespread scale. If you’re reading this article, you’re one of the minority in Search who seems to care about Local, let alone one of two or three hundred people in the world who understands the distinction between citations and links. But it stands to reason that less-than-whitehat SEOs will eventually try to manipulate this part of the algorithm once the more obvious “1.0” factors start to dwindle in importance.

As with link data, I think it’s entirely likely that small business owners logged into their Local Business Center account will be able to see this data presented more-or-less accurately, but that this information will not be presented to unauthenticated users.

Ironically, on-page factors may start to become more important as an off-listing factor for Local. Things like title tags and keywords on the landing page linked from an LBC listing are already thought to be of high importance by many Local experts.

So is what I’m talking about really Local 2.0? Or is it just an advancement of, or improvement upon, Local 1.0? Is this shift perceptible to any of you yet? Perhaps Google is just learning how to better incorporate more data for a broader range of industries, rather than actually tweaking its algorithm.

I’m not really sure that the semantics of it matter, but I wanted to share my thoughts on the subject with you and I would love to see a lively discussion happen in the comments below!

10 Responses to “Has Local Search 2.0 Arrived?”

  1. Mike Blumenthal says at

    Good questions! Is Google Maps changing dramatically or is it our perception of it changing or is there just more data available for Google to do a better job…

    My vote is on the last two…when I go back and look at patents from 2004, 2005 etc on things like location prominence I see the elements of what we are looking at today. The reason that we didn’t see them that much sooner was like in the case of the plumbers, a lack of data that Google could use. There have been changes that is to be sure but I think the basics are essentially the same.

    Mike Blumenthal

  2. Neil Street says at

    Great post Dave. I also followed the link and read your blog post about Citations. I see Google is already “obfuscating” citation results: I have a client who is sitting on top of the Ten Pack for his niche in his market, and it indicates that he has 76 web pages in his citations list. But when you click to review them, there is only one link showing — his home page! So Google plays “hide the data” as usual!

    I have long been a fan of getting my clients into as much structured data as possible, even the ones that don’t provide a good, old-fashioned, direct link. I always figured that if latent semantic indexing is in the pipeline, then any form of legit citation (I do like that term) is good stuff. And experience seems to bear this out.

    One point you made in the Citations post, I’d raise a flag on. You mentioned:

    “Crappy-looking links might not be crappy after all. If we go a bit deeper and see the kinds of links that are being counted by the Local algo, just using #1 Juniper Flowers as an example, we see a heavy dose of usual suspects Citysearch and Yelp, but we also see things like Washingtonfloristonline.com and Locateaflowershop.com. I don’t know about you, but based on domain name alone, those look like pretty spammy sites to me, and not worth pursuing as a link for the organic algorithm, but they’re just the kind of links that seem to be propelling Juniper to the #1 Local ranking”.

    I actually don’t think those two sites are spammy. They seem ok at first glance. But there are tons of spammy directory sites out there. Any idiot can get a bunch of links from them, probably at a rate of about $10 for 150 or whatever. But just because Google hasn’t (perhaps) gotten around to filtering these out of the Local Search algo yet, it doesn’t meant they won’t get around to it. To my way of thinking, links from spammy sites send a huge signal that you are trying to game the system — and most important, after you sign up for junk like that, you will never be able to get out of them. So I would caution readers against grabbing a bunch of cheapo links, just because today Google is not slamming them. They will eventually.

    Great post, thanks.

  3. David Mihm says at

    Mike, I’m inclined to agree with you — the shift seems to have happened more in the amount (I wouldn’t necessarily say quality) of data Google has to go on now. But with all of the other stuff G is rolling out these days like personalized search and now search suggest, I wouldn’t be surprised if this was an actual shift in the algo either.

    Neil, thanks for your thoughtful response. As Mike pointed out in the comments one of my last posts, you can still get accurate citation data through maps.google.com (as opposed to google.com), though I’m not sure how much longer that’ll last. And you’re absolutely right that QUALITY citations are still more important to obtain in the long run, but quantity does seem to be the name of the game in this early phase of Local search. It’s a matter of having both a short-term AND a long-term strategy. Thanks for stopping by!

  4. MiriamEllis says at

    Excellent post, David, and I will surely sphinn it promptly.

    I agree that the personalization of Local does smack of Danny’s 4.0, let alone 2.0, but like Mike, I feel like we are still dealing with mainly the same metrics that have been around for the past couple of years…trying to determine their real strength. UGC is a new one, though!

    Oh, if you’re right in your prediction that Google will begin obfuscating citation data, I will applaud you…but I will be so mad! :)

    Neil says:
    …have a client who is sitting on top of the Ten Pack for his niche in his market, and it indicates that he has 76 web pages in his citations list. But when you click to review them, there is only one link showing — his home page! So Google plays “hide the data” as usual…

    I can’t help wondering if this is a glitch. I haven’t seen anything like this, and don’t recall seeing any results like this doing the Mike Blumenthal Code Cracking project. Mike, David, have you guys seen this???

    David, the 1.0, 2.0 question is a good one and in addition to Danny’s definitions of these phases, I associate a metric of ease and quality of results with each of these different stages the Internet is progressing through. That being said, articles like Mike’s recent one about the phony health care listings make me feel that Google is still in the dark ages when it comes to what I see as their overall basic goal with Local. The gaming of the results, in addition to pure error, are standing in the way of Local being the usable, trustworthy tool it could be. I don’t envy Google the headache of making it so, but having set themselves up as the Local king, they’ve got to find a way to do better than this. My 2 cents!

    Miriam

  5. Mike Blumenthal says at

    @Miriam

    I do think this particular case of them not showing up is a glitch. David wrote a post on it a while back. My opinion is that it is a function of their recent blue line upgrade which seems to have caused all sorts of problems. My experience is that it comes and goes from the TEXT view, sometimes its there and sometimes not but has been consistently visible in the Maps view. They seem to have been very involved in fixing the problems caused by the upgrade (inability to reorder directions etc) and have let other less obvious problems (inconsistency in showing web page totals) slide.

    That being said, even without this obvious obfusification, Google has clearly not been showing ALL web page citations & references in the web pages tab. We know this because often the quoted snippet that you often see at the highest level of a Maps results page is not shown in the Web Pages tabs…

    We have however seen a strong correlation between the Web Pages total and ranking so while we are not seeing every citation we are seeing some indication of relative strength within that industry.

    I think of the tabs as Google’s way of saying: “These things are important, Pay attention!” as opposed to them giving us absolute insight into Google’s index.

    Mike

  6. MiriamEllis says at

    “My experience is that it comes and goes from the TEXT view, sometimes its there and sometimes not but has been consistently visible in the Maps view.”

    Mike, shall we create a new category of bugs called Rabbit Ears? :)

    I know, I see wonky things come and go constantly while using maps. Adjusting the rabbit ears can help sometimes, going away for a day or two, forgetting about it and coming back later and seeing changes.

    Thanks for answering my question, Mike.
    Miriam

  7. Mike Blumenthal says at

    Hmm…that’s an interesting graphic for Maps….Google Rabbit ears, twist ‘em just right and you might just see the something…

    Mike

  8. Gab Goldenberg says at

    Re: Hiding citations. It should be interesting to see when a hub finder tool for local search citations will be developed, as that is the core of basic link building.

  9. Neil Street says at

    Good tip about using maps.google.com as opposed to google.com.
    When I go that route, I see about 35 of my client’s citations (although the tab indicates there are about 76).
    I just went back and checked the citation showing under Google.com, and it’s still the same — indicates 76 citations in the tab, but only shows the home page when you click on the tab.

  10. Dave Oremland says at

    Nice article, David. Also interesting comments from readers.

    I still see a lot of traffic coming uniquely from search as opposed to Google Maps/ inserted into search for local phrases. I see it on usage of secondary terms around which google doesn’t generate a map. I see it on searches wherein a state name is used rather than a city name. I see search phrases in a metro region wherein the searcher mentions more than one geo area….and no maps emerge.

    Does anyone have a feel from clients as to the effectiveness of ranking #1 and lower within maps? I’m curious. One thing I can verify from a couple of examples; an authoritative 1 box is an awesome traffic generator, IMV. LOL, frankly, I’d love to turn a phrase or business into a recipient of an authoritative onebox, in lieu of the anticipated 10 pac for a phrase; not unlike the experience with “denver flowers” last December and the first few months of this year.

    BTW; happy Labor Day!

    Dave

Leave a Reply

You are here: Home > Blog > Has Local Search 2.0 Arrived?