Highlights of Eric Enge’s Interview with Google Local Director Carter Maslan
Eric Enge just posted a remarkably thorough interview with Carter Maslan of Google Local. I’m going to run through the highlights here, but please, please, please read the interview in its entirety.
What a fabulous interview. I applaud Carter for his open-ness in talking about the factors their algorithm is looking at, and I applaud Eric, for being so thorough with his questions.
Highlight #1: What’s Google’s stance on the same business having multiple listings in the 10-pack?
[E]ven though there is nothing inherently wrong with a single business ranking prominently in the top results, we are working to modify the way we handle cases like that. The challenge spans multiple areas around the quality of the data, the way we index it, the way we score it, and the way we present the UI in showing it.
So we are working on variations on ways to handle those cases better, but that’s not to say that there’s something inherently wrong with one business having a prominent presence in the top ten. (emphasis mine)
It’s so nice to see Google’s sensitivity to the issue of multiple business listings in the 10-pack for the same company. I’ve seen cases of both obvious spam AND legitimate regional businesses and can completely understand Google’s dilemma. They don’t want to allow spam in their index, and yet they don’t want to throw out companies for whom consumers expect to see multiple locations either.
Personally, I’d think that a URL filter would solve this problem to a large degree, since multiple listings from the same company would conceivably point back to the same domain. So filter the results prior to presenting them in the 10-pack such that one company is limited to X# (3?) listings out of 10. But it’s a very tough situation for Google.
Highlight #2: Bulk Uploading and Spam
That’s something that we are taking a look at, because the bulk upload has been the primary source of map spam, so we are reexamining that. We are looking at a combination of factors to address it either through added verification or through, for example, consistency with the listings that have been manually verified.” (emphasis mine)
It’s not necessarily surprising to hear this, but one can appreciate Carter’s candor in describing the vulnerability of this feature. From nearby sections of the interview, it sounds like they may be re-instituting the bulk upload feature, but with a more stringent verification process in the near future…
Highlight #3: Carter Confirms Citations as a Major Player in the Algorithm
We try to find as many references to the businesses as possible, and then determine if they are referencing the same business, and then present that as related information to that business. So we try to look as comprehensively as possible for sources of information on a business.
Just the fact that there is an implicit link in the geospatial world, is not as strong as the explicit anchor text that goes straight to an URL that we know is a definitive domain for a business. But yes, it does help to have your business well-described and geo-coded in references on different pages. (emphasis mine)
Not much more to be said here. But definitely get to know these guys to leverage your pool of citations for maximum exposure.
Highlight #4: The role of Business Title in the Local Business Center
Products and keywords in the title record is much the same as it is with web search, where it is a double-edged issue. You don’t want to spam your title, in general, you just want to describe your business accurately.
It was absolutely terrific to get clarification from Carter about the role of the Business Title in the LBC. This was the most valuable comment I’ve ever heard come directly from a search engine rep. I’ve been advising clients to structure their LBC titles as “Company Widgets – Geolocation” ifor awhile without really knowing whether it was white hat or not. Carter’s response appears to indicate that Google sees descriptive titles as positive and keyword stuffed titles as spam. I’d totally agree with that!
Highlight #5: My ONE disagreement with Carter
Exactly, as for proximity to centroid that is an old issue which we’ve addressed, so that is not too important any more. Early in the history of local search people would try to setup locations near the centroid, but that is just not that important any more. (emphasis mine)
The one statement that I’d take issue with in the ENTIRE interview is that centroid doesn’t matter much anymore. I’ve seen plenty of searches in non-competitive industries or locations where centroid still makes plenty of difference in the rankings, presumably in the absence of strong signals from elsewhere on the web.