I have been following with fascination (and taking many calls from research analysts), regarding the public market buzz around Local.com (NASDAQ: LOCM). This firestorm was triggered by John Gilliam’s recent blog on SeekingAlpha. While I personally think John’s assessment of this patent is overzealous (he did disclose he was long in the stock), I couldn’t have written a better rebuttal than industry veteran Martin Himmelstein, who’s comments I totally agree with.
There is an incredible amount of prior art that affects the local search and web mapping space, due to their dependency on geospatial technology – which has been around for over 35 years.
Marty’s blog links to a Vicinity patent (now owned by Microsoft), which cited 11 other patents, when it was filed in 1998. In the case of Local.com, they only cite 5 references – even though they filed in 2005. Perhaps LCOM didn’t pay their patent lawyers enough… Anybody involved in software and internet patents knows that the USPTO is broken, and should be encouraging their children to pursue a career in patent law. For fun, you can check out this patent 6,611,751, that is now owned by Cquay, where our prior art search found over 50 cases to cite in this class(s) – but of course not Marty’s (because his had not been granted yet). Like Marty, we don’t see much advancement to the state of the art, in LCOM’s patent.
This debate about location references in web content and business data, geocoding, geotags, push pin maps and geographic search is far from over, and I predict that a flurry of litigation will emerge. In that sense, John Gilliam is correct, but LCOM’s patent is likely not defensible. The noise around paid local search is already at fever pitch and with the amount of money involved in local advertising, the litigators must be drooling to take a run at Google, Microsoft, Yahoo, AOL, and IAC.
The invention described in Cquay’s patent is the basis of this whole series on PlaceSmart Search ™. To help position this concept, I need to start with Marty’s patent which describes a very similar process as LCOM claims, for geocoding web content. Included in this patent is the notion of a “point key” – used primarily to enable the association of the geographic location of an address to areas (polygons) to enable geographic search, such as by city or zip code boundary.
We believe this patent is a little thin too, because GIS systems have been using boundary features, to retrieve point features and associated content, for 20 years. Further, points derived from geocoded addresses have associated error factors and only allow content to be linked to theoretical locations on a map (such as push pin feature). Anyone who tried to use Amazon’s A9 local search engine with all those expensive building photos (now turned off) experienced this issue. While the pictures all had lat/lon tags, and you could get a panoramic view of building facades at a particular location, you could rarely tell which picture was associated with a specific address. Without local geographic knowledge, to know which side of the street to look on and only a fraction of the pictures with a visible building numbers, I found myself lost in a sea of pictures.
GeoTags on Flickr photos are just as frustrating. A much better approach is the one Zaio (TSXV: ZAO) is using for the commercial and residential real estate industry, where the photo, legal assessment data, and address are all tied together in a database, and the geocode is recorded at the same time the picture of the building is taken.
The primary flaw in first generation location-centric search engines, as discussed in a prior post, is that mapping systems were never designed for information retrieval – and geocoding technology, while useful in mapping, GIS and navigation applications like TomTom – was never intended as a method for indexing and integration of disparate content sources, about specific places.
We refer to this as the “which place?” problem, rather than the “where is it?” problem.
More specifically, location coordinates derived from geocoding addresses do not uniquely identify a building, let alone a suite in an office tower, or store in a mall. In fact, trying to get back to real world addresses from lat/lon coordinates is terribly inaccurate – in reverse, geocoders will make up building numbers and even get the street name wrong. Indeed, geocoding allows for geographic search and proximity-based content retrieval (and of course map spam) – but it does not allow us to index, integrate, share and find information about places.
Cquay’s patented technology and Common Ground ™ platform was designed to solve this problem. We started with “addressable locations” or “places” in a database of places, and added a unique key or identifier for every place, (in the early days we called this a Universal Spatial Locator, or USL), and now refer to it simply as a Place ID ™ or Place Tag ™. Encoded in the key is information about address (to sub-division level), place name (such as Embarcadero Center or Mall of America – if there is one), and location (points or polygons) – along with all the hierarchical and geographical relationships between and among other places in the database. A hierarchy of places, if you will. The technology includes a registry, index, search and ranking method for real world places that works like a URL or internet address, for virutal “places” in the web. With this approach, location precision down to a store in a mall (using the geometry from a mall floor plan) is possible, and a reverse geocode from a GPS coordinate will create a list of actual real world buildings, ranked by proximity.
This technology forms the basis of our Common Ground information integration and retrieval platform that we currently license to customers like Bell Canada, MTS Allstream and the US Sheriffs Association, and will be used to launch a suite of PlaceSmart Search ™, and PlaceSmart Information Services, in 2008.
(To be continued…)