Thursday, November 5, 2009

Was the Google Maps data change a big mistake?

So the discussions about the great Google map data change in the US rage on, and we are seeing more and more reports of significant data quality issues. I wrote about how Central City Parkway was completely missing, and I reported this to Google to see how the change process would work. I posted later about how it had been partially fixed, with a new geometry visible but not routable, and with the wrong road name and classification. The latest state (on November 5, after reporting the error on October 9), is that is now routable, but it still has the wrong road classification, being shown as a minor road rather than a major highway. This means that if you calculate the best route from Denver to Central City, Google gets it wrong and doesn't go on Central City Parkway, choosing some much smaller mountain roads instead, which take a lot longer. Microsoft Bing, Yahoo, MapQuest and CloudMade (using OpenStreetMap) all calculate the correct route using Central City Parkway. Another substantial error I found recently is that if you search for one of the companies I work for, Enspiria Solutions (by name or address), the location returned was about a mile off. This has now been partially but not entirely fixed (after I reported it).

Steve Citron-Pousty recently wrote about whether Google made the data change too soon. He talked about how his wife has always used Google Maps, but it has got her lost four times in the past week and she has now switched to using MapQuest or Bing. And Google got Steve lost in the Bay Area last week too. Maitri says she is "splitting up with Google Maps" over issues in Ohio, as "there is no excuse for such shoddy mapping when MapQuest and Yahoo do exceptional work in this area the first time around". She links to an article in a Canton, Ohio, newspaper about how the town was mis-named after the recent data change (we call it Canton, Google calls it Colesville). James Fee pointed out an error with Google showing a lake that hadn't been there for 25 years or so. Matt Ball did a round-up discussion on the importance of trusted data. The well known tech journalist Walt Mossberg reviews the new Motorola Droid phone (which uses the new Google data for navigation), and in passing says when reviewing the navigation application "but it also gave me a couple of bad directions, such as sending me the wrong way at a fork in the road". And then in news which is presumably unrelated technically (being the in the UK), there was a lot of coverage of a story about how Google Maps contained a completely fictitious town called Argleton - which even though a separate issue does not help the public perception of the reliability of Google Maps data.

Update: see quite a few more stories about data issues in the comments below.

So anyway, this is a long and maybe somewhat boring list, but I think that it is worth getting a feel for the number of stories that are appearing about map data errors. As anyone in the geo world knows, all maps have errors, and it's hard to do a really rigorous analysis on Google's current dataset versus others. But I think there is strong evidence that the new Google dataset in the US is a significant step down in quality from what they had before, and from what Microsoft, Yahoo and MapQuest have (via Tele Atlas or NAVTEQ).

Google clearly hopes to clean up the data fairly quickly by having users notify them of errors. But looking at the situation, I think that they may have a few challenges with this. One is just that the number of errors seems to be pretty large. But more importantly, I think the question for Google is whether consumers will be motivated to help them fix up the data, when there are plenty of good free alternatives available. If Google gives you the wrong answer once maybe you let it slide, and perhaps you notice the link to inform them of the problem and maybe fill it out. But if it happens a couple of times, is the average consumer likely to keep informing Google of errors, or just say "*&#% this, I'm switching to MapQuest/Bing/Yahoo"?

Google has made some reasonable progress with Google MapMaker (its crowdsourced system for letting people create their own map data) in certain parts of the world, but these are generally places where there are not good alternative maps already, or people may be unaware of alternatives like OpenStreetMap. So in those cases, people have a clearer motivation to contribute their time to making updates. People who contribute time to OpenStreetMap have a range of motivations, but in general for most of them it is important that the data is open and freely available, which is not the case with Google (at least not so much, I won't get into the details of that discussion here). Most if not all the people I know who contribute effort to OpenStreetMap (myself included) would not be inclined to contribute significant updates to Google (except for some experiments to see how good or bad the update process is).

Consumer confidence is a fickle thing, and you probably don't need too many stories in the newspapers of mishaps due to bad data, or more than a couple of direct experiences of getting lost yourself due to bad data, to switch to a different provider (especially when you are choosing between different free systems - you have a bit more incentive to stick with a navigation system and try to make it work if you've spent a few hundred dollars on it).

The risks are even higher with real time turn by turn directions - no matter how many caveats you put on it, you are likely to get some drivers who follow the directions from the system and don't notice relevant road signs. You only need a couple of accidents because people drove the wrong way up one way streets because of bad data to damage consumer confidence even further.

So I think it will be very interesting over the next few months to see whether the data quality issues are bad enough to result in significant numbers of users moving away from Google Maps in the US or not - and whether Google will get significant uptake in terms of number of people contributing error reports in the US (beyond the initial wave of curiosity-driven updates just to test if the process works). Obviously the answer to the second question is likely to have a big influence on the first. Stay tuned ...

20 comments:

gletham Communications said...

I think Google may have bitten off even more than they can chew on this one. Again they are set on reinventing the wheel. Why not partner up with OSM on this effort? They have clearly messed with something that wasn't broken and now may have "effed" it up. As a former, avid gMaps mobile user I am now finding many other more suitable and useful mobile navigation apps (Waze for example) that are clearly superior to what Google has going on at the moment - you clearly can't do it all Google... why insist on buring all your bridges... this one may really come back to haunt them on day.
Glenn (2gletham) blog.gisuser.com

Ian Turton said...

Once again I'm having to tell people visiting not to use Google maps as my house (and whole subdivision) has vanished. Some of the errors I reported on the first day have been fixed but others have had no response beyond the acknowledgment. I suspect they have been overwhelmed and now people are going to give up.

As to using OSM I heard that the license was the problem - no clue as to what part of it though.

Unknown said...

Guys, I think you have to give Google some time to work on this, yes there are errors as Peter points out all data sets have errors, what will be interesting is how quickly these errors are fixed - already you are seeing errors corrected in weeks rather than months previously.

As for OSM licensing Andrew has a great explanation of some of the issues with opengeodata licensing.
http://highearthorbit.com/the-need-for-clear-data-licenses/

Peter Batty said...

Ed, I think my key questions though are whether the average consumer will (a) give Google some time and (b) be motivated to submit error reports - or will they just go elsewhere? And I think we just have to wait and see on that.

I understand the motivation for what you guys have done and if you can make it work it will have a lot of advantages for you - but I think the next few months will be interesting as we see whether end users have the patience to see this transition period through.

Steven Romalewski said...

Interesting post as always, Peter. Unrelated, I think it's kind of amusing that the AdSense ads in a blog post critiquing Google -- are ads for Google !

Peter Batty said...

Thanks Steven! Yes I was seeing large Google ads too, which is quite amusing. I wonder if that's just because of lots of mentions of Google, or do they figure out that this post may be questioning Google's infallibility and give you an extra large dose of Google goodness ads to counteract that :) ??

Ian Turton said...

Ed: I read Andrew's post on licensing and I'm still in the dark as to why Google couldn't use OSM data?

Peter Batty said...

Ian, I asked Michael Jones about this at the GeoWeb conference this year and here is his answer. OpenStreetMap is currently in the process of changing its license to ODbL (more info here) which will address some of the issues though not necessarily all - some people still have concerns about ODbL being "too viral" as Andrew said.

I suspect that Google may be too far down a different path now to do anything with OSM - but who knows, if their new data initiative doesn't work out so well maybe that might be motivation for them to reconsider at some point?

Kirk Kuykendall said...

I suspect Google intends to compete with Inrix by using android devices to form a sensor web, similar to Inrix's Dust Network.

http://www.inrix.com/news_DustNetwork_23May2006.asp

Google Street View Funny said...

How many GIS Specialist's does Google have now? It's great for the GIS industry! Google will need quite a few geonerds to keep those maps up to date! - though...I hate to say this, geonerds in development countries are priced very well and almost impossible to compete with!

Richard Fairhurst said...

I don't think the Argleton issue is technically unrelated - I think it's essentially the same. That is, data-mining as a source, though insanely clever, is by its nature "fuzzier" than surveying.

Google became the best text search engine by mining data better than anyone else. Early 'search', particularly Yahoo, required manually surveying the web.

But where maps are concerned, I'm not convinced that mining can currently get you the accuracy you need - or indeed that it ever will.

I've written a long (UK-centric) screed on how even little POIs are being screwed up on Google Maps because the mining/conflation stuff isn't ready for prime-time yet. Apply that across a whole country's road network and the result isn't going to be pretty.

Peter Batty said...

Kirk, Google is already doing "sensor network" stuff using Google Maps Mobile, so that will be in the navigation application too, as that's just an addition to the existing app. That will help identify potential issues with the street data, but only gets you so far - won't help with road or town names, addressing issues, etc.

Peter Batty said...

Hi Richard, interesting post of yours on Argleton! And interesting discussion on data mining. I talked a while back about some Google Local Search issues which may be related to some of the things you talk about. But I'm not sure the issues with the core street network data in the US are related to data mining. I think that data is a mixture of data they've captured themselves from their streetview cars (which in general seems reasonably good), combined with TIGER data from the US census bureau in areas where they don't have streetview data (and this data is generally not so good). Obviously they hope they can clean up the TIGER data fairly quickly, but how quickly and how well is the big guestion.

Anonymous said...

In response to Ed's comment about give Google a break.

Google by being Google and providing Google Maps free to the community and driving the demand has turned the product into a utility / necessity for people and as such people have a higher level of expectations so I think Google has very little time to correct these issues. It has taken other firms years to perfect and Google is not good at data creation.

I find it interesting that no one has questioned Google's motives here. I recently read a great article how Google fake's its support for open source. They use the community to their advantage and ask the community for assistance building their content / data and in turn do not provide the data back to the community in the same fashion.

In essence all of the employers of the world are subsidizing Google since employees of these firms come home and help Google build the content empire and Google continues to get richer and put legitimate businesses out of business (PND as an example - I would hate to work for Garmin or Tom Tom after last weeks announcement of Android Turn by Turn navigation).

Just my two cents as I watch an empire grow and turn into evil just by it shear size and "trying to better the community"

Steven said...

I'm surprised that they've got it so wrong, or maybe because we're geonerds we're hearing about the errors?
To give Google some credit surely they ran the two networks in tandem behind the scenes, compared what their network was giving vs TA and then felt they had to cross some threshold to roll it out........

Anonymous said...

With the relatively coincidental release of their own data and the announcement of free turn-by-turn navigation, my guess is Google had to drop Teleatlas so they wouldn't lose money on licensing costs for the nav application.

OSM has no validation for the data entered by users, so it is probably of limited usefulness to Google.

Adam said...

Just anecdotally, I have noticed the quality of Google Maps in my area (suburban Boston) has gone down since the switch. Streets themselves appear to be rendered correctly, but addresses are not being properly geocoded. For example, Massachusetts Avenue is a long road which runs through many towns in the eastern part of the state. Locally, people refer to it as "Mass Ave" and it was always possible to use that abbreviation in Google Maps in the past -- not anymore. Many street addresses are no longer plotted in the correct place on the map. My place of business, which has existed for 50 years, is not recognized as a valid address by Google anymore. In my area, at least, this change in map data has not been good at all.

Peter Batty said...

@Anonymous, you say "OSM has no validation for the data entered by users, so it is probably of limited usefulness to Google". I think you underestimate the power of crowdsourcing. Would you say that Wikipedia is of limited use as it has no validation? Both of them have validation from the community (though additional tools to help with this certainly could be - and are - being built). Google itself has bet on this approach and built a clone of OpenStreetMap called ClosedStreetMap, sorry I mean Google MapMaker :), which is just used to create data for certain countries. And I recommend you look at Muki Haklay's rigorous analysis of OpenStreetMap data quality in which he concludes that it is "beyond good enough" for many applications.

Peter Barnes said...

Google have, I think, hit what the satnav vendors have so far avoided - the ability of the customer to choose other content (in this case data) at little or no cost when what they're given doesn't cut it. The disconnected satnav market make devices that are typically bound to a single dataset, which is both good and bad for the vendors. Google, on the other hand, operate in an area where the end customer can easily switch to Bing, MapQuest etc. like flicking the channel on the TV when the programme isn't good. As Peter pointed out, customers are fickle!

The change does point to a serious risk for those who invest their development capital in apps that use the APIs of the major commercial players such as Google & Bing. The ability of our apps to meet our customers' needs is dependent on the data but we have no say in whether the source of this data, and consequently its fitness for our purpose, changes.

Unknown said...

My office is now 5 miles from where it should be because of errors in the use of "Pike", "Ave", "E" & "W" and a Naval Development airbase that closed 13 years ago and was divided into various public and private properties has reappeared even larger than I ever remember it being in the last 30 years. Add another half-dozen small items reported and then multiply over the entire country and the total errors really start to become a concern.

The ability to report problems is better than I remember from Navteq and Tele Atlas, which only matters if actually used. I would venture to say that a large percentage of end-users who rely on Google Maps are not even aware that the data was switched over, nor have a reason to know such information. These are the users that Google needs to submit error reports. The problem is that once these users start receiving incorrect data, they will not be Google's partners in the field providing feedback at a grass-roots level, they'll just be ex-Google Maps users.