Showing posts with label openstreetmap. Show all posts
Showing posts with label openstreetmap. Show all posts

Sunday, September 4, 2011

Crowdsourcing my State of the Map keynote talk

A couple of months ago I gave a talk at the monthly OpenStreetMap meetup in Denver and I decided to try crowdsourcing the content (in the best OpenStreetMap tradition), and it worked out very well. I received a good amount of interesting content from several folks (thanks everyone!). I rashly agreed to do a keynote at the upcoming State of the Map conference in Denver (rashly since I'm chairing the FOSS4G conference right after that, and am more than a little busy with that, not to mention my day job :O !!). So I thought I would do the same thing again and appeal for ideas on things I should include in a tour of what's new and cool in OpenStreetMap.

You can send me links to stories, slides if you have them, or anything else that you think would be useful. If you did a presentation on something cool and interesting at SotM EU but can't make it to Denver, I'd be happy to mention it over here. And even if you are presenting something over here, I'm happy to include a slide or two as a "trailer" for your talk. Of course I can't promise to include everything, depending on how much material I end up with (and I do have some of my own of course!), but will do my best.

In general I'm interested in hearing about things that you think are (reasonably) new and interesting in the OpenStreetMap world including:
  • Cool applications using OpenStreetMap data 
  • New (or improved) tools for creating / editing OpenStreetMap data 
  • Examples of businesses or government organizations using OpenStreetMap 
  • Anything else you think is interesting! 

Please just drop me an email, or comment below. I will give credit to all contributors, of course!

Monday, June 20, 2011

Speaking at OpenStreetMap Meetup in Denver tomorrow

As mentioned previously, I'll be speaking at tomorrow's OpenStreetMap meetup in Denver tomorrow, at the cool new MapQuest offices downtown. My attempt to crowdsource the presentation got a great response, and I got lots of suggestions for interesting content. I plan to have something for everyone, along the following lines:
  • A quick intro to OpenStreetMap for any newcomers
  • Some tips on using the Potlatch 2 map editor, which provides some cool new features - this should be good for both newcomers and experienced mappers
  • Examples of how government agencies from various parts of the world are using OpenStreetMap
  • Cool new tools for developers using OpenStreetMap, like Leaflet and Kothic
  • Some interesting applications using OpenStreetMap
  • Last but not least, how OpenStreetMap was used to show that you can in fact walk across Dublin without passing a pub (pubs have always been a mainstay of OpenStreetMap!)
We'll also have some discussion about the state of OpenStreetMap in Denver and Colorado, and brainstorm on things that we might want to focus on before State of the Map and FOSS4G are in Denver in September! You can get full details on the meetup and sign up here.

Tuesday, June 14, 2011

Looking for content for "State of OpenStreetMap" presentation

I'm doing a couple of upcoming presentations on OpenStreetMap, the first one next week at the very cool MapQuest office in downtown Denver, so I encourage you to come along to that if you're in the neighborhood ... and this may well evolve into a presentation for State of the Map in Denver too!

So in the best OpenStreetMap tradition I thought I'd try a little crowdsourcing to help me pull this together. I'd be interested if you could send me links (or other info) about things that you think are (reasonably) new and interesting in the OpenStreetMap world including:
  • Cool applications using OpenStreetMap data
  • New (or improved) tools for creating / editing OpenStreetMap data
  • Examples of businesses or government organizations using OpenStreetMap
  • Anything else you think is interesting!
Please just drop me an email, or comment below. In fact if anyone has presentation slides covering interesting stuff that would be relevant, those would be great too. Happy to give credit to all contributors, of course!

Thursday, January 27, 2011

Geospatial in the cloud

As mentioned previously, earlier in the week myself, Brian Timoney and Chris Helm did a set of presentations and demos on geospatial technology in the cloud, to the Boulder Denver Geospatial Technologists group. We were aiming to give a quick taste of a variety of interesting geo-things currently happening in the cloud, and we did it as six slots of about ten minutes each, and apart from my introductory opening slot these were all demos:
  • Peter: Why the cloud?
  • Brian: Google Fusion Tables
  • Chris: the OpenGeo stack on Amazon (PostGIS, GeoServer, OpenLayers, etc)
  • Peter: Ubisense myWorld and Arc2Earth
  • Chris: GeoCommons
  • Peter: OpenStreetMap
We got a lot of good feedback on the session. Here's the video (for best quality click through to vimeo and watch in HD):

Geo in the cloud from Peter Batty on Vimeo.

Here are links to the demos we used, or related sites:
And finally, here are my slides on slideshare:

Thursday, October 21, 2010

Triple geo-conference goodness coming to Denver!!

Denver has always been known as a center for geospatial activity, and we have a great triple bill of events lined up, one in the near future and two back to back in September 2011.

The one coming up is WhereCamp5280 on November 19th. Eric Wolf, Ben Tuttle and I ran the inaugural one last year which was a great success, see James Fee's review. I hear a rumor that James will be back this year, so I guess he must have liked it! Eric and I have both been a bit swamped on other things recently, so Steve Coast has kindly taken up the organizing reins this year, thanks to Steve for that! Last year we were kindly hosted for free by Denver University (DU), this year we will be at University of Colorado Denver on their Auraria Campus, which has the advantage of being within easy walking distance of downtown. And this year we've decided to do one day rather than two. But two things that haven't changed since last year is that the event is FREE, and we'll be holding the social event on Friday evening at my loft, I expect there will be plenty of geo-beer from the Wynkoop Brewing Company downstairs and that may fuel some geo-karaoke later on. All this is thanks to our kind sponsors, who at the time of writing include Enspiria Solutions, ESRI, Google, MapQuest and Waze.
WhereCamp5280 party
I'm expecting a great group of interesting attendees and presentations again this year, so highly encourage you to come along. And remember it's an unconference, so we are looking for as many people as possible to participate - prepare a short presentation or come prepared to lead a discussion on a topic that interests you!

Sign up for WhereCamp5280 here, and if you feel like sponsoring at anywhere from $16 to $1024 (can you tell that a techie geek set the sponsorship amounts?!) that would be great, but otherwise just sign up and enjoy the great free education, networking, and beer :).

So WhereCamp5280 is a great local event, but in September 2011 the global geo community will be converging on Denver for a fantastic double bill of FOSS4G and SotM.

For those who don't know, FOSS4G stands for Free and Open Source Software for Geospatial and is an annual international gathering organized by OSGeo. The last North American event was in 2007 in Victoria, BC, and since then it's been in Cape Town, Sydney and Barcelona, so we're delighted to have Denver join that list, and expecting a great turnout from around the world.

Eric Wolf and I led the bid to bring FOSS4G to Denver (which is one of the things we were busy on that was competing for time with WhereCamp5280). Eric was originally slated to be the conference chair, but unfortunately due to circumstances beyond his control he has had to stand down from that, and I have just taken over that role in the last week (well unless the OSGeo board fails to approve the change at their next meeting, but I'm assured that's not very likely!). I'd like to publicly thank Eric for all the work he did to bring the conference here - it was his idea initially, and definitely wouldn't have happened without all his efforts. We have the core of a great local organizing group set up already, but are still interested in recruiting a couple more folks, so if you'd like to help out please let me know. It's going to be a great event, and I'll be blogging plenty more about it over the coming months.

And on top of that it was announced today that Denver has also been selected to host State of the Map (SotM), the global OpenStreetMap conference, also in September 2011. I attended SotM in Amsterdam in 2009 and thought it was a fantastic event. Unfortunately I wasn't able to make it this year, but I will definitely be there next year :) ! The two events are distinct, but several people were involved in both bids, and we recognized that a lot of people would be interested in attending both, so the intent is for them to run back to back. The SotM date isn't fixed yet, but FOSS4G is locked in for September 12-16.

So if you're in the Denver area already, plan to be at WhereCamp5280 on Nov 19, and if you're not, make plans to be here in September 2011!

Sunday, February 14, 2010

OpenStreetMap in Haiti - video

I am sure most readers of this blog have heard about how there has been a huge effort to map Haiti using OpenStreetMap - Harry Wood gave a good summary of efforts a few weeks ago. Currently Schuyler Erle and Tom Buckley are down in Haiti helping out with mapping for the relief efforts on ground - Schuyler's blog has really interesting accounts of what they've been doing. Schuyler tweeted today that he and Tom were "starting to post raw, iffy quality, unedited audio and video - please feel free to edit/remix/repost". So here's my quick edit of what they've posted so far - I'll try to do something similar if they post more in the coming days. The video features people from UNOSAT and UN OCHA explaining how useful OpenStreetMap data was to them - and also some clips from a prayer march. Click through on the video to see a larger version (and choose 480p).

Update: some quotes from the video ...

Catherine from OCHA (the UN Office for the Coordination of Humanitarian Affairs): "Basically it's the best source of transportation information that we have for Haiti ... it's the most comprehensive one, and the most up to date, of course. This is information that everyone is using here. There is a big GIS group here for this humanitarian response, and everybody is using OpenStreetMap."

Schuyler: "OpenstreetMap was used for the UNOSAT building damage assessment ..."

Olivier from UNOSAT (UN Operational Satellite Applications Programme): "Of course we used OpenStreetMap on a daily basis ... every day it was possible to improve our maps ... with new information. Usually it is impossible to get this information ... so thanks to this volunteer platform, it has been fantastic for us, that's for sure."

Friday, February 12, 2010

OpenStreetMap meetup in Denver next Wednesday

On Wednesday next week, there will be an informal meetup at the Wynkoop Brewing Company in downtown Denver, to drink beer and discuss OpenStreetMap. Steve Coast, the founder of OpenStreetMap, will be there, and it should be an interesting group of local geo folks. We'll be there from 5:30pm until 9ish or later!
Steve Coast and Peter Batty
If you’re already involved with OpenStreetMap, this is a chance to get together with other contributors and discuss ideas – I would love to see us having more coordinated initiatives in the Denver area than we’ve had so far. And if you haven’t been involved but are curious about OpenStreetMap, this is a great opportunity to learn more. Hope to see a good turnout – please sign up here! That's Steve and me in the photo above by the way, to help you track us down if you don't know us!

Tuesday, December 8, 2009

The great OpenStreetMap license debate

If you are involved with OpenStreetMap, you may or may not be aware that a lot of work has been been going on to develop a new license. If you are signed up for any of the OpenStreetMap mailing lists, then you certainly know that this work has been going on, as there has been a torrent of emails containing very heated debate on the topic over the past week or so (although work in this area has been going on for a couple of years).

From the long and passionate emails of a small number of people, you might get the impression that the new license is some sort of subversive scheme to somehow take over or undermine OpenStreetMap.

Duty Calls

Everyone is entitled to their opinion of course, but a few key points I would make are as follows:
  • I support the license change and encourage you to do so too.
  • A lot of very bright people who know way more about data licensing than I do have spent a huge amount of time working on the new license and I respect their knowledge and opinions very highly.
  • The License Working Group followed a very open and consultative process that allowed for lots of input from the community (you can see the minutes of over 50 meetings they have held this year alone).
  • The current license from Creative Commons has some significant shortcomings for databases - it is intended to apply to "creative works" and even Creative Commons says that it should not be used for databases.
  • The new license embodies very much the same spirit as the old license, but is much more enforceable, and better protects both the data and users of the data. So anyone who was comfortable contributing data to OpenStreetMap under the old license should be comfortable with the principles of the new license.
  • Switching from a "ShareAlike" license (which basically is more aggressive about ensuring that OpenStreetMap data remains open) to a "Public Domain" license (which has no restrictions), which is something that has been discussed, but would be a much bigger change than what is being proposed. This is worthy of further debate in the future but even if you support the Public Domain approach (which I lean towards but am undecided), I believe that the right short term approach is to get the project on a more solid legal footing than it is on currently (but without a major change to the original license philosophy).
  • There have been concerns expressed about the possibility of losing some data if existing contributors refuse to make their data available under the new license terms. There is a little risk here but I think it is overstated. The two licenses are so similar in their philosophy that it is hard for me to see a sensible reason why someone who was happy to contribute data under the old one would not do so under the new one. With data imports from larger organizations it may take a little time to work through some bureaucracy but I really think that should be doable, and various members of the community have volunteered to help out with this as needed.
This "human readable summary" does a good job of simply conveying the key principles of the new license.

If you would like to hear a longer discussion on these topics, you can listen to this 45 minute podcast which is a discussion on this topic including myself and various other more knowledgeable people, including several members of the License Working Group.

As I said in an email to one of the OpenStreetMap mailing lists, I think that the License Working Group has put in a huge amount of work on this effort, and I would like to sincerely thank them for that. As I said above, I think that this proposal would move OpenStreetMap onto a much stronger legal footing.

Finally, as I said at the end of the podcast and in an email to one of the lists, while the license is important, it is not the main aim of OpenStreetMap - that is to produce a great free and open map of the world. A lot of work has gone into the new license and big improvements have been made, but now is the time to vote and move on, and to let the community focus more of its energy on more important items like getting more people mapping, further improving the quality of the map data, and so on.

Wednesday, November 18, 2009

OpenStreetMap helps free Ordnance Survey data with suicide bombing mission

So as I talked about in my previous post, Ordnance Survey is going to make its small scale data freely available. I think that in many ways, OpenStreetMap has been a major influence in making this happen. The growth of OpenStreetMap has increased the awareness of the benefits of free geospatial data, and it was becoming apparent that there would no longer be a significant market for the Ordnance Survey to sell small scale map data, certainly not at the sort of high prices it has traditionally charged.

However, the fact that this is happening raises some major questions about the future of OpenStreetMap in the UK, and could even lead to its demise there. At the very least, it dramatically changes the nature of OpenStreetMap in the UK. People have different motivations for contributing to OpenStreetMap. Some do it just because they think it's fun, and they like mapping their local area. For many people there is motivation around the fact that they believe it's important to have freely available and open map data. Suddenly at a stroke, the second motivation is seriously diminished (in the UK), as this aim has been achieved if the Ordnance Survey makes a high quality and very complete dataset freely available. Now we don't know for sure yet what Ordnance Survey will release - it is possible that it could just make raster map data available (like Google does). But it seems likely to me that they will probably make the small scale vector data available too - there is certainly lots of demand for this.

We also don't know the licensing terms yet, but it seems likely that the Ordnance Survey data will be in the public domain. So ironically it will be more open than OpenStreetMap, whose current and new licenses are fairly "viral" - roughly speaking they say that if you enhance the data, you have to make those enhancements available on the same terms as the original data (i.e. the enhanced data has to be freely available). This more or less precludes having a commercial ecosystem of "value added" data providers on top of OpenStreetMap. And many commercial companies, like Google, have expressed concern about using OpenStreetMap because of licensing (even with the new license that should be rolled out soon). But potentially Google, Microsoft et al will be free to use the Ordnance Survey data with no constraints.

So where does this leave OpenStreetMap in the UK? It is interesting to compare the situation in the UK with the US. OpenStreetMap took off very quickly in the UK, driven in many ways by frustration with the Ordnance Survey and the lack of free map data. In the US it has taken off more slowly, and this is widely thought to be because there are more sources of free map data (albeit often poor quality ones, as I've discussed previously). There has also been a lot of spirited discussion recently on the OpenStreetMap mailing lists about the pros and cons of importing TIGER data as a starting point in the US. There is a strong contingent that argues that cleaning up existing data is less interesting and motivating than mapping something from scratch, and that this is why there is less interest in OpenStreetMap in the US than the UK. The counter-argument, which I support in general, is that we are much further along in the US with TIGER data than we would have been without it. But anyway, suddenly the UK finds itself in a similar situation to the US, but with a much higher quality free data source (assuming there are no licensing issues, which there won't be if the data is public domain, which is what I expect).

This raises a lot of practical issues in terms of data imports, which we have already faced (but not solved) with OpenStreetMap in the US. OpenStreetMap in the UK has a rich database already - according to Muki Haklay, it is about 65% complete in terms of geometry, and 25% complete if you consider attributes. Now you have a 100% complete high quality dataset that you could import, but how do you reconcile this with existing data? This is a complex problem to solve. And how about subsequent updates? Do you just do a one time import of OS data, and let the community do updates after that? Will people be motivated to do this, if the OS is updating its own dataset for free in parallel? Is there some way of using the OS data for items that they maintain, and having OpenStreetMap focus on more detailed items (benches, trash cans / bins, etc)?

The ideal world might be to have some sort of integration between OpenStreetMap and the Ordnance Survey. I have spoken often about the disruptive impact of crowdsourcing and how government agencies and commercial companies need to leverage the power of this approach to bring down the cost of creating and maintaining data. Now that Ordnance Survey will have reduced revenues and require increased subsidies from taxpayers, they will be under increasing pressure to cut costs. If there was a way to leverage the power of the thriving OpenStreetMap community in the UK that could reduce costs quite significantly. There are challenges with doing this and it may just be wishful thinking ... but we can hope :).

So anyway, this move raises lots of questions about what OpenStreetMap will look like in the UK in future. If you regarded the mission of OpenStreetMap in the UK as being to create a free, open and high quality map of the UK, you can argue that the mission is completed (or will be in April), perhaps in a slightly unexpected and sudden fashion, like the falling of the Berlin Wall. Steve Coast quotes Gandhi on the OpenGeoData blog: "First they ignore you, then they laugh at you, then they fight you, then you win." The question is should we add "... and then you die"? (Or less drastically perhaps, retire, or have no reason to exist any more?)

There are some other aspects to OpenStreetMap of course, like I alluded to before - making more detailed maps of aspects of your neighborhood than the Ordnance Survey does for example. But working out how those other aspects can coexist alongside the new reality of free OS data is complex. And how many OpenStreetMappers will lose the incentive to participate in this new world, if there is an alternative source of good quality, free and open data? We live in interesting times in the geo world today - this is the second hugely disruptive announcement (following the Google earthquake) in a month or so!

I should just reiterate that of course all these specific questions apply to OpenStreetMap in the UK, they don't affect its aims and benefits in the rest of the world - except that a lot of energy for the global movement has come from the UK, so if that energy diminishes it could have some knock-on effect in the rest of the world. But I hope not!

This move by Ordnance Survey will also increase pressure on National Mapping Agencies in other countries to make more data freely available (where it isn't already).
Reblog this post [with Zemanta]

Ordnance Survey free data: right decision, various wrong justifications cited

So yesterday the UK government announced that some data sets (not all) from the Ordnance Survey (the UK national mapping agency) will be made available for free - 1:10,000 scale data and above is included (so this includes popular OS maps like the 1:25,000 and 1:50,000, in digital form). The more detailed maps (1:1250 and 1:2500) are not included - but I believe that issues related to derived data will also be resolved, which will be useful in regard to those datasets too. Overall I think this is the right decision, in what is a much more complex issue than most people realize, as I discussed briefly in my recent georant on the topic. This approach makes available most of the data that is useful to the broader world, while minimizing the impact to Ordnance Survey's revenue, most of which comes from the large scale 1:2500 and 1:1250 map data (known as MasterMap). I was recently asked for my opinions by some consultants working for the UK government on the Ordnance Survey strategy, and this is the option I favored.

As I pointed out in my georant, making all Ordnance Survey data free would cost the UK taxpayer an extra 50 million pounds a year (a total of 100 million pounds a year). This approach should cost the taxpayer substantially less - though there will still be a cost, in the order of a few tens of millions a year. Nobody has said where spending will be cut to pay for this - but I personally think it will be money well spent in this scenario (Update: I have heard other estimates that the lost revenue due to sales may be more in the £5-10m range, but nobody seems to have a firm estimate yet - well nobody knows which products are involved so that makes it harder. My number was just an order of magnitude guess).

The Guardian, though my favorite newspaper, continues to make several incorrect statements in support of this move. They say "the move will bring the UK into line with the free publication of maps that exists in the US". What maps are they talking about? Again as I talked about in my georant, there are two main sources of central government maps in the US, the USGS "national map" and the US Census Bureau TIGER data. Both of these have very limited funding (as they are paid for by the taxpayer and deemed low priority compared to other things), and their map products are not good enough quality to be used as base maps by local government or infrastructure companies like gas, electric and telecoms. As a result, all of these companies and agencies do their own base mapping, leading to huge inefficiencies with all cities being mapped multiple times in inconsistent fashion, so data from the electric utility often doesn't line up at all with data from the gas utility, for example. More detailed map data created by local government agencies (like parcel maps) has a huge array of different policies - some give it away free, some free for non-commercial use only, and some charge significant amounts for it. So please don't hold up the US as an example of what you want in a national mapping infrastructure, it's a real mess here I'm afraid! I really hope that the UK government will step up to the increased taxpayer funding that Ordnance Survey now needs to continue its work as one of the premier National Mapping Agencies in the world, and that funding for mapping won't be cut drastically as it has been in the US (where for example, USGS has gone from having ~2500 people working on the national map in the 1980s to two hundred and something today).

The other thing that annoys me is that the Guardian cites the so-called "Cambridge Report" (PDF), which in my opinion is a very questionable document anyway, in a totally incorrect way. They say that the report says "making all OS data free would cost the government £12m and bring a net gain of £156m". Firstly, that quote alone is very misleading, which is just one of the problems I have with the Cambridge Report, but I won't get into that here (but may return to it in a future post). However, the scenario studied in the Cambridge report was not "making all OS data free", it studied the option of giving away all the large scale data for free, and not giving away the small scale data - in other words the EXACT OPPOSITE of what is being proposed. So the specifics of the Cambridge Report in regard to the Ordnance Survey have ZERO RELEVANCE to the decision which has been announced (except to reinforce that there is some benefit to society in making free map data available, which is stating the obvious anyway). I am in favor of the decision as I said, but like to think of myself as a scientist and a scholar :), and it really annoys me when people blatantly misrepresent evidence to make their case.

So anyway, don't get me wrong - I think that this is a very good thing for the UK geospatial industry, and for the general principle of open data, which I am a strong supporter of, despite that fact that I will also point out the challenges with it when appropriate! I think that the right broad option has been chosen out of a complex array of possible choices. But there are risks with the decision too, including the potential for reduced funding and deterioration in quality of Ordnance Survey Maps. And there are likely be some big losers too - including NAVTEQ and Tele Atlas (again), and in many ways OpenStreetMap, which is the topic of my next post.
Reblog this post [with Zemanta]

Thursday, November 5, 2009

Was the Google Maps data change a big mistake?

So the discussions about the great Google map data change in the US rage on, and we are seeing more and more reports of significant data quality issues. I wrote about how Central City Parkway was completely missing, and I reported this to Google to see how the change process would work. I posted later about how it had been partially fixed, with a new geometry visible but not routable, and with the wrong road name and classification. The latest state (on November 5, after reporting the error on October 9), is that is now routable, but it still has the wrong road classification, being shown as a minor road rather than a major highway. This means that if you calculate the best route from Denver to Central City, Google gets it wrong and doesn't go on Central City Parkway, choosing some much smaller mountain roads instead, which take a lot longer. Microsoft Bing, Yahoo, MapQuest and CloudMade (using OpenStreetMap) all calculate the correct route using Central City Parkway. Another substantial error I found recently is that if you search for one of the companies I work for, Enspiria Solutions (by name or address), the location returned was about a mile off. This has now been partially but not entirely fixed (after I reported it).

Steve Citron-Pousty recently wrote about whether Google made the data change too soon. He talked about how his wife has always used Google Maps, but it has got her lost four times in the past week and she has now switched to using MapQuest or Bing. And Google got Steve lost in the Bay Area last week too. Maitri says she is "splitting up with Google Maps" over issues in Ohio, as "there is no excuse for such shoddy mapping when MapQuest and Yahoo do exceptional work in this area the first time around". She links to an article in a Canton, Ohio, newspaper about how the town was mis-named after the recent data change (we call it Canton, Google calls it Colesville). James Fee pointed out an error with Google showing a lake that hadn't been there for 25 years or so. Matt Ball did a round-up discussion on the importance of trusted data. The well known tech journalist Walt Mossberg reviews the new Motorola Droid phone (which uses the new Google data for navigation), and in passing says when reviewing the navigation application "but it also gave me a couple of bad directions, such as sending me the wrong way at a fork in the road". And then in news which is presumably unrelated technically (being the in the UK), there was a lot of coverage of a story about how Google Maps contained a completely fictitious town called Argleton - which even though a separate issue does not help the public perception of the reliability of Google Maps data.

Update: see quite a few more stories about data issues in the comments below.

So anyway, this is a long and maybe somewhat boring list, but I think that it is worth getting a feel for the number of stories that are appearing about map data errors. As anyone in the geo world knows, all maps have errors, and it's hard to do a really rigorous analysis on Google's current dataset versus others. But I think there is strong evidence that the new Google dataset in the US is a significant step down in quality from what they had before, and from what Microsoft, Yahoo and MapQuest have (via Tele Atlas or NAVTEQ).

Google clearly hopes to clean up the data fairly quickly by having users notify them of errors. But looking at the situation, I think that they may have a few challenges with this. One is just that the number of errors seems to be pretty large. But more importantly, I think the question for Google is whether consumers will be motivated to help them fix up the data, when there are plenty of good free alternatives available. If Google gives you the wrong answer once maybe you let it slide, and perhaps you notice the link to inform them of the problem and maybe fill it out. But if it happens a couple of times, is the average consumer likely to keep informing Google of errors, or just say "*&#% this, I'm switching to MapQuest/Bing/Yahoo"?

Google has made some reasonable progress with Google MapMaker (its crowdsourced system for letting people create their own map data) in certain parts of the world, but these are generally places where there are not good alternative maps already, or people may be unaware of alternatives like OpenStreetMap. So in those cases, people have a clearer motivation to contribute their time to making updates. People who contribute time to OpenStreetMap have a range of motivations, but in general for most of them it is important that the data is open and freely available, which is not the case with Google (at least not so much, I won't get into the details of that discussion here). Most if not all the people I know who contribute effort to OpenStreetMap (myself included) would not be inclined to contribute significant updates to Google (except for some experiments to see how good or bad the update process is).

Consumer confidence is a fickle thing, and you probably don't need too many stories in the newspapers of mishaps due to bad data, or more than a couple of direct experiences of getting lost yourself due to bad data, to switch to a different provider (especially when you are choosing between different free systems - you have a bit more incentive to stick with a navigation system and try to make it work if you've spent a few hundred dollars on it).

The risks are even higher with real time turn by turn directions - no matter how many caveats you put on it, you are likely to get some drivers who follow the directions from the system and don't notice relevant road signs. You only need a couple of accidents because people drove the wrong way up one way streets because of bad data to damage consumer confidence even further.

So I think it will be very interesting over the next few months to see whether the data quality issues are bad enough to result in significant numbers of users moving away from Google Maps in the US or not - and whether Google will get significant uptake in terms of number of people contributing error reports in the US (beyond the initial wave of curiosity-driven updates just to test if the process works). Obviously the answer to the second question is likely to have a big influence on the first. Stay tuned ...

Monday, October 26, 2009

Talk on "The Geospatial Revolution" in Minnesota

Here is a video of my recent keynote talk at the Minnesota GIS/LIS conference in Duluth, which was an excellent event. There were about 500 people there, which is great in the current economic climate. It was mainly a "traditional GIS" audience, and I got a lot of good feedback on the talk which was nice.

I talk about current trends in the industry in three main areas: moving to the mainstream (at last!); a real time, multimedia view of the world; and crowdsourcing. There's a lot of the same material that I presented in my talk with the same title at AGI GeoCommunity (which doesn't have an online video), but this one also has additional content (~50 minutes versus 30 minutes).

Click through to vimeo for a larger video, and if you click on "HD" you will get the full high definition version!! I used a different approach to produce this video compared to previous presentation videos, using a separate camera and a different layout for combining the slides and video. I like the way this came out - I'll do a separate blog post soon with some tips on how to video presentations, I think.

The Geospatial Revolution (Minnesota) from Peter Batty on Vimeo.

You can also view the slides here:

Tuesday, October 13, 2009

More on the "Google data earthquake"

Following on from my previous post about Google shaking up the geospatial data industry, Steve Coast invited me and James Fee to join him for a discussion on the topic. James' blog post on the topic has 138 comments at the time of writing, which is a good indication of the interest in this change! You can listen to the podcast on the "Google data earthquake" here.

One topic I talk about in the call which I didn't cover in my previous post is where the Google street data comes from (they haven't said anything about this). To me it looks like a mixture of data they have captured from their StreetView cars, which seems to be good quality, and then probably TIGER data, which is much lower quality, where their cars haven't driven. Quite a few people have reported finding errors in street data that weren't there previously since the change. I found that Central City Parkway was missing from the map entirely, which is a pretty major highway that was completed in 2004. You can see this below, with OpenStreetMap on the left, and Google Maps on the right (screen shot using GeoFabrik's nice map compare tool):

Central City Parkway missing from Google Maps

I've reported the error to Google, so it will be interesting to see how quickly it gets fixed - and in general, how quickly they are able to fix up the apparently lower quality data in areas they haven't driven yet (though this assessment is not based on anything scientific).

Wednesday, October 7, 2009

Google shakes up the geospatial data industry

Well, the big news of the day is that Google has dumped Tele Atlas as the main data provider for Google Maps in the US, and is providing its own map data from a variety of sources (presumably also including its own Streetview teams). They've also added the ability to point out errors in the map, another addition to the crowdsourcing techniques they've been using. The announcement has caused a flurry of discussion of course. James raises questions about various aspects of the data (especially parcels). Steve speculates that the same thing will happen in Europe and that the beneficiary there will probably be AND.

The new Google data certainly adds details in some places, from a quick random sampling - for example check out Commons Park in downtown Denver using the nice GeoFabrik Map Compare tool. None of those paths were there previously in Google. Still not quite as good as OpenStreetMap in this case though :).

This does dramatically reshape the geospatial data industry though. Previously there were two commercial providers with a detailed routable database of roads in the US, NAVTEQ (owned by Nokia) and Tele Atlas (owned by TomTom), now at a stroke there is a third in Google. OpenStreetMap is a fourth provider of course, not quite up with the other three in terms of coverage and routing quality in the US yet, but getting there very quickly.

This raises lots of interesting questions:
  • Will Google sell its data to providers of third party navigation systems and compete with Tele Atlas and NAVTEQ? Or indeed will they sell/license it to others who could use it (users of GIS, etc)?
  • Will Google Maps on the iPhone (and other mobile devices) get real time turn by turn directions? This was previously prohibited by licensing terms from Tele Atlas and NAVTEQ. Existing real time navigation systems using data from these two providers generally cost in the region of $100. Will Google add this to the free maps offering? Or sell a version that does real time turn by turn directions?
  • Will Google contribute any of this data to open data initiatives like OpenStreetMap? Or make it available to USGS for the US National Map? In the past they have cited licensing constraints from their data providers as a reason for not being more open with their geospatial data, that reason largely goes away now (though we don't know all the new data providers and their terms). I'm not holding my breath on this one, but we can hope!
  • Will this negatively or positively impact OpenStreetMap? Previously in areas with active communities, OpenStreetMap had significantly more detail, and more current data, than Google - this appears to move Google forward in that regard. But will Google taking another step towards total world domination encourage more people to want an open alternative?
So anyway, definitely a very interesting development for the geospatial data industry (albeit one that has been on the cards for a little while). It will take a little while to understand the full implications. I'm sure Tele Atlas is glad they are no longer an independent company, I wouldn't like to have seen how their stock price would have dropped today otherwise :O !!

Update: some more discussion and a link to a podcast on this topic featuring Steve Coast, James Fee and me in this follow up post.

Thursday, July 16, 2009

OffMaps for iPhone review

On my recent trip to Amsterdam for the OpenStreetMap State of the Map conference, I made extensive use of the OffMaps mapping application for the iPhone - which incidentally uses OpenStreetMap data. The big advantage of OffMaps compared to the standard Google Maps application is that it can run offline, which is really important if you are going abroad, as data usage charges make it prohibitively expensive to use an online mapping application (you quickly get into hundreds of dollars worth of data charges!). The functionality is fairly simple - currently it's basically map display, without routing or search for points of interest (though these things are planned for the future, I heard from Felix Lamouroux, the main developer, who I met at the conference). But it makes good use of the GPS and the compass in the iPhone 3GS, which make it very intuitive and easy to use when walking around. Performance is very good since all the data is local. There is a simple interface for choosing the data to download and store locally (which you want to do ahead of time when you're at home, or connected via WiFi). You can run it in an online mode too, where it will dynamically fetch data over the network if it isn't already stored locally.

The OpenStreetMap maps for Amsterdam were very good quality and nice looking. This application illustrates one of the key advantages of OpenStreetMap over Google Maps and other commercial solutions, which is that Google licensing prohibits you from using the data offline. This fact, plus the availability of the Cloudmade iPhone Maps Library, is making OpenStreetMap data a popular option for mapping applications on the iPhone - just today, Dopplr announced a new iPhone application which also uses OpenStreetMap data (though strangely it uses different rendering from the other applications I've tried, and none of the freeways in Denver show up, though they do in the other applications - but I'm sure that will be fixed shortly).

So I would strongly recommend OffMaps, especially if you're going on an overseas trip - I am currently downloading data for Vancouver in preparation for my upcoming trip to GeoWeb, so if you're there and want to check it out please ask me and I'll show you!

Tuesday, July 14, 2009

My presentation on "Geodata creation: past, present and future" at State of the Map 2009 #sotm09

A video of my presentation at the recent OpenStreetMap State of the Map conference is now online at Vimeo. There's also a copy of the slides at SlideShare. I talk about the four major business models that have been tried in regard to creating geodata, and how they are all handicapped by the very high costs involved when you use the traditional approach which involves paying the people who create the data. And I discuss how crowdsourcing completely changes things by reducing this labor cost to zero. From everything I saw at State of the Map, I am convinced that use of crowdsourcing in general, and OpenStreetMap in particular, is going to massively grow over the next couple of years.

Geodata creation: past , present and future - Petter Batty (Spatial Networking) from State of the Map 2009 on Vimeo.

Other videos are being posted online at the sotm09 channel on Vimeo - I recommend you take a look. There are lots of great presentations, but one that I guarantee will make you smile is this 5 minute one on Mapping of Historical Sites in Japan - check it out :).

Monday, July 13, 2009

Quick report on OpenStreetMap State of the Map 2009 conference

This is just a quick initial report on the OpenStreetMap State of the Map conference, which has just finished in Amsterdam. First I'd say read Steven Feldman's summary, which I completely agree with. As I said in my keynote talk here (see the slides, video should be online at some point soon, and I'll summarize it in a future post), the industry has always been hampered by the very large cost of creating and maintaining geodata. Despite all the technological advances over the years, this remains a very labor-intensive process and therefore is just fundamentally expensive by traditional methods (where you pay the people doing the data capture).

Crowdsourcing changes the paradigm by having volunteers contributing their time, and having access to free data (without complex licensing constraints) completely changes the economics of developing geospatial applications. Obviously the first reaction of most traditional geospatial people is to ask whether you can get good enough data quality using this approach. Dr Muki Haklay presented a very thorough and rigorous analysis of the quality of OpenStreetMap data against Ordnance Survey data in the UK, and his conclusion was "OpenStreetMap quality is beyond good enough, it is a product that can be used for a wide range of activities" (not in general - today - for very large scale mapping, but for small or medium scale mapping - the type of applications that today might use today data from Google or Microsoft, NAVTEQ or Tele Atlas).

The conference itself had a tremendous buzz about it, with 250 people from thirty-something countries (I think), and great enthusiasm and excitement from everyone participating. The presentations were a great mixture of people talking about really cool and innovative technical things, and heart-warming and moving stories about people creating maps from all parts of the world - in many parts of the developing world, OpenStreetMap is way ahead of any other data source.

Anyway, I'm off to be a tourist in Amsterdam for the day, I will write more in due course. If you are attending the ESRI user conference this week I really recommend you seek out the OpenStreetMap booth to find out more about what they are up to. And thanks to everyone who attended State of the Map for making it such a fun and inspiring event!

Monday, October 13, 2008

Statements that come back to haunt you

Just read this article in Forbes, which says:
"I laugh when I hear that you can make a map by community input alone," says Tele Atlas founder De Taeye. He says that if tens of thousands of users travel a road without complaining, then Tele Atlas can be fairly certain that its map of the road is correct.
The first statement is demonstrably false already (see my earlier post about Oxford University and OpenStreetMap for just one example). But then it's a little bizarre that he follows that up by saying that the reason they know that their data is correct is through (lack of) feedback from the community. I've done my share of interviews and it's quite possible that these two statements were taken out of context, but it's an odd juxtaposition. I'm sure the owners of Encyclopedia Britannica laughed at the notion that you could produce a comparable online encyclopedia by community input alone, but they aren't laughing any more and are even moving to accommodate community input, as are most of the main mapping data providers of course (including Tele Atlas).

Community input isn't the answer to all data creation and maintenance problems, but it provides an excellent solution in a number of areas already, and the extent of its applicability will increase rapidly.

Wednesday, October 8, 2008

OpenStreetMap mapping party in Denver

Just a quick note to say that there will be a mapping party in Denver the weekend of October 18-19, following on from the first one we had back in July. Richard Weait from Cloudmade will be here to coordinate things, and I'll be hosting it at my loft in downtown Denver, as I did last time. I think we'll be starting at 1pm each day, and migrating to the Wynkoop Brewing Company downstairs at around 5pm.

If you haven't been to a mapping party before (or if you have!) I encourage you to come along. Last time we had a decent turnout, about 5 or 6 people each day, and we made some good progress on mapping downtown Denver.

OpenStreetMap in downtown Denver

The screen shot above shows some details that you won't find in Google or Microsoft maps of Denver, including various footpaths and the light rail line. Click here for a live link. I hope we will finish up a pretty complete map of the downtown area at this level of detail over that weekend. And of course if you have other interests like cycle paths, etc, you are free to work on whatever you like! If you have any questions let me know. More information will be posted at Upcoming shortly.