The
GeoWeb 2009 conference was very good as usual. The presentations I attended were a bit of a mixed bag - some excellent, a few so so - but the networking and hallway conversations were great. James has done a
good writeup. I'll just comment on a few things that I thought were interesting.
One trend was increasing acknowledgment that the traditional approach to SDI (Spatial Data Infrastructure) using
OGC standards is really not working well in comparison to newer "geoweb" type mechanisms for sharing data that have come from the "neogeography" developments of recent years (I won't get back into the
label debate just now!). This was most striking for me in the presentation by Clemens Portele, who is a highly respected member of the OGC community - he won the
Gardels Award in 2007. Clemens gave an excellent and very frank talk on "The gap between SDIs and the GeoWeb". A couple of quotes that I noted were "So far the impact of SDIs on the integration of data as a ubiquitous component of the web seems low" and "There is not evidence that SDIs have increased the market volume of government data by a significant amount". He noted that OGC web services are based largely on an architecture and approach to web services developed around 10 years ago, well before many recent web innovations. Another interesting comment was on metadata - he observed that there is much more interest in user-centric and user-provided metadata – did others use the data? Who? What worked, what didn’t? - than in "traditional" metadata. This is not a new debate, but it was interesting to see someone who has invested so much in OGC and who is so well respected in OGC circles taking a very "neo" view of the world (for lack of a better short label!). My friend Carl Reed, CTO of OGC, was moderating the session and commented at the end that OGC was aware of the issues with not having a simpler API and were looking at them, but the solutions were complex.
Andrew Turner gave an excellent talk on geoweb standards which you can see at
geogeektv (kudos to
Dave Bouwman for broadcasting a number of presentations - we really should see a lot more live transmission and recording of presentations, all you need is a cheap webcam and and Internet connection). There were some interesting dynamics as the talk was introduced by Ron Lake, the father of GML, and Andrew discussed his views on shortcomings of GML pretty frankly! Andrew observed that we are lacking a "middle ground" in web standards between the simple (GeoRSS, KML) and the complex (GML, Shape).
Update: Ron responded via twitter (as I said recently, hardly anyone seems to comment on blogs any more!), and this is a summary of his points (in bullets of 140 characters or less :)
- Good review. I also thought that there was a growing realization of limits/roles of things like KML, GeoRSS outside mapping.
- I think GML already fills that middle ground although I agree that the spec is daunting!
- It also does not reflect the fact that one can profile GML (take subsets) - can be small like lo-pidf proposed in ipv6!
- Or a bit larger like geoRSS GML - which is only slightly more complex than geoRSS "simple"
- That is one of the things that struck me also - the range of views on what constitutes the "web" stood out very clearly (this was in response to my comment that one of the interesting things about GeoWeb is bringing together a range of perspectives)
- Many versions of the web to consider. The web of hyperlinked documents. The web of database access. The web of real time collaboration ..
Thanks for the comments Ron, definitely an interesting debate!
I think we have seen in this area and others that on the web especially (and probably in computing / life in general!), simple is good. We have repeatedly seen greater success in the technology space with an approach that says do something simple and useful that covers maybe 80% of what you want to do eventually, with 20% of the effort, get it out there and being used, and iterate from there. Versus an approach which tries to tackle all the complex cases up front, takes much longer to implement, and ends up being complex to use - which is not uncommon in a consensus based standards process. OpenStreetMap is another example of the "keep it simple" philosophy which has been very successful.
Another general trend was a lot of discussion on how the "geoweb" is not just about running geospatial applications on the web, but making applications and geospatial data fit in with key web concepts like search and linking. Jason Birch gave a good presentation on the work he is doing at Nanaimo, which is discussed on
his blog and by
James. To see an example, do a Google search for
2323 Rosstown Rd. A
tweet I liked from Kevin Wiebe said "If a dataset available on the web is in a format that can't be indexed by Google, does it make a sound?". Clearly this approach to making geospatial data searchable is hugely useful, and there are places where making data linkable is very useful too. I think there is a question about how far the linking paradigm can go with geospatial data though - Mano Marks from Google has an
interesting discussion on this (and other related topics). Another potential item for a future post!
In many ways the "geoweb" is not a good concept (as Sean Gillies and others have observed), except as a very broad label for geospatial activity on the web (let's not get back into terminology discussions again!). But there isn't a separate "geoweb", there's one web which handles many types of data, one of which is data that has a location / geo element to it. We don't talk about the "numeric web" or the "varchar web". Alex Miller from ESRI included in his presentation a picture of the web with lots of "GIS servers" and I always have an uncomfortable feeling about that. It is often said that 80-85% of data has a location component. Does that mean that 80-85% of servers on the Internet (or in your enterprise) should be "GIS servers"? No, of course not. But any general data server should be able to support location data types. Of course there may be some servers that are especially focused on certain kinds of geospatial processing, and maybe "GIS server" is a reasonable term for those. But I think it's important to understand that the great majority of location data we use will not be coming from those specialized "GIS servers".
Michael Jones of Google gave a very animated, entertaining and thought-provoking talk as always, which you can see (cut into 10 minute chunks) on
youtube (hint to GeoWeb organizers: use
Vimeo, where you aren't constrained to 10 minute videos!). His major themes were the pace of change in our world, something which was a common theme from other invited speakers too, and user created data. He said "the future is user created data" and talked a lot about Google Mapmaker. He also made a big deal about the fact that Google has an internal organization called the "Data Liberation Front", whose mission is to ensure that any data you put into Google, you can get out - you see this piece
here. I thought it was curious that he made such a big deal of this, since one of the primary differences between Google Mapmaker and OpenStreetMap is that you can't get the raw data (vector data with attributes) that you have created out of Google Mapmaker, but you can get this back from OpenStreetMap. Google Mapmaker just lets you display a standard Google Map created from your data, but doesn't let you get the data itself. Ed Parsons
argues that getting the underlying data is "an edge case". I would agree that most users don't want to access this data directly - but I think that most users would want the result of their efforts to be more broadly available. I haven't yet used any of the raw data I've contributed to OpenStreetMap directly, but I like the fact that a whole range of third party applications can use that data for innovative purposes - things that they couldn't do if they just had access to something like the Google Maps API. I asked Michael about this at the end of his presentation, and also about his thoughts on OpenStreetMap in general (
see video). He said that he didn't think it was likely that the policy on making map data available would change (it is available to non-profits but nobody else) - though he did say "perhaps for non-commercial use", so maybe there is a glimmer of hope there :). In regard to OpenStreetMap he said that Google had had conversations with them for a year or so, but couldn't resolve some legal issues - with the key problem being that contributors of data retained the rights to that data, which opens up the possibility of Google being sued by contributors if they are using that data. He made the interesting comment that "if they [OpenStreetMap] had their legal act in order we would not ever have done anything on our own" (i.e. Google MapMaker). OpenStreetMap is in the process of changing its licensing in a way which will resolve this specific issue, so it will be interesting to see whether this re-opens the possibility of collaboration (which probably some people in the OSM community would like, and others wouldn't) - though there are other licensing complexities which may be obstacles too, and it may be that the two of them are too far down separate paths now.
One other interesting snippet mentioned by Michael was that Google Maps has had 470 versions in 4 years :O !!!! There is a new one every Tuesday, and additional ones if necessary to fix a problem. That certainly puts a whole new perspective on agile development. As he said, it's hard for companies who do a software release once every year or two to keep up with innovation these days.
Chris Helms (A fellow Denverite) from the National Renewable Energy Laboratory (NREL) gave an interesting talk on an application he has developed called
IMBY (In My Back Yard), which lets you look at the potential effect of installing a solar photovoltaic (PV) array or wind turbine at your home or business.
Sean Gorman gave a good overview of his efforts at
FortiusOne to demystify and democratize geospatial analysis, and empower end users who are not "GIS professionals". I very much agree with his philosophy - as he says, you don't need to do a training course in statistics to use a spreadsheet, or have a qualification in graphic design to use PowerPoint.
Dale Lutz of Safe Software talked about the new opportunities they see in 3D data and BIM (Building Information Modeling) - I think this is a great new opportunity for Safe, with much greater complexity than traditional (largely) 2D geospatial data. The growth in number of formats supported by FME is always an interesting chart in Dale and Don's presentations - it has now reached 238 and continues to show linear growth since they started - and Dale said if anything it is now trending slightly above linear growth (getting into 3D data may well contribute to that). He also said in passing that his mother can handle a 3D PDF, which I thought might explain something about Dale ;) !
Just as at Where 2.0, one of the coolest demos of the show was the automatically generated 3D models done by C3 - check out some of the videos on
their site.
I think that had better do, I have other things to get to - apologies to those who did other interesting presentations that I didn't mention! And if you made it this far, I should mention that I haven't forgotten my "top 10" influential people list, that will be coming soon. Quite a few of them were at GeoWeb :). Thanks to the organizers including Ron Lake, Galdos, and GITA, for another excellent event.