Thursday, June 28, 2007

Thoughts on GE's next generation system based on Oracle - part 2

This is a continuation of my previous post about GE (General Electric)'s next generation system based on Oracle - please read that before reading this if you haven't already done so. At the end of part 1, I said that in part 2 I would talk about the real reason why this product could be a significant jump forward for the utility industry, which really hasn't been highlighted in the GE announcements or in any of the commentary I've seen - and also why this same factor could be the reason that the product fails. And last but not least, I'll talk about some of the challenges which GE faces in positioning the new product with regard to the existing Smallworld products. So here we go ...

I think that the factor that could make this product a significant jump forward is that, as I understand it from contacts at GE, they are really trying to produce something that is an off the shelf product which can be configured rather than customized (the distinction being that configuration just involves setting certain parameters about how the application behaves, as opposed to customization which involves writing code). Now GE really didn't talk about this in their announcement, and this information comes from informal conversations, so it's possible this emphasis may not be as strong as I had inferred - but either way it is interesting to talk about the pros and cons of this approach.

Historically in the utility industry, the implementation of geospatial systems has involved buying a software product as a starting point (with GE, Intergraph and ESRI being the three primary vendors in the space), and then doing some customization (data modeling, further software development, etc) on top of that core product to meet each individual utility's specific requirements. This approach enables each utility to get a system which closely meets its specific requirements, but has drawbacks: apart from the additional cost for the initial customization, having many different custom systems makes support and upgrade procedures harder for vendor and user alike. All the major vendors have generally had a "starter system" or "template" for the customization, which reduces the cost of the initial implementation, but in general (in my experience) has not really helped in terms of simplifying ongoing support and upgrade issues, and thereby reducing ongoing cost of ownership. In the early 2000s, when Intergraph launched its G/Technology product, initially the intent was that it would be a completely off the shelf system, allowing some configuration but not customization. While many utilities liked the principle, nearly all wanted additional functionality in their systems (and of course typically different utilities wanted different additional functionality). So Intergraph ended up having to rearchitect their initial approach to allow the system to be customized in a more flexible way, which was not a trivial undertaking and probably cost them a few years in terms of getting a competitive version of G/Technology to market.

So it will be interesting to see if GE really does pursue the angle that this will be an "off the shelf solution". It is hard to be wishy-washy about this - if you say that you think it will meet many customer needs off the shelf but you can still customize it if you want, then you really don't address the ongoing cost of ownership issues associated with the complexity of supporting and upgrading all these different systems in the field (unless you put very strict constraints on the customization, and really constrain yourself to not modify APIs from one release to the next). If GE does take a truly off the shelf approach then this would differentiate the product in the market IF it was functionally rich enough - but the risk if the product is not functionally rich enough is that you won't be able to win business and it will set back your entry to the market until either you allow customization (which makes you less differentiated) or you add sufficient functionality that you can be competitive - either of which may take a significant amount of time.

This leads on to the challenges that GE faces in terms of positioning the new product versus the current Smallworld product. Now GE is specifically trying not to position this as a replacement for Smallworld, and is saying that it will continue to develop new functionality on both platforms. It's fine to say that, but obviously the challenge with this approach is that if you develop all functionality on two different platforms which don't support common code, you can only develop half as much new functionality with the same number of developers (well maybe a bit more than half as much, since some design work could be common across both platforms, but certainly you can do a lot less than you could do if you just focused on a single platform). So that really doesn't seem like a sustainable approach for a long period of time, unless GE is prepared to substantially increase the size of its development team, which I imagine would be hard for it to justify. GE is initially focusing the new product specifically on North American mid-size electric utilities, so migrating to the new product will not be an option yet for customers who do not fall into that category, which is a fairly large majority.

For those customers who are in a position to consider migration (the North American mid-size electric utilities - and presumably the segments addressed will expand over time), GE will face the classic challenges of any company going through a major technology upgrade (and which Intergraph went through - and is still going through - with the migration from its older FRAMME product to G/Technology, and ESRI from Arc/Info 7 to ArcGIS). One challenge is that with a product like Smallworld that is 16 years old and exceptionally customizable, most customers have very rich functionality which it is hard to replace with a product that has only been under development for a year or two. There will either be custom development necessary to replace custom functionality in the old system, or the customer will need to be persuaded to give up some existing functionality to get other benefits that can be obtained from the new architecture. This generally means that the migration from the old system to the new system is a large enough project that most organizations will take the opportunity to evaluate other systems on the market and decide whether to stay with the same vendor or switch to a new one. There has been little turnover in the utility market in recent years, so on the rare occasions that utilities do choose a new system, all the vendors are very hungry for those opportunities and pricing tends to be very competitive, especially since the three major systems are not highly differentiated these days. As I said, none of this part is unique to GE, it is the same situation that ESRI and Intergraph have gone through as they have been migrating their customers to newer technologies.

One other challenge GE may have is with the customers who are not yet addressed by the new product. It will presumably take several years before the new product is an option for all of them (I believe that GE is saying that they will have a beta out sometime this year, then a first release first half of next year, and unless it's different from all other software products it will probably need a second release before it's really ready for serious use, so suppose that arrives late 2008 or early 2009, then probably the product is not going to be expanding to substantial additional market segments until 2009 or 2010). Now by and large I think that the Smallworld customer base is still pretty happy, so maybe customers will be willing to continue, assuming that GE continues to invest in Smallworld as it says it will. But there is also the risk that some customers will decide that this all means that the writing is on the wall for the Smallworld products, even if they keep going for a few more years, so maybe they should just go out and look at Intergraph and ESRI, who are running on more mainstream architectures which are in production today.

One other challenge for GE as they try to address moving their larger customers to the new platform at some point in the future (assuming they do) will be how to provide the same level of scalability that Smallworld VMDS does - perhaps that's a topic for a future discussion.

So anyway, it will be interesting to see how all this pans out over the next few years. I wish the GE guys good luck with it - as I said before, the industry can use additional innovation and competition!

5 comments:

Paul Ramsey said...

This is really cool stuff. Am I right to assume that there is not any green fields market left at this point? That GE basically has to either upgrade existing clients or steal clients from some competing vendor (who competes with GE in the utility space?).

So, in some ways this is a new market, it's the market of customers who think that "classic" SmallWorld doesn't cut it anymore, and need "something" new. Which means the vendor lock is broken, and everything is back on the table. Does this put GE head-to-head with ESRI, Intergraph and others for the upgrade business?

It is interesting to note that when Autodesk's legacy customer base for MapGuide approached this tipping point, Autodesk staunched a potentially gaping chest wound (of customers moving to ESRI or open source) by creating a new product that met the needs, but within the open source space. Legacy customers were still willing to pay support for the new product (though not a capital upgrade cost) so they weren't lost as a revenue stream, and customers who otherwise wouldn't give Mapguide a second glance are taking it for a serious trial run because of the open source aspect.

Think GE could go that way?

Roberto Falco said...

I completely agree with you the success or fail balance is around the "Off-the-shelf"/customizable dilemma. Having gone through that as part of the Intergraph community some years ago, when G/Tech arrived, now as part of the GE Smallworld community this seems to me somewhat like a "deja vu" …

SAP example should be looked carefully. No one have more power as a SW vendor to impose things to customers than SAP, and they never abandoned the customization path, no matter what their marketing speech and implementation evangelization says.

Business software - and I'm assuming GIS is a piece of business software for any Utility, hope reader's will agree with me :) - needs to comply with business processes, and those are subjected to human creativity. No matter how flexible in terms of configuration one business software is, I doubt that flexibility will meet all people needs.

Hope GE understands that.

randy george said...

Interesting.
I would think moving off a proprietary DB to Oracle Geospatial makes the customization potential broader, at least in the sense that there is a broader pool of talent and a wider scope of tools available. The concern as a utility would be how to migrate my previous custom tier into the new model.

Oracle geospatial is absorbing greater and greater functionality out of the core which must be scary for GE, Intergraph, ESRI as they retract into the middle tier.

In the meantime the evolving rich client browser world will continue to chip away at the client side leaving the middle core smaller and smaller.

Jonathan Hartley said...

Hey. Nice article Peter, really interesting stuff.

My interest and excitement was tinged with a frisson of trepidation though, at the description of the configurable GIS behaviour. I know there's a whole phalanx of very smart engineers at GE, who are no doubt immeasurably more cognisant of the following issues than myself. But, as is my wont, I'm not going to let that get in the way of an opportunity for a jolly good rant, so here goes.

Obviously data driven behaviour is brilliant. However, it's only good up to a point. Once the behaviour in question becomes complex enough (and a GIS definitely qualifies), there's a real risk of the Inner Platform effect: In an effort to replicate the flexibility and expressiveness of the programming languages that it is designed to replace, the configuration system ends up replicating all their features - badly.

Such an endevour is caught between two stools. If it is insufficiently ambitious, the configuration will not be flexible enough to meet clients' needs. If it does manages to capture power of the programming languages it replaces, then it is Turing complete, and all you have done is convert customisation using a standard, proven, well-known programming language into configuration using a ghastly proprietary language that is embedded within your configuration schema.

In addition, creating such a configuration actually ends up being much harder for clients. It requires deep proprietary skills, as opposed to common skills like C#, and it cannot lean on any of the supportive ecosystem of tools and knowledge that an established language has built up over the years. Perhaps worst of all, it will *still* require software engineering skills to perform the configuration, and all the good software engineers will have run a mile.

Like I say, no doubt there are smart people at GE who have been figuring out solutions to all this for years by now, but I'd have to hear what those solutions were before I'd trust such a system.

To my way of thinking, the best solution to the problem is to acknowledge that the best way of specifying behaviour is to use a programming language - that is exactly what they were designed for. Trying to sidestep that is simply swimming against the current, and you need to embrace it. You want a domain specific language (DSL), so that it is concise and intuitive for clients, but you don't want it to be a proprietary language - that way lies madness. So what you need is to provide is a library that extends an existing, established language, making it into your GIS DSL. Creating DSLs from Python or Ruby is all the rage these days, and I believe the above chain of reasoning is why.

Nixta said...

Some very interesting comments from some very varied directions. And of course, Pete, a great article.

But this is the same problem that we've all faced for a long time, and that many of us have been involved in developing failed solutions to.

Oracle App Server is a fairly successful example, I guess, of this kind of move with many services provided by the Oracle suite of things, but as Mr. Hartley says, you end up with proprietary hardships incurred anyway, and a bunch of highly paid contractors. Oracle have progressively been elbowing in the same kind of model with Spatial, and from your article that approach is clearly working in business terms (after all, it's committed GE to building atop it).

But there's a proprietary element here that seems impossible to escape from. Sure, SQL is a basis for an interface to a database, but will the same configuration of an Oracle Spatial solution drive a PostGIS solution or, say some other spatial DB solution (wink wink)?

ESRI did a good, if heavy-handed job of providing a half-way decent development environment over Smallworld's when they chose Windows COM to develop a huge library of components (even if they're flawed, closed and unreliable). I loved developing in Magik (as did Charlie Savage, but we're geeks), but any workman needs decent tools (even if they're not allowed to blame them), and Emacs doesn't compare to a rich IDE. Of course, ESRI has other concerns that counter that free IDE integration by (in my mind unforgivably and somewhat despicably, given their costs) leaving customers with often insurmountable closed-box problems rather than the open sandpit that Smallworld gave them to dig around in (I still believe a little that this is symptomatic of an American solutions approach vs Smallworld's European one, but that's a different story). So near and yet so far.

I think a clear solution is to develop open interface/configuration standards and open source plugins on top. I suppose like a standard-driven beans approach: "Here's your application platform. It's also, btw, a spatial database with defined services. Now here are our modules on top to cover everything you need - you just need to configure them - but if you need to hook into them or replace them, you'll find a standard over there to cover each of those things, and boy do we adhere to it." I think Oracle's push puts us a bit of the way there, but it's still proprietary. You want to be able to take your modules framework and just drop it in on any underlying platform. You can choose one that's more scalable (wrap VMDS to provide the platform) or cheaper (push PostGIS as the platform) or suitable to your IT team (use Oracle).

I hate to say it, but it hasn't worked before, and I feel somewhat sceptical about it working now. With the best intentions in the world, such an approach rarely appeals to those holding the purse-strings - it's too much of a risk to huge margins and license revenus - compromises will be made, openness will be clipped... And with a company like GE (from an outsider's commie perspective), commitment long term to the best intentions is something that isn't even an idealist's dream - the framework of a company like that can't even handle such an abstract.

Why can't we all just get along?