As some of you will know, the FOSS4G 2011 conference is coming up in Denver and I am the conference chair. I have only been to one previous FOSS4G, which was in Victoria in Canada in 2007. That event had a profound impact on my perspective on the geospatial industry, and on the software platforms I've chosen to implement geospatial applications on since then. And it has saved my companies a lot of money! So I wanted to share some perspectives about my experience at FOSS4G and what I've learned about open source software, to explain why you should come to FOSS4G, especially if you've never been before.
Back in 2007 I had just left my job as CTO of Intergraph and was weighing up what to do next, and looking at ideas for a geospatial startup company. I'd spent 20 years in the geospatial industry working with closed source products, and knew very little about open source geospatial products. Towards the end of my time at Intergraph, I'd been getting quite a few questions from sales guys, in Canada in particular, about the fact that their customers were showing interest in open source software, which was free, and how should they sell against that? So I'd done a bit of research and had become interested in particular in PostGIS, the open source spatial database, as a possible platform for applications I was looking at in my new (yet to be created) startup, Spatial Networking. The fact that it was free was obviously attractive to a new startup owner, especially as I was looking at a system that (I hoped) would need to be deployed on many servers to cope with large numbers of users.
Open source software tends not to have such flashy marketing material as closed source software, so after a bit of digging around online and not finding all the information I was after, I got in touch with Paul Ramsey, who is one of the main people behind PostGIS. He suggested I should come up to FOSS4G to find out more. I did and was really impressed by the whole experience, both the event in general and what I found out about PostGIS - you can read my writeup at the time here. As I said in that writeup, there was much more energy and buzz than I had seen at other geospatial conferences I was used to attending.
I went ahead and used PostGIS at Spatial Networking and on other projects, including my current project Ubisense myWorld. I continue to be very impressed with PostGIS - it does all the core things you expect a spatial database to do (I had a lot of previous experience with Oracle Spatial and other systems), and it's FREE! I have never hit a bug during my time using it.
So what have I learned about open source geospatial software in those four years? First of all let me say that I have no strong predisposition to open or closed source development approaches, per se. I am happy to chose either open or closed source products depending on what I need in a given situation. What I do have s strong predisposition to is FREE. Obviously a product needs to meet your requirements, but assuming it does then free is rather attractive compared to having to pay for something. This is especially true in a cloud environment, where you may scale up to running many servers, and traditional per server licensing costs can really hurt you financially.
People in the closed source world often raise concerns about support in the open source world. My experience hasn't borne out this concern. With PostGIS, I've never needed support, it just works. With MapFish, another open source product we're using for myWorld, we needed a few enhancements. Some were addressed by the community within a month or two, others we were able to do ourselves as we had access to the product source code. I very much doubt that we could have got enhancements made in a mature closed source product in that timeframe. There are also more and more options in the open source geospatial world to pay people to do enhancements or fixes for you. Again it's a bit dangerous to make generalizations, you can get good support or poor support on different closed source products, and you can get good support or poor support on different open source products. But my experience with the open source products we've chosen has been very good.
I think that after price, perhaps the aspect of open source that I value most is longevity and predictability. Many times during my career, I have seen projects suffer because a vendor has decided to stop development of a product (or feature). I have also seen dramatic changes in terms of service or costs of online services. Google App Engine is one example of the latter - many people put significant effort into developing applications that were running for free, then Google changed the pricing model and people found themselves facing large costs they hadn't planned on. With an open source product that has a strong community behind it, there is much more long term stability. You know it's not going to go away tomorrow. Even if some developers leave, others are there to cover for them. And in the worst case you have access to the source code so could continue to maintain it yourself (though that's a very unlikely scenario as long as the community of developers has a certain critical mass). I have been moving more components of myWorld towards open source because of this predictability.
So anyway, if you are still paying for geospatial software you owe it to yourself and your company to come to FOSS4G and find out what all this open source software is about. There is a parallel universe out there with software products that have great capabilities and are FREE! It's also worth saying that in general there is good interoperability between open and closed source systems, so it's not an all or nothing proposition. In general open source web and database products are very strong, and they may well be able to complement your investment in existing applications.
This is the first time that FOSS4G has been in North America for 4 years, and it is not likely to be here again for another 3, so this is a rare opportunity if you are based here to meet a wide range of people developing and using these products. So I hope to see you in Denver in September at FOSS4G!
To finish up, check out this video featuring members of the organizing team talking about why you should be there:
Showing posts with label open source. Show all posts
Showing posts with label open source. Show all posts
Sunday, August 7, 2011
Monday, March 28, 2011
Pete Warden's Data Science Toolkit offers cool geo capabilities
I just had an interesting chat with Pete Warden, a fellow Brit who was living in Boulder for a while and is now out in San Francisco, and who has worked on various interesting development projects including quite a bit of geo stuff. He is most famous for his cool map of Facebook users, which led to Facebook threatening to sue him :( !!

He has just launched a new project called the Data Science Toolkit at the GigaOM Structure Big Data Conference.
It's an open source project and contains a variety of tools for analyzing data, including several geospatial ones. Everything is nicely packaged up as an Amazon AMI, including some large databases, so you can just fire up one or more Amazon machines and use the functionality (and you can do some basic testing here).
He has just launched a new project called the Data Science Toolkit at the GigaOM Structure Big Data Conference.
Watch live streaming video from gigaombigdata at livestream.com
It's an open source project and contains a variety of tools for analyzing data, including several geospatial ones. Everything is nicely packaged up as an Amazon AMI, including some large databases, so you can just fire up one or more Amazon machines and use the functionality (and you can do some basic testing here).
- Geocoding (currently US only) - this uses the open source geocoder developed by geoIQ using TIGER data. The nice thing about this is that there are no transaction limits or restrictive terms associated with it. And you can run it offline if you like. Lack of good geocoding is currently a weakness of OpenStreetMap, so this is a nice complement to that. It found my home address very accurately - not an extensive test but a good start :). While this is all open source, it takes a good bit of effort to download the full TIGER database and get this all set up, so having a packaged version is a good thing.
- Reverse geocoding - takes a point and gives you information about where it is - country, city, district, etc. Again this is not unique, but has the same advantages as the geocoding functionality in terms of setup and lack of restrictions.
- GeoDict is a tool that emulates Yahoo's Placemaker and pulls location data out of unstructured English text - its API is identical to Placemaker.
- IP address to location
Monday, February 28, 2011
New open source server options for Ubisense myWorld
We have been busy working away on various aspects of Ubisense myWorld. One of the biggest enhancements is behind the scenes, with support for new server options, so that we can run in the cloud or in house.
Up to this point we’ve been working with Arc2Earth, which runs on top of Google App Engine, and both these platforms have worked very well for us, and were a great way of getting an initial system up and running quickly. We see a lot of benefits to running in the cloud, as I’ve talked about on several occasions.
However, a number of our customers, including large utilities and telecom companies, have said that they really like what we’re doing with myWorld, but they would be more comfortable with a solution where the server can run in house. So to support this we have added a new server architecture based on the open source products MapFish and PostGIS. As many of you will know, PostGIS is a very robust spatial database, built on top of PostgreSQL. I have used this on a few projects including whereyougonnabe and have always been very impressed with its functionality and performance. MapFish provides services using data from PostGIS (or from other spatial data sources, including Oracle Spatial, MySQL and Spatialite), using a very similar REST API to that used by Arc2Earth, so that made the migration straightforward and means we can support both server options with a largely common set of code. We’re just using server side components of MapFish, not its client side components (though we might consider using those in the future).
This new server code can run on various operating systems, including Linux and Windows (and Mac!). Customers can run the server in house, while we can now offer services using many different cloud infrastructure providers. We’re currently using Amazon, which has been working well, but it’s good to have alternatives available. We've continued to be pleased with PostGIS, and MapFish too based on our experience so far.
Up to this point we’ve been working with Arc2Earth, which runs on top of Google App Engine, and both these platforms have worked very well for us, and were a great way of getting an initial system up and running quickly. We see a lot of benefits to running in the cloud, as I’ve talked about on several occasions.
However, a number of our customers, including large utilities and telecom companies, have said that they really like what we’re doing with myWorld, but they would be more comfortable with a solution where the server can run in house. So to support this we have added a new server architecture based on the open source products MapFish and PostGIS. As many of you will know, PostGIS is a very robust spatial database, built on top of PostgreSQL. I have used this on a few projects including whereyougonnabe and have always been very impressed with its functionality and performance. MapFish provides services using data from PostGIS (or from other spatial data sources, including Oracle Spatial, MySQL and Spatialite), using a very similar REST API to that used by Arc2Earth, so that made the migration straightforward and means we can support both server options with a largely common set of code. We’re just using server side components of MapFish, not its client side components (though we might consider using those in the future).
This new server code can run on various operating systems, including Linux and Windows (and Mac!). Customers can run the server in house, while we can now offer services using many different cloud infrastructure providers. We’re currently using Amazon, which has been working well, but it’s good to have alternatives available. We've continued to be pleased with PostGIS, and MapFish too based on our experience so far.
Stay tuned for more news on other cool new functionality on the front end of myWorld coming soon!
Thursday, January 27, 2011
Geospatial in the cloud
As mentioned previously, earlier in the week myself, Brian Timoney and Chris Helm did a set of presentations and demos on geospatial technology in the cloud, to the Boulder Denver Geospatial Technologists group. We were aiming to give a quick taste of a variety of interesting geo-things currently happening in the cloud, and we did it as six slots of about ten minutes each, and apart from my introductory opening slot these were all demos:
- Peter: Why the cloud?
- Brian: Google Fusion Tables
- Chris: the OpenGeo stack on Amazon (PostGIS, GeoServer, OpenLayers, etc)
- Peter: Ubisense myWorld and Arc2Earth
- Chris: GeoCommons
- Peter: OpenStreetMap
We got a lot of good feedback on the session. Here's the video (for best quality click through to vimeo and watch in HD):
Geo in the cloud from Peter Batty on Vimeo.
Here are links to the demos we used, or related sites:- Utah Oil Wells in Fusion Tables (Brian)
- Denver Elections map in Fusion Tables (Brian)
- Centennial address lookup (Brian)
- FOSS4G 2010 attendance map in Fusion Tables (Peter)
- FOSS4G 2010 attendance map in GeoCommons (Peter)
- How to run an OpenGeo Amazon instance (Chris) - blog post from OpenGeo, our demo instance is not running permanently
- Video demo of Ubisense myWorld (Peter) - public demo not currently available
- GeoCommons (Chris)
- OpenStreetMap (Peter)
And finally, here are my slides on slideshare:
Geo in the cloud
View more presentations from Peter Batty.
Labels:
cloud,
GeoCommons,
geospatial,
google,
open source,
openstreetmap,
presentation
Thursday, October 21, 2010
Triple geo-conference goodness coming to Denver!!
Denver has always been known as a center for geospatial activity, and we have a great triple bill of events lined up, one in the near future and two back to back in September 2011.
The one coming up is WhereCamp5280 on November 19th. Eric Wolf, Ben Tuttle and I ran the inaugural one last year which was a great success, see James Fee's review. I hear a rumor that James will be back this year, so I guess he must have liked it! Eric and I have both been a bit swamped on other things recently, so Steve Coast has kindly taken up the organizing reins this year, thanks to Steve for that! Last year we were kindly hosted for free by Denver University (DU), this year we will be at University of Colorado Denver on their Auraria Campus, which has the advantage of being within easy walking distance of downtown. And this year we've decided to do one day rather than two. But two things that haven't changed since last year is that the event is FREE, and we'll be holding the social event on Friday evening at my loft, I expect there will be plenty of geo-beer from the Wynkoop Brewing Company downstairs and that may fuel some geo-karaoke later on. All this is thanks to our kind sponsors, who at the time of writing include Enspiria Solutions, ESRI, Google, MapQuest and Waze.

I'm expecting a great group of interesting attendees and presentations again this year, so highly encourage you to come along. And remember it's an unconference, so we are looking for as many people as possible to participate - prepare a short presentation or come prepared to lead a discussion on a topic that interests you!
Sign up for WhereCamp5280 here, and if you feel like sponsoring at anywhere from $16 to $1024 (can you tell that a techie geek set the sponsorship amounts?!) that would be great, but otherwise just sign up and enjoy the great free education, networking, and beer :).
So WhereCamp5280 is a great local event, but in September 2011 the global geo community will be converging on Denver for a fantastic double bill of FOSS4G and SotM.
For those who don't know, FOSS4G stands for Free and Open Source Software for Geospatial and is an annual international gathering organized by OSGeo. The last North American event was in 2007 in Victoria, BC, and since then it's been in Cape Town, Sydney and Barcelona, so we're delighted to have Denver join that list, and expecting a great turnout from around the world.
Eric Wolf and I led the bid to bring FOSS4G to Denver (which is one of the things we were busy on that was competing for time with WhereCamp5280). Eric was originally slated to be the conference chair, but unfortunately due to circumstances beyond his control he has had to stand down from that, and I have just taken over that role in the last week (well unless the OSGeo board fails to approve the change at their next meeting, but I'm assured that's not very likely!). I'd like to publicly thank Eric for all the work he did to bring the conference here - it was his idea initially, and definitely wouldn't have happened without all his efforts. We have the core of a great local organizing group set up already, but are still interested in recruiting a couple more folks, so if you'd like to help out please let me know. It's going to be a great event, and I'll be blogging plenty more about it over the coming months.
And on top of that it was announced today that Denver has also been selected to host State of the Map (SotM), the global OpenStreetMap conference, also in September 2011. I attended SotM in Amsterdam in 2009 and thought it was a fantastic event. Unfortunately I wasn't able to make it this year, but I will definitely be there next year :) ! The two events are distinct, but several people were involved in both bids, and we recognized that a lot of people would be interested in attending both, so the intent is for them to run back to back. The SotM date isn't fixed yet, but FOSS4G is locked in for September 12-16.
So if you're in the Denver area already, plan to be at WhereCamp5280 on Nov 19, and if you're not, make plans to be here in September 2011!
The one coming up is WhereCamp5280 on November 19th. Eric Wolf, Ben Tuttle and I ran the inaugural one last year which was a great success, see James Fee's review. I hear a rumor that James will be back this year, so I guess he must have liked it! Eric and I have both been a bit swamped on other things recently, so Steve Coast has kindly taken up the organizing reins this year, thanks to Steve for that! Last year we were kindly hosted for free by Denver University (DU), this year we will be at University of Colorado Denver on their Auraria Campus, which has the advantage of being within easy walking distance of downtown. And this year we've decided to do one day rather than two. But two things that haven't changed since last year is that the event is FREE, and we'll be holding the social event on Friday evening at my loft, I expect there will be plenty of geo-beer from the Wynkoop Brewing Company downstairs and that may fuel some geo-karaoke later on. All this is thanks to our kind sponsors, who at the time of writing include Enspiria Solutions, ESRI, Google, MapQuest and Waze.
I'm expecting a great group of interesting attendees and presentations again this year, so highly encourage you to come along. And remember it's an unconference, so we are looking for as many people as possible to participate - prepare a short presentation or come prepared to lead a discussion on a topic that interests you!
Sign up for WhereCamp5280 here, and if you feel like sponsoring at anywhere from $16 to $1024 (can you tell that a techie geek set the sponsorship amounts?!) that would be great, but otherwise just sign up and enjoy the great free education, networking, and beer :).
So WhereCamp5280 is a great local event, but in September 2011 the global geo community will be converging on Denver for a fantastic double bill of FOSS4G and SotM.
For those who don't know, FOSS4G stands for Free and Open Source Software for Geospatial and is an annual international gathering organized by OSGeo. The last North American event was in 2007 in Victoria, BC, and since then it's been in Cape Town, Sydney and Barcelona, so we're delighted to have Denver join that list, and expecting a great turnout from around the world.
Eric Wolf and I led the bid to bring FOSS4G to Denver (which is one of the things we were busy on that was competing for time with WhereCamp5280). Eric was originally slated to be the conference chair, but unfortunately due to circumstances beyond his control he has had to stand down from that, and I have just taken over that role in the last week (well unless the OSGeo board fails to approve the change at their next meeting, but I'm assured that's not very likely!). I'd like to publicly thank Eric for all the work he did to bring the conference here - it was his idea initially, and definitely wouldn't have happened without all his efforts. We have the core of a great local organizing group set up already, but are still interested in recruiting a couple more folks, so if you'd like to help out please let me know. It's going to be a great event, and I'll be blogging plenty more about it over the coming months.
And on top of that it was announced today that Denver has also been selected to host State of the Map (SotM), the global OpenStreetMap conference, also in September 2011. I attended SotM in Amsterdam in 2009 and thought it was a fantastic event. Unfortunately I wasn't able to make it this year, but I will definitely be there next year :) ! The two events are distinct, but several people were involved in both bids, and we recognized that a lot of people would be interested in attending both, so the intent is for them to run back to back. The SotM date isn't fixed yet, but FOSS4G is locked in for September 12-16.
So if you're in the Denver area already, plan to be at WhereCamp5280 on Nov 19, and if you're not, make plans to be here in September 2011!
Labels:
conference,
foss4g,
geospatial,
mapping,
open source,
openstreetmap
Monday, October 26, 2009
Talk on "The Geospatial Revolution" in Minnesota
Here is a video of my recent keynote talk at the Minnesota GIS/LIS conference in Duluth, which was an excellent event. There were about 500 people there, which is great in the current economic climate. It was mainly a "traditional GIS" audience, and I got a lot of good feedback on the talk which was nice.
I talk about current trends in the industry in three main areas: moving to the mainstream (at last!); a real time, multimedia view of the world; and crowdsourcing. There's a lot of the same material that I presented in my talk with the same title at AGI GeoCommunity (which doesn't have an online video), but this one also has additional content (~50 minutes versus 30 minutes).
Click through to vimeo for a larger video, and if you click on "HD" you will get the full high definition version!! I used a different approach to produce this video compared to previous presentation videos, using a separate camera and a different layout for combining the slides and video. I like the way this came out - I'll do a separate blog post soon with some tips on how to video presentations, I think.
The Geospatial Revolution (Minnesota) from Peter Batty on Vimeo.
You can also view the slides here:
Labels:
conference,
ESRI,
geospatial,
GIS,
google,
neogeography,
open source,
openstreetmap,
presentation
Monday, January 26, 2009
Great turnout for "open source geospatial for managers" event in Denver
Last Friday I went to a FRUGOS (Front Range Users of Geospatial Open Source) event organized by Brian Timoney in Denver, entitled "FRUGOS for Managers". It ran for a couple of hours on a Friday afternoon, and around 50 people showed up, which I thought was a great turnout. Brian includes a list of organizations who attended in his summary of the event - they included a good assortment of local government organizations in the area. Previous FRUGOS events had been more targeted at techies (like many open source events) - this was the first to consciously target a less technical audience. The turnout, and feedback, certainly suggested that there is a strong demand for this type of event.
The main focus of the event was on a couple of case studies using PostGIS, MapServer and GeoServer. The most interesting presentation for me was from Matt Krusemark of DRCOG, the Denver Regional Council of Governments, which exists to foster regional cooperation between county and municipal governments in the Denver metropolitan area. They have a geospatial group focused on collecting data from member organizations and sharing it, and they will be launching a new web site very soon using PostGIS, GeoServer and OpenLayers with a Google basemap. This is especially notable as they use ESRI software on the desktop, as most US government organizations do. I think this "hybrid" approach of using closed source solutions on the desktop and open source solutions (and/or solutions from Microsoft or Google) for web mapping will become increasingly common. DRCOG also intends to make all this data available to the public for free (gratis). You'll be able to find this via DRCOG's GIS page when it goes live.
Congratulations to Brian on a great event - I look forward to more of these.
The main focus of the event was on a couple of case studies using PostGIS, MapServer and GeoServer. The most interesting presentation for me was from Matt Krusemark of DRCOG, the Denver Regional Council of Governments, which exists to foster regional cooperation between county and municipal governments in the Denver metropolitan area. They have a geospatial group focused on collecting data from member organizations and sharing it, and they will be launching a new web site very soon using PostGIS, GeoServer and OpenLayers with a Google basemap. This is especially notable as they use ESRI software on the desktop, as most US government organizations do. I think this "hybrid" approach of using closed source solutions on the desktop and open source solutions (and/or solutions from Microsoft or Google) for web mapping will become increasingly common. DRCOG also intends to make all this data available to the public for free (gratis). You'll be able to find this via DRCOG's GIS page when it goes live.
Congratulations to Brian on a great event - I look forward to more of these.
Tuesday, December 4, 2007
No data creation in neogeography - errr????
I found this post at All Points Blog rather bizarre, on how "neogeography is not GIS". It quotes Mike Hickey of Pitney Bowes (the company formerly known as MapInfo) saying that "there is no data creation in neogeography", when perhaps the most notable trend in the industry at the moment is how crowd-sourced or community generated data is radically changing the way we create and maintain data. I discussed one example in my previous post about OpenStreetMap. There was an interesting link in the comments on that post from my former Smallworld colleague Phil Rodgers on the Cambridge Cycling Campaign's route planner, which also uses community generated data to provide a level of detail not available from any commercial data providers. In doing some further reading on OpenStreetMap I also came across this interesting comparison of their data versus Google's in the small town of Hayward's Heath in England, via an interview on the ZXV blog. And of course there are hosts of other sites generating many different types of geospatial data via community input. The post also said that there is "no spatial analysis" in neogeography, when again there are many interesting developments in this area outside the traditional GIS space - for example what FortiusOne is doing with GeoCommons, and companies like BP and others are implementing increasingly sophisticated applications with Virtual Earth.
King Canute trying to turn back the tide
I'm afraid this comes across to me as another rather poor attempt by old school GIS guys to justify their continued existence in a rapidly changing geospatial world. Absolutely there will continue to be specialized analytical applications which require specialized software and skills, but the new generation of geospatial software systems will continue to eat into applications which were previously the domain of the traditional GIS companies at a rapid rate, and making blatantly incorrect assertions about "neogeography" isn't going to change that trend.
Labels:
data,
geospatial,
google,
Microsoft,
open source
Monday, December 3, 2007
Oxford University using OpenStreetMap data
I came across this interesting post from Nick Black saying that Oxford University (my former hangout) is now using data from OpenStreetMap on its web site for detailed maps, as its data is better than Google Maps for Oxford. Since I know the city well I thought I'd check it out. Here are a couple of sample screen shots around my old college, Balliol.
Here's a screen shot from the Open Street Map version (for live version click here and zoom in):
Note all the footpaths and alleyways, of which Oxford has a lot. On the west side it includes a footpath along the canal and on the east side it shows an all important very narrow alleyway off New College Lane which leads to a great old pub called the Turf Tavern, which is easily overlooked. It also correctly shows that Broad Street is no longer a through street, which is a relatively recent change (some time in the last few years, not sure exactly when). None of these details are shown on the following Google map (for live version click here):

Here's a screen shot from the Open Street Map version (for live version click here and zoom in):
Note all the footpaths and alleyways, of which Oxford has a lot. On the west side it includes a footpath along the canal and on the east side it shows an all important very narrow alleyway off New College Lane which leads to a great old pub called the Turf Tavern, which is easily overlooked. It also correctly shows that Broad Street is no longer a through street, which is a relatively recent change (some time in the last few years, not sure exactly when). None of these details are shown on the following Google map (for live version click here):
OpenStreetMap is probably something that people are less aware of in North America than in Europe - for those not familiar with it, the following description from their site sums it up pretty well:
OpenStreetMap is a free editable map of the whole world. It is made by people like you.
OpenStreetMap allows you to view, edit and use geographical data in a collaborative way from anywhere on Earth.
So it's essentially a "crowdsourcing" approach to geospatial data collection. There was a lot of interest in OpenStreetMap at the FOSS4G conference this year. I had been impressed with everything I had seen about the project, but I have to confess that I had been thinking of it mainly as a "cheap and cheerful" (cheap=free) alternative to other more expensive but higher quality data sources. It is interesting to see that it has already moved past that in some locations (though obviously not all) to where it is more comprehensive and more up to date than data from commercial sources - this is just a taste of things to come in this regard, I am sure.
OpenStreetMap is a free editable map of the whole world. It is made by people like you.
OpenStreetMap allows you to view, edit and use geographical data in a collaborative way from anywhere on Earth.
So it's essentially a "crowdsourcing" approach to geospatial data collection. There was a lot of interest in OpenStreetMap at the FOSS4G conference this year. I had been impressed with everything I had seen about the project, but I have to confess that I had been thinking of it mainly as a "cheap and cheerful" (cheap=free) alternative to other more expensive but higher quality data sources. It is interesting to see that it has already moved past that in some locations (though obviously not all) to where it is more comprehensive and more up to date than data from commercial sources - this is just a taste of things to come in this regard, I am sure.
Wednesday, September 26, 2007
Review of FOSS4G
I've been up at the FOSS4G conference in Victoria this week, which has just finished (apart from the related "code sprint" tomorrow). It was really an excellent conference - congratulations to conference chair Paul Ramsey and the rest of the organizing committee for putting on a great event. The quality of the sessions I went to was consistently high, and there was a real energy and buzz around the whole event (much more than at most of the more established geospatial conferences I have been to recently). Adena Schutzberg said in her closing comments that her overall impression of the conference and the open source geospatial community was one of maturity (and she will expand on that theme at Directions magazine next week). The event reaffirmed the belief I had before coming here that the role of open source software in the geospatial industry will continue to grow quickly. And personally I enjoyed learning a lot of new things and meeting a new crowd of people (as well as quite a few old friends).
One specific aim for me in coming here was to learn more about PostGIS, which I regarded beforehand as the front runner for the database technology to use for my new company Spatial Networking. Paul Ramsey, who was a busy man this week, giving a number of very good presentations, presented an interesting set of PostGIS case studies. These included IGN, the French national mapping agency, who maintain a database of 100 million geographic features, with frequent updates from 1800 users, and a fleet management company which stores GPS readings from 100 vehicles 10 times a minute for 8 hours a day, or 480,000 records a day. In a separate presentation, Alejandro Chumaceiro of SIGIS from Venezuela, talked about a similar fleet management application with very high update volumes. Interestingly, they use partitioned tables and create a new partition every hour! Incidentally, I talked with Alejandro afterwards and it turns out that he worked for IBM on their GFIS product from 1986 to 1991, and knew me from those days - it's a small world in the geospatial community :). Kevin Neufeld from Refractions Research also gave a lot of useful hints about partitioning and other performance related topics. Brian Timoney talked about the work he has done using both Google Earth and Virtual Earth on the front end, with PostGIS on the back end doing a variety of spatial queries and reports, including capabilities like buffering, in a way which is very easy to use for people with no specialized knowledge. And Tyler Erickson of Michigan Tech Research Institute talked about some interesting spatio-temporal analysis of environmental data he is doing using PostGIS, GeoServer and Google Earth. Overall I was very impressed with the capabilities and scalability of PostGIS, and was reassured that this is the right approach for us to use at Spatial Networking.
Another topic which featured in several sessions I attended was that of data. As Schuyler Erle said in introducing a session about the OSGeo Public Geodata Committee, a key difference between geospatial software and other open source initiatives is that the software is no use without data, so looking at ways to create and maintain, share, and enable users to find, freely available geodata is also an important element of OSGeo's work, in addition to software. Nick Black, a fellow Brit, gave a good talk about Open Street Map, which is getting a lot of interest. The scope of what they are doing is broader than I had realized, including not just streets but points of interest (pubs are apparently especially well represented!), address information which can be used for geocoding, and they are working on building up a US database based on TIGER data. The ubiquitous Geoff Zeiss, a man without whom no GIS conference would be complete :), gave an interesting review of the wide variety of government policies with regard to geospatial data around the world. One curious snippet from that was that in Malaysia and some other Asian countries, you need to have an application reviewed by the police and the army before being able to receive a government-issued map! In the opening session, I enjoyed the talk by Andrew Turner of Mapufacture on Neogeography Data Collection, which was a great overview of the wide range of approaches being used for "community generated" data, including things like cheap aerial photography using remote control drones from Pict'Earth - they have a nice example of data captured at Burning Man. This was one of a number of five minute lightning talks, which went over pretty well - several people told me that they enjoyed the format. I also gave one of those, on the topic of the past, present and future of the geospatial industry, and managed to fit into the allotted time - though next time I might choose a slightly more focused topic :) ! I will write up my talk in a separate post at some point (it will take a lot longer than 5 minutes to do that though!). Ed McNierny of Topozone had the most intriguing title, "No one wants 3/8 inch drill bits" - the punchline was that what they actually want is 3/8 inch holes, and we should focus on the results that our users need, not the specific tools they use to achieve those. Schuyler Erle gave one of the more entertaining presentations on 7-dimensional matrices that I have seen (and I say that as a mathematician).
Also in the opening session, Damian Conway gave a good talk entitled "geek eye for the suit guy", on how to sell "suits" on the benefits of open source software. Roughly half his arguments applied to geospatial software, and half were more specific to Linux - Adena has done a more detailed writeup.
Brady Forrest of O'Reilly Media gave an interesting presentation on "Trends of the Geo Web". His three main themes were "Maps, Maps Everywhere", "The Web as the GeoIndex", and
"Crowdsourced Data Collection". One interesting site he mentioned that I hadn't come across before was Walk Score, which ranks your home based on how "walkable" the neighborhood is (my loft in downtown Denver rated an excellent 94 out of 100). It seems as though every time I see a presentation like this I discover some interesting new sites, and now I listen slightly nervously hoping that I don't discover someone doing what we plan to do with Spatial Networking, but so far that hasn't happened!
I also was on the closing panel for the conference, which I thought went well - we had a pretty lively discussion. The closing session also included a preview of next year's conference which will be in Cape Town, South Africa. I had the pleasure of spending a few days in Cape Town in 2002, followed by a safari in Botswana which still ranks as the best of the many trips I've done to different parts of the world (check out my pictures). So I certainly hope to make it to the conference, and highly recommend that others try to make it down there and spend some additional time in that part of the world too.
Apologies to those I missed out of this somewhat rambling account, but the Sticky Wicket pub is calling, so I will wrap it up here, for now at least.
One specific aim for me in coming here was to learn more about PostGIS, which I regarded beforehand as the front runner for the database technology to use for my new company Spatial Networking. Paul Ramsey, who was a busy man this week, giving a number of very good presentations, presented an interesting set of PostGIS case studies. These included IGN, the French national mapping agency, who maintain a database of 100 million geographic features, with frequent updates from 1800 users, and a fleet management company which stores GPS readings from 100 vehicles 10 times a minute for 8 hours a day, or 480,000 records a day. In a separate presentation, Alejandro Chumaceiro of SIGIS from Venezuela, talked about a similar fleet management application with very high update volumes. Interestingly, they use partitioned tables and create a new partition every hour! Incidentally, I talked with Alejandro afterwards and it turns out that he worked for IBM on their GFIS product from 1986 to 1991, and knew me from those days - it's a small world in the geospatial community :). Kevin Neufeld from Refractions Research also gave a lot of useful hints about partitioning and other performance related topics. Brian Timoney talked about the work he has done using both Google Earth and Virtual Earth on the front end, with PostGIS on the back end doing a variety of spatial queries and reports, including capabilities like buffering, in a way which is very easy to use for people with no specialized knowledge. And Tyler Erickson of Michigan Tech Research Institute talked about some interesting spatio-temporal analysis of environmental data he is doing using PostGIS, GeoServer and Google Earth. Overall I was very impressed with the capabilities and scalability of PostGIS, and was reassured that this is the right approach for us to use at Spatial Networking.
Another topic which featured in several sessions I attended was that of data. As Schuyler Erle said in introducing a session about the OSGeo Public Geodata Committee, a key difference between geospatial software and other open source initiatives is that the software is no use without data, so looking at ways to create and maintain, share, and enable users to find, freely available geodata is also an important element of OSGeo's work, in addition to software. Nick Black, a fellow Brit, gave a good talk about Open Street Map, which is getting a lot of interest. The scope of what they are doing is broader than I had realized, including not just streets but points of interest (pubs are apparently especially well represented!), address information which can be used for geocoding, and they are working on building up a US database based on TIGER data. The ubiquitous Geoff Zeiss, a man without whom no GIS conference would be complete :), gave an interesting review of the wide variety of government policies with regard to geospatial data around the world. One curious snippet from that was that in Malaysia and some other Asian countries, you need to have an application reviewed by the police and the army before being able to receive a government-issued map! In the opening session, I enjoyed the talk by Andrew Turner of Mapufacture on Neogeography Data Collection, which was a great overview of the wide range of approaches being used for "community generated" data, including things like cheap aerial photography using remote control drones from Pict'Earth - they have a nice example of data captured at Burning Man. This was one of a number of five minute lightning talks, which went over pretty well - several people told me that they enjoyed the format. I also gave one of those, on the topic of the past, present and future of the geospatial industry, and managed to fit into the allotted time - though next time I might choose a slightly more focused topic :) ! I will write up my talk in a separate post at some point (it will take a lot longer than 5 minutes to do that though!). Ed McNierny of Topozone had the most intriguing title, "No one wants 3/8 inch drill bits" - the punchline was that what they actually want is 3/8 inch holes, and we should focus on the results that our users need, not the specific tools they use to achieve those. Schuyler Erle gave one of the more entertaining presentations on 7-dimensional matrices that I have seen (and I say that as a mathematician).
Also in the opening session, Damian Conway gave a good talk entitled "geek eye for the suit guy", on how to sell "suits" on the benefits of open source software. Roughly half his arguments applied to geospatial software, and half were more specific to Linux - Adena has done a more detailed writeup.
Brady Forrest of O'Reilly Media gave an interesting presentation on "Trends of the Geo Web". His three main themes were "Maps, Maps Everywhere", "The Web as the GeoIndex", and
"Crowdsourced Data Collection". One interesting site he mentioned that I hadn't come across before was Walk Score, which ranks your home based on how "walkable" the neighborhood is (my loft in downtown Denver rated an excellent 94 out of 100). It seems as though every time I see a presentation like this I discover some interesting new sites, and now I listen slightly nervously hoping that I don't discover someone doing what we plan to do with Spatial Networking, but so far that hasn't happened!
I also was on the closing panel for the conference, which I thought went well - we had a pretty lively discussion. The closing session also included a preview of next year's conference which will be in Cape Town, South Africa. I had the pleasure of spending a few days in Cape Town in 2002, followed by a safari in Botswana which still ranks as the best of the many trips I've done to different parts of the world (check out my pictures). So I certainly hope to make it to the conference, and highly recommend that others try to make it down there and spend some additional time in that part of the world too.
Apologies to those I missed out of this somewhat rambling account, but the Sticky Wicket pub is calling, so I will wrap it up here, for now at least.
Tuesday, August 28, 2007
Upcoming gigs
Just a quick note to say that I am going to be speaking at a couple of upcoming conferences. The first is GIS in the Rockies, in Denver on September 12, where they have a pretty strong line-up of the usual suspects for their "all star panel":
Peter Batty, Former Chief Technology Officer of Intergraph Corporation
Joseph K. Berry, Keck Scholar in Geosciences, University of Denver and Principal, Berry & Associates
Jack Dangermond, President, Environmental Systems Research Institute (ESRI)
William Gail, PhD, Director, Strategic Development - Microsoft Virtual Earth
Geoff Zeiss, Director of Technology, Infrastructure Solutions Division, Autodesk, Inc.
Andy Zetlan, National Director of Utility Industry Solutions, Oracle Corporation
GIS in the Rockies has always been one of the strongest regional GIS conferences, and I think they are anticipating 650+ people. There's a "social mixer" right after the panel, which hopefully most of the panelists will attend, at the Wynkoop Brewing Company in Denver, which I happen to live above, so I will definitely be there!
The second event is FOSS4G, in Victoria, Canada, where I'm giving a 5 minute lightning talk in the open session on "The past, present and future of the geospatial industry" - I felt like a bit of a challenge :) !! I'm looking forward to that actually, having never done a 5 minute presentation before - it will be interesting to figure out what to say. And I'm also going to be on the closing panel, with Tim Bowden of Mapforge Geospatial, Mark Sondheim of BC Integrated Land Management Bureau, and Frank Warmerdam, President of the Open Source Geospatial Foundation (OSGeo). It will be my first FOSS4G and I'm looking forward to meeting a different crowd and getting some new perspectives on things.
Peter Batty, Former Chief Technology Officer of Intergraph Corporation
Joseph K. Berry, Keck Scholar in Geosciences, University of Denver and Principal, Berry & Associates
Jack Dangermond, President, Environmental Systems Research Institute (ESRI)
William Gail, PhD, Director, Strategic Development - Microsoft Virtual Earth
Geoff Zeiss, Director of Technology, Infrastructure Solutions Division, Autodesk, Inc.
Andy Zetlan, National Director of Utility Industry Solutions, Oracle Corporation
GIS in the Rockies has always been one of the strongest regional GIS conferences, and I think they are anticipating 650+ people. There's a "social mixer" right after the panel, which hopefully most of the panelists will attend, at the Wynkoop Brewing Company in Denver, which I happen to live above, so I will definitely be there!
The second event is FOSS4G, in Victoria, Canada, where I'm giving a 5 minute lightning talk in the open session on "The past, present and future of the geospatial industry" - I felt like a bit of a challenge :) !! I'm looking forward to that actually, having never done a 5 minute presentation before - it will be interesting to figure out what to say. And I'm also going to be on the closing panel, with Tim Bowden of Mapforge Geospatial, Mark Sondheim of BC Integrated Land Management Bureau, and Frank Warmerdam, President of the Open Source Geospatial Foundation (OSGeo). It will be my first FOSS4G and I'm looking forward to meeting a different crowd and getting some new perspectives on things.
Tuesday, July 3, 2007
Follow up discussion on GE next generation system (about customization and open source)
My two part post last week on General Electric's next generation system based on Oracle (part 1 and part 2) generated some interesting follow up comments and questions (mainly after part 2). I got somewhat sidetracked with the whole iPhone extravaganza over the past few days, but wanted to circle back and follow up on a couple of threads.
The first thread related to customization versus configuration, and the fact that there are potential attractions in being able to provide an "off the shelf" system, which is just configured and not customized, for a specific application area - in this case design and asset management for mid-size electric utilities in North America. If you can achieve this, then potentially you can significantly reduce implementation costs and (in particular) ongoing support and upgrade costs. However, the challenge lies in whether you can meet customer needs well enough in this type of complex application area with just configuration. People have tried to do this multiple times in the utility industry, but so far nobody has been successful, and everyone has fallen back to an approach which requires customization. Both Roberto Falco and Jonathan Hartley were skeptical that a pure configuration approach could be successful, and Jonathan makes a good argument in his response that if you implement a very complex configuration system, then you may end up just recreating a customization environment which is less flexible and less standard than if you had just allowed appropriate customization in the first place. I don't disagree with either of them in general - though I would make the comment to Jon that I don't think we're talking about "GIS" in general, it's a specific application of that to electric utilities in a relatively narrow market segment, so that gives you a little more chance of defining the problem specifically enough that a configurable system could make sense. One other observation I would make is that I think that cost is an element of this equation. If an "off the shelf" system meets 90% of an organization's stated requirements and costs 80% of the amount of a fully customized system which meets 100% of their requirements, that is much less compelling than if the off the shelf system cost (say) 20% of the fully customized system, in which case they might be more willing to make a few workarounds or changes to business processes to accommodate such a system. This may be stating the obvious, but I wonder whether organizations trying to develop this "off the shelf " approach go into it thinking that they will have to offer a substantially lower price (or provide other compelling benefits) to persuade customers to adopt it. Finally on this thread, I think it is also worth observing that the "GIS" (if you call it that) in a utility is not a standalone system - it typically requires some sort of integration or interfaces with many other systems, such as customer information, work management, outage management, workforce management, etc etc. This makes it an even bigger challenge to develop something which does not require customization.
Paul Ramsey raised a different question, that of open source, which I think is interesting in this context. But first I will just answer a couple of other questions that Paul raised. He asked if this was essentially a replacement market, and the answer is yes - especially among the medium and large utilities, and in the more established markets (North America, Western Europe, Japan, Australia / New Zealand, etc), pretty well everyone has a system implemented, and GE/Smallworld, Intergraph and ESRI are the dominant vendors. Because of the customized nature of these systems, and the amount of integration with other business systems, switching to a new vendor is typically a multi-million dollar project, even if the new software is provided for free. So this is obviously a big reason why this is a "sticky" market and customers don't change systems very often. The other clarification is that Paul asks about defining the market of customers "who think that 'classic' Smallworld doesn't cut it anymore", and overall I wouldn't particularly categorize things that way. Overall I think that satisfaction levels are probably as high among the Smallworld customer base as with any of the other vendors, the system is functionally very competitive still, the main concern is probably in obtaining and retaining people with the appropriate specialized skills, especially for smaller organizations. But there has been very little movement of utilities from Smallworld to other companies up to this point. Ironically, GE bringing out a next generation system (which they had to do for new business) is something which may cause existing customers to re-evaluate their strategy. Again though this is just a standard challenge of moving to a new generation architecture for any software company - you're damned if you do and damned if you don't.
Anyway, back to open source - Paul raises the question of whether GE doing something with open source may be an option. I have heard a few people raise this before in regard to Smallworld, and it is an interesting question. There are three major elements to the Smallworld core software. The first is the Magik programming language, which was developed by Smallworld back in the late eighties, and it is very similar to Python and Ruby. Smallworld was way ahead of its time in recognizing the productivity benefits of this type of language, and this was a key reason for its success. The core Magik virtual machine has been stable for a long time, the original developers of it left Smallworld a number of years ago and I suspect that nothing much has changed at this level of the system recently. The second key element is VMDS, the Version Managed Datastore. There is some very complex and low level code in both of these components which I suspect would make it hard to open source them effectively, given the amount of time it would take people to get into the details of these components (currently known to only a very few people either inside GE, or who have left GE), and the relatively small size of the market for the products would probably be a disincentive for people to make the effort to learn this. However, both these components are very stable and probably don't really need much maintenance or change. The vast majority of the system is written in Magik, and the bulk of the Magik source code for Smallworld GIS has always been shipped with the product, which has allowed for a hugely flexible customization environment (much more flexible than Smallworld's main competitors). There is a pretty large base of technical resources in customers and Smallworld partners who know how to enhance the system using this environment. If GE could manage enhancements at this level of the system in an open source fashion, to leverage the strength of the existing Smallworld technical community, who are on the whole very loyal to Smallworld and have a strong interest in seeing it continue to be successful, this could be a very powerful thing. As I noted in my previous post, one of the challenges for GE is that to develop new applications on two platforms concurrently (Smallworld and the new Oracle/Java based platform) will significantly decrease the amount of new functionality they can develop with a finite set of resources.
Of course there are a lot of challenges for an existing commercial software company in making such a switch. A critical one for GE would be that they could maintain their existing maintenance revenue from the Smallworld user base (at least to a large degree), otherwise it would be a non-starter from a business perspective I think. But it is quite conceivable that this approach could harness the talents of a much larger group of developers than Smallworld currently has working on the product, and produce more enhancements and fixes to the products than customers currently see, so you can see scenarios in which customers would be happy to continue to pay maintenance for support. As Paul points out, Autodesk has made this work and most of their customers still pay maintenance. There are differences in the Autodesk scenario as they open sourced a new product rather than an old one, and I think that a big driver for them was that they saw that it was going to be increasingly hard to make money for basic web mapping products, as that area becomes increasingly commoditized, due to the efforts of both the Googles and Microsofts of the world as well as the open source community. I think that this factor probably helped them decide to make the big leap to an open source approach, and it seems to have been a successful one for them.
Could GE also consider open sourcing portions of the new product, which I think was really Paul's question? That could also be an interesting possibility, to help gain traction in the market. If they open sourced some of the more generic base level components of the product, they could still build their specific business applications on top of these components and charge money for those, but leverage the open source community to provide additional functionality.
So the motivations are somewhat different for GE than for Autodesk, and there are multiple scenarios where open source could play a role, but I could envision an open source version of Smallworld significantly extending the product's life, by harnessing the very good technical resources which exist in the broader Smallworld community. Internal GE resources could then be freed up to work on the new product line (with just a small number focused on managing the open source developments of the established Smallworld product line). There are certainly a number of challenges and a lot of complexity in making something like this fly from a business perspective, and I'm not sure if GE would have the imagination or the boldness to take a risk on it, but it's an interesting thing to speculate about!
The first thread related to customization versus configuration, and the fact that there are potential attractions in being able to provide an "off the shelf" system, which is just configured and not customized, for a specific application area - in this case design and asset management for mid-size electric utilities in North America. If you can achieve this, then potentially you can significantly reduce implementation costs and (in particular) ongoing support and upgrade costs. However, the challenge lies in whether you can meet customer needs well enough in this type of complex application area with just configuration. People have tried to do this multiple times in the utility industry, but so far nobody has been successful, and everyone has fallen back to an approach which requires customization. Both Roberto Falco and Jonathan Hartley were skeptical that a pure configuration approach could be successful, and Jonathan makes a good argument in his response that if you implement a very complex configuration system, then you may end up just recreating a customization environment which is less flexible and less standard than if you had just allowed appropriate customization in the first place. I don't disagree with either of them in general - though I would make the comment to Jon that I don't think we're talking about "GIS" in general, it's a specific application of that to electric utilities in a relatively narrow market segment, so that gives you a little more chance of defining the problem specifically enough that a configurable system could make sense. One other observation I would make is that I think that cost is an element of this equation. If an "off the shelf" system meets 90% of an organization's stated requirements and costs 80% of the amount of a fully customized system which meets 100% of their requirements, that is much less compelling than if the off the shelf system cost (say) 20% of the fully customized system, in which case they might be more willing to make a few workarounds or changes to business processes to accommodate such a system. This may be stating the obvious, but I wonder whether organizations trying to develop this "off the shelf " approach go into it thinking that they will have to offer a substantially lower price (or provide other compelling benefits) to persuade customers to adopt it. Finally on this thread, I think it is also worth observing that the "GIS" (if you call it that) in a utility is not a standalone system - it typically requires some sort of integration or interfaces with many other systems, such as customer information, work management, outage management, workforce management, etc etc. This makes it an even bigger challenge to develop something which does not require customization.
Paul Ramsey raised a different question, that of open source, which I think is interesting in this context. But first I will just answer a couple of other questions that Paul raised. He asked if this was essentially a replacement market, and the answer is yes - especially among the medium and large utilities, and in the more established markets (North America, Western Europe, Japan, Australia / New Zealand, etc), pretty well everyone has a system implemented, and GE/Smallworld, Intergraph and ESRI are the dominant vendors. Because of the customized nature of these systems, and the amount of integration with other business systems, switching to a new vendor is typically a multi-million dollar project, even if the new software is provided for free. So this is obviously a big reason why this is a "sticky" market and customers don't change systems very often. The other clarification is that Paul asks about defining the market of customers "who think that 'classic' Smallworld doesn't cut it anymore", and overall I wouldn't particularly categorize things that way. Overall I think that satisfaction levels are probably as high among the Smallworld customer base as with any of the other vendors, the system is functionally very competitive still, the main concern is probably in obtaining and retaining people with the appropriate specialized skills, especially for smaller organizations. But there has been very little movement of utilities from Smallworld to other companies up to this point. Ironically, GE bringing out a next generation system (which they had to do for new business) is something which may cause existing customers to re-evaluate their strategy. Again though this is just a standard challenge of moving to a new generation architecture for any software company - you're damned if you do and damned if you don't.
Anyway, back to open source - Paul raises the question of whether GE doing something with open source may be an option. I have heard a few people raise this before in regard to Smallworld, and it is an interesting question. There are three major elements to the Smallworld core software. The first is the Magik programming language, which was developed by Smallworld back in the late eighties, and it is very similar to Python and Ruby. Smallworld was way ahead of its time in recognizing the productivity benefits of this type of language, and this was a key reason for its success. The core Magik virtual machine has been stable for a long time, the original developers of it left Smallworld a number of years ago and I suspect that nothing much has changed at this level of the system recently. The second key element is VMDS, the Version Managed Datastore. There is some very complex and low level code in both of these components which I suspect would make it hard to open source them effectively, given the amount of time it would take people to get into the details of these components (currently known to only a very few people either inside GE, or who have left GE), and the relatively small size of the market for the products would probably be a disincentive for people to make the effort to learn this. However, both these components are very stable and probably don't really need much maintenance or change. The vast majority of the system is written in Magik, and the bulk of the Magik source code for Smallworld GIS has always been shipped with the product, which has allowed for a hugely flexible customization environment (much more flexible than Smallworld's main competitors). There is a pretty large base of technical resources in customers and Smallworld partners who know how to enhance the system using this environment. If GE could manage enhancements at this level of the system in an open source fashion, to leverage the strength of the existing Smallworld technical community, who are on the whole very loyal to Smallworld and have a strong interest in seeing it continue to be successful, this could be a very powerful thing. As I noted in my previous post, one of the challenges for GE is that to develop new applications on two platforms concurrently (Smallworld and the new Oracle/Java based platform) will significantly decrease the amount of new functionality they can develop with a finite set of resources.
Of course there are a lot of challenges for an existing commercial software company in making such a switch. A critical one for GE would be that they could maintain their existing maintenance revenue from the Smallworld user base (at least to a large degree), otherwise it would be a non-starter from a business perspective I think. But it is quite conceivable that this approach could harness the talents of a much larger group of developers than Smallworld currently has working on the product, and produce more enhancements and fixes to the products than customers currently see, so you can see scenarios in which customers would be happy to continue to pay maintenance for support. As Paul points out, Autodesk has made this work and most of their customers still pay maintenance. There are differences in the Autodesk scenario as they open sourced a new product rather than an old one, and I think that a big driver for them was that they saw that it was going to be increasingly hard to make money for basic web mapping products, as that area becomes increasingly commoditized, due to the efforts of both the Googles and Microsofts of the world as well as the open source community. I think that this factor probably helped them decide to make the big leap to an open source approach, and it seems to have been a successful one for them.
Could GE also consider open sourcing portions of the new product, which I think was really Paul's question? That could also be an interesting possibility, to help gain traction in the market. If they open sourced some of the more generic base level components of the product, they could still build their specific business applications on top of these components and charge money for those, but leverage the open source community to provide additional functionality.
So the motivations are somewhat different for GE than for Autodesk, and there are multiple scenarios where open source could play a role, but I could envision an open source version of Smallworld significantly extending the product's life, by harnessing the very good technical resources which exist in the broader Smallworld community. Internal GE resources could then be freed up to work on the new product line (with just a small number focused on managing the open source developments of the established Smallworld product line). There are certainly a number of challenges and a lot of complexity in making something like this fly from a business perspective, and I'm not sure if GE would have the imagination or the boldness to take a risk on it, but it's an interesting thing to speculate about!
Labels:
General Electric,
geospatial,
open source,
Smallworld,
utilities
Saturday, June 16, 2007
Quick report on FRUGOS "unconference"
As mentioned previously, FRUGOS (Front Range Users of Geospatial Open Source) held an "unconference" in Boulder, CO, today. I went along together with over twenty others, and it was a really excellent event. Many thanks to Sean Gillies and Brian Timoney for organizing things (insofar as an unconference can admit to having organizers!), and to Tom Churchill of Churchill Navigation for hosting us in a beautiful location right at the foot of the Flatiron Mountains in Boulder.
Having never been to an unconference before (as was the case for most people, I think), I wasn't sure what to expect, but an intentionally fairly random process of having people volunteer to speak, demo or lead discussions, with very minimal organizing of the agenda, produced a set of sessions that were of a higher quality than most highly structured conferences I have been to. We had some sessions with the whole group and some where we broke into two groups. As well as strictly open source topics, there were several sessions which talked about Google Earth and Maps, and I talked about general geospatial future trends in a shortened version of my recent GITA presentation.
I don't have time to review everything now, but will just mention a few highlights. Scott Davis, author of various books, gave an interesting talk on "rolling your own Google Maps", with a sequence of 12 simple web pages which gradually built up functionality until he had implemented a page with "slippy map" functionality, allowing dynamic panning and zooming on multiple layers of image tiles. You can check out these examples here - you will just see a directory listing, start by looking at the readme and then work your way through each of the sample pages. It seems like a great little tutorial in Javascript, and a good way of understanding some of the principles that have been used to make Google maps so performant and easy to use. Chris Helm from University of Colorado talked about how they have used various products including MapServer, PostGIS and Google Earth to view glacier data and related imagery. The system links together over 10,000 KML files which are loaded as appropriate, to avoid the overhead of having to download very large KML files. There was another interesting talk on NASA WorldWind.
Gregor Allensworth-Mosheh of HostGIS talked about his HostGIS Linux distribution, which comes with all sorts of open source geospatial goodies installed and ready to run right out of the box, with a nice set of examples. I have a copy of the CD and plan to give it a try when I have the time. Secondly he talked about "how to display 10,000 points in Google Maps", which I thought was great. He had two approaches, one which downloaded all the points to the client in JSON format (which is much more compact than other options like KML), and did all the processing on the client to combine multiple points which were close together into a single marker. The other used a WMS service which combined points on the server and rendered a raster image, but still allowed selection of points using a mechanism which went back to the server. Both these approaches overcome one of my pet peeves, which is displaying large result sets in multiple "pages", which I really think is a lazy solution which is not at all useful in most circumstances. For example, if I look at the photos I have geocoded in flickr, the result is split into 12 pages, so I just get a random one twelfth of the 1200 or so photos I have geocoded, with no idea what the real geographic distribution of the whole set of photos is. Come on flickr guys, you can do better than this!
Tom Churchill ended up the day by showing us his very cool touch table with an application which overlaid live video coming from a helicopter on top of a map with both imagery and vector data - it was very dynamic and very cool! This is just a side project for them, their main efforts are concentrated on producing a new generation of in car navigation system, which aims to do to that category what Google Earth did to online map display - make it much more dynamic and fun. You can get a flavor of what they are up to from these videos, but they really don't do full justice to what Tom showed us. I look forward to seeing how their system develops!
There is certainly a great energy about all that is going on with this new generation of geospatial systems right now, and it was good to meet a number of the people in the Front Range area who are making things happen in this area. I look forward to future FRUGOS events.
Having never been to an unconference before (as was the case for most people, I think), I wasn't sure what to expect, but an intentionally fairly random process of having people volunteer to speak, demo or lead discussions, with very minimal organizing of the agenda, produced a set of sessions that were of a higher quality than most highly structured conferences I have been to. We had some sessions with the whole group and some where we broke into two groups. As well as strictly open source topics, there were several sessions which talked about Google Earth and Maps, and I talked about general geospatial future trends in a shortened version of my recent GITA presentation.
I don't have time to review everything now, but will just mention a few highlights. Scott Davis, author of various books, gave an interesting talk on "rolling your own Google Maps", with a sequence of 12 simple web pages which gradually built up functionality until he had implemented a page with "slippy map" functionality, allowing dynamic panning and zooming on multiple layers of image tiles. You can check out these examples here - you will just see a directory listing, start by looking at the readme and then work your way through each of the sample pages. It seems like a great little tutorial in Javascript, and a good way of understanding some of the principles that have been used to make Google maps so performant and easy to use. Chris Helm from University of Colorado talked about how they have used various products including MapServer, PostGIS and Google Earth to view glacier data and related imagery. The system links together over 10,000 KML files which are loaded as appropriate, to avoid the overhead of having to download very large KML files. There was another interesting talk on NASA WorldWind.
Gregor Allensworth-Mosheh of HostGIS talked about his HostGIS Linux distribution, which comes with all sorts of open source geospatial goodies installed and ready to run right out of the box, with a nice set of examples. I have a copy of the CD and plan to give it a try when I have the time. Secondly he talked about "how to display 10,000 points in Google Maps", which I thought was great. He had two approaches, one which downloaded all the points to the client in JSON format (which is much more compact than other options like KML), and did all the processing on the client to combine multiple points which were close together into a single marker. The other used a WMS service which combined points on the server and rendered a raster image, but still allowed selection of points using a mechanism which went back to the server. Both these approaches overcome one of my pet peeves, which is displaying large result sets in multiple "pages", which I really think is a lazy solution which is not at all useful in most circumstances. For example, if I look at the photos I have geocoded in flickr, the result is split into 12 pages, so I just get a random one twelfth of the 1200 or so photos I have geocoded, with no idea what the real geographic distribution of the whole set of photos is. Come on flickr guys, you can do better than this!
Tom Churchill ended up the day by showing us his very cool touch table with an application which overlaid live video coming from a helicopter on top of a map with both imagery and vector data - it was very dynamic and very cool! This is just a side project for them, their main efforts are concentrated on producing a new generation of in car navigation system, which aims to do to that category what Google Earth did to online map display - make it much more dynamic and fun. You can get a flavor of what they are up to from these videos, but they really don't do full justice to what Tom showed us. I look forward to seeing how their system develops!
There is certainly a great energy about all that is going on with this new generation of geospatial systems right now, and it was good to meet a number of the people in the Front Range area who are making things happen in this area. I look forward to future FRUGOS events.
Subscribe to:
Comments (Atom)