Showing posts with label analysis. Show all posts
Showing posts with label analysis. Show all posts

Wednesday, November 4, 2009

I've joined the Advisory Board of FortiusOne

Today it was announced that I've joined the new Advisory Board of FortiusOne, together with Jeff Harris, who has a very distinguished background in the Intelligence world, and Michael Frankel and Wolf Ruzicka, who bring great expertise in Business Intelligence and enterprise software. We actually had the first Advisory Board meeting just recently and it's a great group.

I've followed the development of FortiusOne with interest for a few years now, and I did a bit of consulting for them back in the fairly early days of the company. Their CEO Sean Gorman and CTO Andrew Turner are two of the leading thinkers in the geospatial industry. I am a big proponent of their philosophy of de-mystifying geospatial analysis and making it accessible to a much broader audience of non-specialists. You can check out their free GeoCommons site which lets you easily search for, upload and download geospatial data, and produce great looking maps and spatial analysis like this:



(Click on the map to go to the interactive version)

One cool feature of GeoCommons is the ability to upload spreadsheets containing addresses or place names, which will be automatically geocoded using the free and open source GeoCommons geocoder. There are lots of nice examples of using GeoCommons on the FortiusOne blog, for example these posts on health care, the Afghan Elections, and home foreclosures. FortiusOne sells enterprise versions of their application (as a service or an appliance), which have additional analytic capabilities beyond those on the free public site, but with the same focus on simplicity and ease of use. I look forward to working with the team at FortiusOne, and watch for more cool new things coming soon!

Monday, September 29, 2008

Sneak Preview of GeoCommons Maker from FortiusOne

Sean Gorman from FortiusOne was kind enough to let me have a play with the new GeoCommons Maker application ahead of its upcoming release. I don't have time for a detailed review right now, but overall my first impressions of it are very good. Maker is their new product focused on enabling non-expert users to make nice maps. Map layers can be of three types - basic reference data, or chloropleth or graduated symbol thematic maps.

The following is a graduated symbol map showing number of Facebook users by city in the US.

Maker screen shot

This is an example of one of the map creation screens - the whole process has nice clear graphics which are well thought out to explain the process and options, with ways of getting more information when you want it.

Maker screenshot 2

I am a big fan of FortiusOne's vision of putting spatial analysis in the hands of people who are not geospatial specialists. There are still a lot of traditional GIS people who think the sky will fall in (people may draw invalid conclusions if they don't understand data accuracy, etc etc) if you let "untrained" people analyze geospatial data, but I think this is nonsense. I think the best analogy is the spreadsheet. Of course people can draw incorrect conclusions from any kind of data, geospatial or not, but in the great majority of cases this does not happen, and they discover useful insights. Spreadsheets let "untrained" people do useful analysis on all kinds of other data, and I think that FortiusOne is trying to democratize access to spatial analysis in the same way that the spreadsheet has done for non-spatial data. The benefits of having a much larger set of people able to do basic geospatial analysis are huge.

As I said above, I think that this first release looks great, and I look forward to seeing where they take this in the future. I understand that the public release of this will be coming soon.

Thursday, February 28, 2008

If you could do geospatial analysis 50 to 100 times faster …

… than you can today, what compelling new things would this enable you to do? And yes, I mean 50 to 100 times faster, not 50 to 100 percent faster! I’m looking for challenging geospatial analytical problems that would deliver a high business value if you could do this, involving many gigabytes or terabytes of data. If you have a complex analysis that takes a week to run, but you only need to run it once a year for regulatory purposes, there is no compelling business value to being able to run it in an hour or two. But if you are a retail chain and you need to run some complex analysis to decide whether you want to buy an available site for a new store within the next three days, it makes a huge difference whether you can just run one analysis which takes two days, or dozens of analyses which take 30 minutes each, allowing you to try a range of assumptions and different models. Or if you’re a utility or emergency response agency running models to decide where to deploy resources as a hurricane is approaching your territory, being able to run analyses in minutes rather than hours could make a huge difference to being able to adjust your plans to changing conditions. There may be highly valuable analyses that you don’t even consider running today as they would take months to run, but which would have very high value if you could run them in a day.

If you have problems in this category I would be really interested to hear about them, either in the comments here, or feel free to email me if you prefer.

Update: I wanted to say that this is not just a hypothetical question, but I can't talk about any details yet. See the comments for more discussion.