Archive for the 'Locality and Space' Category

Topology vs. Simple Features

Posted in OpenStreetMap on April 21st, 2007 at 19:41:20

Lars Aronsson on the OSM list said:

The result from this is Steve’s current data model and the fact that the rest of us accept this as a viable solution. Those who don’t, because they know more of GIS, like Christopher Schmidt, are repelled by everything they find under the hood of OSM.


Part of my response:

I’m actually not repelled by everything. It’s simply a different choice than I would make. Specifically:

  • OSM uses topology as its base storage. Topology is good for making graphs, which is important when you need to do routing. For this reason, (it seems to me) that OSM was built towards the goal of creating driving directions. Great goal for a project to have. However,
  • Most geo-software uses Simple Features — not topology — for handling data. The result is very different — Simple Features are designed for making maps. If you’d like evidence, look at how the mapnik maps are built: the topology is turned into simple features, and stored in PostGIS. My MapServer demos just under a year ago worked the same way.

The difference to me is simple:

If I want to drawn an OSM feature on a map, I have to fetch a large number of pieces of data fromm the API individually, and combine them to create a geographic feature.

Example:

Way ID 4213747:

  • 1 way.
  • 21 segments.
  • 22 nodes.

So, to visualize this one way, I have to make 44 fetches to the API.

Now, if I switch to a simple features model:

JSON Simple Feature output of same geometry

I’m given a geometry (“Line”), list of coordinates, and list of properties. (This is JSON output: you can also see it as html by adding ‘.html’ to the end, or as atom by adding ‘.atom’ to the end.)

“Line” can also be “Polygon”, or “Point”. (Or “MULTIPOLYGON”, etc., though FeatureServer doesn’t support those.)

This is one fetch. I can now draw the feature. I can also query for other features which have the same name, and get the information for those, too:

Attribute query on name
This shows me that there is also a feature, ID 4213746, which has the same name. I can draw all these features on a map with the output of one query.

In OSM, that would be 88. 88 queries to the API, just so I can display two features — not to mention the fact that at the moment, there’s no way to query attributes quickly.
If there was a strong reason for storing topology — that is, if OSM was really not about maps, and was instead about making driving directions — this could make sense. In fact, it may make sense: I may have a serious lack of understanding about how the project data is being used. However, I think that the most common usage of OSM is *making maps* —
in fact, Steve even backs me up on this, in his post:

OpenStreetMap is driven by this principle that we just want a fscking map.

Topology makes a graph, not a map. This is the reason why I’m in favor of a simple features-based data model: Features-based models are what you use for making maps. Topology is what you use for doing analysis.

The upshot of this? The tools to make topology out of simple features *already exists*: GRASS will do it. PostGIS + pgdijkstra will do it. Any application out there which needs topology knows how to get it, because mapping data is almost always distributed as something that isn’t topological. These are all technical problems: mapping back and forth is possible. The best way to do it is hard to determine, but the OSM project has no shortage of hard-working participants, and I’m sure that over time we will see easier to use UIs and editors for editing and creating data.

OpenLayers Blog

Posted in Locality and Space, OpenLayers on March 27th, 2007 at 22:15:50

OpenLayers now has its own blog.

Thank the creation of Planet OSGeo for the final inspiration to actually put it together.

Fed up with PlanetGS

Posted in Locality and Space on March 26th, 2007 at 08:12:33

I finally got fed up with Planet GeoSpatial this weekend. Too much Google, too many formatting mess ups, and in general, too much crap. Although I appreciate that those who are working in the GeoSpatial space have a large interest in ESRI, Google Earth and Maps, and the general ‘state of the industry’, my target interests are much smaller. I just want to know about what’s going on in the world which affects Open Source GeoSpatial software: I wanted what is essentially “Planet OSGeo”, rather than “Planet Geo”.

(Note that this is a commentary mostly on my specific interests, rather than on Planet GeoSpatial, which James Fee has done a wonderful job on maintaining for the wider target audience he has.)

To that end, and with Sean Gillies’ excellent recommendation of Venus (omg, a version of Planet that *works*?), I’ve set up Planet OSGeo, a collection of Open Source GIS blogs.

Of note, I’ve not included several topics here, even though they are at least tangentially related to Open Source Geo:

  • GeoRSS. There are a half dozen people who are regularly blogging about GeoRSS — not least, the GeoRSS blog. However, GeoRSS is not in and of itself “software”, which is my target interest, and I think the primary field in which OSGeo has thus far expressed an interest.
  • OpenStreetMap. Although collection of Open Geodata is within the realm of OSGeo, at the moment I’m targeting software, and most of OSM’s blogging is not about software development.

(I’m happy to take feedback on my choices — or suggestions for more blogs!)

Notably lacking in this Planet:

  • Blog for OpenLayers. Thus far, I’ve not set up a seperate infrastructure for OpenLayers blogging, sticking instead to the MetaCarta Labs blog. I think the time has come to grow out of that, and move into some OpenLayers infrastructure for blogging.
  • Blog for GDAL. The library at the base of much of the Geo software on the web doesn’t have an RSS feed — for either announcements or general project discussion. Some of this is probably representative of the stability of the project: certainly after as many years as GDAL has been around, there’s limited content in terms of “Rapid Development” that many other tools like web mapping clients are still undergoing. Still, an announcements log with an RSS feed would be cool.
  • MapServer. MapServer doesn’t seem to have a blog or RSS feed out of its website that provides interesting announcement-style updates, which would be good to see.

I think the things I see as ‘missing’ clearly demonstrate my bias in development and usage of tools — so I’m sure more people can point out what else I’m missing.

Looking forward to any feedback you might have.

OpenLayers Vector Support

Posted in default, Locality and Space, OpenLayers on March 11th, 2007 at 11:05:52

So, last week was an OpenLayers hack week. One of the things that we did was make adding support for new vector formats trivial. Instead of modifying several parts of the code, you only need to create two functions: a ‘read’, which takes a set of data — XML, strings, Javascript object, what have you — and returns a list of OpenLayers.Feature.Vector objects, and a ‘write’ which does the reverse — takes a list of objects and returns a string or object or XML.

To prove this, I set out to write some additional vector format support last night. I decided to add one read, and one write.

  • Read: KML. I added support for KML point display in about 20 minutes, including the time to find data and write a demo HTML page loading some example data. Adding LineString support was another 15 minutes.
  • Write: GeoRSS. Support for writing georss:simple points, lines, and polygons was simple… once I found data. I asked for a live example, and was unfortunately unable to find any valid line data outside the GeoRSS website, so I just generated something that was as close as I could come to the examples. I’m lazy, so the export is just RSS 2, and I’m sure that someone will come along and criticize it, but that’s one of the benefits of Open Source: Anyone can offer up a patch. Time from when I created the file stub to when I committed the code was 27 minutes, again, including a demo.

Altogether, the Format support in the new OpenLayers is pretty cool. Because of the way it’s built, I can even do something that is pretty damn ridiculous: Import KML, and export GeoRSS (or GML), all from the browser. Certainly, this is an incredibly crazy thing to do, but OpenLayers is a pretty crazy project.

I’m convinced that there’s nothing in the code that would make it difficult for someone who’s comfortable working with Javascript to write support for any simple-to-parse format. Now, to get the code back to trunk and get the patches rolling in.

From Data To Map

Posted in Locality and Space, OpenLayers, QGIS, TileCache on February 14th, 2007 at 00:47:25

Earlier this evening, Atrus pointed out that DC has a bunch of cool data in their GIS Data Catalog. I decided I would play with it a bit and see what I could come up with.

I grabbed the Street Centerlines, played with it in QGIS to do a bit of cartography, and then (eventually) got it exported to a MapServer .map file (which describes styling info). I was then able to set the file up in MapServer, serve it out to OpenLayers, and then to stick TileCache in the mix. The result isn’t the prettiest thing in the world, but it works.
After going through it once, I decided I’d go through it all again, to see how long it took.

  • 12:15AM: Open Firefox to the DC Data Catalog to find some data to map.
  • 12:16AM: Pick out Structures Polygons.
  • 12:17AM: Download complete, open QGIS
  • 12:18AM: Open file in QGIS
  • 12:19AM: Save QGIS project file, save map file from project file
  • 12:20AM: Copy both shapefile and mapfile to server
  • 12:21AM: Tweak mapfile: adjust PNG output to not be interlaced (for TileCache usage), change background color
  • 12:22AM: Test mapfile in mapserv CGI. Find out I misspelled something, fix it.
  • 12:23AM: Edit TileCache config to add new layer information.
  • 12:24AM: Copy an existing tile URL, ensure that it works in TileCache with the different layer.
  • 12:25AM: Edit OpenLayers config to include additional layer
  • 12:26AM: Edit OpenLayers config to include layerswitcher.
  • 12:27AM: Marvel at the result

In less than 15 minutes I was able to turn a dataset into a browsable, lazily cached web viewable data set, using qgis, OpenLayers, and TileCache. Not bad at all.

Free Maps for Free Guides

Posted in Locality and Space, Mapserver, OpenGuides, OpenLayers, TileCache on February 11th, 2007 at 08:46:05

A bit more than a year ago, when I was just learning how to use the Google Maps API, I put together a patch for the OpenGuides software, adding Google Maps support. It seemed the logical way to go: It wasn’t perfect, since Google Maps are obviously non-free, but it seemed like a better way to get the geographic output from OpenGuides out there than anything else at the time.

Since I did that, I’ve learned a lot. Remember that 18 months ago, I’d never installed MapServer, had no idea what PostGIS was, and didn’t realize that there were free alternatives to some of the things that Google had done. Also, 9 months ago, there was no OpenLayers, or any decent open alternative to the Google Maps API.

In the past 18 months, that’s all changed. I’ve done map cartography, I’ve done setting up of map servers, and I worked full time for several months on the OpenLayers project. Although my direction has changed slightly, I still work heavily with maps on a daily basis, and spend more of my time on things like TileCache, which lets you serve map tiles at hundreds of requests/second.

So, about a month ago, I went back to the Open Guide to Boston, and converted all the Google Maps API calls to OpenLayers API calls. The conversion took about an hour, as I replaced all the templates with the different code. (If I was writing it again, it would have taken less time, but this was my first large scale open source Javascript undertaking, long before I gained the knowledge I now have from working with OpenLayers.) In that hour, I was able to convert all the existing maps to use free data from MassGIS, rather than the copyrighted data from Google, and to have Google as a backup: a Map of Furniture Stores can show you the different. You’ll see that there are several layers — one of which is a roadmap provided by me, one from Google — and one from the USGS, topographic quad charts.

It’s possible that some of this could have been done using Google as the tool. There’s nothing really magical here. But now, the data in the guide is no longer displayed by default on top of closed source data that no one can have access to. Instead, it’s displayed on top of an open dataset provided by my state government.

This is how the world should work. The data that the government collects should be made available to the people for things exactly like this. It shouldn’t require a ‘grassroots remapping’: There are examples out there of how to do it right. I find it so depressing to talk to friends in the UK, who not only don’t have the 1:5000 scale quality road data that Massachusetts provides, but doesn’t even provide TIGER-level data that the geocoder on the Open Guide to Boston uses.

Free Guides, with Free Maps. That’s the way it should be. The fact that it isn’t everywhere is sad, but at least it’s good to know that the technology is there. Switching from Google to OpenLayers is an easy task — it’s what happens next that is a problem. You need the data from somewhere, and it’s unfortunate that that ‘somewhere’ needs to be Google for so many people. I’m thankful to MassGIS and to the US Government for providing the data I can use, and to all the people who helped me learn enough to realize that using Google for everything is heading the wrong way when you want to not be beholden to a specific set of restrictions placed on a corporate entity.

Yahoo! Pipes: Turning Pipes into Application

Posted in Ning, OpenLayers, Pipes on February 10th, 2007 at 20:29:19

So it seems clear to me that the Pipes application is a step in a really cool direction. I don’t know if there’s anything incredibly innovative in the idea of making programming easy, but Yahoo! has gone a long way towards the goals that other people have put into place. Ning thought that letting people code would be the way forward: give them a sandbox, let them copy paste, and they’ll build applications. The idea was right: there are a lot more people out there who want to be builders that aren’t. It turned out that the people who want to be builders didn’t have the skill level that they needed to build PHP code, even with mix/match and copy/paste.

Yahoo! Pipes is the followthrough on that idea: make it possible for people to take a set of input, and get a set of output, passing it through multiple filters.

The next step is obvious: Let people turn the filter settings into a web page, with the output being another web page. Search for all content 5 miles from a given Craigslist location: Take the user input as drop down boxes or something in an HTML form, and make the output a Yahoo! Map. Boom: you’ve turned everyone who can create a pipe into a web application builder. Stick ads along the bottom, and you’ve done one of the things that Ning tried to do: make money off applications in the same way that so many have made money off content.

I’m sure that Yahoo! already has this in mind, whether they’ve written about it or done it yet or not. It’s only a matter of time. It does make me wonder if someone could build something that did this without needing Yahoo! to do it… It seems like at the moment it would require altering a pipe on the fly, which I don’t see a way to do, so either there needs to be a further API, or we’ll all just need for it to get done 🙂

Update: Looking today, you can control the input of text inputs from the URL that you fetch the RSS with. This means that I can go ahead and build the pipe thingy for my own pipes as is. That’s pretty cool. I’ll show one with MetaCarta stuff on Monday.

Perhaps I’ll build an OpenLayers based Yahoo Pipe output viewer. It wouldn’t be that different from the GeoRSS viewer… but it would need a way to visualize non-Geo content. Ponder ponder.

Yahoo! Pipes: Make it work at all?

Posted in Locality and Space, Pipes on February 10th, 2007 at 09:53:11

A prize to anyone who can make a simple Atom entry or a simple RSS entry get geocoded by the Yahoo! Pipes Location Extractor. I’ve spent the last 30 minutes on it, and failed.

Non-working pipe is my attempt. GeoRSS works, but location extraction doesn’t.

Update:

  1. Location Extractor seems to work against the HTML pages referenced by the feed, not the content in the feed.
  2. Minor changes to the HTML page seem to break the parsing — it seems to be very targeted towards Craigslist postings.  A page with my address, but a different map link seems to extract the Sebastapool address, while a page with just a map link doesn’t seem to extract at all.

I guess I don’t need to sell my MetaCarta stock yet… unless I’m way off, this shows that MetaCarta is significantly ahead of the game for extracting locations from unstructured text. Not that this is a surprise to me 😉

Yahoo! Pipes: Make your own Module?

Posted in Locality and Space, MetaCarta, Pipes on February 10th, 2007 at 09:20:15

Is it possible to make your own module with Yahoo! Pipes? I was looking around and didn’t see anything… I’d really like to be able to hook up something that grabs locations from the MetaCarta Web Services, and then let people drop it into their own pipelines… I’d be willing to bet that the Location Extractor pipe module wouldn’t pull out “20 miles north of London”, but with the MetaCarta GeoTagger, I could…

FOSS4G2007 Call For Workshops

Posted in FOSS4G 2007, Locality and Space on February 6th, 2007 at 02:17:04

This has been posted to a couple mailing lists, but posting it here can’t hurt:

The FOSS4G (Free and Open Source Software for Geospatial) conference is pleased to announce the Call for Workshops for the 2007 conference, being held September 24-27 in beautiful Victoria, British Columbia, Canada.

FOSS4G is the premier conference for the open source geospatial community, providing a place for developers, users, and people new to open source geospatial to get a full-immersion experience in both established and leading edge geospatial technologies.

This is your chance to showcase your favorite application, integration solution, or other topic. You will use your superior classroom skills to lead a group of attendees through your chosen topic in either a half-day or ninety minute lab or classroom format. Half-day workshops will be delivered on Monday, September 24 (the Workshops day), while the ninety minute workshops will run concurrently with the presentations during the remainder of the conference.

While we are open to workshops on a wide range of topics, we strongly encourage workshop submissions on the following topics:

  • Practical Introduction to __________
  • Interoperability
  • NeoGeography and NovelGeography
  • Using a Software Stack
  • 3D Worlds

In the tradition of previous FOSS4G events, we expect that the majority of workshops will be “hands on”, with participants seated in front of computers and able to follow along with the instructor, working directly with the software and applications under discussion.

Be prepared to spend considerable effort in creating your workshop. Past experience has shown that a high quality workshop requires about three days of preparation for each hour of presentation time. As part of this preparation you will be expected to develop material for attendees to take away with them, such as handouts, a ‘workbook’, CDROM, etc.

In recognition of this effort, workshop presenters will receive a reduction in the price of conference registration:

  • free registration for delivering a half-day workshop
  • half-price registration for delivering a 90-minute workshop

Because of limited space, you may want to consider submitting two versions of your topic, one for each length format.

Please visit the workshops page on our website to download the submission templates and instructions for sending them in:

http://www.foss4g2007.org/workshops.html

The deadline for workshop submissions is February 28, 2007. Submit early, submit often!