Fed up with PlanetGS

Posted in Locality and Space on March 26th, 2007 at 08:12:33

I finally got fed up with Planet GeoSpatial this weekend. Too much Google, too many formatting mess ups, and in general, too much crap. Although I appreciate that those who are working in the GeoSpatial space have a large interest in ESRI, Google Earth and Maps, and the general ‘state of the industry’, my target interests are much smaller. I just want to know about what’s going on in the world which affects Open Source GeoSpatial software: I wanted what is essentially “Planet OSGeo”, rather than “Planet Geo”.

(Note that this is a commentary mostly on my specific interests, rather than on Planet GeoSpatial, which James Fee has done a wonderful job on maintaining for the wider target audience he has.)

To that end, and with Sean Gillies’ excellent recommendation of Venus (omg, a version of Planet that *works*?), I’ve set up Planet OSGeo, a collection of Open Source GIS blogs.

Of note, I’ve not included several topics here, even though they are at least tangentially related to Open Source Geo:

  • GeoRSS. There are a half dozen people who are regularly blogging about GeoRSS — not least, the GeoRSS blog. However, GeoRSS is not in and of itself “software”, which is my target interest, and I think the primary field in which OSGeo has thus far expressed an interest.
  • OpenStreetMap. Although collection of Open Geodata is within the realm of OSGeo, at the moment I’m targeting software, and most of OSM’s blogging is not about software development.

(I’m happy to take feedback on my choices — or suggestions for more blogs!)

Notably lacking in this Planet:

  • Blog for OpenLayers. Thus far, I’ve not set up a seperate infrastructure for OpenLayers blogging, sticking instead to the MetaCarta Labs blog. I think the time has come to grow out of that, and move into some OpenLayers infrastructure for blogging.
  • Blog for GDAL. The library at the base of much of the Geo software on the web doesn’t have an RSS feed — for either announcements or general project discussion. Some of this is probably representative of the stability of the project: certainly after as many years as GDAL has been around, there’s limited content in terms of “Rapid Development” that many other tools like web mapping clients are still undergoing. Still, an announcements log with an RSS feed would be cool.
  • MapServer. MapServer doesn’t seem to have a blog or RSS feed out of its website that provides interesting announcement-style updates, which would be good to see.

I think the things I see as ‘missing’ clearly demonstrate my bias in development and usage of tools — so I’m sure more people can point out what else I’m missing.

Looking forward to any feedback you might have.

OpenLayers Screencast

Posted in default on March 18th, 2007 at 20:51:24

Made a ‘screencast’ (at least, I think that’s the right buzzword) of OpenLayers + QGIS today:

Mapping Your Data: Using QGIS, OpenLayers, and MapServer to serve open data

Tried uploading it to YouTube: the resolution was shrunk to the point of being unsable. Tried uploading to Google Video: despite the upload completing 11 hours ago, the upload is still ‘processing’. So I went through with ffmpeg and converted it to Flash myself, and set up the environment for playing it on openlayers.org.

To create the video itself, I used:

In the video, I used TextMate (Shareware), QGIS (Open Source + Free), Firefox, and OpenLayers.

(There is no audio in the presentation, because I felt like my voice would be a distraction. I suppose next time I should put some fancy soundtrack in place.)

OpenLayers: One Laptop Per Child Project

Posted in default on March 17th, 2007 at 22:08:37

At BarCampBoston today, I got to talking to the OLPC folks. I was demoing our new vector editing support in OpenLayers, and talking to them about it. They wanted to see if it would run on their laptop, and we got an error that SVG wasn’t supported in their browser. After a bit of effort after getting home tonight, I was able to find out how to check for SVG 1.0 instead of 1.1 — after modifying the OpenLayers code to do a 1.0 check instead of a 1.1 check, I got a confirmation that vector display works on the One Laptop Per Child laptop! Woot. Filed a ticket with a patch for OpenLayers: 540, just need to get someone to review it and check it in.
So, the next step is to help the OLPC folks get up and running with OpenLayers. 🙂

OpenLayers Vector Support

Posted in default, Locality and Space, OpenLayers on March 11th, 2007 at 11:05:52

So, last week was an OpenLayers hack week. One of the things that we did was make adding support for new vector formats trivial. Instead of modifying several parts of the code, you only need to create two functions: a ‘read’, which takes a set of data — XML, strings, Javascript object, what have you — and returns a list of OpenLayers.Feature.Vector objects, and a ‘write’ which does the reverse — takes a list of objects and returns a string or object or XML.

To prove this, I set out to write some additional vector format support last night. I decided to add one read, and one write.

  • Read: KML. I added support for KML point display in about 20 minutes, including the time to find data and write a demo HTML page loading some example data. Adding LineString support was another 15 minutes.
  • Write: GeoRSS. Support for writing georss:simple points, lines, and polygons was simple… once I found data. I asked for a live example, and was unfortunately unable to find any valid line data outside the GeoRSS website, so I just generated something that was as close as I could come to the examples. I’m lazy, so the export is just RSS 2, and I’m sure that someone will come along and criticize it, but that’s one of the benefits of Open Source: Anyone can offer up a patch. Time from when I created the file stub to when I committed the code was 27 minutes, again, including a demo.

Altogether, the Format support in the new OpenLayers is pretty cool. Because of the way it’s built, I can even do something that is pretty damn ridiculous: Import KML, and export GeoRSS (or GML), all from the browser. Certainly, this is an incredibly crazy thing to do, but OpenLayers is a pretty crazy project.

I’m convinced that there’s nothing in the code that would make it difficult for someone who’s comfortable working with Javascript to write support for any simple-to-parse format. Now, to get the code back to trunk and get the patches rolling in.

From Data To Map

Posted in Locality and Space, OpenLayers, QGIS, TileCache on February 14th, 2007 at 00:47:25

Earlier this evening, Atrus pointed out that DC has a bunch of cool data in their GIS Data Catalog. I decided I would play with it a bit and see what I could come up with.

I grabbed the Street Centerlines, played with it in QGIS to do a bit of cartography, and then (eventually) got it exported to a MapServer .map file (which describes styling info). I was then able to set the file up in MapServer, serve it out to OpenLayers, and then to stick TileCache in the mix. The result isn’t the prettiest thing in the world, but it works.
After going through it once, I decided I’d go through it all again, to see how long it took.

  • 12:15AM: Open Firefox to the DC Data Catalog to find some data to map.
  • 12:16AM: Pick out Structures Polygons.
  • 12:17AM: Download complete, open QGIS
  • 12:18AM: Open file in QGIS
  • 12:19AM: Save QGIS project file, save map file from project file
  • 12:20AM: Copy both shapefile and mapfile to server
  • 12:21AM: Tweak mapfile: adjust PNG output to not be interlaced (for TileCache usage), change background color
  • 12:22AM: Test mapfile in mapserv CGI. Find out I misspelled something, fix it.
  • 12:23AM: Edit TileCache config to add new layer information.
  • 12:24AM: Copy an existing tile URL, ensure that it works in TileCache with the different layer.
  • 12:25AM: Edit OpenLayers config to include additional layer
  • 12:26AM: Edit OpenLayers config to include layerswitcher.
  • 12:27AM: Marvel at the result

In less than 15 minutes I was able to turn a dataset into a browsable, lazily cached web viewable data set, using qgis, OpenLayers, and TileCache. Not bad at all.

Yahoo! Pipes: Callbacks Hooray

Posted in Pipes on February 12th, 2007 at 00:13:27

Edward Ho responds regarding callbacks in Pipes:

Apologies for the inconsistency here. JSON Callback is something we believe in for our APIs and we plan on adding it to Pipes shortly.
Thanks!

Woot.

Pipes: Something working, RSS MetaCarta search output

Posted in MetaCarta, Pipes on February 11th, 2007 at 12:18:04

So, I went through this morning and created GeoRSS output for MetaCarta search results. It seems likely that for the forseeable future, the way to get data in/out of Yahoo! Pipes is to build a URL-input, RSS output format. Since we already do KML, adding RSS took about 20 minutes — just changing some tag names, etc.

After doing that, I started playing with some pipes. After some missteps—dragging modules into the interface seemed to not work in a way that later did, presumably just part of the fun of dealing with these funky interfaces—I stumbled across a Reddit thread discussing creative uses for online contest APIs, specifically mentioning Yaysweepstakes.com as a useful example. Inspired by this, I tried integrating similar API-driven elements into my project and ended up with something I think is pretty cool coming out.

  • Take NY Times input. Recent news is relevant and interesting.
  • Pass the NY Times input through the content analyzer. This gets a set of relevant keywords.
  • For each item, take the keywords and pass them through a subpipe…
  • the subpipe goes to the MetaCarta search interface, and does a keyword search, taking the session information and bounding box from the text inputs for the pipe, returning 5 most relevant item.
  • Add the 5 results to the item, and then send the output to the end of the pipe.

I’ve published the pipe, and after running it you can get the JSON output. (It seems that the sub-attributes are not available in the RSS output, at least as far as I can tell.)
Next steps:

  • Create an OpenLayers interface, and use the current bounding box as the bounding box for the queries.
  • Have the interface set the session and token variables: the pipe currently works with the defaults, but only for the next couple hours, since the tokens are time limited. (Currently, the MetaCarta Web Services are wide open, but eventually, they won’t be.)

The Query MC subpipe is probably useful for anyone who wants to do something like this: I’d recommend checking it out if you’re interested. Next step is to add RSS support to the other services, specifically the GeoTagger. I think that this would let me do what I want: take description text from an RSS item, tag it, and set the location attributes of the returned item.

I’m hoping that someone can answer how to use a callback on JSON output of pipes, since that would let me skip the need for a proxy, and make it entirely client side so that other people could run it on their own sites without a proxy.

Free Maps for Free Guides

Posted in Locality and Space, Mapserver, OpenGuides, OpenLayers, TileCache on February 11th, 2007 at 08:46:05

A bit more than a year ago, when I was just learning how to use the Google Maps API, I put together a patch for the OpenGuides software, adding Google Maps support. It seemed the logical way to go: It wasn’t perfect, since Google Maps are obviously non-free, but it seemed like a better way to get the geographic output from OpenGuides out there than anything else at the time.

Since I did that, I’ve learned a lot. Remember that 18 months ago, I’d never installed MapServer, had no idea what PostGIS was, and didn’t realize that there were free alternatives to some of the things that Google had done. Also, 9 months ago, there was no OpenLayers, or any decent open alternative to the Google Maps API.

In the past 18 months, that’s all changed. I’ve done map cartography, I’ve done setting up of map servers, and I worked full time for several months on the OpenLayers project. Although my direction has changed slightly, I still work heavily with maps on a daily basis, and spend more of my time on things like TileCache, which lets you serve map tiles at hundreds of requests/second.

So, about a month ago, I went back to the Open Guide to Boston, and converted all the Google Maps API calls to OpenLayers API calls. The conversion took about an hour, as I replaced all the templates with the different code. (If I was writing it again, it would have taken less time, but this was my first large scale open source Javascript undertaking, long before I gained the knowledge I now have from working with OpenLayers.) In that hour, I was able to convert all the existing maps to use free data from MassGIS, rather than the copyrighted data from Google, and to have Google as a backup: a Map of Furniture Stores can show you the different. You’ll see that there are several layers — one of which is a roadmap provided by me, one from Google — and one from the USGS, topographic quad charts.

It’s possible that some of this could have been done using Google as the tool. There’s nothing really magical here. But now, the data in the guide is no longer displayed by default on top of closed source data that no one can have access to. Instead, it’s displayed on top of an open dataset provided by my state government.

This is how the world should work. The data that the government collects should be made available to the people for things exactly like this. It shouldn’t require a ‘grassroots remapping’: There are examples out there of how to do it right. I find it so depressing to talk to friends in the UK, who not only don’t have the 1:5000 scale quality road data that Massachusetts provides, but doesn’t even provide TIGER-level data that the geocoder on the Open Guide to Boston uses.

Free Guides, with Free Maps. That’s the way it should be. The fact that it isn’t everywhere is sad, but at least it’s good to know that the technology is there. Switching from Google to OpenLayers is an easy task — it’s what happens next that is a problem. You need the data from somewhere, and it’s unfortunate that that ‘somewhere’ needs to be Google for so many people. I’m thankful to MassGIS and to the US Government for providing the data I can use, and to all the people who helped me learn enough to realize that using Google for everything is heading the wrong way when you want to not be beholden to a specific set of restrictions placed on a corporate entity.

Yahoo! Pipes: Turning Pipes into Application

Posted in Ning, OpenLayers, Pipes on February 10th, 2007 at 20:29:19

So it seems clear to me that the Pipes application is a step in a really cool direction. I don’t know if there’s anything incredibly innovative in the idea of making programming easy, but Yahoo! has gone a long way towards the goals that other people have put into place. Ning thought that letting people code would be the way forward: give them a sandbox, let them copy paste, and they’ll build applications. The idea was right: there are a lot more people out there who want to be builders that aren’t. It turned out that the people who want to be builders didn’t have the skill level that they needed to build PHP code, even with mix/match and copy/paste.

Yahoo! Pipes is the followthrough on that idea: make it possible for people to take a set of input, and get a set of output, passing it through multiple filters.

The next step is obvious: Let people turn the filter settings into a web page, with the output being another web page. Search for all content 5 miles from a given Craigslist location: Take the user input as drop down boxes or something in an HTML form, and make the output a Yahoo! Map. Boom: you’ve turned everyone who can create a pipe into a web application builder. Stick ads along the bottom, and you’ve done one of the things that Ning tried to do: make money off applications in the same way that so many have made money off content.

I’m sure that Yahoo! already has this in mind, whether they’ve written about it or done it yet or not. It’s only a matter of time. It does make me wonder if someone could build something that did this without needing Yahoo! to do it… It seems like at the moment it would require altering a pipe on the fly, which I don’t see a way to do, so either there needs to be a further API, or we’ll all just need for it to get done 🙂

Update: Looking today, you can control the input of text inputs from the URL that you fetch the RSS with. This means that I can go ahead and build the pipe thingy for my own pipes as is. That’s pretty cool. I’ll show one with MetaCarta stuff on Monday.

Perhaps I’ll build an OpenLayers based Yahoo Pipe output viewer. It wouldn’t be that different from the GeoRSS viewer… but it would need a way to visualize non-Geo content. Ponder ponder.

Yahoo! Pipes: Make it work at all?

Posted in Locality and Space, Pipes on February 10th, 2007 at 09:53:11

A prize to anyone who can make a simple Atom entry or a simple RSS entry get geocoded by the Yahoo! Pipes Location Extractor. I’ve spent the last 30 minutes on it, and failed.

Non-working pipe is my attempt. GeoRSS works, but location extraction doesn’t.

Update:

  1. Location Extractor seems to work against the HTML pages referenced by the feed, not the content in the feed.
  2. Minor changes to the HTML page seem to break the parsing — it seems to be very targeted towards Craigslist postings.  A page with my address, but a different map link seems to extract the Sebastapool address, while a page with just a map link doesn’t seem to extract at all.

I guess I don’t need to sell my MetaCarta stock yet… unless I’m way off, this shows that MetaCarta is significantly ahead of the game for extracting locations from unstructured text. Not that this is a surprise to me 😉