Map of Life
A Jabiru stork at the Crooked Tree Wildlife Sanctuary

MOL Blog

Map of Life and CartoDB

Friday, December 30, 2011

It’s been an exciting month for Map of Life! We had a great time at TDWG 2011 in sunny New Orleans, where John Wieczorek and I presented Map of Life‘s big dream: to use existing maps to make better maps of where species actually are. John and Aaron Steele also presented some radical ideas about hooking CouchDB and CouchApp together to build simple, powerful applications. Their switch in strategy made us wonder if perhaps we could pull that off with Map of Life, too.

It was in this frame of mind that we attended Javier de la Torre's demonstration of CartoDB, a Google Fusion Table-like application to store and render mapping data. The more we saw, the more we liked: open-source (http://github.com/vizzuality/cartodb) (and available on GitHub!), based on PostGIS on PostgreSQL (already our platform of choice), and incorporating Mapnik, the super-fast tile rendering engine we discussed in our last blog post. They’re also quick to respond to our requests: last week, CartoDB added support for per-request tile styling, an essential feature for the next phase of our development.

CartoDB

Over the last two months, we’ve been working on moving our map tiling infrastructure to leverage CartoDB while continuing to use Google App Engine for indexing and searching. Although there are still some small glitches to work out before we can claim full success, our system now works in two parts:

A set of scripts which upload data into a CartoDB database, and; A frontend which queries and accesses that database to create a map to show our users. In so doing, we’ve reaped the rewards of a much smaller, simpler code base. Many of the more complicated tasks we were doing earlier, such as indexing our attributes or drawing the map layers, are now being handled by programs perfectly designed to take on these tasks (PostgreSQL and CartoDB respectively). So our job has been simplified to doing what we do best: managing the data, combining it easily and quickly in our front end, and analysing it for global patterns on our back end. We’ll be working to further simplify this upload process soon, and we’ll be showing off more of our new architecture shortly. Stay tuned!