It is my pleasure to introduce here on Monster Swell a new collaboration and a spectacular piece of work. Arjan Scherpenisse of Miracle Things will be collaborating with us in the field of data visualization.
The TIMEMAPSÂ project written up just before this post is the first of we hope many forays into data visualization for Arjan and we look forward to collaborate on many such projects in the future.
TIMEMAPS visualizes how the map of the Netherlands would look if it would be scaled proportionally to the travel times (by train) between cities. I was asked by the designer of the concept, Vincent Meertens of graphsic, to transform his manually crafted PDF files into a real-time, interactive visualization. TIMEMAPS has been exhibited at the Graduation Show event during the Dutch Design Week 2011.
The map is a real-time interactive map. Clicking a city allows one to set the perspective to a city of his choice. Hovering the map shows a pop-up which highlights the time it takes to travel to the city the mouse is currently over. Every coloured “ring” on the map denotes 30 minutes of travel time, at the current time.
Drawing the map with Canvas
The visualization is done using the HTML5 canvas. Why canvas, and not just SVG, one would ask? Good question: I wanted to learn more about the canvas and thus was a bit biased. I think the project could have been done with SVG as well.
The map consists of a set of polygons: the outline of the Netherlands and its various islands. All the cities are located on those shapes with all 379 train stations. Furthermore, there are several bridges between the islands, like the big “Afsluitduikâ€, which each connect 2 vertices of the polygons.
The initial, un-transformed shape of the country and the station positions is the same as that on the famous yellow overview map that the NS uses in the stations: it is a schematic view of the Netherlands, constrained in a grid of 0, 45 and 90-degree lines.
The drawing algorithm first draws all the polygons and bridges, and subsequently fills those areas with a pattern of colored concentric circles. This is done in canvas by blitting the previous shape with a pre-rendered image of the circles using the compositeOperation method. The distances between the circles are scaled to represent 30 minutes of travel time. Then, the cities are drawn as big/small dots (main stations are bigger) and connected to the current city by a thin white line.
The information hovers (a plain HTML div) are done by using the “mousemove†event on the canvas and calculating which city is the closest to the current mouse location. Clicking a city causes the current perspective to shift to the clicked city in an animated fashion, using a simple (cosine) transition.
Map deformation
The angles at which cities view each other are kept constant. So, for example, viewed from Rotterdam, Utrecht centraal is always at a 45-degree angle, regardless of the time it takes to travel from Rotterdam to Utrecht. The actual city location is scaled proportionally along these angles: if it takes less time to travel, the city is pulled closer; if it takes more time it is pushed further away. But the angle remains constant.
For the islands, this mesh-stretching was mixed 60%/40% with a simple vertex displacement to prevent the islands from becoming unrecognizable: since there are no stations on islands, they are prone to more deformation since the feature points (cities) lie further away.
Problems in the visualization
The shape of the map sometimes is deformed beyond recognition because in certain cases cities which are normally close are being pushed away beyond cities that are normally far away: thus causing the polygon to turn “inside out†and cause cities to appear to be located in the sea.
Another issue is the 45-degree grid constraint: the mesh stretching algorithm does not take this into account because this constraint is applied in a later calculation stage: this sometimes causes cities to be located in the sea as well. A temporary solution for this was to add more vertices to the polygons so the map had more flexibility while stretching.
Application to other maps
The Netherlands is a pretty ideal country in the way the transportation system is organized: viewed from the center, “de Randstadâ€, or Utrecht or Amersfoort, it is indeed so that travel times do increase almost linearly with geographical distance. I do not think this holds for every country: especially with the advance of faster railways (the fast Fyra train was not taken into account in our implementation!), the map might deform in ways that are beyond recognition and beyond representation in the 2D domain.However it might be an interesting experiment to apply the same techniques to a different country.
This article is the second in a series about the TIMEMAPS project. TIMEMAPS’ concept and design are by Vincent Meertens, the implementation is by Arjan Scherpenisse.
Finally got around to go the AUB Ticketshop at Leidse Square during the daytime to view the Foursquare Display we setup in action (previous blog post).
A video of the screen:
The screen in context:
It is a welcome refresher from the static posters and the static videos that usually litter these high profile locations. The foursquare coloured view of the area is always fresh and shows a view on the local flavour and the people that visit the venues around.
From an urban development point of view it may be odd to draw more attention to the already highly crowded Leidse Square area. But it comes to reason that new developments such as these will be tested on high density locations first. We would be very interested to create augmentations in public space to make locations in Amsterdam’s periphery more appealing.
First: I do not agree with the premise that most apps created in government challenges are quickly abandoned. I have not done a tally of our Apps for Amsterdam contest, but the completeness and polish of most apps submitted was impressive. I still use several of the apps from that contest regularly. Snelstepontje.nl for finding out which ferry to take is a godsend just to name one.
Maintenance is indeed an issue. It is my personal experience that if the app is deployed to a suitably robust platform (Google App Engine is a notable one), it may continue to run unsupervised for many years.
But yes, I do have my own doubts when it comes to the sustainability of apps from app contests as I have stated in my review of Apps for Amsterdam.
Data quality is the largest issue on all levels and it needs to be addressed. From gathering data, to publishing it, to responding adequately to issues. Most datasets that are released for contests are not of the highest quality due to time constraints. And after the contest is over they are seldom kept up to date by the publishing office. When it comes to sustainability, government should first turn to itself and start releasing their data in a way that is sustainable.
Besides releasing the data in a proper format, a very important consideration is the licensing. Re-using data should happen under conditions as liberal as possible (CC0 preferred) as not to deter companies from investing in using that data.
But even then creating apps that are successful and sustainable at scale may be too lofty a goal. Productizing apps in a professional way implies conceiving, building and expanding a startup company. If one or more such initiatives come out of a hackathon that may be called a resounding succes. But what of the rest?
Well, communities of practice are built on exactly that: practice. Data does not overnight become readily at hand and usable. It takes a lot of hard work from all of us.
Having organized several hackdays we are seeing an increase in number of people attending and their proficiencies as well as a wider awareness of the possibilities of data in journalism, government and politics. Those are exactly the things we need if we want to make open data (and not just applications) the foundational fabric of our information society.
I have written here before about the need for web developers to learn more about GIS technologies and how to either work with or work around the traditional geographical software packages and data formats. There is a lot of synergy to be achieved in working together.
In the summer lull over at Hack de Overheid we are organizing a day of programming at a fortress which in itself already is a unique event: Apps for Noord Holland. But during the day the people from ESRI will give a workshop about geo data which we think is very worthwhile for any programmer who wants to get started in this field.
So if you want to spend a day on a fortress learning about GIS and programming, go right ahead and register. It promises to be a terrific day.
Somebody brought to my attention again the Foursquare user adoption animation they created in honour of their 10 millionth member. A great achievement for Foursquare and just the beginning of many more awesome things I am sure.
In the animation, if you look at the still at August of 2009, you see the US gaining some traction and this flare across the pond. That is Amsterdam where at the time Foursquare was being adopted hand over fist.
The story behind that is somewhat interesting and has been told, but this graphic does make it poignant again. Having visited SxSWi that last March, Robert Gaal and myself saw the launch of Foursquare and quickly got hooked. That was the year that location had not been played out yet at all, Latitude was fresh, Fire Eagle was still relevant and Brightkite was being used. Location was on the cusp.
Back in the Netherlands we quickly got in touch with the guys to get the service launched here. We thought waiting would probably result in the Netherlands being served last (as usual). After some back and forth we got everything up and running and Amsterdam was the first international city on Foursquare. The rest is history as can be seen in the graph.
It’s been some time in the making but today we are proud to do a very early beta release of Statlas, the project we have been working on these past months. The Dutch Press Innovation fund funded this project and we collaborated with Fluxility and Alexander Zeh on this version. So please do check out: Statlas
There are several similar tools out there that help you create your own map but we feel that they are not as easy as they should be and most all of them are created in Flash. Statlas is built on Polymaps and therefore fully compatible with the open web. Creating a map is a simple as painting by numbers.
Our initial explorations set us on our way to create the easiest and most generative atlas tool we could imagine. Statlas is setup to allow you to choose a group of regions and for each of those regions enter a value (numerical, color or other) to create a map coloring. That map can then be shared, printed, embeded wherever you want . But anybody can also take a public map and edit it to improve upon existing data or to express their differences with them. It is also possible to export data to CSV, use other tools to collect statistics and re-import them back into Statlas.
Feedback
This initial release is geared towards the Dutch context as we have been developing it with the Netherlands in mind first. We are going to quickly add more regions and we are solliciting requests for regions you may want to add. If you have ideas, requests and or Shapefiles, please send them our way so we may add them.
This is a most preliminary beta release of a functional piece of software. We are envisioning much more data heavy and live updating views in the near future, but a project of this scope can balloon too easily. We’ve heard no end of people who wanted to use it for one cause or another and we wanted to show something first. After this release we’ll see which direction is most in demand of pursuing.
Tonight the Apps for Amsterdam awards ceremony takes place and stage one of the Dutch open data trajectory will be completed.
Last year at the end of summer I helped Thijs Kleinpaste and Stefan de Bruijn co-author a proposal to sponsor open data within the municipality of Amsterdam. This proposal was accepted near unanimously by the commission in November (full write-up) and it started a roller coaster ride for open data in Amsterdam that is now starting to have far wider effects throughout the Netherlands.
Hack de Overheid (Hack the Government), the soon-to-be foundation I’m in the board of, partnered with the City of Amsterdam and Waag Society to realize the competition and a series of events. This series culminated for us in Hack de Overheid #3 an inspiring day and hackathon for over a hundred developers who built civic apps.
The completion of the contest tonight and the sometimes stunning applications —many of which display excellence in cartography and visualization— submitted to it mark another high point I am proud to be a part of.
What’s next?
But as I said this completes just the first stage of what is bound to be a long and tortuous road. As we speak there are local initiatives being formed to open up data in at least Enschede, Rotterdam, Utrecht, Eindhoven and the Hague. It will be interesting to see what comes out of that and if some of the smaller cities may in fact outpace us here in the capital.
But we need to do more. Recent questions about privacy violations in data releases make it more than a little obvious that there is a massive issue in data literacy. I wholeheartedly agree with Adam Greenfield if he says that data and its affordances need to be a core subject starting from school onwards. We need to explore materials, interventions and processes that allow us to teach data literacy and that allow others to teach it for us if we ever want to spread this knowledge at scale.
Literacy is required not only in school children but also in decision makers in business and government right now if we want to keep the momentum we have right now. Future developments run the risk of being hamstrung by backlashes against the malignant consequences of data or open data being unused because the ecosystem is not in tune. There are still lots of issues to be resolved around ownership, privacy, responsibility, licensing and business models.
From a commercial point of view, the sustainability of many of the applications in the contest is doubtful. Creating proof of concept apps for the data is a more than a good start, but it is by no means enough. The real need is for open but comprehensive systems where open data is a given. That data needs to be technically excellent and fully engrained in the fabric of our information society so that everybody can use it to enrich their app/site/discourse. Data owners and producers need to participate and be accountable for their data to accept feedback from the public both in the specific and in the general case. Such a system cannot be built or be static, but needs to be grown and evolve continuously. The only thing we can do is plant, nurture and weed.
So tonight will be fun, but let that not distract us from the massive amount of work still ahead. We are ready for it. Will you join us?
The agenda is filling up again just before the summer break. Alper will speak at:
May 24th – Technical review of city dashboard concepts at HvA
A brief bit of teaching with design and technical critique of city visualization dashboards developed by students.
May 25th – Apps for Amsterdam Awards Night
Judging and attending the awards for the Amsterdam open data application contest.
May 27th – What Design Can Do
Presenting an engaged data-centric approach for designers’ benefit (blurb).
Here are the slides for a talk I gave at /dev/haaglast Friday ambitiously titled “Fixing Reality with Data Visualization†which was well received. I promised to write it up here, so here it is.
Starting off with some introductions. We are Monster Swell, this equation is the central challenge of our practice.
To start with the title inspiration for this talk. I recently finished this book by Jane McGonigal.
“Reality is Broken†by Jane McGonigal recently came out and it’s not really true, but it’s quite opportune. Reality isn’t broken, but there is —as always— lots that can be improved. Slapping a gamification label on that is a false exit because it implies that such improvement can be done easily by the magic of games.
The core idea of the book is that:
1. Reality can be fixed by game mechanics (voluntary participation, epic stories, social collaboration, fitting rewards), and
2. That reality should be fixed by game mechanics.
Both of these points: the possibility and the desirability of such are the subject of fierce debate both within game design circles and without.
We are now seeing a superficial trend of gamification, badge-ification and pointification where everybody is rushing forward to add as many ‘game-like’ features to their application/concept to look tuned into the fun paradigm.
Fortunately this does not work. Checking in for points and badges is fun at first, but is hardly a sustainable engagement vector. Foursquare mostly did a bait and switch with their game until they got enough critical mass to be useful along other vectors.
Things that are difficult remain difficult even if they are gamified. ‘An obstacle remains an obstacle even with a cherry on top.’
Ian Bogost terms this exploitationware. Our own discussions concluded with that if you are not the one playing, you are being played.
In our practice we look for deeper ways to engage people and affect them. There are hardly any one-to-one mappings to be found and the effects that are most worthwhile are the higher order ones. As Kars Alfrink says:
“We don’t tell them to coordinate, we create a situation within which the way to win is to coordinate.â€
Corollary: A game about violence does not immediately make people violent.
But another way of looking at it might render it as a map. The metaphor of men and liberties and territory to occupy already points towards that comparison.
Looking at it in another way it could also be a Cartesian grid with binary data values plotted onto it. A data visualization of a phenomenon we don’t know (yet).
Coming back to the map parallel, this picture of center pivot irrigation systems (by NASA) in Garden City, Kansas looks awfully similar to the goban and this is just an aerial photograph with some processing applied to it.
So to come to this point:
‘Any sufficiently abstract game is indistinguishable from a data visualization.’
The difference just is that a game is a visualization of a game model and its rules. The whole point of playing a game is learning those rules and uncovering the model of the game is essence ‘breaking’ a game. After this point it usually ceases to be fun.
And its complementary point:
‘Any sufficiently interactive data visualization is indistinguishable from a game.’
And indeed the best ones are highly interactive and offer various controls, abstraction levels and displays of data deep enough to engage users/players for a long time. It is also the reason that in our practice we don’t occupy ourselves much with visualizations in print media.
To continue the point about games: many games are either quite concrete or very abstract simulations. This is most obvious with sim games such as Sim City pictured below.
Simulations are subjective projections of reality both because of the choices that the designer of the simulator has embedded in their choices for the projection and because of the interpretation of the player of the simulation and how their ingrained notions allow them to interpret the simulation.
Ian Bogost (picture) in his book Unit Operations coins a state of being called ‘Simulation Fever’.
Bogost says that all games in some way are simulations, and that any simulation is subjective. The response people have to this subjectivity is one of either resignation (uncritically subjecting oneself to the rules of the simulation, taking it at face value) or of denial (rejecting simulations wholesale since their subjectivity makes them useless). Taken together, Bogost calls these reactions simulation fever. A discomfort created by the friction between our idea of how reality functions and how it is presented by a game system. The way to shake this fever, says Bogost, is to work through it, that is to say, to play in a critical way and to become aware of what it includes and excludes.
I think we could use the correspondence between games and visualizations to coin a corresponding term called Visualization Fever.
Those are my most important points, that good and interesting games and good and interesting data visualizations share many of the same characteristics. We can use data and its correspondence with reality (or lack thereof) to create a similar fever.
(This graphic is somewhat rudimentary but it was made within Keynote in five minutes and I hope it gets the point across.)
The visualization process shares a lot of similarities with the open data process that we are involved in. It is a perpetual conversation and the visual part is only one place where it can be improved. Data collection, discussion on results and errors, sharing of data and the resulting products, controllability of the outputs and being able to remix and reuse them and incorporating this process as feedback back into atoms are all areas that need active participation.
There is nothing easy about this. It is a ton of hard work and long tedious conversations. Fortunately most of it is worth it.
Some examples of visualization fever in action.
Verbeter de Buurt is the Dutch version of See Click Fix and it works really admirably. It creates a subjective map of an area with the issues that a group of people have signalled in their neighborhood. Nothing really is said about who these people are and if these issues are indeed the ones that are the most pressing (we all know the annoying neighbour who complains about dog poo to whomever will hear it). By making issues visible, this map imposes its view of the city onto the councils and exerts change.
Planning systems at an urban scale is a very difficult process. These planning stages are being opened up to the general public using consultation and other means but it remains to be seen if and how citizens can comprehend the complex issues that underlie city planning.
One step to help both experts and laypeople to better come to grips with the city that they are inhabiting is to create macroscopes that in one view show the entire scale and all the things that are in a system in such a way that we can make (some) sense of it. These Flowprints by Anil Bawa-Cavia are a great example of doing such for public transportation.
And done right these visualizations can reveal the systems of the world or in this case the order flow of trains in the Netherlands. Everybody knows how crowded Dutch rail is, which trains go where along which routes, but actually seeing it happening in front of your eyes in a real-time visualization gives you an insight and a tangible grip on the system that you did not have before.
So what do we fix?
We use visualizations and their compressed interactive views to expose system design choices and errors. They can also be used to give depth to a specific point, something which journalists are increasingly finding necessary. People consuming data heavy news want to be able to poke that data themselves.
A lot of visualizations I have seen thusfar serve not much more than to reinforce pre-existing judgements almost as if the person creating the visualization sought to build that which they wanted to see. Visualizations will need to be better, more flexible and draw upon more data if we want to break out of these throughs of shallow insight.
The brief as stated by the nice people at Bloom as well is that having a visualization serve solely as a visual output is too limited a use of the interactions created. You should be able to use the same interactions in the visualization to also influence the underlying model either directly or indirectly. That is to say the model and the representation should be bidirectionally influencing.
Planetary, the latest app by Bloom is a great example of that. It shows you a beautifully crafted astromusical view, but it also allows you to play your music library from within that very same visualization.
We need to bring visualization and deep data literacy to the web and infuse any relevant site and system (that is to say all) with them. Many people asking for data visualization think that they are some magical fairy dust that will make a site awesome by its very touch. This is of course not true.
Data and interactive visuals can generate value and insight for any site that employs them properly.
In the presentation Data Visualization for Web Designers by Tom Carden he remarks that web developers already know how to do all this. These are exactly the tools we have been employing over the last years to create interactive experiences (and we plan to use them more and more).
Internet Explorer is still the cripple old man of the web, but given understanding clients (and users) and some compatibility layers, you may be able to get away with using a lot of this stuff as long as the result is awesome enough.
The other trend is the idea that there need to be bridges built between web people and GIS people. Preferably how to create GIS-like experiences using the affordances that the web necessitates. A trend we were thinking about neatly summarized (blog) at a #NoGIS meetup by Mike Migurski.
GIS people have tremendous tools and knowledge but they are not accustomed to work in a very web way: quick, usable, beautiful. Web people can build nice sites pretty quickly, but they tend to fall flat when they need to work with geographical tools that are more complex than the Google Maps API.
If we can combine these two powers, the gains will be immense.
We can create subjective views to exert power upon reality and try to fix things for the better. The subjectivity is not a problem, as often the values embedded in the views are the very point. Subjectivity creates debate and debate moves things forward.
The tools we have to create these views are getting ever more powerful, but there is also a lot of work to be done.
As a wise man said: “The best way to complain is to make things.†(picture)