Tonight the Apps for Amsterdam awards ceremony takes place and stage one of the Dutch open data trajectory will be completed.
Last year at the end of summer I helped Thijs Kleinpaste and Stefan de Bruijn co-author a proposal to sponsor open data within the municipality of Amsterdam. This proposal was accepted near unanimously by the commission in November (full write-up) and it started a roller coaster ride for open data in Amsterdam that is now starting to have far wider effects throughout the Netherlands.
Hack de Overheid (Hack the Government), the soon-to-be foundation I’m in the board of, partnered with the City of Amsterdam and Waag Society to realize the competition and a series of events. This series culminated for us in Hack de Overheid #3 an inspiring day and hackathon for over a hundred developers who built civic apps.
But as I said this completes just the first stage of what is bound to be a long and tortuous road. As we speak there are local initiatives being formed to open up data in at least Enschede, Rotterdam, Utrecht, Eindhoven and the Hague. It will be interesting to see what comes out of that and if some of the smaller cities may in fact outpace us here in the capital.
But we need to do more. Recent questions about privacy violations in data releases make it more than a little obvious that there is a massive issue in data literacy. I wholeheartedly agree with Adam Greenfield if he says that data and its affordances need to be a core subject starting from school onwards. We need to explore materials, interventions and processes that allow us to teach data literacy and that allow others to teach it for us if we ever want to spread this knowledge at scale.
Literacy is required not only in school children but also in decision makers in business and government right now if we want to keep the momentum we have right now. Future developments run the risk of being hamstrung by backlashes against the malignant consequences of data or open data being unused because the ecosystem is not in tune. There are still lots of issues to be resolved around ownership, privacy, responsibility, licensing and business models.
From a commercial point of view, the sustainability of many of the applications in the contest is doubtful. Creating proof of concept apps for the data is a more than a good start, but it is by no means enough. The real need is for open but comprehensive systems where open data is a given. That data needs to be technically excellent and fully engrained in the fabric of our information society so that everybody can use it to enrich their app/site/discourse. Data owners and producers need to participate and be accountable for their data to accept feedback from the public both in the specific and in the general case. Such a system cannot be built or be static, but needs to be grown and evolve continuously. The only thing we can do is plant, nurture and weed.
So tonight will be fun, but let that not distract us from the massive amount of work still ahead. We are ready for it. Will you join us?
The agenda is filling up again just before the summer break. Alper will speak at:
May 24th – Technical review of city dashboard concepts at HvA
A brief bit of teaching with design and technical critique of city visualization dashboards developed by students.
May 25th – Apps for Amsterdam Awards Night
Judging and attending the awards for the Amsterdam open data application contest.
May 27th – What Design Can Do
Presenting an engaged data-centric approach for designers’ benefit (blurb).
Here are the slides for a talk I gave at /dev/haaglast Friday ambitiously titled “Fixing Reality with Data Visualization” which was well received. I promised to write it up here, so here it is.
Starting off with some introductions. We are Monster Swell, this equation is the central challenge of our practice.
To start with the title inspiration for this talk. I recently finished this book by Jane McGonigal.
“Reality is Broken” by Jane McGonigal recently came out and it’s not really true, but it’s quite opportune. Reality isn’t broken, but there is —as always— lots that can be improved. Slapping a gamification label on that is a false exit because it implies that such improvement can be done easily by the magic of games.
The core idea of the book is that:
1. Reality can be fixed by game mechanics (voluntary participation, epic stories, social collaboration, fitting rewards), and
2. That reality should be fixed by game mechanics.
Both of these points: the possibility and the desirability of such are the subject of fierce debate both within game design circles and without.
We are now seeing a superficial trend of gamification, badge-ification and pointification where everybody is rushing forward to add as many ‘game-like’ features to their application/concept to look tuned into the fun paradigm.
Fortunately this does not work. Checking in for points and badges is fun at first, but is hardly a sustainable engagement vector. Foursquare mostly did a bait and switch with their game until they got enough critical mass to be useful along other vectors.
Things that are difficult remain difficult even if they are gamified. ‘An obstacle remains an obstacle even with a cherry on top.’
Ian Bogost terms this exploitationware. Our own discussions concluded with that if you are not the one playing, you are being played.
In our practice we look for deeper ways to engage people and affect them. There are hardly any one-to-one mappings to be found and the effects that are most worthwhile are the higher order ones. As Kars Alfrink says:
“We don’t tell them to coordinate, we create a situation within which the way to win is to coordinate.”
Corollary: A game about violence does not immediately make people violent.
But another way of looking at it might render it as a map. The metaphor of men and liberties and territory to occupy already points towards that comparison.
Looking at it in another way it could also be a Cartesian grid with binary data values plotted onto it. A data visualization of a phenomenon we don’t know (yet).
Coming back to the map parallel, this picture of center pivot irrigation systems (by NASA) in Garden City, Kansas looks awfully similar to the goban and this is just an aerial photograph with some processing applied to it.
So to come to this point:
‘Any sufficiently abstract game is indistinguishable from a data visualization.’
The difference just is that a game is a visualization of a game model and its rules. The whole point of playing a game is learning those rules and uncovering the model of the game is essence ‘breaking’ a game. After this point it usually ceases to be fun.
And its complementary point:
‘Any sufficiently interactive data visualization is indistinguishable from a game.’
And indeed the best ones are highly interactive and offer various controls, abstraction levels and displays of data deep enough to engage users/players for a long time. It is also the reason that in our practice we don’t occupy ourselves much with visualizations in print media.
To continue the point about games: many games are either quite concrete or very abstract simulations. This is most obvious with sim games such as Sim City pictured below.
Simulations are subjective projections of reality both because of the choices that the designer of the simulator has embedded in their choices for the projection and because of the interpretation of the player of the simulation and how their ingrained notions allow them to interpret the simulation.
Ian Bogost (picture) in his book Unit Operations coins a state of being called ‘Simulation Fever’.
Bogost says that all games in some way are simulations, and that any simulation is subjective. The response people have to this subjectivity is one of either resignation (uncritically subjecting oneself to the rules of the simulation, taking it at face value) or of denial (rejecting simulations wholesale since their subjectivity makes them useless). Taken together, Bogost calls these reactions simulation fever. A discomfort created by the friction between our idea of how reality functions and how it is presented by a game system. The way to shake this fever, says Bogost, is to work through it, that is to say, to play in a critical way and to become aware of what it includes and excludes.
I think we could use the correspondence between games and visualizations to coin a corresponding term called Visualization Fever.
Those are my most important points, that good and interesting games and good and interesting data visualizations share many of the same characteristics. We can use data and its correspondence with reality (or lack thereof) to create a similar fever.
(This graphic is somewhat rudimentary but it was made within Keynote in five minutes and I hope it gets the point across.)
The visualization process shares a lot of similarities with the open data process that we are involved in. It is a perpetual conversation and the visual part is only one place where it can be improved. Data collection, discussion on results and errors, sharing of data and the resulting products, controllability of the outputs and being able to remix and reuse them and incorporating this process as feedback back into atoms are all areas that need active participation.
There is nothing easy about this. It is a ton of hard work and long tedious conversations. Fortunately most of it is worth it.
Some examples of visualization fever in action.
Verbeter de Buurt is the Dutch version of See Click Fix and it works really admirably. It creates a subjective map of an area with the issues that a group of people have signalled in their neighborhood. Nothing really is said about who these people are and if these issues are indeed the ones that are the most pressing (we all know the annoying neighbour who complains about dog poo to whomever will hear it). By making issues visible, this map imposes its view of the city onto the councils and exerts change.
Planning systems at an urban scale is a very difficult process. These planning stages are being opened up to the general public using consultation and other means but it remains to be seen if and how citizens can comprehend the complex issues that underlie city planning.
One step to help both experts and laypeople to better come to grips with the city that they are inhabiting is to create macroscopes that in one view show the entire scale and all the things that are in a system in such a way that we can make (some) sense of it. These Flowprints by Anil Bawa-Cavia are a great example of doing such for public transportation.
And done right these visualizations can reveal the systems of the world or in this case the order flow of trains in the Netherlands. Everybody knows how crowded Dutch rail is, which trains go where along which routes, but actually seeing it happening in front of your eyes in a real-time visualization gives you an insight and a tangible grip on the system that you did not have before.
So what do we fix?
We use visualizations and their compressed interactive views to expose system design choices and errors. They can also be used to give depth to a specific point, something which journalists are increasingly finding necessary. People consuming data heavy news want to be able to poke that data themselves.
A lot of visualizations I have seen thusfar serve not much more than to reinforce pre-existing judgements almost as if the person creating the visualization sought to build that which they wanted to see. Visualizations will need to be better, more flexible and draw upon more data if we want to break out of these throughs of shallow insight.
The brief as stated by the nice people at Bloom as well is that having a visualization serve solely as a visual output is too limited a use of the interactions created. You should be able to use the same interactions in the visualization to also influence the underlying model either directly or indirectly. That is to say the model and the representation should be bidirectionally influencing.
Planetary, the latest app by Bloom is a great example of that. It shows you a beautifully crafted astromusical view, but it also allows you to play your music library from within that very same visualization.
We need to bring visualization and deep data literacy to the web and infuse any relevant site and system (that is to say all) with them. Many people asking for data visualization think that they are some magical fairy dust that will make a site awesome by its very touch. This is of course not true.
Data and interactive visuals can generate value and insight for any site that employs them properly.
In the presentation Data Visualization for Web Designers by Tom Carden he remarks that web developers already know how to do all this. These are exactly the tools we have been employing over the last years to create interactive experiences (and we plan to use them more and more).
Internet Explorer is still the cripple old man of the web, but given understanding clients (and users) and some compatibility layers, you may be able to get away with using a lot of this stuff as long as the result is awesome enough.
The other trend is the idea that there need to be bridges built between web people and GIS people. Preferably how to create GIS-like experiences using the affordances that the web necessitates. A trend we were thinking about neatly summarized (blog) at a #NoGIS meetup by Mike Migurski.
GIS people have tremendous tools and knowledge but they are not accustomed to work in a very web way: quick, usable, beautiful. Web people can build nice sites pretty quickly, but they tend to fall flat when they need to work with geographical tools that are more complex than the Google Maps API.
If we can combine these two powers, the gains will be immense.
We can create subjective views to exert power upon reality and try to fix things for the better. The subjectivity is not a problem, as often the values embedded in the views are the very point. Subjectivity creates debate and debate moves things forward.
The tools we have to create these views are getting ever more powerful, but there is also a lot of work to be done.
As a wise man said: “The best way to complain is to make things.” (picture)
For the Amsterdam UIT Bureau and I Amsterdam we created this Foursquare map designed to display nightlife activity around the Leidseplein (entertainment) area with recent checkins, specials and current mayor and photographs of a selected group of venues. We strongly believe in creating autonomous displays that take cues from the environment —in this case using Foursquare— and deliver clear actions to the audience as well as a sense that the area they are in is alive and all they have to do is go out and connect to it.
Technically we used Foursquare’s OAuth2 API which is outstanding. To be able to share one token across all requests we employ a file based PHP cache that relays the necessary requests for us. Main technology was created in collaboration with Panman Productions.