Review of the Visualizing Europe Conference

Power and potential of data visualization with an european perspective was the core topic of Visualizing Europe, a one-day conference organized by Visualizing.org, Information Aesthetics and the OpenKnowledge Foundation. With a speaker list that didn’t leave no wishes open, an intimate and focused audience and stellar organization the event was a overall success.

The Sessions

Introduction

Adam Bly of Seed Media Group introduces the audience to the conference and gives a brief introduction to Visualizing.org, a place to publish, share and discuss visualizations. Visualizing.org strives to give professionals like teachers or journalists access to high quality visualizations under a creative commons license. Following Adam, Greg Farret of GE, partner of both Visualizing.org as well as this very conference explains the interest and engagement of GE in the field of data visualization.

“One of the thoughest challenges we face today is communication. Getting the message across.”

Session 1: The Power and Potential of Data Visualization

Santiago Ortiz, Bestiario, (Spain)
Santiago starts by presenting an interactive species tree visualization and then moves over to introduce the audience to a tool they’ve been working on called Impure. It’s a free, web-based visualization tool that lets non-programmers create, share and publish data visualizations. To explain the concepts and workflow with Impure, Santiago creates a simple Twitter visualization on-the-fly.

Moritz Stefaner, Well-formed Data, (Germany)
Moritz shows us two of his most recent client works — Notabilia for Wikipedia and the Better Life Index for the OECD. He provides us with an insight into his workflow:

“All my work starts with data and I cannot get started without it. I quickly prototype different visualization methods to see what’s viable and useful.”

Moritz discusses a new trend he sees in visualization, the idea of remixing existing works. As data sources, and oftentimes also source code is open, fellow designers and developers can step in and re-imagine and re-create. The second thought he shares with us is the concept he calls “full circle”: Getting data back from the usage of visualizations to inform design decision in future refinements.

Enrico Bertini, Fell in Love with Data, (Germany)
Enrico states the importance of of bridging the gap between practitioners and academia as he feels that some people dismiss what scientists have studied over the past 25 years. Enrico shows some scientific work and asks/proves that the contained data relies on visualization to be understood by the researcher. He moves on to explain that, although only a small subset of the population needs indispensable visualizations, a) these are still millions of people and b) they are working on solutions to problems that benefit the whole world like making our cities saver or curing our diseases.

“Data visualization is not useful, it’s indispensable.”

David McCandless, Information is Beautiful, (United Kingdom)
David disagrees with Moritz when he states that he is not looking for 1000 stories but only a meaningful one told well. He uses the metaphor of photography as he uses visualization as a new type of camera to give a specific look at a situation. He sees the task of storytelling as “relaxing and let the data unfold in people’s minds”. David also calls for more play in visualization to find new ways of doing things.

Q&A

Question by Paul Kahn to Moritz Stefaner:
What gives you the idea, you’ve created something the people will understand? Is in the development process any testing by end users to validate useability?

Moritz admits that he choses to come up with a solution without giving too much attention to the target audience. He focuses on the underlying data and to convey it in a readable and understandable manner, but doesn’t include any user testing in his approach.

Session 2: A Vision for Europe

Gregor Aisch, Driven by Data (Germany)
Gregor’s talk is focused around the topic of open visualization as the sum of a) open source code, b) open for different data sets and c) open to the community. Open visualization translates to fair share of knowledge as producing complex visualizations is cost intensive, thus might not be feasible for everyone. It also benefits the sustainability of visualizations as projects can be forked and advanced by everybody. He mentions some shortcoming in this regard of different popular projects. For ManyEyes the development has slowed down and the community seems rather inactive. Gapminder is perceived as being too closed and Wordle lacks of further improvements. In a quirky encounter, a representative of the agency that produced a visualization, critiqued by Gregor as being too close, stepped up and mentioned that the source code of the Flex based project is openly available.

“We can’t imagine how people will use our visualizations, thus we should make them as open as possible.”

Assaf Biderman, MIT (USA)
With challenges us to think of a city as a ubiquitous computer up in the air as we all carry around tiny terminals called Smartphones. How do we compute it? Who should program it and with what technology?

“A form of self-conveniance comes into place as we, as developers and designers, can analyze and interact with city structures of all kind.”

To underline how this can be done, Assaf shows a few projects from the Senseable City Lab of the MIT like the Copenhagen Wheel, Trash Tracking and Live Singapore.

Salvatore Iaconesi, Art Is Open Source (Italy)
The first project, Salvatore talks about is Squatting Supermarket, an augmented reality shopping experience. It combines goods with their impact on the environment that is of personal interest to the user. Along the lines of: “tell me what your favorite beach is, and I’ll tell you how this shampoo is polluting this beach”. He shows some more projects by his organization FakePress that develops and creates new publishing models and editorial projects presented by new scenarios in technology, productivity and contemporary culture. At this intersection, FakePress creates a range of interesting and innovative solutions.

Peter Miller, ITO (United Kingdom)
Peter gives the audience an overview over the possibilities of the OpenStreetMap project as a visualization tool for geographic information. Examples include public transportation, flight patterns, electricity grids or street conditions. After just a few years, the user generated data on OpenStreetMap is just as good as data published by the UK government.

Q&A

Question by me, Benjamin Wiederkehr to Assaf Biderman:
How can anybody, who’s not from the MIT but wants to have self-covernence in the way he interacts with his environment, be able to get access to detailed data that is not yet publicly available?

Having a University behind you efforts certainly helps a lot. But it hasn’t been always easy for the MIT to convince their partners to share their data. He suggests to clearly show them your intentions and the desired outcome of your project. Talk to them about the benefits and value of your work for them.

Session 3: Where Do We Go From Here?

In the closing round with panelists from the European Commission and the OECD, Adam Bly asks them to share their key take aways from todays conference and how these could influence their day to day work in the near and distant future.

Jean-Claude Burgelman, European Commission (Belgium)
Jean-Claude mentions that using these tools for illustration of complex information as well as to generate new knowledge by combining parts that were not possible to combine before. We need to view it in a global perspective and convince stakeholders to open up architectures.

Franco Accordino, European Commission (Belgium)
Franco even sees this time as a turning point in history as we as users can access so much information and work with it. In the public sector, this area has to obey to certain regulations and it’s crucial that these regulations are set with expertise knowledge and reason.

Toby Green, OECD (France)
Elements of trust, as we have them in academia, are very important to establish in visualization as well. Making data and its visualizations citable and linkable will be mandatory.

The Conclusion

The group of speakers, the intimacy and focus of the audience and the networking opportunities made this an excellent event. One recommendation that I discussed with the organizers was the limited amount of time reserved for the third session. I believe that influencers like Franco Accordino, Jean-Claude Burgelman and Toby Green are of crucial importance for moving our field forward. It would have been great to hearing more about the needed regulations and existing limitations for visualization in politics and economics. And once more, kudos to the folks of Visualizing.org for stellar organization and treatment.

If you want to reread the twitter backchannel, I have curated a list with the most telling tweets here.

Share this article

Subscribe for more

Give Feedback

  • http://twitter.com/moritz_stefaner Moritz Stefaner

    Hi,

    great summary! For the record and with respect to: “Moritz admits that he choses to come with a solution without giving too much attention to the target audience. He focuses on the underlying data and to convey it in a readable and understandable manner, but doesn’t include any user testing in his approach.”, I know I gave a sort of cheeky answer in the original discussion, but would like to expand it here:
    In fact, I do think a lot about the target audience, and always try to make things work for them. I just don’t think that applying user testing necessarily will take you there. As stated, I am quite interested in analysing how people actually use the tools I build, and, quite often, have corrected my position in how to design interaction details (like, for instance, the mixer in the OECD tool) based on user feedback. But you have to be very clear from about what you are testing, and what you expect from testing. And, in visualization, it is very difficult to test prototypes at an early stage, as data and design and code have to come together nicely to actually be able to test anything meaningful. The second thing one should keep in mind is that visualization is not always merely a tool to be evaluated based on its utility, but a cultural artifact as well. So in a usability study, you could only capture a fraction of the nature of, for instance, the notabilia visualization. Heck, I wouldn’t even know what to test about this one.

    • http://benjaminwiederkehr.com Benjamin Wiederkehr

      Hey Moritz,

      I didn’t intend to put you in the wrong light here. I tried to capture the essence of your answer and of course, there are more sides to this. Especially the tongue-in-cheek beginning of your answer couldn’t possibly be captured in written words — it was ingenious!

      Thanks for your great explanation of your process and your view on usability testing for visualizations in general. This could be a topic for a whole other discussion.