data is beautiful

106
Data is Beautiful Conference Data from the view of Society, Science, Art, Design and Technology 2–6 October, Budapest ‘12 initiated by

Upload: kitchen-budapest

Post on 22-Mar-2016

232 views

Category:

Documents


0 download

DESCRIPTION

The catalogue is the documentation of the event-series of Data is Beautiful project which was held in Budapest between 2-6 October 2012. The main aim was to create a national and international platform between Hungarian educational institutions, institutes and organizations to collaboratively investigate one of the most timely and significant phenomena of digital culture: aspects of data and data visualization.

TRANSCRIPT

Page 1: Data Is Beautiful

Data is Beautiful ConferenceData from the view of Society, Science, Art, Design and Technology

2–6 October, Budapest

‘12

initiated by

Page 2: Data Is Beautiful
Page 3: Data Is Beautiful

Data is Beautiful Conference

Page 4: Data Is Beautiful
Page 5: Data Is Beautiful

Table of contentsIntroduction...........................................................................................................6.Participants...........................................................................................................7.

Science SectionZoltán Kolláth................................................................................................10-15. András Sik.....................................................................................................16-25.Márton Deák..................................................................................................26-31.Krisztián Bodzsár...........................................................................................32-37. Irén Bálint.......................................................................................................38-43. Akos Kereszturi, Henrik Hargitai and Zsófia Merk...........................................44-51.Mátyás Gede.................................................................................................52-57.

Art, Design, Technology Patricio Devilla...............................................................................................60-63. Bettina Schülke..............................................................................................64-71. Fónagy Dóra..................................................................................................72-75. Mag. art. Harald Moser .................................................................................76-81. Nina Czegledy and André P. Czeglédy...........................................................82-87.Zsófia Ruttkay................................................................................................88-89.

Society Noémi Alexa...................................................................................................92-95.Hackathon...........................................................................................................96.

Photos.........................................................................................................97-103. Credit, impressum.............................................................................................105.

Page 6: Data Is Beautiful

IntroductionNina Czeglédy1 and Eszter Bircsák2 1 Senior Fellow, KMDI, University of Toronto, Adjunct

Associate Professor, Concordia University, Montreal, Senior Fellow, Hungarian University of Fine Arts

2 Freelance curator, co-founder of Kitchen Budapest, MER

Data Visualization at the intersection of Art, Science, Society, Design and Technology The fundamental aim of the Data is Beautiful project is to generate an international and national platform in partnership with Hungarian Educational Institutions and International Organizations to investigate and present data as a contemporary entity.

Due to rapidly developing technologies and the inclusion of new concepts, innovative design techniques, the representation of novel approaches; structures and working methods, data visualization offers brand new potentials, whether these are in the form of visual, sound or interactive fields. Beyond aesthetic functions, the concepts of effectiveness, access, social innovation, transparency and usefulness are necessary claims challenging the experts who have developed visualization of databases and now utilize the available tools for professionals and the general public.

Simultaneous with this process the characterization of Data Visualization requires interdisciplinary re-definition as well as clarification of the terminology towards the general public. Data Visualization by financial experts, engineers, research scientists, (for example in astronomy, biology) has been practiced for decades, it is only in the last period however, that Data Visualization based on databases became noticeable in the Arts. Globally, new university departments, new research opportunities and new visualization curricula were established in academia – mostly on student’s demands.

It is interesting to note that while most of the hardware and basic software techniques used is similar in all disciplines, the motivations and outcomes are entirely different in the visualization community. The intent of this project is to explore and foreground the seldom-presented intersections and connections within the framework of the three-day Data is Beautiful events. The aim is to introduce and dissect questions, problems based on professional viewpoints thus creating a dialogue between presenters and event participants.

The ultimate goal of the project is to generate collaboration, strengthen participation and create broad visibility for the wide variety of questions concerning data visualization, thus the program emphasizes knowledge transfer through all events and the ensuing publication to obtain a comprehensive interdisciplinary overview of data visualization.

In conclusion, it is important to additionally emphasize that the numerous possibilities in these fields inevitably lead to new professional opportunities for the emerging generation – the expert professionals of the future.

Data is Beautiful Conference 6

Page 7: Data Is Beautiful

The primary goal of the selection of international and local guests was to include mature and emerging researchers, artists and professional experts with a broad overview of the theme.

1. Science section:+ Zoltán Kolláth, Visualization and sonification of stellar data, (MTA KTM CSKI)+ András Sik, GIS-based visualization of the surface morphology on Mars by satellite imagery integration (ELTE Dept of Physical Geography)+ Márton Deák, The colorful hyperspectrum, (ELTE Dept of Physical Geography)+ Krisztián Bodzsár, Visualisation of the radio frequency spectrum, (NMHH)+ Irén Bálint, Electromagnetic field-strength prediction and visualisation of radio coverage, (NMHH)+ Ákos Kereszturi, Henrik Hargitai, Landing site visualization, (Konkoly Astronomical Observatory - ELTE Cosmic Materials Space Reserach Group)+ Mátyás Gede, Visualization methods of Spatial distribution of the Geotagged Photography, (ELTE Department of Cartography & Geoinformatics)

Workshop section: Visualisation of the radio frequency spectrum

2. Art, Design and Technology section:+ Harald Moser, media artist, Ars Electronica Center, Linz+ Patricio Davila, Assistant Professor, Faculty of Design, OCAD University, Toronto+ Bettina Schuelke, Austrian artist and Ph.D Researcher at the University of Lapland+ Zsófia Ruttkay, founder of Creative Technology Lab at the Media Institute, MOME, University of Art and Design, Budapest+ Dóra Fónagy, founder of DEFO Labor Budapest

3. Society Section:Keynotes:Noémi Alexa, director Transparency International Hungary Zsuzsa Szvetelszky, sociologist

Hackathon about the corruption visualization. Partner Transparency International

Participants

Introduction 7

Page 8: Data Is Beautiful

1%

Page 9: Data Is Beautiful

Science Section1

Page 10: Data Is Beautiful

Pulsating stars: visualization and sonificationZoltán Kolláth MTA CSFK Konkoly Observatory,

Pulsating variable stars play an important role in astrophysics. The mechanism of stellar oscillations is, in principle, very simple – it is similar to the propagation of acoustic waves in organ pipes. However, the interaction of stellar vibration modes makes the stellar

“sounds” quite complex. The basically simple pulsation mechanism turned out to be very rich in dynamical features, thus special ways of visualization and sonification are crucial to understand the underlying processes.

IntroductionIn Astronomy and astrophysics, in most of the cases, our knowledge is based on image like observations. However these empirical data usually represents only a projection or some mapping of the real physical processes, which are hidden or obscured by other events and objects. One of the real art of astrophysics is to reconstruct the fundamental principles and processes of celestial bodies from the possible observations and provide the real “image” of theories of planets, stars, galaxies and their systems. Data are obtained from images, then the data are transformed to different type of images. This complex sequence of analysis and visualization provides the final way of understanding the processes behind the images. Photograph like observations are formed not only by optical radiation but also by the whole range of wavelengths of electromagnetic radiation, from radio waves to gamma rays. In some cases even acoustic waves help us to map in astronomical scales. It is generally accepted that before the period of recombination of Hydrogen and Helium from their ionized states at the beginning history of our Universe, acoustic waves were able to propagate in plasma. These acoustic waves caused the patterns found in both the Cosmic Microwave Background and in the large scale structures we can see today in the Universe.

Another analogy can be found in the stars. Imagine that by listening to a radio broadcast and then by analyzing the sound of an acoustic musical instrument, one tries to reconstruct the shape of a trumpet or a horn, which was used far away at the radio studio. If a perfect microphone was used during the recording and the whole information of the audio signal is transmitted by radio waves, then a complete audio ‘fingerprint’ of the musical instrument can be recorded at the receiver. Then, at least in principle, information on the instrument itself can be deduced. Interestingly, the same idea is used in asteroseismology to determine the inner structure of stars.

Important diagnostic methods are based on acoustic waves. 3D spatial image of a fetus in the womb can be obtained by the use of ultrasound. Some animals, like bats and dolphins, use a similar technique to obtain an image of their surroundings by echolocation. Similarly, acoustic oscillations inside the stars provide lots of information about the physical processes of celestial bodies. These sounds can be detected from brightness variation of pulsating stars. Like earthquake vibrations, the acoustic oscillations of stars make it possible to perform seismic studies of the internal structure of heavenly bodies.

Abstract

Data is Beautiful Conference 10

Page 11: Data Is Beautiful

Astronomy, science and musicThe miracles of the Universe, the beauty of the night sky always inspired composers. The music of the spheres was one of the first concepts where mathematical considerations were applied to music. This impression is credited to Pythagoras like the connection of musical scale to the overtone structure of strings. In Ptolemy’s cosmology, the stars and planets were attached to crystal spheres, which rotated slowly, in the heavens (Figure 1). As they moved, their rotation produced melodies and harmonies, which humans might possibly learn to hear. Thus, the Music of the Spheres came into our language, first as an exact literal description of a physical process, and then, as a metaphor for something unutterably beautiful and perfect. Later, when Kepler found out that the motion of planets is not a combination of circular orbits on the crystal spheres, but single elliptical trajectories, he dropped the original idea of the music of the spheres. In the Keplerian cosmology there are no voices of crystal spheres, but the idea of connecting music to astronomy survived. He found another possible relation between astronomy and musical sequences: he used the changes in the velocity of the planets as a model for tunes (Figure 2).

Figure 1. Motion prescribed by simple “rolling” of spheres in each other provides a complex motion. It was the basic idea of Ptolemy’s cosmology of planetary motion, but it also provided sense for the Music of the Spheres

The explanation of the rhythmic light-change of the stars was only revealed in the 20th century – as a consequence of the periodic contraction-expansion and pulsation of the star, the size and surface temperature changes, which causes alterations in the energy of the light emitted. For the observer on Earth the star is a point without expanse, nevertheless we see its pulsation because of its light changes. This is an interesting chain of mappings, which understood and inverted can further our understanding of the inner structure of stars.

Essentially, the rhythmic changes of stars are due to the fact that acoustic waves travel through all matter and gas. In the case of some of the stars these sounds may be kept in constant “humming” by the flow of energy from the core to the surface. Of course we are incapable of hearing these sounds, as they do not travel in the vacuum between the stars and the Earth, not to mention the fact that their pitch is substantially lower than the range of the human ear. The sounds nevertheless are mapped in the changes in the light of the stars, as we have previously observed. In the end, using multiple transformations in which we must also use the physics of stars, we are able to survey their inner structure.

The above procedure is a nice scientific tool, but we can go on one more step. The observations of stellar oscillations can be made audible and/or made into geometric shapes and forms. The sonification of stellar variations turned into a nice tool to demonstrate astronomical processes. By speeding up the observed light variation data, these distant infra-bass sounds become audible. The voices of stars are astonishingly interesting, providing a wide range of sounds from a simple buzz through colorful noises to strange, polyphonic booming. Going beyond just listening, these stellar audio sources can be treated as virtual musical instruments, giving a new sense to the Music of the Spheres. The visualization of stellar oscillations and their transitions provides additional possibilities for the interaction between science and art. Animations and sound effects have been already used as ambient background in a fine art exhibition. New observations, data and models provide further temptation for exciting adventures in the frontiers between art and science.

Figure 2. Musical scores of the planets from Kepler’s “Harmonices Mundi”

Science Section 11

Page 12: Data Is Beautiful

Parallels between science and art have always helped in expressing new scientific ideas to non scientists. Projects based on such parallels also makes a great possibility to narrow the hiatus between science and humanities. The connection of music and astronomy is straightforward after Ptolemy’s and Kepler’s ideas, but one can find the same interaction in different disciplines too: the cooperation of a musician and a biologist resulted in the ‘sonification’ of proteins (Dunn & Clark 1999), which was fruitful for both fields and for the publicity of DNA science as well. Biologist Mary Anne Clark and algorithmic artist John Dunn are working at the molecular scale, using protein and gene sequence data from humans and other species as the basis of the musical scales for their songs. In 1998, they published an audio CD titled ‘Life Music,’ a collection of pieces based on data from a variety of proteins.

It is straightforward to turn the signal of radio telescopes to audible sound. Fiorella Terenzi took one more step, to transfer astrophysical information into soundscapes and music. She introduced the concept of ‘Acoustic Astronomy’, an experiment that allows us to transform radiation from space into sound. This transition is based on the notational similarity between radiation and musical notes: both of them are decoded by intensity, frequency and temporal evolution.

These fusions between science and music also reflect a close relationship that has long been in existence. It is well known that human arts have always drawn inspiration from the natural world. Furthermore, there is an even more physical connection between musical instruments and some of the celestial bodies: the trapped acoustic waves in stars.

Figure 3.A star and a trumpet. Virtual image of a wind instrument designed to mimic the overtone structure of a pulsating star.

Parallels: Pulsating stars and musical instruments

Pulsating stars provide a novel parallel between astrophysics and musical acoustics. Contrary to previous ideas, when the connection was somehow arbitrary (planetary motion or radio signals), one can use the interior physics of stars as a starting point. The acoustic waves inside stars are analogous to the waves inside organ pipes or other wind instruments. Stars are complex objects with nuclear energy production, radiation transfer coupled with hydrodynamics, etc. It indicates that stellar oscillations are also multifaceted states of stars. To understand a collection of effects involved in stellar pulsation, we first have to simplify our models. In the simplest model, only very small oscillations are assumed. Even in this case, the possible frequencies of stellar overtones are given very precisely. If motion is allowed only in the radial direction, and only adiabatic variations are assumed, then in the linear limit (very small amplitude oscillations) the pulsation can be described with a relatively simple partial differential equation, which has a form very close to that of the Schrödinger equation. This simple model is mathematically and physically analogous to the physics of organ pipes or wind instruments (Buchler, Yecko & Kolláth 1997). The possible frequencies of a wind instrument depend on the variation of its cross-section along its length. An analogy between these equations that connect the possible sounds of wind instruments to their shapes and Schrödinger like equations is very well known in the study of wind instruments (Morse & Ingard 1968; Benade 1973). Then, based on the analogies among wind instruments, Schrödinger equation and simplified stellar pulsations, for a given star the cross section of an equivalent wind instrument, a “stellar trumpet” can be designed. Figure 3 shows a virtual stellar trumpet. We should emphasize that the analogy is not only a mathematical one but also a physical one, since acoustic waves have the same properties for both systems.

One of the the major differences between the stellar trumpet and a real trumpet is the sound generation mechanism. While the valve mechanism of the lips is extremely important for the sound generation and the quality of sound in wind instruments, we use only the characteristics of the acoustic cavity (resonator) of the stellar trumpet. In stars the sound generation mechanism is located inside the cavity (due to interaction of the waves and the radiation of energy).

Data is Beautiful Conference 12

Page 13: Data Is Beautiful

Figure 4. Transition from simple to complex. A periodic signal can be represented by a closed loop. It indicates the self-similarity of the oscillation, it repeats itself in the same loop. Period doubling results in a double loop, and an additional period doubling makes the loop 4-folded, but still strictly repetitive. The sequence of period doublings may reach to chaos, where the cycles never return to its initial state, but some level of regularity and predictability remains in the variation.

Sounds produced by modern musical instruments – guitars, for example – consist of multiple tones arranged in specific ratios, and these amalgamations give each instrument its distinct musical character. But the tones from stars don’t have any of these familiar ratios, so virtual musical instruments based on stars have a unique, unusual sound. Using numerical models of pulsating stars and sound synthesis, stellar music can be composed with physical and logical fidelity to the underlying relations (Kolláth & Keuler 2006). The adoption of stellar sounds to construct musical structures and to create musical composition poses lots of questions, and aesthetically, they provide a new challenge. The result is a musical composition that are appropriately alien and eerie-sounding. Three musical pieces where composed by Jenő Keuler and the author of this paper: ‘Stellar Music No 1’, ‘Stellar Music No 2’ and ‘Stellar Sound Spaces’. These compositions can be listened and downloaded from the ‘Stellar Music Project’ homepage: http://www.konkoly.hu/stellarmusic/

From simple to complex: challenges for visualization

One of the main stellar instruments used in the composition ‘Stellar Music No 1’ was based on a series of numerical models. While only few selected partial tones can be observed in real stars, computer models of their physics give the full details of the spectrum. For harmonic spectra, the frequency ratio of the consecutive overtones monotonically decreases with the order of the overtone. However, there is a feature visible in the shape of the “stellar trumpet”, a narrowing part on the neck, which distorts the frequency structure of the overtones of the stars. If these vibrations are transposed to the audible range with the same ratio, then the sounds with different pitches have different timbre. These changes in the structure of the overtones are well represented by avoided crossings of the egienvalues (the possible frequencies) of a star. The reason for this feature is that in addition to the normal overtones in the star, there exists a so called strange mode overtone. In the prediction of these modes the parallel between stellar and musical acoustics helped a lot. It confirmed that the effect is not a numerical one, but based on real physical processes.

The existence of these surface modes were predicted by theory more than a decade ago (Buchler, Yecko, Kolláth, 1997; Buchler, Kolláth 2001), but their observational confirmation has turned out to be very hard. These strange modes can have only very small effect on the light variation of stars alone, hardly visible compared to the normal variations. However, when a star ‘plays’ this note together with the main mode of the pulsation, the sound behaves quite differently. This effect was found in the precise observations of Kepler Space Telescope. The measurements of this instrument have resulted in a golden age of asteroseismology (see e.g. Cowen 2012).

Science Section 13

Page 14: Data Is Beautiful

Figure 5. Transitions of the return map. From top left: periodic, double periodic, high order resonance (17P repetition), resonance, resonance with second period doubling, chaotic arches mixed with resonance.

The key for confirming the existence of strange mode is period doubling, a phenomenon often observed in dynamical systems. This effect was found in stellar models and actual pulsating variables in the last decades. Period doubling means that the observed quantity of the system alternates between a high-amplitude cycle and a low-amplitude cycle. Dynamical systems such as the simple Rossler oscillator are usually capable of period-doubling bifurcation, and through a series of bifurcations called the Feigenbaum cascade can evolve to chaotic behaviour (Figure 4). Period doubling was not expected in the case of RR Lyrae type variable stars (one of the best known type of pulsating stars, they have a period about half a day, and are important for the cosmic distance scale and for testing stellar evolution). But indeed, alterations of maxima in the variations of RR Lyarae stars have been found in Kepler observations. Our theoretical understanding of the musical acoustics of pulsating stars helped us again, and the puzzle of the observed complex behaviour was solved (Kolláth, Molnár and Szabó 2011) as a 9:4 resonance of the fundamental mode and the 9th overtone.

The hidden interaction of stellar sounds not only results in period doubling, but in parallel the stability of the new state of the main, fundamental sound changes significantly as well. It turned out that if an accord with three tones is “played” in a star the behaviour of the system can be very complex. Really unusual events can happen then: e.g. two pitches can be synchronized in a period-7 sequence. Now 14 periods repeat themselves, and this repetition time is just 19-times the period of the overtone. It is a resonance that almost impossible to occur in nature. While the detection of period doubling is quite simple if the right visualization tool is used, like the phase-space plots shown in Figure 4, this kind of plots cannot unhide the real nature of these 3-modal interacting systems. Periodic variations are transformed to loops in the phase phase. However complex variations results in very complex and obscured structures of loops. The plot should be further simplified to get the “essence” visible. One possibility is to take only a section of the loops by a plane. Then a single loop results in a point, a period doubled loop turns to a pair of points and so on. A special way to draw such images is to plot the maxima of the variation versus consecutive maxima – these are the so called return maps. Lots of the chaotic signals appear in this plot as a set of points populated along a simple curve, like a parabola, but after magnifying the spinlines they display a fine structure of curves. If two clear (strictly periodic) sounds are played simultaneously, then the phase-space plot is a curve on a three dimensional torus. Even this simple case can be hard to recognize on a phase plot, but then the return map is again simplified to a set of points on single closed loop. If the dataset is long enough and the periods are incommensurable, the points fills the loop. However, a resonance between the two pitches results in a fixed number of points along the loop. All the above scenarios appears in an RR Lyrae model sequence as the parameters are changed, indicating the really complex music of these stars. In Figure 5. these transitions appear as simple structures, and with this visualization important underlying processes of pulsating stars can be recognized otherwise hidden in the raw data.

Data is Beautiful Conference 14

Page 15: Data Is Beautiful

ConclusionA co-operation among artists and scientist, especially among musicians and stellar astrophysicists can be profitable. The sound examples of stellar processes and the compositions have already been in use in public relation activities. They were presented on Hungarian and BBC television programs, used as film music (http://whentheworldturnedround.com/) at a fine arts exhibition in Kiscell Museum in summer 2006 (Csutak 2011), and in lots of public talks all around the world. An astronomer interested in music can search for newer and newer interesting type of stellar sounds. Parallels in different disciplines help us to solve complex puzzles of natural processes. The picture of the possible interactions of stars changed fundamentally. But the pulsation itself rises lots of interesting questions, like the long term amplitude modulation of the variation of most of RR Lyrae stars. This modulation, the Blazhko-effect, has been known for more than 100 years but its exact mechanism is still a mystery. Perhaps at one day stellar acoustics will help in the solution to this puzzling mechanism. Finally, we hope that the Stellar Music Project can be extended with more novel astrophysical ingredients and new musical compositions.

ReferencesBenade AH (1973), Physics of brasses, Sci. Am. 229:24-35 (reprinted in: Kent EL (ed) Musical acoustics: Piano and wind instruments, Dowden, Hutchinson & Ross:, Stroudsburg, PA 1977)

Buchler JR, Yecko PA, Kolláth Z (1997) The nature of strange modes in classical variable stars. Astronomy & Astrophysics, 326:669-681

Buchler JR, Kolláth Z (2001) Strange Cepheids and RR Lyrae Stars. Astronomy & Astrophysics, 555:961-966

Csutak M (2011) A nulla megközelítése / Die Annäherung an die Null / Approaching Zero, Beke L (ed) Budapest, MTA MKI

Cowen R (2012) Kepler’s surprise: The sounds of the stars. Nature 481:18-19

Dunn J, Clark MA (1999) Life Music: The Sonification of Proteins. Leonardo, 32, 25

Kolláth Z, Keuler J (2006) Stellar acoustics as input for music composition. Musicae Scientiae, Special Issue 2005-2006, 161

Kolláth Z, Molnár L, Szabó R (2011) Period-doubling bifurcation and high-order resonances in RR Lyrae hydrodynamical models. Monthly Notices of the Royal Astronomical Society, 414:1111-1118

Morse PM, Ingard KU (1968) Theoretical acoustics (McGraw-Hill: New York)

Science Section 15

Page 16: Data Is Beautiful

GIS-based visualization of the surface morphology on Mars by satellite imagery integrationAndrás Sik Assistant professor, Department of Physical Geography,

Institute of Geography and Earth Sciences,

IntroductionAs part of my ongoing research work I have completed the morphological analysis of two groups of ice-related Martian slope features, focusing on 8 different study areas on the surface of Mars (Sik 2011).

One group contains periglacial debris aprons of mid-latitudinal regions, developed in the last few hundred million years as a result of the slow downslope movement and plastic deformation of rock-ice mixtures with cemented inner structure. They are fossil or inactive landforms that indicate different climatic conditions from the current surface environment in the past period of the planet’s evolution history.

The other group consists of dark slope streaks observed on the dune fields of the sub-polar Martian regions which are the signatures of the pore volume-filling downslope seepage of at least a microscopic quantity of liquid interfacial H2O, appearing temporarily during the spring-summer melting of the water ice contained in the near-surface material, and so these are the areas of active seasonal surface changes.

Instead of telescopes, modern investigation of Mars is carried out by space probes. Optical satellite images, digital elevation models and other type of planetary datasets acquired via remote sensing techniques by

orbiters’ instruments are accessible in public internet databases, thus can be integrated and analyzed by geoinformatical software – but many technical problems have to be solved before.

Abstract

Data sourcesFor these 8 selected regions I have searched and downloaded all the available optical satellite images, digital elevation models (DEM), spectrometer-based temperature values and other kind of spatial datasets from 4 various orbiter space probes’ 10 different instruments (Table 1.).

Data is Beautiful Conference 16

Page 17: Data Is Beautiful

Table 1. Characteristics of the most important instruments onboard recent and current orbiters around Mars

Figure 1. The same area visualized by different field resolution images – a) MGS MOC from 2001 with3 meter/pixel field resolution; b) MRO HiRISE from 2011 with 0,25 meter/pixel field resolution (figure by Sik A.)

Figure 2. The same area visualized by different digital elevation models –a) MGS MOLA from 2001 with 300 meter/pixel field resolution;b) MEX HRSC from 2006 with 50 meter/pixel field resolution;c) MRO HiRISE from 2011 with 1 meter/pixel field resolution (figure by Sik A.)

Science Section 17

Page 18: Data Is Beautiful

The most important property of these data sources is the level of details, called field resolution, which can be expressed by numbers in meter/pixel, as measuring unit. This improved significantly during the last orbiter missions (Figure 1-2.), causing some difficulties with the image co-registration process. In addition, the optical instruments’ footprint size (Figure 3.) and global coverage is different as well (Figure 4.).

Figure 3. Comparison of typical footprint size and pixel number/area for different optical images (figure by Sik A.)

Software

During some years of planetary data handling, I have gained practical experience with a wide variety of desktop software products related to remote sensing, digital image processing and geoinformatics (GIS). As an almost complete list, I have used the following computer programs for different tasks in my data processing workflow:

•Adobe Photoshop; •CAT for ITT VIS ENVI; •ESRI ArcGIS Desktop version 9.0; •ESRI ArcGIS Desktop version 9.2; •ESRI ArcGIS Desktop version 9.3; •ESRI ArcGIS Desktop version 10.0;

Figure 4. Correlation of maximum field resolution with global coverage for different optical images (figure by Sik A.)

•ESRI ArcView version 3.3;•GIMP;•Golden Software Surfer version 8.0;•IDRISI for Windows;•Leica Geosystems ERDAS Imagine;•NASAView.

Data is Beautiful Conference 18

Page 19: Data Is Beautiful

Finally I have concluded that in general, the ESRI ArcGIS Desktop software (Figure 5.) is the most appropriate for my research purposes, because it is a robust and versatile environment which natively supports the planetary ellipsoid of Mars (and some other Solar System bodies’ as well), recognizes some raster formats used by NASA data archives and ideal for database handling (van Gasselt, Nass 2011).

Figure 5. Mars-GIS, slope category map, high resolution image footprints and cross section of a study area in the graphic user interface of ESRI ArcGIS Desktop 9.3 software’s ArcMap module (figure by Sik A.)

Mars-GIS standardsDifficulties of Martian orbiter data integration primarily emerge from the diverse spatial reference systems applied by the different instruments’ science teams (Sik, Kereszturi 2006). Until the recent past they used several ellipsoids, latitude definitions (Figure 6.), longitude measuring directions and map projections (Timár et al. 2005) as well (Figure 7.). Fortunately in the last decade these discrepancies faded a lot (Duxbury 2002), and a very beneficial standardization took place (Seidelmann et al. 2005). As its result, some very important standards consolidated in the field of Mars-GIS (Table 2.).

Figure 6. Difference between latitude definitions – a) Planetocentric; b) Planetographic (figure by Sik A.)

Figure 7. Preferred projections – a) Equirectangular;b) Polar Stereographic; c) Sinusoidal (figure by Sik A.)

Science Section 19

Page 20: Data Is Beautiful

Table 2. The most important standards of Mars-GIS

File format parametersThe Martian data sources, for example the NASA Planetary Data System (PDS) primarily offers raster-based data files, which can have a wide range of format parameters. The most frequent file extensions are .IMG (ERDAS type or NASA PDS type), .TIF, .JPG, .JP2 (called JPEG2000) and the “geo-aware” versions of some, containing spatial reference and extent information internally, like GeoTIF and GeoJP2.

But in addition to the file extension, many other technical details, for example binary layout, byte order, header information structure, number of bands, pixel depth and processing level influences the opening method of these raster datasets as well.

Combined data searchFor any chosen area on the surface of Mars, it can be very useful to search for all the available data files by only one data mining interface. Based on my personal experiences, the combined data search is most effective by the following applications/web services:

•Google Earth (version 5.0 and above): http://www.google.com/earth; •Mars Image Explorer (Arizona State University): http://viewer.mars.asu.edu; •Mars Orbital Data Explorer (containing the global coverage footprint collections for the different instruments’ high resolution images as well): http://ode.rsl.wustl.edu/mars; •NASA Planetary Image Atlas: http://pds-imaging.jpl.nasa.gov/search/search.html#QuickSearch; •Webmap of all Mars images (Arizona State University): http://themis.asu.edu/maps.

Data integrationFor morphological and morphometrical analysis the integration of different Martian data sources into a geoinformatical system is essential. Therefore in the following I present a step-by-step “user manual” of this, which concerns the ESRI ArcGIS Desktop 9.3 software environment:

•opening a new Data Frame in ArcMap, changing the store option for path names to relative; •setting the spatial reference system of the Data Frame; •creating a new File geodatabase in ArcCatalog, into which every raster-, vector- or table-format data source should be loaded, except the extremely large file sized MRO HiRISE images; •MGS MOLA DEM is the most practical basemap (because in the last years the MOLA spatial reference system became a widely accepted standard in the world of Mars-GIS); •changing Data Frame’s display units to “Decimal Degrees” (for easier orientation on the surface); •setting the central meridian of the Data Frame (0° is the practical choice); •in ArcCatalog, its own spatial reference system should assigned to every data file; •integrating the global coverage footprint collections of the different data sources; •based on these, selecting and downloading all available data files for the investigated areas;

Data is Beautiful Conference 20

Page 21: Data Is Beautiful

•if needed, georeferencing of the downloaded raster data files, following the MGS MOLA DEM 4 MEX HRSC 4 MRO CTX 4 MGS MOC NA 4 MRO HiRISE sequence; •performing the planned analysis and storing the results in the File geodatabase; •preparing map layouts, harmonized with planetary nomenclature (Hargitai 2006).

Application example 1: Three-dimensional visualizationIf the geoinformatical software has a three-dimensional module, the results of the performed analysis can be visualized in a perspective view as well. ESRI ArcGIS Desktop 9.3 has the ArcScene module for this functionality, which can produce a very spectacular three-dimensional representation (Figure 8.).

Figure 8. Three-dimensional visualization of a study area (the same as shown in Figure 5.) with approximately1:3 vertical exaggeration, in the graphic user interface of ESRI ArcGIS Desktop 9.3 software’s ArcScene module(figure by Sik A.)

The three-dimensional visualization is a particularly useful method in the morphological analysis of high resolution images, mainly for the evaluation of their topographical context (Figure 9.). For example, it is possible to drape a lower resolution optical image over the digital elevation model with a top layer of contour lines (Figure 10.), or with a top layer of high resolution optical images (Figure 11.). Furthermore, it can be used for the visualization of volumetric calculations as well – which is needed for the estimation of the total H2O-ice content of any periglacial landform (Figure 12.).

Figure 9. Three-dimensional visualization of a study area by a height-colored digital elevation model, with approximately 1:3 vertical exaggeration (figure by Sik A.)

Science Section 21

Page 22: Data Is Beautiful

Figure 10. Three-dimensional visualization of an eroded table mountain and a periglacial debris apron around it, based on its digital elevation model, a lower resolution color optical image and a top layer of contour lines draped over it, with approximately 1:3 vertical exaggeration (figure by Sik A.)

Figure 11. Three-dimensional visualization of an eroded table mountain and a periglacial debris apron around it, based on its digital elevation model, a lower resolution color optical image and a top layer of high resolution optical images draped over it, with approximately 1:3 vertical exaggeration (figure by Sik A.)

Data is Beautiful Conference 22

Page 23: Data Is Beautiful

Figure 12. Volumetric calculation for an eroded table mountain and a periglacial debris apron around it –a) boundary line of the periglacial debris apron on the three-dimensional model of the area; b) separation of the table mountain from the periglacial debris apron and the total estimated volume of the combined landform;c) base level contour of the table mountain; d) vertical-wall model of the table mountain and its estimated volume; e) vertical-wall model of the periglacial debris apron and its estimated volume (figure by Sik A.)

Application example 2: Change detection between yearsMonitoring of high resolution images from different years of a sub-polar dark dune area (Figure 13.) shows 40-50% spatial reoccurrence of dark slope streaks, which resemble seepages (Kereszturi et al. 2009).

Figure 13. Comparison of dark slope streaks in a dark dune area between 2001 and 2003 –a) position of overlapping high resolution images; b) subset of image 2001; c) subset of image 2003 d) dark slope streaks (green) of image 2001; e) dark slope streaks (blue) of image 2003;f) overlapping dark slope streaks (red) of 2001 and 2003 images (figure by Sik A.)

Science Section 23

Page 24: Data Is Beautiful

Figure 14. Comparison of dark slope streaks in a dark dune area during northern spring and summer on Mars (Ls: 0-180°) –

a) most of the slope area is covered with CO2-ice layer; b) 22 sols later dark slope streak inbox 1 elongated; c) further 11 sols later new dark slope streak appeared in box 2

Comparison of dark slope streaks in a dark dune area during northern spring and summer on Mars (Ls: 0-180°) –

d) further 16 sols later the slope area is less than half-covered with CO2-ice layer and dark slope streaks proliferating; e) in the summer period the CO2-ice layer is ceased and dark slope features are not visible (figure by Sik A.)

Data is Beautiful Conference 24

Page 25: Data Is Beautiful

Application example 3: Observation of seasonal changesMonitoring of high resolution images from different Martian days (sol) during spring and summer for a sub-polar dark dune area shows elongation of dark slope streaks and formation of new ones (Figure 14.), which can be interpreted as melting of H2O-content in the near-surface dune material (Kereszturi et al. 2010).

ConclusionsFrom the Martian surface evolution perspective, the most important conclusion of this ongoing research work is that the development of the periglacial debris aprons and the dark dune field’s dark slope streaks can be explained by the same process, which is the partial melting of the H2O-content in the shallow subsurface layers. But the periglacial debris aprons are larger and ancient landforms while the slope streaks are smaller and recent features. Therefore, both can be considered as reliable locations and reachable sources of frozen and/or liquid H2O reservoirs to be found on the planet.

Besides, the above presented three examples clearly demonstrate that GIS-based visualization of orbiter instruments’ data sources can be effectively applied in the field of Martian morphological analysis and morphometrical research as well. Therefore Mars-GIS can be considered as the planetary extension and a promising future sub-discipline of traditional physical geography.

AcknowledgementThis ongoing research work is supported by the Bolyai János Research Grant of the Hungarian Academy of Sciences.

ReferencesDuxbury T. C. et al. (2002): Mars Geodesy/Cartography Working Group Recommendations on Mars Cartographic Constants and Coordinate Systems. ISPRS Archives/IAPRS Proceedings of Commission IV. Symposium – Geospatial Theory, Processing and Applications 34(4).

van Gasselt S., Nass A. (2011): Planetary mapping - The datamodel’s perspective and GIS framework. Planetary and Space Science 59(11-12):1 231-1 242., doi:10.1016/j.pss.2010.09.012.

Hargitai H. (2006): Planetary Maps: Visualization and Nomenclature. Cartographica 41(2):149-164., doi:10.3138/9862-21JU-4021-72M3.

Kereszturi Á., Möhlmann D., Bérczi Sz., Gánti T., Kuti A., Sik A., Horváth A. (2009): Recent rheologic processes on dark polar dunes of Mars: Driven by interfacial water? Icarus 201(2):492-503., doi:10.1016/j.icarus.2009.01.014.

Kereszturi Á., Möhlmann D., Bérczi Sz., Gánti T., Horváth A., Kuti A., Sik A., Szathmáry E. (2010): Indications of brine related local seepage phenomena on the northern hemisphere of Mars. Icarus 207(1):149-164., doi:10.1016/j.icarus.2009.10.012.

Seidelmann P. K. et al. (2005): Report of the IAU/IAG Working Group on Cartographic Coordinates and Rotational Elements: 2003. Celestial Mechanics and Dynamical Astronomy 91(3-4):203-215., doi:10.1007/s10569-004-3115-4.

Sik A. (2011): REMOTE SENSING AND MORPHOLOGY: GIS-based integration of orbiter datasets for the investigation of ice-related slope features on Mars. PhD Dissertation, Budapest, 171 p.

Sik A., Kereszturi Á. (2006): A Mars felszínalaktani vizsgálata űrfelvételek alapján. Geodézia és Kartográfia LVIII:12-20.

Timár G., Székely B., Molnár G. (2005): Definition of a map grid system for minimum distortion representation of the topography of the planet Mars. Geophysical Research Abstracts 7 #EGU05-A-04931.

Science Section 25

Page 26: Data Is Beautiful

The colorful hyperspectrumMárton Deák Eötvös Loránd University, Institute of Geography and

Earth Sciences, Department of Physical Geography,

Hyperspectral remote sensing is one of the most improving methods for land cover analysis. It allows us to collect extremely detailed information about the quality of vegetation, rocks or soils. Visualization of this data would

be obvious through different maps but there are more diverse, spectacular methods which make this “colorless”, infrared spectrum of light very much colorful.

IntroductionOne of the most advancing fields in remote sensing today is “hyperspectral remote sensing” – which is also referred to as “imaging spectroscopy”. It helps us to collect information about the surface of the Earth by separating different plant species or discovering pollutants in the soil. Sunlight reflects differently from different surface types depending on their physical-chemical attributes. Polluted soils have different reflectance compared to non-polluted soils just like limestone and basalt or coniferous Austrian pine (Pinus nigra) and deciduous sessile oak (Quercus petrea) have (Hargitai et al. 2006).

The looks of the resulting data is very important because hyperspectral remote sensing offers many ways of data visualization which can be very colorful – even from an artistic point of view.

Remote sensing and the range of lightAccording to the Hungarian Institute of Geodesy, Cartography and Remote Sensing remote sensing can be defined as the following: “Through remote sensing we can collect information about an object (or its surface), while we do not maintain direct contact with it. With remote sensing we analyze mainly the surface of the Earth.”. We can define two types of remote sensing: active – where the instrument itself sends out a signal and measures differences between the sent and the reflected signal – and passive, where the sensor only works with incoming information which is usually reflected electromagnetic radiation.

The spectrum of electromagnetic radiation (EM) can be separated into segments depending on their wavelengths: gamma rays, X-rays, ultraviolet, visible light, infrared, microwave and radio waves. Visible light is between (approx.) 380 nm and 740 nm – but it varies a bit by each individual person (Alpern et al. 1965). Multi- and hyperspectral remote sensing mainly operates with the ranges between 740 nm and 8000 nm (that’s from the near infrared to the thermal infrared) where photons are also present but in a wavelength our eyes cannot detect them.

Hyper- and multispectral sensors are working with these defined wavelengths of light. They are measuring the intensity of one selected wavelength on several points of the Earth’s surface creating a raster image from the measured values gained from each point. Depending on the resolution of the output image the data set will have information about the reflectance of the surface divided to the same area as the resolution (for example 2x2 m – or 2 m/pixel – is frequent by aerial photographs and 30 m/pixel is common by satellite images).

Abstract

Data is Beautiful Conference 26

Page 27: Data Is Beautiful

Figure 1: The comparison of a multi- (A) and a hyperspectral (B) reflectance curve corresponding to the same area. X axis shows the band numbers and Y shows the intensity valuesFigure: M Deák

It is possible to take several overlapping raster images and put them on each other if there are more images in different wavelengths. As a result we will have several values in different wavelengths corresponding to one pixel – therefore to one coordinate on a map. Connecting the values of every overlapping pixel creates a curve. It visualizes the data on a graph which is called “spectral reflectance curve” (Figure 1B.). This graph is the main source of data processing usually with different multidimensional mathematical approaches (Kozma and Berke 2010).

Hyperspectral data and resolutionVisualization methods discussed later apply not only for hyperspectral images (as seen in the topic) but also for multispectral data. The difference between them is spectral resolution – which is basically the interval of the taken samples from the reflectance spectrum in respect to the covered range. The same reflectance curve can look very different with lower spectral resolution – hyperspectral data has higher spectral resolution (200-300 bands for the 420-2300 nm) while multispectral data has lower (7-10 bands for the same spectral range (Figure 1A). Coarser multispectral data is still sufficient for most analysis (Móricz et al. 2005) – for example to differentiate coniferous and deciduous plants – but doesn’t allow the collection of detailed information about the mineral composition or water content.

One of the most challenging problems in evaluating the spectra is their purity. Spectra collected through aerial photographs or satellite images make it almost impossible to gain pure information. The collected reflectance curve will be the mixture of the reflectance of every surface cover type present in the area represented by the pixel (for example soil, vegetation and rocks). For accurate analysis a laboratory spectrophotometer is needed (Figure 2.). It measures the reflectance of a small sample with high precision together with high spectral resolution.

In conclusion, the higher the spatial resolution is, the purest the analyzed spectrum will be. Accordingly hyperspectral aerial photos have purer spectra, while the result of satellite images is usually a very mixed one – but it also depends on the analyzed area. The reflectance spectrum of a large, homogeneous area (for example a desert or a large plantation) looks similar on a 2 m/pixel resolution aerial photograph and a 30 m/pixel satellite image – taking in account the atmospheric effects of the latter one.

Figure 2: A Shimadzu UV-1601 laboratory spectrophotometerFigure: Shimadzu

Science Section 27

Page 28: Data Is Beautiful

VisualizationHowever it would be obvious to create maps from this type of data the best way to use them is for composite, mixed-color images which are using the three primaries.

Color mixingHuman eye works in a similar way than the monitor of a computer does. Our retina has photoreceptors - called “cone cells” - which allow the discrimination of different wavelengths of light. The light in the wavelength around 430 nm is blue, around 540 nm is green and around 580 nm is red – of course these values are just the sensitivity peaks of the cone cells, they can precept other wavelengths as well. For example when the light sensed by our cone cells consists of 50% of 430 nm blue, 0% 540 of nm green and 50% of 580 nm red light our brain mixes them and creates a purple color.

It is important that not every species has trichromatic vision. Cats and dogs have two types of cone cells so they have a bichromatic vision. Biologists also discovered that some animals have five types of cone cells (giving them some vision in the UV and infrared spectrum) so they have – in fact – a pentachromatic vision (Sabbah et al. 2010).

The image displayed by a computer is created almost the same way: we mainly use RGB images, which means that one displayed image on the screen has three overlapping raster images. The first contains the red, the second the green and the third one the blue intensity values. The displaying software then mixes these values just like human brain does and assigns the proper color for one pixel.

This method enables us to create different composite satellite/aerial images as components with different overlapping raster images (so-called bands). If we create an RGB composite image, where we assign the red wavelength to the red component, the green to the green and the blue to the blue, the resulting image will be a so-called “true color” image – similar to what we would see with our own eyes. If we mix them up a little or use wavelengths our eye can’t detect we can create “false-color” images.

SpectraFor deeper analysis we have to look at the values of the overlapping pixels. We can visualize the data with a continuous reflectance curve, where the X axis shows wavelength values (and therefore also the value of different bands) and Y shows intensity. Since this curve is unique for every material it is often called “spectral fingerprint”.

It is possible to read the values of each band and create composite images according to the feature we would like to display. For example if a Hyperion satellite image is used and band 47 (corresponding to 823,65 nm) assigned to the red, band 150 (1648,9 nm) to the green and band 29 (640 nm) to the blue component of this image, the forest will be red, the water will be black and the bare soil will be cyanish-bluish. We can create a lot of combinations even for aerial photographs showing a lot of features in a very colorful way (Figure 3.).

Reflectance curves also have special ranges. For example the wavelengths between 700 nm and 750 nm are corresponding to chlorophyll, between 1700 nm and 1780 nm to various leaf waxes and oils and from 2100 nm to 2300 nm to clays in the soil (Datt et al. 2009). Using only these ranges it is possible to create composite images, where (for example) the reddish areas are rich of leaf oils and waxes and the bluish ones are the vegetation types which contain less (Figure 4.). We can easily discriminate for example sunflower and lavender from other vegetation species using this method.

Through the examination of typical peaks of the spectral curve to receive specific information about the sample. For example the variations in the range between 400 nm and 1400 nm are associated with FeO content (Ben-Dor et al. 1999) and the asymmetry of the peak around 2200 nm correlates very well with the arsenic content (Choe et al. 2008). Elimination of specific spectral ranges is also possible with this method – for example using only the longer wavelengths the imaging instrument can “bypass” smoke thus giving us a vision on the area covered for our eyes (Figure 5.)

Data is Beautiful Conference 28

Page 29: Data Is Beautiful

Figure 3: Three false-color aerial photographs displaying the vegetation in the Sárvíz-valley, Hungary in different color combinations. Although the same bands were used the color combinations are different: on the first image vegetation is shown in red and bare soil is green, on the second they are yellow and blue while on the third cyan and red.The images were made with an AISA Dual Eagle/Hawk sensor with 2 m/pixel resolution during an archeological survey in August 2011. The images are showing an approx. 1 km x 0,5 km area. The three wavelengths used: 610 nm (green), 720 nm (red) and 820 nm (near infrared).Figure: M Deák

Figure 4: A false-color image displaying an agronomical area south of Budapest. Reddish areas are plants with a lot of wax, bluish ones contain less.Figure: M Deák, USGS, Hyperion

Figure 5: Two images captured by a hyperspectral sensor covering the same area near Jakutsk, Russia at May 24th, 2011 during a massive forest fire. The left is a true-color, the right is false-color image using only the longer wavelengths (areas already burnt down are showing in red, live vegetation is green). Observe the difference between the areas covered with smoke on the Eastern part of the images Figure: USGS, ALI, Hyperion

Science Section 29

Page 30: Data Is Beautiful

IndicesSimilar to the visualization method mentioned above it is possible to make spectral indices to visualize two or more parts of the reflectance spectrum in respect to each other. Different mathematical equations are often used where the input data are raster images corresponding to some specified wavelengths – therefore some specific bands. The calculations are made with the overlapping pixels thus creating a one band raster image where the values are no longer the intensity values of the sunlight in a wavelength but the results of the equation: the indices. Two of the most commonly used indices are the NDVI (Normalized Difference Vegetation Index) and the MCARI (Modified Triangular Vegetation Index). Example for NDVI: (Band730 nm – Band630 nm) / (Band730 nm + Band630 nm) (Tucker 1979). 730 nm corresponds to chlorophyll reflectance and 630 nm to chlorophyll absorbance hence the resulted value (which is between -1 and +1) will show the chlorophyll value (in conclusion the plant coverage) of the analyzed area.

Therefore the raster will have numerical values for which colors can be assigned to. For example the easiest method is to use the different shades of grey where the lowest value is black and the highest is white. However if we use a different color table, for example the colors of the rainbow, our data won’t only become more informative and detailed but also more spectacular (Figure 6.). There are different indices for different types of data and they are not necessarily using only two wavelengths. The possibilities are almost infinite and they are growing fast as newer detectors are able to sense new ranges of light.

Hyperspectral data cube

Hyperspectral data has got also a unique method to visualize the spectra corresponding to the pixels. It is possible to assign colors for the numerical intensity values of a reflectance spectrum – similar to the method mentioned above. This way the reflectance curve will be turned into a colorful line. The next step is to assign this line to each pixel in the image, so a cube is created, where spectra of the boundary pixels are seen at once (Figure 7.).

Having a look at this it is possible to tell a lot about the spectral features of the area shown on the image. For example a practiced expert can tell whether the soil surface seen on the image is rich in iron oxides or not. Water absorbs almost all light in the infrared range therefore the spectral curve of a pixel covered with water would be blue on the image above just like a cloud – which reflects almost everything – looked red.

Figure 6: NDVI of the Zemplén Mounts with different colorization methods. On the first two images white, on the third red and on the fourth green represents the higher values – thus the higher plant coverage.Figure: USGS, M. Deák

Data is Beautiful Conference 30

Page 31: Data Is Beautiful

The future of hyperspectral imagingThere are a lot of possibilities in the application of multi- and hyperspectral remote sensing while the number of areas we have this type of data from is growing fast (Hargitai et al. 2004). The newest hyperspectral instruments are often called “ultraspectral” sensors and their application on satellites is often discussed (Zhou et al. 2007). Today there are sensors orbiting not only Earth, but Mars (CRISM - Compact Reconnaissance Imaging Spectrometer) and Moon as well (M3 – Moon Mineralogy Mapper on the Chandrayaan-2). They are also sending us plenty of information about the surface of these celestial bodies.

The range of visualization methods is also expanding quickly because new software and data types are introduced almost every month. We can only imagine what type of colorful images will be assigned to the reflectance spectrum in the future.

Figure 7: Hyperspectral data cube showing an agronomical area. A true-color image is on the top, the sides of the cube are representing the intensity values of the corresponding pixel – blue means lower, red means higher valuesFigure: USGS, Spectralpython

Science Section 31

ReferencesAlpern M, Thompson S, S. Lee M (1965): Spectral Transmittance of the Visible Light by the Living Human Eye. JOSA,

Vol. 55, Issue 6, pp. 723-725Ben-Dor E, Irons JR, Epema G F (1999): Soil reflectance. In: Rencz AN (ed), Remote sensing for the earth sciences:

Manual of remote sensing, New York: John Wiley & Sons, pp. 111−188Choe E, van der Meer F, van Ruitenbeek F, van der Werff H, de Smeth B, Kim K-W (2008): Mapping of heavy metal

pollution in stream sediments using combined geochemistry, field spectroscopy, and hyperspectral remote sensing: A case study of the Rodalquilar mining area, SE Spain. Remote Sensing of the Environment, Vol. 112, pp. 3222-3233

Datt B, McVicar TR, Van Niel TG, Jupp DLB, Pearlman JS (2003): Preprocessing EO-1 Hyperion Hyperspectral Data to Support the Application of Agricultural Indices. IEEE Transactions on Geoscience and Remote Sensing, Vol. 41, pp. 1246-1259

Hargitai H, Vekerdy Z, Turdukulov U, Kardeván P (2004): Képalkotó spektrométeres távérzékelési kísérlet Magyarországon (Imaging spectrometer experiment in Hungary, in Hungarian). Térinformatika, 2004/6, pp. 12-15

Hargitai H, Kardeván P, Horváth F (2006): Az első magyarországiképalkotó spektrométeres repülésés adatainak elemzése erdőtípusok elkülönítésére (Data analysis for forest type classification of the first airborne imaging spectrometry experiment in Hungary, in Hungarian). Geodézia és Kartográfia, 2006/9, pp. 21-33

Kozma BV, Berke J (2010): New Evaluation Techniques of Hyperspectral Data. Journal of Systemics, Cybernetics and Informatics. Vol. 8, No. 5

Móricz N, Mari L, Mattányi Zs, Kohán B (2005): Modelling of Potential Vegetation in Hungary by using Methods of GIS. GeoBit/GIS 11/2005, pp.

Sabbah S, Laria LR, Gray SM, Hawryshyn CW (2010): Functional diversity in the color vision of cichlid fishes. BMC Biology 8:133

Tucker C.J. (1979): Red and photographic infrared linear combinations for monitoring vegetation. Remote Sensing of the Environment, Vol. 8, pp. 127-150

Zhou DK, William LS, Xu L, Allen ML, Stephen AM, Hung-Lung H (2007): Physically Retrieving Cloud and Thermodynamic Parameters from Ultraspectral IR Measurements. Journal of the Atmospheric Sciences, Vol. 64, pp. 969–982

Page 32: Data Is Beautiful

Visualization of the radio frequency spectrumKrisztián Bodzsár National Media and Infocommunication Authority

Introduction to the visualization of our inspections in the electromagnetic spectrum and the direction finding measurement for locating a transmitter

site. Presentation and visualization of the data collected during our measurements.

IntroductionAll national authorities around Europe have a monitoring service which is based on the International Telecommunication Union (ITU) recommendations and each country has its own telecommunication law regulating these activities. These monitoring services are assigned to monitor the human made radio services, and different interferences (which may be intentional or non-intentional).

The electromagnetic spectrum

The electromagnetic wave is an optional way to transfer information from a transmitter to one or more receiver. The electromagnetic radiation consists of transversal waves (electric and magnetic components). The two main parameter of an electromagnetic radiation is the wavelength (λ=[meter][m]) and the frequency (f=[Herzt][Hz]). Every radiation can be divided to different spectral component’s linear combination. The typical frequency range which is usually supervised by authorities is the 9 kHz (9x103Hz) to 300 GHz (300x109Hz) range (Fig. 1).

Abstract

Figure 1, the electromagnetic spectrum (Courtesy of Inductiveload, NASA, 26 October 2007)

Visualization of the spectrumThere is a little problem when we try to observe the radio frequency spectrum, because usually we perceive everything in the time domain (real-time) and the spectral components are in the frequency domain. To visualize the spectrum first we have to “convert the time into frequency” somehow. For solving this “equation” we use the so-called Discrete Fourier Transformation (DFT) which is a linear transformation that transforms into the frequency domain (usually from the time domain) and while doing so has a finite input function and a finite duration. The kind of Fourier transformation that is mainly used by the computers today is the Fast Fourier Transformation (FFT) which computes the DFT and its inverse faster than the DFT itself. By using this transformation we can easily display the

Data is Beautiful Conference 32

Page 33: Data Is Beautiful

level (e.g.: voltage, field strength) for the different frequency components, and this way we can measure and regulate the different radio services. On figure 2 a typical frequency spectrum diagram can be seen, with the level on the y axis and the frequency on the x axis.

Figure 2, an example of the frequency spectrum diagram ranging from 20MHz to 1.3GHz with each spike representing the level for the corresponding frequency (each spike represents a different radio service) (Courtesy of Krisztián Bodzsár, R&S Argus, 2012)

The waterfall diagramFor displaying this spectrum we mainly use the spectrum diagram but in cases we’d like to see the tendency of the spectra over time, than we use the so-called waterfall diagram. The name comes from the motion the diagram makes (it’s like the water which is falling down in a waterfall). The waterfall diagram can be 2D or 3D depending how we want to see the spectra displayed over time (Fig. 3, 4).

Figure 3, a spectrum diagram (top) and the corresponding 2D waterfall diagram (below) (Courtesy of Krisztián Bodzsár, R&S Argus, 2012)

In case of the 2D diagram the frequency is still on the x axis (it’s often locked to a spectrum diagram) and the level is coded by different colors, while the y axis of the diagram displays the elapsed time. The 3D waterfall diagram is a bit different. It looks like there are many spectrum diagrams which are offset in space to each other, which is actually a correct definition, because here the z axis represents the time and the x and y axis are exactly the same as with a spectrum diagram (Fig. 4).

Science Section 33

Page 34: Data Is Beautiful

Frequency vs. time diagramIf we only want to analyze one discreet frequency then we can use the frequency vs. time diagram, greatly simplifying the analysis of a chosen signal. This kind of diagram helps us when we, for example like to see the tendency of the field strength of a transmitter which we suspect to periodically emit greater output power than it is permitted.

Figure 4, a 3D waterfall diagram of the aeronautical radio services (Courtesy of Krisztián Bodzsár, R&S Argus, 2012)

Figure 5, a time diagram for a specific frequency with the field strength on the y axis and the time on the x axis, the difference between the two level values indicates that the transmitter has been shut down at the specific date where the marker is also shown (Courtesy of Krisztián Bodzsár, R&S Argus, 2012)

Difference between the different radio services spectrumEach different radio service has a different spectrum view. For well-trained engineers often identifying a service type is done by looking at the shape of the spectra. The difference between services is the occupied bandwidth, modulation and the shape of the signal. There is also significant difference between the shape of the digital and analog services when looking at their spectrum. For example on figure 6 the GSM (Global System for Mobile Communication) downlink channels, on figure 7 the 470MHz-863MHz terrestrial television broadcast channels and on figure 8 the 87.5MHz-108MHz VHF (Very High Frequency) FM (Frequency Modulated) broadcast radios can be seen.

Data is Beautiful Conference 34

Page 35: Data Is Beautiful

Figure 6, GSM downlink channels near 950MHz (Krisztián Bodzsár, RFeye Live, 2012)

Figure 7, the 470MHz-863MHz TV spectrum the 2 regions which are shown within the circles show the difference between the analog and digital terrestrial TV spectrum (Courtesy of Krisztián Bodzsár, RFeye Live, 2012)

Science Section 35

Page 36: Data Is Beautiful

Direction FindingThe Direction Finding (DF) and geo-location is the technique used to find unlicensed transmitters or interferences. The DF only tells us the bearing of a signal but when we combine three or more we can do a triangulation to do a location finding. There are three different types of DF techniques we currently use the Angle of Arrival (AoA), Time Difference of Arrival (TDoA) and Power of Arrival (PoA). On figure 10 there is a comparison of the result of these different DF techniques using a simulator program called RFeye SIM.

These different DF and geo location finding techniques all have their pros and cons, and it’s not rare to see them work side by side in a country-wide monitoring system.

Figure 8, an example for an AoA DF antenna array on top of a mobile DF vehicle (Courtesy of Péter Forman, NMIAH, 2012)

Figure 9, an example for a Mobile Monitoring System (MMS) with a DF antenna array on top (Courtesy of Krisztián Bodzsár, NMIAH, 2011)

Figure 10, comparison of the AoA TDoA and PoA location finding techniques (in order) (Courtesy of Krisztián Bodzsár, RFeye SIM, 2012)

Mobile monitoringThe newest addition for our measuring capabilities is the introduction of the new generation mobile monitoring system. Previously we had mobile system but those were service specific measuring systems. This new system is capable of scanning the spectrum range from 10MHz to 6GHz with a very fast sweep time and also capable of adding GPS (Global Positioning System) stamps to each measurement. By using the measurements with the GPS stamps we can easily display the results on a map. On figure 11 and 12 there are two samples taken from the measurements made with this new system.

Figure 11, the 2.1GHz UMTS (Universal Mobile Telecommunication System) services of the different service providers, the color difference on the map represents the different levels, there is also a spectrum diagram on the left side of the figure (Courtesy of Krisztián Bodzsár, RFeye DAS, 2012)

Data is Beautiful Conference 36

Page 37: Data Is Beautiful

Figure 12, the 2.4GHz voltage coverage of different Wlan (Wireless Local Area Network or Wi-Fi) networks in the city of Fót as in the previous figure the color difference of the tiles on the map represents the different levels, there is also a spectrum diagram on the left side of the figure (Courtesy of Krisztián Bodzsár, RFeye DAS, 2012)

Following the technologyFor a regulator authority it is very important to keep the pace with the newest infocomunnication technologies, because we can only enforce our spectrum policy if we have a well-equipped and well trained team. Fortunately for the time being these conditions are available at the National Media and Infocommunication Authority – Hungary (NMIAH).

BibliographyInternational Telecommunication Union (ITU) Spectrum Monitoring Handbook 2011 Radiocommunication Bureau Geneva, Switzerland

100 %

Science Section 37

Page 38: Data Is Beautiful

Visualisation of the Radio CoverageIrén Bálint National Media and Infocommunications Authority

The implementation of the new radio technologies and the more and more intensive use of the frequency resources require managing of the electromagnetic spectrum through engineering and administrative procedures to maximize the efficiency of the use of radio frequencies. The efficient spectrum management activity involves accurate frequency planning based on different wave propagation models,

detailed electromagnetic compatibility analysis and intensive international frequency coordination. In most of the countries these activities are supported by sophisticated computer programs based on digital terrain models. This paper gives a general view on the radio coverage prediction aspects of digital radio networks with particular view on the visualization and interpretation of the graphical results.

IntroductionFor the efficient use of the frequency resources - before the implementation of different types of radiocommunication services.- careful planning is needed to enable different systems to operate without causing or suffering unacceptable interference. Modern computer based frequency management systems provide sophisticated radio network planning frameworks for calculation of the field strength values and prediction of the coverage based on terrain data and accurate radio wave propagation models. Depending on the system particularities for the planning of different radio systems different prediction methods can be used. Before the implementation of broadcasting systems the planning of the coverage area is particularly important. The aim of this paper is to present planning aspects of digital broadcasting networks including a planning exercise for a nationwide terrestrial digital audio broadcasting (T-DAB) network in Hungary. The calculations and the visualization of the graphical results were made by using the CHIRplusBC planning tool developed by LStelcom AG (CHIRplus_BC 2011).

Basic Elements for Radio Coverage Prediction

The most important step in the planning process is the determination of the wanted and interfering electromagnetic field strength in different geographical positions. The value of the field strength depends on the technical data of the radiocommunication transmitters and is also influenced by the terrain characteristics between the transmitter and the receiver site (e. g. terrain obstacles). The basic elements for planning include:

•databases •electromagnetic wave propagation models •technical criteria for planning •field strength calculation and aggregation methods to determine the cumulated field strength

Abstract

Data is Beautiful Conference 38

Page 39: Data Is Beautiful

DatabasesThe accuracy of the coverage prediction is greatly influenced by the technical data of transmitters and definition of digital terrain models available for the calculation. The planning tools used for calculation is based on a number of databases, such as: transmitter databases containing technical data of transmitters (effective radiated power, antenna pattern, etc) and corresponding planning parameters (reference field strength, protection ratios, signal-to-noise ratios, reception modes), geographical data (coordinates, height above sea level, etc), morphological data (corrections for urban areas, etc), population databases, different vectors and maps to calculate overlay. Technical data used for calculation are presented in Figure 1.

Figure 1: Technical data of transmitters (CHIRplus_BC 2011)

Propagation modelsFor the planning of radiocommunication networks - depending on the used frequency bands and the type of the radio services - various prediction models are available. In case of different telecommunication links different terrain, path, obstructions, atmospheric conditions and other phenomena have to be taken into account; as a result, different models exist for different types of radio links under different conditions. In general the path loss along a radio link serves as the dominant factor for characterization of the propagation. The path loss for free space propagation (L) depends on the frequency (wave length ⋋) and the distance from the transmitter (d):

However path loss depends also on other conditions: environment (mountainous, urban, rural, dense urban, suburban, open, forest, sea etc), atmospheric conditions, indoor/outdoor etc. In most cases the radio propagation models are empirical; they are developed based on large collections of data for specific topographic configurations. Regarding on the conditions taken into consideration, some of these models are so called site general methods and there are also path specific methods based on digital terrain data that allow better accuracy for calculations. Like all empirical models, radio propagation models do not give the exact value of the field strength, but they predict with some probability these values under the specified conditions. For the terrestrial broadcasting service a number of propagation models may be used, in the examples presented in this paper results based on a site general method (model described in Recommendation ITU-R P. 1546) and a path specific method (IRT 2D model) will be compared.

Presentation of a terrain profile between two transmitters can be found in Figure 2. The vertical lines indicate the transmitter and receiver height, the horizontal line indicates the distance between transmitters. The line connecting the transmitter and receiver is the direct line of sight between the two points.The Fresnel zone (red line) and the diffraction path (magenta line) is also displayed.

The prediction method described in Recommendation ITU-R P. 1546 is used as reference in the broadcasting service by most regulators and in the international frequency coordination process as well. This Recommendation describes a method for point-to-area radio propagation predictions for terrestrial services in the frequency range 30

direct line of sight between transmitter and receiver Fresnel zone (Fresnel zone) Magenta line: diffraction path interconnecting obstacles

Figure 2: Presentation of terrain profile between two transmitters (CHIRplus_BC 2011

Figure 3: Determination of the effective antenna height for distances greater than 15 km (NMHH 2008)

Science Section 39

Page 40: Data Is Beautiful

MHz to 3 000 MHz. The method takes account of the effective height of the transmitting/base antenna, which is the height of the antenna above terrain height averaged between distances of 3 to 15 km in the direction of the receiving/mobile antenna (ITU Rec.1546 – 2009).

Most of the path specific models are based on the multiple knife-edge diffraction loss concept (the loss caused by obstacles between transmitter and receiver is considered by different type of summation methods) For further calculation in this paper the ITU-R P. 1546 model (site general model) and the IRT 2D model (path specific model) will be used.

Field strength calculationFor the prediction of the radio coverage the calculation of the wanted and interfering electromagnetic field strength values is required and the minimum field strength level which is necessary to meet the signal quality for the desired reception conditions (fixed rooftop / mobile / portable indoor or outdoor) is needed. Several field strength summation algorithms are available, separate summation method can be used for wanted and interfering field summations.

The field strength produced by a single frequency network (SFN) based on a site general prediction method compared with a path specific method are presented in Figure 4 and Figure 5, respectively.

Figure 6 presents the difference between the cumulated field strength values produced by the transmitters in the SFN network calculated with different propagation models (difference between Figure 4 and Figure 5).

Figure 4: Coverage of a single frequency network based on Rec. ITU-Rec. 1546 (CHIRplus_BC 2011)

wanted transmittersinterfering transmitterscoverage area

Figure 6: Difference between the coverages of SFN calculated with different propagation models (CHIRplus_BC 2011)

wanted transmittersinterfering transmitterscoverage area

Figure 5: Coverage of a single frequency network based on IRT 2D model (CHIRplus_BC 2011)

wanted transmittersinterfering transmitterscoverage area

Visualization of the network gain due to the use of a single frequency network is shown in Figure 7. The network gain is the increase in field strength due to the summation of field strength values produced by all of the transmitters in the SFN that are within the guard interval, compared to the field strength of the SFN transmitters giving the best coverage alone, without this summation process for the same pixel.

Figure 7: Network gain produced by the SFN (CHIRplus_BC 2011)

Data is Beautiful Conference 40

Page 41: Data Is Beautiful

Interference analysisThe coverage of the radio communication networks can be significantly influenced by the presence of interfering radio transmitters. The interference can be caused by co-channel interference or adjacent channel interference produced by other networks and may also occur self interference within an SFN. In this case interference is produced due to different time delays between the transmitters and a certain receiver point. This may occur if the distances between individual transmitters in the network exceed a certain value defined by the length of the guard interval specified for the given digital system. Figure 8 compares the coverage of a DAB single frequency network without interfering stations and in the presence of interferers. The area in blue colour is lost due to the impact of the interferers.

Figure 8:Impact of the interfering stations on the coverage of SFN (CHIRplus_BC 2011)

wanted transmittersinterfering transmitterscoverage areacoverage area lost because of interferers

Example for a Nationwide DAB Network Planning A planning exercise has been made for a nationwide digital terrestrial radio broadcasting network for Hungary in the VHF frequency band based on the DAB standard. Acceptable national population coverage (above 90 percent of population) could be achieved by using 5 single frequency networks (SFN) including 39 transmitters. It is to be noted that the technical data planned for these transmitters have only been chosen for this planning exercise, the real data of transmitters have to be coordinated with all affected neighbouring countries.

Figure 10 contains the coverage produced by the planned T-DAB network taking into account of the interferers, including self interference. The coverage of the different single frequency networks (single frequency networks are networks consisting of transmitters sharing the same frequency) is indicated with different colours.

The impact of the interferers is presented in Figure 11.

Figure 9:coverage level produced by the transmitters of a national T-DAB network without interfering stations (CHIRplus_BC 2011)

portable indoor coverage level calculated without interferersmobile coverage level calculated without interferersother field strength thresholds

Figure 10: Coverage produced by the planned T-DAB network taking into account of the interferers (different single frequency networks displayed with different colors)(CHIRplus_BC 2011)

Science Section 41

Page 42: Data Is Beautiful

Figure 11: Impact of the interfering stations on the national coverage (CHIRplus_BC 2011)

Visualization of Regular NetworksIn case of flat areas it is very efficient to optimize the networks by using regular hexagonal networks composed of 7 transmitters, each transmitter having the same effective radiated power and effective height. Defining different colours for the field strength thresholds, network gain level or other planning parameters the results of the calculations and network analysis may lead to beautiful pictures, artistic shapes which have real physical interpretation. A planning exercise has been made for a regular network planned in the flat area in South-East of HungaryFigure 12 shows the SFN network gain for a regular network.

The coverage probability result found in Figure 13 indicates the probability in %, ranging from 0% to 100%.

Figure 14 shows the interference level of the transmitters in the regular SFN (field strength sum of all transmitters outside the guard interval).

Figure 15 shows the summation of the wanted field strength values for all transmitters in the SFN.

The geographical information systems integrated in most of the planning tools make possible overlay calculation and a convenient presentation and analysis of the results. Planning based on visual analysis, supported by different picture functions and intuitive user interfaces offer new creative means for engineers.

Figure 12:Network gain of a regular SFN (CHIRplus_BC 2011)

Figure 13: Coverage probability of a regular SFN in % (CHIRplus_BC 2011)

Figure 14: Visualization of the interference level of the transmitters in the regular SFN (different colors represent different field strength values) (CHIRplus_BC 2011)

Figure 15: Visualization of the summation of the wanted field strength values for the transmitters of a regular SFN (different colors represent different field strength values) (CHIRplus_BC 2011)

Data is Beautiful Conference 42

mobile coverage in the presence of interfererscoverage lost because of the impact of the interferers

Page 43: Data Is Beautiful

Science Section 43

ReferencesRRC-06 (2006) Final Acts of the Regional Radiocommunication Conference for planning of the digital terrestrial broadcasting service in parts of Regions 1 and 3, in the frequency bands 174-230 MHz and 470-862 MHz (Geneva, 2006)

CHIRPlus_BC (2011) LStelcom AG, CHIRplus_BC version 5.6.1. (20.10.2011)

ITU SG3Recommendation ITU-R-P.1546-4 (2009) Method for point-to-area predictions for terrestrial services in the frequency range 30 MHz to 3 000 MHz

NMHH (2008) Technical directives for planning of digital broadcasting – National Media and Infocommunications Authority, February 2008

Page 44: Data Is Beautiful

Data is Beautiful Conference 44

Science and art in landing site visualizationAkos Kereszturi1, 2 Henrik Hargitai3 and Zsófia Merk4 1 Konkoly Thege Miklos Astronomical Institute, Research

Centre for Astronomy and Earth Sciences, Hungarian Academy of Sciences

2 Nagy Karoly Astronomical Foundation

3 Dept of Physical Geography, Eötvös Loránd University, 4 Department of Cartography and Geoinformatics, Eötvös Loránd University

Landing sites are the most important “hot spots” in extraterrestrial surfaces for humans. Mission planning uses various methods for landing site selection; post-mission analysis and outreach maps and those created

during mission planning have different purposes but they all visualize the same areas. This paper shows the essential differences in scientific and “artistic” landing site visualization.

IntroductionPlanetary cartography is, as cartography, science, technique and art. The technique part in planetary cartography is data acquisition and instrumentation. The science part is the quantitative and qualitative post-mission analysis of data aquired; and the pre-mission and real time cartographic support of planetary and lunar missions. In this paper we give examples of scientific maps and of outreach maps of landing sites.

1. The science perspectiveVisualization of geospatial data is where different disciplines meet whose results may be visualized for both experts and for the public. Landing site selection (Arvidson etal. 2008; Golombek etal. 2009; Bridges etal. 2003, Kereszturi 2012) is a specific field in planetary and space science which is heavily based on such visualization techniques. Here the aim is to visualize important issues and compare various scenarios during the selection of surface landing sites for spacecrafts which are designed to do in-situ surface analysis. We present an overview of the landing site selection process in the example of Mars.

The selection requires the joint analysis of (1) scientifically important issues (locations where answers for relevant questions could be find), and (2) engineering constraints (providing the background for safe landing and trafficability on the surface). Both issues are essential, and the final landing site is determined by an iteration process selecting fewer candidate sites in each step of the work. During this process the data of the candidate sites should be presented for both engineers and researchers from different scientific disciplines.

During this visualization the following issues are addressed:(1) distribution and size of surface structures that are ideal for scientific analysis:

outcrops, water-containing minerals, sedimentary layers, ancient riverbeds etc. (2) distribution and size of surface structures and other features that should be avoided:

dust covered area, very rough terrain, high slope angle etc.(3) unfavourable locations in regional scale:

high altitude, strong wind blasted area, high geographic latitude with cold winter etc.

Abstract

Page 45: Data Is Beautiful

Science Section 45

The different quantitative parameters and their spatial distribution are visualized on maps or map sections containing topographic, temperature or compositional data. During the visualization and scientific analysis various datasets are overlain on each other, usually for two or three different data types. In the followings such examples are presented.

In Figure 1. the terrains are coloured as a function of rock density, to show where the very rocky terrains that might be dangerous both for landing and surface movements. Here the surface structures are indicated at low resolution (0.5 km). Figure 2. gives an example at higher resolution (50 m) for the classification of (1) unimportant, (2) interesting but dangerous, and (3) interesting and accessible locations. The planning of a rover’s work is influenced by this type of visualization as it helps to identify the locations worth of visiting. Figure 3. displays a part of the area shown in Figure 2. but on a digital terrain model, where different colours indicate accessible and inaccessible sites. It shows those areas that should be avoided by the rover because of too high slope angles.

Figure 1. Comparison of rock densities at the last four candidate landing sites selected for Mars Science Laboratory (MSL) / Curiosity rover landing: a: Eberswalde crater (a fluvial delta), b: Gale crater (layered mountain), c: Holden crater (paleolake bed), d: Mawrth Vallis (outflow channel). Red colour marks high spatial rock densities, blue shows low spatial density of rocks. The graphical representation method visualizes the level or probability of danger in one certain issue, the occurrence of large rocks (after Golombek et al. 2012)

Figure 2. Outcrops and related features in Gale crater: (a) HiRISE image PSP_009149_1750, (b) graphical interpretation of terrain types, and (c1)–(c4) magnified insets of characteristic features (Kereszturi 2012)

Page 46: Data Is Beautiful

Data is Beautiful Conference 46

Figure 3. Accessible (green) and inaccessible (red) locations by a rover at the perimeter of the Aeolis Mons sedimentary hill inside Gale crater (GoogleEarth, NASA, JPL, UA)

On a large scale, surfaces that are too dusty should be avoided because they are dangerous for the rover: because of its thermal inertia, the surface wams and cools very fast and dust particles may cover the rover. To identify “too dusty” areas, a global albedo map, a global thermal inertia map, and a map of dust cover index is used.

Summary Engineering constraints and scientific goals are shown in thematic maps and diagrams that are are used together to quantitatively select sites that are safe for landing, and there are interesting and rover-accessible features nearby. The result helps to elucidate the rationality of in-situ activity by a particular mission there.

At the first level, surface parameters quantified by remote sensing techniques are indicated at a global coverage. This analysis helps finding optimal landing sites. Once on the surface, the ideal rover traverses have to be determined where the rover could traffic safely and visit the sites that have scientific importance. As the engineering and scientific requirements are often conflicting, the multilayer maps are useful as they pinpoint the possible sites that are often small in number and size.

These landing site maps are very often spectacular 3D visualizations of the surface at high resolution, but only the science and technique aspects of cartography are fully utilized here; colors and shapes etc. are chosen to be clear and informative. But the art concept plays no role here.

The art perspectiveThere is a thin dividing line between maps and other representations of spatial phenomena. When art is coming into focus in cartography, the accuracy of the map may be questionned. Art here is understood not as marginal elements on a scientific map, not decorations, but essential part of the map itself. When maps are designed to be easy-to-understood, generalization of the reality is necessary. But from which point can we say that a map is not a map anymore because it is oversimplified or essential elements are missing? Non-maps range from non-controlled automatically created photomosaics to artistic paintings based on a cartographically projected image. However, in everyday use, especially in elementary schools when measurements on the map are not considered, and especially in the case of planetary maps, which depict an alien landscape, it is more important that the image shown in the map be understandable and attractive. Attractiveness does not necessarily include clarity. Landscapes – surfaces – of extraterrestrial bodies are hard to interpret, based on our terrestrial experiences. There are no landmarks, the scales are hard to feel, the features are unknown. In terrestrial maps, there are various features that help navigation and identification of scales: such are hydrology in physical geographic maps or man-made real objects (roads) or imagined features (country borders) in political maps. None of these are present on other bodies. What can be mapped then? Automated topographical maps therefore provide relatively little help in understanding the surface – craters and some volcanoes are identifiable but other landforms are not necessarily. Photomaps provide the same enigmatic view of planetary surfaces. Photomaps do not provide interpretations, and are non-narrative. At large scales, boulders and valleys are visible, but the context is missing: is it a lava flow or a fluvial valley? Are we looking at a volcanic lava field or a floodplain? These questions are difficult to answer for even planetary scientists sometimes, but for K-12 education, regional and global features may be of higher importance. What is the Martian or

Page 47: Data Is Beautiful

Science Section 47

Lunar equivalent of the continent-ocean dichotomy? We have to narrate, to interpret the visual data in order to make it visible for the human mind. Depending on the visual narrative, the map reader will create a different mental map (Hargitai 2012).

Landing sites, even though as we have seen are selected primary to meet engineering constraints, even though may be not the most interesting places on a body, regardless of their scientific importance, these sites are unquestionably the most important hot spots on planetary surfaces for us humans. Those places where humans walked, or where our devices were or are operating, keep record of our physical or tele-presence.

The case of Apollo-11 landing site visualizationIn the followings we present two different approaches of landing site visualization, on the example of Apollo-11 landing site. First and objective photo, then a scientific map and finally an artistic map (Fig. 4-7.). Graphic artists and cartographers should work together to produce effective maps.

Figure 4. shows the photodocumentation of the landing site of Apollo-11. It is accompanied with some explanations, but the acronyms are not transparent for the average reader. The image has no narrative: it shows reality as it is. It is not a landing site map, even though there is scale bar, so measurements can be made. The original caption of this image adds: “their tracks cover less area than a typical city block!” (Robison 2012) However, the map itself should visually tell scales.

Figure 5. is the landing site map compiled that was presented to President Richard Nixon (Schaber et al. 1969). It is a narrative map, attempting to explaining what happened at the landing site, who went where, but hardly shows anything for a general reader. Its message is that it was a very scientific, very complicated task that you won’t be able to understand, despite it’s being clearly explained in the legends. Nor its narrative, nor its spatial or geological context is transparent.

A new generation of landing site visualization were produced by Thomas Schwagmeier in 2005 in close collaboration with Erich Jones, editor of the Apollo Lunar Surface Journal, and Joseph O’Dea. (Schwagmeier 2012, Jones and Glover 2012) (Fig. 6 and Fig. 7.). These maps provide a new sense of space for the map reader. The first idea came from the American Erich Jones, editor of the Apollo Lunar Surface Journal, who overlayed “a sketch version of Figure 3-16. from the Apollo 11 Preliminary Science Report onto a drawing of a baseball diamond” in 2004. (Jones and Glover 2012). Thomas Schwagmeier created a high-quality superposition of the Apollo 11 traverse map on the

Figure 4. Apollo 11 landing site. LROC image LROC M175124932R. NASA/GSFC/Arizona State University

Page 48: Data Is Beautiful

Data is Beautiful Conference 48

baseball diamond in 2005 (Fig. 6). It is based on the “scientific” map (Fig. 5), and still suggest the extreme complexity of surface operations, but puts the walking paths into spatial context, the baseball field. It is hard to find a common spatial experience that is shared by both east and west, independent of culture, so the German Schwagmeier has also produced a twin of the overlay, this time on a football field, a “comparison of greater interest to readers outside North America” (Jones and Glover 2012) (Fig. 7.). The baseball image visualizes both spatial dimensions and geological (local) context, i.e. the crater on the right side which explains why the astronaut turned back. This latter remains unexplained in the football field map. Toponyms are missing in both maps.

Figure 5. Apollo 11 landing site “preliminary traverse map”, 1969. USGS.

Figure 6. Apollo-11 landing site visualization on a baseball diamond (2005) (Schwagmeier 2012). Created by Thomas Schwagmeier, Erich Jones and Joseph O’Dea. Courtesy of Thomas Schwagmeier. Courtesy of Thomas Schwagmeier.

Page 49: Data Is Beautiful

Science Section 49

Figure 7. Apollo 11 landing site visualization on a football field (2005). Created by Thomas Schwagmeier, Erich Jones and Joseph O’Dea. Courtesy of Thomas Schwagmeier.

Common experiences in spatial dimensionsWe have common experiences up to the size of a stadium, but for much larger areas our experiences are different: we live in different cities, so well-known landmarks may not be helpful in creating a sense of space. In this case we see two possibilities: either a localized visualization or a freely replaceable layer of the landing site. In the latter case the user would virtually move the landing site layer over their living area, which they are familiar with. Both small and large distances will be easily perceived , starting from „home”. This map would be a personalized, interactive map that is designed to be understood by only one or very few people.

Figure 8. This map puts the Curiosity landing site on Mars into spatial context on the Earth. Landforms (dunes) in the south makes Martian landscape more transparent. This is a prelimiary map showing the concept only; no graphic artist was involved in this work. (Credit: Merk Zs., Hargitai H.).

Page 50: Data Is Beautiful

Using the principles of the artistic and scientific requirements, we have created a traverse map of the Curiosity rover. As this map was shown to an audience in Budapest, we used the downtown map of Budapest.

AcknowledgmentThis work was supported by the OTKA PD 105970 project.

ReferencesArvidson,R., Adams,D., Bonfiglio,G., Christensen,P., Cull,S., Golombek,M., Guinn,J., Guinness,E., Heet,T., Kirk,R., Knudson,A., Malin,M., Mellon,M., McEwen,A.., Mushkin,A., Parker,T., Seelos,F., Seelos,K., Smith,P., Spencer,D., Stein,T., Tamppari,L., 2008. Mars Exploration Program 2007 Phoenix landing site selection and characteristics. Journal of Geophysical Research 113(E6).

Bridges, J.C., Seabrook, A.M., Rothery, D.A., Kim, J.R., Pillinger, C.T., Sims, M.R., Golombek, M.P., Duxbury, T., Head, J.W., Haldemann, A.F.C., Mitchell, K.L., Muller, J.-P., Lewis, S.R., Moncrieff, C., Wright, I.P., Grady, M.M., Morley, J.G., 2003. Selection of the landing site in Isidis Planitia of Mars probe Beagle2. Journal of Geophysical Research 108(E1).

Golombek M., Huertas A., Kipp D., Hanus V., Sun Y. 2010. Rocks at the MSL Landing Sites. 4th MSL Landing Site Workshop. Monrovia, USA

Golombek, M., Grant, J., Parker, T.J., Kass, D.M., Crisp, J.A., Squyres, S.W., Haldemann, A.F.C., Adler, M., Lee, W.J., Bridges, N.T., Arvidson, R.E., Carr, M.H., Kirk, R.L., Knocke, P.C., Roncoli, R.B., Weitz, C.M., Schofield, J.T., Zurek, R.W., Christensen, P.R., Fergason, R.L., Anderson, F.S., Rice, J.W., 2003. Selection of the Mars Exploration Rover landing sites. Journal of Geophysical Research 108(E12), CiteID 8072.

Hargitai H. (2012) Interpretation of Surface Features of Mars as a Function of Its Verbal—Toponymic—and Visual Representation. In: L. Zentai and J.R. Nunez (eds.) Maps for the Future, Lecture Notes in Geoinformation and Cartography 5. Springer: Berlin Heidelberg.

Kereszturi A. 2012. Landing site rationality scaling for subsurface sampling on Mars—Case study for ExoMars Rover-like missions. Planetary and Space Science 72, 78 -90.

Robinson M. (2012) LROC’s Best Look at the Apollo 11 Landing Site http://lunarscience.nasa.gov/articles/lrocs-best-look-at-the-apollo-11-landing-site/

Schaber G, Batson R, Hait T (1969) Apollo 11 Landing site, preliminary traverse map. USGS photograph P854c, F27042. USGS Open-File Report 2005-1190, Figure 064. - ID. Project Apollo (1960-1973) 064 - pap00064 - U.S. Geological Survey

Schwagmeier T. (2012) http://www.apollo-11-mission.de/.

Jones E.M., Glover K. (eds) (2012) Apollo Lunar Surface Journal. Updates. http://www.hq.nasa.gov/alsj/journal.corrections.html

Data is Beautiful Conference 50

2003 2004 2005 2006

Page 51: Data Is Beautiful

Science Section 51

2007 2008 2009 2010 2011 2012

Page 52: Data Is Beautiful

Visualization Methods of Spatial Distribution of Geotagged PhotographyMátyás Gede Department of Cartography and Geoinformatics,

Eötvös Loránd University

Thanks to the spreading of Web 2.0 several photo-sharing sites can be found on the internet, where users can upload their photographs and share them with the public. More and more of these sites offer the possibility of “geotagging” the photos, i.e. adding information about their geographic position. Practically it means that geographic coordinates are assigned to the pictures.

The method has several advantages, from which the most important is that the geotagged photos can be searched using spatial filters, e.g. it is possible to find the photos taken within hundred metres distance of a given point.

A few photo-sharing sites allow access to its data via an API (Application Programming Interface). These interfaces let us use the data of uploaded photos in our own web pages or applications. The most useful function of these APIs in the current theme is that it can send the data of photos of a given geographic quadrangle upon the appropriate request. Using this feature we can download the data of a specific area, and store it in a specially designed

database which can facilitate the further data processing or visualization.

If the number of photos of a given area is large enough, it is possible to visualize their spatial distribution on maps. Examining these “photo-distribution maps” can reveal interesting correlations between the geographical properties or highlights of the area and the number of photos taken there.

Further analysis of the data can also produce interesting deduced information. If the number of photos is large enough, it is possible to differentiate pictures taken by locals or visitors with rather good estimation examining the temporal distribution of a specific user’s photos within a given area.

Taking advantages of the visualization possibilities of modern web cartography it is possible to show the gained information in three dimension on the surface of a digital globe application, which makes the correlations of the photo density and geographical objects of a given area even more expressive.

Abstract

Data is Beautiful Conference 52

Page 53: Data Is Beautiful

IntroductionThe touristic attractivity of a place can be well indicated by the number of photographs taken there. Since it is possible to give the geographic position along the pictures on the various photo-sharing sites, all these data can be visualized on a map or, taking advantages of the technical develpopment, on a virtual globe. Analyzing the attributes of the shared photos we can get even more information.

Previous researches on geotagged photography focused mainly on the relation between the location and the content of photos (Crandall et al 2009) while this paper uses only the geographic location and a few other attributes of the images and tries to find the causes of different textures in the spatial distribution of them.

Geotagged Photographs and APIs of Photo-Sharing SitesThere are several photo-sharing sites across the internet, which let users upload and share their photographs. More and more of these sites can supplement pictures with “geotag”, i.e. with geographical relevance. Geotags are usually geographic coordinates (latitudes/longitudes).

This method has several advantages. The most important one is that searching among the pictures can be based not only on their title, keywords, etc. but on their location also. As for instance, it is possible to select the pictures taken within a distance of 100 metres from a specified point.

A photo can be supplemented by coordinates in several ways. The common cases are:- The photographing device is able to acquire positional information also. There are cameras with built-in GPS module, but nowadays the most popular devices are smartphones. These are usually equipped with GPS, but there is a possibility to gain geographic information based on WiFi routers or GSM cell information also.- A separate GPS instrument is logging the position while rambling around and photographing. Positions are merged into photographs’ metadata by a special software. Merging is based on the timestamps of the photos.- Users can manually add geotags to pictures while uploading them to photo-sharing sites. Usually an interactive map helps this process: one simply has to point out the place on the map. Alternatively it is often possible to type in an address which can be geocoded.

Geotagging is popular. This can be indicated by the number of uploaded photos: users of Flickr added more than 25000 geotagged photos in one year taken at Budapest according to the coordinates.

Some of the photo-sharing sites also provide an Application Programming Interface (API) which provides access to the photograph database for web applications. These interfaces make it possible to use the shared photos and/or their attribute data on web sites. A very useful function of these APIs is the possibility of downloading the photograph data within a given bounding box. Using this feature we can collect the picture data of an area and visualize or analyze it. (Google 2011, Yahoo 2011)

The author examined the APIs of Panoramio and Flickr. An advantage of the first one is that it can be used without registration but it has a serious drawback also: the tests pointed out that the requested data is filtered in a not documented way which causes contradictory results. A good example of these anomalies is that the number of photos within a given bounding box is not equal to the total number of photos within the two halves of that box.

To use the other service however, the data collector application has to be registered before use. Its definite advantage is that no “undocumented” behavior were experienced so far, so the downloaded data is likely more reliable.

Considering all these issues, the author decided to use the Flickr’s data.

Downloading, storing and analyzing photograph attributesAll of the examined APIs are REST services (Fielding, 2000). The application sends a HTTP request, in which all the necessary information is given. The server sends the required data in the response, in XML, JSON or similar format. In the case of Flickr the request is as follows (replacing the {key} with the API key received during the registration):http://api.flickr.com/services/rest/?api_key={key}&method=flickr.photos.search&bbox=18,47,18.1,47.1&min_taken_date=2010-01-01&min_upload_date=2010-01-01&format=php_serial&page=1&perpage=250&extras=geo,date_taken

Science Section 53

Page 54: Data Is Beautiful

This request asks for the first 250 geotagged photos taken within the bounding box given in the parameter bbox, taken and uploaded since 1 January, 2010. (The maximum number of photos in one response is limited to 250.)

The response contains the data of the requested photos, and also the total number of photos matching the searching criteria. If this number is more than 250, the request has to be repeated after increasing the number in the parameter page. Due to a limitation (the maximum number of downloadable photos in a given boundig box is 4000) this solution works only until the 16th page. This restriction however, can be get around by creating a recursive algorithm: If the number of photos in a bounding box is more than 4000, let’s divide the box into four parts, and run the algorithm on each. Otherwise, let’s download the data.

Figure 1: Working mechanism of the photo downloader application

The number of pictures within a big town can be several times 10 000 (or even 100 000). Downloading that amount of data means hundreds or thousands of HTTP requests which take quite a long time (due to the experiences, even 20-30 minutes); Therefore it is advisable to store the gained data in a database. This way data of one specific area only have to be downloaded once.

The author created a web application to facilitate the data collection which works as follows: The desired area (bounding box) can be defined using an interactive map. A JavaScript code controls the recursive algorithm explained above. A supplementary PHP script sends the HTTP requests to the Flickr API and stores the data extracted from the responses in a MySQL database (Figure 1). The attributes stored are the ID of the photo, the ID of its owner, the timestamp of its creation and the coordinates of the geotag.

It is possible to run analyses on the gained data. An interesting task is, for example to determine if a phot was taken by a local or by a tourist. The solution of the author is based on the idea of Eric Fischer’s album of “Locals and Tourists” (Fischer, 2010): let’s calculate the difference of the timestamp of the users’ first and last image taken in the specified area. If this difference is smaller than 30 days, the user can be considered as a “visitor”, otherwise he/she is a “local”. This is obviously a rather simplified model, but interpreting its results indicates that it works quite well.

VisualizationA map based on the geotags of photographs can be rather impressive. The easiest way is simply placing a dot to a map at each photo’s location (Figore 2). In the case of Budapest the major tourist attractions (Castle Hill, the Danube banks with the bridges, Parlament, Andrássy Avenue, Heroes Square etc.) are easily recognisable at the first sight.

Digital globe applications such as Google Earth give us the possibility of visualizing spatial information using the third (altitude) dimension also. Let’s take advantage of this opportunity: a rectangular grid to the map is placed onto the map, and the number of photographs taken within each grid cell is visualized by a 3D bar above the cell. The height of the bar is a linear, square or logarithmic function of the count.

The author created a simple, parametrisable PHP script to implement this task. The script generates a KML file, which can be opened with any digital globe applications. The script also implements the previously introduced differentiation of local and tourist photographs, and adds a semi-transparent dark overlay to the surface which gives more contrast to the theme. The results can be observed on Figure 3.

Data is Beautiful Conference 54

Page 55: Data Is Beautiful

Figure 2. The “photo map” of Budapest

Figure 3. The distribution of photos made by visitors (above) and locals (below) visualized in Google Earth

While the spatial distribution of local and visitor photos looks rather similar for the first sight, a closer look will reveal several differences in the patterns. Although the major touristic highlights appear as “congestions” in both cases, the photographs of locals are less concentrated at these sights: with much lower density, but they are visible everywhere around the city. Above this, there are places where only local pictures are concentrated. These are typically stadiums, sport fields, pubs, university or college campuses etc.

Science Section 55

Page 56: Data Is Beautiful

Conclusion, Further Plans, PossibilitiesThe examples of this paper make it clear that the mass of geotagged photographs contains much more information than the pictures itselves. The spatial distribution of the geotags is in correlation with the geographic features of the given area. Analyzing the geotags together with other attributes of the photos can give us further, deduced information about their owner, such as the local vs. visitor differentiation introduced above.

Further research plans focus on the analysis of photos for urban development:- Temporal analysis: examining the photo distribution before and after a specific urban development project (e.g. a new pedestrian zone). The change in distribution can be a measure of impact of the development, which can be used later in decision making. Comparing photos taken in summer or winter, on working days or weekends, in the morning or the afternoon hours etc. can also give interesting results.- Connecting different photo locations by lines (naturally using some connection rules) may show further information, e.g. creating typical paths by connecting the positions of photos with small time difference, taken by the same user.- Developing additional visualization methods together with the above mentioned analyses.- Defining the “touristic graph” of a city: the nodes could be created by automated clustering of photo locations while edges can be paths of photographers between these nodes. Analyzing this graph may lead to new conclusions regarding the structure of the city.

ReferencesCrandall D, Backstrom L, Huttenlocher D, Kleinberg J (2009): Mapping the World’s Photos. WWW 2009 Conference, Madrid. ACM 978-1-60558-487-4/09/04

Google (2012): Panoramio API. http://www.panoramio.com/api/data/api.html

Fielding RT (2000): Architectural Styles and the Design of Network-based Software Architectures. Chapter 5. Representational State Transfer (REST). http://www.ics.uci.edu/~fielding/pubs/dissertation/rest_arch_style.htm

Fischer E (2010): Geotaggers’ atlas of the world – Locals and Tourists. http://www.flickr.com/photos/walkingsf/sets/72157624209158632/

Yahoo (2012): Flickr API documentation. http://www.flickr.com/services/api/

Data is Beautiful Conference 56

Page 57: Data Is Beautiful

Science Section 57

Page 58: Data Is Beautiful

2

Page 59: Data Is Beautiful

Art, Design, Technology

Page 60: Data Is Beautiful

The Aesthetics of Public VisualizationPatricio Davila Assistant Professor at OCAD University

(Faculty of Design)

The expansion of the mediated city produces a paradox in which a dazzling array of possibilities for new forms of creativity and experiences are offered while at the same time a surplus of fluid images threatens to dull the senses and eliminate the possibility of perceiving any difference between noise and signal. This has led me to think about how place, indexicality and site-specificity can potentially increase the apperception of complex issues, specifically as they are represented through visualization.

In order to explore this potential the following is a very brief survey of examples of visu-alization and art strategies that use public space in ways that augment our knowledge and connection to a specific issue. These examples include projects that may readily be recognized as information or data visualization but there are others where the definition of visualization is stretched considerably.

All the examples in this exploration engage in three distinct but related issues: aesthetics, visualization and public. Aesthetics, for instance, has a long history of philosophical investigation where the question of “what is beautiful” has been a core part of this trajectory. Implied in this question is the aspect of how our senses interact with image or object and space in order to determine beauty. Following this aspect we may also think about the affective dimension of aesthetics. How we feel also impacts on how we know something. The emotional, physiological and cognitive aspects of visualization, in particular with relation to participation, should be considered when thinking about how visualization makes information available to various publics.

A consideration of public with respect to the aesthetic of visualization invites us to imagine a number of visually-based associations. For instance, public can be explained as a space in which common and competing interests are presented and played out. It is a space in which things are meant to be made visible in contradistinction to private where things are meant to be made invisible. Public aspires to be transparent in order for anyone to clearly discern the structure of a particular thing. It is also viewable from many points of view since it promises to be accessible to everyone. The same goes for communication. It is imagined that everyone can make their message although not everyone has the same level of access — individuals can’t as easily address people as corporations or governments can. Finally, public is a space constituted by the communicative capacity of people.

Visualization supports the aims of public in that it attempts to make things visible and therefore understandable. The mantra of visualization is “to make the invisible visible”. It attempts to do this by way of augmenting visual cognition. This is often done through the simplification of complexity and the abstraction of phenomena. Necessarily, visualization implies the act of representation which, in turn, implies translation, reduction or expansion. Beyond representing data, visualization offers the elucidation of relations between variables.

An initial example that can be used to examine what is at play in public visualization could be the weather beacon. The weather beacon became popular in the middle of the last century and provided a public service through an advertising capacity. Many beacons of the era had the name of their corporate sponsor directly on or nearby the structure.

These pylon structures fitted with bands of lights became matrices on which animations ran indicating a variety of weather conditions such as rain, snow or clear skies and warmer, cooler or steady temperatures. This is not sophisticated by today’s standards but it does point to the simplification of complex phenomena, in this case meteorological phenomena, to an aesthetic experience that has a narrowly determined purpose for the user.

Environment and weather seems to be a popular choice for artists and designers because of our increasing ability to measure aspects of it, our common experience of itand our pressing need to create a relationship with the complex links between what we see and feel and that which is actually far removed from our immediate experience.

For instance, Andrea Polli and Chuck Varga’s installation, Particle Falls (2010), monitors air quality in the immediate vicinity and displays a generative animation that responds to the number particulates in samples taken.

Data is Beautiful Conference 60

Page 61: Data Is Beautiful

This installation literally makes the invisible visible. In effect, it amplifies the size of the particulate matter in the air so that the human visual apparatus can register a pattern and a movement.

In other words, this installation like many others of this kind, make data visible and by extension public. Something that was invisible because either the phenomena was un-detected or the data was unparsed or filtered was made visible and public. In the case of Particle Falls all this may not be the case but rather what was missing was the will or ability to assemble different variables onto one site for us to see and witness.

Phenomena is thus presented in such a way that is visible and then through some con-textualization we, as spectators, code this data as meaningful information. Implied in this process is that to see is to understand — we visualize in order to augment our cognitive capacity.

As mentioned above our cognition is aided by the arrangement of data such that we can spatially and/or temporally compare different variables. This is standard by which visual-izations are judged to be good and in aesthetic terms, beautiful. The bigger the data set, the more complex the phenomena and the more defined the question leads to the need for a better visualization.

Maybe the opposite is true for “effective” public visualizations? In other words, the simpler the data and the more open the question may lead to more successful visualizations. Take, for instance, Alfredo Jaar’s 1999 installation, Lights in the City. This light installation took the participation of clients of homeless shelters in the city of Montreal as a very simple data stream with which to animate a red light in the Cupola of the Marché Bonsecours. The cupola is a well-known landmark in the city with a well-known history which includes several fires that have destroyed the building. Jaar exploits the notion of crisis in the cupola’s history with the crisis of homeless in the city.

The data is simple and the visualization is even simpler in its formal aspects. At first glance, the question that this visualization answers is: how many people use the homeless shelters in Montreal? Although an important number, this measurement is too simple to sustain much interest. Instead, the question opens a network of issues and queries including our attitudes towards this constant crisis, the anonymity of homeless people and the physical proximity of the spectators to the homeless. Through the highlighting of these issues homelessness is made visible and public.

One of the ways in which Jaar’s installation opens the question up is by way of “mapping”. Mapping is a basic process in visualization. It occurs in cartography, in which different kinds of information are overlaid in order to compare two or more aspects of a geographic territory, e.g. boundaries and topography.

Donna Cox expands this notion by employing the term, visaphor or visual metaphor, which can help us understand the process of mapping data onto new contexts. Cox suggests that visualizations are particularly powerful in how they recontextualize data.For instance, when demographic data is placed on a visual representation of the city, a source domain is mapped onto a target domain. Data, according to Cox, represents the source domain while its translation into a visual model, the target domain, produces the visaphor and thus recontextualizes the source data. In other words, meaning is borrowed from one in order to create new meaning.

While this mapping function occurs regularly in visualizations presented on paper and screens we may also think of ways this applies to mapping virtual data onto other physical spaces. Visualization that takes on an architectural scale often presents new data resting on a built structure where both components bring a network of meanings.In order to employ the notion of Cox’s visaphor we need to adapt it to the way new meaning is created with physical structures. In this case, the visual model includes both the image (colour, animation, shape) and the physical structure (building, facade, area).

In Jaar’s project the flow of people entering the homeless shelters around the installation is mapped onto the cupola — the city’s invisible inhabitants are mapped onto a powerful symbol of the city — source domain and target domain are combined. The question that arises therefore is framed within homelessness and disaster. The baggage borrowed from one (cupola) complicates the reading of the other (homelessness). This is underlined by the fact that “bonsecours” means “good help” in French.

The presentation of information at this scale in an urban environment and with potentially hundreds or thousands of viewers leads us to also consider the visualization in public space in terms of spectacle. This term is particularly loaded as it emphasizes the act of looking in an environment dominated by images. Guy Debord took the term to indict capitalist society with a specifically visual form of alienation which further rendered a separation of humans from themselves and each other.

Art, Design, Technology 61

Page 62: Data Is Beautiful

Spectacle has more recently been reclaimed to an extent through Jacques Ranciere’s use of the term “spectator” which lifts the viewer from the lowly status of dupe to the ranks of co-producers of meaning. Visualization in the sense that we make the invisible visible conjures another notion of aesthetics — specifically what is “sensible”. If we think of Ranciere’s notion of the “distribution of the sensible” and the “emancipation of the spectator” we can begin to unpack how the politics of what is seen and what people can participate in relates to visualization on a fundamental level.

Spectator is also reimagined as “spect-actor”, a term coined by Augusto Boal and elabo-rated in his theory of the “theatre of the oppressed”, which emphasizes the participatory dimension of public theatre performance. Public visualization has the potential to borrow from these formulations, specifically through the participation of the public in order to gather data or through interaction by participants to uncover relationships among the data.

To understand this potential it may be necessary to look at some parallel practices that use similar frameworks while not operating as data visualization. These frameworks include participation and performance and mediate these two aspects through the use of projection, video and site-specifity.

Krzystof Wodiczko’s Tijuana Projection (2001) is a good example of the incorporation of public, visibility and participation. In this installation Wodiczko invited women who work in the maquiladora factories in the area, which tend to provide very poor working conditions for a predominantly female workforce, to describe their experiences. Women’s faces were projected onto a very large building, the main cultural centre in the city, while they individually recounted experiences of abuse, violence and family disintegration that often occurs to these women workers.

In another project, the St.Louis Projections (2004), Wodiczko orchestrates the commu-nication between convicted prisoners and victims of crime. Live video of a prisoner’s hands are projected on the facade of the St. Louis Public Library while another participant, a crime victim, addresses the prisoner through a microphone place in front of the large projection. Like the Tijuana Projections project, this project stages a spectacle in which some participants provide the content and others act as recipients.

Using this example as an aligned practice to public visualization prompts me to consider the following: what if people looking at visualizations were not thought of simply as users or readers or viewers? What if public visualization had an audience composed of spectators, participants and most importantly witnesses? This would require us to consider public visualizations performances and not simply interactive installations.

The strength in these two examples is in that the image isn’t naturalized as part of the media landscape. It is contingent and temporary as well as distorted, truncated and highly dramatic. Both Jaar and Wodiczko’s installations are contingent because they work by mapping one meaning onto another physical context to create a critical awareness. In fact, Wodiczko’s St. Louis projection had to move from the courthouse building to the library building because the mapping was too powerful.

The mapping of data onto a new or old context can also be seen in some of Peter Greenaway’s latest projection work. Peter Greenaway’s Wedding at Cana project high-lights an important aspect of this modality of spectating or witnessing. Greenaway’s in-stallation is based on the re-staging of Paolo Veronese’s Wedding at Cana painting. The original, appropriated by Napoleon in the late 18th century, is displayed in the Louvre in Paris. For the installation, a full scale facsimile of the original located at the original refectory in, San Giorgio Maggiore in Venice, was used as foundation of music and light projection performance.

The story of the painting and its projection/intervention gains meaning from its scale, its geographic location and its staging within the refectory. The site itself informs the reception of the story and the spectators watch the story unfold before them. Here time and place are specific and not easily reproducible granting the spectators the role of witness who can recall temperature, lighting, spatial arrangement, mood, silence, etc.

Place is also important the Nuage Vert project (started in 2008) by artist collective Hehe (Helen Evans & Heiko Hansen). The first installation took place in Helsinki, Finland and it has since moved to various venues. And although it moves, location for this project is extremely important. Each time the project is mounted the same kind of place is chosen — smokestacks near residential areas. Hehe’s project literally highlights (with powerful green laser projectors) the plume of smoke exiting the tops of smokestacks. In the case of the Helsinki project, the projection is coordinated with an “unplug” event where local residents are encouraged to participate in a collective reduction of power consumption on a specified date and time for short period of time.

Data is Beautiful Conference 62

Page 63: Data Is Beautiful

In addition to engaging in spectacle, visualization, site and participation, Nuage Vert also deploys another aspect of public. Bruno Latour, using John Dewey’s conception of public, reminds us that publics are formed through the gathering of attention towards a common concern. Publics are assembled through controversies by virtue of the involvement of people in the generation, communication and incorporation of information.The same is attempted by design collective, Realities United (Tim Edler and Jan Edler), with their projection project, Big Vortex, that will highlight the airborne carbon output of a waste-to-energy plant in Copenhagen, Denmark, designed by BIG Architects. Using similar technology to Hehe’s Nuage Vert project, Big Vortex will aid Copenhagen citizens count the number and frequency of carbon, in the form of smoke-rings, expelled into the air directly above the city. Meant as a permanent installation, this project will presumably create a public around the issue of energy consumption.

On a practical level, making something public also entails using scale. Historically the realm of government and capital, scale ensures visibility by a wide audience. The Nuage Vert and the Big Vortex projects necessarily work with this aspect. In the case of Big Vortex, the smoke-rings are estimated to be 250 kilograms, 30 meters in diameter and 3 meters high. These projections acquire a scale normally reserved for architecture, advertising and monuments.

Scale also implies power. The lure of this platform must surely be that images can achieve a magnitude where there is a potential for multitudes of people to witness it. It must also be tempting due to the transgressive possibilities of temporarily re-writing the space and its surfaces. Since being public entails the notion of dominance the transgressive potential relies on the sudden and apparent off-script messaging and reversal of this power dynamic, e.g. Nuage Vert.

The E-Tower project I created with Dave Colangelo and exhibited in Toronto’s Nuit Blanche event in 2010 works with this potential although to less subversive ends. The CN Tower, one of the world’s tallest free-standing structure, and its external LED lighting array was used in an installation that visualized the “energy” of people attending Nuit Blanche throughout downtown Toronto. Over several hours (between 7pm and 7am) 5000 people sent SMS text messages to a specified number registering their participation in the installation. As more people participated the light emitting from the 1/2 kilometer tall structure cycled through progressions of colour and rhythm.

This project capitalized on the power of the large structure that dominates the urban landscape but was limited in communicative power due to the limited lighting array (most arrays on the tower were two “pixels” wide). Rather than use this platform to relay a concrete message we attempted to engender and represent connectedness by showing that many people were able to something at the same time throughout the evening. People contributed their “data” by participating and these same people, as well as many more, witnessed the results of their collective action.

Participation can create a relational space (a term coined by Scott McQuire) which em-phasizes a loose organization of spaces for play and interaction. Visualization’s mantra of “making the invisible visible” implies, as discussed above, a revealing of hidden meaning. Relational space when created through visualization techniques may also manifest the act of spectating/participating. It comes into existence through the watching and not before it. It therefore may not so much aim to reveal but rather to engender participation.

It seems there is a voracious appetite for more data streams in order to make more vis-ualizations. If data is a material, we want more of it in order to construct bigger things revealing more complex arrangements that represent previously unexamined aspects of society, politics or the environment. Yet the projects I have explored above show a dif-ferent practice that takes relatively simple data streams and even simpler visual repre-sentations to open up complex questions and/or engender participation.

James Corner, through his use of the old German word “landschaft”, suggests that there is a way of recovering what we may wish to highlight in images of landscape. The word, Corner explains, implies a deep and connected relationship between people, objects and space that is often organized through intervals of time. I would like to conclude this exploration by echoing Corner’s emphasis on the process and unfolding of events, and therefore thinking of “program” rather than static description, as way of thinking about how visualization in public space may help manifest the dynamic ways in which we are all implicated in a complex network of social connections, material effects and spatial arrangements.

Art, Design, Technology 63

Page 64: Data Is Beautiful

Data is Beautiful Conference 64

From data to visual spaceThe impact of Data Visualization on deconstructing and mediating the extension of spaceBettina Schülke Artist and Ph.D Researcher at the University of Lapland

Data visualization is currently a widely discussed content in several scientific and artistic disciplines. While data is absorbed in the virtual space, most rarely consider their physical and spatial ramifications. It has to be noted that all data is stored in a certain space and data progressively transforms space into data space. Consequently, the rapidly growing field of data visualization requires, amongst other aspects, a radical redefinition of space. Former static definitions of visibility and information are becoming replaced through a new flexibility, transparence, dynamic, mobility and intangible qualities. Rigid forms mutate to fluidity with open access, characterized through immateriality and nonlocation. This process implements that also our current

understanding of spatial concepts are going through a radical change of perception where data visualization is being an essential part of it. Space no longer exclusively is the housing for context anymore, instead networked contexts generates space and revoke it again. The spatial characterizes from now on a state of information production and –generation and redefines space as a dimension of interfaces. This transformation of visual information implements the question of how data visualization generates a new understanding of space, vice versa, how spatial concepts influence data visualization. This paper focuses on the above-mentioned themes in the framework of a historical contextualization.

IntroductionInitiated by the rapidly growing popularity of data visualization numerous novel tools and software programs are being developed of late. Data scientists are continually asking new questions of their data and are revealing unexpected insights into our world.

Abstract

Page 65: Data Is Beautiful

Art, Design, Technology 65

Figure 1. Mindmap

My interest in this topic is driven by my background as visual artist and by my ongoing research on spatial related topics. Since data visualization is dealing with the transformation of data into visual information, this topic is also becoming a question of spatial dimensions. Animated or interactive data visualizations unfold in space and time. Spatial topics such as topology, geography, mapping, the visualization of distances, positions, connections or paths, to name a few of them, are referring to a rather close connection between data visualization and space or place. Countless scientific and artistic projects in this category are dealing in the one or the other way with topics closely related to spatial representations of data. To a lesser degree so far, artists and scientists are dealing with the question of how data visualization can take effect on our perception of space and data. This will be the main focus of this article where I will provide an overview of spatial related topics of data visualization and our current radical changing concept of space.

Page 66: Data Is Beautiful

Data is Beautiful Conference 66

Spatial variablesSpace in data visualization and information visualization, early beginnings

From a historical point of view it can be said that changing spatial concepts did not concern the early stage of data visualization when data or information were transformed simply through a pencil into graphic representation. The first statistic data graphics are dating back about 300 years when the Scottish engineer William Playfair invented the first graphical methods of statistics. Switching from pencils to computers did not affect at its early stage the core idea of visualization.

Today, Lev Manovich points out, what various visualization techniques have in common is the reduction of data and the use of spatial variables such as position, size, shape and more recently curvature of lines and movement to represent key differences in the data and reveal most important patterns and relations. In other words he states that, “infovis privileges spatial dimensions over other visual dimensions. We map the properties of our data that we are most interested in into topology and geometry. Other less important properties of the objects are represented through different visual dimensions - tones, shading patterns, colors, or transparency of the graphical elements.”

All these parameters essentially have an effect on spatial ramifications and on objects and their relation to each other. On the contrary, color effect can be addressed as non-spatial variables and their function appears differently. They can help improving the readability or add to beauty, but not necessarily need to add new spatial information.

Figure 2. W. Playfair´s Commercial and Politic Atlas

Spatial relations in data visualizationHow relations of data and objects are relating and how they effect on each otherAt the beginning of the 19th century new techniques of visual representation started being developed. Several graph types like the bar chart, a line graph or scatter plot where introduced and are still in use today. This scientific revolution of the invention of graphic display of data appeared first in William Playfair´s Commercial and Politic Atlas. Spatial variables, like position and distances between points transformed data into creating a better understanding. Today for example, network visualizations still operate with spatial dimension like the bar charts and line graphs to show the connection between data and objects and their relation to each other.

An example for such kind of spatial data visualization is Manuel Lima’s authoritative gallery visualcomplexity.com , which currently houses over 700 network visualization projects.

It can be observed that today we are still following some of the early major information visualization: spatial arrangements for the most important dimension of data and the use of other variables for remaining dimensions.

Figure 3. The Geotraggers ´World Atlas

Page 67: Data Is Beautiful

Art, Design, Technology 67

Lev Manovich explains in that context, “the privileging of spatial over other visual dimensions was also true of plastic arts in Europe between the 16th and 19th centuries. A painter first worked out the composition for a new work in many sketches; next, the composition was transferred to a canvas and shading was fully developed in monochrome. Only after that color was added. This practice assumed that the meaning and emotional impact of an image depends most of all on the spatial arrangements of its parts, as opposed to colors, textures and other visual parameters” As mentioned above, spatial arrangements are an essential element of diverse visualization techniques. Their relevance reaches from an early stage on till today. But what changes is our understanding and perception of spatial and temporal dimensions. Consequently, these developments affect our understanding of spatial related topics of data visualization.

Spaces are the result of actionNew spatial concepts, characterized by fluidity and the extension of spaceFor millennia of years our perception of space was dominated by a body-focused spatial experience. Today static definitions of visibility are increasingly replaced by a new flexibility, fluidity, dynamic and mobility. The physical coexists with electronic space yet at the same time the sensual feeling of space vanishes into the virtual world. While we are blurring the boundaries of our physical space towards a virtual extension of space, processes connected with a spatial-specifically orientated focus might drastically change in the near future. The transformation of data to visual information creates an additional impact on the representation of virtual spaces.

Figure 4. Invisible Cities

For millennia of years our perception of space was domi-nated by a body-focused spatial experience. Today static definitions of visibility are increasingly replaced by a new flexibility, fluidity, dynamic and mobility. The physical coexists with electronic space yet at the same time the sensual feeling of space vanishes into the virtual world. While we are blur-ring the boundaries of our physical space towards a virtual extension of space, processes connected with a spatial-specifically orientated focus might drastically change in the near future. The transformation of data to visual information creates an additional impact on the representation of virtual spaces.

Our here and now is fundamentally influenced by telematics media and machines, which clearly affects the relation of location and space as a variable. Location mutates to nonlocation, presence to absence and the disappearance of space is widely discussed. The hype about the spatial turn (first mentioned by Eduard W. Soja in 1989 ) expresses its impact in several scientific disciplines. In this context it has always been emphasized that this shift of paradigm was not only about space itself, but also about the concept of space. In other words how space can be defined. However, already in 1967 Michel Foucault points out that the big obsession of the 19th century was history, but that our time rather can be defined as the age of space .

When previously data was transformed in or through virtual space it was often still represented as static dimension. Initiated by a rapidly growing amount of interactive visualization techniques this static dimension increasingly vanishes into the field of data visualization. As a result, animated or interactive data visualizations are becoming a dimension of space and time.

Thus in 2001, the German sociologist Martina Löw released her “Soziologie des Raumes” where she introduced her idea about relational space-models. The goal of her theory is to bear down the separation between an absolute and relative position of spatial thinking. She points out that spaces are the result of action. In her theory Löw takes leave of the imagination of the container space towards a social construction of space. Similar ideas can be applied to various interactive networked data visualization projects. Open access creates interaction where data-spaces become the result of action. It can be observed that data visualization progressively reflects on a social interaction. Consequently, open data networks interact with participants who have an impact on the construction of space.

Page 68: Data Is Beautiful

Data is Beautiful Conference 68

Data space as interaction and information spaceSpace as the housing of contentProcesses connected with spatial-specific focus are in the act of transformation. Manfred Fassler explains in this context that not only User Generated Content (UGC) will be the outcome; instead he introduces the term User Generated Spaces (UGS) . He works out that in such “simulation spaces” a fusion arises between the storage space (which can contain data), the generation and the presence, accordingly the transport- and communication-logic of information technologies. In other words, User Generated Spaces can be seen as temporary fusion spaces. The digital microphysics of the spatial becomes connected with communication-logical macrostructures. Such structures are able to combine worldwide storage spaces with accidentally arising real-local, individual utilization necessity. This provides the base for Fassler´s theory on Cyberlocalism , which he defines as new dimension of a global production of space.

The sensual feeling of space vanishes and at the same time its static category disappears. Space is no longer the housing for context anymore, instead networked contexts generate space and revoke it again. As Fassler expresses, the spatial from now on characterizes a state of information production and generation and he defines space as a dimension of interfaces.

Figure 5. Science Matrix

Spatial sensual experience of dataBlurring the boundaries between physical and virtual absorbed dataspaceData visualization can build a bridge between the physical and the virtual world where participants can become the bonding link between both of them.

Data used to contain information mostly stored in numbers, texts or bytes and was frequently characterized by a static state of being. Initiated by the process of data visualizing knowledge transfer, visual information gains a spatial dimension, frequently blurring the boundaries between the physical and the virtual space. It can be observed that an increasing amount of artistic projects in this field are dealing with active perceiver participation and the involvement of sensual experience. A dialog arises from what people feel and experience through data and certainly will need further investigation. Initiated by the engagement on a more emotional experience our perception is shifting. New terms like “data physicalisation”, “data embodiment” and “datacrafting” were introduced of late at the Beyond Data Lab.

Image 6. CompathOur physical environment we can only fully understand and perceive through experience. To construct reality, various modes, whether they are passive or active senses such as smell, taste and touch affect, while motions tint our experience.

Yi-Fu Tuan, one of the internationally most influential humanist geographers, in his “Space and Place: The perspective of Experience explores the ways in which people feel and think about space. For him it is essential to understand how experience is directed towards the external world. He explains that, “to see and think are closely related processes. In English, “I see” means I understand. Seeing, it has long been recognized, is not the simple recording of light stimuli; it is a selective and creative process in which environmental stimuli are organized into flowing structures that provide signs meaningful to purposive organism. “

But how can we ever develop a clear sense for places in Cyberspace? Already the word Cyberspace implicates the word space, which is compared to place far more anonymous and undefined, as Yi-Fu Tuan points out, space is freedom and place is characterized by security. What about sizes, will they matter anymore? Probably not, since they have already became variable as well. And how about sensuality? For sure we are increasingly influenced by a hybrid world. The physical space is extended to the virtual. Data visualization is connected to both of them.

Page 69: Data Is Beautiful

Art, Design, Technology 69

Data spaces and how data can represent space in a new dimensionThe transformation of data- creating a visual space 2-D to 3-D

We are living in a 3-dimensional world, but can understand 4 dimensions. We can only see three dimensions and the fourth one of time and space we can understand. Scientists, mathematicians, physicists and artists are already working on models making higher dimensional spaces visible. Peter Weibel together with Renate Quehenberger are currently investigating in their project “Quantum Cinema” methods of making the world of quantum physics visible through a new media art project based on scientific research

Figure 7. Quantum Cinema

Peter Weibel explains that already since the 1920ies in particular the abstract sculpture is influenced by higher-dimensional spatial visualization and mathematized spaces. Together with an interdisciplinary team of artists, mathematicians, physicists and programmers they are trying to make phenomena of the quantum physics and higher dimensional theories visible through 3-D Dynamic Geometry for visualization of Unitary Structures. The aim of this project is to explore the borders of spatial perception and to find clues for their depiction of space and time on the quantum level.

The team explains:“The Quantum Cinema Project is reuniting Art and Science by following the claim for a coherent picture of the Quantum World, approaching it by geometrical means as an item of art – in this case: Digital New- Media Art.” XIV

“The Quantum Cinema holographic depiction model is based on the idea that geometry must not remain restricted to the 2-D page - merely remaining a shadow of algebraic concepts. We estimate that only the invention of digital media makes the appropriate outlook of higher mathematics possible and animated 3-D graphics are the new medium for a new geometry (…). XV

ConclusionIf data visualization is seen as another way of learning, understanding and experiencing visual and spatial related topics it needs to be considered that our current concept of space is going towards a radical change. Traditional concepts of spaces are equalized with limits, sense and authenticity. These values are still having their authorization, yet at the same time they are today not applicable anymore in such a narrow sense. One of the main reasons therefore is that space became has become a dimension of the interface. It can be observed that there is a need of thinking beyond traditional forms of data visualization towards a more intuitive or emotional understanding. This tendency goes along with an increasing interest in spatial experience. Physicality, environment and social issues demand a fresh approach to data visualization in a rather cognitive and sensual context.

Page 70: Data Is Beautiful

Bibliography http://www.psych.utoronto.ca/users/spence/Spence%20(2006).pdf. Accessed 29.9. 2012

http://manovich.net/2010/10/25/new-article-what-is-visualization/Accessed 29.9. 2012

http://www.visualcomplexity.com/vc/Accessed 29.9. 2012

http://manovich.net/2010/10/25/new-article-what-is-visualization/Accessed 29.9. 2012

Soja Eduard W. 1989. Postmodern Geographies. London, British Library Cataloguing in Publication Data.

Räume, Andere. 1992. Michael Foucault. In: Barck, Karlheinz u.a. (Hg.), Aisthesis. Wahrnehmung heute oder Perspektiven einer anderen Ästhetik, Leipzig.

Löw. 2001. Soziologie des Raumes. Suhrkamp: Frankfurth am Main.

Fassler Manfred. 2008. Spatial Turn: das Raumparadigma in den Kultur und Sozialwissenschaften. Bielefeld: transcript. In his essay, Cybernetic Localism: Space, Reloaded.

Fassler Manfred. 2008. Spatial Turn: das Raumparadigma in den Kultur und Sozialwissenschaften. Bielefeld: transcript. In his essay, Cybernetic Localism: Space, Reloaded.

Beyond Data“ 2012. Beyond Data Lab at the Lift 12 Conference.

Yi-Fu Tuan. 1977. Spaces and Places. University of Minnesota Press.

Yi-Fu Tuan. 1977. Spaces and Places. University of Minnesota Press.

http://www1.uni-ak.ac.at/medientheorie/blog/?p=87. Accessed 29.9..2012

http://www.researchcatalogue.net/view/22616/22617 Accessed 29.9.2012

A New Digital 3-D Dynamic Geometry for t he Visualization of Complex Number Space. 2012. R.C. Z. Quehenberger, P. Weibel, H.Rauch, H. Katzengraber, R. Friemel. Artikel. New Deli.

ImagesFigure1. Mindmap. Schülke BettinaFigure 2. http://www.datavis.ca/gallery/missed.php#playfairFigure 3. The Geotraggers´World Atlas, http://www.visualcomplexity.com/vc/, Author: Eric FischerFigure 4. Invisible Cities, http://www.visualcomplexity.com/vc/, Author: Christian Marc Schmidt, Liangjie XiaFigure 5. Science Matrix, http://www.visualcomplexity.com/vc/, Author: Oliver H. BeauchesneImage 6. Compath, http://www.visualcomplexity.com/vc/, Author: Norimichi HirakawaFigure 7. Quantum Cinema, http://quantumcinema.uni-ak.ac.at/site/research/discrete-mathematics/

Data is Beautiful Conference 70

Page 71: Data Is Beautiful

Art, Design, Technology 71

Page 72: Data Is Beautiful

Design and Re-designDóra Fónagy Founder of DEFO Lab

What do we design actually? The question is appropiate in Hungary, because in most cases the client and the designer means something completely different by the term design. Therefore it

is a common problem to make ourselves understand by the other side. This text is coming from a creative person more like an exploration of working with non-designers, instead of given facts.

The ultimate aim of all creative activity is to bring happiness to people’s lives. Happiness is an emotion that comes in result of positive experiences and affects human beings. As a founder and leader of defo labor (http://defolabor.com/), me and my team design feelings. It sounds unusual, isn’t it? Let me explain.

Abstract

Data is Beautiful Conference 72

Page 73: Data Is Beautiful

Historically, design has been treated as a downstream step in the development process—the point where designers, who have played no earlier role in the substantive work of innovation, come along and put a beautiful wrapper around the idea. To be sure, this approach has stimulated market growth in many areas by making new products and technologies aesthetically attractive and therefore more desirable to consumers or by enhancing brand perception through smart, evocative advertising and communication strategies.

Figure 1: The rythm of design strategy and design communication during creation

During the latter half of the twentieth century design became an increasingly valuable competitive asset in, for example, the consumer electronics, automotive, and consumer packaged goods industries. But in most others it remained a late-stage add-on. Now, however, rather than asking designers to make an already developed idea more attractive to consumers, companies are asking them to create ideas that better meet consumers’ needs and desires. The former role is tactical, and results in limited value creation; the latter is strategic, and leads to dramatic new forms of value.

Now design means: ‘Design is one of the basic characteristics of what it is to be human, and an essential determinant of the quality of human life. It affects everyone in every detail of every aspect of what they do throughout each day.’ This means design is not only about the direct construction of an object, but also it is a roadmap or a strategic approach for someone to achieve a unique expectation. It defines, specialises and optimalises, and the result is universal, a language which is understandable for all.

In the last few years we have had several projects connected to services, such as hotels or restaurants. When our client decides to give us the mission, usually he or she thinks about a place full of furniture, so the zero step for us is to teach him or her that we design for people, which means during a project we have to consider the aesthetic, functional and economical dimensions, but also have to do research, analysis, problem solving, development, and this process may involve modeling and re-design too. This means service design is not driven by a single design discipline. Instead, it requires a cross-discipline perspective that considers multiple aspects of the brand/business/environment/experience from product, packaging and retail environment to the clothing and attitude of employ-ees. It may sound overcomplicated, so I would like to go into details by explaining one of our latest project called Laci!Pecsenye?.

The name Laci!Pecsenye? Refers to a unique Hungarian habit, which provides eating at the butcher. There are several ‘fast food’ dishes from all kinds of meet, but ‘pecsenye’ is made out of beef, it is practically thin steak served with cabbage. This restaurant was planned to be at the CET building, which is very close to the Main Marketplace and surrounded by several universities. The client goal was to offer gastronomical experience in a simple and fresh way.

Figure 2: Laci!Pecsenye?

Location is highly important. CET is Central European Time. CET is also a synonym for a whale. The Mixed Use Development CET at the Közraktárak between the Petőfi and the Szabadság Bridge. The CET concept refers to Budapest as an important metropolitan centre in the heart of Central Europe and it’s shape refers to the smooth and friendly streamlined body of a whale. Name and shape of the CET symbolizes its cultural potential and commercial pole position in one of the best preserved cities in the world. The body of the CET landmark building is developed along the flow of the Danube. Its architectural and urban expression evolves with the direction of the flow. The CET’s origin stems from the side of the city centre, grows in size between the two parallel existing buildings of the Közraktárak and then culminates at the south side, the side of the National Theatre and the new Cultural Centre, in a striking landmark building representing the state-of-the-art in architectural design and building technology, its impact on the city will be not unlike the removed Elevator Building from the 19th Century from where the goods were distributed to the 6 warehouses which originally occupied the banks of the Danube.

1 John Heskett (2002): Design, A very short introduction, Oxford University Press, London, pg.2

Art, Design, Technology 73

Page 74: Data Is Beautiful

Our work started with observing the environment meanwhile researching the meat market culture. We found a raw, basic, a kind of direct indoor places surrounded by the wonderful Danube and nature. By collecting objects, such as furniture, font styles, lamps or even colours we started to translate this traditional atmosphere into an easy-going contemporary way.

Unfortunatelly this project could not succeeded in this way, because there were some troubles with the opening of CET, but the owner wanted to open such a place, so he ordered a new interior in the hear of Budapest, in the neighborhood of St. Stephens’s Basilica. It is named in honour of Stephen, the first King of Hungary whose right hand is housed in the reliquary. Today it is the most important church building in Hungary and one of the most significant tourist attractions in Budapest. It’s architectural style is neo-classical, surrounded by a spacious square in front of it. We needed courage to place our restaurant to this sacred place, and of course, also meant re-design too.

Besides the original elements of Laci!Pecsenye? We had a new chance: a huge terrace was given. This district is full of bars, cafes, confectionaries, all in all places to sit in and relax we had to find the way to catch peoples attantion and make them stay, start conversations with local people, or just to have fun. We found that installation is the right tool to achieve it.

Installation as a terminology for a specific form of art came into use fairly recently. Essentially, installation takes into account a broader sensory experience, rather than floating framed points of focus on a ‘neutral’ wall or displaying isolated objects on a pedestral. This may leave space and time as its only dimensional constants, implyin dissolution of the line between ‘art’ and ‘life’.

We wanted to create an installation which involves the users, moves them in a certain ways as they usually do not act. At the end our traditional form guieds us an irregular way. Laci!Pecsenye?-s traditional voice needed an other special Hungarian beverage, called fröccs. It is popular drink mixed from wine and soda water in varying proportions. Wine drinking and eating with others is one of the basic experiences from the beginnig of human life. But why not spicing it up with some more experiences?

Figure 4: Location

Figure 3: Furniture, typography research and colour code for Laci!Pecsenye?

Figure 5: Fröccsterasz bar concept

Data is Beautiful Conference 74

Page 75: Data Is Beautiful

Our goal was to: •Make people feel confident of themselves. •Make people feel they can do something better. Empowers people to do something in a better way. •Make people have an enjoyable and fun time during the experience, thus making life worth to be lived. •Surprises people in a magic way, bringing delight to the eyes and making the mind wonder. •Create an emotional connection between everyone involved, the experience itself and the one supporting the experience (a brand or a person) •Make the world a better place to live •Strenghten relationships between people that live the same experience

All my thoughts introduces the idea of design thinking‚ the collaborative process by which the designer’s sensibilities and methods are employed to match people’s needs not only with what is technically feasible and a viable business strategy. In short‚ design thinking converts need into demand. It’s a human−centered approach to problem solving that helps people and organizations become more innovative and more creative. Design thinking is not just applicable to so−called creative industries or people who work in the design field. It is a discipline that uses the designer’s sensibility and methods to match people’s needs with what is technologically feasible and what a viable business strategy can convert into customer value and market opportunity So all in all (responsible) design focuses on people’s actual needs rather than trying to persuade them to buy into what businesses are selling. Because people feel before they think.

Figure 6: Sketches for installation

Figure 7: Fröccsterasz installation concept

Art, Design, Technology 75

Page 76: Data Is Beautiful

Graphical and physical data visualization in urban spaceMag. art. Harald Moser Ars Electronica Futurelab

www.aec.at/zeitraum

www.wollle.com/index.php?option=com_content&task=view&id=13&Itemid=28&lang=en

Urban data visualization can explore lots of different representations. To focus on a graphical (“ZeitRaum”) or physical (“GoE”) presentation signifies two miscellaneous approaches.At the core of “ZeitRaum” is an imaginary space, one at the interface of all the world’s airports. Passengers enter it when they pass through a security checkpoint prior to takeoff, and leave it after touching down at their final destination. This space’s boundaries are constantly shifting in accordance with current air traffic. The

borders of nation-states, time zones and geographic allocations lose their relevance. This space hosts more than five billion people a year, people who are total strangers and yet feel that they’re temporarily interconnected as fellow members of a temporary community. “GoE” is also connecting urban spaces as a comparison of metropolises dealing with pollution. The real time pollution data is projected on physical objects (lettuce), which reacts to a metropolis environmental space.

IntroductionFrom the beginning of humanism, philosophers try to understand the concept of time and space. Platon, Aristoteles, Augustinus, Newton, Kant and many more explored theories to get answers about these important factors.

In modern history (maybe since the industrial revolution) time and space became even more significant in our society and will become even more essential (since the digital revolution started) also because transferring amount of data increases constantly as, for example, Moore’s law describes. Not only the amount of data is important, the real time aspect of transferring data is even more relevant in these projects.

Abstract

Data is Beautiful Conference 76

Page 77: Data Is Beautiful

“ZeitRaum” AestheticsTo visualize the data of an urban connective space so called airport one has to start thinking about the essence of an airport, travelling and transportation in general. Furthermore how passengers can participate with the airport as a connective space. The link between airports and the airport as an imaginary space with its own society (5 billion people a year), laws and time will be visualized to the inhabitants. The goal is to find a visualization of the air traffic data and link this data directly to the passenger. Raw air traffic data (see Image 1) will be transformed from a technical basis into a landscape-based text-visualization.

Every flight which is arriving or departing will be transformed into hills and valleys. All of them constantly in motion because their growth is a function of current arrivals and departures. Every takeoff engenders a hill, every landing a valley.

Invited authors from several disciplines have written texts about this special phenomena of the „ZeitRaum“. These texts have been written in 7 languages. They are triggered by the different languages of destinations where the planes depart to or have arrived from.

Image 1: Incoming data stream

Image 2: design prototype I

Image 3: design prototype II

Image 4: design prototype III

Art, Design, Technology 77

Page 78: Data Is Beautiful

GoE AestheticsThe Visualization Aspects of data focus on the physical output – how the lettuce is influenced by the ozone. The Aesthetics are driven by this different approach of designing this installation.Pedestals are built in a very minimalist way, with only few lines, few colors and little information.In terms of color the only bright color in the artwork is green, which is repeated in the salad, the Japanese letters and the digital display, which shows the actual pollution level of the city.The proportions, in which this color is used, also reflect the importance of the only information: The view of the visitor gets conducted from the most important thing, which is the salad, to the name of the city and finally to a mere number in the backside of the box, which reflects the amount of pollution.Those aesthetics were chosen to create the stern and cold impression that a scientific experiment gives.

Scientific experiments have a very simple setup, as they are about very determined researches. In these conditions, the way of getting new knowledge seems to be unstoppable (otherwise the knowledge can’t be reached) and therefore cold and cruel – there is no place for feelings. In this hopeless situation people start to get very empathetic with the salad. When starting to realize that this situation reflects reality, beholders are left with a very strong impression. Another point we want to make by quoting scientific projects is to make kind of an Alien out of the salad. Something we eat every day, or at least quite often, suddenly becomes something we don’t know anymore, something we have to research about. This refers to daily life and should make people wonder about their basic knowledge.

Image 5: garden of eden at the ars electronica festival 2007

Concept ZeitRaumPassengers enter the ZeitRaum when they pass through a security checkpoint prior to takeoff, and leave it after touching down at their final destination. From that moment when they enter they share this space with all passengers at all airports globally.

This space’s boundaries are constantly shifting in accordance with current air traffic. Passengers who are total strangers and yet feel that they’re temporarily interconnected as fellow members of a temporary community. The expansion of this space is diversified by the imagination of its “temporary residents”.

This space is measured in the “metric sense of time”. The borders of nation states, time zones and geographic allocations lose their relevance. Connected destinations suddenly become much closer, while the rest of the world remains far away.This imaginary space, so called airport, brought us to the concept of a real time interactive data visualization environment.

The Check in 3 Terminal at the Vienna airport allows its passengers – the “residents on time” - an insight into this melting pot of cultures, languages, times and imaginations.

Data is Beautiful Conference 78

Page 79: Data Is Beautiful

After the Check in- at the moment of the boarding pass check - a person’s approach triggers a cloud of letters falling down this wall. They come to rest at the bottom and fall into texts that, in turn, form the topography of a landscape. Hills and valleys take shape in this way, all of them constantly in motion because their growth is a function of current arrivals and departures. Every takeoff engenders a hill, every landing a valley. The more passengers are in ZeitRaum, the more detailed its “description”. Expansion deformation of the space corresponds to the distance of a flight and its passenger numbers. Passers leave marks in their ZeitRaum, and affect, by the degree their attention the deformation of space.

Image 6: Terminal Check In 3

The terminal is the “gate“ to the ZeitRaum. Here the boarding pass control is located. At the moment passengers enter – a deformation of the landscape appears and triggers letters falling from the wall in the back.

Image 7: Terminal Check In 3 – A12 Wall

Image 8: Terminal Check In 3 – A3 Wall

Art, Design, Technology 79

Page 80: Data Is Beautiful

Time CreaturesThe diversity of the time creatures with their specific behavioral and interaction patterns is influencing the ZeitRaum. These interaction patterns show a translation of the texts (word by word) In the national language of the country they leave to or they are coming from. In total there are 60 languages integrated.

Image 9: Time Creatures

They appear in the landscape as a window in the virtual ZeitRaum. It depends where the aircraft is coming from and how many passengers are on it – on how many time creatures appear. At the arrival of an aircraft at a gate - the current time is the time zone of the departure airport. The international air events accordingly represent the time as metric of the space in constant motion.

Concept GoEIn the project Garden of Eden (GoE) the problem of air pollution is focused. Therefore eight pedestals, each of which is covered with an airtight Plexiglas box were exhibited. Via the internet, the latest air pollution levels in the capitals of the G8-countries are obtained and sent to the control system of the installation.

Based on this data, the system reproduces these levels artificially inside these boxes, each of which contains a lettuce that serves an indicator of the quality of the air inside the capsules. The lettuce, exhibited in showcase-like containers, becomes an object, a sculpture that speaks in nature’s own language about its state

Image 10: Garden of Eden, Ars Electronica Festival 2007

Data is Beautiful Conference 80

Page 81: Data Is Beautiful

GoE is an artwork in the domain of Hybrid Art & Data Visualization, which combines the classic and scientific setup of plant chambers with new media technologies (like e.g. internet). Thus, it connects the modern zeitgeist, in which all information is free accessible - and able to be “google”-able - with rather traditional scientific methods.

Image 11: Garden of Eden, Lettuce

ConclusionThere is already a variety of urban data visualization environments in our society. Inhabitants will be confronted with more data in public space. Transforming raw (technical) data into easily to decode data will be the purpose for its designers.

Both projects are involving a dedicated society that is able to interact with the system. This interaction is immediatly displayed in the ZeitRaum project as a graphical output. In GoE this interaction process displays a physically output. To interact, trying to produce less pollution is a long term based process for city inhabitants.

Art, Design, Technology 81

Page 82: Data Is Beautiful

Spectacle and Mediation: The Body Visualized1

Nina Czegledy1 and André P. Czeglédy2 1 Senior Fellow, KMDI, University of Toronto, Adjunct Associate Professor, Concordia University, Montreal, Senior Fellow, Hungarian University of Fine Arts

2 Associate Professor, Program Director, Anthropology, Wilfrid Laurier University, Waterloo, Canada

IntroductionThe development of advanced biomedical technologies, particularly those linked to visualization, has radically changed the relationship between subject/patient and the agency of diagnosis and/or bodily interpretation. The increasing use of such technologies in biomedicine, as well as their proliferation into the popular imagination through the various fora of education, art and commerce, has furthermore resulted in a revision of the bodily visual that can no longer be ignored.

Imaging systems and visualization techniques have successively contributed to a significant shift to multiple perceptions of the human body that have had profound social and cultural consequences in changing the way we see ourselves. This has led to a situation whereby the visual material of our bodily reality is increasingly supplied through a variety of technological means and the corporeal is now often seen in ways distinct from human vision, and completely dependent upon the mechanics of biomedicine. What does this mean in terms of both the science and the art of the body as a social concept? How have contemporary artists, in particular, sought to capture the intricate shifts in corporeal identity? What is the significance of technical mediation in translating the images – and perhaps imaginaries – involved? These and other questions are addressed in a reflective discussion that focuses attention not only on the ways in which biomedicine has created new sights of the body, but also on how issues of mediation may increasingly play a role in understanding what might be termed the longer historical process of visual alienation. Such alienation has its conceptual roots in other forms of separation, most notably the sort of economic alienation proposed by Marx and its social counterpart discussed by Durkheim.

The history of bodily visualization is ably represented in the field of medical science. Medicine originally sought to capture sight of the body as a definitive field of scientific endeavor suitable to both the ending of debilitating illness and the general prolongation of life. In the process of this never-ending quest, it developed new ways of “seeing” the body and interpreting its presence via substantive images of analysis outside the bounds of corporeal reality. Such

mediated representation in the service of Science quickly became accepted as a revealed Truth beyond a subject’s own possession and knowledge. Centered on the above themes and preceded by historical contextualization, our discussion focuses attention on how the new perspectives of biomedical visualization have inspired artists to produce novel aesthetic and artistic creations.

Abstract

Data is Beautiful Conference 82

Page 83: Data Is Beautiful

In his Economic and Philosophic Manuscripts (1844), Karl Marx details four types of entfremdung (alienation) that accrue to the worker under capitalism and within a stratified society.2 Each involves a different form of disenfranchisement that nonetheless ultimately leads to the worker losing control of themselves as much as their livelihood, entering into a Faustian bargain in order to survive in an environment that is calculatingly exploitative, if not outright hostile. For current purposes, it is this encompassing environment and element of knowing manipulation that will be of acute resonance within our discussion, for both highlight the play of power as a fundamental yet often overlooked dynamic in the social and cultural relations that are entwined with all forms of technology, whether that of the industrial kind more reminiscent of a former century (and two) ago, or that of the biomedical kind which has become so prevalent in society today.

In contrast to Marx’s conceptual foundation lies the later work of Émile Durkheim on anomie (social alienation) which features the individual as someone whose life is so governed by societal norms that their response is to seek distance from social conventions altogether. Such alienation is often the result of a fundamental dislocation between the images prevalent in wider society and the ability of the individual to realize them. The psychological stresses of conformity may therefore lead to extreme behavior – which is why Durkheim’s discussion lies within his volume on Suicide (1897). The ideas of social alienation here are every bit as influential in reminding us just how subtle and personally destructive various processes of separation can really be. While alienation in these terms takes place on a strictly individual basis (as opposed to the collective nature of class perspective), Durkheim’s concern is primarily to understand the tensions between individual and society that develop out of the unease, dissatisfaction, and general disillusionment that accompanies modernity. These tensions are, if anything, magnified in the post-industrial setting of contemporary biomedicine, for the self-interpretation of our bodies has been increasingly eroded on the long historical road to today.

History and SightUntil quite recently, the human body could only be viewed through the naked eye. Then in the 17th century the invention of the compound lens microscope altered the nature of direct sight by adding an ability to see beyond ordinary vision. Entire new micro-worlds of existence were suddenly opened up to examination and we were able to reflect on forms of internal scale hitherto undreamt of. Nevertheless, there was little room for bodily distortion due to the necessity of corporeality: the visual knowledge of the body was directly connected to its presence in terms of the technology used,. Simply said, without tissue being present, it remained impossible to see the corporeal. Even into the 18th and 19th centuries, with the exception of clinical photography visualization technologies were uniformly rooted in immediate visual sensibility. Consequently, visualizing the body depended upon a temporality of visual access which ensured that personal control of the material body involved a significant amount of control over its image. This was largely because, with the exception of graphic/sculptural art, there was no way to fully reproduce the body apart from its presence. However, in the modern era, the development of advanced biomedical technologies linked to visualization has radically changed this relationship. It has been most affected – and considerably effected – by causing the body to be subsumed under technological control and visual re-creation. Consequently, just when bodily imaging has been placed at the centre of much of the new diagnostic and clinical medicine, the medical image of the body has been de-centred from both personal and much of immediate sensory experience. This process has meant that the body and its image are far less a matter of conventional visualization and therefore common knowledge than previously. Instead, we now have greater and greater technological mediation that reinterprets and (re)presents our corporeal selves. How so?

Prior to the appearance of complex visualization technologies, it was the sensory body that universally informed judgments of homeopathic diagnosis so that the only way to see beyond the plainly material realm of human sight was to engage artistic visions of one kind or another. The Physician explained what we were ourselves seeing, and the Artist provided different ways of depicting our sight. The former might refer to esoteric presences such as the balance of so-called bodily ‘humors’ which were once thought in a befuddled Europe to account for all manner of physiological conditions and ailments. In contrast, the latter could - at most - be accused of some obvious distortion by way of technique and/or graphic style. But more importantly, it was then very clear that (especially) the physician relied on an established corpus of anatomical knowledge that was no different from the realm of self-analysis open to the patient (or for that matter, the artist’s model and audience). In such terms, the empirical basis of consideration was substantially the same for anyone interested in both seeing and understanding the body. Today, however, the rise of new visualization technologies has allowed for the visual material of our bodily reality being supplied through a widening variety of technological means that frequently downplays ordinary sensory knowledge and experience.

1 This discussion draws upon Czegledy & Czegledy (2000).2 Alienation is also covered by Marx in The German Ideology (1846).

Art, Design, Technology 83

Page 84: Data Is Beautiful

What we see is no longer enough. Moreover, it is increasing no longer even the ‘reality’ accepted by many medical authorities. This means that we are increasingly being told that our bodies are not simply what we see, hear, touch and smell, but a range of deep structures and processes captured by the different eyes of technological visualization. The result has left the physical sciences ranging from chemistry, physics and biochemistry through to astronomy, neuroscience and molecular genetics all depending upon forms of technological imaging that are changing the way we see ourselves as much as the world around us.

Technologies of VisualizationThe discovery of x-rays by Wilhelm Röntgen in 1895 fundamentally changed both the way we see the body – and the ways in which we understand it as a site of material reality and as a set of relationships linking people together in personal as well as medical dimensions. The new expertise which emerged from this discovery made technology as important as its interpreter in addition to making its interpreter far more ‘knowledgeable’ about the state of a person’s body – then the person him/herself. All of a sudden, our bodies were no longer what we saw them as but as what they were shown to be. This was because while x-rays more or less stripped the body of its corporeal encumbrances, they also set the nascent precedent for technology as the very arbiter of bodily reality. Not even many medical practitioners were a part of this elite development as higher and higher forms of technology have become the preserve of authority for fewer (and narrower) specialists working with data removed from our bodies under their expert interrogation and interpretation. An intrinsic part of the process of separation that saw the subjects (patients) of especially medical science removed from knowledge of their own bodies was the increasing fetishism of technology far beyond it previously functional roles. While this was but part and parcel of the more general social shift in attitudes towards technology in the modern era, it also involved an over-appreciation of the very instruments of progress beyond their utility. On one hand, such appreciation resulted from the startling realization that for the first time in human history the tools of humankind allowed us to see the unimaginable – and made it manageable in the process. On the other, it drew on the public’s fascination with the very evident power of technology to extend the body beyond itself (McLuhan 1964). Never before had the tools of one kind or another become so important as both the creators and the carriers of knowledge. This was particularly the case with electron microscopy, the first technological leap to truly bridge the internal gap in scale between what could be seen and what became practically imagined.

Although the Hungarian physicist Leo Szillard tried to convince Denis Gabor to build the first electron microscope as early as 1928, it was only realized in 1931 when the German physicist Max Kroll succeeded in enlisting Ernst Ruska to develop the first working prototype. Importantly, the four hundred-power-magnification of this instrument did not exceed the maximum of conventional, optical microscopy of the time, but within two years a more advanced version had established parameters far beyond the application of ordinary lenses. Such a revolution in magnification added an important new dimension to how we see the body. Unlike previous technologies which revealed the corporeal to scale, completely new terrains of corporeal structure were revealed to both patient and practitioner alike. These internal landscapes repositioned us within an entirely new spectrum of living environments and, thereby, directed attention back within our bodies. The problem, of course, was that the technology involved was so remote, so esoteric and so complex, that it was easiest for anyone but an expert to skip even routine questioning of the sorts of interpretation that might lie within the images produced - and immediately proceed straight to the results as if they were unambiguous, absolute values representing a plain reality. In the process, all sense of criticism was lost in the technological shuffle that had separated our sight of the body from the visually recognizable.

More recently, high-resolution, electron microscopy has provided even powerful micrographic images on a molecular level. Other technical advances related to enhanced visualization include: TI (Thermal Imaging), MRI (Magnetic Resonance Imaging), CT (Computed Tomography X-ray imaging) and PET (Positron Emission Tomography). All of these technologies have worked to increase the ways in which our seeing the body is no longer enough to understanding it. Only by way of technology can we now be sure that what we see through our senses is ‘real’.

Of course, the reality that technology has given us is in fact a multitude of realities given that a range of new visualization technologies now provide very different ways to seeing the human body. Today, the body as an image may be (re)constructed through heat signatures (in Thermal Imaging), electronic impulses (Electro-cartography), magnetic impulses (Magnetic Resonance Imaging) or the reverberation of sound waves (Sonography), for example. This increasing variety provides very different perspectives (on supposedly the same thing), thereby widening (instead of simply clarifying) interpretation of the body. So accordingly, we are faced with the proposition of greater and greater challenges for making connecting diagnoses that unify both the technologies in play and their disparate interpretations.

Data is Beautiful Conference 84

Page 85: Data Is Beautiful

Much of the method to making connections between the multiple interpretations of technology seems to lie in the newest application to all forms of data representation, let alone just visualization: digital technology. Digital technology allows us to play with both time and space by compression or expansion; it simultaneously gives us opportunity to quickly – and masterfully - manipulate input data in ways that can fundamentally cut out near anything, transform anything, and replace it with anything… else. The simultaneous ease and potential involved reminds one of the original practitioners of visual representation who chose to experiment a growing number of techniques and perspectives to deal with the fundamental problem of verisimilitude well before Karl Popper’s (1964, 1994) questioning of the scientific path towards understanding humanity as much as society. Not surprisingly in our post-modern era, it is artists again who are challenging contemporary truths regarding the value of visual representation.

Art and Sight The new perspectives of biomedical visualization as well as new software tools, have both contributed and inspired many artists to create novel artworks that comment on the nexus between technology, society and the corporeal. Data visualization is usually regarded a specialized field restricted to the more technical sides of computer science, engineering and, to a lesser extent, commerce. Yet in contrast to popular belief, data visualization in the arts is a evolving growing field that is finding considerable ground in developing new perspectives on the role of technology in society. While the basic techniques used by many artists would be familiar to those in the scientific visualization community, the motivations and creations are very different. This is not least because artistic visualizations are visualizations of data done by artists with the intent of making art that reflects on the connections between technology, society and culture. More specifically, artists are interested in understanding the abstract relations that envelope and are carried through technology.

Justine Cooper, for example, constructed Rapt1 by using Magnetic Resonance Imaging scans and medical software to animate the slices of her own body in order to reflect on issues of corporeal reflexivity. Writing about her work with medical visualization technologies, she notes how: “The fluid transformations of the body in these works echo the materiality lost at the moment of its digital imaging, and its re-materialization as information. Just as the body is re-codified through medical technology, so its internal spaces and brute physicality are remapped and made accessible in these works. Living flesh is translated into malleable data.” 3 On the one hand, Cooper’s art speaks to the notion of artifice and the ways in which all kinds of visual representation are fundamentally open to manipulation. On the other, from her perspective there is a sense in which our bodies remain substantially unknowable to us, and the history of western art – and especially post-modern art - is duly studded with moments in which new and surprising images of the human body emerge, shocking us into a fresh recognition of our place in the world, and of the worlds within us.

Such a perspective on bodily positioning is shifted rather than amplified by Fred Laforge in his own deconstructions of the human body via pixilated drawings in order to to emphasize a fascination with non-standard morphologies.4 This approach investigates the formal aesthetics of classical art along with more subjective concepts like Beauty. It thereby questions constructs of Permanence and Translation while simultaneously reminding us that so much of the visual framework of our society is open to technological mediation. Interestingly, even though Laforge reduces humans to their most basic geometry, each artwork still seeks to retain the unique character of an individual – his way of noting that despite the techniques of abstraction there remains a necessity for recognizing the core individuality of our humanity. In this way, Laforge reveals the uniqueness of the human body by reducing it to basic visual elements whose arrangement is different. He highlights the binary opposition of the general and specific which is often integral to forms of representation (Hall 1997:234-235). By doing so, however, he also suggests an evanescence of the body, and possibly even its eventual disappearance.

In contrast, Marta Menezes works at the intersection of art and biology, aiming to visualize what is “hidden under our skin” – especially in her Functional Portrait: Picturing the Invisible Body installations.5 Within this larger project, she reconstructs - with the aid of Magnetic Resonance Images - her own portrait presenting a cerebral map. By her own admission: “In this work I have been using functional magnetic resonance imaging (fMRI) of the brain, in order to visualize the regions of the brain that are active while a given task is being performed. With this visual information

3 http://www.acmi.net.au/justine_cooper.htm#essay4 See http://www.galeriesas.com/spip.php?article7425 http://www.martademenezes.com/?page_id=102

Art, Design, Technology 85

Page 86: Data Is Beautiful

it becomes possible to create portraits – functional portraits – where besides the physical appearance of the subject, the function of its brain while performing a chosen task is represented.” In this way, her portraiture becomes a commentary not simply on the self, but also on the relationship between the person and what they do in society – where they fit into the complexity of modern life. As Morus (2002:3) emphasizes, “Anthropologists have long been aware of the relationship between social organization and bodily demeanour and description.” This knowledge underscores the connections between bodily activity and identity while bringing into corollary the significance of representation as an active social dynamic.

Not surprisingly given contemporary issues of intellectual property and copyright, Dario Neira artwork similarly utilizes visual representations of herself. Her Somato Landscape is a scanned self-portrait that offering a microscopical self-image, fundamental and organic, yet still narrating the subject’s emotional state. “In the same direction, Neira’s works tell of a third nature, a dimension that constitutes the union of art, science and the sacred, since human beings, aware of their body processes and mechanisms, have always wondered about the mysteries of existence and of death.” 6 This particular constellation of elements has precedent in Victorian England, when many spiritualists from the 1850s onwards “… sought to defend the conventions of the spirit circle by appealing to analogies between séance bodies and scientific instruments although this did little to thwart spiritualism’s fiercest opponents.” (Noakes 2002: 126). Many of the critics of spiritualism pointed towards its seeming affinities with the magical arts of illusionists, conjurers and the theatrical like. Some of them went so far as to replicate the ‘visions’ of the séance in order to debunk the medium profession. Yet even when audiences were already informed of visual manipulation beforehand, they were nonetheless drawn to performances which played with their imagination in such a way as to accept the power of technology over and above personal bodily experience.

Perhaps because of an awareness of the historicity of changing social and cultural approaches to technology and its visualization of the body, Joyce Cutler Shaw’s work questions whether present day medical visualization techniques point to a significant shift in the perception of the body (in contrast to long standing historical explorations when the body was solely viewed through the naked eye). The ongoing Anatomy Lesson investigates since 1994 aspects of the history of anatomy as a way of understanding the range of shifts within practices of visualization. The resultant work centers on the body, but does not fetishize it. “Exploring across the disciplines of art and medicine”, writes Cutler-Shaw, “… I have discovered the medical field to be an arena for the newest forms of body representation. It is at the intersection of art and medical science that new insights in interpreting the physical self can emerge. Moreover, no field confronts issues more contentious than the medical such as when life begins and when it ends, and the limits of normalcy and the aberrant.” 7 For this reason, in her work, she evokes the current options of an enhanced body fitting into the contemporary social/cultural/medical environment. Represented by digital imaging our bodies become transparent and more virtual than physical. For her, our challenge “is to understand and respond to the implications and consequences of these advancing phenomena that culturally define us.” 8

Revealing the BodyCutler-Shaw’s artwork reminds us not only that the nexus between technology, the body and visualization is a complicated one, but also that it possesses a varied history in which an extremely heterogeneous mix of agents, forces and perspectives have taken part. The work of the artists noted above investigates the intersecting nexus of visual art, medical science and technology, thereby highlighting insights into interpreting the representation of the physical self by way of the products of various forms of instrumentation. Such enquiry becomes ever more important as the spectacle of the human body has made corporeality more transparent, exposed, and vulnerable than ever before. What is at stake here is not only the extent – and the different ways – in which the human body can be visually ‘read’, but also the level and limits of mediation in both representation and interpretation. An intrinsic part of the dynamic at play is whether or not (or simply the extent to which) we are party as individuals to the machines and machinations involved. Are we aware of what is going on? Would we understand its implications even if we were told? These two questions rebound within the tensions caused by technology’s power to appropriate the image of the body as well as the development of interpretation as an expertise completely removed from ordinary understanding. Never mind that in biomedical circles it is now possible to consider physical manipulation at the genetic level…

As Herminio Martins points out in his analysis of the problem of technology in the post-modern era, the great philosopher of history Augustin Cournot (1801-77) may have got it wrong. He notes that Cournot believed that “organic life will never be understood in as fundamental a way – at least as far as scientific cognition is concerned – as the physical or the human worlds, both of which are susceptible to indefinite mechanization in a way that organic life could never be” (1998:155). Yet Cournot did not (and could not) anticipate the power of new technologies such as those found in biomedicine beyond his time. These evolving technologies do not just grant us new ways of seeing ourselves, they also change the very possibilities of intervention.

Data is Beautiful Conference 86

Page 87: Data Is Beautiful

BibliographyCzegledy, Nina & Czegledy, André 2000 “Mediated Bodies” in The Body Caught in the Intestines of the Computer & Beyond: Women’s Strategies and/or Strategies by Women in Media, Art and Theory Ljubljana: MASKA, 6-10

Durkheim, Émile 1897 Suicide New York: The Free Press

Hall, Stuart 1997 “The Spectacle of the Other” in Stuart Hall (ed.) Representation: Cultural Representations and Signifying Practices Sage: London, 223-290

Martins, Herminio 1998 “Technology, Modernity, Politics” in James Good & Irving Velody (eds.) The Politics of Modernity Cambridge: Cambridge University Press

Marx, Karl1844 Economic and Philosophical Manuscripts Moscow: Progress Publishers

Marx, Karl [1846] 1932 The German Ideology Moscow: Marx-Engels Institute

McLuhan, Marshall [1964] 1994 Understanding Media: The Extensions of Man Cambridge, Mass.: MIT Press

Morus, Iwan Rhys 2002 “Introduction” in I. R. Morus (ed.) Bodies/Machines Oxford: Berg, 1-15 Noakes, Richard 2002 “Instruments to Lay Hold of Spirits: Technologizing the Bodies of Victorian

Spiritualism” in I. R. Morus (ed.) Bodies/Machines Oxford: Berg, 125-163Popper, Karl 1963 Conjectures and Refutations: The Growth of Scientific Knowledge London: RoutledgePopper, Karl [1958] 2001 All Life is Problem Solving London: Routledge

6 http://www.darioneira.com/7 http://medhum.med.nyu.edu/blog/?p=1698 http://medhum.med.nyu.edu/blog/?p=169

Art, Design, Technology 87

Page 88: Data Is Beautiful

Visualizing Szindbád, revealing the music of a literary textZsófia Ruttkay Moholy-Nagy University of Art and Design Budapest,

Creative Technology Lab

A literary text is a special type of data. Its visualization may be motivated by providing insight into the semantic, structural or rhythmic and melodic aspects of the literary work, or by generating some aesthetical, abstract “illustration”, which is algorithmically linked to the text. While in the first case the data visualization serves scientific purposes, and in the second purely aesthetical ones, in both cases it is a challenge to choose the design parameters (mapping principles and the colours and forms used for visualization) in such a way that the result is visually pleasing and informative.

Nowadays many graphic designers and programmers play around with literary texts. Slowly scholars of literature start to discover the potentials for scientific research of the digital corpora and the text visualization.

I demonstrate live my own exploration of visualizing the text from “Szindbád” (Sindbad) by the Hungarian writer Gyula Krúdy. Krúdy ’s prose is known for its musical quality; it is often associated with the sound of a cello. It has been suggested that Krúdy owes some of his magic to the melodic effect of the sounds of the text as well as to the rhythmic patterns emerging from the alternation of short and long syllables in his sentences. He is also known for telling stories in long sentences that express several, embedded and associative pieces of thought. Interestingly, these hypotheses are not supported by any publications. This may have to do with the tedious work that would have been required to analyse a sizeable body of text by hand – the only way available till recently. So I was eager to use computational linguistic and data visualization techniques to take a closer look at the hidden music of the text, caused by the repetitions, the structure of sentences, acoustic characteristics such as occurrences of “bright” and “dark” vowels, and by its rhythms. I will show the result of this analysis in easy to grasp, and often interactive visual forms, and I will compare the image of different contemporary texts (by Krúdy and other authors) for reference.

The last playful visualization – a work of the MOME graphic design student Miklós Ferencz - presents each story as an abstract flower , based on different acoustic aspects of the text.

All the programming was done in Processing.The results may be seen at http://create.mome.hu/ruttkay/krudy.

Data is Beautiful Conference 88

Page 89: Data Is Beautiful

Art, Design, Technology 89

%100

Page 90: Data Is Beautiful

385%50%

Page 91: Data Is Beautiful

3Society

Page 92: Data Is Beautiful

Invisible Set of DataNoemi Alexa Transparency International Hungary

IntroductionData is beautiful - it is especially true for corruption. Being a hidden phenomenon where in most cases both the payer and the receiver of a bribe is interested in keeping the deal secret corruption-related data gathering is always a challenge. The presentation will focus on the data created by the many ways of measuring corruption and the types of data Transparency International works with worldwide and in Hungary. TI uses very different quantitative and qualitative methods to map corruption and find the best ways to tackle it. Data provides the scientific background for anticorruption work and the visualization of data is one of the most effective tools to raise awareness about corruption.

Measuring corruptionThere are several tools to measure corruption, but none of them can provide exact figure about how rampant the phenomenon is in a country or in a sector.

Criminal statistics are a starting point, but due to high latency of corrupt actions the number of proceedings in corruption cases or criminal charges imposed reflects only a fraction of all corruption related crimes. In addition, the Criminal Code only sanctions some, but not all types of corrupt behaviours such as bribery or undue influence but other phenomena like nepotism will never appear in these statistics.

Corruption Perception IndexSince we cannot measure corruption by counting corruption cases the academic world uses measurement tools that focus on perception. The most known set of data is the Corruption Perception Index of Transparency International. This tool is the index of indices: it uses 13 international surveys that rank countries according to factors related to corruption. The scores each country got in these international indices are projected to a scale form 0-100 and an average is generated. Based on this final score countries are ranked year by year. Again, this tool does not provide an exact data about corruption issues in a country. It is a snapshot of perception of how corrupt public institutions are in a country, but does not give explanation about its roots and causes.

Data is Beautiful Conference 92

1. picture:Corruption Perception Index 2012, Map

Page 93: Data Is Beautiful

Society 93

2. picture: Corruption Perception Index 2012, Countries

3. picture: National Integrity System Pillars

Page 94: Data Is Beautiful

Data is Beautiful Conference 94

National Integrity System StudyIn order to understand the underlying nature of corruption in a country Transparency International uses a qualitative analysis, called the National Integrity System country study. This tool examines 13 pillars of a country’s integrity system such as political parties, election management, legislature, executive power, public sector, media, state audit institution, civil society, business sector, judiciary, ombudsman, anticorruption agencies, law enforcement agencies. The methodology provides a detailed analysis about the legislative background of each pillar as well as about their performance in practice.

The underlying principle behind the methodology is that if these pillars of a country function well then they are able to counter corruption risks and ensure the integrity of the system.

Should any of the pillars be weak or function improperly the stability of the whole ‘temple’ i.e. the integrity system of a nation is in danger. Even though it is a qualitative analysis Transparency International uses a scoring method so that pillars can be compared to each other. Pillars are assessed according to their capacities, their governance structure and the role they play in the fight against corruption. The Hungarian results of 2011 are shown in the chart.

The National Integrity System methodology gives an excellent insight into the functioning of a country. But as all corruption measurement tools it has its limitations, since it does not use concrete corruption cases and concentrates only on ‘elite’ corruption. Petty corruption that most people experience during their everyday life does not fall in the scope of this assessment method.

Global Corruption BarometerTo measure how widespread petty corruption is Transparency International uses the Global Corruption Barometer. This is a population survey on a representative sample of the society where interviewees are asked whether they have paid bribes in the last 12 months, and if so, how much and to which institutions. People are also asked about how much they trust different institutions and how corrupt they perceive them. Results of countries are comparable worldwide and provide a picture about everyday corruption issues. Latency though might distort the data.

Making corruption visibleModern technology is a great weapon against corruption since it makes accessible a lot of information for everybody. There are several great examples worldwide of how to use information and communication tools in the fight against corruption from easily understandable infographics to searchable databases of MP’s asset declarations though websites where people can report corruption cases.

I will introduce some of the tools Transparency International Hungary has used so far in order to make to issue easily understandable for the general public.

4. picture: Hungarian National Integrity System, 2011

5. picture: Hungarian National Integrity System, 2011

6. picture: Global Corruption Barometer, 2011

Page 95: Data Is Beautiful

Society 95

Database of criminal proceedingsInvestigate corruption takes a long time, but lengthen the investigation is also a tool to lose evidence and protect corrupt criminals. It is therefore important to inform the public about the phases of investigations. Transparency International Hungary developed a website where all criminal proceedings related to major corruption cases are listed and the time passed is visualised. Since the police do not provide this kind of information the database is put together on the basis of articles from the press.

Antitrust databaseThis database was developed to make public how companies behave when it comes to comply with competition regulations. After careful analysis of the decisions of the Competition Authority Transparency International Hungary enters all relevant information into a searchable database. Companies and public authorities can then easily check whether their competitors, suppliers or partners have been punished for not complying with the competition law.

Hypocrisy – database of campaign spendingAccording to the Hungarian laws parties can spend up to 386 m HUF (cc. EUR 1.4 m) during the election campaigns. It is an open secret that parties spend much more, but because of the law they only report about this amount. Transparency International Hungary monitored the campaign spendings of important parties of the 2010 elections. With the help of the media and volunteers TI Hungary got the information about different campaign events and advertisements. The cost of each campaign element was estimated with the help of marketing, PR and communication experts. Besides this database the website also provides information about how the law should be reshaped.

7. picture: Database of Criminal Proceedings, Transparency International Hungary

The NetworkThis is an ongoing project of K-monitor, Hungarian Civil Liberties Union and Transparency International Hungary. The aim of the open source project is to disclose and visualize personal and business connections between Hungarian politicians and businesspeople. It contains data on more than 30.000 transactions (parties, value, type and subject of transactions, date, any connecting transaction), more than 20.000 individuals’ (business interests and political affiliations), personal relationships and more than 25.000 organizations (stakeholders and officers of private companies, officers of municipalities, related transactions). Most of the data is scraped from public databases while CVs and types of relations are manually edited. The database will be launched in 2013.

SummaryCorruption is a phenomenon that is had to see and follow. Making corruption patterns visible is one of the most effective tools to step up against it. Transparency International deliberately decided to widen its portfolio of actions and besides traditional advocacy activities like negotiating with governments mobilize people to raise their voice and say no to corruption. To reach this aim TI has started to use new information and communication tools that help citizens understand and report corruption. We truly believe that these actions will reach such a critical mass that will be enough to break the vicious circle of corruption patterns in Hungary.

8. picture: Database of Campaign Spendings, 2010

Page 96: Data Is Beautiful

A Hackathon by Transparency International and Kitchen Budapest5-6 October 2012.Location: Kitchen Budapest

During the Data is Beautiful event-series we organized a Hackathon - a 24-hour long event series for programmers and designers who make creative open source solutions for a special theme- as part of Transparency International and Random Hacks of Kindness (RHoK) global call where in cooperation with several chapters from different regions we wanted to promote, scale up and integrate innovative and sustainable Information and Communications Technol-ogy solutions and visualizing methods in the fight against corruption.

Every participating country (Indonesia, Russia, Colombia, Lithuania, Morocco and Hungary) had a local theme re-garding to the corruption, in Hungary we worked on relation of the corruption and young people which is based on a questionnaire and a study made by TI Hungary this year. Our aim was to encourage them to realize the corruption in their everyday life and support them with online/offline tools to fight against it.

Previously we had a call for people who are:- an expert against the corruption- programer and you are quite good at open source platforms- graphic designer, artist who is very familiar with visualizing data, information

The Hungarian database (in English) was here: http://www.transparency.hu/uploads/docs/Youth_Integrity_Survey_Hungary_EN_final.345.pdf

The registration site was here:http://www.rhok.org/event/hacks-against-corruption-budapest

We had 4 groups who made the following projects:Máté Cziner: Cheat or Starve (TI Awarded)http://www.rhok.org/solutions/cheat-or-starvePéter Kadlót: Youth Integrity (Prezi.com Award)http://www.rhok.org/solutions/youth-and-corruption-hackathon-budapest-ti-kibuPéter Szekeres: Your Life (TI Awarded)http://www.rhok.org/solutions/ti-hackathon-youth-and-corruptionEmese Szilágyi – Péter Puklus: Corrupted City Guide Tour App, Panic Button, iConfess, The Bribe Cardshttp://issuu.com/peterpuklus/docs/hac_bp_2012Krisztina Szűcs: Data visualization of GVH databasehttp://szucskrisztina.hu/corruption.png

Official Partner of the Hackathon:Transparency International HungaryRandom Hacks of Kindness (RHoK)Prezi.comKreatív Magazine

Media Partner:HVG Magazine

Hack with us and be part of fight against corruption!

Data is Beautiful Conference 96

Page 97: Data Is Beautiful

Society 97

Page 98: Data Is Beautiful

Data is Beautiful Conference 98

Page 99: Data Is Beautiful

Society 99

Page 100: Data Is Beautiful

Data is Beautiful Conference 100

Page 101: Data Is Beautiful

Society 101

Page 102: Data Is Beautiful

Data is Beautiful Conference 102

Page 103: Data Is Beautiful

Society 103

Page 104: Data Is Beautiful

Data is Beautiful Conference 104

Page 105: Data Is Beautiful

Editor: Eszter BircsákGraphic design: Eszter Balogh, Réka Harsányi

Photography: Zoltán Csík-Kovács and the authors

ISBN 978-963-08-5452-8

Published by: Kitchen Budapest

The Event-series curated by Nina Czeglédy, Eszter BircsákAdvisor: Nemes AttilaInitiated by Kitchen Budapest

Distribution partner is www.visualizing.org.

Financial background of the program-series by Kitchen Budapest, and the Ministry of National Resources (NKA)

Supported by:

Partners of the Hackathon:

Credit, impressum 105

Page 106: Data Is Beautiful

2012%