• Keine Ergebnisse gefunden

Cartographic Interface

Im Dokument Mobile Screens (Seite 150-154)

With touchscreen, camera, compass, GPS, network connectivity and the divergent mapping applications that are being developed for it, the smartphones such as Android devices and the iPhone can be considered a fundamentally cartographic interface. The hybrid interface of the gadget not only allows for navigation within the machine, and on the screen, but also within the physical space surrounding the device. It provides an interface for navigating bits, pixels, and spatial coordi-nates.

A wide range of innovative navigation software is being developed for the handheld devices such as the iPhone, enabling new ways of navigating urban space. Interactive tours, augmented reality, social locative media, and mobile navigation contribute to an expanding and transforming field of cartographic screen practices that not only represent space, but also truly make space– operat-ing as performative cartography by generatoperat-ing a hybrid screenspace.

The hybridity of the interface compels us to investigate the complexity of navi-gation as it is taking shape as a prominent cartographic and epistemological model, or a visual regime of navigation in today’s culture of mobility. This naviga-tional model, as I argue, entails a shift in cartography. Originating in the art of making maps, but as such putting forward a regime of understanding and repre-senting space, a new mobile cartography infuses spatial representation with a temporal and procedural dimension: a performative cartography, a dynamic map which emerges and changes during the journey. Moreover, divergent spatial cate-gories of information or data space and physical space are connected in the map as a hybrid screenspace. The physical engagement of the user-navigator with the iPhone in this temporally dynamic and spatially layered process of making maps while reading them entails a collapse of making images and viewing them. This brings forward the co-operation of the device’s (hardware) specifications, the applications’ (software) affordances and the user’s activity (the interfacing) in processes of connectivity, participation and mobility.

The iPhone–a handheld, mobile and hybrid device, and a console for multiple uses–invites us to interrogate the characteristic of the screen gadget as interface for mobile use. However, simply asserting that smartphones such as the iPhone are hybrid devices glosses over the complex and layered structure of characteris-tics and affordances of the interface of the device, as well as the different inter-active practices involved in this hybridity. The iPhone raises questions about the specificity of this type of screen gadget as ahybrid object. In this sense, it is just as much a theoretical object as the Nintendo DS I alleged as such in Chapter 3. To

5. performative cartography 149

be specific, because it is a mobile device, questions about the iPhone’s hybridity are intrinsically related to movement, touch, and the process of spatial transfor-mation. This is situated in an entanglement of technologies, applications, and interactive practices that iPhoneinterfacingentails.

Handling the iPhone takes place within what we have called amobilescreening arrangement or dispositif. As a hybrid object, the device is embedded within a mobile dispositif that encompasses both the perceptual positioning of the (mobile) user, and the physical (interactive) interfacing with the screen. The screening arrangement in motion, taking place within public space and making connections with this space, establishes amobile sphere: a space that is marked by mobility and connectivity, and constructed within the (mobile) arrangement of the user, location, and device.

This mobility in space is intricately bound to the mobility, or flexibility, of the on-screen space itself: the interactive touchscreen that in fact requires physical manipulation for its operation. Considering the use of the iPhone as machine for navigation, the mobility of the device makes it a visceral interface: the entire body of the user is incorporated in mobility and making space.

The iPhone has a cartographic interface for the simultaneous navigation of both on- and off-screen space. As a machine, it enables navigation within the machine itself, as well as the navigation within physical space with the machine in hand. This makes the screen use of the iPhone distinct from historical screen uses such as televisual or cinematic viewing. The multi-touchscreen and the diver-gent practices of mobile touchscreening problematize the distinction of making, transmitting, and receiving images. Moreover, characteristic of the mobile screen is positioning within a mobile sphere – or dispositif – implying an ambulant locatedness and, hence, flexible site-specificity.

This mobility and physicality, I argue, points toward a performative and em-bodied notion of interactivity as characteristic of navigation, not only as a spatial decoding of map information, orientation and mobility, but as a cultural trope structuring our sense of (spatial) presence – as well as (temporal) present– as hybrid and flexible categories. This establishes a new spatial category, screen-space, which is activated by the simultaneous construction of on- and off-screen spaces when traversing in fluid motions with navigation devices in our hands.

(Verhoeff 2008)

As a device for navigation, the iPhone comprises a layered interface. While phenomenally intricately connected and hard to separate or isolate, conceptually we can discern three (non-hierarchical) levels that are all essential for navigation.

First, it encompasses the level of theinternal interfacingof applications: the back-end operating system and software. This includes so-called application program-ming interfaces (API), making communication between applications possible, as well as the communication of the software with the graphical user interface (GUI)

that enables the human user of the applications to‘read’, or understand, and use them.

The Google Maps API is a good example here. The fact that it is open source makes Google Maps a highly adaptable framework for all kinds of implementa-tions. This is very suitable for mapping applications, because it provides the tools for mash-ups, or web-application hybrids: the integration of data from different sources within, in this case, the mapping environment of Google Maps. This level is theprocessingof data.

The second layer of the interface is the spatial positioning and connectivity of the apparatus in relation to physical as well as data space: the interface of internal instruments of the iPhone that connect with the external space. This entails the digital camera, GPS, Wifi/G3 connectivity, compass, and motion sensor or accel-erometer, calculating the position, orientation and velocity, and the screen. This level of the interface communicates between the hardware of the device and‘the world’.

It includes what is called an inertial navigation system, defined by Oliver J.

Woodman as follows:

Inertial navigation is a self-contained navigation technique in which measure-ments provided by accelerometers and gyroscopes are used to track the posi-tion and orientaposi-tion of an object relative to a known starting point, orientaposi-tion and velocity. (2007: 4)

This inertial positioning system is combined with theabsolutepositioning system of GPS which is based on triangulation of geographical coordinates (which cur-rently only works outdoors). This ability to calculate position and orientation is necessary for e.g. gravimetric (rather than marker-based) augmented reality appli-cations as interface for location-based data, orambient intelligence. Moreover, Inter-net connectivity also positions the device via wireless connection. The second layer of the interface, then, concerns connecting and positioning the interface, whether based on inertial, absolute, camera-based or wireless technologies.8

This positioning, then, is communicated to the user who might, for example, see the on-screen image tilt, or find a representation of position and movement signified by an arrow-shaped icon in the on-screen maps, and can then read this orientation and subsequently act or move. This is taking place on a third level of the interface ofuser interaction, enabling the communication between the user and the internal operation of the device (first level) as it is connected to the space surrounding it (second level). The first level of the applications interface also includes software operation of the graphical user interface (GUI). However, the way in which this data is visualized and made understandable, its output, oper-ates at this third level of user interaction. This level contains the user feedback input options such as the touchscreen, buttons, the‘shake control’(making use

5. performative cartography 151

of the inertial system), but also representational conventions of the GUI. In the case of navigation, this means the way that spatial information is represented on the screen and interacted with by the user.

Significant for the touchscreen of the iPhone is that at the level of user interac-tion it is both an instrument for input and for output. This is the level of‘access’

(to data) and of the‘experience’of it. Where the action takes place is, literally, on the screen. Moreover, it is a multi-touchscreen in a technological and practical sense: multi-touch technology allows for multifarious ways of touching such as swiping, virtual scrolling or swirling, and two-fingered pinch movements for enlarging or shrinking. Moreover, the dynamic horizontal or vertical scrolling of screen content establishes a connection between the image on the screen and its off-screen spaces: the frame is always a detail of a larger whole. The map is always larger than the part that is displayed on screen. Objects can be moved out-side and brought into the frame by the swipe of a fingertip. Moreover, tapping the screen to give commands make more buttons, keys, sticks or a mouse controller redundant. For example, pressure can make the screen image zoom in, simulat-ing a virtual camera lens.

Seen within a layered constellation of the interface, the iPhone requires a triple perspective: it is a machine that processes and combines data, it is a sensor that connects and positions data, and it is a medium that produces perception. Within this constellation, its products, results, or yields received as visuals on screen by the user, cannot be approached as fixed entities, or ‘texts’, both in a temporal sense and in terms of authorship, or better, of agency. While walking and using the iPhone for an interactive tour, for example, the different layers of the interface operate together: location-based information is processed and communicated to the user via the screen. This complex layering of the interfacing process is not experienced as such because it is filtered by the user interaction interface. How-ever, the integration of these processes (data processing, spatial positioning and connectivity, and the communication with the user) is the condition of possibility for creative navigation: an integration of the mechanisms and affordances that underlie our actions, but that are not experienced as discrete layers. As such, the hybridity of the iPhone interface provides the conditions for creative navigation of screenspace as a performative cartography.

These creative practices that make use of the affordance of the (layered) inter-face of the iPhone as navigation device involve different interactive engagements with an array of cartographic applications. We can discern three different ways in which the broad concept of interactivity becomes specific for navigation, as the point where interface and agency meet and where performativity is actualized:

navigation understood as a constructive form of interactivity, as a participatory form of interactivity, and as yielding a haptic engagement with screenspace.

Let me explore briefly how this performative cartography constructs an urban space in which pervasive presence, embedded pasts, and evolving futures

inter-sect, according to my triple interpretation of the index. I take locative media, or geomedia practices, and augmented reality navigation as popular and (at the moment of writing) innovative uses of mobile screens and sketch the way they revamp some cartographic principles. I first address the three aspects of tagging, plotting and stitching in the following section before elaborating on augmented reality navigation.

Im Dokument Mobile Screens (Seite 150-154)