• Keine Ergebnisse gefunden

Implementing the First Experience Prototype

6.3 The First Periscope Experience Prototype

6.3.2 Implementing the First Experience Prototype

6.3 The First Periscope Experience Prototype 91

Figure 6.4: Second storyboard visualizing the enhanced ’Periscope’ concept. A discovery can be ’locked’ before passing the Prototype on (frame 3) and the user can zoom in and out of the visualized map (5). Drawn by Marc Landau.

its private character as it belongs to one of the passengers, which contradicts the moment of sharing the experience with others when passing on the Periscope. The design decision of realizing a dedicated device is an example of how the implementation of a concept must pay attention to all aspects of the story in order to achieve a consistent experience.

Figure 6.5: The first Periscope experience prototype and a display providing additional infor-mation about discovered points of interested are placed in the dashboard in front of the co-driver.

driver’s point of view. Points of interest (POI) such as sights or lakes are marked on the map.

When the user turns to the left or right while holding the prototype, the visible part of the map changes accordingly. It is possible to zoom in and out of the map by rotating a part of the housing similar to camera zoom lenses. When zooming towards a POI, the color of the marker changes to indicate that this sight is selected. Now, the user can click a button to

’freeze’ the image he is currently seeing. Thus, when handing over the Periscope to share a discovery it will not get lost due to the movement of the prototype. When the button is pressed again, the visible part of the map adjusts to the direction the Periscope is currently pointing at.

Whenever a POI is selected, the co-driver can press a button on his display and further information such as opening hours or historic details appear on the screen. Additionally a sound representing the POI is played. When the group decides to visit a POI, the co-driver can press a button to send the new destination to the navigation system.

Hardware

Figure 6.7 illustrates the setup of the prototype. A piece of drainpipe forms the housing of the Periscope, defining its telescope-like structure. The pipe is divided into two parts by

5 https://www.google.com/earth/ (accessed 01/02/2015)

6.3 The First Periscope Experience Prototype 93

Figure 6.6: The representation of the environment visible when looking through the Periscope prototype. Markers represent points of interest. The graphic was made by Maximilian Hacken-schmied based on material obtained from Google Earth5.

a chassis made out of acrylic glass holding a smartphone, the Samsung Galaxy S running Android. We decided to use a smartphone because it provides an accelerometer to track the movements of the prototype, a display to show the map, storage to save the map image as well as a processor to determine and update the screen contents. The front part of the Periscope carries a button that can be clicked to freeze the part of the map that is currently visible. Besides this button, this part of the prototype does not hold any other electronic parts that would otherwise block the view onto the screen of the smartphone.

The back part of the Periscope is a rotatable wheel implementing the zoom functionality.

The wheel is attached to a rotary potentiometer measuring if and how far it is turned. An Arduino Micro, a smaller version of the Arduino microcontroller platform, reads inputs of the button and the potentiometer. The Arduino is powered via USB cable connected to a computer.

A tablet PC displays additional information about POIs and plays audio via its internal speak-ers. The tablet as well as the Periscope prototype are located in the dashboard in front of the co-driver (see figure 6.5).

Software

The software architecture of the first Periscope prototype consists of four parts, namely the Arduino, the smartphone, a tablet PC and a desktop computer. The computer runs a Java server and thus constitutes the central element of the client-server architecture. The Arduino, the first client, reads the input values generated by the button and the rotary potentiometer.

These are transferred via a USB connection to the computer. The server calculates the current zoom level and determines the freeze-state of the map based on the sensor values.

A WLAN connection to the computer embeds the smartphone as the second client into the network. The smartphone stores a high-resolution image of a map (see figure 6.6). An Android application constantly calculates the visible part of the map based on sensor values of the integrated accelerometer and displays the resulting image on the screen. Additionally, the app receives zoom level as well as freeze-state from the computer via TCP and updates the screen accordingly.

Figure 6.7: Blueprint of the first Periscope experience prototype. A smartphone tracks move-ments and displays the map of the environment. An Arduino determines whether the button is clicked to freeze the image or the wheel is rotated to zoom in and out of the map. Based on [26].

The third client is the tablet PC that is embedded using a WLAN connection. The tablet runs an Android application displaying additional information on the POI that is currently selected and plays corresponding audio feedback. By clicking a button on the screen, the address of the POI can conceptually forwarded to the navigation system as a new destination.

The map of the environment is represented by a static panorama image (see figure 6.6). It does not show the actual environment of the lab where evaluations will take place. It does not change and adapt due to the virtual driving situation described in the experience story, but depending on the movements of the person holding it.