• Keine Ergebnisse gefunden

Temporal-Spatial Visualization of Interaction Data for Visual Debugging

N/A
N/A
Protected

Academic year: 2022

Aktie "Temporal-Spatial Visualization of Interaction Data for Visual Debugging"

Copied!
1
0
0

Wird geladen.... (Jetzt Volltext ansehen)

Volltext

(1)

Temporal-Spatial Visualization of Interaction Data for Visual Debugging

Roman R¨adle

Human-Computer Interaction Group University of Konstanz

Werner A. K¨onig

Human-Computer Interaction Group University of Konstanz

Harald Reiterer

Human-Computer Interaction Group University of Konstanz

1 Introduction

The design of novel input devices and interaction techniques is a highly demanding task. The interaction designers need program- ming environments which provide high flexibility and complex functionalities. But above all, such tools have to be easy to learn and should support the analysis and interpretation of the interaction data even for interaction designers with little or no programming expe- rience. Squidy is a Zoomable Design Environment which eases the design, integration and combination of novel input devices as well as appropriate interaction techniques. By providing a simple yet powerful visual programming language based on the pipe-and-filter metaphor combined with details-on-demand techniques, Squidy fa- cilitates rapid prototyping and a fast iteration cycle [K¨onig et al.

2009]. Therefore, Squidy offers a versatile collection of ready-to- use nodes such as devices, filters and interaction techniques which can be visually arranged and connected and thereby define the de- sired interaction technique. This can be for example an optical tracking of a laser pointer’s spot to interact with a distant display wall, as can be seen in [Figure 1]. Here, the 2-dimensional posi- tion data passes through several filters and finally controls a mouse cursor and a TUIO-based application. However, currently such a visual design of an interaction technique with Squidy requires a deep understanding of the data flow and the semantics of the de- signed pipeline. Current debugging techniques are based on textual output mostly combined with table-based representations that allow hierarchical navigation. However, they neither provide information about spatial location of data nor follow a chronological sequence.

We introduce an approach to visually analyze and interpret the data flow inside the design environment.

2 Our Approach

To recognize interaction patterns like gestures or multimodal in- put [Bolt 1980], interaction designers need to have a brief overview of the chronological flow of data within a predefined time span. By using the concept of semantic zooming the user is able to navigate to a visual layer that provides a top view to the flow of the interac- tion data [Figure 2 (b)]. This layer is directly located at each pipe, thus indicating a connection between two nodes [Figure 2 (a)]. The duration of a specific interaction is up to the length of its pattern.

Therefore, the time-based view can be manipulated by the user in- teractively and provide insight into the currently flowing data. The types of interaction data vary in their dimensions of atomic val- ues and so the visual plot of data types also vary in their visual representation. This means that for instance the representation of a position in 2D differs to the representation of a gesture being recog- nized [Figure 2 (b)]. Users are able to inspect frequent and parallel occuring data at a glance according to its spatial and chronologi- cal location. Thus, interaction designers benefit from the insight into the interaction data flow and are able to directly apply changes to it. These changes instantly effect the behavior of the interaction design, providing the possibility to gradually refine and test the con- figuration at run-time (e.g. changing noise level of a Kalman filter to compensate users’ natural hand tremor) and allow the designer to

e-mail: Roman.Raedle@uni-konstanz.de

e-mail: Werner.Koenig@uni-konstanz.de

e-mail: Harald.Reiterer@uni-konstanz.de

Figure 1:View of a pipeline in the Squidy Design Environment. The pipeline receives position, button and inertial data from a laser- pointer, applies a Kalman filter, a filter for change recognition and a filter for selection improvement and finally emulates a standard mouse to interact with conventional WIMP-applications. The data is alternatively sent via TUIO to listening applications.

Figure 2: Dataflow visualization for visual debugging within the Squidy Zoomable Design Environment based on the concept of se- mantic zooming. (a) Zoomed out representation of the dataflow vi- sualization. (b) Shows a chronological flow of 2-dimensional posi- tions (red and blue) and a gesture (green) that has been recognized by a gesture recognizer filter.

achieve more natural and reliable interaction techniques. In our fu- ture research we will focus on the integration of Magic Lenses [Bier et al. 1993] to allow manipulations on the interaction data itself (e.g.

ability to annotate data on-the-fly) to provide a deeper insight into users’ interaction.

References

BIER, E. A., STONE, M. C., PIER, K., BUXTON, W., AND

DEROSE, T. D. 1993. Toolglass and magic lenses: the see- through interface. InSIGGRAPH ’93: Proceedings of the 20th annual conference on Computer graphics and interactive tech- niques, ACM, New York, NY, USA, 73–80.

BOLT, R. A. 1980. ”put-that-there”: Voice and gesture at the graph- ics interface. InSIGGRAPH ’80: Proceedings of the 7th annual conference on Computer graphics and interactive techniques.

K ¨ONIG, W. A., R ¨ADLE, R.,ANDREITERER, H. 2009. Squidy:

A zoomable design environment for natural user interfaces. In CHI 2009 Extended Abstracts on Human Factors in Computing Systems, Work-In-Progress Session, ACM Press.

First publ. in: Computer Graphics Forum 28 (2009), 3 (EuroVis 09)

Konstanzer Online-Publikations-System (KOPS) URN: http://nbn-resolving.de/urn:nbn:de:bsz:352-opus-106759

URL: http://kops.ub.uni-konstanz.de/volltexte/2010/10675/

Referenzen

ÄHNLICHE DOKUMENTE

Conventional scheduling systems often provide one of these interaction models: manual scheduling (M1), full automatic scheduling with subsequent manual modification of the

At the same time, as mobile internet becomes available and affordable, the mobile device can play as personal access point to private and shared data, as well as applications, which

Based on the experiences we gained from the design and the evaluation of diverse visual information-seeking systems we identified four design principles which help to develop more

Squidy provides a central design environment based on high-level visual data flow programming combined with zoomable user interface concepts.. The user interface offers a

Concurrently, we will design and implement visualization and interaction techniques that integrate the physical structures within a spatial workspace based on a

which unifies various device toolkits and frameworks in a common library and provides an integrated user interface for visual dataflow management as well as device and data

Wall‐sized  displays  with  millions  of  pixels  offer  various  advantages  compared  to  conventional  desktop  monitors.  The  increased  display  real  estate 

In contrast to existing frameworks and toolkits, which aim at the support of interaction designers only, Squidy offers a single user interface for all users involved in