Lecture Notes in Computer Science 5136 CommencedPublicationin1973 FoundingandFormerSeriesEditors: GerhardGoos,JurisHartmanis,andJanvanLeeuwen EditorialBoard DavidHutchison LancasterUniversity,UK TakeoKanade CarnegieMellonUniversity,Pittsburgh,PA,USA JosefKittler UniversityofSurrey,Guildford,UK JonM.Kleinberg CornellUniversity,Ithaca,NY,USA AlfredKobsa UniversityofCalifornia,Irvine,CA,USA FriedemannMattern ETHZurich,Switzerland JohnC.Mitchell StanfordUniversity,CA,USA MoniNaor WeizmannInstituteofScience,Rehovot,Israel OscarNierstrasz UniversityofBern,Switzerland C.PanduRangan IndianInstituteofTechnology,Madras,India BernhardSteffen UniversityofDortmund,Germany MadhuSudan MassachusettsInstituteofTechnology,MA,USA DemetriTerzopoulos UniversityofCalifornia,LosAngeles,CA,USA DougTygar UniversityofCalifornia,Berkeley,CA,USA GerhardWeikum Max-PlanckInstituteofComputerScience,Saarbruecken,Germany T.C. Nicholas Graham Philippe Palanque (Eds.) Interactive Systems Design, Specification, and Verification 15th International Workshop, DSV-IS 2008 Kingston, Canada, July 16-18, 2008 Proceedings 1 3 VolumeEditors T.C.NicholasGraham SchoolofComputing Queen’sUniversity Kingston,Ontario,Canada E-mail:[email protected] PhilippePalanque IRIT UniversityPaulSabatier(Toulouse3) Toulouse,France E-mail:[email protected] LibraryofCongressControlNumber:2008930133 CRSubjectClassification(1998):H.5.2,H.5,I.3,D.2,F.3 LNCSSublibrary:SL2–ProgrammingandSoftwareEngineering ISSN 0302-9743 ISBN-10 3-540-70568-6SpringerBerlinHeidelbergNewYork ISBN-13 978-3-540-70568-0SpringerBerlinHeidelbergNewYork Thisworkissubjecttocopyright.Allrightsarereserved,whetherthewholeorpartofthematerialis concerned,specificallytherightsoftranslation,reprinting,re-useofillustrations,recitation,broadcasting, reproductiononmicrofilmsorinanyotherway,andstorageindatabanks.Duplicationofthispublication orpartsthereofispermittedonlyundertheprovisionsoftheGermanCopyrightLawofSeptember9,1965, initscurrentversion,andpermissionforusemustalwaysbeobtainedfromSpringer.Violationsareliable toprosecutionundertheGermanCopyrightLaw. SpringerisapartofSpringerScience+BusinessMedia springer.com ©Springer-VerlagBerlinHeidelberg2008 PrintedinGermany Typesetting:Camera-readybyauthor,dataconversionbyScientificPublishingServices,Chennai,India Printedonacid-freepaper SPIN:12436422 06/3180 543210 Preface The modern world has made available a wealth of new possibilities for interacting with computers, through advanced Web applications, while on the go with handheld smart telephones or using electronic tabletops or wall-sized displays. Developers of modern interactive systems face great problems: how to design applications which will work well with newly available technologies, and how to efficiently and correctly implement such designs. Design, Specification and Verification of Interactive Systems 2008 was the 15th of a series of annual workshops devoted to helping designers and implementers of interactive systems unleash the power of modern interaction devices and techniques. DSV-IS 2008 was held at Queen’s University in Kingston, Canada, during July 16–18, 2008. This book collects the best papers submitted to the workshop. There were 17 full papers, 10 late-breaking and experience report papers, and two demonstrations. Keynote presentations were provided by Judy Brown of Carleton University and Randy Ellis of Queen’s University. The first day of the workshop addressed the problems of user interface evaluation and specification, with particular emphasis on the use of task models to provide high- level approaches for capturing the intended functionality of a user interface. Day two continued this theme, examining techniques for modeling user interfaces, particularly for mobile and ubiquitous applications. Presenters also discussed advanced implemen- tation techniques for interactive systems. Finally, day three considered how to archi- tect interactive systems, and returned to the themes of evaluation and specification. The workshop was hosted by IFIP Working Group 2.7/13.4 on User Interface En- gineering. We thank the 30 members of our international Program Committee for their hard work in the paper selection process. We also gratefully acknowledge Precision Conference for their generous donation of the PCS reviewing system. We hope that you enjoy this record of the DSV-IS 2008 workshop, and find it fruit- ful for your work and research. July 2008 T.C. Nicholas Graham Philippe Palanque Organization Conference Chairs T.C. Nicholas Graham, Queen's University, Canada Philippe Palanque, IHCS-IRIT, Université Paul Sabatier, France Program Committee Simone Diniz Junqueira Barbosa, PUC-Rio, Brazil Rémi Bastide, IRIT - C.U. Jean-François Champollion, France Regina Bernhaupt, ICT&S, University of Salzburg, Austria Ann Blandford, UCL, UK Judith Brown, Carleton University, Canada Gaelle Calvary, University of Grenoble, France José Creissac Campos, University of Minho, Portugal Stéphane Chatty, ENAC, France Prasun Dewan, University of North Carolina, USA Anke Dittmar, University of Rostock, Germany Alan Dix, Lancaster University, UK Gavin Doherty, Trinity College Dublin, Ireland Peter Forbrig, University of Rostock, Germany Philip Gray, University of Glasgow, UK Morten Borup Harning, Priway, Denmark Michael Harrison, University of Newcastle, UK Chris Johnson, University of Glasgow, UK Joaquim A Jorge, Technical University of Lisbon, Portugal Kris Luyten, Expertise Centre for Digital Media, Hasselt University, Belgium Mieke Massink, CNR, Pisa, Italy Francisco Montero, UCLM, Spain Laurence Nigay, University of Grenoble, France Nuno Nunes, University of Madeira, Portugal Fabio Paterno, ISTI-CNR, Pisa, Italy Greg Phillips, Royal Military College, Canada Kevin Scheider, University of Saskatchewan, Canada Harold Thimbleby, University of Swansea, Wales Claus Unger, University of Hagen, Germany Jean Vanderdonckt, Université Catholique de Louvain, Belgium Marco Winckler, IHCS-IRIT, Université Paul Sabatier, France Table of Contents EMU in the Car: Evaluating Multimodal Usability of a Satellite Navigation System ............................................... 1 Ann Blandford, Paul Curzon, Joanne Hyde, and George Papatzanis ComparingMixed Interactive Systems for Navigating3D Environments in Museums ..................................................... 15 Emmanuel Dubois, C´edric Bach, and Philippe Truillet An Attentive Groupware Device to Mitigate Information Overload ..... 29 Antonio Ferreira and Pedro Antunes Multi-fidelity User Interface Specifications........................... 43 Thomas Memmel, Jean Vanderdonckt, and Harald Reiterer HOPS: A PrototypicalSpecification Tool for Interactive Systems....... 58 Anke Dittmar, Toralf Hu¨bner, and Peter Forbrig Systematic Analysis of Control Panel Interfaces Using Formal Tools .... 72 J. Creissac Campos and M.D. Harrison Investigating System Navigation Ergonomics through Model Verification ..................................................... 86 Alexandre Scaico, Maria de F.Q. Vieira, Markson R.F. de Sousa, and Charles Santoni Tool Support for Representing Task Models, Dialog Models and User-Interface Specifications....................................... 92 D. Reichart, A. Dittmar, P. Forbrig, and M. Wurdel Towards a Library of Workflow User Interface Patterns ............... 96 Josefina Guerrero Garc´ıa, Jean Vanderdonckt, Juan Manuel Gonza´lez Calleros, and Marco Winckler Specification and Verification of Multi-agent Systems Interaction Protocols Using a Combination of AUML and Event B ............... 102 Leila Jemni Ben Ayed and Fatma Siala Pattern Languages as Tool for Discount Usability Engineering ......... 108 Elbert-Jan Hennipman, Evert-Jan Oppelaar, and Gerrit van der Veer Cascading Dialog Modeling with UsiXML........................... 121 Marco Winckler, Jean Vanderdonckt, Adrian Stanciulescu, and Francisco Trindade VIII Table of Contents Designing Graphical Elements for Cognitively Demanding Activities: An Account on Fine-Tuning for Colors.............................. 136 Gilles Tabart, St´ephane Conversy, Jean-Luc Vinot, and Sylvie Ath`enes Lightweight Coding of Structurally Varying Dialogs .................. 149 Michael Dunlavey ReWiRe: Designing Reactive Systems for Pervasive Environments...... 155 Geert Vanderhulst, Kris Luyten, and Karin Coninx TowardMulti-disciplinary Model-Based(Re)Design ofSustainable User Interfaces ....................................................... 161 Jan Van den Bergh, Mieke Haesen, Kris Luyten, Sofie Notelaers, and Karin Coninx A Model-Based Approach to Supporting Configuration in Ubiquitous Systems ........................................................ 167 Tony McBryan and Phil Gray Exploiting Web Services and Model-Based User Interfaces for Multi-device Access to Home Applications .......................... 181 Giulio Mori, Fabio Paterno`, and Lucio Davide Spano Resources for Situated Actions..................................... 194 Gavin Doherty, Jose Campos, and Michael Harrison An Architecture and a Formal Description Technique for the Design and Implementation of Reconfigurable User Interfaces ................ 208 David Navarre, Philippe Palanque, Jean-Franc¸ois Ladry, and Sandra Basnyat COMET(s), a Software Architecture Style and an Interactors Toolkit for Plastic User Interfaces......................................... 225 Alexandre Demeure, Ga¨elle Calvary, and Karin Coninx Executable Models for Human-Computer Interaction ................. 238 Marco Blumendorf, Grzegorz Lehmann, Sebastian Feuerstack, and Sahin Albayrak A Middleware for Seamless Use of Multiple Displays.................. 252 Satoshi Sakurai, Yuichi Itoh, Yoshifumi Kitamura, Miguel A. Nacenta, Tokuo Yamaguchi, Sriram Subramanian, and Fumio Kishino Graphic Rendering Considered as a Compilation Chain ............... 267 Benjamin Tissoires and St´ephane Conversy Table of Contents IX Towards Specifying Multimodal Collaborative User Interfaces: A Comparison of Collaboration Notations ............................. 281 Fr´ed´eric Jourde, Yann Laurillau, Alberto Moran, and Laurence Nigay Towards Characterizing Visualizations.............................. 287 Christophe Hurter and St´ephane Conversy Towards Usability Evaluation for Smart Appliance Ensembles ......... 294 Gregor Buchholz and Stefan Propp Task Model Refinement with Meta Operators........................ 300 Maik Wurdel, Daniel Sinnig, and Peter Forbrig Utilizing Dynamic Executable Models for User Interface Development .................................................... 306 Grzegorz Lehmann, Marco Blumendorf, Sebastian Feuerstack, and Sahin Albayrak Author Index.................................................. 311 EMU in the Car: Evaluating Multimodal Usability of a Satellite Navigation System Ann Blandford1, Paul Curzon2, Joanne Hyde3, and George Papatzanis2 1 UCL Interaction Centre, University College London, Remax House, 31-32 Alfred Place London WC1E 7DP, U.K [email protected] http://www.uclic.ucl.ac.uk/annb/ 2 Queen Mary, University of London, U.K 3 formerly at Middlesex University U.K Abstract. The design and evaluation of multimodal systems has traditionally been a craft skill. There are some well established heuristics, guidelines and frameworks for assessing multimodal interactions, but no established method- ologies that focus on the design of the interaction between user and system in context. In this paper, we present EMU, a systematic evaluation methodology for reasoning about the usability of an interactive system in terms of the modali- ties of interaction. We illustrate its application using an example of in-car navi- gation. EMU fills a niche in the repertoire of analytical evaluation approaches by focusing on the quality of interaction in terms of the modalities of interac- tion, how modalities are integrated, and where there may be interaction break- downs due to modality clashes, synchronisation difficulties or distractions. Keywords: usability evaluation, multimodal systems, in-car navigation sys- tems, satellite navigation systems. 1 Introduction There is a substantial literature on the design and use of multimodal systems, most of which takes either a system or a user perspective. Taking a system perspective, issues of concern include how to select output modalities to communicate most effectively (e.g. [9]) and how to integrate user input expressed through multiple modalities to correctly interpret the user’s meaning (e.g. [15]). Conversely, much work from a user perspective is concerned with how users perceive and work with system output in different modalities (e.g. [8]) or how users select modalities of communication (e.g. [14]). Little work has taken an integrative approach, considering both user and system perspectives in parallel. The work reported here takes such an approach, developing a prototype methodology for reasoning about the design of multimodal interactive sys- tems to accommodate both input and output within the interaction. As an integrative approach, it does not consider the fine-grained details of either system implementa- tion or user cognition, but focuses more broadly on how the two interact. T.C.N. Graham and P. Palanque (Eds.): DSVIS 2008, LNCS 5136, pp. 1–14, 2008. © Springer-Verlag Berlin Heidelberg 2008 2 A. Blandford et al. The method, Evaluating Multimodal Usability (EMU) was initially developed and tested using as the main case study a robotic arm interface [10]. The approach pre- sented and illustrated here is a refinement of the method, as described below. 2 Background: Multimodal Interaction Multimodal systems are widely considered to be ones that integrate multiple modes of input or output, typically using non-standard interaction devices. The standard con- figuration of keyboard and mouse for input and graphics, text and audio for output is rarely described as “multimodal”, though for our purposes it would class as such. Many definitions of a “modality” effectively consider a data stream from a particular source. For example Lin and Imamiya [13] discuss assessing user experience by measuring various user attributes – eye gaze, pupil size, hand movements, verbal reports – and refer to each of these inputs as a modality. Similarly, Sun et al [15] discuss data fusion across speech and gesture modalities, and Oviatt et al [14] focus on how people select alternative modalities (i.e. input devices) for interacting with a computer system. Considering both user input and computer system output, Coutaz et al [5] consider the combinations of modalities in terms of Complementarity, Assign- ment, Redundancy and Equivalence. Here, ‘assignment’ means that information has to be communicated through a particular modality and ‘equivalence’ means that the same information can be communicated equally effectively through alternative mo- dalities. Complementarity and redundancy refer to how information is communicated (using different modalities in complementary ways, or presenting equivalent informa- tion through multiple modalities). From a user perspective, much work on modalities has focused on how people in- tegrate information received through different senses. Wickens and Hollands [17] present a multiple resource theory that considers cognitive capabilities and limitations in terms of perceptual channels (vision, hearing, touch, etc.), information form (of which the two most important for our purposes are lexical and symbolic) and stages of processing. They highlight limitations on sensory input (that multiple streams of information cannot be easily received through the same channel, such as eyes or ears, simultaneously) and on input, processing and action (that competing information in the same code, verbal or spatial, cannot be easily processed simultaneously). Other approaches that take a user focus include Barnard et al’s work on Interacting Cogni- tive Subsystems [7], which considers the transformation and integration of sensory inputs through central processing to generate action, and Kieras et al’s [12] work on Executive Process – Interaction Control (EPIC), which models human information processing in complex, multimodal tasks. Since our concern is with assessing the usability of interactive systems, the capa- bilities and constraints on human processing, as well as those on system processing, have to be accommodated within any definition of a modality. Drawing on the in- sights from earlier work on modalities, we propose a definition: that a modality is a temporally based instance of information perceived by a particular sensory channel. This definition comprises three key elements, time, form and channel, which need some explanation.